Sample records for multiple analysis techniques

  1. Analysis and prediction of Multiple-Site Damage (MSD) fatigue crack growth

    NASA Technical Reports Server (NTRS)

    Dawicke, D. S.; Newman, J. C., Jr.

    1992-01-01

    A technique was developed to calculate the stress intensity factor for multiple interacting cracks. The analysis was verified through comparison with accepted methods of calculating stress intensity factors. The technique was incorporated into a fatigue crack growth prediction model and used to predict the fatigue crack growth life for multiple-site damage (MSD). The analysis was verified through comparison with experiments conducted on uniaxially loaded flat panels with multiple cracks. Configuration with nearly equal and unequal crack distribution were examined. The fatigue crack growth predictions agreed within 20 percent of the experimental lives for all crack configurations considered.

  2. Overview of Sparse Graph for Multiple Access in Future Mobile Networks

    NASA Astrophysics Data System (ADS)

    Lei, Jing; Li, Baoguo; Li, Erbao; Gong, Zhenghui

    2017-10-01

    Multiple access via sparse graph, such as low density signature (LDS) and sparse code multiple access (SCMA), is a promising technique for future wireless communications. This survey presents an overview of the developments in this burgeoning field, including transmitter structures, extrinsic information transform (EXIT) chart analysis and comparisons with existing multiple access techniques. Such technique enables multiple access under overloaded conditions to achieve a satisfactory performance. Message passing algorithm is utilized for multi-user detection in the receiver, and structures of the sparse graph are illustrated in detail. Outlooks and challenges of this technique are also presented.

  3. Performance analysis of multiple PRF technique for ambiguity resolution

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Curlander, J. C.

    1992-01-01

    For short wavelength spaceborne synthetic aperture radar (SAR), ambiguity in Doppler centroid estimation occurs when the azimuth squint angle uncertainty is larger than the azimuth antenna beamwidth. Multiple pulse recurrence frequency (PRF) hopping is a technique developed to resolve the ambiguity by operating the radar in different PRF's in the pre-imaging sequence. Performance analysis results of the multiple PRF technique are presented, given the constraints of the attitude bound, the drift rate uncertainty, and the arbitrary numerical values of PRF's. The algorithm performance is derived in terms of the probability of correct ambiguity resolution. Examples, using the Shuttle Imaging Radar-C (SIR-C) and X-SAR parameters, demonstrate that the probability of correct ambiguity resolution obtained by the multiple PRF technique is greater than 95 percent and 80 percent for the SIR-C and X-SAR applications, respectively. The success rate is significantly higher than that achieved by the range cross correlation technique.

  4. Multiple-Group Analysis Using the sem Package in the R System

    ERIC Educational Resources Information Center

    Evermann, Joerg

    2010-01-01

    Multiple-group analysis in covariance-based structural equation modeling (SEM) is an important technique to ensure the invariance of latent construct measurements and the validity of theoretical models across different subpopulations. However, not all SEM software packages provide multiple-group analysis capabilities. The sem package for the R…

  5. A multiple technique approach to the analysis of urinary calculi.

    PubMed

    Rodgers, A L; Nassimbeni, L R; Mulder, K J

    1982-01-01

    10 urinary calculi have been qualitatively and quantitatively analysed using X-ray diffraction, infra-red, scanning electron microscopy, X-ray fluorescence, atomic absorption and density gradient procedures. Constituents and compositional features which often go undetected due to limitations in the particular analytical procedure being used, have been identified and a detailed picture of each stone's composition and structure has been obtained. In all cases at least two components were detected suggesting that the multiple technique approach might cast some doubt as to the existence of "pure" stones. Evidence for a continuous, non-sequential deposition mechanism has been detected. In addition, the usefulness of each technique in the analysis of urinary stones has been assessed and the multiple technique approach has been evaluated as a whole.

  6. Using multiple group modeling to test moderators in meta-analysis.

    PubMed

    Schoemann, Alexander M

    2016-12-01

    Meta-analysis is a popular and flexible analysis that can be fit in many modeling frameworks. Two methods of fitting meta-analyses that are growing in popularity are structural equation modeling (SEM) and multilevel modeling (MLM). By using SEM or MLM to fit a meta-analysis researchers have access to powerful techniques associated with SEM and MLM. This paper details how to use one such technique, multiple group analysis, to test categorical moderators in meta-analysis. In a multiple group meta-analysis a model is fit to each level of the moderator simultaneously. By constraining parameters across groups any model parameter can be tested for equality. Using multiple groups to test for moderators is especially relevant in random-effects meta-analysis where both the mean and the between studies variance of the effect size may be compared across groups. A simulation study and the analysis of a real data set are used to illustrate multiple group modeling with both SEM and MLM. Issues related to multiple group meta-analysis and future directions for research are discussed. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  7. The multiple imputation method: a case study involving secondary data analysis.

    PubMed

    Walani, Salimah R; Cleland, Charles M

    2015-05-01

    To illustrate with the example of a secondary data analysis study the use of the multiple imputation method to replace missing data. Most large public datasets have missing data, which need to be handled by researchers conducting secondary data analysis studies. Multiple imputation is a technique widely used to replace missing values while preserving the sample size and sampling variability of the data. The 2004 National Sample Survey of Registered Nurses. The authors created a model to impute missing values using the chained equation method. They used imputation diagnostics procedures and conducted regression analysis of imputed data to determine the differences between the log hourly wages of internationally educated and US-educated registered nurses. The authors used multiple imputation procedures to replace missing values in a large dataset with 29,059 observations. Five multiple imputed datasets were created. Imputation diagnostics using time series and density plots showed that imputation was successful. The authors also present an example of the use of multiple imputed datasets to conduct regression analysis to answer a substantive research question. Multiple imputation is a powerful technique for imputing missing values in large datasets while preserving the sample size and variance of the data. Even though the chained equation method involves complex statistical computations, recent innovations in software and computation have made it possible for researchers to conduct this technique on large datasets. The authors recommend nurse researchers use multiple imputation methods for handling missing data to improve the statistical power and external validity of their studies.

  8. Method for improving accuracy in full evaporation headspace analysis.

    PubMed

    Xie, Wei-Qi; Chai, Xin-Sheng

    2017-05-01

    We report a new headspace analytical method in which multiple headspace extraction is incorporated with the full evaporation technique. The pressure uncertainty caused by the solid content change in the samples has a great impact to the measurement accuracy in the conventional full evaporation headspace analysis. The results (using ethanol solution as the model sample) showed that the present technique is effective to minimize such a problem. The proposed full evaporation multiple headspace extraction analysis technique is also automated and practical, and which could greatly broaden the applications of the full-evaporation-based headspace analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Biostatistics Series Module 10: Brief Overview of Multivariate Methods.

    PubMed

    Hazra, Avijit; Gogtay, Nithya

    2017-01-01

    Multivariate analysis refers to statistical techniques that simultaneously look at three or more variables in relation to the subjects under investigation with the aim of identifying or clarifying the relationships between them. These techniques have been broadly classified as dependence techniques, which explore the relationship between one or more dependent variables and their independent predictors, and interdependence techniques, that make no such distinction but treat all variables equally in a search for underlying relationships. Multiple linear regression models a situation where a single numerical dependent variable is to be predicted from multiple numerical independent variables. Logistic regression is used when the outcome variable is dichotomous in nature. The log-linear technique models count type of data and can be used to analyze cross-tabulations where more than two variables are included. Analysis of covariance is an extension of analysis of variance (ANOVA), in which an additional independent variable of interest, the covariate, is brought into the analysis. It tries to examine whether a difference persists after "controlling" for the effect of the covariate that can impact the numerical dependent variable of interest. Multivariate analysis of variance (MANOVA) is a multivariate extension of ANOVA used when multiple numerical dependent variables have to be incorporated in the analysis. Interdependence techniques are more commonly applied to psychometrics, social sciences and market research. Exploratory factor analysis and principal component analysis are related techniques that seek to extract from a larger number of metric variables, a smaller number of composite factors or components, which are linearly related to the original variables. Cluster analysis aims to identify, in a large number of cases, relatively homogeneous groups called clusters, without prior information about the groups. The calculation intensive nature of multivariate analysis has so far precluded most researchers from using these techniques routinely. The situation is now changing with wider availability, and increasing sophistication of statistical software and researchers should no longer shy away from exploring the applications of multivariate methods to real-life data sets.

  10. Analysis and Interpretation of Findings Using Multiple Regression Techniques

    ERIC Educational Resources Information Center

    Hoyt, William T.; Leierer, Stephen; Millington, Michael J.

    2006-01-01

    Multiple regression and correlation (MRC) methods form a flexible family of statistical techniques that can address a wide variety of different types of research questions of interest to rehabilitation professionals. In this article, we review basic concepts and terms, with an emphasis on interpretation of findings relevant to research questions…

  11. Use of Multiple Regression and Use-Availability Analyses in Determining Habitat Selection by Gray Squirrels (Sciurus Carolinensis)

    Treesearch

    John W. Edwards; Susan C. Loeb; David C. Guynn

    1994-01-01

    Multiple regression and use-availability analyses are two methods for examining habitat selection. Use-availability analysis is commonly used to evaluate macrohabitat selection whereas multiple regression analysis can be used to determine microhabitat selection. We compared these techniques using behavioral observations (n = 5534) and telemetry locations (n = 2089) of...

  12. Market segmentation for multiple option healthcare delivery systems--an application of cluster analysis.

    PubMed

    Jarboe, G R; Gates, R H; McDaniel, C D

    1990-01-01

    Healthcare providers of multiple option plans may be confronted with special market segmentation problems. This study demonstrates how cluster analysis may be used for discovering distinct patterns of preference for multiple option plans. The availability of metric, as opposed to categorical or ordinal, data provides the ability to use sophisticated analysis techniques which may be superior to frequency distributions and cross-tabulations in revealing preference patterns.

  13. Statistical Evaluation of Time Series Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Benignus, V. A.

    1973-01-01

    The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.

  14. Combining results of multiple search engines in proteomics.

    PubMed

    Shteynberg, David; Nesvizhskii, Alexey I; Moritz, Robert L; Deutsch, Eric W

    2013-09-01

    A crucial component of the analysis of shotgun proteomics datasets is the search engine, an algorithm that attempts to identify the peptide sequence from the parent molecular ion that produced each fragment ion spectrum in the dataset. There are many different search engines, both commercial and open source, each employing a somewhat different technique for spectrum identification. The set of high-scoring peptide-spectrum matches for a defined set of input spectra differs markedly among the various search engine results; individual engines each provide unique correct identifications among a core set of correlative identifications. This has led to the approach of combining the results from multiple search engines to achieve improved analysis of each dataset. Here we review the techniques and available software for combining the results of multiple search engines and briefly compare the relative performance of these techniques.

  15. Combining Results of Multiple Search Engines in Proteomics*

    PubMed Central

    Shteynberg, David; Nesvizhskii, Alexey I.; Moritz, Robert L.; Deutsch, Eric W.

    2013-01-01

    A crucial component of the analysis of shotgun proteomics datasets is the search engine, an algorithm that attempts to identify the peptide sequence from the parent molecular ion that produced each fragment ion spectrum in the dataset. There are many different search engines, both commercial and open source, each employing a somewhat different technique for spectrum identification. The set of high-scoring peptide-spectrum matches for a defined set of input spectra differs markedly among the various search engine results; individual engines each provide unique correct identifications among a core set of correlative identifications. This has led to the approach of combining the results from multiple search engines to achieve improved analysis of each dataset. Here we review the techniques and available software for combining the results of multiple search engines and briefly compare the relative performance of these techniques. PMID:23720762

  16. Mixed Models and Reduction Techniques for Large-Rotation, Nonlinear Analysis of Shells of Revolution with Application to Tires

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Andersen, C. M.; Tanner, J. A.

    1984-01-01

    An effective computational strategy is presented for the large-rotation, nonlinear axisymmetric analysis of shells of revolution. The three key elements of the computational strategy are: (1) use of mixed finite-element models with discontinuous stress resultants at the element interfaces; (2) substantial reduction in the total number of degrees of freedom through the use of a multiple-parameter reduction technique; and (3) reduction in the size of the analysis model through the decomposition of asymmetric loads into symmetric and antisymmetric components coupled with the use of the multiple-parameter reduction technique. The potential of the proposed computational strategy is discussed. Numerical results are presented to demonstrate the high accuracy of the mixed models developed and to show the potential of using the proposed computational strategy for the analysis of tires.

  17. MULGRES: a computer program for stepwise multiple regression analysis

    Treesearch

    A. Jeff Martin

    1971-01-01

    MULGRES is a computer program source deck that is designed for multiple regression analysis employing the technique of stepwise deletion in the search for most significant variables. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.

  18. General Nature of Multicollinearity in Multiple Regression Analysis.

    ERIC Educational Resources Information Center

    Liu, Richard

    1981-01-01

    Discusses multiple regression, a very popular statistical technique in the field of education. One of the basic assumptions in regression analysis requires that independent variables in the equation should not be highly correlated. The problem of multicollinearity and some of the solutions to it are discussed. (Author)

  19. Analysis in Motion Initiative – Summarization Capability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arendt, Dustin; Pirrung, Meg; Jasper, Rob

    2017-06-22

    Analysts are tasked with integrating information from multiple data sources for important and timely decision making. What if sense making and overall situation awareness could be improved through visualization techniques? The Analysis in Motion initiative is advancing the ability to summarize and abstract multiple streams and static data sources over time.

  20. Incorporating Multiple-Choice Questions into an AACSB Assurance of Learning Process: A Course-Embedded Assessment Application to an Introductory Finance Course

    ERIC Educational Resources Information Center

    Santos, Michael R.; Hu, Aidong; Jordan, Douglas

    2014-01-01

    The authors offer a classification technique to make a quantitative skills rubric more operational, with the groupings of multiple-choice questions to match the student learning levels in knowledge, calculation, quantitative reasoning, and analysis. The authors applied this classification technique to the mid-term exams of an introductory finance…

  1. Developing Scenarios: Linking Environmental Scanning and Strategic Planning.

    ERIC Educational Resources Information Center

    Whiteley, Meredith A.; And Others

    1990-01-01

    The multiple scenario analysis technique for organizational planning used by multinational corporations is adaptable for colleges and universities. Arizona State University launched a futures-based planning project using the Delphi technique and cross-impact analysis to produce three alternative scenarios (stable, turbulent, and chaotic) to expand…

  2. Performance Analysis of Diversity-Controlled Multi-User Superposition Transmission for 5G Wireless Networks

    PubMed Central

    Yeom, Jeong Seon; Jung, Bang Chul; Jin, Hu

    2018-01-01

    In this paper, we propose a novel low-complexity multi-user superposition transmission (MUST) technique for 5G downlink networks, which allows multiple cell-edge users to be multiplexed with a single cell-center user. We call the proposed technique diversity-controlled MUST technique since the cell-center user enjoys the frequency diversity effect via signal repetition over multiple orthogonal frequency division multiplexing (OFDM) sub-carriers. We assume that a base station is equipped with a single antenna but users are equipped with multiple antennas. In addition, we assume that the quadrature phase shift keying (QPSK) modulation is used for users. We mathematically analyze the bit error rate (BER) of both cell-edge users and cell-center users, which is the first theoretical result in the literature to the best of our knowledge. The mathematical analysis is validated through extensive link-level simulations. PMID:29439413

  3. Performance Analysis of Diversity-Controlled Multi-User Superposition Transmission for 5G Wireless Networks.

    PubMed

    Yeom, Jeong Seon; Chu, Eunmi; Jung, Bang Chul; Jin, Hu

    2018-02-10

    In this paper, we propose a novel low-complexity multi-user superposition transmission (MUST) technique for 5G downlink networks, which allows multiple cell-edge users to be multiplexed with a single cell-center user. We call the proposed technique diversity-controlled MUST technique since the cell-center user enjoys the frequency diversity effect via signal repetition over multiple orthogonal frequency division multiplexing (OFDM) sub-carriers. We assume that a base station is equipped with a single antenna but users are equipped with multiple antennas. In addition, we assume that the quadrature phase shift keying (QPSK) modulation is used for users. We mathematically analyze the bit error rate (BER) of both cell-edge users and cell-center users, which is the first theoretical result in the literature to the best of our knowledge. The mathematical analysis is validated through extensive link-level simulations.

  4. Zero-forcing pre-coding for MIMO WiMAX transceivers: Performance analysis and implementation issues

    NASA Astrophysics Data System (ADS)

    Cattoni, A. F.; Le Moullec, Y.; Sacchi, C.

    Next generation wireless communication networks are expected to achieve ever increasing data rates. Multi-User Multiple-Input-Multiple-Output (MU-MIMO) is a key technique to obtain the expected performance, because such a technique combines the high capacity achievable using MIMO channel with the benefits of space division multiple access. In MU-MIMO systems, the base stations transmit signals to two or more users over the same channel, for this reason every user can experience inter-user interference. This paper provides a capacity analysis of an online, interference-based pre-coding algorithm able to mitigate the multi-user interference of the MU-MIMO systems in the context of a realistic WiMAX application scenario. Simulation results show that pre-coding can significantly increase the channel capacity. Furthermore, the paper presents several feasibility considerations for implementation of the analyzed technique in a possible FPGA-based software defined radio.

  5. Exploring how surgeon teachers motivate residents in the operating room.

    PubMed

    Dath, Deepak; Hoogenes, Jen; Matsumoto, Edward D; Szalay, David A

    2013-02-01

    Motivation in teaching, mainly studied in disciplines outside of surgery, may also be an important part of intraoperative teaching. We explored techniques surgeons use to motivate learners in the operating room (OR). Forty-four experienced surgeon teachers from multiple specialties participated in 9 focus groups about teaching in the OR. Focus groups were transcribed and subjected to qualitative thematic analysis by 3 reviewers through an iterative, rigorous process. Analysis revealed 8 motivational techniques. Surgeons used motivation techniques tacitly, describing multiple ways that they facilitate resident motivation while teaching. Two major categories of motivational techniques emerged: (1) the facilitation of intrinsic motivation; and (2) the provision of factors to stimulate extrinsic motivation. Surgeons unknowingly but tacitly and commonly use motivation in intraoperative teaching and use a variety of techniques to foster learners' intrinsic and extrinsic motivation. Motivating learners is 1 vital role that surgeon teachers play in nontechnical intraoperative teaching. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  7. Untangling the Diverse Interior and Multiple Exterior Guest Interactions of a Supramolecular Host by the Simultaneous Analysis of Complementary Observables.

    PubMed

    Sgarlata, Carmelo; Raymond, Kenneth N

    2016-07-05

    The entropic and enthalpic driving forces for encapsulation versus sequential exterior guest binding to the [Ga4L6](12-) supramolecular host in solution are very different, which significantly complicates the determination of these thermodynamic parameters. The simultaneous use of complementary techniques, such as NMR, UV-vis, and isothermal titration calorimetry, enables the disentanglement of such multiple host-guest interactions. Indeed, data collected by each technique measure different components of the host-guest equilibria and together provide a complete picture of the solution thermodynamics. Unfortunately, commercially available programs do not allow for global analysis of different physical observables. We thus resorted to a novel procedure for the simultaneous refinement of multiple parameters (ΔG°, ΔH°, and ΔS°) by treating different observables through a weighted nonlinear least-squares analysis of a constrained model. The refinement procedure is discussed for the multiple binding of the Et4N(+) guest, but it is broadly applicable to the deconvolution of other intricate host-guest equilibria.

  8. The Impact of Multiple Endpoint Dependency on "Q" and "I"[superscript 2] in Meta-Analysis

    ERIC Educational Resources Information Center

    Thompson, Christopher Glen; Becker, Betsy Jane

    2014-01-01

    A common assumption in meta-analysis is that effect sizes are independent. When correlated effect sizes are analyzed using traditional univariate techniques, this assumption is violated. This research assesses the impact of dependence arising from treatment-control studies with multiple endpoints on homogeneity measures "Q" and…

  9. Conjoint Analysis: A Study of the Effects of Using Person Variables.

    ERIC Educational Resources Information Center

    Fraas, John W.; Newman, Isadore

    Three statistical techniques--conjoint analysis, a multiple linear regression model, and a multiple linear regression model with a surrogate person variable--were used to estimate the relative importance of five university attributes for students in the process of selecting a college. The five attributes include: availability and variety of…

  10. A Technique of Fuzzy C-Mean in Multiple Linear Regression Model toward Paddy Yield

    NASA Astrophysics Data System (ADS)

    Syazwan Wahab, Nur; Saifullah Rusiman, Mohd; Mohamad, Mahathir; Amira Azmi, Nur; Che Him, Norziha; Ghazali Kamardan, M.; Ali, Maselan

    2018-04-01

    In this paper, we propose a hybrid model which is a combination of multiple linear regression model and fuzzy c-means method. This research involved a relationship between 20 variates of the top soil that are analyzed prior to planting of paddy yields at standard fertilizer rates. Data used were from the multi-location trials for rice carried out by MARDI at major paddy granary in Peninsular Malaysia during the period from 2009 to 2012. Missing observations were estimated using mean estimation techniques. The data were analyzed using multiple linear regression model and a combination of multiple linear regression model and fuzzy c-means method. Analysis of normality and multicollinearity indicate that the data is normally scattered without multicollinearity among independent variables. Analysis of fuzzy c-means cluster the yield of paddy into two clusters before the multiple linear regression model can be used. The comparison between two method indicate that the hybrid of multiple linear regression model and fuzzy c-means method outperform the multiple linear regression model with lower value of mean square error.

  11. Single, double or multiple-injection techniques for non-ultrasound guided axillary brachial plexus block in adults undergoing surgery of the lower arm.

    PubMed

    Chin, Ki Jinn; Alakkad, Husni; Cubillos, Javier E

    2013-08-08

    Regional anaesthesia comprising axillary block of the brachial plexus is a common anaesthetic technique for distal upper limb surgery. This is an update of a review first published in 2006 and updated in 2011. To compare the relative effects (benefits and harms) of three injection techniques (single, double and multiple) of axillary block of the brachial plexus for distal upper extremity surgery. We considered these effects primarily in terms of anaesthetic effectiveness; the complication rate (neurological and vascular); and pain and discomfort caused by performance of the block. We searched the Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library), MEDLINE, EMBASE and reference lists of trials. We contacted trial authors. The date of the last search was March 2013 (updated from March 2011). We included randomized controlled trials that compared double with single-injection techniques, multiple with single-injection techniques, or multiple with double-injection techniques for axillary block in adults undergoing surgery of the distal upper limb. We excluded trials using ultrasound-guided techniques. Independent study selection, risk of bias assessment and data extraction were performed by at least two investigators. We undertook meta-analysis. The 21 included trials involved a total of 2148 participants who received regional anaesthesia for hand, wrist, forearm or elbow surgery. Risk of bias assessment indicated that trial design and conduct were generally adequate; the most common areas of weakness were in blinding and allocation concealment.Eight trials comparing double versus single injections showed a statistically significant decrease in primary anaesthesia failure (risk ratio (RR 0.51), 95% confidence interval (CI) 0.30 to 0.85). Subgroup analysis by method of nerve location showed that the effect size was greater when neurostimulation was used rather than the transarterial technique.Eight trials comparing multiple with single injections showed a statistically significant decrease in primary anaesthesia failure (RR 0.25, 95% CI 0.14 to 0.44) and of incomplete motor block (RR 0.61, 95% CI 0.39 to 0.96) in the multiple injection group.Eleven trials comparing multiple with double injections showed a statistically significant decrease in primary anaesthesia failure (RR 0.28, 95% CI 0.20 to 0.40) and of incomplete motor block (RR 0.55, 95% CI 0.36 to 0.85) in the multiple injection group.Tourniquet pain was significantly reduced with multiple injections compared with double injections (RR 0.53, 95% CI 0.33 to 0.84). Otherwise there were no statistically significant differences between groups in any of the three comparisons on secondary analgesia failure, complications and patient discomfort. The time for block performance was significantly shorter for single and double injections compared with multiple injections. This review provides evidence that multiple-injection techniques using nerve stimulation for axillary plexus block produce more effective anaesthesia than either double or single-injection techniques. However, there was insufficient evidence for a significant difference in other outcomes, including safety.

  12. Simple lock-in detection technique utilizing multiple harmonics for digital PGC demodulators.

    PubMed

    Duan, Fajie; Huang, Tingting; Jiang, Jiajia; Fu, Xiao; Ma, Ling

    2017-06-01

    A simple lock-in detection technique especially suited for digital phase-generated carrier (PGC) demodulators is proposed in this paper. It mixes the interference signal with rectangular waves whose Fourier expansions contain multiple odd or multiple even harmonics of the carrier to recover the quadrature components needed for interference phase demodulation. In this way, the use of a multiplier is avoided and the efficiency of the algorithm is improved. Noise performance with regard to light intensity variation and circuit noise is analyzed theoretically for both the proposed technique and the traditional lock-in technique, and results show that the former provides a better signal-to-noise ratio than the latter with proper modulation depth and average interference phase. Detailed simulations were conducted and the theoretical analysis was verified. A fiber-optic Michelson interferometer was constructed and the feasibility of the proposed technique is demonstrated.

  13. Using cognitive task analysis to develop simulation-based training for medical tasks.

    PubMed

    Cannon-Bowers, Jan; Bowers, Clint; Stout, Renee; Ricci, Katrina; Hildabrand, Annette

    2013-10-01

    Pressures to increase the efficacy and effectiveness of medical training are causing the Department of Defense to investigate the use of simulation technologies. This article describes a comprehensive cognitive task analysis technique that can be used to simultaneously generate training requirements, performance metrics, scenario requirements, and simulator/simulation requirements for medical tasks. On the basis of a variety of existing techniques, we developed a scenario-based approach that asks experts to perform the targeted task multiple times, with each pass probing a different dimension of the training development process. In contrast to many cognitive task analysis approaches, we argue that our technique can be highly cost effective because it is designed to accomplish multiple goals. The technique was pilot tested with expert instructors from a large military medical training command. These instructors were employed to generate requirements for two selected combat casualty care tasks-cricothyroidotomy and hemorrhage control. Results indicated that the technique is feasible to use and generates usable data to inform simulation-based training system design. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  14. Digital processing of array seismic recordings

    USGS Publications Warehouse

    Ryall, Alan; Birtill, John

    1962-01-01

    This technical letter contains a brief review of the operations which are involved in digital processing of array seismic recordings by the methods of velocity filtering, summation, cross-multiplication and integration, and by combinations of these operations (the "UK Method" and multiple correlation). Examples are presented of analyses by the several techniques on array recordings which were obtained by the U.S. Geological Survey during chemical and nuclear explosions in the western United States. Seismograms are synthesized using actual noise and Pn-signal recordings, such that the signal-to-noise ratio, onset time and velocity of the signal are predetermined for the synthetic record. These records are then analyzed by summation, cross-multiplication, multiple correlation and the UK technique, and the results are compared. For all of the examples presented, analysis by the non-linear techniques of multiple correlation and cross-multiplication of the traces on an array recording are preferred to analyses by the linear operations involved in summation and the UK Method.

  15. Physical and Cognitive-Affective Factors Associated with Fatigue in Individuals with Fibromyalgia: A Multiple Regression Analysis

    ERIC Educational Resources Information Center

    Muller, Veronica; Brooks, Jessica; Tu, Wei-Mo; Moser, Erin; Lo, Chu-Ling; Chan, Fong

    2015-01-01

    Purpose: The main objective of this study was to determine the extent to which physical and cognitive-affective factors are associated with fibromyalgia (FM) fatigue. Method: A quantitative descriptive design using correlation techniques and multiple regression analysis. The participants consisted of 302 members of the National Fibromyalgia &…

  16. Computational Tools for Probing Interactions in Multiple Linear Regression, Multilevel Modeling, and Latent Curve Analysis

    ERIC Educational Resources Information Center

    Preacher, Kristopher J.; Curran, Patrick J.; Bauer, Daniel J.

    2006-01-01

    Simple slopes, regions of significance, and confidence bands are commonly used to evaluate interactions in multiple linear regression (MLR) models, and the use of these techniques has recently been extended to multilevel or hierarchical linear modeling (HLM) and latent curve analysis (LCA). However, conducting these tests and plotting the…

  17. Methods for Improving Information from ’Undesigned’ Human Factors Experiments.

    DTIC Science & Technology

    Human factors engineering, Information processing, Regression analysis , Experimental design, Least squares method, Analysis of variance, Correlation techniques, Matrices(Mathematics), Multiple disciplines, Mathematical prediction

  18. An iterative forward analysis technique to determine the equation of state of dynamically compressed materials

    DOE PAGES

    Ali, S. J.; Kraus, R. G.; Fratanduono, D. E.; ...

    2017-05-18

    Here, we developed an iterative forward analysis (IFA) technique with the ability to use hydrocode simulations as a fitting function for analysis of dynamic compression experiments. The IFA method optimizes over parameterized quantities in the hydrocode simulations, breaking the degeneracy of contributions to the measured material response. Velocity profiles from synthetic data generated using a hydrocode simulation are analyzed as a first-order validation of the technique. We also analyze multiple magnetically driven ramp compression experiments on copper and compare with more conventional techniques. Excellent agreement is obtained in both cases.

  19. Modification and evaluation of a Barnes-type objective analysis scheme for surface meteorological data

    NASA Technical Reports Server (NTRS)

    Smith, D. R.

    1982-01-01

    The Purdue Regional Objective Analysis of the Mesoscale (PROAM) is a Barness-type scheme for the analysis of surface meteorological data. Modifications are introduced to the original version in order to increase its flexibility and to permit greater ease of usage. The code was rewritten for an interactive computer environment. Furthermore, a multiple iteration technique suggested by Barnes was implemented for greater accuracy. PROAM was subjected to a series of experiments in order to evaluate its performance under a variety of analysis conditions. The tests include use of a known analytic temperature distribution in order to quantify error bounds for the scheme. Similar experiments were conducted using actual atmospheric data. Results indicate that the multiple iteration technique increases the accuracy of the analysis. Furthermore, the tests verify appropriate values for the analysis parameters in resolving meso-beta scale phenomena.

  20. A multiple ion counter total evaporation (MICTE) method for precise analysis of plutonium by thermal ionization mass spectrometry

    DOE PAGES

    Inglis, Jeremy D.; Maassen, Joel; Kara, Azim; ...

    2017-04-28

    This study presents a total evaporation method for the analysis of sub-picogram quantities of Pu, utilizing an array of multiple ion counters. Data from three standards are presented to assess the utility of the technique. An external precision of 1.5% RSD (2σ) was achieved on aliquots approaching 100 fg for the minor 240Pu isotope. Accurate analysis of <1 femtogram of 240Pu, is achievable, with an external reproducibility of better than 10% RSD (2σ). Finally, this new technique represents a significant advance in the total evaporation method and will allow routine measurement of femtogram sized Pu samples by thermal ionization massmore » spectrometry.« less

  1. A multiple ion counter total evaporation (MICTE) method for precise analysis of plutonium by thermal ionization mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Inglis, Jeremy D.; Maassen, Joel; Kara, Azim

    This study presents a total evaporation method for the analysis of sub-picogram quantities of Pu, utilizing an array of multiple ion counters. Data from three standards are presented to assess the utility of the technique. An external precision of 1.5% RSD (2σ) was achieved on aliquots approaching 100 fg for the minor 240Pu isotope. Accurate analysis of <1 femtogram of 240Pu, is achievable, with an external reproducibility of better than 10% RSD (2σ). Finally, this new technique represents a significant advance in the total evaporation method and will allow routine measurement of femtogram sized Pu samples by thermal ionization massmore » spectrometry.« less

  2. An R package for the integrated analysis of metabolomics and spectral data.

    PubMed

    Costa, Christopher; Maraschin, Marcelo; Rocha, Miguel

    2016-06-01

    Recently, there has been a growing interest in the field of metabolomics, materialized by a remarkable growth in experimental techniques, available data and related biological applications. Indeed, techniques as nuclear magnetic resonance, gas or liquid chromatography, mass spectrometry, infrared and UV-visible spectroscopies have provided extensive datasets that can help in tasks as biological and biomedical discovery, biotechnology and drug development. However, as it happens with other omics data, the analysis of metabolomics datasets provides multiple challenges, both in terms of methodologies and in the development of appropriate computational tools. Indeed, from the available software tools, none addresses the multiplicity of existing techniques and data analysis tasks. In this work, we make available a novel R package, named specmine, which provides a set of methods for metabolomics data analysis, including data loading in different formats, pre-processing, metabolite identification, univariate and multivariate data analysis, machine learning, and feature selection. Importantly, the implemented methods provide adequate support for the analysis of data from diverse experimental techniques, integrating a large set of functions from several R packages in a powerful, yet simple to use environment. The package, already available in CRAN, is accompanied by a web site where users can deposit datasets, scripts and analysis reports to be shared with the community, promoting the efficient sharing of metabolomics data analysis pipelines. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  4. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  5. Analysis of multiple instructional techniques on the understanding and retention of select mechanical topics

    NASA Astrophysics Data System (ADS)

    Fetsco, Sara Elizabeth

    There are several topics that introductory physics students typically have difficulty understanding. The purpose of this thesis is to investigate if multiple instructional techniques will help students to better understand and retain the material. The three units analyzed in this study are graphing motion, projectile motion, and conservation of momentum. For each unit students were taught using new or altered instructional methods including online laboratory simulations, inquiry labs, and interactive demonstrations. Additionally, traditional instructional methods such as lecture and problem sets were retained. Effectiveness was measured through pre- and post-tests and student opinion surveys. Results suggest that incorporating multiple instructional techniques into teaching will improve student understanding and retention. Students stated that they learned well from all of the instructional methods used except the online simulations.

  6. Real-Time Visualization of Network Behaviors for Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.

    Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less

  7. Noninvasive spectral imaging of skin chromophores based on multiple regression analysis aided by Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Nishidate, Izumi; Wiswadarma, Aditya; Hase, Yota; Tanaka, Noriyuki; Maeda, Takaaki; Niizeki, Kyuichi; Aizu, Yoshihisa

    2011-08-01

    In order to visualize melanin and blood concentrations and oxygen saturation in human skin tissue, a simple imaging technique based on multispectral diffuse reflectance images acquired at six wavelengths (500, 520, 540, 560, 580 and 600nm) was developed. The technique utilizes multiple regression analysis aided by Monte Carlo simulation for diffuse reflectance spectra. Using the absorbance spectrum as a response variable and the extinction coefficients of melanin, oxygenated hemoglobin, and deoxygenated hemoglobin as predictor variables, multiple regression analysis provides regression coefficients. Concentrations of melanin and total blood are then determined from the regression coefficients using conversion vectors that are deduced numerically in advance, while oxygen saturation is obtained directly from the regression coefficients. Experiments with a tissue-like agar gel phantom validated the method. In vivo experiments with human skin of the human hand during upper limb occlusion and of the inner forearm exposed to UV irradiation demonstrated the ability of the method to evaluate physiological reactions of human skin tissue.

  8. Advanced analysis techniques for uranium assay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.

    2001-01-01

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples countmore » rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.« less

  9. Adaptive subdomain modeling: A multi-analysis technique for ocean circulation models

    NASA Astrophysics Data System (ADS)

    Altuntas, Alper; Baugh, John

    2017-07-01

    Many coastal and ocean processes of interest operate over large temporal and geographical scales and require a substantial amount of computational resources, particularly when engineering design and failure scenarios are also considered. This study presents an adaptive multi-analysis technique that improves the efficiency of these computations when multiple alternatives are being simulated. The technique, called adaptive subdomain modeling, concurrently analyzes any number of child domains, with each instance corresponding to a unique design or failure scenario, in addition to a full-scale parent domain providing the boundary conditions for its children. To contain the altered hydrodynamics originating from the modifications, the spatial extent of each child domain is adaptively adjusted during runtime depending on the response of the model. The technique is incorporated in ADCIRC++, a re-implementation of the popular ADCIRC ocean circulation model with an updated software architecture designed to facilitate this adaptive behavior and to utilize concurrent executions of multiple domains. The results of our case studies confirm that the method substantially reduces computational effort while maintaining accuracy.

  10. Effectiveness of applying progressive muscle relaxation technique on quality of life of patients with multiple sclerosis.

    PubMed

    Ghafari, Somayeh; Ahmadi, Fazlolah; Nabavi, Masoud; Anoshirvan, Kazemnejad; Memarian, Robabe; Rafatbakhsh, Mohamad

    2009-08-01

    To identify the effects of applying Progressive Muscle Relaxation Technique on Quality of Life of patients with multiple Sclerosis. In view of the growing caring options in Multiple Sclerosis, improvement of quality of life has become increasingly relevant as a caring intervention. Complementary therapies are widely used by multiple sclerosis patients and Progressive Muscle Relaxation Technique is a form of complementary therapies. Quasi-experimental study. Multiple Sclerosis patients (n = 66) were selected with no probability sampling then assigned to experimental and control groups (33 patients in each group). Means of data collection included: Individual Information Questionnaire, SF-8 Health Survey, Self-reported checklist. PMRT performed for 63 sessions by experimental group during two months but no intervention was done for control group. Statistical analysis was done by SPSS software. Student t-test showed that there was no significant difference between two groups in mean scores of health-related quality of life before the study but this test showed a significant difference between two groups, one and two months after intervention (p < 0.05). anova test with repeated measurements showed that there is a significant difference in mean score of whole and dimensions of health-related quality of life between two groups in three times (p < 0.05). Although this study provides modest support for the effectiveness of Progressive Muscle Relaxation Technique on quality of life of multiple sclerosis patients, further research is required to determine better methods to promote quality of life of patients suffer multiple sclerosis and other chronic disease. Progressive Muscle Relaxation Technique is practically feasible and is associated with increase of life quality of multiple sclerosis patients; so that health professionals need to update their knowledge about complementary therapies.

  11. The clinico-radiological paradox of cognitive function and MRI burden of white matter lesions in people with multiple sclerosis: A systematic review and meta-analysis.

    PubMed

    Mollison, Daisy; Sellar, Robin; Bastin, Mark; Mollison, Denis; Chandran, Siddharthan; Wardlaw, Joanna; Connick, Peter

    2017-01-01

    Moderate correlation exists between the imaging quantification of brain white matter lesions and cognitive performance in people with multiple sclerosis (MS). This may reflect the greater importance of other features, including subvisible pathology, or methodological limitations of the primary literature. To summarise the cognitive clinico-radiological paradox and explore the potential methodological factors that could influence the assessment of this relationship. Systematic review and meta-analysis of primary research relating cognitive function to white matter lesion burden. Fifty papers met eligibility criteria for review, and meta-analysis of overall results was possible in thirty-two (2050 participants). Aggregate correlation between cognition and T2 lesion burden was r = -0.30 (95% confidence interval: -0.34, -0.26). Wide methodological variability was seen, particularly related to key factors in the cognitive data capture and image analysis techniques. Resolving the persistent clinico-radiological paradox will likely require simultaneous evaluation of multiple components of the complex pathology using optimum measurement techniques for both cognitive and MRI feature quantification. We recommend a consensus initiative to support common standards for image analysis in MS, enabling benchmarking while also supporting ongoing innovation.

  12. A Case for More Multiple Scattering Lidar from Space: Analysis of Four LITE Pulses Returned from a Marine Stratocumulus Deck

    NASA Technical Reports Server (NTRS)

    Davis, Anthony B.; Winker, David M.

    2011-01-01

    Outline: (1) Signal Physics for Multiple-Scattering Cloud Lidar, (2) SNR Estimation (3) Cloud Property Retrievals (3a) several techniques (3b) application to Lidar-In-space Technology Experiment (LITE) data (3c) relation to O2 A-band

  13. An Introduction to Modern Missing Data Analyses

    ERIC Educational Resources Information Center

    Baraldi, Amanda N.; Enders, Craig K.

    2010-01-01

    A great deal of recent methodological research has focused on two modern missing data analysis methods: maximum likelihood and multiple imputation. These approaches are advantageous to traditional techniques (e.g. deletion and mean imputation techniques) because they require less stringent assumptions and mitigate the pitfalls of traditional…

  14. Spectroscopic analysis technique for arc-welding process control

    NASA Astrophysics Data System (ADS)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  15. The role of chemometrics in single and sequential extraction assays: a review. Part II. Cluster analysis, multiple linear regression, mixture resolution, experimental design and other techniques.

    PubMed

    Giacomino, Agnese; Abollino, Ornella; Malandrino, Mery; Mentasti, Edoardo

    2011-03-04

    Single and sequential extraction procedures are used for studying element mobility and availability in solid matrices, like soils, sediments, sludge, and airborne particulate matter. In the first part of this review we reported an overview on these procedures and described the applications of chemometric uni- and bivariate techniques and of multivariate pattern recognition techniques based on variable reduction to the experimental results obtained. The second part of the review deals with the use of chemometrics not only for the visualization and interpretation of data, but also for the investigation of the effects of experimental conditions on the response, the optimization of their values and the calculation of element fractionation. We will describe the principles of the multivariate chemometric techniques considered, the aims for which they were applied and the key findings obtained. The following topics will be critically addressed: pattern recognition by cluster analysis (CA), linear discriminant analysis (LDA) and other less common techniques; modelling by multiple linear regression (MLR); investigation of spatial distribution of variables by geostatistics; calculation of fractionation patterns by a mixture resolution method (Chemometric Identification of Substrates and Element Distributions, CISED); optimization and characterization of extraction procedures by experimental design; other multivariate techniques less commonly applied. Copyright © 2010 Elsevier B.V. All rights reserved.

  16. An Array of Qualitative Data Analysis Tools: A Call for Data Analysis Triangulation

    ERIC Educational Resources Information Center

    Leech, Nancy L.; Onwuegbuzie, Anthony J.

    2007-01-01

    One of the most important steps in the qualitative research process is analysis of data. The purpose of this article is to provide elements for understanding multiple types of qualitative data analysis techniques available and the importance of utilizing more than one type of analysis, thus utilizing data analysis triangulation, in order to…

  17. Confounding Problems in Multifactor AOV When Using Several Organismic Variables of Limited Reliability

    ERIC Educational Resources Information Center

    Games, Paul A.

    1975-01-01

    A brief introduction is presented on how multiple regression and linear model techniques can handle data analysis situations that most educators and psychologists think of as appropriate for analysis of variance. (Author/BJG)

  18. Diversity Performance Analysis on Multiple HAP Networks.

    PubMed

    Dong, Feihong; Li, Min; Gong, Xiangwu; Li, Hongjun; Gao, Fengyue

    2015-06-30

    One of the main design challenges in wireless sensor networks (WSNs) is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP) is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO) techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO) model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV). In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF) and cumulative distribution function (CDF) of the received signal-to-noise ratio (SNR) are derived. In addition, the average symbol error rate (ASER) with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI) and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques.

  19. A Unified Approach to Functional Principal Component Analysis and Functional Multiple-Set Canonical Correlation.

    PubMed

    Choi, Ji Yeh; Hwang, Heungsun; Yamamoto, Michio; Jung, Kwanghee; Woodward, Todd S

    2017-06-01

    Functional principal component analysis (FPCA) and functional multiple-set canonical correlation analysis (FMCCA) are data reduction techniques for functional data that are collected in the form of smooth curves or functions over a continuum such as time or space. In FPCA, low-dimensional components are extracted from a single functional dataset such that they explain the most variance of the dataset, whereas in FMCCA, low-dimensional components are obtained from each of multiple functional datasets in such a way that the associations among the components are maximized across the different sets. In this paper, we propose a unified approach to FPCA and FMCCA. The proposed approach subsumes both techniques as special cases. Furthermore, it permits a compromise between the techniques, such that components are obtained from each set of functional data to maximize their associations across different datasets, while accounting for the variance of the data well. We propose a single optimization criterion for the proposed approach, and develop an alternating regularized least squares algorithm to minimize the criterion in combination with basis function approximations to functions. We conduct a simulation study to investigate the performance of the proposed approach based on synthetic data. We also apply the approach for the analysis of multiple-subject functional magnetic resonance imaging data to obtain low-dimensional components of blood-oxygen level-dependent signal changes of the brain over time, which are highly correlated across the subjects as well as representative of the data. The extracted components are used to identify networks of neural activity that are commonly activated across the subjects while carrying out a working memory task.

  20. A simple white noise analysis of neuronal light responses.

    PubMed

    Chichilnisky, E J

    2001-05-01

    A white noise technique is presented for estimating the response properties of spiking visual system neurons. The technique is simple, robust, efficient and well suited to simultaneous recordings from multiple neurons. It provides a complete and easily interpretable model of light responses even for neurons that display a common form of response nonlinearity that precludes classical linear systems analysis. A theoretical justification of the technique is presented that relies only on elementary linear algebra and statistics. Implementation is described with examples. The technique and the underlying model of neural responses are validated using recordings from retinal ganglion cells, and in principle are applicable to other neurons. Advantages and disadvantages of the technique relative to classical approaches are discussed.

  1. Maintenance Operations in Mission Oriented Protective Posture Level IV (MOPPIV)

    DTIC Science & Technology

    1987-10-01

    Repair FADAC Printed Circuit Board ............. 6 3. Data Analysis Techniques ............................. 6 a. Multiple Linear Regression... ANALYSIS /DISCUSSION ............................... 12 1. Exa-ple of Regression Analysis ..................... 12 S2. Regression results for all tasks...6 * TABLE 9. Task Grouping for Analysis ........................ 7 "TABXLE 10. Remove/Replace H60A3 Power Pack................. 8 TABLE

  2. Tau-independent Phase Analysis: A Novel Method for Accurately Determining Phase Shifts.

    PubMed

    Tackenberg, Michael C; Jones, Jeff R; Page, Terry L; Hughey, Jacob J

    2018-06-01

    Estimations of period and phase are essential in circadian biology. While many techniques exist for estimating period, comparatively few methods are available for estimating phase. Current approaches to analyzing phase often vary between studies and are sensitive to coincident changes in period and the stage of the circadian cycle at which the stimulus occurs. Here we propose a new technique, tau-independent phase analysis (TIPA), for quantifying phase shifts in multiple types of circadian time-course data. Through comprehensive simulations, we show that TIPA is both more accurate and more precise than the standard actogram approach. TIPA is computationally simple and therefore will enable accurate and reproducible quantification of phase shifts across multiple subfields of chronobiology.

  3. Estimating regional greenhouse gas fluxes: An uncertainty analysis of planetary boundary layer techniques and bottom-up inventories

    USDA-ARS?s Scientific Manuscript database

    Quantification of regional greenhouse gas (GHG) fluxes is essential for establishing mitigation strategies and evaluating their effectiveness. Here, we used multiple top-down approaches and multiple trace gas observations at a tall tower to estimate GHG regional fluxes and evaluate the GHG fluxes de...

  4. Identification of Multiple Nonreturner Profiles to Inform the Development of Targeted College Retention Interventions

    ERIC Educational Resources Information Center

    Mattern, Krista D.; Marini, Jessica P.; Shaw, Emily J.

    2015-01-01

    Throughout the college retention literature, there is a recurring theme that students leave college for a variety of reasons making retention a difficult phenomenon to model. In the current study, cluster analysis techniques were employed to investigate whether multiple empirically based profiles of nonreturning students existed to more fully…

  5. Visual Attention for Solving Multiple-Choice Science Problem: An Eye-Tracking Analysis

    ERIC Educational Resources Information Center

    Tsai, Meng-Jung; Hou, Huei-Tse; Lai, Meng-Lung; Liu, Wan-Yi; Yang, Fang-Ying

    2012-01-01

    This study employed an eye-tracking technique to examine students' visual attention when solving a multiple-choice science problem. Six university students participated in a problem-solving task to predict occurrences of landslide hazards from four images representing four combinations of four factors. Participants' responses and visual attention…

  6. Network meta-analysis: a technique to gather evidence from direct and indirect comparisons

    PubMed Central

    2017-01-01

    Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228

  7. Temporal profile of inflammatory response to fracture and hemorrhagic shock: Proposal of a novel long-term survival murine multiple trauma model.

    PubMed

    Kleber, Christian; Becker, Christopher A; Malysch, Tom; Reinhold, Jens M; Tsitsilonis, Serafeim; Duda, Georg N; Schmidt-Bleek, Katharina; Schaser, Klaus D

    2015-07-01

    Hemorrhagic shock (hS) interacts with the posttraumatic immune response and fracture healing in multiple trauma. Due to the lack of a long-term survival multiple trauma animal models, no standardized analysis of fracture healing referring the impact of multiple trauma on fracture healing was performed. We propose a new long-term survival (21 days) murine multiple trauma model combining hS (microsurgical cannulation of carotid artery, withdrawl of blood and continuously blood pressure measurement), femoral (osteotomy/external fixation) and tibial fracture (3-point bending technique/antegrade nail). The posttraumatic immune response was measured via IL-6, sIL-6R ELISA. The hS was investigated via macrohemodynamics, blood gas analysis, wet-dry lung ration and histologic analysis of the shock organs. We proposed a new murine long-term survival (21 days) multiple trauma model mimicking clinical relevant injury patterns and previously published human posttraumatic immune response. Based on blood gas analysis and histologic analysis of shock organs we characterized and standardized our murine multiple trauma model. Furthermore, we revealed hemorrhagic shock as a causative factor that triggers sIL-6R formation underscoring the fundamental pathophysiologic role of the transsignaling mechanism in multiple trauma. © 2015 Orthopaedic Research Society. Published by Wiley Periodicals, Inc.

  8. Multiplication factor versus regression analysis in stature estimation from hand and foot dimensions.

    PubMed

    Krishan, Kewal; Kanchan, Tanuj; Sharma, Abhilasha

    2012-05-01

    Estimation of stature is an important parameter in identification of human remains in forensic examinations. The present study is aimed to compare the reliability and accuracy of stature estimation and to demonstrate the variability in estimated stature and actual stature using multiplication factor and regression analysis methods. The study is based on a sample of 246 subjects (123 males and 123 females) from North India aged between 17 and 20 years. Four anthropometric measurements; hand length, hand breadth, foot length and foot breadth taken on the left side in each subject were included in the study. Stature was measured using standard anthropometric techniques. Multiplication factors were calculated and linear regression models were derived for estimation of stature from hand and foot dimensions. Derived multiplication factors and regression formula were applied to the hand and foot measurements in the study sample. The estimated stature from the multiplication factors and regression analysis was compared with the actual stature to find the error in estimated stature. The results indicate that the range of error in estimation of stature from regression analysis method is less than that of multiplication factor method thus, confirming that the regression analysis method is better than multiplication factor analysis in stature estimation. Copyright © 2012 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  9. A Review of Meta-Analysis Packages in R

    ERIC Educational Resources Information Center

    Polanin, Joshua R.; Hennessy, Emily A.; Tanner-Smith, Emily E.

    2017-01-01

    Meta-analysis is a statistical technique that allows an analyst to synthesize effect sizes from multiple primary studies. To estimate meta-analysis models, the open-source statistical environment R is quickly becoming a popular choice. The meta-analytic community has contributed to this growth by developing numerous packages specific to…

  10. Methods for Mediation Analysis with Missing Data

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; Wang, Lijuan

    2013-01-01

    Despite wide applications of both mediation models and missing data techniques, formal discussion of mediation analysis with missing data is still rare. We introduce and compare four approaches to dealing with missing data in mediation analysis including list wise deletion, pairwise deletion, multiple imputation (MI), and a two-stage maximum…

  11. Homomorphic filtering textural analysis technique to reduce multiplicative noise in the 11Oba nano-doped liquid crystalline compounds

    NASA Astrophysics Data System (ADS)

    Madhav, B. T. P.; Pardhasaradhi, P.; Manepalli, R. K. N. R.; Pisipati, V. G. K. M.

    2015-07-01

    The compound undecyloxy benzoic acid (11Oba) exhibits nematic and smectic-C phases while a nano-doped undecyloxy benzoic acid with ZnO exhibits the same nematic and smectic-C phases with reduced clearing temperature as expected. The doping is done with 0.5% and 1% ZnO molecules. The clearing temperatures are reduced by approximately 4 ° and 6 °, respectively (differential scanning calorimeter data). While collecting the images from a polarizing microscope connected with hot stage and camera, the illumination and reflectance combined multiplicatively and the image quality was reduced to identify the exact phase in the compound. A novel technique of homomorphic filtering is used in this manuscript through which multiplicative noise components of the image are separated linearly in the frequency domain. This technique provides a frequency domain procedure to improve the appearance of an image by gray level range compression and contrast enhancement.

  12. Spacecraft Multiple Array Communication System Performance Analysis

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Desilva, Kanishka; Sham, Catherine C.

    2010-01-01

    The Communication Systems Simulation Laboratory (CSSL) at the NASA Johnson Space Center is tasked to perform spacecraft and ground network communication system simulations, design validation, and performance verification. The CSSL has developed simulation tools that model spacecraft communication systems and the space and ground environment in which the tools operate. In this paper, a spacecraft communication system with multiple arrays is simulated. Multiple array combined technique is used to increase the radio frequency coverage and data rate performance. The technique is to achieve phase coherence among the phased arrays to combine the signals at the targeting receiver constructively. There are many technical challenges in spacecraft integration with a high transmit power communication system. The array combining technique can improve the communication system data rate and coverage performances without increasing the system transmit power requirements. Example simulation results indicate significant performance improvement can be achieved with phase coherence implementation.

  13. Correlative Tomography

    PubMed Central

    Burnett, T. L.; McDonald, S. A.; Gholinia, A.; Geurts, R.; Janus, M.; Slater, T.; Haigh, S. J.; Ornek, C.; Almuaili, F.; Engelberg, D. L.; Thompson, G. E.; Withers, P. J.

    2014-01-01

    Increasingly researchers are looking to bring together perspectives across multiple scales, or to combine insights from different techniques, for the same region of interest. To this end, correlative microscopy has already yielded substantial new insights in two dimensions (2D). Here we develop correlative tomography where the correlative task is somewhat more challenging because the volume of interest is typically hidden beneath the sample surface. We have threaded together x-ray computed tomography, serial section FIB-SEM tomography, electron backscatter diffraction and finally TEM elemental analysis all for the same 3D region. This has allowed observation of the competition between pitting corrosion and intergranular corrosion at multiple scales revealing the structural hierarchy, crystallography and chemistry of veiled corrosion pits in stainless steel. With automated correlative workflows and co-visualization of the multi-scale or multi-modal datasets the technique promises to provide insights across biological, geological and materials science that are impossible using either individual or multiple uncorrelated techniques. PMID:24736640

  14. Tackling missing radiographic progression data: multiple imputation technique compared with inverse probability weights and complete case analysis.

    PubMed

    Descalzo, Miguel Á; Garcia, Virginia Villaverde; González-Alvaro, Isidoro; Carbonell, Jordi; Balsa, Alejandro; Sanmartí, Raimon; Lisbona, Pilar; Hernandez-Barrera, Valentín; Jiménez-Garcia, Rodrigo; Carmona, Loreto

    2013-02-01

    To describe the results of different statistical ways of addressing radiographic outcome affected by missing data--multiple imputation technique, inverse probability weights and complete case analysis--using data from an observational study. A random sample of 96 RA patients was selected for a follow-up study in which radiographs of hands and feet were scored. Radiographic progression was tested by comparing the change in the total Sharp-van der Heijde radiographic score (TSS) and the joint erosion score (JES) from baseline to the end of the second year of follow-up. MI technique, inverse probability weights in weighted estimating equation (WEE) and CC analysis were used to fit a negative binomial regression. Major predictors of radiographic progression were JES and joint space narrowing (JSN) at baseline, together with baseline disease activity measured by DAS28 for TSS and MTX use for JES. Results from CC analysis show larger coefficients and s.e.s compared with MI and weighted techniques. The results from the WEE model were quite in line with those of MI. If it seems plausible that CC or MI analysis may be valid, then MI should be preferred because of its greater efficiency. CC analysis resulted in inefficient estimates or, translated into non-statistical terminology, could guide us into inaccurate results and unwise conclusions. The methods discussed here will contribute to the use of alternative approaches for tackling missing data in observational studies.

  15. Magnetic fabric constraints of the emplacement of igneous intrusions

    NASA Astrophysics Data System (ADS)

    Maes, Stephanie M.

    Fabric analysis is critical to evaluating the history, kinematics, and dynamics of geological deformation. This is particularly true of igneous intrusions, where the development of fabric is used to constrain magmatic flow and emplacement mechanisms. Fabric analysis was applied to three mafic intrusions, with different tectonic and petrogenetic histories, to study emplacement and magma flow: the Insizwa sill (Mesozoic Karoo Large Igneous Province, South Africa), Sonju Lake intrusion (Proterozoic Midcontinent Rift, Minnesota, USA), and Palisades sill (Mesozoic rift basin, New Jersey, USA). Multiple fabric analysis techniques were used to define the fabric in each intrusive body. Using digital image analysis techniques on multiple thin sections, the three-dimensional shape-preferred orientation (SPO) of populations of mineral phases were calculated. Low-field anisotropy of magnetic susceptibility (AMS) measurements were used as a proxy for the mineral fabric of the ferromagnetic phases (e.g., magnetite). In addition, a new technique---high-field AMS---was used to isolate the paramagnetic component of the fabric (e.g., silicate fabric). Each fabric analysis technique was then compared to observable field fabrics as a framework for interpretation. In the Insizwa sill, magnetic properties were used to corroborate vertical petrologic zonation and distinguish sub-units within lithologically defined units. Abrupt variation in magnetic properties provides evidence supporting the formation of the Insizwa sill by separate magma intrusions. Low-field AMS fabrics in the Sonju Lake intrusion exhibit consistent SW-plunging lineations and SW-dipping foliations. These fabric orientations provide evidence that the cumulate layers in the intrusion were deposited in a dynamic environment, and indicate magma flowed from southwest to northeast, parallel to the pre-existing rift structures. In the Palisades sill, the magnetite SPO and low-field AMS lineation have developed orthogonal to the plagioclase SPO and high-field AMS lineation. Magma flow in the Palisades magmatic system is interpreted to have originated from a point source feeder. Low-field AMS records the flow direction, whereas high-field AMS records extension within the igneous sheet. The multiple fabric analysis techniques presented in this dissertation have advanced our understanding of the development of fabric and its relationship to internal structure, emplacement, and magma dynamics in mafic igneous systems.

  16. A Systematic Analysis of 2 Monoisocentric Techniques for the Treatment of Multiple Brain Metastases.

    PubMed

    Narayanasamy, Ganesh; Stathakis, Sotirios; Gutierrez, Alonso N; Pappas, Evangelos; Crownover, Richard; Floyd, John R; Papanikolaou, Niko

    2017-10-01

    In this treatment planning study, we compare the plan quality and delivery parameters for the treatment of multiple brain metastases using 2 monoisocentric techniques: the Multiple Metastases Element from Brainlab and the RapidArc volumetric-modulated arc therapy from Varian Medical Systems. Eight patients who were treated in our institution for multiple metastases (3-7 lesions) were replanned with Multiple Metastases Element using noncoplanar dynamic conformal arcs. The same patients were replanned with the RapidArc technique in Eclipse using 4 noncoplanar arcs. Both techniques were designed using a single isocenter. Plan quality metrics (conformity index, homogeneity index, gradient index, and R 50% ), monitor unit, and the planning time were recorded. Comparison of the Multiple Metastases Element and RapidArc plans was performed using Shapiro-Wilk test, paired Student t test, and Wilcoxon signed rank test. A paired Wilcoxon signed rank test between Multiple Metastases Element and RapidArc showed comparable plan quality metrics and dose to brain. Mean ± standard deviation values of conformity index were 1.8 ± 0.7 and 1.7 ± 0.6, homogeneity index were 1.3 ± 0.1 and 1.3 ± 0.1, gradient index were 5.0 ± 1.8 and 5.1 ± 1.9, and R 50% were 4.9 ± 1.8 and 5.0 ± 1.9 for Multiple Metastases Element and RapidArc plans, respectively. Mean brain dose was 2.3 and 2.7 Gy for Multiple Metastases Element and RapidArc plans, respectively. The mean value of monitor units in Multiple Metastases Element plan was 7286 ± 1065, which is significantly lower than the RapidArc monitor units of 9966 ± 1533 ( P < .05). For the planning of multiple brain lesions to be treated with stereotactic radiosurgery, Multiple Metastases Element planning software produced equivalent conformity, homogeneity, dose falloff, and brain V 12 Gy but required significantly lower monitor units, when compared to RapidArc plans.

  17. A Systematic Analysis of 2 Monoisocentric Techniques for the Treatment of Multiple Brain Metastases

    PubMed Central

    Stathakis, Sotirios; Gutierrez, Alonso N.; Pappas, Evangelos; Crownover, Richard; Floyd, John R.; Papanikolaou, Niko

    2016-01-01

    Background: In this treatment planning study, we compare the plan quality and delivery parameters for the treatment of multiple brain metastases using 2 monoisocentric techniques: the Multiple Metastases Element from Brainlab and the RapidArc volumetric-modulated arc therapy from Varian Medical Systems. Methods: Eight patients who were treated in our institution for multiple metastases (3-7 lesions) were replanned with Multiple Metastases Element using noncoplanar dynamic conformal arcs. The same patients were replanned with the RapidArc technique in Eclipse using 4 noncoplanar arcs. Both techniques were designed using a single isocenter. Plan quality metrics (conformity index, homogeneity index, gradient index, and R50%), monitor unit, and the planning time were recorded. Comparison of the Multiple Metastases Element and RapidArc plans was performed using Shapiro-Wilk test, paired Student t test, and Wilcoxon signed rank test. Results: A paired Wilcoxon signed rank test between Multiple Metastases Element and RapidArc showed comparable plan quality metrics and dose to brain. Mean ± standard deviation values of conformity index were 1.8 ± 0.7 and 1.7 ± 0.6, homogeneity index were 1.3 ± 0.1 and 1.3 ± 0.1, gradient index were 5.0 ± 1.8 and 5.1 ± 1.9, and R50% were 4.9 ± 1.8 and 5.0 ± 1.9 for Multiple Metastases Element and RapidArc plans, respectively. Mean brain dose was 2.3 and 2.7 Gy for Multiple Metastases Element and RapidArc plans, respectively. The mean value of monitor units in Multiple Metastases Element plan was 7286 ± 1065, which is significantly lower than the RapidArc monitor units of 9966 ± 1533 (P < .05). Conclusion: For the planning of multiple brain lesions to be treated with stereotactic radiosurgery, Multiple Metastases Element planning software produced equivalent conformity, homogeneity, dose falloff, and brain V12 Gy but required significantly lower monitor units, when compared to RapidArc plans. PMID:27612917

  18. Modelling and multi objective optimization of WEDM of commercially Monel super alloy using evolutionary algorithms

    NASA Astrophysics Data System (ADS)

    Varun, Sajja; Reddy, Kalakada Bhargav Bal; Vardhan Reddy, R. R. Vishnu

    2016-09-01

    In this research work, development of a multi response optimization technique has been undertaken, using traditional desirability analysis and non-traditional particle swarm optimization techniques (for different customer's priorities) in wire electrical discharge machining (WEDM). Monel 400 has been selected as work material for experimentation. The effect of key process parameters such as pulse on time (TON), pulse off time (TOFF), peak current (IP), wire feed (WF) were on material removal rate (MRR) and surface roughness(SR) in WEDM operation were investigated. Further, the responses such as MRR and SR were modelled empirically through regression analysis. The developed models can be used by the machinists to predict the MRR and SR over a wide range of input parameters. The optimization of multiple responses has been done for satisfying the priorities of multiple users by using Taguchi-desirability function method and particle swarm optimization technique. The analysis of variance (ANOVA) is also applied to investigate the effect of influential parameters. Finally, the confirmation experiments were conducted for the optimal set of machining parameters, and the betterment has been proved.

  19. Fast algorithm for spectral processing with application to on-line welding quality assurance

    NASA Astrophysics Data System (ADS)

    Mirapeix, J.; Cobo, A.; Jaúregui, C.; López-Higuera, J. M.

    2006-10-01

    A new technique is presented in this paper for the analysis of welding process emission spectra to accurately estimate in real-time the plasma electronic temperature. The estimation of the electronic temperature of the plasma, through the analysis of the emission lines from multiple atomic species, may be used to monitor possible perturbations during the welding process. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, sub-pixel algorithms are used to more accurately estimate the central wavelength of the peaks. Three different sub-pixel algorithms will be analysed and compared, and it will be shown that the LPO (linear phase operator) sub-pixel algorithm is a better solution within the proposed system. Experimental tests during TIG-welding using a fibre optic to capture the arc light, together with a low cost CCD-based spectrometer, show that some typical defects associated with perturbations in the electron temperature can be easily detected and identified with this technique. A typical processing time for multiple peak analysis is less than 20 ms running on a conventional PC.

  20. Eccentricity Fluctuations Make Flow Measurable in High Multiplicity p-p Collisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casalderrey-Solana, Jorge; Wiedemann, Urs Achim

    2010-03-12

    Elliptic flow is a hallmark of collectivity in hadronic collisions. Its measurement relies on analysis techniques which require high event multiplicity and so far can only be applied to heavy ion collisions. Here, we delineate the conditions under which elliptic flow becomes measurable in the samples of high-multiplicity (dN{sub ch}/dy>=50) p-p collisions, which will soon be collected at the LHC. We observe that fluctuations in the p-p interaction region can result in a sizable spatial eccentricity even for the most central p-p collisions. Under relatively mild assumptions on the nature of such fluctuations and on the eccentricity scaling of ellipticmore » flow, we find that the resulting elliptic flow signal in high-multiplicity p-p collisions at the LHC becomes measurable with standard techniques.« less

  1. An Improved Wake Vortex Tracking Algorithm for Multiple Aircraft

    NASA Technical Reports Server (NTRS)

    Switzer, George F.; Proctor, Fred H.; Ahmad, Nashat N.; LimonDuparcmeur, Fanny M.

    2010-01-01

    The accurate tracking of vortex evolution from Large Eddy Simulation (LES) data is a complex and computationally intensive problem. The vortex tracking requires the analysis of very large three-dimensional and time-varying datasets. The complexity of the problem is further compounded by the fact that these vortices are embedded in a background turbulence field, and they may interact with the ground surface. Another level of complication can arise, if vortices from multiple aircrafts are simulated. This paper presents a new technique for post-processing LES data to obtain wake vortex tracks and wake intensities. The new approach isolates vortices by defining "regions of interest" (ROI) around each vortex and has the ability to identify vortex pairs from multiple aircraft. The paper describes the new methodology for tracking wake vortices and presents application of the technique for single and multiple aircraft.

  2. Modulation and synchronization technique for MF-TDMA system

    NASA Technical Reports Server (NTRS)

    Faris, Faris; Inukai, Thomas; Sayegh, Soheil

    1994-01-01

    This report addresses modulation and synchronization techniques for a multi-frequency time division multiple access (MF-TDMA) system with onboard baseband processing. The types of synchronization techniques analyzed are asynchronous (conventional) TDMA, preambleless asynchronous TDMA, bit synchronous timing with a preamble, and preambleless bit synchronous timing. Among these alternatives, preambleless bit synchronous timing simplifies onboard multicarrier demultiplexer/demodulator designs (about 2:1 reduction in mass and power), requires smaller onboard buffers (10:1 to approximately 3:1 reduction in size), and provides better frame efficiency as well as lower onboard processing delay. Analysis and computer simulation illustrate that this technique can support a bit rate of up to 10 Mbit/s (or higher) with proper selection of design parameters. High bit rate transmission may require Doppler compensation and multiple phase error measurements. The recommended modulation technique for bit synchronous timing is coherent QPSK with differential encoding for the uplink and coherent QPSK for the downlink.

  3. Practical Guidance for Conducting Mediation Analysis With Multiple Mediators Using Inverse Odds Ratio Weighting

    PubMed Central

    Nguyen, Quynh C.; Osypuk, Theresa L.; Schmidt, Nicole M.; Glymour, M. Maria; Tchetgen Tchetgen, Eric J.

    2015-01-01

    Despite the recent flourishing of mediation analysis techniques, many modern approaches are difficult to implement or applicable to only a restricted range of regression models. This report provides practical guidance for implementing a new technique utilizing inverse odds ratio weighting (IORW) to estimate natural direct and indirect effects for mediation analyses. IORW takes advantage of the odds ratio's invariance property and condenses information on the odds ratio for the relationship between the exposure (treatment) and multiple mediators, conditional on covariates, by regressing exposure on mediators and covariates. The inverse of the covariate-adjusted exposure-mediator odds ratio association is used to weight the primary analytical regression of the outcome on treatment. The treatment coefficient in such a weighted regression estimates the natural direct effect of treatment on the outcome, and indirect effects are identified by subtracting direct effects from total effects. Weighting renders treatment and mediators independent, thereby deactivating indirect pathways of the mediators. This new mediation technique accommodates multiple discrete or continuous mediators. IORW is easily implemented and is appropriate for any standard regression model, including quantile regression and survival analysis. An empirical example is given using data from the Moving to Opportunity (1994–2002) experiment, testing whether neighborhood context mediated the effects of a housing voucher program on obesity. Relevant Stata code (StataCorp LP, College Station, Texas) is provided. PMID:25693776

  4. Using AVIRIS data and multiple-masking techniques to map urban forest trees species

    Treesearch

    Q. Xiao; S.L. Ustin; E.G. McPherson

    2004-01-01

    Tree type and species information are critical parameters for urban forest management, benefit cost analysis and urban planning. However, traditionally, these parameters have been derived based on limited field samples in urban forest management practice. In this study we used high-resolution Airborne Visible Infrared Imaging Spectrometer (AVIRIS) data and multiple-...

  5. What Is Wrong with ANOVA and Multiple Regression? Analyzing Sentence Reading Times with Hierarchical Linear Models

    ERIC Educational Resources Information Center

    Richter, Tobias

    2006-01-01

    Most reading time studies using naturalistic texts yield data sets characterized by a multilevel structure: Sentences (sentence level) are nested within persons (person level). In contrast to analysis of variance and multiple regression techniques, hierarchical linear models take the multilevel structure of reading time data into account. They…

  6. Some Applied Research Concerns Using Multiple Linear Regression Analysis.

    ERIC Educational Resources Information Center

    Newman, Isadore; Fraas, John W.

    The intention of this paper is to provide an overall reference on how a researcher can apply multiple linear regression in order to utilize the advantages that it has to offer. The advantages and some concerns expressed about the technique are examined. A number of practical ways by which researchers can deal with such concerns as…

  7. Estimating Interaction Effects With Incomplete Predictor Variables

    PubMed Central

    Enders, Craig K.; Baraldi, Amanda N.; Cham, Heining

    2014-01-01

    The existing missing data literature does not provide a clear prescription for estimating interaction effects with missing data, particularly when the interaction involves a pair of continuous variables. In this article, we describe maximum likelihood and multiple imputation procedures for this common analysis problem. We outline 3 latent variable model specifications for interaction analyses with missing data. These models apply procedures from the latent variable interaction literature to analyses with a single indicator per construct (e.g., a regression analysis with scale scores). We also discuss multiple imputation for interaction effects, emphasizing an approach that applies standard imputation procedures to the product of 2 raw score predictors. We thoroughly describe the process of probing interaction effects with maximum likelihood and multiple imputation. For both missing data handling techniques, we outline centering and transformation strategies that researchers can implement in popular software packages, and we use a series of real data analyses to illustrate these methods. Finally, we use computer simulations to evaluate the performance of the proposed techniques. PMID:24707955

  8. The use of artificial intelligence techniques to improve the multiple payload integration process

    NASA Technical Reports Server (NTRS)

    Cutts, Dannie E.; Widgren, Brian K.

    1992-01-01

    A maximum return of science and products with a minimum expenditure of time and resources is a major goal of mission payload integration. A critical component then, in successful mission payload integration is the acquisition and analysis of experiment requirements from the principal investigator and payload element developer teams. One effort to use artificial intelligence techniques to improve the acquisition and analysis of experiment requirements within the payload integration process is described.

  9. Configural Frequency Analysis as a Statistical Tool for Developmental Research.

    ERIC Educational Resources Information Center

    Lienert, Gustav A.; Oeveste, Hans Zur

    1985-01-01

    Configural frequency analysis (CFA) is suggested as a technique for longitudinal research in developmental psychology. Stability and change in answers to multiple choice and yes-no item patterns obtained with repeated measurements are identified by CFA and illustrated by developmental analysis of an item from Gorham's Proverb Test. (Author/DWH)

  10. Criteria for the use of regression analysis for remote sensing of sediment and pollutants

    NASA Technical Reports Server (NTRS)

    Whitlock, C. H.; Kuo, C. Y.; Lecroy, S. R. (Principal Investigator)

    1982-01-01

    Data analysis procedures for quantification of water quality parameters that are already identified and are known to exist within the water body are considered. The liner multiple-regression technique was examined as a procedure for defining and calibrating data analysis algorithms for such instruments as spectrometers and multispectral scanners.

  11. Statistical technique for analysing functional connectivity of multiple spike trains.

    PubMed

    Masud, Mohammad Shahed; Borisyuk, Roman

    2011-03-15

    A new statistical technique, the Cox method, used for analysing functional connectivity of simultaneously recorded multiple spike trains is presented. This method is based on the theory of modulated renewal processes and it estimates a vector of influence strengths from multiple spike trains (called reference trains) to the selected (target) spike train. Selecting another target spike train and repeating the calculation of the influence strengths from the reference spike trains enables researchers to find all functional connections among multiple spike trains. In order to study functional connectivity an "influence function" is identified. This function recognises the specificity of neuronal interactions and reflects the dynamics of postsynaptic potential. In comparison to existing techniques, the Cox method has the following advantages: it does not use bins (binless method); it is applicable to cases where the sample size is small; it is sufficiently sensitive such that it estimates weak influences; it supports the simultaneous analysis of multiple influences; it is able to identify a correct connectivity scheme in difficult cases of "common source" or "indirect" connectivity. The Cox method has been thoroughly tested using multiple sets of data generated by the neural network model of the leaky integrate and fire neurons with a prescribed architecture of connections. The results suggest that this method is highly successful for analysing functional connectivity of simultaneously recorded multiple spike trains. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. A performance analysis of DS-CDMA and SCPC VSAT networks

    NASA Technical Reports Server (NTRS)

    Hayes, David P.; Ha, Tri T.

    1990-01-01

    Spread-spectrum and single-channel-per-carrier (SCPC) transmission techniques work well in very small aperture terminal (VSAT) networks for multiple-access purposes while allowing the earth station antennas to remain small. Direct-sequence code-division multiple-access (DS-CDMA) is the simplest spread-spectrum technique to use in a VSAT network since a frequency synthesizer is not required for each terminal. An examination is made of the DS-CDMA and SCPC Ku-band VSAT satellite systems for low-density (64-kb/s or less) communications. A method for improving the standardf link analysis of DS-CDMA satellite-switched networks by including certain losses is developed. The performance of 50-channel full mesh and star network architectures is analyzed. The selection of operating conditions producing optimum performance is demonstrated.

  13. Rapid Multi-Damage Identification for Health Monitoring of Laminated Composites Using Piezoelectric Wafer Sensor Arrays

    PubMed Central

    Si, Liang; Wang, Qian

    2016-01-01

    Through the use of the wave reflection from any damage in a structure, a Hilbert spectral analysis-based rapid multi-damage identification (HSA-RMDI) technique with piezoelectric wafer sensor arrays (PWSA) is developed to monitor and identify the presence, location and severity of damage in carbon fiber composite structures. The capability of the rapid multi-damage identification technique to extract and estimate hidden significant information from the collected data and to provide a high-resolution energy-time spectrum can be employed to successfully interpret the Lamb waves interactions with single/multiple damage. Nevertheless, to accomplish the precise positioning and effective quantification of multiple damage in a composite structure, two functional metrics from the RMDI technique are proposed and used in damage identification, which are the energy density metric and the energy time-phase shift metric. In the designed damage experimental tests, invisible damage to the naked eyes, especially delaminations, were detected in the leftward propagating waves as well as in the selected sensor responses, where the time-phase shift spectra could locate the multiple damage whereas the energy density spectra were used to quantify the multiple damage. The increasing damage was shown to follow a linear trend calculated by the RMDI technique. All damage cases considered showed completely the developed RMDI technique potential as an effective online damage inspection and assessment tool. PMID:27153070

  14. An observational model for biomechanical assessment of sprint kayaking technique.

    PubMed

    McDonnell, Lisa K; Hume, Patria A; Nolte, Volker

    2012-11-01

    Sprint kayaking stroke phase descriptions for biomechanical analysis of technique vary among kayaking literature, with inconsistencies not conducive for the advancement of biomechanics applied service or research. We aimed to provide a consistent basis for the categorisation and analysis of sprint kayak technique by proposing a clear observational model. Electronic databases were searched using key words kayak, sprint, technique, and biomechanics, with 20 sources reviewed. Nine phase-defining positions were identified within the kayak literature and were divided into three distinct types based on how positions were defined: water-contact-defined positions, paddle-shaft-defined positions, and body-defined positions. Videos of elite paddlers from multiple camera views were reviewed to determine the visibility of positions used to define phases. The water-contact-defined positions of catch, immersion, extraction, and release were visible from multiple camera views, therefore were suitable for practical use by coaches and researchers. Using these positions, phases and sub-phases were created for a new observational model. We recommend that kayaking data should be reported using single strokes and described using two phases: water and aerial. For more detailed analysis without disrupting the basic two-phase model, a four-sub-phase model consisting of entry, pull, exit, and aerial sub-phases should be used.

  15. Multiple-Star System Adaptive Vortex Coronagraphy Using a Liquid Crystal Light Valve

    NASA Astrophysics Data System (ADS)

    Aleksanyan, Artur; Kravets, Nina; Brasselet, Etienne

    2017-05-01

    We propose the development of a high-contrast imaging technique enabling the simultaneous and selective nulling of several light sources. This is done by realizing a reconfigurable multiple-vortex phase mask made of a liquid crystal thin film on which local topological features can be addressed electro-optically. The method is illustrated by reporting on a triple-star optical vortex coronagraphy laboratory demonstration, which can be easily extended to higher multiplicity. These results allow considering the direct observation and analysis of worlds with multiple suns and more complex extrasolar planetary systems.

  16. Performance optimization of spectral amplitude coding OCDMA system using new enhanced multi diagonal code

    NASA Astrophysics Data System (ADS)

    Imtiaz, Waqas A.; Ilyas, M.; Khan, Yousaf

    2016-11-01

    This paper propose a new code to optimize the performance of spectral amplitude coding-optical code division multiple access (SAC-OCDMA) system. The unique two-matrix structure of the proposed enhanced multi diagonal (EMD) code and effective correlation properties, between intended and interfering subscribers, significantly elevates the performance of SAC-OCDMA system by negating multiple access interference (MAI) and associated phase induce intensity noise (PIIN). Performance of SAC-OCDMA system based on the proposed code is thoroughly analyzed for two detection techniques through analytic and simulation analysis by referring to bit error rate (BER), signal to noise ratio (SNR) and eye patterns at the receiving end. It is shown that EMD code while using SDD technique provides high transmission capacity, reduces the receiver complexity, and provides better performance as compared to complementary subtraction detection (CSD) technique. Furthermore, analysis shows that, for a minimum acceptable BER of 10-9 , the proposed system supports 64 subscribers at data rates of up to 2 Gbps for both up-down link transmission.

  17. Multiple Vehicle Detection and Segmentation in Malaysia Traffic Flow

    NASA Astrophysics Data System (ADS)

    Fariz Hasan, Ahmad; Fikri Che Husin, Mohd; Affendi Rosli, Khairul; Norhafiz Hashim, Mohd; Faiz Zainal Abidin, Amar

    2018-03-01

    Vision based system are widely used in the field of Intelligent Transportation System (ITS) to extract a large amount of information to analyze traffic scenes. By rapid number of vehicles on the road as well as significant increase on cameras dictated the need for traffic surveillance systems. This system can take over the burden some task was performed by human operator in traffic monitoring centre. The main technique proposed by this paper is concentrated on developing a multiple vehicle detection and segmentation focusing on monitoring through Closed Circuit Television (CCTV) video. The system is able to automatically segment vehicle extracted from heavy traffic scene by optical flow estimation alongside with blob analysis technique in order to detect the moving vehicle. Prior to segmentation, blob analysis technique will compute the area of interest region corresponding to moving vehicle which will be used to create bounding box on that particular vehicle. Experimental validation on the proposed system was performed and the algorithm is demonstrated on various set of traffic scene.

  18. Diversity Performance Analysis on Multiple HAP Networks

    PubMed Central

    Dong, Feihong; Li, Min; Gong, Xiangwu; Li, Hongjun; Gao, Fengyue

    2015-01-01

    One of the main design challenges in wireless sensor networks (WSNs) is achieving a high-data-rate transmission for individual sensor devices. The high altitude platform (HAP) is an important communication relay platform for WSNs and next-generation wireless networks. Multiple-input multiple-output (MIMO) techniques provide the diversity and multiplexing gain, which can improve the network performance effectively. In this paper, a virtual MIMO (V-MIMO) model is proposed by networking multiple HAPs with the concept of multiple assets in view (MAV). In a shadowed Rician fading channel, the diversity performance is investigated. The probability density function (PDF) and cumulative distribution function (CDF) of the received signal-to-noise ratio (SNR) are derived. In addition, the average symbol error rate (ASER) with BPSK and QPSK is given for the V-MIMO model. The system capacity is studied for both perfect channel state information (CSI) and unknown CSI individually. The ergodic capacity with various SNR and Rician factors for different network configurations is also analyzed. The simulation results validate the effectiveness of the performance analysis. It is shown that the performance of the HAPs network in WSNs can be significantly improved by utilizing the MAV to achieve overlapping coverage, with the help of the V-MIMO techniques. PMID:26134102

  19. Multiple-wavelength neutron holography with pulsed neutrons

    PubMed Central

    Hayashi, Kouichi; Ohoyama, Kenji; Happo, Naohisa; Matsushita, Tomohiro; Hosokawa, Shinya; Harada, Masahide; Inamura, Yasuhiro; Nitani, Hiroaki; Shishido, Toetsu; Yubuta, Kunio

    2017-01-01

    Local structures around impurities in solids provide important information for understanding the mechanisms of material functions, because most of them are controlled by dopants. For this purpose, the x-ray absorption fine structure method, which provides radial distribution functions around specific elements, is most widely used. However, a similar method using neutron techniques has not yet been developed. If one can establish a method of local structural analysis with neutrons, then a new frontier of materials science can be explored owing to the specific nature of neutron scattering—that is, its high sensitivity to light elements and magnetic moments. Multiple-wavelength neutron holography using the time-of-flight technique with pulsed neutrons has great potential to realize this. We demonstrated multiple-wavelength neutron holography using a Eu-doped CaF2 single crystal and obtained a clear three-dimensional atomic image around trivalent Eu substituted for divalent Ca, revealing an interesting feature of the local structure that allows it to maintain charge neutrality. The new holography technique is expected to provide new information on local structures using the neutron technique. PMID:28835917

  20. Multiple-wavelength neutron holography with pulsed neutrons.

    PubMed

    Hayashi, Kouichi; Ohoyama, Kenji; Happo, Naohisa; Matsushita, Tomohiro; Hosokawa, Shinya; Harada, Masahide; Inamura, Yasuhiro; Nitani, Hiroaki; Shishido, Toetsu; Yubuta, Kunio

    2017-08-01

    Local structures around impurities in solids provide important information for understanding the mechanisms of material functions, because most of them are controlled by dopants. For this purpose, the x-ray absorption fine structure method, which provides radial distribution functions around specific elements, is most widely used. However, a similar method using neutron techniques has not yet been developed. If one can establish a method of local structural analysis with neutrons, then a new frontier of materials science can be explored owing to the specific nature of neutron scattering-that is, its high sensitivity to light elements and magnetic moments. Multiple-wavelength neutron holography using the time-of-flight technique with pulsed neutrons has great potential to realize this. We demonstrated multiple-wavelength neutron holography using a Eu-doped CaF 2 single crystal and obtained a clear three-dimensional atomic image around trivalent Eu substituted for divalent Ca, revealing an interesting feature of the local structure that allows it to maintain charge neutrality. The new holography technique is expected to provide new information on local structures using the neutron technique.

  1. A Parallel Independent Component Analysis Approach to Investigate Genomic Influence on Brain Function

    PubMed Central

    Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D.

    2009-01-01

    Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings. PMID:19834575

  2. A Parallel Independent Component Analysis Approach to Investigate Genomic Influence on Brain Function.

    PubMed

    Liu, Jingyu; Demirci, Oguz; Calhoun, Vince D

    2008-01-01

    Relationships between genomic data and functional brain images are of great interest but require new analysis approaches to integrate the high-dimensional data types. This letter presents an extension of a technique called parallel independent component analysis (paraICA), which enables the joint analysis of multiple modalities including interconnections between them. We extend our earlier work by allowing for multiple interconnections and by providing important overfitting controls. Performance was assessed by simulations under different conditions, and indicated reliable results can be extracted by properly balancing overfitting and underfitting. An application to functional magnetic resonance images and single nucleotide polymorphism array produced interesting findings.

  3. Computerized multiple image analysis on mammograms: performance improvement of nipple identification for registration of multiple views using texture convergence analyses

    NASA Astrophysics Data System (ADS)

    Zhou, Chuan; Chan, Heang-Ping; Sahiner, Berkman; Hadjiiski, Lubomir M.; Paramagul, Chintana

    2004-05-01

    Automated registration of multiple mammograms for CAD depends on accurate nipple identification. We developed two new image analysis techniques based on geometric and texture convergence analyses to improve the performance of our previously developed nipple identification method. A gradient-based algorithm is used to automatically track the breast boundary. The nipple search region along the boundary is then defined by geometric convergence analysis of the breast shape. Three nipple candidates are identified by detecting the changes along the gray level profiles inside and outside the boundary and the changes in the boundary direction. A texture orientation-field analysis method is developed to estimate the fourth nipple candidate based on the convergence of the tissue texture pattern towards the nipple. The final nipple location is determined from the four nipple candidates by a confidence analysis. Our training and test data sets consisted of 419 and 368 randomly selected mammograms, respectively. The nipple location identified on each image by an experienced radiologist was used as the ground truth. For 118 of the training and 70 of the test images, the radiologist could not positively identify the nipple, but provided an estimate of its location. These were referred to as invisible nipple images. In the training data set, 89.37% (269/301) of the visible nipples and 81.36% (96/118) of the invisible nipples could be detected within 1 cm of the truth. In the test data set, 92.28% (275/298) of the visible nipples and 67.14% (47/70) of the invisible nipples were identified within 1 cm of the truth. In comparison, our previous nipple identification method without using the two convergence analysis techniques detected 82.39% (248/301), 77.12% (91/118), 89.93% (268/298) and 54.29% (38/70) of the nipples within 1 cm of the truth for the visible and invisible nipples in the training and test sets, respectively. The results indicate that the nipple on mammograms can be detected accurately. This will be an important step towards automatic multiple image analysis for CAD techniques.

  4. An improved large-field focusing schlieren system

    NASA Technical Reports Server (NTRS)

    Weinstein, Leonard M.

    1991-01-01

    The analysis and performance of a high-brightness large-field focusing schlieren system is described. The system can be used to examine complex two- and three-dimensional flows. Techniques are described to obtain focusing schlieren through distorting optical elements, to use multiple colors in a time multiplexing technique, and to use diffuse screen holography for three-dimensional photographs.

  5. Choosing a DIVA: a comparison of emerging digital imagery vegetation analysis techniques

    USGS Publications Warehouse

    Jorgensen, Christopher F.; Stutzman, Ryan J.; Anderson, Lars C.; Decker, Suzanne E.; Powell, Larkin A.; Schacht, Walter H.; Fontaine, Joseph J.

    2013-01-01

    Question: What is the precision of five methods of measuring vegetation structure using ground-based digital imagery and processing techniques? Location: Lincoln, Nebraska, USA Methods: Vertical herbaceous cover was recorded using digital imagery techniques at two distinct locations in a mixed-grass prairie. The precision of five ground-based digital imagery vegetation analysis (DIVA) methods for measuring vegetation structure was tested using a split-split plot analysis of covariance. Variability within each DIVA technique was estimated using coefficient of variation of mean percentage cover. Results: Vertical herbaceous cover estimates differed among DIVA techniques. Additionally, environmental conditions affected the vertical vegetation obstruction estimates for certain digital imagery methods, while other techniques were more adept at handling various conditions. Overall, percentage vegetation cover values differed among techniques, but the precision of four of the five techniques was consistently high. Conclusions: DIVA procedures are sufficient for measuring various heights and densities of standing herbaceous cover. Moreover, digital imagery techniques can reduce measurement error associated with multiple observers' standing herbaceous cover estimates, allowing greater opportunity to detect patterns associated with vegetation structure.

  6. HBCU Efficiency and Endowments: An Exploratory Analysis

    ERIC Educational Resources Information Center

    Coupet, Jason; Barnum, Darold

    2010-01-01

    Discussions of efficiency among Historically Black Colleges and Universities (HBCUs) are often missing in academic conversations. This article seeks to assess efficiency of individual HBCUs using Data Envelopment Analysis (DEA), a non-parametric technique that can synthesize multiple inputs and outputs to determine a single efficiency score for…

  7. Applications of multi-frequency single beam sonar fisheries analysis methods for seep quantification and characterization

    NASA Astrophysics Data System (ADS)

    Price, V.; Weber, T.; Jerram, K.; Doucet, M.

    2016-12-01

    The analysis of multi-frequency, narrow-band single-beam acoustic data for fisheries applications has long been established, with methodology focusing on characterizing targets in the water column by utilizing complex algorithms and false-color time series data to create and compare frequency response curves for dissimilar biological groups. These methods were built on concepts developed for multi-frequency analysis of satellite imagery for terrestrial analysis and have been applied to a broad range of data types and applications. Single-beam systems operating at multiple frequencies are also used for the detection and identification of seeps in water column data. Here we incorporate the same analysis and visualization techniques used for fisheries applications to attempt to characterize and quantify seeps by creating and comparing frequency response curves and applying false coloration to shallow and deep multi-channel seep data. From this information, we can establish methods to differentiate bubble size in the echogram and differentiate seep composition. These techniques are also useful in differentiating plume content from biological noise (volume reverberation) created by euphausid layers and fish with gas-filled swim bladders. The combining of the multiple frequencies using false coloring and other image analysis techniques after applying established normalization and beam pattern correction algorithms is a novel approach to quantitatively describing seeps. Further, this information could be paired with geological models, backscatter, and bathymetry data to assess seep distribution.

  8. Quantitative Analysis of Tissue Samples by Combining iTRAQ Isobaric Labeling with Selected/Multiple Reaction Monitoring (SRM/MRM).

    PubMed

    Narumi, Ryohei; Tomonaga, Takeshi

    2016-01-01

    Mass spectrometry-based phosphoproteomics is an indispensible technique used in the discovery and quantification of phosphorylation events on proteins in biological samples. The application of this technique to tissue samples is especially useful for the discovery of biomarkers as well as biological studies. We herein describe the application of a large-scale phosphoproteome analysis and SRM/MRM-based quantitation to develop a strategy for the systematic discovery and validation of biomarkers using tissue samples.

  9. Synchronous in-field application of life-detection techniques in planetary analog missions

    NASA Astrophysics Data System (ADS)

    Amador, Elena S.; Cable, Morgan L.; Chaudry, Nosheen; Cullen, Thomas; Gentry, Diana; Jacobsen, Malene B.; Murukesan, Gayathri; Schwieterman, Edward W.; Stevens, Adam H.; Stockton, Amanda; Yin, Chang; Cullen, David C.; Geppert, Wolf

    2015-02-01

    Field expeditions that simulate the operations of robotic planetary exploration missions at analog sites on Earth can help establish best practices and are therefore a positive contribution to the planetary exploration community. There are many sites in Iceland that possess heritage as planetary exploration analog locations and whose environmental extremes make them suitable for simulating scientific sampling and robotic operations. We conducted a planetary exploration analog mission at two recent lava fields in Iceland, Fimmvörðuháls (2010) and Eldfell (1973), using a specially developed field laboratory. We tested the utility of in-field site sampling down selection and tiered analysis operational capabilities with three life detection and characterization techniques: fluorescence microscopy (FM), adenine-triphosphate (ATP) bioluminescence assay, and quantitative polymerase chain reaction (qPCR) assay. The study made use of multiple cycles of sample collection at multiple distance scales and field laboratory analysis using the synchronous life-detection techniques to heuristically develop the continuing sampling and analysis strategy during the expedition. Here we report the operational lessons learned and provide brief summaries of scientific data. The full scientific data report will follow separately. We found that rapid in-field analysis to determine subsequent sampling decisions is operationally feasible, and that the chosen life detection and characterization techniques are suitable for a terrestrial life-detection field mission. In-field analysis enables the rapid obtainment of scientific data and thus facilitates the collection of the most scientifically relevant samples within a single field expedition, without the need for sample relocation to external laboratories. The operational lessons learned in this study could be applied to future terrestrial field expeditions employing other analytical techniques and to future robotic planetary exploration missions.

  10. 3D fluid-structure modelling and vibration analysis for fault diagnosis of Francis turbine using multiple ANN and multiple ANFIS

    NASA Astrophysics Data System (ADS)

    Saeed, R. A.; Galybin, A. N.; Popov, V.

    2013-01-01

    This paper discusses condition monitoring and fault diagnosis in Francis turbine based on integration of numerical modelling with several different artificial intelligence (AI) techniques. In this study, a numerical approach for fluid-structure (turbine runner) analysis is presented. The results of numerical analysis provide frequency response functions (FRFs) data sets along x-, y- and z-directions under different operating load and different position and size of faults in the structure. To extract features and reduce the dimensionality of the obtained FRF data, the principal component analysis (PCA) has been applied. Subsequently, the extracted features are formulated and fed into multiple artificial neural networks (ANN) and multiple adaptive neuro-fuzzy inference systems (ANFIS) in order to identify the size and position of the damage in the runner and estimate the turbine operating conditions. The results demonstrated the effectiveness of this approach and provide satisfactory accuracy even when the input data are corrupted with certain level of noise.

  11. Multimodal Neuroimaging: Basic Concepts and Classification of Neuropsychiatric Diseases.

    PubMed

    Tulay, Emine Elif; Metin, Barış; Tarhan, Nevzat; Arıkan, Mehmet Kemal

    2018-06-01

    Neuroimaging techniques are widely used in neuroscience to visualize neural activity, to improve our understanding of brain mechanisms, and to identify biomarkers-especially for psychiatric diseases; however, each neuroimaging technique has several limitations. These limitations led to the development of multimodal neuroimaging (MN), which combines data obtained from multiple neuroimaging techniques, such as electroencephalography, functional magnetic resonance imaging, and yields more detailed information about brain dynamics. There are several types of MN, including visual inspection, data integration, and data fusion. This literature review aimed to provide a brief summary and basic information about MN techniques (data fusion approaches in particular) and classification approaches. Data fusion approaches are generally categorized as asymmetric and symmetric. The present review focused exclusively on studies based on symmetric data fusion methods (data-driven methods), such as independent component analysis and principal component analysis. Machine learning techniques have recently been introduced for use in identifying diseases and biomarkers of disease. The machine learning technique most widely used by neuroscientists is classification-especially support vector machine classification. Several studies differentiated patients with psychiatric diseases and healthy controls with using combined datasets. The common conclusion among these studies is that the prediction of diseases increases when combining data via MN techniques; however, there remain a few challenges associated with MN, such as sample size. Perhaps in the future N-way fusion can be used to combine multiple neuroimaging techniques or nonimaging predictors (eg, cognitive ability) to overcome the limitations of MN.

  12. Moving beyond Univariate Post-Hoc Testing in Exercise Science: A Primer on Descriptive Discriminate Analysis

    ERIC Educational Resources Information Center

    Barton, Mitch; Yeatts, Paul E.; Henson, Robin K.; Martin, Scott B.

    2016-01-01

    There has been a recent call to improve data reporting in kinesiology journals, including the appropriate use of univariate and multivariate analysis techniques. For example, a multivariate analysis of variance (MANOVA) with univariate post hocs and a Bonferroni correction is frequently used to investigate group differences on multiple dependent…

  13. New Software for Market Segmentation Analysis: A Chi-Square Interaction Detector. AIR 1983 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Lay, Robert S.

    The advantages and disadvantages of new software for market segmentation analysis are discussed, and the application of this new, chi-square based procedure (CHAID), is illustrated. A comparison is presented of an earlier, binary segmentation technique (THAID) and a multiple discriminant analysis. It is suggested that CHAID is superior to earlier…

  14. A Review on Spectral Amplitude Coding Optical Code Division Multiple Access

    NASA Astrophysics Data System (ADS)

    Kaur, Navpreet; Goyal, Rakesh; Rani, Monika

    2017-06-01

    This manuscript deals with analysis of Spectral Amplitude Coding Optical Code Division Multiple Access (SACOCDMA) system. The major noise source in optical CDMA is co-channel interference from other users known as multiple access interference (MAI). The system performance in terms of bit error rate (BER) degrades as a result of increased MAI. It is perceived that number of users and type of codes used for optical system directly decide the performance of system. MAI can be restricted by efficient designing of optical codes and implementing them with unique architecture to accommodate more number of users. Hence, it is a necessity to design a technique like spectral direct detection (SDD) technique with modified double weight code, which can provide better cardinality and good correlation property.

  15. Combining lipophilic dye, in situ hybridization, immunohistochemistry, and histology.

    PubMed

    Duncan, Jeremy; Kersigo, Jennifer; Gray, Brian; Fritzsch, Bernd

    2011-03-17

    Going beyond single gene function to cut deeper into gene regulatory networks requires multiple mutations combined in a single animal. Such analysis of two or more genes needs to be complemented with in situ hybridization of other genes, or immunohistochemistry of their proteins, both in whole mounted developing organs or sections for detailed resolution of the cellular and tissue expression alterations. Combining multiple gene alterations requires the use of cre or flipase to conditionally delete genes and avoid embryonic lethality. Required breeding schemes dramatically enhance effort and cost proportional to the number of genes mutated, with an outcome of very few animals with the full repertoire of genetic modifications desired. Amortizing the vast amount of effort and time to obtain these few precious specimens that are carrying multiple mutations necessitates tissue optimization. Moreover, investigating a single animal with multiple techniques makes it easier to correlate gene deletion defects with expression profiles. We have developed a technique to obtain a more thorough analysis of a given animal; with the ability to analyze several different histologically recognizable structures as well as gene and protein expression all from the same specimen in both whole mounted organs and sections. Although mice have been utilized to demonstrate the effectiveness of this technique it can be applied to a wide array of animals. To do this we combine lipophilic dye tracing, whole mount in situ hybridization, immunohistochemistry, and histology to extract the maximal possible amount of data.

  16. Combining Lipophilic dye, in situ Hybridization, Immunohistochemistry, and Histology

    PubMed Central

    Duncan, Jeremy; Kersigo, Jennifer; Gray, Brian; Fritzsch, Bernd

    2011-01-01

    Going beyond single gene function to cut deeper into gene regulatory networks requires multiple mutations combined in a single animal. Such analysis of two or more genes needs to be complemented with in situ hybridization of other genes, or immunohistochemistry of their proteins, both in whole mounted developing organs or sections for detailed resolution of the cellular and tissue expression alterations. Combining multiple gene alterations requires the use of cre or flipase to conditionally delete genes and avoid embryonic lethality. Required breeding schemes dramatically enhance effort and cost proportional to the number of genes mutated, with an outcome of very few animals with the full repertoire of genetic modifications desired. Amortizing the vast amount of effort and time to obtain these few precious specimens that are carrying multiple mutations necessitates tissue optimization. Moreover, investigating a single animal with multiple techniques makes it easier to correlate gene deletion defects with expression profiles. We have developed a technique to obtain a more thorough analysis of a given animal; with the ability to analyze several different histologically recognizable structures as well as gene and protein expression all from the same specimen in both whole mounted organs and sections. Although mice have been utilized to demonstrate the effectiveness of this technique it can be applied to a wide array of animals. To do this we combine lipophilic dye tracing, whole mount in situ hybridization, immunohistochemistry, and histology to extract the maximal possible amount of data. PMID:21445047

  17. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    ERIC Educational Resources Information Center

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  18. Cerebrovascular pattern improved by ozone autohemotherapy: an entropy-based study on multiple sclerosis patients.

    PubMed

    Molinari, Filippo; Rimini, Daniele; Liboni, William; Acharya, U Rajendra; Franzini, Marianno; Pandolfi, Sergio; Ricevuti, Giovanni; Vaiano, Francesco; Valdenassi, Luigi; Simonetti, Vincenzo

    2017-08-01

    Ozone major autohemotherapy is effective in reducing the symptoms of multiple sclerosis (MS) patients, but its effects on brain are still not clear. In this work, we have monitored the changes in the cerebrovascular pattern of MS patients and normal subjects during major ozone autohemotherapy by using near-infrared spectroscopy (NIRS) as functional and vascular technique. NIRS signals are analyzed using a combination of time, time-frequency analysis and nonlinear analysis of intrinsic mode function signals obtained from empirical mode decomposition technique. Our results show that there is an improvement in the cerebrovascular pattern of all subjects indicated by increasing the entropy of the NIRS signals. Hence, we can conclude that the ozone therapy increases the brain metabolism and helps to recover from the lower activity levels which is predominant in MS patients.

  19. Automated Track Recognition and Event Reconstruction in Nuclear Emulsion

    NASA Technical Reports Server (NTRS)

    Deines-Jones, P.; Cherry, M. L.; Dabrowska, A.; Holynski, R.; Jones, W. V.; Kolganova, E. D.; Kudzia, D.; Nilsen, B. S.; Olszewski, A.; Pozharova, E. A.; hide

    1998-01-01

    The major advantages of nuclear emulsion for detecting charged particles are its submicron position resolution and sensitivity to minimum ionizing particles. These must be balanced, however, against the difficult manual microscope measurement by skilled observers required for the analysis. We have developed an automated system to acquire and analyze the microscope images from emulsion chambers. Each emulsion plate is analyzed independently, allowing coincidence techniques to be used in order to reject back- ground and estimate error rates. The system has been used to analyze a sample of high-multiplicity Pb-Pb interactions (charged particle multiplicities approx. 1100) produced by the 158 GeV/c per nucleon Pb-208 beam at CERN. Automatically reconstructed track lists agree with our best manual measurements to 3%. We describe the image analysis and track reconstruction techniques, and discuss the measurement and reconstruction uncertainties.

  20. A Versatile Integrated Ambient Ionization Source Platform.

    PubMed

    Ai, Wanpeng; Nie, Honggang; Song, Shiyao; Liu, Xiaoyun; Bai, Yu; Liu, Huwei

    2018-04-30

    The pursuit of high-throughput sample analysis from complex matrix demands development of multiple ionization techniques with complementary specialties. A versatile integrated ambient ionization source (iAmIS) platform is proposed in this work, based on the idea of integrating multiple functions, enhancing the efficiency of current ionization techniques, extending the applications, and decreasing the cost of the instrument. The design of the iAmIS platform combines flowing atmospheric pressure afterglow (FAPA) source/direct analysis in real time (DART), dielectric barrier discharge ionization (DBDI)/low-temperature plasma (LTP), desorption electrospray ionization (DESI), and laser desorption (LD) technique. All individual and combined ionization modes can be easily attained by modulating parameters. In particular, the FAPA/DART&DESI mode can realize the detection of polar and nonpolar compounds at the same time with two different ionization mechanisms: proton transfer and charge transfer. The introduction of LD contributes to the mass spectrometry imaging and the surface-assisted laser desorption (SALDI) under ambient condition. Compared with other individual or multi-mode ion source, the iAmIS platform provides the flexibility of choosing different ionization modes, broadens the scope of the analyte detection, and facilitates the analysis of complex samples. Graphical abstract ᅟ.

  1. A Versatile Integrated Ambient Ionization Source Platform

    NASA Astrophysics Data System (ADS)

    Ai, Wanpeng; Nie, Honggang; Song, Shiyao; Liu, Xiaoyun; Bai, Yu; Liu, Huwei

    2018-04-01

    The pursuit of high-throughput sample analysis from complex matrix demands development of multiple ionization techniques with complementary specialties. A versatile integrated ambient ionization source (iAmIS) platform is proposed in this work, based on the idea of integrating multiple functions, enhancing the efficiency of current ionization techniques, extending the applications, and decreasing the cost of the instrument. The design of the iAmIS platform combines flowing atmospheric pressure afterglow (FAPA) source/direct analysis in real time (DART), dielectric barrier discharge ionization (DBDI)/low-temperature plasma (LTP), desorption electrospray ionization (DESI), and laser desorption (LD) technique. All individual and combined ionization modes can be easily attained by modulating parameters. In particular, the FAPA/DART&DESI mode can realize the detection of polar and nonpolar compounds at the same time with two different ionization mechanisms: proton transfer and charge transfer. The introduction of LD contributes to the mass spectrometry imaging and the surface-assisted laser desorption (SALDI) under ambient condition. Compared with other individual or multi-mode ion source, the iAmIS platform provides the flexibility of choosing different ionization modes, broadens the scope of the analyte detection, and facilitates the analysis of complex samples. [Figure not available: see fulltext.

  2. [Progress in industrial bioprocess engineering in China].

    PubMed

    Zhuang, Yingping; Chen, Hongzhang; Xia, Jianye; Tang, Wenjun; Zhao, Zhimin

    2015-06-01

    The advances of industrial biotechnology highly depend on the development of industrial bioprocess researches. In China, we are facing several challenges because of a huge national industrial fermentation capacity. The industrial bioprocess development experienced several main stages. This work mainly reviews the development of the industrial bioprocess in China during the past 30 or 40 years: including the early stage kinetics model study derived from classical chemical engineering, researching method based on control theory, multiple-parameter analysis techniques of on-line measuring instruments and techniques, and multi-scale analysis theory, and also solid state fermentation techniques and fermenters. In addition, the cutting edge of bioprocess engineering was also addressed.

  3. Multifunctional, three-dimensional tomography for analysis of eletrectrohydrodynamic jetting

    NASA Astrophysics Data System (ADS)

    Nguyen, Xuan Hung; Gim, Yeonghyeon; Ko, Han Seo

    2015-05-01

    A three-dimensional optical tomography technique was developed to reconstruct three-dimensional objects using a set of two-dimensional shadowgraphic images and normal gray images. From three high-speed cameras, which were positioned at an offset angle of 45° between each other, number, size, and location of electrohydrodynamic jets with respect to the nozzle position were analyzed using shadowgraphic tomography employing multiplicative algebraic reconstruction technique (MART). Additionally, a flow field inside a cone-shaped liquid (Taylor cone) induced under an electric field was observed using a simultaneous multiplicative algebraic reconstruction technique (SMART), a tomographic method for reconstructing light intensities of particles, combined with three-dimensional cross-correlation. Various velocity fields of circulating flows inside the cone-shaped liquid caused by various physico-chemical properties of liquid were also investigated.

  4. Development of Techniques for Multiple Data Stream Analysis and Short- Term Forecasting. Volume I. Multiple Data Stream Analysis

    DTIC Science & Technology

    1975-11-15

    ir in» l.iit.ii-nlrl-i . i ifr .-Viii ^„,„^>,,,,,.,,.,„.,,,.,™„„,^^^^ I ’Ulis Cable shows great similarity between the NYT and TOL as follows; o...from which the data have been derived. The authors challenge the contention by other data collectors that variation in interaction data derived from...LIC Luxemburg LUX Malagasy MAG Malawi MAW Malaysia MAL Maldive MAD Mali MLI Malta MLT Mauritius MAR Mauritania MAU Mexico MEX Monaco MOC

  5. Application of higher harmonic blade feathering for helicopter vibration reduction

    NASA Technical Reports Server (NTRS)

    Powers, R. W.

    1978-01-01

    Higher harmonic blade feathering for helicopter vibration reduction is considered. Recent wind tunnel tests confirmed the effectiveness of higher harmonic control in reducing articulated rotor vibratory hub loads. Several predictive analyses developed in support of the NASA program were shown to be capable of calculating single harmonic control inputs required to minimize a single 4P hub response. In addition, a multiple-input, multiple-output harmonic control predictive analysis was developed. All techniques developed thus far obtain a solution by extracting empirical transfer functions from sampled data. Algorithm data sampling and processing requirements are minimal to encourage adaptive control system application of such techniques in a flight environment.

  6. Multiple stage MS in analysis of plasma, serum, urine and in vitro samples relevant to clinical and forensic toxicology.

    PubMed

    Meyer, Golo M; Maurer, Hans H; Meyer, Markus R

    2016-01-01

    This paper reviews MS approaches applied to metabolism studies, structure elucidation and qualitative or quantitative screening of drugs (of abuse) and/or their metabolites. Applications in clinical and forensic toxicology were included using blood plasma or serum, urine, in vitro samples, liquids, solids or plant material. Techniques covered are liquid chromatography coupled to low-resolution and high-resolution multiple stage mass analyzers. Only PubMed listed studies published in English between January 2008 and January 2015 were considered. Approaches are discussed focusing on sample preparation and mass spectral settings. Comments on advantages and limitations of these techniques complete the review.

  7. Phased-mission system analysis using Boolean algebraic methods

    NASA Technical Reports Server (NTRS)

    Somani, Arun K.; Trivedi, Kishor S.

    1993-01-01

    Most reliability analysis techniques and tools assume that a system is used for a mission consisting of a single phase. However, multiple phases are natural in many missions. The failure rates of components, system configuration, and success criteria may vary from phase to phase. In addition, the duration of a phase may be deterministic or random. Recently, several researchers have addressed the problem of reliability analysis of such systems using a variety of methods. A new technique for phased-mission system reliability analysis based on Boolean algebraic methods is described. Our technique is computationally efficient and is applicable to a large class of systems for which the failure criterion in each phase can be expressed as a fault tree (or an equivalent representation). Our technique avoids state space explosion that commonly plague Markov chain-based analysis. A phase algebra to account for the effects of variable configurations and success criteria from phase to phase was developed. Our technique yields exact (as opposed to approximate) results. The use of our technique was demonstrated by means of an example and present numerical results to show the effects of mission phases on the system reliability.

  8. Time-Frequency Analysis of the Dispersion of Lamb Modes

    NASA Technical Reports Server (NTRS)

    Prosser, W. H.; Seale, Michael D.; Smith, Barry T.

    1999-01-01

    Accurate knowledge of the velocity dispersion of Lamb modes is important for ultrasonic nondestructive evaluation methods used in detecting and locating flaws in thin plates and in determining their elastic stiffness coefficients. Lamb mode dispersion is also important in the acoustic emission technique for accurately triangulating the location of emissions in thin plates. In this research, the ability to characterize Lamb mode dispersion through a time-frequency analysis (the pseudo-Wigner-Ville distribution) was demonstrated. A major advantage of time-frequency methods is the ability to analyze acoustic signals containing multiple propagation modes, which overlap and superimpose in the time domain signal. By combining time-frequency analysis with a broadband acoustic excitation source, the dispersion of multiple Lamb modes over a wide frequency range can be determined from as little as a single measurement. In addition, the technique provides a direct measurement of the group velocity dispersion. The technique was first demonstrated in the analysis of a simulated waveform in an aluminum plate in which the Lamb mode dispersion was well known. Portions of the dispersion curves of the AO, A I , So, and S2 Lamb modes were obtained from this one waveform. The technique was also applied for the analysis of experimental waveforms from a unidirectional graphite/epoxy composite plate. Measurements were made both along and perpendicular to the fiber direction. In this case, the signals contained only the lowest order symmetric and antisymmetric modes. A least squares fit of the results from several source to detector distances was used. Theoretical dispersion curves were calculated and are shown to be in good agreement with experimental results.

  9. Selectivity/Specificity Improvement Strategies in Surface-Enhanced Raman Spectroscopy Analysis

    PubMed Central

    Wang, Feng; Cao, Shiyu; Yan, Ruxia; Wang, Zewei; Wang, Dan; Yang, Haifeng

    2017-01-01

    Surface-enhanced Raman spectroscopy (SERS) is a powerful technique for the discrimination, identification, and potential quantification of certain compounds/organisms. However, its real application is challenging due to the multiple interference from the complicated detection matrix. Therefore, selective/specific detection is crucial for the real application of SERS technique. We summarize in this review five selective/specific detection techniques (chemical reaction, antibody, aptamer, molecularly imprinted polymers and microfluidics), which can be applied for the rapid and reliable selective/specific detection when coupled with SERS technique. PMID:29160798

  10. Designing to Support Command and Control in Urban Firefighting

    DTIC Science & Technology

    2008-06-01

    complex human- machine systems. Keywords: Command and control, firefighting, cognitive systems engineering, cognitive task analysis 1...Elm, W. (2000). Bootstrapping multiple converging cognitive task analysis techniques for system design. In J.M.C. Schraagen, S.F. Chipman, & V.L...Shalin, (Eds.), Cognitive Task Analysis . (pp. 317-340). Mahwah, NJ: Lawrence Erlbaum. Rasmussen, J., Pejtersen, A., Goodman, L. (1994). Cognitive

  11. Model for spectral and chromatographic data

    DOEpatents

    Jarman, Kristin [Richland, WA; Willse, Alan [Richland, WA; Wahl, Karen [Richland, WA; Wahl, Jon [Richland, WA

    2002-11-26

    A method and apparatus using a spectral analysis technique are disclosed. In one form of the invention, probabilities are selected to characterize the presence (and in another form, also a quantification of a characteristic) of peaks in an indexed data set for samples that match a reference species, and other probabilities are selected for samples that do not match the reference species. An indexed data set is acquired for a sample, and a determination is made according to techniques exemplified herein as to whether the sample matches or does not match the reference species. When quantification of peak characteristics is undertaken, the model is appropriately expanded, and the analysis accounts for the characteristic model and data. Further techniques are provided to apply the methods and apparatuses to process control, cluster analysis, hypothesis testing, analysis of variance, and other procedures involving multiple comparisons of indexed data.

  12. A novel method for tracing the movement of multiple individual soil particles under rainfall conditions using florescent videography.

    NASA Astrophysics Data System (ADS)

    Hardy, Robert; Pates, Jackie; Quinton, John

    2016-04-01

    The importance of developing new techniques to study soil movement cannot be underestimated especially those that integrate new technology. Currently there are limited empirical data available about the movement of individual soil particles, particularly high quality time-resolved data. Here we present a new technique which allows multiple individual soil particles to be traced in real time under simulated rainfall conditions. The technique utilises fluorescent videography in combination with a fluorescent soil tracer, which is based on natural particles. The system has been successfully used on particles greater than ~130 micrometres diameter. The technique uses HD video shot at 50 frames per second, providing extremely high temporal (0.02 s) and spatial resolution (sub-millimetre) of a particle's location without the need to perturb the system. Once the tracer has been filmed then the images are processed and analysed using a particle analysis and visualisation toolkit written in python. The toolkit enables the creation of 2 and 3-D time-resolved graphs showing the location of 1 or more particles. Quantitative numerical analysis of a pathway (or collection of pathways) is also possible, allowing parameters such as particle speed and displacement to be assessed. Filming the particles removes the need to destructively sample material and has many side-benefits, reducing the time, money and effort expended in the collection, transport and laboratory analysis of soils, while delivering data in a digital form which is perfect for modern computer-driven analysis techniques. There are many potential applications for the technique. High resolution empirical data on how soil particles move could be used to create, parameterise and evaluate soil movement models, particularly those that use the movement of individual particles. As data can be collected while rainfall is occurring it may offer the ability to study systems under dynamic conditions(rather than rainfall of a constant intensity), which are more realistic and this was one motivations behind the development of this technique.

  13. Visualization techniques for computer network defense

    NASA Astrophysics Data System (ADS)

    Beaver, Justin M.; Steed, Chad A.; Patton, Robert M.; Cui, Xiaohui; Schultz, Matthew

    2011-06-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operator to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.

  14. Application of optical correlation techniques to particle imaging velocimetry

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.; Edwards, Robert V.

    1988-01-01

    Pulsed laser sheet velocimetry yields nonintrusive measurements of velocity vectors across an extended 2-dimensional region of the flow field. The application of optical correlation techniques to the analysis of multiple exposure laser light sheet photographs can reduce and/or simplify the data reduction time and hardware. Here, Matched Spatial Filters (MSF) are used in a pattern recognition system. Usually MSFs are used to identify the assembly line parts. In this application, the MSFs are used to identify the iso-velocity vector contours in the flow. The patterns to be recognized are the recorded particle images in a pulsed laser light sheet photograph. Measurement of the direction of the partical image displacements between exposures yields the velocity vector. The particle image exposure sequence is designed such that the velocity vector direction is determined unambiguously. A global analysis technique is used in comparison to the more common particle tracking algorithms and Young's fringe analysis technique.

  15. Contribution of multiple inert gas elimination technique to pulmonary medicine. 1. Principles and information content of the multiple inert gas elimination technique.

    PubMed Central

    Roca, J.; Wagner, P. D.

    1994-01-01

    This introductory review summarises four different aspects of the multiple inert gas elimination technique (MIGET). Firstly, the historical background that facilitated, in the mid 1970s, the development of the MIGET as a tool to obtain more information about the entire spectrum of VA/Q distribution in the lung by measuring the exchange of six gases of different solubility in trace concentrations. Its principle is based on the observation that the retention (or excretion) of any gas is dependent on the solubility (lambda) of that gas and the VA/Q distribution. A second major aspect is the analysis of the information content and limitations of the technique. During the last 15 years a substantial amount of clinical research using the MIGET has been generated by several groups around the world. The technique has been shown to be adequate in understanding the mechanisms of hypoxaemia in different forms of pulmonary disease and the effects of therapeutic interventions, but also in separately determining the quantitative role of each extrapulmonary factor on systemic arterial PO2 when they change between two conditions of MIGET measurement. This information will be extensively reviewed in the forthcoming articles of this series. Next, the different modalities of the MIGET, practical considerations involved in the measurements and the guidelines for quality control have been indicated. Finally, a section has been devoted to the analysis of available data in healthy subjects under different conditions. The lack of systematic information on the VA/Q distributions of older healthy subjects is emphasised, since it will be required to fully understand the changes brought about by diseases that affect the older population. PMID:8091330

  16. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 4: Advanced fan section aerodynamic analysis

    NASA Technical Reports Server (NTRS)

    Crook, Andrew J.; Delaney, Robert A.

    1992-01-01

    The purpose of this study is the development of a three-dimensional Euler/Navier-Stokes flow analysis for fan section/engine geometries containing multiple blade rows and multiple spanwise flow splitters. An existing procedure developed by Dr. J. J. Adamczyk and associates and the NASA Lewis Research Center was modified to accept multiple spanwise splitter geometries and simulate engine core conditions. The procedure was also modified to allow coarse parallelization of the solution algorithm. This document is a final report outlining the development and techniques used in the procedure. The numerical solution is based upon a finite volume technique with a four stage Runge-Kutta time marching procedure. Numerical dissipation is used to gain solution stability but is reduced in viscous dominated flow regions. Local time stepping and implicit residual smoothing are used to increase the rate of convergence. Multiple blade row solutions are based upon the average-passage system of equations. The numerical solutions are performed on an H-type grid system, with meshes being generated by the system (TIGG3D) developed earlier under this contract. The grid generation scheme meets the average-passage requirement of maintaining a common axisymmetric mesh for each blade row grid. The analysis was run on several geometry configurations ranging from one to five blade rows and from one to four radial flow splitters. Pure internal flow solutions were obtained as well as solutions with flow about the cowl/nacelle and various engine core flow conditions. The efficiency of the solution procedure was shown to be the same as the original analysis.

  17. Data Analysis Techniques for Physical Scientists

    NASA Astrophysics Data System (ADS)

    Pruneau, Claude A.

    2017-10-01

    Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.

  18. Integrative sparse principal component analysis of gene expression data.

    PubMed

    Liu, Mengque; Fan, Xinyan; Fang, Kuangnan; Zhang, Qingzhao; Ma, Shuangge

    2017-12-01

    In the analysis of gene expression data, dimension reduction techniques have been extensively adopted. The most popular one is perhaps the PCA (principal component analysis). To generate more reliable and more interpretable results, the SPCA (sparse PCA) technique has been developed. With the "small sample size, high dimensionality" characteristic of gene expression data, the analysis results generated from a single dataset are often unsatisfactory. Under contexts other than dimension reduction, integrative analysis techniques, which jointly analyze the raw data of multiple independent datasets, have been developed and shown to outperform "classic" meta-analysis and other multidatasets techniques and single-dataset analysis. In this study, we conduct integrative analysis by developing the iSPCA (integrative SPCA) method. iSPCA achieves the selection and estimation of sparse loadings using a group penalty. To take advantage of the similarity across datasets and generate more accurate results, we further impose contrasted penalties. Different penalties are proposed to accommodate different data conditions. Extensive simulations show that iSPCA outperforms the alternatives under a wide spectrum of settings. The analysis of breast cancer and pancreatic cancer data further shows iSPCA's satisfactory performance. © 2017 WILEY PERIODICALS, INC.

  19. Making Sense of Sensemaking: Requirements of a Cognitive Analysis to Support C2 Decision Support System Design

    DTIC Science & Technology

    2006-06-01

    heart of a distinction within the CSE community with respect to the differences between Cognitive Task Analysis (CTA) and Cognitive Work Analysis...Wesley. Pirolli, P. and Card, S. (2005). The sensemaking process and leverage points for analyst technology as identified through cognitive task analysis . In...D. D., and Elm, W. C. (2000). Cognitive task analysis as bootstrapping multiple converging techniques. In Schraagen, Chipman, and Shalin (Eds

  20. An Innovative Technique to Assess Spontaneous Baroreflex Sensitivity with Short Data Segments: Multiple Trigonometric Regressive Spectral Analysis.

    PubMed

    Li, Kai; Rüdiger, Heinz; Haase, Rocco; Ziemssen, Tjalf

    2018-01-01

    Objective: As the multiple trigonometric regressive spectral (MTRS) analysis is extraordinary in its ability to analyze short local data segments down to 12 s, we wanted to evaluate the impact of the data segment settings by applying the technique of MTRS analysis for baroreflex sensitivity (BRS) estimation using a standardized data pool. Methods: Spectral and baroreflex analyses were performed on the EuroBaVar dataset (42 recordings, including lying and standing positions). For this analysis, the technique of MTRS was used. We used different global and local data segment lengths, and chose the global data segments from different positions. Three global data segments of 1 and 2 min and three local data segments of 12, 20, and 30 s were used in MTRS analysis for BRS. Results: All the BRS-values calculated on the three global data segments were highly correlated, both in the supine and standing positions; the different global data segments provided similar BRS estimations. When using different local data segments, all the BRS-values were also highly correlated. However, in the supine position, using short local data segments of 12 s overestimated BRS compared with those using 20 and 30 s. In the standing position, the BRS estimations using different local data segments were comparable. There was no proportional bias for the comparisons between different BRS estimations. Conclusion: We demonstrate that BRS estimation by the MTRS technique is stable when using different global data segments, and MTRS is extraordinary in its ability to evaluate BRS in even short local data segments (20 and 30 s). Because of the non-stationary character of most biosignals, the MTRS technique would be preferable for BRS analysis especially in conditions when only short stationary data segments are available or when dynamic changes of BRS should be monitored.

  1. Application of several physical techniques in the total analysis of a canine urinary calculus.

    PubMed

    Rodgers, A L; Mezzabotta, M; Mulder, K J; Nassimbeni, L R

    1981-06-01

    A single calculus from the bladder of a Beagle bitch has been analyzed by a multiple technique approach employing x-ray diffraction, infrared spectroscopy, scanning electron microscopy, x-ray fluorescence spectrometry, atomic absorption spectrophotometry and density gradient fractionation. The qualitative and quantitative data obtained showed excellent agreement, lending confidence to such an approach for the evaluation and understanding of stone disease.

  2. Multispan Elevated Guideway Design for Passenger Transport Vehicles : Volume 1. Text.

    DOT National Transportation Integrated Search

    1975-04-01

    Analysis techniques, a design procedure and design data are described for passenger vehicle, simply supported, single span and multiple span elevated guideway structures. Analyses and computer programs are developed to determine guideway deflections,...

  3. A formal concept analysis approach to consensus clustering of multi-experiment expression data

    PubMed Central

    2014-01-01

    Background Presently, with the increasing number and complexity of available gene expression datasets, the combination of data from multiple microarray studies addressing a similar biological question is gaining importance. The analysis and integration of multiple datasets are expected to yield more reliable and robust results since they are based on a larger number of samples and the effects of the individual study-specific biases are diminished. This is supported by recent studies suggesting that important biological signals are often preserved or enhanced by multiple experiments. An approach to combining data from different experiments is the aggregation of their clusterings into a consensus or representative clustering solution which increases the confidence in the common features of all the datasets and reveals the important differences among them. Results We propose a novel generic consensus clustering technique that applies Formal Concept Analysis (FCA) approach for the consolidation and analysis of clustering solutions derived from several microarray datasets. These datasets are initially divided into groups of related experiments with respect to a predefined criterion. Subsequently, a consensus clustering algorithm is applied to each group resulting in a clustering solution per group. These solutions are pooled together and further analysed by employing FCA which allows extracting valuable insights from the data and generating a gene partition over all the experiments. In order to validate the FCA-enhanced approach two consensus clustering algorithms are adapted to incorporate the FCA analysis. Their performance is evaluated on gene expression data from multi-experiment study examining the global cell-cycle control of fission yeast. The FCA results derived from both methods demonstrate that, although both algorithms optimize different clustering characteristics, FCA is able to overcome and diminish these differences and preserve some relevant biological signals. Conclusions The proposed FCA-enhanced consensus clustering technique is a general approach to the combination of clustering algorithms with FCA for deriving clustering solutions from multiple gene expression matrices. The experimental results presented herein demonstrate that it is a robust data integration technique able to produce good quality clustering solution that is representative for the whole set of expression matrices. PMID:24885407

  4. A Proposed Data Fusion Architecture for Micro-Zone Analysis and Data Mining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kevin McCarthy; Milos Manic

    Data Fusion requires the ability to combine or “fuse” date from multiple data sources. Time Series Analysis is a data mining technique used to predict future values from a data set based upon past values. Unlike other data mining techniques, however, Time Series places special emphasis on periodicity and how seasonal and other time-based factors tend to affect trends over time. One of the difficulties encountered in developing generic time series techniques is the wide variability of the data sets available for analysis. This presents challenges all the way from the data gathering stage to results presentation. This paper presentsmore » an architecture designed and used to facilitate the collection of disparate data sets well suited to Time Series analysis as well as other predictive data mining techniques. Results show this architecture provides a flexible, dynamic framework for the capture and storage of a myriad of dissimilar data sets and can serve as a foundation from which to build a complete data fusion architecture.« less

  5. Feasibility study of a synthesis procedure for array feeds to improve radiation performance of large distorted reflector antennas

    NASA Technical Reports Server (NTRS)

    Stutzman, W. L.; Takamizawa, K.; Werntz, P.; Lapean, J.; Barts, R.

    1991-01-01

    The following subject areas are covered: General Reflector Antenna Systems Program version 7(GRASP7); Multiple Reflector Analysis Program for Cylindrical Antennas (MRAPCA); Tri-Reflector 2D Synthesis Code (TRTDS); a geometrical optics and a physical optics synthesis techniques; beam scanning reflector, the type 2 and 6 reflectors, spherical reflector, and multiple reflector imaging systems; and radiometric array design.

  6. Systematic procedure for designing processes with multiple environmental objectives.

    PubMed

    Kim, Ki-Joo; Smith, Raymond L

    2005-04-01

    Evaluation of multiple objectives is very important in designing environmentally benign processes. It requires a systematic procedure for solving multiobjective decision-making problems due to the complex nature of the problems, the need for complex assessments, and the complicated analysis of multidimensional results. In this paper, a novel systematic procedure is presented for designing processes with multiple environmental objectives. This procedure has four steps: initialization, screening, evaluation, and visualization. The first two steps are used for systematic problem formulation based on mass and energy estimation and order of magnitude analysis. In the third step, an efficient parallel multiobjective steady-state genetic algorithm is applied to design environmentally benign and economically viable processes and to provide more accurate and uniform Pareto optimal solutions. In the last step a new visualization technique for illustrating multiple objectives and their design parameters on the same diagram is developed. Through these integrated steps the decision-maker can easily determine design alternatives with respect to his or her preferences. Most importantly, this technique is independent of the number of objectives and design parameters. As a case study, acetic acid recovery from aqueous waste mixtures is investigated by minimizing eight potential environmental impacts and maximizing total profit. After applying the systematic procedure, the most preferred design alternatives and their design parameters are easily identified.

  7. Extracranial glioblastoma diagnosed by examination of pleural effusion using the cell block technique: case report.

    PubMed

    Hori, Yusuke S; Fukuhara, Toru; Aoi, Mizuho; Oda, Kazunori; Shinno, Yoko

    2018-06-01

    Metastatic glioblastoma is a rare condition, and several studies have reported the involvement of multiple organs including the lymph nodes, liver, and lung. The lung and pleura are reportedly the most frequent sites of metastasis, and diagnosis using less invasive tools such as cytological analysis with fine needle aspiration biopsy is challenging. Cytological analysis of fluid specimens tends to be negative because of the small number of cells obtained, whereas the cell block technique reportedly has higher sensitivity because of a decrease in cellular dispersion. Herein, the authors describe a patient with a history of diffuse astrocytoma who developed intractable, progressive accumulation of pleural fluid. Initial cytological analysis of the pleural effusion obtained by thoracocentesis was negative, but reanalysis using the cell block technique revealed the presence of glioblastoma cells. This is the first report to suggest the effectiveness of the cell block technique in the diagnosis of extracranial glioblastoma using pleural effusion. In patients with a history of glioma, the presence of extremely intractable pleural effusion warrants cytological analysis of the fluid using this technique in order to initiate appropriate chemotherapy.

  8. Automatic cytometric device using multiple wavelength excitations

    NASA Astrophysics Data System (ADS)

    Rongeat, Nelly; Ledroit, Sylvain; Chauvet, Laurence; Cremien, Didier; Urankar, Alexandra; Couderc, Vincent; Nérin, Philippe

    2011-05-01

    Precise identification of eosinophils, basophils, and specific subpopulations of blood cells (B lymphocytes) in an unconventional automatic hematology analyzer is demonstrated. Our specific apparatus mixes two excitation radiations by means of an acousto-optics tunable filter to properly control fluorescence emission of phycoerythrin cyanin 5 (PC5) conjugated to antibodies (anti-CD20 or anti-CRTH2) and Thiazole Orange. This way our analyzer combining techniques of hematology analysis and flow cytometry based on multiple fluorescence detection, drastically improves the signal to noise ratio and decreases the spectral overlaps impact coming from multiple fluorescence emissions.

  9. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klumpp, John

    We propose a radiation detection system which generates its own discrete sampling distribution based on past measurements of background. The advantage to this approach is that it can take into account variations in background with respect to time, location, energy spectra, detector-specific characteristics (i.e. different efficiencies at different count rates and energies), etc. This would therefore be a 'machine learning' approach, in which the algorithm updates and improves its characterization of background over time. The system would have a 'learning mode,' in which it measures and analyzes background count rates, and a 'detection mode,' in which it compares measurements frommore » an unknown source against its unique background distribution. By characterizing and accounting for variations in the background, general purpose radiation detectors can be improved with little or no increase in cost. The statistical and computational techniques to perform this kind of analysis have already been developed. The necessary signal analysis can be accomplished using existing Bayesian algorithms which account for multiple channels, multiple detectors, and multiple time intervals. Furthermore, Bayesian machine-learning techniques have already been developed which, with trivial modifications, can generate appropriate decision thresholds based on the comparison of new measurements against a nonparametric sampling distribution. (authors)« less

  10. Fostering multiple repertoires in undergraduate behavior analysis students

    PubMed Central

    Polson, David A. D.

    1995-01-01

    Eight techniques used by the author in teaching an introductory applied behavior analysis course are described: (a) a detailed study guide, (b) frequent tests, (c) composition of practice test questions, (d) in-class study groups, (e) fluency building with a computerized flash-card program, (f) bonus marks for participation during question-and-answer sessions, (g) student presentations that summarize and analyze recently published research, and (h) in-class behavior analysis of comic strips. Together, these techniques require an extensive amount of work by students. Nevertheless, students overwhelmingly prefer this approach to the traditional lecture-midterm-final format, and most earn an A as their final course grade. PMID:22478226

  11. Applications of data compression techniques in modal analysis for on-orbit system identification

    NASA Technical Reports Server (NTRS)

    Carlin, Robert A.; Saggio, Frank; Garcia, Ephrahim

    1992-01-01

    Data compression techniques have been investigated for use with modal analysis applications. A redundancy-reduction algorithm was used to compress frequency response functions (FRFs) in order to reduce the amount of disk space necessary to store the data and/or save time in processing it. Tests were performed for both single- and multiple-degree-of-freedom (SDOF and MDOF, respectively) systems, with varying amounts of noise. Analysis was done on both the compressed and uncompressed FRFs using an SDOF Nyquist curve fit as well as the Eigensystem Realization Algorithm. Significant savings were realized with minimal errors incurred by the compression process.

  12. Practical guidance for conducting mediation analysis with multiple mediators using inverse odds ratio weighting.

    PubMed

    Nguyen, Quynh C; Osypuk, Theresa L; Schmidt, Nicole M; Glymour, M Maria; Tchetgen Tchetgen, Eric J

    2015-03-01

    Despite the recent flourishing of mediation analysis techniques, many modern approaches are difficult to implement or applicable to only a restricted range of regression models. This report provides practical guidance for implementing a new technique utilizing inverse odds ratio weighting (IORW) to estimate natural direct and indirect effects for mediation analyses. IORW takes advantage of the odds ratio's invariance property and condenses information on the odds ratio for the relationship between the exposure (treatment) and multiple mediators, conditional on covariates, by regressing exposure on mediators and covariates. The inverse of the covariate-adjusted exposure-mediator odds ratio association is used to weight the primary analytical regression of the outcome on treatment. The treatment coefficient in such a weighted regression estimates the natural direct effect of treatment on the outcome, and indirect effects are identified by subtracting direct effects from total effects. Weighting renders treatment and mediators independent, thereby deactivating indirect pathways of the mediators. This new mediation technique accommodates multiple discrete or continuous mediators. IORW is easily implemented and is appropriate for any standard regression model, including quantile regression and survival analysis. An empirical example is given using data from the Moving to Opportunity (1994-2002) experiment, testing whether neighborhood context mediated the effects of a housing voucher program on obesity. Relevant Stata code (StataCorp LP, College Station, Texas) is provided. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. Improving the analysis of slug tests

    USGS Publications Warehouse

    McElwee, C.D.

    2002-01-01

    This paper examines several techniques that have the potential to improve the quality of slug test analysis. These techniques are applicable in the range from low hydraulic conductivities with overdamped responses to high hydraulic conductivities with nonlinear oscillatory responses. Four techniques for improving slug test analysis will be discussed: use of an extended capability nonlinear model, sensitivity analysis, correction for acceleration and velocity effects, and use of multiple slug tests. The four-parameter nonlinear slug test model used in this work is shown to allow accurate analysis of slug tests with widely differing character. The parameter ?? represents a correction to the water column length caused primarily by radius variations in the wellbore and is most useful in matching the oscillation frequency and amplitude. The water column velocity at slug initiation (V0) is an additional model parameter, which would ideally be zero but may not be due to the initiation mechanism. The remaining two model parameters are A (parameter for nonlinear effects) and K (hydraulic conductivity). Sensitivity analysis shows that in general ?? and V0 have the lowest sensitivity and K usually has the highest. However, for very high K values the sensitivity to A may surpass the sensitivity to K. Oscillatory slug tests involve higher accelerations and velocities of the water column; thus, the pressure transducer responses are affected by these factors and the model response must be corrected to allow maximum accuracy for the analysis. The performance of multiple slug tests will allow some statistical measure of the experimental accuracy and of the reliability of the resulting aquifer parameters. ?? 2002 Elsevier Science B.V. All rights reserved.

  14. Simultaneous analysis for water- and fat-soluble vitamins by a novel single chromatography technique unifying supercritical fluid chromatography and liquid chromatography.

    PubMed

    Taguchi, Kaori; Fukusaki, Eiichiro; Bamba, Takeshi

    2014-10-03

    Chromatography techniques usually use a single state in the mobile phase, such as liquid, gas, or supercritical fluid. Chromatographers manage one of these techniques for their purpose but are sometimes required to use multiple methods, or even worse, multiple techniques when the target compounds have a wide range of chemical properties. To overcome this challenge, we developed a single method covering a diverse compound range by means of a "unified" chromatography which completely bridges supercritical fluid chromatography and liquid chromatography. In our method, the phase state was continuously changed in the following order; supercritical, subcritical and liquid. Moreover, the gradient of the mobile phase starting at almost 100% CO2 was replaced with 100% methanol at the end completely. As a result, this approach achieved further extension of the polarity range of the mobile phase in a single run, and successfully enabled the simultaneous analysis of fat- and water-soluble vitamins with a wide logP range of -2.11 to 10.12. Furthermore, the 17 vitamins were exceptionally separated in 4min. Our results indicated that the use of dense CO2 and the replacement of CO2 by methanol are practical approaches in unified chromatography covering diverse compounds. Additionally, this is a first report to apply the novel approach to unified chromatography, and can open another door for diverse compound analysis in a single chromatographic technique with single injection, single column and single system. Copyright © 2014. Published by Elsevier B.V.

  15. Advanced statistics: linear regression, part II: multiple linear regression.

    PubMed

    Marill, Keith A

    2004-01-01

    The applications of simple linear regression in medical research are limited, because in most situations, there are multiple relevant predictor variables. Univariate statistical techniques such as simple linear regression use a single predictor variable, and they often may be mathematically correct but clinically misleading. Multiple linear regression is a mathematical technique used to model the relationship between multiple independent predictor variables and a single dependent outcome variable. It is used in medical research to model observational data, as well as in diagnostic and therapeutic studies in which the outcome is dependent on more than one factor. Although the technique generally is limited to data that can be expressed with a linear function, it benefits from a well-developed mathematical framework that yields unique solutions and exact confidence intervals for regression coefficients. Building on Part I of this series, this article acquaints the reader with some of the important concepts in multiple regression analysis. These include multicollinearity, interaction effects, and an expansion of the discussion of inference testing, leverage, and variable transformations to multivariate models. Examples from the first article in this series are expanded on using a primarily graphic, rather than mathematical, approach. The importance of the relationships among the predictor variables and the dependence of the multivariate model coefficients on the choice of these variables are stressed. Finally, concepts in regression model building are discussed.

  16. Ratioed scatter diagrams - An erotetic method for phase identification on complex surfaces using scanning Auger microscopy

    NASA Technical Reports Server (NTRS)

    Browning, R.

    1984-01-01

    By ratioing multiple Auger intensities and plotting a two-dimensional occupational scatter diagram while digitally scanning across an area, the number and elemental association of surface phases can be determined. This can prove a useful tool in scanning Auger microscopic analysis of complex materials. The technique is illustrated by results from an anomalous region on the reaction zone of a SiC/Ti-6Al-4V metal matrix composite material. The anomalous region is shown to be a single phase associated with sulphur and phosphorus impurities. Imaging of a selected phase from the ratioed scatter diagram is possible and may be a useful technique for presenting multiple scanning Auger images.

  17. Analysis to feature-based video stabilization/registration techniques within application of traffic data collection

    NASA Astrophysics Data System (ADS)

    Sadat, Mojtaba T.; Viti, Francesco

    2015-02-01

    Machine vision is rapidly gaining popularity in the field of Intelligent Transportation Systems. In particular, advantages are foreseen by the exploitation of Aerial Vehicles (AV) in delivering a superior view on traffic phenomena. However, vibration on AVs makes it difficult to extract moving objects on the ground. To partly overcome this issue, image stabilization/registration procedures are adopted to correct and stitch multiple frames taken of the same scene but from different positions, angles, or sensors. In this study, we examine the impact of multiple feature-based techniques for stabilization, and we show that SURF detector outperforms the others in terms of time efficiency and output similarity.

  18. FIRE: an SPSS program for variable selection in multiple linear regression analysis via the relative importance of predictors.

    PubMed

    Lorenzo-Seva, Urbano; Ferrando, Pere J

    2011-03-01

    We provide an SPSS program that implements currently recommended techniques and recent developments for selecting variables in multiple linear regression analysis via the relative importance of predictors. The approach consists of: (1) optimally splitting the data for cross-validation, (2) selecting the final set of predictors to be retained in the equation regression, and (3) assessing the behavior of the chosen model using standard indices and procedures. The SPSS syntax, a short manual, and data files related to this article are available as supplemental materials from brm.psychonomic-journals.org/content/supplemental.

  19. Kernel canonical-correlation Granger causality for multiple time series

    NASA Astrophysics Data System (ADS)

    Wu, Guorong; Duan, Xujun; Liao, Wei; Gao, Qing; Chen, Huafu

    2011-04-01

    Canonical-correlation analysis as a multivariate statistical technique has been applied to multivariate Granger causality analysis to infer information flow in complex systems. It shows unique appeal and great superiority over the traditional vector autoregressive method, due to the simplified procedure that detects causal interaction between multiple time series, and the avoidance of potential model estimation problems. However, it is limited to the linear case. Here, we extend the framework of canonical correlation to include the estimation of multivariate nonlinear Granger causality for drawing inference about directed interaction. Its feasibility and effectiveness are verified on simulated data.

  20. Enhanced Multiobjective Optimization Technique for Comprehensive Aerospace Design. Part A

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John N.

    1997-01-01

    A multidisciplinary design optimization procedure which couples formal multiobjectives based techniques and complex analysis procedures (such as computational fluid dynamics (CFD) codes) developed. The procedure has been demonstrated on a specific high speed flow application involving aerodynamics and acoustics (sonic boom minimization). In order to account for multiple design objectives arising from complex performance requirements, multiobjective formulation techniques are used to formulate the optimization problem. Techniques to enhance the existing Kreisselmeier-Steinhauser (K-S) function multiobjective formulation approach have been developed. The K-S function procedure used in the proposed work transforms a constrained multiple objective functions problem into an unconstrained problem which then is solved using the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm. Weight factors are introduced during the transformation process to each objective function. This enhanced procedure will provide the designer the capability to emphasize specific design objectives during the optimization process. The demonstration of the procedure utilizes a computational Fluid dynamics (CFD) code which solves the three-dimensional parabolized Navier-Stokes (PNS) equations for the flow field along with an appropriate sonic boom evaluation procedure thus introducing both aerodynamic performance as well as sonic boom as the design objectives to be optimized simultaneously. Sensitivity analysis is performed using a discrete differentiation approach. An approximation technique has been used within the optimizer to improve the overall computational efficiency of the procedure in order to make it suitable for design applications in an industrial setting.

  1. Sensor failure and multivariable control for airbreathing propulsion systems. Ph.D. Thesis - Dec. 1979 Final Report

    NASA Technical Reports Server (NTRS)

    Behbehani, K.

    1980-01-01

    A new sensor/actuator failure analysis technique for turbofan jet engines was developed. Three phases of failure analysis, namely detection, isolation, and accommodation are considered. Failure detection and isolation techniques are developed by utilizing the concept of Generalized Likelihood Ratio (GLR) tests. These techniques are applicable to both time varying and time invariant systems. Three GLR detectors are developed for: (1) hard-over sensor failure; (2) hard-over actuator failure; and (3) brief disturbances in the actuators. The probability distribution of the GLR detectors and the detectability of sensor/actuator failures are established. Failure type is determined by the maximum of the GLR detectors. Failure accommodation is accomplished by extending the Multivariable Nyquest Array (MNA) control design techniques to nonsquare system designs. The performance and effectiveness of the failure analysis technique are studied by applying the technique to a turbofan jet engine, namely the Quiet Clean Short Haul Experimental Engine (QCSEE). Single and multiple sensor/actuator failures in the QCSEE are simulated and analyzed and the effects of model degradation are studied.

  2. Time-Frequency Analysis of the Dispersion of Lamb Modes

    NASA Technical Reports Server (NTRS)

    Prosser, W. H.; Seale, Michael D.; Smith, Barry T.

    1999-01-01

    Accurate knowledge of the velocity dispersion of Lamb modes is important for ultrasonic nondestructive evaluation methods used in detecting and locating flaws in thin plates and in determining their elastic stiffness coefficients. Lamb mode dispersion is also important in the acoustic emission technique for accurately triangulating the location of emissions in thin plates. In this research, the ability to characterize Lamb mode dispersion through a time-frequency analysis (the pseudo Wigner-Ville distribution) was demonstrated. A major advantage of time-frequency methods is the ability to analyze acoustic signals containing multiple propagation modes, which overlap and superimpose in the time domain signal. By combining time-frequency analysis with a broadband acoustic excitation source, the dispersion of multiple Lamb modes over a wide frequency range can be determined from as little as a single measurement. In addition, the technique provides a direct measurement of the group velocity dispersion. The technique was first demonstrated in the analysis of a simulated waveform in an aluminum plate in which the Lamb mode dispersion was well known. Portions of the dispersion curves of the A(sub 0), A(sub 1), S(sub 0), and S(sub 2)Lamb modes were obtained from this one waveform. The technique was also applied for the analysis of experimental waveforms from a unidirectional graphite/epoxy composite plate. Measurements were made both along, and perpendicular to the fiber direction. In this case, the signals contained only the lowest order symmetric and antisymmetric modes. A least squares fit of the results from several source to detector distances was used. Theoretical dispersion curves were calculated and are shown to be in good agreement with experimental results.

  3. Cooperative Robots to Observe Moving Targets: Review.

    PubMed

    Khan, Asif; Rinner, Bernhard; Cavallaro, Andrea

    2018-01-01

    The deployment of multiple robots for achieving a common goal helps to improve the performance, efficiency, and/or robustness in a variety of tasks. In particular, the observation of moving targets is an important multirobot application that still exhibits numerous open challenges, including the effective coordination of the robots. This paper reviews control techniques for cooperative mobile robots monitoring multiple targets. The simultaneous movement of robots and targets makes this problem particularly interesting, and our review systematically addresses this cooperative multirobot problem for the first time. We classify and critically discuss the control techniques: cooperative multirobot observation of multiple moving targets, cooperative search, acquisition, and track, cooperative tracking, and multirobot pursuit evasion. We also identify the five major elements that characterize this problem, namely, the coordination method, the environment, the target, the robot and its sensor(s). These elements are used to systematically analyze the control techniques. The majority of the studied work is based on simulation and laboratory studies, which may not accurately reflect real-world operational conditions. Importantly, while our systematic analysis is focused on multitarget observation, our proposed classification is useful also for related multirobot applications.

  4. The Effectiveness of Trace DNA Profiling-A Comparison Between a U.S. and a U.K. Law Enforcement Jurisdiction.

    PubMed

    Bond, John W; Weart, Jocelyn R

    2017-05-01

    Recovery, profiling, and speculative searching of trace DNA (not attributable to a body fluid/cell type) over a twelve-month period in a U.S. Crime Laboratory and U.K. police force are compared. Results show greater numbers of U.S. firearm-related items submitted for analysis compared with the U.K., where greatest numbers were submitted from burglary or vehicle offenses. U.S. multiple recovery techniques (double swabbing) occurred mainly during laboratory examination, whereas the majority of U.K. multiple recovery techniques occurred at the scene. No statistical difference was observed for useful profiles from single or multiple recovery. Database loading of interpretable profiles was most successful for U.K. items related to burglary or vehicle offenses. Database associations (matches) represented 7.0% of all U.S. items and 13.1% of all U.K. items. The U.K. strategy for burglary and vehicle examination demonstrated that careful selection of both items and sampling techniques is crucial to obtaining the observed results. © 2016 American Academy of Forensic Sciences.

  5. Diagnosis of 25 genotypes of human papillomaviruses for their physical statuses in cervical precancerous/cancerous lesions: a comparison of E2/E6E7 ratio-based vs. multiple E1-L1/E6E7 ratio-based detection techniques.

    PubMed

    Zhang, Rong; He, Yi-feng; Chen, Mo; Chen, Chun-mei; Zhu, Qiu-jing; Lu, Huan; Wei, Zhen-hong; Li, Fang; Zhang, Xiao-xin; Xu, Cong-jian; Yu, Long

    2014-10-02

    Cervical lesions caused by integrated human papillomavirus (HPV) infection are highly dangerous because they can quickly develop into invasive cancers. However, clinicians are currently hampered by the lack of a quick, convenient and precise technique to detect integrated/mixed infections of various genotypes of HPVs in the cervix. This study aimed to develop a practical tool to determine the physical status of different HPVs and evaluate its clinical significance. The target population comprised 1162 women with an HPV infection history of > six months and an abnormal cervical cytological finding. The multiple E1-L1/E6E7 ratio analysis, a novel technique, was developed based on determining the ratios of E1/E6E7, E2/E6E7, E4E5/E6E7, L2/E6E7 and L1/E6E7 within the viral genome. Any imbalanced ratios indicate integration. Its diagnostic and predictive performances were compared with those of E2/E6E7 ratio analysis. The detection accuracy of both techniques was evaluated using the gold-standard technique "detection of integrated papillomavirus sequences" (DIPS). To realize a multigenotypic detection goal, a primer and probe library was established. The integration rate of a particular genotype of HPV was correlated with its tumorigenic potential and women with higher lesion grades often carried lower viral loads. The E1-L1/E6E7 ratio analysis achieved 92.7% sensitivity and 99.0% specificity in detecting HPV integration, while the E2/E6E7 ratio analysis showed a much lower sensitivity (75.6%) and a similar specificity (99.3%). Interference due to episomal copies was observed in both techniques, leading to false-negative results. However, some positive results of E1-L1/E6E7 ratio analysis were missed by DIPS due to its stochastic detection nature. The E1-L1/E6E7 ratio analysis is more efficient than E2/E6E7 ratio analysis and DIPS in predicting precancerous/cancerous lesions, in which both positive predictive values (36.7%-82.3%) and negative predictive values (75.9%-100%) were highest (based on the results of three rounds of biopsies). The multiple E1-L1/E6E7 ratio analysis is more sensitive and predictive than E2/E6E7 ratio analysis as a triage test for detecting HPV integration. It can effectively narrow the range of candidates for colposcopic examination and cervical biopsy, thereby lowering the expense of cervical cancer prevention.

  6. Performance Analysis of MIMO Relay Network via Propagation Measurement in L-Shaped Corridor Environment

    NASA Astrophysics Data System (ADS)

    Lertwiram, Namzilp; Tran, Gia Khanh; Mizutani, Keiichi; Sakaguchi, Kei; Araki, Kiyomichi

    Setting relays can address the shadowing problem between a transmitter (Tx) and a receiver (Rx). Moreover, the Multiple-Input Multiple-Output (MIMO) technique has been introduced to improve wireless link capacity. The MIMO technique can be applied in relay network to enhance system performance. However, the efficiency of relaying schemes and relay placement have not been well investigated with experiment-based study. This paper provides a propagation measurement campaign of a MIMO two-hop relay network in 5GHz band in an L-shaped corridor environment with various relay locations. Furthermore, this paper proposes a Relay Placement Estimation (RPE) scheme to identify the optimum relay location, i.e. the point at which the network performance is highest. Analysis results of channel capacity show that relaying technique is beneficial over direct transmission in strong shadowing environment while it is ineffective in non-shadowing environment. In addition, the optimum relay location estimated with the RPE scheme also agrees with the location where the network achieves the highest performance as identified by network capacity. Finally, the capacity analysis shows that two-way MIMO relay employing network coding has the best performance while cooperative relaying scheme is not effective due to shadowing effect weakening the signal strength of the direct link.

  7. Using real options analysis to support strategic management decisions

    NASA Astrophysics Data System (ADS)

    Kabaivanov, Stanimir; Markovska, Veneta; Milev, Mariyan

    2013-12-01

    Decision making is a complex process that requires taking into consideration multiple heterogeneous sources of uncertainty. Standard valuation and financial analysis techniques often fail to properly account for all these sources of risk as well as for all sources of additional flexibility. In this paper we explore applications of a modified binomial tree method for real options analysis (ROA) in an effort to improve decision making process. Usual cases of use of real options are analyzed with elaborate study on the applications and advantages that company management can derive from their application. A numeric results based on extending simple binomial tree approach for multiple sources of uncertainty are provided to demonstrate the improvement effects on management decisions.

  8. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 4: Advanced fan section aerodynamic analysis computer program user's manual

    NASA Technical Reports Server (NTRS)

    Crook, Andrew J.; Delaney, Robert A.

    1992-01-01

    The computer program user's manual for the ADPACAPES (Advanced Ducted Propfan Analysis Code-Average Passage Engine Simulation) program is included. The objective of the computer program is development of a three-dimensional Euler/Navier-Stokes flow analysis for fan section/engine geometries containing multiple blade rows and multiple spanwise flow splitters. An existing procedure developed by Dr. J. J. Adamczyk and associates at the NASA Lewis Research Center was modified to accept multiple spanwise splitter geometries and simulate engine core conditions. The numerical solution is based upon a finite volume technique with a four stage Runge-Kutta time marching procedure. Multiple blade row solutions are based upon the average-passage system of equations. The numerical solutions are performed on an H-type grid system, with meshes meeting the requirement of maintaining a common axisymmetric mesh for each blade row grid. The analysis was run on several geometry configurations ranging from one to five blade rows and from one to four radial flow splitters. The efficiency of the solution procedure was shown to be the same as the original analysis.

  9. Numerically accurate computational techniques for optimal estimator analyses of multi-parameter models

    NASA Astrophysics Data System (ADS)

    Berger, Lukas; Kleinheinz, Konstantin; Attili, Antonio; Bisetti, Fabrizio; Pitsch, Heinz; Mueller, Michael E.

    2018-05-01

    Modelling unclosed terms in partial differential equations typically involves two steps: First, a set of known quantities needs to be specified as input parameters for a model, and second, a specific functional form needs to be defined to model the unclosed terms by the input parameters. Both steps involve a certain modelling error, with the former known as the irreducible error and the latter referred to as the functional error. Typically, only the total modelling error, which is the sum of functional and irreducible error, is assessed, but the concept of the optimal estimator enables the separate analysis of the total and the irreducible errors, yielding a systematic modelling error decomposition. In this work, attention is paid to the techniques themselves required for the practical computation of irreducible errors. Typically, histograms are used for optimal estimator analyses, but this technique is found to add a non-negligible spurious contribution to the irreducible error if models with multiple input parameters are assessed. Thus, the error decomposition of an optimal estimator analysis becomes inaccurate, and misleading conclusions concerning modelling errors may be drawn. In this work, numerically accurate techniques for optimal estimator analyses are identified and a suitable evaluation of irreducible errors is presented. Four different computational techniques are considered: a histogram technique, artificial neural networks, multivariate adaptive regression splines, and an additive model based on a kernel method. For multiple input parameter models, only artificial neural networks and multivariate adaptive regression splines are found to yield satisfactorily accurate results. Beyond a certain number of input parameters, the assessment of models in an optimal estimator analysis even becomes practically infeasible if histograms are used. The optimal estimator analysis in this paper is applied to modelling the filtered soot intermittency in large eddy simulations using a dataset of a direct numerical simulation of a non-premixed sooting turbulent flame.

  10. Optimum decoding and detection of a multiplicative amplitude-encoded watermark

    NASA Astrophysics Data System (ADS)

    Barni, Mauro; Bartolini, Franco; De Rosa, Alessia; Piva, Alessandro

    2002-04-01

    The aim of this paper is to present a novel approach to the decoding and the detection of multibit, multiplicative, watermarks embedded in the frequency domain. Watermark payload is conveyed by amplitude modulating a pseudo-random sequence, thus resembling conventional DS spread spectrum techniques. As opposed to conventional communication systems, though, the watermark is embedded within the host DFT coefficients by using a multiplicative rule. The watermark decoding technique presented in the paper is an optimum one, in that it minimizes the bit error probability. The problem of watermark presence assessment, which is often underestimated by state-of-the-art research on multibit watermarking, is addressed too, and the optimum detection rule derived according to the Neyman-Pearson criterion. Experimental results are shown both to demonstrate the validity of the theoretical analysis and to highlight the good performance of the proposed system.

  11. Wavelet regression model in forecasting crude oil price

    NASA Astrophysics Data System (ADS)

    Hamid, Mohd Helmie; Shabri, Ani

    2017-05-01

    This study presents the performance of wavelet multiple linear regression (WMLR) technique in daily crude oil forecasting. WMLR model was developed by integrating the discrete wavelet transform (DWT) and multiple linear regression (MLR) model. The original time series was decomposed to sub-time series with different scales by wavelet theory. Correlation analysis was conducted to assist in the selection of optimal decomposed components as inputs for the WMLR model. The daily WTI crude oil price series has been used in this study to test the prediction capability of the proposed model. The forecasting performance of WMLR model were also compared with regular multiple linear regression (MLR), Autoregressive Moving Average (ARIMA) and Generalized Autoregressive Conditional Heteroscedasticity (GARCH) using root mean square errors (RMSE) and mean absolute errors (MAE). Based on the experimental results, it appears that the WMLR model performs better than the other forecasting technique tested in this study.

  12. Multiple fault separation and detection by joint subspace learning for the health assessment of wind turbine gearboxes

    NASA Astrophysics Data System (ADS)

    Du, Zhaohui; Chen, Xuefeng; Zhang, Han; Zi, Yanyang; Yan, Ruqiang

    2017-09-01

    The gearbox of a wind turbine (WT) has dominant failure rates and highest downtime loss among all WT subsystems. Thus, gearbox health assessment for maintenance cost reduction is of paramount importance. The concurrence of multiple faults in gearbox components is a common phenomenon due to fault induction mechanism. This problem should be considered before planning to replace the components of the WT gearbox. Therefore, the key fault patterns should be reliably identified from noisy observation data for the development of an effective maintenance strategy. However, most of the existing studies focusing on multiple fault diagnosis always suffer from inappropriate division of fault information in order to satisfy various rigorous decomposition principles or statistical assumptions, such as the smooth envelope principle of ensemble empirical mode decomposition and the mutual independence assumption of independent component analysis. Thus, this paper presents a joint subspace learning-based multiple fault detection (JSL-MFD) technique to construct different subspaces adaptively for different fault patterns. Its main advantage is its capability to learn multiple fault subspaces directly from the observation signal itself. It can also sparsely concentrate the feature information into a few dominant subspace coefficients. Furthermore, it can eliminate noise by simply performing coefficient shrinkage operations. Consequently, multiple fault patterns are reliably identified by utilizing the maximum fault information criterion. The superiority of JSL-MFD in multiple fault separation and detection is comprehensively investigated and verified by the analysis of a data set of a 750 kW WT gearbox. Results show that JSL-MFD is superior to a state-of-the-art technique in detecting hidden fault patterns and enhancing detection accuracy.

  13. A Novel Technique to Detect Code for SAC-OCDMA System

    NASA Astrophysics Data System (ADS)

    Bharti, Manisha; Kumar, Manoj; Sharma, Ajay K.

    2018-04-01

    The main task of optical code division multiple access (OCDMA) system is the detection of code used by a user in presence of multiple access interference (MAI). In this paper, new method of detection known as XOR subtraction detection for spectral amplitude coding OCDMA (SAC-OCDMA) based on double weight codes has been proposed and presented. As MAI is the main source of performance deterioration in OCDMA system, therefore, SAC technique is used in this paper to eliminate the effect of MAI up to a large extent. A comparative analysis is then made between the proposed scheme and other conventional detection schemes used like complimentary subtraction detection, AND subtraction detection and NAND subtraction detection. The system performance is characterized by Q-factor, BER and received optical power (ROP) with respect to input laser power and fiber length. The theoretical and simulation investigations reveal that the proposed detection technique provides better quality factor, security and received power in comparison to other conventional techniques. The wide opening of eye in case of proposed technique also proves its robustness.

  14. Simplified multiple headspace extraction gas chromatographic technique for determination of monomer solubility in water.

    PubMed

    Chai, X S; Schork, F J; DeCinque, Anthony

    2005-04-08

    This paper reports an improved headspace gas chromatographic (GC) technique for determination of monomer solubilities in water. The method is based on a multiple headspace extraction GC technique developed previously [X.S. Chai, Q.X. Hou, F.J. Schork, J. Appl. Polym. Sci., in press], but with the major modification in the method calibration technique. As a result, only a few iterations of headspace extraction and GC measurement are required, which avoids the "exhaustive" headspace extraction, and thus the experimental time for each analysis. For highly insoluble monomers, effort must be made to minimize adsorption in the headspace sampling channel, transportation conduit and capillary column by using higher operating temperature and a short capillary column in the headspace sampler and GC system. For highly water soluble monomers, a new calibration method is proposed. The combinations of these technique modifications results in a method that is simple, rapid and automated. While the current focus of the authors is on the determination of monomer solubility in aqueous solutions, the method should be applicable to determination of solubility of any organic in water.

  15. Domino: Extracting, Comparing, and Manipulating Subsets across Multiple Tabular Datasets

    PubMed Central

    Gratzl, Samuel; Gehlenborg, Nils; Lex, Alexander; Pfister, Hanspeter; Streit, Marc

    2016-01-01

    Answering questions about complex issues often requires analysts to take into account information contained in multiple interconnected datasets. A common strategy in analyzing and visualizing large and heterogeneous data is dividing it into meaningful subsets. Interesting subsets can then be selected and the associated data and the relationships between the subsets visualized. However, neither the extraction and manipulation nor the comparison of subsets is well supported by state-of-the-art techniques. In this paper we present Domino, a novel multiform visualization technique for effectively representing subsets and the relationships between them. By providing comprehensive tools to arrange, combine, and extract subsets, Domino allows users to create both common visualization techniques and advanced visualizations tailored to specific use cases. In addition to the novel technique, we present an implementation that enables analysts to manage the wide range of options that our approach offers. Innovative interactive features such as placeholders and live previews support rapid creation of complex analysis setups. We introduce the technique and the implementation using a simple example and demonstrate scalability and effectiveness in a use case from the field of cancer genomics. PMID:26356916

  16. Advancing statistical analysis of ambulatory assessment data in the study of addictive behavior: A primer on three person-oriented techniques.

    PubMed

    Foster, Katherine T; Beltz, Adriene M

    2018-08-01

    Ambulatory assessment (AA) methodologies have the potential to increase understanding and treatment of addictive behavior in seemingly unprecedented ways, due in part, to their emphasis on intensive repeated assessments of an individual's addictive behavior in context. But, many analytic techniques traditionally applied to AA data - techniques that average across people and time - do not fully leverage this potential. In an effort to take advantage of the individualized, temporal nature of AA data on addictive behavior, the current paper considers three underutilized person-oriented analytic techniques: multilevel modeling, p-technique, and group iterative multiple model estimation. After reviewing prevailing analytic techniques, each person-oriented technique is presented, AA data specifications are mentioned, an example analysis using generated data is provided, and advantages and limitations are discussed; the paper closes with a brief comparison across techniques. Increasing use of person-oriented techniques will substantially enhance inferences that can be drawn from AA data on addictive behavior and has implications for the development of individualized interventions. Copyright © 2017. Published by Elsevier Ltd.

  17. Helping agencies improve their planning analysis techniques.

    DOT National Transportation Integrated Search

    2011-11-18

    This report summarizes the results of a peer review of the AZTDM. The peer review was : supported by the Travel Model Improvement Program (TMIP), which is sponsored by FHWA. : The peer review of a travel model can serve multiple purposes, including i...

  18. Microbiological Detection Systems for Molecular Analysis of Environmental Water and Soil Samples

    EPA Science Inventory

    Multiple detection systems are being targeted to track various species and genotypes of pathogens found in environmental samples with the overreaching goal of developing analytical separation and detection techniques for Salmonella enterica Serovars Typhi, Cryptosporidium parvum,...

  19. [Difference between perinatal mortality in multiple pregnancies obtained spontaneously versus assisted reproduction].

    PubMed

    del Rayo Rivas-Ortiz, Yazmín; Hernández-Herrera, Ricardo Jorge

    2010-06-01

    Recently assisted reproduction techniques are more common, which increases multiple pregnancies and adverse perinatal outcomes. Some authors report increased mortality in multiple pregnancies products obtained by techniques of assisted reproduction vs. conceived spontaneously, although other authors found no significant difference. To evaluate mortality rate of multiple pregnancies comparing those obtained by assisted reproduction vs. spontaneous conception. Retrospective, observational and comparative study. We included pregnant women with 3 or more products that went to the Unidad Médica de Alta Especialidad No. 23, IMSS, in Monterrey, NL (Mexico), between 2002-2008. We compared the number of complicated pregnancies and dead products obtained by a technique of assisted reproduction vs. spontaneous. 68 multiple pregnancies were included. On average, spontaneously conceived fetuses had more weeks of gestation and more birth weight than those achieved by assisted reproduction techniques (p = ns). 20.5% (14/68) of multiple pregnancies had one or more fatal events: 10/40 (25%) by assisted reproduction techniques vs. 4/28 (14%) of spontaneous multiple pregnancies (p = 0.22). 21/134 (16%) of the products conceived by assisted reproduction techniques and 6/88 (7%) of spontaneous (p < 0.03) died. 60% of all multiple pregnancies were obtained by a technique of assisted reproduction and 21% of the cases had one or more fatal events (11% more in pregnancies achieved by assisted reproduction techniques). 12% of the products of multiple pregnancies died (9% more in those obtained by a technique of assisted reproduction).

  20. Monitoring Urban Greenness Dynamics Using Multiple Endmember Spectral Mixture Analysis

    PubMed Central

    Gan, Muye; Deng, Jinsong; Zheng, Xinyu; Hong, Yang; Wang, Ke

    2014-01-01

    Urban greenness is increasingly recognized as an essential constituent of the urban environment and can provide a range of services and enhance residents’ quality of life. Understanding the pattern of urban greenness and exploring its spatiotemporal dynamics would contribute valuable information for urban planning. In this paper, we investigated the pattern of urban greenness in Hangzhou, China, over the past two decades using time series Landsat-5 TM data obtained in 1990, 2002, and 2010. Multiple endmember spectral mixture analysis was used to derive vegetation cover fractions at the subpixel level. An RGB-vegetation fraction model, change intensity analysis and the concentric technique were integrated to reveal the detailed, spatial characteristics and the overall pattern of change in the vegetation cover fraction. Our results demonstrated the ability of multiple endmember spectral mixture analysis to accurately model the vegetation cover fraction in pixels despite the complex spectral confusion of different land cover types. The integration of multiple techniques revealed various changing patterns in urban greenness in this region. The overall vegetation cover has exhibited a drastic decrease over the past two decades, while no significant change occurred in the scenic spots that were studied. Meanwhile, a remarkable recovery of greenness was observed in the existing urban area. The increasing coverage of small green patches has played a vital role in the recovery of urban greenness. These changing patterns were more obvious during the period from 2002 to 2010 than from 1990 to 2002, and they revealed the combined effects of rapid urbanization and greening policies. This work demonstrates the usefulness of time series of vegetation cover fractions for conducting accurate and in-depth studies of the long-term trajectories of urban greenness to obtain meaningful information for sustainable urban development. PMID:25375176

  1. Sensor fusion III: 3-D perception and recognition; Proceedings of the Meeting, Boston, MA, Nov. 5-8, 1990

    NASA Technical Reports Server (NTRS)

    Schenker, Paul S. (Editor)

    1991-01-01

    The volume on data fusion from multiple sources discusses fusing multiple views, temporal analysis and 3D motion interpretation, sensor fusion and eye-to-hand coordination, and integration in human shape perception. Attention is given to surface reconstruction, statistical methods in sensor fusion, fusing sensor data with environmental knowledge, computational models for sensor fusion, and evaluation and selection of sensor fusion techniques. Topics addressed include the structure of a scene from two and three projections, optical flow techniques for moving target detection, tactical sensor-based exploration in a robotic environment, and the fusion of human and machine skills for remote robotic operations. Also discussed are K-nearest-neighbor concepts for sensor fusion, surface reconstruction with discontinuities, a sensor-knowledge-command fusion paradigm for man-machine systems, coordinating sensing and local navigation, and terrain map matching using multisensing techniques for applications to autonomous vehicle navigation.

  2. Analysis of dense-medium light scattering with applications to corneal tissue: experiments and Monte Carlo simulations.

    PubMed

    Kim, K B; Shanyfelt, L M; Hahn, D W

    2006-01-01

    Dense-medium scattering is explored in the context of providing a quantitative measurement of turbidity, with specific application to corneal haze. A multiple-wavelength scattering technique is proposed to make use of two-color scattering response ratios, thereby providing a means for data normalization. A combination of measurements and simulations are reported to assess this technique, including light-scattering experiments for a range of polystyrene suspensions. Monte Carlo (MC) simulations were performed using a multiple-scattering algorithm based on full Mie scattering theory. The simulations were in excellent agreement with the polystyrene suspension experiments, thereby validating the MC model. The MC model was then used to simulate multiwavelength scattering in a corneal tissue model. Overall, the proposed multiwavelength scattering technique appears to be a feasible approach to quantify dense-medium scattering such as the manifestation of corneal haze, although more complex modeling of keratocyte scattering, and animal studies, are necessary.

  3. Symbolic manipulation techniques for vibration analysis of laminated elliptic plates

    NASA Technical Reports Server (NTRS)

    Andersen, C. M.; Noor, A. K.

    1977-01-01

    A computational scheme is presented for the free vibration analysis of laminated composite elliptic plates. The scheme is based on Hamilton's principle, the Rayleigh-Ritz technique and symmetry considerations and is implemented with the aid of the MACSYMA symbolic manipulation system. The MACYSMA system, through differentiation, integration, and simplification of analytic expressions, produces highly-efficient FORTRAN code for the evaluation of the stiffness and mass coefficients. Multiple use is made of this code to obtain not only the frequencies and mode shapes of the plate, but also the derivatives of the frequencies with respect to various material and geometric parameters.

  4. Analysis of Extracellular Vesicles in the Tumor Microenvironment.

    PubMed

    Al-Nedawi, Khalid; Read, Jolene

    2016-01-01

    Extracellular vesicles (ECV) are membrane compartments shed from all types of cells in various physiological and pathological states. In recent years, ECV have gained an increasing interest from the scientific community for their role as an intercellular communicator that plays important roles in modifying the tumor microenvironment. Multiple techniques have been established to collect ECV from conditioned media of cell culture or physiological fluids. The gold standard methodology is differential centrifugation. Although alternative techniques exist to collect ECV, these techniques have not proven suitable as a substitution for the ultracentrifugation procedure.

  5. Multivariate statistical analysis of wildfires in Portugal

    NASA Astrophysics Data System (ADS)

    Costa, Ricardo; Caramelo, Liliana; Pereira, Mário

    2013-04-01

    Several studies demonstrate that wildfires in Portugal present high temporal and spatial variability as well as cluster behavior (Pereira et al., 2005, 2011). This study aims to contribute to the characterization of the fire regime in Portugal with the multivariate statistical analysis of the time series of number of fires and area burned in Portugal during the 1980 - 2009 period. The data used in the analysis is an extended version of the Rural Fire Portuguese Database (PRFD) (Pereira et al, 2011), provided by the National Forest Authority (Autoridade Florestal Nacional, AFN), the Portuguese Forest Service, which includes information for more than 500,000 fire records. There are many multiple advanced techniques for examining the relationships among multiple time series at the same time (e.g., canonical correlation analysis, principal components analysis, factor analysis, path analysis, multiple analyses of variance, clustering systems). This study compares and discusses the results obtained with these different techniques. Pereira, M.G., Trigo, R.M., DaCamara, C.C., Pereira, J.M.C., Leite, S.M., 2005: "Synoptic patterns associated with large summer forest fires in Portugal". Agricultural and Forest Meteorology. 129, 11-25. Pereira, M. G., Malamud, B. D., Trigo, R. M., and Alves, P. I.: The history and characteristics of the 1980-2005 Portuguese rural fire database, Nat. Hazards Earth Syst. Sci., 11, 3343-3358, doi:10.5194/nhess-11-3343-2011, 2011 This work is supported by European Union Funds (FEDER/COMPETE - Operational Competitiveness Programme) and by national funds (FCT - Portuguese Foundation for Science and Technology) under the project FCOMP-01-0124-FEDER-022692, the project FLAIR (PTDC/AAC-AMB/104702/2008) and the EU 7th Framework Program through FUME (contract number 243888).

  6. Smart Sampling and HPC-based Probabilistic Look-ahead Contingency Analysis Implementation and its Evaluation with Real-world Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Etingov, Pavel V.; Ren, Huiying

    This paper describes a probabilistic look-ahead contingency analysis application that incorporates smart sampling and high-performance computing (HPC) techniques. Smart sampling techniques are implemented to effectively represent the structure and statistical characteristics of uncertainty introduced by different sources in the power system. They can significantly reduce the data set size required for multiple look-ahead contingency analyses, and therefore reduce the time required to compute them. High-performance-computing (HPC) techniques are used to further reduce computational time. These two techniques enable a predictive capability that forecasts the impact of various uncertainties on potential transmission limit violations. The developed package has been tested withmore » real world data from the Bonneville Power Administration. Case study results are presented to demonstrate the performance of the applications developed.« less

  7. An Optimization of Inventory Demand Forecasting in University Healthcare Centre

    NASA Astrophysics Data System (ADS)

    Bon, A. T.; Ng, T. K.

    2017-01-01

    Healthcare industry becomes an important field for human beings nowadays as it concerns about one’s health. With that, forecasting demand for health services is an important step in managerial decision making for all healthcare organizations. Hence, a case study was conducted in University Health Centre to collect historical demand data of Panadol 650mg for 68 months from January 2009 until August 2014. The aim of the research is to optimize the overall inventory demand through forecasting techniques. Quantitative forecasting or time series forecasting model was used in the case study to forecast future data as a function of past data. Furthermore, the data pattern needs to be identified first before applying the forecasting techniques. Trend is the data pattern and then ten forecasting techniques are applied using Risk Simulator Software. Lastly, the best forecasting techniques will be find out with the least forecasting error. Among the ten forecasting techniques include single moving average, single exponential smoothing, double moving average, double exponential smoothing, regression, Holt-Winter’s additive, Seasonal additive, Holt-Winter’s multiplicative, seasonal multiplicative and Autoregressive Integrated Moving Average (ARIMA). According to the forecasting accuracy measurement, the best forecasting technique is regression analysis.

  8. Radio Occultation Experiments with Venus Express and Mars Express using the Planetary Radio Interferometry and Doppler Experiment (PRIDE) Technique

    NASA Astrophysics Data System (ADS)

    Bocanegra Bahamon, T.; Gurvits, L.; Molera Calves, G.; Cimo, G.; Duev, D.; Pogrebenko, S.; Dirkx, D.; Rosenblatt, P.

    2017-12-01

    The Planetary Radio Interferometry and Doppler Experiment (PRIDE) is a technique that can be used to enhance multiple radio science experiments of planetary missions. By 'eavesdropping' on the spacecraft signal using radio telescopes from different VLBI networks around the world, the PRIDE technique provides precise open-loop Doppler and VLBI observables to able to reconstruct the spacecraft's orbit. The application of this technique for atmospheric studies has been assessed by observing ESA's Venus Express (VEX) and Mars Express (MEX) during multiple Venus and Mars occultation events between 2012 and 2014. From these observing sessions density, temperature and pressure profiles of Venus and Mars neutral atmosphere and ionosphere have been retrieved. We present an error propagation analysis where the uncertainties of the atmospheric properties measured with this technique have been derived. These activities serve as demonstration of the applicability of the PRIDE technique for radio occultation studies, and provides a benchmark against the traditional Doppler tracking provided by the NASA's DSN and ESA's Estrack networks for these same purposes, in the framework of the upcoming ESA JUICE mission to the Jovian system.

  9. An analysis of pilot error-related aircraft accidents

    NASA Technical Reports Server (NTRS)

    Kowalsky, N. B.; Masters, R. L.; Stone, R. B.; Babcock, G. L.; Rypka, E. W.

    1974-01-01

    A multidisciplinary team approach to pilot error-related U.S. air carrier jet aircraft accident investigation records successfully reclaimed hidden human error information not shown in statistical studies. New analytic techniques were developed and applied to the data to discover and identify multiple elements of commonality and shared characteristics within this group of accidents. Three techniques of analysis were used: Critical element analysis, which demonstrated the importance of a subjective qualitative approach to raw accident data and surfaced information heretofore unavailable. Cluster analysis, which was an exploratory research tool that will lead to increased understanding and improved organization of facts, the discovery of new meaning in large data sets, and the generation of explanatory hypotheses. Pattern recognition, by which accidents can be categorized by pattern conformity after critical element identification by cluster analysis.

  10. Analysis of remote sensing data for evaluation of vegetation resources

    NASA Technical Reports Server (NTRS)

    1970-01-01

    Research has centered around: (1) completion of a study on the use of remote sensing techniques as an aid to multiple use management; (2) determination of the information transfer at various image resolution levels for wildland areas; and (3) determination of the value of small scale multiband, multidate photography for the analysis of vegetation resources. In addition, a substantial effort was made to upgrade the automatic image classification and spectral signature acquisition capabilities of the laboratory. It was found that: (1) Remote sensing techniques should be useful in multiple use management to provide a first-cut analysis of an area. (2) Imagery with 400-500 feet ground resolvable distance (GRD), such as that expected from ERTS-1, should allow discriminations to be made between woody vegetation, grassland, and water bodies with approximately 80% accuracy. (3) Barley and wheat acreages in Maricopa County, Arizona could be estimated with acceptable accuracies using small scale multiband, multidate photography. Sampling errors for acreages of wheat, barley, small grains (wheat and barley combined), and all cropland were 13%, 11%, 8% and 3% respectively.

  11. Investigation to realize a computationally efficient implementation of the high-order instantaneous-moments-based fringe analysis method

    NASA Astrophysics Data System (ADS)

    Gorthi, Sai Siva; Rajshekhar, Gannavarpu; Rastogi, Pramod

    2010-06-01

    Recently, a high-order instantaneous moments (HIM)-operator-based method was proposed for accurate phase estimation in digital holographic interferometry. The method relies on piece-wise polynomial approximation of phase and subsequent evaluation of the polynomial coefficients from the HIM operator using single-tone frequency estimation. The work presents a comparative analysis of the performance of different single-tone frequency estimation techniques, like Fourier transform followed by optimization, estimation of signal parameters by rotational invariance technique (ESPRIT), multiple signal classification (MUSIC), and iterative frequency estimation by interpolation on Fourier coefficients (IFEIF) in HIM-operator-based methods for phase estimation. Simulation and experimental results demonstrate the potential of the IFEIF technique with respect to computational efficiency and estimation accuracy.

  12. Short-Arc Analysis of Intersatellite Tracking Data in a Gravity Mapping Mission

    NASA Technical Reports Server (NTRS)

    Rowlands, David D.; Ray, Richard D.; Chinn, Douglas S.; Lemoine, Frank G.; Smith, David E. (Technical Monitor)

    2001-01-01

    A technique for the analysis of low-low intersatellite range-rate data in a gravity mapping mission is explored. The technique is based on standard tracking data analysis for orbit determination but uses a spherical coordinate representation of the 12 epoch state parameters describing the baseline between the two satellites. This representation of the state parameters is exploited to allow the intersatellite range-rate analysis to benefit from information provided by other tracking data types without large simultaneous multiple data type solutions. The technique appears especially valuable for estimating gravity from short arcs (e.g., less than 15 minutes) of data. Gravity recovery simulations which use short arcs are compared with those using arcs a day in length. For a high-inclination orbit, the short-arc analysis recovers low-order gravity coefficients remarkably well, although higher order terms, especially sectorial terms, are less accurate. Simulations suggest that either long or short arcs of GRACE data are likely to improve parts of the geopotential spectrum by orders of magnitude.

  13. A Pragmatic Cognitive System Engineering Approach to Model Dynamic Human Decision-Making Activities in Intelligent and Automated Systems

    DTIC Science & Technology

    2003-10-01

    Among the procedures developed to identify cognitive processes, there are the Cognitive Task Analysis (CTA) and the Cognitive Work Analysis (CWA...of Cognitive Task Design. [11] Potter, S.S., Roth, E.M., Woods, D.D., and Elm, W.C. (2000). Cognitive Task Analysis as Bootstrapping Multiple...Converging Techniques, In Schraagen, Chipman, and Shalin (Eds.). Cognitive Task Analysis . Mahwah, NJ: Lawrence Erlbaum Associates. [12] Roth, E.M

  14. Visualization Techniques for Computer Network Defense

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beaver, Justin M; Steed, Chad A; Patton, Robert M

    2011-01-01

    Effective visual analysis of computer network defense (CND) information is challenging due to the volume and complexity of both the raw and analyzed network data. A typical CND is comprised of multiple niche intrusion detection tools, each of which performs network data analysis and produces a unique alerting output. The state-of-the-practice in the situational awareness of CND data is the prevalent use of custom-developed scripts by Information Technology (IT) professionals to retrieve, organize, and understand potential threat events. We propose a new visual analytics framework, called the Oak Ridge Cyber Analytics (ORCA) system, for CND data that allows an operatormore » to interact with all detection tool outputs simultaneously. Aggregated alert events are presented in multiple coordinated views with timeline, cluster, and swarm model analysis displays. These displays are complemented with both supervised and semi-supervised machine learning classifiers. The intent of the visual analytics framework is to improve CND situational awareness, to enable an analyst to quickly navigate and analyze thousands of detected events, and to combine sophisticated data analysis techniques with interactive visualization such that patterns of anomalous activities may be more easily identified and investigated.« less

  15. Analyzing Multiple-Choice Questions by Model Analysis and Item Response Curves

    NASA Astrophysics Data System (ADS)

    Wattanakasiwich, P.; Ananta, S.

    2010-07-01

    In physics education research, the main goal is to improve physics teaching so that most students understand physics conceptually and be able to apply concepts in solving problems. Therefore many multiple-choice instruments were developed to probe students' conceptual understanding in various topics. Two techniques including model analysis and item response curves were used to analyze students' responses from Force and Motion Conceptual Evaluation (FMCE). For this study FMCE data from more than 1000 students at Chiang Mai University were collected over the past three years. With model analysis, we can obtain students' alternative knowledge and the probabilities for students to use such knowledge in a range of equivalent contexts. The model analysis consists of two algorithms—concentration factor and model estimation. This paper only presents results from using the model estimation algorithm to obtain a model plot. The plot helps to identify a class model state whether it is in the misconception region or not. Item response curve (IRC) derived from item response theory is a plot between percentages of students selecting a particular choice versus their total score. Pros and cons of both techniques are compared and discussed.

  16. Analysis and prediction of flow from local source in a river basin using a Neuro-fuzzy modeling tool.

    PubMed

    Aqil, Muhammad; Kita, Ichiro; Yano, Akira; Nishiyama, Soichi

    2007-10-01

    Traditionally, the multiple linear regression technique has been one of the most widely used models in simulating hydrological time series. However, when the nonlinear phenomenon is significant, the multiple linear will fail to develop an appropriate predictive model. Recently, neuro-fuzzy systems have gained much popularity for calibrating the nonlinear relationships. This study evaluated the potential of a neuro-fuzzy system as an alternative to the traditional statistical regression technique for the purpose of predicting flow from a local source in a river basin. The effectiveness of the proposed identification technique was demonstrated through a simulation study of the river flow time series of the Citarum River in Indonesia. Furthermore, in order to provide the uncertainty associated with the estimation of river flow, a Monte Carlo simulation was performed. As a comparison, a multiple linear regression analysis that was being used by the Citarum River Authority was also examined using various statistical indices. The simulation results using 95% confidence intervals indicated that the neuro-fuzzy model consistently underestimated the magnitude of high flow while the low and medium flow magnitudes were estimated closer to the observed data. The comparison of the prediction accuracy of the neuro-fuzzy and linear regression methods indicated that the neuro-fuzzy approach was more accurate in predicting river flow dynamics. The neuro-fuzzy model was able to improve the root mean square error (RMSE) and mean absolute percentage error (MAPE) values of the multiple linear regression forecasts by about 13.52% and 10.73%, respectively. Considering its simplicity and efficiency, the neuro-fuzzy model is recommended as an alternative tool for modeling of flow dynamics in the study area.

  17. Tradespace Exploration for the Engineering of Resilient Systems

    DTIC Science & Technology

    2015-05-01

    world scenarios. The types of tools within the SAE set include visualization, decision analysis, and M&S, so it is difficult to categorize this toolset... overpopulated , or questionable. ERS Tradespace Workshop Create predictive models using multiple techniques (e.g., regression, Kriging, neural nets

  18. Technique for estimating depth of 100-year floods in Tennessee

    USGS Publications Warehouse

    Gamble, Charles R.; Lewis, James G.

    1977-01-01

    Preface: A method is presented for estimating the depth of the loo-year flood in four hydrologic areas in Tennessee. Depths at 151 gaging stations on streams that were not significantly affected by man made changes were related to basin characteristics by multiple regression techniques. Equations derived from the analysis can be used to estimate the depth of the loo-year flood if the size of the drainage basin is known.

  19. ROCKETSHIP: a flexible and modular software tool for the planning, processing and analysis of dynamic MRI studies.

    PubMed

    Barnes, Samuel R; Ng, Thomas S C; Santa-Maria, Naomi; Montagne, Axel; Zlokovic, Berislav V; Jacobs, Russell E

    2015-06-16

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a promising technique to characterize pathology and evaluate treatment response. However, analysis of DCE-MRI data is complex and benefits from concurrent analysis of multiple kinetic models and parameters. Few software tools are currently available that specifically focuses on DCE-MRI analysis with multiple kinetic models. Here, we developed ROCKETSHIP, an open-source, flexible and modular software for DCE-MRI analysis. ROCKETSHIP incorporates analyses with multiple kinetic models, including data-driven nested model analysis. ROCKETSHIP was implemented using the MATLAB programming language. Robustness of the software to provide reliable fits using multiple kinetic models is demonstrated using simulated data. Simulations also demonstrate the utility of the data-driven nested model analysis. Applicability of ROCKETSHIP for both preclinical and clinical studies is shown using DCE-MRI studies of the human brain and a murine tumor model. A DCE-MRI software suite was implemented and tested using simulations. Its applicability to both preclinical and clinical datasets is shown. ROCKETSHIP was designed to be easily accessible for the beginner, but flexible enough for changes or additions to be made by the advanced user as well. The availability of a flexible analysis tool will aid future studies using DCE-MRI. A public release of ROCKETSHIP is available at https://github.com/petmri/ROCKETSHIP .

  20. Analysis of intracranial pressure: past, present, and future.

    PubMed

    Di Ieva, Antonio; Schmitz, Erika M; Cusimano, Michael D

    2013-12-01

    The monitoring of intracranial pressure (ICP) is an important tool in medicine for its ability to portray the brain's compliance status. The bedside monitor displays the ICP waveform and intermittent mean values to guide physicians in the management of patients, particularly those having sustained a traumatic brain injury. Researchers in the fields of engineering and physics have investigated various mathematical analysis techniques applicable to the waveform in order to extract additional diagnostic and prognostic information, although they largely remain limited to research applications. The purpose of this review is to present the current techniques used to monitor and interpret ICP and explore the potential of using advanced mathematical techniques to provide information about system perturbations from states of homeostasis. We discuss the limits of each proposed technique and we propose that nonlinear analysis could be a reliable approach to describe ICP signals over time, with the fractal dimension as a potential predictive clinically meaningful biomarker. Our goal is to stimulate translational research that can move modern analysis of ICP using these techniques into widespread practical use, and to investigate to the clinical utility of a tool capable of simplifying multiple variables obtained from various sensors.

  1. Time-Sharing-Based Synchronization and Performance Evaluation of Color-Independent Visual-MIMO Communication.

    PubMed

    Kwon, Tae-Ho; Kim, Jai-Eun; Kim, Ki-Doo

    2018-05-14

    In the field of communication, synchronization is always an important issue. The communication between a light-emitting diode (LED) array (LEA) and a camera is known as visual multiple-input multiple-output (MIMO), for which the data transmitter and receiver must be synchronized for seamless communication. In visual-MIMO, LEDs generally have a faster data rate than the camera. Hence, we propose an effective time-sharing-based synchronization technique with its color-independent characteristics providing the key to overcome this synchronization problem in visual-MIMO communication. We also evaluated the performance of our synchronization technique by varying the distance between the LEA and camera. A graphical analysis is also presented to compare the symbol error rate (SER) at different distances.

  2. Coordinating Multiple Spacecraft Assets for Joint Science Campaigns

    NASA Technical Reports Server (NTRS)

    Estlin, Tara; Chien, Steve; Castano, Rebecca; Gaines, Daniel; de Granville, Charles; Doubleday, Josh; Anderson, Robert C.; Knight, Russell; Bornstein, Benjamin; Rabideau, Gregg; hide

    2010-01-01

    This paper describes technology to support a new paradigm of space science campaigns. These campaigns enable opportunistic science observations to be autonomously coordinated between multiple spacecraft. Coordinated spacecraft can consist of multiple orbiters, landers, rovers, or other in-situ vehicles (such as an aerobot). In this paradigm, opportunistic science detections can be cued by any of these assets where additional spacecraft are requested to take further observations characterizing the identified event or surface feature. Such coordination will enable a number of science campaigns not possible with present spacecraft technology. Examples from Mars include enabling rapid data collection from multiple craft on dynamic events such as new Mars dark slope streaks, dust-devils or trace gases. Technology to support the identification of opportunistic science events and/or the re-tasking of a spacecraft to take new measurements of the event is already in place on several individual missions such as the Mars Exploration Rover (MER) Mission and the Earth Observing One (EO1) Mission. This technology includes onboard data analysis techniques as well as capabilities for planning and scheduling. This paper describes how these techniques can be cue and coordinate multiple spacecraft in observing the same science event from their different vantage points.

  3. Multi-scale pixel-based image fusion using multivariate empirical mode decomposition.

    PubMed

    Rehman, Naveed ur; Ehsan, Shoaib; Abdullah, Syed Muhammad Umer; Akhtar, Muhammad Jehanzaib; Mandic, Danilo P; McDonald-Maier, Klaus D

    2015-05-08

    A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences.

  4. Multi-Scale Pixel-Based Image Fusion Using Multivariate Empirical Mode Decomposition

    PubMed Central

    Rehman, Naveed ur; Ehsan, Shoaib; Abdullah, Syed Muhammad Umer; Akhtar, Muhammad Jehanzaib; Mandic, Danilo P.; McDonald-Maier, Klaus D.

    2015-01-01

    A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences. PMID:26007714

  5. Simplified Phased-Mission System Analysis for Systems with Independent Component Repairs

    NASA Technical Reports Server (NTRS)

    Somani, Arun K.

    1996-01-01

    Accurate analysis of reliability of system requires that it accounts for all major variations in system's operation. Most reliability analyses assume that the system configuration, success criteria, and component behavior remain the same. However, multiple phases are natural. We present a new computationally efficient technique for analysis of phased-mission systems where the operational states of a system can be described by combinations of components states (such as fault trees or assertions). Moreover, individual components may be repaired, if failed, as part of system operation but repairs are independent of the system state. For repairable systems Markov analysis techniques are used but they suffer from state space explosion. That limits the size of system that can be analyzed and it is expensive in computation. We avoid the state space explosion. The phase algebra is used to account for the effects of variable configurations, repairs, and success criteria from phase to phase. Our technique yields exact (as opposed to approximate) results. We demonstrate our technique by means of several examples and present numerical results to show the effects of phases and repairs on the system reliability/availability.

  6. Detecting multiple outliers in linear functional relationship model for circular variables using clustering technique

    NASA Astrophysics Data System (ADS)

    Mokhtar, Nurkhairany Amyra; Zubairi, Yong Zulina; Hussin, Abdul Ghapor

    2017-05-01

    Outlier detection has been used extensively in data analysis to detect anomalous observation in data and has important application in fraud detection and robust analysis. In this paper, we propose a method in detecting multiple outliers for circular variables in linear functional relationship model. Using the residual values of the Caires and Wyatt model, we applied the hierarchical clustering procedure. With the use of tree diagram, we illustrate the graphical approach of the detection of outlier. A simulation study is done to verify the accuracy of the proposed method. Also, an illustration to a real data set is given to show its practical applicability.

  7. Metabolite identification of triptolide by data-dependent accurate mass spectrometric analysis in combination with online hydrogen/deuterium exchange and multiple data-mining techniques.

    PubMed

    Du, Fuying; Liu, Ting; Liu, Tian; Wang, Yongwei; Wan, Yakun; Xing, Jie

    2011-10-30

    Triptolide (TP), the primary active component of the herbal medicine Tripterygium wilfordii Hook F, has shown promising antileukemic and anti-inflammatory activity. The pharmacokinetic profile of TP indicates an extensive metabolic elimination in vivo; however, its metabolic data is rarely available partly because of the difficulty in identifying it due to the absence of appropriate ultraviolet chromophores in the structure and the presence of endogenous interferences in biological samples. In the present study, the biotransformation of TP was investigated by improved data-dependent accurate mass spectrometric analysis, using an LTQ/Orbitrap hybrid mass spectrometer in conjunction with the online hydrogen (H)/deuterium (D) exchange technique for rapid structural characterization. Accurate full-scan MS and MS/MS data were processed with multiple post-acquisition data-mining techniques, which were complementary and effective in detecting both common and uncommon metabolites from biological matrices. As a result, 38 phase I, 9 phase II and 8 N-acetylcysteine (NAC) metabolites of TP were found in rat urine. Accurate MS/MS data were used to support assignments of metabolite structures, and online H/D exchange experiments provided additional evidence for exchangeable hydrogen atoms in the structure. The results showed the main phase I metabolic pathways of TP are hydroxylation, hydrolysis and desaturation, and the resulting metabolites subsequently undergo phase II processes. The presence of NAC conjugates indicated the capability of TP to form reactive intermediate species. This study also demonstrated the effectiveness of LC/HR-MS(n) in combination with multiple post-acquisition data-mining methods and the online H/D exchange technique for the rapid identification of drug metabolites. Copyright © 2011 John Wiley & Sons, Ltd.

  8. Multiple locus VNTR analysis highlights that geographical clustering and distribution of Dichelobacter nodosus, the causal agent of footrot in sheep, correlates with inter-country movements☆

    PubMed Central

    Russell, Claire L.; Smith, Edward M.; Calvo-Bado, Leonides A.; Green, Laura E.; Wellington, Elizabeth M.H.; Medley, Graham F.; Moore, Lynda J.; Grogono-Thomas, Rosemary

    2014-01-01

    Dichelobacter nodosus is a Gram-negative, anaerobic bacterium and the causal agent of footrot in sheep. Multiple locus variable number tandem repeat (VNTR) analysis (MLVA) is a portable technique that involves the identification and enumeration of polymorphic tandem repeats across the genome. The aims of this study were to develop an MLVA scheme for D. nodosus suitable for use as a molecular typing tool, and to apply it to a global collection of isolates. Seventy-seven isolates selected from regions with a long history of footrot (GB, Australia) and regions where footrot has recently been reported (India, Scandinavia), were characterised. From an initial 61 potential VNTR regions, four loci were identified as usable and in combination had the attributes required of a typing method for use in bacterial epidemiology: high discriminatory power (D > 0.95), typeability and reproducibility. Results from the analysis indicate that D. nodosus appears to have evolved via recombinational exchanges and clonal diversification. This has resulted in some clonal complexes that contain isolates from multiple countries and continents; and others that contain isolates from a single geographic location (country or region). The distribution of alleles between countries matches historical accounts of sheep movements, suggesting that the MLVA technique is sufficiently specific and sensitive for an epidemiological investigation of the global distribution of D. nodosus. PMID:23748018

  9. Flight-determined stability analysis of multiple-input-multiple-output control systems

    NASA Technical Reports Server (NTRS)

    Burken, John J.

    1992-01-01

    Singular value analysis can give conservative stability margin results. Applying structure to the uncertainty can reduce this conservatism. This paper presents flight-determined stability margins for the X-29A lateral-directional, multiloop control system. These margins are compared with the predicted unscaled singular values and scaled structured singular values. The algorithm was further evaluated with flight data by changing the roll-rate-to-aileron command-feedback gain by +/- 20 percent. Minimum eigenvalues of the return difference matrix which bound the singular values are also presented. Extracting multiloop singular values from flight data and analyzing the feedback gain variations validates this technique as a measure of robustness. This analysis can be used for near-real-time flight monitoring and safety testing.

  10. Flight-determined stability analysis of multiple-input-multiple-output control systems

    NASA Technical Reports Server (NTRS)

    Burken, John J.

    1992-01-01

    Singular value analysis can give conservative stability margin results. Applying structure to the uncertainty can reduce this conservatism. This paper presents flight-determined stability margins for the X-29A lateral-directional, multiloop control system. These margins are compared with the predicted unscaled singular values and scaled structured singular values. The algorithm was further evaluated with flight data by changing the roll-rate-to-aileron-command-feedback gain by +/- 20 percent. Also presented are the minimum eigenvalues of the return difference matrix which bound the singular values. Extracting multiloop singular values from flight data and analyzing the feedback gain variations validates this technique as a measure of robustness. This analysis can be used for near-real-time flight monitoring and safety testing.

  11. Combination of multiple model population analysis and mid-infrared technology for the estimation of copper content in Tegillarca granosa

    NASA Astrophysics Data System (ADS)

    Hu, Meng-Han; Chen, Xiao-Jing; Ye, Peng-Chao; Chen, Xi; Shi, Yi-Jian; Zhai, Guang-Tao; Yang, Xiao-Kang

    2016-11-01

    The aim of this study was to use mid-infrared spectroscopy coupled with multiple model population analysis based on Monte Carlo-uninformative variable elimination for rapidly estimating the copper content of Tegillarca granosa. Copper-specific wavelengths were first extracted from the whole spectra, and subsequently, a least square-support vector machine was used to develop the prediction models. Compared with the prediction model based on full wavelengths, models that used 100 multiple MC-UVE selected wavelengths without and with bin operation showed comparable performances with Rp (root mean square error of Prediction) of 0.97 (14.60 mg/kg) and 0.94 (20.85 mg/kg) versus 0.96 (17.27 mg/kg), as well as ratio of percent deviation (number of wavelength) of 2.77 (407) and 1.84 (45) versus 2.32 (1762). The obtained results demonstrated that the mid-infrared technique could be used for estimating copper content in T. granosa. In addition, the proposed multiple model population analysis can eliminate uninformative, weakly informative and interfering wavelengths effectively, that substantially reduced the model complexity and computation time.

  12. Imaging mass spectrometry data reduction: automated feature identification and extraction.

    PubMed

    McDonnell, Liam A; van Remoortere, Alexandra; de Velde, Nico; van Zeijl, René J M; Deelder, André M

    2010-12-01

    Imaging MS now enables the parallel analysis of hundreds of biomolecules, spanning multiple molecular classes, which allows tissues to be described by their molecular content and distribution. When combined with advanced data analysis routines, tissues can be analyzed and classified based solely on their molecular content. Such molecular histology techniques have been used to distinguish regions with differential molecular signatures that could not be distinguished using established histologic tools. However, its potential to provide an independent, complementary analysis of clinical tissues has been limited by the very large file sizes and large number of discrete variables associated with imaging MS experiments. Here we demonstrate data reduction tools, based on automated feature identification and extraction, for peptide, protein, and lipid imaging MS, using multiple imaging MS technologies, that reduce data loads and the number of variables by >100×, and that highlight highly-localized features that can be missed using standard data analysis strategies. It is then demonstrated how these capabilities enable multivariate analysis on large imaging MS datasets spanning multiple tissues. Copyright © 2010 American Society for Mass Spectrometry. Published by Elsevier Inc. All rights reserved.

  13. Proceedings of the Mobile Satellite System Architectures and Multiple Access Techniques Workshop

    NASA Technical Reports Server (NTRS)

    Dessouky, Khaled

    1989-01-01

    The Mobile Satellite System Architectures and Multiple Access Techniques Workshop served as a forum for the debate of system and network architecture issues. Particular emphasis was on those issues relating to the choice of multiple access technique(s) for the Mobile Satellite Service (MSS). These proceedings contain articles that expand upon the 12 presentations given in the workshop. Contrasting views on Frequency Division Multiple Access (FDMA), Code Division Multiple Access (CDMA), and Time Division Multiple Access (TDMA)-based architectures are presented, and system issues relating to signaling, spacecraft design, and network management constraints are addressed. An overview article that summarizes the issues raised in the numerous discussion periods of the workshop is also included.

  14. Evaluation of multiple frequency bioelectrical impedance and Cole-Cole analysis for the assessment of body water volumes in healthy humans.

    PubMed

    Cornish, B H; Ward, L C; Thomas, B J; Jebb, S A; Elia, M

    1996-03-01

    To assess the application of a Cole-Cole analysis of multiple frequency bioelectrical impedance analysis (MFBIA) measurements to predict total body water (TBW) and extracellular water (ECW) in humans. This technique has previously been shown to produce accurate and reliable estimates in both normal and abnormal animals. The whole body impedance of 60 healthy humans was measured at 496 frequencies (ranging from 4 kHz to 1 MHz) and the impedance at zero frequency, Ro, and at the characteristic frequency, Zc, were determined from the impedance spectrum, (Cole-Cole plot). TBW and ECW were independently determined using deuterium and bromide tracer dilution techniques. At the Dunn Clinical Nutrition Centre and The Department of Biochemistry, University of Queensland. 60 healthy adult volunteers (27 men and 33 women, aged 18-45 years). The results presented suggest that the swept frequency bioimpedance technique estimates total body water, (SEE = 5.2%), and extracellular water, (SEE = 10%), only slightly better in normal, healthy subjects than a method based on single frequency bioimpedance or anthropometric estimates based on weight, height and gender. This study has undertaken the most extensive analysis to date of relationships between TBW (and ECW) and individual impedances obtained at different frequencies ( > 400 frequencies), and has shown marginal advantages of using one frequency over another, even if values predicted from theoretical bioimpedance models are used in the estimations. However in situations where there are disturbances of fluid distribution, values predicted from the Cole-Cole analysis of swept frequency bioimpedance measurements could prove to be more useful.

  15. [Comparison between rapid detection method of enzyme substrate technique and multiple-tube fermentation technique in water coliform bacteria detection].

    PubMed

    Sun, Zong-ke; Wu, Rong; Ding, Pei; Xue, Jin-Rong

    2006-07-01

    To compare between rapid detection method of enzyme substrate technique and multiple-tube fermentation technique in water coliform bacteria detection. Using inoculated and real water samples to compare the equivalence and false positive rate between two methods. Results demonstrate that enzyme substrate technique shows equivalence with multiple-tube fermentation technique (P = 0.059), false positive rate between the two methods has no statistical difference. It is suggested that enzyme substrate technique can be used as a standard method for water microbiological safety evaluation.

  16. Design of a transportable high efficiency fast neutron spectrometer

    DOE PAGES

    Roecker, C.; Bernstein, A.; Bowden, N. S.; ...

    2016-04-12

    A transportable fast neutron detection system has been designed and constructed for measuring neutron energy spectra and flux ranging from tens to hundreds of MeV. The transportability of the spectrometer reduces the detector-related systematic bias between different neutron spectra and flux measurements, which allows for the comparison of measurements above or below ground. The spectrometer will measure neutron fluxes that are of prohibitively low intensity compared to the site-specific background rates targeted by other transportable fast neutron detection systems. To measure low intensity high-energy neutron fluxes, a conventional capture-gating technique is used for measuring neutron energies above 20 MeV andmore » a novel multiplicity technique is used for measuring neutron energies above 100 MeV. The spectrometer is composed of two Gd containing plastic scintillator detectors arranged around a lead spallation target. To calibrate and characterize the position dependent response of the spectrometer, a Monte Carlo model was developed and used in conjunction with experimental data from gamma ray sources. Multiplicity event identification algorithms were developed and used with a Cf-252 neutron multiplicity source to validate the Monte Carlo model Gd concentration and secondary neutron capture efficiency. The validated Monte Carlo model was used to predict an effective area for the multiplicity and capture gating analyses. For incident neutron energies between 100 MeV and 1000 MeV with an isotropic angular distribution, the multiplicity analysis predicted an effective area of 500 cm 2 rising to 5000 cm 2. For neutron energies above 20 MeV, the capture-gating analysis predicted an effective area between 1800 cm 2 and 2500 cm 2. As a result, the multiplicity mode was found to be sensitive to the incident neutron angular distribution.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roecker, C.; Bernstein, A.; Bowden, N. S.

    A transportable fast neutron detection system has been designed and constructed for measuring neutron energy spectra and flux ranging from tens to hundreds of MeV. The transportability of the spectrometer reduces the detector-related systematic bias between different neutron spectra and flux measurements, which allows for the comparison of measurements above or below ground. The spectrometer will measure neutron fluxes that are of prohibitively low intensity compared to the site-specific background rates targeted by other transportable fast neutron detection systems. To measure low intensity high-energy neutron fluxes, a conventional capture-gating technique is used for measuring neutron energies above 20 MeV andmore » a novel multiplicity technique is used for measuring neutron energies above 100 MeV. The spectrometer is composed of two Gd containing plastic scintillator detectors arranged around a lead spallation target. To calibrate and characterize the position dependent response of the spectrometer, a Monte Carlo model was developed and used in conjunction with experimental data from gamma ray sources. Multiplicity event identification algorithms were developed and used with a Cf-252 neutron multiplicity source to validate the Monte Carlo model Gd concentration and secondary neutron capture efficiency. The validated Monte Carlo model was used to predict an effective area for the multiplicity and capture gating analyses. For incident neutron energies between 100 MeV and 1000 MeV with an isotropic angular distribution, the multiplicity analysis predicted an effective area of 500 cm 2 rising to 5000 cm 2. For neutron energies above 20 MeV, the capture-gating analysis predicted an effective area between 1800 cm 2 and 2500 cm 2. As a result, the multiplicity mode was found to be sensitive to the incident neutron angular distribution.« less

  18. Transgender Phonosurgery: A Systematic Review and Meta-analysis.

    PubMed

    Song, Tara Elena; Jiang, Nancy

    2017-05-01

    Objectives Different surgical techniques have been described in the literature to increase vocal pitch. The purpose of this study is to systematically review these surgeries and perform a meta-analysis to determine which technique increases pitch the most. Data Sources CINAHL, Cochrane, Embase, Medline, PubMed, and Science Direct. Review Methods A systematic review and meta-analysis of the literature was performed using the CINAHL, Cochrane, Embase, Medline, PubMed, and Science Direct databases. Studies were eligible for inclusion if they evaluated pitch-elevating phonosurgical techniques in live humans and performed pre- and postoperative acoustic analysis. Data were gathered regarding surgical technique, pre- and postoperative fundamental frequencies, perioperative care measures, and complications. Results Twenty-nine studies were identified. After applying inclusion and exclusion criteria, a total of 13 studies were included in the meta-analysis. Mechanisms of pitch elevation included increasing vocal cord tension (cricothyroid approximation), shortening the vocal cord length (cold knife glottoplasty, laser-shortening glottoplasty), and decreasing mass (laser reduction glottoplasty). The most common interventions were shortening techniques and cricothyroid approximation (6 studies each). The largest increase in fundamental frequency was seen with techniques that shortened the vocal cords. Preoperative speech therapy, postoperative voice rest, and reporting of patient satisfaction were inconsistent. Many of the studies were limited by low power and short length of follow-up. Conclusions Multiple techniques for elevation of vocal pitch exist, but vocal cord shortening procedures appear to result in the largest increase in fundamental frequency.

  19. Recent development in mass spectrometry and its hyphenated techniques for the analysis of medicinal plants.

    PubMed

    Zhu, Ming-Zhi; Chen, Gui-Lin; Wu, Jian-Lin; Li, Na; Liu, Zhong-Hua; Guo, Ming-Quan

    2018-04-23

    Medicinal plants are gaining increasing attention worldwide due to their empirical therapeutic efficacy and being a huge natural compound pool for new drug discovery and development. The efficacy, safety and quality of medicinal plants are the main concerns, which are highly dependent on the comprehensive analysis of chemical components in the medicinal plants. With the advances in mass spectrometry (MS) techniques, comprehensive analysis and fast identification of complex phytochemical components have become feasible, and may meet the needs, for the analysis of medicinal plants. Our aim is to provide an overview on the latest developments in MS and its hyphenated technique and their applications for the comprehensive analysis of medicinal plants. Application of various MS and its hyphenated techniques for the analysis of medicinal plants, including but not limited to one-dimensional chromatography, multiple-dimensional chromatography coupled to MS, ambient ionisation MS, and mass spectral database, have been reviewed and compared in this work. Recent advancs in MS and its hyphenated techniques have made MS one of the most powerful tools for the analysis of complex extracts from medicinal plants due to its excellent separation and identification ability, high sensitivity and resolution, and wide detection dynamic range. To achieve high-throughput or multi-dimensional analysis of medicinal plants, the state-of-the-art MS and its hyphenated techniques have played, and will continue to play a great role in being the major platform for their further research in order to obtain insight into both their empirical therapeutic efficacy and quality control. Copyright © 2018 John Wiley & Sons, Ltd.

  20. Common aero vehicle autonomous reentry trajectory optimization satisfying waypoint and no-fly zone constraints

    NASA Astrophysics Data System (ADS)

    Jorris, Timothy R.

    2007-12-01

    To support the Air Force's Global Reach concept, a Common Aero Vehicle is being designed to support the Global Strike mission. "Waypoints" are specified for reconnaissance or multiple payload deployments and "no-fly zones" are specified for geopolitical restrictions or threat avoidance. Due to time critical targets and multiple scenario analysis, an autonomous solution is preferred over a time-intensive, manually iterative one. Thus, a real-time or near real-time autonomous trajectory optimization technique is presented to minimize the flight time, satisfy terminal and intermediate constraints, and remain within the specified vehicle heating and control limitations. This research uses the Hypersonic Cruise Vehicle (HCV) as a simplified two-dimensional platform to compare multiple solution techniques. The solution techniques include a unique geometric approach developed herein, a derived analytical dynamic optimization technique, and a rapidly emerging collocation numerical approach. This up-and-coming numerical technique is a direct solution method involving discretization then dualization, with pseudospectral methods and nonlinear programming used to converge to the optimal solution. This numerical approach is applied to the Common Aero Vehicle (CAV) as the test platform for the full three-dimensional reentry trajectory optimization problem. The culmination of this research is the verification of the optimality of this proposed numerical technique, as shown for both the two-dimensional and three-dimensional models. Additionally, user implementation strategies are presented to improve accuracy and enhance solution convergence. Thus, the contributions of this research are the geometric approach, the user implementation strategies, and the determination and verification of a numerical solution technique for the optimal reentry trajectory problem that minimizes time to target while satisfying vehicle dynamics and control limitation, and heating, waypoint, and no-fly zone constraints.

  1. Using foreground/background analysis to determine leaf and canopy chemistry

    NASA Technical Reports Server (NTRS)

    Pinzon, J. E.; Ustin, S. L.; Hart, Q. J.; Jacquemoud, S.; Smith, M. O.

    1995-01-01

    Spectral Mixture Analysis (SMA) has become a well established procedure for analyzing imaging spectrometry data, however, the technique is relatively insensitive to minor sources of spectral variation (e.g., discriminating stressed from unstressed vegetation and variations in canopy chemistry). Other statistical approaches have been tried e.g., stepwise multiple linear regression analysis to predict canopy chemistry. Grossman et al. reported that SMLR is sensitive to measurement error and that the prediction of minor chemical components are not independent of patterns observed in more dominant spectral components like water. Further, they observed that the relationships were strongly dependent on the mode of expressing reflectance (R, -log R) and whether chemistry was expressed on a weight (g/g) or are basis (g/sq m). Thus, alternative multivariate techniques need to be examined. Smith et al. reported a revised SMA that they termed Foreground/Background Analysis (FBA) that permits directing the analysis along any axis of variance by identifying vectors through the n-dimensional spectral volume orthonormal to each other. Here, we report an application of the FBA technique for the detection of canopy chemistry using a modified form of the analysis.

  2. Deriving amplitude equations for weakly-nonlinear oscillators and their generalizations

    NASA Astrophysics Data System (ADS)

    O'Malley, Robert E., Jr.; Williams, David B.

    2006-06-01

    Results by physicists on renormalization group techniques have recently sparked interest in the singular perturbations community of applied mathematicians. The survey paper, [Phys. Rev. E 54(1) (1996) 376-394], by Chen et al. demonstrated that many problems which applied mathematicians solve using disparate methods can be solved using a single approach. Analysis of that renormalization group method by Mudavanhu and O'Malley [Stud. Appl. Math. 107(1) (2001) 63-79; SIAM J. Appl. Math. 63(2) (2002) 373-397], among others, indicates that the technique can be streamlined. This paper carries that analysis several steps further to present an amplitude equation technique which is both well adapted for use with a computer algebra system and easy to relate to the classical methods of averaging and multiple scales.

  3. A novel universal real-time PCR system using the attached universal duplex probes for quantitative analysis of nucleic acids.

    PubMed

    Yang, Litao; Liang, Wanqi; Jiang, Lingxi; Li, Wenquan; Cao, Wei; Wilson, Zoe A; Zhang, Dabing

    2008-06-04

    Real-time PCR techniques are being widely used for nucleic acids analysis, but one limitation of current frequently employed real-time PCR is the high cost of the labeled probe for each target molecule. We describe a real-time PCR technique employing attached universal duplex probes (AUDP), which has the advantage of generating fluorescence by probe hydrolysis and strand displacement over current real-time PCR methods. AUDP involves one set of universal duplex probes in which the 5' end of the fluorescent probe (FP) and a complementary quenching probe (QP) lie in close proximity so that fluorescence can be quenched. The PCR primer pair with attached universal template (UT) and the FP are identical to the UT sequence. We have shown that the AUDP technique can be used for detecting multiple target DNA sequences in both simplex and duplex real-time PCR assays for gene expression analysis, genotype identification, and genetically modified organism (GMO) quantification with comparable sensitivity, reproducibility, and repeatability with other real-time PCR methods. The results from GMO quantification, gene expression analysis, genotype identification, and GMO quantification using AUDP real-time PCR assays indicate that the AUDP real-time PCR technique has been successfully applied in nucleic acids analysis, and the developed AUDP real-time PCR technique will offer an alternative way for nucleic acid analysis with high efficiency, reliability, and flexibility at low cost.

  4. Waveforms and Sonic Boom Perception and Response (WSPR): Low-Boom Community Response Program Pilot Test Design, Execution, and Analysis

    NASA Technical Reports Server (NTRS)

    Page, Juliet A.; Hodgdon, Kathleen K.; Krecker, Peg; Cowart, Robbie; Hobbs, Chris; Wilmer, Clif; Koening, Carrie; Holmes, Theresa; Gaugler, Trent; Shumway, Durland L.; hide

    2014-01-01

    The Waveforms and Sonic boom Perception and Response (WSPR) Program was designed to test and demonstrate the applicability and effectiveness of techniques to gather data relating human subjective response to multiple low-amplitude sonic booms. It was in essence a practice session for future wider scale testing on naive communities, using a purpose built low-boom demonstrator aircraft. The low-boom community response pilot experiment was conducted in California in November 2011. The WSPR team acquired sufficient data to assess and evaluate the effectiveness of the various physical and psychological data gathering techniques and analysis methods.

  5. Using Multilevel Modeling in Language Assessment Research: A Conceptual Introduction

    ERIC Educational Resources Information Center

    Barkaoui, Khaled

    2013-01-01

    This article critiques traditional single-level statistical approaches (e.g., multiple regression analysis) to examining relationships between language test scores and variables in the assessment setting. It highlights the conceptual, methodological, and statistical problems associated with these techniques in dealing with multilevel or nested…

  6. Choosing the Most Effective Pattern Classification Model under Learning-Time Constraint.

    PubMed

    Saito, Priscila T M; Nakamura, Rodrigo Y M; Amorim, Willian P; Papa, João P; de Rezende, Pedro J; Falcão, Alexandre X

    2015-01-01

    Nowadays, large datasets are common and demand faster and more effective pattern analysis techniques. However, methodologies to compare classifiers usually do not take into account the learning-time constraints required by applications. This work presents a methodology to compare classifiers with respect to their ability to learn from classification errors on a large learning set, within a given time limit. Faster techniques may acquire more training samples, but only when they are more effective will they achieve higher performance on unseen testing sets. We demonstrate this result using several techniques, multiple datasets, and typical learning-time limits required by applications.

  7. Groundwater-level prediction using multiple linear regression and artificial neural network techniques: a comparative assessment

    NASA Astrophysics Data System (ADS)

    Sahoo, Sasmita; Jha, Madan K.

    2013-12-01

    The potential of multiple linear regression (MLR) and artificial neural network (ANN) techniques in predicting transient water levels over a groundwater basin were compared. MLR and ANN modeling was carried out at 17 sites in Japan, considering all significant inputs: rainfall, ambient temperature, river stage, 11 seasonal dummy variables, and influential lags of rainfall, ambient temperature, river stage and groundwater level. Seventeen site-specific ANN models were developed, using multi-layer feed-forward neural networks trained with Levenberg-Marquardt backpropagation algorithms. The performance of the models was evaluated using statistical and graphical indicators. Comparison of the goodness-of-fit statistics of the MLR models with those of the ANN models indicated that there is better agreement between the ANN-predicted groundwater levels and the observed groundwater levels at all the sites, compared to the MLR. This finding was supported by the graphical indicators and the residual analysis. Thus, it is concluded that the ANN technique is superior to the MLR technique in predicting spatio-temporal distribution of groundwater levels in a basin. However, considering the practical advantages of the MLR technique, it is recommended as an alternative and cost-effective groundwater modeling tool.

  8. Use of Empirical Estimates of Shrinkage in Multiple Regression: A Caution.

    ERIC Educational Resources Information Center

    Kromrey, Jeffrey D.; Hines, Constance V.

    1995-01-01

    The accuracy of four empirical techniques to estimate shrinkage in multiple regression was studied through Monte Carlo simulation. None of the techniques provided unbiased estimates of the population squared multiple correlation coefficient, but the normalized jackknife and bootstrap techniques demonstrated marginally acceptable performance with…

  9. A review of advantages of high-efficiency X-ray spectrum imaging for analysis of nanostructured ferritic alloys

    DOE PAGES

    Parish, Chad M.; Miller, Michael K.

    2014-12-09

    Nanostructured ferritic alloys (NFAs) exhibit complex microstructures consisting of 100-500 nm ferrite grains, grain boundary solute enrichment, and multiple populations of precipitates and nanoclusters (NCs). Understanding these materials' excellent creep and radiation-tolerance properties requires a combination of multiple atomic-scale experimental techniques. Recent advances in scanning transmission electron microscopy (STEM) hardware and data analysis methods have the potential to revolutionize nanometer to micrometer scale materials analysis. The application of these methods is applied to NFAs as a test case and is compared to both conventional STEM methods as well as complementary methods such as scanning electron microscopy and atom probe tomography.more » In this paper, we review past results and present new results illustrating the effectiveness of latest-generation STEM instrumentation and data analysis.« less

  10. Incipient fault detection study for advanced spacecraft systems

    NASA Technical Reports Server (NTRS)

    Milner, G. Martin; Black, Michael C.; Hovenga, J. Mike; Mcclure, Paul F.

    1986-01-01

    A feasibility study to investigate the application of vibration monitoring to the rotating machinery of planned NASA advanced spacecraft components is described. Factors investigated include: (1) special problems associated with small, high RPM machines; (2) application across multiple component types; (3) microgravity; (4) multiple fault types; (5) eight different analysis techniques including signature analysis, high frequency demodulation, cepstrum, clustering, amplitude analysis, and pattern recognition are compared; and (6) small sample statistical analysis is used to compare performance by computation of probability of detection and false alarm for an ensemble of repeated baseline and faulted tests. Both detection and classification performance are quantified. Vibration monitoring is shown to be an effective means of detecting the most important problem types for small, high RPM fans and pumps typical of those planned for the advanced spacecraft. A preliminary monitoring system design and implementation plan is presented.

  11. Practical aspects of a maximum likelihood estimation method to extract stability and control derivatives from flight data

    NASA Technical Reports Server (NTRS)

    Iliff, K. W.; Maine, R. E.

    1976-01-01

    A maximum likelihood estimation method was applied to flight data and procedures to facilitate the routine analysis of a large amount of flight data were described. Techniques that can be used to obtain stability and control derivatives from aircraft maneuvers that are less than ideal for this purpose are described. The techniques involve detecting and correcting the effects of dependent or nearly dependent variables, structural vibration, data drift, inadequate instrumentation, and difficulties with the data acquisition system and the mathematical model. The use of uncertainty levels and multiple maneuver analysis also proved to be useful in improving the quality of the estimated coefficients. The procedures used for editing the data and for overall analysis are also discussed.

  12. Correlating Detergent Fiber Analysis and Dietary Fiber Analysis Data for Corn Stover

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfrum, E. J.; Lorenz, A. J.; deLeon, N.

    There exist large amounts of detergent fiber analysis data [neutral detergent fiber (NDF), acid detergent fiber (ADF), acid detergent lignin (ADL)] for many different potential cellulosic ethanol feedstocks, since these techniques are widely used for the analysis of forages. Researchers working in the area of cellulosic ethanol are interested in the structural carbohydrates in a feedstock (principally glucan and xylan), which are typically determined by acid hydrolysis of the structural fraction after multiple extractions of the biomass. These so-called dietary fiber analysis methods are significantly more involved than detergent fiber analysis methods. The purpose of this study was to determinemore » whether it is feasible to correlate detergent fiber analysis values to glucan and xylan content determined by dietary fiber analysis methods for corn stover. In the detergent fiber analysis literature cellulose is often estimated as the difference between ADF and ADL, while hemicellulose is often estimated as the difference between NDF and ADF. Examination of a corn stover dataset containing both detergent fiber analysis data and dietary fiber analysis data predicted using near infrared spectroscopy shows that correlations between structural glucan measured using dietary fiber techniques and cellulose estimated using detergent techniques, and between structural xylan measured using dietary fiber techniques and hemicellulose estimated using detergent techniques are high, but are driven largely by the underlying correlation between total extractives measured by fiber analysis and NDF/ADF. That is, detergent analysis data is correlated to dietary fiber analysis data for structural carbohydrates, but only indirectly; the main correlation is between detergent analysis data and solvent extraction data produced during the dietary fiber analysis procedure.« less

  13. Mobile multiple access study

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Multiple access techniques (FDMA, CDMA, TDMA) for the mobile user and attempts to identify the current best technique are discussed. Traffic loading is considered as well as voice and data modulation and spacecraft and system design. Emphasis is placed on developing mobile terminal cost estimates for the selected design. In addition, design examples are presented for the alternative techniques of multiple access in order to compare with the selected technique.

  14. Multiple Removal of Spent Rocket Upper Stages with an Ion Beam Shepherd

    NASA Astrophysics Data System (ADS)

    Bombardelli, C.; Herrera-Montojo, J.; Gonzalo, J. L.

    2013-08-01

    Among the many advantages of the recently proposed ion beam shepherd (IBS) debris removal technique is the capability to deal with multiple targets in a single mission. A preliminary analysis is here conducted in order to estimate the cost in terms of spacecraft mass and total mission time to remove multiple large-size upper stages of the Zenit family. Zenit-2 upper stages are clustered at 71 degrees inclination around 850 km altitude in low Earth orbit. It is found that a removal of two targets per year is feasible with a modest size spacecraft. The most favorable combinations of targets are outlined.

  15. Assessing the use of multiple sources in student essays.

    PubMed

    Hastings, Peter; Hughes, Simon; Magliano, Joseph P; Goldman, Susan R; Lawless, Kimberly

    2012-09-01

    The present study explored different approaches for automatically scoring student essays that were written on the basis of multiple texts. Specifically, these approaches were developed to classify whether or not important elements of the texts were present in the essays. The first was a simple pattern-matching approach called "multi-word" that allowed for flexible matching of words and phrases in the sentences. The second technique was latent semantic analysis (LSA), which was used to compare student sentences to original source sentences using its high-dimensional vector-based representation. Finally, the third was a machine-learning technique, support vector machines, which learned a classification scheme from the corpus. The results of the study suggested that the LSA-based system was superior for detecting the presence of explicit content from the texts, but the multi-word pattern-matching approach was better for detecting inferences outside or across texts. These results suggest that the best approach for analyzing essays of this nature should draw upon multiple natural language processing approaches.

  16. Advanced statistics: linear regression, part I: simple linear regression.

    PubMed

    Marill, Keith A

    2004-01-01

    Simple linear regression is a mathematical technique used to model the relationship between a single independent predictor variable and a single dependent outcome variable. In this, the first of a two-part series exploring concepts in linear regression analysis, the four fundamental assumptions and the mechanics of simple linear regression are reviewed. The most common technique used to derive the regression line, the method of least squares, is described. The reader will be acquainted with other important concepts in simple linear regression, including: variable transformations, dummy variables, relationship to inference testing, and leverage. Simplified clinical examples with small datasets and graphic models are used to illustrate the points. This will provide a foundation for the second article in this series: a discussion of multiple linear regression, in which there are multiple predictor variables.

  17. Application of stepwise multiple regression techniques to inversion of Nimbus 'IRIS' observations.

    NASA Technical Reports Server (NTRS)

    Ohring, G.

    1972-01-01

    Exploratory studies with Nimbus-3 infrared interferometer-spectrometer (IRIS) data indicate that, in addition to temperature, such meteorological parameters as geopotential heights of pressure surfaces, tropopause pressure, and tropopause temperature can be inferred from the observed spectra with the use of simple regression equations. The technique of screening the IRIS spectral data by means of stepwise regression to obtain the best radiation predictors of meteorological parameters is validated. The simplicity of application of the technique and the simplicity of the derived linear regression equations - which contain only a few terms - suggest usefulness for this approach. Based upon the results obtained, suggestions are made for further development and exploitation of the stepwise regression analysis technique.

  18. Application of Linear Mixed-Effects Models in Human Neuroscience Research: A Comparison with Pearson Correlation in Two Auditory Electrophysiology Studies.

    PubMed

    Koerner, Tess K; Zhang, Yang

    2017-02-27

    Neurophysiological studies are often designed to examine relationships between measures from different testing conditions, time points, or analysis techniques within the same group of participants. Appropriate statistical techniques that can take into account repeated measures and multivariate predictor variables are integral and essential to successful data analysis and interpretation. This work implements and compares conventional Pearson correlations and linear mixed-effects (LME) regression models using data from two recently published auditory electrophysiology studies. For the specific research questions in both studies, the Pearson correlation test is inappropriate for determining strengths between the behavioral responses for speech-in-noise recognition and the multiple neurophysiological measures as the neural responses across listening conditions were simply treated as independent measures. In contrast, the LME models allow a systematic approach to incorporate both fixed-effect and random-effect terms to deal with the categorical grouping factor of listening conditions, between-subject baseline differences in the multiple measures, and the correlational structure among the predictor variables. Together, the comparative data demonstrate the advantages as well as the necessity to apply mixed-effects models to properly account for the built-in relationships among the multiple predictor variables, which has important implications for proper statistical modeling and interpretation of human behavior in terms of neural correlates and biomarkers.

  19. Performance Analysis of Triple Asymmetrical Optical Micro Ring Resonator with 2 × 2 Input-Output Bus Waveguide

    NASA Astrophysics Data System (ADS)

    Ranjan, Suman; Mandal, Sanjoy

    2017-12-01

    Modeling of triple asymmetrical optical micro ring resonator (TAOMRR) in z-domain with 2 × 2 input-output system with detailed design of its waveguide configuration using finite-difference time-domain (FDTD) method is presented. Transfer function in z-domain using delay-line signal processing technique of the proposed TAOMRR is determined for different input and output ports. The frequency response analysis is carried out using MATLAB software. Group delay and dispersion characteristics are also determined in MATLAB. The electric field analysis is done using FDTD. The method proposes a new methodology to design and draw multiple configurations of coupled ring resonators having multiple in and out ports. Various important parameters such as coupling coefficients and FSR are also determined.

  20. Performance Analysis of Triple Asymmetrical Optical Micro Ring Resonator with 2 × 2 Input-Output Bus Waveguide

    NASA Astrophysics Data System (ADS)

    Ranjan, Suman; Mandal, Sanjoy

    2018-02-01

    Modeling of triple asymmetrical optical micro ring resonator (TAOMRR) in z-domain with 2 × 2 input-output system with detailed design of its waveguide configuration using finite-difference time-domain (FDTD) method is presented. Transfer function in z-domain using delay-line signal processing technique of the proposed TAOMRR is determined for different input and output ports. The frequency response analysis is carried out using MATLAB software. Group delay and dispersion characteristics are also determined in MATLAB. The electric field analysis is done using FDTD. The method proposes a new methodology to design and draw multiple configurations of coupled ring resonators having multiple in and out ports. Various important parameters such as coupling coefficients and FSR are also determined.

  1. Probing of multiple magnetic responses in magnetic inductors using atomic force microscopy.

    PubMed

    Park, Seongjae; Seo, Hosung; Seol, Daehee; Yoon, Young-Hwan; Kim, Mi Yang; Kim, Yunseok

    2016-02-08

    Even though nanoscale analysis of magnetic properties is of significant interest, probing methods are relatively less developed compared to the significance of the technique, which has multiple potential applications. Here, we demonstrate an approach for probing various magnetic properties associated with eddy current, coil current and magnetic domains in magnetic inductors using multidimensional magnetic force microscopy (MMFM). The MMFM images provide combined magnetic responses from the three different origins, however, each contribution to the MMFM response can be differentiated through analysis based on the bias dependence of the response. In particular, the bias dependent MMFM images show locally different eddy current behavior with values dependent on the type of materials that comprise the MI. This approach for probing magnetic responses can be further extended to the analysis of local physical features.

  2. Assessing the validity of discourse analysis: transdisciplinary convergence

    NASA Astrophysics Data System (ADS)

    Jaipal-Jamani, Kamini

    2014-12-01

    Research studies using discourse analysis approaches make claims about phenomena or issues based on interpretation of written or spoken text, which includes images and gestures. How are findings/interpretations from discourse analysis validated? This paper proposes transdisciplinary convergence as a way to validate discourse analysis approaches to research. The argument is made that discourse analysis explicitly grounded in semiotics, systemic functional linguistics, and critical theory, offers a credible research methodology. The underlying assumptions, constructs, and techniques of analysis of these three theoretical disciplines can be drawn on to show convergence of data at multiple levels, validating interpretations from text analysis.

  3. Analysis of Environmental Data and Landscape Characterization on Multiple WetlandTypes Using Water Level Loggers and GIS Techniques in Tampa, FL

    EPA Science Inventory

    To better characterize the relationships between both adjacent hydrology/ precipitation and nutrient processing with groundwater level fluctuations, continuous water level data are being collected across three dominant wetland types, each with varied landscape characteristics. Th...

  4. Applied Missing Data Analysis. Methodology in the Social Sciences Series

    ERIC Educational Resources Information Center

    Enders, Craig K.

    2010-01-01

    Walking readers step by step through complex concepts, this book translates missing data techniques into something that applied researchers and graduate students can understand and utilize in their own research. Enders explains the rationale and procedural details for maximum likelihood estimation, Bayesian estimation, multiple imputation, and…

  5. Break-even Analysis: Tool for Budget Planning

    ERIC Educational Resources Information Center

    Lohmann, Roger A.

    1976-01-01

    Multiple funding creates special management problems for the administrator of a human service agency. This article presents a useful analytic technique adapted from business practice that can help the administrator draw up and balance a unified budget. Such a budget also affords reliable overview of the agency's financial status. (Author)

  6. Topographic Brain Mapping: A Window on Brain Function?

    ERIC Educational Resources Information Center

    Karniski, Walt M.

    1989-01-01

    The article reviews the method of topographic mapping of the brain's electrical activity. Multiple electroencephalogram (EEG) electrodes and computerized analysis of the EEG signal are used to generate maps of frequency and voltage (evoked potential). This relatively new technique holds promise in the evaluation of children with behavioral and…

  7. Factors Influencing the Academic Achievement of First-Generation College Students

    ERIC Educational Resources Information Center

    Strayhorn, Terrell L.

    2006-01-01

    First-generation college students face a number of unique challenges in college. These obstacles may have a disparate effect on educational outcomes such as academic achievement. This study presents findings from an analysis of the Baccalaureate & Beyond Longitudinal Study using hierarchical multiple regression techniques to measure the influence…

  8. Child Mortality in a Developing Country: A Statistical Analysis

    ERIC Educational Resources Information Center

    Uddin, Md. Jamal; Hossain, Md. Zakir; Ullah, Mohammad Ohid

    2009-01-01

    This study uses data from the "Bangladesh Demographic and Health Survey (BDHS] 1999-2000" to investigate the predictors of child (age 1-4 years) mortality in a developing country like Bangladesh. The cross-tabulation and multiple logistic regression techniques have been used to estimate the predictors of child mortality. The…

  9. Weighting and Aggregation in Composite Indicator Construction: A Multiplicative Optimization Approach

    ERIC Educational Resources Information Center

    Zhou, P.; Ang, B. W.; Zhou, D. Q.

    2010-01-01

    Composite indicators (CIs) have increasingly been accepted as a useful tool for benchmarking, performance comparisons, policy analysis and public communication in many different fields. Several recent studies show that as a data aggregation technique in CI construction the weighted product (WP) method has some desirable properties. However, a…

  10. Single-molecule dilution and multiple displacement amplification for molecular haplotyping.

    PubMed

    Paul, Philip; Apgar, Josh

    2005-04-01

    Separate haploid analysis is frequently required for heterozygous genotyping to resolve phase ambiguity or confirm allelic sequence. We demonstrate a technique of single-molecule dilution followed by multiple strand displacement amplification to haplotype polymorphic alleles. Dilution of DNA to haploid equivalency, or a single molecule, is a simple method for separating di-allelic DNA. Strand displacement amplification is a robust method for non-specific DNA expansion that employs random hexamers and phage polymerase Phi29 for double-stranded DNA displacement and primer extension, resulting in high processivity and exceptional product length. Single-molecule dilution was followed by strand displacement amplification to expand separated alleles to microgram quantities of DNA for more efficient haplotype analysis of heterozygous genes.

  11. Comprehensive Method for Culturing Embryonic Dorsal Root Ganglion Neurons for Seahorse Extracellular Flux XF24 Analysis

    PubMed Central

    Lange, Miranda; Zeng, Yan; Knight, Andrew; Windebank, Anthony; Trushina, Eugenia

    2012-01-01

    Changes in mitochondrial dynamics and function contribute to progression of multiple neurodegenerative diseases including peripheral neuropathies. The Seahorse Extracellular Flux XF24 analyzer provides a comprehensive assessment of the relative state of glycolytic and aerobic metabolism in live cells making this method instrumental in assessing mitochondrial function. One of the most important steps in the analysis of mitochondrial respiration using the Seahorse XF24 analyzer is plating a uniform monolayer of firmly attached cells. However, culturing of primary dorsal root ganglion (DRG) neurons is associated with multiple challenges, including their propensity to form clumps and detach from the culture plate. This could significantly interfere with proper analysis and interpretation of data. We have tested multiple cell culture parameters including coating substrates, culture medium, XF24 microplate plastics, and plating techniques in order to optimize plating conditions. Here we describe a highly reproducible method to obtain neuron-enriched monolayers of securely attached dissociated primary embryonic (E15) rat DRG neurons suitable for analysis with the Seahorse XF24 platform. PMID:23248613

  12. Comprehensive Method for Culturing Embryonic Dorsal Root Ganglion Neurons for Seahorse Extracellular Flux XF24 Analysis.

    PubMed

    Lange, Miranda; Zeng, Yan; Knight, Andrew; Windebank, Anthony; Trushina, Eugenia

    2012-01-01

    Changes in mitochondrial dynamics and function contribute to progression of multiple neurodegenerative diseases including peripheral neuropathies. The Seahorse Extracellular Flux XF24 analyzer provides a comprehensive assessment of the relative state of glycolytic and aerobic metabolism in live cells making this method instrumental in assessing mitochondrial function. One of the most important steps in the analysis of mitochondrial respiration using the Seahorse XF24 analyzer is plating a uniform monolayer of firmly attached cells. However, culturing of primary dorsal root ganglion (DRG) neurons is associated with multiple challenges, including their propensity to form clumps and detach from the culture plate. This could significantly interfere with proper analysis and interpretation of data. We have tested multiple cell culture parameters including coating substrates, culture medium, XF24 microplate plastics, and plating techniques in order to optimize plating conditions. Here we describe a highly reproducible method to obtain neuron-enriched monolayers of securely attached dissociated primary embryonic (E15) rat DRG neurons suitable for analysis with the Seahorse XF24 platform.

  13. Metric Selection for Evaluation of Human Supervisory Control Systems

    DTIC Science & Technology

    2009-12-01

    finding a significant effect when there is none becomes more likely. The inflation of type I error due to multiple dependent variables can be handled...with multivariate analysis techniques, such as Multivariate Analysis of Variance (MANOVA) (Johnson & Wichern, 2002). However, it should be noted that...the few significant differences among many insignificant ones. The best way to avoid failure to identify significant differences is to design an

  14. A perturbation analysis of a mechanical model for stable spatial patterning in embryology

    NASA Astrophysics Data System (ADS)

    Bentil, D. E.; Murray, J. D.

    1992-12-01

    We investigate a mechanical cell-traction mechanism that generates stationary spatial patterns. A linear analysis highlights the model's potential for these heterogeneous solutions. We use multiple-scale perturbation techniques to study the evolution of these solutions and compare our solutions with numerical simulations of the model system. We discuss some potential biological applications among which are the formation of ridge patterns, dermatoglyphs, and wound healing.

  15. An experimental study addressing the use of geoforensic analysis for the exploitation of improvised explosive devices (IEDs).

    PubMed

    Wilks, Beth; Morgan, Ruth M; Rose, Neil L

    2017-09-01

    The use of geoforensic analysis in criminal investigations is continuing to develop, with the diversification of analytical techniques, many of which are semi-automated, facilitating prompt analysis of large sample sets at a relatively low cost. Whilst micro-scale geoforensic analysis has been shown to assist criminal investigations including homicide (Concheri et al., 2011 [1]), wildlife crime (Morgan et al., 2006 [2]), illicit drug distribution (Stanley, 1992 [3]), and burglary (Mildenhall, 2006 [4]), its application to the pressing international security threat posed by Improvised Explosive Devices (IEDs) is yet to be considered. This experimental study simulated an IED supply chain from the sourcing of raw materials through to device emplacement. Mineralogy, quartz grain surface texture analysis (QGSTA) and particle size analysis (PSA) were used to assess whether environmental materials were transferred and subsequently persisted on the different components of three pressure plate IEDs. The research also addressed whether these samples were comprised of material from single or multiple geographical provenances that represented supply chain activity nodes. The simulation demonstrated that material derived from multiple activity nodes, was transferred and persisted on device components. The results from the mineralogy and QGSTA illustrated the value these techniques offer for the analysis of mixed provenance samples. The results from the PSA, which produces a bulk signature of the sample, failed to distinguish multiple provenances. The study also considered how the environmental material recovered could be used to generate information regarding the geographical locations the device had been in contact with, in an intelligence style investigation, and demonstrated that geoforensic analysis has the potential to be of value to international counter-IED efforts. It is a tool that may be used to prevent the distribution of large quantities of devices, by aiding the identification of the geographical location of key activity nodes. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Biomolecular signatures of diabetic wound healing by structural mass spectrometry

    PubMed Central

    Hines, Kelly M.; Ashfaq, Samir; Davidson, Jeffrey M.; Opalenik, Susan R.; Wikswo, John P.; McLean, John A.

    2013-01-01

    Wound fluid is a complex biological sample containing byproducts associated with the wound repair process. Contemporary techniques, such as immunoblotting and enzyme immunoassays, require extensive sample manipulation and do not permit the simultaneous analysis of multiple classes of biomolecular species. Structural mass spectrometry, implemented as ion mobility-mass spectrometry (IM-MS), comprises two sequential, gas-phase dispersion techniques well suited for the study of complex biological samples due to its ability to separate and simultaneously analyze multiple classes of biomolecules. As a model of diabetic wound healing, polyvinyl alcohol (PVA) sponges were inserted subcutaneously into non-diabetic (control) and streptozotocin-induced diabetic rats to elicit a granulation tissue response and to collect acute wound fluid. Sponges were harvested at days 2 or 5 to capture different stages of the early wound healing process. Utilizing IM-MS, statistical analysis, and targeted ultra-performance liquid chromatography (UPLC) analysis, biomolecular signatures of diabetic wound healing have been identified. The protein S100-A8 was highly enriched in the wound fluids collected from day 2 diabetic rats. Lysophosphatidylcholine (20:4) and cholic acid also contributed significantly to the differences between diabetic and control groups. This report provides a generalized workflow for wound fluid analysis demonstrated with a diabetic rat model. PMID:23452326

  17. Non-destructive analysis of sensory traits of dry-cured loins by MRI-computer vision techniques and data mining.

    PubMed

    Caballero, Daniel; Antequera, Teresa; Caro, Andrés; Ávila, María Del Mar; G Rodríguez, Pablo; Perez-Palacios, Trinidad

    2017-07-01

    Magnetic resonance imaging (MRI) combined with computer vision techniques have been proposed as an alternative or complementary technique to determine the quality parameters of food in a non-destructive way. The aim of this work was to analyze the sensory attributes of dry-cured loins using this technique. For that, different MRI acquisition sequences (spin echo, gradient echo and turbo 3D), algorithms for MRI analysis (GLCM, NGLDM, GLRLM and GLCM-NGLDM-GLRLM) and predictive data mining techniques (multiple linear regression and isotonic regression) were tested. The correlation coefficient (R) and mean absolute error (MAE) were used to validate the prediction results. The combination of spin echo, GLCM and isotonic regression produced the most accurate results. In addition, the MRI data from dry-cured loins seems to be more suitable than the data from fresh loins. The application of predictive data mining techniques on computational texture features from the MRI data of loins enables the determination of the sensory traits of dry-cured loins in a non-destructive way. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  18. Analysis of eletrectrohydrodynamic jetting using multifunctional and three-dimensional tomography

    NASA Astrophysics Data System (ADS)

    Ko, Han Seo; Nguyen, Xuan Hung; Lee, Soo-Hong; Kim, Young Hyun

    2013-11-01

    Three-dimensional optical tomography technique was developed to reconstruct three-dimensional flow fields using a set of two-dimensional shadowgraphic images and normal gray images. From three high speed cameras, which were positioned at an offset angle of 45° relative to one another, number, size and location of electrohydrodynamic jets with respect to the nozzle position were analyzed using shadowgraphic tomography employing a multiplicative algebraic reconstruction technique (MART). Additionally, a flow field inside cone-shaped liquid (Taylor cone) which was induced under electric field was also observed using a simultaneous multiplicative algebraic reconstruction technique (SMART) for reconstructing intensities of particle light and combining with a three-dimensional cross correlation. Various velocity fields of a circulating flow inside the cone-shaped liquid due to different physico-chemical properties of liquid and applied voltages were also investigated. This work supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Korean government (MEST) (No. S-2011-0023457).

  19. Comparison of imaging characteristics of multiple-beam equalization and storage phosphor direct digitizer radiographic systems

    NASA Astrophysics Data System (ADS)

    Sankaran, A.; Chuang, Keh-Shih; Yonekawa, Hisashi; Huang, H. K.

    1992-06-01

    The imaging characteristics of two chest radiographic equipment, Advanced Multiple Beam Equalization Radiography (AMBER) and Konica Direct Digitizer [using a storage phosphor (SP) plate] systems have been compared. The variables affecting image quality and the computer display/reading systems used are detailed. Utilizing specially designed wedge, geometric, and anthropomorphic phantoms, studies were conducted on: exposure and energy response of detectors; nodule detectability; different exposure techniques; various look- up tables (LUTs), gray scale displays and laser printers. Methods for scatter estimation and reduction were investigated. It is concluded that AMBER with screen-film and equalization techniques provides better nodule detectability than SP plates. However, SP plates have other advantages such as flexibility in the selection of exposure techniques, image processing features, and excellent sensitivity when combined with optimum reader operating modes. The equalization feature of AMBER provides better nodule detectability under the denser regions of the chest. Results of diagnostic accuracy are demonstrated with nodule detectability plots and analysis of images obtained with phantoms.

  20. The Research of Multiple Attenuation Based on Feedback Iteration and Independent Component Analysis

    NASA Astrophysics Data System (ADS)

    Xu, X.; Tong, S.; Wang, L.

    2017-12-01

    How to solve the problem of multiple suppression is a difficult problem in seismic data processing. The traditional technology for multiple attenuation is based on the principle of the minimum output energy of the seismic signal, this criterion is based on the second order statistics, and it can't achieve the multiple attenuation when the primaries and multiples are non-orthogonal. In order to solve the above problems, we combine the feedback iteration method based on the wave equation and the improved independent component analysis (ICA) based on high order statistics to suppress the multiple waves. We first use iterative feedback method to predict the free surface multiples of each order. Then, in order to predict multiples from real multiple in amplitude and phase, we design an expanded pseudo multi-channel matching filtering method to get a more accurate matching multiple result. Finally, we present the improved fast ICA algorithm which is based on the maximum non-Gauss criterion of output signal to the matching multiples and get better separation results of the primaries and the multiples. The advantage of our method is that we don't need any priori information to the prediction of the multiples, and can have a better separation result. The method has been applied to several synthetic data generated by finite-difference model technique and the Sigsbee2B model multiple data, the primaries and multiples are non-orthogonal in these models. The experiments show that after three to four iterations, we can get the perfect multiple results. Using our matching method and Fast ICA adaptive multiple subtraction, we can not only effectively preserve the effective wave energy in seismic records, but also can effectively suppress the free surface multiples, especially the multiples related to the middle and deep areas.

  1. A model of human decision making in multiple process monitoring situations

    NASA Technical Reports Server (NTRS)

    Greenstein, J. S.; Rouse, W. B.

    1982-01-01

    Human decision making in multiple process monitoring situations is considered. It is proposed that human decision making in many multiple process monitoring situations can be modeled in terms of the human's detection of process related events and his allocation of attention among processes once he feels event have occurred. A mathematical model of human event detection and attention allocation performance in multiple process monitoring situations is developed. An assumption made in developing the model is that, in attempting to detect events, the human generates estimates of the probabilities that events have occurred. An elementary pattern recognition technique, discriminant analysis, is used to model the human's generation of these probability estimates. The performance of the model is compared to that of four subjects in a multiple process monitoring situation requiring allocation of attention among processes.

  2. Neural-adaptive control of single-master-multiple-slaves teleoperation for coordinated multiple mobile manipulators with time-varying communication delays and input uncertainties.

    PubMed

    Li, Zhijun; Su, Chun-Yi

    2013-09-01

    In this paper, adaptive neural network control is investigated for single-master-multiple-slaves teleoperation in consideration of time delays and input dead-zone uncertainties for multiple mobile manipulators carrying a common object in a cooperative manner. Firstly, concise dynamics of teleoperation systems consisting of a single master robot, multiple coordinated slave robots, and the object are developed in the task space. To handle asymmetric time-varying delays in communication channels and unknown asymmetric input dead zones, the nonlinear dynamics of the teleoperation system are transformed into two subsystems through feedback linearization: local master or slave dynamics including the unknown input dead zones and delayed dynamics for the purpose of synchronization. Then, a model reference neural network control strategy based on linear matrix inequalities (LMI) and adaptive techniques is proposed. The developed control approach ensures that the defined tracking errors converge to zero whereas the coordination internal force errors remain bounded and can be made arbitrarily small. Throughout this paper, stability analysis is performed via explicit Lyapunov techniques under specific LMI conditions. The proposed adaptive neural network control scheme is robust against motion disturbances, parametric uncertainties, time-varying delays, and input dead zones, which is validated by simulation studies.

  3. Label-free imaging of trabecular meshwork cells using Coherent Anti-Stokes Raman Scattering (CARS) microscopy

    PubMed Central

    Lei, Tim C.; Ammar, David A.; Masihzadeh, Omid; Gibson, Emily A.

    2011-01-01

    Purpose To image the human trabecular meshwork (TM) using a non-invasive, non-destructive technique without the application of exogenous label. Methods Flat-mounted TM samples from a human cadaver eye were imaged using two nonlinear optical techniques: coherent anti-Stokes Raman scattering (CARS) and two-photon autofluorescence (TPAF). In TPAF, two optical photons are simultaneously absorbed and excite molecules in the sample that then emit a higher energy photon. The signal is predominately from collagen and elastin. The CARS technique uses two laser frequencies to specifically excite carbon-hydrogen bonds, allowing the visualization of lipid-rich cell membranes. Multiple images were taken along an axis perpendicular to the surface of the TM for subsequent analysis. Results Analysis of multiple TPAF images of the TM reveals the characteristic overlapping bundles of collagen of various sizes. Simultaneous CARS imaging revealed elliptical structures of ~7×10 µm in diameter populating the meshwork which were consistent with TM cells. Irregularly shaped objects of ~4 µm diameter appeared in both the TPAF and CARS channels, and are consistent with melanin granules. Conclusions CARS techniques were successful in imaging live TM cells in freshly isolated human TM samples. Similar images have been obtained with standard histological techniques, however the method described here has the advantage of being performed on unprocessed, unfixed tissue free from the potential distortions of the fine tissue morphology that can occur due to infusion of fixatives and treatment with alcohols. CARS imaging of the TM represents a new avenue for exploring details of aqueous outflow and TM cell physiology. PMID:22025898

  4. Multifactorial analysis of human blood cell responses to clinical total body irradiation

    NASA Technical Reports Server (NTRS)

    Yuhas, J. M.; Stokes, T. R.; Lushbaugh, C. C.

    1972-01-01

    Multiple regression analysis techniques are used to study the effects of therapeutic radiation exposure, number of fractions, and time on such quantal responses as tumor control and skin injury. The potential of these methods for the analysis of human blood cell responses is demonstrated and estimates are given of the effects of total amount of exposure and time of protraction in determining the minimum white blood cell concentration observed after exposure of patients from four disease groups.

  5. Multiple Criteria Decision Analysis for Health Care Decision Making--An Introduction: Report 1 of the ISPOR MCDA Emerging Good Practices Task Force.

    PubMed

    Thokala, Praveen; Devlin, Nancy; Marsh, Kevin; Baltussen, Rob; Boysen, Meindert; Kalo, Zoltan; Longrenn, Thomas; Mussen, Filip; Peacock, Stuart; Watkins, John; Ijzerman, Maarten

    2016-01-01

    Health care decisions are complex and involve confronting trade-offs between multiple, often conflicting, objectives. Using structured, explicit approaches to decisions involving multiple criteria can improve the quality of decision making and a set of techniques, known under the collective heading multiple criteria decision analysis (MCDA), are useful for this purpose. MCDA methods are widely used in other sectors, and recently there has been an increase in health care applications. In 2014, ISPOR established an MCDA Emerging Good Practices Task Force. It was charged with establishing a common definition for MCDA in health care decision making and developing good practice guidelines for conducting MCDA to aid health care decision making. This initial ISPOR MCDA task force report provides an introduction to MCDA - it defines MCDA; provides examples of its use in different kinds of decision making in health care (including benefit risk analysis, health technology assessment, resource allocation, portfolio decision analysis, shared patient clinician decision making and prioritizing patients' access to services); provides an overview of the principal methods of MCDA; and describes the key steps involved. Upon reviewing this report, readers should have a solid overview of MCDA methods and their potential for supporting health care decision making. Copyright © 2016. Published by Elsevier Inc.

  6. Estimation of Theaflavins (TF) and Thearubigins (TR) Ratio in Black Tea Liquor Using Electronic Vision System

    NASA Astrophysics Data System (ADS)

    Akuli, Amitava; Pal, Abhra; Ghosh, Arunangshu; Bhattacharyya, Nabarun; Bandhopadhyya, Rajib; Tamuly, Pradip; Gogoi, Nagen

    2011-09-01

    Quality of black tea is generally assessed using organoleptic tests by professional tea tasters. They determine the quality of black tea based on its appearance (in dry condition and during liquor formation), aroma and taste. Variation in the above parameters is actually contributed by a number of chemical compounds like, Theaflavins (TF), Thearubigins (TR), Caffeine, Linalool, Geraniol etc. Among the above, TF and TR are the most important chemical compounds, which actually contribute to the formation of taste, colour and brightness in tea liquor. Estimation of TF and TR in black tea is generally done using a spectrophotometer instrument. But, the analysis technique undergoes a rigorous and time consuming effort for sample preparation; also the operation of costly spectrophotometer requires expert manpower. To overcome above problems an Electronic Vision System based on digital image processing technique has been developed. The system is faster, low cost, repeatable and can estimate the amount of TF and TR ratio for black tea liquor with accuracy. The data analysis is done using Principal Component Analysis (PCA), Multiple Linear Regression (MLR) and Multiple Discriminate Analysis (MDA). A correlation has been established between colour of tea liquor images and TF, TR ratio. This paper describes the newly developed E-Vision system, experimental methods, data analysis algorithms and finally, the performance of the E-Vision System as compared to the results of traditional spectrophotometer.

  7. Identifying configurations of behavior change techniques in effective medication adherence interventions: a qualitative comparative analysis.

    PubMed

    Kahwati, Leila; Viswanathan, Meera; Golin, Carol E; Kane, Heather; Lewis, Megan; Jacobs, Sara

    2016-05-04

    Interventions to improve medication adherence are diverse and complex. Consequently, synthesizing this evidence is challenging. We aimed to extend the results from an existing systematic review of interventions to improve medication adherence by using qualitative comparative analysis (QCA) to identify necessary or sufficient configurations of behavior change techniques among effective interventions. We used data from 60 studies in a completed systematic review to examine the combinations of nine behavior change techniques (increasing knowledge, increasing awareness, changing attitude, increasing self-efficacy, increasing intention formation, increasing action control, facilitation, increasing maintenance support, and motivational interviewing) among studies demonstrating improvements in adherence. Among the 60 studies, 34 demonstrated improved medication adherence. Among effective studies, increasing patient knowledge was a necessary but not sufficient technique. We identified seven configurations of behavior change techniques sufficient for improving adherence, which together accounted for 26 (76 %) of the effective studies. The intervention configuration that included increasing knowledge and self-efficacy was the most empirically relevant, accounting for 17 studies (50 %) and uniquely accounting for 15 (44 %). This analysis extends the completed review findings by identifying multiple combinations of behavior change techniques that improve adherence. Our findings offer direction for policy makers, practitioners, and future comparative effectiveness research on improving adherence.

  8. Improvements to direct quantitative analysis of multiple microRNAs facilitating faster analysis.

    PubMed

    Ghasemi, Farhad; Wegman, David W; Kanoatov, Mirzo; Yang, Burton B; Liu, Stanley K; Yousef, George M; Krylov, Sergey N

    2013-11-05

    Studies suggest that patterns of deregulation in sets of microRNA (miRNA) can be used as cancer diagnostic and prognostic biomarkers. Establishing a "miRNA fingerprint"-based diagnostic technique requires a suitable miRNA quantitation method. The appropriate method must be direct, sensitive, capable of simultaneous analysis of multiple miRNAs, rapid, and robust. Direct quantitative analysis of multiple microRNAs (DQAMmiR) is a recently introduced capillary electrophoresis-based hybridization assay that satisfies most of these criteria. Previous implementations of the method suffered, however, from slow analysis time and required lengthy and stringent purification of hybridization probes. Here, we introduce a set of critical improvements to DQAMmiR that address these technical limitations. First, we have devised an efficient purification procedure that achieves the required purity of the hybridization probe in a fast and simple fashion. Second, we have optimized the concentrations of the DNA probe to decrease the hybridization time to 10 min. Lastly, we have demonstrated that the increased probe concentrations and decreased incubation time removed the need for masking DNA, further simplifying the method and increasing its robustness. The presented improvements bring DQAMmiR closer to use in a clinical setting.

  9. Use of partial dissolution techniques in geochemical exploration

    USGS Publications Warehouse

    Chao, T.T.

    1984-01-01

    Application of partial dissolution techniques to geochemical exploration has advanced from an early empirical approach to an approach based on sound geochemical principles. This advance assures a prominent future position for the use of these techniques in geochemical exploration for concealed mineral deposits. Partial dissolution techniques are classified as single dissolution or sequential multiple dissolution depending on the number of steps taken in the procedure, or as "nonselective" extraction and as "selective" extraction in terms of the relative specificity of the extraction. The choice of dissolution techniques for use in geochemical exploration is dictated by the geology of the area, the type and degree of weathering, and the expected chemical forms of the ore and of the pathfinding elements. Case histories have illustrated many instances where partial dissolution techniques exhibit advantages over conventional methods of chemical analysis used in geochemical exploration. ?? 1984.

  10. FAST: A multi-processed environment for visualization of computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon V.; Merritt, Fergus J.; Plessel, Todd C.; Kelaita, Paul G.; Mccabe, R. Kevin

    1991-01-01

    Three-dimensional, unsteady, multi-zoned fluid dynamics simulations over full scale aircraft are typical of the problems being investigated at NASA Ames' Numerical Aerodynamic Simulation (NAS) facility on CRAY2 and CRAY-YMP supercomputers. With multiple processor workstations available in the 10-30 Mflop range, we feel that these new developments in scientific computing warrant a new approach to the design and implementation of analysis tools. These larger, more complex problems create a need for new visualization techniques not possible with the existing software or systems available as of this writing. The visualization techniques will change as the supercomputing environment, and hence the scientific methods employed, evolves even further. The Flow Analysis Software Toolkit (FAST), an implementation of a software system for fluid mechanics analysis, is discussed.

  11. Advanced Automation for Ion Trap Mass Spectrometry-New Opportunities for Real-Time Autonomous Analysis

    NASA Technical Reports Server (NTRS)

    Palmer, Peter T.; Wong, C. M.; Salmonson, J. D.; Yost, R. A.; Griffin, T. P.; Yates, N. A.; Lawless, James G. (Technical Monitor)

    1994-01-01

    The utility of MS/MS for both target compound analysis and the structure elucidation of unknowns has been described in a number of references. A broader acceptance of this technique has not yet been realized as it requires large, complex, and costly instrumentation which has not been competitive with more conventional techniques. Recent advancements in ion trap mass spectrometry promise to change this situation. Although the ion trap's small size, sensitivity, and ability to perform multiple stages of mass spectrometry have made it eminently suitable for on-line, real-time monitoring applications, advance automation techniques are required to make these capabilities more accessible to non-experts. Towards this end we have developed custom software for the design and implementation of MS/MS experiments. This software allows the user to take full advantage of the ion trap's versatility with respect to ionization techniques, scan proxies, and ion accumulation/ejection methods. Additionally, expert system software has been developed for autonomous target compound analysis. This software has been linked to ion trap control software and a commercial data system to bring all of the steps in the analysis cycle under control of the expert system. These software development efforts and their utilization for a number of trace analysis applications will be described.

  12. SU-F-T-349: Dosimetric Comparison of Three Different Simultaneous Integrated Boost Irradiation Techniques for Multiple Brain Metastases: Intensity-Modulatedradiotherapy, Hybrid Intensity-Modulated Radiotherapy and Volumetric Modulated Arc Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, X; Sun, T; Yin, Y

    Purpose: To study the dosimetric impact of intensity-modulated radiotherapy (IMRT), hybrid intensity-modulated radiotherapy (h-IMRT) and volumetric modulated arc therapy(VMAT) for whole-brain radiotherapy (WBRT) with simultaneous integrated boost in patients with multiple brain metastases. Methods: Ten patients with multiple brain metastases were included in this analysis. The prescribed dose was 45 Gy to the whole brain (PTVWBRT) and 55 Gy to individual brain metastases (PTVboost) delivered simultaneously in 25 fractions. Three treatment techniques were designed: the 7 equal spaced fields IMRT plan, hybrid IMRT plan and VMAT with two 358°arcs. In hybrid IMRT plan, two fields(90°and 270°) were planned to themore » whole brain. This was used as a base dose plan. Then 5 fields IMRT plan was optimized based on the two fields plan. The dose distribution in the target, the dose to the organs at risk and total MU in three techniques were compared. Results: For the target dose, conformity and homogeneity in PTV, no statistically differences were observed in the three techniques. For the maximum dose in bilateral lens and the mean dose in bilateral eyes, IMRT and h-IMRT plans showed the highest and lowest value respectively. No statistically significant differences were observed in the dose of optic nerve and brainstem. For the monitor units, IMRT and VMAT plans showed the highest and lowest value respectively. Conclusion: For WBRT with simultaneous integrated boost in patients with multiple brain metastases, hybrid IMRT could reduce the doses to lens and eyes. It is feasible for patients with brain metastases.« less

  13. A simple algorithm for quantifying DNA methylation levels on multiple independent CpG sites in bisulfite genomic sequencing electropherograms.

    PubMed

    Leakey, Tatiana I; Zielinski, Jerzy; Siegfried, Rachel N; Siegel, Eric R; Fan, Chun-Yang; Cooney, Craig A

    2008-06-01

    DNA methylation at cytosines is a widely studied epigenetic modification. Methylation is commonly detected using bisulfite modification of DNA followed by PCR and additional techniques such as restriction digestion or sequencing. These additional techniques are either laborious, require specialized equipment, or are not quantitative. Here we describe a simple algorithm that yields quantitative results from analysis of conventional four-dye-trace sequencing. We call this method Mquant and we compare it with the established laboratory method of combined bisulfite restriction assay (COBRA). This analysis of sequencing electropherograms provides a simple, easily applied method to quantify DNA methylation at specific CpG sites.

  14. New Techniques for Thermo-electrochemical Analysis of Lithium-ion Batteries for Space Applications

    NASA Technical Reports Server (NTRS)

    Walker, William; Ardebili, H.

    2013-01-01

    The overall goal of this study was achieved: Replicated the numerical assessment performed by Chen et. al. (2005). Displayed the ability of Thermal Desktop to be coupled with thermo-electrochemical analysis techniques. such that the local heat generated on the cells is a function of the model itself using logic blocks and arrays. Differences in the TD temperature vs. depth of discharge profiles and Chen's was most likely due to differences in two primary areas: Contact regions and conductance values. Differences in density and specific heat values. center dot The model results are highly dependent on the accuracy of the material properties with respect to the multiple layers of an individual cell.

  15. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    PubMed

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  16. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  17. A novel universal real-time PCR system using the attached universal duplex probes for quantitative analysis of nucleic acids

    PubMed Central

    Yang, Litao; Liang, Wanqi; Jiang, Lingxi; Li, Wenquan; Cao, Wei; Wilson, Zoe A; Zhang, Dabing

    2008-01-01

    Background Real-time PCR techniques are being widely used for nucleic acids analysis, but one limitation of current frequently employed real-time PCR is the high cost of the labeled probe for each target molecule. Results We describe a real-time PCR technique employing attached universal duplex probes (AUDP), which has the advantage of generating fluorescence by probe hydrolysis and strand displacement over current real-time PCR methods. AUDP involves one set of universal duplex probes in which the 5' end of the fluorescent probe (FP) and a complementary quenching probe (QP) lie in close proximity so that fluorescence can be quenched. The PCR primer pair with attached universal template (UT) and the FP are identical to the UT sequence. We have shown that the AUDP technique can be used for detecting multiple target DNA sequences in both simplex and duplex real-time PCR assays for gene expression analysis, genotype identification, and genetically modified organism (GMO) quantification with comparable sensitivity, reproducibility, and repeatability with other real-time PCR methods. Conclusion The results from GMO quantification, gene expression analysis, genotype identification, and GMO quantification using AUDP real-time PCR assays indicate that the AUDP real-time PCR technique has been successfully applied in nucleic acids analysis, and the developed AUDP real-time PCR technique will offer an alternative way for nucleic acid analysis with high efficiency, reliability, and flexibility at low cost. PMID:18522756

  18. Multiple directed graph large-class multi-spectral processor

    NASA Technical Reports Server (NTRS)

    Casasent, David; Liu, Shiaw-Dong; Yoneyama, Hideyuki

    1988-01-01

    Numerical analysis techniques for the interpretation of high-resolution imaging-spectrometer data are described and demonstrated. The method proposed involves the use of (1) a hierarchical classifier with a tree structure generated automatically by a Fisher linear-discriminant-function algorithm and (2) a novel multiple-directed-graph scheme which reduces the local maxima and the number of perturbations required. Results for a 500-class test problem involving simulated imaging-spectrometer data are presented in tables and graphs; 100-percent-correct classification is achieved with an improvement factor of 5.

  19. Uncovering multiple pathways to substance use: a comparison of methods for identifying population subgroups.

    PubMed

    Dierker, Lisa; Rose, Jennifer; Tan, Xianming; Li, Runze

    2010-12-01

    This paper describes and compares a selection of available modeling techniques for identifying homogeneous population subgroups in the interest of informing targeted substance use intervention. We present a nontechnical review of the common and unique features of three methods: (a) trajectory analysis, (b) functional hierarchical linear modeling (FHLM), and (c) decision tree methods. Differences among the techniques are described, including required data features, strengths and limitations in terms of the flexibility with which outcomes and predictors can be modeled, and the potential of each technique for helping to inform the selection of targets and timing of substance intervention programs.

  20. A modal parameter extraction procedure applicable to linear time-invariant dynamic systems

    NASA Technical Reports Server (NTRS)

    Kurdila, A. J.; Craig, R. R., Jr.

    1985-01-01

    Modal analysis has emerged as a valuable tool in many phases of the engineering design process. Complex vibration and acoustic problems in new designs can often be remedied through use of the method. Moreover, the technique has been used to enhance the conceptual understanding of structures by serving to verify analytical models. A new modal parameter estimation procedure is presented. The technique is applicable to linear, time-invariant systems and accommodates multiple input excitations. In order to provide a background for the derivation of the method, some modal parameter extraction procedures currently in use are described. Key features implemented in the new technique are elaborated upon.

  1. Consistent detection and identification of individuals in a large camera network

    NASA Astrophysics Data System (ADS)

    Colombo, Alberto; Leung, Valerie; Orwell, James; Velastin, Sergio A.

    2007-10-01

    In the wake of an increasing number of terrorist attacks, counter-terrorism measures are now a main focus of many research programmes. An important issue for the police is the ability to track individuals and groups reliably through underground stations, and in the case of post-event analysis, to be able to ascertain whether specific individuals have been at the station previously. While there exist many motion detection and tracking algorithms, the reliable deployment of them in a large network is still ongoing research. Specifically, to track individuals through multiple views, on multiple levels and between levels, consistent detection and labelling of individuals is crucial. In view of these issues, we have developed a change detection algorithm to work reliably in the presence of periodic movements, e.g. escalators and scrolling advertisements, as well as a content-based retrieval technique for identification. The change detection technique automatically extracts periodically varying elements in the scene using Fourier analysis, and constructs a Markov model for the process. Training is performed online, and no manual intervention is required, making this system suitable for deployment in large networks. Experiments on real data shows significant improvement over existing techniques. The content-based retrieval technique uses MPEG-7 descriptors to identify individuals. Given the environment under which the system operates, i.e. at relatively low resolution, this approach is suitable for short timescales. For longer timescales, other forms of identification such as gait, or if the resolution allows, face recognition, will be required.

  2. Coagulation dynamics of a blood sample by multiple scattering analysis

    NASA Astrophysics Data System (ADS)

    Faivre, Magalie; Peltié, Philippe; Planat-Chrétien, Anne; Cosnier, Marie-Line; Cubizolles, Myriam; Nougier, Christophe; Négrier, Claude; Pouteau, Patrick

    2011-05-01

    We report a new technique to measure coagulation dynamics on whole-blood samples. The method relies on the analysis of the speckle figure resulting from a whole-blood sample mixed with coagulation reagent and introduced in a thin chamber illuminated with a coherent light. A dynamic study of the speckle reveals a typical behavior due to coagulation. We compare our measured coagulation times to a reference method obtained in a medical laboratory.

  3. Spectral Analysis of Breast Cancer on Tissue Microarrays: Seeing Beyond Morphology

    DTIC Science & Technology

    2005-04-01

    Harvey N., Szymanski J.J., Bloch J.J., Mitchell M. investigation of image feature extraction by a genetic algorithm. Proc. SPIE 1999;3812:24-31. 11...automated feature extraction using multiple data sources. Proc. SPIE 2003;5099:190-200. 15 4 Spectral-Spatial Analysis of Urine Cytology Angeletti et al...Appendix Contents: 1. Harvey, N.R., Levenson, R.M., Rimm, D.L. (2003) Investigation of Automated Feature Extraction Techniques for Applications in

  4. A new cooperative MIMO scheme based on SM for energy-efficiency improvement in wireless sensor network.

    PubMed

    Peng, Yuyang; Choi, Jaeho

    2014-01-01

    Improving the energy efficiency in wireless sensor networks (WSN) has attracted considerable attention nowadays. The multiple-input multiple-output (MIMO) technique has been proved as a good candidate for improving the energy efficiency, but it may not be feasible in WSN which is due to the size limitation of the sensor node. As a solution, the cooperative multiple-input multiple-output (CMIMO) technique overcomes this constraint and shows a dramatically good performance. In this paper, a new CMIMO scheme based on the spatial modulation (SM) technique named CMIMO-SM is proposed for energy-efficiency improvement. We first establish the system model of CMIMO-SM. Based on this model, the transmission approach is introduced graphically. In order to evaluate the performance of the proposed scheme, a detailed analysis in terms of energy consumption per bit of the proposed scheme compared with the conventional CMIMO is presented. Later, under the guide of this new scheme we extend our proposed CMIMO-SM to a multihop clustered WSN for further achieving energy efficiency by finding an optimal hop-length. Equidistant hop as the traditional scheme will be compared in this paper. Results from the simulations and numerical experiments indicate that by the use of the proposed scheme, significant savings in terms of total energy consumption can be achieved. Combining the proposed scheme with monitoring sensor node will provide a good performance in arbitrary deployed WSN such as forest fire detection system.

  5. Effect of denoising on supervised lung parenchymal clusters

    NASA Astrophysics Data System (ADS)

    Jayamani, Padmapriya; Raghunath, Sushravya; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Bartholmai, Brian J.; Robb, Richard A.

    2012-03-01

    Denoising is a critical preconditioning step for quantitative analysis of medical images. Despite promises for more consistent diagnosis, denoising techniques are seldom explored in clinical settings. While this may be attributed to the esoteric nature of the parameter sensitve algorithms, lack of quantitative measures on their ecacy to enhance the clinical decision making is a primary cause of physician apathy. This paper addresses this issue by exploring the eect of denoising on the integrity of supervised lung parenchymal clusters. Multiple Volumes of Interests (VOIs) were selected across multiple high resolution CT scans to represent samples of dierent patterns (normal, emphysema, ground glass, honey combing and reticular). The VOIs were labeled through consensus of four radiologists. The original datasets were ltered by multiple denoising techniques (median ltering, anisotropic diusion, bilateral ltering and non-local means) and the corresponding ltered VOIs were extracted. Plurality of cluster indices based on multiple histogram-based pair-wise similarity measures were used to assess the quality of supervised clusters in the original and ltered space. The resultant rank orders were analyzed using the Borda criteria to nd the denoising-similarity measure combination that has the best cluster quality. Our exhaustive analyis reveals (a) for a number of similarity measures, the cluster quality is inferior in the ltered space; and (b) for measures that benet from denoising, a simple median ltering outperforms non-local means and bilateral ltering. Our study suggests the need to judiciously choose, if required, a denoising technique that does not deteriorate the integrity of supervised clusters.

  6. Quantile Regression in the Study of Developmental Sciences

    ERIC Educational Resources Information Center

    Petscher, Yaacov; Logan, Jessica A. R.

    2014-01-01

    Linear regression analysis is one of the most common techniques applied in developmental research, but only allows for an estimate of the average relations between the predictor(s) and the outcome. This study describes quantile regression, which provides estimates of the relations between the predictor(s) and outcome, but across multiple points of…

  7. Fiscal Impacts and Redistributive Effects of the New Federalism on Michigan School Districts.

    ERIC Educational Resources Information Center

    Kearney, C. Philip; Kim, Taewan

    1990-01-01

    The fiscal impacts and redistribution effects of the recently enacted (1981) federal education block grant on 525 elementary and secondary school districts in Michigan were examined using a quasi-experimental time-series design and multiple regression and analysis of covariance techniques. Implications of changes in federal policy are discussed.…

  8. User’s guide to SNAP for ArcGIS® :ArcGIS interface for scheduling and network analysis program

    Treesearch

    Woodam Chung; Dennis Dykstra; Fred Bower; Stephen O’Brien; Richard Abt; John. and Sessions

    2012-01-01

    This document introduces a computer software named SNAP for ArcGIS® , which has been developed to streamline scheduling and transportation planning for timber harvest areas. Using modern optimization techniques, it can be used to spatially schedule timber harvest with consideration of harvesting costs, multiple products, alternative...

  9. Comparison of Radio Frequency Distinct Native Attribute and Matched Filtering Techniques for Device Discrimination and Operation Identification

    DTIC Science & Technology

    identification. URE from ten MSP430F5529 16-bit microcontrollers were analyzed using: 1) RF distinct native attributes (RF-DNA) fingerprints paired with multiple...discriminant analysis/maximum likelihood (MDA/ML) classification, 2) RF-DNA fingerprints paired with generalized relevance learning vector quantized

  10. Corruption in Higher Education: Conceptual Approaches and Measurement Techniques

    ERIC Educational Resources Information Center

    Osipian, Ararat L.

    2007-01-01

    Corruption is a complex and multifaceted phenomenon. Forms of corruption are multiple. Measuring corruption is necessary not only for getting ideas about the scale and scope of the problem, but for making simple comparisons between the countries and conducting comparative analysis of corruption. While the total impact of corruption is indeed…

  11. Classification of mathematics deficiency using shape and scale analysis of 3D brain structures

    NASA Astrophysics Data System (ADS)

    Kurtek, Sebastian; Klassen, Eric; Gore, John C.; Ding, Zhaohua; Srivastava, Anuj

    2011-03-01

    We investigate the use of a recent technique for shape analysis of brain substructures in identifying learning disabilities in third-grade children. This Riemannian technique provides a quantification of differences in shapes of parameterized surfaces, using a distance that is invariant to rigid motions and re-parameterizations. Additionally, it provides an optimal registration across surfaces for improved matching and comparisons. We utilize an efficient gradient based method to obtain the optimal re-parameterizations of surfaces. In this study we consider 20 different substructures in the human brain and correlate the differences in their shapes with abnormalities manifested in deficiency of mathematical skills in 106 subjects. The selection of these structures is motivated in part by the past links between their shapes and cognitive skills, albeit in broader contexts. We have studied the use of both individual substructures and multiple structures jointly for disease classification. Using a leave-one-out nearest neighbor classifier, we obtained a 62.3% classification rate based on the shape of the left hippocampus. The use of multiple structures resulted in an improved classification rate of 71.4%.

  12. Integrated data analysis for genome-wide research.

    PubMed

    Steinfath, Matthias; Repsilber, Dirk; Scholz, Matthias; Walther, Dirk; Selbig, Joachim

    2007-01-01

    Integrated data analysis is introduced as the intermediate level of a systems biology approach to analyse different 'omics' datasets, i.e., genome-wide measurements of transcripts, protein levels or protein-protein interactions, and metabolite levels aiming at generating a coherent understanding of biological function. In this chapter we focus on different methods of correlation analyses ranging from simple pairwise correlation to kernel canonical correlation which were recently applied in molecular biology. Several examples are presented to illustrate their application. The input data for this analysis frequently originate from different experimental platforms. Therefore, preprocessing steps such as data normalisation and missing value estimation are inherent to this approach. The corresponding procedures, potential pitfalls and biases, and available software solutions are reviewed. The multiplicity of observations obtained in omics-profiling experiments necessitates the application of multiple testing correction techniques.

  13. From Molecules to Cells to Organisms: Understanding Health and Disease with Multidimensional Single-Cell Methods

    NASA Astrophysics Data System (ADS)

    Candia, Julián

    2013-03-01

    The multidimensional nature of many single-cell measurements (e.g. multiple markers measured simultaneously using Fluorescence-Activated Cell Sorting (FACS) technologies) offers unprecedented opportunities to unravel emergent phenomena that are governed by the cooperative action of multiple elements across different scales, from molecules and proteins to cells and organisms. We will discuss an integrated analysis framework to investigate multicolor FACS data from different perspectives: Singular Value Decomposition to achieve an effective dimensional reduction in the data representation, machine learning techniques to separate different patient classes and improve diagnosis, as well as a novel cell-similarity network analysis method to identify cell subpopulations in an unbiased manner. Besides FACS data, this framework is versatile: in this vein, we will demonstrate an application to the multidimensional single-cell shape analysis of healthy and prematurely aged cells.

  14. A conflict analysis of 4D descent strategies in a metered, multiple-arrival route environment

    NASA Technical Reports Server (NTRS)

    Izumi, K. H.; Harris, C. S.

    1990-01-01

    A conflict analysis was performed on multiple arrival traffic at a typical metered airport. The Flow Management Evaluation Model (FMEM) was used to simulate arrival operations using Denver Stapleton's arrival route structure. Sensitivities of conflict performance to three different 4-D descent strategies (clear-idle Mach/Constant AirSpeed (CAS), constant descent angle Mach/CAS and energy optimal) were examined for three traffic mixes represented by those found at Denver Stapleton, John F. Kennedy and typical en route metering (ERM) airports. The Monte Carlo technique was used to generate simulation entry point times. Analysis results indicate that the clean-idle descent strategy offers the best compromise in overall performance. Performance measures primarily include susceptibility to conflict and conflict severity. Fuel usage performance is extrapolated from previous descent strategy studies.

  15. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    DOE PAGES

    Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...

    2008-01-01

    The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less

  16. Hierarchical modeling and robust synthesis for the preliminary design of large scale complex systems

    NASA Astrophysics Data System (ADS)

    Koch, Patrick Nathan

    Large-scale complex systems are characterized by multiple interacting subsystems and the analysis of multiple disciplines. The design and development of such systems inevitably requires the resolution of multiple conflicting objectives. The size of complex systems, however, prohibits the development of comprehensive system models, and thus these systems must be partitioned into their constituent parts. Because simultaneous solution of individual subsystem models is often not manageable iteration is inevitable and often excessive. In this dissertation these issues are addressed through the development of a method for hierarchical robust preliminary design exploration to facilitate concurrent system and subsystem design exploration, for the concurrent generation of robust system and subsystem specifications for the preliminary design of multi-level, multi-objective, large-scale complex systems. This method is developed through the integration and expansion of current design techniques: (1) Hierarchical partitioning and modeling techniques for partitioning large-scale complex systems into more tractable parts, and allowing integration of subproblems for system synthesis, (2) Statistical experimentation and approximation techniques for increasing both the efficiency and the comprehensiveness of preliminary design exploration, and (3) Noise modeling techniques for implementing robust preliminary design when approximate models are employed. The method developed and associated approaches are illustrated through their application to the preliminary design of a commercial turbofan turbine propulsion system; the turbofan system-level problem is partitioned into engine cycle and configuration design and a compressor module is integrated for more detailed subsystem-level design exploration, improving system evaluation.

  17. Handling Missing Data With Multilevel Structural Equation Modeling and Full Information Maximum Likelihood Techniques.

    PubMed

    Schminkey, Donna L; von Oertzen, Timo; Bullock, Linda

    2016-08-01

    With increasing access to population-based data and electronic health records for secondary analysis, missing data are common. In the social and behavioral sciences, missing data frequently are handled with multiple imputation methods or full information maximum likelihood (FIML) techniques, but healthcare researchers have not embraced these methodologies to the same extent and more often use either traditional imputation techniques or complete case analysis, which can compromise power and introduce unintended bias. This article is a review of options for handling missing data, concluding with a case study demonstrating the utility of multilevel structural equation modeling using full information maximum likelihood (MSEM with FIML) to handle large amounts of missing data. MSEM with FIML is a parsimonious and hypothesis-driven strategy to cope with large amounts of missing data without compromising power or introducing bias. This technique is relevant for nurse researchers faced with ever-increasing amounts of electronic data and decreasing research budgets. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  18. A study and experiment plan for digital mobile communication via satellite

    NASA Technical Reports Server (NTRS)

    Jones, J. J.; Craighill, E. J.; Evans, R. G.; Vincze, A. D.; Tom, N. N.

    1978-01-01

    The viability of mobile communications is examined within the context of a frequency division multiple access, single channel per carrier satellite system emphasizing digital techniques to serve a large population of users. The intent is to provide the mobile users with a grade of service consistant with the requirements for remote, rural (perhaps emergency) voice communications, but which approaches toll quality speech. A traffic model is derived on which to base the determination of the required maximum number of satellite channels to provide the anticipated level of service. Various voice digitalization and digital modulation schemes are reviewed along with a general link analysis of the mobile system. Demand assignment multiple access considerations and analysis tradeoffs are presented. Finally, a completed configuration is described.

  19. Missing data treatments matter: an analysis of multiple imputation for anterior cervical discectomy and fusion procedures.

    PubMed

    Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Cui, Jonathan J; Basques, Bryce A; Albert, Todd J; Grauer, Jonathan N

    2018-04-09

    The presence of missing data is a limitation of large datasets, including the National Surgical Quality Improvement Program (NSQIP). In addressing this issue, most studies use complete case analysis, which excludes cases with missing data, thus potentially introducing selection bias. Multiple imputation, a statistically rigorous approach that approximates missing data and preserves sample size, may be an improvement over complete case analysis. The present study aims to evaluate the impact of using multiple imputation in comparison with complete case analysis for assessing the associations between preoperative laboratory values and adverse outcomes following anterior cervical discectomy and fusion (ACDF) procedures. This is a retrospective review of prospectively collected data. Patients undergoing one-level ACDF were identified in NSQIP 2012-2015. Perioperative adverse outcome variables assessed included the occurrence of any adverse event, severe adverse events, and hospital readmission. Missing preoperative albumin and hematocrit values were handled using complete case analysis and multiple imputation. These preoperative laboratory levels were then tested for associations with 30-day postoperative outcomes using logistic regression. A total of 11,999 patients were included. Of this cohort, 63.5% of patients had missing preoperative albumin and 9.9% had missing preoperative hematocrit. When using complete case analysis, only 4,311 patients were studied. The removed patients were significantly younger, healthier, of a common body mass index, and male. Logistic regression analysis failed to identify either preoperative hypoalbuminemia or preoperative anemia as significantly associated with adverse outcomes. When employing multiple imputation, all 11,999 patients were included. Preoperative hypoalbuminemia was significantly associated with the occurrence of any adverse event and severe adverse events. Preoperative anemia was significantly associated with the occurrence of any adverse event, severe adverse events, and hospital readmission. Multiple imputation is a rigorous statistical procedure that is being increasingly used to address missing values in large datasets. Using this technique for ACDF avoided the loss of cases that may have affected the representativeness and power of the study and led to different results than complete case analysis. Multiple imputation should be considered for future spine studies. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. A near-optimal low complexity sensor fusion technique for accurate indoor localization based on ultrasound time of arrival measurements from low-quality sensors

    NASA Astrophysics Data System (ADS)

    Mitilineos, Stelios A.; Argyreas, Nick D.; Thomopoulos, Stelios C. A.

    2009-05-01

    A fusion-based localization technique for location-based services in indoor environments is introduced herein, based on ultrasound time-of-arrival measurements from multiple off-the-shelf range estimating sensors which are used in a market-available localization system. In-situ field measurements results indicated that the respective off-the-shelf system was unable to estimate position in most of the cases, while the underlying sensors are of low-quality and yield highly inaccurate range and position estimates. An extensive analysis is performed and a model of the sensor-performance characteristics is established. A low-complexity but accurate sensor fusion and localization technique is then developed, which consists inof evaluating multiple sensor measurements and selecting the one that is considered most-accurate based on the underlying sensor model. Optimality, in the sense of a genie selecting the optimum sensor, is subsequently evaluated and compared to the proposed technique. The experimental results indicate that the proposed fusion method exhibits near-optimal performance and, albeit being theoretically suboptimal, it largely overcomes most flaws of the underlying single-sensor system resulting in a localization system of increased accuracy, robustness and availability.

  1. Isotope ratio measurements of pg-size plutonium samples using TIMS in combination with "multiple ion counting" and filament carburization

    NASA Astrophysics Data System (ADS)

    Jakopic, Rozle; Richter, Stephan; Kühn, Heinz; Benedik, Ljudmila; Pihlar, Boris; Aregbe, Yetunde

    2009-01-01

    A sample preparation procedure for isotopic measurements using thermal ionization mass spectrometry (TIMS) was developed which employs the technique of carburization of rhenium filaments. Carburized filaments were prepared in a special vacuum chamber in which the filaments were exposed to benzene vapour as a carbon supply and carburized electrothermally. To find the optimal conditions for the carburization and isotopic measurements using TIMS, the influence of various parameters such as benzene pressure, carburization current and the exposure time were tested. As a result, carburization of the filaments improved the overall efficiency by one order of magnitude. Additionally, a new "multi-dynamic" measurement technique was developed for Pu isotope ratio measurements using a "multiple ion counting" (MIC) system. This technique was combined with filament carburization and applied to the NBL-137 isotopic standard and samples of the NUSIMEP 5 inter-laboratory comparison campaign, which included certified plutonium materials at the ppt-level. The multi-dynamic measurement technique for plutonium, in combination with filament carburization, has been shown to significantly improve the precision and accuracy for isotopic analysis of environmental samples with low-levels of plutonium.

  2. Multiple regression technique for Pth degree polynominals with and without linear cross products

    NASA Technical Reports Server (NTRS)

    Davis, J. W.

    1973-01-01

    A multiple regression technique was developed by which the nonlinear behavior of specified independent variables can be related to a given dependent variable. The polynomial expression can be of Pth degree and can incorporate N independent variables. Two cases are treated such that mathematical models can be studied both with and without linear cross products. The resulting surface fits can be used to summarize trends for a given phenomenon and provide a mathematical relationship for subsequent analysis. To implement this technique, separate computer programs were developed for the case without linear cross products and for the case incorporating such cross products which evaluate the various constants in the model regression equation. In addition, the significance of the estimated regression equation is considered and the standard deviation, the F statistic, the maximum absolute percent error, and the average of the absolute values of the percent of error evaluated. The computer programs and their manner of utilization are described. Sample problems are included to illustrate the use and capability of the technique which show the output formats and typical plots comparing computer results to each set of input data.

  3. Estimation of liver T₂ in transfusion-related iron overload in patients with weighted least squares T₂ IDEAL.

    PubMed

    Vasanawala, Shreyas S; Yu, Huanzhou; Shimakawa, Ann; Jeng, Michael; Brittain, Jean H

    2012-01-01

    MRI imaging of hepatic iron overload can be achieved by estimating T(2) values using multiple-echo sequences. The purpose of this work is to develop and clinically evaluate a weighted least squares algorithm based on T(2) Iterative Decomposition of water and fat with Echo Asymmetry and Least-squares estimation (IDEAL) technique for volumetric estimation of hepatic T(2) in the setting of iron overload. The weighted least squares T(2) IDEAL technique improves T(2) estimation by automatically decreasing the impact of later, noise-dominated echoes. The technique was evaluated in 37 patients with iron overload. Each patient underwent (i) a standard 2D multiple-echo gradient echo sequence for T(2) assessment with nonlinear exponential fitting, and (ii) a 3D T(2) IDEAL technique, with and without a weighted least squares fit. Regression and Bland-Altman analysis demonstrated strong correlation between conventional 2D and T(2) IDEAL estimation. In cases of severe iron overload, T(2) IDEAL without weighted least squares reconstruction resulted in a relative overestimation of T(2) compared with weighted least squares. Copyright © 2011 Wiley-Liss, Inc.

  4. A default Bayesian hypothesis test for mediation.

    PubMed

    Nuijten, Michèle B; Wetzels, Ruud; Matzke, Dora; Dolan, Conor V; Wagenmakers, Eric-Jan

    2015-03-01

    In order to quantify the relationship between multiple variables, researchers often carry out a mediation analysis. In such an analysis, a mediator (e.g., knowledge of a healthy diet) transmits the effect from an independent variable (e.g., classroom instruction on a healthy diet) to a dependent variable (e.g., consumption of fruits and vegetables). Almost all mediation analyses in psychology use frequentist estimation and hypothesis-testing techniques. A recent exception is Yuan and MacKinnon (Psychological Methods, 14, 301-322, 2009), who outlined a Bayesian parameter estimation procedure for mediation analysis. Here we complete the Bayesian alternative to frequentist mediation analysis by specifying a default Bayesian hypothesis test based on the Jeffreys-Zellner-Siow approach. We further extend this default Bayesian test by allowing a comparison to directional or one-sided alternatives, using Markov chain Monte Carlo techniques implemented in JAGS. All Bayesian tests are implemented in the R package BayesMed (Nuijten, Wetzels, Matzke, Dolan, & Wagenmakers, 2014).

  5. Analysis of multiple tank car releases in train accidents.

    PubMed

    Liu, Xiang; Liu, Chang; Hong, Yili

    2017-10-01

    There are annually over two million carloads of hazardous materials transported by rail in the United States. The American railroads use large blocks of tank cars to transport petroleum crude oil and other flammable liquids from production to consumption sites. Being different from roadway transport of hazardous materials, a train accident can potentially result in the derailment and release of multiple tank cars, which may result in significant consequences. The prior literature predominantly assumes that the occurrence of multiple tank car releases in a train accident is a series of independent Bernoulli processes, and thus uses the binomial distribution to estimate the total number of tank car releases given the number of tank cars derailing or damaged. This paper shows that the traditional binomial model can incorrectly estimate multiple tank car release probability by magnitudes in certain circumstances, thereby significantly affecting railroad safety and risk analysis. To bridge this knowledge gap, this paper proposes a novel, alternative Correlated Binomial (CB) model that accounts for the possible correlations of multiple tank car releases in the same train. We test three distinct correlation structures in the CB model, and find that they all outperform the conventional binomial model based on empirical tank car accident data. The analysis shows that considering tank car release correlations would result in a significantly improved fit of the empirical data than otherwise. Consequently, it is prudent to consider alternative modeling techniques when analyzing the probability of multiple tank car releases in railroad accidents. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Minimal invasive epicardial lead implantation: optimizing cardiac resynchronization with a new mapping device for epicardial lead placement.

    PubMed

    Maessen, J G; Phelps, B; Dekker, A L A J; Dijkman, B

    2004-05-01

    To optimize resynchronization in biventricular pacing with epicardial leads, mapping to determine the best pacing site, is a prerequisite. A port access surgical mapping technique was developed that allowed multiple pace site selection and reproducible lead evaluation and implantation. Pressure-volume loops analysis was used for real time guidance in targeting epicardial lead placement. Even the smallest changes in lead position revealed significantly different functional results. Optimizing the pacing site with this technique allowed functional improvement up to 40% versus random pace site selection.

  7. Estimating free-body modal parameters from tests of a constrained structure

    NASA Technical Reports Server (NTRS)

    Cooley, Victor M.

    1993-01-01

    Hardware advances in suspension technology for ground tests of large space structures provide near on-orbit boundary conditions for modal testing. Further advances in determining free-body modal properties of constrained large space structures have been made, on the analysis side, by using time domain parameter estimation and perturbing the stiffness of the constraints over multiple sub-tests. In this manner, passive suspension constraint forces, which are fully correlated and therefore not usable for spectral averaging techniques, are made effectively uncorrelated. The technique is demonstrated with simulated test data.

  8. Quantitative Analysis of Drugs with Highly Different Concentrations of Pharmaceutical Components Using Spectral Subtraction Techniques

    NASA Astrophysics Data System (ADS)

    Ayoub, B. M.

    2017-11-01

    Two simple spectrophotometric methods were developed for determination of empagliflozin and metformin by manipulating their ratio spectra with application on a recently approved pharmaceutical combination, Synjardy® tablets. A spiking technique was used to increase the concentration of empagliflozin after extraction from the tablets to allow its simultaneous determination with metformin. Validation parameters according to ICH guidelines were acceptable over the concentration range of 2-12 μg/mL for both drugs using constant multiplication and spectrum subtraction methods. The optimized methods are suitable for QC labs.

  9. A new method for inferring carbon monoxide concentrations from gas filter radiometer data

    NASA Technical Reports Server (NTRS)

    Wallio, H. A.; Reichle, H. G., Jr.; Casas, J. C.; Gormsen, B. B.

    1981-01-01

    A method for inferring carbon monoxide concentrations from gas filter radiometer data is presented. The technique can closely approximate the results of more costly line-by-line radiative transfer calculations over a wide range of altitudes, ground temperatures, and carbon monoxide concentrations. The technique can also be used over a larger range of conditions than those used for the regression analysis. Because the influence of the carbon monoxide mixing ratio requires only addition, multiplication and a minimum of logic, the method can be implemented on very small computers or microprocessors.

  10. 3D polymer gel dosimetry using a 3D (DESS) and a 2D MultiEcho SE (MESE) sequence

    NASA Astrophysics Data System (ADS)

    Maris, Thomas G.; Pappas, Evangelos; Karolemeas, Kostantinos; Papadakis, Antonios E.; Zacharopoulou, Fotini; Papanikolaou, Nickolas; Gourtsoyiannis, Nicholas

    2006-12-01

    The utilization of 3D techniques in Magnetic Resonance Imaging data aquisition and post-processing analysis is a prerequisite especially when modern radiotherapy techniques (conformal RT, IMRT, Stereotactic RT) are to be used. The aim of this work is to compare a 3D Double Echo Steady State (DESS) and a 2D Multiple Echo Spin Echo (MESE) sequence in 3D MRI radiation dosimetry using two different MRI scanners and utilising N-VInylPyrrolidone (VIPAR) based polymer gels.

  11. Application of a Modified Time Delay Spectrometry Technique in Modeling of Underwater Acoustic Propagation.

    DTIC Science & Technology

    1987-03-01

    W.B. Anderson) 1 Keyport, Washington 98345 7. Director, David W. Taylor Naval Ships 1 and Development Center Detachment Puget Sound Attn: George...Monterey, California 93943-5000 Sa IIAME ’) F NDN1G, SPONSOQ;NG 8ab OF ,CE SvM9OL 9 PROCUJREMENT ,NSTR MET *DEN’ CATiON .,.M4[R ORCA ’.:ZAr ON j Iapplecaboe...analysis of sound propagating by multiple paths in an ocean at short ranges has been conducted using a Modified Time Delay Spectrometry (TDS) technique

  12. Theory and simulations of covariance mapping in multiple dimensions for data analysis in high-event-rate experiments

    NASA Astrophysics Data System (ADS)

    Zhaunerchyk, V.; Frasinski, L. J.; Eland, J. H. D.; Feifel, R.

    2014-05-01

    Multidimensional covariance analysis and its validity for correlation of processes leading to multiple products are investigated from a theoretical point of view. The need to correct for false correlations induced by experimental parameters which fluctuate from shot to shot, such as the intensity of self-amplified spontaneous emission x-ray free-electron laser pulses, is emphasized. Threefold covariance analysis based on simple extension of the two-variable formulation is shown to be valid for variables exhibiting Poisson statistics. In this case, false correlations arising from fluctuations in an unstable experimental parameter that scale linearly with signals can be eliminated by threefold partial covariance analysis, as defined here. Fourfold covariance based on the same simple extension is found to be invalid in general. Where fluctuations in an unstable parameter induce nonlinear signal variations, a technique of contingent covariance analysis is proposed here to suppress false correlations. In this paper we also show a method to eliminate false correlations associated with fluctuations of several unstable experimental parameters.

  13. Limitations of quantitative analysis of deep crustal seismic reflection data: Examples from GLIMPCE

    USGS Publications Warehouse

    Lee, Myung W.; Hutchinson, Deborah R.

    1992-01-01

    Amplitude preservation in seismic reflection data can be obtained by a relative true amplitude (RTA) processing technique in which the relative strength of reflection amplitudes is preserved vertically as well as horizontally, after compensating for amplitude distortion by near-surface effects and propagation effects. Quantitative analysis of relative true amplitudes of the Great Lakes International Multidisciplinary Program on Crustal Evolution seismic data is hampered by large uncertainties in estimates of the water bottom reflection coefficient and the vertical amplitude correction and by inadequate noise suppression. Processing techniques such as deconvolution, F-K filtering, and migration significantly change the overall shape of amplitude curves and hence calculation of reflection coefficients and average reflectance. Thus lithological interpretation of deep crustal seismic data based on the absolute value of estimated reflection strength alone is meaningless. The relative strength of individual events, however, is preserved on curves generated at different stages in the processing. We suggest that qualitative comparisons of relative strength, if used carefully, provide a meaningful measure of variations in reflectivity. Simple theoretical models indicate that peg-leg multiples rather than water bottom multiples are the most severe source of noise contamination. These multiples are extremely difficult to remove when the water bottom reflection coefficient is large (>0.6), a condition that exists beneath parts of Lake Superior and most of Lake Huron.

  14. Application of Linear Mixed-Effects Models in Human Neuroscience Research: A Comparison with Pearson Correlation in Two Auditory Electrophysiology Studies

    PubMed Central

    Koerner, Tess K.; Zhang, Yang

    2017-01-01

    Neurophysiological studies are often designed to examine relationships between measures from different testing conditions, time points, or analysis techniques within the same group of participants. Appropriate statistical techniques that can take into account repeated measures and multivariate predictor variables are integral and essential to successful data analysis and interpretation. This work implements and compares conventional Pearson correlations and linear mixed-effects (LME) regression models using data from two recently published auditory electrophysiology studies. For the specific research questions in both studies, the Pearson correlation test is inappropriate for determining strengths between the behavioral responses for speech-in-noise recognition and the multiple neurophysiological measures as the neural responses across listening conditions were simply treated as independent measures. In contrast, the LME models allow a systematic approach to incorporate both fixed-effect and random-effect terms to deal with the categorical grouping factor of listening conditions, between-subject baseline differences in the multiple measures, and the correlational structure among the predictor variables. Together, the comparative data demonstrate the advantages as well as the necessity to apply mixed-effects models to properly account for the built-in relationships among the multiple predictor variables, which has important implications for proper statistical modeling and interpretation of human behavior in terms of neural correlates and biomarkers. PMID:28264422

  15. Satellite Remote Sensing of Harmful Algal Blooms (HABs) and a Potential Synthesized Framework

    PubMed Central

    Shen, Li; Xu, Huiping; Guo, Xulin

    2012-01-01

    Harmful algal blooms (HABs) are severe ecological disasters threatening aquatic systems throughout the World, which necessitate scientific efforts in detecting and monitoring them. Compared with traditional in situ point observations, satellite remote sensing is considered as a promising technique for studying HABs due to its advantages of large-scale, real-time, and long-term monitoring. The present review summarizes the suitability of current satellite data sources and different algorithms for detecting HABs. It also discusses the spatial scale issue of HABs. Based on the major problems identified from previous literature, including the unsystematic understanding of HABs, the insufficient incorporation of satellite remote sensing, and a lack of multiple oceanographic explanations of the mechanisms causing HABs, this review also attempts to provide a comprehensive understanding of the complicated mechanism of HABs impacted by multiple oceanographic factors. A potential synthesized framework can be established by combining multiple accessible satellite remote sensing approaches including visual interpretation, spectra analysis, parameters retrieval and spatial-temporal pattern analysis. This framework aims to lead to a systematic and comprehensive monitoring of HABs based on satellite remote sensing from multiple oceanographic perspectives. PMID:22969372

  16. Air-to-air radar flight testing

    NASA Astrophysics Data System (ADS)

    Scott, Randall E.

    1988-06-01

    This volume in the AGARD Flight Test Techniques Series describes flight test techniques, flight test instrumentation, ground simulation, data reduction and analysis methods used to determine the performance characteristics of a modern air-to-air (a/a) radar system. Following a general coverage of specification requirements, test plans, support requirements, development and operational testing, and management information systems, the report goes into more detailed flight test techniques covering a/a radar capabilities of: detection, manual acquisition, automatic acquisition, tracking a single target, and detection and tracking of multiple targets. There follows a section on additional flight test considerations such as electromagnetic compatibility, electronic countermeasures, displays and controls, degraded and backup modes, radome effects, environmental considerations, and use of testbeds. Other sections cover ground simulation, flight test instrumentation, and data reduction and analysis. The final sections deal with reporting and a discussion of considerations for the future and how they may affect radar flight testing.

  17. Investigating effects of communications modulation technique on targeting performance

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Eusebio, Gerald; Huling, Edward

    2006-05-01

    One of the key challenges facing the global war on terrorism (GWOT) and urban operations is the increased need for rapid and diverse information from distributed sources. For users to get adequate information on target types and movements, they would need reliable data. In order to facilitate reliable computational intelligence, we seek to explore the communication modulation tradeoffs affecting information distribution and accumulation. In this analysis, we explore the modulation techniques of Orthogonal Frequency Division Multiplexing (OFDM), Direct Sequence Spread Spectrum (DSSS), and statistical time-division multiple access (TDMA) as a function of the bit error rate and jitter that affect targeting performance. In the analysis, we simulate a Link 16 with a simple bandpass frequency shift keying (PSK) technique using different Signal-to-Noise ratios. The communications transfer delay and accuracy tradeoffs are assessed as to the effects incurred in targeting performance.

  18. Monitoring technique for multiple power splitter-passive optical networks using a tunable OTDR and FBGs

    NASA Astrophysics Data System (ADS)

    Hann, Swook; Kim, Dong-Hwan; Park, Chang-Soo

    2006-04-01

    A monitoring technique for multiple power splitter-passive optical networks (PS-PON) is presented. The technique is based on the remote sensing of fiber Bragg grating (FBG) using a tunable OTDR. To monitor the multiple PS-PON, the FBG can be used for a wavelength dependent reflective reference on each branch end of the PS. The FBG helps discern an individual event of the multiple PS-PON for the monitoring in collaborate with information of Rayleigh backscattered power. The multiple PS-PON can be analyzed by the monitoring method at the central office under 10-Gbit/s in-service.

  19. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    PubMed Central

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  20. Error determination of a successive correction type objective analysis scheme. [for surface meteorological data

    NASA Technical Reports Server (NTRS)

    Smith, D. R.; Leslie, F. W.

    1984-01-01

    The Purdue Regional Objective Analysis of the Mesoscale (PROAM) is a successive correction type scheme for the analysis of surface meteorological data. The scheme is subjected to a series of experiments to evaluate its performance under a variety of analysis conditions. The tests include use of a known analytic temperature distribution to quantify error bounds for the scheme. Similar experiments were conducted using actual atmospheric data. Results indicate that the multiple pass technique increases the accuracy of the analysis. Furthermore, the tests suggest appropriate values for the analysis parameters in resolving disturbances for the data set used in this investigation.

  1. Innovative Visualization Techniques applied to a Flood Scenario

    NASA Astrophysics Data System (ADS)

    Falcão, António; Ho, Quan; Lopes, Pedro; Malamud, Bruce D.; Ribeiro, Rita; Jern, Mikael

    2013-04-01

    The large and ever-increasing amounts of multi-dimensional, time-varying and geospatial digital information from multiple sources represent a major challenge for today's analysts. We present a set of visualization techniques that can be used for the interactive analysis of geo-referenced and time sampled data sets, providing an integrated mechanism and that aids the user to collaboratively explore, present and communicate visually complex and dynamic data. Here we present these concepts in the context of a 4 hour flood scenario from Lisbon in 2010, with data that includes measures of water column (flood height) every 10 minutes at a 4.5 m x 4.5 m resolution, topography, building damage, building information, and online base maps. Techniques we use include web-based linked views, multiple charts, map layers and storytelling. We explain two of these in more detail that are not currently in common use for visualization of data: storytelling and web-based linked views. Visual storytelling is a method for providing a guided but interactive process of visualizing data, allowing more engaging data exploration through interactive web-enabled visualizations. Within storytelling, a snapshot mechanism helps the author of a story to highlight data views of particular interest and subsequently share or guide others within the data analysis process. This allows a particular person to select relevant attributes for a snapshot, such as highlighted regions for comparisons, time step, class values for colour legend, etc. and provide a snapshot of the current application state, which can then be provided as a hyperlink and recreated by someone else. Since data can be embedded within this snapshot, it is possible to interactively visualize and manipulate it. The second technique, web-based linked views, includes multiple windows which interactively respond to the user selections, so that when selecting an object and changing it one window, it will automatically update in all the other windows. These concepts can be part of a collaborative platform, where multiple people share and work together on the data, via online access, which also allows its remote usage from a mobile platform. Storytelling augments analysis and decision-making capabilities allowing to assimilate complex situations and reach informed decisions, in addition to helping the public visualize information. In our visualization scenario, developed in the context of the VA-4D project for the European Space Agency (see http://www.ca3-uninova.org/project_va4d), we make use of the GAV (GeoAnalytics Visualization) framework, a web-oriented visual analytics application based on multiple interactive views. The final visualization that we produce includes multiple interactive views, including a dynamic multi-layer map surrounded by other visualizations such as bar charts, time graphs and scatter plots. The map provides flood and building information, on top of a base city map (street maps and/or satellite imagery provided by online map services such as Google Maps, Bing Maps etc.). Damage over time for selected buildings, damage for all buildings at a chosen time period, correlation between damage and water depth can be analysed in the other views. This interactive web-based visualization that incorporates the ideas of storytelling, web-based linked views, and other visualization techniques, for a 4 hour flood event in Lisbon in 2010, can be found online at http://www.ncomva.se/flash/projects/esa/flooding/.

  2. The Immersive Virtual Reality Experience: A Typology of Users Revealed Through Multiple Correspondence Analysis Combined with Cluster Analysis Technique.

    PubMed

    Rosa, Pedro J; Morais, Diogo; Gamito, Pedro; Oliveira, Jorge; Saraiva, Tomaz

    2016-03-01

    Immersive virtual reality is thought to be advantageous by leading to higher levels of presence. However, and despite users getting actively involved in immersive three-dimensional virtual environments that incorporate sound and motion, there are individual factors, such as age, video game knowledge, and the predisposition to immersion, that may be associated with the quality of virtual reality experience. Moreover, one particular concern for users engaged in immersive virtual reality environments (VREs) is the possibility of side effects, such as cybersickness. The literature suggests that at least 60% of virtual reality users report having felt symptoms of cybersickness, which reduces the quality of the virtual reality experience. The aim of this study was thus to profile the right user to be involved in a VRE through head-mounted display. To examine which user characteristics are associated with the most effective virtual reality experience (lower cybersickness), a multiple correspondence analysis combined with cluster analysis technique was performed. Results revealed three distinct profiles, showing that the PC gamer profile is more associated with higher levels of virtual reality effectiveness, that is, higher predisposition to be immersed and reduced cybersickness symptoms in the VRE than console gamer and nongamer. These findings can be a useful orientation in clinical practice and future research as they help identify which users are more predisposed to benefit from immersive VREs.

  3. Advanced correlation grid: Analysis and visualisation of functional connectivity among multiple spike trains.

    PubMed

    Masud, Mohammad Shahed; Borisyuk, Roman; Stuart, Liz

    2017-07-15

    This study analyses multiple spike trains (MST) data, defines its functional connectivity and subsequently visualises an accurate diagram of connections. This is a challenging problem. For example, it is difficult to distinguish the common input and the direct functional connection of two spike trains. The new method presented in this paper is based on the traditional pairwise cross-correlation function (CCF) and a new combination of statistical techniques. First, the CCF is used to create the Advanced Correlation Grid (ACG) correlation where both the significant peak of the CCF and the corresponding time delay are used for detailed analysis of connectivity. Second, these two features of functional connectivity are used to classify connections. Finally, the visualization technique is used to represent the topology of functional connections. Examples are presented in the paper to demonstrate the new Advanced Correlation Grid method and to show how it enables discrimination between (i) influence from one spike train to another through an intermediate spike train and (ii) influence from one common spike train to another pair of analysed spike trains. The ACG method enables scientists to automatically distinguish between direct connections from spurious connections such as common source connection and indirect connection whereas existing methods require in-depth analysis to identify such connections. The ACG is a new and effective method for studying functional connectivity of multiple spike trains. This method can identify accurately all the direct connections and can distinguish common source and indirect connections automatically. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Deep convolutional neural network based antenna selection in multiple-input multiple-output system

    NASA Astrophysics Data System (ADS)

    Cai, Jiaxin; Li, Yan; Hu, Ying

    2018-03-01

    Antenna selection of wireless communication system has attracted increasing attention due to the challenge of keeping a balance between communication performance and computational complexity in large-scale Multiple-Input MultipleOutput antenna systems. Recently, deep learning based methods have achieved promising performance for large-scale data processing and analysis in many application fields. This paper is the first attempt to introduce the deep learning technique into the field of Multiple-Input Multiple-Output antenna selection in wireless communications. First, the label of attenuation coefficients channel matrix is generated by minimizing the key performance indicator of training antenna systems. Then, a deep convolutional neural network that explicitly exploits the massive latent cues of attenuation coefficients is learned on the training antenna systems. Finally, we use the adopted deep convolutional neural network to classify the channel matrix labels of test antennas and select the optimal antenna subset. Simulation experimental results demonstrate that our method can achieve better performance than the state-of-the-art baselines for data-driven based wireless antenna selection.

  5. Performance Analysis of a De-correlated Modified Code Tracking Loop for Synchronous DS-CDMA System under Multiuser Environment

    NASA Astrophysics Data System (ADS)

    Wu, Ya-Ting; Wong, Wai-Ki; Leung, Shu-Hung; Zhu, Yue-Sheng

    This paper presents the performance analysis of a De-correlated Modified Code Tracking Loop (D-MCTL) for synchronous direct-sequence code-division multiple-access (DS-CDMA) systems under multiuser environment. Previous studies have shown that the imbalance of multiple access interference (MAI) in the time lead and time lag portions of the signal causes tracking bias or instability problem in the traditional correlating tracking loop like delay lock loop (DLL) or modified code tracking loop (MCTL). In this paper, we exploit the de-correlating technique to combat the MAI at the on-time code position of the MCTL. Unlike applying the same technique to DLL which requires an extensive search algorithm to compensate the noise imbalance which may introduce small tracking bias under low signal-to-noise ratio (SNR), the proposed D-MCTL has much lower computational complexity and exhibits zero tracking bias for the whole range of SNR, regardless of the number of interfering users. Furthermore, performance analysis and simulations based on Gold codes show that the proposed scheme has better mean square tracking error, mean-time-to-lose-lock and near-far resistance than the other tracking schemes, including traditional DLL (T-DLL), traditional MCTL (T-MCTL) and modified de-correlated DLL (MD-DLL).

  6. Multiple Small Diameter Drillings Increase Femoral Neck Stability Compared with Single Large Diameter Femoral Head Core Decompression Technique for Avascular Necrosis of the Femoral Head.

    PubMed

    Brown, Philip J; Mannava, Sandeep; Seyler, Thorsten M; Plate, Johannes F; Van Sikes, Charles; Stitzel, Joel D; Lang, Jason E

    2016-10-26

    Femoral head core decompression is an efficacious joint-preserving procedure for treatment of early stage avascular necrosis. However, postoperative fractures have been described which may be related to the decompression technique used. Femoral head decompressions were performed on 12 matched human cadaveric femora comparing large 8mm single bore versus multiple 3mm small drilling techniques. Ultimate failure strength of the femora was tested using a servo-hydraulic material testing system. Ultimate load to failure was compared between the different decompression techniques using two paired ANCOVA linear regression models. Prior to biomechanical testing and after the intervention, volumetric bone mineral density was determined using quantitative computed tomography to account for variation between cadaveric samples and to assess the amount of bone disruption by the core decompression. Core decompression, using the small diameter bore and multiple drilling technique, withstood significantly greater load prior to failure compared with the single large bore technique after adjustment for bone mineral density (p< 0.05). The 8mm single bore technique removed a significantly larger volume of bone compared to the 3mm multiple drilling technique (p< 0.001). However, total fracture energy was similar between the two core decompression techniques. When considering core decompression for the treatment of early stage avascular necrosis, the multiple small bore technique removed less bone volume, thereby potentially leading to higher load to failure.

  7. Thermal photons in heavy ion collisions at 158 A GeV

    NASA Astrophysics Data System (ADS)

    Dutt, Sunil

    2018-05-01

    The essence of experimental ultra-relativistic heavy ion collision physics is the production and study of strongly interacting matter at extreme energy densities, temperatures and consequent search for equation of state of nuclear matter. The focus of the analysis has been to examine pseudo-rapidity distributions obtained for the γ-like particles in pre-shower photon multiplicity detector. This allows the extension of scaled factorial moment analysis to bin sizes smaller than those accessible to other experimental techniques. Scaled factorial moments are calculated using horizontal corrected and vertical analysis. The results are compared with simulation analysis using VENUS event generator.

  8. Using Regression Analysis To Determine If Faculty Salaries Are Overly Compressed. AIR 1997 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Toutkoushian, Robert K.

    This paper proposes a five-step process by which to analyze whether the salary ratio between junior and senior college faculty exhibits salary compression, a term used to describe an unusually small differential between faculty with different levels of experience. The procedure utilizes commonly used statistical techniques (multiple regression…

  9. Examination of the Relation between the Values of Adolescents and Virtual Sensitiveness

    ERIC Educational Resources Information Center

    Yilmaz, Hasan

    2013-01-01

    The aim of this study is to examine the relation between the values adolescents have and virtual sensitiveness. The study is carried out on 447 adolescents, 160 of whom are female, 287 males. The Humanistic Values Scale and Virtual Sensitiveness scale were used. Pearson Product Moment Coefficient and multiple regression analysis techniques were…

  10. Utilization of high performance liquid chromatography coupled to tandem mass spectrometry for characterization of 8-O-methylbostrycoidin production by species of the fungus Fusarium

    USDA-ARS?s Scientific Manuscript database

    The pigment, 8-O-methylbostrycoidin is a polyketide metabolite produced by multiple species of the fungus Fusarium that infects plant crops, including maize. A technique was developed for the analysis of 8-O-methylbostrycoidin by high performance liquid chromatography coupled to electrospray ionizat...

  11. Testing and Evaluating C3I Systems That Employ AI. Volume 1. Handbook for Testing Expert Systems

    DTIC Science & Technology

    1991-01-31

    Designs ....... ............. .. 6-29 Nonequivalent Control Group Design ...does not receive the system; and (c) nonequivalent (and nonrandomized) control group designs that rely on statistical techniques like analysis of...implementation); (b) multiple time-series designs using a control group ; and (c) nonequivalent control group designs that obtain pretest and

  12. Source-space ICA for MEG source imaging.

    PubMed

    Jonmohamadi, Yaqub; Jones, Richard D

    2016-02-01

    One of the most widely used approaches in electroencephalography/magnetoencephalography (MEG) source imaging is application of an inverse technique (such as dipole modelling or sLORETA) on the component extracted by independent component analysis (ICA) (sensor-space ICA + inverse technique). The advantage of this approach over an inverse technique alone is that it can identify and localize multiple concurrent sources. Among inverse techniques, the minimum-variance beamformers offer a high spatial resolution. However, in order to have both high spatial resolution of beamformer and be able to take on multiple concurrent sources, sensor-space ICA + beamformer is not an ideal combination. We propose source-space ICA for MEG as a powerful alternative approach which can provide the high spatial resolution of the beamformer and handle multiple concurrent sources. The concept of source-space ICA for MEG is to apply the beamformer first and then singular value decomposition + ICA. In this paper we have compared source-space ICA with sensor-space ICA both in simulation and real MEG. The simulations included two challenging scenarios of correlated/concurrent cluster sources. Source-space ICA provided superior performance in spatial reconstruction of source maps, even though both techniques performed equally from a temporal perspective. Real MEG from two healthy subjects with visual stimuli were also used to compare performance of sensor-space ICA and source-space ICA. We have also proposed a new variant of minimum-variance beamformer called weight-normalized linearly-constrained minimum-variance with orthonormal lead-field. As sensor-space ICA-based source reconstruction is popular in EEG and MEG imaging, and given that source-space ICA has superior spatial performance, it is expected that source-space ICA will supersede its predecessor in many applications.

  13. Hypergraph partitioning implementation for parallelizing matrix-vector multiplication using CUDA GPU-based parallel computing

    NASA Astrophysics Data System (ADS)

    Murni, Bustamam, A.; Ernastuti, Handhika, T.; Kerami, D.

    2017-07-01

    Calculation of the matrix-vector multiplication in the real-world problems often involves large matrix with arbitrary size. Therefore, parallelization is needed to speed up the calculation process that usually takes a long time. Graph partitioning techniques that have been discussed in the previous studies cannot be used to complete the parallelized calculation of matrix-vector multiplication with arbitrary size. This is due to the assumption of graph partitioning techniques that can only solve the square and symmetric matrix. Hypergraph partitioning techniques will overcome the shortcomings of the graph partitioning technique. This paper addresses the efficient parallelization of matrix-vector multiplication through hypergraph partitioning techniques using CUDA GPU-based parallel computing. CUDA (compute unified device architecture) is a parallel computing platform and programming model that was created by NVIDIA and implemented by the GPU (graphics processing unit).

  14. Evaluating the decision accuracy and speed of clinical data visualizations.

    PubMed

    Pieczkiewicz, David S; Finkelstein, Stanley M

    2010-01-01

    Clinicians face an increasing volume of biomedical data. Assessing the efficacy of systems that enable accurate and timely clinical decision making merits corresponding attention. This paper discusses the multiple-reader multiple-case (MRMC) experimental design and linear mixed models as means of assessing and comparing decision accuracy and latency (time) for decision tasks in which clinician readers must interpret visual displays of data. These tools can assess and compare decision accuracy and latency (time). These experimental and statistical techniques, used extensively in radiology imaging studies, offer a number of practical and analytic advantages over more traditional quantitative methods such as percent-correct measurements and ANOVAs, and are recommended for their statistical efficiency and generalizability. An example analysis using readily available, free, and commercial statistical software is provided as an appendix. While these techniques are not appropriate for all evaluation questions, they can provide a valuable addition to the evaluative toolkit of medical informatics research.

  15. Private multiple congenital anomaly syndromes may result from unbalanced subtle translocations: t(2q;4p) explains the Lambotte syndrome.

    PubMed

    Herens, C; Jamar, M; Alvarez-Gonzalez, M L; Lesenfants, S; Lombet, J; Bonnivert, J; Koulischer, L; Verloes, A

    1997-12-12

    In 1990, Lambotte syndrome was reported as an apparently autosomal recessive multiple congenital anomaly/mental retardation (MCA/MR) syndrome observed in 4 of 12 sibs from a probably consanguineous mating [Verloes et al., Am J Med Genet 1990; 37:119-123]. Major manifestations included intrauterine growth retardation (IUGR), microcephaly, large soft pinnae, hypertelorism, beaked nose, and extremely severe neurologic impairment, with holoprosencephaly in one instance. After the observation of a further affected child born of one unaffected sister, in situ hybridization analysis and chromosome painting techniques demonstrated a subtle t(2;4)(q37.1; p16.2) translocation in the mother, suggesting a combination of 2q/4p trisomy/monosomy in all of the affected children of the family. Many private sporadic or recurrent MCA/MR syndromes maybe due to similar symmetric translocations, undetectable by conventional banding techniques.

  16. Conceptual Model Evaluation using Advanced Parameter Estimation Techniques with Heat as a Tracer

    NASA Astrophysics Data System (ADS)

    Naranjo, R. C.; Morway, E. D.; Healy, R. W.

    2016-12-01

    Temperature measurements made at multiple depths beneath the sediment-water interface has proven useful for estimating seepage rates from surface-water channels and corresponding subsurface flow direction. Commonly, parsimonious zonal representations of the subsurface structure are defined a priori by interpretation of temperature envelopes, slug tests or analysis of soil cores. However, combining multiple observations into a single zone may limit the inverse model solution and does not take full advantage of the information content within the measured data. Further, simulating the correct thermal gradient, flow paths, and transient behavior of solutes may be biased by inadequacies in the spatial description of subsurface hydraulic properties. The use of pilot points in PEST offers a more sophisticated approach to estimate the structure of subsurface heterogeneity. This presentation evaluates seepage estimation in a cross-sectional model of a trapezoidal canal with intermittent flow representing four typical sedimentary environments. The recent improvements in heat as a tracer measurement techniques (i.e. multi-depth temperature probe) along with use of modern calibration techniques (i.e., pilot points) provides opportunities for improved calibration of flow models, and, subsequently, improved model predictions.

  17. Fluorescence Fluctuation Approaches to the Study of Adhesion and Signaling

    PubMed Central

    Bachir, Alexia I.; Kubow, Kristopher E.; Horwitz, Alan R.

    2013-01-01

    Cell–matrix adhesions are large, multimolecular complexes through which cells sense and respond to their environment. They also mediate migration by serving as traction points and signaling centers and allow the cell to modify the surroucnding tissue. Due to their fundamental role in cell behavior, adhesions are germane to nearly all major human health pathologies. However, adhesions are extremely complex and dynamic structures that include over 100 known interacting proteins and operate over multiple space (nm–µm) and time (ms–min) regimes. Fluorescence fluctuation techniques are well suited for studying adhesions. These methods are sensitive over a large spatiotemporal range and provide a wealth of information including molecular transport dynamics, interactions, and stoichiometry from a single time series. Earlier chapters in this volume have provided the theoretical background, instrumentation, and analysis algorithms for these techniques. In this chapter, we discuss their implementation in living cells to study adhesions in migrating cells. Although each technique and application has its own unique instrumentation and analysis requirements, we provide general guidelines for sample preparation, selection of imaging instrumentation, and optimization of data acquisition and analysis parameters. Finally, we review several recent studies that implement these techniques in the study of adhesions. PMID:23280111

  18. Do not blame the driver: a systems analysis of the causes of road freight crashes.

    PubMed

    Newnam, Sharon; Goode, Natassia

    2015-03-01

    Although many have advocated a systems approach in road transportation, this view has not meaningfully penetrated road safety research, practice or policy. In this study, a systems theory-based approach, Rasmussens's (1997) risk management framework and associated Accimap technique, is applied to the analysis of road freight transportation crashes. Twenty-seven highway crash investigation reports were downloaded from the National Transport Safety Bureau website. Thematic analysis was used to identify the complex system of contributory factors, and relationships, identified within the reports. The Accimap technique was then used to represent the linkages and dependencies within and across system levels in the road freight transportation industry and to identify common factors and interactions across multiple crashes. The results demonstrate how a systems approach can increase knowledge in this safety critical domain, while the findings can be used to guide prevention efforts and the development of system-based investigation processes for the heavy vehicle industry. A research agenda for developing an investigation technique to better support the application of the Accimap technique by practitioners in road freight transportation industry is proposed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Comparing multiple competing interventions in the absence of randomized trials using clinical risk-benefit analysis

    PubMed Central

    2012-01-01

    Background To demonstrate the use of risk-benefit analysis for comparing multiple competing interventions in the absence of randomized trials, we applied this approach to the evaluation of five anticoagulants to prevent thrombosis in patients undergoing orthopedic surgery. Methods Using a cost-effectiveness approach from a clinical perspective (i.e. risk benefit analysis) we compared thromboprophylaxis with warfarin, low molecular weight heparin, unfractionated heparin, fondaparinux or ximelagatran in patients undergoing major orthopedic surgery, with sub-analyses according to surgery type. Proportions and variances of events defining risk (major bleeding) and benefit (thrombosis averted) were obtained through a meta-analysis and used to define beta distributions. Monte Carlo simulations were conducted and used to calculate incremental risks, benefits, and risk-benefit ratios. Finally, net clinical benefit was calculated for all replications across a range of risk-benefit acceptability thresholds, with a reference range obtained by estimating the case fatality rate - ratio of thrombosis to bleeding. Results The analysis showed that compared to placebo ximelagatran was superior to other options but final results were influenced by type of surgery, since ximelagatran was superior in total knee replacement but not in total hip replacement. Conclusions Using simulation and economic techniques we demonstrate a method that allows comparing multiple competing interventions in the absence of randomized trials with multiple arms by determining the option with the best risk-benefit profile. It can be helpful in clinical decision making since it incorporates risk, benefit, and personal risk acceptance. PMID:22233221

  20. Sparse Group Penalized Integrative Analysis of Multiple Cancer Prognosis Datasets

    PubMed Central

    Liu, Jin; Huang, Jian; Xie, Yang; Ma, Shuangge

    2014-01-01

    SUMMARY In cancer research, high-throughput profiling studies have been extensively conducted, searching for markers associated with prognosis. Because of the “large d, small n” characteristic, results generated from the analysis of a single dataset can be unsatisfactory. Recent studies have shown that integrative analysis, which simultaneously analyzes multiple datasets, can be more effective than single-dataset analysis and classic meta-analysis. In most of existing integrative analysis, the homogeneity model has been assumed, which postulates that different datasets share the same set of markers. Several approaches have been designed to reinforce this assumption. In practice, different datasets may differ in terms of patient selection criteria, profiling techniques, and many other aspects. Such differences may make the homogeneity model too restricted. In this study, we assume the heterogeneity model, under which different datasets are allowed to have different sets of markers. With multiple cancer prognosis datasets, we adopt the AFT (accelerated failure time) model to describe survival. This model may have the lowest computational cost among popular semiparametric survival models. For marker selection, we adopt a sparse group MCP (minimax concave penalty) approach. This approach has an intuitive formulation and can be computed using an effective group coordinate descent algorithm. Simulation study shows that it outperforms the existing approaches under both the homogeneity and heterogeneity models. Data analysis further demonstrates the merit of heterogeneity model and proposed approach. PMID:23938111

  1. Multiple Query Evaluation Based on an Enhanced Genetic Algorithm.

    ERIC Educational Resources Information Center

    Tamine, Lynda; Chrisment, Claude; Boughanem, Mohand

    2003-01-01

    Explains the use of genetic algorithms to combine results from multiple query evaluations to improve relevance in information retrieval. Discusses niching techniques, relevance feedback techniques, and evolution heuristics, and compares retrieval results obtained by both genetic multiple query evaluation and classical single query evaluation…

  2. Pairwise Classifier Ensemble with Adaptive Sub-Classifiers for fMRI Pattern Analysis.

    PubMed

    Kim, Eunwoo; Park, HyunWook

    2017-02-01

    The multi-voxel pattern analysis technique is applied to fMRI data for classification of high-level brain functions using pattern information distributed over multiple voxels. In this paper, we propose a classifier ensemble for multiclass classification in fMRI analysis, exploiting the fact that specific neighboring voxels can contain spatial pattern information. The proposed method converts the multiclass classification to a pairwise classifier ensemble, and each pairwise classifier consists of multiple sub-classifiers using an adaptive feature set for each class-pair. Simulated and real fMRI data were used to verify the proposed method. Intra- and inter-subject analyses were performed to compare the proposed method with several well-known classifiers, including single and ensemble classifiers. The comparison results showed that the proposed method can be generally applied to multiclass classification in both simulations and real fMRI analyses.

  3. Master of Puppets: Cooperative Multitasking for In Situ Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morozov, Dmitriy; Lukic, Zarija

    2016-01-01

    Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. Here, we present a novel design for running multiple codes in situ: using coroutines and position-independent executables we enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. We present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. This design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The techniques we present can also be integrated into other in situ frameworks.« less

  4. Henson v1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monozov, Dmitriy; Lukie, Zarija

    2016-04-01

    Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. The developers present a novel design for running multiple codes in situ: using coroutines and position-independent executables they enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. They present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. Our design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The presented techniques can also be integrated into other in situ frameworks.« less

  5. The novel application of Benford's second order analysis for monitoring radiation output in interventional radiology.

    PubMed

    Cournane, S; Sheehy, N; Cooke, J

    2014-06-01

    Benford's law is an empirical observation which predicts the expected frequency of digits in naturally occurring datasets spanning multiple orders of magnitude, with the law having been most successfully applied as an audit tool in accountancy. This study investigated the sensitivity of the technique in identifying system output changes using simulated changes in interventional radiology Dose-Area-Product (DAP) data, with any deviations from Benford's distribution identified using z-statistics. The radiation output for interventional radiology X-ray equipment is monitored annually during quality control testing; however, for a considerable portion of the year an increased output of the system, potentially caused by engineering adjustments or spontaneous system faults may go unnoticed, leading to a potential increase in the radiation dose to patients. In normal operation recorded examination radiation outputs vary over multiple orders of magnitude rendering the application of normal statistics ineffective for detecting systematic changes in the output. In this work, the annual DAP datasets complied with Benford's first order law for first, second and combinations of the first and second digits. Further, a continuous 'rolling' second order technique was devised for trending simulated changes over shorter timescales. This distribution analysis, the first employment of the method for radiation output trending, detected significant changes simulated on the original data, proving the technique useful in this case. The potential is demonstrated for implementation of this novel analysis for monitoring and identifying change in suitable datasets for the purpose of system process control. Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  6. Advanced Statistics for Exotic Animal Practitioners.

    PubMed

    Hodsoll, John; Hellier, Jennifer M; Ryan, Elizabeth G

    2017-09-01

    Correlation and regression assess the association between 2 or more variables. This article reviews the core knowledge needed to understand these analyses, moving from visual analysis in scatter plots through correlation, simple and multiple linear regression, and logistic regression. Correlation estimates the strength and direction of a relationship between 2 variables. Regression can be considered more general and quantifies the numerical relationships between an outcome and 1 or multiple variables in terms of a best-fit line, allowing predictions to be made. Each technique is discussed with examples and the statistical assumptions underlying their correct application. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Improving Predictions of Multiple Binary Models in ILP

    PubMed Central

    2014-01-01

    Despite the success of ILP systems in learning first-order rules from small number of examples and complexly structured data in various domains, they struggle in dealing with multiclass problems. In most cases they boil down a multiclass problem into multiple black-box binary problems following the one-versus-one or one-versus-rest binarisation techniques and learn a theory for each one. When evaluating the learned theories of multiple class problems in one-versus-rest paradigm particularly, there is a bias caused by the default rule toward the negative classes leading to an unrealistic high performance beside the lack of prediction integrity between the theories. Here we discuss the problem of using one-versus-rest binarisation technique when it comes to evaluating multiclass data and propose several methods to remedy this problem. We also illustrate the methods and highlight their link to binary tree and Formal Concept Analysis (FCA). Our methods allow learning of a simple, consistent, and reliable multiclass theory by combining the rules of the multiple one-versus-rest theories into one rule list or rule set theory. Empirical evaluation over a number of data sets shows that our proposed methods produce coherent and accurate rule models from the rules learned by the ILP system of Aleph. PMID:24696657

  8. Direct Position Determination of Multiple Non-Circular Sources with a Moving Coprime Array.

    PubMed

    Zhang, Yankui; Ba, Bin; Wang, Daming; Geng, Wei; Xu, Haiyun

    2018-05-08

    Direct position determination (DPD) is currently a hot topic in wireless localization research as it is more accurate than traditional two-step positioning. However, current DPD algorithms are all based on uniform arrays, which have an insufficient degree of freedom and limited estimation accuracy. To improve the DPD accuracy, this paper introduces a coprime array to the position model of multiple non-circular sources with a moving array. To maximize the advantages of this coprime array, we reconstruct the covariance matrix by vectorization, apply a spatial smoothing technique, and converge the subspace data from each measuring position to establish the cost function. Finally, we obtain the position coordinates of the multiple non-circular sources. The complexity of the proposed method is computed and compared with that of other methods, and the Cramer⁻Rao lower bound of DPD for multiple sources with a moving coprime array, is derived. Theoretical analysis and simulation results show that the proposed algorithm is not only applicable to circular sources, but can also improve the positioning accuracy of non-circular sources. Compared with existing two-step positioning algorithms and DPD algorithms based on uniform linear arrays, the proposed technique offers a significant improvement in positioning accuracy with a slight increase in complexity.

  9. Multiple excitation nano-spot generation and confocal detection for far-field microscopy.

    PubMed

    Mondal, Partha Pratim

    2010-03-01

    An imaging technique is developed for the controlled generation of multiple excitation nano-spots for far-field microscopy. The system point spread function (PSF) is obtained by interfering two counter-propagating extended depth-of-focus PSF (DoF-PSF), resulting in highly localized multiple excitation spots along the optical axis. The technique permits (1) simultaneous excitation of multiple planes in the specimen; (2) control of the number of spots by confocal detection; and (3) overcoming the point-by-point based excitation. Fluorescence detection from the excitation spots can be efficiently achieved by Z-scanning the detector/pinhole assembly. The technique complements most of the bioimaging techniques and may find potential application in high resolution fluorescence microscopy and nanoscale imaging.

  10. Multiple excitation nano-spot generation and confocal detection for far-field microscopy

    NASA Astrophysics Data System (ADS)

    Mondal, Partha Pratim

    2010-03-01

    An imaging technique is developed for the controlled generation of multiple excitation nano-spots for far-field microscopy. The system point spread function (PSF) is obtained by interfering two counter-propagating extended depth-of-focus PSF (DoF-PSF), resulting in highly localized multiple excitation spots along the optical axis. The technique permits (1) simultaneous excitation of multiple planes in the specimen; (2) control of the number of spots by confocal detection; and (3) overcoming the point-by-point based excitation. Fluorescence detection from the excitation spots can be efficiently achieved by Z-scanning the detector/pinhole assembly. The technique complements most of the bioimaging techniques and may find potential application in high resolution fluorescence microscopy and nanoscale imaging.

  11. Femoral head necrosis: A finite element analysis of common and novel surgical techniques.

    PubMed

    Cilla, Myriam; Checa, Sara; Preininger, Bernd; Winkler, Tobias; Perka, Carsten; Duda, Georg N; Pumberger, Matthias

    2017-10-01

    Femoral head necrosis is a common cause of secondary osteoarthritis. At the early stages, treatment strategies are normally based on core decompression techniques, where the number, location and diameter of the drilling holes varies depending on the selected approach. The purpose of this study was to investigate the risk of femoral head, neck and subtrochanteric fracture following six different core decompression techniques. Five common and a newly proposed techniques were analyzed in respect to their biomechanical consequences using finite element analysis. The geometry of a femur was reconstructed from computed-tomography images. Thereafter, the drilling configurations were simulated. The strains in the intact and drilled femurs were determined under physiological, patient-specific, muscle and joint contact forces. The following results were observed: i) - an increase in collapse and fracture risk of the femur head by disease progression ii) - for a single hole approach at the subtrochanteric region, the fracture risk increases with the diameter iii) - the highest fracture risks occur for an 8mm single hole drilling at the subtrochanteric region and approaches with multiple drilling at various entry points iv) - the proposed novel approach resulted in the most physiological strains (closer to the experienced by the healthy bone). Our results suggest that all common core decompression methods have a significant impact on the biomechanical competence of the proximal femur and impact its mechanical potential. Fracture risk increases with drilling diameter and multiple drilling with smaller diameter. We recommend the anterior approach due to its reduced soft tissue trauma and its biomechanical performance. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Optical spectroscopy of nanoscale and heterostructured oxides

    NASA Astrophysics Data System (ADS)

    Senty, Tess R.

    Through careful analysis of a material's properties, devices are continually getting smaller, faster and more efficient each day. Without a complete scientific understanding of material properties, devices cannot continue to improve. This dissertation uses optical spectroscopy techniques to understand light-matter interactions in several oxide materials with promising uses mainly in light harvesting applications. Linear absorption, photoluminescence and transient absorption spectroscopy are primarily used on europium doped yttrium vanadate nanoparticles, copper gallium oxide delafossites doped with iron, and cadmium selenide quantum dots attached to titanium dioxide nanoparticles. Europium doped yttrium vanadate nanoparticles have promising applications for linking to biomolecules. Using Fourier-transform infrared spectroscopy, it was shown that organic ligands (benzoic acid, 3-nitro 4-chloro-benzoic acid and 3,4-dihydroxybenzoic acid) can be attached to the surface of these molecules using metal-carboxylate coordination. Photoluminescence spectroscopy display little difference in the position of the dominant photoluminescence peaks between samples with different organic ligands although there is a strong decrease in their intensity when 3,4-dihydroxybenzoic acid is attached. It is shown that this strong quenching is due to the presence of high-frequency hydroxide vibrational modes within the organic linker. Ultraviolet/visible linear absorption measurements on delafossites display that by doping copper gallium oxide with iron allows for the previously forbidden fundamental gap transition to be accessed. Using tauc plots, it is shown that doping with iron lowers the bandgap from 2.8 eV for pure copper gallium oxide, to 1.7 eV for samples with 1 -- 5% iron doping. Using terahertz transient absorption spectroscopy measurements, it was also determined that doping with iron reduces the charge mobility of the pure delafossite samples. A comparison of cadmium selenide quantum dots, both with and without capping ligands, attached to titanium dioxide nanoparticles is performed using a new transient absorption analysis technique. Multiple exponential fit models were applied to the system and compared with the new inversion analysis technique. It is shown how the new inversion analysis can map out the charge carrier dynamics, providing carrier recombination rates and lifetimes as a function of carrier concentration, where the multiple exponential fit technique is not dependent on the carrier concentration. With the inversion analysis technique it is shown that capping ligands allow for increased charge transfer due to traps being passivated on the quantum dot surface.

  13. Rapid and highly reproducible analysis of rare earth elements by multiple collector inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Baker, Joel; Waight, Tod; Ulfbeck, David

    2002-10-01

    A method has been developed for the rapid chemical separation and highly reproducible analysis of the rare earth elements (REE) by isotope dilution analysis by means of a multiple collector inductively coupled plasma mass spectrometer (MC-ICP-MS). This technique is superior in terms of the analytical reproducibility or rapidity of analysis compared with quadrupole ICP-MS or with thermal ionization mass spectrometric isotope dilution techniques. Samples are digested by standard hydrofluoric-nitric acid-based techniques and spiked with two mixed spikes. The bulk REE are separated from the sample on a cation exchange column, collecting the middle-heavy and light REE as two groups, which provides a middle-heavy REE cut with sufficient separation of the light from the heavier REE to render oxide interferences trivial, and a Ba-free light REE cut. The heavy (Er-Lu), middle (Eu-Gd), and light REE (La-Eu) concentrations are determined by three short (1 to 2 min) analyses with a CETAC Aridus desolvating nebulizer introduction system. Replicate digestions of international rock standards demonstrate that concentrations can be reproduced to <1%, which reflects weighing errors during digestion and aliquotting as inter-REE ratios reproduce to ≤0.2% (2 SD). Eu and Ce anomalies reproduce to <0.15%. In addition to determining the concentrations of polyisotopic REE by isotope dilution analysis, the concentration of monoisotopic Pr can be measured during the light REE isotope dilution run, by reference to Pr/Ce and Pr/Nd ratios measured in a REE standard solution. Pr concentrations determined in this way reproduce to <1%, and Pr/REE ratios reproduce to <0.4%. Ce anomalies calculated with La and Pr also reproduce to <0.15% (2 SD). The precise Ce (and Eu) anomaly measurements should allow greater use of these features in studying the recycling of materials with these anomalies into the mantle, or redox-induced effects on the REE during recycling and dehydration of oceanic lithosphere, partial melting, metamorphism, alteration, or sedimentation processes. Moreover, this technique consumes very small amounts (subnanograms) of the REE and will allow precise REE determinations to be made on much smaller samples than hitherto possible.

  14. Single-Molecule Studies of Actin Assembly and Disassembly Factors

    PubMed Central

    Smith, Benjamin A.; Gelles, Jeff; Goode, Bruce L.

    2014-01-01

    The actin cytoskeleton is very dynamic and highly regulated by multiple associated proteins in vivo. Understanding how this system of proteins functions in the processes of actin network assembly and disassembly requires methods to dissect the mechanisms of activity of individual factors and of multiple factors acting in concert. The advent of single-filament and single-molecule fluorescence imaging methods has provided a powerful new approach to discovering actin-regulatory activities and obtaining direct, quantitative insights into the pathways of molecular interactions that regulate actin network architecture and dynamics. Here we describe techniques for acquisition and analysis of single-molecule data, applied to the novel challenges of studying the filament assembly and disassembly activities of actin-associated proteins in vitro. We discuss the advantages of single-molecule analysis in directly visualizing the order of molecular events, measuring the kinetic rates of filament binding and dissociation, and studying the coordination among multiple factors. The methods described here complement traditional biochemical approaches in elucidating actin-regulatory mechanisms in reconstituted filamentous networks. PMID:24630103

  15. Use of principal-component, correlation, and stepwise multiple-regression analyses to investigate selected physical and hydraulic properties of carbonate-rock aquifers

    USGS Publications Warehouse

    Brown, C. Erwin

    1993-01-01

    Correlation analysis in conjunction with principal-component and multiple-regression analyses were applied to laboratory chemical and petrographic data to assess the usefulness of these techniques in evaluating selected physical and hydraulic properties of carbonate-rock aquifers in central Pennsylvania. Correlation and principal-component analyses were used to establish relations and associations among variables, to determine dimensions of property variation of samples, and to filter the variables containing similar information. Principal-component and correlation analyses showed that porosity is related to other measured variables and that permeability is most related to porosity and grain size. Four principal components are found to be significant in explaining the variance of data. Stepwise multiple-regression analysis was used to see how well the measured variables could predict porosity and (or) permeability for this suite of rocks. The variation in permeability and porosity is not totally predicted by the other variables, but the regression is significant at the 5% significance level. ?? 1993.

  16. Serum-free light-chain analysis in diagnosis and management of multiple myeloma and related conditions.

    PubMed

    Milani, Paolo; Palladini, Giovanni; Merlini, Giampaolo

    2016-01-01

    The introduction of the serum-free light-chain (S-FLC) assay has been a breakthrough in the diagnosis and management of plasma cell dyscrasias, particularly monoclonal light-chain diseases. The first method, proposed in 2001, quantifies serum-free light-chains using polyclonal antibodies. More recently, assays based on monoclonal antibodies have entered into clinical practice. S-FLC measurement plays a central role in the screening for multiple myeloma and related conditions, in association with electrophoretic techniques. Analysis of S-FLC is essential in assessing the risk of progression of precursor diseases to overt plasma cell dyscrasias. It is also useful for risk stratification in solitary plasmacytoma and AL amyloidosis. The S-FLC measurement is part of the new diagnostic criteria for multiple myeloma, and provides a marker to follow changes in clonal substructure over time. Finally, the evaluation of S-FLC is fundamental for assessing the response to treatment in monoclonal light chain diseases.

  17. Global Single and Multiple Cloud Classification with a Fuzzy Logic Expert System

    NASA Technical Reports Server (NTRS)

    Welch, Ronald M.; Tovinkere, Vasanth; Titlow, James; Baum, Bryan A.

    1996-01-01

    An unresolved problem in remote sensing concerns the analysis of satellite imagery containing both single and multiple cloud layers. While cloud parameterizations are very important both in global climate models and in studies of the Earth's radiation budget, most cloud retrieval schemes, such as the bispectral method used by the International Satellite Cloud Climatology Project (ISCCP), have no way of determining whether overlapping cloud layers exist in any group of satellite pixels. Coakley (1983) used a spatial coherence method to determine whether a region contained more than one cloud layer. Baum et al. (1995) developed a scheme for detection and analysis of daytime multiple cloud layers using merged AVHRR (Advanced Very High Resolution Radiometer) and HIRS (High-resolution Infrared Radiometer Sounder) data collected during the First ISCCP Regional Experiment (FIRE) Cirrus 2 field campaign. Baum et al. (1995) explored the use of a cloud classification technique based on AVHRR data. This study examines the feasibility of applying the cloud classifier to global satellite imagery.

  18. Treatments of Missing Values in Large National Data Affect Conclusions: The Impact of Multiple Imputation on Arthroplasty Research.

    PubMed

    Ondeck, Nathaniel T; Fu, Michael C; Skrip, Laura A; McLynn, Ryan P; Su, Edwin P; Grauer, Jonathan N

    2018-03-01

    Despite the advantages of large, national datasets, one continuing concern is missing data values. Complete case analysis, where only cases with complete data are analyzed, is commonly used rather than more statistically rigorous approaches such as multiple imputation. This study characterizes the potential selection bias introduced using complete case analysis and compares the results of common regressions using both techniques following unicompartmental knee arthroplasty. Patients undergoing unicompartmental knee arthroplasty were extracted from the 2005 to 2015 National Surgical Quality Improvement Program. As examples, the demographics of patients with and without missing preoperative albumin and hematocrit values were compared. Missing data were then treated with both complete case analysis and multiple imputation (an approach that reproduces the variation and associations that would have been present in a full dataset) and the conclusions of common regressions for adverse outcomes were compared. A total of 6117 patients were included, of which 56.7% were missing at least one value. Younger, female, and healthier patients were more likely to have missing preoperative albumin and hematocrit values. The use of complete case analysis removed 3467 patients from the study in comparison with multiple imputation which included all 6117 patients. The 2 methods of handling missing values led to differing associations of low preoperative laboratory values with commonly studied adverse outcomes. The use of complete case analysis can introduce selection bias and may lead to different conclusions in comparison with the statistically rigorous multiple imputation approach. Joint surgeons should consider the methods of handling missing values when interpreting arthroplasty research. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Comparison of Nonlinear Random Response Using Equivalent Linearization and Numerical Simulation

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Muravyov, Alexander A.

    2000-01-01

    A recently developed finite-element-based equivalent linearization approach for the analysis of random vibrations of geometrically nonlinear multiple degree-of-freedom structures is validated. The validation is based on comparisons with results from a finite element based numerical simulation analysis using a numerical integration technique in physical coordinates. In particular, results for the case of a clamped-clamped beam are considered for an extensive load range to establish the limits of validity of the equivalent linearization approach.

  20. Use of multiple cluster analysis methods to explore the validity of a community outcomes concept map.

    PubMed

    Orsi, Rebecca

    2017-02-01

    Concept mapping is now a commonly-used technique for articulating and evaluating programmatic outcomes. However, research regarding validity of knowledge and outcomes produced with concept mapping is sparse. The current study describes quantitative validity analyses using a concept mapping dataset. We sought to increase the validity of concept mapping evaluation results by running multiple cluster analysis methods and then using several metrics to choose from among solutions. We present four different clustering methods based on analyses using the R statistical software package: partitioning around medoids (PAM), fuzzy analysis (FANNY), agglomerative nesting (AGNES) and divisive analysis (DIANA). We then used the Dunn and Davies-Bouldin indices to assist in choosing a valid cluster solution for a concept mapping outcomes evaluation. We conclude that the validity of the outcomes map is high, based on the analyses described. Finally, we discuss areas for further concept mapping methods research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. A prospective randomized controlled trial of the two-window technique without membrane versus the solo-window technique with membrane over the osteotomy window for maxillary sinus augmentation.

    PubMed

    Yu, Huajie; He, Danqing; Qiu, Lixin

    2017-12-01

    Maturation of the grafted volume after lateral sinus elevation is crucial for the long-term survival of dental implants. To compare endo-sinus histomorphometric bone formation between the solo- and two-window maxillary sinus augmentation techniques with or without membrane coverage for the rehabilitation of multiple missing posterior teeth. Patients with severely atrophic posterior maxillae were randomized to receive lateral sinus floor elevation via the solo-window technique with membrane coverage (Control Group) or the two-window technique without coverage (Test Group). Six months after surgery, bone core specimens harvested from the lateral aspect were histomorphometrically analyzed. Ten patients in each group underwent 21 maxillary sinus augmentations. Histomorphometric analysis revealed mean newly formed bone values of 26.08 ± 16.23% and 27.14 ± 18.11%, mean connective tissue values of 59.34 ± 12.42% and 50.03 ± 17.13%, and mean residual graft material values of 14.6 ± 14.56% and 22.78 ± 10.83% in the Test and Control Groups, respectively, with no significant differences. The two-window technique obtained comparative maturation of the grafted volume even without membrane coverage, and is a viable alternative for the rehabilitation of severely atrophic posterior maxillae with multiple missing posterior teeth. © 2017 Wiley Periodicals, Inc.

  2. Time-Distance Analysis of Deep Solar Convection

    NASA Technical Reports Server (NTRS)

    Duvall, T. L., Jr.; Hanasoge, S. M.

    2011-01-01

    Recently it was shown by Hanasoge, Duvall, and DeRosa (2010) that the upper limit to convective flows for spherical harmonic degrees l

  3. Improving Systematic Constraint-driven Analysis Using Incremental and Parallel Techniques

    DTIC Science & Technology

    2012-05-01

    and modeling latency of a cloud based subsystem. Members of my research group provided useful comments and ideas on my work in group meetings and...122 5.7.1 One structurally complex argument . . . . . . . . . . . . . . 122 5.7.2 Multiple independent arguments...Subject Tools . . . . . . . . . . . . . . . . . 131 6.1.1.1 JPF — Model Checker . . . . . . . . . . . . . . . . 131 6.1.1.2 Alloy — Using a SAT

  4. The Well-Being of Children Born to Teen Mothers: Multiple Approaches to Assessing the Causal Links. JCPR Working Paper.

    ERIC Educational Resources Information Center

    Levine, Judith A.; Pollack, Harold

    This study used linked maternal-child data from the 1997-1998 National Longitudinal Survey of Youth to explore the wellbeing of children born to teenage mothers. Two econometric techniques explored the causal impact of early childbearing on subsequent child and adolescent outcomes. First, a fixed-effect, cousin-comparison analysis controlled for…

  5. Browns Ferry Unit-3 cavity neutron spectral analysis. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, G.C.

    1981-08-01

    This report describes neutron dosimetry measurements performed in the Browns Ferry Unit-3 reactor cavity using multiple dosimeter and spectrum unfolding techniques to assess radiation-induced degradation of nuclear plant pressure vessels. Test results and conclusions indicating the feasibility of determining neutron flux spectra and the densities in the pressure vessel cavity region via dosimetric measurements are presented.

  6. Principal component analysis of the cytokine and chemokine response to human traumatic brain injury.

    PubMed

    Helmy, Adel; Antoniades, Chrystalina A; Guilfoyle, Mathew R; Carpenter, Keri L H; Hutchinson, Peter J

    2012-01-01

    There is a growing realisation that neuro-inflammation plays a fundamental role in the pathology of Traumatic Brain Injury (TBI). This has led to the search for biomarkers that reflect these underlying inflammatory processes using techniques such as cerebral microdialysis. The interpretation of such biomarker data has been limited by the statistical methods used. When analysing data of this sort the multiple putative interactions between mediators need to be considered as well as the timing of production and high degree of statistical co-variance in levels of these mediators. Here we present a cytokine and chemokine dataset from human brain following human traumatic brain injury and use principal component analysis and partial least squares discriminant analysis to demonstrate the pattern of production following TBI, distinct phases of the humoral inflammatory response and the differing patterns of response in brain and in peripheral blood. This technique has the added advantage of making no assumptions about the Relative Recovery (RR) of microdialysis derived parameters. Taken together these techniques can be used in complex microdialysis datasets to summarise the data succinctly and generate hypotheses for future study.

  7. Characterizing plant cell wall derived oligosaccharides using hydrophilic interaction chromatography with mass spectrometry detection.

    PubMed

    Leijdekkers, A G M; Sanders, M G; Schols, H A; Gruppen, H

    2011-12-23

    Analysis of complex mixtures of plant cell wall derived oligosaccharides is still challenging and multiple analytical techniques are often required for separation and characterization of these mixtures. In this work it is demonstrated that hydrophilic interaction chromatography coupled with evaporative light scattering and mass spectrometry detection (HILIC-ELSD-MS(n)) is a valuable tool for identification of a wide range of neutral and acidic cell wall derived oligosaccharides. The separation potential for acidic oligosaccharides observed with HILIC is much better compared to other existing techniques, like capillary electrophoresis, reversed phase and porous-graphitized carbon chromatography. Important structural information, such as presence of methyl esters and acetyl groups, is retained during analysis. Separation of acidic oligosaccharides with equal charge yet with different degrees of polymerization can be obtained. The efficient coupling of HILIC with ELSD and MS(n)-detection enables characterization and quantification of many different oligosaccharide structures present in complex mixtures. This makes HILIC-ELSD-MS(n) a versatile and powerful additional technique in plant cell wall analysis. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Multiple Beam Interferometry in Elementary Teaching

    ERIC Educational Resources Information Center

    Tolansky, S.

    1970-01-01

    Discusses a relatively simple technique for demonstrating multiple beam interferometry. The technique can be applied to measuring (1) radii of curvature of lenses, (2) surface finish of glass, and (3) differential phase change on reflection. Microtopographies, modulated fringe systems and opaque objects may also be observed by this technique.…

  9. Robust decentralized controller for minimizing coupling effect in single inductor multiple output DC-DC converter operating in continuous conduction mode.

    PubMed

    Medeiros, Renan Landau Paiva de; Barra, Walter; Bessa, Iury Valente de; Chaves Filho, João Edgar; Ayres, Florindo Antonio de Cavalho; Neves, Cleonor Crescêncio das

    2018-02-01

    This paper describes a novel robust decentralized control design methodology for a single inductor multiple output (SIMO) DC-DC converter. Based on a nominal multiple input multiple output (MIMO) plant model and performance requirements, a pairing input-output analysis is performed to select the suitable input to control each output aiming to attenuate the loop coupling. Thus, the plant uncertainty limits are selected and expressed in interval form with parameter values of the plant model. A single inductor dual output (SIDO) DC-DC buck converter board is developed for experimental tests. The experimental results show that the proposed methodology can maintain a desirable performance even in the presence of parametric uncertainties. Furthermore, the performance indexes calculated from experimental data show that the proposed methodology outperforms classical MIMO control techniques. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Performance Analysis of Direct-Sequence Code-Division Multiple-Access Communications with Asymmetric Quadrature Phase-Shift-Keying Modulation

    NASA Technical Reports Server (NTRS)

    Wang, C.-W.; Stark, W.

    2005-01-01

    This article considers a quaternary direct-sequence code-division multiple-access (DS-CDMA) communication system with asymmetric quadrature phase-shift-keying (AQPSK) modulation for unequal error protection (UEP) capability. Both time synchronous and asynchronous cases are investigated. An expression for the probability distribution of the multiple-access interference is derived. The exact bit-error performance and the approximate performance using a Gaussian approximation and random signature sequences are evaluated by extending the techniques used for uniform quadrature phase-shift-keying (QPSK) and binary phase-shift-keying (BPSK) DS-CDMA systems. Finally, a general system model with unequal user power and the near-far problem is considered and analyzed. The results show that, for a system with UEP capability, the less protected data bits are more sensitive to the near-far effect that occurs in a multiple-access environment than are the more protected bits.

  11. [Bus drivers' biomechanical risk assessment in two different contexts].

    PubMed

    Baracco, A; Coggiola, M; Perrelli, F; Banchio, M; Martignone, S; Gullino, A; Romano, C

    2012-01-01

    The application of standardize methods for the biomechanical risk assessment in non-industrial cycled activity is not always possible. A typical case is the public transport sector, where workers complain of suffering for shoulder more than elbow and wrist pains. The Authors present the results of two studies involving two public transport companies and the risk of biomechanical overload of upper limbs for bus and tram drivers. The analysis has been made using three different approaches: focus groups; static analysis by using anthropometric manikins; work sampling technique by monitoring worker's activity and posture at each minute, for two hours and for each binomial vehicle-route, considering P5F e P95M drivers and assessing the perceived efforts thorough the Borg's CR10 Scale. The conclusive results show that the ergonomic analysis managed by multiple non-standardized techniques may reach consistent and repeatable results according to the epidemiological evidences.

  12. Environmental applications for the analysis of chlorinated dibenzo-p-dioxins and dibenzofurans using mass spectrometry/mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reiner, E.J.; Schellenberg, D.H.; Taguchi, V.Y.

    1991-01-01

    A mass spectrometry/mass spectrometry-multiple reaction monitoring (MS/MS-MRM) technique for the analysis of all tetra- through octachlorinated dibenzo-p-dioxins (Cl{sub x}DD, x = 4-8) and dibenzofurans (Cl{sub x}DF, x = 4-8) has been developed at the Ministry of the Environment (MOE) utilizing a triple quadrupole mass spectrometer. Optimization of instrumental parameters using the analyte of interest in a direct insertion probe (DIP) resulted in sensitivities approaching those obtainable by high-resolution mass spectrometric (HRMS) methods. All congeners of dioxins and furans were detected in the femtogram range. Results on selected samples indicated that for some matrices, fewer chemical interferences were observed by MS/MSmore » than by HRMS. The technique used to optimize the instrument for chlorinated dibenzo-p-dioxins (CDDs) and chlorinated dibenzofurans (CDFs) analysis is adaptable to other analytes.« less

  13. Evaluation and comparison of ERTS measurements of major crops and soil associations for selected test sites in the central United States. [Texas, Indiana, Kansas, Iowa, Nebraska, and North Dakota

    NASA Technical Reports Server (NTRS)

    Baumgardner, M. F. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. Multispectral scanner data obtained by ERTS-1 over six test sites in the Central United States were analyzed and interpreted. ERTS-1 data for some of the test sites were geometrically corrected and temporally overlayed. Computer-implemented pattern recognition techniques were used in the analysis of all multispectral data. These techniques were used to evaluate ERTS-1 data as a tool for soil survey. Geology maps and land use inventories were prepared by digital analysis of multispectral data. Identification and mapping of crop species and rangelands were achieved throught the analysis of 1972 and 1973 ERTS-1 data. Multiple dates of ERTS-1 data were examined to determine the variation with time of the areal extent of surface water resources on the Southern Great Plain.

  14. Neural data science: accelerating the experiment-analysis-theory cycle in large-scale neuroscience.

    PubMed

    Paninski, L; Cunningham, J P

    2018-06-01

    Modern large-scale multineuronal recording methodologies, including multielectrode arrays, calcium imaging, and optogenetic techniques, produce single-neuron resolution data of a magnitude and precision that were the realm of science fiction twenty years ago. The major bottlenecks in systems and circuit neuroscience no longer lie in simply collecting data from large neural populations, but also in understanding this data: developing novel scientific questions, with corresponding analysis techniques and experimental designs to fully harness these new capabilities and meaningfully interrogate these questions. Advances in methods for signal processing, network analysis, dimensionality reduction, and optimal control-developed in lockstep with advances in experimental neurotechnology-promise major breakthroughs in multiple fundamental neuroscience problems. These trends are clear in a broad array of subfields of modern neuroscience; this review focuses on recent advances in methods for analyzing neural time-series data with single-neuronal precision. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Validated Metrics of Quick Flow Improve Assessments of Streamflow Generation Processes at the Long-Term Sleepers River Research Site

    NASA Astrophysics Data System (ADS)

    Sebestyen, S. D.; Shanley, J. B.

    2015-12-01

    There are multiple approaches to quantify quick flow components of streamflow. Physical hydrograph separations of quick flow using recession analysis (RA) are objective, reproducible, and easily calculated for long-duration streamflow records (years to decades). However, this approach has rarely been validated to have a physical basis for interpretation. In contrast, isotopic hydrograph separation (IHS) and end member mixing analysis using multiple solutes (EMMA) have been used to identify flow components and flowpath routing through catchment soils. Nonetheless, these two approaches are limited by data from limited and isolated periods (hours to weeks) during which more-intensive grab samples were analyzed. These limitations oftentimes make IHS and EMMA difficult to generalize beyond brief windows of time. At the Sleepers River Research Watershed (SRRW) in northern Vermont, USA, we have data from multiple snowmelt events over a two decade period and from multiple nested catchments to assess relationships among RA, IHS, and EMMA. Quick flow separations were highly correlated among the three techniques, which shows links among metrics of quick flow, water sources, and flow path routing in a small (41 ha), forested catchment (W-9) The similarity in responses validates a physical interpretation for a particular RA approach (the Ekhardt recursive RA filter). This validation provides a new tool to estimate new water inputs and flowpath routing for more and longer periods when chemical or isotopic tracers may not have been measured. At three other SRRW catchments, we found similar strong correlations among the three techniques. Consistent responses across four catchments provide evidence to support other research at the SRRW that shows that runoff generation mechanisms are similar despite differences in catchment sizes and land covers.

  16. Implementing technical refinement in high-level athletics: exploring the knowledge schemas of coaches.

    PubMed

    Kearney, Philip E; Carson, Howie J; Collins, Dave

    2018-05-01

    This paper explores the approaches adopted by high-level field athletics coaches when attempting to refine an athlete's already well-established technique (long and triple jump and javelin throwing). Six coaches, who had all coached multiple athletes to multiple major championships, took part in semi-structured interviews focused upon a recent example of technique refinement. Data were analysed using a thematic content analysis. The coaching tools reported were generally consistent with those advised by the existing literature, focusing on attaining "buy-in", utilising part-practice, restoring movement automaticity and securing performance under pressure. Five of the six coaches reported using a systematic sequence of stages to implement the refinement, although the number and content of these stages varied between them. Notably, however, there were no formal sources of knowledge (e.g., coach education or training) provided to inform coaches' decision making. Instead, coaches' decisions were largely based on experience both within and outside the sporting domain. Data offer a useful stimulus for reflection amongst sport practitioners confronted by the problem of technique refinement. Certainly the limited awareness of existing guidelines on technique refinement expressed by the coaches emphasises a need for further collaborative work by researchers and coach educators to disseminate best practice.

  17. Application of the MIDAS approach for analysis of lysine acetylation sites.

    PubMed

    Evans, Caroline A; Griffiths, John R; Unwin, Richard D; Whetton, Anthony D; Corfe, Bernard M

    2013-01-01

    Multiple Reaction Monitoring Initiated Detection and Sequencing (MIDAS™) is a mass spectrometry-based technique for the detection and characterization of specific post-translational modifications (Unwin et al. 4:1134-1144, 2005), for example acetylated lysine residues (Griffiths et al. 18:1423-1428, 2007). The MIDAS™ technique has application for discovery and analysis of acetylation sites. It is a hypothesis-driven approach that requires a priori knowledge of the primary sequence of the target protein and a proteolytic digest of this protein. MIDAS essentially performs a targeted search for the presence of modified, for example acetylated, peptides. The detection is based on the combination of the predicted molecular weight (measured as mass-charge ratio) of the acetylated proteolytic peptide and a diagnostic fragment (product ion of m/z 126.1), which is generated by specific fragmentation of acetylated peptides during collision induced dissociation performed in tandem mass spectrometry (MS) analysis. Sequence information is subsequently obtained which enables acetylation site assignment. The technique of MIDAS was later trademarked by ABSciex for targeted protein analysis where an MRM scan is combined with full MS/MS product ion scan to enable sequence confirmation.

  18. Study of spread spectrum multiple access systems for satellite communications with overlay on current services

    NASA Technical Reports Server (NTRS)

    Ha, Tri T.; Pratt, Timothy

    1989-01-01

    The feasibility of using spread spectrum techniques to provide a low-cost multiple access system for a very large number of low data terminals was investigated. Two applications of spread spectrum technology to very small aperture terminal (VSAT) satellite communication networks are presented. Two spread spectrum multiple access systems which use a form of noncoherent M-ary FSK (MFSK) as the primary modulation are described and the throughput analyzed. The analysis considers such factors as satellite power constraints and adjacent satellite interference. Also considered is the effect of on-board processing on the multiple access efficiency and the feasibility of overlaying low data rate spread spectrum signals on existing satellite traffic as a form of frequency reuse is investigated. The use of chirp is examined for spread spectrum communications. In a chirp communication system, each data bit is converted into one or more up or down sweeps of frequency, which spread the RF energy across a broad range of frequencies. Several different forms of chirp communication systems are considered, and a multiple-chirp coded system is proposed for overlay service. The mutual interference problem is examined in detail and a performance analysis undertaken for the case of a chirp data channel overlaid on a video channel.

  19. Many-body-theory study of lithium photoionization

    NASA Technical Reports Server (NTRS)

    Chang, T. N.; Poe, R. T.

    1975-01-01

    A detailed theoretical calculation is carried out for the photoionization of lithium at low energies within the framework of Brueckner-Goldstone perturbational approach. In this calculation extensive use is made of the recently developed multiple-basis-set technique. Through this technique all second-order perturbation terms, plus a number of important classes of terms to infinite order, have been taken into account. Analysis of the results enables one to resolve the discrepancies between two previous works on this subject. The detailed calculation also serves as a test on the convergence of the many-body perturbation-expansion approach.

  20. Pharmacokinetics-on-a-Chip Using Label-Free SERS Technique for Programmable Dual-Drug Analysis.

    PubMed

    Fei, Jiayuan; Wu, Lei; Zhang, Yizhi; Zong, Shenfei; Wang, Zhuyuan; Cui, Yiping

    2017-06-23

    Synergistic effects of dual or multiple drugs have attracted great attention in medical fields, especially in cancer therapies. We provide a programmable microfluidic platform for pharmacokinetic detection of multiple drugs in multiple cells. The well-designed microfluidic platform includes two 2 × 3 microarrays of cell chambers, two gradient generators, and several pneumatic valves. Through the combined use of valves and gradient generators, each chamber can be controlled to infuse different kinds of living cells and drugs with specific concentrations as needed. In our experiments, 6-mercaptopurine (6MP) and methimazole (MMI) were chosen as two drug models and their pharmacokinetic parameters in different living cells were monitored through intracellular SERS spectra, which reflected the molecular structure of these drugs. The dynamic change of SERS fingerprints from 6MP and MMI molecules were recorded during drug metabolism in living cells. The results indicated that both 6MP and MMI molecules were diffused into the cells within 4 min and excreted out after 36 h. Moreover, the intracellular distribution of these drugs was monitored through SERS mapping. Thus, our microfluidic platform simultaneously accomplishes the functions to monitor pharmacokinetic action, distribution, and fingerprint of multiple drugs in multiple cells. Owing to its real-time, rapid-speed, high-precision, and programmable capability of multiple-drug and multicell analysis, such a microfluidic platform has great potential in drug design and development.

  1. 4D Hyperspherical Harmonic (HyperSPHARM) Representation of Surface Anatomy: A Holistic Treatment of Multiple Disconnected Anatomical Structures

    PubMed Central

    Hosseinbor, A. Pasha; Chung, Moo K.; Koay, Cheng Guan; Schaefer, Stacey M.; van Reekum, Carien M.; Schmitz, Lara Peschke; Sutterer, Matt; Alexander, Andrew L.; Davidson, Richard J.

    2015-01-01

    Image-based parcellation of the brain often leads to multiple disconnected anatomical structures, which pose significant challenges for analyses of morphological shapes. Existing shape models, such as the widely used spherical harmonic (SPHARM) representation, assume topological invariance, so are unable to simultaneously parameterize multiple disjoint structures. In such a situation, SPHARM has to be applied separately to each individual structure. We present a novel surface parameterization technique using 4D hyperspherical harmonics in representing multiple disjoint objects as a single analytic function, terming it HyperSPHARM. The underlying idea behind Hyper-SPHARM is to stereographically project an entire collection of disjoint 3D objects onto the 4D hypersphere and subsequently simultaneously parameterize them with the 4D hyperspherical harmonics. Hence, HyperSPHARM allows for a holistic treatment of multiple disjoint objects, unlike SPHARM. In an imaging dataset of healthy adult human brains, we apply HyperSPHARM to the hippocampi and amygdalae. The HyperSPHARM representations are employed as a data smoothing technique, while the HyperSPHARM coefficients are utilized in a support vector machine setting for object classification. HyperSPHARM yields nearly identical results as SPHARM, as will be shown in the paper. Its key advantage over SPHARM lies computationally; Hyper-SPHARM possess greater computational efficiency than SPHARM because it can parameterize multiple disjoint structures using much fewer basis functions and stereographic projection obviates SPHARM's burdensome surface flattening. In addition, HyperSPHARM can handle any type of topology, unlike SPHARM, whose analysis is confined to topologically invariant structures. PMID:25828650

  2. An Analysis of Nondestructive Evaluation Techniques for Polymer Matrix Composite Sandwich Materials

    NASA Technical Reports Server (NTRS)

    Cosgriff, Laura M.; Roberts, Gary D.; Binienda, Wieslaw K.; Zheng, Diahua; Averbeck, Timothy; Roth, Donald J.; Jeanneau, Philippe

    2006-01-01

    Structural sandwich materials composed of triaxially braided polymer matrix composite material face sheets sandwiching a foam core are being utilized for applications including aerospace components and recreational equipment. Since full scale components are being made from these sandwich materials, it is necessary to develop proper inspection practices for their manufacture and in-field use. Specifically, nondestructive evaluation (NDE) techniques need to be investigated for analysis of components made from these materials. Hockey blades made from sandwich materials and a flat sandwich sample were examined with multiple NDE techniques including thermographic, radiographic, and shearographic methods to investigate damage induced in the blades and flat panel components. Hockey blades used during actual play and a flat polymer matrix composite sandwich sample with damage inserted into the foam core were investigated with each technique. NDE images from the samples were presented and discussed. Structural elements within each blade were observed with radiographic imaging. Damaged regions and some structural elements of the hockey blades were identified with thermographic imaging. Structural elements, damaged regions, and other material variations were detected in the hockey blades with shearography. Each technique s advantages and disadvantages were considered in making recommendations for inspection of components made from these types of materials.

  3. Turning challenges into design principles: Telemonitoring systems for patients with multiple chronic conditions.

    PubMed

    Sultan, Mehwish; Kuluski, Kerry; McIsaac, Warren J; Cafazzo, Joseph A; Seto, Emily

    2018-01-01

    People with multiple chronic conditions often struggle with managing their health. The purpose of this research was to identify specific challenges of patients with multiple chronic conditions and to use the findings to form design principles for a telemonitoring system tailored for these patients. Semi-structured interviews with 15 patients with multiple chronic conditions and 10 clinicians were conducted to gain an understanding of their needs and preferences for a smartphone-based telemonitoring system. The interviews were analyzed using a conventional content analysis technique, resulting in six themes. Design principles developed from the themes included that the system must be modular to accommodate various combinations of conditions, reinforce a routine, consolidate record keeping, as well as provide actionable feedback to the patients. Designing an application for multiple chronic conditions is complex due to variability in patient conditions, and therefore, design principles developed in this study can help with future innovations aimed to help manage this population.

  4. Multiple Uses of a Word Study Technique

    ERIC Educational Resources Information Center

    Joseph, Laurice M.; Orlins, Andrew

    2005-01-01

    This paper presents two case studies that illustrate the multiple uses of word sorts, a word study phonics technique. Case study children were Sara, a second grader, who had difficulty with reading basic words and John, a third grader, who had difficulty with spelling basic words. Multiple baseline designs were employed to study the effects of…

  5. Hardware Implementation of Multiple Fan Beam Projection Technique in Optical Fibre Process Tomography

    PubMed Central

    Rahim, Ruzairi Abdul; Fazalul Rahiman, Mohd Hafiz; Leong, Lai Chen; Chan, Kok San; Pang, Jon Fea

    2008-01-01

    The main objective of this project is to implement the multiple fan beam projection technique using optical fibre sensors with the aim to achieve a high data acquisition rate. Multiple fan beam projection technique here is defined as allowing more than one emitter to transmit light at the same time using the switch-mode fan beam method. For the thirty-two pairs of sensors used, the 2-projection technique and 4-projection technique are being investigated. Sixteen sets of projections will complete one frame of light emission for the 2-projection technique while eight sets of projection will complete one frame of light emission for the 4-projection technique. In order to facilitate data acquisition process, PIC microcontroller and the sample and hold circuit are being used. This paper summarizes the hardware configuration and design for this project. PMID:27879885

  6. Maternal factors predicting cognitive and behavioral characteristics of children with fetal alcohol spectrum disorders.

    PubMed

    May, Philip A; Tabachnick, Barbara G; Gossage, J Phillip; Kalberg, Wendy O; Marais, Anna-Susan; Robinson, Luther K; Manning, Melanie A; Blankenship, Jason; Buckley, David; Hoyme, H Eugene; Adnams, Colleen M

    2013-06-01

    To provide an analysis of multiple predictors of cognitive and behavioral traits for children with fetal alcohol spectrum disorders (FASDs). Multivariate correlation techniques were used with maternal and child data from epidemiologic studies in a community in South Africa. Data on 561 first-grade children with fetal alcohol syndrome (FAS), partial FAS (PFAS), and not FASD and their mothers were analyzed by grouping 19 maternal variables into categories (physical, demographic, childbearing, and drinking) and used in structural equation models (SEMs) to assess correlates of child intelligence (verbal and nonverbal) and behavior. A first SEM using only 7 maternal alcohol use variables to predict cognitive/behavioral traits was statistically significant (B = 3.10, p < .05) but explained only 17.3% of the variance. The second model incorporated multiple maternal variables and was statistically significant explaining 55.3% of the variance. Significantly correlated with low intelligence and problem behavior were demographic (B = 3.83, p < .05) (low maternal education, low socioeconomic status [SES], and rural residence) and maternal physical characteristics (B = 2.70, p < .05) (short stature, small head circumference, and low weight). Childbearing history and alcohol use composites were not statistically significant in the final complex model and were overpowered by SES and maternal physical traits. Although other analytic techniques have amply demonstrated the negative effects of maternal drinking on intelligence and behavior, this highly controlled analysis of multiple maternal influences reveals that maternal demographics and physical traits make a significant enabling or disabling contribution to child functioning in FASD.

  7. Fragmentation of organic ions bearing fixed multiple charges observed in MALDI MS.

    PubMed

    Lou, Xianwen; Li, Bao; de Waal, Bas F M; Schill, Jurgen; Baker, Matthew B; Bovee, Ralf A A; van Dongen, Joost L J; Milroy, Lech-Gustav; Meijer, E W

    2018-01-01

    Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI TOF MS) was used to analyze a series of synthetic organic ions bearing fixed multiple charges. Despite the multiple intrinsic charges, only singly charged ions were recorded in each case. In addition to the pseudo-molecular ions formed by counterion adduction, deprotonation and electron capture, a number of fragment ions were also observed. Charge splitting by fragmentation was found to be a viable route for charge reduction leading to the formation of the observed singly charged fragment ions. Unlike multivalent metal ions, organic ions can rearrange and/or fragment during charge reduction. This fragmentation process will evidently complicate the interpretation of the MALDI MS spectrum. Because MALDI MS is usually considered as a soft ionization technique, the fragment ion peaks can easily be erroneously interpreted as impurities. Therefore, the awareness and understanding of the underlying MALDI-induced fragmentation pathways is essential for a proper interpretation of the corresponding mass spectra. Due to the fragment ions generated during charge reduction, special care should be taken in the MALDI MS analysis of multiply charged ions. In this work, the possible mechanisms by which the organic ions bearing fixed multiple charges fragment are investigated. With an improved understanding of the fragmentation mechanisms, MALDI TOF MS should still be a useful technique for the characterization of organic ions with fixed multiple charges. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Nuts and Bolts - Techniques for Genesis Sample Curation

    NASA Technical Reports Server (NTRS)

    Burkett, Patti J.; Rodriquez, M. C.; Allton, J. H.

    2011-01-01

    The Genesis curation staff at NASA Johnson Space Center provides samples and data for analysis to the scientific community, following allocation approval by the Genesis Oversight Committee, a sub-committee of CAPTEM (Curation Analysis Planning Team for Extraterrestrial Materials). We are often asked by investigators within the scientific community how we choose samples to best fit the requirements of the request. Here we will demonstrate our techniques for characterizing samples and satisfying allocation requests. Even with a systematic approach, every allocation is unique. We are also providing updated status of the cataloging and characterization of solar wind collectors as of January 2011. The collection consists of 3721 inventoried samples consisting of a single fragment, or multiple fragments containerized or pressed between post-it notes, jars or vials of various sizes.

  9. Using Classification and Regression Trees (CART) and random forests to analyze attrition: Results from two simulations.

    PubMed

    Hayes, Timothy; Usami, Satoshi; Jacobucci, Ross; McArdle, John J

    2015-12-01

    In this article, we describe a recent development in the analysis of attrition: using classification and regression trees (CART) and random forest methods to generate inverse sampling weights. These flexible machine learning techniques have the potential to capture complex nonlinear, interactive selection models, yet to our knowledge, their performance in the missing data analysis context has never been evaluated. To assess the potential benefits of these methods, we compare their performance with commonly employed multiple imputation and complete case techniques in 2 simulations. These initial results suggest that weights computed from pruned CART analyses performed well in terms of both bias and efficiency when compared with other methods. We discuss the implications of these findings for applied researchers. (c) 2015 APA, all rights reserved).

  10. An Identification Procedure for Behavioral Analysis in a Multi-User environment.

    PubMed

    Guerra, Claudio; Bianchi, Valentina; De Munari, Ilaria; Ciampolini, Paolo

    2015-01-01

    As the average age of the EU population increases, ICT solutions are going to play a key role in order to find answers to the new challenges the demographic change is carrying on. At the University of Parma an AAL (Ambient Assisted Living) system named CARDEA has been developed during the last 10 years. Within CARDEA, behavioral analysis is carried out, based on environmental sensors. If multiple users live in the same environment, however, data coming from sensors need to be properly tagged: in this paper, a simple technique for such tagging is proposed, which exploits the same wireless transmission used for transmitting data, thus not requiring additional hardware components and avoiding more complex and expensive (radio)localization techniques. Preliminary results are shown, featuring a satisfactory accuracy.

  11. Using Classification and Regression Trees (CART) and Random Forests to Analyze Attrition: Results From Two Simulations

    PubMed Central

    Hayes, Timothy; Usami, Satoshi; Jacobucci, Ross; McArdle, John J.

    2016-01-01

    In this article, we describe a recent development in the analysis of attrition: using classification and regression trees (CART) and random forest methods to generate inverse sampling weights. These flexible machine learning techniques have the potential to capture complex nonlinear, interactive selection models, yet to our knowledge, their performance in the missing data analysis context has never been evaluated. To assess the potential benefits of these methods, we compare their performance with commonly employed multiple imputation and complete case techniques in 2 simulations. These initial results suggest that weights computed from pruned CART analyses performed well in terms of both bias and efficiency when compared with other methods. We discuss the implications of these findings for applied researchers. PMID:26389526

  12. Analysis and optimization of gyrokinetic toroidal simulations on homogenous and heterogenous platforms

    DOE PAGES

    Ibrahim, Khaled Z.; Madduri, Kamesh; Williams, Samuel; ...

    2013-07-18

    The Gyrokinetic Toroidal Code (GTC) uses the particle-in-cell method to efficiently simulate plasma microturbulence. This paper presents novel analysis and optimization techniques to enhance the performance of GTC on large-scale machines. We introduce cell access analysis to better manage locality vs. synchronization tradeoffs on CPU and GPU-based architectures. Finally, our optimized hybrid parallel implementation of GTC uses MPI, OpenMP, and NVIDIA CUDA, achieves up to a 2× speedup over the reference Fortran version on multiple parallel systems, and scales efficiently to tens of thousands of cores.

  13. Performance Analysis of Blind Subspace-Based Signature Estimation Algorithms for DS-CDMA Systems with Unknown Correlated Noise

    NASA Astrophysics Data System (ADS)

    Zarifi, Keyvan; Gershman, Alex B.

    2006-12-01

    We analyze the performance of two popular blind subspace-based signature waveform estimation techniques proposed by Wang and Poor and Buzzi and Poor for direct-sequence code division multiple-access (DS-CDMA) systems with unknown correlated noise. Using the first-order perturbation theory, analytical expressions for the mean-square error (MSE) of these algorithms are derived. We also obtain simple high SNR approximations of the MSE expressions which explicitly clarify how the performance of these techniques depends on the environmental parameters and how it is related to that of the conventional techniques that are based on the standard white noise assumption. Numerical examples further verify the consistency of the obtained analytical results with simulation results.

  14. Preparing Colorful Astronomical Images II

    NASA Astrophysics Data System (ADS)

    Levay, Z. G.; Frattare, L. M.

    2002-12-01

    We present additional techniques for using mainstream graphics software (Adobe Photoshop and Illustrator) to produce composite color images and illustrations from astronomical data. These techniques have been used on numerous images from the Hubble Space Telescope to produce photographic, print and web-based products for news, education and public presentation as well as illustrations for technical publication. We expand on a previous paper to present more detail and additional techniques, taking advantage of new or improved features available in the latest software versions. While Photoshop is not intended for quantitative analysis of full dynamic range data (as are IRAF or IDL, for example), we have had much success applying Photoshop's numerous, versatile tools to work with scaled images, masks, text and graphics in multiple semi-transparent layers and channels.

  15. Logistic Stick-Breaking Process

    PubMed Central

    Ren, Lu; Du, Lan; Carin, Lawrence; Dunson, David B.

    2013-01-01

    A logistic stick-breaking process (LSBP) is proposed for non-parametric clustering of general spatially- or temporally-dependent data, imposing the belief that proximate data are more likely to be clustered together. The sticks in the LSBP are realized via multiple logistic regression functions, with shrinkage priors employed to favor contiguous and spatially localized segments. The LSBP is also extended for the simultaneous processing of multiple data sets, yielding a hierarchical logistic stick-breaking process (H-LSBP). The model parameters (atoms) within the H-LSBP are shared across the multiple learning tasks. Efficient variational Bayesian inference is derived, and comparisons are made to related techniques in the literature. Experimental analysis is performed for audio waveforms and images, and it is demonstrated that for segmentation applications the LSBP yields generally homogeneous segments with sharp boundaries. PMID:25258593

  16. Graded Multiple Choice Questions: Rewarding Understanding and Preventing Plagiarism

    NASA Astrophysics Data System (ADS)

    Denyer, G. S.; Hancock, D.

    2002-08-01

    This paper describes an easily implemented method that allows the generation and analysis of graded multiple-choice examinations. The technique, which uses standard functions in user-end software (Microsoft Excel 5+), can also produce several different versions of an examination that can be employed to prevent the reward of plagarism. The manuscript also discusses the advantages of having a graded marking system for the elimination of ambiguities, use in multi-step calculation questions, and questions that require extrapolation or reasoning. The advantages of the scrambling strategy, which maintains the same question order, is discussed with reference to student equity. The system provides a non-confrontational mechanism for dealing with cheating in large-class multiple-choice examinations, as well as providing a reward for problem solving over surface learning.

  17. A new formation control of multiple underactuated surface vessels

    NASA Astrophysics Data System (ADS)

    Xie, Wenjing; Ma, Baoli; Fernando, Tyrone; Iu, Herbert Ho-Ching

    2018-05-01

    This work investigates a new formation control problem of multiple underactuated surface vessels. The controller design is based on input-output linearisation technique, graph theory, consensus idea and some nonlinear tools. The proposed smooth time-varying distributed control law guarantees that the multiple underactuated surface vessels globally exponentially converge to some desired geometric shape, which is especially centred at the initial average position of vessels. Furthermore, the stability analysis of zero dynamics proves that the orientations of vessels tend to some constants that are dependent on the initial values of vessels, and the velocities and control inputs of the vessels decay to zero. All the results are obtained under the communication scenarios of static directed balanced graph with a spanning tree. Effectiveness of the proposed distributed control scheme is demonstrated using a simulation example.

  18. Multiple Criteria Decision Analysis for Health Care Decision Making--Emerging Good Practices: Report 2 of the ISPOR MCDA Emerging Good Practices Task Force.

    PubMed

    Marsh, Kevin; IJzerman, Maarten; Thokala, Praveen; Baltussen, Rob; Boysen, Meindert; Kaló, Zoltán; Lönngren, Thomas; Mussen, Filip; Peacock, Stuart; Watkins, John; Devlin, Nancy

    2016-01-01

    Health care decisions are complex and involve confronting trade-offs between multiple, often conflicting objectives. Using structured, explicit approaches to decisions involving multiple criteria can improve the quality of decision making. A set of techniques, known under the collective heading, multiple criteria decision analysis (MCDA), are useful for this purpose. In 2014, ISPOR established an Emerging Good Practices Task Force. The task force's first report defined MCDA, provided examples of its use in health care, described the key steps, and provided an overview of the principal methods of MCDA. This second task force report provides emerging good-practice guidance on the implementation of MCDA to support health care decisions. The report includes: a checklist to support the design, implementation and review of an MCDA; guidance to support the implementation of the checklist; the order in which the steps should be implemented; illustrates how to incorporate budget constraints into an MCDA; provides an overview of the skills and resources, including available software, required to implement MCDA; and future research directions. Copyright © 2016 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  19. Sparsity-aware multiple relay selection in large multi-hop decode-and-forward relay networks

    NASA Astrophysics Data System (ADS)

    Gouissem, A.; Hamila, R.; Al-Dhahir, N.; Foufou, S.

    2016-12-01

    In this paper, we propose and investigate two novel techniques to perform multiple relay selection in large multi-hop decode-and-forward relay networks. The two proposed techniques exploit sparse signal recovery theory to select multiple relays using the orthogonal matching pursuit algorithm and outperform state-of-the-art techniques in terms of outage probability and computation complexity. To reduce the amount of collected channel state information (CSI), we propose a limited-feedback scheme where only a limited number of relays feedback their CSI. Furthermore, a detailed performance-complexity tradeoff investigation is conducted for the different studied techniques and verified by Monte Carlo simulations.

  20. Least-squares deconvolution of evoked potentials and sequence optimization for multiple stimuli under low-jitter conditions.

    PubMed

    Bardy, Fabrice; Dillon, Harvey; Van Dun, Bram

    2014-04-01

    Rapid presentation of stimuli in an evoked response paradigm can lead to overlap of multiple responses and consequently difficulties interpreting waveform morphology. This paper presents a deconvolution method allowing overlapping multiple responses to be disentangled. The deconvolution technique uses a least-squared error approach. A methodology is proposed to optimize the stimulus sequence associated with the deconvolution technique under low-jitter conditions. It controls the condition number of the matrices involved in recovering the responses. Simulations were performed using the proposed deconvolution technique. Multiple overlapping responses can be recovered perfectly in noiseless conditions. In the presence of noise, the amount of error introduced by the technique can be controlled a priori by the condition number of the matrix associated with the used stimulus sequence. The simulation results indicate the need for a minimum amount of jitter, as well as a sufficient number of overlap combinations to obtain optimum results. An aperiodic model is recommended to improve reconstruction. We propose a deconvolution technique allowing multiple overlapping responses to be extracted and a method of choosing the stimulus sequence optimal for response recovery. This technique may allow audiologists, psychologists, and electrophysiologists to optimize their experimental designs involving rapidly presented stimuli, and to recover evoked overlapping responses. Copyright © 2013 International Federation of Clinical Neurophysiology. All rights reserved.

  1. Classification of air quality using fuzzy synthetic multiplication.

    PubMed

    Abdullah, Lazim; Khalid, Noor Dalina

    2012-11-01

    Proper identification of environment's air quality based on limited observations is an essential task to meet the goals of environmental management. Various classification methods have been used to estimate the change of air quality status and health. However, discrepancies frequently arise from the lack of clear distinction between each air quality, the uncertainty in the quality criteria employed and the vagueness or fuzziness embedded in the decision-making output values. Owing to inherent imprecision, difficulties always exist in some conventional methodologies when describing integrated air quality conditions with respect to various pollutants. Therefore, this paper presents two fuzzy multiplication synthetic techniques to establish classification of air quality. The fuzzy multiplication technique empowers the max-min operations in "or" and "and" in executing the fuzzy arithmetic operations. Based on a set of air pollutants data carbon monoxide, sulfur dioxide, nitrogen dioxide, ozone, and particulate matter (PM(10)) collected from a network of 51 stations in Klang Valley, East Malaysia, Sabah, and Sarawak were utilized in this evaluation. The two fuzzy multiplication techniques consistently classified Malaysia's air quality as "good." The findings indicated that the techniques may have successfully harmonized inherent discrepancies and interpret complex conditions. It was demonstrated that fuzzy synthetic multiplication techniques are quite appropriate techniques for air quality management.

  2. Analysis of the psychometric properties of the Multiple Sclerosis Impact Scale-29 (MSIS-29) in relapsing–remitting multiple sclerosis using classical and modern test theory

    PubMed Central

    Wyrwich, KW; Phillips, GA; Vollmer, T; Guo, S

    2016-01-01

    Background Investigations using classical test theory support the psychometric properties of the original version of the Multiple Sclerosis Impact Scale (MSIS-29v1), a disease-specific measure of multiple sclerosis (MS) impact (physical and psychological subscales). Later, assessments of the MSIS-29v1 in an MS community-based sample using Rasch analysis led to revisions of the instrument’s response options (MSIS-29v2). Objective The objective of this paper is to evaluate the psychometric properties of the MSIS-29v1 in a clinical trial cohort of relapsing–remitting MS patients (RRMS). Methods Data from 600 patients with RRMS enrolled in the SELECT clinical trial were used. Assessments were performed at baseline and at Weeks 12, 24, and 52. In addition to traditional psychometric analyses, Item Response Theory (IRT) and Rasch analysis were used to evaluate the measurement properties of the MSIS-29v1. Results Both MSIS-29v1 subscales demonstrated strong reliability, construct validity, and responsiveness. The IRT and Rasch analysis showed overall support for response category threshold ordering, person-item fit, and item fit for both subscales. Conclusions Both MSIS-29v1 subscales demonstrated robust measurement properties using classical, IRT, and Rasch techniques. Unlike previous research using a community-based sample, the MSIS-29v1 was found to be psychometrically sound to assess physical and psychological impairments in a clinical trial sample of patients with RRMS. PMID:28607741

  3. Analysis of the psychometric properties of the Multiple Sclerosis Impact Scale-29 (MSIS-29) in relapsing-remitting multiple sclerosis using classical and modern test theory.

    PubMed

    Bacci, E D; Wyrwich, K W; Phillips, G A; Vollmer, T; Guo, S

    2016-01-01

    Investigations using classical test theory support the psychometric properties of the original version of the Multiple Sclerosis Impact Scale (MSIS-29v1), a disease-specific measure of multiple sclerosis (MS) impact (physical and psychological subscales). Later, assessments of the MSIS-29v1 in an MS community-based sample using Rasch analysis led to revisions of the instrument's response options (MSIS-29v2). The objective of this paper is to evaluate the psychometric properties of the MSIS-29v1 in a clinical trial cohort of relapsing-remitting MS patients (RRMS). Data from 600 patients with RRMS enrolled in the SELECT clinical trial were used. Assessments were performed at baseline and at Weeks 12, 24, and 52. In addition to traditional psychometric analyses, Item Response Theory (IRT) and Rasch analysis were used to evaluate the measurement properties of the MSIS-29v1. Both MSIS-29v1 subscales demonstrated strong reliability, construct validity, and responsiveness. The IRT and Rasch analysis showed overall support for response category threshold ordering, person-item fit, and item fit for both subscales. Both MSIS-29v1 subscales demonstrated robust measurement properties using classical, IRT, and Rasch techniques. Unlike previous research using a community-based sample, the MSIS-29v1 was found to be psychometrically sound to assess physical and psychological impairments in a clinical trial sample of patients with RRMS.

  4. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 5: Unsteady counterrotation ducted propfan analysis

    NASA Technical Reports Server (NTRS)

    Hall, Edward J.; Delaney, Robert A.

    1993-01-01

    The primary objective of this study was the development of a time-marching three-dimensional Euler/Navier-Stokes aerodynamic analysis to predict steady and unsteady compressible transonic flows about ducted and unducted propfan propulsion systems employing multiple blade rows. The computer codes resulting from this study are referred to as ADPAC-AOAR\\CR (Advanced Ducted Propfan Analysis Codes-Angle of Attack Coupled Row). This document is the final report describing the theoretical basis and analytical results from the ADPAC-AOACR codes developed under task 5 of NASA Contract NAS3-25270, Unsteady Counterrotating Ducted Propfan Analysis. The ADPAC-AOACR Program is based on a flexible multiple blocked grid discretization scheme permitting coupled 2-D/3-D mesh block solutions with application to a wide variety of geometries. For convenience, several standard mesh block structures are described for turbomachinery applications. Aerodynamic calculations are based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. Steady flow predictions are accelerated by a multigrid procedure. Numerical calculations are compared with experimental data for several test cases to demonstrate the utility of this approach for predicting the aerodynamics of modern turbomachinery configurations employing multiple blade rows.

  5. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 5: Unsteady counterrotation ducted propfan analysis. Computer program user's manual

    NASA Technical Reports Server (NTRS)

    Hall, Edward J.; Delaney, Robert A.; Adamczyk, John J.; Miller, Christopher J.; Arnone, Andrea; Swanson, Charles

    1993-01-01

    The primary objective of this study was the development of a time-marching three-dimensional Euler/Navier-Stokes aerodynamic analysis to predict steady and unsteady compressible transonic flows about ducted and unducted propfan propulsion systems employing multiple blade rows. The computer codes resulting from this study are referred to as ADPAC-AOACR (Advanced Ducted Propfan Analysis Codes-Angle of Attack Coupled Row). This report is intended to serve as a computer program user's manual for the ADPAC-AOACR codes developed under Task 5 of NASA Contract NAS3-25270, Unsteady Counterrotating Ducted Propfan Analysis. The ADPAC-AOACR program is based on a flexible multiple blocked grid discretization scheme permitting coupled 2-D/3-D mesh block solutions with application to a wide variety of geometries. For convenience, several standard mesh block structures are described for turbomachinery applications. Aerodynamic calculations are based on a four-stage Runge-Kutta time-marching finite volume solution technique with added numerical dissipation. Steady flow predictions are accelerated by a multigrid procedure. Numerical calculations are compared with experimental data for several test cases to demonstrate the utility of this approach for predicting the aerodynamics of modern turbomachinery configurations employing multiple blade rows.

  6. Hyperspectral wide gap second derivative analysis for in vivo detection of cervical intraepithelial neoplasia

    NASA Astrophysics Data System (ADS)

    Zheng, Wenli; Wang, Chaojian; Chang, Shufang; Zhang, Shiwu; Xu, Ronald X.

    2015-12-01

    Hyperspectral reflectance imaging technique has been used for in vivo detection of cervical intraepithelial neoplasia. However, the clinical outcome of this technique is suboptimal owing to multiple limitations such as nonuniform illumination, high-cost and bulky setup, and time-consuming data acquisition and processing. To overcome these limitations, we acquired the hyperspectral data cube in a wavelength ranging from 600 to 800 nm and processed it by a wide gap second derivative analysis method. This method effectively reduced the image artifacts caused by nonuniform illumination and background absorption. Furthermore, with second derivative analysis, only three specific wavelengths (620, 696, and 772 nm) are needed for tissue classification with optimal separability. Clinical feasibility of the proposed image analysis and classification method was tested in a clinical trial where cervical hyperspectral images from three patients were used for classification analysis. Our proposed method successfully classified the cervix tissue into three categories of normal, inflammation and high-grade lesion. These classification results were coincident with those by an experienced gynecology oncologist after applying acetic acid. Our preliminary clinical study has demonstrated the technical feasibility for in vivo and noninvasive detection of cervical neoplasia without acetic acid. Further clinical research is needed in order to establish a large-scale diagnostic database and optimize the tissue classification technique.

  7. Hyperspectral wide gap second derivative analysis for in vivo detection of cervical intraepithelial neoplasia.

    PubMed

    Zheng, Wenli; Wang, Chaojian; Chang, Shufang; Zhang, Shiwu; Xu, Ronald X

    2015-12-01

    Hyperspectral reflectance imaging technique has been used for in vivo detection of cervical intraepithelial neoplasia. However, the clinical outcome of this technique is suboptimal owing to multiple limitations such as nonuniform illumination, high-cost and bulky setup, and time-consuming data acquisition and processing. To overcome these limitations, we acquired the hyperspectral data cube in a wavelength ranging from 600 to 800 nm and processed it by a wide gap second derivative analysis method. This method effectively reduced the image artifacts caused by nonuniform illumination and background absorption. Furthermore, with second derivative analysis, only three specific wavelengths (620, 696, and 772 nm) are needed for tissue classification with optimal separability. Clinical feasibility of the proposed image analysis and classification method was tested in a clinical trial where cervical hyperspectral images from three patients were used for classification analysis. Our proposed method successfully classified the cervix tissue into three categories of normal, inflammation and high-grade lesion. These classification results were coincident with those by an experienced gynecology oncologist after applying acetic acid. Our preliminary clinical study has demonstrated the technical feasibility for in vivo and noninvasive detection of cervical neoplasia without acetic acid. Further clinical research is needed in order to establish a large-scale diagnostic database and optimize the tissue classification technique.

  8. Challenge Paper: Validation of Forensic Techniques for Criminal Prosecution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erbacher, Robert F.; Endicott-Popovsky, Barbara E.; Frincke, Deborah A.

    2007-04-10

    Abstract: As in many domains, there is increasing agreement in the user and research community that digital forensics analysts would benefit from the extension, development and application of advanced techniques in performing large scale and heterogeneous data analysis. Modern digital forensics analysis of cyber-crimes and cyber-enabled crimes often requires scrutiny of massive amounts of data. For example, a case involving network compromise across multiple enterprises might require forensic analysis of numerous sets of network logs and computer hard drives, potentially involving 100?s of gigabytes of heterogeneous data, or even terabytes or petabytes of data. Also, the goal for forensic analysismore » is to not only determine whether the illicit activity being considered is taking place, but also to identify the source of the activity and the full extent of the compromise or impact on the local network. Even after this analysis, there remains the challenge of using the results in subsequent criminal and civil processes.« less

  9. Nonlinear zero-sum differential game analysis by singular perturbation methods

    NASA Technical Reports Server (NTRS)

    Sinar, J.; Farber, N.

    1982-01-01

    A class of nonlinear, zero-sum differential games, exhibiting time-scale separation properties, can be analyzed by singular-perturbation techniques. The merits of such an analysis, leading to an approximate game solution, as well as the 'well-posedness' of the formulation, are discussed. This approach is shown to be attractive for investigating pursuit-evasion problems; the original multidimensional differential game is decomposed to a 'simple pursuit' (free-stream) game and two independent (boundary-layer) optimal-control problems. Using multiple time-scale boundary-layer models results in a pair of uniformly valid zero-order composite feedback strategies. The dependence of suboptimal strategies on relative geometry and own-state measurements is demonstrated by a three dimensional, constant-speed example. For game analysis with realistic vehicle dynamics, the technique of forced singular perturbations and a variable modeling approach is proposed. Accuracy of the analysis is evaluated by comparison with the numerical solution of a time-optimal, variable-speed 'game of two cars' in the horizontal plane.

  10. Analysis of multiple mycotoxins in food.

    PubMed

    Hajslova, Jana; Zachariasova, Milena; Cajka, Tomas

    2011-01-01

    Mycotoxins are secondary metabolites of microscopic filamentous fungi. With regard to the widespread distribution of fungi in the environment, mycotoxins are considered to be one of the most important natural contaminants in foods and feeds. To protect consumers' health and reduce economic losses, surveillance and control of mycotoxins in food and feed has become a major objective for producers, regulatory authorities, and researchers worldwide. In this context, availability of reliable analytical methods applicable for this purpose is essential. Since the variety of chemical structures of mycotoxins makes impossible to use one single technique for their analysis, a vast number of analytical methods has been developed and validated. Both a large variability of food matrices and growing demands for a fast, cost-saving and accurate determination of multiple mycotoxins by a single method outline new challenges for analytical research. This strong effort is facilitated by technical developments in mass spectrometry allowing decreasing the influence of matrix effects in spite of omitting sample clean-up step. The current state-of-the-art together with future trends is presented in this chapter. Attention is focused mainly on instrumental method; advances in biosensors and other screening bioanalytical approaches enabling analysis of multiple mycotoxins are not discussed in detail.

  11. Non-Orthogonal Multiple Access for Ubiquitous Wireless Sensor Networks.

    PubMed

    Anwar, Asim; Seet, Boon-Chong; Ding, Zhiguo

    2018-02-08

    Ubiquitous wireless sensor networks (UWSNs) have become a critical technology for enabling smart cities and other ubiquitous monitoring applications. Their deployment, however, can be seriously hampered by the spectrum available to the sheer number of sensors for communication. To support the communication needs of UWSNs without requiring more spectrum resources, the power-domain non-orthogonal multiple access (NOMA) technique originally proposed for 5th Generation (5G) cellular networks is investigated for UWSNs for the first time in this paper. However, unlike 5G networks that operate in the licensed spectrum, UWSNs mostly operate in unlicensed spectrum where sensors also experience cross-technology interferences from other devices sharing the same spectrum. In this paper, we model the interferences from various sources at the sensors using stochastic geometry framework. To evaluate the performance, we derive a theorem and present new closed form expression for the outage probability of the sensors in a downlink scenario under interference limited environment. In addition, diversity analysis for the ordered NOMA users is performed. Based on the derived outage probability, we evaluate the average link throughput and energy consumption efficiency of NOMA against conventional orthogonal multiple access (OMA) technique in UWSNs. Further, the required computational complexity for the NOMA users is presented.

  12. Context-based automated defect classification system using multiple morphological masks

    DOEpatents

    Gleason, Shaun S.; Hunt, Martin A.; Sari-Sarraf, Hamed

    2002-01-01

    Automatic detection of defects during the fabrication of semiconductor wafers is largely automated, but the classification of those defects is still performed manually by technicians. This invention includes novel digital image analysis techniques that generate unique feature vector descriptions of semiconductor defects as well as classifiers that use these descriptions to automatically categorize the defects into one of a set of pre-defined classes. Feature extraction techniques based on multiple-focus images, multiple-defect mask images, and segmented semiconductor wafer images are used to create unique feature-based descriptions of the semiconductor defects. These feature-based defect descriptions are subsequently classified by a defect classifier into categories that depend on defect characteristics and defect contextual information, that is, the semiconductor process layer(s) with which the defect comes in contact. At the heart of the system is a knowledge database that stores and distributes historical semiconductor wafer and defect data to guide the feature extraction and classification processes. In summary, this invention takes as its input a set of images containing semiconductor defect information, and generates as its output a classification for the defect that describes not only the defect itself, but also the location of that defect with respect to the semiconductor process layers.

  13. Comparison of seven techniques for typing international epidemic strains of Clostridium difficile: restriction endonuclease analysis, pulsed-field gel electrophoresis, PCR-ribotyping, multilocus sequence typing, multilocus variable-number tandem-repeat analysis, amplified fragment length polymorphism, and surface layer protein A gene sequence typing.

    PubMed

    Killgore, George; Thompson, Angela; Johnson, Stuart; Brazier, Jon; Kuijper, Ed; Pepin, Jacques; Frost, Eric H; Savelkoul, Paul; Nicholson, Brad; van den Berg, Renate J; Kato, Haru; Sambol, Susan P; Zukowski, Walter; Woods, Christopher; Limbago, Brandi; Gerding, Dale N; McDonald, L Clifford

    2008-02-01

    Using 42 isolates contributed by laboratories in Canada, The Netherlands, the United Kingdom, and the United States, we compared the results of analyses done with seven Clostridium difficile typing techniques: multilocus variable-number tandem-repeat analysis (MLVA), amplified fragment length polymorphism (AFLP), surface layer protein A gene sequence typing (slpAST), PCR-ribotyping, restriction endonuclease analysis (REA), multilocus sequence typing (MLST), and pulsed-field gel electrophoresis (PFGE). We assessed the discriminating ability and typeability of each technique as well as the agreement among techniques in grouping isolates by allele profile A (AP-A) through AP-F, which are defined by toxinotype, the presence of the binary toxin gene, and deletion in the tcdC gene. We found that all isolates were typeable by all techniques and that discrimination index scores for the techniques tested ranged from 0.964 to 0.631 in the following order: MLVA, REA, PFGE, slpAST, PCR-ribotyping, MLST, and AFLP. All the techniques were able to distinguish the current epidemic strain of C. difficile (BI/027/NAP1) from other strains. All of the techniques showed multiple types for AP-A (toxinotype 0, binary toxin negative, and no tcdC gene deletion). REA, slpAST, MLST, and PCR-ribotyping all included AP-B (toxinotype III, binary toxin positive, and an 18-bp deletion in tcdC) in a single group that excluded other APs. PFGE, AFLP, and MLVA grouped two, one, and two different non-AP-B isolates, respectively, with their AP-B isolates. All techniques appear to be capable of detecting outbreak strains, but only REA and MLVA showed sufficient discrimination to distinguish strains from different outbreaks.

  14. Multiple techniques for mineral identification of terrestrial evaporites relevant to Mars exploration

    NASA Astrophysics Data System (ADS)

    Stivaletta, N.; Dellisanti, F.; D'Elia, M.; Fonti, S.; Mancarella, F.

    2013-05-01

    Sulfates, commonly found in evaporite deposits, were observed on Mars surface during orbital remote sensing and surface exploration. In terrestrial environments, evaporite precipitation creates excellent microniches for microbial colonization, especially in desert areas. Deposits comprised of gypsum, calcite, quartz and silicate deposits (phyllosilicates, feldspars) from Sahara Desert in southern Tunisia contain endolithic colonies just below the rock surface. Previous optical observations verified the presence of microbial communities and, as described in this paper, spectral visible analyses have led to identification of chlorophylls belonging to photosynthetic bacteria. Spectral analyses in the infrared region have clearly detected the presence of gypsum and phyllosilicates (mainly illite and/or smectite), as well as traces of calcite, but not quartz. X-ray diffraction (XRD) analysis has identified the dominant presence of gypsum as well as that of other secondary minerals such as quartz, feldspars and Mg-Al-rich phyllosilicates, such as chlorite, illite and smectite. The occurrence of a small quantity of calcite in all the samples was also highlighted by the loss of CO2 by thermal analysis (TG-DTA). A normative calculation using XRD, thermal data and X-ray fluorescence (XRF) analysis has permitted to obtain the mineralogical concentration of the minerals occurring in the samples. The combination of multiple techniques provides information about the mineralogy of rocks and hence indication of environments suitable for supporting microbial life on Mars surface.

  15. Platform for Post-Processing Waveform-Based NDE

    NASA Technical Reports Server (NTRS)

    Roth, Don J.

    2010-01-01

    Signal- and image-processing methods are commonly needed to extract information from the waves, improve resolution of, and highlight defects in an image. Since some similarity exists for all waveform-based nondestructive evaluation (NDE) methods, it would seem that a common software platform containing multiple signal- and image-processing techniques to process the waveforms and images makes sense where multiple techniques, scientists, engineers, and organizations are involved. NDE Wave & Image Processor Version 2.0 software provides a single, integrated signal- and image-processing and analysis environment for total NDE data processing and analysis. It brings some of the most useful algorithms developed for NDE over the past 20 years into a commercial-grade product. The software can import signal/spectroscopic data, image data, and image series data. This software offers the user hundreds of basic and advanced signal- and image-processing capabilities including esoteric 1D and 2D wavelet-based de-noising, de-trending, and filtering. Batch processing is included for signal- and image-processing capability so that an optimized sequence of processing operations can be applied to entire folders of signals, spectra, and images. Additionally, an extensive interactive model-based curve-fitting facility has been included to allow fitting of spectroscopy data such as from Raman spectroscopy. An extensive joint-time frequency module is included for analysis of non-stationary or transient data such as that from acoustic emission, vibration, or earthquake data.

  16. A Quantile Regression Approach to Understanding the Relations Between Morphological Awareness, Vocabulary, and Reading Comprehension in Adult Basic Education Students

    PubMed Central

    Tighe, Elizabeth L.; Schatschneider, Christopher

    2015-01-01

    The purpose of this study was to investigate the joint and unique contributions of morphological awareness and vocabulary knowledge at five reading comprehension levels in Adult Basic Education (ABE) students. We introduce the statistical technique of multiple quantile regression, which enabled us to assess the predictive utility of morphological awareness and vocabulary knowledge at multiple points (quantiles) along the continuous distribution of reading comprehension. To demonstrate the efficacy of our multiple quantile regression analysis, we compared and contrasted our results with a traditional multiple regression analytic approach. Our results indicated that morphological awareness and vocabulary knowledge accounted for a large portion of the variance (82-95%) in reading comprehension skills across all quantiles. Morphological awareness exhibited the greatest unique predictive ability at lower levels of reading comprehension whereas vocabulary knowledge exhibited the greatest unique predictive ability at higher levels of reading comprehension. These results indicate the utility of using multiple quantile regression to assess trajectories of component skills across multiple levels of reading comprehension. The implications of our findings for ABE programs are discussed. PMID:25351773

  17. On-chip wavelength multiplexed detection of cancer DNA biomarkers in blood

    PubMed Central

    Cai, H.; Stott, M. A.; Ozcelik, D.; Parks, J. W.; Hawkins, A. R.; Schmidt, H.

    2016-01-01

    We have developed an optofluidic analysis system that processes biomolecular samples starting from whole blood and then analyzes and identifies multiple targets on a silicon-based molecular detection platform. We demonstrate blood filtration, sample extraction, target enrichment, and fluorescent labeling using programmable microfluidic circuits. We detect and identify multiple targets using a spectral multiplexing technique based on wavelength-dependent multi-spot excitation on an antiresonant reflecting optical waveguide chip. Specifically, we extract two types of melanoma biomarkers, mutated cell-free nucleic acids —BRAFV600E and NRAS, from whole blood. We detect and identify these two targets simultaneously using the spectral multiplexing approach with up to a 96% success rate. These results point the way toward a full front-to-back chip-based optofluidic compact system for high-performance analysis of complex biological samples. PMID:28058082

  18. Differentially Variable Component Analysis (dVCA): Identifying Multiple Evoked Components using Trial-to-Trial Variability

    NASA Technical Reports Server (NTRS)

    Knuth, Kevin H.; Shah, Ankoor S.; Truccolo, Wilson; Ding, Ming-Zhou; Bressler, Steven L.; Schroeder, Charles E.

    2003-01-01

    Electric potentials and magnetic fields generated by ensembles of synchronously active neurons in response to external stimuli provide information essential to understanding the processes underlying cognitive and sensorimotor activity. Interpreting recordings of these potentials and fields is difficult as each detector records signals simultaneously generated by various regions throughout the brain. We introduce the differentially Variable Component Analysis (dVCA) algorithm, which relies on trial-to-trial variability in response amplitude and latency to identify multiple components. Using simulations we evaluate the importance of response variability to component identification, the robustness of dVCA to noise, and its ability to characterize single-trial data. Finally, we evaluate the technique using visually evoked field potentials recorded at incremental depths across the layers of cortical area VI, in an awake, behaving macaque monkey.

  19. Application of stable isotope ratio analysis for biodegradation monitoring in groundwater

    USGS Publications Warehouse

    Hatzinger, Paul B.; Böhlke, John Karl; Sturchio, Neil C.

    2013-01-01

    Stable isotope ratio analysis is increasingly being applied as a tool to detect, understand, and quantify biodegradation of organic and inorganic contaminants in groundwater. An important feature of this approach is that it allows degradative losses of contaminants to be distinguished from those caused by non-destructive processes such as dilution, dispersion, and sorption. Recent advances in analytical techniques, and new approaches for interpreting stable isotope data, have expanded the utility of this method while also exposing complications and ambiguities that must be considered in data interpretations. Isotopic analyses of multiple elements in a compound, and multiple compounds in the environment, are being used to distinguish biodegradative pathways by their characteristic isotope effects. Numerical models of contaminant transport, degradation pathways, and isotopic composition are improving quantitative estimates of in situ contaminant degradation rates under realistic environmental conditions.

  20. Lifting degeneracy in holographic characterization of colloidal particles using multi-color imaging.

    PubMed

    Ruffner, David B; Cheong, Fook Chiong; Blusewicz, Jaroslaw M; Philips, Laura A

    2018-05-14

    Micrometer sized particles can be accurately characterized using holographic video microscopy and Lorenz-Mie fitting. In this work, we explore some of the limitations in holographic microscopy and introduce methods for increasing the accuracy of this technique with the use of multiple wavelengths of laser illumination. Large high index particle holograms have near degenerate solutions that can confuse standard fitting algorithms. Using a model based on diffraction from a phase disk, we explain the source of these degeneracies. We introduce multiple color holography as an effective approach to distinguish between degenerate solutions and provide improved accuracy for the holographic analysis of sub-visible colloidal particles.

  1. Laplace Transform Based Radiative Transfer Studies

    NASA Astrophysics Data System (ADS)

    Hu, Y.; Lin, B.; Ng, T.; Yang, P.; Wiscombe, W.; Herath, J.; Duffy, D.

    2006-12-01

    Multiple scattering is the major uncertainty for data analysis of space-based lidar measurements. Until now, accurate quantitative lidar data analysis has been limited to very thin objects that are dominated by single scattering, where photons from the laser beam only scatter a single time with particles in the atmosphere before reaching the receiver, and simple linear relationship between physical property and lidar signal exists. In reality, multiple scattering is always a factor in space-based lidar measurement and it dominates space- based lidar returns from clouds, dust aerosols, vegetation canopy and phytoplankton. While multiple scattering are clear signals, the lack of a fast-enough lidar multiple scattering computation tool forces us to treat the signal as unwanted "noise" and use simple multiple scattering correction scheme to remove them. Such multiple scattering treatments waste the multiple scattering signals and may cause orders of magnitude errors in retrieved physical properties. Thus the lack of fast and accurate time-dependent radiative transfer tools significantly limits lidar remote sensing capabilities. Analyzing lidar multiple scattering signals requires fast and accurate time-dependent radiative transfer computations. Currently, multiple scattering is done with Monte Carlo simulations. Monte Carlo simulations take minutes to hours and are too slow for interactive satellite data analysis processes and can only be used to help system / algorithm design and error assessment. We present an innovative physics approach to solve the time-dependent radiative transfer problem. The technique utilizes FPGA based reconfigurable computing hardware. The approach is as following, 1. Physics solution: Perform Laplace transform on the time and spatial dimensions and Fourier transform on the viewing azimuth dimension, and convert the radiative transfer differential equation solving into a fast matrix inversion problem. The majority of the radiative transfer computation goes to matrix inversion processes, FFT and inverse Laplace transforms. 2. Hardware solutions: Perform the well-defined matrix inversion, FFT and Laplace transforms on highly parallel, reconfigurable computing hardware. This physics-based computational tool leads to accurate quantitative analysis of space-based lidar signals and improves data quality of current lidar mission such as CALIPSO. This presentation will introduce the basic idea of this approach, preliminary results based on SRC's FPGA-based Mapstation, and how we may apply it to CALIPSO data analysis.

  2. Utilization of the Differential Die-Away Self-Interrogation Technique for Characterization and Verification of Spent Nuclear Fuel

    NASA Astrophysics Data System (ADS)

    Trahan, Alexis Chanel

    New nondestructive assay techniques are sought to better characterize spent nuclear fuel. One of the NDA instruments selected for possible deployment is differential die-away self-interrogation (DDSI). The proposed DDSI approach for spent fuel assembly assay utilizes primarily the spontaneous fission and (alpha, n) neutrons in the assemblies as an internal interrogating radiation source. The neutrons released in spontaneous fission or (alpha,n) reactions are thermalized in the surrounding water and induce fission in fissile isotopes, thereby creating a measurable signal from isotopes of interest that would be otherwise difficult to measure. The DDSI instrument employs neutron coincidence counting with 3He tubes and list-mode-based data acquisition to allow for production of Rossi-alpha distributions (RADs) in post-processing. The list-mode approach to data collection and subsequent construction of RADs has expanded the analytical possibilities, as will be demonstrated throughout this thesis. One of the primary advantages is that the measured signal in the form of a RAD can be analyzed in its entirety including determination of die-away times in different time domains. This capability led to the development of the early die-away method, a novel leakage multiplication determination method which is tested throughout the thesis on different sources in simulation space and fresh fuel experiments. The early die-away method is a robust, accurate, improved method of determining multiplication without the need for knowledge of the (alpha,n) source term. The DDSI technique and instrument are presented along with the many novel capabilities enabled by and discovered through RAD analysis. Among the new capabilities presented are the early die-away method, total plutonium content determination, and highly sensitive missing pin detection. Simulation of hundreds of different spent and fresh fuel assemblies were used to develop the analysis algorithms and the techniques were tested on a variety of spontaneous fission-driven fresh fuel assemblies at Los Alamos National Laboratory and the BeRP ball at the Nevada National Security Site. The development of the new, improved analysis and characterization methods with the DDSI instrument makes it a viable technique for implementation in a facility to meet material control and safeguards needs.

  3. MRI measurements of Blood-Brain Barrier function in dementia: A review of recent studies.

    PubMed

    Raja, Rajikha; Rosenberg, Gary A; Caprihan, Arvind

    2018-05-15

    Blood-brain barrier (BBB) separates the systemic circulation and the brain, regulating transport of most molecules to protect the brain microenvironment. Multiple structural and functional components preserve the integrity of the BBB. Several imaging modalities are available to study disruption of the BBB. However, the subtle changes in BBB leakage that occurs in vascular cognitive impairment and Alzheimer's disease have been less well studied. Dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) is the most widely adopted non-invasive imaging technique for evaluating BBB breakdown. It is used as a significant marker for a wide variety of diseases with large permeability leaks, such as brain tumors and multiple sclerosis, to more subtle disruption in chronic vascular disease and dementia. DCE-MRI analysis of BBB includes both model-free parameters and quantitative parameters using pharmacokinetic modelling. We review MRI studies of BBB breakdown in dementia. The challenges in measuring subtle BBB changes and the state of the art techniques are initially examined. Subsequently, a systematic review comparing methodologies from recent in-vivo MRI studies is presented. Various factors related to subtle BBB permeability measurement such as DCE-MRI acquisition parameters, arterial input assessment, T 1 mapping and data analysis methods are reviewed with the focus on finding the optimal technique. Finally, the reported BBB permeability values in dementia are compared across different studies and across various brain regions. We conclude that reliable measurement of low-level BBB permeability across sites remains a difficult problem and a standardization of the methodology for both data acquisition and quantitative analysis is required. This article is part of the Special Issue entitled 'Cerebral Ischemia'. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Multiplex gas chromatography for use in space craft

    NASA Technical Reports Server (NTRS)

    Valentin, J. R.

    1985-01-01

    Gas chromatography is a powerful technique for the analysis of gaseous mixtures. Some limitations in this technique still exist which can be alleviated with multiplex gas chromatography (MGC). In MGC, rapid multiple sample injections are made into the column without having to wait for one determination to be finished before taking a new sample. The resulting data must then be reduced using computational methods such as cross correlation. In order to efficiently perform multiplexgas chromatography, experiments in the laboratory and on board future space craft, skills, equipment, and computer software were developed. Three new techniques for modulating, i.e., changing, sample concentrations were demonstrated by using desorption, decomposition, and catalytic modulators. In all of them, the need for a separate gas stream as the carrier was avoided by placing the modulator at the head of the column to directly modulate a sample stream. Finally, the analysis of an environmental sample by multiplex chromatography was accomplished by employing silver oxide to catalytically modulate methane in ambient air.

  5. Covariance mapping of two-photon double core hole states in C 2 H 2 and C 2 H 6 produced by an x-ray free electron laser

    DOE PAGES

    Mucke, M; Zhaunerchyk, V; Frasinski, L J; ...

    2015-07-01

    Few-photon ionization and relaxation processes in acetylene (C 2H 2) and ethane (C 2H 6) were investigated at the linac coherent light source x-ray free electron laser (FEL) at SLAC, Stanford using a highly efficient multi-particle correlation spectroscopy technique based on a magnetic bottle. The analysis method of covariance mapping has been applied and enhanced, allowing us to identify electron pairs associated with double core hole (DCH) production and competing multiple ionization processes including Auger decay sequences. The experimental technique and the analysis procedure are discussed in the light of earlier investigations of DCH studies carried out at the samemore » FEL and at third generation synchrotron radiation sources. In particular, we demonstrate the capability of the covariance mapping technique to disentangle the formation of molecular DCH states which is barely feasible with conventional electron spectroscopy methods.« less

  6. A numerical model for predicting crack path and modes of damage in unidirectional metal matrix composites

    NASA Technical Reports Server (NTRS)

    Bakuckas, J. G.; Tan, T. M.; Lau, A. C. W.; Awerbuch, J.

    1993-01-01

    A finite element-based numerical technique has been developed to simulate damage growth in unidirectional composites. This technique incorporates elastic-plastic analysis, micromechanics analysis, failure criteria, and a node splitting and node force relaxation algorithm to create crack surfaces. Any combination of fiber and matrix properties can be used. One of the salient features of this technique is that damage growth can be simulated without pre-specifying a crack path. In addition, multiple damage mechanisms in the forms of matrix cracking, fiber breakage, fiber-matrix debonding and plastic deformation are capable of occurring simultaneously. The prevailing failure mechanism and the damage (crack) growth direction are dictated by the instantaneous near-tip stress and strain fields. Once the failure mechanism and crack direction are determined, the crack is advanced via the node splitting and node force relaxation algorithm. Simulations of the damage growth process in center-slit boron/aluminum and silicon carbide/titanium unidirectional specimens were performed. The simulation results agreed quite well with the experimental observations.

  7. Integration of Scale Invariant Generator Technique and S-A Technique for Characterizing 2-D Patterns for Information Retrieve

    NASA Astrophysics Data System (ADS)

    Cao, L.; Cheng, Q.

    2004-12-01

    The scale invariant generator technique (SIG) and spectrum-area analysis technique (S-A) were developed independently relevant to the concept of the generalized scale invariance (GSI). The former was developed for characterizing the parameters involved in the GSI for characterizing and simulating multifractal measures whereas the latter was for identifying scaling breaks for decomposition of superimposed multifractal measures caused by multiple geophysical processes. A natural integration of these two techniques may yield a new technique to serve two purposes, on the one hand, that can enrich the power of S-A by increasing the interpretability of decomposed patterns in some applications of S-A and, on the other hand, that can provide a mean to test the uniqueness of multifractality of measures which is essential for application of SIG technique in more complicated environment. The implementation of the proposed technique has been done as a Dynamic Link Library (DLL) in Visual C++. The program can be friendly used for method validation and application in different fields.

  8. Empirical validation of statistical parametric mapping for group imaging of fast neural activity using electrical impedance tomography.

    PubMed

    Packham, B; Barnes, G; Dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D

    2016-06-01

    Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have  >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p  <  0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity.

  9. D Tracking Based Augmented Reality for Cultural Heritage Data Management

    NASA Astrophysics Data System (ADS)

    Battini, C.; Landi, G.

    2015-02-01

    The development of contactless documentation techniques is allowing researchers to collect high volumes of three-dimensional data in a short time but with high levels of accuracy. The digitalisation of cultural heritage opens up the possibility of using image processing and analysis, and computer graphics techniques, to preserve this heritage for future generations; augmenting it with additional information or with new possibilities for its enjoyment and use. The collection of precise datasets about cultural heritage status is crucial for its interpretation, its conservation and during the restoration processes. The application of digital-imaging solutions for various feature extraction, image data-analysis techniques, and three-dimensional reconstruction of ancient artworks, allows the creation of multidimensional models that can incorporate information coming from heterogeneous data sets, research results and historical sources. Real objects can be scanned and reconstructed virtually, with high levels of data accuracy and resolution. Real-time visualisation software and hardware is rapidly evolving and complex three-dimensional models can be interactively visualised and explored on applications developed for mobile devices. This paper will show how a 3D reconstruction of an object, with multiple layers of information, can be stored and visualised through a mobile application that will allow interaction with a physical object for its study and analysis, using 3D Tracking based Augmented Reality techniques.

  10. Empirical validation of statistical parametric mapping for group imaging of fast neural activity using electrical impedance tomography

    PubMed Central

    Packham, B; Barnes, G; dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D

    2016-01-01

    Abstract Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have  >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p  <  0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity. PMID:27203477

  11. Repeated stool sampling and use of multiple techniques enhance the sensitivity of helminth diagnosis: a cross-sectional survey in southern Lao People's Democratic Republic.

    PubMed

    Sayasone, Somphou; Utzinger, Jürg; Akkhavong, Kongsap; Odermatt, Peter

    2015-01-01

    Intestinal parasitic infections are common in Lao People's Democratic Republic (Lao PDR). We investigated the accuracy of the Kato-Katz (KK) technique in relation to varying stool sampling efforts, and determined the effect of the concurrent use of a quantitative formalin-ethyl acetate concentration technique (FECT) for helminth diagnosis and appraisal of concomitant infections. The study was carried out between March and May 2006 in Champasack province, southern Lao PDR. Overall, 485 individuals aged ≥6 months who provided three stool samples were included in the final analysis. All stool samples were subjected to the KK technique. Additionally, one stool sample per individual was processed by FECT. Diagnosis was done under a light microscope by experienced laboratory technicians. Analysis of three stool samples with KK plus a single FECT was considered as diagnostic 'gold' standard and resulted in prevalence estimates of hookworm, Opisthorchis viverrini, Ascaris lumbricoides, Trichuris trichiura and Schistosoma mekongi infection of 77.9%, 65.0%, 33.4%, 26.2% and 24.3%, respectively. As expected, a single KK and a single FECT missed a considerable number of infections. While our diagnostic 'gold' standard produced similar results than those obtained by a mathematical model for most helminth infections, the 'true' prevalence predicted by the model for S. mekongi (28.1%) was somewhat higher than after multiple KK plus a single FECT (24.3%). In the current setting, triplicate KK plus a single FECT diagnosed helminth infections with high sensitivity. Hence, such a diagnostic approach might be utilised for generating high-quality baseline data, assessing anthelminthic drug efficacy and rigorous monitoring of community interventions. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Advanced Modeling and Uncertainty Quantification for Flight Dynamics; Interim Results and Challenges

    NASA Technical Reports Server (NTRS)

    Hyde, David C.; Shweyk, Kamal M.; Brown, Frank; Shah, Gautam

    2014-01-01

    As part of the NASA Vehicle Systems Safety Technologies (VSST), Assuring Safe and Effective Aircraft Control Under Hazardous Conditions (Technical Challenge #3), an effort is underway within Boeing Research and Technology (BR&T) to address Advanced Modeling and Uncertainty Quantification for Flight Dynamics (VSST1-7). The scope of the effort is to develop and evaluate advanced multidisciplinary flight dynamics modeling techniques, including integrated uncertainties, to facilitate higher fidelity response characterization of current and future aircraft configurations approaching and during loss-of-control conditions. This approach is to incorporate multiple flight dynamics modeling methods for aerodynamics, structures, and propulsion, including experimental, computational, and analytical. Also to be included are techniques for data integration and uncertainty characterization and quantification. This research shall introduce new and updated multidisciplinary modeling and simulation technologies designed to improve the ability to characterize airplane response in off-nominal flight conditions. The research shall also introduce new techniques for uncertainty modeling that will provide a unified database model comprised of multiple sources, as well as an uncertainty bounds database for each data source such that a full vehicle uncertainty analysis is possible even when approaching or beyond Loss of Control boundaries. Methodologies developed as part of this research shall be instrumental in predicting and mitigating loss of control precursors and events directly linked to causal and contributing factors, such as stall, failures, damage, or icing. The tasks will include utilizing the BR&T Water Tunnel to collect static and dynamic data to be compared to the GTM extended WT database, characterizing flight dynamics in off-nominal conditions, developing tools for structural load estimation under dynamic conditions, devising methods for integrating various modeling elements into a real-time simulation capability, generating techniques for uncertainty modeling that draw data from multiple modeling sources, and providing a unified database model that includes nominal plus increments for each flight condition. This paper presents status of testing in the BR&T water tunnel and analysis of the resulting data and efforts to characterize these data using alternative modeling methods. Program challenges and issues are also presented.

  13. Analysis Test of Understanding of Vectors with the Three-Parameter Logistic Model of Item Response Theory and Item Response Curves Technique

    ERIC Educational Resources Information Center

    Rakkapao, Suttida; Prasitpong, Singha; Arayathanitkul, Kwan

    2016-01-01

    This study investigated the multiple-choice test of understanding of vectors (TUV), by applying item response theory (IRT). The difficulty, discriminatory, and guessing parameters of the TUV items were fit with the three-parameter logistic model of IRT, using the parscale program. The TUV ability is an ability parameter, here estimated assuming…

  14. A Multiple Case Study Analysis of Digital Preservation Techniques Across Government, Private, and Public Service Organizations

    DTIC Science & Technology

    2008-03-01

    capture archived encoded data. A black basalt slab with strange inscriptions on it, the Rosetta Stone was unearthed in July 1799 by Napoleon’s army...the bitstream. When you start thinking about 1s and 0s and when you start thinking about fiber optics, its just either on or off. So there is a

  15. The Role of Efficient XML Interchange (EXI) in Navy Wide-Area Network (WAN) Optimization

    DTIC Science & Technology

    2015-03-01

    compress, and re-encrypt data to continue providing optimization through compression; however, that capability requires careful consideration of...optimization 23 of encrypted data requires a careful analysis and comparison of performance improvements and IA vulnerabilities. It is important...Contained EXI capitalizes on multiple techniques to improve compression, and they vary depending on a set of EXI options passed to the codec

  16. Executive Order 12898 and Social, Economic, and Sociopolitical Factors Influencing Toxic Release Inventory Facility Location in EPA Region 6: A Multi-Scale Spatial Assessment of Environmental Justice

    ERIC Educational Resources Information Center

    Moore, Andrea Lisa

    2013-01-01

    Toxic Release Inventory facilities are among the many environmental hazards shown to create environmental inequities in the United States. This project examined four factors associated with Toxic Release Inventory, specifically, manufacturing facility location at multiple spatial scales using spatial analysis techniques (i.e., O-ring statistic and…

  17. Software for Automated Image-to-Image Co-registration

    NASA Technical Reports Server (NTRS)

    Benkelman, Cody A.; Hughes, Heidi

    2007-01-01

    The project objectives are: a) Develop software to fine-tune image-to-image co-registration, presuming images are orthorectified prior to input; b) Create a reusable software development kit (SDK) to enable incorporation of these tools into other software; d) provide automated testing for quantitative analysis; and e) Develop software that applies multiple techniques to achieve subpixel precision in the co-registration of image pairs.

  18. Space vehicle engine and heat shield environment review. Volume 1: Engineering analysis

    NASA Technical Reports Server (NTRS)

    Mcanelly, W. B.; Young, C. T. K.

    1973-01-01

    Methods for predicting the base heating characteristics of a multiple rocket engine installation are discussed. The environmental data is applied to the design of adequate protection system for the engine components. The methods for predicting the base region thermal environment are categorized as: (1) scale model testing, (2) extrapolation of previous and related flight test results, and (3) semiempirical analytical techniques.

  19. Use of Forest Inventory and Analysis information in wildlife habitat modeling: a process for linking multiple scales

    Treesearch

    Thomas C. Edwards; Gretchen G. Moisen; Tracey S. Frescino; Joshua L. Lawler

    2002-01-01

    We describe our collective efforts to develop and apply methods for using FIA data to model forest resources and wildlife habitat. Our work demonstrates how flexible regression techniques, such as generalized additive models, can be linked with spatially explicit environmental information for the mapping of forest type and structure. We illustrate how these maps of...

  20. Investigation of advanced phase-shifting projected fringe profilometry techniques

    NASA Astrophysics Data System (ADS)

    Liu, Hongyu

    1999-11-01

    The phase-shifting projected fringe profilometry (PSPFP) technique is a powerful tool in the profile measurements of rough engineering surfaces. Compared with other competing techniques, this technique is notable for its full-field measurement capacity, system simplicity, high measurement speed, and low environmental vulnerability. The main purpose of this dissertation is to tackle three important problems, which severely limit the capability and the accuracy of the PSPFP technique, with some new approaches. Chapter 1 provides some background information of the PSPFP technique including the measurement principles, basic features, and related techniques is briefly introduced. The objectives and organization of the thesis are also outlined. Chapter 2 gives a theoretical treatment to the absolute PSPFP measurement. The mathematical formulations and basic requirements of the absolute PSPFP measurement and its supporting techniques are discussed in detail. Chapter 3 introduces the experimental verification of the proposed absolute PSPFP technique. Some design details of a prototype system are discussed as supplements to the previous theoretical analysis. Various fundamental experiments performed for concept verification and accuracy evaluation are introduced together with some brief comments. Chapter 4 presents the theoretical study of speckle- induced phase measurement errors. In this analysis, the expression for speckle-induced phase errors is first derived based on the multiplicative noise model of image- plane speckles. The statistics and the system dependence of speckle-induced phase errors are then thoroughly studied through numerical simulations and analytical derivations. Based on the analysis, some suggestions on the system design are given to improve measurement accuracy. Chapter 5 discusses a new technique combating surface reflectivity variations. The formula used for error compensation is first derived based on a simplified model of the detection process. The techniques coping with two major effects of surface reflectivity variations are then introduced. Some fundamental problems in the proposed technique are studied through simulations. Chapter 6 briefly summarizes the major contributions of the current work and provides some suggestions for the future research.

  1. Assessment of apically extruded debris produced by the single-file ProTaper F2 technique under reciprocating movement.

    PubMed

    De-Deus, Gustavo; Brandão, Maria Claudia; Barino, Bianca; Di Giorgi, Karina; Fidel, Rivail Antonio Sergio; Luna, Aderval Severino

    2010-09-01

    This study was designed to quantitatively evaluate the amount of dentin debris extruded from the apical foramen by comparing the conventional sequence of the ProTaper Universal nickel-titanium (NiTi) files with the single-file ProTaper F2 technique. Thirty mesial roots of lower molars were selected, and the use of different instrumentation techniques resulted in 3 groups (n=10 each). In G1, a crown-down hand-file technique was used, and in G2 conventional ProTaper Universal technique was used. In G3, ProTaper F2 file was used in a reciprocating motion. The apical finish preparation was equivalent to ISO size 25. An apparatus was used to evaluate the apically extruded debris. Statistical analysis was performed using 1-way analysis of variance and Tukey multiple comparisons. No significant difference was found in the amount of the debris extruded between the conventional sequence of the ProTaper Universal NiTi files and the single-file ProTaper F2 technique (P>.05). In contrast, the hand instrumentation group extruded significantly more debris than both NiTi groups (P<.05). The present results yielded favorable input for the F2 single-file technique in terms of apically extruded debris, inasmuch as it is the most simple and cost-effective instrumentation approach. Copyright (c) 2010 Mosby, Inc. All rights reserved.

  2. Eversion Technique to Prevent Biliary Stricture After Living Donor Liver Transplantation in the Universal Minimal Hilar Dissection Era.

    PubMed

    Ikegami, Toru; Shimagaki, Tomonari; Kawasaki, Junji; Yoshizumi, Tomoharu; Uchiyama, Hideaki; Harada, Noboru; Harimoto, Norifumi; Itoh, Shinji; Soejima, Yuji; Maehara, Yoshihiko

    2017-01-01

    Biliary anastomosis stricture (BAS) is still among the major concerns after living donor liver transplantation (LDLT), even after the technical refinements including the universal use of the blood flow-preserving hilar dissection technique. The aim of this study is to investigate what are still the factors for BAS after LDLT. An analysis of 279 adult-to-adult LDLT grafts (left lobe, n = 161; right lobe, n = 118) with duct-to-duct biliary reconstruction, since the universal application of minimal hilar dissection technique and gradual introduction of eversion technique, was performed. There were 39 patients with BAS. Univariate analysis showed that a right lobe graft (P = 0.008), multiple bile ducts (P < 0.001), ductoplasty (P < 0.001), not using the eversion technique (P = 0.004) and fewer biliary stents than bile duct orifices (P = 0.002) were among the factors associated with BAS. The 1-year and 5-year BAS survival rates were 17.7% and 21.2% in the noneversion group (n = 134), and 6.2% and 7.9% in the eversion group (n = 145), respectively (P = 0.002). The perioperative factors including graft biliary anatomy were not different between everted (n = 134) and noneverted (n = 145) patients. The application of eversion technique under minimal hilar dissection technique could be a key for preventing BAS in duct-to-duct biliary reconstruction in LDLT.

  3. A Multi-criteria Decision Analysis System for Prioritizing Sites and Types of Low Impact Development Practices

    NASA Astrophysics Data System (ADS)

    Song, Jae Yeol; Chung, Eun-Sung

    2017-04-01

    This study developed a multi-criteria decision analysis framework to prioritize sites and types of low impact development (LID) practices. This framework was systemized as a web-based system coupled with the Storm Water Management Model (SWMM) from the Environmental Protection Agency (EPA). Using the technique for order of preference by similarity to ideal solution (TOPSIS), which is a type of multi-criteria decision-making (MCDM) method, multiple types and sites of designated LID practices are prioritized. This system is named the Water Management Prioritization Module (WMPM) and is an improved version of the Water Management Analysis Module (WMAM) that automatically generates and simulates multiple scenarios of LID design and planning parameters for a single LID type. WMPM can simultaneously determine the priority of multiple LID types and sites. In this study, an infiltration trench and permeable pavement were considered for multiple sub-catchments in South Korea to demonstrate the WMPM procedures. The TOPSIS method was manually incorporated to select the vulnerable target sub-catchments and to prioritize the LID planning scenarios for multiple types and sites considering socio-economic, hydrologic and physical-geometric factors. In this application, the Delphi method and entropy theory were used to determine the subjective and objective weights, respectively. Comparing the ranks derived by this system, two sub-catchments, S16 and S4, out of 18 were considered to be the most suitable places for installing an infiltration trench and porous pavement to reduce the peak and total flow, respectively, considering both socio-economic factors and hydrological effectiveness. WMPM can help policy-makers to objectively develop urban water plans for sustainable development. Keywords: Low Impact Development, Multi-Criteria Decision Analysis, SWMM, TOPSIS, Water Management Prioritization Module (WMPM)

  4. Technique of nonvascularized toe phalangeal transfer and distraction lengthening in the treatment of multiple digit symbrachydactyly.

    PubMed

    Netscher, David T; Lewis, Eric V

    2008-06-01

    A combination of nonvascularized multiple toe phalangeal transfers, web space deepening, and distraction lengthening may provide excellent function in the child born with the oligodactylous type of symbrachydactyly. These techniques may reconstruct multiple digits, maintaining a wide and stable grip span with good prehension to the thumb. We detail the techniques of each of these 3 stages in reconstruction and describe appropriate patient selection. Potential complications are discussed. However, with strict attention to technical details, these complications can be minimized.

  5. Indirect fabrication of multiple post-and-core patterns with a vinyl polysiloxane matrix.

    PubMed

    Sabbak, Sahar Asaad

    2002-11-01

    In the described technique, a vinyl polysiloxane material is used as a matrix for the indirect fabrication of multiple custom-cast posts and cores. The matrix technique enables the clinician to fabricate multiple posts and cores in a short period of time. The form, harmony, and common axis of preparation for all cores are well controlled before the definitive crown/fixed partial denture restorations are fabricated. Oral tissues are not exposed to the heat of polymerization or the excess monomer of the resin material when this technique is used.

  6. Material property analytical relations for the case of an AFM probe tapping a viscoelastic surface containing multiple characteristic times

    PubMed Central

    López-Guerra, Enrique A

    2017-01-01

    We explore the contact problem of a flat-end indenter penetrating intermittently a generalized viscoelastic surface, containing multiple characteristic times. This problem is especially relevant for nanoprobing of viscoelastic surfaces with the highly popular tapping-mode AFM imaging technique. By focusing on the material perspective and employing a rigorous rheological approach, we deliver analytical closed-form solutions that provide physical insight into the viscoelastic sources of repulsive forces, tip–sample dissipation and virial of the interaction. We also offer a systematic comparison to the well-established standard harmonic excitation, which is the case relevant for dynamic mechanical analysis (DMA) and for AFM techniques where tip–sample sinusoidal interaction is permanent. This comparison highlights the substantial complexity added by the intermittent-contact nature of the interaction, which precludes the derivation of straightforward equations as is the case for the well-known harmonic excitations. The derivations offered have been thoroughly validated through numerical simulations. Despite the complexities inherent to the intermittent-contact nature of the technique, the analytical findings highlight the potential feasibility of extracting meaningful viscoelastic properties with this imaging method. PMID:29114450

  7. Value-cell bar charts for visualizing large transaction data sets.

    PubMed

    Keim, Daniel A; Hao, Ming C; Dayal, Umeshwar; Lyons, Martha

    2007-01-01

    One of the common problems businesses need to solve is how to use large volumes of sales histories, Web transactions, and other data to understand the behavior of their customers and increase their revenues. Bar charts are widely used for daily analysis, but only show highly aggregated data. Users often need to visualize detailed multidimensional information reflecting the health of their businesses. In this paper, we propose an innovative visualization solution based on the use of value cells within bar charts to represent business metrics. The value of a transaction can be discretized into one or multiple cells: high-value transactions are mapped to multiple value cells, whereas many small-value transactions are combined into one cell. With value-cell bar charts, users can 1) visualize transaction value distributions and correlations, 2) identify high-value transactions and outliers at a glance, and 3) instantly display values at the transaction record level. Value-Cell Bar Charts have been applied with success to different sales and IT service usage applications, demonstrating the benefits of the technique over traditional charting techniques. A comparison with two variants of the well-known Treemap technique and our earlier work on Pixel Bar Charts is also included.

  8. Neutron coincidence counting based on time interval analysis with one- and two-dimensional Rossi-alpha distributions: an application for passive neutron waste assay

    NASA Astrophysics Data System (ADS)

    Bruggeman, M.; Baeten, P.; De Boeck, W.; Carchon, R.

    1996-02-01

    Neutron coincidence counting is commonly used for the non-destructive assay of plutonium bearing waste or for safeguards verification measurements. A major drawback of conventional coincidence counting is related to the fact that a valid calibration is needed to convert a neutron coincidence count rate to a 240Pu equivalent mass ( 240Pu eq). In waste assay, calibrations are made for representative waste matrices and source distributions. The actual waste however may have quite different matrices and source distributions compared to the calibration samples. This often results in a bias of the assay result. This paper presents a new neutron multiplicity sensitive coincidence counting technique including an auto-calibration of the neutron detection efficiency. The coincidence counting principle is based on the recording of one- and two-dimensional Rossi-alpha distributions triggered respectively by pulse pairs and by pulse triplets. Rossi-alpha distributions allow an easy discrimination between real and accidental coincidences and are aimed at being measured by a PC-based fast time interval analyser. The Rossi-alpha distributions can be easily expressed in terms of a limited number of factorial moments of the neutron multiplicity distributions. The presented technique allows an unbiased measurement of the 240Pu eq mass. The presented theory—which will be indicated as Time Interval Analysis (TIA)—is complementary to Time Correlation Analysis (TCA) theories which were developed in the past, but is from the theoretical point of view much simpler and allows a straightforward calculation of deadtime corrections and error propagation. Analytical expressions are derived for the Rossi-alpha distributions as a function of the factorial moments of the efficiency dependent multiplicity distributions. The validity of the proposed theory is demonstrated and verified via Monte Carlo simulations of pulse trains and the subsequent analysis of the simulated data.

  9. On reliable time-frequency characterization and delay estimation of stimulus frequency otoacoustic emissions

    NASA Astrophysics Data System (ADS)

    Biswal, Milan; Mishra, Srikanta

    2018-05-01

    The limited information on origin and nature of stimulus frequency otoacoustic emissions (SFOAEs) necessitates a thorough reexamination into SFOAE analysis procedures. This will lead to a better understanding of the generation of SFOAEs. The SFOAE response waveform in the time domain can be interpreted as a summation of amplitude modulated and frequency modulated component waveforms. The efficiency of a technique to segregate these components is critical to describe the nature of SFOAEs. Recent advancements in robust time-frequency analysis algorithms have staked claims on the more accurate extraction of these components, from composite signals buried in noise. However, their potential has not been fully explored for SFOAEs analysis. Indifference to distinct information, due to nature of these analysis techniques, may impact the scientific conclusions. This paper attempts to bridge this gap in literature by evaluating the performance of three linear time-frequency analysis algorithms: short-time Fourier transform (STFT), continuous Wavelet transform (CWT), S-transform (ST) and two nonlinear algorithms: Hilbert-Huang Transform (HHT), synchrosqueezed Wavelet transform (SWT). We revisit the extraction of constituent components and estimation of their magnitude and delay, by carefully evaluating the impact of variation in analysis parameters. The performance of HHT and SWT from the perspective of time-frequency filtering and delay estimation were found to be relatively less efficient for analyzing SFOAEs. The intrinsic mode functions of HHT does not completely characterize the reflection components and hence IMF based filtering alone, is not recommended for segregating principal emission from multiple reflection components. We found STFT, WT, and ST to be suitable for canceling multiple internal reflection components with marginal altering in SFOAE.

  10. Structural and chemical ordering of Heusler C o x M n y G e z epitaxial films on Ge (111): Quantitative study using traditional and anomalous x-ray diffraction techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, B. A.; Chu, Y. S.; He, L.

    2015-12-01

    Epitaxial films of CoxMnyGez grown on Ge (111) substrates by molecular-beam-epitaxy techniques have been investigated as a continuous function of composition using combinatorial synchrotron x-ray diffraction (XRD) and x-ray fluorescence (XRF) spectroscopy techniques. A high-resolution ternary epitaxial phase diagram is obtained, revealing a small number of structural phases stabilized over large compositional regions. Ordering of the constituent elements in the compositional region near the full Heusler alloy Co2MnGe has been examined in detail using both traditional XRD and a new multiple-edge anomalous diffraction (MEAD) technique. Multiple-edge anomalous diffraction involves analyzing the energy dependence of multiple reflections across each constituent absorptionmore » edge in order to detect and quantify the elemental distribution of occupation in specific lattice sites. Results of this paper show that structural and chemical ordering are very sensitive to the Co : Mn atomic ratio, such that the ordering is the highest at an atomic ratio of 2 but significantly reduced even a few percent off this ratio. The in-plane lattice is nearly coherent with that of the Ge substrate, while the approximately 2% lattice mismatch is accommodated by the out-of-plane tetragonal strain. The quantitative MEAD analysis further reveals no detectable amount (< 0.5%) of Co-Mn site swapping, but instead high levels (26%) of Mn-Ge site swapping. Increasing Ge concentration above the Heusler stoichiometry (Co0.5Mn0.25Ge0.25) is shown to correlate with increased lattice vacancies, antisites, and stacking faults, but reduced lattice relaxation. The highest degree of chemical ordering is observed off the Heusler stoichiometry with a Ge enrichment of 5 at.%.« less

  11. Monitoring and Management of a Sensitive Resource: A Landscape-level Approach with Amphibians

    DTIC Science & Technology

    2001-03-01

    Results show that each technique is effective for a portion of the amphibian community and that the use of multiple techniques is essential to any...combinations of species. These results show that multiple techniques are needed for a full assessment of amphibian populations and communities at...against which future assessments of amphibian populations and communities on each installation can be evaluated. The standardized techniques used in FY

  12. Fan fault diagnosis based on symmetrized dot pattern analysis and image matching

    NASA Astrophysics Data System (ADS)

    Xu, Xiaogang; Liu, Haixiao; Zhu, Hao; Wang, Songling

    2016-07-01

    To detect the mechanical failure of fans, a new diagnostic method based on the symmetrized dot pattern (SDP) analysis and image matching is proposed. Vibration signals of 13 kinds of running states are acquired on a centrifugal fan test bed and reconstructed by the SDP technique. The SDP pattern templates of each running state are established. An image matching method is performed to diagnose the fault. In order to improve the diagnostic accuracy, the single template, multiple templates and clustering fault templates are used to perform the image matching.

  13. Study of process variables associated with manufacturing hermetically-sealed nickel-cadmium cells

    NASA Technical Reports Server (NTRS)

    Miller, L.; Doan, D. J.; Carr, E. S.

    1971-01-01

    A program to determine and study the critical process variables associated with the manufacture of aerospace, hermetically-sealed, nickel-cadmium cells is described. The determination and study of the process variables associated with the positive and negative plaque impregnation/polarization process are emphasized. The experimental data resulting from the implementation of fractional factorial design experiments are analyzed by means of a linear multiple regression analysis technique. This analysis permits the selection of preferred levels for certain process variables to achieve desirable impregnated plaque characteristics.

  14. Spontaneous abortion in multiple pregnancy: focus on fetal pathology.

    PubMed

    Joó, József Gábor; Csaba, Ákos; Szigeti, Zsanett; Rigó, János

    2012-08-15

    Multiple pregnancy with its wide array of medical consequences poses an important condition during pregnancy. We performed perinatal autopsy in 49 cases of spontaneous abortion resulting from multiple pregnancies during the study period. Twenty-seven of the 44 twin pregnancies ending in miscarriage were conceived naturally, whereas 17 were conceived through assisted reproductive techniques. Each of the 5 triplet pregnancies ending in miscarriage was conceived through assisted reproductive techniques. There was a positive history of miscarriage in 22.4% of the cases. Monochorial placentation occurred more commonly in multiple pregnancies terminating with miscarriage than in multiple pregnancies without miscarriage. A fetal congenital malformation was found in 8 cases. Three of these cases were conceived through assisted reproductive techniques, and 5 were conceived naturally. Miscarriage was due to intrauterine infection in 36% of the cases. Our study confirms that spontaneous abortion is more common in multiple than in singleton pregnancies. Monochorial placentation predicted a higher fetal morbidity and mortality. In pregnancies where all fetuses were of male gender, miscarriage was more common than in pregnancies where all fetuses were female. Assisted reproductive techniques do not predispose to the development of fetal malformations. Copyright © 2012 Elsevier GmbH. All rights reserved.

  15. Data extraction for complex meta-analysis (DECiMAL) guide.

    PubMed

    Pedder, Hugo; Sarri, Grammati; Keeney, Edna; Nunes, Vanessa; Dias, Sofia

    2016-12-13

    As more complex meta-analytical techniques such as network and multivariate meta-analyses become increasingly common, further pressures are placed on reviewers to extract data in a systematic and consistent manner. Failing to do this appropriately wastes time, resources and jeopardises accuracy. This guide (data extraction for complex meta-analysis (DECiMAL)) suggests a number of points to consider when collecting data, primarily aimed at systematic reviewers preparing data for meta-analysis. Network meta-analysis (NMA), multiple outcomes analysis and analysis combining different types of data are considered in a manner that can be useful across a range of data collection programmes. The guide has been shown to be both easy to learn and useful in a small pilot study.

  16. Network meta-analysis: an introduction for pharmacists.

    PubMed

    Xu, Yina; Amiche, Mohamed Amine; Tadrous, Mina

    2018-05-21

    Network meta-analysis is a new tool used to summarize and compare studies for multiple interventions, irrespective of whether these interventions have been directly evaluated against each other. Network meta-analysis is quickly becoming the standard in conducting therapeutic reviews and clinical guideline development. However, little guidance is available to help pharmacists review network meta-analysis studies in their practice. Major institutions such as the Cochrane Collaboration, Agency for Healthcare Research and Quality, Canadian Agency for Drugs and Technologies in Health, and National Institute for Health and Care Excellence Decision Support Unit have endorsed utilizing network meta-analysis to establish therapeutic evidence and inform decision making. Our objective is to introduce this novel technique to pharmacy practitioners, and highlight key assumptions behind network meta-analysis studies.

  17. Phylogenetic analysis of metastatic progression in breast cancer using somatic mutations and copy number aberrations

    PubMed Central

    Brown, David; Smeets, Dominiek; Székely, Borbála; Larsimont, Denis; Szász, A. Marcell; Adnet, Pierre-Yves; Rothé, Françoise; Rouas, Ghizlane; Nagy, Zsófia I.; Faragó, Zsófia; Tőkés, Anna-Mária; Dank, Magdolna; Szentmártoni, Gyöngyvér; Udvarhelyi, Nóra; Zoppoli, Gabriele; Pusztai, Lajos; Piccart, Martine; Kulka, Janina; Lambrechts, Diether; Sotiriou, Christos; Desmedt, Christine

    2017-01-01

    Several studies using genome-wide molecular techniques have reported various degrees of genetic heterogeneity between primary tumours and their distant metastases. However, it has been difficult to discern patterns of dissemination owing to the limited number of patients and available metastases. Here, we use phylogenetic techniques on data generated using whole-exome sequencing and copy number profiling of primary and multiple-matched metastatic tumours from ten autopsied patients to infer the evolutionary history of breast cancer progression. We observed two modes of disease progression. In some patients, all distant metastases cluster on a branch separate from their primary lesion. Clonal frequency analyses of somatic mutations show that the metastases have a monoclonal origin and descend from a common ‘metastatic precursor’. Alternatively, multiple metastatic lesions are seeded from different clones present within the primary tumour. We further show that a metastasis can be horizontally cross-seeded. These findings provide insights into breast cancer dissemination. PMID:28429735

  18. A robust and scalable neuromorphic communication system by combining synaptic time multiplexing and MIMO-OFDM.

    PubMed

    Srinivasa, Narayan; Zhang, Deying; Grigorian, Beayna

    2014-03-01

    This paper describes a novel architecture for enabling robust and efficient neuromorphic communication. The architecture combines two concepts: 1) synaptic time multiplexing (STM) that trades space for speed of processing to create an intragroup communication approach that is firing rate independent and offers more flexibility in connectivity than cross-bar architectures and 2) a wired multiple input multiple output (MIMO) communication with orthogonal frequency division multiplexing (OFDM) techniques to enable a robust and efficient intergroup communication for neuromorphic systems. The MIMO-OFDM concept for the proposed architecture was analyzed by simulating large-scale spiking neural network architecture. Analysis shows that the neuromorphic system with MIMO-OFDM exhibits robust and efficient communication while operating in real time with a high bit rate. Through combining STM with MIMO-OFDM techniques, the resulting system offers a flexible and scalable connectivity as well as a power and area efficient solution for the implementation of very large-scale spiking neural architectures in hardware.

  19. Something from (almost) nothing: the impact of multiple displacement amplification on microbial ecology.

    PubMed

    Binga, Erik K; Lasken, Roger S; Neufeld, Josh D

    2008-03-01

    Microbial ecology is a field that applies molecular techniques to analyze genes and communities associated with a plethora of unique environments on this planet. In the past, low biomass and the predominance of a few abundant community members have impeded the application of techniques such as PCR, microarray analysis and metagenomics to complex microbial populations. In the absence of suitable cultivation methods, it was not possible to obtain DNA samples from individual microorganisms. Recently, a method called multiple displacement amplification (MDA) has been used to circumvent these limitations by amplifying DNA from microbial communities in low-biomass environments, individual cells from uncultivated microbial species and active organisms obtained through stable isotope probing incubations. This review describes the development and applications of MDA, discusses its strengths and limitations and highlights the impact of MDA on the field of microbial ecology. Whole genome amplification via MDA has increased access to the genomic DNA of uncultivated microorganisms and low-biomass environments and represents a 'power tool' in the molecular toolbox of microbial ecologists.

  20. Chromatographic Separation of Cd from Plants via Anion-Exchange Resin for an Isotope Determination by Multiple Collector ICP-MS.

    PubMed

    Wei, Rongfei; Guo, Qingjun; Wen, Hanjie; Peters, Marc; Yang, Junxing; Tian, Liyan; Han, Xiaokun

    2017-01-01

    In this study, key factors affecting the chromatographic separation of Cd from plants, such as the resin column, digestion and purification procedures, were experimentally investigated. A technique for separating Cd from plant samples based on single ion-exchange chromatography has been developed, which is suitable for the high-precision analysis of Cd isotopes by multiple-collector inductively coupled plasma mass spectrometry (MC-ICP-MS). The robustness of the technique was assessed by replicate analyses of Cd standard solutions and plant samples. The Cd yields of the whole separation process were higher than 95%, and the 114/110 Cd values of three Cd second standard solutions (Münster Cd, Spex Cd, Spex-1 Cd solutions) relative to the NIST SRM 3108 were measured accurately, which enabled the comparisons of Cd isotope results obtained in other laboratories. Hence, stable Cd isotope analyses represent a powerful tool for fingerprinting specific Cd sources and/or examining biogeochemical reactions in ecological and environmental systems.

  1. Direct Surface and Droplet Microsampling for Electrospray Ionization Mass Spectrometry Analysis with an Integrated Dual-Probe Microfluidic Chip

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Cong-Min; Zhu, Ying; Jin, Di-Qiong

    Ambient mass spectrometry (MS) has revolutionized the way of MS analysis and broadened its application in various fields. This paper describes the use of microfluidic techniques to simplify the setup and improve the functions of ambient MS by integrating the sampling probe, electrospray emitter probe, and online mixer on a single glass microchip. Two types of sampling probes, including a parallel-channel probe and a U-shaped channel probe, were designed for dryspot and liquid-phase droplet samples, respectively. We demonstrated that the microfabrication techniques not only enhanced the capability of ambient MS methods in analysis of dry-spot samples on various surfaces, butmore » also enabled new applications in the analysis of nanoliter-scale chemical reactions in an array of droplets. The versatility of the microchip-based ambient MS method was demonstrated in multiple different applications including evaluation of residual pesticide on fruit surfaces, sensitive analysis of low-ionizable analytes using postsampling derivatization, and high-throughput screening of Ugi-type multicomponent reactions.« less

  2. SPICE: exploration and analysis of post-cytometric complex multivariate datasets.

    PubMed

    Roederer, Mario; Nozzi, Joshua L; Nason, Martha C

    2011-02-01

    Polychromatic flow cytometry results in complex, multivariate datasets. To date, tools for the aggregate analysis of these datasets across multiple specimens grouped by different categorical variables, such as demographic information, have not been optimized. Often, the exploration of such datasets is accomplished by visualization of patterns with pie charts or bar charts, without easy access to statistical comparisons of measurements that comprise multiple components. Here we report on algorithms and a graphical interface we developed for these purposes. In particular, we discuss thresholding necessary for accurate representation of data in pie charts, the implications for display and comparison of normalized versus unnormalized data, and the effects of averaging when samples with significant background noise are present. Finally, we define a statistic for the nonparametric comparison of complex distributions to test for difference between groups of samples based on multi-component measurements. While originally developed to support the analysis of T cell functional profiles, these techniques are amenable to a broad range of datatypes. Published 2011 Wiley-Liss, Inc.

  3. Information theoretic partitioning and confidence based weight assignment for multi-classifier decision level fusion in hyperspectral target recognition applications

    NASA Astrophysics Data System (ADS)

    Prasad, S.; Bruce, L. M.

    2007-04-01

    There is a growing interest in using multiple sources for automatic target recognition (ATR) applications. One approach is to take multiple, independent observations of a phenomenon and perform a feature level or a decision level fusion for ATR. This paper proposes a method to utilize these types of multi-source fusion techniques to exploit hyperspectral data when only a small number of training pixels are available. Conventional hyperspectral image based ATR techniques project the high dimensional reflectance signature onto a lower dimensional subspace using techniques such as Principal Components Analysis (PCA), Fisher's linear discriminant analysis (LDA), subspace LDA and stepwise LDA. While some of these techniques attempt to solve the curse of dimensionality, or small sample size problem, these are not necessarily optimal projections. In this paper, we present a divide and conquer approach to address the small sample size problem. The hyperspectral space is partitioned into contiguous subspaces such that the discriminative information within each subspace is maximized, and the statistical dependence between subspaces is minimized. We then treat each subspace as a separate source in a multi-source multi-classifier setup and test various decision fusion schemes to determine their efficacy. Unlike previous approaches which use correlation between variables for band grouping, we study the efficacy of higher order statistical information (using average mutual information) for a bottom up band grouping. We also propose a confidence measure based decision fusion technique, where the weights associated with various classifiers are based on their confidence in recognizing the training data. To this end, training accuracies of all classifiers are used for weight assignment in the fusion process of test pixels. The proposed methods are tested using hyperspectral data with known ground truth, such that the efficacy can be quantitatively measured in terms of target recognition accuracies.

  4. The Characterization of Biosignatures in Caves Using an Instrument Suite

    NASA Astrophysics Data System (ADS)

    Uckert, Kyle; Chanover, Nancy J.; Getty, Stephanie; Voelz, David G.; Brinckerhoff, William B.; McMillan, Nancy; Xiao, Xifeng; Boston, Penelope J.; Li, Xiang; McAdam, Amy; Glenar, David A.; Chavez, Arriana

    2017-12-01

    The search for life and habitable environments on other Solar System bodies is a major motivator for planetary exploration. Due to the difficulty and significance of detecting extant or extinct extraterrestrial life in situ, several independent measurements from multiple instrument techniques will bolster the community's confidence in making any such claim. We demonstrate the detection of subsurface biosignatures using a suite of instrument techniques including IR reflectance spectroscopy, laser-induced breakdown spectroscopy, and scanning electron microscopy/energy dispersive X-ray spectroscopy. We focus our measurements on subterranean calcium carbonate field samples, whose biosignatures are analogous to those that might be expected on some high-interest astrobiology targets. In this work, we discuss the feasibility and advantages of using each of the aforementioned instrument techniques for the in situ search for biosignatures and present results on the autonomous characterization of biosignatures using multivariate statistical analysis techniques.

  5. The Characterization of Biosignatures in Caves Using an Instrument Suite.

    PubMed

    Uckert, Kyle; Chanover, Nancy J; Getty, Stephanie; Voelz, David G; Brinckerhoff, William B; McMillan, Nancy; Xiao, Xifeng; Boston, Penelope J; Li, Xiang; McAdam, Amy; Glenar, David A; Chavez, Arriana

    2017-12-01

    The search for life and habitable environments on other Solar System bodies is a major motivator for planetary exploration. Due to the difficulty and significance of detecting extant or extinct extraterrestrial life in situ, several independent measurements from multiple instrument techniques will bolster the community's confidence in making any such claim. We demonstrate the detection of subsurface biosignatures using a suite of instrument techniques including IR reflectance spectroscopy, laser-induced breakdown spectroscopy, and scanning electron microscopy/energy dispersive X-ray spectroscopy. We focus our measurements on subterranean calcium carbonate field samples, whose biosignatures are analogous to those that might be expected on some high-interest astrobiology targets. In this work, we discuss the feasibility and advantages of using each of the aforementioned instrument techniques for the in situ search for biosignatures and present results on the autonomous characterization of biosignatures using multivariate statistical analysis techniques. Key Words: Biosignature suites-Caves-Mars-Life detection. Astrobiology 17, 1203-1218.

  6. Noise Reduction Effect of Multiple-Sampling-Based Signal-Readout Circuits for Ultra-Low Noise CMOS Image Sensors.

    PubMed

    Kawahito, Shoji; Seo, Min-Woong

    2016-11-06

    This paper discusses the noise reduction effect of multiple-sampling-based signal readout circuits for implementing ultra-low-noise image sensors. The correlated multiple sampling (CMS) technique has recently become an important technology for high-gain column readout circuits in low-noise CMOS image sensors (CISs). This paper reveals how the column CMS circuits, together with a pixel having a high-conversion-gain charge detector and low-noise transistor, realizes deep sub-electron read noise levels based on the analysis of noise components in the signal readout chain from a pixel to the column analog-to-digital converter (ADC). The noise measurement results of experimental CISs are compared with the noise analysis and the effect of noise reduction to the sampling number is discussed at the deep sub-electron level. Images taken with three CMS gains of two, 16, and 128 show distinct advantage of image contrast for the gain of 128 (noise(median): 0.29 e - rms ) when compared with the CMS gain of two (2.4 e - rms ), or 16 (1.1 e - rms ).

  7. Noise Reduction Effect of Multiple-Sampling-Based Signal-Readout Circuits for Ultra-Low Noise CMOS Image Sensors

    PubMed Central

    Kawahito, Shoji; Seo, Min-Woong

    2016-01-01

    This paper discusses the noise reduction effect of multiple-sampling-based signal readout circuits for implementing ultra-low-noise image sensors. The correlated multiple sampling (CMS) technique has recently become an important technology for high-gain column readout circuits in low-noise CMOS image sensors (CISs). This paper reveals how the column CMS circuits, together with a pixel having a high-conversion-gain charge detector and low-noise transistor, realizes deep sub-electron read noise levels based on the analysis of noise components in the signal readout chain from a pixel to the column analog-to-digital converter (ADC). The noise measurement results of experimental CISs are compared with the noise analysis and the effect of noise reduction to the sampling number is discussed at the deep sub-electron level. Images taken with three CMS gains of two, 16, and 128 show distinct advantage of image contrast for the gain of 128 (noise(median): 0.29 e−rms) when compared with the CMS gain of two (2.4 e−rms), or 16 (1.1 e−rms). PMID:27827972

  8. Computational models for the analysis of three-dimensional internal and exhaust plume flowfields

    NASA Technical Reports Server (NTRS)

    Dash, S. M.; Delguidice, P. D.

    1977-01-01

    This paper describes computational procedures developed for the analysis of three-dimensional supersonic ducted flows and multinozzle exhaust plume flowfields. The models/codes embodying these procedures cater to a broad spectrum of geometric situations via the use of multiple reference plane grid networks in several coordinate systems. Shock capturing techniques are employed to trace the propagation and interaction of multiple shock surfaces while the plume interface, separating the exhaust and external flows, and the plume external shock are discretely analyzed. The computational grid within the reference planes follows the trace of streamlines to facilitate the incorporation of finite-rate chemistry and viscous computational capabilities. Exhaust gas properties consist of combustion products in chemical equilibrium. The computational accuracy of the models/codes is assessed via comparisons with exact solutions, results of other codes and experimental data. Results are presented for the flows in two-dimensional convergent and divergent ducts, expansive and compressive corner flows, flow in a rectangular nozzle and the plume flowfields for exhausts issuing out of single and multiple rectangular nozzles.

  9. Progress in The Semantic Analysis of Scientific Code

    NASA Technical Reports Server (NTRS)

    Stewart, Mark

    2000-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.

  10. Real-time flutter identification

    NASA Technical Reports Server (NTRS)

    Roy, R.; Walker, R.

    1985-01-01

    The techniques and a FORTRAN 77 MOdal Parameter IDentification (MOPID) computer program developed for identification of the frequencies and damping ratios of multiple flutter modes in real time are documented. Physically meaningful model parameterization was combined with state of the art recursive identification techniques and applied to the problem of real time flutter mode monitoring. The performance of the algorithm in terms of convergence speed and parameter estimation error is demonstrated for several simulated data cases, and the results of actual flight data analysis from two different vehicles are presented. It is indicated that the algorithm is capable of real time monitoring of aircraft flutter characteristics with a high degree of reliability.

  11. New techniques for the analysis of manual control systems. [mathematical models of human operator behavior

    NASA Technical Reports Server (NTRS)

    Bekey, G. A.

    1971-01-01

    Studies are summarized on the application of advanced analytical and computational methods to the development of mathematical models of human controllers in multiaxis manual control systems. Specific accomplishments include the following: (1) The development of analytical and computer methods for the measurement of random parameters in linear models of human operators. (2) Discrete models of human operator behavior in a multiple display situation were developed. (3) Sensitivity techniques were developed which make possible the identification of unknown sampling intervals in linear systems. (4) The adaptive behavior of human operators following particular classes of vehicle failures was studied and a model structure proposed.

  12. Plio-Pleistocene time evolution of the 100-ky cycle in marine paleoclimate records

    NASA Technical Reports Server (NTRS)

    Park, Jeffrey; Maasch, Kirk A.

    1992-01-01

    To constrain theories for the dynamical evolution of global ice mass through the late Neogene, it is important to determine whether major changes in the record were gradual or rapid. Of particular interest is the evolution of the near 100-ky ice age cycle in the middle Pleistocene. We have applied a new technique based on multiple taper spectrum analysis which allows us to model the time evolution of quasi-periodic signals. This technique uses both phase and amplitude information, and enables us to address the question of abrupt versus gradual onset of the 100-ky periodicity in the middle Pleistocene.

  13. Slow crack growth in spinel in water

    NASA Technical Reports Server (NTRS)

    Schwantes, S.; Elber, W.

    1983-01-01

    Magnesium aluminate spinel was tested in a water environment at room temperature to establish its slow crack-growth behavior. Ring specimens with artificial flaws on the outside surface were loaded hydraulically on the inside surface. The time to failure was measured. Various precracking techniques were evaluated and multiple precracks were used to minimize the scatter in the static fatigue tests. Statistical analysis techniques were developed to determine the strength and crack velocities for a single flaw. Slow crack-growth rupture was observed at stress intensities as low as 70 percent of K sub c. A strengthening effect was observed in specimens that had survived long-time static fatigue tests.

  14. Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors.

    PubMed

    Dutton, Neale A W; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K

    2016-07-20

    SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed.

  15. Spectral analysis of views interpolated by chroma subpixel downsampling for 3D autosteroscopic displays

    NASA Astrophysics Data System (ADS)

    Marson, Avishai; Stern, Adrian

    2015-05-01

    One of the main limitations of horizontal parallax autostereoscopic displays is the horizontal resolution loss due the need to repartition the pixels of the display panel among the multiple views. Recently we have shown that this problem can be alleviated by applying a color sub-pixel rendering technique1. Interpolated views are generated by down-sampling the panel pixels at sub-pixel level, thus increasing the number of views. The method takes advantage of lower acuity of the human eye to chromatic resolution. Here we supply further support of the technique by analyzing the spectra of the subsampled images.

  16. Rule-based statistical data mining agents for an e-commerce application

    NASA Astrophysics Data System (ADS)

    Qin, Yi; Zhang, Yan-Qing; King, K. N.; Sunderraman, Rajshekhar

    2003-03-01

    Intelligent data mining techniques have useful e-Business applications. Because an e-Commerce application is related to multiple domains such as statistical analysis, market competition, price comparison, profit improvement and personal preferences, this paper presents a hybrid knowledge-based e-Commerce system fusing intelligent techniques, statistical data mining, and personal information to enhance QoS (Quality of Service) of e-Commerce. A Web-based e-Commerce application software system, eDVD Web Shopping Center, is successfully implemented uisng Java servlets and an Oracle81 database server. Simulation results have shown that the hybrid intelligent e-Commerce system is able to make smart decisions for different customers.

  17. Using Movies to Analyse Gene Circuit Dynamics in Single Cells

    PubMed Central

    Locke, James CW; Elowitz, Michael B

    2010-01-01

    Preface Many bacterial systems rely on dynamic genetic circuits to control critical processes. A major goal of systems biology is to understand these behaviours in terms of individual genes and their interactions. However, traditional techniques based on population averages wash out critical dynamics that are either unsynchronized between cells or driven by fluctuations, or ‘noise,’ in cellular components. Recently, the combination of time-lapse microscopy, quantitative image analysis, and fluorescent protein reporters has enabled direct observation of multiple cellular components over time in individual cells. In conjunction with mathematical modelling, these techniques are now providing powerful insights into genetic circuit behaviour in diverse microbial systems. PMID:19369953

  18. Performance Analysis of Receive Diversity in Wireless Sensor Networks over GBSBE Models

    PubMed Central

    Goel, Shivali; Abawajy, Jemal H.; Kim, Tai-hoon

    2010-01-01

    Wireless sensor networks have attracted a lot of attention recently. In this paper, we develop a channel model based on the elliptical model for multipath components involving randomly placed scatterers in the scattering region with sensors deployed on a field. We verify that in a sensor network, the use of receive diversity techniques improves the performance of the system. Extensive performance analysis of the system is carried out for both single and multiple antennas with the applied receive diversity techniques. Performance analyses based on variations in receiver height, maximum multipath delay and transmit power have been performed considering different numbers of antenna elements present in the receiver array, Our results show that increasing the number of antenna elements for a wireless sensor network does indeed improve the BER rates that can be obtained. PMID:22163510

  19. Combined use of atomic force microscopy, X-ray photoelectron spectroscopy, and secondary ion mass spectrometry for cell surface analysis.

    PubMed

    Dague, Etienne; Delcorte, Arnaud; Latgé, Jean-Paul; Dufrêne, Yves F

    2008-04-01

    Understanding the surface properties of microbial cells is a major challenge of current microbiological research and a key to efficiently exploit them in biotechnology. Here, we used three advanced surface analysis techniques with different sensitivity, probing depth, and lateral resolution, that is, in situ atomic force microscopy, X-ray photoelectron spectroscopy, and secondary ion mass spectrometry, to gain insight into the surface properties of the conidia of the human fungal pathogen Aspergillus fumigatus. We show that the native ultrastructure, surface protein and polysaccharide concentrations, and amino acid composition of three mutants affected in hydrophobin production are markedly different from those of the wild-type, thereby providing novel insight into the cell wall architecture of A. fumigatus. The results demonstrate the power of using multiple complementary techniques for probing microbial cell surfaces.

  20. Chronodes: Interactive Multifocus Exploration of Event Sequences

    PubMed Central

    POLACK, PETER J.; CHEN, SHANG-TSE; KAHNG, MINSUK; DE BARBARO, KAYA; BASOLE, RAHUL; SHARMIN, MOUSHUMI; CHAU, DUEN HORNG

    2018-01-01

    The advent of mobile health (mHealth) technologies challenges the capabilities of current visualizations, interactive tools, and algorithms. We present Chronodes, an interactive system that unifies data mining and human-centric visualization techniques to support explorative analysis of longitudinal mHealth data. Chronodes extracts and visualizes frequent event sequences that reveal chronological patterns across multiple participant timelines of mHealth data. It then combines novel interaction and visualization techniques to enable multifocus event sequence analysis, which allows health researchers to interactively define, explore, and compare groups of participant behaviors using event sequence combinations. Through summarizing insights gained from a pilot study with 20 behavioral and biomedical health experts, we discuss Chronodes’s efficacy and potential impact in the mHealth domain. Ultimately, we outline important open challenges in mHealth, and offer recommendations and design guidelines for future research. PMID:29515937

  1. A Brief History of the use of Electromagnetic Induction Techniques in Soil Survey

    NASA Astrophysics Data System (ADS)

    Brevik, Eric C.; Doolittle, James

    2017-04-01

    Electromagnetic induction (EMI) has been used to characterize the spatial variability of soil properties since the late 1970s. Initially used to assess soil salinity, the use of EMI in soil studies has expanded to include: mapping soil types; characterizing soil water content and flow patterns; assessing variations in soil texture, compaction, organic matter content, and pH; and determining the depth to subsurface horizons, stratigraphic layers or bedrock, among other uses. In all cases the soil property being investigated must influence soil apparent electrical conductivity (ECa) either directly or indirectly for EMI techniques to be effective. An increasing number and diversity of EMI sensors have been developed in response to users' needs and the availability of allied technologies, which have greatly improved the functionality of these tools and increased the amount and types of data that can be gathered with a single pass. EMI investigations provide several benefits for soil studies. The large amount of georeferenced data that can be rapidly and inexpensively collected with EMI provides more complete characterization of the spatial variations in soil properties than traditional sampling techniques. In addition, compared to traditional soil survey methods, EMI can more effectively characterize diffuse soil boundaries and identify included areas of dissimilar soils within mapped soil units, giving soil scientists greater confidence when collecting spatial soil information. EMI techniques do have limitations; results are site-specific and can vary depending on the complex interactions among multiple and variable soil properties. Despite this, EMI techniques are increasingly being used to investigate the spatial variability of soil properties at field and landscape scales. The future should witness a greater use of multiple-frequency and multiple-coil EMI sensors and integration with other sensors to assess the spatial variability of soil properties. Data analysis will be improved with advanced processing and presentation systems and more sophisticated geostatistical modeling algorithms will be developed and used to interpolate EMI data, improve the resolution of subsurface features, and assess soil properties.

  2. Evaluation of potential substrates for restenosis and thrombosis in overlapped versus edge-to-edge juxtaposed bioabsorbable scaffolds: Insights from a computed fluid dynamic study.

    PubMed

    Rigatelli, Gianluca; Zuin, Marco; Dell'Avvocata, Fabio; Cardaioli, Paolo; Vassiliev, Dobrin; Ferenc, Miroslaw; Nghia, Nguyen Tuan; Nguyen, Thach; Foin, Nicholas

    2018-04-01

    Multiple BRSs and specifically the Absorb scaffold (BVS) (Abbott Vascular, Santa Clara, CA USA) have been often used to treat long diffuse coronary artery lesions. We evaluate by a computational fluid dynamic(CFD) study the impact on the intravascular fluid rheology on multiple bioabsorbable scaffolds (BRS) by standard overlapping versus edge-to-edge technique. We simulated the treatment of a real long significant coronary lesion (>70% luminal narrowing) involving the left anterior descending artery (LAD) treated with a standard or edge-to-edge technique, respectively. Simulations were performed after BVS implantations in two different conditions: 1) Edge-to-edge technique, where the scaffolds are kissed but not overlapped resulting in a luminal encroachment of 0.015cm (150μm); 2) Standard overlapping, where the scaffolds are overlapped resulting in a luminal encroachment of 0.030cm (300μm). After positioning the BVS across the long lesion, the implantation procedure was performed in-silico following all the usual procedural steps. Analysis of the wall shear stress (WSS) suggested that at the vessel wall level the WSS were lower in the overlapping zones overlapping compared to the edge-to-edge zone (∆=0.061Pa, p=0.01). At the struts level the difference between the two WSS was more striking (∆=1.065e-004 p=0.01) favouring the edge-to-edge zone. Our study suggested that at both vessel wall and scaffold struts levels, there was lowering WSS when multiple BVS were implanted with the standard overlapping technique compared to the "edge-to-edge" technique. This lower WSS might represent a substrate for restenosis, early and late BVS thrombosis, potentially explaining at least in part the recent evidences of devices poor performance. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Higher moments of net-proton multiplicity distributions in a heavy-ion event pile-up scenario

    NASA Astrophysics Data System (ADS)

    Garg, P.; Mishra, D. K.

    2017-10-01

    High-luminosity modern accelerators, like the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory (BNL) and Large Hadron Collider (LHC) at European Organization for Nuclear Research (CERN), inherently have event pile-up scenarios which significantly contribute to physics events as a background. While state-of-the-art tracking algorithms and detector concepts take care of these event pile-up scenarios, several offline analytical techniques are used to remove such events from the physics analysis. It is still difficult to identify the remaining pile-up events in an event sample for physics analysis. Since the fraction of these events is significantly small, it may not be as serious of an issue for other analyses as it would be for an event-by-event analysis. Particularly when the characteristics of the multiplicity distribution are observable, one needs to be very careful. In the present work, we demonstrate how a small fraction of residual pile-up events can change the moments and their ratios of an event-by-event net-proton multiplicity distribution, which are sensitive to the dynamical fluctuations due to the QCD critical point. For this study, we assume that the individual event-by-event proton and antiproton multiplicity distributions follow Poisson, negative binomial, or binomial distributions. We observe a significant effect in cumulants and their ratios of net-proton multiplicity distributions due to pile-up events, particularly at lower energies. It might be crucial to estimate the fraction of pile-up events in the data sample while interpreting the experimental observable for the critical point.

  4. Adaptive control of a jet turboshaft engine driving a variable pitch propeller using multiple models

    NASA Astrophysics Data System (ADS)

    Ahmadian, Narjes; Khosravi, Alireza; Sarhadi, Pouria

    2017-08-01

    In this paper, a multiple model adaptive control (MMAC) method is proposed for a gas turbine engine. The model of a twin spool turbo-shaft engine driving a variable pitch propeller includes various operating points. Variations in fuel flow and propeller pitch inputs produce different operating conditions which force the controller to be adopted rapidly. Important operating points are three idle, cruise and full thrust cases for the entire flight envelope. A multi-input multi-output (MIMO) version of second level adaptation using multiple models is developed. Also, stability analysis using Lyapunov method is presented. The proposed method is compared with two conventional first level adaptation and model reference adaptive control techniques. Simulation results for JetCat SPT5 turbo-shaft engine demonstrate the performance and fidelity of the proposed method.

  5. Adaptive sensor-fault tolerant control for a class of multivariable uncertain nonlinear systems.

    PubMed

    Khebbache, Hicham; Tadjine, Mohamed; Labiod, Salim; Boulkroune, Abdesselem

    2015-03-01

    This paper deals with the active fault tolerant control (AFTC) problem for a class of multiple-input multiple-output (MIMO) uncertain nonlinear systems subject to sensor faults and external disturbances. The proposed AFTC method can tolerate three additive (bias, drift and loss of accuracy) and one multiplicative (loss of effectiveness) sensor faults. By employing backstepping technique, a novel adaptive backstepping-based AFTC scheme is developed using the fact that sensor faults and system uncertainties (including external disturbances and unexpected nonlinear functions caused by sensor faults) can be on-line estimated and compensated via robust adaptive schemes. The stability analysis of the closed-loop system is rigorously proven using a Lyapunov approach. The effectiveness of the proposed controller is illustrated by two simulation examples. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Optimal space communications techniques. [using digital and phase locked systems for signal processing

    NASA Technical Reports Server (NTRS)

    Schilling, D. L.

    1974-01-01

    Digital multiplication of two waveforms using delta modulation (DM) is discussed. It is shown that while conventional multiplication of two N bit words requires N2 complexity, multiplication using DM requires complexity which increases linearly with N. Bounds on the signal-to-quantization noise ratio (SNR) resulting from this multiplication are determined and compared with the SNR obtained using standard multiplication techniques. The phase locked loop (PLL) system, consisting of a phase detector, voltage controlled oscillator, and a linear loop filter, is discussed in terms of its design and system advantages. Areas requiring further research are identified.

  7. Real-time optical multiple object recognition and tracking system and method

    NASA Technical Reports Server (NTRS)

    Chao, Tien-Hsin (Inventor); Liu, Hua Kuang (Inventor)

    1987-01-01

    The invention relates to an apparatus and associated methods for the optical recognition and tracking of multiple objects in real time. Multiple point spatial filters are employed that pre-define the objects to be recognized at run-time. The system takes the basic technology of a Vander Lugt filter and adds a hololens. The technique replaces time, space and cost-intensive digital techniques. In place of multiple objects, the system can also recognize multiple orientations of a single object. This later capability has potential for space applications where space and weight are at a premium.

  8. Laser scanning confocal microscopy: history, applications, and related optical sectioning techniques.

    PubMed

    Paddock, Stephen W; Eliceiri, Kevin W

    2014-01-01

    Confocal microscopy is an established light microscopical technique for imaging fluorescently labeled specimens with significant three-dimensional structure. Applications of confocal microscopy in the biomedical sciences include the imaging of the spatial distribution of macromolecules in either fixed or living cells, the automated collection of 3D data, the imaging of multiple labeled specimens and the measurement of physiological events in living cells. The laser scanning confocal microscope continues to be chosen for most routine work although a number of instruments have been developed for more specific applications. Significant improvements have been made to all areas of the confocal approach, not only to the instruments themselves, but also to the protocols of specimen preparation, to the analysis, the display, the reproduction, sharing and management of confocal images using bioinformatics techniques.

  9. Frequency-domain method for measuring spectral properties in multiple-scattering media: methemoglobin absorption spectrum in a tissuelike phantom

    NASA Astrophysics Data System (ADS)

    Fishkin, Joshua B.; So, Peter T. C.; Cerussi, Albert E.; Gratton, Enrico; Fantini, Sergio; Franceschini, Maria Angela

    1995-03-01

    We have measured the optical absorption and scattering coefficient spectra of a multiple-scattering medium (i.e., a biological tissue-simulating phantom comprising a lipid colloid) containing methemoglobin by using frequency-domain techniques. The methemoglobin absorption spectrum determined in the multiple-scattering medium is in excellent agreement with a corrected methemoglobin absorption spectrum obtained from a steady-state spectrophotometer measurement of the optical density of a minimally scattering medium. The determination of the corrected methemoglobin absorption spectrum takes into account the scattering from impurities in the methemoglobin solution containing no lipid colloid. Frequency-domain techniques allow for the separation of the absorbing from the scattering properties of multiple-scattering media, and these techniques thus provide an absolute

  10. Enabling Predictive Simulation and UQ of Complex Multiphysics PDE Systems by the Development of Goal-Oriented Variational Sensitivity Analysis and a-Posteriori Error Estimation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estep, Donald

    2015-11-30

    This project addressed the challenge of predictive computational analysis of strongly coupled, highly nonlinear multiphysics systems characterized by multiple physical phenomena that span a large range of length- and time-scales. Specifically, the project was focused on computational estimation of numerical error and sensitivity analysis of computational solutions with respect to variations in parameters and data. In addition, the project investigated the use of accurate computational estimates to guide efficient adaptive discretization. The project developed, analyzed and evaluated new variational adjoint-based techniques for integration, model, and data error estimation/control and sensitivity analysis, in evolutionary multiphysics multiscale simulations.

  11. PeTTSy: a computational tool for perturbation analysis of complex systems biology models.

    PubMed

    Domijan, Mirela; Brown, Paul E; Shulgin, Boris V; Rand, David A

    2016-03-10

    Over the last decade sensitivity analysis techniques have been shown to be very useful to analyse complex and high dimensional Systems Biology models. However, many of the currently available toolboxes have either used parameter sampling, been focused on a restricted set of model observables of interest, studied optimisation of a objective function, or have not dealt with multiple simultaneous model parameter changes where the changes can be permanent or temporary. Here we introduce our new, freely downloadable toolbox, PeTTSy (Perturbation Theory Toolbox for Systems). PeTTSy is a package for MATLAB which implements a wide array of techniques for the perturbation theory and sensitivity analysis of large and complex ordinary differential equation (ODE) based models. PeTTSy is a comprehensive modelling framework that introduces a number of new approaches and that fully addresses analysis of oscillatory systems. It examines sensitivity analysis of the models to perturbations of parameters, where the perturbation timing, strength, length and overall shape can be controlled by the user. This can be done in a system-global setting, namely, the user can determine how many parameters to perturb, by how much and for how long. PeTTSy also offers the user the ability to explore the effect of the parameter perturbations on many different types of outputs: period, phase (timing of peak) and model solutions. PeTTSy can be employed on a wide range of mathematical models including free-running and forced oscillators and signalling systems. To enable experimental optimisation using the Fisher Information Matrix it efficiently allows one to combine multiple variants of a model (i.e. a model with multiple experimental conditions) in order to determine the value of new experiments. It is especially useful in the analysis of large and complex models involving many variables and parameters. PeTTSy is a comprehensive tool for analysing large and complex models of regulatory and signalling systems. It allows for simulation and analysis of models under a variety of environmental conditions and for experimental optimisation of complex combined experiments. With its unique set of tools it makes a valuable addition to the current library of sensitivity analysis toolboxes. We believe that this software will be of great use to the wider biological, systems biology and modelling communities.

  12. Statistical analysis of Thematic Mapper Simulator data for the geobotanical discrimination of rock types in southwest Oregon

    NASA Technical Reports Server (NTRS)

    Morrissey, L. A.; Weinstock, K. J.; Mouat, D. A.; Card, D. H.

    1984-01-01

    An evaluation of Thematic Mapper Simulator (TMS) data for the geobotanical discrimination of rock types based on vegetative cover characteristics is addressed in this research. A methodology for accomplishing this evaluation utilizing univariate and multivariate techniques is presented. TMS data acquired with a Daedalus DEI-1260 multispectral scanner were integrated with vegetation and geologic information for subsequent statistical analyses, which included a chi-square test, an analysis of variance, stepwise discriminant analysis, and Duncan's multiple range test. Results indicate that ultramafic rock types are spectrally separable from nonultramafics based on vegetative cover through the use of statistical analyses.

  13. Development of Multiobjective Optimization Techniques for Sonic Boom Minimization

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.

    1996-01-01

    A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.

  14. Determination of renewable energy yield from mixed waste material from the use of novel image analysis methods.

    PubMed

    Wagland, S T; Dudley, R; Naftaly, M; Longhurst, P J

    2013-11-01

    Two novel techniques are presented in this study which together aim to provide a system able to determine the renewable energy potential of mixed waste materials. An image analysis tool was applied to two waste samples prepared using known quantities of source-segregated recyclable materials. The technique was used to determine the composition of the wastes, where through the use of waste component properties the biogenic content of the samples was calculated. The percentage renewable energy determined by image analysis for each sample was accurate to within 5% of the actual values calculated. Microwave-based multiple-point imaging (AutoHarvest) was used to demonstrate the ability of such a technique to determine the moisture content of mixed samples. This proof-of-concept experiment was shown to produce moisture measurement accurate to within 10%. Overall, the image analysis tool was able to determine the renewable energy potential of the mixed samples, and the AutoHarvest should enable the net calorific value calculations through the provision of moisture content measurements. The proposed system is suitable for combustion facilities, and enables the operator to understand the renewable energy potential of the waste prior to combustion. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Doppler Global Velocimeter Development for the Large Wind Tunnels at Ames Research Center

    NASA Technical Reports Server (NTRS)

    Reinath, Michael S.

    1997-01-01

    Development of an optical, laser-based flow-field measurement technique for large wind tunnels is described. The technique uses laser sheet illumination and charged coupled device detectors to rapidly measure flow-field velocity distributions over large planar regions of the flow. Sample measurements are presented that illustrate the capability of the technique. An analysis of measurement uncertainty, which focuses on the random component of uncertainty, shows that precision uncertainty is not dependent on the measured velocity magnitude. For a single-image measurement, the analysis predicts a precision uncertainty of +/-5 m/s. When multiple images are averaged, this uncertainty is shown to decrease. For an average of 100 images, for example, the analysis shows that a precision uncertainty of +/-0.5 m/s can be expected. Sample applications show that vectors aligned with an orthogonal coordinate system are difficult to measure directly. An algebraic transformation is presented which converts measured vectors to the desired orthogonal components. Uncertainty propagation is then used to show how the uncertainty propagates from the direct measurements to the orthogonal components. For a typical forward-scatter viewing geometry, the propagation analysis predicts precision uncertainties of +/-4, +/-7, and +/-6 m/s, respectively, for the U, V, and W components at 68% confidence.

  16. Faster Double-Size Bipartite Multiplication out of Montgomery Multipliers

    NASA Astrophysics Data System (ADS)

    Yoshino, Masayuki; Okeya, Katsuyuki; Vuillaume, Camille

    This paper proposes novel algorithms for computing double-size modular multiplications with few modulus-dependent precomputations. Low-end devices such as smartcards are usually equipped with hardware Montgomery multipliers. However, due to progresses of mathematical attacks, security institutions such as NIST have steadily demanded longer bit-lengths for public-key cryptography, making the multipliers quickly obsolete. In an attempt to extend the lifespan of such multipliers, double-size techniques compute modular multiplications with twice the bit-length of the multipliers. Techniques are known for extending the bit-length of classical Euclidean multipliers, of Montgomery multipliers and the combination thereof, namely bipartite multipliers. However, unlike classical and bipartite multiplications, Montgomery multiplications involve modulus-dependent precomputations, which amount to a large part of an RSA encryption or signature verification. The proposed double-size technique simulates double-size multiplications based on single-size Montgomery multipliers, and yet precomputations are essentially free: in an 2048-bit RSA encryption or signature verification with public exponent e=216+1, the proposal with a 1024-bit Montgomery multiplier is at least 1.5 times faster than previous double-size Montgomery multiplications.

  17. Training in Multiple Launch Rocket System Units

    DTIC Science & Technology

    1992-04-01

    training conducted is Section-level training. This training is typically conducted in the garrison environment , rather than in the field. 29 4. The major... environment . Such vehicles could also be used to train on land navigation and other terrain appreciation and analysis tasks, movement techniques, and...including Batery , Platoon, and Section, of the person interviewed. Valid values for Battery are A through I. Valid codes for Platoon are 1 through 3

  18. Trends in Human-Computer Interaction to Support Future Intelligence Analysis Capabilities

    DTIC Science & Technology

    2011-06-01

    that allows data to be moved between different computing systems and displays. Figure 4- G-Speak gesture interaction (Oblong, 2011) 5.2 Multitouch ... Multitouch refers to a touchscreen interaction technique in which multiple simultaneous touchpoints and movements can be detected and used to...much of the style of interaction (such as rotate, pinch, zoom and flick movements) found in multitouch devices but can typically recognize more than

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tate, John G.; Richardson, Bradley S.; Love, Lonnie J.

    ORNL worked with the Schaeffler Group USA to explore additive manufacturing techniques that might be appropriate for prototyping of bearing cages. Multiple additive manufacturing techniques were investigated, including e-beam, binder jet and multiple laser based processes. The binder jet process worked best for the thin, detailed cages printed.

  20. The MIND PALACE: A Multi-Spectral Imaging and Spectroscopy Database for Planetary Science

    NASA Astrophysics Data System (ADS)

    Eshelman, E.; Doloboff, I.; Hara, E. K.; Uckert, K.; Sapers, H. M.; Abbey, W.; Beegle, L. W.; Bhartia, R.

    2017-12-01

    The Multi-Instrument Database (MIND) is the web-based home to a well-characterized set of analytical data collected by a suite of deep-UV fluorescence/Raman instruments built at the Jet Propulsion Laboratory (JPL). Samples derive from a growing body of planetary surface analogs, mineral and microbial standards, meteorites, spacecraft materials, and other astrobiologically relevant materials. In addition to deep-UV spectroscopy, datasets stored in MIND are obtained from a variety of analytical techniques obtained over multiple spatial and spectral scales including electron microscopy, optical microscopy, infrared spectroscopy, X-ray fluorescence, and direct fluorescence imaging. Multivariate statistical analysis techniques, primarily Principal Component Analysis (PCA), are used to guide interpretation of these large multi-analytical spectral datasets. Spatial co-referencing of integrated spectral/visual maps is performed using QGIS (geographic information system software). Georeferencing techniques transform individual instrument data maps into a layered co-registered data cube for analysis across spectral and spatial scales. The body of data in MIND is intended to serve as a permanent, reliable, and expanding database of deep-UV spectroscopy datasets generated by this unique suite of JPL-based instruments on samples of broad planetary science interest.

Top