Sample records for sequential testing method

  1. Sequential Testing: Basics and Benefits

    DTIC Science & Technology

    1978-03-01

    Eii~TARADC6M and x _..TECHNICAL REPORT NO. 12325 SEQUENTIAL TESTING: BASICS AND BENEFITS / i * p iREFERENCE CP...Sequential Testing: Basics and Benefits Contents Page I. Introduction and Summary II. Sequential Analysis 2 III. Mathematics of Sequential Testing 4 IV...testing. The added benefit of reduced energy needs are inherent in this testing method. The text was originally released by the authors in 1972. The text

  2. Delay test generation for synchronous sequential circuits

    NASA Astrophysics Data System (ADS)

    Devadas, Srinivas

    1989-05-01

    We address the problem of generating tests for delay faults in non-scan synchronous sequential circuits. Delay test generation for sequential circuits is a considerably more difficult problem than delay testing of combinational circuits and has received much less attention. In this paper, we present a method for generating test sequences to detect delay faults in sequential circuits using the stuck-at fault sequential test generator STALLION. The method is complete in that it will generate a delay test sequence for a targeted fault given sufficient CPU time, if such a sequence exists. We term faults for which no delay test sequence exists, under out test methodology, sequentially delay redundant. We describe means of eliminating sequential delay redundancies in logic circuits. We present a partial-scan methodology for enhancing the testability of difficult-to-test of untestable sequential circuits, wherein a small number of flip-flops are selected and made controllable/observable. The selection process guarantees the elimination of all sequential delay redundancies. We show that an intimate relationship exists between state assignment and delay testability of a sequential machine. We describe a state assignment algorithm for the synthesis of sequential machines with maximal delay fault testability. Preliminary experimental results using the test generation, partial-scan and synthesis algorithm are presented.

  3. Test pattern generation for ILA sequential circuits

    NASA Technical Reports Server (NTRS)

    Feng, YU; Frenzel, James F.; Maki, Gary K.

    1993-01-01

    An efficient method of generating test patterns for sequential machines implemented using one-dimensional, unilateral, iterative logic arrays (ILA's) of BTS pass transistor networks is presented. Based on a transistor level fault model, the method affords a unique opportunity for real-time fault detection with improved fault coverage. The resulting test sets are shown to be equivalent to those obtained using conventional gate level models, thus eliminating the need for additional test patterns. The proposed method advances the simplicity and ease of the test pattern generation for a special class of sequential circuitry.

  4. Proceedings of the Conference on the Design of Experiments in Army Research, Development and Testing (29th)

    DTIC Science & Technology

    1984-06-01

    SEQUENTIAL TESTING (Bldg. A, Room C) 1300-1330 ’ 1330-1415 1415-1445 1445-1515 BREAK 1515-1545 A TRUNCATED SEQUENTIAL PROBABILITY RATIO TEST J...suicide optical data operational testing reliability random numbers bootstrap methods missing data sequential testing fire support complex computer model carcinogenesis studies EUITION Of 1 NOV 68 I% OBSOLETE a ...contributed papers can be ascertained from the titles of the

  5. Algorithms for the Construction of Parallel Tests by Zero-One Programming. Project Psychometric Aspects of Item Banking No. 7. Research Report 86-7.

    ERIC Educational Resources Information Center

    Boekkooi-Timminga, Ellen

    Nine methods for automated test construction are described. All are based on the concepts of information from item response theory. Two general kinds of methods for the construction of parallel tests are presented: (1) sequential test design; and (2) simultaneous test design. Sequential design implies that the tests are constructed one after the…

  6. A sampling and classification item selection approach with content balancing.

    PubMed

    Chen, Pei-Hua

    2015-03-01

    Existing automated test assembly methods typically employ constrained combinatorial optimization. Constructing forms sequentially based on an optimization approach usually results in unparallel forms and requires heuristic modifications. Methods based on a random search approach have the major advantage of producing parallel forms sequentially without further adjustment. This study incorporated a flexible content-balancing element into the statistical perspective item selection method of the cell-only method (Chen et al. in Educational and Psychological Measurement, 72(6), 933-953, 2012). The new method was compared with a sequential interitem distance weighted deviation model (IID WDM) (Swanson & Stocking in Applied Psychological Measurement, 17(2), 151-166, 1993), a simultaneous IID WDM, and a big-shadow-test mixed integer programming (BST MIP) method to construct multiple parallel forms based on matching a reference form item-by-item. The results showed that the cell-only method with content balancing and the sequential and simultaneous versions of IID WDM yielded results comparable to those obtained using the BST MIP method. The cell-only method with content balancing is computationally less intensive than the sequential and simultaneous versions of IID WDM.

  7. Monte Carlo Simulation of Sudden Death Bearing Testing

    NASA Technical Reports Server (NTRS)

    Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.

    2003-01-01

    Monte Carlo simulations combined with sudden death testing were used to compare resultant bearing lives to the calculated hearing life and the cumulative test time and calendar time relative to sequential and censored sequential testing. A total of 30 960 virtual 50-mm bore deep-groove ball bearings were evaluated in 33 different sudden death test configurations comprising 36, 72, and 144 bearings each. Variations in both life and Weibull slope were a function of the number of bearings failed independent of the test method used and not the total number of bearings tested. Variation in L10 life as a function of number of bearings failed were similar to variations in lift obtained from sequentially failed real bearings and from Monte Carlo (virtual) testing of entire populations. Reductions up to 40 percent in bearing test time and calendar time can be achieved by testing to failure or the L(sub 50) life and terminating all testing when the last of the predetermined bearing failures has occurred. Sudden death testing is not a more efficient method to reduce bearing test time or calendar time when compared to censored sequential testing.

  8. 40 CFR 53.34 - Test procedure for methods for PM10 and Class I methods for PM2.5.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... simultaneous PM10 or PM2.5 measurements as necessary (see table C-4 of this subpart), each set consisting of...) in appendix A to this subpart). (f) Sequential samplers. For sequential samplers, the sampler shall be configured for the maximum number of sequential samples and shall be set for automatic collection...

  9. The Sequential Probability Ratio Test and Binary Item Response Models

    ERIC Educational Resources Information Center

    Nydick, Steven W.

    2014-01-01

    The sequential probability ratio test (SPRT) is a common method for terminating item response theory (IRT)-based adaptive classification tests. To decide whether a classification test should stop, the SPRT compares a simple log-likelihood ratio, based on the classification bound separating two categories, to prespecified critical values. As has…

  10. Group Sequential Testing of the Predictive Accuracy of a Continuous Biomarker with Unknown Prevalence

    PubMed Central

    Koopmeiners, Joseph S.; Feng, Ziding

    2015-01-01

    Group sequential testing procedures have been proposed as an approach to conserving resources in biomarker validation studies. Previously, Koopmeiners and Feng (2011) derived the asymptotic properties of the sequential empirical positive predictive value (PPV) and negative predictive value curves, which summarize the predictive accuracy of a continuous marker, under case-control sampling. A limitation of their approach is that the prevalence can not be estimated from a case-control study and must be assumed known. In this manuscript, we consider group sequential testing of the predictive accuracy of a continuous biomarker with unknown prevalence. First, we develop asymptotic theory for the sequential empirical PPV and NPV curves when the prevalence must be estimated, rather than assumed known in a case-control study. We then discuss how our results can be combined with standard group sequential methods to develop group sequential testing procedures and bias-adjusted estimators for the PPV and NPV curve. The small sample properties of the proposed group sequential testing procedures and estimators are evaluated by simulation and we illustrate our approach in the context of a study to validate a novel biomarker for prostate cancer. PMID:26537180

  11. An exact computational method for performance analysis of sequential test algorithms for detecting network intrusions

    NASA Astrophysics Data System (ADS)

    Chen, Xinjia; Lacy, Fred; Carriere, Patrick

    2015-05-01

    Sequential test algorithms are playing increasingly important roles for quick detecting network intrusions such as portscanners. In view of the fact that such algorithms are usually analyzed based on intuitive approximation or asymptotic analysis, we develop an exact computational method for the performance analysis of such algorithms. Our method can be used to calculate the probability of false alarm and average detection time up to arbitrarily pre-specified accuracy.

  12. Robust inference for group sequential trials.

    PubMed

    Ganju, Jitendra; Lin, Yunzhi; Zhou, Kefei

    2017-03-01

    For ethical reasons, group sequential trials were introduced to allow trials to stop early in the event of extreme results. Endpoints in such trials are usually mortality or irreversible morbidity. For a given endpoint, the norm is to use a single test statistic and to use that same statistic for each analysis. This approach is risky because the test statistic has to be specified before the study is unblinded, and there is loss in power if the assumptions that ensure optimality for each analysis are not met. To minimize the risk of moderate to substantial loss in power due to a suboptimal choice of a statistic, a robust method was developed for nonsequential trials. The concept is analogous to diversification of financial investments to minimize risk. The method is based on combining P values from multiple test statistics for formal inference while controlling the type I error rate at its designated value.This article evaluates the performance of 2 P value combining methods for group sequential trials. The emphasis is on time to event trials although results from less complex trials are also included. The gain or loss in power with the combination method relative to a single statistic is asymmetric in its favor. Depending on the power of each individual test, the combination method can give more power than any single test or give power that is closer to the test with the most power. The versatility of the method is that it can combine P values from different test statistics for analysis at different times. The robustness of results suggests that inference from group sequential trials can be strengthened with the use of combined tests. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Sequential Probability Ratio Test for Collision Avoidance Maneuver Decisions

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell; Markley, F. Landis

    2010-01-01

    When facing a conjunction between space objects, decision makers must chose whether to maneuver for collision avoidance or not. We apply a well-known decision procedure, the sequential probability ratio test, to this problem. We propose two approaches to the problem solution, one based on a frequentist method, and the other on a Bayesian method. The frequentist method does not require any prior knowledge concerning the conjunction, while the Bayesian method assumes knowledge of prior probability densities. Our results show that both methods achieve desired missed detection rates, but the frequentist method's false alarm performance is inferior to the Bayesian method's

  14. A Bayesian sequential design with adaptive randomization for 2-sided hypothesis test.

    PubMed

    Yu, Qingzhao; Zhu, Lin; Zhu, Han

    2017-11-01

    Bayesian sequential and adaptive randomization designs are gaining popularity in clinical trials thanks to their potentials to reduce the number of required participants and save resources. We propose a Bayesian sequential design with adaptive randomization rates so as to more efficiently attribute newly recruited patients to different treatment arms. In this paper, we consider 2-arm clinical trials. Patients are allocated to the 2 arms with a randomization rate to achieve minimum variance for the test statistic. Algorithms are presented to calculate the optimal randomization rate, critical values, and power for the proposed design. Sensitivity analysis is implemented to check the influence on design by changing the prior distributions. Simulation studies are applied to compare the proposed method and traditional methods in terms of power and actual sample sizes. Simulations show that, when total sample size is fixed, the proposed design can obtain greater power and/or cost smaller actual sample size than the traditional Bayesian sequential design. Finally, we apply the proposed method to a real data set and compare the results with the Bayesian sequential design without adaptive randomization in terms of sample sizes. The proposed method can further reduce required sample size. Copyright © 2017 John Wiley & Sons, Ltd.

  15. Repeated significance tests of linear combinations of sensitivity and specificity of a diagnostic biomarker

    PubMed Central

    Wu, Mixia; Shu, Yu; Li, Zhaohai; Liu, Aiyi

    2016-01-01

    A sequential design is proposed to test whether the accuracy of a binary diagnostic biomarker meets the minimal level of acceptance. The accuracy of a binary diagnostic biomarker is a linear combination of the marker’s sensitivity and specificity. The objective of the sequential method is to minimize the maximum expected sample size under the null hypothesis that the marker’s accuracy is below the minimal level of acceptance. The exact results of two-stage designs based on Youden’s index and efficiency indicate that the maximum expected sample sizes are smaller than the sample sizes of the fixed designs. Exact methods are also developed for estimation, confidence interval and p-value concerning the proposed accuracy index upon termination of the sequential testing. PMID:26947768

  16. Sequential Probability Ratio Testing with Power Projective Base Method Improves Decision-Making for BCI

    PubMed Central

    Liu, Rong

    2017-01-01

    Obtaining a fast and reliable decision is an important issue in brain-computer interfaces (BCI), particularly in practical real-time applications such as wheelchair or neuroprosthetic control. In this study, the EEG signals were firstly analyzed with a power projective base method. Then we were applied a decision-making model, the sequential probability ratio testing (SPRT), for single-trial classification of motor imagery movement events. The unique strength of this proposed classification method lies in its accumulative process, which increases the discriminative power as more and more evidence is observed over time. The properties of the method were illustrated on thirteen subjects' recordings from three datasets. Results showed that our proposed power projective method outperformed two benchmark methods for every subject. Moreover, with sequential classifier, the accuracies across subjects were significantly higher than that with nonsequential ones. The average maximum accuracy of the SPRT method was 84.1%, as compared with 82.3% accuracy for the sequential Bayesian (SB) method. The proposed SPRT method provides an explicit relationship between stopping time, thresholds, and error, which is important for balancing the time-accuracy trade-off. These results suggest SPRT would be useful in speeding up decision-making while trading off errors in BCI. PMID:29348781

  17. The effect of a sequential structure of practice for the training of perceptual-cognitive skills in tennis

    PubMed Central

    2017-01-01

    Objective Anticipation of opponent actions, through the use of advanced (i.e., pre-event) kinematic information, can be trained using video-based temporal occlusion. Typically, this involves isolated opponent skills/shots presented as trials in a random order. However, two different areas of research concerning representative task design and contextual (non-kinematic) information, suggest this structure of practice restricts expert performance. The aim of this study was to examine the effect of a sequential structure of practice during video-based training of anticipatory behavior in tennis, as well as the transfer of these skills to the performance environment. Methods In a pre-practice-retention-transfer design, participants viewed life-sized video of tennis rallies across practice in either a sequential order (sequential group), in which participants were exposed to opponent skills/shots in the order they occur in the sport, or a non-sequential (non-sequential group) random order. Results In the video-based retention test, the sequential group was significantly more accurate in their anticipatory judgments when the retention condition replicated the sequential structure compared to the non-sequential group. In the non-sequential retention condition, the non-sequential group was more accurate than the sequential group. In the field-based transfer test, overall decision time was significantly faster in the sequential group compared to the non-sequential group. Conclusion Findings highlight the benefits of a sequential structure of practice for the transfer of anticipatory behavior in tennis. We discuss the role of contextual information, and the importance of representative task design, for the testing and training of perceptual-cognitive skills in sport. PMID:28355263

  18. An extended sequential goodness-of-fit multiple testing method for discrete data.

    PubMed

    Castro-Conde, Irene; Döhler, Sebastian; de Uña-Álvarez, Jacobo

    2017-10-01

    The sequential goodness-of-fit (SGoF) multiple testing method has recently been proposed as an alternative to the familywise error rate- and the false discovery rate-controlling procedures in high-dimensional problems. For discrete data, the SGoF method may be very conservative. In this paper, we introduce an alternative SGoF-type procedure that takes into account the discreteness of the test statistics. Like the original SGoF, our new method provides weak control of the false discovery rate/familywise error rate but attains false discovery rate levels closer to the desired nominal level, and thus it is more powerful. We study the performance of this method in a simulation study and illustrate its application to a real pharmacovigilance data set.

  19. Ultrasensitive surveillance of sensors and processes

    DOEpatents

    Wegerich, Stephan W.; Jarman, Kristin K.; Gross, Kenneth C.

    2001-01-01

    A method and apparatus for monitoring a source of data for determining an operating state of a working system. The method includes determining a sensor (or source of data) arrangement associated with monitoring the source of data for a system, activating a method for performing a sequential probability ratio test if the data source includes a single data (sensor) source, activating a second method for performing a regression sequential possibility ratio testing procedure if the arrangement includes a pair of sensors (data sources) with signals which are linearly or non-linearly related; activating a third method for performing a bounded angle ratio test procedure if the sensor arrangement includes multiple sensors and utilizing at least one of the first, second and third methods to accumulate sensor signals and determining the operating state of the system.

  20. Ultrasensitive surveillance of sensors and processes

    DOEpatents

    Wegerich, Stephan W.; Jarman, Kristin K.; Gross, Kenneth C.

    1999-01-01

    A method and apparatus for monitoring a source of data for determining an operating state of a working system. The method includes determining a sensor (or source of data) arrangement associated with monitoring the source of data for a system, activating a method for performing a sequential probability ratio test if the data source includes a single data (sensor) source, activating a second method for performing a regression sequential possibility ratio testing procedure if the arrangement includes a pair of sensors (data sources) with signals which are linearly or non-linearly related; activating a third method for performing a bounded angle ratio test procedure if the sensor arrangement includes multiple sensors and utilizing at least one of the first, second and third methods to accumulate sensor signals and determining the operating state of the system.

  1. Numerical study on the sequential Bayesian approach for radioactive materials detection

    NASA Astrophysics Data System (ADS)

    Qingpei, Xiang; Dongfeng, Tian; Jianyu, Zhu; Fanhua, Hao; Ge, Ding; Jun, Zeng

    2013-01-01

    A new detection method, based on the sequential Bayesian approach proposed by Candy et al., offers new horizons for the research of radioactive detection. Compared with the commonly adopted detection methods incorporated with statistical theory, the sequential Bayesian approach offers the advantages of shorter verification time during the analysis of spectra that contain low total counts, especially in complex radionuclide components. In this paper, a simulation experiment platform implanted with the methodology of sequential Bayesian approach was developed. Events sequences of γ-rays associating with the true parameters of a LaBr3(Ce) detector were obtained based on an events sequence generator using Monte Carlo sampling theory to study the performance of the sequential Bayesian approach. The numerical experimental results are in accordance with those of Candy. Moreover, the relationship between the detection model and the event generator, respectively represented by the expected detection rate (Am) and the tested detection rate (Gm) parameters, is investigated. To achieve an optimal performance for this processor, the interval of the tested detection rate as a function of the expected detection rate is also presented.

  2. EEG Classification with a Sequential Decision-Making Method in Motor Imagery BCI.

    PubMed

    Liu, Rong; Wang, Yongxuan; Newman, Geoffrey I; Thakor, Nitish V; Ying, Sarah

    2017-12-01

    To develop subject-specific classifier to recognize mental states fast and reliably is an important issue in brain-computer interfaces (BCI), particularly in practical real-time applications such as wheelchair or neuroprosthetic control. In this paper, a sequential decision-making strategy is explored in conjunction with an optimal wavelet analysis for EEG classification. The subject-specific wavelet parameters based on a grid-search method were first developed to determine evidence accumulative curve for the sequential classifier. Then we proposed a new method to set the two constrained thresholds in the sequential probability ratio test (SPRT) based on the cumulative curve and a desired expected stopping time. As a result, it balanced the decision time of each class, and we term it balanced threshold SPRT (BTSPRT). The properties of the method were illustrated on 14 subjects' recordings from offline and online tests. Results showed the average maximum accuracy of the proposed method to be 83.4% and the average decision time of 2.77[Formula: see text]s, when compared with 79.2% accuracy and a decision time of 3.01[Formula: see text]s for the sequential Bayesian (SB) method. The BTSPRT method not only improves the classification accuracy and decision speed comparing with the other nonsequential or SB methods, but also provides an explicit relationship between stopping time, thresholds and error, which is important for balancing the speed-accuracy tradeoff. These results suggest that BTSPRT would be useful in explicitly adjusting the tradeoff between rapid decision-making and error-free device control.

  3. Sequential sampling of ribes populations in the control of white pine blister rust (Cronartium ribicola Fischer) in California

    Treesearch

    Harold R. Offord

    1966-01-01

    Sequential sampling based on a negative binomial distribution of ribes populations required less than half the time taken by regular systematic line transect sampling in a comparison test. It gave the same control decision as the regular method in 9 of 13 field trials. A computer program that permits sequential plans to be built readily for other white pine regions is...

  4. 16 CFR 1500.41 - Method of testing primary irritant substances.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... corrosivity properties of substances, including testing that does not require animals, are presented in the CPSC's animal testing policy set forth in 16 CFR 1500.232. A weight-of-evidence analysis or a validated... conducted, a sequential testing strategy is recommended to reduce the number of test animals. The method of...

  5. The application of intraoperative transit time flow measurement to accurately assess anastomotic quality in sequential vein grafting

    PubMed Central

    Yu, Yang; Zhang, Fan; Gao, Ming-Xin; Li, Hai-Tao; Li, Jing-Xing; Song, Wei; Huang, Xin-Sheng; Gu, Cheng-Xiong

    2013-01-01

    OBJECTIVES Intraoperative transit time flow measurement (TTFM) is widely used to assess anastomotic quality in coronary artery bypass grafting (CABG). However, in sequential vein grafting, the flow characteristics collected by the conventional TTFM method are usually associated with total graft flow and might not accurately indicate the quality of every distal anastomosis in a sequential graft. The purpose of our study was to examine a new TTFM method that could assess the quality of each distal anastomosis in a sequential graft more reliably than the conventional TTFM approach. METHODS Two TTFM methods were tested in 84 patients who underwent sequential saphenous off-pump CABG in Beijing An Zhen Hospital between April and August 2012. In the conventional TTFM method, normal blood flow in the sequential graft was maintained during the measurement, and the flow probe was placed a few centimetres above the anastomosis to be evaluated. In the new method, blood flow in the sequential graft was temporarily reduced during the measurement by placing an atraumatic bulldog clamp at the graft a few centimetres distal to the anastomosis to be evaluated, while the position of the flow probe remained the same as in the conventional method. This new TTFM method was named the flow reduction TTFM. Graft flow parameters measured by both methods were compared. RESULTS Compared with the conventional TTFM, the flow reduction TTFM resulted in significantly lower mean graft blood flow (P < 0.05); in contrast, yielded significantly higher pulsatility index (P < 0.05). Diastolic filling was not significantly different between the two methods and was >50% in both cases. Interestingly, the flow reduction TTFM identified two defective middle distal anastomoses that the conventional TTFM failed to detect. Graft flows near the defective distal anastomoses were improved substantially after revision. CONCLUSIONS In this study, we found that temporary reduction of graft flow during TTFM seemed to enhance the sensitivity of TTFM to less-than-critical anastomotic defects in a sequential graft and to improve the overall accuracy of the intraoperative assessment of anastomotic quality in sequential vein grafting. PMID:24000314

  6. A Robust Real Time Direction-of-Arrival Estimation Method for Sequential Movement Events of Vehicles.

    PubMed

    Liu, Huawei; Li, Baoqing; Yuan, Xiaobing; Zhou, Qianwei; Huang, Jingchang

    2018-03-27

    Parameters estimation of sequential movement events of vehicles is facing the challenges of noise interferences and the demands of portable implementation. In this paper, we propose a robust direction-of-arrival (DOA) estimation method for the sequential movement events of vehicles based on a small Micro-Electro-Mechanical System (MEMS) microphone array system. Inspired by the incoherent signal-subspace method (ISM), the method that is proposed in this work employs multiple sub-bands, which are selected from the wideband signals with high magnitude-squared coherence to track moving vehicles in the presence of wind noise. The field test results demonstrate that the proposed method has a better performance in emulating the DOA of a moving vehicle even in the case of severe wind interference than the narrowband multiple signal classification (MUSIC) method, the sub-band DOA estimation method, and the classical two-sided correlation transformation (TCT) method.

  7. Blocking for Sequential Political Experiments

    PubMed Central

    Moore, Sally A.

    2013-01-01

    In typical political experiments, researchers randomize a set of households, precincts, or individuals to treatments all at once, and characteristics of all units are known at the time of randomization. However, in many other experiments, subjects “trickle in” to be randomized to treatment conditions, usually via complete randomization. To take advantage of the rich background data that researchers often have (but underutilize) in these experiments, we develop methods that use continuous covariates to assign treatments sequentially. We build on biased coin and minimization procedures for discrete covariates and demonstrate that our methods outperform complete randomization, producing better covariate balance in simulated data. We then describe how we selected and deployed a sequential blocking method in a clinical trial and demonstrate the advantages of our having done so. Further, we show how that method would have performed in two larger sequential political trials. Finally, we compare causal effect estimates from differences in means, augmented inverse propensity weighted estimators, and randomization test inversion. PMID:24143061

  8. NASA DOE POD NDE Capabilities Data Book

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    2015-01-01

    This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book, 3rd ed., NTIAC DB-97-02. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. The test methodology used in DOEPOD is based on the field of statistical sequential analysis founded by Abraham Wald. Sequential analysis is a method of statistical inference whose characteristic feature is that the number of observations required by the procedure is not determined in advance of the experiment. The decision to terminate the experiment depends, at each stage, on the results of the observations previously made. A merit of the sequential method, as applied to testing statistical hypotheses, is that test procedures can be constructed which require, on average, a substantially smaller number of observations than equally reliable test procedures based on a predetermined number of observations.

  9. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM2.5 and PM-2.5

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... reference method samplers shall be of single-filter design (not multi-filter, sequential sample design... and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean...

  10. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM2.5 and PM-2.5

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... reference method samplers shall be of single-filter design (not multi-filter, sequential sample design... and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean...

  11. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM2.5 and PM−2.5.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... reference method samplers shall be of single-filter design (not multi-filter, sequential sample design... and multiplicative bias (comparative slope and intercept). (1) For each test site, calculate the mean...

  12. Beyond Grand Rounds: A Comprehensive and Sequential Intervention to Improve Identification of Delirium

    ERIC Educational Resources Information Center

    Ramaswamy, Ravishankar; Dix, Edward F.; Drew, Janet E.; Diamond, James J.; Inouye, Sharon K.; Roehl, Barbara J. O.

    2011-01-01

    Purpose of the Study: Delirium is a widespread concern for hospitalized seniors, yet is often unrecognized. A comprehensive and sequential intervention (CSI) aiming to effect change in clinician behavior by improving knowledge about delirium was tested. Design and Methods: A 2-day CSI program that consisted of progressive 4-part didactic series,…

  13. Radiation detection method and system using the sequential probability ratio test

    DOEpatents

    Nelson, Karl E [Livermore, CA; Valentine, John D [Redwood City, CA; Beauchamp, Brock R [San Ramon, CA

    2007-07-17

    A method and system using the Sequential Probability Ratio Test to enhance the detection of an elevated level of radiation, by determining whether a set of observations are consistent with a specified model within a given bounds of statistical significance. In particular, the SPRT is used in the present invention to maximize the range of detection, by providing processing mechanisms for estimating the dynamic background radiation, adjusting the models to reflect the amount of background knowledge at the current point in time, analyzing the current sample using the models to determine statistical significance, and determining when the sample has returned to the expected background conditions.

  14. Alternative methods for the median lethal dose (LD(50)) test: the up-and-down procedure for acute oral toxicity.

    PubMed

    Rispin, Amy; Farrar, David; Margosches, Elizabeth; Gupta, Kailash; Stitzel, Katherine; Carr, Gregory; Greene, Michael; Meyer, William; McCall, Deborah

    2002-01-01

    The authors have developed an improved version of the up-and-down procedure (UDP) as one of the replacements for the traditional acute oral toxicity test formerly used by the Organisation for Economic Co-operation and Development member nations to characterize industrial chemicals, pesticides, and their mixtures. This method improves the performance of acute testing for applications that use the median lethal dose (classic LD50) test while achieving significant reductions in animal use. It uses sequential dosing, together with sophisticated computer-assisted computational methods during the execution and calculation phases of the test. Staircase design, a form of sequential test design, can be applied to acute toxicity testing with its binary experimental endpoints (yes/no outcomes). The improved UDP provides a point estimate of the LD50 and approximate confidence intervals in addition to observed toxic signs for the substance tested. It does not provide information about the dose-response curve. Computer simulation was used to test performance of the UDP without the need for additional laboratory validation.

  15. Evaluation of non-animal methods for assessing skin sensitisation hazard: A Bayesian Value-of-Information analysis.

    PubMed

    Leontaridou, Maria; Gabbert, Silke; Van Ierland, Ekko C; Worth, Andrew P; Landsiedel, Robert

    2016-07-01

    This paper offers a Bayesian Value-of-Information (VOI) analysis for guiding the development of non-animal testing strategies, balancing information gains from testing with the expected social gains and costs from the adoption of regulatory decisions. Testing is assumed to have value, if, and only if, the information revealed from testing triggers a welfare-improving decision on the use (or non-use) of a substance. As an illustration, our VOI model is applied to a set of five individual non-animal prediction methods used for skin sensitisation hazard assessment, seven battery combinations of these methods, and 236 sequential 2-test and 3-test strategies. Their expected values are quantified and compared to the expected value of the local lymph node assay (LLNA) as the animal method. We find that battery and sequential combinations of non-animal prediction methods reveal a significantly higher expected value than the LLNA. This holds for the entire range of prior beliefs. Furthermore, our results illustrate that the testing strategy with the highest expected value does not necessarily have to follow the order of key events in the sensitisation adverse outcome pathway (AOP). 2016 FRAME.

  16. Parallel heuristics for scalable community detection

    DOE PAGES

    Lu, Hao; Halappanavar, Mahantesh; Kalyanaraman, Ananth

    2015-08-14

    Community detection has become a fundamental operation in numerous graph-theoretic applications. Despite its potential for application, there is only limited support for community detection on large-scale parallel computers, largely owing to the irregular and inherently sequential nature of the underlying heuristics. In this paper, we present parallelization heuristics for fast community detection using the Louvain method as the serial template. The Louvain method is an iterative heuristic for modularity optimization. Originally developed in 2008, the method has become increasingly popular owing to its ability to detect high modularity community partitions in a fast and memory-efficient manner. However, the method ismore » also inherently sequential, thereby limiting its scalability. Here, we observe certain key properties of this method that present challenges for its parallelization, and consequently propose heuristics that are designed to break the sequential barrier. For evaluation purposes, we implemented our heuristics using OpenMP multithreading, and tested them over real world graphs derived from multiple application domains. Compared to the serial Louvain implementation, our parallel implementation is able to produce community outputs with a higher modularity for most of the inputs tested, in comparable number or fewer iterations, while providing real speedups of up to 16x using 32 threads.« less

  17. Computationally inexpensive identification of noninformative model parameters by sequential screening

    NASA Astrophysics Data System (ADS)

    Cuntz, Matthias; Mai, Juliane; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis

    2015-08-01

    Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.

  18. Computationally inexpensive identification of noninformative model parameters by sequential screening

    NASA Astrophysics Data System (ADS)

    Mai, Juliane; Cuntz, Matthias; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis

    2016-04-01

    Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.

  19. Rise and fall of political complexity in island South-East Asia and the Pacific.

    PubMed

    Currie, Thomas E; Greenhill, Simon J; Gray, Russell D; Hasegawa, Toshikazu; Mace, Ruth

    2010-10-14

    There is disagreement about whether human political evolution has proceeded through a sequence of incremental increases in complexity, or whether larger, non-sequential increases have occurred. The extent to which societies have decreased in complexity is also unclear. These debates have continued largely in the absence of rigorous, quantitative tests. We evaluated six competing models of political evolution in Austronesian-speaking societies using phylogenetic methods. Here we show that in the best-fitting model political complexity rises and falls in a sequence of small steps. This is closely followed by another model in which increases are sequential but decreases can be either sequential or in bigger drops. The results indicate that large, non-sequential jumps in political complexity have not occurred during the evolutionary history of these societies. This suggests that, despite the numerous contingent pathways of human history, there are regularities in cultural evolution that can be detected using computational phylogenetic methods.

  20. Physics-based, Bayesian sequential detection method and system for radioactive contraband

    DOEpatents

    Candy, James V; Axelrod, Michael C; Breitfeller, Eric F; Chambers, David H; Guidry, Brian L; Manatt, Douglas R; Meyer, Alan W; Sale, Kenneth E

    2014-03-18

    A distributed sequential method and system for detecting and identifying radioactive contraband from highly uncertain (noisy) low-count, radionuclide measurements, i.e. an event mode sequence (EMS), using a statistical approach based on Bayesian inference and physics-model-based signal processing based on the representation of a radionuclide as a monoenergetic decomposition of monoenergetic sources. For a given photon event of the EMS, the appropriate monoenergy processing channel is determined using a confidence interval condition-based discriminator for the energy amplitude and interarrival time and parameter estimates are used to update a measured probability density function estimate for a target radionuclide. A sequential likelihood ratio test is then used to determine one of two threshold conditions signifying that the EMS is either identified as the target radionuclide or not, and if not, then repeating the process for the next sequential photon event of the EMS until one of the two threshold conditions is satisfied.

  1. Three-dimensional mapping of equiprobable hydrostratigraphic units at the Frenchman Flat Corrective Action Unit, Nevada Test Site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shirley, C.; Pohlmann, K.; Andricevic, R.

    1996-09-01

    Geological and geophysical data are used with the sequential indicator simulation algorithm of Gomez-Hernandez and Srivastava to produce multiple, equiprobable, three-dimensional maps of informal hydrostratigraphic units at the Frenchman Flat Corrective Action Unit, Nevada Test Site. The upper 50 percent of the Tertiary volcanic lithostratigraphic column comprises the study volume. Semivariograms are modeled from indicator-transformed geophysical tool signals. Each equiprobable study volume is subdivided into discrete classes using the ISIM3D implementation of the sequential indicator simulation algorithm. Hydraulic conductivity is assigned within each class using the sequential Gaussian simulation method of Deutsch and Journel. The resulting maps show the contiguitymore » of high and low hydraulic conductivity regions.« less

  2. Simulations of 6-DOF Motion with a Cartesian Method

    NASA Technical Reports Server (NTRS)

    Murman, Scott M.; Aftosmis, Michael J.; Berger, Marsha J.; Kwak, Dochan (Technical Monitor)

    2003-01-01

    Coupled 6-DOF/CFD trajectory predictions using an automated Cartesian method are demonstrated by simulating a GBU-32/JDAM store separating from an F-18C aircraft. Numerical simulations are performed at two Mach numbers near the sonic speed, and compared with flight-test telemetry and photographic-derived data. Simulation results obtained with a sequential-static series of flow solutions are contrasted with results using a time-dependent flow solver. Both numerical methods show good agreement with the flight-test data through the first half of the simulations. The sequential-static and time-dependent methods diverge over the last half of the trajectory prediction. after the store produces peak angular rates. A cost comparison for the Cartesian method is included, in terms of absolute cost and relative to computing uncoupled 6-DOF trajectories. A detailed description of the 6-DOF method, as well as a verification of its accuracy, is provided in an appendix.

  3. A Rejection Principle for Sequential Tests of Multiple Hypotheses Controlling Familywise Error Rates

    PubMed Central

    BARTROFF, JAY; SONG, JINLIN

    2015-01-01

    We present a unifying approach to multiple testing procedures for sequential (or streaming) data by giving sufficient conditions for a sequential multiple testing procedure to control the familywise error rate (FWER). Together we call these conditions a “rejection principle for sequential tests,” which we then apply to some existing sequential multiple testing procedures to give simplified understanding of their FWER control. Next the principle is applied to derive two new sequential multiple testing procedures with provable FWER control, one for testing hypotheses in order and another for closed testing. Examples of these new procedures are given by applying them to a chromosome aberration data set and to finding the maximum safe dose of a treatment. PMID:26985125

  4. Diagnostic test accuracy and prevalence inferences based on joint and sequential testing with finite population sampling.

    PubMed

    Su, Chun-Lung; Gardner, Ian A; Johnson, Wesley O

    2004-07-30

    The two-test two-population model, originally formulated by Hui and Walter, for estimation of test accuracy and prevalence estimation assumes conditionally independent tests, constant accuracy across populations and binomial sampling. The binomial assumption is incorrect if all individuals in a population e.g. child-care centre, village in Africa, or a cattle herd are sampled or if the sample size is large relative to population size. In this paper, we develop statistical methods for evaluating diagnostic test accuracy and prevalence estimation based on finite sample data in the absence of a gold standard. Moreover, two tests are often applied simultaneously for the purpose of obtaining a 'joint' testing strategy that has either higher overall sensitivity or specificity than either of the two tests considered singly. Sequential versions of such strategies are often applied in order to reduce the cost of testing. We thus discuss joint (simultaneous and sequential) testing strategies and inference for them. Using the developed methods, we analyse two real and one simulated data sets, and we compare 'hypergeometric' and 'binomial-based' inferences. Our findings indicate that the posterior standard deviations for prevalence (but not sensitivity and specificity) based on finite population sampling tend to be smaller than their counterparts for infinite population sampling. Finally, we make recommendations about how small the sample size should be relative to the population size to warrant use of the binomial model for prevalence estimation. Copyright 2004 John Wiley & Sons, Ltd.

  5. Feature Selection based on Machine Learning in MRIs for Hippocampal Segmentation

    NASA Astrophysics Data System (ADS)

    Tangaro, Sabina; Amoroso, Nicola; Brescia, Massimo; Cavuoti, Stefano; Chincarini, Andrea; Errico, Rosangela; Paolo, Inglese; Longo, Giuseppe; Maglietta, Rosalia; Tateo, Andrea; Riccio, Giuseppe; Bellotti, Roberto

    2015-01-01

    Neurodegenerative diseases are frequently associated with structural changes in the brain. Magnetic resonance imaging (MRI) scans can show these variations and therefore can be used as a supportive feature for a number of neurodegenerative diseases. The hippocampus has been known to be a biomarker for Alzheimer disease and other neurological and psychiatric diseases. However, it requires accurate, robust, and reproducible delineation of hippocampal structures. Fully automatic methods are usually the voxel based approach; for each voxel a number of local features were calculated. In this paper, we compared four different techniques for feature selection from a set of 315 features extracted for each voxel: (i) filter method based on the Kolmogorov-Smirnov test; two wrapper methods, respectively, (ii) sequential forward selection and (iii) sequential backward elimination; and (iv) embedded method based on the Random Forest Classifier on a set of 10 T1-weighted brain MRIs and tested on an independent set of 25 subjects. The resulting segmentations were compared with manual reference labelling. By using only 23 feature for each voxel (sequential backward elimination) we obtained comparable state-of-the-art performances with respect to the standard tool FreeSurfer.

  6. Prediction of rat protein subcellular localization with pseudo amino acid composition based on multiple sequential features.

    PubMed

    Shi, Ruijia; Xu, Cunshuan

    2011-06-01

    The study of rat proteins is an indispensable task in experimental medicine and drug development. The function of a rat protein is closely related to its subcellular location. Based on the above concept, we construct the benchmark rat proteins dataset and develop a combined approach for predicting the subcellular localization of rat proteins. From protein primary sequence, the multiple sequential features are obtained by using of discrete Fourier analysis, position conservation scoring function and increment of diversity, and these sequential features are selected as input parameters of the support vector machine. By the jackknife test, the overall success rate of prediction is 95.6% on the rat proteins dataset. Our method are performed on the apoptosis proteins dataset and the Gram-negative bacterial proteins dataset with the jackknife test, the overall success rates are 89.9% and 96.4%, respectively. The above results indicate that our proposed method is quite promising and may play a complementary role to the existing predictors in this area.

  7. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM 2.5 and PM −2.5.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... section. All reference method samplers shall be of single-filter design (not multi-filter, sequential sample design). Each candidate method shall be setup and operated in accordance with its associated... precision specified in table C-4 of this subpart. (g) Test for additive and multiplicative bias (comparative...

  8. 40 CFR 53.35 - Test procedure for Class II and Class III methods for PM 2.5 and PM −2.5.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... section. All reference method samplers shall be of single-filter design (not multi-filter, sequential sample design). Each candidate method shall be setup and operated in accordance with its associated... precision specified in table C-4 of this subpart. (g) Test for additive and multiplicative bias (comparative...

  9. Adaptive sequential Bayesian classification using Page's test

    NASA Astrophysics Data System (ADS)

    Lynch, Robert S., Jr.; Willett, Peter K.

    2002-03-01

    In this paper, the previously introduced Mean-Field Bayesian Data Reduction Algorithm is extended for adaptive sequential hypothesis testing utilizing Page's test. In general, Page's test is well understood as a method of detecting a permanent change in distribution associated with a sequence of observations. However, the relationship between detecting a change in distribution utilizing Page's test with that of classification and feature fusion is not well understood. Thus, the contribution of this work is based on developing a method of classifying an unlabeled vector of fused features (i.e., detect a change to an active statistical state) as quickly as possible given an acceptable mean time between false alerts. In this case, the developed classification test can be thought of as equivalent to performing a sequential probability ratio test repeatedly until a class is decided, with the lower log-threshold of each test being set to zero and the upper log-threshold being determined by the expected distance between false alerts. It is of interest to estimate the delay (or, related stopping time) to a classification decision (the number of time samples it takes to classify the target), and the mean time between false alerts, as a function of feature selection and fusion by the Mean-Field Bayesian Data Reduction Algorithm. Results are demonstrated by plotting the delay to declaring the target class versus the mean time between false alerts, and are shown using both different numbers of simulated training data and different numbers of relevant features for each class.

  10. Sequential Bayesian Geostatistical Inversion and Evaluation of Combined Data Worth for Aquifer Characterization at the Hanford 300 Area

    NASA Astrophysics Data System (ADS)

    Murakami, H.; Chen, X.; Hahn, M. S.; Over, M. W.; Rockhold, M. L.; Vermeul, V.; Hammond, G. E.; Zachara, J. M.; Rubin, Y.

    2010-12-01

    Subsurface characterization for predicting groundwater flow and contaminant transport requires us to integrate large and diverse datasets in a consistent manner, and quantify the associated uncertainty. In this study, we sequentially assimilated multiple types of datasets for characterizing a three-dimensional heterogeneous hydraulic conductivity field at the Hanford 300 Area. The datasets included constant-rate injection tests, electromagnetic borehole flowmeter tests, lithology profile and tracer tests. We used the method of anchored distributions (MAD), which is a modular-structured Bayesian geostatistical inversion method. MAD has two major advantages over the other inversion methods. First, it can directly infer a joint distribution of parameters, which can be used as an input in stochastic simulations for prediction. In MAD, in addition to typical geostatistical structural parameters, the parameter vector includes multiple point values of the heterogeneous field, called anchors, which capture local trends and reduce uncertainty in the prediction. Second, MAD allows us to integrate the datasets sequentially in a Bayesian framework such that it updates the posterior distribution, as a new dataset is included. The sequential assimilation can decrease computational burden significantly. We applied MAD to assimilate different combinations of the datasets, and then compared the inversion results. For the injection and tracer test assimilation, we calculated temporal moments of pressure build-up and breakthrough curves, respectively, to reduce the data dimension. A massive parallel flow and transport code PFLOTRAN is used for simulating the tracer test. For comparison, we used different metrics based on the breakthrough curves not used in the inversion, such as mean arrival time, peak concentration and early arrival time. This comparison intends to yield the combined data worth, i.e. which combination of the datasets is the most effective for a certain metric, which will be useful for guiding the further characterization effort at the site and also the future characterization projects at the other sites.

  11. Brief Lags in Interrupted Sequential Performance: Evaluating a Model and Model Evaluation Method

    DTIC Science & Technology

    2015-01-05

    rehearsal mechanism in the model. To evaluate the model we developed a simple new goodness-of-fit test based on analysis of variance that offers an...repeated step). Sequen- tial constraints are common in medicine, equipment maintenance, computer programming and technical support, data analysis ...legal analysis , accounting, and many other home and workplace environ- ments. Sequential constraints also play a role in such basic cognitive processes

  12. Evaluating specificity of sequential extraction for chemical forms of lead in artificially-contaminated and field-contaminated soils.

    PubMed

    Tai, Yiping; McBride, Murray B; Li, Zhian

    2013-03-30

    In the present study, we evaluated a commonly employed modified Bureau Communautaire de Référence (BCR test) 3-step sequential extraction procedure for its ability to distinguish forms of solid-phase Pb in soils with different sources and histories of contamination. When the modified BCR test was applied to mineral soils spiked with three forms of Pb (pyromorphite, hydrocerussite and nitrate salt), the added Pb was highly susceptible to dissolution in the operationally-defined "reducible" or "oxide" fraction regardless of form. When three different materials (mineral soil, organic soil and goethite) were spiked with soluble Pb nitrate, the BCR sequential extraction profiles revealed that soil organic matter was capable of retaining Pb in more stable and acid-resistant forms than silicate clay minerals or goethite. However, the BCR sequential extraction for field-collected soils with known and different sources of Pb contamination was not sufficiently discriminatory in the dissolution of soil Pb phases to allow soil Pb forms to be "fingerprinted" by this method. It is concluded that standard sequential extraction procedures are probably not very useful in predicting lability and bioavailability of Pb in contaminated soils. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Wald Sequential Probability Ratio Test for Space Object Conjunction Assessment

    NASA Technical Reports Server (NTRS)

    Carpenter, James R.; Markley, F Landis

    2014-01-01

    This paper shows how satellite owner/operators may use sequential estimates of collision probability, along with a prior assessment of the base risk of collision, in a compound hypothesis ratio test to inform decisions concerning collision risk mitigation maneuvers. The compound hypothesis test reduces to a simple probability ratio test, which appears to be a novel result. The test satisfies tolerances related to targeted false alarm and missed detection rates. This result is independent of the method one uses to compute the probability density that one integrates to compute collision probability. A well-established test case from the literature shows that this test yields acceptable results within the constraints of a typical operational conjunction assessment decision timeline. Another example illustrates the use of the test in a practical conjunction assessment scenario based on operations of the International Space Station.

  14. Sequential accelerated tests: Improving the correlation of accelerated tests to module performance in the field

    NASA Astrophysics Data System (ADS)

    Felder, Thomas; Gambogi, William; Stika, Katherine; Yu, Bao-Ling; Bradley, Alex; Hu, Hongjie; Garreau-Iles, Lucie; Trout, T. John

    2016-09-01

    DuPont has been working steadily to develop accelerated backsheet tests that correlate with solar panels observations in the field. This report updates efforts in sequential testing. Single exposure tests are more commonly used and can be completed more quickly, and certain tests provide helpful predictions of certain backsheet failure modes. DuPont recommendations for single exposure tests are based on 25-year exposure levels for UV and humidity/temperature, and form a good basis for sequential test development. We recommend a sequential exposure of damp heat followed by UV then repetitions of thermal cycling and UVA. This sequence preserves 25-year exposure levels for humidity/temperature and UV, and correlates well with a large body of field observations. Measurements can be taken at intervals in the test, although the full test runs 10 months. A second, shorter sequential test based on damp heat and thermal cycling tests mechanical durability and correlates with loss of mechanical properties seen in the field. Ongoing work is directed toward shorter sequential tests that preserve good correlation to field data.

  15. Pre-testing Orientation for the Disadvantaged.

    ERIC Educational Resources Information Center

    Mihalka, Joseph A.

    A pre-testing orientation was incorporated into the Work Incentives Program, a pre-vocational program for disadvantaged youth. Test-taking skills were taught in seven and one half hours of instruction and a variety of methods were used to provide a sequential experience with distributed learning, positive reinforcement, and immediate feedback of…

  16. New Testing Methods to Assess Technical Problem-Solving Ability.

    ERIC Educational Resources Information Center

    Hambleton, Ronald K.; And Others

    Tests to assess problem-solving ability being provided for the Air Force are described, and some details on the development and validation of these computer-administered diagnostic achievement tests are discussed. Three measurement approaches were employed: (1) sequential problem solving; (2) context-free assessment of fundamental skills and…

  17. A Novel Ship-Tracking Method for GF-4 Satellite Sequential Images.

    PubMed

    Yao, Libo; Liu, Yong; He, You

    2018-06-22

    The geostationary remote sensing satellite has the capability of wide scanning, persistent observation and operational response, and has tremendous potential for maritime target surveillance. The GF-4 satellite is the first geostationary orbit (GEO) optical remote sensing satellite with medium resolution in China. In this paper, a novel ship-tracking method in GF-4 satellite sequential imagery is proposed. The algorithm has three stages. First, a local visual saliency map based on local peak signal-to-noise ratio (PSNR) is used to detect ships in a single frame of GF-4 satellite sequential images. Second, the accuracy positioning of each potential target is realized by a dynamic correction using the rational polynomial coefficients (RPCs) and automatic identification system (AIS) data of ships. Finally, an improved multiple hypotheses tracking (MHT) algorithm with amplitude information is used to track ships by further removing the false targets, and to estimate ships’ motion parameters. The algorithm has been tested using GF-4 sequential images and AIS data. The results of the experiment demonstrate that the algorithm achieves good tracking performance in GF-4 satellite sequential images and estimates the motion information of ships accurately.

  18. Sequential Probability Ratio Test for Collision Avoidance Maneuver Decisions Based on a Bank of Norm-Inequality-Constrained Epoch-State Filters

    NASA Technical Reports Server (NTRS)

    Carpenter, J. R.; Markley, F. L.; Alfriend, K. T.; Wright, C.; Arcido, J.

    2011-01-01

    Sequential probability ratio tests explicitly allow decision makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models 1he null hypothesis 1ha1 the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming highly-elliptical orbit formation flying mission.

  19. Sequential Probability Ratio Test for Spacecraft Collision Avoidance Maneuver Decisions

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell; Markley, F. Landis

    2013-01-01

    A document discusses sequential probability ratio tests that explicitly allow decision-makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models the null hypotheses that the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming, highly elliptical orbit formation flying mission.

  20. Evaluation of sequential extraction procedures for soluble and insoluble hexavalent chromium compounds in workplace air samples.

    PubMed

    Ashley, Kevin; Applegate, Gregory T; Marcy, A Dale; Drake, Pamela L; Pierce, Paul A; Carabin, Nathalie; Demange, Martine

    2009-02-01

    Because toxicities may differ for Cr(VI) compounds of varying solubility, some countries and organizations have promulgated different occupational exposure limits (OELs) for soluble and insoluble hexavalent chromium (Cr(VI)) compounds, and analytical methods are needed to determine these species in workplace air samples. To address this need, international standard methods ASTM D6832 and ISO 16740 have been published that describe sequential extraction techniques for soluble and insoluble Cr(VI) in samples collected from occupational settings. However, no published performance data were previously available for these Cr(VI) sequential extraction procedures. In this work, the sequential extraction methods outlined in the relevant international standards were investigated. The procedures tested involved the use of either deionized water or an ammonium sulfate/ammonium hydroxide buffer solution to target soluble Cr(VI) species. This was followed by extraction in a sodium carbonate/sodium hydroxide buffer solution to dissolve insoluble Cr(VI) compounds. Three-step sequential extraction with (1) water, (2) sulfate buffer and (3) carbonate buffer was also investigated. Sequential extractions were carried out on spiked samples of soluble, sparingly soluble and insoluble Cr(VI) compounds, and analyses were then generally carried out by using the diphenylcarbazide method. Similar experiments were performed on paint pigment samples and on airborne particulate filter samples collected from stainless steel welding. Potential interferences from soluble and insoluble Cr(III) compounds, as well as from Fe(II), were investigated. Interferences from Cr(III) species were generally absent, while the presence of Fe(II) resulted in low Cr(VI) recoveries. Two-step sequential extraction of spiked samples with (first) either water or sulfate buffer, and then carbonate buffer, yielded quantitative recoveries of soluble Cr(VI) and insoluble Cr(VI), respectively. Three-step sequential extraction gave excessively high recoveries of soluble Cr(VI), low recoveries of sparingly soluble Cr(VI), and quantitative recoveries of insoluble Cr(VI). Experiments on paint pigment samples using two-step extraction with water and carbonate buffer yielded varying percentages of relative fractions of soluble and insoluble Cr(VI). Sequential extractions of stainless steel welding fume air filter samples demonstrated the predominance of soluble Cr(VI) compounds in such samples. The performance data obtained in this work support the Cr(VI) sequential extraction procedures described in the international standards.

  1. Assessment of sequential same arm agreement of blood pressure measurements by a CVProfilor DO-2020 versus a Baumanometer mercury sphygmomanometer.

    PubMed

    Prisant, L M; Resnick, L M; Hollenberg, S M

    2001-06-01

    The aim of this study was to assess the accuracy of sequential same arm blood pressure measurement by the mercury sphygmomanometer with the oscillometric blood pressure measurements from a device that also determines arterial elasticity. A prospective, multicentre, clinical study evaluated sequential same arm blood pressure measurements, using a mercury sphygmomanometer (Baumanometer, W. A. Baum Co., Inc., Copiague, New York, USA) and an oscillometric non-invasive device that calculates arterial elasticity (CVProfilor DO-2020 Cardiovascular Profiling System, Hypertension Diagnostics, Inc., Eagan, Minnesota, USA). Blood pressure was measured supine in triplicate, 3 min apart in a randomized sequence after a period of rest. The study population of 230 normotensive and hypertensive subjects included 57% females, 51% Caucasians, and 33% African Americans. The mean difference between test methods of systolic blood pressure, diastolic blood pressure, and heart rate was -3.2 +/- 6.9 mmHg, +0.8 +/- 5.9 mmHg, and +1.0 +/- 5.7 beats/minute. For systolic and diastolic blood pressure, 60.9 and 70.4% of sequential measurements by each method were within +/- 5 mmHg. Few or no points fell beyond the mean +/- 2 standard deviations lines for each cuff bladder size. Sequential same arm measurements of the CVProfilor DO-2020 Cardiovascular Profiling System measures blood pressure by an oscillometric method (dynamic linear deflation) with reasonable agreement with a mercury sphygmomanometer.

  2. Experiences with digital processing of images at INPE

    NASA Technical Reports Server (NTRS)

    Mascarenhas, N. D. A. (Principal Investigator)

    1984-01-01

    Four different research experiments with digital image processing at INPE will be described: (1) edge detection by hypothesis testing; (2) image interpolation by finite impulse response filters; (3) spatial feature extraction methods in multispectral classification; and (4) translational image registration by sequential tests of hypotheses.

  3. Sequential biases in accumulating evidence

    PubMed Central

    Huggins, Richard; Dogo, Samson Henry

    2015-01-01

    Whilst it is common in clinical trials to use the results of tests at one phase to decide whether to continue to the next phase and to subsequently design the next phase, we show that this can lead to biased results in evidence synthesis. Two new kinds of bias associated with accumulating evidence, termed ‘sequential decision bias’ and ‘sequential design bias’, are identified. Both kinds of bias are the result of making decisions on the usefulness of a new study, or its design, based on the previous studies. Sequential decision bias is determined by the correlation between the value of the current estimated effect and the probability of conducting an additional study. Sequential design bias arises from using the estimated value instead of the clinically relevant value of an effect in sample size calculations. We considered both the fixed‐effect and the random‐effects models of meta‐analysis and demonstrated analytically and by simulations that in both settings the problems due to sequential biases are apparent. According to our simulations, the sequential biases increase with increased heterogeneity. Minimisation of sequential biases arises as a new and important research area necessary for successful evidence‐based approaches to the development of science. © 2015 The Authors. Research Synthesis Methods Published by John Wiley & Sons Ltd. PMID:26626562

  4. Modified sequential extraction for biochar and petroleum coke: Metal release potential and its environmental implications.

    PubMed

    von Gunten, Konstantin; Alam, Md Samrat; Hubmann, Magdalena; Ok, Yong Sik; Konhauser, Kurt O; Alessi, Daniel S

    2017-07-01

    A modified Community Bureau of Reference (CBR) sequential extraction method was tested to assess the composition of untreated pyrogenic carbon (biochar) and oil sands petroleum coke. Wood biochar samples were found to contain lower concentrations of metals, but had higher fractions of easily mobilized alkaline earth and transition metals. Sewage sludge biochar was determined to be less recalcitrant and had higher total metal concentrations, with most of the metals found in the more resilient extraction fractions (oxidizable, residual). Petroleum coke was the most stable material, with a similar metal distribution pattern as the sewage sludge biochar. The applied sequential extraction method represents a suitable technique to recover metals from these materials, and is a valuable tool in understanding the metal retaining and leaching capability of various biochar types and carbonaceous petroleum coke samples. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Structural Optimization for Reliability Using Nonlinear Goal Programming

    NASA Technical Reports Server (NTRS)

    El-Sayed, Mohamed E.

    1999-01-01

    This report details the development of a reliability based multi-objective design tool for solving structural optimization problems. Based on two different optimization techniques, namely sequential unconstrained minimization and nonlinear goal programming, the developed design method has the capability to take into account the effects of variability on the proposed design through a user specified reliability design criterion. In its sequential unconstrained minimization mode, the developed design tool uses a composite objective function, in conjunction with weight ordered design objectives, in order to take into account conflicting and multiple design criteria. Multiple design criteria of interest including structural weight, load induced stress and deflection, and mechanical reliability. The nonlinear goal programming mode, on the other hand, provides for a design method that eliminates the difficulty of having to define an objective function and constraints, while at the same time has the capability of handling rank ordered design objectives or goals. For simulation purposes the design of a pressure vessel cover plate was undertaken as a test bed for the newly developed design tool. The formulation of this structural optimization problem into sequential unconstrained minimization and goal programming form is presented. The resulting optimization problem was solved using: (i) the linear extended interior penalty function method algorithm; and (ii) Powell's conjugate directions method. Both single and multi-objective numerical test cases are included demonstrating the design tool's capabilities as it applies to this design problem.

  6. Hybrid Computerized Adaptive Testing: From Group Sequential Design to Fully Sequential Design

    ERIC Educational Resources Information Center

    Wang, Shiyu; Lin, Haiyan; Chang, Hua-Hua; Douglas, Jeff

    2016-01-01

    Computerized adaptive testing (CAT) and multistage testing (MST) have become two of the most popular modes in large-scale computer-based sequential testing. Though most designs of CAT and MST exhibit strength and weakness in recent large-scale implementations, there is no simple answer to the question of which design is better because different…

  7. Using sequential self-calibration method to identify conductivity distribution: Conditioning on tracer test data

    USGS Publications Warehouse

    Hu, B.X.; He, C.

    2008-01-01

    An iterative inverse method, the sequential self-calibration method, is developed for mapping spatial distribution of a hydraulic conductivity field by conditioning on nonreactive tracer breakthrough curves. A streamline-based, semi-analytical simulator is adopted to simulate solute transport in a heterogeneous aquifer. The simulation is used as the forward modeling step. In this study, the hydraulic conductivity is assumed to be a deterministic or random variable. Within the framework of the streamline-based simulator, the efficient semi-analytical method is used to calculate sensitivity coefficients of the solute concentration with respect to the hydraulic conductivity variation. The calculated sensitivities account for spatial correlations between the solute concentration and parameters. The performance of the inverse method is assessed by two synthetic tracer tests conducted in an aquifer with a distinct spatial pattern of heterogeneity. The study results indicate that the developed iterative inverse method is able to identify and reproduce the large-scale heterogeneity pattern of the aquifer given appropriate observation wells in these synthetic cases. ?? International Association for Mathematical Geology 2008.

  8. Surveillance system and method having an adaptive sequential probability fault detection test

    NASA Technical Reports Server (NTRS)

    Herzog, James P. (Inventor); Bickford, Randall L. (Inventor)

    2005-01-01

    System and method providing surveillance of an asset such as a process and/or apparatus by providing training and surveillance procedures that numerically fit a probability density function to an observed residual error signal distribution that is correlative to normal asset operation and then utilizes the fitted probability density function in a dynamic statistical hypothesis test for providing improved asset surveillance.

  9. Surveillance system and method having an adaptive sequential probability fault detection test

    NASA Technical Reports Server (NTRS)

    Bickford, Randall L. (Inventor); Herzog, James P. (Inventor)

    2006-01-01

    System and method providing surveillance of an asset such as a process and/or apparatus by providing training and surveillance procedures that numerically fit a probability density function to an observed residual error signal distribution that is correlative to normal asset operation and then utilizes the fitted probability density function in a dynamic statistical hypothesis test for providing improved asset surveillance.

  10. Surveillance System and Method having an Adaptive Sequential Probability Fault Detection Test

    NASA Technical Reports Server (NTRS)

    Bickford, Randall L. (Inventor); Herzog, James P. (Inventor)

    2008-01-01

    System and method providing surveillance of an asset such as a process and/or apparatus by providing training and surveillance procedures that numerically fit a probability density function to an observed residual error signal distribution that is correlative to normal asset operation and then utilizes the fitted probability density function in a dynamic statistical hypothesis test for providing improved asset surveillance.

  11. Some sequential, distribution-free pattern classification procedures with applications

    NASA Technical Reports Server (NTRS)

    Poage, J. L.

    1971-01-01

    Some sequential, distribution-free pattern classification techniques are presented. The decision problem to which the proposed classification methods are applied is that of discriminating between two kinds of electroencephalogram responses recorded from a human subject: spontaneous EEG and EEG driven by a stroboscopic light stimulus at the alpha frequency. The classification procedures proposed make use of the theory of order statistics. Estimates of the probabilities of misclassification are given. The procedures were tested on Gaussian samples and the EEG responses.

  12. Depression and Delinquency Covariation in an Accelerated Longitudinal Sample of Adolescents

    ERIC Educational Resources Information Center

    Kofler, Michael J.; McCart, Michael R.; Zajac, Kristyn; Ruggiero, Kenneth J.; Saunders, Benjamin E.; Kilpatrick, Dean G.

    2011-01-01

    Objectives: The current study tested opposing predictions stemming from the failure and acting out theories of depression-delinquency covariation. Method: Participants included a nationwide longitudinal sample of adolescents (N = 3,604) ages 12 to 17. Competing models were tested with cohort-sequential latent growth curve modeling to determine…

  13. A Novel Method for Discovering Fuzzy Sequential Patterns Using the Simple Fuzzy Partition Method.

    ERIC Educational Resources Information Center

    Chen, Ruey-Shun; Hu, Yi-Chung

    2003-01-01

    Discusses sequential patterns, data mining, knowledge acquisition, and fuzzy sequential patterns described by natural language. Proposes a fuzzy data mining technique to discover fuzzy sequential patterns by using the simple partition method which allows the linguistic interpretation of each fuzzy set to be easily obtained. (Author/LRW)

  14. A weight modification sequential method for VSC-MTDC power system state estimation

    NASA Astrophysics Data System (ADS)

    Yang, Xiaonan; Zhang, Hao; Li, Qiang; Guo, Ziming; Zhao, Kun; Li, Xinpeng; Han, Feng

    2017-06-01

    This paper presents an effective sequential approach based on weight modification for VSC-MTDC power system state estimation, called weight modification sequential method. The proposed approach simplifies the AC/DC system state estimation algorithm through modifying the weight of state quantity to keep the matrix dimension constant. The weight modification sequential method can also make the VSC-MTDC system state estimation calculation results more ccurate and increase the speed of calculation. The effectiveness of the proposed weight modification sequential method is demonstrated and validated in modified IEEE 14 bus system.

  15. Helicobacter pylori eradication with either seven-day or 10-day triple therapies, and with a 10-day sequential regimen

    PubMed Central

    Scaccianoce, Giuseppe; Hassan, Cesare; Panarese, Alba; Piglionica, Donato; Morini, Sergio; Zullo, Angelo

    2006-01-01

    BACKGROUND Helicobacter pylori eradication rates achieved by standard seven-day triple therapies are decreasing in several countries, while a novel 10-day sequential regimen has achieved a very high success rate. A longer 10-day triple therapy, similar to the sequential regimen, was tested to see whether it could achieve a better infection cure rate. METHODS Patients with nonulcer dyspepsia and H pylori infection were randomly assigned to one of the following three therapies: esomeprazole 20 mg, clarithromycin 500 mg and amoxycillin 1 g for seven days or 10 days, or a 10-day sequential regimen including esomeprazole 20 mg plus amoxycillin 1 g for five days and esomeprazole 20 mg, clarithromycin 500 mg and tinidazole 500 mg for the remaining five days. All drugs were given twice daily. H pylori eradication was checked four to six weeks after treatment by using a 13C-urea breath test. RESULTS Overall, 213 patients were enrolled. H pylori eradication was achieved in 75.7% and 77.9%, in 81.7% and 84.1%, and in 94.4% and 97.1% of patients following seven-day or 10-day triple therapy and the 10-day sequential regimen, at intention-to-treat and per protocol analyses, respectively. The eradication rate following the sequential regimen was higher than either seven-day (P=0.002) or 10-day triple therapy (P=0.02), while no significant difference emerged between the latter two regimens (P=0.6). CONCLUSIONS The 10-day sequential regimen was significantly more effective than both triple regimens, while 10-day triple therapy failed to significantly increase the H pylori eradication rate achieved by the standard seven-day regimen. PMID:16482238

  16. SW-846 Test Method 3200: Mercury Species Fractionation and Quantification by Microwave Assisted Extraction, Selective Solvent Extraction and/or Solid Phase Extraction

    EPA Pesticide Factsheets

    a sequential extraction and separation procedure that maybe used in conjunction with a determinative method to differentiate mercury species that arepresent in soils and sediments. provides information on both total mercury andvarious mercury species.

  17. Sequential detection of influenza epidemics by the Kolmogorov-Smirnov test

    PubMed Central

    2012-01-01

    Background Influenza is a well known and common human respiratory infection, causing significant morbidity and mortality every year. Despite Influenza variability, fast and reliable outbreak detection is required for health resource planning. Clinical health records, as published by the Diagnosticat database in Catalonia, host useful data for probabilistic detection of influenza outbreaks. Methods This paper proposes a statistical method to detect influenza epidemic activity. Non-epidemic incidence rates are modeled against the exponential distribution, and the maximum likelihood estimate for the decaying factor λ is calculated. The sequential detection algorithm updates the parameter as new data becomes available. Binary epidemic detection of weekly incidence rates is assessed by Kolmogorov-Smirnov test on the absolute difference between the empirical and the cumulative density function of the estimated exponential distribution with significance level 0 ≤ α ≤ 1. Results The main advantage with respect to other approaches is the adoption of a statistically meaningful test, which provides an indicator of epidemic activity with an associated probability. The detection algorithm was initiated with parameter λ0 = 3.8617 estimated from the training sequence (corresponding to non-epidemic incidence rates of the 2008-2009 influenza season) and sequentially updated. Kolmogorov-Smirnov test detected the following weeks as epidemic for each influenza season: 50−10 (2008-2009 season), 38−50 (2009-2010 season), weeks 50−9 (2010-2011 season) and weeks 3 to 12 for the current 2011-2012 season. Conclusions Real medical data was used to assess the validity of the approach, as well as to construct a realistic statistical model of weekly influenza incidence rates in non-epidemic periods. For the tested data, the results confirmed the ability of the algorithm to detect the start and the end of epidemic periods. In general, the proposed test could be applied to other data sets to quickly detect influenza outbreaks. The sequential structure of the test makes it suitable for implementation in many platforms at a low computational cost without requiring to store large data sets. PMID:23031321

  18. Sequential Tests of Multiple Hypotheses Controlling Type I and II Familywise Error Rates

    PubMed Central

    Bartroff, Jay; Song, Jinlin

    2014-01-01

    This paper addresses the following general scenario: A scientist wishes to perform a battery of experiments, each generating a sequential stream of data, to investigate some phenomenon. The scientist would like to control the overall error rate in order to draw statistically-valid conclusions from each experiment, while being as efficient as possible. The between-stream data may differ in distribution and dimension but also may be highly correlated, even duplicated exactly in some cases. Treating each experiment as a hypothesis test and adopting the familywise error rate (FWER) metric, we give a procedure that sequentially tests each hypothesis while controlling both the type I and II FWERs regardless of the between-stream correlation, and only requires arbitrary sequential test statistics that control the error rates for a given stream in isolation. The proposed procedure, which we call the sequential Holm procedure because of its inspiration from Holm’s (1979) seminal fixed-sample procedure, shows simultaneous savings in expected sample size and less conservative error control relative to fixed sample, sequential Bonferroni, and other recently proposed sequential procedures in a simulation study. PMID:25092948

  19. One-sided truncated sequential t-test: application to natural resource sampling

    Treesearch

    Gary W. Fowler; William G. O' Regan

    1974-01-01

    A new procedure for constructing one-sided truncated sequential t-tests and its application to natural resource sampling are described. Monte Carlo procedures were used to develop a series of one-sided truncated sequential t-tests and the associated approximations to the operating characteristic and average sample number functions. Different truncation points and...

  20. Examining Differential Item Functions of Different Item Ordered Test Forms According to Item Difficulty Levels

    ERIC Educational Resources Information Center

    Çokluk, Ömay; Gül, Emrah; Dogan-Gül, Çilem

    2016-01-01

    The study aims to examine whether differential item function is displayed in three different test forms that have item orders of random and sequential versions (easy-to-hard and hard-to-easy), based on Classical Test Theory (CTT) and Item Response Theory (IRT) methods and bearing item difficulty levels in mind. In the correlational research, the…

  1. Efficient Testing Combining Design of Experiment and Learn-to-Fly Strategies

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick C.; Brandon, Jay M.

    2017-01-01

    Rapid modeling and efficient testing methods are important in a number of aerospace applications. In this study efficient testing strategies were evaluated in a wind tunnel test environment and combined to suggest a promising approach for both ground-based and flight-based experiments. Benefits of using Design of Experiment techniques, well established in scientific, military, and manufacturing applications are evaluated in combination with newly developing methods for global nonlinear modeling. The nonlinear modeling methods, referred to as Learn-to-Fly methods, utilize fuzzy logic and multivariate orthogonal function techniques that have been successfully demonstrated in flight test. The blended approach presented has a focus on experiment design and identifies a sequential testing process with clearly defined completion metrics that produce increased testing efficiency.

  2. Separation of left and right lungs using 3D information of sequential CT images and a guided dynamic programming algorithm

    PubMed Central

    Park, Sang Cheol; Leader, Joseph Ken; Tan, Jun; Lee, Guee Sang; Kim, Soo Hyung; Na, In Seop; Zheng, Bin

    2011-01-01

    Objective this article presents a new computerized scheme that aims to accurately and robustly separate left and right lungs on CT examinations. Methods we developed and tested a method to separate the left and right lungs using sequential CT information and a guided dynamic programming algorithm using adaptively and automatically selected start point and end point with especially severe and multiple connections. Results the scheme successfully identified and separated all 827 connections on the total 4034 CT images in an independent testing dataset of CT examinations. The proposed scheme separated multiple connections regardless of their locations, and the guided dynamic programming algorithm reduced the computation time to approximately 4.6% in comparison with the traditional dynamic programming and avoided the permeation of the separation boundary into normal lung tissue. Conclusions The proposed method is able to robustly and accurately disconnect all connections between left and right lungs and the guided dynamic programming algorithm is able to remove redundant processing. PMID:21412104

  3. Sequential analysis as a tool for detection of amikacin ototoxicity in the treatment of multidrug-resistant tuberculosis.

    PubMed

    Vasconcelos, Karla Anacleto de; Frota, Silvana Maria Monte Coelho; Ruffino-Netto, Antonio; Kritski, Afrânio Lineu

    2018-04-01

    To investigate early detection of amikacin-induced ototoxicity in a population treated for multidrug-resistant tuberculosis (MDR-TB), by means of three different tests: pure-tone audiometry (PTA); high-frequency audiometry (HFA); and distortion-product otoacoustic emission (DPOAE) testing. This was a longitudinal prospective cohort study involving patients aged 18-69 years with a diagnosis of MDR-TB who had to receive amikacin for six months as part of their antituberculosis drug regimen for the first time. Hearing was assessed before treatment initiation and at two and six months after treatment initiation. Sequential statistics were used to analyze the results. We included 61 patients, but the final population consisted of 10 patients (7 men and 3 women) because of sequential analysis. Comparison of the test results obtained at two and six months after treatment initiation with those obtained at baseline revealed that HFA at two months and PTA at six months detected hearing threshold shifts consistent with ototoxicity. However, DPOAE testing did not detect such shifts. The statistical method used in this study makes it possible to conclude that, over the six-month period, amikacin-associated hearing threshold shifts were detected by HFA and PTA, and that DPOAE testing was not efficient in detecting such shifts.

  4. Sitting Tai Chi Improves the Balance Control and Muscle Strength of Community-Dwelling Persons with Spinal Cord Injuries: A Pilot Study

    PubMed Central

    Tsang, William W. N.; Gao, Kelly L.; Chan, K. M.; Purves, Sheila; Macfarlane, Duncan J.; Fong, Shirley S. M.

    2015-01-01

    Objective. To investigate the effects of sitting Tai Chi on muscle strength, balance control, and quality of life (QOL) among survivors with spinal cord injuries (SCI). Methods. Eleven SCI survivors participated in the sitting Tai Chi training (90 minutes/session, 2 times/week for 12 weeks) and eight SCI survivors acted as controls. Dynamic sitting balance was evaluated using limits of stability test and a sequential weight shifting test in sitting. Handgrip strength was also tested using a hand-held dynamometer. QOL was measured using the World Health Organization's Quality of Life Scale. Results. Tai Chi practitioners achieved significant improvements in their reaction time (P = 0.042); maximum excursion (P = 0.016); and directional control (P = 0.025) in the limits of stability test after training. In the sequential weight shifting test, they significantly improved their total time to sequentially hit the 12 targets (P = 0.035). Significant improvement in handgrip strength was also found among the Tai Chi practitioners (P = 0.049). However, no significant within and between-group differences were found in the QOL outcomes (P > 0.05). Conclusions. Twelve weeks of sitting Tai Chi training could improve the dynamic sitting balance and handgrip strength, but not QOL, of the SCI survivors. PMID:25688276

  5. Using the Larval Zebrafish Locomotor Asssay in Functional Neurotoxicity Screening: Light Brightness and the Order of Stimulus Presentation Affect the Outcome

    EPA Science Inventory

    We are evaluating methods to screen/prioritize large numbers of chemicals using 6 day old zebrafish (Danio rerio) as an alternative model for detecting neurotoxic effects. Our behavioral testing paradigm simultaneously tests individual larval zebrafish under sequential light and...

  6. Examining Parallelism of Sets of Psychometric Measures Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko; Patelis, Thanos; Marcoulides, George A.

    2011-01-01

    A latent variable modeling approach that can be used to examine whether several psychometric tests are parallel is discussed. The method consists of sequentially testing the properties of parallel measures via a corresponding relaxation of parameter constraints in a saturated model or an appropriately constructed latent variable model. The…

  7. Damage diagnosis algorithm using a sequential change point detection method with an unknown distribution for damage

    NASA Astrophysics Data System (ADS)

    Noh, Hae Young; Rajagopal, Ram; Kiremidjian, Anne S.

    2012-04-01

    This paper introduces a damage diagnosis algorithm for civil structures that uses a sequential change point detection method for the cases where the post-damage feature distribution is unknown a priori. This algorithm extracts features from structural vibration data using time-series analysis and then declares damage using the change point detection method. The change point detection method asymptotically minimizes detection delay for a given false alarm rate. The conventional method uses the known pre- and post-damage feature distributions to perform a sequential hypothesis test. In practice, however, the post-damage distribution is unlikely to be known a priori. Therefore, our algorithm estimates and updates this distribution as data are collected using the maximum likelihood and the Bayesian methods. We also applied an approximate method to reduce the computation load and memory requirement associated with the estimation. The algorithm is validated using multiple sets of simulated data and a set of experimental data collected from a four-story steel special moment-resisting frame. Our algorithm was able to estimate the post-damage distribution consistently and resulted in detection delays only a few seconds longer than the delays from the conventional method that assumes we know the post-damage feature distribution. We confirmed that the Bayesian method is particularly efficient in declaring damage with minimal memory requirement, but the maximum likelihood method provides an insightful heuristic approach.

  8. CLAss-Specific Subspace Kernel Representations and Adaptive Margin Slack Minimization for Large Scale Classification.

    PubMed

    Yu, Yinan; Diamantaras, Konstantinos I; McKelvey, Tomas; Kung, Sun-Yuan

    2018-02-01

    In kernel-based classification models, given limited computational power and storage capacity, operations over the full kernel matrix becomes prohibitive. In this paper, we propose a new supervised learning framework using kernel models for sequential data processing. The framework is based on two components that both aim at enhancing the classification capability with a subset selection scheme. The first part is a subspace projection technique in the reproducing kernel Hilbert space using a CLAss-specific Subspace Kernel representation for kernel approximation. In the second part, we propose a novel structural risk minimization algorithm called the adaptive margin slack minimization to iteratively improve the classification accuracy by an adaptive data selection. We motivate each part separately, and then integrate them into learning frameworks for large scale data. We propose two such frameworks: the memory efficient sequential processing for sequential data processing and the parallelized sequential processing for distributed computing with sequential data acquisition. We test our methods on several benchmark data sets and compared with the state-of-the-art techniques to verify the validity of the proposed techniques.

  9. Context-Dependent Upper Limb Prosthesis Control for Natural and Robust Use.

    PubMed

    Amsuess, Sebastian; Vujaklija, Ivan; Goebel, Peter; Roche, Aidan D; Graimann, Bernhard; Aszmann, Oskar C; Farina, Dario

    2016-07-01

    Pattern recognition and regression methods applied to the surface EMG have been used for estimating the user intended motor tasks across multiple degrees of freedom (DOF), for prosthetic control. While these methods are effective in several conditions, they are still characterized by some shortcomings. In this study we propose a methodology that combines these two approaches for mutually alleviating their limitations. This resulted in a control method capable of context-dependent movement estimation that switched automatically between sequential (one DOF at a time) or simultaneous (multiple DOF) prosthesis control, based on an online estimation of signal dimensionality. The proposed method was evaluated in scenarios close to real-life situations, with the control of a physical prosthesis in applied tasks of varying difficulties. Test prostheses were individually manufactured for both able-bodied and transradial amputee subjects. With these prostheses, two amputees performed the Southampton Hand Assessment Procedure test with scores of 58 and 71 points. The five able-bodied individuals performed standardized tests, such as the box&block and clothes pin test, reducing the completion times by up to 30%, with respect to using a state-of-the-art pure sequential control algorithm. Apart from facilitating fast simultaneous movements, the proposed control scheme was also more intuitive to use, since human movements are predominated by simultaneous activations across joints. The proposed method thus represents a significant step towards intelligent, intuitive and natural control of upper limb prostheses.

  10. Introducing a Model for Optimal Design of Sequential Objective Structured Clinical Examinations

    ERIC Educational Resources Information Center

    Mortaz Hejri, Sara; Yazdani, Kamran; Labaf, Ali; Norcini, John J.; Jalili, Mohammad

    2016-01-01

    In a sequential OSCE which has been suggested to reduce testing costs, candidates take a short screening test and who fail the test, are asked to take the full OSCE. In order to introduce an effective and accurate sequential design, we developed a model for designing and evaluating screening OSCEs. Based on two datasets from a 10-station…

  11. Sequential change detection and monitoring of temporal trends in random-effects meta-analysis.

    PubMed

    Dogo, Samson Henry; Clark, Allan; Kulinskaya, Elena

    2017-06-01

    Temporal changes in magnitude of effect sizes reported in many areas of research are a threat to the credibility of the results and conclusions of meta-analysis. Numerous sequential methods for meta-analysis have been proposed to detect changes and monitor trends in effect sizes so that meta-analysis can be updated when necessary and interpreted based on the time it was conducted. The difficulties of sequential meta-analysis under the random-effects model are caused by dependencies in increments introduced by the estimation of the heterogeneity parameter τ 2 . In this paper, we propose the use of a retrospective cumulative sum (CUSUM)-type test with bootstrap critical values. This method allows retrospective analysis of the past trajectory of cumulative effects in random-effects meta-analysis and its visualization on a chart similar to CUSUM chart. Simulation results show that the new method demonstrates good control of Type I error regardless of the number or size of the studies and the amount of heterogeneity. Application of the new method is illustrated on two examples of medical meta-analyses. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  12. Preliminary report of a Web-based instrument to assess and teach knowledge and clinical thinking to medical student

    PubMed Central

    Tokunaga, Hironobu; Ando, Hirotaka; Obika, Mikako; Miyoshi, Tomoko; Tokuda, Yasuharu; Bautista, Miho; Kataoka, Hitomi; Terasawa, Hidekazu

    2014-01-01

    Objectives We report the preliminary development of a unique Web-based instrument for assessing and teaching knowledge and developing clinical thinking called the “Sequential Questions and Answers” (SQA) test. Included in this feasibility report are physicians’ answers to the Sequential Questions and Answers pre- and posttests and their brief questionnaire replies. Methods The authors refined the SQA test case scenario for content, ease of modifications of case scenarios, test uploading and answer retrieval. Eleven geographically distant physicians evaluated the SQA test, taking the pretest and posttest within two weeks. These physicians completed a brief questionnaire about the SQA test. Results Eleven physicians completed the SQA pre- and posttest; all answers were downloaded for analysis. They reported the ease of website login and navigating within the test module together with many helpful suggestions. Their average posttest score gain was 53% (p=0.012). Conclusions We report the successful launch of a unique Web-based instrument referred to as the Sequential Questions and Answers test. This distinctive test combines teaching organization of the clinical narrative into an assessment tool that promotes acquiring medical knowledge and clinical thinking. We successfully demonstrated the feasibility of geographically distant physicians to access the SQA instrument. The physicians’ helpful suggestions will be added to future SQA test versions. Medical schools might explore the integration of this multi-language-capable SQA assessment and teaching instrument into their undergraduate medical curriculum. PMID:25341203

  13. Trial Sequential Methods for Meta-Analysis

    ERIC Educational Resources Information Center

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  14. Testing sequential extraction methods for the analysis of multiple stable isotope systems from a bone sample

    NASA Astrophysics Data System (ADS)

    Sahlstedt, Elina; Arppe, Laura

    2017-04-01

    Stable isotope composition of bones, analysed either from the mineral phase (hydroxyapatite) or from the organic phase (mainly collagen) carry important climatological and ecological information and are therefore widely used in paleontological and archaeological research. For the analysis of the stable isotope compositions, both of the phases, hydroxyapatite and collagen, have their more or less well established separation and analytical techniques. Recent development in IRMS and wet chemical extraction methods have facilitated the analysis of very small bone fractions (500 μg or less starting material) for PO43-O isotope composition. However, the uniqueness and (pre-) historical value of each archaeological and paleontological finding lead to preciously little material available for stable isotope analyses, encouraging further development of microanalytical methods for the use of stable isotope analyses. Here we present the first results in developing extraction methods for combining collagen C- and N-isotope analyses to PO43-O-isotope analyses from a single bone sample fraction. We tested sequential extraction starting with dilute acid demineralization and collection of both collagen and PO43-fractions, followed by further purification step by H2O2 (PO43-fraction). First results show that bone sample separates as small as 2 mg may be analysed for their δ15N, δ13C and δ18OPO4 values. The method may be incorporated in detailed investigation of sequentially developing skeletal material such as teeth, potentially allowing for the investigation of interannual variability in climatological/environmental signals or investigation of the early life history of an individual.

  15. Improving the identification accuracy of senior witnesses: do prelineup questions and sequential testing help?

    PubMed

    Memon, Amina; Gabbert, Fiona

    2003-04-01

    Eyewitness research has identified sequential lineup testing as a way of reducing false lineup choices while maintaining accurate identifications. The authors examined the usefulness of this procedure for reducing false choices in older adults. Young and senior witnesses viewed a crime video and were later presented with target present orabsent lineups in a simultaneous or sequential format. In addition, some participants received prelineup questions about their memory for a perpetrator's face and about their confidence in their ability to identify the culprit or to correctly reject the lineup. The sequential lineup reduced false choosing rates among young and older adults in target-absent conditions. In target-present conditions, sequential testing significantly reduced the correct identification rate in both age groups.

  16. Win-Stay, Lose-Sample: a simple sequential algorithm for approximating Bayesian inference.

    PubMed

    Bonawitz, Elizabeth; Denison, Stephanie; Gopnik, Alison; Griffiths, Thomas L

    2014-11-01

    People can behave in a way that is consistent with Bayesian models of cognition, despite the fact that performing exact Bayesian inference is computationally challenging. What algorithms could people be using to make this possible? We show that a simple sequential algorithm "Win-Stay, Lose-Sample", inspired by the Win-Stay, Lose-Shift (WSLS) principle, can be used to approximate Bayesian inference. We investigate the behavior of adults and preschoolers on two causal learning tasks to test whether people might use a similar algorithm. These studies use a "mini-microgenetic method", investigating how people sequentially update their beliefs as they encounter new evidence. Experiment 1 investigates a deterministic causal learning scenario and Experiments 2 and 3 examine how people make inferences in a stochastic scenario. The behavior of adults and preschoolers in these experiments is consistent with our Bayesian version of the WSLS principle. This algorithm provides both a practical method for performing Bayesian inference and a new way to understand people's judgments. Copyright © 2014 Elsevier Inc. All rights reserved.

  17. The Relevance of Visual Sequential Memory to Reading.

    ERIC Educational Resources Information Center

    Crispin, Lisa; And Others

    1984-01-01

    Results of three visual sequential memory tests and a group reading test given to 19 elementary students are discussed in terms of task analysis and structuralist approaches to analysis of reading skills. Relation of visual sequential memory to other reading subskills is considered in light of current reasearch. (CMG)

  18. Sequential structural damage diagnosis algorithm using a change point detection method

    NASA Astrophysics Data System (ADS)

    Noh, H.; Rajagopal, R.; Kiremidjian, A. S.

    2013-11-01

    This paper introduces a damage diagnosis algorithm for civil structures that uses a sequential change point detection method. The general change point detection method uses the known pre- and post-damage feature distributions to perform a sequential hypothesis test. In practice, however, the post-damage distribution is unlikely to be known a priori, unless we are looking for a known specific type of damage. Therefore, we introduce an additional algorithm that estimates and updates this distribution as data are collected using the maximum likelihood and the Bayesian methods. We also applied an approximate method to reduce the computation load and memory requirement associated with the estimation. The algorithm is validated using a set of experimental data collected from a four-story steel special moment-resisting frame and multiple sets of simulated data. Various features of different dimensions have been explored, and the algorithm was able to identify damage, particularly when it uses multidimensional damage sensitive features and lower false alarm rates, with a known post-damage feature distribution. For unknown feature distribution cases, the post-damage distribution was consistently estimated and the detection delays were only a few time steps longer than the delays from the general method that assumes we know the post-damage feature distribution. We confirmed that the Bayesian method is particularly efficient in declaring damage with minimal memory requirement, but the maximum likelihood method provides an insightful heuristic approach.

  19. Potential for leaching of arsenic from excavated rock after different drying treatments.

    PubMed

    Li, Jining; Kosugi, Tomoya; Riya, Shohei; Hashimoto, Yohey; Hou, Hong; Terada, Akihiko; Hosomi, Masaaki

    2016-07-01

    Leaching of arsenic (As) from excavated rock subjected to different drying methods is compared using sequential leaching tests and rapid small-scale column tests combined with a sequential extraction procedure. Although the total As content in the rock was low (8.81 mg kg(-1)), its resulting concentration in the leachate when leached at a liquid-to-solid ratio of 10 L kg(-1) exceeded the environmental standard (10 μg L(-1)). As existed mainly in dissolved forms in the leachates. All of the drying procedures applied in this study increased the leaching of As, with freeze-drying leading to the largest increase. Water extraction of As using the two tests showed different leaching behaviors as a function of the liquid-to-solid ratio, and achieved average extractions of up to 35.7% and 25.8% total As, respectively. Dissolution of As from the mineral surfaces and subsequent re-adsorption controlled the short-term release of As; dissolution of Fe, Al, and dissolved organic carbon played important roles in long-term As leaching. Results of the sequential extraction procedure showed that use of 0.05 M (NH4)2SO4 underestimates the readily soluble As. Long-term water extraction removed almost all of the non-specifically sorbed As and most of the specifically sorbed As. The concept of pollution potential indices, which are easily determined by the sequential leaching test, is proposed in this study and is considered for possible use in assessing efficacy of treatment of excavated rocks. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Your Scores in Basic Skills: Iowa Tests of Basic Skills. AISD Junior High Schools, School Year 1981-82. AISD Senior High Schools, School Year 1981-82.

    ERIC Educational Resources Information Center

    Austin Independent School District, TX.

    Designed for junior high and high school students and their parents, this brochure explains the structure, function, and method for interpretation of the Iowa Tests of Basic Skills and the Sequential Tests of Educational Progress. A question and answer format is used to provide information on scope and purposes of the tests, meaning and accuracy…

  1. Academic Dishonesty: A Mixed-Method Study of Rational Choice among Students at the College of Basic Education in Kuwait

    ERIC Educational Resources Information Center

    Alsuwaileh, Bader Ghannam; Russ-Eft, Darlene F.; Alshurai, Saad R.

    2016-01-01

    The research herein used a sequential mixed methods design to investigate why academic dishonesty is widespread among the students at the College of Basic Education in Kuwait. Qualitative interviews were conducted to generate research hypotheses. Then, using questionnaire survey, the research hypotheses were quantitatively tested. The findings…

  2. The Effect of English Language Learning on Creative Thinking Skills: A Mixed Methods Case Study

    ERIC Educational Resources Information Center

    Sehic, Sandro

    2017-01-01

    The purpose of this sequential explanatory mixed-methods case study was to investigate the effects of English language learning on creative thinking skills in the domains of fluency, flexibility, originality, and elaboration as measured with the Alternate Uses Test. Unlike the previous research studies that investigated the links between English…

  3. Resistance of various shiga toxin-producing Escherichia coli to electrolyzed oxidizing water

    USDA-ARS?s Scientific Manuscript database

    The resistance of thirty two strains of Escherichia coli O157:H7 and six major serotypes of non-O157 Shiga toxin- producing E. coli (STEC) plus E. coli O104 was tested against Electrolyzed oxidizing (EO) water using two different methods; modified AOAC 955.16 sequential inoculation method and minim...

  4. Computerized Classification Testing with the Rasch Model

    ERIC Educational Resources Information Center

    Eggen, Theo J. H. M.

    2011-01-01

    If classification in a limited number of categories is the purpose of testing, computerized adaptive tests (CATs) with algorithms based on sequential statistical testing perform better than estimation-based CATs (e.g., Eggen & Straetmans, 2000). In these computerized classification tests (CCTs), the Sequential Probability Ratio Test (SPRT) (Wald,…

  5. Comparison of regression and geostatistical methods for mapping Leaf Area Index (LAI) with Landsat ETM+ data over a boreal forest.

    Treesearch

    Mercedes Berterretche; Andrew T. Hudak; Warren B. Cohen; Thomas K. Maiersperger; Stith T. Gower; Jennifer Dungan

    2005-01-01

    This study compared aspatial and spatial methods of using remote sensing and field data to predict maximum growing season leaf area index (LAI) maps in a boreal forest in Manitoba, Canada. The methods tested were orthogonal regression analysis (reduced major axis, RMA) and two geostatistical techniques: kriging with an external drift (KED) and sequential Gaussian...

  6. Diagnostic value of tendon thickness and structure in the sonographic diagnosis of supraspinatus tendinopathy: room for a two-step approach.

    PubMed

    Arend, Carlos Frederico; Arend, Ana Amalia; da Silva, Tiago Rodrigues

    2014-06-01

    The aim of our study was to systematically compare different methodologies to establish an evidence-based approach based on tendon thickness and structure for sonographic diagnosis of supraspinatus tendinopathy when compared to MRI. US was obtained from 164 symptomatic patients with supraspinatus tendinopathy detected at MRI and 42 asymptomatic controls with normal MRI. Diagnostic yield was calculated for either maximal supraspinatus tendon thickness (MSTT) and tendon structure as isolated criteria and using different combinations of parallel and sequential testing at US. Chi-squared tests were performed to assess sensitivity, specificity, and accuracy of different diagnostic approaches. Mean MSTT was 6.68 mm in symptomatic patients and 5.61 mm in asymptomatic controls (P<.05). When used as an isolated criterion, MSTT>6.0mm provided best results for accuracy (93.7%) when compared to other measurements of tendon thickness. Also as an isolated criterion, abnormal tendon structure (ATS) yielded 93.2% accuracy for diagnosis. The best overall yield was obtained by both parallel and sequential testing using either MSTT>6.0mm or ATS as diagnostic criteria at no particular order, which provided 99.0% accuracy, 100% sensitivity, and 95.2% specificity. Among these parallel and sequential tests that provided best overall yield, additional analysis revealed that sequential testing first evaluating tendon structure required assessment of 258 criteria (vs. 261 for sequential testing first evaluating tendon thickness and 412 for parallel testing) and demanded a mean of 16.1s to assess diagnostic criteria and reach the diagnosis (vs. 43.3s for sequential testing first evaluating tendon thickness and 47.4s for parallel testing). We found that using either MSTT>6.0mm or ATS as diagnostic criteria for both parallel and sequential testing provides the best overall yield for sonographic diagnosis of supraspinatus tendinopathy when compared to MRI. Among these strategies, a two-step sequential approach first assessing tendon structure was advantageous because it required a lower number of criteria to be assessed and demanded less time to assess diagnostic criteria and reach the diagnosis. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. Optimal Sequential Rules for Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Vos, Hans J.

    1998-01-01

    Formulates sequential rules for adapting the appropriate amount of instruction to learning needs in the context of computer-based instruction. Topics include Bayesian decision theory, threshold and linear-utility structure, psychometric model, optimal sequential number of test questions, and an empirical example of sequential instructional…

  8. Accurately controlled sequential self-folding structures by polystyrene film

    NASA Astrophysics Data System (ADS)

    Deng, Dongping; Yang, Yang; Chen, Yong; Lan, Xing; Tice, Jesse

    2017-08-01

    Four-dimensional (4D) printing overcomes the traditional fabrication limitations by designing heterogeneous materials to enable the printed structures evolve over time (the fourth dimension) under external stimuli. Here, we present a simple 4D printing of self-folding structures that can be sequentially and accurately folded. When heated above their glass transition temperature pre-strained polystyrene films shrink along the XY plane. In our process silver ink traces printed on the film are used to provide heat stimuli by conducting current to trigger the self-folding behavior. The parameters affecting the folding process are studied and discussed. Sequential folding and accurately controlled folding angles are achieved by using printed ink traces and angle lock design. Theoretical analyses are done to guide the design of the folding processes. Programmable structures such as a lock and a three-dimensional antenna are achieved to test the feasibility and potential applications of this method. These self-folding structures change their shapes after fabrication under controlled stimuli (electric current) and have potential applications in the fields of electronics, consumer devices, and robotics. Our design and fabrication method provides an easy way by using silver ink printed on polystyrene films to 4D print self-folding structures for electrically induced sequential folding with angular control.

  9. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  10. Sequential and simultaneous SLAR block adjustment. [spline function analysis for mapping

    NASA Technical Reports Server (NTRS)

    Leberl, F.

    1975-01-01

    Two sequential methods of planimetric SLAR (Side Looking Airborne Radar) block adjustment, with and without splines, and three simultaneous methods based on the principles of least squares are evaluated. A limited experiment with simulated SLAR images indicates that sequential block formation with splines followed by external interpolative adjustment is superior to the simultaneous methods such as planimetric block adjustment with similarity transformations. The use of the sequential block formation is recommended, since it represents an inexpensive tool for satisfactory point determination from SLAR images.

  11. The Influence of High-Stakes Testing on Teacher Self-Efficacy and Job-Related Stress

    ERIC Educational Resources Information Center

    Gonzalez, Alejandro; Peters, Michelle L.; Orange, Amy; Grigsby, Bettye

    2017-01-01

    In the United States, teachers' job-related stress and self-efficacy levels across all grades are influenced in some manner by the demands of high-stakes testing. This sequential mixed-methods study aimed at examining the dynamics among assigned subject matter, teacher job-related stress, and teacher self-efficacy in a large south-eastern Texas…

  12. A 3D front tracking method on a CPU/GPU system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bo, Wurigen; Grove, John

    2011-01-21

    We describe the method to port a sequential 3D interface tracking code to a GPU with CUDA. The interface is represented as a triangular mesh. Interface geometry properties and point propagation are performed on a GPU. Interface mesh adaptation is performed on a CPU. The convergence of the method is assessed from the test problems with given velocity fields. Performance results show overall speedups from 11 to 14 for the test problems under mesh refinement. We also briefly describe our ongoing work to couple the interface tracking method with a hydro solver.

  13. Identifying High-Rate Flows Based on Sequential Sampling

    NASA Astrophysics Data System (ADS)

    Zhang, Yu; Fang, Binxing; Luo, Hao

    We consider the problem of fast identification of high-rate flows in backbone links with possibly millions of flows. Accurate identification of high-rate flows is important for active queue management, traffic measurement and network security such as detection of distributed denial of service attacks. It is difficult to directly identify high-rate flows in backbone links because tracking the possible millions of flows needs correspondingly large high speed memories. To reduce the measurement overhead, the deterministic 1-out-of-k sampling technique is adopted which is also implemented in Cisco routers (NetFlow). Ideally, a high-rate flow identification method should have short identification time, low memory cost and processing cost. Most importantly, it should be able to specify the identification accuracy. We develop two such methods. The first method is based on fixed sample size test (FSST) which is able to identify high-rate flows with user-specified identification accuracy. However, since FSST has to record every sampled flow during the measurement period, it is not memory efficient. Therefore the second novel method based on truncated sequential probability ratio test (TSPRT) is proposed. Through sequential sampling, TSPRT is able to remove the low-rate flows and identify the high-rate flows at the early stage which can reduce the memory cost and identification time respectively. According to the way to determine the parameters in TSPRT, two versions of TSPRT are proposed: TSPRT-M which is suitable when low memory cost is preferred and TSPRT-T which is suitable when short identification time is preferred. The experimental results show that TSPRT requires less memory and identification time in identifying high-rate flows while satisfying the accuracy requirement as compared to previously proposed methods.

  14. Mechanical System Reliability and Cost Integration Using a Sequential Linear Approximation Method

    NASA Technical Reports Server (NTRS)

    Kowal, Michael T.

    1997-01-01

    The development of new products is dependent on product designs that incorporate high levels of reliability along with a design that meets predetermined levels of system cost. Additional constraints on the product include explicit and implicit performance requirements. Existing reliability and cost prediction methods result in no direct linkage between variables affecting these two dominant product attributes. A methodology to integrate reliability and cost estimates using a sequential linear approximation method is proposed. The sequential linear approximation method utilizes probability of failure sensitivities determined from probabilistic reliability methods as well a manufacturing cost sensitivities. The application of the sequential linear approximation method to a mechanical system is demonstrated.

  15. Evaluating the Mobility of Arsenic in Synthetic Iron-containing Solids Using a Modified Sequential Extraction Method.

    PubMed

    Shan, Jilei; Sáez, A Eduardo; Ela, Wendell P

    2010-02-01

    Many water treatment technologies for arsenic removal that are used today produce arsenic-bearing residuals which are disposed in non-hazardous landfills. Previous works have established that many of these residuals will release arsenic to a much greater extent than predicted by standard regulatory leaching tests (e.g. the toxicity characteristic leaching procedure, TCLP) and, consequently, require stabilization to ensure benign behavior after disposal. In this work, a four-step sequential extraction method was developed in an effort to determine the proportion of arsenic in various phases in untreated as well as stabilized iron-based solid matrices. The solids synthesized using various potential stabilization techniques included: amorphous arsenic-iron sludge (ASL), reduced ASL via reaction with zero valent iron (RASL), amorphous ferrous arsenate (PFA), a mixture of PFA and SL (M1), crystalline ferrous arsenate (HPFA), and a mixture of HPFA and SL (M2). The overall arsenic mobility of the tested samples increased in the following order: ASL > RASL > PFA > M1 > HPFA > M2.

  16. Sequential limiting in continuous and discontinuous Galerkin methods for the Euler equations

    NASA Astrophysics Data System (ADS)

    Dobrev, V.; Kolev, Tz.; Kuzmin, D.; Rieben, R.; Tomov, V.

    2018-03-01

    We present a new predictor-corrector approach to enforcing local maximum principles in piecewise-linear finite element schemes for the compressible Euler equations. The new element-based limiting strategy is suitable for continuous and discontinuous Galerkin methods alike. In contrast to synchronized limiting techniques for systems of conservation laws, we constrain the density, momentum, and total energy in a sequential manner which guarantees positivity preservation for the pressure and internal energy. After the density limiting step, the total energy and momentum gradients are adjusted to incorporate the irreversible effect of density changes. Antidiffusive corrections to bounds-compatible low-order approximations are limited to satisfy inequality constraints for the specific total and kinetic energy. An accuracy-preserving smoothness indicator is introduced to gradually adjust lower bounds for the element-based correction factors. The employed smoothness criterion is based on a Hessian determinant test for the density. A numerical study is performed for test problems with smooth and discontinuous solutions.

  17. Separation of left and right lungs using 3-dimensional information of sequential computed tomography images and a guided dynamic programming algorithm.

    PubMed

    Park, Sang Cheol; Leader, Joseph Ken; Tan, Jun; Lee, Guee Sang; Kim, Soo Hyung; Na, In Seop; Zheng, Bin

    2011-01-01

    This article presents a new computerized scheme that aims to accurately and robustly separate left and right lungs on computed tomography (CT) examinations. We developed and tested a method to separate the left and right lungs using sequential CT information and a guided dynamic programming algorithm using adaptively and automatically selected start point and end point with especially severe and multiple connections. The scheme successfully identified and separated all 827 connections on the total 4034 CT images in an independent testing data set of CT examinations. The proposed scheme separated multiple connections regardless of their locations, and the guided dynamic programming algorithm reduced the computation time to approximately 4.6% in comparison with the traditional dynamic programming and avoided the permeation of the separation boundary into normal lung tissue. The proposed method is able to robustly and accurately disconnect all connections between left and right lungs, and the guided dynamic programming algorithm is able to remove redundant processing.

  18. A sequential test for assessing observed agreement between raters.

    PubMed

    Bersimis, Sotiris; Sachlas, Athanasios; Chakraborti, Subha

    2018-01-01

    Assessing the agreement between two or more raters is an important topic in medical practice. Existing techniques, which deal with categorical data, are based on contingency tables. This is often an obstacle in practice as we have to wait for a long time to collect the appropriate sample size of subjects to construct the contingency table. In this paper, we introduce a nonparametric sequential test for assessing agreement, which can be applied as data accrues, does not require a contingency table, facilitating a rapid assessment of the agreement. The proposed test is based on the cumulative sum of the number of disagreements between the two raters and a suitable statistic representing the waiting time until the cumulative sum exceeds a predefined threshold. We treat the cases of testing two raters' agreement with respect to one or more characteristics and using two or more classification categories, the case where the two raters extremely disagree, and finally the case of testing more than two raters' agreement. The numerical investigation shows that the proposed test has excellent performance. Compared to the existing methods, the proposed method appears to require significantly smaller sample size with equivalent power. Moreover, the proposed method is easily generalizable and brings the problem of assessing the agreement between two or more raters and one or more characteristics under a unified framework, thus providing an easy to use tool to medical practitioners. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. ROC and Loss Function Analysis in Sequential Testing

    ERIC Educational Resources Information Center

    Muijtjens, Arno M. M.; Van Luijk, Scheltus J.; Van Der Vleuten, Cees P. M.

    2006-01-01

    Sequential testing is applied to reduce costs in SP-based tests (OSCEs). Initially, all candidates take a screening test consisting of a part of the OSCE. Candidates who fail the screen sit the complete test, whereas those who pass the screen are qualified as a pass of the complete test. The procedure may result in a reduction of testing…

  20. Multiparticle imaging technique for two-phase fluid flows using pulsed laser speckle velocimetry. Final report, September 1988--November 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hassan, T.A.

    1992-12-01

    The practical use of Pulsed Laser Velocimetry (PLV) requires the use of fast, reliable computer-based methods for tracking numerous particles suspended in a fluid flow. Two methods for performing tracking are presented. One method tracks a particle through multiple sequential images (minimum of four required) by prediction and verification of particle displacement and direction. The other method, requiring only two sequential images uses a dynamic, binary, spatial, cross-correlation technique. The algorithms are tested on computer-generated synthetic data and experimental data which was obtained with traditional PLV methods. This allowed error analysis and testing of the algorithms on real engineering flows.more » A novel method is proposed which eliminates tedious, undersirable, manual, operator assistance in removing erroneous vectors. This method uses an iterative process involving an interpolated field produced from the most reliable vectors. Methods are developed to allow fast analysis and presentation of sets of PLV image data. Experimental investigation of a two-phase, horizontal, stratified, flow regime was performed to determine the interface drag force, and correspondingly, the drag coefficient. A horizontal, stratified flow test facility using water and air was constructed to allow interface shear measurements with PLV techniques. The experimentally obtained local drag measurements were compared with theoretical results given by conventional interfacial drag theory. Close agreement was shown when local conditions near the interface were similar to space-averaged conditions. However, theory based on macroscopic, space-averaged flow behavior was shown to give incorrect results if the local gas velocity near the interface as unstable, transient, and dissimilar from the average gas velocity through the test facility.« less

  1. Multiparticle imaging technique for two-phase fluid flows using pulsed laser speckle velocimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hassan, T.A.

    1992-12-01

    The practical use of Pulsed Laser Velocimetry (PLV) requires the use of fast, reliable computer-based methods for tracking numerous particles suspended in a fluid flow. Two methods for performing tracking are presented. One method tracks a particle through multiple sequential images (minimum of four required) by prediction and verification of particle displacement and direction. The other method, requiring only two sequential images uses a dynamic, binary, spatial, cross-correlation technique. The algorithms are tested on computer-generated synthetic data and experimental data which was obtained with traditional PLV methods. This allowed error analysis and testing of the algorithms on real engineering flows.more » A novel method is proposed which eliminates tedious, undersirable, manual, operator assistance in removing erroneous vectors. This method uses an iterative process involving an interpolated field produced from the most reliable vectors. Methods are developed to allow fast analysis and presentation of sets of PLV image data. Experimental investigation of a two-phase, horizontal, stratified, flow regime was performed to determine the interface drag force, and correspondingly, the drag coefficient. A horizontal, stratified flow test facility using water and air was constructed to allow interface shear measurements with PLV techniques. The experimentally obtained local drag measurements were compared with theoretical results given by conventional interfacial drag theory. Close agreement was shown when local conditions near the interface were similar to space-averaged conditions. However, theory based on macroscopic, space-averaged flow behavior was shown to give incorrect results if the local gas velocity near the interface as unstable, transient, and dissimilar from the average gas velocity through the test facility.« less

  2. Distributed Immune Systems for Wireless Network Information Assurance

    DTIC Science & Technology

    2010-04-26

    ratio test (SPRT), where the goal is to optimize a hypothesis testing problem given a trade-off between the probability of errors and the...using cumulative sum (CUSUM) and Girshik-Rubin-Shiryaev (GRSh) statistics. In sequential versions of the problem the sequential probability ratio ...the more complicated problems, in particular those where no clear mean can be established. We developed algorithms based on the sequential probability

  3. Isolation of Polyvalent Bacteriophages by Sequential Multiple-Host Approaches

    PubMed Central

    Yu, Pingfeng; Li, Mengyan; Dai, Zhaoyi; Alvarez, Pedro J. J.

    2015-01-01

    Many studies on phage biology are based on isolation methods that may inadvertently select for narrow-host-range phages. Consequently, broad-host-range phages, whose ecological significance is largely unexplored, are consistently overlooked. To enhance research on such polyvalent phages, we developed two sequential multihost isolation methods and tested both culture-dependent and culture-independent phage libraries for broad infectivity. Lytic phages isolated from activated sludge were capable of interspecies or even interorder infectivity without a significant reduction in the efficiency of plating (0.45 to 1.15). Two polyvalent phages (PX1 of the Podoviridae family and PEf1 of the Siphoviridae family) were characterized in terms of adsorption rate (3.54 × 10−10 to 8.53 × 10−10 ml/min), latent time (40 to 55 min), and burst size (45 to 99 PFU/cell), using different hosts. These phages were enriched with a nonpathogenic host (Pseudomonas putida F1 or Escherichia coli K-12) and subsequently used to infect model problematic bacteria. By using a multiplicity of infection of 10 in bacterial challenge tests, >60% lethality was observed for Pseudomonas aeruginosa relative to uninfected controls. The corresponding lethality for Pseudomonas syringae was ∼50%. Overall, this work suggests that polyvalent phages may be readily isolated from the environment by using different sequential hosts, and this approach should facilitate the study of their ecological significance as well as enable novel applications. PMID:26590277

  4. Polymeric assay film for direct colorimetric detection

    DOEpatents

    Charych, Deborah; Nagy, Jon; Spevak, Wayne

    2002-01-01

    A lipid bilayer with affinity to an analyte, which directly signals binding by a changes in the light absorption spectra. This novel assay means and method has special applications in the drug development and medical testing fields. Using a spectrometer, the system is easily automated, and a multiple well embodiment allows inexpensive screening and sequential testing. This invention also has applications in industry for feedstock and effluent monitoring.

  5. Polymeric assay film for direct colorimetric detection

    DOEpatents

    Charych, Deborah; Nagy, Jon; Spevak, Wayne

    1999-01-01

    A lipid bilayer with affinity to an analyte, which directly signals binding by a changes in the light absorption spectra. This novel assay means and method has special applications in the drug development and medical testing fields. Using a spectrometer, the system is easily automated, and a multiple well embodiment allows inexpensive screening and sequential testing. This invention also has applications in industry for feedstock and effluent monitoring.

  6. Double-blind photo lineups using actual eyewitnesses: an experimental test of a sequential versus simultaneous lineup procedure.

    PubMed

    Wells, Gary L; Steblay, Nancy K; Dysart, Jennifer E

    2015-02-01

    Eyewitnesses (494) to actual crimes in 4 police jurisdictions were randomly assigned to view simultaneous or sequential photo lineups using laptop computers and double-blind administration. The sequential procedure used in the field experiment mimicked how it is conducted in actual practice (e.g., using a continuation rule, witness does not know how many photos are to be viewed, witnesses resolve any multiple identifications), which is not how most lab experiments have tested the sequential lineup. No significant differences emerged in rates of identifying lineup suspects (25% overall) but the sequential procedure produced a significantly lower rate (11%) of identifying known-innocent lineup fillers than did the simultaneous procedure (18%). The simultaneous/sequential pattern did not significantly interact with estimator variables and no lineup-position effects were observed for either the simultaneous or sequential procedures. Rates of nonidentification were not significantly different for simultaneous and sequential but nonidentifiers from the sequential procedure were more likely to use the "not sure" response option than were nonidentifiers from the simultaneous procedure. Among witnesses who made an identification, 36% (41% of simultaneous and 32% of sequential) identified a known-innocent filler rather than a suspect, indicating that eyewitness performance overall was very poor. The results suggest that the sequential procedure that is used in the field reduces the identification of known-innocent fillers, but the differences are relatively small.

  7. A path-level exact parallelization strategy for sequential simulation

    NASA Astrophysics Data System (ADS)

    Peredo, Oscar F.; Baeza, Daniel; Ortiz, Julián M.; Herrero, José R.

    2018-01-01

    Sequential Simulation is a well known method in geostatistical modelling. Following the Bayesian approach for simulation of conditionally dependent random events, Sequential Indicator Simulation (SIS) method draws simulated values for K categories (categorical case) or classes defined by K different thresholds (continuous case). Similarly, Sequential Gaussian Simulation (SGS) method draws simulated values from a multivariate Gaussian field. In this work, a path-level approach to parallelize SIS and SGS methods is presented. A first stage of re-arrangement of the simulation path is performed, followed by a second stage of parallel simulation for non-conflicting nodes. A key advantage of the proposed parallelization method is to generate identical realizations as with the original non-parallelized methods. Case studies are presented using two sequential simulation codes from GSLIB: SISIM and SGSIM. Execution time and speedup results are shown for large-scale domains, with many categories and maximum kriging neighbours in each case, achieving high speedup results in the best scenarios using 16 threads of execution in a single machine.

  8. Forecasting daily streamflow using online sequential extreme learning machines

    NASA Astrophysics Data System (ADS)

    Lima, Aranildo R.; Cannon, Alex J.; Hsieh, William W.

    2016-06-01

    While nonlinear machine methods have been widely used in environmental forecasting, in situations where new data arrive continually, the need to make frequent model updates can become cumbersome and computationally costly. To alleviate this problem, an online sequential learning algorithm for single hidden layer feedforward neural networks - the online sequential extreme learning machine (OSELM) - is automatically updated inexpensively as new data arrive (and the new data can then be discarded). OSELM was applied to forecast daily streamflow at two small watersheds in British Columbia, Canada, at lead times of 1-3 days. Predictors used were weather forecast data generated by the NOAA Global Ensemble Forecasting System (GEFS), and local hydro-meteorological observations. OSELM forecasts were tested with daily, monthly or yearly model updates. More frequent updating gave smaller forecast errors, including errors for data above the 90th percentile. Larger datasets used in the initial training of OSELM helped to find better parameters (number of hidden nodes) for the model, yielding better predictions. With the online sequential multiple linear regression (OSMLR) as benchmark, we concluded that OSELM is an attractive approach as it easily outperformed OSMLR in forecast accuracy.

  9. Alternatives to the sequential lineup: the importance of controlling the pictures.

    PubMed

    Lindsay, R C; Bellinger, K

    1999-06-01

    Because sequential lineups reduce false-positive choices, their use has been recommended (R. C. L. Lindsay, 1999; R. C. L. Lindsay & G. L. Wells, 1985). Blind testing is included in the recommended procedures. Police, concerned about blind testing, devised alternative procedures, including self-administered sequential lineups, to reduce use of relative judgments (G. L. Wells, 1984) while permitting the investigating officer to conduct the procedure. Identification data from undergraduates exposed to a staged crime (N = 165) demonstrated that 4 alternative identification procedures tested were less effective than the original sequential lineup. Allowing witnesses to control the photographs resulted in higher rates of false-positive identification. Self-reports of using relative judgments were shown to be postdictive of decision accuracy.

  10. A STUDY OF METHODS OF CONTROLLING IMPULSES.

    ERIC Educational Resources Information Center

    WHITESIDE, RAY

    THE PERSON LESS ABLE TO CONTROL HIS IMPULSES IS ALSO APT TO EXHIBIT SOCIALLY DISVALUED BEHAVIOR. VOCATIONAL AND ACADEMIC FAILURE IS A PARTIAL CONSEQUENCE OF IMPULSIVENESS AND LACK OF SELF-CONTROL. TO INVESTIGATE IMPULSE CONTROL, TWO INSTRUMENTS BELIEVED TO MEASURE ATTRIBUTES OF OPPOSITE POLES OF THIS CONCEPT (SEQUENTIAL TESTS OF EDUCATIONAL…

  11. Sequential Design of Experiments to Maximize Learning from Carbon Capture Pilot Plant Testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soepyan, Frits B.; Morgan, Joshua C.; Omell, Benjamin P.

    Pilot plant test campaigns can be expensive and time-consuming. Therefore, it is of interest to maximize the amount of learning and the efficiency of the test campaign given the limited number of experiments that can be conducted. This work investigates the use of sequential design of experiments (SDOE) to overcome these challenges by demonstrating its usefulness for a recent solvent-based CO2 capture plant test campaign. Unlike traditional design of experiments methods, SDOE regularly uses information from ongoing experiments to determine the optimum locations in the design space for subsequent runs within the same experiment. However, there are challenges that needmore » to be addressed, including reducing the high computational burden to efficiently update the model, and the need to incorporate the methodology into a computational tool. We address these challenges by applying SDOE in combination with a software tool, the Framework for Optimization, Quantification of Uncertainty and Surrogates (FOQUS) (Miller et al., 2014a, 2016, 2017). The results of applying SDOE on a pilot plant test campaign for CO2 capture suggests that relative to traditional design of experiments methods, SDOE can more effectively reduce the uncertainty of the model, thus decreasing technical risk. Future work includes integrating SDOE into FOQUS and using SDOE to support additional large-scale pilot plant test campaigns.« less

  12. Extremely accurate sequential verification of RELAP5-3D

    DOE PAGES

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less

  13. Efficacy of premixed versus sequential administration of clonidine as an adjuvant to hyperbaric bupivacaine intrathecally in cesarean section

    PubMed Central

    Sachan, Prachee; Kumar, Nidhi; Sharma, Jagdish Prasad

    2014-01-01

    Background: Density of the drugs injected intrathecally is an important factor that influences spread in the cerebrospinal fluid. Mixing adjuvants with local anesthetics (LA) alters their density and hence their spread compared to when given sequentially in seperate syringes. Aims: To evaluate the efficacy of intrathecal administration of hyperbaric bupivacaine (HB) and clonidine as a mixture and sequentially in terms of block characteristics, hemodynamics, neonatal outcome, and postoperative pain. Setting and Design: Prospective randomized single blind study at a tertiary center from 2010 to 2012. Materials and Methods: Ninety full-term parturient scheduled for elective cesarean sections were divided into three groups on the basis of technique of intrathecal drug administration. Group M received mixture of 75 μg clonidine and 10 mg HB 0.5%. Group A received 75 μg clonidine after administration of 10 mg HB 0.5% through separate syringe. Group B received 75 μg clonidine before HB 0.5% (10 mg) through separate syringe. Statistical analysis used: Observational descriptive statistics, analysis of variance with Bonferroni multiple comparison post hoc test, and Chi-square test. Results: Time to achieve complete sensory and motor block was less in group A and B in which drugs were given sequentially. Duration of analgesia lasted longer in group B (474.3 ± 20.79 min) and group A (472.50 ± 22.11 min) than in group M (337 ± 18.22 min) with clinically insignificant influence on hemodynamic parameters and sedation. Conclusion: Sequential technique reduces time to achieve complete sensory and motor block, delays block regression, and significantly prolongs the duration of analgesia. However, it did not matter much whether clonidine was administered before or after HB. PMID:25886098

  14. Near real-time adverse drug reaction surveillance within population-based health networks: methodology considerations for data accrual.

    PubMed

    Avery, Taliser R; Kulldorff, Martin; Vilk, Yury; Li, Lingling; Cheetham, T Craig; Dublin, Sascha; Davis, Robert L; Liu, Liyan; Herrinton, Lisa; Brown, Jeffrey S

    2013-05-01

    This study describes practical considerations for implementation of near real-time medical product safety surveillance in a distributed health data network. We conducted pilot active safety surveillance comparing generic divalproex sodium to historical branded product at four health plans from April to October 2009. Outcomes reported are all-cause emergency room visits and fractures. One retrospective data extract was completed (January 2002-June 2008), followed by seven prospective monthly extracts (January 2008-November 2009). To evaluate delays in claims processing, we used three analytic approaches: near real-time sequential analysis, sequential analysis with 1.5 month delay, and nonsequential (using final retrospective data). Sequential analyses used the maximized sequential probability ratio test. Procedural and logistical barriers to active surveillance were documented. We identified 6586 new users of generic divalproex sodium and 43,960 new users of the branded product. Quality control methods identified 16 extract errors, which were corrected. Near real-time extracts captured 87.5% of emergency room visits and 50.0% of fractures, which improved to 98.3% and 68.7% respectively with 1.5 month delay. We did not identify signals for either outcome regardless of extract timeframe, and slight differences in the test statistic and relative risk estimates were found. Near real-time sequential safety surveillance is feasible, but several barriers warrant attention. Data quality review of each data extract was necessary. Although signal detection was not affected by delay in analysis, when using a historical control group differential accrual between exposure and outcomes may theoretically bias near real-time risk estimates towards the null, causing failure to detect a signal. Copyright © 2013 John Wiley & Sons, Ltd.

  15. Evaluation of Quantitative Performance of Sequential Immobilized Metal Affinity Chromatographic Enrichment for Phosphopeptides

    PubMed Central

    Sun, Zeyu; Hamilton, Karyn L.; Reardon, Kenneth F.

    2014-01-01

    We evaluated a sequential elution protocol from immobilized metal affinity chromatography (SIMAC) employing gallium-based immobilized metal affinity chromatography (IMAC) in conjunction with titanium-dioxide-based metal oxide affinity chromatography (MOAC). The quantitative performance of this SIMAC enrichment approach, assessed in terms of repeatability, dynamic range, and linearity, was evaluated using a mixture composed of tryptic peptides from caseins, bovine serum albumin, and phosphopeptide standards. While our data demonstrate the overall consistent performance of the SIMAC approach under various loading conditions, the results also revealed that the method had limited repeatability and linearity for most phosphopeptides tested, and different phosphopeptides were found to have different linear ranges. These data suggest that, unless additional strategies are used, SIMAC should be regarded as a semi-quantitative method when used in large-scale phosphoproteomics studies in complex backgrounds. PMID:24096195

  16. Detecting Signals of Disproportionate Reporting from Singapore's Spontaneous Adverse Event Reporting System: An Application of the Sequential Probability Ratio Test.

    PubMed

    Chan, Cheng Leng; Rudrappa, Sowmya; Ang, Pei San; Li, Shu Chuen; Evans, Stephen J W

    2017-08-01

    The ability to detect safety concerns from spontaneous adverse drug reaction reports in a timely and efficient manner remains important in public health. This paper explores the behaviour of the Sequential Probability Ratio Test (SPRT) and ability to detect signals of disproportionate reporting (SDRs) in the Singapore context. We used SPRT with a combination of two hypothesised relative risks (hRRs) of 2 and 4.1 to detect signals of both common and rare adverse events in our small database. We compared SPRT with other methods in terms of number of signals detected and whether labelled adverse drug reactions were detected or the reaction terms were considered serious. The other methods used were reporting odds ratio (ROR), Bayesian Confidence Propagation Neural Network (BCPNN) and Gamma Poisson Shrinker (GPS). The SPRT produced 2187 signals in common with all methods, 268 unique signals, and 70 signals in common with at least one other method, and did not produce signals in 178 cases where two other methods detected them, and there were 403 signals unique to one of the other methods. In terms of sensitivity, ROR performed better than other methods, but the SPRT method found more new signals. The performances of the methods were similar for negative predictive value and specificity. Using a combination of hRRs for SPRT could be a useful screening tool for regulatory agencies, and more detailed investigation of the medical utility of the system is merited.

  17. Effectiveness of motor sequential learning according to practice schedules in healthy adults; distributed practice versus massed practice

    PubMed Central

    Kwon, Yong Hyun; Kwon, Jung Won; Lee, Myoung Hee

    2015-01-01

    [Purpose] The purpose of the current study was to compare the effectiveness of motor sequential learning according to two different types of practice schedules, distributed practice schedule (two 12-hour inter-trial intervals) and massed practice schedule (two 10-minute inter-trial intervals) using a serial reaction time (SRT) task. [Subjects and Methods] Thirty healthy subjects were recruited and then randomly and evenly assigned to either the distributed practice group or the massed practice group. All subjects performed three consecutive sessions of the SRT task following one of the two different types of practice schedules. Distributed practice was scheduled for two 12-hour inter-session intervals including sleeping time, whereas massed practice was administered for two 10-minute inter-session intervals. Response time (RT) and response accuracy (RA) were measured in at pre-test, mid-test, and post-test. [Results] For RT, univariate analysis demonstrated significant main effects in the within-group comparison of the three tests as well as the interaction effect of two groups × three tests, whereas the between-group comparison showed no significant effect. The results for RA showed no significant differences in neither the between-group comparison nor the interaction effect of two groups × three tests, whereas the within-group comparison of the three tests showed a significant main effect. [Conclusion] Distributed practice led to enhancement of motor skill acquisition at the first inter-session interval as well as at the second inter-interval the following day, compared to massed practice. Consequentially, the results of this study suggest that a distributed practice schedule can enhance the effectiveness of motor sequential learning in 1-day learning as well as for two days learning formats compared to massed practice. PMID:25931727

  18. The PMHT: solutions for some of its problems

    NASA Astrophysics Data System (ADS)

    Wieneke, Monika; Koch, Wolfgang

    2007-09-01

    Tracking multiple targets in a cluttered environment is a challenging task. Probabilistic Multiple Hypothesis Tracking (PMHT) is an efficient approach for dealing with it. Essentially PMHT is based on the method of Expectation-Maximization for handling with association conflicts. Linearity in the number of targets and measurements is the main motivation for a further development and extension of this methodology. Unfortunately, compared with the Probabilistic Data Association Filter (PDAF), PMHT has not yet shown its superiority in terms of track-lost statistics. Furthermore, the problem of track extraction and deletion is apparently not yet satisfactorily solved within this framework. Four properties of PMHT are responsible for its problems in track maintenance: Non-Adaptivity, Hospitality, Narcissism and Local Maxima. 1, 2 In this work we present a solution for each of them and derive an improved PMHT by integrating the solutions into the PMHT formalism. The new PMHT is evaluated by Monte-Carlo simulations. A sequential Likelihood-Ratio (LR) test for track extraction has been developed and already integrated into the framework of traditional Bayesian Multiple Hypothesis Tracking. 3 As a multi-scan approach, also the PMHT methodology has the potential for track extraction. In this paper an analogous integration of a sequential LR test into the PMHT framework is proposed. We present an LR formula for track extraction and deletion using the PMHT update formulae. As PMHT provides all required ingredients for a sequential LR calculation, the LR is thus a by-product of the PMHT iteration process. Therefore the resulting update formula for the sequential LR test affords the development of Track-Before-Detect algorithms for PMHT. The approach is illustrated by a simple example.

  19. Introducing Science Experiments to Rote-Learning Classes in Pakistani Middle Schools

    ERIC Educational Resources Information Center

    Pell, Anthony William; Iqbal, Hafiz Muhammad; Sohail, Shahida

    2010-01-01

    A mixed-methods sequential research design has been used to test the effect of introducing teacher science demonstrations to a traditional book-learning sample of 384 Grade 7 boys and girls from five schools in Lahore, Pakistan. In the quasi-experimental quantitative study, the eight classes of comparable ability were designated either…

  20. Lessons in Listening and Learning.

    ERIC Educational Resources Information Center

    Phibbs, Mary E.

    1991-01-01

    Presents a teachers search for solutions to the problem of students not listening in science class. The author discovered the sequential nature of aural Origami is an excellent method for getting students to listen. Used cassette tape recordings of the paper folding directions twice a week for a month. Students test scores improved as well as…

  1. PATTERN RECOGNITION APPROACH TO MEDICAL DIAGNOSIS,

    DTIC Science & Technology

    A sequential method of pattern recognition was used to recognize hyperthyroidism in a sample of 2219 patients being treated at the Straub Clinic in...the most prominent class features are selected. Thus, the symptoms which best distinguish hyperthyroidism are extracted at every step and the number of tests required to reach a diagnosis is reduced. (Author)

  2. Evaluating the Mobility of Arsenic in Synthetic Iron-containing Solids Using a Modified Sequential Extraction Method

    PubMed Central

    Shan, Jilei; Sáez, A. Eduardo; Ela, Wendell P.

    2013-01-01

    Many water treatment technologies for arsenic removal that are used today produce arsenic-bearing residuals which are disposed in non-hazardous landfills. Previous works have established that many of these residuals will release arsenic to a much greater extent than predicted by standard regulatory leaching tests (e.g. the toxicity characteristic leaching procedure, TCLP) and, consequently, require stabilization to ensure benign behavior after disposal. In this work, a four-step sequential extraction method was developed in an effort to determine the proportion of arsenic in various phases in untreated as well as stabilized iron-based solid matrices. The solids synthesized using various potential stabilization techniques included: amorphous arsenic-iron sludge (ASL), reduced ASL via reaction with zero valent iron (RASL), amorphous ferrous arsenate (PFA), a mixture of PFA and SL (M1), crystalline ferrous arsenate (HPFA), and a mixture of HPFA and SL (M2). The overall arsenic mobility of the tested samples increased in the following order: ASL > RASL > PFA > M1 > HPFA > M2. PMID:23459695

  3. A Comparison of Trajectory Optimization Methods for the Impulsive Minimum Fuel Rendezvous Problem

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Mailhe, Laurie M.; Guzman, Jose J.

    2002-01-01

    In this paper we present a comparison of optimization approaches to the minimum fuel rendezvous problem. Both indirect and direct methods are compared for a variety of test cases. The indirect approach is based on primer vector theory. The direct approaches are implemented numerically and include Sequential Quadratic Programming (SQP), Quasi-Newton, Simplex, Genetic Algorithms, and Simulated Annealing. Each method is applied to a variety of test cases including, circular to circular coplanar orbits, LEO to GEO, and orbit phasing in highly elliptic orbits. We also compare different constrained optimization routines on complex orbit rendezvous problems with complicated, highly nonlinear constraints.

  4. A comparison of 2 methods of endoscopic laryngeal sensory testing: a preliminary study.

    PubMed

    Kaneoka, Asako; Krisciunas, Gintas P; Walsh, Kayo; Raade, Adele S; Langmore, Susan E

    2015-03-01

    This study examined the association between laryngeal sensory deficits and penetration or aspiration. Two methods of testing laryngeal sensation were carried out to determine which was more highly correlated with Penetration-Aspiration Scale (PAS) scores. Healthy participants and patients with dysphagia received an endoscopic swallowing evaluation including 2 sequential laryngeal sensory tests-air pulse followed by touch method. Normal/impaired responses were correlated with PAS scores. Fourteen participants completed the endoscopic swallowing evaluation and both sensory tests. The air pulse method identified sensory impairment with greater frequency than the touch method (P<.0001). However, the impairment identified by the air pulse method was not associated with abnormal PAS scores (P=.46). The sensory deficits identified by the touch method were associated with abnormal PAS scores (P=.05). Sensory impairment detected by the air pulse method does not appear to be associated with risk of penetration/aspiration. Significant laryngeal sensory loss revealed by the touch method is associated with compromised airway protection. © The Author(s) 2014.

  5. Sequential peritoneal equilibration test: a new method for assessment and modelling of peritoneal transport.

    PubMed

    Galach, Magda; Antosiewicz, Stefan; Baczynski, Daniel; Wankowicz, Zofia; Waniewski, Jacek

    2013-02-01

    In spite of many peritoneal tests proposed, there is still a need for a simple and reliable new approach for deriving detailed information about peritoneal membrane characteristics, especially those related to fluid transport. The sequential peritoneal equilibration test (sPET) that includes PET (glucose 2.27%, 4 h) followed by miniPET (glucose 3.86%, 1 h) was performed in 27 stable continuous ambulatory peritoneal dialysis patients. Ultrafiltration volumes, glucose absorption, ratio of concentration in dialysis fluid to concentration in plasma (D/P), sodium dip (Dip D/P Sodium), free water fraction (FWF60) and the ultrafiltration passing through small pores at 60 min (UFSP60), were calculated using clinical data. Peritoneal transport parameters were estimated using the three-pore model (3p model) and clinical data. Osmotic conductance for glucose was calculated from the parameters of the model. D/P creatinine correlated with diffusive mass transport parameters for all considered solutes, but not with fluid transport characteristics. Hydraulic permeability (L(p)S) correlated with net ultrafiltration from miniPET, UFSP60, FWF60 and sodium dip. The fraction of ultrasmall pores correlated with FWF60 and sodium dip. The sequential PET described and interpreted mechanisms of ultrafiltration and solute transport. Fluid transport parameters from the 3p model were independent of the PET D/P creatinine, but correlated with fluid transport characteristics from PET and miniPET.

  6. Simple and flexible SAS and SPSS programs for analyzing lag-sequential categorical data.

    PubMed

    O'Connor, B P

    1999-11-01

    This paper describes simple and flexible programs for analyzing lag-sequential categorical data, using SAS and SPSS. The programs read a stream of codes and produce a variety of lag-sequential statistics, including transitional frequencies, expected transitional frequencies, transitional probabilities, adjusted residuals, z values, Yule's Q values, likelihood ratio tests of stationarity across time and homogeneity across groups or segments, transformed kappas for unidirectional dependence, bidirectional dependence, parallel and nonparallel dominance, and significance levels based on both parametric and randomization tests.

  7. A Comparison of Trajectory Optimization Methods for the Impulsive Minimum Fuel Rendezvous Problem

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Mailhe, Laurie M.; Guzman, Jose J.

    2003-01-01

    In this paper we present, a comparison of trajectory optimization approaches for the minimum fuel rendezvous problem. Both indirect and direct methods are compared for a variety of test cases. The indirect approach is based on primer vector theory. The direct approaches are implemented numerically and include Sequential Quadratic Programming (SQP). Quasi- Newton and Nelder-Meade Simplex. Several cost function parameterizations are considered for the direct approach. We choose one direct approach that appears to be the most flexible. Both the direct and indirect methods are applied to a variety of test cases which are chosen to demonstrate the performance of each method in different flight regimes. The first test case is a simple circular-to-circular coplanar rendezvous. The second test case is an elliptic-to-elliptic line of apsides rotation. The final test case is an orbit phasing maneuver sequence in a highly elliptic orbit. For each test case we present a comparison of the performance of all methods we consider in this paper.

  8. Finding False Paths in Sequential Circuits

    NASA Astrophysics Data System (ADS)

    Matrosova, A. Yu.; Andreeva, V. V.; Chernyshov, S. V.; Rozhkova, S. V.; Kudin, D. V.

    2018-02-01

    Method of finding false paths in sequential circuits is developed. In contrast with heuristic approaches currently used abroad, the precise method based on applying operations on Reduced Ordered Binary Decision Diagrams (ROBDDs) extracted from the combinational part of a sequential controlling logic circuit is suggested. The method allows finding false paths when transfer sequence length is not more than the given value and obviates the necessity of investigation of combinational circuit equivalents of the given lengths. The possibilities of using of the developed method for more complicated circuits are discussed.

  9. SPMBR: a scalable algorithm for mining sequential patterns based on bitmaps

    NASA Astrophysics Data System (ADS)

    Xu, Xiwei; Zhang, Changhai

    2013-12-01

    Now some sequential patterns mining algorithms generate too many candidate sequences, and increase the processing cost of support counting. Therefore, we present an effective and scalable algorithm called SPMBR (Sequential Patterns Mining based on Bitmap Representation) to solve the problem of mining the sequential patterns for large databases. Our method differs from previous related works of mining sequential patterns. The main difference is that the database of sequential patterns is represented by bitmaps, and a simplified bitmap structure is presented firstly. In this paper, First the algorithm generate candidate sequences by SE(Sequence Extension) and IE(Item Extension), and then obtain all frequent sequences by comparing the original bitmap and the extended item bitmap .This method could simplify the problem of mining the sequential patterns and avoid the high processing cost of support counting. Both theories and experiments indicate that the performance of SPMBR is predominant for large transaction databases, the required memory size for storing temporal data is much less during mining process, and all sequential patterns can be mined with feasibility.

  10. CFD simulation of hemodynamics in sequential and individual coronary bypass grafts based on multislice CT scan datasets.

    PubMed

    Hajati, Omid; Zarrabi, Khalil; Karimi, Reza; Hajati, Azadeh

    2012-01-01

    There is still controversy over the differences in the patency rates of the sequential and individual coronary artery bypass grafting (CABG) techniques. The purpose of this paper was to non-invasively evaluate hemodynamic parameters using complete 3D computational fluid dynamics (CFD) simulations of the sequential and the individual methods based on the patient-specific data extracted from computed tomography (CT) angiography. For CFD analysis, the geometric model of coronary arteries was reconstructed using an ECG-gated 64-detector row CT. Modeling the sequential and individual bypass grafting, this study simulates the flow from the aorta to the occluded posterior descending artery (PDA) and the posterior left ventricle (PLV) vessel with six coronary branches based on the physiologically measured inlet flow as the boundary condition. The maximum calculated wall shear stress (WSS) in the sequential and the individual models were estimated to be 35.1 N/m(2) and 36.5 N/m(2), respectively. Compared to the individual bypass method, the sequential graft has shown a higher velocity at the proximal segment and lower spatial wall shear stress gradient (SWSSG) due to the flow splitting caused by the side-to-side anastomosis. Simulated results combined with its surgical benefits including the requirement of shorter vein length and fewer anastomoses advocate the sequential method as a more favorable CABG method.

  11. The bandwidth of consolidation into visual short-term memory (VSTM) depends on the visual feature

    PubMed Central

    Miller, James R.; Becker, Mark W.; Liu, Taosheng

    2014-01-01

    We investigated the nature of the bandwidth limit in the consolidation of visual information into visual short-term memory. In the first two experiments, we examined whether previous results showing differential consolidation bandwidth for color and orientation resulted from methodological differences by testing the consolidation of color information with methods used in prior orientation experiments. We briefly presented two color patches with masks, either sequentially or simultaneously, followed by a location cue indicating the target. Participants identified the target color via button-press (Experiment 1) or by clicking a location on a color wheel (Experiment 2). Although these methods have previously demonstrated that two orientations are consolidated in a strictly serial fashion, here we found equivalent performance in the sequential and simultaneous conditions, suggesting that two colors can be consolidated in parallel. To investigate whether this difference resulted from different consolidation mechanisms or a common mechanism with different features consuming different amounts of bandwidth, Experiment 3 presented a color patch and an oriented grating either sequentially or simultaneously. We found a lower performance in the simultaneous than the sequential condition, with orientation showing a larger impairment than color. These results suggest that consolidation of both features share common mechanisms. However, it seems that color requires less information to be encoded than orientation. As a result two colors can be consolidated in parallel without exceeding the bandwidth limit, whereas two orientations or an orientation and a color exceed the bandwidth and appear to be consolidated serially. PMID:25317065

  12. Multi-point objective-oriented sequential sampling strategy for constrained robust design

    NASA Astrophysics Data System (ADS)

    Zhu, Ping; Zhang, Siliang; Chen, Wei

    2015-03-01

    Metamodelling techniques are widely used to approximate system responses of expensive simulation models. In association with the use of metamodels, objective-oriented sequential sampling methods have been demonstrated to be effective in balancing the need for searching an optimal solution versus reducing the metamodelling uncertainty. However, existing infilling criteria are developed for deterministic problems and restricted to one sampling point in one iteration. To exploit the use of multiple samples and identify the true robust solution in fewer iterations, a multi-point objective-oriented sequential sampling strategy is proposed for constrained robust design problems. In this article, earlier development of objective-oriented sequential sampling strategy for unconstrained robust design is first extended to constrained problems. Next, a double-loop multi-point sequential sampling strategy is developed. The proposed methods are validated using two mathematical examples followed by a highly nonlinear automotive crashworthiness design example. The results show that the proposed method can mitigate the effect of both metamodelling uncertainty and design uncertainty, and identify the robust design solution more efficiently than the single-point sequential sampling approach.

  13. Configural and component processing in simultaneous and sequential lineup procedures.

    PubMed

    Flowe, Heather D; Smith, Harriet M J; Karoğlu, Nilda; Onwuegbusi, Tochukwu O; Rai, Lovedeep

    2016-01-01

    Configural processing supports accurate face recognition, yet it has never been examined within the context of criminal identification lineups. We tested, using the inversion paradigm, the role of configural processing in lineups. Recent research has found that face discrimination accuracy in lineups is better in a simultaneous compared to a sequential lineup procedure. Therefore, we compared configural processing in simultaneous and sequential lineups to examine whether there are differences. We had participants view a crime video, and then they attempted to identify the perpetrator from a simultaneous or sequential lineup. The test faces were presented either upright or inverted, as previous research has shown that inverting test faces disrupts configural processing. The size of the inversion effect for faces was the same across lineup procedures, indicating that configural processing underlies face recognition in both procedures. Discrimination accuracy was comparable across lineup procedures in both the upright and inversion condition. Theoretical implications of the results are discussed.

  14. [Bilateral cochlear implants in children: acquisition of binaural hearing].

    PubMed

    Ramos-Macías, Angel; Deive-Maggiolo, Leopoldo; Artiles-Cabrera, Ovidio; González-Aguado, Rocío; Borkoski-Barreiro, Silvia A; Masgoret-Palau, Elizabeth; Falcón-González, Juan C; Bueno-Yanes, Jorge

    2013-01-01

    Several studies have indicated the benefit of bilateral cochlear implants in the acquisition of binaural hearing and bilateralism. In children with cochlear implants, is it possible to achieve binaurality after a second implant? When is the ideal time to implant them? The objective of this study was to analyse the binaural effect in children with bilateral implants and the differences between subjects with simultaneous and sequential implants with both short and long intervals. There were 90 patients between 1 and 2 years of age (the first surgery), implanted between 2000 and 2008. Of these, 25 were unilateral users and 65 bilateral; 17 patients had received simultaneous implants, 29 had sequential implants before 12 months after the first one (short interimplant period) and 19 after 12 months (long period). All of them were tested for silent and noisy verbal perception and a tonal threshold audiometry was performed. The silent perception test showed that the simultaneous and short period sequential implant patients (mean: 84.67%) versus unilateral and long period sequential implants (mean: 79.66%), had a statistically-significant difference (P=0,23). Likewise, the noisy perception test showed a difference with statistical significance (P=0,22) comparing the simultaneous implanted and short period sequential implants (mean, 77.17%) versus unilateral implanted and long period sequential ones (mean: 69.32%). The simultaneous and sequential short period implants acquired the advantages of binaural hearing. Copyright © 2012 Elsevier España, S.L. All rights reserved.

  15. A sequential coalescent algorithm for chromosomal inversions

    PubMed Central

    Peischl, S; Koch, E; Guerrero, R F; Kirkpatrick, M

    2013-01-01

    Chromosomal inversions are common in natural populations and are believed to be involved in many important evolutionary phenomena, including speciation, the evolution of sex chromosomes and local adaptation. While recent advances in sequencing and genotyping methods are leading to rapidly increasing amounts of genome-wide sequence data that reveal interesting patterns of genetic variation within inverted regions, efficient simulation methods to study these patterns are largely missing. In this work, we extend the sequential Markovian coalescent, an approximation to the coalescent with recombination, to include the effects of polymorphic inversions on patterns of recombination. Results show that our algorithm is fast, memory-efficient and accurate, making it feasible to simulate large inversions in large populations for the first time. The SMC algorithm enables studies of patterns of genetic variation (for example, linkage disequilibria) and tests of hypotheses (using simulation-based approaches) that were previously intractable. PMID:23632894

  16. Comparison of Performance of Eight-Year-Old Children on Three Auditory Sequential Memory Tests.

    ERIC Educational Resources Information Center

    Chermak, Gail D.; O'Connell, Vickie I.

    1981-01-01

    Twenty normal children were administered three tests of auditory sequential memory. A Pearson product-moment correlation of .50 and coefficients of determination showed all but one relationship to be nonsignificant and predictability between pairs of scores to be poor. (Author)

  17. Sequential Computerized Mastery Tests--Three Simulation Studies

    ERIC Educational Resources Information Center

    Wiberg, Marie

    2006-01-01

    A simulation study of a sequential computerized mastery test is carried out with items modeled with the 3 parameter logistic item response theory model. The examinees' responses are either identically distributed, not identically distributed, or not identically distributed together with estimation errors in the item characteristics. The…

  18. The possibility of application of spiral brain computed tomography to traumatic brain injury.

    PubMed

    Lim, Daesung; Lee, Soo Hoon; Kim, Dong Hoon; Choi, Dae Seub; Hong, Hoon Pyo; Kang, Changwoo; Jeong, Jin Hee; Kim, Seong Chun; Kang, Tae-Sin

    2014-09-01

    The spiral computed tomography (CT) with the advantage of low radiation dose, shorter test time required, and its multidimensional reconstruction is accepted as an essential diagnostic method for evaluating the degree of injury in severe trauma patients and establishment of therapeutic plans. However, conventional sequential CT is preferred for the evaluation of traumatic brain injury (TBI) over spiral CT due to image noise and artifact. We aimed to compare the diagnostic power of spiral facial CT for TBI to that of conventional sequential brain CT. We evaluated retrospectively the images of 315 traumatized patients who underwent both brain CT and facial CT simultaneously. The hemorrhagic traumatic brain injuries such as epidural hemorrhage, subdural hemorrhage, subarachnoid hemorrhage, and contusional hemorrhage were evaluated in both images. Statistics were performed using Cohen's κ to compare the agreement between 2 imaging modalities and sensitivity, specificity, positive predictive value, and negative predictive value of spiral facial CT to conventional sequential brain CT. Almost perfect agreement was noted regarding hemorrhagic traumatic brain injuries between spiral facial CT and conventional sequential brain CT (Cohen's κ coefficient, 0.912). To conventional sequential brain CT, sensitivity, specificity, positive predictive value, and negative predictive value of spiral facial CT were 92.2%, 98.1%, 95.9%, and 96.3%, respectively. In TBI, the diagnostic power of spiral facial CT was equal to that of conventional sequential brain CT. Therefore, expanded spiral facial CT covering whole frontal lobe can be applied to evaluate TBI in the future. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Effects of neostriatal 6-OHDA lesion on performance in a rat sequential reaction time task.

    PubMed

    Domenger, D; Schwarting, R K W

    2008-10-31

    Work in humans and monkeys has provided evidence that the basal ganglia, and the neurotransmitter dopamine therein, play an important role for sequential learning and performance. Compared to primates, experimental work in rodents is rather sparse, largely due to the fact that tasks comparable to the human ones, especially serial reaction time tasks (SRTT), had been lacking until recently. We have developed a rat model of the SRTT, which allows to study neural correlates of sequential performance and motor sequence execution. Here, we report the effects of dopaminergic neostriatal lesions, performed using bilateral 6-hydroxydopamine injections, on performance of well-trained rats tested in our SRTT. Sequential behavior was measured in two ways: for one, the effects of small violations of otherwise well trained sequences were examined as a measure of attention and automation. Secondly, sequential versus random performance was compared as a measure of sequential learning. Neurochemically, the lesions led to sub-total dopamine depletions in the neostriatum, which ranged around 60% in the lateral, and around 40% in the medial neostriatum. These lesions led to a general instrumental impairment in terms of reduced speed (response latencies) and response rate, and these deficits were correlated with the degree of striatal dopamine loss. Furthermore, the violation test indicated that the lesion group conducted less automated responses. The comparison of random versus sequential responding showed that the lesion group did not retain its superior sequential performance in terms of speed, whereas they did in terms of accuracy. Also, rats with lesions did not improve further in overall performance as compared to pre-lesion values, whereas controls did. These results support previous results that neostriatal dopamine is involved in instrumental behaviour in general. Also, these lesions are not sufficient to completely abolish sequential performance, at least when acquired before lesion as tested here.

  20. In-Flight System Identification

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    1998-01-01

    A method is proposed and studied whereby the system identification cycle consisting of experiment design and data analysis can be repeatedly implemented aboard a test aircraft in real time. This adaptive in-flight system identification scheme has many advantages, including increased flight test efficiency, adaptability to dynamic characteristics that are imperfectly known a priori, in-flight improvement of data quality through iterative input design, and immediate feedback of the quality of flight test results. The technique uses equation error in the frequency domain with a recursive Fourier transform for the real time data analysis, and simple design methods employing square wave input forms to design the test inputs in flight. Simulation examples are used to demonstrate that the technique produces increasingly accurate model parameter estimates resulting from sequentially designed and implemented flight test maneuvers. The method has reasonable computational requirements, and could be implemented aboard an aircraft in real time.

  1. A rapid method for the sequential separation of polonium, plutonium, americium and uranium in drinking water.

    PubMed

    Lemons, B; Khaing, H; Ward, A; Thakur, P

    2018-06-01

    A new sequential separation method for the determination of polonium and actinides (Pu, Am and U) in drinking water samples has been developed that can be used for emergency response or routine water analyses. For the first time, the application of TEVA chromatography column in the sequential separation of polonium and plutonium has been studied. This method utilizes a rapid Fe +3 co-precipitation step to remove matrix interferences, followed by plutonium oxidation state adjustment to Pu 4+ and an incubation period of ~ 1 h at 50-60 °C to allow Po 2+ to oxidize to Po 4+ . The polonium and plutonium were then separated on a TEVA column, while separation of americium from uranium was performed on a TRU column. After separation, polonium was micro-precipitated with copper sulfide (CuS), while actinides were micro co-precipitated using neodymium fluoride (NdF 3 ) for counting by the alpha spectrometry. The method is simple, robust and can be performed quickly with excellent removal of interferences, high chemical recovery and very good alpha peak resolution. The efficiency and reliability of the procedures were tested by using spiked samples. The effect of several transition metals (Cu 2+ , Pb 2+ , Fe 3+ , Fe 2+ , and Ni 2+ ) on the performance of this method were also assessed to evaluate the potential matrix effects. Studies indicate that presence of up to 25 mg of these cations in the samples had no adverse effect on the recovery or the resolution of polonium alpha peaks. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Cost-benefit analysis of sequential warning lights in nighttime work zone tapers.

    DOT National Transportation Integrated Search

    2011-06-01

    Improving safety at nighttime work zones is important because of the extra visibility concerns. The deployment of sequential lights is an innovative method for improving driver recognition of lane closures and work zone tapers. Sequential lights are ...

  3. Safeguarding a Lunar Rover with Wald's Sequential Probability Ratio Test

    NASA Technical Reports Server (NTRS)

    Furlong, Michael; Dille, Michael; Wong, Uland; Nefian, Ara

    2016-01-01

    The virtual bumper is a safeguarding mechanism for autonomous and remotely operated robots. In this paper we take a new approach to the virtual bumper system by using an old statistical test. By using a modified version of Wald's sequential probability ratio test we demonstrate that we can reduce the number of false positive reported by the virtual bumper, thereby saving valuable mission time. We use the concept of sequential probability ratio to control vehicle speed in the presence of possible obstacles in order to increase certainty about whether or not obstacles are present. Our new algorithm reduces the chances of collision by approximately 98 relative to traditional virtual bumper safeguarding without speed control.

  4. Receiver operating characteristic analysis of eyewitness memory: comparing the diagnostic accuracy of simultaneous versus sequential lineups.

    PubMed

    Mickes, Laura; Flowe, Heather D; Wixted, John T

    2012-12-01

    A police lineup presents a real-world signal-detection problem because there are two possible states of the world (the suspect is either innocent or guilty), some degree of information about the true state of the world is available (the eyewitness has some degree of memory for the perpetrator), and a decision is made (identifying the suspect or not). A similar state of affairs applies to diagnostic tests in medicine because, in a patient, the disease is either present or absent, a diagnostic test yields some degree of information about the true state of affairs, and a decision is made about the presence or absence of the disease. In medicine, receiver operating characteristic (ROC) analysis is the standard method for assessing diagnostic accuracy. By contrast, in the eyewitness memory literature, this powerful technique has never been used. Instead, researchers have attempted to assess the diagnostic performance of different lineup procedures using methods that cannot identify the better procedure (e.g., by computing a diagnosticity ratio). Here, we describe the basics of ROC analysis, explaining why it is needed and showing how to use it to measure the performance of different lineup procedures. To illustrate the unique advantages of this technique, we also report 3 ROC experiments that were designed to investigate the diagnostic accuracy of simultaneous versus sequential lineups. According to our findings, the sequential procedure appears to be inferior to the simultaneous procedure in discriminating between the presence versus absence of a guilty suspect in a lineup.

  5. Video systems for real-time oil-spill detection

    NASA Technical Reports Server (NTRS)

    Millard, J. P.; Arvesen, J. C.; Lewis, P. L.; Woolever, G. F.

    1973-01-01

    Three airborne television systems are being developed to evaluate techniques for oil-spill surveillance. These include a conventional TV camera, two cameras operating in a subtractive mode, and a field-sequential camera. False-color enhancement and wavelength and polarization filtering are also employed. The first of a series of flight tests indicates that an appropriately filtered conventional TV camera is a relatively inexpensive method of improving contrast between oil and water. False-color enhancement improves the contrast, but the problem caused by sun glint now limits the application to overcast days. Future effort will be aimed toward a one-camera system. Solving the sun-glint problem and developing the field-sequential camera into an operable system offers potential for color 'flagging' oil on water.

  6. A Randomised Trial of empiric 14-day Triple, five-day Concomitant, and ten-day Sequential Therapies for Helicobacter pylori in Seven Latin American Sites

    PubMed Central

    Greenberg, E. Robert; Anderson, Garnet L.; Morgan, Douglas R.; Torres, Javier; Chey, William D.; Bravo, Luis Eduardo; Dominguez, Ricardo L.; Ferreccio, Catterina; Herrero, Rolando; Lazcano-Ponce, Eduardo C.; Meza-Montenegro, Mercedes María; Peña, Rodolfo; Peña, Edgar M.; Salazar-Martínez, Eduardo; Correa, Pelayo; Martínez, María Elena; Valdivieso, Manuel; Goodman, Gary E.; Crowley, John J.; Baker, Laurence H.

    2011-01-01

    Summary Background Evidence from Europe, Asia, and North America suggests that standard three-drug regimens of a proton pump inhibitor plus amoxicillin and clarithromycin are significantly less effective for eradicating Helicobacter pylori (H. pylori) infection than five-day concomitant and ten-day sequential four-drug regimens that include a nitroimidazole. These four-drug regimens also entail fewer antibiotic doses and thus may be suitable for eradication programs in low-resource settings. Studies are limited from Latin America, however, where the burden of H. pylori-associated diseases is high. Methods We randomised 1463 men and women ages 21–65 selected from general populations in Chile, Colombia, Costa Rica, Honduras, Nicaragua, and Mexico (two sites) who tested positive for H. pylori by a urea breath test (UBT) to: 14 days of lansoprazole, amoxicillin, and clarithromycin (standard therapy); five days of lansoprazole, amoxicillin, clarithromycin, and metronidazole (concomitant therapy); or five days of lansoprazole and amoxicillin followed by five of lansoprazole, clarithromycin, and metronidazole (sequential therapy). Eradication was assessed by UBT six–eight weeks after randomisation. Findings In intention-to-treat analyses, the probability of eradication with standard therapy was 82·2%, which was 8·6% higher (95% adjusted CI: 2·6%, 14·5%) than with concomitant therapy (73·6%) and 5·6% higher (95% adjusted CI: −0·04%, 11·6%) than with sequential therapy (76·5%). In analyses limited to the 1314 participants who adhered to their assigned therapy, the probabilities of eradication were 87·1%, 78·7%, and 81·1% with standard, concomitant, and sequential therapies, respectively. Neither four-drug regimen was significantly better than standard triple therapy in any of the seven sites. Interpretation Standard 14-day triple-drug therapy is preferable to five-day concomitant or ten-day sequential four-drug regimens as empiric therapy for H. pylori among diverse Latin American populations. Funding Bill & Melinda Gates Foundation and US National Institutes of Health. PMID:21777974

  7. Discovering Visual Scanning Patterns in a Computerized Cancellation Test

    ERIC Educational Resources Information Center

    Huang, Ho-Chuan; Wang, Tsui-Ying

    2013-01-01

    The purpose of this study was to develop an attention sequential mining mechanism for investigating the sequential patterns of children's visual scanning process in a computerized cancellation test. Participants had to locate and cancel the target amongst other non-targets in a structured form, and a random form with Chinese stimuli. Twenty-three…

  8. Robustness of the sequential lineup advantage.

    PubMed

    Gronlund, Scott D; Carlson, Curt A; Dailey, Sarah B; Goodsell, Charles A

    2009-06-01

    A growing movement in the United States and around the world involves promoting the advantages of conducting an eyewitness lineup in a sequential manner. We conducted a large study (N = 2,529) that included 24 comparisons of sequential versus simultaneous lineups. A liberal statistical criterion revealed only 2 significant sequential lineup advantages and 3 significant simultaneous advantages. Both sequential advantages occurred when the good photograph of the guilty suspect or either innocent suspect was in the fifth position in the sequential lineup; all 3 simultaneous advantages occurred when the poorer quality photograph of the guilty suspect or either innocent suspect was in the second position. Adjusting the statistical criterion to control for the multiple tests (.05/24) revealed no significant sequential advantages. Moreover, despite finding more conservative overall choosing for the sequential lineup, no support was found for the proposal that a sequential advantage was due to that conservative criterion shift. Unless lineups with particular characteristics predominate in the real world, there appears to be no strong preference for conducting lineups in either a sequential or a simultaneous manner. (PsycINFO Database Record (c) 2009 APA, all rights reserved).

  9. To Build a Better (Digital) Mousetrap: Testing the Learning Effectiveness of Al Gore's Book "Our Choice" against the iPad App Based on the Book in Order to Suggest Design Improvements for Future Enriched Ebooks

    ERIC Educational Resources Information Center

    Halliday, Steven W.

    2012-01-01

    This sequential explanatory mixed methods study tested the learning effectiveness of a codex book against a convergent media resource based on the same content. It also investigated whether users of the two formats reported any differences in their liking of the two formats, or in their tendency to be persuaded to the degree that they altered…

  10. Speciation of mercury in sludge solids: washed sludge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bannochie, C. J.; Lourie, A. P.

    2017-10-24

    The objective of this applied research task was to study the type and concentration of mercury compounds found within the contaminated Savannah River Site Liquid Waste System (SRS LWS). A method of selective sequential extraction (SSE), developed by Eurofins Frontier Global Sciences1,2 and adapted by SRNL, utilizes an extraction procedure divided into seven separate tests for different species of mercury. In the SRNL’s modified procedure four of these tests were applied to a washed sample of high level radioactive waste sludge.

  11. A proposed method to detect kinematic differences between and within individuals.

    PubMed

    Frost, David M; Beach, Tyson A C; McGill, Stuart M; Callaghan, Jack P

    2015-06-01

    The primary objective was to examine the utility of a novel method of detecting "actual" kinematic changes using the within-subject variation. Twenty firefighters were assigned to one of two groups (lifting or firefighting). Participants performed 25 repetitions of two lifting or firefighting tasks, in three sessions. The magnitude and within-subject variation of several discrete kinematic measures were computed. Sequential averages of each variable were used to derive a cubic, quadratic and linear regression equation. The efficacy of each equation was examined by contrasting participants' sequential means to their 25-trial mean±1SD and 2SD. The magnitude and within-subject variation of each dependent measure was repeatable for all tasks; however, each participant did not exhibit the same movement patterns as the group. The number of instances across all variables, tasks and testing sessions whereby the 25-trial mean±1SD was contained within the boundaries established by the regression equations increased as the aggregate scores included more trials. Each equation achieved success in at least 88% of all instances when three trials were included in the sequential mean (95% with five trials). The within-subject variation may offer a means to examine participant-specific changes without having to collect a large number of trials. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Landsat-4 (TDRSS-user) orbit determination using batch least-squares and sequential methods

    NASA Technical Reports Server (NTRS)

    Oza, D. H.; Jones, T. L.; Hakimi, M.; Samii, M. V.; Doll, C. E.; Mistretta, G. D.; Hart, R. C.

    1992-01-01

    TDRSS user orbit determination is analyzed using a batch least-squares method and a sequential estimation method. It was found that in the batch least-squares method analysis, the orbit determination consistency for Landsat-4, which was heavily tracked by TDRSS during January 1991, was about 4 meters in the rms overlap comparisons and about 6 meters in the maximum position differences in overlap comparisons. The consistency was about 10 to 30 meters in the 3 sigma state error covariance function in the sequential method analysis. As a measure of consistency, the first residual of each pass was within the 3 sigma bound in the residual space.

  13. Serious Game Leverages Productive Negativity to Facilitate Conceptual Change in Undergraduate Molecular Biology: A Mixed-Methods Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Gauthier, Andrea; Jenkinson, Jodie

    2017-01-01

    We designed a serious game, MolWorlds, to facilitate conceptual change about molecular emergence by using game mechanics (resource management, immersed 3rd person character, sequential level progression, and 3-star scoring system) to encourage cycles of productive negativity. We tested the value-added effect of game design by comparing and…

  14. Efficient algorithm for locating and sizing series compensation devices in large power transmission grids: II. Solutions and applications

    DOE PAGES

    Frolov, Vladimir; Backhaus, Scott; Chertkov, Misha

    2014-10-01

    In a companion manuscript, we developed a novel optimization method for placement, sizing, and operation of Flexible Alternating Current Transmission System (FACTS) devices to relieve transmission network congestion. Specifically, we addressed FACTS that provide Series Compensation (SC) via modification of line inductance. In this manuscript, this heuristic algorithm and its solutions are explored on a number of test cases: a 30-bus test network and a realistically-sized model of the Polish grid (~ 2700 nodes and ~ 3300 lines). The results on the 30-bus network are used to study the general properties of the solutions including non-locality and sparsity. The Polishmore » grid is used as a demonstration of the computational efficiency of the heuristics that leverages sequential linearization of power flow constraints and cutting plane methods that take advantage of the sparse nature of the SC placement solutions. Using these approaches, the algorithm is able to solve an instance of Polish grid in tens of seconds. We explore the utility of the algorithm by analyzing transmission networks congested by (a) uniform load growth, (b) multiple overloaded configurations, and (c) sequential generator retirements.« less

  15. Problematizing the concept of the "borderline" group in performance assessments.

    PubMed

    Homer, Matt; Pell, Godfrey; Fuller, Richard

    2017-05-01

    Many standard setting procedures focus on the performance of the "borderline" group, defined through expert judgments by assessors. In performance assessments such as Objective Structured Clinical Examinations (OSCEs), these judgments usually apply at the station level. Using largely descriptive approaches, we analyze the assessment profile of OSCE candidates at the end of a five year undergraduate medical degree program to investigate the consistency of the borderline group across stations. We look specifically at those candidates who are borderline in individual stations, and in the overall assessment. While the borderline group can be clearly defined at the individual station level, our key finding is that the membership of this group varies considerably across stations. These findings pose challenges for some standard setting methods, particularly the borderline group and objective borderline methods. They also suggest that institutions should ensure appropriate conjunctive rules to limit compensation in performance between stations to maximize "diagnostic accuracy". In addition, this work highlights a key benefit of sequential testing formats in OSCEs. In comparison with a traditional, single-test format, sequential models allow assessment of "borderline" candidates across a wider range of content areas with concomitant improvements in pass/fail decision-making.

  16. Efficient Algorithm for Locating and Sizing Series Compensation Devices in Large Transmission Grids: Solutions and Applications (PART II)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frolov, Vladimir; Backhaus, Scott N.; Chertkov, Michael

    2014-01-14

    In a companion manuscript, we developed a novel optimization method for placement, sizing, and operation of Flexible Alternating Current Transmission System (FACTS) devices to relieve transmission network congestion. Specifically, we addressed FACTS that provide Series Compensation (SC) via modification of line inductance. In this manuscript, this heuristic algorithm and its solutions are explored on a number of test cases: a 30-bus test network and a realistically-sized model of the Polish grid (~2700 nodes and ~3300 lines). The results on the 30-bus network are used to study the general properties of the solutions including non-locality and sparsity. The Polish grid ismore » used as a demonstration of the computational efficiency of the heuristics that leverages sequential linearization of power flow constraints and cutting plane methods that take advantage of the sparse nature of the SC placement solutions. Using these approaches, the algorithm is able to solve an instance of Polish grid in tens of seconds. We explore the utility of the algorithm by analyzing transmission networks congested by (a) uniform load growth, (b) multiple overloaded configurations, and (c) sequential generator retirements« less

  17. Comparative study of six sequential spectrophotometric methods for quantification and separation of ribavirin, sofosbuvir and daclatasvir: An application on Laboratory prepared mixture, pharmaceutical preparations, spiked human urine, spiked human plasma, and dissolution test.

    PubMed

    Hassan, Wafaa S; Elmasry, Manal S; Elsayed, Heba M; Zidan, Dalia W

    2018-09-05

    In accordance with International Conference on Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use (ICH) guidelines, six novel, simple and precise sequential spectrophotometric methods were developed and validated for the simultaneous analysis of Ribavirin (RIB), Sofosbuvir (SOF), and Daclatasvir (DAC) in their mixture without prior separation steps. These drugs are described as co-administered for treatment of Hepatitis C virus (HCV). HCV is the cause of hepatitis C and some cancers such as liver cancer (hepatocellular carcinoma) and lymphomas in humans. These techniques consisted of several sequential steps using zero, ratio and/or derivative spectra. DAC was first determined through direct spectrophotometry at 313.7 nm without any interference of the other two drugs while RIB and SOF can be determined after ratio subtraction through five methods; Ratio difference spectrophotometric method, successive derivative ratio method, constant center, isoabsorptive method at 238.8 nm, and mean centering of the ratio spectra (MCR) at 224 nm and 258 nm for RIB and SOF, respectively. The calibration curve is linear over the concentration ranges of (6-42), (10-70) and (4-16) μg/mL for RIB, SOF, and DAC, respectively. This method was successfully applied to commercial pharmaceutical preparation of the drugs, spiked human urine, and spiked human plasma. The above methods are very simple methods that were developed for the simultaneous determination of binary and ternary mixtures and so enhance signal-to-noise ratio. The method has been successfully applied to the simultaneous analysis of RIB, SOF, and DAC in laboratory prepared mixtures. The obtained results are statistically compared with those obtained by the official or reported methods, showing no significant difference with respect to accuracy and precision at p = 0.05. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Determination of yeast assimilable nitrogen content in wine fermentations by sequential injection analysis with spectrophotometric detetection.

    PubMed

    Muik, Barbara; Edelmann, Andrea; Lendl, Bernhard; Ayora-Cañada, María José

    2002-09-01

    An automated method for measuring the primary amino acid concentration in wine fermentations by sequential injection analysis with spectrophotometric detection was developed. Isoindole-derivatives from the primary amino acid were formed by reaction with o-phthaldialdehyde and N-acetyl- L-cysteine and measured at 334 nm with respect to a baseline point at 700 nm to compensate the observed Schlieren effect. As the reaction kinetic was strongly matrix dependent the analytical readout at the final reaction equilibrium has been evaluated. Therefore four parallel reaction coils were included in the flow system to be capable of processing four samples simultaneously. Using isoleucine as the representative primary amino acid in wine fermentations a linear calibration curve from 2 to 10 mM isoleucine, corresponding to 28 to 140 mg nitrogen/L (N/L) was obtained. The coefficient of variation of the method was 1.5% at a throughput of 12 samples per hour. The developed method was successfully used to monitor two wine fermentations during alcoholic fermentation. The results were in agreement with an external reference method based on high performance liquid chromatography. A mean-t-test showed no significant differences between the two methods at a confidence level of 95%.

  19. Translating Basic Behavioral and Social Science Research to Clinical Application: The EVOLVE Mixed Methods Approach

    PubMed Central

    Peterson, Janey C.; Czajkowski, Susan; Charlson, Mary E.; Link, Alissa R.; Wells, Martin T.; Isen, Alice M.; Mancuso, Carol A.; Allegrante, John P.; Boutin-Foster, Carla; Ogedegbe, Gbenga; Jobe, Jared B.

    2012-01-01

    Objective To describe a mixed-methods approach to develop and test a basic behavioral science-informed intervention to motivate behavior change in three high-risk clinical populations. Our theoretically-derived intervention comprised a combination of positive affect and self-affirmation (PA/SA) which we applied to three clinical chronic disease populations. Methods We employed a sequential mixed methods model (EVOLVE) to design and test the PA/SA intervention in order to increase physical activity in people with coronary artery disease (post-percutaneous coronary intervention [PCI]) or asthma (ASM), and to improve medication adherence in African Americans with hypertension (HTN). In an initial qualitative phase, we explored participant values and beliefs. We next pilot tested and refined the intervention, and then conducted three randomized controlled trials (RCTs) with parallel study design. Participants were randomized to combined PA/SA vs. an informational control (IC) and followed bimonthly for 12 months, assessing for health behaviors and interval medical events. Results Over 4.5 years, we enrolled 1,056 participants. Changes were sequentially made to the intervention during the qualitative and pilot phases. The three RCTs enrolled 242 PCI, 258 ASM and 256 HTN participants (n=756). Overall, 45.1% of PA/SA participants versus 33.6% of IC participants achieved successful behavior change (p=0.001). In multivariate analysis PA/SA intervention remained a significant predictor of achieving behavior change (p<0.002, OR=1.66, 95% CI 1.22–2.27), controlling for baseline negative affect, comorbidity, gender, race/ethnicity, medical events, smoking and age. Conclusions The EVOLVE method is a means by which basic behavioral science research can be translated into efficacious interventions for chronic disease populations. PMID:22963594

  20. Improving the Sequential Time Perception of Teenagers with Mild to Moderate Mental Retardation with 3D Immersive Virtual Reality (IVR)

    ERIC Educational Resources Information Center

    Passig, David

    2009-01-01

    Children with mental retardation have pronounced difficulties in using cognitive strategies and comprehending abstract concepts--among them, the concept of sequential time (Van-Handel, Swaab, De-Vries, & Jongmans, 2007). The perception of sequential time is generally tested by using scenarios presenting a continuum of actions. The goal of this…

  1. A detailed description of the sequential probability ratio test for 2-IMU FDI

    NASA Technical Reports Server (NTRS)

    Rich, T. M.

    1976-01-01

    The sequential probability ratio test (SPRT) for 2-IMU FDI (inertial measuring unit failure detection/isolation) is described. The SPRT is a statistical technique for detecting and isolating soft IMU failures originally developed for the strapdown inertial reference unit. The flowchart of a subroutine incorporating the 2-IMU SPRT is included.

  2. Simultaneous control of microorganisms and disinfection by-products by sequential chlorination.

    PubMed

    Chen, Chao; Zhang, Xiao-Jian; He, Wen-Jie; Han, Hong-Da

    2007-04-01

    To introduce a new sequential chlorination disinfection process in which short-term free chlorine and chloramine are sequentially added. Pilot tests of this sequential chlorination were carried out in a drinking water plant. The sequential chlorination disinfection process had the same or better efficiency on microbe (including virus) inactivation compared with the free chlorine disinfection process. There seemed to be some synergetic disinfection effect between free chlorine and monochloramine because they attacked different targets. The sequential chlorination disinfection process resulted in 35.7%-77.0% TTHM formation and 36.6%-54.8% THAA5 formation less than the free chlorination process. The poorer the water quality was, the more advantage the sequential chlorination disinfection had over the free chlorination. This process takes advantages of free chlorine's quick inactivation of microorganisms and chloramine's low disinfection by-product (DBP) yield and long-term residual effect, allowing simultaneous control of microbes and DBPs in an effective and economic way.

  3. Comparison of three commercially available fit-test methods.

    PubMed

    Janssen, Larry L; Luinenburg, D Michael; Mullins, Haskell E; Nelson, Thomas J

    2002-01-01

    American National Standards Institute (ANSI) standard Z88.10, Respirator Fit Testing Methods, includes criteria to evaluate new fit-tests. The standard allows generated aerosol, particle counting, or controlled negative pressure quantitative fit-tests to be used as the reference method to determine acceptability of a new test. This study examined (1) comparability of three Occupational Safety and Health Administration-accepted fit-test methods, all of which were validated using generated aerosol as the reference method; and (2) the effect of the reference method on the apparent performance of a fit-test method under evaluation. Sequential fit-tests were performed using the controlled negative pressure and particle counting quantitative fit-tests and the bitter aerosol qualitative fit-test. Of 75 fit-tests conducted with each method, the controlled negative pressure method identified 24 failures; bitter aerosol identified 22 failures; and the particle counting method identified 15 failures. The sensitivity of each method, that is, agreement with the reference method in identifying unacceptable fits, was calculated using each of the other two methods as the reference. None of the test methods met the ANSI sensitivity criterion of 0.95 or greater when compared with either of the other two methods. These results demonstrate that (1) the apparent performance of any fit-test depends on the reference method used, and (2) the fit-tests evaluated use different criteria to identify inadequately fitting respirators. Although "acceptable fit" cannot be defined in absolute terms at this time, the ability of existing fit-test methods to reject poor fits can be inferred from workplace protection factor studies.

  4. Learning to Monitor Machine Health with Convolutional Bi-Directional LSTM Networks

    PubMed Central

    Zhao, Rui; Yan, Ruqiang; Wang, Jinjiang; Mao, Kezhi

    2017-01-01

    In modern manufacturing systems and industries, more and more research efforts have been made in developing effective machine health monitoring systems. Among various machine health monitoring approaches, data-driven methods are gaining in popularity due to the development of advanced sensing and data analytic techniques. However, considering the noise, varying length and irregular sampling behind sensory data, this kind of sequential data cannot be fed into classification and regression models directly. Therefore, previous work focuses on feature extraction/fusion methods requiring expensive human labor and high quality expert knowledge. With the development of deep learning methods in the last few years, which redefine representation learning from raw data, a deep neural network structure named Convolutional Bi-directional Long Short-Term Memory networks (CBLSTM) has been designed here to address raw sensory data. CBLSTM firstly uses CNN to extract local features that are robust and informative from the sequential input. Then, bi-directional LSTM is introduced to encode temporal information. Long Short-Term Memory networks (LSTMs) are able to capture long-term dependencies and model sequential data, and the bi-directional structure enables the capture of past and future contexts. Stacked, fully-connected layers and the linear regression layer are built on top of bi-directional LSTMs to predict the target value. Here, a real-life tool wear test is introduced, and our proposed CBLSTM is able to predict the actual tool wear based on raw sensory data. The experimental results have shown that our model is able to outperform several state-of-the-art baseline methods. PMID:28146106

  5. Learning to Monitor Machine Health with Convolutional Bi-Directional LSTM Networks.

    PubMed

    Zhao, Rui; Yan, Ruqiang; Wang, Jinjiang; Mao, Kezhi

    2017-01-30

    In modern manufacturing systems and industries, more and more research efforts have been made in developing effective machine health monitoring systems. Among various machine health monitoring approaches, data-driven methods are gaining in popularity due to the development of advanced sensing and data analytic techniques. However, considering the noise, varying length and irregular sampling behind sensory data, this kind of sequential data cannot be fed into classification and regression models directly. Therefore, previous work focuses on feature extraction/fusion methods requiring expensive human labor and high quality expert knowledge. With the development of deep learning methods in the last few years, which redefine representation learning from raw data, a deep neural network structure named Convolutional Bi-directional Long Short-Term Memory networks (CBLSTM) has been designed here to address raw sensory data. CBLSTM firstly uses CNN to extract local features that are robust and informative from the sequential input. Then, bi-directional LSTM is introduced to encode temporal information. Long Short-Term Memory networks(LSTMs) are able to capture long-term dependencies and model sequential data, and the bi-directional structure enables the capture of past and future contexts. Stacked, fully-connected layers and the linear regression layer are built on top of bi-directional LSTMs to predict the target value. Here, a real-life tool wear test is introduced, and our proposed CBLSTM is able to predict the actual tool wear based on raw sensory data. The experimental results have shown that our model is able to outperform several state-of-the-art baseline methods.

  6. Accelerated drug release and clearance of PEGylated epirubicin liposomes following repeated injections: a new challenge for sequential low-dose chemotherapy

    PubMed Central

    Yang, Qiang; Ma, Yanling; Zhao, Yongxue; She, Zhennan; Wang, Long; Li, Jie; Wang, Chunling; Deng, Yihui

    2013-01-01

    Background Sequential low-dose chemotherapy has received great attention for its unique advantages in attenuating multidrug resistance of tumor cells. Nevertheless, it runs the risk of producing new problems associated with the accelerated blood clearance phenomenon, especially with multiple injections of PEGylated liposomes. Methods Liposomes were labeled with fluorescent phospholipids of 1,2-dipalmitoyl-snglycero-3-phosphoethanolamine-N-(7-nitro-2-1,3-benzoxadiazol-4-yl) and epirubicin (EPI). The pharmacokinetics profile and biodistribution of the drug and liposome carrier following multiple injections were determined. Meanwhile, the antitumor effect of sequential low-dose chemotherapy was tested. To clarify this unexpected phenomenon, the production of polyethylene glycol (PEG)-specific immunoglobulin M (IgM), drug release, and residual complement activity experiments were conducted in serum. Results The first or sequential injections of PEGylated liposomes within a certain dose range induced the rapid clearance of subsequently injected PEGylated liposomal EPI. Of note, the clearance of EPI was two- to three-fold faster than the liposome itself, and a large amount of EPI was released from liposomes in the first 30 minutes in a complement-activation, direct-dependent manner. The therapeutic efficacy of liposomal EPI following 10 days of sequential injections in S180 tumor-bearing mice of 0.75 mg EPI/kg body weight was almost completely abolished between the sixth and tenth day of the sequential injections, even although the subsequently injected doses were doubled. The level of PEG-specific IgM in the blood increased rapidly, with a larger amount of complement being activated while the concentration of EPI in blood and tumor tissue was significantly reduced. Conclusion Our investigation implied that the accelerated blood clearance phenomenon and its accompanying rapid leakage and clearance of drug following sequential low-dose injections may reverse the unique pharmacokinetic–toxicity profile of liposomes which deserved our attention. Therefore, a more reasonable treatment regime should be selected to lessen or even eliminate this phenomenon. PMID:23576868

  7. The analysis of verbal interaction sequences in dyadic clinical communication: a review of methods.

    PubMed

    Connor, Martin; Fletcher, Ian; Salmon, Peter

    2009-05-01

    To identify methods available for sequential analysis of dyadic verbal clinical communication and to review their methodological and conceptual differences. Critical review, based on literature describing sequential analyses of clinical and other relevant social interaction. Dominant approaches are based on analysis of communication according to its precise position in the series of utterances that constitute event-coded dialogue. For practical reasons, methods focus on very short-term processes, typically the influence of one party's speech on what the other says next. Studies of longer-term influences are rare. Some analyses have statistical limitations, particularly in disregarding heterogeneity between consultations, patients or practitioners. Additional techniques, including ones that can use information about timing and duration of speech from interval-coding are becoming available. There is a danger that constraints of commonly used methods shape research questions and divert researchers from potentially important communication processes including ones that operate over a longer-term than one or two speech turns. Given that no one method can model the complexity of clinical communication, multiple methods, both quantitative and qualitative, are necessary. Broadening the range of methods will allow the current emphasis on exploratory studies to be balanced by tests of hypotheses about clinically important communication processes.

  8. Cable tester

    NASA Astrophysics Data System (ADS)

    Rammage, Robert L.

    1990-10-01

    A device for sequentially testing the plurality of connectors in a wiring harness is disclosed. The harness is attached to the tester by means of adapter cables and a rotary switch is used to sequentially, individually test the connectors by passing a current through the connector. If the connector is unbroken, a light will flash to show it is electrically sound. The adapters allow a large number of cable configurations to be tested using a single tester configuration.

  9. Evaluation Using Sequential Trials Methods.

    ERIC Educational Resources Information Center

    Cohen, Mark E.; Ralls, Stephen A.

    1986-01-01

    Although dental school faculty as well as practitioners are interested in evaluating products and procedures used in clinical practice, research design and statistical analysis can sometimes pose problems. Sequential trials methods provide an analytical structure that is both easy to use and statistically valid. (Author/MLW)

  10. Evidence for decreased interaction and improved carotenoid bioavailability by sequential delivery of a supplement.

    PubMed

    Salter-Venzon, Dawna; Kazlova, Valentina; Izzy Ford, Samantha; Intra, Janjira; Klosner, Allison E; Gellenbeck, Kevin W

    2017-05-01

    Despite the notable health benefits of carotenoids for human health, the majority of human diets worldwide are repeatedly shown to be inadequate in intake of carotenoid-rich fruits and vegetables, according to current health recommendations. To address this deficit, strategies designed to increase dietary intakes and subsequent plasma levels of carotenoids are warranted. When mixed carotenoids are delivered into the intestinal tract simultaneously, competition occurs for micelle formation and absorption, affecting carotenoid bioavailability. Previously, we tested the in vitro viability of a carotenoid mix designed to deliver individual carotenoids sequentially spaced from one another over the 6 hr transit time of the human upper gastrointestinal system. We hypothesized that temporally and spatially separating the individual carotenoids would reduce competition for micelle formation, improve uptake, and maximize efficacy. Here, we test this hypothesis in a double-blind, repeated-measure, cross-over human study with 12 subjects by comparing the change of plasma carotenoid levels for 8 hr after oral doses of a sequentially spaced carotenoid mix, to a matched mix without sequential spacing. We find the carotenoid change from baseline, measured as area under the curve, is increased following consumption of the sequentially spaced mix compared to concomitant carotenoids delivery. These results demonstrate reduced interaction and regulation between the sequentially spaced carotenoids, suggesting improved bioavailability from a novel sequentially spaced carotenoid mix.

  11. An apparatus for sequentially combining microvolumes of reagents by infrasonic mixing.

    PubMed

    Camien, M N; Warner, R C

    1984-05-01

    A method employing high-speed infrasonic mixing for obtaining timed samples for following the progress of a moderately rapid chemical reaction is described. Drops of 10 to 50 microliter each of two reagents are mixed to initiate the reaction, followed, after a measured time interval, by mixing with a drop of a third reagent to quench the reaction. The method was developed for measuring the rate of denaturation of covalently closed, circular DNA in NaOH at several temperatures. For this purpose the timed samples were analyzed by analytical ultracentrifugation. The apparatus was tested by determination of the rate of hydrolysis of 2,4-dinitrophenyl acetate in an alkaline buffer. The important characteristics of the method are (i) it requires very small volumes of sample and reagents; (ii) the components of the reaction mixture are pre-equilibrated and mixed with no transfer outside the prescribed constant temperature environment; (iii) the mixing is very rapid; and (iv) satisfactorily precise measurements of relatively short time intervals (approximately 2 sec minimum) between sequential mixings of the components are readily obtainable.

  12. Non-parametric characterization of long-term rainfall time series

    NASA Astrophysics Data System (ADS)

    Tiwari, Harinarayan; Pandey, Brij Kishor

    2018-03-01

    The statistical study of rainfall time series is one of the approaches for efficient hydrological system design. Identifying, and characterizing long-term rainfall time series could aid in improving hydrological systems forecasting. In the present study, eventual statistics was applied for the long-term (1851-2006) rainfall time series under seven meteorological regions of India. Linear trend analysis was carried out using Mann-Kendall test for the observed rainfall series. The observed trend using the above-mentioned approach has been ascertained using the innovative trend analysis method. Innovative trend analysis has been found to be a strong tool to detect the general trend of rainfall time series. Sequential Mann-Kendall test has also been carried out to examine nonlinear trends of the series. The partial sum of cumulative deviation test is also found to be suitable to detect the nonlinear trend. Innovative trend analysis, sequential Mann-Kendall test and partial cumulative deviation test have potential to detect the general as well as nonlinear trend for the rainfall time series. Annual rainfall analysis suggests that the maximum changes in mean rainfall is 11.53% for West Peninsular India, whereas the maximum fall in mean rainfall is 7.8% for the North Mountainous Indian region. The innovative trend analysis method is also capable of finding the number of change point available in the time series. Additionally, we have performed von Neumann ratio test and cumulative deviation test to estimate the departure from homogeneity. Singular spectrum analysis has been applied in this study to evaluate the order of departure from homogeneity in the rainfall time series. Monsoon season (JS) of North Mountainous India and West Peninsular India zones has higher departure from homogeneity and singular spectrum analysis shows the results to be in coherence with the same.

  13. The cost-effectiveness of 10 antenatal syphilis screening and treatment approaches in Peru, Tanzania, and Zambia

    PubMed Central

    Terris-Prestholt, Fern; Vickerman, Peter; Torres-Rueda, Sergio; Santesso, Nancy; Sweeney, Sedona; Mallma, Patricia; Shelley, Katharine D.; Garcia, Patricia J.; Bronzan, Rachel; Gill, Michelle M.; Broutet, Nathalie; Wi, Teodora; Watts, Charlotte; Mabey, David; Peeling, Rosanna W.; Newman, Lori

    2015-01-01

    Objective Rapid plasma reagin (RPR) is frequently used to test women for maternal syphilis. Rapid syphilis immunochromatographic strip tests detecting only Treponema pallidum antibodies (single RSTs) or both treponemal and non-treponemal antibodies (dual RSTs) are now available. This study assessed the cost-effectiveness of algorithms using these tests to screen pregnant women. Methods Observed costs of maternal syphilis screening and treatment using clinic-based RPR and single RSTs in 20 clinics across Peru, Tanzania, and Zambia were used to model the cost-effectiveness of algorithms using combinations of RPR, single, and dual RSTs, and no and mass treatment. Sensitivity analyses determined drivers of key results. Results Although this analysis found screening using RPR to be relatively cheap, most (> 70%) true cases went untreated. Algorithms using single RSTs were the most cost-effective in all observed settings, followed by dual RSTs, which became the most cost-effective if dual RST costs were halved. Single test algorithms dominated most sequential testing algorithms, although sequential algorithms reduced overtreatment. Mass treatment was relatively cheap and effective in the absence of screening supplies, though treated many uninfected women. Conclusion This analysis highlights the advantages of introducing RSTs in three diverse settings. The results should be applicable to other similar settings. PMID:25963907

  14. Multi-volatile method for aroma analysis using sequential dynamic headspace sampling with an application to brewed coffee.

    PubMed

    Ochiai, Nobuo; Tsunokawa, Jun; Sasamoto, Kikuo; Hoffmann, Andreas

    2014-12-05

    A novel multi-volatile method (MVM) using sequential dynamic headspace (DHS) sampling for analysis of aroma compounds in aqueous sample was developed. The MVM consists of three different DHS method parameters sets including choice of the replaceable adsorbent trap. The first DHS sampling at 25 °C using a carbon-based adsorbent trap targets very volatile solutes with high vapor pressure (>20 kPa). The second DHS sampling at 25 °C using the same type of carbon-based adsorbent trap targets volatile solutes with moderate vapor pressure (1-20 kPa). The third DHS sampling using a Tenax TA trap at 80 °C targets solutes with low vapor pressure (<1 kPa) and/or hydrophilic characteristics. After the 3 sequential DHS samplings using the same HS vial, the three traps are sequentially desorbed with thermal desorption in reverse order of the DHS sampling and the desorbed compounds are trapped and concentrated in a programmed temperature vaporizing (PTV) inlet and subsequently analyzed in a single GC-MS run. Recoveries of the 21 test aroma compounds for each DHS sampling and the combined MVM procedure were evaluated as a function of vapor pressure in the range of 0.000088-120 kPa. The MVM provided very good recoveries in the range of 91-111%. The method showed good linearity (r2>0.9910) and high sensitivity (limit of detection: 1.0-7.5 ng mL(-1)) even with MS scan mode. The feasibility and benefit of the method was demonstrated with analysis of a wide variety of aroma compounds in brewed coffee. Ten potent aroma compounds from top-note to base-note (acetaldehyde, 2,3-butanedione, 4-ethyl guaiacol, furaneol, guaiacol, 3-methyl butanal, 2,3-pentanedione, 2,3,5-trimethyl pyrazine, vanillin, and 4-vinyl guaiacol) could be identified together with an additional 72 aroma compounds. Thirty compounds including 9 potent aroma compounds were quantified in the range of 74-4300 ng mL(-1) (RSD<10%, n=5). Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  15. Meta-analyses and adaptive group sequential designs in the clinical development process.

    PubMed

    Jennison, Christopher; Turnbull, Bruce W

    2005-01-01

    The clinical development process can be viewed as a succession of trials, possibly overlapping in calendar time. The design of each trial may be influenced by results from previous studies and other currently proceeding trials, as well as by external information. Results from all of these trials must be considered together in order to assess the efficacy and safety of the proposed new treatment. Meta-analysis techniques provide a formal way of combining the information. We examine how such methods can be used in combining results from: (1) a collection of separate studies, (2) a sequence of studies in an organized development program, and (3) stages within a single study using a (possibly adaptive) group sequential design. We present two examples. The first example concerns the combining of results from a Phase IIb trial using several dose levels or treatment arms with those of the Phase III trial comparing the treatment selected in Phase IIb against a control This enables a "seamless transition" from Phase IIb to Phase III. The second example examines the use of combination tests to analyze data from an adaptive group sequential trial.

  16. Sample size determination in group-sequential clinical trials with two co-primary endpoints

    PubMed Central

    Asakura, Koko; Hamasaki, Toshimitsu; Sugimoto, Tomoyuki; Hayashi, Kenichi; Evans, Scott R; Sozu, Takashi

    2014-01-01

    We discuss sample size determination in group-sequential designs with two endpoints as co-primary. We derive the power and sample size within two decision-making frameworks. One is to claim the test intervention’s benefit relative to control when superiority is achieved for the two endpoints at the same interim timepoint of the trial. The other is when the superiority is achieved for the two endpoints at any interim timepoint, not necessarily simultaneously. We evaluate the behaviors of sample size and power with varying design elements and provide a real example to illustrate the proposed sample size methods. In addition, we discuss sample size recalculation based on observed data and evaluate the impact on the power and Type I error rate. PMID:24676799

  17. The Impact of Optional Flexible Year Program on Texas Assessment of Knowledge and Skills Test Scores of Fifth Grade Students

    ERIC Educational Resources Information Center

    Longbotham, Pamela J.

    2012-01-01

    The study examined the impact of participation in an optional flexible year program (OFYP) on academic achievement. The ex post facto study employed an explanatory sequential mixed methods design. The non-probability sample consisted of 163 fifth grade students in an OFYP district and 137 5th graders in a 180-day instructional year school…

  18. Development of an ADP Training Program to Serve the EPA Data Processing Community.

    DTIC Science & Technology

    1976-07-29

    divide, compute , perform and alter statements; data representation and conversion; table processing; and indexed sequential and random access file...processing. The course workshop will include the testing of coded exercises and problems on a computer system. CLASS SIZE: Individualized METHODS/CONDUCT...familiarization with computer concepts will be helpful. OBJECTIVES OF CURRICULUM After completing this course, the student should have developed a working

  19. Memory and other properties of multiple test procedures generated by entangled graphs.

    PubMed

    Maurer, Willi; Bretz, Frank

    2013-05-10

    Methods for addressing multiplicity in clinical trials have attracted much attention during the past 20 years. They include the investigation of new classes of multiple test procedures, such as fixed sequence, fallback and gatekeeping procedures. More recently, sequentially rejective graphical test procedures have been introduced to construct and visualize complex multiple test strategies. These methods propagate the local significance level of a rejected null hypothesis to not-yet rejected hypotheses. In the graph defining the test procedure, hypotheses together with their local significance levels are represented by weighted vertices and the propagation rule by weighted directed edges. An algorithm provides the rules for updating the local significance levels and the transition weights after rejecting an individual hypothesis. These graphical procedures have no memory in the sense that the origin of the propagated significance level is ignored in subsequent iterations. However, in some clinical trial applications, memory is desirable to reflect the underlying dependence structure of the study objectives. In such cases, it would allow the further propagation of significance levels to be dependent on their origin and thus reflect the grouped parent-descendant structures of the hypotheses. We will give examples of such situations and show how to induce memory and other properties by convex combination of several individual graphs. The resulting entangled graphs provide an intuitive way to represent the underlying relative importance relationships between the hypotheses, are as easy to perform as the original individual graphs, remain sequentially rejective and control the familywise error rate in the strong sense. Copyright © 2012 John Wiley & Sons, Ltd.

  20. Guided particle swarm optimization method to solve general nonlinear optimization problems

    NASA Astrophysics Data System (ADS)

    Abdelhalim, Alyaa; Nakata, Kazuhide; El-Alem, Mahmoud; Eltawil, Amr

    2018-04-01

    The development of hybrid algorithms is becoming an important topic in the global optimization research area. This article proposes a new technique in hybridizing the particle swarm optimization (PSO) algorithm and the Nelder-Mead (NM) simplex search algorithm to solve general nonlinear unconstrained optimization problems. Unlike traditional hybrid methods, the proposed method hybridizes the NM algorithm inside the PSO to improve the velocities and positions of the particles iteratively. The new hybridization considers the PSO algorithm and NM algorithm as one heuristic, not in a sequential or hierarchical manner. The NM algorithm is applied to improve the initial random solution of the PSO algorithm and iteratively in every step to improve the overall performance of the method. The performance of the proposed method was tested over 20 optimization test functions with varying dimensions. Comprehensive comparisons with other methods in the literature indicate that the proposed solution method is promising and competitive.

  1. Implementing Quality Criteria in Designing and Conducting a Sequential Quan [right arrow] Qual Mixed Methods Study of Student Engagement with Learning Applied Research Methods Online

    ERIC Educational Resources Information Center

    Ivankova, Nataliya V.

    2014-01-01

    In spite of recent methodological developments related to quality assurance in mixed methods research, practical examples of how to implement quality criteria in designing and conducting sequential QUAN [right arrow] QUAL mixed methods studies to ensure the process is systematic and rigorous remain scarce. This article discusses a three-step…

  2. A stacked sequential learning method for investigator name recognition from web-based medical articles

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoli; Zou, Jie; Le, Daniel X.; Thoma, George

    2010-01-01

    "Investigator Names" is a newly required field in MEDLINE citations. It consists of personal names listed as members of corporate organizations in an article. Extracting investigator names automatically is necessary because of the increasing volume of articles reporting collaborative biomedical research in which a large number of investigators participate. In this paper, we present an SVM-based stacked sequential learning method in a novel application - recognizing named entities such as the first and last names of investigators from online medical journal articles. Stacked sequential learning is a meta-learning algorithm which can boost any base learner. It exploits contextual information by adding the predicted labels of the surrounding tokens as features. We apply this method to tag words in text paragraphs containing investigator names, and demonstrate that stacked sequential learning improves the performance of a nonsequential base learner such as an SVM classifier.

  3. Sequential vs simultaneous revascularization in patients undergoing liver transplantation: A meta-analysis

    PubMed Central

    Wang, Jia-Zhong; Liu, Yang; Wang, Jin-Long; Lu, Le; Zhang, Ya-Fei; Lu, Hong-Wei; Li, Yi-Ming

    2015-01-01

    AIM: We undertook this meta-analysis to investigate the relationship between revascularization and outcomes after liver transplantation. METHODS: A literature search was performed using MeSH and key words. The quality of the included studies was assessed using the Jadad Score and the Newcastle-Ottawa Scale. Heterogeneity was evaluated by the χ2 and I2 tests. The risk of publication bias was assessed using a funnel plot and Egger’s test, and the risk of bias was assessed using a domain-based assessment tool. A sensitivity analysis was conducted by reanalyzing the data using different statistical approaches. RESULTS: Six studies with a total of 467 patients were included. Ischemic-type biliary lesions were significantly reduced in the simultaneous revascularization group compared with the sequential revascularization group (OR = 4.97, 95%CI: 2.45-10.07; P < 0.00001), and intensive care unit (ICU) days were decreased (MD = 2.00, 95%CI: 0.55-3.45; P = 0.007) in the simultaneous revascularization group. Although warm ischemia time was prolonged in simultaneous revascularization group (MD = -25.84, 95%CI: -29.28-22.40; P < 0.00001), there were no significant differences in other outcomes between sequential and simultaneous revascularization groups. Assessment of the risk of bias showed that the methods of random sequence generation and blinding might have been a source of bias. The sensitivity analysis strengthened the reliability of the results of this meta-analysis. CONCLUSION: The results of this study indicate that simultaneous revascularization in liver transplantation may reduce the incidence of ischemic-type biliary lesions and length of stay of patients in the ICU. PMID:26078582

  4. System training and assessment in simultaneous proportional myoelectric prosthesis control

    PubMed Central

    2014-01-01

    Background Pattern recognition control of prosthetic hands take inputs from one or more myoelectric sensors and controls one or more degrees of freedom. However, most systems created allow only sequential control of one motion class at a time. Additionally, only recently have researchers demonstrated proportional myoelectric control in such systems, an option that is believed to make fine control easier for the user. Recent developments suggest improved reliability if the user follows a so-called prosthesis guided training (PGT) scheme. Methods In this study, a system for simultaneous proportional myoelectric control has been developed for a hand prosthesis with two motor functions (hand open/close, and wrist pro-/supination). The prosthesis has been used with a prosthesis socket equivalent designed for normally-limbed subjects. An extended version of PGT was developed for use with proportional control. The control system’s performance was tested for two subjects in the Clothespin Relocation Task and the Southampton Hand Assessment Procedure (SHAP). Simultaneous proportional control was compared with three other control strategies implemented on the same prosthesis: mutex proportional control (the same system but with simultaneous control disabled), mutex on-off control, and a more traditional, sequential proportional control system with co-contractions for state switching. Results The practical tests indicate that the simultaneous proportional control strategy and the two mutex-based pattern recognition strategies performed equally well, and superiorly to the more traditional sequential strategy according to the chosen outcome measures. Conclusions This is the first simultaneous proportional myoelectric control system demonstrated on a prosthesis affixed to the forearm of a subject. The study illustrates that PGT is a promising system training method for proportional control. Due to the limited number of subjects in this study, no definite conclusions can be drawn. PMID:24775602

  5. Two-IMU FDI performance of the sequential probability ratio test during shuttle entry

    NASA Technical Reports Server (NTRS)

    Rich, T. M.

    1976-01-01

    Performance data for the sequential probability ratio test (SPRT) during shuttle entry are presented. Current modeling constants and failure thresholds are included for the full mission 3B from entry through landing trajectory. Minimum 100 percent detection/isolation failure levels and a discussion of the effects of failure direction are presented. Finally, a limited comparison of failures introduced at trajectory initiation shows that the SPRT algorithm performs slightly worse than the data tracking test.

  6. Acquisition of Inductive Biconditional Reasoning Skills: Training of Simultaneous and Sequential Processing.

    ERIC Educational Resources Information Center

    Lee, Seong-Soo

    1982-01-01

    Tenth-grade students (n=144) received training on one of three processing methods: coding-mapping (simultaneous), coding only, or decision tree (sequential). The induced simultaneous processing strategy worked optimally under rule learning, while the sequential strategy was difficult to induce and/or not optimal for rule-learning operations.…

  7. Therapeutic effect for liver-metastasized tumor by sequential intravenous injection of anionic polymer and cationic lipoplex of siRNA.

    PubMed

    Hattori, Yoshiyuki; Arai, Shohei; Kikuchi, Takuto; Ozaki, Kei-Ichi; Kawano, Kumi; Yonemochi, Etsuo

    2016-04-01

    Previously, we developed a novel siRNA transfer method to the liver by sequential intravenous injection of anionic polymer and cationic liposome/siRNA complex (cationic lipoplex). In this study, we investigated whether siRNA delivered by this sequential injection could significantly suppress mRNA expression of the targeted gene in liver metastasis and inhibit tumor growth. When cationic lipoplex was intravenously injected into mice bearing liver metastasis of human breast tumor MCF-7 at 1 min after intravenous injection of chondroitin sulfate C (CS) or poly-l-glutamic acid (PGA), siRNA was accumulated in tumor-metastasized liver. In terms of a gene silencing effect, sequential injections of CS or PGA plus cationic lipoplex of luciferase siRNA could reduce luciferase activity in liver MCF-7-Luc metastasis. Regarding the side effects, sequential injections of CS plus cationic lipoplex did not exhibit hepatic damage or induction of inflammatory cytokines in serum after repeated injections, but sequential injections of PGA plus cationic lipoplex did. Finally, sequential injections of CS plus cationic lipoplex of protein kinase N3 siRNA could suppress tumor growth in the mice bearing liver metastasis. From these findings, sequential injection of CS and cationic lipoplex of siRNA might be a novel systemic method of delivering siRNA to liver metastasis.

  8. Iodine speciation in coastal and inland bathing waters and seaweeds extracts using a sequential injection standard addition flow-batch method.

    PubMed

    Santos, Inês C; Mesquita, Raquel B R; Bordalo, Adriano A; Rangel, António O S S

    2015-02-01

    The present work describes the development of a sequential injection standard addition method for iodine speciation in bathing waters and seaweeds extracts without prior sample treatment. Iodine speciation was obtained by assessing the iodide and iodate content, the two inorganic forms of iodine in waters. For the determination of iodide, an iodide ion selective electrode (ISE) was used. The indirect determination of iodate was based on the spectrophotometric determination of nitrite (Griess reaction). For the iodate measurement, a mixing chamber was employed (flow batch approach) to explore the inherent efficient mixing, essential for the indirect determination of iodate. The application of the standard addition method enabled detection limits of 0.14 µM for iodide and 0.02 µM for iodate, together with the direct introduction of the target water samples, coastal and inland bathing waters. The results obtained were in agreement with those obtained by ICP-MS and a colorimetric reference procedure. Recovery tests also confirmed the accuracy of the developed method which was effectively applied to bathing waters and seaweed extracts. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. ANAEROBIC AND AEROBIC TREATMENT OF CHLORINATED ALIPHATIC COMPOUNDS

    EPA Science Inventory

    Biological degradation of 12 chlorinated aliphatic compounds (CACs) was assessed in bench-top reactors and in serum bottle tests. Three continuously mixed daily batch-fed reactor systems were evaluated: anaerobic, aerobic, and sequential-anaerobic-aerobic (sequential). Glucose,...

  10. Efficient sequential and parallel algorithms for record linkage

    PubMed Central

    Mamun, Abdullah-Al; Mi, Tian; Aseltine, Robert; Rajasekaran, Sanguthevar

    2014-01-01

    Background and objective Integrating data from multiple sources is a crucial and challenging problem. Even though there exist numerous algorithms for record linkage or deduplication, they suffer from either large time needs or restrictions on the number of datasets that they can integrate. In this paper we report efficient sequential and parallel algorithms for record linkage which handle any number of datasets and outperform previous algorithms. Methods Our algorithms employ hierarchical clustering algorithms as the basis. A key idea that we use is radix sorting on certain attributes to eliminate identical records before any further processing. Another novel idea is to form a graph that links similar records and find the connected components. Results Our sequential and parallel algorithms have been tested on a real dataset of 1 083 878 records and synthetic datasets ranging in size from 50 000 to 9 000 000 records. Our sequential algorithm runs at least two times faster, for any dataset, than the previous best-known algorithm, the two-phase algorithm using faster computation of the edit distance (TPA (FCED)). The speedups obtained by our parallel algorithm are almost linear. For example, we get a speedup of 7.5 with 8 cores (residing in a single node), 14.1 with 16 cores (residing in two nodes), and 26.4 with 32 cores (residing in four nodes). Conclusions We have compared the performance of our sequential algorithm with TPA (FCED) and found that our algorithm outperforms the previous one. The accuracy is the same as that of this previous best-known algorithm. PMID:24154837

  11. Translating basic behavioral and social science research to clinical application: the EVOLVE mixed methods approach.

    PubMed

    Peterson, Janey C; Czajkowski, Susan; Charlson, Mary E; Link, Alissa R; Wells, Martin T; Isen, Alice M; Mancuso, Carol A; Allegrante, John P; Boutin-Foster, Carla; Ogedegbe, Gbenga; Jobe, Jared B

    2013-04-01

    To describe a mixed-methods approach to develop and test a basic behavioral science-informed intervention to motivate behavior change in 3 high-risk clinical populations. Our theoretically derived intervention comprised a combination of positive affect and self-affirmation (PA/SA), which we applied to 3 clinical chronic disease populations. We employed a sequential mixed methods model (EVOLVE) to design and test the PA/SA intervention in order to increase physical activity in people with coronary artery disease (post-percutaneous coronary intervention [PCI]) or asthma (ASM) and to improve medication adherence in African Americans with hypertension (HTN). In an initial qualitative phase, we explored participant values and beliefs. We next pilot tested and refined the intervention and then conducted 3 randomized controlled trials with parallel study design. Participants were randomized to combined PA/SA versus an informational control and were followed bimonthly for 12 months, assessing for health behaviors and interval medical events. Over 4.5 years, we enrolled 1,056 participants. Changes were sequentially made to the intervention during the qualitative and pilot phases. The 3 randomized controlled trials enrolled 242 participants who had undergone PCI, 258 with ASM, and 256 with HTN (n = 756). Overall, 45.1% of PA/SA participants versus 33.6% of informational control participants achieved successful behavior change (p = .001). In multivariate analysis, PA/SA intervention remained a significant predictor of achieving behavior change (p < .002, odds ratio = 1.66), 95% CI [1.22, 2.27], controlling for baseline negative affect, comorbidity, gender, race/ethnicity, medical events, smoking, and age. The EVOLVE method is a means by which basic behavioral science research can be translated into efficacious interventions for chronic disease populations.

  12. A method for simultaneously counterbalancing condition order and assignment of stimulus materials to conditions.

    PubMed

    Zeelenberg, René; Pecher, Diane

    2015-03-01

    Counterbalanced designs are frequently used in the behavioral sciences. Studies often counterbalance either the order in which conditions are presented in the experiment or the assignment of stimulus materials to conditions. Occasionally, researchers need to simultaneously counterbalance both condition order and stimulus assignment to conditions. Lewis (1989; Behavior Research Methods, Instruments, & Computers 25:414-415, 1993) presented a method for constructing Latin squares that fulfill these requirements. The resulting Latin squares counterbalance immediate sequential effects, but not remote sequential effects. Here, we present a new method for generating Latin squares that simultaneously counterbalance both immediate and remote sequential effects and assignment of stimuli to conditions. An Appendix is provided to facilitate implementation of these Latin square designs.

  13. Analyses of group sequential clinical trials.

    PubMed

    Koepcke, W

    1989-12-01

    In the first part of this article the methodology of group sequential plans is reviewed. After introducing the basic definition of such plans the main properties are shown. At the end of this section three different plans (Pocock, O'Brien-Fleming, Koepcke) are compared. In the second part of the article some unresolved issues and recent developments in the application of group sequential methods to long-term controlled clinical trials are discussed. These include deviation from the assumptions, life table methods, multiple-arm clinical trials, multiple outcome measures, and confidence intervals.

  14. The Development of Man and His Culture: Old World Prehistory. Grade 5. Teacher Guide [And] Pupil Text [And] Pupil Guide [And] Teacher Background Material [And] A Sequential Curriculum in Anthropology. Test Form 5, Composite Form for Pre- and Post-Test. Revised, January 1968. Publications No. 25, 31, 23, 24 and 43.

    ERIC Educational Resources Information Center

    Potterfield, James E.; And Others

    This social studies unit includes a teaching guide, student text, study guide, teacher background material, and composite pretest/posttest covering archaeological methods, evolution, fossils and man, and development of culture during the prehistoric periods in the Old World. It is part of the Anthropology Curriculum Project and is designed for…

  15. Test Generation for Highly Sequential Circuits

    DTIC Science & Technology

    1989-08-01

    Sequential CircuitsI Abhijit Ghosh, Srinivas Devadas , and A. Richard Newton Abstract We address the problem of generating test sequences for stuck-at...Electrical Engineering and Computer Sciences, University of California, Berkeley, CA 94720. Devadas : Department of Electrical Engineering and Computer...attn1 b ~een propagatedl to ltne nnext state lites aloine. then we obtain tine fnalty Is as bit. valunes is called A miniteri state. Iti genecral. a

  16. A Bayesian sequential design using alpha spending function to control type I error.

    PubMed

    Zhu, Han; Yu, Qingzhao

    2017-10-01

    We propose in this article a Bayesian sequential design using alpha spending functions to control the overall type I error in phase III clinical trials. We provide algorithms to calculate critical values, power, and sample sizes for the proposed design. Sensitivity analysis is implemented to check the effects from different prior distributions, and conservative priors are recommended. We compare the power and actual sample sizes of the proposed Bayesian sequential design with different alpha spending functions through simulations. We also compare the power of the proposed method with frequentist sequential design using the same alpha spending function. Simulations show that, at the same sample size, the proposed method provides larger power than the corresponding frequentist sequential design. It also has larger power than traditional Bayesian sequential design which sets equal critical values for all interim analyses. When compared with other alpha spending functions, O'Brien-Fleming alpha spending function has the largest power and is the most conservative in terms that at the same sample size, the null hypothesis is the least likely to be rejected at early stage of clinical trials. And finally, we show that adding a step of stop for futility in the Bayesian sequential design can reduce the overall type I error and reduce the actual sample sizes.

  17. Optimal sequential measurements for bipartite state discrimination

    NASA Astrophysics Data System (ADS)

    Croke, Sarah; Barnett, Stephen M.; Weir, Graeme

    2017-05-01

    State discrimination is a useful test problem with which to clarify the power and limitations of different classes of measurement. We consider the problem of discriminating between given states of a bipartite quantum system via sequential measurement of the subsystems, with classical feed-forward of measurement results. Our aim is to understand when sequential measurements, which are relatively easy to implement experimentally, perform as well, or almost as well, as optimal joint measurements, which are in general more technologically challenging. We construct conditions that the optimal sequential measurement must satisfy, analogous to the well-known Helstrom conditions for minimum error discrimination in the unrestricted case. We give several examples and compare the optimal probability of correctly identifying the state via global versus sequential measurement strategies.

  18. Standard triple, bismuth pectin quadruple and sequential therapies for Helicobacter pylori eradication

    PubMed Central

    Gao, Xiao-Zhong; Qiao, Xiu-Li; Song, Wen-Chong; Wang, Xiao-Feng; Liu, Feng

    2010-01-01

    AIM: To compare the effectiveness of standard triple, bismuth pectin quadruple and sequential therapies for Helicobacter pylori (H. pylori) eradication in a randomized, double-blinded, comparative clinical trial in China. METHODS: A total of 215 H. pylori-positive patients were enrolled in the study and randomly allocated into three groups: group A (n = 72) received a 10-d bismuth pectin quadruple therapy (20 mg rabeprazole bid, 1000 mg amoxicillin bid, 100 mg bismuth pectin qid, and 500 mg levofloxacin qd); group B (n = 72) received the sequential therapy (20 mg omeprazole bid, 1000 mg amoxicillin bid, in 5 d, followed by 20 mg omeprazole bid, 500 mg tinidazole bid, 500 mg clarithromycin bid, for another 5 d); group C (n = 71) received a standard 1-wk triple therapy (20 mg omeprazole bid, 1000 mg amoxicillin bid, 500 mg clarithromycin bid). After all these treatments, 20 mg omeprazole bid was administrated for 3 wk. H. pylori status was assessed by histology, 13C-urea breath test and rapid urease test at baseline and 4-6 wk after completion of treatment. Ulcer cicatrization was assessed by gastroscopy. χ2 test (P < 0.05) was used to compare the eradication rates and ulcer cicatrisation rates among the three groups. RESULTS: The eradication rate was 83.33% (60/72) in group A, 88.89% (64/72) in group B, and 80.56% (58/71) in group C. The ulcer cicatrisation rate was 86.44% (51/59) in group A, 90.16% (55/61) in group B, and 84.91% (45/53) in group C. The sequential therapy yielded a higher eradication rate and ulcer cicatrisation rate than the standard triple and bismuth pectin quadruple therapies. Statistically, the eradication rate of group B was significantly different from groups A and C (P < 0.05), but the difference of ulcer cicatrisation rate and side effects was not statistically significant among the three groups (P > 0.05). The three protocols were generally well tolerated. CONCLUSION: The sequential therapy has achieved a significantly higher eradication rate, and is a more suitable first-line alternative protocol for anti-H. pylori infection compared with the standard triple and bismuth pectin quadruple therapies. PMID:20818821

  19. Sequential Testing of Hypotheses Concerning the Reliability of a System Modeled by a Two-Parameter Weibull Distribution.

    DTIC Science & Technology

    1981-12-01

    CONCERNING THE RELIABILITY OF A SYSTEM MODELED BY A TWO-PARAMETER WEIBULL DISTRIBUTION THESIS AFIT/GOR/MA/81D-8 Philippe A. Lussier 2nd Lt USAF... MODELED BY A TWO-PARAMETER WEIBULL DISTRIBUTION THESIS Presented to the Faculty of the School of Engineering of the Air Force Institute of Technology...repetitions are used for these test procedures. vi Sequential Testing of Hypotheses Concerning the Reliability of a System Modeled by a Two-Parameter

  20. A real-time comparison between direct control, sequential pattern recognition control and simultaneous pattern recognition control using a Fitts’ law style assessment procedure

    PubMed Central

    2014-01-01

    Background Pattern recognition (PR) based strategies for the control of myoelectric upper limb prostheses are generally evaluated through offline classification accuracy, which is an admittedly useful metric, but insufficient to discuss functional performance in real time. Existing functional tests are extensive to set up and most fail to provide a challenging, objective framework to assess the strategy performance in real time. Methods Nine able-bodied and two amputee subjects gave informed consent and participated in the local Institutional Review Board approved study. We designed a two-dimensional target acquisition task, based on the principles of Fitts’ law for human motor control. Subjects were prompted to steer a cursor from the screen center of into a series of subsequently appearing targets of different difficulties. Three cursor control systems were tested, corresponding to three electromyography-based prosthetic control strategies: 1) amplitude-based direct control (the clinical standard of care), 2) sequential PR control, and 3) simultaneous PR control, allowing for a concurrent activation of two degrees of freedom (DOF). We computed throughput (bits/second), path efficiency (%), reaction time (second), and overshoot (%)) and used general linear models to assess significant differences between the strategies for each metric. Results We validated the proposed methodology by achieving very high coefficients of determination for Fitts’ law. Both PR strategies significantly outperformed direct control in two-DOF targets and were more intuitive to operate. In one-DOF targets, the simultaneous approach was the least precise. The direct control was efficient in one-DOF targets but cumbersome to operate in two-DOF targets through a switch-depended sequential cursor control. Conclusions We designed a test, capable of comprehensively describing prosthetic control strategies in real time. When implemented on control subjects, the test was able to capture statistically significant differences (p < 0.05) in control strategies when considering throughputs, path efficiencies and reaction times. Of particular note, we found statistically significant (p < 0.01) improvements in throughputs and path efficiencies with simultaneous PR when compared to direct control or sequential PR. Amputees could readily achieve the task; however a limited number of subjects was tested and a statistical analysis was not performed with that population. PMID:24886664

  1. The compatibility of consumer DLP projectors with time-sequential stereoscopic 3D visualisation

    NASA Astrophysics Data System (ADS)

    Woods, Andrew J.; Rourke, Tegan

    2007-02-01

    A range of advertised "Stereo-Ready" DLP projectors are now available in the market which allow high-quality flickerfree stereoscopic 3D visualization using the time-sequential stereoscopic display method. The ability to use a single projector for stereoscopic viewing offers a range of advantages, including extremely good stereoscopic alignment, and in some cases, portability. It has also recently become known that some consumer DLP projectors can be used for timesequential stereoscopic visualization, however it was not well understood which projectors are compatible and incompatible, what display modes (frequency and resolution) are compatible, and what stereoscopic display quality attributes are important. We conducted a study to test a wide range of projectors for stereoscopic compatibility. This paper reports on the testing of 45 consumer DLP projectors of widely different specifications (brand, resolution, brightness, etc). The projectors were tested for stereoscopic compatibility with various video formats (PAL, NTSC, 480P, 576P, and various VGA resolutions) and video input connections (composite, SVideo, component, and VGA). Fifteen projectors were found to work well at up to 85Hz stereo in VGA mode. Twenty three projectors would work at 60Hz stereo in VGA mode.

  2. Classification and Sequential Pattern Analysis for Improving Managerial Efficiency and Providing Better Medical Service in Public Healthcare Centers

    PubMed Central

    Chung, Sukhoon; Rhee, Hyunsill; Suh, Yongmoo

    2010-01-01

    Objectives This study sought to find answers to the following questions: 1) Can we predict whether a patient will revisit a healthcare center? 2) Can we anticipate diseases of patients who revisit the center? Methods For the first question, we applied 5 classification algorithms (decision tree, artificial neural network, logistic regression, Bayesian networks, and Naïve Bayes) and the stacking-bagging method for building classification models. To solve the second question, we performed sequential pattern analysis. Results We determined: 1) In general, the most influential variables which impact whether a patient of a public healthcare center will revisit it or not are personal burden, insurance bill, period of prescription, age, systolic pressure, name of disease, and postal code. 2) The best plain classification model is dependent on the dataset. 3) Based on average of classification accuracy, the proposed stacking-bagging method outperformed all traditional classification models and our sequential pattern analysis revealed 16 sequential patterns. Conclusions Classification models and sequential patterns can help public healthcare centers plan and implement healthcare service programs and businesses that are more appropriate to local residents, encouraging them to revisit public health centers. PMID:21818426

  3. CONORBIT: constrained optimization by radial basis function interpolation in trust regions

    DOE PAGES

    Regis, Rommel G.; Wild, Stefan M.

    2016-09-26

    Here, this paper presents CONORBIT (CONstrained Optimization by Radial Basis function Interpolation in Trust regions), a derivative-free algorithm for constrained black-box optimization where the objective and constraint functions are computationally expensive. CONORBIT employs a trust-region framework that uses interpolating radial basis function (RBF) models for the objective and constraint functions, and is an extension of the ORBIT algorithm. It uses a small margin for the RBF constraint models to facilitate the generation of feasible iterates, and extensive numerical tests confirm that such a margin is helpful in improving performance. CONORBIT is compared with other algorithms on 27 test problems, amore » chemical process optimization problem, and an automotive application. Numerical results show that CONORBIT performs better than COBYLA, a sequential penalty derivative-free method, an augmented Lagrangian method, a direct search method, and another RBF-based algorithm on the test problems and on the automotive application.« less

  4. A comparative synthesis and physicochemical characterizations of Ni/Al2O3-MgO nanocatalyst via sequential impregnation and sol-gel methods used for CO2 reforming of methane.

    PubMed

    Aghamohammadi, Sogand; Haghighi, Mohammad; Karimipour, Samira

    2013-07-01

    Carbon dioxide reforming of methane is an interesting route for synthesis gas production especially over nano-sized catalysts. The present research deals with catalyst development for dry reforming of methane with the aim of reaching the most stable catalyst. Effect of preparation method, one of the most significant variables, on the properties of the catalysts was taken in to account. The Ni/Al2O3-MgO catalysts were prepared via sol-gel and sequential impregnation methods and characterized with XRD, FESEM, EDAX, BET and FTIR techniques. The reforming reactions were carried out using different feed ratios, gas hourly space velocities (GHSV) and reaction temperatures to identify the influence of operational variables. FESEM images indicate uniform particle size distribution for the sample synthesized with sol-gel method. It has been found that the sol-gel method has the potential to improve catalyst desired properties especially metal surface enrichment resulting in catalytic performance enhancement. The highest yield of products was obtained at 850 degrees C for both of the catalysts. During the 10 h stability test, CH4 and CO2 conversions gained higher values in the case of sol-gel made catalyst compared to impregnated one.

  5. Universal Method for Creating Hierarchical Wrinkles on Thin-Film Surfaces.

    PubMed

    Jung, Woo-Bin; Cho, Kyeong Min; Lee, Won-Kyu; Odom, Teri W; Jung, Hee-Tae

    2018-01-10

    One of the most interesting topics in physical science and materials science is the creation of complex wrinkled structures on thin-film surfaces because of their several advantages of high surface area, localized strain, and stress tolerance. In this study, a significant step was taken toward solving limitations imposed by the fabrication of previous artificial wrinkles. A universal method for preparing hierarchical three-dimensional wrinkle structures of thin films on a multiple scale (e.g., nanometers to micrometers) by sequential wrinkling with different skin layers was developed. Notably, this method was not limited to specific materials, and it was applicable to fabricating hierarchical wrinkles on all of the thin-film surfaces tested thus far, including those of metals, two-dimensional and one-dimensional materials, and polymers. The hierarchical wrinkles with multiscale structures were prepared by sequential wrinkling, in which a sacrificial layer was used as the additional skin layer between sequences. For example, a hierarchical MoS 2 wrinkle exhibited highly enhanced catalytic behavior because of the superaerophobicity and effective surface area, which are related to topological effects. As the developed method can be adopted to a majority of thin films, it is thought to be a universal method for enhancing the physical properties of various materials.

  6. Asymptotic Properties of the Sequential Empirical ROC, PPV and NPV Curves Under Case-Control Sampling.

    PubMed

    Koopmeiners, Joseph S; Feng, Ziding

    2011-01-01

    The receiver operating characteristic (ROC) curve, the positive predictive value (PPV) curve and the negative predictive value (NPV) curve are three measures of performance for a continuous diagnostic biomarker. The ROC, PPV and NPV curves are often estimated empirically to avoid assumptions about the distributional form of the biomarkers. Recently, there has been a push to incorporate group sequential methods into the design of diagnostic biomarker studies. A thorough understanding of the asymptotic properties of the sequential empirical ROC, PPV and NPV curves will provide more flexibility when designing group sequential diagnostic biomarker studies. In this paper we derive asymptotic theory for the sequential empirical ROC, PPV and NPV curves under case-control sampling using sequential empirical process theory. We show that the sequential empirical ROC, PPV and NPV curves converge to the sum of independent Kiefer processes and show how these results can be used to derive asymptotic results for summaries of the sequential empirical ROC, PPV and NPV curves.

  7. Asymptotic Properties of the Sequential Empirical ROC, PPV and NPV Curves Under Case-Control Sampling

    PubMed Central

    Koopmeiners, Joseph S.; Feng, Ziding

    2013-01-01

    The receiver operating characteristic (ROC) curve, the positive predictive value (PPV) curve and the negative predictive value (NPV) curve are three measures of performance for a continuous diagnostic biomarker. The ROC, PPV and NPV curves are often estimated empirically to avoid assumptions about the distributional form of the biomarkers. Recently, there has been a push to incorporate group sequential methods into the design of diagnostic biomarker studies. A thorough understanding of the asymptotic properties of the sequential empirical ROC, PPV and NPV curves will provide more flexibility when designing group sequential diagnostic biomarker studies. In this paper we derive asymptotic theory for the sequential empirical ROC, PPV and NPV curves under case-control sampling using sequential empirical process theory. We show that the sequential empirical ROC, PPV and NPV curves converge to the sum of independent Kiefer processes and show how these results can be used to derive asymptotic results for summaries of the sequential empirical ROC, PPV and NPV curves. PMID:24039313

  8. Effective Identification of Similar Patients Through Sequential Matching over ICD Code Embedding.

    PubMed

    Nguyen, Dang; Luo, Wei; Venkatesh, Svetha; Phung, Dinh

    2018-04-11

    Evidence-based medicine often involves the identification of patients with similar conditions, which are often captured in ICD (International Classification of Diseases (World Health Organization 2013)) code sequences. With no satisfying prior solutions for matching ICD-10 code sequences, this paper presents a method which effectively captures the clinical similarity among routine patients who have multiple comorbidities and complex care needs. Our method leverages the recent progress in representation learning of individual ICD-10 codes, and it explicitly uses the sequential order of codes for matching. Empirical evaluation on a state-wide cancer data collection shows that our proposed method achieves significantly higher matching performance compared with state-of-the-art methods ignoring the sequential order. Our method better identifies similar patients in a number of clinical outcomes including readmission and mortality outlook. Although this paper focuses on ICD-10 diagnosis code sequences, our method can be adapted to work with other codified sequence data.

  9. Evaluation and costs of different haemoglobin methods for use in district hospitals in Malawi

    PubMed Central

    Medina Lara, A; Mundy, C; Kandulu, J; Chisuwo, L; Bates, I

    2005-01-01

    Aims: To evaluate the characteristics of manual haemoglobin methods in use in Malawi and provide evidence for the Ministry of Health in Malawi to enable them to choose a suitable method for district hospitals. Methods: Criteria on accuracy, clinical usefulness, user friendliness, speed, training time, and economic costs were determined by local health professionals and used to compare six different manual haemoglobin methods. These were introduced sequentially into use in a district hospital in Malawi alongside the reference method. Results: HemoCue was the optimal method based on most of the outcome measures but was also the most expensive (US$0.75/test). DHT meter and Jenway colorimeter were the second choice because they were cheaper (US$0.20–0.35/test), but they were not as accurate or user friendly as HemoCue. Conclusions: The process for choosing appropriate laboratory methods is complex and very little guidance is available for health managers in poorer countries. This paper describes the development and testing of a practical model for gathering evidence about test efficiency that could be adapted for use in other resource poor settings. PMID:15623483

  10. Controlling the type I error rate in two-stage sequential adaptive designs when testing for average bioequivalence.

    PubMed

    Maurer, Willi; Jones, Byron; Chen, Ying

    2018-05-10

    In a 2×2 crossover trial for establishing average bioequivalence (ABE) of a generic agent and a currently marketed drug, the recommended approach to hypothesis testing is the two one-sided test (TOST) procedure, which depends, among other things, on the estimated within-subject variability. The power of this procedure, and therefore the sample size required to achieve a minimum power, depends on having a good estimate of this variability. When there is uncertainty, it is advisable to plan the design in two stages, with an interim sample size reestimation after the first stage, using an interim estimate of the within-subject variability. One method and 3 variations of doing this were proposed by Potvin et al. Using simulation, the operating characteristics, including the empirical type I error rate, of the 4 variations (called Methods A, B, C, and D) were assessed by Potvin et al and Methods B and C were recommended. However, none of these 4 variations formally controls the type I error rate of falsely claiming ABE, even though the amount of inflation produced by Method C was considered acceptable. A major disadvantage of assessing type I error rate inflation using simulation is that unless all possible scenarios for the intended design and analysis are investigated, it is impossible to be sure that the type I error rate is controlled. Here, we propose an alternative, principled method of sample size reestimation that is guaranteed to control the type I error rate at any given significance level. This method uses a new version of the inverse-normal combination of p-values test, in conjunction with standard group sequential techniques, that is more robust to large deviations in initial assumptions regarding the variability of the pharmacokinetic endpoints. The sample size reestimation step is based on significance levels and power requirements that are conditional on the first-stage results. This necessitates a discussion and exploitation of the peculiar properties of the power curve of the TOST testing procedure. We illustrate our approach with an example based on a real ABE study and compare the operating characteristics of our proposed method with those of Method B of Povin et al. Copyright © 2018 John Wiley & Sons, Ltd.

  11. Native Frames: Disentangling Sequential from Concerted Three-Body Fragmentation

    NASA Astrophysics Data System (ADS)

    Rajput, Jyoti; Severt, T.; Berry, Ben; Jochim, Bethany; Feizollah, Peyman; Kaderiya, Balram; Zohrabi, M.; Ablikim, U.; Ziaee, Farzaneh; Raju P., Kanaka; Rolles, D.; Rudenko, A.; Carnes, K. D.; Esry, B. D.; Ben-Itzhak, I.

    2018-03-01

    A key question concerning the three-body fragmentation of polyatomic molecules is the distinction of sequential and concerted mechanisms, i.e., the stepwise or simultaneous cleavage of bonds. Using laser-driven fragmentation of OCS into O++C++S+ and employing coincidence momentum imaging, we demonstrate a novel method that enables the clear separation of sequential and concerted breakup. The separation is accomplished by analyzing the three-body fragmentation in the native frame associated with each step and taking advantage of the rotation of the intermediate molecular fragment, CO2 + or CS2 + , before its unimolecular dissociation. This native-frame method works for any projectile (electrons, ions, or photons), provides details on each step of the sequential breakup, and enables the retrieval of the relevant spectra for sequential and concerted breakup separately. Specifically, this allows the determination of the branching ratio of all these processes in OCS3 + breakup. Moreover, we find that the first step of sequential breakup is tightly aligned along the laser polarization and identify the likely electronic states of the intermediate dication that undergo unimolecular dissociation in the second step. Finally, the separated concerted breakup spectra show clearly that the central carbon atom is preferentially ejected perpendicular to the laser field.

  12. Oral acute toxic class method: a successful alternative to the oral LD50 test.

    PubMed

    Schlede, Eva; Genschow, Elke; Spielmann, Horst; Stropp, Gisela; Kayser, Detlev

    2005-06-01

    The oral acute toxic class method (ATC method) was developed as an alternative to replace the oral LD50 test. The ATC method is a sequential testing procedure using only three animals of one sex per step at any of the defined dose levels. Depending on the mortality rate three but never more than six animals are used per dose level. This approach results in the reduction of numbers of animals used in comparison to the LD50 test by 40-70%. The principle of the oral ATC method is based on the Probit model and it was first evaluated on a biometric basis before a national and subsequently an international ring study were conducted. The results demonstrated an excellent agreement between the toxicity and the animal numbers predicted biometrically and observed in the validation studies. The oral ATC method was adopted as an official test guideline by OECD in 1996 and was slightly amended in 2001. The ATC method has been successfully used in Germany and in 2003 >85% of all tests on acute oral toxicity testing was conducted as oral ATC tests. In member states of the European Union the ATC method is used in the range of 50% of all tests conducted. Meanwhile the oral LD50 test has been deleted by OECD, by the European Union and by the USA, making the use of alternatives to the oral LD50 test mandatory.

  13. Low-dose cerebral perfusion computed tomography image restoration via low-rank and total variation regularizations

    PubMed Central

    Niu, Shanzhou; Zhang, Shanli; Huang, Jing; Bian, Zhaoying; Chen, Wufan; Yu, Gaohang; Liang, Zhengrong; Ma, Jianhua

    2016-01-01

    Cerebral perfusion x-ray computed tomography (PCT) is an important functional imaging modality for evaluating cerebrovascular diseases and has been widely used in clinics over the past decades. However, due to the protocol of PCT imaging with repeated dynamic sequential scans, the associative radiation dose unavoidably increases as compared with that used in conventional CT examinations. Minimizing the radiation exposure in PCT examination is a major task in the CT field. In this paper, considering the rich similarity redundancy information among enhanced sequential PCT images, we propose a low-dose PCT image restoration model by incorporating the low-rank and sparse matrix characteristic of sequential PCT images. Specifically, the sequential PCT images were first stacked into a matrix (i.e., low-rank matrix), and then a non-convex spectral norm/regularization and a spatio-temporal total variation norm/regularization were then built on the low-rank matrix to describe the low rank and sparsity of the sequential PCT images, respectively. Subsequently, an improved split Bregman method was adopted to minimize the associative objective function with a reasonable convergence rate. Both qualitative and quantitative studies were conducted using a digital phantom and clinical cerebral PCT datasets to evaluate the present method. Experimental results show that the presented method can achieve images with several noticeable advantages over the existing methods in terms of noise reduction and universal quality index. More importantly, the present method can produce more accurate kinetic enhanced details and diagnostic hemodynamic parameter maps. PMID:27440948

  14. C-learning: A new classification framework to estimate optimal dynamic treatment regimes.

    PubMed

    Zhang, Baqun; Zhang, Min

    2017-12-11

    A dynamic treatment regime is a sequence of decision rules, each corresponding to a decision point, that determine that next treatment based on each individual's own available characteristics and treatment history up to that point. We show that identifying the optimal dynamic treatment regime can be recast as a sequential optimization problem and propose a direct sequential optimization method to estimate the optimal treatment regimes. In particular, at each decision point, the optimization is equivalent to sequentially minimizing a weighted expected misclassification error. Based on this classification perspective, we propose a powerful and flexible C-learning algorithm to learn the optimal dynamic treatment regimes backward sequentially from the last stage until the first stage. C-learning is a direct optimization method that directly targets optimizing decision rules by exploiting powerful optimization/classification techniques and it allows incorporation of patient's characteristics and treatment history to improve performance, hence enjoying advantages of both the traditional outcome regression-based methods (Q- and A-learning) and the more recent direct optimization methods. The superior performance and flexibility of the proposed methods are illustrated through extensive simulation studies. © 2017, The International Biometric Society.

  15. Comparing multiple imputation methods for systematically missing subject-level data.

    PubMed

    Kline, David; Andridge, Rebecca; Kaizar, Eloise

    2017-06-01

    When conducting research synthesis, the collection of studies that will be combined often do not measure the same set of variables, which creates missing data. When the studies to combine are longitudinal, missing data can occur on the observation-level (time-varying) or the subject-level (non-time-varying). Traditionally, the focus of missing data methods for longitudinal data has been on missing observation-level variables. In this paper, we focus on missing subject-level variables and compare two multiple imputation approaches: a joint modeling approach and a sequential conditional modeling approach. We find the joint modeling approach to be preferable to the sequential conditional approach, except when the covariance structure of the repeated outcome for each individual has homogenous variance and exchangeable correlation. Specifically, the regression coefficient estimates from an analysis incorporating imputed values based on the sequential conditional method are attenuated and less efficient than those from the joint method. Remarkably, the estimates from the sequential conditional method are often less efficient than a complete case analysis, which, in the context of research synthesis, implies that we lose efficiency by combining studies. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  16. Towards the operational estimation of a radiological plume using data assimilation after a radiological accidental atmospheric release

    NASA Astrophysics Data System (ADS)

    Winiarek, Victor; Vira, Julius; Bocquet, Marc; Sofiev, Mikhail; Saunier, Olivier

    2011-06-01

    In the event of an accidental atmospheric release of radionuclides from a nuclear power plant, accurate real-time forecasting of the activity concentrations of radionuclides is required by the decision makers for the preparation of adequate countermeasures. The accuracy of the forecast plume is highly dependent on the source term estimation. On several academic test cases, including real data, inverse modelling and data assimilation techniques were proven to help in the assessment of the source term. In this paper, a semi-automatic method is proposed for the sequential reconstruction of the plume, by implementing a sequential data assimilation algorithm based on inverse modelling, with a care to develop realistic methods for operational risk agencies. The performance of the assimilation scheme has been assessed through the intercomparison between French and Finnish frameworks. Two dispersion models have been used: Polair3D and Silam developed in two different research centres. Different release locations, as well as different meteorological situations are tested. The existing and newly planned surveillance networks are used and realistically large multiplicative observational errors are assumed. The inverse modelling scheme accounts for strong error bias encountered with such errors. The efficiency of the data assimilation system is tested via statistical indicators. For France and Finland, the average performance of the data assimilation system is strong. However there are outlying situations where the inversion fails because of a too poor observability. In addition, in the case where the power plant responsible for the accidental release is not known, robust statistical tools are developed and tested to discriminate candidate release sites.

  17. Sequential lineup presentation promotes less-biased criterion setting but does not improve discriminability.

    PubMed

    Palmer, Matthew A; Brewer, Neil

    2012-06-01

    When compared with simultaneous lineup presentation, sequential presentation has been shown to reduce false identifications to a greater extent than it reduces correct identifications. However, there has been much debate about whether this difference in identification performance represents improved discriminability or more conservative responding. In this research, data from 22 experiments that compared sequential and simultaneous lineups were analyzed using a compound signal-detection model, which is specifically designed to describe decision-making performance on tasks such as eyewitness identification tests. Sequential (cf. simultaneous) presentation did not influence discriminability, but produced a conservative shift in response bias that resulted in less-biased choosing for sequential than simultaneous lineups. These results inform understanding of the effects of lineup presentation mode on eyewitness identification decisions.

  18. Automated ILA design for synchronous sequential circuits

    NASA Technical Reports Server (NTRS)

    Liu, M. N.; Liu, K. Z.; Maki, G. K.; Whitaker, S. R.

    1991-01-01

    An iterative logic array (ILA) architecture for synchronous sequential circuits is presented. This technique utilizes linear algebra to produce the design equations. The ILA realization of synchronous sequential logic can be fully automated with a computer program. A programmable design procedure is proposed to fullfill the design task and layout generation. A software algorithm in the C language has been developed and tested to generate 1 micron CMOS layouts using the Hewlett-Packard FUNGEN module generator shell.

  19. Modeling Eye Gaze Patterns in Clinician-Patient Interaction with Lag Sequential Analysis

    PubMed Central

    Montague, E; Xu, J; Asan, O; Chen, P; Chewning, B; Barrett, B

    2011-01-01

    Objective The aim of this study was to examine whether lag-sequential analysis could be used to describe eye gaze orientation between clinicians and patients in the medical encounter. This topic is particularly important as new technologies are implemented into multi-user health care settings where trust is critical and nonverbal cues are integral to achieving trust. This analysis method could lead to design guidelines for technologies and more effective assessments of interventions. Background Nonverbal communication patterns are important aspects of clinician-patient interactions and may impact patient outcomes. Method Eye gaze behaviors of clinicians and patients in 110-videotaped medical encounters were analyzed using the lag-sequential method to identify significant behavior sequences. Lag-sequential analysis included both event-based lag and time-based lag. Results Results from event-based lag analysis showed that the patients’ gaze followed that of clinicians, while clinicians did not follow patients. Time-based sequential analysis showed that responses from the patient usually occurred within two seconds after the initial behavior of the clinician. Conclusion Our data suggest that the clinician’s gaze significantly affects the medical encounter but not the converse. Application Findings from this research have implications for the design of clinical work systems and modeling interactions. Similar research methods could be used to identify different behavior patterns in clinical settings (physical layout, technology, etc.) to facilitate and evaluate clinical work system designs. PMID:22046723

  20. Constrained optimization of sequentially generated entangled multiqubit states

    NASA Astrophysics Data System (ADS)

    Saberi, Hamed; Weichselbaum, Andreas; Lamata, Lucas; Pérez-García, David; von Delft, Jan; Solano, Enrique

    2009-08-01

    We demonstrate how the matrix-product state formalism provides a flexible structure to solve the constrained optimization problem associated with the sequential generation of entangled multiqubit states under experimental restrictions. We consider a realistic scenario in which an ancillary system with a limited number of levels performs restricted sequential interactions with qubits in a row. The proposed method relies on a suitable local optimization procedure, yielding an efficient recipe for the realistic and approximate sequential generation of any entangled multiqubit state. We give paradigmatic examples that may be of interest for theoretical and experimental developments.

  1. Sequential causal inference: Application to randomized trials of adaptive treatment strategies

    PubMed Central

    Dawson, Ree; Lavori, Philip W.

    2009-01-01

    SUMMARY Clinical trials that randomize subjects to decision algorithms, which adapt treatments over time according to individual response, have gained considerable interest as investigators seek designs that directly inform clinical decision making. We consider designs in which subjects are randomized sequentially at decision points, among adaptive treatment options under evaluation. We present a sequential method to estimate the comparative effects of the randomized adaptive treatments, which are formalized as adaptive treatment strategies. Our causal estimators are derived using Bayesian predictive inference. We use analytical and empirical calculations to compare the predictive estimators to (i) the ‘standard’ approach that allocates the sequentially obtained data to separate strategy-specific groups as would arise from randomizing subjects at baseline; (ii) the semi-parametric approach of marginal mean models that, under appropriate experimental conditions, provides the same sequential estimator of causal differences as the proposed approach. Simulation studies demonstrate that sequential causal inference offers substantial efficiency gains over the standard approach to comparing treatments, because the predictive estimators can take advantage of the monotone structure of shared data among adaptive strategies. We further demonstrate that the semi-parametric asymptotic variances, which are marginal ‘one-step’ estimators, may exhibit significant bias, in contrast to the predictive variances. We show that the conditions under which the sequential method is attractive relative to the other two approaches are those most likely to occur in real studies. PMID:17914714

  2. Gear tooth stress measurements of two helicopter planetary stages

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    1992-01-01

    Two versions of the planetary reduction stages from U.S. Army OH-58 helicopter main rotor transmissions were tested at NASA Lewis. One sequential and one nonsequential planetary were tested. Sun gear and ring gear teeth strains were measured, and stresses were calculated from the strains. The alternating stress at the fillet of both the loaded and unloaded sides of the teeth and at the root of the sun gear teeth are reported. Typical stress variations as the gear tooth moves through mesh are illustrated. At the tooth root location of the thin rimmed sun gear, a significant stress was produced by a phenomenon other than the passing of a planet gear. The load variation among the planets was studied. Each planet produced its own distinctive load distribution on the ring and sun gears. The load variation was less for a three planet, nonsequential design as compared to that of a four planet, sequential design. The reported results enhance the data base for gear stress levels and provide data for the validation of analytical methods.

  3. Evaluating the parent-adolescent communication toolkit: Usability and preliminary content effectiveness of an online intervention.

    PubMed

    Toombs, Elaine; Unruh, Anita; McGrath, Patrick

    2018-01-01

    This study aimed to assess the Parent-Adolescent Communication Toolkit, an online intervention designed to help improve parent communication with their adolescents. Participant preferences for two module delivery systems (sequential and unrestricted module access) were identified. Usability assessment of the PACT intervention was completed using pre-test and posttest comparisons. Usability data, including participant completion and satisfaction ratings were examined. Parents ( N  =   18) of adolescents were randomized to a sequential or unrestricted chapter access group. Parent participants completed pre-test measures, the PACT intervention and posttest measures. Participants provided feedback for the intervention to improve modules and provided usability ratings. Adolescent pre- and posttest ratings were evaluated. Usability ratings were high and parent feedback was positive. The sequential module access groups rated the intervention content higher and completed more content than the unrestricted chapter access group, indicating support for the sequential access design. Parent mean posttest communication scores were significantly higher ( p  <   .05) than pre-test scores. No significant differences were detected for adolescent participants. Findings suggest that the Parent-Adolescent Communication Toolkit has potential to improve parent-adolescent communication but further effectiveness assessment is required.

  4. Expert system for online surveillance of nuclear reactor coolant pumps

    DOEpatents

    Gross, Kenny C.; Singer, Ralph M.; Humenik, Keith E.

    1993-01-01

    An expert system for online surveillance of nuclear reactor coolant pumps. This system provides a means for early detection of pump or sensor degradation. Degradation is determined through the use of a statistical analysis technique, sequential probability ratio test, applied to information from several sensors which are responsive to differing physical parameters. The results of sequential testing of the data provide the operator with an early warning of possible sensor or pump failure.

  5. Technical Reports Prepared Under Contract N00014-76-C-0475.

    DTIC Science & Technology

    1987-05-29

    264 Approximations to Densities in Geometric H. Solomon 10/27/78 Probability M.A. Stephens 3. Technical Relort No. Title Author Date 265 Sequential ...Certain Multivariate S. Iyengar 8/12/82 Normal Probabilities 323 EDF Statistics for Testing for the Gamma M.A. Stephens 8/13/82 Distribution with...20-85 Nets 360 Random Sequential Coding By Hamming Distance Yoshiaki Itoh 07-11-85 Herbert Solomon 361 Transforming Censored Samples And Testing Fit

  6. Wald Sequential Probability Ratio Test for Analysis of Orbital Conjunction Data

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell; Markley, F. Landis; Gold, Dara

    2013-01-01

    We propose a Wald Sequential Probability Ratio Test for analysis of commonly available predictions associated with spacecraft conjunctions. Such predictions generally consist of a relative state and relative state error covariance at the time of closest approach, under the assumption that prediction errors are Gaussian. We show that under these circumstances, the likelihood ratio of the Wald test reduces to an especially simple form, involving the current best estimate of collision probability, and a similar estimate of collision probability that is based on prior assumptions about the likelihood of collision.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Serin, E.; Codel, G.; Mabhouti, H.

    Purpose: In small field geometries, the electronic equilibrium can be lost, making it challenging for the dose-calculation algorithm to accurately predict the dose, especially in the presence of tissue heterogeneities. In this study, dosimetric accuracy of Monte Carlo (MC) advanced dose calculation and sequential algorithms of Multiplan treatment planning system were investigated for small radiation fields incident on homogeneous and heterogeneous geometries. Methods: Small open fields of fixed cones of Cyberknife M6 unit 100 to 500 mm2 were used for this study. The fields were incident on in house phantom containing lung, air, and bone inhomogeneities and also homogeneous phantom.more » Using the same film batch, the net OD to dose calibration curve was obtained using CK with the 60 mm fixed cone by delivering 0- 800 cGy. Films were scanned 48 hours after irradiation using an Epson 1000XL flatbed scanner. The dosimetric accuracy of MC and sequential algorithms in the presence of the inhomogeneities was compared against EBT3 film dosimetry Results: Open field tests in a homogeneous phantom showed good agreement between two algorithms and film measurement For MC algorithm, the minimum gamma analysis passing rates between measured and calculated dose distributions were 99.7% and 98.3% for homogeneous and inhomogeneous fields in the case of lung and bone respectively. For sequential algorithm, the minimum gamma analysis passing rates were 98.9% and 92.5% for for homogeneous and inhomogeneous fields respectively for used all cone sizes. In the case of the air heterogeneity, the differences were larger for both calculation algorithms. Overall, when compared to measurement, the MC had better agreement than sequential algorithm. Conclusion: The Monte Carlo calculation algorithm in the Multiplan treatment planning system is an improvement over the existing sequential algorithm. Dose discrepancies were observed for in the presence of air inhomogeneities.« less

  8. Temporal presentation protocols in stereoscopic displays: Flicker visibility, perceived motion, and perceived depth

    PubMed Central

    Hoffman, David M.; Karasev, Vasiliy I.; Banks, Martin S.

    2011-01-01

    Most stereoscopic displays rely on field-sequential presentation to present different images to the left and right eyes. With sequential presentation, images are delivered to each eye in alternation with dark intervals, and each eye receives its images in counter phase with the other eye. This type of presentation can exacerbate image artifacts including flicker, and the appearance of unsmooth motion. To address the flicker problem, some methods repeat images multiple times before updating to new ones. This greatly reduces flicker visibility, but makes motion appear less smooth. This paper describes an investigation of how different presentation methods affect the visibility of flicker, motion artifacts, and distortions in perceived depth. It begins with an examination of these methods in the spatio-temporal frequency domain. From this examination, it describes a series of predictions for how presentation rate, object speed, simultaneity of image delivery to the two eyes, and other properties ought to affect flicker, motion artifacts, and depth distortions, and reports a series of experiments that tested these predictions. The results confirmed essentially all of the predictions. The paper concludes with a summary and series of recommendations for the best approach to minimize these undesirable effects. PMID:21572544

  9. Exploiting Complexity Information for Brain Activation Detection

    PubMed Central

    Zhang, Yan; Liang, Jiali; Lin, Qiang; Hu, Zhenghui

    2016-01-01

    We present a complexity-based approach for the analysis of fMRI time series, in which sample entropy (SampEn) is introduced as a quantification of the voxel complexity. Under this hypothesis the voxel complexity could be modulated in pertinent cognitive tasks, and it changes through experimental paradigms. We calculate the complexity of sequential fMRI data for each voxel in two distinct experimental paradigms and use a nonparametric statistical strategy, the Wilcoxon signed rank test, to evaluate the difference in complexity between them. The results are compared with the well known general linear model based Statistical Parametric Mapping package (SPM12), where a decided difference has been observed. This is because SampEn method detects brain complexity changes in two experiments of different conditions and the data-driven method SampEn evaluates just the complexity of specific sequential fMRI data. Also, the larger and smaller SampEn values correspond to different meanings, and the neutral-blank design produces higher predictability than threat-neutral. Complexity information can be considered as a complementary method to the existing fMRI analysis strategies, and it may help improving the understanding of human brain functions from a different perspective. PMID:27045838

  10. A repeated measures model for analysis of continuous outcomes in sequential parallel comparison design studies.

    PubMed

    Doros, Gheorghe; Pencina, Michael; Rybin, Denis; Meisner, Allison; Fava, Maurizio

    2013-07-20

    Previous authors have proposed the sequential parallel comparison design (SPCD) to address the issue of high placebo response rate in clinical trials. The original use of SPCD focused on binary outcomes, but recent use has since been extended to continuous outcomes that arise more naturally in many fields, including psychiatry. Analytic methods proposed to date for analysis of SPCD trial continuous data included methods based on seemingly unrelated regression and ordinary least squares. Here, we propose a repeated measures linear model that uses all outcome data collected in the trial and accounts for data that are missing at random. An appropriate contrast formulated after the model has been fit can be used to test the primary hypothesis of no difference in treatment effects between study arms. Our extensive simulations show that when compared with the other methods, our approach preserves the type I error even for small sample sizes and offers adequate power and the smallest mean squared error under a wide variety of assumptions. We recommend consideration of our approach for analysis of data coming from SPCD trials. Copyright © 2013 John Wiley & Sons, Ltd.

  11. Examination of the Relation between TEOG Score of Turkish Revolution History and Kemalism Course and Reading Comprehension Skill (An Example of Explanatory Sequential Mixed Design)

    ERIC Educational Resources Information Center

    Yuvaci, Ibrahim; Demir, Selçuk Besir

    2016-01-01

    This paper is aimed to determine the relation between reading comprehension skill and TEOG success. In this research, a mixed research method, sequential explanatory mixed design, is utilized to examine the relation between reading comprehension skills and TEOG success of 8th grade students throughly. In explanatory sequential mixed design…

  12. Regeneration of glass nanofluidic chips through a multiple-step sequential thermochemical decomposition process at high temperatures.

    PubMed

    Xu, Yan; Wu, Qian; Shimatani, Yuji; Yamaguchi, Koji

    2015-10-07

    Due to the lack of regeneration methods, the reusability of nanofluidic chips is a significant technical challenge impeding the efficient and economic promotion of both fundamental research and practical applications on nanofluidics. Herein, a simple method for the total regeneration of glass nanofluidic chips was described. The method consists of sequential thermal treatment with six well-designed steps, which correspond to four sequential thermal and thermochemical decomposition processes, namely, dehydration, high-temperature redox chemical reaction, high-temperature gasification, and cooling. The method enabled the total regeneration of typical 'dead' glass nanofluidic chips by eliminating physically clogged nanoparticles in the nanochannels, removing chemically reacted organic matter on the glass surface and regenerating permanent functional surfaces of dissimilar materials localized in the nanochannels. The method provides a technical solution to significantly improve the reusability of glass nanofluidic chips and will be useful for the promotion and acceleration of research and applications on nanofluidics.

  13. Sequential Pattern Analysis: Method and Application in Exploring How Students Develop Concept Maps

    ERIC Educational Resources Information Center

    Chiu, Chiung-Hui; Lin, Chien-Liang

    2012-01-01

    Concept mapping is a technique that represents knowledge in graphs. It has been widely adopted in science education and cognitive psychology to aid learning and assessment. To realize the sequential manner in which students develop concept maps, most research relies upon human-dependent, qualitative approaches. This article proposes a method for…

  14. An Overview of Markov Chain Methods for the Study of Stage-Sequential Developmental Processes

    ERIC Educational Resources Information Center

    Kapland, David

    2008-01-01

    This article presents an overview of quantitative methodologies for the study of stage-sequential development based on extensions of Markov chain modeling. Four methods are presented that exemplify the flexibility of this approach: the manifest Markov model, the latent Markov model, latent transition analysis, and the mixture latent Markov model.…

  15. Optimum target sizes for a sequential sawing process

    Treesearch

    H. Dean Claxton

    1972-01-01

    A method for solving a class of problems in random sequential processes is presented. Sawing cedar pencil blocks is used to illustrate the method. Equations are developed for the function representing loss from improper sizing of blocks. A weighted over-all distribution for sawing and drying operations is developed and graphed. Loss minimizing changes in the control...

  16. Estimation of parameters and basic reproduction ratio for Japanese encephalitis transmission in the Philippines using sequential Monte Carlo filter

    USDA-ARS?s Scientific Manuscript database

    We developed a sequential Monte Carlo filter to estimate the states and the parameters in a stochastic model of Japanese Encephalitis (JE) spread in the Philippines. This method is particularly important for its adaptability to the availability of new incidence data. This method can also capture the...

  17. A Mixed-Methods Sequential Explanatory Study: Elementary Principals' Perceptions of the Impact of the Massachusetts Comprehensive Assessment System Testing Culture on the Six Conditions for Effective Learning in Schools

    ERIC Educational Resources Information Center

    Fraine, Patrick David

    2012-01-01

    There has been much debate regarding the impact of state-mandated assessment in schools. Most of the literature on this topic has been gathered from studies focused on teachers' perceptions (Hungerford, 2004). The effects, typically perceived to be negative, indicate reduced quality of teaching and learning in schools. The purpose of this study…

  18. Optimization of the gypsum-based materials by the sequential simplex method

    NASA Astrophysics Data System (ADS)

    Doleželová, Magdalena; Vimmrová, Alena

    2017-11-01

    The application of the sequential simplex optimization method for the design of gypsum based materials is described. The principles of simplex method are explained and several examples of the method usage for the optimization of lightweight gypsum and ternary gypsum based materials are given. By this method lightweight gypsum based materials with desired properties and ternary gypsum based material with higher strength (16 MPa) were successfully developed. Simplex method is a useful tool for optimizing of gypsum based materials, but the objective of the optimization has to be formulated appropriately.

  19. Enhanced Biocide Treatments with D-amino Acid Mixtures against a Biofilm Consortium from a Water Cooling Tower.

    PubMed

    Jia, Ru; Li, Yingchao; Al-Mahamedh, Hussain H; Gu, Tingyue

    2017-01-01

    Different species of microbes form mixed-culture biofilms in cooling water systems. They cause microbiologically influenced corrosion (MIC) and biofouling, leading to increased operational and maintenance costs. In this work, two D-amino acid mixtures were found to enhance two non-oxidizing biocides [tetrakis hydroxymethyl phosphonium sulfate (THPS) and NALCO 7330 (isothiazoline derivatives)] and one oxidizing biocide [bleach (NaClO)] against a biofilm consortium from a water cooling tower in lab tests. Fifty ppm (w/w) of an equimass mixture of D-methionine, D-leucine, D-tyrosine, D-tryptophan, D-serine, D-threonine, D-phenylalanine, and D-valine (D8) enhanced 15 ppm THPS and 15 ppm NALCO 7330 with similar efficacies achieved by the 30 ppm THPS alone treatment and the 30 ppm NALCO 7330 alone treatment, respectively in the single-batch 3-h biofilm removal test. A sequential treatment method was used to enhance bleach because D-amino acids react with bleach. After a 4-h biofilm removal test, the sequential treatment of 5 ppm bleach followed by 50 ppm D8 achieved extra 1-log reduction in sessile cell counts of acid producing bacteria, sulfate reducing bacteria, and general heterotrophic bacteria compared with the 5 ppm bleach alone treatment. The 10 ppm bleach alone treatment showed a similar efficacy with the sequential treatment of 5 ppm bleach followed by 50 ppm D8. The efficacy of D8 was found better than that of D4 (an equimass mixture of D-methionine, D-leucine, D-tyrosine, and D-tryptophan) in the enhancement of the three individual biocides against the biofilm consortium.

  20. Enhanced Biocide Treatments with D-amino Acid Mixtures against a Biofilm Consortium from a Water Cooling Tower

    PubMed Central

    Jia, Ru; Li, Yingchao; Al-Mahamedh, Hussain H.; Gu, Tingyue

    2017-01-01

    Different species of microbes form mixed-culture biofilms in cooling water systems. They cause microbiologically influenced corrosion (MIC) and biofouling, leading to increased operational and maintenance costs. In this work, two D-amino acid mixtures were found to enhance two non-oxidizing biocides [tetrakis hydroxymethyl phosphonium sulfate (THPS) and NALCO 7330 (isothiazoline derivatives)] and one oxidizing biocide [bleach (NaClO)] against a biofilm consortium from a water cooling tower in lab tests. Fifty ppm (w/w) of an equimass mixture of D-methionine, D-leucine, D-tyrosine, D-tryptophan, D-serine, D-threonine, D-phenylalanine, and D-valine (D8) enhanced 15 ppm THPS and 15 ppm NALCO 7330 with similar efficacies achieved by the 30 ppm THPS alone treatment and the 30 ppm NALCO 7330 alone treatment, respectively in the single-batch 3-h biofilm removal test. A sequential treatment method was used to enhance bleach because D-amino acids react with bleach. After a 4-h biofilm removal test, the sequential treatment of 5 ppm bleach followed by 50 ppm D8 achieved extra 1-log reduction in sessile cell counts of acid producing bacteria, sulfate reducing bacteria, and general heterotrophic bacteria compared with the 5 ppm bleach alone treatment. The 10 ppm bleach alone treatment showed a similar efficacy with the sequential treatment of 5 ppm bleach followed by 50 ppm D8. The efficacy of D8 was found better than that of D4 (an equimass mixture of D-methionine, D-leucine, D-tyrosine, and D-tryptophan) in the enhancement of the three individual biocides against the biofilm consortium. PMID:28861053

  1. Fault detection on a sewer network by a combination of a Kalman filter and a binary sequential probability ratio test

    NASA Astrophysics Data System (ADS)

    Piatyszek, E.; Voignier, P.; Graillot, D.

    2000-05-01

    One of the aims of sewer networks is the protection of population against floods and the reduction of pollution rejected to the receiving water during rainy events. To meet these goals, managers have to equip the sewer networks with and to set up real-time control systems. Unfortunately, a component fault (leading to intolerable behaviour of the system) or sensor fault (deteriorating the process view and disturbing the local automatism) makes the sewer network supervision delicate. In order to ensure an adequate flow management during rainy events it is essential to set up procedures capable of detecting and diagnosing these anomalies. This article introduces a real-time fault detection method, applicable to sewer networks, for the follow-up of rainy events. This method consists in comparing the sensor response with a forecast of this response. This forecast is provided by a model and more precisely by a state estimator: a Kalman filter. This Kalman filter provides not only a flow estimate but also an entity called 'innovation'. In order to detect abnormal operations within the network, this innovation is analysed with the binary sequential probability ratio test of Wald. Moreover, by crossing available information on several nodes of the network, a diagnosis of the detected anomalies is carried out. This method provided encouraging results during the analysis of several rains, on the sewer network of Seine-Saint-Denis County, France.

  2. System For Surveillance Of Spectral Signals

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W.; Criss-Puszkiewicz, Cynthia; Wilks, Alan D.

    2004-10-12

    A method and system for monitoring at least one of a system, a process and a data source. A method and system have been developed for carrying out surveillance, testing and modification of an ongoing process or other source of data, such as a spectroscopic examination. A signal from the system under surveillance is collected and compared with a reference signal, a frequency domain transformation carried out for the system signal and reference signal, a frequency domain difference function established. The process is then repeated until a full range of data is accumulated over the time domain and a Sequential Probability Ratio Test ("SPRT") methodology applied to determine a three-dimensional surface plot characteristic of the operating state of the system under surveillance.

  3. System For Surveillance Of Spectral Signals

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan; Criss-Puszkiewicz, Cynthia; Wilks, Alan D.

    2003-04-22

    A method and system for monitoring at least one of a system, a process and a data source. A method and system have been developed for carrying out surveillance, testing and modification of an ongoing process or other source of data, such as a spectroscopic examination. A signal from the system under surveillance is collected and compared with a reference signal, a frequency domain transformation carried out for the system signal and reference signal, a frequency domain difference function established. The process is then repeated until a full range of data is accumulated over the time domain and a Sequential Probability Ratio Test methodology applied to determine a three-dimensional surface plot characteristic of the operating state of the system under surveillance.

  4. System for surveillance of spectral signals

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W.; Criss-Puszkiewicz, Cynthia; Wilks, Alan D.

    2006-02-14

    A method and system for monitoring at least one of a system, a process and a data source. A method and system have been developed for carrying out surveillance, testing and modification of an ongoing process or other source of data, such as a spectroscopic examination. A signal from the system under surveillance is collected and compared with a reference signal, a frequency domain transformation carried out for the system signal and reference signal, a frequency domain difference function established. The process is then repeated until a full range of data is accumulated over the time domain and a Sequential Probability Ratio Test ("SPRT") methodology applied to determine a three-dimensional surface plot characteristic of the operating state of the system under surveillance.

  5. System for surveillance of spectral signals

    DOEpatents

    Gross, Kenneth C.; Wegerich, Stephan W.; Criss-Puszkiewicz, Cynthia; Wilks, Alan D.

    2001-01-01

    A method and system for monitoring at least one of a system, a process and a data source. A method and system have been developed for carrying out surveillance, testing and modification of an ongoing process or other source of data, such as a spectroscopic examination. A signal from the system under surveillance is collected and compared with a reference signal, a frequency domain transformation carried out for the system signal and reference signal, a frequency domain difference function established. The process is then repeated until a full range of data is accumulated over the time domain and a SPRT sequential probability ratio test methodology applied to determine a three-dimensional surface plot characteristic of the operating state of the system under surveillance.

  6. SEQUENTIAL EXTRACTIONS FOR PARTITIONING OF ARSENIC ON HYDROUS IRON OXIDES AND IRON SULFIDES

    EPA Science Inventory

    The objective of this study was to use model solids to test solutions designed to extract arsenic from relatively labile solid phase fractions. The use of sequential extractions provides analytical constraints on the identification of mineral phases that control arsenic mobility...

  7. The Development of Man and His Culture: New World Prehistory. Grade Two. Teacher's Guide [And] Teacher Background Material [And] Pupil Text [And] Pupil's Study Guide [And] A Sequential Curriculum in Anthropology. Test: Form 2, Composite Form for Pre- and Post-test. Publications No. 28, 29, 30, 33 and 41.

    ERIC Educational Resources Information Center

    Austin, Carol E.; And Others

    The social studies unit includes a teaching guide, teacher background material, student text, study guide, and composite pretest/posttest. Subject matter focuses on archaeological methods, history of man in America, and the Hopi Indians in the past and present. The unit is part of the Anthropology Curriculum Project and is designed to be used in…

  8. Development, Characterization, and Resultant Properties of a Carbon, Boron, and Chromium Ternary Diffusion System

    NASA Astrophysics Data System (ADS)

    Domec, Brennan S.

    In today's industry, engineering materials are continuously pushed to the limits. Often, the application only demands high-specification properties in a narrowly-defined region of the material, such as the outermost surface. This, in combination with the economic benefits, makes case hardening an attractive solution to meet industry demands. While case hardening has been in use for decades, applications demanding high hardness, deep case depth, and high corrosion resistance are often under-served by this process. Instead, new solutions are required. The goal of this study is to develop and characterize a new borochromizing process applied to a pre-carburized AISI 8620 alloy steel. The process was successfully developed using a combination of computational simulations, calculations, and experimental testing. Process kinetics were studied by fitting case depth measurement data to Fick's Second Law of Diffusion and an Arrhenius equation. Results indicate that the kinetics of the co-diffusion method are unaffected by the addition of chromium to the powder pack. The results also show that significant structural degradation of the case occurs when chromizing is applied sequentially to an existing boronized case. The amount of degradation is proportional to the chromizing parameters. Microstructural evolution was studied using metallographic methods, simulation and computational calculations, and analytical techniques. While the co-diffusion process failed to enrich the substrate with chromium, significant enrichment is obtained with the sequential diffusion process. The amount of enrichment is directly proportional to the chromizing parameters with higher parameters resulting in more enrichment. The case consists of M7C3 and M23C6 carbides nearest the surface, minor amounts of CrB, and a balance of M2B. Corrosion resistance was measured with salt spray and electrochemical methods. These methods confirm the benefit of surface enrichment by chromium in the sequential diffusion method with corrosion resistance increasing directly with chromium concentration. The results also confirm the deleterious effect of surface-breaking case defects and the need to reduce or eliminate them. The best combination of microstructural integrity, mean surface hardness, effective case depth, and corrosion resistance is obtained in samples sequentially boronized and chromized at 870°C for 6hrs. Additional work is required to further optimize process parameters and case properties.

  9. Tracking Time Evolution of Collective Attention Clusters in Twitter: Time Evolving Nonnegative Matrix Factorisation.

    PubMed

    Saito, Shota; Hirata, Yoshito; Sasahara, Kazutoshi; Suzuki, Hideyuki

    2015-01-01

    Micro-blogging services, such as Twitter, offer opportunities to analyse user behaviour. Discovering and distinguishing behavioural patterns in micro-blogging services is valuable. However, it is difficult and challenging to distinguish users, and to track the temporal development of collective attention within distinct user groups in Twitter. In this paper, we formulate this problem as tracking matrices decomposed by Nonnegative Matrix Factorisation for time-sequential matrix data, and propose a novel extension of Nonnegative Matrix Factorisation, which we refer to as Time Evolving Nonnegative Matrix Factorisation (TENMF). In our method, we describe users and words posted in some time interval by a matrix, and use several matrices as time-sequential data. Subsequently, we apply Time Evolving Nonnegative Matrix Factorisation to these time-sequential matrices. TENMF can decompose time-sequential matrices, and can track the connection among decomposed matrices, whereas previous NMF decomposes a matrix into two lower dimension matrices arbitrarily, which might lose the time-sequential connection. Our proposed method has an adequately good performance on artificial data. Moreover, we present several results and insights from experiments using real data from Twitter.

  10. Enhancing the performance of tungsten doped InZnO thin film transistors via sequential ambient annealing

    NASA Astrophysics Data System (ADS)

    Park, Hyun-Woo; Song, Aeran; Kwon, Sera; Choi, Dukhyun; Kim, Younghak; Jun, Byung-Hyuk; Kim, Han-Ki; Chung, Kwun-Bum

    2018-03-01

    This study suggests a sequential ambient annealing process as an excellent post-treatment method to enhance the device performance and stability of W (tungsten) doped InZnO thin film transistors (WIZO-TFTs). Sequential ambient annealing at 250 °C significantly enhanced the device performance and stability of WIZO-TFTs, compared with other post-treatment methods, such as air ambient annealing and vacuum ambient annealing at 250 °C. To understand the enhanced device performance and stability of WIZO-TFT with sequential ambient annealing, we investigate the correlations between device performance and stability and electronic structures, such as band alignment, a feature of the conduction band, and band edge states below the conduction band. The enhanced performance of WIZO-TFTs with sequential ambient annealing is related to the modification of the electronic structure. In addition, the dominant mechanism responsible for the enhanced device performance and stability of WIZO-TFTs is considered to be a change in the shallow-level and deep-level band edge states below the conduction band.

  11. Robust multiperson detection and tracking for mobile service and social robots.

    PubMed

    Li, Liyuan; Yan, Shuicheng; Yu, Xinguo; Tan, Yeow Kee; Li, Haizhou

    2012-10-01

    This paper proposes an efficient system which integrates multiple vision models for robust multiperson detection and tracking for mobile service and social robots in public environments. The core technique is a novel maximum likelihood (ML)-based algorithm which combines the multimodel detections in mean-shift tracking. First, a likelihood probability which integrates detections and similarity to local appearance is defined. Then, an expectation-maximization (EM)-like mean-shift algorithm is derived under the ML framework. In each iteration, the E-step estimates the associations to the detections, and the M-step locates the new position according to the ML criterion. To be robust to the complex crowded scenarios for multiperson tracking, an improved sequential strategy to perform the mean-shift tracking is proposed. Under this strategy, human objects are tracked sequentially according to their priority order. To balance the efficiency and robustness for real-time performance, at each stage, the first two objects from the list of the priority order are tested, and the one with the higher score is selected. The proposed method has been successfully implemented on real-world service and social robots. The vision system integrates stereo-based and histograms-of-oriented-gradients-based human detections, occlusion reasoning, and sequential mean-shift tracking. Various examples to show the advantages and robustness of the proposed system for multiperson tracking from mobile robots are presented. Quantitative evaluations on the performance of multiperson tracking are also performed. Experimental results indicate that significant improvements have been achieved by using the proposed method.

  12. Lifelong Transfer Learning for Heterogeneous Teams of Agents in Sequential Decision Processes

    DTIC Science & Technology

    2016-06-01

    making (SDM) tasks in dynamic environments with simulated and physical robots . 15. SUBJECT TERMS Sequential decision making, lifelong learning, transfer...sequential decision-making (SDM) tasks in dynamic environments with both simple benchmark tasks and more complex aerial and ground robot tasks. Our work...and ground robots in the presence of disturbances: We applied our methods to the problem of learning controllers for robots with novel disturbances in

  13. Development of a standardized sequential extraction protocol for simultaneous extraction of multiple actinide elements

    DOE PAGES

    Faye, Sherry A.; Richards, Jason M.; Gallardo, Athena M.; ...

    2017-02-07

    Sequential extraction is a useful technique for assessing the potential to leach actinides from soils; however, current literature lacks uniformity in experimental details, making direct comparison of results impossible. This work continued development toward a standardized five-step sequential extraction protocol by analyzing extraction behaviors of 232Th, 238U, 239,240Pu and 241Am from lake and ocean sediment reference materials. Results produced a standardized procedure after creating more defined reaction conditions to improve method repeatability. A NaOH fusion procedure is recommended following sequential leaching for the complete dissolution of insoluble species.

  14. Sequential Aldol Condensation – Transition Metal-Catalyzed Addition Reactions of Aldehydes, Methyl Ketones and Arylboronic Acids

    PubMed Central

    Liao, Yuan-Xi; Xing, Chun-Hui; Israel, Matthew; Hu, Qiao-Sheng

    2011-01-01

    Sequential aldol condensation of aldehydes with methyl ketones followed by transition metal-catalyzed addition reactions of arylboronic acids to form β-substituted ketones is described. By using the 1,1′-spirobiindane-7,7′-diol (SPINOL)-based phosphite, an asymmetric version of this type of sequential reaction, with up to 92% ee, was also realized. Our study provided an efficient method to access β-substituted ketones and might lead to the development of other sequential/tandem reactions with transition metal-catalyzed addition reactions as the key step. PMID:21417359

  15. Sequential aldol condensation-transition metal-catalyzed addition reactions of aldehydes, methyl ketones, and arylboronic acids.

    PubMed

    Liao, Yuan-Xi; Xing, Chun-Hui; Israel, Matthew; Hu, Qiao-Sheng

    2011-04-15

    Sequential aldol condensation of aldehydes with methyl ketones followed by transition metal-catalyzed addition reactions of arylboronic acids to form β-substituted ketones is described. By using the 1,1'-spirobiindane-7,7'-diol (SPINOL)-based phosphite, an asymmetric version of this type of sequential reaction, with up to 92% ee, was also realized. Our study provided an efficient method to access β-substituted ketones and might lead to the development of other sequential/tandem reactions with transition metal-catalyzed addition reactions as the key step. © 2011 American Chemical Society

  16. Design and evaluation of a hybrid storage system in HEP environment

    NASA Astrophysics Data System (ADS)

    Xu, Qi; Cheng, Yaodong; Chen, Gang

    2017-10-01

    Nowadays, the High Energy Physics experiments produce a large amount of data. These data are stored in mass storage systems which need to balance the cost, performance and manageability. In this paper, a hybrid storage system including SSDs (Solid-state Drive) and HDDs (Hard Disk Drive) is designed to accelerate data analysis and maintain a low cost. The performance of accessing files is a decisive factor for the HEP computing system. A new deployment model of Hybrid Storage System in High Energy Physics is proposed which is proved to have higher I/O performance. The detailed evaluation methods and the evaluations about SSD/HDD ratio, and the size of the logic block are also given. In all evaluations, sequential-read, sequential-write, random-read and random-write are all tested to get the comprehensive results. The results show the Hybrid Storage System has good performance in some fields such as accessing big files in HEP.

  17. Validity of the mockwitness paradigm: testing the assumptions.

    PubMed

    McQuiston, Dawn E; Malpass, Roy S

    2002-08-01

    Mockwitness identifications are used to provide a quantitative measure of lineup fairness. Some theoretical and practical assumptions of this paradigm have not been studied in terms of mockwitnesses' decision processes and procedural variation (e.g., instructions, lineup presentation method), and the current experiment was conducted to empirically evaluate these assumptions. Four hundred and eighty mockwitnesses were given physical information about a culprit, received 1 of 4 variations of lineup instructions, and were asked to identify the culprit from either a fair or unfair sequential lineup containing 1 of 2 targets. Lineup bias estimates varied as a result of lineup fairness and the target presented. Mockwitnesses generally reported that the target's physical description was their main source of identifying information. Our findings support the use of mockwitness identifications as a useful technique for sequential lineup evaluation, but only for mockwitnesses who selected only 1 lineup member. Recommendations for the use of this evaluation procedure are discussed.

  18. Determination of ambroxol hydrochloride, methylparaben and benzoic acid in pharmaceutical preparations based on sequential injection technique coupled with monolithic column.

    PubMed

    Satínský, Dalibor; Huclová, Jitka; Ferreira, Raquel L C; Montenegro, Maria Conceição B S M; Solich, Petr

    2006-02-13

    The porous monolithic columns show high performance at relatively low pressure. The coupling of short monoliths with sequential injection technique (SIA) results in a new approach to implementation of separation step to non-separation low-pressure method. In this contribution, a new separation method for simultaneous determination of ambroxol, methylparaben and benzoic acid was developed based on a novel reversed-phase sequential injection chromatography (SIC) technique with UV detection. A Chromolith SpeedROD RP-18e, 50-4.6 mm column with 10 mm precolumn and a FIAlab 3000 system with a six-port selection valve and 5 ml syringe were used for sequential injection chromatographic separations in our study. The mobile phase used was acetonitrile-tetrahydrofuran-0.05M acetic acid (10:10:90, v/v/v), pH 3.75 adjusted with triethylamine, flow rate 0.48 mlmin(-1), UV-detection was at 245 nm. The analysis time was <11 min. A new SIC method was validated and compared with HPLC. The method was found to be useful for the routine analysis of the active compounds ambroxol and preservatives (methylparaben or benzoic acid) in various pharmaceutical syrups and drops.

  19. An adaptive two-stage sequential design for sampling rare and clustered populations

    USGS Publications Warehouse

    Brown, J.A.; Salehi, M.M.; Moradi, M.; Bell, G.; Smith, D.R.

    2008-01-01

    How to design an efficient large-area survey continues to be an interesting question for ecologists. In sampling large areas, as is common in environmental studies, adaptive sampling can be efficient because it ensures survey effort is targeted to subareas of high interest. In two-stage sampling, higher density primary sample units are usually of more interest than lower density primary units when populations are rare and clustered. Two-stage sequential sampling has been suggested as a method for allocating second stage sample effort among primary units. Here, we suggest a modification: adaptive two-stage sequential sampling. In this method, the adaptive part of the allocation process means the design is more flexible in how much extra effort can be directed to higher-abundance primary units. We discuss how best to design an adaptive two-stage sequential sample. ?? 2008 The Society of Population Ecology and Springer.

  20. Individuation of Pairs of Objects in Infancy

    ERIC Educational Resources Information Center

    Leslie, Alan M.; Chen, Marian L.

    2007-01-01

    Looking-time studies examined whether 11-month-old infants can individuate two pairs of objects using only shape information. In order to test individuation, the object pairs were presented sequentially. Infants were familiarized either with the sequential pairs, disk-triangle/disk-triangle (XY/XY), whose shapes differed within but not across…

  1. Accounting for Test Variability through Sizing Local Domains in Sequential Design Optimization with Concurrent Calibration-Based Model Validation

    DTIC Science & Technology

    2013-08-01

    in Sequential Design Optimization with Concurrent Calibration-Based Model Validation Dorin Drignei 1 Mathematics and Statistics Department...Validation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Dorin Drignei; Zissimos Mourelatos; Vijitashwa Pandey

  2. Sequential Objective Structured Clinical Examination based on item response theory in Iran.

    PubMed

    Hejri, Sara Mortaz; Jalili, Mohammad

    2017-01-01

    In a sequential objective structured clinical examination (OSCE), all students initially take a short screening OSCE. Examinees who pass are excused from further testing, but an additional OSCE is administered to the remaining examinees. Previous investigations of sequential OSCE were based on classical test theory. We aimed to design and evaluate screening OSCEs based on item response theory (IRT). We carried out a retrospective observational study. At each station of a 10-station OSCE, the students' performance was graded on a Likert-type scale. Since the data were polytomous, the difficulty parameters, discrimination parameters, and students' ability were calculated using a graded response model. To design several screening OSCEs, we identified the 5 most difficult stations and the 5 most discriminative ones. For each test, 5, 4, or 3 stations were selected. Normal and stringent cut-scores were defined for each test. We compared the results of each of the 12 screening OSCEs to the main OSCE and calculated the positive and negative predictive values (PPV and NPV), as well as the exam cost. A total of 253 students (95.1%) passed the main OSCE, while 72.6% to 94.4% of examinees passed the screening tests. The PPV values ranged from 0.98 to 1.00, and the NPV values ranged from 0.18 to 0.59. Two tests effectively predicted the results of the main exam, resulting in financial savings of 34% to 40%. If stations with the highest IRT-based discrimination values and stringent cut-scores are utilized in the screening test, sequential OSCE can be an efficient and convenient way to conduct an OSCE.

  3. Substructure hybrid testing of reinforced concrete shear wall structure using a domain overlapping technique

    NASA Astrophysics Data System (ADS)

    Zhang, Yu; Pan, Peng; Gong, Runhua; Wang, Tao; Xue, Weichen

    2017-10-01

    An online hybrid test was carried out on a 40-story 120-m high concrete shear wall structure. The structure was divided into two substructures whereby a physical model of the bottom three stories was tested in the laboratory and the upper 37 stories were simulated numerically using ABAQUS. An overlapping domain method was employed for the bottom three stories to ensure the validity of the boundary conditions of the superstructure. Mixed control was adopted in the test. Displacement control was used to apply the horizontal displacement, while two controlled force actuators were applied to simulate the overturning moment, which is very large and cannot be ignored in the substructure hybrid test of high-rise buildings. A series of tests with earthquake sources of sequentially increasing intensities were carried out. The test results indicate that the proposed hybrid test method is a solution to reproduce the seismic response of high-rise concrete shear wall buildings. The seismic performance of the tested precast high-rise building satisfies the requirements of the Chinese seismic design code.

  4. An investigation of several numerical procedures for time-asymptotic compressible Navier-Stokes solutions

    NASA Technical Reports Server (NTRS)

    Rudy, D. H.; Morris, D. J.; Blanchard, D. K.; Cooke, C. H.; Rubin, S. G.

    1975-01-01

    The status of an investigation of four numerical techniques for the time-dependent compressible Navier-Stokes equations is presented. Results for free shear layer calculations in the Reynolds number range from 1000 to 81000 indicate that a sequential alternating-direction implicit (ADI) finite-difference procedure requires longer computing times to reach steady state than a low-storage hopscotch finite-difference procedure. A finite-element method with cubic approximating functions was found to require excessive computer storage and computation times. A fourth method, an alternating-direction cubic spline technique which is still being tested, is also described.

  5. Attenuation-difference radar tomography: results of a multiple-plane experiment at the U.S. Geological Survey Fractured-Rock Research Site, Mirror Lake, New Hampshire

    USGS Publications Warehouse

    Lane, J.W.; Day-Lewis, F. D.; Harris, J.M.; Haeni, F.P.; Gorelick, S.M.

    2000-01-01

    Attenuation-difference, borehole-radar tomography was used to monitor a series of sodium chloride tracer injection tests conducted within the FSE, wellfield at the U.S. Geological Survey Fractured-Rock Hydrology Research Site in Grafton County, New Hampshire, USA. Borehole-radar tomography surveys were conducted using the sequential-scanning and injection method in three boreholes that form a triangular prism of adjoining tomographic image planes. Results indicate that time-lapse tomography methods provide high-resolution images of tracer distribution in permeable zones.

  6. A 2-step penalized regression method for family-based next-generation sequencing association studies.

    PubMed

    Ding, Xiuhua; Su, Shaoyong; Nandakumar, Kannabiran; Wang, Xiaoling; Fardo, David W

    2014-01-01

    Large-scale genetic studies are often composed of related participants, and utilizing familial relationships can be cumbersome and computationally challenging. We present an approach to efficiently handle sequencing data from complex pedigrees that incorporates information from rare variants as well as common variants. Our method employs a 2-step procedure that sequentially regresses out correlation from familial relatedness and then uses the resulting phenotypic residuals in a penalized regression framework to test for associations with variants within genetic units. The operating characteristics of this approach are detailed using simulation data based on a large, multigenerational cohort.

  7. "RCL-Pooling Assay": A Simplified Method for the Detection of Replication-Competent Lentiviruses in Vector Batches Using Sequential Pooling.

    PubMed

    Corre, Guillaume; Dessainte, Michel; Marteau, Jean-Brice; Dalle, Bruno; Fenard, David; Galy, Anne

    2016-02-01

    Nonreplicative recombinant HIV-1-derived lentiviral vectors (LV) are increasingly used in gene therapy of various genetic diseases, infectious diseases, and cancer. Before they are used in humans, preparations of LV must undergo extensive quality control testing. In particular, testing of LV must demonstrate the absence of replication-competent lentiviruses (RCL) with suitable methods, on representative fractions of vector batches. Current methods based on cell culture are challenging because high titers of vector batches translate into high volumes of cell culture to be tested in RCL assays. As vector batch size and titers are continuously increasing because of the improvement of production and purification methods, it became necessary for us to modify the current RCL assay based on the detection of p24 in cultures of indicator cells. Here, we propose a practical optimization of this method using a pairwise pooling strategy enabling easier testing of higher vector inoculum volumes. These modifications significantly decrease material handling and operator time, leading to a cost-effective method, while maintaining optimal sensibility of the RCL testing. This optimized "RCL-pooling assay" ameliorates the feasibility of the quality control of large-scale batches of clinical-grade LV while maintaining the same sensitivity.

  8. Computer-generated holograms by multiple wavefront recording plane method with occlusion culling.

    PubMed

    Symeonidou, Athanasia; Blinder, David; Munteanu, Adrian; Schelkens, Peter

    2015-08-24

    We propose a novel fast method for full parallax computer-generated holograms with occlusion processing, suitable for volumetric data such as point clouds. A novel light wave propagation strategy relying on the sequential use of the wavefront recording plane method is proposed, which employs look-up tables in order to reduce the computational complexity in the calculation of the fields. Also, a novel technique for occlusion culling with little additional computation cost is introduced. Additionally, the method adheres a Gaussian distribution to the individual points in order to improve visual quality. Performance tests show that for a full-parallax high-definition CGH a speedup factor of more than 2,500 compared to the ray-tracing method can be achieved without hardware acceleration.

  9. The effects of the sequential addition of synthesis parameters on the performance of alkali activated fly ash mortar

    NASA Astrophysics Data System (ADS)

    Dassekpo, Jean-Baptiste Mawulé; Zha, Xiaoxiong; Zhan, Jiapeng; Ning, Jiaqian

    Geopolymer is an energy efficient and sustainable material that is currently used in construction industry as an alternative for Portland cement. As a new material, specific mix design method is essential and efforts have been made to develop a mix design procedure with the main focus on achieving better compressive strength and economy. In this paper, a sequential addition of synthesis parameters such as fly ash-sand, alkaline liquids, plasticizer and additional water at well-defined time intervals was investigated. A total of 4 mix procedures were used to study the compressive performance on fly ash-based geopolymer mortar and the results of each method were analyzed and discussed. Experimental results show that the sequential addition of sodium hydroxide (NaOH), sodium silicate (Na2SiO3), plasticizer (PL), followed by adding water (WA) increases considerably the compressive strengths of the geopolymer-based mortar. These results clearly demonstrate the high significant influence of sequential addition of synthesis parameters on geopolymer materials compressive properties, and also provide a new mixing method for the preparation of geopolymer paste, mortar and concrete.

  10. Application of silver nanoparticles to the chemiluminescence determination of cefditoren pivoxil using the luminol-ferricyanide system.

    PubMed

    Alarfaj, Nawal A; Aly, Fatma A; El-Tohamy, Maha F

    2015-02-01

    A new simple, accurate and sensitive sequential injection analysis chemiluminescence (CL) detection method for the determination of cefditoren pivoxil (CTP) has been developed. The developed method was based on the enhancement effect of silver nanoparticles on the CL signal arising from a luminol-potassium ferricyanide reaction in the presence of CTP. The optimum conditions relevant to the effect of luminol, potassium ferricyanide and silver nanoparticle concentrations were investigated. The proposed method showed linear relationships between relative CL intensity and the investigated drug concentration at the range 0.001-5000 ng/mL, (r = 0.9998, n = 12) with a detection limit of 0.5 pg/mL and quantification limit of 0.001 ng/mL. The relative standard deviation was 1.6%. The proposed method was employed for the determination of CTP in bulk drug, in its pharmaceutical dosage forms and biological fluids such as human serum and urine. The interference of some common additive compounds such as glucose, lactose, starch, talc and magnesium stearate was investigated. In addition, the interference of some related cephalosporins was tested. No interference was recorded. The obtained sequential injection analysis-CL results were statistically compared with those from a reported method and did not show any significant differences. Copyright © 2014 John Wiley & Sons, Ltd.

  11. Meta-cognitive online sequential extreme learning machine for imbalanced and concept-drifting data classification.

    PubMed

    Mirza, Bilal; Lin, Zhiping

    2016-08-01

    In this paper, a meta-cognitive online sequential extreme learning machine (MOS-ELM) is proposed for class imbalance and concept drift learning. In MOS-ELM, meta-cognition is used to self-regulate the learning by selecting suitable learning strategies for class imbalance and concept drift problems. MOS-ELM is the first sequential learning method to alleviate the imbalance problem for both binary class and multi-class data streams with concept drift. In MOS-ELM, a new adaptive window approach is proposed for concept drift learning. A single output update equation is also proposed which unifies various application specific OS-ELM methods. The performance of MOS-ELM is evaluated under different conditions and compared with methods each specific to some of the conditions. On most of the datasets in comparison, MOS-ELM outperforms the competing methods. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, L; Han, Y; Jin, M

    Purpose: To develop an iterative reconstruction method for X-ray CT, in which the reconstruction can quickly converge to the desired solution with much reduced projection views. Methods: The reconstruction is formulated as a convex feasibility problem, i.e. the solution is an intersection of three convex sets: 1) data fidelity (DF) set – the L2 norm of the difference of observed projections and those from the reconstructed image is no greater than an error bound; 2) non-negativity of image voxels (NN) set; and 3) piecewise constant (PC) set - the total variation (TV) of the reconstructed image is no greater thanmore » an upper bound. The solution can be found by applying projection onto convex sets (POCS) sequentially for these three convex sets. Specifically, the algebraic reconstruction technique and setting negative voxels as zero are used for projection onto the DF and NN sets, respectively, while the projection onto the PC set is achieved by solving a standard Rudin, Osher, and Fatemi (ROF) model. The proposed method is named as full sequential POCS (FS-POCS), which is tested using the Shepp-Logan phantom and the Catphan600 phantom and compared with two similar algorithms, TV-POCS and CP-TV. Results: Using the Shepp-Logan phantom, the root mean square error (RMSE) of reconstructed images changing along with the number of iterations is used as the convergence measurement. In general, FS- POCS converges faster than TV-POCS and CP-TV, especially with fewer projection views. FS-POCS can also achieve accurate reconstruction of cone-beam CT of the Catphan600 phantom using only 54 views, comparable to that of FDK using 364 views. Conclusion: We developed an efficient iterative reconstruction for sparse-view CT using full sequential POCS. The simulation and physical phantom data demonstrated the computational efficiency and effectiveness of FS-POCS.« less

  13. Optimizing Standard Sequential Extraction Protocol With Lake And Ocean Sediments

    EPA Science Inventory

    The environmental mobility/availability behavior of radionuclides in soils and sediments depends on their speciation. Experiments have been carried out to develop a simple but robust radionuclide sequential extraction method for identification of radionuclide partitioning in sed...

  14. Enduring Advantages of Early Cochlear Implantation for Spoken Language Development

    PubMed Central

    Geers, Ann E.; Nicholas, Johanna G.

    2013-01-01

    Purpose To determine whether the precise age of implantation (AOI) remains an important predictor of spoken language outcomes in later childhood for those who received a cochlear implant (CI) between 12–38 months of age. Relative advantages of receiving a bilateral CI after age 4.5, better pre-CI aided hearing, and longer CI experience were also examined. Method Sixty children participated in a prospective longitudinal study of outcomes at 4.5 and 10.5 years of age. Twenty-nine children received a sequential second CI. Test scores were compared to normative samples of hearing age-mates and predictors of outcomes identified. Results Standard scores on language tests at 10.5 years of age remained significantly correlated with age of first cochlear implantation. Scores were not associated with receipt of a second, sequentially-acquired CI. Significantly higher scores were achieved for vocabulary as compared with overall language, a finding not evident when the children were tested at younger ages. Conclusion Age-appropriate spoken language skills continued to be more likely with younger AOI, even after an average of 8.6 years of additional CI use. Receipt of a second implant between ages 4–10 years and longer duration of device use did not provide significant added benefit. PMID:23275406

  15. Sequential search leads to faster, more efficient fragment-based de novo protein structure prediction.

    PubMed

    de Oliveira, Saulo H P; Law, Eleanor C; Shi, Jiye; Deane, Charlotte M

    2018-04-01

    Most current de novo structure prediction methods randomly sample protein conformations and thus require large amounts of computational resource. Here, we consider a sequential sampling strategy, building on ideas from recent experimental work which shows that many proteins fold cotranslationally. We have investigated whether a pseudo-greedy search approach, which begins sequentially from one of the termini, can improve the performance and accuracy of de novo protein structure prediction. We observed that our sequential approach converges when fewer than 20 000 decoys have been produced, fewer than commonly expected. Using our software, SAINT2, we also compared the run time and quality of models produced in a sequential fashion against a standard, non-sequential approach. Sequential prediction produces an individual decoy 1.5-2.5 times faster than non-sequential prediction. When considering the quality of the best model, sequential prediction led to a better model being produced for 31 out of 41 soluble protein validation cases and for 18 out of 24 transmembrane protein cases. Correct models (TM-Score > 0.5) were produced for 29 of these cases by the sequential mode and for only 22 by the non-sequential mode. Our comparison reveals that a sequential search strategy can be used to drastically reduce computational time of de novo protein structure prediction and improve accuracy. Data are available for download from: http://opig.stats.ox.ac.uk/resources. SAINT2 is available for download from: https://github.com/sauloho/SAINT2. saulo.deoliveira@dtc.ox.ac.uk. Supplementary data are available at Bioinformatics online.

  16. Evaluation of Ophthalmic Surgical Instrument Sterility Using Short-Cycle Sterilization for Sequential Same-Day Use.

    PubMed

    Chang, David F; Hurley, Nikki; Mamalis, Nick; Whitman, Jeffrey

    2018-03-27

    The common practice of short-cycle sterilization for ophthalmic surgical instrumentation has come under increased regulatory scrutiny. This study was undertaken to evaluate the efficacy of short-cycle sterilization processing for consecutive same-day cataract procedures. Testing of specific sterilization processing methods by an independent medical device validation testing laboratory. Phaco handpieces from 3 separate manufacturers were tested along with appropriate biologic indicators and controls using 2 common steam sterilizers. A STATIM 2000 sterilizer (SciCan, Canonsburg, PA) with the STATIM metal cassette, and an AMSCO Century V116 pre-vacuum sterilizer (STERIS, Mentor, OH) using a Case Medical SteriTite container (Case Medical, South Hackensack, NJ) rigid container were tested using phaco tips and handpieces from 3 different manufacturers. Biological indicators were inoculated with highly resistant Geobacillus stearothermophilus, and each sterility verification test was performed in triplicate. Both wrapped and contained loads were tested with full dry cycles and a 7-day storage time to simulate prolonged storage. In adherence with the manufacturers' instructions for use (IFU), short cycles (3.0-3.5-minute exposure times) for unwrapped and contained loads were also tested after only 1 minute of dry time to simulate use on a consecutive case. Additional studies were performed to demonstrate whether any moisture present in the load containing phaco handpieces postprocessing was sterile and would affect the sterility of the contents after a 3-minute transit/storage time. This approximated the upper limit of time needed to transfer a containment device to the operating room. Presence or absence of microbial growth from cultured test samples. All inoculated test samples from both sterilizers were negative for growth of the target organism whether the full dry phase was interrupted or not. Pipetted postprocessing moisture samples and swabs of the handpieces were also negative for growth after a 3-minute transit/storage time. These studies support the use of unwrapped, short-cycle sterilization that adheres to the IFU of these 2 popular Food and Drug Administration-cleared sterilizers for sequential same-day cataract surgeries. A full drying phase is not necessary when the instruments are kept within the covered sterilizer containment device for prompt use on a sequential case. Copyright © 2018 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  17. Modeling eye gaze patterns in clinician-patient interaction with lag sequential analysis.

    PubMed

    Montague, Enid; Xu, Jie; Chen, Ping-Yu; Asan, Onur; Barrett, Bruce P; Chewning, Betty

    2011-10-01

    The aim of this study was to examine whether lag sequential analysis could be used to describe eye gaze orientation between clinicians and patients in the medical encounter. This topic is particularly important as new technologies are implemented into multiuser health care settings in which trust is critical and nonverbal cues are integral to achieving trust. This analysis method could lead to design guidelines for technologies and more effective assessments of interventions. Nonverbal communication patterns are important aspects of clinician-patient interactions and may affect patient outcomes. The eye gaze behaviors of clinicians and patients in 110 videotaped medical encounters were analyzed using the lag sequential method to identify significant behavior sequences. Lag sequential analysis included both event-based lag and time-based lag. Results from event-based lag analysis showed that the patient's gaze followed that of the clinician, whereas the clinician's gaze did not follow the patient's. Time-based sequential analysis showed that responses from the patient usually occurred within 2 s after the initial behavior of the clinician. Our data suggest that the clinician's gaze significantly affects the medical encounter but that the converse is not true. Findings from this research have implications for the design of clinical work systems and modeling interactions. Similar research methods could be used to identify different behavior patterns in clinical settings (physical layout, technology, etc.) to facilitate and evaluate clinical work system designs.

  18. Apollo experience report: Command and service module sequential events control subsystem

    NASA Technical Reports Server (NTRS)

    Johnson, G. W.

    1975-01-01

    The Apollo command and service module sequential events control subsystem is described, with particular emphasis on the major systems and component problems and solutions. The subsystem requirements, design, and development and the test and flight history of the hardware are discussed. Recommendations to avoid similar problems on future programs are outlined.

  19. Treatment Utility of the Kaufman Assessment Battery for Children: Effects of Matching Instruction and Student Processing Strength.

    ERIC Educational Resources Information Center

    Good, Roland H, III; And Others

    1993-01-01

    Tested hypothesis that achievement would be maximized by matching student's Kaufman Assessment Battery for Children-identified processing strength with sequential or simultaneous instruction. Findings from analyses of data from three students with strengths in sequential processing and three students with strengths in simultaneous processing…

  20. Sequential color video to parallel color video converter

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The engineering design, development, breadboard fabrication, test, and delivery of a breadboard field sequential color video to parallel color video converter is described. The converter was designed for use onboard a manned space vehicle to eliminate a flickering TV display picture and to reduce the weight and bulk of previous ground conversion systems.

  1. Alternating and Sequential Motion Rates in Older Adults

    ERIC Educational Resources Information Center

    Pierce, John E.; Cotton, Susan; Perry, Alison

    2013-01-01

    Background: Alternating motion rate (AMR) and sequential motion rate (SMR) are tests of articulatory diadochokinesis that are widely used in the evaluation of motor speech. However, there are no quality normative data available for adults aged 65 years and older. Aims: There were two aims: (1) to obtain a representative, normative dataset of…

  2. Adaptive x-ray threat detection using sequential hypotheses testing with fan-beam experimental data (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Thamvichai, Ratchaneekorn; Huang, Liang-Chih; Ashok, Amit; Gong, Qian; Coccarelli, David; Greenberg, Joel A.; Gehm, Michael E.; Neifeld, Mark A.

    2017-05-01

    We employ an adaptive measurement system, based on sequential hypotheses testing (SHT) framework, for detecting material-based threats using experimental data acquired on an X-ray experimental testbed system. This testbed employs 45-degree fan-beam geometry and 15 views over a 180-degree span to generate energy sensitive X-ray projection data. Using this testbed system, we acquire multiple view projection data for 200 bags. We consider an adaptive measurement design where the X-ray projection measurements are acquired in a sequential manner and the adaptation occurs through the choice of the optimal "next" source/view system parameter. Our analysis of such an adaptive measurement design using the experimental data demonstrates a 3x-7x reduction in the probability of error relative to a static measurement design. Here the static measurement design refers to the operational system baseline that corresponds to a sequential measurement using all the available sources/views. We also show that by using adaptive measurements it is possible to reduce the number of sources/views by nearly 50% compared a system that relies on static measurements.

  3. Sequential bilateral cochlear implantation improves working performance, quality of life, and quality of hearing.

    PubMed

    Härkönen, Kati; Kivekäs, Ilkka; Rautiainen, Markus; Kotti, Voitto; Sivonen, Ville; Vasama, Juha-Pekka

    2015-05-01

    This prospective study shows that working performance, quality of life (QoL), and quality of hearing (QoH) are better with two compared with a single cochlear implant (CI). The impact of the second CI on the patient's QoL is as significant as the impact of the first CI. To evaluate the benefits of sequential bilateral cochlear implantation in working, QoL, and QoH. We studied working performance, work-related stress, QoL, and QoH with specific questionnaires in 15 patients with unilateral CI scheduled for sequential CI of another ear. Sound localization performance and speech perception in noise were measured with specific tests. All questionnaires and tests were performed before the second CI surgery and 6 and 12 months after its activation. Bilateral CIs increased patients' working performance and their work-related stress and fatigue decreased. Communication with co-workers was easier and patients were more active in their working environment. Sequential bilateral cochlear implantation improved QoL, QoH, sound localization, and speech perception in noise statistically significantly.

  4. The Sequential Probability Ratio Test: An efficient alternative to exact binomial testing for Clean Water Act 303(d) evaluation.

    PubMed

    Chen, Connie; Gribble, Matthew O; Bartroff, Jay; Bay, Steven M; Goldstein, Larry

    2017-05-01

    The United States's Clean Water Act stipulates in section 303(d) that states must identify impaired water bodies for which total maximum daily loads (TMDLs) of pollution inputs into water bodies are developed. Decision-making procedures about how to list, or delist, water bodies as impaired, or not, per Clean Water Act 303(d) differ across states. In states such as California, whether or not a particular monitoring sample suggests that water quality is impaired can be regarded as a binary outcome variable, and California's current regulatory framework invokes a version of the exact binomial test to consolidate evidence across samples and assess whether the overall water body complies with the Clean Water Act. Here, we contrast the performance of California's exact binomial test with one potential alternative, the Sequential Probability Ratio Test (SPRT). The SPRT uses a sequential testing framework, testing samples as they become available and evaluating evidence as it emerges, rather than measuring all the samples and calculating a test statistic at the end of the data collection process. Through simulations and theoretical derivations, we demonstrate that the SPRT on average requires fewer samples to be measured to have comparable Type I and Type II error rates as the current fixed-sample binomial test. Policymakers might consider efficient alternatives such as SPRT to current procedure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Computer-Based Radiographic Quantification of Joint Space Narrowing Progression Using Sequential Hand Radiographs: Validation Study in Rheumatoid Arthritis Patients from Multiple Institutions.

    PubMed

    Ichikawa, Shota; Kamishima, Tamotsu; Sutherland, Kenneth; Fukae, Jun; Katayama, Kou; Aoki, Yuko; Okubo, Takanobu; Okino, Taichi; Kaneda, Takahiko; Takagi, Satoshi; Tanimura, Kazuhide

    2017-10-01

    We have developed a refined computer-based method to detect joint space narrowing (JSN) progression with the joint space narrowing progression index (JSNPI) by superimposing sequential hand radiographs. The purpose of this study is to assess the validity of a computer-based method using images obtained from multiple institutions in rheumatoid arthritis (RA) patients. Sequential hand radiographs of 42 patients (37 females and 5 males) with RA from two institutions were analyzed by a computer-based method and visual scoring systems as a standard of reference. The JSNPI above the smallest detectable difference (SDD) defined JSN progression on the joint level. The sensitivity and specificity of the computer-based method for JSN progression was calculated using the SDD and a receiver operating characteristic (ROC) curve. Out of 314 metacarpophalangeal joints, 34 joints progressed based on the SDD, while 11 joints widened. Twenty-one joints progressed in the computer-based method, 11 joints in the scoring systems, and 13 joints in both methods. Based on the SDD, we found lower sensitivity and higher specificity with 54.2 and 92.8%, respectively. At the most discriminant cutoff point according to the ROC curve, the sensitivity and specificity was 70.8 and 81.7%, respectively. The proposed computer-based method provides quantitative measurement of JSN progression using sequential hand radiographs and may be a useful tool in follow-up assessment of joint damage in RA patients.

  6. Method for universal detection of two-photon polarization entanglement

    NASA Astrophysics Data System (ADS)

    Bartkiewicz, Karol; Horodecki, Paweł; Lemr, Karel; Miranowicz, Adam; Życzkowski, Karol

    2015-03-01

    Detecting and quantifying quantum entanglement of a given unknown state poses problems that are fundamentally important for quantum information processing. Surprisingly, no direct (i.e., without quantum tomography) universal experimental implementation of a necessary and sufficient test of entanglement has been designed even for a general two-qubit state. Here we propose an experimental method for detecting a collective universal witness, which is a necessary and sufficient test of two-photon polarization entanglement. It allows us to detect entanglement for any two-qubit mixed state and to establish tight upper and lower bounds on its amount. A different element of this method is the sequential character of its main components, which allows us to obtain relatively complicated information about quantum correlations with the help of simple linear-optical elements. As such, this proposal realizes a universal two-qubit entanglement test within the present state of the art of quantum optics. We show the optimality of our setup with respect to the minimal number of measured quantities.

  7. The Obstacles for the Teaching of 8th Grade TR History of Revolution and Kemalism Course According to the Constructivist Approach (An Example of Exploratory Sequential Mixed Method Design)

    ERIC Educational Resources Information Center

    Karademir, Yavuz; Demir, Selcuk Besir

    2015-01-01

    The aim of this study is to ascertain the problems social studies teachers face in the teaching of topics covered in 8th grade TRHRK Course. The study was conducted in line with explanatory sequential mixed method design, which is one of the mixed research method, was used. The study involves three phases. In the first step, exploratory process…

  8. Enantioselective synthesis of syn/anti-1,3-amino alcohols via proline-catalyzed sequential alpha-aminoxylation/alpha-amination and Horner-Wadsworth-Emmons olefination of aldehydes.

    PubMed

    Jha, Vishwajeet; Kondekar, Nagendra B; Kumar, Pradeep

    2010-06-18

    A novel and general method for asymmetric synthesis of both syn/anti-1,3-amino alcohols is described. The method uses proline-catalyzed sequential alpha-aminoxylation/ alpha-amination and Horner-Wadsworth-Emmons (HWE) olefination of aldehydes as the key step. By using this method, a short synthesis of a bioactive molecule, (R)-1-((S)-1-methylpyrrolidin-2-yl)-5-phenylpentan-2-ol, is also accomplished.

  9. Cost-effectiveness of simultaneous versus sequential surgery in head and neck reconstruction.

    PubMed

    Wong, Kevin K; Enepekides, Danny J; Higgins, Kevin M

    2011-02-01

    To determine whether simultaneous (ablation and reconstruction overlaps by two teams) head and neck reconstruction is cost effective compared to sequentially (ablation followed by reconstruction) performed surgery. Case-controlled study. Tertiary care hospital. Oncology patients undergoing free flap reconstruction of the head and neck. A match paired comparison study was performed with a retrospective chart review examining the total time of surgery for sequential and simultaneous surgery. Nine patients were selected for both the sequential and simultaneous groups. Sequential head and neck reconstruction patients were pair matched with patients who had undergone similar oncologic ablative or reconstructive procedures performed in a simultaneous fashion. A detailed cost analysis using the microcosting method was then undertaken looking at the direct costs of the surgeons, anesthesiologist, operating room, and nursing. On average, simultaneous surgery required 3 hours 15 minutes less operating time, leading to a cost savings of approximately $1200/case when compared to sequential surgery. This represents approximately a 15% reduction in the cost of the entire operation. Simultaneous head and neck reconstruction is more cost effective when compared to sequential surgery.

  10. Comparison of DNA testing strategies in monitoring human papillomavirus infection prevalence through simulation.

    PubMed

    Lin, Carol Y; Li, Ling

    2016-11-07

    HPV DNA diagnostic tests for epidemiology monitoring (research purpose) or cervical cancer screening (clinical purpose) have often been considered separately. Women with positive Linear Array (LA) polymerase chain reaction (PCR) research test results typically are neither informed nor referred for colposcopy. Recently, a sequential testing by using Hybrid Capture 2 (HC2) HPV clinical test as a triage before genotype by LA has been adopted for monitoring HPV infections. Also, HC2 has been reported as a more feasible screening approach for cervical cancer in low-resource countries. Thus, knowing the performance of testing strategies incorporating HPV clinical test (i.e., HC2-only or using HC2 as a triage before genotype by LA) compared with LA-only testing in measuring HPV prevalence will be informative for public health practice. We conducted a Monte Carlo simulation study. Data were generated using mathematical algorithms. We designated the reported HPV infection prevalence in the U.S. and Latin America as the "true" underlying type-specific HPV prevalence. Analytical sensitivity of HC2 for detecting 14 high-risk (oncogenic) types was considered to be less than LA. Estimated-to-true prevalence ratios and percentage reductions were calculated. When the "true" HPV prevalence was designated as the reported prevalence in the U.S., with LA genotyping sensitivity and specificity of (0.95, 0.95), estimated-to-true prevalence ratios of 14 high-risk types were 2.132, 1.056, 0.958 for LA-only, HC2-only, and sequential testing, respectively. Estimated-to-true prevalence ratios of two vaccine-associated high-risk types were 2.359 and 1.063 for LA-only and sequential testing, respectively. When designated type-specific prevalence of HPV16 and 18 were reduced by 50 %, using either LA-only or sequential testing, prevalence estimates were reduced by 18 %. Estimated-to-true HPV infection prevalence ratios using LA-only testing strategy are generally higher than using HC2-only or using HC2 as a triage before genotype by LA. HPV clinical testing can be incorporated to monitor HPV prevalence or vaccine effectiveness. Caution is needed when comparing apparent prevalence from different testing strategies.

  11. "Plate cherry picking": a novel semi-sequential screening paradigm for cheaper, faster, information-rich compound selection.

    PubMed

    Crisman, Thomas J; Jenkins, Jeremy L; Parker, Christian N; Hill, W Adam G; Bender, Andreas; Deng, Zhan; Nettles, James H; Davies, John W; Glick, Meir

    2007-04-01

    This work describes a novel semi-sequential technique for in silico enhancement of high-throughput screening (HTS) experiments now employed at Novartis. It is used in situations in which the size of the screen is limited by the readout (e.g., high-content screens) or the amount of reagents or tools (proteins or cells) available. By performing computational chemical diversity selection on a per plate basis (instead of a per compound basis), 25% of the 1,000,000-compound screening was optimized for general initial HTS. Statistical models are then generated from target-specific primary results (percentage inhibition data) to drive the cherry picking and testing from the entire collection. Using retrospective analysis of 11 HTS campaigns, the authors show that this method would have captured on average two thirds of the active compounds (IC(50) < 10 microM) and three fourths of the active Murcko scaffolds while decreasing screening expenditure by nearly 75%. This result is true for a wide variety of targets, including G-protein-coupled receptors, chemokine receptors, kinases, metalloproteinases, pathway screens, and protein-protein interactions. Unlike time-consuming "classic" sequential approaches that require multiple iterations of cherry picking, testing, and building statistical models, here individual compounds are cherry picked just once, based directly on primary screening data. Strikingly, the authors demonstrate that models built from primary data are as robust as models built from IC(50) data. This is true for all HTS campaigns analyzed, which represent a wide variety of target classes and assay types.

  12. Sequential vs simultaneous revascularization in patients undergoing liver transplantation: A meta-analysis.

    PubMed

    Wang, Jia-Zhong; Liu, Yang; Wang, Jin-Long; Lu, Le; Zhang, Ya-Fei; Lu, Hong-Wei; Li, Yi-Ming

    2015-06-14

    We undertook this meta-analysis to investigate the relationship between revascularization and outcomes after liver transplantation. A literature search was performed using MeSH and key words. The quality of the included studies was assessed using the Jadad Score and the Newcastle-Ottawa Scale. Heterogeneity was evaluated by the χ(2) and I (2) tests. The risk of publication bias was assessed using a funnel plot and Egger's test, and the risk of bias was assessed using a domain-based assessment tool. A sensitivity analysis was conducted by reanalyzing the data using different statistical approaches. Six studies with a total of 467 patients were included. Ischemic-type biliary lesions were significantly reduced in the simultaneous revascularization group compared with the sequential revascularization group (OR = 4.97, 95%CI: 2.45-10.07; P < 0.00001), and intensive care unit (ICU) days were decreased (MD = 2.00, 95%CI: 0.55-3.45; P = 0.007) in the simultaneous revascularization group. Although warm ischemia time was prolonged in simultaneous revascularization group (MD = -25.84, 95%CI: -29.28-22.40; P < 0.00001), there were no significant differences in other outcomes between sequential and simultaneous revascularization groups. Assessment of the risk of bias showed that the methods of random sequence generation and blinding might have been a source of bias. The sensitivity analysis strengthened the reliability of the results of this meta-analysis. The results of this study indicate that simultaneous revascularization in liver transplantation may reduce the incidence of ischemic-type biliary lesions and length of stay of patients in the ICU.

  13. Computational time analysis of the numerical solution of 3D electrostatic Poisson's equation

    NASA Astrophysics Data System (ADS)

    Kamboh, Shakeel Ahmed; Labadin, Jane; Rigit, Andrew Ragai Henri; Ling, Tech Chaw; Amur, Khuda Bux; Chaudhary, Muhammad Tayyab

    2015-05-01

    3D Poisson's equation is solved numerically to simulate the electric potential in a prototype design of electrohydrodynamic (EHD) ion-drag micropump. Finite difference method (FDM) is employed to discretize the governing equation. The system of linear equations resulting from FDM is solved iteratively by using the sequential Jacobi (SJ) and sequential Gauss-Seidel (SGS) methods, simulation results are also compared to examine the difference between the results. The main objective was to analyze the computational time required by both the methods with respect to different grid sizes and parallelize the Jacobi method to reduce the computational time. In common, the SGS method is faster than the SJ method but the data parallelism of Jacobi method may produce good speedup over SGS method. In this study, the feasibility of using parallel Jacobi (PJ) method is attempted in relation to SGS method. MATLAB Parallel/Distributed computing environment is used and a parallel code for SJ method is implemented. It was found that for small grid size the SGS method remains dominant over SJ method and PJ method while for large grid size both the sequential methods may take nearly too much processing time to converge. Yet, the PJ method reduces computational time to some extent for large grid sizes.

  14. Assessment of in vitro cyto/genotoxicity of sequentially treated electroplating effluent on the human hepatocarcinoma HuH-7 cell line.

    PubMed

    Naik, Umesh Chandra; Das, Mihir Tanay; Sauran, Swati; Thakur, Indu Shekhar

    2014-03-01

    The present study compares in vitro toxicity of electroplating effluent after the batch treatment process with that obtained after the sequential treatment process. Activated charcoal prepared from sugarcane bagasse through chemical carbonization, and tolerant indigenous bacteria, Bacillus sp. strain IST105, were used individually and sequentially for the treatment of electroplating effluent. The sequential treatment involving activated charcoal followed by bacterial treatment removed 99% of Cr(VI) compared with the batch processes, which removed 40% (charcoal) and 75% (bacteria), respectively. Post-treatment in vitro cyto/genotoxicity was evaluated by the MTT test and the comet assay in human HuH-7 hepatocarcinoma cells. The sequentially treated sample showed an increase in LC50 value with a 6-fold decrease in comet-assay DNA migration compared with that of untreated samples. A significant decrease in DNA migration and an increase in LC50 value of treated effluent proved the higher effectiveness of the sequential treatment process over the individual batch processes. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Retrieving rupture history using waveform inversions in time sequence

    NASA Astrophysics Data System (ADS)

    Yi, L.; Xu, C.; Zhang, X.

    2017-12-01

    The rupture history of large earthquakes is generally regenerated using the waveform inversion through utilizing seismological waveform records. In the waveform inversion, based on the superposition principle, the rupture process is linearly parameterized. After discretizing the fault plane into sub-faults, the local source time function of each sub-fault is usually parameterized using the multi-time window method, e.g., mutual overlapped triangular functions. Then the forward waveform of each sub-fault is synthesized through convoluting the source time function with its Green function. According to the superposition principle, these forward waveforms generated from the fault plane are summarized in the recorded waveforms after aligning the arrival times. Then the slip history is retrieved using the waveform inversion method after the superposing of all forward waveforms for each correspond seismological waveform records. Apart from the isolation of these forward waveforms generated from each sub-fault, we also realize that these waveforms are gradually and sequentially superimposed in the recorded waveforms. Thus we proposed a idea that the rupture model is possibly detachable in sequent rupture times. According to the constrained waveform length method emphasized in our previous work, the length of inverted waveforms used in the waveform inversion is objectively constrained by the rupture velocity and rise time. And one essential prior condition is the predetermined fault plane that limits the duration of rupture time, which means the waveform inversion is restricted in a pre-set rupture duration time. Therefore, we proposed a strategy to inverse the rupture process sequentially using the progressively shift rupture times as the rupture front expanding in the fault plane. And we have designed a simulation inversion to test the feasibility of the method. Our test result shows the prospect of this idea that requiring furthermore investigation.

  16. Evaluation of an automated safety surveillance system using risk adjusted sequential probability ratio testing.

    PubMed

    Matheny, Michael E; Normand, Sharon-Lise T; Gross, Thomas P; Marinac-Dabic, Danica; Loyo-Berrios, Nilsa; Vidi, Venkatesan D; Donnelly, Sharon; Resnic, Frederic S

    2011-12-14

    Automated adverse outcome surveillance tools and methods have potential utility in quality improvement and medical product surveillance activities. Their use for assessing hospital performance on the basis of patient outcomes has received little attention. We compared risk-adjusted sequential probability ratio testing (RA-SPRT) implemented in an automated tool to Massachusetts public reports of 30-day mortality after isolated coronary artery bypass graft surgery. A total of 23,020 isolated adult coronary artery bypass surgery admissions performed in Massachusetts hospitals between January 1, 2002 and September 30, 2007 were retrospectively re-evaluated. The RA-SPRT method was implemented within an automated surveillance tool to identify hospital outliers in yearly increments. We used an overall type I error rate of 0.05, an overall type II error rate of 0.10, and a threshold that signaled if the odds of dying 30-days after surgery was at least twice than expected. Annual hospital outlier status, based on the state-reported classification, was considered the gold standard. An event was defined as at least one occurrence of a higher-than-expected hospital mortality rate during a given year. We examined a total of 83 hospital-year observations. The RA-SPRT method alerted 6 events among three hospitals for 30-day mortality compared with 5 events among two hospitals using the state public reports, yielding a sensitivity of 100% (5/5) and specificity of 98.8% (79/80). The automated RA-SPRT method performed well, detecting all of the true institutional outliers with a small false positive alerting rate. Such a system could provide confidential automated notification to local institutions in advance of public reporting providing opportunities for earlier quality improvement interventions.

  17. A Comparison of Two Instructional Methods for Teaching Logo to Learning Disabled and Nonlearning Disabled Children.

    ERIC Educational Resources Information Center

    Mathinos, Debra A.; Leonard, Ann Scheier

    The study examines the use of LOGO, a computer language, with 19 learning disabled (LD) and 19 non-LD students in grades 4-6. Ss were randomly assigned to one of two instructional groups: sequential or whole-task, each with 10 LD and 10 non-LD students. The sequential method features a carefully ordered plan for teaching LOGO commands; the…

  18. Power Distribution System Planning with GIS Consideration

    NASA Astrophysics Data System (ADS)

    Wattanasophon, Sirichai; Eua-Arporn, Bundhit

    This paper proposes a method for solving radial distribution system planning problems taking into account geographical information. The proposed method can automatically determine appropriate location and size of a substation, routing of feeders, and sizes of conductors while satisfying all constraints, i.e. technical constraints (voltage drop and thermal limit) and geographical constraints (obstacle, existing infrastructure, and high-cost passages). Sequential quadratic programming (SQP) and minimum path algorithm (MPA) are applied to solve the planning problem based on net price value (NPV) consideration. In addition this method integrates planner's experience and optimization process to achieve an appropriate practical solution. The proposed method has been tested with an actual distribution system, from which the results indicate that it can provide satisfactory plans.

  19. Environmentally friendly microwave-assisted sequential extraction method followed by ICP-OES and ion-chromatographic analysis for rapid determination of sulphur forms in coal samples.

    PubMed

    Mketo, Nomvano; Nomngongo, Philiswa N; Ngila, J Catherine

    2018-05-15

    A rapid three-step sequential extraction method was developed under microwave radiation followed by inductively coupled plasma-optical emission spectroscopic (ICP-OES) and ion-chromatographic (IC) analysis for the determination of sulphur forms in coal samples. The experimental conditions of the proposed microwave-assisted sequential extraction (MW-ASE) procedure were optimized by using multivariate mathematical tools. Pareto charts generated from 2 3 full factorial design showed that, extraction time has insignificant effect on the extraction of sulphur species, therefore, all the sequential extraction steps were performed for 5 min. The optimum values according to the central composite designs and counter plots of the response surface methodology were 200 °C (microwave temperature) and 0.1 g (coal amount) for all the investigated extracting reagents (H 2 O, HCl and HNO 3 ). When the optimum conditions of the proposed MW-ASE procedure were applied in coal CRMs, SARM 18 showed more organic sulphur (72%) and the other two coal CRMs (SARMs 19 and 20) were dominated by sulphide sulphur species (52-58%). The sum of the sulphur forms from the sequential extraction steps have shown consistent agreement (95-96%) with certified total sulphur values on the coal CRM certificates. This correlation, in addition to the good precision (1.7%) achieved by the proposed procedure, suggests that the sequential extraction method is reliable, accurate and reproducible. To safe-guard the destruction of pyritic and organic sulphur forms in extraction step 1, water was used instead of HCl. Additionally, the notorious acidic mixture (HCl/HNO 3 /HF) was replaced by greener reagent (H 2 O 2 ) in the last extraction step. Therefore, the proposed MW-ASE method can be applied in routine laboratories for the determination of sulphur forms in coal and coal related matrices. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Automated Sample Preparation (ASP): Development of a Rapid Method to Sequentially Isolate Nucleic Acids and Protein from Any Sample Type by a Cartridge-Based System

    DTIC Science & Technology

    2013-11-27

    SECURITY CLASSIFICATION OF: CUBRC has developed an in-line, multi-analyte isolation technology that utilizes solid phase extraction chemistries to purify...goals. Specifically, CUBRC will design and manufacture a prototype cartridge(s) and test the prototype cartridge for its ability to isolate each...display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. CUBRC , Inc. P. O. Box 400 Buffalo, NY 14225 -1955

  1. Transition play in team performance of volleyball: a log-linear analysis.

    PubMed

    Eom, H J; Schutz, R W

    1992-09-01

    The purpose of this study was to develop and test a method to analyze and evaluate sequential skill performances in a team sport. An on-line computerized system was developed to record and summarize the sequential skill performances in volleyball. Seventy-two sample games from the third Federation of International Volleyball Cup men's competition were videotaped and grouped into two categories according to the final team standing and game outcome. Log-linear procedures were used to investigate the nature and degree of the relationship in the first-order (pass-to-set, set-to-spike) and second-order (pass-to-spike) transition plays. Results showed that there was a significant dependency in both the first-order and second-order transition plays, indicating that the outcome of a skill performance is highly influenced by the quality of a preceding skill performance. In addition, the pattern of the transition plays was stable and consistent, regardless of the classification status: Game Outcome, Team Standing, or Transition Process. The methodology and subsequent results provide valuable aids for a thorough understanding of the characteristics of transition plays in volleyball. In addition, the concept of sequential performance analysis may serve as an example for sport scientists in investigating probabilistic patterns of motor performance.

  2. Implementation of Temperature Sequential Controller on Variable Speed Drive

    NASA Astrophysics Data System (ADS)

    Cheong, Z. X.; Barsoum, N. N.

    2008-10-01

    There are many pump and motor installations with quite extensive speed variation, such as Sago conveyor, heating, ventilation and air conditioning (HVAC) and water pumping system. A common solution for these applications is to run several fixed speed motors in parallel, with flow control accomplish by turning the motors on and off. This type of control method causes high in-rush current, and adds a risk of damage caused by pressure transients. This paper explains the design and implementation of a temperature speed control system for use in industrial and commercial sectors. Advanced temperature speed control can be achieved by using ABB ACS800 variable speed drive-direct torque sequential control macro, programmable logic controller and temperature transmitter. The principle of direct torque sequential control macro (DTC-SC) is based on the control of torque and flux utilizing the stator flux field orientation over seven preset constant speed. As a result of continuous comparison of ambient temperature to the references temperatures; electromagnetic torque response is particularly fast to the motor state and it is able maintain constant speeds. Experimental tests have been carried out by using ABB ACS800-U1-0003-2, to validate the effectiveness and dynamic respond of ABB ACS800 against temperature variation, loads, and mechanical shocks.

  3. Validation of a novel sequential cultivation method for the production of enzymatic cocktails from Trichoderma strains.

    PubMed

    Florencio, C; Cunha, F M; Badino, A C; Farinas, C S

    2015-02-01

    The development of new cost-effective bioprocesses for the production of cellulolytic enzymes is needed in order to ensure that the conversion of biomass becomes economically viable. The aim of this study was to determine whether a novel sequential solid-state and submerged fermentation method (SF) could be validated for different strains of the Trichoderma genus. Cultivation of the Trichoderma reesei Rut-C30 reference strain under SF using sugarcane bagasse as substrate was shown to be favorable for endoglucanase (EGase) production, resulting in up to 4.2-fold improvement compared with conventional submerged fermentation. Characterization of the enzymes in terms of the optimum pH and temperature for EGase activity and comparison of the hydrolysis profiles obtained using a synthetic substrate did not reveal any qualitative differences among the different cultivation conditions investigated. However, the thermostability of the EGase was influenced by the type of carbon source and cultivation system. All three strains of Trichoderma tested (T. reesei Rut-C30, Trichoderma harzianum, and Trichoderma sp INPA 666) achieved higher enzymatic productivity when cultivated under SF, hence validating the proposed SF method for use with different Trichoderma strains. The results suggest that this bioprocess configuration is a very promising development for the cellulosic biofuels industry.

  4. Analysis of Optimal Sequential State Discrimination for Linearly Independent Pure Quantum States.

    PubMed

    Namkung, Min; Kwon, Younghun

    2018-04-25

    Recently, J. A. Bergou et al. proposed sequential state discrimination as a new quantum state discrimination scheme. In the scheme, by the successful sequential discrimination of a qubit state, receivers Bob and Charlie can share the information of the qubit prepared by a sender Alice. A merit of the scheme is that a quantum channel is established between Bob and Charlie, but a classical communication is not allowed. In this report, we present a method for extending the original sequential state discrimination of two qubit states to a scheme of N linearly independent pure quantum states. Specifically, we obtain the conditions for the sequential state discrimination of N = 3 pure quantum states. We can analytically provide conditions when there is a special symmetry among N = 3 linearly independent pure quantum states. Additionally, we show that the scenario proposed in this study can be applied to quantum key distribution. Furthermore, we show that the sequential state discrimination of three qutrit states performs better than the strategy of probabilistic quantum cloning.

  5. The perceptual processing capacity of summary statistics between and within feature dimensions

    PubMed Central

    Attarha, Mouna; Moore, Cathleen M.

    2015-01-01

    The simultaneous–sequential method was used to test the processing capacity of statistical summary representations both within and between feature dimensions. Sixteen gratings varied with respect to their size and orientation. In Experiment 1, the gratings were equally divided into four separate smaller sets, one of which with a mean size that was larger or smaller than the other three sets, and one of which with a mean orientation that was tilted more leftward or rightward. The task was to report the mean size and orientation of the oddball sets. This therefore required four summary representations for size and another four for orientation. The sets were presented at the same time in the simultaneous condition or across two temporal frames in the sequential condition. Experiment 1 showed evidence of a sequential advantage, suggesting that the system may be limited with respect to establishing multiple within-feature summaries. Experiment 2 eliminates the possibility that some aspect of the task, other than averaging, was contributing to this observed limitation. In Experiment 3, the same 16 gratings appeared as one large superset, and therefore the task only required one summary representation for size and another one for orientation. Equal simultaneous–sequential performance indicated that between-feature summaries are capacity free. These findings challenge the view that within-feature summaries drive a global sense of visual continuity across areas of the peripheral visual field, and suggest a shift in focus to seeking an understanding of how between-feature summaries in one area of the environment control behavior. PMID:26360153

  6. Parsing the Passive: Comparing Children with Specific Language Impairment to Sequential Bilingual Children

    ERIC Educational Resources Information Center

    Marinis, Theodoros; Saddy, Douglas

    2013-01-01

    Twenty-five monolingual (L1) children with specific language impairment (SLI), 32 sequential bilingual (L2) children, and 29 L1 controls completed the Test of Active & Passive Sentences-Revised (van der Lely 1996) and the Self-Paced Listening Task with Picture Verification for actives and passives (Marinis 2007). These revealed important…

  7. Contribution of Implicit Sequence Learning to Spoken Language Processing: Some Preliminary Findings with Hearing Adults

    ERIC Educational Resources Information Center

    Conway, Christopher M.; Karpicke, Jennifer; Pisoni, David B.

    2007-01-01

    Spoken language consists of a complex, sequentially arrayed signal that contains patterns that can be described in terms of statistical relations among language units. Previous research has suggested that a domain-general ability to learn structured sequential patterns may underlie language acquisition. To test this prediction, we examined the…

  8. The Development of Auditory Sequential Memory in Young Black and White Children.

    ERIC Educational Resources Information Center

    Hurley, Oliver L.; And Others

    The question of whether Black children "peak" earlier than White children in auditory sequential memory (ASM) was investigated in 122 Black children and 120 White children in grades k-3 in two racially mixed schools in a large southern community. Each S was given the ASM subtest of the Illinois Test of Psycholinguistic Abilities. Results…

  9. Examining Age-Related Movement Representations for Sequential (Fine-Motor) Finger Movements

    ERIC Educational Resources Information Center

    Gabbard, Carl; Cacola, Priscila; Bobbio, Tatiana

    2011-01-01

    Theory suggests that imagined and executed movement planning relies on internal models for action. Using a chronometry paradigm to compare the movement duration of imagined and executed movements, we tested children aged 7-11 years and adults on their ability to perform sequential finger movements. Underscoring this tactic was our desire to gain a…

  10. Sequential Organization and Room Reverberation for Speech Segregation

    DTIC Science & Technology

    2012-02-28

    we have proposed two algorithms for sequential organization, an unsupervised clustering algorithm applicable to monaural recordings and a binaural ...algorithm that integrates monaural and binaural analyses. In addition, we have conducted speech intelligibility tests that Firmly establish the...comprehensive version is currently under review for journal publication. A binaural approach in room reverberation Most existing approaches to binaural or

  11. Trial-to-Trial Modulations of the Simon Effect in Conditions of Attentional Limitations : Evidence from Dual Tasks

    ERIC Educational Resources Information Center

    Fischer, Rico; Plessow, Franziska; Kunde, Wilfried; Kiesel, Andrea

    2010-01-01

    Interference effects are reduced after trials including response conflict. This sequential modulation has often been attributed to a top-down mediated adaptive control mechanism and/or to feature repetition mechanisms. In the present study we tested whether mechanisms responsible for such sequential modulations are subject to attentional…

  12. Effects of a Web-Based Tailored Multiple-Lifestyle Intervention for Adults: A Two-Year Randomized Controlled Trial Comparing Sequential and Simultaneous Delivery Modes

    PubMed Central

    Kremers, Stef PJ; Vandelanotte, Corneel; van Adrichem, Mathieu JG; Schneider, Francine; Candel, Math JJM; de Vries, Hein

    2014-01-01

    Background Web-based computer-tailored interventions for multiple health behaviors can have a significant public health impact. Yet, few randomized controlled trials have tested this assumption. Objective The objective of this paper was to test the effects of a sequential and simultaneous Web-based tailored intervention on multiple lifestyle behaviors. Methods A randomized controlled trial was conducted with 3 tailoring conditions (ie, sequential, simultaneous, and control conditions) in the Netherlands in 2009-2012. Follow-up measurements took place after 12 and 24 months. The intervention content was based on the I-Change model. In a health risk appraisal, all respondents (N=5055) received feedback on their lifestyle behaviors that indicated whether they complied with the Dutch guidelines for physical activity, vegetable consumption, fruit consumption, alcohol intake, and smoking. Participants in the sequential (n=1736) and simultaneous (n=1638) conditions received tailored motivational feedback to change unhealthy behaviors one at a time (sequential) or all at the same time (simultaneous). Mixed model analyses were performed as primary analyses; regression analyses were done as sensitivity analyses. An overall risk score was used as outcome measure, then effects on the 5 individual lifestyle behaviors were assessed and a process evaluation was performed regarding exposure to and appreciation of the intervention. Results Both tailoring strategies were associated with small self-reported behavioral changes. The sequential condition had the most significant effects compared to the control condition after 12 months (T1, effect size=0.28). After 24 months (T2), the simultaneous condition was most effective (effect size=0.18). All 5 individual lifestyle behaviors changed over time, but few effects differed significantly between the conditions. At both follow-ups, the sequential condition had significant changes in smoking abstinence compared to the simultaneous condition (T1 effect size=0.31; T2 effect size=0.41). The sequential condition was more effective in decreasing alcohol consumption than the control condition at 24 months (effect size=0.27). Change was predicted by the amount of exposure to the intervention (total visiting time: beta=–.06; P=.01; total number of visits: beta=–.11; P<.001). Both interventions were appreciated well by respondents without significant differences between conditions. Conclusions Although evidence was found for the effectiveness of both programs, no simple conclusive finding could be drawn about which intervention mode was more effective. The best kind of intervention may depend on the behavior that is targeted or on personal preferences and motivation. Further research is needed to identify moderators of intervention effectiveness. The results need to be interpreted in view of the high and selective dropout rates, multiple comparisons, and modest effect sizes. However, a large number of people were reached at low cost and behavioral change was achieved after 2 years. Trial Registration Nederlands Trial Register: NTR 2168; http://www.trialregister.nl/trialreg/admin/rctview.asp?TC=2168 (Archived by WebCite at http://www.webcitation.org/6MbUqttYB). PMID:24472854

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faye, Sherry A.; Richards, Jason M.; Gallardo, Athena M.

    Sequential extraction is a useful technique for assessing the potential to leach actinides from soils; however, current literature lacks uniformity in experimental details, making direct comparison of results impossible. This work continued development toward a standardized five-step sequential extraction protocol by analyzing extraction behaviors of 232Th, 238U, 239,240Pu and 241Am from lake and ocean sediment reference materials. Results produced a standardized procedure after creating more defined reaction conditions to improve method repeatability. A NaOH fusion procedure is recommended following sequential leaching for the complete dissolution of insoluble species.

  14. Speech Perception and Production by Sequential Bilingual Children: A Longitudinal Study of Voice Onset Time Acquisition

    PubMed Central

    McCarthy, Kathleen M; Mahon, Merle; Rosen, Stuart; Evans, Bronwen G

    2014-01-01

    The majority of bilingual speech research has focused on simultaneous bilinguals. Yet, in immigrant communities, children are often initially exposed to their family language (L1), before becoming gradually immersed in the host country's language (L2). This is typically referred to as sequential bilingualism. Using a longitudinal design, this study explored the perception and production of the English voicing contrast in 55 children (40 Sylheti-English sequential bilinguals and 15 English monolinguals). Children were tested twice: when they were in nursery (52-month-olds) and 1 year later. Sequential bilinguals' perception and production of English plosives were initially driven by their experience with their L1, but after starting school, changed to match that of their monolingual peers. PMID:25123987

  15. Simultaneous vs. sequential treatment for smoking and weight management in tobacco quitlines: 6 and 12 month outcomes from a randomized trial.

    PubMed

    Bush, Terry; Lovejoy, Jennifer; Javitz, Harold; Torres, Alula Jimenez; Wassum, Ken; Tan, Marcia M; Spring, Bonnie

    2018-05-31

    Smoking cessation often results in weight gain which discourages many smokers from quitting and can increase health risks. Treatments to reduce cessation-related weight gain have been tested in highly controlled trials of in-person treatment, but have never been tested in a real-world setting, which has inhibited dissemination. The Best Quit Study (BQS) is a replication and "real world" translation using telephone delivery of a prior in-person efficacy trial. randomized control trial in a quitline setting. Eligible smokers (n = 2540) were randomized to the standard 5-call quitline intervention or quitline plus simultaneous or sequential weight management. Regression analyses tested effectiveness of treatments on self-reported smoking abstinence and weight change at 6 and 12 months. Study enrollees were from 10 commercial employer groups and three state quitlines. Participants were between ages 18-72, 65.8% female, 68.2% white; 23.0% Medicaid-insured, and 76.3% overweight/obese. The follow-up response rate was lower in the simultaneous group than the control group at 6 months (p = 0.01). While a completers analysis of 30-day point prevalence abstinence detected no differences among groups at 6 or 12 months, multiply imputed abstinence showed quit rate differences at 6 months for:simultaneous (40.3%) vs. sequential (48.3%), p = 0.034 and simultaneous vs. control (44.9%), p = 0.043. At 12 months, multiply imputed abstinence, was significantly lower for the simultaneous group (40.7%) vs. control (46.0%), p < 0.05 and vs. sequential (46.3%), p < 0.05. Weight gain at 6 and 12 months was minimal and not different among treatment groups. The sequential group completed fewer total calls (3.75) vs. control (4.16) and vs. simultaneous group (3.83), p = 0.01, and fewer weight calls (0.94) than simultaneous (2.33), p < 0.0001. The number of calls completed predicted 30-day abstinence, p < 0.001, but not weight outcomes. This study offers a model for evaluating population-level public health interventions conducted in partnership with tobacco quitlines. Simultaneous (vs. sequential) delivery of phone/web weight management with cessation treatment in the quitline setting may adversely affect quit rate. Neither a simultaneous nor sequential approach to addressing weight produced any benefit on suppressing weight gain. This study highlights the need and the challenges of testing intensive interventions in real-world settings. ClinicalTrials.gov Identifier: NCT01867983 . Registered: May 30, 2013.

  16. A cost and policy analysis comparing immediate sequential cataract surgery and delayed sequential cataract surgery from the physician perspective in the United States.

    PubMed

    Neel, Sean T

    2014-11-01

    A cost analysis was performed to evaluate the effect on physicians in the United States of a transition from delayed sequential cataract surgery to immediate sequential cataract surgery. Financial and efficiency impacts of this change were evaluated to determine whether efficiency gains could offset potential reduced revenue. A cost analysis using Medicare cataract surgery volume estimates, Medicare 2012 physician cataract surgery reimbursement schedules, and estimates of potential additional office visit revenue comparing immediate sequential cataract surgery with delayed sequential cataract surgery for a single specialty ophthalmology practice in West Tennessee. This model should give an indication of the effect on physicians on a national basis. A single specialty ophthalmology practice in West Tennessee was found to have a cataract surgery revenue loss of $126,000, increased revenue from office visits of $34,449 to $106,271 (minimum and maximum offset methods), and a net loss of $19,900 to $91,700 (base case) with the conversion to immediate sequential cataract surgery. Physicians likely stand to lose financially, and this loss cannot be offset by increased patient visits under the current reimbursement system. This may result in physician resistance to converting to immediate sequential cataract surgery, gaming, and supplier-induced demand.

  17. Dizocilpine (MK-801) impairs learning in the active place avoidance task but has no effect on the performance during task/context alternation.

    PubMed

    Vojtechova, Iveta; Petrasek, Tomas; Hatalova, Hana; Pistikova, Adela; Vales, Karel; Stuchlik, Ales

    2016-05-15

    The prevention of engram interference, pattern separation, flexibility, cognitive coordination and spatial navigation are usually studied separately at the behavioral level. Impairment in executive functions is often observed in patients suffering from schizophrenia. We have designed a protocol for assessing these functions all together as behavioral separation. This protocol is based on alternated or sequential training in two tasks testing different hippocampal functions (the Morris water maze and active place avoidance), and alternated or sequential training in two similar environments of the active place avoidance task. In Experiment 1, we tested, in adult rats, whether the performance in two different spatial tasks was affected by their order in sequential learning, or by their day-to-day alternation. In Experiment 2, rats learned to solve the active place avoidance task in two environments either alternately or sequentially. We found that rats are able to acquire both tasks and to discriminate both similar contexts without obvious problems regardless of the order or the alternation. We used two groups of rats, controls and a rat model of psychosis induced by a subchronic intraperitoneal application of 0.08mg/kg of dizocilpine (MK-801), a non-competitive antagonist of NMDA receptors. Dizocilpine had no selective effect on parallel/sequential learning of tasks/contexts. However, it caused hyperlocomotion and a significant deficit in learning in the active place avoidance task regardless of the task alternation. Cognitive coordination tested by this task is probably more sensitive to dizocilpine than spatial orientation because no hyperactivity or learning impairment was observed in the Morris water maze. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. SIMPLE: a sequential immunoperoxidase labeling and erasing method.

    PubMed

    Glass, George; Papin, Jason A; Mandell, James W

    2009-10-01

    The ability to simultaneously visualize expression of multiple antigens in cells and tissues can provide powerful insights into cellular and organismal biology. However, standard methods are limited to the use of just two or three simultaneous probes and have not been widely adopted for routine use in paraffin-embedded tissue. We have developed a novel approach called sequential immunoperoxidase labeling and erasing (SIMPLE) that enables the simultaneous visualization of at least five markers within a single tissue section. Utilizing the alcohol-soluble peroxidase substrate 3-amino-9-ethylcarbazole, combined with a rapid non-destructive method for antibody-antigen dissociation, we demonstrate the ability to erase the results of a single immunohistochemical stain while preserving tissue antigenicity for repeated rounds of labeling. SIMPLE is greatly facilitated by the use of a whole-slide scanner, which can capture the results of each sequential stain without any information loss.

  19. Sequential Monte Carlo for inference of latent ARMA time-series with innovations correlated in time

    NASA Astrophysics Data System (ADS)

    Urteaga, Iñigo; Bugallo, Mónica F.; Djurić, Petar M.

    2017-12-01

    We consider the problem of sequential inference of latent time-series with innovations correlated in time and observed via nonlinear functions. We accommodate time-varying phenomena with diverse properties by means of a flexible mathematical representation of the data. We characterize statistically such time-series by a Bayesian analysis of their densities. The density that describes the transition of the state from time t to the next time instant t+1 is used for implementation of novel sequential Monte Carlo (SMC) methods. We present a set of SMC methods for inference of latent ARMA time-series with innovations correlated in time for different assumptions in knowledge of parameters. The methods operate in a unified and consistent manner for data with diverse memory properties. We show the validity of the proposed approach by comprehensive simulations of the challenging stochastic volatility model.

  20. Sequential Designs Based on Bayesian Uncertainty Quantification in Sparse Representation Surrogate Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Ray -Bing; Wang, Weichung; Jeff Wu, C. F.

    A numerical method, called OBSM, was recently proposed which employs overcomplete basis functions to achieve sparse representations. While the method can handle non-stationary response without the need of inverting large covariance matrices, it lacks the capability to quantify uncertainty in predictions. We address this issue by proposing a Bayesian approach which first imposes a normal prior on the large space of linear coefficients, then applies the MCMC algorithm to generate posterior samples for predictions. From these samples, Bayesian credible intervals can then be obtained to assess prediction uncertainty. A key application for the proposed method is the efficient construction ofmore » sequential designs. Several sequential design procedures with different infill criteria are proposed based on the generated posterior samples. As a result, numerical studies show that the proposed schemes are capable of solving problems of positive point identification, optimization, and surrogate fitting.« less

  1. Sequential Designs Based on Bayesian Uncertainty Quantification in Sparse Representation Surrogate Modeling

    DOE PAGES

    Chen, Ray -Bing; Wang, Weichung; Jeff Wu, C. F.

    2017-04-12

    A numerical method, called OBSM, was recently proposed which employs overcomplete basis functions to achieve sparse representations. While the method can handle non-stationary response without the need of inverting large covariance matrices, it lacks the capability to quantify uncertainty in predictions. We address this issue by proposing a Bayesian approach which first imposes a normal prior on the large space of linear coefficients, then applies the MCMC algorithm to generate posterior samples for predictions. From these samples, Bayesian credible intervals can then be obtained to assess prediction uncertainty. A key application for the proposed method is the efficient construction ofmore » sequential designs. Several sequential design procedures with different infill criteria are proposed based on the generated posterior samples. As a result, numerical studies show that the proposed schemes are capable of solving problems of positive point identification, optimization, and surrogate fitting.« less

  2. Computational aspects of helicopter trim analysis and damping levels from Floquet theory

    NASA Technical Reports Server (NTRS)

    Gaonkar, Gopal H.; Achar, N. S.

    1992-01-01

    Helicopter trim settings of periodic initial state and control inputs are investigated for convergence of Newton iteration in computing the settings sequentially and in parallel. The trim analysis uses a shooting method and a weak version of two temporal finite element methods with displacement formulation and with mixed formulation of displacements and momenta. These three methods broadly represent two main approaches of trim analysis: adaptation of initial-value and finite element boundary-value codes to periodic boundary conditions, particularly for unstable and marginally stable systems. In each method, both the sequential and in-parallel schemes are used and the resulting nonlinear algebraic equations are solved by damped Newton iteration with an optimally selected damping parameter. The impact of damped Newton iteration, including earlier-observed divergence problems in trim analysis, is demonstrated by the maximum condition number of the Jacobian matrices of the iterative scheme and by virtual elimination of divergence. The advantages of the in-parallel scheme over the conventional sequential scheme are also demonstrated.

  3. Sensitivity Analysis in Sequential Decision Models.

    PubMed

    Chen, Qiushi; Ayer, Turgay; Chhatwal, Jagpreet

    2017-02-01

    Sequential decision problems are frequently encountered in medical decision making, which are commonly solved using Markov decision processes (MDPs). Modeling guidelines recommend conducting sensitivity analyses in decision-analytic models to assess the robustness of the model results against the uncertainty in model parameters. However, standard methods of conducting sensitivity analyses cannot be directly applied to sequential decision problems because this would require evaluating all possible decision sequences, typically in the order of trillions, which is not practically feasible. As a result, most MDP-based modeling studies do not examine confidence in their recommended policies. In this study, we provide an approach to estimate uncertainty and confidence in the results of sequential decision models. First, we provide a probabilistic univariate method to identify the most sensitive parameters in MDPs. Second, we present a probabilistic multivariate approach to estimate the overall confidence in the recommended optimal policy considering joint uncertainty in the model parameters. We provide a graphical representation, which we call a policy acceptability curve, to summarize the confidence in the optimal policy by incorporating stakeholders' willingness to accept the base case policy. For a cost-effectiveness analysis, we provide an approach to construct a cost-effectiveness acceptability frontier, which shows the most cost-effective policy as well as the confidence in that for a given willingness to pay threshold. We demonstrate our approach using a simple MDP case study. We developed a method to conduct sensitivity analysis in sequential decision models, which could increase the credibility of these models among stakeholders.

  4. Location Prediction Based on Transition Probability Matrices Constructing from Sequential Rules for Spatial-Temporal K-Anonymity Dataset

    PubMed Central

    Liu, Zhao; Zhu, Yunhong; Wu, Chenxue

    2016-01-01

    Spatial-temporal k-anonymity has become a mainstream approach among techniques for protection of users’ privacy in location-based services (LBS) applications, and has been applied to several variants such as LBS snapshot queries and continuous queries. Analyzing large-scale spatial-temporal anonymity sets may benefit several LBS applications. In this paper, we propose two location prediction methods based on transition probability matrices constructing from sequential rules for spatial-temporal k-anonymity dataset. First, we define single-step sequential rules mined from sequential spatial-temporal k-anonymity datasets generated from continuous LBS queries for multiple users. We then construct transition probability matrices from mined single-step sequential rules, and normalize the transition probabilities in the transition matrices. Next, we regard a mobility model for an LBS requester as a stationary stochastic process and compute the n-step transition probability matrices by raising the normalized transition probability matrices to the power n. Furthermore, we propose two location prediction methods: rough prediction and accurate prediction. The former achieves the probabilities of arriving at target locations along simple paths those include only current locations, target locations and transition steps. By iteratively combining the probabilities for simple paths with n steps and the probabilities for detailed paths with n-1 steps, the latter method calculates transition probabilities for detailed paths with n steps from current locations to target locations. Finally, we conduct extensive experiments, and correctness and flexibility of our proposed algorithm have been verified. PMID:27508502

  5. Effects of scalding method and sequential tanks on broiler processing wastewater loadings

    USDA-ARS?s Scientific Manuscript database

    The effects of scalding time and temperature, and sequential scalding tanks was evaluated based on impact to poultry processing wastewater (PPW) stream loading rates following the slaughter of commercially raised broilers. On 3 separate weeks (trials), broilers were obtained following feed withdrawa...

  6. Phosphorus concentrations in sequentially fractionated soil samples as affected by digestion methods

    USDA-ARS?s Scientific Manuscript database

    Sequential fractionation has been used for several decades for improving our understanding on the effects of agricultural practices and management on the lability and bioavailability of phosphorus in soil, manure, and other soil amendments. Nevertheless, there have been no reports on how manipulatio...

  7. THRESHOLD ELEMENTS AND THE DESIGN OF SEQUENTIAL SWITCHING NETWORKS.

    DTIC Science & Technology

    The report covers research performed from March 1966 to March 1967. The major topics treated are: (1) methods for finding weight- threshold vectors...that realize a given switching function in multi- threshold linear logic; (2) synthesis of sequential machines by means of shift registers and simple

  8. Effects of sequential and discrete rapid naming on reading in Japanese children with reading difficulty.

    PubMed

    Wakamiya, Eiji; Okumura, Tomohito; Nakanishi, Makoto; Takeshita, Takashi; Mizuta, Mekumi; Kurimoto, Naoko; Tamai, Hiroshi

    2011-06-01

    To clarify whether rapid naming ability itself is a main underpinning factor of rapid automatized naming tests (RAN) and how deep an influence the discrete decoding process has on reading, we performed discrete naming tasks and discrete hiragana reading tasks as well as sequential naming tasks and sequential hiragana reading tasks with 38 Japanese schoolchildren with reading difficulty. There were high correlations between both discrete and sequential hiragana reading and sentence reading, suggesting that some mechanism which automatizes hiragana reading makes sentence reading fluent. In object and color tasks, there were moderate correlations between sentence reading and sequential naming, and between sequential naming and discrete naming. But no correlation was found between reading tasks and discrete naming tasks. The influence of rapid naming ability of objects and colors upon reading seemed relatively small, and multi-item processing may work in relation to these. In contrast, in the digit naming task there was moderate correlation between sentence reading and discrete naming, while no correlation was seen between sequential naming and discrete naming. There was moderate correlation between reading tasks and sequential digit naming tasks. Digit rapid naming ability has more direct effect on reading while its effect on RAN is relatively limited. The ratio of how rapid naming ability influences RAN and reading seems to vary according to kind of the stimuli used. An assumption about components in RAN which influence reading is discussed in the context of both sequential processing and discrete naming speed. Copyright © 2010 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  9. A Sequential Optimization Sampling Method for Metamodels with Radial Basis Functions

    PubMed Central

    Pan, Guang; Ye, Pengcheng; Yang, Zhidong

    2014-01-01

    Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is strongly affected by the sampling methods. In this paper, a new sequential optimization sampling method is proposed. Based on the new sampling method, metamodels can be constructed repeatedly through the addition of sampling points, namely, extrema points of metamodels and minimum points of density function. Afterwards, the more accurate metamodels would be constructed by the procedure above. The validity and effectiveness of proposed sampling method are examined by studying typical numerical examples. PMID:25133206

  10. Model Order Reduction of Aeroservoelastic Model of Flexible Aircraft

    NASA Technical Reports Server (NTRS)

    Wang, Yi; Song, Hongjun; Pant, Kapil; Brenner, Martin J.; Suh, Peter

    2016-01-01

    This paper presents a holistic model order reduction (MOR) methodology and framework that integrates key technological elements of sequential model reduction, consistent model representation, and model interpolation for constructing high-quality linear parameter-varying (LPV) aeroservoelastic (ASE) reduced order models (ROMs) of flexible aircraft. The sequential MOR encapsulates a suite of reduction techniques, such as truncation and residualization, modal reduction, and balanced realization and truncation to achieve optimal ROMs at grid points across the flight envelope. The consistence in state representation among local ROMs is obtained by the novel method of common subspace reprojection. Model interpolation is then exploited to stitch ROMs at grid points to build a global LPV ASE ROM feasible to arbitrary flight condition. The MOR method is applied to the X-56A MUTT vehicle with flexible wing being tested at NASA/AFRC for flutter suppression and gust load alleviation. Our studies demonstrated that relative to the fullorder model, our X-56A ROM can accurately and reliably capture vehicles dynamics at various flight conditions in the target frequency regime while the number of states in ROM can be reduced by 10X (from 180 to 19), and hence, holds great promise for robust ASE controller synthesis and novel vehicle design.

  11. Multi-GPU maximum entropy image synthesis for radio astronomy

    NASA Astrophysics Data System (ADS)

    Cárcamo, M.; Román, P. E.; Casassus, S.; Moral, V.; Rannou, F. R.

    2018-01-01

    The maximum entropy method (MEM) is a well known deconvolution technique in radio-interferometry. This method solves a non-linear optimization problem with an entropy regularization term. Other heuristics such as CLEAN are faster but highly user dependent. Nevertheless, MEM has the following advantages: it is unsupervised, it has a statistical basis, it has a better resolution and better image quality under certain conditions. This work presents a high performance GPU version of non-gridding MEM, which is tested using real and simulated data. We propose a single-GPU and a multi-GPU implementation for single and multi-spectral data, respectively. We also make use of the Peer-to-Peer and Unified Virtual Addressing features of newer GPUs which allows to exploit transparently and efficiently multiple GPUs. Several ALMA data sets are used to demonstrate the effectiveness in imaging and to evaluate GPU performance. The results show that a speedup from 1000 to 5000 times faster than a sequential version can be achieved, depending on data and image size. This allows to reconstruct the HD142527 CO(6-5) short baseline data set in 2.1 min, instead of 2.5 days that takes a sequential version on CPU.

  12. A new strategy to improve the cost-effectiveness of human immunodeficiency virus, hepatitis B virus, hepatitis C virus, and syphilis testing of blood donations in sub-Saharan Africa: a pilot study in Burkina Faso.

    PubMed

    Kania, Dramane; Sangaré, Lassana; Sakandé, Jean; Koanda, Abdoulaye; Nébié, Yacouba Kompingnin; Zerbo, Oumarou; Combasséré, Alain Wilfried; Guissou, Innocent Pierre; Rouet, François

    2009-10-01

    In Africa where blood-borne agents are highly prevalent, cheaper and feasible alternative strategies for blood donations testing are specifically required. From May to August 2002, 500 blood donations from Burkina Faso were tested for hepatitis B surface antigen (HBsAg), human immunodeficiency virus (HIV), syphilis, and hepatitis C virus (HCV) according to two distinct strategies. The first strategy was a conventional simultaneous screening of these four blood-borne infectious agents on each blood donation by using single-marker assays. The second strategy was a sequential screening starting by HBsAg. HBsAg-nonreactive blood donations were then further tested for HIV. If nonreactive, they were further tested for syphilis. If nonreactive, they were finally assessed for HCV antibodies. The accuracy and cost-effectiveness of the two strategies were compared. By using the simultaneous strategy, the seroprevalences of HBsAg, HIV, syphilis, and HCV among blood donors in Ouagadougou were estimated to be 19.2, 9.8, 1.6, and 5.2%. No significant difference of HIV, syphilis, and HCV prevalence rates was observed by using the sequential strategy (9.2, 1.9, and 4.7%, respectively). Whatever the strategy used, 157 blood donations (31.4%) were found to be reactive for at least one transfusion-transmissible agent and were thus discarded. The sequential strategy allowed a cost decrease of euro 908.6, compared to the simultaneous strategy. Given that approximately there are 50,000 blood donations annually in Burkina Faso, the money savings reached potentially euro 90,860. In resource-limited settings, the implementation of a sequential strategy appears as a pragmatic solution to promote safe blood supply and ensure sustainability of the system.

  13. Performance of the architect EBV antibody panel for determination of Epstein-Barr virus infection stage in immunocompetent adolescents and young adults with clinical suspicion of infectious mononucleosis.

    PubMed

    Guerrero-Ramos, Alvaro; Patel, Mauli; Kadakia, Kinjal; Haque, Tanzina

    2014-06-01

    The Architect EBV antibody panel is a new chemiluminescence immunoassay system used to determine the stage of Epstein-Barr virus (EBV) infection based on the detection of IgM and IgG antibodies to viral capsid antigen (VCA) and IgG antibodies against Epstein-Barr nuclear antigen 1 (EBNA-1). We evaluated its diagnostic accuracy in immunocompetent adolescents and young adults with clinical suspicion of infectious mononucleosis (IM) using the RecomLine EBV IgM and IgG immunoblots as the reference standard. In addition, the use of the antibody panel in a sequential testing algorithm based on initial EBNA-1 IgG analysis was assessed for cost-effectiveness. Finally, we investigated the degree of cross-reactivity of the VCA IgM marker during other primary viral infections that may present with an EBV IM-like picture. High sensitivity (98.3% [95% confidence interval {CI}, 90.7 to 99.7%]) and specificity (94.2% [95% CI, 87.9 to 97.8%]) were found after testing 162 precharacterized archived serum samples. There was perfect agreement between the use of the antibody panel in sequential and parallel testing algorithms, but substantial cost savings (23%) were obtained with the sequential strategy. A high rate of reactive VCA IgM results was found in primary cytomegalovirus (CMV) infections (60.7%). In summary, the Architect EBV antibody panel performs satisfactorily in the investigation of EBV IM in immunocompetent adolescents and young adults, and the application of an EBNA-1 IgG-based sequential testing algorithm is cost-effective in this diagnostic setting. Concomitant testing for CMV is strongly recommended to aid in the interpretation of EBV serological patterns. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  14. Performance of the Architect EBV Antibody Panel for Determination of Epstein-Barr Virus Infection Stage in Immunocompetent Adolescents and Young Adults with Clinical Suspicion of Infectious Mononucleosis

    PubMed Central

    Patel, Mauli; Kadakia, Kinjal; Haque, Tanzina

    2014-01-01

    The Architect EBV antibody panel is a new chemiluminescence immunoassay system used to determine the stage of Epstein-Barr virus (EBV) infection based on the detection of IgM and IgG antibodies to viral capsid antigen (VCA) and IgG antibodies against Epstein-Barr nuclear antigen 1 (EBNA-1). We evaluated its diagnostic accuracy in immunocompetent adolescents and young adults with clinical suspicion of infectious mononucleosis (IM) using the RecomLine EBV IgM and IgG immunoblots as the reference standard. In addition, the use of the antibody panel in a sequential testing algorithm based on initial EBNA-1 IgG analysis was assessed for cost-effectiveness. Finally, we investigated the degree of cross-reactivity of the VCA IgM marker during other primary viral infections that may present with an EBV IM-like picture. High sensitivity (98.3% [95% confidence interval {CI}, 90.7 to 99.7%]) and specificity (94.2% [95% CI, 87.9 to 97.8%]) were found after testing 162 precharacterized archived serum samples. There was perfect agreement between the use of the antibody panel in sequential and parallel testing algorithms, but substantial cost savings (23%) were obtained with the sequential strategy. A high rate of reactive VCA IgM results was found in primary cytomegalovirus (CMV) infections (60.7%). In summary, the Architect EBV antibody panel performs satisfactorily in the investigation of EBV IM in immunocompetent adolescents and young adults, and the application of an EBNA-1 IgG-based sequential testing algorithm is cost-effective in this diagnostic setting. Concomitant testing for CMV is strongly recommended to aid in the interpretation of EBV serological patterns. PMID:24695777

  15. Plane-Based Sampling for Ray Casting Algorithm in Sequential Medical Images

    PubMed Central

    Lin, Lili; Chen, Shengyong; Shao, Yan; Gu, Zichun

    2013-01-01

    This paper proposes a plane-based sampling method to improve the traditional Ray Casting Algorithm (RCA) for the fast reconstruction of a three-dimensional biomedical model from sequential images. In the novel method, the optical properties of all sampling points depend on the intersection points when a ray travels through an equidistant parallel plan cluster of the volume dataset. The results show that the method improves the rendering speed at over three times compared with the conventional algorithm and the image quality is well guaranteed. PMID:23424608

  16. Sequential x-ray diffraction topography at 1-BM x-ray optics testing beamline at the advanced photon source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoupin, Stanislav, E-mail: sstoupin@aps.anl.gov; Shvyd’ko, Yuri; Trakhtenberg, Emil

    2016-07-27

    We report progress on implementation and commissioning of sequential X-ray diffraction topography at 1-BM Optics Testing Beamline of the Advanced Photon Source to accommodate growing needs of strain characterization in diffractive crystal optics and other semiconductor single crystals. The setup enables evaluation of strain in single crystals in the nearly-nondispersive double-crystal geometry. Si asymmetric collimator crystals of different crystallographic orientations were designed, fabricated and characterized using in-house capabilities. Imaging the exit beam using digital area detectors permits rapid sequential acquisition of X-ray topographs at different angular positions on the rocking curve of a crystal under investigation. Results on sensitivity andmore » spatial resolution are reported based on experiments with high-quality Si and diamond crystals. The new setup complements laboratory-based X-ray topography capabilities of the Optics group at the Advanced Photon Source.« less

  17. A Predictive Model for Medical Events Based on Contextual Embedding of Temporal Sequences

    PubMed Central

    Wang, Zhimu; Huang, Yingxiang; Wang, Shuang; Wang, Fei; Jiang, Xiaoqian

    2016-01-01

    Background Medical concepts are inherently ambiguous and error-prone due to human fallibility, which makes it hard for them to be fully used by classical machine learning methods (eg, for tasks like early stage disease prediction). Objective Our work was to create a new machine-friendly representation that resembles the semantics of medical concepts. We then developed a sequential predictive model for medical events based on this new representation. Methods We developed novel contextual embedding techniques to combine different medical events (eg, diagnoses, prescriptions, and labs tests). Each medical event is converted into a numerical vector that resembles its “semantics,” via which the similarity between medical events can be easily measured. We developed simple and effective predictive models based on these vectors to predict novel diagnoses. Results We evaluated our sequential prediction model (and standard learning methods) in estimating the risk of potential diseases based on our contextual embedding representation. Our model achieved an area under the receiver operating characteristic (ROC) curve (AUC) of 0.79 on chronic systolic heart failure and an average AUC of 0.67 (over the 80 most common diagnoses) using the Medical Information Mart for Intensive Care III (MIMIC-III) dataset. Conclusions We propose a general early prognosis predictor for 80 different diagnoses. Our method computes numeric representation for each medical event to uncover the potential meaning of those events. Our results demonstrate the efficiency of the proposed method, which will benefit patients and physicians by offering more accurate diagnosis. PMID:27888170

  18. Computer-aided target tracking in motion analysis studies

    NASA Astrophysics Data System (ADS)

    Burdick, Dominic C.; Marcuse, M. L.; Mislan, J. D.

    1990-08-01

    Motion analysis studies require the precise tracking of reference objects in sequential scenes. In a typical situation, events of interest are captured at high frame rates using special cameras, and selected objects or targets are tracked on a frame by frame basis to provide necessary data for motion reconstruction. Tracking is usually done using manual methods which are slow and prone to error. A computer based image analysis system has been developed that performs tracking automatically. The objective of this work was to eliminate the bottleneck due to manual methods in high volume tracking applications such as the analysis of crash test films for the automotive industry. The system has proven to be successful in tracking standard fiducial targets and other objects in crash test scenes. Over 95 percent of target positions which could be located using manual methods can be tracked by the system, with a significant improvement in throughput over manual methods. Future work will focus on the tracking of clusters of targets and on tracking deformable objects such as airbags.

  19. A Simulation Approach to Assessing Sampling Strategies for Insect Pests: An Example with the Balsam Gall Midge

    PubMed Central

    Carleton, R. Drew; Heard, Stephen B.; Silk, Peter J.

    2013-01-01

    Estimation of pest density is a basic requirement for integrated pest management in agriculture and forestry, and efficiency in density estimation is a common goal. Sequential sampling techniques promise efficient sampling, but their application can involve cumbersome mathematics and/or intensive warm-up sampling when pests have complex within- or between-site distributions. We provide tools for assessing the efficiency of sequential sampling and of alternative, simpler sampling plans, using computer simulation with “pre-sampling” data. We illustrate our approach using data for balsam gall midge (Paradiplosis tumifex) attack in Christmas tree farms. Paradiplosis tumifex proved recalcitrant to sequential sampling techniques. Midge distributions could not be fit by a common negative binomial distribution across sites. Local parameterization, using warm-up samples to estimate the clumping parameter k for each site, performed poorly: k estimates were unreliable even for samples of n∼100 trees. These methods were further confounded by significant within-site spatial autocorrelation. Much simpler sampling schemes, involving random or belt-transect sampling to preset sample sizes, were effective and efficient for P. tumifex. Sampling via belt transects (through the longest dimension of a stand) was the most efficient, with sample means converging on true mean density for sample sizes of n∼25–40 trees. Pre-sampling and simulation techniques provide a simple method for assessing sampling strategies for estimating insect infestation. We suspect that many pests will resemble P. tumifex in challenging the assumptions of sequential sampling methods. Our software will allow practitioners to optimize sampling strategies before they are brought to real-world applications, while potentially avoiding the need for the cumbersome calculations required for sequential sampling methods. PMID:24376556

  20. Reliability-based trajectory optimization using nonintrusive polynomial chaos for Mars entry mission

    NASA Astrophysics Data System (ADS)

    Huang, Yuechen; Li, Haiyang

    2018-06-01

    This paper presents the reliability-based sequential optimization (RBSO) method to settle the trajectory optimization problem with parametric uncertainties in entry dynamics for Mars entry mission. First, the deterministic entry trajectory optimization model is reviewed, and then the reliability-based optimization model is formulated. In addition, the modified sequential optimization method, in which the nonintrusive polynomial chaos expansion (PCE) method and the most probable point (MPP) searching method are employed, is proposed to solve the reliability-based optimization problem efficiently. The nonintrusive PCE method contributes to the transformation between the stochastic optimization (SO) and the deterministic optimization (DO) and to the approximation of trajectory solution efficiently. The MPP method, which is used for assessing the reliability of constraints satisfaction only up to the necessary level, is employed to further improve the computational efficiency. The cycle including SO, reliability assessment and constraints update is repeated in the RBSO until the reliability requirements of constraints satisfaction are satisfied. Finally, the RBSO is compared with the traditional DO and the traditional sequential optimization based on Monte Carlo (MC) simulation in a specific Mars entry mission to demonstrate the effectiveness and the efficiency of the proposed method.

  1. The Effect of Sequential Dependence on the Sampling Distributions of KR-20, KR-21, and Split-Halves Reliabilities.

    ERIC Educational Resources Information Center

    Sullins, Walter L.

    Five-hundred dichotomously scored response patterns were generated with sequentially independent (SI) items and 500 with dependent (SD) items for each of thirty-six combinations of sampling parameters (i.e., three test lengths, three sample sizes, and four item difficulty distributions). KR-20, KR-21, and Split-Half (S-H) reliabilities were…

  2. Sequential infiltration synthesis for enhancing multiple-patterning lithography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Darling, Seth B.; Elam, Jeffrey W.; Tseng, Yu-Chih

    Simplified methods of multiple-patterning photolithography using sequential infiltration synthesis to modify the photoresist such that it withstands plasma etching better than unmodified resist and replaces one or more hard masks and/or a freezing step in MPL processes including litho-etch-litho-etch photolithography or litho-freeze-litho-etch photolithography.

  3. Phosphorus concentrations in sequentially fractionated soil samples as affected by digestion methods

    USDA-ARS?s Scientific Manuscript database

    Sequential fractionation has been used for several decades for improving our understanding on the effects of agricultural practices and management on the lability and bioavailability of P in soil, manure, and other soil amendments. Nevertheless, there have been no reports on how manipulation of diff...

  4. The Motivating Language of Principals: A Sequential Transformative Strategy

    ERIC Educational Resources Information Center

    Holmes, William Tobias

    2012-01-01

    This study implemented a Sequential Transformative Mixed Methods design with teachers (as recipients) and principals (to give voice) in the examination of principal talk in two different school accountability contexts (Continuously Improving and Continuously Zigzag) using the conceptual framework of Motivating Language Theory. In phase one,…

  5. Spacecraft Data Simulator for the test of level zero processing systems

    NASA Technical Reports Server (NTRS)

    Shi, Jeff; Gordon, Julie; Mirchandani, Chandru; Nguyen, Diem

    1994-01-01

    The Microelectronic Systems Branch (MSB) at Goddard Space Flight Center (GSFC) has developed a Spacecraft Data Simulator (SDS) to support the development, test, and verification of prototype and production Level Zero Processing (LZP) systems. Based on a disk array system, the SDS is capable of generating large test data sets up to 5 Gigabytes and outputting serial test data at rates up to 80 Mbps. The SDS supports data formats including NASA Communication (Nascom) blocks, Consultative Committee for Space Data System (CCSDS) Version 1 & 2 frames and packets, and all the Advanced Orbiting Systems (AOS) services. The capability to simulate both sequential and non-sequential time-ordered downlink data streams with errors and gaps is crucial to test LZP systems. This paper describes the system architecture, hardware and software designs, and test data designs. Examples of test data designs are included to illustrate the application of the SDS.

  6. Simultaneous Versus Sequential Presentation in Testing Recognition Memory for Faces.

    PubMed

    Finley, Jason R; Roediger, Henry L; Hughes, Andrea D; Wahlheim, Christopher N; Jacoby, Larry L

    2015-01-01

    Three experiments examined the issue of whether faces could be better recognized in a simul- taneous test format (2-alternative forced choice [2AFC]) or a sequential test format (yes-no). All experiments showed that when target faces were present in the test, the simultaneous procedure led to superior performance (area under the ROC curve), whether lures were high or low in similarity to the targets. However, when a target-absent condition was used in which no lures resembled the targets but the lures were similar to each other, the simultaneous procedure yielded higher false alarm rates (Experiments 2 and 3) and worse overall performance (Experi- ment 3). This pattern persisted even when we excluded responses that participants opted to withhold rather than volunteer. We conclude that for the basic recognition procedures used in these experiments, simultaneous presentation of alternatives (2AFC) generally leads to better discriminability than does sequential presentation (yes-no) when a target is among the alterna- tives. However, our results also show that the opposite can occur when there is no target among the alternatives. An important future step is to see whether these patterns extend to more realistic eyewitness lineup procedures. The pictures used in the experiment are available online at http://www.press.uillinois.edu/journals/ajp/media/testing_recognition/.

  7. Bridging the qualitative-quantitative divide: Experiences from conducting a mixed methods evaluation in the RUCAS programme.

    PubMed

    Makrakis, Vassilios; Kostoulas-Makrakis, Nelly

    2016-02-01

    Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Method of Real-Time Principal-Component Analysis

    NASA Technical Reports Server (NTRS)

    Duong, Tuan; Duong, Vu

    2005-01-01

    Dominant-element-based gradient descent and dynamic initial learning rate (DOGEDYN) is a method of sequential principal-component analysis (PCA) that is well suited for such applications as data compression and extraction of features from sets of data. In comparison with a prior method of gradient-descent-based sequential PCA, this method offers a greater rate of learning convergence. Like the prior method, DOGEDYN can be implemented in software. However, the main advantage of DOGEDYN over the prior method lies in the facts that it requires less computation and can be implemented in simpler hardware. It should be possible to implement DOGEDYN in compact, low-power, very-large-scale integrated (VLSI) circuitry that could process data in real time.

  9. Theory and procedures for finding a correct kinetic model for the bacteriorhodopsin photocycle.

    PubMed

    Hendler, R W; Shrager, R; Bose, S

    2001-04-26

    In this paper, we present the implementation and results of new methodology based on linear algebra. The theory behind these methods is covered in detail in the Supporting Information, available electronically (Shragerand Hendler). In brief, the methods presented search through all possible forward sequential submodels in order to find candidates that can be used to construct a complete model for the BR-photocycle. The methodology is limited only to forward sequential models. If no such models are compatible with the experimental data,none will be found. The procedures apply objective tests and filters to eliminate possibilities that cannot be correct, thus cutting the total number of candidate sequences to be considered. In the current application,which uses six exponentials, the total sequences were cut from 1950 to 49. The remaining sequences were further screened using known experimental criteria. The approach led to a solution which consists of a pair of sequences, one with 5 exponentials showing BR* f L(f) M(f) N O BR and the other with three exponentials showing BR* L(s) M(s) BR. The deduced complete kinetic model for the BR photocycle is thus either a single photocycle branched at the L intermediate or a pair of two parallel photocycles. Reasons for preferring the parallel photocycles are presented. Synthetic data constructed on the basis of the parallel photocycles were indistinguishable from the experimental data in a number of analytical tests that were applied.

  10. Phase II design with sequential testing of hypotheses within each stage.

    PubMed

    Poulopoulou, Stavroula; Karlis, Dimitris; Yiannoutsos, Constantin T; Dafni, Urania

    2014-01-01

    The main goal of a Phase II clinical trial is to decide, whether a particular therapeutic regimen is effective enough to warrant further study. The hypothesis tested by Fleming's Phase II design (Fleming, 1982) is [Formula: see text] versus [Formula: see text], with level [Formula: see text] and with a power [Formula: see text] at [Formula: see text], where [Formula: see text] is chosen to represent the response probability achievable with standard treatment and [Formula: see text] is chosen such that the difference [Formula: see text] represents a targeted improvement with the new treatment. This hypothesis creates a misinterpretation mainly among clinicians that rejection of the null hypothesis is tantamount to accepting the alternative, and vice versa. As mentioned by Storer (1992), this introduces ambiguity in the evaluation of type I and II errors and the choice of the appropriate decision at the end of the study. Instead of testing this hypothesis, an alternative class of designs is proposed in which two hypotheses are tested sequentially. The hypothesis [Formula: see text] versus [Formula: see text] is tested first. If this null hypothesis is rejected, the hypothesis [Formula: see text] versus [Formula: see text] is tested next, in order to examine whether the therapy is effective enough to consider further testing in a Phase III study. For the derivation of the proposed design the exact binomial distribution is used to calculate the decision cut-points. The optimal design parameters are chosen, so as to minimize the average sample number (ASN) under specific upper bounds for error levels. The optimal values for the design were found using a simulated annealing method.

  11. Biphasic Finite Element Modeling Reconciles Mechanical Properties of Tissue-Engineered Cartilage Constructs Across Testing Platforms.

    PubMed

    Meloni, Gregory R; Fisher, Matthew B; Stoeckl, Brendan D; Dodge, George R; Mauck, Robert L

    2017-07-01

    Cartilage tissue engineering is emerging as a promising treatment for osteoarthritis, and the field has progressed toward utilizing large animal models for proof of concept and preclinical studies. Mechanical testing of the regenerative tissue is an essential outcome for functional evaluation. However, testing modalities and constitutive frameworks used to evaluate in vitro grown samples differ substantially from those used to evaluate in vivo derived samples. To address this, we developed finite element (FE) models (using FEBio) of unconfined compression and indentation testing, modalities commonly used for such samples. We determined the model sensitivity to tissue radius and subchondral bone modulus, as well as its ability to estimate material parameters using the built-in parameter optimization tool in FEBio. We then sequentially tested agarose gels of 4%, 6%, 8%, and 10% weight/weight using a custom indentation platform, followed by unconfined compression. Similarly, we evaluated the ability of the model to generate material parameters for living constructs by evaluating engineered cartilage. Juvenile bovine mesenchymal stem cells were seeded (2 × 10 7 cells/mL) in 1% weight/volume hyaluronic acid hydrogels and cultured in a chondrogenic medium for 3, 6, and 9 weeks. Samples were planed and tested sequentially in indentation and unconfined compression. The model successfully completed parameter optimization routines for each testing modality for both acellular and cell-based constructs. Traditional outcome measures and the FE-derived outcomes showed significant changes in material properties during the maturation of engineered cartilage tissue, capturing dynamic changes in functional tissue mechanics. These outcomes were significantly correlated with one another, establishing this FE modeling approach as a singular method for the evaluation of functional engineered and native tissue regeneration, both in vitro and in vivo.

  12. Sequential strand displacement beacon for detection of DNA coverage on functionalized gold nanoparticles.

    PubMed

    Paliwoda, Rebecca E; Li, Feng; Reid, Michael S; Lin, Yanwen; Le, X Chris

    2014-06-17

    Functionalizing nanomaterials for diverse analytical, biomedical, and therapeutic applications requires determination of surface coverage (or density) of DNA on nanomaterials. We describe a sequential strand displacement beacon assay that is able to quantify specific DNA sequences conjugated or coconjugated onto gold nanoparticles (AuNPs). Unlike the conventional fluorescence assay that requires the target DNA to be fluorescently labeled, the sequential strand displacement beacon method is able to quantify multiple unlabeled DNA oligonucleotides using a single (universal) strand displacement beacon. This unique feature is achieved by introducing two short unlabeled DNA probes for each specific DNA sequence and by performing sequential DNA strand displacement reactions. Varying the relative amounts of the specific DNA sequences and spacing DNA sequences during their coconjugation onto AuNPs results in different densities of the specific DNA on AuNP, ranging from 90 to 230 DNA molecules per AuNP. Results obtained from our sequential strand displacement beacon assay are consistent with those obtained from the conventional fluorescence assays. However, labeling of DNA with some fluorescent dyes, e.g., tetramethylrhodamine, alters DNA density on AuNP. The strand displacement strategy overcomes this problem by obviating direct labeling of the target DNA. This method has broad potential to facilitate more efficient design and characterization of novel multifunctional materials for diverse applications.

  13. Enhanced Performance of Perovskite CH3NH3PbI3 Solar Cell by Using CH3NH3I as Additive in Sequential Deposition.

    PubMed

    Xie, Yian; Shao, Feng; Wang, Yaoming; Xu, Tao; Wang, Deliang; Huang, Fuqiang

    2015-06-17

    Sequential deposition is a widely adopted method to prepare CH3NH3PbI3 on mesostructured TiO2 electrode for organic lead halide perovskite solar cells. However, this method often suffers from the uncontrollable crystal size, surface morphology, and residual PbI2 in the resulting CH3NH3PbI3, which are all detrimental to the device performance. We herein present an optimized sequential solution deposition method by introducing different amount of CH3NH3I in PbI2 precursor solution in the first step to prepare CH3NH3PbI3 absorber on mesoporous TiO2 substrates. The addition of CH3NH3I in PbI2 precursor solution can affect the crystallization and composition of PbI2 raw films, resulting in the variation of UV-vis absorption and surface morphology. Proper addition of CH3NH3I not only enhances the absorption but also improves the efficiency of CH3NH3PbI3 solar cells from 11.13% to 13.37%. Photoluminescence spectra suggest that the improvement of device performance is attributed to the decrease of recombination rate of carriers in CH3NH3PbI3 absorber. This current method provides a highly repeatable route for enhancing the efficiency of CH3NH3PbI3 solar cell in the sequential solution deposition method.

  14. Sequential Change of Wound Calculated by Image Analysis Using a Color Patch Method during a Secondary Intention Healing.

    PubMed

    Yang, Sejung; Park, Junhee; Lee, Hanuel; Kim, Soohyun; Lee, Byung-Uk; Chung, Kee-Yang; Oh, Byungho

    2016-01-01

    Photographs of skin wounds have the most important information during the secondary intention healing (SIH). However, there is no standard method for handling those images and analyzing them efficiently and conveniently. To investigate the sequential changes of SIH depending on the body sites using a color patch method. We performed retrospective reviews of 30 patients (11 facial and 19 non-facial areas) who underwent SIH for the restoration of skin defects and captured sequential photographs with a color patch which is specially designed for automatically calculating defect and scar sizes. Using a novel image analysis method with a color patch, skin defects were calculated more accurately (range of error rate: -3.39% ~ + 3.05%). All patients had smaller scar size than the original defect size after SIH treatment (rates of decrease: 18.8% ~ 86.1%), and facial area showed significantly higher decrease rate compared with the non-facial area such as scalp and extremities (67.05 ± 12.48 vs. 53.29 ± 18.11, P < 0.05). From the result of estimating the date corresponding to the half of the final decrement, all of the facial area showed improvements within two weeks (8.45 ± 3.91), and non-facial area needed 14.33 ± 9.78 days. From the results of sequential changes of skin defects, SIH can be recommended as an alternative treatment method for restoration with more careful dressing for initial two weeks.

  15. Acute Oral Toxicity Up-And-Down-Procedure

    EPA Pesticide Factsheets

    The Up-and-Down Procedure is an alternative acute toxicity test that provides a way to determine the toxicity of chemicals with fewer test animals by using sequential dosing steps. Find out about this test procedure.

  16. Space-Time Fluid-Structure Interaction Computation of Flapping-Wing Aerodynamics

    DTIC Science & Technology

    2013-12-01

    SST-VMST." The structural mechanics computations are based on the Kirchhoff -Love shell model. We use a sequential coupling technique, which is...mechanics computations are based on the Kirchhoff -Love shell model. We use a sequential coupling technique, which is ap- plicable to some classes of FSI...we use the ST-VMS method in combination with the ST-SUPS method. The structural mechanics computations are mostly based on the Kirchhoff –Love shell

  17. Risk-adjusted sequential probability ratio tests: applications to Bristol, Shipman and adult cardiac surgery.

    PubMed

    Spiegelhalter, David; Grigg, Olivia; Kinsman, Robin; Treasure, Tom

    2003-02-01

    To investigate the use of the risk-adjusted sequential probability ratio test in monitoring the cumulative occurrence of adverse clinical outcomes. Retrospective analysis of three longitudinal datasets. Patients aged 65 years and over under the care of Harold Shipman between 1979 and 1997, patients under 1 year of age undergoing paediatric heart surgery in Bristol Royal Infirmary between 1984 and 1995, adult patients receiving cardiac surgery from a team of cardiac surgeons in London,UK. Annual and 30-day mortality rates. Using reasonable boundaries, the procedure could have indicated an 'alarm' in Bristol after publication of the 1991 Cardiac Surgical Register, and in 1985 or 1997 for Harold Shipman depending on the data source and the comparator. The cardiac surgeons showed no significant deviation from expected performance. The risk-adjusted sequential probability test is simple to implement, can be applied in a variety of contexts, and might have been useful to detect specific instances of past divergent performance. The use of this and related techniques deserves further attention in the context of prospectively monitoring adverse clinical outcomes.

  18. A sequential adaptation technique and its application to the Mark 12 IFF system

    NASA Astrophysics Data System (ADS)

    Bailey, John S.; Mallett, John D.; Sheppard, Duane J.; Warner, F. Neal; Adams, Robert

    1986-07-01

    Sequential adaptation uses only two sets of receivers, correlators, and A/D converters which are time multiplexed to effect spatial adaptation in a system with (N) adaptive degrees of freedom. This technique can substantially reduce the hardware cost over what is realizable in a parallel architecture. A three channel L-band version of the sequential adapter was built and tested for use with the MARK XII IFF (identify friend or foe) system. In this system the sequentially determined adaptive weights were obtained digitally but implemented at RF. As a result, many of the post RF hardware induced sources of error that normally limit cancellation, such as receiver mismatch, are removed by the feedback property. The result is a system that can yield high levels of cancellation and be readily retrofitted to currently fielded equipment.

  19. Biohydrogen and methane production via a two-step process using an acid pretreated native microalgae consortium.

    PubMed

    Carrillo-Reyes, Julian; Buitrón, Germán

    2016-12-01

    A native microalgae consortium treated under thermal-acidic hydrolysis was used to produce hydrogen and methane in a two-step sequential process. Different acid concentrations were tested, generating hydrogen and methane yields of up to 45mLH 2 gVS -1 and 432mLCH 4 gVS -1 , respectively. The hydrogen production step solubilized the particulate COD (chemical oxygen demand) up to 30%, creating considerable amounts of volatile fatty acids (up to 10gCODL -1 ). It was observed that lower acid concentration presented higher hydrogen and methane production potential. The results revealed that thermal acid hydrolysis of a native microalgae consortium is a simple but effective strategy for producing hydrogen and methane in the sequential process. In addition to COD removal (50-70%), this method resulted in an energy recovery of up to 15.9kJ per g of volatile solids of microalgae biomass, one of the highest reported. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Visual detection and sequential injection determination of aluminium using a cinnamoyl derivative.

    PubMed

    Elečková, Lenka; Alexovič, Michal; Kuchár, Juraj; Balogh, Ioseph S; Andruch, Vasil

    2015-02-01

    A cinnamoyl derivative, 3-[4-(dimethylamino)cinnamoyl]-4-hydroxy-6-methyl-3,4-2H-pyran-2-one, was used as a ligand for the determination of aluminium. Upon the addition of an acetonitrile solution of the ligand to an aqueous solution containing Al(III) and a buffer solution at pH 8, a marked change in colour from yellow to orange is observed. The colour intensity is proportional to the concentration of Al(III); thus, the 'naked-eye' detection of aluminium is possible. The reaction is also applied for sequential injection determination of aluminium. Beer׳s law is obeyed in the range from 0.055 to 0.66 mg L(-1) of Al(III). The limit of detection, calculated as three times the standard deviation of the blank test (n=10), was found to be 4 μg L(-1) for Al(III). The method was applied for the determination of aluminium in spiked water samples and pharmaceutical preparations. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. 16 CFR 1500.42 - Test for eye irritants.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., including testing that does not require animals, are presented in the CPSC's animal testing policy set forth... conducted, a sequential testing strategy is recommended to reduce the number of test animals. Additionally... eye irritation. Both eyes of each animal in the test group shall be examined before testing, and only...

  2. An iterative particle filter approach for coupled hydro-geophysical inversion of a controlled infiltration experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manoli, Gabriele, E-mail: manoli@dmsa.unipd.it; Nicholas School of the Environment, Duke University, Durham, NC 27708; Rossi, Matteo

    The modeling of unsaturated groundwater flow is affected by a high degree of uncertainty related to both measurement and model errors. Geophysical methods such as Electrical Resistivity Tomography (ERT) can provide useful indirect information on the hydrological processes occurring in the vadose zone. In this paper, we propose and test an iterated particle filter method to solve the coupled hydrogeophysical inverse problem. We focus on an infiltration test monitored by time-lapse ERT and modeled using Richards equation. The goal is to identify hydrological model parameters from ERT electrical potential measurements. Traditional uncoupled inversion relies on the solution of two sequentialmore » inverse problems, the first one applied to the ERT measurements, the second one to Richards equation. This approach does not ensure an accurate quantitative description of the physical state, typically violating mass balance. To avoid one of these two inversions and incorporate in the process more physical simulation constraints, we cast the problem within the framework of a SIR (Sequential Importance Resampling) data assimilation approach that uses a Richards equation solver to model the hydrological dynamics and a forward ERT simulator combined with Archie's law to serve as measurement model. ERT observations are then used to update the state of the system as well as to estimate the model parameters and their posterior distribution. The limitations of the traditional sequential Bayesian approach are investigated and an innovative iterative approach is proposed to estimate the model parameters with high accuracy. The numerical properties of the developed algorithm are verified on both homogeneous and heterogeneous synthetic test cases based on a real-world field experiment.« less

  3. Effect of rotation preference on spontaneous alternation behavior on Y maze and introduction of a new analytical method, entropy of spontaneous alternation.

    PubMed

    Bak, Jia; Pyeon, Hae-In; Seok, Jin-I; Choi, Yun-Sik

    2017-03-01

    Y maze has been used to test spatial working memory in rodents. To this end, the percentage of spontaneous alternation has been employed. Alternation indicates sequential entries into all three arms; e.g., when an animal visits all three arms clockwise or counterclockwise sequentially, alternation is achieved. Interestingly, animals have a tendency to rotate or turn to a preferred side. Thus, when an animal has a high rotation preference, this may influence their alternation behavior. Here, we have generated a new analytical method, termed entropy of spontaneous alternation, to offset the effect of rotation preference on Y maze. To validate the entropy of spontaneous alternation, we employed a free rotation test using a cylinder and a spatial working memory test on Y maze. We identified that mice showed 65.1% rotation preference on average. Importantly, the percentage of spontaneous alternation in the high preference group (more than 70% rotation to a preferred side) was significantly higher than that in the no preference group (<55%). In addition, there was a clear correlation between rotation preference on cylinder and turning preference on Y maze. On the other hand, this potential leverage effect that arose from rotation preference disappeared when the animal behavior on Y maze was analyzed with the entropy of spontaneous alternation. Further, entropy of spontaneous alternation significantly determined the loss of spatial working memory by scopolamine administration. Combined, these data indicate that the entropy of spontaneous alternation provides higher credibility when spatial working memory is evaluated using Y maze. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Repeatability of quantitative FDG-PET/CT and contrast-enhanced CT in recurrent ovarian carcinoma: test-retest measurements for tumor FDG uptake, diameter, and volume.

    PubMed

    Rockall, Andrea G; Avril, Norbert; Lam, Raymond; Iannone, Robert; Mozley, P David; Parkinson, Christine; Bergstrom, Donald; Sala, Evis; Sarker, Shah-Jalal; McNeish, Iain A; Brenton, James D

    2014-05-15

    Repeatability of baseline FDG-PET/CT measurements has not been tested in ovarian cancer. This dual-center, prospective study assessed variation in tumor 2[18F]fluoro-2-deoxy-D-glucose (FDG) uptake, tumor diameter, and tumor volume from sequential FDG-PET/CT and contrast-enhanced computed tomography (CECT) in patients with recurrent platinum-sensitive ovarian cancer. Patients underwent two pretreatment baseline FDG-PET/CT (n = 21) and CECT (n = 20) at two clinical sites with different PET/CT instruments. Patients were included if they had at least one target lesion in the abdomen with a standardized uptake value (SUV) maximum (SUVmax) of ≥ 2.5 and a long axis diameter of ≥ 15 mm. Two independent reading methods were used to evaluate repeatability of tumor diameter and SUV uptake: on site and at an imaging clinical research organization (CRO). Tumor volume reads were only performed by CRO. In each reading set, target lesions were independently measured on sequential imaging. Median time between FDG-PET/CT was two days (range 1-7). For site reads, concordance correlation coefficients (CCC) for SUVmean, SUVmax, and tumor diameter were 0.95, 0.94, and 0.99, respectively. Repeatability coefficients were 16.3%, 17.3%, and 8.8% for SUVmean, SUVmax, and tumor diameter, respectively. Similar results were observed for CRO reads. Tumor volume CCC was 0.99 with a repeatability coefficient of 28.1%. There was excellent test-retest repeatability for FDG-PET/CT quantitative measurements across two sites and two independent reading methods. Cutoff values for determining change in SUVmean, SUVmax, and tumor volume establish limits to determine metabolic and/or volumetric response to treatment in platinum-sensitive relapsed ovarian cancer. ©2014 American Association for Cancer Research.

  5. Animals and the 3Rs in toxicology research and testing: The way forward.

    PubMed

    Stokes, W S

    2015-12-01

    Despite efforts to eliminate the use of animals in testing and the availability of many accepted alternative methods, animals are still widely used for toxicological research and testing. While research using in vitro and computational models has dramatically increased in recent years, such efforts have not yet measurably impacted animal use for regulatory testing and are not likely to do so for many years or even decades. Until regulatory authorities have accepted test methods that can totally replace animals and these are fully implemented, large numbers of animals will continue to be used and many will continue to experience significant pain and distress. In order to positively impact the welfare of these animals, accepted alternatives must be implemented, and efforts must be directed at eliminating pain and distress and reducing animal numbers. Animal pain and distress can be reduced by earlier predictive humane endpoints, pain-relieving medications, and supportive clinical care, while sequential testing and routine use of integrated testing and decision strategies can reduce animal numbers. Applying advances in science and technology to the development of scientifically sound alternative testing models and strategies can improve animal welfare and further reduce and replace animal use. © The Author(s) 2015.

  6. Amantadine Ameliorates Dopamine-Releasing Deficits and Behavioral Deficits in Rats after Fluid Percussion Injury

    PubMed Central

    Huang, Eagle Yi-Kung; Tsui, Pi-Fen; Kuo, Tung-Tai; Tsai, Jing-Jr.; Chou, Yu-Ching; Ma, Hsin-I; Chiang, Yung-Hsiao; Chen, Yuan-Hao

    2014-01-01

    Aims To investigate the role of dopamine in cognitive and motor learning skill deficits after a traumatic brain injury (TBI), we investigated dopamine release and behavioral changes at a series of time points after fluid percussion injury, and explored the potential of amantadine hydrochloride as a chronic treatment to provide behavioral recovery. Materials and Methods In this study, we sequentially investigated dopamine release at the striatum and behavioral changes at 1, 2, 4, 6, and 8 weeks after fluid percussion injury. Rats subjected to 6-Pa cerebral cortical fluid percussion injury were treated by using subcutaneous infusion pumps filled with either saline (sham group) or amantadine hydrochloride, with a releasing rate of 3.6mg/kg/hour for 8 weeks. The dopamine-releasing conditions and metabolism were analyzed sequentially by fast scan cyclic voltammetry (FSCV) and high-pressure liquid chromatography (HPLC). Novel object recognition (NOR) and fixed-speed rotarod (FSRR) behavioral tests were used to determine treatment effects on cognitive and motor deficits after injury. Results Sequential dopamine-release deficits were revealed in 6-Pa-fluid-percussion cerebral cortical injured animals. The reuptake rate (tau value) of dopamine in injured animals was prolonged, but the tau value became close to the value for the control group after amantadine therapy. Cognitive and motor learning impairments were shown evidenced by the NOR and FSRR behavioral tests after injury. Chronic amantadine therapy reversed dopamine-release deficits, and behavioral impairment after fluid percussion injuries were ameliorated in the rats treated by using amantadine-pumping infusion. Conclusion Chronic treatment with amantadine hydrochloride can ameliorate dopamine-release deficits as well as cognitive and motor deficits caused by cerebral fluid-percussion injury. PMID:24497943

  7. Eyewitness decisions in simultaneous and sequential lineups: a dual-process signal detection theory analysis.

    PubMed

    Meissner, Christian A; Tredoux, Colin G; Parker, Janat F; MacLin, Otto H

    2005-07-01

    Many eyewitness researchers have argued for the application of a sequential alternative to the traditional simultaneous lineup, given its role in decreasing false identifications of innocent suspects (sequential superiority effect). However, Ebbesen and Flowe (2002) have recently noted that sequential lineups may merely bring about a shift in response criterion, having no effect on discrimination accuracy. We explored this claim, using a method that allows signal detection theory measures to be collected from eyewitnesses. In three experiments, lineup type was factorially combined with conditions expected to influence response criterion and/or discrimination accuracy. Results were consistent with signal detection theory predictions, including that of a conservative criterion shift with the sequential presentation of lineups. In a fourth experiment, we explored the phenomenological basis for the criterion shift, using the remember-know-guess procedure. In accord with previous research, the criterion shift in sequential lineups was associated with a reduction in familiarity-based responding. It is proposed that the relative similarity between lineup members may create a context in which fluency-based processing is facilitated to a greater extent when lineup members are presented simultaneously.

  8. Sequential vs simultaneous encoding of spatial information: a comparison between the blind and the sighted.

    PubMed

    Ruotolo, Francesco; Ruggiero, Gennaro; Vinciguerra, Michela; Iachini, Tina

    2012-02-01

    The aim of this research is to assess whether the crucial factor in determining the characteristics of blind people's spatial mental images is concerned with the visual impairment per se or the processing style that the dominant perceptual modalities used to acquire spatial information impose, i.e. simultaneous (vision) vs sequential (kinaesthesis). Participants were asked to learn six positions in a large parking area via movement alone (congenitally blind, adventitiously blind, blindfolded sighted) or with vision plus movement (simultaneous sighted, sequential sighted), and then to mentally scan between positions in the path. The crucial manipulation concerned the sequential sighted group. Their visual exploration was made sequential by putting visual obstacles within the pathway in such a way that they could not see simultaneously the positions along the pathway. The results revealed a significant time/distance linear relation in all tested groups. However, the linear component was lower in sequential sighted and blind participants, especially congenital. Sequential sighted and congenitally blind participants showed an almost overlapping performance. Differences between groups became evident when mentally scanning farther distances (more than 5m). This threshold effect could be revealing of processing limitations due to the need of integrating and updating spatial information. Overall, the results suggest that the characteristics of the processing style rather than the visual impairment per se affect blind people's spatial mental images. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. [Reflex epilepsy evoked by decision making: report of a case (author's transl)].

    PubMed

    Mutani, R; Ganga, A; Agnetti, V

    1980-01-01

    A 17-year-old girl with a story of Gran Mal attacks occurring during lessons of mathematics or solving mathematical problems, was investigated with prolonged EEG recordings. During the sessions, relax periods were alternated with arithmetical or mathematical testing, with card or checkers games and solution of puzzles and crossword problems, and with different neuropsychological tests. EGG recordings were characterized by the appearance, on a normal background, of bilaterally synchronous and symmetrical spike-and-wave and polispike-and-wave discharges, associated with loss of consciousness. During relax their mean frequency was one/54 min., it doubled during execution of tests involved with nonsequential decision making, and was eight times as high (one/7 min.) during tests involving sequential decision making. Some tension, challenge and complexity of the performance were also important as precipitating factors. Their lack deprived sequential tests of their efficacy, while on the contrary their presence sometimes gave nonsequential tests full efficacy.

  10. Methods for detecting long-term CNS dysfunction after prenatal exposure to neurotoxins.

    PubMed

    Vorhees, C V

    1997-11-01

    Current U.S. Environmental Protection Agency regulatory guidelines for developmental neurotoxicity emphasize functional categories such as motor activity, auditory startle, and learning and memory. A single test of some simple form of learning and memory is accepted to meet the latter category. The rationale for this emphasis has been that sensitive and reliable methods for assessing complex learning and memory are either not available or are too burdensome, and that insufficient data exist to endorse one approach over another. There has been little discussion of the fact that learning and memory is not a single identifiable functional category and no single test can assess all types of learning and memory. Three methods for assessing complex learning and memory are presented that assess two different types of learning and memory, are relatively efficient to conduct, and are sensitive to several known neurobehavioral teratogens. The tests are a 9-unit multiple-T swimming maze, and the Morris and Barnes mazes. The first of these assesses sequential learning, while the latter two assess spatial learning. A description of each test is provided, along with procedures for their use, and data exemplifying effects obtained using developmental exposure to phenytoin, methamphetamine, and MDMA. It is argued that multiple tests of learning and memory are required to ascertain cognitive deficits; something no single method can accomplish. Methods for acoustic startle are also presented.

  11. Thermodynamically balanced inside-out (TBIO) PCR-based gene synthesis: a novel method of primer design for high-fidelity assembly of longer gene sequences

    PubMed Central

    Gao, Xinxin; Yo, Peggy; Keith, Andrew; Ragan, Timothy J.; Harris, Thomas K.

    2003-01-01

    A novel thermodynamically-balanced inside-out (TBIO) method of primer design was developed and compared with a thermodynamically-balanced conventional (TBC) method of primer design for PCR-based gene synthesis of codon-optimized gene sequences for the human protein kinase B-2 (PKB2; 1494 bp), p70 ribosomal S6 subunit protein kinase-1 (S6K1; 1622 bp) and phosphoinositide-dependent protein kinase-1 (PDK1; 1712 bp). Each of the 60mer TBIO primers coded for identical nucleotide regions that the 60mer TBC primers covered, except that half of the TBIO primers were reverse complement sequences. In addition, the TBIO and TBC primers contained identical regions of temperature- optimized primer overlaps. The TBC method was optimized to generate sequential overlapping fragments (∼0.4–0.5 kb) for each of the gene sequences, and simultaneous and sequential combinations of overlapping fragments were tested for their ability to be assembled under an array of PCR conditions. However, no fully synthesized gene sequences could be obtained by this approach. In contrast, the TBIO method generated an initial central fragment (∼0.4–0.5 kb), which could be gel purified and used for further inside-out bidirectional elongation by additional increments of 0.4–0.5 kb. By using the newly developed TBIO method of PCR-based gene synthesis, error-free synthetic genes for the human protein kinases PKB2, S6K1 and PDK1 were obtained with little or no corrective mutagenesis. PMID:14602936

  12. S/He's a Rebel: Toward a Sequential Stress Theory of Delinquency and Gendered Pathways to Disadvantage in Emerging Adulthood.

    ERIC Educational Resources Information Center

    Hagan, John; Foster, Holly

    2003-01-01

    Data from the National Longitudinal Study of Adolescent Health on 11,506 high school students were used to test a gendered and age-graded sequential stress theory in which delinquency can play an additive and intervening role in adolescents' movement from early anger through rebellious or aggressive forms of behavior to later depressive symptoms…

  13. Aging in Movement Representations for Sequential Finger Movements: A Comparison between Young-, Middle-Aged, and Older Adults

    ERIC Educational Resources Information Center

    Cacola, Priscila; Roberson, Jerroed; Gabbard, Carl

    2013-01-01

    Studies show that as we enter older adulthood (greater than 64 years), our ability to mentally represent action in the form of using motor imagery declines. Using a chronometry paradigm to compare the movement duration of imagined and executed movements, we tested young-, middle-aged, and older adults on their ability to perform sequential finger…

  14. The Quality of French Minority Students' Fictional Texts: A Study of the Influence of a Preferential Cognitive Style and Writing Strategy Scaffolding

    ERIC Educational Resources Information Center

    Cavanagh, Martine Odile; Langevin, Rene

    2010-01-01

    The object of this exploratory study was to test two hypotheses. The first was that a student's preferential cognitive style, sequential or simultaneous, can negatively affect the imaginative fiction texts that he or she produces. The second hypothesis was that students possessing a sequential or simultaneous preferential cognitive style would…

  15. PSYCHOACOUSTICS: a comprehensive MATLAB toolbox for auditory testing.

    PubMed

    Soranzo, Alessandro; Grassi, Massimo

    2014-01-01

    PSYCHOACOUSTICS is a new MATLAB toolbox which implements three classic adaptive procedures for auditory threshold estimation. The first includes those of the Staircase family (method of limits, simple up-down and transformed up-down); the second is the Parameter Estimation by Sequential Testing (PEST); and the third is the Maximum Likelihood Procedure (MLP). The toolbox comes with more than twenty built-in experiments each provided with the recommended (default) parameters. However, if desired, these parameters can be modified through an intuitive and user friendly graphical interface and stored for future use (no programming skills are required). Finally, PSYCHOACOUSTICS is very flexible as it comes with several signal generators and can be easily extended for any experiment.

  16. PSYCHOACOUSTICS: a comprehensive MATLAB toolbox for auditory testing

    PubMed Central

    Soranzo, Alessandro; Grassi, Massimo

    2014-01-01

    PSYCHOACOUSTICS is a new MATLAB toolbox which implements three classic adaptive procedures for auditory threshold estimation. The first includes those of the Staircase family (method of limits, simple up-down and transformed up-down); the second is the Parameter Estimation by Sequential Testing (PEST); and the third is the Maximum Likelihood Procedure (MLP). The toolbox comes with more than twenty built-in experiments each provided with the recommended (default) parameters. However, if desired, these parameters can be modified through an intuitive and user friendly graphical interface and stored for future use (no programming skills are required). Finally, PSYCHOACOUSTICS is very flexible as it comes with several signal generators and can be easily extended for any experiment. PMID:25101013

  17. 16 CFR § 1500.42 - Test for eye irritants.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... before testing, and only those animals without eye defects or irritation shall be used. The animal is... substances, including testing that does not require animals, are presented in the CPSC's animal testing... conducted, a sequential testing strategy is recommended to reduce the number of test animals. Additionally...

  18. Fully vs. Sequentially Coupled Loads Analysis of Offshore Wind Turbines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damiani, Rick; Wendt, Fabian; Musial, Walter

    The design and analysis methods for offshore wind turbines must consider the aerodynamic and hydrodynamic loads and response of the entire system (turbine, tower, substructure, and foundation) coupled to the turbine control system dynamics. Whereas a fully coupled (turbine and support structure) modeling approach is more rigorous, intellectual property concerns can preclude this approach. In fact, turbine control system algorithms and turbine properties are strictly guarded and often not shared. In many cases, a partially coupled analysis using separate tools and an exchange of reduced sets of data via sequential coupling may be necessary. In the sequentially coupled approach, themore » turbine and substructure designers will independently determine and exchange an abridged model of their respective subsystems to be used in their partners' dynamic simulations. Although the ability to achieve design optimization is sacrificed to some degree with a sequentially coupled analysis method, the central question here is whether this approach can deliver the required safety and how the differences in the results from the fully coupled method could affect the design. This work summarizes the scope and preliminary results of a study conducted for the Bureau of Safety and Environmental Enforcement aimed at quantifying differences between these approaches through aero-hydro-servo-elastic simulations of two offshore wind turbines on a monopile and jacket substructure.« less

  19. Sequential Pharmacotherapy for Children with Comorbid Attention-Deficit/hyperactivity and Anxiety Disorders.

    ERIC Educational Resources Information Center

    Abikoff, Howard; McGough, James; Vitiello, Benedetto; McCracken, James; Davies, Mark; Walkup, John; Riddle, Mark; Oatis, Melvin; Greenhill, Laurence; Skrobala, Anne; March, John; Gammon, Pat; Robinson, James; Lazell, Robert; McMahon, Donald J.; Ritz, Louise

    2005-01-01

    Objective: Attention-deficit/hyperactivity disorder (ADHD) is often accompanied by clinically significant anxiety, but few empirical data guide treatment of children meeting full DSM-IV criteria for ADHD and anxiety disorders (ADHD/ANX). This study examined the efficacy of sequential pharmacotherapy for ADHD/ANX children. Method: Children, age 6…

  20. Terminating Sequential Delphi Survey Data Collection

    ERIC Educational Resources Information Center

    Kalaian, Sema A.; Kasim, Rafa M.

    2012-01-01

    The Delphi survey technique is an iterative mail or electronic (e-mail or web-based) survey method used to obtain agreement or consensus among a group of experts in a specific field on a particular issue through a well-designed and systematic multiple sequential rounds of survey administrations. Each of the multiple rounds of the Delphi survey…

  1. PC_Eyewitness and the sequential superiority effect: computer-based lineup administration.

    PubMed

    MacLin, Otto H; Zimmerman, Laura A; Malpass, Roy S

    2005-06-01

    Computer technology has become an increasingly important tool for conducting eyewitness identifications. In the area of lineup identifications, computerized administration offers several advantages for researchers and law enforcement. PC_Eyewitness is designed specifically to administer lineups. To assess this new lineup technology, two studies were conducted in order to replicate the results of previous studies comparing simultaneous and sequential lineups. One hundred twenty university students participated in each experiment. Experiment 1 used traditional paper-and-pencil lineup administration methods to compare simultaneous to sequential lineups. Experiment 2 used PC_Eyewitness to administer simultaneous and sequential lineups. The results of these studies were compared to the meta-analytic results reported by N. Steblay, J. Dysart, S. Fulero, and R. C. L. Lindsay (2001). No differences were found between paper-and-pencil and PC_Eyewitness lineup administration methods. The core findings of the N. Steblay et al. (2001) meta-analysis were replicated by both administration procedures. These results show that computerized lineup administration using PC_Eyewitness is an effective means for gathering eyewitness identification data.

  2. A wireless sequentially actuated microvalve system

    NASA Astrophysics Data System (ADS)

    Baek, Seung-Ki; Yoon, Yong-Kyu; Jeon, Hye-Seon; Seo, Soonmin; Park, Jung-Hwan

    2013-04-01

    A wireless microvalve system was fabricated based on induction heating for flow control in microfluidics by sequential valve opening. In this approach, we used paraffin wax as a flow plug, which can be changed from solid to liquid with adjacent heating elements operated by induction heating. Programmable opening of valves was devised by using different thermal responses of metal discs to a magnetic field. Copper and nickel discs with a diameter of 2.5 mm and various thicknesses (50, 100 and 200 µm) were prepared as heating elements by a laser cutting method, and they were integrated in the microfluidic channel as part of the microvalve. A calorimetric test was used to measure the thermal properties of the discs in terms of kinds of metal and disc thickness. Sequential openings of the microvalves were performed using the difference in the thermal response of 100 µm thick copper disc and 50 µm thick nickel disc for short-interval openings and 200 µm thick copper disc and 100-µm-thick nickel disc for long-interval openings. The thermal effect on fluid samples as a result of induction heating of the discs was studied by investigating lysozyme denaturation. More heat was generated in heating elements made of copper than in those made of nickel, implying differences in the thermal response of heating elements made of copper and nickel. Also, the thickness of the heating elements affected the thermal response in the elements. Valve openings for short intervals of 1-5 s and long intervals of 15-23 s were achieved by using two sets of heating elements. There was no significant change in lysozyme activity by increasing the temperature of the heating discs. This study demonstrates that a wireless sequentially actuated microvalve system can provide programmed valve opening, portability, ease of fabrication and operation, disposability, and low cost.

  3. The Impact of Adjuvant Postoperative Radiation Therapy and Chemotherapy on Survival After Esophagectomy for Esophageal Carcinoma.

    PubMed

    Wong, Andrew T; Shao, Meng; Rineer, Justin; Lee, Anna; Schwartz, David; Schreiber, David

    2017-06-01

    The objective of this study was to analyze the impact on overall survival (OS) from the addition of postoperative radiation with or without chemotherapy after esophagectomy, using a large, hospital-based dataset. Previous retrospective studies have suggested an OS advantage for postoperative chemoradiation over surgery alone, although prospective data are lacking. The National Cancer Data Base was queried to select patients diagnosed with stage pT3-4Nx-0M0 or pT1-4N1-3M0 esophageal carcinoma (squamous cell or adenocarcinoma) from 1998 to 2011 treated with definitive esophagectomy ± postoperative radiation and/or chemotherapy. OS was analyzed using the Kaplan-Meier method and compared using the log-rank test. Multivariate Cox regression analysis was used to identify covariates associated with OS. There were 4893 patients selected, of whom 1153 (23.6%) received postoperative radiation. Most patients receiving radiation also received sequential/concomitant chemotherapy (89.9%). For the entire cohort, postoperative radiation was associated with a statistically significant but modest absolute improvement in survival (hazard ratio 0.77; 95% CI, 0.71-0.83; P < 0.001). On subgroup analysis, postoperative radiation was associated with improved OS for patients with node-positive disease (3-yr OS 34.3 % vs 27.8%, P < 0.001) or positive margins (3-yr OS 36.4% vs 18.0%, P < 0.001). When chemotherapy usage was incorporated, sequential chemotherapy was associated with the best survival (P < 0.001). Multivariate analysis revealed that the addition of chemotherapy to radiation therapy, whether sequentially or concurrently, was a strong prognostic factor for OS. In this hospital-based study, the addition of postoperative chemoradiation (either sequentially or concomitantly) after esophagectomy was associated with improved OS for patients with node-positive disease or positive margins.

  4. Using a Mixed Methods Sequential Design to Identify Factors Associated with African American Mothers' Intention to Vaccinate Their Daughters Aged 9 to 12 for HPV with a Purpose of Informing a Culturally-Relevant, Theory-Based Intervention

    ERIC Educational Resources Information Center

    Cunningham, Jennifer L.

    2013-01-01

    The purpose of this sequential, explanatory mixed methods research study was to understand what factors influenced African American maternal intentions to get their daughters aged 9 years to 12 years vaccinated in Alabama. In the first, quantitative phase of the study, the research questions focused on identifying the predictive power of eleven…

  5. Exposure Control Using Adaptive Multi-Stage Item Bundles.

    ERIC Educational Resources Information Center

    Luecht, Richard M.

    This paper presents a multistage adaptive testing test development paradigm that promises to handle content balancing and other test development needs, psychometric reliability concerns, and item exposure. The bundled multistage adaptive testing (BMAT) framework is a modification of the computer-adaptive sequential testing framework introduced by…

  6. Dynamic leaching and fractionation of trace elements from environmental solids exploiting a novel circulating-flow platform.

    PubMed

    Mori, Masanobu; Nakano, Koji; Sasaki, Masaya; Shinozaki, Haruka; Suzuki, Shiho; Okawara, Chitose; Miró, Manuel; Itabashi, Hideyuki

    2016-02-01

    A dynamic flow-through microcolumn extraction system based on extractant re-circulation is herein proposed as a novel analytical approach for simplification of bioaccessibility tests of trace elements in sediments. On-line metal leaching is undertaken in the format of all injection (AI) analysis, which is a sequel of flow injection analysis, but involving extraction under steady-state conditions. The minimum circulation times and flow rates required to determine the maximum bioaccessible pools of target metals (viz., Cu, Zn, Cd, and Pb) from lake and river sediment samples were estimated using Tessier's sequential extraction scheme and an acid single extraction test. The on-line AIA method was successfully validated by mass balance studies of CRM and real sediment samples. Tessier's test in on-line AI format demonstrated to be carried out by one third of extraction time (6h against more than 17 h by the conventional method), with better analytical precision (<9.2% against >15% by the conventional method) and significant decrease in blank readouts as compared with the manual batch counterpart. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Backfilled, self-assembled monolayers and methods of making same

    DOEpatents

    Fryxell, Glen E [Kennewick, WA; Zemanian, Thomas S [Richland, WA; Addleman, R Shane [Benton City, WA; Aardahl, Christopher L [Sequim, WA; Zheng, Feng [Richland, WA; Busche, Brad [Raleigh, NC; Egorov, Oleg B [West Richland, WA

    2009-06-30

    Backfilled, self-assembled monolayers and methods of making the same are disclosed. The self-assembled monolayer comprises at least one functional organosilane species and a substantially random dispersion of at least one backfilling organosilane species among the functional organosilane species, wherein the functional and backfilling organosilane species have been sequentially deposited on a substrate. The method comprises depositing sequentially a first organosilane species followed by a backfilling organosilane species, and employing a relaxation agent before or during deposition of the backfilling organosilane species, wherein the first and backfilling organosilane species are substantially randomly dispersed on a substrate.

  8. Proposal of a method for the evaluation of inaccuracy of home sphygmomanometers.

    PubMed

    Akpolat, Tekin

    2009-10-01

    There is no formal protocol for evaluating the individual accuracy of home sphygmomanometers. The aims of this study were to propose a method for achieving accuracy in automated home sphygmomanometers and to test the applicability of the defined method. The purposes of this method were to avoid major inaccuracies and to estimate the optimal circumstance for individual accuracy. The method has three stages and sequential measurement of blood pressure is used. The tested devices were categorized into four groups: accurate, acceptable, inaccurate and very inaccurate (major inaccuracy). The defined method takes approximately 10 min (excluding relaxation time) and was tested on three different occasions. The application of the method has shown that inaccuracy is a common problem among non-tested devices, that validated devices are superior to those that are non-validated or whose validation status is unknown, that major inaccuracy is common, especially in non-tested devices and that validation does not guarantee individual accuracy. A protocol addressing the accuracy of a particular sphygmomanometer in an individual patient is required, and a practical method has been suggested to achieve this. This method can be modified, but the main idea and approach should be preserved unless a better method is proposed. The purchase of validated devices and evaluation of accuracy for the purchased device in an individual patient will improve the monitoring of self-measurement of blood pressure at home. This study addresses device inaccuracy, but errors related to the patient, observer or blood pressure measurement technique should not be underestimated, and strict adherence to the manufacturer's instructions is essential.

  9. Frequency of Germline Mutations in 25 Cancer Susceptibility Genes in a Sequential Series of Patients With Breast Cancer

    PubMed Central

    Lin, Nancy U.; Kidd, John; Allen, Brian A.; Singh, Nanda; Wenstrup, Richard J.; Hartman, Anne-Renee; Winer, Eric P.; Garber, Judy E.

    2016-01-01

    Purpose Testing for germline mutations in BRCA1/2 is standard for select patients with breast cancer to guide clinical management. Next-generation sequencing (NGS) allows testing for mutations in additional breast cancer predisposition genes. The frequency of germline mutations detected by using NGS has been reported in patients with breast cancer who were referred for BRCA1/2 testing or with triple-negative breast cancer. We assessed the frequency and predictors of mutations in 25 cancer predisposition genes, including BRCA1/2, in a sequential series of patients with breast cancer at an academic institution to examine the utility of genetic testing in this population. Methods Patients with stages I to III breast cancer who were seen at a single cancer center between 2010 and 2012, and who agreed to participate in research DNA banking, were included (N = 488). Personal and family cancer histories were collected and germline DNA was sequenced with NGS to identify mutations. Results Deleterious mutations were identified in 10.7% of women, including 6.1% in BRCA1/2 (5.1% in non-Ashkenazi Jewish patients) and 4.6% in other breast/ovarian cancer predisposition genes including CHEK2 (n = 10), ATM (n = 4), BRIP1 (n = 4), and one each in PALB2, PTEN, NBN, RAD51C, RAD51D, MSH6, and PMS2. Whereas young age (P < .01), Ashkenazi Jewish ancestry (P < .01), triple-negative breast cancer (P = .01), and family history of breast/ovarian cancer (P = .01) predicted for BRCA1/2 mutations, no factors predicted for mutations in other breast cancer predisposition genes. Conclusion Among sequential patients with breast cancer, 10.7% were found to have a germline mutation in a gene that predisposes women to breast or ovarian cancer, using a panel of 25 predisposition genes. Factors that predict for BRCA1/2 mutations do not predict for mutations in other breast/ovarian cancer susceptibility genes when these genes are analyzed as a single group. Additional cohorts will be helpful to define individuals at higher risk of carrying mutations in genes other than BRCA1/2. PMID:26976419

  10. Applications of geostatistics and Markov models for logo recognition

    NASA Astrophysics Data System (ADS)

    Pham, Tuan

    2003-01-01

    Spatial covariances based on geostatistics are extracted as representative features of logo or trademark images. These spatial covariances are different from other statistical features for image analysis in that the structural information of an image is independent of the pixel locations and represented in terms of spatial series. We then design a classifier in the sense of hidden Markov models to make use of these geostatistical sequential data to recognize the logos. High recognition rates are obtained from testing the method against a public-domain logo database.

  11. Producing carbon stripper foils containing boron

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stoner, J. O. Jr.

    2012-12-19

    Parameters being actively tested by the accelerator community for the purpose of extending carbon stripper foil lifetimes in fast ion beams include methods of deposition, parting agents, mounting techniques, support (fork) materials, and inclusion of alloying elements, particularly boron. Specialized production apparatus is required for either sequential deposition or co-deposition of boron in carbon foils. A dual-use vacuum evaporator for arc evaporation of carbon and electron-beam evaporation of boron and other materials has been built for such development. Production of both carbon and boron foils has begun and improvements are in progress.

  12. Clinical Study on Prospective Efficacy of All-Trans Acid, Realgar-Indigo Naturalis Formula Combined with Chemotherapy as Maintenance Treatment of Acute Promyelocytic Leukemia

    PubMed Central

    Lu-Qun, Wang; Hao, Li; Xiao-Peng, He; Fang-Lin, Li; Ling-Ling, Wang; Xue-Liang, Chen; Ming, Hou

    2014-01-01

    Objectives. To test the efficiency and safety of sequential application of retinoic acid (ATRA), Realgar-Indigo naturalis formula (RIF) and chemotherapy (CT) were used as the maintenance treatment in patients with acute promyelocytic leukemia (APL). Methods. This was a retrospective study of 98 patients with newly diagnosed APL who accepted two different maintenance treatments. After remission induction and consolidation chemotherapy according to their Sanz scores, patients received two different kinds of maintenance scheme. The first regimen was using ATRA, RIF, and standard dose of CT sequentially (ATRA/RIF/CT regimen), while the second one was using ATRA and low dose of chemotherapy with methotrexate (MTX) plus 6-mercaptopurine (6-MP) alternately (ATRA/CTlow regimen). The OS, DFS, relapse rate, minimal residual disease, and adverse reactions in two groups were monitored and evaluated. Results. ATRA/RIF/CT regimen could effectively reduce the chance of relapse in different risk stratification of patients, but there was no significant difference in 5-year DFS rate and OS rate between the two groups. Besides, the patients in the experimental group suffered less severe adverse reactions than those in the control group. Conclusions. The repeated sequential therapeutic regimen to APL with ATRA, RIF, and chemotherapy is worth popularizing for its high effectiveness and low toxicity. PMID:24963332

  13. Clinical study on prospective efficacy of all-trans Acid, realgar-indigo naturalis formula combined with chemotherapy as maintenance treatment of acute promyelocytic leukemia.

    PubMed

    Xiang-Xin, Li; Lu-Qun, Wang; Hao, Li; Xiao-Peng, He; Fang-Lin, Li; Ling-Ling, Wang; Xue-Liang, Chen; Ming, Hou

    2014-01-01

    Objectives. To test the efficiency and safety of sequential application of retinoic acid (ATRA), Realgar-Indigo naturalis formula (RIF) and chemotherapy (CT) were used as the maintenance treatment in patients with acute promyelocytic leukemia (APL). Methods. This was a retrospective study of 98 patients with newly diagnosed APL who accepted two different maintenance treatments. After remission induction and consolidation chemotherapy according to their Sanz scores, patients received two different kinds of maintenance scheme. The first regimen was using ATRA, RIF, and standard dose of CT sequentially (ATRA/RIF/CT regimen), while the second one was using ATRA and low dose of chemotherapy with methotrexate (MTX) plus 6-mercaptopurine (6-MP) alternately (ATRA/CTlow regimen). The OS, DFS, relapse rate, minimal residual disease, and adverse reactions in two groups were monitored and evaluated. Results. ATRA/RIF/CT regimen could effectively reduce the chance of relapse in different risk stratification of patients, but there was no significant difference in 5-year DFS rate and OS rate between the two groups. Besides, the patients in the experimental group suffered less severe adverse reactions than those in the control group. Conclusions. The repeated sequential therapeutic regimen to APL with ATRA, RIF, and chemotherapy is worth popularizing for its high effectiveness and low toxicity.

  14. Three-dimensional Simulation and Prediction of Solenoid Valve Failure Mechanism Based on Finite Element Model

    NASA Astrophysics Data System (ADS)

    Li, Jianfeng; Xiao, Mingqing; Liang, Yajun; Tang, Xilang; Li, Chao

    2018-01-01

    The solenoid valve is a kind of basic automation component applied widely. It’s significant to analyze and predict its degradation failure mechanism to improve the reliability of solenoid valve and do research on prolonging life. In this paper, a three-dimensional finite element analysis model of solenoid valve is established based on ANSYS Workbench software. A sequential coupling method used to calculate temperature filed and mechanical stress field of solenoid valve is put forward. The simulation result shows the sequential coupling method can calculate and analyze temperature and stress distribution of solenoid valve accurately, which has been verified through the accelerated life test. Kalman filtering algorithm is introduced to the data processing, which can effectively reduce measuring deviation and restore more accurate data information. Based on different driving current, a kind of failure mechanism which can easily cause the degradation of coils is obtained and an optimization design scheme of electro-insulating rubbers is also proposed. The high temperature generated by driving current and the thermal stress resulting from thermal expansion can easily cause the degradation of coil wires, which will decline the electrical resistance of coils and result in the eventual failure of solenoid valve. The method of finite element analysis can be applied to fault diagnosis and prognostic of various solenoid valves and improve the reliability of solenoid valve’s health management.

  15. Chemical and physiological metal bioaccessibility assessment in surface bottom sediments from the Deba River urban catchment: Harmonization of PBET, TCLP and BCR sequential extraction methods.

    PubMed

    Unda-Calvo, Jessica; Martínez-Santos, Miren; Ruiz-Romera, Estilita

    2017-04-01

    In the present study, the physiologically based extraction test PBET (gastric and intestinal phases) and two chemical based extraction methods, the toxicity characteristic leaching procedure (TCLP) and the sequential extraction procedure BCR 701 (Community Bureau of Reference of the European Commission) have been used to estimate and evaluate the bioaccessibility of metals (Fe, Mn, Zn, Cu, Ni, Cr and Pb) in sediments from the Deba River urban catchment. The statistical analysis of data and comparison among physiological and chemical methods have highlighted the relevance of simulate the gastrointestinal tract environment since metal bioaccessibility seems to depend on water and sediment properties such as pH, redox potential and organic matter content, and, primordially, on the form in which metals are present in the sediment. Indeed, metals distributed among all fractions (Mn, Ni, Zn) were the most bioaccessible, followed by those predominantly bound to oxidizable fraction (Cu, Cr and Pb), especially near major urban areas. Finally, a toxicological risk assessment was also performed by determining the hazard quotient (HQ), which demonstrated that, although sediments from mid- and downstream sampling points presented the highest metal bioaccessibilities, were not enough to have adverse effects on human health, Cr being the most potentially toxic element. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Comparing cluster-level dynamic treatment regimens using sequential, multiple assignment, randomized trials: Regression estimation and sample size considerations.

    PubMed

    NeCamp, Timothy; Kilbourne, Amy; Almirall, Daniel

    2017-08-01

    Cluster-level dynamic treatment regimens can be used to guide sequential treatment decision-making at the cluster level in order to improve outcomes at the individual or patient-level. In a cluster-level dynamic treatment regimen, the treatment is potentially adapted and re-adapted over time based on changes in the cluster that could be impacted by prior intervention, including aggregate measures of the individuals or patients that compose it. Cluster-randomized sequential multiple assignment randomized trials can be used to answer multiple open questions preventing scientists from developing high-quality cluster-level dynamic treatment regimens. In a cluster-randomized sequential multiple assignment randomized trial, sequential randomizations occur at the cluster level and outcomes are observed at the individual level. This manuscript makes two contributions to the design and analysis of cluster-randomized sequential multiple assignment randomized trials. First, a weighted least squares regression approach is proposed for comparing the mean of a patient-level outcome between the cluster-level dynamic treatment regimens embedded in a sequential multiple assignment randomized trial. The regression approach facilitates the use of baseline covariates which is often critical in the analysis of cluster-level trials. Second, sample size calculators are derived for two common cluster-randomized sequential multiple assignment randomized trial designs for use when the primary aim is a between-dynamic treatment regimen comparison of the mean of a continuous patient-level outcome. The methods are motivated by the Adaptive Implementation of Effective Programs Trial which is, to our knowledge, the first-ever cluster-randomized sequential multiple assignment randomized trial in psychiatry.

  17. Visual short-term memory for sequential arrays.

    PubMed

    Kumar, Arjun; Jiang, Yuhong

    2005-04-01

    The capacity of visual short-term memory (VSTM) for a single visual display has been investigated in past research, but VSTM for multiple sequential arrays has been explored only recently. In this study, we investigate the capacity of VSTM across two sequential arrays separated by a variable stimulus onset asynchrony (SOA). VSTM for spatial locations (Experiment 1), colors (Experiments 2-4), orientations (Experiments 3 and 4), and conjunction of color and orientation (Experiment 4) were tested, with the SOA across the two sequential arrays varying from 100 to 1,500 msec. We find that VSTM for the trailing array is much better than VSTM for the leading array, but when averaged across the two arrays VSTM has a constant capacity independent of the SOA. We suggest that multiple displays compete for retention in VSTM and that separating information into two temporally discrete groups does not enhance the overall capacity of VSTM.

  18. Concurrent versus sequential sorafenib therapy in combination with radiation for hepatocellular carcinoma.

    PubMed

    Wild, Aaron T; Gandhi, Nishant; Chettiar, Sivarajan T; Aziz, Khaled; Gajula, Rajendra P; Williams, Russell D; Kumar, Rachit; Taparra, Kekoa; Zeng, Jing; Cades, Jessica A; Velarde, Esteban; Menon, Siddharth; Geschwind, Jean F; Cosgrove, David; Pawlik, Timothy M; Maitra, Anirban; Wong, John; Hales, Russell K; Torbenson, Michael S; Herman, Joseph M; Tran, Phuoc T

    2013-01-01

    Sorafenib (SOR) is the only systemic agent known to improve survival for hepatocellular carcinoma (HCC). However, SOR prolongs survival by less than 3 months and does not alter symptomatic progression. To improve outcomes, several phase I-II trials are currently examining SOR with radiation (RT) for HCC utilizing heterogeneous concurrent and sequential treatment regimens. Our study provides preclinical data characterizing the effects of concurrent versus sequential RT-SOR on HCC cells both in vitro and in vivo. Concurrent and sequential RT-SOR regimens were tested for efficacy among 4 HCC cell lines in vitro by assessment of clonogenic survival, apoptosis, cell cycle distribution, and γ-H2AX foci formation. Results were confirmed in vivo by evaluating tumor growth delay and performing immunofluorescence staining in a hind-flank xenograft model. In vitro, concurrent RT-SOR produced radioprotection in 3 of 4 cell lines, whereas sequential RT-SOR produced decreased colony formation among all 4. Sequential RT-SOR increased apoptosis compared to RT alone, while concurrent RT-SOR did not. Sorafenib induced reassortment into less radiosensitive phases of the cell cycle through G1-S delay and cell cycle slowing. More double-strand breaks (DSBs) persisted 24 h post-irradiation for RT alone versus concurrent RT-SOR. In vivo, sequential RT-SOR produced the greatest tumor growth delay, while concurrent RT-SOR was similar to RT alone. More persistent DSBs were observed in xenografts treated with sequential RT-SOR or RT alone versus concurrent RT-SOR. Sequential RT-SOR additionally produced a greater reduction in xenograft tumor vascularity and mitotic index than either concurrent RT-SOR or RT alone. In conclusion, sequential RT-SOR demonstrates greater efficacy against HCC than concurrent RT-SOR both in vitro and in vivo. These results may have implications for clinical decision-making and prospective trial design.

  19. Concurrent versus Sequential Sorafenib Therapy in Combination with Radiation for Hepatocellular Carcinoma

    PubMed Central

    Chettiar, Sivarajan T.; Aziz, Khaled; Gajula, Rajendra P.; Williams, Russell D.; Kumar, Rachit; Taparra, Kekoa; Zeng, Jing; Cades, Jessica A.; Velarde, Esteban; Menon, Siddharth; Geschwind, Jean F.; Cosgrove, David; Pawlik, Timothy M.; Maitra, Anirban; Wong, John; Hales, Russell K.; Torbenson, Michael S.; Herman, Joseph M.; Tran, Phuoc T.

    2013-01-01

    Sorafenib (SOR) is the only systemic agent known to improve survival for hepatocellular carcinoma (HCC). However, SOR prolongs survival by less than 3 months and does not alter symptomatic progression. To improve outcomes, several phase I-II trials are currently examining SOR with radiation (RT) for HCC utilizing heterogeneous concurrent and sequential treatment regimens. Our study provides preclinical data characterizing the effects of concurrent versus sequential RT-SOR on HCC cells both in vitro and in vivo. Concurrent and sequential RT-SOR regimens were tested for efficacy among 4 HCC cell lines in vitro by assessment of clonogenic survival, apoptosis, cell cycle distribution, and γ-H2AX foci formation. Results were confirmed in vivo by evaluating tumor growth delay and performing immunofluorescence staining in a hind-flank xenograft model. In vitro, concurrent RT-SOR produced radioprotection in 3 of 4 cell lines, whereas sequential RT-SOR produced decreased colony formation among all 4. Sequential RT-SOR increased apoptosis compared to RT alone, while concurrent RT-SOR did not. Sorafenib induced reassortment into less radiosensitive phases of the cell cycle through G1-S delay and cell cycle slowing. More double-strand breaks (DSBs) persisted 24 h post-irradiation for RT alone versus concurrent RT-SOR. In vivo, sequential RT-SOR produced the greatest tumor growth delay, while concurrent RT-SOR was similar to RT alone. More persistent DSBs were observed in xenografts treated with sequential RT-SOR or RT alone versus concurrent RT-SOR. Sequential RT-SOR additionally produced a greater reduction in xenograft tumor vascularity and mitotic index than either concurrent RT-SOR or RT alone. In conclusion, sequential RT-SOR demonstrates greater efficacy against HCC than concurrent RT-SOR both in vitro and in vivo. These results may have implications for clinical decision-making and prospective trial design. PMID:23762417

  20. Sequential fuzzy diagnosis method for motor roller bearing in variable operating conditions based on vibration analysis.

    PubMed

    Li, Ke; Ping, Xueliang; Wang, Huaqing; Chen, Peng; Cao, Yi

    2013-06-21

    A novel intelligent fault diagnosis method for motor roller bearings which operate under unsteady rotating speed and load is proposed in this paper. The pseudo Wigner-Ville distribution (PWVD) and the relative crossing information (RCI) methods are used for extracting the feature spectra from the non-stationary vibration signal measured for condition diagnosis. The RCI is used to automatically extract the feature spectrum from the time-frequency distribution of the vibration signal. The extracted feature spectrum is instantaneous, and not correlated with the rotation speed and load. By using the ant colony optimization (ACO) clustering algorithm, the synthesizing symptom parameters (SSP) for condition diagnosis are obtained. The experimental results shows that the diagnostic sensitivity of the SSP is higher than original symptom parameter (SP), and the SSP can sensitively reflect the characteristics of the feature spectrum for precise condition diagnosis. Finally, a fuzzy diagnosis method based on sequential inference and possibility theory is also proposed, by which the conditions of the machine can be identified sequentially as well.

  1. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data.

    PubMed

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-12

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  2. Sequential Fuzzy Diagnosis Method for Motor Roller Bearing in Variable Operating Conditions Based on Vibration Analysis

    PubMed Central

    Li, Ke; Ping, Xueliang; Wang, Huaqing; Chen, Peng; Cao, Yi

    2013-01-01

    A novel intelligent fault diagnosis method for motor roller bearings which operate under unsteady rotating speed and load is proposed in this paper. The pseudo Wigner-Ville distribution (PWVD) and the relative crossing information (RCI) methods are used for extracting the feature spectra from the non-stationary vibration signal measured for condition diagnosis. The RCI is used to automatically extract the feature spectrum from the time-frequency distribution of the vibration signal. The extracted feature spectrum is instantaneous, and not correlated with the rotation speed and load. By using the ant colony optimization (ACO) clustering algorithm, the synthesizing symptom parameters (SSP) for condition diagnosis are obtained. The experimental results shows that the diagnostic sensitivity of the SSP is higher than original symptom parameter (SP), and the SSP can sensitively reflect the characteristics of the feature spectrum for precise condition diagnosis. Finally, a fuzzy diagnosis method based on sequential inference and possibility theory is also proposed, by which the conditions of the machine can be identified sequentially as well. PMID:23793021

  3. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data

    NASA Astrophysics Data System (ADS)

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-01

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  4. Multi-atlas segmentation of the cartilage in knee MR images with sequential volume- and bone-mask-based registrations

    NASA Astrophysics Data System (ADS)

    Lee, Han Sang; Kim, Hyeun A.; Kim, Hyeonjin; Hong, Helen; Yoon, Young Cheol; Kim, Junmo

    2016-03-01

    In spite of its clinical importance in diagnosis of osteoarthritis, segmentation of cartilage in knee MRI remains a challenging task due to its shape variability and low contrast with surrounding soft tissues and synovial fluid. In this paper, we propose a multi-atlas segmentation of cartilage in knee MRI with sequential atlas registrations and locallyweighted voting (LWV). First, bone is segmented by sequential volume- and object-based registrations and LWV. Second, to overcome the shape variability of cartilage, cartilage is segmented by bone-mask-based registration and LWV. In experiments, the proposed method improved the bone segmentation by reducing misclassified bone region, and enhanced the cartilage segmentation by preventing cartilage leakage into surrounding similar intensity region, with the help of sequential registrations and LWV.

  5. ChIP-re-ChIP: Co-occupancy Analysis by Sequential Chromatin Immunoprecipitation.

    PubMed

    Beischlag, Timothy V; Prefontaine, Gratien G; Hankinson, Oliver

    2018-01-01

    Chromatin immunoprecipitation (ChIP) exploits the specific interactions between DNA and DNA-associated proteins. It can be used to examine a wide range of experimental parameters. A number of proteins bound at the same genomic location can identify a multi-protein chromatin complex where several proteins work together to regulate gene transcription or chromatin configuration. In many instances, this can be achieved using sequential ChIP; or simply, ChIP-re-ChIP. Whether it is for the examination of specific transcriptional or epigenetic regulators, or for the identification of cistromes, the ability to perform a sequential ChIP adds a higher level of power and definition to these analyses. In this chapter, we describe a simple and reliable method for the sequential ChIP assay.

  6. Use of personalized Dynamic Treatment Regimes (DTRs) and Sequential Multiple Assignment Randomized Trials (SMARTs) in mental health studies

    PubMed Central

    Liu, Ying; ZENG, Donglin; WANG, Yuanjia

    2014-01-01

    Summary Dynamic treatment regimens (DTRs) are sequential decision rules tailored at each point where a clinical decision is made based on each patient’s time-varying characteristics and intermediate outcomes observed at earlier points in time. The complexity, patient heterogeneity, and chronicity of mental disorders call for learning optimal DTRs to dynamically adapt treatment to an individual’s response over time. The Sequential Multiple Assignment Randomized Trial (SMARTs) design allows for estimating causal effects of DTRs. Modern statistical tools have been developed to optimize DTRs based on personalized variables and intermediate outcomes using rich data collected from SMARTs; these statistical methods can also be used to recommend tailoring variables for designing future SMART studies. This paper introduces DTRs and SMARTs using two examples in mental health studies, discusses two machine learning methods for estimating optimal DTR from SMARTs data, and demonstrates the performance of the statistical methods using simulated data. PMID:25642116

  7. Sequencing bilateral and unilateral task-oriented training versus task oriented training alone to improve arm function in individuals with chronic stroke.

    PubMed

    McCombe Waller, Sandy; Whitall, Jill; Jenkins, Toye; Magder, Laurence S; Hanley, Daniel F; Goldberg, Andrew; Luft, Andreas R

    2014-12-14

    Recovering useful hand function after stroke is a major scientific challenge for patients with limited motor recovery. We hypothesized that sequential training beginning with proximal bilateral followed by unilateral task oriented training is superior to time-matched unilateral training alone. Proximal bilateral training could optimally prepare the motor system to respond to the more challenging task-oriented training. Twenty-six participants with moderate severity hemiparesis Intervention: PARTICIPANTS received either 6-weeks of bilateral proximal training followed sequentially by 6-weeks unilateral task-oriented training (COMBO) or 12-weeks of unilateral task-oriented training alone (SAEBO). A subset of 8 COMB0 and 9 SAEBO participants underwent three functional magnetic resonance imaging (fMRI) scans of hand and elbow movement every 6 weeks. Fugl-Meyer Upper extremity scale, Modified Wolf Motor Function Test, University of Maryland Arm Questionnaire for Stroke, Motor cortex activation (fMRI). The COMBO group demonstrated significantly greater gains between baseline and 12-weeks over all outcome measures (p = .018 based on a MANOVA test) and specifically in the Modified Wolf Motor Function test (time). Both groups demonstrated within-group gains on the Fugl-Meyer Upper Extremity test (impairment) and University of Maryland Arm Questionnaire for Stroke (functional use). fMRI subset analyses showed motor cortex (primary and premotor) activation during hand movement was significantly increased by sequential combination training but not by task-oriented training alone. Sequentially combining a proximal bilateral before a unilateral task-oriented training may be an effective way to facilitate gains in arm and hand function in those with moderate to severe paresis post-stroke compared to unilateral task oriented training alone.

  8. Nanoparticle bioconjugates as "bottom-up" assemblies of artifical multienzyme complexes

    NASA Astrophysics Data System (ADS)

    Keighron, Jacqueline D.

    2010-11-01

    The sequential enzymes of several metabolic pathways have been shown to exist in close proximity with each other in the living cell. Although not proven in all cases, colocalization may have several implications for the rate of metabolite formation. Proximity between the sequential enzymes of a metabolic pathway has been proposed to have several benefits for the overall rate of metabolite formation. These include reduced diffusion distance for intermediates, sequestering of intermediates from competing pathways and the cytoplasm. Restricted diffusion in the vicinity of an enzyme can also cause the pooling of metabolites, which can alter reaction equilibria to control the rate of reaction through inhibition. Associations of metabolic enzymes are difficult to isolate ex vivo due to the weak interactions believed to colocalize sequential enzymes within the cell. Therefore model systems in which the proximity and diffusion of intermediates within the experiment system are controlled are attractive alternatives to explore the effects of colocalization of sequential enzymes. To this end three model systems for multienzyme complexes have been constructed. Direct adsorption enzyme:gold nanoparticle bioconjugates functionalized with malate dehydrogenase (MDH) and citrate synthase (CS) allow for proximity between to the enzymes to be controlled from the nanometer to micron range. Results show that while the enzymes present in the colocalized and non-colocalized systems compared here behaved differently overall the sequential activity of the pathway was improved by (1) decreasing the diffusion distance between active sites, (2) decreasing the diffusion coefficient of the reaction intermediate to prevent escape into the bulk solution, and (3) decreasing the overall amount of bioconjugate in the solution to prevent the pathway from being inhibited by the buildup of metabolite over time. Layer-by-layer (LBL) assemblies of MDH and CS were used to examine the layering effect of sequential enzymes found in multienzyme complexes such as the pyruvate dehydrogenase complex (PDC). By controlling the orientation of enzymes in the complex (i.e. how deeply embedded each enzyme is) it was hypothesized that differences in sequential activity would determine an optimal orientation for a multienzyme complex. It was determined during the course of these experiments that the polyelectrolyte (PE) assembly itself served to slow diffusion of intermediates, leading to a buildup of oxaloacetate within the PE layers to form a pool of metabolite that equalized the rate of sequential reaction between the different orientations tested. Hexahistidine tag -- Ni(II) nitriliotriacetic acid (NTA) chemistry is an attractive method to control the proximity between sequential enzymes because each enzyme can be bound in a specific orientation, with minimal loss of activity, and the interaction is reversible. Modifying gold nanoparticles or large unilamellar vesicles with this functionality allows for another class of model to be constructed in which proximity between enzymes is dynamic. Some metabolic pathways (such as the de novo purine biosynthetic pathway), have demonstrated dynamic proximity of sequential enzymes in response to specific cellular stimuli. Results indicate that Ni(II)NTA scaffolds immobilize histidine-tagged enzymes non-destructively, with a near 100% reversibility. This model can be used to demonstrate the possible implications of dynamic proximity such as pathway regulation. Insight into the benefits and mechanisms of sequential enzyme colocalization can enhance the general understanding of cellular processes, as well as allow for the development of new and innovative ways to modulate pathway activity. This may provide new designs for treatments of metabolic diseases and cancer, where metabolic pathways are altered.

  9. A three-dimensional quality-guided phase unwrapping method for MR elastography

    NASA Astrophysics Data System (ADS)

    Wang, Huifang; Weaver, John B.; Perreard, Irina I.; Doyley, Marvin M.; Paulsen, Keith D.

    2011-07-01

    Magnetic resonance elastography (MRE) uses accumulated phases that are acquired at multiple, uniformly spaced relative phase offsets, to estimate harmonic motion information. Heavily wrapped phase occurs when the motion is large and unwrapping procedures are necessary to estimate the displacements required by MRE. Two unwrapping methods were developed and compared in this paper. The first method is a sequentially applied approach. The three-dimensional MRE phase image block for each slice was processed by two-dimensional unwrapping followed by a one-dimensional phase unwrapping approach along the phase-offset direction. This unwrapping approach generally works well for low noise data. However, there are still cases where the two-dimensional unwrapping method fails when noise is high. In this case, the baseline of the corrupted regions within an unwrapped image will not be consistent. Instead of separating the two-dimensional and one-dimensional unwrapping in a sequential approach, an interleaved three-dimensional quality-guided unwrapping method was developed to combine both the two-dimensional phase image continuity and one-dimensional harmonic motion information. The quality of one-dimensional harmonic motion unwrapping was used to guide the three-dimensional unwrapping procedures and it resulted in stronger guidance than in the sequential method. In this work, in vivo results generated by the two methods were compared.

  10. Scanning Mode Sensor for Detection of Flow Inhomogeneities

    NASA Technical Reports Server (NTRS)

    Adamovsky, Grigory (Inventor)

    1998-01-01

    A scanning mode sensor and method is provided for detection of flow inhomogeneities such as shock. The field of use of this invention is ground test control and engine control during supersonic flight. Prior art measuring techniques include interferometry. Schlieren, and shadowgraph techniques. These techniques. however, have problems with light dissipation. The present method and sensor utilizes a pencil beam of energy which is passed through a transparent aperture in a flow inlet in a time-sequential manner so as to alter the energy beam. The altered beam or its effects are processed and can be studied to reveal information about flow through the inlet which can in turn be used for engine control.

  11. Scanning Mode Sensor for Detection of Flow Inhomogeneities

    NASA Technical Reports Server (NTRS)

    Adamovsky, Grigory (Inventor)

    1996-01-01

    A scanning mode sensor and method is provided for detection of flow inhomogeneities such as shock. The field of use of this invention is ground test control and engine control during supersonic flight. Prior art measuring techniques include interferometry, Schlieren, and shadowgraph techniques. These techniques, however, have problems with light dissipation. The present method and sensor utilizes a pencil beam of energy which is passed through a transparent aperture in a flow inlet in a time-sequential manner so as to alter the energy beam. The altered beam or its effects are processed and can be studied to reveal information about flow through the inlet which can in turn be used for engine control.

  12. Evaluation of TDRSS-user orbit determination accuracy using batch least-squares and sequential methods

    NASA Technical Reports Server (NTRS)

    Oza, D. H.; Jones, T. L.; Hodjatzadeh, M.; Samii, M. V.; Doll, C. E.; Hart, R. C.; Mistretta, G. D.

    1991-01-01

    The development of the Real-Time Orbit Determination/Enhanced (RTOD/E) system as a prototype system for sequential orbit determination on a Disk Operating System (DOS) based Personal Computer (PC) is addressed. The results of a study to compare the orbit determination accuracy of a Tracking and Data Relay Satellite System (TDRSS) user spacecraft obtained using RTOD/E with the accuracy of an established batch least squares system, the Goddard Trajectory Determination System (GTDS), is addressed. Independent assessments were made to examine the consistencies of results obtained by the batch and sequential methods. Comparisons were made between the forward filtered RTOD/E orbit solutions and definitive GTDS orbit solutions for the Earth Radiation Budget Satellite (ERBS); the maximum solution differences were less than 25 m after the filter had reached steady state.

  13. Parallel high-precision orbit propagation using the modified Picard-Chebyshev method

    NASA Astrophysics Data System (ADS)

    Koblick, Darin C.

    2012-03-01

    The modified Picard-Chebyshev method, when run in parallel, is thought to be more accurate and faster than the most efficient sequential numerical integration techniques when applied to orbit propagation problems. Previous experiments have shown that the modified Picard-Chebyshev method can have up to a one order magnitude speedup over the 12th order Runge-Kutta-Nystrom method. For this study, the evaluation of the accuracy and computational time of the modified Picard-Chebyshev method, using the Java Astrodynamics Toolkit high-precision force model, is conducted to assess its runtime performance. Simulation results of the modified Picard-Chebyshev method, implemented in MATLAB and the MATLAB Parallel Computing Toolbox, are compared against the most efficient first and second order Ordinary Differential Equation (ODE) solvers. A total of six processors were used to assess the runtime performance of the modified Picard-Chebyshev method. It was found that for all orbit propagation test cases, where the gravity model was simulated to be of higher degree and order (above 225 to increase computational overhead), the modified Picard-Chebyshev method was faster, by as much as a factor of two, than the other ODE solvers which were tested.

  14. Test apparatus for locating shorts during assembly of electrical buses

    NASA Technical Reports Server (NTRS)

    Deboo, G. J.; Devine, D. L. (Inventor)

    1981-01-01

    A test apparatus is described for locating electrical shorts that is especially suited for use while an electrical circuit is being fabricated or assembled. A ring counter derives input pulses from a square wave oscillator. The outputs of the counter are fed through transistors to an array of light emitting diodes. Each diode is connected to an electrical conductor, such as a bus bar, that is to be tested. In the absence of a short between the electrical conductors the diodes are sequentially illuminated. When a short occurs, a comparator/multivibrator circuit triggers an alarm and stops the oscillator and the sequential energization of the diodes. The two diodes that remain illuminated identify the electrical conductors that are shorted.

  15. Sequential and simultaneous choices: testing the diet selection and sequential choice models.

    PubMed

    Freidin, Esteban; Aw, Justine; Kacelnik, Alex

    2009-03-01

    We investigate simultaneous and sequential choices in starlings, using Charnov's Diet Choice Model (DCM) and Shapiro, Siller and Kacelnik's Sequential Choice Model (SCM) to integrate function and mechanism. During a training phase, starlings encountered one food-related option per trial (A, B or R) in random sequence and with equal probability. A and B delivered food rewards after programmed delays (shorter for A), while R ('rejection') moved directly to the next trial without reward. In this phase we measured latencies to respond. In a later, choice, phase, birds encountered the pairs A-B, A-R and B-R, the first implementing a simultaneous choice and the second and third sequential choices. The DCM predicts when R should be chosen to maximize intake rate, and SCM uses latencies of the training phase to predict choices between any pair of options in the choice phase. The predictions of both models coincided, and both successfully predicted the birds' preferences. The DCM does not deal with partial preferences, while the SCM does, and experimental results were strongly correlated to this model's predictions. We believe that the SCM may expose a very general mechanism of animal choice, and that its wider domain of success reflects the greater ecological significance of sequential over simultaneous choices.

  16. The cost and cost-effectiveness of rapid testing strategies for yaws diagnosis and surveillance.

    PubMed

    Fitzpatrick, Christopher; Asiedu, Kingsley; Sands, Anita; Gonzalez Pena, Tita; Marks, Michael; Mitja, Oriol; Meheus, Filip; Van der Stuyft, Patrick

    2017-10-01

    Yaws is a non-venereal treponemal infection caused by Treponema pallidum subspecies pertenue. The disease is targeted by WHO for eradication by 2020. Rapid diagnostic tests (RDTs) are envisaged for confirmation of clinical cases during treatment campaigns and for certification of the interruption of transmission. Yaws testing requires both treponemal (trep) and non-treponemal (non-trep) assays for diagnosis of current infection. We evaluate a sequential testing strategy (using a treponemal RDT before a trep/non-trep RDT) in terms of cost and cost-effectiveness, relative to a single-assay combined testing strategy (using the trep/non-trep RDT alone), for two use cases: individual diagnosis and community surveillance. We use cohort decision analysis to examine the diagnostic and cost outcomes. We estimate cost and cost-effectiveness of the alternative testing strategies at different levels of prevalence of past/current infection and current infection under each use case. We take the perspective of the global yaws eradication programme. We calculate the total number of correct diagnoses for each strategy over a range of plausible prevalences. We employ probabilistic sensitivity analysis (PSA) to account for uncertainty and report 95% intervals. At current prices of the treponemal and trep/non-trep RDTs, the sequential strategy is cost-saving for individual diagnosis at prevalence of past/current infection less than 85% (81-90); it is cost-saving for surveillance at less than 100%. The threshold price of the trep/non-trep RDT (below which the sequential strategy would no longer be cost-saving) is US$ 1.08 (1.02-1.14) for individual diagnosis at high prevalence of past/current infection (51%) and US$ 0.54 (0.52-0.56) for community surveillance at low prevalence (15%). We find that the sequential strategy is cost-saving for both diagnosis and surveillance in most relevant settings. In the absence of evidence assessing relative performance (sensitivity and specificity), cost-effectiveness is uncertain. However, the conditions under which the combined test only strategy might be more cost-effective than the sequential strategy are limited. A cheaper trep/non-trep RDT is needed, costing no more than US$ 0.50-1.00, depending on the use case. Our results will help enhance the cost-effectiveness of yaws programmes in the 13 countries known to be currently endemic. It will also inform efforts in the much larger group of 71 countries with a history of yaws, many of which will have to undertake surveillance to confirm the interruption of transmission.

  17. A generic motif discovery algorithm for sequential data.

    PubMed

    Jensen, Kyle L; Styczynski, Mark P; Rigoutsos, Isidore; Stephanopoulos, Gregory N

    2006-01-01

    Motif discovery in sequential data is a problem of great interest and with many applications. However, previous methods have been unable to combine exhaustive search with complex motif representations and are each typically only applicable to a certain class of problems. Here we present a generic motif discovery algorithm (Gemoda) for sequential data. Gemoda can be applied to any dataset with a sequential character, including both categorical and real-valued data. As we show, Gemoda deterministically discovers motifs that are maximal in composition and length. As well, the algorithm allows any choice of similarity metric for finding motifs. Finally, Gemoda's output motifs are representation-agnostic: they can be represented using regular expressions, position weight matrices or any number of other models for any type of sequential data. We demonstrate a number of applications of the algorithm, including the discovery of motifs in amino acids sequences, a new solution to the (l,d)-motif problem in DNA sequences and the discovery of conserved protein substructures. Gemoda is freely available at http://web.mit.edu/bamel/gemoda

  18. Fictitious domain method for fully resolved reacting gas-solid flow simulation

    NASA Astrophysics Data System (ADS)

    Zhang, Longhui; Liu, Kai; You, Changfu

    2015-10-01

    Fully resolved simulation (FRS) for gas-solid multiphase flow considers solid objects as finite sized regions in flow fields and their behaviours are predicted by solving equations in both fluid and solid regions directly. Fixed mesh numerical methods, such as fictitious domain method, are preferred in solving FRS problems and have been widely researched. However, for reacting gas-solid flows no suitable fictitious domain numerical method has been developed. This work presents a new fictitious domain finite element method for FRS of reacting particulate flows. Low Mach number reacting flow governing equations are solved sequentially on a regular background mesh. Particles are immersed in the mesh and driven by their surface forces and torques integrated on immersed interfaces. Additional treatments on energy and surface reactions are developed. Several numerical test cases validated the method and a burning carbon particles array falling simulation proved the capability for solving moving reacting particle cluster problems.

  19. Random sequential adsorption of cubes

    NASA Astrophysics Data System (ADS)

    Cieśla, Michał; Kubala, Piotr

    2018-01-01

    Random packings built of cubes are studied numerically using a random sequential adsorption algorithm. To compare the obtained results with previous reports, three different models of cube orientation sampling were used. Also, three different cube-cube intersection algorithms were tested to find the most efficient one. The study focuses on the mean saturated packing fraction as well as kinetics of packing growth. Microstructural properties of packings were analyzed using density autocorrelation function.

  20. Analyzing Communication Architectures Using Commercial Off-The-Shelf (COTS) Modeling and Simulation Tools

    DTIC Science & Technology

    1998-06-01

    4] By 2010, we should be able to change how we conduct the most intense joint operations. Instead of relying on massed forces and sequential ...not independent, sequential steps. Data probes to support the analysis phase were required to complete the logical models. This generated a need...Networks) Identify Granularity (System Level) - Establish Physical Bounds or Limits to Systems • Determine System Test Configuration and Lineup

  1. Topics in the Sequential Design of Experiments

    DTIC Science & Technology

    1992-03-01

    decision , unless so designated by other documentation. 12a. DISTRIBUTION /AVAILABIIUTY STATEMENT 12b. DISTRIBUTION CODE Approved for public release...3 0 1992 D 14. SUBJECT TERMS 15. NUMBER OF PAGES12 Design of Experiments, Renewal Theory , Sequential Testing 1 2. PRICE CODE Limit Theory , Local...distributions for one parameter exponential families," by Michael Woodroofe. Stntca, 2 (1991), 91-112. [6] "A non linear renewal theory for a functional of

  2. A Pocock Approach to Sequential Meta-Analysis of Clinical Trials

    ERIC Educational Resources Information Center

    Shuster, Jonathan J.; Neu, Josef

    2013-01-01

    Three recent papers have provided sequential methods for meta-analysis of two-treatment randomized clinical trials. This paper provides an alternate approach that has three desirable features. First, when carried out prospectively (i.e., we only have the results up to the time of our current analysis), we do not require knowledge of the…

  3. Sequential-Injection Analysis: Principles, Instrument Construction, and Demonstration by a Simple Experiment

    ERIC Educational Resources Information Center

    Economou, A.; Tzanavaras, P. D.; Themelis, D. G.

    2005-01-01

    The sequential-injection analysis (SIA) is an approach to sample handling that enables the automation of manual wet-chemistry procedures in a rapid, precise and efficient manner. The experiments using SIA fits well in the course of Instrumental Chemical Analysis and especially in the section of Automatic Methods of analysis provided by chemistry…

  4. Propagating probability distributions of stand variables using sequential Monte Carlo methods

    Treesearch

    Jeffrey H. Gove

    2009-01-01

    A general probabilistic approach to stand yield estimation is developed based on sequential Monte Carlo filters, also known as particle filters. The essential steps in the development of the sampling importance resampling (SIR) particle filter are presented. The SIR filter is then applied to simulated and observed data showing how the 'predictor - corrector'...

  5. Exploring Liquid Sequential Injection Chromatography to Teach Fundamentals of Separation Methods: A Very Fast Analytical Chemistry Experiment

    ERIC Educational Resources Information Center

    Penteado, Jose C.; Masini, Jorge Cesar

    2011-01-01

    Influence of the solvent strength determined by the addition of a mobile-phase organic modifier and pH on chromatographic separation of sorbic acid and vanillin has been investigated by the relatively new technique, liquid sequential injection chromatography (SIC). This technique uses reversed-phase monolithic stationary phase to execute fast…

  6. Lexical and Grammatical Associations in Sequential Bilingual Preschoolers

    ERIC Educational Resources Information Center

    Kohnert, Kathryn; Kan, Pui Fong; Conboy, Barbara T.

    2010-01-01

    Purpose: The authors investigated potential relationships between traditional linguistic domains (words, grammar) in the first (L1) and second (L2) languages of young sequential bilingual preschool children. Method: Participants were 19 children, ages 2;11 (years;months) to 5;2 (M = 4;3) who began learning Hmong as the L1 from birth and English as…

  7. Comparison of solution-mixed and sequentially processed P3HT: F4TCNQ films: effect of doping-induced aggregation on film morphology

    DOE PAGES

    Jacobs, Ian E.; Aasen, Erik W.; Oliveira, Julia L.; ...

    2016-03-23

    Doping polymeric semiconductors often drastically reduces the solubility of the polymer, leading to difficulties in processing doped films. Here, we compare optical, electrical, and morphological properties of P3HT films doped with F4TCNQ, both from mixed solutions and using sequential solution processing with orthogonal solvents. We demonstrate that sequential doping occurs rapidly (<1 s), and that the film doping level can be precisely controlled by varying the concentration of the doping solution. Furthermore, the choice of sequential doping solvent controls whether dopant anions are included or excluded from polymer crystallites. Atomic force microscopy (AFM) reveals that sequential doping produces significantly moremore » uniform films on the nanoscale than the mixed-solution method. In addition, we show that mixed-solution doping induces the formation of aggregates even at low doping levels, resulting in drastic changes to film morphology. Sequentially coated films show 3–15 times higher conductivities at a given doping level than solution-doped films, with sequentially doped films processed to exclude dopant anions from polymer crystallites showing the highest conductivities. In conclusion, we propose a mechanism for doping induced aggregation in which the shift of the polymer HOMO level upon aggregation couples ionization and solvation energies. To show that the methodology is widely applicable, we demonstrate that several different polymer:dopant systems can be prepared by sequential doping.« less

  8. Comparison of solution-mixed and sequentially processed P3HT: F4TCNQ films: effect of doping-induced aggregation on film morphology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobs, Ian E.; Aasen, Erik W.; Oliveira, Julia L.

    Doping polymeric semiconductors often drastically reduces the solubility of the polymer, leading to difficulties in processing doped films. Here, we compare optical, electrical, and morphological properties of P3HT films doped with F4TCNQ, both from mixed solutions and using sequential solution processing with orthogonal solvents. We demonstrate that sequential doping occurs rapidly (<1 s), and that the film doping level can be precisely controlled by varying the concentration of the doping solution. Furthermore, the choice of sequential doping solvent controls whether dopant anions are included or excluded from polymer crystallites. Atomic force microscopy (AFM) reveals that sequential doping produces significantly moremore » uniform films on the nanoscale than the mixed-solution method. In addition, we show that mixed-solution doping induces the formation of aggregates even at low doping levels, resulting in drastic changes to film morphology. Sequentially coated films show 3–15 times higher conductivities at a given doping level than solution-doped films, with sequentially doped films processed to exclude dopant anions from polymer crystallites showing the highest conductivities. In conclusion, we propose a mechanism for doping induced aggregation in which the shift of the polymer HOMO level upon aggregation couples ionization and solvation energies. To show that the methodology is widely applicable, we demonstrate that several different polymer:dopant systems can be prepared by sequential doping.« less

  9. Automatic bearing fault diagnosis of permanent magnet synchronous generators in wind turbines subjected to noise interference

    NASA Astrophysics Data System (ADS)

    Guo, Jun; Lu, Siliang; Zhai, Chao; He, Qingbo

    2018-02-01

    An automatic bearing fault diagnosis method is proposed for permanent magnet synchronous generators (PMSGs), which are widely installed in wind turbines subjected to low rotating speeds, speed fluctuations, and electrical device noise interferences. The mechanical rotating angle curve is first extracted from the phase current of a PMSG by sequentially applying a series of algorithms. The synchronous sampled vibration signal of the fault bearing is then resampled in the angular domain according to the obtained rotating phase information. Considering that the resampled vibration signal is still overwhelmed by heavy background noise, an adaptive stochastic resonance filter is applied to the resampled signal to enhance the fault indicator and facilitate bearing fault identification. Two types of fault bearings with different fault sizes in a PMSG test rig are subjected to experiments to test the effectiveness of the proposed method. The proposed method is fully automated and thus shows potential for convenient, highly efficient and in situ bearing fault diagnosis for wind turbines subjected to harsh environments.

  10. Constrained optimization by radial basis function interpolation for high-dimensional expensive black-box problems with infeasible initial points

    NASA Astrophysics Data System (ADS)

    Regis, Rommel G.

    2014-02-01

    This article develops two new algorithms for constrained expensive black-box optimization that use radial basis function surrogates for the objective and constraint functions. These algorithms are called COBRA and Extended ConstrLMSRBF and, unlike previous surrogate-based approaches, they can be used for high-dimensional problems where all initial points are infeasible. They both follow a two-phase approach where the first phase finds a feasible point while the second phase improves this feasible point. COBRA and Extended ConstrLMSRBF are compared with alternative methods on 20 test problems and on the MOPTA08 benchmark automotive problem (D.R. Jones, Presented at MOPTA 2008), which has 124 decision variables and 68 black-box inequality constraints. The alternatives include a sequential penalty derivative-free algorithm, a direct search method with kriging surrogates, and two multistart methods. Numerical results show that COBRA algorithms are competitive with Extended ConstrLMSRBF and they generally outperform the alternatives on the MOPTA08 problem and most of the test problems.

  11. Novel biorelevant dissolution medium as a prognostic tool for polysaccharide-based colon-targeted drug delivery system.

    PubMed

    Yadav, Ankit Kumar; Sadora, Manik; Singh, Sachin Kumar; Gulati, Monica; Maharshi, Peddi; Sharma, Abhinav; Kumar, Bimlesh; Rathee, Harish; Ghai, Deepak; Malik, Adil Hussain; Garg, Varun; Gowthamrajan, K

    2017-01-01

    To overcome the limitations of the conventionally used methods for evaluation of orally administered colon-targeted delivery systems, a novel dissolution method using probiotics has been recently reported. In the present study, universal suitability of this medium composed of five different probiotics is established. Different delivery systems - mini tablets, liquisolid compacts, and microspheres coated with different polysaccharides - were prepared and subjected to sequential dissolution testing in medium with and without microbiota. The results obtained from fluid thioglycollate medium (FTM)-based probiotic medium for all the polysaccharide-based formulations showed statistically similar dissolution profile to that in the rat and goat cecal content media. Hence, it can be concluded that the developed FTM-based probiotic medium, once established, may eliminate the need for further animal sacrifice in the dissolution testing of polysaccharide-based colon-targeted delivery system.

  12. Comparison between variable and fixed dwell-time PN acquisition algorithms. [for synchronization in pseudonoise spread spectrum systems

    NASA Technical Reports Server (NTRS)

    Braun, W. R.

    1981-01-01

    Pseudo noise (PN) spread spectrum systems require a very accurate alignment between the PN code epochs at the transmitter and receiver. This synchronism is typically established through a two-step algorithm, including a coarse synchronization procedure and a fine synchronization procedure. A standard approach for the coarse synchronization is a sequential search over all code phases. The measurement of the power in the filtered signal is used to either accept or reject the code phase under test as the phase of the received PN code. This acquisition strategy, called a single dwell-time system, has been analyzed by Holmes and Chen (1977). A synopsis of the field of sequential analysis as it applies to the PN acquisition problem is provided. From this, the implementation of the variable dwell time algorithm as a sequential probability ratio test is developed. The performance of this algorithm is compared to the optimum detection algorithm and to the fixed dwell-time system.

  13. Constrained multiple indicator kriging using sequential quadratic programming

    NASA Astrophysics Data System (ADS)

    Soltani-Mohammadi, Saeed; Erhan Tercan, A.

    2012-11-01

    Multiple indicator kriging (MIK) is a nonparametric method used to estimate conditional cumulative distribution functions (CCDF). Indicator estimates produced by MIK may not satisfy the order relations of a valid CCDF which is ordered and bounded between 0 and 1. In this paper a new method has been presented that guarantees the order relations of the cumulative distribution functions estimated by multiple indicator kriging. The method is based on minimizing the sum of kriging variances for each cutoff under unbiasedness and order relations constraints and solving constrained indicator kriging system by sequential quadratic programming. A computer code is written in the Matlab environment to implement the developed algorithm and the method is applied to the thickness data.

  14. Biodegradation and detoxification of textile azo dyes by bacterial consortium under sequential microaerophilic/aerobic processes

    PubMed Central

    Lade, Harshad; Kadam, Avinash; Paul, Diby; Govindwar, Sanjay

    2015-01-01

    Release of textile azo dyes to the environment is an issue of health concern while the use of microorganisms has proved to be the best option for remediation. Thus, in the present study, a bacterial consortium consisting of Providencia rettgeri strain HSL1 and Pseudomonas sp. SUK1 has been investigated for degradation and detoxification of structurally different azo dyes. The consortium showed 98-99 % decolorization of all the selected azo dyes viz. Reactive Black 5 (RB 5), Reactive Orange 16 (RO 16), Disperse Red 78 (DR 78) and Direct Red 81 (DR 81) within 12 to 30 h at 100 mg L-1 concentration at 30 ± 0.2 °C under microaerophilic, sequential aerobic/microaerophilic and microaerophilic/aerobic processes. However, decolorization under microaerophilic conditions viz. RB 5 (0.26 mM), RO 16 (0.18 mM), DR 78 (0.20 mM) and DR 81 (0.23 mM) and sequential aerobic/microaerophilic processes viz. RB 5 (0.08 mM), RO 16 (0.06 mM), DR 78 (0.07 mM) and DR 81 (0.09 mM) resulted into the formation of aromatic amines. In distinction, sequential microaerophilic/ aerobic process doesn’t show the formation of amines. Additionally, 62-72 % reduction in total organic carbon content was observed in all the dyes decolorized broths under sequential microaerophilic/aerobic processes suggesting the efficacy of method in mineralization of dyes. Notable induction within the levels of azoreductase and NADH-DCIP reductase (97 and 229 % for RB 5, 55 and 160 % for RO 16, 63 and 196 % for DR 78, 108 and 258 % for DR 81) observed under sequential microaerophilic/aerobic processes suggested their critical involvements in the initial breakdown of azo bonds, whereas, a slight increase in the levels of laccase and veratryl alcohol oxidase confirmed subsequent oxidation of formed amines. Also, the acute toxicity assay with Daphnia magna revealed the nontoxic nature of the dye-degraded metabolites under sequential microaerophilic/aerobic processes. As biodegradation under sequential microaerophilic/aerobic process completely detoxified all the selected textile azo dyes, further efforts should be made to implement such methods for large scale dye wastewater treatment technologies. PMID:26417357

  15. Sequential CFAR detectors using a dead-zone limiter

    NASA Astrophysics Data System (ADS)

    Tantaratana, Sawasd

    1990-09-01

    The performances of some proposed sequential constant-false-alarm-rate (CFAR) detectors are evaluated. The observations are passed through a dead-zone limiter, the output of which is -1, 0, or +1, depending on whether the input is less than -c, between -c and c, or greater than c, where c is a constant. The test statistic is the sum of the outputs. The test is performed on a reduced set of data (those with absolute value larger than c), with the test statistic being the sum of the signs of the reduced set of data. Both constant and linear boundaries are considered. Numerical results show a significant reduction of the average number of observations needed to achieve the same false alarm and detection probabilities as a fixed-sample-size CFAR detector using the same kind of test statistic.

  16. Is there a role for antibody testing in the diagnosis of invasive candidiasis?

    PubMed

    Quindós, Guillermo; Moragues, María Dolores; Pontón, José

    2004-03-01

    During the last decades, the use of antibody tests for the diagnosis of invasive mycoses has declined as a consequence of the general belief that they are insensitive and non-specific. However, there is a clear evidence that antibodies can be detected in highly immunodeficient patients (such as bone marrow transplant recipients), and that those antibodies are useful for the diagnosis. Antibody tests are currently in use as diagnostic tools for some primary mycoses, such as the endemic mycoses, aspergilloma, allergic bronchopulmonary aspergilosis and sporothrichosis. For invasive candidiasis, diagnostic methods must differentiate Candida colonization of mucous membranes or superficial infection from tissue invasion by this microorganism. Substantial progress has been made in diagnosis of invasive candidiasis with the development of a variety of methods for the detection of antibodies and antigens. However, no single test has found widespread clinical use and there is a consensus that diagnosis based on a single specimen lacks sensitivity. It is necessary to test sequential samples taken while the patient is at greatest risk for developing invasive candidiasis to optimize the diagnosis. Results obtained from a panel of diagnostic tests in association with clinical aspects will likely be the most useful strategy for early diagnosis and therapy.

  17. Validation of antibiotic residue tests for dairy goats.

    PubMed

    Zeng, S S; Hart, S; Escobar, E N; Tesfai, K

    1998-03-01

    The SNAP test, LacTek test (B-L and CEF), Charm Bacillus sterothermophilus var. calidolactis disk assay (BsDA), and Charm II Tablet Beta-lactam sequential test were validated using antibiotic-fortified and -incurred goat milk following the protocol for test kit validations of the U.S. Food and Drug Administration Center for Veterinary Medicine. SNAP, Charm BsDA, and Charm II Tablet Sequential tests were sensitive and reliable in detecting antibiotic residues in goat milk. All three assays showed greater than 90% sensitivity and specificity at tolerance and detection levels. However, caution should be taken in interpreting test results at detection levels. Because of the high sensitivity of these three tests, false-violative results could be obtained in goat milk containing antibiotic residues below the tolerance level. Goat milk testing positive by these tests must be confirmed using a more sophisticated methodology, such as high-performance liquid chromatography, before the milk is condemned. LacTek B-L test did not detect several antibiotics, including penicillin G, in goat milk at tolerance levels. However, LacTek CEF was excellent in detecting ceftiofur residue in goat milk.

  18. Adrenal vein sampling in primary aldosteronism: concordance of simultaneous vs sequential sampling.

    PubMed

    Almarzooqi, Mohamed-Karji; Chagnon, Miguel; Soulez, Gilles; Giroux, Marie-France; Gilbert, Patrick; Oliva, Vincent L; Perreault, Pierre; Bouchard, Louis; Bourdeau, Isabelle; Lacroix, André; Therasse, Eric

    2017-02-01

    Many investigators believe that basal adrenal venous sampling (AVS) should be done simultaneously, whereas others opt for sequential AVS for simplicity and reduced cost. This study aimed to evaluate the concordance of sequential and simultaneous AVS methods. Between 1989 and 2015, bilateral simultaneous sets of basal AVS were obtained twice within 5 min, in 188 consecutive patients (59 women and 129 men; mean age: 53.4 years). Selectivity was defined by adrenal-to-peripheral cortisol ratio ≥2, and lateralization was defined as an adrenal aldosterone-to-cortisol ratio ≥2, the contralateral side. Sequential AVS was simulated using right sampling at -5 min (t = -5) and left sampling at 0 min (t = 0). There was no significant difference in mean selectivity ratio (P = 0.12 and P = 0.42 for the right and left sides respectively) and in mean lateralization ratio (P = 0.93) between t = -5 and t = 0. Kappa for selectivity between 2 simultaneous AVS was 0.71 (95% CI: 0.60-0.82), whereas it was 0.84 (95% CI: 0.76-0.92) and 0.85 (95% CI: 0.77-0.93) between sequential and simultaneous AVS at respectively -5 min and at 0 min. Kappa for lateralization between 2 simultaneous AVS was 0.84 (95% CI: 0.75-0.93), whereas it was 0.86 (95% CI: 0.78-0.94) and 0.80 (95% CI: 0.71-0.90) between sequential AVS and simultaneous AVS at respectively -5 min at 0 min. Concordance between simultaneous and sequential AVS was not different than that between 2 repeated simultaneous AVS in the same patient. Therefore, a better diagnostic performance is not a good argument to select the AVS method. © 2017 European Society of Endocrinology.

  19. New protocol for dissociating visuospatial working memory ability in reaching space and in navigational space.

    PubMed

    Lupo, Michela; Ferlazzo, Fabio; Aloise, Fabio; Di Nocera, Francesco; Tedesco, Anna Maria; Cardillo, Chiara; Leggio, Maria

    2018-04-27

    Several studies have demonstrated that the processing of visuospatial memory for locations in reaching space and in navigational space is supported by independent systems, and that the coding of visuospatial information depends on the modality of the presentation (i.e., sequential or simultaneous). However, these lines of evidence and the most common neuropsychological tests used by clinicians to investigate visuospatial memory have several limitations (e.g., they are unable to analyze all the subcomponents of this function and are not directly comparable). Therefore, we developed a new battery of tests that is able to investigate these subcomponents. We recruited 71 healthy subjects who underwent sequential and simultaneous navigational tests by using an innovative sensorized platform, as well as comparable paper tests to evaluate the same components in reaching space (Exp. 1). Consistent with the literature, the principal-component method of analysis used in this study demonstrated the presence of distinct memory for sequences in different portions of space, but no distinction was found for simultaneous presentation, suggesting that different modalities of eye gaze exploration are used when subjects have to perform different types of tasks. For this purpose, an infrared Tobii Eye-Tracking X50 system was used in both spatial conditions (Exp. 2), showing that a clear effect of the presentation modality was due to the specific strategy used by subjects to explore the stimuli in space. Given these findings, the neuropsychological battery established in the present study allows us to show basic differences in the normal coding of stimuli, which can explain the specific visuospatial deficits found in various neurological conditions.

  20. Analysis and optimization of population annealing

    NASA Astrophysics Data System (ADS)

    Amey, Christopher; Machta, Jonathan

    2018-03-01

    Population annealing is an easily parallelizable sequential Monte Carlo algorithm that is well suited for simulating the equilibrium properties of systems with rough free-energy landscapes. In this work we seek to understand and improve the performance of population annealing. We derive several useful relations between quantities that describe the performance of population annealing and use these relations to suggest methods to optimize the algorithm. These optimization methods were tested by performing large-scale simulations of the three-dimensional (3D) Edwards-Anderson (Ising) spin glass and measuring several observables. The optimization methods were found to substantially decrease the amount of computational work necessary as compared to previously used, unoptimized versions of population annealing. We also obtain more accurate values of several important observables for the 3D Edwards-Anderson model.

  1. A simple algorithm for sequentially incorporating gravity observations in seismic traveltime tomography

    USGS Publications Warehouse

    Parsons, T.; Blakely, R.J.; Brocher, T.M.

    2001-01-01

    The geologic structure of the Earth's upper crust can be revealed by modeling variation in seismic arrival times and in potential field measurements. We demonstrate a simple method for sequentially satisfying seismic traveltime and observed gravity residuals in an iterative 3-D inversion. The algorithm is portable to any seismic analysis method that uses a gridded representation of velocity structure. Our technique calculates the gravity anomaly resulting from a velocity model by converting to density with Gardner's rule. The residual between calculated and observed gravity is minimized by weighted adjustments to the model velocity-depth gradient where the gradient is steepest and where seismic coverage is least. The adjustments are scaled by the sign and magnitude of the gravity residuals, and a smoothing step is performed to minimize vertical streaking. The adjusted model is then used as a starting model in the next seismic traveltime iteration. The process is repeated until one velocity model can simultaneously satisfy both the gravity anomaly and seismic traveltime observations within acceptable misfits. We test our algorithm with data gathered in the Puget Lowland of Washington state, USA (Seismic Hazards Investigation in Puget Sound [SHIPS] experiment). We perform resolution tests with synthetic traveltime and gravity observations calculated with a checkerboard velocity model using the SHIPS experiment geometry, and show that the addition of gravity significantly enhances resolution. We calculate a new velocity model for the region using SHIPS traveltimes and observed gravity, and show examples where correlation between surface geology and modeled subsurface velocity structure is enhanced.

  2. A sequential method for spline approximation with variable knots. [recursive piecewise polynomial signal processing

    NASA Technical Reports Server (NTRS)

    Mier Muth, A. M.; Willsky, A. S.

    1978-01-01

    In this paper we describe a method for approximating a waveform by a spline. The method is quite efficient, as the data are processed sequentially. The basis of the approach is to view the approximation problem as a question of estimation of a polynomial in noise, with the possibility of abrupt changes in the highest derivative. This allows us to bring several powerful statistical signal processing tools into play. We also present some initial results on the application of our technique to the processing of electrocardiograms, where the knot locations themselves may be some of the most important pieces of diagnostic information.

  3. The Use of Test Results from ASA Workshops to Evaluate Workshop Effectiveness

    ERIC Educational Resources Information Center

    Donegan, Judith H.; And Others

    1976-01-01

    Results of test given to participants in six American Society of Anesthesiologists workshops were analyzed to determine whether attendance increased scores on sequential tests (before, immediately after, and three months later). Both workshop and control groups of anesthesiologists increased their scores with each successive test. (Editor/JT)

  4. Robustness of Ability Estimation to Multidimensionality in CAST with Implications to Test Assembly

    ERIC Educational Resources Information Center

    Zhang, Yanwei; Nandakumar, Ratna

    2006-01-01

    Computer Adaptive Sequential Testing (CAST) is a test delivery model that combines features of the traditional conventional paper-and-pencil testing and item-based computerized adaptive testing (CAT). The basic structure of CAST is a panel composed of multiple testlets adaptively administered to examinees at different stages. Current applications…

  5. Sequential ensemble-based optimal design for parameter estimation: SEQUENTIAL ENSEMBLE-BASED OPTIMAL DESIGN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Man, Jun; Zhang, Jiangjiang; Li, Weixuan

    2016-10-01

    The ensemble Kalman filter (EnKF) has been widely used in parameter estimation for hydrological models. The focus of most previous studies was to develop more efficient analysis (estimation) algorithms. On the other hand, it is intuitively understandable that a well-designed sampling (data-collection) strategy should provide more informative measurements and subsequently improve the parameter estimation. In this work, a Sequential Ensemble-based Optimal Design (SEOD) method, coupled with EnKF, information theory and sequential optimal design, is proposed to improve the performance of parameter estimation. Based on the first-order and second-order statistics, different information metrics including the Shannon entropy difference (SD), degrees ofmore » freedom for signal (DFS) and relative entropy (RE) are used to design the optimal sampling strategy, respectively. The effectiveness of the proposed method is illustrated by synthetic one-dimensional and two-dimensional unsaturated flow case studies. It is shown that the designed sampling strategies can provide more accurate parameter estimation and state prediction compared with conventional sampling strategies. Optimal sampling designs based on various information metrics perform similarly in our cases. The effect of ensemble size on the optimal design is also investigated. Overall, larger ensemble size improves the parameter estimation and convergence of optimal sampling strategy. Although the proposed method is applied to unsaturated flow problems in this study, it can be equally applied in any other hydrological problems.« less

  6. A novel method for the sequential removal and separation of multiple heavy metals from wastewater.

    PubMed

    Fang, Li; Li, Liang; Qu, Zan; Xu, Haomiao; Xu, Jianfang; Yan, Naiqiang

    2018-01-15

    A novel method was developed and applied for the treatment of simulated wastewater containing multiple heavy metals. A sorbent of ZnS nanocrystals (NCs) was synthesized and showed extraordinary performance for the removal of Hg 2+ , Cu 2+ , Pb 2+ and Cd 2+ . The removal efficiencies of Hg 2+ , Cu 2+ , Pb 2+ and Cd 2+ were 99.9%, 99.9%, 90.8% and 66.3%, respectively. Meanwhile, it was determined that solubility product (K sp ) of heavy metal sulfides was closely related to adsorption selectivity of various heavy metals on the sorbent. The removal efficiency of Hg 2+ was higher than that of Cd 2+ , while the K sp of HgS was lower than that of CdS. It indicated that preferential adsorption of heavy metals occurred when the K sp of the heavy metal sulfide was lower. In addition, the differences in the K sp of heavy metal sulfides allowed for the exchange of heavy metals, indicating the potential application for the sequential removal and separation of heavy metals from wastewater. According to the cumulative adsorption experimental results, multiple heavy metals were sequentially adsorbed and separated from the simulated wastewater in the order of the K sp of their sulfides. This method holds the promise of sequentially removing and separating multiple heavy metals from wastewater. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Sequential sentinel SNP Regional Association Plots (SSS-RAP): an approach for testing independence of SNP association signals using meta-analysis data.

    PubMed

    Zheng, Jie; Gaunt, Tom R; Day, Ian N M

    2013-01-01

    Genome-Wide Association Studies (GWAS) frequently incorporate meta-analysis within their framework. However, conditional analysis of individual-level data, which is an established approach for fine mapping of causal sites, is often precluded where only group-level summary data are available for analysis. Here, we present a numerical and graphical approach, "sequential sentinel SNP regional association plot" (SSS-RAP), which estimates regression coefficients (beta) with their standard errors using the meta-analysis summary results directly. Under an additive model, typical for genes with small effect, the effect for a sentinel SNP can be transformed to the predicted effect for a possibly dependent SNP through a 2×2 2-SNP haplotypes table. The approach assumes Hardy-Weinberg equilibrium for test SNPs. SSS-RAP is available as a Web-tool (http://apps.biocompute.org.uk/sssrap/sssrap.cgi). To develop and illustrate SSS-RAP we analyzed lipid and ECG traits data from the British Women's Heart and Health Study (BWHHS), evaluated a meta-analysis for ECG trait and presented several simulations. We compared results with existing approaches such as model selection methods and conditional analysis. Generally findings were consistent. SSS-RAP represents a tool for testing independence of SNP association signals using meta-analysis data, and is also a convenient approach based on biological principles for fine mapping in group level summary data. © 2012 Blackwell Publishing Ltd/University College London.

  8. Sequential Neighborhood Effects: The Effect of Long-Term Exposure to Concentrated Disadvantage on Children's Reading and Math Test Scores.

    PubMed

    Hicks, Andrew L; Handcock, Mark S; Sastry, Narayan; Pebley, Anne R

    2018-02-01

    Prior research has suggested that children living in a disadvantaged neighborhood have lower achievement test scores, but these studies typically have not estimated causal effects that account for neighborhood choice. Recent studies used propensity score methods to account for the endogeneity of neighborhood exposures, comparing disadvantaged and nondisadvantaged neighborhoods. We develop an alternative propensity function approach in which cumulative neighborhood effects are modeled as a continuous treatment variable. This approach offers several advantages. We use our approach to examine the cumulative effects of neighborhood disadvantage on reading and math test scores in Los Angeles. Our substantive results indicate that recency of exposure to disadvantaged neighborhoods may be more important than average exposure for children's test scores. We conclude that studies of child development should consider both average cumulative neighborhood exposure and the timing of this exposure.

  9. Sequential Neighborhood Effects: The Effect of Long-Term Exposure to Concentrated Disadvantage on Children's Reading and Math Test Scores

    PubMed Central

    Hicks, Andrew L.; Handcock, Mark S.; Sastry, Narayan

    2018-01-01

    Prior research has suggested that children living in a disadvantaged neighborhood have lower achievement test scores, but these studies typically have not estimated causal effects that account for neighborhood choice. Recent studies used propensity score methods to account for the endogeneity of neighborhood exposures, comparing disadvantaged and nondisadvantaged neighborhoods. We develop an alternative propensity function approach in which cumulative neighborhood effects are modeled as a continuous treatment variable. This approach offers several advantages. We use our approach to examine the cumulative effects of neighborhood disadvantage on reading and math test scores in Los Angeles. Our substantive results indicate that recency of exposure to disadvantaged neighborhoods may be more important than average exposure for children's test scores. We conclude that studies of child development should consider both average cumulative neighborhood exposure and the timing of this exposure. PMID:29192386

  10. Kriging for Simulation Metamodeling: Experimental Design, Reduced Rank Kriging, and Omni-Rank Kriging

    NASA Astrophysics Data System (ADS)

    Hosking, Michael Robert

    This dissertation improves an analyst's use of simulation by offering improvements in the utilization of kriging metamodels. There are three main contributions. First an analysis is performed of what comprises good experimental designs for practical (non-toy) problems when using a kriging metamodel. Second is an explanation and demonstration of how reduced rank decompositions can improve the performance of kriging, now referred to as reduced rank kriging. Third is the development of an extension of reduced rank kriging which solves an open question regarding the usage of reduced rank kriging in practice. This extension is called omni-rank kriging. Finally these results are demonstrated on two case studies. The first contribution focuses on experimental design. Sequential designs are generally known to be more efficient than "one shot" designs. However, sequential designs require some sort of pilot design from which the sequential stage can be based. We seek to find good initial designs for these pilot studies, as well as designs which will be effective if there is no following sequential stage. We test a wide variety of designs over a small set of test-bed problems. Our findings indicate that analysts should take advantage of any prior information they have about their problem's shape and/or their goals in metamodeling. In the event of a total lack of information we find that Latin hypercube designs are robust default choices. Our work is most distinguished by its attention to the higher levels of dimensionality. The second contribution introduces and explains an alternative method for kriging when there is noise in the data, which we call reduced rank kriging. Reduced rank kriging is based on using a reduced rank decomposition which artificially smoothes the kriging weights similar to a nugget effect. Our primary focus will be showing how the reduced rank decomposition propagates through kriging empirically. In addition, we show further evidence for our explanation through tests of reduced rank kriging's performance over different situations. In total, reduced rank kriging is a useful tool for simulation metamodeling. For the third contribution we will answer the question of how to find the best rank for reduced rank kriging. We do this by creating an alternative method which does not need to search for a particular rank. Instead it uses all potential ranks; we call this approach omnirank kriging. This modification realizes the potential gains from reduced rank kriging and provides a workable methodology for simulation metamodeling. Finally, we will demonstrate the use and value of these developments on two case studies, a clinic operation problem and a location problem. These cases will validate the value of this research. Simulation metamodeling always attempts to extract maximum information from limited data. Each one of these contributions will allow analysts to make better use of their constrained computational budgets.

  11. Breaking from binaries - using a sequential mixed methods design.

    PubMed

    Larkin, Patricia Mary; Begley, Cecily Marion; Devane, Declan

    2014-03-01

    To outline the traditional worldviews of healthcare research and discuss the benefits and challenges of using mixed methods approaches in contributing to the development of nursing and midwifery knowledge. There has been much debate about the contribution of mixed methods research to nursing and midwifery knowledge in recent years. A sequential exploratory design is used as an exemplar of a mixed methods approach. The study discussed used a combination of focus-group interviews and a quantitative instrument to obtain a fuller understanding of women's experiences of childbirth. In the mixed methods study example, qualitative data were analysed using thematic analysis and quantitative data using regression analysis. Polarised debates about the veracity, philosophical integrity and motivation for conducting mixed methods research have largely abated. A mixed methods approach can contribute to a deeper, more contextual understanding of a variety of subjects and experiences; as a result, it furthers knowledge that can be used in clinical practice. The purpose of the research study should be the main instigator when choosing from an array of mixed methods research designs. Mixed methods research offers a variety of models that can augment investigative capabilities and provide richer data than can a discrete method alone. This paper offers an example of an exploratory, sequential approach to investigating women's childbirth experiences. A clear framework for the conduct and integration of the different phases of the mixed methods research process is provided. This approach can be used by practitioners and policy makers to improve practice.

  12. A 37-mm Ceramic Gun Nozzle Stress Analysis

    DTIC Science & Technology

    2006-05-01

    Figures iv List of Tables iv 1 . Introduction 1 2. Ceramic Nozzle Structure and Materials 1 3. Sequentially-Coupled and Fully-Coupled Thermal Stress...FEM Analysis 1 4. Ceramic Nozzle Thermal Stress Response 4 5. Ceramic Nozzle Dynamic FEM 7 6. Ceramic Nozzle Dynamic Responses and Discussions 8 7...candidate ceramics and the test fixture model components are listed in table 1 . 3. Sequentially-Coupled and Fully-Coupled Thermal Stress FEM Analysis

  13. APPLICATION OF A DUAL FINE PARTICLE SEQUENTIAL SAMPLER, A TAPERED ELEMENT OSCILLATING MICROBALANCE, AND OTHER AIR MONITORING METHODS TO ASSESS TRANSBOUNDARY INFLUENCE OF PM 2.5

    EPA Science Inventory

    Transboundary influences of paniculate matter less than or equal to 2.5 um in aerodynamic diameter (PM2.5,) have been investigated in a U.S.-Mexican border region using a dual fine particle sequential sampler (DFPSS) and tapered element oscillating microbalance (TEOM). Daily me...

  14. Method of selective reduction of halodisilanes with alkyltin hydrides

    DOEpatents

    D'Errico, John J.; Sharp, Kenneth G.

    1989-01-01

    The invention relates to the selective and sequential reduction of halodisilanes by reacting these compounds at room temperature or below with trialkyltin hydrides or dialkyltin dihydrides without the use of free radical intermediates. The alkyltin hydrides selectively and sequentially reduce the Si-Cl, Si-Br or Si-I bonds while leaving intact the Si-Si and Si-F bonds present.

  15. Lexical Diversity and Omission Errors as Predictors of Language Ability in the Narratives of Sequential Spanish-English Bilinguals: A Cross-Language Comparison

    ERIC Educational Resources Information Center

    Jacobson, Peggy F.; Walden, Patrick R.

    2013-01-01

    Purpose: This study explored the utility of language sample analysis for evaluating language ability in school-age Spanish-English sequential bilingual children. Specifically, the relative potential of lexical diversity and word/morpheme omission as predictors of typical or atypical language status was evaluated. Method: Narrative samples were…

  16. Type I and Type II Error Rates and Overall Accuracy of the Revised Parallel Analysis Method for Determining the Number of Factors

    ERIC Educational Resources Information Center

    Green, Samuel B.; Thompson, Marilyn S.; Levy, Roy; Lo, Wen-Juo

    2015-01-01

    Traditional parallel analysis (T-PA) estimates the number of factors by sequentially comparing sample eigenvalues with eigenvalues for randomly generated data. Revised parallel analysis (R-PA) sequentially compares the "k"th eigenvalue for sample data to the "k"th eigenvalue for generated data sets, conditioned on"k"-…

  17. An algorithm for propagating the square-root covariance matrix in triangular form

    NASA Technical Reports Server (NTRS)

    Tapley, B. D.; Choe, C. Y.

    1976-01-01

    A method for propagating the square root of the state error covariance matrix in lower triangular form is described. The algorithm can be combined with any triangular square-root measurement update algorithm to obtain a triangular square-root sequential estimation algorithm. The triangular square-root algorithm compares favorably with the conventional sequential estimation algorithm with regard to computation time.

  18. System and method for detecting components of a mixture including tooth elements for alignment

    DOEpatents

    Sommer, Gregory Jon; Schaff, Ulrich Y.

    2016-11-22

    Examples are described including assay platforms having tooth elements. An impinging element may sequentially engage tooth elements on the assay platform to sequentially align corresponding detection regions with a detection unit. In this manner, multiple measurements may be made of detection regions on the assay platform without necessarily requiring the starting and stopping of a motor.

  19. An on-line potentiometric sequential injection titration process analyser for the determination of acetic acid.

    PubMed

    van Staden, J F; Mashamba, Mulalo G; Stefan, Raluca I

    2002-09-01

    An on-line potentiometric sequential injection titration process analyser for the determination of acetic acid is proposed. A solution of 0.1 mol L(-1) sodium chloride is used as carrier. Titration is achieved by aspirating acetic acid samples between two strong base-zone volumes into a holding coil and by channelling the stack of well-defined zones with flow reversal through a reaction coil to a potentiometric sensor where the peak widths were measured. A linear relationship between peak width and logarithm of the acid concentration was obtained in the range 1-9 g/100 mL. Vinegar samples were analysed without any sample pre-treatment. The method has a relative standard deviation of 0.4% with a sample frequency of 28 samples per hour. The results revealed good agreement between the proposed sequential injection and an automated batch titration method.

  20. PARTICLE FILTERING WITH SEQUENTIAL PARAMETER LEARNING FOR NONLINEAR BOLD fMRI SIGNALS.

    PubMed

    Xia, Jing; Wang, Michelle Yongmei

    Analyzing the blood oxygenation level dependent (BOLD) effect in the functional magnetic resonance imaging (fMRI) is typically based on recent ground-breaking time series analysis techniques. This work represents a significant improvement over existing approaches to system identification using nonlinear hemodynamic models. It is important for three reasons. First, instead of using linearized approximations of the dynamics, we present a nonlinear filtering based on the sequential Monte Carlo method to capture the inherent nonlinearities in the physiological system. Second, we simultaneously estimate the hidden physiological states and the system parameters through particle filtering with sequential parameter learning to fully take advantage of the dynamic information of the BOLD signals. Third, during the unknown static parameter learning, we employ the low-dimensional sufficient statistics for efficiency and avoiding potential degeneration of the parameters. The performance of the proposed method is validated using both the simulated data and real BOLD fMRI data.

  1. Facilitated assignment of large protein NMR signals with covariance sequential spectra using spectral derivatives.

    PubMed

    Harden, Bradley J; Nichols, Scott R; Frueh, Dominique P

    2014-09-24

    Nuclear magnetic resonance (NMR) studies of larger proteins are hampered by difficulties in assigning NMR resonances. Human intervention is typically required to identify NMR signals in 3D spectra, and subsequent procedures depend on the accuracy of this so-called peak picking. We present a method that provides sequential connectivities through correlation maps constructed with covariance NMR, bypassing the need for preliminary peak picking. We introduce two novel techniques to minimize false correlations and merge the information from all original 3D spectra. First, we take spectral derivatives prior to performing covariance to emphasize coincident peak maxima. Second, we multiply covariance maps calculated with different 3D spectra to destroy erroneous sequential correlations. The maps are easy to use and can readily be generated from conventional triple-resonance experiments. Advantages of the method are demonstrated on a 37 kDa nonribosomal peptide synthetase domain subject to spectral overlap.

  2. Mining sequential patterns for protein fold recognition.

    PubMed

    Exarchos, Themis P; Papaloukas, Costas; Lampros, Christos; Fotiadis, Dimitrios I

    2008-02-01

    Protein data contain discriminative patterns that can be used in many beneficial applications if they are defined correctly. In this work sequential pattern mining (SPM) is utilized for sequence-based fold recognition. Protein classification in terms of fold recognition plays an important role in computational protein analysis, since it can contribute to the determination of the function of a protein whose structure is unknown. Specifically, one of the most efficient SPM algorithms, cSPADE, is employed for the analysis of protein sequence. A classifier uses the extracted sequential patterns to classify proteins in the appropriate fold category. For training and evaluating the proposed method we used the protein sequences from the Protein Data Bank and the annotation of the SCOP database. The method exhibited an overall accuracy of 25% in a classification problem with 36 candidate categories. The classification performance reaches up to 56% when the five most probable protein folds are considered.

  3. PC_Eyewitness: evaluating the New Jersey method.

    PubMed

    MacLin, Otto H; Phelan, Colin M

    2007-05-01

    One important variable in eyewitness identification research is lineup administration procedure. Lineups administered sequentially (one at a time) have been shown to reduce the number of false identifications in comparison with those administered simultaneously (all at once). As a result, some policymakers have adopted sequential administration. However, they have made slight changes to the method used in psychology laboratories. Eyewitnesses in the field are allowed to take multiple passes through a lineup, whereas participants in the laboratory are allowed only one pass. PC_Eyewitness (PCE) is a computerized system used to construct and administer simultaneous or sequential lineups in both the laboratory and the field. It is currently being used in laboratories investigating eyewitness identification in the United States, Canada, and abroad. A modified version of PCE is also being developed for a local police department. We developed a new module for PCE, the New Jersey module, to examine the effects of a second pass. We found that the sequential advantage was eliminated when the participants were allowed to view the lineup a second time. The New Jersey module, and steps we are taking to improve on the module, are presented here and are being made available to the research and law enforcement communities.

  4. Establishment and validation of a method for multi-dose irradiation of cells in 96-well microplates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abatzoglou, Ioannis; Zois, Christos E.; Pouliliou, Stamatia

    2013-02-15

    Highlights: ► We established a method for multi-dose irradiation of cell cultures within a 96-well plate. ► Equations to adjust to preferable dose levels are produced and provided. ► Up to eight different dose levels can be tested in one microplate. ► This method results in fast and reliable estimation of radiation dose–response curves. -- Abstract: Microplates are useful tools in chemistry, biotechnology and molecular biology. In radiobiology research, these can be also applied to assess the effect of a certain radiation dose delivered to the whole microplate, to test radio-sensitivity, radio-sensitization or radio-protection. Whether different radiation doses can bemore » accurately applied to a single 96-well plate to further facilitate and accelerated research by one hand and spare funds on the other, is a question dealt in the current paper. Following repeated ion-chamber, TLD and radiotherapy planning dosimetry we established a method for multi-dose irradiation of cell cultures within a 96-well plate, which allows an accurate delivery of desired doses in sequential columns of the microplate. Up to eight different dose levels can be tested in one microplate. This method results in fast and reliable estimation of radiation dose–response curves.« less

  5. Pilot Testing of the NURSE Stress Management Intervention.

    PubMed

    Delaney, Colleen; Barrere, Cynthia; Robertson, Sue; Zahourek, Rothlyn; Diaz, Desiree; Lachapelle, Leeanne

    2016-12-01

    Student nurses experience significant stress during their education, which may contribute to illness and alterations in health, poor academic performance, and program attrition. The aim of this pilot study was to evaluate the feasibility and potential efficacy of an innovative stress management program in two baccalaureate nursing programs in Connecticut, named NURSE (Nurture nurse, Use resources, foster Resilience, Stress and Environment management), that assists nursing students to develop stress management plans. An explanatory sequential mixed-methods design was used to evaluate the effects of the intervention with 40 junior nursing students. Results from this study provide evidence that the NURSE intervention is highly feasible, and support further testing to examine the effect of the intervention in improving stress management in nursing students. © The Author(s) 2015.

  6. Cochran Q test with Turbo BASIC.

    PubMed

    Seuc, A H

    1995-01-01

    A microcomputer program written in Turbo BASIC for the sequential application of the Cochran Q test is given. A clinical application where the test is used in order to explore the structure of the agreement between observers is also presented. A program listing is available on request.

  7. 16 CFR 1212.4 - Test protocol.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... participate. (6) Two children at a time shall participate in testing of surrogate multi-purpose lighters... at the same time. Two children at a time shall participate in testing of surrogate multi-purpose... appearance, including color. The surrogate multi-purpose lighters shall be labeled with sequential numbers...

  8. A multi-method approach toward de novo glycan characterization: a Man-5 case study.

    PubMed

    Prien, Justin M; Prater, Bradley D; Cockrill, Steven L

    2010-05-01

    Regulatory agencies' expectations for biotherapeutic approval are becoming more stringent with regard to product characterization, where minor species as low as 0.1% of a given profile are typically identified. The mission of this manuscript is to demonstrate a multi-method approach toward de novo glycan characterization and quantitation, including minor species at or approaching the 0.1% benchmark. Recently, unexpected isomers of the Man(5)GlcNAc(2) (M(5)) were reported (Prien JM, Ashline DJ, Lapadula AJ, Zhang H, Reinhold VN. 2009. The high mannose glycans from bovine ribonuclease B isomer characterization by ion trap mass spectrometry (MS). J Am Soc Mass Spectrom. 20:539-556). In the current study, quantitative analysis of these isomers found in commercial M(5) standard demonstrated that they are in low abundance (<1% of the total) and therefore an exemplary "litmus test" for minor species characterization. A simple workflow devised around three core well-established analytical procedures: (1) fluorescence derivatization; (2) online rapid resolution reversed-phase separation coupled with negative-mode sequential mass spectrometry (RRRP-(-)-MS(n)); and (3) permethylation derivatization with nanospray sequential mass spectrometry (NSI-MS(n)) provides comprehensive glycan structural determination. All methods have limitations; however, a multi-method workflow is an at-line stopgap/solution which mitigates each method's individual shortcoming(s) providing greater opportunity for more comprehensive characterization. This manuscript is the first to demonstrate quantitative chromatographic separation of the M(5) isomers and the use of a commercially available stable isotope variant of 2-aminobenzoic acid to detect and chromatographically resolve multiple M(5) isomers in bovine ribonuclease B. With this multi-method approach, we have the capabilities to comprehensively characterize a biotherapeutic's glycan array in a de novo manner, including structural isomers at >/=0.1% of the total chromatographic peak area.

  9. The subtyping of primary aldosteronism by adrenal vein sampling: sequential blood sampling causes factitious lateralization.

    PubMed

    Rossitto, Giacomo; Battistel, Michele; Barbiero, Giulio; Bisogni, Valeria; Maiolino, Giuseppe; Diego, Miotto; Seccia, Teresa M; Rossi, Gian Paolo

    2018-02-01

    The pulsatile secretion of adrenocortical hormones and a stress reaction occurring when starting adrenal vein sampling (AVS) can affect the selectivity and also the assessment of lateralization when sequential blood sampling is used. We therefore tested the hypothesis that a simulated sequential blood sampling could decrease the diagnostic accuracy of lateralization index for identification of aldosterone-producing adenoma (APA), as compared with bilaterally simultaneous AVS. In 138 consecutive patients who underwent subtyping of primary aldosteronism, we compared the results obtained simultaneously bilaterally when starting AVS (t-15) and 15 min after (t0), with those gained with a simulated sequential right-to-left AVS technique (R ⇒ L) created by combining hormonal values obtained at t-15 and at t0. The concordance between simultaneously obtained values at t-15 and t0, and between simultaneously obtained values and values gained with a sequential R ⇒ L technique, was also assessed. We found a marked interindividual variability of lateralization index values in the patients with bilaterally selective AVS at both time point. However, overall the lateralization index simultaneously determined at t0 provided a more accurate identification of APA than the simulated sequential lateralization indexR ⇒ L (P = 0.001). Moreover, regardless of which side was sampled first, the sequential AVS technique induced a sequence-dependent overestimation of lateralization index. While in APA patients the concordance between simultaneous AVS at t0 and t-15 and between simultaneous t0 and sequential technique was moderate-to-good (K = 0.55 and 0.66, respectively), in non-APA patients, it was poor (K = 0.12 and 0.13, respectively). Sequential AVS generates factitious between-sides gradients, which lower its diagnostic accuracy, likely because of the stress reaction arising upon starting AVS.

  10. Automatic load forecasting. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, D.J.; Vemuri, S.

    A method which lends itself to on-line forecasting of hourly electric loads is presented and the results of its use are compared to models developed using the Box-Jenkins method. The method consists of processing the historical hourly loads with a sequential least-squares estimator to identify a finite order autoregressive model which in turn is used to obtain a parsimonious autoregressive-moving average model. A procedure is also defined for incorporating temperature as a variable to improve forecasts where loads are temperature dependent. The method presented has several advantages in comparison to the Box-Jenkins method including much less human intervention and improvedmore » model identification. The method has been tested using three-hourly data from the Lincoln Electric System, Lincoln, Nebraska. In the exhaustive analyses performed on this data base this method produced significantly better results than the Box-Jenkins method. The method also proved to be more robust in that greater confidence could be placed in the accuracy of models based upon the various measures available at the identification stage.« less

  11. Simultaneous Versus Sequential Side-by-Side Bilateral Metal Stent Placement for Malignant Hilar Biliary Obstructions.

    PubMed

    Inoue, Tadahisa; Ishii, Norimitsu; Kobayashi, Yuji; Kitano, Rena; Sakamoto, Kazumasa; Ohashi, Tomohiko; Nakade, Yukiomi; Sumida, Yoshio; Ito, Kiyoaki; Nakao, Haruhisa; Yoneda, Masashi

    2017-09-01

    Endoscopic bilateral self-expandable metallic stent (SEMS) placement for malignant hilar biliary obstructions (MHBOs) is technically demanding, and a second SEMS insertion is particularly challenging. A simultaneous side-by-side (SBS) placement technique using a thinner delivery system may mitigate these issues. We aimed to examine the feasibility and efficacy of simultaneous SBS SEMS placement for treating MHBOs using a novel SEMS that has a 5.7-Fr ultra-thin delivery system. Thirty-four patients with MHBOs underwent SBS SEMS placement between 2010 and 2016. We divided the patient cohort into those who underwent sequential (conventional) SBS placement between 2010 and 2014 (sequential group) and those who underwent simultaneous SBS placement between 2015 and 2016 (simultaneous group), and compared the groups with respect to the clinical outcomes. The technical success rates were 71% (12/17) and 100% (17/17) in the sequential and simultaneous groups, respectively, a difference that was significant (P = .045). The median procedure time was significantly shorter in the simultaneous group (22 min) than in the sequential group (52 min) (P = .017). There were no significant group differences in the time to recurrent biliary obstruction (sequential group: 113 days; simultaneous group: 140 days) or other adverse event rates (sequential group: 12%; simultaneous group: 12%). Simultaneous SBS placement using the novel 5.7-Fr SEMS delivery system may be more straightforward and have a higher success rate compared to that with sequential SBS placement. This new method may be useful for bilateral stenting to treat MHBOs.

  12. [Sequential sampling plans to Orthezia praelonga Douglas (Hemiptera: Sternorrhyncha, Ortheziidae) in citrus].

    PubMed

    Costa, Marilia G; Barbosa, José C; Yamamoto, Pedro T

    2007-01-01

    The sequential sampling is characterized by using samples of variable sizes, and has the advantage of reducing sampling time and costs if compared to fixed-size sampling. To introduce an adequate management for orthezia, sequential sampling plans were developed for orchards under low and high infestation. Data were collected in Matão, SP, in commercial stands of the orange variety 'Pêra Rio', at five, nine and 15 years of age. Twenty samplings were performed in the whole area of each stand by observing the presence or absence of scales on plants, being plots comprised of ten plants. After observing that in all of the three stands the scale population was distributed according to the contagious model, fitting the Negative Binomial Distribution in most samplings, two sequential sampling plans were constructed according to the Sequential Likelihood Ratio Test (SLRT). To construct these plans an economic threshold of 2% was adopted and the type I and II error probabilities were fixed in alpha = beta = 0.10. Results showed that the maximum numbers of samples expected to determine control need were 172 and 76 samples for stands with low and high infestation, respectively.

  13. Identification of DNA-Binding Proteins Using Mixed Feature Representation Methods.

    PubMed

    Qu, Kaiyang; Han, Ke; Wu, Song; Wang, Guohua; Wei, Leyi

    2017-09-22

    DNA-binding proteins play vital roles in cellular processes, such as DNA packaging, replication, transcription, regulation, and other DNA-associated activities. The current main prediction method is based on machine learning, and its accuracy mainly depends on the features extraction method. Therefore, using an efficient feature representation method is important to enhance the classification accuracy. However, existing feature representation methods cannot efficiently distinguish DNA-binding proteins from non-DNA-binding proteins. In this paper, a multi-feature representation method, which combines three feature representation methods, namely, K-Skip-N-Grams, Information theory, and Sequential and structural features (SSF), is used to represent the protein sequences and improve feature representation ability. In addition, the classifier is a support vector machine. The mixed-feature representation method is evaluated using 10-fold cross-validation and a test set. Feature vectors, which are obtained from a combination of three feature extractions, show the best performance in 10-fold cross-validation both under non-dimensional reduction and dimensional reduction by max-relevance-max-distance. Moreover, the reduced mixed feature method performs better than the non-reduced mixed feature technique. The feature vectors, which are a combination of SSF and K-Skip-N-Grams, show the best performance in the test set. Among these methods, mixed features exhibit superiority over the single features.

  14. Pharmacy diabetes care program: analysis of two screening methods for undiagnosed type 2 diabetes in Australian community pharmacy.

    PubMed

    Krass, I; Mitchell, B; Clarke, P; Brillant, M; Dienaar, R; Hughes, J; Lau, P; Peterson, G; Stewart, K; Taylor, S; Wilkinson, J; Armour, C

    2007-03-01

    To compare the efficacy and cost-effectiveness of two methods of screening for undiagnosed type 2 diabetes in Australian community pharmacy. A random sample of 30 pharmacies were allocated into two groups: (i) tick test only (TTO); or (ii) sequential screening (SS) method. Both methods used the same initial risk assessment for type 2 diabetes. Subjects with one or more risk factors in the TTO group were offered a referral to their general practitioner (GP). Under the SS method, patients with risk factors were offered a capillary blood glucose test and those identified as being at risk referred to a GP. The effectiveness and cost-effectiveness of these approaches was assessed. A total of 1286 people were screened over a period of 3 months. The rate of diagnosis of diabetes was significantly higher for SS compared with the TTO method (1.7% versus 0.2%; p=0.008). The SS method resulted in fewer referrals to the GP and a higher uptake of referrals than the TTO method and so was the more cost-effective screening method. SS is the superior method from a cost and efficacy perspective. It should be considered as the preferred option for screening by community based pharmacists in Australia.

  15. Simulated Space Vacuum Ultraviolet (VUV) Exposure Testing for Polymer Films

    NASA Technical Reports Server (NTRS)

    Dever, Joyce A.; Pietromica, Anthony J.; Stueber, Thomas J.; Sechkar, Edward A.; Messer, Russell K.

    2002-01-01

    Vacuum ultraviolet (VUV) radiation of wavelengths between 115 and 200 nm produced by the sun in the space environment can cause degradation to polymer films producing changes in optical, mechanical, and chemical properties. These effects are particularly important for thin polymer films being considered for ultra-lightweight space structures, because, for most polymers, VUV radiation is absorbed in a thin surface layer. NASA Glenn Research Center has developed facilities and methods for long-term ground testing of polymer films to evaluate space environmental VUV radiation effects. VUV exposure can also be used as part of sequential simulated space environmental exposures to determine combined damaging effects. This paper will describe the effects of VUV on polymer films and the necessity for ground testing. Testing practices used at Glenn Research Center for VUV exposure testing will be described including characterization of the VUV radiation source used, calibration procedures traceable to the National Institute of Standards and Technology (NIST), and testing techniques for VUV exposure of polymer surfaces.

  16. HIV testing among MSM in Bogotá, Colombia: The role of structural and individual characteristics

    PubMed Central

    Reisen, Carol A.; Zea, Maria Cecilia; Bianchi, Fernanda T.; Poppen, Paul J.; del Río González, Ana Maria; Romero, Rodrigo A. Aguayo; Pérez, Carolin

    2014-01-01

    This study used mixed methods to examine characteristics related to HIV testing among men who have sex with men (MSM) in Bogotá, Colombia. A sample of 890 MSM responded to a computerized quantitative survey. Follow-up qualitative data included 20 in-depth interviews with MSM and 12 key informant interviews. Hierarchical logistic set regression indicated that sequential sets of variables reflecting demographic characteristics, insurance coverage, risk appraisal, and social context each added to the explanation of HIV testing. Follow-up logistic regression showed that individuals who were older, had higher income, paid for their own insurance, had had a sexually transmitted infection, knew more people living with HIV, and had greater social support were more likely to have been tested for HIV at least once. Qualitative findings provided details of personal and structural barriers to testing, as well as interrelationships among these factors. Recommendations to increase HIV testing among Colombian MSM are offered. PMID:25068180

  17. Automatic sequential fluid handling with multilayer microfluidic sample isolated pumping

    PubMed Central

    Liu, Jixiao; Fu, Hai; Yang, Tianhang; Li, Songjing

    2015-01-01

    To sequentially handle fluids is of great significance in quantitative biology, analytical chemistry, and bioassays. However, the technological options are limited when building such microfluidic sequential processing systems, and one of the encountered challenges is the need for reliable, efficient, and mass-production available microfluidic pumping methods. Herein, we present a bubble-free and pumping-control unified liquid handling method that is compatible with large-scale manufacture, termed multilayer microfluidic sample isolated pumping (mμSIP). The core part of the mμSIP is the selective permeable membrane that isolates the fluidic layer from the pneumatic layer. The air diffusion from the fluidic channel network into the degassing pneumatic channel network leads to fluidic channel pressure variation, which further results in consistent bubble-free liquid pumping into the channels and the dead-end chambers. We characterize the mμSIP by comparing the fluidic actuation processes with different parameters and a flow rate range of 0.013 μl/s to 0.097 μl/s is observed in the experiments. As the proof of concept, we demonstrate an automatic sequential fluid handling system aiming at digital assays and immunoassays, which further proves the unified pumping-control and suggests that the mμSIP is suitable for functional microfluidic assays with minimal operations. We believe that the mμSIP technology and demonstrated automatic sequential fluid handling system would enrich the microfluidic toolbox and benefit further inventions. PMID:26487904

  18. Article and method of forming an article

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lacy, Benjamin Paul; Kottilingam, Srikanth Chandrudu; Dutta, Sandip

    Provided are an article and a method of forming an article. The method includes providing a metallic powder, heating the metallic powder to a temperature sufficient to joint at least a portion of the metallic powder to form an initial layer, sequentially forming additional layers in a build direction by providing a distributed layer of the metallic powder over the initial layer and heating the distributed layer of the metallic powder, repeating the steps of sequentially forming the additional layers in the build direction to form a portion of the article having a hollow space formed in the build direction,more » and forming an overhang feature extending into the hollow space. The article includes an article formed by the method described herein.« less

  19. [Not Available].

    PubMed

    Paturzo, Marco; Colaceci, Sofia; Clari, Marco; Mottola, Antonella; Alvaro, Rosaria; Vellone, Ercole

    2016-01-01

    . Mixed methods designs: an innovative methodological approach for nursing research. The mixed method research designs (MM) combine qualitative and quantitative approaches in the research process, in a single study or series of studies. Their use can provide a wider understanding of multifaceted phenomena. This article presents a general overview of the structure and design of MM to spread this approach in the Italian nursing research community. The MM designs most commonly used in the nursing field are the convergent parallel design, the sequential explanatory design, the exploratory sequential design and the embedded design. For each method a research example is presented. The use of MM can be an added value to improve clinical practices as, through the integration of qualitative and quantitative methods, researchers can better assess complex phenomena typical of nursing.

  20. Bridging the clinician/researcher gap with systemic research: the case for process research, dyadic, and sequential analysis.

    PubMed

    Oka, Megan; Whiting, Jason

    2013-01-01

    In Marriage and Family Therapy (MFT), as in many clinical disciplines, concern surfaces about the clinician/researcher gap. This gap includes a lack of accessible, practical research for clinicians. MFT clinical research often borrows from the medical tradition of randomized control trials, which typically use linear methods, or follow procedures distanced from "real-world" therapy. We review traditional research methods and their use in MFT and propose increased use of methods that are more systemic in nature and more applicable to MFTs: process research, dyadic data analysis, and sequential analysis. We will review current research employing these methods, as well as suggestions and directions for further research. © 2013 American Association for Marriage and Family Therapy.

Top