NASA Technical Reports Server (NTRS)
Rao, R. G. S.; Ulaby, F. T.
1977-01-01
The paper examines optimal sampling techniques for obtaining accurate spatial averages of soil moisture, at various depths and for cell sizes in the range 2.5-40 acres, with a minimum number of samples. Both simple random sampling and stratified sampling procedures are used to reach a set of recommended sample sizes for each depth and for each cell size. Major conclusions from statistical sampling test results are that (1) the number of samples required decreases with increasing depth; (2) when the total number of samples cannot be prespecified or the moisture in only one single layer is of interest, then a simple random sample procedure should be used which is based on the observed mean and SD for data from a single field; (3) when the total number of samples can be prespecified and the objective is to measure the soil moisture profile with depth, then stratified random sampling based on optimal allocation should be used; and (4) decreasing the sensor resolution cell size leads to fairly large decreases in samples sizes with stratified sampling procedures, whereas only a moderate decrease is obtained in simple random sampling procedures.
Evaluation of Bayesian Sequential Proportion Estimation Using Analyst Labels
NASA Technical Reports Server (NTRS)
Lennington, R. K.; Abotteen, K. M. (Principal Investigator)
1980-01-01
The author has identified the following significant results. A total of ten Large Area Crop Inventory Experiment Phase 3 blind sites and analyst-interpreter labels were used in a study to compare proportional estimates obtained by the Bayes sequential procedure with estimates obtained from simple random sampling and from Procedure 1. The analyst error rate using the Bayes technique was shown to be no greater than that for the simple random sampling. Also, the segment proportion estimates produced using this technique had smaller bias and mean squared errors than the estimates produced using either simple random sampling or Procedure 1.
40 CFR 761.355 - Third level of sample selection.
Code of Federal Regulations, 2012 CFR
2012-07-01
... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...
40 CFR 761.355 - Third level of sample selection.
Code of Federal Regulations, 2011 CFR
2011-07-01
... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...
40 CFR 761.355 - Third level of sample selection.
Code of Federal Regulations, 2013 CFR
2013-07-01
... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...
40 CFR 761.355 - Third level of sample selection.
Code of Federal Regulations, 2010 CFR
2010-07-01
... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...
40 CFR 761.355 - Third level of sample selection.
Code of Federal Regulations, 2014 CFR
2014-07-01
... of sample selection further reduces the size of the subsample to 100 grams which is suitable for the... procedures in § 761.353 of this part into 100 gram portions. (b) Use a random number generator or random number table to select one 100 gram size portion as a sample for a procedure used to simulate leachate...
NASA Astrophysics Data System (ADS)
WANG, P. T.
2015-12-01
Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.
Exact intervals and tests for median when one sample value possibly an outliner
NASA Technical Reports Server (NTRS)
Keller, G. J.; Walsh, J. E.
1973-01-01
Available are independent observations (continuous data) that are believed to be a random sample. Desired are distribution-free confidence intervals and significance tests for the population median. However, there is the possibility that either the smallest or the largest observation is an outlier. Then, use of a procedure for rejection of an outlying observation might seem appropriate. Such a procedure would consider that two alternative situations are possible and would select one of them. Either (1) the n observations are truly a random sample, or (2) an outlier exists and its removal leaves a random sample of size n-1. For either situation, confidence intervals and tests are desired for the median of the population yielding the random sample. Unfortunately, satisfactory rejection procedures of a distribution-free nature do not seem to be available. Moreover, all rejection procedures impose undesirable conditional effects on the observations, and also, can select the wrong one of the two above situations. It is found that two-sided intervals and tests based on two symmetrically located order statistics (not the largest and smallest) of the n observations have this property.
Bayesian estimation of Karhunen–Loève expansions; A random subspace approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chowdhary, Kenny; Najm, Habib N.
One of the most widely-used statistical procedures for dimensionality reduction of high dimensional random fields is Principal Component Analysis (PCA), which is based on the Karhunen-Lo eve expansion (KLE) of a stochastic process with finite variance. The KLE is analogous to a Fourier series expansion for a random process, where the goal is to find an orthogonal transformation for the data such that the projection of the data onto this orthogonal subspace is optimal in the L 2 sense, i.e, which minimizes the mean square error. In practice, this orthogonal transformation is determined by performing an SVD (Singular Value Decomposition)more » on the sample covariance matrix or on the data matrix itself. Sampling error is typically ignored when quantifying the principal components, or, equivalently, basis functions of the KLE. Furthermore, it is exacerbated when the sample size is much smaller than the dimension of the random field. In this paper, we introduce a Bayesian KLE procedure, allowing one to obtain a probabilistic model on the principal components, which can account for inaccuracies due to limited sample size. The probabilistic model is built via Bayesian inference, from which the posterior becomes the matrix Bingham density over the space of orthonormal matrices. We use a modified Gibbs sampling procedure to sample on this space and then build a probabilistic Karhunen-Lo eve expansions over random subspaces to obtain a set of low-dimensional surrogates of the stochastic process. We illustrate this probabilistic procedure with a finite dimensional stochastic process inspired by Brownian motion.« less
Bayesian estimation of Karhunen–Loève expansions; A random subspace approach
Chowdhary, Kenny; Najm, Habib N.
2016-04-13
One of the most widely-used statistical procedures for dimensionality reduction of high dimensional random fields is Principal Component Analysis (PCA), which is based on the Karhunen-Lo eve expansion (KLE) of a stochastic process with finite variance. The KLE is analogous to a Fourier series expansion for a random process, where the goal is to find an orthogonal transformation for the data such that the projection of the data onto this orthogonal subspace is optimal in the L 2 sense, i.e, which minimizes the mean square error. In practice, this orthogonal transformation is determined by performing an SVD (Singular Value Decomposition)more » on the sample covariance matrix or on the data matrix itself. Sampling error is typically ignored when quantifying the principal components, or, equivalently, basis functions of the KLE. Furthermore, it is exacerbated when the sample size is much smaller than the dimension of the random field. In this paper, we introduce a Bayesian KLE procedure, allowing one to obtain a probabilistic model on the principal components, which can account for inaccuracies due to limited sample size. The probabilistic model is built via Bayesian inference, from which the posterior becomes the matrix Bingham density over the space of orthonormal matrices. We use a modified Gibbs sampling procedure to sample on this space and then build a probabilistic Karhunen-Lo eve expansions over random subspaces to obtain a set of low-dimensional surrogates of the stochastic process. We illustrate this probabilistic procedure with a finite dimensional stochastic process inspired by Brownian motion.« less
CTEPP STANDARD OPERATING PROCEDURE FOR TELEPHONE SAMPLE SUBJECTS RECRUITMENT (SOP-1.12)
The subject recruitment procedures for the telephone sample component are described in the SOP. A random telephone sample list is ordered from a commercial survey sampling firm. Using this list, introductory letters are sent to targeted homes prior to making initial telephone c...
Generating Random Samples of a Given Size Using Social Security Numbers.
ERIC Educational Resources Information Center
Erickson, Richard C.; Brauchle, Paul E.
1984-01-01
The purposes of this article are (1) to present a method by which social security numbers may be used to draw cluster samples of a predetermined size and (2) to describe procedures used to validate this method of drawing random samples. (JOW)
CTEPP STANDARD OPERATING PROCEDURE FOR SAMPLE SELECTION (SOP-1.10)
The procedures for selecting CTEPP study subjects are described in the SOP. The primary, county-level stratification is by region and urbanicity. Six sample counties in each of the two states (North Carolina and Ohio) are selected using stratified random sampling and reflect ...
A method for determining the weak statistical stationarity of a random process
NASA Technical Reports Server (NTRS)
Sadeh, W. Z.; Koper, C. A., Jr.
1978-01-01
A method for determining the weak statistical stationarity of a random process is presented. The core of this testing procedure consists of generating an equivalent ensemble which approximates a true ensemble. Formation of an equivalent ensemble is accomplished through segmenting a sufficiently long time history of a random process into equal, finite, and statistically independent sample records. The weak statistical stationarity is ascertained based on the time invariance of the equivalent-ensemble averages. Comparison of these averages with their corresponding time averages over a single sample record leads to a heuristic estimate of the ergodicity of a random process. Specific variance tests are introduced for evaluating the statistical independence of the sample records, the time invariance of the equivalent-ensemble autocorrelations, and the ergodicity. Examination and substantiation of these procedures were conducted utilizing turbulent velocity signals.
ERIC Educational Resources Information Center
Montague, Margariete A.
This study investigated the feasibility of concurrently and randomly sampling examinees and items in order to estimate group achievement. Seven 32-item tests reflecting a 640-item universe of simple open sentences were used such that item selection (random, systematic) and assignment (random, systematic) of items (four, eight, sixteen) to forms…
Application and testing of a procedure to evaluate transferability of habitat suitability criteria
Thomas, Jeff A.; Bovee, Ken D.
1993-01-01
A procedure designed to test the transferability of habitat suitability criteria was evaluated in the Cache la Poudre River, Colorado. Habitat suitability criteria were developed for active adult and juvenile rainbow trout in the South Platte River, Colorado. These criteria were tested by comparing microhabitat use predicted from the criteria with observed microhabitat use by adult rainbow trout in the Cache la Poudre River. A one-sided X2 test, using counts of occupied and unoccupied cells in each suitability classification, was used to test for non-random selection for optimum habitat use over usable habitat and for suitable over unsuitable habitat. Criteria for adult rainbow trout were judged to be transferable to the Cache la Poudre River, but juvenile criteria (applied to adults) were not transferable. Random subsampling of occupied and unoccupied cells was conducted to determine the effect of sample size on the reliability of the test procedure. The incidence of type I and type II errors increased rapidly as the sample size was reduced below 55 occupied and 200 unoccupied cells. Recommended modifications to the procedure included the adoption of a systematic or randomized sampling design and direct measurement of microhabitat variables. With these modifications, the procedure is economical, simple and reliable. Use of the procedure as a quality assurance device in routine applications of the instream flow incremental methodology was encouraged.
Decision tree modeling using R.
Zhang, Zhongheng
2016-08-01
In machine learning field, decision tree learner is powerful and easy to interpret. It employs recursive binary partitioning algorithm that splits the sample in partitioning variable with the strongest association with the response variable. The process continues until some stopping criteria are met. In the example I focus on conditional inference tree, which incorporates tree-structured regression models into conditional inference procedures. While growing a single tree is subject to small changes in the training data, random forests procedure is introduced to address this problem. The sources of diversity for random forests come from the random sampling and restricted set of input variables to be selected. Finally, I introduce R functions to perform model based recursive partitioning. This method incorporates recursive partitioning into conventional parametric model building.
ERIC Educational Resources Information Center
Bauer, Daniel J.; Preacher, Kristopher J.; Gil, Karen M.
2006-01-01
The authors propose new procedures for evaluating direct, indirect, and total effects in multilevel models when all relevant variables are measured at Level 1 and all effects are random. Formulas are provided for the mean and variance of the indirect and total effects and for the sampling variances of the average indirect and total effects.…
Kretzschmar, A; Durand, E; Maisonnasse, A; Vallon, J; Le Conte, Y
2015-06-01
A new procedure of stratified sampling is proposed in order to establish an accurate estimation of Varroa destructor populations on sticky bottom boards of the hive. It is based on the spatial sampling theory that recommends using regular grid stratification in the case of spatially structured process. The distribution of varroa mites on sticky board being observed as spatially structured, we designed a sampling scheme based on a regular grid with circles centered on each grid element. This new procedure is then compared with a former method using partially random sampling. Relative error improvements are exposed on the basis of a large sample of simulated sticky boards (n=20,000) which provides a complete range of spatial structures, from a random structure to a highly frame driven structure. The improvement of varroa mite number estimation is then measured by the percentage of counts with an error greater than a given level. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Housworth, E A; Martins, E P
2001-01-01
Statistical randomization tests in evolutionary biology often require a set of random, computer-generated trees. For example, earlier studies have shown how large numbers of computer-generated trees can be used to conduct phylogenetic comparative analyses even when the phylogeny is uncertain or unknown. These methods were limited, however, in that (in the absence of molecular sequence or other data) they allowed users to assume that no phylogenetic information was available or that all possible trees were known. Intermediate situations where only a taxonomy or other limited phylogenetic information (e.g., polytomies) are available are technically more difficult. The current study describes a procedure for generating random samples of phylogenies while incorporating limited phylogenetic information (e.g., four taxa belong together in a subclade). The procedure can be used to conduct comparative analyses when the phylogeny is only partially resolved or can be used in other randomization tests in which large numbers of possible phylogenies are needed.
ERIC Educational Resources Information Center
Hayford, Samuel K.; Ocansey, Frederick
2017-01-01
This study reports part of a national survey on sources of information, education and communication materials on HIV/AIDS available to students with visual impairments in residential, segregated, and integrated schools in Ghana. A multi-staged stratified random sampling procedure and a purposive and simple random sampling approach, where…
Toward a Principled Sampling Theory for Quasi-Orders
Ünlü, Ali; Schrepp, Martin
2016-01-01
Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets. PMID:27965601
Toward a Principled Sampling Theory for Quasi-Orders.
Ünlü, Ali; Schrepp, Martin
2016-01-01
Quasi-orders, that is, reflexive and transitive binary relations, have numerous applications. In educational theories, the dependencies of mastery among the problems of a test can be modeled by quasi-orders. Methods such as item tree or Boolean analysis that mine for quasi-orders in empirical data are sensitive to the underlying quasi-order structure. These data mining techniques have to be compared based on extensive simulation studies, with unbiased samples of randomly generated quasi-orders at their basis. In this paper, we develop techniques that can provide the required quasi-order samples. We introduce a discrete doubly inductive procedure for incrementally constructing the set of all quasi-orders on a finite item set. A randomization of this deterministic procedure allows us to generate representative samples of random quasi-orders. With an outer level inductive algorithm, we consider the uniform random extensions of the trace quasi-orders to higher dimension. This is combined with an inner level inductive algorithm to correct the extensions that violate the transitivity property. The inner level correction step entails sampling biases. We propose three algorithms for bias correction and investigate them in simulation. It is evident that, on even up to 50 items, the new algorithms create close to representative quasi-order samples within acceptable computing time. Hence, the principled approach is a significant improvement to existing methods that are used to draw quasi-orders uniformly at random but cannot cope with reasonably large item sets.
Pigeons' Choices between Fixed-Interval and Random-Interval Schedules: Utility of Variability?
ERIC Educational Resources Information Center
Andrzejewski, Matthew E.; Cardinal, Claudia D.; Field, Douglas P.; Flannery, Barbara A.; Johnson, Michael; Bailey, Kathleen; Hineline, Philip N.
2005-01-01
Pigeons' choosing between fixed-interval and random-interval schedules of reinforcement was investigated in three experiments using a discrete-trial procedure. In all three experiments, the random-interval schedule was generated by sampling a probability distribution at an interval (and in multiples of the interval) equal to that of the…
Scalable randomized benchmarking of non-Clifford gates
NASA Astrophysics Data System (ADS)
Cross, Andrew; Magesan, Easwar; Bishop, Lev; Smolin, John; Gambetta, Jay
Randomized benchmarking is a widely used experimental technique to characterize the average error of quantum operations. Benchmarking procedures that scale to enable characterization of n-qubit circuits rely on efficient procedures for manipulating those circuits and, as such, have been limited to subgroups of the Clifford group. However, universal quantum computers require additional, non-Clifford gates to approximate arbitrary unitary transformations. We define a scalable randomized benchmarking procedure over n-qubit unitary matrices that correspond to protected non-Clifford gates for a class of stabilizer codes. We present efficient methods for representing and composing group elements, sampling them uniformly, and synthesizing corresponding poly (n) -sized circuits. The procedure provides experimental access to two independent parameters that together characterize the average gate fidelity of a group element. We acknowledge support from ARO under Contract W911NF-14-1-0124.
The purpose of this SOP is to define the procedure for conducting a data accuracy check on a randomly selected 10% sample of all electronic data. This procedure applies to the cleaned, working databases generated during the Arizona NHEXAS project and the "Border" study. Keyword...
Pei, Yanbo; Tian, Guo-Liang; Tang, Man-Lai
2014-11-10
Stratified data analysis is an important research topic in many biomedical studies and clinical trials. In this article, we develop five test statistics for testing the homogeneity of proportion ratios for stratified correlated bilateral binary data based on an equal correlation model assumption. Bootstrap procedures based on these test statistics are also considered. To evaluate the performance of these statistics and procedures, we conduct Monte Carlo simulations to study their empirical sizes and powers under various scenarios. Our results suggest that the procedure based on score statistic performs well generally and is highly recommended. When the sample size is large, procedures based on the commonly used weighted least square estimate and logarithmic transformation with Mantel-Haenszel estimate are recommended as they do not involve any computation of maximum likelihood estimates requiring iterative algorithms. We also derive approximate sample size formulas based on the recommended test procedures. Finally, we apply the proposed methods to analyze a multi-center randomized clinical trial for scleroderma patients. Copyright © 2014 John Wiley & Sons, Ltd.
Correcting Evaluation Bias of Relational Classifiers with Network Cross Validation
2010-01-01
classi- fication algorithms: simple random resampling (RRS), equal-instance random resampling (ERS), and network cross-validation ( NCV ). The first two... NCV procedure that eliminates overlap between test sets altogether. The procedure samples for k disjoint test sets that will be used for evaluation...propLabeled ∗ S) nodes from train Pool in f erenceSet =network − trainSet F = F ∪ < trainSet, test Set, in f erenceSet > end for output: F NCV addresses
Extending cluster Lot Quality Assurance Sampling designs for surveillance programs
Hund, Lauren; Pagano, Marcello
2014-01-01
Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance based on the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than simple random sampling. By applying survey sampling results to the binary classification procedure, we develop a simple and flexible non-parametric procedure to incorporate clustering effects into the LQAS sample design to appropriately inflate the sample size, accommodating finite numbers of clusters in the population when relevant. We use this framework to then discuss principled selection of survey design parameters in longitudinal surveillance programs. We apply this framework to design surveys to detect rises in malnutrition prevalence in nutrition surveillance programs in Kenya and South Sudan, accounting for clustering within villages. By combining historical information with data from previous surveys, we design surveys to detect spikes in the childhood malnutrition rate. PMID:24633656
Extending cluster lot quality assurance sampling designs for surveillance programs.
Hund, Lauren; Pagano, Marcello
2014-07-20
Lot quality assurance sampling (LQAS) has a long history of applications in industrial quality control. LQAS is frequently used for rapid surveillance in global health settings, with areas classified as poor or acceptable performance on the basis of the binary classification of an indicator. Historically, LQAS surveys have relied on simple random samples from the population; however, implementing two-stage cluster designs for surveillance sampling is often more cost-effective than simple random sampling. By applying survey sampling results to the binary classification procedure, we develop a simple and flexible nonparametric procedure to incorporate clustering effects into the LQAS sample design to appropriately inflate the sample size, accommodating finite numbers of clusters in the population when relevant. We use this framework to then discuss principled selection of survey design parameters in longitudinal surveillance programs. We apply this framework to design surveys to detect rises in malnutrition prevalence in nutrition surveillance programs in Kenya and South Sudan, accounting for clustering within villages. By combining historical information with data from previous surveys, we design surveys to detect spikes in the childhood malnutrition rate. Copyright © 2014 John Wiley & Sons, Ltd.
21 CFR 203.34 - Policies and procedures; administrative systems.
Code of Federal Regulations, 2010 CFR
2010-04-01
... distribution security and audit system, including conducting random and for-cause audits of sales... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Policies and procedures; administrative systems...; administrative systems. Each manufacturer or authorized distributor of record that distributes drug samples shall...
Sampling pig farms at the abattoir in a cross-sectional study - Evaluation of a sampling method.
Birkegård, Anna Camilla; Halasa, Tariq; Toft, Nils
2017-09-15
A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However, it is difficult to sample a large number of farms from an exact predefined list, due to the logistics and workflow of an abattoir. Therefore, it is necessary to have a systematic sampling procedure and to evaluate the obtained sample with respect to the study objective. We propose a method for 1) planning, 2) conducting, and 3) evaluating the representativeness and reproducibility of a cross-sectional study when simple random sampling is not possible. We used an example of a cross-sectional study with the aim of quantifying the association of antimicrobial resistance and antimicrobial consumption in Danish slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2) conducting: sampling was carried out at five abattoirs; 3) evaluation: representativeness was evaluated by comparing sampled and non-sampled farms, and the reproducibility of the study was assessed through simulated sampling based on meat inspection data from the period where the actual data collection was carried out. In the cross-sectional study samples were taken from 681 Danish pig farms, during five weeks from February to March 2015. The evaluation showed that the sampling procedure was reproducible with results comparable to the collected sample. However, the sampling procedure favoured sampling of large farms. Furthermore, both under-sampled and over-sampled areas were found using scan statistics. In conclusion, sampling conducted at abattoirs can provide a spatially representative sample. Hence it is a possible cost-effective alternative to simple random sampling. However, it is important to assess the properties of the resulting sample so that any potential selection bias can be addressed when reporting the findings. Copyright © 2017 Elsevier B.V. All rights reserved.
Group Matching: Is This a Research Technique to Be Avoided?
ERIC Educational Resources Information Center
Ross, Donald C.; Klein, Donald F.
1988-01-01
The variance of the sample difference and the power of the "F" test for mean differences were studied under group matching on covariates and also under random assignment. Results shed light on systematic assignment procedures advocated to provide more precise estimates of treatment effects than simple random assignment. (TJH)
Unbiased Estimates of Variance Components with Bootstrap Procedures
ERIC Educational Resources Information Center
Brennan, Robert L.
2007-01-01
This article provides general procedures for obtaining unbiased estimates of variance components for any random-model balanced design under any bootstrap sampling plan, with the focus on designs of the type typically used in generalizability theory. The results reported here are particularly helpful when the bootstrap is used to estimate standard…
Collective Negotiations and Teacher Satisfaction in Selected Indiana Secondary Schools.
ERIC Educational Resources Information Center
Davies, Paul R.; Kline, Charles E.
This paper reports a study that sought to determine whether differences in bargaining procedures are related to differences in teacher satisfaction or morale. Of the forty schools in the random sample, 27 were operating under traditional collective negotiation procedures -- teachers relatively unorganized; eight were operating under procedural…
48 CFR 13.303-6 - Review procedures.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Review procedures. (a) The contracting officer placing orders under a BPA, or the designated representative of the contracting officer, shall review a sufficient random sample of the BPA files at least... into the BPA shall— (1) Ensure that each BPA is reviewed at least annually and, if necessary, updated...
48 CFR 13.303-6 - Review procedures.
Code of Federal Regulations, 2012 CFR
2012-10-01
... Review procedures. (a) The contracting officer placing orders under a BPA, or the designated representative of the contracting officer, shall review a sufficient random sample of the BPA files at least... into the BPA shall— (1) Ensure that each BPA is reviewed at least annually and, if necessary, updated...
48 CFR 13.303-6 - Review procedures.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Review procedures. (a) The contracting officer placing orders under a BPA, or the designated representative of the contracting officer, shall review a sufficient random sample of the BPA files at least... into the BPA shall— (1) Ensure that each BPA is reviewed at least annually and, if necessary, updated...
48 CFR 13.303-6 - Review procedures.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Review procedures. (a) The contracting officer placing orders under a BPA, or the designated representative of the contracting officer, shall review a sufficient random sample of the BPA files at least... into the BPA shall— (1) Ensure that each BPA is reviewed at least annually and, if necessary, updated...
48 CFR 13.303-6 - Review procedures.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Review procedures. (a) The contracting officer placing orders under a BPA, or the designated representative of the contracting officer, shall review a sufficient random sample of the BPA files at least... into the BPA shall— (1) Ensure that each BPA is reviewed at least annually and, if necessary, updated...
Handwashing with soap or alcoholic solutions? A randomized clinical trial of its effectiveness.
Zaragoza, M; Sallés, M; Gomez, J; Bayas, J M; Trilla, A
1999-06-01
The effectiveness of an alcoholic solution compared with the standard hygienic handwashing procedure during regular work in clinical wards and intensive care units of a large public university hospital in Barcelona was assessed. A prospective, randomized clinical trial with crossover design, paired data, and blind evaluation was done. Eligible health care workers (HCWs) included permanent and temporary HCWs of wards and intensive care units. From each category, a random sample of persons was selected. HCWs were randomly assigned to regular handwashing (liquid soap and water) or handwashing with the alcoholic solution by using a crossover design. The number of colony-forming units on agar plates from hands printing in 3 different samples was counted. A total of 47 HCWs were included. The average reduction in the number of colony-forming units from samples before handwashing to samples after handwashing was 49.6% for soap and water and 88.2% for the alcoholic solution. When both methods were compared, the average number of colony-forming units recovered after the procedure showed a statistically significant difference in favor of the alcoholic solution (P <.001). The alcoholic solution was well tolerated by HCWs. Overall acceptance rate was classified as "good" by 72% of HCWs after 2 weeks use. Of all HCWs included, 9.3% stated that the use of the alcoholic solution worsened minor pre-existing skin conditions. Although the regular use of hygienic soap and water handwashing procedures is the gold standard, the use of alcoholic solutions is effective and safe and deserves more attention, especially in situations in which the handwashing compliance rate is hampered by architectural problems (lack of sinks) or nursing work overload.
Patel, Amita; Czerniawski, Barbara; Gray, Shari; Lui, Eric
2003-01-01
BACKGROUND: Heel prick blood sampling is the most common painful invasive procedure performed on neonates. Currently, there are no effective ways to provide pain relief from this painful procedure. OBJECTIVE: To assess the efficacy of the topical anesthetic amethocaine 4% gel (Ametop, Smith & Nephew Inc, St Laurent) in reducing the pain of heel prick blood sampling in neonates. METHODS: A randomized, double-blind, placebo controlled, crossover trial was conducted. Neonates between 33 to 37 weeks’ gestational age in their first seven days of life were eligible. Heel prick blood sampling was performed on each participant twice. Each infant was randomly assigned to receive either amethocaine 4% gel or placebo to the heel for the first prick, and then received the alternative agent for the second prick. Prick pain was assessed using both Premature Infant Pain Profile (PIPP) and Neonatal Infant Pain Scale (NIPS). Squeeze pain was assessed by NIPS. RESULTS: Ten babies were recruited. There were no significant differences in the average PIPP and NIPS scores between the treatment and placebo groups for both prick and squeeze pains from heel prick blood sampling. For prick pain, linear-regression showed significant correlation between the PIPP and NIPS scores. No adverse reactions were observed after application of either the active or placebo agents. CONCLUSION: Topical amethocaine 4% gel is not shown to reduce prick and squeeze pains significantly from heel prick blood sampling in neonates between 33 to 37 weeks’ gestational age. Further studies are needed to find ways to provide effective pain relief from this common procedure. PMID:20020001
Individualizing drug dosage with longitudinal data.
Zhu, Xiaolu; Qu, Annie
2016-10-30
We propose a two-step procedure to personalize drug dosage over time under the framework of a log-linear mixed-effect model. We model patients' heterogeneity using subject-specific random effects, which are treated as the realizations of an unspecified stochastic process. We extend the conditional quadratic inference function to estimate both fixed-effect coefficients and individual random effects on a longitudinal training data sample in the first step and propose an adaptive procedure to estimate new patients' random effects and provide dosage recommendations for new patients in the second step. An advantage of our approach is that we do not impose any distribution assumption on estimating random effects. Moreover, the new approach can accommodate more general time-varying covariates corresponding to random effects. We show in theory and numerical studies that the proposed method is more efficient compared with existing approaches, especially when covariates are time varying. In addition, a real data example of a clozapine study confirms that our two-step procedure leads to more accurate drug dosage recommendations. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
A posteriori noise estimation in variable data sets. With applications to spectra and light curves
NASA Astrophysics Data System (ADS)
Czesla, S.; Molle, T.; Schmitt, J. H. M. M.
2018-01-01
Most physical data sets contain a stochastic contribution produced by measurement noise or other random sources along with the signal. Usually, neither the signal nor the noise are accurately known prior to the measurement so that both have to be estimated a posteriori. We have studied a procedure to estimate the standard deviation of the stochastic contribution assuming normality and independence, requiring a sufficiently well-sampled data set to yield reliable results. This procedure is based on estimating the standard deviation in a sample of weighted sums of arbitrarily sampled data points and is identical to the so-called DER_SNR algorithm for specific parameter settings. To demonstrate the applicability of our procedure, we present applications to synthetic data, high-resolution spectra, and a large sample of space-based light curves and, finally, give guidelines to apply the procedure in situation not explicitly considered here to promote its adoption in data analysis.
An Efficient Alternative Mixed Randomized Response Procedure
ERIC Educational Resources Information Center
Singh, Housila P.; Tarray, Tanveer A.
2015-01-01
In this article, we have suggested a new modified mixed randomized response (RR) model and studied its properties. It is shown that the proposed mixed RR model is always more efficient than the Kim and Warde's mixed RR model. The proposed mixed RR model has also been extended to stratified sampling. Numerical illustrations and graphical…
ERIC Educational Resources Information Center
Ndirangu, Caroline
2017-01-01
This study aims to evaluate teachers' attitude towards implementation of learner-centered methodology in science education in Kenya. The study used a survey design methodology, adopting the purposive, stratified random and simple random sampling procedures and hypothesised that there was no significant relationship between the head teachers'…
Sampling procedures for inventory of commercial volume tree species in Amazon Forest.
Netto, Sylvio P; Pelissari, Allan L; Cysneiros, Vinicius C; Bonazza, Marcelo; Sanquetta, Carlos R
2017-01-01
The spatial distribution of tropical tree species can affect the consistency of the estimators in commercial forest inventories, therefore, appropriate sampling procedures are required to survey species with different spatial patterns in the Amazon Forest. For this, the present study aims to evaluate the conventional sampling procedures and introduce the adaptive cluster sampling for volumetric inventories of Amazonian tree species, considering the hypotheses that the density, the spatial distribution and the zero-plots affect the consistency of the estimators, and that the adaptive cluster sampling allows to obtain more accurate volumetric estimation. We use data from a census carried out in Jamari National Forest, Brazil, where trees with diameters equal to or higher than 40 cm were measured in 1,355 plots. Species with different spatial patterns were selected and sampled with simple random sampling, systematic sampling, linear cluster sampling and adaptive cluster sampling, whereby the accuracy of the volumetric estimation and presence of zero-plots were evaluated. The sampling procedures applied to species were affected by the low density of trees and the large number of zero-plots, wherein the adaptive clusters allowed concentrating the sampling effort in plots with trees and, thus, agglutinating more representative samples to estimate the commercial volume.
Teacher Evaluation: Practices and Procedures. ERS Report.
ERIC Educational Resources Information Center
Educational Research Service, Arlington, VA.
This report presents findings of the 1988 Educational Research Service survey of teacher evaluation practices and procedures in U.S. schools. The survey instrument was mailed to a random sample of 1,730 superintendents of school districts of varying size. The response rate was 52.5 percent. The first section discusses the purposes of teacher…
Borak, T B
1986-04-01
Periodic grab sampling in combination with time-of-occupancy surveys has been the accepted procedure for estimating the annual exposure of underground U miners to Rn daughters. Temporal variations in the concentration of potential alpha energy in the mine generate uncertainties in this process. A system to randomize the selection of locations for measurement is described which can reduce uncertainties and eliminate systematic biases in the data. In general, a sample frequency of 50 measurements per year is sufficient to satisfy the criteria that the annual exposure be determined in working level months to within +/- 50% of the true value with a 95% level of confidence. Suggestions for implementing this randomization scheme are presented.
Acquisition of delayed matching in the pigeon.
Berryman, R; Cumming, W W; Nevin, J A
1963-01-01
Pigeons were exposed to three successive matching-to-sample procedures. On a given trial, the sample (red, green or blue light) appeared on a center key; observing responses to this key produced the comparison stimuli on two side keys. Seven different experimental conditions could govern the temporal relations between the sample and comparison stimuli. In the "simultaneous" condition, the center key response was followed immediately by illumination of the side key comparison stimuli, with the center key remaining on. In "zero delay" the center key response simultaneously turned the side keys on and the center key off, while in the "variable delay" conditions, intervals of 1, 2, 4, 10, and 24 sec were interposed between the offset of the sample and the appearance of the comparison stimuli on the side keys. In all conditions, a response to the side key of matching hue produced reinforcement, while a response to the non-matching side key was followed by a blackout. In procedure I all seven experimental conditions were presented in randomly permutated order. After nine sessions of exposure (at 191 trials per session, for a total of 1719 trials) the birds gave no evidence of acquisition in any of the conditions. They were therefore transferred to Procedure II, which required them to match only in the "simultaneous" condition, with both the sample and comparison stimuli present at the same time. With the exception of one bird, all subjects acquired this performance to near 100% levels. Next, in Procedure III, they were once more exposed to presentation of all seven experimental conditions in random order. In contrast to Procedure I, they now acquired the delay performance, and were able to match effectively at delays of about 4 sec.
Balejo, Rodrigo Dalla Pria; Cortelli, José Roberto; Costa, Fernando Oliveira; Cyrino, Renata Magalhães; Aquino, Davi Romeiro; Cogo-Müller, Karina; Miranda, Taís Browne; Moura, Sara Porto; Cortelli, Sheila Cavalca
2017-01-01
Abstract Objective: Single dose of systemic antibiotics and short-term use of mouthwashes reduce bacteremia. However, the effects of a single dose of preprocedural rinse are still controversial. This study evaluated, in periodontally diseased patients, the effects of a pre-procedural mouth rinse on induced bacteremia. Material and Methods: Systemically healthy individuals with gingivitis (n=27) or periodontitis (n = 27) were randomly allocated through a sealed envelope system to: 0.12% chlorhexidine pre-procedural rinse (13 gingivitis and 13 periodontitis patients) or no rinse before dental scaling (14 gingivitis and 15 periodontitis patients). Periodontal probing depth, clinical attachment level, plaque, and gingival indices were measured and subgingival samples were collected. Blood samples were collected before dental scaling, 2 and 6 minutes after scaling. Total bacterial load and levels of P. gingivalis were determined in oral and blood samples by real-time polymerase chain reaction, while aerobic and anaerobic counts were determined by culture in blood samples. The primary outcome was the antimicrobial effect of the pre-procedural rinse. Data was compared by Mann-Whitney and Signal tests (p<0.05). Results: In all sampling times, polymerase chain reaction revealed higher blood bacterial levels than culture (p<0.0001), while gingivitis patients presented lower bacterial levels in blood than periodontitis patients (p<0.0001). Individuals who experienced bacteremia showed worse mean clinical attachment level (3.4 mm vs. 1.1 mm) and more subgingival bacteria (p<0.005). The pre-procedural rinse did not reduce induced bacteremia. Conclusions: Bacteremia was influenced by periodontal parameters. In periodontally diseased patients, pre-procedural rinsing showed a discrete effect on bacteremia control. PMID:29211279
ERIC Educational Resources Information Center
Bakar, Ab Rahim; Mohamed, Shamsiah; Hamzah, Ramlah
2013-01-01
This study was performed to identify the employability skills of technical students from the Industrial Training Institutes (ITI) and Indigenous People's Trust Council (MARA) Skills Training Institutes (IKM) in Malaysia. The study sample consisted of 850 final year trainees of IKM and ITI. The sample was chosen by a random sampling procedure from…
NASA Technical Reports Server (NTRS)
Vangenderen, J. L. (Principal Investigator); Lock, B. F.
1976-01-01
The author has identified the following significant results. Scope of the preprocessing techniques was restricted to standard material from the EROS Data Center accompanied by some enlarging procedures and the use of the diazo process. Investigation has shown that the most appropriate sampling strategy for this study is the stratified random technique. A viable sampling procedure, together with a method for determining minimum number of sample points in order to test results of any interpretation are presented.
People's Need for Additional Job Training: Development and Evaluation of an Assessment Procedure.
ERIC Educational Resources Information Center
Copa, George H.; Maurice, Clyde F.
A procedure was developed and evaluated for assessing the self-perceived educational needs of people as one input to the process of planning, approving, and implementing relevant educational programs. The method of data collection involved selecting samples of people by randomly selecting households in a given geographic area, and then contacting…
Types of Bullying in the Senior High Schools in Ghana
ERIC Educational Resources Information Center
Antiri, Kwasi Otopa
2016-01-01
The main objective of the study was to examine the types of bullying that were taking place in the senior high schools in Ghana. A multi-stage sampling procedure, comprising purposive, simple random and snowball sampling technique, was used in the selection of the sample. A total of 354 respondents were drawn six schools in Ashanti, Central and…
Multistage point relascope and randomized branch sampling for downed coarse woody debris estimation
Jeffrey H. Gove; Mark J. Ducey; Harry T. Valentine
2002-01-01
New sampling methods have recently been introduced that allow estimation of downed coarse woody debris using an angle gauge, or relascope. The theory behind these methods is based on sampling straight pieces of downed coarse woody debris. When pieces deviate from this ideal situation, auxillary methods must be employed. We describe a two-stage procedure where the...
Sarkar, Papri; Mikhail, Emad; Schickler, Robyn; Plosker, Shayne; Imudia, Anthony N
2017-09-01
To estimate the optimal order of office hysteroscopy and endometrial biopsy when performed successively for evaluation of abnormal uterine bleeding. Patients undergoing successive office hysteroscopy and endometrial biopsy were included in a single-blind, prospective, randomized trial. The primary outcome was to evaluate the effect of order of procedures on patients' pain score. Prespecified secondary outcomes include procedure duration, hysteroscopic visualization of the uterine cavity, endometrial sample adequacy, and number of attempts at biopsy. Pain scores were assessed using a visual analog scale from 0 to 10 and endometrial sample adequacy was determined from the histopathology report. Hysteroscopy images were recorded. Sample size of 34 per group (n=68) was determined to be adequate to detect a difference of 20% in visual analog scale score between hysteroscopy first (group A) and biopsy first (group B) at α of 0.05 and 80% power. Between October 2015 and January 2017, 78 women were randomized to group A (n=40) and group B (n=38). There was no difference in global pain perception [7 (0-10) vs 7 (0-10); P=.57, 95% CI 5.8-7.1]. Procedure duration [3 (1-9) vs 3 (2-10), P=.32, 95% CI 3.3-4.1] and endometrial sample adequacy (78.9% vs 75.7%, P=.74) were similar in both groups. Group A patients had better endometrial visualization (P<.001) than group B based on the hysteroscopic images: excellent (50% vs 7.9%), good (20% vs 34.2%), and fair (22.5% vs 44.7%); group B participants required fewer endometrial biopsy attempts at obtaining adequate tissue sample (two vs one; P<.001, 1.6-1.9). Patients having successive office hysteroscopy and endometrial biopsy for evaluation of abnormal uterine bleeding, the global pain perception, and time required are independent of the order in which procedures are performed. Performing hysteroscopy first ensures better image, whereas biopsy first yields adequate tissue sample with fewer attempts. ClinicalTrials.gov, NCT02472184.
Analytical Applications of Monte Carlo Techniques.
ERIC Educational Resources Information Center
Guell, Oscar A.; Holcombe, James A.
1990-01-01
Described are analytical applications of the theory of random processes, in particular solutions obtained by using statistical procedures known as Monte Carlo techniques. Supercomputer simulations, sampling, integration, ensemble, annealing, and explicit simulation are discussed. (CW)
Contrasting lexical similarity and formal definitions in SNOMED CT: consistency and implications.
Agrawal, Ankur; Elhanan, Gai
2014-02-01
To quantify the presence of and evaluate an approach for detection of inconsistencies in the formal definitions of SNOMED CT (SCT) concepts utilizing a lexical method. Utilizing SCT's Procedure hierarchy, we algorithmically formulated similarity sets: groups of concepts with similar lexical structure of their fully specified name. We formulated five random samples, each with 50 similarity sets, based on the same parameter: number of parents, attributes, groups, all the former as well as a randomly selected control sample. All samples' sets were reviewed for types of formal definition inconsistencies: hierarchical, attribute assignment, attribute target values, groups, and definitional. For the Procedure hierarchy, 2111 similarity sets were formulated, covering 18.1% of eligible concepts. The evaluation revealed that 38 (Control) to 70% (Different relationships) of similarity sets within the samples exhibited significant inconsistencies. The rate of inconsistencies for the sample with different relationships was highly significant compared to Control, as well as the number of attribute assignment and hierarchical inconsistencies within their respective samples. While, at this time of the HITECH initiative, the formal definitions of SCT are only a minor consideration, in the grand scheme of sophisticated, meaningful use of captured clinical data, they are essential. However, significant portion of the concepts in the most semantically complex hierarchy of SCT, the Procedure hierarchy, are modeled inconsistently in a manner that affects their computability. Lexical methods can efficiently identify such inconsistencies and possibly allow for their algorithmic resolution. Copyright © 2013 Elsevier Inc. All rights reserved.
Performance review using sequential sampling and a practice computer.
Difford, F
1988-06-01
The use of sequential sample analysis for repeated performance review is described with examples from several areas of practice. The value of a practice computer in providing a random sample from a complete population, evaluating the parameters of a sequential procedure, and producing a structured worksheet is discussed. It is suggested that sequential analysis has advantages over conventional sampling in the area of performance review in general practice.
Validity and Reliability of Psychosocial Factors Related to Breast Cancer Screening.
ERIC Educational Resources Information Center
Zapka, Jane G.; And Others
1991-01-01
The construct validity of hypothesized survey items and data reduction procedures for selected psychosocial constructs frequently used in breast cancer screening research were investigated in telephone interviews with randomly selected samples of 1,184 and 903 women and a sample of 169 Hispanic clinic clients. Validity of the constructs is…
Tomie, Arthur; Tirado, Aidaluz D; Yu, Lung; Pohorecky, Larissa A
2004-08-12
Pavlovian autoshaping procedures provide for pairings of a small object conditioned stimulus (CS) with a rewarding substance unconditioned stimulus (US), resulting in the acquisition of complex sequences of CS-directed skeletal-motor responses or autoshaping conditioned responses (CRs). Autoshaping procedures induce higher post-session levels of corticosterone than in controls receiving CS and US randomly, and the enhanced post-session corticosterone levels have been attributed to the appetitive or arousal-inducing effects of autoshaping procedures. Enhanced corticosterone release can be induced by aversive stimulation or stressful situations, where it is often accompanied by higher levels of norepinephrine (NE) and serotonin (5-HT) in prefrontal cortex (PFC) but not in striatum (ST). Effects of autoshaping procedures on post-session corticosterone levels, NE contents in PFC, and 5-HT contents in PFC and ST were investigated in male Long-Evans rats. Post-session blood samples revealed higher corticosterone levels in the CS-US Paired group (n = 46) than in the CS-US Random control group (n = 21), and brain samples revealed higher levels of PFC NE and 5-HT in CS-US Paired group. Striatal 5-HT levels were unaltered by the autoshaping procedures. Autoshaping procedures provide for appetitive stimulation and induce an arousal-like state, as well as simultaneous stress-like changes in plasma corticosterone and monoamine levels in PFC. Autoshaping, therefore, may be useful for the study of endocrine and central processes associated with appetitive conditions.
Effects of Sample Selection on Estimates of Economic Impacts of Outdoor Recreation
Donald B.K. English
1997-01-01
Estimates of the economic impacts of recreation often come from spending data provided by a self-selected subset of a random sample of site visitors. The subset is frequently less than half the onsite sample. Biased vectors of per trip spending and impact estimates can result if self-selection is related to spending pattctns, and proper corrective procedures arc not...
A sampling design framework for monitoring secretive marshbirds
Johnson, D.H.; Gibbs, J.P.; Herzog, M.; Lor, S.; Niemuth, N.D.; Ribic, C.A.; Seamans, M.; Shaffer, T.L.; Shriver, W.G.; Stehman, S.V.; Thompson, W.L.
2009-01-01
A framework for a sampling plan for monitoring marshbird populations in the contiguous 48 states is proposed here. The sampling universe is the breeding habitat (i.e. wetlands) potentially used by marshbirds. Selection protocols would be implemented within each of large geographical strata, such as Bird Conservation Regions. Site selection will be done using a two-stage cluster sample. Primary sampling units (PSUs) would be land areas, such as legal townships, and would be selected by a procedure such as systematic sampling. Secondary sampling units (SSUs) will be wetlands or portions of wetlands in the PSUs. SSUs will be selected by a randomized spatially balanced procedure. For analysis, the use of a variety of methods as a means of increasing confidence in conclusions that may be reached is encouraged. Additional effort will be required to work out details and implement the plan.
Characterization of Friction Joints Subjected to High Levels of Random Vibration
NASA Technical Reports Server (NTRS)
deSantos, Omar; MacNeal, Paul
2012-01-01
This paper describes the test program in detail including test sample description, test procedures, and vibration test results of multiple test samples. The material pairs used in the experiment were Aluminum-Aluminum, Aluminum- Dicronite coated Aluminum, and Aluminum-Plasmadize coated Aluminum. Levels of vibration for each set of twelve samples of each material pairing were gradually increased until all samples experienced substantial displacement. Data was collected on 1) acceleration in all three axes, 2) relative static displacement between vibration runs utilizing photogrammetry techniques, and 3) surface galling and contaminant generation. This data was used to estimate the values of static friction during random vibratory motion when "stick-slip" occurs and compare these to static friction coefficients measured before and after vibration testing.
Quantifying Adventitious Error in a Covariance Structure as a Random Effect
Wu, Hao; Browne, Michael W.
2017-01-01
We present an approach to quantifying errors in covariance structures in which adventitious error, identified as the process underlying the discrepancy between the population and the structured model, is explicitly modeled as a random effect with a distribution, and the dispersion parameter of this distribution to be estimated gives a measure of misspecification. Analytical properties of the resultant procedure are investigated and the measure of misspecification is found to be related to the RMSEA. An algorithm is developed for numerical implementation of the procedure. The consistency and asymptotic sampling distributions of the estimators are established under a new asymptotic paradigm and an assumption weaker than the standard Pitman drift assumption. Simulations validate the asymptotic sampling distributions and demonstrate the importance of accounting for the variations in the parameter estimates due to adventitious error. Two examples are also given as illustrations. PMID:25813463
Jackknifing Techniques for Evaluation of Equating Accuracy. Research Report. ETS RR-09-39
ERIC Educational Resources Information Center
Haberman, Shelby J.; Lee, Yi-Hsuan; Qian, Jiahe
2009-01-01
Grouped jackknifing may be used to evaluate the stability of equating procedures with respect to sampling error and with respect to changes in anchor selection. Properties of grouped jackknifing are reviewed for simple-random and stratified sampling, and its use is described for comparisons of anchor sets. Application is made to examples of item…
ERIC Educational Resources Information Center
Chiner, Esther; Cardona, Maria Cristina
2013-01-01
This study examined regular education teachers' perceptions of inclusion in elementary and secondary schools in Spain and how these perceptions may differ depending on teaching experience, skills, and the availability of resources and supports. Stratified random sampling procedures were used to draw a representative sample of 336 general education…
Predictor sort sampling, tight t`s, and the analysis of covariance : theory, tables, and examples
S. P. Verrill; D. W. Green
In recent years wood strength researchers have begun to replace experimental unit allocation via random sampling with allocation via sorts based on nondestructive measurements of strength predictors such as modulus of elasticity and specific gravity. Although this procedure has the potential of greatly increasing experimental sensitivity, as currently implemented it...
ERIC Educational Resources Information Center
Johnson, Matthew S.; Jenkins, Frank
2005-01-01
Large-scale educational assessments such as the National Assessment of Educational Progress (NAEP) sample examinees to whom an exam will be administered. In most situations the sampling design is not a simple random sample and must be accounted for in the estimating model. After reviewing the current operational estimation procedure for NAEP, this…
NASA Technical Reports Server (NTRS)
Reeves, C. A. (Principal Investigator)
1978-01-01
The author has identified the following significant results. An assumption that short prairie grass and salt grass could be differentiated on aircraft photographs was inaccurate for the Weld County site. However, rangeland could be differentiated using procedure 1 from LACIE. Estimates derived from either random or systematic sampling were satisfactory. Level 1 features were separated and mapped, and proportions were estimated with accompanying confidence statements.
Long Term Resource Monitoring Program procedures: fish monitoring
Ratcliff, Eric N.; Glittinger, Eric J.; O'Hara, T. Matt; Ickes, Brian S.
2014-01-01
This manual constitutes the second revision of the U.S. Army Corps of Engineers’ Upper Mississippi River Restoration-Environmental Management Program (UMRR-EMP) Long Term Resource Monitoring Program (LTRMP) element Fish Procedures Manual. The original (1988) manual merged and expanded on ideas and recommendations related to Upper Mississippi River fish sampling presented in several early documents. The first revision to the manual was made in 1995 reflecting important protocol changes, such as the adoption of a stratified random sampling design. The 1995 procedures manual has been an important document through the years and has been cited in many reports and scientific manuscripts. The resulting data collected by the LTRMP fish component represent the largest dataset on fish within the Upper Mississippi River System (UMRS) with more than 44,000 collections of approximately 5.7 million fish. The goal of this revision of the procedures manual is to document changes in LTRMP fish sampling procedures since 1995. Refinements to sampling methods become necessary as monitoring programs mature. Possible refinements are identified through field experiences (e.g., sampling techniques and safety protocols), data analysis (e.g., planned and studied gear efficiencies and reallocations of effort), and technological advances (e.g., electronic data entry). Other changes may be required because of financial necessity (i.e., unplanned effort reductions). This version of the LTRMP fish monitoring manual describes the most current (2014) procedures of the LTRMP fish component.
Study Design Rigor in Animal-Experimental Research Published in Anesthesia Journals.
Hoerauf, Janine M; Moss, Angela F; Fernandez-Bustamante, Ana; Bartels, Karsten
2018-01-01
Lack of reproducibility of preclinical studies has been identified as an impediment for translation of basic mechanistic research into effective clinical therapies. Indeed, the National Institutes of Health has revised its grant application process to require more rigorous study design, including sample size calculations, blinding procedures, and randomization steps. We hypothesized that the reporting of such metrics of study design rigor has increased over time for animal-experimental research published in anesthesia journals. PubMed was searched for animal-experimental studies published in 2005, 2010, and 2015 in primarily English-language anesthesia journals. A total of 1466 publications were graded on the performance of sample size estimation, randomization, and blinding. Cochran-Armitage test was used to assess linear trends over time for the primary outcome of whether or not a metric was reported. Interrater agreement for each of the 3 metrics (power, randomization, and blinding) was assessed using the weighted κ coefficient in a 10% random sample of articles rerated by a second investigator blinded to the ratings of the first investigator. A total of 1466 manuscripts were analyzed. Reporting for all 3 metrics of experimental design rigor increased over time (2005 to 2010 to 2015): for power analysis, from 5% (27/516), to 12% (59/485), to 17% (77/465); for randomization, from 41% (213/516), to 50% (243/485), to 54% (253/465); and for blinding, from 26% (135/516), to 38% (186/485), to 47% (217/465). The weighted κ coefficients and 98.3% confidence interval indicate almost perfect agreement between the 2 raters beyond that which occurs by chance alone (power, 0.93 [0.85, 1.0], randomization, 0.91 [0.85, 0.98], and blinding, 0.90 [0.84, 0.96]). Our hypothesis that reported metrics of rigor in animal-experimental studies in anesthesia journals have increased during the past decade was confirmed. More consistent reporting, or explicit justification for absence, of sample size calculations, blinding techniques, and randomization procedures could better enable readers to evaluate potential sources of bias in animal-experimental research manuscripts. Future studies should assess whether such steps lead to improved translation of animal-experimental anesthesia research into successful clinical trials.
Qiu, Xing; Hu, Rui; Wu, Zhixin
2014-01-01
Normalization procedures are widely used in high-throughput genomic data analyses to remove various technological noise and variations. They are known to have profound impact to the subsequent gene differential expression analysis. Although there has been some research in evaluating different normalization procedures, few attempts have been made to systematically evaluate the gene detection performances of normalization procedures from the bias-variance trade-off point of view, especially with strong gene differentiation effects and large sample size. In this paper, we conduct a thorough study to evaluate the effects of normalization procedures combined with several commonly used statistical tests and MTPs under different configurations of effect size and sample size. We conduct theoretical evaluation based on a random effect model, as well as simulation and biological data analyses to verify the results. Based on our findings, we provide some practical guidance for selecting a suitable normalization procedure under different scenarios. PMID:24941114
NASA Technical Reports Server (NTRS)
Chapman, G. M. (Principal Investigator); Carnes, J. G.
1981-01-01
Several techniques which use clusters generated by a new clustering algorithm, CLASSY, are proposed as alternatives to random sampling to obtain greater precision in crop proportion estimation: (1) Proportional Allocation/relative count estimator (PA/RCE) uses proportional allocation of dots to clusters on the basis of cluster size and a relative count cluster level estimate; (2) Proportional Allocation/Bayes Estimator (PA/BE) uses proportional allocation of dots to clusters and a Bayesian cluster-level estimate; and (3) Bayes Sequential Allocation/Bayesian Estimator (BSA/BE) uses sequential allocation of dots to clusters and a Bayesian cluster level estimate. Clustering in an effective method in making proportion estimates. It is estimated that, to obtain the same precision with random sampling as obtained by the proportional sampling of 50 dots with an unbiased estimator, samples of 85 or 166 would need to be taken if dot sets with AI labels (integrated procedure) or ground truth labels, respectively were input. Dot reallocation provides dot sets that are unbiased. It is recommended that these proportion estimation techniques are maintained, particularly the PA/BE because it provides the greatest precision.
Beukes, Lorika S; Schmidt, Stefan
2018-04-16
The aim of this study was to assess pit latrine samples from a peri-urban community in KwaZulu-Natal (South Africa) for the presence of multidrug-resistant (MDR) Staphylococcus spp. Standard procedures were used to isolate Staphylococcus spp. from pit latrine fecal sludge samples, with confirmation at genus level by polymerase chain reaction (PCR). Sixty-eight randomly selected pit latrine Staphylococcus spp. isolates were further characterized by using established disk diffusion procedures. An average Staphylococcus spp. count of 2.1 × 10 5 CFU per g fecal material was established using two randomly selected pit latrine samples. Of the 68-selected Staphylococcus spp. pit latrine isolates, 49% were identified as coagulase positive, 51% as coagulase negative and 65% (12 coagulase positive, 32 coagulase negative isolates) were categorized as MDR. The majority (66/68) of Staphylococcus spp. isolates displayed resistance to fusidic acid while only 5/68 isolates displayed resistance to chloramphenicol. The pit latrine samples analyzed in this study are a source of MDR Staphylococcus spp., highlighting the need for proper hygiene and sanitation regimes in rural communities using these facilities.
Adaptive sampling in research on risk-related behaviors.
Thompson, Steven K; Collins, Linda M
2002-11-01
This article introduces adaptive sampling designs to substance use researchers. Adaptive sampling is particularly useful when the population of interest is rare, unevenly distributed, hidden, or hard to reach. Examples of such populations are injection drug users, individuals at high risk for HIV/AIDS, and young adolescents who are nicotine dependent. In conventional sampling, the sampling design is based entirely on a priori information, and is fixed before the study begins. By contrast, in adaptive sampling, the sampling design adapts based on observations made during the survey; for example, drug users may be asked to refer other drug users to the researcher. In the present article several adaptive sampling designs are discussed. Link-tracing designs such as snowball sampling, random walk methods, and network sampling are described, along with adaptive allocation and adaptive cluster sampling. It is stressed that special estimation procedures taking the sampling design into account are needed when adaptive sampling has been used. These procedures yield estimates that are considerably better than conventional estimates. For rare and clustered populations adaptive designs can give substantial gains in efficiency over conventional designs, and for hidden populations link-tracing and other adaptive procedures may provide the only practical way to obtain a sample large enough for the study objectives.
SNP selection and classification of genome-wide SNP data using stratified sampling random forests.
Wu, Qingyao; Ye, Yunming; Liu, Yang; Ng, Michael K
2012-09-01
For high dimensional genome-wide association (GWA) case-control data of complex disease, there are usually a large portion of single-nucleotide polymorphisms (SNPs) that are irrelevant with the disease. A simple random sampling method in random forest using default mtry parameter to choose feature subspace, will select too many subspaces without informative SNPs. Exhaustive searching an optimal mtry is often required in order to include useful and relevant SNPs and get rid of vast of non-informative SNPs. However, it is too time-consuming and not favorable in GWA for high-dimensional data. The main aim of this paper is to propose a stratified sampling method for feature subspace selection to generate decision trees in a random forest for GWA high-dimensional data. Our idea is to design an equal-width discretization scheme for informativeness to divide SNPs into multiple groups. In feature subspace selection, we randomly select the same number of SNPs from each group and combine them to form a subspace to generate a decision tree. The advantage of this stratified sampling procedure can make sure each subspace contains enough useful SNPs, but can avoid a very high computational cost of exhaustive search of an optimal mtry, and maintain the randomness of a random forest. We employ two genome-wide SNP data sets (Parkinson case-control data comprised of 408 803 SNPs and Alzheimer case-control data comprised of 380 157 SNPs) to demonstrate that the proposed stratified sampling method is effective, and it can generate better random forest with higher accuracy and lower error bound than those by Breiman's random forest generation method. For Parkinson data, we also show some interesting genes identified by the method, which may be associated with neurological disorders for further biological investigations.
Lofwall, Michelle R; Nuzzo, Paul A; Campbell, Charles; Walsh, Sharon L
2014-06-01
Aripiprazole is a partial agonist at dopamine (D2) and serotonin (5-HT1a) receptors and 5-HT2 antagonist. Because cocaine affects dopamine and serotonin, this study assessed whether aripiprazole could diminish the reinforcing efficacy of cocaine. Secondary aims evaluated aripiprazole on ad lib cigarette smoking and with a novel 40-hr smoking abstinence procedure. Adults with regular cocaine and cigarette use completed this inpatient double blind, randomized, placebo-controlled mixed-design study. A placebo lead-in was followed by randomization to aripiprazole (0, 2 or 10 mg/day/p.o.; n = 7 completed/group). Three sets of test sessions, each consisting of 3 cocaine sample-choice (i.e., self-administration) sessions and 1 dose-response session, were conducted (once during the lead-in and twice after randomization). Sample sessions tested each cocaine dose (0, 20 and 40 mg/70 kg, i.v.) in random order; subjective, observer-rated and physiologic outcomes were collected. Later that day, participants chose between the morning's sample dose or descending amounts of money over 7 trials. In dose response sessions, all doses were given 1 hr apart in ascending order for pharmacodynamic and pharmacokinetic assessment. Two sets of smoking topography sessions were conducted during the lead-in and after randomization; 1 with and 1 without 40 hr of smoking abstinence. Number of ad lib cigarettes smoked during non-session days was collected. Cocaine produced prototypic effects, but aripiprazole did not significantly alter these effects or smoking outcomes. The smoking abstinence procedure reliably produced nicotine withdrawal and craving and increased smoking modestly. These data do not support further investigation of aripiprazole for cocaine or tobacco use disorder treatment. PsycINFO Database Record (c) 2014 APA, all rights reserved.
A multiple-objective optimal exploration strategy
Christakos, G.; Olea, R.A.
1988-01-01
Exploration for natural resources is accomplished through partial sampling of extensive domains. Such imperfect knowledge is subject to sampling error. Complex systems of equations resulting from modelling based on the theory of correlated random fields are reduced to simple analytical expressions providing global indices of estimation variance. The indices are utilized by multiple objective decision criteria to find the best sampling strategies. The approach is not limited by geometric nature of the sampling, covers a wide range in spatial continuity and leads to a step-by-step procedure. ?? 1988.
Audit of lymphadenectomy in lung cancer resections using a specimen collection kit and checklist
Osarogiagbon, Raymond U.; Sareen, Srishti; Eke, Ransome; Yu, Xinhua; McHugh, Laura M.; Kernstine, Kemp H.; Putnam, Joe B.; Robbins, Edward T.
2014-01-01
Background Audits of operative summaries and pathology reports reveal wide discordance in identifying the extent of lymphadenectomy performed (the communication gap). We tested the ability of a pre-labeled lymph node specimen collection kit and checklist to narrow the communication gap between operating surgeons, pathologists, and auditors of surgeons’ operation notes. Methods We conducted a prospective single cohort study of lung cancer resections performed with a lymph node collection kit from November 2010 to January 2013. We used the kappa statistic to compare surgeon claims on a checklist of lymph node stations harvested intraoperatively, to pathology reports, and an independent audit of surgeons’ operative summaries. Lymph node collection procedures were classified into 4 groups based on the anatomic origin of resected lymph nodes: mediastinal lymph node dissection, systematic sampling, random sampling and no sampling. Results From the pathology report, 73% of 160 resections had a mediastinal lymph node dissection or systematic sampling procedure, 27% had random sampling. The concordance with surgeon claims was 80% (kappa statistic 0.69 [CI 0.60 – 0.79]). Concordance between independent audits of the operation notes and either the pathology report (kappa 0.14 [0.04 – 0.23]), or surgeon claims (kappa 0.09 [0.03 – 0.22]), was poor. Conclusion A pre-labeled specimen collection kit and checklist significantly narrowed the communication gap between surgeons and pathologists in identifying the extent of lymphadenectomy. Audit of surgeons’ operation notes did not accurately reflect the procedure performed, bringing its value for quality improvement work into question. PMID:25530090
Lacey, John H.; Kelley-Baker, Tara; Voas, Robert B.; Romano, Eduardo; Furr-Holden, C. Debra; Torres, Pedro; Berning, Amy
2013-01-01
This article describes the methodology used in the 2007 U.S. National Roadside Survey to estimate the prevalence of alcohol- and drug-impaired driving and alcohol- and drug-involved driving. This study involved randomly stopping drivers at 300 locations across the 48 continental U.S. states at sites selected through a stratified random sampling procedure. Data were collected during a 2-hour Friday daytime session at 60 locations and during 2-hour nighttime weekend periods at 240 locations. Both self-report and biological measures were taken. Biological measures included breath alcohol measurements from 9,413 respondents, oral fluid samples from 7,719 respondents, and blood samples from 3,276 respondents. PMID:21997324
Reboussin, Beth A; Preisser, John S; Song, Eun-Young; Wolfson, Mark
2012-07-01
Under-age drinking is an enormous public health issue in the USA. Evidence that community level structures may impact on under-age drinking has led to a proliferation of efforts to change the environment surrounding the use of alcohol. Although the focus of these efforts is to reduce drinking by individual youths, environmental interventions are typically implemented at the community level with entire communities randomized to the same intervention condition. A distinct feature of these trials is the tendency of the behaviours of individuals residing in the same community to be more alike than that of others residing in different communities, which is herein called 'clustering'. Statistical analyses and sample size calculations must account for this clustering to avoid type I errors and to ensure an appropriately powered trial. Clustering itself may also be of scientific interest. We consider the alternating logistic regressions procedure within the population-averaged modelling framework to estimate the effect of a law enforcement intervention on the prevalence of under-age drinking behaviours while modelling the clustering at multiple levels, e.g. within communities and within neighbourhoods nested within communities, by using pairwise odds ratios. We then derive sample size formulae for estimating intervention effects when planning a post-test-only or repeated cross-sectional community-randomized trial using the alternating logistic regressions procedure.
Weighting by Inverse Variance or by Sample Size in Random-Effects Meta-Analysis
ERIC Educational Resources Information Center
Marin-Martinez, Fulgencio; Sanchez-Meca, Julio
2010-01-01
Most of the statistical procedures in meta-analysis are based on the estimation of average effect sizes from a set of primary studies. The optimal weight for averaging a set of independent effect sizes is the inverse variance of each effect size, but in practice these weights have to be estimated, being affected by sampling error. When assuming a…
ERIC Educational Resources Information Center
Teva, Inmaculada; Bermudez, Maria Paz; Buela-Casal, Gualberto
2010-01-01
The aim of this study was to assess whether coping styles, social stress, and sexual sensation seeking were predictors of HIV/STD risk behaviours in adolescents. A representative sample of 4,456 female and male Spanish high school students aged 13 to 18 years participated. A stratified random sampling procedure was used. Self-report questionnaires…
ERIC Educational Resources Information Center
BROOKS, MELVIN S.; HILGENDORF, ROBERT L.
MIGRANT LABORERS WHO PICKED STRAWBERRIES IN SOUTHERN ILLINOIS IN THE SPRING OF 1958 WERE SURVEYED. THIS PARTICULAR SAMPLE WAS SELECTED FOR STUDY BECAUSE FAR MORE CHILDREN ARE INVOLVED IN THE HARVEST OF STRAWBERRIES THAN IN ANY OTHER FARM TASK OF THE AREA. MIGRANTS WHO WERE INTERVIEWED WERE SELECTED BY SYSTEMATIC RANDOM SAMPLING--A PROCEDURE THAT…
SAS procedures for designing and analyzing sample surveys
Stafford, Joshua D.; Reinecke, Kenneth J.; Kaminski, Richard M.
2003-01-01
Complex surveys often are necessary to estimate occurrence (or distribution), density, and abundance of plants and animals for purposes of re-search and conservation. Most scientists are familiar with simple random sampling, where sample units are selected from a population of interest (sampling frame) with equal probability. However, the goal of ecological surveys often is to make inferences about populations over large or complex spatial areas where organisms are not homogeneously distributed or sampling frames are in-convenient or impossible to construct. Candidate sampling strategies for such complex surveys include stratified,multistage, and adaptive sampling (Thompson 1992, Buckland 1994).
Determining Consumer Preference for Furniture Product Characteristics
ERIC Educational Resources Information Center
Turner, Carolyn S.; Edwards, Kay P.
1974-01-01
The paper describes instruments for determining preferences of consumers for selected product characteristics associated with furniture choices--specifically style, color, color scheme, texture, and materials--and the procedures for administration of those instruments. Results are based on a random sampling of public housing residents. (Author/MW)
Michael, Claire W; Naik, Kalyani; McVicker, Michael
2013-05-01
We developed a value stream map (VSM) of the Papanicolaou test procedure to identify opportunities to reduce waste and errors, created a new VSM, and implemented a new process emphasizing Lean tools. Preimplementation data revealed the following: (1) processing time (PT) for 1,140 samples averaged 54 hours; (2) 27 accessioning errors were detected on review of 357 random requisitions (7.6%); (3) 5 of the 20,060 tests had labeling errors that had gone undetected in the processing stage. Four were detected later during specimen processing but 1 reached the reporting stage. Postimplementation data were as follows: (1) PT for 1,355 samples averaged 31 hours; (2) 17 accessioning errors were detected on review of 385 random requisitions (4.4%); and (3) no labeling errors were undetected. Our results demonstrate that implementation of Lean methods, such as first-in first-out processes and minimizing batch size by staff actively participating in the improvement process, allows for higher quality, greater patient safety, and improved efficiency.
Rigorously Assessing Whether the Data Backs the Back School
Vinh, David T.; Johnson, Craig W.; Phelps, Cynthia L.
2003-01-01
A rigorous between-subjects methodology employing independent random samples and having broad clinical applicability was designed and implemented to evaluate the effectiveness of back safety and patient transfer training interventions for both hospital nurses and nursing assistants. Effects upon self-efficacy, cognitive, and affective measures are assessed for each of three back safety procedures. The design solves the problem of obtaining randomly assigned independent controls where all experimental subjects must participate in the training interventions. PMID:14728544
Robustness of methods for blinded sample size re-estimation with overdispersed count data.
Schneider, Simon; Schmidli, Heinz; Friede, Tim
2013-09-20
Counts of events are increasingly common as primary endpoints in randomized clinical trials. With between-patient heterogeneity leading to variances in excess of the mean (referred to as overdispersion), statistical models reflecting this heterogeneity by mixtures of Poisson distributions are frequently employed. Sample size calculation in the planning of such trials requires knowledge on the nuisance parameters, that is, the control (or overall) event rate and the overdispersion parameter. Usually, there is only little prior knowledge regarding these parameters in the design phase resulting in considerable uncertainty regarding the sample size. In this situation internal pilot studies have been found very useful and very recently several blinded procedures for sample size re-estimation have been proposed for overdispersed count data, one of which is based on an EM-algorithm. In this paper we investigate the EM-algorithm based procedure with respect to aspects of their implementation by studying the algorithm's dependence on the choice of convergence criterion and find that the procedure is sensitive to the choice of the stopping criterion in scenarios relevant to clinical practice. We also compare the EM-based procedure to other competing procedures regarding their operating characteristics such as sample size distribution and power. Furthermore, the robustness of these procedures to deviations from the model assumptions is explored. We find that some of the procedures are robust to at least moderate deviations. The results are illustrated using data from the US National Heart, Lung and Blood Institute sponsored Asymptomatic Cardiac Ischemia Pilot study. Copyright © 2013 John Wiley & Sons, Ltd.
1973 U.S. national roadside breathtesting survey : procedures and results
DOT National Transportation Integrated Search
1974-10-01
Author's abstract: This first U.S. national roadside breathtesting survey was conducted at 185 roadside locations in 18 states. Random samples of 3,698 motorists were stopped between 10PM and 3AM on eight weekends in the fall of 1973. From these driv...
Linear discriminant analysis with misallocation in training samples
NASA Technical Reports Server (NTRS)
Chhikara, R. (Principal Investigator); Mckeon, J.
1982-01-01
Linear discriminant analysis for a two-class case is studied in the presence of misallocation in training samples. A general appraoch to modeling of mislocation is formulated, and the mean vectors and covariance matrices of the mixture distributions are derived. The asymptotic distribution of the discriminant boundary is obtained and the asymptotic first two moments of the two types of error rate given. Certain numerical results for the error rates are presented by considering the random and two non-random misallocation models. It is shown that when the allocation procedure for training samples is objectively formulated, the effect of misallocation on the error rates of the Bayes linear discriminant rule can almost be eliminated. If, however, this is not possible, the use of Fisher rule may be preferred over the Bayes rule.
Audit of lymphadenectomy in lung cancer resections using a specimen collection kit and checklist.
Osarogiagbon, Raymond U; Sareen, Srishti; Eke, Ransome; Yu, Xinhua; McHugh, Laura M; Kernstine, Kemp H; Putnam, Joe B; Robbins, Edward T
2015-02-01
Audits of operative summaries and pathology reports reveal wide discordance in identifying the extent of lymphadenectomy performed (the communication gap). We tested the ability of a prelabeled lymph node specimen collection kit and checklist to narrow the communication gap between operating surgeons, pathologists, and auditors of surgeons' operation notes. We conducted a prospective single cohort study of lung cancer resections performed with a lymph node collection kit from November 2010 to January 2013. We used the kappa statistic to compare surgeon claims on a checklist of lymph node stations harvested intraoperatively with pathology reports and an independent audit of surgeons' operative summaries. Lymph node collection procedures were classified into four groups based on the anatomic origin of resected lymph nodes: mediastinal lymph node dissection, systematic sampling, random sampling, and no sampling. From the pathology reports, 73% of 160 resections had a mediastinal lymph node dissection or systematic sampling procedure, 27% had random sampling. The concordance with surgeon claims was 80% (kappa statistic 0.69, 95% confidence interval: 0.60 to 0.79). Concordance between independent audits of the operation notes and either the pathology report (kappa 0.14, 95% confidence interval: 0.04 to 0.23) or surgeon claims (kappa 0.09, 95% confidence interval: 0.03 to 0.22) was poor. A prelabeled specimen collection kit and checklist significantly narrowed the communication gap between surgeons and pathologists in identifying the extent of lymphadenectomy. Audit of surgeons' operation notes did not accurately reflect the procedure performed, bringing its value for quality improvement work into question. Copyright © 2015 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.
Evaluation of Written Language.
ERIC Educational Resources Information Center
Hillerich, Robert L.
An evaluation procedure was formulated to ascertain the effectiveness of an emphasis on the clarity and interest appeal of a composition as opposed to its mechanical correctness in improving a child's written expression. A random sample of themes were submitted to a general evaluation of content by six criteria and a linguistic analysis by nine…
Drug and Alcohol Use by Canadian University Athletes: A National Survey.
ERIC Educational Resources Information Center
Spence, John C.; Gauvin, Lise
1996-01-01
Using a stratified random sampling procedure, 754 student athletes were surveyed regarding drug and alcohol use in eight different sports from eight universities across Canada. Provides statistics of substances athletes reported using, including pain medications, weight loss products, anabolic steroids, smokeless tobacco products, alcohol,…
Do Social Workers Make Better Child Welfare Workers than Non-Social Workers?
ERIC Educational Resources Information Center
Perry, Robin E.
2006-01-01
Objective: To empirically examine whether the educational background of child welfare workers in Florida impacts on performance evaluations of their work. Method: A proportionate, stratified random sample of supervisor and peer evaluations of child protective investigators and child protective service workers is conducted. ANOVA procedures are…
Conducting a wildland visual resources inventory
James F. Palmer
1979-01-01
This paper describes a procedure for systematically inventorying the visual resources of wildland environments. Visual attributes are recorded photographically using two separate sampling methods: one based on professional judgment and the other on random selection. The location and description of each inventoried scene are recorded on U.S. Geological Survey...
SCIENCE TEACHING IN THE PUBLIC JUNIOR HIGH SCHOOL.
ERIC Educational Resources Information Center
ROGERS, LOLA ERIKSEN
INFORMATION RELATED TO SCHOOL ORGANIZATION, PROCEDURES, PRACTICES, AND CONDITIONS AFFECTING SCIENCE INSTRUCTION IN THE PUBLIC JUNIOR HIGH SCHOOLS IS PRESENTED. QUESTIONNAIRES SENT TO THE PRINCIPALS OF A RANDOM SAMPLE OF SCHOOLS WHICH INCLUDED GRADES 7, 8, AND 9 WERE USED TO OBTAIN INFORMATION. CATEGORIES OF INFORMATION INCLUDED (1) ENROLLMENT AND…
Accounting for Sampling Error in Genetic Eigenvalues Using Random Matrix Theory.
Sztepanacz, Jacqueline L; Blows, Mark W
2017-07-01
The distribution of genetic variance in multivariate phenotypes is characterized by the empirical spectral distribution of the eigenvalues of the genetic covariance matrix. Empirical estimates of genetic eigenvalues from random effects linear models are known to be overdispersed by sampling error, where large eigenvalues are biased upward, and small eigenvalues are biased downward. The overdispersion of the leading eigenvalues of sample covariance matrices have been demonstrated to conform to the Tracy-Widom (TW) distribution. Here we show that genetic eigenvalues estimated using restricted maximum likelihood (REML) in a multivariate random effects model with an unconstrained genetic covariance structure will also conform to the TW distribution after empirical scaling and centering. However, where estimation procedures using either REML or MCMC impose boundary constraints, the resulting genetic eigenvalues tend not be TW distributed. We show how using confidence intervals from sampling distributions of genetic eigenvalues without reference to the TW distribution is insufficient protection against mistaking sampling error as genetic variance, particularly when eigenvalues are small. By scaling such sampling distributions to the appropriate TW distribution, the critical value of the TW statistic can be used to determine if the magnitude of a genetic eigenvalue exceeds the sampling error for each eigenvalue in the spectral distribution of a given genetic covariance matrix. Copyright © 2017 by the Genetics Society of America.
Boosting association rule mining in large datasets via Gibbs sampling.
Qian, Guoqi; Rao, Calyampudi Radhakrishna; Sun, Xiaoying; Wu, Yuehua
2016-05-03
Current algorithms for association rule mining from transaction data are mostly deterministic and enumerative. They can be computationally intractable even for mining a dataset containing just a few hundred transaction items, if no action is taken to constrain the search space. In this paper, we develop a Gibbs-sampling-induced stochastic search procedure to randomly sample association rules from the itemset space, and perform rule mining from the reduced transaction dataset generated by the sample. Also a general rule importance measure is proposed to direct the stochastic search so that, as a result of the randomly generated association rules constituting an ergodic Markov chain, the overall most important rules in the itemset space can be uncovered from the reduced dataset with probability 1 in the limit. In the simulation study and a real genomic data example, we show how to boost association rule mining by an integrated use of the stochastic search and the Apriori algorithm.
Direct generation of all-optical random numbers from optical pulse amplitude chaos.
Li, Pu; Wang, Yun-Cai; Wang, An-Bang; Yang, Ling-Zhen; Zhang, Ming-Jiang; Zhang, Jian-Zhong
2012-02-13
We propose and theoretically demonstrate an all-optical method for directly generating all-optical random numbers from pulse amplitude chaos produced by a mode-locked fiber ring laser. Under an appropriate pump intensity, the mode-locked laser can experience a quasi-periodic route to chaos. Such a chaos consists of a stream of pulses with a fixed repetition frequency but random intensities. In this method, we do not require sampling procedure and external triggered clocks but directly quantize the chaotic pulses stream into random number sequence via an all-optical flip-flop. Moreover, our simulation results show that the pulse amplitude chaos has no periodicity and possesses a highly symmetric distribution of amplitude. Thus, in theory, the obtained random number sequence without post-processing has a high-quality randomness verified by industry-standard statistical tests.
Sampling design for spatially distributed hydrogeologic and environmental processes
Christakos, G.; Olea, R.A.
1992-01-01
A methodology for the design of sampling networks over space is proposed. The methodology is based on spatial random field representations of nonhomogeneous natural processes, and on optimal spatial estimation techniques. One of the most important results of random field theory for physical sciences is its rationalization of correlations in spatial variability of natural processes. This correlation is extremely important both for interpreting spatially distributed observations and for predictive performance. The extent of site sampling and the types of data to be collected will depend on the relationship of subsurface variability to predictive uncertainty. While hypothesis formulation and initial identification of spatial variability characteristics are based on scientific understanding (such as knowledge of the physics of the underlying phenomena, geological interpretations, intuition and experience), the support offered by field data is statistically modelled. This model is not limited by the geometric nature of sampling and covers a wide range in subsurface uncertainties. A factorization scheme of the sampling error variance is derived, which possesses certain atttactive properties allowing significant savings in computations. By means of this scheme, a practical sampling design procedure providing suitable indices of the sampling error variance is established. These indices can be used by way of multiobjective decision criteria to obtain the best sampling strategy. Neither the actual implementation of the in-situ sampling nor the solution of the large spatial estimation systems of equations are necessary. The required values of the accuracy parameters involved in the network design are derived using reference charts (readily available for various combinations of data configurations and spatial variability parameters) and certain simple yet accurate analytical formulas. Insight is gained by applying the proposed sampling procedure to realistic examples related to sampling problems in two dimensions. ?? 1992.
NASA Astrophysics Data System (ADS)
Beck, L.; Wood, B.; Whitney, S.; Rossi, R.; Spanner, M.; Rodriguez, M.; Rodriguez-Ramirez, A.; Salute, J.; Legters, L.; Roberts, D.; Rejmankova, E.; Washino, R.
1993-08-01
This paper describes a procedure whereby remote sensing and geographic information system (GIS) technologies are used in a sample design to study the habitat of Anopheles albimanus, one of the principle vectors of malaria in Central America. This procedure incorporates Landsat-derived land cover maps with digital elevation and road network data to identify a random selection of larval habitats accessible for field sampling. At the conclusion of the sampling season, the larval counts will be used to determine habitat productivity, and then integrated with information on human settlement to assess where people are at high risk of malaria. This aproach would be appropriate in areas where land cover information is lacking and problems of access constrain field sampling. The use of a GIS also permits other data (such as insecticide spraying data) to the incorporated in the sample design as they arise. This approach would also be pertinent for other tropical vector-borne diseases, particularly where human activities impact disease vector habitat.
1979 Reserve Force Studies Surveys: Survey Design, Sample Design and Administrative Procedures,
1981-08-01
three factors: the need for a statistically significant number of usable questionnaires from different groups within the random sampls and from...Because of the multipurpose nature of these surveys and the large number of questions needed to fully address some of the topics covered, we...varies. Collection of data at the unit level is needed to accurately estimate actual reserve compensation and benefits and their possible role in both
Adaptive pre-specification in randomized trials with and without pair-matching.
Balzer, Laura B; van der Laan, Mark J; Petersen, Maya L
2016-11-10
In randomized trials, adjustment for measured covariates during the analysis can reduce variance and increase power. To avoid misleading inference, the analysis plan must be pre-specified. However, it is often unclear a priori which baseline covariates (if any) should be adjusted for in the analysis. Consider, for example, the Sustainable East Africa Research in Community Health (SEARCH) trial for HIV prevention and treatment. There are 16 matched pairs of communities and many potential adjustment variables, including region, HIV prevalence, male circumcision coverage, and measures of community-level viral load. In this paper, we propose a rigorous procedure to data-adaptively select the adjustment set, which maximizes the efficiency of the analysis. Specifically, we use cross-validation to select from a pre-specified library the candidate targeted maximum likelihood estimator (TMLE) that minimizes the estimated variance. For further gains in precision, we also propose a collaborative procedure for estimating the known exposure mechanism. Our small sample simulations demonstrate the promise of the methodology to maximize study power, while maintaining nominal confidence interval coverage. We show how our procedure can be tailored to the scientific question (intervention effect for the study sample vs. for the target population) and study design (pair-matched or not). Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
İşlekdemir, Burcu; Kaya, Nurten
2016-01-01
Patients generally prefer to have their family present during medical or nursing interventions. Family presence is assumed to reduce anxiety, especially during painful interventions. This study employed a randomized controlled experimental design to determine the effects of family presence on pain and anxiety during invasive nursing procedures. The study population consisted of patients hospitalized in the observation unit of the internal medicine section in the emergency department of a university hospital. The sample comprised 138 patients assigned into the experimental and control groups by drawing lots. The invasive nursing procedure was carried out in the presence of family members, for members of the experimental group, and without family members, for members of the control group. Thus, the effects of family presence on pain and anxiety during the administration of an invasive nursing procedure to patients were analyzed. The results showed that members of the experimental and control groups did not differ with respect to the pain and state anxiety scores during the intervention. Family presence does not influence the participants' pain and anxiety during an invasive nursing procedure. Thus, the decision regarding family presence during such procedures should be based on patient preference. Copyright © 2015 Elsevier Ltd. All rights reserved.
Correcting Classifiers for Sample Selection Bias in Two-Phase Case-Control Studies
Theis, Fabian J.
2017-01-01
Epidemiological studies often utilize stratified data in which rare outcomes or exposures are artificially enriched. This design can increase precision in association tests but distorts predictions when applying classifiers on nonstratified data. Several methods correct for this so-called sample selection bias, but their performance remains unclear especially for machine learning classifiers. With an emphasis on two-phase case-control studies, we aim to assess which corrections to perform in which setting and to obtain methods suitable for machine learning techniques, especially the random forest. We propose two new resampling-based methods to resemble the original data and covariance structure: stochastic inverse-probability oversampling and parametric inverse-probability bagging. We compare all techniques for the random forest and other classifiers, both theoretically and on simulated and real data. Empirical results show that the random forest profits from only the parametric inverse-probability bagging proposed by us. For other classifiers, correction is mostly advantageous, and methods perform uniformly. We discuss consequences of inappropriate distribution assumptions and reason for different behaviors between the random forest and other classifiers. In conclusion, we provide guidance for choosing correction methods when training classifiers on biased samples. For random forests, our method outperforms state-of-the-art procedures if distribution assumptions are roughly fulfilled. We provide our implementation in the R package sambia. PMID:29312464
Petrovskaya, Natalia B.; Forbes, Emily; Petrovskii, Sergei V.; Walters, Keith F. A.
2018-01-01
Studies addressing many ecological problems require accurate evaluation of the total population size. In this paper, we revisit a sampling procedure used for the evaluation of the abundance of an invertebrate population from assessment data collected on a spatial grid of sampling locations. We first discuss how insufficient information about the spatial population density obtained on a coarse sampling grid may affect the accuracy of an evaluation of total population size. Such information deficit in field data can arise because of inadequate spatial resolution of the population distribution (spatially variable population density) when coarse grids are used, which is especially true when a strongly heterogeneous spatial population density is sampled. We then argue that the average trap count (the quantity routinely used to quantify abundance), if obtained from a sampling grid that is too coarse, is a random variable because of the uncertainty in sampling spatial data. Finally, we show that a probabilistic approach similar to bootstrapping techniques can be an efficient tool to quantify the uncertainty in the evaluation procedure in the presence of a spatial pattern reflecting a patchy distribution of invertebrates within the sampling grid. PMID:29495513
47 CFR 1.1602 - Designation for random selection.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection...
47 CFR 1.1602 - Designation for random selection.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 1 2011-10-01 2011-10-01 false Designation for random selection. 1.1602 Section 1.1602 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1602 Designation for random selection...
Olsen, Geary W; Logan, Perry W; Hansen, Kristen J; Simpson, Cathy A; Burris, Jean M; Burlew, Michele M; Vorarath, Phanasouk P; Venkateswarlu, Pothapragada; Schumpert, John C; Mandel, Jeffrey H
2003-01-01
This investigation randomly sampled a fluorochemical manufacturing employee population to determine the distribution of serum fluorochemical levels according to employees' jobs and work areas. Previous analyses of medical surveillance data have not shown significant associations between fluorochemical production employees' clinical chemistry and hematology tests and their serum PFOS and perfluorooctanoate (PFOA, C(7)F(15)COO(-)) concentrations, but may have been subject to nonparticipation bias. A random sample of the on-site film plant employee population, where fluorochemicals are not produced, determined their serum concentrations also. Of the 232 employees randomly selected for serum sampling, 186 (80%) employees participated (n=126 chemical plant; n=60 film plant). Sera samples were extracted using an ion-pairing extraction procedure and were quantitatively analyzed for seven fluorochemicals using high-pressure liquid chromatography electrospray tandem mass spectrometry methods. Geometric means (in parts per million) and 95% confidence intervals (in parentheses) of the random sample of 126 chemical plant employees were: PFOS 0.941 (0.787-1.126); PFOA 0.899 (0.722-1.120); perfluorohexanesulfonate 0.180 (0.145-0.223); N-ethyl perfluorooctanesulfonamidoacetate 0.008 (0.006-0.011); N-methyl perfluorooctanesulfonamidoacetate 0.081 (0.067-0.098); perfluorooctanesulfonamide 0.013 (0.009-0.018); and perfluorooctanesulfonamidoacetate 0.022 (0.018-0.029). These geometric means were approximately one order of magnitude higher than those observed for the film plant employees.
Khan, Ajmal; Aggarwal, Ashutosh N; Agarwal, Ritesh; Bal, Amanjit; Gupta, Dheeraj
2011-01-01
Although electrocoagulation at time of endobronchial biopsy can potentially reduce procedure-related bleeding during fiberoptic bronchoscopy (FOB), it can also impair quality of tissue specimen; credible data for either are lacking. To evaluate the impact of hot biopsy on the quality of tissue samples and to quantify the amount of procedure-related bleeding during endobronchial biopsy. In this single-center, prospective, single-blind, randomized controlled study we included adult patients referred for FOB and having endobronchial lesions. Patients were randomized to bronchial biopsy using an electrocoagulation-enabled biopsy forceps, with (EC+ group) or without (EC- group) application of electrocoagulation current (40 W for 10 s in a monopolar mode). Procedure-related bleeding was semi-quantified by observer description, as well as through a visual analogue scale. Overall quality of biopsy specimen and tissue damage were assessed and graded by a pulmonary pathologist blinded to FOB details. 160 patients were randomized to endobronchial biopsy with (n = 81) or without (n = 79) the application of electrocoagulation. There were no severe bleeding episodes in either group, and severity of bleeding in the EC+ and EC- groups was similar (median visual analogue scale scores of 14 and 16, respectively). Histopathological diagnosis was similar in the EC+ and EC- groups (77.8% and 82.3%, respectively). There was no significant difference in tissue quality between the two groups. Use of electrocoagulation-enabled endobronchial biopsy does not alter specimen quality and does not result in any significant reduction in procedure-related bleeding. Copyright © 2010 S. Karger AG, Basel.
Suhonen, Riitta; Stolt, Minna; Katajisto, Jouko; Leino-Kilpi, Helena
2015-12-01
To report a review of quality regarding sampling, sample and data collection procedures of empirical nursing research of ethical climate studies where nurses were informants. Surveys are needed to obtain generalisable information about topics sensitive to nursing. Methodological quality of the studies is of key concern, especially the description of sampling and data collection procedures. Methodological literature review. Using the electronic MEDLINE database, empirical nursing research articles focusing on ethical climate were accessed in 2013 (earliest-22 November 2013). Using the search terms 'ethical' AND ('climate*' OR 'environment*') AND ('nurse*' OR 'nursing'), 376 citations were retrieved. Based on a four-phase retrieval process, 26 studies were included in the detailed analysis. Sampling method was reported in 58% of the studies, and it was random in a minority of the studies (26%). The identification of target sample and its size (92%) was reported, whereas justification for sample size was less often given. In over two-thirds (69%) of the studies with identifiable response rate, it was below 75%. A variety of data collection procedures were used with large amount of missing data about the details of who distributed, recruited and collected the questionnaires. Methods to increase response rates were seldom described. Discussion about nonresponse, representativeness of the sample and generalisability of the results was missing in many studies. This review highlights the methodological challenges and developments that need to be considered in ensuring the use of valid information in developing health care through research findings. © 2015 Nordic College of Caring Science.
47 CFR 1.1603 - Conduct of random selection.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection. The...
47 CFR 1.1603 - Conduct of random selection.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 1 2011-10-01 2011-10-01 false Conduct of random selection. 1.1603 Section 1.1603 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1603 Conduct of random selection. The...
Robust reliable sampled-data control for switched systems with application to flight control
NASA Astrophysics Data System (ADS)
Sakthivel, R.; Joby, Maya; Shi, P.; Mathiyalagan, K.
2016-11-01
This paper addresses the robust reliable stabilisation problem for a class of uncertain switched systems with random delays and norm bounded uncertainties. The main aim of this paper is to obtain the reliable robust sampled-data control design which involves random time delay with an appropriate gain control matrix for achieving the robust exponential stabilisation for uncertain switched system against actuator failures. In particular, the involved delays are assumed to be randomly time-varying which obeys certain mutually uncorrelated Bernoulli distributed white noise sequences. By constructing an appropriate Lyapunov-Krasovskii functional (LKF) and employing an average-dwell time approach, a new set of criteria is derived for ensuring the robust exponential stability of the closed-loop switched system. More precisely, the Schur complement and Jensen's integral inequality are used in derivation of stabilisation criteria. By considering the relationship among the random time-varying delay and its lower and upper bounds, a new set of sufficient condition is established for the existence of reliable robust sampled-data control in terms of solution to linear matrix inequalities (LMIs). Finally, an illustrative example based on the F-18 aircraft model is provided to show the effectiveness of the proposed design procedures.
Sandhu, Gurkirat; Khinda, Paramjit Kaur; Gill, Amarjit Singh; Singh Khinda, Vineet Inder; Baghi, Kamal; Chahal, Gurparkash Singh
2017-01-01
Periodontal surgical procedures produce varying degree of stress in all patients. Nitrous oxide-oxygen inhalation sedation is very effective for adult patients with mild-to-moderate anxiety due to dental procedures and needle phobia. The present study was designed to perform periodontal surgical procedures under nitrous oxide-oxygen inhalation sedation and assess whether this technique actually reduces stress physiologically, in comparison to local anesthesia alone (LA) during lengthy periodontal surgical procedures. This was a randomized, split-mouth, cross-over study. A total of 16 patients were selected for this randomized, split-mouth, cross-over study. One surgical session (SS) was performed under local anesthesia aided by nitrous oxide-oxygen inhalation sedation, and the other SS was performed on the contralateral quadrant under LA. For each session, blood samples to measure and evaluate serum cortisol levels were obtained, and vital parameters including blood pressure, heart rate, respiratory rate, and arterial blood oxygen saturation were monitored before, during, and after periodontal surgical procedures. Paired t -test and repeated measure ANOVA. The findings of the present study revealed a statistically significant decrease in serum cortisol levels, blood pressure and pulse rate and a statistically significant increase in respiratory rate and arterial blood oxygen saturation during periodontal surgical procedures under nitrous oxide inhalation sedation. Nitrous oxide-oxygen inhalation sedation for periodontal surgical procedures is capable of reducing stress physiologically, in comparison to LA during lengthy periodontal surgical procedures.
Lofwall, M.R.; Nuzzo, P.A.; Campbell, C.; Walsh, S.L.
2014-01-01
Aripiprazole is a partial agonist at dopamine D2 and serotonin 5-HT1a receptors and antagonist at 5-HT2 receptors. Because both dopamine and serotonin systems are involved in the action of cocaine, this study aimed to determine if aripiprazole could diminish the reinforcing efficacy of cocaine. Secondary aims evaluated aripiprazole effects on ad lib cigarette smoking and a novel 40-hour cigarette smoking abstinence procedure. Healthy adults with regular cocaine and cigarette use completed this ~30-day inpatient double blind, randomized, placebo-controlled mixed-design study. An oral placebo lead-in period was followed by randomization to oral aripiprazole (0, 2 or 10 mg daily; n=7 completed/group). Three sets of test sessions, each consisting of three cocaine sample-choice (i.e., self-administration) sessions and one dose-response session, were conducted (during the lead-in period and after randomization before and after achieving aripiprazole steady state). Sample-choice sessions tested three cocaine doses (0, 20, and 40 mg/70 kg, i.v.) with one dose (random order) administered in each sample session; subjective, observer-rated and physiologic outcomes were collected repeatedly before and after cocaine administration. Later that day, participants chose between receiving the sample dose from that morning or descending amounts of money for seven trials ($19, 16, 13, 10, 7, 4, 1). Dose response sessions administered the three cocaine doses in ascending order for pharmacodynamic and potential pharmacokinetic assessment. A set of two cigarette smoking topography sessions were conducted during placebo lead-in and after randomization; one with and one without 40-hours of cigarette smoking abstinence. Number of ad lib cigarettes smoked during non-session days was also collected. Cocaine produced prototypic pharmacodynamic effects and self-administration; neither were significantly altered by aripiprazole. The 40-hour smoking abstinence procedure reliably produced nicotine withdrawal and craving and increased smoking modestly. Aripiprazole did not significantly alter smoking outcomes. These data do not support the further investigation of aripiprazole for the treatment of cocaine or tobacco use disorders. PMID:24467369
Gotelli, Nicholas J.; Dorazio, Robert M.; Ellison, Aaron M.; Grossman, Gary D.
2010-01-01
Quantifying patterns of temporal trends in species assemblages is an important analytical challenge in community ecology. We describe methods of analysis that can be applied to a matrix of counts of individuals that is organized by species (rows) and time-ordered sampling periods (columns). We first developed a bootstrapping procedure to test the null hypothesis of random sampling from a stationary species abundance distribution with temporally varying sampling probabilities. This procedure can be modified to account for undetected species. We next developed a hierarchical model to estimate species-specific trends in abundance while accounting for species-specific probabilities of detection. We analysed two long-term datasets on stream fishes and grassland insects to demonstrate these methods. For both assemblages, the bootstrap test indicated that temporal trends in abundance were more heterogeneous than expected under the null model. We used the hierarchical model to estimate trends in abundance and identified sets of species in each assemblage that were steadily increasing, decreasing or remaining constant in abundance over more than a decade of standardized annual surveys. Our methods of analysis are broadly applicable to other ecological datasets, and they represent an advance over most existing procedures, which do not incorporate effects of incomplete sampling and imperfect detection.
Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.
Blutke, Andreas; Wanke, Rüdiger
2018-03-06
In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical uniform random (VUR) sections.
Exclusion in Schools in Northern Ireland: The Pupils' Voice
ERIC Educational Resources Information Center
Knipe, Damian; Reynolds, Margaret; Milner, Sharon
2007-01-01
The Department of Education in Northern Ireland has been reviewing the procedures for suspending and expelling pupils from school. This article reports the views of a random sample of 114 children (11-16 years) towards the proposed changes. Pupils' thoughts on: dealing with misbehaviour; setting rules; the decision-making process; appropriate…
Code of Federal Regulations, 2014 CFR
2014-10-01
... 42 Public Health 3 2014-10-01 2014-10-01 false Surveys. 416.140 Section 416.140 Public Health... Furnished Before January 1, 2008 § 416.140 Surveys. (a) Timing, purpose, and procedures. (1) No more often than once a year, CMS conducts a survey of a randomly selected sample of participating ASCs to collect...
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 3 2011-10-01 2011-10-01 false Surveys. 416.140 Section 416.140 Public Health... January 1, 2008 § 416.140 Surveys. (a) Timing, purpose, and procedures. (1) No more often than once a year, CMS conducts a survey of a randomly selected sample of participating ASCs to collect data for analysis...
Code of Federal Regulations, 2013 CFR
2013-10-01
... 42 Public Health 3 2013-10-01 2013-10-01 false Surveys. 416.140 Section 416.140 Public Health... Furnished Before January 1, 2008 § 416.140 Surveys. (a) Timing, purpose, and procedures. (1) No more often than once a year, CMS conducts a survey of a randomly selected sample of participating ASCs to collect...
Code of Federal Regulations, 2012 CFR
2012-10-01
... 42 Public Health 3 2012-10-01 2012-10-01 false Surveys. 416.140 Section 416.140 Public Health... Furnished Before January 1, 2008 § 416.140 Surveys. (a) Timing, purpose, and procedures. (1) No more often than once a year, CMS conducts a survey of a randomly selected sample of participating ASCs to collect...
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 3 2010-10-01 2010-10-01 false Surveys. 416.140 Section 416.140 Public Health... January 1, 2008 § 416.140 Surveys. (a) Timing, purpose, and procedures. (1) No more often than once a year, CMS conducts a survey of a randomly selected sample of participating ASCs to collect data for analysis...
Linking Teacher Competences to Organizational Citizenship Behaviour: The Role of Empowerment
ERIC Educational Resources Information Center
Kasekende, Francis; Munene, John C.; Otengei, Samson Omuudu; Ntayi, Joseph Mpeera
2016-01-01
Purpose: The purpose of this paper is to examine relationship between teacher competences and organizational citizenship behavior (OCB) with empowerment as a mediating factor. Design/methodology/approach: The study took a cross-sectional descriptive and analytical design. Using cluster and random sampling procedures, data were obtained from 383…
ERIC Educational Resources Information Center
Gray, James R.; Blair, Thomas J.
This report investigated the characteristics of summertime recreationists in northeastern New Mexico. A description of the area was given including the physical and economic characteristics. Data were gathered through a modified random sampling procedure. A prepared questionnaire was distributed to recreationists at 13 sites in New Mexico. The…
Assessing Professional Openness to a Novel Publication Approach.
ERIC Educational Resources Information Center
DeFiore, Roberta M.; Kramer, Thomas J.
In response to criticism of the peer review publication process, a study surveyed a random sample of 600 members of the American Psychological Association (APA) to determine (1) whether professional openness exists regarding the publication of a journal highlighting studies that, due to a difficulty in the experimental procedure, would usually be…
The Effectiveness of a Phone Help Line As Indicated by Student Awareness and Use
ERIC Educational Resources Information Center
Johnson, Craig W.
1976-01-01
Randomly sampled students (N=287) indicated the Help Line at the University of Nebraska significantly helped with their personal and informational needs. Perceptions of Help Line's purposes fell into two groups: informational versus personal. The two groups had radically different call rates. Advertising procedures are also discussed. (Author)
USDA-ARS?s Scientific Manuscript database
To identify the factors associated with anemia and to document the severity of micronutrient deficiencies, malaria and inflammation, a nationally representative cross-sectional survey was conducted. A three-stage sampling procedure was used to randomly select children <5 years of age and adult women...
Effective Recruitment of Schools for Randomized Clinical Trials: Role of School Nurses.
Petosa, R L; Smith, L
2017-01-01
In school settings, nurses lead efforts to improve the student health and well-being to support academic success. Nurses are guided by evidenced-based practice and data to inform care decisions. The randomized controlled trial (RCT) is considered the gold standard of scientific rigor for clinical trials. RCTs are critical to the development of evidence-based health promotion programs in schools. The purpose of this article is to present practical solutions to implementing principles of randomization to RCT trials conducted in school settings. Randomization is a powerful sampling method used to build internal and external validity. The school's daily organization and educational mission provide several barriers to randomization. Based on the authors' experience in conducting school-based RCTs, they offer a host of practical solutions to working with schools to successfully implement randomization procedures. Nurses play a critical role in implementing RCTs in schools to promote rigorous science in support of evidence-based practice.
Westfall, Jacob; Kenny, David A; Judd, Charles M
2014-10-01
Researchers designing experiments in which a sample of participants responds to a sample of stimuli are faced with difficult questions about optimal study design. The conventional procedures of statistical power analysis fail to provide appropriate answers to these questions because they are based on statistical models in which stimuli are not assumed to be a source of random variation in the data, models that are inappropriate for experiments involving crossed random factors of participants and stimuli. In this article, we present new methods of power analysis for designs with crossed random factors, and we give detailed, practical guidance to psychology researchers planning experiments in which a sample of participants responds to a sample of stimuli. We extensively examine 5 commonly used experimental designs, describe how to estimate statistical power in each, and provide power analysis results based on a reasonable set of default parameter values. We then develop general conclusions and formulate rules of thumb concerning the optimal design of experiments in which a sample of participants responds to a sample of stimuli. We show that in crossed designs, statistical power typically does not approach unity as the number of participants goes to infinity but instead approaches a maximum attainable power value that is possibly small, depending on the stimulus sample. We also consider the statistical merits of designs involving multiple stimulus blocks. Finally, we provide a simple and flexible Web-based power application to aid researchers in planning studies with samples of stimuli.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wild, M.; Rouhani, S.
1995-02-01
A typical site investigation entails extensive sampling and monitoring. In the past, sampling plans have been designed on purely ad hoc bases, leading to significant expenditures and, in some cases, collection of redundant information. In many instances, sampling costs exceed the true worth of the collected data. The US Environmental Protection Agency (EPA) therefore has advocated the use of geostatistics to provide a logical framework for sampling and analysis of environmental data. Geostatistical methodology uses statistical techniques for the spatial analysis of a variety of earth-related data. The use of geostatistics was developed by the mining industry to estimate oremore » concentrations. The same procedure is effective in quantifying environmental contaminants in soils for risk assessments. Unlike classical statistical techniques, geostatistics offers procedures to incorporate the underlying spatial structure of the investigated field. Sample points spaced close together tend to be more similar than samples spaced further apart. This can guide sampling strategies and determine complex contaminant distributions. Geostatistic techniques can be used to evaluate site conditions on the basis of regular, irregular, random and even spatially biased samples. In most environmental investigations, it is desirable to concentrate sampling in areas of known or suspected contamination. The rigorous mathematical procedures of geostatistics allow for accurate estimates at unsampled locations, potentially reducing sampling requirements. The use of geostatistics serves as a decision-aiding and planning tool and can significantly reduce short-term site assessment costs, long-term sampling and monitoring needs, as well as lead to more accurate and realistic remedial design criteria.« less
Sample size determination for GEE analyses of stepped wedge cluster randomized trials.
Li, Fan; Turner, Elizabeth L; Preisser, John S
2018-06-19
In stepped wedge cluster randomized trials, intact clusters of individuals switch from control to intervention from a randomly assigned period onwards. Such trials are becoming increasingly popular in health services research. When a closed cohort is recruited from each cluster for longitudinal follow-up, proper sample size calculation should account for three distinct types of intraclass correlations: the within-period, the inter-period, and the within-individual correlations. Setting the latter two correlation parameters to be equal accommodates cross-sectional designs. We propose sample size procedures for continuous and binary responses within the framework of generalized estimating equations that employ a block exchangeable within-cluster correlation structure defined from the distinct correlation types. For continuous responses, we show that the intraclass correlations affect power only through two eigenvalues of the correlation matrix. We demonstrate that analytical power agrees well with simulated power for as few as eight clusters, when data are analyzed using bias-corrected estimating equations for the correlation parameters concurrently with a bias-corrected sandwich variance estimator. © 2018, The International Biometric Society.
Ibrahim, Irwani; Yau, Ying Wei; Ong, Lizhen; Chan, Yiong Huak; Kuan, Win Sen
2015-03-01
Arterial punctures are important procedures performed by emergency physicians in the assessment of ill patients. However, arterial punctures are painful and can create anxiety and needle phobia in patients. The pain score of radial arterial punctures were compared between the insulin needle and the standard 23-gauge hypodermic needle. In a randomized controlled crossover design, healthy volunteers were recruited to undergo bilateral radial arterial punctures. They were assigned to receive either the insulin or the standard needle as the first puncture, using blocked randomization. The primary outcome was the pain score measured on a 100-mm visual analogue scale (VAS) for pain, and secondary outcomes were rate of hemolysis, mean potassium values, and procedural complications immediately and 24 hours postprocedure. Fifty healthy volunteers were included in the study. The mean (±standard deviation) VAS score in punctures with the insulin needle was lower than the standard needle (23 ± 22 mm vs. 39 ± 24 mm; mean difference = -15 mm; 95% confidence interval = -22 mm to -7 mm; p < 0.001). The rates of hemolysis and mean potassium value were greater in samples obtained using the insulin needle compared to the standard needle (31.3% vs. 11.6%, p = 0.035; and 4.6 ±0.7 mmol/L vs. 4.2 ±0.5 mmol/L, p = 0.002). Procedural complications were lower in punctures with the insulin needle both immediately postprocedure (0% vs. 24%; p < 0.001) and at 24 hours postprocedure (5.4% vs. 34.2%; p = 0.007). Arterial punctures using insulin needles cause less pain and fewer procedural complications compared to standard needles. However, due to the higher rate of hemolysis, its use should be limited to conditions that do not require a concurrent potassium value in the same blood sample. © 2015 by the Society for Academic Emergency Medicine.
Kettis-Lindblad, Asa; Ring, Lena; Viberth, Eva; Hansson, Mats G
2007-01-01
To assess the Swedish public's preferences for information and consent procedures when being asked for permission to use previously collected tissue samples for new research studies. Cross-sectional study employing postal questionnaires to a random sample of the Swedish general public (n = 6,000) in October 2002-February 2003. The response rate was 49% (n = 2,928). This paper includes only respondents who reportedly would approve of samples being taken and stored (n = 2,122). When potential tissue sample donors in the general public have to strike a balance between the values at stake, i.e. the autonomy of the donor versus the research value, most (72%) prefer general consent, i.e. where consent is asked for at the outset only. They want the research ethics committee (REC) alone to decide on the use of stored samples, and they would allow storage as long as the sample is useful for research. The minority of respondents who were in favour of specific consent were more likely to be young, well educated, have negative experiences of healthcare and low trust in healthcare authorities. The majority of the Swedish general public prefer general consent, and are thus willing to delegate some decisions to the RECs. However, preferences for information and consent procedures depend on the context, e.g. the risks for the donor and the purpose of the research. If feasible, procedures should be differentiated according to the preferences of individual donors, thus protecting the interests of both the minority and the majority.
Molander, Anders; Warfvinge, Johan; Reit, Claes; Kvist, Thomas
2007-10-01
The present investigation recorded the 2-year clinical and radiographic outcome of one- and two visit endodontic treatment and studied the significance of the bacteriologic sampling results on the outcome. A randomization procedure allocated 53 teeth to one-visit treatment and 48 teeth to two-visit treatment. At the end of the study period, 32 teeth (65%) in the one-visit group and 30 teeth (75%) in the two-visit group were classified as healed. The statistical analysis of the healing results did not show any significant difference between the groups (p = 0.75). Forty-nine (80%) of the 61 teeth that were obturated after a negative micobiologic sample were classified as healed. Teeth sealed after positive samples healed in 44%. The present study gave evidence that similar healing results might be obtained through one- and two-visit antimicrobial treatment.
How Does Active Parental Consent Influence the Findings of Drug-Use Surveys in Schools?
ERIC Educational Resources Information Center
White, Victoria M.; Hill, David J.; Effendi, Yuksel
2004-01-01
This study examines the impact of passive and active parental consent procedures on the type of adolescents participating in a school-based survey examining substance use. Schools recruited from a random sample of metropolitan schools were assigned to passive or active parental consent condition. Results showed that participation rates in active…
Texas School Survey of Substance Abuse: Grades 7-12. 1992.
ERIC Educational Resources Information Center
Liu, Liang Y.; Fredlund, Eric V.
The 1992 Texas School Survey results for secondary students are based on data collected from a sample of 73,073 students in grades 7 through 12. Students were randomly selected from school districts throughout the state using a multi-stage probability design. The procedure ensured that students living in metropolitan and rural areas of Texas are…
Brutal Borders? Examining the Treatment of Deportees during Arrest and Detention
ERIC Educational Resources Information Center
Phillips, Scott; Hagan, Jacqueline Maria; Rodriguez, Nestor
2006-01-01
Recent legislation has produced a dramatic rise in the detention and removal of immigrants from the United States. Drawing on interviews with a random sample of Salvadoran deportees, we examine treatment during arrest and detention. Our findings indicate: (1) deportees are often subject to verbal harassment, procedural failings and use of force;…
A Procedure to Detect Item Bias Present Simultaneously in Several Items
1991-04-25
exhibit a coherent and major biasing influence at the test level. In partic- ular, this can be true even if each individual item displays only a minor...response functions (IRFs) without the use of item parameter estimation algorithms when the sample size is too small for their use. Thissen, Steinberg...convention). A random sample of examinees is drawn from each group, and a test of N items is administered to them. Typically it is suspected that a
Sandhu, Gurkirat; Khinda, Paramjit Kaur; Gill, Amarjit Singh; Singh Khinda, Vineet Inder; Baghi, Kamal; Chahal, Gurparkash Singh
2017-01-01
Context: Periodontal surgical procedures produce varying degree of stress in all patients. Nitrous oxide-oxygen inhalation sedation is very effective for adult patients with mild-to-moderate anxiety due to dental procedures and needle phobia. Aim: The present study was designed to perform periodontal surgical procedures under nitrous oxide-oxygen inhalation sedation and assess whether this technique actually reduces stress physiologically, in comparison to local anesthesia alone (LA) during lengthy periodontal surgical procedures. Settings and Design: This was a randomized, split-mouth, cross-over study. Materials and Methods: A total of 16 patients were selected for this randomized, split-mouth, cross-over study. One surgical session (SS) was performed under local anesthesia aided by nitrous oxide-oxygen inhalation sedation, and the other SS was performed on the contralateral quadrant under LA. For each session, blood samples to measure and evaluate serum cortisol levels were obtained, and vital parameters including blood pressure, heart rate, respiratory rate, and arterial blood oxygen saturation were monitored before, during, and after periodontal surgical procedures. Statistical Analysis Used: Paired t-test and repeated measure ANOVA. Results: The findings of the present study revealed a statistically significant decrease in serum cortisol levels, blood pressure and pulse rate and a statistically significant increase in respiratory rate and arterial blood oxygen saturation during periodontal surgical procedures under nitrous oxide inhalation sedation. Conclusion: Nitrous oxide-oxygen inhalation sedation for periodontal surgical procedures is capable of reducing stress physiologically, in comparison to LA during lengthy periodontal surgical procedures. PMID:29386796
How to Do Random Allocation (Randomization)
Shin, Wonshik
2014-01-01
Purpose To explain the concept and procedure of random allocation as used in a randomized controlled study. Methods We explain the general concept of random allocation and demonstrate how to perform the procedure easily and how to report it in a paper. PMID:24605197
Caprilli, Simona; Anastasi, Francesca; Grotto, Rosa Pia Lauro; Scollo Abeti, Marianna; Abeti, Mariana Scollo; Messeri, Andrea
2007-10-01
The experience of venipuncture is seen by children as one of the most fearful experiences during hospitalization. Children experience anxiety both before and during the procedure. Therefore, any intervention aiming to prevent or reduce distress should focus on the entire experience of the procedure, including waiting, actual preparation, and conclusion. This study was designed to determine whether the presence of musicians, who had attended specific training to work in medical settings, could reduce distress and pain in children undergoing blood tests. Our sample population was composed of 108 unpremedicated children (4-13 years of age) undergoing blood tests. They were randomly assigned to a music group (n=54), in which the child underwent the procedure while interacting with the musicians in the presence of a parent or to a control group (n=54), in which only the parent provided support to the child during the procedure. The distress experienced by the child before, during and after the blood test was assessed with the Amended Form of the Observation Scale of Behavioral Distress, and pain experience with FACES scale (Wong Baker Scale) only after the venipuncture. Our results show that distress and pain intensity was significantly lower (p<.001; p<.05) in the music group compared with the control group before, during, and after blood sampling. This controlled study demonstrates that songs and music, performed by "professional" musicians, have a beneficial effect in reducing distress before, during, and after blood tests. This study shows, moreover, that the presence of musicians has a minor, but yet significant, effect on pain due to needle insertion.
Organic cattle products: Authenticating production origin by analysis of serum mineral content.
Rodríguez-Bermúdez, Ruth; Herrero-Latorre, Carlos; López-Alonso, Marta; Losada, David E; Iglesias, Roberto; Miranda, Marta
2018-10-30
An authentication procedure for differentiating between organic and non-organic cattle production on the basis of analysis of serum samples has been developed. For this purpose, the concentrations of fourteen mineral elements (As, Cd, Co, Cr, Cu, Fe, Hg, I, Mn, Mo, Ni, Pb, Se and Zn) in 522 serum samples from cows (341 from organic farms and 181 from non-organic farms), determined by inductively coupled plasma spectrometry, were used. The chemical information provided by serum analysis was employed to construct different pattern recognition classification models that predict the origin of each sample: organic or non-organic class. Among all classification procedures considered, the best results were obtained with the decision tree C5.0, Random Forest and AdaBoost neural networks, with hit levels close to 90% for both production types. The proposed method, involving analysis of serum samples, provided rapid, accurate in vivo classification of cattle according to organic and non-organic production type. Copyright © 2018 Elsevier Ltd. All rights reserved.
Autoshaping, random control, and omission training in the rat1
Locurto, Charles; Terrace, H. S.; Gibbon, John
1976-01-01
The role of the stimulus-reinforcer contingency in the development and maintenance of lever contact responding was studied in hooded rats. In Experiment I, three groups of experimentally naive rats were trained either on autoshaping, omission training, or a random-control procedure. Subjects trained by the autoshaping procedure responded more consistently than did either random-control or omission-trained subjects. The probability of at least one lever contact per trial was slightly higher in subjects trained by the omission procedure than by the random-control procedure. However, these differences were not maintained during extended training, nor were they evident in total lever-contact frequencies. When omission and random-control subjects were switched to the autoshaping condition, lever contacts increased in all animals, but a pronounced retardation was observed in omission subjects relative to the random-control subjects. In addition, subjects originally exposed to the random-control procedure, and later switched to autoshaping, acquired more rapidly than naive subjects that were exposed only on the autoshaping procedure. In Experiment II, subjects originally trained by an autoshaping procedure were exposed either to an omission, a random-control, or an extinction procedure. No differences were observed among the groups either in the rate at which lever contacts decreased or in the frequency of lever contacts at the end of training. These data implicate prior experience in the interpretation of omission-training effects and suggest limitations in the influence of stimulus-reinforcer relations in autoshaping. PMID:16811960
Autoshaping, random control, and omission training in the rat.
Locurto, C; Terrace, H S; Gibbon, J
1976-11-01
The role of the stimulus-reinforcer contingency in the development and maintenance of lever contact responding was studied in hooded rats. In Experiment I, three groups of experimentally naive rats were trained either on autoshaping, omission training, or a random-control procedure. Subjects trained by the autoshaping procedure responded more consistently than did either random-control or omission-trained subjects. The probability of at least one lever contact per trial was slightly higher in subjects trained by the omission procedure than by the random-control procedure. However, these differences were not maintained during extended training, nor were they evident in total lever-contact frequencies. When omission and random-control subjects were switched to the autoshaping condition, lever contacts increased in all animals, but a pronounced retardation was observed in omission subjects relative to the random-control subjects. In addition, subjects originally exposed to the random-control procedure, and later switched to autoshaping, acquired more rapidly than naive subjects that were exposed only on the autoshaping procedure. In Experiment II, subjects originally trained by an autoshaping procedure were exposed either to an omission, a random-control, or an extinction procedure. No differences were observed among the groups either in the rate at which lever contacts decreased or in the frequency of lever contacts at the end of training. These data implicate prior experience in the interpretation of omission-training effects and suggest limitations in the influence of stimulus-reinforcer relations in autoshaping.
Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kleijnen, J.P.C.; Helton, J.C.
1999-04-01
The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are consideredmore » for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.« less
Nonlinear probabilistic finite element models of laminated composite shells
NASA Technical Reports Server (NTRS)
Engelstad, S. P.; Reddy, J. N.
1993-01-01
A probabilistic finite element analysis procedure for laminated composite shells has been developed. A total Lagrangian finite element formulation, employing a degenerated 3-D laminated composite shell with the full Green-Lagrange strains and first-order shear deformable kinematics, forms the modeling foundation. The first-order second-moment technique for probabilistic finite element analysis of random fields is employed and results are presented in the form of mean and variance of the structural response. The effects of material nonlinearity are included through the use of a rate-independent anisotropic plasticity formulation with the macroscopic point of view. Both ply-level and micromechanics-level random variables can be selected, the latter by means of the Aboudi micromechanics model. A number of sample problems are solved to verify the accuracy of the procedures developed and to quantify the variability of certain material type/structure combinations. Experimental data is compared in many cases, and the Monte Carlo simulation method is used to check the probabilistic results. In general, the procedure is quite effective in modeling the mean and variance response of the linear and nonlinear behavior of laminated composite shells.
The appropriateness of use of percutaneous transluminal coronary angioplasty in Spain.
Aguilar, M D; Fitch, K; Lázaro, P; Bernstein, S J
2001-05-01
The rapid increase in the number of percutaneous transluminal coronary angioplasty (PTCA) procedures performed in Spain in recent years raises questions about how appropriately this procedure is being used. To examine this issue, we studied the appropriateness of use of PTCA in Spanish patients and factors associated with inappropriate use. We applied criteria for the appropriate use of PTCA developed by an expert panel of Spanish cardiologists and cardiovascular surgeons to a random sample of 1913 patients undergoing PTCA in Spain in 1997. The patients were selected through a two-step sampling process, stratifying by hospital type (public/private) and volume of procedures (low/medium/high). We examined the association between inappropriate use of PTCA and different clinical and sociodemographic factors. Overall, 46% of the PTCA procedures were appropriate, 31% were uncertain and 22% were inappropriate. Two factors contributing to inappropriate use were patients' receipt of less than optimal medical therapy and their failure to undergo stress testing. Institutional type and volume of procedures were not significantly related with inappropriate use. One of every five PTCA procedures in Spain is done for inappropriate reasons. Assuring that patients receive optimal medical therapy and undergo stress testing when indicated could contribute to more appropriate use of PTCA.
Occupational Commonalities: A Base for Course Construction. Paper No. 2219, Journal Series.
ERIC Educational Resources Information Center
Dillon, Roy D.; Horner, James T.
To determine competencies and activities used by workers in a cross section of the statewide labor force, data were obtained from a random sample of 1,500 employed persons drawn from 14 purposively selected index counties in Nebraska. An interview-questionnaire procedure yielded an 87.7 percent response to a checklist of 144 activities, duties,…
ERIC Educational Resources Information Center
Kassis, Wassilis; Artz, Sibylle; Moldenhauer, Stephanie
2013-01-01
Questionnaire data from a cross-sectional study of a randomly selected sample of 5,149 middle-school students from four EU countries (Austria, Germany, Slovenia, and Spain) were used to explore the effects of family violence burden level, structural and procedural risk and protective factors, and personal characteristics on adolescents who are…
Effects of Training Method and Gender on Learning 2D/3D Geometry
ERIC Educational Resources Information Center
Khairulanuar, Samsudin; Nazre, Abd Rashid; Jamilah, H.; Sairabanu, Omar Khan; Norasikin, Fabil
2010-01-01
This article reports the findings of an experimental study involving 36 primary school students (16 girls, 20 boys, Mean age = 9.5 years, age range: 8-10 years) in geometrical understanding of 2D and 3D objects. Students were assigned into two experimental groups and one control group based on a stratified random sampling procedure. The first…
ERIC Educational Resources Information Center
Perl, Joseph; And Others
A treatment program was instituted to reduce dating anxiety in infrequent daters. A sample of 85 extremely anxious and inhibited college undergraduates of both sexes volunteered for the program. Subjects were randomly assigned to one of three practice dating groups and two control groups. In one practice dating group, each non-dater was paired…
'Pygmy' old-growth redwood characteristics on an edaphic ecotone in Mendocino County, California
Will Russell; Suzie. Woolhouse
2012-01-01
The 'pygmy forest' is a specialized community that is adapted to highly acidic, hydrophobic, nutrient deprived soils, and exists in pockets within the coast redwood forest in Mendocino County. While coast redwood is known as an exceptionally tall tree, stunted trees exhibit unusual growth-forms on pygmy soils. We used a stratified random sampling procedure to...
Abercrombie, M L; Jewell, J S
1986-01-01
Results of EMIT, Abuscreen RIA, and GC/MS tests for THC metabolites in a high volume random urinalysis program are compared. Samples were field tested by non-laboratory personnel with an EMIT system using a 100 ng/mL cutoff. Samples were then sent to the Army Forensic Toxicology Drug Testing Laboratory (WRAMC) at Fort Meade, Maryland, where they were tested by RIA (Abuscreen) using a statistical 100 ng/mL cutoff. Confirmations of all RIA positives were accomplished using a GC/MS procedure. EMIT and RIA results agreed for 91% of samples. Data indicated a 4% false positive rate and a 10% false negative rate for EMIT field testing. In a related study, results for samples which tested positive by RIA for THC metabolites using a statistical 100 ng/mL cutoff were compared with results by GC/MS utilizing a 20 ng/mL cutoff for the THCA metabolite. Presence of THCA metabolite was detected in 99.7% of RIA positive samples. No relationship between quantitations determined by the two tests was found.
Uncertainty in monitoring E. coli concentrations in streams and stormwater runoff
NASA Astrophysics Data System (ADS)
Harmel, R. D.; Hathaway, J. M.; Wagner, K. L.; Wolfe, J. E.; Karthikeyan, R.; Francesconi, W.; McCarthy, D. T.
2016-03-01
Microbial contamination of surface waters, a substantial public health concern throughout the world, is typically identified by fecal indicator bacteria such as Escherichia coli. Thus, monitoring E. coli concentrations is critical to evaluate current conditions, determine restoration effectiveness, and inform model development and calibration. An often overlooked component of these monitoring and modeling activities is understanding the inherent random and systematic uncertainty present in measured data. In this research, a review and subsequent analysis was performed to identify, document, and analyze measurement uncertainty of E. coli data collected in stream flow and stormwater runoff as individual discrete samples or throughout a single runoff event. Data on the uncertainty contributed by sample collection, sample preservation/storage, and laboratory analysis in measured E. coli concentrations were compiled and analyzed, and differences in sampling method and data quality scenarios were compared. The analysis showed that: (1) manual integrated sampling produced the lowest random and systematic uncertainty in individual samples, but automated sampling typically produced the lowest uncertainty when sampling throughout runoff events; (2) sample collection procedures often contributed the highest amount of uncertainty, although laboratory analysis introduced substantial random uncertainty and preservation/storage introduced substantial systematic uncertainty under some scenarios; and (3) the uncertainty in measured E. coli concentrations was greater than that of sediment and nutrients, but the difference was not as great as may be assumed. This comprehensive analysis of uncertainty in E. coli concentrations measured in streamflow and runoff should provide valuable insight for designing E. coli monitoring projects, reducing uncertainty in quality assurance efforts, regulatory and policy decision making, and fate and transport modeling.
Usami, Satoshi
2017-03-01
Behavioral and psychological researchers have shown strong interests in investigating contextual effects (i.e., the influences of combinations of individual- and group-level predictors on individual-level outcomes). The present research provides generalized formulas for determining the sample size needed in investigating contextual effects according to the desired level of statistical power as well as width of confidence interval. These formulas are derived within a three-level random intercept model that includes one predictor/contextual variable at each level to simultaneously cover various kinds of contextual effects that researchers can show interest. The relative influences of indices included in the formulas on the standard errors of contextual effects estimates are investigated with the aim of further simplifying sample size determination procedures. In addition, simulation studies are performed to investigate finite sample behavior of calculated statistical power, showing that estimated sample sizes based on derived formulas can be both positively and negatively biased due to complex effects of unreliability of contextual variables, multicollinearity, and violation of assumption regarding the known variances. Thus, it is advisable to compare estimated sample sizes under various specifications of indices and to evaluate its potential bias, as illustrated in the example.
Pavlovian autoshaping procedures increase plasma corticosterone levels in rats.
Tomie, Arthur; Silberman, Yuval; Williams, Kayon; Pohorecky, Larissa A
2002-06-01
Pavlovian autoshaping conditioned responses (CRs) are complex sequences of conditioned stimulus (CS)-directed skeletal-motor responses that are elicited by CS objects predictive of food unconditioned stimulus (US). Autoshaping CRs are observed under conditions known to be conducive to elevations in plasma corticosterone levels, as, for example, in response to the eating of food as well as in response to signals predictive of food. Two experiments investigated the relationships between Pavlovian autoshaping procedures, the performance of Pavlovian autoshaping CRs, and plasma corticosterone levels in male Long-Evans rats. In Experiment 1, rats in the CS-US paired group (n=30) were given 20 daily sessions of Pavlovian autoshaping training wherein the insertion of a retractable lever CS was followed by the response-independent presentation of the food US. Tail blood samples obtained after the 20th autoshaping session revealed higher plasma corticosterone levels in the CS-US paired group than in the CS-US random control group (n=10). In Experiment 2, rats (n=35) were assessed for basal plasma corticosterone levels 2 weeks prior to autoshaping training. Plasma samples obtained immediately following the first autoshaping session, and prior to the acquisition of lever-press autoshaping CR performance, revealed higher plasma corticosterone levels in the CS-US paired group (n=24) relative to basal levels. This effect was not observed in the CS-US random control group (n=11). Data suggest that corticosterone release is a physiological endocrine Pavlovian CR induced by lever CS-food US pairings during Pavlovian autoshaping procedures, rather than a by-product of autoshaping CR performance. Implications of the link between autoshaping procedures and corticosterone release are discussed.
Doulas for surgical management of miscarriage and abortion: a randomized controlled trial.
Wilson, Susan F; Gurney, Elizabeth P; Sammel, Mary D; Schreiber, Courtney A
2017-01-01
Women undergoing office-based surgical management of a failed or undesired pregnancy often report fear of pain and anxiety pertaining to the procedure. Doulas are trained to specifically address women's physical and emotional needs in obstetric care, and recently have extended their practice to support women through all pregnancy outcomes. We sought to evaluate the impact of doulas on patients' physical and emotional responses to surgical management of a first-trimester failed or undesired pregnancy under local anesthesia. In this nonblinded, randomized trial, women received doula support or routine care during office uterine aspiration for failed or unwanted pregnancies in the first trimester. The primary outcome was pain measured on a 100-mm visual analog scale. Secondary outcomes included satisfaction, emotional state, sense of personal empowerment, and ability to cope immediately and 1 month after the procedure, as well as medical assistants' assessment of the doula's utility. A sample size of 35 per group (N = 70) was planned to detect a 20% difference in pain score. From April 2014 through January 2015, 129 women were screened and 70 were randomized. The 2 study groups were similar on all baseline characteristics. The primary outcome was not different between the doula and control groups (pain score 70.7 ± 24.5 mm vs 59.7 ± 32.5 mm, P = .11, respectively), even after controlling for procedure indication (P = .20). While 97% of women who received doula support reported this helped with their experience, there was no statistically significant difference in satisfaction, emotional response, sense of empowerment, or perceived ability to cope between the 2 groups of women immediately following or 1 month after the procedure. Of all study participants, 72% reported that it was important to have someone with them during the procedure, but that the support person did not have to be a doula. Doula support during office uterine aspiration for failed or undesired pregnancies is well received and desired by women undergoing this procedure despite no significant effect on physical comfort or emotional responses related to the procedure. This may suggest an unmet psychosocial need for procedure-related support among such women. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Ng, Daniel; Supaporn, Potibut
A study investigated the trend of current U.S. television commercial informativeness by comparing the results with Alan Resnik and Bruce Stern's previous benchmark study conducted in 1977. A systematic random sampling procedure was used to select viewing dates and times of commercials from the three national networks. Ultimately, a total of 550…
ERIC Educational Resources Information Center
Kwaah, Christopher Yaw; Essilfie, Gabriel
2017-01-01
This study was designed to identify the causes of stress and coping strategies adopted among distance education students at the College of Distance Education in the University of Cape Coast. A total of 332 diploma and post-diploma final year students in 2014/2015 academic year were selected from two study centers using random sampling procedure to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Novak, Erik; Trolinger, James D.; Lacey, Ian
This work reports on the development of a binary pseudo-random test sample optimized to calibrate the MTF of optical microscopes. The sample consists of a number of 1-D and 2-D patterns, with different minimum sizes of spatial artifacts from 300 nm to 2 microns. We describe the mathematical background, fabrication process, data acquisition and analysis procedure to return spatial frequency based instrument calibration. We show that the developed samples satisfy the characteristics of a test standard: functionality, ease of specification and fabrication, reproducibility, and low sensitivity to manufacturing error. © (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading ofmore » the abstract is permitted for personal use only.« less
A smart Monte Carlo procedure for production costing and uncertainty analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parker, C.; Stremel, J.
1996-11-01
Electric utilities using chronological production costing models to decide whether to buy or sell power over the next week or next few weeks need to determine potential profits or losses under a number of uncertainties. A large amount of money can be at stake--often $100,000 a day or more--and one party of the sale must always take on the risk. In the case of fixed price ($/MWh) contracts, the seller accepts the risk. In the case of cost plus contracts, the buyer must accept the risk. So, modeling uncertainty and understanding the risk accurately can improve the competitive edge ofmore » the user. This paper investigates an efficient procedure for representing risks and costs from capacity outages. Typically, production costing models use an algorithm based on some form of random number generator to select resources as available or on outage. These algorithms allow experiments to be repeated and gains and losses to be observed in a short time. The authors perform several experiments to examine the capability of three unit outage selection methods and measures their results. Specifically, a brute force Monte Carlo procedure, a Monte Carlo procedure with Latin Hypercube sampling, and a Smart Monte Carlo procedure with cost stratification and directed sampling are examined.« less
Wickenberg-Bolin, Ulrika; Göransson, Hanna; Fryknäs, Mårten; Gustafsson, Mats G; Isaksson, Anders
2006-03-13
Supervised learning for classification of cancer employs a set of design examples to learn how to discriminate between tumors. In practice it is crucial to confirm that the classifier is robust with good generalization performance to new examples, or at least that it performs better than random guessing. A suggested alternative is to obtain a confidence interval of the error rate using repeated design and test sets selected from available examples. However, it is known that even in the ideal situation of repeated designs and tests with completely novel samples in each cycle, a small test set size leads to a large bias in the estimate of the true variance between design sets. Therefore different methods for small sample performance estimation such as a recently proposed procedure called Repeated Random Sampling (RSS) is also expected to result in heavily biased estimates, which in turn translates into biased confidence intervals. Here we explore such biases and develop a refined algorithm called Repeated Independent Design and Test (RIDT). Our simulations reveal that repeated designs and tests based on resampling in a fixed bag of samples yield a biased variance estimate. We also demonstrate that it is possible to obtain an improved variance estimate by means of a procedure that explicitly models how this bias depends on the number of samples used for testing. For the special case of repeated designs and tests using new samples for each design and test, we present an exact analytical expression for how the expected value of the bias decreases with the size of the test set. We show that via modeling and subsequent reduction of the small sample bias, it is possible to obtain an improved estimate of the variance of classifier performance between design sets. However, the uncertainty of the variance estimate is large in the simulations performed indicating that the method in its present form cannot be directly applied to small data sets.
Study of Dynamic Characteristics of Aeroelastic Systems Utilizing Randomdec Signatures
NASA Technical Reports Server (NTRS)
Chang, C. S.
1975-01-01
The feasibility of utilizing the random decrement method in conjunction with a signature analysis procedure to determine the dynamic characteristics of an aeroelastic system for the purpose of on-line prediction of potential on-set of flutter was examined. Digital computer programs were developed to simulate sampled response signals of a two-mode aeroelastic system. Simulated response data were used to test the random decrement method. A special curve-fit approach was developed for analyzing the resulting signatures. A number of numerical 'experiments' were conducted on the combined processes. The method is capable of determining frequency and damping values accurately from randomdec signatures of carefully selected lengths.
Khazaal, Yasser; van Singer, Mathias; Chatton, Anne; Achab, Sophia; Zullino, Daniele; Rothen, Stephane; Khan, Riaz; Billieux, Joel; Thorens, Gabriel
2014-07-07
The number of medical studies performed through online surveys has increased dramatically in recent years. Despite their numerous advantages (eg, sample size, facilitated access to individuals presenting stigmatizing issues), selection bias may exist in online surveys. However, evidence on the representativeness of self-selected samples in online studies is patchy. Our objective was to explore the representativeness of a self-selected sample of online gamers using online players' virtual characters (avatars). All avatars belonged to individuals playing World of Warcraft (WoW), currently the most widely used online game. Avatars' characteristics were defined using various games' scores, reported on the WoW's official website, and two self-selected samples from previous studies were compared with a randomly selected sample of avatars. We used scores linked to 1240 avatars (762 from the self-selected samples and 478 from the random sample). The two self-selected samples of avatars had higher scores on most of the assessed variables (except for guild membership and exploration). Furthermore, some guilds were overrepresented in the self-selected samples. Our results suggest that more proficient players or players more involved in the game may be more likely to participate in online surveys. Caution is needed in the interpretation of studies based on online surveys that used a self-selection recruitment procedure. Epidemiological evidence on the reduced representativeness of sample of online surveys is warranted.
A method to estimate the effect of deformable image registration uncertainties on daily dose mapping
Murphy, Martin J.; Salguero, Francisco J.; Siebers, Jeffrey V.; Staub, David; Vaman, Constantin
2012-01-01
Purpose: To develop a statistical sampling procedure for spatially-correlated uncertainties in deformable image registration and then use it to demonstrate their effect on daily dose mapping. Methods: Sequential daily CT studies are acquired to map anatomical variations prior to fractionated external beam radiotherapy. The CTs are deformably registered to the planning CT to obtain displacement vector fields (DVFs). The DVFs are used to accumulate the dose delivered each day onto the planning CT. Each DVF has spatially-correlated uncertainties associated with it. Principal components analysis (PCA) is applied to measured DVF error maps to produce decorrelated principal component modes of the errors. The modes are sampled independently and reconstructed to produce synthetic registration error maps. The synthetic error maps are convolved with dose mapped via deformable registration to model the resulting uncertainty in the dose mapping. The results are compared to the dose mapping uncertainty that would result from uncorrelated DVF errors that vary randomly from voxel to voxel. Results: The error sampling method is shown to produce synthetic DVF error maps that are statistically indistinguishable from the observed error maps. Spatially-correlated DVF uncertainties modeled by our procedure produce patterns of dose mapping error that are different from that due to randomly distributed uncertainties. Conclusions: Deformable image registration uncertainties have complex spatial distributions. The authors have developed and tested a method to decorrelate the spatial uncertainties and make statistical samples of highly correlated error maps. The sample error maps can be used to investigate the effect of DVF uncertainties on daily dose mapping via deformable image registration. An initial demonstration of this methodology shows that dose mapping uncertainties can be sensitive to spatial patterns in the DVF uncertainties. PMID:22320766
Al-Mohrej, Omar A; Alshammari, Faris O; Aljuraisi, Abdulrahman M; Bin Amer, Lujain A; Masuadi, Emad M; Al-Kenani, Nader S
2018-04-01
Studies on total knee arthroplasty (TKA) in Saudi Arabia are scarce, and none have reported the knowledge and attitude of the procedure in Saudi Arabia. Our study aims to measure the knowledge and attitude of TKA among the adult Saudi population. To encompass a representative sample of this cross-sectional survey, all 13 administrative areas were used as ready-made geographical clusters. For each cluster, stratified random sampling was performed to maximize participation in the study. In each area, random samples of mobile phone numbers were selected with a probability proportional to the administrative area population size. Sample size calculation was based on the assumption that 50% of the participants would have some level of knowledge, with a 2% margin of error and 95% confidence level. To reach our intended sample size of 1540, we contacted 1722 participants with a response rate of 89.4%. The expected percentage of public knowledge was 50%; however, the actual percentage revealed by this study was much lower (29.7%). A stepwise multiple logistic regression was used to assess the factors that positively affected the knowledge score regarding TKA. Age [P = 0.016 with OR of 0.47], higher income [P = 0.001 with OR of 0.52] and participants with a positive history of TKA or that have known someone who underwent the surgery [P < 0.001 with OR of 0.15] had a positive impact on the total knowledge score. There are still misconceptions among the public in Saudi Arabia concerning TKA, its indications and results. We recommend that doctors use the results of our survey to assess their conversations with their patients, and to determine whether the results of the procedure are adequately clarified.
Confidence intervals for a difference between lognormal means in cluster randomization trials.
Poirier, Julia; Zou, G Y; Koval, John
2017-04-01
Cluster randomization trials, in which intact social units are randomized to different interventions, have become popular in the last 25 years. Outcomes from these trials in many cases are positively skewed, following approximately lognormal distributions. When inference is focused on the difference between treatment arm arithmetic means, existent confidence interval procedures either make restricting assumptions or are complex to implement. We approach this problem by assuming log-transformed outcomes from each treatment arm follow a one-way random effects model. The treatment arm means are functions of multiple parameters for which separate confidence intervals are readily available, suggesting that the method of variance estimates recovery may be applied to obtain closed-form confidence intervals. A simulation study showed that this simple approach performs well in small sample sizes in terms of empirical coverage, relatively balanced tail errors, and interval widths as compared to existing methods. The methods are illustrated using data arising from a cluster randomization trial investigating a critical pathway for the treatment of community acquired pneumonia.
Blaivas, Michael; Adhikari, Srikar; Lander, Lina
2011-09-01
Emergency physicians (EPs) are beginning to use ultrasound (US) guidance to perform regional nerve blocks. The primary objective of this study was to compare length of stay (LOS) in patients randomized to US-guided interscalene block or procedural sedation to facilitate reduction of shoulder dislocation in the emergency department (ED). The secondary objectives were to compare one-on-one health care provider time, pain experienced by the patient during reduction, and patient satisfaction between the two groups. This was a prospective, randomized study of patients presenting to the ED with shoulder dislocation. The study was conducted at an academic Level I trauma center ED with an annual census of approximately 80,000. Patients were eligible for the study if they were at least 18 years of age and required reduction of a shoulder dislocation. A convenience sample of patients was randomized to either traditional procedural sedation or US-guided interscalene nerve block. Procedural sedation was performed with etomidate as the sole agent. Interscalene blocks were performed by hospital-credentialed EPs using sterile technique and a SonoSite MicroMaxx US machine with a high-frequency linear array transducer. Categorical variables were evaluated using Fisher's exact test, and continuous variables were analyzed using the Wilcoxon rank sum test. Forty-two patients were enrolled, with 21 patients randomized to each group. The groups were not significantly different with respect to sex or age. The mean (±SD) LOS in the ED was significantly higher in the procedural sedation group (177.3 ± 37.9 min) than in the US-guided interscalene block group (100.3 ± 28.2 minutes; p < 0.0001). The mean (±SD) one-on-one health care provider time was 47.1 (±9.8) minutes for the sedation group and 5 (±0.7) minutes for the US-guided interscalene block group (p < 0.0001). There was no statistically significant difference between the two groups in patient satisfaction or pain experienced during the procedure. There were no significant differences between groups with respect to complications such as hypoxia or hypotension (p = 0.49). In this study, patients undergoing shoulder dislocation reduction using US-guided interscalene block spent less time in the ED and required less one-on-one health care provider time compared to those receiving procedural sedation. There was no difference in pain level or satisfaction when compared to procedural sedation patients. © 2011 by the Society for Academic Emergency Medicine.
Dolch, Michael E; Janitza, Silke; Boulesteix, Anne-Laure; Graßmann-Lichtenauer, Carola; Praun, Siegfried; Denzer, Wolfgang; Schelling, Gustav; Schubert, Sören
2016-12-01
Identification of microorganisms in positive blood cultures still relies on standard techniques such as Gram staining followed by culturing with definite microorganism identification. Alternatively, matrix-assisted laser desorption/ionization time-of-flight mass spectrometry or the analysis of headspace volatile compound (VC) composition produced by cultures can help to differentiate between microorganisms under experimental conditions. This study assessed the efficacy of volatile compound based microorganism differentiation into Gram-negatives and -positives in unselected positive blood culture samples from patients. Headspace gas samples of positive blood culture samples were transferred to sterilized, sealed, and evacuated 20 ml glass vials and stored at -30 °C until batch analysis. Headspace gas VC content analysis was carried out via an auto sampler connected to an ion-molecule reaction mass spectrometer (IMR-MS). Measurements covered a mass range from 16 to 135 u including CO2, H2, N2, and O2. Prediction rules for microorganism identification based on VC composition were derived using a training data set and evaluated using a validation data set within a random split validation procedure. One-hundred-fifty-two aerobic samples growing 27 Gram-negatives, 106 Gram-positives, and 19 fungi and 130 anaerobic samples growing 37 Gram-negatives, 91 Gram-positives, and two fungi were analysed. In anaerobic samples, ten discriminators were identified by the random forest method allowing for bacteria differentiation into Gram-negative and -positive (error rate: 16.7 % in validation data set). For aerobic samples the error rate was not better than random. In anaerobic blood culture samples of patients IMR-MS based headspace VC composition analysis facilitates bacteria differentiation into Gram-negative and -positive.
Skin penetration operators' knowledge and attitudes towards infection control.
Oberdorfer, Aurmporn; Wiggers, John H; Considine, Robyn J; Bowman, Jenny; Cockburn, Jill
2003-01-01
To assess the knowledge and attitudes of owners/managers of commercial skin-penetration premises regarding infection control. A telephone survey was conducted with a randomly selected sample of 874 owners/managers. Participants appeared to lack knowledge of essential infection-control practices. Less than 39% correctly identified recommended disinfection procedures, and between 12% to 67% were not aware of inappropriate sterlization procedures. Almost all participants accepted the need for guidelines. Half acknowledged a need to improve their infection-control compliance, and most accepted having their premises regularly checked by the councils. There is a considerable opportunity to increase infection-control compliance among skin-penetration operators.
Improved Equivalent Linearization Implementations Using Nonlinear Stiffness Evaluation
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.; Muravyov, Alexander A.
2001-01-01
This report documents two new implementations of equivalent linearization for solving geometrically nonlinear random vibration problems of complicated structures. The implementations are given the acronym ELSTEP, for "Equivalent Linearization using a STiffness Evaluation Procedure." Both implementations of ELSTEP are fundamentally the same in that they use a novel nonlinear stiffness evaluation procedure to numerically compute otherwise inaccessible nonlinear stiffness terms from commercial finite element programs. The commercial finite element program MSC/NASTRAN (NASTRAN) was chosen as the core of ELSTEP. The FORTRAN implementation calculates the nonlinear stiffness terms and performs the equivalent linearization analysis outside of NASTRAN. The Direct Matrix Abstraction Program (DMAP) implementation performs these operations within NASTRAN. Both provide nearly identical results. Within each implementation, two error minimization approaches for the equivalent linearization procedure are available - force and strain energy error minimization. Sample results for a simply supported rectangular plate are included to illustrate the analysis procedure.
Paoletti, Claudia; Esbensen, Kim H
2015-01-01
Material heterogeneity influences the effectiveness of sampling procedures. Most sampling guidelines used for assessment of food and/or feed commodities are based on classical statistical distribution requirements, the normal, binomial, and Poisson distributions-and almost universally rely on the assumption of randomness. However, this is unrealistic. The scientific food and feed community recognizes a strong preponderance of non random distribution within commodity lots, which should be a more realistic prerequisite for definition of effective sampling protocols. Nevertheless, these heterogeneity issues are overlooked as the prime focus is often placed only on financial, time, equipment, and personnel constraints instead of mandating acquisition of documented representative samples under realistic heterogeneity conditions. This study shows how the principles promulgated in the Theory of Sampling (TOS) and practically tested over 60 years provide an effective framework for dealing with the complete set of adverse aspects of both compositional and distributional heterogeneity (material sampling errors), as well as with the errors incurred by the sampling process itself. The results of an empirical European Union study on genetically modified soybean heterogeneity, Kernel Lot Distribution Assessment are summarized, as they have a strong bearing on the issue of proper sampling protocol development. TOS principles apply universally in the food and feed realm and must therefore be considered the only basis for development of valid sampling protocols free from distributional constraints.
Sara A. Goeking; Paul L. Patterson
2013-01-01
The USDA Forest Serviceâs Forest Inventory and Analysis (FIA) Program applies specific sampling and analysis procedures to estimate a variety of forest attributes. FIAâs Interior West region uses post-stratification, where strata consist of forest/nonforest polygons based on MODIS imagery, and assumes that nonresponse plots are distributed at random across each stratum...
ERIC Educational Resources Information Center
Keach, Everett T., Jr.; Pierfy, David A.
The research in this report was conducted to assess the cognitive impact of a simulation game designed to teach selected geographic data about wind and ocean currents to fifth graders. A two-group, post-test research design was used. A random procedure was used to assign 185 students to two treatment groups. The sample was divided by sex, ranked…
ERIC Educational Resources Information Center
Klein, Thomas W.
Steps involved in the item analysis and scaling of the 1990 edition of Forms A and B of the Nevada High School Proficiency Examinations (NHSPEs) are described. Pilot tests of Forms A and B of the 47-item reading and 45-item mathematics tests were each administered to random samples of more than 600 eleventh-grade students. A computer program was…
A closer look at cross-validation for assessing the accuracy of gene regulatory networks and models.
Tabe-Bordbar, Shayan; Emad, Amin; Zhao, Sihai Dave; Sinha, Saurabh
2018-04-26
Cross-validation (CV) is a technique to assess the generalizability of a model to unseen data. This technique relies on assumptions that may not be satisfied when studying genomics datasets. For example, random CV (RCV) assumes that a randomly selected set of samples, the test set, well represents unseen data. This assumption doesn't hold true where samples are obtained from different experimental conditions, and the goal is to learn regulatory relationships among the genes that generalize beyond the observed conditions. In this study, we investigated how the CV procedure affects the assessment of supervised learning methods used to learn gene regulatory networks (or in other applications). We compared the performance of a regression-based method for gene expression prediction estimated using RCV with that estimated using a clustering-based CV (CCV) procedure. Our analysis illustrates that RCV can produce over-optimistic estimates of the model's generalizability compared to CCV. Next, we defined the 'distinctness' of test set from training set and showed that this measure is predictive of performance of the regression method. Finally, we introduced a simulated annealing method to construct partitions with gradually increasing distinctness and showed that performance of different gene expression prediction methods can be better evaluated using this method.
Motives for cosmetic procedures in Saudi women.
Al-Natour, Sahar H
2014-01-01
The media-fuelled obsession with beauty in modern society has led more women to seek elective cosmetic procedures to meet the portrayed ideals of beauty in different cultures. This study gives insights into incentives and desires to undergo cosmetic procedures in a conservative society with strict religious practices where women are veiled. Questionnaire data were obtained from 509 Saudi women who responded to a survey distributed randomly to a sample of Saudi women aged 17 to 72 years. At least 1 elective cosmetic procedure was performed in 42% of the women, of whom 77.8% wore a veil. Another 33% considered having a procedure. The motives for seeking a cosmetic procedure were to improve self-esteem in 83.7%, attract a husband in 63.3%, or prevent a husband from seeking another wife in 36.2%. The decision to seek a procedure was affected by the media, with high peer influence. Motivation for elective cosmetic procedures in Saudi women is influenced by a combination of emotional and cultural factors, level of education, marital status, and religious beliefs. The veil is not an impediment for seeking such procedures. The limitation of the study was missing data analysis as some items in the questionnaire were completed inaccurately or left unanswered.
Randomization Procedures Applied to Analysis of Ballistic Data
1991-06-01
test,;;15. NUMBER OF PAGES data analysis; computationally intensive statistics ; randomization tests; permutation tests; 16 nonparametric statistics ...be 0.13. 8 Any reasonable statistical procedure would fail to support the notion of improvement of dynamic over standard indexing based on this data ...AD-A238 389 TECHNICAL REPORT BRL-TR-3245 iBRL RANDOMIZATION PROCEDURES APPLIED TO ANALYSIS OF BALLISTIC DATA MALCOLM S. TAYLOR BARRY A. BODT - JUNE
1998-05-01
Coverage Probability with a Random Optimization Procedure: An Artificial Neural Network Approach by Biing T. Guan, George Z. Gertner, and Alan B...Modeling Training Site Vegetation Coverage Probability with a Random Optimizing Procedure: An Artificial Neural Network Approach 6. AUTHOR(S) Biing...coverage based on past coverage. Approach A literature survey was conducted to identify artificial neural network analysis techniques applicable for
Huprich, Steven K; Defife, Jared; Westen, Drew
2014-01-01
We sought to determine whether meaningful subtypes of Dysthymic patients could be identified when grouping them by similar personality profiles. A random, national sample of psychiatrists and clinical psychologists (n=1201) described a randomly selected current patient with personality pathology using the descriptors in the Shedler-Westen Assessment Procedure-II (SWAP-II), completed assessments of patients' adaptive functioning, and provided DSM-IV Axis I and II diagnoses. We applied Q-factor cluster analyses to those patients diagnosed with Dysthymic Disorder. Four clusters were identified-High Functioning, Anxious/Dysphoric, Emotionally Dysregulated, and Narcissistic. These factor scores corresponded with a priori hypotheses regarding diagnostic comorbidity and level of adaptive functioning. We compared these groups to diagnostic constructs described and empirically identified in the past literature. The results converge with past and current ideas about the ways in which chronic depression and personality are related and offer an enhanced means by which to understand a heterogeneous diagnostic category that is empirically grounded and clinically useful. © 2013 Published by Elsevier B.V.
Ayasrah, Shahnaz Mohammed; Ahmad, Muayyad M
2016-01-01
To explore the effectiveness of an educational video intervention in lowering periprocedural anxiety among Jordanian patients hospitalized for cardiac catheterization (CATH). There are many potential reasons of anxiety related to CATH including involvement of the heart and the actual test procedure. A randomized controlled trial took place in a specialized heart institute in Jordan. The sample size was 186 patients who had undergone CATH procedure. Patients anxiety levels were measured by physiological parameters of anxiety (blood pressure, heart rate, and respiratory rate) and by the Spielberger State Anxiety Inventory (SAI). After video education, there was a significant difference in periprocedural perceived anxiety between the groups: preprocedural anxiety levels (M = 39.03, SD = 5.70) for the experimental group versus (M = 49.34, SD = 6.00) for the control, p < .001, and postprocedural perceived anxiety for the experimental group (M = 29.18, SD = 5.42) versus (M = 41.73, SD = 5.41) for the control. Providing an educational video intervention about CATH may effectively decrease periprocedural anxiety levels.
Reiss, K; Makarova, N; Spallek, J; Zeeb, H; Razum, O
2013-06-01
In 2009, 19.6% of the population of Germany either had migrated themselves or were the offspring of people with migration experience. Migrants differ from the autochthonous German population in terms of health status, health awareness and health behaviour. To further investigate the health situation of migrants in Germany, epidemiological studies are needed. Such studies can employ existing databases which provide detailed information on migration status. Otherwise, onomastic or toponomastic procedures can be applied to identify people with migration background. If migrants have to be recruited into an epidemiological study, this can be done register-based (e. g., data from registration offices or telephone lists), based on residential location (random-route or random-walk procedure), via snowball sampling (e. g., through key persons) or via settings (e. g., school entry examination). An oversampling of people with migration background is not sufficient to avoid systematic bias in the sample due to non-participation. Additional measures have to be taken to increase access and raise participation rates. Personal contacting, multilingual instruments, multilingual interviewers and extensive public relations increase access and willingness to participate. Empirical evidence on 'successful' recruitment strategies for studies with migrants is still lacking in epidemiology and health sciences in Germany. The choice of the recruitment strategy as well as the measures to raise accessibility and willingness to participate depend on the available resources, the research question and the specific migrant target group. © Georg Thieme Verlag KG Stuttgart · New York.
Ji, Peter; DuBois, David L; Flay, Brian R; Brechling, Vanessa
2008-03-01
Recruiting schools into a matched-pair randomized control trial (MP-RCT) to evaluate the efficacy of a school-level prevention program presents challenges for researchers. We considered which of 2 procedures would be most effective for recruiting schools into the study and assigning them to conditions. In 1 procedure (recruit and match/randomize), we would recruit schools and match them prior to randomization, and in the other (match/randomize and recruitment), we would match schools and randomize them prior to recruitment. We considered how each procedure impacted the randomization process and our ability to recruit schools into the study. After implementing the selected procedure, the equivalence of both treatment and control group schools and the participating and nonparticipating schools on school demographic variables was evaluated. We decided on the recruit and match/randomize procedure because we thought it would provide the opportunity to build rapport with the schools and prepare them for the randomization process, thereby increasing the likelihood that they would accept their randomly assigned conditions. Neither the treatment and control group schools nor the participating and nonparticipating schools exhibited statistically significant differences from each other on any of the school demographic variables. Recruitment of schools prior to matching and randomization in an MP-RCT may facilitate the recruitment of schools and thus enhance both the statistical power and the representativeness of study findings. Future research would benefit from the consideration of a broader range of variables (eg, readiness to implement a comprehensive prevention program) both in matching schools and in evaluating their representativeness to nonparticipating schools.
Robust estimation of the proportion of treatment effect explained by surrogate marker information.
Parast, Layla; McDermott, Mary M; Tian, Lu
2016-05-10
In randomized treatment studies where the primary outcome requires long follow-up of patients and/or expensive or invasive obtainment procedures, the availability of a surrogate marker that could be used to estimate the treatment effect and could potentially be observed earlier than the primary outcome would allow researchers to make conclusions regarding the treatment effect with less required follow-up time and resources. The Prentice criterion for a valid surrogate marker requires that a test for treatment effect on the surrogate marker also be a valid test for treatment effect on the primary outcome of interest. Based on this criterion, methods have been developed to define and estimate the proportion of treatment effect on the primary outcome that is explained by the treatment effect on the surrogate marker. These methods aim to identify useful statistical surrogates that capture a large proportion of the treatment effect. However, current methods to estimate this proportion usually require restrictive model assumptions that may not hold in practice and thus may lead to biased estimates of this quantity. In this paper, we propose a nonparametric procedure to estimate the proportion of treatment effect on the primary outcome that is explained by the treatment effect on a potential surrogate marker and extend this procedure to a setting with multiple surrogate markers. We compare our approach with previously proposed model-based approaches and propose a variance estimation procedure based on a perturbation-resampling method. Simulation studies demonstrate that the procedure performs well in finite samples and outperforms model-based procedures when the specified models are not correct. We illustrate our proposed procedure using a data set from a randomized study investigating a group-mediated cognitive behavioral intervention for peripheral artery disease participants. Copyright © 2015 John Wiley & Sons, Ltd.
Effect of virtual reality on adolescent pain during burn wound care.
Jeffs, Debra; Dorman, Dona; Brown, Susan; Files, Amber; Graves, Tamara; Kirk, Elizabeth; Meredith-Neve, Sandra; Sanders, Janise; White, Benjamin; Swearingen, Christopher J
2014-01-01
The objective of this study was to compare the effect of virtual reality to passive distraction and standard care on burn treatment pain in adolescents.This single-blinded, randomized controlled study enrolled 30 adolescents who were 10 to 17 years of age from the burn clinic of a large children's hospital. After providing informed consent/assent, these participants were randomly assigned to one of three groups during wound care: standard care, passive distraction watching a movie, or virtual reality (VR) using a tripod-arm device rather than an immersive helmet. Before wound care, participants completed the Spielberger's State-Trait Anxiety Inventory for Children and Pre-Procedure Questionnaire while blinded to group assignment. A total of 28 participants completed the study and rated treatment pain after wound care by using the Adolescent Pediatric Pain Tool and completed a Post-Procedure Questionnaire. The VR group reported less pain during wound care than either the passive distraction or standard care group as determined by multivariable linear regression adjusted for age, sex, preprocedure pain, state anxiety, opiate use, and treatment length. The VR group was the only group to have an estimated decrease in pain perception from baseline preprocedure pain to procedural pain reported. Adolescents pretreated with opiate analgesics and female adolescents reported more pain during wound care.This between-subjects clinical study provides further support for VR, even without requiring wearing of an immersive helmet, in lessening burn wound care pain in adolescents. Passive distraction by watching a movie may be less effective in reducing treatment pain. Additional between-subjects randomized controlled trials with larger samples of children and during other healthcare treatments may further support VR's effectiveness in pediatric procedural pain management.
Steimer, Andreas; Schindler, Kaspar
2015-01-01
Oscillations between high and low values of the membrane potential (UP and DOWN states respectively) are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon's implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs) of the exponential integrate and fire (EIF) model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike's preceding ISI. As we show, the EIF's exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron's ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing computational theories about UP states during slow wave sleep and present possible extensions of the model in the context of spike-frequency adaptation.
ERIC Educational Resources Information Center
Smith-Lock, Karen M.; Leitão, Suze; Prior, Polly; Nickels, Lyndsey
2015-01-01
Purpose: This study compared the effectiveness of two grammar treatment procedures for children with specific language impairment. Method: A double-blind superiority trial with cluster randomization was used to compare a cueing procedure, designed to elicit a correct production following an initial error, to a recasting procedure, which required…
Predicate Argument Structure Frames for Modeling Information in Operative Notes
Wang, Yan; Pakhomov, Serguei; Melton, Genevieve B.
2015-01-01
The rich information about surgical procedures contained in operative notes is a valuable data source for improving the clinical evidence base and clinical research. In this study, we propose a set of Predicate Argument Structure (PAS) frames for surgical action verbs to assist in the creation of an information extraction (IE) system to automatically extract details about the techniques, equipment, and operative steps from operative notes. We created PropBank style PAS frames for the 30 top surgical action verbs based on examination of randomly selected sample sentences from 3,000 Laparoscopic Cholecystectomy notes. To assess completeness of the PAS frames to represent usage of same action verbs, we evaluated the PAS frames created on sample sentences from operative notes of 6 other gastrointestinal surgical procedures. Our results showed that the PAS frames created with one type of surgery can successfully denote the usage of the same verbs in operative notes of broader surgical categories. PMID:23920664
FOCIS: A forest classification and inventory system using LANDSAT and digital terrain data
NASA Technical Reports Server (NTRS)
Strahler, A. H.; Franklin, J.; Woodcook, C. E.; Logan, T. L.
1981-01-01
Accurate, cost-effective stratification of forest vegetation and timber inventory is the primary goal of a Forest Classification and Inventory System (FOCIS). Conventional timber stratification using photointerpretation can be time-consuming, costly, and inconsistent from analyst to analyst. FOCIS was designed to overcome these problems by using machine processing techniques to extract and process tonal, textural, and terrain information from registered LANDSAT multispectral and digital terrain data. Comparison of samples from timber strata identified by conventional procedures showed that both have about the same potential to reduce the variance of timber volume estimates over simple random sampling.
Random analysis of bearing capacity of square footing using the LAS procedure
NASA Astrophysics Data System (ADS)
Kawa, Marek; Puła, Wojciech; Suska, Michał
2016-09-01
In the present paper, a three-dimensional problem of bearing capacity of square footing on random soil medium is analyzed. The random fields of strength parameters c and φ are generated using LAS procedure (Local Average Subdivision, Fenton and Vanmarcke 1990). The procedure used is re-implemented by the authors in Mathematica environment in order to combine it with commercial program. Since the procedure is still tested the random filed has been assumed as one-dimensional: the strength properties of soil are random in vertical direction only. Individual realizations of bearing capacity boundary-problem with strength parameters of medium defined the above procedure are solved using FLAC3D Software. The analysis is performed for two qualitatively different cases, namely for the purely cohesive and cohesive-frictional soils. For the latter case the friction angle and cohesion have been assumed as independent random variables. For these two cases the random square footing bearing capacity results have been obtained for the range of fluctuation scales from 0.5 m to 10 m. Each time 1000 Monte Carlo realizations have been performed. The obtained results allow not only the mean and variance but also the probability density function to be estimated. An example of application of this function for reliability calculation has been presented in the final part of the paper.
NASA Technical Reports Server (NTRS)
Lites, B. W.; Skumanich, A.
1985-01-01
A method is presented for recovery of the vector magnetic field and thermodynamic parameters from polarization measurement of photospheric line profiles measured with filtergraphs. The method includes magneto-optic effects and may be utilized on data sampled at arbitrary wavelengths within the line profile. The accuracy of this method is explored through inversion of synthetic Stokes profiles subjected to varying levels of random noise, instrumental wave-length resolution, and line profile sampling. The level of error introduced by the systematic effect of profile sampling over a finite fraction of the 5 minute oscillation cycle is also investigated. The results presented here are intended to guide instrumental design and observational procedure.
Leili, Mostafa; Pirmoghani, Amin; Samadi, Mohammad Taghi; Shokoohi, Reza; Roshanaei, Ghodratollah; Poormohammadi, Ali
2016-11-01
The objective of this study was to determine the residual concentrations of ethion and imidacloprid in cucumbers grown in greenhouse. The effect of some simple processing procedures on both ethion and imidacloprid residues were also studied. Ten active greenhouses that produce cucumber were randomly selected. Ethion and imidacloprid as the most widely used pesticides were measured in cucumber samples of studied greenhouses. Moreover, the effect of storing, washing, and peeling as simple processing procedures on both ethion and imidacloprid residues were investigated. One hour after pesticide application; the maximum residue levels (MRLs) of ethion and imidacloprid were higher than that of Codex standard level. One day after pesticide application, the levels of pesticides were decreased about 35 and 31% for ethion and imidacloprid, respectively, which still were higher than the MRL. Washing procedure led to about 51 and 42.5% loss in ethion and imidacloprid residues, respectively. Peeling procedure also led to highest loss of 93.4 and 63.7% in ethion and imidacloprid residues, respectively. The recovery for both target analytes was in the range between 88 and 102%. The residue values in collected samples one hour after pesticides application were higher than standard value. The storing, washing, and peeling procedures lead to the decrease of pesticide residues in greenhouse cucumbers. Among them, the peeling procedure has the greatest impact on residual reduction. Therefore, these procedures can be used as simple and effective processing techniques for reducing and removing pesticides from greenhouse products before their consumption.
NASA Astrophysics Data System (ADS)
Bushel, Pierre R.; Bennett, Lee; Hamadeh, Hisham; Green, James; Ableson, Alan; Misener, Steve; Paules, Richard; Afshari, Cynthia
2002-06-01
We present an analysis of pattern recognition procedures used to predict the classes of samples exposed to pharmacologic agents by comparing gene expression patterns from samples treated with two classes of compounds. Rat liver mRNA samples following exposure for 24 hours with phenobarbital or peroxisome proliferators were analyzed using a 1700 rat cDNA microarray platform. Sets of genes that were consistently differentially expressed in the rat liver samples following treatment were stored in the MicroArray Project System (MAPS) database. MAPS identified 238 genes in common that possessed a low probability (P < 0.01) of being randomly detected as differentially expressed at the 95% confidence level. Hierarchical cluster analysis on the 238 genes clustered specific gene expression profiles that separated samples based on exposure to a particular class of compound.
47 CFR 1.1604 - Post-selection hearings.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings. (a) Following the random...
47 CFR 1.1604 - Post-selection hearings.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 1 2011-10-01 2011-10-01 false Post-selection hearings. 1.1604 Section 1.1604 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1604 Post-selection hearings. (a) Following the random...
Effect of airway clearance techniques on the efficacy of the sputum induction procedure.
Elkins, M R; Lane, T; Goldberg, H; Pagliuso, J; Garske, L A; Hector, E; Marchetto, L; Alison, J A; Bye, P T P
2005-11-01
Sputum induction is used in the early identification of tuberculosis (TB) and pneumocystis infections of the lung. Although manual physiotherapy techniques to clear the airways are often incorporated in the sputum induction procedure, their efficacy in this setting is unknown. This randomised, crossover trial enrolled adults referred for sputum induction for suspected TB and pneumocystis infections of the lung. All participants underwent two sputum induction procedures, inhaling 3% saline via ultrasonic nebuliser. During one randomly allocated procedure, airway clearance techniques (chest wall percussion, vibration, huffing) were incorporated. In total, 59 participants completed the trial. The airway clearance techniques had no significant effect on how the test was tolerated, the volume expectorated or the quality of the sample obtained (assessed by the presence of alveolar macrophages). The techniques did not significantly affect how often the test identified a suspected organism, nor the sensitivity or specificity of sputum induction. In conclusion, the study was unable to demonstrate any effect of airway clearance techniques on the sputum induction procedure. The results provide some justification for not including airway clearance techniques as part of the sputum induction procedure.
Designing with figer-reinforced plastics (planar random composites)
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1982-01-01
The use of composite mechanics to predict the hygrothermomechanical behavior of planar random composites (PRC) is reviewed and described. These composites are usually made from chopped fiber reinforced resins (thermoplastics or thermosets). The hygrothermomechanical behavior includes mechanical properties, physical properties, thermal properties, fracture toughness, creep and creep rupture. Properties are presented in graphical form with sample calculations to illustrate their use. Concepts such as directional reinforcement and strip hybrids are described. Typical data that can be used for preliminary design for various PRCs are included. Several resins and molding compounds used to make PRCs are described briefly. Pertinent references are cited that cover analysis and design methods, materials, data, fabrication procedures and applications.
Sampling procedures for throughfall monitoring: A simulation study
NASA Astrophysics Data System (ADS)
Zimmermann, Beate; Zimmermann, Alexander; Lark, Richard Murray; Elsenbeer, Helmut
2010-01-01
What is the most appropriate sampling scheme to estimate event-based average throughfall? A satisfactory answer to this seemingly simple question has yet to be found, a failure which we attribute to previous efforts' dependence on empirical studies. Here we try to answer this question by simulating stochastic throughfall fields based on parameters for statistical models of large monitoring data sets. We subsequently sampled these fields with different sampling designs and variable sample supports. We evaluated the performance of a particular sampling scheme with respect to the uncertainty of possible estimated means of throughfall volumes. Even for a relative error limit of 20%, an impractically large number of small, funnel-type collectors would be required to estimate mean throughfall, particularly for small events. While stratification of the target area is not superior to simple random sampling, cluster random sampling involves the risk of being less efficient. A larger sample support, e.g., the use of trough-type collectors, considerably reduces the necessary sample sizes and eliminates the sensitivity of the mean to outliers. Since the gain in time associated with the manual handling of troughs versus funnels depends on the local precipitation regime, the employment of automatically recording clusters of long troughs emerges as the most promising sampling scheme. Even so, a relative error of less than 5% appears out of reach for throughfall under heterogeneous canopies. We therefore suspect a considerable uncertainty of input parameters for interception models derived from measured throughfall, in particular, for those requiring data of small throughfall events.
Yang, Baohui; Eyeson-Annan, Margo
2006-01-01
Background Computer assisted telephone interviewing (CATI) is widely used for health surveys. The advantages of CATI over face-to-face interviewing are timeliness and cost reduction to achieve the same sample size and geographical coverage. Two major CATI sampling procedures are used: sampling directly from the electronic white pages (EWP) telephone directory and list assisted random digit dialling (LA-RDD) sampling. EWP sampling covers telephone numbers of households listed in the printed white pages. LA-RDD sampling has a better coverage of households than EWP sampling but is considered to be more expensive due to interviewers dialling more out-of-scope numbers. Methods This study compared an EWP sample and a LA-RDD sample from the New South Wales Population Health Survey in 2003 on demographic profiles, health estimates, coefficients of variation in weights, design effects on estimates, and cost effectiveness, on the basis of achieving the same level of precision of estimates. Results The LA-RDD sample better represented the population than the EWP sample, with a coefficient of variation of weights of 1.03 for LA-RDD compared with 1.21 for EWP, and average design effects of 2.00 for LA-RDD compared with 2.38 for EWP. Also, a LA-RDD sample can save up to 14.2% in cost compared to an EWP sample to achieve the same precision for health estimates. Conclusion A LA-RDD sample better represents the population, which potentially leads to reduced bias in health estimates, and rather than costing more than EWP actually costs less. PMID:16504117
Rietbergen, Charlotte; Stefansdottir, Gudrun; Leufkens, Hubert G; Knol, Mirjam J; De Bruin, Marie L; Klugkist, Irene
2017-01-01
The current system of harm assessment of medicines has been criticized for relying on intuitive expert judgment. There is a call for more quantitative approaches and transparency in decision-making. Illustrated with the case of cardiovascular safety concerns for rosiglitazone, we aimed to explore a structured procedure for the collection, quality assessment, and statistical modeling of safety data from observational and randomized studies. We distinguished five stages in the synthesis process. In Stage I, the general research question, population and outcome, and general inclusion and exclusion criteria are defined and a systematic search is performed. Stage II focusses on the identification of sub-questions examined in the included studies and the classification of the studies into the different categories of sub-questions. In Stage III, the quality of the identified studies is assessed. Coding and data extraction are performed in Stage IV. Finally, meta-analyses on the study results per sub-question are performed in Stage V. A Pubmed search identified 30 randomized and 14 observational studies meeting our search criteria. From these studies, we identified 4 higher level sub-questions and 4 lower level sub-questions. We were able to categorize 29 individual treatment comparisons into one or more of the sub-question categories, and selected study duration as an important covariate. We extracted covariate, outcome, and sample size information at the treatment arm level of the studies. We extracted absolute numbers of myocardial infarctions from the randomized study, and adjusted risk estimates with 95% confidence intervals from the observational studies. Overall, few events were observed in the randomized studies that were frequently of relatively short duration. The large observational studies provided more information since these were often of longer duration. A Bayesian random effects meta-analysis on these data showed no significant increase in risk of rosiglitazone for any of the sub-questions. The proposed procedure can be of additional value for drug safety assessment because it provides a stepwise approach that guides the decision-making in increasing process transparency. The procedure allows for the inclusion of results from both randomized an observational studies, which is especially relevant for this type of research.
Sample size requirements for the design of reliability studies: precision consideration.
Shieh, Gwowen
2014-09-01
In multilevel modeling, the intraclass correlation coefficient based on the one-way random-effects model is routinely employed to measure the reliability or degree of resemblance among group members. To facilitate the advocated practice of reporting confidence intervals in future reliability studies, this article presents exact sample size procedures for precise interval estimation of the intraclass correlation coefficient under various allocation and cost structures. Although the suggested approaches do not admit explicit sample size formulas and require special algorithms for carrying out iterative computations, they are more accurate than the closed-form formulas constructed from large-sample approximations with respect to the expected width and assurance probability criteria. This investigation notes the deficiency of existing methods and expands the sample size methodology for the design of reliability studies that have not previously been discussed in the literature.
Prevalence of Listeria monocytogenes in Idiazabal cheese.
Arrese, E; Arroyo-Izaga, M
2012-01-01
Raw-milk cheese has been identified in risk assessment as a food of greater concern to public health due to listeriosis. To determine the prevalence and levels of Listeria monocytogenes in semi-hard Idiazabal cheese manufactured by different producers in the Basque Country at consumer level. A total of 51 Idiazabal cheese samples were obtained from 10 separate retail establishments, chosen by stratified random sampling. Samples were tested using the official standard ISO procedure 11290-1 for detection and enumeration methods. All cheese samples tested negative for L. monocytogenes. However, 9.8% tested positive for Listeria spp., different from L. monocytogenes. Positive samples came from two brands, two were natural and three were smoked. The presence of Listeria spss. suggests that the cheese making process and the hygiene whether at milking or during cheese making could be insufficient.
van Singer, Mathias; Chatton, Anne; Achab, Sophia; Zullino, Daniele; Rothen, Stephane; Khan, Riaz; Billieux, Joel; Thorens, Gabriel
2014-01-01
Background The number of medical studies performed through online surveys has increased dramatically in recent years. Despite their numerous advantages (eg, sample size, facilitated access to individuals presenting stigmatizing issues), selection bias may exist in online surveys. However, evidence on the representativeness of self-selected samples in online studies is patchy. Objective Our objective was to explore the representativeness of a self-selected sample of online gamers using online players’ virtual characters (avatars). Methods All avatars belonged to individuals playing World of Warcraft (WoW), currently the most widely used online game. Avatars’ characteristics were defined using various games’ scores, reported on the WoW’s official website, and two self-selected samples from previous studies were compared with a randomly selected sample of avatars. Results We used scores linked to 1240 avatars (762 from the self-selected samples and 478 from the random sample). The two self-selected samples of avatars had higher scores on most of the assessed variables (except for guild membership and exploration). Furthermore, some guilds were overrepresented in the self-selected samples. Conclusions Our results suggest that more proficient players or players more involved in the game may be more likely to participate in online surveys. Caution is needed in the interpretation of studies based on online surveys that used a self-selection recruitment procedure. Epidemiological evidence on the reduced representativeness of sample of online surveys is warranted. PMID:25001007
Chow, Jeffrey T Y; Turkstra, Timothy P; Yim, Edmund; Jones, Philip M
2018-06-01
Although every randomized clinical trial (RCT) needs participants, determining the ideal number of participants that balances limited resources and the ability to detect a real effect is difficult. Focussing on two-arm, parallel group, superiority RCTs published in six general anesthesiology journals, the objective of this study was to compare the quality of sample size calculations for RCTs published in 2010 vs 2016. Each RCT's full text was searched for the presence of a sample size calculation, and the assumptions made by the investigators were compared with the actual values observed in the results. Analyses were only performed for sample size calculations that were amenable to replication, defined as using a clearly identified outcome that was continuous or binary in a standard sample size calculation procedure. The percentage of RCTs reporting all sample size calculation assumptions increased from 51% in 2010 to 84% in 2016. The difference between the values observed in the study and the expected values used for the sample size calculation for most RCTs was usually > 10% of the expected value, with negligible improvement from 2010 to 2016. While the reporting of sample size calculations improved from 2010 to 2016, the expected values in these sample size calculations often assumed effect sizes larger than those actually observed in the study. Since overly optimistic assumptions may systematically lead to underpowered RCTs, improvements in how to calculate and report sample sizes in anesthesiology research are needed.
Quasi-Monte Carlo Methods Applied to Tau-Leaping in Stochastic Biological Systems.
Beentjes, Casper H L; Baker, Ruth E
2018-05-25
Quasi-Monte Carlo methods have proven to be effective extensions of traditional Monte Carlo methods in, amongst others, problems of quadrature and the sample path simulation of stochastic differential equations. By replacing the random number input stream in a simulation procedure by a low-discrepancy number input stream, variance reductions of several orders have been observed in financial applications. Analysis of stochastic effects in well-mixed chemical reaction networks often relies on sample path simulation using Monte Carlo methods, even though these methods suffer from typical slow [Formula: see text] convergence rates as a function of the number of sample paths N. This paper investigates the combination of (randomised) quasi-Monte Carlo methods with an efficient sample path simulation procedure, namely [Formula: see text]-leaping. We show that this combination is often more effective than traditional Monte Carlo simulation in terms of the decay of statistical errors. The observed convergence rate behaviour is, however, non-trivial due to the discrete nature of the models of chemical reactions. We explain how this affects the performance of quasi-Monte Carlo methods by looking at a test problem in standard quadrature.
A DNA-Based Procedure for In Planta Detection of Fusarium oxysporum f. sp. phaseoli.
Alves-Santos, Fernando M; Ramos, Brisa; García-Sánchez, M Asunción; Eslava, Arturo P; Díaz-Mínguez, José María
2002-03-01
ABSTRACT We have characterized strains of Fusarium oxysporum from common bean fields in Spain that were nonpathogenic on common bean, as well as F. oxysporum strains (F. oxysporum f. sp. phaseoli) pathogenic to common bean by random amplified polymorphic DNA (RAPD) analysis. We identified a RAPD marker (RAPD 4.12) specific for the highly virulent pathogenic strains of the seven races of F. oxysporum f. sp. phaseoli. Sequence analysis of RAPD 4.12 allowed the design of oligonucleotides that amplify a 609-bp sequence characterized amplified region (SCAR) marker (SCAR-B310A280). Under controlled environmental and greenhouse conditions, detection of the pathogen by polymerase chain reaction was 100% successful in root samples of infected but still symptomless plants and in stem samples of plants with disease severity of >/=4 in the Centro Internacional de Agricultura Tropical (CIAT; Cali, Colombia) scale. The diagnostic procedure can be completed in 5 h and allows the detection of all known races of the pathogen in plant samples at early stages of the disease with no visible symptoms.
Er, Buket; Onurdag, Fatma Kaynak; Demirhan, Burak; Ozgacar, Selda Özgen; Oktem, Aysel Bayhan; Abbasoglu, Ufuk
2013-08-01
This study aimed to find the effects of quinolone antibiotics in chicken and beef used in Ankara, Turkey. Total number of 127 chicken and 104 beef meat samples were collected randomly from local markets for analysis. Extraction and determination of quinolones were made by ELISA procedure. One hundred eighteen of 231 (51.1%) examined chicken meat and beef samples were found to contain quinolone antibiotic residue. Among the chicken meat and beef samples, 58 (45.7%) of chicken meat samples and 60 (57.7%) of beef meat samples were positive for quinolones, respectively. The mean levels (±SE) of quinolones were found to be 30.81 ± 0.45 µg/kg and 6.64 ± 1.11 µg/kg in chicken and beef samples, respectively. This study indicated that some chicken and beef meat sold in Ankara contains residues of quinolone antibiotics.
Canbulat, Nejla; Ayhan, Fatma; Inal, Sevil
2015-02-01
The aim of this study was to investigate the effect of external cold and vibration stimulation via Buzzy on the pain and anxiety level of children during peripheral intravenous (IV) cannulation. This study was a prospective, randomized controlled trial. The sample consisted of 176 children ages 7 to 12 years who were randomly assigned to two groups: a control group that received no peripheral IV cannulation intervention and an experimental group that received external cold and vibration via Buzzy. The same nurse conducted the peripheral IV cannulation in all the children, and the same researcher applied the external cold and vibration to all the children. The external cold and the vibration were applied 1 minute before the peripheral IV cannulation procedure and continued until the end of the procedure. Preprocedural anxiety was assessed using the Children's Fear Scale, along with reports by the children, their parents, and an observer. Procedural anxiety was assessed with the Children's Fear Scale and the parents' and the observer's reports. Procedural pain was assessed using the Wong Baker Faces Scale and the visual analog scale self-reports of the children. Preprocedural anxiety did not differ significantly. Comparison of the two groups showed significantly lower pain and anxiety levels in the experimental group than in the control group during the peripheral IV cannulation. Buzzy can be considered to provide an effective combination of coldness and vibration. This method can be used during pediatric peripheral IV cannulation by pediatric nurses. Copyright © 2015 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.
A two-stage Monte Carlo approach to the expression of uncertainty with finite sample sizes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crowder, Stephen Vernon; Moyer, Robert D.
2005-05-01
Proposed supplement I to the GUM outlines a 'propagation of distributions' approach to deriving the distribution of a measurand for any non-linear function and for any set of random inputs. The supplement's proposed Monte Carlo approach assumes that the distributions of the random inputs are known exactly. This implies that the sample sizes are effectively infinite. In this case, the mean of the measurand can be determined precisely using a large number of Monte Carlo simulations. In practice, however, the distributions of the inputs will rarely be known exactly, but must be estimated using possibly small samples. If these approximatedmore » distributions are treated as exact, the uncertainty in estimating the mean is not properly taken into account. In this paper, we propose a two-stage Monte Carlo procedure that explicitly takes into account the finite sample sizes used to estimate parameters of the input distributions. We will illustrate the approach with a case study involving the efficiency of a thermistor mount power sensor. The performance of the proposed approach will be compared to the standard GUM approach for finite samples using simple non-linear measurement equations. We will investigate performance in terms of coverage probabilities of derived confidence intervals.« less
Long, Ju
2016-05-01
In China, -(SEA), -α(3.7) and -α(4.2) are common deletional α-thalassemia alleles. Gap-PCR is the currently used detection method for these alleles, whose disadvantages include time-consuming procedure and increased potential for PCR product contamination. Therefore, this detection method needs to be improved. Based on identical-primer homologous fragments, a qPCR system was developed for deletional α-thalassemia genotyping, which was composed of a group of quantitatively-related primers and their corresponding probes plus two groups of qualitatively-related primers and their corresponding probes. In order to verify the accuracy of the qPCR system, known genotype samples and random samples are employed. The standard curve result demonstrated that designed primers and probes all yielded good amplification efficiency. In the tests of known genotype samples and random samples, sample detection results were consistent with verification results. In detecting αα, -(SEA), -α(3.7) and -α(4.2) alleles, deletional α-thalassemia alleles are accurately detected by this method. In addition, this method is provided with a wider detection range, greater speed and reduced PCR product contamination risk when compared with current common gap-PCR detection reagents. Copyright © 2016 Elsevier B.V. All rights reserved.
Taş, Nilay; Korkmaz, Hakan; Yağan, Özgür; Korkmaz, Mukadder
2015-01-01
Backround Sugammadex is a reversal agent with well known advantages but it’s effects on haemostasis and bleeding have been a topic of interest. Septoplasty is a common surgical procedure with postoperative respiratory complications and bleeding. The aim of this study is to investigate the effects of sugammadex on postoperative coagulation parameters and bleeding after septoplasty procedure. Material/Methods In this randomized controlled study, fifty patients were grouped into two groups; neostigmine (Group N) vs. sugammadex (Group S). For the evaluation of PT, aPTT and INR, blood samples were taken for at the postoperative 120th minutes and alteration of these values with respect to preoperative values were documented. Postoperative bleeding was measured by evaluating the amount of blood absorbed on the nasal tip dressing during 3 hours postoperatively. Results Postoperative bleeding amount was significantly higher in the Group S compared to Group N (p=0.013). No significant difference was observed between two groups according to coagulation parameters (PT; p=0.953, aPTT; p=0.734, INR; p=0.612). Conclusions Sugammadex was associated with higher amount of postoperative bleeding than neostigmine in septoplasty patients. In surgical procedures having high risk of bleeding the safety of sugammadex need to be verified. PMID:26271275
Effects of music therapy and distraction cards on pain relief during phlebotomy in children.
Aydin, Diler; Sahiner, Nejla Canbulat
2017-02-01
To investigate three different distraction methods (distraction cards, listening to music, and distraction cards + music) on pain and anxiety relief in children during phlebotomy. This study was a prospective, randomized, controlled trial. The sample consisted of children aged 7 to 12years who required blood tests. The children were randomized into four groups, distraction cards, music, distraction cards + music, and controls. Data were obtained through face-to-face interviews with the children, their parents, and the observer before and after the procedure. The children's pain levels were assessed and reported by the parents and observers, and the children themselves who self-reported using Wong-Baker FACES. The children's anxiety levels were also assessed using the Children's Fear Scale. Two hundred children (mean age: 9.01±2.35years) were included. No difference was found between the groups in the self, parent, and observer reported procedural pain levels (p=0.72, p=0.23, p=0.15, respectively). Furthermore, no significant differences were observed between groups in procedural child anxiety levels according to the parents and observer (p=0.092, p=0.096, respectively). Pain and anxiety relief was seen in all three methods during phlebotomy; however, no statistically significant difference was observed. Copyright © 2016 Elsevier Inc. All rights reserved.
Ozone measurement system for NASA global air sampling program
NASA Technical Reports Server (NTRS)
Tiefermann, M. W.
1979-01-01
The ozone measurement system used in the NASA Global Air Sampling Program is described. The system uses a commercially available ozone concentration monitor that was modified and repackaged so as to operate unattended in an aircraft environment. The modifications required for aircraft use are described along with the calibration techniques, the measurement of ozone loss in the sample lines, and the operating procedures that were developed for use in the program. Based on calibrations with JPL's 5-meter ultraviolet photometer, all previously published GASP ozone data are biased high by 9 percent. A system error analysis showed that the total system measurement random error is from 3 to 8 percent of reading (depending on the pump diaphragm material) or 3 ppbv, whichever are greater.
Testing for qualitative heterogeneity: An application to composite endpoints in survival analysis.
Oulhaj, Abderrahim; El Ghouch, Anouar; Holman, Rury R
2017-01-01
Composite endpoints are frequently used in clinical outcome trials to provide more endpoints, thereby increasing statistical power. A key requirement for a composite endpoint to be meaningful is the absence of the so-called qualitative heterogeneity to ensure a valid overall interpretation of any treatment effect identified. Qualitative heterogeneity occurs when individual components of a composite endpoint exhibit differences in the direction of a treatment effect. In this paper, we develop a general statistical method to test for qualitative heterogeneity, that is to test whether a given set of parameters share the same sign. This method is based on the intersection-union principle and, provided that the sample size is large, is valid whatever the model used for parameters estimation. We propose two versions of our testing procedure, one based on a random sampling from a Gaussian distribution and another version based on bootstrapping. Our work covers both the case of completely observed data and the case where some observations are censored which is an important issue in many clinical trials. We evaluated the size and power of our proposed tests by carrying out some extensive Monte Carlo simulations in the case of multivariate time to event data. The simulations were designed under a variety of conditions on dimensionality, censoring rate, sample size and correlation structure. Our testing procedure showed very good performances in terms of statistical power and type I error. The proposed test was applied to a data set from a single-center, randomized, double-blind controlled trial in the area of Alzheimer's disease.
High resolution 4-D spectroscopy with sparse concentric shell sampling and FFT-CLEAN.
Coggins, Brian E; Zhou, Pei
2008-12-01
Recent efforts to reduce the measurement time for multidimensional NMR experiments have fostered the development of a variety of new procedures for sampling and data processing. We recently described concentric ring sampling for 3-D NMR experiments, which is superior to radial sampling as input for processing by a multidimensional discrete Fourier transform. Here, we report the extension of this approach to 4-D spectroscopy as Randomized Concentric Shell Sampling (RCSS), where sampling points for the indirect dimensions are positioned on concentric shells, and where random rotations in the angular space are used to avoid coherent artifacts. With simulations, we show that RCSS produces a very low level of artifacts, even with a very limited number of sampling points. The RCSS sampling patterns can be adapted to fine rectangular grids to permit use of the Fast Fourier Transform in data processing, without an apparent increase in the artifact level. These artifacts can be further reduced to the noise level using the iterative CLEAN algorithm developed in radioastronomy. We demonstrate these methods on the high resolution 4-D HCCH-TOCSY spectrum of protein G's B1 domain, using only 1.2% of the sampling that would be needed conventionally for this resolution. The use of a multidimensional FFT instead of the slow DFT for initial data processing and for subsequent CLEAN significantly reduces the calculation time, yielding an artifact level that is on par with the level of the true spectral noise.
High Resolution 4-D Spectroscopy with Sparse Concentric Shell Sampling and FFT-CLEAN
Coggins, Brian E.; Zhou, Pei
2009-01-01
SUMMARY Recent efforts to reduce the measurement time for multidimensional NMR experiments have fostered the development of a variety of new procedures for sampling and data processing. We recently described concentric ring sampling for 3-D NMR experiments, which is superior to radial sampling as input for processing by a multidimensional discrete Fourier transform. Here, we report the extension of this approach to 4-D spectroscopy as Randomized Concentric Shell Sampling (RCSS), where sampling points for the indirect dimensions are positioned on concentric shells, and where random rotations in the angular space are used to avoid coherent artifacts. With simulations, we show that RCSS produces a very low level of artifacts, even with a very limited number of sampling points. The RCSS sampling patterns can be adapted to fine rectangular grids to permit use of the Fast Fourier Transform in data processing, without an apparent increase in the artifact level. These artifacts can be further reduced to the noise level using the iterative CLEAN algorithm developed in radioastronomy. We demonstrate these methods on the high resolution 4-D HCCH-TOCSY spectrum of protein G's B1 domain, using only 1.2% of the sampling that would be needed conventionally for this resolution. The use of a multidimensional FFT instead of the slow DFT for initial data processing and for subsequent CLEAN significantly reduces the calculation time, yielding an artifact level that is on par with the level of the true spectral noise. PMID:18853260
Effect of Setting Time on the Shear Bond Strength Between Biodentine and Composite
2015-06-01
Methods: Sample cylinders (n=134) and Biodentine capsules were randomly assigned to groups based on the setting time allowed for Biodentine (Group 1...15 minutes, Group 2 = 1 hour, Group 3 = 24 hours, Group 4 = 2 weeks). Biodentine was prepared and placed in the wells of the acrylic cylinders and...widely used as a temporary intracanal medicament during root canal therapy, as a liner , and for direct and indirect pulp capping procedures. Although
Morrissey, M A; Hill, H H
1989-09-01
A simplified procedure was developed for the determination of 2,4-dichlorophenoxyacetic acid (2,4-D) in soils. Soil samples were separated by supercritical fluid chromatography after extraction without derivatization and without the use of column chromatography for cleanup. Interferences in the chromatographic separation were eliminated by using a tunably selective ion mobility detector. An atmospheric pressure ion formed by the free acid was selectively monitored so the detector could monitor 2,4-D in the presence of other electron-capturing compounds. For a randomly chosen soil sample, the level of 2,4-D detected was estimated at 500 ppb.
Hardigan, Patrick C; Popovici, Ioana; Carvajal, Manuel J
2016-01-01
There is a gap between increasing demands from pharmacy journals, publishers, and reviewers for high survey response rates and the actual responses often obtained in the field by survey researchers. Presumably demands have been set high because response rates, times, and costs affect the validity and reliability of survey results. Explore the extent to which survey response rates, average response times, and economic costs are affected by conditions under which pharmacist workforce surveys are administered. A random sample of 7200 U.S. practicing pharmacists was selected. The sample was stratified by delivery method, questionnaire length, item placement, and gender of respondent for a total of 300 observations within each subgroup. A job satisfaction survey was administered during March-April 2012. Delivery method was the only classification showing significant differences in response rates and average response times. The postal mail procedure accounted for the highest response rates of completed surveys, but the email method exhibited the quickest turnaround. A hybrid approach, consisting of a combination of postal and electronic means, showed the least favorable results. Postal mail was 2.9 times more cost effective than the email approach and 4.6 times more cost effective than the hybrid approach. Researchers seeking to increase practicing pharmacists' survey participation and reduce response time and related costs can benefit from the analytical procedures tested here. Copyright © 2016 Elsevier Inc. All rights reserved.
Tommaselli, Giovanni A; Di Carlo, Costantino; Formisano, Carmen; Fabozzi, Annamaria; Nappi, Carmine
2014-08-01
To evaluate the effect of a protocol of local anesthesia and epinephrine associated with sedo-analgesia on post-TVT-O pain in comparison with infiltration of saline and epinephrine. Forty-two patients undergoing TVT-O were randomized into two groups to receive periurethral infiltration with epinephrine only (group A, n = 21) or with epinephrine plus 1 % lidocaine hydrochloride (group B, n = 21). Post-operative pain was assessed using a visual analog scale (VAS) from 0 (absence of pain) to 10 (maximum pain possible), 1, 6, 12 and 24 h after the procedure. The total amount of analgesia was recorded and the proportion of women reporting a pain VAS score ≥4, 1 h after the procedure was calculated. ANOVA for repeated measures and Bonferroni correction, the Student's t test for independent samples, the Mann-Whitney U test, the Fisher exact test, or the χ (2) test for parametric was used. Pain level was significantly lower in group B 1 (p = 0.01) and 6 h (p = 0.05) after surgery, but not 12 and 24 h after the procedure. No significant difference was observed in the proportion of women requesting analgesia and in the total dosage of analgesics between the two groups. A significant higher proportion of women in group A reported a pain VAS score higher than four 1 h after surgery in comparison with patients in group B. This randomized study seems to indicate that systematic infiltration before TVT-O positioning with local anesthetic may reduce immediate post-operative pain.
Handelsman, D J; Sivananathan, T; Andres, L; Bathur, F; Jayadev, V; Conway, A J
2013-11-01
Semen is collected to evaluate male fertility or cryostore sperm preferentially in laboratories but such collection facilities have no standard fit-out. It is widely believed but untested whether providing erotic material (EM) is required to collect semen by masturbation in the unfamiliar environment. To test this assumption, 1520 men (1046 undergoing fertility evaluation, 474 sperm cryostorage, providing 1932 semen collection episodes) consecutively attending the semen laboratory of a major metropolitan teaching hospital for semen analysis were eligible for randomization to be provided or not with printed erotic material EM (X-rated, soft-core magazines) during semen collection. Randomization was performed by providing magazines in the collection rooms (as a variation on non-standard fit-out) on alternate weeks using a schedule concealed from participants. In the pilot study, men were randomized without seeking consent. In the second part of the study, which continued on from the first without interruption, an approved informed consent procedure was added. The primary outcome, the time to collect semen defined as the time from receiving to returning the sample receptacle, was significantly longer (by ~6%, 14.9 ± 0.3 [mean ± standard error of mean] vs. 14.0 ± 0.2 minutes, p = 0.02) among men provided with EM than those randomized to not being provided. There was no significant increase in the failure to collect semen samples (2.6% overall) nor any difference in age, semen volume or sperm concentration, output or motility according to whether EM was provided or not. The significantly longer time to collect was evident in the pilot study and the study overall, but not in the main study where the informed consent procedure was used. This study provides evidence that refutes the assumption that EM needs to be provided for semen collection in a laboratory. It also provides an example of a usually unobservable participation bias influencing study outcome of a randomized controlled trials. © 2013 American Society of Andrology and European Academy of Andrology.
Ryan Clarke, P; Frey, Rebecca K; Rhyan, Jack C; McCollum, Matt P; Nol, Pauline; Aune, Keith
2014-03-01
OBJECTIVE--To determine the feasibility of qualifying individuals or groups of Yellowstone National Park bison as free from brucellosis. DESIGN--Cohort study. SAMPLE--Serum, blood, and various samples from live bison and tissues taken at necropsy from 214 bison over 7 years. PROCEDURES--Blood was collected from bison every 30 to 45 days for serologic tests and microbiological culture of blood for Brucella abortus. Seropositive bison were euthanized until all remaining bison had 2 consecutive negative test results. Half the seronegative bison were randomly euthanized, and tissues were collected for bacteriologic culture. The remaining seronegative bison were bred, and blood was tested at least twice per year. Cow-calf pairs were sampled immediately after calving and 6 months after calving for evidence of B abortus. RESULTS--Post-enrollment serial testing for B abortus antibodies revealed no bison that seroconverted after 205 days (first cohort) and 180 days (second cohort). During initial serial testing, 85% of bison seroconverted within 120 days after removal from the infected population. Brucella abortus was not cultured from any euthanized seronegative bison (0/88). After parturition, no cows or calves had a positive test result for B abortus antibodies, nor was B abortus cultured from any samples. CONCLUSIONS AND CLINICAL RELEVANCE--Results suggested it is feasible to qualify brucellosis-free bison from an infected herd following quarantine procedures as published in the USDA APHIS brucellosis eradication uniform methods and rules. Latent infection was not detected in this sample of bison when applying the USDA APHIS quarantine protocol.
Zhou, Hanzhi; Elliott, Michael R; Raghunathan, Trivellore E
2016-06-01
Multistage sampling is often employed in survey samples for cost and convenience. However, accounting for clustering features when generating datasets for multiple imputation is a nontrivial task, particularly when, as is often the case, cluster sampling is accompanied by unequal probabilities of selection, necessitating case weights. Thus, multiple imputation often ignores complex sample designs and assumes simple random sampling when generating imputations, even though failing to account for complex sample design features is known to yield biased estimates and confidence intervals that have incorrect nominal coverage. In this article, we extend a recently developed, weighted, finite-population Bayesian bootstrap procedure to generate synthetic populations conditional on complex sample design data that can be treated as simple random samples at the imputation stage, obviating the need to directly model design features for imputation. We develop two forms of this method: one where the probabilities of selection are known at the first and second stages of the design, and the other, more common in public use files, where only the final weight based on the product of the two probabilities is known. We show that this method has advantages in terms of bias, mean square error, and coverage properties over methods where sample designs are ignored, with little loss in efficiency, even when compared with correct fully parametric models. An application is made using the National Automotive Sampling System Crashworthiness Data System, a multistage, unequal probability sample of U.S. passenger vehicle crashes, which suffers from a substantial amount of missing data in "Delta-V," a key crash severity measure.
Zhou, Hanzhi; Elliott, Michael R.; Raghunathan, Trivellore E.
2017-01-01
Multistage sampling is often employed in survey samples for cost and convenience. However, accounting for clustering features when generating datasets for multiple imputation is a nontrivial task, particularly when, as is often the case, cluster sampling is accompanied by unequal probabilities of selection, necessitating case weights. Thus, multiple imputation often ignores complex sample designs and assumes simple random sampling when generating imputations, even though failing to account for complex sample design features is known to yield biased estimates and confidence intervals that have incorrect nominal coverage. In this article, we extend a recently developed, weighted, finite-population Bayesian bootstrap procedure to generate synthetic populations conditional on complex sample design data that can be treated as simple random samples at the imputation stage, obviating the need to directly model design features for imputation. We develop two forms of this method: one where the probabilities of selection are known at the first and second stages of the design, and the other, more common in public use files, where only the final weight based on the product of the two probabilities is known. We show that this method has advantages in terms of bias, mean square error, and coverage properties over methods where sample designs are ignored, with little loss in efficiency, even when compared with correct fully parametric models. An application is made using the National Automotive Sampling System Crashworthiness Data System, a multistage, unequal probability sample of U.S. passenger vehicle crashes, which suffers from a substantial amount of missing data in “Delta-V,” a key crash severity measure. PMID:29226161
Arteagoitia, Iciar; Zumarraga, Mercedes; Dávila, Ricardo; Barbier, Luis; Santamaría, Gorka
2014-01-01
Objectives: Was to evaluate the effect of different regional anesthetics (articaine with epinephrine versus prilocaine with felypressin) on stress in the extraction of impacted lower third molars in healthy subjects. Sutdy Desing: A prospective single-blind, split-mouth cross-over randomized study was designed, with a control group. The experimental group consisted of 24 otherwise healthy male volunteers, with two impacted lower third molars which were surgically extracted after inferior alveolar nerve block (regional anesthesia), with a fortnight’s interval: the right using 4% articaine with 1:100.000 epinephrine, and the left 3% prilocaine with 1:1.850.000 felypressin. Patients were randomized for the first surgical procedure. To analyze the variation in four stress markers, homovanillic acid, 3-methoxy-4-hydroxyphenylglycol, prolactin and cortisol, 10-mL blood samples were obtained at t = 0, 5, 60, and 120 minutes. The control group consisted of 12 healthy volunteers, who did not undergo either extractions or anesthetic procedures but from whom blood samples were collected and analyzed in the same way. Results: Plasma cortisol increased in the experimental group (multiple range test, P<0.05), the levels being significantly higher in the group receiving 3% prilocaine with 1:1.850,000 felypressin (signed rank test, p<0.0007). There was a significant reduction in homovanillic acid over time in both groups (multiple range test, P<0.05). No significant differences were observed in homovanillic acid, 3-methoxy-4-hydroxyphenylglycol or prolactin concentrations between the experimental and control groups. Conclusions: The effect of regional anesthesia on stress is lower when 4% articaine with 1:100,000 epinephrine is used in this surgical procedure. Key words:Stress markets, epinephrine versus felypressin. PMID:24316704
Pataky, Todd C; Vanrenterghem, Jos; Robinson, Mark A
2015-05-01
Biomechanical processes are often manifested as one-dimensional (1D) trajectories. It has been shown that 1D confidence intervals (CIs) are biased when based on 0D statistical procedures, and the non-parametric 1D bootstrap CI has emerged in the Biomechanics literature as a viable solution. The primary purpose of this paper was to clarify that, for 1D biomechanics datasets, the distinction between 0D and 1D methods is much more important than the distinction between parametric and non-parametric procedures. A secondary purpose was to demonstrate that a parametric equivalent to the 1D bootstrap exists in the form of a random field theory (RFT) correction for multiple comparisons. To emphasize these points we analyzed six datasets consisting of force and kinematic trajectories in one-sample, paired, two-sample and regression designs. Results showed, first, that the 1D bootstrap and other 1D non-parametric CIs were qualitatively identical to RFT CIs, and all were very different from 0D CIs. Second, 1D parametric and 1D non-parametric hypothesis testing results were qualitatively identical for all six datasets. Last, we highlight the limitations of 1D CIs by demonstrating that they are complex, design-dependent, and thus non-generalizable. These results suggest that (i) analyses of 1D data based on 0D models of randomness are generally biased unless one explicitly identifies 0D variables before the experiment, and (ii) parametric and non-parametric 1D hypothesis testing provide an unambiguous framework for analysis when one׳s hypothesis explicitly or implicitly pertains to whole 1D trajectories. Copyright © 2015 Elsevier Ltd. All rights reserved.
Hoffman, Steven J; Justicz, Victoria
2016-07-01
To develop and validate a method for automatically quantifying the scientific quality and sensationalism of individual news records. After retrieving 163,433 news records mentioning the Severe Acute Respiratory Syndrome (SARS) and H1N1 pandemics, a maximum entropy model for inductive machine learning was used to identify relationships among 500 randomly sampled news records that correlated with systematic human assessments of their scientific quality and sensationalism. These relationships were then computationally applied to automatically classify 10,000 additional randomly sampled news records. The model was validated by randomly sampling 200 records and comparing human assessments of them to the computer assessments. The computer model correctly assessed the relevance of 86% of news records, the quality of 65% of records, and the sensationalism of 73% of records, as compared to human assessments. Overall, the scientific quality of SARS and H1N1 news media coverage had potentially important shortcomings, but coverage was not too sensationalizing. Coverage slightly improved between the two pandemics. Automated methods can evaluate news records faster, cheaper, and possibly better than humans. The specific procedure implemented in this study can at the very least identify subsets of news records that are far more likely to have particular scientific and discursive qualities. Copyright © 2016 Elsevier Inc. All rights reserved.
Portella, Claudio Elidio; Silva, Julio Guilherme; Bastos, Victor Hugo; Machado, Dionis; Cunha, Marlo; Cagy, Maurício; Basile, Luis; Piedade, Roberto; Ribeiro, Pedro
2006-06-01
The objective of the present study was to evaluate attentional, motor and electroencephalographic (EEG) parameters during a procedural task when subjects have ingested 6 mg of bromazepam. The sample consisted of 26 healthy subjects, male or female, between 19 and 36 years of age. The control (placebo) and experimental (bromazepam 6 mg) groups were submitted to a typewriting task in a randomized, double-blind design. The findings did not show significant differences in attentional and motor measures between groups. Coherence measures (qEEG) were evaluated between scalp regions, in theta, alpha and beta bands. A first analysis revealed a main effect for condition (Anova 2-way--condition versus blocks). A second Anova 2-way (condition versus scalp regions) showed a main effect for both factors. The coherence measure was not a sensitive tool at demonstrating differences between cortical areas as a function of procedural learning.
Kheur, Mohit G; Kheur, Supriya; Lakha, Tabrez; Jambhekar, Shantanu; Le, Bach; Jain, Vinay
2018-04-01
The absence of an adequate volume of bone at implant sites requires augmentation procedures before the placement of implants. The aim of the present study was to assess the ridge width gain with the use of allografts and biphasic β-tricalcium phosphate with hydroxyapatite (alloplast) in ridge split procedures, when each were used in small (0.25 to 1 mm) and large (1 to 2 mm) particle sizes. A randomized controlled trial of 23 subjects with severe atrophy of the mandible in the horizontal dimension was conducted in a private institute. The patients underwent placement of 49 dental implants after a staged ridge split procedure. The patients were randomly allocated to alloplast and allograft groups (predictor variable). In each group, the patients were randomly assigned to either small graft particle or large graft particle size (predictor variable). The gain in ridge width (outcome variable) was assessed before implant placement. A 2-way analysis of variance test and the Student unpaired t test were used for evaluation of the ridge width gain between the allograft and alloplast groups (predictor variable). Differences were considered significant if P values were < .05. The sample included 23 patients (14 men and 9 women). The patients were randomly allocated to the alloplast (n = 11) or allograft (n = 12) group before the ridge split procedure. In each group, they were assigned to a small graft particle or large graft particle size (alloplast group, small particle in 5 and large particle size in 6 patients; allograft group, small particle in 6 and large particle size in 6). A statistically significant difference was observed between the 2 graft types. The average ridge width gain was significantly greater in the alloplast group (large, 4.40 ± 0.24 mm; small, 3.52 ± 0.59 mm) than in the allograft group (large, 3.82 ± 0.19 mm; small, 2.57 ± 0.16 mm). For both graft types (alloplast and allograft), the large particle size graft resulted in a greater ridge width gain compared with the small particle size graft (P < .05). Within the limitations of the present study, we suggest the use of large particle alloplast as the graft material of choice for staged ridge split procedures in the posterior mandible. Copyright © 2017 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
Carlson, Mike; Jackson, Jeanne; Mandel, Deborah; Blanchard, Jeanine; Holguin, Jess; Lai, Mei-Ying; Marterella, Abbey; Vigen, Cheryl; Gleason, Sarah; Lam, Claudia; Azen, Stan; Clark, Florence
2014-04-01
The purpose of this study was to document predictors of long-term retention among minority participants in the Well Elderly 2 Study, a randomized controlled trial of a lifestyle intervention for community-dwelling older adults. The primary sample included 149 African American and 92 Hispanic men and women aged 60 to 95 years, recruited at senior activity centers and senior residences. Chi-square and logistic regression procedures were undertaken to examine study-based, psychosocial and health-related predictors of retention at 18 months following study entry. For both African Americans and Hispanics, intervention adherence was the strongest predictor. Retention was also related to high active coping and average (vs. high or low) levels of activity participation among African Americans and high social network strength among Hispanics. The results suggest that improved knowledge of the predictors of retention among minority elders can spawn new retention strategies that can be applied at individual, subgroup, and sample-wide levels.
On the predictivity of pore-scale simulations: Estimating uncertainties with multilevel Monte Carlo
NASA Astrophysics Data System (ADS)
Icardi, Matteo; Boccardo, Gianluca; Tempone, Raúl
2016-09-01
A fast method with tunable accuracy is proposed to estimate errors and uncertainties in pore-scale and Digital Rock Physics (DRP) problems. The overall predictivity of these studies can be, in fact, hindered by many factors including sample heterogeneity, computational and imaging limitations, model inadequacy and not perfectly known physical parameters. The typical objective of pore-scale studies is the estimation of macroscopic effective parameters such as permeability, effective diffusivity and hydrodynamic dispersion. However, these are often non-deterministic quantities (i.e., results obtained for specific pore-scale sample and setup are not totally reproducible by another ;equivalent; sample and setup). The stochastic nature can arise due to the multi-scale heterogeneity, the computational and experimental limitations in considering large samples, and the complexity of the physical models. These approximations, in fact, introduce an error that, being dependent on a large number of complex factors, can be modeled as random. We propose a general simulation tool, based on multilevel Monte Carlo, that can reduce drastically the computational cost needed for computing accurate statistics of effective parameters and other quantities of interest, under any of these random errors. This is, to our knowledge, the first attempt to include Uncertainty Quantification (UQ) in pore-scale physics and simulation. The method can also provide estimates of the discretization error and it is tested on three-dimensional transport problems in heterogeneous materials, where the sampling procedure is done by generation algorithms able to reproduce realistic consolidated and unconsolidated random sphere and ellipsoid packings and arrangements. A totally automatic workflow is developed in an open-source code [1], that include rigid body physics and random packing algorithms, unstructured mesh discretization, finite volume solvers, extrapolation and post-processing techniques. The proposed method can be efficiently used in many porous media applications for problems such as stochastic homogenization/upscaling, propagation of uncertainty from microscopic fluid and rock properties to macro-scale parameters, robust estimation of Representative Elementary Volume size for arbitrary physics.
Vallée, Julie; Souris, Marc; Fournet, Florence; Bochaton, Audrey; Mobillion, Virginie; Peyronnie, Karine; Salem, Gérard
2007-01-01
Background Geographical objectives and probabilistic methods are difficult to reconcile in a unique health survey. Probabilistic methods focus on individuals to provide estimates of a variable's prevalence with a certain precision, while geographical approaches emphasise the selection of specific areas to study interactions between spatial characteristics and health outcomes. A sample selected from a small number of specific areas creates statistical challenges: the observations are not independent at the local level, and this results in poor statistical validity at the global level. Therefore, it is difficult to construct a sample that is appropriate for both geographical and probability methods. Methods We used a two-stage selection procedure with a first non-random stage of selection of clusters. Instead of randomly selecting clusters, we deliberately chose a group of clusters, which as a whole would contain all the variation in health measures in the population. As there was no health information available before the survey, we selected a priori determinants that can influence the spatial homogeneity of the health characteristics. This method yields a distribution of variables in the sample that closely resembles that in the overall population, something that cannot be guaranteed with randomly-selected clusters, especially if the number of selected clusters is small. In this way, we were able to survey specific areas while minimising design effects and maximising statistical precision. Application We applied this strategy in a health survey carried out in Vientiane, Lao People's Democratic Republic. We selected well-known health determinants with unequal spatial distribution within the city: nationality and literacy. We deliberately selected a combination of clusters whose distribution of nationality and literacy is similar to the distribution in the general population. Conclusion This paper describes the conceptual reasoning behind the construction of the survey sample and shows that it can be advantageous to choose clusters using reasoned hypotheses, based on both probability and geographical approaches, in contrast to a conventional, random cluster selection strategy. PMID:17543100
Vallée, Julie; Souris, Marc; Fournet, Florence; Bochaton, Audrey; Mobillion, Virginie; Peyronnie, Karine; Salem, Gérard
2007-06-01
Geographical objectives and probabilistic methods are difficult to reconcile in a unique health survey. Probabilistic methods focus on individuals to provide estimates of a variable's prevalence with a certain precision, while geographical approaches emphasise the selection of specific areas to study interactions between spatial characteristics and health outcomes. A sample selected from a small number of specific areas creates statistical challenges: the observations are not independent at the local level, and this results in poor statistical validity at the global level. Therefore, it is difficult to construct a sample that is appropriate for both geographical and probability methods. We used a two-stage selection procedure with a first non-random stage of selection of clusters. Instead of randomly selecting clusters, we deliberately chose a group of clusters, which as a whole would contain all the variation in health measures in the population. As there was no health information available before the survey, we selected a priori determinants that can influence the spatial homogeneity of the health characteristics. This method yields a distribution of variables in the sample that closely resembles that in the overall population, something that cannot be guaranteed with randomly-selected clusters, especially if the number of selected clusters is small. In this way, we were able to survey specific areas while minimising design effects and maximising statistical precision. We applied this strategy in a health survey carried out in Vientiane, Lao People's Democratic Republic. We selected well-known health determinants with unequal spatial distribution within the city: nationality and literacy. We deliberately selected a combination of clusters whose distribution of nationality and literacy is similar to the distribution in the general population. This paper describes the conceptual reasoning behind the construction of the survey sample and shows that it can be advantageous to choose clusters using reasoned hypotheses, based on both probability and geographical approaches, in contrast to a conventional, random cluster selection strategy.
Menlove, Howard O.; Stewart, James E.
1986-01-01
Apparatus and method for the direct, nondestructive evaluation of the .sup.235 U nuclide content of samples containing UF.sub.6, UF.sub.4, or UO.sub.2 utilizing the passive neutron self-interrogation of the sample resulting from the intrinsic production of neutrons therein. The ratio of the emitted neutron coincidence count rate to the total emitted neutron count rate is determined and yields a measure of the bulk fissile mass. The accuracy of the method is 6.8% (1.sigma.) for cylinders containing UF.sub.6 with enrichments ranging from 6% to 98% with measurement times varying from 3-6 min. The samples contained from below 1 kg to greater than 16 kg. Since the subject invention relies on fast neutron self-interrogation, complete sampling of the UF.sub.6 takes place, reducing difficulties arising from inhomogeneity of the sample which adversely affects other assay procedures.
Menlove, H.O.; Stewart, J.E.
1985-02-04
Apparatus and method for the direct, nondestructive evaluation of the /sup 235/U nuclide content of samples containing UF/sub 6/, UF/sub 4/, or UO/sub 2/ utilizing the passive neutron self-interrogation of the sample resulting from the intrinsic production of neutrons therein. The ratio of the emitted neutron coincidence count rate to the total emitted neutron count rate is determined and yields a measure of the bulk fissile mass. The accuracy of the method is 6.8% (1sigma) for cylinders containing UF/sub 6/ with enrichments ranging from 6% to 98% with measurement times varying from 3-6 min. The samples contained from below 1 kg to greater than 16 kg. Since the subject invention relies on fast neutron self-interrogation, complete sampling of the UF/sub 6/ takes place, reducing difficulties arising from inhomogeneity of the sample which adversely affects other assay procedures. 4 figs., 1 tab.
In-Person versus Telehealth Assessment of Discourse Ability in Adults with Traumatic Brain Injury
Turkstra, Lyn S.; Quinn-Padron, Maura; Johnson, Jacqueline E.; Workinger, Marilyn S.; Antoniotti, Nina
2011-01-01
Objectives To compare in-person (IP) vs. telehealth (TH) assessment of discourse ability in adults with chronic traumatic brain injury (TBI). Design Repeated-measures design with random order of conditions. Participants Twenty adults with moderate-to-severe TBI. Method Participants completed conversation, picture description, story-generation, and procedural description tasks. Sessions were video-recorded and transcribed. Measures Measures of productivity and quality of discourse. Results Significant differences between conditions were not detected in this sample, and feedback from participants was positive. Conclusions These preliminary results support the use of TH for the assessment of discourse ability in adults with TBI, at least for individuals with sufficient cognitive skills to follow TH procedures. PMID:22190010
Technical note: Alternatives to reduce adipose tissue sampling bias.
Cruz, G D; Wang, Y; Fadel, J G
2014-10-01
Understanding the mechanisms by which nutritional and pharmaceutical factors can manipulate adipose tissue growth and development in production animals has direct and indirect effects in the profitability of an enterprise. Adipocyte cellularity (number and size) is a key biological response that is commonly measured in animal science research. The variability and sampling of adipocyte cellularity within a muscle has been addressed in previous studies, but no attempt to critically investigate these issues has been proposed in the literature. The present study evaluated 2 sampling techniques (random and systematic) in an attempt to minimize sampling bias and to determine the minimum number of samples from 1 to 15 needed to represent the overall adipose tissue in the muscle. Both sampling procedures were applied on adipose tissue samples dissected from 30 longissimus muscles from cattle finished either on grass or grain. Briefly, adipose tissue samples were fixed with osmium tetroxide, and size and number of adipocytes were determined by a Coulter Counter. These results were then fit in a finite mixture model to obtain distribution parameters of each sample. To evaluate the benefits of increasing number of samples and the advantage of the new sampling technique, the concept of acceptance ratio was used; simply stated, the higher the acceptance ratio, the better the representation of the overall population. As expected, a great improvement on the estimation of the overall adipocyte cellularity parameters was observed using both sampling techniques when sample size number increased from 1 to 15 samples, considering both techniques' acceptance ratio increased from approximately 3 to 25%. When comparing sampling techniques, the systematic procedure slightly improved parameters estimation. The results suggest that more detailed research using other sampling techniques may provide better estimates for minimum sampling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sistine, Angela Van; Salzer, John J.; Janowiecki, Steven
2016-06-10
The ALFALFA H α survey utilizes a large sample of H i-selected galaxies from the ALFALFA survey to study star formation (SF) in the local universe. ALFALFA H α contains 1555 galaxies with distances between ∼20 and ∼100 Mpc. We have obtained continuum-subtracted narrowband H α images and broadband R images for each galaxy, creating one of the largest homogeneous sets of H α images ever assembled. Our procedures were designed to minimize the uncertainties related to the calculation of the local SF rate density (SFRD). The galaxy sample we constructed is as close to volume-limited as possible, is amore » robust statistical sample, and spans a wide range of galaxy environments. In this paper, we discuss the properties of our Fall sample of 565 galaxies, our procedure for deriving individual galaxy SF rates, and our method for calculating the local SFRD. We present a preliminary value of log(SFRD[ M {sub ⊙} yr{sup −1} Mpc{sup −3}]) = −1.747 ± 0.018 (random) ±0.05 (systematic) based on the 565 galaxies in our Fall sub-sample. Compared to the weighted average of SFRD values around z ≈ 2, our local value indicates a drop in the global SFRD of a factor of 10.2 over that lookback time.« less
Judd, Charles M; Westfall, Jacob; Kenny, David A
2012-07-01
Throughout social and cognitive psychology, participants are routinely asked to respond in some way to experimental stimuli that are thought to represent categories of theoretical interest. For instance, in measures of implicit attitudes, participants are primed with pictures of specific African American and White stimulus persons sampled in some way from possible stimuli that might have been used. Yet seldom is the sampling of stimuli taken into account in the analysis of the resulting data, in spite of numerous warnings about the perils of ignoring stimulus variation (Clark, 1973; Kenny, 1985; Wells & Windschitl, 1999). Part of this failure to attend to stimulus variation is due to the demands imposed by traditional analysis of variance procedures for the analysis of data when both participants and stimuli are treated as random factors. In this article, we present a comprehensive solution using mixed models for the analysis of data with crossed random factors (e.g., participants and stimuli). We show the substantial biases inherent in analyses that ignore one or the other of the random factors, and we illustrate the substantial advantages of the mixed models approach with both hypothetical and actual, well-known data sets in social psychology (Bem, 2011; Blair, Chapleau, & Judd, 2005; Correll, Park, Judd, & Wittenbrink, 2002). PsycINFO Database Record (c) 2012 APA, all rights reserved
Steimer, Andreas; Schindler, Kaspar
2015-01-01
Oscillations between high and low values of the membrane potential (UP and DOWN states respectively) are an ubiquitous feature of cortical neurons during slow wave sleep and anesthesia. Nevertheless, a surprisingly small number of quantitative studies have been conducted only that deal with this phenomenon’s implications for computation. Here we present a novel theory that explains on a detailed mathematical level the computational benefits of UP states. The theory is based on random sampling by means of interspike intervals (ISIs) of the exponential integrate and fire (EIF) model neuron, such that each spike is considered a sample, whose analog value corresponds to the spike’s preceding ISI. As we show, the EIF’s exponential sodium current, that kicks in when balancing a noisy membrane potential around values close to the firing threshold, leads to a particularly simple, approximative relationship between the neuron’s ISI distribution and input current. Approximation quality depends on the frequency spectrum of the current and is improved upon increasing the voltage baseline towards threshold. Thus, the conceptually simpler leaky integrate and fire neuron that is missing such an additional current boost performs consistently worse than the EIF and does not improve when voltage baseline is increased. For the EIF in contrast, the presented mechanism is particularly effective in the high-conductance regime, which is a hallmark feature of UP-states. Our theoretical results are confirmed by accompanying simulations, which were conducted for input currents of varying spectral composition. Moreover, we provide analytical estimations of the range of ISI distributions the EIF neuron can sample from at a given approximation level. Such samples may be considered by any algorithmic procedure that is based on random sampling, such as Markov Chain Monte Carlo or message-passing methods. Finally, we explain how spike-based random sampling relates to existing computational theories about UP states during slow wave sleep and present possible extensions of the model in the context of spike-frequency adaptation. PMID:26203657
Budnik, Lygia T; Fahrenholtz, Svea; Kloth, Stefan; Baur, Xaver
2010-04-01
Protection against infestation of a container cargo by alien species is achieved by mandatory fumigation with pesticides. Most of the effective fumigants are methyl and ethyl halide gases that are highly toxic and are a risk to both human health and the environment. There is a worldwide need for a reliable and robust analytical screening procedure for these volatile chemicals in a multitude of health and environmental scenarios. We have established a highly sensitive broad spectrum mass spectrometry method combined with thermal desorption gas chromatography to detect, identify and quantify volatile pesticide residues. Using this method, 1201 random ambient air samples taken from freight containers arriving at the biggest European ports of Hamburg and Rotterdam were analyzed over a period of two and a half years. This analytical procedure is a valuable strategy to measure air pollution from these hazardous chemicals, to help in the identification of pesticides in the new mixtures/formulations that are being adopted globally and to analyze expired breath samples after suspected intoxication in biomonitoring.
Estimation of divergence from Hardy-Weinberg form.
Stark, Alan E
2015-08-01
The Hardy–Weinberg (HW) principle explains how random mating (RM) can produce and maintain a population in equilibrium, that is, with constant genotypic proportions. When proportions diverge from HW form, it is of interest to estimate the fixation index F, which reflects the degree of divergence. Starting from a sample of genotypic counts, a mixed procedure gives first the orthodox estimate of gene frequency q and then a Bayesian estimate of F, based on a credible prior distribution of F, which is described here.
Bridoux, Valerie; Regimbeau, Jean Marc; Ouaissi, Mehdi; Mathonnet, Muriel; Mauvais, Francois; Houivet, Estelle; Schwarz, Lilian; Mege, Diane; Sielezneff, Igor; Sabbagh, Charles; Tuech, Jean-Jacques
2017-12-01
About 25% of patients with acute diverticulitis require emergency intervention. Currently, most patients with diverticular peritonitis undergo a Hartmann's procedure. Our objective was to assess whether primary anastomosis (PA) with a diverting stoma results in lower mortality rates than Hartmann's procedure (HP) in patients with diverticular peritonitis. We conducted a multicenter randomized controlled trial conducted between June 2008 and May 2012: the DIVERTI (Primary vs Secondary Anastomosis for Hinchey Stage III-IV Diverticulitis) trial. Follow-up duration was up to 18 months. A random sample of 102 eligible participants with purulent or fecal diverticular peritonitis from tertiary care referral centers and associated centers in France were equally randomized to either a PA arm or to an HP arm. Data were analyzed on an intention-to-treat basis. The primary end point was mortality rate at 18 months. Secondary outcomes were postoperative complications, operative time, length of hospital stay, rate of definitive stoma, and morbidity. All 102 patients enrolled were comparable for age (p = 0.4453), sex (p = 0.2347), Hinchey stage III vs IV (p = 0.2347), and Mannheim Peritonitis Index (p = 0.0606). Overall mortality did not differ significantly between HP (7.7%) and PA (4%) (p = 0.4233). Morbidity for both resection and stoma reversal operations were comparable (39% in the HP arm vs 44% in the PA arm; p = 0.4233). At 18 months, 96% of PA patients and 65% of HP patients had a stoma reversal (p = 0.0001). Although mortality was similar in both arms, the rate of stoma reversal was significantly higher in the PA arm. This trial provides additional evidence in favor of PA with diverting ileostomy over HP in patients with diverticular peritonitis. ClinicalTrials.gov Identifier: NCT 00692393. Copyright © 2017. Published by Elsevier Inc.
Kudo, Taiki; Kawakami, Hiroshi; Hayashi, Tsuyoshi; Yasuda, Ichiro; Mukai, Tsuyoshi; Inoue, Hiroyuki; Katanuma, Akio; Kawakubo, Kazumichi; Ishiwatari, Hirotoshi; Doi, Shinpei; Yamada, Reiko; Maguchi, Hiroyuki; Isayama, Hiroyuki; Mitsuhashi, Tomoko; Sakamoto, Naoya
2014-12-01
EUS-guided FNA (EUS-FNA) has a high diagnostic accuracy for pancreatic diseases. However, although most reports have typically focused on cytology, histological tissue quality has rarely been investigated. The effectiveness of EUS-FNA combined with high negative pressure (HNP) suction was recently indicated for tissue acquisition, but has not thus far been tested in a prospective, randomized clinical trial. To evaluate the adequacy of EUS-FNA with HNP for the histological diagnosis of pancreatic lesions by using 25-gauge needles. Prospective, single-blind, randomized, controlled crossover trial. Seven tertiary referral centers. Patients referred for EUS-FNA of pancreatic solid lesions. From July 2011 to April 2012, 90 patients underwent EUS-FNA of pancreatic solid masses by using normal negative pressure (NNP) and HNP with 2 respective passes. The order of the passes was randomized, and the sample adequacy, quality, and histology were evaluated by a single expert pathologist. EUS-FNA by using NNP and HNP. The adequacy of tissue acquisition and the accuracy of histological diagnoses made by using the EUS-FNA technique with HNP. We found that 72.2% (65/90) and 90% (81/90) of the specimens obtained using NNP and HNP, respectively, were adequate for histological diagnosis (P = .0003, McNemar test). For 73.3% (66/90) and 82.2% (74/90) of the specimens obtained by using NNP and HNP, respectively, an accurate diagnosis was achieved (P = .06, McNemar test). Pancreatitis developed in 1 patient after this procedure, which subsided with conservative therapy. This was a single-blinded, crossover study. Biopsy procedures that combine the EUS-FNA with HNP techniques are superior to EUS-FNA with NNP procedures for tissue acquisition. ( UMIN000005939.). Copyright © 2014 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Golino, Hudson F.; Gomes, Cristiano M. A.
2016-01-01
This paper presents a non-parametric imputation technique, named random forest, from the machine learning field. The random forest procedure has two main tuning parameters: the number of trees grown in the prediction and the number of predictors used. Fifty experimental conditions were created in the imputation procedure, with different…
NASA Astrophysics Data System (ADS)
Meirovitch, Hagai
1985-12-01
The scanning method proposed by us [J. Phys. A 15, L735 (1982); Macromolecules 18, 563 (1985)] for simulation of polymer chains is further developed and applied, for the first time, to a model with finite interactions. In addition to ``importance sampling,'' we remove the bias introduced by the scanning method with a procedure suggested recently by Schmidt [Phys. Rev. Lett. 51, 2175 (1983)]; this procedure has the advantage of enabling one to estimate the statistical error. We find these two procedures to be equally efficient. The model studied is an N-step random walk on a lattice, in which a random walk i has a statistical weight &, where p<1 is an attractive energy parameter and Mi is the number of distinct sites visited by walk i. This model, which corresponds to a model of random walks moving in a medium with randomly distributed static traps, has been solved analytically for N-->∞ for any dimension d by Donsker and Varadhan (DV) and by others.
Resampling methods in Microsoft Excel® for estimating reference intervals
Theodorsson, Elvar
2015-01-01
Computer- intensive resampling/bootstrap methods are feasible when calculating reference intervals from non-Gaussian or small reference samples. Microsoft Excel® in version 2010 or later includes natural functions, which lend themselves well to this purpose including recommended interpolation procedures for estimating 2.5 and 97.5 percentiles. The purpose of this paper is to introduce the reader to resampling estimation techniques in general and in using Microsoft Excel® 2010 for the purpose of estimating reference intervals in particular. Parametric methods are preferable to resampling methods when the distributions of observations in the reference samples is Gaussian or can transformed to that distribution even when the number of reference samples is less than 120. Resampling methods are appropriate when the distribution of data from the reference samples is non-Gaussian and in case the number of reference individuals and corresponding samples are in the order of 40. At least 500-1000 random samples with replacement should be taken from the results of measurement of the reference samples. PMID:26527366
Resampling methods in Microsoft Excel® for estimating reference intervals.
Theodorsson, Elvar
2015-01-01
Computer-intensive resampling/bootstrap methods are feasible when calculating reference intervals from non-Gaussian or small reference samples. Microsoft Excel® in version 2010 or later includes natural functions, which lend themselves well to this purpose including recommended interpolation procedures for estimating 2.5 and 97.5 percentiles. The purpose of this paper is to introduce the reader to resampling estimation techniques in general and in using Microsoft Excel® 2010 for the purpose of estimating reference intervals in particular. Parametric methods are preferable to resampling methods when the distributions of observations in the reference samples is Gaussian or can transformed to that distribution even when the number of reference samples is less than 120. Resampling methods are appropriate when the distribution of data from the reference samples is non-Gaussian and in case the number of reference individuals and corresponding samples are in the order of 40. At least 500-1000 random samples with replacement should be taken from the results of measurement of the reference samples.
Ryeznik, Yevgen; Sverdlov, Oleksandr
2018-06-04
Randomization designs for multiarm clinical trials are increasingly used in practice, especially in phase II dose-ranging studies. Many new methods have been proposed in the literature; however, there is lack of systematic, head-to-head comparison of the competing designs. In this paper, we systematically investigate statistical properties of various restricted randomization procedures for multiarm trials with fixed and possibly unequal allocation ratios. The design operating characteristics include measures of allocation balance, randomness of treatment assignments, variations in the allocation ratio, and statistical characteristics such as type I error rate and power. The results from the current paper should help clinical investigators select an appropriate randomization procedure for their clinical trial. We also provide a web-based R shiny application that can be used to reproduce all results in this paper and run simulations under additional user-defined experimental scenarios. Copyright © 2018 John Wiley & Sons, Ltd.
1981-06-01
normality and several types of nonnormality. Overall the rank transformation procedure seems to be the best. The Fisher’s LSD multiple comparisons procedure...the rank transformation procedure appears to maintain power better than Fisher’s LSD or the randomization proce- dures. The conclusion of this study...best. The Fisher’s LSD multiple comparisons procedure in the one way and two way layouts iv compared with a randomization procedure and with the same
Randomized study of surgical prophylaxis in immunocompromised hosts.
Lopes, D R; Peres, M P S M; Levin, A S
2011-02-01
Although prophylaxis is current practice, there are no randomized controlled studies evaluating preoperative antimicrobial prophylaxis in dental procedures in patients immunocompromised by chemotherapy or organ transplants. To evaluate prophylaxis in dental-invasive procedures in patients with cancer or solid organ transplants, 414 patients were randomized to receive one oral 500-mg dose 2 hours before the procedure (1-dose group) or a 500-mg dose 2 hours before the procedure and an additional dose 8 hours later (2-dose group). Procedures were exodontia or periodontal scaling/root planing. Follow-up was 4 weeks. No deaths or surgical site infections occurred. Six patients (1.4%) presented with use of pain medication > 3 days or hospitalization during follow-up: 4 of 207 (2%) in the 1-dose group and 2 of 207 (1%) in the 2-dose group (relative risk, 2.02; 95% confidence interval, 0.37-11.15). In conclusion, no statistically significant difference occurred in outcome using 1 or 2 doses of prophylactic amoxicillin for invasive dental procedures in immunocompromised patients.
Esteves, A; Patarata, L; Aymerich, T; Garriga, M; Martins, C
2007-03-01
Sources and tracing of Staphylococcus aureus in alheira (garlic sausage) production were evaluated by multifactorial correspondence analysis (MCA) of occurrence data and a random amplified polymorphic DNA (RAPD) on S. aureus isolates. Samples from four production lines, four different production batches, and 14 different sampling sites (including raw material, different contact surfaces, and several stages of alheira manufacturing) were analyzed at four sampling times. From the 896 microbial analyses completed, a collection of 170 S. aureus isolates was obtained. Although analysis of the occurrence data alone was not elucidative enough, MCA and RAPD-PCR were able to assess the sources of contamination and to trace the spread of this microorganism along the production lines. MCA results indicated that the presence of S. aureus in alheira was related to its presence in the intermediate manufacturing stages after heat treatment but before stuffing in the casings. It was also possible to associate a cross-contamination path related to handler procedures. RAPD-PCR typing in accordance to MCA results confirmed the cross-contamination path between the raw material and casings and the role of handlers as an important cross-contamination vehicle.
Carbon monoxide measurement in the global atmospheric sampling program
NASA Technical Reports Server (NTRS)
Dudzinski, T. J.
1979-01-01
The carbon monoxide measurement system used in the NASA Global Atmospheric Sampling Program (GASP) is described. The system used a modified version of a commercially available infrared absorption analyzer. The modifications increased the sensitivity of the analyzer to 1 ppmv full scale, with a limit of detectability of 0.02 ppmv. Packaging was modified for automatic, unattended operation in an aircraft environment. The GASP system is described along with analyzer operation, calibration procedures, and measurement errors. Uncertainty of the CO measurement over a 2-year period ranged from + or - 3 to + or - 13 percent of reading, plus an error due to random fluctuation of the output signal + or - 3 to + or - 15 ppbv.
A DNA fingerprinting procedure for ultra high-throughput genetic analysis of insects.
Schlipalius, D I; Waldron, J; Carroll, B J; Collins, P J; Ebert, P R
2001-12-01
Existing procedures for the generation of polymorphic DNA markers are not optimal for insect studies in which the organisms are often tiny and background molecular information is often non-existent. We have used a new high throughput DNA marker generation protocol called randomly amplified DNA fingerprints (RAF) to analyse the genetic variability in three separate strains of the stored grain pest, Rhyzopertha dominica. This protocol is quick, robust and reliable even though it requires minimal sample preparation, minute amounts of DNA and no prior molecular analysis of the organism. Arbitrarily selected oligonucleotide primers routinely produced approximately 50 scoreable polymorphic DNA markers, between individuals of three independent field isolates of R. dominica. Multivariate cluster analysis using forty-nine arbitrarily selected polymorphisms generated from a single primer reliably separated individuals into three clades corresponding to their geographical origin. The resulting clades were quite distinct, with an average genetic difference of 37.5 +/- 6.0% between clades and of 21.0 +/- 7.1% between individuals within clades. As a prelude to future gene mapping efforts, we have also assessed the performance of RAF under conditions commonly used in gene mapping. In this analysis, fingerprints from pooled DNA samples accurately and reproducibly reflected RAF profiles obtained from individual DNA samples that had been combined to create the bulked samples.
Cross-talk free selective reconstruction of individual objects from multiplexed optical field data
NASA Astrophysics Data System (ADS)
Zea, Alejandro Velez; Barrera, John Fredy; Torroba, Roberto
2018-01-01
In this paper we present a data multiplexing method for simultaneous storage in a single package composed by several optical fields of tridimensional (3D) objects, and their individual cross-talk free retrieval. Optical field data are extracted from off axis Fourier holograms, and then sampled by multiplying them with random binary masks. The resulting sampled optical fields can be used to reconstruct the original objects. Sampling causes a loss of quality that can be controlled by the number of white pixels in the binary masks and by applying a padding procedure on the optical field data. This process can be performed using a different binary mask for each optical field, and then added to form a multiplexed package. With the adequate choice of sampling and padding, we can achieve a volume reduction in the multiplexed package over the addition of all individual optical fields. Moreover, the package can be multiplied by a binary mask to select a specific optical field, and after the reconstruction procedure, the corresponding 3D object is recovered without any cross-talk. We demonstrate the effectiveness of our proposal for data compression with a comparison with discrete cosine transform filtering. Experimental results confirm the validity of our proposal.
The topomer-sampling model of protein folding
Debe, Derek A.; Carlson, Matt J.; Goddard, William A.
1999-01-01
Clearly, a protein cannot sample all of its conformations (e.g., ≈3100 ≈ 1048 for a 100 residue protein) on an in vivo folding timescale (<1 s). To investigate how the conformational dynamics of a protein can accommodate subsecond folding time scales, we introduce the concept of the native topomer, which is the set of all structures similar to the native structure (obtainable from the native structure through local backbone coordinate transformations that do not disrupt the covalent bonding of the peptide backbone). We have developed a computational procedure for estimating the number of distinct topomers required to span all conformations (compact and semicompact) for a polypeptide of a given length. For 100 residues, we find ≈3 × 107 distinct topomers. Based on the distance calculated between different topomers, we estimate that a 100-residue polypeptide diffusively samples one topomer every ≈3 ns. Hence, a 100-residue protein can find its native topomer by random sampling in just ≈100 ms. These results suggest that subsecond folding of modest-sized, single-domain proteins can be accomplished by a two-stage process of (i) topomer diffusion: random, diffusive sampling of the 3 × 107 distinct topomers to find the native topomer (≈0.1 s), followed by (ii) intratopomer ordering: nonrandom, local conformational rearrangements within the native topomer to settle into the precise native state. PMID:10077555
Mail merge can be used to create personalized questionnaires in complex surveys.
Taljaard, Monica; Chaudhry, Shazia Hira; Brehaut, Jamie C; Weijer, Charles; Grimshaw, Jeremy M
2015-10-16
Low response rates and inadequate question comprehension threaten the validity of survey results. We describe a simple procedure to implement personalized-as opposed to generically worded-questionnaires in the context of a complex web-based survey of corresponding authors of a random sample of 300 published cluster randomized trials. The purpose of the survey was to gather more detailed information about informed consent procedures used in the trial, over and above basic information provided in the trial report. We describe our approach-which allowed extensive personalization without the need for specialized computer technology-and discuss its potential application in similar settings. The mail merge feature of standard word processing software was used to generate unique, personalized questionnaires for each author by incorporating specific information from the article, including naming the randomization unit (e.g., family practice, school, worksite), and identifying specific individuals who may have been considered research participants at the cluster level (family doctors, teachers, employers) and individual level (patients, students, employees) in questions regarding informed consent procedures in the trial. The response rate was relatively high (64%, 182/285) and did not vary significantly by author, publication, or study characteristics. The refusal rate was low (7%). While controlled studies are required to examine the specific effects of our approach on comprehension, quality of responses, and response rates, we showed how mail merge can be used as a simple but useful tool to add personalized fields to complex survey questionnaires, or to request additional information required from study authors. One potential application is in eliciting specific information about published articles from study authors when conducting systematic reviews and meta-analyses.
Glaser, John; Reeves, Scott T; Stoll, William David; Epperson, Thomas I; Hilbert, Megan; Madan, Alok; George, Mark S; Borckardt, Jeffrey J
2016-05-01
Randomized, controlled pilot trial. The present study is the first randomized, double-blind, sham-controlled pilot clinical trial of transcranial direct current stimulation (tDCS) for pain and patient-controlled analgesia (PCA) opioid usage among patients receiving spine surgery. Lumbar spinal surgeries are common, and while pain is often a complaint that precedes surgical intervention, the procedures themselves are associated with considerable postoperative pain lasting days to weeks. Adequate postoperative pain control is an important factor in determining recovery and new analgesic strategies are needed that can be used adjunctively to existing strategies potentially to reduce reliance on opioid analgesia. Several novel brain stimulation technologies including tDCS are beginning to demonstrate promise as treatments for a variety of pain conditions. Twenty-seven patients undergoing lumbar spine procedures at Medical University of South Carolina were randomly assigned to receive four 20-minute sessions of real or sham tDCS during their postsurgical hospital stay. Patient-administered hydromorphone usage was tracked along with numeric rating scale pain ratings. The effect of tDCS on the slope of the cumulative PCA curve was significant (P < 0.001) and tDCS was associated with a 23% reduction in PCA usage. In the real tDCS group a 31% reduction was observed in pain-at-its-least ratings from admission to discharge (P = 0.027), but no other changes in numeric rating scale pain ratings were significant in either group. The present pilot trial is the first study to demonstrate an opioid sparing effect of tDCS after spine surgical procedures. Although this was a small pilot trial in a heterogeneous sample of spinal surgery patients, a moderate effect-size was observed for tDCS, suggesting that future work in this area is warranted. 2.
Sadahiro, Sotaro; Suzuki, Toshiyuki; Tanaka, Akira; Okada, Kazutake; Kamata, Hiroko; Ozaki, Toru; Koga, Yasuhiro
2014-03-01
We have already reported that, for patients undergoing elective colon cancer operations, perioperative infection can be prevented by a single intravenous dose of an antibiotic given immediately beforehand if mechanical bowel preparation and the administration of oral antibiotics are implemented. Synbiotics has been reported to reduce the rate of infection in patients after pancreatic cancer operations. The effectiveness of oral antibiotics and probiotics in preventing postoperative infection in elective colon cancer procedures was examined in a randomized controlled trial. Three hundred ten patients with colon cancer randomly were assigned to one of three groups. All patients underwent mechanical bowel preparation and received a single intravenous dose of flomoxef immediately before operation. Probiotics were administered in Group A; oral antibiotics were administered in Group B; and neither probiotics nor oral antibiotics were administered in Group C. Stool samples were collected 9 and 2 days before and 7 and 14 days after the procedure. Clostridium difficile toxin and the number of bacteria in the intestine were determined. The rates of incisional surgical-site infection were 18.0%, 6.1%, and 17.9% in Groups A, B, and C, and the rates of leakage were 12.0%, 1.0%, and 7.4% in Groups A, B, and C, respectively, indicating that both rates were lesser in Group B than in Groups A and C (P = .014 and P = .004, respectively). The detection rates of C. difficile toxin were not changed among the three groups. We recommend oral antibiotics, rather than probiotics, as bowel preparation for elective colon cancer procedures to prevent surgical-site infections. Copyright © 2014 Mosby, Inc. All rights reserved.
Efficacy of abstinence promotion media messages: findings from an online randomized trial.
Evans, W Douglas; Davis, Kevin C; Ashley, Olivia Silber; Blitstein, Jonathan; Koo, Helen; Zhang, Yun
2009-10-01
We conducted an online randomized experiment to evaluate the efficacy of messages from the Parents Speak Up National Campaign (PSUNC) to promote parent-child communication about sex. We randomly assigned a national sample of 1,969 mothers and fathers to treatment (PSUNC exposure) and control (no exposure) conditions. Mothers were further randomized into treatment and booster (additional messages) conditions to evaluate dose-response effects. Participants were surveyed at baseline, 4 weeks postexposure, and 6 months postexposure. We used multivariable logistic regression procedures in our analysis. Treatment fathers were more likely than control fathers to initiate conversations about sex at 4 weeks, and treatment fathers and mothers were more likely than controls at 6 months to recommend that their children wait to have sex. Treatment fathers and mothers were far more likely than controls to use the campaign Web site. There was a dose-response effect for mothers' Web site use. Using new media methods, this study shows that PSUNC messages are efficacious in promoting parent-child communication about sex and abstinence. Future research should evaluate mechanisms and effectiveness in natural settings.
Stival, Rebecca Saray Marchesini; Cavalheiro, Patrícia Rechetello; Stasiak, Camila Edith Stachera; Galdino, Dayana Talita; Hoekstra, Bianca Eliza; Schafranski, Marcelo Derbli
2014-01-01
To evaluate the efficacy of acupuncture in the treatment of fibromyalgia, considering the immediate response of the visual analogue pain scale (VAS) as its primary outcome. Randomized, controlled, double-blind study including 36 patients with fibromyalgia (ACR 1990) selected from the outpatient rheumatology clinic, Santa Casa de Misericórdia, Ponta Grossa, PR. Twenty-one patients underwent an acupuncture session, under the principles of the traditional Chinese medicine, and 15 patients underwent a placebo procedure (sham acupuncture). For pain assessment, the subjects completed a Visual Analogue Scale (VAS) before and immediately after the proposed procedure. The mean change in VAS was compared among groups. The variation between the final and initial VAS values was -4.36±3.23 (P=0.0001) in the treatment group and -1.70±1.55 in the control group (P=0.06). The difference in terms of amplitude of variation of VAS (initial - final VAS) among groups favored the actual procedure (P=0.005). The effect size (ES) for the treatment group was d=1.7, which is considered a large effect. Although small, the statistical power of the sample for these results was very relevant (94.8%). Acupuncture has proven effective in the immediate pain reduction in patients with fibromyalgia, with a quite significant effect size. Copyright © 2014 Elsevier Editora Ltda. All rights reserved.
TRANSFER OF AVERSIVE RESPONDENT ELICITATION IN ACCORDANCE WITH EQUIVALENCE RELATIONS
Valverde, Miguel RodrÍguez; Luciano, Carmen; Barnes-Holmes, Dermot
2009-01-01
The present study investigates the transfer of aversively conditioned respondent elicitation through equivalence classes, using skin conductance as the measure of conditioning. The first experiment is an attempt to replicate Experiment 1 in Dougher, Augustson, Markham, Greenway, and Wulfert (1994), with different temporal parameters in the aversive conditioning procedure employed. Match-to-sample procedures were used to teach 17 participants two 4-member equivalence classes. Then, one member of one class was paired with electric shock and one member of the other class was presented without shock. The remaining stimuli from each class were presented in transfer tests. Unlike the findings in the original study, transfer of conditioning was not achieved. In Experiment 2, similar procedures were used with 30 participants, although several modifications were introduced (formation of five-member classes, direct conditioning with several elements of each class, random sequences of stimulus presentation in transfer tests, reversal in aversive conditioning contingencies). More than 80% of participants who had shown differential conditioning also showed the transfer of function effect. Moreover, this effect was replicated within subjects for 3 participants. This is the first demonstration of the transfer of aversive respondent elicitation through stimulus equivalence classes with the presentation of transfer test trials in random order. The latter prevents the possibility that transfer effects are an artefact of transfer test presentation order. PMID:20119523
Hu, Pei-Hsin; Peng, Yen-Chun; Lin, Yu-Ting; Chang, Chi-Sen; Ou, Ming-Chiu
2010-01-01
Colonoscopy is generally tolerated, some patients regarding the procedure as unpleasant and painful and generally performed with the patient sedated and receiving analgesics. The effect of sedation and analgesia for colonoscopy is limited. Aromatherapy is also applied to gastrointestinal endoscopy to reduce procedural anxiety. There is lack of information about aromatherapy specific for colonoscopy. In this study, we aimed to performed a randomized controlled study to investigate the effect of aromatherapy on relieve anxiety, stress and physiological parameters of colonoscopy. A randomized controlled trail was carried out and collected in 2009 and 2010. The participants were randomized in two groups. Aromatherapy was then carried out by inhalation of Sunflower oil (control group) and Neroli oil (Experimental group). The anxiety index was evaluated by State Trait Anxiety Inventory-state (STAI-S) score before aromatherapy and after colonoscopy as well as the pain index for post-procedural by visual analogue scale (VAS). Physiological indicators, such as blood pressure (systolic and diastolic blood pressure), heart rate and respiratory rate were evaluated before and after aromatherapy. Participates in this study were 27 subjects, 13 in control group and 14 in Neroli group with average age 52.26 +/- 17.79 years. There was no significance of procedural anxiety by STAI-S score and procedural pain by VAS. The physiological parameters showed a significant lower pre- and post-procedural systolic blood pressure in Neroli group than control group. Aromatic care for colonoscopy, although with no significant effect on procedural anxiety, is an inexpensive, effective and safe pre-procedural technique that could decrease systolic blood pressure.
SUNPLIN: Simulation with Uncertainty for Phylogenetic Investigations
2013-01-01
Background Phylogenetic comparative analyses usually rely on a single consensus phylogenetic tree in order to study evolutionary processes. However, most phylogenetic trees are incomplete with regard to species sampling, which may critically compromise analyses. Some approaches have been proposed to integrate non-molecular phylogenetic information into incomplete molecular phylogenies. An expanded tree approach consists of adding missing species to random locations within their clade. The information contained in the topology of the resulting expanded trees can be captured by the pairwise phylogenetic distance between species and stored in a matrix for further statistical analysis. Thus, the random expansion and processing of multiple phylogenetic trees can be used to estimate the phylogenetic uncertainty through a simulation procedure. Because of the computational burden required, unless this procedure is efficiently implemented, the analyses are of limited applicability. Results In this paper, we present efficient algorithms and implementations for randomly expanding and processing phylogenetic trees so that simulations involved in comparative phylogenetic analysis with uncertainty can be conducted in a reasonable time. We propose algorithms for both randomly expanding trees and calculating distance matrices. We made available the source code, which was written in the C++ language. The code may be used as a standalone program or as a shared object in the R system. The software can also be used as a web service through the link: http://purl.oclc.org/NET/sunplin/. Conclusion We compare our implementations to similar solutions and show that significant performance gains can be obtained. Our results open up the possibility of accounting for phylogenetic uncertainty in evolutionary and ecological analyses of large datasets. PMID:24229408
SUNPLIN: simulation with uncertainty for phylogenetic investigations.
Martins, Wellington S; Carmo, Welton C; Longo, Humberto J; Rosa, Thierson C; Rangel, Thiago F
2013-11-15
Phylogenetic comparative analyses usually rely on a single consensus phylogenetic tree in order to study evolutionary processes. However, most phylogenetic trees are incomplete with regard to species sampling, which may critically compromise analyses. Some approaches have been proposed to integrate non-molecular phylogenetic information into incomplete molecular phylogenies. An expanded tree approach consists of adding missing species to random locations within their clade. The information contained in the topology of the resulting expanded trees can be captured by the pairwise phylogenetic distance between species and stored in a matrix for further statistical analysis. Thus, the random expansion and processing of multiple phylogenetic trees can be used to estimate the phylogenetic uncertainty through a simulation procedure. Because of the computational burden required, unless this procedure is efficiently implemented, the analyses are of limited applicability. In this paper, we present efficient algorithms and implementations for randomly expanding and processing phylogenetic trees so that simulations involved in comparative phylogenetic analysis with uncertainty can be conducted in a reasonable time. We propose algorithms for both randomly expanding trees and calculating distance matrices. We made available the source code, which was written in the C++ language. The code may be used as a standalone program or as a shared object in the R system. The software can also be used as a web service through the link: http://purl.oclc.org/NET/sunplin/. We compare our implementations to similar solutions and show that significant performance gains can be obtained. Our results open up the possibility of accounting for phylogenetic uncertainty in evolutionary and ecological analyses of large datasets.
Template matching for auditing hospital cost and quality.
Silber, Jeffrey H; Rosenbaum, Paul R; Ross, Richard N; Ludwig, Justin M; Wang, Wei; Niknam, Bijan A; Mukherjee, Nabanita; Saynisch, Philip A; Even-Shoshan, Orit; Kelz, Rachel R; Fleisher, Lee A
2014-10-01
Develop an improved method for auditing hospital cost and quality. Medicare claims in general, gynecologic and urologic surgery, and orthopedics from Illinois, Texas, and New York between 2004 and 2006. A template of 300 representative patients was constructed and then used to match 300 patients at hospitals that had a minimum of 500 patients over a 3-year study period. From each of 217 hospitals we chose 300 patients most resembling the template using multivariate matching. The matching algorithm found close matches on procedures and patient characteristics, far more balanced than measured covariates would be in a randomized clinical trial. These matched samples displayed little to no differences across hospitals in common patient characteristics yet found large and statistically significant hospital variation in mortality, complications, failure-to-rescue, readmissions, length of stay, ICU days, cost, and surgical procedure length. Similar patients at different hospitals had substantially different outcomes. The template-matched sample can produce fair, directly standardized audits that evaluate hospitals on patients with similar characteristics, thereby making benchmarking more believable. Through examining matched samples of individual patients, administrators can better detect poor performance at their hospitals and better understand why these problems are occurring. © Health Research and Educational Trust.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghoos, K., E-mail: kristel.ghoos@kuleuven.be; Dekeyser, W.; Samaey, G.
2016-10-01
The plasma and neutral transport in the plasma edge of a nuclear fusion reactor is usually simulated using coupled finite volume (FV)/Monte Carlo (MC) codes. However, under conditions of future reactors like ITER and DEMO, convergence issues become apparent. This paper examines the convergence behaviour and the numerical error contributions with a simplified FV/MC model for three coupling techniques: Correlated Sampling, Random Noise and Robbins Monro. Also, practical procedures to estimate the errors in complex codes are proposed. Moreover, first results with more complex models show that an order of magnitude speedup can be achieved without any loss in accuracymore » by making use of averaging in the Random Noise coupling technique.« less
Methods and analysis of realizing randomized grouping.
Hu, Liang-Ping; Bao, Xiao-Lei; Wang, Qi
2011-07-01
Randomization is one of the four basic principles of research design. The meaning of randomization includes two aspects: one is to randomly select samples from the population, which is known as random sampling; the other is to randomly group all the samples, which is called randomized grouping. Randomized grouping can be subdivided into three categories: completely, stratified and dynamically randomized grouping. This article mainly introduces the steps of complete randomization, the definition of dynamic randomization and the realization of random sampling and grouping by SAS software.
Optimal sample sizes for the design of reliability studies: power consideration.
Shieh, Gwowen
2014-09-01
Intraclass correlation coefficients are used extensively to measure the reliability or degree of resemblance among group members in multilevel research. This study concerns the problem of the necessary sample size to ensure adequate statistical power for hypothesis tests concerning the intraclass correlation coefficient in the one-way random-effects model. In view of the incomplete and problematic numerical results in the literature, the approximate sample size formula constructed from Fisher's transformation is reevaluated and compared with an exact approach across a wide range of model configurations. These comprehensive examinations showed that the Fisher transformation method is appropriate only under limited circumstances, and therefore it is not recommended as a general method in practice. For advance design planning of reliability studies, the exact sample size procedures are fully described and illustrated for various allocation and cost schemes. Corresponding computer programs are also developed to implement the suggested algorithms.
Psychometric evaluation of the Revised Professional Practice Environment (RPPE) scale.
Erickson, Jeanette Ives; Duffy, Mary E; Ditomassi, Marianne; Jones, Dorothy
2009-05-01
The purpose was to examine the psychometric properties of the Revised Professional Practice Environment (RPPE) scale. Despite renewed focus on studying health professionals' practice environments, there are still few reliable and valid instruments available to assist nurse administrators in decision making. A psychometric evaluation using a random-sample cross-validation procedure (calibration sample [CS], n = 775; validation sample [VS], n = 775) was undertaken. Cronbach alpha internal consistency reliability of the total score (r = 0.93 [CS] and 0.92 [VS]), resulting subscale scores (r range: 0.80-0.87 [CS], 0.81-0.88 [VS]), and principal components analyses with Varimax rotation and Kaiser normalization (8 components, 59.2% variance [CS], 59.7% [VS]) produced almost identical results in both samples. The multidimensional RPPE is a psychometrically sound measure of 8 components of the professional practice environment in the acute care setting and sufficiently reliable and valid for use as independent subscales in healthcare research.
USGS Blind Sample Project: monitoring and evaluating laboratory analytical quality
Ludtke, Amy S.; Woodworth, Mark T.
1997-01-01
The U.S. Geological Survey (USGS) collects and disseminates information about the Nation's water resources. Surface- and ground-water samples are collected and sent to USGS laboratories for chemical analyses. The laboratories identify and quantify the constituents in the water samples. Random and systematic errors occur during sample handling, chemical analysis, and data processing. Although all errors cannot be eliminated from measurements, the magnitude of their uncertainty can be estimated and tracked over time. Since 1981, the USGS has operated an independent, external, quality-assurance project called the Blind Sample Project (BSP). The purpose of the BSP is to monitor and evaluate the quality of laboratory analytical results through the use of double-blind quality-control (QC) samples. The information provided by the BSP assists the laboratories in detecting and correcting problems in the analytical procedures. The information also can aid laboratory users in estimating the extent that laboratory errors contribute to the overall errors in their environmental data.
Adaptive sampling of information in perceptual decision-making.
Cassey, Thomas C; Evens, David R; Bogacz, Rafal; Marshall, James A R; Ludwig, Casimir J H
2013-01-01
In many perceptual and cognitive decision-making problems, humans sample multiple noisy information sources serially, and integrate the sampled information to make an overall decision. We derive the optimal decision procedure for two-alternative choice tasks in which the different options are sampled one at a time, sources vary in the quality of the information they provide, and the available time is fixed. To maximize accuracy, the optimal observer allocates time to sampling different information sources in proportion to their noise levels. We tested human observers in a corresponding perceptual decision-making task. Observers compared the direction of two random dot motion patterns that were triggered only when fixated. Observers allocated more time to the noisier pattern, in a manner that correlated with their sensory uncertainty about the direction of the patterns. There were several differences between the optimal observer predictions and human behaviour. These differences point to a number of other factors, beyond the quality of the currently available sources of information, that influences the sampling strategy.
Jothika, Mohan; Vanajassun, P. Pranav; Someshwar, Battu
2015-01-01
Aim: To determine the short-term efficiency of probiotic, chlorhexidine, and fluoride mouthwashes on plaque Streptococcus mutans level at four periodic intervals. Materials and Methods: This was a single-blind, randomized control study in which each subject was tested with only one mouthwash regimen. Fifty-two healthy qualified adult patients were selected randomly for the study and were divided into the following groups: group 1- 10 ml of distilled water, group 2- 10 ml of 0.2% chlorhexidine mouthwash, group 3- 10 ml of 500 ppm F/400 ml sodium fluoride mouthwash, and group 4- 10 ml of probiotic mouthwash. Plaque samples were collected from the buccal surface of premolars and molars in the maxillary quadrant. Sampling procedure was carried out by a single examiner after 7 days, 14 days, and 30 days, respectively, after the use of the mouthwash. All the samples were subjected to microbiological analysis and statistically analyzed with one-way analysis of variance (ANOVA) and post-hoc test. Results: One-way ANOVA comparison among groups 2, 3, and 4 showed no statistical significance, whereas group 1 showed statistically significant difference when compared with groups 2, 3, and 4 at 7th, 14th, and 30th day. Conclusion: Chlorhexidine, sodium fluoride, and probiotic mouthwashes reduce plaque S. mutans levels. Probiotic mouthwash is effective and equivalent to chlorhexidine and sodium fluoride mouthwashes. Thus, probiotic mouthwash can also be considered as an effective oral hygiene regimen. PMID:25984467
Is early cord clamping, delayed cord clamping or cord milking best?
Vatansever, Binay; Demirel, Gamze; Ciler Eren, Elif; Erel, Ozcan; Neselioglu, Salim; Karavar, Hande Nur; Gundogdu, Semra; Ulfer, Gozde; Bahadir, Selcen; Tastekin, Ayhan
2018-04-01
To compare the antioxidant status of three cord clamping procedures (early clamping, delayed clamping and milking) by analyzing the thiol-disulfide balance. This randomized controlled study enrolled 189 term infants who were divided into three groups according to the cord clamping procedure: early clamping, delayed clamping and milking. Blood samples were collected from the umbilical arteries immediately after clamping, and the thiol/disulfide homeostasis was analyzed. The native and total thiol levels were significantly (p < .05) lower in the early cord clamping group compared with the other two groups. The disulfide/total thiol ratio was significantly (p = .026) lower in the delayed cord clamping and milking groups compared with the early clamping groups. Early cord clamping causes the production of more disulfide bonds and lower thiol levels, indicating that oxidation reactions are increased in the early cord clamping procedure compared with the delayed cord clamping and milking procedures. The oxidant capacity is greater with early cord clamping than with delayed clamping or cord milking. Delayed cord clamping or milking are beneficial in neonatal care, and we suggest that they be performed routinely in all deliveries.
Kabbasch, C; Dorn, F; Wenchel, H M; Krug, B; Mpotsaris, A; Liebig, T
2017-03-01
Bacterial contamination during angiographic procedures is a potential source of bacteremia. It is largely unknown whether it is clinically relevant. Our aim was to evaluate the incidence of contamination of liquids during catheter-based neuroangiographic examinations, the spectrum of microorganisms, a comparison of two different trolley-settings, and a follow-up of all patients with regard to clinical and lab signs of infection. A total of 170 patients underwent either diagnostic angiography (n = 111) or arterial neuroendovascular procedures (n = 59). To study the impact of airborne contamination of sterile liquids, we randomly assigned equal numbers of procedures to two distinct setups. Group A with standard open-surface bowls and group B with repetitive coverage of liquids throughout the procedure. Patient preparation was performed with utmost care. After each procedure, samples of the liquids were sent for microbiological evaluation. Patients were followed for signs of infection (fever, white blood cell count, C-reactive-protein). Of all samples, 25.3 % were contaminated. Contamination consisted of resident skin microbiota only and was more common with procedures (28.8 %) than with diagnostic angiography (23.4 %) and less common in uncovered (23.5 %) than in covered bowls (27.1 %). However, these differences were insignificant. None of the patients developed clinical or lab signs of infection. Contamination during diagnostic and interventional angiography does occur and cannot be avoided by intermittent coverage. Despite a surprisingly high incidence, our findings support the common strategy that antibiotic coverage is unnecessary in most patients undergoing arterial angiography as it lacks clinical impact. Airborne contamination does not appear to play a role.
Van Broeck, Bianca; Timmers, Maarten; Ramael, Steven; Bogert, Jennifer; Shaw, Leslie M; Mercken, Marc; Slemmon, John; Van Nueten, Luc; Engelborghs, Sebastiaan; Streffer, Johannes Rolf
2016-05-19
Cerebrospinal fluid (CSF) amyloid-beta (Aβ) peptides are predictive biomarkers for Alzheimer's disease and are proposed as pharmacodynamic markers for amyloid-lowering therapies. However, frequent sampling results in fluctuating CSF Aβ levels that have a tendency to increase compared with baseline. The impact of sampling frequency, volume, catheterization procedure, and ibuprofen pretreatment on CSF Aβ levels using continuous sampling over 36 h was assessed. In this open-label biomarker study, healthy participants (n = 18; either sex, age 55-85 years) were randomized into one of three cohorts (n = 6/cohort; high-frequency sampling). In all cohorts except cohort 2 (sampling started 6 h post catheterization), sampling through lumbar catheterization started immediately post catheterization. Cohort 3 received ibuprofen (800 mg) before catheterization. Following interim data review, an additional cohort 4 (n = 6) with an optimized sampling scheme (low-frequency and lower volume) was included. CSF Aβ(1-37), Aβ(1-38), Aβ(1-40), and Aβ(1-42) levels were analyzed. Increases and fluctuations in mean CSF Aβ levels occurred in cohorts 1-3 at times of high-frequency sampling. Some outliers were observed (cohorts 2 and 3) with an extreme pronunciation of this effect. Cohort 4 demonstrated minimal fluctuation of CSF Aβ both on a group and an individual level. Intersubject variability in CSF Aβ profiles over time was observed in all cohorts. CSF Aβ level fluctuation upon catheterization primarily depends on the sampling frequency and volume, but not on the catheterization procedure or inflammatory reaction. An optimized low-frequency sampling protocol minimizes or eliminates fluctuation of CSF Aβ levels, which will improve the capability of accurately measuring the pharmacodynamic read-out for amyloid-lowering therapies. ClinicalTrials.gov NCT01436188 . Registered 15 September 2011.
Reflexion on linear regression trip production modelling method for ensuring good model quality
NASA Astrophysics Data System (ADS)
Suprayitno, Hitapriya; Ratnasari, Vita
2017-11-01
Transport Modelling is important. For certain cases, the conventional model still has to be used, in which having a good trip production model is capital. A good model can only be obtained from a good sample. Two of the basic principles of a good sampling is having a sample capable to represent the population characteristics and capable to produce an acceptable error at a certain confidence level. It seems that this principle is not yet quite understood and used in trip production modeling. Therefore, investigating the Trip Production Modelling practice in Indonesia and try to formulate a better modeling method for ensuring the Model Quality is necessary. This research result is presented as follows. Statistics knows a method to calculate span of prediction value at a certain confidence level for linear regression, which is called Confidence Interval of Predicted Value. The common modeling practice uses R2 as the principal quality measure, the sampling practice varies and not always conform to the sampling principles. An experiment indicates that small sample is already capable to give excellent R2 value and sample composition can significantly change the model. Hence, good R2 value, in fact, does not always mean good model quality. These lead to three basic ideas for ensuring good model quality, i.e. reformulating quality measure, calculation procedure, and sampling method. A quality measure is defined as having a good R2 value and a good Confidence Interval of Predicted Value. Calculation procedure must incorporate statistical calculation method and appropriate statistical tests needed. A good sampling method must incorporate random well distributed stratified sampling with a certain minimum number of samples. These three ideas need to be more developed and tested.
Adaptive adjustment of the randomization ratio using historical control data
Hobbs, Brian P.; Carlin, Bradley P.; Sargent, Daniel J.
2013-01-01
Background Prospective trial design often occurs in the presence of “acceptable” [1] historical control data. Typically this data is only utilized for treatment comparison in a posteriori retrospective analysis to estimate population-averaged effects in a random-effects meta-analysis. Purpose We propose and investigate an adaptive trial design in the context of an actual randomized controlled colorectal cancer trial. This trial, originally reported by Goldberg et al. [2], succeeded a similar trial reported by Saltz et al. [3], and used a control therapy identical to that tested (and found beneficial) in the Saltz trial. Methods The proposed trial implements an adaptive randomization procedure for allocating patients aimed at balancing total information (concurrent and historical) among the study arms. This is accomplished by assigning more patients to receive the novel therapy in the absence of strong evidence for heterogeneity among the concurrent and historical controls. Allocation probabilities adapt as a function of the effective historical sample size (EHSS) characterizing relative informativeness defined in the context of a piecewise exponential model for evaluating time to disease progression. Commensurate priors [4] are utilized to assess historical and concurrent heterogeneity at interim analyses and to borrow strength from the historical data in the final analysis. The adaptive trial’s frequentist properties are simulated using the actual patient-level historical control data from the Saltz trial and the actual enrollment dates for patients enrolled into the Goldberg trial. Results Assessing concurrent and historical heterogeneity at interim analyses and balancing total information with the adaptive randomization procedure leads to trials that on average assign more new patients to the novel treatment when the historical controls are unbiased or slightly biased compared to the concurrent controls. Large magnitudes of bias lead to approximately equal allocation of patients among the treatment arms. Using the proposed commensurate prior model to borrow strength from the historical data, after balancing total information with the adaptive randomization procedure, provides admissible estimators of the novel treatment effect with desirable bias-variance trade-offs. Limitations Adaptive randomization methods in general are sensitive to population drift and more suitable for trials that initiate with gradual enrollment. Balancing information among study arms in time-to-event analyses is difficult in the presence of informative right-censoring. Conclusions The proposed design could prove important in trials that follow recent evaluations of a control therapy. Efficient use of the historical controls is especially important in contexts where reliance on pre-existing information is unavoidable because the control therapy is exceptionally hazardous, expensive, or the disease is rare. PMID:23690095
Adaptive adjustment of the randomization ratio using historical control data.
Hobbs, Brian P; Carlin, Bradley P; Sargent, Daniel J
2013-01-01
Prospective trial design often occurs in the presence of 'acceptable' historical control data. Typically, these data are only utilized for treatment comparison in a posteriori retrospective analysis to estimate population-averaged effects in a random-effects meta-analysis. We propose and investigate an adaptive trial design in the context of an actual randomized controlled colorectal cancer trial. This trial, originally reported by Goldberg et al., succeeded a similar trial reported by Saltz et al., and used a control therapy identical to that tested (and found beneficial) in the Saltz trial. The proposed trial implements an adaptive randomization procedure for allocating patients aimed at balancing total information (concurrent and historical) among the study arms. This is accomplished by assigning more patients to receive the novel therapy in the absence of strong evidence for heterogeneity among the concurrent and historical controls. Allocation probabilities adapt as a function of the effective historical sample size (EHSS), characterizing relative informativeness defined in the context of a piecewise exponential model for evaluating time to disease progression. Commensurate priors are utilized to assess historical and concurrent heterogeneity at interim analyses and to borrow strength from the historical data in the final analysis. The adaptive trial's frequentist properties are simulated using the actual patient-level historical control data from the Saltz trial and the actual enrollment dates for patients enrolled into the Goldberg trial. Assessing concurrent and historical heterogeneity at interim analyses and balancing total information with the adaptive randomization procedure lead to trials that on average assign more new patients to the novel treatment when the historical controls are unbiased or slightly biased compared to the concurrent controls. Large magnitudes of bias lead to approximately equal allocation of patients among the treatment arms. Using the proposed commensurate prior model to borrow strength from the historical data, after balancing total information with the adaptive randomization procedure, provides admissible estimators of the novel treatment effect with desirable bias-variance trade-offs. Adaptive randomization methods in general are sensitive to population drift and more suitable for trials that initiate with gradual enrollment. Balancing information among study arms in time-to-event analyses is difficult in the presence of informative right-censoring. The proposed design could prove important in trials that follow recent evaluations of a control therapy. Efficient use of the historical controls is especially important in contexts where reliance on preexisting information is unavoidable because the control therapy is exceptionally hazardous, expensive, or the disease is rare.
Bayesian Analysis for Exponential Random Graph Models Using the Adaptive Exchange Sampler.
Jin, Ick Hoon; Yuan, Ying; Liang, Faming
2013-10-01
Exponential random graph models have been widely used in social network analysis. However, these models are extremely difficult to handle from a statistical viewpoint, because of the intractable normalizing constant and model degeneracy. In this paper, we consider a fully Bayesian analysis for exponential random graph models using the adaptive exchange sampler, which solves the intractable normalizing constant and model degeneracy issues encountered in Markov chain Monte Carlo (MCMC) simulations. The adaptive exchange sampler can be viewed as a MCMC extension of the exchange algorithm, and it generates auxiliary networks via an importance sampling procedure from an auxiliary Markov chain running in parallel. The convergence of this algorithm is established under mild conditions. The adaptive exchange sampler is illustrated using a few social networks, including the Florentine business network, molecule synthetic network, and dolphins network. The results indicate that the adaptive exchange algorithm can produce more accurate estimates than approximate exchange algorithms, while maintaining the same computational efficiency.
NASA Astrophysics Data System (ADS)
de Santana, Felipe Bachion; de Souza, André Marcelo; Poppi, Ronei Jesus
2018-02-01
This study evaluates the use of visible and near infrared spectroscopy (Vis-NIRS) combined with multivariate regression based on random forest to quantify some quality soil parameters. The parameters analyzed were soil cation exchange capacity (CEC), sum of exchange bases (SB), organic matter (OM), clay and sand present in the soils of several regions of Brazil. Current methods for evaluating these parameters are laborious, timely and require various wet analytical methods that are not adequate for use in precision agriculture, where faster and automatic responses are required. The random forest regression models were statistically better than PLS regression models for CEC, OM, clay and sand, demonstrating resistance to overfitting, attenuating the effect of outlier samples and indicating the most important variables for the model. The methodology demonstrates the potential of the Vis-NIR as an alternative for determination of CEC, SB, OM, sand and clay, making possible to develop a fast and automatic analytical procedure.
The Study on Mental Health at Work: Design and sampling.
Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit
2017-08-01
The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. S-MGA is a representative study of German employees aged 31-60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment.
[Krigle estimation and its simulated sampling of Chilo suppressalis population density].
Yuan, Zheming; Bai, Lianyang; Wang, Kuiwu; Hu, Xiangyue
2004-07-01
In order to draw up a rational sampling plan for the larvae population of Chilo suppressalis, an original population and its two derivative populations, random population and sequence population, were sampled and compared with random sampling, gap-range-random sampling, and a new systematic sampling integrated Krigle interpolation and random original position. As for the original population whose distribution was up to aggregative and dependence range in line direction was 115 cm (6.9 units), gap-range-random sampling in line direction was more precise than random sampling. Distinguishing the population pattern correctly is the key to get a better precision. Gap-range-random sampling and random sampling are fit for aggregated population and random population, respectively, but both of them are difficult to apply in practice. Therefore, a new systematic sampling named as Krigle sample (n = 441) was developed to estimate the density of partial sample (partial estimation, n = 441) and population (overall estimation, N = 1500). As for original population, the estimated precision of Krigle sample to partial sample and population was better than that of investigation sample. With the increase of the aggregation intensity of population, Krigel sample was more effective than investigation sample in both partial estimation and overall estimation in the appropriate sampling gap according to the dependence range.
Hutsell, Blake A; Banks, Matthew L
2015-08-15
Working memory is a domain of 'executive function.' Delayed nonmatching-to-sample (DNMTS) procedures are commonly used to examine working memory in both human laboratory and preclinical studies. The aim was to develop an automated DNMTS procedure maintained by food pellets in rhesus monkeys using a touch-sensitive screen attached to the housing chamber. Specifically, the DNMTS procedure was a 2-stimulus, 2-choice recognition memory task employing unidimensional discriminative stimuli and randomized delay interval presentations. DNMTS maintained a delay-dependent decrease in discriminability that was independent of the retention interval distribution. Eliminating reinforcer availability during a single delay session or providing food pellets before the session did not systematically alter accuracy, but did reduce total choices. Increasing the intertrial interval enhanced accuracy at short delays. Acute Δ(9)-THC pretreatment produced delay interval-dependent changes in the forgetting function at doses that did not alter total choices. Acute methylphenidate pretreatment only decreased total choices. All monkeys were trained to perform NMTS at the 1s training delay within 60 days of initiating operant touch training. Furthermore, forgetting functions were reliably delay interval-dependent and stable over the experimental period (∼6 months). Consistent with previous studies, increasing the intertrial interval improved DNMTS performance, whereas Δ(9)-THC disrupted DNMTS performance independent of changes in total choices. Overall, the touchscreen-based DNMTS procedure described provides an efficient method for training and testing experimental manipulations on working memory in unrestrained rhesus monkeys. Copyright © 2015 Elsevier B.V. All rights reserved.
Lee, Chang-Hyun; Chung, Chun Kee; Kim, Chi Heon
2017-11-01
Radiofrequency denervation is commonly used for the treatment of chronic facet joint pain that has been refractory to more conservative treatments, although the evidence supporting this treatment has been controversial. We aimed to elucidate the precise effects of radiofrequency denervation in patients with low back pain originating from the facet joints relative to those obtained using control treatments, with particular attention to consistency in the denervation protocol. A meta-analysis of randomized controlled trials was carried out. Adult patients undergoing radiofrequency denervation or control treatments (sham or epidural block) for facet joint disease of the lumbar spine comprised the patient sample. Visual analog scale (VAS) pain scores were measured and stratified by response of diagnostic block procedures. We searched PubMed, Embase, Web of Science, and the Cochrane Database for randomized controlled trials regarding radiofrequency denervation and control treatments for back pain. Changes in VAS pain scores of the radiofrequency group were compared with those of the control group as well as the minimal clinically important difference (MCID) for back pain VAS. Meta-regression model was developed to evaluate the effect of radiofrequency treatment according to responses of diagnostic block while controlling for other variables. We then calculated mean differences and 95% confidence intervals (CIs) using random-effects models. We included data from seven trials involving 454 patients who had undergone radiofrequency denervation (231 patients) and control treatments such as sham or epidural block procedures (223 patients). The radiofrequency group exhibited significantly greater improvements in back pain score when compared with the control group for 1-year follow-up. Although the average improvement in VAS scores exceeded the MCID, the lower limit of the 95% CI encompassed the MCID. A subgroup of patients who responded very well to diagnostic block procedures demonstrated significant improvements in back pain relative to the control group at all times. When placed into our meta-regression model, the response to diagnostic block procedure was responsible for a statistically significant portion of treatment effect. Studies published over the last two decades revealed that radiofrequency denervation reduced back pain significantly in patients with facet joint disease compared with the MCID and control treatments. Conventional radiofrequency denervation resulted in significant reductions in low back pain originating from the facet joints in patients showing the best response to diagnostic block over the first 12 months when compared with sham procedures or epidural nerve blocks. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wan Salleh, Masturah; Sulaiman, Hajar
2013-04-01
The use of technology in the teaching of mathematics at the university level has long been introduced; but many among the lecturers, especially those that have taught for many years, still opt for a traditional teaching method, that is, by lecture talk. One reason is that lecturers themselves were not exposed to the technologies available and how it can assist in the teaching and learning procedures (T&L) in mathematics. GeoGebra is a mathematical software which is open and free and has just recently been introduced in Malaysia. Compared with the software Cabri Geometry and Geometer's Sketchpad (GSP), which only focus on geometry, GeoGebra is able to connect geometry, algebra and numerical representation. Realizing this, the researchers have conducted a study to expose the university lecturers on the use of GeoGebra in T&L. The researchers chose to do the research on mathematics lecturers at the Department of Computer Science and Mathematics (JSKM), Universiti Teknologi Mara (UiTM), Penang. The objective of this study is to determine whether an exposure to GeoGebra software can affect the conceptual knowledge and procedural teaching of mathematics at the university level. This study is a combination of descriptive and qualitative. One session was conducted in an open workshop for all the 45 lecturers. From that total, four people were selected as a sample. The sample was selected by using a simple random sampling method. This study used materials in the form of modules during the workshop. In terms of conceptual knowledge, the results showed that the GeoGebra software is appropriate, relevant and highly effective for in-depth understanding of the selected topics. While the procedural aspects of teaching, it can be one of the teaching aids and considerably facilitate the lecturers.
Effect of post-curing treatment on mechanical properties of composite resins.
Almeida-Chetti, Verónica A; Macchi, Ricardo L; Iglesias, María E
2014-01-01
The aim of this study is to assess the effect of additional curing procedures on the flexural strength and modulus of elasticity of indirect and direct composite materials. Twenty-four rectangular prism-shaped 2 mm x 2 mm x 25 mm samples of Belleglass, Premisa (Kerr), Adoro and Heliomolar (Ivoclar Vivadent) were prepared. Each composite was packed in an ad-hoc stainless steel device with a TeflonR instrument. A mylar strip and a glass slab were placed on top to obtain a flat surface. Polymerization was activated for 20 seconds with a halogen unit (Astralis 10, Ivoclar - Vivadent) with soft start regime and an output with a 350 to 1200 mw/cm2 range at four different points according to the diameter of the end of the guide. The specimens obtained were then randomly divided into two different groups: with and without additional treatment. In the group with additional treatment, the samples adorro were submitted to 25 minutes in Lumamat 100 (Ivoclar Vivadent) and the rest to 20 minutes in BelleGlass HP (Kerr). After the curing procedures, all samples were treated with sandpapers of decreasing grain size under water flow, and stored in distilled water for 24 h. Flexural strength was measured according to the ISO 404920 recommendations and elastic modulus was determined following the procedures of ANSI/ADA standard No. 27. Statistical differences were found among the different materials and curing procedures employed (P<0.01). The elastic modulus was significantly higher after the additional curing treatment for all materials except Premisa. Further work is needed to determine the association between the actual monomers present in the matrix and the effect of additional curing processes on the mechanical properties of both direct and indirect composites, and its clinical relevance.
Alkemade, Nathan; Bowden, Stephen C; Salzman, Louis
2015-02-01
It has been suggested that MMPI-2 scoring requires removal of some items when assessing patients after a traumatic brain injury (TBI). Gass (1991. MMPI-2 interpretation and closed head injury: A correction factor. Psychological assessment, 3, 27-31) proposed a correction procedure in line with the hypothesis that MMPI-2 endorsement may be affected by symptoms of TBI. This study assessed the validity of the Gass correction procedure. A sample of patients with a TBI (n = 242), and a random subset of the MMPI-2 normative sample (n = 1,786). The correction procedure implies a failure of measurement invariance across populations. This study examined measurement invariance of one of the MMPI-2 scales (Hs) that includes TBI correction items. A four-factor model of the MMPI-2 Hs items was defined. The factor model was found to meet the criteria for partial measurement invariance. Analysis of the change in sensitivity and specificity values implied by partial measurement invariance failed to indicate significant practical impact of partial invariance. Overall, the results support continued use of all Hs items to assess psychological well-being in patients with TBI. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
A randomized study of a method for optimizing adolescent assent to biomedical research.
Annett, Robert D; Brody, Janet L; Scherer, David G; Turner, Charles W; Dalen, Jeanne; Raissy, Hengameh
2017-01-01
Voluntary consent/assent with adolescents invited to participate in research raises challenging problems. No studies to date have attempted to manipulate autonomy in relation to assent/consent processes. This study evaluated the effects of an autonomy-enhanced individualized assent/consent procedure embedded within a randomized pediatric asthma clinical trial. Families were randomly assigned to remain together or separated during a consent/assent process; the latter we characterize as an autonomy-enhanced assent/consent procedure. We hypothesized that separating adolescents from their parents would improve adolescent assent by increasing knowledge and appreciation of the clinical trial and willingness to participate. Sixty-four adolescent-parent dyads completed procedures. The together versus separate randomization made no difference in adolescent or parent willingness to participate. However, significant differences were found in both parent and adolescent knowledge of the asthma clinical trial based on the assent/consent procedure and adolescent age. The separate assent/consent procedure improved knowledge of study risks and benefits for older adolescents and their parents but not for the younger youth or their parents. Regardless of the assent/consent process, younger adolescents had lower comprehension of information associated with the study medication and research risks and benefits, but not study procedures or their research rights and privileges. The use of an autonomy-enhanced assent/consent procedure for adolescents may improve their and their parent's informed assent/consent without impacting research participation decisions. Traditional assent/consent procedures may result in a "diffusion of responsibility" effect between parents and older adolescents, specifically in attending to key information associated with study risks and benefits.
Outcomes of Mini vs Roux-en-Y gastric bypass: A meta-analysis and systematic review.
Wang, Fu-Gang; Yan, Wen-Mao; Yan, Ming; Song, Mao-Min
2018-05-10
Mini gastric bypass has been proved to be capable of achieving excellent metabolic results by numerous published studies. Compared to Roux-en-Y gastric bypass, mini gastric bypass is a technically simpler and reversible procedure. However, comparative outcomes of the effectiveness between Mini gastric bypass and Roux-en-Y gastric bypass remain unclear. A systematic literature search was performed in Pubmed, Embase, Cochrane library from inception to February 9, 2018. For assessment of method quality, NOS (Newcastle-Ottawa Scale) and Cochrane Collaboration's tool for assessing risk of bias were used for cohort study and randomized controlled trials, respectively. The meta-analysis was performed by RevMan 5.3 software. 10 cohort studies and 1 randomized controlled trial was included in our meta-analysis. The method quality of the 10 cohort studies was proved as high quality according to the Newcastle-Ottawa Scale. The randomized controlled trial was proved to have a low risk of bias according to Cochrane Collaboration's assessment. Patients receiving mini-gastric bypass had multiple advantageous indexes as compared with patients receiving Roux-en-Y gastric bypass. Examples include: a higher 1-year EWL% (P < 0.05), higher 2-year EWL% (P < 0.05), higher type 2 diabetes mellitus remission rate, as well as a shorter operation time (P < 0.05). No significant statistical difference was observed in hypertension remission rate, mortality, leakage rate, GERD rate, or hospital stay between mini gastric bypass and Roux-en-Y gastric bypass. Mini gastric bypass seems to be a simpler procedure with a better weight reduction effect. This seems to also be the case regarding remission rates of type 2 diabetes mellitus when using Mini gastric bypass in comparison to Roux-en-Y gastric bypass. A small sample size and biased data may have influenced the stability of our results. In light of this, surgeons should treat our results in a conservative way. Larger sample size and multi-center randomized control trials are needed to compare the effectiveness and safety between mini-gastric bypass and Roux-en-Y gastric bypass. Copyright © 2018. Published by Elsevier Ltd.
Ishwaran, Hemant; Lu, Min
2018-06-04
Random forests are a popular nonparametric tree ensemble procedure with broad applications to data analysis. While its widespread popularity stems from its prediction performance, an equally important feature is that it provides a fully nonparametric measure of variable importance (VIMP). A current limitation of VIMP, however, is that no systematic method exists for estimating its variance. As a solution, we propose a subsampling approach that can be used to estimate the variance of VIMP and for constructing confidence intervals. The method is general enough that it can be applied to many useful settings, including regression, classification, and survival problems. Using extensive simulations, we demonstrate the effectiveness of the subsampling estimator and in particular find that the delete-d jackknife variance estimator, a close cousin, is especially effective under low subsampling rates due to its bias correction properties. These 2 estimators are highly competitive when compared with the .164 bootstrap estimator, a modified bootstrap procedure designed to deal with ties in out-of-sample data. Most importantly, subsampling is computationally fast, thus making it especially attractive for big data settings. Copyright © 2018 John Wiley & Sons, Ltd.
Systematic versus random sampling in stereological studies.
West, Mark J
2012-12-01
The sampling that takes place at all levels of an experimental design must be random if the estimate is to be unbiased in a statistical sense. There are two fundamental ways by which one can make a random sample of the sections and positions to be probed on the sections. Using a card-sampling analogy, one can pick any card at all out of a deck of cards. This is referred to as independent random sampling because the sampling of any one card is made without reference to the position of the other cards. The other approach to obtaining a random sample would be to pick a card within a set number of cards and others at equal intervals within the deck. Systematic sampling along one axis of many biological structures is more efficient than random sampling, because most biological structures are not randomly organized. This article discusses the merits of systematic versus random sampling in stereological studies.
Moore, M A; Katzgraber, Helmut G
2014-10-01
Starting from preferences on N proposed policies obtained via questionnaires from a sample of the electorate, an Ising spin-glass model in a field can be constructed from which a political party could find the subset of the proposed policies which would maximize its appeal, form a coherent choice in the eyes of the electorate, and have maximum overlap with the party's existing policies. We illustrate the application of the procedure by simulations of a spin glass in a random field on scale-free networks.
Undergraduate student mental health at Makerere University, Uganda
OVUGA, EMILIO; BOARDMAN, JED; WASSERMAN, DANUTA
2006-01-01
There is little information on the current mental health of University students in Uganda. The present study was carried out to determine the prevalence of depressed mood and suicidal ideation among students at Makerere University. Two student samples participated. Sample I comprised 253 fresh students admitted to all faculties at the University in the academic year 2000/2001, selected by a simple random sampling procedure. Sample II comprised 101 students admitted to the Faculty of Medicine during the academic year 2002/2003. The prevalence of depressed mood was measured using the 13-item Beck Depression Inventory (BDI). The prevalence of depressed mood (BDI score 10 or more) was significantly higher in sample I (16.2%) than sample II (4.0%). Sample I members were significantly more likely than those of sample II to report lifetime and past week suicide ideation. Thus, there is a high prevalence of mental health problems among the general population of new students entering Makerere University and this is significantly higher than for new students in the Faculty of Medicine. PMID:16757997
Norris, David C; Wilson, Andrew
2016-01-01
In a 2014 report on adolescent mental health outcomes in the Moving to Opportunity for Fair Housing Demonstration (MTO), Kessler et al. reported that, at 10- to 15-year follow-up, boys from households randomized to an experimental housing voucher intervention experienced 12-month prevalence of post-traumatic stress disorder (PTSD) at several times the rate of boys from control households. We reanalyze this finding here, bringing to light a PTSD outcome imputation procedure used in the original analysis, but not described in the study report. By bootstrapping with repeated draws from the frequentist sampling distribution of the imputation model used by Kessler et al., and by varying two pseudorandom number generator seeds that fed their analysis, we account for several purely statistical components of the uncertainty inherent in their imputation procedure. We also discuss other sources of uncertainty in this procedure that were not accessible to a formal reanalysis.
Calculation of power spectrums from digital time series with missing data points
NASA Technical Reports Server (NTRS)
Murray, C. W., Jr.
1980-01-01
Two algorithms are developed for calculating power spectrums from the autocorrelation function when there are missing data points in the time series. Both methods use an average sampling interval to compute lagged products. One method, the correlation function power spectrum, takes the discrete Fourier transform of the lagged products directly to obtain the spectrum, while the other, the modified Blackman-Tukey power spectrum, takes the Fourier transform of the mean lagged products. Both techniques require fewer calculations than other procedures since only 50% to 80% of the maximum lags need be calculated. The algorithms are compared with the Fourier transform power spectrum and two least squares procedures (all for an arbitrary data spacing). Examples are given showing recovery of frequency components from simulated periodic data where portions of the time series are missing and random noise has been added to both the time points and to values of the function. In addition the methods are compared using real data. All procedures performed equally well in detecting periodicities in the data.
Mixed model approaches for diallel analysis based on a bio-model.
Zhu, J; Weir, B S
1996-12-01
A MINQUE(1) procedure, which is minimum norm quadratic unbiased estimation (MINQUE) method with 1 for all the prior values, is suggested for estimating variance and covariance components in a bio-model for diallel crosses. Unbiasedness and efficiency of estimation were compared for MINQUE(1), restricted maximum likelihood (REML) and MINQUE theta which has parameter values for the prior values. MINQUE(1) is almost as efficient as MINQUE theta for unbiased estimation of genetic variance and covariance components. The bio-model is efficient and robust for estimating variance and covariance components for maternal and paternal effects as well as for nuclear effects. A procedure of adjusted unbiased prediction (AUP) is proposed for predicting random genetic effects in the bio-model. The jack-knife procedure is suggested for estimation of sampling variances of estimated variance and covariance components and of predicted genetic effects. Worked examples are given for estimation of variance and covariance components and for prediction of genetic merits.
A method to estimate statistical errors of properties derived from charge-density modelling
Lecomte, Claude
2018-01-01
Estimating uncertainties of property values derived from a charge-density model is not straightforward. A methodology, based on calculation of sample standard deviations (SSD) of properties using randomly deviating charge-density models, is proposed with the MoPro software. The parameter shifts applied in the deviating models are generated in order to respect the variance–covariance matrix issued from the least-squares refinement. This ‘SSD methodology’ procedure can be applied to estimate uncertainties of any property related to a charge-density model obtained by least-squares fitting. This includes topological properties such as critical point coordinates, electron density, Laplacian and ellipticity at critical points and charges integrated over atomic basins. Errors on electrostatic potentials and interaction energies are also available now through this procedure. The method is exemplified with the charge density of compound (E)-5-phenylpent-1-enylboronic acid, refined at 0.45 Å resolution. The procedure is implemented in the freely available MoPro program dedicated to charge-density refinement and modelling. PMID:29724964
Lu, Tsui-Shan; Longnecker, Matthew P.; Zhou, Haibo
2016-01-01
Outcome-dependent sampling (ODS) scheme is a cost-effective sampling scheme where one observes the exposure with a probability that depends on the outcome. The well-known such design is the case-control design for binary response, the case-cohort design for the failure time data and the general ODS design for a continuous response. While substantial work has been done for the univariate response case, statistical inference and design for the ODS with multivariate cases remain under-developed. Motivated by the need in biological studies for taking the advantage of the available responses for subjects in a cluster, we propose a multivariate outcome dependent sampling (Multivariate-ODS) design that is based on a general selection of the continuous responses within a cluster. The proposed inference procedure for the Multivariate-ODS design is semiparametric where all the underlying distributions of covariates are modeled nonparametrically using the empirical likelihood methods. We show that the proposed estimator is consistent and developed the asymptotically normality properties. Simulation studies show that the proposed estimator is more efficient than the estimator obtained using only the simple-random-sample portion of the Multivariate-ODS or the estimator from a simple random sample with the same sample size. The Multivariate-ODS design together with the proposed estimator provides an approach to further improve study efficiency for a given fixed study budget. We illustrate the proposed design and estimator with an analysis of association of PCB exposure to hearing loss in children born to the Collaborative Perinatal Study. PMID:27966260
Rodgers, Wendy M; Hall, Craig R; Wilson, Philip M; Berry, Tanya R
2009-02-01
The purpose of this research was to examine whether exercisers and nonexercisers are rated similarly on a variety of characteristics by a sample of randomly selected regular exercisers, nonexercisers who intend to exercise, and nonexercisers with no intention to exercise. Previous research by Martin Ginis et al. (2003) has demonstrated an exerciser stereotype that advantages exercisers. It is unknown, however, the extent to which an exerciser stereotype is shared by nonexercisers, particularly nonintenders. Following an item-generation procedure, a sample of 470 (n=218 men; n=252 women) people selected using random digit dialing responded to a questionnaire assessing the extent to which they agreed that exercisers and nonexercisers possessed 24 characteristics, such as "happy," "fit," "fat," and "lazy." The results strongly support a positive exerciser bias, with exercisers rated more favorably on 22 of the 24 items. The degree of bias was equivalent in all groups of respondents. Examination of the demographic characteristics revealed no differences among the three groups on age, work status, or child-care responsibilities, suggesting that there is a pervasive positive exerciser bias.
Donovan, John E.; Chung, Tammy
2015-01-01
Objective: Most studies of adolescent drinking focus on single alcohol use behaviors (e.g., high-volume drinking, drunkenness) and ignore the patterning of adolescents’ involvement across multiple alcohol behaviors. The present latent class analyses (LCAs) examined a procedure for empirically determining multiple cut points on the alcohol use behaviors in order to establish a typology of adolescent alcohol involvement. Method: LCA was carried out on six alcohol use behavior indicators collected from 6,504 7th through 12th graders who participated in Wave I of the National Longitudinal Study of Adolescent Health (AddHealth). To move beyond dichotomous indicators, a “progressive elaboration” strategy was used, starting with six dichotomous indicators and then evaluating a series of models testing additional cut points on the ordinal indicators at progressively higher points for one indicator at a time. Analyses were performed on one random half-sample, and confirmatory LCAs were performed on the second random half-sample and in the Wave II data. Results: The final model consisted of four latent classes (never or non–current drinkers, low-intake drinkers, non–problem drinkers, and problem drinkers). Confirmatory LCAs in the second random half-sample from Wave I and in Wave II support this four-class solution. The means on the four latent classes were also generally ordered on an array of measures reflecting psychosocial risk for problem behavior. Conclusions: These analyses suggest that there may be four different classes or types of alcohol involvement among adolescents, and, more importantly, they illustrate the utility of the progressive elaboration strategy for moving beyond dichotomous indicators in latent class models. PMID:25978828
Dental Procedures, Oral Practices, and Associated Anxiety: A Study on Late-teenagers
Bhola, Rahul; Malhotra, Reema
2014-01-01
Objectives The study aims to determine the degree of anxiety pertaining to dental procedures and various oral hygiene practices among college teenagers. Methods Corah's Modified Dental Anxiety Scale was administered on a randomly chosen sample of 100 Indian college students (50 males and 50 females) of Delhi University, belonging to the age group of 17–20 years. Results Descriptive statistical computations revealed 12.14 years as the mean age of first dental visit, with moderately high levels of anxiety (60.75%) for various dental procedures among the Indian teenagers and 5% lying in the “phobic or extremely anxious” category. With merely 4.16% people going for regular consultations, general check-ups evoked 78.3% anxiety and having an injection or a tooth removed was perceived as the most threatening. The sample subgroup not using mouthwash and mouthspray, smokers, and alcohol drinkers with improper oral hygiene practices experienced much higher anxiety towards routine dental procedures. Conclusion The majority of the Indian youngsters had an evasive attitude of delaying dental treatment. The core problems lay in deficient health care knowledge, lack of patient-sensitive pedagogy to train dental professionals, inaccessibility of services, and a dismissive attitude towards medical help. The feelings of fear and anxiety prevalent among the Indian youth offer significant insights into causes and preventive measures for future research and practice. Methods of education and motivation could be developed to dissipate the anxiety amongst Indian teenagers that prevent routine dental visits and maintenance of adequate oral hygiene. PMID:25379373
Implementing self sustained quality control procedures in a clinical laboratory.
Khatri, Roshan; K C, Sanjay; Shrestha, Prabodh; Sinha, J N
2013-01-01
Quality control is an essential component in every clinical laboratory which maintains the excellence of laboratory standards, supplementing to proper disease diagnosis, patient care and resulting in overall strengthening of health care system. Numerous quality control schemes are available, with combinations of procedures, most of which are tedious, time consuming and can be "too technical" whereas commercially available quality control materials can be expensive especially for laboratories in developing nations like Nepal. Here, we present a procedure performed at our centre with self prepared control serum and use of simple statistical tools for quality assurance. The pooled serum was prepared as per guidelines for preparation of stabilized liquid quality control serum from human sera. Internal Quality Assessment was performed on this sample, on a daily basis which included measurement of 12 routine biochemical parameters. The results were plotted on Levey-Jennings charts and analysed with quality control rules, for a period of one month. The mean levels of biochemical analytes in self prepared control serum were within normal physiological range. This serum was evaluated every day along with patients' samples. The results obtained were plotted on control charts and analysed using common quality control rules to identify possible systematic and random errors. Immediate mitigation measures were taken and the dispatch of erroneous reports was avoided. In this study we try to highlight on a simple internal quality control procedure which can be performed by laboratories, with minimum technology, expenditure, and expertise and improve reliability and validity of the test reports.
Adelborg, Kasper; Sundbøll, Jens; Munch, Troels; Frøslev, Trine; Sørensen, Henrik Toft; Bøtker, Hans Erik; Schmidt, Morten
2016-01-01
Objective Danish medical registries are widely used for cardiovascular research, but little is known about the data quality of cardiac interventions. We computed positive predictive values (PPVs) of codes for cardiac examinations, procedures and surgeries registered in the Danish National Patient Registry during 2010–2012. Design Population-based validation study. Setting We randomly sampled patients from 1 university hospital and 2 regional hospitals in the Central Denmark Region. Participants 1239 patients undergoing different cardiac interventions. Main outcome measure PPVs with medical record review as reference standard. Results A total of 1233 medical records (99% of the total sample) were available for review. PPVs ranged from 83% to 100%. For examinations, the PPV was overall 98%, reflecting PPVs of 97% for echocardiography, 97% for right heart catheterisation and 100% for coronary angiogram. For procedures, the PPV was 98% overall, with PPVs of 98% for thrombolysis, 92% for cardioversion, 100% for radiofrequency ablation, 98% for percutaneous coronary intervention, and 100% for both cardiac pacemakers and implantable cardiac defibrillators. For cardiac surgery, the overall PPVs was 99%, encompassing PPVs of 100% for mitral valve surgery, 99% for aortic valve surgery, 98% for coronary artery bypass graft surgery, and 100% for heart transplantation. The accuracy of coding was consistent within age, sex, and calendar year categories, and the agreement between independent reviewers was high (99%). Conclusions Cardiac examinations, procedures and surgeries have high PPVs in the Danish National Patient Registry. PMID:27940630
Introducing Statistical Inference to Biology Students through Bootstrapping and Randomization
ERIC Educational Resources Information Center
Lock, Robin H.; Lock, Patti Frazer
2008-01-01
Bootstrap methods and randomization tests are increasingly being used as alternatives to standard statistical procedures in biology. They also serve as an effective introduction to the key ideas of statistical inference in introductory courses for biology students. We discuss the use of such simulation based procedures in an integrated curriculum…
Fonseca, S N; Melon Kunzle, S R; Barbosa Silva, S A; Schmidt, J G; Mele, R R
1999-01-01
To describe the implementation and results of a perioperative antibiotic prophylaxis (PAP) program. A protocol for correct use of PAP was implemented in December 1994. For selected months we measured the PAP protocol compliance of a random sample of clean and clean-contaminated procedures and calculated the cost of incorrect use of PAP. SELLING: A 180-bed general hospital in Ribeirão Preto, Brazil. The cost of unnecessary PAP in the obstetric and gynecologic, cardiothoracic, and orthopedic services dropped from $4,224.54 ($23.47/procedure) in November 1994 to $1,147.24 ($6.17/procedure, January 1995), $544.42 ($3.58/procedure, May 1995), $99.06 ($0.50/procedure, August 1995), and $30 ($0.12/procedure, March 1996). In November 1994, only 13.6% of all surgical procedures were done with correct use of PAP, compared to 59% in January 1995, 73% in August 1995, 78% in March 1996, 92% in November 1996, and 98% in May 1997. Incorrect PAP use wastes resources, which is a particular problem in developing countries. Our program is simple and can be implemented without the use of computers and now is being adopted in other hospitals in our region. We credit the success of our program to the commitment of all participants and to the strong support of the hospital directors.
[Quality of the Early Cervical Cancer Detection Program in the State of Nuevo León].
Salinas-Martínez, A M; Villarreal-Ríos, E; Garza-Elizondo, M E; Fraire-Gloria, J M; López-Franco, J J; Barboza-Quintana, O
1997-01-01
To determine the quality of the Early Cervical Cancer Detection Program in the state of Nuevo León. A random selection of 4791 cytologic reports were analyzed, emitted by the State Ministry of Health, the University Hospital and the Mexican Institute for Social Security early cervical cancer detection modules. Pap tests of women with hysterectomy, current pregnancy, menopause or positive result were excluded. Quality was measured with previously defined standards. Analysis included, besides univariate statistics, tests of significance for proportions and means. The quality of the program was fairly satisfactory at the level of the State. The quality of the sampling procedure was low; 39.9% of the tests contained endocervical cells. Quality of coverage was low; 15.6% were women 25+years with first time Pap test. Quality of opportunity was high; 8.5 +/- 7 weekdays between the date of the pap smear and the interpretation date. Strategies are needed to increase the impact of the state program, such as improving the sampling procedure and the coverage quality levels.
Zhu, Ruoqing; Zeng, Donglin; Kosorok, Michael R.
2015-01-01
In this paper, we introduce a new type of tree-based method, reinforcement learning trees (RLT), which exhibits significantly improved performance over traditional methods such as random forests (Breiman, 2001) under high-dimensional settings. The innovations are three-fold. First, the new method implements reinforcement learning at each selection of a splitting variable during the tree construction processes. By splitting on the variable that brings the greatest future improvement in later splits, rather than choosing the one with largest marginal effect from the immediate split, the constructed tree utilizes the available samples in a more efficient way. Moreover, such an approach enables linear combination cuts at little extra computational cost. Second, we propose a variable muting procedure that progressively eliminates noise variables during the construction of each individual tree. The muting procedure also takes advantage of reinforcement learning and prevents noise variables from being considered in the search for splitting rules, so that towards terminal nodes, where the sample size is small, the splitting rules are still constructed from only strong variables. Last, we investigate asymptotic properties of the proposed method under basic assumptions and discuss rationale in general settings. PMID:26903687
Rogo-Gupta, Lisa; Litwin, Mark S; Saigal, Christopher S; Anger, Jennifer T
2013-07-01
To describe trends in the surgical management of female stress urinary incontinence (SUI) in the United States from 2002 to 2007. As part of the Urologic Diseases of America Project, we analyzed data from a 5% national random sample of female Medicare beneficiaries aged 65 and older. Data were obtained from the Centers for Medicare and Medicaid Services carrier and outpatient files from 2002 to 2007. Women who were diagnosed with urinary incontinence identified by the International Classification of Diseases, Ninth Edition (ICD-9) diagnosis codes and who underwent surgical management identified by Current Procedural Terminology, Fourth Edition (CPT-4) procedure codes were included in the analysis. Trends were analyzed over the 6-year period. Unweighted procedure counts were multiplied by 20 to estimate the rate among all female Medicare beneficiaries. The total number of surgical procedures remained stable during the study period, from 49,340 in 2002 to 49,900 in 2007. Slings were the most common procedure across all years, which increased from 25,840 procedures in 2002 to 33,880 procedures in 2007. Injectable bulking agents were the second most common procedure, which accounted for 14,100 procedures in 2002 but decreased to 11,320 in 2007. Procedures performed in ambulatory surgery centers and physician offices increased, although those performed in inpatient settings declined. Hospital outpatient procedures remained stable. The surgical management of women with SUI shifted toward a dominance of procedures performed in ambulatory surgery centers from 2002 to 2007, although the overall number of procedures remained stable. Slings remained the dominant surgical procedure, followed by injectable bulking agents, both of which are easily performed in outpatient settings. Copyright © 2013 Elsevier Inc. All rights reserved.
Gu, Yingxin; Wylie, Bruce K.; Boyte, Stephen; Picotte, Joshua J.; Howard, Danny; Smith, Kelcy; Nelson, Kurtis
2016-01-01
Regression tree models have been widely used for remote sensing-based ecosystem mapping. Improper use of the sample data (model training and testing data) may cause overfitting and underfitting effects in the model. The goal of this study is to develop an optimal sampling data usage strategy for any dataset and identify an appropriate number of rules in the regression tree model that will improve its accuracy and robustness. Landsat 8 data and Moderate-Resolution Imaging Spectroradiometer-scaled Normalized Difference Vegetation Index (NDVI) were used to develop regression tree models. A Python procedure was designed to generate random replications of model parameter options across a range of model development data sizes and rule number constraints. The mean absolute difference (MAD) between the predicted and actual NDVI (scaled NDVI, value from 0–200) and its variability across the different randomized replications were calculated to assess the accuracy and stability of the models. In our case study, a six-rule regression tree model developed from 80% of the sample data had the lowest MAD (MADtraining = 2.5 and MADtesting = 2.4), which was suggested as the optimal model. This study demonstrates how the training data and rule number selections impact model accuracy and provides important guidance for future remote-sensing-based ecosystem modeling.
Improved Compressive Sensing of Natural Scenes Using Localized Random Sampling
Barranca, Victor J.; Kovačič, Gregor; Zhou, Douglas; Cai, David
2016-01-01
Compressive sensing (CS) theory demonstrates that by using uniformly-random sampling, rather than uniformly-spaced sampling, higher quality image reconstructions are often achievable. Considering that the structure of sampling protocols has such a profound impact on the quality of image reconstructions, we formulate a new sampling scheme motivated by physiological receptive field structure, localized random sampling, which yields significantly improved CS image reconstructions. For each set of localized image measurements, our sampling method first randomly selects an image pixel and then measures its nearby pixels with probability depending on their distance from the initially selected pixel. We compare the uniformly-random and localized random sampling methods over a large space of sampling parameters, and show that, for the optimal parameter choices, higher quality image reconstructions can be consistently obtained by using localized random sampling. In addition, we argue that the localized random CS optimal parameter choice is stable with respect to diverse natural images, and scales with the number of samples used for reconstruction. We expect that the localized random sampling protocol helps to explain the evolutionarily advantageous nature of receptive field structure in visual systems and suggests several future research areas in CS theory and its application to brain imaging. PMID:27555464
2012-01-01
Background With the current focus on personalized medicine, patient/subject level inference is often of key interest in translational research. As a result, random effects models (REM) are becoming popular for patient level inference. However, for very large data sets that are characterized by large sample size, it can be difficult to fit REM using commonly available statistical software such as SAS since they require inordinate amounts of computer time and memory allocations beyond what are available preventing model convergence. For example, in a retrospective cohort study of over 800,000 Veterans with type 2 diabetes with longitudinal data over 5 years, fitting REM via generalized linear mixed modeling using currently available standard procedures in SAS (e.g. PROC GLIMMIX) was very difficult and same problems exist in Stata’s gllamm or R’s lme packages. Thus, this study proposes and assesses the performance of a meta regression approach and makes comparison with methods based on sampling of the full data. Data We use both simulated and real data from a national cohort of Veterans with type 2 diabetes (n=890,394) which was created by linking multiple patient and administrative files resulting in a cohort with longitudinal data collected over 5 years. Methods and results The outcome of interest was mean annual HbA1c measured over a 5 years period. Using this outcome, we compared parameter estimates from the proposed random effects meta regression (REMR) with estimates based on simple random sampling and VISN (Veterans Integrated Service Networks) based stratified sampling of the full data. Our results indicate that REMR provides parameter estimates that are less likely to be biased with tighter confidence intervals when the VISN level estimates are homogenous. Conclusion When the interest is to fit REM in repeated measures data with very large sample size, REMR can be used as a good alternative. It leads to reasonable inference for both Gaussian and non-Gaussian responses if parameter estimates are homogeneous across VISNs. PMID:23095325
Canullo, Luigi; Peñarrocha-Oltra, David; Marchionni, Silvia; Bagán, Leticia; Micarelli, Costanza
2014-01-01
Objectives: A randomized controlled trial was performed to assess soft tissue cell adhesion to implant titanium abutments subjected to different cleaning procedures and test if plasma cleaning can enhance cell adhesion at an early healing time. Study Design: Eighteen patients with osseointegrated and submerged implants were included. Before re-opening, 18 abutments were divided in 3 groups corresponding to different clinical conditions with different cleaning processes: no treatment (G1), laboratory customization and cleaning by steam (G2), cleaning by plasma of Argon (G3). Abutments were removed after 1 week and scanning electron microscopy was used to analyze cell adhesion to the abutment surface quantitatively (percentage of area occupied by cells) and qualitatively (aspect of adhered cells and presence of contaminants). Results: Mean percentages of area occupied by cells were 17.6 ± 22.7%, 16.5 ± 12.9% and 46.3 ± 27.9% for G1, G2 and G3 respectively. Differences were statistically significant between G1 and G3 (p=0.030), close to significance between G2 and G3 (p=0.056), and non-significant between G1 and G2 (p=0.530). The proportion of samples presenting adhered cells was homogeneous among the 3 groups (p-valor = 1.000). In all cases cells presented a flattened aspect; in 2 cases cells were less efficiently adhered and in 1 case cells presented filipodia. Three cases showed contamination with cocobacteria. Conclusions: Within the limits of the present study, plasma of Argon may enhance cell adhesion to titanium abutments, even at the early stage of soft tissue healing. Further studies with greater samples are necessary to confirm these findings. Key words:Connective tissue, dental abutments, randomized controlled trial, clinical research, glow discharged abutment, plasma cleaning. PMID:24121917
Hogendoorn, E A; Westhuis, K; Dijkman, E; Heusinkveld, H A; den Boer, A C; Evers, E A; Baumann, R A
1999-10-08
The coupled-column (LC-LC) configuration consisting of a 3 microm C18 column (50 x 4.6 mm I.D.) as the first column and a 5 microm C18 semi-permeable-surface (SPS) column (150 x 4.6 mm I.D.) as the second column appeared to be successful for the screening of acidic pesticides in surface water samples. In comparison to LC-LC employing two C18 columns, the combination of C18/SPS-C18 significantly decreased the baseline deviation caused by the hump of the co-extracted humic substances when using UV detection (217 nm). The developed LC-LC procedure allowed the simultaneous determination of the target analytes bentazone and bromoxynil in uncleaned extracts of surface water samples to a level of 0.05 microg/l in less than 15 min. In combination with a simple solid-phase extraction step (200 ml of water on a 500 mg C18-bonded silica) the analytical procedure provides a high sample throughput. During a period of about five months more than 200 ditch-water samples originating from agricultural locations were analyzed with the developed procedure. Validation of the method was performed by randomly analyzing recoveries of water samples spiked at levels of 0.1 microg/l (n=10), 0.5 microg/l (n=7) and 2.5 microg/l (n=4). Weighted regression of the recovery data showed that the method provides overall recoveries of 95 and 100% for bentazone and bromoxynil, respectively, with corresponding intra-laboratory reproducibilities of 10 and 11%, respectively. Confirmation of the analytes in part of the samples extracts was carried out with GC-negative ion chemical ionization MS involving a derivatization step with bis(trifluoromethyl)benzyl bromide. No false negatives or positives were observed.
Joshi, Sonal B.; Bhagwat, S.V; Patil, Sanjana A
2016-01-01
Introduction Root Canal Treatment (RCT) has become a mainstream procedure in dentistry. A successful RCT is presented by absence of clinical signs and symptoms in teeth without any radiographic evidence of periodontal involvement. Completing this procedure in one visit or multiple visits has long been a topic of discussion. Aim To evaluate the incidence of postoperative pain after root canal therapy performed in single visit and two visits. Material and Methods An unblinded/ open label randomized controlled trial was carried out in the endodontic department of the Dental Institute, where 78 patients were recruited from the regular pool of patients. A total of 66 maxillary central incisors requiring root canal therapy fulfilled the inclusion and exclusion criteria. Using simple randomization by biased coin randomization method, the selected patients were assigned into two groups: group A (n=33) and group B (n=33). Single visit root canal treatment was performed for group A and two visit root canal treatment for group B. Independent sample t-test was used for statistical analysis. Results Thirty three patients were allotted to group A where endodontic treatment was completed in single visit while 33 patients were allotted to group B where endodontic treatment was completed in two visits. One patient dropped-out from Group A. Hence in Group A, 32 patients were analysed while in Group B, 33 patients were analysed. After 6 hours, 12 hours and 24 hours of obturation, pain was significantly higher in Group B as compared to Group A. However, there was no significant difference in the pain experienced by the patients 48 hours after treatment in both the groups. Conclusion Incidence of pain after endodontic treatment being performed in one-visit or two-visits is not significantly different. PMID:27437339
Chehelcheraghi, Farzaneh; Abbaszadeh, Abolfazl; Tavafi, Magid
2018-03-06
Skin flap procedures are employed in plastic surgery, but failure can lead to necrosis of the flap. Studies have used bone marrow mesenchymal stem cells (BM-MSCs) to improve flap viability. BM-MSCs and acellular amniotic membrane (AAM) have been introduced as alternatives. The objective of this study was to evaluate the effect of BM-MSCs and AAM on mast cells of random skin flaps (RSF) in rats. RSFs (80 × 30 mm) were created on 40 rats that were randomly assigned to one of four groups, including (I) AAM, (II) BM-MSCs, (III) BM-MSCs/AAM, and (IV) saline (control). Transplantation was carried out during the procedure (zero day). Flap necrosis was observed on day 7, and skin samples were collected from the transition line of the flap to evaluate the total number and types of mast cells. The development and the total number of mast cells were related to the development of capillaries. The results of one-way ANOVA indicated that there was no statistically significant difference between the mean numbers of mast cell types for different study groups. However, the difference between the total number of mast cells in the study groups was statistically significant (p = 0.001). The present study suggests that the use of AAM/BM-MSCs can improve the total number of mast cells and accelerate the growth of capillaries at the transient site in RSFs in rats.
Yoshida, Masao; Takizawa, Kohei; Suzuki, Sho; Koike, Yoshiki; Nonaka, Satoru; Yamasaki, Yasushi; Minagawa, Takeyoshi; Sato, Chiko; Takeuchi, Chihiro; Watanabe, Ko; Kanzaki, Hiromitsu; Morimoto, Hiroyuki; Yano, Takafumi; Sudo, Kosuke; Mori, Keita; Gotoda, Takuji; Ono, Hiroyuki
2018-05-01
The aim of this study was to clarify whether dental floss clip (DFC) traction improves the technical outcomes of endoscopic submucosal dissection (ESD). A superiority, randomized control trial was conducted at 14 institutions across Japan. Patients with single gastric neoplasm meeting the indications of the Japanese guidelines for gastric treatment were enrolled and assigned to receive conventional ESD or DFC traction-assisted ESD (DFC-ESD). Randomization was performed according to a computer-generated random sequence with stratification by institution, tumor location, tumor size, and operator experience. The primary endpoint was ESD procedure time, defined as the time from the start of the submucosal injection to the end of the tumor removal procedure. Between July 2015 and September 2016, 640 patients underwent randomization. Of these, 316 patients who underwent conventional ESD and 319 patients who underwent DFC-ESD were included in our analysis. The mean ESD procedure time was 60.7 and 58.1 minutes for conventional ESD and DFC-ESD, respectively (P = .45). Perforation was less frequent in the DFC-ESD group (2.2% vs .3%, P = .04). For lesions located in the greater curvature of the upper or middle stomach, the mean procedure time was significantly shorter in the DFC-ESD group (104.1 vs 57.2 minutes, P = .01). Our findings suggest that DFC-ESD does not result in shorter procedure time in the overall patient population, but it can reduce the risk of perforation. When selectively applied to lesions located in the greater curvature of the upper or middle stomach, DFC-ESD provides a remarkable reduction in procedure time. Copyright © 2018 American Society for Gastrointestinal Endoscopy. Published by Elsevier Inc. All rights reserved.
Goenka, Ajit H; Remer, Erick M; Veniero, Joseph C; Thupili, Chakradhar R; Klein, Eric A
2015-09-01
The objective of our study was to review our experience with CT-guided transgluteal prostate biopsy in patients without rectal access. Twenty-one CT-guided transgluteal prostate biopsy procedures were performed in 16 men (mean age, 68 years; age range, 60-78 years) who were under conscious sedation. The mean prostate-specific antigen (PSA) value was 11.4 ng/mL (range, 2.3-39.4 ng/mL). Six had seven prior unsuccessful transperineal or transurethral biopsies. Biopsy results, complications, sedation time, and radiation dose were recorded. The mean PSA values and number of core specimens were compared between patients with malignant results and patients with nonmalignant results using the Student t test. The average procedural sedation time was 50.6 minutes (range, 15-90 minutes) (n = 20), and the mean effective radiation dose was 8.2 mSv (median, 6.6 mSv; range 3.6-19.3 mSv) (n = 13). Twenty of the 21 (95%) procedures were technically successful. The only complication was a single episode of gross hematuria and penile pain in one patient, which resolved spontaneously. Of 20 successful biopsies, 8 (40%) yielded adenocarcinoma (Gleason score: mean, 8; range, 7-9). Twelve biopsies yielded nonmalignant results (60%): high-grade prostatic intraepithelial neoplasia (n = 3) or benign prostatic tissue with or without inflammation (n = 9). Three patients had carcinoma diagnosed on subsequent biopsies (second biopsy, n = 2 patients; third biopsy, n = 1 patient). A malignant biopsy result was not significantly associated with the number of core specimens (p = 0.3) or the mean PSA value (p = 0.1). CT-guided transgluteal prostate biopsy is a safe and reliable technique for the systematic random sampling of the prostate in patients without a rectal access. In patients with initial negative biopsy results, repeat biopsy should be considered if there is a persistent rise in the PSA value.
A Randomized Study of a Method for Optimizing Adolescent Assent to Biomedical Research
Annett, Robert D.; Brody, Janet L.; Scherer, David G.; Turner, Charles W.; Dalen, Jeanne; Raissy, Hengameh
2018-01-01
Purpose Voluntary consent/assent with adolescents invited to participate in research raises challenging problems. No studies to date have attempted to manipulate autonomy in relation to assent/consent processes. This study evaluated the effects of an autonomy-enhanced individualized assent/consent procedure embedded within a randomized pediatric asthma clinical trial. Methods Families were randomly assigned to remain together or separated during a consent/assent process, the latter we characterize as an autonomy-enhanced assent/consent procedure. We hypothesized that separating adolescents from their parents would improve adolescent assent by increasing knowledge and appreciation of the clinical trial and willingness to participate. Results 64 adolescent-parent dyads completed procedures. The together versus separate randomization made no difference in adolescent or parent willingness to participate. However, significant differences were found in both parent and adolescent knowledge of the asthma clinical trial based on the assent/consent procedure and adolescent age. The separate assent/consent procedure improved knowledge of study risks and benefits for older adolescents and their parents but not for the younger youth or their parents. Regardless of the assent/consent process, younger adolescents had lower comprehension of information associated with the study medication and research risks and benefits, but not study procedures or their research rights and privileges. Conclusions The use of an autonomy-enhanced assent/consent procedure for adolescents may improve their and their parent’s informed assent/consent without impacting research participation decisions. Traditional assent/consent procedures may result in a “diffusion of responsibility” effect between parents and older adolescents, specifically in attending to key information associated with study risks and benefits. PMID:28949898
Gerson, Lauren; Stouch, Bruce; Lobonţiu, Adrian
2018-01-01
The TIF procedure has emerged as an endoscopic treatment for patients with refractory gastro-esophageal reflux disease (GERD). Previous systematic reviews of the TIF procedure conflated findings from studies with modalities that do not reflect the current 2.0 procedure technique or refined data-backed patient selection criteria. A meta-analysis was conducted using data only from randomized studies that assessed the TIF 2.0 procedure compared to a control. The purpose of the meta-analysis was to determine the efficacy and long-term outcomes associated with performance of the TIF 2.0 procedure in patients with chronic long-term refractory GERD on optimized PPI therapy, including esophageal pH, PPI utilization and quality of life. Methods: Three prospective research questions were predicated on the outcomes of the TIF procedure compared to patients who received PPI therapy or sham, concomitant treatment for GERD, and the patient-reported quality of life. Event rates were calculated using the random effect model. Since the time of follow-up post-TIF procedure was variable, analysis was performed to incorporate the time of follow-up for each individual patient at the 3-year time point. Results: Results from this meta-analysis, including data from 233 patients, demonstrated that TIF subjects at 3 years had improved esophageal pH, a decrease in PPI utilization, and improved quality of life. Conclusions: In a meta-analysis of randomized, controlled trials (RCTs), the TIF procedure data for patients with GERD refractory to PPI's produces significant changes, compared with sham or PPI therapy, in esophageal pH, decreased PPI utilization, and improved quality of life. Celsius.
Psychological correlates of loneliness in the older adult.
Walton, C G; Shultz, C M; Beck, C M; Walls, R C
1991-06-01
Loneliness is the emotional response to the discrepancy between desired and available relationships. As people grow old, the likelihood of experiencing age-related losses increases. Such losses may impede the maintenance or acquisition of desired relationships, resulting in a higher incidence of loneliness. This pilot study examines how loneliness relates to age-related losses, hopelessness, self-transcendence, and spiritual well-being in a convenience sample of 107 adults aged 65 years or older. The collective utility of the independent variables in predicting loneliness was investigated by means of a regression decision tree with an automatic random subset crossvalidation procedure. This procedure explained 46% of the variance. Higher scores for age-related losses and hopelessness were associated with higher loneliness scores. Higher scores for self-transcendence and existential spiritual well-being were associated with lower loneliness scores.
Chloroacetanilide herbicide metabolites in Wisconsin groundwater: 2001 survey results.
Postle, Jeffrey K; Rheineck, Bruce D; Allen, Paula E; Baldock, Jon O; Cook, Cody J; Zogbaum, Randy; Vandenbrook, James P
2004-10-15
A survey of agricultural chemicals in Wisconsin groundwater was conducted between October 2000 and April 2001 to obtain a current picture of agricultural chemicals in groundwater used for private drinking water. A stratified, random sampling procedure was used to select 336 sampling locations. Water from private drinking water wells randomly selected from within the 336 sampling locations was analyzed for 18 compounds including herbicides, herbicide metabolites, and nitrate. This report focuses on the frequency and concentration of chloroacetanilide herbicides and their metabolites. Analysis of data resulted in an estimated proportion of 38+/-5.0% of wells that contained detectable levels of a herbicide or herbicide metabolite. The most commonly detected compound was alachlor ESA with a proportion estimate of 28+/-4.6%. Other detected compounds in order of prevalence were metolachlor ESA, metolachlor OA, alachlor OA, acetochlor ESA, and parent alachlor. Estimates of the mean concentration for the detects ranged from 0.15+/-0.082 microg/L for acetochlor ESA to 1.8+/-0.60 microg/L for alachlor OA. Water quality standards have not been developed for these chloroacetanilide herbicide metabolites. The results of this survey emphasize the need for toxicological assessments of herbicide metabolite compounds and establishment of water quality standards at the state and federal levels.
Censoring approach to the detection limits in X-ray fluorescence analysis
NASA Astrophysics Data System (ADS)
Pajek, M.; Kubala-Kukuś, A.
2004-10-01
We demonstrate that the effect of detection limits in the X-ray fluorescence analysis (XRF), which limits the determination of very low concentrations of trace elements and results in appearance of the so-called "nondetects", can be accounted for using the statistical concept of censoring. More precisely, the results of such measurements can be viewed as the left random censored data, which can further be analyzed using the Kaplan-Meier method correcting the data for the presence of nondetects. Using this approach, the results of measured, detection limit censored concentrations can be interpreted in a nonparametric manner including the correction for the nondetects, i.e. the measurements in which the concentrations were found to be below the actual detection limits. Moreover, using the Monte Carlo simulation technique we show that by using the Kaplan-Meier approach the corrected mean concentrations for a population of the samples can be estimated within a few percent uncertainties with respect of the simulated, uncensored data. This practically means that the final uncertainties of estimated mean values are limited in fact by the number of studied samples and not by the correction procedure itself. The discussed random-left censoring approach was applied to analyze the XRF detection-limit-censored concentration measurements of trace elements in biomedical samples.
Lewandowska, Dagmara W; Zagordi, Osvaldo; Geissberger, Fabienne-Desirée; Kufner, Verena; Schmutz, Stefan; Böni, Jürg; Metzner, Karin J; Trkola, Alexandra; Huber, Michael
2017-08-08
Sequence-specific PCR is the most common approach for virus identification in diagnostic laboratories. However, as specific PCR only detects pre-defined targets, novel virus strains or viruses not included in routine test panels will be missed. Recently, advances in high-throughput sequencing allow for virus-sequence-independent identification of entire virus populations in clinical samples, yet standardized protocols are needed to allow broad application in clinical diagnostics. Here, we describe a comprehensive sample preparation protocol for high-throughput metagenomic virus sequencing using random amplification of total nucleic acids from clinical samples. In order to optimize metagenomic sequencing for application in virus diagnostics, we tested different enrichment and amplification procedures on plasma samples spiked with RNA and DNA viruses. A protocol including filtration, nuclease digestion, and random amplification of RNA and DNA in separate reactions provided the best results, allowing reliable recovery of viral genomes and a good correlation of the relative number of sequencing reads with the virus input. We further validated our method by sequencing a multiplexed viral pathogen reagent containing a range of human viruses from different virus families. Our method proved successful in detecting the majority of the included viruses with high read numbers and compared well to other protocols in the field validated against the same reference reagent. Our sequencing protocol does work not only with plasma but also with other clinical samples such as urine and throat swabs. The workflow for virus metagenomic sequencing that we established proved successful in detecting a variety of viruses in different clinical samples. Our protocol supplements existing virus-specific detection strategies providing opportunities to identify atypical and novel viruses commonly not accounted for in routine diagnostic panels.
Bovali, Efstathia; Kiliaridis, Stavros; Cornelis, Marie A
2014-12-01
The objective of this 2-arm parallel single-center trial was to compare placement time and numbers of failures of mandibular lingual retainers bonded with an indirect procedure vs a direct bonding procedure. Sixty-four consecutive patients at the postgraduate orthodontic clinic of the University of Geneva in Switzerland scheduled for debonding and mandibular fixed retainer placement were randomly allocated to either an indirect bonding procedure or a traditional direct bonding procedure. Eligibility criteria were the presence of the 4 mandibular incisors and the 2 mandibular canines, and no active caries, restorations, fractures, or periodontal disease of these teeth. The patients were randomized in blocks of 4; the randomization sequence was generated using an online randomization service (www.randomization.com). Allocation concealment was secured by contacting the sequence generator for treatment assignment; blinding was possible for outcome assessment only. Bonding time was measured for each procedure. Unpaired t tests were used to assess differences in time. Patients were recalled at 1, 2, 4, and 6 months after bonding. Mandibular fixed retainers having at least 1 composite pad debonded were considered as failures. The log-rank test was used to compare the Kaplan-Meier survival curves of both procedures. A test of proportion was applied to compare the failures at 6 months between the treatment groups. Sixty-four patients were randomized in a 1:1 ratio. One patient dropped out at baseline after the bonding procedure, and 3 patients did not attend the recalls at 4 and 6 months. Bonding time was significantly shorter for the indirect procedure (321 ± 31 seconds, mean ± SD) than for the direct procedure (401 ± 40 seconds) (per protocol analysis of 63 patients: mean difference = 80 seconds; 95% CI = 62.4-98.1; P <0.001). The 6-month numbers of failures were 10 of 31 (32%) with the indirect technique and 7 of 29 (24%) with the direct technique (log rank: P = 0.35; test of proportions: risk difference = 0.08; 95% CI = -0.15 to 0.31; P = 0.49). No serious harm was observed except for plaque accumulation. Indirect bonding was statistically significantly faster than direct bonding, with both techniques showing similar risks of failure. This trial was not registered. The protocol was not published before trial commencement. No funding or conflict of interest to be declared. Copyright © 2014 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.
Ryeznik, Yevgen; Sverdlov, Oleksandr; Wong, Weng Kee
2015-08-01
Response-adaptive randomization designs are becoming increasingly popular in clinical trial practice. In this paper, we present RARtool , a user interface software developed in MATLAB for designing response-adaptive randomized comparative clinical trials with censored time-to-event outcomes. The RARtool software can compute different types of optimal treatment allocation designs, and it can simulate response-adaptive randomization procedures targeting selected optimal allocations. Through simulations, an investigator can assess design characteristics under a variety of experimental scenarios and select the best procedure for practical implementation. We illustrate the utility of our RARtool software by redesigning a survival trial from the literature.
Bello, Segun; Moustgaard, Helene; Hróbjartsson, Asbjørn
2014-10-01
To assess the proportion of clinical trials explicitly reporting the risk of unblinding, to evaluate the completeness of reporting on unblinding risk, and to describe the reported procedures involved in assessing unblinding. We sampled at random 300 blinded randomized clinical trials indexed in PubMed in 2010. Two authors read the trial publications and extracted data independently. Twenty-four trial publications, or 8% (95% confidence interval [CI], 5, 12%), explicitly reported the risk of unblinding, of which 16 publications, or 5% (95% CI, 3, 8%), reported compromised blinding; and 8 publications, or 3% (95% CI, 1, 5%), intact blinding. The reporting on risk of unblinding in the 24 trial publications was generally incomplete. The median proportion of assessments per trial affected by unblinding was 3% (range 1-30%). The most common mechanism for unblinding was perceptible physical properties of the treatments, for example, a difference in the taste and odor of a typhoid vaccine compared with its placebo. Published articles on randomized clinical trials infrequently reported risk of unblinding. This may reflect a tendency for avoiding reporting actual or suspected unblinding or a genuine low risk of unblinding. Copyright © 2014 Elsevier Inc. All rights reserved.
Full-field OCT for fast diagnostic of head and neck cancer
NASA Astrophysics Data System (ADS)
De Leeuw, Frederic; Casiraghi, Odile; Ben Lakhdar, Aïcha; Abbaci, Muriel; Laplace-Builhé, Corinne
2015-02-01
Full-Field OCT (FFOCT) produces optical slices of tissue using white light interferometry providing in-depth 2D images, with an isotropic resolution around 1 micrometer. These optical biopsy images are similar to those obtained with established histological procedures, but without tissue preparation and within few minutes. This technology could be useful when diagnosing a lesion or at the time of its surgical management. Here we evaluate the clinical value of FFOCT imaging in the management of patients with Head and Neck cancers by assessing the accuracy of the diagnosis done on FFOCT images from resected specimen. FFOCT images from Head and Neck samples were first compared to the gold standard (HES-conventional histology). An image atlas dedicated to the training of pathologists was built and diagnosis criteria were identified. Then, we performed a morphological correlative study: both healthy and cancerous samples from patients who undergo Head and Neck surgery of oral cavity, pharynx, and larynx were imaged. Images were interpreted in a random way by two pathologists and the FFOCT based diagnostics were compared with HES (gold standard) of the same samples. Here we present preliminary results showing that FFOCT provides a quick assessment of tissue architecture at microscopic level that could guide surgeons for tumor margin delineation during intraoperative procedure.
Template Matching for Auditing Hospital Cost and Quality
Silber, Jeffrey H; Rosenbaum, Paul R; Ross, Richard N; Ludwig, Justin M; Wang, Wei; Niknam, Bijan A; Mukherjee, Nabanita; Saynisch, Philip A; Even-Shoshan, Orit; Kelz, Rachel R; Fleisher, Lee A
2014-01-01
Objective Develop an improved method for auditing hospital cost and quality. Data Sources/Setting Medicare claims in general, gynecologic and urologic surgery, and orthopedics from Illinois, Texas, and New York between 2004 and 2006. Study Design A template of 300 representative patients was constructed and then used to match 300 patients at hospitals that had a minimum of 500 patients over a 3-year study period. Data Collection/Extraction Methods From each of 217 hospitals we chose 300 patients most resembling the template using multivariate matching. Principal Findings The matching algorithm found close matches on procedures and patient characteristics, far more balanced than measured covariates would be in a randomized clinical trial. These matched samples displayed little to no differences across hospitals in common patient characteristics yet found large and statistically significant hospital variation in mortality, complications, failure-to-rescue, readmissions, length of stay, ICU days, cost, and surgical procedure length. Similar patients at different hospitals had substantially different outcomes. Conclusion The template-matched sample can produce fair, directly standardized audits that evaluate hospitals on patients with similar characteristics, thereby making benchmarking more believable. Through examining matched samples of individual patients, administrators can better detect poor performance at their hospitals and better understand why these problems are occurring. PMID:24588413
47 CFR 1.957 - Procedure with respect to amateur radio operator license.
Code of Federal Regulations, 2013 CFR
2013-10-01
... AND PROCEDURE Grants by Random Selection Wireless Radio Services Applications and Proceedings Application Requirements and Procedures § 1.957 Procedure with respect to amateur radio operator license. Each... 47 Telecommunication 1 2013-10-01 2013-10-01 false Procedure with respect to amateur radio...
47 CFR 1.957 - Procedure with respect to amateur radio operator license.
Code of Federal Regulations, 2014 CFR
2014-10-01
... AND PROCEDURE Grants by Random Selection Wireless Radio Services Applications and Proceedings Application Requirements and Procedures § 1.957 Procedure with respect to amateur radio operator license. Each... 47 Telecommunication 1 2014-10-01 2014-10-01 false Procedure with respect to amateur radio...
47 CFR 1.957 - Procedure with respect to amateur radio operator license.
Code of Federal Regulations, 2012 CFR
2012-10-01
... AND PROCEDURE Grants by Random Selection Wireless Radio Services Applications and Proceedings Application Requirements and Procedures § 1.957 Procedure with respect to amateur radio operator license. Each... 47 Telecommunication 1 2012-10-01 2012-10-01 false Procedure with respect to amateur radio...
NASA Astrophysics Data System (ADS)
Ye, Su; Pontius, Robert Gilmore; Rakshit, Rahul
2018-07-01
Object-based image analysis (OBIA) has gained widespread popularity for creating maps from remotely sensed data. Researchers routinely claim that OBIA procedures outperform pixel-based procedures; however, it is not immediately obvious how to evaluate the degree to which an OBIA map compares to reference information in a manner that accounts for the fact that the OBIA map consists of objects that vary in size and shape. Our study reviews 209 journal articles concerning OBIA published between 2003 and 2017. We focus on the three stages of accuracy assessment: (1) sampling design, (2) response design and (3) accuracy analysis. First, we report the literature's overall characteristics concerning OBIA accuracy assessment. Simple random sampling was the most used method among probability sampling strategies, slightly more than stratified sampling. Office interpreted remotely sensed data was the dominant reference source. The literature reported accuracies ranging from 42% to 96%, with an average of 85%. A third of the articles failed to give sufficient information concerning accuracy methodology such as sampling scheme and sample size. We found few studies that focused specifically on the accuracy of the segmentation. Second, we identify a recent increase of OBIA articles in using per-polygon approaches compared to per-pixel approaches for accuracy assessment. We clarify the impacts of the per-pixel versus the per-polygon approaches respectively on sampling, response design and accuracy analysis. Our review defines the technical and methodological needs in the current per-polygon approaches, such as polygon-based sampling, analysis of mixed polygons, matching of mapped with reference polygons and assessment of segmentation accuracy. Our review summarizes and discusses the current issues in object-based accuracy assessment to provide guidance for improved accuracy assessments for OBIA.
Gomes, Heloisa Sousa; Miranda, Analya Rodrigues; Viana, Karolline Alves; Batista, Aline Carvalho; Costa, Paulo Sucasas; Daher, Anelise; Machado, Geovanna de Castro Morais; Sado-Filho, Joji; Vieira, Liliani Aires Candido; Corrêa-Faria, Patrícia; Hosey, Marie Therese; Costa, Luciane Rezende
2017-04-11
Uncooperative children may need to receive dental treatment under sedation, which is indicated when nonpharmacological behavior guidance is unsuccessful. There are randomized controlled trials (RCTs) comparing different sedative protocols for dental procedures; however, the evidence for superiority of one form over another is weak. The primary aim of this study is to investigate the efficacy of intranasally administered ketamine plus midazolam for the dental treatment of children. We have designed a three-armed, parallel RCT to assess intranasal sedation using ketamine/midazolam in terms of the following measures: efficacy, safety, and cost-effectiveness. Two- to 6-year-old healthy children, referred for dental treatment in a dental sedation center in Brazil due to uncooperative behavior and requiring restorative dental procedures, will be recruited. Each child will be randomly assigned to one of the three groups: A - Intranasal administration of ketamine (4.0 mg/kg, maximum 100 mg) and midazolam (0.2 mg/kg, maximum 5.0 mg); B - Oral administration of ketamine (4.0 mg/kg, maximum 100 mg) and midazolam (0.5 mg/kg, maximum 20 mg); and C - Oral administration of midazolam (1.0 mg/kg, maximum 20 mg). The primary outcome is the child's behavior assessed through an observational scale using digital videos of the restorative dental treatment under sedation. The secondary outcomes are as follows: acceptance of sedative administration; memory of intraoperative events; the child's stress; adverse events; the child's pain during the procedure; the parent's, dentists', and child's perceptions of sedation; and economic analysis. Measures will be taken at baseline and drug administration and during and after the dental procedure. The necessary sample size was estimated to be 84 children after a blinded interim analysis of the first 30 cases. This study will provide data that can substantially add to science and pediatric dentistry as it examines the effect of sedative regimes from different perspectives (outcomes). ClinicalTrials.gov, identifier: NCT02447289 . Registered on 11 May 2015, named "Midazolam and Ketamine Effect Administered Through the Nose for Sedation of Children for Dental Treatment (NASO)."
Amstadter, Ananda B; Koenen, Karestan C; Ruggiero, Kenneth J; Acierno, Ron; Galea, Sandro; Kilpatrick, Dean G; Gelernter, Joel
2009-01-01
This study examined whether rs4606, a single nucleotide polymorphism (SNP) in the translated region at the 3' end of RGS2, was related to suicidal ideation in an epidemiologic sample of adults living in areas affected by the 2004 Florida hurricanes. An epidemiologic sample of residents of Florida was recruited via random digit-dial procedures after the 2004 Florida hurricanes; participants were interviewed about suicidal ideation, hurricane exposure, and social support. Participants who returned buccal DNA samples via mail (n = 607) were included here. Rs4606 in RGS2 was associated with increased symptoms of current suicidal ideation (p < 0.01). Each "C" allele was associated with 5.59 times increased risk of having current ideation. No gene-by-environment interactions were found, perhaps due to low power. RGS2 rs4606 is related to risk of current suicidal ideation in stressor-exposed adults.
Failure to replicate depletion of self-control.
Xu, Xiaomeng; Demos, Kathryn E; Leahey, Tricia M; Hart, Chantelle N; Trautvetter, Jennifer; Coward, Pamela; Middleton, Kathryn R; Wing, Rena R
2014-01-01
The limited resource or strength model of self-control posits that the use of self-regulatory resources leads to depletion and poorer performance on subsequent self-control tasks. We conducted four studies (two with community samples, two with young adult samples) utilizing a frequently used depletion procedure (crossing out letters protocol) and the two most frequently used dependent measures of self-control (handgrip perseverance and modified Stroop). In each study, participants completed a baseline self-control measure, a depletion or control task (randomized), and then the same measure of self-control a second time. There was no evidence for significant depletion effects in any of these four studies. The null results obtained in four attempts to replicate using strong methodological approaches may indicate that depletion has more limited effects than implied by prior publications. We encourage further efforts to replicate depletion (particularly among community samples) with full disclosure of positive and negative results.
Dezhdar, Shahin; Jahanpour, Faezeh; Firouz Bakht, Saeedeh; Ostovar, Afshin
2016-04-01
Hospitalized premature babies often undergo various painful procedures. Kangaroo mother care (KMC) and swaddling are two pain reduction methods. This study was undertaken to compare the effects of swaddling and KMC on pain during venous sampling in premature neonates. This study was performed as a randomized clinical trial on 90 premature neonates. The neonates were divided into three groups using a random allocation block. The three groups were group A (swaddling), group B (KMC), and group C (control). In all three groups, the heart rate and arterial oxygen saturation were measured and recorded in time intervals of 30 seconds before, during, and 30, 60, 90, and 120 seconds after blood sampling. The neonate's face was video recorded and assessed using the premature infant pain profile (PIPP) at time intervals of 30 seconds. The data was analyzed using the t-test, chi-square test, Repeated Measure analysis of variance (ANOVA), Kruskal-Wallis, Post-hoc, and Bonferroni test. The findings revealed that pain was reduced to a great extent in the swaddling and KMC methods compared to the control group. However, there was no significant difference between KMC and swaddling (P ≥ 0.05). The results of this study indicate that there is no meaningful difference between swaddling and KMC on physiological indexes and pain in neonates. Therefore, the swaddling method may be a good substitute for KMC.
Cuing effects for informational masking
NASA Astrophysics Data System (ADS)
Richards, Virginia M.; Neff, Donna L.
2004-01-01
The detection of a tone added to a random-frequency, multitone masker can be very poor even when the maskers have little energy in the frequency region of the signal. This paper examines the effects of adding a pretrial cue to reduce uncertainty for the masker or the signal. The first two experiments examined the effect of cuing a fixed-frequency signal as the number of masker components and presentation methods were manipulated. Cue effectiveness varied across observers, but could reduce thresholds by as much as 20 dB. Procedural comparisons indicated observers benefited more from having two masker samples to compare, with or without a signal cue, than having a single interval with one masker sample and a signal cue. The third experiment used random-frequency signals and compared no-cue, signal-cue, and masker-cue conditions, and also systematically varied the time interval between cue offset and trial onset. Thresholds with a cued random-frequency signal remained higher than for a cued fixed-frequency signal. For time intervals between the cue and trial of 50 ms or longer, thresholds were approximately the same with a signal or a masker cue and lower than when there was no cue. Without a cue or with a masker cue, analyses of possible decision strategies suggested observers attended to the potential signal frequencies, particularly the highest signal frequency. With a signal cue, observers appeared to attend to the frequency of the subsequent signal.
Effect of different bleaching strategies on microhardness of a silorane-based composite resin.
Bahari, Mahmoud; Savadi Oskoee, Siavash; Mohammadi, Narmin; Ebrahimi Chaharom, Mohammad Esmaeel; Godrati, Mostafa; Savadi Oskoee, Ayda
2016-01-01
Background. Dentists' awareness of the effects of bleaching agents on the surface and mechanical properties of restorative materials is of utmost importance. Therefore, this in vitro study was undertaken to investigate the effects of different bleaching strategies on the microhardness of a silorane-based composite resin. Methods. Eighty samples of a silorane-based composite resin (measuring 4 mm in diameter and 2 mm in thickness) were prepared within acrylic molds. The samples were polished and randomly assigned to 4 groups (n=20). Group 1 (controls) were stored in distilled water for 2 weeks. The samples in group 2 underwent a bleaching procedure with 15% carbamide peroxide for two weeks two hours daily. The samples in group 3 were bleached with 35% hydrogen peroxide twice 5 days apart for 30 minutes each time. The samples in group 4 underwent a bleaching procedure with light-activated 35% hydrogen peroxide under LED light once for 40 minutes. Then the microhardness of the samples was determined using Vickers method. Data were analyzed with one-way ANOVA and post hoc Tukey tests (P < 0.05). Results. All the bleaching agents significantly decreased microhardness compared to the control group (P < 0.05). In addition, there were significant differences in microhardness between groups 2 and 4 (P = 0.001) and between groups 3 and 4 (P<0.001). However, no significant differences were detected in microhardness between groups 2 and 3 (P > 0.05). Conclusion. Bleaching agents decreased microhardness of silorane-based composite resin restorations, the magnitude of which depending on the bleaching strategy used.
Shannon, Casey P; Chen, Virginia; Takhar, Mandeep; Hollander, Zsuzsanna; Balshaw, Robert; McManus, Bruce M; Tebbutt, Scott J; Sin, Don D; Ng, Raymond T
2016-11-14
Gene network inference (GNI) algorithms can be used to identify sets of coordinately expressed genes, termed network modules from whole transcriptome gene expression data. The identification of such modules has become a popular approach to systems biology, with important applications in translational research. Although diverse computational and statistical approaches have been devised to identify such modules, their performance behavior is still not fully understood, particularly in complex human tissues. Given human heterogeneity, one important question is how the outputs of these computational methods are sensitive to the input sample set, or stability. A related question is how this sensitivity depends on the size of the sample set. We describe here the SABRE (Similarity Across Bootstrap RE-sampling) procedure for assessing the stability of gene network modules using a re-sampling strategy, introduce a novel criterion for identifying stable modules, and demonstrate the utility of this approach in a clinically-relevant cohort, using two different gene network module discovery algorithms. The stability of modules increased as sample size increased and stable modules were more likely to be replicated in larger sets of samples. Random modules derived from permutated gene expression data were consistently unstable, as assessed by SABRE, and provide a useful baseline value for our proposed stability criterion. Gene module sets identified by different algorithms varied with respect to their stability, as assessed by SABRE. Finally, stable modules were more readily annotated in various curated gene set databases. The SABRE procedure and proposed stability criterion may provide guidance when designing systems biology studies in complex human disease and tissues.
Gupta, Alisha; Agarwala, Sandeep; Sreenivas, Vishnubhatla; Srinivas, Madhur; Bhatnagar, Veereshwar
2017-01-01
Females with Krickenbeck low-type anorectal malformations - vestibular fistula (VF) and perineal fistula (PF) - are managed either by a primary definitive or conventional three-staged approach. Ultimate outcome in these children may be affected by wound dehiscence leading to healing by fibrosis. Most of the literature favors one approach over other based on retrospective analysis of their outcomes. Whether a statistically significant difference in wound dehiscence rates between these approaches exists needed to be seen. A randomized controlled trial for girls <14 years with VF or PF was done. Random tables were used to randomize 33 children to Group I (primary procedure) and 31 to Group II (three-staged procedure). Statistical analysis was done for significance of difference ( P < 0.05) in the primary outcome (wound dehiscence) and secondary outcomes (immediate and early postoperative complications). Of the 64 children randomized, 54 (84%) had VF. Both groups were comparable in demography, clinical profile and age at surgery. The incidence of wound dehiscence (39.4% vs. 18.2%; P = 0.04), immediate postoperative complications (51.5% vs. 12.9%; P = 0.001), and early postoperative complications (42.4% vs. 12.9%; P = 0.01) was significantly higher in Group I as compared to Group II. Six of 13 children (46.2%) with dehiscence in Group I required a diverting colostomy to be made. Females with VF or PF undergoing primary definitive procedure have a significantly higher incidence of wound dehiscence ( P = 0.04), immediate ( P = 0.001) and early postoperative complications ( P = 0.01).
Fluoride content in table salt distributed in Mexico City, Mexico.
Hernández-Guerrero, Juan Carlos; de la Fuente-Hernández, Javier; Jiménez-Farfán, Maria Dolores; Ledesma-Montes, Constantino; Castañeda-Castaneira, Enrique; Molina-Frechero, Nelly; Jacinto-Alemán, Luís Fernando; Juárez-Lopez, Lilia Adriana; Moreno-Altamirano, Alejandra
2008-01-01
The aim of this study was to analyze table salt available in Mexico City's market to identify the fluoride concentrations and to compare these with the Mexican regulations. We analyzed 44 different brands of table salt. All samples were purchased at random in different stores, supermarkets, and groceries from Mexico City's metropolitan area and analyzed in triplicate in three different laboratories (nine determinations per sample) with an Orion 720 A potentiometer and an Orion 9609 BN ion-specific electrode. Fluoride concentration in the samples varied from 0 ppm to 485 ppm. It was found that fluoride concentration varied widely among the analyzed brands. Also, we found that fluoride concentration in 92 percent of the analyzed samples did not match with that printed on the label. Only 6.8 percent of the analyzed samples contained fluoride concentrations that meet Mexican and WHO regulations. The broad variation in the analyzed samples suggests that Mexican Public Health authorities must implement more stringent regulation guidelines and procedures for controlling the distribution of salt and its fluoride concentration for human consumption.
Increased cognitive load enables unlearning in procedural category learning.
Crossley, Matthew J; Maddox, W Todd; Ashby, F Gregory
2018-04-19
Interventions for drug abuse and other maladaptive habitual behaviors may yield temporary success but are often fragile and relapse is common. This implies that current interventions do not erase or substantially modify the representations that support the underlying addictive behavior-that is, they do not cause true unlearning. One example of an intervention that fails to induce true unlearning comes from Crossley, Ashby, and Maddox (2013, Journal of Experimental Psychology: General), who reported that a sudden shift to random feedback did not cause unlearning of category knowledge obtained through procedural systems, and they also reported results suggesting that this failure is because random feedback is noncontingent on behavior. These results imply the existence of a mechanism that (a) estimates feedback contingency and (b) protects procedural learning from modification when feedback contingency is low (i.e., during random feedback). This article reports the results of an experiment in which increasing cognitive load via an explicit dual task during the random feedback period facilitated unlearning. This result is consistent with the hypothesis that the mechanism that protects procedural learning when feedback contingency is low depends on executive function. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
40 CFR 89.413 - Raw sampling procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Raw sampling procedures. 89.413 Section 89.413 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... Test Procedures § 89.413 Raw sampling procedures. Follow these procedures when sampling for gaseous...
40 CFR 89.413 - Raw sampling procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Raw sampling procedures. 89.413 Section 89.413 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... Test Procedures § 89.413 Raw sampling procedures. Follow these procedures when sampling for gaseous...
Kuznetsova, Olga M; Tymofyeyev, Yevgen
2014-04-30
In open-label studies, partial predictability of permuted block randomization provides potential for selection bias. To lessen the selection bias in two-arm studies with equal allocation, a number of allocation procedures that limit the imbalance in treatment totals at a pre-specified level but do not require the exact balance at the ends of the blocks were developed. In studies with unequal allocation, however, the task of designing a randomization procedure that sets a pre-specified limit on imbalance in group totals is not resolved. Existing allocation procedures either do not preserve the allocation ratio at every allocation or do not include all allocation sequences that comply with the pre-specified imbalance threshold. Kuznetsova and Tymofyeyev described the brick tunnel randomization for studies with unequal allocation that preserves the allocation ratio at every step and, in the two-arm case, includes all sequences that satisfy the smallest possible imbalance threshold. This article introduces wide brick tunnel randomization for studies with unequal allocation that allows all allocation sequences with imbalance not exceeding any pre-specified threshold while preserving the allocation ratio at every step. In open-label studies, allowing a larger imbalance in treatment totals lowers selection bias because of the predictability of treatment assignments. The applications of the technique in two-arm and multi-arm open-label studies with unequal allocation are described. Copyright © 2013 John Wiley & Sons, Ltd.
Autoshaping of chlordiazepoxide drinking in non-deprived rats.
Tomie, Arthur; Wong, Lauren E; Pohorecky, Larissa A
2005-02-28
Effects of autoshaping procedures (Paired versus Random) and sipper fluid [chlordiazepoxide (CDP) versus water] on sipper-directed drinking were evaluated in 32 male Long-Evans rats maintained with free access to food and water. For the Paired/CDP group (n = 16), autoshaping procedures consisted of the presentation of the CDP sipper conditioned stimulus (CS) followed by the response-independent presentation of the food unconditioned stimulus (US). The concentration of CDP in the sipper CS (0.05, 0.10, 0.15, 0.20, and 0.25 mg/ml CDP) was increased across sessions. The Paired/Water group (n = 8) received only water in the sipper CS. The Random/CDP group (n = 8) received the CDP sipper CS and food US randomly with respect to one another. The Paired/CDP group drank significantly more of the 0.20 mg/ml and 0.25 mg/ml CDP solutions than the Random/CDP control, and more fluid than the Paired/Water control group when the sipper CS for the Paired/CDP group contained the three highest concentrations of CDP. CS-Only extinction procedures reliably reduced sipper CS-directed drinking in the Paired/CDP and the Paired/Water groups, but not in the Random/CDP group. Data are consistent with the hypothesis that Pavlovian autoshaping procedures induce sipper CS-directed drinking of CDP in rats deprived of neither food nor fluid. Implications for the autoshaping model of drug abuse are discussed.
Lu, Tsui-Shan; Longnecker, Matthew P; Zhou, Haibo
2017-03-15
Outcome-dependent sampling (ODS) scheme is a cost-effective sampling scheme where one observes the exposure with a probability that depends on the outcome. The well-known such design is the case-control design for binary response, the case-cohort design for the failure time data, and the general ODS design for a continuous response. While substantial work has been carried out for the univariate response case, statistical inference and design for the ODS with multivariate cases remain under-developed. Motivated by the need in biological studies for taking the advantage of the available responses for subjects in a cluster, we propose a multivariate outcome-dependent sampling (multivariate-ODS) design that is based on a general selection of the continuous responses within a cluster. The proposed inference procedure for the multivariate-ODS design is semiparametric where all the underlying distributions of covariates are modeled nonparametrically using the empirical likelihood methods. We show that the proposed estimator is consistent and developed the asymptotically normality properties. Simulation studies show that the proposed estimator is more efficient than the estimator obtained using only the simple-random-sample portion of the multivariate-ODS or the estimator from a simple random sample with the same sample size. The multivariate-ODS design together with the proposed estimator provides an approach to further improve study efficiency for a given fixed study budget. We illustrate the proposed design and estimator with an analysis of association of polychlorinated biphenyl exposure to hearing loss in children born to the Collaborative Perinatal Study. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Hawkins, Alesia Oscea; Danielson, Carla Kmett; de Arellano, Michael A; Hanson, Rochelle F; Ruggiero, Kenneth J; Smith, Daniel W; Saunders, Benjamin E; Kilpatrick, Dean G
2010-08-01
Limited research has examined whether similar patterns in injurious spanking and other forms of child physical abuse (CPA) exist across specific ethnic/racial groups. The authors examined and compared differences in the lifetime prevalence of injurious spanking and CPA in two national samples of adolescents across ethnic/racial groups and over time. Participants were 4,023 youth (12-17 years) and 3,614 youth (12-17 years) who participated in the 1995 National Survey of Adolescents (NSA) and 2005 National Survey of Adolescents-Replication (NSA-R), respectively. Adolescents, who were identified through random digit dial procedures, completed a telephone interview assessment. Results indicated significant ethnic/racial variation across groups in reports of injurious spanking in the NSA and the NSA-R samples; however, significant differences were not observed within groups between the two samples over time. Ethnic/racial differences also were found between groups in reports of CPA in the NSA-R sample. Limitations and future directions of this research are discussed.
Statistical inference for the additive hazards model under outcome-dependent sampling.
Yu, Jichang; Liu, Yanyan; Sandler, Dale P; Zhou, Haibo
2015-09-01
Cost-effective study design and proper inference procedures for data from such designs are always of particular interests to study investigators. In this article, we propose a biased sampling scheme, an outcome-dependent sampling (ODS) design for survival data with right censoring under the additive hazards model. We develop a weighted pseudo-score estimator for the regression parameters for the proposed design and derive the asymptotic properties of the proposed estimator. We also provide some suggestions for using the proposed method by evaluating the relative efficiency of the proposed method against simple random sampling design and derive the optimal allocation of the subsamples for the proposed design. Simulation studies show that the proposed ODS design is more powerful than other existing designs and the proposed estimator is more efficient than other estimators. We apply our method to analyze a cancer study conducted at NIEHS, the Cancer Incidence and Mortality of Uranium Miners Study, to study the risk of radon exposure to cancer.
Statistical inference for the additive hazards model under outcome-dependent sampling
Yu, Jichang; Liu, Yanyan; Sandler, Dale P.; Zhou, Haibo
2015-01-01
Cost-effective study design and proper inference procedures for data from such designs are always of particular interests to study investigators. In this article, we propose a biased sampling scheme, an outcome-dependent sampling (ODS) design for survival data with right censoring under the additive hazards model. We develop a weighted pseudo-score estimator for the regression parameters for the proposed design and derive the asymptotic properties of the proposed estimator. We also provide some suggestions for using the proposed method by evaluating the relative efficiency of the proposed method against simple random sampling design and derive the optimal allocation of the subsamples for the proposed design. Simulation studies show that the proposed ODS design is more powerful than other existing designs and the proposed estimator is more efficient than other estimators. We apply our method to analyze a cancer study conducted at NIEHS, the Cancer Incidence and Mortality of Uranium Miners Study, to study the risk of radon exposure to cancer. PMID:26379363
Cosmetic procedures among youths: a survey of junior college and medical students in Singapore.
Ng, Jia Hui; Yeak, Seth; Phoon, Natalie; Lo, Stephen
2014-08-01
Although cosmetic procedures have become increasingly popular among the younger population in recent years, limited research on this subject has been done in the Asian context. We aimed to explore the views and knowledge regarding cosmetic procedures among junior college (JC) and medical students in Singapore. In the first phase of the study, a cross-sectional, self-administered survey of 1,500 JC students aged 16-21 years from six JCs was conducted in 2010. The same survey was then conducted on a random sample of Year 2-5 medical students from an undergraduate medical school in 2011. In total, 1,164 JC and 241 medical students responded to the surveys. There was an overall female to male ratio of 1.3:1. Of all the respondents, 2.5% of the JC students and 3.0% of the medical students admitted to having undergone cosmetic procedures. Among those who claimed to have never had cosmetic procedures done, 9.0% and 44.0% of the JC and medical students, respectively, responded that they would consider such procedures in the future. Those who disapproved of their peers undergoing cosmetic surgery comprised 35.0% of JC students and 56.8% of medical students. Among the JC and medical students, 52.0% and 36.1%, respectively, were unaware of any risks associated with cosmetic procedures. The younger population is increasingly accepting of cosmetic procedures. However, there is a general lack of understanding of the risks associated with such procedures. Education of both the general public and medical students may help prevent potential medicolegal issues.
Karoly, Paul; Ruehlman, Linda S.
2005-01-01
A heterogeneous national sample of adults (mean age = 40 years) employed in management positions was contacted by random digit dialing procedures and interviewed about current pain experience, work-goal cognitions, and psychological status (depression and anxiety). In accord with predictions, persistent pain experience was differentially related to the construal of work-related goals. Specifically, individuals with both persistent and episodic pain (relative to those with no pain) reported lower levels of goal-centered value, self-efficacy, and positive arousal and heightened perceptions of goal-based self-criticism, negative arousal, and conflict between work and nonwork goals. Furthermore, regression analyses revealed that goal cognition accounted for unique variance in depression and anxiety over and above the contribution of pain chronicity. PMID:8891717
Dissociation and serenity induction.
Zoellner, Lori A; Sacks, Matthew B; Foa, Edna B
2007-09-01
Dissociation is a common experience during or immediately after a traumatic event; yet, most of the current knowledge regarding dissociation is retrospective in nature. The aim of the present study investigated a non-pharmacological method of dissociative induction with a clinical sample. Participants with PTSD and non-trauma exposed participants were randomly assigned to receive either a dissociative induction, or a serenity induction, based on modified Velten mood induction procedures. Participants receiving the dissociative induction reported higher state-dissociation than those receiving the serenity induction. The PTSD group reported greater state dissociation than the non-trauma exposed group, regardless of induction. State dissociation was related to trait dissociation, PTSD severity, and depression. The present results provide an initial demonstration of the viability for inducing state dissociation in the laboratory with a PTSD sample.
Minnesota dentists׳ attitudes toward the dental therapist workforce model.
Blue, Christine M; Rockwood, Todd; Riggs, Sheila
2015-06-01
The purpose of this study was to evaluate dentists' attitudes and perceptions toward dental therapists, a new licensed dental provider in Minnesota. This study employed mixed modes to administer a survey using a stratified random sample of 1000 dentists in Minnesota. The response rate was 55% (AAPOR RR1: n=551/999). Results showed a majority of dentists were opposed to dental therapists performing irreversible procedures. In addition, results identified perceived barriers to hiring a dental therapist and found dentists do not believe dental therapists will alleviate oral health disparity in the State. Published by Elsevier Inc.
Machine learning prediction for classification of outcomes in local minimisation
NASA Astrophysics Data System (ADS)
Das, Ritankar; Wales, David J.
2017-01-01
Machine learning schemes are employed to predict which local minimum will result from local energy minimisation of random starting configurations for a triatomic cluster. The input data consists of structural information at one or more of the configurations in optimisation sequences that converge to one of four distinct local minima. The ability to make reliable predictions, in terms of the energy or other properties of interest, could save significant computational resources in sampling procedures that involve systematic geometry optimisation. Results are compared for two energy minimisation schemes, and for neural network and quadratic functions of the inputs.
The quality of the new birth certificate data: a validation study in North Carolina.
Buescher, P A; Taylor, K P; Davis, M H; Bowling, J M
1993-01-01
A random sample of 395 December 1989 North Carolina birth certificates and the corresponding maternal hospital medical records were examined to validate selected items. Reporting was very accurate for birth-weight, Apgar score, and method of delivery; fair to good for tobacco use, prenatal care, weight gain during pregnancy, obstetrical procedures, and events of labor and delivery; and poor for medical history and alcohol use. This study suggests that many of the new birth certificate items will support valid aggregate analyses for maternal and child health research and evaluation. PMID:8342728
Health-related quality-of-life as co-primary endpoint in randomized clinical trials in oncology.
Fiteni, Frédéric; Pam, Alhousseiny; Anota, Amélie; Vernerey, Dewi; Paget-Bailly, Sophie; Westeel, Virginie; Bonnetain, Franck
2015-01-01
Overall survival (OS) has been considered as the most relevant primary endpoint but trials using OS often require large numbers of patients and long-term follow-up. Therefore composite endpoints, which are assessed earlier, are frequently used as primary endpoint but suffer from important limitations specially a lack of validation as surrogate of OS. Therefore, Health-related quality of life (HRQoL) could be considered as an outcome to judge efficacy of a treatment. An alternative approach would be to combine HRQoL with composite endpoints as co-primary endpoint to ensure a clinical benefit for patients of a new therapy. The decision rules of such design, the procedure to control the Type I error and the determination of sample size remain questions to debate. Here, we discusses HRQoL as co-primary endpoints in randomized clinical trials in oncology and provide some solutions to promote such design.
7 CFR 42.121 - Sampling and inspection procedures.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 2 2011-01-01 2011-01-01 false Sampling and inspection procedures. 42.121 Section 42... REGULATIONS STANDARDS FOR CONDITION OF FOOD CONTAINERS Skip Lot Sampling and Inspection Procedures § 42.121 Sampling and inspection procedures. (a) Following skip lot procedure authorization, inspect every lot...
7 CFR 42.121 - Sampling and inspection procedures.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Sampling and inspection procedures. 42.121 Section 42... REGULATIONS STANDARDS FOR CONDITION OF FOOD CONTAINERS Skip Lot Sampling and Inspection Procedures § 42.121 Sampling and inspection procedures. (a) Following skip lot procedure authorization, inspect every lot...
Cozzani, Mauro; Delucchi, Alessia; Barreca, Carlo; Rinchuse, Daniel J.; Servetto, Roberto; Calevo, Maria Grazia; Piras, Vincenzo
2016-01-01
Summary Objectives: To assess the effects of a follow-up text message and a telephone call after bonding on participants’ self-reported level of pain. Materials and methods: Eighty-four participants were randomly assigned to one of three trial arms. Randomization was performed by the Department of Epidemiology and Biostatistics of IRCCS G.Gaslini. Participants were enrolled from patients with a permanent dentition who were beginning fixed no extraction treatment at the Orthodontic Department, Gaslini Hospital. Participants completed baseline questionnaires to assess their levels of pain prior to treatment. After the initial appointment, participants were completed a pain questionnaire at the same time, daily, for 7 days. The first group, served as control, did not receive any post-procedure communication; the second group received a structured text message; and the third group received a structured telephone call. Participants were blinded to group assignment. Limitations: A larger sample size should have been considered in order to increase the ability to generalize this study’s results. Results: Participants in both the telephone call group and the text message group reported lower level of pain than participants in the control group with a larger and more consistent effect for the telephone call group. Most participants reported a higher level of pain during the first 48 hours post-bonding. The analgesic’s consumption significantly correlated with the level of pain during the previous 24 hours. Female participants appeared to be more sensitive to pain than male participants. Conclusions: A telephone follow-up after orthodontic treatment may be an effective procedure to reduce participants’ level of pain. Protocol: The research protocol was approved by the Italian Comitato Etico Regionale della Liguria-sezione 3^ c/o IRCCS- Istituto G.Gaslini 845/2014. Registration: 182 Reg 2014, 16/09/2014 Comitato Etico Regione Liguria, Sez.3. PMID:26070922
Cozzani, Mauro; Ragazzini, Giulia; Delucchi, Alessia; Barreca, Carlo; Rinchuse, Daniel J; Servetto, Roberto; Calevo, Maria Grazia; Piras, Vincenzo
2016-06-01
To assess the effects of a follow-up text message and a telephone call after bonding on participants' self-reported level of pain. Eighty-four participants were randomly assigned to one of three trial arms. Randomization was performed by the Department of Epidemiology and Biostatistics of IRCCS G.Gaslini. Participants were enrolled from patients with a permanent dentition who were beginning fixed no extraction treatment at the Orthodontic Department, Gaslini Hospital. Participants completed baseline questionnaires to assess their levels of pain prior to treatment. After the initial appointment, participants were completed a pain questionnaire at the same time, daily, for 7 days. The first group, served as control, did not receive any post-procedure communication; the second group received a structured text message; and the third group received a structured telephone call. Participants were blinded to group assignment. A larger sample size should have been considered in order to increase the ability to generalize this study's results. Participants in both the telephone call group and the text message group reported lower level of pain than participants in the control group with a larger and more consistent effect for the telephone call group. Most participants reported a higher level of pain during the first 48 hours post-bonding. The analgesic's consumption significantly correlated with the level of pain during the previous 24 hours. Female participants appeared to be more sensitive to pain than male participants. A telephone follow-up after orthodontic treatment may be an effective procedure to reduce participants' level of pain. The research protocol was approved by the Italian Comitato Etico Regionale della Liguria-sezione 3^ c/o IRCCS- Istituto G.Gaslini 845/2014. 182 Reg 2014, 16/09/2014 Comitato Etico Regione Liguria, Sez.3. © The Author 2015. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com.
The Effect of Empowerment and Educational Programs on the Quality of Life in Iranian Women with HIV.
Moghadam, Zahra Behboodi; Rezaei, Elham; Sharifi, Bahareh; Nejat, Saharnaz; Saeieh, Sara Esmaelzadeh; Khiaban, Maryam Ordibeheshti
2018-01-01
AIDS affects physical, mental, social, and psychological health status. One of the goals of Health for All in the 21st century is to improve the quality of life. This study is a randomized clinical trial conducted on 120 HIV-positive women. Women were administered assessment questionnaires to be completed during the structured interview. After sample collection, participants were divided randomly into 3 groups by using the table of random numbers, then, respectively, received educational intervention, empowerment program, and routine procedures offered by the center and were followed by refilling the questionnaires 12 weeks after intervention. Depending on the type of data, chi-square, analysis of variance, and paired t test were used, and SPSS version 16 was used for data analysis. The finding showed that knowledge increased after intervention in educational ( P = .02) and empowerment groups ( P = .006); also empowerment group indicated significant difference in psychological ( P = .006) and spiritual ( P = .001) domains and their total quality of life ( P = .004). According to this study, exposing HIV-positive women to empowerment education is effective in improving their quality of life.
Weintraub, W S; Becker, E R; Mauldin, P D; Culler, S; Kosinski, A S; King, S B
2000-10-01
The Emory Angioplasty versus Surgery Trial (EAST) was a randomized trial that compared, by intention to treat, the clinical outcome and costs of percutaneous transluminal coronary angioplasty (PTCA) and coronary bypass grafting (CABG) for multivessel coronary artery disease. We present the findings of the economic analysis of EAST through 8 years of follow-up and compare the cost and outcomes of patients randomized in EAST versus patients eligible but not randomized (registry patients). Charges were assessed from hospital UB82 and UB92 bills and professional charges from the Emory Clinic. Hospital charges were reduced to cost through step-down accounting methods. All costs and charges were inflated to 1997 dollars. Costs were assessed for initial hospitalization and for cumulative costs of the initial hospitalization and additional revascularization procedures up to 8 years. Total 8-year costs were $46,548 for CABG and $44,491 for PTCA (p = 0.37). Cost of CABG in the eligible registry group showed a pattern similar to that for randomized patients, but total cost of PTCA was lower for registry patients than for randomized patients. Thus, the primary procedural costs of CABG are more than those for PTCA; this cost advantage, given the limits of measurement, is largely or even completely lost for randomized patients over the course of 8 years because of additional procedures after a first revascularization by PTCA.
Medvedovici, Andrei; Udrescu, Stefan; Albu, Florin; Tache, Florentin; David, Victor
2011-09-01
Liquid-liquid extraction of target compounds from biological matrices followed by the injection of a large volume from the organic layer into the chromatographic column operated under reversed-phase (RP) conditions would successfully combine the selectivity and the straightforward character of the procedure in order to enhance sensitivity, compared with the usual approach of involving solvent evaporation and residue re-dissolution. Large-volume injection of samples in diluents that are not miscible with the mobile phase was recently introduced in chromatographic practice. The risk of random errors produced during the manipulation of samples is also substantially reduced. A bioanalytical method designed for the bioequivalence of fenspiride containing pharmaceutical formulations was based on a sample preparation procedure involving extraction of the target analyte and the internal standard (trimetazidine) from alkalinized plasma samples in 1-octanol. A volume of 75 µl from the octanol layer was directly injected on a Zorbax SB C18 Rapid Resolution, 50 mm length × 4.6 mm internal diameter × 1.8 µm particle size column, with the RP separation being carried out under gradient elution conditions. Detection was made through positive ESI and MS/MS. Aspects related to method development and validation are discussed. The bioanalytical method was successfully applied to assess bioequivalence of a modified release pharmaceutical formulation containing 80 mg fenspiride hydrochloride during two different studies carried out as single-dose administration under fasting and fed conditions (four arms), and multiple doses administration, respectively. The quality attributes assigned to the bioanalytical method, as resulting from its application to the bioequivalence studies, are highlighted and fully demonstrate that sample preparation based on large-volume injection of immiscible diluents has an increased potential for application in bioanalysis.
Tosri, Nisanuch; Rojanasthien, Noppamas; Srichairatanakool, Somdet; Sangdee, Chaichan
2013-01-01
The objective of this study was to determine the pharmacokinetics of caffeine after single administration of a coffee enema versus coffee consumed orally in healthy male subjects. The study design was an open-label, randomized two-phase crossover study. Eleven healthy subjects were randomly assigned either to receive 500 mL of coffee enema for 10 minutes or to consume 180 mL of ready-to-drink coffee beverage. After a washout period of at least 10 days, all the subjects were switched to receive the alternate coffee procedure. Blood samples were collected immediately before and at specific time points until 12 hours after coffee administration in each phase. The mean caffeine content in both the coffee solution prepared for the coffee enema and the ready-to-drink coffee beverage was not statistically different. The C max and AUC of caffeine obtained from the coffee enema were about 3.5 times significantly less than those of the coffee consumed orally, despite having slightly but statistically faster T max. The t 1/2 of caffeine obtained following both coffee procedures did not statistically differ. In summary, the relative bioavailability of caffeine obtained from the coffee enema was about 3.5 times significantly less than those of the coffee consumed orally. PMID:23533801
Emperical Tests of Acceptance Sampling Plans
NASA Technical Reports Server (NTRS)
White, K. Preston, Jr.; Johnson, Kenneth L.
2012-01-01
Acceptance sampling is a quality control procedure applied as an alternative to 100% inspection. A random sample of items is drawn from a lot to determine the fraction of items which have a required quality characteristic. Both the number of items to be inspected and the criterion for determining conformance of the lot to the requirement are given by an appropriate sampling plan with specified risks of Type I and Type II sampling errors. In this paper, we present the results of empirical tests of the accuracy of selected sampling plans reported in the literature. These plans are for measureable quality characteristics which are known have either binomial, exponential, normal, gamma, Weibull, inverse Gaussian, or Poisson distributions. In the main, results support the accepted wisdom that variables acceptance plans are superior to attributes (binomial) acceptance plans, in the sense that these provide comparable protection against risks at reduced sampling cost. For the Gaussian and Weibull plans, however, there are ranges of the shape parameters for which the required sample sizes are in fact larger than the corresponding attributes plans, dramatically so for instances of large skew. Tests further confirm that the published inverse-Gaussian (IG) plan is flawed, as reported by White and Johnson (2011).
Nonuniform sampling theorems for random signals in the linear canonical transform domain
NASA Astrophysics Data System (ADS)
Shuiqing, Xu; Congmei, Jiang; Yi, Chai; Youqiang, Hu; Lei, Huang
2018-06-01
Nonuniform sampling can be encountered in various practical processes because of random events or poor timebase. The analysis and applications of the nonuniform sampling for deterministic signals related to the linear canonical transform (LCT) have been well considered and researched, but up to now no papers have been published regarding the various nonuniform sampling theorems for random signals related to the LCT. The aim of this article is to explore the nonuniform sampling and reconstruction of random signals associated with the LCT. First, some special nonuniform sampling models are briefly introduced. Second, based on these models, some reconstruction theorems for random signals from various nonuniform samples associated with the LCT have been derived. Finally, the simulation results are made to prove the accuracy of the sampling theorems. In addition, the latent real practices of the nonuniform sampling for random signals have been also discussed.
Lentz, Robert J; Argento, A Christine; Colby, Thomas V; Rickman, Otis B; Maldonado, Fabien
2017-07-01
Transbronchial lung biopsy with a cryoprobe, or cryobiopsy, is a promising new bronchoscopic biopsy technique capable of obtaining larger and better-preserved samples than previously possible using traditional biopsy forceps. Over two dozen case series and several small randomized trials are now available describing experiences with this technique, largely for the diagnosis of diffuse parenchymal lung disease (DPLD), in which the reported diagnostic yield is typically 70% to 80%. Cryobiopsy technique varies widely between centers and this predominantly single center-based retrospective literature heterogeneously defines diagnostic yield and complications, limiting the degree to which this technique can be compared between centers or to surgical lung biopsy (SLB). This review explores the broad range of cryobiopsy techniques currently in use, their rationale, the current state of the literature, and suggestions for the direction of future study into this promising but unproven procedure.
Are all data created equal?--Exploring some boundary conditions for a lazy intuitive statistician.
Lindskog, Marcus; Winman, Anders
2014-01-01
The study investigated potential effects of the presentation order of numeric information on retrospective subjective judgments of descriptive statistics of this information. The studies were theoretically motivated by the assumption in the naïve sampling model of independence between temporal encoding order of data in long-term memory and retrieval probability (i.e. as implied by a "random sampling" from memory metaphor). In Experiment 1, participants experienced Arabic numbers that varied in distribution shape/variability between the first and the second half of the information sequence. Results showed no effects of order on judgments of mean, variability or distribution shape. To strengthen the interpretation of these results, Experiment 2 used a repeated judgment procedure, with an initial judgment occurring prior to the change in distribution shape of the information half-way through data presentation. The results of Experiment 2 were in line with those from Experiment 1, and in addition showed that the act of making explicit judgments did not impair accuracy of later judgments, as would be suggested by an anchoring and insufficient adjustment strategy. Overall, the results indicated that participants were very responsive to the properties of the data while at the same time being more or less immune to order effects. The results were interpreted as being in line with the naïve sampling models in which values are stored as exemplars and sampled randomly from long-term memory.
Sorrentino, G; Fumagalli, M; Milani, S; Cortinovis, I; Zorz, A; Cavallaro, G; Mosca, F; Plevani, L
2017-07-01
The heel stick is the method of choice in most neonatal units for capillary blood sampling, and it represents the most common event among all painful procedures performed on newborns. The type and design of heel stick device and the clinical procedure to collect a blood sample may have an impact on newborn pain response as well. To compare the pain response and efficiency of different automated devices for capillary blood collection in newborns. Randomized clinical trial. Postnatal ward of a tertiary-care university hospital in Italy. Newborn infants at gestational age ≥34 weeks undergoing the metabolic screening test after the 49th hour of life. A total of 762 neonates were recruited and randomized into 6 groups (127 babies in each group) assigned to 6 different capillary blood collection devices (Ames Minilet™ Lancet; Cardinal Health Gentleheel ® ; Natus Medical NeatNick™; BD Quikheel™ Lancet; Vitrex Steriheel ® Baby Lancet; Accriva Diagnostics Tenderfoot ® ). The following data were collected and assessed for each of the 6 groups evaluated: a) number of heel sticks, b) pain score according to the Neonatal Infant Pain Scale (NIPS) and c) need to squeeze the heel. The Ames Minilet™ Lancet device was found to perform by far the worst compared to the five device underexamination: it required the highest number of sticks (mean=3.91; 95% CI: 3.46-4.36), evoked the most intense pain (mean=3.98; 95% CI: 3.77-4.20), and most frequently necessitated squeezing the heel (92.9%; 95% CI: 86.9-96.3). The five devices under examination appeared to be similar in terms of the number of sticks required, but differed slightly in NIPS score and in need to squeeze the heel. The Accriva Diagnostics Tenderfoot ® device demonstrated the greatest efficiency for blood sampling and evoked the least pain. With this device, the metabolic screening test could be performed with a single skin incision in the large majority of infants (98.4%), heel squeezing was limited to only 6.3% of infants, and the NIPS score turns out to be lower than other devices in our study (1.22; 95% CI 1.05-1.39). Copyright © 2017 Elsevier Ltd. All rights reserved.
Hassan, A; Wahba, A; Haggag, H
2016-01-01
Which is better, Tramadol or Celecoxib, in reducing pain associated with outpatient hysteroscopy? Both Tramadol and Celecoxib are effective in reducing pain associated with outpatient hysteroscopy but Celecoxib may be better tolerated. Pain is the most common cause of failure of outpatient hysteroscopy. A systematic review and meta-analysis showed that local anaesthetics were effective in reducing pain associated with hysteroscopy but there was insufficient evidence to support the use of oral analgesics, opioids and non-steroidal anti-inflammatory drugs, to reduce hysteroscopy-associated pain and further studies were recommended. This was a randomized double-blind placebo-controlled trial with balanced randomization (allocation ratio 1:1:1) conducted in a university hospital from May 2014 to November 2014. Two hundred and ten women who had diagnostic outpatient hysteroscopy were randomly divided into three equal groups: Group 1 received oral Tramadol 100 mg, group 2 received Celecoxib 200 mg and group 3 received an oral placebo. All the drugs were given 1 h before the procedure. A patient's perception of pain was assessed during the procedure, immediately afterwards and 30 min after the procedure with the use of a visual analogue scale (VAS). There was a significant difference in the pain scores among the groups during the procedure, immediately afterwards and 30 min after the procedure (P< 0.001, 0.001, <0.001 respectively). Tramadol had significantly lower pain scores when compared with the placebo during the procedure (mean difference = 1.54, 95% confidence interval (CI) (0.86, 2.22), P < 0.001), immediately after the procedure (mean difference = 1.09; 95% CI (0.5, 1.68), P < 0.001) and 30 min later (mean difference = 0.95, 95% CI (0.48, 1.41), P < 0.001). Celecoxib administration also led to significantly lower pain scores than the placebo during the procedure (mean difference = 1.28, 95% CI (0.62, 1.94), P < 0.001), immediately after the procedure (mean difference = 0.72; 95% CI (0.13, 1.32), P = 0.016) and 30 min later (mean difference = 0.77, 95% CI (0.3, 1.24), P = 0.001). There were no significant differences in pain scores between Tramadol and Celecoxib at any time. Time until no pain differed significantly among the groups (P = 0.01); it was shorter with both Tramadol and Celecoxib groups when compared with placebo (P = 0.002 and 0.046, respectively). The procedure failed to be completed in one patient in the placebo group but no failure to complete the procedure occurred in Tramadol and Celecoxib groups. Four women in the Tramadol group reported nausea but no side effects were reported with Celecoxib group and no complications were reported in any group of patients. All results were based on the subjective perception of pain, which varies among individuals and is related to the individuals' previous pain experience and level of anxiety. Tramadol and Celecoxib are effective in reducing pain in outpatient hysteroscopy. Celecoxib may be better tolerated as no side effects were reported in the study, however further research on a larger sample size is required before drawing firm conclusions about lack of side effects. This research did not receive any specific grant from any funding agency in the public, commercial or not-for-profit sector. All authors declare no conflict of interest. www.clinicaltrials.gov - NCT02071303. © The Author 2015. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
49 CFR 219.605 - Positive drug test results; procedures.
Code of Federal Regulations, 2012 CFR
2012-10-01
... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION CONTROL OF ALCOHOL AND DRUG USE Random Alcohol and Drug Testing Programs § 219.605 Positive drug test results; procedures. (a) [Reserved] (b) Procedures for administrative... 49 Transportation 4 2012-10-01 2012-10-01 false Positive drug test results; procedures. 219.605...
49 CFR 219.605 - Positive drug test results; procedures.
Code of Federal Regulations, 2010 CFR
2010-10-01
... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION CONTROL OF ALCOHOL AND DRUG USE Random Alcohol and Drug Testing Programs § 219.605 Positive drug test results; procedures. (a) [Reserved] (b) Procedures for administrative... 49 Transportation 4 2010-10-01 2010-10-01 false Positive drug test results; procedures. 219.605...
49 CFR 219.605 - Positive drug test results; procedures.
Code of Federal Regulations, 2011 CFR
2011-10-01
... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION CONTROL OF ALCOHOL AND DRUG USE Random Alcohol and Drug Testing Programs § 219.605 Positive drug test results; procedures. (a) [Reserved] (b) Procedures for administrative... 49 Transportation 4 2011-10-01 2011-10-01 false Positive drug test results; procedures. 219.605...
49 CFR 219.605 - Positive drug test results; procedures.
Code of Federal Regulations, 2013 CFR
2013-10-01
... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION CONTROL OF ALCOHOL AND DRUG USE Random Alcohol and Drug Testing Programs § 219.605 Positive drug test results; procedures. (a) [Reserved] (b) Procedures for administrative... 49 Transportation 4 2013-10-01 2013-10-01 false Positive drug test results; procedures. 219.605...
49 CFR 219.605 - Positive drug test results; procedures.
Code of Federal Regulations, 2014 CFR
2014-10-01
... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION CONTROL OF ALCOHOL AND DRUG USE Random Alcohol and Drug Testing Programs § 219.605 Positive drug test results; procedures. (a) [Reserved] (b) Procedures for administrative... 49 Transportation 4 2014-10-01 2014-10-01 false Positive drug test results; procedures. 219.605...
Utility-based designs for randomized comparative trials with categorical outcomes
Murray, Thomas A.; Thall, Peter F.; Yuan, Ying
2016-01-01
A general utility-based testing methodology for design and conduct of randomized comparative clinical trials with categorical outcomes is presented. Numerical utilities of all elementary events are elicited to quantify their desirabilities. These numerical values are used to map the categorical outcome probability vector of each treatment to a mean utility, which is used as a one-dimensional criterion for constructing comparative tests. Bayesian tests are presented, including fixed sample and group sequential procedures, assuming Dirichlet-multinomial models for the priors and likelihoods. Guidelines are provided for establishing priors, eliciting utilities, and specifying hypotheses. Efficient posterior computation is discussed, and algorithms are provided for jointly calibrating test cutoffs and sample size to control overall type I error and achieve specified power. Asymptotic approximations for the power curve are used to initialize the algorithms. The methodology is applied to re-design a completed trial that compared two chemotherapy regimens for chronic lymphocytic leukemia, in which an ordinal efficacy outcome was dichotomized and toxicity was ignored to construct the trial’s design. The Bayesian tests also are illustrated by several types of categorical outcomes arising in common clinical settings. Freely available computer software for implementation is provided. PMID:27189672
Drew, L.J.; Attanasi, E.D.; Schuenemeyer, J.H.
1988-01-01
If observed oil and gas field size distributions are obtained by random samplings, the fitted distributions should approximate that of the parent population of oil and gas fields. However, empirical evidence strongly suggests that larger fields tend to be discovered earlier in the discovery process than they would be by random sampling. Economic factors also can limit the number of small fields that are developed and reported. This paper examines observed size distributions in state and federal waters of offshore Texas. Results of the analysis demonstrate how the shape of the observable size distributions change with significant hydrocarbon price changes. Comparison of state and federal observed size distributions in the offshore area shows how production cost differences also affect the shape of the observed size distribution. Methods for modifying the discovery rate estimation procedures when economic factors significantly affect the discovery sequence are presented. A primary conclusion of the analysis is that, because hydrocarbon price changes can significantly affect the observed discovery size distribution, one should not be confident about inferring the form and specific parameters of the parent field size distribution from the observed distributions. ?? 1988 International Association for Mathematical Geology.
This procedure summarizes the sample shipping procedures that have been described in the individual NHEXAS sample collection protocols. This procedure serves as a quick reference tool for the field staff when samples are prepared for shipment at the field lab/staging area. For ea...
VARIABLE SELECTION FOR REGRESSION MODELS WITH MISSING DATA
Garcia, Ramon I.; Ibrahim, Joseph G.; Zhu, Hongtu
2009-01-01
We consider the variable selection problem for a class of statistical models with missing data, including missing covariate and/or response data. We investigate the smoothly clipped absolute deviation penalty (SCAD) and adaptive LASSO and propose a unified model selection and estimation procedure for use in the presence of missing data. We develop a computationally attractive algorithm for simultaneously optimizing the penalized likelihood function and estimating the penalty parameters. Particularly, we propose to use a model selection criterion, called the ICQ statistic, for selecting the penalty parameters. We show that the variable selection procedure based on ICQ automatically and consistently selects the important covariates and leads to efficient estimates with oracle properties. The methodology is very general and can be applied to numerous situations involving missing data, from covariates missing at random in arbitrary regression models to nonignorably missing longitudinal responses and/or covariates. Simulations are given to demonstrate the methodology and examine the finite sample performance of the variable selection procedures. Melanoma data from a cancer clinical trial is presented to illustrate the proposed methodology. PMID:20336190
Yang, Shuo; Lok, Charmaine; Arnold, Renee; Rajan, Dheeraj; Glickman, Marc
2017-03-28
Due to early and late failures that may occur with surgically created hemodialysis arteriovenous fistulas (SAVF), post-creation procedures are commonly required to facilitate AVF maturation and maintain patency. This study compared AVF post-creation procedures and their associated costs in patients with SAVF to patients with a new endovascularly created AVF (endoAVF). A 5% random sample from Medicare Standard Analytical Files was abstracted to determine post-creation procedures and associated costs for SAVF created from 2011 to 2013. Medicare enrollment during the 6 months prior to and after the AVF creation was required. Patients' follow-up inpatient, outpatient, and physician claims were used to identify post-creation procedures and to estimate average procedure costs. Comparative procedural information on endoAVF was obtained from the Novel Endovascular Access Trial (NEAT). Of 3764 Medicare SAVF patients, 60 successfully matched to endoAVF patients using 1:1 propensity score matching of baseline demographic and clinical characteristics. The total post-creation procedural event rate within 1 year was lower for endoAVF patients (0.59 per patient-year) compared to the matched SAVF cohort (3.43 per patient-year; p<0.05). In the endoAVF cohort, event rates of angioplasty, thrombectomy, revision, catheter placement, subsequent arteriovenous graft (AVG), new SAVF, and vascular access-related infection were all significantly lower than in the SAVF cohort. The average first year cost per patient-year associated with post-creation procedures was estimated at US$11,240 USD lower for endoAVF than for SAVF. Compared to patients with SAVF, patients with endoAVF required fewer post-creation procedures and had lower associated mean costs within the first year.
Azarmnejad, Elham; Sarhangi, Forogh; Javadi, Mahrooz; Rejeh, Nahid
2015-04-19
Due to devastating effects of pain in neonates, it is very important to ease it though safe and feasible methods. This study was to determine the effect of familiar auditory stimuli on the arterial blood sampling (ABS) induced pain in term neonates. This study was done on 30 newborns hospitalized in neonate intensive care unit (NICU) of a hospital in Tehran. Research samples were selected by using convenience sampling and randomly divided into two groups of control and test. In the test group, the recorded mothers' voices were played for the newborns before and after blood sampling procedure. Then, pain measures were recorded 10 minutes before, during and 10 minutes after blood collection based on Neonatal Infant Pain Scale (NIPS); then the pain level changes were reviewed and studied. The findings showed significant differences between the control and test groups that indicating the effect of mother's voice on reducing the pain of neonates during the ABS (p<0.005). Research findings demonstrate that mother's voice reduces ABS induced pain in the term neonates.
NASA Technical Reports Server (NTRS)
Hornung, Steven D.; Biesinger, Paul; Kirsch, Mike; Beeson, Harold; Leuders, Kathy
1999-01-01
The NASA White Sands Test Facility (WSTF) has developed an entirely aqueous final cleaning and verification process to replace the current chlorofluorocarbon (CFC) 113 based process. This process has been accepted for final cleaning and cleanliness verification of WSTF ground support equipment. The aqueous process relies on ultrapure water at 50 C (323 K) and ultrasonic agitation for removal of organic compounds and particulate. The cleanliness is verified bv determining the total organic carbon (TOC) content and filtration with particulate counting. The effectiveness of the aqueous methods for detecting hydrocarbon contamination and particulate was compared to the accepted CFC 113 sampling procedures. Testing with known contaminants, such as hydraulic fluid and cutting and lubricating oils, to establish a correlation between aqueous TOC and CFC 113 nonvolatile residue (NVR) was performed. Particulate sampling on cleaned batches of hardware that were randomly separated and sampled by the two methods was performed. This paper presents the approach and results, and discusses the issues in establishing the equivalence of aqueous sampling to CFC 113 sampling, while describing the approach for implementing aqueous techniques on Space Shuttle Propulsion hardware.
Sample size allocation for food item radiation monitoring and safety inspection.
Seto, Mayumi; Uriu, Koichiro
2015-03-01
The objective of this study is to identify a procedure for determining sample size allocation for food radiation inspections of more than one food item to minimize the potential risk to consumers of internal radiation exposure. We consider a simplified case of food radiation monitoring and safety inspection in which a risk manager is required to monitor two food items, milk and spinach, in a contaminated area. Three protocols for food radiation monitoring with different sample size allocations were assessed by simulating random sampling and inspections of milk and spinach in a conceptual monitoring site. Distributions of (131)I and radiocesium concentrations were determined in reference to (131)I and radiocesium concentrations detected in Fukushima prefecture, Japan, for March and April 2011. The results of the simulations suggested that a protocol that allocates sample size to milk and spinach based on the estimation of (131)I and radiocesium concentrations using the apparent decay rate constants sequentially calculated from past monitoring data can most effectively minimize the potential risks of internal radiation exposure. © 2014 Society for Risk Analysis.
47 CFR 1.926 - Application processing; initial procedures.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 47 Telecommunication 1 2013-10-01 2013-10-01 false Application processing; initial procedures. 1.926 Section 1.926 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Wireless Radio Services Applications and Proceedings Application Requirements...
47 CFR 1.926 - Application processing; initial procedures.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 47 Telecommunication 1 2014-10-01 2014-10-01 false Application processing; initial procedures. 1.926 Section 1.926 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Wireless Radio Services Applications and Proceedings Application Requirements...
47 CFR 1.926 - Application processing; initial procedures.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 47 Telecommunication 1 2012-10-01 2012-10-01 false Application processing; initial procedures. 1.926 Section 1.926 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Grants by Random Selection Wireless Radio Services Applications and Proceedings Application Requirements...
Securebox: a multibiopsy sample container for specimen identification and transport.
Palmieri, Beniamino; Sblendorio, Valeriana; Saleh, Farid; Al-Sebeih, Khalid
2008-01-01
To describe an original multicompartment disposable container for tissue surgical specimens or serial biopsy samples (Securebox). The increasing number of pathology samples from a single patient required for an accurate diagnosis led us to design and manufacture a unique container with 4 boxes; in each box 1 or more biopsy samples can be lodged. A magnification lens on a convex segment of the plastic framework allows inspection of macroscopic details of the recovered specimens. We investigated 400 randomly selected cases (compared with 400 controls) who underwent multiple biopsies from January 2006 to January 2007 to evaluate compliance with the new procedure and detect errors resulting from missing some of the multiple specimens or to technical mistakes during the procedure or delivery that might have compromised the final diagnosis. Using our Securebox, the percentage of oatients whose diagnosis failed or could not be reached was O.5% compared to 4% with the traditional method (p = 0.0012). Moreover, the percentage of medical and nursing staff who were satisfied with the Securebox compared to the traditional methodwas 85% vs. 15%, respectively (p < 0.0001). The average number of days spent bto reach a proper diagnosis based on the usage of the Securebox was 3.38 +/- 1.16 SD compared to 6.76 +/- 0.52 SD with the traditional method (p < 0.0001). The compact Securebox makes it safer and easier to introduce the specimens and to ship them to the pathology laboratories, reducing the risk of error.
Kelley, Shannon E; van Dongen, Josanne D M; Donnellan, M Brent; Edens, John F; Eisenbarth, Hedwig; Fossati, Andrea; Howner, Katarina; Somma, Antonella; Sörman, Karolina
2018-05-01
The Triarchic Assessment Procedure for Inconsistent Responding (TAPIR; Mowle et al., 2016) was recently developed to identify inattentiveness or comprehension difficulties that may compromise the validity of responses on the Triarchic Psychopathy Measure (TriPM; Patrick, 2010). The TAPIR initially was constructed and cross-validated using exclusively English-speaking participants from the United States; however, research using the TriPM has been increasingly conducted internationally, with numerous foreign language translations of the measure emerging. The present study examined the cross-language utility of the TAPIR in German, Dutch, Swedish, and Italian translations of the TriPM using 6 archival samples of community members, university students, forensic psychiatric inpatients, forensic detainees, and adolescents residing outside the United States (combined N = 5,404). Findings suggest that the TAPIR effectively detects careless responding across these 4 translated versions of the TriPM without the need for language-specific modifications. The TAPIR total score meaningfully discriminated genuine participant responses from both fully and partially randomly generated data in every sample, and demonstrated further utility in detecting fixed "all true" or "all false" response patterns. In addition, TAPIR scores were reliably associated with inconsistent responding scores from another psychopathy inventory. Specificity for a range of tentative cut scores for assessing profile validity was modestly reduced among our samples relative to rates previously obtained with the English version of the TriPM; however, overall the TAPIR appears to demonstrate satisfactory cross-language generalizability. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
The Analysis of Completely Randomized Factorial Experiments When Observations Are Lost at Random.
ERIC Educational Resources Information Center
Hummel, Thomas J.
An investigation was conducted of the characteristics of two estimation procedures and corresponding test statistics used in the analysis of completely randomized factorial experiments when observations are lost at random. For one estimator, contrast coefficients for cell means did not involve the cell frequencies. For the other, contrast…
Conditional Monte Carlo randomization tests for regression models.
Parhat, Parwen; Rosenberger, William F; Diao, Guoqing
2014-08-15
We discuss the computation of randomization tests for clinical trials of two treatments when the primary outcome is based on a regression model. We begin by revisiting the seminal paper of Gail, Tan, and Piantadosi (1988), and then describe a method based on Monte Carlo generation of randomization sequences. The tests based on this Monte Carlo procedure are design based, in that they incorporate the particular randomization procedure used. We discuss permuted block designs, complete randomization, and biased coin designs. We also use a new technique by Plamadeala and Rosenberger (2012) for simple computation of conditional randomization tests. Like Gail, Tan, and Piantadosi, we focus on residuals from generalized linear models and martingale residuals from survival models. Such techniques do not apply to longitudinal data analysis, and we introduce a method for computation of randomization tests based on the predicted rate of change from a generalized linear mixed model when outcomes are longitudinal. We show, by simulation, that these randomization tests preserve the size and power well under model misspecification. Copyright © 2014 John Wiley & Sons, Ltd.
Lin, Yi Hung; Tu, Yu Kang; Lu, Chun Tai; Chung, Wen Chen; Huang, Chiung Fang; Huang, Mao Suan; Lu, Hsein Kun
2014-01-01
Repigmentation variably occurs with different treatment methods in patients with gingival pigmentation. A systemic review was conducted of various treatment modalities for eliminating melanin pigmentation of the gingiva, comprising bur abrasion, scalpel surgery, cryosurgery, electrosurgery, gingival grafts, and laser techniques, to compare the recurrence rates (Rrs) of these treatment procedures. Electronic databases, including PubMed, Web of Science, Google, and Medline were comprehensively searched, and manual searches were conducted for studies published from January 1951 to June 2013. After applying inclusion and exclusion criteria, the final list of articles was reviewed in depth to achieve the objectives of this review. A Poisson regression was used to analyze the outcome of depigmentation using the various treatment methods. The systematic review was based on case reports mainly. In total, 61 eligible publications met the defined criteria. The various therapeutic procedures showed variable clinical results with a wide range of Rrs. A random-effects Poisson regression showed that cryosurgery (Rr = 0.32%), electrosurgery (Rr = 0.74%), and laser depigmentation (Rr = 1.16%) yielded superior result, whereas bur abrasion yielded the highest Rr (8.89%). Within the limit of the sampling level, the present evidence-based results show that cryosurgery exhibits the optimal predictability for depigmentation of the gingiva among all procedures examined, followed by electrosurgery and laser techniques. It is possible to treat melanin pigmentation of the gingiva with various methods and prevent repigmentation. Among those treatment modalities, cryosurgery, electrosurgery, and laser surgery appear to be the best choices for treating gingival pigmentation. © 2014 Wiley Periodicals, Inc.
Hayes, Galina; Singh, Ameet; Gibson, Tom; Moens, Noel; Oblak, Michelle; Ogilvie, Adam; Reynolds, Debbie
2017-10-01
To determine the influence of orthopedic reinforced gloves on contamination events during small animal orthopedic surgery. Prospective randomized controlled trial SAMPLE POPULATION: Two hundred and thirty-seven pairs of orthopedic gloves (474 gloves) and 203 pairs of double standard gloves (812 gloves) worn during 193 orthopedic procedures. Primary and assistant surgeons were randomized to wear either orthopedic reinforced gloves or double gloves. Gloves were leak tested to identify perforations at the end of procedures. Perforations detected intraoperatively or postoperatively were recorded. A contamination event was defined as at least one perforation on either hand for orthopedic reinforced gloves, or a perforation of both the inner and outer glove on the same hand for double gloves. Baseline characteristics between the 2 intervention groups were similar. There was no difference in contamination events between the double-gloved and orthopedic gloved groups (OR = 0.95, 95% CI = 0.49-1.87, P = .89). The same percentage of contamination events (8% glove pairs used) occurred in the double gloved group (17 contamination events) and in the orthopedic gloved group (19 contamination events). The odds of a contamination event increased by 1.02 (95% CI 1.01-1.03, P < .001) with each additional minute of surgery. Orthopedic reinforced gloves and double standard gloving were equally effective at preventing contamination events in small animal orthopedic procedures. Surgeons reluctant to double glove due to perceptions of decreased dexterity and discomfort may safely opt for wearing orthopedic gloves, which may improve their compliance. © 2017 The American College of Veterinary Surgeons.
Pollution gets personal! A first population-based human biomonitoring study in Austria.
Hohenblum, Philipp; Steinbichl, Philipp; Raffesberg, Wolfgang; Weiss, Stefan; Moche, Wolfgang; Vallant, Birgit; Scharf, Sigrid; Haluza, Daniela; Moshammer, Hanns; Kundi, Michael; Piegler, Brigitte; Wallner, Peter; Hutter, Hans-Peter
2012-02-01
Humans are exposed to a broad variety of man-made chemicals. Human biomonitoring (HBM) data reveal the individual body burden irrespective of sources and routes of uptake. A first population-based study was started in Austria in 2008 and was finished at the end of May 2011. This cross sectional study aims at documenting the extent, the distribution and the determinants of human exposure to industrial chemicals as well as proving the feasibility of a representative HBM study. Overall, 150 volunteers (50 families) were selected by stratified random sampling. Exposure to phthalates, trisphosphates, polybrominated diphenyl ethers (PBDE), bisphenol A (along with nonyl- and octyl phenol) and methyl mercury was assessed. Sixteen of 18 PBDE determined were detected above the limit of quantification (LOQ) in blood samples with #153 and #197 the most abundant species. Bisphenol A in urine was measured in a subsample of 25 with only 4 samples found above the LOQ. In 3 of 100 urine samples at least one of 8 trisphosphate compounds assessed was above the LOQ. These first analytical results of the human biomonitoring data show that the body burden of the Austrian population with respect to the assessed compounds is comparable to or even lower than in other European countries. Overall, the study revealed that in order to develop a feasible protocol for representative human biomonitoring studies procedures have to be optimized to allow for non-invasive sampling of body tissues in accordance with the main metabolic pathways. Procedures of participants' recruitment were, however, labor intensive and have to be improved. Copyright © 2011 Elsevier GmbH. All rights reserved.
Pontis, Alessandro; Sedda, Federica; Mereu, Liliana; Podda, Mauro; Melis, Gian Benedetto; Pisanu, Adolfo; Angioni, Stefano
2016-09-01
To critically appraise published randomized controlled trials (RCTs) comparing laparo-endoscopic single site (LESS) and multi-port laparoscopic (MPL) in gynecologic operative surgery; the aim was to assess feasibility, safety, and potential benefits of LESS in comparison to MPL. A systematic review and meta-analysis of eleven RCTs. Women undergoing operative LESS and MPL gynecologic procedure (hysterectomy, cystectomy, salpingectomy, salpingo-oophorectomy, myomectomy). Outcomes evaluated were as follows: postoperative overall morbidity, postoperative pain evaluation at 6, 12, 24 and 48 h, cosmetic patient satisfaction, conversion rate, body mass index (BMI), operative time, blood loss, hemoglobin drop, postoperative hospital stay. Eleven RCTs comprising 956 women with gynecologic surgical disease randomized to either LESS (477) or MPL procedures (479) were analyzed systematically. The LESS approach is a surgical procedure with longer operative and better cosmetic results time than MPL but without statistical significance. Operative outcomes, postoperative recovery, postoperative morbidity and patient satisfaction are similar in LESS and MPL. LESS may be considered an alternative to MPL with comparable feasibility and safety in gynecologic operative procedures. However, it does not offer the expected advantages in terms of postoperative pain and cosmetic satisfaction.
Yarmus, Lonny B; Semaan, Roy W; Arias, Sixto A; Feller-Kopman, David; Ortiz, Ricardo; Bösmüller, Hans; Illei, Peter B; Frimpong, Bernice O; Oakjones-Burgess, Karen; Lee, Hans J
2016-08-01
Transbronchial forceps biopsy (FBx) has been the preferred method for obtaining bronchoscopic lung biopsy specimens. Cryoprobe biopsy (CBx) has been shown to obtain larger and higher quality samples, but is limited by its inability to retrieve the sample through the working channel of the bronchoscope, requiring the bronchoscope to leave the airway for sample retrieval. We evaluated a novel device using a sheath cryobiopsy (SCBx). This method allows for specimen retrieval through the working channel of the bronchoscope, with the scope remaining inside the airway. This prospective, randomized controlled, single-blinded porcine study compared a 1.1-mm SCBx probe, a 1.9-mm CBx probe, and 2.0-mm FBx forceps. Assessment of histologic accessibility, sample quantity and quality, number of attempts to acquire and retrieve samples, cryoprobe activation time, fluoroscopy activation time, technical feasibility, and complications were compared. Samples adequate for standard pathologic processing were retrieved with 82.1% of the SCBx specimens, 82.9%% of the CBx specimens, and 30% of the FBx specimens. The histologic accessibility of both SCBx (P = .0002) and CBx (P = .0003) was superior to FBx. Procedure time for FBx was faster than for both SCBx and CBx, but SCBx was significantly faster than CBx (P < .0001). Fluoroscopy time was lower for both SCBx and CBx compared with FBx. There were no significant bleeding events. SCBx is a feasible technique providing a higher quality lung biopsy specimen compared with FBx and can successfully be retrieved through the working channel. Human studies are needed to further assess this technique with additional safety data. Copyright © 2016 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.
Sebastião, E; Gobbi, S; Chodzko-Zajko, W; Schwingel, A; Papini, C B; Nakamura, P M; Netto, A V; Kokubun, E
2012-11-01
To explore issues associated with measuring physical activity using the International Physical Activity Questionnaire (IPAQ)-long form in adults living in a mid-sized Brazilian city. A stratified random sampling procedure was used to select a representative sample of adults living in Rio Claro. This yielded 1572 participants who were interviewed using the IPAQ-long form. The data were analysed using standard statistical procedures. Overall, 83% of men and 89% of women reported at least 150 min of combined moderate and/or vigorous physical activity per week. Reliable values of leisure and transportation-related physical activity were observed for both males and females. With regard to the household and work-related physical activity domains, both males and females reported unusually high levels of participation. The IPAQ-long form appears to overestimate levels of physical activity for both males and females, suggesting that the instrument has problems in measuring levels of physical activity in Brazilian adults. Accordingly, caution is warranted before using IPAQ data to support public policy decisions related to physical activity. Copyright © 2012 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Decreasing Errors in Reading-Related Matching to Sample Using a Delayed-Sample Procedure
ERIC Educational Resources Information Center
Doughty, Adam H.; Saunders, Kathryn J.
2009-01-01
Two men with intellectual disabilities initially demonstrated intermediate accuracy in two-choice matching-to-sample (MTS) procedures. A printed-letter identity MTS procedure was used with 1 participant, and a spoken-to-printed-word MTS procedure was used with the other participant. Errors decreased substantially under a delayed-sample procedure,…
Benkerroum, N; Bouhlal, Y; El Attar, A; Marhaben, A
2004-06-01
Samples of meat and dairy products taken from the city of Rabat, Morocco, were examined for the presence of Escherichia coli O157 by the selective enrichment procedure followed by plating on cefixime-tellurite-sorbitol MacConkey agar and a latex agglutination test. The ability of isolates to produce Shiga toxins (ST1 or ST2) was also tested by an agglutination test using sensitized latex. Dairy samples (n = 44) included different products commonly consumed in the country. Meat samples (n = 36) were taken from traditional butchers because these products are generally marketed in this way. Random samples were taken from each product during the period of January through May. Of the 80 samples tested, 8 (10%) harbored E. coli O157. Four dairy and four meat samples were contaminated (9.1 and 11.1%, respectively). Of 10 E. coli O157 isolates from contaminated samples demonstrating true antigen-antibody agglutination, 5 (50%) produced either ST2 alone or ST2 plus ST1. Four of the five strains (80%) were meat isolates and produced ST2 with or without ST1, and the fifth was a dairy isolate producing ST2.
McDonell, Michael G.; Leickly, Emily; McPherson, Sterling; Skalisky, Jordan; Srebnik, Debra; Angelo, Frank; Vilardaga, Roger; Nepom, Jenny R.; Roll, John M.; Ries, Richard K.
2017-01-01
Objective To determine if a contingency management intervention using the ethyl glucuronide (EtG) alcohol biomarker resulted in increased alcohol abstinence in outpatients with co-occurring serious mental illnesses. Secondary objectives were to determine if contingency management was associated with changes in heavy drinking, treatment attendance, drug use, cigarette smoking, psychiatric symptoms, and HIV-risk behavior. Method Seventy-nine (37% female, 44% non-white) outpatients with serious mental illness and alcohol dependence receiving treatment as usual completed a 4-week observation period and were randomized to 12-weeks of contingency management for EtG-negative urine samples and addiction treatment attendance, or reinforcement only for study participation. Contingency management included the variable magnitude of reinforcement “prize draw” procedure contingent on EtG-negative samples (<150 ng/mL) three times a week and weekly gift cards for outpatient treatment attendance. Urine EtG, drug test, and self-report outcomes were assessed during the 12-week intervention and 3-month follow-up periods. Results Contingency management participants were 3.1 times (95% CI: 2.2, 4.5) more likely to submit an EtG-negative urine test during the 12-week intervention period, attaining nearly 1.5 weeks of additional abstinence relative to controls. Contingency management participants had significantly lower mean EtG levels, reported less drinking and fewer heavy drinking episodes, and were more likely to submit stimulant-negative urine and smoking-negative breath samples, relative to controls. Differences in self-reported alcohol use were maintained at the 3-month follow-up. Conclusions This is the first randomized trial utilizing an accurate and validated biomarker (EtG) to demonstrate the efficacy of contingency management for alcohol dependence in outpatients with serious mental illness. PMID:28135843
McDonell, Michael G; Leickly, Emily; McPherson, Sterling; Skalisky, Jordan; Srebnik, Debra; Angelo, Frank; Vilardaga, Roger; Nepom, Jenny R; Roll, John M; Ries, Richard K
2017-04-01
The authors examined whether a contingency management intervention using the ethyl glucuronide (EtG) alcohol biomarker resulted in increased alcohol abstinence in outpatients with co-occurring serious mental illnesses. Secondary objectives were to determine whether contingency management was associated with changes in heavy drinking, treatment attendance, drug use, cigarette smoking, psychiatric symptoms, and HIV-risk behavior. Seventy-nine (37% female, 44% nonwhite) outpatients with serious mental illness and alcohol dependence receiving treatment as usual completed a 4-week observation period and were randomly assigned to 12 weeks of contingency management for EtG-negative urine samples and addiction treatment attendance, or reinforcement only for study participation. Contingency management included the variable magnitude of reinforcement "prize draw" procedure contingent on EtG-negative samples (<150 ng/mL) three times a week and weekly gift cards for outpatient treatment attendance. Urine EtG, drug test, and self-report outcomes were assessed during the 12-week intervention and 3-month follow-up periods. Contingency management participants were 3.1 times (95% CI=2.2-4.5) more likely to submit an EtG-negative urine test during the 12-week intervention period, attaining nearly 1.5 weeks of additional alcohol abstinence compared with controls. Contingency management participants had significantly lower mean EtG levels, reported less drinking and fewer heavy drinking episodes, and were more likely to submit stimulant-negative urine and smoking-negative breath samples, compared with controls. Differences in self-reported alcohol use were maintained at the 3-month follow-up. This is the first randomized trial utilizing an accurate and validated biomarker (EtG) to demonstrate the efficacy of contingency management for alcohol dependence in outpatients with serious mental illness.
The Effect of Hypnosis on Anxiety in Patients With Cancer: A Meta-Analysis.
Chen, Pei-Ying; Liu, Ying-Mei; Chen, Mei-Ling
2017-06-01
Anxiety is a common form of psychological distress in patients with cancer. One recognized nonpharmacological intervention to reduce anxiety for various populations is hypnotherapy or hypnosis. However, its effect in reducing anxiety in cancer patients has not been systematically evaluated. This meta-analysis was designed to synthesize the immediate and sustained effects of hypnosis on anxiety of cancer patients and to identify moderators for these hypnosis effects. Qualified studies including randomized controlled trials (RCT) and pre-post design studies were identified by searching seven electronic databases: Scopus, Medline Ovidsp, PubMed, PsycInfo-Ovid, Academic Search Premier, CINAHL Plus with FT-EBSCO, and SDOL. Effect size (Hedges' g) was computed for each study. Random-effect modeling was used to combine effect sizes across studies. All statistical analyses were conducted with Comprehensive Meta-Analysis, version 2 (Biostat, Inc., Englewood, NJ, USA). Our meta-analysis of 20 studies found that hypnosis had a significant immediate effect on anxiety in cancer patients (Hedges' g: 0.70-1.41, p < .01) and the effect was sustained (Hedges' g: 0.61-2.77, p < .01). The adjusted mean effect size (determined by Duvan and Tweedie's trim-and-fill method) was 0.46. RCTs had a significantly higher effect size than non-RCT studies. Higher mean effect sizes were also found with pediatric study samples, hematological malignancy, studies on procedure-related stressors, and with mixed-gender samples. Hypnosis delivered by a therapist was significantly more effective than self-hypnosis. Hypnosis can reduce anxiety of cancer patients, especially for pediatric cancer patients who experience procedure-related stress. We recommend therapist-delivered hypnosis should be preferred until more effective self-hypnosis strategies are developed. © 2017 Sigma Theta Tau International.
Koyama, Kento; Hokunan, Hidekazu; Hasegawa, Mayumi; Kawamura, Shuso; Koseki, Shigenobu
2016-12-01
We investigated a bacterial sample preparation procedure for single-cell studies. In the present study, we examined whether single bacterial cells obtained via 10-fold dilution followed a theoretical Poisson distribution. Four serotypes of Salmonella enterica, three serotypes of enterohaemorrhagic Escherichia coli and one serotype of Listeria monocytogenes were used as sample bacteria. An inoculum of each serotype was prepared via a 10-fold dilution series to obtain bacterial cell counts with mean values of one or two. To determine whether the experimentally obtained bacterial cell counts follow a theoretical Poisson distribution, a likelihood ratio test between the experimentally obtained cell counts and Poisson distribution which parameter estimated by maximum likelihood estimation (MLE) was conducted. The bacterial cell counts of each serotype sufficiently followed a Poisson distribution. Furthermore, to examine the validity of the parameters of Poisson distribution from experimentally obtained bacterial cell counts, we compared these with the parameters of a Poisson distribution that were estimated using random number generation via computer simulation. The Poisson distribution parameters experimentally obtained from bacterial cell counts were within the range of the parameters estimated using a computer simulation. These results demonstrate that the bacterial cell counts of each serotype obtained via 10-fold dilution followed a Poisson distribution. The fact that the frequency of bacterial cell counts follows a Poisson distribution at low number would be applied to some single-cell studies with a few bacterial cells. In particular, the procedure presented in this study enables us to develop an inactivation model at the single-cell level that can estimate the variability of survival bacterial numbers during the bacterial death process. Copyright © 2016 Elsevier Ltd. All rights reserved.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Definitions. 1.1621 Section 1.1621 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1621 Definitions. (a) Medium of mass communications means...
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 1 2011-10-01 2011-10-01 false Definitions. 1.1621 Section 1.1621 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1621 Definitions. (a) Medium of mass communications means...
ERIC Educational Resources Information Center
Lee, Eunjung
2013-01-01
The purpose of this research was to compare the equating performance of various equating procedures for the multidimensional tests. To examine the various equating procedures, simulated data sets were used that were generated based on a multidimensional item response theory (MIRT) framework. Various equating procedures were examined, including…
Weldu, Yemane; Gebru, Hagos; Kahsay, Getahun; Teweldemedhn, Gebremichael; Hagos, Yifter; Kahsay, Amlsha
2017-01-01
The aim of this study was to assess the utilization of standard operating procedures for acid-fast bacilli (AFB) smear microscopy. A facility-based cross-sectional study was conducted in select health institutions in Mekelle City, Ethiopia, from July 1, 2015, through August 30, 2015. Using a simple random sampling technique, 18 health facilities were included in the study. Data were collected using a standard checklist and entered into Epi Info version 3.5.4 (Centers for Disease Control and Prevention, Atlanta, GA) for editing. Analysis was done using SPSS version 20 (SPSS, Chicago, IL). Of the 18 laboratory facilities, only seven (38.9%) had a legible AFB registration book. In three (16.7%) of the laboratories, heat fixation was not applied before adding primary staining reagent. In 12 (66.7%), the staining reagents had precipitates. Two laboratories had microscopes with mechanical stages that could not move freely on both axes. Seven (38.9%) of the laboratories reported samples to be negative before examining all required fields. Most laboratories, 16 (88.9%) and 17 (94.4%), respectively, did not run positive and negative controls after new batch reagent preparation. Tuberculosis microscopy was found to be substandard with clear gaps in documentation, sample collection, and processing. © American Society for Clinical Pathology, 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
Nishioka, Michele A; Pinfildi, Carlos E; Sheliga, Tatiana Rodrigues; Arias, Victor E; Gomes, Heitor C; Ferreira, Lydia M
2012-09-01
Skin flap procedures are commonly used in plastic surgery. Failures can follow, leading to the necrosis of the flap. Therefore, many studies use LLLT to improve flap viability. Currently, the LED has been introduced as an alternative to LLLT. The objective of this study was to evaluate the effect of LLLT and LED on the viability of random skin flaps in rats. Forty-eight rats were divided into four groups, and a random skin flap (10 × 4 cm) was performed in all animals. Group 1 was the sham group; group 2 was submitted to LLLT 660 nm, 0.14 J; group 3 with LED 630 nm, 2.49 J, and group 4 with LLLT 660 nm, with 2.49 J. Irradiation was applied after surgery and repeated on the four subsequent days. On the 7th postoperative day, the percentage of flap necrosis was calculated and skin samples were collected from the viable area and from the transition line of the flap to evaluate blood vessels and mast cells. The percentage of necrosis was significantly lower in groups 3 and 4 compared to groups 1 and 2. Concerning blood vessels and mast cell numbers, only the animals in group 3 showed significant increase compared to group 1 in the skin sample of the transition line. LED and LLLT with the same total energies were effective in increasing viability of random skin flaps. LED was more effective in increasing the number of mast cells and blood vessels in the transition line of random skin flaps.
40 CFR 133.104 - Sampling and test procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 22 2011-07-01 2011-07-01 false Sampling and test procedures. 133.104 Section 133.104 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS SECONDARY TREATMENT REGULATION § 133.104 Sampling and test procedures. (a) Sampling and test procedures for...
40 CFR 133.104 - Sampling and test procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 21 2010-07-01 2010-07-01 false Sampling and test procedures. 133.104 Section 133.104 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS SECONDARY TREATMENT REGULATION § 133.104 Sampling and test procedures. (a) Sampling and test procedures for...
40 CFR 90.415 - Raw gaseous sampling procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Raw gaseous sampling procedures. 90.415 Section 90.415 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... Test Procedures § 90.415 Raw gaseous sampling procedures. Fit all heated sampling lines with a heated...
40 CFR 90.415 - Raw gaseous sampling procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Raw gaseous sampling procedures. 90.415 Section 90.415 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... Test Procedures § 90.415 Raw gaseous sampling procedures. Fit all heated sampling lines with a heated...
40 CFR 133.104 - Sampling and test procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 23 2012-07-01 2012-07-01 false Sampling and test procedures. 133.104 Section 133.104 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS SECONDARY TREATMENT REGULATION § 133.104 Sampling and test procedures. (a) Sampling and test procedures for...
Cosmetic procedures among youths: a survey of junior college and medical students in Singapore
Ng, Jia Hui; Yeak, Seth; Phoon, Natalie; Lo, Stephen
2014-01-01
INTRODUCTION Although cosmetic procedures have become increasingly popular among the younger population in recent years, limited research on this subject has been done in the Asian context. We aimed to explore the views and knowledge regarding cosmetic procedures among junior college (JC) and medical students in Singapore. METHODS In the first phase of the study, a cross-sectional, self-administered survey of 1,500 JC students aged 16–21 years from six JCs was conducted in 2010. The same survey was then conducted on a random sample of Year 2–5 medical students from an undergraduate medical school in 2011. RESULTS In total, 1,164 JC and 241 medical students responded to the surveys. There was an overall female to male ratio of 1.3:1. Of all the respondents, 2.5% of the JC students and 3.0% of the medical students admitted to having undergone cosmetic procedures. Among those who claimed to have never had cosmetic procedures done, 9.0% and 44.0% of the JC and medical students, respectively, responded that they would consider such procedures in the future. Those who disapproved of their peers undergoing cosmetic surgery comprised 35.0% of JC students and 56.8% of medical students. Among the JC and medical students, 52.0% and 36.1%, respectively, were unaware of any risks associated with cosmetic procedures. CONCLUSION The younger population is increasingly accepting of cosmetic procedures. However, there is a general lack of understanding of the risks associated with such procedures. Education of both the general public and medical students may help prevent potential medicolegal issues. PMID:25189303
Brick tunnel randomization and the momentum of the probability mass.
Kuznetsova, Olga M
2015-12-30
The allocation space of an unequal-allocation permuted block randomization can be quite wide. The development of unequal-allocation procedures with a narrower allocation space, however, is complicated by the need to preserve the unconditional allocation ratio at every step (the allocation ratio preserving (ARP) property). When the allocation paths are depicted on the K-dimensional unitary grid, where allocation to the l-th treatment is represented by a step along the l-th axis, l = 1 to K, the ARP property can be expressed in terms of the center of the probability mass after i allocations. Specifically, for an ARP allocation procedure that randomizes subjects to K treatment groups in w1 :⋯:wK ratio, w1 +⋯+wK =1, the coordinates of the center of the mass are (w1 i,…,wK i). In this paper, the momentum with respect to the center of the probability mass (expected imbalance in treatment assignments) is used to compare ARP procedures in how closely they approximate the target allocation ratio. It is shown that the two-arm and three-arm brick tunnel randomizations (BTR) are the ARP allocation procedures with the tightest allocation space among all allocation procedures with the same allocation ratio; the two-arm BTR is the minimum-momentum two-arm ARP allocation procedure. Resident probabilities of two-arm and three-arm BTR are analytically derived from the coordinates of the center of the probability mass; the existence of the respective transition probabilities is proven. Probability of deterministic assignments with BTR is found generally acceptable. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Anxiety Outcomes after Physical Activity Interventions: Meta-Analysis Findings
Conn, Vicki S.
2011-01-01
Background Although numerous primary studies have documented the mental health benefits of physical activity (PA), no previous quantitative synthesis has examined anxiety outcomes of interventions to increase PA. Objectives This meta-analysis integrates extant research about anxiety outcomes from interventions to increase PA among healthy adults. Method Extensive literature searching located published and unpublished PA intervention studies with anxiety outcomes. Eligible studies reported findings from interventions designed to increase PA delivered to healthy adults without anxiety disorders. Data were coded from primary studies. Random-effects meta-analytic procedures were completed. Exploratory moderator analyses using meta-analysis ANOVA and regression analogues were conducted to determine if report, methods, sample, or intervention characteristics were associated with differences in anxiety outcomes. Results Data were synthesized across 3,289 subjects from 19 eligible reports. The overall mean anxiety effect size (d-index) for two-group comparisons was 0.22 with significant heterogeneity (Q = 32.15). Exploratory moderator analyses found larger anxiety improvement effect sizes among studies that included larger samples, used random allocation of subjects to treatment and control conditions, targeted only PA behavior instead of multiple health behaviors, included supervised exercise (vs. home-based PA), used moderate or high-intensity instead of low-intensity PA, and suggested subjects exercise at a fitness facility (vs. home) following interventions. Discussion These findings document that some interventions can decrease anxiety symptoms among healthy adults. Exploratory moderator analyses suggest possible directions for future primary research to compare interventions in randomized trials to confirm causal relationships. PMID:20410849
Duan, Xiaotao; Young, Rebecca; Straubinger, Robert M.; Page, Brian J.; Cao, Jin; Wang, Hao; Yu, Haoying; Canty, John M.; Qu, Jun
2009-01-01
For label-free expression profiling of tissue proteomes, efficient protein extraction, thorough and quantitative sample cleanup and digestion procedures, as well as sufficient and reproducible chromatographic separation, are highly desirable but remain challenging. However, optimal methodology has remained elusive, especially for proteomes that are rich in membrane proteins, such as the mitochondria. Here we describe a straightforward and reproducible sample preparation procedure, coupled with a highly selective and sensitive nano-LC/Orbitrap analysis, which enables reliable and comprehensive expression profiling of tissue mitochondria. The mitochondrial proteome of swine heart was selected as a test system. Efficient protein extraction was accomplished using a strong buffer containing both ionic and non-ionic detergents. Overnight precipitation was used for cleanup of the extract, and the sample was subjected to an optimized 2-step, on-pellet digestion approach. In the first step, the protein pellet was dissolved via a 4 h tryptic digestion under vigorous agitation, which nano-LC/LTQ/ETD showed to produce large and incompletely cleaved tryptic peptides. The mixture was then reduced, alkylated, and digested into its full complement of tryptic peptides with additional trypsin. This solvent precipitation/on-pellet digestion procedure achieved significantly higher and more reproducible peptide recovery of the mitochondrial preparation, than observed using a prevalent alternative procedure for label-free expression profiling, SDS-PAGE/in-gel digestion (87% vs. 54%). Furthermore, uneven peptide losses were lower than observed with SDS-PAGE/in-gel digestion. The resulting peptides were sufficiently resolved by a 5 h gradient using a nano-LC configuration that features a low-void-volume, high chromatographic reproducibility, and an LTQ/Orbitrap analyzer for protein identification and quantification. The developed method was employed for label-free comparison of the mitochondrial proteomes of myocardium from healthy animals vs. those with hibernating myocardium. Each experimental group consisted of a relatively large number of animals (n=10), and samples were analyzed in random order to minimize quantitative false-positives. Using this approach, 904 proteins were identified and quantified with high confidence, and those mitochondrial proteins that were altered significantly between groups were compared with the results of a parallel 2D-DIGE analysis. The sample preparation and analytical strategy developed here represents an advancement that can be adapted to analyze other tissue proteomes. PMID:19290621
Divine, George; Norton, H James; Hunt, Ronald; Dienemann, Jacqueline
2013-09-01
When a study uses an ordinal outcome measure with unknown differences in the anchors and a small range such as 4 or 7, use of the Wilcoxon rank sum test or the Wilcoxon signed rank test may be most appropriate. However, because nonparametric methods are at best indirect functions of standard measures of location such as means or medians, the choice of the most appropriate summary measure can be difficult. The issues underlying use of these tests are discussed. The Wilcoxon-Mann-Whitney odds directly reflects the quantity that the rank sum procedure actually tests, and thus it can be a superior summary measure. Unlike the means and medians, its value will have a one-to-one correspondence with the Wilcoxon rank sum test result. The companion article appearing in this issue of Anesthesia & Analgesia ("Aromatherapy as Treatment for Postoperative Nausea: A Randomized Trial") illustrates these issues and provides an example of a situation for which the medians imply no difference between 2 groups, even though the groups are, in fact, quite different. The trial cited also provides an example of a single sample that has a median of zero, yet there is a substantial shift for much of the nonzero data, and the Wilcoxon signed rank test is quite significant. These examples highlight the potential discordance between medians and Wilcoxon test results. Along with the issues surrounding the choice of a summary measure, there are considerations for the computation of sample size and power, confidence intervals, and multiple comparison adjustment. In addition, despite the increased robustness of the Wilcoxon procedures relative to parametric tests, some circumstances in which the Wilcoxon tests may perform poorly are noted, along with alternative versions of the procedures that correct for such limitations.
Tomie, Arthur; Kuo, Teresa; Apor, Khristine R; Salomon, Kimberly E; Pohorecky, Larissa A
2004-04-01
The effects of autoshaping procedures (paired vs. random) and sipper fluid (ethanol vs. water) on sipper-directed drinking were evaluated in male Long-Evans rats maintained with free access to food and water. For the paired/ethanol group (n=16), autoshaping procedures consisted of presenting the ethanol sipper (containing 0% to 28% unsweetened ethanol) conditioned stimulus (CS) followed by the response-independent presentation of food unconditioned stimulus (US). The random/ethanol group (n=8) received the sipper CS and food US randomly with respect to one another. The paired/water group (n=8) received only water in the sipper CS. The paired/ethanol group showed higher grams per kilogram ethanol intake than the random/ethanol group did at ethanol concentrations of 8% to 28%. The paired/ethanol group showed higher sipper CS-directed milliliter fluid consumption than the paired/water group did at ethanol concentrations of 1% to 6%, and 15%, 16%, 18%, and 20%. Following a 42-day retention interval, the paired/ethanol group showed superior retention of CS-directed drinking of 18% ethanol, relative to the random/ethanol group, and superior retention of CS-directed milliliter fluid drinking relative to the paired/water group. When tested for home cage ethanol preference using limited access two-bottle (28% ethanol vs. water) procedures, the paired/ethanol and random/ethanol groups did not differ on any drinking measures.
Topology in two dimensions. II - The Abell and ACO cluster catalogues
NASA Astrophysics Data System (ADS)
Plionis, Manolis; Valdarnini, Riccardo; Coles, Peter
1992-09-01
We apply a method for quantifying the topology of projected galaxy clustering to the Abell and ACO catalogues of rich clusters. We use numerical simulations to quantify the statistical bias involved in using high peaks to define the large-scale structure, and we use the results obtained to correct our observational determinations for this known selection effect and also for possible errors introduced by boundary effects. We find that the Abell cluster sample is consistent with clusters being identified with high peaks of a Gaussian random field, but that the ACO shows a slight meatball shift away from the Gaussian behavior over and above that expected purely from the high-peak selection. The most conservative explanation of this effect is that it is caused by some artefact of the procedure used to select the clusters in the two samples.
47 CFR 1.1623 - Probability calculation.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...
Code of Federal Regulations, 2010 CFR
2010-10-01
... 47 Telecommunication 1 2010-10-01 2010-10-01 false Scope. 1.1601 Section 1.1601 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1601 Scope. The provisions of this subpart, and the provisions...
Code of Federal Regulations, 2011 CFR
2011-10-01
... 47 Telecommunication 1 2011-10-01 2011-10-01 false Scope. 1.1601 Section 1.1601 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1601 Scope. The provisions of this subpart, and the provisions...
An anatomically sound surgical simulation model for myringotomy and tympanostomy tube insertion.
Hong, Paul; Webb, Amanda N; Corsten, Gerard; Balderston, Janet; Haworth, Rebecca; Ritchie, Krista; Massoud, Emad
2014-03-01
Myringotomy and tympanostomy tube insertion (MT) is a common surgical procedure. Although surgical simulation has proven to be an effective training tool, an anatomically sound simulation model for MT is lacking. We developed such a model and assessed its impact on the operating room performance of senior medical students. Prospective randomized trial. A randomized single-blind controlled study of simulation training with the MT model versus no simulation training. Each participant was randomized to either the simulation model group or control group, after performing an initial MT procedure. Within two weeks of the first procedure, the students performed a second MT. All procedures were performed on real patients and rated with a Global Rating Scale by two attending otolaryngologists. Time to complete the MT was also recorded. Twenty-four senior medical students were enrolled. Control and intervention groups did not differ at baseline on their Global Rating Scale score or time to complete the MT procedure. Following simulation training, the study group received significantly higher scores (P=.005) and performed the MT procedure in significantly less time (P=.034). The control group did not improve their performance scores (P>.05) or the time to complete the procedure (P>.05). Our surgical simulation model shows promise for being a valuable teaching tool for MT for senior medical students. Such anatomically appropriate physical simulators may benefit teaching of junior trainees. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Hoffmann, Tammy C; Walker, Marion F; Langhorne, Peter; Eames, Sally; Thomas, Emma; Glasziou, Paul
2015-01-01
Objective To assess, in a sample of systematic reviews of non-pharmacological interventions, the completeness of intervention reporting, identify the most frequently missing elements, and assess review authors’ use of and beliefs about providing intervention information. Design Analysis of a random sample of systematic reviews of non-pharmacological stroke interventions; online survey of review authors. Data sources and study selection The Cochrane Library and PubMed were searched for potentially eligible systematic reviews and a random sample of these assessed for eligibility until 60 (30 Cochrane, 30 non-Cochrane) eligible reviews were identified. Data collection In each review, the completeness of the intervention description in each eligible trial (n=568) was assessed by 2 independent raters using the Template for Intervention Description and Replication (TIDieR) checklist. All review authors (n=46) were invited to complete a survey. Results Most reviews were missing intervention information for the majority of items. The most incompletely described items were: modifications, fidelity, materials, procedure and tailoring (missing from all interventions in 97%, 90%, 88%, 83% and 83% of reviews, respectively). Items that scored better, but were still incomplete for the majority of reviews, were: ‘when and how much’ (in 31% of reviews, adequate for all trials; in 57% of reviews, adequate for some trials); intervention mode (in 22% of reviews, adequate for all trials; in 38%, adequate for some trials); and location (in 19% of reviews, adequate for all trials). Of the 33 (71%) authors who responded, 58% reported having further intervention information but not including it, and 70% tried to obtain information. Conclusions Most focus on intervention reporting has been directed at trials. Poor intervention reporting in stroke systematic reviews is prevalent, compounded by poor trial reporting. Without adequate intervention descriptions, the conduct, usability and interpretation of reviews are restricted and therefore, require action by trialists, systematic reviewers, peer reviewers and editors. PMID:26576811
Methodology Series Module 5: Sampling Strategies.
Setia, Maninder Singh
2016-01-01
Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.
Melvin, Neal R; Poda, Daniel; Sutherland, Robert J
2007-10-01
When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.
Kundu, Anjana; Lin, Yuting; Oron, Assaf P; Doorenbos, Ardith Z
2014-02-01
To examine the effects of Reiki as an adjuvant therapy to opioid therapy for postoperative pain control in pediatric patients. This was a double-blind, randomized controlled study of children undergoing dental procedures. Participants were randomly assigned to receive either Reiki therapy or the control therapy (sham Reiki) preoperatively. Postoperative pain scores, opioid requirements, and side effects were assessed. Family members were also asked about perioperative care satisfaction. Multiple linear regressions were used for analysis. Thirty-eight children participated. The blinding procedure was successful. No statistically significant difference was observed between groups on all outcome measures. Our study provides a successful example of a blinding procedure for Reiki therapy among children in the perioperative period. This study does not support the effectiveness of Reiki as an adjuvant therapy to opioid therapy for postoperative pain control in pediatric patients. Copyright © 2013 Elsevier Ltd. All rights reserved.
Kundu, Anjana; Lin, Yuting; Oron, Assaf P.; Doorenbos, Ardith Z.
2014-01-01
Purpose To examine the effects of Reiki as an adjuvant therapy to opioid therapy for postoperative pain control in pediatric patients. Methods This was a double-blind, randomized controlled study of children undergoing dental procedures. Participants were randomly assigned to receive either Reiki therapy or the control therapy (sham Reiki) preoperatively. Postoperative pain scores, opioid requirements, and side effects were assessed. Family members were also asked about perioperative care satisfaction. Multiple linear regressions were used for analysis. Results Thirty-eight children participated. The blinding procedure was successful. No statistically significant difference was observed between groups on all outcome measures. Implications Our study provides a successful example of a blinding procedure for Reiki therapy among children in the perioperative period. This study does not support the effectiveness of Reiki as an adjuvant therapy to opioid therapy for postoperative pain control in pediatric patients. PMID:24439640
The PX-EM algorithm for fast stable fitting of Henderson's mixed model
Foulley, Jean-Louis; Van Dyk, David A
2000-01-01
This paper presents procedures for implementing the PX-EM algorithm of Liu, Rubin and Wu to compute REML estimates of variance covariance components in Henderson's linear mixed models. The class of models considered encompasses several correlated random factors having the same vector length e.g., as in random regression models for longitudinal data analysis and in sire-maternal grandsire models for genetic evaluation. Numerical examples are presented to illustrate the procedures. Much better results in terms of convergence characteristics (number of iterations and time required for convergence) are obtained for PX-EM relative to the basic EM algorithm in the random regression. PMID:14736399
Efficient sampling of complex network with modified random walk strategies
NASA Astrophysics Data System (ADS)
Xie, Yunya; Chang, Shuhua; Zhang, Zhipeng; Zhang, Mi; Yang, Lei
2018-02-01
We present two novel random walk strategies, choosing seed node (CSN) random walk and no-retracing (NR) random walk. Different from the classical random walk sampling, the CSN and NR strategies focus on the influences of the seed node choice and path overlap, respectively. Three random walk samplings are applied in the Erdös-Rényi (ER), Barabási-Albert (BA), Watts-Strogatz (WS), and the weighted USAir networks, respectively. Then, the major properties of sampled subnets, such as sampling efficiency, degree distributions, average degree and average clustering coefficient, are studied. The similar conclusions can be reached with these three random walk strategies. Firstly, the networks with small scales and simple structures are conducive to the sampling. Secondly, the average degree and the average clustering coefficient of the sampled subnet tend to the corresponding values of original networks with limited steps. And thirdly, all the degree distributions of the subnets are slightly biased to the high degree side. However, the NR strategy performs better for the average clustering coefficient of the subnet. In the real weighted USAir networks, some obvious characters like the larger clustering coefficient and the fluctuation of degree distribution are reproduced well by these random walk strategies.
Baird, D D; Saldana, T M; Shore, D L; Hill, M C; Schectman, J M
2015-12-01
How well can a single baseline ultrasound assessment of fibroid burden (presence or absence of fibroids and size of largest, if present) predict future probability of having a major uterine procedure? During an 8-year follow-up period, the risk of having a major uterine procedure was 2% for those without fibroids and increased with fibroid size for those with fibroids, reaching 47% for those with fibroids ≥ 4 cm in diameter at baseline. Uterine fibroids are a leading indication for hysterectomy. However, when fibroids are found, there are few available data to help clinicians advise patients about disease progression. Women who were 35-49 years old were randomly selected from the membership of a large urban health plan; 80% of those determined to be eligible were enrolled and screened with ultrasound for fibroids ≥ 0.5 cm in diameter. African-American and white premenopausal participants who responded to at least one follow-up interview (N = 964, 85% of those eligible) constituted the study cohort. During follow-up (5822 person-years), participants self-reported any major uterine procedure (67% hysterectomies). Life-table analyses and Cox regression (with censoring for menopause) were used to estimate the risk of having a uterine procedure for women with no fibroids, small (<2 cm in diameter), medium (2-3.9 cm), and large fibroids (≥ 4 cm). Differences between African-American and white women, importance of a clinical diagnosis of fibroids prior to study enrollment, and the impact of submucosal fibroids on risk were investigated. There was a greater loss to follow-up for African-Americans than whites (19 versus 11%). For those with follow-up data, 64% had fibroids at baseline, 33% of whom had had a prior diagnosis. Of those with fibroids, 27% had small fibroids (<2 cm in diameter), 46% had medium (largest fibroid 2-3.9 cm in diameter), and 27% had large fibroids (largest ≥ 4 cm in diameter). Twenty-one percent had at least one submucosal fibroid. Major uterine procedures were reported by 115 women during follow-up. The estimated risk of having a procedure in any given year of follow-up for those with fibroids compared with those without fibroids increased markedly with fibroid-size category (from 4-fold, confidence interval (CI) (1.4-11.1) for the small fibroids to 10-fold, CI (4.4-24.8) for the medium fibroids, to 27-fold, CI (11.5-65.2) for the large fibroids). This influence of fibroid size on risk did not differ between African-Americans and whites (P-value for interaction = 0.88). Once fibroid size at enrollment was accounted for, having a prior diagnosis at the time of ultrasound screening was not predictive of having a procedure. Exclusion of women with a submucosal fibroid had little influence on the results. The 8-year risk of a procedure based on lifetable analyses was 2% for women with no fibroids, 8, 23, and 47%, respectively, for women who had small, medium or large fibroids at enrollment. Given the strong association of fibroid size with subsequent risk of a procedure, these findings are unlikely to be due to chance. Despite a large sample size, the number of women having procedures during follow-up was relatively small. Thus, covariates such as BMI, which were not important in our analyses, may have associations that were too small to detect with our sample size. Another limitation is that the medical procedures were self-reported. However, we attempted to retrieve medical records when participants agreed, and 77% of the total procedures reported were verified. Our findings are likely to be generalizable to other African-American and white premenopausal women in their late 30s and 40s, but other ethnic groups have not been studied. Though further studies are needed to confirm and extend the results, our findings provide an initial estimate of disease progression that will be helpful to clinicians and their patients. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology 2015. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Williams, Jessica R; Halstead, Valerie; Salani, Deborah; Koermer, Natasha
2016-01-01
This study examines policies and procedures for identifying and responding to intimate partner violence (IPV) among different types of health care settings. This epidemiologic, cross-sectional, observational study design collected data from June 2014 to January 2015 through a telephone questionnaire from a stratified random sample of 288 health care facilities in Miami-Dade County, Florida. An overall response rate of 76.2% was achieved from 72 primary care clinics, 93 obstetrics/gynecology clinics, 106 pediatric clinics, and 17 emergency departments (EDs). There is a general awareness of the importance of IPV screening with 78.1% of facilities (95% CI, 73.9%-82.3%) reporting some type of IPV screening procedures. Wide variation exists, however, in how practices are implemented, with only 35.3% of facilities (95% CI, 29.5%-41.1%) implementing multicomponent, comprehensive IPV screening and response programs. Differences were also observed by setting with EDs reporting the most comprehensive programs. This study yields important empirical information regarding the extent to which IPV screening and response procedures are currently being implemented in both clinic and acute health care settings along with areas where improvements are needed. Copyright © 2016 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.
Using tablet technology and instructional videos to enhance preclinical dental laboratory learning.
Gadbury-Amyot, Cynthia C; Purk, John H; Williams, Brian Joseph; Van Ness, Christopher J
2014-02-01
The purpose of this pilot study was to examine if tablet technology with accompanying instructional videos enhanced the teaching and learning outcomes in a preclinical dental laboratory setting. Two procedures deemed most challenging in Operative Dentistry II were chosen for the development of instructional videos. A random sample of thirty students was chosen to participate in the pilot. Comparison of faculty evaluations of the procedures between the experimental (tablet) and control (no tablet) groups resulted in no significant differences; however, there was a trend toward fewer failures in the experimental group. Examination of the ability to accurately self-assess was compared by exploring correlations between faculty and student evaluations. While correlations were stronger in the experimental group, the control group had significant correlations for all three procedures, while the experimental group had significant correlations on only two of the procedures. Students strongly perceived that the tablets and videos helped them perform better and more accurately self-assess their work products. Students did not support requiring that they purchase/obtain a specific brand of technology. As a result of this pilot study, further development of ideal and non-ideal videos are in progress, and the school will be implementing a "Bring Your Own Device" policy with incoming students.
Methodology Series Module 5: Sampling Strategies
Setia, Maninder Singh
2016-01-01
Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438
Incidence of tuberculosis among school-going adolescents in South India.
Uppada, Dharma Rao; Selvam, Sumithra; Jesuraj, Nelson; Lau, Esther L; Doherty, T Mark; Grewal, Harleen M S; Vaz, Mario; Lindtjørn, Bernt
2016-07-26
Tuberculosis (TB) incidence data in vaccine target populations, particularly adolescents, are important for designing and powering vaccine clinical trials. Little is known about the incidence of tuberculosis among adolescents in India. The objective of current study is to estimate the incidence of pulmonary tuberculosis (PTB) disease among adolescents attending school in South India using two different surveillance methods (active and passive) and to compare the incidence between the two groups. The study was a prospective cohort study with a 2-year follow-up period. The study was conducted in Palamaner, Chittoor District of Andhra Pradesh, South India from February 2007 to July 2010. A random sampling procedure was used to select a subset of schools to enable approximately 8000 subjects to be available for randomization in the study. A stratified randomization procedure was used to assign the selected schools to either active or passive surveillance. Participants who met the criteria for being exposed to TB were referred to the diagnostic ward for pulmonary tuberculosis confirmation. A total number of 3441 males and 3202 females between the ages 11 and less than 18 years were enrolled into the study. Of the 3102 participants in the active surveillance group, four subjects were diagnosed with definite tuberculosis, four subjects with probable tuberculosis, and 71 subjects had non-tuberculous Mycobacteria (NTM) isolated from their sputum. Of the 3541 participants in the passive surveillance group, four subjects were diagnosed with definite tuberculosis, two subjects with probable tuberculosis, and 48 subjects had non-tuberculosis Mycobacteria isolated from their sputum. The incidence of definite + probable TB was 147.60 / 100,000 person years in the active surveillance group and 87 / 100,000 person years in the passive surveillance group. The incidence of pulmonary tuberculosis among adolescents in our study is lower than similar studies conducted in South Africa and Eastern Uganda - countries with a higher incidence of tuberculosis and human immunodeficiency virus (HIV) than India. The study data will inform sample design for vaccine efficacy trials among adolescents in India.
Randomized Comparison of 3 High-Level Disinfection and Sterilization Procedures for Duodenoscopes.
Snyder, Graham M; Wright, Sharon B; Smithey, Anne; Mizrahi, Meir; Sheppard, Michelle; Hirsch, Elizabeth B; Chuttani, Ram; Heroux, Riley; Yassa, David S; Olafsdottir, Lovisa B; Davis, Roger B; Anastasiou, Jiannis; Bapat, Vijay; Bidari, Kiran; Pleskow, Douglas K; Leffler, Daniel; Lane, Benjamin; Chen, Alice; Gold, Howard S; Bartley, Anthony; King, Aleah D; Sawhney, Mandeep S
2017-10-01
Duodenoscopes have been implicated in the transmission of multidrug-resistant organisms (MDRO). We compared the frequency of duodenoscope contamination with MDRO or any other bacteria after disinfection or sterilization by 3 different methods. We performed a single-center prospective randomized study in which duodenoscopes were randomly reprocessed by standard high-level disinfection (sHLD), double high-level disinfection (dHLD), or standard high-level disinfection followed by ethylene oxide gas sterilization (HLD/ETO). Samples were collected from the elevator mechanism and working channel of each duodenoscope and cultured before use. The primary outcome was the proportion of duodenoscopes with an elevator mechanism or working channel culture showing 1 or more MDRO; secondary outcomes included the frequency of duodenoscope contamination with more than 0 and 10 or more colony-forming units (CFU) of aerobic bacterial growth on either sampling location. After 3 months of enrollment, the study was closed because of the futility; we did not observe sufficient events to evaluate the primary outcome. Among 541 duodenoscope culture events, 516 were included in the final analysis. No duodenoscope culture in any group was positive for MDRO. Bacterial growth of more than 0 CFU was noted in 16.1% duodenoscopes in the sHLD group, 16.0% in the dHLD group, and 22.5% in the HLD/ETO group (P = .21). Bacterial growth or 10 or more CFU was noted in 2.3% of duodenoscopes in the sHLD group, 4.1% in the dHLD group, and 4.2% in the HLD/ETO group (P = .36). MRDOs were cultured from 3.2% of pre-procedure rectal swabs and 2.5% of duodenal aspirates. In a comparison of duodenoscopes reprocessed by sHLD, dHLD, or HLD/ETO, we found no significant differences between groups for MDRO or bacteria contamination. Enhanced disinfection methods (dHLD or HLD/ETO) did not provide additional protection against contamination. However, insufficient events occurred to assess our primary study end-point. ClinicalTrials.gov no: NCT02611648. Copyright © 2017 AGA Institute. Published by Elsevier Inc. All rights reserved.
Indovina, Paola; Barone, Daniela; Gallo, Luigi; Chirico, Andrea; De Pietro, Giuseppe; Antonio, Giordano
2018-02-26
This review aims to provide a framework for evaluating the utility of virtual reality (VR) as a distraction intervention to alleviate pain and distress during medical procedures. We firstly describe the theoretical bases underlying the VR analgesic and anxiolytic effects and define the main factors contributing to its efficacy, which largely emerged from studies on healthy volunteers. Then, we provide a comprehensive overview of the clinical trials using VR distraction during different medical procedures, such as burn injury treatments, chemotherapy, surgery, dental treatment, and other diagnostic and therapeutic procedures. A broad literature search was performed using as main terms "virtual reality", "distraction" and "pain". No date limit was applied and all the retrieved studies on immersive VR distraction during medical procedures were selected. VR has proven to be effective in reducing procedural pain, as almost invariably observed even in patients subjected to extremely painful procedures, such as patients with burn injuries undergoing wound care and physical therapy. Moreover, VR seemed to decrease cancer-related symptoms in different settings, including during chemotherapy. Only mild and infrequent side effects were observed. Despite these promising results, future long-term randomized controlled trials with larger sample sizes and evaluating not only self-report measures but also physiological variables are needed. Further studies are also required both to establish predictive factors to select patients who can benefit from VR distraction and to design hardware/software systems tailored to the specific needs of different patients and able to provide the greatest distraction at the lowest cost.
Makowski, David; Bancal, Rémi; Bensadoun, Arnaud; Monod, Hervé; Messéan, Antoine
2017-09-01
According to E.U. regulations, the maximum allowable rate of adventitious transgene presence in non-genetically modified (GM) crops is 0.9%. We compared four sampling methods for the detection of transgenic material in agricultural non-GM maize fields: random sampling, stratified sampling, random sampling + ratio reweighting, random sampling + regression reweighting. Random sampling involves simply sampling maize grains from different locations selected at random from the field concerned. The stratified and reweighting sampling methods make use of an auxiliary variable corresponding to the output of a gene-flow model (a zero-inflated Poisson model) simulating cross-pollination as a function of wind speed, wind direction, and distance to the closest GM maize field. With the stratified sampling method, an auxiliary variable is used to define several strata with contrasting transgene presence rates, and grains are then sampled at random from each stratum. With the two methods involving reweighting, grains are first sampled at random from various locations within the field, and the observations are then reweighted according to the auxiliary variable. Data collected from three maize fields were used to compare the four sampling methods, and the results were used to determine the extent to which transgene presence rate estimation was improved by the use of stratified and reweighting sampling methods. We found that transgene rate estimates were more accurate and that substantially smaller samples could be used with sampling strategies based on an auxiliary variable derived from a gene-flow model. © 2017 Society for Risk Analysis.
The computation of pi to 29,360,000 decimal digits using Borweins' quartically convergent algorithm
NASA Technical Reports Server (NTRS)
Bailey, David H.
1988-01-01
The quartically convergent numerical algorithm developed by Borwein and Borwein (1987) for 1/pi is implemented via a prime-modulus-transform multiprecision technique on the NASA Ames Cray-2 supercomputer to compute the first 2.936 x 10 to the 7th digits of the decimal expansion of pi. The history of pi computations is briefly recalled; the most recent algorithms are characterized; the implementation procedures are described; and samples of the output listing are presented. Statistical analyses show that the present decimal expansion is completely random, with only acceptable numbers of long repeating strings and single-digit runs.
Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses.
Faul, Franz; Erdfelder, Edgar; Buchner, Axel; Lang, Albert-Georg
2009-11-01
G*Power is a free power analysis program for a variety of statistical tests. We present extensions and improvements of the version introduced by Faul, Erdfelder, Lang, and Buchner (2007) in the domain of correlation and regression analyses. In the new version, we have added procedures to analyze the power of tests based on (1) single-sample tetrachoric correlations, (2) comparisons of dependent correlations, (3) bivariate linear regression, (4) multiple linear regression based on the random predictor model, (5) logistic regression, and (6) Poisson regression. We describe these new features and provide a brief introduction to their scope and handling.
Classical verification of quantum circuits containing few basis changes
NASA Astrophysics Data System (ADS)
Demarie, Tommaso F.; Ouyang, Yingkai; Fitzsimons, Joseph F.
2018-04-01
We consider the task of verifying the correctness of quantum computation for a restricted class of circuits which contain at most two basis changes. This contains circuits giving rise to the second level of the Fourier hierarchy, the lowest level for which there is an established quantum advantage. We show that when the circuit has an outcome with probability at least the inverse of some polynomial in the circuit size, the outcome can be checked in polynomial time with bounded error by a completely classical verifier. This verification procedure is based on random sampling of computational paths and is only possible given knowledge of the likely outcome.
[Pay attention to the complexity of cataract surgery of no vitreous eyes].
Bao, Y Z
2017-04-11
With wide-spread performance of pars plana vitrectomy, cataract surgeries with no vitreous are getting more and more. This kind of surgery has great difference between individuals and it lacks randomized large sample clinical trial. Surgical strategy decision was basically relied on the surgeon's personal experience. We should fully aware the individual and common characteristics of no vitreous cataract surgery. Surgical time should be carefully decided. Complete ocular examination, evaluation, design of cataract surgical procedure and appropriate intra-ocular lens selection are needed. We must pay highly attention on the cataract surgery of no vitreous eyes. (Chin J Ophthalmol, 2017, 53: 241-243) .
DOE Office of Scientific and Technical Information (OSTI.GOV)
A. Alfonsi; C. Rabiti; D. Mandelli
The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data miningmore » module« less
Proietti, Riccardo; Pecoraro, Valentina; Di Biase, Luigi; Natale, Andrea; Santangeli, Pasquale; Viecca, Maurizio; Sagone, Antonio; Galli, Alessio; Moja, Lorenzo; Tagliabue, Ludovica
2013-09-01
The aim of this study was to determine the efficacy and safety of remote magnetic navigation (RMN) with open-irrigated catheter vs. manual catheter navigation (MCN) in performing atrial fibrillation (AF) ablation. We searched in PubMed (1948-2013) and EMBASE (1974-2013) studies comparing RMN with MCN. Outcomes considered were AF recurrence (primary outcome), pulmonary vein isolation (PVI), procedural complications, and data on procedure's performance. Odds ratios (OR) and mean difference (MD) were extracted and pooled using a random-effect model. Confidence in the estimates of the obtained effects (quality of evidence) was assessed using the Grading of Recommendations Assessment, Development and Evaluation approach. We identified seven controlled trials, six non-randomized and one randomized, including a total of 941 patients. Studies were at high risk of bias. No difference was observed between RMN and MCN on AF recurrence [OR 1.18, 95% confidence interval (CI) 0.85 to 1.65, P = 0.32] or PVI (OR 0.41, 95% CI 0.11-1.47, P = 0.17). Remote magnetic navigation was associated with less peri-procedural complications (Peto OR 0.41, 95% CI 0.19-0.88, P = 0.02). Mean fluoroscopy time was reduced in RMN group (-22.22 min; 95% CI -42.48 to -1.96, P = 0.03), although the overall duration of the procedure was longer (60.91 min; 95% CI 31.17 to 90.65, P < 0.0001). In conclusion, RMN is not superior to MCN in achieving freedom from recurrent AF at mid-term follow-up or PVI. The procedure implies less peri-procedural complications, requires a shorter fluoroscopy time but a longer total procedural time. For the low quality of the available evidence, a proper designed randomized controlled trial could turn the direction and the effect of the dimensions explored.
Sears, Erika Davis; Burke, James F; Davis, Matthew M; Chung, Kevin C
2013-03-01
The purpose of this study was to (1) understand national variation in delay of emergency procedures in patients with open tibial fracture at the hospital level and (2) compare length of stay and cost in patients cared for at the best- and worst-performing hospitals for delay. The authors retrospectively analyzed the 2003 to 2009 Nationwide Inpatient Sample. Adult patients with open tibial fracture were included. Hospital probability of delay in performing emergency procedures beyond the day of admission was calculated. Multilevel linear regression random-effects models were created to evaluate the relationship between the treating hospital's tendency for delay (in quartiles) and the log-transformed outcomes of length of stay and cost. The final sample included 7029 patients from 332 hospitals. Patients treated at hospitals in the fourth (worst) quartile for delay were estimated to have 12 percent (95 percent CI, 2 to 21 percent) higher cost compared with patients treated at hospitals in the first quartile. In addition, patients treated at hospitals in the fourth quartile had an estimated 11 percent (95 percent CI, 4 to 17 percent) longer length of stay compared with patients treated at hospitals in the first quartile. Patients with open tibial fracture treated at hospitals with more timely initiation of surgical care had lower cost and shorter length of stay than patients treated at hospitals with less timely initiation of care. Policies directed toward mitigating variation in care may reduce unnecessary waste.
Sampling Large Graphs for Anticipatory Analytics
2015-05-15
low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges
Electromagnetic Scattering by Fully Ordered and Quasi-Random Rigid Particulate Samples
NASA Technical Reports Server (NTRS)
Mishchenko, Michael I.; Dlugach, Janna M.; Mackowski, Daniel W.
2016-01-01
In this paper we have analyzed circumstances under which a rigid particulate sample can behave optically as a true discrete random medium consisting of particles randomly moving relative to each other during measurement. To this end, we applied the numerically exact superposition T-matrix method to model far-field scattering characteristics of fully ordered and quasi-randomly arranged rigid multiparticle groups in fixed and random orientations. We have shown that, in and of itself, averaging optical observables over movements of a rigid sample as a whole is insufficient unless it is combined with a quasi-random arrangement of the constituent particles in the sample. Otherwise, certain scattering effects typical of discrete random media (including some manifestations of coherent backscattering) may not be accurately replicated.
Robust Approach to Verifying the Weak Form of the Efficient Market Hypothesis
NASA Astrophysics Data System (ADS)
Střelec, Luboš
2011-09-01
The weak form of the efficient markets hypothesis states that prices incorporate only past information about the asset. An implication of this form of the efficient markets hypothesis is that one cannot detect mispriced assets and consistently outperform the market through technical analysis of past prices. One of possible formulations of the efficient market hypothesis used for weak form tests is that share prices follow a random walk. It means that returns are realizations of IID sequence of random variables. Consequently, for verifying the weak form of the efficient market hypothesis, we can use distribution tests, among others, i.e. some tests of normality and/or some graphical methods. Many procedures for testing the normality of univariate samples have been proposed in the literature [7]. Today the most popular omnibus test of normality for a general use is the Shapiro-Wilk test. The Jarque-Bera test is the most widely adopted omnibus test of normality in econometrics and related fields. In particular, the Jarque-Bera test (i.e. test based on the classical measures of skewness and kurtosis) is frequently used when one is more concerned about heavy-tailed alternatives. As these measures are based on moments of the data, this test has a zero breakdown value [2]. In other words, a single outlier can make the test worthless. The reason so many classical procedures are nonrobust to outliers is that the parameters of the model are expressed in terms of moments, and their classical estimators are expressed in terms of sample moments, which are very sensitive to outliers. Another approach to robustness is to concentrate on the parameters of interest suggested by the problem under this study. Consequently, novel robust testing procedures of testing normality are presented in this paper to overcome shortcomings of classical normality tests in the field of financial data, which are typical with occurrence of remote data points and additional types of deviations from normality. This study also discusses some results of simulation power studies of these tests for normality against selected alternatives. Based on outcome of the power simulation study, selected normality tests were consequently used to verify weak form of efficiency in Central Europe stock markets.
Local linear regression for function learning: an analysis based on sample discrepancy.
Cervellera, Cristiano; Macciò, Danilo
2014-11-01
Local linear regression models, a kind of nonparametric structures that locally perform a linear estimation of the target function, are analyzed in the context of empirical risk minimization (ERM) for function learning. The analysis is carried out with emphasis on geometric properties of the available data. In particular, the discrepancy of the observation points used both to build the local regression models and compute the empirical risk is considered. This allows to treat indifferently the case in which the samples come from a random external source and the one in which the input space can be freely explored. Both consistency of the ERM procedure and approximating capabilities of the estimator are analyzed, proving conditions to ensure convergence. Since the theoretical analysis shows that the estimation improves as the discrepancy of the observation points becomes smaller, low-discrepancy sequences, a family of sampling methods commonly employed for efficient numerical integration, are also analyzed. Simulation results involving two different examples of function learning are provided.
Marien, Koen M.; Andries, Luc; De Schepper, Stefanie; Kockx, Mark M.; De Meyer, Guido R.Y.
2015-01-01
Tumor angiogenesis is measured by counting microvessels in tissue sections at high power magnification as a potential prognostic or predictive biomarker. Until now, regions of interest1 (ROIs) were selected by manual operations within a tumor by using a systematic uniform random sampling2 (SURS) approach. Although SURS is the most reliable sampling method, it implies a high workload. However, SURS can be semi-automated and in this way contribute to the development of a validated quantification method for microvessel counting in the clinical setting. Here, we report a method to use semi-automated SURS for microvessel counting: • Whole slide imaging with Pannoramic SCAN (3DHISTECH) • Computer-assisted sampling in Pannoramic Viewer (3DHISTECH) extended by two self-written AutoHotkey applications (AutoTag and AutoSnap) • The use of digital grids in Photoshop® and Bridge® (Adobe Systems) This rapid procedure allows traceability essential for high throughput protein analysis of immunohistochemically stained tissue. PMID:26150998
Microbiological survey of raw and ready-to-eat leafy green vegetables marketed in Italy.
Losio, M N; Pavoni, E; Bilei, S; Bertasi, B; Bove, D; Capuano, F; Farneti, S; Blasi, G; Comin, D; Cardamone, C; Decastelli, L; Delibato, E; De Santis, P; Di Pasquale, S; Gattuso, A; Goffredo, E; Fadda, A; Pisanu, M; De Medici, D
2015-10-01
The presence of foodborne pathogens (Salmonella spp., Listeria monocytogenes, Escherichia coli O157:H7, thermotolerant Campylobacter, Yersinia enterocolitica and norovirus) in fresh leafy (FL) and ready-to-eat (RTE) vegetable products, sampled at random on the Italian market, was investigated to evaluate the level of risk to consumers. Nine regional laboratories, representing 18 of the 20 regions of Italy and in which 97.7% of the country's population resides, were involved in this study. All laboratories used the same sampling procedures and analytical methods. The vegetable samples were screened using validated real-time PCR (RT-PCR) methods and standardized reference ISO culturing methods. The results show that 3.7% of 1372 fresh leafy vegetable products and 1.8% of 1160 "fresh-cut" or "ready-to-eat" (RTE) vegetable retailed in supermarkets or farm markets, were contaminated with one or more foodborne pathogens harmful to human health. Copyright © 2015 Elsevier B.V. All rights reserved.
Smoking profile among the gay and lesbian community in Ireland.
Kabir, Z; Keogan, S; Clarke, V; Currie, L M; Clancy, L
2010-09-01
We hypothesized that smoking rates among the Gay and Lesbian Community (GLC) in Ireland are not significantly different from the general Irish population. A convenience sampling of self-identified GLC was recruited using electronic (n = 700) and print (n = 500) media procedures in response to survey call advertisements (December 2006-March 2007). In all, 1,113 had complete smoking data and were analyzed. Data on a random sample of 4,000 individuals, using the Irish Office of Tobacco Control monthly telephone survey, were analyzed for the same period. Adjusted smoking rates in GLC were 26 and 24.6% in the general Irish population (P = 0.99), while "heavy" (> or =20 cigarettes/day) smoking prevalence was 44.1 and 36.6%, respectively (P = 0.02). Upper SES GLCs are "heavy" smokers compared with general population of similar SES group (P = 0.01). When considering two different sampling methodologies, this study suggests that smoking rates among the GLC in Ireland are not significantly different from the general Irish population.
Miller, Michael A; Colby, Alison C C; Kanehl, Paul D; Blocksom, Karen
2009-03-01
The Wisconsin Department of Natural Resources (WDNR), with support from the U.S. EPA, conducted an assessment of wadeable streams in the Driftless Area ecoregion in western Wisconsin using a probabilistic sampling design. This ecoregion encompasses 20% of Wisconsin's land area and contains 8,800 miles of perennial streams. Randomly-selected stream sites (n = 60) equally distributed among stream orders 1-4 were sampled. Watershed land use, riparian and in-stream habitat, water chemistry, macroinvertebrate, and fish assemblage data were collected at each true random site and an associated "modified-random" site on each stream that was accessed via a road crossing nearest to the true random site. Targeted least-disturbed reference sites (n = 22) were also sampled to develop reference conditions for various physical, chemical, and biological measures. Cumulative distribution function plots of various measures collected at the true random sites evaluated with reference condition thresholds, indicate that high proportions of the random sites (and by inference the entire Driftless Area wadeable stream population) show some level of degradation. Study results show no statistically significant differences between the true random and modified-random sample sites for any of the nine physical habitat, 11 water chemistry, seven macroinvertebrate, or eight fish metrics analyzed. In Wisconsin's Driftless Area, 79% of wadeable stream lengths were accessible via road crossings. While further evaluation of the statistical rigor of using a modified-random sampling design is warranted, sampling randomly-selected stream sites accessed via the nearest road crossing may provide a more economical way to apply probabilistic sampling in stream monitoring programs.
Strategic Use of Random Subsample Replication and a Coefficient of Factor Replicability
ERIC Educational Resources Information Center
Katzenmeyer, William G.; Stenner, A. Jackson
1975-01-01
The problem of demonstrating replicability of factor structure across random variables is addressed. Procedures are outlined which combine the use of random subsample replication strategies with the correlations between factor score estimates across replicate pairs to generate a coefficient of replicability and confidence intervals associated with…
Xu, Jiao; Zhang, Juan; Wang, Xue-Qiang; Wang, Xuan-Lin; Wu, Ya; Chen, Chan-Cheng; Zhang, Han-Yu; Zhang, Zhi-Wan; Fan, Kai-Yi; Zhu, Qiang; Deng, Zhi-Wei
2017-12-01
Total knee arthroplasty (TKA) has become the most preferred procedure by patients for the relief of pain caused by knee osteoarthritis. TKA patients aim a speedy recovery after the surgery. Joint mobilization techniques for rehabilitation have been widely used to relieve pain and improve joint mobility. However, relevant randomized controlled trials showing the curative effect of these techniques remain lacking to date. Accordingly, this study aims to investigate whether joint mobilization techniques are valid for primary TKA. We will manage a single-blind, prospective, randomized, controlled trial of 120 patients with unilateral TKA. Patients will be randomized into an intervention group, a physical modality therapy group, and a usual care group. The intervention group will undergo joint mobilization manipulation treatment once a day and regular training twice a day for a month. The physical modality therapy group will undergo physical therapy once a day and regular training twice a day for a month. The usual care group will perform regular training twice a day for a month. Primary outcome measures will be based on the visual analog scale, the knee joint Hospital for Special Surgery score, range of motion, surrounded degree, and adverse effect. Secondary indicators will include manual muscle testing, 36-Item Short Form Health Survey, Berg Balance Scale function evaluation, Pittsburgh Sleep Quality Index, proprioception, and muscle morphology. We will direct intention-to-treat analysis if a subject withdraws from the trial. The important features of this trial for joint mobilization techniques in primary TKA are randomization procedures, single-blind, large sample size, and standardized protocol. This study aims to investigate whether joint mobilization techniques are effective for early TKA patients. The result of this study may serve as a guide for TKA patients, medical personnel, and healthcare decision makers. It has been registered at http://www.chictr.org.cn/showproj.aspx?proj=15262 (Identifier:ChiCTR-IOR-16009192), Registered 11 September 2016. We also could provide the correct URL of the online registry in the WHO Trial Registration. http://apps.who.int/trialsearch/Trial2.aspx?TrialID=ChiCTR-IOR-16009192.
19 CFR 151.52 - Sampling procedures.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 19 Customs Duties 2 2011-04-01 2011-04-01 false Sampling procedures. 151.52 Section 151.52 Customs... (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Metal-Bearing Ores and Other Metal-Bearing Materials § 151.52 Sampling procedures. (a) Commercial samples taken under Customs supervision...
19 CFR 151.52 - Sampling procedures.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 19 Customs Duties 2 2010-04-01 2010-04-01 false Sampling procedures. 151.52 Section 151.52 Customs... (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Metal-Bearing Ores and Other Metal-Bearing Materials § 151.52 Sampling procedures. (a) Commercial samples taken under Customs supervision...
Effect of Spinal Manipulative Therapy on the Singing Voice.
Fachinatto, Ana Paula A; Duprat, André de Campos; Silva, Marta Andrada E; Bracher, Eduardo Sawaya Botelho; Benedicto, Camila de Carvalho; Luz, Victor Botta Colangelo; Nogueira, Maruan Nogueira; Fonseca, Beatriz Suster Gomes
2015-09-01
This study investigated the effect of spinal manipulative therapy (SMT) on the singing voice of male individuals. Randomized, controlled, case-crossover trial. Twenty-nine subjects were selected among male members of the Heralds of the Gospel. This association was chosen because it is a group of persons with similar singing activities. Participants were randomly assigned to two groups: (A) chiropractic SMT procedure and (B) nontherapeutic transcutaneous electrical nerve stimulation (TENS) procedure. Recordings of the singing voice of each participant were taken immediately before and after the procedures. After a 14-day period, procedures were switched between groups: participants who underwent SMT on the first day were subjected to TENS and vice versa. Recordings were subjected to perceptual audio and acoustic evaluations. The same recording segment of each participant was selected. Perceptual audio evaluation was performed by a specialist panel (SP). Recordings of each participant were randomly presented thus making the SP blind to intervention type and recording session (before/after intervention). Recordings compiled in a randomized order were also subjected to acoustic evaluation. No differences in the quality of the singing on perceptual audio evaluation were observed between TENS and SMT. No differences in the quality of the singing voice of asymptomatic male singers were observed on perceptual audio evaluation or acoustic evaluation after a single spinal manipulative intervention of the thoracic and cervical spine. Copyright © 2015 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Colling Wipe Samples for VX Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koester, C; Hoppes, W G
2010-02-11
This standard operating procedure (SOP) provides uniform procedures for the collection of wipe samples of VX residues from surfaces. Personnel may use this procedure to collect and handle wipe samples in the field. Various surfaces, including building materials (wood, metal, tile, vinyl, etc.) and equipment, may be sampled based on this procedure. The purpose of such sampling is to determine whether or not the relevant surfaces are contaminated, to determine the extent of their contamination, to evaluate the effectiveness of decontamination procedures, and to determine the amount of contaminant that might present as a contact hazard.
Model selection with multiple regression on distance matrices leads to incorrect inferences.
Franckowiak, Ryan P; Panasci, Michael; Jarvis, Karl J; Acuña-Rodriguez, Ian S; Landguth, Erin L; Fortin, Marie-Josée; Wagner, Helene H
2017-01-01
In landscape genetics, model selection procedures based on Information Theoretic and Bayesian principles have been used with multiple regression on distance matrices (MRM) to test the relationship between multiple vectors of pairwise genetic, geographic, and environmental distance. Using Monte Carlo simulations, we examined the ability of model selection criteria based on Akaike's information criterion (AIC), its small-sample correction (AICc), and the Bayesian information criterion (BIC) to reliably rank candidate models when applied with MRM while varying the sample size. The results showed a serious problem: all three criteria exhibit a systematic bias toward selecting unnecessarily complex models containing spurious random variables and erroneously suggest a high level of support for the incorrectly ranked best model. These problems effectively increased with increasing sample size. The failure of AIC, AICc, and BIC was likely driven by the inflated sample size and different sum-of-squares partitioned by MRM, and the resulting effect on delta values. Based on these findings, we strongly discourage the continued application of AIC, AICc, and BIC for model selection with MRM.
Noronha, Vladimir-Reimar-Augusto-de Souza; Gurgel, Gladson-de Souza; Alves, Luiz-César-Fonseca; Noman-Ferreira, Luiz-Cláudio; Mendonça, Lisette-Lobato; Aguiar, Evandro-Guimarães de; Abdo, Evandro-Neves
2009-08-01
The purpose of this study is to compare the analgesic effect of lysine clonixinate, paracetamol and dipyrone after lower third molar extraction. The sample consisted of 90 individuals with clinical indication for inferior third molar extraction. The mean age of the sample was 22.3 years (DP +/-2.5). The individuals received the medication in unidentified bottles along with the intake instructions. The postoperative pain parameters were measured according to the Visual Analogical Scale (VAS) and the data was evaluated using the Kruskal-Wallis Test and Friedman Test, with the latter used to test different time intervals for each one of the drugs. The final sample consisted of 64 individuals, including 23 males (45.9%) and 41 females (64.1%) The mean age of the entire sample was 22.3 years (+/-2.5). The average length of the procedures was 33.9 minutes (+/-9.8). The distribution of mean values for this variable showed little variance for the different drugs (p=0.07). Lysine Clonixinate did not show any substantial impact on the postoperative pain control when compared to other drugs.
Vickers, Andrew J; Young-Afat, Danny A; Ehdaie, Behfar; Kim, Scott Yh
2018-02-01
Informed consent for randomized trials often causes significant and persistent anxiety, distress and confusion to patients. Where an experimental treatment is compared to a standard care control, much of this burden is potentially avoidable in the control group. We propose a "just-in-time" consent in which consent discussions take place in two stages: an initial consent to research from all participants and a later specific consent to randomized treatment only from those assigned to the experimental intervention. All patients are first approached and informed about research procedures, such as questionnaires or tests. They are also informed that they might be randomly selected to receive an experimental treatment and that, if selected, they can learn more about the treatment and decide whether or not to accept it at that time. After randomization, control patients undergo standard clinical consent whereas patients randomized to the experimental procedure undergo a second consent discussion. Analysis would be by intent-to-treat, which protects the trial from selection bias, although not from poor acceptance of experimental treatment. The advantages of just-in-time consent stem from the fact that only patients randomized to the experimental treatment are subject to a discussion of that intervention. We hypothesize that this will reduce much of the patient's burden associated with the consent process, such as decisional anxiety, confusion and information overload. We recommend well-controlled studies to compare just-in-time and traditional consent, with endpoints to include characteristics of participants, distress and anxiety and participants' understanding of research procedures.
Assessing the quality of reproductive health services in Egypt via exit interviews.
Zaky, Hassan H M; Khattab, Hind A S; Galal, Dina
2007-05-01
This study assesses the quality of reproductive health services using client satisfaction exit interviews among three groups of primary health care units run by the Ministry of Health and Population of Egypt. Each group applied a different model of intervention. The Ministry will use the results in assessing its reproductive health component in the health sector reform program, and benefits from the strengths of other models of intervention. The sample was selected in two stages. First, a stratified random sampling procedure was used to select the health units. Then the sample of female clients in each health unit was selected using the systematic random approach, whereby one in every two women visiting the unit was approached. All women in the sample coming for reproductive health services were included in the analysis. The results showed that reproductive health beneficiaries at the units implementing the new health sector reform program were more satisfied with the quality of services. Still there were various areas where clients showed significant dissatisfaction, such as waiting time, interior furnishings, cleanliness of the units and consultation time. The study showed that the staff of these units did not provide a conductive social environment as other interventions did. A significant proportion of women expressed their intention to go to private physicians owing to their flexible working hours and variety specializations. Beneficiaries were generally more satisfied with the quality of health services after attending the reformed units than the other types of units, but the generalization did not fully apply. Areas of weakness are identified.
Mantovani, Cínthia de Carvalho; Lima, Marcela Bittar; Oliveira, Carolina Dizioli Rodrigues de; Menck, Rafael de Almeida; Diniz, Edna Maria de Albuquerque; Yonamine, Mauricio
2014-04-15
A method using accelerated solvent extraction (ASE) for the isolation of cocaine/crack biomarkers in meconium samples, followed by solid phase extraction (SPE) and the simultaneous quantification by gas chromatography-mass spectrometry (GC-MS) was developed and validated. Initially, meconium samples were submitted to an ASE procedure, which was followed by SPE with Bond Elut Certify I cartridges. The analytes were derivatizated with PFP/PFPA and analyzed by GC-MS. The limits of detection (LOD) were between 11 and 17ng/g for all analytes. The limits of quantification (LOQ) were 30ng/g for anhydroecgonine methyl ester, and 20ng/g for cocaine, benzoylecgonine, ecgonine methyl ester and cocaethylene. Linearity ranged from the LOQ to 1500ng/g for all analytes, with a coefficients of determination greater than 0.991, except for m-hydroxybenzoylecgonine, which was only qualitatively detected. Precision and accuracy were evaluated at three concentration levels. For all analytes, inter-assay precision ranged from 3.2 to 18.1%, and intra-assay precision did not exceed 12.7%. The accuracy results were between 84.5 and 114.2% and the average recovery ranged from 17 to 84%. The method was applied to 342 meconium samples randomly collected in the University Hospital-University of São Paulo (HU-USP), Brazil. Cocaine biomarkers were detected in 19 samples, which represent 5.6% of exposure prevalence. Significantly lower birth weight, length and head circumference were found for the exposed newborns compared with the non-exposed group. This is the first report in which ASE was used as a sample preparation technique to extract cocaine biomarkers from a complex biological matrix such as meconium samples. The advantages of the developed method are the smaller demand for organic solvents and the minor sample handling, which allows a faster and accurate procedure, appropriate to confirm fetal exposure to cocaine/crack. Copyright © 2014 Elsevier B.V. All rights reserved.
A random spatial sampling method in a rural developing nation
Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas
2014-01-01
Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...
Orban, Jean-Christophe; Fontaine, Eric; Cassuto, Elisabeth; Baumstarck, Karine; Leone, Marc; Constantin, Jean-Michel; Ichai, Carole
2018-04-17
Renal transplantation represents the treatment of choice of end-stage kidney disease. Delayed graft function (DGF) remains the most frequent complication after this procedure, reaching more than 30%. Its prevention is essential as it impedes early- and long-term prognosis of transplantation. Numerous pharmacological interventions aiming to prevent ischemia-reperfusion injuries failed to reduce the rate of DGF. We hypothesize that cyclosporine as an early preconditioning procedure in donors would be associated with decreased DGF. The Cis-A-rein study is an investigator-initiated, prospective, multicenter, double-blind, randomized, controlled study performed to assess the effects of a donor preconditioning with cyclosporine A on kidney grafts function in transplanted patients. After randomization, a brain dead donor will receive 2.5 mg kg -1 of cyclosporine A or the same volume of 5% glucose solution. The primary objective is to compare the rate of DGF, defined as the need for at least one dialysis session within the 7 days following transplantation, between both groups. The secondary objectives include rate of slow graft function, mild and severe DGF, urine output and serum creatinine during the first week after transplantation, rate of primary graft dysfunction, renal function and mortality at 1 year. The sample size (n = 648) was determined to obtain 80% power to detect a 10% difference for rate of DGF at day 7 between the two groups (30% of the patients in the placebo group and 20% of the patients in the intervention group). Delayed graft function is a major issue after renal transplantation, impeding long-term prognosis. Cyclosporine A pretreatment in deceased donors could improve the outcome of patients after renal transplantation. ClinicalTrials.gov, ID: NCT02907554 Registered on 20 September 2016.
Procedures for sampling radium-contaminated soils
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fleischhauer, H.L.
Two procedures for sampling the surface layer (0 to 15 centimeters) of radium-contaminated soil are recommended for use in remedial action projects. Both procedures adhere to the philosophy that soil samples should have constant geometry and constant volume in order to ensure uniformity. In the first procedure, a ''cookie cutter'' fashioned from pipe or steel plate, is driven to the desired depth by means of a slide hammer, and the sample extracted as a core or plug. The second procedure requires use of a template to outline the sampling area, from which the sample is obtained using a trowel ormore » spoon. Sampling to the desired depth must then be performed incrementally. Selection of one procedure over the other is governed primarily by soil conditions, the cookie cutter being effective in nongravelly soils, and the template procedure appropriate for use in both gravelly and nongravelly soils. In any event, a minimum sample volume of 1000 cubic centimeters is recommended. The step-by-step procedures are accompanied by a description of the minimum requirements for sample documentation. Transport of the soil samples from the field is then addressed in a discussion of the federal regulations for shipping radioactive materials. Interpretation of those regulations, particularly in light of their application to remedial action soil-sampling programs, is provided in the form of guidance and suggested procedures. Due to the complex nature of the regulations, however, there is no guarantee that our interpretations of them are complete or entirely accurate. Preparation of soil samples for radium-226 analysis by means of gamma-ray spectroscopy is described.« less
Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.
2009-01-01
The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's Lab Master data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality control samples analyzed from July 2001 through June 2003. Results for the quality-control samples for 19 analytical procedures were evaluated for bias and precision. Control charts indicate that data for six of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, chloride, magnesium, nitrate (ion chromatography), potassium, and sodium. The calcium procedure was biased throughout the analysis period for the high-concentration sample, but was within control limits. The total monomeric aluminum and fluoride procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The total aluminum, pH, specific conductance, and sulfate procedures were biased for the high-concentration and low-concentration samples, but were within control limits. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 16 of 18 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for the dissolved organic carbon or specific conductance procedures. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 18 of the 21 analytes. At least 90 percent of the samples met data-quality objectives for all procedures except total monomeric aluminum (83 percent of samples met objectives), total aluminum (76 percent of samples met objectives), ammonium (73 percent of samples met objectives), dissolved organic carbon (86 percent of samples met objectives), and nitrate (81 percent of samples met objectives). The data-quality objective was not met for the nitrite procedure. Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated satisfactory or above data quality over the time period, with most performance ratings for each sample in the good-to-excellent range. The N-sample (nutrient constituents) analysis had one unsatisfactory rating for the ammonium procedure in one study. The T-sample (trace constituents) analysis had one unsatisfactory rating for the magnesium procedure and one marginal rating for the potassium procedure in one study and one unsatisfactory rating for the sodium procedure in another. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 90 percent of the samples met data-quality objectives for 10 of the 14 analytes; the exceptions were acid-neutralizing capacity, ammonium, dissolved organic carbon, and sodium. Data-quality objectives were not met in 37 percent of samples analyzed for acid-neutralizing capacity, 28 percent of samples analyzed for dissolved organic carbon, and 30 percent of samples analyzed for sodium. Results indicate a positive bias for the ammonium procedure in one study and a negative bias in another. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 90 percent of the samples analyzed for calcium, chloride, magnesium, pH, potassium, and sodium. Data-quality objectives were met by 78 percent of
Bachman, E A; Senapati, S; Sammel, M D; Kalra, S K
2014-06-01
Many women experience pain during hysterosalpingogram (HSG). This prospective, randomized, double-blinded, placebo-controlled study assessed whether the use of benzocaine spray during HSG is associated with reduced pain as compared with placebo. Thirty women presenting for HSG were enrolled and randomized to either benzocaine or saline spray. Treatment groups were similar in age, race, parity, pre-procedure oral analgesic use and history of dysmenorrhoea and/or chronic pelvic pain. Median change in pain score from baseline to procedure was 50.6mm (-7.4 to 98.8mm) in the benzocaine group and 70.4mm (19.8 to 100mm) in the placebo group. There was no difference between groups after adjusting for history of dysmenorrhoea. There was no difference in resolution of pain in benzocaine versus placebo groups at 5 min post procedure--median pain score difference -11.1 (-90.1 to 18.5) versus -37.0 (-100 to 1.2)--or at 30 min post procedure. Satisfaction scores did not differ by treatment and did not correlate with pain score during the procedure (rho=0.005). The use of benzocaine spray does not significantly improve pain relief during HSG nor does it hasten resolution of pain post HSG. Of interest, patient satisfaction was not correlated with pain. Many women experience pain during hysterosalpingogram (HSG), which is a test used to evaluate the uterine cavity and fallopian tube. We conducted a prospective, randomized, double-blinded, placebo-controlled study to assess whether the use of benzocaine spray during HSG is associated with reduced pain as compared with placebo. Thirty women presenting for HSG were enrolled and randomized to either benzocaine or saline spray. Treatment groups were similar in age, race, previous pregnancies, pre-procedure oral analgesic use and history of dysmenorrhoea (painful periods) and/or chronic pelvic pain. There was no difference in pain scores or resolution of pain between the two groups. Satisfaction scores did not differ by treatment group and did not correlate with the pain score during the procedure. We conclude that the use of benzocaine spray does not significantly improve pain relief during HSG nor does it hasten resolution of pain post HSG. Of interest, patient satisfaction was not correlated with pain. Copyright © 2014 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
The efficacy and safety of using cooled radiofrequency in treating chronic sacroiliac joint pain
Sun, Hui-Hui; Zhuang, Su-Yang; Hong, Xin; Xie, Xin-Hui; Zhu, Lei; Wu, Xiao-Tao
2018-01-01
Abstract Background: Cooled radiofrequency procedure is a novel minimally invasive surgical technique and has been occasionally utilized in managing chronic sacroiliac joint (SIJ) pain. A meta-analysis was conducted to systematically assess the efficacy and safety of using cooled radiofrequency in treating patients with chronic SIJ pain in terms of pain and disability relief, patients’ satisfaction degree as well as complications. Methods: Studies of using cooled radiofrequency procedure in managing SIJ pain were retrieved from Medline and Web of Science according to inclusion and exclusion criteria. Quality evaluation was conducted using Cochrane collaboration tool for randomized controlled trials and MINORS quality assessment for noncomparative trials. Statistics were managed using Review Manager 5.3. Results: Totally 7 studies with 240 eligible patients were enrolled. The overall pooled results demonstrated that pain intensity decreased significantly after cooled radiofrequency procedure compared with that measured before treatment. The mean difference (MD) was 3.81 [95% confidence intervals (95% CIs): 3.29–4.33, P < .001] and 3.78 (95% CIs: 3.31–4.25, P < .001) as measured by the Numerical Rating Scale (NRS) and Visual Analog Scale (VAS), respectively. Disability also relieved significantly after treatment compared with that measured before treatment. The MD was 18.2 (95% CIs: 12.22–24.17, P < .001) as measured by the Oswestry Disability Index (ODI). Seventy-two percent of the patients presented positive results as measured by the Global Perceived Effect (GPE). The OR was 0.01 (95% CIs: 0.00–0.05, P < .001). Only mild complications were observed in the 7 studies, including transient hip pain, soreness, and numbness. Conclusion: Cooled radiofrequency procedure can significantly relieve pain and disability with no severe complications, and majority of patients are satisfied with this technique. Thus, it is safe and effective to use this procedure in managing patients with chronic SIJ pain. More high-quality and large-scale randomized controlled trials (RCTs) are required to validate our findings. Limitations: The sample size of the included studies was small and various heterogeneity existed. PMID:29419679
Dhooria, Sahajal; Aggarwal, Ashutosh N; Gupta, Dheeraj; Behera, Digambar; Agarwal, Ritesh
2015-07-01
The use of endoscopic ultrasound with bronchoscope-guided fine-needle aspiration (EUS-B-FNA) has been described in the evaluation of mediastinal lymphadenopathy. Herein, we conduct a meta-analysis to estimate the overall diagnostic yield and safety of EUS-B-FNA combined with endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA), in the diagnosis of mediastinal lymphadenopathy. The PubMed and EmBase databases were searched for studies reporting the outcomes of EUS-B-FNA in diagnosis of mediastinal lymphadenopathy. The study quality was assessed using the QualSyst tool. The yield of EBUS-TBNA alone and the combined procedure (EBUS-TBNA and EUS-B-FNA) were analyzed by calculating the sensitivity, specificity, positive likelihood ratio, negative likelihood ratio, and diagnostic odds ratio for each study, and pooling the study results using a random effects model. Heterogeneity and publication bias were assessed for individual outcomes. The additional diagnostic gain of EUS-B-FNA over EBUS-TBNA was calculated using proportion meta-analysis. Our search yielded 10 studies (1,080 subjects with mediastinal lymphadenopathy). The sensitivity of the combined procedure was significantly higher than EBUS-TBNA alone (91% vs 80%, P = .004), in staging of lung cancer (4 studies, 465 subjects). The additional diagnostic gain of EUS-B-FNA over EBUS-TBNA was 7.6% in the diagnosis of mediastinal adenopathy. No serious complication of EUS-B-FNA procedure was reported. Clinical and statistical heterogeneity was present without any evidence of publication bias. Combining EBUS-TBNA and EUS-B-FNA is an effective and safe method, superior to EBUS-TBNA alone, in the diagnosis of mediastinal lymphadenopathy. Good quality randomized controlled trials are required to confirm the results of this systematic review. Copyright © 2015 by Daedalus Enterprises.
Applying appropriate-use criteria to cardiac revascularisation in India.
Sood, Neeraj; Ugargol, Allen P; Barnes, Kayleigh; Mahajan, Anish
2016-03-30
The high prevalence of coronary heart disease and dramatic growth of cardiac interventions in India motivate an evaluation of the appropriateness of coronary revascularisation procedures in India. Although, appropriate-use criteria (AUC) have been used to analyse the appropriateness of cardiovascular care in the USA, they are yet to be applied to care in India. In our study, we apply AUC to cardiac care in Karnataka, India, compare our results to international applications of AUC, and suggest ways to improve the appropriateness of care in India. Data were collected from the Vajpayee Arogyashree Scheme, a government-sponsored health insurance scheme in Karnataka, India. These data were collected as part of the preauthorisation process for cardiac procedures. The final data included a random sample of 600 patients from 28 hospitals in Karnataka, who obtained coronary artery bypass grafting or percutaneous coronary intervention between 1 October 2014 and 31 December 2014. We obtained our primary baseline results using a random imputation simulation to fill in missing data. Our secondary outcome measure was a best case-worst case scenario where missing data were filled to give the lowest or highest number of appropriate cases. Of the cases, 86.7% (CI 0.837% to 0.892%) were deemed appropriate, 3.65% (CI 0.023% to 0.055%) were inappropriate and 9.63% (CI 0.074% to 0.123%) were uncertain. The vast majority of cardiac revascularisation procedures performed on beneficiaries of a government-sponsored insurance programme in India were found to be appropriate. These results meet or exceed levels of appropriate use of cardiac care in the USA. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Shabandokht-Zarmi, Hosniyeh; Bagheri-Nesami, Masoumeh; Shorofi, Seyed Afshin; Mousavinasab, Seyed Nouraddin
2017-11-01
This study was intended to examine the effect of selective soothing music on fistula puncture-related pain in hemodialysis patients. This is a randomized clinical trial in which 114 participants were selected from two hemodialysis units by means of a non-random, convenience sampling method. The participants were then allocated in three groups of music (N = 38), headphone (N = 38), and control (N = 38). The fistula puncture-related pain was measured 1 min after venipuncture procedure in all three groups. The music group listened to their self-selected and preferred music 6 min before needle insertion into a fistula until the end of procedure. The headphone group wore a headphone alone without listening to music 6 min before needle insertion into a fistula until the end of procedure. The control group did not receive any intervention from the research team during needle insertion into a fistula. The pain intensity was measured immediately after the intervention in all three groups. This study showed a significant difference between the music and control groups, and the music and headphone groups in terms of the mean pain score after the intervention. However, the analysis did not indicate any significant difference between the headphone and control groups with regard to the mean pain score after the intervention. It is concluded that music can be used effectively for pain related to needle insertion into a fistula in hemodialysis patients. Future research should investigate the comparative effects of pharmacological and non-pharmacological interventions on fistula puncture-related pain. Copyright © 2017 Elsevier Ltd. All rights reserved.
Pourmoradian, Samira; Mahdavi, Reza; Mobasseri, Majid; Faramarzi, Elnaz; Mobasseri, Mehrnoosh
2014-05-01
It has been proposed that royal jelly has antioxidant properties and may improve oxidative stress and glycemic control. Therefore, we investigated the effects of royal jelly supplementation in diabetic females. In this pilot, parallel design randomized clinical trial, 50 female volunteers with type 2 diabetes were randomly allocated to the supplemented (25, cases) and placebo (25, cases) groups, based on random block procedure produced by Random Allocation Software, given a daily dose of 1,000 mg royal jelly soft gel or placebo, respectively, for 8 weeks. Before and after intervention, glycemic control indices, antioxidant and oxidative stress factors were measured. After royal jelly supplementation, the mean fasting blood glucose decreased remarkably (163.05±42.51 mg/dL vs. 149.68±42.7 mg/dL). Royal jelly supplementation resulted in significant reduction in the mean serum glycosylated hemoglobin levels (8.67%±2.24% vs. 7.05%±1.45%, P=0.001) and significant elevation in the mean insulin concentration (70.28±29.16 pmol/L vs. 86.46±27.50 pmol/L, P=0.01). Supplementation significantly increased erythrocyte superoxidase dismutase and glutathione peroxidase activities and decreased malondialdehyde levels (P<0.05). At the end of study, the mean total antioxidant capacity elevated insignificantly in both groups. On the basis of our findings, it seems that royal jelly supplementation may be beneficial in controlling diabetes outcomes. Further studies with larger sample size are warranted.
Modeling stimulus variation in three common implicit attitude tasks.
Wolsiefer, Katie; Westfall, Jacob; Judd, Charles M
2017-08-01
We explored the consequences of ignoring the sampling variation due to stimuli in the domain of implicit attitudes. A large literature in psycholinguistics has examined the statistical treatment of random stimulus materials, but the recommendations from this literature have not been applied to the social psychological literature on implicit attitudes. This is partly because of inherent complications in applying crossed random-effect models to some of the most common implicit attitude tasks, and partly because no work to date has demonstrated that random stimulus variation is in fact consequential in implicit attitude measurement. We addressed this problem by laying out statistically appropriate and practically feasible crossed random-effect models for three of the most commonly used implicit attitude measures-the Implicit Association Test, affect misattribution procedure, and evaluative priming task-and then applying these models to large datasets (average N = 3,206) that assess participants' implicit attitudes toward race, politics, and self-esteem. We showed that the test statistics from the traditional analyses are substantially (about 60 %) inflated relative to the more-appropriate analyses that incorporate stimulus variation. Because all three tasks used the same stimulus words and faces, we could also meaningfully compare the relative contributions of stimulus variation across the tasks. In an appendix, we give syntax in R, SAS, and SPSS for fitting the recommended crossed random-effects models to data from all three tasks, as well as instructions on how to structure the data file.
Psychiatric comorbidities in a community sample of women with fibromyalgia.
Raphael, Karen G; Janal, Malvin N; Nayak, Sangeetha; Schwartz, Joseph E; Gallagher, Rollin M
2006-09-01
Prior studies of careseeking fibromyalgia (FM) patients often report that they have an elevated risk of psychiatric disorders, but biased sampling may distort true risk. The current investigation utilizes state-of-the-art diagnostic procedures for both FM and psychiatric disorders to estimate prevalence rates of FM and the comorbidity of FM and specific psychiatric disorders in a diverse community sample of women. Participants were screened by telephone for FM and MDD, by randomly selecting telephone numbers from a list of households with women in the NY/NJ metropolitan area. Eligible women were invited to complete physical examinations for FM and clinician-administered psychiatric interviews. Data were weighted to adjust for sampling procedures and population demographics. The estimated overall prevalence of FM among women in the NY/NJ metropolitan area was 3.7% (95% CI=3.2, 4.4), with higher rates among racial minorities. Although risk of current MDD was nearly 3-fold higher in community women with than without FM, the groups had similar risk of lifetime MDD. Risk of lifetime anxiety disorders, particularly obsessive compulsive disorder and post-traumatic stress disorder, was approximately 5-fold higher among women with FM. Overall, this study found a community prevalence for FM among women that replicates prior North American studies, and revealed that FM may be even more prevalent among racial minority women. These community-based data also indicate that the relationship between MDD and FM may be more complicated than previously thought, and call for an increased focus on anxiety disorders in FM.
[Comparison of hot versus cold biopsy forceps in the diagnosis of endobronchial lesions].
Firoozbakhsh, Shahram; Seifirad, Soroush; Safavi, Enayat; Dinparast, Reza; Taslimi, Shervin; Derakhshandeilami, Gholamreza
2011-11-01
Traditionally cold biopsy forceps were used for endobronchial biopsy, and recently electrocautery (hot) bronchoscopy biopsy forceps are introduced. It is hypothesized that hot biopsy forceps may decrease procedure related bleeding and also may decrease the quality of obtained samples. Patients with different indications for endobronchial biopsy during fiberoptic bronchoscopy underwent three hot and three cold biopsies with a random fashion. All biopsies were obtained with a single biopsy forceps with and without the application of an electrocoagulation current, set on soft coagulation mode (40W). A four point scale was used for quantification of bleeding. A single pathologist blinded to the patients' history was requested to review all samples. A three point scale was used to assess electrocoagulation damage. A total of 240 biopsies were obtained from 40 patients. Frequency of positive concordance between the two methods was 85%. The degree of electrocoagulation damage of the samples was as follows: grade 1=52.5%, grade 2=32.5%, and grade 3=15%. The average bleeding score following hot biopsy was significantly lower compared to the cold biopsy (P=.006). The concordance between diagnostic yield of hot and cold biopsies was 85%. There was no significant difference between the diagnostic yields of two biopsy methods (P=.687). Hot biopsy forceps significantly decreased the procedure related bleeding. The quality of samples was not impaired significantly. Regarding low prevalence of bleeding following endobronchial biopsy, routine use of hot bronchoscopy forceps is not reasonable. However, familiarity of bronchoscopists with this method may improve bronchoscopy safety. Copyright © 2011 SEPAR. Published by Elsevier Espana. All rights reserved.
Reduction of display artifacts by random sampling
NASA Technical Reports Server (NTRS)
Ahumada, A. J., Jr.; Nagel, D. C.; Watson, A. B.; Yellott, J. I., Jr.
1983-01-01
The application of random-sampling techniques to remove visible artifacts (such as flicker, moire patterns, and paradoxical motion) introduced in TV-type displays by discrete sequential scanning is discussed and demonstrated. Sequential-scanning artifacts are described; the window of visibility defined in spatiotemporal frequency space by Watson and Ahumada (1982 and 1983) and Watson et al. (1983) is explained; the basic principles of random sampling are reviewed and illustrated by the case of the human retina; and it is proposed that the sampling artifacts can be replaced by random noise, which can then be shifted to frequency-space regions outside the window of visibility. Vertical sequential, single-random-sequence, and continuously renewed random-sequence plotting displays generating 128 points at update rates up to 130 Hz are applied to images of stationary and moving lines, and best results are obtained with the single random sequence for the stationary lines and with the renewed random sequence for the moving lines.
Schneider, Emily N; Riley, Regan; Espey, Eve; Mishra, Shiraz I; Singh, Rameet H
2017-03-01
To evaluate whether inhaled nitrous oxide with oxygen (N 2 O/O 2 ) is associated with less pain compared to oral sedation for pain management during in-office hysteroscopic sterilization. This double blinded randomized controlled trial enrolled women undergoing in-office hysteroscopic sterilization. All participants received pre-procedure intramuscular ketorolac and a standardized paracervical block. The intervention group also received N 2 O/O 2 via a nasal mask titrated to a maximum 70%:30% mixture by a nurse during the procedure and placebo pills pre-procedure and the active control group received inhaled O 2 during the procedure and 5/325 mg hydrocodone/acetaminophen and 1 mg lorazepam pre-procedure. The primary outcome was maximum procedure pain on a 100 mm Visual Analog Scale (VAS with anchors at 0=no pain and 100=worst imaginable pain) assessed 3-5 min post procedure. Thirty women per treatment arm were required to detect a clinically significant pain difference of 20 mm. Seventy-two women, 36 per study arm, were randomized. Mean age of participants was 34.1±5.7 years and mean BMI was 30.1±6.6kg/m 2 . Mean maximum procedure pain scores were 22.8±27.6 mm and 54.5±32.7 mm for intervention and control groups, respectively (p<.001). Most study participants (97%) stated N 2 O/O 2 should be offered for gynecologic office procedures and 86% would pay for it if not a covered benefit. N 2 O/O 2 decreased pain with in-office hysteroscopic sterilization compared to oral sedation and is an effective pain management option for this procedure. Given its safety and favorable side effect profile, N 2 O/O 2 can be used for pain management for in-office hysteroscopic sterilization and adds a safe, easily administered option to currently available strategies. Copyright © 2016 Elsevier Inc. All rights reserved.
2013-01-01
Background Depression in primary care is common, yet this costly and disabling condition remains underdiagnosed and undertreated. Persisting gaps in the primary care of depression are due in part to patients’ reluctance to bring depressive symptoms to the attention of their primary care clinician and, when depression is diagnosed, to accept initial treatment for the condition. Both targeted and tailored communication strategies offer promise for fomenting discussion and reducing barriers to appropriate initial treatment of depression. Methods/design The Activating Messages to Enhance Primary Care Practice (AMEP2) Study is a stratified randomized controlled trial comparing two computerized multimedia patient interventions --- one targeted (to patient gender and income level) and one tailored (to level of depressive symptoms, visit agenda, treatment preferences, depression causal attributions, communication self-efficacy and stigma)--- and an attention control. AMEP2 consists of two linked sub-studies, one focusing on patients with significant depressive symptoms (Patient Health Questionnaire-9 [PHQ-9] scores ≥ 5), the other on patients with few or no depressive symptoms (PHQ-9 < 5). The first sub-study examined effectiveness of the interventions; key outcomes included delivery of components of initial depression care (antidepressant prescription or mental health referral). The second sub-study tracked potential hazards (clinical distraction and overtreatment). A telephone interview screening procedure assessed patients for eligibility and oversampled patients with significant depressive symptoms. Sampled, consenting patients used computers to answer survey questions, be randomized, and view assigned interventions just before scheduled primary care office visits. Patient surveys were also collected immediately post-visit and 12 weeks later. Physicians completed brief reporting forms after each patient’s index visit. Additional data were obtained from medical record abstraction and visit audio recordings. Of 6,191 patients assessed, 867 were randomized and included in analysis, with 559 in the first sub-study and 308 in the second. Discussion Based on formative research, we developed two novel multimedia programs for encouraging patients to discuss depressive symptoms with their primary care clinicians. Our computer-based enrollment and randomization procedures ensured that randomization was fully concealed and data missingness minimized. Analyses will focus on the interventions’ potential benefits among depressed persons, and the potential hazards among the non-depressed. Trial registration ClinicialTrials.gov Identifier: http://NCT01144104 PMID:23594572
78 FR 43002 - Proposed Collection; Comment Request for Revenue Procedure 2004-29
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-18
... comments concerning statistical sampling in Sec. 274 Context. DATES: Written comments should be received on... INFORMATION: Title: Statistical Sampling in Sec. 274 Contest. OMB Number: 1545-1847. Revenue Procedure Number: Revenue Procedure 2004-29. Abstract: Revenue Procedure 2004-29 prescribes the statistical sampling...
NASA Technical Reports Server (NTRS)
Hadass, Z.
1974-01-01
The design procedure of feedback controllers was described and the considerations for the selection of the design parameters were given. The frequency domain properties of single-input single-output systems using state feedback controllers are analyzed, and desirable phase and gain margin properties are demonstrated. Special consideration is given to the design of controllers for tracking systems, especially those designed to track polynomial commands. As an example, a controller was designed for a tracking telescope with a polynomial tracking requirement and some special features such as actuator saturation and multiple measurements, one of which is sampled. The resulting system has a tracking performance comparing favorably with a much more complicated digital aided tracker. The parameter sensitivity reduction was treated by considering the variable parameters as random variables. A performance index is defined as a weighted sum of the state and control convariances that sum from both the random system disturbances and the parameter uncertainties, and is minimized numerically by adjusting a set of free parameters.
Peltzer, Karl; Simbayi, Leickness; Banyini, Mercy; Kekana, Queen
2011-01-01
The aim of this study was to test a 180-minute group HIV risk-reduction counseling intervention trial with men undergoing traditional circumcision in South Africa to reduce behavioral disinhibition (false security) as a result of the procedure. A cluster randomized controlled trial design was employed using a sample of 160 men, 80 in the experimental group and 80 in the control group. Comparisons between baseline and 3-month follow-up assessments on key behavioral outcomes were completed. We found that behavioral intentions, risk-reduction skills, and male role norms did not change in the experimental compared to the control condition. However, HIV-related stigma beliefs were significantly reduced in both conditions over time. These findings show that one small-group HIV risk-reduction intervention did not reduce sexual risk behaviors in recently traditionally circumcised men at high risk for behavioral disinhibition. Copyright © 2011 Association of Nurses in AIDS Care. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yashchuk, Valeriy V.; Anderson, Erik H.; Barber, Samuel K.
2011-03-14
A modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays [Proc. SPIE 7077-7 (2007), Opt. Eng. 47, 073602 (2008)] has been proven to be an effective MTF calibration method for a number of interferometric microscopes and a scatterometer [Nucl. Instr. and Meth. A616, 172 (2010)]. Here we report on a further expansion of the application range of the method. We describe the MTF calibration of a 6 inch phase shifting Fizeau interferometer. Beyond providing a direct measurement of the interferometer's MTF, tests with a BPR array surface have revealed an asymmetry in the instrument's datamore » processing algorithm that fundamentally limits its bandwidth. Moreover, the tests have illustrated the effects of the instrument's detrending and filtering procedures on power spectral density measurements. The details of the development of a BPR test sample suitable for calibration of scanning and transmission electron microscopes are also presented. Such a test sample is realized as a multilayer structure with the layer thicknesses of two materials corresponding to BPR sequence. The investigations confirm the universal character of the method that makes it applicable to a large variety of metrology instrumentation with spatial wavelength bandwidths from a few nanometers to hundreds of millimeters.« less
van Berkum, Susanne; Erné, Ben H.
2013-01-01
The magnetic remanence of silica microspheres with a low concentration of embedded cobalt ferrite nanoparticles is studied after demagnetization and remagnetization treatments. When the microspheres are dispersed in a liquid, alternating current (AC) magnetic susceptibility spectra reveal a constant characteristic frequency, corresponding to the rotational diffusion of the microparticles; this depends only on particle size and liquid viscosity, making the particles suitable as a rheological probe and indicating that interactions between the microspheres are weak. On the macroscopic scale, a sample with the dry microparticles is magnetically remanent after treatment in a saturating field, and after a demagnetization treatment, the remanence goes down to zero. The AC susceptibility of a liquid dispersion, however, characterizes the remanence on the scale of the individual microparticles, which does not become zero after demagnetization. The reason is that an individual microparticle contains only a relatively small number of magnetic units, so that even if they can be reoriented magnetically at random, the average vector sum of the nanoparticle dipoles is not negligible on the scale of the microparticle. In contrast, on the macroscopic scale, the demagnetization procedure randomizes the orientations of a macroscopic number of magnetic units, resulting in a remanent magnetization that is negligible compared to the saturation magnetization of the entire sample. PMID:24009021
Park, Woo Young; Shin, Yang-Sik; Lee, Sang Kil; Kim, So Yeon; Lee, Tai Kyung
2014-01-01
Purpose Endoscopic submucosal dissection (ESD) is a technically difficult and lengthy procedure requiring optimal depth of sedation. The bispectral index (BIS) monitor is a non-invasive tool that objectively evaluates the depth of sedation. The purpose of this prospective randomized controlled trial was to evaluate whether BIS guided sedation with propofol and remifentanil could reduce the number of patients requiring rescue propofol, and thus reduce the incidence of sedation- and/or procedure-related complications. Materials and Methods A total of 180 patients who underwent the ESD procedure for gastric adenoma or early gastric cancer were randomized to two groups. The control group (n=90) was monitored by the Modified Observer's Assessment of Alertness and Sedation scale and the BIS group (n=90) was monitored using BIS. The total doses of propofol and remifentanil, the need for rescue propofol, and the rates of complications were recorded. Results The number of patients who needed rescue propofol during the procedure was significantly higher in the control group than the BIS group (47.8% vs. 30.0%, p=0.014). There were no significant differences in the incidence of sedation- and/or procedure-related complications. Conclusion BIS-guided propofol infusion combined with remifentanil reduced the number of patients requiring rescue propofol in ESD procedures. However, this finding did not lead to clinical benefits and thus BIS monitoring is of limited use during anesthesiologist-directed sedation. PMID:25048506
10 CFR 707.7 - Random drug testing requirements and identification of testing designated positions.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 4 2012-01-01 2012-01-01 false Random drug testing requirements and identification of... PROGRAMS AT DOE SITES Procedures § 707.7 Random drug testing requirements and identification of testing... evidence of the use of illegal drugs of employees in testing designated positions identified in this...
10 CFR 707.7 - Random drug testing requirements and identification of testing designated positions.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 4 2014-01-01 2014-01-01 false Random drug testing requirements and identification of... PROGRAMS AT DOE SITES Procedures § 707.7 Random drug testing requirements and identification of testing... evidence of the use of illegal drugs of employees in testing designated positions identified in this...
10 CFR 707.7 - Random drug testing requirements and identification of testing designated positions.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 4 2011-01-01 2011-01-01 false Random drug testing requirements and identification of... PROGRAMS AT DOE SITES Procedures § 707.7 Random drug testing requirements and identification of testing... evidence of the use of illegal drugs of employees in testing designated positions identified in this...
10 CFR 707.7 - Random drug testing requirements and identification of testing designated positions.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 4 2013-01-01 2013-01-01 false Random drug testing requirements and identification of... PROGRAMS AT DOE SITES Procedures § 707.7 Random drug testing requirements and identification of testing... evidence of the use of illegal drugs of employees in testing designated positions identified in this...
10 CFR 707.7 - Random drug testing requirements and identification of testing designated positions.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 4 2010-01-01 2010-01-01 false Random drug testing requirements and identification of... PROGRAMS AT DOE SITES Procedures § 707.7 Random drug testing requirements and identification of testing... evidence of the use of illegal drugs of employees in testing designated positions identified in this...
Braun, M; Buchen, B; Sievers, A
1996-06-27
A special fixation device and fixation procedure have been developed to investigate for the first time the ultrastructure of gravity-sensing, unicellular Chara rhizoids grown for 30 h under microgravity (MG) conditions during the IML-2 mission. The fixation unit allowed culture, fixation and storage of Chara rhizoids in the same chamber without transferring the samples. The procedure was easy and safe to perform and required a minimum of crew time. Rhizoids fixated with glutaraldehyde in space and further processed for electron microscopy on ground showed that the fixation was of high quality and corresponded to the fixation quality of rhizoids in the ground controls. Thus, the equipment accomplished the manifold problems related to the physical effects of MG. The polarity of the rhizoids was maintained in MG. Well-preserved organelles and microtubules showed no obvious difference in ultrastructure or distribution after 30-h growth in MG compared to ground controls. The statoliths were more randomly distributed, however, only up to 50 microns basal to the tip. Thus, changing the gravity conditions does to disturb the cellular organisation of the rhizoids enabling the tip-growing cells to follow their genetic program in development and growth also under MG.