Sample records for complex sampling designs

  1. A nonparametric method to generate synthetic populations to adjust for complex sampling design features.

    PubMed

    Dong, Qi; Elliott, Michael R; Raghunathan, Trivellore E

    2014-06-01

    Outside of the survey sampling literature, samples are often assumed to be generated by a simple random sampling process that produces independent and identically distributed (IID) samples. Many statistical methods are developed largely in this IID world. Application of these methods to data from complex sample surveys without making allowance for the survey design features can lead to erroneous inferences. Hence, much time and effort have been devoted to develop the statistical methods to analyze complex survey data and account for the sample design. This issue is particularly important when generating synthetic populations using finite population Bayesian inference, as is often done in missing data or disclosure risk settings, or when combining data from multiple surveys. By extending previous work in finite population Bayesian bootstrap literature, we propose a method to generate synthetic populations from a posterior predictive distribution in a fashion inverts the complex sampling design features and generates simple random samples from a superpopulation point of view, making adjustment on the complex data so that they can be analyzed as simple random samples. We consider a simulation study with a stratified, clustered unequal-probability of selection sample design, and use the proposed nonparametric method to generate synthetic populations for the 2006 National Health Interview Survey (NHIS), and the Medical Expenditure Panel Survey (MEPS), which are stratified, clustered unequal-probability of selection sample designs.

  2. A nonparametric method to generate synthetic populations to adjust for complex sampling design features

    PubMed Central

    Dong, Qi; Elliott, Michael R.; Raghunathan, Trivellore E.

    2017-01-01

    Outside of the survey sampling literature, samples are often assumed to be generated by a simple random sampling process that produces independent and identically distributed (IID) samples. Many statistical methods are developed largely in this IID world. Application of these methods to data from complex sample surveys without making allowance for the survey design features can lead to erroneous inferences. Hence, much time and effort have been devoted to develop the statistical methods to analyze complex survey data and account for the sample design. This issue is particularly important when generating synthetic populations using finite population Bayesian inference, as is often done in missing data or disclosure risk settings, or when combining data from multiple surveys. By extending previous work in finite population Bayesian bootstrap literature, we propose a method to generate synthetic populations from a posterior predictive distribution in a fashion inverts the complex sampling design features and generates simple random samples from a superpopulation point of view, making adjustment on the complex data so that they can be analyzed as simple random samples. We consider a simulation study with a stratified, clustered unequal-probability of selection sample design, and use the proposed nonparametric method to generate synthetic populations for the 2006 National Health Interview Survey (NHIS), and the Medical Expenditure Panel Survey (MEPS), which are stratified, clustered unequal-probability of selection sample designs. PMID:29200608

  3. Accuracy assessment with complex sampling designs

    Treesearch

    Raymond L. Czaplewski

    2010-01-01

    A reliable accuracy assessment of remotely sensed geospatial data requires a sufficiently large probability sample of expensive reference data. Complex sampling designs reduce cost or increase precision, especially with regional, continental and global projects. The General Restriction (GR) Estimator and the Recursive Restriction (RR) Estimator separate a complex...

  4. Complex sample survey estimation in static state-space

    Treesearch

    Raymond L. Czaplewski

    2010-01-01

    Increased use of remotely sensed data is a key strategy adopted by the Forest Inventory and Analysis Program. However, multiple sensor technologies require complex sampling units and sampling designs. The Recursive Restriction Estimator (RRE) accommodates this complexity. It is a design-consistent Empirical Best Linear Unbiased Prediction for the state-vector, which...

  5. Performance of Random Effects Model Estimators under Complex Sampling Designs

    ERIC Educational Resources Information Center

    Jia, Yue; Stokes, Lynne; Harris, Ian; Wang, Yan

    2011-01-01

    In this article, we consider estimation of parameters of random effects models from samples collected via complex multistage designs. Incorporation of sampling weights is one way to reduce estimation bias due to unequal probabilities of selection. Several weighting methods have been proposed in the literature for estimating the parameters of…

  6. Navigating complex sample analysis using national survey data.

    PubMed

    Saylor, Jennifer; Friedmann, Erika; Lee, Hyeon Joo

    2012-01-01

    The National Center for Health Statistics conducts the National Health and Nutrition Examination Survey and other national surveys with probability-based complex sample designs. Goals of national surveys are to provide valid data for the population of the United States. Analyses of data from population surveys present unique challenges in the research process but are valuable avenues to study the health of the United States population. The aim of this study was to demonstrate the importance of using complex data analysis techniques for data obtained with complex multistage sampling design and provide an example of analysis using the SPSS Complex Samples procedure. Illustration of challenges and solutions specific to secondary data analysis of national databases are described using the National Health and Nutrition Examination Survey as the exemplar. Oversampling of small or sensitive groups provides necessary estimates of variability within small groups. Use of weights without complex samples accurately estimates population means and frequency from the sample after accounting for over- or undersampling of specific groups. Weighting alone leads to inappropriate population estimates of variability, because they are computed as if the measures were from the entire population rather than a sample in the data set. The SPSS Complex Samples procedure allows inclusion of all sampling design elements, stratification, clusters, and weights. Use of national data sets allows use of extensive, expensive, and well-documented survey data for exploratory questions but limits analysis to those variables included in the data set. The large sample permits examination of multiple predictors and interactive relationships. Merging data files, availability of data in several waves of surveys, and complex sampling are techniques used to provide a representative sample but present unique challenges. In sophisticated data analysis techniques, use of these data is optimized.

  7. Multiple Imputation in Two-Stage Cluster Samples Using The Weighted Finite Population Bayesian Bootstrap.

    PubMed

    Zhou, Hanzhi; Elliott, Michael R; Raghunathan, Trivellore E

    2016-06-01

    Multistage sampling is often employed in survey samples for cost and convenience. However, accounting for clustering features when generating datasets for multiple imputation is a nontrivial task, particularly when, as is often the case, cluster sampling is accompanied by unequal probabilities of selection, necessitating case weights. Thus, multiple imputation often ignores complex sample designs and assumes simple random sampling when generating imputations, even though failing to account for complex sample design features is known to yield biased estimates and confidence intervals that have incorrect nominal coverage. In this article, we extend a recently developed, weighted, finite-population Bayesian bootstrap procedure to generate synthetic populations conditional on complex sample design data that can be treated as simple random samples at the imputation stage, obviating the need to directly model design features for imputation. We develop two forms of this method: one where the probabilities of selection are known at the first and second stages of the design, and the other, more common in public use files, where only the final weight based on the product of the two probabilities is known. We show that this method has advantages in terms of bias, mean square error, and coverage properties over methods where sample designs are ignored, with little loss in efficiency, even when compared with correct fully parametric models. An application is made using the National Automotive Sampling System Crashworthiness Data System, a multistage, unequal probability sample of U.S. passenger vehicle crashes, which suffers from a substantial amount of missing data in "Delta-V," a key crash severity measure.

  8. Multiple Imputation in Two-Stage Cluster Samples Using The Weighted Finite Population Bayesian Bootstrap

    PubMed Central

    Zhou, Hanzhi; Elliott, Michael R.; Raghunathan, Trivellore E.

    2017-01-01

    Multistage sampling is often employed in survey samples for cost and convenience. However, accounting for clustering features when generating datasets for multiple imputation is a nontrivial task, particularly when, as is often the case, cluster sampling is accompanied by unequal probabilities of selection, necessitating case weights. Thus, multiple imputation often ignores complex sample designs and assumes simple random sampling when generating imputations, even though failing to account for complex sample design features is known to yield biased estimates and confidence intervals that have incorrect nominal coverage. In this article, we extend a recently developed, weighted, finite-population Bayesian bootstrap procedure to generate synthetic populations conditional on complex sample design data that can be treated as simple random samples at the imputation stage, obviating the need to directly model design features for imputation. We develop two forms of this method: one where the probabilities of selection are known at the first and second stages of the design, and the other, more common in public use files, where only the final weight based on the product of the two probabilities is known. We show that this method has advantages in terms of bias, mean square error, and coverage properties over methods where sample designs are ignored, with little loss in efficiency, even when compared with correct fully parametric models. An application is made using the National Automotive Sampling System Crashworthiness Data System, a multistage, unequal probability sample of U.S. passenger vehicle crashes, which suffers from a substantial amount of missing data in “Delta-V,” a key crash severity measure. PMID:29226161

  9. Community Health Centers: Providers, Patients, and Content of Care

    MedlinePlus

    ... Statistics (NCHS). NAMCS uses a multistage probability sample design involving geographic primary sampling units (PSUs), physician practices ... 05 level. To account for the complex sample design during variance estimation, all analyses were performed using ...

  10. Variance Estimation Using Replication Methods in Structural Equation Modeling with Complex Sample Data

    ERIC Educational Resources Information Center

    Stapleton, Laura M.

    2008-01-01

    This article discusses replication sampling variance estimation techniques that are often applied in analyses using data from complex sampling designs: jackknife repeated replication, balanced repeated replication, and bootstrapping. These techniques are used with traditional analyses such as regression, but are currently not used with structural…

  11. SAS procedures for designing and analyzing sample surveys

    USGS Publications Warehouse

    Stafford, Joshua D.; Reinecke, Kenneth J.; Kaminski, Richard M.

    2003-01-01

    Complex surveys often are necessary to estimate occurrence (or distribution), density, and abundance of plants and animals for purposes of re-search and conservation. Most scientists are familiar with simple random sampling, where sample units are selected from a population of interest (sampling frame) with equal probability. However, the goal of ecological surveys often is to make inferences about populations over large or complex spatial areas where organisms are not homogeneously distributed or sampling frames are in-convenient or impossible to construct. Candidate sampling strategies for such complex surveys include stratified,multistage, and adaptive sampling (Thompson 1992, Buckland 1994).

  12. Numerically stable algorithm for combining census and sample estimates with the multivariate composite estimator

    Treesearch

    R. L. Czaplewski

    2009-01-01

    The minimum variance multivariate composite estimator is a relatively simple sequential estimator for complex sampling designs (Czaplewski 2009). Such designs combine a probability sample of expensive field data with multiple censuses and/or samples of relatively inexpensive multi-sensor, multi-resolution remotely sensed data. Unfortunately, the multivariate composite...

  13. Apollo Experiment Report: Lunar-Sample Processing in the Lunar Receiving Laboratory High-Vacuum Complex

    NASA Technical Reports Server (NTRS)

    White, D. R.

    1976-01-01

    A high-vacuum complex composed of an atmospheric decontamination system, sample-processing chambers, storage chambers, and a transfer system was built to process and examine lunar material while maintaining quarantine status. Problems identified, equipment modifications, and procedure changes made for Apollo 11 and 12 sample processing are presented. The sample processing experiences indicate that only a few operating personnel are required to process the sample efficiently, safely, and rapidly in the high-vacuum complex. The high-vacuum complex was designed to handle the many contingencies, both quarantine and scientific, associated with handling an unknown entity such as the lunar sample. Lunar sample handling necessitated a complex system that could not respond rapidly to changing scientific requirements as the characteristics of the lunar sample were better defined. Although the complex successfully handled the processing of Apollo 11 and 12 lunar samples, the scientific requirement for vacuum samples was deleted after the Apollo 12 mission just as the vacuum system was reaching its full potential.

  14. Prevalence of Obesity Among Adults and Youth: United States, 2011-2014

    MedlinePlus

    ... sample is selected through a complex, multistage probability design. In 2011–2012 and 2013–2014, non-Hispanic ... All variance estimates accounted for the complex survey design by using Taylor series linearization. Pregnant females were ...

  15. Randomization-Based Inference about Latent Variables from Complex Samples: The Case of Two-Stage Sampling

    ERIC Educational Resources Information Center

    Li, Tiandong

    2012-01-01

    In large-scale assessments, such as the National Assessment of Educational Progress (NAEP), plausible values based on Multiple Imputations (MI) have been used to estimate population characteristics for latent constructs under complex sample designs. Mislevy (1991) derived a closed-form analytic solution for a fixed-effect model in creating…

  16. Accounting for selection bias in association studies with complex survey data.

    PubMed

    Wirth, Kathleen E; Tchetgen Tchetgen, Eric J

    2014-05-01

    Obtaining representative information from hidden and hard-to-reach populations is fundamental to describe the epidemiology of many sexually transmitted diseases, including HIV. Unfortunately, simple random sampling is impractical in these settings, as no registry of names exists from which to sample the population at random. However, complex sampling designs can be used, as members of these populations tend to congregate at known locations, which can be enumerated and sampled at random. For example, female sex workers may be found at brothels and street corners, whereas injection drug users often come together at shooting galleries. Despite the logistical appeal, complex sampling schemes lead to unequal probabilities of selection, and failure to account for this differential selection can result in biased estimates of population averages and relative risks. However, standard techniques to account for selection can lead to substantial losses in efficiency. Consequently, researchers implement a variety of strategies in an effort to balance validity and efficiency. Some researchers fully or partially account for the survey design, whereas others do nothing and treat the sample as a realization of the population of interest. We use directed acyclic graphs to show how certain survey sampling designs, combined with subject-matter considerations unique to individual exposure-outcome associations, can induce selection bias. Finally, we present a novel yet simple maximum likelihood approach for analyzing complex survey data; this approach optimizes statistical efficiency at no cost to validity. We use simulated data to illustrate this method and compare it with other analytic techniques.

  17. Prospective power calculations for the Four Lab study of a multigenerational reproductive/developmental toxicity rodent bioassay using a complex mixture of disinfection by-products in the low-response region.

    PubMed

    Dingus, Cheryl A; Teuschler, Linda K; Rice, Glenn E; Simmons, Jane Ellen; Narotsky, Michael G

    2011-10-01

    In complex mixture toxicology, there is growing emphasis on testing environmentally representative doses that improve the relevance of results for health risk assessment, but are typically much lower than those used in traditional toxicology studies. Traditional experimental designs with typical sample sizes may have insufficient statistical power to detect effects caused by environmentally relevant doses. Proper study design, with adequate statistical power, is critical to ensuring that experimental results are useful for environmental health risk assessment. Studies with environmentally realistic complex mixtures have practical constraints on sample concentration factor and sample volume as well as the number of animals that can be accommodated. This article describes methodology for calculation of statistical power for non-independent observations for a multigenerational rodent reproductive/developmental bioassay. The use of the methodology is illustrated using the U.S. EPA's Four Lab study in which rodents were exposed to chlorinated water concentrates containing complex mixtures of drinking water disinfection by-products. Possible experimental designs included two single-block designs and a two-block design. Considering the possible study designs and constraints, a design of two blocks of 100 females with a 40:60 ratio of control:treated animals and a significance level of 0.05 yielded maximum prospective power (~90%) to detect pup weight decreases, while providing the most power to detect increased prenatal loss.

  18. Prospective Power Calculations for the Four Lab Study of A Multigenerational Reproductive/Developmental Toxicity Rodent Bioassay Using A Complex Mixture of Disinfection By-Products in the Low-Response Region

    PubMed Central

    Dingus, Cheryl A.; Teuschler, Linda K.; Rice, Glenn E.; Simmons, Jane Ellen; Narotsky, Michael G.

    2011-01-01

    In complex mixture toxicology, there is growing emphasis on testing environmentally representative doses that improve the relevance of results for health risk assessment, but are typically much lower than those used in traditional toxicology studies. Traditional experimental designs with typical sample sizes may have insufficient statistical power to detect effects caused by environmentally relevant doses. Proper study design, with adequate statistical power, is critical to ensuring that experimental results are useful for environmental health risk assessment. Studies with environmentally realistic complex mixtures have practical constraints on sample concentration factor and sample volume as well as the number of animals that can be accommodated. This article describes methodology for calculation of statistical power for non-independent observations for a multigenerational rodent reproductive/developmental bioassay. The use of the methodology is illustrated using the U.S. EPA’s Four Lab study in which rodents were exposed to chlorinated water concentrates containing complex mixtures of drinking water disinfection by-products. Possible experimental designs included two single-block designs and a two-block design. Considering the possible study designs and constraints, a design of two blocks of 100 females with a 40:60 ratio of control:treated animals and a significance level of 0.05 yielded maximum prospective power (~90%) to detect pup weight decreases, while providing the most power to detect increased prenatal loss. PMID:22073030

  19. Rapid Sampling of Hydrogen Bond Networks for Computational Protein Design.

    PubMed

    Maguire, Jack B; Boyken, Scott E; Baker, David; Kuhlman, Brian

    2018-05-08

    Hydrogen bond networks play a critical role in determining the stability and specificity of biomolecular complexes, and the ability to design such networks is important for engineering novel structures, interactions, and enzymes. One key feature of hydrogen bond networks that makes them difficult to rationally engineer is that they are highly cooperative and are not energetically favorable until the hydrogen bonding potential has been satisfied for all buried polar groups in the network. Existing computational methods for protein design are ill-equipped for creating these highly cooperative networks because they rely on energy functions and sampling strategies that are focused on pairwise interactions. To enable the design of complex hydrogen bond networks, we have developed a new sampling protocol in the molecular modeling program Rosetta that explicitly searches for sets of amino acid mutations that can form self-contained hydrogen bond networks. For a given set of designable residues, the protocol often identifies many alternative sets of mutations/networks, and we show that it can readily be applied to large sets of residues at protein-protein interfaces or in the interior of proteins. The protocol builds on a recently developed method in Rosetta for designing hydrogen bond networks that has been experimentally validated for small symmetric systems but was not extensible to many larger protein structures and complexes. The sampling protocol we describe here not only recapitulates previously validated designs with performance improvements but also yields viable hydrogen bond networks for cases where the previous method fails, such as the design of large, asymmetric interfaces relevant to engineering protein-based therapeutics.

  20. Simulation methods to estimate design power: an overview for applied research.

    PubMed

    Arnold, Benjamin F; Hogan, Daniel R; Colford, John M; Hubbard, Alan E

    2011-06-20

    Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research.

  1. Simulation methods to estimate design power: an overview for applied research

    PubMed Central

    2011-01-01

    Background Estimating the required sample size and statistical power for a study is an integral part of study design. For standard designs, power equations provide an efficient solution to the problem, but they are unavailable for many complex study designs that arise in practice. For such complex study designs, computer simulation is a useful alternative for estimating study power. Although this approach is well known among statisticians, in our experience many epidemiologists and social scientists are unfamiliar with the technique. This article aims to address this knowledge gap. Methods We review an approach to estimate study power for individual- or cluster-randomized designs using computer simulation. This flexible approach arises naturally from the model used to derive conventional power equations, but extends those methods to accommodate arbitrarily complex designs. The method is universally applicable to a broad range of designs and outcomes, and we present the material in a way that is approachable for quantitative, applied researchers. We illustrate the method using two examples (one simple, one complex) based on sanitation and nutritional interventions to improve child growth. Results We first show how simulation reproduces conventional power estimates for simple randomized designs over a broad range of sample scenarios to familiarize the reader with the approach. We then demonstrate how to extend the simulation approach to more complex designs. Finally, we discuss extensions to the examples in the article, and provide computer code to efficiently run the example simulations in both R and Stata. Conclusions Simulation methods offer a flexible option to estimate statistical power for standard and non-traditional study designs and parameters of interest. The approach we have described is universally applicable for evaluating study designs used in epidemiologic and social science research. PMID:21689447

  2. Robust Design of Sheet Metal Forming Process Based on Kriging Metamodel

    NASA Astrophysics Data System (ADS)

    Xie, Yanmin

    2011-08-01

    Nowadays, sheet metal forming processes design is not a trivial task due to the complex issues to be taken into account (conflicting design goals, complex shapes forming and so on). Optimization methods have also been widely applied in sheet metal forming. Therefore, proper design methods to reduce time and costs have to be developed mostly based on computer aided procedures. At the same time, the existence of variations during manufacturing processes significantly may influence final product quality, rendering non-robust optimal solutions. In this paper, a small size of design of experiments is conducted to investigate how a stochastic behavior of noise factors affects drawing quality. The finite element software (LS_DYNA) is used to simulate the complex sheet metal stamping processes. The Kriging metamodel is adopted to map the relation between input process parameters and part quality. Robust design models for sheet metal forming process integrate adaptive importance sampling with Kriging model, in order to minimize impact of the variations and achieve reliable process parameters. In the adaptive sample, an improved criterion is used to provide direction in which additional training samples can be added to better the Kriging model. Nonlinear functions as test functions and a square stamping example (NUMISHEET'93) are employed to verify the proposed method. Final results indicate application feasibility of the aforesaid method proposed for multi-response robust design.

  3. Using a Complex and Nonlinear Pedagogical Approach to Design Practical Primary Physical Education Lessons

    ERIC Educational Resources Information Center

    Atencio, Matthew; Chow, Jia Yi; Tan, Wee Keat Clara; Lee, Chang Yi Miriam

    2014-01-01

    This paper describes several practical activities that reveal how complex and nonlinear pedagogies might underpin primary physical education and school sport lessons. These sample activities, involving track and field, tennis and netball components, are designed to incorporate states of stability and instability through the modification of task…

  4. A Simple Explanation of Complexation

    ERIC Educational Resources Information Center

    Elliott, J. Richard

    2010-01-01

    The topics of solution thermodynamics, activity coefficients, and complex formation are introduced through computational exercises and sample applications. The presentation is designed to be accessible to freshmen in a chemical engineering computations course. The MOSCED model is simplified to explain complex formation in terms of hydrogen…

  5. Grouping methods for estimating the prevalences of rare traits from complex survey data that preserve confidentiality of respondents.

    PubMed

    Hyun, Noorie; Gastwirth, Joseph L; Graubard, Barry I

    2018-03-26

    Originally, 2-stage group testing was developed for efficiently screening individuals for a disease. In response to the HIV/AIDS epidemic, 1-stage group testing was adopted for estimating prevalences of a single or multiple traits from testing groups of size q, so individuals were not tested. This paper extends the methodology of 1-stage group testing to surveys with sample weighted complex multistage-cluster designs. Sample weighted-generalized estimating equations are used to estimate the prevalences of categorical traits while accounting for the error rates inherent in the tests. Two difficulties arise when using group testing in complex samples: (1) How does one weight the results of the test on each group as the sample weights will differ among observations in the same group. Furthermore, if the sample weights are related to positivity of the diagnostic test, then group-level weighting is needed to reduce bias in the prevalence estimation; (2) How does one form groups that will allow accurate estimation of the standard errors of prevalence estimates under multistage-cluster sampling allowing for intracluster correlation of the test results. We study 5 different grouping methods to address the weighting and cluster sampling aspects of complex designed samples. Finite sample properties of the estimators of prevalences, variances, and confidence interval coverage for these grouping methods are studied using simulations. National Health and Nutrition Examination Survey data are used to illustrate the methods. Copyright © 2018 John Wiley & Sons, Ltd.

  6. Model-based inference for small area estimation with sampling weights

    PubMed Central

    Vandendijck, Y.; Faes, C.; Kirby, R.S.; Lawson, A.; Hens, N.

    2017-01-01

    Obtaining reliable estimates about health outcomes for areas or domains where only few to no samples are available is the goal of small area estimation (SAE). Often, we rely on health surveys to obtain information about health outcomes. Such surveys are often characterised by a complex design, stratification, and unequal sampling weights as common features. Hierarchical Bayesian models are well recognised in SAE as a spatial smoothing method, but often ignore the sampling weights that reflect the complex sampling design. In this paper, we focus on data obtained from a health survey where the sampling weights of the sampled individuals are the only information available about the design. We develop a predictive model-based approach to estimate the prevalence of a binary outcome for both the sampled and non-sampled individuals, using hierarchical Bayesian models that take into account the sampling weights. A simulation study is carried out to compare the performance of our proposed method with other established methods. The results indicate that our proposed method achieves great reductions in mean squared error when compared with standard approaches. It performs equally well or better when compared with more elaborate methods when there is a relationship between the responses and the sampling weights. The proposed method is applied to estimate asthma prevalence across districts. PMID:28989860

  7. Testing for independence in J×K contingency tables with complex sample survey data.

    PubMed

    Lipsitz, Stuart R; Fitzmaurice, Garrett M; Sinha, Debajyoti; Hevelone, Nathanael; Giovannucci, Edward; Hu, Jim C

    2015-09-01

    The test of independence of row and column variables in a (J×K) contingency table is a widely used statistical test in many areas of application. For complex survey samples, use of the standard Pearson chi-squared test is inappropriate due to correlation among units within the same cluster. Rao and Scott (1981, Journal of the American Statistical Association 76, 221-230) proposed an approach in which the standard Pearson chi-squared statistic is multiplied by a design effect to adjust for the complex survey design. Unfortunately, this test fails to exist when one of the observed cell counts equals zero. Even with the large samples typical of many complex surveys, zero cell counts can occur for rare events, small domains, or contingency tables with a large number of cells. Here, we propose Wald and score test statistics for independence based on weighted least squares estimating equations. In contrast to the Rao-Scott test statistic, the proposed Wald and score test statistics always exist. In simulations, the score test is found to perform best with respect to type I error. The proposed method is motivated by, and applied to, post surgical complications data from the United States' Nationwide Inpatient Sample (NIS) complex survey of hospitals in 2008. © 2015, The International Biometric Society.

  8. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    NASA Astrophysics Data System (ADS)

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing; Zhang, Guannan; Ye, Ming; Wu, Jianfeng; Wu, Jichun

    2017-12-01

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we develop a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.

  9. Optimization of the intravenous glucose tolerance test in T2DM patients using optimal experimental design.

    PubMed

    Silber, Hanna E; Nyberg, Joakim; Hooker, Andrew C; Karlsson, Mats O

    2009-06-01

    Intravenous glucose tolerance test (IVGTT) provocations are informative, but complex and laborious, for studying the glucose-insulin system. The objective of this study was to evaluate, through optimal design methodology, the possibilities of more informative and/or less laborious study design of the insulin modified IVGTT in type 2 diabetic patients. A previously developed model for glucose and insulin regulation was implemented in the optimal design software PopED 2.0. The following aspects of the study design of the insulin modified IVGTT were evaluated; (1) glucose dose, (2) insulin infusion, (3) combination of (1) and (2), (4) sampling times, (5) exclusion of labeled glucose. Constraints were incorporated to avoid prolonged hyper- and/or hypoglycemia and a reduced design was used to decrease run times. Design efficiency was calculated as a measure of the improvement with an optimal design compared to the basic design. The results showed that the design of the insulin modified IVGTT could be substantially improved by the use of an optimized design compared to the standard design and that it was possible to use a reduced number of samples. Optimization of sample times gave the largest improvement followed by insulin dose. The results further showed that it was possible to reduce the total sample time with only a minor loss in efficiency. Simulations confirmed the predictions from PopED. The predicted uncertainty of parameter estimates (CV) was low in all tested cases, despite the reduction in the number of samples/subject. The best design had a predicted average CV of parameter estimates of 19.5%. We conclude that improvement can be made to the design of the insulin modified IVGTT and that the most important design factor was the placement of sample times followed by the use of an optimal insulin dose. This paper illustrates how complex provocation experiments can be improved by sequential modeling and optimal design.

  10. Microwave Photonic Architecture for Direction Finding of LPI Emitters: Front End Analog Circuit Design and Component Characterization

    DTIC Science & Technology

    2016-09-01

    design to control the phase shifters was complex, and the calibration process was time consuming. During the redesign process, we carried out...signals in time domain with a maximum sampling frequency of 20 Giga samples per second. In the previous tests of the design , the performance of...PHOTONIC ARCHITECTURE FOR DIRECTION FINDING OF LPI EMITTERS: FRONT-END ANALOG CIRCUIT DESIGN AND COMPONENT CHARACTERIZATION by Chew K. Tan

  11. Modeling Complex Workflow in Molecular Diagnostics

    PubMed Central

    Gomah, Mohamed E.; Turley, James P.; Lu, Huimin; Jones, Dan

    2010-01-01

    One of the hurdles to achieving personalized medicine has been implementing the laboratory processes for performing and reporting complex molecular tests. The rapidly changing test rosters and complex analysis platforms in molecular diagnostics have meant that many clinical laboratories still use labor-intensive manual processing and testing without the level of automation seen in high-volume chemistry and hematology testing. We provide here a discussion of design requirements and the results of implementation of a suite of lab management tools that incorporate the many elements required for use of molecular diagnostics in personalized medicine, particularly in cancer. These applications provide the functionality required for sample accessioning and tracking, material generation, and testing that are particular to the evolving needs of individualized molecular diagnostics. On implementation, the applications described here resulted in improvements in the turn-around time for reporting of more complex molecular test sets, and significant changes in the workflow. Therefore, careful mapping of workflow can permit design of software applications that simplify even the complex demands of specialized molecular testing. By incorporating design features for order review, software tools can permit a more personalized approach to sample handling and test selection without compromising efficiency. PMID:20007844

  12. Optimum study designs.

    PubMed

    Gu, C; Rao, D C

    2001-01-01

    Because simplistic designs will lead to prohibitively large sample sizes, the optimization of genetic study designs is critical for successfully mapping genes for complex diseases. Creative designs are necessary for detecting and amplifying the usually weak signals for complex traits. Two important outcomes of a study design--power and resolution--are implicitly tied together by the principle of uncertainty. Overemphasis on either one may lead to suboptimal designs. To achieve optimality for a particular study, therefore, practical measures such as cost-effectiveness must be used to strike a balance between power and resolution. In this light, the myriad of factors involved in study design can be checked for their effects on the ultimate outcomes, and the popular existing designs can be sorted into building blocks that may be useful for particular situations. It is hoped that imaginative construction of novel designs using such building blocks will lead to enhanced efficiency in finding genes for complex human traits.

  13. Cardiorespiratory Fitness Levels among U.S. Youth Aged 12-15 Years: United States, 1999-2004 and 2012

    MedlinePlus

    ... use a complex, stratified, multistage probability cluster sampling design. NHANES data collection is based on a nationally ... conjunction with the 2012 NHANES and the survey design was based on the design for NHANES, with ...

  14. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    DOE PAGES

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing; ...

    2017-12-27

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we developmore » a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.« less

  15. A Taylor Expansion-Based Adaptive Design Strategy for Global Surrogate Modeling With Applications in Groundwater Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mo, Shaoxing; Lu, Dan; Shi, Xiaoqing

    Global sensitivity analysis (GSA) and uncertainty quantification (UQ) for groundwater modeling are challenging because of the model complexity and significant computational requirements. To reduce the massive computational cost, a cheap-to-evaluate surrogate model is usually constructed to approximate and replace the expensive groundwater models in the GSA and UQ. Constructing an accurate surrogate requires actual model simulations on a number of parameter samples. Thus, a robust experimental design strategy is desired to locate informative samples so as to reduce the computational cost in surrogate construction and consequently to improve the efficiency in the GSA and UQ. In this study, we developmore » a Taylor expansion-based adaptive design (TEAD) that aims to build an accurate global surrogate model with a small training sample size. TEAD defines a novel hybrid score function to search informative samples, and a robust stopping criterion to terminate the sample search that guarantees the resulted approximation errors satisfy the desired accuracy. The good performance of TEAD in building global surrogate models is demonstrated in seven analytical functions with different dimensionality and complexity in comparison to two widely used experimental design methods. The application of the TEAD-based surrogate method in two groundwater models shows that the TEAD design can effectively improve the computational efficiency of GSA and UQ for groundwater modeling.« less

  16. Stuttering Frequency in Relation to Lexical Diversity, Syntactic Complexity, and Utterance Length

    ERIC Educational Resources Information Center

    Wagovich, Stacy A.; Hall, Nancy E.

    2018-01-01

    Children's frequency of stuttering can be affected by utterance length, syntactic complexity, and lexical content of language. Using a unique small-scale within-subjects design, this study explored whether language samples that contain more stuttering have (a) longer, (b) syntactically more complex, and (c) lexically more diverse utterances than…

  17. Statistical methods for efficient design of community surveys of response to noise: Random coefficients regression models

    NASA Technical Reports Server (NTRS)

    Tomberlin, T. J.

    1985-01-01

    Research studies of residents' responses to noise consist of interviews with samples of individuals who are drawn from a number of different compact study areas. The statistical techniques developed provide a basis for those sample design decisions. These techniques are suitable for a wide range of sample survey applications. A sample may consist of a random sample of residents selected from a sample of compact study areas, or in a more complex design, of a sample of residents selected from a sample of larger areas (e.g., cities). The techniques may be applied to estimates of the effects on annoyance of noise level, numbers of noise events, the time-of-day of the events, ambient noise levels, or other factors. Methods are provided for determining, in advance, how accurately these effects can be estimated for different sample sizes and study designs. Using a simple cost function, they also provide for optimum allocation of the sample across the stages of the design for estimating these effects. These techniques are developed via a regression model in which the regression coefficients are assumed to be random, with components of variance associated with the various stages of a multi-stage sample design.

  18. Marriage, Cohabitation, and Men's Use of Preventive Health Care Services

    MedlinePlus

    ... N.C.) to account for the complex sample design of NHIS. All estimates shown in this report ... Parsons VL, Moriarity C, Jonas K, et al. Design and estimation for the National Health Interview Survey, ...

  19. Sentence-Combining Practice

    ERIC Educational Resources Information Center

    Akin, Judy O'Neal

    1978-01-01

    Sample sentence-combining lessons developed to accompany the first-year A-LM German textbook are presented. The exercises are designed for language manipulation practice; they involve breaking down more complex sentences into simpler sentences and the subsequent recombination into complex sentences. All language skills, and particularly writing,…

  20. Application of a surfactant-assisted dispersive liquid-liquid microextraction method along with central composite design for micro-volume based spectrophotometric determination of low level of Cr(VI) ions in aquatic samples.

    PubMed

    Sobhi, Hamid Reza; Azadikhah, Efat; Behbahani, Mohammad; Esrafili, Ali; Ghambarian, Mahnaz

    2018-05-09

    A fast, simple, low cost surfactant-assisted dispersive liquid-liquid microextraction method along with central composite design for the determination of low level of Cr(VI) ions in several aquatic samples has been developed. Initially, Cr(VI) ions present in the aqueous sample were readily reacted with 1,5‑diphenylcarbazide (DPC) in acidic medium through complexation. Sodium dodecyl sulfate (SDS), as an anionic surfactant, was then employed as an ion-pair agent to convert the cationic complex into the neutral one. Following on, the whole aqueous phase underwent a dispersive liquid-liquid microextraction (DLLME) leading to the transfer of the neutral complex into the fine droplet of organic extraction phase. A micro-volume spectrophotometer was used to determine Cr(VI) concentrations. Under the optimized conditions predicted by the statistical design, the limit of quantification (LOQ) obtained was reported to be 5.0 μg/L, and the calibration curve was linear over the concentration range of 5-100 μg/L. Finally, the method was successfully implemented for the determination of low levels of Cr(VI) ions in various real aquatic samples and the accuracies fell within the range of 83-102%, while the precision varied in the span of 1.7-5.2%. Copyright © 2018. Published by Elsevier B.V.

  1. GLIMMPSE Lite: Calculating Power and Sample Size on Smartphone Devices

    PubMed Central

    Munjal, Aarti; Sakhadeo, Uttara R.; Muller, Keith E.; Glueck, Deborah H.; Kreidler, Sarah M.

    2014-01-01

    Researchers seeking to develop complex statistical applications for mobile devices face a common set of difficult implementation issues. In this work, we discuss general solutions to the design challenges. We demonstrate the utility of the solutions for a free mobile application designed to provide power and sample size calculations for univariate, one-way analysis of variance (ANOVA), GLIMMPSE Lite. Our design decisions provide a guide for other scientists seeking to produce statistical software for mobile platforms. PMID:25541688

  2. An overview of the genetic dissection of complex traits.

    PubMed

    Rao, D C

    2008-01-01

    Thanks to the recent revolutionary genomic advances such as the International HapMap consortium, resolution of the genetic architecture of common complex traits is beginning to look hopeful. While demonstrating the feasibility of genome-wide association (GWA) studies, the pathbreaking Wellcome Trust Case Control Consortium (WTCCC) study also serves to underscore the critical importance of very large sample sizes and draws attention to potential problems, which need to be addressed as part of the study design. Even the large WTCCC study had vastly inadequate power for several of the associations reported (and confirmed) and, therefore, most of the regions harboring relevant associations may not be identified anytime soon. This chapter provides an overview of some of the key developments in the methodological approaches to genetic dissection of common complex traits. Constrained Bayesian networks are suggested as especially useful for analysis of pathway-based SNPs. Likewise, composite likelihood is suggested as a promising method for modeling complex systems. It discusses the key steps in a study design, with an emphasis on GWA studies. Potential limitations highlighted by the WTCCC GWA study are discussed, including problems associated with massive genotype imputation, analysis of pooled national samples, shared controls, and the critical role of interactions. GWA studies clearly need massive sample sizes that are only possible through genuine collaborations. After all, for common complex traits, the question is not whether we can find some pieces of the puzzle, but how large and what kind of a sample we need to (nearly) solve the genetic puzzle.

  3. Design on the x-ray oral digital image display card

    NASA Astrophysics Data System (ADS)

    Wang, Liping; Gu, Guohua; Chen, Qian

    2009-10-01

    According to the main characteristics of X-ray imaging, the X-ray display card is successfully designed and debugged using the basic principle of correlated double sampling (CDS) and combined with embedded computer technology. CCD sensor drive circuit and the corresponding procedures have been designed. Filtering and sampling hold circuit have been designed. The data exchange with PC104 bus has been implemented. Using complex programmable logic device as a device to provide gating and timing logic, the functions which counting, reading CPU control instructions, corresponding exposure and controlling sample-and-hold have been completed. According to the image effect and noise analysis, the circuit components have been adjusted. And high-quality images have been obtained.

  4. Novel immunoassay formats for integrated microfluidic circuits: diffusion immunoassays (DIA)

    NASA Astrophysics Data System (ADS)

    Weigl, Bernhard H.; Hatch, Anson; Kamholz, Andrew E.; Yager, Paul

    2000-03-01

    Novel designs of integrated fluidic microchips allow separations, chemical reactions, and calibration-free analytical measurements to be performed directly in very small quantities of complex samples such as whole blood and contaminated environmental samples. This technology lends itself to applications such as clinical diagnostics, including tumor marker screening, and environmental sensing in remote locations. Lab-on-a-Chip based systems offer many *advantages over traditional analytical devices: They consume extremely low volumes of both samples and reagents. Each chip is inexpensive and small. The sampling-to-result time is extremely short. They perform all analytical functions, including sampling, sample pretreatment, separation, dilution, and mixing steps, chemical reactions, and detection in an integrated microfluidic circuit. Lab-on-a-Chip systems enable the design of small, portable, rugged, low-cost, easy to use, yet extremely versatile and capable diagnostic instruments. In addition, fluids flowing in microchannels exhibit unique characteristics ('microfluidics'), which allow the design of analytical devices and assay formats that would not function on a macroscale. Existing Lab-on-a-chip technologies work very well for highly predictable and homogeneous samples common in genetic testing and drug discovery processes. One of the biggest challenges for current Labs-on-a-chip, however, is to perform analysis in the presence of the complexity and heterogeneity of actual samples such as whole blood or contaminated environmental samples. Micronics has developed a variety of Lab-on-a-Chip assays that can overcome those shortcomings. We will now present various types of novel Lab- on-a-Chip-based immunoassays, including the so-called Diffusion Immunoassays (DIA) that are based on the competitive laminar diffusion of analyte molecules and tracer molecules into a region of the chip containing antibodies that target the analyte molecules. Advantages of this technique are a reduction in reagents, higher sensitivity, minimal preparation of complex samples such as blood, real-time calibration, and extremely rapid analysis.

  5. Health Insurance Coverage: Early Release of Estimates from the National Health Interview Survey, January -- June 2013

    MedlinePlus

    ... Park, NC) to account for the complex sample design of NHIS, taking into account stratum and primary sampling unit (PSU) identifiers. The Taylor series linearization method was chosen for variance estimation. Trends ...

  6. [Evaluation of the quality of Anales Españoles de Pediatría versus Medicina Clínica].

    PubMed

    Bonillo Perales, A

    2002-08-01

    To compare the scientific methodology and quality of articles published in Anales Españoles de Pediatría and Medicina Clínica. A stratified and randomized selection of 40 original articles published in 2001 in Anales Españoles de Pediatría and Medicina Clínica was made. Methodological errors in the critical analysis of original articles (21 items), epidemiological design, sample size, statistical complexity and levels of scientific evidence in both journals were compared using the chi-squared and/or Student's t-test. No differences were found between Anales Españoles de Pediatría and Medicina Clínica in the critical evaluation of original articles (p > 0.2). In original articles published in Anales Españoles de Pediatría, the designs were of lower scientific evidence (a lower proportion of clinical trials, cohort and case-control studies) (17.5 vs 42.5 %, p 0.05), sample sizes were smaller (p 0.003) and there was less statistical complexity in the results section (p 0.03). To improve the scientific quality of Anales Españoles de Pediatría, improved study designs, larger sample sizes and greater statistical complexity are required in its articles.

  7. [National Health and Nutrition Survey 2012: design and coverage].

    PubMed

    Romero-Martínez, Martín; Shamah-Levy, Teresa; Franco-Núñez, Aurora; Villalpando, Salvador; Cuevas-Nasu, Lucía; Gutiérrez, Juan Pablo; Rivera-Dommarco, Juan Ángel

    2013-01-01

    To describe the design and population coverage of the National Health and Nutrition Survey 2012 (NHNS 2012). The design of the NHNS 2012 is reported, as a probabilistic population based survey with a multi-stage and stratified sampling, as well as the sample inferential properties, the logistical procedures, and the obtained coverage. Household response rate for the NHNS 2012 was 87%, completing data from 50,528 households, where 96 031 individual interviews selected by age and 14,104 of ambulatory health services users were also obtained. The probabilistic design of the NHNS 2012 as well as its coverage allowed to generate inferences about health and nutrition conditions, health programs coverage, and access to health services. Because of their complex designs, all estimations from the NHNS 2012 must use the survey design: weights, primary sampling units, and stratus variables.

  8. Using machine learning tools to model complex toxic interactions with limited sampling regimes.

    PubMed

    Bertin, Matthew J; Moeller, Peter; Guillette, Louis J; Chapman, Robert W

    2013-03-19

    A major impediment to understanding the impact of environmental stress, including toxins and other pollutants, on organisms, is that organisms are rarely challenged by one or a few stressors in natural systems. Thus, linking laboratory experiments that are limited by practical considerations to a few stressors and a few levels of these stressors to real world conditions is constrained. In addition, while the existence of complex interactions among stressors can be identified by current statistical methods, these methods do not provide a means to construct mathematical models of these interactions. In this paper, we offer a two-step process by which complex interactions of stressors on biological systems can be modeled in an experimental design that is within the limits of practicality. We begin with the notion that environment conditions circumscribe an n-dimensional hyperspace within which biological processes or end points are embedded. We then randomly sample this hyperspace to establish experimental conditions that span the range of the relevant parameters and conduct the experiment(s) based upon these selected conditions. Models of the complex interactions of the parameters are then extracted using machine learning tools, specifically artificial neural networks. This approach can rapidly generate highly accurate models of biological responses to complex interactions among environmentally relevant toxins, identify critical subspaces where nonlinear responses exist, and provide an expedient means of designing traditional experiments to test the impact of complex mixtures on biological responses. Further, this can be accomplished with an astonishingly small sample size.

  9. Design of refractive laser beam shapers to generate complex irradiance profiles

    NASA Astrophysics Data System (ADS)

    Li, Meijie; Meuret, Youri; Duerr, Fabian; Vervaeke, Michael; Thienpont, Hugo

    2014-05-01

    A Gaussian laser beam is reshaped to have specific irradiance distributions in many applications in order to ensure optimal system performance. Refractive optics are commonly used for laser beam shaping. A refractive laser beam shaper is typically formed by either two plano-aspheric lenses or by one thick lens with two aspherical surfaces. Ray mapping is a general optical design technique to design refractive beam shapers based on geometric optics. This design technique in principle allows to generate any rotational-symmetric irradiance profile, yet in literature ray mapping is mainly developed to transform a Gaussian irradiance profile to a uniform profile. For more complex profiles especially with low intensity in the inner region, like a Dark Hollow Gaussian (DHG) irradiance profile, ray mapping technique is not directly applicable in practice. In order to these complex profiles, the numerical effort of calculating the aspherical surface points and fitting a surface with sufficient accuracy increases considerably. In this work we evaluate different sampling approaches and surface fitting methods. This allows us to propose and demonstrate a comprehensive numerical approach to efficiently design refractive laser beam shapers to generate rotational-symmetric collimated beams with a complex irradiance profile. Ray tracing analysis for several complex irradiance profiles demonstrates excellent performance of the designed lenses and the versatility of our design procedure.

  10. On Design Mining: Coevolution and Surrogate Models.

    PubMed

    Preen, Richard J; Bull, Larry

    2017-01-01

    Design mining is the use of computational intelligence techniques to iteratively search and model the attribute space of physical objects evaluated directly through rapid prototyping to meet given objectives. It enables the exploitation of novel materials and processes without formal models or complex simulation. In this article, we focus upon the coevolutionary nature of the design process when it is decomposed into concurrent sub-design-threads due to the overall complexity of the task. Using an abstract, tunable model of coevolution, we consider strategies to sample subthread designs for whole-system testing and how best to construct and use surrogate models within the coevolutionary scenario. Drawing on our findings, we then describe the effective design of an array of six heterogeneous vertical-axis wind turbines.

  11. Platform construction and extraction mechanism study of magnetic mixed hemimicelles solid-phase extraction

    NASA Astrophysics Data System (ADS)

    Xiao, Deli; Zhang, Chan; He, Jia; Zeng, Rong; Chen, Rong; He, Hua

    2016-12-01

    Simple, accurate and high-throughput pretreatment method would facilitate large-scale studies of trace analysis in complex samples. Magnetic mixed hemimicelles solid-phase extraction has the power to become a key pretreatment method in biological, environmental and clinical research. However, lacking of experimental predictability and unsharpness of extraction mechanism limit the development of this promising method. Herein, this work tries to establish theoretical-based experimental designs for extraction of trace analytes from complex samples using magnetic mixed hemimicelles solid-phase extraction. We selected three categories and six sub-types of compounds for systematic comparative study of extraction mechanism, and comprehensively illustrated the roles of different force (hydrophobic interaction, π-π stacking interactions, hydrogen-bonding interaction, electrostatic interaction) for the first time. What’s more, the application guidelines for supporting materials, surfactants and sample matrix were also summarized. The extraction mechanism and platform established in the study render its future promising for foreseeable and efficient pretreatment under theoretical based experimental design for trace analytes from environmental, biological and clinical samples.

  12. Integrating a Genetic Algorithm Into a Knowledge-Based System for Ordering Complex Design Processes

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; McCulley, Collin M.; Bloebaum, Christina L.

    1996-01-01

    The design cycle associated with large engineering systems requires an initial decomposition of the complex system into design processes which are coupled through the transference of output data. Some of these design processes may be grouped into iterative subcycles. In analyzing or optimizing such a coupled system, it is essential to be able to determine the best ordering of the processes within these subcycles to reduce design cycle time and cost. Many decomposition approaches assume the capability is available to determine what design processes and couplings exist and what order of execution will be imposed during the design cycle. Unfortunately, this is often a complex problem and beyond the capabilities of a human design manager. A new feature, a genetic algorithm, has been added to DeMAID (Design Manager's Aid for Intelligent Decomposition) to allow the design manager to rapidly examine many different combinations of ordering processes in an iterative subcycle and to optimize the ordering based on cost, time, and iteration requirements. Two sample test cases are presented to show the effects of optimizing the ordering with a genetic algorithm.

  13. POWER ANALYSIS FOR COMPLEX MEDIATIONAL DESIGNS USING MONTE CARLO METHODS

    PubMed Central

    Thoemmes, Felix; MacKinnon, David P.; Reiser, Mark R.

    2013-01-01

    Applied researchers often include mediation effects in applications of advanced methods such as latent variable models and linear growth curve models. Guidance on how to estimate statistical power to detect mediation for these models has not yet been addressed in the literature. We describe a general framework for power analyses for complex mediational models. The approach is based on the well known technique of generating a large number of samples in a Monte Carlo study, and estimating power as the percentage of cases in which an estimate of interest is significantly different from zero. Examples of power calculation for commonly used mediational models are provided. Power analyses for the single mediator, multiple mediators, three-path mediation, mediation with latent variables, moderated mediation, and mediation in longitudinal designs are described. Annotated sample syntax for Mplus is appended and tabled values of required sample sizes are shown for some models. PMID:23935262

  14. Optimal design in pediatric pharmacokinetic and pharmacodynamic clinical studies.

    PubMed

    Roberts, Jessica K; Stockmann, Chris; Balch, Alfred; Yu, Tian; Ward, Robert M; Spigarelli, Michael G; Sherwin, Catherine M T

    2015-03-01

    It is not trivial to conduct clinical trials with pediatric participants. Ethical, logistical, and financial considerations add to the complexity of pediatric studies. Optimal design theory allows investigators the opportunity to apply mathematical optimization algorithms to define how to structure their data collection to answer focused research questions. These techniques can be used to determine an optimal sample size, optimal sample times, and the number of samples required for pharmacokinetic and pharmacodynamic studies. The aim of this review is to demonstrate how to determine optimal sample size, optimal sample times, and the number of samples required from each patient by presenting specific examples using optimal design tools. Additionally, this review aims to discuss the relative usefulness of sparse vs rich data. This review is intended to educate the clinician, as well as the basic research scientist, whom plan on conducting a pharmacokinetic/pharmacodynamic clinical trial in pediatric patients. © 2015 John Wiley & Sons Ltd.

  15. X-ray diffraction, FTIR, UV-VIS and SEM studies on chromium (III) complexes

    NASA Astrophysics Data System (ADS)

    Mishra, Ashutosh; Dwivedi, Jagrati; Shukla, Kritika

    2015-06-01

    Five Chromium (III) complexes have been prepared using Schiff base ligands which derived from benzoin and five different amino acids (H2N-R). Samples were characterized by XRD, FTIR, UV-VIS and SEM method. X-Ray diffraction pattern analyzed that all chromium (III) complexes have hexagonal structure and crystalline, in nature, using Bruker D8 Advance instrument. Using VERTAX 70, FTIR spectroscopy reveals that Samples have (C=N), (C-O), (M-N) and (M-O) bonds in the range of 4000-400cm-1. UV-VIS spectroscopy give information that samples absorb the visible light which is in the range of 380-780nm. For this, Lambda 960 spectrometer used. SEM is designed for studying of the solid objects, using JEOL JSM 5600 instrument.

  16. Trends in Elevated Triglyceride in Adults: United States, 2001-2012

    MedlinePlus

    ... All variance estimates accounted for the complex survey design using Taylor series linearization ( 10 ). Percentage estimates for the total adult ... al. National Health and Nutrition Examination Survey: Sample design, 2007–2010. ... KM. Taylor series methods. In: Introduction to variance estimation. 2nd ed. ...

  17. A Methodology to Teach Advanced A/D Converters, Combining Digital Signal Processing and Microelectronics Perspectives

    ERIC Educational Resources Information Center

    Quintans, C.; Colmenar, A.; Castro, M.; Moure, M. J.; Mandado, E.

    2010-01-01

    ADCs (analog-to-digital converters), especially Pipeline and Sigma-Delta converters, are designed using complex architectures in order to increase their sampling rate and/or resolution. Consequently, the learning of ADC devices also encompasses complex concepts such as multistage synchronization, latency, oversampling, modulation, noise shaping,…

  18. Best Practices in Using Large, Complex Samples: The Importance of Using Appropriate Weights and Design Effect Compensation

    ERIC Educational Resources Information Center

    Osborne, Jason W.

    2011-01-01

    Large surveys often use probability sampling in order to obtain representative samples, and these data sets are valuable tools for researchers in all areas of science. Yet many researchers are not formally prepared to appropriately utilize these resources. Indeed, users of one popular dataset were generally found "not" to have modeled…

  19. Weighting Test Samples in IRT Linking and Equating: Toward an Improved Sampling Design for Complex Equating. Research Report. ETS RR-13-39

    ERIC Educational Resources Information Center

    Qian, Jiahe; Jiang, Yanming; von Davier, Alina A.

    2013-01-01

    Several factors could cause variability in item response theory (IRT) linking and equating procedures, such as the variability across examinee samples and/or test items, seasonality, regional differences, native language diversity, gender, and other demographic variables. Hence, the following question arises: Is it possible to select optimal…

  20. ICS-II USA research design and methodology.

    PubMed

    Rana, H; Andersen, R M; Nakazono, T T; Davidson, P L

    1997-05-01

    The purpose of the WHO-sponsored International Collaborative Study of Oral Health Outcomes (ICS-II) was to provide policy-markers and researchers with detailed, reliable, and valid data on the oral health situation in their countries or regions, together with comparative data from other dental care delivery systems. ICS-II used a cross-sectional design with no explicit control groups or experimental interventions. A standardized methodology was developed and tested for collecting and analyzing epidemiological, sociocultural, economic, and delivery system data. Respondent information was obtained by household interviews, and clinical examinations were conducted by calibrated oral epidemiologists. Discussed are the sampling design characteristics for the USA research locations, response rates, samples size for interview and oral examination data, weighting procedures, and statistical methods. SUDAAN was used to adjust variance calculations, since complex sampling designs were used.

  1. Toxin activity assays, devices, methods and systems therefor

    DOEpatents

    Koh, Chung-Yan; Schaff, Ulrich Y.; Sommer, Gregory Jon

    2016-04-05

    Embodiments of the present invention are directed toward devices, system and method for conducting toxin activity assay using sedimentation. The toxin activity assay may include generating complexes which bind to a plurality of beads in a fluid sample. The complexes may include a target toxin and a labeling agent, or may be generated due to presence of active target toxin and/or labeling agent designed to be incorporated into complexes responsive to the presence of target active toxin. The plurality of beads including the complexes may be transported through a density media, wherein the density media has a lower density than a density of the beads and higher than a density of the fluid sample, and wherein the transporting occurs, at least in part, by sedimentation. Signal may be detected from the labeling agents of the complexes.

  2. Selected Oral Health Indicators in the United States, 2005-2008

    MedlinePlus

    ... errors of the percentages were estimated using Taylor series linearization, to take into account the complex sampling design. The statistical significance of differences between estimates were ...

  3. [New design of the Health Survey of Catalonia (Spain, 2010-2014): a step forward in health planning and evaluation].

    PubMed

    Alcañiz-Zanón, Manuela; Mompart-Penina, Anna; Guillén-Estany, Montserrat; Medina-Bustos, Antonia; Aragay-Barbany, Josep M; Brugulat-Guiteras, Pilar; Tresserras-Gaju, Ricard

    2014-01-01

    This article presents the genesis of the Health Survey of Catalonia (Spain, 2010-2014) with its semiannual subsamples and explains the basic characteristics of its multistage sampling design. In comparison with previous surveys, the organizational advantages of this new statistical operation include rapid data availability and the ability to continuously monitor the population. The main benefits are timeliness in the production of indicators and the possibility of introducing new topics through the supplemental questionnaire as a function of needs. Limitations consist of the complexity of the sample design and the lack of longitudinal follow-up of the sample. Suitable sampling weights for each specific subsample are necessary for any statistical analysis of micro-data. Accuracy in the analysis of territorial disaggregation or population subgroups increases if annual samples are accumulated. Copyright © 2013 SESPAS. Published by Elsevier Espana. All rights reserved.

  4. Modeling the Stress Complexities of Teaching and Learning of School Physics in Nigeria

    ERIC Educational Resources Information Center

    Emetere, Moses E.

    2014-01-01

    This study was designed to investigate the validity of the stress complexity model (SCM) to teaching and learning of school physics in Abuja municipal area council of Abuja, North. About two hundred students were randomly selected by a simple random sampling technique from some schools within the Abuja municipal area council. A survey research…

  5. Platform construction and extraction mechanism study of magnetic mixed hemimicelles solid-phase extraction

    PubMed Central

    Xiao, Deli; Zhang, Chan; He, Jia; Zeng, Rong; Chen, Rong; He, Hua

    2016-01-01

    Simple, accurate and high-throughput pretreatment method would facilitate large-scale studies of trace analysis in complex samples. Magnetic mixed hemimicelles solid-phase extraction has the power to become a key pretreatment method in biological, environmental and clinical research. However, lacking of experimental predictability and unsharpness of extraction mechanism limit the development of this promising method. Herein, this work tries to establish theoretical-based experimental designs for extraction of trace analytes from complex samples using magnetic mixed hemimicelles solid-phase extraction. We selected three categories and six sub-types of compounds for systematic comparative study of extraction mechanism, and comprehensively illustrated the roles of different force (hydrophobic interaction, π-π stacking interactions, hydrogen-bonding interaction, electrostatic interaction) for the first time. What’s more, the application guidelines for supporting materials, surfactants and sample matrix were also summarized. The extraction mechanism and platform established in the study render its future promising for foreseeable and efficient pretreatment under theoretical based experimental design for trace analytes from environmental, biological and clinical samples. PMID:27924944

  6. Recent progress in the design and clinical development of electronic-nose technologies

    Treesearch

    Dan Wilson

    2016-01-01

    Electronic-nose (e-nose) devices are instruments designed to detect and discriminate between precise complex gaseous mixtures of volatile organic compounds derived from specific organic sources, such as clinical test samples from patients, based on electronic aroma signature patterns (distinct digital sensor responses) resulting from the combined outputs of a...

  7. A Database Design and Development Case: Home Theater Video

    ERIC Educational Resources Information Center

    Ballenger, Robert; Pratt, Renee

    2012-01-01

    This case consists of a business scenario of a small video rental store, Home Theater Video, which provides background information, a description of the functional business requirements, and sample data. The case provides sufficient information to design and develop a moderately complex database to assist Home Theater Video in solving their…

  8. Development and optimization of SPE-HPLC-UV/ELSD for simultaneous determination of nine bioactive components in Shenqi Fuzheng Injection based on Quality by Design principles.

    PubMed

    Wang, Lu; Qu, Haibin

    2016-03-01

    A method combining solid phase extraction, high performance liquid chromatography, and ultraviolet/evaporative light scattering detection (SPE-HPLC-UV/ELSD) was developed according to Quality by Design (QbD) principles and used to assay nine bioactive compounds within a botanical drug, Shenqi Fuzheng Injection. Risk assessment and a Plackett-Burman design were utilized to evaluate the impact of 11 factors on the resolutions and signal-to-noise of chromatographic peaks. Multiple regression and Pareto ranking analysis indicated that the sorbent mass, sample volume, flow rate, column temperature, evaporator temperature, and gas flow rate were statistically significant (p < 0.05) in this procedure. Furthermore, a Box-Behnken design combined with response surface analysis was employed to study the relationships between the quality of SPE-HPLC-UV/ELSD analysis and four significant factors, i.e., flow rate, column temperature, evaporator temperature, and gas flow rate. An analytical design space of SPE-HPLC-UV/ELSD was then constructed by calculated Monte Carlo probability. In the presented approach, the operating parameters of sample preparation, chromatographic separation, and compound detection were investigated simultaneously. Eight terms of method validation, i.e., system-suitability tests, method robustness/ruggedness, sensitivity, precision, repeatability, linearity, accuracy, and stability, were accomplished at a selected working point. These results revealed that the QbD principles were suitable in the development of analytical procedures for samples in complex matrices. Meanwhile, the analytical quality and method robustness were validated by the analytical design space. The presented strategy provides a tutorial on the development of a robust QbD-compliant quantitative method for samples in complex matrices.

  9. Moving bed reactor setup to study complex gas-solid reactions.

    PubMed

    Gupta, Puneet; Velazquez-Vargas, Luis G; Valentine, Charles; Fan, Liang-Shih

    2007-08-01

    A moving bed scale reactor setup for studying complex gas-solid reactions has been designed in order to obtain kinetic data for scale-up purpose. In this bench scale reactor setup, gas and solid reactants can be contacted in a cocurrent and countercurrent manner at high temperatures. Gas and solid sampling can be performed through the reactor bed with their composition profiles determined at steady state. The reactor setup can be used to evaluate and corroborate model parameters accounting for intrinsic reaction rates in both simple and complex gas-solid reaction systems. The moving bed design allows experimentation over a variety of gas and solid compositions in a single experiment unlike differential bed reactors where the gas composition is usually fixed. The data obtained from the reactor can also be used for direct scale-up of designs for moving bed reactors.

  10. Reference Proteome Extracts for Mass Spec Instrument Performance Validation and Method Development

    PubMed Central

    Rosenblatt, Mike; Urh, Marjeta; Saveliev, Sergei

    2014-01-01

    Biological samples of high complexity are required to test protein mass spec sample preparation procedures and validate mass spec instrument performance. Total cell protein extracts provide the needed sample complexity. However, to be compatible with mass spec applications, such extracts should meet a number of design requirements: compatibility with LC/MS (free of detergents, etc.)high protein integrity (minimal level of protein degradation and non-biological PTMs)compatibility with common sample preparation methods such as proteolysis, PTM enrichment and mass-tag labelingLot-to-lot reproducibility Here we describe total protein extracts from yeast and human cells that meet the above criteria. Two extract formats have been developed: Intact protein extracts with primary use for sample preparation method development and optimizationPre-digested extracts (peptides) with primary use for instrument validation and performance monitoring

  11. Statistical and sampling issues when using multiple particle tracking

    NASA Astrophysics Data System (ADS)

    Savin, Thierry; Doyle, Patrick S.

    2007-08-01

    Video microscopy can be used to simultaneously track several microparticles embedded in a complex material. The trajectories are used to extract a sample of displacements at random locations in the material. From this sample, averaged quantities characterizing the dynamics of the probes are calculated to evaluate structural and/or mechanical properties of the assessed material. However, the sampling of measured displacements in heterogeneous systems is singular because the volume of observation with video microscopy is finite. By carefully characterizing the sampling design in the experimental output of the multiple particle tracking technique, we derive estimators for the mean and variance of the probes’ dynamics that are independent of the peculiar statistical characteristics. We expose stringent tests of these estimators using simulated and experimental complex systems with a known heterogeneous structure. Up to a certain fundamental limitation, which we characterize through a material degree of sampling by the embedded probe tracking, these estimators can be applied to quantify the heterogeneity of a material, providing an original and intelligible kind of information on complex fluid properties. More generally, we show that the precise assessment of the statistics in the multiple particle tracking output sample of observations is essential in order to provide accurate unbiased measurements.

  12. Statistical assessment on a combined analysis of GRYN-ROMN-UCBN upland vegetation vital signs

    USGS Publications Warehouse

    Irvine, Kathryn M.; Rodhouse, Thomas J.

    2014-01-01

    As of 2013, Rocky Mountain and Upper Columbia Basin Inventory and Monitoring Networks have multiple years of vegetation data and Greater Yellowstone Network has three years of vegetation data and monitoring is ongoing in all three networks. Our primary objective is to assess whether a combined analysis of these data aimed at exploring correlations with climate and weather data is feasible. We summarize the core survey design elements across protocols and point out the major statistical challenges for a combined analysis at present. The dissimilarity in response designs between ROMN and UCBN-GRYN network protocols presents a statistical challenge that has not been resolved yet. However, the UCBN and GRYN data are compatible as they implement a similar response design; therefore, a combined analysis is feasible and will be pursued in future. When data collected by different networks are combined, the survey design describing the merged dataset is (likely) a complex survey design. A complex survey design is the result of combining datasets from different sampling designs. A complex survey design is characterized by unequal probability sampling, varying stratification, and clustering (see Lohr 2010 Chapter 7 for general overview). Statistical analysis of complex survey data requires modifications to standard methods, one of which is to include survey design weights within a statistical model. We focus on this issue for a combined analysis of upland vegetation from these networks, leaving other topics for future research. We conduct a simulation study on the possible effects of equal versus unequal probability selection of points on parameter estimates of temporal trend using available packages within the R statistical computing package. We find that, as written, using lmer or lm for trend detection in a continuous response and clm and clmm for visually estimated cover classes with “raw” GRTS design weights specified for the weight argument leads to substantially different results and/or computational instability. However, when only fixed effects are of interest, the survey package (svyglm and svyolr) may be suitable for a model-assisted analysis for trend. We provide possible directions for future research into combined analysis for ordinal and continuous vital sign indictors.

  13. The Evaluation of Bias of the Weighted Random Effects Model Estimators. Research Report. ETS RR-11-13

    ERIC Educational Resources Information Center

    Jia, Yue; Stokes, Lynne; Harris, Ian; Wang, Yan

    2011-01-01

    Estimation of parameters of random effects models from samples collected via complex multistage designs is considered. One way to reduce estimation bias due to unequal probabilities of selection is to incorporate sampling weights. Many researchers have been proposed various weighting methods (Korn, & Graubard, 2003; Pfeffermann, Skinner,…

  14. Confronting Analytical Dilemmas for Understanding Complex Human Interactions in Design-Based Research from a Cultural-Historical Activity Theory (CHAT) Framework

    ERIC Educational Resources Information Center

    Yamagata-Lynch, Lisa C.

    2007-01-01

    Understanding human activity in real-world situations often involves complicated data collection, analysis, and presentation methods. This article discusses how Cultural-Historical Activity Theory (CHAT) can inform design-based research practices that focus on understanding activity in real-world situations. I provide a sample data set with…

  15. Blood Sampling and Preparation Procedures for Proteomic Biomarker Studies of Psychiatric Disorders.

    PubMed

    Guest, Paul C; Rahmoune, Hassan

    2017-01-01

    A major challenge in proteomic biomarker discovery and validation for psychiatric diseases is the inherent biological complexity underlying these conditions. There are also many technical issues which hinder this process such as the lack of standardization in sampling, processing and storage of bio-samples in preclinical and clinical settings. This chapter describes a reproducible procedure for sampling blood serum and plasma that is specifically designed for maximizing data quality output in two-dimensional gel electrophoresis, multiplex immunoassay and mass spectrometry profiling studies.

  16. Network Model-Assisted Inference from Respondent-Driven Sampling Data

    PubMed Central

    Gile, Krista J.; Handcock, Mark S.

    2015-01-01

    Summary Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population. PMID:26640328

  17. Network Model-Assisted Inference from Respondent-Driven Sampling Data.

    PubMed

    Gile, Krista J; Handcock, Mark S

    2015-06-01

    Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population.

  18. A general method to determine sampling windows for nonlinear mixed effects models with an application to population pharmacokinetic studies.

    PubMed

    Foo, Lee Kien; McGree, James; Duffull, Stephen

    2012-01-01

    Optimal design methods have been proposed to determine the best sampling times when sparse blood sampling is required in clinical pharmacokinetic studies. However, the optimal blood sampling time points may not be feasible in clinical practice. Sampling windows, a time interval for blood sample collection, have been proposed to provide flexibility in blood sampling times while preserving efficient parameter estimation. Because of the complexity of the population pharmacokinetic models, which are generally nonlinear mixed effects models, there is no analytical solution available to determine sampling windows. We propose a method for determination of sampling windows based on MCMC sampling techniques. The proposed method attains a stationary distribution rapidly and provides time-sensitive windows around the optimal design points. The proposed method is applicable to determine sampling windows for any nonlinear mixed effects model although our work focuses on an application to population pharmacokinetic models. Copyright © 2012 John Wiley & Sons, Ltd.

  19. Examination of Libby, Montana, Fill Material for Background Levels of Amphibole from the Rainy Creek Complex Using Scanning Electron Microscopy and X-Ray Microanalysis

    USGS Publications Warehouse

    Adams, David T.; Langer, William H.; Hoefen, Todd M.; Van Gosen, Bradley S.; Meeker, Gregory P.

    2010-01-01

    Natural background levels of Libby-type amphibole in the sediment of the Libby valley in Montana have not, up to this point, been determined. The purpose of this report is to provide the preliminary findings of a study designed by both the U.S. Geological Survey and the U.S. Environmental Protection Agency and performed by the U.S. Geological Survey. The study worked to constrain the natural background levels of fibrous amphiboles potentially derived from the nearby Rainy Creek Complex. The material selected for this study was sampled from three localities, two of which are active open-pit sand and gravel mines. Seventy samples were collected in total and examined using a scanning electron microscope equipped with an energy dispersive x-ray spectrometer. All samples contained varying amounts of feldspars, ilmenite, magnetite, quartz, clay minerals, pyroxene minerals, and non-fibrous amphiboles such as tremolite, actinolite, and magnesiohornblende. Of the 70 samples collected, only three had detectable levels of fibrous amphiboles compatible with those found in the rainy creek complex. The maximum concentration, identified here, of the amphiboles potentially from the Rainy Creek Complex is 0.083 percent by weight.

  20. The value of value of information: best informing research design and prioritization using current methods.

    PubMed

    Eckermann, Simon; Karnon, Jon; Willan, Andrew R

    2010-01-01

    Value of information (VOI) methods have been proposed as a systematic approach to inform optimal research design and prioritization. Four related questions arise that VOI methods could address. (i) Is further research for a health technology assessment (HTA) potentially worthwhile? (ii) Is the cost of a given research design less than its expected value? (iii) What is the optimal research design for an HTA? (iv) How can research funding be best prioritized across alternative HTAs? Following Occam's razor, we consider the usefulness of VOI methods in informing questions 1-4 relative to their simplicity of use. Expected value of perfect information (EVPI) with current information, while simple to calculate, is shown to provide neither a necessary nor a sufficient condition to address question 1, given that what EVPI needs to exceed varies with the cost of research design, which can vary from very large down to negligible. Hence, for any given HTA, EVPI does not discriminate, as it can be large and further research not worthwhile or small and further research worthwhile. In contrast, each of questions 1-4 are shown to be fully addressed (necessary and sufficient) where VOI methods are applied to maximize expected value of sample information (EVSI) minus expected costs across designs. In comparing complexity in use of VOI methods, applying the central limit theorem (CLT) simplifies analysis to enable easy estimation of EVSI and optimal overall research design, and has been shown to outperform bootstrapping, particularly with small samples. Consequently, VOI methods applying the CLT to inform optimal overall research design satisfy Occam's razor in both improving decision making and reducing complexity. Furthermore, they enable consideration of relevant decision contexts, including option value and opportunity cost of delay, time, imperfect implementation and optimal design across jurisdictions. More complex VOI methods such as bootstrapping of the expected value of partial EVPI may have potential value in refining overall research design. However, Occam's razor must be seriously considered in application of these VOI methods, given their increased complexity and current limitations in informing decision making, with restriction to EVPI rather than EVSI and not allowing for important decision-making contexts. Initial use of CLT methods to focus these more complex partial VOI methods towards where they may be useful in refining optimal overall trial design is suggested. Integrating CLT methods with such partial VOI methods to allow estimation of partial EVSI is suggested in future research to add value to the current VOI toolkit.

  1. Improving inference for aerial surveys of bears: The importance of assumptions and the cost of unnecessary complexity.

    PubMed

    Schmidt, Joshua H; Wilson, Tammy L; Thompson, William L; Reynolds, Joel H

    2017-07-01

    Obtaining useful estimates of wildlife abundance or density requires thoughtful attention to potential sources of bias and precision, and it is widely understood that addressing incomplete detection is critical to appropriate inference. When the underlying assumptions of sampling approaches are violated, both increased bias and reduced precision of the population estimator may result. Bear ( Ursus spp.) populations can be difficult to sample and are often monitored using mark-recapture distance sampling (MRDS) methods, although obtaining adequate sample sizes can be cost prohibitive. With the goal of improving inference, we examined the underlying methodological assumptions and estimator efficiency of three datasets collected under an MRDS protocol designed specifically for bears. We analyzed these data using MRDS, conventional distance sampling (CDS), and open-distance sampling approaches to evaluate the apparent bias-precision tradeoff relative to the assumptions inherent under each approach. We also evaluated the incorporation of informative priors on detection parameters within a Bayesian context. We found that the CDS estimator had low apparent bias and was more efficient than the more complex MRDS estimator. When combined with informative priors on the detection process, precision was increased by >50% compared to the MRDS approach with little apparent bias. In addition, open-distance sampling models revealed a serious violation of the assumption that all bears were available to be sampled. Inference is directly related to the underlying assumptions of the survey design and the analytical tools employed. We show that for aerial surveys of bears, avoidance of unnecessary model complexity, use of prior information, and the application of open population models can be used to greatly improve estimator performance and simplify field protocols. Although we focused on distance sampling-based aerial surveys for bears, the general concepts we addressed apply to a variety of wildlife survey contexts.

  2. Sample Manipulation System for Sample Analysis at Mars

    NASA Technical Reports Server (NTRS)

    Mumm, Erik; Kennedy, Tom; Carlson, Lee; Roberts, Dustyn

    2008-01-01

    The Sample Analysis at Mars (SAM) instrument will analyze Martian samples collected by the Mars Science Laboratory Rover with a suite of spectrometers. This paper discusses the driving requirements, design, and lessons learned in the development of the Sample Manipulation System (SMS) within SAM. The SMS stores and manipulates 74 sample cups to be used for solid sample pyrolysis experiments. Focus is given to the unique mechanism architecture developed to deliver a high packing density of sample cups in a reliable, fault tolerant manner while minimizing system mass and control complexity. Lessons learned are presented on contamination control, launch restraint mechanisms for fragile sample cups, and mechanism test data.

  3. Considerations on sample holder design and custom-made non-polarizable electrodes for Spectral Induced Polarization measurements on unsaturated soils

    NASA Astrophysics Data System (ADS)

    Kaouane, C.; Chouteau, M. C.; Fauchard, C.; Cote, P.

    2014-12-01

    Spectral Induced Polarization (SIP) is a geophysical method sensitive to water content, saturation and grain size distribution. It could be used as an alternative to nuclear probes to assess the compaction of soils in road works. To evaluate the potential of SIP as a practical tool, we designed an experiment for complex conductivity measurements on unsaturated soil samples.Literature presents a large variety of sample holders and designs, each depending on the context. Although we might find some precise description about the sample holder, exact replication is not always possible. Furthermore, the potential measurements are often done using custom-made Ag/AgCl electrodes and very few indications are given on their reliability with time and temperature. Our objective is to perform complex conductivity measurements on soil samples compacted in a PVC cylindrical mould (10 cm-long, 5 cm-diameter) according to geotechnical standards. To expect homogeneous current density, electrical current is transmitted through the sample via chambers filled with agar gel. Agar gel is a good non-polarizable conductor within the frequency range (1 mHz -20kHz). Its electrical properties are slightly known. We measured increasing of agar-agar electrical conductivity in time. We modelled the influence of this variation on the measurement. If the electrodes are located on the sample, it is minimized. Because of the dimensions at stake and the need for simple design, potential electrodes are located outside the sample, hence the gel contributes to the measurements. Since the gel is fairly conductive, we expect to overestimate the sample conductivity. Potential electrodes are non-polarizable Ag/AgCl electrodes. To avoid any leakage, the KCl solution in the electrodes is replaced by saturated KCl-agar gel. These electrodes are low cost and show a low, stable, self-potential (<1mV). In addition, the technique of making electrode can be easily reproduced and storage and maintenance are simple. We measured a variation of less than 1 mS/m of the electrolyte conductivity during the time of measurement (~1h40) for a conductivity range 25-100 mS/m, showing no ionic contamination of the solution by the electrodes. An improvement to the cell design would be to control the internal temperature of the sample.

  4. BIOMONITORING OF EXPOSURE IN FARMWORKER STUDIES

    EPA Science Inventory

    Though biomonitoring has been used in many occupational and environmental health and exposure studies, we are only beginning to understand the complexities and uncertainties involved with the biomonitoring process -- from study design, to sample collection, to chemical analysis -...

  5. Complex disease and phenotype mapping in the domestic dog

    PubMed Central

    Hayward, Jessica J.; Castelhano, Marta G.; Oliveira, Kyle C.; Corey, Elizabeth; Balkman, Cheryl; Baxter, Tara L.; Casal, Margret L.; Center, Sharon A.; Fang, Meiying; Garrison, Susan J.; Kalla, Sara E.; Korniliev, Pavel; Kotlikoff, Michael I.; Moise, N. S.; Shannon, Laura M.; Simpson, Kenneth W.; Sutter, Nathan B.; Todhunter, Rory J.; Boyko, Adam R.

    2016-01-01

    The domestic dog is becoming an increasingly valuable model species in medical genetics, showing particular promise to advance our understanding of cancer and orthopaedic disease. Here we undertake the largest canine genome-wide association study to date, with a panel of over 4,200 dogs genotyped at 180,000 markers, to accelerate mapping efforts. For complex diseases, we identify loci significantly associated with hip dysplasia, elbow dysplasia, idiopathic epilepsy, lymphoma, mast cell tumour and granulomatous colitis; for morphological traits, we report three novel quantitative trait loci that influence body size and one that influences fur length and shedding. Using simulation studies, we show that modestly larger sample sizes and denser marker sets will be sufficient to identify most moderate- to large-effect complex disease loci. This proposed design will enable efficient mapping of canine complex diseases, most of which have human homologues, using far fewer samples than required in human studies. PMID:26795439

  6. Sampling in ecology and evolution - bridging the gap between theory and practice

    USGS Publications Warehouse

    Albert, C.H.; Yoccoz, N.G.; Edwards, T.C.; Graham, C.H.; Zimmermann, N.E.; Thuiller, W.

    2010-01-01

    Sampling is a key issue for answering most ecological and evolutionary questions. The importance of developing a rigorous sampling design tailored to specific questions has already been discussed in the ecological and sampling literature and has provided useful tools and recommendations to sample and analyse ecological data. However, sampling issues are often difficult to overcome in ecological studies due to apparent inconsistencies between theory and practice, often leading to the implementation of simplified sampling designs that suffer from unknown biases. Moreover, we believe that classical sampling principles which are based on estimation of means and variances are insufficient to fully address many ecological questions that rely on estimating relationships between a response and a set of predictor variables over time and space. Our objective is thus to highlight the importance of selecting an appropriate sampling space and an appropriate sampling design. We also emphasize the importance of using prior knowledge of the study system to estimate models or complex parameters and thus better understand ecological patterns and processes generating these patterns. Using a semi-virtual simulation study as an illustration we reveal how the selection of the space (e.g. geographic, climatic), in which the sampling is designed, influences the patterns that can be ultimately detected. We also demonstrate the inefficiency of common sampling designs to reveal response curves between ecological variables and climatic gradients. Further, we show that response-surface methodology, which has rarely been used in ecology, is much more efficient than more traditional methods. Finally, we discuss the use of prior knowledge, simulation studies and model-based designs in defining appropriate sampling designs. We conclude by a call for development of methods to unbiasedly estimate nonlinear ecologically relevant parameters, in order to make inferences while fulfilling requirements of both sampling theory and field work logistics. ?? 2010 The Authors.

  7. Planning Considerations for a Mars Sample Receiving Facility: Summary and Interpretation of Three Design Studies

    NASA Astrophysics Data System (ADS)

    Beaty, David W.; Allen, Carlton C.; Bass, Deborah S.; Buxbaum, Karen L.; Campbell, James K.; Lindstrom, David J.; Miller, Sylvia L.; Papanastassiou, Dimitri A.

    2009-10-01

    It has been widely understood for many years that an essential component of a Mars Sample Return mission is a Sample Receiving Facility (SRF). The purpose of such a facility would be to take delivery of the flight hardware that lands on Earth, open the spacecraft and extract the sample container and samples, and conduct an agreed-upon test protocol, while ensuring strict containment and contamination control of the samples while in the SRF. Any samples that are found to be non-hazardous (or are rendered non-hazardous by sterilization) would then be transferred to long-term curation. Although the general concept of an SRF is relatively straightforward, there has been considerable discussion about implementation planning. The Mars Exploration Program carried out an analysis of the attributes of an SRF to establish its scope, including minimum size and functionality, budgetary requirements (capital cost, operating costs, cost profile), and development schedule. The approach was to arrange for three independent design studies, each led by an architectural design firm, and compare the results. While there were many design elements in common identified by each study team, there were significant differences in the way human operators were to interact with the systems. In aggregate, the design studies provided insight into the attributes of a future SRF and the complex factors to consider for future programmatic planning.

  8. Planning considerations for a Mars Sample Receiving Facility: summary and interpretation of three design studies.

    PubMed

    Beaty, David W; Allen, Carlton C; Bass, Deborah S; Buxbaum, Karen L; Campbell, James K; Lindstrom, David J; Miller, Sylvia L; Papanastassiou, Dimitri A

    2009-10-01

    It has been widely understood for many years that an essential component of a Mars Sample Return mission is a Sample Receiving Facility (SRF). The purpose of such a facility would be to take delivery of the flight hardware that lands on Earth, open the spacecraft and extract the sample container and samples, and conduct an agreed-upon test protocol, while ensuring strict containment and contamination control of the samples while in the SRF. Any samples that are found to be non-hazardous (or are rendered non-hazardous by sterilization) would then be transferred to long-term curation. Although the general concept of an SRF is relatively straightforward, there has been considerable discussion about implementation planning. The Mars Exploration Program carried out an analysis of the attributes of an SRF to establish its scope, including minimum size and functionality, budgetary requirements (capital cost, operating costs, cost profile), and development schedule. The approach was to arrange for three independent design studies, each led by an architectural design firm, and compare the results. While there were many design elements in common identified by each study team, there were significant differences in the way human operators were to interact with the systems. In aggregate, the design studies provided insight into the attributes of a future SRF and the complex factors to consider for future programmatic planning.

  9. Creating targeted initial populations for genetic product searches in heterogeneous markets

    NASA Astrophysics Data System (ADS)

    Foster, Garrett; Turner, Callaway; Ferguson, Scott; Donndelinger, Joseph

    2014-12-01

    Genetic searches often use randomly generated initial populations to maximize diversity and enable a thorough sampling of the design space. While many of these initial configurations perform poorly, the trade-off between population diversity and solution quality is typically acceptable for small-scale problems. Navigating complex design spaces, however, often requires computationally intelligent approaches that improve solution quality. This article draws on research advances in market-based product design and heuristic optimization to strategically construct 'targeted' initial populations. Targeted initial designs are created using respondent-level part-worths estimated from discrete choice models. These designs are then integrated into a traditional genetic search. Two case study problems of differing complexity are presented to illustrate the benefits of this approach. In both problems, targeted populations lead to computational savings and product configurations with improved market share of preferences. Future research efforts to tailor this approach and extend it towards multiple objectives are also discussed.

  10. Combined qualitative and quantitative research designs.

    PubMed

    Seymour, Jane

    2012-12-01

    Mixed methods research designs have been recognized as important in addressing complexity and are recommended particularly in the development and evaluation of complex interventions. This article reports a review of studies in palliative care published between 2010 and March 2012 that combine qualitative and quantitative approaches. A synthesis of approaches to mixed methods research taken in 28 examples of published research studies of relevance to palliative and supportive care is provided, using a typology based on a classic categorization put forward in 1992. Mixed-method studies are becoming more frequently employed in palliative care research and resonate with the complexity of the palliative care endeavour. Undertaking mixed methods research requires a sophisticated understanding of the research process and recognition of some of the underlying complexities encountered when working with different traditions and perspectives on issues of: sampling, validity, reliability and rigour, different sources of data and different data collection and analysis techniques.

  11. Classifier-Guided Sampling for Complex Energy System Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Backlund, Peter B.; Eddy, John P.

    2015-09-01

    This report documents the results of a Laboratory Directed Research and Development (LDRD) effort enti tled "Classifier - Guided Sampling for Complex Energy System Optimization" that was conducted during FY 2014 and FY 2015. The goal of this proj ect was to develop, implement, and test major improvements to the classifier - guided sampling (CGS) algorithm. CGS is type of evolutionary algorithm for perform ing search and optimization over a set of discrete design variables in the face of one or more objective functions. E xisting evolutionary algorithms, such as genetic algorithms , may require a large number of omore » bjecti ve function evaluations to identify optimal or near - optimal solutions . Reducing the number of evaluations can result in significant time savings, especially if the objective function is computationally expensive. CGS reduce s the evaluation count by us ing a Bayesian network classifier to filter out non - promising candidate designs , prior to evaluation, based on their posterior probabilit ies . In this project, b oth the single - objective and multi - objective version s of the CGS are developed and tested on a set of benchm ark problems. As a domain - specific case study, CGS is used to design a microgrid for use in islanded mode during an extended bulk power grid outage.« less

  12. Design of single phase inverter using microcontroller assisted by data processing applications software

    NASA Astrophysics Data System (ADS)

    Ismail, K.; Muharam, A.; Amin; Widodo Budi, S.

    2015-12-01

    Inverter is widely used for industrial, office, and residential purposes. Inverter supports the development of alternative energy such as solar cells, wind turbines and fuel cells by converting dc voltage to ac voltage. Inverter has been made with a variety of hardware and software combinations, such as the use of pure analog circuit and various types of microcontroller as controller. When using pure analog circuit, modification would be difficult because it will change the entire hardware components. In inverter with microcontroller based design (with software), calculations to generate AC modulation is done in the microcontroller. This increases programming complexity and amount of coding downloaded to the microcontroller chip (capacity flash memory in the microcontroller is limited). This paper discusses the design of a single phase inverter using unipolar modulation of sine wave and triangular wave, which is done outside the microcontroller using data processing software application (Microsoft Excel), result shows that complexity programming was reduce and resolution sampling data is very influence to THD. Resolution sampling must taking ½ A degree to get best THD (15.8%).

  13. Single-cell transcriptome conservation in cryopreserved cells and tissues.

    PubMed

    Guillaumet-Adkins, Amy; Rodríguez-Esteban, Gustavo; Mereu, Elisabetta; Mendez-Lago, Maria; Jaitin, Diego A; Villanueva, Alberto; Vidal, August; Martinez-Marti, Alex; Felip, Enriqueta; Vivancos, Ana; Keren-Shaul, Hadas; Heath, Simon; Gut, Marta; Amit, Ido; Gut, Ivo; Heyn, Holger

    2017-03-01

    A variety of single-cell RNA preparation procedures have been described. So far, protocols require fresh material, which hinders complex study designs. We describe a sample preservation method that maintains transcripts in viable single cells, allowing one to disconnect time and place of sampling from subsequent processing steps. We sequence single-cell transcriptomes from >1000 fresh and cryopreserved cells using 3'-end and full-length RNA preparation methods. Our results confirm that the conservation process did not alter transcriptional profiles. This substantially broadens the scope of applications in single-cell transcriptomics and could lead to a paradigm shift in future study designs.

  14. National Intimate Partner and Sexual Violence Survey: 2010 Findings on Victimization by Sexual Orientation

    MedlinePlus

    ... information about response and cooperation rates and other methodological details of NISVS can be found in the ... software for analyzing data collected through complex sample design. The estimated number of victims affected by a ...

  15. Alcohol Use and Binge Drinking Among Women of Childbearing Age: United States, 2011-2013

    MedlinePlus

    ... SUDAAN 11.0 accounted for the complex sampling design. Among nonpregnant women, the prevalence of any alcohol ... between the two periods are likely related to methodological changes in the BRFSS in 2011, rather than ...

  16. A review on design of experiments and surrogate models in aircraft real-time and many-query aerodynamic analyses

    NASA Astrophysics Data System (ADS)

    Yondo, Raul; Andrés, Esther; Valero, Eusebio

    2018-01-01

    Full scale aerodynamic wind tunnel testing, numerical simulation of high dimensional (full-order) aerodynamic models or flight testing are some of the fundamental but complex steps in the various design phases of recent civil transport aircrafts. Current aircraft aerodynamic designs have increase in complexity (multidisciplinary, multi-objective or multi-fidelity) and need to address the challenges posed by the nonlinearity of the objective functions and constraints, uncertainty quantification in aerodynamic problems or the restrained computational budgets. With the aim to reduce the computational burden and generate low-cost but accurate models that mimic those full order models at different values of the design variables, Recent progresses have witnessed the introduction, in real-time and many-query analyses, of surrogate-based approaches as rapid and cheaper to simulate models. In this paper, a comprehensive and state-of-the art survey on common surrogate modeling techniques and surrogate-based optimization methods is given, with an emphasis on models selection and validation, dimensionality reduction, sensitivity analyses, constraints handling or infill and stopping criteria. Benefits, drawbacks and comparative discussions in applying those methods are described. Furthermore, the paper familiarizes the readers with surrogate models that have been successfully applied to the general field of fluid dynamics, but not yet in the aerospace industry. Additionally, the review revisits the most popular sampling strategies used in conducting physical and simulation-based experiments in aircraft aerodynamic design. Attractive or smart designs infrequently used in the field and discussions on advanced sampling methodologies are presented, to give a glance on the various efficient possibilities to a priori sample the parameter space. Closing remarks foster on future perspectives, challenges and shortcomings associated with the use of surrogate models by aircraft industrial aerodynamicists, despite their increased interest among the research communities.

  17. Pseudo-Random Sequence Modifications for Ion Mobility Orthogonal Time of Flight Mass Spectrometry

    PubMed Central

    Clowers, Brian H.; Belov, Mikhail E.; Prior, David C.; Danielson, William F.; Ibrahim, Yehia; Smith, Richard D.

    2008-01-01

    Due to the inherently low duty cycle of ion mobility spectrometry (IMS) experiments that sample from continuous ion sources, a range of experimental advances have been developed to maximize ion utilization efficiency. The use of ion trapping mechanisms prior to the ion mobility drift tube has demonstrated significant gains over discrete sampling from continuous sources; however, these technologies have traditionally relied upon a signal averaging to attain analytically relevant signal-to-noise ratios (SNR). Multiplexed (MP) techniques based upon the Hadamard transform offer an alternative experimental approach by which ion utilization efficiency can be elevated to ∼ 50 %. Recently, our research group demonstrated a unique multiplexed ion mobility time-of-flight (MP-IMS-TOF) approach that incorporates ion trapping and can extend ion utilization efficiency beyond 50 %. However, the spectral reconstruction of the multiplexed signal using this experiment approach requires the use of sample-specific weighing designs. Though general weighing designs have been shown to significantly enhance ion utilization efficiency using this MP technique, such weighing designs cannot be applied to all samples. By modifying both the ion funnel trap and the pseudo random sequence (PRS) used for the MP experiment we have eliminated the need for complex weighing matrices. For both simple and complex mixtures SNR enhancements of up to 13 were routinely observed as compared to the SA-IMS-TOF experiment. In addition, this new class of PRS provides a two fold enhancement in ion throughput compared to the traditional HT-IMS experiment. PMID:18311942

  18. Determination of B-complex vitamins in pharmaceutical formulations by surface-enhanced Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Junior, Benedito Roberto Alvarenga; Soares, Frederico Luis Felipe; Ardila, Jorge Armando; Durango, Luis Guillermo Cuadrado; Forim, Moacir Rossi; Carneiro, Renato Lajarim

    2018-01-01

    The aim of this work was to quantify B-complex vitamins in pharmaceutical samples by surface enhanced Raman spectroscopy technique using gold colloid substrate. Synthesis of gold nanoparticles was performed according to an adapted Turkevich method. Initial essays were able to suggest the orientation of molecules on gold nanoparticles surface. Central Composite design was performed to obtain the highest SERS signal for nicotinamide and riboflavin. The evaluated parameters in the experimental design were volume of AuNPs, concentration of vitamins and sodium chloride concentration. The best condition for nicotinamide was NaCl 2.3 × 10- 3 mol L- 1 and 700 μL of AuNPs colloid and this same condition showed to be adequate to quantify thiamine. The experimental design for riboflavin shows the best condition at NaCl 1.15 × 10- 2 mol L- 1 and 2.8 mL of AuNPs colloid. It was possible to quantify thiamine and nicotinamide in presence of others vitamins and excipients in two solid multivitamin formulations using the standard addition procedure. The standard addition curve presented a R2 higher than 0.96 for both nicotinamide and thiamine, at orders of magnitude 10- 7 and 10- 8 mol L- 1, respectively. The nicotinamide content in a cosmetic gel sample was also quantified by direct analysis presenting R2 0.98. The t-student test presented no significant difference regarding HPLC method. Despite the experimental design performed for riboflavin, it was not possible its quantification in the commercial samples.

  19. Simulating and assessing boson sampling experiments with phase-space representations

    NASA Astrophysics Data System (ADS)

    Opanchuk, Bogdan; Rosales-Zárate, Laura; Reid, Margaret D.; Drummond, Peter D.

    2018-04-01

    The search for new, application-specific quantum computers designed to outperform any classical computer is driven by the ending of Moore's law and the quantum advantages potentially obtainable. Photonic networks are promising examples, with experimental demonstrations and potential for obtaining a quantum computer to solve problems believed classically impossible. This introduces a challenge: how does one design or understand such photonic networks? One must be able to calculate observables using general methods capable of treating arbitrary inputs, dissipation, and noise. We develop complex phase-space software for simulating these photonic networks, and apply this to boson sampling experiments. Our techniques give sampling errors orders of magnitude lower than experimental correlation measurements for the same number of samples. We show that these techniques remove systematic errors in previous algorithms for estimating correlations, with large improvements in errors in some cases. In addition, we obtain a scalable channel-combination strategy for assessment of boson sampling devices.

  20. Measures of precision for dissimilarity-based multivariate analysis of ecological communities

    PubMed Central

    Anderson, Marti J; Santana-Garcon, Julia

    2015-01-01

    Ecological studies require key decisions regarding the appropriate size and number of sampling units. No methods currently exist to measure precision for multivariate assemblage data when dissimilarity-based analyses are intended to follow. Here, we propose a pseudo multivariate dissimilarity-based standard error (MultSE) as a useful quantity for assessing sample-size adequacy in studies of ecological communities. Based on sums of squared dissimilarities, MultSE measures variability in the position of the centroid in the space of a chosen dissimilarity measure under repeated sampling for a given sample size. We describe a novel double resampling method to quantify uncertainty in MultSE values with increasing sample size. For more complex designs, values of MultSE can be calculated from the pseudo residual mean square of a permanova model, with the double resampling done within appropriate cells in the design. R code functions for implementing these techniques, along with ecological examples, are provided. PMID:25438826

  1. Semiautomated Device for Batch Extraction of Metabolites from Tissue Samples

    PubMed Central

    2012-01-01

    Metabolomics has become a mainstream analytical strategy for investigating metabolism. The quality of data derived from these studies is proportional to the consistency of the sample preparation. Although considerable research has been devoted to finding optimal extraction protocols, most of the established methods require extensive sample handling. Manual sample preparation can be highly effective in the hands of skilled technicians, but an automated tool for purifying metabolites from complex biological tissues would be of obvious utility to the field. Here, we introduce the semiautomated metabolite batch extraction device (SAMBED), a new tool designed to simplify metabolomics sample preparation. We discuss SAMBED’s design and show that SAMBED-based extractions are of comparable quality to extracts produced through traditional methods (13% mean coefficient of variation from SAMBED versus 16% from manual extractions). Moreover, we show that aqueous SAMBED-based methods can be completed in less than a quarter of the time required for manual extractions. PMID:22292466

  2. A Sequential Optimization Sampling Method for Metamodels with Radial Basis Functions

    PubMed Central

    Pan, Guang; Ye, Pengcheng; Yang, Zhidong

    2014-01-01

    Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is strongly affected by the sampling methods. In this paper, a new sequential optimization sampling method is proposed. Based on the new sampling method, metamodels can be constructed repeatedly through the addition of sampling points, namely, extrema points of metamodels and minimum points of density function. Afterwards, the more accurate metamodels would be constructed by the procedure above. The validity and effectiveness of proposed sampling method are examined by studying typical numerical examples. PMID:25133206

  3. Habitat Complexity Metrics to Guide Restoration of Large Rivers

    NASA Astrophysics Data System (ADS)

    Jacobson, R. B.; McElroy, B. J.; Elliott, C.; DeLonay, A.

    2011-12-01

    Restoration strategies on large, channelized rivers typically strive to recover lost habitat complexity, based on the assumption complexity and biophysical capacity are directly related. Although definition of links between complexity and biotic responses can be tenuous, complexity metrics have appeal because of their potential utility in quantifying habitat quality, defining reference conditions and design criteria, and measuring restoration progress. Hydroacoustic instruments provide many ways to measure complexity on large rivers, yet substantive questions remain about variables and scale of complexity that are meaningful to biota, and how complexity can be measured and monitored cost effectively. We explore these issues on the Missouri River, using the example of channel re-engineering projects that are intended to aid in recovery of the pallid sturgeon, an endangered benthic fish. We are refining understanding of what habitat complexity means for adult fish by combining hydroacoustic habitat assessments with acoustic telemetry to map locations during reproductive migrations and spawning. These data indicate that migrating sturgeon select points with relatively low velocity but adjacent to areas of high velocity (that is, with high velocity gradients); the integration of points defines pathways which minimize energy expenditures during upstream migrations of 10's to 100's of km. Complexity metrics that efficiently quantify migration potential at the reach scale are therefore directly relevant to channel restoration strategies. We are also exploring complexity as it relates to larval sturgeon dispersal. Larvae may drift for as many as 17 days (100's of km at mean velocities) before using up their yolk sac, after which they "settle" into habitats where they initiate feeding. An assumption underlying channel re-engineering is that additional channel complexity, specifically increased shallow, slow water, is necessary for early feeding and refugia. Development of complexity metrics is complicated by the fact that characteristics of channel morphology may increase complexity scores without necessarily increasing biophysical capacity for target species. For example, a cross section that samples depths and velocities across the thalweg (navigation channel) and into lentic habitat may score high on most measures of hydraulic or geomorphic complexity, but does not necessarily provide habitats beneficial to native species. Complexity measures need to be bounded by best estimates of native species requirements. In the absence of specific information, creation of habitat complexity for the sake of complexity may lead to unintended consequences, for example, lentic habitats that increase a complexity score but support invasive species. An additional practical constraint on complexity measures is the need to develop metrics that are can be deployed cost-effectively in an operational monitoring program. Design of a monitoring program requires informed choices of measurement variables, definition of reference sites, and design of sampling effort to capture spatial and temporal variability.

  4. Object oriented development of engineering software using CLIPS

    NASA Technical Reports Server (NTRS)

    Yoon, C. John

    1991-01-01

    Engineering applications involve numeric complexity and manipulations of a large amount of data. Traditionally, numeric computation has been the concern in developing an engineering software. As engineering application software became larger and more complex, management of resources such as data, rather than the numeric complexity, has become the major software design problem. Object oriented design and implementation methodologies can improve the reliability, flexibility, and maintainability of the resulting software; however, some tasks are better solved with the traditional procedural paradigm. The C Language Integrated Production System (CLIPS), with deffunction and defgeneric constructs, supports the procedural paradigm. The natural blending of object oriented and procedural paradigms has been cited as the reason for the popularity of the C++ language. The CLIPS Object Oriented Language's (COOL) object oriented features are more versatile than C++'s. A software design methodology based on object oriented and procedural approaches appropriate for engineering software, and to be implemented in CLIPS was outlined. A method for sensor placement for Space Station Freedom is being implemented in COOL as a sample problem.

  5. Dietary Supplement Use Among U.S. Adults Has Increased Since NHANES III (1988-1994)

    MedlinePlus

    ... uses a complex, stratified, multistage probability cluster sampling design and oversamples in order to increase precision in estimates for certain groups. NHANES III was one in a series of periodic surveys conducted in two cycles during ...

  6. University Learning Systems for Participative Courses.

    ERIC Educational Resources Information Center

    Billingham, Carol J.; Harper, William W.

    1980-01-01

    Describes the instructional development of a course for advanced finance students on the use of data files and/or databases for solving complex finance problems. Areas covered include course goals and the design. The course class schedule and sample learning assessment assignments are provided. (JD)

  7. Improving the evidence base in palliative care to inform practice and policy: thinking outside the box.

    PubMed

    Aoun, Samar M; Nekolaichuk, Cheryl

    2014-12-01

    The adoption of evidence-based hierarchies and research methods from other disciplines may not completely translate to complex palliative care settings. The heterogeneity of the palliative care population, complexity of clinical presentations, and fluctuating health states present significant research challenges. The aim of this narrative review was to explore the debate about the use of current evidence-based approaches for conducting research, such as randomized controlled trials and other study designs, in palliative care, and more specifically to (1) describe key myths about palliative care research; (2) highlight substantive challenges of conducting palliative care research, using case illustrations; and (3) propose specific strategies to address some of these challenges. Myths about research in palliative care revolve around evidence hierarchies, sample heterogeneity, random assignment, participant burden, and measurement issues. Challenges arise because of the complex physical, psychological, existential, and spiritual problems faced by patients, families, and service providers. These challenges can be organized according to six general domains: patient, system/organization, context/setting, study design, research team, and ethics. A number of approaches for dealing with challenges in conducting research fall into five separate domains: study design, sampling, conceptual, statistical, and measures and outcomes. Although randomized controlled trials have their place whenever possible, alternative designs may offer more feasible research protocols that can be successfully implemented in palliative care. Therefore, this article highlights "outside the box" approaches that would benefit both clinicians and researchers in the palliative care field. Ultimately, the selection of research designs is dependent on a clearly articulated research question, which drives the research process. Copyright © 2014 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  8. How Much Confidence Can We Have in EU-SILC? Complex Sample Designs and the Standard Error of the Europe 2020 Poverty Indicators

    ERIC Educational Resources Information Center

    Goedeme, Tim

    2013-01-01

    If estimates are based on samples, they should be accompanied by appropriate standard errors and confidence intervals. This is true for scientific research in general, and is even more important if estimates are used to inform and evaluate policy measures such as those aimed at attaining the Europe 2020 poverty reduction target. In this article I…

  9. Detection of E. coli O157:H7 in complex matrices under varying flow parameters with a robotic fluorometric assay system

    NASA Astrophysics Data System (ADS)

    Leskinen, Stephaney D.; Schlemmer, Sarah M.; Kearns, Elizabeth A.; Lim, Daniel V.

    2009-02-01

    The development of rapid assays for detection of microbial pathogens in complex matrices is needed to protect public health due to continued outbreaks of disease from contaminated foods and water. An Escherichia coli O157:H7 detection assay was designed using a robotic, fluorometric assay system. The system integrates optics, fluidics, robotics and software for the detection of foodborne pathogens or toxins in as many as four samples simultaneously. It utilizes disposable fiber optic waveguides coated with biotinylated antibodies for capture of target analytes from complex sample matrices. Computer-controlled rotation of sample cups allows complete contact between the sample and the waveguide. Detection occurs via binding of a fluorophore-labeled antibody to the captured target, which leads to an increase in the fluorescence signal. Assays are completed within twenty-five minutes. Sample matrices included buffer, retentate (material recovered from the filter of the Automated Concentration System (ACS) following hollow fiber ultrafiltration), spinach wash and ground beef. The matrices were spiked with E. coli O157:H7 (103-105 cells/ml) and the limits of detection were determined. The effect of sample rotation on assay sensitivity was also examined. Rotation parameters for each sample matrix included 10 ml with rotation, 5 ml with rotation and 0.1 ml without rotation. Detection occurred at 104 cells/ml in buffer and spinach wash and at 105 cells/ml in retentate and ground beef. Detection was greater for rotated samples in each matrix except ground beef. Enhanced detection of E. coli from large, rotated volumes of complex matrices was confirmed.

  10. Exposure enriched outcome dependent designs for longitudinal studies of gene-environment interaction.

    PubMed

    Sun, Zhichao; Mukherjee, Bhramar; Estes, Jason P; Vokonas, Pantel S; Park, Sung Kyun

    2017-08-15

    Joint effects of genetic and environmental factors have been increasingly recognized in the development of many complex human diseases. Despite the popularity of case-control and case-only designs, longitudinal cohort studies that can capture time-varying outcome and exposure information have long been recommended for gene-environment (G × E) interactions. To date, literature on sampling designs for longitudinal studies of G × E interaction is quite limited. We therefore consider designs that can prioritize a subsample of the existing cohort for retrospective genotyping on the basis of currently available outcome, exposure, and covariate data. In this work, we propose stratified sampling based on summaries of individual exposures and outcome trajectories and develop a full conditional likelihood approach for estimation that adjusts for the biased sample. We compare the performance of our proposed design and analysis with combinations of different sampling designs and estimation approaches via simulation. We observe that the full conditional likelihood provides improved estimates for the G × E interaction and joint exposure effects over uncorrected complete-case analysis, and the exposure enriched outcome trajectory dependent design outperforms other designs in terms of estimation efficiency and power for detection of the G × E interaction. We also illustrate our design and analysis using data from the Normative Aging Study, an ongoing longitudinal cohort study initiated by the Veterans Administration in 1963. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  11. An adaptive sampling method for variable-fidelity surrogate models using improved hierarchical kriging

    NASA Astrophysics Data System (ADS)

    Hu, Jiexiang; Zhou, Qi; Jiang, Ping; Shao, Xinyu; Xie, Tingli

    2018-01-01

    Variable-fidelity (VF) modelling methods have been widely used in complex engineering system design to mitigate the computational burden. Building a VF model generally includes two parts: design of experiments and metamodel construction. In this article, an adaptive sampling method based on improved hierarchical kriging (ASM-IHK) is proposed to refine the improved VF model. First, an improved hierarchical kriging model is developed as the metamodel, in which the low-fidelity model is varied through a polynomial response surface function to capture the characteristics of a high-fidelity model. Secondly, to reduce local approximation errors, an active learning strategy based on a sequential sampling method is introduced to make full use of the already required information on the current sampling points and to guide the sampling process of the high-fidelity model. Finally, two numerical examples and the modelling of the aerodynamic coefficient for an aircraft are provided to demonstrate the approximation capability of the proposed approach, as well as three other metamodelling methods and two sequential sampling methods. The results show that ASM-IHK provides a more accurate metamodel at the same simulation cost, which is very important in metamodel-based engineering design problems.

  12. An enhanced cluster analysis program with bootstrap significance testing for ecological community analysis

    USGS Publications Warehouse

    McKenna, J.E.

    2003-01-01

    The biosphere is filled with complex living patterns and important questions about biodiversity and community and ecosystem ecology are concerned with structure and function of multispecies systems that are responsible for those patterns. Cluster analysis identifies discrete groups within multivariate data and is an effective method of coping with these complexities, but often suffers from subjective identification of groups. The bootstrap testing method greatly improves objective significance determination for cluster analysis. The BOOTCLUS program makes cluster analysis that reliably identifies real patterns within a data set more accessible and easier to use than previously available programs. A variety of analysis options and rapid re-analysis provide a means to quickly evaluate several aspects of a data set. Interpretation is influenced by sampling design and a priori designation of samples into replicate groups, and ultimately relies on the researcher's knowledge of the organisms and their environment. However, the BOOTCLUS program provides reliable, objectively determined groupings of multivariate data.

  13. FORGE Milford Triaxial Test Data and Summary from EGI labs

    DOE Data Explorer

    Joe Moore

    2016-03-01

    Six samples were evaluated in unconfined and triaxial compression, their data are included in separate excel spreadsheets, and summarized in the word document. Three samples were plugged along the axis of the core (presumed to be nominally vertical) and three samples were plugged perpendicular to the axis of the core. A designation of "V"indicates vertical or the long axis of the plugged sample is aligned with the axis of the core. Similarly, "H" indicates a sample that is nominally horizontal and cut orthogonal to the axis of the core. Stress-strain curves were made before and after the testing, and are included in the word doc.. The confining pressure for this test was 2800 psi. A series of tests are being carried out on to define a failure envelope, to provide representative hydraulic fracture design parameters and for future geomechanical assessments. The samples are from well 52-21, which reaches a maximum depth of 3581 ft +/- 2 ft into a gneiss complex.

  14. Selective and Sensitive Fluorescent Detection of Picric Acid by New Pyrene and Anthracene Based Copper Complexes.

    PubMed

    Reddy, Kumbam Lingeshwar; Kumar, Anabathula Manoj; Dhir, Abhimanew; Krishnan, Venkata

    2016-11-01

    New pyrene and anthracene based copper complexes 4 and 7 respectively were designed, synthesized and characterized. The fluorescence behaviour of both 4 and 7 were evaluated towards nitro aromatics and anions. Both 4 and 7 possess high selectivity for the detection of well-known explosive picric acid (PA) by showing maximum fluorescence affinity. Furthermore, complex 4 showed similar sensing efficiency towards PA at different pH ranges. It was also used for real world applications, as illustrated by the very fast detection of PA from soil samples observed directly by naked eye.

  15. RosettaAntibodyDesign (RAbD): A general framework for computational antibody design

    PubMed Central

    Adolf-Bryfogle, Jared; Kalyuzhniy, Oleks; Kubitz, Michael; Hu, Xiaozhen; Adachi, Yumiko; Schief, William R.

    2018-01-01

    A structural-bioinformatics-based computational methodology and framework have been developed for the design of antibodies to targets of interest. RosettaAntibodyDesign (RAbD) samples the diverse sequence, structure, and binding space of an antibody to an antigen in highly customizable protocols for the design of antibodies in a broad range of applications. The program samples antibody sequences and structures by grafting structures from a widely accepted set of the canonical clusters of CDRs (North et al., J. Mol. Biol., 406:228–256, 2011). It then performs sequence design according to amino acid sequence profiles of each cluster, and samples CDR backbones using a flexible-backbone design protocol incorporating cluster-based CDR constraints. Starting from an existing experimental or computationally modeled antigen-antibody structure, RAbD can be used to redesign a single CDR or multiple CDRs with loops of different length, conformation, and sequence. We rigorously benchmarked RAbD on a set of 60 diverse antibody–antigen complexes, using two design strategies—optimizing total Rosetta energy and optimizing interface energy alone. We utilized two novel metrics for measuring success in computational protein design. The design risk ratio (DRR) is equal to the frequency of recovery of native CDR lengths and clusters divided by the frequency of sampling of those features during the Monte Carlo design procedure. Ratios greater than 1.0 indicate that the design process is picking out the native more frequently than expected from their sampled rate. We achieved DRRs for the non-H3 CDRs of between 2.4 and 4.0. The antigen risk ratio (ARR) is the ratio of frequencies of the native amino acid types, CDR lengths, and clusters in the output decoys for simulations performed in the presence and absence of the antigen. For CDRs, we achieved cluster ARRs as high as 2.5 for L1 and 1.5 for H2. For sequence design simulations without CDR grafting, the overall recovery for the native amino acid types for residues that contact the antigen in the native structures was 72% in simulations performed in the presence of the antigen and 48% in simulations performed without the antigen, for an ARR of 1.5. For the non-contacting residues, the ARR was 1.08. This shows that the sequence profiles are able to maintain the amino acid types of these conserved, buried sites, while recovery of the exposed, contacting residues requires the presence of the antigen-antibody interface. We tested RAbD experimentally on both a lambda and kappa antibody–antigen complex, successfully improving their affinities 10 to 50 fold by replacing individual CDRs of the native antibody with new CDR lengths and clusters. PMID:29702641

  16. RosettaAntibodyDesign (RAbD): A general framework for computational antibody design.

    PubMed

    Adolf-Bryfogle, Jared; Kalyuzhniy, Oleks; Kubitz, Michael; Weitzner, Brian D; Hu, Xiaozhen; Adachi, Yumiko; Schief, William R; Dunbrack, Roland L

    2018-04-01

    A structural-bioinformatics-based computational methodology and framework have been developed for the design of antibodies to targets of interest. RosettaAntibodyDesign (RAbD) samples the diverse sequence, structure, and binding space of an antibody to an antigen in highly customizable protocols for the design of antibodies in a broad range of applications. The program samples antibody sequences and structures by grafting structures from a widely accepted set of the canonical clusters of CDRs (North et al., J. Mol. Biol., 406:228-256, 2011). It then performs sequence design according to amino acid sequence profiles of each cluster, and samples CDR backbones using a flexible-backbone design protocol incorporating cluster-based CDR constraints. Starting from an existing experimental or computationally modeled antigen-antibody structure, RAbD can be used to redesign a single CDR or multiple CDRs with loops of different length, conformation, and sequence. We rigorously benchmarked RAbD on a set of 60 diverse antibody-antigen complexes, using two design strategies-optimizing total Rosetta energy and optimizing interface energy alone. We utilized two novel metrics for measuring success in computational protein design. The design risk ratio (DRR) is equal to the frequency of recovery of native CDR lengths and clusters divided by the frequency of sampling of those features during the Monte Carlo design procedure. Ratios greater than 1.0 indicate that the design process is picking out the native more frequently than expected from their sampled rate. We achieved DRRs for the non-H3 CDRs of between 2.4 and 4.0. The antigen risk ratio (ARR) is the ratio of frequencies of the native amino acid types, CDR lengths, and clusters in the output decoys for simulations performed in the presence and absence of the antigen. For CDRs, we achieved cluster ARRs as high as 2.5 for L1 and 1.5 for H2. For sequence design simulations without CDR grafting, the overall recovery for the native amino acid types for residues that contact the antigen in the native structures was 72% in simulations performed in the presence of the antigen and 48% in simulations performed without the antigen, for an ARR of 1.5. For the non-contacting residues, the ARR was 1.08. This shows that the sequence profiles are able to maintain the amino acid types of these conserved, buried sites, while recovery of the exposed, contacting residues requires the presence of the antigen-antibody interface. We tested RAbD experimentally on both a lambda and kappa antibody-antigen complex, successfully improving their affinities 10 to 50 fold by replacing individual CDRs of the native antibody with new CDR lengths and clusters.

  17. Propensity Scores in Pharmacoepidemiology: Beyond the Horizon.

    PubMed

    Jackson, John W; Schmid, Ian; Stuart, Elizabeth A

    2017-12-01

    Propensity score methods have become commonplace in pharmacoepidemiology over the past decade. Their adoption has confronted formidable obstacles that arise from pharmacoepidemiology's reliance on large healthcare databases of considerable heterogeneity and complexity. These include identifying clinically meaningful samples, defining treatment comparisons, and measuring covariates in ways that respect sound epidemiologic study design. Additional complexities involve correctly modeling treatment decisions in the face of variation in healthcare practice, and dealing with missing information and unmeasured confounding. In this review, we examine the application of propensity score methods in pharmacoepidemiology with particular attention to these and other issues, with an eye towards standards of practice, recent methodological advances, and opportunities for future progress. Propensity score methods have matured in ways that can advance comparative effectiveness and safety research in pharmacoepidemiology. These include natural extensions for categorical treatments, matching algorithms that can optimize sample size given design constraints, weighting estimators that asymptotically target matched and overlap samples, and the incorporation of machine learning to aid in covariate selection and model building. These recent and encouraging advances should be further evaluated through simulation and empirical studies, but nonetheless represent a bright path ahead for the observational study of treatment benefits and harms.

  18. Rapid production of hollow SS316 profiles by extrusion based additive manufacturing

    NASA Astrophysics Data System (ADS)

    Rane, Kedarnath; Cataldo, Salvatore; Parenti, Paolo; Sbaglia, Luca; Mussi, Valerio; Annoni, Massimiliano; Giberti, Hermes; Strano, Matteo

    2018-05-01

    Complex shaped stainless steel tubes are often required for special purpose biomedical equipment. Nevertheless, traditional manufacturing technologies, such as extrusion, lack the ability to compete in a market of customized complex components because of associated expenses towards tooling and extrusion presses. To rapid manufacture few of such components with low cost and high precision, a new Extrusion based Additive Manufacturing (EAM) process, is proposed in this paper, and as an example, short stainless steel 316L complex shaped and sectioned tubes were prepared by EAM. Several sample parts were produced using this process; the dimensional stability, surface roughness and chemical composition of sintered samples were investigated to prove process competence. The results indicate that feedstock with a 316L particle content of 92.5 wt. % can be prepared with a sigma blade mixing, whose rheological behavior is fit for EAM. The green samples have sufficient strength to handle them for subsequent treatments. The sintered samples considerably shrunk to designed dimensions and have a homogeneous microstructure to impart mechanical strength. Whereas, maintaining comparable dimensional accuracy and chemical composition which are required for biomedical equipment still need iterations, a kinematic correction and modification in debinding cycle was proposed.

  19. Structural Equation Modeling of School Violence Data: Methodological Considerations

    ERIC Educational Resources Information Center

    Mayer, Matthew J.

    2004-01-01

    Methodological challenges associated with structural equation modeling (SEM) and structured means modeling (SMM) in research on school violence and related topics in the social and behavioral sciences are examined. Problems associated with multiyear implementations of large-scale surveys are discussed. Complex sample designs, part of any…

  20. Contamination source review for Building E3162, Edgewood Area, Aberdeen Proving Ground, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, G.A.; Draugelis, A.K.; Rueda, J.

    1995-09-01

    This report was prepared by Argonne National Laboratory (ANL) to document the results of a contamination source review for Building E3162 at the Aberdeen Proving Ground (APG) in Maryland. The report may be used to assist the US Army in planning for the future use or disposition of this building. The review included a historical records search, physical inspection, photographic documentation, geophysical investigation, and collection of air samples. The field investigations were performed by ANL during 1994 and 1995. Building E3162 (APG designation) is part of the Medical Research Laboratories Building E3160 Complex. This research laboratory complex is located westmore » of Kings Creek, east of the airfield and Ricketts Point Road, and south of Kings Creek Road in the Edgewood Area of APG. The original structures in the E3160 Complex were constructed during World War 2. The complex was originally used as a medical research laboratory. Much of the research involved wound assessment involving chemical warfare agents. Building E3162 was used as a holding and study area for animals involved in non-agent burns. The building was constructed in 1952, placed on inactive status in 1983, and remains unoccupied. Analytical results from these air samples revealed no distinguishable difference in hydrocarbon and chlorinated solvent levels between the two background samples and the sample taken inside Building E3162.« less

  1. Performance of some biotic indices in the real variable world: a case study at different spatial scales in North-Western Mediterranean Sea.

    PubMed

    Tataranni, Mariella; Lardicci, Claudio

    2010-01-01

    The aim of this study was to analyse the variability of four different benthic biotic indices (AMBI, BENTIX, H', M-AMBI) in two marine coastal areas of the North-Western Mediterranean Sea. In each coastal area, 36 replicates were randomly selected according to a hierarchical sampling design, which allowed estimating the variance components of the indices associated with four different spatial scales (ranging from metres to kilometres). All the analyses were performed at two different sampling periods in order to evaluate if the observed trends were consistent over the time. The variance components of the four indices revealed complex trends and different patterns in the two sampling periods. These results highlighted that independently from the employed index, a rigorous and appropriate sampling design taking into account different scales should always be used in order to avoid erroneous classifications and to develop effective monitoring programs.

  2. Flexible sample environment for high resolution neutron imaging at high temperatures in controlled atmosphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makowska, Małgorzata G.; Theil Kuhn, Luise; Cleemann, Lars N.

    In high material penetration by neutrons allows for experiments using sophisticated sample environments providing complex conditions. Thus, neutron imaging holds potential for performing in situ nondestructive measurements on large samples or even full technological systems, which are not possible with any other technique. Our paper presents a new sample environment for in situ high resolution neutron imaging experiments at temperatures from room temperature up to 1100 degrees C and/or using controllable flow of reactive atmospheres. The design also offers the possibility to directly combine imaging with diffraction measurements. Design, special features, and specification of the furnace are described. In addition,more » examples of experiments successfully performed at various neutron facilities with the furnace, as well as examples of possible applications are presented. Our work covers a broad field of research from fundamental to technological investigations of various types of materials and components.« less

  3. Flexible sample environment for high resolution neutron imaging at high temperatures in controlled atmosphere

    DOE PAGES

    Makowska, Małgorzata G.; Theil Kuhn, Luise; Cleemann, Lars N.; ...

    2015-12-17

    In high material penetration by neutrons allows for experiments using sophisticated sample environments providing complex conditions. Thus, neutron imaging holds potential for performing in situ nondestructive measurements on large samples or even full technological systems, which are not possible with any other technique. Our paper presents a new sample environment for in situ high resolution neutron imaging experiments at temperatures from room temperature up to 1100 degrees C and/or using controllable flow of reactive atmospheres. The design also offers the possibility to directly combine imaging with diffraction measurements. Design, special features, and specification of the furnace are described. In addition,more » examples of experiments successfully performed at various neutron facilities with the furnace, as well as examples of possible applications are presented. Our work covers a broad field of research from fundamental to technological investigations of various types of materials and components.« less

  4. Determination of B-complex vitamins in pharmaceutical formulations by surface-enhanced Raman spectroscopy.

    PubMed

    Junior, Benedito Roberto Alvarenga; Soares, Frederico Luis Felipe; Ardila, Jorge Armando; Durango, Luis Guillermo Cuadrado; Forim, Moacir Rossi; Carneiro, Renato Lajarim

    2018-01-05

    The aim of this work was to quantify B-complex vitamins in pharmaceutical samples by surface enhanced Raman spectroscopy technique using gold colloid substrate. Synthesis of gold nanoparticles was performed according to an adapted Turkevich method. Initial essays were able to suggest the orientation of molecules on gold nanoparticles surface. Central Composite design was performed to obtain the highest SERS signal for nicotinamide and riboflavin. The evaluated parameters in the experimental design were volume of AuNPs, concentration of vitamins and sodium chloride concentration. The best condition for nicotinamide was NaCl 2.3×10 -3 molL -1 and 700μL of AuNPs colloid and this same condition showed to be adequate to quantify thiamine. The experimental design for riboflavin shows the best condition at NaCl 1.15×10 -2 molL -1 and 2.8mL of AuNPs colloid. It was possible to quantify thiamine and nicotinamide in presence of others vitamins and excipients in two solid multivitamin formulations using the standard addition procedure. The standard addition curve presented a R 2 higher than 0.96 for both nicotinamide and thiamine, at orders of magnitude 10 -7 and 10 -8 molL -1 , respectively. The nicotinamide content in a cosmetic gel sample was also quantified by direct analysis presenting R 2 0.98. The t-student test presented no significant difference regarding HPLC method. Despite the experimental design performed for riboflavin, it was not possible its quantification in the commercial samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Using extreme phenotype sampling to identify the rare causal variants of quantitative traits in association studies.

    PubMed

    Li, Dalin; Lewinger, Juan Pablo; Gauderman, William J; Murcray, Cassandra Elizabeth; Conti, David

    2011-12-01

    Variants identified in recent genome-wide association studies based on the common-disease common-variant hypothesis are far from fully explaining the hereditability of complex traits. Rare variants may, in part, explain some of the missing hereditability. Here, we explored the advantage of the extreme phenotype sampling in rare-variant analysis and refined this design framework for future large-scale association studies on quantitative traits. We first proposed a power calculation approach for a likelihood-based analysis method. We then used this approach to demonstrate the potential advantages of extreme phenotype sampling for rare variants. Next, we discussed how this design can influence future sequencing-based association studies from a cost-efficiency (with the phenotyping cost included) perspective. Moreover, we discussed the potential of a two-stage design with the extreme sample as the first stage and the remaining nonextreme subjects as the second stage. We demonstrated that this two-stage design is a cost-efficient alternative to the one-stage cross-sectional design or traditional two-stage design. We then discussed the analysis strategies for this extreme two-stage design and proposed a corresponding design optimization procedure. To address many practical concerns, for example measurement error or phenotypic heterogeneity at the very extremes, we examined an approach in which individuals with very extreme phenotypes are discarded. We demonstrated that even with a substantial proportion of these extreme individuals discarded, an extreme-based sampling can still be more efficient. Finally, we expanded the current analysis and design framework to accommodate the CMC approach where multiple rare-variants in the same gene region are analyzed jointly. © 2011 Wiley Periodicals, Inc.

  6. Using Extreme Phenotype Sampling to Identify the Rare Causal Variants of Quantitative Traits in Association Studies

    PubMed Central

    Li, Dalin; Lewinger, Juan Pablo; Gauderman, William J.; Murcray, Cassandra Elizabeth; Conti, David

    2014-01-01

    Variants identified in recent genome-wide association studies based on the common-disease common-variant hypothesis are far from fully explaining the hereditability of complex traits. Rare variants may, in part, explain some of the missing hereditability. Here, we explored the advantage of the extreme phenotype sampling in rare-variant analysis and refined this design framework for future large-scale association studies on quantitative traits. We first proposed a power calculation approach for a likelihood-based analysis method. We then used this approach to demonstrate the potential advantages of extreme phenotype sampling for rare variants. Next, we discussed how this design can influence future sequencing-based association studies from a cost-efficiency (with the phenotyping cost included) perspective. Moreover, we discussed the potential of a two-stage design with the extreme sample as the first stage and the remaining nonextreme subjects as the second stage. We demonstrated that this two-stage design is a cost-efficient alternative to the one-stage cross-sectional design or traditional two-stage design. We then discussed the analysis strategies for this extreme two-stage design and proposed a corresponding design optimization procedure. To address many practical concerns, for example measurement error or phenotypic heterogeneity at the very extremes, we examined an approach in which individuals with very extreme phenotypes are discarded. We demonstrated that even with a substantial proportion of these extreme individuals discarded, an extreme-based sampling can still be more efficient. Finally, we expanded the current analysis and design framework to accommodate the CMC approach where multiple rare-variants in the same gene region are analyzed jointly. PMID:21922541

  7. Application of quality by design concept to develop a dual gradient elution stability-indicating method for cloxacillin forced degradation studies using combined mixture-process variable models.

    PubMed

    Zhang, Xia; Hu, Changqin

    2017-09-08

    Penicillins are typical of complex ionic samples which likely contain large number of degradation-related impurities (DRIs) with different polarities and charge properties. It is often a challenge to develop selective and robust high performance liquid chromatography (HPLC) methods for the efficient separation of all DRIs. In this study, an analytical quality by design (AQbD) approach was proposed for stability-indicating method development of cloxacillin. The structures, retention and UV characteristics rules of penicillins and their impurities were summarized and served as useful prior knowledge. Through quality risk assessment and screen design, 3 critical process parameters (CPPs) were defined, including 2 mixture variables (MVs) and 1 process variable (PV). A combined mixture-process variable (MPV) design was conducted to evaluate the 3 CPPs simultaneously and a response surface methodology (RSM) was used to achieve the optimal experiment parameters. A dual gradient elution was performed to change buffer pH, mobile-phase type and strength simultaneously. The design spaces (DSs) was evaluated using Monte Carlo simulation to give their possibility of meeting the specifications of CQAs. A Plackett-Burman design was performed to test the robustness around the working points and to decide the normal operating ranges (NORs). Finally, validation was performed following International Conference on Harmonisation (ICH) guidelines. To our knowledge, this is the first study of using MPV design and dual gradient elution to develop HPLC methods and improve separations for complex ionic samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Epiverta Dieke (Coleoptera: Coccinellidae: Epilachnini): A Complex of Species, Not a Monotypic Genus

    PubMed Central

    Huo, Lizhi; Szawaryn, Karol; Wang, Xingmin

    2017-01-01

    Rich sampling and modern research techniques, including SEM, revealed that rarely collected epilachnine species Epiverta chelonia is a complex of four closely related species: E. chelonia (Mader, 1933), E. albopilosa, E. angusta, and E. supinata spp. nov. All Epiverta species are described and illustrated, a key to the species and a distribution map are provided. Lectotype of Solanophila cheloniaMader, 1933 is designated and its type locality delimited to Yunnan Province, Deqin County (China). PMID:28931156

  9. Robust Optimization Design for Turbine Blade-Tip Radial Running Clearance using Hierarchically Response Surface Method

    NASA Astrophysics Data System (ADS)

    Zhiying, Chen; Ping, Zhou

    2017-11-01

    Considering the robust optimization computational precision and efficiency for complex mechanical assembly relationship like turbine blade-tip radial running clearance, a hierarchically response surface robust optimization algorithm is proposed. The distribute collaborative response surface method is used to generate assembly system level approximation model of overall parameters and blade-tip clearance, and then a set samples of design parameters and objective response mean and/or standard deviation is generated by using system approximation model and design of experiment method. Finally, a new response surface approximation model is constructed by using those samples, and this approximation model is used for robust optimization process. The analyses results demonstrate the proposed method can dramatic reduce the computational cost and ensure the computational precision. The presented research offers an effective way for the robust optimization design of turbine blade-tip radial running clearance.

  10. The complexity of personality: advantages of a genetically sensitive multi-group design.

    PubMed

    Hahn, Elisabeth; Spinath, Frank M; Siedler, Thomas; Wagner, Gert G; Schupp, Jürgen; Kandler, Christian

    2012-03-01

    Findings from many behavioral genetic studies utilizing the classical twin design suggest that genetic and non-shared environmental effects play a significant role in human personality traits. This study focuses on the methodological advantages of extending the sampling frame to include multiple dyads of relatives. We investigated the sensitivity of heritability estimates to the inclusion of sibling pairs, mother-child pairs and grandparent-grandchild pairs from the German Socio-Economic Panel Study in addition to a classical German twin sample consisting of monozygotic- and dizygotic twins. The resulting dataset contained 1.308 pairs, including 202 monozygotic and 147 dizygotic twin pairs, along with 419 sibling pairs, 438 mother-child dyads, and 102 grandparent-child dyads. This genetically sensitive multi-group design allowed the simultaneous testing of additive and non-additive genetic, common and specific environmental effects, including cultural transmission and twin-specific environmental influences. Using manifest and latent modeling of phenotypes (i.e., controlling for measurement error), we compare results from the extended sample with those from the twin sample alone and discuss implications for future research.

  11. IMPLICATIONS OF INTER-HABITAT VARIATION FOR MONITORING GREAT RIVER ECOSYSTEMS: EMAP-UMR EXPERIENCE

    EPA Science Inventory

    Great River ecosystems (GREs) are complex mosaics of habitats that vary at multiple scales. GRE monitoring designs can capture some but not all of this variation. Each discrete habitat, however defined, must either be sampled as a separate strata or "resource population", combine...

  12. Evaluation of an electronic nose for improved biosolids alkaline-stabilization treatment and odor management

    USDA-ARS?s Scientific Manuscript database

    Electronic nose sensors are designed to detect differences in complex air sample matrices. For example, they have been used in the food industry to monitor process performance and quality control. However, no information is available on the application of sensor arrays to monitor process performanc...

  13. Dynamic behavior of geometrically complex hybrid composite samples in a Split-Hopkinson Pressure Bar system

    NASA Astrophysics Data System (ADS)

    Pouya, M.; Balasubramaniam, S.; Sharafiev, S.; F-X Wagner, M.

    2018-06-01

    The interfaces between layered materials play an important role for the overall mechanical behavior of hybrid composites, particularly during dynamic loading. Moreover, in complex-shaped composites, interfacial failure is strongly affected by the geometry and size of these contact interfaces. As preliminary work for the design of a novel sample geometry that allows to analyze wave reflection phenomena at the interfaces of such materials, a series of experiments using a Split-Hopkinson Pressure Bar technique was performed on five different sample geometries made of a monomaterial steel. A complementary explicit finite element model of the Split-Hopkinson Pressure Bar system was developed and the same sample geometries were studied numerically. The simulated input, reflected and transmitted elastic wave pulses were analyzed for the different sample geometries and were found to agree well with the experimental results. Additional simulations using different composite layers of steel and aluminum (with the same sample geometries) were performed to investigate the effect of material variation on the propagated wave pulses. The numerical results show that the reflected and transmitted wave pulses systematically depend on the sample geometry, and that elastic wave pulse propagation is affected by the properties of individual material layers.

  14. Evaluation of the mtDNA-COII Region Based Species Specific Assay for Identifying Members of the Anopheles culicifacies Species Complex

    PubMed Central

    Manonmani, Arulsamy Mary; Mathivanan, Ashok Kumar; Sadanandane, Candassamy; Jambulingam, Purushothaman

    2013-01-01

    Background: Anopheles culicifacies, a major malarial vector has been recognized as a complex of five sibling species, A, B, C, D and E. These sibling species exhibit varied vectorial capacity, host specificity and susceptibility to malarial parasites/ insecticides. In this study, a PCR assay developed earlier for distinguishing the five individual species was validated on samples of An. culicifacies collected from various parts of India. Methods: The samples were initially screened using the rDNA-ITS2 region based primers which categorised the samples into either A/D group or B/C/E group. A proportion of samples belonging to each group were subjected to the mtDNA-COII PCR assay for identifying individual species. Results: Among the 615 samples analysed by rDNA-ITS2 PCR assay, 303 were found to belong to A/D group and 299 to B/C/E group while 13 turned negative. Among 163 samples belonging to A/D group, only one sample displayed the profile characteristic of species A and among the 176 samples falling in the B/C/E group, 51 were identified as species B, 14 as species C and 41 as species E respectively by the mtDNA-COII PCR assay. Samples exhibiting products diagnostic of B/C/E, when subjected to PCR-RFLP assay identified 15 samples as species E. Conclusion: Validation of the mtDNA-COII PCR assay on large number of samples showed that this technique cannot be used universally to distinguish the 5 members of this species complex, as it has been designed based on minor/single base differences observed in the COII region. PMID:24409441

  15. Measures of precision for dissimilarity-based multivariate analysis of ecological communities.

    PubMed

    Anderson, Marti J; Santana-Garcon, Julia

    2015-01-01

    Ecological studies require key decisions regarding the appropriate size and number of sampling units. No methods currently exist to measure precision for multivariate assemblage data when dissimilarity-based analyses are intended to follow. Here, we propose a pseudo multivariate dissimilarity-based standard error (MultSE) as a useful quantity for assessing sample-size adequacy in studies of ecological communities. Based on sums of squared dissimilarities, MultSE measures variability in the position of the centroid in the space of a chosen dissimilarity measure under repeated sampling for a given sample size. We describe a novel double resampling method to quantify uncertainty in MultSE values with increasing sample size. For more complex designs, values of MultSE can be calculated from the pseudo residual mean square of a permanova model, with the double resampling done within appropriate cells in the design. R code functions for implementing these techniques, along with ecological examples, are provided. © 2014 The Authors. Ecology Letters published by John Wiley & Sons Ltd and CNRS.

  16. Power calculation for overall hypothesis testing with high-dimensional commensurate outcomes.

    PubMed

    Chi, Yueh-Yun; Gribbin, Matthew J; Johnson, Jacqueline L; Muller, Keith E

    2014-02-28

    The complexity of system biology means that any metabolic, genetic, or proteomic pathway typically includes so many components (e.g., molecules) that statistical methods specialized for overall testing of high-dimensional and commensurate outcomes are required. While many overall tests have been proposed, very few have power and sample size methods. We develop accurate power and sample size methods and software to facilitate study planning for high-dimensional pathway analysis. With an account of any complex correlation structure between high-dimensional outcomes, the new methods allow power calculation even when the sample size is less than the number of variables. We derive the exact (finite-sample) and approximate non-null distributions of the 'univariate' approach to repeated measures test statistic, as well as power-equivalent scenarios useful to generalize our numerical evaluations. Extensive simulations of group comparisons support the accuracy of the approximations even when the ratio of number of variables to sample size is large. We derive a minimum set of constants and parameters sufficient and practical for power calculation. Using the new methods and specifying the minimum set to determine power for a study of metabolic consequences of vitamin B6 deficiency helps illustrate the practical value of the new results. Free software implementing the power and sample size methods applies to a wide range of designs, including one group pre-intervention and post-intervention comparisons, multiple parallel group comparisons with one-way or factorial designs, and the adjustment and evaluation of covariate effects. Copyright © 2013 John Wiley & Sons, Ltd.

  17. Effect-directed analysis supporting monitoring of aquatic ...

    EPA Pesticide Factsheets

    Aquatic environments are often contaminated with complex mixtures of chemicals that may pose a risk to ecosystems and human health. This contamination cannot be addressed with target analysis alone but tools are required to reduce this complexity and identify those chemicals that might cause adverse effects. Effect-directed analysis (EDA) is designed to meet this challenge and faces increasing interest in water and sediment quality monitoring. Thus, the present paper summarizes current experience with the EDA approach and the tools required,and provides practical advice on their application. The paper highlights the need for proper problem formulation and gives general advice for study design. As the EDA approach is directed by toxicity, basic principles for the selection of bioassays are given as well as a comprehensive compilation of appropriate assays, includingtheir strengths andweaknesses. A specific focus is given to strategies for sampling, extraction and bioassay dosing since they strongly impact prioritization of toxicants in EDA. Reduction of sample complexity mainly relies onfractionation procedures, which are discussed in this paper, including quality assurance and quality control. Automated combinations of fractionation, biotesting and chemical analysis using so-called hyphenated tools can enhance the throughput and might reduce the risk of artifacts in laboratory work. The key to determiningthe chemical structures causing effects is analytical toxi

  18. Automated design of genomic Southern blot probes

    PubMed Central

    2010-01-01

    Background Sothern blotting is a DNA analysis technique that has found widespread application in molecular biology. It has been used for gene discovery and mapping and has diagnostic and forensic applications, including mutation detection in patient samples and DNA fingerprinting in criminal investigations. Southern blotting has been employed as the definitive method for detecting transgene integration, and successful homologous recombination in gene targeting experiments. The technique employs a labeled DNA probe to detect a specific DNA sequence in a complex DNA sample that has been separated by restriction-digest and gel electrophoresis. Critically for the technique to succeed the probe must be unique to the target locus so as not to cross-hybridize to other endogenous DNA within the sample. Investigators routinely employ a manual approach to probe design. A genome browser is used to extract DNA sequence from the locus of interest, which is searched against the target genome using a BLAST-like tool. Ideally a single perfect match is obtained to the target, with little cross-reactivity caused by homologous DNA sequence present in the genome and/or repetitive and low-complexity elements in the candidate probe. This is a labor intensive process often requiring several attempts to find a suitable probe for laboratory testing. Results We have written an informatic pipeline to automatically design genomic Sothern blot probes that specifically attempts to optimize the resultant probe, employing a brute-force strategy of generating many candidate probes of acceptable length in the user-specified design window, searching all against the target genome, then scoring and ranking the candidates by uniqueness and repetitive DNA element content. Using these in silico measures we can automatically design probes that we predict to perform as well, or better, than our previous manual designs, while considerably reducing design time. We went on to experimentally validate a number of these automated designs by Southern blotting. The majority of probes we tested performed well confirming our in silico prediction methodology and the general usefulness of the software for automated genomic Southern probe design. Conclusions Software and supplementary information are freely available at: http://www.genes2cognition.org/software/southern_blot PMID:20113467

  19. Sample size considerations when groups are the appropriate unit of analyses

    PubMed Central

    Sadler, Georgia Robins; Ko, Celine Marie; Alisangco, Jennifer; Rosbrook, Bradley P.; Miller, Eric; Fullerton, Judith

    2007-01-01

    This paper discusses issues to be considered by nurse researchers when groups should be used as a unit of randomization. Advantages and disadvantages are presented, with statistical calculations needed to determine effective sample size. Examples of these concepts are presented using data from the Black Cosmetologists Promoting Health Program. Different hypothetical scenarios and their impact on sample size are presented. Given the complexity of calculating sample size when using groups as a unit of randomization, it’s advantageous for researchers to work closely with statisticians when designing and implementing studies that anticipate the use of groups as the unit of randomization. PMID:17693219

  20. Discrete elements for 3D microfluidics.

    PubMed

    Bhargava, Krisna C; Thompson, Bryant; Malmstadt, Noah

    2014-10-21

    Microfluidic systems are rapidly becoming commonplace tools for high-precision materials synthesis, biochemical sample preparation, and biophysical analysis. Typically, microfluidic systems are constructed in monolithic form by means of microfabrication and, increasingly, by additive techniques. These methods restrict the design and assembly of truly complex systems by placing unnecessary emphasis on complete functional integration of operational elements in a planar environment. Here, we present a solution based on discrete elements that liberates designers to build large-scale microfluidic systems in three dimensions that are modular, diverse, and predictable by simple network analysis techniques. We develop a sample library of standardized components and connectors manufactured using stereolithography. We predict and validate the flow characteristics of these individual components to design and construct a tunable concentration gradient generator with a scalable number of parallel outputs. We show that these systems are rapidly reconfigurable by constructing three variations of a device for generating monodisperse microdroplets in two distinct size regimes and in a high-throughput mode by simple replacement of emulsifier subcircuits. Finally, we demonstrate the capability for active process monitoring by constructing an optical sensing element for detecting water droplets in a fluorocarbon stream and quantifying their size and frequency. By moving away from large-scale integration toward standardized discrete elements, we demonstrate the potential to reduce the practice of designing and assembling complex 3D microfluidic circuits to a methodology comparable to that found in the electronics industry.

  1. A Practical, Robust Methodology for Acquiring New Observation Data Using Computationally Expensive Groundwater Models

    NASA Astrophysics Data System (ADS)

    Siade, Adam J.; Hall, Joel; Karelse, Robert N.

    2017-11-01

    Regional groundwater flow models play an important role in decision making regarding water resources; however, the uncertainty embedded in model parameters and model assumptions can significantly hinder the reliability of model predictions. One way to reduce this uncertainty is to collect new observation data from the field. However, determining where and when to obtain such data is not straightforward. There exist a number of data-worth and experimental design strategies developed for this purpose. However, these studies often ignore issues related to real-world groundwater models such as computational expense, existing observation data, high-parameter dimension, etc. In this study, we propose a methodology, based on existing methods and software, to efficiently conduct such analyses for large-scale, complex regional groundwater flow systems for which there is a wealth of available observation data. The method utilizes the well-established d-optimality criterion, and the minimax criterion for robust sampling strategies. The so-called Null-Space Monte Carlo method is used to reduce the computational burden associated with uncertainty quantification. And, a heuristic methodology, based on the concept of the greedy algorithm, is proposed for developing robust designs with subsets of the posterior parameter samples. The proposed methodology is tested on a synthetic regional groundwater model, and subsequently applied to an existing, complex, regional groundwater system in the Perth region of Western Australia. The results indicate that robust designs can be obtained efficiently, within reasonable computational resources, for making regional decisions regarding groundwater level sampling.

  2. Microcontroller-based real-time QRS detection.

    PubMed

    Sun, Y; Suppappola, S; Wrublewski, T A

    1992-01-01

    The authors describe the design of a system for real-time detection of QRS complexes in the electrocardiogram based on a single-chip microcontroller (Motorola 68HC811). A systematic analysis of the instrumentation requirements for QRS detection and of the various design techniques is also given. Detection algorithms using different nonlinear transforms for the enhancement of QRS complexes are evaluated by using the ECG database of the American Heart Association. The results show that the nonlinear transform involving multiplication of three adjacent, sign-consistent differences in the time domain gives a good performance and a quick response. When implemented with an appropriate sampling rate, this algorithm is also capable of rejecting pacemaker spikes. The eight-bit single-chip microcontroller provides sufficient throughput and shows a satisfactory performance. Implementation of multiple detection algorithms in the same system improves flexibility and reliability. The low chip count in the design also favors maintainability and cost-effectiveness.

  3. Spectral-spatial classification of hyperspectral image using three-dimensional convolution network

    NASA Astrophysics Data System (ADS)

    Liu, Bing; Yu, Xuchu; Zhang, Pengqiang; Tan, Xiong; Wang, Ruirui; Zhi, Lu

    2018-01-01

    Recently, hyperspectral image (HSI) classification has become a focus of research. However, the complex structure of an HSI makes feature extraction difficult to achieve. Most current methods build classifiers based on complex handcrafted features computed from the raw inputs. The design of an improved 3-D convolutional neural network (3D-CNN) model for HSI classification is described. This model extracts features from both the spectral and spatial dimensions through the application of 3-D convolutions, thereby capturing the important discrimination information encoded in multiple adjacent bands. The designed model views the HSI cube data altogether without relying on any pre- or postprocessing. In addition, the model is trained in an end-to-end fashion without any handcrafted features. The designed model was applied to three widely used HSI datasets. The experimental results demonstrate that the 3D-CNN-based method outperforms conventional methods even with limited labeled training samples.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makowska, Małgorzata G., E-mail: malg@dtu.dk; European Spallation Source ESS AB, P.O. Box 176, SE-221 00 Lund; Theil Kuhn, Luise

    High material penetration by neutrons allows for experiments using sophisticated sample environments providing complex conditions. Thus, neutron imaging holds potential for performing in situ nondestructive measurements on large samples or even full technological systems, which are not possible with any other technique. This paper presents a new sample environment for in situ high resolution neutron imaging experiments at temperatures from room temperature up to 1100 °C and/or using controllable flow of reactive atmospheres. The design also offers the possibility to directly combine imaging with diffraction measurements. Design, special features, and specification of the furnace are described. In addition, examples of experimentsmore » successfully performed at various neutron facilities with the furnace, as well as examples of possible applications are presented. This covers a broad field of research from fundamental to technological investigations of various types of materials and components.« less

  5. Toxicity bioassays with concentrated cell culture media-a methodology to overcome the chemical loss by conventional preparation of water samples.

    PubMed

    Niss, Frida; Rosenmai, Anna Kjerstine; Mandava, Geeta; Örn, Stefan; Oskarsson, Agneta; Lundqvist, Johan

    2018-04-01

    The use of in vitro bioassays for studies of toxic activity in environmental water samples is a rapidly expanding field of research. Cell-based bioassays can assess the total toxicity exerted by a water sample, regardless whether the toxicity is caused by a known or unknown agent or by a complex mixture of different agents. When using bioassays for environmental water samples, it is often necessary to concentrate the water samples before applying the sample. Commonly, water samples are concentrated 10-50 times. However, there is always a risk of losing compounds in the sample in such sample preparation. We have developed an alternative experimental design by preparing a concentrated cell culture medium which was then diluted in the environmental water sample to compose the final cell culture media for the in vitro assays. Water samples from five Swedish waste water treatment plants were analyzed for oxidative stress response, estrogen receptor (ER), and aryl hydrocarbon receptor (AhR) activity using this experimental design. We were able to detect responses equivalent to 8.8-11.3 ng/L TCCD for AhR activity and 0.4-0.9 ng/L 17β-estradiol for ER activity. We were unable to detect oxidative stress response in any of the studied water samples. In conclusion, we have developed an experimental design allowing us to examine environmental water samples in toxicity in vitro assays at a concentration factor close to 1, without the risk of losing known or unknown compounds during an extraction procedure.

  6. Adaptive sampling strategies with high-throughput molecular dynamics

    NASA Astrophysics Data System (ADS)

    Clementi, Cecilia

    Despite recent significant hardware and software developments, the complete thermodynamic and kinetic characterization of large macromolecular complexes by molecular simulations still presents significant challenges. The high dimensionality of these systems and the complexity of the associated potential energy surfaces (creating multiple metastable regions connected by high free energy barriers) does not usually allow to adequately sample the relevant regions of their configurational space by means of a single, long Molecular Dynamics (MD) trajectory. Several different approaches have been proposed to tackle this sampling problem. We focus on the development of ensemble simulation strategies, where data from a large number of weakly coupled simulations are integrated to explore the configurational landscape of a complex system more efficiently. Ensemble methods are of increasing interest as the hardware roadmap is now mostly based on increasing core counts, rather than clock speeds. The main challenge in the development of an ensemble approach for efficient sampling is in the design of strategies to adaptively distribute the trajectories over the relevant regions of the systems' configurational space, without using any a priori information on the system global properties. We will discuss the definition of smart adaptive sampling approaches that can redirect computational resources towards unexplored yet relevant regions. Our approaches are based on new developments in dimensionality reduction for high dimensional dynamical systems, and optimal redistribution of resources. NSF CHE-1152344, NSF CHE-1265929, Welch Foundation C-1570.

  7. Determination of arsenic species in rice samples using CPE and ETAAS.

    PubMed

    Costa, Bruno Elias Dos Santos; Coelho, Nívia Maria Melo; Coelho, Luciana Melo

    2015-07-01

    A highly sensitive and selective procedure for the determination of arsenate and total arsenic in food by electrothermal atomic absorption spectrometry after cloud point extraction (ETAAS/CPE) was developed. The procedure is based on the formation of a complex of As(V) ions with molybdate in the presence of 50.0 mmol L(-1) sulfuric acid. The complex was extracted into the surfactant-rich phase of 0.06% (w/v) Triton X-114. The variables affecting the complex formation, extraction and phase separation were optimized using factorial designs. Under the optimal conditions, the calibration graph was linear in the range of 0.05-10.0 μg L(-1). The detection and quantification limits were 10 and 33 ng L(-1), respectively and the corresponding value for the relative standard deviation for 10 replicates was below 5%. Recovery values of between 90.8% and 113.1% were obtained for spiked samples. The accuracy of the method was evaluated by comparison with the results obtained for the analysis of a rice flour sample (certified material IRMM-804) and no significant difference at the 95% confidence level was observed. The method was successfully applied to the determination of As(V) and total arsenic in rice samples. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Tackling complexities in understanding the social determinants of health: the contribution of ethnographic research.

    PubMed

    Bandyopadhyay, Mridula

    2011-11-25

    The complexities inherent in understanding the social determinants of health are often not well-served by quantitative approaches. My aim is to show that well-designed and well-conducted ethnographic studies have an important contribution to make in this regard. Ethnographic research designs are a difficult but rigorous approach to research questions that require us to understand the complexity of people's social and cultural lives. I draw on an ethnographic study to describe the complexities of studying maternal health in a rural area in India. I then show how the lessons learnt in that setting and context can be applied to studies done in very different settings. I show how ethnographic research depends for rigour on a theoretical framework for sample selection; why immersion in the community under study, and rapport building with research participants, is important to ensure rich and meaningful data; and how flexible approaches to data collection lead to the gradual emergence of an analysis based on intense cross-referencing with community views and thus a conclusion that explains the similarities and differences observed. When using ethnographic research design it can be difficult to specify in advance the exact details of the study design. Researchers can encounter issues in the field that require them to change what they planned on doing. In rigorous ethnographic studies, the researcher in the field is the research instrument and needs to be well trained in the method. Ethnographic research is challenging, but nevertheless provides a rewarding way of researching complex health problems that require an understanding of the social and cultural determinants of health.

  9. Reliability-Based Control Design for Uncertain Systems

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.

    2005-01-01

    This paper presents a robust control design methodology for systems with probabilistic parametric uncertainty. Control design is carried out by solving a reliability-based multi-objective optimization problem where the probability of violating design requirements is minimized. Simultaneously, failure domains are optimally enlarged to enable global improvements in the closed-loop performance. To enable an efficient numerical implementation, a hybrid approach for estimating reliability metrics is developed. This approach, which integrates deterministic sampling and asymptotic approximations, greatly reduces the numerical burden associated with complex probabilistic computations without compromising the accuracy of the results. Examples using output-feedback and full-state feedback with state estimation are used to demonstrate the ideas proposed.

  10. A Sample Handling System for Mars Sample Return - Design and Status

    NASA Astrophysics Data System (ADS)

    Allouis, E.; Renouf, I.; Deridder, M.; Vrancken, D.; Gelmi, R.; Re, E.

    2009-04-01

    A mission to return atmosphere and soil samples form the Mars is highly desired by planetary scientists from around the world and space agencies are starting preparation for the launch of a sample return mission in the 2020 timeframe. Such a mission would return approximately 500 grams of atmosphere, rock and soil samples to Earth by 2025. Development of a wide range of new technology will be critical to the successful implementation of such a challenging mission. Technical developments required to realise the mission include guided atmospheric entry, soft landing, sample handling robotics, biological sealing, Mars atmospheric ascent sample rendezvous & capture and Earth return. The European Space Agency has been performing system definition studies along with numerous technology development studies under the framework of the Aurora programme. Within the scope of these activities Astrium has been responsible for defining an overall sample handling architecture in collaboration with European partners (sample acquisition and sample capture, Galileo Avionica; sample containment and automated bio-sealing, Verhaert). Our work has focused on the definition and development of the robotic systems required to move the sample through the transfer chain. This paper presents the Astrium team's high level design for the surface transfer system and the orbiter transfer system. The surface transfer system is envisaged to use two robotic arms of different sizes to allow flexible operations and to enable sample transfer over relatively large distances (~2 to 3 metres): The first to deploy/retract the Drill Assembly used for sample collection, the second for the transfer of the Sample Container (the vessel containing all the collected samples) from the Drill Assembly to the Mars Ascent Vehicle (MAV). The sample transfer actuator also features a complex end-effector for handling the Sample Container. The orbiter transfer system will transfer the Sample Container from the capture mechanism through a bio-sealing system to the Earth Return Capsule (ERC) and has distinctly different requirements from the surface transfer system. The operations required to transfer the samples to the ERC are clearly defined and make use of mechanisms specifically designed for the job rather than robotic arms. Though it is mechanical rather than robotic, the design of the orbiter transfer system is very complex in comparison to most previous missions to fulfil all the scientific and technological requirements. Further mechanisms will be required to lock the samples into the ERC and to close the door at the rear of the ERC through which the samples have been inserted. Having performed this overall definition study, Astrium is now leading the next step of the development of the MSR sample handling: the Mars Surface Sample Transfer and Manipulation project (MSSTM). Organised in two phases, the project will re-evaluate in phase 1 the output of the previous study in the light of new inputs (e.g. addition of a rover) and investigate further the architectures and systems involved in the sample transfer chain while identifying the critical technologies. The second phase of the project will concentrate on the prototyping of a number of these key technologies with the goal of providing an end-to end validation of the surface sample transfer concept.

  11. Grayscale lithography-automated mask generation for complex three-dimensional topography

    NASA Astrophysics Data System (ADS)

    Loomis, James; Ratnayake, Dilan; McKenna, Curtis; Walsh, Kevin M.

    2016-01-01

    Grayscale lithography is a relatively underutilized technique that enables fabrication of three-dimensional (3-D) microstructures in photosensitive polymers (photoresists). By spatially modulating ultraviolet (UV) dosage during the writing process, one can vary the depth at which photoresist is developed. This means complex structures and bioinspired designs can readily be produced that would otherwise be cost prohibitive or too time intensive to fabricate. The main barrier to widespread grayscale implementation, however, stems from the laborious generation of mask files required to create complex surface topography. We present a process and associated software utility for automatically generating grayscale mask files from 3-D models created within industry-standard computer-aided design (CAD) suites. By shifting the microelectromechanical systems (MEMS) design onus to commonly used CAD programs ideal for complex surfacing, engineering professionals already familiar with traditional 3-D CAD software can readily utilize their pre-existing skills to make valuable contributions to the MEMS community. Our conversion process is demonstrated by prototyping several samples on a laser pattern generator-capital equipment already in use in many foundries. Finally, an empirical calibration technique is shown that compensates for nonlinear relationships between UV exposure intensity and photoresist development depth as well as a thermal reflow technique to help smooth microstructure surfaces.

  12. Creating and validating GIS measures of urban design for health research.

    PubMed

    Purciel, Marnie; Neckerman, Kathryn M; Lovasi, Gina S; Quinn, James W; Weiss, Christopher; Bader, Michael D M; Ewing, Reid; Rundle, Andrew

    2009-12-01

    Studies relating urban design to health have been impeded by the unfeasibility of conducting field observations across large areas and the lack of validated objective measures of urban design. This study describes measures for five dimensions of urban design - imageability, enclosure, human scale, transparency, and complexity - created using public geographic information systems (GIS) data from the US Census and city and state government. GIS measures were validated for a sample of 588 New York City block faces using a well-documented field observation protocol. Correlations between GIS and observed measures ranged from 0.28 to 0.89. Results show valid urban design measures can be constructed from digital sources.

  13. Creating and validating GIS measures of urban design for health research

    PubMed Central

    Purciel, Marnie; Neckerman, Kathryn M.; Lovasi, Gina S.; Quinn, James W.; Weiss, Christopher; Bader, Michael D.M.; Ewing, Reid; Rundle, Andrew

    2012-01-01

    Studies relating urban design to health have been impeded by the unfeasibility of conducting field observations across large areas and the lack of validated objective measures of urban design. This study describes measures for five dimensions of urban design – imageability, enclosure, human scale, transparency, and complexity – created using public geographic information systems (GIS) data from the US Census and city and state government. GIS measures were validated for a sample of 588 New York City block faces using a well-documented field observation protocol. Correlations between GIS and observed measures ranged from 0.28 to 0.89. Results show valid urban design measures can be constructed from digital sources. PMID:22956856

  14. MUTAGENIC AND CARCINOGENIC POTENCY OF EXTRACTS OF DIESEL AND RELATED ENVIRONMENTAL EMISSIONS: STUDY DESIGN, SAMPLE GENERATION, COLLECTION, AND PREPARATION (JOURNAL VERSION)

    EPA Science Inventory

    A major diesel emissions research program has been initiated by the US Environmental Protection Agency to assess the human health risk associated with increased use of diesel automobiles. This program is intended to establish the mutagenic and carcinogenic potency of complex orga...

  15. Lifestyle and Clinical Health Behaviors and PSA Tests

    ERIC Educational Resources Information Center

    Norris, Cynthia; McFall, Stephanie

    2006-01-01

    This study assessed the association of lifestyle and clinical health behaviors with prostate specific antigen (PSA) tests. The study used cross-sectional data from the 2002 Behavioral Risk Factor Surveillance System (BRFSS). We used Stata 8.0 to take into account the complex sample design in analyses. Both lifestyle and clinical health behaviors…

  16. Joint histogram-based cost aggregation for stereo matching.

    PubMed

    Min, Dongbo; Lu, Jiangbo; Do, Minh N

    2013-10-01

    This paper presents a novel method for performing efficient cost aggregation in stereo matching. The cost aggregation problem is reformulated from the perspective of a histogram, giving us the potential to reduce the complexity of the cost aggregation in stereo matching significantly. Differently from previous methods which have tried to reduce the complexity in terms of the size of an image and a matching window, our approach focuses on reducing the computational redundancy that exists among the search range, caused by a repeated filtering for all the hypotheses. Moreover, we also reduce the complexity of the window-based filtering through an efficient sampling scheme inside the matching window. The tradeoff between accuracy and complexity is extensively investigated by varying the parameters used in the proposed method. Experimental results show that the proposed method provides high-quality disparity maps with low complexity and outperforms existing local methods. This paper also provides new insights into complexity-constrained stereo-matching algorithm design.

  17. Implications of complex adaptive systems theory for the design of research on health care organizations

    PubMed Central

    McDaniel, Reuben R.; Lanham, Holly Jordan; Anderson, Ruth A.

    2013-01-01

    Background Because health care organizations (HCOs) are complex adaptive systems (CASs), phenomena of interest often are dynamic and unfold in unpredictable ways, and unfolding events are often unique. Researchers of HCOs may recognize that the subject of their research is dynamic; however, their research designs may not take this into account. Researchers may also know that unfolding events are often unique, but their design may not have the capacity to obtain information from meager evidence. Purpose These two concerns led us to examine two ideas from organizational theory: (a) the ideas of K. E. Weick (1993) on organizational design as a verb and (b) the ideas of J. G. March, L. S. Sproull, and M. Tamuz (1991) on learning from samples of one or fewer. In this article, we applied these ideas to develop an enriched perspective of research design for studying CASs. Methodology/Approach We conducted a theoretical analysis of organizations as CASs, identifying relevant characteristics for research designs. We then explored two ideas from organizational theory and discussed the implications for research designs. Findings Weick's idea of “design as a verb” helps in understanding dynamic and process-oriented research design. The idea of “learning from samples of one or fewer” of March, Sproull, and Tamuz provides strategies for research design that enables learning from meager evidence. When studying HCOs, research designs are likely to be more effective when they (a) anticipate change, (b) include tension, (c) capitalize on serendipity, and (d) use an “act-then-look” mind set. Implications for practice are discussed. Practice Implications Practitioners who understand HCOs as CASs will be cautious in accepting findings from studies that treat HCOs mechanistically. They will consider the characteristics of CAS when evaluating the evidence base for practice. Practitioners can use the strategies proposed in this article to stimulate discussion with researchers seeking to conduct research in their HCO. PMID:19322050

  18. Implications of complex adaptive systems theory for the design of research on health care organizations.

    PubMed

    McDaniel, Reuben R; Lanham, Holly Jordan; Anderson, Ruth A

    2009-01-01

    Because health care organizations (HCOs) are complex adaptive systems (CASs), phenomena of interest often are dynamic and unfold in unpredictable ways, and unfolding events are often unique. Researchers of HCOs may recognize that the subject of their research is dynamic; however, their research designs may not take this into account. Researchers may also know that unfolding events are often unique, but their design may not have the capacity to obtain information from meager evidence. These two concerns led us to examine two ideas from organizational theory: (a) the ideas of K. E. Weick (1993) on organizational design as a verb and (b) the ideas of J. G. March, L. S. Sproull, and M. Tamuz (1991) on learning from samples of one or fewer. In this article, we applied these ideas to develop an enriched perspective of research design for studying CASs. We conducted a theoretical analysis of organizations as CASs, identifying relevant characteristics for research designs. We then explored two ideas from organizational theory and discussed the implications for research designs. Weick's idea of "design as a verb" helps in understanding dynamic and process-oriented research design. The idea of "learning from samples of one or fewer" of March, Sproull, and Tamuz provides strategies for research design that enables learning from meager evidence. When studying HCOs, research designs are likely to be more effective when they (a) anticipate change, (b) include tension, (c) capitalize on serendipity, and (d) use an "act-then-look" mind set. Implications for practice are discussed. Practitioners who understand HCOs as CASs will be cautious in accepting findings from studies that treat HCOs mechanistically. They will consider the characteristics of CAS when evaluating the evidence base for practice. Practitioners can use the strategies proposed in this article to stimulate discussion with researchers seeking to conduct research in their HCO.

  19. Classroom learning and achievement: how the complexity of classroom interaction impacts students' learning

    NASA Astrophysics Data System (ADS)

    Podschuweit, Sören; Bernholt, Sascha; Brückmann, Maja

    2016-05-01

    Background: Complexity models have provided a suitable framework in various domains to assess students' educational achievement. Complexity is often used as the analytical focus when regarding learning outcomes, i.e. when analyzing written tests or problem-centered interviews. Numerous studies reveal negative correlations between the complexity of a task and the probability of a student solving it. Purpose: Thus far, few detailed investigations explore the importance of complexity in actual classroom lessons. Moreover, the few efforts made so far revealed inconsistencies. Hence, the present study sheds light on the influence the complexity of students' and teachers' class contributions have on students' learning outcomes. Sample: Videos of 10 German 8th grade physics courses covering three consecutive lessons on two topics each (electricity, mechanics) have been analyzed. The sample includes 10 teachers and 290 students. Design and methods: Students' and teachers' verbal contributions were coded manual-based according to the level of complexity. Additionally, pre-post testing of knowledge in electricity and mechanics was applied to assess the students' learning gain. ANOVA analysis was used to characterize the influence of the complexity on the learning gain. Results: Results indicate that the mean level of complexity in classroom contributions explains a large portion of variance in post-test results on class level. Despite this overarching trend, taking classroom activities into account as well reveals even more fine-grained patterns, leading to more specific relations between the complexity in the classroom and students' achievement. Conclusions: In conclusion, we argue for more reflected teaching approaches intended to gradually increase class complexity to foster students' level of competency.

  20. Applying the design-build-test paradigm in microbiome engineering.

    PubMed

    Pham, Hoang Long; Ho, Chun Loong; Wong, Adison; Lee, Yung Seng; Chang, Matthew Wook

    2017-12-01

    The recently discovered roles of human microbiome in health and diseases have inspired research efforts across many disciplines to engineer microbiome for health benefits. In this review, we highlight recent progress in human microbiome research and how modifications to the microbiome could result in implications to human health. Furthermore, we discuss the application of a 'design-build-test' framework to expedite microbiome engineering efforts by reviewing current literature on three key aspects: design principles to engineer the human microbiome, methods to engineer microbiome with desired functions, and analytical techniques to examine complex microbiome samples. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Nucleic acid sequence detection using multiplexed oligonucleotide PCR

    DOEpatents

    Nolan, John P [Santa Fe, NM; White, P Scott [Los Alamos, NM

    2006-12-26

    Methods for rapidly detecting single or multiple sequence alleles in a sample nucleic acid are described. Provided are all of the oligonucleotide pairs capable of annealing specifically to a target allele and discriminating among possible sequences thereof, and ligating to each other to form an oligonucleotide complex when a particular sequence feature is present (or, alternatively, absent) in the sample nucleic acid. The design of each oligonucleotide pair permits the subsequent high-level PCR amplification of a specific amplicon when the oligonucleotide complex is formed, but not when the oligonucleotide complex is not formed. The presence or absence of the specific amplicon is used to detect the allele. Detection of the specific amplicon may be achieved using a variety of methods well known in the art, including without limitation, oligonucleotide capture onto DNA chips or microarrays, oligonucleotide capture onto beads or microspheres, electrophoresis, and mass spectrometry. Various labels and address-capture tags may be employed in the amplicon detection step of multiplexed assays, as further described herein.

  2. Experiment kits for processing biological samples inflight on SLS-2

    NASA Technical Reports Server (NTRS)

    Savage, P. D.; Hinds, W. E.; Jaquez, R.; Evans, J.; Dubrovin, L.

    1995-01-01

    This paper describes development of an innovative, modular approach to packaging the instruments used to obtain and preserve the inflight rodent tissue and blood samples associated with hematology experiments on the Spacelab Life Sciences-2 (SLS-2) mission. The design approach organized the multitude of instruments into twelve 5- x 6- x l-in. kits which were each used for a particular experiment. Each kit contained the syringes, vials, microscope slides, etc., necessary for processing and storing blood and tissue samples for one rat on a particular day. A total of 1245 components, packaged into 128 kits and stowed in 17 Zero(registered trademark) boxes, were required. Crewmembers found the design easy to use and laid out in a logical, simple configuration which minimized chances for error during the complex procedures in flight. This paper also summarizes inflight performance of the kits on SLS-2.

  3. The Direct Lighting Computation in Global Illumination Methods

    NASA Astrophysics Data System (ADS)

    Wang, Changyaw Allen

    1994-01-01

    Creating realistic images is a computationally expensive process, but it is very important for applications such as interior design, product design, education, virtual reality, and movie special effects. To generate realistic images, state-of-art rendering techniques are employed to simulate global illumination, which accounts for the interreflection of light among objects. In this document, we formalize the global illumination problem into a eight -dimensional integral and discuss various methods that can accelerate the process of approximating this integral. We focus on the direct lighting computation, which accounts for the light reaching the viewer from the emitting sources after exactly one reflection, Monte Carlo sampling methods, and light source simplification. Results include a new sample generation method, a framework for the prediction of the total number of samples used in a solution, and a generalized Monte Carlo approach for computing the direct lighting from an environment which for the first time makes ray tracing feasible for highly complex environments.

  4. Color filter array design based on a human visual model

    NASA Astrophysics Data System (ADS)

    Parmar, Manu; Reeves, Stanley J.

    2004-05-01

    To reduce cost and complexity associated with registering multiple color sensors, most consumer digital color cameras employ a single sensor. A mosaic of color filters is overlaid on a sensor array such that only one color channel is sampled per pixel location. The missing color values must be reconstructed from available data before the image is displayed. The quality of the reconstructed image depends fundamentally on the array pattern and the reconstruction technique. We present a design method for color filter array patterns that use red, green, and blue color channels in an RGB array. A model of the human visual response for luminance and opponent chrominance channels is used to characterize the perceptual error between a fully sampled and a reconstructed sparsely-sampled image. Demosaicking is accomplished using Wiener reconstruction. To ensure that the error criterion reflects perceptual effects, reconstruction is done in a perceptually uniform color space. A sequential backward selection algorithm is used to optimize the error criterion to obtain the sampling arrangement. Two different types of array patterns are designed: non-periodic and periodic arrays. The resulting array patterns outperform commonly used color filter arrays in terms of the error criterion.

  5. Molecular Level Design Principle behind Optimal Sizes of Photosynthetic LH2 Complex: Taming Disorder through Cooperation of Hydrogen Bonding and Quantum Delocalization.

    PubMed

    Jang, Seogjoo; Rivera, Eva; Montemayor, Daniel

    2015-03-19

    The light harvesting 2 (LH2) antenna complex from purple photosynthetic bacteria is an efficient natural excitation energy carrier with well-known symmetric structure, but the molecular level design principle governing its structure-function relationship is unknown. Our all-atomistic simulations of nonnatural analogues of LH2 as well as those of a natural LH2 suggest that nonnatural sizes of LH2-like complexes could be built. However, stable and consistent hydrogen bonding (HB) between bacteriochlorophyll and the protein is shown to be possible only near naturally occurring sizes, leading to significantly smaller disorder than for nonnatural ones. Extensive quantum calculations of intercomplex exciton transfer dynamics, sampled for a large set of disorder, reveal that taming the negative effect of disorder through a reliable HB as well as quantum delocalization of the exciton is a critical mechanism that makes LH2 highly functional, which also explains why the natural sizes of LH2 are indeed optimal.

  6. Design strategies from sexual exploitation and sex work studies among women and girls: Methodological considerations in a hidden and vulnerable population.

    PubMed

    Gerassi, Lara; Edmond, Tonya; Nichols, Andrea

    2017-06-01

    The study of sex trafficking, prostitution, sex work, and sexual exploitation is associated with many methodological issues and challenges. Researchers' study designs must consider the many safety issues related to this vulnerable and hidden population. Community advisory boards and key stakeholder involvement are essential to study design to increase safety of participants, usefulness of study aims, and meaningfulness of conclusions. Nonrandomized sampling strategies are most often utilized when studying exploited women and girls, which have the capacity to provide rich data and require complex sampling and recruitment methods. This article reviews the current methodological issues when studying this marginalized population as well as strategies to address challenges while working with the community in order to bring about social change. The authors also discuss their own experiences in collaborating with community organizations to conduct research in this field.

  7. Design strategies from sexual exploitation and sex work studies among women and girls: Methodological considerations in a hidden and vulnerable population

    PubMed Central

    Gerassi, Lara; Edmond, Tonya; Nichols, Andrea

    2016-01-01

    The study of sex trafficking, prostitution, sex work, and sexual exploitation is associated with many methodological issues and challenges. Researchers’ study designs must consider the many safety issues related to this vulnerable and hidden population. Community advisory boards and key stakeholder involvement are essential to study design to increase safety of participants, usefulness of study aims, and meaningfulness of conclusions. Nonrandomized sampling strategies are most often utilized when studying exploited women and girls, which have the capacity to provide rich data and require complex sampling and recruitment methods. This article reviews the current methodological issues when studying this marginalized population as well as strategies to address challenges while working with the community in order to bring about social change. The authors also discuss their own experiences in collaborating with community organizations to conduct research in this field. PMID:28824337

  8. Fast prediction of the fatigue behavior of short-fiber-reinforced thermoplastics based on heat build-up measurements: application to heterogeneous cases

    NASA Astrophysics Data System (ADS)

    Serrano, Leonell; Marco, Yann; Le Saux, Vincent; Robert, Gilles; Charrier, Pierre

    2017-09-01

    Short-fiber-reinforced thermoplastics components for structural applications are usually very complex parts as stiffeners, ribs and thickness variations are used to compensate the quite low material intrinsic stiffness. These complex geometries induce complex local mechanical fields but also complex microstructures due to the injection process. Accounting for these two aspects is crucial for the design in regard to fatigue of these parts, especially for automotive industry. The aim of this paper is to challenge an energetic approach, defined to evaluate quickly the fatigue lifetime, on three different heterogeneous cases: a classic dog-bone sample with a skin-core microstructure and two structural samples representative of the thickness variations observed for industrial components. First, a method to evaluate dissipated energy fields from thermal measurements is described and is applied to the three samples in order to relate the cyclic loading amplitude to the fields of cyclic dissipated energy. Then, a local analysis is detailed in order to link the energy dissipated at the failure location to the fatigue lifetime and to predict the fatigue curve from the thermomechanical response of one single sample. The predictions obtained for the three cases are compared successfully to the Wöhler curves obtained with classic fatigue tests. Finally, a discussion is proposed to compare results for the three samples in terms of dissipation fields and fatigue lifetime. This comparison illustrates that, if the approach is leading to a very relevant diagnosis on each case, the dissipated energy field is not giving a straightforward access to the lifetime cartography as the relation between fatigue failure and dissipated energy seems to be dependent on the local mechanical and microstructural state.

  9. Employment of Near Full-Length Ribosome Gene TA-Cloning and Primer-Blast to Detect Multiple Species in a Natural Complex Microbial Community Using Species-Specific Primers Designed with Their Genome Sequences.

    PubMed

    Zhang, Huimin; He, Hongkui; Yu, Xiujuan; Xu, Zhaohui; Zhang, Zhizhou

    2016-11-01

    It remains an unsolved problem to quantify a natural microbial community by rapidly and conveniently measuring multiple species with functional significance. Most widely used high throughput next-generation sequencing methods can only generate information mainly for genus-level taxonomic identification and quantification, and detection of multiple species in a complex microbial community is still heavily dependent on approaches based on near full-length ribosome RNA gene or genome sequence information. In this study, we used near full-length rRNA gene library sequencing plus Primer-Blast to design species-specific primers based on whole microbial genome sequences. The primers were intended to be specific at the species level within relevant microbial communities, i.e., a defined genomics background. The primers were tested with samples collected from the Daqu (also called fermentation starters) and pit mud of a traditional Chinese liquor production plant. Sixteen pairs of primers were found to be suitable for identification of individual species. Among them, seven pairs were chosen to measure the abundance of microbial species through quantitative PCR. The combination of near full-length ribosome RNA gene library sequencing and Primer-Blast may represent a broadly useful protocol to quantify multiple species in complex microbial population samples with species-specific primers.

  10. Capturing heterogeneity: The role of a study area's extent for estimating mean throughfall

    NASA Astrophysics Data System (ADS)

    Zimmermann, Alexander; Voss, Sebastian; Metzger, Johanna Clara; Hildebrandt, Anke; Zimmermann, Beate

    2016-11-01

    The selection of an appropriate spatial extent of a sampling plot is one among several important decisions involved in planning a throughfall sampling scheme. In fact, the choice of the extent may determine whether or not a study can adequately characterize the hydrological fluxes of the studied ecosystem. Previous attempts to optimize throughfall sampling schemes focused on the selection of an appropriate sample size, support, and sampling design, while comparatively little attention has been given to the role of the extent. In this contribution, we investigated the influence of the extent on the representativeness of mean throughfall estimates for three forest ecosystems of varying stand structure. Our study is based on virtual sampling of simulated throughfall fields. We derived these fields from throughfall data sampled in a simply structured forest (young tropical forest) and two heterogeneous forests (old tropical forest, unmanaged mixed European beech forest). We then sampled the simulated throughfall fields with three common extents and various sample sizes for a range of events and for accumulated data. Our findings suggest that the size of the study area should be carefully adapted to the complexity of the system under study and to the required temporal resolution of the throughfall data (i.e. event-based versus accumulated). Generally, event-based sampling in complex structured forests (conditions that favor comparatively long autocorrelations in throughfall) requires the largest extents. For event-based sampling, the choice of an appropriate extent can be as important as using an adequate sample size.

  11. Career Decision Statuses among Portuguese Secondary School Students: A Cluster Analytical Approach

    ERIC Educational Resources Information Center

    Santos, Paulo Jorge; Ferreira, Joaquim Armando

    2012-01-01

    Career indecision is a complex phenomenon and an increasing number of authors have proposed that undecided individuals do not form a group with homogeneous characteristics. This study examines career decision statuses among a sample of 362 12th-grade Portuguese students. A cluster-analytical procedure, based on a battery of instruments designed to…

  12. Leadership Identities, Styles, and Practices of Women University Administrators and Presidents

    ERIC Educational Resources Information Center

    Wheat, Celeste A.; Hill, Lilian H.

    2016-01-01

    To understand the complex factors that influence women's experiences in senior administrative roles in higher education, the purpose of this study was to give voice to how they made meaning of their leadership experiences. We employed a qualitative design in which a criterion-based sample of 14 women senior administrators (i.e., dean, vice…

  13. Ethnic and Racial Socialization and Self-Esteem of Asian Adoptees: The Mediating Role of Multiple Identities

    ERIC Educational Resources Information Center

    Mohanty, Jayashree

    2013-01-01

    Positive identity development during adolescence in general is a complex process and may pose additional challenges for adolescents adopted from a different culture. Using a web-based survey design with a sample of 100 internationally adopted Asian adolescent and young adults, the present study examined the mediating role of multiple identities…

  14. A novel quality by design approach for developing an HPLC method to analyze herbal extracts: A case study of sugar content analysis.

    PubMed

    Shao, Jingyuan; Cao, Wen; Qu, Haibin; Pan, Jianyang; Gong, Xingchu

    2018-01-01

    The aim of this study was to present a novel analytical quality by design (AQbD) approach for developing an HPLC method to analyze herbal extracts. In this approach, critical method attributes (CMAs) and critical method parameters (CMPs) of the analytical method were determined using the same data collected from screening experiments. The HPLC-ELSD method for separation and quantification of sugars in Codonopsis Radix extract (CRE) samples and Astragali Radix extract (ARE) samples was developed as an example method with a novel AQbD approach. Potential CMAs and potential CMPs were found with Analytical Target Profile. After the screening experiments, the retention time of the D-glucose peak of CRE samples, the signal-to-noise ratio of the D-glucose peak of CRE samples, and retention time of the sucrose peak in ARE samples were considered CMAs. The initial and final composition of the mobile phase, flow rate, and column temperature were found to be CMPs using a standard partial regression coefficient method. The probability-based design space was calculated using a Monte-Carlo simulation method and verified by experiments. The optimized method was validated to be accurate and precise, and then it was applied in the analysis of CRE and ARE samples. The present AQbD approach is efficient and suitable for analysis objects with complex compositions.

  15. Design of a WSN for the Sampling of Environmental Variability in Complex Terrain

    PubMed Central

    Martín-Tardío, Miguel A.; Felicísimo, Ángel M.

    2014-01-01

    In-situ environmental parameter measurements using sensor systems connected to a wireless network have become widespread, but the problem of monitoring large and mountainous areas by means of a wireless sensor network (WSN) is not well resolved. The main reasons for this are: (1) the environmental variability distribution is unknown in the field; (2) without this knowledge, a huge number of sensors would be necessary to ensure the complete coverage of the environmental variability and (3) WSN design requirements, for example, effective connectivity (intervisibility), limiting distances and controlled redundancy, are usually solved by trial and error. Using temperature as the target environmental variable, we propose: (1) a method to determine the homogeneous environmental classes to be sampled using the digital elevation model (DEM) and geometric simulations and (2) a procedure to determine an effective WSN design in complex terrain in terms of the number of sensors, redundancy, cost and spatial distribution. The proposed methodology, based on geographic information systems and binary integer programming can be easily adapted to a wide range of applications that need exhaustive and continuous environmental monitoring with high spatial resolution. The results show that the WSN design is perfectly suited to the topography and the technical specifications of the sensors, and provides a complete coverage of the environmental variability in terms of Sun exposure. However these results still need be validated in the field and the proposed procedure must be refined. PMID:25412218

  16. Practical aspects of complex permittivity reconstruction with neural-network-controlled FDTD modeling of a two-port fixture.

    PubMed

    Eves, E Eugene; Murphy, Ethan K; Yakovlev, Vadim V

    2007-01-01

    The paper discusses characteristics of a new modeling-based technique for determining dielectric properties of materials. Complex permittivity is found with an optimization algorithm designed to match complex S-parameters obtained from measurements and from 3D FDTD simulation. The method is developed on a two-port (waveguide-type) fixture and deals with complex reflection and transmission characteristics at the frequency of interest. A computational part is constructed as an inverse-RBF-network-based procedure that reconstructs dielectric constant and the loss factor of the sample from the FDTD modeling data sets and the measured reflection and transmission coefficients. As such, it is applicable to samples and cavities of arbitrary configurations provided that the geometry of the experimental setup is adequately represented by the FDTD model. The practical implementation of the method considered in this paper is a section of a WR975 waveguide containing a sample of a liquid in a cylindrical cutout of a rectangular Teflon cup. The method is run in two stages and employs two databases--first, built for a sparse grid on the complex permittivity plane, in order to locate a domain with an anticipated solution and, second, made as a denser grid covering the determined domain, for finding an exact location of the complex permittivity point. Numerical tests demonstrate that the computational part of the method is highly accurate even when the modeling data is represented by relatively small data sets. When working with reflection and transmission coefficients measured in an actual experimental fixture and reconstructing a low dielectric constant and the loss factor the technique may be less accurate. It is shown that the employed neural network is capable of finding complex permittivity of the sample when experimental data on the reflection and transmission coefficients are numerically dispersive (noise-contaminated). A special modeling test is proposed for validating the results; it confirms that the values of complex permittivity for several liquids (including salt water acetone and three types of alcohol) at 915 MHz are reconstructed with satisfactory accuracy.

  17. Molecular design of boronic acid-functionalized squarylium cyanine dyes for multiple discriminant analysis of sialic acid in biological samples: selectivity toward monosaccharides controlled by different alkyl side chain lengths.

    PubMed

    Ouchi, Kazuki; Colyer, Christa L; Sebaiy, Mahmoud; Zhou, Jin; Maeda, Takeshi; Nakazumi, Hiroyuki; Shibukawa, Masami; Saito, Shingo

    2015-02-03

    We designed a new series of boronic acid-functionalized squarylium cyanine dyes (SQ-BA) with different lengths of alkyl chain residues, suitable for multiple discriminant analysis (MDA) of sialic acid (Neu5Ac) in biological samples. The SQ-BA dyes form aggregates based on hydrophobic interactions, which result in quenched fluorescence in aqueous solutions. When the boronic acid binds with saccharides, the fluorescence intensity increases as a result of dissociation to the emissive monomeric complex. We inferred that different dye aggregate structures (H-aggregates and J-aggregates) were induced depending on the alkyl chain length, so that monosaccharides would be recognized in different ways (especially, multipoint interaction with J-aggregates). A distinctive emission enhancement of SQ-BA dyes with shorter-alkyl-chains in the presence of Neu5Ac was observed (2.4-fold fluorescence enhancement; with formation constant 10(1.7) M(-1)), with no such enhancement for SQ-BA dyes with longer-alkyl-chain. In addition, various enhancement factors for other monosaccharides were observed depending on the alkyl chain length. Detailed thermodynamic and NMR studies of the SQ-BA complexes revealed the unique recognition mechanism: the dye aggregate with a shorter-alkyl-chain causes the slipped parallel structure and forms a stable 2:1 complex with Neu5Ac, as distinct from longer-alkyl-chain dyes, which form a 1:1 monomeric complex. MDA using the four SQ-BA dyes was performed for human urine samples, resulting in the successful discrimination between normal and abnormal Neu5Ac levels characteristic of disease. Thus, we successfully controlled various responses to similar monosaccharides with a novel approach that chemically modified not the boronic acid moiety itself but the length of the alkyl chain residue attached to the dye in order to generate specificity.

  18. [Methodological design for the National Survey Violence Against Women in Mexico].

    PubMed

    Olaiz, Gustavo; Franco, Aurora; Palma, Oswaldo; Echarri, Carlos; Valdez, Rosario; Herrera, Cristina

    2006-01-01

    To describe the methodology, the research designs used, the estimation and sample selection, variable definitions, collection instruments, and operative design and analytical procedures for the National Survey Violence Against Women in Mexico. A complex (two-step) cross-sectional study was designed and the qualitative design was carried out using in-depth interviews and participant observation in health care units. We obtained for the quantitative study a total of 26 240 interviews in women users of health services and 2 636 questionnaires for health workers; the survey is representative of the 32 Mexican states. For the qualitative study 26 in-depth interviews were conducted with female users and 60 interviews with health workers in the States of Quintana Roo, Coahuila and the Federal District.

  19. Tackling complexities in understanding the social determinants of health: the contribution of ethnographic research

    PubMed Central

    2011-01-01

    Objective The complexities inherent in understanding the social determinants of health are often not well-served by quantitative approaches. My aim is to show that well-designed and well-conducted ethnographic studies have an important contribution to make in this regard. Ethnographic research designs are a difficult but rigorous approach to research questions that require us to understand the complexity of people’s social and cultural lives. Approach I draw on an ethnographic study to describe the complexities of studying maternal health in a rural area in India. I then show how the lessons learnt in that setting and context can be applied to studies done in very different settings. Results I show how ethnographic research depends for rigour on a theoretical framework for sample selection; why immersion in the community under study, and rapport building with research participants, is important to ensure rich and meaningful data; and how flexible approaches to data collection lead to the gradual emergence of an analysis based on intense cross-referencing with community views and thus a conclusion that explains the similarities and differences observed. Conclusion When using ethnographic research design it can be difficult to specify in advance the exact details of the study design. Researchers can encounter issues in the field that require them to change what they planned on doing. In rigorous ethnographic studies, the researcher in the field is the research instrument and needs to be well trained in the method. Implication Ethnographic research is challenging, but nevertheless provides a rewarding way of researching complex health problems that require an understanding of the social and cultural determinants of health. PMID:22168509

  20. DART - LTQ ORBITRAP as an expedient tool for the identification of synthetic cannabinoids.

    PubMed

    Habala, Ladislav; Valentová, Jindra; Pechová, Iveta; Fuknová, Mária; Devínsky, Ferdinand

    2016-05-01

    Synthetic cannabinoids as designer drugs constitute a major problem due to their rapid increase in number and the difficulties connected with their identification in complex mixtures. DART (Direct Analysis in Real Time) has emerged as an advantageous tool for the direct and rapid analysis of complex samples by mass spectrometry. Here we report on the identification of six synthetic cannabinoids originating from seized material in various matrices, employing the combination of ambient pressure ion source DART and hybrid ion trap - LTQ ORBITRAP mass spectrometer. This report also describes the sampling techniques for the provided herbal material containing the cannabinoids, either directly as plant parts or as an extract in methanol and their influence on the outcome of the analysis. The high resolution mass spectra supplied by the LTQ ORBITRAP instrument allowed for an unambiguous assignment of target compounds. The utilized instrumental coupling proved to be a convenient way for the identification of synthetic cannabinoids in real-world samples. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  1. Imaging system design and image interpolation based on CMOS image sensor

    NASA Astrophysics Data System (ADS)

    Li, Yu-feng; Liang, Fei; Guo, Rui

    2009-11-01

    An image acquisition system is introduced, which consists of a color CMOS image sensor (OV9620), SRAM (CY62148), CPLD (EPM7128AE) and DSP (TMS320VC5509A). The CPLD implements the logic and timing control to the system. SRAM stores the image data, and DSP controls the image acquisition system through the SCCB (Omni Vision Serial Camera Control Bus). The timing sequence of the CMOS image sensor OV9620 is analyzed. The imaging part and the high speed image data memory unit are designed. The hardware and software design of the image acquisition and processing system is given. CMOS digital cameras use color filter arrays to sample different spectral components, such as red, green, and blue. At the location of each pixel only one color sample is taken, and the other colors must be interpolated from neighboring samples. We use the edge-oriented adaptive interpolation algorithm for the edge pixels and bilinear interpolation algorithm for the non-edge pixels to improve the visual quality of the interpolated images. This method can get high processing speed, decrease the computational complexity, and effectively preserve the image edges.

  2. SABRE: a method for assessing the stability of gene modules in complex tissues and subject populations.

    PubMed

    Shannon, Casey P; Chen, Virginia; Takhar, Mandeep; Hollander, Zsuzsanna; Balshaw, Robert; McManus, Bruce M; Tebbutt, Scott J; Sin, Don D; Ng, Raymond T

    2016-11-14

    Gene network inference (GNI) algorithms can be used to identify sets of coordinately expressed genes, termed network modules from whole transcriptome gene expression data. The identification of such modules has become a popular approach to systems biology, with important applications in translational research. Although diverse computational and statistical approaches have been devised to identify such modules, their performance behavior is still not fully understood, particularly in complex human tissues. Given human heterogeneity, one important question is how the outputs of these computational methods are sensitive to the input sample set, or stability. A related question is how this sensitivity depends on the size of the sample set. We describe here the SABRE (Similarity Across Bootstrap RE-sampling) procedure for assessing the stability of gene network modules using a re-sampling strategy, introduce a novel criterion for identifying stable modules, and demonstrate the utility of this approach in a clinically-relevant cohort, using two different gene network module discovery algorithms. The stability of modules increased as sample size increased and stable modules were more likely to be replicated in larger sets of samples. Random modules derived from permutated gene expression data were consistently unstable, as assessed by SABRE, and provide a useful baseline value for our proposed stability criterion. Gene module sets identified by different algorithms varied with respect to their stability, as assessed by SABRE. Finally, stable modules were more readily annotated in various curated gene set databases. The SABRE procedure and proposed stability criterion may provide guidance when designing systems biology studies in complex human disease and tissues.

  3. Periodontal disease associated with red complex bacteria in dogs.

    PubMed

    Di Bello, A; Buonavoglia, A; Franchini, D; Valastro, C; Ventrella, G; Greco, M F; Corrente, M

    2014-03-01

    Red complex bacteria (Treponema denticola, Tannerella forsythia and Porphyromonas gingivalis) play a major role in the aetiology of periodontal disease in humans. This study was designed to evaluate the association of such bacteria with periodontal disease in dogs. Seventy-three subgingival samples taken from dogs ranging from 2 months to 12 years (median age 4 years) were tested for red complex bacteria using a polymerase chain reaction assay. Thirty-six of 73 (49 · 3%) dogs were found to be positive for T. forsythia and P. gingivalis. Dogs with gingivitis or periodontitis were more likely to be infected with T. forsythia and P. gingivalis [odds ratio (OR) 5 · 4 (confidence interval (CI) 1 · 9-15 · 6), P = 0 · 002] than healthy animals. Only 3 (4 · 1%) of 73 samples were positive for red complex bacteria, but the association with periodontal disease was not significant. The results indicate that involvement of red complex bacteria in periodontal disease in dogs is similar to that observed in humans. Only the concurrent presence of T. forsythia and P. gingivalis were correlated to periodontal disease in dogs in this study. © 2014 British Small Animal Veterinary Association.

  4. Spatial patterning in PM2.5 constituents under an inversion-focused sampling design across an urban area of complex terrain

    PubMed Central

    Tunno, Brett J; Dalton, Rebecca; Michanowicz, Drew R; Shmool, Jessie L C; Kinnee, Ellen; Tripathy, Sheila; Cambal, Leah; Clougherty, Jane E

    2016-01-01

    Health effects of fine particulate matter (PM2.5) vary by chemical composition, and composition can help to identify key PM2.5 sources across urban areas. Further, this intra-urban spatial variation in concentrations and composition may vary with meteorological conditions (e.g., mixing height). Accordingly, we hypothesized that spatial sampling during atmospheric inversions would help to better identify localized source effects, and reveal more distinct spatial patterns in key constituents. We designed a 2-year monitoring campaign to capture fine-scale intra-urban variability in PM2.5 composition across Pittsburgh, PA, and compared both spatial patterns and source effects during “frequent inversion” hours vs 24-h weeklong averages. Using spatially distributed programmable monitors, and a geographic information systems (GIS)-based design, we collected PM2.5 samples across 37 sampling locations per year to capture variation in local pollution sources (e.g., proximity to industry, traffic density) and terrain (e.g., elevation). We used inductively coupled plasma mass spectrometry (ICP-MS) to determine elemental composition, and unconstrained factor analysis to identify source suites by sampling scheme and season. We examined spatial patterning in source factors using land use regression (LUR), wherein GIS-based source indicators served to corroborate factor interpretations. Under both summer sampling regimes, and for winter inversion-focused sampling, we identified six source factors, characterized by tracers associated with brake and tire wear, steel-making, soil and road dust, coal, diesel exhaust, and vehicular emissions. For winter 24-h samples, four factors suggested traffic/fuel oil, traffic emissions, coal/industry, and steel-making sources. In LURs, as hypothesized, GIS-based source terms better explained spatial variability in inversion-focused samples, including a greater contribution from roadway, steel, and coal-related sources. Factor analysis produced source-related constituent suites under both sampling designs, though factors were more distinct under inversion-focused sampling. PMID:26507005

  5. Cavity-enhanced measurements for determining dielectric-membrane thickness and complex index of refraction.

    PubMed

    Stambaugh, Corey; Durand, Mathieu; Kemiktarak, Utku; Lawall, John

    2014-08-01

    The material properties of silicon nitride (SiN) play an important role in the performance of SiN membranes used in optomechanical applications. An optimum design of a subwavelength high-contrast grating requires accurate knowledge of the membrane thickness and index of refraction, and its performance is ultimately limited by material absorption. Here we describe a cavity-enhanced method to measure the thickness and complex index of refraction of dielectric membranes with small, but nonzero, absorption coefficients. By determining Brewster's angle and an angle at which reflection is minimized by means of destructive interference, both the real part of the index of refraction and the sample thickness can be measured. A comparison of the losses in the empty cavity and the cavity containing the dielectric sample provides a measurement of the absorption.

  6. Integrated control-system design via generalized LQG (GLQG) theory

    NASA Technical Reports Server (NTRS)

    Bernstein, Dennis S.; Hyland, David C.; Richter, Stephen; Haddad, Wassim M.

    1989-01-01

    Thirty years of control systems research has produced an enormous body of theoretical results in feedback synthesis. Yet such results see relatively little practical application, and there remains an unsettling gap between classical single-loop techniques (Nyquist, Bode, root locus, pole placement) and modern multivariable approaches (LQG and H infinity theory). Large scale, complex systems, such as high performance aircraft and flexible space structures, now demand efficient, reliable design of multivariable feedback controllers which optimally tradeoff performance against modeling accuracy, bandwidth, sensor noise, actuator power, and control law complexity. A methodology is described which encompasses numerous practical design constraints within a single unified formulation. The approach, which is based upon coupled systems or modified Riccati and Lyapunov equations, encompasses time-domain linear-quadratic-Gaussian theory and frequency-domain H theory, as well as classical objectives such as gain and phase margin via the Nyquist circle criterion. In addition, this approach encompasses the optimal projection approach to reduced-order controller design. The current status of the overall theory will be reviewed including both continuous-time and discrete-time (sampled-data) formulations.

  7. Thimble microscope system

    NASA Astrophysics Data System (ADS)

    Kamal, Tahseen; Rubinstein, Jaden; Watkins, Rachel; Cen, Zijian; Kong, Gary; Lee, W. M.

    2016-12-01

    Wearable computing devices, e.g. Google Glass, Smart watch, embodies the new human design frontier, where technology interfaces seamlessly with human gestures. During examination of any subject in the field (clinic, surgery, agriculture, field survey, water collection), our sensory peripherals (touch and vision) often go hand-in-hand. The sensitivity and maneuverability of the human fingers are guided with tight distribution of biological nerve cells, which perform fine motor manipulation over a range of complex surfaces that is often out of sight. Our sight (or naked vision), on the other hand, is generally restricted to line of sight that is ill-suited to view around corner. Hence, conventional imaging methods are often resort to complex light guide designs (periscope, endoscopes etc) to navigate over obstructed surfaces. Using modular design strategies, we constructed a prototype miniature microscope system that is incorporated onto a wearable fixture (thimble). This unique platform allows users to maneuver around a sample and take high resolution microscopic images. In this paper, we provide an exposition of methods to achieve a thimble microscopy; microscope lens fabrication, thimble design, integration of miniature camera and liquid crystal display.

  8. Revision of the ant genus Melophorus (Hymenoptera, Formicidae)

    PubMed Central

    Heterick, Brian E.; Castalanelli, Mark; Shattuck, Steve O.

    2017-01-01

    Abstract The fauna of the purely Australian formicine ant genus Melophorus (Hymenoptera: Formicidae) is revised. This project involved integrated morphological and molecular taxonomy using one mitochondrial gene (COI) and four nuclear genes (AA, H3, LR and Wg). Seven major clades were identified and are here designated as the M. aeneovirens, M. anderseni, M. biroi, M. fulvihirtus, M. ludius, M. majeri and M. potteri species-groups. Within these clades, smaller complexes of similar species were also identified and designated species-complexes. The M. ludius species-group was identified purely on molecular grounds, as the morphology of its members is indistinguishable from typical members of the M. biroi species-complex within the M. biroi species-group. Most species-complexes sampled were also found to be monophyletic. Sequencing generally supported monophyly in taxa sampled but some species of the M. fieldi complex and M. biroi were not monophyletic and the implications arising from this are discussed in this monograph. Based on morphology, ninety-three species are recognized, 73 described as new. A further new species (here called 'Species K' [TERC Collection]) is noted in the taxonomic list, but is not described in this work. One species is removed from Melophorus: M. scipio Forel is here placed provisionally in Prolasius. Six species and five subspecies pass into synonymy. Of the full species, M. constans Santschi, M. iridescens (Emery) and M. insularis Wheeler are synonymized under M. aeneovirens (Lowne), M. pillipes Santschi is synonymized under M. turneri Forel, M. marius Forel is synonymized under M. biroi Forel, and M. omniparens Forel is synonymized under M. wheeleri Forel. Of the subspecies, M. iridescens fraudatrix and M. iridescens froggatti Forel are synonymized under M. aeneovirens (Lowne), M. turneri aesopus Forel and M. turneri candidus Santschi are synonymized under M. turneri Forel and M. fieldi propinqua Viehmeyer is synonymized under M. biroi. Camponotus cowlei Froggatt is reinstated as a junior synonym of Melophorus bagoti Lubbock. In addition, the subspecies M. fieldi major Forel, M. ludius sulla Forel and M. turneri perthensis Forel are raised to species. A key to workers of the genus is supplied. A lectotype is designated for M. curtus Forel, M. sulla, and M. turneri. PMID:29358897

  9. Specific and Non-Specific Protein Association in Solution: Computation of Solvent Effects and Prediction of First-Encounter Modes for Efficient Configurational Bias Monte Carlo Simulations

    PubMed Central

    Cardone, Antonio; Pant, Harish; Hassan, Sergio A.

    2013-01-01

    Weak and ultra-weak protein-protein association play a role in molecular recognition, and can drive spontaneous self-assembly and aggregation. Such interactions are difficult to detect experimentally, and are a challenge to the force field and sampling technique. A method is proposed to identify low-population protein-protein binding modes in aqueous solution. The method is designed to identify preferential first-encounter complexes from which the final complex(es) at equilibrium evolves. A continuum model is used to represent the effects of the solvent, which accounts for short- and long-range effects of water exclusion and for liquid-structure forces at protein/liquid interfaces. These effects control the behavior of proteins in close proximity and are optimized based on binding enthalpy data and simulations. An algorithm is described to construct a biasing function for self-adaptive configurational-bias Monte Carlo of a set of interacting proteins. The function allows mixing large and local changes in the spatial distribution of proteins, thereby enhancing sampling of relevant microstates. The method is applied to three binary systems. Generalization to multiprotein complexes is discussed. PMID:24044772

  10. Latin American Study of Nutrition and Health (ELANS): rationale and study design.

    PubMed

    Fisberg, M; Kovalskys, I; Gómez, G; Rigotti, A; Cortés, L Y; Herrera-Cuenca, M; Yépez, M C; Pareja, R G; Guajardo, V; Zimberg, I Z; Chiavegatto Filho, A D P; Pratt, M; Koletzko, B; Tucker, K L

    2016-01-30

    Obesity is growing at an alarming rate in Latin America. Lifestyle behaviours such as physical activity and dietary intake have been largely associated with obesity in many countries; however studies that combine nutrition and physical activity assessment in representative samples of Latin American countries are lacking. The aim of this study is to present the design rationale of the Latin American Study of Nutrition and Health/Estudio Latinoamericano de Nutrición y Salud (ELANS) with a particular focus on its quality control procedures and recruitment processes. The ELANS is a multicenter cross-sectional nutrition and health surveillance study of a nationally representative sample of urban populations from eight Latin American countries (Argentina, Brazil, Chile, Colombia, Costa Rica, Ecuador, Perú and Venezuela). A standard study protocol was designed to evaluate the nutritional intakes, physical activity levels, and anthropometric measurements of 9000 enrolled participants. The study was based on a complex, multistage sample design and the sample was stratified by gender, age (15 to 65 years old) and socioeconomic level. A small-scale pilot study was performed in each country to test the procedures and tools. This study will provide valuable information and a unique dataset regarding Latin America that will enable cross-country comparisons of nutritional statuses that focus on energy and macro- and micronutrient intakes, food patterns, and energy expenditure. Clinical Trials NCT02226627.

  11. Effects of chemical disinfectant solutions on the stability and accuracy of the dental impression complex.

    PubMed

    Rios, M P; Morgano, S M; Stein, R S; Rose, L

    1996-10-01

    Currently available impression materials were not designed for disinfection or sterilization, and it is conceivable that disinfectants may adversely affect impressions. This study evaluated the accuracy and dimensional stability of polyether (Permadyne/Impregum) and polyvinyl siloxane (Express) impression materials retained by their adhesives in two different acrylic resin tray designs (perforated and nonperforated) when the materials were immersed for either 30 or 60 minutes in three high-level disinfectants. Distilled water and no solution served as controls. A stainless steel test analog similar to ADA specification No. 19 was used. A total of 400 impressions were made with all combinations of impression materials, tray designs, disinfectant, and soaking times. Samples were evaluated microscopically before and after immersion and 48 hours after soaking. Results indicated that these two impression materials were dimensionally stable. Because the results emphasized the stability and accuracy of the impression complex under various conditions, dentists can perform disinfection procedures similar to the protocol of this study without concern for clinically significant distortion of the impression.

  12. Investigation of mechanical properties for open cellular structure CoCrMo alloy fabricated by selective laser melting process

    NASA Astrophysics Data System (ADS)

    Azidin, A.; Taib, Z. A. M.; Harun, W. S. W.; Che Ghani, S. A.; Faisae, M. F.; Omar, M. A.; Ramli, H.

    2015-12-01

    Orthodontic implants have been a major focus through mechanical and biological performance in advance to fabricate shape of complex anatomical. Designing the part with a complex mechanism is one of the challenging process and addition to achieve the balance and desired mechanical performance brought to the right manufacture technique to fabricate. Metal additive manufacturing (MAM) is brought forward to the newest fabrication technology in this field. In this study, selective laser melting (SLM) process was utilized on a medical grade cobalt-chrome molybdenum (CoCrMo) alloy. The work has focused on mechanical properties of the CoCrMo open cellular structures samples with 60%, 70%, and 80% designed volume porosity that could potentially emulate the properties of human bone. It was observed that hardness values decreased as the soaking time increases except for bottom face. For compression test, 60% designed volume porosity demonstrated highest ultimate compressive strength compared to 70% and 80%.

  13. Approximate median regression for complex survey data with skewed response.

    PubMed

    Fraser, Raphael André; Lipsitz, Stuart R; Sinha, Debajyoti; Fitzmaurice, Garrett M; Pan, Yi

    2016-12-01

    The ready availability of public-use data from various large national complex surveys has immense potential for the assessment of population characteristics using regression models. Complex surveys can be used to identify risk factors for important diseases such as cancer. Existing statistical methods based on estimating equations and/or utilizing resampling methods are often not valid with survey data due to complex survey design features. That is, stratification, multistage sampling, and weighting. In this article, we accommodate these design features in the analysis of highly skewed response variables arising from large complex surveys. Specifically, we propose a double-transform-both-sides (DTBS)'based estimating equations approach to estimate the median regression parameters of the highly skewed response; the DTBS approach applies the same Box-Cox type transformation twice to both the outcome and regression function. The usual sandwich variance estimate can be used in our approach, whereas a resampling approach would be needed for a pseudo-likelihood based on minimizing absolute deviations (MAD). Furthermore, the approach is relatively robust to the true underlying distribution, and has much smaller mean square error than a MAD approach. The method is motivated by an analysis of laboratory data on urinary iodine (UI) concentration from the National Health and Nutrition Examination Survey. © 2016, The International Biometric Society.

  14. Approximate Median Regression for Complex Survey Data with Skewed Response

    PubMed Central

    Fraser, Raphael André; Lipsitz, Stuart R.; Sinha, Debajyoti; Fitzmaurice, Garrett M.; Pan, Yi

    2016-01-01

    Summary The ready availability of public-use data from various large national complex surveys has immense potential for the assessment of population characteristics using regression models. Complex surveys can be used to identify risk factors for important diseases such as cancer. Existing statistical methods based on estimating equations and/or utilizing resampling methods are often not valid with survey data due to complex survey design features. That is, stratification, multistage sampling and weighting. In this paper, we accommodate these design features in the analysis of highly skewed response variables arising from large complex surveys. Specifically, we propose a double-transform-both-sides (DTBS) based estimating equations approach to estimate the median regression parameters of the highly skewed response; the DTBS approach applies the same Box-Cox type transformation twice to both the outcome and regression function. The usual sandwich variance estimate can be used in our approach, whereas a resampling approach would be needed for a pseudo-likelihood based on minimizing absolute deviations (MAD). Furthermore, the approach is relatively robust to the true underlying distribution, and has much smaller mean square error than a MAD approach. The method is motivated by an analysis of laboratory data on urinary iodine (UI) concentration from the National Health and Nutrition Examination Survey. PMID:27062562

  15. A binary logistic regression model with complex sampling design of unmet need for family planning among all women aged (15-49) in Ethiopia.

    PubMed

    Workie, Demeke Lakew; Zike, Dereje Tesfaye; Fenta, Haile Mekonnen; Mekonnen, Mulusew Admasu

    2017-09-01

    Unintended pregnancy related to unmet need is a worldwide problem that affects societies. The main objective of this study was to identify the prevalence and determinants of unmet need for family planning among women aged (15-49) in Ethiopia. The Performance Monitoring and Accountability2020/Ethiopia was conducted in April 2016 at round-4 from 7494 women with two-stage-stratified sampling. Bi-variable and multi-variable binary logistic regression model with complex sampling design was fitted. The prevalence of unmet-need for family planning was 16.2% in Ethiopia. Women between the age range of 15-24 years were 2.266 times more likely to have unmet need family planning compared to above 35 years. Women who were currently married were about 8 times more likely to have unmet need family planning compared to never married women. Women who had no under-five child were 0.125 times less likely to have unmet need family planning compared to those who had more than two-under-5. The key determinants of unmet need family planning in Ethiopia were residence, age, marital-status, education, household members, birth-events and number of under-5 children. Thus the Government of Ethiopia would take immediate steps to address the causes of high unmet need for family planning among women.

  16. Design and evaluation of a new Peltier-cooled laser ablation cell with on-sample temperature control.

    PubMed

    Konz, Ioana; Fernández, Beatriz; Fernández, M Luisa; Pereiro, Rosario; Sanz-Medel, Alfredo

    2014-01-27

    A new custom-built Peltier-cooled laser ablation cell is described. The proposed cryogenic cell combines a small internal volume (20 cm(3)) with a unique and reliable on-sample temperature control. The use of a flexible temperature sensor, directly located on the sample surface, ensures a rigorous sample temperature control throughout the entire analysis time and allows instant response to any possible fluctuation. In this way sample integrity and, therefore, reproducibility can be guaranteed during the ablation. The refrigeration of the proposed cryogenic cell combines an internal refrigeration system, controlled by a sensitive thermocouple, with an external refrigeration system. Cooling of the sample is directly carried out by 8 small (1 cm×1 cm) Peltier elements placed in a circular arrangement in the base of the cell. These Peltier elements are located below a copper plate where the sample is placed. Due to the small size of the cooling electronics and their circular allocation it was possible to maintain a peephole under the sample for illumination allowing a much better visualization of the sample, a factor especially important when working with structurally complex tissue sections. The analytical performance of the cryogenic cell was studied using a glass reference material (SRM NIST 612) at room temperature and at -20°C. The proposed cell design shows a reasonable signal washout (signal decay within less than 10 s to background level), high sensitivity and good signal stability (in the range 6.6-11.7%). Furthermore, high precision (0.4-2.6%) and accuracy (0.3-3.9%) in the isotope ratio measurements were also observed operating the cell both at room temperature and at -20°C. Finally, experimental results obtained for the cell application to qualitative elemental imaging of structurally complex tissue samples (e.g. eye sections from a native frozen porcine eye and fresh flower leaves) demonstrate that working in cryogenic conditions is critical in such type of direct sample analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Effect-directed analysis supporting monitoring of aquatic environments--An in-depth overview.

    PubMed

    Brack, Werner; Ait-Aissa, Selim; Burgess, Robert M; Busch, Wibke; Creusot, Nicolas; Di Paolo, Carolina; Escher, Beate I; Mark Hewitt, L; Hilscherova, Klara; Hollender, Juliane; Hollert, Henner; Jonker, Willem; Kool, Jeroen; Lamoree, Marja; Muschket, Matthias; Neumann, Steffen; Rostkowski, Pawel; Ruttkies, Christoph; Schollee, Jennifer; Schymanski, Emma L; Schulze, Tobias; Seiler, Thomas-Benjamin; Tindall, Andrew J; De Aragão Umbuzeiro, Gisela; Vrana, Branislav; Krauss, Martin

    2016-02-15

    Aquatic environments are often contaminated with complex mixtures of chemicals that may pose a risk to ecosystems and human health. This contamination cannot be addressed with target analysis alone but tools are required to reduce this complexity and identify those chemicals that might cause adverse effects. Effect-directed analysis (EDA) is designed to meet this challenge and faces increasing interest in water and sediment quality monitoring. Thus, the present paper summarizes current experience with the EDA approach and the tools required, and provides practical advice on their application. The paper highlights the need for proper problem formulation and gives general advice for study design. As the EDA approach is directed by toxicity, basic principles for the selection of bioassays are given as well as a comprehensive compilation of appropriate assays, including their strengths and weaknesses. A specific focus is given to strategies for sampling, extraction and bioassay dosing since they strongly impact prioritization of toxicants in EDA. Reduction of sample complexity mainly relies on fractionation procedures, which are discussed in this paper, including quality assurance and quality control. Automated combinations of fractionation, biotesting and chemical analysis using so-called hyphenated tools can enhance the throughput and might reduce the risk of artifacts in laboratory work. The key to determining the chemical structures causing effects is analytical toxicant identification. The latest approaches, tools, software and databases for target-, suspect and non-target screening as well as unknown identification are discussed together with analytical and toxicological confirmation approaches. A better understanding of optimal use and combination of EDA tools will help to design efficient and successful toxicant identification studies in the context of quality monitoring in multiply stressed environments. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Are Model Transferability And Complexity Antithetical? Insights From Validation of a Variable-Complexity Empirical Snow Model in Space and Time

    NASA Astrophysics Data System (ADS)

    Lute, A. C.; Luce, Charles H.

    2017-11-01

    The related challenges of predictions in ungauged basins and predictions in ungauged climates point to the need to develop environmental models that are transferable across both space and time. Hydrologic modeling has historically focused on modelling one or only a few basins using highly parameterized conceptual or physically based models. However, model parameters and structures have been shown to change significantly when calibrated to new basins or time periods, suggesting that model complexity and model transferability may be antithetical. Empirical space-for-time models provide a framework within which to assess model transferability and any tradeoff with model complexity. Using 497 SNOTEL sites in the western U.S., we develop space-for-time models of April 1 SWE and Snow Residence Time based on mean winter temperature and cumulative winter precipitation. The transferability of the models to new conditions (in both space and time) is assessed using non-random cross-validation tests with consideration of the influence of model complexity on transferability. As others have noted, the algorithmic empirical models transfer best when minimal extrapolation in input variables is required. Temporal split-sample validations use pseudoreplicated samples, resulting in the selection of overly complex models, which has implications for the design of hydrologic model validation tests. Finally, we show that low to moderate complexity models transfer most successfully to new conditions in space and time, providing empirical confirmation of the parsimony principal.

  19. Efficient two-dimensional compressive sensing in MIMO radar

    NASA Astrophysics Data System (ADS)

    Shahbazi, Nafiseh; Abbasfar, Aliazam; Jabbarian-Jahromi, Mohammad

    2017-12-01

    Compressive sensing (CS) has been a way to lower sampling rate leading to data reduction for processing in multiple-input multiple-output (MIMO) radar systems. In this paper, we further reduce the computational complexity of a pulse-Doppler collocated MIMO radar by introducing a two-dimensional (2D) compressive sensing. To do so, we first introduce a new 2D formulation for the compressed received signals and then we propose a new measurement matrix design for our 2D compressive sensing model that is based on minimizing the coherence of sensing matrix using gradient descent algorithm. The simulation results show that our proposed 2D measurement matrix design using gradient decent algorithm (2D-MMDGD) has much lower computational complexity compared to one-dimensional (1D) methods while having better performance in comparison with conventional methods such as Gaussian random measurement matrix.

  20. Constructing complex graphics applications with CLIPS and the X window system

    NASA Technical Reports Server (NTRS)

    Faul, Ben M.

    1990-01-01

    This article will demonstrate how the artificial intelligence concepts in CLIPS used to solve problems encountered in the design and implementation of graphics applications within the UNIX-X Window System environment. The design of an extended version of CLIPS, called XCLIPS, is presented to show how the X Windows System graphics can be incorporated without losing DOS compatibility. Using XCLIPS, a sample scientific application is built that applies solving capabilities of both two and three dimensional graphics presentations in conjunction with the standard CLIPS features.

  1. Origami-inspired building block and parametric design for mechanical metamaterials

    NASA Astrophysics Data System (ADS)

    Jiang, Wei; Ma, Hua; Feng, Mingde; Yan, Leilei; Wang, Jiafu; Wang, Jun; Qu, Shaobo

    2016-08-01

    An origami-based building block of mechanical metamaterials is proposed and explained by introducing a mechanism model based on its geometry. According to our model, this origami mechanism supports response to uniaxial tension that depends on structure parameters. Hence, its mechanical properties can be tunable by adjusting the structure parameters. Experiments for poly lactic acid (PLA) samples were carried out, and the results are in good agreement with those of finite element analysis (FEA). This work may be useful for designing building blocks of mechanical metamaterials or other complex mechanical structures.

  2. Progressive Stochastic Reconstruction Technique (PSRT) for cryo electron tomography.

    PubMed

    Turoňová, Beata; Marsalek, Lukas; Davidovič, Tomáš; Slusallek, Philipp

    2015-03-01

    Cryo Electron Tomography (cryoET) plays an essential role in Structural Biology, as it is the only technique that allows to study the structure of large macromolecular complexes in their close to native environment in situ. The reconstruction methods currently in use, such as Weighted Back Projection (WBP) or Simultaneous Iterative Reconstruction Technique (SIRT), deliver noisy and low-contrast reconstructions, which complicates the application of high-resolution protocols, such as Subtomogram Averaging (SA). We propose a Progressive Stochastic Reconstruction Technique (PSRT) - a novel iterative approach to tomographic reconstruction in cryoET based on Monte Carlo random walks guided by Metropolis-Hastings sampling strategy. We design a progressive reconstruction scheme to suit the conditions present in cryoET and apply it successfully to reconstructions of macromolecular complexes from both synthetic and experimental datasets. We show how to integrate PSRT into SA, where it provides an elegant solution to the region-of-interest problem and delivers high-contrast reconstructions that significantly improve template-based localization without any loss of high-resolution structural information. Furthermore, the locality of SA is exploited to design an importance sampling scheme which significantly speeds up the otherwise slow Monte Carlo approach. Finally, we design a new memory efficient solution for the specimen-level interior problem of cryoET, removing all associated artifacts. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Quantitation of the phosphoproteome using the library-assisted extracted ion chromatogram (LAXIC) strategy.

    PubMed

    Arrington, Justine V; Xue, Liang; Tao, W Andy

    2014-01-01

    Phosphorylation is a key posttranslational modification that regulates many signaling pathways, but quantifying changes in phosphorylation between samples can be challenging due to its low stoichiometry within cells. We have introduced a mass spectrometry-based label-free quantitation strategy termed LAXIC for the analysis of the phosphoproteome. This method uses a spiked-in synthetic peptide library designed to elute across the entire chromatogram for local normalization of phosphopeptides within complex samples. Normalization of phosphopeptides by library peptides that co-elute within a small time frame accounts for fluctuating ion suppression effects, allowing more accurate quantitation even when LC-MS performance varies. Here we explain the premise of LAXIC, the design of a suitable peptide library, and how the LAXIC algorithm can be implemented with software developed in-house.

  4. A sub-microwatt asynchronous level-crossing ADC for biomedical applications.

    PubMed

    Li, Yongjia; Zhao, Duan; Serdijn, Wouter A

    2013-04-01

    A continuous-time level-crossing analog-to-digital converter (LC-ADC) for biomedical applications is presented. When compared to uniform-sampling (US) ADCs LC-ADCs generate fewer samples for various sparse biomedical signals. Lower power consumption and reduced design complexity with respect to conventional LC-ADCs are achieved due to: 1) replacing the n-bit digital-to-analog converter (DAC) with a 1-bit DAC; 2) splitting the level-crossing detections; and 3) fixing the comparison window. Designed and implemented in 0.18 μm CMOS technology, the proposed ADC uses a chip area of 220 × 203 μm(2). Operating from a supply voltage of 0.8 V, the ADC consumes 313-582 nW from 5 Hz to 5 kHz and achieves an ENOB up to 7.9 bits.

  5. How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?

    PubMed

    West, Brady T; Sakshaug, Joseph W; Aurelien, Guy Alain S

    2016-01-01

    Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data.

  6. How Big of a Problem is Analytic Error in Secondary Analyses of Survey Data?

    PubMed Central

    West, Brady T.; Sakshaug, Joseph W.; Aurelien, Guy Alain S.

    2016-01-01

    Secondary analyses of survey data collected from large probability samples of persons or establishments further scientific progress in many fields. The complex design features of these samples improve data collection efficiency, but also require analysts to account for these features when conducting analysis. Unfortunately, many secondary analysts from fields outside of statistics, biostatistics, and survey methodology do not have adequate training in this area, and as a result may apply incorrect statistical methods when analyzing these survey data sets. This in turn could lead to the publication of incorrect inferences based on the survey data that effectively negate the resources dedicated to these surveys. In this article, we build on the results of a preliminary meta-analysis of 100 peer-reviewed journal articles presenting analyses of data from a variety of national health surveys, which suggested that analytic errors may be extremely prevalent in these types of investigations. We first perform a meta-analysis of a stratified random sample of 145 additional research products analyzing survey data from the Scientists and Engineers Statistical Data System (SESTAT), which describes features of the U.S. Science and Engineering workforce, and examine trends in the prevalence of analytic error across the decades used to stratify the sample. We once again find that analytic errors appear to be quite prevalent in these studies. Next, we present several example analyses of real SESTAT data, and demonstrate that a failure to perform these analyses correctly can result in substantially biased estimates with standard errors that do not adequately reflect complex sample design features. Collectively, the results of this investigation suggest that reviewers of this type of research need to pay much closer attention to the analytic methods employed by researchers attempting to publish or present secondary analyses of survey data. PMID:27355817

  7. Comparison of Efficiency of Jackknife and Variance Component Estimators of Standard Errors. Program Statistics Research. Technical Report.

    ERIC Educational Resources Information Center

    Longford, Nicholas T.

    Large scale surveys usually employ a complex sampling design and as a consequence, no standard methods for estimation of the standard errors associated with the estimates of population means are available. Resampling methods, such as jackknife or bootstrap, are often used, with reference to their properties of robustness and reduction of bias. A…

  8. The Role of Search in University Productivity: Inside, outside, and Interdisciplinary Dimensions. NEBR Working Paper No. 15489

    ERIC Educational Resources Information Center

    Adams, James D.; Clemmons, J. Roger

    2009-01-01

    Due to improving information technology, the growing complexity of research problems, and policies designed to foster interdisciplinary research, the practice of science in the United States has undergone significant structural change. Using a sample of 110 top U.S. universities observed during the late 20th century we find that knowledge flows,…

  9. Parametric study on the thermal performance of beam screen samples of the High-Luminosity LHC upgrade

    NASA Astrophysics Data System (ADS)

    Borges de Sousa, P.; Morrone, M.; Hovenga, N.; Garion, C.; van Weelderen, R.; Koettig, T.; Bremer, J.

    2017-12-01

    The High-Luminosity upgrade of the Large Hadron Collider (HL-LHC) will increase the accelerator’s luminosity by a factor 10 beyond its original design value, giving rise to more collisions and generating an intense flow of debris. A new beam screen has been designed for the inner triplets that incorporates tungsten alloy blocks to shield the superconducting magnets and the 1.9 K superfluid helium bath from incoming radiation. These screens will operate between 60 K and 80 K and are designed to sustain a nominal head load of 15 Wm-1, over 10 times the nominal heat load for the original LHC design. Their overall new and more complex design requires them and their constituent parts to be characterised from a thermal performance standpoint. In this paper we describe the experimental parametric study carried out on two principal thermal components: a representative sample of the beam screen with a tungsten-based alloy block and thermal link and the supporting structure composed of an assembly of ceramic spheres and titanium springs. Results from both studies are shown and discussed regarding their impact on the baseline considerations for the thermal design of the beam screens.

  10. Direct determination and speciation of mercury compounds in environmental and biological samples by carbon bed atomic absorption spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skelly, E.M.

    A method was developed for the direct determination of mercury in water and biological samples using a unique carbon bed atomizer for atomic absorption spectroscopy. The method avoided sources of error such as loss of volatile mercury during sample digestion and contamination of samples through added reagents by eliminating sample pretreatment steps. The design of the atomizer allowed use of the 184.9 nm mercury resonance line in the vacuum ultraviolet region, which increased sensitivity over the commonly used spin-forbidden 253.7 nm line. The carbon bed atomizer method was applied to a study of mercury concentrations in water, hair, sweat, urine,more » blood, breath and saliva samples from a non-occupationally exposed population. Data were collected on the average concentration, the range and distribution of mercury in the samples. Data were also collected illustrating individual variations in mercury concentrations with time. Concentrations of mercury found were significantly higher than values reported in the literature for a ''normal'' population. This is attributed to the increased accuracy gained by eliminating pretreatment steps and increasing atomization efficiency. Absorption traces were obtained for various solutions of pure and complexed mercury compounds. Absorption traces of biological fluids were also obtained. Differences were observed in the absorption-temperatures traces of various compounds. The utility of this technique for studying complexation was demonstrated.« less

  11. Molecular Analyzer for Complex Refractory Organic-Rich Surfaces (MACROS)

    NASA Technical Reports Server (NTRS)

    Getty, Stephanie A.; Cook, Jamie E.; Balvin, Manuel; Brinckerhoff, William B.; Li, Xiang; Grubisic, Andrej; Cornish, Timothy; Ferrance, Jerome; Southard, Adrian

    2017-01-01

    The Molecular Analyzer for Complex Refractory Organic-rich Surfaces, MACROS, is a novel instrument package being developed at NASA Goddard Space Flight Center. MACROS enables the in situ characterization of a sample's composition by coupling two powerful techniques into one compact instrument package: (1) laser desorption/ionization time-of-flight mass spectrometry (LDMS) for broad detection of inorganic mineral composition and non-volatile organics, and (2) liquid-phase extraction methods to gently isolate the soluble organic and inorganic fraction of a planetary powder for enrichment and detailed analysis by liquid chromatographic separation coupled to LDMS. The LDMS is capable of positive and negative ion detection, precision mass selection, and fragment analysis. Two modes are included for LDMS: single laser LDMS as the broad survey mode and two step laser mass spectrometry (L2MS). The liquid-phase extraction will be done in a newly designed extraction module (EM) prototype, providing selectivity in the analysis of a complex sample. For the sample collection, a diamond drill front end will be used to collect rock/icy powder. With all these components and capabilities together, MACROS offers a versatile analytical instrument for a mission targeting an icy moon, carbonaceous asteroid, or comet, to fully characterize the surface composition and advance our understanding of the chemical inventory present on that body.

  12. Low complexity lossless compression of underwater sound recordings.

    PubMed

    Johnson, Mark; Partan, Jim; Hurst, Tom

    2013-03-01

    Autonomous listening devices are increasingly used to study vocal aquatic animals, and there is a constant need to record longer or with greater bandwidth, requiring efficient use of memory and battery power. Real-time compression of sound has the potential to extend recording durations and bandwidths at the expense of increased processing operations and therefore power consumption. Whereas lossy methods such as MP3 introduce undesirable artifacts, lossless compression algorithms (e.g., flac) guarantee exact data recovery. But these algorithms are relatively complex due to the wide variety of signals they are designed to compress. A simpler lossless algorithm is shown here to provide compression factors of three or more for underwater sound recordings over a range of noise environments. The compressor was evaluated using samples from drifting and animal-borne sound recorders with sampling rates of 16-240 kHz. It achieves >87% of the compression of more-complex methods but requires about 1/10 of the processing operations resulting in less than 1 mW power consumption at a sampling rate of 192 kHz on a low-power microprocessor. The potential to triple recording duration with a minor increase in power consumption and no loss in sound quality may be especially valuable for battery-limited tags and robotic vehicles.

  13. Sample-Based Surface Coloring

    PubMed Central

    Bürger, Kai; Krüger, Jens; Westermann, Rüdiger

    2011-01-01

    In this paper, we present a sample-based approach for surface coloring, which is independent of the original surface resolution and representation. To achieve this, we introduce the Orthogonal Fragment Buffer (OFB)—an extension of the Layered Depth Cube—as a high-resolution view-independent surface representation. The OFB is a data structure that stores surface samples at a nearly uniform distribution over the surface, and it is specifically designed to support efficient random read/write access to these samples. The data access operations have a complexity that is logarithmic in the depth complexity of the surface. Thus, compared to data access operations in tree data structures like octrees, data-dependent memory access patterns are greatly reduced. Due to the particular sampling strategy that is employed to generate an OFB, it also maintains sample coherence, and thus, exhibits very good spatial access locality. Therefore, OFB-based surface coloring performs significantly faster than sample-based approaches using tree structures. In addition, since in an OFB, the surface samples are internally stored in uniform 2D grids, OFB-based surface coloring can efficiently be realized on the GPU to enable interactive coloring of high-resolution surfaces. On the OFB, we introduce novel algorithms for color painting using volumetric and surface-aligned brushes, and we present new approaches for particle-based color advection along surfaces in real time. Due to the intermediate surface representation we choose, our method can be used to color polygonal surfaces as well as any other type of surface that can be sampled. PMID:20616392

  14. Performance modeling & simulation of complex systems (A systems engineering design & analysis approach)

    NASA Technical Reports Server (NTRS)

    Hall, Laverne

    1995-01-01

    Modeling of the Multi-mission Image Processing System (MIPS) will be described as an example of the use of a modeling tool to design a distributed system that supports multiple application scenarios. This paper examines: (a) modeling tool selection, capabilities, and operation (namely NETWORK 2.5 by CACl), (b) pointers for building or constructing a model and how the MIPS model was developed, (c) the importance of benchmarking or testing the performance of equipment/subsystems being considered for incorporation the design/architecture, (d) the essential step of model validation and/or calibration using the benchmark results, (e) sample simulation results from the MIPS model, and (f) how modeling and simulation analysis affected the MIPS design process by having a supportive and informative impact.

  15. Tractable Experiment Design via Mathematical Surrogates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Brian J.

    This presentation summarizes the development and implementation of quantitative design criteria motivated by targeted inference objectives for identifying new, potentially expensive computational or physical experiments. The first application is concerned with estimating features of quantities of interest arising from complex computational models, such as quantiles or failure probabilities. A sequential strategy is proposed for iterative refinement of the importance distributions used to efficiently sample the uncertain inputs to the computational model. In the second application, effective use of mathematical surrogates is investigated to help alleviate the analytical and numerical intractability often associated with Bayesian experiment design. This approach allows formore » the incorporation of prior information into the design process without the need for gross simplification of the design criterion. Illustrative examples of both design problems will be presented as an argument for the relevance of these research problems.« less

  16. Improved assemblies using a source-agnostic pipeline for MetaGenomic Assembly by Merging (MeGAMerge) of contigs

    DOE PAGES

    Scholz, Matthew; Lo, Chien -Chi; Chain, Patrick S. G.

    2014-10-01

    Assembly of metagenomic samples is a very complex process, with algorithms designed to address sequencing platform-specific issues, (read length, data volume, and/or community complexity), while also faced with genomes that differ greatly in nucleotide compositional biases and in abundance. To address these issues, we have developed a post-assembly process: MetaGenomic Assembly by Merging (MeGAMerge). We compare this process to the performance of several assemblers, using both real, and in-silico generated samples of different community composition and complexity. MeGAMerge consistently outperforms individual assembly methods, producing larger contigs with an increased number of predicted genes, without replication of data. MeGAMerge contigs aremore » supported by read mapping and contig alignment data, when using synthetically-derived and real metagenomic data, as well as by gene prediction analyses and similarity searches. Ultimately, MeGAMerge is a flexible method that generates improved metagenome assemblies, with the ability to accommodate upcoming sequencing platforms, as well as present and future assembly algorithms.« less

  17. Complex mixture analysis by photoionization mass spectrometry with a VUV hydrogen laser source

    NASA Astrophysics Data System (ADS)

    Huth, T. C.; Denton, M. B.

    1985-12-01

    Trace organic analysis in complex matrix presents one of the most challenging problems in analytical mass spectrometry. When ionization is accomplished non-selectively using electron impact, extensive sample clean-up is often necessary in order to isolate the analyte from the matrix. Sample preparation can be greatly reduced when the VUV H2 laser is used to selectively photoionize only a small fraction of compounds introduced into the ion source. This device produces parent ions only for all compounds whose ionization potentials lie below a threshold value determined by the photon energy of 7.8 eV. The only observed interference arises from electron impact ionization, when scattered laser radiation interacts with metal surfaces, producing electrons which are then accelerated by potential fields inside the source. These can be suppressed to levels acceptable for practical analysis through proper instrumental design. Results are presented which indicate the ability of this ion source to discriminate against interfering matrix components, in simple extracts from a variety of complex real world matrices, such as brewed coffee, beer, and urine.

  18. Epoch-based Entropy for Early Screening of Alzheimer's Disease.

    PubMed

    Houmani, N; Dreyfus, G; Vialatte, F B

    2015-12-01

    In this paper, we introduce a novel entropy measure, termed epoch-based entropy. This measure quantifies disorder of EEG signals both at the time level and spatial level, using local density estimation by a Hidden Markov Model on inter-channel stationary epochs. The investigation is led on a multi-centric EEG database recorded from patients at an early stage of Alzheimer's disease (AD) and age-matched healthy subjects. We investigate the classification performances of this method, its robustness to noise, and its sensitivity to sampling frequency and to variations of hyperparameters. The measure is compared to two alternative complexity measures, Shannon's entropy and correlation dimension. The classification accuracies for the discrimination of AD patients from healthy subjects were estimated using a linear classifier designed on a development dataset, and subsequently tested on an independent test set. Epoch-based entropy reached a classification accuracy of 83% on the test dataset (specificity = 83.3%, sensitivity = 82.3%), outperforming the two other complexity measures. Furthermore, it was shown to be more stable to hyperparameter variations, and less sensitive to noise and sampling frequency disturbances than the other two complexity measures.

  19. The Danish National Health Survey 2010. Study design and respondent characteristics.

    PubMed

    Christensen, Anne Illemann; Ekholm, Ola; Glümer, Charlotte; Andreasen, Anne Helms; Hvidberg, Michael Falk; Kristensen, Peter Lund; Larsen, Finn Breinholt; Ortiz, Britta; Juel, Knud

    2012-06-01

    In 2010 the five Danish regions and the National Institute of Public Health at the University of Southern Denmark conducted a national representative health survey among the adult population in Denmark. This paper describes the study design and the sample and study population as well as the content of the questionnaire. The survey was based on five regional stratified random samples and one national random sample. The samples were mutually exclusive. A total of 298,550 individuals (16 years or older) were invited to participate. Information was collected using a mixed mode approach (paper and web questionnaires). A questionnaire with a minimum of 52 core questions was used in all six subsamples. Calibrated weights were computed in order to take account of the complex survey design and reduce non-response bias. In all, 177,639 individuals completed the questionnaire (59.5%). The response rate varied from 52.3% in the Capital Region of Denmark sample to 65.5% in the North Denmark Region sample. The response rate was particularly low among young men, unmarried people and among individuals with a different ethnic background than Danish. The survey was a result of extensive national cooperation across sectors, which makes it unique in its field of application, e.g. health surveillance, planning and prioritizing public health initiatives and research. However, the low response rate in some subgroups of the study population can pose problems in generalizing data, and efforts to increase the response rate will be important in the forthcoming surveys.

  20. Factorial design optimization of experimental variables in the on-line separation/preconcentration of copper in water samples using solid phase extraction and ICP-OES determination.

    PubMed

    Escudero, Luis A; Cerutti, S; Olsina, R A; Salonia, J A; Gasquez, J A

    2010-11-15

    An on-line preconcentration procedure using solid phase extraction (SPE) for the determination of copper in different water samples by inductively coupled plasma optical emission spectrometry (ICP-OES) is proposed. The copper was retained on a minicolumn filled with ethyl vinyl acetate (EVA) at pH 8.0 without using any complexing reagent. The experimental optimization step was performed using a two-level full factorial design. The results showed that pH, sample loading flow rate, and their interaction (at the tested levels) were statistically significant. In order to determine the best conditions for preconcentration and determination of copper, a final optimization of the significant factors was carried out using a central composite design (CCD). The calibration graph was linear with a regression coefficient of 0.995 at levels near the detection limit up to at least 300 μg L(-1). An enrichment factor (EF) of 54 with a preconcentration time of 187.5 s was obtained. The limit of detection (3σ) was 0.26 μg L(-1). The sampling frequency for the developed methodology was about 15 samples/h. The relative standard deviation (RSD) for six replicates containing 50 μg L(-1) of copper was 3.76%. The methodology was successfully applied to the determination of Cu in tap, mineral, river water samples, and in a certified VKI standard reference material. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. Asymmetric flow field flow fractionation with light scattering detection - an orthogonal sensitivity analysis.

    PubMed

    Galyean, Anne A; Filliben, James J; Holbrook, R David; Vreeland, Wyatt N; Weinberg, Howard S

    2016-11-18

    Asymmetric flow field flow fractionation (AF 4 ) has several instrumental factors that may have a direct effect on separation performance. A sensitivity analysis was applied to ascertain the relative importance of AF 4 primary instrument factor settings for the separation of a complex environmental sample. The analysis evaluated the impact of instrumental factors namely, cross flow, ramp time, focus flow, injection volume, and run buffer concentration on the multi-angle light scattering measurement of natural organic matter (NOM) molar mass (MM). A 2 (5-1) orthogonal fractional factorial design was used to minimize analysis time while preserving the accuracy and robustness in the determination of the main effects and interactions between any two instrumental factors. By assuming that separations resulting in smaller MM measurements would be more accurate, the analysis produced a ranked list of effects estimates for factors and interactions of factors based on their relative importance in minimizing the MM. The most important and statistically significant AF 4 instrumental factors were buffer concentration and cross flow. The least important was ramp time. A parallel 2 (5-2) orthogonal fractional factorial design was also employed on five environmental factors for synthetic natural water samples containing silver nanoparticles (NPs), namely: NP concentration, NP size, NOM concentration, specific conductance, and pH. None of the water quality characteristic effects or interactions were found to be significant in minimizing the measured MM; however, the interaction between NP concentration and NP size was an important effect when considering NOM recovery. This work presents a structured approach for the rigorous assessment of AF 4 instrument factors and optimal settings for the separation of complex samples utilizing efficient orthogonal factional factorial design and appropriate graphical analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Fusarium diversity in soil using a specific molecular approach and a cultural approach.

    PubMed

    Edel-Hermann, Véronique; Gautheron, Nadine; Mounier, Arnaud; Steinberg, Christian

    2015-04-01

    Fusarium species are ubiquitous in soil. They cause plant and human diseases and can produce mycotoxins. Surveys of Fusarium species diversity in environmental samples usually rely on laborious culture-based methods. In the present study, we have developed a molecular method to analyze Fusarium diversity directly from soil DNA. We designed primers targeting the translation elongation factor 1-alpha (EF-1α) gene and demonstrated their specificity toward Fusarium using a large collection of fungi. We used the specific primers to construct a clone library from three contrasting soils. Sequence analysis confirmed the specificity of the assay, with 750 clones identified as Fusarium and distributed among eight species or species complexes. The Fusarium oxysporum species complex (FOSC) was the most abundant one in the three soils, followed by the Fusarium solani species complex (FSSC). We then compared our molecular approach results with those obtained by isolating Fusarium colonies on two culture media and identifying species by sequencing part of the EF-1α gene. The 750 isolates were distributed into eight species or species complexes, with the same dominant species as with the cloning method. Sequence diversity was much higher in the clone library than in the isolate collection. The molecular approach proved to be a valuable tool to assess Fusarium diversity in environmental samples. Combined with high throughput sequencing, it will allow for in-depth analysis of large numbers of samples. Published by Elsevier B.V.

  3. Eigenvalue sensitivity of sampled time systems operating in closed loop

    NASA Astrophysics Data System (ADS)

    Bernal, Dionisio

    2018-05-01

    The use of feedback to create closed-loop eigenstructures with high sensitivity has received some attention in the Structural Health Monitoring field. Although practical implementation is necessarily digital, and thus in sampled time, work thus far has center on the continuous time framework, both in design and in checking performance. It is shown in this paper that the performance in discrete time, at typical sampling rates, can differ notably from that anticipated in the continuous time formulation and that discrepancies can be particularly large on the real part of the eigenvalue sensitivities; a consequence being important error on the (linear estimate) of the level of damage at which closed-loop stability is lost. As one anticipates, explicit consideration of the sampling rate poses no special difficulties in the closed-loop eigenstructure design and the relevant expressions are developed in the paper, including a formula for the efficient evaluation of the derivative of the matrix exponential based on the theory of complex perturbations. The paper presents an easily reproduced numerical example showing the level of error that can result when the discrete time implementation of the controller is not considered.

  4. Generalizing the Network Scale-Up Method: A New Estimator for the Size of Hidden Populations*

    PubMed Central

    Feehan, Dennis M.; Salganik, Matthew J.

    2018-01-01

    The network scale-up method enables researchers to estimate the size of hidden populations, such as drug injectors and sex workers, using sampled social network data. The basic scale-up estimator offers advantages over other size estimation techniques, but it depends on problematic modeling assumptions. We propose a new generalized scale-up estimator that can be used in settings with non-random social mixing and imperfect awareness about membership in the hidden population. Further, the new estimator can be used when data are collected via complex sample designs and from incomplete sampling frames. However, the generalized scale-up estimator also requires data from two samples: one from the frame population and one from the hidden population. In some situations these data from the hidden population can be collected by adding a small number of questions to already planned studies. For other situations, we develop interpretable adjustment factors that can be applied to the basic scale-up estimator. We conclude with practical recommendations for the design and analysis of future studies. PMID:29375167

  5. Extending RosettaDock with water, sugar, and pH for prediction of complex structures and affinities for CAPRI rounds 20-27.

    PubMed

    Kilambi, Krishna Praneeth; Pacella, Michael S; Xu, Jianqing; Labonte, Jason W; Porter, Justin R; Muthu, Pravin; Drew, Kevin; Kuroda, Daisuke; Schueler-Furman, Ora; Bonneau, Richard; Gray, Jeffrey J

    2013-12-01

    Rounds 20-27 of the Critical Assessment of PRotein Interactions (CAPRI) provided a testing platform for computational methods designed to address a wide range of challenges. The diverse targets drove the creation of and new combinations of computational tools. In this study, RosettaDock and other novel Rosetta protocols were used to successfully predict four of the 10 blind targets. For example, for DNase domain of Colicin E2-Im2 immunity protein, RosettaDock and RosettaLigand were used to predict the positions of water molecules at the interface, recovering 46% of the native water-mediated contacts. For α-repeat Rep4-Rep2 and g-type lysozyme-PliG inhibitor complexes, homology models were built and standard and pH-sensitive docking algorithms were used to generate structures with interface RMSD values of 3.3 Å and 2.0 Å, respectively. A novel flexible sugar-protein docking protocol was also developed and used for structure prediction of the BT4661-heparin-like saccharide complex, recovering 71% of the native contacts. Challenges remain in the generation of accurate homology models for protein mutants and sampling during global docking. On proteins designed to bind influenza hemagglutinin, only about half of the mutations were identified that affect binding (T55: 54%; T56: 48%). The prediction of the structure of the xylanase complex involving homology modeling and multidomain docking pushed the limits of global conformational sampling and did not result in any successful prediction. The diversity of problems at hand requires computational algorithms to be versatile; the recent additions to the Rosetta suite expand the capabilities to encompass more biologically realistic docking problems. Copyright © 2013 Wiley Periodicals, Inc.

  6. Sampling the oxidative weathering products and the potentially acidic permafrost on Mars

    NASA Technical Reports Server (NTRS)

    Burns, Roger G.

    1988-01-01

    Large areas of Mars' surface are covered by oxidative weathering products containing ferric and sulfate ions having analogies to terrestrial gossans derived from sulfide mineralization associated with iron-rich basalts. Chemical weathering of such massive and disseminated pyrrhotite-pentlandite assemblages and host basaltic rocks in the Martian environment could have produced metastable gossaniferous phases (limonite containing poorly crystalline hydrated ferric sulfates and oxyhydroxides, clay silicates and opal). Underlying groundwater, now permafrost on Mars, may still be acidic due to incomplete buffering reactions by wall-rock alteration of unfractured host rock. Such acidic solutions stabilize temperature-sensitive complex ions and sols which flocculate to colloidal precipitates at elevated temperatures. Sampling procedures of Martian regolith will need to be designed bearing in mind that the frozen permafrost may be corrosive and be stabilizing unique complex ions and sols of Fe, Al, Mg, Ni and other minor elements.

  7. Argumentation: A Methodology to Facilitate Critical Thinking.

    PubMed

    Makhene, Agnes

    2017-06-20

    Caring is a difficult nursing activity that involves a complex nature of a human being in need of complex decision-making and problem solving through the critical thinking process. It is mandatory that critical thinking is facilitated in general and in nursing education particularly in order to render care in diverse multicultural patient care settings. This paper aims to describe how argumentation can be used to facilitate critical thinking in learners. A qualitative, exploratory and descriptive design that is contextual was used. Purposive sampling method was used to draw a sample and Miles and Huberman methodology of qualitative analysis was used to analyse data. Lincoln and Guba's strategies were employed to ensure trustworthiness, while Dhai and McQuoid-Mason's principles of ethical consideration were used. Following data analysis the findings were integrated within literature which culminated into the formulation of guidelines that can be followed when using argumentation as a methodology to facilitate critical thinking.

  8. Increasing complexity of clinical research in gastroenterology: implications for the training of clinician-scientists.

    PubMed

    Scott, Frank I; McConnell, Ryan A; Lewis, Matthew E; Lewis, James D

    2012-04-01

    Significant advances have been made in clinical and epidemiologic research methods over the past 30 years. We sought to demonstrate the impact of these advances on published gastroenterology research from 1980 to 2010. Twenty original clinical articles were randomly selected from each of three journals from 1980, 1990, 2000, and 2010. Each article was assessed for topic, whether the outcome was clinical or physiologic, study design, sample size, number of authors and centers collaborating, reporting of various statistical methods, and external funding. From 1980 to 2010, there was a significant increase in analytic studies, clinical outcomes, number of authors per article, multicenter collaboration, sample size, and external funding. There was increased reporting of P values, confidence intervals, and power calculations, and increased use of large multicenter databases, multivariate analyses, and bioinformatics. The complexity of clinical gastroenterology and hepatology research has increased dramatically, highlighting the need for advanced training of clinical investigators.

  9. XAP, a program for deconvolution and analysis of complex X-ray spectra

    USGS Publications Warehouse

    Quick, James E.; Haleby, Abdul Malik

    1989-01-01

    The X-ray analysis program (XAP) is a spectral-deconvolution program written in BASIC and specifically designed to analyze complex spectra produced by energy-dispersive X-ray analytical systems (EDS). XAP compensates for spectrometer drift, utilizes digital filtering to remove background from spectra, and solves for element abundances by least-squares, multiple-regression analysis. Rather than base analyses on only a few channels, broad spectral regions of a sample are reconstructed from standard reference spectra. The effects of this approach are (1) elimination of tedious spectrometer adjustments, (2) removal of background independent of sample composition, and (3) automatic correction for peak overlaps. Although the program was written specifically to operate a KEVEX 7000 X-ray fluorescence analytical system, it could be adapted (with minor modifications) to analyze spectra produced by scanning electron microscopes, electron microprobes, and probes, and X-ray defractometer patterns obtained from whole-rock powders.

  10. Incorporating Uncertainty into Spacecraft Mission and Trajectory Design

    NASA Astrophysics Data System (ADS)

    Juliana D., Feldhacker

    The complex nature of many astrodynamic systems often leads to high computational costs or degraded accuracy in the analysis and design of spacecraft missions, and the incorporation of uncertainty into the trajectory optimization process often becomes intractable. This research applies mathematical modeling techniques to reduce computational cost and improve tractability for design, optimization, uncertainty quantication (UQ) and sensitivity analysis (SA) in astrodynamic systems and develops a method for trajectory optimization under uncertainty (OUU). This thesis demonstrates the use of surrogate regression models and polynomial chaos expansions for the purpose of design and UQ in the complex three-body system. Results are presented for the application of the models to the design of mid-eld rendezvous maneuvers for spacecraft in three-body orbits. The models are shown to provide high accuracy with no a priori knowledge on the sample size required for convergence. Additionally, a method is developed for the direct incorporation of system uncertainties into the design process for the purpose of OUU and robust design; these methods are also applied to the rendezvous problem. It is shown that the models can be used for constrained optimization with orders of magnitude fewer samples than is required for a Monte Carlo approach to the same problem. Finally, this research considers an application for which regression models are not well-suited, namely UQ for the kinetic de ection of potentially hazardous asteroids under the assumptions of real asteroid shape models and uncertainties in the impact trajectory and the surface material properties of the asteroid, which produce a non-smooth system response. An alternate set of models is presented that enables analytic computation of the uncertainties in the imparted momentum from impact. Use of these models for a survey of asteroids allows conclusions to be drawn on the eects of an asteroid's shape on the ability to successfully divert the asteroid via kinetic impactor.

  11. The Australian longitudinal study on male health sampling design and survey weighting: implications for analysis and interpretation of clustered data.

    PubMed

    Spittal, Matthew J; Carlin, John B; Currier, Dianne; Downes, Marnie; English, Dallas R; Gordon, Ian; Pirkis, Jane; Gurrin, Lyle

    2016-10-31

    The Australian Longitudinal Study on Male Health (Ten to Men) used a complex sampling scheme to identify potential participants for the baseline survey. This raises important questions about when and how to adjust for the sampling design when analyzing data from the baseline survey. We describe the sampling scheme used in Ten to Men focusing on four important elements: stratification, multi-stage sampling, clustering and sample weights. We discuss how these elements fit together when using baseline data to estimate a population parameter (e.g., population mean or prevalence) or to estimate the association between an exposure and an outcome (e.g., an odds ratio). We illustrate this with examples using a continuous outcome (weight in kilograms) and a binary outcome (smoking status). Estimates of a population mean or disease prevalence using Ten to Men baseline data are influenced by the extent to which the sampling design is addressed in an analysis. Estimates of mean weight and smoking prevalence are larger in unweighted analyses than weighted analyses (e.g., mean = 83.9 kg vs. 81.4 kg; prevalence = 18.0 % vs. 16.7 %, for unweighted and weighted analyses respectively) and the standard error of the mean is 1.03 times larger in an analysis that acknowledges the hierarchical (clustered) structure of the data compared with one that does not. For smoking prevalence, the corresponding standard error is 1.07 times larger. Measures of association (mean group differences, odds ratios) are generally similar in unweighted or weighted analyses and whether or not adjustment is made for clustering. The extent to which the Ten to Men sampling design is accounted for in any analysis of the baseline data will depend on the research question. When the goals of the analysis are to estimate the prevalence of a disease or risk factor in the population or the magnitude of a population-level exposure-outcome association, our advice is to adopt an analysis that respects the sampling design.

  12. Sample Acqusition Drilling System for the the Resource Prospector Mission

    NASA Astrophysics Data System (ADS)

    Zacny, K.; Paulsen, G.; Quinn, J.; Smith, J.; Kleinhenz, J.

    2015-12-01

    The goal of the Lunar Resource Prospector Mission (RPM) is to capture and identify volatiles species within the top meter of the lunar regolith. The RPM drill has been designed to 1. Generate cuttings and place them on the surface for analysis by the the Near InfraRed Volatiles Spectrometer Subsystem (NIRVSS), and 2. Capture cuttings and transfer them to the Oxygen and Volatile Extraction Node (OVEN) coupled with the Lunar Advanced Volatiles Analysis (LAVA) subsystem. The RPM drill is based on the Mars Icebreaker drill developed for capturing samples of ice and ice cemented ground on Mars. The drill weighs approximately 10 kg and is rated at ~300 Watt. It is a rotary-percussive, fully autonomous system designed to capture cuttings for analysis. The drill consists of: 1. Rotary-Percussive Drill Head, 2. Sampling Auger, 3. Brushing station, 4. Z-stage, 5. Deployment stage. To reduce sample handling complexity, the drill auger is designed to capture cuttings as opposed to cores. High sampling efficiency is possible through a dual design of the auger. The lower section has deep and low pitch flutes for retaining of cuttings. The upper section has been designed to efficiently move the cuttings out of the hole. The drill uses a "bite" sampling approach where samples are captured in ~10 cm intervals. The first generation drill was tested in Mars chamber as well as in Antarctica and the Arctic. It demonstrated drilling at 1-1-100-100 level (1 meter in 1 hour with 100 Watt and 100 N Weight on Bit) in ice, ice cemented ground, soil, and rocks. The second generation drill was deployed on a Carnegie Mellon University rover, called Zoe, and tested in Atacama in 2012. The tests demonstrated fully autonomous sample acquisition and delivery to a carousel. The third generation drill was tested in NASA GRC's vacuum chamber, VF13, at 10-5 torr and approximately 200 K. It demonstrated successful capture and transfer of icy samples to a crucible. The drill has been modified and integrated onto the NASA JSC RPM rover. It has been undergoing testing in a lab and in the field during the Summer of 2015.

  13. Large strain cruciform biaxial testing for FLC detection

    NASA Astrophysics Data System (ADS)

    Güler, Baran; Efe, Mert

    2017-10-01

    Selection of proper test method, specimen design and analysis method are key issues for studying formability of sheet metals and detection of their forming limit curves (FLC). Materials with complex microstructures may need an additional micro-mechanical investigation and accurate modelling. Cruciform biaxial test stands as an alternative to standard tests as it achieves frictionless, in-plane, multi-axial stress states with a single sample geometry. In this study, we introduce a small-scale (less than 10 cm) cruciform sample allowing micro-mechanical investigation at stress states ranging from plane strain to equibiaxial. With successful specimen design and surface finish, large forming limit strains are obtained at the test region of the sample. The large forming limit strains obtained by experiments are compared to the values obtained from Marciniak-Kuczynski (M-K) local necking model and Cockroft-Latham damage model. This comparison shows that the experimental limiting strains are beyond the theoretical values, approaching to the fracture strain of the two test materials: Al-6061-T6 aluminum alloy and DC-04 high formability steel.

  14. Recent advances in the development of extraction chromatographic materials for the isolation of radionuclides from biological and environmental samples.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dietz, M. L.

    1998-11-30

    The determination of low levels of radionuclides in environmental and biological samples is often hampered by the complex and variable nature of the samples. One approach to circumventing this problem is to incorporate into the analytical scheme a separation and preconcentration step by which the species of interest can be isolated from the major constituents of the sample. Extraction chromatography (EXC), a form of liquid chromatography in which the stationary phase comprises an extractant or a solution of an extractant in an appropriate diluent coated onto an inert support, provides a simple and efficient means of performing a wide varietymore » of metal ion separations. Recent advances in extractant design, in particular the development of extractants capable of metal ion recognition or of strong complex formation even in acidic media, have substantially improved the utility of the method. For the preconcentration of actinides, for example, an EXC resin consisting of a liquid diphosphonic acid supported on a polymeric substrate has been shown to exhibit extraordinarily strong retention of these elements from acidic chloride media. This resin, together with other related materials, can provide the basis of a number of efficient and flexible schemes for the separation and preconcentration of radionuclides form a variety of samples for subsequent determination.« less

  15. Disease-Concordant Twins Empower Genetic Association Studies.

    PubMed

    Tan, Qihua; Li, Weilong; Vandin, Fabio

    2017-01-01

    Genome-wide association studies with moderate sample sizes are underpowered, especially when testing SNP alleles with low allele counts, a situation that may lead to high frequency of false-positive results and lack of replication in independent studies. Related individuals, such as twin pairs concordant for a disease, should confer increased power in genetic association analysis because of their genetic relatedness. We conducted a computer simulation study to explore the power advantage of the disease-concordant twin design, which uses singletons from disease-concordant twin pairs as cases and ordinary healthy samples as controls. We examined the power gain of the twin-based design for various scenarios (i.e., cases from monozygotic and dizygotic twin pairs concordant for a disease) and compared the power with the ordinary case-control design with cases collected from the unrelated patient population. Simulation was done by assigning various allele frequencies and allelic relative risks for different mode of genetic inheritance. In general, for achieving a power estimate of 80%, the sample sizes needed for dizygotic and monozygotic twin cases were one half and one fourth of the sample size of an ordinary case-control design, with variations depending on genetic mode. Importantly, the enriched power for dizygotic twins also applies to disease-concordant sibling pairs, which largely extends the application of the concordant twin design. Overall, our simulation revealed a high value of disease-concordant twins in genetic association studies and encourages the use of genetically related individuals for highly efficiently identifying both common and rare genetic variants underlying human complex diseases without increasing laboratory cost. © 2016 John Wiley & Sons Ltd/University College London.

  16. Auto-Origami and Soft Programmable Transformers: Simulation Studies of Liquid Crystal Elastomers and Swelling Polymer Gels

    NASA Astrophysics Data System (ADS)

    Konya, Andrew; Santangelo, Christian; Selinger, Robin

    2014-03-01

    When the underlying microstructure of an actuatable material varies in space, simple sheets can transform into complex shapes. Using nonlinear finite element elastodynamic simulations, we explore the design space of two such materials: liquid crystal elastomers and swelling polymer gels. Liquid crystal elastomers (LCE) undergo shape transformations induced by stimuli such as heating/cooling or illumination; complex deformations may be programmed by ``blueprinting'' a non-uniform director field in the sample when the polymer is cross-linked. Similarly, swellable gels can undergo shape change when they are swollen anisotropically as programmed by recently developed halftone gel lithography techniques. For each of these materials we design and test programmable motifs which give rise to complex deformation trajectories including folded structures, soft swimmers, apertures that open and close, bas relief patterns, and other shape transformations inspired by art and nature. In order to accommodate the large computational needs required to model these materials, our 3-d nonlinear finite element elastodynamics simulation algorithm is implemented in CUDA, running on a single GPU-enabled workstation.

  17. An analysis of adaptive design variations on the sequential parallel comparison design for clinical trials

    PubMed Central

    Mi, Michael Y.; Betensky, Rebecca A.

    2013-01-01

    Background Currently, a growing placebo response rate has been observed in clinical trials for antidepressant drugs, a phenomenon that has made it increasingly difficult to demonstrate efficacy. The sequential parallel comparison design (SPCD) is a clinical trial design that was proposed to address this issue. The SPCD theoretically has the potential to reduce the sample size requirement for a clinical trial and to simultaneously enrich the study population to be less responsive to the placebo. Purpose Because the basic SPCD design already reduces the placebo response by removing placebo responders between the first and second phases of a trial, the purpose of this study was to examine whether we can further improve the efficiency of the basic SPCD and if we can do so when the projected underlying drug and placebo response rates differ considerably from the actual ones. Methods Three adaptive designs that used interim analyses to readjust the length of study duration for individual patients were tested to reduce the sample size requirement or increase the statistical power of the SPCD. Various simulations of clinical trials using the SPCD with interim analyses were conducted to test these designs through calculations of empirical power. Results From the simulations, we found that the adaptive designs can recover unnecessary resources spent in the traditional SPCD trial format with overestimated initial sample sizes and provide moderate gains in power. Under the first design, results showed up to a 25% reduction in person-days, with most power losses below 5%. In the second design, results showed up to a 8% reduction in person-days with negligible loss of power. In the third design using sample size re-estimation, up to 25% power was recovered from underestimated sample size scenarios. Limitations Given the numerous possible test parameters that could have been chosen for the simulations, the study’s results are limited to situations described by the parameters that were used, and may not generalize to all possible scenarios. Furthermore, drop-out of patients is not considered in this study. Conclusions It is possible to make an already complex design such as the SPCD adaptive, and thus more efficient, potentially overcoming the problem of placebo response at lower cost. Ultimately, such a design may expedite the approval of future effective treatments. PMID:23283576

  18. Implications of alternative field-sampling designs on Landsat-based mapping of stand age and carbon stocks in Oregon forests

    Treesearch

    Maureen V. Duane; Warren B. Cohen; John L. Campbell; Tara Hudiburg; David P. Turner; Dale Weyermann

    2010-01-01

    Empirical models relating forest attributes to remotely sensed metrics are widespread in the literature and underpin many of our efforts to map forest structure across complex landscapes. In this study we compared empirical models relating Landsat reflectance to forest age across Oregon using two alternate sets of ground data: one from a large (n ~ 1500) systematic...

  19. Amputation effects on the underlying complexity within transtibial amputee ankle motion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wurdeman, Shane R., E-mail: shanewurdeman@gmail.com; Advanced Prosthetics Center, Omaha, Nebraska 68134; Myers, Sara A.

    2014-03-15

    The presence of chaos in walking is considered to provide a stable, yet adaptable means for locomotion. This study examined whether lower limb amputation and subsequent prosthetic rehabilitation resulted in a loss of complexity in amputee gait. Twenty-eight individuals with transtibial amputation participated in a 6 week, randomized cross-over design study in which they underwent a 3 week adaptation period to two separate prostheses. One prosthesis was deemed “more appropriate” and the other “less appropriate” based on matching/mismatching activity levels of the person and the prosthesis. Subjects performed a treadmill walking trial at self-selected walking speed at multiple points ofmore » the adaptation period, while kinematics of the ankle were recorded. Bilateral sagittal plane ankle motion was analyzed for underlying complexity through the pseudoperiodic surrogation analysis technique. Results revealed the presence of underlying deterministic structure in both prostheses and both the prosthetic and sound leg ankle (discriminant measure largest Lyapunov exponent). Results also revealed that the prosthetic ankle may be more likely to suffer loss of complexity than the sound ankle, and a “more appropriate” prosthesis may be better suited to help restore a healthy complexity of movement within the prosthetic ankle motion compared to a “less appropriate” prosthesis (discriminant measure sample entropy). Results from sample entropy results are less likely to be affected by the intracycle periodic dynamics as compared to the largest Lyapunov exponent. Adaptation does not seem to influence complexity in the system for experienced prosthesis users.« less

  20. Defining Long-Duration Traverses of Lunar Volcanic Complexes with LROC NAC Images

    NASA Technical Reports Server (NTRS)

    Stopar, J. D.; Lawrence, S. J.; Joliff, B. L.; Speyerer, E. J.; Robinson, M. S.

    2016-01-01

    A long-duration lunar rover [e.g., 1] would be ideal for investigating large volcanic complexes like the Marius Hills (MH) (approximately 300 x 330 km), where widely spaced sampling points are needed to explore the full geologic and compositional variability of the region. Over these distances, a rover would encounter varied surface morphologies (ranging from impact craters to rugged lava shields), each of which need to be considered during the rover design phase. Previous rovers including Apollo, Lunokhod, and most recently Yutu, successfully employed pre-mission orbital data for planning (at scales significantly coarser than that of the surface assets). LROC was specifically designed to provide mission-planning observations at scales useful for accurate rover traverse planning (crewed and robotic) [2]. After-the-fact analyses of the planning data can help improve predictions of future rover performance [e.g., 3-5].

  1. Method and Apparatus for Automated Isolation of Nucleic Acids from Small Cell Samples

    NASA Technical Reports Server (NTRS)

    Sundaram, Shivshankar; Prabhakarpandian, Balabhaskar; Pant, Kapil; Wang, Yi

    2014-01-01

    RNA isolation is a ubiquitous need, driven by current emphasis on microarrays and miniaturization. With commercial systems requiring 100,000 to 1,000,000 cells for successful isolation, there is a growing need for a small-footprint, easy-to-use device that can harvest nucleic acids from much smaller cell samples (1,000 to 10,000 cells). The process of extraction of RNA from cell cultures is a complex, multi-step one, and requires timed, asynchronous operations with multiple reagents/buffers. An added complexity is the fragility of RNA (subject to degradation) and its reactivity to surface. A novel, microfluidics-based, integrated cartridge has been developed that can fully automate the complex process of RNA isolation (lyse, capture, and elute RNA) from small cell culture samples. On-cartridge cell lysis is achieved using either reagents or high-strength electric fields made possible by the miniaturized format. Traditionally, silica-based, porous-membrane formats have been used for RNA capture, requiring slow perfusion for effective capture. In this design, high efficiency capture/elution are achieved using a microsphere-based "microfluidized" format. Electrokinetic phenomena are harnessed to actively mix microspheres with the cell lysate and capture/elution buffer, providing important advantages in extraction efficiency, processing time, and operational flexibility. Successful RNA isolation was demonstrated using both suspension (HL-60) and adherent (BHK-21) cells. Novel features associated with this development are twofold. First, novel designs that execute needed processes with improved speed and efficiency were developed. These primarily encompass electric-field-driven lysis of cells. The configurations include electrode-containing constructs, or an "electrode-less" chip design, which is easy to fabricate and mitigates fouling at the electrode surface; and the "fluidized" extraction format based on electrokinetically assisted mixing and contacting of microbeads in a shape-optimized chamber. A secondary proprietary feature is in the particular layout integrating these components to perform the desired operation of RNA isolation. Apart from a novel functional capability, advantages of the innovation include reduced or eliminated use of toxic reagents, and operator-independent extraction of RNA.

  2. Comparison of the Medical College of Georgia Complex Figures and the Rey-Osterrieth Complex Figure tests in a normal sample of Japanese university students.

    PubMed

    Yamashita, Hikari; Yasugi, Mina

    2008-08-01

    Comparability of copy and recall performance on the four figures of the Medical College of Georgia Complex Figures and the Rey-Osterrieth Complex Figure were examined using an incidental learning paradigm with 60 men and 60 women, healthy volunteers between the ages of 18 and 24 years (M = 21.5 yr., SD = 1.5) at a Japanese university. A between-subjects design was used in which each group of participants (n = 24) responded to five figures. The interrater reliability of each Georgia figure was excellent. While the five figures yielded equivalent copy scores, the Rey-Osterrieth figure had significantly lower scores than the Georgia figures at recall after 3 min. There were no significant differences between the four Georgia figures. These results are consistent with the findings of the original studies in the USA.

  3. A Qualitative Inquiry into the Complex Features of Strained Interactions: Analysis and Implications for Health Care Personnel.

    PubMed

    Thunborg, Charlotta; Salzmann-Erikson, Martin

    2017-01-01

    Communication skills are vital for successful relationships between patients and health care professionals. Failure to communicate may lead to a lack of understanding and may result in strained interactions. Our theoretical point of departure was to make use of chaos and complexity theories. To examine the features of strained interactions and to discuss their relevance for health care settings. A netnography study design was applied. Data were purposefully sampled, and video clips (122 minutes from 30 video clips) from public online venues were used. The results are presented in four categories: 1) unpredictability, 2) sensitivity dependence, 3) resistibility, and 4) iteration. They are all features of strained interactions. Strained interactions are a complex phenomenon that exists in health care settings. The findings provide health care professionals guidance to understand the complexity and the features of strained interactions.

  4. Ultra compact spectrometer using linear variable filters

    NASA Astrophysics Data System (ADS)

    Dami, M.; De Vidi, R.; Aroldi, G.; Belli, F.; Chicarella, L.; Piegari, A.; Sytchkova, A.; Bulir, J.; Lemarquis, F.; Lequime, M.; Abel Tibérini, L.; Harnisch, B.

    2017-11-01

    The Linearly Variable Filters (LVF) are complex optical devices that, integrated in a CCD, can realize a "single chip spectrometer". In the framework of an ESA Study, a team of industries and institutes led by SELEX-Galileo explored the design principles and manufacturing techniques, realizing and characterizing LVF samples based both on All-Dielectric (AD) and Metal-Dielectric (MD) Coating Structures in the VNIR and SWIR spectral ranges. In particular the achieved performances on spectral gradient, transmission bandwidth and Spectral Attenuation (SA) are presented and critically discussed. Potential improvements will be highlighted. In addition the results of a feasibility study of a SWIR Linear Variable Filter are presented with the comparison of design prediction and measured performances. Finally criticalities related to the filter-CCD packaging are discussed. The main achievements reached during these activities have been: - to evaluate by design, manufacturing and test of LVF samples the achievable performances compared with target requirements; - to evaluate the reliability of the projects by analyzing their repeatability; - to define suitable measurement methodologies

  5. Taking a statistical approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wild, M.; Rouhani, S.

    1995-02-01

    A typical site investigation entails extensive sampling and monitoring. In the past, sampling plans have been designed on purely ad hoc bases, leading to significant expenditures and, in some cases, collection of redundant information. In many instances, sampling costs exceed the true worth of the collected data. The US Environmental Protection Agency (EPA) therefore has advocated the use of geostatistics to provide a logical framework for sampling and analysis of environmental data. Geostatistical methodology uses statistical techniques for the spatial analysis of a variety of earth-related data. The use of geostatistics was developed by the mining industry to estimate oremore » concentrations. The same procedure is effective in quantifying environmental contaminants in soils for risk assessments. Unlike classical statistical techniques, geostatistics offers procedures to incorporate the underlying spatial structure of the investigated field. Sample points spaced close together tend to be more similar than samples spaced further apart. This can guide sampling strategies and determine complex contaminant distributions. Geostatistic techniques can be used to evaluate site conditions on the basis of regular, irregular, random and even spatially biased samples. In most environmental investigations, it is desirable to concentrate sampling in areas of known or suspected contamination. The rigorous mathematical procedures of geostatistics allow for accurate estimates at unsampled locations, potentially reducing sampling requirements. The use of geostatistics serves as a decision-aiding and planning tool and can significantly reduce short-term site assessment costs, long-term sampling and monitoring needs, as well as lead to more accurate and realistic remedial design criteria.« less

  6. Exploiting Complexity Information for Brain Activation Detection

    PubMed Central

    Zhang, Yan; Liang, Jiali; Lin, Qiang; Hu, Zhenghui

    2016-01-01

    We present a complexity-based approach for the analysis of fMRI time series, in which sample entropy (SampEn) is introduced as a quantification of the voxel complexity. Under this hypothesis the voxel complexity could be modulated in pertinent cognitive tasks, and it changes through experimental paradigms. We calculate the complexity of sequential fMRI data for each voxel in two distinct experimental paradigms and use a nonparametric statistical strategy, the Wilcoxon signed rank test, to evaluate the difference in complexity between them. The results are compared with the well known general linear model based Statistical Parametric Mapping package (SPM12), where a decided difference has been observed. This is because SampEn method detects brain complexity changes in two experiments of different conditions and the data-driven method SampEn evaluates just the complexity of specific sequential fMRI data. Also, the larger and smaller SampEn values correspond to different meanings, and the neutral-blank design produces higher predictability than threat-neutral. Complexity information can be considered as a complementary method to the existing fMRI analysis strategies, and it may help improving the understanding of human brain functions from a different perspective. PMID:27045838

  7. An object-oriented framework for medical image registration, fusion, and visualization.

    PubMed

    Zhu, Yang-Ming; Cochoff, Steven M

    2006-06-01

    An object-oriented framework for image registration, fusion, and visualization was developed based on the classic model-view-controller paradigm. The framework employs many design patterns to facilitate legacy code reuse, manage software complexity, and enhance the maintainability and portability of the framework. Three sample applications built a-top of this framework are illustrated to show the effectiveness of this framework: the first one is for volume image grouping and re-sampling, the second one is for 2D registration and fusion, and the last one is for visualization of single images as well as registered volume images.

  8. Uncertainty Assessment of Synthetic Design Hydrographs for Gauged and Ungauged Catchments

    NASA Astrophysics Data System (ADS)

    Brunner, Manuela I.; Sikorska, Anna E.; Furrer, Reinhard; Favre, Anne-Catherine

    2018-03-01

    Design hydrographs described by peak discharge, hydrograph volume, and hydrograph shape are essential for engineering tasks involving storage. Such design hydrographs are inherently uncertain as are classical flood estimates focusing on peak discharge only. Various sources of uncertainty contribute to the total uncertainty of synthetic design hydrographs for gauged and ungauged catchments. These comprise model uncertainties, sampling uncertainty, and uncertainty due to the choice of a regionalization method. A quantification of the uncertainties associated with flood estimates is essential for reliable decision making and allows for the identification of important uncertainty sources. We therefore propose an uncertainty assessment framework for the quantification of the uncertainty associated with synthetic design hydrographs. The framework is based on bootstrap simulations and consists of three levels of complexity. On the first level, we assess the uncertainty due to individual uncertainty sources. On the second level, we quantify the total uncertainty of design hydrographs for gauged catchments and the total uncertainty of regionalizing them to ungauged catchments but independently from the construction uncertainty. On the third level, we assess the coupled uncertainty of synthetic design hydrographs in ungauged catchments, jointly considering construction and regionalization uncertainty. We find that the most important sources of uncertainty in design hydrograph construction are the record length and the choice of the flood sampling strategy. The total uncertainty of design hydrographs in ungauged catchments depends on the catchment properties and is not negligible in our case.

  9. Seven common mistakes in population genetics and how to avoid them.

    PubMed

    Meirmans, Patrick G

    2015-07-01

    As the data resulting from modern genotyping tools are astoundingly complex, genotyping studies require great care in the sampling design, genotyping, data analysis and interpretation. Such care is necessary because, with data sets containing thousands of loci, small biases can easily become strongly significant patterns. Such biases may already be present in routine tasks that are present in almost every genotyping study. Here, I discuss seven common mistakes that can be frequently encountered in the genotyping literature: (i) giving more attention to genotyping than to sampling, (ii) failing to perform or report experimental randomization in the laboratory, (iii) equating geopolitical borders with biological borders, (iv) testing significance of clustering output, (v) misinterpreting Mantel's r statistic, (vi) only interpreting a single value of k and (vii) forgetting that only a small portion of the genome will be associated with climate. For every of those issues, I give some suggestions how to avoid the mistake. Overall, I argue that genotyping studies would benefit from establishing a more rigorous experimental design, involving proper sampling design, randomization and better distinction of a priori hypotheses and exploratory analyses. © 2015 John Wiley & Sons Ltd.

  10. Intuitive web-based experimental design for high-throughput biomedical data.

    PubMed

    Friedrich, Andreas; Kenar, Erhan; Kohlbacher, Oliver; Nahnsen, Sven

    2015-01-01

    Big data bioinformatics aims at drawing biological conclusions from huge and complex biological datasets. Added value from the analysis of big data, however, is only possible if the data is accompanied by accurate metadata annotation. Particularly in high-throughput experiments intelligent approaches are needed to keep track of the experimental design, including the conditions that are studied as well as information that might be interesting for failure analysis or further experiments in the future. In addition to the management of this information, means for an integrated design and interfaces for structured data annotation are urgently needed by researchers. Here, we propose a factor-based experimental design approach that enables scientists to easily create large-scale experiments with the help of a web-based system. We present a novel implementation of a web-based interface allowing the collection of arbitrary metadata. To exchange and edit information we provide a spreadsheet-based, humanly readable format. Subsequently, sample sheets with identifiers and metainformation for data generation facilities can be created. Data files created after measurement of the samples can be uploaded to a datastore, where they are automatically linked to the previously created experimental design model.

  11. The Design of a Quantitative Western Blot Experiment

    PubMed Central

    Taylor, Sean C.; Posch, Anton

    2014-01-01

    Western blotting is a technique that has been in practice for more than three decades that began as a means of detecting a protein target in a complex sample. Although there have been significant advances in both the imaging and reagent technologies to improve sensitivity, dynamic range of detection, and the applicability of multiplexed target detection, the basic technique has remained essentially unchanged. In the past, western blotting was used simply to detect a specific target protein in a complex mixture, but now journal editors and reviewers are requesting the quantitative interpretation of western blot data in terms of fold changes in protein expression between samples. The calculations are based on the differential densitometry of the associated chemiluminescent and/or fluorescent signals from the blots and this now requires a fundamental shift in the experimental methodology, acquisition, and interpretation of the data. We have recently published an updated approach to produce quantitative densitometric data from western blots (Taylor et al., 2013) and here we summarize the complete western blot workflow with a focus on sample preparation and data analysis for quantitative western blotting. PMID:24738055

  12. Multichannel waveguides for the simultaneous detection of disease biomarkers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mukundan, Harshini; Price, Dominique Z; Grace, Wynne K

    2009-01-01

    The sensor team at the Los Alamos National Laboratory has developed a waveguide-based optical biosensor that has previously been used for the detection of biomarkers associated with diseases such as tuberculosis, breast cancer, anthrax and influenza in complex biological samples (e.g., serum and urine). However, no single biomarker can accurately predict disease. To address this issue, we developed a multiplex assay for the detection of components of the Bacillus anthracis lethal toxin on single mode planar optical waveguides with tunable quantum dots as the fluorescence reporter. This limited ability to multiplex is still insufficient for accurate detection of disease ormore » for monitoring prognosis. In this manuscript, we demonstrate for the first time, the design, fabrication and successful evaluation of a multichannel planar optical waveguide for the simultaneous detection of at least three unknown samples in quadruplicate. We demonstrate the simultaneous, rapid (30 min), quantitative (with internal standard) and sensitive (limit of detection of 1 pM) detection of protective antigen and lethal factor of Bacillus anthracis in complex biological samples (serum) using specific monoclonal antibodies labeled with quantum dots as the fluorescence reporter.« less

  13. Development of an Enhanced Metaproteomic Approach for Deepening the Microbiome Characterization of the Human Infant Gut

    PubMed Central

    2015-01-01

    The establishment of early life microbiota in the human infant gut is highly variable and plays a crucial role in host nutrient availability/uptake and maturation of immunity. Although high-performance mass spectrometry (MS)-based metaproteomics is a powerful method for the functional characterization of complex microbial communities, the acquisition of comprehensive metaproteomic information in human fecal samples is inhibited by the presence of abundant human proteins. To alleviate this restriction, we have designed a novel metaproteomic strategy based on double filtering (DF) the raw samples, a method that fractionates microbial from human cells to enhance microbial protein identification and characterization in complex fecal samples from healthy premature infants. This method dramatically improved the overall depth of infant gut proteome measurement, with an increase in the number of identified low-abundance proteins and a greater than 2-fold improvement in microbial protein identification and quantification. This enhancement of proteome measurement depth enabled a more extensive microbiome comparison between infants by not only increasing the confidence of identified microbial functional categories but also revealing previously undetected categories. PMID:25350865

  14. Bayesian Analysis of the Cosmic Microwave Background

    NASA Technical Reports Server (NTRS)

    Jewell, Jeffrey

    2007-01-01

    There is a wealth of cosmological information encoded in the spatial power spectrum of temperature anisotropies of the cosmic microwave background! Experiments designed to map the microwave sky are returning a flood of data (time streams of instrument response as a beam is swept over the sky) at several different frequencies (from 30 to 900 GHz), all with different resolutions and noise properties. The resulting analysis challenge is to estimate, and quantify our uncertainty in, the spatial power spectrum of the cosmic microwave background given the complexities of "missing data", foreground emission, and complicated instrumental noise. Bayesian formulation of this problem allows consistent treatment of many complexities including complicated instrumental noise and foregrounds, and can be numerically implemented with Gibbs sampling. Gibbs sampling has now been validated as an efficient, statistically exact, and practically useful method for low-resolution (as demonstrated on WMAP 1 and 3 year temperature and polarization data). Continuing development for Planck - the goal is to exploit the unique capabilities of Gibbs sampling to directly propagate uncertainties in both foreground and instrument models to total uncertainty in cosmological parameters.

  15. A comprehensive and scalable database search system for metaproteomics.

    PubMed

    Chatterjee, Sandip; Stupp, Gregory S; Park, Sung Kyu Robin; Ducom, Jean-Christophe; Yates, John R; Su, Andrew I; Wolan, Dennis W

    2016-08-16

    Mass spectrometry-based shotgun proteomics experiments rely on accurate matching of experimental spectra against a database of protein sequences. Existing computational analysis methods are limited in the size of their sequence databases, which severely restricts the proteomic sequencing depth and functional analysis of highly complex samples. The growing amount of public high-throughput sequencing data will only exacerbate this problem. We designed a broadly applicable metaproteomic analysis method (ComPIL) that addresses protein database size limitations. Our approach to overcome this significant limitation in metaproteomics was to design a scalable set of sequence databases assembled for optimal library querying speeds. ComPIL was integrated with a modified version of the search engine ProLuCID (termed "Blazmass") to permit rapid matching of experimental spectra. Proof-of-principle analysis of human HEK293 lysate with a ComPIL database derived from high-quality genomic libraries was able to detect nearly all of the same peptides as a search with a human database (~500x fewer peptides in the database), with a small reduction in sensitivity. We were also able to detect proteins from the adenovirus used to immortalize these cells. We applied our method to a set of healthy human gut microbiome proteomic samples and showed a substantial increase in the number of identified peptides and proteins compared to previous metaproteomic analyses, while retaining a high degree of protein identification accuracy and allowing for a more in-depth characterization of the functional landscape of the samples. The combination of ComPIL with Blazmass allows proteomic searches to be performed with database sizes much larger than previously possible. These large database searches can be applied to complex meta-samples with unknown composition or proteomic samples where unexpected proteins may be identified. The protein database, proteomic search engine, and the proteomic data files for the 5 microbiome samples characterized and discussed herein are open source and available for use and additional analysis.

  16. An analysis of adaptive design variations on the sequential parallel comparison design for clinical trials.

    PubMed

    Mi, Michael Y; Betensky, Rebecca A

    2013-04-01

    Currently, a growing placebo response rate has been observed in clinical trials for antidepressant drugs, a phenomenon that has made it increasingly difficult to demonstrate efficacy. The sequential parallel comparison design (SPCD) is a clinical trial design that was proposed to address this issue. The SPCD theoretically has the potential to reduce the sample-size requirement for a clinical trial and to simultaneously enrich the study population to be less responsive to the placebo. Because the basic SPCD already reduces the placebo response by removing placebo responders between the first and second phases of a trial, the purpose of this study was to examine whether we can further improve the efficiency of the basic SPCD and whether we can do so when the projected underlying drug and placebo response rates differ considerably from the actual ones. Three adaptive designs that used interim analyses to readjust the length of study duration for individual patients were tested to reduce the sample-size requirement or increase the statistical power of the SPCD. Various simulations of clinical trials using the SPCD with interim analyses were conducted to test these designs through calculations of empirical power. From the simulations, we found that the adaptive designs can recover unnecessary resources spent in the traditional SPCD trial format with overestimated initial sample sizes and provide moderate gains in power. Under the first design, results showed up to a 25% reduction in person-days, with most power losses below 5%. In the second design, results showed up to a 8% reduction in person-days with negligible loss of power. In the third design using sample-size re-estimation, up to 25% power was recovered from underestimated sample-size scenarios. Given the numerous possible test parameters that could have been chosen for the simulations, the study's results are limited to situations described by the parameters that were used and may not generalize to all possible scenarios. Furthermore, dropout of patients is not considered in this study. It is possible to make an already complex design such as the SPCD adaptive, and thus more efficient, potentially overcoming the problem of placebo response at lower cost. Ultimately, such a design may expedite the approval of future effective treatments.

  17. Establishing and Maintaining an Extensive Library of Patient-Derived Xenograft Models.

    PubMed

    Mattar, Marissa; McCarthy, Craig R; Kulick, Amanda R; Qeriqi, Besnik; Guzman, Sean; de Stanchina, Elisa

    2018-01-01

    Patient-derived xenograft (PDX) models have recently emerged as a highly desirable platform in oncology and are expected to substantially broaden the way in vivo studies are designed and executed and to reshape drug discovery programs. However, acquisition of patient-derived samples, and propagation, annotation and distribution of PDXs are complex processes that require a high degree of coordination among clinic, surgery and laboratory personnel, and are fraught with challenges that are administrative, procedural and technical. Here, we examine in detail the major aspects of this complex process and relate our experience in establishing a PDX Core Laboratory within a large academic institution.

  18. [Sampling optimization for tropical invertebrates: an example using dung beetles (Coleoptera: Scarabaeinae) in Venezuela].

    PubMed

    Ferrer-Paris, José Rafael; Sánchez-Mercado, Ada; Rodríguez, Jon Paul

    2013-03-01

    The development of efficient sampling protocols is an essential prerequisite to evaluate and identify priority conservation areas. There are f ew protocols for fauna inventory and monitoring in wide geographical scales for the tropics, where the complexity of communities and high biodiversity levels, make the implementation of efficient protocols more difficult. We proposed here a simple strategy to optimize the capture of dung beetles, applied to sampling with baited traps and generalizable to other sampling methods. We analyzed data from eight transects sampled between 2006-2008 withthe aim to develop an uniform sampling design, that allows to confidently estimate species richness, abundance and composition at wide geographical scales. We examined four characteristics of any sampling design that affect the effectiveness of the sampling effort: the number of traps, sampling duration, type and proportion of bait, and spatial arrangement of the traps along transects. We used species accumulation curves, rank-abundance plots, indicator species analysis, and multivariate correlograms. We captured 40 337 individuals (115 species/morphospecies of 23 genera). Most species were attracted by both dung and carrion, but two thirds had greater relative abundance in traps baited with human dung. Different aspects of the sampling design influenced each diversity attribute in different ways. To obtain reliable richness estimates, the number of traps was the most important aspect. Accurate abundance estimates were obtained when the sampling period was increased, while the spatial arrangement of traps was determinant to capture the species composition pattern. An optimum sampling strategy for accurate estimates of richness, abundance and diversity should: (1) set 50-70 traps to maximize the number of species detected, (2) get samples during 48-72 hours and set trap groups along the transect to reliably estimate species abundance, (3) set traps in groups of at least 10 traps to suitably record the local species composition, and (4) separate trap groups by a distance greater than 5-10km to avoid spatial autocorrelation. For the evaluation of other sampling protocols we recommend to, first, identify the elements of sampling design that could affect the sampled effort (the number of traps, sampling duration, type and proportion of bait) and their spatial distribution (spatial arrangement of the traps) and then, to evaluate how they affect richness, abundance and species composition estimates.

  19. Chemodosimetric analysis in food-safety monitoring: design, synthesis, and application of a bimetallic Re(I)-Pt(II) complex for detection of dimethyl sulfide in foods.

    PubMed

    Chow, Cheuk-Fai; Gong, Fu-Wen; Gong, Cheng-Bin

    2014-09-21

    Detection of neutral biogenic sulfides plays a crucial role in food safety. A new heterobimetallic Re(I)-Pt(II) donor-acceptor complex--[Re(biq)(CO)3(CN)]-[Pt(DMSO)(Cl)2] (1, biq = 2,2'-biquinoline)--was synthesized and characterized. The X-ray crystallographic and photophysical data for 1 are reported in this study. Complex 1 indicated the luminescent chemodosimetric selectivity for dimethyl sulfide, which persisted even in the presence of a variety of interfering vapors, with a detection limit as low as 0.96 ppm. The binding constant (log K) of 1 toward dimethyl sulfide was 3.63 ± 0.03. The analyte selectivity of the complexes was found to be dependent on the ligand coordinated to the Re(I) center. Real samples (beef, chicken, and pork) were monitored real-time for gaseous dimethyl sulfide. Complex 1 shows a linear spectrofluorimetric response with increasing storage time of the meats at 30 °C.

  20. Ancient palace complex (300–100 BC) discovered in the Valley of Oaxaca, Mexico

    PubMed Central

    Redmond, Elsa M.; Spencer, Charles S.

    2017-01-01

    Recently completed excavations at the site of El Palenque in Mexico’s Valley of Oaxaca have recovered the well-preserved remains of a palace complex dated by associated radiocarbon samples and ceramics to the Late Formative period or Late Monte Albán I phase (300–100 BC), the period of archaic state emergence in the region. The El Palenque palace exhibits certain architectural and organizational features similar to the royal palaces of much later Mesoamerican states described by Colonial-period sources. The excavation data document a multifunctional palace complex covering a maximum estimated area of 2,790 m2 on the north side of the site’s plaza and consisting of both governmental and residential components. The data indicate that the palace complex was designed and built as a single construction. The palace complex at El Palenque is the oldest multifunctional palace excavated thus far in the Valley of Oaxaca. PMID:28348218

  1. Statistical Methods for Detecting Differentially Abundant Features in Clinical Metagenomic Samples

    PubMed Central

    White, James Robert; Nagarajan, Niranjan; Pop, Mihai

    2009-01-01

    Numerous studies are currently underway to characterize the microbial communities inhabiting our world. These studies aim to dramatically expand our understanding of the microbial biosphere and, more importantly, hope to reveal the secrets of the complex symbiotic relationship between us and our commensal bacterial microflora. An important prerequisite for such discoveries are computational tools that are able to rapidly and accurately compare large datasets generated from complex bacterial communities to identify features that distinguish them. We present a statistical method for comparing clinical metagenomic samples from two treatment populations on the basis of count data (e.g. as obtained through sequencing) to detect differentially abundant features. Our method, Metastats, employs the false discovery rate to improve specificity in high-complexity environments, and separately handles sparsely-sampled features using Fisher's exact test. Under a variety of simulations, we show that Metastats performs well compared to previously used methods, and significantly outperforms other methods for features with sparse counts. We demonstrate the utility of our method on several datasets including a 16S rRNA survey of obese and lean human gut microbiomes, COG functional profiles of infant and mature gut microbiomes, and bacterial and viral metabolic subsystem data inferred from random sequencing of 85 metagenomes. The application of our method to the obesity dataset reveals differences between obese and lean subjects not reported in the original study. For the COG and subsystem datasets, we provide the first statistically rigorous assessment of the differences between these populations. The methods described in this paper are the first to address clinical metagenomic datasets comprising samples from multiple subjects. Our methods are robust across datasets of varied complexity and sampling level. While designed for metagenomic applications, our software can also be applied to digital gene expression studies (e.g. SAGE). A web server implementation of our methods and freely available source code can be found at http://metastats.cbcb.umd.edu/. PMID:19360128

  2. RACE/A: an architectural account of the interactions between learning, task control, and retrieval dynamics.

    PubMed

    van Maanen, Leendert; van Rijn, Hedderik; Taatgen, Niels

    2012-01-01

    This article discusses how sequential sampling models can be integrated in a cognitive architecture. The new theory Retrieval by Accumulating Evidence in an Architecture (RACE/A) combines the level of detail typically provided by sequential sampling models with the level of task complexity typically provided by cognitive architectures. We will use RACE/A to model data from two variants of a picture-word interference task in a psychological refractory period design. These models will demonstrate how RACE/A enables interactions between sequential sampling and long-term declarative learning, and between sequential sampling and task control. In a traditional sequential sampling model, the onset of the process within the task is unclear, as is the number of sampling processes. RACE/A provides a theoretical basis for estimating the onset of sequential sampling processes during task execution and allows for easy modeling of multiple sequential sampling processes within a task. Copyright © 2011 Cognitive Science Society, Inc.

  3. 'Complexity-compatible' policy for integrated care? Lessons from the implementation of Ontario's Health Links.

    PubMed

    Grudniewicz, Agnes; Tenbensel, Tim; Evans, Jenna M; Steele Gray, Carolyn; Baker, G Ross; Wodchis, Walter P

    2018-02-01

    Complex adaptive systems (CAS) theory views healthcare as numerous sub-systems characterized by diverse agents that interact, self-organize, and continuously adapt. We apply this complexity science perspective to examine the extent to which CAS theory is a useful lens for designing and implementing health policies. We present the case of Health Links, a "low rules" policy intervention in Ontario, Canada aimed at stimulating the development of voluntary networks of health and social organizations to improve care coordination for the most frequent users of the healthcare system. Our sample consisted of stakeholders from regional governance bodies and organizations partnering in Health Links. Qualitative interview data were coded using the key complexity concepts of sensemaking, self-organization, interconnections, coevolution, and emergence. We found that the complexity-compatible policy design successfully stimulated local dynamics of flexibility, experimentation, and learning and that important mediating factors include leadership, readiness, relationship-building, role clarity, communication, and resources. However, we saw tensions between preferences for flexibility and standardization. Desirable developments occurred only in some settings and failed to flow upward to higher levels, resulting in a piecemeal and patchy landscape. Attention needs to be paid not only to local dynamics and processes, but also to regional and provincial levels to ensure that learning flows to the top and informs decision-making. We conclude that implementation of complexity-compatible policies needs a balance between flexibility and consistency and the right leadership to coordinate the two. Complexity-compatible policy for integrated healthcare is more than simply 'letting a thousand flowers bloom'. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Measurement of the complex permittivity of dry rocks and minerals: application of polythene dilution method and Lichtenecker's mixture formulae

    NASA Astrophysics Data System (ADS)

    Zheng, Yongchun; Wang, Shijie; Feng, Junming; Ouyang, Ziyuan; Li, Xiongyao

    2005-12-01

    The complex permittivity of dry rocks and minerals varies over a very wide range, even within a sample there are variation at different temperatures and frequencies. Most rocks and minerals are inhomogeneous materials, therefore, most of the present methods of dielectric measurement designed for artificial homogeneous materials are not suitable for rocks and minerals. The resonant cavity perturbation (RCP) method is a reliable and simple technique to determine the complex permittivity of dielectric materials in the GHz range, and this method is also used extensively. However, the traditional RCP method is sensitive to the measurement of low dielectric constant (ɛ') and low loss factor (ɛ'' or tanδ) materials. The complex permittivity of most dry rocks and minerals exceeds the span vibration of the RCP method, and cannot be measured by the RCP method directly. This paper proposes a new method to measure the complex permittivity of dry rocks and minerals with the RCP method incorporated in the application of polythene (PE) dilution method and Lichtenecker's mixture formulae. Dry rocks and minerals were ground into fine powder. The powder of rocks and minerals was mixed with polythene powder in a definite volume per cent. The mixture was heated and pressed into a thin circular slice. The slice was processed into a small rectangular strip sample, the size of which was fitted to the demands of the RCP method. The complex permittivity of the strip was obtained by the RCP method. The relationship between the dielectric properties of the two-phase mixture and those of each phase in the mixture can be expressed by Lichtenecker's mixture formula. Thus the complex permittivity of dry rocks and minerals can be calculated from the complex permittivity of the mixture in case the complex permittivity of polythene is known. The presented method was verified by measurements of reference materials of various known complex permittivity and other reliable dielectric measurement methods. The results of the experiment showed that this new method is of high accuracy, small sample requirement, and convenient application. Moreover, the complex permittivity of rocks and minerals measured by this method is more reliable than the direct dielectric measurement of rocks or minerals without application of the polythene dilution method and Lichtenecker's mixture formulae.

  5. Primary Care Physician Insights Into a Typology of the Complex Patient in Primary Care

    PubMed Central

    Loeb, Danielle F.; Binswanger, Ingrid A.; Candrian, Carey; Bayliss, Elizabeth A.

    2015-01-01

    PURPOSE Primary care physicians play unique roles caring for complex patients, often acting as the hub for their care and coordinating care among specialists. To inform the clinical application of new models of care for complex patients, we sought to understand how these physicians conceptualize patient complexity and to develop a corresponding typology. METHODS We conducted qualitative in-depth interviews with internal medicine primary care physicians from 5 clinics associated with a university hospital and a community health hospital. We used systematic nonprobabilistic sampling to achieve an even distribution of sex, years in practice, and type of practice. The interviews were analyzed using a team-based participatory general inductive approach. RESULTS The 15 physicians in this study endorsed a multidimensional concept of patient complexity. The physicians perceived patients to be complex if they had an exacerbating factor—a medical illness, mental illness, socioeconomic challenge, or behavior or trait (or some combination thereof)—that complicated care for chronic medical illnesses. CONCLUSION This perspective of primary care physicians caring for complex patients can help refine models of complexity to design interventions or models of care that improve outcomes for these patients. PMID:26371266

  6. Primary care physician insights into a typology of the complex patient in primary care.

    PubMed

    Loeb, Danielle F; Binswanger, Ingrid A; Candrian, Carey; Bayliss, Elizabeth A

    2015-09-01

    Primary care physicians play unique roles caring for complex patients, often acting as the hub for their care and coordinating care among specialists. To inform the clinical application of new models of care for complex patients, we sought to understand how these physicians conceptualize patient complexity and to develop a corresponding typology. We conducted qualitative in-depth interviews with internal medicine primary care physicians from 5 clinics associated with a university hospital and a community health hospital. We used systematic nonprobabilistic sampling to achieve an even distribution of sex, years in practice, and type of practice. The interviews were analyzed using a team-based participatory general inductive approach. The 15 physicians in this study endorsed a multidimensional concept of patient complexity. The physicians perceived patients to be complex if they had an exacerbating factor-a medical illness, mental illness, socioeconomic challenge, or behavior or trait (or some combination thereof)-that complicated care for chronic medical illnesses. This perspective of primary care physicians caring for complex patients can help refine models of complexity to design interventions or models of care that improve outcomes for these patients. © 2015 Annals of Family Medicine, Inc.

  7. Prevalence of overweight and obesity and some associated factors among adult residents of northeast China: a cross-sectional study

    PubMed Central

    Zhang, Peng; Gao, Chunshi; Li, Zhijun; Lv, Xin; Song, Yuanyuan; Yu, Yaqin; Li, Bo

    2016-01-01

    Objectives This study aims to estimate the prevalence of overweight and obesity and determine potential influencing factors among adults in northeast China. Methods A cross-sectional survey was conducted in Jilin Province, northeast China, in 2012. A total of 9873 men and 10 966 women aged 18–79 years from the general population were included using a multistage stratified random cluster sampling design. Data were obtained from face-to-face interview and physical examination. After being weighted according to a complex sampling scheme, the sample was used to estimate the prevalence of overweight (body mass index (BMI) 24–27.9 kg/m2) and obesity (BMI >28 kg/m2) in Jilin Province, and analyse influencing factors through corresponding statistical methods based on complex sampling design behaviours. Results The overall prevalence of overweight was 32.3% (male 34.3%; female 30.2%), and the prevalence of obesity was 14.6% (male 16.3%; female 12.8%) in Jilin Province. The prevalence of both overweight and obesity were higher in men than women (p<0.001). Influencing factors included sex, age, marriage status, occupation, smoking, drinking, diet and hours of sleep (p<0.05). Conclusions This study estimated that the prevalence of overweight and obesity among adult residents of Jilin Province, northeast China, were high. The results of this study will be submitted to the Health Department of Jilin Province and other relevant departments as a reference, which should inform policy makers in developing education and publicity to prevent and control the occurrence of overweight and obesity. PMID:27456326

  8. Prevalence of overweight and obesity and some associated factors among adult residents of northeast China: a cross-sectional study.

    PubMed

    Wang, Rui; Zhang, Peng; Gao, Chunshi; Li, Zhijun; Lv, Xin; Song, Yuanyuan; Yu, Yaqin; Li, Bo

    2016-07-25

    This study aims to estimate the prevalence of overweight and obesity and determine potential influencing factors among adults in northeast China. A cross-sectional survey was conducted in Jilin Province, northeast China, in 2012. A total of 9873 men and 10 966 women aged 18-79 years from the general population were included using a multistage stratified random cluster sampling design. Data were obtained from face-to-face interview and physical examination. After being weighted according to a complex sampling scheme, the sample was used to estimate the prevalence of overweight (body mass index (BMI) 24-27.9 kg/m(2)) and obesity (BMI >28 kg/m(2)) in Jilin Province, and analyse influencing factors through corresponding statistical methods based on complex sampling design behaviours. The overall prevalence of overweight was 32.3% (male 34.3%; female 30.2%), and the prevalence of obesity was 14.6% (male 16.3%; female 12.8%) in Jilin Province. The prevalence of both overweight and obesity were higher in men than women (p<0.001). Influencing factors included sex, age, marriage status, occupation, smoking, drinking, diet and hours of sleep (p<0.05). This study estimated that the prevalence of overweight and obesity among adult residents of Jilin Province, northeast China, were high. The results of this study will be submitted to the Health Department of Jilin Province and other relevant departments as a reference, which should inform policy makers in developing education and publicity to prevent and control the occurrence of overweight and obesity. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  9. Improving validity of informed consent for biomedical research in Zambia using a laboratory exposure intervention.

    PubMed

    Zulu, Joseph Mumba; Lisulo, Mpala Mwanza; Besa, Ellen; Kaonga, Patrick; Chisenga, Caroline C; Chomba, Mumba; Simuyandi, Michelo; Banda, Rosemary; Kelly, Paul

    2014-01-01

    Complex biomedical research can lead to disquiet in communities with limited exposure to scientific discussions, leading to rumours or to high drop-out rates. We set out to test an intervention designed to address apprehensions commonly encountered in a community where literacy is uncommon, and where complex biomedical research has been conducted for over a decade. We aimed to determine if it could improve the validity of consent. Data were collected using focus group discussions, key informant interviews and observations. We designed an intervention that exposed participants to a detailed demonstration of laboratory processes. Each group was interviewed twice in a day, before and after exposure to the intervention in order to assess changes in their views. Factors that motivated people to participate in invasive biomedical research included a desire to stay healthy because of the screening during the recruitment process, regular advice from doctors, free medical services, and trust in the researchers. Inhibiting factors were limited knowledge about samples taken from their bodies during endoscopic procedures, the impact of endoscopy on the function of internal organs, and concerns about the use of biomedical samples. The belief that blood can be used for Satanic practices also created insecurities about drawing of blood samples. Further inhibiting factors included a fear of being labelled as HIV positive if known to consult heath workers repeatedly, and gender inequality. Concerns about the use and storage of blood and tissue samples were overcome by a laboratory exposure intervention. Selecting a group of members from target community and engaging them in a laboratory exposure intervention could be a useful tool for enhancing specific aspects of consent for biomedical research. Further work is needed to determine the extent to which improved understanding permeates beyond the immediate group participating in the intervention.

  10. In-Situ Analysis System for Correlated Electron Heterostructures

    DTIC Science & Technology

    2014-11-20

    semiconductor materials and elemental metals. Specifically, films must be pristine and ideally remain intact during analytical procedure [1]. In addition...involves a rather complex engineering design described below. A laser heater ! (a) ! (b) ! 1 Figure 1. (a) An empty Neocera’s sample holder rack...the center of the analytical chamber. (fiber-coupled, high-power 808 nm diode laser JOLD -100-CPXF-2P, Jenoptik), is free of such limitations because

  11. Dragonfly: Investigating the Surface Composition of Titan

    NASA Technical Reports Server (NTRS)

    Brinckerhoff, W. B.; Lawrence, D. J.; Barnes, J. W.; Lorenz, R. D.; Horst, S. M.; Zacny, K.; Freissinet, C.; Parsons, A. M.; Turtle, E. P.; Trainer, M. G.; hide

    2018-01-01

    Dragonfly is a rotorcraft lander mission, selected as a finalist in NASA's New Frontiers Program, that is designed to sample materials and determine the surface composition in different geologic settings on Titan. This revolutionary mission concept would explore diverse locations to characterize the habitability of Titan's environment, to investigate how far prebiotic chemistry has progressed, and to search for chemical signatures that could be indicative of water-based and/or hydrocarbon-based life. Here we describe Dragonfly's capabilities to determine the composition of a variety of surface units on Titan, from elemental components to complex organic molecules. The compositional investigation ncludes characterization of local surface environments and finely sampled materials. The Dragonfly flexible sampling approach can robustly accommodate materials from Titan's most intriguing surface environments.

  12. Regression and Data Mining Methods for Analyses of Multiple Rare Variants in the Genetic Analysis Workshop 17 Mini-Exome Data

    PubMed Central

    Bailey-Wilson, Joan E.; Brennan, Jennifer S.; Bull, Shelley B; Culverhouse, Robert; Kim, Yoonhee; Jiang, Yuan; Jung, Jeesun; Li, Qing; Lamina, Claudia; Liu, Ying; Mägi, Reedik; Niu, Yue S.; Simpson, Claire L.; Wang, Libo; Yilmaz, Yildiz E.; Zhang, Heping; Zhang, Zhaogong

    2012-01-01

    Group 14 of Genetic Analysis Workshop 17 examined several issues related to analysis of complex traits using DNA sequence data. These issues included novel methods for analyzing rare genetic variants in an aggregated manner (often termed collapsing rare variants), evaluation of various study designs to increase power to detect effects of rare variants, and the use of machine learning approaches to model highly complex heterogeneous traits. Various published and novel methods for analyzing traits with extreme locus and allelic heterogeneity were applied to the simulated quantitative and disease phenotypes. Overall, we conclude that power is (as expected) dependent on locus-specific heritability or contribution to disease risk, large samples will be required to detect rare causal variants with small effect sizes, extreme phenotype sampling designs may increase power for smaller laboratory costs, methods that allow joint analysis of multiple variants per gene or pathway are more powerful in general than analyses of individual rare variants, population-specific analyses can be optimal when different subpopulations harbor private causal mutations, and machine learning methods may be useful for selecting subsets of predictors for follow-up in the presence of extreme locus heterogeneity and large numbers of potential predictors. PMID:22128066

  13. A methodology for system-of-systems design in support of the engineering team

    NASA Astrophysics Data System (ADS)

    Ridolfi, G.; Mooij, E.; Cardile, D.; Corpino, S.; Ferrari, G.

    2012-04-01

    Space missions have experienced a trend of increasing complexity in the last decades, resulting in the design of very complex systems formed by many elements and sub-elements working together to meet the requirements. In a classical approach, especially in a company environment, the two steps of design-space exploration and optimization are usually performed by experts inferring on major phenomena, making assumptions and doing some trial-and-error runs on the available mathematical models. This is done especially in the very early design phases where most of the costs are locked-in. With the objective of supporting the engineering team and the decision-makers during the design of complex systems, the authors developed a modelling framework for a particular category of complex, coupled space systems called System-of-Systems. Once modelled, the System-of-Systems is solved using a computationally cheap parametric methodology, named the mixed-hypercube approach, based on the utilization of a particular type of fractional factorial design-of-experiments, and analysis of the results via global sensitivity analysis and response surfaces. As an applicative example, a system-of-systems of a hypothetical human space exploration scenario for the support of a manned lunar base is presented. The results demonstrate that using the mixed-hypercube to sample the design space, an optimal solution is reached with a limited computational effort, providing support to the engineering team and decision makers thanks to sensitivity and robustness information. The analysis of the system-of-systems model that was implemented shows that the logistic support of a human outpost on the Moon for 15 years is still feasible with currently available launcher classes. The results presented in this paper have been obtained in cooperation with Thales Alenia Space—Italy, in the framework of a regional programme called STEPS. STEPS—Sistemi e Tecnologie per l'EsPlorazione Spaziale is a research project co-financed by Piedmont Region and firms and universities of the Piedmont Aerospace District in the ambit of the P.O.R-F.E.S.R. 2007-2013 program.

  14. Surface modified capillary electrophoresis combined with in solution isoelectric focusing and MALDI-TOF/TOF MS: a gel-free multidimensional electrophoresis approach for proteomic profiling--exemplified on human follicular fluid.

    PubMed

    Hanrieder, Jörg; Zuberovic, Aida; Bergquist, Jonas

    2009-04-24

    Development of miniaturized analytical tools continues to be of great interest to face the challenges in proteomic analysis of complex biological samples such as human body fluids. In the light of these challenges, special emphasis is put on the speed and simplicity of newly designed technological approaches as well as the need for cost efficiency and low sample consumption. In this study, we present an alternative multidimensional bottom-up approach for proteomic profiling for fast, efficient and sensitive protein analysis in complex biological matrices. The presented setup was based on sample pre-fractionation using microscale in solution isoelectric focusing (IEF) followed by tryptic digestion and subsequent capillary electrophoresis (CE) coupled off-line to matrix assisted laser desorption/ionization time of flight tandem mass spectrometry (MALDI TOF MS/MS). For high performance CE-separation, PolyE-323 modified capillaries were applied to minimize analyte-wall interactions. The potential of the analytical setup was demonstrated on human follicular fluid (hFF) representing a typical complex human body fluid with clinical implication. The obtained results show significant identification of 73 unique proteins (identified at 95% significance level), including mostly acute phase proteins but also protein identities that are well known to be extensively involved in follicular development.

  15. ATMAD: robust image analysis for Automatic Tissue MicroArray De-arraying.

    PubMed

    Nguyen, Hoai Nam; Paveau, Vincent; Cauchois, Cyril; Kervrann, Charles

    2018-04-19

    Over the last two decades, an innovative technology called Tissue Microarray (TMA), which combines multi-tissue and DNA microarray concepts, has been widely used in the field of histology. It consists of a collection of several (up to 1000 or more) tissue samples that are assembled onto a single support - typically a glass slide - according to a design grid (array) layout, in order to allow multiplex analysis by treating numerous samples under identical and standardized conditions. However, during the TMA manufacturing process, the sample positions can be highly distorted from the design grid due to the imprecision when assembling tissue samples and the deformation of the embedding waxes. Consequently, these distortions may lead to severe errors of (histological) assay results when the sample identities are mismatched between the design and its manufactured output. The development of a robust method for de-arraying TMA, which localizes and matches TMA samples with their design grid, is therefore crucial to overcome the bottleneck of this prominent technology. In this paper, we propose an Automatic, fast and robust TMA De-arraying (ATMAD) approach dedicated to images acquired with brightfield and fluorescence microscopes (or scanners). First, tissue samples are localized in the large image by applying a locally adaptive thresholding on the isotropic wavelet transform of the input TMA image. To reduce false detections, a parametric shape model is considered for segmenting ellipse-shaped objects at each detected position. Segmented objects that do not meet the size and the roundness criteria are discarded from the list of tissue samples before being matched with the design grid. Sample matching is performed by estimating the TMA grid deformation under the thin-plate model. Finally, thanks to the estimated deformation, the true tissue samples that were preliminary rejected in the early image processing step are recognized by running a second segmentation step. We developed a novel de-arraying approach for TMA analysis. By combining wavelet-based detection, active contour segmentation, and thin-plate spline interpolation, our approach is able to handle TMA images with high dynamic, poor signal-to-noise ratio, complex background and non-linear deformation of TMA grid. In addition, the deformation estimation produces quantitative information to asset the manufacturing quality of TMAs.

  16. Automated Protist Analysis of Complex Samples: Recent Investigations Using Motion and Thresholding

    DTIC Science & Technology

    2012-01-01

    Report No: CG-D-15-13 Automated Protist Analysis of Complex Samples: Recent Investigations Using Motion and Thresholding...Distribution Statement A: Approved for public release; distribution is unlimited. January 2012 Automated Protist Analysis of Complex Samples...Chelsea Street New London, CT 06320 Automated Protist Analysis of Complex Samples iii UNCLAS//PUBLIC | CG-926 R&DC | B. Nelson, et al

  17. MAP: an iterative experimental design methodology for the optimization of catalytic search space structure modeling.

    PubMed

    Baumes, Laurent A

    2006-01-01

    One of the main problems in high-throughput research for materials is still the design of experiments. At early stages of discovery programs, purely exploratory methodologies coupled with fast screening tools should be employed. This should lead to opportunities to find unexpected catalytic results and identify the "groups" of catalyst outputs, providing well-defined boundaries for future optimizations. However, very few new papers deal with strategies that guide exploratory studies. Mostly, traditional designs, homogeneous covering, or simple random samplings are exploited. Typical catalytic output distributions exhibit unbalanced datasets for which an efficient learning is hardly carried out, and interesting but rare classes are usually unrecognized. Here is suggested a new iterative algorithm for the characterization of the search space structure, working independently of learning processes. It enhances recognition rates by transferring catalysts to be screened from "performance-stable" space zones to "unsteady" ones which necessitate more experiments to be well-modeled. The evaluation of new algorithm attempts through benchmarks is compulsory due to the lack of past proofs about their efficiency. The method is detailed and thoroughly tested with mathematical functions exhibiting different levels of complexity. The strategy is not only empirically evaluated, the effect or efficiency of sampling on future Machine Learning performances is also quantified. The minimum sample size required by the algorithm for being statistically discriminated from simple random sampling is investigated.

  18. Evaluation of passive samplers for the collection of dissolved organic matter in streams.

    PubMed

    Warner, Daniel L; Oviedo-Vargas, Diana; Royer, Todd V

    2015-01-01

    Traditional sampling methods for dissolved organic matter (DOM) in streams limit opportunities for long-term studies due to time and cost constraints. Passive DOM samplers were constructed following a design proposed previously which utilizes diethylaminoethyl (DEAE) cellulose as a sampling medium, and they were deployed throughout a temperate stream network in Indiana. Two deployments of the passive samplers were conducted, during which grab samples were frequently collected for comparison. Differences in DOM quality between sites and sampling methods were assessed using several common optical analyses. The analyses revealed significant differences in optical properties between sampling methods, with the passive samplers preferentially collecting terrestrial, humic-like DOM. We assert that the differences in DOM composition from each sampling method were caused by preferential binding of complex humic compounds to the DEAE cellulose in the passive samplers. Nonetheless, the passive samplers may provide a cost-effective, integrated sample of DOM in situations where the bulk DOM pool is composed mainly of terrestrial, humic-like compounds.

  19. Using hydrogels in microscopy: A tutorial.

    PubMed

    Flood, Peter; Page, Henry; Reynaud, Emmanuel G

    2016-05-01

    Sample preparation for microscopy is a crucial step to ensure the best experimental outcome. It often requires the use of specific mounting media that have to be tailored to not just the sample but the chosen microscopy technique. The media must not damage the sample or impair the optical path, and may also have to support the correct physiological function/development of the sample. For decades, researchers have used embedding media such as hydrogels to maintain samples in place. Their ease of use and transparency has promoted them as mainstream mounting media. However, they are not as straightforward to implement as assumed. They can contain contaminants, generate forces on the sample, have complex diffusion and structural properties that are influenced by multiple factors and are generally not designed for microscopy in mind. This short review will discuss the advantages and disadvantages of using hydrogels for microscopy sample preparation and highlight some of the less obvious problems associated with the area. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Computer generated hologram from point cloud using graphics processor.

    PubMed

    Chen, Rick H-Y; Wilkinson, Timothy D

    2009-12-20

    Computer generated holography is an extremely demanding and complex task when it comes to providing realistic reconstructions with full parallax, occlusion, and shadowing. We present an algorithm designed for data-parallel computing on modern graphics processing units to alleviate the computational burden. We apply Gaussian interpolation to create a continuous surface representation from discrete input object points. The algorithm maintains a potential occluder list for each individual hologram plane sample to keep the number of visibility tests to a minimum. We experimented with two approximations that simplify and accelerate occlusion computation. It is observed that letting several neighboring hologram plane samples share visibility information on object points leads to significantly faster computation without causing noticeable artifacts in the reconstructed images. Computing a reduced sample set via nonuniform sampling is also found to be an effective acceleration technique.

  1. Distributed Space Mission Design for Earth Observation Using Model-Based Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Nag, Sreeja; LeMoigne-Stewart, Jacqueline; Cervantes, Ben; DeWeck, Oliver

    2015-01-01

    Distributed Space Missions (DSMs) are gaining momentum in their application to earth observation missions owing to their unique ability to increase observation sampling in multiple dimensions. DSM design is a complex problem with many design variables, multiple objectives determining performance and cost and emergent, often unexpected, behaviors. There are very few open-access tools available to explore the tradespace of variables, minimize cost and maximize performance for pre-defined science goals, and therefore select the most optimal design. This paper presents a software tool that can multiple DSM architectures based on pre-defined design variable ranges and size those architectures in terms of predefined science and cost metrics. The tool will help a user select Pareto optimal DSM designs based on design of experiments techniques. The tool will be applied to some earth observation examples to demonstrate its applicability in making some key decisions between different performance metrics and cost metrics early in the design lifecycle.

  2. Leaf LIMS: A Flexible Laboratory Information Management System with a Synthetic Biology Focus.

    PubMed

    Craig, Thomas; Holland, Richard; D'Amore, Rosalinda; Johnson, James R; McCue, Hannah V; West, Anthony; Zulkower, Valentin; Tekotte, Hille; Cai, Yizhi; Swan, Daniel; Davey, Robert P; Hertz-Fowler, Christiane; Hall, Anthony; Caddick, Mark

    2017-12-15

    This paper presents Leaf LIMS, a flexible laboratory information management system (LIMS) designed to address the complexity of synthetic biology workflows. At the project's inception there was a lack of a LIMS designed specifically to address synthetic biology processes, with most systems focused on either next generation sequencing or biobanks and clinical sample handling. Leaf LIMS implements integrated project, item, and laboratory stock tracking, offering complete sample and construct genealogy, materials and lot tracking, and modular assay data capture. Hence, it enables highly configurable task-based workflows and supports data capture from project inception to completion. As such, in addition to it supporting synthetic biology it is ideal for many laboratory environments with multiple projects and users. The system is deployed as a web application through Docker and is provided under a permissive MIT license. It is freely available for download at https://leaflims.github.io .

  3. IFSA: a microfluidic chip-platform for frit-based immunoassay protocols

    NASA Astrophysics Data System (ADS)

    Hlawatsch, Nadine; Bangert, Michael; Miethe, Peter; Becker, Holger; Gärtner, Claudia

    2013-03-01

    Point-of-care diagnostics (POC) is one of the key application fields for lab-on-a-chip devices. While in recent years much of the work has concentrated on integrating complex molecular diagnostic assays onto a microfluidic device, there is a need to also put comparatively simple immunoassay-type protocols on a microfluidic platform. In this paper, we present the development of a microfluidic cartridge using an immunofiltration approach. In this method, the sandwich immunoassay takes place in a porous frit on which the antibodies have immobilized. The device is designed to be able to handle three samples in parallel and up to four analytical targets per sample. In order to meet the critical cost targets for the diagnostic market, the microfluidic chip has been designed and manufactured using high-volume manufacturing technologies in mind. Validation experiments show comparable sensitivities in comparison with conventional immunofiltration kits.

  4. Optimized Projection Matrix for Compressive Sensing

    NASA Astrophysics Data System (ADS)

    Xu, Jianping; Pi, Yiming; Cao, Zongjie

    2010-12-01

    Compressive sensing (CS) is mainly concerned with low-coherence pairs, since the number of samples needed to recover the signal is proportional to the mutual coherence between projection matrix and sparsifying matrix. Until now, papers on CS always assume the projection matrix to be a random matrix. In this paper, aiming at minimizing the mutual coherence, a method is proposed to optimize the projection matrix. This method is based on equiangular tight frame (ETF) design because an ETF has minimum coherence. It is impossible to solve the problem exactly because of the complexity. Therefore, an alternating minimization type method is used to find a feasible solution. The optimally designed projection matrix can further reduce the necessary number of samples for recovery or improve the recovery accuracy. The proposed method demonstrates better performance than conventional optimization methods, which brings benefits to both basis pursuit and orthogonal matching pursuit.

  5. The Importance of Experimental Design, Quality Assurance, and Control in Plant Metabolomics Experiments.

    PubMed

    Martins, Marina C M; Caldana, Camila; Wolf, Lucia Daniela; de Abreu, Luis Guilherme Furlan

    2018-01-01

    The output of metabolomics relies to a great extent upon the methods and instrumentation to identify, quantify, and access spatial information on as many metabolites as possible. However, the most modern machines and sophisticated tools for data analysis cannot compensate for inappropriate harvesting and/or sample preparation procedures that modify metabolic composition and can lead to erroneous interpretation of results. In addition, plant metabolism has a remarkable degree of complexity, and the number of identified compounds easily surpasses the number of samples in metabolomics analyses, increasing false discovery risk. These aspects pose a large challenge when carrying out plant metabolomics experiments. In this chapter, we address the importance of a proper experimental design taking into consideration preventable complications and unavoidable factors to achieve success in metabolomics analysis. We also focus on quality control and standardized procedures during the metabolomics workflow.

  6. [Satisfaction and perceived quality of people insured by the Social Health Protection in Mexico. Methodological foundations].

    PubMed

    Saturno-Hernández, Pedro J; Gutiérrez-Reyes, Juan Pablo; Vieyra-Romero, Waldo Ivan; Romero-Martínez, Martín; O'Shea-Cuevas, Gabriel Jaime; Lozano-Herrera, Javier; Tavera-Martínez, Sonia; Hernández-Ávila, Mauricio

    2016-01-01

    To describe the conceptual framework and methods for implementation and analysis of the satisfaction survey of the Mexican System for Social Protection in Health. We analyze the methodological elements of the 2013, 2014 and 2015 surveys, including the instrument, sampling method and study design, conceptual framework, and characteristics and indicators of the analysis. The survey captures information on perceived quality and satisfaction. Sampling has national and State representation. Simple and composite indicators (index of satisfaction and rate of reported quality problems) are built and described. The analysis is completed using Pareto diagrams, correlation between indicators and association with satisfaction by means of multivariate models. The measurement of satisfaction and perceived quality is a complex but necessary process to comply with regulations and to identify strategies for improvement. The described survey presents a design and rigorous analysis focused on its utility for improving.

  7. Mitochondrial dysfunction in the gastrointestinal mucosa of children with autism: A blinded case-control study

    PubMed Central

    Rose, Shannon; Bennuri, Sirish C.; Murray, Katherine F.; Buie, Timothy; Winter, Harland

    2017-01-01

    Gastrointestinal (GI) symptoms are prevalent in autism spectrum disorder (ASD) but the pathophysiology is poorly understood. Imbalances in the enteric microbiome have been associated with ASD and can cause GI dysfunction potentially through disruption of mitochondrial function as microbiome metabolites modulate mitochondrial function and mitochondrial dysfunction is highly associated with GI symptoms. In this study, we compared mitochondrial function in rectal and cecum biopsies under the assumption that certain microbiome metabolites, such as butyrate and propionic acid, are more abundant in the cecum as compared to the rectum. Rectal and cecum mucosal biopsies were collected during elective diagnostic colonoscopy. Using a single-blind case-control design, complex I and IV and citrate synthase activities and complex I-V protein quantity from 10 children with ASD, 10 children with Crohn’s disease and 10 neurotypical children with nonspecific GI complaints were measured. The protein for all complexes, except complex II, in the cecum as compared to the rectum was significantly higher in ASD samples as compared to other groups. For both rectal and cecum biopsies, ASD samples demonstrated higher complex I activity, but not complex IV or citrate synthase activity, compared to other groups. Mitochondrial function in the gut mucosa from children with ASD was found to be significantly different than other groups who manifested similar GI symptomatology suggesting a unique pathophysiology for GI symptoms in children with ASD. Abnormalities localized to the cecum suggest a role for imbalances in the microbiome, potentially in the production of butyrate, in children with ASD. PMID:29028817

  8. Mass amplifying probe for sensitive fluorescence anisotropy detection of small molecules in complex biological samples.

    PubMed

    Cui, Liang; Zou, Yuan; Lin, Ninghang; Zhu, Zhi; Jenkins, Gareth; Yang, Chaoyong James

    2012-07-03

    Fluorescence anisotropy (FA) is a reliable and excellent choice for fluorescence sensing. One of the key factors influencing the FA value for any molecule is the molar mass of the molecule being measured. As a result, the FA method with functional nucleic acid aptamers has been limited to macromolecules such as proteins and is generally not applicable for the analysis of small molecules because their molecular masses are relatively too small to produce observable FA value changes. We report here a molecular mass amplifying strategy to construct anisotropy aptamer probes for small molecules. The probe is designed in such a way that only when a target molecule binds to the probe does it activate its binding ability to an anisotropy amplifier (a high molecular mass molecule such as protein), thus significantly increasing the molecular mass and FA value of the probe/target complex. Specifically, a mass amplifying probe (MAP) consists of a targeting aptamer domain against a target molecule and molecular mass amplifying aptamer domain for the amplifier protein. The probe is initially rendered inactive by a small blocking strand partially complementary to both target aptamer and amplifier protein aptamer so that the mass amplifying aptamer domain would not bind to the amplifier protein unless the probe has been activated by the target. In this way, we prepared two probes that constitute a target (ATP and cocaine respectively) aptamer, a thrombin (as the mass amplifier) aptamer, and a fluorophore. Both probes worked well against their corresponding small molecule targets, and the detection limits for ATP and cocaine were 0.5 μM and 0.8 μM, respectively. More importantly, because FA is less affected by environmental interferences, ATP in cell media and cocaine in urine were directly detected without any tedious sample pretreatment. Our results established that our molecular mass amplifying strategy can be used to design aptamer probes for rapid, sensitive, and selective detection of small molecules by means of FA in complex biological samples.

  9. Proximity Operations in Microgravity, a Robotic Solution for Maneuvering about an Asteroid Surface

    NASA Astrophysics Data System (ADS)

    Indyk, Stephen; Scheidt, David; Moses, Kenneth; Perry, Justin; Mike, Krystal

    Asteroids remain some of the most under investigated bodies in the solar system. Addition-ally, there is a distinct lack of directly collected information. This is in part due to complex sampling and motion problems that must be overcome before more detailed missions can be formulated. The chief caveat lies in formulating a technique for precision operation in mi-crogravity. Locomotion, in addition to sample collection, involve forces significantly greater than the gravitational force keeping a robot on the surface. The design of a system that can successfully maneuver over unfamiliar surfaces void of natural anchor points is an incredible challenge. This problem was investigated at Johns Hopkins University Applied Physics Laboratory as part of the 2009 NASA Lunar and Planetary Academy. Examining the problem through a two-dimensional robotic simulation, a swarm robotics approach was applied. In simplest form, this was comprised of three grappling robots and one sampling robot. Connected by tethers, the grappling robots traverse a plane and reposition the sampling robot through tensioning the tethers. This presentation provides information on the design of the robotic system, as well as gait analysis and future considerations for a three dimensional system.

  10. Preconcentration of lead using solidification of floating organic drop and its determination by electrothermal atomic absorption spectrometry

    PubMed Central

    Chamsaz, Mahmoud; Akhoundzadeh, Jeiran; Arbab-zavar, Mohammad Hossein

    2012-01-01

    A simple microextraction method based on solidification of a floating organic drop (SFOD) was developed for preconcentration of lead prior to its determination by electrothermal atomic absorption spectrometry (ETAAS). Ammonium pyrolidinedithiocarbamate (APDC) was used as complexing agent, and the formed complex was extracted into a 20 μL of 1-undecanol. The extracted complex was diluted with ethanol and injected into a graphite furnace. An orthogonal array design (OAD) with OA16 (45) matrix was employed to study the effects of different parameters such as pH, APDC concentration, stirring rate, sample solution temperature and the exposure time on the extraction efficiency. Under the optimized experimental conditions the limit of detection (based on 3 s) and the enhancement factor were 0.058 μg L−1 and 113, respectively. The relative standard deviation (RSD) for 8 replicate determinations of 1 μg L−1 of Pb was 8.8%. The developed method was validated by the analysis of certified reference materials and was successfully applied to the determination of lead in water and infant formula base powder samples. PMID:25685441

  11. Protocol for the process evaluation of a complex intervention designed to increase the use of research in health policy and program organisations (the SPIRIT study).

    PubMed

    Haynes, Abby; Brennan, Sue; Carter, Stacy; O'Connor, Denise; Schneider, Carmen Huckel; Turner, Tari; Gallego, Gisselle

    2014-09-27

    Process evaluation is vital for understanding how interventions function in different settings, including if and why they have different effects or do not work at all. This is particularly important in trials of complex interventions in 'real world' organisational settings where causality is difficult to determine. Complexity presents challenges for process evaluation, and process evaluations that tackle complexity are rarely reported. This paper presents the detailed protocol for a process evaluation embedded in a randomised trial of a complex intervention known as SPIRIT (Supporting Policy In health with Research: an Intervention Trial). SPIRIT aims to build capacity for using research in health policy and program agencies. We describe the flexible and pragmatic methods used for capturing, managing and analysing data across three domains: (a) the intervention as it was implemented; (b) how people participated in and responded to the intervention; and (c) the contextual characteristics that mediated this relationship and may influence outcomes. Qualitative and quantitative data collection methods include purposively sampled semi-structured interviews at two time points, direct observation and coding of intervention activities, and participant feedback forms. We provide examples of the data collection and data management tools developed. This protocol provides a worked example of how to embed process evaluation in the design and evaluation of a complex intervention trial. It tackles complexity in the intervention and its implementation settings. To our knowledge, it is the only detailed example of the methods for a process evaluation of an intervention conducted as part of a randomised trial in policy organisations. We identify strengths and weaknesses, and discuss how the methods are functioning during early implementation. Using 'insider' consultation to develop methods is enabling us to optimise data collection while minimising discomfort and burden for participants. Embedding the process evaluation within the trial design is facilitating access to data, but may impair participants' willingness to talk openly in interviews. While it is challenging to evaluate the process of conducting a randomised trial of a complex intervention, our experience so far suggests that it is feasible and can add considerably to the knowledge generated.

  12. Valuing Trial Designs from a Pharmaceutical Perspective Using Value-Based Pricing.

    PubMed

    Breeze, Penny; Brennan, Alan

    2015-11-01

    Our aim was to adapt the traditional framework for expected net benefit of sampling (ENBS) to be more compatible with drug development trials from the pharmaceutical perspective. We modify the traditional framework for conducting ENBS and assume that the price of the drug is conditional on the trial outcomes. We use a value-based pricing (VBP) criterion to determine price conditional on trial data using Bayesian updating of cost-effectiveness (CE) model parameters. We assume that there is a threshold price below which the company would not market the new intervention. We present a case study in which a phase III trial sample size and trial duration are varied. For each trial design, we sampled 10,000 trial outcomes and estimated VBP using a CE model. The expected commercial net benefit is calculated as the expected profits minus the trial costs. A clinical trial with shorter follow-up, and larger sample size, generated the greatest expected commercial net benefit. Increasing the duration of follow-up had a modest impact on profit forecasts. Expected net benefit of sampling can be adapted to value clinical trials in the pharmaceutical industry to optimise the expected commercial net benefit. However, the analyses can be very time consuming for complex CE models. © 2014 The Authors. Health Economics published by John Wiley & Sons Ltd.

  13. Stochastic sampled-data control for synchronization of complex dynamical networks with control packet loss and additive time-varying delays.

    PubMed

    Rakkiyappan, R; Sakthivel, N; Cao, Jinde

    2015-06-01

    This study examines the exponential synchronization of complex dynamical networks with control packet loss and additive time-varying delays. Additionally, sampled-data controller with time-varying sampling period is considered and is assumed to switch between m different values in a random way with given probability. Then, a novel Lyapunov-Krasovskii functional (LKF) with triple integral terms is constructed and by using Jensen's inequality and reciprocally convex approach, sufficient conditions under which the dynamical network is exponentially mean-square stable are derived. When applying Jensen's inequality to partition double integral terms in the derivation of linear matrix inequality (LMI) conditions, a new kind of linear combination of positive functions weighted by the inverses of squared convex parameters appears. In order to handle such a combination, an effective method is introduced by extending the lower bound lemma. To design the sampled-data controller, the synchronization error system is represented as a switched system. Based on the derived LMI conditions and average dwell-time method, sufficient conditions for the synchronization of switched error system are derived in terms of LMIs. Finally, numerical example is employed to show the effectiveness of the proposed methods. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. A shock tube with a high-repetition-rate time-of-flight mass spectrometer for investigations of complex reaction systems

    NASA Astrophysics Data System (ADS)

    Dürrstein, Steffen H.; Aghsaee, Mohammad; Jerig, Ludger; Fikri, Mustapha; Schulz, Christof

    2011-08-01

    A conventional membrane-type stainless steel shock tube has been coupled to a high-repetition-rate time-of-flight mass spectrometer (HRR-TOF-MS) to be used to study complex reaction systems such as the formation of pollutants in combustion processes or formation of nanoparticles from metal containing organic compounds. Opposed to other TOF-MS shock tubes, our instrument is equipped with a modular sampling unit that allows to sample with or without a skimmer. The skimmer unit can be mounted or removed in less than 10 min. Thus, it is possible to adjust the sampling procedure, namely, the mass flux into the ionization chamber of the HRR-TOF-MS, to the experimental situation imposed by species-specific ionization cross sections and vapor pressures. The whole sampling section was optimized with respect to a minimal distance between the nozzle tip inside the shock tube and the ion source inside the TOF-MS. The design of the apparatus is presented and the influence of the skimmer on the measured spectra is demonstrated by comparing data from both operation modes for conditions typical for chemical kinetics experiments. The well-studied thermal decomposition of acetylene has been used as a test system to validate the new setup against kinetics mechanisms reported in literature.

  15. Design of a Single Channel Modulated Wideband Converter for Wideband Spectrum Sensing: Theory, Architecture and Hardware Implementation

    PubMed Central

    Liu, Weisong; Huang, Zhitao; Wang, Xiang; Sun, Weichao

    2017-01-01

    In a cognitive radio sensor network (CRSN), wideband spectrum sensing devices which aims to effectively exploit temporarily vacant spectrum intervals as soon as possible are of great importance. However, the challenge of increasingly high signal frequency and wide bandwidth requires an extremely high sampling rate which may exceed today’s best analog-to-digital converters (ADCs) front-end bandwidth. Recently, the newly proposed architecture called modulated wideband converter (MWC), is an attractive analog compressed sensing technique that can highly reduce the sampling rate. However, the MWC has high hardware complexity owing to its parallel channel structure especially when the number of signals increases. In this paper, we propose a single channel modulated wideband converter (SCMWC) scheme for spectrum sensing of band-limited wide-sense stationary (WSS) signals. With one antenna or sensor, this scheme can save not only sampling rate but also hardware complexity. We then present a new, SCMWC based, single node CR prototype System, on which the spectrum sensing algorithm was tested. Experiments on our hardware prototype show that the proposed architecture leads to successful spectrum sensing. And the total sampling rate as well as hardware size is only one channel’s consumption of MWC. PMID:28471410

  16. FIDO - Video File

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Field Integrated Design and Operations (FIDO) rover is a prototype of the Mars Sample Return rovers that will carry the integrated Athena Science Payload to Mars in 2003 and 2005. The purpose of FIDO is to simulate, using Mars analog settings, the complex surface operations that will be necessary to find, characterize, obtain, cache, and return samples to the ascent vehicles on the landers. This videotape shows tests of the FIDO in the Mojave Desert. These tests include drilling through rock and movement of the rover. Also included in this tape are interviews with Dr Raymond Arvidson, the test director for FIDO, and Dr. Eric Baumgartner, Robotics Engineer at the Jet Propulsion Laboratory.

  17. Lessons in the Design and Characterization Testing of the Semi-Span Super-Sonic Transport (S4T) Wind-Tunnel Model

    NASA Technical Reports Server (NTRS)

    2012-01-01

    This paper focuses on some of the more challenging design processes and characterization tests of the Semi-Span Super-Sonic Transport (S4T)-Active Controls Testbed (ACT). The model was successfully tested in four entries in the National Aeronautics and Space Administration Langley Transonic Dynamics Tunnel to satisfy the goals and objectives of the Fundamental Aeronautics Program Supersonic Project Aero-Propulso-Servo-Elastic effort. Due to the complexity of the S4T-ACT, only a small sample of the technical challenges for designing and characterizing the model will be presented. Specifically, the challenges encountered in designing the model include scaling the Technology Concept Airplane to model scale, designing the model fuselage, aileron actuator, and engine pylons. Characterization tests included full model ground vibration tests, wing stiffness measurements, geometry measurements, proof load testing, and measurement of fuselage static and dynamic properties.

  18. Aerodynamic Shape Optimization Design of Wing-Body Configuration Using a Hybrid FFD-RBF Parameterization Approach

    NASA Astrophysics Data System (ADS)

    Liu, Yuefeng; Duan, Zhuoyi; Chen, Song

    2017-10-01

    Aerodynamic shape optimization design aiming at improving the efficiency of an aircraft has always been a challenging task, especially when the configuration is complex. In this paper, a hybrid FFD-RBF surface parameterization approach has been proposed for designing a civil transport wing-body configuration. This approach is simple and efficient, with the FFD technique used for parameterizing the wing shape and the RBF interpolation approach used for handling the wing body junction part updating. Furthermore, combined with Cuckoo Search algorithm and Kriging surrogate model with expected improvement adaptive sampling criterion, an aerodynamic shape optimization design system has been established. Finally, the aerodynamic shape optimization design on DLR F4 wing-body configuration has been carried out as a study case, and the result has shown that the approach proposed in this paper is of good effectiveness.

  19. Automation--down to the nuts and bolts.

    PubMed

    Fix, R J; Rowe, J M; McConnell, B C

    2000-01-01

    Laboratories that once viewed automation as an expensive luxury are now looking to automation as a solution to increase sample throughput, to help ensure data integrity and to improve laboratory safety. The question is no longer, 'Should we automate?', but 'How should we approach automation?' A laboratory may choose from three approaches when deciding to automate: (1) contract with a third party vendor to produce a turnkey system, (2) develop and fabricate the system in-house or (3) some combination of approaches (1) and (2). The best approach for a given laboratory depends upon its available resources. The first lesson to be learned in automation is that no matter how straightforward an idea appears in the beginning, the solution will not be realized until many complex problems have been resolved. Issues dealing with sample vessel manipulation, liquid handling and system control must be addressed before a final design can be developed. This requires expertise in engineering, electronics, programming and chemistry. Therefore, the team concept of automation should be employed to help ensure success. This presentation discusses the advantages and disadvantages of the three approaches to automation. The development of an automated sample handling and control system for the STAR System focused microwave will be used to illustrate the complexities encountered in a seemingly simple project, and to highlight the importance of the team concept to automation no matter which approach is taken. The STAR System focused microwave from CEM Corporation is an open vessel digestion system with six microwave cells. This system is used to prepare samples for trace metal determination. The automated sample handling was developed around a XYZ motorized gantry system. Grippers were specially designed to perform several different functions and to provide feedback to the control software. Software was written in Visual Basic 5.0 to control the movement of the samples and the operation and monitoring of the STAR microwave. This software also provides a continuous update of the system's status to the computer screen. The system provides unattended preparation of up to 59 samples per run.

  20. Design of AN Intelligent Individual Evacuation Model for High Rise Building Fires Based on Neural Network Within the Scope of 3d GIS

    NASA Astrophysics Data System (ADS)

    Atila, U.; Karas, I. R.; Turan, M. K.; Rahman, A. A.

    2013-09-01

    One of the most dangerous disaster threatening the high rise and complex buildings of today's world including thousands of occupants inside is fire with no doubt. When we consider high population and the complexity of such buildings it is clear to see that performing a rapid and safe evacuation seems hard and human being does not have good memories in case of such disasters like world trade center 9/11. Therefore, it is very important to design knowledge based realtime interactive evacuation methods instead of classical strategies which lack of flexibility. This paper presents a 3D-GIS implementation which simulates the behaviour of an intelligent indoor pedestrian navigation model proposed for a self -evacuation of a person in case of fire. The model is based on Multilayer Perceptron (MLP) which is one of the most preferred artificial neural network architecture in classification and prediction problems. A sample fire scenario following through predefined instructions has been performed on 3D model of the Corporation Complex in Putrajaya (Malaysia) and the intelligent evacuation process has been realized within a proposed 3D-GIS based simulation.

  1. RNA–protein binding kinetics in an automated microfluidic reactor

    PubMed Central

    Ridgeway, William K.; Seitaridou, Effrosyni; Phillips, Rob; Williamson, James R.

    2009-01-01

    Microfluidic chips can automate biochemical assays on the nanoliter scale, which is of considerable utility for RNA–protein binding reactions that would otherwise require large quantities of proteins. Unfortunately, complex reactions involving multiple reactants cannot be prepared in current microfluidic mixer designs, nor is investigation of long-time scale reactions possible. Here, a microfluidic ‘Riboreactor’ has been designed and constructed to facilitate the study of kinetics of RNA–protein complex formation over long time scales. With computer automation, the reactor can prepare binding reactions from any combination of eight reagents, and is optimized to monitor long reaction times. By integrating a two-photon microscope into the microfluidic platform, 5-nl reactions can be observed for longer than 1000 s with single-molecule sensitivity and negligible photobleaching. Using the Riboreactor, RNA–protein binding reactions with a fragment of the bacterial 30S ribosome were prepared in a fully automated fashion and binding rates were consistent with rates obtained from conventional assays. The microfluidic chip successfully combines automation, low sample consumption, ultra-sensitive fluorescence detection and a high degree of reproducibility. The chip should be able to probe complex reaction networks describing the assembly of large multicomponent RNPs such as the ribosome. PMID:19759214

  2. Schedule Risks Due to Delays in Advanced Technology Development

    NASA Technical Reports Server (NTRS)

    Reeves, John D. Jr.; Kayat, Kamal A.; Lim, Evan

    2008-01-01

    This paper discusses a methodology and modeling capability that probabilistically evaluates the likelihood and impacts of delays in advanced technology development prior to the start of design, development, test, and evaluation (DDT&E) of complex space systems. The challenges of understanding and modeling advanced technology development considerations are first outlined, followed by a discussion of the problem in the context of lunar surface architecture analysis. The current and planned methodologies to address the problem are then presented along with sample analyses and results. The methodology discussed herein provides decision-makers a thorough understanding of the schedule impacts resulting from the inclusion of various enabling advanced technology assumptions within system design.

  3. Design and Analysis of Single-Cell Sequencing Experiments.

    PubMed

    Grün, Dominic; van Oudenaarden, Alexander

    2015-11-05

    Recent advances in single-cell sequencing hold great potential for exploring biological systems with unprecedented resolution. Sequencing the genome of individual cells can reveal somatic mutations and allows the investigation of clonal dynamics. Single-cell transcriptome sequencing can elucidate the cell type composition of a sample. However, single-cell sequencing comes with major technical challenges and yields complex data output. In this Primer, we provide an overview of available methods and discuss experimental design and single-cell data analysis. We hope that these guidelines will enable a growing number of researchers to leverage the power of single-cell sequencing. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Texas Adolescent Tobacco and Marketing Surveillance System’s Design

    PubMed Central

    Pérez, Adriana; Harrell, Melissa B.; Malkani, Raja I.; Jackson, Christian D.; Delk, Joanne; Allotey, Prince A.; Matthews, Krystin J.; Martinez, Pablo; Perry, Cheryl L.

    2017-01-01

    Objectives To provide a full methodological description of the design of the wave I and II (6-month follow-up) surveys of the Texas Adolescent Tobacco and Marketing Surveillance System (TATAMS), a longitudinal surveillance study of 6th, 8th, and 10th grade students who attended schools in Bexar, Dallas, Tarrant, Harris, or Travis counties, where the 4 largest cities in Texas (San Antonio, Dallas, Fort Worth, Houston, and Austin, respectively) are located. Methods TATAMS used a complex probability design, yielding representative estimates of these students in these counties during the 2014–2015 academic year. Weighted prevalence of the use of tobacco products, drugs and alcohol in wave I, and the percent of: (i) bias, (ii) relative bias, and (iii) relative bias ratio, between waves I and II are estimated. Results The wave I sample included 79 schools and 3,907 students. The prevalence of current cigarette, e-cigarette and hookah use at wave I was 3.5%, 7.4%, and 2.5%, respectively. Small biases, mostly less than 3.5%, were observed for nonrespondents in wave II. Conclusions Even with adaptions to the sampling methodology, the resulting sample adequately represents the target population. Results from TATAMS will have important implications for future tobacco policy in Texas and federal regulation. PMID:29098172

  5. Nanoscale effects in the characterization of viscoelastic materials with atomic force microscopy: coupling of a quasi-three-dimensional standard linear solid model with in-plane surface interactions.

    PubMed

    Solares, Santiago D

    2016-01-01

    Significant progress has been accomplished in the development of experimental contact-mode and dynamic-mode atomic force microscopy (AFM) methods designed to measure surface material properties. However, current methods are based on one-dimensional (1D) descriptions of the tip-sample interaction forces, thus neglecting the intricacies involved in the material behavior of complex samples (such as soft viscoelastic materials) as well as the differences in material response between the surface and the bulk. In order to begin to address this gap, a computational study is presented where the sample is simulated using an enhanced version of a recently introduced model that treats the surface as a collection of standard-linear-solid viscoelastic elements. The enhanced model introduces in-plane surface elastic forces that can be approximately related to a two-dimensional (2D) Young's modulus. Relevant cases are discussed for single- and multifrequency intermittent-contact AFM imaging, with focus on the calculated surface indentation profiles and tip-sample interaction force curves, as well as their implications with regards to experimental interpretation. A variety of phenomena are examined in detail, which highlight the need for further development of more physically accurate sample models that are specifically designed for AFM simulation. A multifrequency AFM simulation tool based on the above sample model is provided as supporting information.

  6. Self-Sealing Wet Chemistry Cell for Field Analysis

    NASA Technical Reports Server (NTRS)

    Beegle, Luther W.; Soto, Juancarlos; Lasnik, James; Roark, Shane

    2012-01-01

    In most analytical investigations, there is a need to process complex field samples for the unique detection of analytes, especially when detecting low concentration organic molecules that may identify extraterrestrial life. Wet chemistry based instruments are the techniques of choice for most laboratory- based analysis of organic molecules due to several factors including less fragmentation of fragile biomarkers, and ability to concentrate target species resulting in much lower limits of detection. Development of an automated wet chemistry preparation system that can operate autonomously on Earth and is also designed to operate under Martian ambient conditions will demonstrate the technical feasibility of including wet chemistry on future missions. An Automated Sample Processing System (ASPS) has recently been developed that receives fines, extracts organics through solvent extraction, processes the extract by removing non-organic soluble species, and delivers sample to multiple instruments for analysis (including for non-organic soluble species). The key to this system is a sample cell that can autonomously function under field conditions. As a result, a self-sealing sample cell was developed that can autonomously hermetically seal fines and powder into a container, regardless of orientation of the apparatus. The cap is designed with a beveled edge, which allows the cap to be self-righted as the capping motor engages. Each cap consists of a C-clip lock ring below a crucible O-ring that is placed into a groove cut into the sample cap.

  7. A review of blood sample handling and pre-processing for metabolomics studies.

    PubMed

    Hernandes, Vinicius Veri; Barbas, Coral; Dudzik, Danuta

    2017-09-01

    Metabolomics has been found to be applicable to a wide range of clinical studies, bringing a new era for improving clinical diagnostics, early disease detection, therapy prediction and treatment efficiency monitoring. A major challenge in metabolomics, particularly untargeted studies, is the extremely diverse and complex nature of biological specimens. Despite great advances in the field there still exist fundamental needs for considering pre-analytical variability that can introduce bias to the subsequent analytical process and decrease the reliability of the results and moreover confound final research outcomes. Many researchers are mainly focused on the instrumental aspects of the biomarker discovery process, and sample related variables sometimes seem to be overlooked. To bridge the gap, critical information and standardized protocols regarding experimental design and sample handling and pre-processing are highly desired. Characterization of a range variation among sample collection methods is necessary to prevent results misinterpretation and to ensure that observed differences are not due to an experimental bias caused by inconsistencies in sample processing. Herein, a systematic discussion of pre-analytical variables affecting metabolomics studies based on blood derived samples is performed. Furthermore, we provide a set of recommendations concerning experimental design, collection, pre-processing procedures and storage conditions as a practical review that can guide and serve for the standardization of protocols and reduction of undesirable variation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Fine-tuning gene networks using simple sequence repeats

    PubMed Central

    Egbert, Robert G.; Klavins, Eric

    2012-01-01

    The parameters in a complex synthetic gene network must be extensively tuned before the network functions as designed. Here, we introduce a simple and general approach to rapidly tune gene networks in Escherichia coli using hypermutable simple sequence repeats embedded in the spacer region of the ribosome binding site. By varying repeat length, we generated expression libraries that incrementally and predictably sample gene expression levels over a 1,000-fold range. We demonstrate the utility of the approach by creating a bistable switch library that programmatically samples the expression space to balance the two states of the switch, and we illustrate the need for tuning by showing that the switch’s behavior is sensitive to host context. Further, we show that mutation rates of the repeats are controllable in vivo for stability or for targeted mutagenesis—suggesting a new approach to optimizing gene networks via directed evolution. This tuning methodology should accelerate the process of engineering functionally complex gene networks. PMID:22927382

  9. Monitoring Peptidase Activities in Complex Proteomes by MALDI-TOF Mass Spectrometry

    PubMed Central

    Villanueva, Josep; Nazarian, Arpi; Lawlor, Kevin; Tempst, Paul

    2009-01-01

    Measuring enzymatic activities in biological fluids is a form of activity-based proteomics and may be utilized as a means of developing disease biomarkers. Activity-based assays allow amplification of output signals, thus potentially visualizing low-abundant enzymes on a virtually transparent whole-proteome background. The protocol presented here describes a semi-quantitative in vitro assay of proteolytic activities in complex proteomes by monitoring breakdown of designer peptide-substrates using robotic extraction and a MALDI-TOF mass spectrometric read-out. Relative quantitation of the peptide metabolites is done by comparison with spiked internal standards, followed by statistical analysis of the resulting mini-peptidome. Partial automation provides reproducibility and throughput essential for comparing large sample sets. The approach may be employed for diagnostic or predictive purposes and enables profiling of 96 samples in 30 hours. It could be tailored to many diagnostic and pharmaco-dynamic purposes, as a read-out of catalytic and metabolic activities in body fluids or tissues. PMID:19617888

  10. Comparison of complex effluent treatability in different bench scale microbial electrolysis cells.

    PubMed

    Ullery, Mark L; Logan, Bruce E

    2014-10-01

    A range of wastewaters and substrates were examined using mini microbial electrolysis cells (mini MECs) to see if they could be used to predict the performance of larger-scale cube MECs. COD removals and coulombic efficiencies corresponded well between the two reactor designs for individual samples, with 66-92% of COD removed for all samples. Current generation was consistent between the reactor types for acetate (AC) and fermentation effluent (FE) samples, but less consistent with industrial (IW) and domestic wastewaters (DW). Hydrogen was recovered from all samples in cube MECs, but gas composition and volume varied significantly between samples. Evidence for direct conversion of substrate to methane was observed with two of the industrial wastewater samples (IW-1 and IW-3). Overall, mini MECs provided organic treatment data that corresponded well with larger scale reactor results, and therefore it was concluded that they can be a useful platform for screening wastewater sources. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. A Pareto frontier intersection-based approach for efficient multiobjective optimization of competing concept alternatives

    NASA Astrophysics Data System (ADS)

    Rousis, Damon A.

    The expected growth of civil aviation over the next twenty years places significant emphasis on revolutionary technology development aimed at mitigating the environmental impact of commercial aircraft. As the number of technology alternatives grows along with model complexity, current methods for Pareto finding and multiobjective optimization quickly become computationally infeasible. Coupled with the large uncertainty in the early stages of design, optimal designs are sought while avoiding the computational burden of excessive function calls when a single design change or technology assumption could alter the results. This motivates the need for a robust and efficient evaluation methodology for quantitative assessment of competing concepts. This research presents a novel approach that combines Bayesian adaptive sampling with surrogate-based optimization to efficiently place designs near Pareto frontier intersections of competing concepts. Efficiency is increased over sequential multiobjective optimization by focusing computational resources specifically on the location in the design space where optimality shifts between concepts. At the intersection of Pareto frontiers, the selection decisions are most sensitive to preferences place on the objectives, and small perturbations can lead to vastly different final designs. These concepts are incorporated into an evaluation methodology that ultimately reduces the number of failed cases, infeasible designs, and Pareto dominated solutions across all concepts. A set of algebraic samples along with a truss design problem are presented as canonical examples for the proposed approach. The methodology is applied to the design of ultra-high bypass ratio turbofans to guide NASA's technology development efforts for future aircraft. Geared-drive and variable geometry bypass nozzle concepts are explored as enablers for increased bypass ratio and potential alternatives over traditional configurations. The method is shown to improve sampling efficiency and provide clusters of feasible designs that motivate a shift towards revolutionary technologies that reduce fuel burn, emissions, and noise on future aircraft.

  12. User evaluations of design complexity: the impact of visual perceptions for effective online health communication.

    PubMed

    Lazard, Allison; Mackert, Michael

    2014-10-01

    This paper highlights the influential role of design complexity for users' first impressions of health websites. An experimental design was utilized to investigate whether a website's level of design complexity impacts user evaluations. An online questionnaire measured the hypothesized impact of design complexity on predictors of message effectiveness. Findings reveal that increased design complexity was positively associated with higher levels of perceived design esthetics, attitude toward the website, perceived message comprehensibility, perceived ease of use, perceived usefulness, perceived message quality, perceived informativeness, and perceived visual informativeness. This research gives further evidence that design complexity should be considered an influential variable for health communicators to effectively reach their audiences, as it embodies the critical first step for message evaluation via electronic platforms. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. A process-based hierarchical framework for monitoring glaciated alpine headwaters

    USGS Publications Warehouse

    Weekes, Anne A.; Torgersen, Christian E.; Montgomery, David R.; Woodward, Andrea; Bolton, Susan M.

    2012-01-01

    Recent studies have demonstrated the geomorphic complexity and wide range of hydrologic regimes found in alpine headwater channels that provide complex habitats for aquatic taxa. These geohydrologic elements are fundamental to better understand patterns in species assemblages and indicator taxa and are necessary to aquatic monitoring protocols that aim to track changes in physical conditions. Complex physical variables shape many biological and ecological traits, including life history strategies, but these mechanisms can only be understood if critical physical variables are adequately represented within the sampling framework. To better align sampling design protocols with current geohydrologic knowledge, we present a conceptual framework that incorporates regional-scale conditions, basin-scale longitudinal profiles, valley-scale glacial macroform structure, valley segment-scale (i.e., colluvial, alluvial, and bedrock), and reach-scale channel types. At the valley segment- and reach-scales, these hierarchical levels are associated with differences in streamflow and sediment regime, water source contribution and water temperature. Examples of linked physical-ecological hypotheses placed in a landscape context and a case study using the proposed framework are presented to demonstrate the usefulness of this approach for monitoring complex temporal and spatial patterns and processes in glaciated basins. This approach is meant to aid in comparisons between mountain regions on a global scale and to improve management of potentially endangered alpine species affected by climate change and other stressors.

  14. Low-Complexity Lossless and Near-Lossless Data Compression Technique for Multispectral Imagery

    NASA Technical Reports Server (NTRS)

    Xie, Hua; Klimesh, Matthew A.

    2009-01-01

    This work extends the lossless data compression technique described in Fast Lossless Compression of Multispectral- Image Data, (NPO-42517) NASA Tech Briefs, Vol. 30, No. 8 (August 2006), page 26. The original technique was extended to include a near-lossless compression option, allowing substantially smaller compressed file sizes when a small amount of distortion can be tolerated. Near-lossless compression is obtained by including a quantization step prior to encoding of prediction residuals. The original technique uses lossless predictive compression and is designed for use on multispectral imagery. A lossless predictive data compression algorithm compresses a digitized signal one sample at a time as follows: First, a sample value is predicted from previously encoded samples. The difference between the actual sample value and the prediction is called the prediction residual. The prediction residual is encoded into the compressed file. The decompressor can form the same predicted sample and can decode the prediction residual from the compressed file, and so can reconstruct the original sample. A lossless predictive compression algorithm can generally be converted to a near-lossless compression algorithm by quantizing the prediction residuals prior to encoding them. In this case, since the reconstructed sample values will not be identical to the original sample values, the encoder must determine the values that will be reconstructed and use these values for predicting later sample values. The technique described here uses this method, starting with the original technique, to allow near-lossless compression. The extension to allow near-lossless compression adds the ability to achieve much more compression when small amounts of distortion are tolerable, while retaining the low complexity and good overall compression effectiveness of the original algorithm.

  15. Development and evaluation of a highly sensitive immunochromatographic strip test using gold nanoparticle for direct detection of Vibrio cholerae O139 in seafood samples.

    PubMed

    Pengsuk, Chalinan; Chaivisuthangkura, Parin; Longyant, Siwaporn; Sithigorngul, Paisarn

    2013-04-15

    A strip test for the detection of Vibrio cholerae O139 was developed using two monoclonal antibodies (MAbs), namely VC-273 and VC-812, which specifically bind to the lipopolysaccharide and capsular polysaccharide of V. cholerae O139. The MAb VC-273 gold nanoparticle conjugate was sprayed onto a glass fiber pad that was placed adjacent to a sample chamber. MAb VC-812 and the goat anti-mouse immunoglobulin G (GAM) antibody were sprayed onto a nitrocellulose membrane in strips at positions designated as T and C, respectively. The test strips were assessed for their ability to directly detect V. cholerae O139 using samples dispersed in application buffer, and a 100 μL aliquot of sample was applied to the sample chamber. The results were observable within 20 min after application of the sample. In samples containing V. cholerae O139, the antigen was bound to the colloidal gold-conjugated MAb to form an antibody-antigen complex. This complex was captured by the MAbs at the T test line, resulting in the appearance of a reddish-purple band at the T position. The sensitivity of the test was determined to be 10⁴ cfu mL⁻¹. Direct detection of V. cholerae O139 in various fresh seafood samples could be accomplished with similar sensitivities. The detection limit was substantially improved to 1 cfu mL⁻¹ of the original bacterial content after pre-incubation of the sample in alkaline peptone water for 12 h. The V. cholerae strip test provides several advantages over other methods, including the speed and simplicity of use because there is no requirement for sophisticated equipment. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Attractive design: an elution solvent optimization platform for magnetic-bead-based fractionation using digital microfluidics and design of experiments.

    PubMed

    Lafrenière, Nelson M; Mudrik, Jared M; Ng, Alphonsus H C; Seale, Brendon; Spooner, Neil; Wheeler, Aaron R

    2015-04-07

    There is great interest in the development of integrated tools allowing for miniaturized sample processing, including solid phase extraction (SPE). We introduce a new format for microfluidic SPE relying on C18-functionalized magnetic beads that can be manipulated in droplets in a digital microfluidic platform. This format provides the opportunity to tune the amount (and potentially the type) of stationary phase on-the-fly, and allows the removal of beads after the extraction (to enable other operations in same device-space), maintaining device reconfigurability. Using the new method, we employed a design of experiments (DOE) operation to enable automated on-chip optimization of elution solvent composition for reversed phase SPE of a model system. Further, conditions were selected to enable on-chip fractionation of multiple analytes. Finally, the method was demonstrated to be useful for online cleanup of extracts from dried blood spot (DBS) samples. We anticipate this combination of features will prove useful for separating a wide range of analytes, from small molecules to peptides, from complex matrices.

  17. A Complex Permittivity Based Sensor for the Electrical Characterization of High-Voltage Transformer Oils

    PubMed Central

    Dervos, Constantine T.; Paraskevas, Christos D.; Skafidas, Panayotis D.; Vassiliou, Panayota

    2005-01-01

    This work investigates the use of a specially designed cylindrical metal cell, in order to obtain complex permittivity and tanδ data of highly insulating High Voltage (HV) transformer oil samples. The data are obtained at a wide range of frequencies and operation temperatures to demonstrate the polarization phenomena and the thermally stimulated effects. Such complex permittivity measurements may be utilized as a criterion for the service life prediction of oil field electrical equipment (OFEE). Therefore, by one set of measurements on a small oil volume, data may be provided on the impending termination, or continuation of the transformer oil service life. The oil incorporating cell, attached to the appropriate measuring units, could be described as a complex permittivity sensor. In this work, the acquired dielectric data from a great number of operating distribution network power transformers were correlated to corresponding physicochemical ones to demonstrate the future potential employment of the proposed measuring technique.

  18. Studying light-harvesting models with superconducting circuits.

    PubMed

    Potočnik, Anton; Bargerbos, Arno; Schröder, Florian A Y N; Khan, Saeed A; Collodo, Michele C; Gasparinetti, Simone; Salathé, Yves; Creatore, Celestino; Eichler, Christopher; Türeci, Hakan E; Chin, Alex W; Wallraff, Andreas

    2018-03-02

    The process of photosynthesis, the main source of energy in the living world, converts sunlight into chemical energy. The high efficiency of this process is believed to be enabled by an interplay between the quantum nature of molecular structures in photosynthetic complexes and their interaction with the environment. Investigating these effects in biological samples is challenging due to their complex and disordered structure. Here we experimentally demonstrate a technique for studying photosynthetic models based on superconducting quantum circuits, which complements existing experimental, theoretical, and computational approaches. We demonstrate a high degree of freedom in design and experimental control of our approach based on a simplified three-site model of a pigment protein complex with realistic parameters scaled down in energy by a factor of 10 5 . We show that the excitation transport between quantum-coherent sites disordered in energy can be enabled through the interaction with environmental noise. We also show that the efficiency of the process is maximized for structured noise resembling intramolecular phononic environments found in photosynthetic complexes.

  19. Multiplex Degenerate Primer Design for Targeted Whole Genome Amplification of Many Viral Genomes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, Shea N.; Jaing, Crystal J.; Elsheikh, Maher M.

    Background . Targeted enrichment improves coverage of highly mutable viruses at low concentration in complex samples. Degenerate primers that anneal to conserved regions can facilitate amplification of divergent, low concentration variants, even when the strain present is unknown. Results . A tool for designing multiplex sets of degenerate sequencing primers to tile overlapping amplicons across multiple whole genomes is described. The new script, run_tiled_primers, is part of the PriMux software. Primers were designed for each segment of South American hemorrhagic fever viruses, tick-borne encephalitis, Henipaviruses, Arenaviruses, Filoviruses, Crimean-Congo hemorrhagic fever virus, Rift Valley fever virus, and Japanese encephalitis virus. Eachmore » group is highly diverse with as little as 5% genome consensus. Primer sets were computationally checked for nontarget cross reactions against the NCBI nucleotide sequence database. Primers for murine hepatitis virus were demonstrated in the lab to specifically amplify selected genes from a laboratory cultured strain that had undergone extensive passage in vitro and in vivo. Conclusions . This software should help researchers design multiplex sets of primers for targeted whole genome enrichment prior to sequencing to obtain better coverage of low titer, divergent viruses. Applications include viral discovery from a complex background and improved sensitivity and coverage of rapidly evolving strains or variants in a gene family.« less

  20. Multiplex Degenerate Primer Design for Targeted Whole Genome Amplification of Many Viral Genomes

    DOE PAGES

    Gardner, Shea N.; Jaing, Crystal J.; Elsheikh, Maher M.; ...

    2014-01-01

    Background . Targeted enrichment improves coverage of highly mutable viruses at low concentration in complex samples. Degenerate primers that anneal to conserved regions can facilitate amplification of divergent, low concentration variants, even when the strain present is unknown. Results . A tool for designing multiplex sets of degenerate sequencing primers to tile overlapping amplicons across multiple whole genomes is described. The new script, run_tiled_primers, is part of the PriMux software. Primers were designed for each segment of South American hemorrhagic fever viruses, tick-borne encephalitis, Henipaviruses, Arenaviruses, Filoviruses, Crimean-Congo hemorrhagic fever virus, Rift Valley fever virus, and Japanese encephalitis virus. Eachmore » group is highly diverse with as little as 5% genome consensus. Primer sets were computationally checked for nontarget cross reactions against the NCBI nucleotide sequence database. Primers for murine hepatitis virus were demonstrated in the lab to specifically amplify selected genes from a laboratory cultured strain that had undergone extensive passage in vitro and in vivo. Conclusions . This software should help researchers design multiplex sets of primers for targeted whole genome enrichment prior to sequencing to obtain better coverage of low titer, divergent viruses. Applications include viral discovery from a complex background and improved sensitivity and coverage of rapidly evolving strains or variants in a gene family.« less

  1. Development of an Enhanced Metaproteomic Approach for Deepening the Microbiome Characterization of the Human Infant Gut

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiong, Weili; Richard J. Giannone; Morowitz, Michael J.

    The early-life microbiota establishment in the human infant gut is highly variable and plays a crucial role in host nutrients and immunity maturation. While high-performance mass spectrometry (MS)-based metaproteomics is a powerful method for the functional characterization of complex microbial communities, the construction of comprehensive metaproteomic information in human fecal samples is inhibited by the presence of abundant human proteins. To alleviate this restriction, we have designed a novel metaproteomic strategy based on Double Filtering (DF) to enhance microbial protein characterization in complex fecal samples from healthy premature infants. We improved the overall depth of infant gut proteome measurement, withmore » an increase in the number of identified low abundance proteins, and observed greater than twofold improvement in metrics for microbial protein identifications and quantifications with a relatively high rank correlation to control. We further showed the substantial enhancement of this approach for extensively interpreting microbial functional categories between infants by affording more detailed and confident identified categories. This approach provided an avenue for in-depth measurement in the microbial component of infant fecal samples and thus comprehensive characterization of infant gut microbiome functionality.« less

  2. Quantum-dot-based suspension microarray for multiplex detection of lung cancer markers: preclinical validation and comparison with the Luminex xMAP® system

    NASA Astrophysics Data System (ADS)

    Bilan, Regina; Ametzazurra, Amagoia; Brazhnik, Kristina; Escorza, Sergio; Fernández, David; Uríbarri, María; Nabiev, Igor; Sukhanova, Alyona

    2017-03-01

    A novel suspension multiplex immunoassay for the simultaneous specific detection of lung cancer markers in bronchoalveolar lavage fluid (BALF) clinical samples based on fluorescent microspheres having different size and spectrally encoded with quantum dots (QDEM) was developed. The designed suspension immunoassay was validated for the quantitative detection of three lung cancer markers in BALF samples from 42 lung cancer patients and 10 control subjects. Tumor markers were detected through simultaneous formation of specific immune complexes consisting of a capture molecule, the target antigen, and biotinylated recognition molecule on the surface of the different QDEM in a mixture. The immune complexes were visualized by fluorescently labeled streptavidin and simultaneously analyzed using a flow cytometer. Preclinical validation of the immunoassay was performed and results were compared with those obtained using an alternative 3-plex immunoassay based on Luminex xMAP® technology, developed on classical organic fluorophores. The comparison showed that the QDEM and xMAP® assays yielded almost identical results, with clear discrimination between control and clinical samples. Thus, developed QDEM technology can become a good alternative to xMAP® assays permitting analysis of multiple protein biomarkers using conventional flow cytometers.

  3. Quantum-dot-based suspension microarray for multiplex detection of lung cancer markers: preclinical validation and comparison with the Luminex xMAP® system

    PubMed Central

    Bilan, Regina; Ametzazurra, Amagoia; Brazhnik, Kristina; Escorza, Sergio; Fernández, David; Uríbarri, María; Nabiev, Igor; Sukhanova, Alyona

    2017-01-01

    A novel suspension multiplex immunoassay for the simultaneous specific detection of lung cancer markers in bronchoalveolar lavage fluid (BALF) clinical samples based on fluorescent microspheres having different size and spectrally encoded with quantum dots (QDEM) was developed. The designed suspension immunoassay was validated for the quantitative detection of three lung cancer markers in BALF samples from 42 lung cancer patients and 10 control subjects. Tumor markers were detected through simultaneous formation of specific immune complexes consisting of a capture molecule, the target antigen, and biotinylated recognition molecule on the surface of the different QDEM in a mixture. The immune complexes were visualized by fluorescently labeled streptavidin and simultaneously analyzed using a flow cytometer. Preclinical validation of the immunoassay was performed and results were compared with those obtained using an alternative 3-plex immunoassay based on Luminex xMAP® technology, developed on classical organic fluorophores. The comparison showed that the QDEM and xMAP® assays yielded almost identical results, with clear discrimination between control and clinical samples. Thus, developed QDEM technology can become a good alternative to xMAP® assays permitting analysis of multiple protein biomarkers using conventional flow cytometers. PMID:28300171

  4. Development of an Enhanced Metaproteomic Approach for Deepening the Microbiome Characterization of the Human Infant Gut

    DOE PAGES

    Xiong, Weili; Richard J. Giannone; Morowitz, Michael J.; ...

    2014-10-28

    The early-life microbiota establishment in the human infant gut is highly variable and plays a crucial role in host nutrients and immunity maturation. While high-performance mass spectrometry (MS)-based metaproteomics is a powerful method for the functional characterization of complex microbial communities, the construction of comprehensive metaproteomic information in human fecal samples is inhibited by the presence of abundant human proteins. To alleviate this restriction, we have designed a novel metaproteomic strategy based on Double Filtering (DF) to enhance microbial protein characterization in complex fecal samples from healthy premature infants. We improved the overall depth of infant gut proteome measurement, withmore » an increase in the number of identified low abundance proteins, and observed greater than twofold improvement in metrics for microbial protein identifications and quantifications with a relatively high rank correlation to control. We further showed the substantial enhancement of this approach for extensively interpreting microbial functional categories between infants by affording more detailed and confident identified categories. This approach provided an avenue for in-depth measurement in the microbial component of infant fecal samples and thus comprehensive characterization of infant gut microbiome functionality.« less

  5. Space shuttle nonmetallic materials age life prediction

    NASA Technical Reports Server (NTRS)

    Mendenhall, G. D.; Hassell, J. A.; Nathan, R. A.

    1975-01-01

    The chemiluminescence from samples of polybutadiene, Viton, Teflon, Silicone, PL 731 Adhesive, and SP 296 Boron-Epoxy composite was measured at temperatures from 25 to 150 C. Excellent correlations were obtained between chemiluminescence and temperature. These correlations serve to validate accelerated aging tests (at elevated temperatures) designed to predict service life at lower temperatures. In most cases, smooth or linear correlations were obtained between chemiluminescence and physical properties of purified polymer gums, including the tensile strength, viscosity, and loss tangent. The latter is a complex function of certain polymer properties. Data were obtained with far greater ease by the chemiluminescence technique than by the conventional methods of study. The chemiluminescence from the Teflon (Halon) samples was discovered to arise from trace amounts of impurities, which were undetectable by conventional, destructive analysis of the sample.

  6. An approach to optimize sample preparation for MALDI imaging MS of FFPE sections using fractional factorial design of experiments.

    PubMed

    Oetjen, Janina; Lachmund, Delf; Palmer, Andrew; Alexandrov, Theodore; Becker, Michael; Boskamp, Tobias; Maass, Peter

    2016-09-01

    A standardized workflow for matrix-assisted laser desorption/ionization imaging mass spectrometry (MALDI imaging MS) is a prerequisite for the routine use of this promising technology in clinical applications. We present an approach to develop standard operating procedures for MALDI imaging MS sample preparation of formalin-fixed and paraffin-embedded (FFPE) tissue sections based on a novel quantitative measure of dataset quality. To cover many parts of the complex workflow and simultaneously test several parameters, experiments were planned according to a fractional factorial design of experiments (DoE). The effect of ten different experiment parameters was investigated in two distinct DoE sets, each consisting of eight experiments. FFPE rat brain sections were used as standard material because of low biological variance. The mean peak intensity and a recently proposed spatial complexity measure were calculated for a list of 26 predefined peptides obtained by in silico digestion of five different proteins and served as quality criteria. A five-way analysis of variance (ANOVA) was applied on the final scores to retrieve a ranking of experiment parameters with increasing impact on data variance. Graphical abstract MALDI imaging experiments were planned according to fractional factorial design of experiments for the parameters under study. Selected peptide images were evaluated by the chosen quality metric (structure and intensity for a given peak list), and the calculated values were used as an input for the ANOVA. The parameters with the highest impact on the quality were deduced and SOPs recommended.

  7. A multiple-alignment based primer design algorithm for genetically highly variable DNA targets

    PubMed Central

    2013-01-01

    Background Primer design for highly variable DNA sequences is difficult, and experimental success requires attention to many interacting constraints. The advent of next-generation sequencing methods allows the investigation of rare variants otherwise hidden deep in large populations, but requires attention to population diversity and primer localization in relatively conserved regions, in addition to recognized constraints typically considered in primer design. Results Design constraints include degenerate sites to maximize population coverage, matching of melting temperatures, optimizing de novo sequence length, finding optimal bio-barcodes to allow efficient downstream analyses, and minimizing risk of dimerization. To facilitate primer design addressing these and other constraints, we created a novel computer program (PrimerDesign) that automates this complex procedure. We show its powers and limitations and give examples of successful designs for the analysis of HIV-1 populations. Conclusions PrimerDesign is useful for researchers who want to design DNA primers and probes for analyzing highly variable DNA populations. It can be used to design primers for PCR, RT-PCR, Sanger sequencing, next-generation sequencing, and other experimental protocols targeting highly variable DNA samples. PMID:23965160

  8. Participatory Ethnographic Evaluation and Research: Reflections on the Research Approach Used to Understand the Complexity of Maternal Health Issues in South Sudan

    PubMed Central

    Elmusharaf, Khalifa; Byrne, Elaine; Manandhar, Mary; Hemmings, Joanne; O’Donovan, Diarmuid

    2016-01-01

    Many methodological approaches have been used to understand cultural dimensions to maternal health issues. Although a well-designed quantitative survey with a representative sample can provide essential information on trends in behavior, it does not necessarily establish a contextualized understanding of the complexity in which different behaviors occur. This article addresses how contextualized data can be collected in a short time and under conditions in which participants in conflict-affected zones might not have established, or time to establish, trust with the researchers. The solution, the Participatory Ethnographic Evaluation and Research (PEER) approach, is illustrated through a study whereby South Sudanese marginalized women were trained to design research instruments, and collect and analyze qualitative data. PEER overcomes the problem that many ethnographic or participatory approaches face—the extensive time and resources required to develop trusting relationships with the community to understand the local context and the social networks they form. PMID:27811290

  9. Development of a computationally-designed polymeric adsorbent specific for mycotoxin patulin.

    PubMed

    Piletska, Elena V; Pink, Demi; Karim, Kal; Piletsky, Sergey A

    2017-12-04

    Patulin is a toxic compound which is found predominantly in apples affected by mould rot. Since apples and apple-containing products are a popular food for the elderly, children and babies, the monitoring of the toxin is crucial. This paper describes a development of a computationally-designed polymeric adsorbent for the solid-phase extraction of patulin, which provides an effective clean-up of the food samples and allows the detection and accurate quantification of patulin levels present in apple juice using conventional chromatography methods. The developed bespoke polymer demonstrates a quantitative binding towards the patulin present in undiluted apple juice. The polymer is inexpensive and easy to mass-produce. The contributing factors to the function of the adsorbent is a combination of acidic and basic functional monomers producing a zwitterionic complex in the solution that formed stronger binding complexes with the patulin molecule. The protocols described in this paper provide a blueprint for the development of polymeric adsorbents for other toxins or different food matrices.

  10. Mixture-based gatekeeping procedures in adaptive clinical trials.

    PubMed

    Kordzakhia, George; Dmitrienko, Alex; Ishida, Eiji

    2018-01-01

    Clinical trials with data-driven decision rules often pursue multiple clinical objectives such as the evaluation of several endpoints or several doses of an experimental treatment. These complex analysis strategies give rise to "multivariate" multiplicity problems with several components or sources of multiplicity. A general framework for defining gatekeeping procedures in clinical trials with adaptive multistage designs is proposed in this paper. The mixture method is applied to build a gatekeeping procedure at each stage and inferences at each decision point (interim or final analysis) are performed using the combination function approach. An advantage of utilizing the mixture method is that it enables powerful gatekeeping procedures applicable to a broad class of settings with complex logical relationships among the hypotheses of interest. Further, the combination function approach supports flexible data-driven decisions such as a decision to increase the sample size or remove a treatment arm. The paper concludes with a clinical trial example that illustrates the methodology by applying it to develop an adaptive two-stage design with a mixture-based gatekeeping procedure.

  11. Participatory Ethnographic Evaluation and Research: Reflections on the Research Approach Used to Understand the Complexity of Maternal Health Issues in South Sudan.

    PubMed

    Elmusharaf, Khalifa; Byrne, Elaine; Manandhar, Mary; Hemmings, Joanne; O'Donovan, Diarmuid

    2017-07-01

    Many methodological approaches have been used to understand cultural dimensions to maternal health issues. Although a well-designed quantitative survey with a representative sample can provide essential information on trends in behavior, it does not necessarily establish a contextualized understanding of the complexity in which different behaviors occur. This article addresses how contextualized data can be collected in a short time and under conditions in which participants in conflict-affected zones might not have established, or time to establish, trust with the researchers. The solution, the Participatory Ethnographic Evaluation and Research (PEER) approach, is illustrated through a study whereby South Sudanese marginalized women were trained to design research instruments, and collect and analyze qualitative data. PEER overcomes the problem that many ethnographic or participatory approaches face-the extensive time and resources required to develop trusting relationships with the community to understand the local context and the social networks they form.

  12. Experimental study of the complex resistivity and dielectric constant of chrome-contaminated soil

    NASA Astrophysics Data System (ADS)

    Liu, Haorui; Yang, Heli; Yi, Fengyan

    2016-08-01

    Heavy metals such as arsenic and chromium often contaminate soils near industrialized areas. Soil samples, made with different water content and chromate pollutant concentrations, are often needed to test soil quality. Because complex resistivity and complex dielectric characteristics of these samples need to be measured, the relationship between these measurement results and chromium concentration as well as water content was studied. Based on soil sample observations, the amplitude of the sample complex resistivity decreased with an increase of contamination concentration and water content. The phase of complex resistivity takes on a tendency of initially decrease, and then increase with the increasing of contamination concentration and water content. For a soil sample with the same resistivity, the higher the amplitude of complex resistivity, the lower the water content and the higher the contamination concentration. The real and imaginary parts of the complex dielectric constant increase with an increase in contamination concentration and water content. Note that resistivity and complex resistivity methods are necessary to adequately evaluate pollution at various sites.

  13. Sampling and Data Analysis for Environmental Microbiology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murray, Christopher J.

    2001-06-01

    A brief review of the literature indicates the importance of statistical analysis in applied and environmental microbiology. Sampling designs are particularly important for successful studies, and it is highly recommended that researchers review their sampling design before heading to the laboratory or the field. Most statisticians have numerous stories of scientists who approached them after their study was complete only to have to tell them that the data they gathered could not be used to test the hypothesis they wanted to address. Once the data are gathered, a large and complex body of statistical techniques are available for analysis ofmore » the data. Those methods include both numerical and graphical techniques for exploratory characterization of the data. Hypothesis testing and analysis of variance (ANOVA) are techniques that can be used to compare the mean and variance of two or more groups of samples. Regression can be used to examine the relationships between sets of variables and is often used to examine the dependence of microbiological populations on microbiological parameters. Multivariate statistics provides several methods that can be used for interpretation of datasets with a large number of variables and to partition samples into similar groups, a task that is very common in taxonomy, but also has applications in other fields of microbiology. Geostatistics and other techniques have been used to examine the spatial distribution of microorganisms. The objectives of this chapter are to provide a brief survey of some of the statistical techniques that can be used for sample design and data analysis of microbiological data in environmental studies, and to provide some examples of their use from the literature.« less

  14. Mendelian breeding units versus standard sampling strategies: Mitochondrial DNA variation in southwest Sardinia

    PubMed Central

    Sanna, Daria; Pala, Maria; Cossu, Piero; Dedola, Gian Luca; Melis, Sonia; Fresu, Giovanni; Morelli, Laura; Obinu, Domenica; Tonolo, Giancarlo; Secchi, Giannina; Triunfo, Riccardo; Lorenz, Joseph G.; Scheinfeldt, Laura; Torroni, Antonio; Robledo, Renato; Francalacci, Paolo

    2011-01-01

    We report a sampling strategy based on Mendelian Breeding Units (MBUs), representing an interbreeding group of individuals sharing a common gene pool. The identification of MBUs is crucial for case-control experimental design in association studies. The aim of this work was to evaluate the possible existence of bias in terms of genetic variability and haplogroup frequencies in the MBU sample, due to severe sample selection. In order to reach this goal, the MBU sampling strategy was compared to a standard selection of individuals according to their surname and place of birth. We analysed mitochondrial DNA variation (first hypervariable segment and coding region) in unrelated healthy subjects from two different areas of Sardinia: the area around the town of Cabras and the western Campidano area. No statistically significant differences were observed when the two sampling methods were compared, indicating that the stringent sample selection needed to establish a MBU does not alter original genetic variability and haplogroup distribution. Therefore, the MBU sampling strategy can be considered a useful tool in association studies of complex traits. PMID:21734814

  15. Chemical analysis of black crust on the Angkor sandstone at the Bayon temple, Cambodia

    NASA Astrophysics Data System (ADS)

    Song, Wonsuh; Oguchi, Chiaki; Waragai, Tetsuya

    2014-05-01

    The Angkor complex is the one of the greatest cultural heritages in the world. It is constructed in the early 12th century, designated as a world cultural heritage by UNESCO in 1992. The temples at the Angkor complex are mainly made of sandstone and laterite. However, due to the tropical climate, plants, lichens and various microorganisms are growing well on the rock surface. Black crusts are also easily found on the stone surface. The 21st technical session of the International Coordinating Committee for the Safeguarding and Development of the Historic Site of Angkor (ICC-Angkor) held in 2012 recommended that to preserve both the biofilms and the forest cover and to prohibit the biocides (chlorine-based) and organic biocides. However, there are many reports that lichens and microorganisms accelerate rock weathering. It is important to clarify that how the biofilm on the Angkor temples affect Angkor sandstones. We sampled Angkor sandstone covered by black crust at the Bayon temple, Angkor complex, and observed the section and the surface of the rock sample by using SEM. Surfaces of the samples are not polished in order to observe the original condition. The samples are coated with gold for 180 seconds. The depth of the black crust is up to 1 mm. Many filamentous materials were found on the black crust. Average energy-dispersive X-ray spectroscopy data of the five areas of ca. 20 μm ×15 μm in the black crusts shows that over 80 % of the filamentous materials are compounds of carbon. It seems that these materials are hyphae. The shape of the hypha is like a thread and its size is few μm in diameter and up to several centimeters in length. Black crusts are consisted of elements and compounds of carbon, Na, Mg, Al, Si, Cl, K, Ca, and Fe. Further research has to be done to find out the better and proper way of conservation for the Angkor complex.

  16. Manipulating the motion of large molecules: Information from the molecular frame

    NASA Astrophysics Data System (ADS)

    Küpper, Jochen

    2011-05-01

    Large molecules have complex potential-energy surfaces with many local minima. They exhibit multiple stereoisomers, even at the low temperatures (~1 K) in a molecular beam, with rich intra- and intermolecular dynamics. Over the last years, we have developed methods to manipulate the motion of large, complex molecules and to select their quantum states. We have exploited this state-selectivity, for example, to spatially separate individual structural isomers of complex molecules and to demonstrate unprecedented degrees of laser alignment and mixed-field orientation of these molecules. Such clean, well-defined samples strongly benefit, or simply allow, novel experiments on the dynamics of complex molecules, for instance, femtosecond pump-probe measurements, X-ray or electron diffraction of molecular ensembles (including diffraction-from-within experiments), or tomographic reconstructions of molecular orbitals. These samples could also be very advantageous for metrology applications, such as, for example, matter-wave interferometry or the search for electroweak interactions in chiral molecules. Moreover, they provide an extreme level of control for stereo-dynamically controlled reaction dynamics. We have recently exploited these state-selected and oriented samples to measure photoelectron angular distributions in the molecular frame (MFPADs) from non-resonant femtosecond-laser photoionization and using the X-ray Free-Electron-Laser LCLS. We have also investigated X-ray diffraction imaging and, using ion momentum imaging, the induced radiation damage of these samples using the LCLS. This work was carried out within a collaboration for which J. Küpper, H. Chapman, and D. Rolles are spokespersons. The collaboration consists of CFEL (DESY, MPG, University Hamburg), Fritz-Haber-Institute Berlin, MPI Nuclear Physics Heidelberg, MPG Semi-conductor Lab, Aarhus University, FOM AMOLF Amsterdam, Lund University, MPI Medical Research Heidelberg, TU Berlin, Max Born Institute Berlin, and SLAC Menlo Park, CA, USA. The experiments were carried out using CAMP (designed and built by the MPG-ASG at CFEL) at the LCLS (operated by Stanford University on behalf of the US DOE).

  17. Probing the effect of electron channelling on atomic resolution energy dispersive X-ray quantification.

    PubMed

    MacArthur, Katherine E; Brown, Hamish G; Findlay, Scott D; Allen, Leslie J

    2017-11-01

    Advances in microscope stability, aberration correction and detector design now make it readily possible to achieve atomic resolution energy dispersive X-ray mapping for dose resilient samples. These maps show impressive atomic-scale qualitative detail as to where the elements reside within a given sample. Unfortunately, while electron channelling is exploited to provide atomic resolution data, this very process makes the images rather more complex to interpret quantitatively than if no electron channelling occurred. Here we propose small sample tilt as a means for suppressing channelling and improving quantification of composition, whilst maintaining atomic-scale resolution. Only by knowing composition and thickness of the sample is it possible to determine the atomic configuration within each column. The effects of neighbouring atomic columns with differing composition and of residual channelling on our ability to extract exact column-by-column composition are also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Improving the sampling efficiency of Monte Carlo molecular simulations: an evolutionary approach

    NASA Astrophysics Data System (ADS)

    Leblanc, Benoit; Braunschweig, Bertrand; Toulhoat, Hervé; Lutton, Evelyne

    We present a new approach in order to improve the convergence of Monte Carlo (MC) simulations of molecular systems belonging to complex energetic landscapes: the problem is redefined in terms of the dynamic allocation of MC move frequencies depending on their past efficiency, measured with respect to a relevant sampling criterion. We introduce various empirical criteria with the aim of accounting for the proper convergence in phase space sampling. The dynamic allocation is performed over parallel simulations by means of a new evolutionary algorithm involving 'immortal' individuals. The method is bench marked with respect to conventional procedures on a model for melt linear polyethylene. We record significant improvement in sampling efficiencies, thus in computational load, while the optimal sets of move frequencies are liable to allow interesting physical insights into the particular systems simulated. This last aspect should provide a new tool for designing more efficient new MC moves.

  19. New approaches to the analysis of complex samples using fluorescence lifetime techniques and organized media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertz, P.R.

    Fluorescence spectroscopy is a highly sensitive and selective tool for the analysis of complex systems. In order to investigate the efficacy of several steady state and dynamic techniques for the analysis of complex systems, this work focuses on two types of complex, multicomponent samples: petrolatums and coal liquids. It is shown in these studies dynamic, fluorescence lifetime-based measurements provide enhanced discrimination between complex petrolatum samples. Additionally, improved quantitative analysis of multicomponent systems is demonstrated via incorporation of organized media in coal liquid samples. This research provides the first systematic studies of (1) multifrequency phase-resolved fluorescence spectroscopy for dynamic fluorescence spectralmore » fingerprinting of complex samples, and (2) the incorporation of bile salt micellar media to improve accuracy and sensitivity for characterization of complex systems. In the petroleum studies, phase-resolved fluorescence spectroscopy is used to combine spectral and lifetime information through the measurement of phase-resolved fluorescence intensity. The intensity is collected as a function of excitation and emission wavelengths, angular modulation frequency, and detector phase angle. This multidimensional information enhances the ability to distinguish between complex samples with similar spectral characteristics. Examination of the eigenvalues and eigenvectors from factor analysis of phase-resolved and steady state excitation-emission matrices, using chemometric methods of data analysis, confirms that phase-resolved fluorescence techniques offer improved discrimination between complex samples as compared with conventional steady state methods.« less

  20. Real-time analysis of dual-display phage immobilization and autoantibody screening using quartz crystal microbalance with dissipation monitoring.

    PubMed

    Rajaram, Kaushik; Losada-Pérez, Patricia; Vermeeren, Veronique; Hosseinkhani, Baharak; Wagner, Patrick; Somers, Veerle; Michiels, Luc

    2015-01-01

    Over the last three decades, phage display technology has been used for the display of target-specific biomarkers, peptides, antibodies, etc. Phage display-based assays are mostly limited to the phage ELISA, which is notorious for its high background signal and laborious methodology. These problems have been recently overcome by designing a dual-display phage with two different end functionalities, namely, streptavidin (STV)-binding protein at one end and a rheumatoid arthritis-specific autoantigenic target at the other end. Using this dual-display phage, a much higher sensitivity in screening specificities of autoantibodies in complex serum sample has been detected compared to single-display phage system on phage ELISA. Herein, we aimed to develop a novel, rapid, and sensitive dual-display phage to detect autoantibodies presence in serum samples using quartz crystal microbalance with dissipation monitoring as a sensing platform. The vertical functionalization of the phage over the STV-modified surfaces resulted in clear frequency and dissipation shifts revealing a well-defined viscoelastic signature. Screening for autoantibodies using antihuman IgG-modified surfaces and the dual-display phage with STV magnetic bead complexes allowed to isolate the target entities from complex mixtures and to achieve a large response as compared to negative control samples. This novel dual-display strategy can be a potential alternative to the time consuming phage ELISA protocols for the qualitative analysis of serum autoantibodies and can be taken as a departure point to ultimately achieve a point of care diagnostic system.

  1. Self assembly of rectangular shapes on concentration programming and probabilistic tile assembly models.

    PubMed

    Kundeti, Vamsi; Rajasekaran, Sanguthevar

    2012-06-01

    Efficient tile sets for self assembling rectilinear shapes is of critical importance in algorithmic self assembly. A lower bound on the tile complexity of any deterministic self assembly system for an n × n square is [Formula: see text] (inferred from the Kolmogrov complexity). Deterministic self assembly systems with an optimal tile complexity have been designed for squares and related shapes in the past. However designing [Formula: see text] unique tiles specific to a shape is still an intensive task in the laboratory. On the other hand copies of a tile can be made rapidly using PCR (polymerase chain reaction) experiments. This led to the study of self assembly on tile concentration programming models. We present two major results in this paper on the concentration programming model. First we show how to self assemble rectangles with a fixed aspect ratio ( α:β ), with high probability, using Θ( α + β ) tiles. This result is much stronger than the existing results by Kao et al. (Randomized self-assembly for approximate shapes, LNCS, vol 5125. Springer, Heidelberg, 2008) and Doty (Randomized self-assembly for exact shapes. In: proceedings of the 50th annual IEEE symposium on foundations of computer science (FOCS), IEEE, Atlanta. pp 85-94, 2009)-which can only self assembly squares and rely on tiles which perform binary arithmetic. On the other hand, our result is based on a technique called staircase sampling . This technique eliminates the need for sub-tiles which perform binary arithmetic, reduces the constant in the asymptotic bound, and eliminates the need for approximate frames (Kao et al. Randomized self-assembly for approximate shapes, LNCS, vol 5125. Springer, Heidelberg, 2008). Our second result applies staircase sampling on the equimolar concentration programming model (The tile complexity of linear assemblies. In: proceedings of the 36th international colloquium automata, languages and programming: Part I on ICALP '09, Springer-Verlag, pp 235-253, 2009), to self assemble rectangles (of fixed aspect ratio) with high probability. The tile complexity of our algorithm is Θ(log( n )) and is optimal on the probabilistic tile assembly model (PTAM)- n being an upper bound on the dimensions of a rectangle.

  2. A novel conformation of gel grown biologically active cadmium nicotinate

    NASA Astrophysics Data System (ADS)

    Nair, Lekshmi P.; Bijini, B. R.; Divya, R.; Nair, Prabitha B.; Eapen, S. M.; Dileep Kumar, B. S.; Nishanth Kumar, S.; Nair, C. M. K.; Deepa, M.; Rajendra Babu, K.

    2017-11-01

    The elimination of toxic heavy metals by the formation of stable co-ordination compounds with biologically active ligands is applicable in drug designing. A new crystalline complex of cadmium with nicotinic acid is grown at ambient temperature using the single gel diffusion method in which the crystal structure is different from those already reported. Single crystal x-ray diffraction reveals the identity of crystal structure belonging to monoclinic system, P21/c space group with cell dimensions a = 17.220 (2) Å, b = 10.2480 (2) Å, c = 7.229(9) Å, β = 91.829(4)°. Powder x-ray diffraction analysis confirmed the crystallinity of the sample. The unidentate mode of co-ordination between the metal atom and the carboxylate group is supported by the Fourier Transform Infra Red spectral data. Thermal analysis ensures the thermal stability of the complex. Kinetic and thermodynamic parameters are also calculated. The stoichiometry of the complex is confirmed by the elemental analysis. The UV-visible spectral analysis shows the wide transparency window of the complex in the visible region. The band gap of the complex is found to be 3.92 eV. The complex shows excellent antibacterial and antifungal activity.

  3. Statistical methodology for the analysis of dye-switch microarray experiments

    PubMed Central

    Mary-Huard, Tristan; Aubert, Julie; Mansouri-Attia, Nadera; Sandra, Olivier; Daudin, Jean-Jacques

    2008-01-01

    Background In individually dye-balanced microarray designs, each biological sample is hybridized on two different slides, once with Cy3 and once with Cy5. While this strategy ensures an automatic correction of the gene-specific labelling bias, it also induces dependencies between log-ratio measurements that must be taken into account in the statistical analysis. Results We present two original statistical procedures for the statistical analysis of individually balanced designs. These procedures are compared with the usual ML and REML mixed model procedures proposed in most statistical toolboxes, on both simulated and real data. Conclusion The UP procedure we propose as an alternative to usual mixed model procedures is more efficient and significantly faster to compute. This result provides some useful guidelines for the analysis of complex designs. PMID:18271965

  4. Mars Science Laboratory CHIMRA/IC/DRT Flight Software for Sample Acquisition and Processing

    NASA Technical Reports Server (NTRS)

    Kim, Won S.; Leger, Chris; Carsten, Joseph; Helmick, Daniel; Kuhn, Stephen; Redick, Richard; Trujillo, Diana

    2013-01-01

    The design methodologies of using sequence diagrams, multi-process functional flow diagrams, and hierarchical state machines were successfully applied in designing three MSL (Mars Science Laboratory) flight software modules responsible for handling actuator motions of the CHIMRA (Collection and Handling for In Situ Martian Rock Analysis), IC (Inlet Covers), and DRT (Dust Removal Tool) mechanisms. The methodologies were essential to specify complex interactions with other modules, support concurrent foreground and background motions, and handle various fault protections. Studying task scenarios with multi-process functional flow diagrams yielded great insight to overall design perspectives. Since the three modules require three different levels of background motion support, the methodologies presented in this paper provide an excellent comparison. All three modules are fully operational in flight.

  5. Rapid Design of Knowledge-Based Scoring Potentials for Enrichment of Near-Native Geometries in Protein-Protein Docking.

    PubMed

    Sasse, Alexander; de Vries, Sjoerd J; Schindler, Christina E M; de Beauchêne, Isaure Chauvot; Zacharias, Martin

    2017-01-01

    Protein-protein docking protocols aim to predict the structures of protein-protein complexes based on the structure of individual partners. Docking protocols usually include several steps of sampling, clustering, refinement and re-scoring. The scoring step is one of the bottlenecks in the performance of many state-of-the-art protocols. The performance of scoring functions depends on the quality of the generated structures and its coupling to the sampling algorithm. A tool kit, GRADSCOPT (GRid Accelerated Directly SCoring OPTimizing), was designed to allow rapid development and optimization of different knowledge-based scoring potentials for specific objectives in protein-protein docking. Different atomistic and coarse-grained potentials can be created by a grid-accelerated directly scoring dependent Monte-Carlo annealing or by a linear regression optimization. We demonstrate that the scoring functions generated by our approach are similar to or even outperform state-of-the-art scoring functions for predicting near-native solutions. Of additional importance, we find that potentials specifically trained to identify the native bound complex perform rather poorly on identifying acceptable or medium quality (near-native) solutions. In contrast, atomistic long-range contact potentials can increase the average fraction of near-native poses by up to a factor 2.5 in the best scored 1% decoys (compared to existing scoring), emphasizing the need of specific docking potentials for different steps in the docking protocol.

  6. Rapid DNA analysis for automated processing and interpretation of low DNA content samples.

    PubMed

    Turingan, Rosemary S; Vasantgadkar, Sameer; Palombo, Luke; Hogan, Catherine; Jiang, Hua; Tan, Eugene; Selden, Richard F

    2016-01-01

    Short tandem repeat (STR) analysis of casework samples with low DNA content include those resulting from the transfer of epithelial cells from the skin to an object (e.g., cells on a water bottle, or brim of a cap), blood spatter stains, and small bone and tissue fragments. Low DNA content (LDC) samples are important in a wide range of settings, including disaster response teams to assist in victim identification and family reunification, military operations to identify friend or foe, criminal forensics to identify suspects and exonerate the innocent, and medical examiner and coroner offices to identify missing persons. Processing LDC samples requires experienced laboratory personnel, isolated workstations, and sophisticated equipment, requires transport time, and involves complex procedures. We present a rapid DNA analysis system designed specifically to generate STR profiles from LDC samples in field-forward settings by non-technical operators. By performing STR in the field, close to the site of collection, rapid DNA analysis has the potential to increase throughput and to provide actionable information in real time. A Low DNA Content BioChipSet (LDC BCS) was developed and manufactured by injection molding. It was designed to function in the fully integrated Accelerated Nuclear DNA Equipment (ANDE) instrument previously designed for analysis of buccal swab and other high DNA content samples (Investigative Genet. 4(1):1-15, 2013). The LDC BCS performs efficient DNA purification followed by microfluidic ultrafiltration of the purified DNA, maximizing the quantity of DNA available for subsequent amplification and electrophoretic separation and detection of amplified fragments. The system demonstrates accuracy, precision, resolution, signal strength, and peak height ratios appropriate for casework analysis. The LDC rapid DNA analysis system is effective for the generation of STR profiles from a wide range of sample types. The technology broadens the range of sample types that can be processed and minimizes the time between sample collection, sample processing and analysis, and generation of actionable intelligence. The fully integrated Expert System is capable of interpreting a wide range or sample types and input DNA quantities, allowing samples to be processed and interpreted without a technical operator.

  7. Bounds on the sample complexity for private learning and private data release

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kasiviswanathan, Shiva; Beime, Amos; Nissim, Kobbi

    2009-01-01

    Learning is a task that generalizes many of the analyses that are applied to collections of data, and in particular, collections of sensitive individual information. Hence, it is natural to ask what can be learned while preserving individual privacy. [Kasiviswanathan, Lee, Nissim, Raskhodnikova, and Smith; FOCS 2008] initiated such a discussion. They formalized the notion of private learning, as a combination of PAC learning and differential privacy, and investigated what concept classes can be learned privately. Somewhat surprisingly, they showed that, ignoring time complexity, every PAC learning task could be performed privately with polynomially many samples, and in many naturalmore » cases this could even be done in polynomial time. While these results seem to equate non-private and private learning, there is still a significant gap: the sample complexity of (non-private) PAC learning is crisply characterized in terms of the VC-dimension of the concept class, whereas this relationship is lost in the constructions of private learners, which exhibit, generally, a higher sample complexity. Looking into this gap, we examine several private learning tasks and give tight bounds on their sample complexity. In particular, we show strong separations between sample complexities of proper and improper private learners (such separation does not exist for non-private learners), and between sample complexities of efficient and inefficient proper private learners. Our results show that VC-dimension is not the right measure for characterizing the sample complexity of proper private learning. We also examine the task of private data release (as initiated by [Blum, Ligett, and Roth; STOC 2008]), and give new lower bounds on the sample complexity. Our results show that the logarithmic dependence on size of the instance space is essential for private data release.« less

  8. Glass sample characterization

    NASA Technical Reports Server (NTRS)

    Ahmad, Anees

    1990-01-01

    The development of in-house integrated optical performance modelling capability at MSFC is described. This performance model will take into account the effects of structural and thermal distortions, as well as metrology errors in optical surfaces to predict the performance of large an complex optical systems, such as Advanced X-Ray Astrophysics Facility. The necessary hardware and software were identified to implement an integrated optical performance model. A number of design, development, and testing tasks were supported to identify the debonded mirror pad, and rebuilding of the Technology Mirror Assembly. Over 300 samples of Zerodur were prepared in different sizes and shapes for acid etching, coating, and polishing experiments to characterize the subsurface damage and stresses produced by the grinding and polishing operations.

  9. Effects of model complexity and priors on estimation using sequential importance sampling/resampling for species conservation

    USGS Publications Warehouse

    Dunham, Kylee; Grand, James B.

    2016-01-01

    We examined the effects of complexity and priors on the accuracy of models used to estimate ecological and observational processes, and to make predictions regarding population size and structure. State-space models are useful for estimating complex, unobservable population processes and making predictions about future populations based on limited data. To better understand the utility of state space models in evaluating population dynamics, we used them in a Bayesian framework and compared the accuracy of models with differing complexity, with and without informative priors using sequential importance sampling/resampling (SISR). Count data were simulated for 25 years using known parameters and observation process for each model. We used kernel smoothing to reduce the effect of particle depletion, which is common when estimating both states and parameters with SISR. Models using informative priors estimated parameter values and population size with greater accuracy than their non-informative counterparts. While the estimates of population size and trend did not suffer greatly in models using non-informative priors, the algorithm was unable to accurately estimate demographic parameters. This model framework provides reasonable estimates of population size when little to no information is available; however, when information on some vital rates is available, SISR can be used to obtain more precise estimates of population size and process. Incorporating model complexity such as that required by structured populations with stage-specific vital rates affects precision and accuracy when estimating latent population variables and predicting population dynamics. These results are important to consider when designing monitoring programs and conservation efforts requiring management of specific population segments.

  10. Characterization of microbial communities in heavy crude oil from Saudi Arabia.

    PubMed

    Albokari, Majed; Mashhour, Ibrahim; Alshehri, Mohammed; Boothman, Chris; Al-Enezi, Mousa

    The complete mineralization of crude oil into carbon dioxide, water, inorganic compounds and cellular constituents can be carried out as part of a bioremediation strategy. This involves the transformation of complex organic contaminants into simpler organic compounds by microbial communities, mainly bacteria. A crude oil sample and an oil sludge sample were obtained from Saudi ARAMCO Oil Company and investigated to identify the microbial communities present using PCR-based culture-independent techniques. In total, analysis of 177 clones yielded 30 distinct bacterial sequences. Clone library analysis of the oil sample was found to contain Bacillus , Clostridia and Gammaproteobacteria species while the sludge sample revealed the presence of members of the Alphaproteobacteria , Betaproteobacteria , Gammaproteobacteria , Clostridia , Spingobacteria and Flavobacteria . The dominant bacterial class identified in oil and sludge samples was found to be Bacilli and Flavobacteria , respectively. Phylogenetic analysis showed that the dominant bacterium in the oil sample has the closest sequence identity to Enterococcus aquimarinus and the dominant bacterium in the sludge sample is most closely related to the uncultured Bacteroidetes bacterium designated AH.KK.

  11. Design and Validation of a Compressive Tissue Stimulator with High-Throughput Capacity and Real-Time Modulus Measurement Capability

    PubMed Central

    Salvetti, David J.; Pino, Christopher J.; Manuel, Steven G.; Dallmeyer, Ian; Rangarajan, Sanjeet V.; Meyer, Tobias; Kotov, Misha

    2012-01-01

    Mechanical stimulation has been shown to impact the properties of engineered hyaline cartilage constructs and is relevant for engineering of cartilage and osteochondral tissues. Most mechanical stimulators developed to date emphasize precision over adaptability to standard tissue culture equipment and protocols. The realization of mechanical characteristics in engineered constructs approaching native cartilage requires the optimization of complex variables (type of stimulus, regimen, and bimolecular signals). We have proposed and validated a stimulator design that focuses on high construct capacity, compatibility with tissue culture plastic ware, and regimen adaptability to maximize throughput. This design utilizes thin force sensors in lieu of a load cell and a linear encoder to verify position. The implementation of an individual force sensor for each sample enables the measurement of Young's modulus while stimulating the sample. Removable and interchangeable Teflon plungers mounted using neodymium magnets contact each sample. Variations in plunger height and design can vary the strain and force type on individual samples. This allows for the evaluation of a myriad of culture conditions and regimens simultaneously. The system was validated using contact accuracy, and Young's modulus measurements range as key parameters. Contact accuracy for the system was excellent within 1.16% error of the construct height in comparison to measurements made with a micrometer. Biomaterials ranging from bioceramics (cancellous bone, 123 MPa) to soft gels (1% agarose, 20 KPa) can be measured without any modification to the device. The accuracy of measurements in conjunction with the wide range of moduli tested demonstrate the unique characteristics of the device and the feasibility of using this device in mapping real-time changes to Young's modulus of tissue constructs (cartilage, bone) through the developmental phases in ex vivo culture conditions. PMID:21988089

  12. Fidelity in complex behaviour change interventions: a standardised approach to evaluate intervention integrity

    PubMed Central

    Mars, Tom; Ellard, David; Carnes, Dawn; Homer, Kate; Underwood, Martin; Taylor, Stephanie J C

    2013-01-01

    Objectives The aim of this study was to (1) demonstrate the development and testing of tools and procedures designed to monitor and assess the integrity of a complex intervention for chronic pain (COping with persistent Pain, Effectiveness Research into Self-management (COPERS) course); and (2) make recommendations based on our experiences. Design Fidelity assessment of a two-arm randomised controlled trial intervention, assessing the adherence and competence of the facilitators delivering the intervention. Setting The intervention was delivered in the community in two centres in the UK: one inner city and one a mix of rural and urban locations. Participants 403 people with chronic musculoskeletal pain were enrolled in the intervention arm and 300 attended the self-management course. Thirty lay and healthcare professionals were trained and 24 delivered the courses (2 per course). We ran 31 courses for up to 16 people per course and all were audio recorded. Interventions The course was run over three and a half days; facilitators delivered a semistructured manualised course. Outcomes We designed three measures to evaluate fidelity assessing adherence to the manual, competence and overall impression. Results We evaluated a random sample of four components from each course (n=122). The evaluation forms were reliable and had good face validity. There were high levels of adherence in the delivery: overall adherence was two (maximum 2, IQR 1.67–2.00), facilitator competence exhibited more variability, and overall competence was 1.5 (maximum 2, IQR 1.25–2.00). Overall impression was three (maximum 4, IQR 2.00–3.00). Conclusions Monitoring and assessing adherence and competence at the point of intervention delivery can be realised most efficiently by embedding the principles of fidelity measurement within the design stage of complex interventions and the training and assessment of those delivering the intervention. More work is necessary to ensure that more robust systems of fidelity evaluation accompany the growth of complex interventions. Trial Registration ISRCTN No ISRCTN24426731. PMID:24240140

  13. Lanthanide complexes as luminogenic probes to measure sulfide levels in industrial samples.

    PubMed

    Thorson, Megan K; Ung, Phuc; Leaver, Franklin M; Corbin, Teresa S; Tuck, Kellie L; Graham, Bim; Barrios, Amy M

    2015-10-08

    A series of lanthanide-based, azide-appended complexes were investigated as hydrogen sulfide-sensitive probes. Europium complex 1 and Tb complex 3 both displayed a sulfide-dependent increase in luminescence, while Tb complex 2 displayed a decrease in luminescence upon exposure to NaHS. The utility of the complexes for monitoring sulfide levels in industrial oil and water samples was investigated. Complex 3 provided a sensitive measure of sulfide levels in petrochemical water samples (detection limit ∼ 250 nM), while complex 1 was capable of monitoring μM levels of sulfide in partially refined crude oil. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Supporting Space Systems Design via Systems Dependency Analysis Methodology

    NASA Astrophysics Data System (ADS)

    Guariniello, Cesare

    The increasing size and complexity of space systems and space missions pose severe challenges to space systems engineers. When complex systems and Systems-of-Systems are involved, the behavior of the whole entity is not only due to that of the individual systems involved but also to the interactions and dependencies between the systems. Dependencies can be varied and complex, and designers usually do not perform analysis of the impact of dependencies at the level of complex systems, or this analysis involves excessive computational cost, or occurs at a later stage of the design process, after designers have already set detailed requirements, following a bottom-up approach. While classical systems engineering attempts to integrate the perspectives involved across the variety of engineering disciplines and the objectives of multiple stakeholders, there is still a need for more effective tools and methods capable to identify, analyze and quantify properties of the complex system as a whole and to model explicitly the effect of some of the features that characterize complex systems. This research describes the development and usage of Systems Operational Dependency Analysis and Systems Developmental Dependency Analysis, two methods based on parametric models of the behavior of complex systems, one in the operational domain and one in the developmental domain. The parameters of the developed models have intuitive meaning, are usable with subjective and quantitative data alike, and give direct insight into the causes of observed, and possibly emergent, behavior. The approach proposed in this dissertation combines models of one-to-one dependencies among systems and between systems and capabilities, to analyze and evaluate the impact of failures or delays on the outcome of the whole complex system. The analysis accounts for cascading effects, partial operational failures, multiple failures or delays, and partial developmental dependencies. The user of these methods can assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.

  15. LIGHT-SABRE enables efficient in-magnet catalytic hyperpolarization

    NASA Astrophysics Data System (ADS)

    Theis, Thomas; Truong, Milton; Coffey, Aaron M.; Chekmenev, Eduard Y.; Warren, Warren S.

    2014-11-01

    Nuclear spin hyperpolarization overcomes the sensitivity limitations of traditional NMR and MRI, but the most general method demonstrated to date (dynamic nuclear polarization) has significant limitations in scalability, cost, and complex apparatus design. As an alternative, signal amplification by reversible exchange (SABRE) of parahydrogen on transition metal catalysts can hyperpolarize a variety of substrates, but to date this scheme has required transfer of the sample to low magnetic field or very strong RF irradiation. Here we demonstrate "Low-Irradiation Generation of High Tesla-SABRE" (LIGHT-SABRE) which works with simple pulse sequences and low power deposition; it should be usable at any magnetic field and for hyperpolarization of many different nuclei. This approach could drastically reduce the cost and complexity of producing hyperpolarized molecules.

  16. Damage Detection in Rotorcraft Composite Structures Using Thermography and Laser-Based Ultrasound

    NASA Technical Reports Server (NTRS)

    Anastasi, Robert F.; Zalameda, Joseph N.; Madaras, Eric I.

    2004-01-01

    New rotorcraft structural composite designs incorporate lower structural weight, reduced manufacturing complexity, and improved threat protection. These new structural concepts require nondestructive evaluation inspection technologies that can potentially be field-portable and able to inspect complex geometries for damage or structural defects. Two candidate technologies were considered: Thermography and Laser-Based Ultrasound (Laser UT). Thermography and Laser UT have the advantage of being non-contact inspection methods, with Thermography being a full-field imaging method and Laser UT a point scanning technique. These techniques were used to inspect composite samples that contained both embedded flaws and impact damage of various size and shape. Results showed that the inspection techniques were able to detect both embedded and impact damage with varying degrees of success.

  17. Estrogen-, androgen- and aryl hydrocarbon receptor mediated activities in passive and composite samples from municipal waste and surface waters.

    PubMed

    Jálová, V; Jarošová, B; Bláha, L; Giesy, J P; Ocelka, T; Grabic, R; Jurčíková, J; Vrana, B; Hilscherová, K

    2013-09-01

    Passive and composite sampling in combination with in vitro bioassays and identification and quantification of individual chemicals were applied to characterize pollution by compounds with several specific modes of action in urban area in the basin of two rivers, with 400,000 inhabitants and a variety of industrial activities. Two types of passive samplers, semipermeable membrane devices (SPMD) for hydrophobic contaminants and polar organic chemical integrative samplers (POCIS) for polar compounds such as pesticides and pharmaceuticals, were used to sample wastewater treatment plant (WWTP) influent and effluent as well as rivers upstream and downstream of the urban complex and the WWTP. Compounds with endocrine disruptive potency were detected in river water and WWTP influent and effluent. Year-round, monthly assessment of waste waters by bioassays documented estrogenic, androgenic and dioxin-like potency as well as cytotoxicity in influent waters of the WWTP and allowed characterization of seasonal variability of these biological potentials in waste waters. The WWTP effectively removed cytotoxic compounds, xenoestrogens and xenoandrogens. There was significant variability in treatment efficiency of dioxin-like potency. The study indicates that the WWTP, despite its up-to-date technology, can contribute endocrine disrupting compounds to the river. Riverine samples exhibited dioxin-like, antiestrogenic and antiandrogenic potencies. The study design enabled characterization of effects of the urban complex and the WWTP on the river. Concentrations of PAHs and contaminants and specific biological potencies sampled by POCIS decreased as a function of distance from the city. © 2013.

  18. Optimizing liquid effluent monitoring at a large nuclear complex.

    PubMed

    Chou, Charissa J; Barnett, D Brent; Johnson, Vernon G; Olson, Phil M

    2003-12-01

    Effluent monitoring typically requires a large number of analytes and samples during the initial or startup phase of a facility. Once a baseline is established, the analyte list and sampling frequency may be reduced. Although there is a large body of literature relevant to the initial design, few, if any, published papers exist on updating established effluent monitoring programs. This paper statistically evaluates four years of baseline data to optimize the liquid effluent monitoring efficiency of a centralized waste treatment and disposal facility at a large defense nuclear complex. Specific objectives were to: (1) assess temporal variability in analyte concentrations, (2) determine operational factors contributing to waste stream variability, (3) assess the probability of exceeding permit limits, and (4) streamline the sampling and analysis regime. Results indicated that the probability of exceeding permit limits was one in a million under normal facility operating conditions, sampling frequency could be reduced, and several analytes could be eliminated. Furthermore, indicators such as gross alpha and gross beta measurements could be used in lieu of more expensive specific isotopic analyses (radium, cesium-137, and strontium-90) for routine monitoring. Study results were used by the state regulatory agency to modify monitoring requirements for a new discharge permit, resulting in an annual cost savings of US dollars 223,000. This case study demonstrates that statistical evaluation of effluent contaminant variability coupled with process knowledge can help plant managers and regulators streamline analyte lists and sampling frequencies based on detection history and environmental risk.

  19. Sensemaking in a Value Based Context for Large Scale Complex Engineered Systems

    NASA Astrophysics Data System (ADS)

    Sikkandar Basha, Nazareen

    The design and the development of Large-Scale Complex Engineered Systems (LSCES) requires the involvement of multiple teams and numerous levels of the organization and interactions with large numbers of people and interdisciplinary departments. Traditionally, requirements-driven Systems Engineering (SE) is used in the design and development of these LSCES. The requirements are used to capture the preferences of the stakeholder for the LSCES. Due to the complexity of the system, multiple levels of interactions are required to elicit the requirements of the system within the organization. Since LSCES involves people and interactions between the teams and interdisciplinary departments, it should be socio-technical in nature. The elicitation of the requirements of most large-scale system projects are subjected to creep in time and cost due to the uncertainty and ambiguity of requirements during the design and development. In an organization structure, the cost and time overrun can occur at any level and iterate back and forth thus increasing the cost and time. To avoid such creep past researches have shown that rigorous approaches such as value based designing can be used to control it. But before the rigorous approaches can be used, the decision maker should have a proper understanding of requirements creep and the state of the system when the creep occurs. Sensemaking is used to understand the state of system when the creep occurs and provide a guidance to decision maker. This research proposes the use of the Cynefin framework, sensemaking framework which can be used in the design and development of LSCES. It can aide in understanding the system and decision making to minimize the value gap due to requirements creep by eliminating ambiguity which occurs during design and development. A sample hierarchical organization is used to demonstrate the state of the system at the occurrence of requirements creep in terms of cost and time using the Cynefin framework. These trials are continued for different requirements and at different sub-system level. The results obtained show that the Cynefin framework can be used to improve the value of the system and can be used for predictive analysis. The decision makers can use these findings and use rigorous approaches and improve the design of Large Scale Complex Engineered Systems.

  20. A developmental approach to complex PTSD: childhood and adult cumulative trauma as predictors of symptom complexity.

    PubMed

    Cloitre, Marylene; Stolbach, Bradley C; Herman, Judith L; van der Kolk, Bessel; Pynoos, Robert; Wang, Jing; Petkova, Eva

    2009-10-01

    Exposure to multiple traumas, particularly in childhood, has been proposed to result in a complex of symptoms that includes posttraumatic stress disorder (PTSD) as well as a constrained, but variable group of symptoms that highlight self-regulatory disturbances. The relationship between accumulated exposure to different types of traumatic events and total number of different types of symptoms (symptom complexity) was assessed in an adult clinical sample (N = 582) and a child clinical sample (N = 152). Childhood cumulative trauma but not adulthood trauma predicted increasing symptom complexity in adults. Cumulative trauma predicted increasing symptom complexity in the child sample. Results suggest that Complex PTSD symptoms occur in both adult and child samples in a principled, rule-governed way and that childhood experiences significantly influenced adult symptoms. Copyright © 2009 International Society for Traumatic Stress Studies.

  1. Simple and Rapid Determination of Ferulic Acid Levels in Food and Cosmetic Samples Using Paper-Based Platforms

    PubMed Central

    Tee-ngam, Prinjaporn; Nunant, Namthip; Rattanarat, Poomrat; Siangproh, Weena; Chailapakul, Orawon

    2013-01-01

    Ferulic acid is an important phenolic antioxidant found in or added to diet supplements, beverages, and cosmetic creams. Two designs of paper-based platforms for the fast, simple and inexpensive evaluation of ferulic acid contents in food and pharmaceutical cosmetics were evaluated. The first, a paper-based electrochemical device, was developed for ferulic acid detection in uncomplicated matrix samples and was created by the photolithographic method. The second, a paper-based colorimetric device was preceded by thin layer chromatography (TLC) for the separation and detection of ferulic acid in complex samples using a silica plate stationary phase and an 85:15:1 (v/v/v) chloroform: methanol: formic acid mobile phase. After separation, ferulic acid containing section of the TLC plate was attached onto the patterned paper containing the colorimetric reagent and eluted with ethanol. The resulting color change was photographed and quantitatively converted to intensity. Under the optimal conditions, the limit of detection of ferulic acid was found to be 1 ppm and 7 ppm (S/N = 3) for first and second designs, respectively, with good agreement with the standard HPLC-UV detection method. Therefore, these methods can be used for the simple, rapid, inexpensive and sensitive quantification of ferulic acid in a variety of samples. PMID:24077320

  2. A Modular Low-Complexity ECG Delineation Algorithm for Real-Time Embedded Systems.

    PubMed

    Bote, Jose Manuel; Recas, Joaquin; Rincon, Francisco; Atienza, David; Hermida, Roman

    2018-03-01

    This work presents a new modular and low-complexity algorithm for the delineation of the different ECG waves (QRS, P and T peaks, onsets, and end). Involving a reduced number of operations per second and having a small memory footprint, this algorithm is intended to perform real-time delineation on resource-constrained embedded systems. The modular design allows the algorithm to automatically adjust the delineation quality in runtime to a wide range of modes and sampling rates, from a ultralow-power mode when no arrhythmia is detected, in which the ECG is sampled at low frequency, to a complete high-accuracy delineation mode, in which the ECG is sampled at high frequency and all the ECG fiducial points are detected, in the case of arrhythmia. The delineation algorithm has been adjusted using the QT database, providing very high sensitivity and positive predictivity, and validated with the MIT database. The errors in the delineation of all the fiducial points are below the tolerances given by the Common Standards for Electrocardiography Committee in the high-accuracy mode, except for the P wave onset, for which the algorithm is above the agreed tolerances by only a fraction of the sample duration. The computational load for the ultralow-power 8-MHz TI MSP430 series microcontroller ranges from 0.2% to 8.5% according to the mode used.

  3. Are posttraumatic stress disorder (PTSD) and complex-PTSD distinguishable within a treatment-seeking sample of Syrian refugees living in Lebanon?

    PubMed

    Hyland, P; Ceannt, R; Daccache, F; Abou Daher, R; Sleiman, J; Gilmore, B; Byrne, S; Shevlin, M; Murphy, J; Vallières, F

    2018-01-01

    The World Health Organization will publish its 11 th revision of the International Classification of Diseases (ICD-11) in 2018. The ICD-11 will include a refined model of posttraumatic stress disorder (PTSD) and a new diagnosis of complex PTSD (CPTSD). Whereas emerging data supports the validity of these proposals, the discriminant validity of PTSD and CPTSD have yet to be tested amongst a sample of refugees. Treatment-seeking Syrian refugees ( N  = 110) living in Lebanon completed an Arabic version of the International Trauma Questionnaire ; a measure specifically designed to capture the symptom content of ICD-11 PTSD and CPTSD. In total, 62.6% of the sample met the diagnostic criteria for PTSD or CPTSD. More refugees met the criteria for CPTSD (36.1%) than PTSD (25.2%) and no gender differences were observed. Latent class analysis results identified three distinct groups: (1) a PTSD class, (2) a CPTSD class and (3) a low symptom class. Class membership was significantly predicted by levels of functional impairment. Support for the discriminant validity of ICD-11 PTSD and CPTSD was observed for the first time within a sample of refugees. In support of the cross-cultural validity of the ICD-11 proposals, the prevalence of PTSD and CPTSD were similar to those observed in culturally distinct contexts.

  4. Thermo Physics Facilities Branch Brochure ARC Jet Complex Fact Sheets, Hypervelocity Free-Flight Aerodynamic Facility Fact Sheets, Ames Vertical Gun Range Fact Sheets

    NASA Technical Reports Server (NTRS)

    Fretter, E. F. (Editor); Kuhns, Jay (Editor); Nuez, Jay (Editor)

    2003-01-01

    The Ames Arc Jet Complex has a rich heritage of over 40 years in Thermal Protection System (TPS) development for every NASA Space Transportation and Planetary program, including Apollo, Space Shuttle, Viking, Pioneer-Venus, Galileo, Mars Pathfinder,Stardust, NASP,X-33,X-34,SHARP-B1 and B2,X-37 and Mars Exploration Rovers. With this early TPS history came a long heritage in the development of the arc jet facilities. These are used to simulate the aerodynamic heating that occurs on the nose cap, wing leading edges and on other areas of the spacecraft requiring thermal protection. TPS samples have been run in the arc jets from a few minutes to over an hour,from one exposure to multiple exposures of the same sample, in order t o understand the TPS materials response to a hot gas flow environment (representative of real hyperthermal environments experienced in flight). The Ames Arc l e t Complex is a key enabler for customers involved in the three major areas of TPS development: selection, validation, and qualification. The arc jet data are critical for validating TPS thermal models, heat shield designs and repairs, and ultimately for flight qualification.

  5. Targeted Modification of Neutron Energy Spectra for National Security Applications

    NASA Astrophysics Data System (ADS)

    Bevins, James Edward

    At its core, research represents an attempt to break from the "this is the way we have always done it" paradigm. This idea is evidenced from the start in this research effort by the problem formulation to develop a new way to generate synthetic debris that mimics the samples that would be collected for forensics purposes following a nuclear weapon attack on the U.S. or its allies. The philosophy is also demonstrated by the design methodology used to solve the synthetic debris problem, using methods not commonly applied to nuclear engineering problems. Through this research, the bounds of what is deemed possible in neutron spectral shaping are moved ever so slightly. A capability for the production of synthetic debris and fission products was developed for the National Ignition Facility (NIF). Synthetic debris has historically been made in a limited fashion using sample doping techniques since the cessation of nuclear weapons testing, but a more robust alternative approach using neutron spectral shaping was proposed and developed by the University of California-Berkeley and Lawrence Livermore National Laboratory (LLNL). Using NIF as a starting source spectrum, the energy tuning assembly (ETA) developed in this work can irradiate samples with a combined thermonuclear and prompt fission neutron spectrum (TN+PFNS). When used with fissile foils, this irradiation will produce a synthetic fission product distribution that is realistic across all mass chains. To design the ETA, traditional parametric point design approaches were discarded in favor of formal optimization techniques. Finding a lack of suitable algorithms in the literature, a metaheuristic-based optimization algorithm, Gnowee, was developed for rapid convergence to nearly globally optimum solutions for complex, constrained engineering problems with mixed-integer and combinatorial design vectors and high-cost, noisy, discontinuous, black box objective function evaluations. Comparisons between Gnowee and several well-established metaheuristic algorithms are made for a set of continuous, mixed-integer, and combinatorial benchmarks. These results demonstrated Gnoweee to have superior flexibility and convergence characteristics over a wide range of design spaces. The Gnowee algorithm was implemented in Coeus, a new piece of software, to perform optimization of design problems requiring radiation transport for the evaluation of their objective functions. Currently, Coeus solves ETA optimization problems using hybrid radiation transport (ADVANTG and MCNP) to assess design permutations developed by Gnowee. Future enhancements of Coeus will look to expand the geometries and objective functions considered to those beyond ETA design. Coeus was used to generate an ETA design for the TN+PFNS application on NIF. The design achieved a reasonable match with the objective TN+PFNS and associated fission product distributions within the size and weight constraints imposed by the NIF facility. The ETA design was built by American Elements, and initial validation tests were conducted at the Lawrence Berkeley National Laboratory's 88-Inch Cyclotron. These experiments used foil activation and pulse height spectroscopy to measure the ETA-modified spectrum. Additionally, pulse height spectroscopy measurements were taken as the ETA was built-up component-by-component to measure the impact of nuclear data on the ability to model the ETA performance. Some initial analysis of these results is included here. Finally, an integral validation experiment on NIF was proposed using the Coeus generated ETA design. A scoping study conducted by LLNL determined the proposed experiment and ETA design are within NIF facility limitations and current radio-chemistry capabilities. The study found that the proposed ETA experiment was "low risk," has "no show stoppers," and has a "reasonable cost." All that is needed is a sponsor to close the last funding gap and bring the experiment to fruition. This research broke with the current sample doping approach and applied neutron spectral shaping to design an ETA that can create realistic synthetic fission and activation products and improve technical nuclear forensics outcomes. However, the ETA presented in this research represents more than a stand alone point design with a limited scope and application. It is proof of a concept and the product of a unique capability that has a wide range of potential applications. This research demonstrates that the concept of neutron spectral shaping can be used to engineer complex neutron spectra within the confines of physics. There are many possible applications that could benefit from the ability to generate custom energy neutron spectra that fall outside of current sources and methods. The ETA is the product of a general-purpose optimization algorithm, Gnowee, and design framework, Coeus, which enables the use of Gnowee for complex nuclear design problems. Through Gnowee and Coeus, new ETA neutronics designs can be generated in days, not months or years, with a drastic reduction in the research effort required to do so. (Abstract shortened by ProQuest.).

  6. The Emergent Capabilities of Distributed Satellites and Methods for Selecting Distributed Satellite Science Missions

    NASA Astrophysics Data System (ADS)

    Corbin, B. A.; Seager, S.; Ross, A.; Hoffman, J.

    2017-12-01

    Distributed satellite systems (DSS) have emerged as an effective and cheap way to conduct space science, thanks to advances in the small satellite industry. However, relatively few space science missions have utilized multiple assets to achieve their primary scientific goals. Previous research on methods for evaluating mission concepts designs have shown that distributed systems are rarely competitive with monolithic systems, partially because it is difficult to quantify the added value of DSSs over monolithic systems. Comparatively little research has focused on how DSSs can be used to achieve new, fundamental space science goals that cannot be achieved with monolithic systems or how to choose a design from a larger possible tradespace of options. There are seven emergent capabilities of distributed satellites: shared sampling, simultaneous sampling, self-sampling, census sampling, stacked sampling, staged sampling, and sacrifice sampling. These capabilities are either fundamentally, analytically, or operationally unique in their application to distributed science missions, and they can be leveraged to achieve science goals that are either impossible or difficult and costly to achieve with monolithic systems. The Responsive Systems Comparison (RSC) method combines Multi-Attribute Tradespace Exploration with Epoch-Era Analysis to examine benefits, costs, and flexible options in complex systems over the mission lifecycle. Modifications to the RSC method as it exists in previously published literature were made in order to more accurately characterize how value is derived from space science missions. New metrics help rank designs by the value derived over their entire mission lifecycle and show more accurate cumulative value distributions. The RSC method was applied to four case study science missions that leveraged the emergent capabilities of distributed satellites to achieve their primary science goals. In all four case studies, RSC showed how scientific value was gained that would be impossible or unsatisfactory with monolithic systems and how changes in design and context variables affected the overall mission value. Each study serves as a blueprint for how to conduct a Pre-Phase A study using these methods to learn more about the tradespace of a particular mission.

  7. Solution of rocks and refractory minerals by acids at high temperatures and pressures. Determination of silica after decomposition with hydrofluoric acid

    USGS Publications Warehouse

    May, I.; Rowe, J.J.

    1965-01-01

    A modified Morey bomb was designed which contains a removable nichromecased 3.5-ml platinium crucible. This bomb is particularly useful for decompositions of refractory samples for micro- and semimicro-analysis. Temperatures of 400-450?? and pressures estimated as great as 6000 p.s.i. were maintained in the bomb for periods as long as 24 h. Complete decompositions of rocks, garnet, beryl, chrysoberyl, phenacite, sapphirine, and kyanite were obtained with hydrofluoric acid or a mixture of hydrofluoric and sulfuric acids; the decomposition of chrome refractory was made with hydrochloric acid. Aluminum-rich samples formed difficultly soluble aluminum fluoride precipitates. Because no volatilization losses occur, silica can be determined on sample solutions by a molybdenum-blue procedure using aluminum(III) to complex interfering fluoride. ?? 1965.

  8. Cryogenic STM in 3D vector magnetic fields realized through a rotatable insert.

    PubMed

    Trainer, C; Yim, C M; McLaren, M; Wahl, P

    2017-09-01

    Spin-polarized scanning tunneling microscopy (SP-STM) performed in vector magnetic fields promises atomic scale imaging of magnetic structure, providing complete information on the local spin texture of a sample in three dimensions. Here, we have designed and constructed a turntable system for a low temperature STM which in combination with a 2D vector magnet provides magnetic fields of up to 5 T in any direction relative to the tip-sample geometry. This enables STM imaging and spectroscopy to be performed at the same atomic-scale location and field-of-view on the sample, and most importantly, without experiencing any change on the tip apex before and after field switching. Combined with a ferromagnetic tip, this enables us to study the magnetization of complex magnetic orders in all three spatial directions.

  9. Across North America tracer experiment (ANATEX): Sampling and analysis

    NASA Astrophysics Data System (ADS)

    Draxler, R. R.; Dietz, R.; Lagomarsino, R. J.; Start, G.

    Between 5 January 1987 and 29 March 1987, there were 33 releases of different tracers from each of two sites: Glasgow, MT and St. Cloud, MN. The perfluorocarbon tracers were routinely released in a 3-h period every 2.5 days, alternating between daytime and night-time tracer releases. Ground-level air samples of 24-h duration were taken at 77 sites mostly located near rawinsonde stations east of 105°W and between 26°N and 55°N. Weekly air samples were taken at 12 remote sites between San Diego, CA and Pt. Barrow, AK and between Norway and the Canary Islands. Short-term 6-h samples were collected at ground level and 200 m AGL along an arc of five towers between Tulsa, OK and Green Bay, WI. Aircraft sampling within several hundred kilometers of both tracer release sites was used to establish the initial tracer path. Experimental design required improved sampler performance, new tracers with lower atmospheric backgrounds, and improvements in analytic precision. The advances to the perfluorocarbon tracer system are discussed in detail. Results from the tracer sampling showed that the average and peak concentrations measured over the daily ground-level sampling network were consistent with what would be calculated using mass conservative approaches. however, ground-level samples from individual tracer patterns showed considerable complexity due to vertical stability or the interaction of the tracer plumes with low pressure and frontal systems. These systems could pass right through the tracer plume without appreciable effect. Aircraft tracer measurements are used to confirm the initial tracer trajectory when the narrow plume may miss the coarser spaced ground-level sampling network. Tower tracer measurements showed a more complex temporal structure than evident from the longer duration ground-level sampling sites. Few above background plume measurements were evident in the more distant remote sampling network due to larger than expected uncertainties in the ambient background concentrations.

  10. Nanoscale effects in the characterization of viscoelastic materials with atomic force microscopy: Coupling of a quasi-three-dimensional standard linear solid model with in-plane surface interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solares, Santiago D.

    Significant progress has been accomplished in the development of experimental contact-mode and dynamic-mode atomic force microscopy (AFM) methods designed to measure surface material properties. However, current methods are based on one-dimensional (1D) descriptions of the tip-sample interaction forces, thus neglecting the intricacies involved in the material behavior of complex samples (such as soft viscoelastic materials) as well as the differences in material response between the surface and the bulk. In order to begin to address this gap, a computational study is presented where the sample is simulated using an enhanced version of a recently introduced model that treats the surfacemore » as a collection of standard-linear-solid viscoelastic elements. The enhanced model introduces in-plane surface elastic forces that can be approximately related to a two-dimensional (2D) Young's modulus. Relevant cases are discussed for single-and multifrequency intermittent-contact AFM imaging, with focus on the calculated surface indentation profiles and tip-sample interaction force curves, as well as their implications with regards to experimental interpretation. A variety of phenomena are examined in detail, which highlight the need for further development of more physically accurate sample models that are specifically designed for AFM simulation. As a result, a multifrequency AFM simulation tool based on the above sample model is provided as supporting information.« less

  11. Nanoscale effects in the characterization of viscoelastic materials with atomic force microscopy: Coupling of a quasi-three-dimensional standard linear solid model with in-plane surface interactions

    DOE PAGES

    Solares, Santiago D.

    2016-04-15

    Significant progress has been accomplished in the development of experimental contact-mode and dynamic-mode atomic force microscopy (AFM) methods designed to measure surface material properties. However, current methods are based on one-dimensional (1D) descriptions of the tip-sample interaction forces, thus neglecting the intricacies involved in the material behavior of complex samples (such as soft viscoelastic materials) as well as the differences in material response between the surface and the bulk. In order to begin to address this gap, a computational study is presented where the sample is simulated using an enhanced version of a recently introduced model that treats the surfacemore » as a collection of standard-linear-solid viscoelastic elements. The enhanced model introduces in-plane surface elastic forces that can be approximately related to a two-dimensional (2D) Young's modulus. Relevant cases are discussed for single-and multifrequency intermittent-contact AFM imaging, with focus on the calculated surface indentation profiles and tip-sample interaction force curves, as well as their implications with regards to experimental interpretation. A variety of phenomena are examined in detail, which highlight the need for further development of more physically accurate sample models that are specifically designed for AFM simulation. As a result, a multifrequency AFM simulation tool based on the above sample model is provided as supporting information.« less

  12. Saturation sampling for spatial variation in multiple air pollutants across an inversion-prone metropolitan area of complex terrain

    PubMed Central

    2014-01-01

    Background Characterizing intra-urban variation in air quality is important for epidemiological investigation of health outcomes and disparities. To date, however, few studies have been designed to capture spatial variation during select hours of the day, or to examine the roles of meteorology and complex terrain in shaping intra-urban exposure gradients. Methods We designed a spatial saturation monitoring study to target local air pollution sources, and to understand the role of topography and temperature inversions on fine-scale pollution variation by systematically allocating sampling locations across gradients in key local emissions sources (vehicle traffic, industrial facilities) and topography (elevation) in the Pittsburgh area. Street-level integrated samples of fine particulate matter (PM2.5), black carbon (BC), nitrogen dioxide (NO2), sulfur dioxide (SO2), and ozone (O3) were collected during morning rush and probable inversion hours (6-11 AM), during summer and winter. We hypothesized that pollution concentrations would be: 1) higher under inversion conditions, 2) exacerbated in lower-elevation areas, and 3) vary by season. Results During July - August 2011 and January - March 2012, we observed wide spatial and seasonal variability in pollution concentrations, exceeding the range measured at regulatory monitors. We identified elevated concentrations of multiple pollutants at lower-elevation sites, and a positive association between inversion frequency and NO2 concentration. We examined temporal adjustment methods for deriving seasonal concentration estimates, and found that the appropriate reference temporal trend differs between pollutants. Conclusions Our time-stratified spatial saturation approach found some evidence for modification of inversion-concentration relationships by topography, and provided useful insights for refining and interpreting GIS-based pollution source indicators for Land Use Regression modeling. PMID:24735818

  13. Emotion dysregulation and autonomic responses to film, rumination, and body awareness: Extending psychophysiological research to a naturalistic clinical setting and a chemically dependent female sample.

    PubMed

    Crowell, Sheila E; Price, Cynthia J; Puzia, Megan E; Yaptangco, Mona; Cheng, Sunny Chieh

    2017-05-01

    Substance use is a complex clinical problem characterized by emotion dysregulation and daily challenges that can interfere with laboratory research. Thus, few psychophysiological studies examine autonomic and self-report measures of emotion dysregulation with multidiagnostic, chemically dependent samples or extend this work into naturalistic settings. In this study, we used a within-subject design to examine changes in respiratory sinus arrhythmia (RSA), electrodermal activity (EDA), and self-reported affect across three tasks designed to elicit distinct psychophysiological and emotional response patterns. We also examined emotion dysregulation as a moderator of psychophysiological responses. Participants include 116 women with multiple comorbid mental health conditions enrolled in substance use treatment, many of whom also reported high emotion dysregulation. Participants were assessed in the treatment setting and completed three tasks: watching a sad movie clip, rumination on a stressful event, and a mindful interoceptive awareness meditation. Multilevel models were used to examine changes from resting baselines to the tasks. During the film, results indicate a significant decrease in RSA and an increase in EDA. For the rumination task, participants showed a decrease in RSA but no EDA response. For the body awareness task, there was an increase in RSA and a decrease in EDA. Emotion dysregulation was associated with differences in baseline RSA but not with EDA or with the slope of response patterns across tasks. Self-reported affect was largely consistent with autonomic patterns. Findings add to the literature on emotion dysregulation, substance use, and the translation of psychophysiological measurements into clinical settings with complex samples. © 2017 Society for Psychophysiological Research.

  14. Investigation of Maternal Effects, Maternal-Fetal Interactions and Parent-of-Origin Effects (Imprinting), Using Mothers and Their Offspring

    PubMed Central

    Ainsworth, Holly F; Unwin, Jennifer; Jamison, Deborah L; Cordell, Heather J

    2011-01-01

    Many complex genetic effects, including epigenetic effects, may be expected to operate via mechanisms in the inter-uterine environment. A popular design for the investigation of such effects, including effects of parent-of-origin (imprinting), maternal genotype, and maternal-fetal genotype interactions, is to collect DNA from affected offspring and their mothers (case/mother duos) and to compare with an appropriate control sample. An alternative design uses data from cases and both parents (case/parent trios) but does not require controls. In this study, we describe a novel implementation of a multinomial modeling approach that allows the estimation of such genetic effects using either case/mother duos or case/parent trios. We investigate the performance of our approach using computer simulations and explore the sample sizes and data structures required to provide high power for detection of effects and accurate estimation of the relative risks conferred. Through the incorporation of additional assumptions (such as Hardy-Weinberg equilibrium, random mating and known allele frequencies) and/or the incorporation of additional types of control sample (such as unrelated controls, controls and their mothers, or both parents of controls), we show that the (relative risk) parameters of interest are identifiable and well estimated. Nevertheless, parameter interpretation can be complex, as we illustrate by demonstrating the mathematical equivalence between various different parameterizations. Our approach scales up easily to allow the analysis of large-scale genome-wide association data, provided both mothers and affected offspring have been genotyped at all variants of interest. Genet. Epidemiol. 35:19–45, 2011. © 2010 Wiley-Liss, Inc. PMID:21181895

  15. Saturation sampling for spatial variation in multiple air pollutants across an inversion-prone metropolitan area of complex terrain.

    PubMed

    Shmool, Jessie Lc; Michanowicz, Drew R; Cambal, Leah; Tunno, Brett; Howell, Jeffery; Gillooly, Sara; Roper, Courtney; Tripathy, Sheila; Chubb, Lauren G; Eisl, Holger M; Gorczynski, John E; Holguin, Fernando E; Shields, Kyra Naumoff; Clougherty, Jane E

    2014-04-16

    Characterizing intra-urban variation in air quality is important for epidemiological investigation of health outcomes and disparities. To date, however, few studies have been designed to capture spatial variation during select hours of the day, or to examine the roles of meteorology and complex terrain in shaping intra-urban exposure gradients. We designed a spatial saturation monitoring study to target local air pollution sources, and to understand the role of topography and temperature inversions on fine-scale pollution variation by systematically allocating sampling locations across gradients in key local emissions sources (vehicle traffic, industrial facilities) and topography (elevation) in the Pittsburgh area. Street-level integrated samples of fine particulate matter (PM2.5), black carbon (BC), nitrogen dioxide (NO2), sulfur dioxide (SO2), and ozone (O3) were collected during morning rush and probable inversion hours (6-11 AM), during summer and winter. We hypothesized that pollution concentrations would be: 1) higher under inversion conditions, 2) exacerbated in lower-elevation areas, and 3) vary by season. During July - August 2011 and January - March 2012, we observed wide spatial and seasonal variability in pollution concentrations, exceeding the range measured at regulatory monitors. We identified elevated concentrations of multiple pollutants at lower-elevation sites, and a positive association between inversion frequency and NO2 concentration. We examined temporal adjustment methods for deriving seasonal concentration estimates, and found that the appropriate reference temporal trend differs between pollutants. Our time-stratified spatial saturation approach found some evidence for modification of inversion-concentration relationships by topography, and provided useful insights for refining and interpreting GIS-based pollution source indicators for Land Use Regression modeling.

  16. Design for gas chromatography-corona discharge-ion mobility spectrometry.

    PubMed

    Jafari, Mohammad T; Saraji, Mohammad; Sherafatmand, Hossein

    2012-11-20

    A corona discharge ionization-ion mobility spectrometry (CD-IMS) with a novel sample inlet system was designed and constructed as a detector for capillary gas chromatography. In this design, a hollow needle was used instead of a solid needle which is commonly used for corona discharge creation, helping us to have direct axial interfacing for GC-IMS. The capillary column was passed through the needle, resulting in a reaction of effluents with reactant ions on the upstream side of the corona discharge ionization source. Using this sample introduction design, higher ionization efficiency was achieved relative to the entrance direction through the side of the drift tube. In addition, the volume of the ionization region was reduced to minimize the resistance time of compounds in the ionization source, increasing chromatographic resolution of the instrument. The effects of various parameters such as drift gas flow, makeup gas flow, and column tip position inside the needle were investigated. The designed instrument was exhaustively validated in terms of sensitivity, resolution, and reproducibility by analyzing the standard solutions of methyl isobutyl ketone, heptanone, nonanone, and acetophenone as the test compounds. The results obtained by CD-IMS detector were compared with those of the flame ionization detector, which revealed the capability of the proposed GC-IMS for two-dimensional separation (based on the retention time and drift time information) and identification of an analyte in complex matrixes.

  17. Work Related Psychosocial and Organizational Factors for Neck Pain in Workers in the United States

    PubMed Central

    Yang, Haiou; Hitchcock, Edward; Haldeman, Scott; Swanson, Naomi; Lu, Ming-Lun; Choi, BongKyoo; Nakata, Akinori; Baker, Dean

    2016-01-01

    Background Neck pain is a prevalent musculoskeletal condition among workers in the United States. This study explores a set of workplace psychosocial and organization-related factors for neck pain. Methods Data used for this study comes from the 2010 National Health interview Survey which provides a representative sample of the US population. To account for the complex sampling design, the Taylor linearized variance estimation method was used. Logistic regression models were constructed to measure the associations. Results This study demonstrated significant associations between neck pain and a set of workplace risk factors including work-family imbalance, exposure to a hostile work environment and job insecurity, non-standard work arrangements, multiple jobs and long work hours. Conclusion Workers with neck pain may benefit from intervention programs that address issues related to these workplace risk factors. Future studies exploring both psychosocial risk factors and physical risk factors with a longitudinal design will be important. PMID:27184340

  18. Workplace psychosocial and organizational factors for neck pain in workers in the United States.

    PubMed

    Yang, Haiou; Hitchcock, Edward; Haldeman, Scott; Swanson, Naomi; Lu, Ming-Lun; Choi, BongKyoo; Nakata, Akinori; Baker, Dean

    2016-07-01

    Neck pain is a prevalent musculoskeletal condition among workers in the United States. This study explores a set of workplace psychosocial and organization-related factors for neck pain. Data used for this study come from the 2010 National Health Interview Survey which provides a representative sample of the US population. To account for the complex sampling design, the Taylor linearized variance estimation method was used. Logistic regression models were constructed to measure the associations. This study demonstrated significant associations between neck pain and a set of workplace risk factors, including work-family imbalance, exposure to a hostile work environment and job insecurity, non-standard work arrangements, multiple jobs, and long work hours. Workers with neck pain may benefit from intervention programs that address issues related to these workplace risk factors. Future studies exploring both psychosocial risk factors and physical risk factors with a longitudinal design will be important. Am. J. Ind. Med. 59:549-560, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  19. An Adaptive Defect Weighted Sampling Algorithm to Design Pseudoknotted RNA Secondary Structures

    PubMed Central

    Zandi, Kasra; Butler, Gregory; Kharma, Nawwaf

    2016-01-01

    Computational design of RNA sequences that fold into targeted secondary structures has many applications in biomedicine, nanotechnology and synthetic biology. An RNA molecule is made of different types of secondary structure elements and an important RNA element named pseudoknot plays a key role in stabilizing the functional form of the molecule. However, due to the computational complexities associated with characterizing pseudoknotted RNA structures, most of the existing RNA sequence designer algorithms generally ignore this important structural element and therefore limit their applications. In this paper we present a new algorithm to design RNA sequences for pseudoknotted secondary structures. We use NUPACK as the folding algorithm to compute the equilibrium characteristics of the pseudoknotted RNAs, and describe a new adaptive defect weighted sampling algorithm named Enzymer to design low ensemble defect RNA sequences for targeted secondary structures including pseudoknots. We used a biological data set of 201 pseudoknotted structures from the Pseudobase library to benchmark the performance of our algorithm. We compared the quality characteristics of the RNA sequences we designed by Enzymer with the results obtained from the state of the art MODENA and antaRNA. Our results show our method succeeds more frequently than MODENA and antaRNA do, and generates sequences that have lower ensemble defect, lower probability defect and higher thermostability. Finally by using Enzymer and by constraining the design to a naturally occurring and highly conserved Hammerhead motif, we designed 8 sequences for a pseudoknotted cis-acting Hammerhead ribozyme. Enzymer is available for download at https://bitbucket.org/casraz/enzymer. PMID:27499762

  20. A system for programming experiments and for recording and analyzing data automatically1

    PubMed Central

    Herrick, Robert M.; Denelsbeck, John S.

    1963-01-01

    A system designed for use in complex operant conditioning experiments is described. Some of its key features are: (a) plugboards that permit the experimenter to change either from one program to another or from one analysis to another in less than a minute, (b) time-sharing of permanently-wired, electronic logic components, (c) recordings suitable for automatic analyses. Included are flow diagrams of the system and sample logic diagrams for programming experiments and for analyzing data. ImagesFig. 4. PMID:14055967

  1. Comments on ``Use of conditional simulation in nuclear waste site performance assessment`` by Carol Gotway

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Downing, D.J.

    1993-10-01

    This paper discusses Carol Gotway`s paper, ``The Use of Conditional Simulation in Nuclear Waste Site Performance Assessment.`` The paper centers on the use of conditional simulation and the use of geostatistical methods to simulate an entire field of values for subsequent use in a complex computer model. The issues of sampling designs for geostatistics, semivariogram estimation and anisotropy, turning bands method for random field generation, and estimation of the comulative distribution function are brought out.

  2. Development of a NIR-based blend uniformity method for a drug product containing multiple structurally similar actives by using the quality by design principles.

    PubMed

    Lin, Yiqing; Li, Weiyong; Xu, Jin; Boulas, Pierre

    2015-07-05

    The aim of this study is to develop an at-line near infrared (NIR) method for the rapid and simultaneous determination of four structurally similar active pharmaceutical ingredients (APIs) in powder blends intended for the manufacturing of tablets. Two of the four APIs in the formula are present in relatively small amounts, one at 0.95% and the other at 0.57%. Such small amounts in addition to the similarity in structures add significant complexity to the blend uniformity analysis. The NIR method is developed using spectra from six laboratory-created calibration samples augmented by a small set of spectra from a large-scale blending sample. Applying the quality by design (QbD) principles, the calibration design included concentration variations of the four APIs and a main excipient, microcrystalline cellulose. A bench-top FT-NIR instrument was used to acquire the spectra. The obtained NIR spectra were analyzed by applying principal component analysis (PCA) before calibration model development. Score patterns from the PCA were analyzed to reveal relationship between latent variables and concentration variations of the APIs. In calibration model development, both PLS-1 and PLS-2 models were created and evaluated for their effectiveness in predicting API concentrations in the blending samples. The final NIR method shows satisfactory specificity and accuracy. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Evaluation of Solid Rocket Motor Component Data Using a Commercially Available Statistical Software Package

    NASA Technical Reports Server (NTRS)

    Stefanski, Philip L.

    2015-01-01

    Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.

  4. Interfacial hydration, dynamics and electron transfer: multi-scale ET modeling of the transient [myoglobin, cytochrome b5] complex.

    PubMed

    Keinan, Shahar; Nocek, Judith M; Hoffman, Brian M; Beratan, David N

    2012-10-28

    Formation of a transient [myoglobin (Mb), cytochrome b(5) (cyt b(5))] complex is required for the reductive repair of inactive ferri-Mb to its functional ferro-Mb state. The [Mb, cyt b(5)] complex exhibits dynamic docking (DD), with its cyt b(5) partner in rapid exchange at multiple sites on the Mb surface. A triple mutant (Mb(3M)) was designed as part of efforts to shift the electron-transfer process to the simple docking (SD) regime, in which reactive binding occurs at a restricted, reactive region on the Mb surface that dominates the docked ensemble. An electrostatically-guided brownian dynamics (BD) docking protocol was used to generate an initial ensemble of reactive configurations of the complex between unrelaxed partners. This ensemble samples a broad and diverse array of heme-heme distances and orientations. These configurations seeded all-atom constrained molecular dynamics simulations (MD) to generate relaxed complexes for the calculation of electron tunneling matrix elements (T(DA)) through tunneling-pathway analysis. This procedure for generating an ensemble of relaxed complexes combines the ability of BD calculations to sample the large variety of available conformations and interprotein distances, with the ability of MD to generate the atomic level information, especially regarding the structure of water molecules at the protein-protein interface, that defines electron-tunneling pathways. We used the calculated T(DA) values to compute ET rates for the [Mb(wt), cyt b(5)] complex and for the complex with a mutant that has a binding free energy strengthened by three D/E → K charge-reversal mutations, [Mb(3M), cyt b(5)]. The calculated rate constants are in agreement with the measured values, and the mutant complex ensemble has many more geometries with higher T(DA) values than does the wild-type Mb complex. Interestingly, water plays a double role in this electron-transfer system, lowering the tunneling barrier as well as inducing protein interface remodeling that screens the repulsion between the negatively-charged propionates of the two hemes.

  5. Data mining and statistical inference in selective laser melting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamath, Chandrika

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  6. Data mining and statistical inference in selective laser melting

    DOE PAGES

    Kamath, Chandrika

    2016-01-11

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  7. Meeting Report: Structural Determination of Environmentally Responsive Proteins

    PubMed Central

    Reinlib, Leslie

    2005-01-01

    The three-dimensional structure of gene products continues to be a missing lynchpin between linear genome sequences and our understanding of the normal and abnormal function of proteins and pathways. Enhanced activity in this area is likely to lead to better understanding of how discrete changes in molecular patterns and conformation underlie functional changes in protein complexes and, with it, sensitivity of an individual to an exposure. The National Institute of Environmental Health Sciences convened a workshop of experts in structural determination and environmental health to solicit advice for future research in structural resolution relative to environmentally responsive proteins and pathways. The highest priorities recommended by the workshop were to support studies of structure, analysis, control, and design of conformational and functional states at molecular resolution for environmentally responsive molecules and complexes; promote understanding of dynamics, kinetics, and ligand responses; investigate the mechanisms and steps in posttranslational modifications, protein partnering, impact of genetic polymorphisms on structure/function, and ligand interactions; and encourage integrated experimental and computational approaches. The workshop participants also saw value in improving the throughput and purity of protein samples and macromolecular assemblies; developing optimal processes for design, production, and assembly of macromolecular complexes; encouraging studies on protein–protein and macromolecular interactions; and examining assemblies of individual proteins and their functions in pathways of interest for environmental health. PMID:16263521

  8. Spectroelectrochemistry as a Strategy for Improving Selectivity of Sensors for Security and Defense Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heineman, William R.; Seliskar, Carl J.; Morris, Laura K.

    2012-12-19

    Spectroelectrochemistry provides improved selectivity for sensors by electrochemically modulating the optical signal associated with the analyte. The sensor consists of an optically transparent electrode (OTE) coated with a film that preconcentrates the target analyte. The OTE functions as an optical waveguide for attenuated total reflectance (ATR) spectroscopy, which detects the analyte by absorption. Alternatively, the OTE can serve as the excitation light for fluorescence detection, which is generally more sensitive than absorption. The analyte partitions into the film, undergoes an electrochemical redox reaction at the OTE surface, and absorbs or emits light in its oxidized or reduced state. The changemore » in the optical response associated with electrochemical oxidation or reduction at the OTE is used to quantify the analyte. Absorption sensors for metal ion complexes such as [Fe(CN)6]4- and [Ru(bpy)3]2+ and fluorescence sensors for [Ru(bpy)3]2+ and the polycyclic aromatic hydrocarbon 1-hydroxypyrene have been developed. The sensor concept has been extended to binding assays for a protein using avidin–biotin and 17β-estradiol–anti-estradiol antibodies. The sensor has been demonstrated to measure metal complexes in complex samples such as nuclear waste and natural water. This sensor has qualities needed for security and defense applications that require a high level of selectivity and good detection limits for target analytes in complex samples. Quickly monitoring and designating intent of a nuclear program by measuring the Ru/Tc fission product ratio is such an application.« less

  9. Evaluation of Gallium as a Tracer of Exogenous Hemoglobin-Haptoglobin Complexes for Targeted Drug Delivery Applications

    NASA Astrophysics Data System (ADS)

    Xu, Shengsheng; Kaltashov, Igor A.

    2016-12-01

    Haptoglobin (Hp) is a plasma glycoprotein that generates significant interest in the drug delivery community because of its potential for delivery of antiretroviral medicines with high selectivity to macrophages and monocytes, the latent reservoirs of human immunodeficiency virus. As is the case with other therapies that exploit transport networks for targeted drug delivery, the success of the design and optimization of Hp-based therapies will critically depend on the ability to accurately localize and quantitate Hp-drug conjugates on the varying and unpredictable background of endogenous proteins having identical structure. In this work, we introduce a new strategy for detecting and quantitating exogenous Hp and Hp-based drugs with high sensitivity in complex biological samples using gallium as a tracer of this protein and inductively coupled plasma mass spectrometry (ICP MS) as a method of detection. Metal label is introduced by reconstituting hemoglobin (Hb) with gallium(III)-protoporphyrin IX followed by its complexation with Hp. Formation of the Hp/Hb assembly and its stability are evaluated with native electrospray ionization mass spectrometry. Both stable isotopes of Ga give rise to an abundant signal in ICP MS of a human plasma sample spiked with the metal-labeled Hp/Hb complex. The metal label signal exceeds the spectral interferences' contributions by more than an order of magnitude even with the concentration of the exogenous protein below 10 nM, the level that is more than adequate for the planned pharmacokinetic studies of Hp-based therapeutics.

  10. Design complexity and strength of laterality are correlated in New Caledonian crows' pandanus tool manufacture

    PubMed Central

    Hunt, Gavin R; Corballis, Michael C; Gray, Russell D

    2006-01-01

    Population-level laterality is generally considered to reflect functional brain specialization. Consequently, the strength of population-level laterality in manipulatory tasks is predicted to positively correlate with task complexity. This relationship has not been investigated in tool manufacture. Here, we report the correlation between strength of laterality and design complexity in the manufacture of New Caledonian crows' three pandanus tool designs: wide, narrow and stepped designs. We documented indirect evidence of over 5800 tool manufactures on 1232 pandanus trees at 23 sites. We found that the strength of laterality in tool manufacture was correlated with design complexity in three ways: (i) the strongest effect size among the population-level edge biases for each design was for the more complex, stepped design, (ii) the strength of laterality at individual sites was on average greater for the stepped design than it was for the simpler wide and narrow, non-stepped designs, and (iii) there was a positive, but non-significant, trend for a correlation between the strength of laterality and the number of steps on a stepped tool. These three aspects together indicate that greater design complexity generally elicits stronger lateralization of crows' pandanus tool manufacture. PMID:16600891

  11. Design complexity and strength of laterality are correlated in New Caledonian crows' pandanus tool manufacture.

    PubMed

    Hunt, Gavin R; Corballis, Michael C; Gray, Russell D

    2006-05-07

    Population-level laterality is generally considered to reflect functional brain specialization. Consequently, the strength of population-level laterality in manipulatory tasks is predicted to positively correlate with task complexity. This relationship has not been investigated in tool manufacture. Here, we report the correlation between strength of laterality and design complexity in the manufacture of New Caledonian crows' three pandanus tool designs: wide, narrow and stepped designs. We documented indirect evidence of over 5,800 tool manufactures on 1,232 pandanus trees at 23 sites. We found that the strength of laterality in tool manufacture was correlated with design complexity in three ways: (i) the strongest effect size among the population-level edge biases for each design was for the more complex, stepped design, (ii) the strength of laterality at individual sites was on average greater for the stepped design than it was for the simpler wide and narrow, non-stepped designs, and (iii) there was a positive, but non-significant, trend for a correlation between the strength of laterality and the number of steps on a stepped tool. These three aspects together indicate that greater design complexity generally elicits stronger lateralization of crows' pandanus tool manufacture.

  12. Development of an advanced spacecraft tandem mass spectrometer

    NASA Astrophysics Data System (ADS)

    Drew, Russell C.

    1992-03-01

    The purpose of this research was to apply current advanced technology in electronics and materials to the development of a miniaturized Tandem Mass Spectrometer that would have the potential for future development into a package suitable for spacecraft use. The mass spectrometer to be used as a basis for the tandem instrument would be a magnetic sector instrument, of Nier-Johnson configuration, as used on the Viking Mars Lander mission. This instrument configuration would then be matched with a suitable second stage MS to provide the benefits of tandem MS operation for rapid identification of unknown organic compounds. This tandem instrument is configured with a newly designed GC system to aid in separation of complex mixtures prior to MS analysis. A number of important results were achieved in the course of this project. Among them were the development of a miniaturized GC subsystem, with a unique desorber-injector, fully temperature feedback controlled oven with powered cooling for rapid reset to ambient conditions, a unique combination inlet system to the MS that provides for both membrane sampling and direct capillary column sample transfer, a compact and ruggedized alignment configuration for the MS, an improved ion source design for increased sensitivity, and a simple, rugged tandem MS configuration that is particularly adaptable to spacecraft use because of its low power and low vacuum pumping requirements. The potential applications of this research include use in manned spacecraft like the space station as a real-time detection and warning device for the presence of potentially harmful trace contaminants of the spacecraft atmosphere, use as an analytical device for evaluating samples collected on the Moon or a planetary surface, or even use in connection with monitoring potentially hazardous conditions that may exist in terrestrial locations such as launch pads, environmental test chambers or other sensitive areas. Commercial development of the technology could lead to a new family of environmental test instruments that would be small and portable, yet would give quick analyses of complex samples.

  13. Rapid and simultaneous determination of twenty amino acids in complex biological and food samples by solid-phase microextraction and gas chromatography-mass spectrometry with the aid of experimental design after ethyl chloroformate derivatization.

    PubMed

    Mudiam, Mohana Krishna Reddy; Ratnasekhar, Ch; Jain, Rajeev; Saxena, Prem Narain; Chauhan, Abhishek; Murthy, R C

    2012-10-15

    Amino acids play a vital role as intermediates in many important metabolic pathways such as the biosynthesis of nucleotides, vitamins and secondary metabolites. A sensitive and rapid analytical method has been proposed for the first time for the simultaneous determination of twenty amino acids using solid-phase microextraction (SPME). The protein samples were hydrolyzed by 6M HCl under microwave radiation for 120 min. Then the amino acids were derivatized by ethyl chloroformate (ECF) and the ethoxy carbonyl ethyl esters of amino acids formed were extracted using SPME by direct immersion. Finally the extracted analytes on the SPME fiber were desorbed at 260°C and analyzed by gas chromatography-mass spectrometer (GC-MS) in electron ionization mode. Factors which affect the SPME efficiency were screened by Plackett-Burmann design; most significant factors were optimized with response surface methodology. The optimum conditions for SPME are as follows: pH of 1.7, ionic strength of 733 mg, extraction time of 30 min and fiber of divinyl benzene/carboxen/polydimethylsiloxane (DVB/CAR/PDMS). The recovery of all the amino acids was found to be in the range of 89.17-100.98%. The limit of detection (LOD) of all derivatized amino acids in urine, hair and soybean was found to be in the range of 0.20-7.52 μg L(-1), 0.21-8.40 μg L(-1) and 0.18-5.62 μg L(-1), respectively. Finally, the proposed technique was successfully applied for the determination of amino acids in complex biological (hair, urine) and food samples (soybean). The method can find wide applications in the routine analysis of amino acids in any biological as well as food samples. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Development of an advanced spacecraft tandem mass spectrometer

    NASA Technical Reports Server (NTRS)

    Drew, Russell C.

    1992-01-01

    The purpose of this research was to apply current advanced technology in electronics and materials to the development of a miniaturized Tandem Mass Spectrometer that would have the potential for future development into a package suitable for spacecraft use. The mass spectrometer to be used as a basis for the tandem instrument would be a magnetic sector instrument, of Nier-Johnson configuration, as used on the Viking Mars Lander mission. This instrument configuration would then be matched with a suitable second stage MS to provide the benefits of tandem MS operation for rapid identification of unknown organic compounds. This tandem instrument is configured with a newly designed GC system to aid in separation of complex mixtures prior to MS analysis. A number of important results were achieved in the course of this project. Among them were the development of a miniaturized GC subsystem, with a unique desorber-injector, fully temperature feedback controlled oven with powered cooling for rapid reset to ambient conditions, a unique combination inlet system to the MS that provides for both membrane sampling and direct capillary column sample transfer, a compact and ruggedized alignment configuration for the MS, an improved ion source design for increased sensitivity, and a simple, rugged tandem MS configuration that is particularly adaptable to spacecraft use because of its low power and low vacuum pumping requirements. The potential applications of this research include use in manned spacecraft like the space station as a real-time detection and warning device for the presence of potentially harmful trace contaminants of the spacecraft atmosphere, use as an analytical device for evaluating samples collected on the Moon or a planetary surface, or even use in connection with monitoring potentially hazardous conditions that may exist in terrestrial locations such as launch pads, environmental test chambers or other sensitive areas. Commercial development of the technology could lead to a new family of environmental test instruments that would be small and portable, yet would give quick analyses of complex samples.

  15. Improving Validity of Informed Consent for Biomedical Research in Zambia Using a Laboratory Exposure Intervention

    PubMed Central

    Zulu, Joseph Mumba; Lisulo, Mpala Mwanza; Besa, Ellen; Kaonga, Patrick; Chisenga, Caroline C.; Chomba, Mumba; Simuyandi, Michelo; Banda, Rosemary; Kelly, Paul

    2014-01-01

    Background Complex biomedical research can lead to disquiet in communities with limited exposure to scientific discussions, leading to rumours or to high drop-out rates. We set out to test an intervention designed to address apprehensions commonly encountered in a community where literacy is uncommon, and where complex biomedical research has been conducted for over a decade. We aimed to determine if it could improve the validity of consent. Methods Data were collected using focus group discussions, key informant interviews and observations. We designed an intervention that exposed participants to a detailed demonstration of laboratory processes. Each group was interviewed twice in a day, before and after exposure to the intervention in order to assess changes in their views. Results Factors that motivated people to participate in invasive biomedical research included a desire to stay healthy because of the screening during the recruitment process, regular advice from doctors, free medical services, and trust in the researchers. Inhibiting factors were limited knowledge about samples taken from their bodies during endoscopic procedures, the impact of endoscopy on the function of internal organs, and concerns about the use of biomedical samples. The belief that blood can be used for Satanic practices also created insecurities about drawing of blood samples. Further inhibiting factors included a fear of being labelled as HIV positive if known to consult heath workers repeatedly, and gender inequality. Concerns about the use and storage of blood and tissue samples were overcome by a laboratory exposure intervention. Conclusion Selecting a group of members from target community and engaging them in a laboratory exposure intervention could be a useful tool for enhancing specific aspects of consent for biomedical research. Further work is needed to determine the extent to which improved understanding permeates beyond the immediate group participating in the intervention. PMID:25254378

  16. Using complex auditory-visual samples to produce emergent relations in children with autism.

    PubMed

    Groskreutz, Nicole C; Karsina, Allen; Miguel, Caio F; Groskreutz, Mark P

    2010-03-01

    Six participants with autism learned conditional relations between complex auditory-visual sample stimuli (dictated words and pictures) and simple visual comparisons (printed words) using matching-to-sample training procedures. Pre- and posttests examined potential stimulus control by each element of the complex sample when presented individually and emergence of additional conditional relations and oral labeling. Tests revealed class-consistent performance for all participants following training.

  17. Stereo Sound Field Controller Design Using Partial Model Matching on the Frequency Domain

    NASA Astrophysics Data System (ADS)

    Kumon, Makoto; Miike, Katsuhiro; Eguchi, Kazuki; Mizumoto, Ikuro; Iwai, Zenta

    The objective of sound field control is to make the acoustic characteristics of a listening room close to those of the desired system. Conventional methods apply feedforward controllers, such as digital filters, to achieve this objective. However, feedback controllers are also necessary in order to attenuate noise or to compensate the uncertainty of the acoustic characteristics of the listening room. Since acoustic characteristics are well modeled on the frequency domain, it is efficient to design controllers with respect to frequency responses, but it is difficult to design a multi input multi output (MIMO) control system on a wide frequency domain. In the present study, a partial model matching method on the frequency domain was adopted because this method requires only sampled data, rather than complex mathematical models of the plant, in order to design controllers for MIMO systems. The partial model matching method was applied to design two-degree-of-freedom controllers for acoustic equalization and noise reduction. Experiments demonstrated effectiveness of the proposed method.

  18. Surface laser marking optimization using an experimental design approach

    NASA Astrophysics Data System (ADS)

    Brihmat-Hamadi, F.; Amara, E. H.; Lavisse, L.; Jouvard, J. M.; Cicala, E.; Kellou, H.

    2017-04-01

    Laser surface marking is performed on a titanium substrate using a pulsed frequency doubled Nd:YAG laser ( λ= 532 nm, τ pulse=5 ns) to process the substrate surface under normal atmospheric conditions. The aim of the work is to investigate, following experimental and statistical approaches, the correlation between the process parameters and the response variables (output), using a Design of Experiment method (DOE): Taguchi methodology and a response surface methodology (RSM). A design is first created using MINTAB program, and then the laser marking process is performed according to the planned design. The response variables; surface roughness and surface reflectance were measured for each sample, and incorporated into the design matrix. The results are then analyzed and the RSM model is developed and verified for predicting the process output for the given set of process parameters values. The analysis shows that the laser beam scanning speed is the most influential operating factor followed by the laser pumping intensity during marking, while the other factors show complex influences on the objective functions.

  19. A Novel Latin Hypercube Algorithm via Translational Propagation

    PubMed Central

    Pan, Guang; Ye, Pengcheng

    2014-01-01

    Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is directly related to the experimental designs used. Optimal Latin hypercube designs are frequently used and have been shown to have good space-filling and projective properties. However, the high cost in constructing them limits their use. In this paper, a methodology for creating novel Latin hypercube designs via translational propagation and successive local enumeration algorithm (TPSLE) is developed without using formal optimization. TPSLE algorithm is based on the inspiration that a near optimal Latin Hypercube design can be constructed by a simple initial block with a few points generated by algorithm SLE as a building block. In fact, TPSLE algorithm offers a balanced trade-off between the efficiency and sampling performance. The proposed algorithm is compared to two existing algorithms and is found to be much more efficient in terms of the computation time and has acceptable space-filling and projective properties. PMID:25276844

  20. Sampling from complex networks using distributed learning automata

    NASA Astrophysics Data System (ADS)

    Rezvanian, Alireza; Rahmati, Mohammad; Meybodi, Mohammad Reza

    2014-02-01

    A complex network provides a framework for modeling many real-world phenomena in the form of a network. In general, a complex network is considered as a graph of real world phenomena such as biological networks, ecological networks, technological networks, information networks and particularly social networks. Recently, major studies are reported for the characterization of social networks due to a growing trend in analysis of online social networks as dynamic complex large-scale graphs. Due to the large scale and limited access of real networks, the network model is characterized using an appropriate part of a network by sampling approaches. In this paper, a new sampling algorithm based on distributed learning automata has been proposed for sampling from complex networks. In the proposed algorithm, a set of distributed learning automata cooperate with each other in order to take appropriate samples from the given network. To investigate the performance of the proposed algorithm, several simulation experiments are conducted on well-known complex networks. Experimental results are compared with several sampling methods in terms of different measures. The experimental results demonstrate the superiority of the proposed algorithm over the others.

  1. Integrated modeling tool for performance engineering of complex computer systems

    NASA Technical Reports Server (NTRS)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  2. Mass spectrometer having a derivatized sample presentation apparatus

    DOEpatents

    Nelson, Randall W.

    2000-07-25

    A mass spectrometer having a derivatized sample presentation apparatus is provided. The sample presentation apparatus has a complex bound to the surface of the sample presentation apparatus. This complex includes a molecule which may chemically modify a biomolecule.

  3. Chemical cross-linking of the urease complex from Helicobacter pylori and analysis by Fourier transform ion cyclotron resonance mass spectrometry and molecular modeling

    NASA Astrophysics Data System (ADS)

    Carlsohn, Elisabet; Ångström, Jonas; Emmett, Mark R.; Marshall, Alan G.; Nilsson, Carol L.

    2004-05-01

    Chemical cross-linking of proteins is a well-established method for structural mapping of small protein complexes. When combined with mass spectrometry, cross-linking can reveal protein topology and identify contact sites between the peptide surfaces. When applied to surface-exposed proteins from pathogenic organisms, the method can reveal structural details that are useful in vaccine design. In order to investigate the possibilities of applying cross-linking on larger protein complexes, we selected the urease enzyme from Helicobacter pylori as a model. This membrane-associated protein complex consists of two subunits: [alpha] (26.5 kDa) and [beta] (61.7 kDa). Three ([alpha][beta]) heterodimers form a trimeric ([alpha][beta])3 assembly which further associates into a unique dodecameric 1.1 MDa complex composed of four ([alpha][beta])3 units. Cross-linked peptides from trypsin-digested urease complex were analyzed by Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) and molecular modeling. Two potential cross-linked peptides (present in the cross-linked sample but undetectable in [alpha], [beta], and native complex) were assigned. Molecular modeling of urease [alpha][beta] complex and trimeric urease units ([alpha][beta])3 revealed a linkage site between the [alpha]-subunit and the [beta]-subunit, and an internal cross-linkage in the [beta]-subunit.

  4. Direct Ink Writing of Three-Dimensional (K, Na)NbO3-Based Piezoelectric Ceramics

    PubMed Central

    Li, Yayun; Li, Longtu; Li, Bo

    2015-01-01

    A kind of piezoelectric ink was prepared with Li, Ta, Sb co-doped (K, Na)NbO3 (KNN) powders. Piezoelectric scaffolds with diameters at micrometer scale were constructed from this ink by using direct ink writing method. According to the micro-morphology and density test, the samples sintered at 1100 °C for 2 h have formed ceramics completely with a high relative density of 98%. X-ray diffraction (XRD) test shows that the main phase of sintered samples is orthogonal (Na0.52K0.4425Li0.0375)(Nb0.87Sb0.07Ta0.06)O3. The piezoelectric constant d33 of 280 pC/N, dielectric constant ε of 1775, remanent polarization Pr of 18.8 μC/cm2 and coercive field Ec of 8.5 kV/cm prove that the sintered samples exhibit good electrical properties. The direct ink writing method allows one to design and rapidly fabricate piezoelectric structures in complex three-dimensional (3D) shapes without the need for any dies or lithographic masks, which will simplify the process of material preparation and offer new ideas for the design and application of piezoelectric devices. PMID:28788028

  5. CTViz: A tool for the visualization of transport in nanocomposites.

    PubMed

    Beach, Benjamin; Brown, Joshua; Tarlton, Taylor; Derosa, Pedro A

    2016-05-01

    A visualization tool (CTViz) for charge transport processes in 3-D hybrid materials (nanocomposites) was developed, inspired by the need for a graphical application to assist in code debugging and data presentation of an existing in-house code. As the simulation code grew, troubleshooting problems grew increasingly difficult without an effective way to visualize 3-D samples and charge transport in those samples. CTViz is able to produce publication and presentation quality visuals of the simulation box, as well as static and animated visuals of the paths of individual carriers through the sample. CTViz was designed to provide a high degree of flexibility in the visualization of the data. A feature that characterizes this tool is the use of shade and transparency levels to highlight important details in the morphology or in the transport paths by hiding or dimming elements of little relevance to the current view. This is fundamental for the visualization of 3-D systems with complex structures. The code presented here provides these required capabilities, but has gone beyond the original design and could be used as is or easily adapted for the visualization of other particulate transport where transport occurs on discrete paths. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Area, speed and power measurements of FPGA-based complex orthogonal space-time block code channel encoders

    NASA Astrophysics Data System (ADS)

    Passas, Georgios; Freear, Steven; Fawcett, Darren

    2010-01-01

    Space-time coding (STC) is an important milestone in modern wireless communications. In this technique, more copies of the same signal are transmitted through different antennas (space) and different symbol periods (time), to improve the robustness of a wireless system by increasing its diversity gain. STCs are channel coding algorithms that can be readily implemented on a field programmable gate array (FPGA) device. This work provides some figures for the amount of required FPGA hardware resources, the speed that the algorithms can operate and the power consumption requirements of a space-time block code (STBC) encoder. Seven encoder very high-speed integrated circuit hardware description language (VHDL) designs have been coded, synthesised and tested. Each design realises a complex orthogonal space-time block code with a different transmission matrix. All VHDL designs are parameterisable in terms of sample precision. Precisions ranging from 4 bits to 32 bits have been synthesised. Alamouti's STBC encoder design [Alamouti, S.M. (1998), 'A Simple Transmit Diversity Technique for Wireless Communications', IEEE Journal on Selected Areas in Communications, 16:55-108.] proved to be the best trade-off, since it is on average 3.2 times smaller, 1.5 times faster and requires slightly less power than the next best trade-off in the comparison, which is a 3/4-rate full-diversity 3Tx-antenna STBC.

  7. Local Laser Strengthening of Steel Sheets for Load Adapted Component Design in Car Body Structures

    NASA Astrophysics Data System (ADS)

    Jahn, Axel; Heitmanek, Marco; Standfuss, Jens; Brenner, Berndt; Wunderlich, Gerd; Donat, Bernd

    The current trend in car body construction concerning light weight design and car safety improvement increasingly requires an adaption of the local material properties on the component load. Martensitic hardenable steels, which are typically used in car body components, show a significant hardening effect, for instance in laser welded seams. This effect can be purposefully used as a local strengthening method. For several steel grades the local strengthening, resulting from a laser remelting process was investigated. The strength in the treated zone was determined at crash relevant strain rates. A load adapted design of complex reinforcement structures was developed for compression and bending loaded tube samples, using numerical simulation of the deformation behavior. Especially for bending loaded parts, the crash energy absorption can be increased significantly by local laser strengthening.

  8. Design of a scientific probe for obtaining Mars surface material

    NASA Technical Reports Server (NTRS)

    1990-01-01

    With the recent renewed interest in interplanetary and deep space exploratory missions, the Red Planet, Mars, which has captured people's imagination for centuries, has again become a center of attention. In the late 1960s and early 1970s, a series of Mariner missions performed fly-by investigations of the Mars surface and atmosphere. Later, in the mid 1970s, the data gathered by these earlier Mariner missions provided the basis of the much-publicized Viking missions, whose main objective was to determine the possibility of extraterrestrial life on Mars. More recently, with the dramatic changes in international politics, ambitious joint manned missions between the United States and the Soviet Union have been proposed to be launched in the early 21st century. In light of these exciting developments, the Spacecraft Design course, which was newly established at UCLA under NASA/USRA sponsorship, has developed its curriculum around a design project: the synthesis of an unmanned Martian landing probe. The students are required to conceive a preliminary design of a small spacecraft that is capable of landing at a designated site, collecting soil samples, and then returning the samples to orbit. The goal of the project is to demonstrate the feasibility of such a mission. This preliminary study of an interplanetary exploration mission has shown the feasibility of such a mission. The students have learned valuable lessons about the complexity of spacecraft design, even though the mission is relatively simple.

  9. Evaluation of a hydrophilic interaction liquid chromatography design space for sugars and sugar alcohols.

    PubMed

    Hetrick, Evan M; Kramer, Timothy T; Risley, Donald S

    2017-03-17

    Based on a column-screening exercise, a column ranking system was developed for sample mixtures containing any combination of 26 sugar and sugar alcohol analytes using 16 polar stationary phases in the HILIC mode with acetonitrile/water or acetone/water mobile phases. Each analyte was evaluated on the HILIC columns with gradient elution and the subsequent chromatography data was compiled into a statistical software package where any subset of the analytes can be selected and the columns are then ranked by the greatest separation. Since these analytes lack chromophores, aerosol-based detectors, including an evaporative light scattering detector (ELSD) and a charged aerosol detector (CAD) were employed for qualitative and quantitative detection. Example qualitative applications are provided to illustrate the practicality and efficiency of this HILIC column ranking. Furthermore, the design-space approach was used as a starting point for a quantitative method for the trace analysis of glucose in trehalose samples in a complex matrix. Knowledge gained from evaluating the design-space led to rapid development of a capable method as demonstrated through validation of the following parameters: specificity, accuracy, precision, linearity, limit of quantitation, limit of detection, and range. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. High-resolution neutron powder diffractometer SPODI at research reactor FRM II

    NASA Astrophysics Data System (ADS)

    Hoelzel, M.; Senyshyn, A.; Juenke, N.; Boysen, H.; Schmahl, W.; Fuess, H.

    2012-03-01

    SPODI is a high-resolution thermal neutron diffractometer at the research reactor Heinz Maier-Leibnitz (FRM II) especially dedicated to structural studies of complex systems. Unique features like a very large monochromator take-off angle of 155° and a 5 m monochromator-sample distance in its standard configuration achieve both high-resolution and a good profile shape for a broad scattering angle range. Two dimensional data are collected by an array of 80 vertical position sensitive 3He detectors. SPODI is well suited for studies of complex structural and magnetic order and disorder phenomena at non-ambient conditions. In addition to standard sample environment facilities (cryostats, furnaces, magnet) specific devices (rotatable load frame, cell for electric fields, multichannel potentiostat) were developed. Thus the characterisation of functional materials at in-operando conditions can be achieved. In this contribution the details of the design and present performance of the instrument are reported along with its specifications. A new concept for data reduction using a 2 θ dependent variable height for the intensity integration along the Debye-Scherrer lines is introduced.

  11. Extractive Atmospheric Pressure Photoionization (EAPPI) Mass Spectrometry: Rapid Analysis of Chemicals in Complex Matrices.

    PubMed

    Liu, Chengyuan; Yang, Jiuzhong; Wang, Jian; Hu, Yonghua; Zhao, Wan; Zhou, Zhongyue; Qi, Fei; Pan, Yang

    2016-10-01

    Extractive atmospheric pressure photoionization (EAPPI) mass spectrometry was designed for rapid qualitative and quantitative analysis of chemicals in complex matrices. In this method, an ultrasonic nebulization system was applied to sample extraction, nebulization, and vaporization. Mixed with a gaseous dopant, vaporized analytes were ionized through ambient photon-induced ion-molecule reactions, and were mass-analyzed by a high resolution time-of-flight mass spectrometer (TOF-MS). After careful optimization and testing with pure sample solution, EAPPI was successfully applied to the fast screening of capsules, soil, natural products, and viscous compounds. Analysis was completed within a few seconds without the need for preseparation. Moreover, the quantification capability of EAPPI for matrices was evaluated by analyzing six polycyclic aromatic hydrocarbons (PAHs) in soil. The correlation coefficients (R (2) ) for standard curves of all six PAHs were above 0.99, and the detection limits were in the range of 0.16-0.34 ng/mg. In addition, EAPPI could also be used to monitor organic chemical reactions in real time. Graphical Abstract ᅟ.

  12. Forensic analysis of anthraquinone, azo, and metal complex acid dyes from nylon fibers by micro-extraction and capillary electrophoresis.

    PubMed

    Stefan, Amy R; Dockery, Christopher R; Nieuwland, Alexander A; Roberson, Samantha N; Baguley, Brittany M; Hendrix, James E; Morgan, Stephen L

    2009-08-01

    The extraction and separation of dyes present on textile fibers offers the possibility of enhanced discrimination between forensic trace fiber evidence. An automated liquid sample handling workstation was programmed to deliver varying solvent combinations to acid-dyed nylon samples, and the resulting extracts were analyzed by an ultraviolet/visible microplate reader to evaluate extraction efficiencies at different experimental conditions. Combinatorial experiments using three-component mixture designs varied three solvents (water, pyridine, and aqueous ammonia) and were employed at different extraction temperatures for various extraction durations. The extraction efficiency as a function of the three solvents (pyridine/ammonia/water) was modeled and used to define optimum conditions for the extraction of three subclasses of acid dyes (anthraquinone, azo, and metal complex) from nylon fibers. The capillary electrophoresis analysis of acid dye extracts is demonstrated using an electrolyte solution of 15 mM ammonium acetate in acetonitrile/water (40:60, v/v) at pH 9.3. Excellent separations and discriminating diode array spectra are obtained even for dyes of similar color.

  13. Progress and development of analytical methods for gibberellins.

    PubMed

    Pan, Chaozhi; Tan, Swee Ngin; Yong, Jean Wan Hong; Ge, Liya

    2017-01-01

    Gibberellins, as a group of phytohormones, exhibit a wide variety of bio-functions within plant growth and development, which have been used to increase crop yields. Many analytical procedures, therefore, have been developed for the determination of the types and levels of endogenous and exogenous gibberellins. As plant tissues contain gibberellins in trace amounts (usually at the level of nanogram per gram fresh weight or even lower), the sample pre-treatment steps (extraction, pre-concentration, and purification) for gibberellins are reviewed in details. The primary focus of this comprehensive review is on the various analytical methods designed to meet the requirements for gibberellins analyses in complex matrices with particular emphasis on high-throughput analytical methods, such as gas chromatography, liquid chromatography, and capillary electrophoresis, mostly combined with mass spectrometry. The advantages and drawbacks of the each described analytical method are discussed. The overall aim of this review is to provide a comprehensive and critical view on the different analytical methods nowadays employed to analyze gibberellins in complex sample matrices and their foreseeable trends. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. [Management of a breast cystic syndrome: Guidelines].

    PubMed

    Uzan, C; Seror, J-Y; Seror, J

    2015-12-01

    Breast cysts are common, often discovered incidentally or subsequently to pain or palpable mass. The purpose of these recommendations is to describe the sonographic findings for classifying breast cystic lesions, to analyze the value and contribution of various imaging techniques and sampling and to provide a management strategy. Literature review conducted by a small group and then reviewed and validated by the group designated by the Collège national des gynécologues et obstétriciens français (CNGOF) to make recommendations for clinical practice for benign breast lesions. Breast cysts are classified in 3 categories: simple cysts, complicated cysts and complex cysts. For simple cysts, after ultrasound, no further imaging is necessary, cytology is to consider only as analgesic. For complicated cysts, a control at 4-6 months is recommended; the use of cytology depends on the context (familial risk, difficulty of follow-up). In case of complex cyst, sampling by cytology or biopsy is recommended. More assessments of other imaging tests are reported. The sonographic characterization is essential for management of breast cyst. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  15. Complexity analysis of brain activity in attention-deficit/hyperactivity disorder: A multiscale entropy analysis.

    PubMed

    Chenxi, Li; Chen, Yanni; Li, Youjun; Wang, Jue; Liu, Tian

    2016-06-01

    The multiscale entropy (MSE) is a novel method for quantifying the intrinsic dynamical complexity of physiological systems over several scales. To evaluate this method as a promising way to explore the neural mechanisms in ADHD, we calculated the MSE in EEG activity during the designed task. EEG data were collected from 13 outpatient boys with a confirmed diagnosis of ADHD and 13 age- and gender-matched normal control children during their doing multi-source interference task (MSIT). We estimated the MSE by calculating the sample entropy values of delta, theta, alpha and beta frequency bands over twenty time scales using coarse-grained procedure. The results showed increased complexity of EEG data in delta and theta frequency bands and decreased complexity in alpha frequency bands in ADHD children. The findings of this study revealed aberrant neural connectivity of kids with ADHD during interference task. The results showed that MSE method may be a new index to identify and understand the neural mechanism of ADHD. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Delivery route determines the presence of immune complexes on umbilical cord erythrocytes.

    PubMed

    de Lima, Andrés; Franco, Luis C; Sarmiento, Andrés; González, John M

    2017-11-01

    Umbilical cord blood offers a unique opportunity to study the basal level of immunoglobulin complexes. This study aims to determine the presence of immune complexes and complement deposition on erythrocytes from umbilical cord blood from normal, full-term pregnancies. In vitro pre-formed IgA, IgG, and IgM complexes were used as positive control for flow cytometry detection, and for C3d deposition. Blood samples (34) of umbilical cord blood taken from vaginal and cesarean deliveries were tested for the presence of immunoglobulin complexes. Fourteen samples from vaginal deliveries and 20 samples from cesarean deliveries were assessed. IgG and IgM complexes were detected on erythrocytes, whereas no IgA complexes or complement deposition was observed. Interestingly, the percentage of IgG complexes was higher on erythrocytes from vaginal delivery samples compared to those from cesarean deliveries. No other associations between immune complexes and other maternal or newborn variables were found. IgG and IgM complexes seem to be normally present on umbilical cord erythrocytes. Erythrocytes from vaginal deliveries have a higher percentage of IgG complexes present compared to that from cesarean deliveries. Since no C3d activity was detected, these complexes are non-pathological and should be part of the newborn's initial innate immune response.

  17. An instrument design and sample strategy for measuring soil respiration in the coastal temperate rain forest

    NASA Astrophysics Data System (ADS)

    Nay, S. M.; D'Amore, D. V.

    2009-12-01

    The coastal temperate rainforest (CTR) along the northwest coast of North America is a large and complex mosaic of forests and wetlands located on an undulating terrain ranging from sea level to thousands of meters in elevation. This biome stores a dynamic portion of the total carbon stock of North America. The fate of the terrestrial carbon stock is of concern due to the potential for mobilization and export of this store to both the atmosphere as carbon respiration flux and ocean as dissolved organic and inorganic carbon flux. Soil respiration is the largest export vector in the system and must be accurately measured to gain any comprehensive understanding of how carbon moves though this system. Suitable monitoring tools capable of measuring carbon fluxes at small spatial scales are essential for our understanding of carbon dynamics at larger spatial scales within this complex assemblage of ecosystems. We have adapted instrumentation and developed a sampling strategy for optimizing replication of soil respiration measurements to quantify differences among spatially complex landscape units of the CTR. We start with the design of the instrument to ease the technological, ergonomic and financial barriers that technicians encounter in monitoring the efflux of CO2 from the soil. Our sampling strategy optimizes the physical efforts of the field work and manages for the high variation of flux measurements encountered in this difficult environment of rough terrain, dense vegetation and wet climate. Our soil respirometer incorporates an infra-red gas analyzer (LiCor Inc. LI-820) and an 8300 cm3 soil respiration chamber; the device is durable, lightweight, easy to operate and can be built for under $5000 per unit. The modest unit price allows for a multiple unit fleet to be deployed and operated in an intensive field monitoring campaign. We use a large 346 cm2 collar to accommodate as much micro spatial variation as feasible and to facilitate repeated measures for tracking temporal trends. Our collar design minimizes root interference yet provides a highly stable platform for coupling with the respirometer. Meso-scale variability characterized by large down woody debris, wind throw pits and mounds and surface roots is negotiated with by a hexagonal array of seven collars at two meter spacing (sample pod). Landscape scale variability is managed through stratification and replication amongst ecosystem types arrayed across a hydrologic gradient from bogs to forested wetlands to upland forests. Our strategy has allowed us to gather data sets consisting of approximately 1800 total observations with approximately 600 measurements per replication per year. Mean coefficients of variation (CV) at the collar (micro-scale) were approximately 0.67. The pod level mean CV was reduced to approximately 0.29 at the pod (meso-scale). The CV at the vegetation strata were 0.43, 0.18 and 0.21 for bog, forested wetland and upland forest respectively. With temperature and hydrological data we are able to measure and model carbon dynamics in this large and complex environment. The analysis of variability at the three spatial scales has confirmed that our approach is capturing and constraining the variability.

  18. Simulating recurrent event data with hazard functions defined on a total time scale.

    PubMed

    Jahn-Eimermacher, Antje; Ingel, Katharina; Ozga, Ann-Kathrin; Preussler, Stella; Binder, Harald

    2015-03-08

    In medical studies with recurrent event data a total time scale perspective is often needed to adequately reflect disease mechanisms. This means that the hazard process is defined on the time since some starting point, e.g. the beginning of some disease, in contrast to a gap time scale where the hazard process restarts after each event. While techniques such as the Andersen-Gill model have been developed for analyzing data from a total time perspective, techniques for the simulation of such data, e.g. for sample size planning, have not been investigated so far. We have derived a simulation algorithm covering the Andersen-Gill model that can be used for sample size planning in clinical trials as well as the investigation of modeling techniques. Specifically, we allow for fixed and/or random covariates and an arbitrary hazard function defined on a total time scale. Furthermore we take into account that individuals may be temporarily insusceptible to a recurrent incidence of the event. The methods are based on conditional distributions of the inter-event times conditional on the total time of the preceeding event or study start. Closed form solutions are provided for common distributions. The derived methods have been implemented in a readily accessible R script. The proposed techniques are illustrated by planning the sample size for a clinical trial with complex recurrent event data. The required sample size is shown to be affected not only by censoring and intra-patient correlation, but also by the presence of risk-free intervals. This demonstrates the need for a simulation algorithm that particularly allows for complex study designs where no analytical sample size formulas might exist. The derived simulation algorithm is seen to be useful for the simulation of recurrent event data that follow an Andersen-Gill model. Next to the use of a total time scale, it allows for intra-patient correlation and risk-free intervals as are often observed in clinical trial data. Its application therefore allows the simulation of data that closely resemble real settings and thus can improve the use of simulation studies for designing and analysing studies.

  19. Mitochondrial dysfunction in myocardium obtained from clinically normal dogs, clinically normal anesthetized dogs, and dogs with dilated cardiomyopathy.

    PubMed

    Sleeper, Meg M; Rosato, Bradley P; Bansal, Seema; Avadhani, Narayan G

    2012-11-01

    To compare mitochondrial complex I and complex IV activity in myocardial mitochondria of clinically normal dogs, clinically normal dogs exposed to inhalation anesthesia, and dogs affected with dilated cardiomyopathy. Myocardial samples obtained from 21 euthanized dogs (6 clinically normal [control] dogs, 5 clinically normal dogs subjected to inhalation anesthesia with isoflurane prior to euthanasia, 5 dogs with juvenile-onset dilated cardiomyopathy, and 5 dogs with adult-onset dilated cardiomyopathy). Activity of mitochondrial complex I and complex IV was assayed spectrophotometrically in isolated mitochondria from left ventricular tissue obtained from the 4 groups of dogs. Activity of complex I and complex IV was significantly decreased in anesthetized dogs, compared with activities in the control dogs and dogs with juvenile-onset or adult-onset dilated cardiomyopathy. Inhalation anesthesia disrupted the electron transport chain in the dogs, which potentially led to an outburst of reactive oxygen species that caused mitochondrial dysfunction. Inhalation anesthesia depressed mitochondrial function in dogs, similar to results reported in other species. This effect is important to consider when anesthetizing animals with myocardial disease and suggested that antioxidant treatments may be beneficial in some animals. Additionally, this effect should be considered when designing studies in which mitochondrial enzyme activity will be measured. Additional studies that include a larger number of animals are warranted.

  20. Generalized sample entropy analysis for traffic signals based on similarity measure

    NASA Astrophysics Data System (ADS)

    Shang, Du; Xu, Mengjia; Shang, Pengjian

    2017-05-01

    Sample entropy is a prevailing method used to quantify the complexity of a time series. In this paper a modified method of generalized sample entropy and surrogate data analysis is proposed as a new measure to assess the complexity of a complex dynamical system such as traffic signals. The method based on similarity distance presents a different way of signals patterns match showing distinct behaviors of complexity. Simulations are conducted over synthetic data and traffic signals for providing the comparative study, which is provided to show the power of the new method. Compared with previous sample entropy and surrogate data analysis, the new method has two main advantages. The first one is that it overcomes the limitation about the relationship between the dimension parameter and the length of series. The second one is that the modified sample entropy functions can be used to quantitatively distinguish time series from different complex systems by the similar measure.

  1. Thomson Scattering Diagnostic Data Acquisition Systems for Modern Fusion Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ivanenko, S.V.; Khilchenko, A.D.; Ovchar, V.K.

    2015-07-01

    Uniquely designed complex data acquisition system for Thomson scattering diagnostic was developed. It allows recording short duration (3-5 ns) scattered pulses with 2 GHz sampling rate and 10-bit total resolution in oscilloscope mode. The system consists up to 48 photo detector modules with 0- 200 MHz bandwidth, 1-48 simultaneously sampling ADC modules and synchronization subsystem. The photo detector modules are based on avalanche photodiodes (APD) and ultra-low noise trans-impedance amplifiers. ADC modules include fast analog to digital converters and digital units based on the FPGA (Field- Programmable Gate Array) for data processing and storage. The synchronization subsystem is used tomore » form triggering pulses and to organize the simultaneously mode of ADC modules operation. (authors)« less

  2. Crisamicin A, a new antibiotic from Micromonospora. I. Taxonomy of the producing strain, fermentation, isolation, physico-chemical characterization and antimicrobial properties.

    PubMed

    Nelson, R A; Pope, J A; Luedemann, G M; McDaniel, L E; Schaffner, C P

    1986-03-01

    A microorganism, designated as RV-79-9-101 and now identified as Micromonospora purpureochromogenes subsp. halotolerans, isolated from a mud sample in the Philippines, has been shown to produce a complex of antibiotics called crisamicins. Thin-layer chromatography and bioautography, employing solvent extracts of whole fermentation broths, revealed a minimum of five antimicrobial components. The major biologically-active component of the antibiotic complex, crisamicin A, was obtained in pure form after preparative silica gel column chromatography followed by crystallization. Based on physico-chemical data crisamicin A has been identified as a novel member of the isochromanequinone group of antibiotics. It exhibits excellent in vitro activity against Gram-positive bacteria but little or no activity towards Gram-negative bacteria or fungi.

  3. Anti-aliasing filter design on spaceborne digital receiver

    NASA Astrophysics Data System (ADS)

    Yu, Danru; Zhao, Chonghui

    2009-12-01

    In recent years, with the development of satellite observation technologies, more and more active remote sensing technologies are adopted in spaceborne system. The spaceborne precipitation radar will depend heavily on high performance digital processing to collect meaningful rain echo data. It will increase the complexity of the spaceborne system and need high-performance and reliable digital receiver. This paper analyzes the frequency aliasing in the intermediate frequency signal sampling of digital down conversion in spaceborne radar, and gives an effective digital filter. By analysis and calculation, we choose reasonable parameters of the half-band filters to suppress the frequency aliasing on DDC. Compared with traditional filter, the FPGA resources cost in our system are reduced by over 50%. This can effectively reduce the complexity in the spaceborne digital receiver and improve the reliability of system.

  4. Classification of subsurface objects using singular values derived from signal frames

    DOEpatents

    Chambers, David H; Paglieroni, David W

    2014-05-06

    The classification system represents a detected object with a feature vector derived from the return signals acquired by an array of N transceivers operating in multistatic mode. The classification system generates the feature vector by transforming the real-valued return signals into complex-valued spectra, using, for example, a Fast Fourier Transform. The classification system then generates a feature vector of singular values for each user-designated spectral sub-band by applying a singular value decomposition (SVD) to the N.times.N square complex-valued matrix formed from sub-band samples associated with all possible transmitter-receiver pairs. The resulting feature vector of singular values may be transformed into a feature vector of singular value likelihoods and then subjected to a multi-category linear or neural network classifier for object classification.

  5. LIGHT-SABRE enables efficient in-magnet catalytic hyperpolarization.

    PubMed

    Theis, Thomas; Truong, Milton; Coffey, Aaron M; Chekmenev, Eduard Y; Warren, Warren S

    2014-11-01

    Nuclear spin hyperpolarization overcomes the sensitivity limitations of traditional NMR and MRI, but the most general method demonstrated to date (dynamic nuclear polarization) has significant limitations in scalability, cost, and complex apparatus design. As an alternative, signal amplification by reversible exchange (SABRE) of parahydrogen on transition metal catalysts can hyperpolarize a variety of substrates, but to date this scheme has required transfer of the sample to low magnetic field or very strong RF irradiation. Here we demonstrate "Low-Irradiation Generation of High Tesla-SABRE" (LIGHT-SABRE) which works with simple pulse sequences and low power deposition; it should be usable at any magnetic field and for hyperpolarization of many different nuclei. This approach could drastically reduce the cost and complexity of producing hyperpolarized molecules. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. DNA curtains for high-throughput single-molecule optical imaging.

    PubMed

    Greene, Eric C; Wind, Shalom; Fazio, Teresa; Gorman, Jason; Visnapuu, Mari-Liis

    2010-01-01

    Single-molecule approaches provide a valuable tool in the arsenal of the modern biologist, and new discoveries continue to be made possible through the use of these state-of-the-art technologies. However, it can be inherently difficult to obtain statistically relevant data from experimental approaches specifically designed to probe individual reactions. This problem is compounded with more complex biochemical reactions, heterogeneous systems, and/or reactions requiring the use of long DNA substrates. Here we give an overview of a technology developed in our laboratory, which relies upon simple micro- or nanofabricated structures in combination with "bio-friendly" lipid bilayers, to align thousands of long DNA molecules into defined patterns on the surface of a microfluidic sample chamber. We call these "DNA curtains," and we have developed several different versions varying in complexity and DNA substrate configuration, which are designed to meet different experimental needs. This novel approach to single-molecule imaging provides a powerful experimental platform that offers the potential for concurrent observation of hundreds or even thousands of protein-DNA interactions in real time. Copyright 2010 Elsevier Inc. All rights reserved.

  7. Plot shape effects on plant species diversity measurements

    USGS Publications Warehouse

    Keeley, Jon E.; Fotheringham, C.J.

    2005-01-01

    Abstract. Question: Do rectangular sample plots record more plant species than square plots as suggested by both empirical and theoretical studies?Location: Grasslands, shrublands and forests in the Mediterranean-climate region of California, USA.Methods: We compared three 0.1-ha sampling designs that differed in the shape and dispersion of 1-m2 and 100-m2 nested subplots. We duplicated an earlier study that compared the Whittaker sample design, which had square clustered subplots, with the modified Whittaker design, which had dispersed rectangular subplots. To sort out effects of dispersion from shape we used a third design that overlaid square subplots on the modified Whittaker design. Also, using data from published studies we extracted species richness values for 400-m2 subplots that were either square or 1:4 rectangles partially overlaid on each other from desert scrub in high and low rainfall years, chaparral, sage scrub, oak savanna and coniferous forests with and without fire.Results: We found that earlier empirical reports of more than 30% greater richness with rectangles were due to the confusion of shape effects with spatial effects, coupled with the use of cumulative number of species as the metric for comparison. Average species richness was not significantly different between square and 1:4 rectangular sample plots at either 1- or 100-m2. Pairwise comparisons showed no significant difference between square and rectangular samples in all but one vegetation type, and that one exhibited significantly greater richness with squares. Our three intensive study sites appear to exhibit some level of self-similarity at the scale of 400 m2, but, contrary to theoretical expectations, we could not detect plot shape effects on species richness at this scale.Conclusions: At the 0.1-ha scale or lower there is no evidence that plot shape has predictable effects on number of species recorded from sample plots. We hypothesize that for the mediterranean-climate vegetation types studied here, the primary reason that 1:4 rectangles do not sample greater species richness than squares is because species turnover varies along complex environmental gradients that are both parallel and perpendicular to the long axis of rectangular plots. Reports in the literature of much greater species richness recorded for highly elongated rectangular strips than for squares of the same area are not likely to be fair comparisons because of the dramatically different periphery/area ratio, which includes a much greater proportion of species that are using both above and below-ground niche space outside the sample area.

  8. Estimation of Variance in the Case of Complex Samples.

    ERIC Educational Resources Information Center

    Groenewald, A. C.; Stoker, D. J.

    In a complex sampling scheme it is desirable to select the primary sampling units (PSUs) without replacement to prevent duplications in the sample. Since the estimation of the sampling variances is more complicated when the PSUs are selected without replacement, L. Kish (1965) recommends that the variance be calculated using the formulas…

  9. Modeling OPC complexity for design for manufacturability

    NASA Astrophysics Data System (ADS)

    Gupta, Puneet; Kahng, Andrew B.; Muddu, Swamy; Nakagawa, Sam; Park, Chul-Hong

    2005-11-01

    Increasing design complexity in sub-90nm designs results in increased mask complexity and cost. Resolution enhancement techniques (RET) such as assist feature addition, phase shifting (attenuated PSM) and aggressive optical proximity correction (OPC) help in preserving feature fidelity in silicon but increase mask complexity and cost. Data volume increase with rise in mask complexity is becoming prohibitive for manufacturing. Mask cost is determined by mask write time and mask inspection time, which are directly related to the complexity of features printed on the mask. Aggressive RET increase complexity by adding assist features and by modifying existing features. Passing design intent to OPC has been identified as a solution for reducing mask complexity and cost in several recent works. The goal of design-aware OPC is to relax OPC tolerances of layout features to minimize mask cost, without sacrificing parametric yield. To convey optimal OPC tolerances for manufacturing, design optimization should drive OPC tolerance optimization using models of mask cost for devices and wires. Design optimization should be aware of impact of OPC correction levels on mask cost and performance of the design. This work introduces mask cost characterization (MCC) that quantifies OPC complexity, measured in terms of fracture count of the mask, for different OPC tolerances. MCC with different OPC tolerances is a critical step in linking design and manufacturing. In this paper, we present a MCC methodology that provides models of fracture count of standard cells and wire patterns for use in design optimization. MCC cannot be performed by designers as they do not have access to foundry OPC recipes and RET tools. To build a fracture count model, we perform OPC and fracturing on a limited set of standard cells and wire configurations with all tolerance combinations. Separately, we identify the characteristics of the layout that impact fracture count. Based on the fracture count (FC) data from OPC and mask data preparation runs, we build models of FC as function of OPC tolerances and layout parameters.

  10. Single particle tracking through highly scattering media with multiplexed two-photon excitation

    NASA Astrophysics Data System (ADS)

    Perillo, Evan; Liu, Yen-Liang; Liu, Cong; Yeh, Hsin-Chih; Dunn, Andrew K.

    2015-03-01

    3D single-particle tracking (SPT) has been a pivotal tool to furthering our understanding of dynamic cellular processes in complex biological systems, with a molecular localization accuracy (10-100 nm) often better than the diffraction limit of light. However, current SPT techniques utilize either CCDs or a confocal detection scheme which not only suffer from poor temporal resolution but also limit tracking to a depth less than one scattering mean free path in the sample (typically <15μm). In this report we highlight our novel design for a spatiotemporally multiplexed two-photon microscope which is able to reach sub-diffraction-limit tracking accuracy and sub-millisecond temporal resolution, but with a dramatically extended SPT range of up to 200 μm through dense cell samples. We have validated our microscope by tracking (1) fluorescent nanoparticles in a prescribed motion inside gelatin gel (with 1% intralipid) and (2) labeled single EGFR complexes inside skin cancer spheroids (at least 8 layers of cells thick) for ~10 minutes. Furthermore we discuss future capabilities of our multiplexed two-photon microscope design, specifically to the extension of (1) simultaneous multicolor tracking (i.e. spatiotemporal co-localization analysis) and (2) FRET studies (i.e. lifetime analysis). The high resolution, high depth penetration, and multicolor features of this microscope make it well poised to study a variety of molecular scale dynamics in the cell, especially related to cellular trafficking studies with in vitro tumor models and in vivo.

  11. Three-dimensional printed magnetophoretic system for the continuous flow separation of avian influenza H5N1 viruses.

    PubMed

    Wang, Yuhe; Li, Yanbin; Wang, Ronghui; Wang, Maohua; Lin, Jianhan

    2017-04-01

    As a result of the low concentration of avian influenza viruses in samples for routine screening, the separation and concentration of these viruses are vital for their sensitive detection. We present a novel three-dimensional printed magnetophoretic system for the continuous flow separation of the viruses using aptamer-modified magnetic nanoparticles, a magnetophoretic chip, a magnetic field, and a fluidic controller. The magnetic field was designed based on finite element magnetic simulation and developed using neodymium magnets with a maximum intensity of 0.65 T and a gradient of 32 T/m for dragging the nanoparticle-virus complexes. The magnetophoretic chip was designed by SOLIDWORKS and fabricated by a three-dimensional printer with a magnetophoretic channel for the continuous flow separation of the viruses using phosphate-buffered saline as carrier flow. The fluidic controller was developed using a microcontroller and peristaltic pumps to inject the carrier flow and the viruses. The trajectory of the virus-nanoparticle complexes was simulated using COMSOL for optimization of the carrier flow and the magnetic field, respectively. The results showed that the H5N1 viruses could be captured, separated, and concentrated using the proposed magnetophoretic system with the separation efficiency up to 88% in a continuous flow separation time of 2 min for a sample volume of 200 μL. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Exploring the Molecular Design of Protein Interaction Sites with Molecular Dynamics Simulations and Free Energy Calculations†

    PubMed Central

    Liang, Shide; Li, Liwei; Hsu, Wei-Lun; Pilcher, Meaghan N.; Uversky, Vladimir; Zhou, Yaoqi; Dunker, A. Keith; Meroueh, Samy O.

    2009-01-01

    The significant work that has been invested toward understanding protein–protein interaction has not translated into significant advances in structure-based predictions. In particular redesigning protein surfaces to bind to unrelated receptors remains a challenge, partly due to receptor flexibility, which is often neglected in these efforts. In this work, we computationally graft the binding epitope of various small proteins obtained from the RCSB database to bind to barnase, lysozyme, and trypsin using a previously derived and validated algorithm. In an effort to probe the protein complexes in a realistic environment, all native and designer complexes were subjected to a total of nearly 400 ns of explicit-solvent molecular dynamics (MD) simulation. The MD data led to an unexpected observation: some of the designer complexes were highly unstable and decomposed during the trajectories. In contrast, the native and a number of designer complexes remained consistently stable. The unstable conformers provided us with a unique opportunity to define the structural and energetic factors that lead to unproductive protein–protein complexes. To that end we used free energy calculations following the MM-PBSA approach to determine the role of nonpolar effects, electrostatics and entropy in binding. Remarkably, we found that a majority of unstable complexes exhibited more favorable electrostatics than native or stable designer complexes, suggesting that favorable electrostatic interactions are not prerequisite for complex formation between proteins. However, nonpolar effects remained consistently more favorable in native and stable designer complexes reinforcing the importance of hydrophobic effects in protein–protein binding. While entropy systematically opposed binding in all cases, there was no observed trend in the entropy difference between native and designer complexes. A series of alanine scanning mutations of hot-spot residues at the interface of native and designer complexes showed less than optimal contacts of hot-spot residues with their surroundings in the unstable conformers, resulting in more favorable entropy for these complexes. Finally, disorder predictions revealed that secondary structures at the interface of unstable complexes exhibited greater disorder than the stable complexes. PMID:19113835

  13. Advantages and challenges in automated apatite fission track counting

    NASA Astrophysics Data System (ADS)

    Enkelmann, E.; Ehlers, T. A.

    2012-04-01

    Fission track thermochronometer data are often a core element of modern tectonic and denudation studies. Soon after the development of the fission track methods interest emerged for the developed an automated counting procedure to replace the time consuming labor of counting fission tracks under the microscope. Automated track counting became feasible in recent years with increasing improvements in computer software and hardware. One such example used in this study is the commercial automated fission track counting procedure from Autoscan Systems Pty that has been highlighted through several venues. We conducted experiments that are designed to reliably and consistently test the ability of this fully automated counting system to recognize fission tracks in apatite and a muscovite external detector. Fission tracks were analyzed in samples with a step-wise increase in sample complexity. The first set of experiments used a large (mm-size) slice of Durango apatite cut parallel to the prism plane. Second, samples with 80-200 μm large apatite grains of Fish Canyon Tuff were analyzed. This second sample set is characterized by complexities often found in apatites in different rock types. In addition to the automated counting procedure, the same samples were also analyzed using conventional counting procedures. We found for all samples that the fully automated fission track counting procedure using the Autoscan System yields a larger scatter in the fission track densities measured compared to conventional (manual) track counting. This scatter typically resulted from the false identification of tracks due surface and mineralogical defects, regardless of the image filtering procedure used. Large differences between track densities analyzed with the automated counting persisted between different grains analyzed in one sample as well as between different samples. As a result of these differences a manual correction of the fully automated fission track counts is necessary for each individual surface area and grain counted. This manual correction procedure significantly increases (up to four times) the time required to analyze a sample with the automated counting procedure compared to the conventional approach.

  14. Conventional and Accelerated-Solvent Extractions of Green Tea (Camellia sinensis) for Metabolomics-based Chemometrics

    PubMed Central

    Kellogg, Joshua J.; Wallace, Emily D.; Graf, Tyler N.; Oberlies, Nicholas H.; Cech, Nadja B.

    2018-01-01

    Metabolomics has emerged as an important analytical technique for multiple applications. The value of information obtained from metabolomics analysis depends on the degree to which the entire metabolome is present and the reliability of sample treatment to ensure reproducibility across the study. The purpose of this study was to compare methods of preparing complex botanical extract samples prior to metabolomics profiling. Two extraction methodologies, accelerated solvent extraction and a conventional solvent maceration, were compared using commercial green tea [Camellia sinensis (L.) Kuntze (Theaceae)] products as a test case. The accelerated solvent protocol was first evaluated to ascertain critical factors influencing extraction using a D-optimal experimental design study. The accelerated solvent and conventional extraction methods yielded similar metabolite profiles for the green tea samples studied. The accelerated solvent extraction yielded higher total amounts of extracted catechins, was more reproducible, and required less active bench time to prepare the samples. This study demonstrates the effectiveness of accelerated solvent as an efficient methodology for metabolomics studies. PMID:28787673

  15. Laser Time-of-Flight Mass Spectrometry for Future In Situ Planetary Missions

    NASA Technical Reports Server (NTRS)

    Getty, S. A.; Brinckerhoff, W. B.; Cornish, T.; Ecelberger, S. A.; Li, X.; Floyd, M. A. Merrill; Chanover, N.; Uckert, K.; Voelz, D.; Xiao, X.; hide

    2012-01-01

    Laser desorption/ionization time-of-flight mass spectrometry (LD-TOF-MS) is a versatile, low-complexity instrument class that holds significant promise for future landed in situ planetary missions that emphasize compositional analysis of surface materials. Here we describe a 5kg-class instrument that is capable of detecting and analyzing a variety of analytes directly from rock or ice samples. Through laboratory studies of a suite of representative samples, we show that detection and analysis of key mineral composition, small organics, and particularly, higher molecular weight organics are well suited to this instrument design. A mass range exceeding 100,000 Da has recently been demonstrated. We describe recent efforts in instrument prototype development and future directions that will enhance our analytical capabilities targeting organic mixtures on primitive and icy bodies. We present results on a series of standards, simulated mixtures, and meteoritic samples.

  16. Learning Bayesian Networks from Correlated Data

    NASA Astrophysics Data System (ADS)

    Bae, Harold; Monti, Stefano; Montano, Monty; Steinberg, Martin H.; Perls, Thomas T.; Sebastiani, Paola

    2016-05-01

    Bayesian networks are probabilistic models that represent complex distributions in a modular way and have become very popular in many fields. There are many methods to build Bayesian networks from a random sample of independent and identically distributed observations. However, many observational studies are designed using some form of clustered sampling that introduces correlations between observations within the same cluster and ignoring this correlation typically inflates the rate of false positive associations. We describe a novel parameterization of Bayesian networks that uses random effects to model the correlation within sample units and can be used for structure and parameter learning from correlated data without inflating the Type I error rate. We compare different learning metrics using simulations and illustrate the method in two real examples: an analysis of genetic and non-genetic factors associated with human longevity from a family-based study, and an example of risk factors for complications of sickle cell anemia from a longitudinal study with repeated measures.

  17. Mars rover sample return: An exobiology science scenario

    NASA Technical Reports Server (NTRS)

    Rosenthal, D. A.; Sims, M. H.; Schwartz, Deborah E.; Nedell, S. S.; Mckay, Christopher P.; Mancinelli, Rocco L.

    1988-01-01

    A mission designed to collect and return samples from Mars will provide information regarding its composition, history, and evolution. At the same time, a sample return mission generates a technical challenge. Sophisticated, semi-autonomous, robotic spacecraft systems must be developed in order to carry out complex operations at the surface of a very distant planet. An interdisciplinary effort was conducted to consider how much a Mars mission can be realistically structured to maximize the planetary science return. The focus was to concentrate on a particular set of scientific objectives (exobiology), to determine the instrumentation and analyses required to search for biological signatures, and to evaluate what analyses and decision making can be effectively performed by the rover in order to minimize the overhead of constant communication between Mars and the Earth. Investigations were also begun in the area of machine vision to determine whether layered sedimentary structures can be recognized autonomously, and preliminary results are encouraging.

  18. Microfluidics-based integrated airborne pathogen detection systems

    NASA Astrophysics Data System (ADS)

    Northrup, M. Allen; Alleman-Sposito, Jennifer; Austin, Todd; Devitt, Amy; Fong, Donna; Lin, Phil; Nakao, Brian; Pourahmadi, Farzad; Vinas, Mary; Yuan, Bob

    2006-09-01

    Microfluidic Systems is focused on building microfluidic platforms that interface front-end mesofluidics to handle real world sample volumes for optimal sensitivity coupled to microfluidic circuitry to process small liquid volumes for complex reagent metering, mixing, and biochemical analysis, particularly for pathogens. MFSI is the prime contractor on two programs for the US Department of Homeland Security: BAND (Bioagent Autonomous Networked Detector) and IBADS (Instantaneous Bio-Aerosol Detection System). The goal of BAND is to develop an autonomous system for monitoring the air for known biological agents. This consists of air collection, sample lysis, sample purification, detection of DNA, RNA, and toxins, and a networked interface to report the results. For IBADS, MFSI is developing the confirmatory device which must verify the presence of a pathogen with 5 minutes of an air collector/trigger sounding an alarm. Instrument designs and biological assay results from both BAND and IBADS will be presented.

  19. Finite element simulation and experimental verification of ultrasonic non-destructive inspection of defects in additively manufactured materials

    NASA Astrophysics Data System (ADS)

    Taheri, H.; Koester, L.; Bigelow, T.; Bond, L. J.

    2018-04-01

    Industrial applications of additively manufactured components are increasing quickly. Adequate quality control of the parts is necessary in ensuring safety when using these materials. Base material properties, surface conditions, as well as location and size of defects are some of the main targets for nondestructive evaluation of additively manufactured parts, and the problem of adequate characterization is compounded given the challenges of complex part geometry. Numerical modeling can allow the interplay of the various factors to be studied, which can lead to improved measurement design. This paper presents a finite element simulation verified by experimental results of ultrasonic waves scattering from flat bottom holes (FBH) in additive manufacturing materials. A focused beam immersion ultrasound transducer was used for both the modeling and simulations in the additive manufactured samples. The samples were SS17 4 PH steel samples made by laser sintering in a powder bed.

  20. Automated Comparative Metabolite Profiling of Large LC-ESIMS Data Sets in an ACD/MS Workbook Suite Add-in, and Data Clustering on a New Open-Source Web Platform FreeClust.

    PubMed

    Božičević, Alen; Dobrzyński, Maciej; De Bie, Hans; Gafner, Frank; Garo, Eliane; Hamburger, Matthias

    2017-12-05

    The technological development of LC-MS instrumentation has led to significant improvements of performance and sensitivity, enabling high-throughput analysis of complex samples, such as plant extracts. Most software suites allow preprocessing of LC-MS chromatograms to obtain comprehensive information on single constituents. However, more advanced processing needs, such as the systematic and unbiased comparative metabolite profiling of large numbers of complex LC-MS chromatograms remains a challenge. Currently, users have to rely on different tools to perform such data analyses. We developed a two-step protocol comprising a comparative metabolite profiling tool integrated in ACD/MS Workbook Suite, and a web platform developed in R language designed for clustering and visualization of chromatographic data. Initially, all relevant chromatographic and spectroscopic data (retention time, molecular ions with the respective ion abundance, and sample names) are automatically extracted and assembled in an Excel spreadsheet. The file is then loaded into an online web application that includes various statistical algorithms and provides the user with tools to compare and visualize the results in intuitive 2D heatmaps. We applied this workflow to LC-ESIMS profiles obtained from 69 honey samples. Within few hours of calculation with a standard PC, honey samples were preprocessed and organized in clusters based on their metabolite profile similarities, thereby highlighting the common metabolite patterns and distributions among samples. Implementation in the ACD/Laboratories software package enables ulterior integration of other analytical data, and in silico prediction tools for modern drug discovery.

  1. Libration Orbit Mission Design: Applications of Numerical & Dynamical Methods

    NASA Technical Reports Server (NTRS)

    Bauer, Frank (Technical Monitor); Folta, David; Beckman, Mark

    2002-01-01

    Sun-Earth libration point orbits serve as excellent locations for scientific investigations. These orbits are often selected to minimize environmental disturbances and maximize observing efficiency. Trajectory design in support of libration orbits is ever more challenging as more complex missions are envisioned in the next decade. Trajectory design software must be further enabled to incorporate better understanding of the libration orbit solution space and thus improve the efficiency and expand the capabilities of current approaches. The Goddard Space Flight Center (GSFC) is currently supporting multiple libration missions. This end-to-end support consists of mission operations, trajectory design, and control. It also includes algorithm and software development. The recently launched Microwave Anisotropy Probe (MAP) and upcoming James Webb Space Telescope (JWST) and Constellation-X missions are examples of the use of improved numerical methods for attaining constrained orbital parameters and controlling their dynamical evolution at the collinear libration points. This paper presents a history of libration point missions, a brief description of the numerical and dynamical design techniques including software used, and a sample of future GSFC mission designs.

  2. Miniaturized and direct spectrophotometric multi-sample analysis of trace metals in natural waters.

    PubMed

    Albendín, Gemma; López-López, José A; Pinto, Juan J

    2016-03-15

    Trends in the analysis of trace metals in natural waters are mainly based on the development of sample treatment methods to isolate and pre-concentrate the metal from the matrix in a simpler extract for further instrumental analysis. However, direct analysis is often possible using more accessible techniques such as spectrophotometry. In this case a proper ligand is required to form a complex that absorbs radiation in the ultraviolet-visible (UV-Vis) spectrum. In this sense, the hydrazone derivative, di-2-pyridylketone benzoylhydrazone (dPKBH), forms complexes with copper (Cu) and vanadium (V) that absorb light at 370 and 395 nm, respectively. Although spectrophotometric methods are considered as time- and reagent-consuming, this work focused on its miniaturization by reducing the volume of sample as well as time and cost of analysis. In both methods, a micro-amount of sample is placed into a microplate reader with a capacity for 96 samples, which can be analyzed in times ranging from 5 to 10 min. The proposed methods have been optimized using a Box-Behnken design of experiments. For Cu determination, concentration of phosphate buffer solution at pH 8.33, masking agents (ammonium fluoride and sodium citrate), and dPKBH were optimized. For V analysis, sample (pH 4.5) was obtained using acetic acid/sodium acetate buffer, and masking agents were ammonium fluoride and 1,2-cyclohexanediaminetetraacetic acid. Under optimal conditions, both methods were applied to the analysis of certified reference materials TMDA-62 (lake water), LGC-6016 (estuarine water), and LGC-6019 (river water). In all cases, results proved the accuracy of the method. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Managing Increasing Complexity in Undergraduate Digital Media Design Education: The Impact and Benefits of Multidisciplinary Collaboration

    ERIC Educational Resources Information Center

    Fleischmann, Katja; Daniel, Ryan

    2013-01-01

    Increasing complexity is one of the most pertinent issues when discussing the role and future of design, designers and their education. The evolving nature of digital media technology has resulted in a profession in a state of flux with increasingly complex communication and design problems. The ability to collaborate and interact with other…

  4. The ExoMars Sample Preparation and Distribution System

    NASA Astrophysics Data System (ADS)

    Schulte, Wolfgang; Hofmann, Peter; Baglioni, Pietro; Richter, Lutz; Redlich, . Daniel; Notarnicola, Marco; Durrant, Stephen

    2012-07-01

    The Sample Preparation and Distribution System (SPDS) is a key element of the ESA ExoMars Rover. It is a set of complex mechanisms designed to receive Mars soil samples acquired from the subsurface with a drill, to crush them and to distribute the obtained soil powder to the scientific instruments of the `Pasteur Payload', in the Rover Analytical Laboratory (ALD). In particular, the SPDS consists of: (1) a Core Sample Handling System (CSHS), including a Core Sample Transportation Mechanism (CSTM) and a Blank Sample Dispenser; (2) a Crushing Station (CS); (3) a Powder Sample Dosing and Distribution System (PSDDS); and (4) a Powder Sample Handling System (PSHS) which is a carousel carrying pyrolysis ovens, a re-fillable sample container and a tool to flatten the powder sample surface. Kayser-Threde has developed, undercontract with the ExoMars prime contractor Thales Alenia Space Italy, breadboards and an engineering model of the SPDS mechanisms. Tests of individual mechanisms, namely the CSTM, CS and PSDDS were conducted both in laboratory ambient conditions and in a simulated Mars environment, using dedicated facilities. The SPDS functionalities and performances were measured and evaluated. In the course of 2011 the SPDS Dosing Station (part of the PSDDS) was also tested in simulated Mars gravity conditions during a parabolic flight campaign. By the time of the conference, an elegant breadboard of the Powder Sample Handling System will have been built and tested. The next step, planned by mid of 2012, will be a complete end-to-end test of the sample handling and processing chain, combining all four SPDS mechanisms. The possibility to verify interface and operational aspects between the SPDS and the ALD scientific instruments using the available instruments breadboards with the end-to-end set-up is currently being evaluated. This paper illustrates the most recent design status of the SPDS mechanisms, summarizes the test results and highlights future development activities, including potential involvement of the ExoMars science experiments.

  5. Proteome Dynamics: Revisiting Turnover with a Global Perspective*

    PubMed Central

    Claydon, Amy J.; Beynon, Robert

    2012-01-01

    Although bulk protein turnover has been measured with the use of stable isotope labeled tracers for over half a century, it is only recently that the same approach has become applicable to the level of the proteome, permitting analysis of the turnover of many proteins instead of single proteins or an aggregated protein pool. The optimal experimental design for turnover studies is dependent on the nature of the biological system under study, which dictates the choice of precursor label, protein pool sampling strategy, and treatment of data. In this review we discuss different approaches and, in particular, explore how complexity in experimental design and data processing increases as we shift from unicellular to multicellular systems, in particular animals. PMID:23125033

  6. Illuminating the Chemistry of Life: Design, Synthesis, and Applications of “Caged” and Related Photoresponsive Compounds

    PubMed Central

    Lee, Hsienming; Larson, Daniel R.; Lawrence, David S.

    2009-01-01

    Biological systems are characterized by a level of spatial and temporal organization that often lies beyond the grasp of present day methods. Light-modulated bioreagents, including analogs of low molecular weight compounds, peptides, proteins, and nucleic acids, represent a compelling strategy to probe, perturb, or sample biological phenomena with the requisite control to address many of these organizational complexities. Although this technology has created considerable excitement in the chemical community, its application to biological questions has been relatively limited. We describe the challenges associated with the design, synthesis, and use of light-responsive bioreagents, the scope and limitations associated with the instrumentation required for their application, and recent chemical and biological advances in this field. PMID:19298086

  7. Illuminating the chemistry of life: design, synthesis, and applications of "caged" and related photoresponsive compounds.

    PubMed

    Lee, Hsien-Ming; Larson, Daniel R; Lawrence, David S

    2009-06-19

    Biological systems are characterized by a level of spatial and temporal organization that often lies beyond the grasp of present day methods. Light-modulated bioreagents, including analogs of low molecular weight compounds, peptides, proteins, and nucleic acids, represent a compelling strategy to probe, perturb, or sample biological phenomena with the requisite control to address many of these organizational complexities. Although this technology has created considerable excitement in the chemical community, its application to biological questions has been relatively limited. We describe the challenges associated with the design, synthesis, and use of light-responsive bioreagents; the scope and limitations associated with the instrumentation required for their application; and recent chemical and biological advances in this field.

  8. The Study of the Relationship between Probabilistic Design and Axiomatic Design Methodology. Volume 2

    NASA Technical Reports Server (NTRS)

    Onwubiko, Chin-Yere; Onyebueke, Landon

    1996-01-01

    The structural design, or the design of machine elements, has been traditionally based on deterministic design methodology. The deterministic method considers all design parameters to be known with certainty. This methodology is, therefore, inadequate to design complex structures that are subjected to a variety of complex, severe loading conditions. A nonlinear behavior that is dependent on stress, stress rate, temperature, number of load cycles, and time is observed on all components subjected to complex conditions. These complex conditions introduce uncertainties; hence, the actual factor of safety margin remains unknown. In the deterministic methodology, the contingency of failure is discounted; hence, there is a use of a high factor of safety. It may be most useful in situations where the design structures are simple. The probabilistic method is concerned with the probability of non-failure performance of structures or machine elements. It is much more useful in situations where the design is characterized by complex geometry, possibility of catastrophic failure, sensitive loads and material properties. Also included: Comparative Study of the use of AGMA Geometry Factors and Probabilistic Design Methodology in the Design of Compact Spur Gear Set.

  9. A modular method for the extraction of DNA and RNA, and the separation of DNA pools from diverse environmental sample types

    PubMed Central

    Lever, Mark A.; Torti, Andrea; Eickenbusch, Philip; Michaud, Alexander B.; Šantl-Temkiv, Tina; Jørgensen, Bo Barker

    2015-01-01

    A method for the extraction of nucleic acids from a wide range of environmental samples was developed. This method consists of several modules, which can be individually modified to maximize yields in extractions of DNA and RNA or separations of DNA pools. Modules were designed based on elaborate tests, in which permutations of all nucleic acid extraction steps were compared. The final modular protocol is suitable for extractions from igneous rock, air, water, and sediments. Sediments range from high-biomass, organic rich coastal samples to samples from the most oligotrophic region of the world's oceans and the deepest borehole ever studied by scientific ocean drilling. Extraction yields of DNA and RNA are higher than with widely used commercial kits, indicating an advantage to optimizing extraction procedures to match specific sample characteristics. The ability to separate soluble extracellular DNA pools without cell lysis from intracellular and particle-complexed DNA pools may enable new insights into the cycling and preservation of DNA in environmental samples in the future. A general protocol is outlined, along with recommendations for optimizing this general protocol for specific sample types and research goals. PMID:26042110

  10. Robust Timing Synchronization in Aeronautical Mobile Communication Systems

    NASA Technical Reports Server (NTRS)

    Xiong, Fu-Qin; Pinchak, Stanley

    2004-01-01

    This work details a study of robust synchronization schemes suitable for satellite to mobile aeronautical applications. A new scheme, the Modified Sliding Window Synchronizer (MSWS), is devised and compared with existing schemes, including the traditional Early-Late Gate Synchronizer (ELGS), the Gardner Zero-Crossing Detector (GZCD), and the Sliding Window Synchronizer (SWS). Performance of the synchronization schemes is evaluated by a set of metrics that indicate performance in digital communications systems. The metrics are convergence time, mean square phase error (or root mean-square phase error), lowest SNR for locking, initial frequency offset performance, midstream frequency offset performance, and system complexity. The performance of the synchronizers is evaluated by means of Matlab simulation models. A simulation platform is devised to model the satellite to mobile aeronautical channel, consisting of a Quadrature Phase Shift Keying modulator, an additive white Gaussian noise channel, and a demodulator front end. Simulation results show that the MSWS provides the most robust performance at the cost of system complexity. The GZCD provides a good tradeoff between robustness and system complexity for communication systems that require high symbol rates or low overall system costs. The ELGS has a high system complexity despite its average performance. Overall, the SWS, originally designed for multi-carrier systems, performs very poorly in single-carrier communications systems. Table 5.1 in Section 5 provides a ranking of each of the synchronization schemes in terms of the metrics set forth in Section 4.1. Details of comparison are given in Section 5. Based on the results presented in Table 5, it is safe to say that the most robust synchronization scheme examined in this work is the high-sample-rate Modified Sliding Window Synchronizer. A close second is its low-sample-rate cousin. The tradeoff between complexity and lowest mean-square phase error determines the rankings of the Gardner Zero-Crossing Detector and both versions of the Early-Late Gate Synchronizer. The least robust models are the high and low-sample-rate Sliding Window Synchronizers. Consequently, the recommended replacement synchronizer for NASA's Advanced Air Transportation Technologies mobile aeronautical communications system is the high-sample-rate Modified Sliding Window Synchronizer. By incorporating this synchronizer into their system, NASA can be assured that their system will be operational in extremely adverse conditions. The quick convergence time of the MSWS should allow the use of high-level protocols. However, if NASA feels that reduced system complexity is the most important aspect of their replacement synchronizer, the Gardner Zero-Crossing Detector would be the best choice.

  11. A relay identification fluorescence probe for Fe3 + and phosphate anion and its applications

    NASA Astrophysics Data System (ADS)

    Tang, Xu; Wang, Yun; Han, Juan; Ni, Liang; Wang, Lei; Li, Longhua; Zhang, Huiqin; Li, Cheng; Li, Jing; Li, Haoran

    2018-02-01

    A simple relay identification fluorescence probe for Fe3 + and phosphate anion with ;on-off-on; switching was designed and synthesized based on the phenylthiazole and biphenylcarbonitrile. Probe 1 displayed highly selective and sensitive recognition to Fe3 + in HEPES aqueous buffer (EtOH/H2O = 2:8, v/v, pH = 7.4) solutions. The optimized structures and HOMO and LUMO of probe 1 and [1-Fe3 +] complex were obtained by the density functional theory (DFT) calculations with B3LYP as the exchange and correlation functional using a suite of Gaussian 09 programs. The [1-Fe3 +] complex solution also showed a high selectivity toward PO43 -. The lower limits of detection of probe 1 to Fe3 + and [1-Fe3 +] complex to PO43 - were estimated to 1.09 × 10- 7 M and 1.86 × 10- 7 M. Besides, the probe 1 also was used to detected the target ions in real water sample and living cells successfully.

  12. Critical thinking about fables: examining language production and comprehension in adolescents.

    PubMed

    Nippold, Marilyn A; Frantz-Kaspar, Megan W; Cramond, Paige M; Kirk, Cecilia; Hayward-Mayhew, Christine; MacKinnon, Melanie

    2015-04-01

    This study was designed primarily to determine if a critical-thinking task involving fables would elicit greater syntactic complexity than a conversational task in adolescents. Another purpose was to determine how well adolescents understand critical-thinking questions about fables. Forty adolescents (N=20 boys and 20 girls; mean age=14 years) with typical language development answered critical-thinking questions about the deeper meanings of fables. They also participated in a standard conversational task. The syntactic complexity of their responses during the speaking tasks was analyzed for mean length of communication unit (MLCU) and clausal density (CD). Both measures of syntactic complexity, MLCU and CD, were substantially greater during the critical-thinking task compared with the conversational task. It was also found that the adolescents understood the questions quite well, earning a mean accuracy score of 80%. The critical-thinking task has potential for use as a new type of language-sampling tool to examine language production and comprehension in adolescents.

  13. Development and Validation of an Extractive Spectrophotometric Method for Miconazole Nitrate Assay in Pharmaceutical Formulations.

    PubMed

    Eticha, Tadele; Kahsay, Getu; Hailu, Teklebrhan; Gebretsadikan, Tesfamichael; Asefa, Fitsum; Gebretsadik, Hailekiros; Thangabalan, Boovizhikannan

    2018-01-01

    A simple extractive spectrophotometric technique has been developed and validated for the determination of miconazole nitrate in pure and pharmaceutical formulations. The method is based on the formation of a chloroform-soluble ion-pair complex between the drug and bromocresol green (BCG) dye in an acidic medium. The complex showed absorption maxima at 422 nm, and the system obeys Beer's law in the concentration range of 1-30  µ g/mL with molar absorptivity of 2.285 × 10 4  L/mol/cm. The composition of the complex was studied by Job's method of continuous variation, and the results revealed that the mole ratio of drug : BCG is 1 : 1. Full factorial design was used to optimize the effect of variable factors, and the method was validated based on the ICH guidelines. The method was applied for the determination of miconazole nitrate in real samples.

  14. Nano magnetic solid phase extraction for preconcentration of lead ions in environmental samples by a newly synthesized reagent.

    PubMed

    Golshekan, Mostafa; Shariati, Shahab

    2013-01-01

    In this study, magnetite nanoparticles with particle size lower than 47 nm were synthesized and were applied for preconcentration of Pb2+ ions from aqueous solutions. To preconcentrate the Pb2+ ions, the surface of the synthesized nano particles was modified with sodium dodecyl sulfate (SDS) as an anionic surfactant. A new chelating agent (2-((E)-2-amino-4,5-dinitrophenylimino)methyl)phenol) was synthesized and used to form a very stable complex with Pb2+ ions. The lead ions formed complexes and were quantitatively extracted with SDS-coated magnetite nanoparticles. After magnetic separation of adsorbent, the adsorbent was eluted with 0.5% (v/v) HC1 in dimethyl sulfoxide (DMSO) prior to analysis by flame atomic absorption spectrometry (FAAS). Orthogonal array design (OAD) was used to study and optimize the different experimental parameters. Under the optimum conditions, enhancement factor up to 63.5 was achieved for extraction from only 10 mL of sample solution and the relative standard deviation (RSD %) of the method was lower than 2.8%. The obtained calibration curve was linear in the range of 1-300 pg L-' with reasonable linearity (r2 > 0.998). The limit of detection (LOD) based on S/N = 3 was 0.04 microg L(-1) for 10 mL sample volumes. Finally, applicability of the proposed method was successfully confirmed by preconcentration and determination of trace amounts of lead ions in environmental samples and satisfactory results were obtained.

  15. The facile flow-injection spectrophotometric detection of gold(III) in water and pharmaceutical samples using 3,5-dimethoxy-4-hydroxy-2-aminoacetophenone isonicotinoyl hydrazone (3,5-DMHAAINH).

    PubMed

    Babu, S Hari; Suvardhan, K; Kumar, K Suresh; Reddy, K M; Rekha, D; Chiranjeevi, P

    2005-04-11

    A simple, sensitive and rapid flow-injection spectrophotometric method was developed for the determination of trace amounts of Au(III) in aqueous dimethylformamide (DMF). The method is based on formation of Au(III)-(3,5-DMHAAINH)3 complex. The optimum conditions for the chromogenic reaction of Au(III) with 3,5-DMHAAINH is studied and the colored (reddish brown) complex is selectively monitored at lambda(max) 490 nm at pH 6.0. The reaction and flow conditions of the full experimental design were optimized. The detection limit (2 s) of 0.1 microg l-1 Au(III) was obtained at a sampling rate of 15 samples h-1. Beer's law is obeyed over the range of 0.30-4.00 microg ml-1. The molar absorptivity and Sandell's sensitivity were 3.450x10(4) M and 0.0050 microg ml-1, respectively. Job's method of continuous variation and stability constants corresponding to these maxima was determined and found to be 9.3x10(15) (1:3, M:R) (M, metal; R, reagent). The detailed study of various interferences confirmed the high selectivity of the developed method. The method was successfully applied for the determination of trace amount of Au(III) in water and pharmaceutical samples. The results obtained were in agreement with the reported methods at the 95% confidence level.

  16. A Framework for Orbital Performance Evaluation in Distributed Space Missions for Earth Observation

    NASA Technical Reports Server (NTRS)

    Nag, Sreeja; LeMoigne-Stewart, Jacqueline; Miller, David W.; de Weck, Olivier

    2015-01-01

    Distributed Space Missions (DSMs) are gaining momentum in their application to earth science missions owing to their unique ability to increase observation sampling in spatial, spectral and temporal dimensions simultaneously. DSM architectures have a large number of design variables and since they are expected to increase mission flexibility, scalability, evolvability and robustness, their design is a complex problem with many variables and objectives affecting performance. There are very few open-access tools available to explore the tradespace of variables which allow performance assessment and are easy to plug into science goals, and therefore select the most optimal design. This paper presents a software tool developed on the MATLAB engine interfacing with STK, for DSM orbit design and selection. It is capable of generating thousands of homogeneous constellation or formation flight architectures based on pre-defined design variable ranges and sizing those architectures in terms of predefined performance metrics. The metrics can be input into observing system simulation experiments, as available from the science teams, allowing dynamic coupling of science and engineering designs. Design variables include but are not restricted to constellation type, formation flight type, FOV of instrument, altitude and inclination of chief orbits, differential orbital elements, leader satellites, latitudes or regions of interest, planes and satellite numbers. Intermediate performance metrics include angular coverage, number of accesses, revisit coverage, access deterioration over time at every point of the Earth's grid. The orbit design process can be streamlined and variables more bounded along the way, owing to the availability of low fidelity and low complexity models such as corrected HCW equations up to high precision STK models with J2 and drag. The tool can thus help any scientist or program manager select pre-Phase A, Pareto optimal DSM designs for a variety of science goals without having to delve into the details of the engineering design process.

  17. A systems-based approach for integrated design of materials, products and design process chains

    NASA Astrophysics Data System (ADS)

    Panchal, Jitesh H.; Choi, Hae-Jin; Allen, Janet K.; McDowell, David L.; Mistree, Farrokh

    2007-12-01

    The concurrent design of materials and products provides designers with flexibility to achieve design objectives that were not previously accessible. However, the improved flexibility comes at a cost of increased complexity of the design process chains and the materials simulation models used for executing the design chains. Efforts to reduce the complexity generally result in increased uncertainty. We contend that a systems based approach is essential for managing both the complexity and the uncertainty in design process chains and simulation models in concurrent material and product design. Our approach is based on simplifying the design process chains systematically such that the resulting uncertainty does not significantly affect the overall system performance. Similarly, instead of striving for accurate models for multiscale systems (that are inherently complex), we rely on making design decisions that are robust to uncertainties in the models. Accordingly, we pursue hierarchical modeling in the context of design of multiscale systems. In this paper our focus is on design process chains. We present a systems based approach, premised on the assumption that complex systems can be designed efficiently by managing the complexity of design process chains. The approach relies on (a) the use of reusable interaction patterns to model design process chains, and (b) consideration of design process decisions using value-of-information based metrics. The approach is illustrated using a Multifunctional Energetic Structural Material (MESM) design example. Energetic materials store considerable energy which can be released through shock-induced detonation; conventionally, they are not engineered for strength properties. The design objectives for the MESM in this paper include both sufficient strength and energy release characteristics. The design is carried out by using models at different length and time scales that simulate different aspects of the system. Finally, by applying the method to the MESM design problem, we show that the integrated design of materials and products can be carried out more efficiently by explicitly accounting for design process decisions with the hierarchy of models.

  18. Designing Complexity

    ERIC Educational Resources Information Center

    Glanville, Ranulph

    2007-01-01

    This article considers the nature of complexity and design, as well as relationships between the two, and suggests that design may have much potential as an approach to improving human performance in situations seen as complex. It is developed against two backgrounds. The first is a world view that derives from second order cybernetics and radical…

  19. Robust Fixed-Structure Controller Synthesis

    NASA Technical Reports Server (NTRS)

    Corrado, Joseph R.; Haddad, Wassim M.; Gupta, Kajal (Technical Monitor)

    2000-01-01

    The ability to develop an integrated control system design methodology for robust high performance controllers satisfying multiple design criteria and real world hardware constraints constitutes a challenging task. The increasingly stringent performance specifications required for controlling such systems necessitates a trade-off between controller complexity and robustness. The principle challenge of the minimal complexity robust control design is to arrive at a tractable control design formulation in spite of the extreme complexity of such systems. Hence, design of minimal complexitY robust controllers for systems in the face of modeling errors has been a major preoccupation of system and control theorists and practitioners for the past several decades.

  20. Methodological considerations in using complex survey data: an applied example with the Head Start Family and Child Experiences Survey.

    PubMed

    Hahs-Vaughn, Debbie L; McWayne, Christine M; Bulotsky-Shearer, Rebecca J; Wen, Xiaoli; Faria, Ann-Marie

    2011-06-01

    Complex survey data are collected by means other than simple random samples. This creates two analytical issues: nonindependence and unequal selection probability. Failing to address these issues results in underestimated standard errors and biased parameter estimates. Using data from the nationally representative Head Start Family and Child Experiences Survey (FACES; 1997 and 2000 cohorts), three diverse multilevel models are presented that illustrate differences in results depending on addressing or ignoring the complex sampling issues. Limitations of using complex survey data are reported, along with recommendations for reporting complex sample results. © The Author(s) 2011

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madden, Jeremy T.; Toth, Scott J.; Dettmar, Christopher M.

    Nonlinear optical (NLO) instrumentation has been integrated with synchrotron X-ray diffraction (XRD) for combined single-platform analysis, initially targeting applications for automated crystal centering. Second-harmonic-generation microscopy and two-photon-excited ultraviolet fluorescence microscopy were evaluated for crystal detection and assessed by X-ray raster scanning. Two optical designs were constructed and characterized; one positioned downstream of the sample and one integrated into the upstream optical path of the diffractometer. Both instruments enabled protein crystal identification with integration times between 80 and 150 µs per pixel, representing a ~10 3–10 4-fold reduction in the per-pixel exposure time relative to X-ray raster scanning. Quantitative centering andmore » analysis of phenylalanine hydroxylase fromChromobacterium violaceumcPAH,Trichinella spiralisdeubiquitinating enzyme TsUCH37, human κ-opioid receptor complex kOR-T4L produced in lipidic cubic phase (LCP), intimin prepared in LCP, and α-cellulose samples were performed by collecting multiple NLO images. The crystalline samples were characterized by single-crystal diffraction patterns, while α-cellulose was characterized by fiber diffraction. Good agreement was observed between the sample positions identified by NLO and XRD raster measurements for all samples studied.« less

  2. A new open tubular capillary microextraction and sweeping for the analysis of super low concentration of hydrophobic compounds.

    PubMed

    Xia, Zhining; Gan, Tingting; Chen, Hua; Lv, Rui; Wei, Weili; Yang, Fengqing

    2010-10-01

    A sample pre-concentration method based on the in-line coupling of in-tube solid-phase microextraction and electrophoretic sweeping was developed for the analysis of hydrophobic compounds. The sample pre-concentration and electrophoretic separation processes were simply and sequentially carried out with a (35%-phenyl)-methylpolysiloxane-coated capillary. The developed method was validated and applied to enrich and separate several pharmaceuticals including loratadine, indomethacin, ibuprofen and doxazosin. Several parameters of microextration were investigated such as temperature, pH and eluant. And the concentration of microemulsion that influences separation efficiency and microextraction efficiency were also studied. Central composite design was applied for the optimization of sampling flow rate and sampling time that interact in a very complex way with each other. The precision, sensitivity and recovery of the method were investigated. Under the optimal conditions, the maximum enrichment factors for loratadine, indomethacin, ibuprofen and doxazosin in aqueous solutions are 1355, 571, 523 and 318, respectively. In addition, the developed method was applied to determine loratadine in rabbit blood sample.

  3. First Detection of Non-Chlorinated Organic Molecules Indigenous to a Martian Sample

    NASA Technical Reports Server (NTRS)

    Freissinet, C.; Glavin, D. P.; Buch, A.; Szopa, C.; Summons, R. E.; Eigenbrode, J. L.; Archer, P. D., Jr.; Brinckerhoff, W. B.; Brunner, A. E.; Cabane, M.; hide

    2016-01-01

    The Sample Analysis at Mars (SAM) instrument onboard Curiosity can perform pyrolysis of martian solid samples, and analyze the volatiles by direct mass spectrometry in evolved gas analysis (EGA) mode, or separate the components in the GCMS mode (coupling the gas chromatograph and the mass spectrometer instruments). In addition, SAM has a wet chemistry laboratory designed for the extraction and identification of complex and refractory organic molecules in the solid samples. The chemical derivatization agent used, N-methyl-N-tert-butyldimethylsilyl- trifluoroacetamide (MTBSTFA), was sealed inside seven Inconel metal cups present in SAM. Although none of these foil-capped derivatization cups have been punctured on Mars for a full wet chemistry experiment, an MTBSTFA leak was detected and the resultant MTBSTFA vapor inside the instrument has been used for a multi-sol MTBSTFA derivatization (MD) procedure instead of direct exposure to MTBSTFA liquid by dropping a solid sample directly into a punctured wet chemistry cup. Pyr-EGA, Pyr-GCMS and Der-GCMS experiments each led to the detection and identification of a variety of organic molecules in diverse formations of Gale Crater.

  4. DNA Extraction by Isotachophoresis in a Microfluidic Channel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephenson, S J

    Biological assays have many applications. For example, forensics personnel and medical professionals use these tests to diagnose diseases and track their progression or identify pathogens and the host response to them. One limitation of these tests, however, is that most of them target only one piece of the sample - such as bacterial DNA - and other components (e.g. host genomic DNA) get in the way, even though they may be useful for different tests. To address this problem, it would be useful to extract several different substances from a complex biological sample - such as blood - in anmore » inexpensive and efficient manner. This summer, I worked with Maxim Shusteff at Lawrence Livermore National Lab on the Rapid Automated Sample Prep project. The goal of the project is to solve the aforementioned problem by creating a system that uses a series of different extraction methods to extract cells, bacteria, and DNA from a complex biological sample. Biological assays can then be run on purified output samples. In this device, an operator could input a complex sample such as blood or saliva, and would receive separate outputs of cells, bacteria, viruses, and DNA. I had the opportunity to work this summer with isotachophoresis (ITP), a technique that can be used to extract nucleic acids from a sample. This technique is intended to be the last stage of the purification device. Isotachophoresis separates particles based on different electrophoretic mobilities. This technique is convenient for out application because free solution DNA mobility is approximately equal for DNA longer than 300 base pairs in length. The sample of interest - in our case DNA - is fed into the chip with streams of leading electrolyte (LE) and trailing electrolyte (TE). When an electric field is applied, the species migrate based on their electrophoretic mobilities. Because the ions in the leading electrolyte have a high electrophoretic mobility, they race ahead of the slower sample and trailing electrolyte ions. Conversely, the trailing electrolyte ions have a slow electrophoretic mobility, so they lag behind the sample, thus trapping the species of interest between the LE and TE streams. In a typical isotachophoresis configuration, the electric field is applied in a direction parallel to the direction of flow. The species then form bands that stretch across the width of the channel. A major limitation of that approach is that only a finite amount of sample can be processed at once, and the sample must be processed in batches. For our purposes, a form of free-flow isotachophoresis is more convenient, where the DNA forms a band parallel to the edges of the channel. To achieve this, in our chip, the electric field is applied transversely. This creates a force perpendicular to the direction of flow, which causes the different ions to migrate across the flow direction. Because the mobility of the DNA is between the mobility of the leading and the trailing electrolyte, the DNA is focused in a tight band near the center of the channel. The stream of DNA can then be directed to a different output to produce a highly concentrated outlet stream without batch processing. One hurdle that must be overcome for successful ITP is isolating the electrochemical reactions that result from the application of high voltage for the actual process of isotachophoresis. The electrochemical reactions that occur around metal electrodes produce bubbles and pH changes that are detrimental to successful ITP. The design of the chips we use incorporates polyacrylamide gels to serve as electrodes along the central channel. For our design, the metal electrodes are located away from the chip, and high conductivity buffer streams carry the potential to the chip, functioning as a 'liquid electrode.' The stream then runs alongside a gel barrier. The gel electrode permits ion transfer while simultaneously isolating the separation chamber from any contaminants in the outer, 'liquid electrode' streams. The difference in potential from one side of the chip to the other creates an electric field. This field traverses the inner, separation channel, containing the leading electrolyte, the trailing electrolyte, and the sample of interest (DNA). To increase the ease of use of the chips, a newer chip design has been fabricated. This design has wire electrodes integrated on the chip, rather than elsewhere. To keep the pH changes and bubbling isolated from the separation channel, the chip contains deeper wells near the electrodes so that the flowing buffer can wash away any gases that form around the electrode. This design is significantly more compact because it eliminates the cumbersome electrode boxes. Eliminating the electrode boxes also decreases the required voltage, making the experiments safer. This happens because when the 'liquid electrode' streams travel through small diameter tubing, they lose much of their voltage due to the electrical resistance of the fluid in the tubing.« less

  5. Self assembly of rectangular shapes on concentration programming and probabilistic tile assembly models

    PubMed Central

    Rajasekaran, Sanguthevar

    2013-01-01

    Efficient tile sets for self assembling rectilinear shapes is of critical importance in algorithmic self assembly. A lower bound on the tile complexity of any deterministic self assembly system for an n × n square is Ω(log(n)log(log(n))) (inferred from the Kolmogrov complexity). Deterministic self assembly systems with an optimal tile complexity have been designed for squares and related shapes in the past. However designing Θ(log(n)log(log(n))) unique tiles specific to a shape is still an intensive task in the laboratory. On the other hand copies of a tile can be made rapidly using PCR (polymerase chain reaction) experiments. This led to the study of self assembly on tile concentration programming models. We present two major results in this paper on the concentration programming model. First we show how to self assemble rectangles with a fixed aspect ratio (α:β), with high probability, using Θ(α + β) tiles. This result is much stronger than the existing results by Kao et al. (Randomized self-assembly for approximate shapes, LNCS, vol 5125. Springer, Heidelberg, 2008) and Doty (Randomized self-assembly for exact shapes. In: proceedings of the 50th annual IEEE symposium on foundations of computer science (FOCS), IEEE, Atlanta. pp 85–94, 2009)—which can only self assembly squares and rely on tiles which perform binary arithmetic. On the other hand, our result is based on a technique called staircase sampling. This technique eliminates the need for sub-tiles which perform binary arithmetic, reduces the constant in the asymptotic bound, and eliminates the need for approximate frames (Kao et al. Randomized self-assembly for approximate shapes, LNCS, vol 5125. Springer, Heidelberg, 2008). Our second result applies staircase sampling on the equimolar concentration programming model (The tile complexity of linear assemblies. In: proceedings of the 36th international colloquium automata, languages and programming: Part I on ICALP ’09, Springer-Verlag, pp 235–253, 2009), to self assemble rectangles (of fixed aspect ratio) with high probability. The tile complexity of our algorithm is Θ(log(n)) and is optimal on the probabilistic tile assembly model (PTAM)—n being an upper bound on the dimensions of a rectangle. PMID:24311993

  6. Learning to Predict Combinatorial Structures

    NASA Astrophysics Data System (ADS)

    Vembu, Shankar

    2009-12-01

    The major challenge in designing a discriminative learning algorithm for predicting structured data is to address the computational issues arising from the exponential size of the output space. Existing algorithms make different assumptions to ensure efficient, polynomial time estimation of model parameters. For several combinatorial structures, including cycles, partially ordered sets, permutations and other graph classes, these assumptions do not hold. In this thesis, we address the problem of designing learning algorithms for predicting combinatorial structures by introducing two new assumptions: (i) The first assumption is that a particular counting problem can be solved efficiently. The consequence is a generalisation of the classical ridge regression for structured prediction. (ii) The second assumption is that a particular sampling problem can be solved efficiently. The consequence is a new technique for designing and analysing probabilistic structured prediction models. These results can be applied to solve several complex learning problems including but not limited to multi-label classification, multi-category hierarchical classification, and label ranking.

  7. Critical evaluation of challenges and future use of animals in experimentation for biomedical research.

    PubMed

    Singh, Vijay Pal; Pratap, Kunal; Sinha, Juhi; Desiraju, Koundinya; Bahal, Devika; Kukreti, Ritushree

    2016-12-01

    Animal experiments that are conducted worldwide contribute to significant findings and breakthroughs in the understanding of the underlying mechanisms of various diseases, bringing up appropriate clinical interventions. However, their predictive value is often low, leading to translational failure. Problems like translational failure of animal studies and poorly designed animal experiments lead to loss of animal lives and less translatable data which affect research outcomes ethically and economically. Due to increasing complexities in animal usage with changes in public perception and stringent guidelines, it is becoming difficult to use animals for conducting studies. This review deals with challenges like poor experimental design and ethical concerns and discusses key concepts like sample size, statistics in experimental design, humane endpoints, economic assessment, species difference, housing conditions, and systematic reviews and meta-analyses that are often neglected. If practiced, these strategies can refine the procedures effectively and help translate the outcomes efficiently. © The Author(s) 2016.

  8. Computational design of active, self-reinforcing gels.

    PubMed

    Yashin, Victor V; Kuksenok, Olga; Balazs, Anna C

    2010-05-20

    Many living organisms have evolved a protective mechanism that allows them to reversibly alter their stiffness in response to mechanical contact. Using theoretical modeling, we design a mechanoresponsive polymer gel that exhibits a similar self-reinforcing behavior. We focus on cross-linked gels that contain Ru(terpy)(2) units, where both terpyridine ligands are grafted to the chains. The Ru(terpy)(2) complex forms additional, chemoresponsive cross-links that break and re-form in response to a repeated oxidation and reduction of the Ru. In our model, the periodic redox variations of the anchored metal ion are generated by the Belousov-Zhabotinsky (BZ) reaction. Our computer simulations reveal that compression of the BZ gel leads to a stiffening of the sample due to an increase in the cross-link density. These findings provide guidelines for designing biomimetic, active coatings that send out a signal when the system is impacted and use this signaling process to initiate the self-protecting behavior.

  9. Design and development of an electrically-controlled beam steering mirror for microwave tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tayebi, A., E-mail: tayebiam@msu.edu; Tang, J.; Paladhi, P. Roy

    2015-03-31

    Microwave tomography has gained significant attention due to its reliability and unhazardous nature in the fields of NDE and medical industry. A new microwave tomography system is presented in this paper, which significantly reduces the design and operational complexities of traditional microwave imaging systems. The major component of the proposed system is a reconfigurable reflectarray antenna which is used for beam steering in order to generate projections from multiple angles. The design, modeling and fabrication of the building block of the antenna, a tunable unit cell, are discussed in this paper. The unit cell is capable of dynamically altering themore » phase of the reflected field which results in beam steering ability of the reflectarray antenna. A tomographically reconstructed image of a dielectric sample using this new microwave tomography system is presented in this work.« less

  10. Critical evaluation of challenges and future use of animals in experimentation for biomedical research

    PubMed Central

    Singh, Vijay Pal; Pratap, Kunal; Sinha, Juhi; Desiraju, Koundinya; Bahal, Devika; Kukreti, Ritushree

    2016-01-01

    Animal experiments that are conducted worldwide contribute to significant findings and breakthroughs in the understanding of the underlying mechanisms of various diseases, bringing up appropriate clinical interventions. However, their predictive value is often low, leading to translational failure. Problems like translational failure of animal studies and poorly designed animal experiments lead to loss of animal lives and less translatable data which affect research outcomes ethically and economically. Due to increasing complexities in animal usage with changes in public perception and stringent guidelines, it is becoming difficult to use animals for conducting studies. This review deals with challenges like poor experimental design and ethical concerns and discusses key concepts like sample size, statistics in experimental design, humane endpoints, economic assessment, species difference, housing conditions, and systematic reviews and meta-analyses that are often neglected. If practiced, these strategies can refine the procedures effectively and help translate the outcomes efficiently. PMID:27694614

  11. Image acquisition system using on sensor compressed sampling technique

    NASA Astrophysics Data System (ADS)

    Gupta, Pravir Singh; Choi, Gwan Seong

    2018-01-01

    Advances in CMOS technology have made high-resolution image sensors possible. These image sensors pose significant challenges in terms of the amount of raw data generated, energy efficiency, and frame rate. This paper presents a design methodology for an imaging system and a simplified image sensor pixel design to be used in the system so that the compressed sensing (CS) technique can be implemented easily at the sensor level. This results in significant energy savings as it not only cuts the raw data rate but also reduces transistor count per pixel; decreases pixel size; increases fill factor; simplifies analog-to-digital converter, JPEG encoder, and JPEG decoder design; decreases wiring; and reduces the decoder size by half. Thus, CS has the potential to increase the resolution of image sensors for a given technology and die size while significantly decreasing the power consumption and design complexity. We show that it has potential to reduce power consumption by about 23% to 65%.

  12. The construction of general basis functions in reweighting ensemble dynamics simulations: Reproduce equilibrium distribution in complex systems from multiple short simulation trajectories

    NASA Astrophysics Data System (ADS)

    Zhang, Chuan-Biao; Ming, Li; Xin, Zhou

    2015-12-01

    Ensemble simulations, which use multiple short independent trajectories from dispersive initial conformations, rather than a single long trajectory as used in traditional simulations, are expected to sample complex systems such as biomolecules much more efficiently. The re-weighted ensemble dynamics (RED) is designed to combine these short trajectories to reconstruct the global equilibrium distribution. In the RED, a number of conformational functions, named as basis functions, are applied to relate these trajectories to each other, then a detailed-balance-based linear equation is built, whose solution provides the weights of these trajectories in equilibrium distribution. Thus, the sufficient and efficient selection of basis functions is critical to the practical application of RED. Here, we review and present a few possible ways to generally construct basis functions for applying the RED in complex molecular systems. Especially, for systems with less priori knowledge, we could generally use the root mean squared deviation (RMSD) among conformations to split the whole conformational space into a set of cells, then use the RMSD-based-cell functions as basis functions. We demonstrate the application of the RED in typical systems, including a two-dimensional toy model, the lattice Potts model, and a short peptide system. The results indicate that the RED with the constructions of basis functions not only more efficiently sample the complex systems, but also provide a general way to understand the metastable structure of conformational space. Project supported by the National Natural Science Foundation of China (Grant No. 11175250).

  13. Joint Transmit and Receive Filter Optimization for Sub-Nyquist Delay-Doppler Estimation

    NASA Astrophysics Data System (ADS)

    Lenz, Andreas; Stein, Manuel S.; Swindlehurst, A. Lee

    2018-05-01

    In this article, a framework is presented for the joint optimization of the analog transmit and receive filter with respect to a parameter estimation problem. At the receiver, conventional signal processing systems restrict the two-sided bandwidth of the analog pre-filter $B$ to the rate of the analog-to-digital converter $f_s$ to comply with the well-known Nyquist-Shannon sampling theorem. In contrast, here we consider a transceiver that by design violates the common paradigm $B\\leq f_s$. To this end, at the receiver, we allow for a higher pre-filter bandwidth $B>f_s$ and study the achievable parameter estimation accuracy under a fixed sampling rate when the transmit and receive filter are jointly optimized with respect to the Bayesian Cram\\'{e}r-Rao lower bound. For the case of delay-Doppler estimation, we propose to approximate the required Fisher information matrix and solve the transceiver design problem by an alternating optimization algorithm. The presented approach allows us to explore the Pareto-optimal region spanned by transmit and receive filters which are favorable under a weighted mean squared error criterion. We also discuss the computational complexity of the obtained transceiver design by visualizing the resulting ambiguity function. Finally, we verify the performance of the optimized designs by Monte-Carlo simulations of a likelihood-based estimator.

  14. Ultrasound-assisted magnetic dispersive solid-phase microextraction: A novel approach for the rapid and efficient microextraction of naproxen and ibuprofen employing experimental design with high-performance liquid chromatography.

    PubMed

    Ghorbani, Mahdi; Chamsaz, Mahmoud; Rounaghi, Gholam Hossein

    2016-03-01

    A simple, rapid, and sensitive method for the determination of naproxen and ibuprofen in complex biological and water matrices (cow milk, human urine, river, and well water samples) has been developed using ultrasound-assisted magnetic dispersive solid-phase microextraction. Magnetic ethylendiamine-functionalized graphene oxide nanocomposite was synthesized and used as a novel adsorbent for the microextraction process and showed great adsorptive ability toward these analytes. Different parameters affecting the microextraction were optimized with the aid of the experimental design approach. A Plackett-Burman screening design was used to study the main variables affecting the microextraction process, and the Box-Behnken optimization design was used to optimize the previously selected variables for extraction of naproxen and ibuprofen. The optimized technique provides good repeatability (relative standard deviations of the intraday precision 3.1 and 3.3, interday precision of 5.6 and 6.1%), linearity (0.1-500 and 0.3-650 ng/mL), low limits of detection (0.03 and 0.1 ng/mL), and a high enrichment factor (168 and 146) for naproxen and ibuprofen, respectively. The proposed method can be successfully applied in routine analysis for determination of naproxen and ibuprofen in cow milk, human urine, and real water samples. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Xeml Lab: a tool that supports the design of experiments at a graphical interface and generates computer-readable metadata files, which capture information about genotypes, growth conditions, environmental perturbations and sampling strategy.

    PubMed

    Hannemann, Jan; Poorter, Hendrik; Usadel, Björn; Bläsing, Oliver E; Finck, Alex; Tardieu, Francois; Atkin, Owen K; Pons, Thijs; Stitt, Mark; Gibon, Yves

    2009-09-01

    Data mining depends on the ability to access machine-readable metadata that describe genotypes, environmental conditions, and sampling times and strategy. This article presents Xeml Lab. The Xeml Interactive Designer provides an interactive graphical interface at which complex experiments can be designed, and concomitantly generates machine-readable metadata files. It uses a new eXtensible Mark-up Language (XML)-derived dialect termed XEML. Xeml Lab includes a new ontology for environmental conditions, called Xeml Environment Ontology. However, to provide versatility, it is designed to be generic and also accepts other commonly used ontology formats, including OBO and OWL. A review summarizing important environmental conditions that need to be controlled, monitored and captured as metadata is posted in a Wiki (http://www.codeplex.com/XeO) to promote community discussion. The usefulness of Xeml Lab is illustrated by two meta-analyses of a large set of experiments that were performed with Arabidopsis thaliana during 5 years. The first reveals sources of noise that affect measurements of metabolite levels and enzyme activities. The second shows that Arabidopsis maintains remarkably stable levels of sugars and amino acids across a wide range of photoperiod treatments, and that adjustment of starch turnover and the leaf protein content contribute to this metabolic homeostasis.

  16. Detection of anabolic androgenic steroid abuse in doping control using mammalian reporter gene bioassays.

    PubMed

    Houtman, Corine J; Sterk, Saskia S; van de Heijning, Monique P M; Brouwer, Abraham; Stephany, Rainer W; van der Burg, Bart; Sonneveld, Edwin

    2009-04-01

    Anabolic androgenic steroids (AAS) are a class of steroid hormones related to the male hormone testosterone. They are frequently detected as drugs in sport doping control. Being similar to or derived from natural male hormones, AAS share the activation of the androgen receptor (AR) as common mechanism of action. The mammalian androgen responsive reporter gene assay (AR CALUX bioassay), measuring compounds interacting with the AR can be used for the analysis of AAS without the necessity of knowing their chemical structure beforehand, whereas current chemical-analytical approaches may have difficulty in detecting compounds with unknown structures, such as designer steroids. This study demonstrated that AAS prohibited in sports and potential designer AAS can be detected with this AR reporter gene assay, but that also additional steroid activities of AAS could be found using additional mammalian bioassays for other types of steroid hormones. Mixtures of AAS were found to behave additively in the AR reporter gene assay showing that it is possible to use this method for complex mixtures as are found in doping control samples, including mixtures that are a result of multi drug use. To test if mammalian reporter gene assays could be used for the detection of AAS in urine samples, background steroidal activities were measured. AAS-spiked urine samples, mimicking doping positive samples, showed significantly higher androgenic activities than unspiked samples. GC-MS analysis of endogenous androgens and AR reporter gene assay analysis of urine samples showed how a combined chemical-analytical and bioassay approach can be used to identify samples containing AAS. The results indicate that the AR reporter gene assay, in addition to chemical-analytical methods, can be a valuable tool for the analysis of AAS for doping control purposes.

  17. New Laboratory Technique to Determine Thermal Conductivity of Complex Regolith Simulants Under High Vacuum

    NASA Astrophysics Data System (ADS)

    Ryan, A. J.; Christensen, P. R.

    2016-12-01

    Laboratory measurements have been necessary to interpret thermal data of planetary surfaces for decades. We present a novel radiometric laboratory method to determine temperature-dependent thermal conductivity of complex regolith simulants under high vacuum and across a wide range of temperatures. Here, we present our laboratory method, strategy, and initial results. This method relies on radiometric temperature measurements instead of contact measurements, eliminating the need to disturb the sample with thermal probes. We intend to determine the conductivity of grains that are up to 2 cm in diameter and to parameterize the effects of angularity, sorting, layering, composition, and cementation. These results will support the efforts of the OSIRIS-REx team in selecting a site on asteroid Bennu that is safe and meets grain size requirements for sampling. Our system consists of a cryostat vacuum chamber with an internal liquid nitrogen dewar. A granular sample is contained in a cylindrical cup that is 4 cm in diameter and 1 to 6 cm deep. The surface of the sample is exposed to vacuum and is surrounded by a black liquid nitrogen cold shroud. Once the system has equilibrated at 80 K, the base of the sample cup is rapidly heated to 450 K. An infrared camera observes the sample from above to monitor its temperature change over time. We have built a time-dependent finite element model of the experiment in COMSOL Multiphysics. Boundary temperature conditions and all known material properties (including surface emissivities) are included to replicate the experiment as closely as possible. The Optimization module in COMSOL is specifically designed for parameter estimation. Sample thermal conductivity is assumed to be a quadratic or cubic polynomial function of temperature. We thus use gradient-based optimization methods in COMSOL to vary the polynomial coefficients in an effort to reduce the least squares error between the measured and modeled sample surface temperature.

  18. [Isolation of Sporothrix pallida complex in clinical and environmental samples from Chile].

    PubMed

    Cruz Choappa, Rodrigo M; Vieille Oyarzo, Peggy I; Carvajal Silva, Laura C

    2014-01-01

    The isolation of S. pallida complex from medical samples and home garden soil of a patient in Chile is here in reported. Fungi of the Sporothrix schenckii complex can cause various infections. In Chile, the medical and environmental isolates of these this complex are rare. The aim of this study was to identify an unusual agent in a case of onychomycosis and to detect its presence in the patient's home garden. For this purpose, clinical samples were obtained by scraping the patient's subungueal first right toe nail as well as by taking soil samples from different areas of her home garden. Species identification was performed by morphophysiology and one of the strains isolated from the patient's toe nail was sent to CBS for molecular confirmation (14.062). S. pallida complex was identified both from the patient's toe nail and samples taken from her home garden. Copyright © 2014 Asociación Argentina de Microbiología. Publicado por Elsevier España. All rights reserved.

  19. Microarrays in brain research: the good, the bad and the ugly.

    PubMed

    Mirnics, K

    2001-06-01

    Making sense of microarray data is a complex process, in which the interpretation of findings will depend on the overall experimental design and judgement of the investigator performing the analysis. As a result, differences in tissue harvesting, microarray types, sample labelling and data analysis procedures make post hoc sharing of microarray data a great challenge. To ensure rapid and meaningful data exchange, we need to create some order out of the existing chaos. In these ground-breaking microarray standardization and data sharing efforts, NIH agencies should take a leading role

  20. Investigation of model-based physical design restrictions (Invited Paper)

    NASA Astrophysics Data System (ADS)

    Lucas, Kevin; Baron, Stanislas; Belledent, Jerome; Boone, Robert; Borjon, Amandine; Couderc, Christophe; Patterson, Kyle; Riviere-Cazaux, Lionel; Rody, Yves; Sundermann, Frank; Toublan, Olivier; Trouiller, Yorick; Urbani, Jean-Christophe; Wimmer, Karl

    2005-05-01

    As lithography and other patterning processes become more complex and more non-linear with each generation, the task of physical design rules necessarily increases in complexity also. The goal of the physical design rules is to define the boundary between the physical layout structures which will yield well from those which will not. This is essentially a rule-based pre-silicon guarantee of layout correctness. However the rapid increase in design rule requirement complexity has created logistical problems for both the design and process functions. Therefore, similar to the semiconductor industry's transition from rule-based to model-based optical proximity correction (OPC) due to increased patterning complexity, opportunities for improving physical design restrictions by implementing model-based physical design methods are evident. In this paper we analyze the possible need and applications for model-based physical design restrictions (MBPDR). We first analyze the traditional design rule evolution, development and usage methodologies for semiconductor manufacturers. Next we discuss examples of specific design rule challenges requiring new solution methods in the patterning regime of low K1 lithography and highly complex RET. We then evaluate possible working strategies for MBPDR in the process development and product design flows, including examples of recent model-based pre-silicon verification techniques. Finally we summarize with a proposed flow and key considerations for MBPDR implementation.

  1. Actinomycetal complex of light sierozem on the Kopet-Dag piedmont plain

    NASA Astrophysics Data System (ADS)

    Zenova, G. M.; Zvyagintsev, D. G.; Manucharova, N. A.; Stepanova, O. A.; Chernov, I. Yu.

    2016-10-01

    The population density of actinomycetes in the samples of light sierozem from the Kopet Dag piedmont plain (75 km from Ashkhabad, Turkmenistan) reaches hundreds of thousand CFU/g soil. The actinomycetal complex is represented by two genera: Streptomyces and Micromonospora. Representatives of the Streptomyces genus predominate and comprise 73 to 87% of the actinomycetal complex. In one sample, representatives of the Micromonospora genus predominated in the complex (75%). The Streptomyces genus in the studied soil samples is represented by the species from several sections and series: the species of section Helvolo-Flavus series Helvolus represent the dominant component of the streptomycetal complex; their portion is up to 77% of all isolated actinomycetes. The species of other sections and series are much less abundant. Thus, the percentage of the Cinereus Achromogenes section in the actinomycetal complex does not exceed 28%; representatives of the Albus section Albus series, Roseus section Lavendulae-Roseus series, and Imperfectus section belong to rare species; they have been isolated not from all the studied samples of light sierozem, and their portion does not exceed 10% of the actinomycetal complex.

  2. Training complexity is not decisive factor for improving adaptation to visual sensory conflict.

    PubMed

    Yang, Yang; Pu, Fang; Li, Shuyu; Li, Yan; Li, Deyu; Fan, Yubo

    2012-01-01

    Ground-based preflight training utilizing unusual visual stimuli is useful for decreasing the susceptibility to space motion sickness (SMS). The effectiveness of the sensorimotor adaptation training is affected by the training tasks, but what kind of task is more effective remains unknown. Whether the complexity is the decisive factor to consider for designing the training and if other factors are more important need to be analyzed. The results from the analysis can help to optimize the preflight training tasks for astronauts. Twenty right-handed subjects were asked to draw the right path of 45° rotated maze before and after 30 min training. Subjects wore an up-down reversing prism spectacle in test and training sessions. Two training tasks were performed: drawing the right path of the horizontal maze (complex task but with different orientation feature) and drawing the L-shape lines (easy task with same orientation feature). The error rate and the executing time were measured during the test. Paired samples t test was used to compare the effects of the two training tasks. After each training, the error rate and the executing time were significantly decreased. However, the training effectiveness of the easy task was better as the test was finished more quickly and accurately. The complexity is not always the decisive factor for designing the adaptation training task, e.g. the orientation feature is more important in this study. In order to accelerate the adaptation and to counter SMS, the task for astronauts preflight adaptation training could be simple activities with the key features.

  3. Optimizing the Design of Preprinted Orders for Ambulatory Chemotherapy: Combining Oncology, Human Factors, and Graphic Design

    PubMed Central

    Jeon, Jennifer; White, Rachel E.; Hunt, Richard G.; Cassano-Piché, Andrea L.; Easty, Anthony C.

    2012-01-01

    Purpose: To establish a set of guidelines for developing ambulatory chemotherapy preprinted orders. Methods: Multiple methods were used to develop the preprinted order guidelines. These included (A) a comprehensive literature review and an environmental scan; (B) analyses of field study observations and incident reports; (C) critical review of evidence from the literature and the field study observation analyses; (D) review of the draft guidelines by a clinical advisory group; and (E) collaboration with graphic designers to develop sample preprinted orders, refine the design guidelines, and format the resulting content. Results: The Guidelines for Developing Ambulatory Chemotherapy Preprinted Orders, which consist of guidance on the design process, content, and graphic design elements of ambulatory chemotherapy preprinted orders, have been established. Conclusion: Health care is a safety critical, dynamic, and complex sociotechnical system. Identifying safety risks in such a system and effectively addressing them often require the expertise of multiple disciplines. This study illustrates how human factors professionals, clinicians, and designers can leverage each other's expertise to uncover commonly overlooked patient safety hazards and to provide health care professionals with innovative, practical, and user-centered tools to minimize those hazards. PMID:23077436

  4. Optimizing the design of preprinted orders for ambulatory chemotherapy: combining oncology, human factors, and graphic design.

    PubMed

    Jeon, Jennifer; White, Rachel E; Hunt, Richard G; Cassano-Piché, Andrea L; Easty, Anthony C

    2012-03-01

    To establish a set of guidelines for developing ambulatory chemotherapy preprinted orders. Multiple methods were used to develop the preprinted order guidelines. These included (A) a comprehensive literature review and an environmental scan; (B) analyses of field study observations and incident reports; (C) critical review of evidence from the literature and the field study observation analyses; (D) review of the draft guidelines by a clinical advisory group; and (E) collaboration with graphic designers to develop sample preprinted orders, refine the design guidelines, and format the resulting content. The Guidelines for Developing Ambulatory Chemotherapy Preprinted Orders, which consist of guidance on the design process, content, and graphic design elements of ambulatory chemotherapy preprinted orders, have been established. Health care is a safety critical, dynamic, and complex sociotechnical system. Identifying safety risks in such a system and effectively addressing them often require the expertise of multiple disciplines. This study illustrates how human factors professionals, clinicians, and designers can leverage each other's expertise to uncover commonly overlooked patient safety hazards and to provide health care professionals with innovative, practical, and user-centered tools to minimize those hazards.

  5. A new tent trap for sampling exophagic and endophagic members of the Anopheles gambiae complex.

    PubMed

    Govella, Nicodemus J; Chaki, Prosper P; Geissbuhler, Yvonne; Kannady, Khadija; Okumu, Fredros; Charlwood, J Derek; Anderson, Robert A; Killeen, Gerry F

    2009-07-14

    Mosquito sampling methods are essential for monitoring and evaluating malaria vector control interventions. In urban Dar es Salaam, human landing catch (HLC) is the only method sufficiently sensitive for monitoring malaria-transmitting Anopheles. HLC is labour intensive, cumbersome, hazardous, and requires such intense supervision that is difficulty to sustain on large scales. Novel tent traps were developed as alternatives to HLC. The Furvela tent, designed in Mozambique, incorporates a CDC Light trap (LT) components, while two others from Ifakara, Tanzania (designs A and B) require no electricity or moving parts. Their sensitivity for sampling malaria vectors was compared with LT and HLC over a wide range of vector abundances in rural and urban settings in Tanzania, with endophagic and exophagic populations, respectively, using randomised Latin-square and cross- over experimental designs. The sensitivity of LTs was greater than HLC while the opposite was true of Ifakara tent traps (crude mean catch of An. gambiae sensu lato relative to HLC = 0.28, 0.65 and 1.30 for designs A, B and LT in a rural setting and 0.32 for design B in an urban setting). However, Ifakara B catches correlated far better to HLC (r2 = 0.73, P < 0.001) than any other method tested (r2 = 0.04, P = 0.426 and r2 = 0.19, P = 0.006 for Ifakara A and LTs respectively). Only Ifakara B in a rural setting with high vector density exhibited constant sampling efficiency relative to HLC. The relative sensitivity of Ifakara B increased as vector densities decreased in the urban setting and exceeded that of HLC at the lowest densities. None of the tent traps differed from HLC in terms of the proportions of parous mosquitoes (P >or= 0.849) or An. gambiae s.l. sibling species (P >or= 0.280) they sampled but both Ifakara A and B designs failed to reduce the proportion of blood-fed mosquitoes caught (Odds ratio [95% Confidence Interval] = 1.6 [1.2, 2.1] and 1.0 [0.8, 1.2], P = 0.002 and 0.998, respectively), probably because of operator exposure while emptying the trap each morning. The Ifakara B trap may have potential for monitoring and evaluating a variety of endophagic and exophagic Afrotropical malaria vectors, particularly at low but epidemiologically relevant population densities. However, operator exposure to mosquito bites remains a concern so additional modifications or protective measures will be required before this design can be considered for widespread, routine use.

  6. Response surface methodology based on central composite design as a chemometric tool for optimization of dispersive-solidification liquid-liquid microextraction for speciation of inorganic arsenic in environmental water samples.

    PubMed

    Asadollahzadeh, Mehdi; Tavakoli, Hamed; Torab-Mostaedi, Meisam; Hosseini, Ghaffar; Hemmati, Alireza

    2014-06-01

    Dispersive-solidification liquid-liquid microextraction (DSLLME) coupled with electrothermal atomic absorption spectrometry (ETAAS) was developed for preconcentration and determination of inorganic arsenic (III, V) in water samples. At pH=1, As(III) formed complex with ammonium pyrrolidine dithiocarbamate (APDC) and extracted into the fine droplets of 1-dodecanol (extraction solvent) which were dispersed with ethanol (disperser solvent) into the water sample solution. After extraction, the organic phase was separated by centrifugation, and was solidified by transferring into an ice bath. The solidified solvent was transferred to a conical vial and melted quickly at room temperature. As(III) was determined in the melted organic phase while As(V) remained in the aqueous layer. Total inorganic As was determined after the reduction of the pentavalent forms of arsenic with sodium thiosulphate and potassium iodide. As(V) was calculated by difference between the concentration of total inorganic As and As(III). The variable of interest in the DSLLME method, such as the volume of extraction solvent and disperser solvent, pH, concentration of APDC (chelating agent), extraction time and salt effect, was optimized with the aid of chemometric approaches. First, in screening experiments, fractional factorial design (FFD) was used for selecting the variables which significantly affected the extraction procedure. Afterwards, the significant variables were optimized using response surface methodology (RSM) based on central composite design (CCD). In the optimum conditions, the proposed method has been successfully applied to the determination of inorganic arsenic in different environmental water samples and certified reference material (NIST RSM 1643e). Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Sampling design optimisation for rainfall prediction using a non-stationary geostatistical model

    NASA Astrophysics Data System (ADS)

    Wadoux, Alexandre M. J.-C.; Brus, Dick J.; Rico-Ramirez, Miguel A.; Heuvelink, Gerard B. M.

    2017-09-01

    The accuracy of spatial predictions of rainfall by merging rain-gauge and radar data is partly determined by the sampling design of the rain-gauge network. Optimising the locations of the rain-gauges may increase the accuracy of the predictions. Existing spatial sampling design optimisation methods are based on minimisation of the spatially averaged prediction error variance under the assumption of intrinsic stationarity. Over the past years, substantial progress has been made to deal with non-stationary spatial processes in kriging. Various well-documented geostatistical models relax the assumption of stationarity in the mean, while recent studies show the importance of considering non-stationarity in the variance for environmental processes occurring in complex landscapes. We optimised the sampling locations of rain-gauges using an extension of the Kriging with External Drift (KED) model for prediction of rainfall fields. The model incorporates both non-stationarity in the mean and in the variance, which are modelled as functions of external covariates such as radar imagery, distance to radar station and radar beam blockage. Spatial predictions are made repeatedly over time, each time recalibrating the model. The space-time averaged KED variance was minimised by Spatial Simulated Annealing (SSA). The methodology was tested using a case study predicting daily rainfall in the north of England for a one-year period. Results show that (i) the proposed non-stationary variance model outperforms the stationary variance model, and (ii) a small but significant decrease of the rainfall prediction error variance is obtained with the optimised rain-gauge network. In particular, it pays off to place rain-gauges at locations where the radar imagery is inaccurate, while keeping the distribution over the study area sufficiently uniform.

  8. Evaluating multi-level models to test occupancy state responses of Plethodontid salamanders

    USGS Publications Warehouse

    Kroll, Andrew J.; Garcia, Tiffany S.; Jones, Jay E.; Dugger, Catherine; Murden, Blake; Johnson, Josh; Peerman, Summer; Brintz, Ben; Rochelle, Michael

    2015-01-01

    Plethodontid salamanders are diverse and widely distributed taxa and play critical roles in ecosystem processes. Due to salamander use of structurally complex habitats, and because only a portion of a population is available for sampling, evaluation of sampling designs and estimators is critical to provide strong inference about Plethodontid ecology and responses to conservation and management activities. We conducted a simulation study to evaluate the effectiveness of multi-scale and hierarchical single-scale occupancy models in the context of a Before-After Control-Impact (BACI) experimental design with multiple levels of sampling. Also, we fit the hierarchical single-scale model to empirical data collected for Oregon slender and Ensatina salamanders across two years on 66 forest stands in the Cascade Range, Oregon, USA. All models were fit within a Bayesian framework. Estimator precision in both models improved with increasing numbers of primary and secondary sampling units, underscoring the potential gains accrued when adding secondary sampling units. Both models showed evidence of estimator bias at low detection probabilities and low sample sizes; this problem was particularly acute for the multi-scale model. Our results suggested that sufficient sample sizes at both the primary and secondary sampling levels could ameliorate this issue. Empirical data indicated Oregon slender salamander occupancy was associated strongly with the amount of coarse woody debris (posterior mean = 0.74; SD = 0.24); Ensatina occupancy was not associated with amount of coarse woody debris (posterior mean = -0.01; SD = 0.29). Our simulation results indicate that either model is suitable for use in an experimental study of Plethodontid salamanders provided that sample sizes are sufficiently large. However, hierarchical single-scale and multi-scale models describe different processes and estimate different parameters. As a result, we recommend careful consideration of study questions and objectives prior to sampling data and fitting models.

  9. Characterization of the Sukinda and Nausahi ultramafic complexes, Orissa, India by platinum-group element geochemistry

    USGS Publications Warehouse

    Page, N.J.; Banerji, P.K.; Haffty, J.

    1985-01-01

    Samples of 20 chromitite, 14 ultramafic and mafic rock, and 9 laterite and soil samples from the Precambrian Sukinda and Nausahi ultramafic complexes, Orissa, India were analyzed for platinum-group elements (PGE). The maximum concentrations are: palladium, 13 parts per billion (ppb); platinum, 120 ppb; rhodium, 21 ppb; iridium, 210 ppb; and ruthenium, 630 ppb. Comparison of chondrite-normalized ratios of PGE for the chromitite samples of lower Proterozoic to Archean age with similar data from Paleozoic and Mesozoic ophiolite complexes strongly implies that these complexes represent Precambrian analogs of ophiolite complexes. This finding is consistent with the geology and petrology of the Indian complexes and suggests that plate-tectonic and ocean basin developement models probably apply to some parts of Precambrian shield areas. ?? 1985.

  10. Bacterial community changes in an industrial algae production system.

    PubMed

    Fulbright, Scott P; Robbins-Pianka, Adam; Berg-Lyons, Donna; Knight, Rob; Reardon, Kenneth F; Chisholm, Stephen T

    2018-04-01

    While microalgae are a promising feedstock for production of fuels and other chemicals, a challenge for the algal bioproducts industry is obtaining consistent, robust algae growth. Algal cultures include complex bacterial communities and can be difficult to manage because specific bacteria can promote or reduce algae growth. To overcome bacterial contamination, algae growers may use closed photobioreactors designed to reduce the number of contaminant organisms. Even with closed systems, bacteria are known to enter and cohabitate, but little is known about these communities. Therefore, the richness, structure, and composition of bacterial communities were characterized in closed photobioreactor cultivations of Nannochloropsis salina in F/2 medium at different scales, across nine months spanning late summer-early spring, and during a sequence of serially inoculated cultivations. Using 16S rRNA sequence data from 275 samples, bacterial communities in small, medium, and large cultures were shown to be significantly different. Larger systems contained richer bacterial communities compared to smaller systems. Relationships between bacterial communities and algae growth were complex. On one hand, blooms of a specific bacterial type were observed in three abnormal, poorly performing replicate cultivations, while on the other, notable changes in the bacterial community structures were observed in a series of serial large-scale batch cultivations that had similar growth rates. Bacteria common to the majority of samples were identified, including a single OTU within the class Saprospirae that was found in all samples. This study contributes important information for crop protection in algae systems, and demonstrates the complex ecosystems that need to be understood for consistent, successful industrial algae cultivation. This is the first study to profile bacterial communities during the scale-up process of industrial algae systems.

  11. Origin of middle rare earth element enrichments in acid waters of a Canadian high Arctic lake.

    NASA Astrophysics Data System (ADS)

    Johannesson, Kevin H.; Zhou, Xiaoping

    1999-01-01

    -Middle rare earth element (MREE) enriched rock-normalized rare earth element (REE) patterns of a dilute acidic lake (Colour Lake) in the Canadian High Arctic, were investigated by quantifying whole-rock REE concentrations of rock samples collected from the catchment basin, as well as determining the acid leachable REE fraction of these rocks. An aliquot of each rock sample was leached with 1 N HNO 3 to examine the readily leachable REE fraction of each rock, and an additional aliquot was leached with a 0.04 M NH 2OH · HCl in 25% (v/v) CH 3COOH solution, designed specifically to reduce Fe-Mn oxides/oxyhydroxides. Rare earth elements associated with the leachates that reacted with clastic sedimentary rock samples containing petrographically identifiable Fe-Mn oxide/oxyhydroxide cements and/or minerals/amorphous phases, exhibited whole-rock-normalized REE patterns similar to the lake waters, whereas whole-rock-normalized leachates from mafic igneous rocks and other clastic sedimentary rocks from the catchment basin differed substantially from the lake waters. The whole-rock, leachates, and lake water REE data support acid leaching or dissolution of MREE enriched Fe-Mn oxides/oxyhydroxides contained and identified within some of the catchment basin sedimentary rocks as the likely source of the unique lake water REE patterns. Solution complexation modelling of the REEs in the inflow streams and lake waters indicate that free metal ions (e.g., Ln 3+, where Ln = any REE) and sulfate complexes (LnSO 4+) are the dominant forms of dissolved REEs. Consequently, solution complexation reactions involving the REEs during weathering, transport to the lake, or within the lake, cannot be invoked to explain the MREE enrichments observed in the lake waters.

  12. A flow system for the spectrophotometric determination of lead in different types of waters using ion-exchange for pre-concentration and elimination of interferences.

    PubMed

    Mesquita, Raquel B R; Fernandes, Sílvia M V; Rangel, António O S S

    2004-02-06

    A flow system for the spectrophotometric determination of lead in natural and waste waters is proposed. The determination is based on the colorimetric reaction between malachite green and iodide, followed by the formation of a ternary complex between those reagents and lead cations. The developed flow system includes a lead pre-concentration step in a column packed with a cationic resin (Chelex 100) operating in a sequential injection mode. To improve the mixture of sample and reagents, a flow injection approach was adopted for the colorimetric determination. This way a hybrid flow system, involving both sequential and flow injection concepts was designed. Another feature of the proposed system is the efficient elimination of major interferent species, such as cadmium and copper. The elimination of cadmium interference is obtained by complexing Cd(2+) with chloride and retaining the formed negatively charged complexes in an anionic resin, AG1 X-8. As for copper, with the presence of both ionic resins as well as the conditions for cadmium elimination, it no longer acts as an interferent. Different ranges of lead concentration (50-300 and 300-1000mugl(-1)) can be determined with minor changes in the controlling software, useful for application to both natural and waste waters. Therefore, a detection limit of 25mugl(-1) was achieved. Repeatability was evaluated from 10 consecutive determinations being the results better than 4%. The recoveries of lead spikes added to the samples ranged from 93 to 102%. The sampling frequency was 17 and 24 determinations per hour, for 50-300 and 300-1000mugl(-1) ranges, respectively.

  13. Merging National Forest and National Forest Health Inventories to Obtain an Integrated Forest Resource Inventory – Experiences from Bavaria, Slovenia and Sweden

    PubMed Central

    Kovač, Marko; Bauer, Arthur; Ståhl, Göran

    2014-01-01

    Backgrounds, Material and Methods To meet the demands of sustainable forest management and international commitments, European nations have designed a variety of forest-monitoring systems for specific needs. While the majority of countries are committed to independent, single-purpose inventorying, a minority of countries have merged their single-purpose forest inventory systems into integrated forest resource inventories. The statistical efficiencies of the Bavarian, Slovene and Swedish integrated forest resource inventory designs are investigated with the various statistical parameters of the variables of growing stock volume, shares of damaged trees, and deadwood volume. The parameters are derived by using the estimators for the given inventory designs. The required sample sizes are derived via the general formula for non-stratified independent samples and via statistical power analyses. The cost effectiveness of the designs is compared via two simple cost effectiveness ratios. Results In terms of precision, the most illustrative parameters of the variables are relative standard errors; their values range between 1% and 3% if the variables’ variations are low (s%<80%) and are higher in the case of higher variations. A comparison of the actual and required sample sizes shows that the actual sample sizes were deliberately set high to provide precise estimates for the majority of variables and strata. In turn, the successive inventories are statistically efficient, because they allow detecting the mean changes of variables with powers higher than 90%; the highest precision is attained for the changes of growing stock volume and the lowest for the changes of the shares of damaged trees. Two indicators of cost effectiveness also show that the time input spent for measuring one variable decreases with the complexity of inventories. Conclusion There is an increasing need for credible information on forest resources to be used for decision making and national and international policy making. Such information can be cost-efficiently provided through integrated forest resource inventories. PMID:24941120

  14. Assessment of different surveillance systems for avian influenza in commercial poultry in Catalonia (North-Eastern Spain).

    PubMed

    Alba, A; Casal, J; Napp, S; Martin, P A J

    2010-11-01

    Compulsory surveillance programmes for avian influenza (AI) have been implemented in domestic poultry and wild birds in all the European Member States since 2005. The implementation of these programmes is complex and requires a close evaluation. A good indicator to assess their efficacy is the sensitivity (Se) of the surveillance system. In this study, the sensitivities for different sampling designs proposed by the Spanish authorities for the commercial poultry population of Catalonia were assessed, using the scenario tree model methodology. These samplings were stratified throughout the territory of Spain and took into account the species, the types of production and their specific risks. The probabilities of detecting infection at different prevalences at both individual and holding level were estimated. Furthermore, those subpopulations that contributed more to the Se of the system were identified. The model estimated that all the designs met the requirements of the European Commission. The probability of detecting AI circulating in Catalonian poultry did not change significantly when the within-holding design prevalence varied from 30% to 10%. In contrast, when the among-holding design prevalence decreased from 5% to 1%, the probability of detecting AI was drastically reduced. The sampling of duck and goose holdings, and to a lesser extent the sampling of turkey and game bird holdings, increased the Se substantially. The Se of passive surveillance in chickens for highly pathogenic avian influenza (HPAI) and low pathogenicity avian influenza (LPAI) were also assessed. The probability of the infected birds manifesting apparent clinical signs and the awareness of veterinarians and farmers had great influence on the probability of detecting AI. In order to increase the probability of an early detection of HPAI in chicken, the probability of performing AI specific tests when AI is suspected would need to be increased. Copyright © 2010 Elsevier B.V. All rights reserved.

  15. Toward reliable and repeatable automated STEM-EDS metrology with high throughput

    NASA Astrophysics Data System (ADS)

    Zhong, Zhenxin; Donald, Jason; Dutrow, Gavin; Roller, Justin; Ugurlu, Ozan; Verheijen, Martin; Bidiuk, Oleksii

    2018-03-01

    New materials and designs in complex 3D architectures in logic and memory devices have raised complexity in S/TEM metrology. In this paper, we report about a newly developed, automated, scanning transmission electron microscopy (STEM) based, energy dispersive X-ray spectroscopy (STEM-EDS) metrology method that addresses these challenges. Different methodologies toward repeatable and efficient, automated STEM-EDS metrology with high throughput are presented: we introduce the best known auto-EDS acquisition and quantification methods for robust and reliable metrology and present how electron exposure dose impacts the EDS metrology reproducibility, either due to poor signalto-noise ratio (SNR) at low dose or due to sample modifications at high dose conditions. Finally, we discuss the limitations of the STEM-EDS metrology technique and propose strategies to optimize the process both in terms of throughput and metrology reliability.

  16. Computationally efficient algorithm for Gaussian Process regression in case of structured samples

    NASA Astrophysics Data System (ADS)

    Belyaev, M.; Burnaev, E.; Kapushev, Y.

    2016-04-01

    Surrogate modeling is widely used in many engineering problems. Data sets often have Cartesian product structure (for instance factorial design of experiments with missing points). In such case the size of the data set can be very large. Therefore, one of the most popular algorithms for approximation-Gaussian Process regression-can be hardly applied due to its computational complexity. In this paper a computationally efficient approach for constructing Gaussian Process regression in case of data sets with Cartesian product structure is presented. Efficiency is achieved by using a special structure of the data set and operations with tensors. Proposed algorithm has low computational as well as memory complexity compared to existing algorithms. In this work we also introduce a regularization procedure allowing to take into account anisotropy of the data set and avoid degeneracy of regression model.

  17. Trends in Mediation Analysis in Nursing Research: Improving Current Practice.

    PubMed

    Hertzog, Melody

    2018-06-01

    The purpose of this study was to describe common approaches used by nursing researchers to test mediation models and evaluate them within the context of current methodological advances. MEDLINE was used to locate studies testing a mediation model and published from 2004 to 2015 in nursing journals. Design (experimental/correlation, cross-sectional/longitudinal, model complexity) and analysis (method, inclusion of test of mediated effect, violations/discussion of assumptions, sample size/power) characteristics were coded for 456 studies. General trends were identified using descriptive statistics. Consistent with findings of reviews in other disciplines, evidence was found that nursing researchers may not be aware of the strong assumptions and serious limitations of their analyses. Suggestions for strengthening the rigor of such studies and an overview of current methods for testing more complex models, including longitudinal mediation processes, are presented.

  18. Experimental setup for the study of resonant inelastic X-ray scattering of organometallic complexes in gas phase

    NASA Astrophysics Data System (ADS)

    Ismail, I.; Guillemin, R.; Marchenko, T.; Travnikova, O.; Ablett, J. M.; Rueff, J.-P.; Piancastelli, M.-N.; Simon, M.; Journel, L.

    2018-06-01

    A new setup has been designed and built to study organometallic complexes in gas phase at the third-generation Synchrotron radiation sources. This setup consists of a new homemade computer-controlled gas cell that allows us to sublimate solid samples by accurately controlling the temperature. This cell has been developed to be a part of the high-resolution X-ray emission spectrometer permanently installed at the GALAXIES beamline of the French National Synchrotron Facility SOLEIL. To illustrate the capabilities of the setup, the cell has been successfully used to record high-resolution Kα emission spectra of gas-phase ferrocene F e (C5H5) 2 and to characterize their dependence with the excitation energy. This will allow to extend resonant X-ray emission to different organometallic molecules.

  19. Water-quality trend analysis and sampling design for the Souris River, Saskatchewan, North Dakota, and Manitoba

    USGS Publications Warehouse

    Vecchia, Aldo V.

    2000-01-01

    The Souris River Basin is a 24,600-square-mile basin located in southeast Saskatchewan, north-central North Dakota, and southwest Manitoba.  The Souris River Bilateral Water Quality Monitoring Group, formed in 1989 by the governments of Canada and the United States, is responsible for documenting trends in water quality in the Souris River and making recommendations for monitoring future water-quality conditions.  This report presents results of a study conducted for the Bilateral Water Quality Monitoring Group by the U.S. Geological Survey, in cooperation with the North Dakota Department of Health, to analyze historic trends in water quality in the Souris River and to determine efficient sampling designs for monitoring future trends.  U.S. Geological Survey and Environment Canada water-quality data collected during 1977-96 from four sites near the boundary crossings between Canada and the United States were included in the trend analysis. A parametric time-series model was developed for detecting trends in historic constituent concentration data.  The model can be applied to constituents that have at least 90 percent of observations above detection limits of the analyses, which, for the Souris River, includes most major ions and nutrients and many trace elements.  The model can detect complex nonmonotonic trends in concentration in the presence of complex interannual and seasonal variability in daily discharge.  A key feature of the model is its ability to handle highly irregular sampling intervals.  For example, the intervals between concentration measurements may be be as short as 10 days to as long as several months, and the number of samples in any given year can range from zero to 36. Results from the trend analysis for the Souris River indicated numerous trends in constituent concentration.  The most significant trends at the two sites located near the upstream boundary crossing between Saskatchewan and North Dakota consisted of increases in concentrations of most major ions, dissolved boron, and dissolved arsenic during 1987-91 and decreases in concentrations of the same constituents during 1992-96.  Significant trends at the two sites located near the downstream boundary crossing between North Dakota and Manitoba included increases in dissolved sodium, dissolved chloride, and total phosphorus during 1977-86, decreases in dissolved oxygen and dissolved boron and increases in total phosphorus and dissolved iron during 1987-91, and a decrease in total phosphorus during 1992-96. The time-series model also was used to determine the sensitivity of various sampling designs for monitoring future water-quality trends in the Souris River.  It was determined that at least two samples per year are required in each of three seasons--March through June, July through October, and November through February--to obtain reasonable sensitivity for detecting trends in each season.  In addition, substantial improvements occurred in sensitivity for detecting trends by adding a third sample for major ions and trace elements in March through June, adding a third sample for nutrients in July through October, and adding a third sample for nutrients, trace elements, and dissolved oxygen in November through February.

  20. The balanced incomplete block design is not suitable for the evaluation of complex interventions.

    PubMed

    Trietsch, Jasper; Leffers, Pieter; van Steenkiste, Ben; Grol, Richard; van der Weijden, Trudy

    2014-12-01

    In quality of care research, the balanced incomplete block (BIB) design is regularly claimed to have been used when evaluating complex interventions. In this article, we reflect on the appropriateness of using this design for evaluating complex interventions. Literature study using PubMed and handbooks. After studying various articles on health services research that claim to have applied the BIB and the original methodological literature on this design, it became clear that the applied method is in fact not a BIB design. We conclude that the use of this design is not suited for evaluating complex interventions. We stress that, to prevent improper use of terms, more attention should be paid to proper referencing of the original methodological literature. Copyright © 2014 Elsevier Inc. All rights reserved.

Top