Science.gov

Sample records for design epidemiological methods

  1. Epidemiological designs for vaccine safety assessment: methods and pitfalls.

    PubMed

    Andrews, Nick

    2012-09-01

    Three commonly used designs for vaccine safety assessment post licensure are cohort, case-control and self-controlled case series. These methods are often used with routine health databases and immunisation registries. This paper considers the issues that may arise when designing an epidemiological study, such as understanding the vaccine safety question, case definition and finding, limitations of data sources, uncontrolled confounding, and pitfalls that apply to the individual designs. The example of MMR and autism, where all three designs have been used, is presented to help consider these issues. PMID:21985898

  2. Epidemiological designs for vaccine safety assessment: methods and pitfalls.

    PubMed

    Andrews, Nick

    2012-09-01

    Three commonly used designs for vaccine safety assessment post licensure are cohort, case-control and self-controlled case series. These methods are often used with routine health databases and immunisation registries. This paper considers the issues that may arise when designing an epidemiological study, such as understanding the vaccine safety question, case definition and finding, limitations of data sources, uncontrolled confounding, and pitfalls that apply to the individual designs. The example of MMR and autism, where all three designs have been used, is presented to help consider these issues.

  3. The ZInEP Epidemiology Survey: background, design and methods.

    PubMed

    Ajdacic-Gross, Vladeta; Müller, Mario; Rodgers, Stephanie; Warnke, Inge; Hengartner, Michael P; Landolt, Karin; Hagenmuller, Florence; Meier, Magali; Tse, Lee-Ting; Aleksandrowicz, Aleksandra; Passardi, Marco; Knöpfli, Daniel; Schönfelder, Herdis; Eisele, Jochen; Rüsch, Nicolas; Haker, Helene; Kawohl, Wolfram; Rössler, Wulf

    2014-12-01

    This article introduces the design, sampling, field procedures and instruments used in the ZInEP Epidemiology Survey. This survey is one of six ZInEP projects (Zürcher Impulsprogramm zur nachhaltigen Entwicklung der Psychiatrie, i.e. the "Zurich Program for Sustainable Development of Mental Health Services"). It parallels the longitudinal Zurich Study with a sample comparable in age and gender, and with similar methodology, including identical instruments. Thus, it is aimed at assessing the change of prevalence rates of common mental disorders and the use of professional help and psychiatric sevices. Moreover, the current survey widens the spectrum of topics by including sociopsychiatric questionnaires on stigma, stress related biological measures such as load and cortisol levels, electroencephalographic (EEG) and near-infrared spectroscopy (NIRS) examinations with various paradigms, and sociophysiological tests. The structure of the ZInEP Epidemiology Survey entails four subprojects: a short telephone screening using the SCL-27 (n of nearly 10,000), a comprehensive face-to-face interview based on the SPIKE (Structured Psychopathological Interview and Rating of the Social Consequences for Epidemiology: the main instrument of the Zurich Study) with a stratified sample (n = 1500), tests in the Center for Neurophysiology and Sociophysiology (n = 227), and a prospective study with up to three follow-up interviews and further measures (n = 157). In sum, the four subprojects of the ZInEP Epidemiology Survey deliver a large interdisciplinary database. PMID:24942564

  4. Design and implementation of epidemiological field investigation method based on mobile collaboration

    NASA Astrophysics Data System (ADS)

    Zhang, Lihui; Wang, Dongchuan; Huang, Mingxiang; Gong, Jianhua; Fang, Liqun; Cao, Wuchun

    2008-10-01

    With the development of mobile technologies and the integration with the spatial information technologies, it becomes possible to provide a potential to develop new techno-support solutions to Epidemiological Field Investigation especially for the disposal of emergent public health events. Based on mobile technologies and virtual geographic environment, the authors have designed a model for collaborative work in four communication patterns, namely, S2S (Static to Static), M2S (Mobile to Static), S2M (Static to Mobile), and M2M (Mobile to Mobile). Based on the model mentioned above, this paper stresses to explore mobile online mapping regarding mobile collaboration and conducts an experimental case study of HFRS (Hemorrhagic Fever with Renal Syndrome) fieldwork, and then develops a prototype system of emergent response disposition information system to test the effectiveness and usefulness of field survey based on mobile collaboration.

  5. Overview of the epidemiology methods and applications: strengths and limitations of observational study designs.

    PubMed

    Colditz, Graham A

    2010-01-01

    The impact of study design on the results of medical research has long been an area of both substantial debate and a smaller body of empirical research. Examples come from many disciplines within clinical and public health research. Among the early major contributions in the 1970s was work by Mosteller and colleagues (Gilbert et al., 1997), who noted that innovations in surgery and anesthesia showed greater gains than standard therapy when nonrandomized, controlled trials were evaluated compared with the gains reported in randomized, controlled trials. More recently, we and others have evaluated the impact of design in medical and surgical research, and concluded that the mean gain comparing new therapies to established therapies was biased by study design in nonrandomized trials (Colditz et al., 1989; Miller et al., 1989). Benson and Hartz (2000) conducted a study in which they focused only on studies reported after 1985. On the basis of 136 reports of 19 diverse treatments, Benson and Hartz concluded that in only 2 of the 19 analyses did the combined data from the observational studies lie outside the 95% confidence interval for the combined data from the randomized trials. A similar study drew only on data reported from 1991 to 1995, which showed remarkably similar results among observational studies and randomized, controlled trials (Concato et al., 2000). These more recent data suggest that advancing the study design and analytic methods may reduce bias in some evaluations of medical and public health interventions. Such methods apply not only to the original studies, but also to the approaches that are taken to quantitatively combine results by using meta-analytic approaches such as random effects meta-regression, Bayesian meta-analysis, and the like (Normand, 1999). By focusing attention on thorough data analysis, design issues can be understood and their impact or bias can be estimated, on average, and then ideally accounted for in the interpretation of

  6. Regression Discontinuity Designs in Epidemiology

    PubMed Central

    Moscoe, Ellen; Mutevedzi, Portia; Newell, Marie-Louise; Bärnighausen, Till

    2014-01-01

    When patients receive an intervention based on whether they score below or above some threshold value on a continuously measured random variable, the intervention will be randomly assigned for patients close to the threshold. The regression discontinuity design exploits this fact to estimate causal treatment effects. In spite of its recent proliferation in economics, the regression discontinuity design has not been widely adopted in epidemiology. We describe regression discontinuity, its implementation, and the assumptions required for causal inference. We show that regression discontinuity is generalizable to the survival and nonlinear models that are mainstays of epidemiologic analysis. We then present an application of regression discontinuity to the much-debated epidemiologic question of when to start HIV patients on antiretroviral therapy. Using data from a large South African cohort (2007–2011), we estimate the causal effect of early versus deferred treatment eligibility on mortality. Patients whose first CD4 count was just below the 200 cells/μL CD4 count threshold had a 35% lower hazard of death (hazard ratio = 0.65 [95% confidence interval = 0.45–0.94]) than patients presenting with CD4 counts just above the threshold. We close by discussing the strengths and limitations of regression discontinuity designs for epidemiology. PMID:25061922

  7. Advanced epidemiologic and analytical methods.

    PubMed

    Albanese, E

    2016-01-01

    Observational studies are indispensable for etiologic research, and are key to test life-course hypotheses and improve our understanding of neurologic diseases that have long induction and latency periods. In recent years a plethora of advanced design and analytic techniques have been developed to strengthen the robustness and ultimately the validity of the results of observational studies, and to address their inherent proneness to bias. It is the responsibility of clinicians and researchers to critically appraise and appropriately contextualize the findings of the exponentially expanding scientific literature. This critical appraisal should be rooted in a thorough understanding of advanced epidemiologic methods and techniques commonly used to formulate and test relevant hypotheses and to keep bias at bay. PMID:27637951

  8. Epidemiology and the scientific method.

    PubMed

    Chalmers, A F

    1982-01-01

    This article refutes the claim that the field of epidemiology and community health would benefit from the application of the scientific method. It is argued that the methods of physics are not appropriate for other disciplines. When applied to the social sciences, positivism is a conservatizing force, causing theory to become based on a mere description of social phenomenon. Since it cannot lead to a deep understanding of social phenomena, positivism is incapable of revealing ways in which society could be radically changed. Moreover, such theory is far from neutral. Rather, it is formed and influenced by the forms of life experienced and practiced in the society. This is illustrated by an analysis of the origin of modern physics at the time when society was changing from a feudal to capitalist form of organization. It is concluded that advances will be made in epidemiology and community health when this field breaks from its focus on the individual and incorporates class into its analysis. However, given the interconnection between social structure and social theory, resistance to such a radical change can be expected. PMID:7141777

  9. Melanocortin-1 receptor, skin cancer and phenotypic characteristics (M-SKIP) project: study design and methods for pooling results of genetic epidemiological studies

    PubMed Central

    2012-01-01

    Background For complex diseases like cancer, pooled-analysis of individual data represents a powerful tool to investigate the joint contribution of genetic, phenotypic and environmental factors to the development of a disease. Pooled-analysis of epidemiological studies has many advantages over meta-analysis, and preliminary results may be obtained faster and with lower costs than with prospective consortia. Design and methods Based on our experience with the study design of the Melanocortin-1 receptor (MC1R) gene, SKin cancer and Phenotypic characteristics (M-SKIP) project, we describe the most important steps in planning and conducting a pooled-analysis of genetic epidemiological studies. We then present the statistical analysis plan that we are going to apply, giving particular attention to methods of analysis recently proposed to account for between-study heterogeneity and to explore the joint contribution of genetic, phenotypic and environmental factors in the development of a disease. Within the M-SKIP project, data on 10,959 skin cancer cases and 14,785 controls from 31 international investigators were checked for quality and recoded for standardization. We first proposed to fit the aggregated data with random-effects logistic regression models. However, for the M-SKIP project, a two-stage analysis will be preferred to overcome the problem regarding the availability of different study covariates. The joint contribution of MC1R variants and phenotypic characteristics to skin cancer development will be studied via logic regression modeling. Discussion Methodological guidelines to correctly design and conduct pooled-analyses are needed to facilitate application of such methods, thus providing a better summary of the actual findings on specific fields. PMID:22862891

  10. Epidemiologic methods in analysis of scientific issues

    NASA Astrophysics Data System (ADS)

    Erdreich, Linda S.

    2003-10-01

    Studies of human populations provide much of the information that is used to evaluate compensation cases for hearing loss, including rates of hearing loss by age, and dose-response relationships. The reference data used to make decisions regarding workman's compensation is based on epidemiologic studies of cohorts of workers exposed to various noise levels. Epidemiology and its methods can be used in other ways in the courtroom; to assess the merits of a complaint, to support Daubert criteria, and to explain scientific issues to the trier of fact, generally a layperson. Using examples other than occupational noise induced hearing loss, these methods will be applied to respond to a complaint that hearing loss followed exposure to a sudden noise, a medication, or an occupational chemical, and thus was caused by said exposure. The standard criteria for assessing the weight of the evidence, and epidemiologic criteria for causality show the limits of such anecdotal data and incorporate quantitative and temporal issues. Reports of clusters of cases are also intuitively convincing to juries. Epidemiologic methods provide a scientific approach to assess whether rates of the outcome are indeed increased, and the extent to which increased rates provide evidence for causality.

  11. Using Epidemiologic Methods to Test Hypotheses regarding Causal Influences on Child and Adolescent Mental Disorders

    ERIC Educational Resources Information Center

    Lahey, Benjamin B.; D'Onofrio, Brian M.; Waldman, Irwin D.

    2009-01-01

    Epidemiology uses strong sampling methods and study designs to test refutable hypotheses regarding the causes of important health, mental health, and social outcomes. Epidemiologic methods are increasingly being used to move developmental psychopathology from studies that catalogue correlates of child and adolescent mental health to designs that…

  12. Kinetics methods for clinical epidemiology problems

    PubMed Central

    Corlan, Alexandru Dan; Ross, John

    2015-01-01

    Calculating the probability of each possible outcome for a patient at any time in the future is currently possible only in the simplest cases: short-term prediction in acute diseases of otherwise healthy persons. This problem is to some extent analogous to predicting the concentrations of species in a reactor when knowing initial concentrations and after examining reaction rates at the individual molecule level. The existing theoretical framework behind predicting contagion and the immediate outcome of acute diseases in previously healthy individuals is largely analogous to deterministic kinetics of chemical systems consisting of one or a few reactions. We show that current statistical models commonly used in chronic disease epidemiology correspond to simple stochastic treatment of single reaction systems. The general problem corresponds to stochastic kinetics of complex reaction systems. We attempt to formulate epidemiologic problems related to chronic diseases in chemical kinetics terms. We review methods that may be adapted for use in epidemiology. We show that some reactions cannot fit into the mass-action law paradigm and solutions to these systems would frequently exhibit an antiportfolio effect. We provide a complete example application of stochastic kinetics modeling for a deductive meta-analysis of two papers on atrial fibrillation incidence, prevalence, and mortality. PMID:26578757

  13. Kinetics methods for clinical epidemiology problems.

    PubMed

    Corlan, Alexandru Dan; Ross, John

    2015-11-17

    Calculating the probability of each possible outcome for a patient at any time in the future is currently possible only in the simplest cases: short-term prediction in acute diseases of otherwise healthy persons. This problem is to some extent analogous to predicting the concentrations of species in a reactor when knowing initial concentrations and after examining reaction rates at the individual molecule level. The existing theoretical framework behind predicting contagion and the immediate outcome of acute diseases in previously healthy individuals is largely analogous to deterministic kinetics of chemical systems consisting of one or a few reactions. We show that current statistical models commonly used in chronic disease epidemiology correspond to simple stochastic treatment of single reaction systems. The general problem corresponds to stochastic kinetics of complex reaction systems. We attempt to formulate epidemiologic problems related to chronic diseases in chemical kinetics terms. We review methods that may be adapted for use in epidemiology. We show that some reactions cannot fit into the mass-action law paradigm and solutions to these systems would frequently exhibit an antiportfolio effect. We provide a complete example application of stochastic kinetics modeling for a deductive meta-analysis of two papers on atrial fibrillation incidence, prevalence, and mortality.

  14. DESIGN OF EXPOSURE MEASUREMENTS FOR EPIDEMIOLOGIC STUDIES

    EPA Science Inventory

    This presentation will describe the following items: (1) London daily air pollution and deaths that demonstrate how time series epidemiology can indicate that air pollution caused death; (2) Sophisticated statistical models required to establish this relationship for lower pollut...

  15. Epidemiological study air disaster in Amsterdam (ESADA): study design

    PubMed Central

    Slottje, Pauline; Huizink, Anja C; Twisk, Jos WR; Witteveen, Anke B; van der Ploeg, Henk M; Bramsen, Inge; Smidt, Nynke; Bijlsma, Joost A; Bouter, Lex M; van Mechelen, Willem; Smid, Tjabe

    2005-01-01

    Background In 1992, a cargo aircraft crashed into apartment buildings in Amsterdam, killing 43 victims and destroying 266 apartments. In the aftermath there were speculations about the cause of the crash, potential exposures to hazardous materials due to the disaster and the health consequences. Starting in 2000, the Epidemiological Study Air Disaster in Amsterdam (ESADA) aimed to assess the long-term health effects of occupational exposure to this disaster on professional assistance workers. Methods/Design Epidemiological study among all the exposed professional fire-fighters and police officers who performed disaster-related task(s), and hangar workers who sorted the wreckage of the aircraft, as well as reference groups of their non-exposed colleagues who did not perform any disaster-related tasks. The study took place, on average, 8.5 years after the disaster. Questionnaires were used to assess details on occupational exposure to the disaster. Health measures comprised laboratory assessments in urine, blood and saliva, as well as self-reported current health measures, including health-related quality of life, and various physical and psychological symptoms. Discussion In this paper we describe and discuss the design of the ESADA. The ESADA will provide additional scientific knowledge on the long-term health effects of technological disasters on professional workers. PMID:15921536

  16. The science of epidemiology and the methods needed for public health assessments: a review of epidemiology textbooks

    PubMed Central

    2014-01-01

    Objectives Epidemiology is often described as ‘the science of public health’. Here we aim to assess the extent that epidemiological methods, as covered in contemporary standard textbooks, provide tools that can assess the relative magnitude of public health problems and can be used to help rank and assess public health priorities. Study Design Narrative literature review. Methods Thirty textbooks were grouped into three categories; pure, extended or applied epidemiology, were reviewed with attention to the ways the discipline is characterised and the nature of the analytical methods described. Results Pure texts tend to present a strict hierarchy of methods with those metrics deemed to best serve aetiological inquiry at the top. Extended and applied texts employ broader definitions of epidemiology but in most cases, the metrics described are also those used in aetiological inquiry and may not be optimal for capturing the consequences and social importance of injuries and disease onsets. Conclusions The primary scientific purpose of epidemiology, even amongst ‘applied’ textbooks, is aetiological inquiry. Authors do not readily extend to methods suitable for assessing public health problems and priorities. PMID:24507570

  17. EPI-CT: design, challenges and epidemiological methods of an international study on cancer risk after paediatric and young adult CT.

    PubMed

    Bosch de Basea, Magda; Pearce, Mark S; Kesminiene, Ausrele; Bernier, Marie-Odile; Dabin, Jérémie; Engels, Hilde; Hauptmann, Michael; Krille, Lucian; Meulepas, Johanna M; Struelens, Lara; Baatout, Sarah; Kaijser, Magnus; Maccia, Carlo; Jahnen, Andreas; Thierry-Chef, Isabelle; Blettner, Maria; Johansen, Christoffer; Kjaerheim, Kristina; Nordenskjöld, Arvid; Olerud, Hilde; Salotti, Jane A; Andersen, Tina Veje; Vrijheid, Martine; Cardis, Elisabeth

    2015-09-01

    Computed tomography (CT) has great clinical utility and its usage has increased dramatically over the years. Concerns have been raised, however, about health impacts of ionising radiation exposure from CTs, particularly in children, who have a higher risk for some radiation induced diseases. Direct estimation of the health impact of these exposures is needed, but the conduct of epidemiological studies of paediatric CT populations poses a number of challenges which, if not addressed, could invalidate the results. The aim of the present paper is to review the main challenges of a study on the health impact of paediatric CTs and how the protocol of the European collaborative study EPI-CT, coordinated by the International Agency for Research on Cancer (IARC), is designed to address them. The study, based on a common protocol, is being conducted in Belgium, Denmark, France, Germany, the Netherlands, Norway, Spain, Sweden and the United Kingdom and it has recruited over one million patients suitable for long-term prospective follow-up. Cohort accrual relies on records of participating hospital radiology departments. Basic demographic information and technical data on the CT procedure needed to estimate organ doses are being abstracted and passive follow-up is being conducted by linkage to population-based cancer and mortality registries. The main issues which may affect the validity of study results include missing doses from other radiological procedures, missing CTs, confounding by CT indication and socioeconomic status and dose reconstruction. Sub-studies are underway to evaluate their potential impact. By focusing on the issues which challenge the validity of risk estimates from CT exposures, EPI-CT will be able to address limitations of previous CT studies, thus providing reliable estimates of risk of solid tumours and leukaemia from paediatric CT exposures and scientific bases for the optimisation of paediatric CT protocols and patient protection. PMID:26226081

  18. A design framework for exploratory geovisualization in epidemiology

    PubMed Central

    Robinson, Anthony C.

    2009-01-01

    This paper presents a design framework for geographic visualization based on iterative evaluations of a toolkit designed to support cancer epidemiology. The Exploratory Spatio-Temporal Analysis Toolkit (ESTAT), is intended to support visual exploration through multivariate health data. Its purpose is to provide epidemiologists with the ability to generate new hypotheses or further refine those they may already have. Through an iterative user-centered design process, ESTAT has been evaluated by epidemiologists at the National Cancer Institute (NCI). Results of these evaluations are discussed, and a design framework based on evaluation evidence is presented. The framework provides specific recommendations and considerations for the design and development of a geovisualization toolkit for epidemiology. Its basic structure provides a model for future design and evaluation efforts in information visualization. PMID:20390052

  19. How to design a (good) epidemiological observational study: epidemiological research protocol at a glance.

    PubMed

    Fronteira, Ines

    2013-01-01

    In this article, we propose a general structure for designing a research protocol of an observational epidemiological study. We start by highlighting the importance of the research protocol, namely in accounting for some bias and guaranteeing methodologic rigor and study reproductability. Next, we reflect on some of the essential elements of a research protocol no matter its objective. We further present some specific issues to be included according to the type of study: cross-sectional, case-control and cohort.

  20. A Review of Spatial Methods in Epidemiology, 2000–2010

    PubMed Central

    Auchincloss, Amy H.; Gebreab, Samson Y.; Mair, Christina; Roux, Ana V. Diez

    2013-01-01

    Understanding the impact of place on health is a key element of epidemiologic investigation, and numerous tools are being employed for analysis of spatial health-related data. This review documents the huge growth in spatial epidemiology, summarizes the tools that have been employed, and provides in-depth discussion of several methods. Relevant research articles for 2000–2010 from seven epidemiology journals were included if the study utilized a spatial analysis method in primary analysis (n = 207). Results summarized frequency of spatial methods and substantive focus; graphs explored trends over time. The most common spatial methods were distance calculations, spatial aggregation, clustering, spatial smoothing and interpolation, and spatial regression. Proximity measures were predominant and were applied primarily to air quality and climate science and resource access studies. The review concludes by noting emerging areas that are likely to be important to future spatial analysis in public health. PMID:22429160

  1. Design and analysis of metabolomics studies in epidemiologic research: a primer on -omic technologies.

    PubMed

    Tzoulaki, Ioanna; Ebbels, Timothy M D; Valdes, Ana; Elliott, Paul; Ioannidis, John P A

    2014-07-15

    Metabolomics is the field of "-omics" research concerned with the comprehensive characterization of the small low-molecular-weight metabolites in biological samples. In epidemiology, it represents an emerging technology and an unprecedented opportunity to measure environmental and other exposures with improved precision and far less measurement error than with standard epidemiologic methods. Advances in the application of metabolomics in large-scale epidemiologic research are now being realized through a combination of improved sample preparation and handling, automated laboratory and processing methods, and reduction in costs. The number of epidemiologic studies that use metabolic profiling is still limited, but it is fast gaining popularity in this area. In the present article, we present a roadmap for metabolomic analyses in epidemiologic studies and discuss the various challenges these data pose to large-scale studies. We discuss the steps of data preprocessing, univariate and multivariate data analysis, correction for multiplicity of comparisons with correlated data, and finally the steps of cross-validation and external validation. As data from metabolomic studies accumulate in epidemiology, there is a need for large-scale replication and synthesis of findings, increased availability of raw data, and a focus on good study design, all of which will highlight the potential clinical impact of metabolomics in this field. PMID:24966222

  2. Epidemiological methods in diarrhoea studies—an update

    PubMed Central

    Schmidt, Wolf-Peter; Arnold, Benjamin F; Boisson, Sophie; Genser, Bernd; Luby, Stephen P; Barreto, Mauricio L; Clasen, Thomas; Cairncross, Sandy

    2011-01-01

    Background Diarrhoea remains a leading cause of morbidity and mortality but is difficult to measure in epidemiological studies. Challenges include the diagnosis based on self-reported symptoms, the logistical burden of intensive surveillance and the variability of diarrhoea in space, time and person. Methods We review current practices in sampling procedures to measure diarrhoea, and provide guidance for diarrhoea measurement across a range of study goals. Using 14 available data sets, we estimated typical design effects for clustering at household and village/neighbourhood level, and measured the impact of adjusting for baseline variables on the precision of intervention effect estimates. Results Incidence is the preferred outcome measure in aetiological studies, health services research and vaccine trials. Repeated prevalence measurements (longitudinal prevalence) are appropriate in high-mortality settings where malnutrition is common, although many repeat measures are rarely useful. Period prevalence is an inadequate outcome if an intervention affects illness duration. Adjusting point estimates for age or diarrhoea at baseline in randomized trials has little effect on the precision of estimates. Design effects in trials randomized at household level are usually <2 (range 1.0–3.2). Design effects for larger clusters (e.g. villages or neighbourhoods) vary greatly among different settings and study designs (range 0.1–25.8). Conclusions Using appropriate sampling strategies and outcome measures can improve the efficiency, validity and comparability of diarrhoea studies. Allocating large clusters in cluster randomized trials is compromized by unpredictable design effects and should be carried out only if the research question requires it. PMID:22268237

  3. Designing ROW Methods

    NASA Technical Reports Server (NTRS)

    Freed, Alan D.

    1996-01-01

    There are many aspects to consider when designing a Rosenbrock-Wanner-Wolfbrandt (ROW) method for the numerical integration of ordinary differential equations (ODE's) solving initial value problems (IVP's). The process can be simplified by constructing ROW methods around good Runge-Kutta (RK) methods. The formulation of a new, simple, embedded, third-order, ROW method demonstrates this design approach.

  4. Study design in genetic epidemiology: theoretical and practical considerations.

    PubMed

    Whittemore, A S; Nelson, L M

    1999-01-01

    Recent advances in molecular genetics have created new opportunities and challenges for genetic epidemiologists. Here we review some of the issues that arise when designing a study involving the genetic epidemiology of chronic diseases of late onset, such as cancer. We discuss two considerations that influence the choice of design. The first consideration is the study's goals. We describe the goals of identifying new susceptibility genes for a disease, of estimating important characteristics of known genes, and of learning how to prevent the disease in the genetically susceptible. We indicate how these goals affect the choice of design and present some guidelines for choosing designs that effectively address them. The second consideration is the set of practical constraints to successfully conducting the research. These contraints include problems of potential selection bias, reduced response rates, problems particular to family registries, problems particular to the cultures of various ethnic groups, and ethical issues. We indicate how these constraints affect the choice of design and discuss ways to deal with them. PMID:10854488

  5. [Curricular design of health postgraduate programs: the case of Masters in epidemiology].

    PubMed

    Bobadilla, J L; Lozano, R; Bobadilla, C

    1991-01-01

    This paper discusses the need to create specific programs for the training of researchers in epidemiology, a field that has traditionally been ignored by the graduate programs in public health. This is due, in part, to the emphasis that has been placed on the training of professionals in other areas of public health. The paper also includes the results of a consensus exercise developed during the curricular design of the Masters Program in Epidemiology of the School of Medicine of the National Autonomous University of Mexico. The technique used during the consensus exercise was the TKJ, which allows the presentation of ideas and possible solutions for a specific problem. This is probably the first published experience in the use of such a technique for the design of an academic curriculum. Taking as a base the general characteristics of the students, the substantive, disciplinary and methodological subjects were chosen. The results showed a need for a multidisciplinary approach based on modern methodologies of statistics and epidemiology. The usefulness of the results of the curricular design and the superiority of this method to reach consensus is also discussed.

  6. [Curricular design of health postgraduate programs: the case of Masters in epidemiology].

    PubMed

    Bobadilla, J L; Lozano, R; Bobadilla, C

    1991-01-01

    This paper discusses the need to create specific programs for the training of researchers in epidemiology, a field that has traditionally been ignored by the graduate programs in public health. This is due, in part, to the emphasis that has been placed on the training of professionals in other areas of public health. The paper also includes the results of a consensus exercise developed during the curricular design of the Masters Program in Epidemiology of the School of Medicine of the National Autonomous University of Mexico. The technique used during the consensus exercise was the TKJ, which allows the presentation of ideas and possible solutions for a specific problem. This is probably the first published experience in the use of such a technique for the design of an academic curriculum. Taking as a base the general characteristics of the students, the substantive, disciplinary and methodological subjects were chosen. The results showed a need for a multidisciplinary approach based on modern methodologies of statistics and epidemiology. The usefulness of the results of the curricular design and the superiority of this method to reach consensus is also discussed. PMID:1948431

  7. Population- and individual-based approaches to the design and analysis of epidemiologic studies of sexually transmitted disease transmission.

    PubMed

    Shiboski, S; Padian, N S

    1996-10-01

    Epidemiologic studies of sexually transmitted disease (STD) transmission present a number of unique challenges in design and analysis. These arise both from the social nature of STD transmission and from inherent difficulties in collecting accurate and informative data on exposure and infection. Risk of acquiring an STD depends on both individual-level factors and the behavior and infectiousness of others. Consequently, study designs and analysis methods developed for studying chronic disease risk in individuals or groups may not apply directly. Simple models of STD transmission were used to investigate these issues, focusing on how the interplay between individual- and population-level factors influences design and interpretation of epidemiologic studies, with particular attention to interpretation of common measures of association and to common sources of bias in epidemiologic data. Existing methods for investigating risk factors can be modified such that these issues may be addressed directly. PMID:8843249

  8. A method for meta-analysis of epidemiological studies.

    PubMed

    Einarson, T R; Leeder, J S; Koren, G

    1988-10-01

    This article presents a stepwise approach for conducting a meta-analysis of epidemiological studies based on proposed guidelines. This systematic method is recommended for practitioners evaluating epidemiological studies in the literature to arrive at an overall quantitative estimate of the impact of a treatment. Bendectin is used as an illustrative example. Meta-analysts should establish a priori the purpose of the analysis and a complete protocol. This protocol should be adhered to, and all steps performed should be recorded in detail. To aid in developing such a protocol, we present methods the researcher can use to perform each of 22 steps in six major areas. The illustrative meta-analysis confirmed previous traditional narrative literature reviews that Bendectin is not related to teratogenic outcomes in humans. The overall summary odds ratio was 1.01 (chi 2 = 0.05, p = 0.815) with a 95 percent confidence interval of 0.66-1.55. When the studies were separated according to study type, the summary odds ratio for cohort studies was 0.95 with a 95 percent confidence interval of 0.62-1.45. For case-control studies, the summary odds ratio was 1.27 with a 95 percent confidence interval of 0.83-1.94. The corresponding chi-square values were not statistically significant at the p = 0.05 level.

  9. Control system design method

    DOEpatents

    Wilson, David G.; Robinett, III, Rush D.

    2012-02-21

    A control system design method and concomitant control system comprising representing a physical apparatus to be controlled as a Hamiltonian system, determining elements of the Hamiltonian system representation which are power generators, power dissipators, and power storage devices, analyzing stability and performance of the Hamiltonian system based on the results of the determining step and determining necessary and sufficient conditions for stability of the Hamiltonian system, creating a stable control system based on the results of the analyzing step, and employing the resulting control system to control the physical apparatus.

  10. Blastocystis: Genetic diversity and molecular methods for diagnosis and epidemiology.

    PubMed

    Stensvold, Christen Rune

    2013-01-01

    Blastocystis, an unusual anaerobic, single-celled stramenopile, is a remarkably successful intestinal parasite of a vast array of host species including humans. Fecal Deoxyribonucleic acid (DNA) analysis by nucleic-acid based methods in particular has led to significant advances in Blastocystis diagnostics and research over the past few years enabling accurate identification of carriers and molecular characterization by high discriminatory power. Moreover, Blastocystis comprises a multitude of subtypes (STs) (arguably species) many of which have been identified only recently and molecular epidemiological studies have revealed a significant difference in the distribution of STs across host species and geographical regions. Having a cosmopolitan distribution, the parasite is a common laboratory finding in the stools of individuals with and without intestinal symptoms across the entire globe and while the parasite remains extremely difficult to eradicate and isolate in culture, appropriate molecular tools are now available to resolve important questions such as whether the clinical outcome of colonization is linked to ST and whether Blastocystis is transmitted zoonotically. This review summarizes some of the recent advances in the molecular diagnosis of Blastocystis and gives an introduction to Blastocystis STs, including a recommendation of subtyping methodology based on recent data and method comparisons. A few suggestions for future directions and research areas are given in the light of relevant technological advances and the availability of mitochondrial and nuclear genomes.

  11. The Role of Applied Epidemiology Methods in the Disaster Management Cycle

    PubMed Central

    Heumann, Michael; Perrotta, Dennis; Wolkin, Amy F.; Schnall, Amy H.; Podgornik, Michelle N.; Cruz, Miguel A.; Horney, Jennifer A.; Zane, David; Roisman, Rachel; Greenspan, Joel R.; Thoroughman, Doug; Anderson, Henry A.; Wells, Eden V.; Simms, Erin F.

    2014-01-01

    Disaster epidemiology (i.e., applied epidemiology in disaster settings) presents a source of reliable and actionable information for decision-makers and stakeholders in the disaster management cycle. However, epidemiological methods have yet to be routinely integrated into disaster response and fully communicated to response leaders. We present a framework consisting of rapid needs assessments, health surveillance, tracking and registries, and epidemiological investigations, including risk factor and health outcome studies and evaluation of interventions, which can be practiced throughout the cycle. Applying each method can result in actionable information for planners and decision-makers responsible for preparedness, response, and recovery. Disaster epidemiology, once integrated into the disaster management cycle, can provide the evidence base to inform and enhance response capability within the public health infrastructure. PMID:25211748

  12. The Role of Applied Epidemiology Methods in the Disaster Management Cycle

    PubMed Central

    Malilay, Josephine; Heumann, Michael; Perrotta, Dennis; Wolkin, Amy F.; Schnall, Amy H.; Podgornik, Michelle N.; Cruz, Miguel A.; Horney, Jennifer A.; Zane, David; Roisman, Rachel; Greenspan, Joel R.; Thoroughman, Doug; Anderson, Henry A.; Wells, Eden V.; Simms, Erin F.

    2015-01-01

    Disaster epidemiology (i.e., applied epidemiology in disaster settings) presents a source of reliable and actionable information for decision-makers and stakeholders in the disaster management cycle. However, epidemiological methods have yet to be routinely integrated into disaster response and fully communicated to response leaders. We present a framework consisting of rapid needs assessments, health surveillance, tracking and registries, and epidemiological investigations, including risk factor and health outcome studies and evaluation of interventions, which can be practiced throughout the cycle. Applying each method can result in actionable information for planners and decision-makers responsible for preparedness, response, and recovery. Disaster epidemiology, once integrated into the disaster management cycle, can provide the evidence base to inform and enhance response capability within the public health infrastructure. PMID:25211748

  13. Development of the residential case-specular epidemiologic investigation method. Final report

    SciTech Connect

    Zaffanella, L.E.; Savitz, D.A.

    1995-11-01

    The residential case-specular method is an innovative approach to epidemiologic studies of the association between wire codes and childhood cancer. This project was designed to further the development of the residential case-specular method, which seeks to help resolve the ``wire code paradox``. For years, wire codes have been used as surrogate measures of past electric and magnetic field (EMF) exposure. There is a magnetic field hypothesis that suggests childhood cancer is associated with exposure to magnetic fields, with wire codes as a proxy for these fields. The neighborhood hypothesis suggests that childhood cancer is associated with neighborhood characteristics and exposures other than magnetic fields, with wire codes as a proxy for these characteristics and exposures. The residential case-specular method was designed to discriminate between the magnetic field and the neighborhood hypothesis. Two methods were developed for determining the specular of a residence. These methods were tested with 400 randomly selected residences. The main advantage of the residential case-specular method is that it may efficiently confirm or eliminate the suspicion that control selection bias or confounding by neighborhood factors affected the results of case-control studies of childhood cancer and magnetic fields. The method may be applicable to both past and ongoing studies. The main disadvantage is that the method is untried. Consequently, further work is required to verify its validity and to ensure that sufficient statistical power can be obtained in a cost-effective manner.

  14. Design Characteristics Influence Performance of Clinical Prediction Rules in Validation: A Meta-Epidemiological Study

    PubMed Central

    Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda

    2016-01-01

    Background Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules’ performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Methods Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. Results A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2–4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Conclusion Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved. PMID:26730980

  15. Diagnostic electrocardiography in epidemiological studies of Chagas' disease: multicenter evaluation of a standardized method.

    PubMed

    Lázzari, J O; Pereira, M; Antunes, C M; Guimarães, A; Moncayo, A; Chávez Domínguez, R; Hernández Pieretti, O; Macedo, V; Rassi, A; Maguire, J; Romero, A

    1998-11-01

    An electrocardiographic recording method with an associated reading guide, designed for epidemiological studies on Chagas' disease, was tested to assess its diagnostic reproducibility. Six cardiologists from five countries each read 100 electrocardiographic (ECG) tracings, including 30 from chronic chagasic patients, then reread them after an interval of 6 months. The readings were blind, with the tracings numbered randomly for the first reading and renumbered randomly for the second reading. The physicians, all experienced in interpreting ECGs from chagasic patients, followed printed instructions for reading the tracings. Reproducibility of the readings was evaluated using the kappa (kappa) index for concordance. The results showed a high degree of interobserver concordance with respect to the diagnosis of normal vs. abnormal tracings (kappa = 0.66; SE 0.02). While the interpretations of some categories of ECG abnormalities were highly reproducible, others, especially those having a low prevalence, showed lower levels of concordance. Intraobserver concordance was uniformly higher than interobserver concordance. The findings of this study justify the use by specialists of the recording of readings method proposed for epidemiological studies on Chagas' disease, but warrant caution in the interpretation of some categories of electrocardiographic alterations.

  16. [Occupational epidemiology].

    PubMed

    Ahrens, W; Behrens, T; Mester, B; Schmeisser, N

    2008-03-01

    The aim of occupational epidemiology is to describe workplace-related diseases and to identify their underlying causes. Its primary goal is to protect workers from hazardous effects of the working process by applying work-related primary and secondary prevention measures. To assess health risks different study designs and a wide array of complex study instruments and methods are frequently employed that cannot be replaced by toxicological investigations. This paper primarily addresses health risks by agent exposures. In this context a central task of occupational epidemiology is careful assessment of exposure. Different data sources, such as work site measurements, register data, archive material, experts' opinion, and the workers' personal estimates of exposure may be used during this process. In addition, biological markers can complement exposure assessment. Since thorough occupational epidemiologic studies allow assessment of disease risks under realistic exposure conditions, their results should be more frequently used to derive workplace-related threshold limit values. PMID:18311483

  17. Aircraft digital control design methods

    NASA Technical Reports Server (NTRS)

    Powell, J. D.; Parsons, E.; Tashker, M. G.

    1976-01-01

    Variations in design methods for aircraft digital flight control are evaluated and compared. The methods fall into two categories; those where the design is done in the continuous domain (or s plane) and those where the design is done in the discrete domain (or z plane). Design method fidelity is evaluated by examining closed loop root movement and the frequency response of the discretely controlled continuous aircraft. It was found that all methods provided acceptable performance for sample rates greater than 10 cps except the uncompensated s plane design method which was acceptable above 20 cps. A design procedure based on optimal control methods was proposed that provided the best fidelity at very slow sample rates and required no design iterations for changing sample rates.

  18. Alternative approaches to analytical designs in occupational injury epidemiology.

    PubMed

    Mittleman, M A; Maldonado, G; Gerberich, S G; Smith, G S; Sorock, G S

    1997-08-01

    In this paper, we discuss the theoretical framework upon which observational studies of occupational injuries are based. Following a general description of how causal effects are estimated, the challenges faced by researchers working in this area are outlined, with an emphasis on case-control studies. These challenges include defining the at-risk period for workers whose tasks change over time and whose hazard period may be very brief, evaluating the underreporting of both exposures and injuries, and considering the effects of multiple injuries per individual on study design and data analysis. We review both the theoretical and practical considerations in the design and conduct of traditional case-control studies, based on the collection of individual level data, as well as other approaches, such as using information culled from administrative and descriptive databases, and case-control studies in which the plant or work site is the unit of analysis. The case-crossover design is also reviewed and its utility for reducing confounding due to differences between individuals by self-matching is highlighted. While this design has not yet been applied to the work setting, its potential for increasing our understanding of the causes of acute-onset occupational injuries seems promising. Finally, a variety of hybrid designs are discussed, including combinations of case-control, case-crossover, and cohort designs. PMID:9215435

  19. Concordance and discordance of sequence survey methods for molecular epidemiology

    PubMed Central

    Hasan, Nur A.; Cebula, Thomas A.; Colwell, Rita R.; Robison, Richard A.; Johnson, W. Evan; Crandall, Keith A.

    2015-01-01

    The post-genomic era is characterized by the direct acquisition and analysis of genomic data with many applications, including the enhancement of the understanding of microbial epidemiology and pathology. However, there are a number of molecular approaches to survey pathogen diversity, and the impact of these different approaches on parameter estimation and inference are not entirely clear. We sequenced whole genomes of bacterial pathogens, Burkholderia pseudomallei, Yersinia pestis, and Brucella spp. (60 new genomes), and combined them with 55 genomes from GenBank to address how different molecular survey approaches (whole genomes, SNPs, and MLST) impact downstream inferences on molecular evolutionary parameters, evolutionary relationships, and trait character associations. We selected isolates for sequencing to represent temporal, geographic origin, and host range variability. We found that substitution rate estimates vary widely among approaches, and that SNP and genomic datasets yielded different but strongly supported phylogenies. MLST yielded poorly supported phylogenies, especially in our low diversity dataset, i.e., Y. pestis. Trait associations showed that B. pseudomallei and Y. pestis phylogenies are significantly associated with geography, irrespective of the molecular survey approach used, while Brucella spp. phylogeny appears to be strongly associated with geography and host origin. We contrast inferences made among monomorphic (clonal) and non-monomorphic bacteria, and between intra- and inter-specific datasets. We also discuss our results in light of underlying assumptions of different approaches. PMID:25737810

  20. Stochastic Methods for Aircraft Design

    NASA Technical Reports Server (NTRS)

    Pelz, Richard B.; Ogot, Madara

    1998-01-01

    The global stochastic optimization method, simulated annealing (SA), was adapted and applied to various problems in aircraft design. The research was aimed at overcoming the problem of finding an optimal design in a space with multiple minima and roughness ubiquitous to numerically generated nonlinear objective functions. SA was modified to reduce the number of objective function evaluations for an optimal design, historically the main criticism of stochastic methods. SA was applied to many CFD/MDO problems including: low sonic-boom bodies, minimum drag on supersonic fore-bodies, minimum drag on supersonic aeroelastic fore-bodies, minimum drag on HSCT aeroelastic wings, FLOPS preliminary design code, another preliminary aircraft design study with vortex lattice aerodynamics, HSR complete aircraft aerodynamics. In every case, SA provided a simple, robust and reliable optimization method which found optimal designs in order 100 objective function evaluations. Perhaps most importantly, from this academic/industrial project, technology has been successfully transferred; this method is the method of choice for optimization problems at Northrop Grumman.

  1. Influence of DNA extraction methods on relative telomere length measurements and its impact on epidemiological studies

    PubMed Central

    Raschenberger, Julia; Lamina, Claudia; Haun, Margot; Kollerits, Barbara; Coassin, Stefan; Boes, Eva; Kedenko, Ludmilla; Köttgen, Anna; Kronenberg, Florian

    2016-01-01

    Measurement of telomere length is widely used in epidemiologic studies. Insufficient standardization of the measurements processes has, however, complicated the comparison of results between studies. We aimed to investigate whether DNA extraction methods have an influence on measured values of relative telomere length (RTL) and whether this has consequences for epidemiological studies. We performed four experiments with RTL measurement in quadruplicate by qPCR using DNA extracted with different methods: 1) a standardized validation experiment including three extraction methods (magnetic-particle-method EZ1, salting-out-method INV, phenol-chloroform-isoamyl-alcohol PCI) each in the same 20 samples demonstrated pronounced differences in RTL with lowest values with EZ1 followed by INV and PCI-isolated DNA; 2) a comparison of 307 samples from an epidemiological study showing EZ1-measurements 40% lower than INV-measurements; 3) a matching-approach of two similar non-diseased control groups including 143 pairs of subjects revealed significantly shorter RTL in EZ1 than INV-extracted DNA (0.844 ± 0.157 vs. 1.357 ± 0.242); 4) an association analysis of RTL with prevalent cardiovascular disease detected a stronger association with INV than with EZ1-extracted DNA. In summary, DNA extraction methods have a pronounced influence on the measured RTL-values. This might result in spurious or lost associations in epidemiological studies under certain circumstances. PMID:27138987

  2. Influence of DNA extraction methods on relative telomere length measurements and its impact on epidemiological studies.

    PubMed

    Raschenberger, Julia; Lamina, Claudia; Haun, Margot; Kollerits, Barbara; Coassin, Stefan; Boes, Eva; Kedenko, Ludmilla; Köttgen, Anna; Kronenberg, Florian

    2016-01-01

    Measurement of telomere length is widely used in epidemiologic studies. Insufficient standardization of the measurements processes has, however, complicated the comparison of results between studies. We aimed to investigate whether DNA extraction methods have an influence on measured values of relative telomere length (RTL) and whether this has consequences for epidemiological studies. We performed four experiments with RTL measurement in quadruplicate by qPCR using DNA extracted with different methods: 1) a standardized validation experiment including three extraction methods (magnetic-particle-method EZ1, salting-out-method INV, phenol-chloroform-isoamyl-alcohol PCI) each in the same 20 samples demonstrated pronounced differences in RTL with lowest values with EZ1 followed by INV and PCI-isolated DNA; 2) a comparison of 307 samples from an epidemiological study showing EZ1-measurements 40% lower than INV-measurements; 3) a matching-approach of two similar non-diseased control groups including 143 pairs of subjects revealed significantly shorter RTL in EZ1 than INV-extracted DNA (0.844 ± 0.157 vs. 1.357 ± 0.242); 4) an association analysis of RTL with prevalent cardiovascular disease detected a stronger association with INV than with EZ1-extracted DNA. In summary, DNA extraction methods have a pronounced influence on the measured RTL-values. This might result in spurious or lost associations in epidemiological studies under certain circumstances. PMID:27138987

  3. Overview of molecular typing methods for outbreak detection and epidemiological surveillance.

    PubMed

    Sabat, A J; Budimir, A; Nashev, D; Sá-Leão, R; van Dijl, J m; Laurent, F; Grundmann, H; Friedrich, A W

    2013-01-01

    Typing methods for discriminating different bacterial isolates of the same species are essential epidemiological tools in infection prevention and control. Traditional typing systems based on phenotypes, such as serotype, biotype, phage-type, or antibiogram, have been used for many years. However, more recent methods that examine the relatedness of isolates at a molecular level have revolutionised our ability to differentiate among bacterial types and subtypes. Importantly, the development of molecular methods has provided new tools for enhanced surveillance and outbreak detection. This has resulted in better implementation of rational infection control programmes and efficient allocation of resources across Europe. The emergence of benchtop sequencers using next generation sequencing technology makes bacterial whole genome sequencing (WGS) feasible even in small research and clinical laboratories. WGS has already been used for the characterisation of bacterial isolates in several large outbreaks in Europe and, in the near future, is likely to replace currently used typing methodologies due to its ultimate resolution. However, WGS is still too laborious and time-consuming to obtain useful data in routine surveillance. Also, a largely unresolved question is how genome sequences must be examined for epidemiological characterisation. In the coming years, the lessons learnt from currently used molecular methods will allow us to condense the WGS data into epidemiologically useful information. On this basis, we have reviewed current and new molecular typing methods for outbreak detection and epidemiological surveillance of bacterial pathogens in clinical practice, aiming to give an overview of their specific advantages and disadvantages. PMID:23369389

  4. Imputation method for lifetime exposure assessment in air pollution epidemiologic studies

    PubMed Central

    2013-01-01

    against health data should be done as a function of PDI to check for consistency of results. The 1% of study subjects who lived for long durations near heavily trafficked intersections, had very high cumulative exposures. Thus, imputation methods must be designed to reproduce non-standard distributions. Conclusions Our approach meets a number of methodological challenges to extending historical exposure reconstruction over a lifetime and shows promise for environmental epidemiology. Application to assessment of breast cancer risks will be reported in a subsequent manuscript. PMID:23919666

  5. Design method of supercavitating pumps

    NASA Astrophysics Data System (ADS)

    Kulagin, V.; Likhachev, D.; Li, F. C.

    2016-05-01

    The problem of effective supercavitating (SC) pump is solved, and optimum load distribution along the radius of the blade is found taking into account clearance, degree of cavitation development, influence of finite number of blades, and centrifugal forces. Sufficient accuracy can be obtained using the equivalent flat SC-grid for design of any SC-mechanisms, applying the “grid effect” coefficient and substituting the skewed flow calculated for grids of flat plates with the infinite attached cavitation caverns. This article gives the universal design method and provides an example of SC-pump design.

  6. Is there epidemiology in Russia?

    PubMed Central

    Vlassov, V.

    2000-01-01

    OBJECTIVE—To examine the current state of epidemiology in Russia.
DESIGN—The structure of clinical research and statistical methods was used to shed light on the epidemiology in Russia. The frequencies of specific study designs were evaluated using Medline data for 1970-1997. To determine the proportion of advanced design clinical studies the frequency of cohort, prospective, follow up, or longitudinal studies, and controlled trials was evaluated. All diagnosis related studies were found to determine the usage of advanced statistical technique (ROC analysis). The adequacy of Medline information was checked by hand search of journals. All dissertations in epidemiology defended in Russia in 1995 and 1996 were evaluated for their methodology. The curriculum recommended by Ministry of Health to Medical Universities was evaluated. Available literature and library indexing of epidemiological terms examined.
MAIN RESULTS—Russian medical research uses less frequently advanced study designs and methods of data analysis. Medical students are taught epidemiology as a science of spread of infectious diseases. There is no department of epidemiology in Russian universities where epidemiology is taught in the modern sense and no epidemiological and biostatistical periodicals available in Russia.
CONCLUSION—Epidemiology in Russia remains in an archaic state of science of the spread of infectious diseases and it is detrimental to methodology of medical research in Russia.


Keywords: Soviet Union; Russia; study design; comparative studies PMID:10990475

  7. Outcome modelling strategies in epidemiology: traditional methods and basic alternatives.

    PubMed

    Greenland, Sander; Daniel, Rhian; Pearce, Neil

    2016-04-01

    Controlling for too many potential confounders can lead to or aggravate problems of data sparsity or multicollinearity, particularly when the number of covariates is large in relation to the study size. As a result, methods to reduce the number of modelled covariates are often deployed. We review several traditional modelling strategies, including stepwise regression and the 'change-in-estimate' (CIE) approach to deciding which potential confounders to include in an outcome-regression model for estimating effects of a targeted exposure. We discuss their shortcomings, and then provide some basic alternatives and refinements that do not require special macros or programming. Throughout, we assume the main goal is to derive the most accurate effect estimates obtainable from the data and commercial software. Allowing that most users must stay within standard software packages, this goal can be roughly approximated using basic methods to assess, and thereby minimize, mean squared error (MSE).

  8. Outcome modelling strategies in epidemiology: traditional methods and basic alternatives

    PubMed Central

    Greenland, Sander; Daniel, Rhian; Pearce, Neil

    2016-01-01

    Controlling for too many potential confounders can lead to or aggravate problems of data sparsity or multicollinearity, particularly when the number of covariates is large in relation to the study size. As a result, methods to reduce the number of modelled covariates are often deployed. We review several traditional modelling strategies, including stepwise regression and the ‘change-in-estimate’ (CIE) approach to deciding which potential confounders to include in an outcome-regression model for estimating effects of a targeted exposure. We discuss their shortcomings, and then provide some basic alternatives and refinements that do not require special macros or programming. Throughout, we assume the main goal is to derive the most accurate effect estimates obtainable from the data and commercial software. Allowing that most users must stay within standard software packages, this goal can be roughly approximated using basic methods to assess, and thereby minimize, mean squared error (MSE). PMID:27097747

  9. Discriminatory Indices of Typing Methods for Epidemiologic Analysis of Contemporary Staphylococcus aureus Strains

    PubMed Central

    Rodriguez, Marcela; Hogan, Patrick G.; Satola, Sarah W.; Crispell, Emily; Wylie, Todd; Gao, Hongyu; Sodergren, Erica; Weinstock, George M.; Burnham, Carey-Ann D.; Fritz, Stephanie A.

    2015-01-01

    Abstract Historically, a number of typing methods have been evaluated for Staphylococcus aureus strain characterization. The emergence of contemporary strains of community-associated S. aureus, and the ensuing epidemic with a predominant strain type (USA300), necessitates re-evaluation of the discriminatory power of these typing methods for discerning molecular epidemiology and transmission dynamics, essential to investigations of hospital and community outbreaks. We compared the discriminatory index of 5 typing methods for contemporary S. aureus strain characterization. Children presenting to St. Louis Children's Hospital and community pediatric practices in St. Louis, Missouri (MO), with community-associated S. aureus infections were enrolled. Repetitive sequence-based PCR (repPCR), pulsed-field gel electrophoresis (PFGE), multilocus sequence typing (MLST), staphylococcal protein A (spa), and staphylococcal cassette chromosome (SCC) mec typing were performed on 200 S. aureus isolates. The discriminatory index of each method was calculated using the standard formula for this metric, where a value of 1 is highly discriminatory and a value of 0 is not discriminatory. Overall, we identified 26 distinct strain types by repPCR, 17 strain types by PFGE, 30 strain types by MLST, 68 strain types by spa typing, and 5 strain types by SCCmec typing. RepPCR had the highest discriminatory index (D) of all methods (D = 0.88), followed by spa typing (D = 0.87), MLST (D = 0.84), PFGE (D = 0.76), and SCCmec typing (D = 0.60). The method with the highest D among MRSA isolates was repPCR (D = 0.64) followed by spa typing (D = 0.45) and MLST (D = 0.44). The method with the highest D among MSSA isolates was spa typing (D = 0.98), followed by MLST (D = 0.93), repPCR (D = 0.92), and PFGE (D = 0.89). Among isolates designated USA300 by PFGE, repPCR was most discriminatory, with 10 distinct strain types identified (D = 0.63). We

  10. Statistical methods in public health and epidemiology: a look at the recent past and projections for the next decade.

    PubMed

    Levy, P S; Stolte, K

    2000-02-01

    This article attempts to prognosticate from past patterns, the type of statistical methods that will be used in published public health and epidemiological studies in the decade that follows the millennium. With this in mind, we conducted a study that would characterize trends in use of statistical methods in two major public health journals: the American Journal of Public Health, and the American Journal of Epidemiology. We took a probability sample of 348 articles published in these journals between 1970 and 1998. For each article sampled, we abstracted information on the design of the study and the types of statistical methods used in the article. Our major findings are that the proportion of articles using statistical methods as well as the mean number of statistical methods used per article has increased dramatically over the three decades surveyed. Also, the proportion of published articles using study designs that we classified as analytic has increased over the years. We also examined patterns of use in these journals of three statistical methodologies: logistic regression, proportional hazards regression, and methods for analysis of data from complex sample surveys. These methods were selected because they had been introduced initially in the late 1960s or early 1970s and had made considerable impact on data analysis in the biomedical sciences in the 1970s-90s. Estimated usage of each of these techniques remained relatively low until user-friendly software became available. Our overall conclusions are that new statistical methods are developed on the basis of need, disseminated to potential users over a course of many years, and often do not reach maximum use until tools for their comfortable use are made readily available to potential users. Based on these conclusions, we identify certain needs that are not now being met and which are likely to generate new statistical methodologies that we will see in the next decade.

  11. Comparing Two Epidemiologic Surveillance Methods to Assess Underestimation of Human Stampedes in India

    PubMed Central

    Ngai, Ka Ming; Lee, Wing Yan; Madan, Aditi; Sanyal, Saswata; Roy, Nobhojit; Burkle, Frederick M.; Hsu, Edbert B.

    2013-01-01

    Background: Two separate but complementary epidemiologic surveillance methods for human stampedes have emerged since the publication of the topic in 2009. The objective of this study is to estimate the degree of underreporting in India. Method: The Ngai Search Method was compared to the Roy Search Method for human stampede events occurring in India between 2001 and 2010. Results: A total of 40 stampedes were identified by both search methods. Using the Ngai method, 34 human stampedes were identified. Using a previously defined stampede scale: 2 events were class I, 21 events were class II, 8 events were class III, and 3 events were class IV. The median deaths were 5.5 per event and median injuries were 13.5 per event. Using the Roy method, 27 events were identified, including 9 events that were not identified by the Ngai method. After excluding events based on exclusion criteria, six additional events identified by the Roy’s method had a median of 4 deaths and 30 injuries. In multivariate analysis using the Ngai method, religious (6.52, 95%CI 1.73-24.66, p=0.006) and political (277.09, 95%CI 5.12-15,001.96, p=0.006) events had higher relative number of deaths. Conclusion: Many causes accounting for the global increase in human stampede events can only be elucidated through systematic epidemiological investigation. Focusing on a country with a high recurrence of human stampedes, we compare two independent methods of data abstraction in an effort to improve the existing database and to identify pertinent risk factors. We concluded that our previous publication underestimated stampede events in India by approximately 18% and an international standardized database to systematically record occurrence of human stampedes is needed to facilitate understanding of the epidemiology of human stampedes. PMID:24077300

  12. Rationale and Design of the International Lymphoma Epidemiology Consortium (InterLymph) Non-Hodgkin Lymphoma Subtypes Project

    PubMed Central

    Morton, Lindsay M.; Sampson, Joshua N.; Cerhan, James R.; Turner, Jennifer J.; Vajdic, Claire M.; Wang, Sophia S.; Smedby, Karin E.; de Sanjosé, Silvia; Monnereau, Alain; Benavente, Yolanda; Bracci, Paige M.; Chiu, Brian C. H.; Skibola, Christine F.; Zhang, Yawei; Mbulaiteye, Sam M.; Spriggs, Michael; Robinson, Dennis; Norman, Aaron D.; Kane, Eleanor V.; Spinelli, John J.; Kelly, Jennifer L.; Vecchia, Carlo La; Dal Maso, Luigino; Maynadié, Marc; Kadin, Marshall E.; Cocco, Pierluigi; Costantini, Adele Seniori; Clarke, Christina A.; Roman, Eve; Miligi, Lucia; Colt, Joanne S.; Berndt, Sonja I.; Mannetje, Andrea; de Roos, Anneclaire J.; Kricker, Anne; Nieters, Alexandra; Franceschi, Silvia; Melbye, Mads; Boffetta, Paolo; Clavel, Jacqueline; Linet, Martha S.; Weisenburger, Dennis D.; Slager, Susan L.

    2014-01-01

    Background Non-Hodgkin lymphoma (NHL), the most common hematologic malignancy, consists of numerous subtypes. The etiology of NHL is incompletely understood, and increasing evidence suggests that risk factors may vary by NHL subtype. However, small numbers of cases have made investigation of subtype-specific risks challenging. The International Lymphoma Epidemiology Consortium therefore undertook the NHL Subtypes Project, an international collaborative effort to investigate the etiologies of NHL subtypes. This article describes in detail the project rationale and design. Methods We pooled individual-level data from 20 case-control studies (17471 NHL cases, 23096 controls) from North America, Europe, and Australia. Centralized data harmonization and analysis ensured standardized definitions and approaches, with rigorous quality control. Results The pooled study population included 11 specified NHL subtypes with more than 100 cases: diffuse large B-cell lymphoma (N = 4667), follicular lymphoma (N = 3530), chronic lymphocytic leukemia/small lymphocytic lymphoma (N = 2440), marginal zone lymphoma (N = 1052), peripheral T-cell lymphoma (N = 584), mantle cell lymphoma (N = 557), lymphoplasmacytic lymphoma/Waldenström macroglobulinemia (N = 374), mycosis fungoides/Sézary syndrome (N = 324), Burkitt/Burkitt-like lymphoma/leukemia (N = 295), hairy cell leukemia (N = 154), and acute lymphoblastic leukemia/lymphoma (N = 152). Associations with medical history, family history, lifestyle factors, and occupation for each of these 11 subtypes are presented in separate articles in this issue, with a final article quantitatively comparing risk factor patterns among subtypes. Conclusions The International Lymphoma Epidemiology Consortium NHL Subtypes Project provides the largest and most comprehensive investigation of potential risk factors for a broad range of common and rare NHL subtypes to date. The analyses contribute to our understanding of the multifactorial nature of NHL

  13. Endodontic Epidemiology

    PubMed Central

    Shahravan, Arash; Haghdoost, Ali Akbar

    2014-01-01

    Epidemiology is the study of disease distribution and factors determining or affecting it. Likewise, endodontic epidemiology can be defined as the science of studying the distribution pattern and determinants of pulp and periapical diseases; specially apical periodontitis. Although different study designs have been used in endodontics, researchers must pay more attention to study designs with higher level of evidence such as randomized clinical trials. PMID:24688577

  14. Empirical Evidence of Study Design Biases in Randomized Trials: Systematic Review of Meta-Epidemiological Studies

    PubMed Central

    Page, Matthew J.; Higgins, Julian P. T.; Clayton, Gemma; Sterne, Jonathan A. C.; Hróbjartsson, Asbjørn; Savović, Jelena

    2016-01-01

    Objective To synthesise evidence on the average bias and heterogeneity associated with reported methodological features of randomized trials. Design Systematic review of meta-epidemiological studies. Methods We retrieved eligible studies included in a recent AHRQ-EPC review on this topic (latest search September 2012), and searched Ovid MEDLINE and Ovid EMBASE for studies indexed from Jan 2012-May 2015. Data were extracted by one author and verified by another. We combined estimates of average bias (e.g. ratio of odds ratios (ROR) or difference in standardised mean differences (dSMD)) in meta-analyses using the random-effects model. Analyses were stratified by type of outcome (“mortality” versus “other objective” versus “subjective”). Direction of effect was standardised so that ROR < 1 and dSMD < 0 denotes a larger intervention effect estimate in trials with an inadequate or unclear (versus adequate) characteristic. Results We included 24 studies. The available evidence suggests that intervention effect estimates may be exaggerated in trials with inadequate/unclear (versus adequate) sequence generation (ROR 0.93, 95% CI 0.86 to 0.99; 7 studies) and allocation concealment (ROR 0.90, 95% CI 0.84 to 0.97; 7 studies). For these characteristics, the average bias appeared to be larger in trials of subjective outcomes compared with other objective outcomes. Also, intervention effects for subjective outcomes appear to be exaggerated in trials with lack of/unclear blinding of participants (versus blinding) (dSMD -0.37, 95% CI -0.77 to 0.04; 2 studies), lack of/unclear blinding of outcome assessors (ROR 0.64, 95% CI 0.43 to 0.96; 1 study) and lack of/unclear double blinding (ROR 0.77, 95% CI 0.61 to 0.93; 1 study). The influence of other characteristics (e.g. unblinded trial personnel, attrition) is unclear. Conclusions Certain characteristics of randomized trials may exaggerate intervention effect estimates. The average bias appears to be greatest in trials of

  15. Comparison of methods of extracting information for meta-analysis of observational studies in nutritional epidemiology

    PubMed Central

    2016-01-01

    OBJECTIVES: A common method for conducting a quantitative systematic review (QSR) for observational studies related to nutritional epidemiology is the “highest versus lowest intake” method (HLM), in which only the information concerning the effect size (ES) of the highest category of a food item is collected on the basis of its lowest category. However, in the interval collapsing method (ICM), a method suggested to enable a maximum utilization of all available information, the ES information is collected by collapsing all categories into a single category. This study aimed to compare the ES and summary effect size (SES) between the HLM and ICM. METHODS: A QSR for evaluating the citrus fruit intake and risk of pancreatic cancer and calculating the SES by using the HLM was selected. The ES and SES were estimated by performing a meta-analysis using the fixed-effect model. The directionality and statistical significance of the ES and SES were used as criteria for determining the concordance between the HLM and ICM outcomes. RESULTS: No significant differences were observed in the directionality of SES extracted by using the HLM or ICM. The application of the ICM, which uses a broader information base, yielded more-consistent ES and SES, and narrower confidence intervals than the HLM. CONCLUSIONS: The ICM is advantageous over the HLM owing to its higher statistical accuracy in extracting information for QSR on nutritional epidemiology. The application of the ICM should hence be recommended for future studies. PMID:26797219

  16. Diagnostic Methods of Helicobacter pylori Infection for Epidemiological Studies: Critical Importance of Indirect Test Validation

    PubMed Central

    Miftahussurur, Muhammad; Yamaoka, Yoshio

    2016-01-01

    Among the methods developed to detect H. pylori infection, determining the gold standard remains debatable, especially for epidemiological studies. Due to the decreasing sensitivity of direct diagnostic tests (histopathology and/or immunohistochemistry [IHC], rapid urease test [RUT], and culture), several indirect tests, including antibody-based tests (serology and urine test), urea breath test (UBT), and stool antigen test (SAT) have been developed to diagnose H. pylori infection. Among the indirect tests, UBT and SAT became the best methods to determine active infection. While antibody-based tests, especially serology, are widely available and relatively sensitive, their specificity is low. Guidelines indicated that no single test can be considered as the gold standard for the diagnosis of H. pylori infection and that one should consider the method's advantages and disadvantages. Based on four epidemiological studies, culture and RUT present a sensitivity of 74.2–90.8% and 83.3–86.9% and a specificity of 97.7–98.8% and 95.1–97.2%, respectively, when using IHC as a gold standard. The sensitivity of serology is quite high, but that of the urine test was lower compared with that of the other methods. Thus, indirect test validation is important although some commercial kits propose universal cut-off values. PMID:26904678

  17. Meticillin-resistant Staphylococcus aureus (MRSA): global epidemiology and harmonisation of typing methods.

    PubMed

    Stefani, Stefania; Chung, Doo Ryeon; Lindsay, Jodi A; Friedrich, Alex W; Kearns, Angela M; Westh, Henrik; Mackenzie, Fiona M

    2012-04-01

    This article reviews recent findings on the global epidemiology of healthcare-acquired/associated (HA), community-acquired/associated (CA) and livestock-associated (LA) meticillin-resistant Staphylococcus aureus (MRSA) and aims to reach a consensus regarding the harmonisation of typing methods for MRSA. MRSA rates continue to increase rapidly in many regions and there is a dynamic spread of strains across the globe. HA-MRSA is currently endemic in hospitals in most regions. CA-MRSA clones have been spreading rapidly in the community and also infiltrating healthcare in many regions worldwide. To date, LA-MRSA is only prevalent in certain high-risk groups of workers in direct contact with live animals. CA-MRSA and LA-MRSA have become a challenge for countries that have so far maintained low rates of MRSA. These evolutionary changes have resulted in MRSA continuing to be a major threat to public health. Continuous efforts to understand the changing epidemiology of S. aureus infection in humans and animals are therefore necessary, not only for appropriate antimicrobial treatment and effective infection control but also to monitor the evolution of the species. The group made several consensus decisions with regard to harmonisation of typing methods. A stratified, three-level organisation of testing laboratories was proposed: local; regional; and national. The functions of, and testing methodology used by, each laboratory were defined. The group consensus was to recommend spa and staphylococcal cassette chromosome mec (SCCmec) typing as the preferred methods. Both are informative in defining particular strain characteristics and utilise standardised nomenclatures, making them applicable globally. Effective communication between each of the different levels and between national centres was viewed as being crucial to inform and monitor the molecular epidemiology of MRSA at national and international levels.

  18. Epidemiologic Treatment in Venereal Disease—A Method to Aid in VD Control

    PubMed Central

    Mitchell, Ellis N.

    1972-01-01

    Epidemiologic therapy refers to the treatment of infectious syphilis or gonorrhea contacts without proof of laboratory diagnosis. This method of treatment is considered essential by public health authorities in the management of venereal disease, but has long been neglected in the private sector of medicine. The majority of venereal disease patients are treated by private practitioners, but apathetic attitudes, insufficient training, lack of case reporting, differing and often inadequate treatment schedules, poor follow-up and ignorance about or reluctance to use epi-treatment are all factors in our losing struggle against the current venereal disease epidemic. PMID:4635397

  19. [Secondary adentia and dental implantation (epidemiological and sociological study by a telephone interview method)].

    PubMed

    Fediaev, I M; Khamadeeva, A M; Nikol'skiĭ, V Iu; Ganzha, I R

    2004-01-01

    Epidemiological research of partial secondary adentia was done by means of telephone interviewing the population of city of Samara (1104 persons are interrogated). Prevalence and intensity of the disease in various age groups, and also average indices for all adult population are established. The share of the persons requiring for dental prosthetics among patients with defects of dental lines is determined. The same method investigates awareness of the population about dental implantation, and also the estimation is given to public opinion on treatment with the use of implants.

  20. A practical method for use in epidemiological studies on enamel hypomineralisation.

    PubMed

    Ghanim, A; Elfrink, M; Weerheijm, K; Mariño, R; Manton, D

    2015-06-01

    With the development of the European Academy of Paediatric Dentistry (EAPD) judgment criteria, there has been increasing interest worldwide in investigation of the prevalence of demarcated opacities in tooth enamel substance, known as molar-incisor hypomineralisation (MIH). However, the lack of a standardised system for the purpose of recording MIH data in epidemiological surveys has contributed greatly to the wide variations in the reported prevalence between studies. The present publication describes the rationale, development, and content of a scoring method for MIH diagnosis in epidemiological studies as well as clinic- and hospital-based studies. The proposed grading method allows separate classification of demarcated hypomineralisation lesions and other enamel defects identical to MIH. It yields an informative description of the severity of MIH-affected teeth in terms of the stage of visible enamel destruction and the area of tooth surface affected (i.e. lesion clinical status and extent, respectively). In order to preserve the maximum amount of information from a clinical examination consistent with the need to permit direct comparisons between prevalence studies, two forms of the charting are proposed, a short form for simple screening surveys and a long form desirable for prospective, longitudinal observational research where aetiological factors in demarcated lesions are to be investigated in tandem with lesions distribution. Validation of the grading method is required, and its reliability and usefulness need to be tested in different age groups and different populations.

  1. Experimental design methods for bioengineering applications.

    PubMed

    Keskin Gündoğdu, Tuğba; Deniz, İrem; Çalışkan, Gülizar; Şahin, Erdem Sefa; Azbar, Nuri

    2016-01-01

    Experimental design is a form of process analysis in which certain factors are selected to obtain the desired responses of interest. It may also be used for the determination of the effects of various independent factors on a dependent factor. The bioengineering discipline includes many different areas of scientific interest, and each study area is affected and governed by many different factors. Briefly analyzing the important factors and selecting an experimental design for optimization are very effective tools for the design of any bioprocess under question. This review summarizes experimental design methods that can be used to investigate various factors relating to bioengineering processes. The experimental methods generally used in bioengineering are as follows: full factorial design, fractional factorial design, Plackett-Burman design, Taguchi design, Box-Behnken design and central composite design. These design methods are briefly introduced, and then the application of these design methods to study different bioengineering processes is analyzed.

  2. Measuring socio-economic position for epidemiological studies in low- and middle-income countries: a methods of measurement in epidemiology paper

    PubMed Central

    Howe, Laura D; Galobardes, Bruna; Matijasevich, Alicia; Gordon, David; Johnston, Deborah; Onwujekwe, Obinna; Patel, Rita; Webb, Elizabeth A; Lawlor, Debbie A; Hargreaves, James R

    2012-01-01

    Much has been written about the measurement of socio-economic position (SEP) in high-income countries (HIC). Less has been written for an epidemiology, health systems and public health audience about the measurement of SEP in low- and middle-income countries (LMIC). The social stratification processes in many LMIC—and therefore the appropriate measurement tools—differ considerably from those in HIC. Many measures of SEP have been utilized in epidemiological studies; the aspects of SEP captured by these measures and the pathways through which they may affect health are likely to be slightly different but overlapping. No single measure of SEP will be ideal for all studies and contexts; the strengths and limitations of a given indicator are likely to vary according to the specific research question. Understanding the general properties of different indicators, however, is essential for all those involved in the design or interpretation of epidemiological studies. In this article, we describe the measures of SEP used in LMIC. We concentrate on measures of individual or household-level SEP rather than area-based or ecological measures such as gross domestic product. We describe each indicator in terms of its theoretical basis, interpretation, measurement, strengths and limitations. We also provide brief comparisons between LMIC and HIC for each measure. PMID:22438428

  3. Computational methods for stealth design

    SciTech Connect

    Cable, V.P. )

    1992-08-01

    A review is presented of the utilization of computer models for stealth design toward the ultimate goal of designing and fielding an aircraft that remains undetected at any altitude and any range. Attention is given to the advancements achieved in computational tools and their utilization. Consideration is given to the development of supercomputers for large-scale scientific computing and the development of high-fidelity, 3D, radar-signature-prediction tools for complex shapes with nonmetallic and radar-penetrable materials.

  4. 10-Year Research Update Review: The Epidemiology of Child and Adolescent Psychiatric Disorders--I. Methods and Public Health Burden

    ERIC Educational Resources Information Center

    Costello, E. Jane; Egger, Helen; Angold, Adrian

    2005-01-01

    Objective: To review recent progress in child and adolescent psychiatric epidemiology in the area of prevalence and burden. Method: The literature published in the past decade was reviewed under two headings: methods and findings. Results: Methods for assessing the prevalence and community burden of child and adolescent psychiatric disorders have…

  5. Spacesuit Radiation Shield Design Methods

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Anderson, Brooke M.; Cucinotta, Francis A.; Ware, J.; Zeitlin, Cary J.

    2006-01-01

    Meeting radiation protection requirements during EVA is predominantly an operational issue with some potential considerations for temporary shelter. The issue of spacesuit shielding is mainly guided by the potential of accidental exposure when operational and temporary shelter considerations fail to maintain exposures within operational limits. In this case, very high exposure levels are possible which could result in observable health effects and even be life threatening. Under these assumptions, potential spacesuit radiation exposures have been studied using known historical solar particle events to gain insight on the usefulness of modification of spacesuit design in which the control of skin exposure is a critical design issue and reduction of blood forming organ exposure is desirable. Transition to a new spacesuit design including soft upper-torso and reconfigured life support hardware gives an opportunity to optimize the next generation spacesuit for reduced potential health effects during an accidental exposure.

  6. Comparison of Methods to Account for Implausible Reporting of Energy Intake in Epidemiologic Studies

    PubMed Central

    Rhee, Jinnie J.; Sampson, Laura; Cho, Eunyoung; Hughes, Michael D.; Hu, Frank B.; Willett, Walter C.

    2015-01-01

    In a recent article in the American Journal of Epidemiology by Mendez et al. (Am J Epidemiol. 2011;173(4):448–458), the use of alternative approaches to the exclusion of implausible energy intakes led to significantly different cross-sectional associations between diet and body mass index (BMI), whereas the use of a simpler recommended criteria (<500 and >3,500 kcal/day) yielded no meaningful change. However, these findings might have been due to exclusions made based on weight, a primary determinant of BMI. Using data from 52,110 women in the Nurses' Health Study (1990), we reproduced the cross-sectional findings of Mendez et al. and compared the results from the recommended method with those from 2 weight-dependent alternative methods (the Goldberg method and predicted total energy expenditure method). The same 3 exclusion criteria were then used to examine dietary variables prospectively in relation to change in BMI, which is not a direct function of attained weight. We found similar associations using the 3 methods. In a separate cross-sectional analysis using biomarkers of dietary factors, we found similar correlations for intakes of fatty acids (n = 439) and carotenoids and retinol (n = 1,293) using the 3 methods for exclusions. These results do not support the general conclusion that use of exclusion criteria based on the alternative methods might confer an advantage over the recommended exclusion method. PMID:25656533

  7. Comparison of methods to account for implausible reporting of energy intake in epidemiologic studies.

    PubMed

    Rhee, Jinnie J; Sampson, Laura; Cho, Eunyoung; Hughes, Michael D; Hu, Frank B; Willett, Walter C

    2015-02-15

    In a recent article in the American Journal of Epidemiology by Mendez et al. (Am J Epidemiol. 2011;173(4):448-458), the use of alternative approaches to the exclusion of implausible energy intakes led to significantly different cross-sectional associations between diet and body mass index (BMI), whereas the use of a simpler recommended criteria (<500 and >3,500 kcal/day) yielded no meaningful change. However, these findings might have been due to exclusions made based on weight, a primary determinant of BMI. Using data from 52,110 women in the Nurses' Health Study (1990), we reproduced the cross-sectional findings of Mendez et al. and compared the results from the recommended method with those from 2 weight-dependent alternative methods (the Goldberg method and predicted total energy expenditure method). The same 3 exclusion criteria were then used to examine dietary variables prospectively in relation to change in BMI, which is not a direct function of attained weight. We found similar associations using the 3 methods. In a separate cross-sectional analysis using biomarkers of dietary factors, we found similar correlations for intakes of fatty acids (n = 439) and carotenoids and retinol (n = 1,293) using the 3 methods for exclusions. These results do not support the general conclusion that use of exclusion criteria based on the alternative methods might confer an advantage over the recommended exclusion method. PMID:25656533

  8. Current Methods and Challenges for Epidemiological Studies of the Associations Between Chemical Constituents of Particulate Matter and Health.

    PubMed

    Krall, Jenna R; Chang, Howard H; Sarnat, Stefanie Ebelt; Peng, Roger D; Waller, Lance A

    2015-12-01

    Epidemiological studies have been critical for estimating associations between exposure to ambient particulate matter (PM) air pollution and adverse health outcomes. Because total PM mass is a temporally and spatially varying mixture of constituents with different physical and chemical properties, recent epidemiological studies have focused on PM constituents. Most studies have estimated associations between PM constituents and health using the same statistical methods as in studies of PM mass. However, these approaches may not be sufficient to address challenges specific to studies of PM constituents, namely assigning exposure, disentangling health effects, and handling measurement error. We reviewed large, population-based epidemiological studies of PM constituents and health and describe the statistical methods typically applied to address these challenges. Development of statistical methods that simultaneously address multiple challenges, for example, both disentangling health effects and handling measurement error, could improve estimation of associations between PM constituents and adverse health outcomes.

  9. [Retrospective evaluation of occupational exposure in epidemiologic studies. Use of the Delphi method].

    PubMed

    Goldberg, M; Leclerc, A; Chastang, J F; Goldberg, P; Brodeur, J M; Fuhrer, R; Segnan, N

    1986-01-01

    A method, based on the Delphi technique, for evaluating occupational risks in a quantifiable manner was devised in the course of a case-control study on respiratory cancers in the nickel mining and refining industry in New Caledonia. There were four stages in the evaluation process: identification of eleven potential carcinogenic factors in the company during the 1930-1977 period; grouping of a limited number of work-stations; evaluation of exposure levels for the different factors for each workstation; computation of the cumulative value of exposure for each subject under study. A partial validation study shows that this kind of approach may prove useful for future occupational epidemiological studies. PMID:3547516

  10. Review of pathogenesis and diagnostic methods of immediate relevance for epidemiology and control of Salmonella Dublin in cattle.

    PubMed

    Nielsen, Liza Rosenbaum

    2013-02-22

    Salmonella enterica subsp. enterica serovar Dublin (S. Dublin) receives increasing attention in cattle production. It is host-adapted to cattle, and leads to unacceptable levels of morbidity, mortality and production losses in both newly and persistently infected herds. Cattle health promoting institutions in several countries are currently constructing active surveillance programmes or voluntary certification programmes, and encourage control and eradication of S. Dublin infected cattle herds. There is a need to understand the underlying pathogenesis of the infection at both animal and herd level to design successful programmes. Furthermore, knowledge about and access to diagnostic tests for use in practice including information about test accuracy and interpretation of available diagnostic test methods are requested. The aim is to synthesise the abundant literature on elements of pathogenesis and diagnosis of immediate relevance for epidemiology and control of S. Dublin at animal and herd level. Relatively few in vivo studies on S. Dublin pathogenesis in cattle included more than a few animals and often showed varying result. It makes it difficult to draw conclusions about mechanisms that affect dissemination in cattle and that might be targets for control methods directed towards improving resistance against the bacteria, e.g. new vaccines. It is recommended to perform larger studies to elucidate dose-response relationships and age- and genetic effects of immunity. Furthermore, it is recommended to attempt to develop faster and more sensitive methods for detection of S. Dublin for diagnosis of infectious animals.

  11. Design Methods for Clinical Systems

    PubMed Central

    Blum, B.I.

    1986-01-01

    This paper presents a brief introduction to the techniques, methods and tools used to implement clinical systems. It begins with a taxonomy of software systems, describes the classic approach to development, provides some guidelines for the planning and management of software projects, and finishes with a guide to further reading. The conclusions are that there is no single right way to develop software, that most decisions are based upon judgment built from experience, and that there are tools that can automate some of the better understood tasks.

  12. [Field epidemiology and social epidemiology].

    PubMed

    Segura del Pozo, Javier

    2006-01-01

    Comparing field epidemiology and social epidemiology, we pretend to think about the no explicit images and meanings operating in both necessary convergent fields, about the obstacles present in epidemiological practice to fulfil its social function and about the necessity of changing epistemological, methodological and practice grounds, beginning with field epidemiologists teaching programmes. Field epidemiology would tend to act in an absent theoretical frame. On the other hand, social epidemiology would tend to prioritize theoretical developments (thinking and research about social determinants) without correspondent action, because of the limits to change public policies. Other differences are found at intervention level (micro-macrospace), its aim (outbreak control vs. inequalities control) and the way to communicate with society. They are similar in the methodological concern, the predominance of orientation based on positivism and framed through statistic methods, but in process of epistemological opening, the stress experienced between the alternative relationship to a virtual world of data bases or to the real society, their peripherical situation in relation of the political, social, institutional and professional system and the tendency to professional frustration. Finally, we ask ten questions to the field epidemiologists related with their present practice, in order to consider if they are developing social epidemiology, and propose some changes in epidemiologist teaching and practice.

  13. Mixed Method Designs in Implementation Research

    PubMed Central

    Aarons, Gregory A.; Horwitz, Sarah; Chamberlain, Patricia; Hurlburt, Michael; Landsverk, John

    2010-01-01

    This paper describes the application of mixed method designs in implementation research in 22 mental health services research studies published in peer-reviewed journals over the last 5 years. Our analyses revealed 7 different structural arrangements of qualitative and quantitative methods, 5 different functions of mixed methods, and 3 different ways of linking quantitative and qualitative data together. Complexity of design was associated with number of aims or objectives, study context, and phase of implementation examined. The findings provide suggestions for the use of mixed method designs in implementation research. PMID:20967495

  14. Culture, Interface Design, and Design Methods for Mobile Devices

    NASA Astrophysics Data System (ADS)

    Lee, Kun-Pyo

    Aesthetic differences and similarities among cultures are obviously one of the very important issues in cultural design. However, ever since products became knowledge-supporting tools, the visible elements of products have become more universal so that the invisible parts of products such as interface and interaction are getting more important. Therefore, the cultural design should be extended to the invisible elements of culture like people's conceptual models beyond material and phenomenal culture. This chapter aims to explain how we address the invisible cultural elements in interface design and design methods by exploring the users' cognitive styles and communication patterns in different cultures. Regarding cultural interface design, we examined users' conceptual models while interacting with mobile phone and website interfaces, and observed cultural difference in performing tasks and viewing patterns, which appeared to agree with cultural cognitive styles known as Holistic thoughts vs. Analytic thoughts. Regarding design methods for culture, we explored how to localize design methods such as focus group interview and generative session for specific cultural groups, and the results of comparative experiments revealed cultural difference on participants' behaviors and performance in each design method and led us to suggest how to conduct them in East Asian culture. Mobile Observation Analyzer and Wi-Pro, user research tools we invented to capture user behaviors and needs especially in their mobile context, were also introduced.

  15. Validation of Simple Epidemiological or Clinical Methods for the Measurement of Body Composition in Young Children

    PubMed Central

    Jackson, Diane M; Donaghy, Zoe; Djafarian, Kurosh; Reilly, John J

    2014-01-01

    Objective: The present study aimed to determine the validity of simple epidemiological and clinical methods for the assessment of body fatness in preschool children. Methods: In 89 children (42 boys, 47 girls; mean age 4.1 SD 1.3y) measures of body fatness were made using total body water (TBW), dual energy x-ray absorptiometry (DXA), air displacement plethysmography (BODPOD) and skinfold thickness. Methods were compared by Bland–Altman analysis using TBW as the reference method, and by paired comparisons and rank order correlations. Findings: Bias for DXA was +1.8% body fat percentage units (limits of agreement +15.5% to −11.9%), bias for BODPOD was −3.5% (limits of agreement +18.9% to −5.9%) and bias for skinfolds using the Slaughter equations was −6.5% (limits of agreement +10.0% to −23.1%). Significant rank order correlations with TBW measures of fatness were obtained for DXA estimates of fatness (r=0.54, P=0.01), but not for estimates of fat by skinfold thickness (r=0.20, P=0.2) or BODPOD (r=0.25, P=0.1). Differences between both DXA and BODPOD and the reference TBW estimates of body fatness were not significant (P=0.06 and P=0.1 respectively); however, the difference in estimated body fatness between skinfold thickness and TBW was significant (P<0.001). Conclusion: Estimates of body fatness in preschool children were inaccurate at the level of the individual child using all the methods, but DXA might provide unbiased estimates and a means of making relative assessments of body fatness. PMID:26019772

  16. RADRUE METHOD FOR RECONSTRUCTION OF EXTERNAL PHOTON DOSES TO CHERNOBYL LIQUIDATORS IN EPIDEMIOLOGICAL STUDIES

    PubMed Central

    Kryuchkov, Victor; Chumak, Vadim; Maceika, Evaldas; Anspaugh, Lynn R.; Cardis, Elisabeth; Bakhanova, Elena; Golovanov, Ivan; Drozdovitch, Vladimir; Luckyanov, Nickolas; Kesminiene, Ausrele; Voillequé, Paul; Bouville, André

    2010-01-01

    Between 1986 and 1990, several hundred thousand workers, called “liquidators” or “clean-up workers”, took part in decontamination and recovery activities within the 30-km zone around the Chernobyl nuclear power plant in Ukraine, where a major accident occurred in April 1986. The Chernobyl liquidators were mainly exposed to external ionizing radiation levels that depended primarily on their work locations and the time after the accident when the work was performed. Because individual doses were often monitored inadequately or were not monitored at all for the majority of liquidators, a new method of photon (i.e. gamma and x-rays) dose assessment, called “RADRUE” (Realistic Analytical Dose Reconstruction with Uncertainty Estimation) was developed to obtain unbiased and reasonably accurate estimates for use in three epidemiologic studies of hematological malignancies and thyroid cancer among liquidators. The RADRUE program implements a time-and-motion dose reconstruction method that is flexible and conceptually easy to understand. It includes a large exposure rate database and interpolation and extrapolation techniques to calculate exposure rates at places where liquidators lived and worked within ~70 km of the destroyed reactor. The RADRUE technique relies on data collected from subjects’ interviews conducted by trained interviewers, and on expert dosimetrists to interpret the information and provide supplementary information, when necessary, based upon their own Chernobyl experience. The RADRUE technique was used to estimate doses from external irradiation, as well as uncertainties, to the bone-marrow for 929 subjects and to the thyroid gland for 530 subjects enrolled in epidemiologic studies. Individual bone-marrow dose estimates were found to range from less than one μGy to 3,300 mGy, with an arithmetic mean of 71 mGy. Individual thyroid dose estimates were lower and ranged from 20 μGy to 507 mGy, with an arithmetic mean of 29 mGy. The

  17. Model reduction methods for control design

    NASA Technical Reports Server (NTRS)

    Dunipace, K. R.

    1988-01-01

    Several different model reduction methods are developed and detailed implementation information is provided for those methods. Command files to implement the model reduction methods in a proprietary control law analysis and design package are presented. A comparison and discussion of the various reduction techniques is included.

  18. Mixed Methods Research Designs in Counseling Psychology

    ERIC Educational Resources Information Center

    Hanson, William E.; Creswell, John W.; Clark, Vicki L. Plano; Petska, Kelly S.; Creswell, David J.

    2005-01-01

    With the increased popularity of qualitative research, researchers in counseling psychology are expanding their methodologies to include mixed methods designs. These designs involve the collection, analysis, and integration of quantitative and qualitative data in a single or multiphase study. This article presents an overview of mixed methods…

  19. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  20. Micarta propellers IV : technical methods of design

    NASA Technical Reports Server (NTRS)

    Caldwell, F W; Clay, N S

    1924-01-01

    A description is given of the methods used in design of Micarta propellers. The most direct method for working out the design of a Micarta propeller is to start with the diameter and blade angles of a wooden propeller suited for a particular installation and then to apply one of the plan forms suitable for Micarta propellers. This allows one to obtain the corresponding blade widths and to then use these angles and blade widths for an aerodynamic analysis.

  1. Phene Plate (PhP) biochemical fingerprinting. A screening method for epidemiological typing of enterococcal isolates.

    PubMed

    Saeedi, B; Tärnberg, M; Gill, H; Hällgren, A; Jonasson, J; Nilsson, L E; Isaksson, B; Kühn, I; Hanberger, H

    2005-09-01

    Pulsed-field gel electrophoresis (PFGE) is currently considered the gold standard for genotyping of enterococci. However, PFGE is both expensive and time-consuming. The purpose of this study was to investigate whether the PhP system can be used as a reliable clinical screening method for detection of genetically related isolates of enterococci. If so, it should be possible to minimize the number of isolates subjected to PFGE typing, which would save time and money. Ninety-nine clinical enterococcal isolates were analysed by PhP (similarity levels 0.90-0.975) and PFGE (similarity levels < or =3 and < or =6 bands) and all possible pairs of isolates were cross-classified as matched or mismatched. We found that the probability that a pair of isolates (A and B) belonging to the same type according to PhP also belong to the same cluster according to PFGE, i.e. p(A(PFGE)=B(PFGE) * A(PhP)=B(PhP)), and the probability that a pair of isolates of different types according to PhP also belong to different clusters according to PFGE, i.e. p(A(PFGE) not equalB(PFGE) * A(PhP) not equalB(PhP)), was relatively high for E. faecalis (0.86 and 0.96, respectively), but was lower for E. faecium (0.51 and 0.77, respectively). The concordance which shows the probability that PhP and PFGE agree on match or mismatch was 86%-93% for E. faecalis and 54%-66% for E. faecium, which indicates that the PhP method may be useful for epidemiological typing of E. faecalis in the current settings but not for E. faecium.

  2. Two method measurement for adolescent obesity epidemiology: Reducing the bias in self report of height and weight

    PubMed Central

    Drake, Keith M.; Longacre, Meghan R.; Dalton, Madeline A.; Langeloh, Gail; Peterson, Karen E.; Titus, Linda J.; Beach, Michael L.

    2013-01-01

    Background Despite validation studies demonstrating substantial bias, epidemiologic studies typically use self-reported height and weight as primary measures of body mass index due to feasibility and resource limitations. Purpose To demonstrate a method for calculating accurate and precise estimates that use body mass index when objectively measuring height and weight in a full sample is not feasible. Methods As part of a longitudinal study of adolescent health, 1,840 adolescents (aged 12–18) self-reported their height and weight during telephone surveys. Height and weight was measured for 407 of these adolescents. Sex specific, age-adjusted obesity status was calculated from self-reported and from measured height and weight. Prevalence and predictors of obesity were estimated using 1) self-reported data, 2) measured data, and 3) multiple imputation (of measured data). Results Among adolescents with self-reported and measured data, the obesity prevalence was lower when using self-report compared to actual measurements (p < 0.001). The obesity prevalence from multiple imputation (20%) was much closer to estimates based solely on measured data (20%) compared to estimates based solely on self-reported data (12%), indicating improved accuracy. In multivariate models, estimates of predictors of obesity were more accurate and approximately as precise (similar confidence intervals) as estimates based solely on self-reported data. Conclusions The two-method measurement design offers researchers a technique to reduce the bias typically inherent in self-reported height and weight without needing to collect measurements on the full sample. This technique enhances the ability to detect real, statistically significant differences, while minimizing the need for additional resources. PMID:23684216

  3. Development of a hydraulic turbine design method

    NASA Astrophysics Data System (ADS)

    Kassanos, Ioannis; Anagnostopoulos, John; Papantonis, Dimitris

    2013-10-01

    In this paper a hydraulic turbine parametric design method is presented which is based on the combination of traditional methods and parametric surface modeling techniques. The blade of the turbine runner is described using Bezier surfaces for the definition of the meridional plane as well as the blade angle distribution, and a thickness distribution applied normal to the mean blade surface. In this way, it is possible to define parametrically the whole runner using a relatively small number of design parameters, compared to conventional methods. The above definition is then combined with a commercial CFD software and a stochastic optimization algorithm towards the development of an automated design optimization procedure. The process is demonstrated with the design of a Francis turbine runner.

  4. Evidence-based decision-making in infectious diseases epidemiology, prevention and control: matching research questions to study designs and quality appraisal tools

    PubMed Central

    2014-01-01

    Background The Project on a Framework for Rating Evidence in Public Health (PRECEPT) was initiated and is being funded by the European Centre for Disease Prevention and Control (ECDC) to define a methodology for evaluating and grading evidence and strength of recommendations in the field of public health, with emphasis on infectious disease epidemiology, prevention and control. One of the first steps was to review existing quality appraisal tools (QATs) for individual research studies of various designs relevant to this area, using a question-based approach. Methods Through team discussions and expert consultations, we identified 20 relevant types of public health questions, which were grouped into six domains, i.e. characteristics of the pathogen, burden of disease, diagnosis, risk factors, intervention, and implementation of intervention. Previously published systematic reviews were used and supplemented by expert consultation to identify suitable QATs. Finally, a matrix was constructed for matching questions to study designs suitable to address them and respective QATs. Key features of each of the included QATs were then analyzed, in particular in respect to its intended use, types of questions and answers, presence/absence of a quality score, and if a validation was performed. Results In total we identified 21 QATs and 26 study designs, and matched them. Four QATs were suitable for experimental quantitative study designs, eleven for observational quantitative studies, two for qualitative studies, three for economic studies, one for diagnostic test accuracy studies, and one for animal studies. Included QATs consisted of six to 28 items. Six of the QATs had a summary quality score. Fourteen QATs had undergone at least one validation procedure. Conclusions The results of this methodological study can be used as an inventory of potentially relevant questions, appropriate study designs and QATs for researchers and authorities engaged with evidence-based decision

  5. Age-Based Methods to Explore Time-Related Variables in Occupational Epidemiology Studies

    SciTech Connect

    Janice P. Watkins, Edward L. Frome, Donna L. Cragle

    2005-08-31

    Although age is recognized as the strongest predictor of mortality in chronic disease epidemiology, a calendar-based approach is often employed when evaluating time-related variables. An age-based analysis file, created by determining the value of each time-dependent variable for each age that a cohort member is followed, provides a clear definition of age at exposure and allows development of diverse analytic models. To demonstrate methods, the relationship between cancer mortality and external radiation was analyzed with Poisson regression for 14,095 Oak Ridge National Laboratory workers. Based on previous analysis of this cohort, a model with ten-year lagged cumulative radiation doses partitioned by receipt before (dose-young) or after (dose-old) age 45 was examined. Dose-response estimates were similar to calendar-year-based results with elevated risk for dose-old, but not when film badge readings were weekly before 1957. Complementary results showed increasing risk with older hire ages and earlier birth cohorts, since workers hired after age 45 were born before 1915, and dose-young and dose-old were distributed differently by birth cohorts. Risks were generally higher for smokingrelated than non-smoking-related cancers. It was difficult to single out specific variables associated with elevated cancer mortality because of: (1) birth cohort differences in hire age and mortality experience completeness, and (2) time-period differences in working conditions, dose potential, and exposure assessment. This research demonstrated the utility and versatility of the age-based approach.

  6. Multidisciplinary Optimization Methods for Aircraft Preliminary Design

    NASA Technical Reports Server (NTRS)

    Kroo, Ilan; Altus, Steve; Braun, Robert; Gage, Peter; Sobieski, Ian

    1994-01-01

    This paper describes a research program aimed at improved methods for multidisciplinary design and optimization of large-scale aeronautical systems. The research involves new approaches to system decomposition, interdisciplinary communication, and methods of exploiting coarse-grained parallelism for analysis and optimization. A new architecture, that involves a tight coupling between optimization and analysis, is intended to improve efficiency while simplifying the structure of multidisciplinary, computation-intensive design problems involving many analysis disciplines and perhaps hundreds of design variables. Work in two areas is described here: system decomposition using compatibility constraints to simplify the analysis structure and take advantage of coarse-grained parallelism; and collaborative optimization, a decomposition of the optimization process to permit parallel design and to simplify interdisciplinary communication requirements.

  7. ESD protection device design using statistical methods

    NASA Astrophysics Data System (ADS)

    Shigyo, N.; Kawashima, H.; Yasuda, S.

    2002-12-01

    This paper describes a design of the electrostatic discharge (ESD) protection device to minimize its area Ap while maintaining the breakdown voltage VESD. Hypothesis tests using measured data were performed to find the severest applied serge condition and to select control factors for the design-of-experiments (DOE). Also, technology CAD (TCAD) was used to estimate VESD. An optimum device structure, where salicide block was employed, was found using statistical methods and TCAD.

  8. Analysis Method for Quantifying Vehicle Design Goals

    NASA Technical Reports Server (NTRS)

    Fimognari, Peter; Eskridge, Richard; Martin, Adam; Lee, Michael

    2007-01-01

    A document discusses a method for using Design Structure Matrices (DSM), coupled with high-level tools representing important life-cycle parameters, to comprehensively conceptualize a flight/ground space transportation system design by dealing with such variables as performance, up-front costs, downstream operations costs, and reliability. This approach also weighs operational approaches based on their effect on upstream design variables so that it is possible to readily, yet defensively, establish linkages between operations and these upstream variables. To avoid the large range of problems that have defeated previous methods of dealing with the complex problems of transportation design, and to cut down the inefficient use of resources, the method described in the document identifies those areas that are of sufficient promise and that provide a higher grade of analysis for those issues, as well as the linkages at issue between operations and other factors. Ultimately, the system is designed to save resources and time, and allows for the evolution of operable space transportation system technology, and design and conceptual system approach targets.

  9. Axisymmetric inlet minimum weight design method

    NASA Technical Reports Server (NTRS)

    Nadell, Shari-Beth

    1995-01-01

    An analytical method for determining the minimum weight design of an axisymmetric supersonic inlet has been developed. The goal of this method development project was to improve the ability to predict the weight of high-speed inlets in conceptual and preliminary design. The initial model was developed using information that was available from inlet conceptual design tools (e.g., the inlet internal and external geometries and pressure distributions). Stiffened shell construction was assumed. Mass properties were computed by analyzing a parametric cubic curve representation of the inlet geometry. Design loads and stresses were developed at analysis stations along the length of the inlet. The equivalent minimum structural thicknesses for both shell and frame structures required to support the maximum loads produced by various load conditions were then determined. Preliminary results indicated that inlet hammershock pressures produced the critical design load condition for a significant portion of the inlet. By improving the accuracy of inlet weight predictions, the method will improve the fidelity of propulsion and vehicle design studies and increase the accuracy of weight versus cost studies.

  10. Optimization methods applied to hybrid vehicle design

    NASA Technical Reports Server (NTRS)

    Donoghue, J. F.; Burghart, J. H.

    1983-01-01

    The use of optimization methods as an effective design tool in the design of hybrid vehicle propulsion systems is demonstrated. Optimization techniques were used to select values for three design parameters (battery weight, heat engine power rating and power split between the two on-board energy sources) such that various measures of vehicle performance (acquisition cost, life cycle cost and petroleum consumption) were optimized. The apporach produced designs which were often significant improvements over hybrid designs already reported on in the literature. The principal conclusions are as follows. First, it was found that the strategy used to split the required power between the two on-board energy sources can have a significant effect on life cycle cost and petroleum consumption. Second, the optimization program should be constructed so that performance measures and design variables can be easily changed. Third, the vehicle simulation program has a significant effect on the computer run time of the overall optimization program; run time can be significantly reduced by proper design of the types of trips the vehicle takes in a one year period. Fourth, care must be taken in designing the cost and constraint expressions which are used in the optimization so that they are relatively smooth functions of the design variables. Fifth, proper handling of constraints on battery weight and heat engine rating, variables which must be large enough to meet power demands, is particularly important for the success of an optimization study. Finally, the principal conclusion is that optimization methods provide a practical tool for carrying out the design of a hybrid vehicle propulsion system.

  11. Epidemiological causality.

    PubMed

    Morabia, Alfredo

    2005-01-01

    Epidemiological methods, which combine population thinking and group comparisons, can primarily identify causes of disease in populations. There is therefore a tension between our intuitive notion of a cause, which we want to be deterministic and invariant at the individual level, and the epidemiological notion of causes, which are invariant only at the population level. Epidemiologists have given heretofore a pragmatic solution to this tension. Causal inference in epidemiology consists in checking the logical coherence of a causality statement and determining whether what has been found grossly contradicts what we think we already know: how strong is the association? Is there a dose-response relationship? Does the cause precede the effect? Is the effect biologically plausible? Etc. This approach to causal inference can be traced back to the English philosophers David Hume and John Stuart Mill. On the other hand, the mode of establishing causality, devised by Jakob Henle and Robert Koch, which has been fruitful in bacteriology, requires that in every instance the effect invariably follows the cause (e.g., inoculation of Koch bacillus and tuberculosis). This is incompatible with epidemiological causality which has to deal with probabilistic effects (e.g., smoking and lung cancer), and is therefore invariant only for the population.

  12. Development and Evaluation for Active Learning Instructional Design of Epidemiology in Nursing Informatics Field.

    PubMed

    Majima, Yukie

    2016-01-01

    Nursing education classes are classifiable into three types: lectures, classroom practice, and clinical practice. In this study, we implemented a class that incorporated elements of active learning, including clickers, minutes papers, quizzes, and group work and presentation, in the subject of "epidemiology", which is often positioned in the field of nursing informatics and which is usually taught in conventional knowledge-transmission style lectures, to help students understand knowledge and achieve seven class goals. Results revealed that the average scores of the class achievement (five levels of evaluation) were 3.6-3.9, which was good overall. The highest average score of the evaluation of teaching materials by students (five levels of evaluation) was 4.6 for quizzes, followed by 4.2 for announcement of test statistics, 4.1 for clickers, and 4.0 for news presentation related to epidemiology. We regard these as useful tools for students to increase their motivation. One problem with the class was that it took time to organize the class: creation of tests, class preparation and marking, such as things to be returned and distribution of clickers, and writing comments on small papers. PMID:27332214

  13. Epigenetic Epidemiology: Promises for Public Health Research

    PubMed Central

    Bakulski, Kelly M.; Fallin, M. Daniele

    2014-01-01

    Epigenetic changes underlie developmental and age related biology. Promising epidemiologic research implicates epigenetics in disease risk and progression, and suggests epigenetic status depends on environmental risks as well as genetic predisposition. Epigenetics may represent a mechanistic link between environmental exposures, or genetics, and many common diseases, or may simply provide a quantitative biomarker for exposure or disease for areas of epidemiology currently lacking such measures. This great promise is balanced by issues related to study design, measurement tools, statistical methods, and biological interpretation that must be given careful consideration in an epidemiologic setting. This article describes the promises and challenges for epigenetic epidemiology, and suggests directions to advance this emerging area of molecular epidemiology. PMID:24449392

  14. The semi-individual study in air pollution epidemiology: a valid design as compared to ecologic studies.

    PubMed Central

    Künzli, N; Tager, I B

    1997-01-01

    The assessment of long-term effects of air pollution in humans relies on epidemiologic studies. A widely used design consists of cross-sectional or cohort studies in which ecologic assignment of exposure, based on a fixed-site ambient monitor, is employed. Although health outcome and usually a large number of covariates are measured in individuals, these studies are often called ecological. We will introduce the term semi-individual design for these studies. We review the major properties and limitations with regard to causal inference of truly ecologic studies, in which outcome, exposure, and covariates are available on an aggregate level only. Misclassification problems and issues related to confounding and model specification in truly ecologic studies limit etiologic inference to individuals. In contrast, the semi-individual study shares its methodological and inferential properties with typical individual-level study designs. The major caveat relates to the case where too few study areas, e.g., two or three, are used, which render control of aggregate level confounding impossible. The issue of exposure misclassification is of general concern in epidemiology and not an exclusive problem of the semi-individual design. In a multicenter setting, the semi-individual study is a valuable tool to approach long-term effects of air pollution. Knowledge about the error structure of the ecologically assigned exposure allows consideration of the impact of ecologically assigned exposure on effect estimation. Semi-individual studies, i.e., individual level air pollution studies with ecologic exposure assignment, more readily permit valid inference to individuals and should not be labeled as ecologic studies. PMID:9349825

  15. Methods and Technologies Branch (MTB)

    Cancer.gov

    The Methods and Technologies Branch focuses on methods to address epidemiologic data collection, study design and analysis, and to modify technological approaches to better understand cancer susceptibility.

  16. Standardized Radiation Shield Design Methods: 2005 HZETRN

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Tripathi, Ram K.; Badavi, Francis F.; Cucinotta, Francis A.

    2006-01-01

    Research committed by the Langley Research Center through 1995 resulting in the HZETRN code provides the current basis for shield design methods according to NASA STD-3000 (2005). With this new prominence, the database, basic numerical procedures, and algorithms are being re-examined with new methods of verification and validation being implemented to capture a well defined algorithm for engineering design processes to be used in this early development phase of the Bush initiative. This process provides the methodology to transform the 1995 HZETRN research code into the 2005 HZETRN engineering code to be available for these early design processes. In this paper, we will review the basic derivations including new corrections to the codes to insure improved numerical stability and provide benchmarks for code verification.

  17. MAST Propellant and Delivery System Design Methods

    NASA Technical Reports Server (NTRS)

    Nadeem, Uzair; Mc Cleskey, Carey M.

    2015-01-01

    A Mars Aerospace Taxi (MAST) concept and propellant storage and delivery case study is undergoing investigation by NASA's Element Design and Architectural Impact (EDAI) design and analysis forum. The MAST lander concept envisions landing with its ascent propellant storage tanks empty and supplying these reusable Mars landers with propellant that is generated and transferred while on the Mars surface. The report provides an overview of the data derived from modeling between different methods of propellant line routing (or "lining") and differentiate the resulting design and operations complexity of fluid and gaseous paths based on a given set of fluid sources and destinations. The EDAI team desires a rough-order-magnitude algorithm for estimating the lining characteristics (i.e., the plumbing mass and complexity) associated different numbers of vehicle propellant sources and destinations. This paper explored the feasibility of preparing a mathematically sound algorithm for this purpose, and offers a method for the EDAI team to implement.

  18. New method of designing CCD driver

    NASA Astrophysics Data System (ADS)

    Yu, Wei; Yu, Daoyin; Zhang, Yimo

    1993-04-01

    A new method of designing CCD driver circuits is introduced in this paper. Some kinds of programmable logic device (PLD) chips including generic array logic (GAL) and EPROM are used to drive a CCD sensor. The driver runs stably and reliably. It is widely applied in many fields with its good interchangeability, small size, and low cost.

  19. Acoustic Treatment Design Scaling Methods. Phase 2

    NASA Technical Reports Server (NTRS)

    Clark, L. (Technical Monitor); Parrott, T. (Technical Monitor); Jones, M. (Technical Monitor); Kraft, R. E.; Yu, J.; Kwan, H. W.; Beer, B.; Seybert, A. F.; Tathavadekar, P.

    2003-01-01

    The ability to design, build and test miniaturized acoustic treatment panels on scale model fan rigs representative of full scale engines provides not only cost-savings, but also an opportunity to optimize the treatment by allowing multiple tests. To use scale model treatment as a design tool, the impedance of the sub-scale liner must be known with confidence. This study was aimed at developing impedance measurement methods for high frequencies. A normal incidence impedance tube method that extends the upper frequency range to 25,000 Hz. without grazing flow effects was evaluated. The free field method was investigated as a potential high frequency technique. The potential of the two-microphone in-situ impedance measurement method was evaluated in the presence of grazing flow. Difficulties in achieving the high frequency goals were encountered in all methods. Results of developing a time-domain finite difference resonator impedance model indicated that a re-interpretation of the empirical fluid mechanical models used in the frequency domain model for nonlinear resistance and mass reactance may be required. A scale model treatment design that could be tested on the Universal Propulsion Simulator vehicle was proposed.

  20. 3. 6 simplified methods for design

    SciTech Connect

    Nickell, R.E.; Yahr, G.T.

    1981-01-01

    Simplified design analysis methods for elevated temperature construction are classified and reviewed. Because the major impetus for developing elevated temperature design methodology during the past ten years has been the LMFBR program, considerable emphasis is placed upon results from this source. The operating characteristics of the LMFBR are such that cycles of severe transient thermal stresses can be interspersed with normal elevated temperature operational periods of significant duration, leading to a combination of plastic and creep deformation. The various simplified methods are organized into two general categories, depending upon whether it is the material, or constitutive, model that is reduced, or the geometric modeling that is simplified. Because the elastic representation of material behavior is so prevalent, an entire section is devoted to elastic analysis methods. Finally, the validation of the simplified procedures is discussed.

  1. Reliability Methods for Shield Design Process

    NASA Technical Reports Server (NTRS)

    Tripathi, R. K.; Wilson, J. W.

    2002-01-01

    Providing protection against the hazards of space radiation is a major challenge to the exploration and development of space. The great cost of added radiation shielding is a potential limiting factor in deep space operations. In this enabling technology, we have developed methods for optimized shield design over multi-segmented missions involving multiple work and living areas in the transport and duty phase of space missions. The total shield mass over all pieces of equipment and habitats is optimized subject to career dose and dose rate constraints. An important component of this technology is the estimation of two most commonly identified uncertainties in radiation shield design, the shielding properties of materials used and the understanding of the biological response of the astronaut to the radiation leaking through the materials into the living space. The largest uncertainty, of course, is in the biological response to especially high charge and energy (HZE) ions of the galactic cosmic rays. These uncertainties are blended with the optimization design procedure to formulate reliability-based methods for shield design processes. The details of the methods will be discussed.

  2. Optimization methods for alternative energy system design

    NASA Astrophysics Data System (ADS)

    Reinhardt, Michael Henry

    An electric vehicle heating system and a solar thermal coffee dryer are presented as case studies in alternative energy system design optimization. Design optimization tools are compared using these case studies, including linear programming, integer programming, and fuzzy integer programming. Although most decision variables in the designs of alternative energy systems are generally discrete (e.g., numbers of photovoltaic modules, thermal panels, layers of glazing in windows), the literature shows that the optimization methods used historically for design utilize continuous decision variables. Integer programming, used to find the optimal investment in conservation measures as a function of life cycle cost of an electric vehicle heating system, is compared to linear programming, demonstrating the importance of accounting for the discrete nature of design variables. The electric vehicle study shows that conservation methods similar to those used in building design, that reduce the overall UA of a 22 ft. electric shuttle bus from 488 to 202 (Btu/hr-F), can eliminate the need for fossil fuel heating systems when operating in the northeast United States. Fuzzy integer programming is presented as a means of accounting for imprecise design constraints such as being environmentally friendly in the optimization process. The solar thermal coffee dryer study focuses on a deep-bed design using unglazed thermal collectors (UTC). Experimental data from parchment coffee drying are gathered, including drying constants and equilibrium moisture. In this case, fuzzy linear programming is presented as a means of optimizing experimental procedures to produce the most information under imprecise constraints. Graphical optimization is used to show that for every 1 m2 deep-bed dryer, of 0.4 m depth, a UTC array consisting of 5, 1.1 m 2 panels, and a photovoltaic array consisting of 1, 0.25 m 2 panels produces the most dry coffee per dollar invested in the system. In general this study

  3. Waterflooding injectate design systems and methods

    SciTech Connect

    Brady, Patrick V.; Krumhansl, James L.

    2014-08-19

    A method of designing an injectate to be used in a waterflooding operation is disclosed. One aspect includes specifying data representative of chemical characteristics of a liquid hydrocarbon, a connate, and a reservoir rock, of a subterranean reservoir. Charged species at an interface of the liquid hydrocarbon are determined based on the specified data by evaluating at least one chemical reaction. Charged species at an interface of the reservoir rock are determined based on the specified data by evaluating at least one chemical reaction. An extent of surface complexation between the charged species at the interfaces of the liquid hydrocarbon and the reservoir rock is determined by evaluating at least one surface complexation reaction. The injectate is designed and is operable to decrease the extent of surface complexation between the charged species at interfaces of the liquid hydrocarbon and the reservoir rock. Other methods, apparatus, and systems are disclosed.

  4. An improved design method for EPC middleware

    NASA Astrophysics Data System (ADS)

    Lou, Guohuan; Xu, Ran; Yang, Chunming

    2014-04-01

    For currently existed problems and difficulties during the small and medium enterprises use EPC (Electronic Product Code) ALE (Application Level Events) specification to achieved middleware, based on the analysis of principle of EPC Middleware, an improved design method for EPC middleware is presented. This method combines the powerful function of MySQL database, uses database to connect reader-writer with upper application system, instead of development of ALE application program interface to achieve a middleware with general function. This structure is simple and easy to implement and maintain. Under this structure, different types of reader-writers added can be configured conveniently and the expandability of the system is improved.

  5. A survival tree method for the analysis of discrete event times in clinical and epidemiological studies.

    PubMed

    Schmid, Matthias; Küchenhoff, Helmut; Hoerauf, Achim; Tutz, Gerhard

    2016-02-28

    Survival trees are a popular alternative to parametric survival modeling when there are interactions between the predictor variables or when the aim is to stratify patients into prognostic subgroups. A limitation of classical survival tree methodology is that most algorithms for tree construction are designed for continuous outcome variables. Hence, classical methods might not be appropriate if failure time data are measured on a discrete time scale (as is often the case in longitudinal studies where data are collected, e.g., quarterly or yearly). To address this issue, we develop a method for discrete survival tree construction. The proposed technique is based on the result that the likelihood of a discrete survival model is equivalent to the likelihood of a regression model for binary outcome data. Hence, we modify tree construction methods for binary outcomes such that they result in optimized partitions for the estimation of discrete hazard functions. By applying the proposed method to data from a randomized trial in patients with filarial lymphedema, we demonstrate how discrete survival trees can be used to identify clinically relevant patient groups with similar survival behavior.

  6. A survival tree method for the analysis of discrete event times in clinical and epidemiological studies.

    PubMed

    Schmid, Matthias; Küchenhoff, Helmut; Hoerauf, Achim; Tutz, Gerhard

    2016-02-28

    Survival trees are a popular alternative to parametric survival modeling when there are interactions between the predictor variables or when the aim is to stratify patients into prognostic subgroups. A limitation of classical survival tree methodology is that most algorithms for tree construction are designed for continuous outcome variables. Hence, classical methods might not be appropriate if failure time data are measured on a discrete time scale (as is often the case in longitudinal studies where data are collected, e.g., quarterly or yearly). To address this issue, we develop a method for discrete survival tree construction. The proposed technique is based on the result that the likelihood of a discrete survival model is equivalent to the likelihood of a regression model for binary outcome data. Hence, we modify tree construction methods for binary outcomes such that they result in optimized partitions for the estimation of discrete hazard functions. By applying the proposed method to data from a randomized trial in patients with filarial lymphedema, we demonstrate how discrete survival trees can be used to identify clinically relevant patient groups with similar survival behavior. PMID:26358826

  7. Review of methods of dose estimation for epidemiological studies of the radiological impact of nevada test site and global fallout.

    PubMed

    Beck, Harold L; Anspaugh, Lynn R; Bouville, André; Simon, Steven L

    2006-07-01

    Methods to assess radiation doses from nuclear weapons test fallout have been used to estimate doses to populations and individuals in a number of studies. However, only a few epidemiology studies have relied on fallout dose estimates. Though the methods for assessing doses from local and regional compared to global fallout are similar, there are significant differences in predicted doses and contributing radionuclides depending on the source of the fallout, e.g. whether the nuclear debris originated in Nevada at the U.S. nuclear test site or whether it originated at other locations worldwide. The sparse historical measurement data available are generally sufficient to estimate external exposure doses reasonably well. However, reconstruction of doses to body organs from ingestion and inhalation of radionuclides is significantly more complex and is almost always more uncertain than are external dose estimates. Internal dose estimates are generally based on estimates of the ground deposition per unit area of specific radionuclides and subsequent transport of radionuclides through the food chain. A number of technical challenges to correctly modeling deposition of fallout under wet and dry atmospheric conditions still remain, particularly at close-in locations where sizes of deposited particles vary significantly over modest changes in distance. This paper summarizes the various methods of dose estimation from weapons test fallout and the most important dose assessment and epidemiology studies that have relied on those methods.

  8. Program for the epidemiological evaluation of stroke in Tandil, Argentina (PREVISTA) study: rationale and design.

    PubMed

    Sposato, Luciano A; Coppola, Mariano L; Altamirano, Juan; Borrego Guerrero, Brenda; Casanova, Jorge; De Martino, Maximiliano; Díaz, Alejandro; Feigin, Valery L; Funaro, Fernando; Gradillone, María E; Lewin, María L; Lopes, Renato D; López, Daniel H; Louge, Mariel; Maccarone, Patricia; Martens, Cecilia; Miguel, Marcelo; Rabinstein, Alejandro; Morasso, Hernán; Riccio, Patricia M; Saposnik, Gustavo; Silva, Damián; Suasnabar, Ramón; Truelsen, Thomas; Uzcudun, Araceli; Viviani, Carlos A; Bahit, M Cecilia

    2013-10-01

    The availability of population-based epidemiological data on the incident risk of stroke is very scarce in Argentina and other Latin American countries. In response to the priorities established by the World Health Organization and the United Nations, PREVISTA was envisaged as a population-based program to determine the risk of first-ever and recurrent stroke and transient ischemic attack incidence and mortality in Tandil, Buenos Aires, Argentina. The study will be conducted according to Standardized Tools for Stroke Surveillance (STEPS Stroke) methodology and will enroll all new (incident) and recurrent consecutive cases of stroke and transient ischemic attack in the City of Tandil between May 1st, 2013 and April 30, 2015. The study will include patients with ischemic stroke, non-traumatic primary intracerebral hemorrhage, subarachnoid hemorrhage, and transient ischemic attack. To ensure the inclusion of every cerebrovascular event during an observation period of two years, we will instrument an 'intensive screening program', consisting of a comprehensive daily tracking of every potential event of stroke or transient ischemic attack using multiple overlapping sources. Mortality would be determined during follow-up for every enrolled patient. Also, fatal community events would be screened daily through revision of death certificates at funeral homes and local offices of vital statistics. All causes of death will be adjudicated by an ad-hoc committee. The close population of Tandil is representative of a large proportion of Latin-American countries with low- and middle-income economies. The findings and conclusions of PREVISTA may provide data that could support future health policy decision-making in the region. PMID:24024917

  9. Design methods of rhombic tensegrity structures

    NASA Astrophysics Data System (ADS)

    Feng, Xi-Qiao; Li, Yue; Cao, Yan-Ping; Yu, Shou-Wen; Gu, Yuan-Tong

    2010-08-01

    As a special type of novel flexible structures, tensegrity holds promise for many potential applications in such fields as materials science, biomechanics, civil and aerospace engineering. Rhombic systems are an important class of tensegrity structures, in which each bar constitutes the longest diagonal of a rhombus of four strings. In this paper, we address the design methods of rhombic structures based on the idea that many tensegrity structures can be constructed by assembling one-bar elementary cells. By analyzing the properties of rhombic cells, we first develop two novel schemes, namely, direct enumeration scheme and cell-substitution scheme. In addition, a facile and efficient method is presented to integrate several rhombic systems into a larger tensegrity structure. To illustrate the applications of these methods, some novel rhombic tensegrity structures are constructed.

  10. Method of designing layered sound absorbing materials

    NASA Astrophysics Data System (ADS)

    Atalla, Youssef; Panneton, Raymond

    2002-11-01

    A widely used model for describing sound propagation in porous materials is the Johnson-Champoux-Allard model. This rigid frame model is based on five geometrical properties of the porous medium: resistivity, porosity, tortuosity, and viscous and thermal characteristic lengths. Using this model and with the knowledge of such properties for different absorbing materials, the design of a multiple layered system can be optimized efficiently and rapidly. The overall impedance of the layered systems can be calculated by the repeated application of single layer impedance equation. The knowledge of the properties of the materials involved in the layered system and their physical meaning, allows to perform by computer a systematic evaluation of potential layer combinations rather than do it experimentally which is time consuming and always not efficient. The final design of layered materials can then be confirmed by suitable measurements. A method of designing the overall acoustic absorption of multiple layered porous materials is presented. Some aspects based on the material properties, for designing a flat layered absorbing system are considered. Good agreement between measured and computed sound absorption coefficients has been obtained for the studied configurations. [Work supported by N.S.E.R.C. Canada, F.C.A.R. Quebec, and Bombardier Aerospace.

  11. The healthy men study: design and recruitment considerations for environmental epidemiologic studies in male reproductive health

    EPA Science Inventory

    Study Objective: To describe study conduct and response and participant characteristics. Design: Prospective cohort study. Setting: Participants were male partners of women enrolled in a community-based study of drinking water disinfection by-products and pregnancy healt...

  12. [Design of software and hardware complex of automated systems for the management of the State Sanitary and Epidemiological Surveillance].

    PubMed

    Mel'nichenko, P I; Muzychenko, F V; Malinovskiĭ, A A; Leont'ev, L Iu; Ustiukhin, N V

    2005-05-01

    A new information system (IS) - the software and hardware complex for controlling the state sanitary-and-epidemiological inspection (SSEI) was created. The system represents the aggregate of automated working places of RF MD chief state sanitary physician arid specialists from the department of state sanitary-and-epidemiological inspection of the Main Military Medical Headquarters. They interact through communications with working places of specialists from SSEI Main Center, chief state sanitary physicians from the Armed Forces, military districts (fleets) and RFAF CSSEI. The special software provides automation of the following technological processes: operative sanitary-and epidemiological and epidemiological monitoring; the epidemiological analysis of infectious diseases; the evaluation of quality and efficiency of sanitary-and epidemiological work. At present the complex works in the regime of experimental exploitation when the adjustment of communications and special software is performed.

  13. Methods for structural design at elevated temperatures

    NASA Technical Reports Server (NTRS)

    Ellison, A. M.; Jones, W. E., Jr.; Leimbach, K. R.

    1973-01-01

    A procedure which can be used to design elevated temperature structures is discussed. The desired goal is to have the same confidence in the structural integrity at elevated temperature as the factor of safety gives on mechanical loads at room temperature. Methods of design and analysis for creep, creep rupture, and creep buckling are presented. Example problems are included to illustrate the analytical methods. Creep data for some common structural materials are presented. Appendix B is description, user's manual, and listing for the creep analysis program. The program predicts time to a given creep or to creep rupture for a material subjected to a specified stress-temperature-time spectrum. Fatigue at elevated temperature is discussed. Methods of analysis for high stress-low cycle fatigue, fatigue below the creep range, and fatigue in the creep range are included. The interaction of thermal fatigue and mechanical loads is considered, and a detailed approach to fatigue analysis is given for structures operating below the creep range.

  14. Direct optimization method for reentry trajectory design

    NASA Astrophysics Data System (ADS)

    Jallade, S.; Huber, P.; Potti, J.; Dutruel-Lecohier, G.

    The software package called `Reentry and Atmospheric Transfer Trajectory' (RATT) was developed under ESA contract for the design of atmospheric trajectories. It includes four software TOP (Trajectory OPtimization) programs, which optimize reentry and aeroassisted transfer trajectories. 6FD and 3FD (6 and 3 degrees of freedom Flight Dynamic) are devoted to the simulation of the trajectory. SCA (Sensitivity and Covariance Analysis) performs covariance analysis on a given trajectory with respect to different uncertainties and error sources. TOP provides the optimum guidance law of a three degree of freedom reentry of aeroassisted transfer (AAOT) trajectories. Deorbit and reorbit impulses (if necessary) can be taken into account in the optimization. A wide choice of cost function is available to the user such as the integrated heat flux, or the sum of the velocity impulses, or a linear combination of both of them for trajectory and vehicle design. The crossrange and the downrange can be maximized during reentry trajectory. Path constraints are available on the load factor, the heat flux and the dynamic pressure. Results on these proposed options are presented. TOPPHY is the part of the TOP software corresponding to the definition and the computation of the optimization problemphysics. TOPPHY can interface with several optimizes with dynamic solvers: TOPOP and TROPIC using direct collocation methods and PROMIS using direct multiple shooting method. TOPOP was developed in the frame of this contract, it uses Hermite polynomials for the collocation method and the NPSOL optimizer from the NAG library. Both TROPIC and PROMIS were developed by the DLR (Deutsche Forschungsanstalt fuer Luft und Raumfahrt) and use the SLSQP optimizer. For the dynamic equation resolution, TROPIC uses a collocation method with Splines and PROMIS uses a multiple shooting method with finite differences. The three different optimizers including dynamics were tested on the reentry trajectory of the

  15. Design analysis, robust methods, and stress classification

    SciTech Connect

    Bees, W.J.

    1993-01-01

    This special edition publication volume is comprised of papers presented at the 1993 ASME Pressure Vessels and Piping Conference, July 25--29, 1993 in Denver, Colorado. The papers were prepared for presentations in technical sessions developed under the auspices of the PVPD Committees on Computer Technology, Design and Analysis, Operations Applications and Components. The topics included are: Analysis of Pressure Vessels and Components; Expansion Joints; Robust Methods; Stress Classification; and Non-Linear Analysis. Individual papers have been processed separately for inclusion in the appropriate data bases.

  16. Cultural epidemiology of pandemic influenza in urban and rural Pune, India: a cross-sectional, mixed-methods study

    PubMed Central

    Sundaram, Neisha; Schaetti, Christian; Purohit, Vidula; Kudale, Abhay; Weiss, Mitchell G

    2014-01-01

    Objective To identify and compare sociocultural features of pandemic influenza with reference to illness-related experience, meaning and behaviour in urban and rural areas of India. Design Cross-sectional, mixed-methods, cultural epidemiological survey with vignette-based interviews. Semistructured explanatory model interviews were used to study community ideas of the 2009 influenza pandemic. In-depth interviews elaborated respondents’ experience during the pandemic. Setting Urban and rural communities, Pune district, western India. Participants Survey of urban (n=215) and rural (n=221) residents aged between 18 and 65 years. In-depth interviews of respondents with a history of 2009 pandemic influenza (n=6). Results More urban (36.7%) than rural respondents (16.3%, p<0.001) identified the illness in the vignette as ‘swine flu’. Over half (56.7%) believed the illness would be fatal without treatment, but with treatment 96% predicted full recovery. Worry (‘tension’) about the illness was reported as more troubling than somatic symptoms. The most common perceived causes—‘exposure to a dirty environment’ and ‘cough or sneeze of an infected person’–were more prominent in the urban group. Among rural respondents, climatic conditions, drinking contaminated water, tension and cultural ideas on humoral imbalance from heat-producing or cold-producing foods were more prominent. The most widely reported home treatment was herbal remedies; more rural respondents suggested reliance on prayer, and symptom relief was more of a priority for urban respondents. Government health services were preferred in the urban communities, and rural residents relied more than urban residents on private facilities. The important preventive measures emphasised were cleanliness, wholesome lifestyle and vaccines, and more urban respondents reported the use of masks. In-depth interviews indicated treatment delays during the 2009 pandemic, especially among rural patients

  17. Foot-and-mouth disease in pigs: current epidemiological situation and control methods.

    PubMed

    León, Emilio A

    2012-03-01

    Foot-and-mouth disease (FMD) is the paradigm of a transboundary animal disease. Beyond any doubt, it is the most serious challenge for livestock's health. Official Veterinary Services from free countries invest considerable amount of money to prevent its introduction, whereas those from endemic countries invest most of their resources in the control of the disease. A very important volume of scientific production is developed every year in different aspects of FMD, and for that reason, the current knowledge makes the diagnosis of the disease easier to a great extent. However, FMD is still endemic in about two-thirds of the countries, and periodically re-emergent in several countries. This paper is a review of recent publications, focusing mainly on control measures and current world epidemiological situation, emphasizing primarily pigs. PMID:22225815

  18. Surveillance in a Telemedicine Setting: Application of Epidemiologic Methods at NASA Johnson Space Center Adriana

    NASA Technical Reports Server (NTRS)

    Babiak-Vazquez, Adriana; Ruffaner, Lanie; Wear, Mary; Crucian Brian; Sams, Clarence; Lee, Lesley R.; Van Baalen, Mary

    2016-01-01

    Space medicine presents unique challenges and opportunities for epidemiologists, such as the use of telemedicine during spaceflight. Medical capabilities aboard the International Space Station (ISS) are limited due to severe restrictions on power, volume, and mass. Consequently, inflight health information is based heavily on crewmember (CM) self-report of signs and symptoms, rather than formal diagnoses. While CM's are in flight, the primary source of crew health information is verbal communication between physicians and crewmembers. In 2010 NASA implemented the Lifetime Surveillance of Astronaut Health, an occupational surveillance program for the U.S. Astronaut corps. This has shifted the epidemiological paradigm from tracking diagnoses based on traditional terrestrial clinical practice to one that incorporates symptomatology and may gain a more population-based understanding of early detection of disease process.

  19. Three-area epidemiological study of geographic differences in stroke mortality. I. Background and methods.

    PubMed

    Nefzger, M D; Kuller, L H; Lilienfeld, A M; Diamond, E L; Miller, G D; Stolley, P D; Tonascia, S

    1977-01-01

    An epidemiological study was conducted to determine the geographical variations in stroke mortality among three U.S. areas. They were Savannah, Georgia (high stroke rates), Hagerstown, Maryland (intermediate stroke rates) and Pueblo, Colorado (low stroke rates). In each area samples were drawn of the population in the 35--54 age group. The subjects were interviewed and examined to obtain the information required on medical conditions and/or living habits which would characterize each area. A brief medical and family history, as well as demographic and personal data, were obtained by interview. The medical examination included blood pressure, ECG, blood and urine chemistry, height and weight. In all three cities the response rate in the final sample selected was 90% (2,375 individuals) interviewed and 74% (1.939 individuals) examined.

  20. Quantifying Uncertainty in Epidemiological Models

    SciTech Connect

    Ramanathan, Arvind; Jha, Sumit Kumar

    2012-01-01

    Modern epidemiology has made use of a number of mathematical models, including ordinary differential equation (ODE) based models and agent based models (ABMs) to describe the dynamics of how a disease may spread within a population and enable the rational design of strategies for intervention that effectively contain the spread of the disease. Although such predictions are of fundamental importance in preventing the next global pandemic, there is a significant gap in trusting the outcomes/predictions solely based on such models. Hence, there is a need to develop approaches such that mathematical models can be calibrated against historical data. In addition, there is a need to develop rigorous uncertainty quantification approaches that can provide insights into when a model will fail and characterize the confidence in the (possibly multiple) model outcomes/predictions, when such retrospective analysis cannot be performed. In this paper, we outline an approach to develop uncertainty quantification approaches for epidemiological models using formal methods and model checking. By specifying the outcomes expected from a model in a suitable spatio-temporal logic, we use probabilistic model checking methods to quantify the probability with which the epidemiological model satisfies the specification. We argue that statistical model checking methods can solve the uncertainty quantification problem for complex epidemiological models.

  1. Circular epidemiology.

    PubMed

    Kuller, L H

    1999-11-01

    Circular epidemiology can be defined as the continuation of specific types of epidemiologic studies beyond the point of reasonable doubt of the true existence of an important association or the absence of such an association. Circular epidemiology is an extreme example of studies of the consistency of associations. A basic problem for epidemiology is the lack of a systematic approach to acquiring new knowledge to reach a goal of improving public health and preventive medicine. For epidemiologists, research support unfortunately is biased toward the continued study of already proven hypotheses. Circular epidemiology, however, freezes at one point in the evolution of epidemiologic studies, failing to move from descriptive to analytical case-control and longitudinal studies, for example, to experimental, clinical trials. Good epidemiology journals are filled with very well-conducted epidemiologic studies that primarily repeat the obvious or are variations on the theme.

  2. Epidemiologic Methods Lessons Learned from Environmental Public Health Disasters: Chernobyl, the World Trade Center, Bhopal, and Graniteville, South Carolina

    PubMed Central

    Svendsen, Erik R.; Runkle, Jennifer R.; Dhara, Venkata Ramana; Lin, Shao; Naboka, Marina; Mousseau, Timothy A.; Bennett, Charles

    2012-01-01

    Background: Environmental public health disasters involving hazardous contaminants may have devastating effects. While much is known about their immediate devastation, far less is known about long-term impacts of these disasters. Extensive latent and chronic long-term public health effects may occur. Careful evaluation of contaminant exposures and long-term health outcomes within the constraints imposed by limited financial resources is essential. Methods: Here, we review epidemiologic methods lessons learned from conducting long-term evaluations of four environmental public health disasters involving hazardous contaminants at Chernobyl, the World Trade Center, Bhopal, and Graniteville (South Carolina, USA). Findings: We found several lessons learned which have direct implications for the on-going disaster recovery work following the Fukushima radiation disaster or for future disasters. Interpretation: These lessons should prove useful in understanding and mitigating latent health effects that may result from the nuclear reactor accident in Japan or future environmental public health disasters. PMID:23066404

  3. A structural design decomposition method utilizing substructuring

    NASA Technical Reports Server (NTRS)

    Scotti, Stephen J.

    1994-01-01

    A new method of design decomposition for structural analysis and optimization is described. For this method, the structure is divided into substructures where each substructure has its structural response described by a structural-response subproblem, and its structural sizing determined from a structural-sizing subproblem. The structural responses of substructures that have rigid body modes when separated from the remainder of the structure are further decomposed into displacements that have no rigid body components, and a set of rigid body modes. The structural-response subproblems are linked together through forces determined within a structural-sizing coordination subproblem which also determines the magnitude of any rigid body displacements. Structural-sizing subproblems having constraints local to the substructures are linked together through penalty terms that are determined by a structural-sizing coordination subproblem. All the substructure structural-response subproblems are totally decoupled from each other, as are all the substructure structural-sizing subproblems, thus there is significant potential for use of parallel solution methods for these subproblems.

  4. Design factors in epidemiologic cohort studies of work-related low back injury or pain.

    PubMed

    Kraus, J F; Gardner, L; Collins, J; Sorock, G; Volinn, E

    1997-08-01

    The connection between work-related exposures and the onset of back injury or pain is complex and not clearly understood. This paper raises design issues related to the planning and conduct of cohort studies of industrial low back pain (or injury)(LBP), with care given to definition and measurement of exposure and outcome events. These issues include sample size, outcome definition, study biases, and practical considerations when seeking and maintaining company collaboration with a research effort. Without resolving these issues, the authors conclude: (1) cohort studies of worksite-based LBP are needed to elucidate the causal associations between work tasks and LBP onset, (2) both acute and cumulative exposures should be assessed as risk factors for low back injury or pain, and (3) attention should be paid to the planning of such studies and minimization of potential biases that can limit the validity of the results. These design issues will benefit researchers and companies engaged in the planning and conduct of cohort studies of industrial LBP.

  5. Research and Design of Rootkit Detection Method

    NASA Astrophysics Data System (ADS)

    Liu, Leian; Yin, Zuanxing; Shen, Yuli; Lin, Haitao; Wang, Hongjiang

    Rootkit is one of the most important issues of network communication systems, which is related to the security and privacy of Internet users. Because of the existence of the back door of the operating system, a hacker can use rootkit to attack and invade other people's computers and thus he can capture passwords and message traffic to and from these computers easily. With the development of the rootkit technology, its applications are more and more extensive and it becomes increasingly difficult to detect it. In addition, for various reasons such as trade secrets, being difficult to be developed, and so on, the rootkit detection technology information and effective tools are still relatively scarce. In this paper, based on the in-depth analysis of the rootkit detection technology, a new kind of the rootkit detection structure is designed and a new method (software), X-Anti, is proposed. Test results show that software designed based on structure proposed is much more efficient than any other rootkit detection software.

  6. Method for designing gas tag compositions

    DOEpatents

    Gross, K.C.

    1995-04-11

    For use in the manufacture of gas tags such as employed in a nuclear reactor gas tagging failure detection system, a method for designing gas tagging compositions utilizes an analytical approach wherein the final composition of a first canister of tag gas as measured by a mass spectrometer is designated as node No. 1. Lattice locations of tag nodes in multi-dimensional space are then used in calculating the compositions of a node No. 2 and each subsequent node so as to maximize the distance of each node from any combination of tag components which might be indistinguishable from another tag composition in a reactor fuel assembly. Alternatively, the measured compositions of tag gas numbers 1 and 2 may be used to fix the locations of nodes 1 and 2, with the locations of nodes 3-N then calculated for optimum tag gas composition. A single sphere defining the lattice locations of the tag nodes may be used to define approximately 20 tag nodes, while concentric spheres can extend the number of tag nodes to several hundred. 5 figures.

  7. Genetic epidemiology utilizing the adoption method: studies of obesity and of premature death in adults.

    PubMed

    Sørensen, T I

    1991-03-01

    Genetic epidemiology gives no priority to genes or environment in the search of disease causation. However, a major problem in this field is the disentangling of the effects of environment and genes. The study of subjects separated very early in life from their biologic parents and adopted by unrelated parents provide a strong tool for estimation of genetic and familial environmental influences. The degree to which the trait or disease frequency of the adoptees is similar to that seen among the biologic relatives is an indication of the strength of the genetic influence. Similarity to the adoptive relatives suggests influences of the family environment shared between them. Adoption studies of adult obesity show that it is genes, and not the family environment, that is responsible for the familial aggregation of obesity. A study of the mortality of adult adoptees and their biologic and adoptive parents indicates a genetic influence on the risk of premature death from all causes, from natural causes, infections, and cardio- and cerebrovascular conditions, and suggests familial environmental influences on death from the vascular causes and from cancer.

  8. Measuring sun exposure in epidemiological studies: Matching the method to the research question.

    PubMed

    King, Laura; Xiang, Fan; Swaminathan, Ashwin; Lucas, Robyn M

    2015-12-01

    Sun exposure has risks and benefits for health. Testing these associations requires tools for measuring sun exposure that are feasible and relevant to the time-course of the health outcome. Recent sun exposure, e.g. the last week, is best captured by dosimeters and sun diaries. These can also be used for medium-term sun exposure e.g. over several weeks, but incur a high participant burden. Self-reported data on "typical time outdoors" for working and non-working days, is less detailed and not influenced by day-to-day variation. Over a longer period, e.g. the lifetime, or for particular life stages, proxies of sun exposure, such as latitude of residence or ambient ultraviolet (UV) radiation levels (from satellites or ground-level monitoring) can be used, with additional detail provided by lifetime sun exposure calendars that include locations of residence, usual time outdoors, and detail of sunburn episodes. Objective measures of lifetime sun exposure include microtopography of sun-exposed skin (e.g. using silicone casts) or conjunctival UV autofluorescence. Potential modifiers of the association between sun exposure and the health outcome, such as clothing coverage and skin colour, may also need to be measured. We provide a systematic approach to selecting sun exposure measures for use in epidemiological health research.

  9. Measuring sun exposure in epidemiological studies: Matching the method to the research question.

    PubMed

    King, Laura; Xiang, Fan; Swaminathan, Ashwin; Lucas, Robyn M

    2015-12-01

    Sun exposure has risks and benefits for health. Testing these associations requires tools for measuring sun exposure that are feasible and relevant to the time-course of the health outcome. Recent sun exposure, e.g. the last week, is best captured by dosimeters and sun diaries. These can also be used for medium-term sun exposure e.g. over several weeks, but incur a high participant burden. Self-reported data on "typical time outdoors" for working and non-working days, is less detailed and not influenced by day-to-day variation. Over a longer period, e.g. the lifetime, or for particular life stages, proxies of sun exposure, such as latitude of residence or ambient ultraviolet (UV) radiation levels (from satellites or ground-level monitoring) can be used, with additional detail provided by lifetime sun exposure calendars that include locations of residence, usual time outdoors, and detail of sunburn episodes. Objective measures of lifetime sun exposure include microtopography of sun-exposed skin (e.g. using silicone casts) or conjunctival UV autofluorescence. Potential modifiers of the association between sun exposure and the health outcome, such as clothing coverage and skin colour, may also need to be measured. We provide a systematic approach to selecting sun exposure measures for use in epidemiological health research. PMID:26555640

  10. The effect of cage and house design on egg production and egg weight of White Leghorn hens: an epidemiological study.

    PubMed

    Garner, J P; Kiess, A S; Mench, J A; Newberry, R C; Hester, P Y

    2012-07-01

    Hen performance can be affected by many interacting variables related to cage design, such as floor area, height, tier arrangement, and feeder and drinker type and placement within the cage. Likewise, features of house design such as waste management and lighting can also affect hen productivity. The influence of these design aspects on hen performance has not been fully assessed. Determining the effects of numerous, interacting variables is impractical in a traditional experiment; therefore, an epidemiological approach, using variability in cage and house design among and within commercial producers, was employed to identify features that affect egg production and egg weight. A universal cage measurement system was created to calculate cage design variables. A database for recording information on cage design, resource location, waste management, environmental conditions, and hen productivity was developed. Production outcomes were assessed from placement to 60 wk of age in White Leghorns (n = 165-168 houses). Using GLM, a statistical model was identified that best described the variance in egg traits. Eggs/hen-housed increased with greater feeder space allocation (P = 0.031); taller cages (P = 0.029); rear (vs. front) drinker location in vertical cages (P = 0.026); and regular removal of manure from the house (P = 0.005). Case weight of eggs was greater in A-frame houses where manure was removed regularly instead of being left in the house (P < 0.001); with increasing cage floor slope (P = 0.001); in cages where drinkers were placed more toward the front or back of the cage as compared with the middle of the cage (P < 0.001); with more space/hen (P = 0.024); and with higher caloric intake (P < 0.001). Perhaps because of its negative correlation with egg production, case weight of eggs increased with less feeder space allocation (P = 0.004) and shorter cage heights (P < 0.001). These results reveal important effects of feeder space, floor space, cage height

  11. Design Process Guide Method for Minimizing Loops and Conflicts

    NASA Astrophysics Data System (ADS)

    Koga, Tsuyoshi; Aoyama, Kazuhiro

    We propose a new guide method for developing an easy-to-design process for product development. This process ensures a smaller number of wasteful iterations and less multiple conflicts. The design process is modeled as a sequence of design decisions. A design decision is defined as the process of determination of product attributes. A design task is represented as a calculation flow that depends on the product constraints between the product attributes. We also propose an automatic planning algorithm for the execution of the design task, in order to minimize the design loops and design conflicts. Further, we validate the effectiveness of the proposed guide method by developing a prototype design system and a design example of piping for a power steering system. We find that the proposed method can successfully minimize design loops and design conflicts. This paper addresses (1) a design loop model, (2) a design conflict model, and (3) how to minimize design loops and design conflicts.

  12. The HIV prevention cascade: integrating theories of epidemiological, behavioural, and social science into programme design and monitoring.

    PubMed

    Hargreaves, James R; Delany-Moretlwe, Sinead; Hallett, Timothy B; Johnson, Saul; Kapiga, Saidi; Bhattacharjee, Parinita; Dallabetta, Gina; Garnett, Geoff P

    2016-07-01

    Theories of epidemiology, health behaviour, and social science have changed the understanding of HIV prevention in the past three decades. The HIV prevention cascade is emerging as a new approach to guide the design and monitoring of HIV prevention programmes in a way that integrates these multiple perspectives. This approach recognises that translating the efficacy of direct mechanisms that mediate HIV prevention (including prevention products, procedures, and risk-reduction behaviours) into population-level effects requires interventions that increase coverage. An HIV prevention cascade approach suggests that high coverage can be achieved by targeting three key components: demand-side interventions that improve risk perception and awareness and acceptability of prevention approaches; supply-side interventions that make prevention products and procedures more accessible and available; and adherence interventions that support ongoing adoption of prevention behaviours, including those that do and do not involve prevention products. Programmes need to develop delivery platforms that ensure these interventions reach target populations, to shape the policy environment so that it facilitates implementation at scale with high quality and intensity, and to monitor the programme with indicators along the cascade. PMID:27365206

  13. Adjoint methods for aerodynamic wing design

    NASA Technical Reports Server (NTRS)

    Grossman, Bernard

    1993-01-01

    A model inverse design problem is used to investigate the effect of flow discontinuities on the optimization process. The optimization involves finding the cross-sectional area distribution of a duct that produces velocities that closely match a targeted velocity distribution. Quasi-one-dimensional flow theory is used, and the target is chosen to have a shock wave in its distribution. The objective function which quantifies the difference between the targeted and calculated velocity distributions may become non-smooth due to the interaction between the shock and the discretization of the flowfield. This paper offers two techniques to resolve the resulting problems for the optimization algorithms. The first, shock-fitting, involves careful integration of the objective function through the shock wave. The second, coordinate straining with shock penalty, uses a coordinate transformation to align the calculated shock with the target and then adds a penalty proportional to the square of the distance between the shocks. The techniques are tested using several popular sensitivity and optimization methods, including finite-differences, and direct and adjoint discrete sensitivity methods. Two optimization strategies, Gauss-Newton and sequential quadratic programming (SQP), are used to drive the objective function to a minimum.

  14. Game Methodology for Design Methods and Tools Selection

    ERIC Educational Resources Information Center

    Ahmad, Rafiq; Lahonde, Nathalie; Omhover, Jean-françois

    2014-01-01

    Design process optimisation and intelligence are the key words of today's scientific community. A proliferation of methods has made design a convoluted area. Designers are usually afraid of selecting one method/tool over another and even expert designers may not necessarily know which method is the best to use in which circumstances. This…

  15. Global Dissemination of Carbapenemase-Producing Klebsiella pneumoniae: Epidemiology, Genetic Context, Treatment Options, and Detection Methods

    PubMed Central

    Lee, Chang-Ro; Lee, Jung Hun; Park, Kwang Seung; Kim, Young Bae; Jeong, Byeong Chul; Lee, Sang Hee

    2016-01-01

    The emergence of carbapenem-resistant Gram-negative pathogens poses a serious threat to public health worldwide. In particular, the increasing prevalence of carbapenem-resistant Klebsiella pneumoniae is a major source of concern. K. pneumoniae carbapenemases (KPCs) and carbapenemases of the oxacillinase-48 (OXA-48) type have been reported worldwide. New Delhi metallo-β-lactamase (NDM) carbapenemases were originally identified in Sweden in 2008 and have spread worldwide rapidly. In this review, we summarize the epidemiology of K. pneumoniae producing three carbapenemases (KPCs, NDMs, and OXA-48-like). Although the prevalence of each resistant strain varies geographically, K. pneumoniae producing KPCs, NDMs, and OXA-48-like carbapenemases have become rapidly disseminated. In addition, we used recently published molecular and genetic studies to analyze the mechanisms by which these three carbapenemases, and major K. pneumoniae clones, such as ST258 and ST11, have become globally prevalent. Because carbapenemase-producing K. pneumoniae are often resistant to most β-lactam antibiotics and many other non-β-lactam molecules, the therapeutic options available to treat infection with these strains are limited to colistin, polymyxin B, fosfomycin, tigecycline, and selected aminoglycosides. Although, combination therapy has been recommended for the treatment of severe carbapenemase-producing K. pneumoniae infections, the clinical evidence for this strategy is currently limited, and more accurate randomized controlled trials will be required to establish the most effective treatment regimen. Moreover, because rapid and accurate identification of the carbapenemase type found in K. pneumoniae may be difficult to achieve through phenotypic antibiotic susceptibility tests, novel molecular detection techniques are currently being developed. PMID:27379038

  16. Global Dissemination of Carbapenemase-Producing Klebsiella pneumoniae: Epidemiology, Genetic Context, Treatment Options, and Detection Methods.

    PubMed

    Lee, Chang-Ro; Lee, Jung Hun; Park, Kwang Seung; Kim, Young Bae; Jeong, Byeong Chul; Lee, Sang Hee

    2016-01-01

    The emergence of carbapenem-resistant Gram-negative pathogens poses a serious threat to public health worldwide. In particular, the increasing prevalence of carbapenem-resistant Klebsiella pneumoniae is a major source of concern. K. pneumoniae carbapenemases (KPCs) and carbapenemases of the oxacillinase-48 (OXA-48) type have been reported worldwide. New Delhi metallo-β-lactamase (NDM) carbapenemases were originally identified in Sweden in 2008 and have spread worldwide rapidly. In this review, we summarize the epidemiology of K. pneumoniae producing three carbapenemases (KPCs, NDMs, and OXA-48-like). Although the prevalence of each resistant strain varies geographically, K. pneumoniae producing KPCs, NDMs, and OXA-48-like carbapenemases have become rapidly disseminated. In addition, we used recently published molecular and genetic studies to analyze the mechanisms by which these three carbapenemases, and major K. pneumoniae clones, such as ST258 and ST11, have become globally prevalent. Because carbapenemase-producing K. pneumoniae are often resistant to most β-lactam antibiotics and many other non-β-lactam molecules, the therapeutic options available to treat infection with these strains are limited to colistin, polymyxin B, fosfomycin, tigecycline, and selected aminoglycosides. Although, combination therapy has been recommended for the treatment of severe carbapenemase-producing K. pneumoniae infections, the clinical evidence for this strategy is currently limited, and more accurate randomized controlled trials will be required to establish the most effective treatment regimen. Moreover, because rapid and accurate identification of the carbapenemase type found in K. pneumoniae may be difficult to achieve through phenotypic antibiotic susceptibility tests, novel molecular detection techniques are currently being developed. PMID:27379038

  17. An inverse design method for 2D airfoil

    NASA Astrophysics Data System (ADS)

    Liang, Zhi-Yong; Cui, Peng; Zhang, Gen-Bao

    2010-03-01

    The computational method for aerodynamic design of aircraft is applied more universally than before, in which the design of an airfoil is a hot problem. The forward problem is discussed by most relative papers, but inverse method is more useful in practical designs. In this paper, the inverse design of 2D airfoil was investigated. A finite element method based on the variational principle was used for carrying out. Through the simulation, it was shown that the method was fit for the design.

  18. Review and International Recommendation of Methods for Typing Neisseria gonorrhoeae Isolates and Their Implications for Improved Knowledge of Gonococcal Epidemiology, Treatment, and Biology

    PubMed Central

    Unemo, Magnus; Dillon, Jo-Anne R.

    2011-01-01

    Summary: Gonorrhea, which may become untreatable due to multiple resistance to available antibiotics, remains a public health problem worldwide. Precise methods for typing Neisseria gonorrhoeae, together with epidemiological information, are crucial for an enhanced understanding regarding issues involving epidemiology, test of cure and contact tracing, identifying core groups and risk behaviors, and recommending effective antimicrobial treatment, control, and preventive measures. This review evaluates methods for typing N. gonorrhoeae isolates and recommends various methods for different situations. Phenotypic typing methods, as well as some now-outdated DNA-based methods, have limited usefulness in differentiating between strains of N. gonorrhoeae. Genotypic methods based on DNA sequencing are preferred, and the selection of the appropriate genotypic method should be guided by its performance characteristics and whether short-term epidemiology (microepidemiology) or long-term and/or global epidemiology (macroepidemiology) matters are being investigated. Currently, for microepidemiological questions, the best methods for fast, objective, portable, highly discriminatory, reproducible, typeable, and high-throughput characterization are N. gonorrhoeae multiantigen sequence typing (NG-MAST) or full- or extended-length porB gene sequencing. However, pulsed-field gel electrophoresis (PFGE) and Opa typing can be valuable in specific situations, i.e., extreme microepidemiology, despite their limitations. For macroepidemiological studies and phylogenetic studies, DNA sequencing of chromosomal housekeeping genes, such as multilocus sequence typing (MLST), provides a more nuanced understanding. PMID:21734242

  19. Translating Vision into Design: A Method for Conceptual Design Development

    NASA Technical Reports Server (NTRS)

    Carpenter, Joyce E.

    2003-01-01

    One of the most challenging tasks for engineers is the definition of design solutions that will satisfy high-level strategic visions and objectives. Even more challenging is the need to demonstrate how a particular design solution supports the high-level vision. This paper describes a process and set of system engineering tools that have been used at the Johnson Space Center to analyze and decompose high-level objectives for future human missions into design requirements that can be used to develop alternative concepts for vehicles, habitats, and other systems. Analysis and design studies of alternative concepts and approaches are used to develop recommendations for strategic investments in research and technology that support the NASA Integrated Space Plan. In addition to a description of system engineering tools, this paper includes a discussion of collaborative design practices for human exploration mission architecture studies used at the Johnson Space Center.

  20. Fast and simple epidemiological typing of Pseudomonas aeruginosa using the double-locus sequence typing (DLST) method.

    PubMed

    Basset, P; Blanc, D S

    2014-06-01

    Although the molecular typing of Pseudomonas aeruginosa is important to understand the local epidemiology of this opportunistic pathogen, it remains challenging. Our aim was to develop a simple typing method based on the sequencing of two highly variable loci. Single-strand sequencing of three highly variable loci (ms172, ms217, and oprD) was performed on a collection of 282 isolates recovered between 1994 and 2007 (from patients and the environment). As expected, the resolution of each locus alone [number of types (NT) = 35-64; index of discrimination (ID) = 0.816-0.964] was lower than the combination of two loci (NT = 78-97; ID = 0.966-0.971). As each pairwise combination of loci gave similar results, we selected the most robust combination with ms172 [reverse; R] and ms217 [R] to constitute the double-locus sequence typing (DLST) scheme for P. aeruginosa. This combination gave: (i) a complete genotype for 276/282 isolates (typability of 98%), (ii) 86 different types, and (iii) an ID of 0.968. Analysis of multiple isolates from the same patients or taps showed that DLST genotypes are generally stable over a period of several months. The high typability, discriminatory power, and ease of use of the proposed DLST scheme makes it a method of choice for local epidemiological analyses of P. aeruginosa. Moreover, the possibility to give unambiguous definition of types allowed to develop an Internet database ( http://www.dlst.org ) accessible by all. PMID:24326699

  1. Using Software Design Methods in CALL

    ERIC Educational Resources Information Center

    Ward, Monica

    2006-01-01

    The phrase "software design" is not one that arouses the interest of many CALL practitioners, particularly those from a humanities background. However, software design essentials are simply logical ways of going about designing a system. The fundamentals include modularity, anticipation of change, generality and an incremental approach. While CALL…

  2. Methods and Strategies: Derby Design Day

    ERIC Educational Resources Information Center

    Kennedy, Katheryn

    2013-01-01

    In this article the author describes the "Derby Design Day" project--a project that paired high school honors physics students with second-grade children for a design challenge and competition. The overall project goals were to discover whether collaboration in a design process would: (1) increase an interest in science; (2) enhance the…

  3. An Efficient Inverse Aerodynamic Design Method For Subsonic Flows

    NASA Technical Reports Server (NTRS)

    Milholen, William E., II

    2000-01-01

    Computational Fluid Dynamics based design methods are maturing to the point that they are beginning to be used in the aircraft design process. Many design methods however have demonstrated deficiencies in the leading edge region of airfoil sections. The objective of the present research is to develop an efficient inverse design method which is valid in the leading edge region. The new design method is a streamline curvature method, and a new technique is presented for modeling the variation of the streamline curvature normal to the surface. The new design method allows the surface coordinates to move normal to the surface, and has been incorporated into the Constrained Direct Iterative Surface Curvature (CDISC) design method. The accuracy and efficiency of the design method is demonstrated using both two-dimensional and three-dimensional design cases.

  4. A review of the epidemiological methods used to investigate the health impacts of air pollution around major industrial areas.

    PubMed

    Pascal, Mathilde; Pascal, Laurence; Bidondo, Marie-Laure; Cochet, Amandine; Sarter, Hélène; Stempfelet, Morgane; Wagner, Vérène

    2013-01-01

    We performed a literature review to investigate how epidemiological studies have been used to assess the health consequences of living in the vicinity of industries. 77 papers on the chronic effects of air pollution around major industrial areas were reviewed. Major health themes were cancers (27 studies), morbidity (25 studies), mortality (7 studies), and birth outcome (7 studies). Only 3 studies investigated mental health. While studies were available from many different countries, a majority of papers came from the United Kingdom, Italy, and Spain. Several studies were motivated by concerns from the population or by previous observations of an overincidence of cases. Geographical ecological designs were largely used for studying cancer and mortality, including statistical designs to quantify a relationship between health indicators and exposure. Morbidity was frequently investigated through cross-sectional surveys on the respiratory health of children. Few multicenter studies were performed. In a majority of papers, exposed areas were defined based on the distance to the industry and were located from <2 km to >20 km from the plants. Improving the exposure assessment would be an asset to future studies. Criteria to include industries in multicenter studies should be defined.

  5. A Review of the Epidemiological Methods Used to Investigate the Health Impacts of Air Pollution around Major Industrial Areas

    PubMed Central

    Pascal, Laurence; Bidondo, Marie-Laure; Cochet, Amandine; Sarter, Hélène; Stempfelet, Morgane; Wagner, Vérène

    2013-01-01

    We performed a literature review to investigate how epidemiological studies have been used to assess the health consequences of living in the vicinity of industries. 77 papers on the chronic effects of air pollution around major industrial areas were reviewed. Major health themes were cancers (27 studies), morbidity (25 studies), mortality (7 studies), and birth outcome (7 studies). Only 3 studies investigated mental health. While studies were available from many different countries, a majority of papers came from the United Kingdom, Italy, and Spain. Several studies were motivated by concerns from the population or by previous observations of an overincidence of cases. Geographical ecological designs were largely used for studying cancer and mortality, including statistical designs to quantify a relationship between health indicators and exposure. Morbidity was frequently investigated through cross-sectional surveys on the respiratory health of children. Few multicenter studies were performed. In a majority of papers, exposed areas were defined based on the distance to the industry and were located from <2 km to >20 km from the plants. Improving the exposure assessment would be an asset to future studies. Criteria to include industries in multicenter studies should be defined. PMID:23818910

  6. Evolution and social epidemiology.

    PubMed

    Nishi, Akihiro

    2015-11-01

    Evolutionary biology, which aims to explain the dynamic process of shaping the diversity of life, has not yet significantly affected thinking in social epidemiology. Current challenges in social epidemiology include understanding how social exposures can affect our biology, explaining the dynamics of society and health, and designing better interventions that are mindful of the impact of exposures during critical periods. I review how evolutionary concepts and tools, such as fitness gradient in cultural evolution, evolutionary game theory, and contemporary evolution in cancer, can provide helpful insights regarding social epidemiology.

  7. MEASUREMENT ERROR ESTIMATION AND CORRECTION METHODS TO MINIMIZE EXPOSURE MISCLASSIFICATION IN EPIDEMIOLOGICAL STUDIES: PROJECT SUMMARY

    EPA Science Inventory

    This project summary highlights recent findings from research undertaken to develop improved methods to assess potential human health risks related to drinking water disinfection byproduct (DBP) exposures.

  8. Design optimization method for Francis turbine

    NASA Astrophysics Data System (ADS)

    Kawajiri, H.; Enomoto, Y.; Kurosawa, S.

    2014-03-01

    This paper presents a design optimization system coupled CFD. Optimization algorithm of the system employs particle swarm optimization (PSO). Blade shape design is carried out in one kind of NURBS curve defined by a series of control points. The system was applied for designing the stationary vanes and the runner of higher specific speed francis turbine. As the first step, single objective optimization was performed on stay vane profile, and second step was multi-objective optimization for runner in wide operating range. As a result, it was confirmed that the design system is useful for developing of hydro turbine.

  9. Dietary Approaches to Stop Hypertension: rationale, design, and methods. DASH Collaborative Research Group.

    PubMed

    Vogt, T M; Appel, L J; Obarzanek, E; Moore, T J; Vollmer, W M; Svetkey, L P; Sacks, F M; Bray, G A; Cutler, J A; Windhauser, M M; Lin, P H; Karanja, N M

    1999-08-01

    Epidemiologic studies across societies have shown consistent differences in blood pressure that appear to be related to diet. Vegetarian diets are consistently associated with reduced blood pressure in observational and interventional studies, but clinical trials of individual nutrient supplements have had an inconsistent pattern of results. Dietary Approaches to Stop Hypertension (DASH) was a multicenter, randomized feeding study, designed to compare the impact on blood pressure of 3 dietary patterns. DASH was designed as a test of eating patterns rather than of individual nutrients in an effort to identify practical, palatable dietary approaches that might have a meaningful impact on reducing morbidity and mortality related to blood pressure in the general population. The objectives of this article are to present the scientific rationale for this trial, review the methods used, and discuss important design considerations and implications.

  10. Alternative methods for the design of jet engine control systems

    NASA Technical Reports Server (NTRS)

    Sain, M. K.; Leake, R. J.; Basso, R.; Gejji, R.; Maloney, A.; Seshadri, V.

    1976-01-01

    Various alternatives to linear quadratic design methods for jet engine control systems are discussed. The main alternatives are classified into two broad categories: nonlinear global mathematical programming methods and linear local multivariable frequency domain methods. Specific studies within these categories include model reduction, the eigenvalue locus method, the inverse Nyquist method, polynomial design, dynamic programming, and conjugate gradient approaches.

  11. INFLUENCE OF EXPOSURE ASSESSMENT METHOD IN AN EPIDEMIOLOGIC STUDY OF TRIHALOMETHANE EXPOSURE AND SPONTANEOUS ABORTION

    EPA Science Inventory

    Trihalomethanes are common contaminants of chlorinated drinking water. Studies of their health effects have been hampered by exposure misclassification, due in part to limitations inherent in using utility sampling records. We used two exposure assessment methods, one based on ut...

  12. Demystifying Mixed Methods Research Design: A Review of the Literature

    ERIC Educational Resources Information Center

    Caruth, Gail D.

    2013-01-01

    Mixed methods research evolved in response to the observed limitations of both quantitative and qualitative designs and is a more complex method. The purpose of this paper was to examine mixed methods research in an attempt to demystify the design thereby allowing those less familiar with its design an opportunity to utilize it in future research.…

  13. Computational Methods Applied to Rational Drug Design

    PubMed Central

    Ramírez, David

    2016-01-01

    Due to the synergic relationship between medical chemistry, bioinformatics and molecular simulation, the development of new accurate computational tools for small molecules drug design has been rising over the last years. The main result is the increased number of publications where computational techniques such as molecular docking, de novo design as well as virtual screening have been used to estimate the binding mode, site and energy of novel small molecules. In this work I review some tools, which enable the study of biological systems at the atomistic level, providing relevant information and thereby, enhancing the process of rational drug design. PMID:27708723

  14. A data-driven epidemiological prediction method for dengue outbreaks using local and remote sensing data

    PubMed Central

    2012-01-01

    Background Dengue is the most common arboviral disease of humans, with more than one third of the world’s population at risk. Accurate prediction of dengue outbreaks may lead to public health interventions that mitigate the effect of the disease. Predicting infectious disease outbreaks is a challenging task; truly predictive methods are still in their infancy. Methods We describe a novel prediction method utilizing Fuzzy Association Rule Mining to extract relationships between clinical, meteorological, climatic, and socio-political data from Peru. These relationships are in the form of rules. The best set of rules is automatically chosen and forms a classifier. That classifier is then used to predict future dengue incidence as either HIGH (outbreak) or LOW (no outbreak), where these values are defined as being above and below the mean previous dengue incidence plus two standard deviations, respectively. Results Our automated method built three different fuzzy association rule models. Using the first two weekly models, we predicted dengue incidence three and four weeks in advance, respectively. The third prediction encompassed a four-week period, specifically four to seven weeks from time of prediction. Using previously unused test data for the period 4–7 weeks from time of prediction yielded a positive predictive value of 0.686, a negative predictive value of 0.976, a sensitivity of 0.615, and a specificity of 0.982. Conclusions We have developed a novel approach for dengue outbreak prediction. The method is general, could be extended for use in any geographical region, and has the potential to be extended to other environmentally influenced infections. The variables used in our method are widely available for most, if not all countries, enhancing the generalizability of our method. PMID:23126401

  15. Method to Select Metropolitan Areas of Epidemiologic Interest for Enhanced Air Quality Monitoring

    EPA Science Inventory

    The U.S. Environmental Protection Agency’s current Speciation Trends Network (STN) covers most major U.S. metropolitan areas and a wide range of particulate matter (PM) constituents and gaseous co-pollutants. However, using filter-based methods, most PM constituents are measured ...

  16. Novel Microbiological and Spatial Statistical Methods to Improve Strength of Epidemiological Evidence in a Community-Wide Waterborne Outbreak

    PubMed Central

    Jalava, Katri; Rintala, Hanna; Ollgren, Jukka; Maunula, Leena; Gomez-Alvarez, Vicente; Revez, Joana; Palander, Marja; Antikainen, Jenni; Kauppinen, Ari; Räsänen, Pia; Siponen, Sallamaari; Nyholm, Outi; Kyyhkynen, Aino; Hakkarainen, Sirpa; Merentie, Juhani; Pärnänen, Martti; Loginov, Raisa; Ryu, Hodon; Kuusi, Markku; Siitonen, Anja; Miettinen, Ilkka; Santo Domingo, Jorge W.; Hänninen, Marja-Liisa; Pitkänen, Tarja

    2014-01-01

    Failures in the drinking water distribution system cause gastrointestinal outbreaks with multiple pathogens. A water distribution pipe breakage caused a community-wide waterborne outbreak in Vuorela, Finland, July 2012. We investigated this outbreak with advanced epidemiological and microbiological methods. A total of 473/2931 inhabitants (16%) responded to a web-based questionnaire. Water and patient samples were subjected to analysis of multiple microbial targets, molecular typing and microbial community analysis. Spatial analysis on the water distribution network was done and we applied a spatial logistic regression model. The course of the illness was mild. Drinking untreated tap water from the defined outbreak area was significantly associated with illness (RR 5.6, 95% CI 1.9–16.4) increasing in a dose response manner. The closer a person lived to the water distribution breakage point, the higher the risk of becoming ill. Sapovirus, enterovirus, single Campylobacter jejuni and EHEC O157:H7 findings as well as virulence genes for EPEC, EAEC and EHEC pathogroups were detected by molecular or culture methods from the faecal samples of the patients. EPEC, EAEC and EHEC virulence genes and faecal indicator bacteria were also detected in water samples. Microbial community sequencing of contaminated tap water revealed abundance of Arcobacter species. The polyphasic approach improved the understanding of the source of the infections, and aided to define the extent and magnitude of this outbreak. PMID:25147923

  17. Novel microbiological and spatial statistical methods to improve strength of epidemiological evidence in a community-wide waterborne outbreak.

    PubMed

    Jalava, Katri; Rintala, Hanna; Ollgren, Jukka; Maunula, Leena; Gomez-Alvarez, Vicente; Revez, Joana; Palander, Marja; Antikainen, Jenni; Kauppinen, Ari; Räsänen, Pia; Siponen, Sallamaari; Nyholm, Outi; Kyyhkynen, Aino; Hakkarainen, Sirpa; Merentie, Juhani; Pärnänen, Martti; Loginov, Raisa; Ryu, Hodon; Kuusi, Markku; Siitonen, Anja; Miettinen, Ilkka; Santo Domingo, Jorge W; Hänninen, Marja-Liisa; Pitkänen, Tarja

    2014-01-01

    Failures in the drinking water distribution system cause gastrointestinal outbreaks with multiple pathogens. A water distribution pipe breakage caused a community-wide waterborne outbreak in Vuorela, Finland, July 2012. We investigated this outbreak with advanced epidemiological and microbiological methods. A total of 473/2931 inhabitants (16%) responded to a web-based questionnaire. Water and patient samples were subjected to analysis of multiple microbial targets, molecular typing and microbial community analysis. Spatial analysis on the water distribution network was done and we applied a spatial logistic regression model. The course of the illness was mild. Drinking untreated tap water from the defined outbreak area was significantly associated with illness (RR 5.6, 95% CI 1.9-16.4) increasing in a dose response manner. The closer a person lived to the water distribution breakage point, the higher the risk of becoming ill. Sapovirus, enterovirus, single Campylobacter jejuni and EHEC O157:H7 findings as well as virulence genes for EPEC, EAEC and EHEC pathogroups were detected by molecular or culture methods from the faecal samples of the patients. EPEC, EAEC and EHEC virulence genes and faecal indicator bacteria were also detected in water samples. Microbial community sequencing of contaminated tap water revealed abundance of Arcobacter species. The polyphasic approach improved the understanding of the source of the infections, and aided to define the extent and magnitude of this outbreak.

  18. Statistical methods in epidemiology: Karl Pearson, Ronald Ross, Major Greenwood and Austin Bradford Hill, 1900-1945.

    PubMed

    Hardy, Anne; Magnello, M Eileen

    2002-01-01

    The tradition of epidemiological study through observation and the use of vital statistics dates back to the 18th century in Britain. At the close of the 19th century, however, a new and more sophisticated statistical approach emerged, from a base in the discipline of mathematics, which was eventually to transform the practice of epidemiology. This paper traces the evolution of that new analytical approach within English epidemiology through the work of four key contributors to its inception and establishment within the wider discipline. PMID:12134737

  19. Supersonic biplane design via adjoint method

    NASA Astrophysics Data System (ADS)

    Hu, Rui

    In developing the next generation supersonic transport airplane, two major challenges must be resolved. The fuel efficiency must be significantly improved, and the sonic boom propagating to the ground must be dramatically reduced. Both of these objectives can be achieved by reducing the shockwaves formed in supersonic flight. The Busemann biplane is famous for using favorable shockwave interaction to achieve nearly shock-free supersonic flight at its design Mach number. Its performance at off-design Mach numbers, however, can be very poor. This dissertation studies the performance of supersonic biplane airfoils at design and off-design conditions. The choked flow and flow-hysteresis phenomena of these biplanes are studied. These effects are due to finite thickness of the airfoils and non-uniqueness of the solution to the Euler equations, creating over an order of magnitude more wave drag than that predicted by supersonic thin airfoil theory. As a result, the off-design performance is the major barrier to the practical use of supersonic biplanes. The main contribution of this work is to drastically improve the off-design performance of supersonic biplanes by using an adjoint based aerodynamic optimization technique. The Busemann biplane is used as the baseline design, and its shape is altered to achieve optimal wave drags in series of Mach numbers ranging from 1.1 to 1.7, during both acceleration and deceleration conditions. The optimized biplane airfoils dramatically reduces the effects of the choked flow and flow-hysteresis phenomena, while maintaining a certain degree of favorable shockwave interaction effects at the design Mach number. Compared to a diamond shaped single airfoil of the same total thickness, the wave drag of our optimized biplane is lower at almost all Mach numbers, and is significantly lower at the design Mach number. In addition, by performing a Navier-Stokes solution for the optimized airfoil, it is verified that the optimized biplane improves

  20. Ecogeographic Genetic Epidemiology

    PubMed Central

    Sloan, Chantel D.; Duell, Eric J.; Shi, Xun; Irwin, Rebecca; Andrew, Angeline S.; Williams, Scott M.; Moore, Jason H.

    2009-01-01

    Complex diseases such as cancer and heart disease result from interactions between an individual's genetics and environment, i.e. their human ecology. Rates of complex diseases have consistently demonstrated geographic patterns of incidence, or spatial “clusters” of increased incidence relative to the general population. Likewise, genetic subpopulations and environmental influences are not evenly distributed across space. Merging appropriate methods from genetic epidemiology, ecology and geography will provide a more complete understanding of the spatial interactions between genetics and environment that result in spatial patterning of disease rates. Geographic Information Systems (GIS), which are tools designed specifically for dealing with geographic data and performing spatial analyses to determine their relationship, are key to this kind of data integration. Here the authors introduce a new interdisciplinary paradigm, ecogeographic genetic epidemiology, which uses GIS and spatial statistical analyses to layer genetic subpopulation and environmental data with disease rates and thereby discern the complex gene-environment interactions which result in spatial patterns of incidence. PMID:19025788

  1. Epidemiology of Candida infection. II. Application of biochemical methods for typing of Candida albicans strains.

    PubMed

    Budak, A

    1990-01-01

    Biochemical profiles of 350 C. albicans isolates from five towns in Poland and from Freiburg in Germany were determined on the basis of nine biochemical tests of Odds and Abbott method. API 20 C AUX system and additionally a resistogram. The analysis of the strains according to Odds' and Abbotts's system showed that investigated strains can be typed into 9 profile codes of common biochemical patterns. There were some differences among the profiles according to their geographical origin and anatomical sources of the isolation. On the basis of the ability C. albicans strains to assimilate of carbon sources, 350 isolates were categorised into 13 separate auxotrophic profiles with the major one: 2,576,174 accounting for 81% of the total. The majority of the investigated isolates were susceptible to antifungal agents (83%). A disproportionate distribution of auxotrophic profiles limited the use of resistogram method and API 20 C AUX as systems for typing C. albicans strains. On the other hand, the method of Odds and Abbott provides valuable criteria for typing of C. albicans. PMID:2130802

  2. Snippets From the Past: The Evolution of Wade Hampton Frost's Epidemiology as Viewed From the American Journal of Hygiene/Epidemiology

    PubMed Central

    Morabia, Alfredo

    2013-01-01

    Wade Hampton Frost, who was a Professor of Epidemiology at Johns Hopkins University from 1919 to 1938, spurred the development of epidemiologic methods. His 6 publications in the American Journal of Hygiene, which later became the American Journal of Epidemiology, comprise a 1928 Cutter lecture on a theory of epidemics, a survey-based study of tonsillectomy and immunity to Corynebacterium diphtheriae (1931), 2 papers from a longitudinal study of the incidence of minor respiratory diseases (1933 and 1935), an attack rate ratio analysis of the decline of diphtheria in Baltimore (1936), and a 1936 lecture on the age, time, and cohort analysis of tuberculosis mortality. These 6 American Journal of Hygiene /American Journal of Epidemiology papers attest that Frost's personal evolution mirrored that of the emerging “early” epidemiology: The scope of epidemiology extended beyond the study of epidemics of acute infectious diseases, and rigorous comparative study designs and their associated quantitative methods came to light. PMID:24022889

  3. Web tools for molecular epidemiology of tuberculosis.

    PubMed

    Shabbeer, Amina; Ozcaglar, Cagri; Yener, Bülent; Bennett, Kristin P

    2012-06-01

    In this study we explore publicly available web tools designed to use molecular epidemiological data to extract information that can be employed for the effective tracking and control of tuberculosis (TB). The application of molecular methods for the epidemiology of TB complement traditional approaches used in public health. DNA fingerprinting methods are now routinely employed in TB surveillance programs and are primarily used to detect recent transmissions and in outbreak investigations. Here we present web tools that facilitate systematic analysis of Mycobacterium tuberculosis complex (MTBC) genotype information and provide a view of the genetic diversity in the MTBC population. These tools help answer questions about the characteristics of MTBC strains, such as their pathogenicity, virulence, immunogenicity, transmissibility, drug-resistance profiles and host-pathogen associativity. They provide an integrated platform for researchers to use molecular epidemiological data to address current challenges in the understanding of TB dynamics and the characteristics of MTBC.

  4. Evidence-based planning and costing palliative care services for children: novel multi-method epidemiological and economic exemplar

    PubMed Central

    2013-01-01

    Background Children’s palliative care is a relatively new clinical specialty. Its nature is multi-dimensional and its delivery necessarily multi-professional. Numerous diverse public and not-for-profit organisations typically provide services and support. Because services are not centrally coordinated, they are provided in a manner that is inconsistent and incoherent. Since the first children’s hospice opened in 1982, the epidemiology of life-limiting conditions has changed with more children living longer, and many requiring transfer to adult services. Very little is known about the number of children living within any given geographical locality, costs of care, or experiences of children with ongoing palliative care needs and their families. We integrated evidence, and undertook and used novel methodological epidemiological work to develop the first evidence-based and costed commissioning exemplar. Methods Multi-method epidemiological and economic exemplar from a health and not-for-profit organisation perspective, to estimate numbers of children under 19 years with life-limiting conditions, cost current services, determine child/parent care preferences, and cost choice of end-of-life care at home. Results The exemplar locality (North Wales) had important gaps in service provision and the clinical network. The estimated annual total cost of current children’s palliative care was about £5.5 million; average annual care cost per child was £22,771 using 2007 prevalence estimates and £2,437- £11,045 using new 2012/13 population-based prevalence estimates. Using population-based prevalence, we estimate 2271 children with a life-limiting condition in the general exemplar population and around 501 children per year with ongoing palliative care needs in contact with hospital services. Around 24 children with a wide range of life-limiting conditions require end-of-life care per year. Choice of end-of-life care at home was requested, which is not currently

  5. A graph-theory method for pattern identification in geographical epidemiology – a preliminary application to deprivation and mortality

    PubMed Central

    Maheswaran, Ravi; Craigs, Cheryl; Read, Simon; Bath, Peter A; Willett, Peter

    2009-01-01

    Background Graph theoretical methods are extensively used in the field of computational chemistry to search datasets of compounds to see if they contain particular molecular sub-structures or patterns. We describe a preliminary application of a graph theoretical method, developed in computational chemistry, to geographical epidemiology in relation to testing a prior hypothesis. We tested the methodology on the hypothesis that if a socioeconomically deprived neighbourhood is situated in a wider deprived area, then that neighbourhood would experience greater adverse effects on mortality compared with a similarly deprived neighbourhood which is situated in a wider area with generally less deprivation. Methods We used the Trent Region Health Authority area for this study, which contained 10,665 census enumeration districts (CED). Graphs are mathematical representations of objects and their relationships and within the context of this study, nodes represented CEDs and edges were determined by whether or not CEDs were neighbours (shared a common boundary). The overall area in this study was represented by one large graph comprising all CEDs in the region, along with their adjacency information. We used mortality data from 1988–1998, CED level population estimates and the Townsend Material Deprivation Index as an indicator of neighbourhood level deprivation. We defined deprived CEDs as those in the top 20% most deprived in the Region. We then set out to classify these deprived CEDs into seven groups defined by increasing deprivation levels in the neighbouring CEDs. 506 (24.2%) of the deprived CEDs had five adjacent CEDs and we limited pattern development and searching to these CEDs. We developed seven query patterns and used the RASCAL (Rapid Similarity Calculator) program to carry out the search for each of the query patterns. This program used a maximum common subgraph isomorphism method which was modified to handle geographical data. Results Of the 506 deprived CEDs

  6. Epidemiology of varicocele

    PubMed Central

    Alsaikhan, Bader; Alrabeeah, Khalid; Delouya, Guila; Zini, Armand

    2016-01-01

    Varicocele is a common problem in reproductive medicine practice. A varicocele is identified in 15% of healthy men and up to 35% of men with primary infertility. The exact pathophysiology of varicoceles is not very well understood, especially regarding its effect on male infertility. We have conducted a systematic review of studies evaluating the epidemiology of varicocele in the general population and in men presenting with infertility. In this article, we have identified some of the factors that can influence the epidemiological aspects of varicoceles. We also recognize that varicocele epidemiology remains incompletely understood, and there is a need for well-designed, large-scale studies to fully define the epidemiological aspects of this condition. PMID:26763551

  7. Probabilistic Methods for Structural Design and Reliability

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Whitlow, Woodrow, Jr. (Technical Monitor)

    2002-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate, that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or in deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  8. A comparison of digital flight control design methods

    NASA Technical Reports Server (NTRS)

    Powell, J. D.; Parsons, E.; Tashker, M. G.

    1976-01-01

    Many variations in design methods for aircraft digital flight control have been proposed in the literature. In general, the methods fall into two categories: those where the design is done in the continuous domain (or s-plane), and those where the design is done in the discrete domain (or z-plane). This paper evaluates several variations of each category and compares them for various flight control modes of the Langley TCV Boeing 737 aircraft. Design method fidelity is evaluated by examining closed loop root movement and the frequency response of the discretely controlled continuous aircraft. It was found that all methods provided acceptable performance for sample rates greater than 10 cps except the 'uncompensated s-plane design' method which was acceptable above 20 cps. A design procedure based on optimal control methods was proposed that provided the best fidelity at very slow sample rates and required no design iterations for changing sample rates.

  9. Soft Computing Methods in Design of Superalloys

    NASA Technical Reports Server (NTRS)

    Cios, K. J.; Berke, L.; Vary, A.; Sharma, S.

    1996-01-01

    Soft computing techniques of neural networks and genetic algorithms are used in the design of superalloys. The cyclic oxidation attack parameter K(sub a), generated from tests at NASA Lewis Research Center, is modelled as a function of the superalloy chemistry and test temperature using a neural network. This model is then used in conjunction with a genetic algorithm to obtain an optimized superalloy composition resulting in low K(sub a) values.

  10. Epidemiology and changed surgical treatment methods for fractures of the distal radius

    PubMed Central

    2013-01-01

    Background and purpose The incidence of fractures of the distal radius may have changed over the last decade, and operative treatment has been commoner during that time. We investigated the incidence of fractures of the distal radius and changing trends in surgical treatment during the period 2004–2010. Patients and methods Registry data on 42,583 patients with a fracture of the distal radius from 2004 to 2010 were evaluated regarding diagnosis, age, sex, and surgical treatment. Results The crude incidence rate was 31 per 104 person-years with a bimodal distribution. After the age of 45 years, the incidence rate in women increased rapidly and leveled off first at a very high age. The incidence rate in postmenopausal women was lower than previously reported. In men, the incidence was low and it increased slowly until the age of 80 years, when it amounted to 31 per 104 person-years. The number of surgical procedures increased by more than 40% despite the fact that there was reduced incidence during the study period. In patients ≥ 18 years of age, the proportion of fractures treated with plating increased from 16% to 70% while the use of external fixation decreased by about the same amount. Interpretation The incidence rate of distal radius fractures in postmenopausal women appears to have decreased over the last few decades. There has been a shift in surgical treatment from external fixation to open reduction and plating. PMID:23594225

  11. A Seven-Year Retrospective View of a Course in Epidemiology and Biostatistics.

    ERIC Educational Resources Information Center

    Mulvihill, Michael N.; And Others

    1980-01-01

    Modifications of a course in epidemiology and biostatistics, designed to facilitate the presentation of difficult material in a clinically relevant manner, are described. Key strategies include seminar sessions devoted to methods of epidemiology and the critique of pairs of published studies, and the use of a course-specific syllabus. (JMD)

  12. The Triton: Design concepts and methods

    NASA Technical Reports Server (NTRS)

    Meholic, Greg; Singer, Michael; Vanryn, Percy; Brown, Rhonda; Tella, Gustavo; Harvey, Bob

    1992-01-01

    During the design of the C & P Aerospace Triton, a few problems were encountered that necessitated changes in the configuration. After the initial concept phase, the aspect ratio was increased from 7 to 7.6 to produce a greater lift to drag ratio (L/D = 13) which satisfied the horsepower requirements (118 hp using the Lycoming O-235 engine). The initial concept had a wing planform area of 134 sq. ft. Detailed wing sizing analysis enlarged the planform area to 150 sq. ft., without changing its layout or location. The most significant changes, however, were made just prior to inboard profile design. The fuselage external diameter was reduced from 54 to 50 inches to reduce drag to meet the desired cruise speed of 120 knots. Also, the nose was extended 6 inches to accommodate landing gear placement. Without the extension, the nosewheel received an unacceptable percentage (25 percent) of the landing weight. The final change in the configuration was made in accordance with the stability and control analysis. In order to reduce the static margin from 20 to 13 percent, the horizontal tail area was reduced from 32.02 to 25.0 sq. ft. The Triton meets all the specifications set forth in the design criteria. If time permitted another iteration of the calculations, two significant changes would be made. The vertical stabilizer area would be reduced to decrease the aircraft lateral stability slope since the current value was too high in relation to the directional stability slope. Also, the aileron size would be decreased to reduce the roll rate below the current 106 deg/second. Doing so would allow greater flap area (increasing CL(sub max)) and thus reduce the overall wing area. C & P would also recalculate the horsepower and drag values to further validate the 120 knot cruising speed.

  13. A survey on methods of design features identification

    NASA Astrophysics Data System (ADS)

    Grabowik, C.; Kalinowski, K.; Paprocka, I.; Kempa, W.

    2015-11-01

    It is widely accepted that design features are one of the most attractive integration method of most fields of engineering activities such as a design modelling, process planning or production scheduling. One of the most important tasks which are realized in the integration process of design and planning functions is a design translation meant as design data mapping into data which are important from process planning needs point of view, it is manufacturing data. A design geometrical shape translation process can be realized with application one of the following strategies: (i) designing with previously prepared design features library also known as DBF method it is design by feature, (ii) interactive design features recognition IFR, (iii) automatic design features recognition AFR. In case of the DBF method design geometrical shape is created with design features. There are two basic approaches for design modelling in DBF method it is classic in which a part design is modelled from beginning to end with application design features previously stored in a design features data base and hybrid where part is partially created with standard predefined CAD system tools and the rest with suitable design features. Automatic feature recognition consist in an autonomic searching of a product model represented with a specific design representation method in order to find those model features which might be potentially recognized as design features, manufacturing features, etc. This approach needs the searching algorithm to be prepared. The searching algorithm should allow carrying on the whole recognition process without a user supervision. Currently there are lots of AFR methods. These methods need the product model to be represented with B-Rep representation most often, CSG rarely, wireframe very rarely. In the IFR method potential features are being recognized by a user. This process is most often realized by a user who points out those surfaces which seem to belong to a

  14. A flexible layout design method for passive micromixers.

    PubMed

    Deng, Yongbo; Liu, Zhenyu; Zhang, Ping; Liu, Yongshun; Gao, Qingyong; Wu, Yihui

    2012-10-01

    This paper discusses a flexible layout design method of passive micromixers based on the topology optimization of fluidic flows. Being different from the trial and error method, this method obtains the detailed layout of a passive micromixer according to the desired mixing performance by solving a topology optimization problem. Therefore, the dependence on the experience of the designer is weaken, when this method is used to design a passive micromixer with acceptable mixing performance. Several design disciplines for the passive micromixers are considered to demonstrate the flexibility of the layout design method for passive micromixers. These design disciplines include the approximation of the real 3D micromixer, the manufacturing feasibility, the spacial periodic design, and effects of the Péclet number and Reynolds number on the designs obtained by this layout design method. The capability of this design method is validated by several comparisons performed between the obtained layouts and the optimized designs in the recently published literatures, where the values of the mixing measurement is improved up to 40.4% for one cycle of the micromixer. PMID:22736305

  15. [Opportunity and challenge on molecular epidemiology].

    PubMed

    Duan, G C; Chen, S Y

    2016-08-10

    Molecular epidemiology, a branch of epidemiology, combines the theories and methods, both in epidemiology and molecular biology. Molecular epidemiology mainly focuses on biological markers, describing the distribution, occurrence, development and prognosis of diseases at the molecular level. The completion of Human Genome Project and rapid development of Precision Medicine and Big Data not only offer the new development opportunities but also bring about a higher demand and new challenge for molecular epidemiology. PMID:27539332

  16. A recommended epidemiological study design for examining the adverse health effects among emergency workers who experienced the TEPCO fukushima daiichi NPP accident in 2011.

    PubMed

    Yasui, Shojiro

    2016-01-01

    Results from medical examinations conducted in 2012 of workers who were engaged in radiation work in 2012 as a result of the 2011 Fukushima Daiichi Nuclear Power Plant (NPP) accident showed that the prevalence of abnormal findings was 4.21%, 3.23 points higher than the 0.98% that was found prior to the accident in the jurisdiction area of the labor inspection office which holds jurisdiction over the NPP. The Ministry of Health, Labour and Welfare (MHLW) concluded that the 2010 and 2012 data cannot be easily compared because 70% of the enterprises within the jurisdiction of the office that reported the 2012 results were different from those that did so in 2010. In addition, although the radiation workers' estimated average dose weighted by number of workers was 3.66 times higher than decontamination workers' dose, the prevalence among radiation workers was only 1.14 times higher than that among decontamination workers. Based on the results of the medical examinations, however, the MHLW decided to implement an epidemiological study on the health effects of radiation exposure on all emergency workers. This article explains key issues of the basic design of the study recommended by the expert meeting established in the MHLW and also identifies challenges that could not be resolved and thus required further consideration by the study researchers. The major issues included: (a) study methods and target group; (b) evaluation of cumulative doses; (c) health effects (end points); (d) control of confounding factors; and (e) study implementation framework. Identified key challenges that required further deliberation were: (a) preventing arbitrary partisan analysis; (b) ensuring a high participation rate; (c) inquiry about the medical radiation doses; and (d) the preparedness of new analytical technology. The study team formulated and implemented the pilot study in 2014 and started the full-scale study in April 2015 with funding from a research grant from the MHLW. PMID

  17. Method for designing and controlling compliant gripper

    NASA Astrophysics Data System (ADS)

    Spanu, A. R.; Besnea, D.; Avram, M.; Ciobanu, R.

    2016-08-01

    The compliant grippers are useful for high accuracy grasping of small objects with adaptive control of contact points along the active surfaces of the fingers. The spatial trajectories of the elements become a must, due to the development of MEMS. The paper presents the solution for the compliant gripper designed by the authors, so the planar and spatial movements are discussed. At the beginning of the process, the gripper could work as passive one just for the moment when it has to reach out the object surface. The forces provided by the elements have to avoid the damage. As part of the system, the camera is taken picture of the object, in order to facilitate the positioning of the system. When the contact is established, the mechanism is acting as an active gripper by using an electrical stepper motor, which has controlled movement.

  18. Design Methods and Optimization for Morphing Aircraft

    NASA Technical Reports Server (NTRS)

    Crossley, William A.

    2005-01-01

    This report provides a summary of accomplishments made during this research effort. The major accomplishments are in three areas. The first is the use of a multiobjective optimization strategy to help identify potential morphing features that uses an existing aircraft sizing code to predict the weight, size and performance of several fixed-geometry aircraft that are Pareto-optimal based upon on two competing aircraft performance objectives. The second area has been titled morphing as an independent variable and formulates the sizing of a morphing aircraft as an optimization problem in which the amount of geometric morphing for various aircraft parameters are included as design variables. This second effort consumed most of the overall effort on the project. The third area involved a more detailed sizing study of a commercial transport aircraft that would incorporate a morphing wing to possibly enable transatlantic point-to-point passenger service.

  19. Comparison of Traditional Design Nonlinear Programming Optimization and Stochastic Methods for Structural Design

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2010-01-01

    Structural design generated by traditional method, optimization method and the stochastic design concept are compared. In the traditional method, the constraints are manipulated to obtain the design and weight is back calculated. In design optimization, the weight of a structure becomes the merit function with constraints imposed on failure modes and an optimization algorithm is used to generate the solution. Stochastic design concept accounts for uncertainties in loads, material properties, and other parameters and solution is obtained by solving a design optimization problem for a specified reliability. Acceptable solutions were produced by all the three methods. The variation in the weight calculated by the methods was modest. Some variation was noticed in designs calculated by the methods. The variation may be attributed to structural indeterminacy. It is prudent to develop design by all three methods prior to its fabrication. The traditional design method can be improved when the simplified sensitivities of the behavior constraint is used. Such sensitivity can reduce design calculations and may have a potential to unify the traditional and optimization methods. Weight versus reliabilitytraced out an inverted-S-shaped graph. The center of the graph corresponded to mean valued design. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure. Weight can be reduced to a small value for a most failure-prone design. Probabilistic modeling of load and material properties remained a challenge.

  20. Statistical Reasoning and Methods in Epidemiology to Promote Individualized Health: In Celebration of the 100th Anniversary of the Johns Hopkins Bloomberg School of Public Health.

    PubMed

    Ogburn, Elizabeth L; Zeger, Scott L

    2016-03-01

    Epidemiology is concerned with determining the distribution and causes of disease. Throughout its history, epidemiology has drawn upon statistical ideas and methods to achieve its aims. Because of the exponential growth in our capacity to measure and analyze data on the underlying processes that define each person's state of health, there is an emerging opportunity for population-based epidemiologic studies to influence health decisions made by individuals in ways that take into account the individuals' characteristics, circumstances, and preferences. We refer to this endeavor as "individualized health." The present article comprises 2 sections. In the first, we describe how graphical, longitudinal, and hierarchical models can inform the project of individualized health. We propose a simple graphical model for informing individual health decisions using population-based data. In the second, we review selected topics in causal inference that we believe to be particularly useful for individualized health. Epidemiology and biostatistics were 2 of the 4 founding departments in the world's first graduate school of public health at Johns Hopkins University, the centennial of which we honor. This survey of a small part of the literature is intended to demonstrate that the 2 fields remain just as inextricably linked today as they were 100 years ago.

  1. Association Between Cannabis and Psychosis: Epidemiologic Evidence.

    PubMed

    Gage, Suzanne H; Hickman, Matthew; Zammit, Stanley

    2016-04-01

    Associations between cannabis use and psychotic outcomes are consistently reported, but establishing causality from observational designs can be problematic. We review the evidence from longitudinal studies that have examined this relationship and discuss the epidemiologic evidence for and against interpreting the findings as causal. We also review the evidence identifying groups at particularly high risk of developing psychosis from using cannabis. Overall, evidence from epidemiologic studies provides strong enough evidence to warrant a public health message that cannabis use can increase the risk of psychotic disorders. However, further studies are required to determine the magnitude of this effect, to determine the effect of different strains of cannabis on risk, and to identify high-risk groups particularly susceptible to the effects of cannabis on psychosis. We also discuss complementary epidemiologic methods that can help address these questions.

  2. Traditional epidemiology, modern epidemiology, and public health.

    PubMed Central

    Pearce, N

    1996-01-01

    There have been significant developments in epidemiologic methodology during the past century, including changes in basic concepts, methods of data analysis, and methods of exposure measurement. However, the rise of modern epidemiology has been a mixed blessing, and the new paradigm has major shortcomings, both in public health and in scientific terms. The changes in the paradigm have not been neutral but have rather helped change--and have reflected changes in--the way in which epidemiologists think about health and disease. The key issue has been the shift in the level of analysis from the population to the individual. Epidemiology has largely ceased to function as part of a multidisciplinary approach to understanding the causation of disease in populations and has become a set of generic methods for measuring associations of exposure and disease in individuals. This reductionist approach focuses on the individual, blames the victim, and produces interventions that can be harmful. We seem to be using more and more advanced technology to study more and more trivial issues, while the major causes of disease are ignored. Epidemiology must reintegrate itself into public health and must rediscover the population perspective. PMID:8629719

  3. Analytical techniques for instrument design - matrix methods

    SciTech Connect

    Robinson, R.A.

    1997-09-01

    We take the traditional Cooper-Nathans approach, as has been applied for many years for steady-state triple-axis spectrometers, and consider its generalisation to other inelastic scattering spectrometers. This involves a number of simple manipulations of exponentials of quadratic forms. In particular, we discuss a toolbox of matrix manipulations that can be performed on the 6- dimensional Cooper-Nathans matrix: diagonalisation (Moller-Nielsen method), coordinate changes e.g. from ({Delta}k{sub I},{Delta}k{sub F} to {Delta}E, {Delta}Q & 2 dummy variables), integration of one or more variables (e.g. over such dummy variables), integration subject to linear constraints (e.g. Bragg`s Law for analysers), inversion to give the variance-covariance matrix, and so on. We show how these tools can be combined to solve a number of important problems, within the narrow-band limit and the gaussian approximation. We will argue that a generalised program that can handle multiple different spectrometers could (and should) be written in parallel to the Monte-Carlo packages that are becoming available. We will also discuss the complementarity between detailed Monte-Carlo calculations and the approach presented here. In particular, Monte-Carlo methods traditionally simulate the real experiment as performed in practice, given a model scattering law, while the Cooper-Nathans method asks the inverse question: given that a neutron turns up in a particular spectrometer configuration (e.g. angle and time of flight), what is the probability distribution of possible scattering events at the sample? The Monte-Carlo approach could be applied in the same spirit to this question.

  4. Hershey Medical Center Technical Workshop Report: optimizing the design and interpretation of epidemiologic studies for assessing neurodevelopmental effects from in utero chemical exposure.

    PubMed

    Amler, Robert W; Barone, Stanley; Belger, Aysenil; Berlin, Cheston M; Cox, Christopher; Frank, Harry; Goodman, Michael; Harry, Jean; Hooper, Stephen R; Ladda, Roger; LaKind, Judy S; Lipkin, Paul H; Lipsitt, Lewis P; Lorber, Matthew N; Myers, Gary; Mason, Ann M; Needham, Larry L; Sonawane, Babasaheb; Wachs, Theodore D; Yager, Janice W

    2006-09-01

    Neurodevelopmental disabilities affect 3-8% of the 4 million babies born each year in the U.S. alone, with known etiology for less than 25% of those disabilities. Numerous investigations have sought to determine the role of environmental exposures in the etiology of a variety of human neurodevelopmental disorders (e.g., learning disabilities, attention deficit-hyperactivity disorder, intellectual disabilities) that are manifested in childhood, adolescence, and young adulthood. A comprehensive critical examination and discussion of the various methodologies commonly used in investigations is needed. The Hershey Medical Center Technical Workshop: Optimizing the design and interpretation of epidemiologic studies for assessing neurodevelopmental effects from in utero chemical exposure provided such a forum for examining these methodologies. The objective of the Workshop was to develop scientific consensus on the key principles and considerations for optimizing the design and interpretation of epidemiologic studies of in utero exposure to environmental chemicals and subsequent neurodevelopmental effects. (The Panel recognized that the nervous system develops post-natally and that critical periods of exposure can span several developmental life stages.) Discussions from the Workshop Panel generated 17 summary points representing key tenets of work in this field. These points stressed the importance of: a well-defined, biologically plausible hypothesis as the foundation of in utero studies for assessing neurodevelopmental outcomes; understanding of the exposure to the environmental chemical(s) of interest, underlying mechanisms of toxicity, and anticipated outcomes; the use of a prospective, longitudinal cohort design that, when possible, runs for periods of 2-5 years, and possibly even longer, in an effort to assess functions at key developmental epochs; measuring potentially confounding variables at regular, fixed time intervals; including measures of specific cognitive

  5. Analytical techniques for instrument design -- Matrix methods

    SciTech Connect

    Robinson, R.A.

    1997-12-31

    The authors take the traditional Cooper-Nathans approach, as has been applied for many years for steady-state triple-axis spectrometers, and consider its generalization to other inelastic scattering spectrometers. This involves a number of simple manipulations of exponentials of quadratic forms. In particular, they discuss a toolbox of matrix manipulations that can be performed on the 6-dimensional Cooper-Nathans matrix. They show how these tools can be combined to solve a number of important problems, within the narrow-band limit and the gaussian approximation. They will argue that a generalized program that can handle multiple different spectrometers could (and should) be written in parallel to the Monte-Carlo packages that are becoming available. They also discuss the complementarity between detailed Monte-Carlo calculations and the approach presented here. In particular, Monte-Carlo methods traditionally simulate the real experiment as performed in practice, given a model scattering law, while the Cooper-Nathans method asks the inverse question: given that a neutron turns up in a particular spectrometer configuration (e.g. angle and time of flight), what is the probability distribution of possible scattering events at the sample? The Monte-Carlo approach could be applied in the same spirit to this question.

  6. Assessment of methods and results of reproductive occupational epidemiology: spontaneous abortions and malformations in the offspring of working women

    SciTech Connect

    Hemminki, K.; Axelson, O.; Niemi, M.L.; Ahlborg, G.

    1983-01-01

    Epidemiological studies relating occupational exposures of working women to spontaneous abortions and malformation are reviewed and some methodological considerations are presented. The reproductive epidemiology is less developed than epidemiology in general and seems to involve some specific problems. The exposures may be reported differently by the women depending on the outcome of the pregnancy; thus confirmation of exposure from an independent data source would be an asset. The types of occupational exposures of the women, suggested to carry a risk of spontaneous abortions, include anesthetic agents, laboratory work, copper smelting, soldering, and chemical sterilization using ethylene oxide and glutaraldehyde. Maternal employment in laboratories and exposure to solvents has been linked to a risk of congenital malformations in the offspring in five studies. Data on the teratogenic effects of anesthetic gases has been conflicting. In one study, employment in copper smelting was associated with malformations in the offspring.

  7. HEALTHY study rationale, design and methods

    PubMed Central

    2009-01-01

    The HEALTHY primary prevention trial was designed and implemented in response to the growing numbers of children and adolescents being diagnosed with type 2 diabetes. The objective was to moderate risk factors for type 2 diabetes. Modifiable risk factors measured were indicators of adiposity and glycemic dysregulation: body mass index ≥85th percentile, fasting glucose ≥5.55 mmol l-1 (100 mg per 100 ml) and fasting insulin ≥180 pmol l-1 (30 μU ml-1). A series of pilot studies established the feasibility of performing data collection procedures and tested the development of an intervention consisting of four integrated components: (1) changes in the quantity and nutritional quality of food and beverage offerings throughout the total school food environment; (2) physical education class lesson plans and accompanying equipment to increase both participation and number of minutes spent in moderate-to-vigorous physical activity; (3) brief classroom activities and family outreach vehicles to increase knowledge, enhance decision-making skills and support and reinforce youth in accomplishing goals; and (4) communications and social marketing strategies to enhance and promote changes through messages, images, events and activities. Expert study staff provided training, assistance, materials and guidance for school faculty and staff to implement the intervention components. A cohort of students were enrolled in sixth grade and followed to end of eighth grade. They attended a health screening data collection at baseline and end of study that involved measurement of height, weight, blood pressure, waist circumference and a fasting blood draw. Height and weight were also collected at the end of the seventh grade. The study was conducted in 42 middle schools, six at each of seven locations across the country, with 21 schools randomized to receive the intervention and 21 to act as controls (data collection activities only). Middle school was the unit of sample size and

  8. Method speeds tapered rod design for directional well

    SciTech Connect

    Hu Yongquan; Yuan Xiangzhong

    1995-10-16

    Determination of the minimum rod diameter, from statistical relationships, can decrease the time needed for designing a sucker-rod string for a directional well. A tapered rod string design for a directional well is more complex than for a vertical well. Based on the theory of a continuous beam column, the rod string design in a directional well is a trial and error method. The key to reduce the time to obtain a solution is to rapidly determine the minimum rod diameter. This can be done with a statistical relationship. The paper describes sucker rods, design method, basic analysis rod design, and minimum rod diameter.

  9. Inhalation exposure systems: design, methods and operation.

    PubMed

    Wong, Brian A

    2007-01-01

    The respiratory system, the major route for entry of oxygen into the body, provides entry for external compounds, including pharmaceutic and toxic materials. These compounds (that might be inhaled under environmental, occupational, medical, or other situations) can be administered under controlled conditions during laboratory inhalation studies. Inhalation study results may be controlled or adversely affected by variability in four key factors: animal environment; exposure atmosphere; inhaled dose; and individual animal biological response. Three of these four factors can be managed through engineering processes. Variability in the animal environment is reduced by engineering control of temperature, humidity, oxygen content, waste gas content, and noise in the exposure facility. Exposure atmospheres are monitored and adjusted to assure a consistent and known exposure for each animal dose group. The inhaled dose, affected by changes in respiration physiology, may be controlled by exposure-specific monitoring of respiration. Selection of techniques and methods for the three factors affected by engineering allows the toxicologic pathologist to study the reproducibility of the fourth factor, the biological response of the animal. PMID:17325967

  10. A new interval optimization method considering tolerance design

    NASA Astrophysics Data System (ADS)

    Jiang, C.; Xie, H. C.; Zhang, Z. G.; Han, X.

    2015-12-01

    This study considers the design variable uncertainty in the actual manufacturing process for a product or structure and proposes a new interval optimization method based on tolerance design, which can provide not only an optimal design but also the allowable maximal manufacturing errors that the design can bear. The design variables' manufacturing errors are depicted using the interval method, and an interval optimization model for the structure is constructed. A dimensionless design tolerance index is defined to describe the overall uncertainty of all design variables, and by combining the nominal objective function, a deterministic two-objective optimization model is built. The possibility degree of interval is used to represent the reliability of the constraints under uncertainty, through which the model is transformed to a deterministic optimization problem. Three numerical examples are investigated to verify the effectiveness of the present method.

  11. Competency in health care management: a training model in epidemiologic methods for assessing and improving the quality of clinical practice through evidence-based decision making.

    PubMed

    Hudak, R P; Jacoby, I; Meyer, G S; Potter, A L; Hooper, T I; Krakauer, H

    1997-01-01

    This article describes a training model that focuses on health care management by applying epidemiologic methods to assess and improve the quality of clinical practice. The model's uniqueness is its focus on integrating clinical evidence-based decision making with fundamental principles of resource management to achieve attainable, cost-effective, high-quality health outcomes. The target students are current and prospective clinical and administrative executives who must optimize decision making at the clinical and managerial levels of health care organizations.

  12. An analytical method for designing low noise helicopter transmissions

    NASA Technical Reports Server (NTRS)

    Bossler, R. B., Jr.; Bowes, M. A.; Royal, A. C.

    1978-01-01

    The development and experimental validation of a method for analytically modeling the noise mechanism in the helicopter geared power transmission systems is described. This method can be used within the design process to predict interior noise levels and to investigate the noise reducing potential of alternative transmission design details. Examples are discussed.

  13. What Can Mixed Methods Designs Offer Professional Development Program Evaluators?

    ERIC Educational Resources Information Center

    Giordano, Victoria; Nevin, Ann

    2007-01-01

    In this paper, the authors describe the benefits and pitfalls of mixed methods designs. They argue that mixed methods designs may be preferred when evaluating professional development programs for p-K-12 education given the new call for accountability in making data-driven decisions. They summarize and critique the studies in terms of limitations…

  14. Turbine blade fixture design using kinematic methods and genetic algorithms

    NASA Astrophysics Data System (ADS)

    Bausch, John J., III

    2000-10-01

    The design of fixtures for turbine blades is a difficult problem even for experience toolmakers. Turbine blades are characterized by complex 3D surfaces, high performance materials that are difficult to manufacture, close tolerance finish requirements, and high precision machining accuracy. Tool designers typically rely on modified designs based on experience, but have no analytical tools to guide or even evaluate their designs. This paper examines the application of kinematic algorithms to the design of six-point-nest, seventh-point-clamp datum transfer fixtures for turbine blade production. The kinematic algorithms, based on screw coordinate theory, are computationally intensive. When used in a blind search mode the time required to generate an actual design is unreasonable. In order to reduce the computation time, the kinematic methods are combined with genetic algorithms and a set of heuristic design rules to guide the search. The kinematic, genetic, and heuristic methods were integrated within a fixture design module as part of the Unigraphics CAD system used by Pratt and Whitney. The kinematic design module was used to generate a datum transfer fixture design for a standard production turbine blade. This design was then used to construct an actual fixture, and compared to the existing production fixture for the same part. The positional accuracy of both designs was compared using a coordinate measurement machine (CMM). Based on the CMM data, the observed variation of kinematic design was over two orders-of-magnitude less than for the production design resulting in greatly improved accuracy.

  15. Probabilistic Methods for Uncertainty Propagation Applied to Aircraft Design

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Lin, Hong-Zong; Khalessi, Mohammad R.

    2002-01-01

    Three methods of probabilistic uncertainty propagation and quantification (the method of moments, Monte Carlo simulation, and a nongradient simulation search method) are applied to an aircraft analysis and conceptual design program to demonstrate design under uncertainty. The chosen example problems appear to have discontinuous design spaces and thus these examples pose difficulties for many popular methods of uncertainty propagation and quantification. However, specific implementation features of the first and third methods chosen for use in this study enable successful propagation of small uncertainties through the program. Input uncertainties in two configuration design variables are considered. Uncertainties in aircraft weight are computed. The effects of specifying required levels of constraint satisfaction with specified levels of input uncertainty are also demonstrated. The results show, as expected, that the designs under uncertainty are typically heavier and more conservative than those in which no input uncertainties exist.

  16. Prevalence and Epidemiologic Characteristics of FASD From Various Research Methods with an Emphasis on Recent In-School Studies

    ERIC Educational Resources Information Center

    May, Philip A.; Gossage, J. Phillip; Kalberg, Wendy O.; Robinson, Luther K.; Buckley, David; Manning, Melanie; Hoyme, H. Eugene

    2009-01-01

    Researching the epidemiology and estimating the prevalence of fetal alcohol syndrome (FAS) and other fetal alcohol spectrum disorders (FASD) for mainstream populations anywhere in the world has presented a challenge to researchers. Three major approaches have been used in the past: surveillance and record review systems, clinic-based studies, and…

  17. Triparental Families: A New Genetic-Epidemiological Design Applied to Drug Abuse, Alcohol Use Disorders, and Criminal Behavior in a Swedish National Sample

    PubMed Central

    Kendler, Kenneth S.; Ohlsson, Henrik; Sundquist, Jan; Sundquist, Kristina

    2015-01-01

    Objective The authors sought to clarify the sources of parent-offspring resemblance for drug abuse, alcohol use disorders, and criminal behavior, using a novel genetic-epidemiological design. Method Using national registries, the authors identified rates of drug abuse, alcohol use disorders, and criminal behavior in 41,360 Swedish individuals born between 1960 and 1990 and raised in triparental families comprising a biological mother who reared them, a “not-lived-with” biological father, and a stepfather. Results When each syndrome was examined individually, hazard rates for drug abuse in offspring of parents with drug abuse were highest for mothers (2.80, 95% CI=2.23–3.38), intermediate for not-lived-with fathers (2.45,95%CI=2.14–2.79), and lowest for stepfathers (1.99, 95% CI=1.55–2.56). The same pattern was seen for alcohol use disorders (2.23, 95% CI=1.93–2.58; 1.84, 95% CI=1.69–2.00; and 1.27, 95% CI=1.12–1.43) and criminal behavior (1.55, 95% CI=1.44–1.66; 1.46, 95%CI=1.40–1.52; and1.30, 95% CI=1.23–1.37). When all three syndromes were examined together, specificity of cross-generational transmission was highest for mothers, intermediate for not-lived-with fathers, and lowest for stepfathers. Analyses of intact families and other not-lived-with parents and stepparents showed similar cross-generation transmission for these syndromes in mothers and fathers, supporting the representativeness of results from triparental families. Conclusions A major strength of the triparental design is its inclusion, within a single family, of parents who provide, to a first approximation, their offspring with genes plus rearing, genes only, and rearing only. For drug abuse, alcohol use disorders, and criminal behavior, the results of this study suggest that parent-offspring transmission involves both genetic and environmental processes, with genetic factors being somewhat more important. These results should be interpreted in the context of the strengths

  18. Obtaining sensitive data through the Web: an example of design and methods.

    PubMed

    Baer, Atar; Saroiu, Stefan; Koutsky, Laura A

    2002-11-01

    Several studies have suggested that the quality of coital data from diaries is superior to that collected by retrospective questionnaires. By collecting data over short intervals of time, diaries can present a more comprehensive picture of exposure, while minimizing the potential for recall bias. Despite these advantages, paper diaries have limited use because of their expense and difficulty of implementation. Web-based data collection offers the opportunity to make improvements to the quality of epidemiologic exposure measurement by providing privacy and convenience to study participants while reducing costs associated with questionnaire administration and allowing for real-time data processing. We adapted coital diaries for Web-based data collection in a study of transmission rates of genital human papillomavirus infection among young adults. University women complete an online sexual behavior questionnaire ("diary") every 2 weeks over a 3-year follow-up period; men complete a single online sexual behavior questionnaire ("journal"). In this paper we describe the design, methodology and implementation issues that emerge in conducting a Web-based epidemiologic study. We also discuss compliance, as well as methods for assuring appropriate security, confidentiality and privacy.

  19. Expanding color design methods for architecture and allied disciplines

    NASA Astrophysics Data System (ADS)

    Linton, Harold E.

    2002-06-01

    The color design processes of visual artists, architects, designers, and theoreticians included in this presentation reflect the practical role of color in architecture. What the color design professional brings to the architectural design team is an expertise and rich sensibility made up of a broad awareness and a finely tuned visual perception. This includes a knowledge of design and its history, expertise with industrial color materials and their methods of application, an awareness of design context and cultural identity, a background in physiology and psychology as it relates to human welfare, and an ability to problem-solve and respond creatively to design concepts with innovative ideas. The broadening of the definition of the colorists's role in architectural design provides architects, artists and designers with significant opportunities for continued professional and educational development.

  20. Polygenic Epidemiology

    PubMed Central

    2016-01-01

    ABSTRACT Much of the genetic basis of complex traits is present on current genotyping products, but the individual variants that affect the traits have largely not been identified. Several traditional problems in genetic epidemiology have recently been addressed by assuming a polygenic basis for disease and treating it as a single entity. Here I briefly review some of these applications, which collectively may be termed polygenic epidemiology. Methodologies in this area include polygenic scoring, linear mixed models, and linkage disequilibrium scoring. They have been used to establish a polygenic effect, estimate genetic correlation between traits, estimate how many variants affect a trait, stratify cases into subphenotypes, predict individual disease risks, and infer causal effects using Mendelian randomization. Polygenic epidemiology will continue to yield useful applications even while much of the specific variation underlying complex traits remains undiscovered. PMID:27061411

  1. Polygenic Epidemiology.

    PubMed

    Dudbridge, Frank

    2016-05-01

    Much of the genetic basis of complex traits is present on current genotyping products, but the individual variants that affect the traits have largely not been identified. Several traditional problems in genetic epidemiology have recently been addressed by assuming a polygenic basis for disease and treating it as a single entity. Here I briefly review some of these applications, which collectively may be termed polygenic epidemiology. Methodologies in this area include polygenic scoring, linear mixed models, and linkage disequilibrium scoring. They have been used to establish a polygenic effect, estimate genetic correlation between traits, estimate how many variants affect a trait, stratify cases into subphenotypes, predict individual disease risks, and infer causal effects using Mendelian randomization. Polygenic epidemiology will continue to yield useful applications even while much of the specific variation underlying complex traits remains undiscovered. PMID:27061411

  2. Cognitive epidemiology

    PubMed Central

    Deary, Ian J; Batty, G David

    2007-01-01

    This glossary provides a guide to some concepts, findings and issues of discussion in the new field of research in which intelligence test scores are associated with mortality and morbidity. Intelligence tests are devised and studied by differential psychologists. Some of the major concepts in differential psychology are explained, especially those regarding cognitive ability testing. Some aspects of IQ (intelligence) tests are described and some of the major tests are outlined. A short guide is given to the main statistical techniques used by differential psychologists in the study of human mental abilities. There is a discussion of common epidemiological concepts in the context of cognitive epidemiology. PMID:17435201

  3. Aerodynamic design optimization by using a continuous adjoint method

    NASA Astrophysics Data System (ADS)

    Luo, JiaQi; Xiong, JunTao; Liu, Feng

    2014-07-01

    This paper presents the fundamentals of a continuous adjoint method and the applications of this method to the aerodynamic design optimization of both external and internal flows. General formulation of the continuous adjoint equations and the corresponding boundary conditions are derived. With the adjoint method, the complete gradient information needed in the design optimization can be obtained by solving the governing flow equations and the corresponding adjoint equations only once for each cost function, regardless of the number of design parameters. An inverse design of airfoil is firstly performed to study the accuracy of the adjoint gradient and the effectiveness of the adjoint method as an inverse design method. Then the method is used to perform a series of single and multiple point design optimization problems involving the drag reduction of airfoil, wing, and wing-body configuration, and the aerodynamic performance improvement of turbine and compressor blade rows. The results demonstrate that the continuous adjoint method can efficiently and significantly improve the aerodynamic performance of the design in a shape optimization problem.

  4. Nutritional Epidemiology

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Although observations on relationships between diet and health have always been recognized—the systematic science of nutritional epidemiology in populations is relatively recent. Important observations propelling the field of nutrition forward were numerous in the 18th and 19th centuries, as it was...

  5. Single-Case Designs and Qualitative Methods: Applying a Mixed Methods Research Perspective

    ERIC Educational Resources Information Center

    Hitchcock, John H.; Nastasi, Bonnie K.; Summerville, Meredith

    2010-01-01

    The purpose of this conceptual paper is to describe a design that mixes single-case (sometimes referred to as single-subject) and qualitative methods, hereafter referred to as a single-case mixed methods design (SCD-MM). Minimal attention has been given to the topic of applying qualitative methods to SCD work in the literature. These two…

  6. 77 FR 55832 - Ambient Air Monitoring Reference and Equivalent Methods: Designation of a New Equivalent Method

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-11

    ... made under the provisions of 40 CFR part 53, as ] amended on August 31, 2011 (76 FR 54326-54341). The... AGENCY Ambient Air Monitoring Reference and Equivalent Methods: Designation of a New Equivalent Method AGENCY: Environmental Protection Agency. ACTION: Notice of the designation of a new equivalent method...

  7. An artificial viscosity method for the design of supercritical airfoils

    NASA Technical Reports Server (NTRS)

    Mcfadden, G. B.

    1979-01-01

    A numerical technique is presented for the design of two-dimensional supercritical wing sections with low wave drag. The method is a design mode of the analysis code H which gives excellent agreement with experimental results and is widely used in the aircraft industry. Topics covered include the partial differential equations of transonic flow, the computational procedure and results; the design procedure; a convergence theorem; and description of the code.

  8. Numerical methods for aerothermodynamic design of hypersonic space transport vehicles

    NASA Astrophysics Data System (ADS)

    Wanie, K. M.; Brenneis, A.; Eberle, A.; Heiss, S.

    1993-04-01

    The requirement of the design process of hypersonic vehicles to predict flow past entire configurations with wings, fins, flaps, and propulsion system represents one of the major challenges for aerothermodynamics. In this context computational fluid dynamics has come up as a powerful tool to support the experimental work. A couple of numerical methods developed at MBB designed to fulfill the needs of the design process are described. The governing equations and fundamental details of the solution methods are shortly reviewed. Results are given for both geometrically simple test cases and realistic hypersonic configurations. Since there is still a considerable lack of experience for hypersonic flow calculations an extensive testing and verification is essential. This verification is done by comparison of results with experimental data and other numerical methods. The results presented prove that the methods used are robust, flexible, and accurate enough to fulfill the strong needs of the design process.

  9. New directions for Artificial Intelligence (AI) methods in optimum design

    NASA Technical Reports Server (NTRS)

    Hajela, Prabhat

    1989-01-01

    Developments and applications of artificial intelligence (AI) methods in the design of structural systems is reviewed. Principal shortcomings in the current approach are emphasized, and the need for some degree of formalism in the development environment for such design tools is underscored. Emphasis is placed on efforts to integrate algorithmic computations in expert systems.

  10. Two-Method Planned Missing Designs for Longitudinal Research

    ERIC Educational Resources Information Center

    Garnier-Villarreal, Mauricio; Rhemtulla, Mijke; Little, Todd D.

    2014-01-01

    We examine longitudinal extensions of the two-method measurement design, which uses planned missingness to optimize cost-efficiency and validity of hard-to-measure constructs. These designs use a combination of two measures: a "gold standard" that is highly valid but expensive to administer, and an inexpensive (e.g., survey-based)…

  11. Investigating the Use of Design Methods by Capstone Design Students at Clemson University

    ERIC Educational Resources Information Center

    Miller, W. Stuart; Summers, Joshua D.

    2013-01-01

    The authors describe a preliminary study to understand the attitude of engineering students regarding the use of design methods in projects to identify the factors either affecting or influencing the use of these methods by novice engineers. A senior undergraduate capstone design course at Clemson University, consisting of approximately fifty…

  12. New knowledge network evaluation method for design rationale management

    NASA Astrophysics Data System (ADS)

    Jing, Shikai; Zhan, Hongfei; Liu, Jihong; Wang, Kuan; Jiang, Hao; Zhou, Jingtao

    2015-01-01

    Current design rationale (DR) systems have not demonstrated the value of the approach in practice since little attention is put to the evaluation method of DR knowledge. To systematize knowledge management process for future computer-aided DR applications, a prerequisite is to provide the measure for the DR knowledge. In this paper, a new knowledge network evaluation method for DR management is presented. The method characterizes the DR knowledge value from four perspectives, namely, the design rationale structure scale, association knowledge and reasoning ability, degree of design justification support and degree of knowledge representation conciseness. The DR knowledge comprehensive value is also measured by the proposed method. To validate the proposed method, different style of DR knowledge network and the performance of the proposed measure are discussed. The evaluation method has been applied in two realistic design cases and compared with the structural measures. The research proposes the DR knowledge evaluation method which can provide object metric and selection basis for the DR knowledge reuse during the product design process. In addition, the method is proved to be more effective guidance and support for the application and management of DR knowledge.

  13. Design method for four-reflector type beam waveguide systems

    NASA Technical Reports Server (NTRS)

    Betsudan, S.; Katagi, T.; Urasaki, S.

    1986-01-01

    Discussed is a method for the design of four reflector type beam waveguide feed systems, comprised of a conical horn and 4 focused reflectors, which are used widely as the primary reflector systems for communications satellite Earth station antennas. The design parameters for these systems are clarified, the relations between each parameter are brought out based on the beam mode development, and the independent design parameters are specified. The characteristics of these systems, namely spillover loss, crosspolarization components, and frequency characteristics, and their relation to the design parameters, are also shown. It is also indicated that design parameters which decide the dimensions of the conical horn or the shape of the focused reflectors can be unerringly established once the design standard for the system has been selected as either: (1) minimizing the crosspolarization component by keeping the spillover loss to within acceptable limits, or (2) minimizing the spillover loss by maintaining the crossover components below an acceptable level and the independent design parameters, such as the respective sizes of the focused reflectors and the distances between the focussed reflectors, etc., have been established according to mechanical restrictions. A sample design is also shown. In addition to being able to clarify the effects of each of the design parameters on the system and improving insight into these systems, the efficiency of these systems will also be increased with this design method.

  14. Causality in epidemiology.

    PubMed

    Kamangar, Farin

    2012-10-01

    This article provides an introduction to the meaning of causality in epidemiology and methods that epidemiologists use to distinguish causal associations from non-causal ones. Alternatives to causal association are discussed in detail. Hill's guidelines, set forth approximately 50 years ago, and more recent developments are reviewed. The role of religious and philosophic views in our understanding of causality is briefly discussed.

  15. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  16. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  17. Novel parameter-based flexure bearing design method

    NASA Astrophysics Data System (ADS)

    Amoedo, Simon; Thebaud, Edouard; Gschwendtner, Michael; White, David

    2016-06-01

    A parameter study was carried out on the design variables of a flexure bearing to be used in a Stirling engine with a fixed axial displacement and a fixed outer diameter. A design method was developed in order to assist identification of the optimum bearing configuration. This was achieved through a parameter study of the bearing carried out with ANSYS®. The parameters varied were the number and the width of the arms, the thickness of the bearing, the eccentricity, the size of the starting and ending holes, and the turn angle of the spiral. Comparison was made between the different designs in terms of axial and radial stiffness, the natural frequency, and the maximum induced stresses. Moreover, the Finite Element Analysis (FEA) was compared to theoretical results for a given design. The results led to a graphical design method which assists the selection of flexure bearing geometrical parameters based on pre-determined geometric and material constraints.

  18. The Design with Intent Method: a design tool for influencing user behaviour.

    PubMed

    Lockton, Dan; Harrison, David; Stanton, Neville A

    2010-05-01

    Using product and system design to influence user behaviour offers potential for improving performance and reducing user error, yet little guidance is available at the concept generation stage for design teams briefed with influencing user behaviour. This article presents the Design with Intent Method, an innovation tool for designers working in this area, illustrated via application to an everyday human-technology interaction problem: reducing the likelihood of a customer leaving his or her card in an automatic teller machine. The example application results in a range of feasible design concepts which are comparable to existing developments in ATM design, demonstrating that the method has potential for development and application as part of a user-centred design process.

  19. INNOVATIVE METHODS FOR THE OPTIMIZATION OF GRAVITY STORM SEWER DESIGN

    EPA Science Inventory

    The purpose of this paper is to describe a new method for optimizing the design of urban storm sewer systems. Previous efforts to optimize gravity sewers have met with limited success because classical optimization methods require that the problem be well behaved, e.g. describ...

  20. Designing, Teaching, and Evaluating Two Complementary Mixed Methods Research Courses

    ERIC Educational Resources Information Center

    Christ, Thomas W.

    2009-01-01

    Teaching mixed methods research is difficult. This longitudinal explanatory study examined how two classes were designed, taught, and evaluated. Curriculum, Research, and Teaching (EDCS-606) and Mixed Methods Research (EDCS-780) used a research proposal generation process to highlight the importance of the purpose, research question and…

  1. Digital Epidemiology

    PubMed Central

    Salathé, Marcel; Bengtsson, Linus; Bodnar, Todd J.; Brewer, Devon D.; Brownstein, John S.; Buckee, Caroline; Campbell, Ellsworth M.; Cattuto, Ciro; Khandelwal, Shashank; Mabry, Patricia L.; Vespignani, Alessandro

    2012-01-01

    Mobile, social, real-time: the ongoing revolution in the way people communicate has given rise to a new kind of epidemiology. Digital data sources, when harnessed appropriately, can provide local and timely information about disease and health dynamics in populations around the world. The rapid, unprecedented increase in the availability of relevant data from various digital sources creates considerable technical and computational challenges. PMID:22844241

  2. Optimal Input Signal Design for Data-Centric Estimation Methods

    PubMed Central

    Deshpande, Sunil; Rivera, Daniel E.

    2013-01-01

    Data-centric estimation methods such as Model-on-Demand and Direct Weight Optimization form attractive techniques for estimating unknown functions from noisy data. These methods rely on generating a local function approximation from a database of regressors at the current operating point with the process repeated at each new operating point. This paper examines the design of optimal input signals formulated to produce informative data to be used by local modeling procedures. The proposed method specifically addresses the distribution of the regressor vectors. The design is examined for a linear time-invariant system under amplitude constraints on the input. The resulting optimization problem is solved using semidefinite relaxation methods. Numerical examples show the benefits in comparison to a classical PRBS input design. PMID:24317042

  3. Test methods and design allowables for fibrous composites. Volume 2

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C. (Editor)

    1989-01-01

    Topics discussed include extreme/hostile environment testing, establishing design allowables, and property/behavior specific testing. Papers are presented on environmental effects on the high strain rate properties of graphite/epoxy composite, the low-temperature performance of short-fiber reinforced thermoplastics, the abrasive wear behavior of unidirectional and woven graphite fiber/PEEK, test methods for determining design allowables for fiber reinforced composites, and statistical methods for calculating material allowables for MIL-HDBK-17. Attention is also given to a test method to measure the response of composite materials under reversed cyclic loads, a through-the-thickness strength specimen for composites, the use of torsion tubes to measure in-plane shear properties of filament-wound composites, the influlence of test fixture design on the Iosipescu shear test for fiber composite materials, and a method for monitoring in-plane shear modulus in fatigue testing of composites.

  4. Tradeoff methods in multiobjective insensitive design of airplane control systems

    NASA Technical Reports Server (NTRS)

    Schy, A. A.; Giesy, D. P.

    1984-01-01

    The latest results of an ongoing study of computer-aided design of airplane control systems are given. Constrained minimization algorithms are used, with the design objectives in the constraint vector. The concept of Pareto optimiality is briefly reviewed. It is shown how an experienced designer can use it to find designs which are well-balanced in all objectives. Then the problem of finding designs which are insensitive to uncertainty in system parameters are discussed, introducing a probabilistic vector definition of sensitivity which is consistent with the deterministic Pareto optimal problem. Insensitivity is important in any practical design, but it is particularly important in the design of feedback control systems, since it is considered to be the most important distinctive property of feedback control. Methods of tradeoff between deterministic and stochastic-insensitive (SI) design are described, and tradeoff design results are presented for the example of the a Shuttle lateral stability augmentation system. This example is used because careful studies have been made of the uncertainty in Shuttle aerodynamics. Finally, since accurate statistics of uncertain parameters are usually not available, the effects of crude statistical models on SI designs are examined.

  5. Computer method for design of acoustic liners for turbofan engines

    NASA Technical Reports Server (NTRS)

    Minner, G. L.; Rice, E. J.

    1976-01-01

    A design package is presented for the specification of acoustic liners for turbofans. An estimate of the noise generation was made based on modifications of existing noise correlations, for which the inputs are basic fan aerodynamic design variables. The method does not predict multiple pure tones. A target attenuation spectrum was calculated which was the difference between the estimated generation spectrum and a flat annoyance-weighted goal attenuated spectrum. The target spectrum was combined with a knowledge of acoustic liner performance as a function of the liner design variables to specify the acoustic design. The liner design method at present is limited to annular duct configurations. The detailed structure of the liner was specified by combining the required impedance (which is a result of the previous step) with a mathematical model relating impedance to the detailed structure. The design procedure was developed for a liner constructed of perforated sheet placed over honeycomb backing cavities. A sample calculation was carried through in order to demonstrate the design procedure, and experimental results presented show good agreement with the calculated results of the method.

  6. A new method of VLSI conform design for MOS cells

    NASA Astrophysics Data System (ADS)

    Schmidt, K. H.; Wach, W.; Mueller-Glaser, K. D.

    An automated method for the design of specialized SSI/LSI-level MOS cells suitable for incorporation in VLSI chips is described. The method uses the symbolic-layout features of the CABBAGE computer program (Hsueh, 1979; De Man et al., 1982), but restricted by a fixed grid system to facilitate compaction procedures. The techniques used are shown to significantly speed the processes of electrical design, layout, design verification, and description for subsequent CAD/CAM application. In the example presented, a 211-transistor, parallel-load, synchronous 4-bit up/down binary counter cell was designed in 9 days, as compared to 30 days for a manually-optimized-layout version and 3 days for a larger, less efficient cell designed by a programmable logic array; the cell areas were 0.36, 0.21, and 0.79 sq mm, respectively. The primary advantage of the method is seen in the extreme ease with which the cell design can be adapted to new parameters or design rules imposed by improvements in technology.

  7. 78 FR 67360 - Ambient Air Monitoring Reference and Equivalent Methods: Designation of Five New Equivalent Methods

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-12

    ... March 6, 2009. The monitors are commercially available from the applicant, Thermo Fisher Scientific, Air... AGENCY Ambient Air Monitoring Reference and Equivalent Methods: Designation of Five New Equivalent... of the designation of five new equivalent methods for monitoring ambient air quality. SUMMARY:...

  8. Epidemiological investigation of a Legionnaires' disease outbreak in Christchurch, New Zealand: the value of spatial methods for practical public health.

    PubMed

    White, P S; Graham, F F; Harte, D J G; Baker, M G; Ambrose, C D; Humphrey, A R G

    2013-04-01

    Between April and August 2005 Christchurch, New Zealand experienced an outbreak of Legionnaires' disease. There were 19 laboratory-confirmed case including three deaths. Legionella pneumophila serogroup 1 (Lpsg1) was identified as the causative agent for all cases. A case-control study indicated a geographical association between the cases but no specific common exposures. Rapid spatial epidemiological investigation confirmed the association and identified seven spatially significant case clusters. The clusters were all sourced in the same area and exhibited a clear anisotropic process (noticeable direction) revealing a plume effect consistent with aerosol dispersion from a prevailing southwesterly wind. Four out of five cases tested had indistinguishable allele profiles that also matched environmental isolates from a water cooling tower within the centre of the clusters. This tower was considered the most probable source for these clusters. The conclusion would suggest a maximum dispersal distance in this outbreak of 11·6 km. This work illustrated the value of geostatistical techniques for infectious disease epidemiology and for providing timely information during outbreak investigations. PMID:22697112

  9. Method for Enzyme Design with Genetically Encoded Unnatural Amino Acids.

    PubMed

    Hu, C; Wang, J

    2016-01-01

    We describe the methodologies for the design of artificial enzymes with genetically encoded unnatural amino acids. Genetically encoded unnatural amino acids offer great promise for constructing artificial enzymes with novel activities. In our studies, the designs of artificial enzyme were divided into two steps. First, we considered the unnatural amino acids and the protein scaffold separately. The scaffold is designed by traditional protein design methods. The unnatural amino acids are inspired by natural structure and organic chemistry methods, and synthesized by either organic chemistry methods or enzymatic conversion. With the increasing number of published unnatural amino acids with various functions, we described an unnatural amino acids toolkit containing metal chelators, redox mediators, and click chemistry reagents. These efforts enable a researcher to search the toolkit for appropriate unnatural amino acids for the study, rather than design and synthesize the unnatural amino acids from the beginning. After the first step, the model enzyme was optimized by computational methods and directed evolution. Lastly, we describe a general method for evolving aminoacyl-tRNA synthetase and expressing unnatural amino acids incorporated into a protein. PMID:27586330

  10. Method for Enzyme Design with Genetically Encoded Unnatural Amino Acids.

    PubMed

    Hu, C; Wang, J

    2016-01-01

    We describe the methodologies for the design of artificial enzymes with genetically encoded unnatural amino acids. Genetically encoded unnatural amino acids offer great promise for constructing artificial enzymes with novel activities. In our studies, the designs of artificial enzyme were divided into two steps. First, we considered the unnatural amino acids and the protein scaffold separately. The scaffold is designed by traditional protein design methods. The unnatural amino acids are inspired by natural structure and organic chemistry methods, and synthesized by either organic chemistry methods or enzymatic conversion. With the increasing number of published unnatural amino acids with various functions, we described an unnatural amino acids toolkit containing metal chelators, redox mediators, and click chemistry reagents. These efforts enable a researcher to search the toolkit for appropriate unnatural amino acids for the study, rather than design and synthesize the unnatural amino acids from the beginning. After the first step, the model enzyme was optimized by computational methods and directed evolution. Lastly, we describe a general method for evolving aminoacyl-tRNA synthetase and expressing unnatural amino acids incorporated into a protein.

  11. Developing Conceptual Hypersonic Airbreathing Engines Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Ferlemann, Shelly M.; Robinson, Jeffrey S.; Martin, John G.; Leonard, Charles P.; Taylor, Lawrence W.; Kamhawi, Hilmi

    2000-01-01

    Designing a hypersonic vehicle is a complicated process due to the multi-disciplinary synergy that is required. The greatest challenge involves propulsion-airframe integration. In the past, a two-dimensional flowpath was generated based on the engine performance required for a proposed mission. A three-dimensional CAD geometry was produced from the two-dimensional flowpath for aerodynamic analysis, structural design, and packaging. The aerodynamics, engine performance, and mass properties arc inputs to the vehicle performance tool to determine if the mission goals were met. If the mission goals were not met, then a flowpath and vehicle redesign would begin. This design process might have to be performed several times to produce a "closed" vehicle. This paper will describe an attempt to design a hypersonic cruise vehicle propulsion flowpath using a Design of' Experiments method to reduce the resources necessary to produce a conceptual design with fewer iterations of the design cycle. These methods also allow for more flexible mission analysis and incorporation of additional design constraints at any point. A design system was developed using an object-based software package that would quickly generate each flowpath in the study given the values of the geometric independent variables. These flowpath geometries were put into a hypersonic propulsion code and the engine performance was generated. The propulsion results were loaded into statistical software to produce regression equations that were combined with an aerodynamic database to optimize the flowpath at the vehicle performance level. For this example, the design process was executed twice. The first pass was a cursory look at the independent variables selected to determine which variables are the most important and to test all of the inputs to the optimization process. The second cycle is a more in-depth study with more cases and higher order equations representing the design space.

  12. A decentralized linear quadratic control design method for flexible structures

    NASA Technical Reports Server (NTRS)

    Su, Tzu-Jeng; Craig, Roy R., Jr.

    1990-01-01

    A decentralized suboptimal linear quadratic control design procedure which combines substructural synthesis, model reduction, decentralized control design, subcontroller synthesis, and controller reduction is proposed for the design of reduced-order controllers for flexible structures. The procedure starts with a definition of the continuum structure to be controlled. An evaluation model of finite dimension is obtained by the finite element method. Then, the finite element model is decomposed into several substructures by using a natural decomposition called substructuring decomposition. Each substructure, at this point, still has too large a dimension and must be reduced to a size that is Riccati-solvable. Model reduction of each substructure can be performed by using any existing model reduction method, e.g., modal truncation, balanced reduction, Krylov model reduction, or mixed-mode method. Then, based on the reduced substructure model, a subcontroller is designed by an LQ optimal control method for each substructure independently. After all subcontrollers are designed, a controller synthesis method called substructural controller synthesis is employed to synthesize all subcontrollers into a global controller. The assembling scheme used is the same as that employed for the structure matrices. Finally, a controller reduction scheme, called the equivalent impulse response energy controller (EIREC) reduction algorithm, is used to reduce the global controller to a reasonable size for implementation. The EIREC reduced controller preserves the impulse response energy of the full-order controller and has the property of matching low-frequency moments and low-frequency power moments. An advantage of the substructural controller synthesis method is that it relieves the computational burden associated with dimensionality. Besides that, the SCS design scheme is also a highly adaptable controller synthesis method for structures with varying configuration, or varying mass

  13. Rotordynamics and Design Methods of an Oil-Free Turbocharger

    NASA Technical Reports Server (NTRS)

    Howard, Samuel A.

    1999-01-01

    The feasibility of supporting a turbocharger rotor on air foil bearings is investigated based upon predicted rotordynamic stability, load accommodations, and stress considerations. It is demonstrated that foil bearings offer a plausible replacement for oil-lubricated bearings in diesel truck turbochargers. Also, two different rotor configurations are analyzed and the design is chosen which best optimizes the desired performance characteristics. The method of designing machinery for foil bearing use and the assumptions made are discussed.

  14. Mixed methods research design for pragmatic psychoanalytic studies.

    PubMed

    Tillman, Jane G; Clemence, A Jill; Stevens, Jennifer L

    2011-10-01

    Calls for more rigorous psychoanalytic studies have increased over the past decade. The field has been divided by those who assert that psychoanalysis is properly a hermeneutic endeavor and those who see it as a science. A comparable debate is found in research methodology, where qualitative and quantitative methods have often been seen as occupying orthogonal positions. Recently, Mixed Methods Research (MMR) has emerged as a viable "third community" of research, pursuing a pragmatic approach to research endeavors through integrating qualitative and quantitative procedures in a single study design. Mixed Methods Research designs and the terminology associated with this emerging approach are explained, after which the methodology is explored as a potential integrative approach to a psychoanalytic human science. Both qualitative and quantitative research methods are reviewed, as well as how they may be used in Mixed Methods Research to study complex human phenomena.

  15. Scenario building as an ergonomics method in consumer product design.

    PubMed

    Suri, J F; Marsh, M

    2000-04-01

    The role of human factors in design appears to have broadened from data analysis and interpretation into application of discovery and "user experience" design. The human factors practitioner is continually in search of ways to enhance and to better communicate their contributions, as well as to raise the prominence of the user at all stages of the design process. In work with design teams on the development of many consumer products, scenario building has proved to be a valuable addition to the repertoire of more traditional human factors methods. It is a powerful exploration, prototyping and communication tool, and is particularly useful early on in the product design process. This paper describes some advantages and potential pitfalls in using scenarios, and provides examples of how and where they can be usefully applied.

  16. Design of large Francis turbine using optimal methods

    NASA Astrophysics Data System (ADS)

    Flores, E.; Bornard, L.; Tomas, L.; Liu, J.; Couston, M.

    2012-11-01

    Among a high number of Francis turbine references all over the world, covering the whole market range of heads, Alstom has especially been involved in the development and equipment of the largest power plants in the world : Three Gorges (China -32×767 MW - 61 to 113 m), Itaipu (Brazil- 20x750 MW - 98.7m to 127m) and Xiangjiaba (China - 8x812 MW - 82.5m to 113.6m - in erection). Many new projects are under study to equip new power plants with Francis turbines in order to answer an increasing demand of renewable energy. In this context, Alstom Hydro is carrying out many developments to answer those needs, especially for jumbo units such the planned 1GW type units in China. The turbine design for such units requires specific care by using the state of the art in computation methods and the latest technologies in model testing as well as the maximum feedback from operation of Jumbo plants already in operation. We present in this paper how a large Francis turbine can be designed using specific design methods, including the global and local optimization methods. The design of the spiral case, the tandem cascade profiles, the runner and the draft tube are designed with optimization loops involving a blade design tool, an automatic meshing software and a Navier-Stokes solver, piloted by a genetic algorithm. These automated optimization methods, presented in different papers over the last decade, are nowadays widely used, thanks to the growing computation capacity of the HPC clusters: the intensive use of such optimization methods at the turbine design stage allows to reach very high level of performances, while the hydraulic flow characteristics are carefully studied over the whole water passage to avoid any unexpected hydraulic phenomena.

  17. Computational methods of robust controller design for aerodynamic flutter suppression

    NASA Technical Reports Server (NTRS)

    Anderson, L. R.

    1981-01-01

    The development of Riccati iteration, a tool for the design and analysis of linear control systems is examined. First, Riccati iteration is applied to the problem of pole placement and order reduction in two-time scale control systems. Order reduction, yielding a good approximation to the original system, is demonstrated using a 16th order linear model of a turbofan engine. Next, a numerical method for solving the Riccati equation is presented and demonstrated for a set of eighth order random examples. A literature review of robust controller design methods follows which includes a number of methods for reducing the trajectory and performance index sensitivity in linear regulators. Lastly, robust controller design for large parameter variations is discussed.

  18. Improved method for transonic airfoil design-by-optimization

    NASA Technical Reports Server (NTRS)

    Kennelly, R. A., Jr.

    1983-01-01

    An improved method for use of optimization techniques in transonic airfoil design is demonstrated. FLO6QNM incorporates a modified quasi-Newton optimization package, and is shown to be more reliable and efficient than the method developed previously at NASA-Ames, which used the COPES/CONMIN optimization program. The design codes are compared on a series of test cases with known solutions, and the effects of problem scaling, proximity of initial point to solution, and objective function precision are studied. In contrast to the older method, well-converged solutions are shown to be attainable in the context of engineering design using computational fluid dynamics tools, a new result. The improvements are due to better performance by the optimization routine and to the use of problem-adaptive finite difference step sizes for gradient evaluation.

  19. An uncertain multidisciplinary design optimization method using interval convex models

    NASA Astrophysics Data System (ADS)

    Li, Fangyi; Luo, Zhen; Sun, Guangyong; Zhang, Nong

    2013-06-01

    This article proposes an uncertain multi-objective multidisciplinary design optimization methodology, which employs the interval model to represent the uncertainties of uncertain-but-bounded parameters. The interval number programming method is applied to transform each uncertain objective function into two deterministic objective functions, and a satisfaction degree of intervals is used to convert both the uncertain inequality and equality constraints to deterministic inequality constraints. In doing so, an unconstrained deterministic optimization problem will be constructed in association with the penalty function method. The design will be finally formulated as a nested three-loop optimization, a class of highly challenging problems in the area of engineering design optimization. An advanced hierarchical optimization scheme is developed to solve the proposed optimization problem based on the multidisciplinary feasible strategy, which is a well-studied method able to reduce the dimensions of multidisciplinary design optimization problems by using the design variables as independent optimization variables. In the hierarchical optimization system, the non-dominated sorting genetic algorithm II, sequential quadratic programming method and Gauss-Seidel iterative approach are applied to the outer, middle and inner loops of the optimization problem, respectively. Typical numerical examples are used to demonstrate the effectiveness of the proposed methodology.

  20. The C8 Health Project: Design, Methods, and Participants

    PubMed Central

    Frisbee, Stephanie J.; Brooks, A. Paul; Maher, Arthur; Flensborg, Patsy; Arnold, Susan; Fletcher, Tony; Steenland, Kyle; Shankar, Anoop; Knox, Sarah S.; Pollard, Cecil; Halverson, Joel A.; Vieira, Verónica M.; Jin, Chuanfang; Leyden, Kevin M.; Ducatman, Alan M.

    2009-01-01

    Background The C8 Health Project was created, authorized, and funded as part of the settlement agreement reached in the case of Jack W. Leach, et al. v. E.I. du Pont de Nemours & Company (no. 01-C-608 W.Va., Wood County Circuit Court, filed 10 April 2002). The settlement stemmed from the perfluorooctanoic acid (PFOA, or C8) contamination of drinking water in six water districts in two states near the DuPont Washington Works facility near Parkersburg, West Virginia. Objectives This study reports on the methods and results from the C8 Health Project, a population study created to gather data that would allow class members to know their own PFOA levels and permit subsequent epidemiologic investigations. Methods Final study participation was 69,030, enrolled over a 13-month period in 2005–2006. Extensive data were collected, including demographic data, medical diagnoses (both self-report and medical records review), clinical laboratory testing, and determination of serum concentrations of 10 perfluorocarbons (PFCs). Here we describe the processes used to collect, validate, and store these health data. We also describe survey participants and their serum PFC levels. Results The population geometric mean for serum PFOA was 32.91 ng/mL, 500% higher than previously reported for a representative American population. Serum concentrations for perfluorohexane sulfonate and perfluorononanoic acid were elevated 39% and 73% respectively, whereas perfluorooctanesulfonate was present at levels similar to those in the U.S. population. Conclusions This largest known population study of community PFC exposure permits new evaluations of associations between PFOA, in particular, and a range of health parameters. These will contribute to understanding of the biology of PFC exposure. The C8 Health Project also represents an unprecedented effort to gather basic data on an exposed population; its achievements and limitations can inform future legal settlements for populations exposed to

  1. Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Phase 1

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas

    1998-01-01

    The NASA Langley Multidisciplinary Design Optimization (MDO) method evaluation study seeks to arrive at a set of guidelines for using promising MDO methods by accumulating and analyzing computational data for such methods. The data are collected by conducting a series of reproducible experiments. This report documents all computational experiments conducted in Phase I of the study. This report is a companion to the paper titled Initial Results of an MDO Method Evaluation Study by N. M. Alexandrov and S. Kodiyalam (AIAA-98-4884).

  2. Exploration of Advanced Probabilistic and Stochastic Design Methods

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.

    2003-01-01

    The primary objective of the three year research effort was to explore advanced, non-deterministic aerospace system design methods that may have relevance to designers and analysts. The research pursued emerging areas in design methodology and leverage current fundamental research in the area of design decision-making, probabilistic modeling, and optimization. The specific focus of the three year investigation was oriented toward methods to identify and analyze emerging aircraft technologies in a consistent and complete manner, and to explore means to make optimal decisions based on this knowledge in a probabilistic environment. The research efforts were classified into two main areas. First, Task A of the grant has had the objective of conducting research into the relative merits of possible approaches that account for both multiple criteria and uncertainty in design decision-making. In particular, in the final year of research, the focus was on the comparison and contrasting between three methods researched. Specifically, these three are the Joint Probabilistic Decision-Making (JPDM) technique, Physical Programming, and Dempster-Shafer (D-S) theory. The next element of the research, as contained in Task B, was focused upon exploration of the Technology Identification, Evaluation, and Selection (TIES) methodology developed at ASDL, especially with regards to identification of research needs in the baseline method through implementation exercises. The end result of Task B was the documentation of the evolution of the method with time and a technology transfer to the sponsor regarding the method, such that an initial capability for execution could be obtained by the sponsor. Specifically, the results of year 3 efforts were the creation of a detailed tutorial for implementing the TIES method. Within the tutorial package, templates and detailed examples were created for learning and understanding the details of each step. For both research tasks, sample files and

  3. A PDE Sensitivity Equation Method for Optimal Aerodynamic Design

    NASA Technical Reports Server (NTRS)

    Borggaard, Jeff; Burns, John

    1996-01-01

    The use of gradient based optimization algorithms in inverse design is well established as a practical approach to aerodynamic design. A typical procedure uses a simulation scheme to evaluate the objective function (from the approximate states) and its gradient, then passes this information to an optimization algorithm. Once the simulation scheme (CFD flow solver) has been selected and used to provide approximate function evaluations, there are several possible approaches to the problem of computing gradients. One popular method is to differentiate the simulation scheme and compute design sensitivities that are then used to obtain gradients. Although this black-box approach has many advantages in shape optimization problems, one must compute mesh sensitivities in order to compute the design sensitivity. In this paper, we present an alternative approach using the PDE sensitivity equation to develop algorithms for computing gradients. This approach has the advantage that mesh sensitivities need not be computed. Moreover, when it is possible to use the CFD scheme for both the forward problem and the sensitivity equation, then there are computational advantages. An apparent disadvantage of this approach is that it does not always produce consistent derivatives. However, for a proper combination of discretization schemes, one can show asymptotic consistency under mesh refinement, which is often sufficient to guarantee convergence of the optimal design algorithm. In particular, we show that when asymptotically consistent schemes are combined with a trust-region optimization algorithm, the resulting optimal design method converges. We denote this approach as the sensitivity equation method. The sensitivity equation method is presented, convergence results are given and the approach is illustrated on two optimal design problems involving shocks.

  4. Taguchi method of experimental design in materials education

    NASA Technical Reports Server (NTRS)

    Weiser, Martin W.

    1993-01-01

    Some of the advantages and disadvantages of the Taguchi Method of experimental design as applied to Materials Science will be discussed. This is a fractional factorial method that employs the minimum number of experimental trials for the information obtained. The analysis is also very simple to use and teach, which is quite advantageous in the classroom. In addition, the Taguchi loss function can be easily incorporated to emphasize that improvements in reproducibility are often at least as important as optimization of the response. The disadvantages of the Taguchi Method include the fact that factor interactions are normally not accounted for, there are zero degrees of freedom if all of the possible factors are used, and randomization is normally not used to prevent environmental biasing. In spite of these disadvantages it is felt that the Taguchi Method is extremely useful for both teaching experimental design and as a research tool, as will be shown with a number of brief examples.

  5. Function combined method for design innovation of children's bike

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoli; Qiu, Tingting; Chen, Huijuan

    2013-03-01

    As children mature, bike products for children in the market develop at the same time, and the conditions are frequently updated. Certain problems occur when using a bike, such as cycle overlapping, repeating function, and short life cycle, which go against the principles of energy conservation and the environmental protection intensive design concept. In this paper, a rational multi-function method of design through functional superposition, transformation, and technical implementation is proposed. An organic combination of frog-style scooter and children's tricycle is developed using the multi-function method. From the ergonomic perspective, the paper elaborates on the body size of children aged 5 to 12 and effectively extracts data for a multi-function children's bike, which can be used for gliding and riding. By inverting the body, parts can be interchanged between the handles and the pedals of the bike. Finally, the paper provides a detailed analysis of the components and structural design, body material, and processing technology of the bike. The study of Industrial Product Innovation Design provides an effective design method to solve the bicycle problems, extends the function problems, improves the product market situation, and enhances the energy saving feature while implementing intensive product development effectively at the same time.

  6. System Synthesis in Preliminary Aircraft Design using Statistical Methods

    NASA Technical Reports Server (NTRS)

    DeLaurentis, Daniel; Mavris, Dimitri N.; Schrage, Daniel P.

    1996-01-01

    This paper documents an approach to conceptual and preliminary aircraft design in which system synthesis is achieved using statistical methods, specifically design of experiments (DOE) and response surface methodology (RSM). These methods are employed in order to more efficiently search the design space for optimum configurations. In particular, a methodology incorporating three uses of these techniques is presented. First, response surface equations are formed which represent aerodynamic analyses, in the form of regression polynomials, which are more sophisticated than generally available in early design stages. Next, a regression equation for an overall evaluation criterion is constructed for the purpose of constrained optimization at the system level. This optimization, though achieved in a innovative way, is still traditional in that it is a point design solution. The methodology put forward here remedies this by introducing uncertainty into the problem, resulting a solutions which are probabilistic in nature. DOE/RSM is used for the third time in this setting. The process is demonstrated through a detailed aero-propulsion optimization of a high speed civil transport. Fundamental goals of the methodology, then, are to introduce higher fidelity disciplinary analyses to the conceptual aircraft synthesis and provide a roadmap for transitioning from point solutions to probabalistic designs (and eventually robust ones).

  7. System Synthesis in Preliminary Aircraft Design Using Statistical Methods

    NASA Technical Reports Server (NTRS)

    DeLaurentis, Daniel; Mavris, Dimitri N.; Schrage, Daniel P.

    1996-01-01

    This paper documents an approach to conceptual and early preliminary aircraft design in which system synthesis is achieved using statistical methods, specifically Design of Experiments (DOE) and Response Surface Methodology (RSM). These methods are employed in order to more efficiently search the design space for optimum configurations. In particular, a methodology incorporating three uses of these techniques is presented. First, response surface equations are formed which represent aerodynamic analyses, in the form of regression polynomials, which are more sophisticated than generally available in early design stages. Next, a regression equation for an Overall Evaluation Criterion is constructed for the purpose of constrained optimization at the system level. This optimization, though achieved in an innovative way, is still traditional in that it is a point design solution. The methodology put forward here remedies this by introducing uncertainty into the problem, resulting in solutions which are probabilistic in nature. DOE/RSM is used for the third time in this setting. The process is demonstrated through a detailed aero-propulsion optimization of a High Speed Civil Transport. Fundamental goals of the methodology, then, are to introduce higher fidelity disciplinary analyses to the conceptual aircraft synthesis and provide a roadmap for transitioning from point solutions to probabilistic designs (and eventually robust ones).

  8. A Simple Method for High-Lift Propeller Conceptual Design

    NASA Technical Reports Server (NTRS)

    Patterson, Michael; Borer, Nick; German, Brian

    2016-01-01

    In this paper, we present a simple method for designing propellers that are placed upstream of the leading edge of a wing in order to augment lift. Because the primary purpose of these "high-lift propellers" is to increase lift rather than produce thrust, these props are best viewed as a form of high-lift device; consequently, they should be designed differently than traditional propellers. We present a theory that describes how these props can be designed to provide a relatively uniform axial velocity increase, which is hypothesized to be advantageous for lift augmentation based on a literature survey. Computational modeling indicates that such propellers can generate the same average induced axial velocity while consuming less power and producing less thrust than conventional propeller designs. For an example problem based on specifications for NASA's Scalable Convergent Electric Propulsion Technology and Operations Research (SCEPTOR) flight demonstrator, a propeller designed with the new method requires approximately 15% less power and produces approximately 11% less thrust than one designed for minimum induced loss. Higher-order modeling and/or wind tunnel testing are needed to verify the predicted performance.

  9. An interdisciplinary heuristic evaluation method for universal building design.

    PubMed

    Afacan, Yasemin; Erbug, Cigdem

    2009-07-01

    This study highlights how heuristic evaluation as a usability evaluation method can feed into current building design practice to conform to universal design principles. It provides a definition of universal usability that is applicable to an architectural design context. It takes the seven universal design principles as a set of heuristics and applies an iterative sequence of heuristic evaluation in a shopping mall, aiming to achieve a cost-effective evaluation process. The evaluation was composed of three consecutive sessions. First, five evaluators from different professions were interviewed regarding the construction drawings in terms of universal design principles. Then, each evaluator was asked to perform the predefined task scenarios. In subsequent interviews, the evaluators were asked to re-analyze the construction drawings. The results showed that heuristic evaluation could successfully integrate universal usability into current building design practice in two ways: (i) it promoted an iterative evaluation process combined with multi-sessions rather than relying on one evaluator and on one evaluation session to find the maximum number of usability problems, and (ii) it highlighted the necessity of an interdisciplinary ad hoc committee regarding the heuristic abilities of each profession. A multi-session and interdisciplinary heuristic evaluation method can save both the project budget and the required time, while ensuring a reduced error rate for the universal usage of the built environments.

  10. Molecular Epidemiology of Malaria

    PubMed Central

    Conway, David J.

    2007-01-01

    Malaria persists as an undiminished global problem, but the resources available to address it have increased. Many tools for understanding its biology and epidemiology are well developed, with a particular richness of comparative genome sequences. Targeted genetic manipulation is now effectively combined with in vitro culture assays on the most important human parasite, Plasmodium falciparum, and with in vivo analysis of rodent and monkey malaria parasites in their laboratory hosts. Studies of the epidemiology, prevention, and treatment of human malaria have already been influenced by the availability of molecular methods, and analyses of parasite polymorphisms have long had useful and highly informative applications. However, the molecular epidemiology of malaria is currently undergoing its most substantial revolution as a result of the genomic information and technologies that are available in well-resourced centers. It is a challenge for research agendas to face the real needs presented by a disease that largely exists in extremely resource-poor settings, but it is one that there appears to be an increased willingness to undertake. To this end, developments in the molecular epidemiology of malaria are reviewed here, emphasizing aspects that may be current and future priorities. PMID:17223628

  11. Comparison of optimal design methods in inverse problems

    NASA Astrophysics Data System (ADS)

    Banks, H. T.; Holm, K.; Kappel, F.

    2011-07-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric-based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher information matrix. A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criterion with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model (Banks H T and Tran H T 2009 Mathematical and Experimental Modeling of Physical and Biological Processes (Boca Raton, FL: Chapman and Hall/CRC)), the standard harmonic oscillator model (Banks H T and Tran H T 2009) and a popular glucose regulation model (Bergman R N, Ider Y Z, Bowden C R and Cobelli C 1979 Am. J. Physiol. 236 E667-77 De Gaetano A and Arino O 2000 J. Math. Biol. 40 136-68 Toffolo G, Bergman R N, Finegood D T, Bowden C R and Cobelli C 1980 Diabetes 29 979-90).

  12. New Methods and Transducer Designs for Ultrasonic Diagnostics and Therapy

    NASA Astrophysics Data System (ADS)

    Rybyanets, A. N.; Naumenko, A. A.; Sapozhnikov, O. A.; Khokhlova, V. A.

    Recent advances in the field of physical acoustics, imaging technologies, piezoelectric materials, and ultrasonic transducer design have led to emerging of novel methods and apparatus for ultrasonic diagnostics, therapy and body aesthetics. The paper presents the results on development and experimental study of different high intensity focused ultrasound (HIFU) transducers. Technological peculiarities of the HIFU transducer design as well as theoretical and numerical models of such transducers and the corresponding HIFU fields are discussed. Several HIFU transducers of different design have been fabricated using different advanced piezoelectric materials. Acoustic field measurements for those transducers have been performed using a calibrated fiber optic hydrophone and an ultrasonic measurement system (UMS). The results of ex vivo experiments with different tissues as well as in vivo experiments with blood vessels are presented that prove the efficacy, safety and selectivity of the developed HIFU transducers and methods.

  13. New displacement-based methods for optimal truss topology design

    NASA Technical Reports Server (NTRS)

    Bendsoe, Martin P.; Ben-Tal, Aharon; Haftka, Raphael T.

    1991-01-01

    Two alternate methods for maximum stiffness truss topology design are presented. The ground structure approach is used, and the problem is formulated in terms of displacements and bar areas. This large, nonconvex optimization problem can be solved by a simultaneous analysis and design approach. Alternatively, an equivalent, unconstrained, and convex problem in the displacements only can be formulated, and this problem can be solved by a nonsmooth, steepest descent algorithm. In both methods, the explicit solving of the equilibrium equations and the assembly of the global stiffness matrix are circumvented. A large number of examples have been studied, showing the attractive features of topology design as well as exposing interesting features of optimal topologies.

  14. Obtaining Valid Response Rates: Considerations beyond the Tailored Design Method.

    ERIC Educational Resources Information Center

    Huang, Judy Y.; Hubbard, Susan M.; Mulvey, Kevin P.

    2003-01-01

    Reports on the use of the tailored design method (TDM) to achieve high survey response in two separate studies of the dissemination of Treatment Improvement Protocols (TIPs). Findings from these two studies identify six factors may have influenced nonresponse, and show that use of TDM does not, in itself, guarantee a high response rate. (SLD)

  15. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... and methods prescribed under appendix A of 14 CFR part 150; and (b) Use of computer models to create noise contours must be in accordance with the criteria prescribed under appendix A of 14 CFR part 150. ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Designation of noise description...

  16. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... and methods prescribed under appendix A of 14 CFR part 150; and (b) Use of computer models to create noise contours must be in accordance with the criteria prescribed under appendix A of 14 CFR part 150. ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Designation of noise description...

  17. Impact design methods for ceramic components in gas turbine engines

    NASA Technical Reports Server (NTRS)

    Song, J.; Cuccio, J.; Kington, H.

    1991-01-01

    Methods currently under development to design ceramic turbine components with improved impact resistance are presented. Two different modes of impact damage are identified and characterized, i.e., structural damage and local damage. The entire computation is incorporated into the EPIC computer code. Model capability is demonstrated by simulating instrumented plate impact and particle impact tests.

  18. Designs and Methods in School Improvement Research: A Systematic Review

    ERIC Educational Resources Information Center

    Feldhoff, Tobias; Radisch, Falk; Bischof, Linda Marie

    2016-01-01

    Purpose: The purpose of this paper is to focus on challenges faced by longitudinal quantitative analyses of school improvement processes and offers a systematic literature review of current papers that use longitudinal analyses. In this context, the authors assessed designs and methods that are used to analyze the relation between school…

  19. Polypharmacology: in silico methods of ligand design and development.

    PubMed

    McKie, Samuel A

    2016-04-01

    How to design a ligand to bind multiple targets, rather than to a single target, is the focus of this review. Rational polypharmacology draws on knowledge that is both broad ranging and hierarchical. Computer-aided multitarget ligand design methods are described according to their nested knowledge level. Ligand-only and then receptor-ligand strategies are first described; followed by the metabolic network viewpoint. Subsequently strategies that view infectious diseases as multigenomic targets are discussed, and finally the disease level interpretation of medicinal therapy is considered. As yet there is no consensus on how best to proceed in designing a multitarget ligand. The current methodologies are bought together in an attempt to give a practical overview of how polypharmacology design might be best initiated. PMID:27105127

  20. Polypharmacology: in silico methods of ligand design and development.

    PubMed

    McKie, Samuel A

    2016-04-01

    How to design a ligand to bind multiple targets, rather than to a single target, is the focus of this review. Rational polypharmacology draws on knowledge that is both broad ranging and hierarchical. Computer-aided multitarget ligand design methods are described according to their nested knowledge level. Ligand-only and then receptor-ligand strategies are first described; followed by the metabolic network viewpoint. Subsequently strategies that view infectious diseases as multigenomic targets are discussed, and finally the disease level interpretation of medicinal therapy is considered. As yet there is no consensus on how best to proceed in designing a multitarget ligand. The current methodologies are bought together in an attempt to give a practical overview of how polypharmacology design might be best initiated.

  1. Design and descriptive epidemiology of the Infectious Diseases of East African Livestock (IDEAL) project, a longitudinal calf cohort study in western Kenya

    PubMed Central

    2013-01-01

    Background There is a widely recognised lack of baseline epidemiological data on the dynamics and impacts of infectious cattle diseases in east Africa. The Infectious Diseases of East African Livestock (IDEAL) project is an epidemiological study of cattle health in western Kenya with the aim of providing baseline epidemiological data, investigating the impact of different infections on key responses such as growth, mortality and morbidity, the additive and/or multiplicative effects of co-infections, and the influence of management and genetic factors. A longitudinal cohort study of newborn calves was conducted in western Kenya between 2007-2009. Calves were randomly selected from all those reported in a 2 stage clustered sampling strategy. Calves were recruited between 3 and 7 days old. A team of veterinarians and animal health assistants carried out 5-weekly, clinical and postmortem visits. Blood and tissue samples were collected in association with all visits and screened using a range of laboratory based diagnostic methods for over 100 different pathogens or infectious exposures. Results The study followed the 548 calves over the first 51 weeks of life or until death and when they were reported clinically ill. The cohort experienced a high all cause mortality rate of 16% with at least 13% of these due to infectious diseases. Only 307 (6%) of routine visits were classified as clinical episodes, with a further 216 reported by farmers. 54% of calves reached one year without a reported clinical episode. Mortality was mainly to east coast fever, haemonchosis, and heartwater. Over 50 pathogens were detected in this population with exposure to a further 6 viruses and bacteria. Conclusion The IDEAL study has demonstrated that it is possible to mount population based longitudinal animal studies. The results quantify for the first time in an animal population the high diversity of pathogens a population may have to deal with and the levels of co-infections with key

  2. Guidance for using mixed methods design in nursing practice research.

    PubMed

    Chiang-Hanisko, Lenny; Newman, David; Dyess, Susan; Piyakong, Duangporn; Liehr, Patricia

    2016-08-01

    The mixed methods approach purposefully combines both quantitative and qualitative techniques, enabling a multi-faceted understanding of nursing phenomena. The purpose of this article is to introduce three mixed methods designs (parallel; sequential; conversion) and highlight interpretive processes that occur with the synthesis of qualitative and quantitative findings. Real world examples of research studies conducted by the authors will demonstrate the processes leading to the merger of data. The examples include: research questions; data collection procedures and analysis with a focus on synthesizing findings. Based on experience with mixed methods studied, the authors introduce two synthesis patterns (complementary; contrasting), considering application for practice and implications for research. PMID:27397810

  3. Computational methods for aerodynamic design using numerical optimization

    NASA Technical Reports Server (NTRS)

    Peeters, M. F.

    1983-01-01

    Five methods to increase the computational efficiency of aerodynamic design using numerical optimization, by reducing the computer time required to perform gradient calculations, are examined. The most promising method consists of drastically reducing the size of the computational domain on which aerodynamic calculations are made during gradient calculations. Since a gradient calculation requires the solution of the flow about an airfoil whose geometry was slightly perturbed from a base airfoil, the flow about the base airfoil is used to determine boundary conditions on the reduced computational domain. This method worked well in subcritical flow.

  4. A Mixed Methods Investigation of Mixed Methods Sampling Designs in Social and Health Science Research

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Jiao, Qun G.

    2007-01-01

    A sequential design utilizing identical samples was used to classify mixed methods studies via a two-dimensional model, wherein sampling designs were grouped according to the time orientation of each study's components and the relationship of the qualitative and quantitative samples. A quantitative analysis of 121 studies representing nine fields…

  5. Mixed methods research: a design for emergency care research?

    PubMed

    Cooper, Simon; Porter, Jo; Endacott, Ruth

    2011-08-01

    This paper follows previous publications on generic qualitative approaches, qualitative designs and action research in emergency care by this group of authors. Contemporary views on mixed methods approaches are considered, with a particular focus on the design choice and the amalgamation of qualitative and quantitative data emphasising the timing of data collection for each approach, their relative 'weight' and how they will be mixed. Mixed methods studies in emergency care are reviewed before the variety of methodological approaches and best practice considerations are presented. The use of mixed methods in clinical studies is increasing, aiming to answer questions such as 'how many' and 'why' in the same study, and as such are an important and useful approach to many key questions in emergency care.

  6. Tuning Parameters in Heuristics by Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Arin, Arif; Rabadi, Ghaith; Unal, Resit

    2010-01-01

    With the growing complexity of today's large scale problems, it has become more difficult to find optimal solutions by using exact mathematical methods. The need to find near-optimal solutions in an acceptable time frame requires heuristic approaches. In many cases, however, most heuristics have several parameters that need to be "tuned" before they can reach good results. The problem then turns into "finding best parameter setting" for the heuristics to solve the problems efficiently and timely. One-Factor-At-a-Time (OFAT) approach for parameter tuning neglects the interactions between parameters. Design of Experiments (DOE) tools can be instead employed to tune the parameters more effectively. In this paper, we seek the best parameter setting for a Genetic Algorithm (GA) to solve the single machine total weighted tardiness problem in which n jobs must be scheduled on a single machine without preemption, and the objective is to minimize the total weighted tardiness. Benchmark instances for the problem are available in the literature. To fine tune the GA parameters in the most efficient way, we compare multiple DOE models including 2-level (2k ) full factorial design, orthogonal array design, central composite design, D-optimal design and signal-to-noise (SIN) ratios. In each DOE method, a mathematical model is created using regression analysis, and solved to obtain the best parameter setting. After verification runs using the tuned parameter setting, the preliminary results for optimal solutions of multiple instances were found efficiently.

  7. Current methods of epitope identification for cancer vaccine design.

    PubMed

    Cherryholmes, Gregory A; Stanton, Sasha E; Disis, Mary L

    2015-12-16

    The importance of the immune system in tumor development and progression has been emerging in many cancers. Previous cancer vaccines have not shown long-term clinical benefit possibly because were not designed to avoid eliciting regulatory T-cell responses that inhibit the anti-tumor immune response. This review will examine different methods of identifying epitopes derived from tumor associated antigens suitable for immunization and the steps used to design and validate peptide epitopes to improve efficacy of anti-tumor peptide-based vaccines. Focusing on in silico prediction algorithms, we survey the advantages and disadvantages of current cancer vaccine prediction tools.

  8. Material Design, Selection, and Manufacturing Methods for System Sustainment

    SciTech Connect

    David Sowder, Jim Lula, Curtis Marshall

    2010-02-18

    This paper describes a material selection and validation process proven to be successful for manufacturing high-reliability long-life product. The National Secure Manufacturing Center business unit of the Kansas City Plant (herein called KCP) designs and manufactures complex electrical and mechanical components used in extreme environments. The material manufacturing heritage is founded in the systems design to manufacturing practices that support the U.S. Department of Energy’s National Nuclear Security Administration (DOE/NNSA). Material Engineers at KCP work with the systems designers to recommend materials, develop test methods, perform analytical analysis of test data, define cradle to grave needs, present final selection and fielding. The KCP material engineers typically will maintain cost control by utilizing commercial products when possible, but have the resources and to develop and produce unique formulations as necessary. This approach is currently being used to mature technologies to manufacture materials with improved characteristics using nano-composite filler materials that will enhance system design and production. For some products the engineers plan and carry out science-based life-cycle material surveillance processes. Recent examples of the approach include refurbished manufacturing of the high voltage power supplies for cockpit displays in operational aircraft; dry film lubricant application to improve bearing life for guided munitions gyroscope gimbals, ceramic substrate design for electrical circuit manufacturing, and tailored polymeric materials for various systems. The following examples show evidence of KCP concurrent design-to-manufacturing techniques used to achieve system solutions that satisfy or exceed demanding requirements.

  9. Race, racism, and epidemiological surveys.

    PubMed

    Adebimpe, V R

    1994-01-01

    Many studies of clinical populations have reported significant differences between whites and blacks in prevalence rates of mental disorders. However, data from the Epidemiologic Catchment Area study indicate only modest differences. The author describes factors in the treatment experiences of black and white patients that may lead researchers to find questionable disparities in prevalence rates. These factors include racial differences in treatment-seeking behavior, likelihood of involuntary commitment, representation in research samples, presentation of psychiatric symptoms and resulting diagnoses, and accuracy of psychological tests as well as disparities in treatment. The author suggests guidelines for improving research methods and designs, including documenting the ethnic composition of samples and using structured diagnostic assessments, so that unintended inequalities can be identified, addressed, and monitored and the accuracy of prevalence data among blacks can be improved.

  10. Denoising Sparse Images from GRAPPA using the Nullspace Method (DESIGN)

    PubMed Central

    Weller, Daniel S.; Polimeni, Jonathan R.; Grady, Leo; Wald, Lawrence L.; Adalsteinsson, Elfar; Goyal, Vivek K

    2011-01-01

    To accelerate magnetic resonance imaging using uniformly undersampled (nonrandom) parallel imaging beyond what is achievable with GRAPPA alone, the Denoising of Sparse Images from GRAPPA using the Nullspace method (DESIGN) is developed. The trade-off between denoising and smoothing the GRAPPA solution is studied for different levels of acceleration. Several brain images reconstructed from uniformly undersampled k-space data using DESIGN are compared against reconstructions using existing methods in terms of difference images (a qualitative measure), PSNR, and noise amplification (g-factors) as measured using the pseudo-multiple replica method. Effects of smoothing, including contrast loss, are studied in synthetic phantom data. In the experiments presented, the contrast loss and spatial resolution are competitive with existing methods. Results for several brain images demonstrate significant improvements over GRAPPA at high acceleration factors in denoising performance with limited blurring or smoothing artifacts. In addition, the measured g-factors suggest that DESIGN mitigates noise amplification better than both GRAPPA and L1 SPIR-iT (the latter limited here by uniform undersampling). PMID:22213069

  11. Docking methods for structure-based library design.

    PubMed

    Cavasotto, Claudio N; Phatak, Sharangdhar S

    2011-01-01

    The drug discovery process mainly relies on the experimental high-throughput screening of huge compound libraries in their pursuit of new active compounds. However, spiraling research and development costs and unimpressive success rates have driven the development of more rational, efficient, and cost-effective methods. With the increasing availability of protein structural information, advancement in computational algorithms, and faster computing resources, in silico docking-based methods are increasingly used to design smaller and focused compound libraries in order to reduce screening efforts and costs and at the same time identify active compounds with a better chance of progressing through the optimization stages. This chapter is a primer on the various docking-based methods developed for the purpose of structure-based library design. Our aim is to elucidate some basic terms related to the docking technique and explain the methodology behind several docking-based library design methods. This chapter also aims to guide the novice computational practitioner by laying out the general steps involved for such an exercise. Selected successful case studies conclude this chapter. PMID:20981523

  12. Docking methods for structure-based library design.

    PubMed

    Cavasotto, Claudio N; Phatak, Sharangdhar S

    2011-01-01

    The drug discovery process mainly relies on the experimental high-throughput screening of huge compound libraries in their pursuit of new active compounds. However, spiraling research and development costs and unimpressive success rates have driven the development of more rational, efficient, and cost-effective methods. With the increasing availability of protein structural information, advancement in computational algorithms, and faster computing resources, in silico docking-based methods are increasingly used to design smaller and focused compound libraries in order to reduce screening efforts and costs and at the same time identify active compounds with a better chance of progressing through the optimization stages. This chapter is a primer on the various docking-based methods developed for the purpose of structure-based library design. Our aim is to elucidate some basic terms related to the docking technique and explain the methodology behind several docking-based library design methods. This chapter also aims to guide the novice computational practitioner by laying out the general steps involved for such an exercise. Selected successful case studies conclude this chapter.

  13. Application of the CSCM method to the design of wedge cavities. [Conservative Supra Characteristic Method

    NASA Technical Reports Server (NTRS)

    Venkatapathy, Ethiraj; Nystrom, G. A.; Bardina, J.; Lombard, C. K.

    1987-01-01

    This paper describes the application of the conservative supra characteristic method (CSCM) to predict the flow around two-dimensional slot injection cooled cavities in hypersonic flow. Seven different numerical solutions are presented that model three different experimental designs. The calculations manifest outer flow conditions including the effects of nozzle/lip geometry, angle of attack, nozzle inlet conditions, boundary and shear layer growth and turbulance on the surrounding flow. The calculations were performed for analysis prior to wind tunnel testing for sensitivity studies early in the design process. Qualitative and quantitative understanding of the flows for each of the cavity designs and design recommendations are provided. The present paper demonstrates the ability of numerical schemes, such as the CSCM method, to play a significant role in the design process.

  14. COMPSIZE - PRELIMINARY DESIGN METHOD FOR FIBER REINFORCED COMPOSITE STRUCTURES

    NASA Technical Reports Server (NTRS)

    Eastlake, C. N.

    1994-01-01

    The Composite Structure Preliminary Sizing program, COMPSIZE, is an analytical tool which structural designers can use when doing approximate stress analysis to select or verify preliminary sizing choices for composite structural members. It is useful in the beginning stages of design concept definition, when it is helpful to have quick and convenient approximate stress analysis tools available so that a wide variety of structural configurations can be sketched out and checked for feasibility. At this stage of the design process the stress/strain analysis does not need to be particularly accurate because any configurations tentatively defined as feasible will later be analyzed in detail by stress analysis specialists. The emphasis is on fast, user-friendly methods so that rough but technically sound evaluation of a broad variety of conceptual designs can be accomplished. Analysis equations used are, in most cases, widely known basic structural analysis methods. All the equations used in this program assume elastic deformation only. The default material selection is intermediate strength graphite/epoxy laid up in a quasi-isotropic laminate. A general flat laminate analysis subroutine is included for analyzing arbitrary laminates. However, COMPSIZE should be sufficient for most users to presume a quasi-isotropic layup and use the familiar basic structural analysis methods for isotropic materials, after estimating an appropriate elastic modulus. Homogeneous materials can be analyzed as simplified cases. The COMPSIZE program is written in IBM BASICA. The program format is interactive. It was designed on an IBM Personal Computer operating under DOS with a central memory requirement of approximately 128K. It has been implemented on an IBM compatible with GW-BASIC under DOS 3.2. COMPSIZE was developed in 1985.

  15. 77 FR 60985 - Ambient Air Monitoring Reference and Equivalent Methods: Designation of Three New Equivalent Methods

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-05

    ... 53, as amended on August 31, 2011 (76 FR 54326-54341). The new equivalent methods are automated... beta radiation attenuation. The newly designated equivalent methods are identified as follows: EQPM-0912-204, ``Teledyne Model 602 Beta\\PLUS\\ Particle Measurement System'' and ``SWAM 5a Dual...

  16. Preliminary demonstration of a robust controller design method

    NASA Technical Reports Server (NTRS)

    Anderson, L. R.

    1980-01-01

    Alternative computational procedures for obtaining a feedback control law which yields a control signal based on measurable quantitites are evaluated. The three methods evaluated are: (1) the standard linear quadratic regulator design model; (2) minimization of the norm of the feedback matrix, k via nonlinear programming subject to the constraint that the closed loop eigenvalues be in a specified domain in the complex plane; and (3) maximize the angles between the closed loop eigenvectors in combination with minimizing the norm of K also via the constrained nonlinear programming. The third or robust design method was chosen to yield a closed loop system whose eigenvalues are insensitive to small changes in the A and B matrices. The relationship between orthogonality of closed loop eigenvectors and the sensitivity of closed loop eigenvalues is described. Computer programs are described.

  17. Helicopter flight-control design using an H(2) method

    NASA Technical Reports Server (NTRS)

    Takahashi, Marc D.

    1991-01-01

    Rate-command and attitude-command flight-control designs for a UH-60 helicopter in hover are presented and were synthesized using an H(2) method. Using weight functions, this method allows the direct shaping of the singular values of the sensitivity, complementary sensitivity, and control input transfer-function matrices to give acceptable feedback properties. The designs were implemented on the Vertical Motion Simulator, and four low-speed hover tasks were used to evaluate the control system characteristics. The pilot comments from the accel-decel, bob-up, hovering turn, and side-step tasks indicated good decoupling and quick response characteristics. However, an underlying roll PIO tendency was found to exist away from the hover condition, which was caused by a flap regressing mode with insufficient damping.

  18. National Tuberculosis Genotyping and Surveillance Network: Design and Methods

    PubMed Central

    Braden, Christopher R.; Schable, Barbara A.; Onorato, Ida M.

    2002-01-01

    The National Tuberculosis Genotyping and Surveillance Network was established in 1996 to perform a 5-year, prospective study of the usefulness of genotyping Mycobacterium tuberculosis isolates to tuberculosis control programs. Seven sentinel sites identified all new cases of tuberculosis, collected information on patients and contacts, and obtained patient isolates. Seven genotyping laboratories performed DNA fingerprinting analysis by the international standard IS6110 method. BioImage Whole Band Analyzer software was used to analyze patterns, and distinct patterns were assigned unique designations. Isolates with six or fewer bands on IS6110 patterns were also spoligotyped. Patient data and genotyping designations were entered in a relational database and merged with selected variables from the national surveillance database. In two related databases, we compiled the results of routine contact investigations and the results of investigations of the relationships of patients who had isolates with matching genotypes. We describe the methods used in the study. PMID:12453342

  19. Optical design and active optics methods in astronomy

    NASA Astrophysics Data System (ADS)

    Lemaitre, Gerard R.

    2013-03-01

    Optical designs for astronomy involve implementation of active optics and adaptive optics from X-ray to the infrared. Developments and results of active optics methods for telescopes, spectrographs and coronagraph planet finders are presented. The high accuracy and remarkable smoothness of surfaces generated by active optics methods also allow elaborating new optical design types with high aspheric and/or non-axisymmetric surfaces. Depending on the goal and performance requested for a deformable optical surface analytical investigations are carried out with one of the various facets of elasticity theory: small deformation thin plate theory, large deformation thin plate theory, shallow spherical shell theory, weakly conical shell theory. The resulting thickness distribution and associated bending force boundaries can be refined further with finite element analysis.

  20. A Requirements-Driven Optimization Method for Acoustic Treatment Design

    NASA Technical Reports Server (NTRS)

    Berton, Jeffrey J.

    2016-01-01

    Acoustic treatment designers have long been able to target specific noise sources inside turbofan engines. Facesheet porosity and cavity depth are key design variables of perforate-over-honeycomb liners that determine levels of noise suppression as well as the frequencies at which suppression occurs. Layers of these structures can be combined to create a robust attenuation spectrum that covers a wide range of frequencies. Looking to the future, rapidly-emerging additive manufacturing technologies are enabling new liners with multiple degrees of freedom, and new adaptive liners with variable impedance are showing promise. More than ever, there is greater flexibility and freedom in liner design. Subject to practical considerations, liner design variables may be manipulated to achieve a target attenuation spectrum. But characteristics of the ideal attenuation spectrum can be difficult to know. Many multidisciplinary system effects govern how engine noise sources contribute to community noise. Given a hardwall fan noise source to be suppressed, and using an analytical certification noise model to compute a community noise measure of merit, the optimal attenuation spectrum can be derived using multidisciplinary systems analysis methods. The subject of this paper is an analytical method that derives the ideal target attenuation spectrum that minimizes noise perceived by observers on the ground.

  1. Subsonic panel method for designing wing surfaces from pressure distribution

    NASA Technical Reports Server (NTRS)

    Bristow, D. R.; Hawk, J. D.

    1983-01-01

    An iterative method has been developed for designing wing section contours corresponding to a prescribed subcritical distribution of pressure. The calculations are initialized by using a surface panel method to analyze a baseline wing or wing-fuselage configuration. A first-order expansion to the baseline panel method equations is then used to calculate a matrix containing the partial derivative of potential at each control point with respect to each unknown geometry parameter. In every iteration cycle, the matrix is used both to calculate the geometry perturbation and to analyze the perturbed geometry. The distribution of potential on the perturbed geometry is established by simple linear extrapolation from the baseline solution. The extrapolated potential is converted to pressure by Bernoulli's equation. Not only is the accuracy of the approach good for very large perturbations, but the computing cost of each complete iteration cycle is substantially less than one analysis solution by a conventional panel method.

  2. Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Part 2

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas; Yuan, Charles; Sobieski, Jaroslaw (Technical Monitor)

    2000-01-01

    A new MDO method, BLISS, and two different variants of the method, BLISS/RS and BLISS/S, have been implemented using iSIGHT's scripting language and evaluated in this report on multidisciplinary problems. All of these methods are based on decomposing a modular system optimization system into several subtasks optimization, that may be executed concurrently, and the system optimization that coordinates the subtasks optimization. The BLISS method and its variants are well suited for exploiting the concurrent processing capabilities in a multiprocessor machine. Several steps, including the local sensitivity analysis, local optimization, response surfaces construction and updates are all ideally suited for concurrent processing. Needless to mention, such algorithms that can effectively exploit the concurrent processing capabilities of the compute servers will be a key requirement for solving large-scale industrial design problems, such as the automotive vehicle problem detailed in Section 3.4.

  3. Improve emergency light design with lumens/sq ft method.

    PubMed

    Sieron, R L

    1981-05-01

    In summary, the "Lumens/sq ft Method" outlined here is proposed as a guideline for designing emergency lighting systems such as in the accompanying examples. With this method, the total lumens delivered by the emergency lighting units in the area is divided by the floor area (in sq ft) to yield a figure of merit. The author proposes that a range from 0.25 to 1.0 lumens/sq ft be specified for emergency lighting. The lower value may be used for non-critical areas (for example, warehouses), while the higher value would be used for areas such as school corridors and hospitals.

  4. Synthesis of aircraft structures using integrated design and analysis methods

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Goetz, R. C.

    1978-01-01

    A systematic research is reported to develop and validate methods for structural sizing of an airframe designed with the use of composite materials and active controls. This research program includes procedures for computing aeroelastic loads, static and dynamic aeroelasticity, analysis and synthesis of active controls, and optimization techniques. Development of the methods is concerned with the most effective ways of integrating and sequencing the procedures in order to generate structural sizing and the associated active control system, which is optimal with respect to a given merit function constrained by strength and aeroelasticity requirements.

  5. Bayesian methods for design and analysis of safety trials.

    PubMed

    Price, Karen L; Xia, H Amy; Lakshminarayanan, Mani; Madigan, David; Manner, David; Scott, John; Stamey, James D; Thompson, Laura

    2014-01-01

    Safety assessment is essential throughout medical product development. There has been increased awareness of the importance of safety trials recently, in part due to recent US Food and Drug Administration guidance related to thorough assessment of cardiovascular risk in the treatment of type 2 diabetes. Bayesian methods provide great promise for improving the conduct of safety trials. In this paper, the safety subteam of the Drug Information Association Bayesian Scientific Working Group evaluates challenges associated with current methods for designing and analyzing safety trials and provides an overview of several suggested Bayesian opportunities that may increase efficiency of safety trials along with relevant case examples.

  6. Asymmetric MRI magnet design using a hybrid numerical method.

    PubMed

    Zhao, H; Crozier, S; Doddrell, D M

    1999-12-01

    This paper describes a hybrid numerical method for the design of asymmetric magnetic resonance imaging magnet systems. The problem is formulated as a field synthesis and the desired current density on the surface of a cylinder is first calculated by solving a Fredholm equation of the first kind. Nonlinear optimization methods are then invoked to fit practical magnet coils to the desired current density. The field calculations are performed using a semi-analytical method. A new type of asymmetric magnet is proposed in this work. The asymmetric MRI magnet allows the diameter spherical imaging volume to be positioned close to one end of the magnet. The main advantages of making the magnet asymmetric include the potential to reduce the perception of claustrophobia for the patient, better access to the patient by attending physicians, and the potential for reduced peripheral nerve stimulation due to the gradient coil configuration. The results highlight that the method can be used to obtain an asymmetric MRI magnet structure and a very homogeneous magnetic field over the central imaging volume in clinical systems of approximately 1.2 m in length. Unshielded designs are the focus of this work. This method is flexible and may be applied to magnets of other geometries.

  7. Design Methods for Load-bearing Elements from Crosslaminated Timber

    NASA Astrophysics Data System (ADS)

    Vilguts, A.; Serdjuks, D.; Goremikins, V.

    2015-11-01

    Cross-laminated timber is an environmentally friendly material, which possesses a decreased level of anisotropy in comparison with the solid and glued timber. Cross-laminated timber could be used for load-bearing walls and slabs of multi-storey timber buildings as well as decking structures of pedestrian and road bridges. Design methods of cross-laminated timber elements subjected to bending and compression with bending were considered. The presented methods were experimentally validated and verified by FEM. Two cross-laminated timber slabs were tested at the action of static load. Pine wood was chosen as a board's material. Freely supported beam with the span equal to 1.9 m, which was loaded by the uniformly distributed load, was a design scheme of the considered plates. The width of the plates was equal to 1 m. The considered cross-laminated timber plates were analysed by FEM method. The comparison of stresses acting in the edge fibres of the plate and the maximum vertical displacements shows that both considered methods can be used for engineering calculations. The difference between the results obtained experimentally and analytically is within the limits from 2 to 31%. The difference in results obtained by effective strength and stiffness and transformed sections methods was not significant.

  8. Applying standard epidemiological methods for investigating foodborne disease outbreak in resource-poor settings: lessons from Vietnam.

    PubMed

    Vo, Thuan Huu; Nguyen, Dat Van; Le, Loan Thi Kim; Phan, Lan Trong; Nuorti, J Pekka; Tran Minh, Nguyen Nhu

    2014-07-01

    An outbreak of gastroenteritis occurred among workers of company X after eating lunch prepared by a catering service. Of 430 workers attending the meal, 56 were hospitalized with abdominal pain, diarrhea, vomiting, and nausea, according to the initial report. We conducted an investigation to identify the extent, vehicle, and source of the outbreak. In our case-control study, a case was a worker who attended the meal and who was hospitalized with acute gastroenteritis; controls were randomly selected from non-ill workers. Cases and controls were interviewed using a standard questionnaire. We used logistic regression to calculate adjusted odds ratios for the consumption of food items. Catering service facilities and food handlers working for the service were inspected. Food samples from the catering service were tested at reference laboratories. Of hospitalized cases, 54 fulfilled the case definition, but no stool specimens were collected for laboratory testing. Of four food items served during lunch, only "squash and pork soup" was significantly associated with gastroenteritis, with an adjusted odds ratio of 9.5 (95 % CI 3.2, 27.7). The caterer did not separate cooked from raw foods but used the same counter for both. Cooked foods were kept at room temperature for about 4 h before serving. Four of 14 food handlers were not trained on basic food safety principles and did not have health certificates. Although no microbiological confirmation was obtained, our epidemiological investigation suggested that squash and pork soup caused the outbreak. Hospitals should be instructed to obtain stool specimens from patients with gastroenteritis. Food catering services should be educated in basic food safety measures.

  9. Epidemiology and evolution of rotaviruses and noroviruses from an archival WHO Global Study in Children (1976-79) with implications for vaccine design.

    PubMed

    Rackoff, Lauren A; Bok, Karin; Green, Kim Y; Kapikian, Albert Z

    2013-01-01

    Prompted by the discovery of new gastrointestinal viruses, the NIH, NIAID and WHO investigated the etiology of acute diarrhea that occurred from 1976-1979 in a global cohort of infants and young children. Rotaviruses were found to be major pathogens worldwide, whereas the Norwalk virus could not be detected using a radioimmunoassay. The aim of this study is to re-evaluate the role and diversity of rotaviruses and noroviruses in the original cohort using more sensitive current technologies. Stools collected from Asia, Africa, and South America (n = 485) were evaluated for viral genotypes by RT-PCR and sequencing. Rotaviruses were detected in 28.9% and noroviruses in 9.7% of the specimens, with G1 rotaviruses and GII noroviruses accounting for the majority of each respective virus. Various strains in this study predated the currently assigned dates of discovery for their particular genotype, and in addition, two noroviruses (KL45 and T091) could not be assigned to current genotypes. Phylogenetic analyses demonstrated a relative constancy in circulating rotavirus genotypes over time, with several genotypes from this study becoming established in the current repertoire of viral species. Similarly, GII noroviruses have maintained dominance, with GII.4 noroviruses continuing as a predominant genotype over time. Taken together, the complex molecular epidemiology of rotaviruses and noroviruses circulating in the 1970's is consistent with current patterns, an important consideration in the design of multivalent vaccines to control these viruses.

  10. Libration Orbit Mission Design: Applications of Numerical & Dynamical Methods

    NASA Technical Reports Server (NTRS)

    Bauer, Frank (Technical Monitor); Folta, David; Beckman, Mark

    2002-01-01

    Sun-Earth libration point orbits serve as excellent locations for scientific investigations. These orbits are often selected to minimize environmental disturbances and maximize observing efficiency. Trajectory design in support of libration orbits is ever more challenging as more complex missions are envisioned in the next decade. Trajectory design software must be further enabled to incorporate better understanding of the libration orbit solution space and thus improve the efficiency and expand the capabilities of current approaches. The Goddard Space Flight Center (GSFC) is currently supporting multiple libration missions. This end-to-end support consists of mission operations, trajectory design, and control. It also includes algorithm and software development. The recently launched Microwave Anisotropy Probe (MAP) and upcoming James Webb Space Telescope (JWST) and Constellation-X missions are examples of the use of improved numerical methods for attaining constrained orbital parameters and controlling their dynamical evolution at the collinear libration points. This paper presents a history of libration point missions, a brief description of the numerical and dynamical design techniques including software used, and a sample of future GSFC mission designs.

  11. A MODEL AND CONTROLLER REDUCTION METHOD FOR ROBUST CONTROL DESIGN.

    SciTech Connect

    YUE,M.; SCHLUETER,R.

    2003-10-20

    A bifurcation subsystem based model and controller reduction approach is presented. Using this approach a robust {micro}-synthesis SVC control is designed for interarea oscillation and voltage control based on a small reduced order bifurcation subsystem model of the full system. The control synthesis problem is posed by structured uncertainty modeling and control configuration formulation using the bifurcation subsystem knowledge of the nature of the interarea oscillation caused by a specific uncertainty parameter. Bifurcation subsystem method plays a key role in this paper because it provides (1) a bifurcation parameter for uncertainty modeling; (2) a criterion to reduce the order of the resulting MSVC control; and (3) a low order model for a bifurcation subsystem based SVC (BMSVC) design. The use of the model of the bifurcation subsystem to produce a low order controller simplifies the control design and reduces the computation efforts so significantly that the robust {micro}-synthesis control can be applied to large system where the computation makes robust control design impractical. The RGA analysis and time simulation show that the reduced BMSVC control design captures the center manifold dynamics and uncertainty structure of the full system model and is capable of stabilizing the full system and achieving satisfactory control performance.

  12. Examination of Different Exposure Metrics in an Epidemiological Study

    EPA Science Inventory

    Epidemiological studies of air pollution have traditionally relied upon measurements of ambient concentration from central-site monitoring stations as surrogates of population exposures. However, depending on the epidemiological study design, this approach may introduce exposure...

  13. Phylogenetic Analyses of Shigella and Enteroinvasive Escherichia coli for the Identification of Molecular Epidemiological Markers: Whole-Genome Comparative Analysis Does Not Support Distinct Genera Designation.

    PubMed

    Pettengill, Emily A; Pettengill, James B; Binet, Rachel

    2015-01-01

    As a leading cause of bacterial dysentery, Shigella represents a significant threat to public health and food safety. Related, but often overlooked, enteroinvasive Escherichia coli (EIEC) can also cause dysentery. Current typing methods have limited ability to identify and differentiate between these pathogens despite the need for rapid and accurate identification of pathogens for clinical treatment and outbreak response. We present a comprehensive phylogeny of Shigella and EIEC using whole genome sequencing of 169 samples, constituting unparalleled strain diversity, and observe a lack of monophyly between Shigella and EIEC and among Shigella taxonomic groups. The evolutionary relationships in the phylogeny are supported by analyses of population structure and hierarchical clustering patterns of translated gene homolog abundance. Lastly, we identified a panel of 404 single nucleotide polymorphism (SNP) markers specific to each phylogenetic cluster for more accurate identification of Shigella and EIEC. Our findings show that Shigella and EIEC are not distinct evolutionary groups within the E. coli genus and, thus, EIEC as a group is not the ancestor to Shigella. The multiple analyses presented provide evidence for reconsidering the taxonomic placement of Shigella. The SNP markers offer more discriminatory power to molecular epidemiological typing methods involving these bacterial pathogens. PMID:26834722

  14. Phylogenetic Analyses of Shigella and Enteroinvasive Escherichia coli for the Identification of Molecular Epidemiological Markers: Whole-Genome Comparative Analysis Does Not Support Distinct Genera Designation

    PubMed Central

    Pettengill, Emily A.; Pettengill, James B.; Binet, Rachel

    2016-01-01

    As a leading cause of bacterial dysentery, Shigella represents a significant threat to public health and food safety. Related, but often overlooked, enteroinvasive Escherichia coli (EIEC) can also cause dysentery. Current typing methods have limited ability to identify and differentiate between these pathogens despite the need for rapid and accurate identification of pathogens for clinical treatment and outbreak response. We present a comprehensive phylogeny of Shigella and EIEC using whole genome sequencing of 169 samples, constituting unparalleled strain diversity, and observe a lack of monophyly between Shigella and EIEC and among Shigella taxonomic groups. The evolutionary relationships in the phylogeny are supported by analyses of population structure and hierarchical clustering patterns of translated gene homolog abundance. Lastly, we identified a panel of 404 single nucleotide polymorphism (SNP) markers specific to each phylogenetic cluster for more accurate identification of Shigella and EIEC. Our findings show that Shigella and EIEC are not distinct evolutionary groups within the E. coli genus and, thus, EIEC as a group is not the ancestor to Shigella. The multiple analyses presented provide evidence for reconsidering the taxonomic placement of Shigella. The SNP markers offer more discriminatory power to molecular epidemiological typing methods involving these bacterial pathogens. PMID:26834722

  15. Phylogenetic Analyses of Shigella and Enteroinvasive Escherichia coli for the Identification of Molecular Epidemiological Markers: Whole-Genome Comparative Analysis Does Not Support Distinct Genera Designation.

    PubMed

    Pettengill, Emily A; Pettengill, James B; Binet, Rachel

    2015-01-01

    As a leading cause of bacterial dysentery, Shigella represents a significant threat to public health and food safety. Related, but often overlooked, enteroinvasive Escherichia coli (EIEC) can also cause dysentery. Current typing methods have limited ability to identify and differentiate between these pathogens despite the need for rapid and accurate identification of pathogens for clinical treatment and outbreak response. We present a comprehensive phylogeny of Shigella and EIEC using whole genome sequencing of 169 samples, constituting unparalleled strain diversity, and observe a lack of monophyly between Shigella and EIEC and among Shigella taxonomic groups. The evolutionary relationships in the phylogeny are supported by analyses of population structure and hierarchical clustering patterns of translated gene homolog abundance. Lastly, we identified a panel of 404 single nucleotide polymorphism (SNP) markers specific to each phylogenetic cluster for more accurate identification of Shigella and EIEC. Our findings show that Shigella and EIEC are not distinct evolutionary groups within the E. coli genus and, thus, EIEC as a group is not the ancestor to Shigella. The multiple analyses presented provide evidence for reconsidering the taxonomic placement of Shigella. The SNP markers offer more discriminatory power to molecular epidemiological typing methods involving these bacterial pathogens.

  16. Towards Robust Designs Via Multiple-Objective Optimization Methods

    NASA Technical Reports Server (NTRS)

    Man Mohan, Rai

    2006-01-01

    evolutionary method (DE) is first used to solve a relatively difficult problem in extended surface heat transfer wherein optimal fin geometries are obtained for different safe operating base temperatures. The objective of maximizing the safe operating base temperature range is in direct conflict with the objective of maximizing fin heat transfer. This problem is a good example of achieving robustness in the context of changing operating conditions. The evolutionary method is then used to design a turbine airfoil; the two objectives being reduced sensitivity of the pressure distribution to small changes in the airfoil shape and the maximization of the trailing edge wedge angle with the consequent increase in airfoil thickness and strength. This is a relevant example of achieving robustness to manufacturing tolerances and wear and tear in the presence of other objectives.

  17. The Global Enteric Multicenter Study (GEMS) of Diarrheal Disease in Infants and Young Children in Developing Countries: Epidemiologic and Clinical Methods of the Case/Control Study

    PubMed Central

    Kotloff, Karen L.; Blackwelder, William C.; Nasrin, Dilruba; Nataro, James P.; Farag, Tamer H.; van Eijk, Annemieke; Adegbola, Richard A.; Alonso, Pedro L.; Breiman, Robert F.; Golam Faruque, Abu Syed; Saha, Debasish; Sow, Samba O.; Sur, Dipika; Zaidi, Anita K. M.; Biswas, Kousick; Panchalingam, Sandra; Clemens, John D.; Cohen, Dani; Glass, Roger I.; Mintz, Eric D.; Sommerfelt, Halvor; Levine, Myron M.

    2012-01-01

    Background. Diarrhea is a leading cause of illness and death among children aged <5 years in developing countries. This paper describes the clinical and epidemiological methods used to conduct the Global Enteric Multicenter Study (GEMS), a 3-year, prospective, age-stratified, case/control study to estimate the population-based burden, microbiologic etiology, and adverse clinical consequences of acute moderate-to-severe diarrhea (MSD) among a censused population of children aged 0–59 months seeking care at health centers in sub-Saharan Africa and South Asia. Methods. GEMS was conducted at 7 field sites, each serving a population whose demography and healthcare utilization practices for childhood diarrhea were documented. We aimed to enroll 220 MSD cases per year from selected health centers serving each site in each of 3 age strata (0–11, 12–23, and 24–59 months), along with 1–3 matched community controls. Cases and controls supplied clinical, epidemiologic, and anthropometric data at enrollment and again approximately 60 days later, and provided enrollment stool specimens for identification and characterization of potential diarrheal pathogens. Verbal autopsy was performed if a child died. Analytic strategies will calculate the fraction of MSD attributable to each pathogen and the incidence, financial costs, nutritional consequences, and case fatality overall and by pathogen. Conclusions. When completed, GEMS will provide estimates of the incidence, etiology, and outcomes of MSD among infants and young children in sub-Saharan Africa and South Asia. This information can guide development and implementation of public health interventions to diminish morbidity and mortality from diarrheal diseases. PMID:23169936

  18. Epidemiology of epilepsy.

    PubMed

    Abramovici, S; Bagić, A

    2016-01-01

    Modern epidemiology of epilepsy maximizes the benefits of advanced diagnostic methods and sophisticated techniques for case ascertainment in order to increase the diagnostic accuracy and representativeness of the cases and cohorts studied, resulting in better comparability of similarly performed studies. Overall, these advanced epidemiologic methods are expected to yield a better understanding of diverse risk factors, high-risk populations, seizure triggers, multiple and poorly understood causes of epilepsy, including the increasing and complex role of genetics, and establish the natural course of treated and untreated epilepsy and syndromes - all of which form the foundation of an attempt to prevent epileptogenesis as the primary prophylaxis of epilepsy. Although data collection continues to improve, epidemiologists still need to overcome definition and coding variability, insufficient documentation, as well as the interplay of socioeconomic factors and stigma. As most of the 65-70 million people with epilepsy live outside of resource-rich countries, extensive underdiagnosis, misdiagnosis, and undertreatment are likely. Epidemiology will continue to provide the necessary information to the medical community, public, and regulators as the foundation for improved health policies, targeted education, and advanced measures of prevention and prognostication of the most common severe brain disorder. PMID:27637958

  19. Bayesian methods for the design and analysis of noninferiority trials.

    PubMed

    Gamalo-Siebers, Margaret; Gao, Aijun; Lakshminarayanan, Mani; Liu, Guanghan; Natanegara, Fanni; Railkar, Radha; Schmidli, Heinz; Song, Guochen

    2016-01-01

    The gold standard for evaluating treatment efficacy of a medical product is a placebo-controlled trial. However, when the use of placebo is considered to be unethical or impractical, a viable alternative for evaluating treatment efficacy is through a noninferiority (NI) study where a test treatment is compared to an active control treatment. The minimal objective of such a study is to determine whether the test treatment is superior to placebo. An assumption is made that if the active control treatment remains efficacious, as was observed when it was compared against placebo, then a test treatment that has comparable efficacy with the active control, within a certain range, must also be superior to placebo. Because of this assumption, the design, implementation, and analysis of NI trials present challenges for sponsors and regulators. In designing and analyzing NI trials, substantial historical data are often required on the active control treatment and placebo. Bayesian approaches provide a natural framework for synthesizing the historical data in the form of prior distributions that can effectively be used in design and analysis of a NI clinical trial. Despite a flurry of recent research activities in the area of Bayesian approaches in medical product development, there are still substantial gaps in recognition and acceptance of Bayesian approaches in NI trial design and analysis. The Bayesian Scientific Working Group of the Drug Information Association provides a coordinated effort to target the education and implementation issues on Bayesian approaches for NI trials. In this article, we provide a review of both frequentist and Bayesian approaches in NI trials, and elaborate on the implementation for two common Bayesian methods including hierarchical prior method and meta-analytic-predictive approach. Simulations are conducted to investigate the properties of the Bayesian methods, and some real clinical trial examples are presented for illustration.

  20. Improved Method of Design for Folding Inflatable Shells

    NASA Technical Reports Server (NTRS)

    Johnson, Christopher J.

    2009-01-01

    An improved method of designing complexly shaped inflatable shells to be assembled from gores was conceived for original application to the inflatable outer shell of a developmental habitable spacecraft module having a cylindrical mid-length section with toroidal end caps. The method is also applicable to inflatable shells of various shapes for terrestrial use. The method addresses problems associated with the assembly, folding, transport, and deployment of inflatable shells that may comprise multiple layers and have complex shapes that can include such doubly curved surfaces as toroids and spheres. One particularly difficult problem is that of mathematically defining fold lines on a gore pattern in a double- curvature region. Moreover, because the fold lines in a double-curvature region tend to be curved, there is a practical problem of how to implement the folds. Another problem is that of modifying the basic gore shapes and sizes for the various layers so that when they are folded as part of the integral structure, they do not mechanically interfere with each other at the fold lines. Heretofore, it has been a common practice to design an inflatable shell to be assembled in the deployed configuration, without regard for the need to fold it into compact form. Typically, the result has been that folding has been a difficult, time-consuming process resulting in a An improved method of designing complexly shaped inflatable shells to be assembled from gores was conceived for original application to the inflatable outer shell of a developmental habitable spacecraft module having a cylindrical mid-length section with toroidal end caps. The method is also applicable to inflatable shells of various shapes for terrestrial use. The method addresses problems associated with the assembly, folding, transport, and deployment of inflatable shells that may comprise multiple layers and have complex shapes that can include such doubly curved surfaces as toroids and spheres. One

  1. A molecular epidemiology project on diet and cancer: the EPIC-Italy Prospective Study. Design and baseline characteristics of participants.

    PubMed

    Palli, Domenico; Berrino, Franco; Vineis, Paolo; Tumino, Rosario; Panico, Salvatore; Masala, Giovanna; Saieva, Calogero; Salvini, Simonetta; Ceroti, Marco; Pala, Valeria; Sieri, Sabina; Frasca, Graziella; Giurdanella, Maria Concetta; Sacerdote, Carlotta; Fiorini, Laura; Celentano, Egidio; Galasso, Rocco; Decarli, Adriano; Krogh, Vittorio

    2003-01-01

    EPIC-Italy is the Italian section of a larger project known as EPIC (European Prospective Investigation into Cancer and Nutrition), a prospective study on diet and cancer carried out in 10 European countries. In the period 1993-1998, EPIC-Italy completed the recruitment of 47,749 volunteers (15,171 men, 32,578 women, aged 35-65 years) in 4 different areas covered by cancer registries: Varese (12,083 volunteers) and Turin (10,604) in the Northern part of the country; Florence (13,597) and Ragusa (6,403) in Central and Southern Italy, respectively. An associate center in Naples enrolled 5,062 women. Detailed information for each individual volunteer about diet and life-style habits, anthropometric measurements and a blood sample was collected, after signing an informed consent form. A food frequency questionnaire specifically developed for the Italian dietary pattern was tested in a pilot phase. A computerized data base with the dietary and life-style information of each participant was completed. Blood samples were processed in the same day of collection, aliquoted (RBC, WBC, serum and plasma) and stored in liquid nitrogen containers. Follow-up procedures were validated and implemented for the identification of newly diagnosed cancer cases. Cancer incidence was related to dietary habits and biochemical markers of food consumption and individual susceptibility in order to test the role of diet-related exposure in the etiology of cancer and its interaction with other environmental or genetic determinants. The comparability of information in a prospective study design is much higher than in other studies. The availability of such a large biological bank linked to individual data on dietary and life-style exposures also provides the unique opportunity of evaluating the role of selected genotypes involved in the metabolism of chemical compounds and DNA repair, potentially related to the risk of cancer, in residents of geographic areas of Italy characterized by specific

  2. A geometric design method for side-stream distillation columns

    SciTech Connect

    Rooks, R.E.; Malone, M.F.; Doherty, M.F.

    1996-10-01

    A side-stream distillation column may replace two simple columns for some applications, sometimes at considerable savings in energy and investment. This paper describes a geometric method for the design of side-stream columns; the method provides rapid estimates of equipment size and utility requirements. Unlike previous approaches, the geometric method is applicable to nonideal and azeotropic mixtures. Several example problems for both ideal and nonideal mixtures, including azeotropic mixtures containing distillation boundaries, are given. The authors make use of the fact that azeotropes or pure components whose classification in the residue curve map is a saddle can be removed as side-stream products. Significant process simplifications are found among some alternatives in example problems, leading to flow sheets with fewer units and a substantial savings in vapor rate.

  3. Design of time interval generator based on hybrid counting method

    NASA Astrophysics Data System (ADS)

    Yao, Yuan; Wang, Zhaoqi; Lu, Houbing; Chen, Lian; Jin, Ge

    2016-10-01

    Time Interval Generators (TIGs) are frequently used for the characterizations or timing operations of instruments in particle physics experiments. Though some "off-the-shelf" TIGs can be employed, the necessity of a custom test system or control system makes the TIGs, being implemented in a programmable device desirable. Nowadays, the feasibility of using Field Programmable Gate Arrays (FPGAs) to implement particle physics instrumentation has been validated in the design of Time-to-Digital Converters (TDCs) for precise time measurement. The FPGA-TDC technique is based on the architectures of Tapped Delay Line (TDL), whose delay cells are down to few tens of picosecond. In this case, FPGA-based TIGs with high delay step are preferable allowing the implementation of customized particle physics instrumentations and other utilities on the same FPGA device. A hybrid counting method for designing TIGs with both high resolution and wide range is presented in this paper. The combination of two different counting methods realizing an integratable TIG is described in detail. A specially designed multiplexer for tap selection is emphatically introduced. The special structure of the multiplexer is devised for minimizing the different additional delays caused by the unpredictable routings from different taps to the output. A Kintex-7 FPGA is used for the hybrid counting-based implementation of a TIG, providing a resolution up to 11 ps and an interval range up to 8 s.

  4. Sequence design in lattice models by graph theoretical methods

    NASA Astrophysics Data System (ADS)

    Sanjeev, B. S.; Patra, S. M.; Vishveshwara, S.

    2001-01-01

    A general strategy has been developed based on graph theoretical methods, for finding amino acid sequences that take up a desired conformation as the native state. This problem of inverse design has been addressed by assigning topological indices for the monomer sites (vertices) of the polymer on a 3×3×3 cubic lattice. This is a simple design strategy, which takes into account only the topology of the target protein and identifies the best sequence for a given composition. The procedure allows the design of a good sequence for a target native state by assigning weights for the vertices on a lattice site in a given conformation. It is seen across a variety of conformations that the predicted sequences perform well both in sequence and in conformation space, in identifying the target conformation as native state for a fixed composition of amino acids. Although the method is tested in the framework of the HP model [K. F. Lau and K. A. Dill, Macromolecules 22, 3986 (1989)] it can be used in any context if proper potential functions are available, since the procedure derives unique weights for all the sites (vertices, nodes) of the polymer chain of a chosen conformation (graph).

  5. A kind of optimizing design method of progressive addition lenses

    NASA Astrophysics Data System (ADS)

    Tang, Yunhai; Qian, Lin; Wu, Quanying; Yu, Jingchi; Chen, Hao; Wang, Yuanyuan

    2010-10-01

    Progressive addition lenses are a kind of ophthalmic lenses with freeform surface. The surface curvature of the progressive addition lenses varies gradually from a minimum value in the upper, distance-viewing area, to a maximum value in the lower, near-viewing area. A kind of optimizing design method of progressive addition lenses is proposed to improve the optical quality by modifying the vector heights of the surface of designed progressive addition lenses initially. The relationship among mean power, cylinder power and the vector heights of the surface is deduced, and the optimizing factor is also gained. The vector heights of the surface of designed progressive addition lenses initially are used to calculate the plots of mean power and cylinder power based on the principle of differential geometry. The mean power plot is changed by adjusting the optimizing factor. Otherwise, the novel plot of the mean power can also be derived by shifting the mean power of one selected region to another and then by interpolating and smoothing. A partial differential equation of the elliptic type is founded based on the changed mean power. The solution of the equation is achieved by iterative method. The optimized vector heights of the surface are solved out. Compared with the original lens, the region in which the astigmatism near the nasal side on distance-vision portion is less than 0.5 D has become broader, and the clear regions on distance-vision and near-vision portion are wider.

  6. A Probabilistic Design Method Applied to Smart Composite Structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1995-01-01

    A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.

  7. Modified method to improve the design of Petlyuk distillation columns

    PubMed Central

    2014-01-01

    Background A response surface analysis was performed to study the effect of the composition and feeding thermal conditions of ternary mixtures on the number of theoretical stages and the energy consumption of Petlyuk columns. A modification of the pre-design algorithm was necessary for this purpose. Results The modified algorithm provided feasible results in 100% of the studied cases, compared with only 8.89% for the current algorithm. The proposed algorithm allowed us to attain the desired separations, despite the type of mixture and the operating conditions in the feed stream, something that was not possible with the traditional pre-design method. The results showed that the type of mixture had great influence on the number of stages and on energy consumption. A higher number of stages and a lower consumption of energy were attained with mixtures rich in the light component, while higher energy consumption occurred when the mixture was rich in the heavy component. Conclusions The proposed strategy expands the search of an optimal design of Petlyuk columns within a feasible region, which allow us to find a feasible design that meets output specifications and low thermal loads. PMID:25061476

  8. Rapid and simple method of qPCR primer design.

    PubMed

    Thornton, Brenda; Basu, Chhandak

    2015-01-01

    Quantitative real-time polymerase chain reaction (qPCR) is a powerful tool for analysis and quantification of gene expression. It is advantageous compared to traditional gel-based method of PCR, as gene expression can be visualized "real-time" using a computer. In qPCR, a reporter dye system is used which intercalates with DNA's region of interest and detects DNA amplification. Some of the popular reporter systems used in qPCR are the following: Molecular Beacon(®), SYBR Green(®), and Taqman(®). However, success of qPCR depends on the optimal primers used. Some of the considerations for primer design are the following: GC content, primer self-dimer, or secondary structure formation. Freely available software could be used for ideal qPCR primer design. Here we have shown how to use some freely available web-based software programs (such as Primerquest(®), Unafold(®), and Beacon designer(®)) to design qPCR primers.

  9. Putting Life into Computer-Based Training: The Creation of an Epidemiologic Case Study.

    ERIC Educational Resources Information Center

    Gathany, Nancy C.; Stehr-Green, Jeanette K.

    1994-01-01

    Describes the design of "Pharyngitis in Louisiana," a computer-based epidemiologic case study that was created to teach students how to conduct disease outbreak investigations. Topics discussed include realistic content portrayals; graphics; interactive teaching methods; interaction between the instructional designer and the medical expert; and…

  10. Genetic Epidemiology and Public Health: The Evolution From Theory to Technology.

    PubMed

    Fallin, M Daniele; Duggal, Priya; Beaty, Terri H

    2016-03-01

    Genetic epidemiology represents a hybrid of epidemiologic designs and statistical models that explicitly consider both genetic and environmental risk factors for disease. It is a relatively new field in public health; the term was first coined only 35 years ago. In this short time, the field has been through a major evolution, changing from a field driven by theory, without the technology for genetic measurement or computational capacity to apply much of the designs and methods developed, to a field driven by rapidly expanding technology in genomic measurement and computational analyses while epidemiologic theory struggles to keep up. In this commentary, we describe 4 different eras of genetic epidemiology, spanning this evolution from theory to technology, what we have learned, what we have added to the broader field of public health, and what remains to be done. PMID:26905340

  11. Genetic Epidemiology and Public Health: The Evolution From Theory to Technology.

    PubMed

    Fallin, M Daniele; Duggal, Priya; Beaty, Terri H

    2016-03-01

    Genetic epidemiology represents a hybrid of epidemiologic designs and statistical models that explicitly consider both genetic and environmental risk factors for disease. It is a relatively new field in public health; the term was first coined only 35 years ago. In this short time, the field has been through a major evolution, changing from a field driven by theory, without the technology for genetic measurement or computational capacity to apply much of the designs and methods developed, to a field driven by rapidly expanding technology in genomic measurement and computational analyses while epidemiologic theory struggles to keep up. In this commentary, we describe 4 different eras of genetic epidemiology, spanning this evolution from theory to technology, what we have learned, what we have added to the broader field of public health, and what remains to be done.

  12. Statistical Methods for Rapid Aerothermal Analysis and Design Technology: Validation

    NASA Technical Reports Server (NTRS)

    DePriest, Douglas; Morgan, Carolyn

    2003-01-01

    The cost and safety goals for NASA s next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to identify adequate statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The initial research work focused on establishing suitable candidate models for these purposes. The second phase is focused on assessing the performance of these models to accurately predict the heat rate for a given candidate data set. This validation work compared models and methods that may be useful in predicting the heat rate.

  13. Collocation methods for distillation design. 1: Model description and testing

    SciTech Connect

    Huss, R.S.; Westerberg, A.W.

    1996-05-01

    Fast and accurate distillation design requires a model that significantly reduces the problem size while accurately approximating a full-order distillation column model. This collocation model builds on the concepts of past collocation models for design of complex real-world separation systems. Two variable transformations make this method unique. Polynomials cannot accurately fit trajectories which flatten out. In columns, flat sections occur in the middle of large column sections or where concentrations go to 0 or 1. With an exponential transformation of the tray number which maps zero to an infinite number of trays onto the range 0--1, four collocation trays can accurately simulate a large column section. With a hyperbolic tangent transformation of the mole fractions, the model can simulate columns which reach high purities. Furthermore, this model uses multiple collocation elements for a column section, which is more accurate than a single high-order collocation section.

  14. Airfoil Design and Optimization by the One-Shot Method

    NASA Technical Reports Server (NTRS)

    Kuruvila, G.; Taasan, Shlomo; Salas, M. D.

    1995-01-01

    An efficient numerical approach for the design of optimal aerodynamic shapes is presented in this paper. The objective of any optimization problem is to find the optimum of a cost function subject to a certain state equation (governing equation of the flow field) and certain side constraints. As in classical optimal control methods, the present approach introduces a costate variable (Lagrange multiplier) to evaluate the gradient of the cost function. High efficiency in reaching the optimum solution is achieved by using a multigrid technique and updating the shape in a hierarchical manner such that smooth (low-frequency) changes are done separately from high-frequency changes. Thus, the design variables are changed on a grid where their changes produce nonsmooth (high-frequency) perturbations that can be damped efficiently by the multigrid. The cost of solving the optimization problem is approximately two to three times the cost of the equivalent analysis problem.

  15. Collocation methods for distillation design. 2: Applications for distillation

    SciTech Connect

    Huss, R.S.; Westerberg, A.W.

    1996-05-01

    The authors present applications for a collocation method for modeling distillation columns that they developed in a companion paper. They discuss implementation of the model, including discussion of the ASCEND (Advanced System for Computations in ENgineering Design) system, which enables one to create complex models with simple building blocks and interactively learn to solve them. They first investigate applying the model to compute minimum reflux for a given separation task, exactly solving nonsharp and approximately solving sharp split minimum reflux problems. They next illustrate the use of the collocation model to optimize the design a single column capable of carrying out a prescribed set of separation tasks. The optimization picks the best column diameter and total number of trays. It also picks the feed tray for each of the prescribed separations.

  16. Property Exchange Method for Designing Computer-Based Learning Game

    NASA Astrophysics Data System (ADS)

    Umetsu, Takanobu; Hirashima, Tsukasa

    Motivation is one of the most important factors in learning. Many researchers of learning environments, therefore, pay special attention to learning games as a remarkable approach to realize highly motivated learning. However, to make a learning game is not easy task. Although there are several investigations for design methods of learning games, most of them are only proposals of guidelines for the design or characteristics that learning games should have. Therefore, developers of learning games are required to have enough knowledge and experiences regarding learning and games in order to understand the guidelines or to deal with the characteristics. Then, it is very difficult for teachers to obtain learning games fitting for their learning issues.

  17. A Method for Designing CDO Conformed to Investment Parameters

    NASA Astrophysics Data System (ADS)

    Nakae, Tatsuya; Moritsu, Toshiyuki; Komoda, Norihisa

    We propose a method for designing CDO (Collateralized Debt Obligation) that meets investor needs about attributes of CDO. It is demonstrated that adjusting attributes (that are credit capability and issue amount) of CDO to investors' preferences causes a capital loss risk that the agent takes. We formulate a CDO optimization problem by defining an objective function using the above risk and by setting constraints that arise from investor needs and a risk premium that is paid for the agent. Our prototype experiment, in which fictitious underlying obligations and investor needs are given, verifies that CDOs can be designed without opportunity loss and dead stock loss, and that the capital loss is not more than thousandth part of the amount of annual payment under guarantee for small and midium-sized enterprises by a general credit guarantee institution.

  18. Design of braided composite tubes by numerical analysis method

    SciTech Connect

    Hamada, Hiroyuki; Fujita, Akihiro; Maekawa, Zenichiro; Nakai, Asami; Yokoyama, Atsushi

    1995-11-01

    Conventional composite laminates have very poor strength through thickness and as a result are limited in their application for structural parts with complex shape. In this paper, the design for braided composite tube was proposed. The concept of analysis model which involved from micro model to macro model was presented. This method was applied to predict bending rigidity and initial fracture stress under bending load of the braided tube. The proposed analytical procedure can be included as a unit in CAE system for braided composites.

  19. Methods to Design and Synthesize Antibody-Drug Conjugates (ADCs)

    PubMed Central

    Yao, Houzong; Jiang, Feng; Lu, Aiping; Zhang, Ge

    2016-01-01

    Antibody-drug conjugates (ADCs) have become a promising targeted therapy strategy that combines the specificity, favorable pharmacokinetics and biodistributions of antibodies with the destructive potential of highly potent drugs. One of the biggest challenges in the development of ADCs is the application of suitable linkers for conjugating drugs to antibodies. Recently, the design and synthesis of linkers are making great progress. In this review, we present the methods that are currently used to synthesize antibody-drug conjugates by using thiols, amines, alcohols, aldehydes and azides. PMID:26848651

  20. A Method of Trajectory Design for Manned Asteroids Exploration

    NASA Astrophysics Data System (ADS)

    Gan, Q. B.; Zhang, Y.; Zhu, Z. F.; Han, W. H.; Dong, X.

    2014-11-01

    A trajectory optimization method of the nuclear propulsion manned asteroids exploration is presented. In the case of launching between 2035 and 2065, based on the Lambert transfer orbit, the phases of departure from and return to the Earth are searched at first. Then the optimal flight trajectory in the feasible regions is selected by pruning the flight sequences. Setting the nuclear propulsion flight plan as propel-coast-propel, and taking the minimal mass of aircraft departure as the index, the nuclear propulsion flight trajectory is separately optimized using a hybrid method. With the initial value of the optimized local parameters of each three phases, the global parameters are jointedly optimized. At last, the minimal departure mass trajectory design result is given.

  1. Sewage-based epidemiology in monitoring the use of new psychoactive substances: Validation and application of an analytical method using LC-MS/MS.

    PubMed

    Kinyua, Juliet; Covaci, Adrian; Maho, Walid; McCall, Ann-Kathrin; Neels, Hugo; van Nuijs, Alexander L N

    2015-09-01

    Sewage-based epidemiology (SBE) employs the analysis of sewage to detect and quantify drug use within a community. While SBE has been applied repeatedly for the estimation of classical illicit drugs, only few studies investigated new psychoactive substances (NPS). These compounds mimic effects of illicit drugs by introducing slight modifications to chemical structures of controlled illicit drugs. We describe the optimization, validation, and application of an analytical method using liquid chromatography coupled to positive electrospray tandem mass spectrometry (LC-ESI-MS/MS) for the determination of seven NPS in sewage: methoxetamine (MXE), butylone, ethylone, methylone, methiopropamine (MPA), 4-methoxymethamphetamine (PMMA), and 4-methoxyamphetamine (PMA). Sample preparation was performed using solid-phase extraction (SPE) with Oasis MCX cartridges. The LC separation was done with a HILIC (150 x 3 mm, 5 µm) column which ensured good resolution of the analytes with a total run time of 19 min. The lower limit of quantification (LLOQ) was between 0.5 and 5 ng/L for all compounds. The method was validated by evaluating the following parameters: sensitivity, selectivity, linearity, accuracy, precision, recoveries and matrix effects. The method was applied on sewage samples collected from sewage treatment plants in Belgium and Switzerland in which all investigated compounds were detected, except MPA and PMA. Furthermore, a consistent presence of MXE has been observed in most of the sewage samples at levels higher than LLOQ.

  2. Novel computational methods to design protein-protein interactions

    NASA Astrophysics Data System (ADS)

    Zhou, Alice Qinhua; O'Hern, Corey; Regan, Lynne

    2014-03-01

    Despite the abundance of structural data, we still cannot accurately predict the structural and energetic changes resulting from mutations at protein interfaces. The inadequacy of current computational approaches to the analysis and design of protein-protein interactions has hampered the development of novel therapeutic and diagnostic agents. In this work, we apply a simple physical model that includes only a minimal set of geometrical constraints, excluded volume, and attractive van der Waals interactions to 1) rank the binding affinity of mutants of tetratricopeptide repeat proteins with their cognate peptides, 2) rank the energetics of binding of small designed proteins to the hydrophobic stem region of the influenza hemagglutinin protein, and 3) predict the stability of T4 lysozyme and staphylococcal nuclease mutants. This work will not only lead to a fundamental understanding of protein-protein interactions, but also to the development of efficient computational methods to rationally design protein interfaces with tunable specificity and affinity, and numerous applications in biomedicine. NSF DMR-1006537, PHY-1019147, Raymond and Beverly Sackler Institute for Biological, Physical and Engineering Sciences, and Howard Hughes Medical Institute.

  3. Cox regression methods for two-stage randomization designs.

    PubMed

    Lokhnygina, Yuliya; Helterbrand, Jeffrey D

    2007-06-01

    Two-stage randomization designs (TSRD) are becoming increasingly common in oncology and AIDS clinical trials as they make more efficient use of study participants to examine therapeutic regimens. In these designs patients are initially randomized to an induction treatment, followed by randomization to a maintenance treatment conditional on their induction response and consent to further study treatment. Broader acceptance of TSRDs in drug development may hinge on the ability to make appropriate intent-to-treat type inference within this design framework as to whether an experimental induction regimen is better than a standard induction regimen when maintenance treatment is fixed. Recently Lunceford, Davidian, and Tsiatis (2002, Biometrics 58, 48-57) introduced an inverse probability weighting based analytical framework for estimating survival distributions and mean restricted survival times, as well as for comparing treatment policies at landmarks in the TSRD setting. In practice Cox regression is widely used and in this article we extend the analytical framework of Lunceford et al. (2002) to derive a consistent estimator for the log hazard in the Cox model and a robust score test to compare treatment policies. Large sample properties of these methods are derived, illustrated via a simulation study, and applied to a TSRD clinical trial. PMID:17425633

  4. An introduction to quantum chemical methods applied to drug design.

    PubMed

    Stenta, Marco; Dal Peraro, Matteo

    2011-06-01

    The advent of molecular medicine allowed identifying the malfunctioning of subcellular processes as the source of many diseases. Since then, drugs are not only discovered, but actually designed to fulfill a precise task. Modern computational techniques, based on molecular modeling, play a relevant role both in target identification and drug lead development. By flanking and integrating standard experimental techniques, modeling has proven itself as a powerful tool across the drug design process. The success of computational methods depends on a balance between cost (computation time) and accuracy. Thus, the integration of innovative theories and more powerful hardware architectures allows molecular modeling to be used as a reliable tool for rationalizing the results of experiments and accelerating the development of new drug design strategies. We present an overview of the most common quantum chemistry computational approaches, providing for each one a general theoretical introduction to highlight limitations and strong points. We then discuss recent developments in software and hardware resources, which have allowed state-of-the-art of computational quantum chemistry to be applied to drug development.

  5. Sensitivity method for integrated structure/active control law design

    NASA Technical Reports Server (NTRS)

    Gilbert, Michael G.

    1987-01-01

    The development is described of an integrated structure/active control law design methodology for aeroelastic aircraft applications. A short motivating introduction to aeroservoelasticity is given along with the need for integrated structures/controls design algorithms. Three alternative approaches to development of an integrated design method are briefly discussed with regards to complexity, coordination and tradeoff strategies, and the nature of the resulting solutions. This leads to the formulation of the proposed approach which is based on the concepts of sensitivity of optimum solutions and multi-level decompositions. The concept of sensitivity of optimum is explained in more detail and compared with traditional sensitivity concepts of classical control theory. The analytical sensitivity expressions for the solution of the linear, quadratic cost, Gaussian (LQG) control problem are summarized in terms of the linear regulator solution and the Kalman Filter solution. Numerical results for a state space aeroelastic model of the DAST ARW-II vehicle are given, showing the changes in aircraft responses to variations of a structural parameter, in this case first wing bending natural frequency.

  6. Simplified design method for shear-valve magnetorheological dampers

    NASA Astrophysics Data System (ADS)

    Ding, Yang; Zhang, Lu; Zhu, Haitao; Li, Zhongxian

    2014-12-01

    Based on the Bingham parallel-plate model, a simplified design method of shear-valve magnetorheological (MR) dampers is proposed considering the magnetic circuit optimization. Correspondingly, a new MR damper with a full-length effective damping path is proposed. The prototype dampers are also fabricated and studied numerically and experimentally. According to the test results, the Bingham parallel-plate model is further modified to obtain a damping force prediction model of the proposed MR dampers. This prediction model considers the magnetic saturation phenomenon. The study indicates that the proposed simplified design method is simple, effective and reliable. The maximum damping force of the proposed MR dampers with a full-length effective damping path is at least twice as large as those of conventional MR dampers. The dynamic range of damping force increases by at least 70%. The proposed damping force prediction model considers the magnetic saturation phenomenon and it can realize the actual characteristic of MR fluids. The model is able to predict the actual damping force of MR dampers precisely.

  7. Development of Analysis Methods for Designing with Composites

    NASA Technical Reports Server (NTRS)

    Madenci, E.

    1999-01-01

    The project involved the development of new analysis methods to achieve efficient design of composite structures. We developed a complex variational formulation to analyze the in-plane and bending coupling response of an unsymmetrically laminated plate with an elliptical cutout subjected to arbitrary edge loading as shown in Figure 1. This formulation utilizes four independent complex potentials that satisfy the coupled in-plane and bending equilibrium equations, thus eliminating the area integrals from the strain energy expression. The solution to a finite geometry laminate under arbitrary loading is obtained by minimizing the total potential energy function and solving for the unknown coefficients of the complex potentials. The validity of this approach is demonstrated by comparison with finite element analysis predictions for a laminate with an inclined elliptical cutout under bi-axial loading.The geometry and loading of this laminate with a lay-up of [-45/45] are shown in Figure 2. The deformed configuration shown in Figure 3 reflects the presence of bending-stretching coupling. The validity of the present method is established by comparing the out-of-plane deflections along the boundary of the elliptical cutout from the present approach with those of the finite element method. The comparison shown in Figure 4 indicates remarkable agreement. The details of this method are described in a manuscript by Madenci et al. (1998).

  8. A New Aerodynamic Data Dispersion Method for Launch Vehicle Design

    NASA Technical Reports Server (NTRS)

    Pinier, Jeremy T.

    2011-01-01

    A novel method for implementing aerodynamic data dispersion analysis is herein introduced. A general mathematical approach combined with physical modeling tailored to the aerodynamic quantity of interest enables the generation of more realistically relevant dispersed data and, in turn, more reasonable flight simulation results. The method simultaneously allows for the aerodynamic quantities and their derivatives to be dispersed given a set of non-arbitrary constraints, which stresses the controls model in more ways than with the traditional bias up or down of the nominal data within the uncertainty bounds. The adoption and implementation of this new method within the NASA Ares I Crew Launch Vehicle Project has resulted in significant increases in predicted roll control authority, and lowered the induced risks for flight test operations. One direct impact on launch vehicles is a reduced size for auxiliary control systems, and the possibility of an increased payload. This technique has the potential of being applied to problems in multiple areas where nominal data together with uncertainties are used to produce simulations using Monte Carlo type random sampling methods. It is recommended that a tailored physics-based dispersion model be delivered with any aerodynamic product that includes nominal data and uncertainties, in order to make flight simulations more realistic and allow for leaner spacecraft designs.

  9. Nanobiological studies on drug design using molecular mechanic method

    PubMed Central

    Ghaheh, Hooria Seyedhosseini; Mousavi, Maryam; Araghi, Mahmood; Rasoolzadeh, Reza; Hosseini, Zahra

    2015-01-01

    Background: Influenza H1N1 is very important worldwide and point mutations that occur in the virus gene are a threat for the World Health Organization (WHO) and druggists, since they could make this virus resistant to the existing antibiotics. Influenza epidemics cause severe respiratory illness in 30 to 50 million people and kill 250,000 to 500,000 people worldwide every year. Nowadays, drug design is not done through trial and error because of its cost and waste of time; therefore bioinformatics studies is essential for designing drugs. Materials and Methods: This paper, infolds a study on binding site of Neuraminidase (NA) enzyme, (that is very important in drug design) in 310K temperature and different dielectrics, for the best drug design. Information of NA enzyme was extracted from Protein Data Bank (PDB) and National Center for Biotechnology Information (NCBI) websites. The new sequences of N1 were downloaded from the NCBI influenza virus sequence database. Drug binding sites were assimilated and homologized modeling using Argus lab 4.0, HyperChem 6.0 and Chem. D3 softwares. Their stability was assessed in different dielectrics and temperatures. Result: Measurements of potential energy (Kcal/mol) of binding sites of NA in different dielectrics and 310K temperature revealed that at time step size = 0 pSec drug binding sites have maximum energy level and at time step size = 100 pSec have maximum stability and minimum energy. Conclusions: Drug binding sites are more dependent on dielectric constants rather than on temperature and the optimum dielectric constant is 39/78. PMID:26605248

  10. IODC98 optical design problem: method of progressing from an ahromatic to an apochromatic design

    SciTech Connect

    Seppala, L.G.

    1998-07-20

    A general method of designing an apochromatic lens by using a triplet of special glasses, in which the buried surfaces concept is used, can be outlined. First, one initially chooses a starting point which is already achromatic. Second, a thick plate or shell is added to the design, where the plate or shell has an index of refraction 1.62, which is similar to the special glass triplet average index of refraction (for example: PSK53A, KZFS1 and TIF6). Third, the lens is then reoptimized to an achromatic design. Fourth, the single element is replace by the special glass triplet. Fifth, only the internal surfaces of the triplet are varied to correct all three wavelengths. Although this step will produce little improvement, it does serve to stabilize further optimization. Sixth and finally, all potential variables are used to fully optimize the apochromatic lens. Microscope objectives, for example, could be designed using this technique. The important concept to apply is the use of multiple buried surfaces in which each interface involves a special glass, after an achromatic design has been achieved. This extension relieves the restriction that all special glasses have a common index of refraction and allows a wider variety of special glasses to be used. However, it is still desirable to use glasses which form a large triangle on the P versus V diagram.

  11. An analytical filter design method for guided wave phased arrays

    NASA Astrophysics Data System (ADS)

    Kwon, Hyu-Sang; Kim, Jin-Yeon

    2016-12-01

    This paper presents an analytical method for designing a spatial filter that processes the data from an array of two-dimensional guided wave transducers. An inverse problem is defined where the spatial filter coefficients are determined in such a way that a prescribed beam shape, i.e., a desired array output is best approximated in the least-squares sense. Taking advantage of the 2π-periodicity of the generated wave field, Fourier-series representation is used to derive closed-form expressions for the constituting matrix elements. Special cases in which the desired array output is an ideal delta function and a gate function are considered in a more explicit way. Numerical simulations are performed to examine the performance of the filters designed by the proposed method. It is shown that the proposed filters can significantly improve the beam quality in general. Most notable is that the proposed method does not compromise between the main lobe width and the sidelobe levels; i.e. a narrow main lobe and low sidelobes are simultaneously achieved. It is also shown that the proposed filter can compensate the effects of nonuniform directivity and sensitivity of array elements by explicitly taking these into account in the formulation. From an example of detecting two separate targets, how much the angular resolution can be improved as compared to the conventional delay-and-sum filter is quantitatively illustrated. Lamb wave based imaging of localized defects in an elastic plate using a circular array is also presented as an example of practical applications.

  12. Interpreting epidemiological research: blinded comparison of methods used to estimate the prevalence of inherited mutations in BRCA1

    PubMed Central

    Eng, C.; Brody, L.; Wagner, T.; Devilee, P.; Vijg, J.; Szabo, C.; Tavtigian, S.; Nathanson, K.; Ostrander, E.; Frank, T.

    2001-01-01

    While sequence analysis is considered by many to be the most sensitive method of detecting unknown mutations in large genes such as BRCA1, most published estimates of the prevalence of mutations in this gene have been derived from studies that have used other methods of gene analysis. In order to determine the relative sensitivity of techniques that are widely used in research on BRCA1, a set of blinded samples containing 58 distinct mutations were analysed by four separate laboratories. Each used one of the following methods: single strand conformational polymorphism analysis (SSCP), conformation sensitive gel electrophoresis (CSGE), two dimensional gene scanning (TDGS), and denaturing high performance liquid chromatography (DHPLC). Only the laboratory using DHPLC correctly identified each of the mutations. The laboratory using TDGS correctly identified 91% of the mutations but produced three apparent false positive results. The laboratories using SSCP and CSGE detected abnormal migration for 72% and 76% of the mutations, respectively, but subsequently confirmed and reported only 65% and 60% of mutations, respectively. False negatives therefore resulted not only from failure of the techniques to distinguish wild type from mutant, but also from failure to confirm the mutation by sequence analysis as well as from human errors leading to misreporting of results. These findings characterise sources of error in commonly used methods of mutation detection that should be addressed by laboratories using these methods. Based upon sources of error identified in this comparison, it is likely that mutations in BRCA1 and BRCA2 are more prevalent than some studies have previously reported. The findings of this comparison provide a basis for interpreting studies of mutations in susceptibility genes across many inherited cancer syndromes.


Keywords: BRCA1; mutation detection; cancer genetics PMID:11748305

  13. Learning physics: A comparative analysis between instructional design methods

    NASA Astrophysics Data System (ADS)

    Mathew, Easow

    The purpose of this research was to determine if there were differences in academic performance between students who participated in traditional versus collaborative problem-based learning (PBL) instructional design approaches to physics curricula. This study utilized a quantitative quasi-experimental design methodology to determine the significance of differences in pre- and posttest introductory physics exam performance between students who participated in traditional (i.e., control group) versus collaborative problem solving (PBL) instructional design (i.e., experimental group) approaches to physics curricula over a college semester in 2008. There were 42 student participants (N = 42) enrolled in an introductory physics course at the research site in the Spring 2008 semester who agreed to participate in this study after reading and signing informed consent documents. A total of 22 participants were assigned to the experimental group (n = 22) who participated in a PBL based teaching methodology along with traditional lecture methods. The other 20 students were assigned to the control group (n = 20) who participated in the traditional lecture teaching methodology. Both the courses were taught by experienced professors who have qualifications at the doctoral level. The results indicated statistically significant differences (p < .01) in academic performance between students who participated in traditional (i.e., lower physics posttest scores and lower differences between pre- and posttest scores) versus collaborative (i.e., higher physics posttest scores, and higher differences between pre- and posttest scores) instructional design approaches to physics curricula. Despite some slight differences in control group and experimental group demographic characteristics (gender, ethnicity, and age) there were statistically significant (p = .04) differences between female average academic improvement which was much higher than male average academic improvement (˜63%) in

  14. Formal methods in the design of Ada 1995

    NASA Technical Reports Server (NTRS)

    Guaspari, David

    1995-01-01

    Formal, mathematical methods are most useful when applied early in the design and implementation of a software system--that, at least, is the familiar refrain. I will report on a modest effort to apply formal methods at the earliest possible stage, namely, in the design of the Ada 95 programming language itself. This talk is an 'experience report' that provides brief case studies illustrating the kinds of problems we worked on, how we approached them, and the extent (if any) to which the results proved useful. It also derives some lessons and suggestions for those undertaking future projects of this kind. Ada 95 is the first revision of the standard for the Ada programming language. The revision began in 1988, when the Ada Joint Programming Office first asked the Ada Board to recommend a plan for revising the Ada standard. The first step in the revision was to solicit criticisms of Ada 83. A set of requirements for the new language standard, based on those criticisms, was published in 1990. A small design team, the Mapping Revision Team (MRT), became exclusively responsible for revising the language standard to satisfy those requirements. The MRT, from Intermetrics, is led by S. Tucker Taft. The work of the MRT was regularly subject to independent review and criticism by a committee of distinguished Reviewers and by several advisory teams--for example, the two User/Implementor teams, each consisting of an industrial user (attempting to make significant use of the new language on a realistic application) and a compiler vendor (undertaking, experimentally, to modify its current implementation in order to provide the necessary new features). One novel decision established the Language Precision Team (LPT), which investigated language proposals from a mathematical point of view. The LPT applied formal mathematical analysis to help improve the design of Ada 95 (e.g., by clarifying the language proposals) and to help promote its acceptance (e.g., by identifying a

  15. PARTIAL RESTRAINING FORCE INTRODUCTION METHOD FOR DESIGNING CONSTRUCTION COUNTERMESURE ON ΔB METHOD

    NASA Astrophysics Data System (ADS)

    Nishiyama, Taku; Imanishi, Hajime; Chiba, Noriyuki; Ito, Takao

    Landslide or slope failure is a three-dimensional movement phenomenon, thus a three-dimensional treatment makes it easier to understand stability. The ΔB method (simplified three-dimensional slope stability analysis method) is based on the limit equilibrium method and equals to an approximate three-dimensional slope stability analysis that extends two-dimensional cross-section stability analysis results to assess stability. This analysis can be conducted using conventional spreadsheets or two-dimensional slope stability computational software. This paper describes the concept of the partial restraining force in-troduction method for designing construction countermeasures using the distribution of the restraining force found along survey lines, which is based on the distribution of survey line safety factors derived from the above-stated analysis. This paper also presents the transverse distributive method of restraining force used for planning ground stabilizing on the basis of the example analysis.

  16. Designing arrays for modern high-resolution methods

    SciTech Connect

    Dowla, F.U.

    1987-10-01

    A bearing estimation study of seismic wavefields propagating from a strongly heterogeneous media shows that with the high-resolution MUSIC algorithm the bias of the direction estimate can be reduced by adopting a smaller aperture sub-array. Further, on this sub-array, the bias of the MUSIC algorithm is less than those of the MLM and Bartlett methods. On the full array, the performance for the three different methods are comparable. Improvement in bearing estimation in MUSIC with a reduced aperture might be attributed to increased signal coherency in the array. For methods with less resolution, the improved signal coherency in the smaller array is possible being offset by severe loss of resolution and the presence of weak secondary sources. Building upon the characteristics of real seismic wavefields, a design language has been developed to generate, modify, and test other arrays. Eigenstructures of wavefields and arrays have been studied empirically by simulation of a variety of realistic signals. 6 refs., 5 figs.

  17. Basic research on design analysis methods for rotorcraft vibrations

    NASA Technical Reports Server (NTRS)

    Hanagud, S.

    1991-01-01

    The objective of the present work was to develop a method for identifying physically plausible finite element system models of airframe structures from test data. The assumed models were based on linear elastic behavior with general (nonproportional) damping. Physical plausibility of the identified system matrices was insured by restricting the identification process to designated physical parameters only and not simply to the elements of the system matrices themselves. For example, in a large finite element model the identified parameters might be restricted to the moduli for each of the different materials used in the structure. In the case of damping, a restricted set of damping values might be assigned to finite elements based on the material type and on the fabrication processes used. In this case, different damping values might be associated with riveted, bolted and bonded elements. The method itself is developed first, and several approaches are outlined for computing the identified parameter values. The method is applied first to a simple structure for which the 'measured' response is actually synthesized from an assumed model. Both stiffness and damping parameter values are accurately identified. The true test, however, is the application to a full-scale airframe structure. In this case, a NASTRAN model and actual measured modal parameters formed the basis for the identification of a restricted set of physically plausible stiffness and damping parameters.

  18. Genetics and epidemiology, congenital anomalies and cancer

    SciTech Connect

    Friedman, J.M.

    1997-03-01

    Many of the basic statistical methods used in epidemiology - regression, analysis of variance, and estimation of relative risk, for example - originally were developed for the genetic analysis of biometric data. The familiarity that many geneticists have with this methodology has helped geneticists to understand and accept genetic epidemiology as a scientific discipline. It worth noting, however, that most of the work in genetic epidemiology during the past decade has been devoted to linkage and other family studies, rather than to population-based investigations of the type that characterize much of mainstream epidemiology. 30 refs., 2 tabs.

  19. Schizophrenia: from Epidemiology to Rehabilitation

    PubMed Central

    Mura, Gioia; Petretto, Donatella Rita; Bhat, Krishna M; Carta, Mauro Giovanni

    2012-01-01

    Purpose/Objective: We discuss recent evidences about schizophrenia (frequency, onset, course, risk factors and genetics) and their influences to some epidemiological myths about schizophrenia diffuse between psychiatric and psychopathology clinicians. The scope is to evaluate if the new acquisitions may change the rehabilitation approaches to schizophrenia modifying the balance about the neurodevelopmental hypothesis of schizophrenia accepting that the cognitive deficits are produced by errors during the normal development of the brain (neurodevelopmental hypothesis) that remains stable in the course of illness and the neurodegenerative hypothesis according of which they derived from a degenerative process that goes on inexorably. Research Method/Design: A review of the literature about epidemiology of schizophrenia has been performed and the contributions of some of these evidence to neurodevelopmental hypothesis and to rehabilitation has been described. Results: It cannot be definitively concluded for or against the neurodevelopmental or degenerative hypothesis, but efforts in understanding basis of schizophrenia must go on. Until now, rehabilitation programs are based on the vulnerability-stress model: supposing an early deficit that go on stable during the life under favorable circumstances. So, rehabilitation approaches (as neuro-cognitive approaches, social skill training, cognitive-emotional training) are focused on the individual and micro-group coping skills, aiming to help people with schizophrenia to cope with environmental stress factors. Conclusions/Implications: Coping of cognitive deficits in schizophrenia may represents the starting-point for further research on schizophrenia, cohort studies and randomized trials are necessary to defined the range of effectiveness and the outcome of the treatments. PMID:22962559

  20. AmiRNA Designer - new method of artificial miRNA design.

    PubMed

    Mickiewicz, Agnieszka; Rybarczyk, Agnieszka; Sarzynska, Joanna; Figlerowicz, Marek; Blazewicz, Jacek

    2016-01-01

    MicroRNAs (miRNAs) are small non-coding RNAs that have been found in most of the eukaryotic organisms. They are involved in the regulation of gene expression at the post-transcriptional level in a sequence specific manner. MiRNAs are produced from their precursors by Dicer-dependent small RNA biogenesis pathway. Involvement of miRNAs in a wide range of biological processes makes them excellent candidates for studying gene function or for therapeutic applications. For this purpose, different RNA-based gene silencing techniques have been developed. Artificially transformed miRNAs (amiRNAs) targeting one or several genes of interest represent one of such techniques being a potential tool in functional genomics. Here, we present a new approach to amiRNA*design, implemented as AmiRNA Designer software. Our method is based on the thermodynamic analysis of the native miRNA/miRNA* and miRNA/target duplexes. In contrast to the available automated tools, our program allows the user to perform analysis of natural miRNAs for the organism of interest and to create customized constraints for the design stage. It also provides filtering of the amiRNA candidates for the potential off-targets. AmiRNA Designer is freely available at http://www.cs.put.poznan.pl/arybarczyk/AmiRNA/. PMID:26784022

  1. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  2. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  3. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  4. Computational methods in metabolic engineering for strain design.

    PubMed

    Long, Matthew R; Ong, Wai Kit; Reed, Jennifer L

    2015-08-01

    Metabolic engineering uses genetic approaches to control microbial metabolism to produce desired compounds. Computational tools can identify new biological routes to chemicals and the changes needed in host metabolism to improve chemical production. Recent computational efforts have focused on exploring what compounds can be made biologically using native, heterologous, and/or enzymes with broad specificity. Additionally, computational methods have been developed to suggest different types of genetic modifications (e.g. gene deletion/addition or up/down regulation), as well as suggest strategies meeting different criteria (e.g. high yield, high productivity, or substrate co-utilization). Strategies to improve the runtime performances have also been developed, which allow for more complex metabolic engineering strategies to be identified. Future incorporation of kinetic considerations will further improve strain design algorithms.

  5. Unique Method for Generating Design Earthquake Time Histories

    SciTech Connect

    R. E. Spears

    2008-07-01

    A method has been developed which takes a seed earthquake time history and modifies it to produce given design response spectra. It is a multi-step process with an initial scaling step and then multiple refinement steps. It is unique in the fact that both the acceleration and displacement response spectra are considered when performing the fit (which primarily improves the low frequency acceleration response spectrum accuracy). Additionally, no matrix inversion is needed. The features include encouraging the code acceleration, velocity, and displacement ratios and attempting to fit the pseudo velocity response spectrum. Also, “smoothing” is done to transition the modified time history to the seed time history at its start and end. This is done in the time history regions below a cumulative energy of 5% and above a cumulative energy of 95%. Finally, the modified acceleration, velocity, and displacement time histories are adjusted to start and end with an amplitude of zero (using Fourier transform techniques for integration).

  6. Computational methods in metabolic engineering for strain design.

    PubMed

    Long, Matthew R; Ong, Wai Kit; Reed, Jennifer L

    2015-08-01

    Metabolic engineering uses genetic approaches to control microbial metabolism to produce desired compounds. Computational tools can identify new biological routes to chemicals and the changes needed in host metabolism to improve chemical production. Recent computational efforts have focused on exploring what compounds can be made biologically using native, heterologous, and/or enzymes with broad specificity. Additionally, computational methods have been developed to suggest different types of genetic modifications (e.g. gene deletion/addition or up/down regulation), as well as suggest strategies meeting different criteria (e.g. high yield, high productivity, or substrate co-utilization). Strategies to improve the runtime performances have also been developed, which allow for more complex metabolic engineering strategies to be identified. Future incorporation of kinetic considerations will further improve strain design algorithms. PMID:25576846

  7. Development of impact design methods for ceramic gas turbine components

    NASA Technical Reports Server (NTRS)

    Song, J.; Cuccio, J.; Kington, H.

    1990-01-01

    Impact damage prediction methods are being developed to aid in the design of ceramic gas turbine engine components with improved impact resistance. Two impact damage modes were characterized: local, near the impact site, and structural, usually fast fracture away from the impact site. Local damage to Si3N4 impacted by Si3N4 spherical projectiles consists of ring and/or radial cracks around the impact point. In a mechanistic model being developed, impact damage is characterized as microcrack nucleation and propagation. The extent of damage is measured as volume fraction of microcracks. Model capability is demonstrated by simulating late impact tests. Structural failure is caused by tensile stress during impact exceeding material strength. The EPIC3 code was successfully used to predict blade structural failures in different size particle impacts on radial and axial blades.

  8. Design method of water jet pump towards high cavitation performances

    NASA Astrophysics Data System (ADS)

    Cao, L. L.; Che, B. X.; Hu, L. J.; Wu, D. Z.

    2016-05-01

    As one of the crucial components for power supply, the propulsion system is of great significance to the advance speed, noise performances, stabilities and other associated critical performances of underwater vehicles. Developing towards much higher advance speed, the underwater vehicles make more critical demands on the performances of the propulsion system. Basically, the increased advance speed requires the significantly raised rotation speed of the propulsion system, which would result in the deteriorated cavitation performances and consequently limit the thrust and efficiency of the whole system. Compared with the traditional propeller, the water jet pump offers more favourite cavitation, propulsion efficiency and other associated performances. The present research focuses on the cavitation performances of the waterjet pump blade profile in expectation of enlarging its advantages in high-speed vehicle propulsion. Based on the specifications of a certain underwater vehicle, the design method of the waterjet blade with high cavitation performances was investigated in terms of numerical simulation.

  9. Virtual Design Method for Controlled Failure in Foldcore Sandwich Panels

    NASA Astrophysics Data System (ADS)

    Sturm, Ralf; Fischer, S.

    2015-12-01

    For certification, novel fuselage concepts have to prove equivalent crashworthiness standards compared to the existing metal reference design. Due to the brittle failure behaviour of CFRP this requirement can only be fulfilled by a controlled progressive crash kinematics. Experiments showed that the failure of a twin-walled fuselage panel can be controlled by a local modification of the core through-thickness compression strength. For folded cores the required change in core properties can be integrated by a modification of the fold pattern. However, the complexity of folded cores requires a virtual design methodology for tailoring the fold pattern according to all static and crash relevant requirements. In this context a foldcore micromodel simulation method is presented to identify the structural response of a twin-walled fuselage panels with folded core under crash relevant loading condition. The simulations showed that a high degree of correlation is required before simulation can replace expensive testing. In the presented studies, the necessary correlation quality could only be obtained by including imperfections of the core material in the micromodel simulation approach.

  10. Infectious Agents and Cancer Epidemiology Research Webinar Series

    Cancer.gov

    Infectious Agents and Cancer Epidemiology Research Webinar Series highlights emerging and cutting-edge research related to infection-associated cancers, shares scientific knowledge about technologies and methods, and fosters cross-disciplinary discussions on infectious agents and cancer epidemiology.

  11. Inquiry into the Practices of Expert Courseware Designers: A Pragmatic Method for the Design of Effective Instructional Systems

    ERIC Educational Resources Information Center

    Rowley, Kurt

    2005-01-01

    A multi-stage study of the practices of expert courseware designers was conducted with the final goal of identifying methods for assisting non-experts with the design of effective instructional systems. A total of 25 expert designers were involved in all stages of the inquiry. A model of the expert courseware design process was created, tested,…

  12. The Schisto Track: A System for Gathering and Monitoring Epidemiological Surveys by Connecting Geographical Information Systems in Real Time

    PubMed Central

    2014-01-01

    Background Using the Android platform as a notification instrument for diseases and disorders forms a new alternative for computerization of epidemiological studies. Objective The objective of our study was to construct a tool for gathering epidemiological data on schistosomiasis using the Android platform. Methods The developed application (app), named the Schisto Track, is a tool for data capture and analysis that was designed to meet the needs of a traditional epidemiological survey. An initial version of the app was finished and tested in both real situations and simulations for epidemiological surveys. Results The app proved to be a tool capable of automation of activities, with data organization and standardization, easy data recovery (to enable interfacing with other systems), and totally modular architecture. Conclusions The proposed Schisto Track is in line with worldwide trends toward use of smartphones with the Android platform for modeling epidemiological scenarios. PMID:25099881

  13. The Method of Complex Characteristics for Design of Transonic Compressors.

    NASA Astrophysics Data System (ADS)

    Bledsoe, Margaret Randolph

    We calculate shockless transonic flows past two -dimensional cascades of airfoils characterized by a prescribed speed distribution. The approach is to find solutions of the partial differential equation (c('2)-u('2)) (PHI)(,xx) - 2uv (PHI)(,xy) + (c('2)-v('2)) (PHI)(,yy) = 0 by the method of complex characteristics. Here (PHI) is the velocity potential, so (DEL)(PHI) = (u,v), and c is the local speed of sound. Our method consists in noting that the coefficients of the equation are analytic, so that we can use analytic continuation, conformal mapping, and a spectral method in the hodograph plane to determine the flow. After complex extension we obtain canonical equations for (PHI) and for the stream function (psi) as well as an explicit map from the hodograph plane to complex characteristic coordinates. In the subsonic case, a new coordinate system is defined in which the flow region corresponds to the interior of an ellipse. We construct special solutions of the flow equations in these coordinates by solving characteristic initial value problems in the ellipse with initial data defined by the complete system of Chebyshev polynomials. The condition (psi) = 0 on the boundary of the ellipse is used to determine the series representation of (PHI) and (psi). The map from the ellipse to the complex flow coordinates is found from data specifying the speed q as a function of the arc length s. The transonic problem for shockless flow becomes well posed after appropriate modifications of this procedure. The nonlinearity of the problem is handled by an iterative method that determines the boundary value problem in the ellipse and the map function in sequence. We have implemented this method as a computer code to design two-dimensional cascades of shockless compressor airfoils with gap-to-chord ratios as low as .5 and supersonic zones on both the upper and lower surfaces. The method may be extended to solve more general boundary value problems for second order partial

  14. International Lymphoma Epidemiology Consortium

    Cancer.gov

    The InterLymph Consortium, or formally the International Consortium of Investigators Working on Non-Hodgkin's Lymphoma Epidemiologic Studies, is an open scientific forum for epidemiologic research in non-Hodgkin's lymphoma.

  15. Epidemiological Cutoff Values for Fluconazole, Itraconazole, Posaconazole, and Voriconazole for Six Candida Species as Determined by the Colorimetric Sensititre YeastOne Method

    PubMed Central

    Pemán, Javier; Iñiguez, Carmen; Hervás, David; Lopez-Hontangas, Jose L.; Pina-Vaz, Cidalia; Camarena, Juan J.; Campos-Herrero, Isolina; García-García, Inmaculada; García-Tapia, Ana M.; Guna, Remedios; Merino, Paloma; Pérez del Molino, Luisa; Rubio, Carmen; Suárez, Anabel

    2013-01-01

    In the absence of clinical breakpoints (CBP), epidemiological cutoff values (ECVs) are useful to separate wild-type (WT) isolates (without mechanisms of resistance) from non-WT isolates (those that can harbor some resistance mechanisms), which is the goal of susceptibility tests. Sensititre YeastOne (SYO) is a widely used method to determine susceptibility of Candida spp. to antifungal agents. The CLSI CBP have been established, but not for the SYO method. The ECVs for four azoles, obtained using MIC distributions determined by the SYO method, were calculated via five methods (three statistical methods and based on the MIC50 and modal MIC). Respectively, the median ECVs (in mg/liter) of the five methods for fluconazole, itraconazole, posaconazole, and voriconazole (in parentheses: the percentage of isolates inhibited by MICs equal to or less than the ECVs; the number of isolates tested) were as follows: 2 (94.4%; 944), 0.5 (96.7%; 942), 0.25 (97.6%; 673), and 0.06 (96.7%; 849) for Candida albicans; 4 (86.1%; 642), 0.5 (99.4%; 642), 0.12 (93.9%; 392), and 0.06 (86.9%; 559) for C. parapsilosis; 8 (94.9%; 175), 1 (93.7%; 175), 2 (93.6%; 125), and 0.25 (90.4%; 167) for C. tropicalis; 128 (98.6%; 212), 4 (95.8%; 212), 4 (96.0%; 173), and 2 (98.5; 205) for C. glabrata; 256 (100%; 53), 1 (98.1%; 53), 1 (100%; 33), and 1 (97.9%; 48) for C. krusei; 4 (89.2%; 93), 0.5 (100%; 93), 0.25 (100%; 33), and 0.06 (87.7%; 73) for C. orthopsilosis. All methods included ≥94% of isolates and yielded similar ECVs (within 1 dilution). These ECVs would be suitable for monitoring emergence of isolates with reduced susceptibility by using the SYO method. PMID:23761155

  16. Epidemiology, Science as Inquiry and Scientific Literacy

    ERIC Educational Resources Information Center

    Kaelin, Mark; Huebner, Wendy

    2003-01-01

    The recent worldwide SARS outbreak has put the science of epidemiology into the headlines once again. Epidemiology is "... the study of the distribution and the determinants of health-related states or events and the application of these methods to the control of health problems" (Gordis 2000). In this context, the authors have developed a…

  17. Assessing health impacts in complex eco-epidemiological settings in the humid tropics: Advancing tools and methods

    SciTech Connect

    Winkler, Mirko S.; Divall, Mark J.; Krieger, Gary R.; Balge, Marci Z.; Singer, Burton H.; Utzinger, Juerg

    2010-01-15

    In the developing world, large-scale projects in the extractive industry and natural resources sectors are often controversial and associated with long-term adverse health consequences to local communities. In many industrialised countries, health impact assessment (HIA) has been institutionalized for the mitigation of anticipated negative health effects while enhancing the benefits of projects, programmes and policies. However, in developing country settings, relatively few HIAs have been performed. Hence, more HIAs with a focus on low- and middle-income countries are needed to advance and refine tools and methods for impact assessment and subsequent mitigation measures. We present a promising HIA approach, developed within the frame of a large gold-mining project in the Democratic Republic of the Congo. The articulation of environmental health areas, the spatial delineation of potentially affected communities and the use of a diversity of sources to obtain quality baseline health data are utilized for risk profiling. We demonstrate how these tools and data are fed into a risk analysis matrix, which facilitates ranking of potential health impacts for subsequent prioritization of mitigation strategies. The outcomes encapsulate a multitude of environmental and health determinants in a systematic manner, and will assist decision-makers in the development of mitigation measures that minimize potential adverse health effects and enhance positive ones.

  18. SU-D-16A-01: A Novel Method to Estimate Normal Tissue Dose for Radiotherapy Patients to Support Epidemiologic Studies of Second Cancer Risk

    SciTech Connect

    Lee, C; Jung, J; Pelletier, C; Kim, J; Lee, C

    2014-06-01

    Purpose: Patient cohort of second cancer study often involves radiotherapy patients with no radiological images available: We developed methods to construct a realistic surrogate anatomy by using computational human phantoms. We tested this phantom images both in a commercial treatment planning system (Eclipse) and a custom Monte Carlo (MC) transport code. Methods: We used a reference adult male phantom defined by International Commission on Radiological Protection (ICRP). The hybrid phantom which was originally developed in Non-Uniform Rational B-Spline (NURBS) and polygon mesh format was converted into more common medical imaging format. Electron density was calculated from the material composition of the organs and tissues and then converted into DICOM format. The DICOM images were imported into the Eclipse system for treatment planning, and then the resulting DICOM-RT files were imported into the MC code for MC-based dose calculation. Normal tissue doses were calculation in Eclipse and MC code for an illustrative prostate treatment case and compared to each other. Results: DICOM images were generated from the adult male reference phantom. Densities and volumes of selected organs between the original phantom and ones represented within Eclipse showed good agreements, less than 0.6%. Mean dose from Eclipse and MC code match less than 7%, whereas maximum and minimum doses were different up to 45%. Conclusion: The methods established in this study will be useful for the reconstruction of organ dose to support epidemiological studies of second cancer in cancer survivors treated by radiotherapy. We also work on implementing body size-dependent computational phantoms to better represent patient's anatomy when the height and weight of patients are available.

  19. Assessing the Epidemiological Data and Management Methods of Body Packers Admitted to a Referral Center in Iran

    PubMed Central

    Alipour-faz, Athena; Shadnia, Shahin; Mirhashemi, Seyyed Hadi; Peyvandi, Maryam; Oroei, Mahbobeh; Shafagh, Omid; Peyvandi, Hassan; Peyvandi, Ali Asghar

    2016-01-01

    Abstract The incidence of smuggling and transporting illegal substances by internal concealment, also known as body packing, is on the rise. The clinical approach to such patients has been changed significantly over the past 2 decades. However, despite a recorded increase in body packing in general, there are controversies in the management of these patients. We aimed to gather data regarding the demographic characteristics, treatment, and outcome of body packers, which were that referred to Loghman Hakim Hospital, Tehran, Iran. The data of all body packers admitted to Loghman Hakim Hospital during 2010 to 2014 were evaluated retrospectively. Data regarding the demographic characteristics of the patients, findings of clinical imaging, treatment, and outcome were recorded. In this study, 175 individuals with a mean age of 31 ± 10 years were assessed. The most common concealed substances were crack (37%), crystal (17%), opium (13%), and heroin (6%). According to the results of surgery and imaging (abdominal radiography or computed tomography), the most common place for concealment was stomach in 33.3% and 12% of cases, respectively. Imaging findings were normal in 18% of the individuals. Forty-eight (27%) patients underwent surgery. The main indications for surgery were clinical manifestations of toxicity (79%) and obstruction of the gastro-intestinal tract (17%). The most common surgical techniques were laparotomy and gastrotomy (50%). The mean duration of hospitalization was 3.8 ± 4 days. The mortality rate was 3%. Conservative treatment of body packers seems to be the best treatment method. Careful monitoring of the patients for possible signs and symptoms of intoxication and gastro-intestinal obstruction is strongly recommended. PMID:27175693

  20. Assessing the Epidemiological Data and Management Methods of Body Packers Admitted to a Referral Center in Iran.

    PubMed

    Alipour-Faz, Athena; Shadnia, Shahin; Mirhashemi, Seyyed Hadi; Peyvandi, Maryam; Oroei, Mahbobeh; Shafagh, Omid; Peyvandi, Hassan; Peyvandi, Ali Asghar

    2016-05-01

    The incidence of smuggling and transporting illegal substances by internal concealment, also known as body packing, is on the rise. The clinical approach to such patients has been changed significantly over the past 2 decades. However, despite a recorded increase in body packing in general, there are controversies in the management of these patients. We aimed to gather data regarding the demographic characteristics, treatment, and outcome of body packers, which were that referred to Loghman Hakim Hospital, Tehran, Iran.The data of all body packers admitted to Loghman Hakim Hospital during 2010 to 2014 were evaluated retrospectively. Data regarding the demographic characteristics of the patients, findings of clinical imaging, treatment, and outcome were recorded.In this study, 175 individuals with a mean age of 31 ± 10 years were assessed. The most common concealed substances were crack (37%), crystal (17%), opium (13%), and heroin (6%). According to the results of surgery and imaging (abdominal radiography or computed tomography), the most common place for concealment was stomach in 33.3% and 12% of cases, respectively. Imaging findings were normal in 18% of the individuals. Forty-eight (27%) patients underwent surgery. The main indications for surgery were clinical manifestations of toxicity (79%) and obstruction of the gastro-intestinal tract (17%). The most common surgical techniques were laparotomy and gastrotomy (50%). The mean duration of hospitalization was 3.8 ± 4 days. The mortality rate was 3%.Conservative treatment of body packers seems to be the best treatment method. Careful monitoring of the patients for possible signs and symptoms of intoxication and gastro-intestinal obstruction is strongly recommended. PMID:27175693

  1. Cancer Epidemiology Matters Blog

    Cancer.gov

    The Cancer Epidemiology Matters blog helps foster a dialogue between the National Cancer Institute's (NCI) Epidemiology and Genomics Research Program (EGRP), extramural researchers, and other individuals, such as clinicians, community partners, and advocates, who are interested in cancer epidemiology and genomics.

  2. A Universal Design Method for Reflecting Physical Characteristics Variability: Case Study of a Bicycle Frame.

    PubMed

    Shimada, Masato; Suzuki, Wataru; Yamada, Shuho; Inoue, Masato

    2016-01-01

    To achieve a Universal Design, designers must consider diverse users' physical and functional requirements for their products. However, satisfying these requirements and obtaining the information which is necessary for designing a universal product is very difficult. Therefore, we propose a new design method based on the concept of set-based design to solve these issues. This paper discusses the suitability of proposed design method by applying bicycle frame design problem. PMID:27534334

  3. A Universal Design Method for Reflecting Physical Characteristics Variability: Case Study of a Bicycle Frame.

    PubMed

    Shimada, Masato; Suzuki, Wataru; Yamada, Shuho; Inoue, Masato

    2016-01-01

    To achieve a Universal Design, designers must consider diverse users' physical and functional requirements for their products. However, satisfying these requirements and obtaining the information which is necessary for designing a universal product is very difficult. Therefore, we propose a new design method based on the concept of set-based design to solve these issues. This paper discusses the suitability of proposed design method by applying bicycle frame design problem.

  4. Design optimization methods for genomic DNA tiling arrays.

    PubMed

    Bertone, Paul; Trifonov, Valery; Rozowsky, Joel S; Schubert, Falk; Emanuelsson, Olof; Karro, John; Kao, Ming-Yang; Snyder, Michael; Gerstein, Mark

    2006-02-01

    A recent development in microarray research entails the unbiased coverage, or tiling, of genomic DNA for the large-scale identification of transcribed sequences and regulatory elements. A central issue in designing tiling arrays is that of arriving at a single-copy tile path, as significant sequence cross-hybridization can result from the presence of non-unique probes on the array. Due to the fragmentation of genomic DNA caused by the widespread distribution of repetitive elements, the problem of obtaining adequate sequence coverage increases with the sizes of subsequence tiles that are to be included in the design. This becomes increasingly problematic when considering complex eukaryotic genomes that contain many thousands of interspersed repeats. The general problem of sequence tiling can be framed as finding an optimal partitioning of non-repetitive subsequences over a prescribed range of tile sizes, on a DNA sequence comprising repetitive and non-repetitive regions. Exact solutions to the tiling problem become computationally infeasible when applied to large genomes, but successive optimizations are developed that allow their practical implementation. These include an efficient method for determining the degree of similarity of many oligonucleotide sequences over large genomes, and two algorithms for finding an optimal tile path composed of longer sequence tiles. The first algorithm, a dynamic programming approach, finds an optimal tiling in linear time and space; the second applies a heuristic search to reduce the space complexity to a constant requirement. A Web resource has also been developed, accessible at http://tiling.gersteinlab.org, to generate optimal tile paths from user-provided DNA sequences.

  5. The Convergence Insufficiency Treatment Trial: Design, Methods, and Baseline Data

    PubMed Central

    2009-01-01

    Objective This report describes the design and methodology of the Convergence Insufficiency Treatment Trial (CITT), the first large-scale, placebo-controlled, randomized clinical trial evaluating treatments for convergence insufficiency (CI) in children. We also report the clinical and demographic characteristics of patients. Methods We prospectively randomized children 9 to 17 years of age to one of four treatment groups: 1) home-based pencil push-ups, 2) home-based computer vergence/accommodative therapy and pencil push-ups, 3) office-based vergence/accommodative therapy with home reinforcement, 4) office-based placebo therapy. Outcome data on the Convergence Insufficiency Symptom Survey (CISS) score (primary outcome), near point of convergence (NPC), and positive fusional vergence were collected after 12 weeks of active treatment and again at 6 and 12 months post-treatment. Results The CITT enrolled 221 children with symptomatic CI with a mean age of 12.0 years (SD = +2.3). The clinical profile of the cohort at baseline was 9Δ exophoria at near (+/− 4.4) and 2Δ exophoria (+/−2.8) at distance, CISS score = 30 (+/−9.0), NPC = 14 cm (+/− 7.5), and near positive fusional vergence break = 13 Δ (+/− 4.6). There were no statistically significant nor clinically relevant differences between treatment groups with respect to baseline characteristics (p > 0.05). Conclusion Hallmark features of the study design include formal definitions of conditions and outcomes, standardized diagnostic and treatment protocols, a placebo treatment arm, masked outcome examinations, and the CISS score outcome measure. The baseline data reported herein define the clinical profile of those enrolled into the CITT. PMID:18300086

  6. Visual Narrative Research Methods as Performance in Industrial Design Education

    ERIC Educational Resources Information Center

    Campbell, Laurel H.; McDonagh, Deana

    2009-01-01

    This article discusses teaching empathic research methodology as performance. The authors describe their collaboration in an activity to help undergraduate industrial design students learn empathy for others when designing products for use by diverse or underrepresented people. The authors propose that an industrial design curriculum would benefit…

  7. Application of the Random Forest method to analyse epidemiological and phenotypic characteristics of Salmonella 4,[5],12:i:- and Salmonella Typhimurium strains.

    PubMed

    Barco, L; Mancin, M; Ruffa, M; Saccardin, C; Minorello, C; Zavagnin, P; Lettini, A A; Olsen, J E; Ricci, A

    2012-11-01

    Salmonella enterica 4,[5],12:i:- is a monophasic variant of S. Typhimurium. In the last decade, its prevalence rose sharply. Although S. 4,[5],12:i:- and S. Typhimurium are known to pose a considerable public health risk, there is no detailed information on the circulation of these serovars in Italy, particularly as far as veterinary isolates are concerned. For this reason, a data set of 877 strains isolated in the north-east of Italy from foodstuffs, animals and environment was analysed during 2005-2010. The Random Forests (RF) method was used to identify the most important epidemiological and phenotypic variables to show the difference between the two serovars. Both descriptive analysis and RF revealed that S. 4,[5],12:i:- is less heterogeneous than S. Typhimurium. RF highlighted that phage type was the most important variable to differentiate the two serovars. The most common phage types identified for S. 4,[5],12:i:- were DT20a, U311 and DT193. The same phage types were also found in S. Typhimurium isolates, although with a much lower prevalence. DT7 and DT120 were ascribed to the two serovars at comparable levels. DT104, DT2 and DT99 were ascribed exclusively to S. Typhimurium, and almost all the other phage types identified were more related to the latter serovar. Such data confirm that phage typing can provide an indication of the biphasic or monophasic state of the strains investigated and could therefore support serotyping results. However, phage typing cannot be used as the definitive method to differentiate the two serovars, as part of the phage types were detected for both serovars and, in particular, all phage types found for S. 4,[5],12:i- were found also for S. Typhimurium.

  8. The Chinese American Eye Study: Design and Methods

    PubMed Central

    Varma, Rohit; Hsu, Chunyi; Wang, Dandan; Torres, Mina; Azen, Stanley P.

    2016-01-01

    Purpose To summarize the study design, operational strategies and procedures of the Chinese American Eye Study (CHES), a population-based assessment of the prevalence of visual impairment, ocular disease, and visual functioning in Chinese Americans. Methods This population-based, cross-sectional study, included 4,570 Chinese, 50 years and older, residing in the city of Monterey Park, California. Each eligible participant completed a detailed interview and eye examination. The interview included an assessment of demographic, behavioral, and ocular risk factors and health-related and vision-related quality of life. The eye examination included measurements of visual acuity, intraocular pressure, visual fields, fundus and optic disc photography, a detailed anterior and posterior segment examination, and measurements of blood pressure, glycosylated hemoglobin levels, and blood glucose levels. Results The objectives of the CHES are to obtain prevalence estimates of visual impairment, refractive error, diabetic retinopathy, open-angle and angle-closure glaucoma, lens opacities, and age-related macular degeneration in Chinese-Americans. In addition, outcomes include effect estimates for risk factors associated with eye diseases. Lastly, CHES will investigate the genetic determinates of myopia and glaucoma. Conclusion The CHES will provide information about the prevalence and risk factors of ocular diseases in one of the fastest growing minority groups in the United States. PMID:24044409

  9. Design and methods of the national Vietnam veterans longitudinal study.

    PubMed

    Schlenger, William E; Corry, Nida H; Kulka, Richard A; Williams, Christianna S; Henn-Haase, Clare; Marmar, Charles R

    2015-09-01

    The National Vietnam Veterans Longitudinal Study (NVVLS) is the second assessment of a representative cohort of US veterans who served during the Vietnam War era, either in Vietnam or elsewhere. The cohort was initially surveyed in the National Vietnam Veterans Readjustment Study (NVVRS) from 1984 to 1988 to assess the prevalence, incidence, and effects of post-traumatic stress disorder (PTSD) and other post-war problems. The NVVLS sought to re-interview the cohort to assess the long-term course of PTSD. NVVLS data collection began July 3, 2012 and ended May 17, 2013, comprising three components: a mailed health questionnaire, a telephone health survey interview, and, for a probability sample of theater Veterans, a clinical diagnostic telephone interview administered by licensed psychologists. Excluding decedents, 78.8% completed the questionnaire and/or telephone survey, and 55.0% of selected living veterans participated in the clinical interview. This report provides a description of the NVVLS design and methods. Together, the NVVRS and NVVLS constitute a nationally representative longitudinal study of Vietnam veterans, and extend the NVVRS as a critical resource for scientific and policy analyses for Vietnam veterans, with policy relevance for Iraq and Afghanistan veterans.

  10. Design and methods of the national Vietnam veterans longitudinal study.

    PubMed

    Schlenger, William E; Corry, Nida H; Kulka, Richard A; Williams, Christianna S; Henn-Haase, Clare; Marmar, Charles R

    2015-09-01

    The National Vietnam Veterans Longitudinal Study (NVVLS) is the second assessment of a representative cohort of US veterans who served during the Vietnam War era, either in Vietnam or elsewhere. The cohort was initially surveyed in the National Vietnam Veterans Readjustment Study (NVVRS) from 1984 to 1988 to assess the prevalence, incidence, and effects of post-traumatic stress disorder (PTSD) and other post-war problems. The NVVLS sought to re-interview the cohort to assess the long-term course of PTSD. NVVLS data collection began July 3, 2012 and ended May 17, 2013, comprising three components: a mailed health questionnaire, a telephone health survey interview, and, for a probability sample of theater Veterans, a clinical diagnostic telephone interview administered by licensed psychologists. Excluding decedents, 78.8% completed the questionnaire and/or telephone survey, and 55.0% of selected living veterans participated in the clinical interview. This report provides a description of the NVVLS design and methods. Together, the NVVRS and NVVLS constitute a nationally representative longitudinal study of Vietnam veterans, and extend the NVVRS as a critical resource for scientific and policy analyses for Vietnam veterans, with policy relevance for Iraq and Afghanistan veterans. PMID:26096554

  11. A decision-based perspective for the design of methods for systems design

    NASA Technical Reports Server (NTRS)

    Mistree, Farrokh; Muster, Douglas; Shupe, Jon A.; Allen, Janet K.

    1989-01-01

    Organization of material, a definition of decision based design, a hierarchy of decision based design, the decision support problem technique, a conceptual model design that can be manufactured and maintained, meta-design, computer-based design, action learning, and the characteristics of decisions are among the topics covered.

  12. [The specific epidemiological surveillance of dengue: the method and its importance since the dengue-2 epidemic in French Polynesia in 1996].

    PubMed

    Deparis, X; Chungue, E; Pauck, S; Roche, C; Murgue, B; Gleize, L

    1998-07-01

    Dengue fever is present in tropical and subtropical regions and its geographical extension and the simultaneous increase of its mortality are worrisome. In endemic or epidemic countries, the aim of dengue-specific epidemiological surveillance is to confirm as soon as possible the circulation of a new viral dengue serotype, i.e. the beginning of an epidemic. The efficiency of the control strategy is improved by an earlier epidemic alert. In French Polynesia, dengue-3 virus circulated since 1989 at low level and, in May 1996, a specific epidemiological surveillance was undertaken because of the threat of a dengue-4 epidemic. From each suspected dengue case reported by 18 Polynesian physicians located in the Société Islands, a blood sample was taken for virological assay and clinical data were reported. Between May and November 1996, the virology unit of the Institut Malardé isolated 21 viruses (2 dengue-3 and 19 dengue-2) from 302 suspected cases. The dengue-specific epidemiological surveillance confirmed that dengue-2 virus was circulating and reduced the time of the epidemiological alert by 2 or 3 months compared to previous epidemics. Taking into account the day of illness, a logistic regression undertaken on the clinical data showed that the absence of cough was the only predictive sign of dengue diagnosis. The performance of this dengue-specific epidemiological surveillance system led us to consider its implementation in all concerned countries. A collaboration with international reference laboratories could be a solution for the developing countries.

  13. ADHD in the Arab World: A Review of Epidemiologic Studies

    ERIC Educational Resources Information Center

    Farah, Lynn G.; Fayyad, John A.; Eapen, Valsamma; Cassir,Youmna; Salamoun, Mariana M.; Tabet, Caroline C.; Mneimneh, Zeina N.; Karam, Elie G.

    2009-01-01

    Objective: Epidemiological studies on psychiatric disorders are quite rare in the Arab World. This article reviews epidemiological studies on ADHD in all the Arab countries. Method: All epidemiological studies on ADHD conducted from 1966 through th present were reviewed. Samples were drawn from the general community, primary care clinical…

  14. Applications of numerical optimization methods to helicopter design problems: A survey

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1984-01-01

    A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.

  15. Multidisciplinary Design Optimization (MDO) Methods: Their Synergy with Computer Technology in Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1998-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate a radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimization (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behavior by interaction of a large number of very simple models may be an inspiration for the above algorithms, the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should be now, even though the widespread availability of massively parallel processing is still a few years away.

  16. Multidisciplinary Design Optimisation (MDO) Methods: Their Synergy with Computer Technology in the Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1999-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate. A radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimisation (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behaviour by interaction of a large number of very simple models may be an inspiration for the above algorithms; the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should begin now, even though the widespread availability of massively parallel processing is still a few years away.

  17. Exploring Clinical and Epidemiological Characteristics of Interstitial Lung Diseases: Rationale, Aims, and Design of a Nationwide Prospective Registry--The EXCITING-ILD Registry.

    PubMed

    Kreuter, Michael; Herth, Felix J F; Wacker, Margarethe; Leidl, Reiner; Hellmann, Andreas; Pfeifer, Michael; Behr, Jürgen; Witt, Sabine; Kauschka, Dagmar; Mall, Marcus; Günther, Andreas; Markart, Philipp

    2015-01-01

    Despite a number of prospective registries conducted in past years, the current epidemiology of interstitial lung diseases (ILD) is still not well defined, particularly regarding the prevalence and incidence, their management, healthcare utilisation needs, and healthcare-associated costs. To address these issues in Germany, a new prospective ILD registry, "Exploring Clinical and Epidemiological Characteristics of Interstitial Lung Diseases" (EXCITING-ILD), is being conducted by the German Centre for Lung Research in association with ambulatory, inpatient, scientific pulmonology organisations and patient support groups. This multicentre, noninterventional, prospective, and observational ILD registry aims to collect comprehensive and validated data from all healthcare institutions on the incidence, prevalence, characteristics, management, and outcomes regarding all ILD presentations in the real-world setting. Specifically, this registry will collect demographic data, disease-related data such as ILD subtype, treatments, diagnostic procedures (e.g., HRCT, surgical lung biopsy), risk factors (e.g., familial ILD), significant comorbidities, ILD managements, and disease outcomes as well as healthcare resource consumption. The EXCITING-ILD registry will include in-patient and out-patient ILD healthcare facilities in more than 100 sites. In summary, this registry will document comprehensive and current epidemiological data as well as important health economic data for ILDs in Germany. PMID:26640781

  18. Exploring Clinical and Epidemiological Characteristics of Interstitial Lung Diseases: Rationale, Aims, and Design of a Nationwide Prospective Registry—The EXCITING-ILD Registry

    PubMed Central

    Kreuter, Michael; Herth, Felix J. F.; Wacker, Margarethe; Leidl, Reiner; Hellmann, Andreas; Pfeifer, Michael; Behr, Jürgen; Witt, Sabine; Kauschka, Dagmar; Mall, Marcus; Günther, Andreas; Markart, Philipp

    2015-01-01

    Despite a number of prospective registries conducted in past years, the current epidemiology of interstitial lung diseases (ILD) is still not well defined, particularly regarding the prevalence and incidence, their management, healthcare utilisation needs, and healthcare-associated costs. To address these issues in Germany, a new prospective ILD registry, “Exploring Clinical and Epidemiological Characteristics of Interstitial Lung Diseases” (EXCITING-ILD), is being conducted by the German Centre for Lung Research in association with ambulatory, inpatient, scientific pulmonology organisations and patient support groups. This multicentre, noninterventional, prospective, and observational ILD registry aims to collect comprehensive and validated data from all healthcare institutions on the incidence, prevalence, characteristics, management, and outcomes regarding all ILD presentations in the real-world setting. Specifically, this registry will collect demographic data, disease-related data such as ILD subtype, treatments, diagnostic procedures (e.g., HRCT, surgical lung biopsy), risk factors (e.g., familial ILD), significant comorbidities, ILD managements, and disease outcomes as well as healthcare resource consumption. The EXCITING-ILD registry will include in-patient and out-patient ILD healthcare facilities in more than 100 sites. In summary, this registry will document comprehensive and current epidemiological data as well as important health economic data for ILDs in Germany. PMID:26640781

  19. Stillbirth Collaborative Research Network: design, methods and recruitment experience.

    PubMed

    Parker, Corette B; Hogue, Carol J R; Koch, Matthew A; Willinger, Marian; Reddy, Uma M; Thorsten, Vanessa R; Dudley, Donald J; Silver, Robert M; Coustan, Donald; Saade, George R; Conway, Deborah; Varner, Michael W; Stoll, Barbara; Pinar, Halit; Bukowski, Radek; Carpenter, Marshall; Goldenberg, Robert

    2011-09-01

    The Stillbirth Collaborative Research Network (SCRN) has conducted a multisite, population-based, case-control study, with prospective enrollment of stillbirths and livebirths at the time of delivery. This paper describes the general design, methods and recruitment experience. The SCRN attempted to enroll all stillbirths and a representative sample of livebirths occurring to residents of pre-defined geographical catchment areas delivering at 59 hospitals associated with five clinical sites. Livebirths <32 weeks gestation and women of African descent were oversampled. The recruitment hospitals were chosen to ensure access to at least 90% of all stillbirths and livebirths to residents of the catchment areas. Participants underwent a standardised protocol including maternal interview, medical record abstraction, placental pathology, biospecimen testing and, in stillbirths, post-mortem examination. Recruitment began in March 2006 and was completed in September 2008 with 663 women with a stillbirth and 1932 women with a livebirth enrolled, representing 69% and 63%, respectively, of the women identified. Additional surveillance for stillbirths continued until June 2009 and a follow-up of the case-control study participants was completed in December 2009. Among consenting women, there were high consent rates for the various study components. For the women with stillbirths, 95% agreed to a maternal interview, chart abstraction and a placental pathological examination; 91% of the women with a livebirth agreed to all of these components. Additionally, 84% of the women with stillbirths agreed to a fetal post-mortem examination. This comprehensive study is poised to systematically study a wide range of potential causes of, and risk factors for, stillbirths and to better understand the scope and incidence of the problem.

  20. Stillbirth Collaborative Research Network: Design, Methods and Recruitment Experience

    PubMed Central

    Parker, Corette B.; Hogue, Carol J. Rowland; Koch, Matthew A.; Willinger, Marian; Reddy, Uma; Thorsten, Vanessa R.; Dudley, Donald J.; Silver, Robert M.; Coustan, Donald; Saade, George R.; Conway, Deborah; Varner, Michael W.; Stoll, Barbara; Pinar, Halit; Bukowski, Radek; Carpenter, Marshall; Goldenberg, Robert

    2013-01-01

    SUMMARY The Stillbirth Collaborative Research Network (SCRN) has conducted a multisite, population-based, case-control study, with prospective enrollment of stillbirths and live births at the time of delivery. This paper describes the general design, methods, and recruitment experience. The SCRN attempted to enroll all stillbirths and a representative sample of live births occurring to residents of pre-defined geographic catchment areas delivering at 59 hospitals associated with five clinical sites. Live births <32 weeks gestation and women of African descent were oversampled. The recruitment hospitals were chosen to ensure access to at least 90% of all stillbirths and live births to residents of the catchment areas. Participants underwent a standardized protocol including maternal interview, medical record abstraction, placental pathology, biospecimen testing, and, in stillbirths, postmortem examination. Recruitment began in March 2006 and was completed in September 2008 with 663 women with a stillbirth and 1932 women with a live birth enrolled, representing 69% and 63%, respectively, of the women identified. Additional surveillance for stillbirth continued through June 2009 and a follow-up of the case-control study participants was completed in December 2009. Among consenting women, there were high consent rates for the various study components. For the women with stillbirth, 95% agreed to maternal interview, chart abstraction, and placental pathologic examination; 91% of the women with live birth agreed to all of these components. Additionally, 84% of the women with stillbirth agreed to a fetal postmortem examination. This comprehensive study is poised to systematically study a wide range of potential causes of, and risk factors for, stillbirth and to better understand the scope and incidence of the problem. PMID:21819424

  1. Methods for combining payload parameter variations with input environment. [calculating design limit loads compatible with probabilistic structural design criteria

    NASA Technical Reports Server (NTRS)

    Merchant, D. H.

    1976-01-01

    Methods are presented for calculating design limit loads compatible with probabilistic structural design criteria. The approach is based on the concept that the desired limit load, defined as the largest load occurring in a mission, is a random variable having a specific probability distribution which may be determined from extreme-value theory. The design limit load, defined as a particular of this random limit load, is the value conventionally used in structural design. Methods are presented for determining the limit load probability distributions from both time-domain and frequency-domain dynamic load simulations. Numerical demonstrations of the method are also presented.

  2. [METHODS AND TECHNOLOGIES OF HEALTH RISK ANALYSIS IN THE SYSTEM OF THE STATE MANAGEMENT UNDER ASSURANCE OF THE SANITATION AND EPIDEMIOLOGICAL WELFARE OF POPULATION].

    PubMed

    Zaĭtseva, N V; Popova, A Iu; Maĭ, I V; Shur, P Z

    2015-01-01

    The methodology of the analysis of health risk at the present stage of development of Russian society is in-demand at all levels of government management. In conjunction with the methods of mathematical modeling, spatial-temporal analysis and economic tools the risk assessment in the analysis of the situation makes it possible to determine the level of safety of the population, workers and consumers, to select prior resources, and threat factors as a point for exertion efforts. At the planning stage risk assessment is a basis for the establishment of most effective measures for the minimization of hazard and dangers. At the realization stage the methodology allows to estimate the efficiency of measures; at the control and supervision phase it permits to select out priorities for the concentration of efforts on the objects of maximal health risk for population. Risk assessments, including the elements of evolutionary modeling, are incorporated in the system of state hygienic regulation, the formation of evidence base of harm to health, the organization of control and supervisory activities. This allows you to harmonize the domestic legal framework with ternational legal requirements and ultimately enhances the credibility of the Russian data on the safety of the environment products and services. There is seemed to be actual the further assignment of enforcement of methodology of health risk analysis in the field of assurance of sanitary and epidemiological well-being and health of employers; he development of informational and analytical base in the part of the establishment of models of dependencies "exposure-response" for different types and levels of exposure and risk contingents; the accuracy enhancement of estimations of exposure; improvement of the economic aspects of health risk analysis and forecasting of measures aimed at mitigation of the losses associated with the negative impact of manifold factors on the health of citizens. PMID:26155657

  3. [METHODS AND TECHNOLOGIES OF HEALTH RISK ANALYSIS IN THE SYSTEM OF THE STATE MANAGEMENT UNDER ASSURANCE OF THE SANITATION AND EPIDEMIOLOGICAL WELFARE OF POPULATION].

    PubMed

    Zaĭtseva, N V; Popova, A Iu; Maĭ, I V; Shur, P Z

    2015-01-01

    The methodology of the analysis of health risk at the present stage of development of Russian society is in-demand at all levels of government management. In conjunction with the methods of mathematical modeling, spatial-temporal analysis and economic tools the risk assessment in the analysis of the situation makes it possible to determine the level of safety of the population, workers and consumers, to select prior resources, and threat factors as a point for exertion efforts. At the planning stage risk assessment is a basis for the establishment of most effective measures for the minimization of hazard and dangers. At the realization stage the methodology allows to estimate the efficiency of measures; at the control and supervision phase it permits to select out priorities for the concentration of efforts on the objects of maximal health risk for population. Risk assessments, including the elements of evolutionary modeling, are incorporated in the system of state hygienic regulation, the formation of evidence base of harm to health, the organization of control and supervisory activities. This allows you to harmonize the domestic legal framework with ternational legal requirements and ultimately enhances the credibility of the Russian data on the safety of the environment products and services. There is seemed to be actual the further assignment of enforcement of methodology of health risk analysis in the field of assurance of sanitary and epidemiological well-being and health of employers; he development of informational and analytical base in the part of the establishment of models of dependencies "exposure-response" for different types and levels of exposure and risk contingents; the accuracy enhancement of estimations of exposure; improvement of the economic aspects of health risk analysis and forecasting of measures aimed at mitigation of the losses associated with the negative impact of manifold factors on the health of citizens.

  4. Developing Baby Bag Design by Using Kansei Engineering Method

    NASA Astrophysics Data System (ADS)

    Janari, D.; Rakhmawati, A.

    2016-01-01

    Consumer's preferences and market demand are essential factors for product's success. Thus, in achieving its success, a product should have design that could fulfill consumer's expectation. Purpose of this research is accomplishing baby bag product as stipulated by Kansei. The results that represent Kanseiwords are; neat, unique, comfortable, safe, modern, gentle, elegant, antique, attractive, simple, spacious, creative, colorful, durable, stylish, smooth and strong. Identification value on significance of correlation for durable attribute is 0,000 < 0,005, which means significant to baby's bag. While the value of coefficient regression is 0,812 < 0,005, which means that durable attribute insignificant to baby's bag.The result of the baby's bag final design selectionbased on the questionnaire 3 is resulting the combination of all design. Space for clothes, diaper's space, shoulder grip, side grip, bottle's heater pocket and bottle's pocket are derived from design 1. Top grip, space for clothes, shoulder grip, and side grip are derived from design 2.Others design that were taken are, spaces for clothes from design 3, diaper's space and clothes’ space from design 4.

  5. Teaching Improvement Model Designed with DEA Method and Management Matrix

    ERIC Educational Resources Information Center

    Montoneri, Bernard

    2014-01-01

    This study uses student evaluation of teachers to design a teaching improvement matrix based on teaching efficiency and performance by combining management matrix and data envelopment analysis. This matrix is designed to formulate suggestions to improve teaching. The research sample consists of 42 classes of freshmen following a course of English…

  6. METHODS FOR INTEGRATING ENVIRONMENTAL CONSIDERATIONS INTO CHEMICAL PROCESS DESIGN DECISIONS

    EPA Science Inventory

    The objective of this cooperative agreement was to postulate a means by which an engineer could routinely include environmental considerations in day-to-day conceptual design problems; a means that could easily integrate with existing design processes, and thus avoid massive retr...

  7. Development of Combinatorial Methods for Alloy Design and Optimization

    SciTech Connect

    Pharr, George M.; George, Easo P.; Santella, Michael L

    2005-07-01

    The primary goal of this research was to develop a comprehensive methodology for designing and optimizing metallic alloys by combinatorial principles. Because conventional techniques for alloy preparation are unavoidably restrictive in the range of alloy composition that can be examined, combinatorial methods promise to significantly reduce the time, energy, and expense needed for alloy design. Combinatorial methods can be developed not only to optimize existing alloys, but to explore and develop new ones as well. The scientific approach involved fabricating an alloy specimen with a continuous distribution of binary and ternary alloy compositions across its surface--an ''alloy library''--and then using spatially resolved probing techniques to characterize its structure, composition, and relevant properties. The three specific objectives of the project were: (1) to devise means by which simple test specimens with a library of alloy compositions spanning the range interest can be produced; (2) to assess how well the properties of the combinatorial specimen reproduce those of the conventionally processed alloys; and (3) to devise screening tools which can be used to rapidly assess the important properties of the alloys. As proof of principle, the methodology was applied to the Fe-Ni-Cr ternary alloy system that constitutes many commercially important materials such as stainless steels and the H-series and C-series heat and corrosion resistant casting alloys. Three different techniques were developed for making alloy libraries: (1) vapor deposition of discrete thin films on an appropriate substrate and then alloying them together by solid-state diffusion; (2) co-deposition of the alloying elements from three separate magnetron sputtering sources onto an inert substrate; and (3) localized melting of thin films with a focused electron-beam welding system. Each of the techniques was found to have its own advantages and disadvantages. A new and very powerful technique for

  8. Categorisation of visualisation methods to support the design of Human-Computer Interaction Systems.

    PubMed

    Li, Katie; Tiwari, Ashutosh; Alcock, Jeffrey; Bermell-Garcia, Pablo

    2016-07-01

    During the design of Human-Computer Interaction (HCI) systems, the creation of visual artefacts forms an important part of design. On one hand producing a visual artefact has a number of advantages: it helps designers to externalise their thought and acts as a common language between different stakeholders. On the other hand, if an inappropriate visualisation method is employed it could hinder the design process. To support the design of HCI systems, this paper reviews the categorisation of visualisation methods used in HCI. A keyword search is conducted to identify a) current HCI design methods, b) approaches of selecting these methods. The resulting design methods are filtered to create a list of just visualisation methods. These are then categorised using the approaches identified in (b). As a result 23 HCI visualisation methods are identified and categorised in 5 selection approaches (The Recipient, Primary Purpose, Visual Archetype, Interaction Type, and The Design Process).

  9. Categorisation of visualisation methods to support the design of Human-Computer Interaction Systems.

    PubMed

    Li, Katie; Tiwari, Ashutosh; Alcock, Jeffrey; Bermell-Garcia, Pablo

    2016-07-01

    During the design of Human-Computer Interaction (HCI) systems, the creation of visual artefacts forms an important part of design. On one hand producing a visual artefact has a number of advantages: it helps designers to externalise their thought and acts as a common language between different stakeholders. On the other hand, if an inappropriate visualisation method is employed it could hinder the design process. To support the design of HCI systems, this paper reviews the categorisation of visualisation methods used in HCI. A keyword search is conducted to identify a) current HCI design methods, b) approaches of selecting these methods. The resulting design methods are filtered to create a list of just visualisation methods. These are then categorised using the approaches identified in (b). As a result 23 HCI visualisation methods are identified and categorised in 5 selection approaches (The Recipient, Primary Purpose, Visual Archetype, Interaction Type, and The Design Process). PMID:26995039

  10. A new method for designing dual foil electron beam forming systems. I. Introduction, concept of the method

    NASA Astrophysics Data System (ADS)

    Adrich, Przemysław

    2016-05-01

    In Part I of this work existing methods and problems in dual foil electron beam forming system design are presented. On this basis, a new method of designing these systems is introduced. The motivation behind this work is to eliminate the shortcomings of the existing design methods and improve overall efficiency of the dual foil design process. The existing methods are based on approximate analytical models applied in an unrealistically simplified geometry. Designing a dual foil system with these methods is a rather labor intensive task as corrections to account for the effects not included in the analytical models have to be calculated separately and accounted for in an iterative procedure. To eliminate these drawbacks, the new design method is based entirely on Monte Carlo modeling in a realistic geometry and using physics models that include all relevant processes. In our approach, an optimal configuration of the dual foil system is found by means of a systematic, automatized scan of the system performance in function of parameters of the foils. The new method, while being computationally intensive, minimizes the involvement of the designer and considerably shortens the overall design time. The results are of high quality as all the relevant physics and geometry details are naturally accounted for. To demonstrate the feasibility of practical implementation of the new method, specialized software tools were developed and applied to solve a real life design problem, as described in Part II of this work.

  11. The cryogenic balance design and balance calibration methods

    NASA Astrophysics Data System (ADS)

    Ewald, B.; Polanski, L.; Graewe, E.

    1992-07-01

    The current status of a program aimed at the development of a cryogenic balance for the European Transonic Wind Tunnel is reviewed. In particular, attention is given to the cryogenic balance design philosophy, mechanical balance design, reliability and accuracy, cryogenic balance calibration concept, and the concept of an automatic calibration machine. It is shown that the use of the automatic calibration machine will improve the accuracy of calibration while reducing the man power and time required for balance calibration.

  12. Advanced transonic fan design procedure based on a Navier-Stokes method

    NASA Astrophysics Data System (ADS)

    Rhie, C. M.; Zacharias, R. M.; Hobbs, D. E.; Sarathy, K. P.; Biederman, B. P.; Lejambre, C. R.; Spear, D. A.

    1994-04-01

    A fan performance analysis method based upon three-dimensional steady Navier-Stokes equations is presented in this paper. Its accuracy is established through extensive code validation effort. Validation data comparisons ranging from a two-dimensional compressor cascade to three-dimensional fans are shown in this paper to highlight the accuracy and reliability of the code. The overall fan design procedure using this code is then presented. Typical results of this design process are shown for a current engine fan design. This new design method introduces a major improvement over the conventional design methods based on inviscid flow and boundary layer concepts. Using the Navier-Stokes design method, fan designers can confidently refine their designs prior to rig testing. This results in reduced rig testing and cost savings as the bulk of the iteration between design and experimental verification is transferred to an iteration between design and computational verification.

  13. Epidemiology, Molecular Epidemiology and Evolution of Bovine Respiratory Syncytial Virus

    PubMed Central

    Sarmiento-Silva, Rosa Elena; Nakamura-Lopez, Yuko; Vaughan, Gilberto

    2012-01-01

    The bovine respiratory syncytial virus (BRSV) is an enveloped, negative sense, single-stranded RNA virus belonging to the pneumovirus genus within the family Paramyxoviridae. BRSV has been recognized as a major cause of respiratory disease in young calves since the early 1970s. The analysis of BRSV infection was originally hampered by its characteristic lability and poor growth in vitro. However, the advent of numerous immunological and molecular methods has facilitated the study of BRSV enormously. The knowledge gained from these studies has also provided the opportunity to develop safe, stable, attenuated virus vaccine candidates. Nonetheless, many aspects of the epidemiology, molecular epidemiology and evolution of the virus are still not fully understood. The natural course of infection is rather complex and further complicates diagnosis, treatment and the implementation of preventive measures aimed to control the disease. Therefore, understanding the mechanisms by which BRSV is able to establish infection is needed to prevent viral and disease spread. This review discusses important information regarding the epidemiology and molecular epidemiology of BRSV worldwide, and it highlights the importance of viral evolution in virus transmission. PMID:23202546

  14. Aircraft design for mission performance using nonlinear multiobjective optimization methods

    NASA Technical Reports Server (NTRS)

    Dovi, Augustine R.; Wrenn, Gregory A.

    1990-01-01

    A new technique which converts a constrained optimization problem to an unconstrained one where conflicting figures of merit may be simultaneously considered was combined with a complex mission analysis system. The method is compared with existing single and multiobjective optimization methods. A primary benefit from this new method for multiobjective optimization is the elimination of separate optimizations for each objective, which is required by some optimization methods. A typical wide body transport aircraft is used for the comparative studies.

  15. A Preliminary Rubric Design to Evaluate Mixed Methods Research

    ERIC Educational Resources Information Center

    Burrows, Timothy J.

    2013-01-01

    With the increase in frequency of the use of mixed methods, both in research publications and in externally funded grants there are increasing calls for a set of standards to assess the quality of mixed methods research. The purpose of this mixed methods study was to conduct a multi-phase analysis to create a preliminary rubric to evaluate mixed…

  16. [Epidemiology of Kawasaki disease].

    PubMed

    Nakamura, Yosikazu

    2014-09-01

    Epidemiologic features of Kawasaki disease, in particular in Japan, were summarized. There were three aspects of the epidemiology: (1) frequency(descriptive epidemiology), (2) risk factors(e.g. case-control studies), and (3) natural history(follow-up studies). The nationwide surveys, which was established in 1970, revealed the epidemiologic features of the disease. The number of patients and incidence rate have elevated since mid-1990s. Descriptive features indicates the association between disease onset and both infection and the hosts' factors. A follow-up study over 20 years has been conducted, but it should be continued till all the participants pass away.

  17. [Molecular epidemiology in the epidemiological transition].

    PubMed

    Tapia-Conyer, R

    1997-01-01

    The epidemiological transition describes the changes in the health profile of populations where infectious diseases are substituted by chronic or non-communicable diseases. Even in industrialized countries, infectious diseases emerge as important public health problems and with a very important association with several type of neoplasm. Molecular epidemiology brings in new tools for the study of the epidemiological transition by discovering infectious agents as etiology of diseases, neither of both new. Much has been advanced in the understanding of the virulence and resistance mechanism of different strains, or improving the knowledge on transmission dynamics and dissemination pathways of infectious diseases. As to the non-communicable diseases, molecular epidemiology has enhanced the identification of endogenous risk factors link to alterations, molecular changes in genetic material, that will allow a more detail definition of risk and the identification of individual and groups at risk of several diseases. The potential impact of molecular epidemiology in other areas as environmental, lifestyles and nutritional areas are illustrated with several examples. PMID:9504120

  18. Statistical Methods for Rapid Aerothermal Analysis and Design Technology

    NASA Technical Reports Server (NTRS)

    Morgan, Carolyn; DePriest, Douglas; Thompson, Richard (Technical Monitor)

    2002-01-01

    The cost and safety goals for NASA's next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to establish statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The research work was focused on establishing the suitable mathematical/statistical models for these purposes. It is anticipated that the resulting models can be incorporated into a software tool to provide rapid, variable-fidelity, aerothermal environments to predict heating along an arbitrary trajectory. This work will support development of an integrated design tool to perform automated thermal protection system (TPS) sizing and material selection.

  19. A method for designing robust multivariable feedback systems

    NASA Technical Reports Server (NTRS)

    Milich, David Albert; Athans, Michael; Valavani, Lena; Stein, Gunter

    1988-01-01

    A new methodology is developed for the synthesis of linear, time-invariant (LTI) controllers for multivariable LTI systems. The aim is to achieve stability and performance robustness of the feedback system in the presence of multiple unstructured uncertainty blocks; i.e., to satisfy a frequency-domain inequality in terms of the structured singular value. The design technique is referred to as the Causality Recovery Methodology (CRM). Starting with an initial (nominally) stabilizing compensator, the CRM produces a closed-loop system whose performance-robustness is at least as good as, and hopefully superior to, that of the original design. The robustness improvement is obtained by solving an infinite-dimensional, convex optimization program. A finite-dimensional implementation of the CRM was developed, and it was applied to a multivariate design example.

  20. A design method for an intuitive web site

    SciTech Connect

    Quinniey, M.L.; Diegert, K.V.; Baca, B.G.; Forsythe, J.C.; Grose, E.

    1999-11-03

    The paper describes a methodology for designing a web site for human factor engineers that is applicable for designing a web site for a group of people. Many web pages on the World Wide Web are not organized in a format that allows a user to efficiently find information. Often the information and hypertext links on web pages are not organized into intuitive groups. Intuition implies that a person is able to use their knowledge of a paradigm to solve a problem. Intuitive groups are categories that allow web page users to find information by using their intuition or mental models of categories. In order to improve the human factors engineers efficiency for finding information on the World Wide Web, research was performed to develop a web site that serves as a tool for finding information effectively. The paper describes a methodology for designing a web site for a group of people who perform similar task in an organization.

  1. Advanced 3D inverse method for designing turbomachine blades

    SciTech Connect

    Dang, T.

    1995-10-01

    To meet the goal of 60% plant-cycle efficiency or better set in the ATS Program for baseload utility scale power generation, several critical technologies need to be developed. One such need is the improvement of component efficiencies. This work addresses the issue of improving the performance of turbo-machine components in gas turbines through the development of an advanced three-dimensional and viscous blade design system. This technology is needed to replace some elements in current design systems that are based on outdated technology.

  2. Preliminary Axial Flow Turbine Design and Off-Design Performance Analysis Methods for Rotary Wing Aircraft Engines. Part 1; Validation

    NASA Technical Reports Server (NTRS)

    Chen, Shu-cheng, S.

    2009-01-01

    For the preliminary design and the off-design performance analysis of axial flow turbines, a pair of intermediate level-of-fidelity computer codes, TD2-2 (design; reference 1) and AXOD (off-design; reference 2), are being evaluated for use in turbine design and performance prediction of the modern high performance aircraft engines. TD2-2 employs a streamline curvature method for design, while AXOD approaches the flow analysis with an equal radius-height domain decomposition strategy. Both methods resolve only the flows in the annulus region while modeling the impact introduced by the blade rows. The mathematical formulations and derivations involved in both methods are documented in references 3, 4 for TD2-2) and in reference 5 (for AXOD). The focus of this paper is to discuss the fundamental issues of applicability and compatibility of the two codes as a pair of companion pieces, to perform preliminary design and off-design analysis for modern aircraft engine turbines. Two validation cases for the design and the off-design prediction using TD2-2 and AXOD conducted on two existing high efficiency turbines, developed and tested in the NASA/GE Energy Efficient Engine (GE-E3) Program, the High Pressure Turbine (HPT; two stages, air cooled) and the Low Pressure Turbine (LPT; five stages, un-cooled), are provided in support of the analysis and discussion presented in this paper.

  3. Convergence of controllers designed using state space methods

    NASA Technical Reports Server (NTRS)

    Morris, K. A.

    1991-01-01

    The convergence of finite dimensional controllers for infinite dimensional systems designed using approximations is examined. Stable coprime factorization theory is used to show that under the standard assumptions of uniform stabilizability/detectability, the controllers stabilize the original system for large enough model order. The controllers converge uniformly to an infinite dimensional controller, as does the closed loop response.

  4. Improved Methods for Classification, Prediction and Design of Antimicrobial Peptides

    PubMed Central

    Wang, Guangshun

    2015-01-01

    Peptides with diverse amino acid sequences, structures and functions are essential players in biological systems. The construction of well-annotated databases not only facilitates effective information management, search and mining, but also lays the foundation for developing and testing new peptide algorithms and machines. The antimicrobial peptide database (APD) is an original construction in terms of both database design and peptide entries. The host defense antimicrobial peptides (AMPs) registered in the APD cover the five kingdoms (bacteria, protists, fungi, plants, and animals) or three domains of life (bacteria, archaea, and eukaryota). This comprehensive database (http://aps.unmc.edu/AP) provides useful information on peptide discovery timeline, nomenclature, classification, glossary, calculation tools, and statistics. The APD enables effective search, prediction, and design of peptides with antibacterial, antiviral, antifungal, antiparasitic, insecticidal, spermicidal, anticancer activities, chemotactic, immune modulation, or anti-oxidative properties. A universal classification scheme is proposed herein to unify innate immunity peptides from a variety of biological sources. As an improvement, the upgraded APD makes predictions based on the database-defined parameter space and provides a list of the sequences most similar to natural AMPs. In addition, the powerful pipeline design of the database search engine laid a solid basis for designing novel antimicrobials to combat resistant superbugs, viruses, fungi or parasites. This comprehensive AMP database is a useful tool for both research and education. PMID:25555720

  5. Designing green corrosion inhibitors using chemical computation methods

    SciTech Connect

    Singhl, W.P.; Lin, G.; Bockris, J.O.M.; Kang, Y.

    1998-12-31

    Green corrosion inhibitors have been designed by understanding the relationships between the structure of organic compounds and toxicity as well as corrosion inhibition efficiency. The estimation of aquatic toxicity as well as corrosion inhibition efficiency are made using QSAR techniques. The predicted structures with reduced toxicity and improved corrosion inhibition efficiency are then tested experimentally for these properties, thus leading to green inhibitors.

  6. A Prospective Method to Guide Small Molecule Drug Design

    ERIC Educational Resources Information Center

    Johnson, Alan T.

    2015-01-01

    At present, small molecule drug design follows a retrospective path when considering what analogs are to be made around a current hit or lead molecule with the focus often on identifying a compound with higher intrinsic potency. What this approach overlooks is the simultaneous need to also improve the physicochemical (PC) and pharmacokinetic (PK)…

  7. Library Design Analysis Using Post-Occupancy Evaluation Methods.

    ERIC Educational Resources Information Center

    James, Dennis C.; Stewart, Sharon L.

    1995-01-01

    Presents findings of a user-based study of the interior of Rodger's Science and Engineering Library at the University of Alabama. Compared facility evaluations from faculty, library staff, and graduate and undergraduate students. Features evaluated include: acoustics, aesthetics, book stacks, design, finishes/materials, furniture, lighting,…

  8. Improved methods for classification, prediction, and design of antimicrobial peptides.

    PubMed

    Wang, Guangshun

    2015-01-01

    Peptides with diverse amino acid sequences, structures, and functions are essential players in biological systems. The construction of well-annotated databases not only facilitates effective information management, search, and mining but also lays the foundation for developing and testing new peptide algorithms and machines. The antimicrobial peptide database (APD) is an original construction in terms of both database design and peptide entries. The host defense antimicrobial peptides (AMPs) registered in the APD cover the five kingdoms (bacteria, protists, fungi, plants, and animals) or three domains of life (bacteria, archaea, and eukaryota). This comprehensive database ( http://aps.unmc.edu/AP ) provides useful information on peptide discovery timeline, nomenclature, classification, glossary, calculation tools, and statistics. The APD enables effective search, prediction, and design of peptides with antibacterial, antiviral, antifungal, antiparasitic, insecticidal, spermicidal, anticancer activities, chemotactic, immune modulation, or antioxidative properties. A universal classification scheme is proposed herein to unify innate immunity peptides from a variety of biological sources. As an improvement, the upgraded APD makes predictions based on the database-defined parameter space and provides a list of the sequences most similar to natural AMPs. In addition, the powerful pipeline design of the database search engine laid a solid basis for designing novel antimicrobials to combat resistant superbugs, viruses, fungi, or parasites. This comprehensive AMP database is a useful tool for both research and education.

  9. A Pareto-optimal refinement method for protein design scaffolds.

    PubMed

    Nivón, Lucas Gregorio; Moretti, Rocco; Baker, David

    2013-01-01

    Computational design of protein function involves a search for amino acids with the lowest energy subject to a set of constraints specifying function. In many cases a set of natural protein backbone structures, or "scaffolds", are searched to find regions where functional sites (an enzyme active site, ligand binding pocket, protein-protein interaction region, etc.) can be placed, and the identities of the surrounding amino acids are optimized to satisfy functional constraints. Input native protein structures almost invariably have regions that score very poorly with the design force field, and any design based on these unmodified structures may result in mutations away from the native sequence solely as a result of the energetic strain. Because the input structure is already a stable protein, it is desirable to keep the total number of mutations to a minimum and to avoid mutations resulting from poorly-scoring input structures. Here we describe a protocol using cycles of minimization with combined backbone/sidechain restraints that is Pareto-optimal with respect to RMSD to the native structure and energetic strain reduction. The protocol should be broadly useful in the preparation of scaffold libraries for functional site design.

  10. Study of design and analysis methods for transonic flow

    NASA Technical Reports Server (NTRS)

    Murman, E. M.

    1977-01-01

    An airfoil design program and a boundary layer analysis were developed. Boundary conditions were derived for ventilated transonic wind tunnels and performing transonic windtunnel wall calculations. A computational procedure for rotational transonic flow in engine inlet throats was formulated. Results and conclusions are summarized.

  11. Multicenter study of epidemiological cutoff values and detection of resistance in Candida spp. to anidulafungin, caspofungin, and micafungin using the Sensititre YeastOne colorimetric method.

    PubMed

    Espinel-Ingroff, A; Alvarez-Fernandez, M; Cantón, E; Carver, P L; Chen, S C-A; Eschenauer, G; Getsinger, D L; Gonzalez, G M; Govender, N P; Grancini, A; Hanson, K E; Kidd, S E; Klinker, K; Kubin, C J; Kus, J V; Lockhart, S R; Meletiadis, J; Morris, A J; Pelaez, T; Quindós, G; Rodriguez-Iglesias, M; Sánchez-Reus, F; Shoham, S; Wengenack, N L; Borrell Solé, N; Echeverria, J; Esperalba, J; Gómez-G de la Pedrosa, E; García García, I; Linares, M J; Marco, F; Merino, P; Pemán, J; Pérez Del Molino, L; Roselló Mayans, E; Rubio Calvo, C; Ruiz Pérez de Pipaon, M; Yagüe, G; Garcia-Effron, G; Guinea, J; Perlin, D S; Sanguinetti, M; Shields, R; Turnidge, J

    2015-11-01

    Neither breakpoints (BPs) nor epidemiological cutoff values (ECVs) have been established for Candida spp. with anidulafungin, caspofungin, and micafungin when using the Sensititre YeastOne (SYO) broth dilution colorimetric method. In addition, reference caspofungin MICs have so far proven to be unreliable. Candida species wild-type (WT) MIC distributions (for microorganisms in a species/drug combination with no detectable phenotypic resistance) were established for 6,007 Candida albicans, 186 C. dubliniensis, 3,188 C. glabrata complex, 119 C. guilliermondii, 493 C. krusei, 205 C. lusitaniae, 3,136 C. parapsilosis complex, and 1,016 C. tropicalis isolates. SYO MIC data gathered from 38 laboratories in Australia, Canada, Europe, Mexico, New Zealand, South Africa, and the United States were pooled to statistically define SYO ECVs. ECVs for anidulafungin, caspofungin, and micafungin encompassing ≥97.5% of the statistically modeled population were, respectively, 0.12, 0.25, and 0.06 μg/ml for C. albicans, 0.12, 0.25, and 0.03 μg/ml for C. glabrata complex, 4, 2, and 4 μg/ml for C. parapsilosis complex, 0.5, 0.25, and 0.06 μg/ml for C. tropicalis, 0.25, 1, and 0.25 μg/ml for C. krusei, 0.25, 1, and 0.12 μg/ml for C. lusitaniae, 4, 2, and 2 μg/ml for C. guilliermondii, and 0.25, 0.25, and 0.12 μg/ml for C. dubliniensis. Species-specific SYO ECVs for anidulafungin, caspofungin, and micafungin correctly classified 72 (88.9%), 74 (91.4%), 76 (93.8%), respectively, of 81 Candida isolates with identified fks mutations. SYO ECVs may aid in detecting non-WT isolates with reduced susceptibility to anidulafungin, micafungin, and especially caspofungin, since testing the susceptibilities of Candida spp. to caspofungin by reference methodologies is not recommended. PMID:26282428

  12. Multicenter Study of Epidemiological Cutoff Values and Detection of Resistance in Candida spp. to Anidulafungin, Caspofungin, and Micafungin Using the Sensititre YeastOne Colorimetric Method

    PubMed Central

    Alvarez-Fernandez, M.; Cantón, E.; Carver, P. L.; Chen, S. C.-A.; Eschenauer, G.; Getsinger, D. L.; Gonzalez, G. M.; Grancini, A.; Hanson, K. E.; Kidd, S. E.; Klinker, K.; Kubin, C. J.; Kus, J. V.; Lockhart, S. R.; Meletiadis, J.; Morris, A. J.; Pelaez, T.; Rodriguez-Iglesias, M.; Sánchez-Reus, F.; Shoham, S.; Wengenack, N. L.; Borrell Solé, N.; Echeverria, J.; Esperalba, J.; Gómez-G. de la Pedrosa, E.; García García, I.; Linares, M. J.; Marco, F.; Merino, P.; Pemán, J.; Pérez del Molino, L.; Roselló Mayans, E.; Rubio Calvo, C.; Ruiz Pérez de Pipaon, M.; Yagüe, G.; Garcia-Effron, G.; Perlin, D. S.; Sanguinetti, M.; Shields, R.; Turnidge, J.

    2015-01-01

    Neither breakpoints (BPs) nor epidemiological cutoff values (ECVs) have been established for Candida spp. with anidulafungin, caspofungin, and micafungin when using the Sensititre YeastOne (SYO) broth dilution colorimetric method. In addition, reference caspofungin MICs have so far proven to be unreliable. Candida species wild-type (WT) MIC distributions (for microorganisms in a species/drug combination with no detectable phenotypic resistance) were established for 6,007 Candida albicans, 186 C. dubliniensis, 3,188 C. glabrata complex, 119 C. guilliermondii, 493 C. krusei, 205 C. lusitaniae, 3,136 C. parapsilosis complex, and 1,016 C. tropicalis isolates. SYO MIC data gathered from 38 laboratories in Australia, Canada, Europe, Mexico, New Zealand, South Africa, and the United States were pooled to statistically define SYO ECVs. ECVs for anidulafungin, caspofungin, and micafungin encompassing ≥97.5% of the statistically modeled population were, respectively, 0.12, 0.25, and 0.06 μg/ml for C. albicans, 0.12, 0.25, and 0.03 μg/ml for C. glabrata complex, 4, 2, and 4 μg/ml for C. parapsilosis complex, 0.5, 0.25, and 0.06 μg/ml for C. tropicalis, 0.25, 1, and 0.25 μg/ml for C. krusei, 0.25, 1, and 0.12 μg/ml for C. lusitaniae, 4, 2, and 2 μg/ml for C. guilliermondii, and 0.25, 0.25, and 0.12 μg/ml for C. dubliniensis. Species-specific SYO ECVs for anidulafungin, caspofungin, and micafungin correctly classified 72 (88.9%), 74 (91.4%), 76 (93.8%), respectively, of 81 Candida isolates with identified fks mutations. SYO ECVs may aid in detecting non-WT isolates with reduced susceptibility to anidulafungin, micafungin, and especially caspofungin, since testing the susceptibilities of Candida spp. to caspofungin by reference methodologies is not recommended. PMID:26282428

  13. Overview of control design methods for smart structural system

    NASA Astrophysics Data System (ADS)

    Rao, Vittal S.; Sana, Sridhar

    2001-08-01

    Smart structures are a result of effective integration of control system design and signal processing with the structural systems to maximally utilize the new advances in materials for structures, actuation and sensing to obtain the best performance for the application at hand. The research in smart structures is constantly driving towards attaining self adaptive and diagnostic capabilities that biological systems possess. This has been manifested in the number of successful applications in many areas of engineering such as aerospace, civil and automotive systems. Instrumental in the development of such systems are smart materials such as piezo-electric, shape memory alloys, electrostrictive, magnetostrictive and fiber-optic materials and various composite materials for use as actuators, sensors and structural members. The need for development of control systems that maximally utilize the smart actuators and sensing materials to design highly distributed and highly adaptable controllers has spurred research in the area of smart structural modeling, identification, actuator/sensor design and placement, control systems design such as adaptive and robust controllers with new tools such a neural networks, fuzzy logic, genetic algorithms, linear matrix inequalities and electronics for controller implementation such as analog electronics, micro controllers, digital signal processors (DSPs) and application specific integrated circuits (ASICs) such field programmable gate arrays (FPGAs) and Multichip modules (MCMs) etc. In this paper, we give a brief overview of the state of control in smart structures. Different aspects of the development of smart structures such as applications, technology and theoretical advances especially in the area of control systems design and implementation will be covered.

  14. Optimal reliability design method for remote solar systems

    NASA Astrophysics Data System (ADS)

    Suwapaet, Nuchida

    A unique optimal reliability design algorithm is developed for remote communication systems. The algorithm deals with either minimizing an unavailability of the system within a fixed cost or minimizing the cost of the system with an unavailability constraint. The unavailability of the system is a function of three possible failure occurrences: individual component breakdown, solar energy deficiency (loss of load probability), and satellite/radio transmission loss. The three mathematical models of component failure, solar power failure, transmission failure are combined and formulated as a nonlinear programming optimization problem with binary decision variables, such as number and type (or size) of photovoltaic modules, batteries, radios, antennas, and controllers. Three possible failures are identified and integrated in computer algorithm to generate the parameters for the optimization algorithm. The optimization algorithm is implemented with a branch-and-bound technique solution in MS Excel Solver. The algorithm is applied to a case study design for an actual system that will be set up in remote mountainous areas of Peru. The automated algorithm is verified with independent calculations. The optimal results from minimizing the unavailability of the system with the cost constraint case and minimizing the total cost of the system with the unavailability constraint case are consistent with each other. The tradeoff feature in the algorithm allows designers to observe results of 'what-if' scenarios of relaxing constraint bounds, thus obtaining the most benefit from the optimization process. An example of this approach applied to an existing communication system in the Andes shows dramatic improvement in reliability for little increase in cost. The algorithm is a real design tool, unlike other existing simulation design tools. The algorithm should be useful for other stochastic systems where component reliability, random supply and demand, and communication are

  15. Consumers' Kansei Needs Clustering Method for Product Emotional Design Based on Numerical Design Structure Matrix and Genetic Algorithms

    PubMed Central

    Chen, Deng-kai; Gu, Rong; Gu, Yu-feng; Yu, Sui-huai

    2016-01-01

    Consumers' Kansei needs reflect their perception about a product and always consist of a large number of adjectives. Reducing the dimension complexity of these needs to extract primary words not only enables the target product to be explicitly positioned, but also provides a convenient design basis for designers engaging in design work. Accordingly, this study employs a numerical design structure matrix (NDSM) by parameterizing a conventional DSM and integrating genetic algorithms to find optimum Kansei clusters. A four-point scale method is applied to assign link weights of every two Kansei adjectives as values of cells when constructing an NDSM. Genetic algorithms are used to cluster the Kansei NDSM and find optimum clusters. Furthermore, the process of the proposed method is presented. The details of the proposed approach are illustrated using an example of electronic scooter for Kansei needs clustering. The case study reveals that the proposed method is promising for clustering Kansei needs adjectives in product emotional design.

  16. Consumers' Kansei Needs Clustering Method for Product Emotional Design Based on Numerical Design Structure Matrix and Genetic Algorithms

    PubMed Central

    Chen, Deng-kai; Gu, Rong; Gu, Yu-feng; Yu, Sui-huai

    2016-01-01

    Consumers' Kansei needs reflect their perception about a product and always consist of a large number of adjectives. Reducing the dimension complexity of these needs to extract primary words not only enables the target product to be explicitly positioned, but also provides a convenient design basis for designers engaging in design work. Accordingly, this study employs a numerical design structure matrix (NDSM) by parameterizing a conventional DSM and integrating genetic algorithms to find optimum Kansei clusters. A four-point scale method is applied to assign link weights of every two Kansei adjectives as values of cells when constructing an NDSM. Genetic algorithms are used to cluster the Kansei NDSM and find optimum clusters. Furthermore, the process of the proposed method is presented. The details of the proposed approach are illustrated using an example of electronic scooter for Kansei needs clustering. The case study reveals that the proposed method is promising for clustering Kansei needs adjectives in product emotional design. PMID:27630709

  17. Consumers' Kansei Needs Clustering Method for Product Emotional Design Based on Numerical Design Structure Matrix and Genetic Algorithms.

    PubMed

    Yang, Yan-Pu; Chen, Deng-Kai; Gu, Rong; Gu, Yu-Feng; Yu, Sui-Huai

    2016-01-01

    Consumers' Kansei needs reflect their perception about a product and always consist of a large number of adjectives. Reducing the dimension complexity of these needs to extract primary words not only enables the target product to be explicitly positioned, but also provides a convenient design basis for designers engaging in design work. Accordingly, this study employs a numerical design structure matrix (NDSM) by parameterizing a conventional DSM and integrating genetic algorithms to find optimum Kansei clusters. A four-point scale method is applied to assign link weights of every two Kansei adjectives as values of cells when constructing an NDSM. Genetic algorithms are used to cluster the Kansei NDSM and find optimum clusters. Furthermore, the process of the proposed method is presented. The details of the proposed approach are illustrated using an example of electronic scooter for Kansei needs clustering. The case study reveals that the proposed method is promising for clustering Kansei needs adjectives in product emotional design.

  18. Consumers' Kansei Needs Clustering Method for Product Emotional Design Based on Numerical Design Structure Matrix and Genetic Algorithms.

    PubMed

    Yang, Yan-Pu; Chen, Deng-Kai; Gu, Rong; Gu, Yu-Feng; Yu, Sui-Huai

    2016-01-01

    Consumers' Kansei needs reflect their perception about a product and always consist of a large number of adjectives. Reducing the dimension complexity of these needs to extract primary words not only enables the target product to be explicitly positioned, but also provides a convenient design basis for designers engaging in design work. Accordingly, this study employs a numerical design structure matrix (NDSM) by parameterizing a conventional DSM and integrating genetic algorithms to find optimum Kansei clusters. A four-point scale method is applied to assign link weights of every two Kansei adjectives as values of cells when constructing an NDSM. Genetic algorithms are used to cluster the Kansei NDSM and find optimum clusters. Furthermore, the process of the proposed method is presented. The details of the proposed approach are illustrated using an example of electronic scooter for Kansei needs clustering. The case study reveals that the proposed method is promising for clustering Kansei needs adjectives in product emotional design. PMID:27630709

  19. Epidemiology as discourse: the politics of development institutions in the Epidemiological Profile of El Salvador

    PubMed Central

    Aviles, L

    2001-01-01

    STUDY OBJECTIVE—To determine the ways in which institutions devoted to international development influence epidemiological studies.
DESIGN—This article takes a descriptive epidemiological study of El Salvador, Epidemiological Profile, conducted in 1994 by the US Agency for International Development, as a case study. The methods include discourse analysis in order to uncover the ideological basis of the report and its characteristics as a discourse of development.
SETTING—El Salvador.
RESULTS—The Epidemiological Profile theoretical basis, the epidemiological transition theory, embodies the ethnocentrism of a "colonizer's model of the world." This report follows the logic of a discourse of development by depoliticising development, creating abnormalities, and relying on the development consulting industry. The epidemiological transition theory serves as an ideology that legitimises and dissimulates the international order.
CONCLUSIONS—Even descriptive epidemiological assessments or epidemiological profiles are imbued with theoretical assumptions shaped by the institutional setting under which epidemiological investigations are conducted.


Keywords: El Salvador; politics PMID:11160170

  20. Design component method for sensitivity analysis of built-up structures

    NASA Technical Reports Server (NTRS)

    Choi, Kyung K.; Seong, Hwai G.

    1986-01-01

    A 'design component method' that provides a unified and systematic organization of design sensitivity analysis for built-up structures is developed and implemented. Both conventional design variables, such as thickness and cross-sectional area, and shape design variables of components of built-up structures are considered. It is shown that design of components of built-up structures can be characterized and system design sensitivity expressions obtained by simply adding contributions from each component. The method leads to a systematic organization of computations for design sensitivity analysis that is similar to the way in which computations are organized within a finite element code.

  1. Applications of Genetic Methods to NASA Design and Operations Problems

    NASA Technical Reports Server (NTRS)

    Laird, Philip D.

    1996-01-01

    We review four recent NASA-funded applications in which evolutionary/genetic methods are important. In the process we survey: the kinds of problems being solved today with these methods; techniques and tools used; problems encountered; and areas where research is needed. The presentation slides are annotated briefly at the top of each page.

  2. Design and ergonomics. Methods for integrating ergonomics at hand tool design stage.

    PubMed

    Marsot, Jacques; Claudon, Laurent

    2004-01-01

    As a marked increase in the number of musculoskeletal disorders was noted in many industrialized countries and more specifically in companies that require the use of hand tools, the French National Research and Safety Institute (INRS) launched in 1999 a research project on the topic of integrating ergonomics into hand tool design, and more particularly to a design of a boning knife. After a brief recall of the difficulties of integrating ergonomics at the design stage, the present paper shows how 3 design methodological tools--Functional Analysis, Quality Function Deployment and TRIZ--have been applied to the design of a boning knife. Implementation of these tools enabled us to demonstrate the extent to which they are capable of responding to the difficulties of integrating ergonomics into product design. PMID:15028190

  3. The Checkered History of American Psychiatric Epidemiology

    PubMed Central

    Horwitz, Allan V; Grob, Gerald N

    2011-01-01

    Context American psychiatry has been fascinated with statistics ever since the specialty was created in the early nineteenth century. Initially, psychiatrists hoped that statistics would reveal the benefits of institutional care. Nevertheless, their fascination with statistics was far removed from the growing importance of epidemiology generally. The impetus to create an epidemiology of mental disorders came from the emerging social sciences, whose members were concerned with developing a scientific understanding of individual and social behavior and applying it to a series of pressing social problems. Beginning in the 1920s, the interest of psychiatric epidemiologists shifted to the ways that social environments contributed to the development of mental disorders. This emphasis dramatically changed after 1980 when the policy focus of psychiatric epidemiology became the early identification and prevention of mental illness in individuals. Methods This article reviews the major developments in psychiatric epidemiology over the past century and a half. Findings The lack of an adequate classification system for mental illness has precluded the field of psychiatric epidemiology from providing causal understandings that could contribute to more adequate policies to remediate psychiatric disorders. Because of this gap, the policy influence of psychiatric epidemiology has stemmed more from institutional and ideological concerns than from knowledge about the causes of mental disorders. Conclusion Most of the problems that have bedeviled psychiatric epidemiology since its inception remain unresolved. In particular, until epidemiologists develop adequate methods to measure mental illnesses in community populations, the policy contributions of this field will not be fully realized. PMID:22188350

  4. Air pollution, inflammation and preterm birth in Mexico City: study design and methods.

    PubMed

    O'Neill, Marie S; Osornio-Vargas, Alvaro; Buxton, Miatta A; Sánchez, Brisa N; Rojas-Bracho, Leonora; Castillo-Castrejon, Marisol; Mordhukovich, Irina B; Brown, Daniel G; Vadillo-Ortega, Felipe

    2013-03-15

    Preterm birth is one of the leading causes of perinatal mortality and is associated with long-term adverse health consequences for surviving infants. Preterm birth rates are rising worldwide, and no effective means for prevention currently exists. Air pollution exposure may be a significant cause of prematurity, but many published studies lack the individual, clinical data needed to elucidate possible biological mechanisms mediating these epidemiological associations. This paper presents the design of a prospective study now underway to evaluate those mechanisms in a cohort of pregnant women residing in Mexico City. We address how air quality may act together with other factors to induce systemic inflammation and influence the duration of pregnancy. Data collection includes: biomarkers relevant to inflammation in cervico-vaginal exudate and peripheral blood, along with full clinical information, pro-inflammatory cytokine gene polymorphisms and air pollution data to evaluate spatial and temporal variability in air pollution exposure. Samples are collected on a monthly basis and participants are followed for the duration of pregnancy. The data will be used to evaluate whether ambient air pollution is associated with preterm birth, controlling for other risk factors. We will evaluate which time windows during pregnancy are most influential in the air pollution and preterm birth association. In addition, the epidemiological study will be complemented with a parallel toxicology invitro study, in which monocytic cells will be exposed to air particle samples to evaluate the expression of biomarkers of inflammation. PMID:23177781

  5. Choosing a future for epidemiology: II. From black box to Chinese boxes and eco-epidemiology.

    PubMed Central

    Susser, M; Susser, E

    1996-01-01

    Part I of this paper traced the evolution of modern epidemiology in terms of three eras, each with its dominant paradigm, culminating in the present era of chronic disease epidemiology with its paradigm, the black box. This paper sees the close of the present era and foresees a new era of eco-epidemiology in which the deployment of a different paradigm will be crucial. Here a paradigm is advocated for the emergent era. Encompassing many levels of organization--molecular and societal as well as individual--this paradigm, termed Chinese boxes, aims to integrate more than a single level in design, analysis, and interpretation. Such a paradigm could sustain and refine a public health-oriented epidemiology. But preventing a decline of creative epidemiology in this new era will require more than a cogent scientific paradigm. Attention will have to be paid to the social processes that foster a cohesive and humane discipline. PMID:8629718

  6. How to Combine Objectives and Methods of Evaluation in Iterative ILE Design: Lessons Learned from Designing Ambre-Add

    ERIC Educational Resources Information Center

    Nogry, S.; Jean-Daubias, S.; Guin, N.

    2012-01-01

    This article deals with evaluating an interactive learning environment (ILE) during the iterative-design process. Various aspects of the system must be assessed and a number of evaluation methods are available. In designing the ILE Ambre-add, several techniques were combined to test and refine the system. In particular, we point out the merits of…

  7. A hybrid nonlinear programming method for design optimization

    NASA Technical Reports Server (NTRS)

    Rajan, S. D.

    1986-01-01

    Solutions to engineering design problems formulated as nonlinear programming (NLP) problems usually require the use of more than one optimization technique. Moreover, the interaction between the user (analysis/synthesis) program and the NLP system can lead to interface, scaling, or convergence problems. An NLP solution system is presented that seeks to solve these problems by providing a programming system to ease the user-system interface. A simple set of rules is used to select an optimization technique or to switch from one technique to another in an attempt to detect, diagnose, and solve some potential problems. Numerical examples involving finite element based optimal design of space trusses and rotor bearing systems are used to illustrate the applicability of the proposed methodology.

  8. Design Method of Fault Detector for Injection Unit

    NASA Astrophysics Data System (ADS)

    Ochi, Kiyoshi; Saeki, Masami

    An injection unit is considered as a speed control system utilizing a reaction-force sensor. Our purpose is to design a fault detector that detects and isolates actuator and sensor faults under the condition that the system is disturbed by a reaction force. First described is the fault detector's general structure. In this system, a disturbance observer that estimates the reaction force is designed for the speed control system in order to obtain the residual signals, and then post-filters that separate the specific frequency elements from the residual signals are applied in order to generate the decision signals. Next, we describe a fault detector designed specifically for a model of the injection unit. It is shown that the disturbance imposed on the decision variables can be made significantly small by appropriate adjustments to the observer bandwidth, and that most of the sensor faults and actuator faults can be detected and some of them can be isolated in the frequency domain by setting the frequency characteristics of the post-filters appropriately. Our result is verified by experiments for an actual injection unit.

  9. The CONSTANCES cohort: an open epidemiological laboratory

    PubMed Central

    2010-01-01

    Background Prospective cohorts represent an essential design for epidemiological studies and allow for the study of the combined effects of lifestyle, environment, genetic predisposition, and other risk factors on a large variety of disease endpoints. The CONSTANCES cohort is intended to provide public health information and to serve as an "open epidemiologic laboratory" accessible to the epidemiologic research community. Although designed as a "general-purpose" cohort with very broad coverage, it will particularly focus on occupational and social determinants of health, and on aging. Methods/Design The CONSTANCES cohort is designed as a randomly selected representative sample of French adults aged 18-69 years at inception; 200,000 subjects will be included over a five-year period. At inclusion, the selected subjects will be invited to fill a questionnaire and to attend a Health Screening Center (HSC) for a comprehensive health examination: weight, height, blood pressure, electrocardiogram, vision, auditory, spirometry, and biological parameters; for those aged 45 years and older, a specific work-up of functional, physical, and cognitive capacities will be performed. A biobank will be set up. The follow-up includes a yearly self-administered questionnaire, and a periodic visit to an HSC. Social and work-related events and health data will be collected from the French national retirement, health and death databases. The data that will be collected include social and demographic characteristics, socioeconomic status, life events, behaviors, and occupational factors. The health data will cover a wide spectrum: self-reported health scales, reported prevalent and incident diseases, long-term chronic diseases and hospitalizations, sick-leaves, handicaps, limitations, disabilities and injuries, healthcare utilization and services provided, and causes of death. To take into account non-participation at inclusion and attrition throughout the longitudinal follow-up, a cohort

  10. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... and methods prescribed under appendix A of 14 CFR part 150; and (b) Use of computer models to create noise contours must be in accordance with the criteria prescribed under appendix A of 14 CFR part 150....

  11. Advanced Control and Protection system Design Methods for Modular HTGRs

    SciTech Connect

    Ball, Sydney J; Wilson Jr, Thomas L; Wood, Richard Thomas

    2012-06-01

    The project supported the Nuclear Regulatory Commission (NRC) in identifying and evaluating the regulatory implications concerning the control and protection systems proposed for use in the Department of Energy's (DOE) Next-Generation Nuclear Plant (NGNP). The NGNP, using modular high-temperature gas-cooled reactor (HTGR) technology, is to provide commercial industries with electricity and high-temperature process heat for industrial processes such as hydrogen production. Process heat temperatures range from 700 to 950 C, and for the upper range of these operation temperatures, the modular HTGR is sometimes referred to as the Very High Temperature Reactor or VHTR. Initial NGNP designs are for operation in the lower temperature range. The defining safety characteristic of the modular HTGR is that its primary defense against serious accidents is to be achieved through its inherent properties of the fuel and core. Because of its strong negative temperature coefficient of reactivity and the capability of the fuel to withstand high temperatures, fast-acting active safety systems or prompt operator actions should not be required to prevent significant fuel failure and fission product release. The plant is designed such that its inherent features should provide adequate protection despite operational errors or equipment failure. Figure 1 shows an example modular HTGR layout (prismatic core version), where its inlet coolant enters the reactor vessel at the bottom, traversing up the sides to the top plenum, down-flow through an annular core, and exiting from the lower plenum (hot duct). This research provided NRC staff with (a) insights and knowledge about the control and protection systems for the NGNP and VHTR, (b) information on the technologies/approaches under consideration for use in the reactor and process heat applications, (c) guidelines for the design of highly integrated control rooms, (d) consideration for modeling of control and protection system designs for

  12. Reducing Design Risk Using Robust Design Methods: A Dual Response Surface Approach

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Yeniay, Ozgur; Lepsch, Roger A. (Technical Monitor)

    2003-01-01

    Space transportation system conceptual design is a multidisciplinary process containing considerable element of risk. Risk here is defined as the variability in the estimated (output) performance characteristic of interest resulting from the uncertainties in the values of several disciplinary design and/or operational parameters. Uncertainties from one discipline (and/or subsystem) may propagate to another, through linking parameters and the final system output may have a significant accumulation of risk. This variability can result in significant deviations from the expected performance. Therefore, an estimate of variability (which is called design risk in this study) together with the expected performance characteristic value (e.g. mean empty weight) is necessary for multidisciplinary optimization for a robust design. Robust design in this study is defined as a solution that minimizes variability subject to a constraint on mean performance characteristics. Even though multidisciplinary design optimization has gained wide attention and applications, the treatment of uncertainties to quantify and analyze design risk has received little attention. This research effort explores the dual response surface approach to quantify variability (risk) in critical performance characteristics (such as weight) during conceptual design.

  13. Object-oriented design of preconditioned iterative methods

    SciTech Connect

    Bruaset, A.M.

    1994-12-31

    In this talk the author discusses how object-oriented programming techniques can be used to develop a flexible software package for preconditioned iterative methods. The ideas described have been used to implement the linear algebra part of Diffpack, which is a collection of C++ class libraries that provides high-level tools for the solution of partial differential equations. In particular, this software package is aimed at rapid development of PDE-based numerical simulators, primarily using finite element methods.

  14. Vaccine epidemiology: A review

    PubMed Central

    Lahariya, Chandrakant

    2016-01-01

    This review article outlines the key concepts in vaccine epidemiology, such as basic reproductive numbers, force of infection, vaccine efficacy and effectiveness, vaccine failure, herd immunity, herd effect, epidemiological shift, disease modeling, and describes the application of this knowledge both at program levels and in the practice by family physicians, epidemiologists, and pediatricians. A case has been made for increased knowledge and understanding of vaccine epidemiology among key stakeholders including policy makers, immunization program managers, public health experts, pediatricians, family physicians, and other experts/individuals involved in immunization service delivery. It has been argued that knowledge of vaccine epidemiology which is likely to benefit the society through contributions to the informed decision-making and improving vaccination coverage in the low and middle income countries (LMICs). The article ends with suggestions for the provision of systematic training and learning platforms in vaccine epidemiology to save millions of preventable deaths and improve health outcomes through life-course. PMID:27453836

  15. The future of epidemiology.

    PubMed

    Ness, Roberta B; Andrews, Elizabeth B; Gaudino, James A; Newman, Anne B; Soskolne, Colin L; Stürmer, Til; Wartenberg, Daniel E; Weiss, Stanley H

    2009-11-01

    In this article, the authors discuss current challenges and opportunities in epidemiology that will affect the field's future. Epidemiology is commonly considered the methodologic backbone for the fields of public health and outcomes research because its practitioners describe patterns of disease occurrence, identify risk factors and etiologic determinants, and demonstrate the usefulness of interventions. Like most aspects of science, epidemiology is in rapid flux. Several factors that are influencing and will continue to influence epidemiology and the health of the public include factors fundamental to framing the discipline of epidemiology (i.e., its means of communication, its methodologies, its access to data, its values, its population perspective), factors relating to scientific advances (e.g., genomics, comparative effectiveness in therapeutics), and factors shaping human health (e.g., increasing globalism, the environment, disease and lifestyle, demographics, infectious disease). PMID:19858828

  16. Vaccine epidemiology: A review.

    PubMed

    Lahariya, Chandrakant

    2016-01-01

    This review article outlines the key concepts in vaccine epidemiology, such as basic reproductive numbers, force of infection, vaccine efficacy and effectiveness, vaccine failure, herd immunity, herd effect, epidemiological shift, disease modeling, and describes the application of this knowledge both at program levels and in the practice by family physicians, epidemiologists, and pediatricians. A case has been made for increased knowledge and understanding of vaccine epidemiology among key stakeholders including policy makers, immunization program managers, public health experts, pediatricians, family physicians, and other experts/individuals involved in immunization service delivery. It has been argued that knowledge of vaccine epidemiology which is likely to benefit the society through contributions to the informed decision-making and improving vaccination coverage in the low and middle income countries (LMICs). The article ends with suggestions for the provision of systematic training and learning platforms in vaccine epidemiology to save millions of preventable deaths and improve health outcomes through life-course. PMID:27453836

  17. Mental health epidemiological research in South America: recent findings

    PubMed Central

    Silva de Lima, Maurício; Garcia de Oliveira Soares, Bernardo; de Jesus Mari, Jair

    2004-01-01

    This paper aims to review the recent mental health epidemiological research conducted in South America. The Latin American and the Caribbean (LILACS) database was searched from 1999 to 2003 using a specific strategy for identification of cohort, case-control and cross-sectional population-based studies in South America. The authors screened references and identified relevant studies. Further studies were obtained contacting local experts in epidemiology. 140 references were identified, and 12 studies were selected. Most selected studies explored the prevalence and risk factors for common mental disorders, and several of them used sophisticated methods of sample selection and analysis. There is a need for improving the quality of psychiatric journals in Latin America, and for increasing the distribution and access to research data. Regionally relevant problems such as violence and substance abuse should be considered in designing future investigations in this area. PMID:16633474

  18. [The epidemiological surveillance of dengue in Mexico].

    PubMed

    Montesano-Castellanos, R; Ruiz-Matus, C

    1995-01-01

    The clinical behavior of dengue fever in Mexico has changed, now with the occurrence of hemorrhagic cases. In response to the emergence of such cases, a specific epidemiologic surveillance system has been designed and implemented. This system includes the means to monitor the factors involved in the evolution of the disease. The identification and analysis of these factors is necessary to implement prevention and control measures. This paper presents the main components and procedures of the epidemiologic surveillance system for common and hemorrhagic dengue fever in Mexico, emphasizing the usefulness of the risk approach to predict the pattern of this disease. The model includes the collaboration of a multidisciplinary group. The Epidemiologic Surveillance State Committee, coordinated by the National Health System, participates in the collection and analysis of epidemiologic data, particularly data related to the population, the individual, the vector, the viruses and the environment. PMID:8599150

  19. [The epidemiological surveillance of dengue in Mexico].

    PubMed

    Montesano-Castellanos, R; Ruiz-Matus, C

    1995-01-01

    The clinical behavior of dengue fever in Mexico has changed, now with the occurrence of hemorrhagic cases. In response to the emergence of such cases, a specific epidemiologic surveillance system has been designed and implemented. This system includes the means to monitor the factors involved in the evolution of the disease. The identification and analysis of these factors is necessary to implement prevention and control measures. This paper presents the main components and procedures of the epidemiologic surveillance system for common and hemorrhagic dengue fever in Mexico, emphasizing the usefulness of the risk approach to predict the pattern of this disease. The model includes the collaboration of a multidisciplinary group. The Epidemiologic Surveillance State Committee, coordinated by the National Health System, participates in the collection and analysis of epidemiologic data, particularly data related to the population, the individual, the vector, the viruses and the environment.

  20. GESDB: a platform of simulation resources for genetic epidemiology studies.

    PubMed

    Yao, Po-Ju; Chung, Ren-Hua

    2016-01-01

    Computer simulations are routinely conducted to evaluate new statistical methods, to compare the properties among different methods, and to mimic the observed data in genetic epidemiology studies. Conducting simulation studies can become a complicated task as several challenges can occur, such as the selection of an appropriate simulation tool and the specification of parameters in the simulation model. Although abundant simulated data have been generated for human genetic research, currently there is no public database designed specifically as a repository for these simulated data. With the lack of such a database, for similar studies, similar simulations may have been repeated, which resulted in redundant work. Thus, we created an online platform, the Genetic Epidemiology Simulation Database (GESDB), for simulation data sharing and discussion of simulation techniques for genetic epidemiology studies. GESDB consists of a database for storing simulation scripts, simulated data and documentation from published articles as well as a discussion forum, which provides a platform for discussion of the simulated data and exchanging simulation ideas. Moreover, summary statistics such as the simulation tools that are most commonly used and datasets that are most frequently downloaded are provided. The statistics will be informative for researchers to choose an appropriate simulation tool or select a common dataset for method comparisons. GESDB can be accessed at http://gesdb.nhri.org.twDatabase URL: http://gesdb.nhri.org.tw. PMID:27242038

  1. GESDB: a platform of simulation resources for genetic epidemiology studies.

    PubMed

    Yao, Po-Ju; Chung, Ren-Hua

    2016-01-01

    Computer simulations are routinely conducted to evaluate new statistical methods, to compare the properties among different methods, and to mimic the observed data in genetic epidemiology studies. Conducting simulation studies can become a complicated task as several challenges can occur, such as the selection of an appropriate simulation tool and the specification of parameters in the simulation model. Although abundant simulated data have been generated for human genetic research, currently there is no public database designed specifically as a repository for these simulated data. With the lack of such a database, for similar studies, similar simulations may have been repeated, which resulted in redundant work. Thus, we created an online platform, the Genetic Epidemiology Simulation Database (GESDB), for simulation data sharing and discussion of simulation techniques for genetic epidemiology studies. GESDB consists of a database for storing simulation scripts, simulated data and documentation from published articles as well as a discussion forum, which provides a platform for discussion of the simulated data and exchanging simulation ideas. Moreover, summary statistics such as the simulation tools that are most commonly used and datasets that are most frequently downloaded are provided. The statistics will be informative for researchers to choose an appropriate simulation tool or select a common dataset for method comparisons. GESDB can be accessed at http://gesdb.nhri.org.twDatabase URL: http://gesdb.nhri.org.tw.

  2. GESDB: a platform of simulation resources for genetic epidemiology studies

    PubMed Central

    Yao, Po-Ju; Chung, Ren-Hua

    2016-01-01

    Computer simulations are routinely conducted to evaluate new statistical methods, to compare the properties among different methods, and to mimic the observed data in genetic epidemiology studies. Conducting simulation studies can become a complicated task as several challenges can occur, such as the selection of an appropriate simulation tool and the specification of parameters in the simulation model. Although abundant simulated data have been generated for human genetic research, currently there is no public database designed specifically as a repository for these simulated data. With the lack of such a database, for similar studies, similar simulations may have been repeated, which resulted in redundant work. Thus, we created an online platform, the Genetic Epidemiology Simulation Database (GESDB), for simulation data sharing and discussion of simulation techniques for genetic epidemiology studies. GESDB consists of a database for storing simulation scripts, simulated data and documentation from published articles as well as a discussion forum, which provides a platform for discussion of the simulated data and exchanging simulation ideas. Moreover, summary statistics such as the simulation tools that are most commonly used and datasets that are most frequently downloaded are provided. The statistics will be informative for researchers to choose an appropriate simulation tool or select a common dataset for method comparisons. GESDB can be accessed at http://gesdb.nhri.org.tw. Database URL: http://gesdb.nhri.org.tw PMID:27242038

  3. Genetic-evolution-based optimization methods for engineering design

    NASA Technical Reports Server (NTRS)

    Rao, S. S.; Pan, T. S.; Dhingra, A. K.; Venkayya, V. B.; Kumar, V.

    1990-01-01

    This paper presents the applicability of a biological model, based on genetic evolution, for engineering design optimization. Algorithms embodying the ideas of reproduction, crossover, and mutation are developed and applied to solve different types of structural optimization problems. Both continuous and discrete variable optimization problems are solved. A two-bay truss for maximum fundamental frequency is considered to demonstrate the continuous variable case. The selection of locations of actuators in an actively controlled structure, for minimum energy dissipation, is considered to illustrate the discrete variable case.

  4. Paper vs. electrons. Epidemiologic publishing in a changing world.

    PubMed

    Rothenberg; Frank; Fitzmaurice

    2000-10-01

    PURPOSE: To present the parallel histories of epidemiologic and electronic publishing and consider positive and negative factors that might affect their amalgam.METHODS: We performed a quantitative assessment of the arc of epidemiologic publication from 1966-1999, using major self-designated epidemiologic journals as a sample, and of scholarly electronic publication from 1991-1997, based on current literature review. We use an online, paperless journal as a case study, and review selected information-technology opinion in the area.RESULTS: By traditional standards, growth in epidemiologic publication has been considerable, with the addition of six new journals since 1966. In contrast, scholarly electronic publication for the period 1991-1997 grew from 27 to 2459 journals (not all exclusively online). Positive features of electronic publishing include flexibility, shortened time to publication, freedom from fixed publication date, diversity in presentation, and instant linkage to relevant material. A case study of a new online journal illustrates the substantive power of the medium. Negative factors include restriction (or unrestricted expansion) of the audience, the potential for hasty peer review, pitfalls in establishing credibility, an emphasis on style over content, technologic dependence, and additions to the information explosion. Relative cost and archiving are still debated. In assessing the pros and cons, it is important to distinguish electronic mechanisms that facilitate publication from electronic publishing, and to appreciate the difference between moving an existing journal to the electronic medium, and creating a new online journal.CONCLUSIONS: The movement from print to internet is probably inexorable, but a headlong rush may be ill-advised. Several models for dual publishing now exist, with the expectation that many, including the journals that serve epidemiology, will do so. The ultimate configuration is difficult to predict, but likely to be

  5. Category's analysis and operational project capacity method of transformation in design

    NASA Astrophysics Data System (ADS)

    Obednina, S. V.; Bystrova, T. Y.

    2015-10-01

    The method of transformation is attracting widespread interest in fields such contemporary design. However, in theory of design little attention has been paid to a categorical status of the term "transformation". This paper presents the conceptual analysis of transformation based on the theory of form employed in the influential essays by Aristotle and Thomas Aquinas. In the present work the transformation as a method of shaping design has been explored as well as potential application of this term in design has been demonstrated.

  6. The evolution of disease: anthropological perspectives on epidemiologic transitions

    PubMed Central

    Zuckerman, Molly Kathleen; Harper, Kristin Nicole; Barrett, Ronald; Armelagos, George John

    2014-01-01

    Background The model of epidemiologic transitions has served as a guiding framework for understanding relationships between patterns of human health and disease and economic development for the past several decades. However, epidemiologic transition theory is infrequently employed in epidemiology. Objective Moving beyond Omran's original formulation, we discuss critiques and modifications of the theory of epidemiologic transitions and highlight some of the ways in which incorporating epidemiologic transition theory can benefit theory and practice in epidemiology. Design We focus on two broad contemporary trends in human health that epidemiologic transition theory is useful for conceptualizing: the increased incidence of chronic inflammatory diseases (CIDs), such as allergic and autoimmune diseases, and the emergence and reemergence of infectious disease. Results Situating these trends within epidemiologic transition theory, we explain the rise in CIDs with the hygiene hypothesis and the rise in emerging and reemerging infections with the concept of a third epidemiologic transition. Conclusions Contextualizing these trends within epidemiologic transition theory reveals implications for clinical practice, global health policies, and future research within epidemiology. PMID:24848652

  7. A Proposed Model of Retransformed Qualitative Data within a Mixed Methods Research Design

    ERIC Educational Resources Information Center

    Palladino, John M.

    2009-01-01

    Most models of mixed methods research design provide equal emphasis of qualitative and quantitative data analyses and interpretation. Other models stress one method more than the other. The present article is a discourse about the investigator's decision to employ a mixed method design to examine special education teachers' advocacy and…

  8. 40 CFR 53.8 - Designation of reference and equivalent methods.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 6 2014-07-01 2014-07-01 false Designation of reference and equivalent... PROGRAMS (CONTINUED) AMBIENT AIR MONITORING REFERENCE AND EQUIVALENT METHODS General Provisions § 53.8 Designation of reference and equivalent methods. (a) A candidate method determined by the Administrator...

  9. 40 CFR 53.8 - Designation of reference and equivalent methods.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 6 2012-07-01 2012-07-01 false Designation of reference and equivalent... PROGRAMS (CONTINUED) AMBIENT AIR MONITORING REFERENCE AND EQUIVALENT METHODS General Provisions § 53.8 Designation of reference and equivalent methods. (a) A candidate method determined by the Administrator...

  10. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  11. 40 CFR 53.8 - Designation of reference and equivalent methods.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 5 2011-07-01 2011-07-01 false Designation of reference and equivalent... PROGRAMS (CONTINUED) AMBIENT AIR MONITORING REFERENCE AND EQUIVALENT METHODS General Provisions § 53.8 Designation of reference and equivalent methods. (a) A candidate method determined by the Administrator...

  12. Cathodic protection design using the regression and correlation method

    SciTech Connect

    Niembro, A.M.; Ortiz, E.L.G.

    1997-09-01

    A computerized statistical method which calculates the current demand requirement based on potential measurements for cathodic protection systems is introduced. The method uses the regression and correlation analysis of statistical measurements of current and potentials of the piping network. This approach involves four steps: field potential measurements, statistical determination of the current required to achieve full protection, installation of more cathodic protection capacity with distributed anodes around the plant and examination of the protection potentials. The procedure is described and recommendations for the improvement of the existing and new cathodic protection systems are given.

  13. Flight critical system design guidelines and validation methods

    NASA Technical Reports Server (NTRS)

    Holt, H. M.; Lupton, A. O.; Holden, D. G.

    1984-01-01

    Efforts being expended at NASA-Langley to define a validation methodology, techniques for comparing advanced systems concepts, and design guidelines for characterizing fault tolerant digital avionics are described with an emphasis on the capabilities of AIRLAB, an environmentally controlled laboratory. AIRLAB has VAX 11/750 and 11/780 computers with an aggregate of 22 Mb memory and over 650 Mb storage, interconnected at 256 kbaud. An additional computer is programmed to emulate digital devices. Ongoing work is easily accessed at user stations by either chronological or key word indexing. The CARE III program aids in analyzing the capabilities of test systems to recover from faults. An additional code, the semi-Markov unreliability program (SURE) generates upper and lower reliability bounds. The AIRLAB facility is mainly dedicated to research on designs of digital flight-critical systems which must have acceptable reliability before incorporation into aircraft control systems. The digital systems would be too costly to submit to a full battery of flight tests and must be initially examined with the AIRLAB simulation capabilities.

  14. Computer control of large accelerators design concepts and methods

    SciTech Connect

    Beck, F.; Gormley, M.

    1984-05-01

    Unlike most of the specialities treated in this volume, control system design is still an art, not a science. These lectures are an attempt to produce a primer for prospective practitioners of this art. A large modern accelerator requires a comprehensive control system for commissioning, machine studies and day-to-day operation. Faced with the requirement to design a control system for such a machine, the control system architect has a bewildering array of technical devices and techniques at his disposal, and it is our aim in the following chapters to lead him through the characteristics of the problems he will have to face and the practical alternatives available for solving them. We emphasize good system architecture using commercially available hardware and software components, but in addition we discuss the actual control strategies which are to be implemented since it is at the point of deciding what facilities shall be available that the complexity of the control system and its cost are implicitly decided. 19 references.

  15. On extracting design principles from biology: I. Method-General answers to high-level design questions for bioinspired robots.

    PubMed

    Haberland, M; Kim, S

    2015-02-02

    When millions of years of evolution suggest a particular design solution, we may be tempted to abandon traditional design methods and copy the biological example. However, biological solutions do not often translate directly into the engineering domain, and even when they do, copying eliminates the opportunity to improve. A better approach is to extract design principles relevant to the task of interest, incorporate them in engineering designs, and vet these candidates against others. This paper presents the first general framework for determining whether biologically inspired relationships between design input variables and output objectives and constraints are applicable to a variety of engineering systems. Using optimization and statistics to generalize the results beyond a particular system, the framework overcomes shortcomings observed of ad hoc methods, particularly those used in the challenging study of legged locomotion. The utility of the framework is demonstrated in a case study of the relative running efficiency of rotary-kneed and telescoping-legged robots.

  16. Fault self-diagnosis designing method of the automotive electronic control system

    NASA Astrophysics Data System (ADS)

    Ding, Yangyan; Yang, Zhigang; Fu, Xiaolin

    2005-12-01

    The fault self-diagnosis system is an important component of an the automotive electronic control system. Designers of automotive electronic control systems urgently require or need a complete understanding of the self-diagnosis designing method of the control system in order to apply it in practice. Aiming at this exigent need, self-diagnosis methods of designing sensors, electronic control unit (ECU), and actuators, which are the three main parts of automotive electronic control systems, are discussed in this paper. According to the fault types and characteristics of commonly used sensors, self-diagnosis designing methods of the sensors are discussed. Then fault diagnosis techniques of sensors utilizing signal detection and analytical redundancy are analysed and summarized respectively, from the viewpoint of the self-diagnosis designing method. Also, problems about failure self-diagnosis of ECU are analyzed here. For different fault types of an ECU, setting up a circuit monitoring method and a self-detection method of the hardware circuit are adopted respectively. Using these two methods mentioned above, a real-time and on-line technique of failure self-diagnosis is presented. Furthermore, the failure self-diagnosis design method of ECU are summarized. Finally, common faults of actuators are analyzed and the general design method of the failure self-diagnosis system is presented. It is suggested that self-diagnosis design methods relative to the failure of automotive electronic control systems can offer a useful approach to designers of control systems.

  17. Power Analysis for Complex Mediational Designs Using Monte Carlo Methods

    ERIC Educational Resources Information Center

    Thoemmes, Felix; MacKinnon, David P.; Reiser, Mark R.

    2010-01-01

    Applied researchers often include mediation effects in applications of advanced methods such as latent variable models and linear growth curve models. Guidance on how to estimate statistical power to detect mediation for these models has not yet been addressed in the literature. We describe a general framework for power analyses for complex…

  18. The Use of Hermeneutics in a Mixed Methods Design

    ERIC Educational Resources Information Center

    von Zweck, Claudia; Paterson, Margo; Pentland, Wendy

    2008-01-01

    Combining methods in a single study is becoming a more common practice because of the limitations of using only one approach to fully address all aspects of a research question. Hermeneutics in this paper is discussed in relation to a large national study that investigated issues influencing the ability of international graduates to work as…

  19. Application of Six Sigma Method to EMS Design

    NASA Astrophysics Data System (ADS)

    Rusko, Miroslav; Králiková, Ružena

    2011-01-01

    The Six Sigma method is a complex and flexible system of achieving, maintaining and maximizing the business success. Six Sigma is based mainly on understanding the customer needs and expectation, disciplined use of facts and statistics analysis, and responsible approach to managing, improving and establishing new business, manufacturing and service processes.

  20. AI/OR computational model for integrating qualitative and quantitative design methods

    NASA Technical Reports Server (NTRS)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  1. The research progress on Hodograph Method of aerodynamic design at Tsinghua University

    NASA Technical Reports Server (NTRS)

    Chen, Zuoyi; Guo, Jingrong

    1991-01-01

    Progress in the use of the Hodograph method of aerodynamic design is discussed. It was found that there are some restricted conditions in the application of Hodograph design to transonic turbine and compressor cascades. The Hodograph method is suitable not only to the transonic turbine cascade but also to the transonic compressor cascade. The three dimensional Hodograph method will be developed after obtaining the basic equation for the three dimensional Hodograph method. As an example of the Hodograph method, the use of the method to design a transonic turbine and compressor cascade is discussed.

  2. Comparison of Three Statistical Methods for Establishing Tentative Wild-Type Population and Epidemiological Cutoff Values for Echinocandins, Amphotericin B, Flucytosine, and Six Candida Species as Determined by the Colorimetric Sensititre YeastOne Method

    PubMed Central

    Pemán, Javier; Hervás, David; Iñiguez, Carmen; Navarro, David; Echeverría, Julia; Martínez-Alarcón, José; Fontanals, Dionisia; Gomila-Sard, Bárbara; Buendía, Buenaventura; Torroba, Luis; Ayats, Josefina; Bratos, Angel; Sánchez-Reus, Ferran; Fernández-Natal, Isabel

    2012-01-01

    The Sensititre YeastOne (SYO) method is a widely used method to determine the susceptibility of Candida spp. to antifungal agents. CLSI clinical breakpoints (CBP) have been reported for antifungals, but not using this method. In the absence of CBP, epidemiological cutoff values (ECVs) are useful to separate wild-type (WT) isolates (those without mechanisms of resistance) from non-WT isolates (those that can harbor some resistance mechanisms), which is the goal of any susceptibility test. The ECVs for five agents, obtained using the MIC distributions determined by the SYO test, were calculated and contrasted with those for three statistical methods and the MIC50 and modal MIC, both plus 2-fold dilutions. The median ECVs (in mg/liter) (% of isolates inhibited by MICs equal to or less than the ECV; number of isolates tested) of the five methods for anidulafungin, micafungin, caspofungin, amphotericin B, and flucytosine, respectively, were as follows: 0.25 (98.5%; 656), 0.06 (95.1%; 659), 0.25 (98.7%; 747), 2 (100%; 923), and 1 (98.5%; 915) for Candida albicans; 8 (100%; 352), 4 (99.2%; 392), 2 (99.2%; 480), 1 (99.8%; 603), and 0.5 (97.9%; 635) for C. parapsilosis; 1 (99.2%; 123), 0.12 (99.2%; 121), 0.25 (99.2%; 138), 2 (100%; 171), and 0.5 (97.2%; 175) for C. tropicalis; 0.12 (96.6%; 174), 0.06 (96%; 176), 0.25 (98.4%; 188), 2 (100%; 209), and 0.25 (97.6%; 208) for C. glabrata; 0.25 (97%; 33), 0.5 (93.9%; 33), 1 (91.9%; 37), 4 (100%; 51), and 32 (100%; 53) for C. krusei; and 4 (100%; 33), 2 (100%; 33), 2 (100%; 54), 1 (100%; 90), and 0.25 (93.4%; 91) for C. orthopsilosis. The three statistical methods gave similar ECVs (within one dilution) and included ≥95% of isolates. These tentative ECVs would be useful for monitoring the emergence of isolates with reduced susceptibility by use of the SYO method. PMID:23015676

  3. [Epidemiological research in Brazil].

    PubMed

    Guimarães, R; Lourenço-De-Oliveira, R; Cosac, S

    2001-08-01

    The current epidemiological research in Brazil is described. Secondary data sources were consulted, such as the year 2000 database of the Brazilian Directory of Research Groups and the National Board of Scientific and Technological Development (CNPq). The criterion to identify a group as a research one relies on the existence of at least one research line in the field of epidemiology, as defined by the group leader. After identifying the defined universe of epidemiological research, which included 176 groups and 320 different research lines, the following issues were presented and discussed: the relationships between research financing and health research, focusing on CAPES (Coordination Center for the Advance of University Professionals) graduation programs, public health research and epidemiological research, geographic and institutional distribution and outreach of the current epidemiological research, the researchers and students directly participating in epidemiological research, research topics and patterns of disseminating research findings; the journals where papers in its fullness were published; the financial support of the epidemiological research focusing on the 23 officially recognized graduate programs in public health field. PMID:11600921

  4. Finite Element Method Applied to Fuse Protection Design

    NASA Astrophysics Data System (ADS)

    Li, Sen; Song, Zhiquan; Zhang, Ming; Xu, Liuwei; Li, Jinchao; Fu, Peng; Wang, Min; Dong, Lin

    2014-03-01

    In a poloidal field (PF) converter module, fuse protection is of great importance to ensure the safety of the thyristors. The fuse is pre-selected in a traditional way and then verified by finite element analysis. A 3D physical model is built by ANSYS software to solve the thermal-electric coupled problem of transient process in case of external fault. The result shows that this method is feasible.

  5. Best Estimate Method vs Evaluation Method: a comparison of two techniques in evaluating seismic analysis and design

    SciTech Connect

    Bumpus, S.E.; Johnson, J.J.; Smith, P.D.

    1980-05-01

    The concept of how two techniques, Best Estimate Method and Evaluation Method, may be applied to the traditional seismic analysis and design of a nuclear power plant is introduced. Only the four links of the seismic analysis and design methodology chain (SMC) - seismic input, soil-structure interaction, major structural response, and subsystem response - are considered. The objective is to evaluate the compounding of conservatisms in the seismic analysis and design of nuclear power plants, to provide guidance for judgments in the SMC, and to concentrate the evaluation on that part of the seismic analysis and design which is familiar to the engineering community. An example applies the effects of three-dimensional excitations on a model of a nuclear power plant structure. The example demonstrates how conservatisms accrue by coupling two links in the SMC and comparing those results to the effects of one link alone. The utility of employing the Best Estimate Method vs the Evaluation Method is also demonstrated.

  6. Networks and the Epidemiology of Infectious Disease

    PubMed Central

    Danon, Leon; Ford, Ashley P.; House, Thomas; Jewell, Chris P.; Keeling, Matt J.; Roberts, Gareth O.; Ross, Joshua V.; Vernon, Matthew C.

    2011-01-01

    The science of networks has revolutionised research into the dynamics of interacting elements. It could be argued that epidemiology in particular has embraced the potential of network theory more than any other discipline. Here we review the growing body of research concerning the spread of infectious diseases on networks, focusing on the interplay between network theory and epidemiology. The review is split into four main sections, which examine: the types of network relevant to epidemiology; the multitude of ways these networks can be characterised; the statistical methods that can be applied to infer the epidemiological parameters on a realised network; and finally simulation and analytical methods to determine epidemic dynamics on a given network. Given the breadth of areas covered and the ever-expanding number of publications, a comprehensive review of all work is impossible. Instead, we provide a personalised overview into the areas of network epidemiology that have seen the greatest progress in recent years or have the greatest potential to provide novel insights. As such, considerable importance is placed on analytical approaches and statistical methods which are both rapidly expanding fields. Throughout this review we restrict our attention to epidemiological issues. PMID:21437001

  7. Epidemiology of Crohn's Disease.

    PubMed

    Sandler, R S; Golden, A L

    1986-04-01

    Although our current understanding is limited, epidemiologic investigation of Crohn's disease holds great promise. Certain aspects of the epidemiology are clear. The incidence of Crohn's disease, which has increased over the past few decades, may have reached a plateau. The disease has its peak onset in early life, with a second peak among the elderly. It is more common in the developed countries and among Jews. Whether the disease is related to occupation, social class, marital status, stress, infection, diet, smoking, and oral contraceptives is less certain. This paper reviews the epidemiology of Crohn's disease and proposes areas in which further research is needed.

  8. A New Automated Design Method Based on Machine Learning for CMOS Analog Circuits

    NASA Astrophysics Data System (ADS)

    Moradi, Behzad; Mirzaei, Abdolreza

    2016-11-01

    A new simulation based automated CMOS analog circuit design method which applies a multi-objective non-Darwinian-type evolutionary algorithm based on Learnable Evolution Model (LEM) is proposed in this article. The multi-objective property of this automated design of CMOS analog circuits is governed by a modified Strength Pareto Evolutionary Algorithm (SPEA) incorporated in the LEM algorithm presented here. LEM includes a machine learning method such as the decision trees that makes a distinction between high- and low-fitness areas in the design space. The learning process can detect the right directions of the evolution and lead to high steps in the evolution of the individuals. The learning phase shortens the evolution process and makes remarkable reduction in the number of individual evaluations. The expert designer's knowledge on circuit is applied in the design process in order to reduce the design space as well as the design time. The circuit evaluation is made by HSPICE simulator. In order to improve the design accuracy, bsim3v3 CMOS transistor model is adopted in this proposed design method. This proposed design method is tested on three different operational amplifier circuits. The performance of this proposed design method is verified by comparing it with the evolutionary strategy algorithm and other similar methods.

  9. Matching wind turbine rotors and loads: computational methods for designers

    SciTech Connect

    Seale, J.B.

    1983-04-01

    This report provides a comprehensive method for matching wind energy conversion system (WECS) rotors with the load characteristics of common electrical and mechanical applications. The user must supply: (1) turbine aerodynamic efficiency as a function of tipspeed ratio; (2) mechanical load torque as a function of rotation speed; (3) useful delivered power as a function of incoming mechanical power; (4) site average windspeed and, for maximum accuracy, distribution data. The description of the data includes governing limits consistent with the capacities of components. The report develops, a step-by-step method for converting the data into useful results: (1) from turbine efficiency and load torque characteristics, turbine power is predicted as a function of windspeed; (2) a decision is made how turbine power is to be governed (it may self-govern) to insure safety of all components; (3) mechanical conversion efficiency comes into play to predict how useful delivered power varies with windspeed; (4) wind statistics come into play to predict longterm energy output. Most systems can be approximated by a graph-and-calculator approach: Computer-generated families of coefficient curves provide data for algebraic scaling formulas. The method leads not only to energy predictions, but also to insight into the processes being modeled. Direct use of a computer program provides more sophisticated calculations where a highly unusual system is to be modeled, where accuracy is at a premium, or where error analysis is required. The analysis is fleshed out witn in-depth case studies for induction generator and inverter utility systems; battery chargers; resistance heaters; positive displacement pumps, including three different load-compensation strategies; and centrifugal pumps with unregulated electric power transmission from turbine to pump.

  10. Design of a Password-Based EAP Method

    NASA Astrophysics Data System (ADS)

    Manganaro, Andrea; Koblensky, Mingyur; Loreti, Michele

    In recent years, amendments to IEEE standards for wireless networks added support for authentication algorithms based on the Extensible Authentication Protocol (EAP). Available solutions generally use digital certificates or pre-shared keys but the management of the resulting implementations is complex or unlikely to be scalable. In this paper we present EAP-SRP-256, an authentication method proposal that relies on the SRP-6 protocol and provides a strong password-based authentication mechanism. It is intended to meet the IETF security and key management requirements for wireless networks.

  11. Defining Requirements and Related Methods for Designing Sensorized Garments

    PubMed Central

    Andreoni, Giuseppe; Standoli, Carlo Emilio; Perego, Paolo

    2016-01-01

    Designing smart garments has strong interdisciplinary implications, specifically related to user and technical requirements, but also because of the very different applications they have: medicine, sport and fitness, lifestyle monitoring, workplace and job conditions analysis, etc. This paper aims to discuss some user, textile, and technical issues to be faced in sensorized clothes development. In relation to the user, the main requirements are anthropometric, gender-related, and aesthetical. In terms of these requirements, the user’s age, the target application, and fashion trends cannot be ignored, because they determine the compliance with the wearable system. Regarding textile requirements, functional factors—also influencing user comfort—are elasticity and washability, while more technical properties are the stability of the chemical agents’ effects for preserving the sensors’ efficacy and reliability, and assuring the proper duration of the product for the complete life cycle. From the technical side, the physiological issues are the most important: skin conductance, tolerance, irritation, and the effect of sweat and perspiration are key factors for reliable sensing. Other technical features such as battery size and duration, and the form factor of the sensor collector, should be considered, as they affect aesthetical requirements, which have proven to be crucial, as well as comfort and wearability. PMID:27240361

  12. Simplified tornado depressurization design methods for nuclear power plants

    SciTech Connect

    Howard, N.M.; Krasnopoler, M.I.

    1983-05-01

    A simplified approach for the calculation of tornado depressurization effects on nuclear power plant structures and components is based on a generic computer depressurization analysis for an arbitrary single volume V connected to the atmosphere by an effective vent area A. For a given tornado depressurization transient, the maximum depressurization ..delta..P of the volume was found to depend on the parameter V/A. The relation between ..delta..P and V/A can be represented by a single monotonically increasing curve for each of the three design-basis tornadoes described in the U.S. Nuclear Regulatory Commission's Regulatory Guide 1.76. These curves can be applied to most multiple-volume nuclear power plant structures by considering each volume and its controlling vent area. Where several possible flow areas could be controlling, the maximum value of V/A can be used to estimate a conservative value for ..delta..P. This simplified approach was shown to yield reasonably conservative results when compared to detailed computer calculations of moderately complex geometries. Treatment of severely complicated geometries, heating and ventilation systems, and multiple blowout panel arrangements were found to be beyond the limitations of the simplified analysis.

  13. Defining Requirements and Related Methods for Designing Sensorized Garments.

    PubMed

    Andreoni, Giuseppe; Standoli, Carlo Emilio; Perego, Paolo

    2016-01-01

    Designing smart garments has strong interdisciplinary implications, specifically related to user and technical requirements, but also because of the very different applications they have: medicine, sport and fitness, lifestyle monitoring, workplace and job conditions analysis, etc. This paper aims to discuss some user, textile, and technical issues to be faced in sensorized clothes development. In relation to the user, the main requirements are anthropometric, gender-related, and aesthetical. In terms of these requirements, the user's age, the target application, and fashion trends cannot be ignored, because they determine the compliance with the wearable system. Regarding textile requirements, functional factors-also influencing user comfort-are elasticity and washability, while more technical properties are the stability of the chemical agents' effects for preserving the sensors' efficacy and reliability, and assuring the proper duration of the product for the complete life cycle. From the technical side, the physiological issues are the most important: skin conductance, tolerance, irritation, and the effect of sweat and perspiration are key factors for reliable sensing. Other technical features such as battery size and duration, and the form factor of the sensor collector, should be considered, as they affect aesthetical requirements, which have proven to be crucial, as well as comfort and wearability. PMID:27240361

  14. NMR quantum computing: applying theoretical methods to designing enhanced systems.

    PubMed

    Mawhinney, Robert C; Schreckenbach, Georg

    2004-10-01

    Density functional theory results for chemical shifts and spin-spin coupling constants are presented for compounds currently used in NMR quantum computing experiments. Specific design criteria were examined and numerical guidelines were assessed. Using a field strength of 7.0 T, protons require a coupling constant of 4 Hz with a chemical shift separation of 0.3 ppm, whereas carbon needs a coupling constant of 25 Hz for a chemical shift difference of 10 ppm, based on the minimal coupling approximation. Using these guidelines, it was determined that 2,3-dibromothiophene is limited to only two qubits; the three qubit system bromotrifluoroethene could be expanded to five qubits and the three qubit system 2,3-dibromopropanoic acid could also be used as a six qubit system. An examination of substituent effects showed that judiciously choosing specific groups could increase the number of available qubits by removing rotational degeneracies in addition to introducing specific conformational preferences that could increase (or decrease) the magnitude of the couplings. The introduction of one site of unsaturation can lead to a marked improvement in spectroscopic properties, even increasing the number of active nuclei.

  15. A Computational Method for Materials Design of New Interfaces

    NASA Astrophysics Data System (ADS)

    Kaminski, Jakub; Ratsch, Christian; Weber, Justin; Haverty, Michael; Shankar, Sadasivan

    2015-03-01

    We propose a novel computational approach to explore the broad configurational space of possible interfaces formed from known crystal structures to find new heterostructure materials with potentially interesting properties. In a series of steps with increasing complexity and accuracy, the vast number of possible combinations is narrowed down to a limited set of the most promising and chemically compatible candidates. This systematic screening encompasses (i) establishing the geometrical compatibility along multiple crystallographic orientations of two materials, (ii) simple functions eliminating configurations with unfavorable interatomic steric conflicts, (iii) application of empirical and semi-empirical potentials estimating approximate energetics and structures, (iv) use of DFT based quantum-chemical methods to ascertain the final optimal geometry and stability of the interface in question. For efficient high-throughput screening we have developed a new method to calculate surface energies, which allows for fast and systematic treatment of materials terminated with non-polar surfaces. We show that our approach leads to a maximum error around 3% from the exact reference. The representative results from our search protocol will be presented for selected materials including semiconductors and oxides.

  16. The Nicaraguan pediatric dengue cohort study: study design, methods, use of information technology, and extension to other infectious diseases.

    PubMed

    Kuan, Guillermina; Gordon, Aubree; Avilés, William; Ortega, Oscar; Hammond, Samantha N; Elizondo, Douglas; Nuñez, Andrea; Coloma, Josefina; Balmaseda, Angel; Harris, Eva

    2009-07-01

    Dengue is a mosquito-borne viral disease that is a major public health problem worldwide. In 2004, the Pediatric Dengue Cohort Study was established in Managua, Nicaragua, to study the natural history and transmission of dengue in children. Here, the authors describe the study design, methods, and results from 2004 to 2008. Initially, 3,721 children 2-9 years of age were recruited through door-to-door visits. Each year, new children aged 2 years are enrolled in the study to maintain the age structure. Children are provided with medical care through the study, and data from each medical visit are recorded on systematic study forms. All participants presenting with suspected dengue or undifferentiated fever are tested for dengue by virologic, serologic, and molecular biologic assays. Yearly blood samples are collected to detect inapparent dengue virus infections. Numerous information and communications technologies are used to manage study data, track samples, and maintain quality control, including personal data assistants, barcodes, global information systems, and fingerprint scans. Close collaboration with the Nicaraguan Ministry of Health and use of almost entirely local staff are essential components for success. This study is providing critical data on the epidemiology and transmission of dengue in the Americas needed for future vaccine trials.

  17. "Epidemiological criminology": coming full circle.

    PubMed

    Akers, Timothy A; Lanier, Mark M

    2009-03-01

    Members of the public health and criminal justice disciplines often work with marginalized populations: people at high risk of drug use, health problems, incarceration, and other difficulties. As these fields increasingly overlap, distinctions between them are blurred, as numerous research reports and funding trends document. However, explicit theoretical and methodological linkages between the 2 disciplines remain rare. A new paradigm that links methods and statistical models of public health with those of their criminal justice counterparts is needed, as are increased linkages between epidemiological analogies, theories, and models and the corresponding tools of criminology. We outline disciplinary commonalities and distinctions, present policy examples that integrate similarities, and propose "epidemiological criminology" as a bridging framework.

  18. A Comparison of Five Statistical Methods for Analyzing Pretest-Posttest Designs.

    ERIC Educational Resources Information Center

    Hendrix, Leland J.; And Others

    1978-01-01

    Five methods for analyzing data from pretest-post-test research designs are discussed. Analysis of gain scores, with pretests as a covariate, is indicated as a superior method when the assumptions underlying covariance analysis are met. (Author/GDC)

  19. Modified Fully Utilized Design (MFUD) Method for Stress and Displacement Constraints

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya; Gendy, Atef; Berke, Laszlo; Hopkins, Dale

    1997-01-01

    The traditional fully stressed method performs satisfactorily for stress-limited structural design. When this method is extended to include displacement limitations in addition to stress constraints, it is known as the fully utilized design (FUD). Typically, the FUD produces an overdesign, which is the primary limitation of this otherwise elegant method. We have modified FUD in an attempt to alleviate the limitation. This new method, called the modified fully utilized design (MFUD) method, has been tested successfully on a number of designs that were subjected to multiple loads and had both stress and displacement constraints. The solutions obtained with MFUD compare favorably with the optimum results that can be generated by using nonlinear mathematical programming techniques. The MFUD method appears to have alleviated the overdesign condition and offers the simplicity of a direct, fully stressed type of design method that is distinctly different from optimization and optimality criteria formulations. The MFUD method is being developed for practicing engineers who favor traditional design methods rather than methods based on advanced calculus and nonlinear mathematical programming techniques. The Integrated Force Method (IFM) was found to be the appropriate analysis tool in the development of the MFUD method. In this paper, the MFUD method and its optimality are presented along with a number of illustrative examples.

  20. Design of a Variational Multiscale Method for Turbulent Compressible Flows

    NASA Technical Reports Server (NTRS)

    Diosady, Laslo Tibor; Murman, Scott M.

    2013-01-01

    A spectral-element framework is presented for the simulation of subsonic compressible high-Reynolds-number flows. The focus of the work is maximizing the efficiency of the computational schemes to enable unsteady simulations with a large number of spatial and temporal degrees of freedom. A collocation scheme is combined with optimized computational kernels to provide a residual evaluation with computational cost independent of order of accuracy up to 16th order. The optimized residual routines are used to develop a low-memory implicit scheme based on a matrix-free Newton-Krylov method. A preconditioner based on the finite-difference diagonalized ADI scheme is developed which maintains the low memory of the matrix-free implicit solver, while providing improved convergence properties. Emphasis on low memory usage throughout the solver development is leveraged to implement a coupled space-time DG solver which may offer further efficiency gains through adaptivity in both space and time.

  1. Epidemiology of Bluetongue in India.

    PubMed

    Rao, P P; Hegde, N R; Reddy, Y N; Krishnajyothi, Y; Reddy, Y V; Susmitha, B; Gollapalli, S R; Putty, K; Reddy, G H

    2016-04-01

    Bluetongue (BT) is an insectborne endemic disease in India. Although infections are observed in domestic and wild ruminants, the clinical disease and mortality are observed only in sheep, especially in the southern states of the country. The difference in disease patterns in different parts of the country could be due to varied climatic conditions, sheep population density and susceptibility of the sheep breeds to BT. Over the five decades after the first report of BT in 1964, most of the known serotypes of bluetongue virus (BTV) have been reported from India either by virus isolation or by detection of serotype-specific antibodies. There have been no structured longitudinal studies to identify the circulating serotypes throughout the country. At least ten serotypes were isolated between 1967 and 2000 (BTV-1-4, 6, 9, 16-18, 23). Since 2001, the All-India Network Programme on Bluetongue and other laboratories have isolated eight different serotypes (BTV-1-3, 9, 10, 12, 16, 21). Genetic analysis of these viruses has revealed that some of them vary substantially from reference viruses, and some show high sequence identity with modified live virus vaccines used in different parts of the world. These observations have highlighted the need to develop diagnostic capabilities, especially as BT outbreaks are still declared based on clinical signs. Although virus isolation and serotyping are the gold standards, rapid methods based on the detection of viral nucleic acid may be more suitable for India. The epidemiological investigations also have implications for vaccine design. Although only a handful serotypes may be involved in causing outbreaks every year, the combination of serotypes may change from year to year. For effective control of BT in India, it may be pertinent to introduce sentinel and vector traps systems for identification of the circulating serotypes and to evaluate herd immunity against different serotypes, so that relevant strains can be included in vaccine

  2. Epidemiology of Bluetongue in India.

    PubMed

    Rao, P P; Hegde, N R; Reddy, Y N; Krishnajyothi, Y; Reddy, Y V; Susmitha, B; Gollapalli, S R; Putty, K; Reddy, G H

    2016-04-01

    Bluetongue (BT) is an insectborne endemic disease in India. Although infections are observed in domestic and wild ruminants, the clinical disease and mortality are observed only in sheep, especially in the southern states of the country. The difference in disease patterns in different parts of the country could be due to varied climatic conditions, sheep population density and susceptibility of the sheep breeds to BT. Over the five decades after the first report of BT in 1964, most of the known serotypes of bluetongue virus (BTV) have been reported from India either by virus isolation or by detection of serotype-specific antibodies. There have been no structured longitudinal studies to identify the circulating serotypes throughout the country. At least ten serotypes were isolated between 1967 and 2000 (BTV-1-4, 6, 9, 16-18, 23). Since 2001, the All-India Network Programme on Bluetongue and other laboratories have isolated eight different serotypes (BTV-1-3, 9, 10, 12, 16, 21). Genetic analysis of these viruses has revealed that some of them vary substantially from reference viruses, and some show high sequence identity with modified live virus vaccines used in different parts of the world. These observations have highlighted the need to develop diagnostic capabilities, especially as BT outbreaks are still declared based on clinical signs. Although virus isolation and serotyping are the gold standards, rapid methods based on the detection of viral nucleic acid may be more suitable for India. The epidemiological investigations also have implications for vaccine design. Although only a handful serotypes may be involved in causing outbreaks every year, the combination of serotypes may change from year to year. For effective control of BT in India, it may be pertinent to introduce sentinel and vector traps systems for identification of the circulating serotypes and to evaluate herd immunity against different serotypes, so that relevant strains can be included in vaccine

  3. Epidemiology of Substance Use Disorders

    PubMed Central

    Merikangas, Kathleen R.; McClair, Vetisha L.

    2013-01-01

    Epidemiological studies of substance use and substance use disorders (SUDs) have provided an abundance of data on the patterns of substance use in nationally representative samples across the world (Degenhardt et al. 2008; Johnston et al. 2011; SAMHSA 2011). This paper presents a summary of the goals, methods and recent findings on the epidemiology of substance use and disorders in the general population of adults and adolescents and describes the methods and findings on the genetic epidemiology of drug use disorders. The high 12 month prevalence rates of substance dependence in U.S. adults (about 12% for alcohol and 2–3% for illicit drugs) approximate those of other mental disorders as well as chronic physical disorders with major public health impact. New findings from the nationally representative samples of U.S. youth reveal that the lifetime prevalence of alcohol use disorders is approximately 8% and illicit drug use disorders is 2–3% (Merikangas et al. 2010; Swendsen et al. in press, SAMSHA, 2011). The striking increase in prevalence rates from ages 13 to 18 highlight adolescence as the key period of development of substance use disorders. The application of genetic epidemiological studies has consistently demonstrated that genetic factors have a major influence on progression of substance use to dependence, whereas environmental factors unique to the individual play an important role in exposure and initial use of substances. Identification of specific susceptibility genes and environmental factors that influence exposure and progression of drug use may enhance our ability to prevent and treat substance use disorders. PMID:22543841

  4. The epidemiology of recurrent pregnancy loss.

    PubMed

    Cramer, D W; Wise, L A

    2000-01-01

    In reviewing the epidemiology of recurrent abortion (RAB), we believe it is necessary to consider the epidemiology of spontaneous abortion (SAB) as well, since it is clear that even a single pregnancy loss increases the risk for a subsequent abortion. In addition, any attempt to identify epidemiologic risk factors for SAB or RAB must deal with the fact that at least 50% of SABs are associated with genetic abnormalities. Given that most epidemiologic studies have not distinguished karyotypically abnormal abortuses, risk factors are likely to be underestimated. Nevertheless, there is fair agreement that a variety of factors may increase risk for SAB or RAB, including advanced maternal age, single gene mutations such as PKU or G6PD deficiency, structural abnormalities of the uterus, poorly controlled diabetes, antiphospholipid syndrome, and smoking. More controversial is the role of luteal phase defect or hyperandrogenism, alloimmune factors, genital infections, caffeine or alcohol use, and trace element or chemical exposure from tap water or in the workplace. Besides better designed epidemiologic studies to detect modifiable risk factors for SAB or RAB, there is a clear need for clinical trials of therapy for RAB which meet minimum epidemiologic standards including randomization, double-blinded (when possible), and placebo-controlled (when ethical).

  5. Epidemiology of Enterocytozoon bieneusi Infection in Humans.

    PubMed

    Matos, Olga; Lobo, Maria Luisa; Xiao, Lihua

    2012-01-01

    A review was conducted to examine published works that focus on the complex epidemiology of Enterocytozoon bieneusi infection in humans. Studies on the prevalence of these emerging microsporidian pathogens in humans, in developed and developing countries, the different clinical spectra of E. bieneusi intestinal infection in children, in different settings, and the risk factors associated with E. bieneusi infection have been reviewed. This paper also analyses the impact of the recent application of PCR-based molecular methods for species-specific identification and genotype differentiation has had in increasing the knowledge of the molecular epidemiology of E. bieneusi in humans. The advances in the epidemiology of E. bieneusi, in the last two decades, emphasize the importance of epidemiological control and prevention of E. bieneusi infections, from both the veterinary and human medical perspectives.

  6. Epidemiology of Enterocytozoon bieneusi Infection in Humans

    PubMed Central

    Matos, Olga; Lobo, Maria Luisa; Xiao, Lihua

    2012-01-01

    A review was conducted to examine published works that focus on the complex epidemiology of Enterocytozoon bieneusi infection in humans. Studies on the prevalence of these emerging microsporidian pathogens in humans, in developed and developing countries, the different clinical spectra of E. bieneusi intestinal infection in children, in different settings, and the risk factors associated with E. bieneusi infection have been reviewed. This paper also analyses the impact of the recent application of PCR-based molecular methods for species-specific identification and genotype differentiation has had in increasing the knowledge of the molecular epidemiology of E. bieneusi in humans. The advances in the epidemiology of E. bieneusi, in the last two decades, emphasize the importance of epidemiological control and prevention of E. bieneusi infections, from both the veterinary and human medical perspectives. PMID:23091702

  7. EPIDEMIOLOGY AND EXPOSURE ASSESSMENT

    EPA Science Inventory

    Research collaborations between the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL) centered on the development and application of exposure analysis tools in environmental epidemiology include the El Paso...

  8. Epidemiology & Genomics Research Program

    Cancer.gov

    The Epidemiology and Genomics Research Program, in the National Cancer Institute's Division of Cancer Control and Population Sciences, funds research in human populations to understand the determinants of cancer occurrence and outcomes.

  9. Epidemiology of Lice

    ERIC Educational Resources Information Center

    Juranek, Dennis D.

    1977-01-01

    Research into the epidemiology of lice indicates that infestation is uncommon in blacks, more common in females than males, significantly higher in low income groups, and transmission is by way of articles of clothing. (JD)

  10. Cancer Epidemiology Cohorts

    Cancer.gov

    Cohort studies are fundamental for epidemiological research by helping researchers better understand the etiology of cancer and provide insights into the key determinants of this disease and its outcomes.

  11. Matching Learning Style Preferences with Suitable Delivery Methods on Textile Design Programmes

    ERIC Educational Resources Information Center

    Sayer, Kate; Studd, Rachel

    2006-01-01

    Textile design is a subject that encompasses both design and technology; aesthetically pleasing patterns and forms must be set within technical parameters to create successful fabrics. When considering education methods in design programmes, identifying the most relevant learning approach is key to creating future successes. Yet are the most…

  12. Assessing Adaptive Instructional Design Tools and Methods in ADAPT[IT].

    ERIC Educational Resources Information Center

    Eseryel, Deniz; Spector, J. Michael

    ADAPT[IT] (Advanced Design Approach for Personalized Training - Interactive Tools) is a European project within the Information Society Technologies program that is providing design methods and tools to guide a training designer according to the latest cognitive science and standardization principles. ADAPT[IT] addresses users in two significantly…

  13. Co-Designing and Co-Teaching Graduate Qualitative Methods: An Innovative Ethnographic Workshop Model

    ERIC Educational Resources Information Center

    Cordner, Alissa; Klein, Peter T.; Baiocchi, Gianpaolo

    2012-01-01

    This article describes an innovative collaboration between graduate students and a faculty member to co-design and co-teach a graduate-level workshop-style qualitative methods course. The goal of co-designing and co-teaching the course was to involve advanced graduate students in all aspects of designing a syllabus and leading class discussions in…

  14. Application of the MNA design method to a nonlinear turbofan engine. [multivariable Nyquist array method

    NASA Technical Reports Server (NTRS)

    Leininger, G. G.

    1981-01-01

    Using nonlinear digital simulation as a representative model of the dynamic operation of the QCSEE turbofan engine, a feedback control system is designed by variable frequency design techniques. Transfer functions are generated for each of five power level settings covering the range of operation from approach power to full throttle (62.5% to 100% full power). These transfer functions are then used by an interactive control system design synthesis program to provide a closed loop feedback control using the multivariable Nyquist array and extensions to multivariable Bode diagrams and Nichols charts.

  15. How to make epidemiological training infectious.

    PubMed

    Bellan, Steve E; Pulliam, Juliet R C; Scott, James C; Dushoff, Jonathan

    2012-01-01

    Modern infectious disease epidemiology builds on two independently developed fields: classical epidemiology and dynamical epidemiology. Over the past decade, integration of the two fields has increased in research practice, but training options within the fields remain distinct with few opportunities for integration in the classroom. The annual Clinic on the Meaningful Modeling of Epidemiological Data (MMED) at the African Institute for Mathematical Sciences has begun to address this gap. MMED offers participants exposure to a broad range of concepts and techniques from both epidemiological traditions. During MMED 2010 we developed a pedagogical approach that bridges the traditional distinction between classical and dynamical epidemiology and can be used at multiple educational levels, from high school to graduate level courses. The approach is hands-on, consisting of a real-time simulation of a stochastic outbreak in course participants, including realistic data reporting, followed by a variety of mathematical and statistical analyses, stemming from both epidemiological traditions. During the exercise, dynamical epidemiologists developed empirical skills such as study design and learned concepts of bias while classical epidemiologists were trained in systems thinking and began to understand epidemics as dynamic nonlinear processes. We believe this type of integrated educational tool will prove extremely valuable in the training of future infectious disease epidemiologists. We also believe that such interdisciplinary training will be critical for local capacity building in analytical epidemiology as Africa continues to produce new cohorts of well-trained mathematicians, statisticians, and scientists. And because the lessons draw on skills and concepts from many fields in biology--from pathogen biology, evolutionary dynamics of host--pathogen interactions, and the ecology of infectious disease to bioinformatics, computational biology, and statistics--this exercise

  16. Molecular Epidemiology of Tuberculosis: Current Insights

    PubMed Central

    Mathema, Barun; Kurepina, Natalia E.; Bifani, Pablo J.; Kreiswirth, Barry N.

    2006-01-01

    Molecular epidemiologic studies of tuberculosis (TB) have focused largely on utilizing molecular techniques to address short- and long-term epidemiologic questions, such as in outbreak investigations and in assessing the global dissemination of strains, respectively. This is done primarily by examining the extent of genetic diversity of clinical strains of Mycobacterium tuberculosis. When molecular methods are used in conjunction with classical epidemiology, their utility for TB control has been realized. For instance, molecular epidemiologic studies have added much-needed accuracy and precision in describing transmission dynamics, and they have facilitated investigation of previously unresolved issues, such as estimates of recent-versus-reactive disease and the extent of exogenous reinfection. In addition, there is mounting evidence to suggest that specific strains of M. tuberculosis belonging to discrete phylogenetic clusters (lineages) may differ in virulence, pathogenesis, and epidemiologic characteristics, all of which may significantly impact TB control and vaccine development strategies. Here, we review the current methods, concepts, and applications of molecular approaches used to better understand the epidemiology of TB. PMID:17041139

  17. The Use of Qsar and Computational Methods in Drug Design

    NASA Astrophysics Data System (ADS)

    Bajot, Fania

    The application of quantitative structure-activity relationships (QSARs) has significantly impacted the paradigm of drug discovery. Following the successful utilization of linear solvation free-energy relationships (LSERs), numerous 2D- and 3D-QSAR methods have been developed, most of them based on descriptors for hydrophobicity, polarizability, ionic interactions, and hydrogen bonding. QSAR models allow for the calculation of physicochemical properties (e.g., lipophilicity), the prediction of biological activity (or toxicity), as well as the evaluation of absorption, distribution, metabolism, and excretion (ADME). In pharmaceutical research, QSAR has a particular interest in the preclinical stages of drug discovery to replace tedious and costly experimentation, to filter large chemical databases, and to select drug candidates. However, to be part of drug discovery and development strategies, QSARs need to meet different criteria (e.g., sufficient predictivity). This chapter describes the foundation of modern QSAR in drug discovery and presents some current challenges and applications for the discovery and optimization of drug candidates

  18. Unique Method for Generating Design Earthquake Time History Seeds

    SciTech Connect

    R. E. Spears

    2008-07-01

    A method has been developed which takes a single seed earthquake time history and produces multiple similar seed earthquake time histories. These new time histories possess important frequency and cumulative energy attributes of the original while having a correlation less than 30% (per the ASCE/SEI 43-05 Section 2.4 [1]). They are produced by taking the fast Fourier transform of the original seed. The averaged amplitudes are then pared with random phase angles and the inverse fast Fourier transform is taken to produce a new time history. The average amplitude through time is then adjusted to encourage a similar cumulative energy curve. Next, the displacement is modified to approximate the original curve using Fourier techniques. Finally, the correlation is checked to ensure it is less than 30%. This process does not guarantee that the correlation will be less than 30% for all of a given set of new curves. It does provide a simple tool where a few additional iterations of the process should produce a set of seed earthquake time histories meeting the correlation criteria.

  19. A Computational Method for Materials Design of Interfaces

    NASA Astrophysics Data System (ADS)

    Kaminski, Jakub; Ratsch, Christian; Shankar, Sadasivan

    2014-03-01

    In the present work we propose a novel computational approach to explore the broad configurational space of possible interfaces formed from known crystal structures to find new hetrostructure materials with potentially interesting properties. In the series of subsequent steps with increasing complexity and accuracy, the vast number of possible combinations is narrowed down to a limited set of the most promising and chemically compatible candidates. This systematic screening encompasses (i) establishing the geometrical compatibility along multiple crystallographic orientations of two (or more) materials, (ii) simple functions eliminating configurations with unfavorable interatomic steric conflicts, (iii) application of empirical and semi-empirical potentials estimating approximate energetics and structures, (iv) use of DFT based quantum-chemical methods to ascertain the final optimal geometry and stability of the interface in question. We also demonstrate the flexibility and efficiency of our approach depending on the size of the investigated structures and size of the search space. The representative results from our search protocol will be presented for selected materials including semiconductors, transition metal systems, and oxides.

  20. Mixture design and treatment methods for recycling contaminated sediment.

    PubMed

    Wang, Lei; Kwok, June S H; Tsang, Daniel C W; Poon, Chi-Sun

    2015-01-01

    Conventional marine disposal of contaminated sediment presents significant financial and environmental burden. This study aimed to recycle the contaminated sediment by assessing the roles and integration of binder formulation, sediment pretreatment, curing method, and waste inclusion in stabilization/solidification. The results demonstrated that the 28-d compressive strength of sediment blocks produced with coal fly ash and lime partially replacing cement at a binder-to-sediment ratio of 3:7 could be used as fill materials for construction. The X-ray diffraction analysis revealed that hydration products (calcium hydroxide) were difficult to form at high sediment content. Thermal pretreatment of sediment removed 90% of indigenous organic matter, significantly increased the compressive strength, and enabled reuse as non-load-bearing masonry units. Besides, 2-h CO2 curing accelerated early-stage carbonation inside the porous structure, sequestered 5.6% of CO2 (by weight) in the sediment blocks, and acquired strength comparable to 7-d curing. Thermogravimetric analysis indicated substantial weight loss corresponding to decomposition of poorly and well crystalline calcium carbonate. Moreover, partial replacement of contaminated sediment by various granular waste materials notably augmented the strength of sediment blocks. The metal leachability of sediment blocks was minimal and acceptable for reuse. These results suggest that contaminated sediment should be viewed as useful resources.