Science.gov

Sample records for design epidemiological methods

  1. Epidemiological designs for vaccine safety assessment: methods and pitfalls.

    PubMed

    Andrews, Nick

    2012-09-01

    Three commonly used designs for vaccine safety assessment post licensure are cohort, case-control and self-controlled case series. These methods are often used with routine health databases and immunisation registries. This paper considers the issues that may arise when designing an epidemiological study, such as understanding the vaccine safety question, case definition and finding, limitations of data sources, uncontrolled confounding, and pitfalls that apply to the individual designs. The example of MMR and autism, where all three designs have been used, is presented to help consider these issues. Copyright © 2011 The International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  2. The ZInEP Epidemiology Survey: background, design and methods.

    PubMed

    Ajdacic-Gross, Vladeta; Müller, Mario; Rodgers, Stephanie; Warnke, Inge; Hengartner, Michael P; Landolt, Karin; Hagenmuller, Florence; Meier, Magali; Tse, Lee-Ting; Aleksandrowicz, Aleksandra; Passardi, Marco; Knöpfli, Daniel; Schönfelder, Herdis; Eisele, Jochen; Rüsch, Nicolas; Haker, Helene; Kawohl, Wolfram; Rössler, Wulf

    2014-12-01

    This article introduces the design, sampling, field procedures and instruments used in the ZInEP Epidemiology Survey. This survey is one of six ZInEP projects (Zürcher Impulsprogramm zur nachhaltigen Entwicklung der Psychiatrie, i.e. the "Zurich Program for Sustainable Development of Mental Health Services"). It parallels the longitudinal Zurich Study with a sample comparable in age and gender, and with similar methodology, including identical instruments. Thus, it is aimed at assessing the change of prevalence rates of common mental disorders and the use of professional help and psychiatric sevices. Moreover, the current survey widens the spectrum of topics by including sociopsychiatric questionnaires on stigma, stress related biological measures such as load and cortisol levels, electroencephalographic (EEG) and near-infrared spectroscopy (NIRS) examinations with various paradigms, and sociophysiological tests. The structure of the ZInEP Epidemiology Survey entails four subprojects: a short telephone screening using the SCL-27 (n of nearly 10,000), a comprehensive face-to-face interview based on the SPIKE (Structured Psychopathological Interview and Rating of the Social Consequences for Epidemiology: the main instrument of the Zurich Study) with a stratified sample (n = 1500), tests in the Center for Neurophysiology and Sociophysiology (n = 227), and a prospective study with up to three follow-up interviews and further measures (n = 157). In sum, the four subprojects of the ZInEP Epidemiology Survey deliver a large interdisciplinary database. Copyright © 2014 John Wiley & Sons, Ltd.

  3. The Swedish Longitudinal Gambling Study (Swelogs): design and methods of the epidemiological (EP-) track

    PubMed Central

    Romild, Ulla; Volberg, Rachel; Abbott, Max

    2014-01-01

    Swelogs (Swedish Longitudinal Gambling Study) epidemiological (EP-) track is a prospective study with four waves of data-collection among Swedish citizens aged 16–84 years at baseline. The major objectives of this track are to provide general population estimates of the prevalence and incidence of problem and at-risk gambling and enable comparisons with the first Swedish national study on gambling and problem gambling (Swegs) conducted in 1997/1998. The overall study (Swelogs) comprises three tracks of data collection; one epidemiological, one in-depth and one follow-up. It is expected to provide information that will inform the development of evidence-based methods and strategies to prevent the development of gambling problems. This paper gives an overview of the design of the epidemiological track, especially of its two first waves. The baseline wave, performed between October 2008 and August 2009, included 8165 subjects, of whom 6021 were re-assessed one year later. A stratified random sampling procedure was applied. Computer-supported telephone interviews were used as the primary method. Postal questionnaires were used to follow-up those not reached by telephone. The response rate was 55% in the first wave and 74% in the second. The interview and questionnaire data are supplemented by register data. © 2014 The Authors. International Journal of Methods in Psychiatric Research published by John Wiley & Sons Ltd. PMID:24942902

  4. Epidemiological methods: about time.

    PubMed

    Kraemer, Helena Chmura

    2010-01-01

    Epidemiological studies often produce false positive results due to use of statistical approaches that either ignore or distort time. The three time-related issues of focus in this discussion are: (1) cross-sectional vs. cohort studies, (2) statistical significance vs. public health significance, and (3), how risk factors "work together" to impact public health significance. The issue of time should be central to all thinking in epidemiology research, affecting sampling, measurement, design, analysis and, perhaps most important, the interpretation of results that might influence clinical and public-health decision-making and subsequent clinical research.

  5. Overview of the epidemiology methods and applications: strengths and limitations of observational study designs.

    PubMed

    Colditz, Graham A

    2010-01-01

    The impact of study design on the results of medical research has long been an area of both substantial debate and a smaller body of empirical research. Examples come from many disciplines within clinical and public health research. Among the early major contributions in the 1970s was work by Mosteller and colleagues (Gilbert et al., 1997), who noted that innovations in surgery and anesthesia showed greater gains than standard therapy when nonrandomized, controlled trials were evaluated compared with the gains reported in randomized, controlled trials. More recently, we and others have evaluated the impact of design in medical and surgical research, and concluded that the mean gain comparing new therapies to established therapies was biased by study design in nonrandomized trials (Colditz et al., 1989; Miller et al., 1989). Benson and Hartz (2000) conducted a study in which they focused only on studies reported after 1985. On the basis of 136 reports of 19 diverse treatments, Benson and Hartz concluded that in only 2 of the 19 analyses did the combined data from the observational studies lie outside the 95% confidence interval for the combined data from the randomized trials. A similar study drew only on data reported from 1991 to 1995, which showed remarkably similar results among observational studies and randomized, controlled trials (Concato et al., 2000). These more recent data suggest that advancing the study design and analytic methods may reduce bias in some evaluations of medical and public health interventions. Such methods apply not only to the original studies, but also to the approaches that are taken to quantitatively combine results by using meta-analytic approaches such as random effects meta-regression, Bayesian meta-analysis, and the like (Normand, 1999). By focusing attention on thorough data analysis, design issues can be understood and their impact or bias can be estimated, on average, and then ideally accounted for in the interpretation of

  6. Design and implementation of epidemiological field investigation method based on mobile collaboration

    NASA Astrophysics Data System (ADS)

    Zhang, Lihui; Wang, Dongchuan; Huang, Mingxiang; Gong, Jianhua; Fang, Liqun; Cao, Wuchun

    2008-10-01

    With the development of mobile technologies and the integration with the spatial information technologies, it becomes possible to provide a potential to develop new techno-support solutions to Epidemiological Field Investigation especially for the disposal of emergent public health events. Based on mobile technologies and virtual geographic environment, the authors have designed a model for collaborative work in four communication patterns, namely, S2S (Static to Static), M2S (Mobile to Static), S2M (Static to Mobile), and M2M (Mobile to Mobile). Based on the model mentioned above, this paper stresses to explore mobile online mapping regarding mobile collaboration and conducts an experimental case study of HFRS (Hemorrhagic Fever with Renal Syndrome) fieldwork, and then develops a prototype system of emergent response disposition information system to test the effectiveness and usefulness of field survey based on mobile collaboration.

  7. Epidemiologic study of residential proximity to transmission lines and childhood cancer in California: description of design, epidemiologic methods and study population

    PubMed Central

    Kheifets, Leeka; Crespi, Catherine M; Hooper, Chris; Oksuzyan, Sona; Cockburn, Myles; Ly, Thomas; Mezei, Gabor

    2015-01-01

    We conducted a large epidemiologic case-control study in California to examine the association between childhood cancer risk and distance from the home address at birth to the nearest high-voltage overhead transmission line as a replication of the study of Draper et al. in the United Kingdom. We present a detailed description of the study design, methods of case ascertainment, control selection, exposure assessment and data analysis plan. A total of 5788 childhood leukemia cases and 3308 childhood central nervous system cancer cases (included for comparison) and matched controls were available for analysis. Birth and diagnosis addresses of cases and birth addresses of controls were geocoded. Distance from the home to nearby overhead transmission lines was ascertained on the basis of the electric power companies’ geographic information system (GIS) databases, additional Google Earth aerial evaluation and site visits to selected residences. We evaluated distances to power lines up to 2000 m and included consideration of lower voltages (60–69 kV). Distance measures based on GIS and Google Earth evaluation showed close agreement (Pearson correlation >0.99). Our three-tiered approach to exposure assessment allowed us to achieve high specificity, which is crucial for studies of rare diseases with low exposure prevalence. PMID:24045429

  8. Epidemiologic study of residential proximity to transmission lines and childhood cancer in California: description of design, epidemiologic methods and study population.

    PubMed

    Kheifets, Leeka; Crespi, Catherine M; Hooper, Chris; Oksuzyan, Sona; Cockburn, Myles; Ly, Thomas; Mezei, Gabor

    2015-01-01

    We conducted a large epidemiologic case-control study in California to examine the association between childhood cancer risk and distance from the home address at birth to the nearest high-voltage overhead transmission line as a replication of the study of Draper et al. in the United Kingdom. We present a detailed description of the study design, methods of case ascertainment, control selection, exposure assessment and data analysis plan. A total of 5788 childhood leukemia cases and 3308 childhood central nervous system cancer cases (included for comparison) and matched controls were available for analysis. Birth and diagnosis addresses of cases and birth addresses of controls were geocoded. Distance from the home to nearby overhead transmission lines was ascertained on the basis of the electric power companies' geographic information system (GIS) databases, additional Google Earth aerial evaluation and site visits to selected residences. We evaluated distances to power lines up to 2000 m and included consideration of lower voltages (60-69 kV). Distance measures based on GIS and Google Earth evaluation showed close agreement (Pearson correlation >0.99). Our three-tiered approach to exposure assessment allowed us to achieve high specificity, which is crucial for studies of rare diseases with low exposure prevalence.

  9. The INTERPHONE study: design, epidemiological methods, and description of the study population.

    PubMed

    Cardis, Elisabeth; Richardson, Lesley; Deltour, Isabelle; Armstrong, Bruce; Feychting, Maria; Johansen, Christoffer; Kilkenny, Monique; McKinney, Patricia; Modan, Baruch; Sadetzki, Siegal; Schüz, Joachim; Swerdlow, Anthony; Vrijheid, Martine; Auvinen, Anssi; Berg, Gabriele; Blettner, Maria; Bowman, Joseph; Brown, Julianne; Chetrit, Angela; Christensen, Helle Collatz; Cook, Angus; Hepworth, Sarah; Giles, Graham; Hours, Martine; Iavarone, Ivano; Jarus-Hakak, Avital; Klaeboe, Lars; Krewski, Daniel; Lagorio, Susanna; Lönn, Stefan; Mann, Simon; McBride, Mary; Muir, Kenneth; Nadon, Louise; Parent, Marie-Elise; Pearce, Neil; Salminen, Tiina; Schoemaker, Minouk; Schlehofer, Brigitte; Siemiatycki, Jack; Taki, Masao; Takebayashi, Toru; Tynes, Tore; van Tongeren, Martie; Vecchia, Paolo; Wiart, Joe; Woodward, Alistair; Yamaguchi, Naohito

    2007-01-01

    The very rapid worldwide increase in mobile phone use in the last decade has generated considerable interest in the possible health effects of exposure to radio frequency (RF) fields. A multinational case-control study, INTERPHONE, was set-up to investigate whether mobile phone use increases the risk of cancer and, more specifically, whether the RF fields emitted by mobile phones are carcinogenic. The study focused on tumours arising in the tissues most exposed to RF fields from mobile phones: glioma, meningioma, acoustic neurinoma and parotid gland tumours. In addition to a detailed history of mobile phone use, information was collected on a number of known and potential risk factors for these tumours. The study was conducted in 13 countries. Australia, Canada, Denmark, Finland, France, Germany, Israel, Italy, Japan, New Zealand, Norway, Sweden, and the UK using a common core protocol. This paper describes the study design and methods and the main characteristics of the study population. INTERPHONE is the largest case-control study to date investigating risks related to mobile phone use and to other potential risk factors for the tumours of interest and includes 2,765 glioma, 2,425 meningioma, 1,121 acoustic neurinoma, 109 malignant parotid gland tumour cases and 7,658 controls. Particular attention was paid to estimating the amount and direction of potential recall and participation biases and their impact on the study results.

  10. Towards non-conventional methods of designing register-based epidemiological studies: An application to pediatric research.

    PubMed

    Gong, Tong; Brew, Bronwyn; Sjölander, Arvid; Almqvist, Catarina

    2017-07-01

    Various epidemiological designs have been applied to investigate the causes and consequences of fetal growth restriction in register-based observational studies. This review seeks to provide an overview of several conventional designs, including cohort, case-control and more recently applied non-conventional designs such as family-based designs. We also discuss some practical points regarding the application and interpretation of family-based designs. Definitions of each design, the study population, the exposure and the outcome measures are briefly summarised. Examples of study designs are taken from the field of low birth-weight research for illustrative purposes. Also examined are relative advantages and disadvantages of each design in terms of assumptions, potential selection and information bias, confounding and generalisability. Kinship data linkage, statistical models and result interpretation are discussed specific to family-based designs. When all information is retrieved from registers, there is no evident preference of the case-control design over the cohort design to estimate odds ratios. All conventional designs included in the review are prone to bias, particularly due to residual confounding. Family-based designs are able to reduce such bias and strengthen causal inference. In the field of low birth-weight research, family-based designs have been able to confirm a negative association not confounded by genetic or shared environmental factors between low birth weight and the risk of asthma. We conclude that there is a broader need for family-based design in observational research as evidenced by the meaningful contributions to the understanding of the potential causal association between low birth weight and subsequent outcomes.

  11. Practical limitations of epidemiologic methods.

    PubMed

    Lilienfeld, A M

    1983-10-01

    Epidemiologic methods can be categorized into demographic studies of mortality and morbidity and observational studies that are either retrospective or prospective. Some of the limitations of demographic studies are illustrated by a review of one specific mortality study showing possible relationship of nuclear fallout to leukemia. Problems of accuracy of diagnosis or causes of death on death certificates, estimates of population, migration from areas of study, and the issue of "ecological fallacy" are discussed. Retrospective studies have such problems as recall of previous environmental exposure, selection bias and survivor bias. In environmental epidemiology, prospective studies have been used. The problems associated with these studies are illustrated by reviewing some of the details of the study of effects of microwave radiation on embassy employees in Moscow. The study population had to be reconstructed, individuals had to be located and information on exposure status had to be obtained by questionnaire. The relatively small size of the exposed group permitted the detection of only fairly large relative risks. Despite these limitations, epidemiologic studies have been remarkably productive in elucidating etiological factors. They are necessary since "the proper study of man is man."

  12. Epidemiology and the scientific method.

    PubMed

    Chalmers, A F

    1982-01-01

    This article refutes the claim that the field of epidemiology and community health would benefit from the application of the scientific method. It is argued that the methods of physics are not appropriate for other disciplines. When applied to the social sciences, positivism is a conservatizing force, causing theory to become based on a mere description of social phenomenon. Since it cannot lead to a deep understanding of social phenomena, positivism is incapable of revealing ways in which society could be radically changed. Moreover, such theory is far from neutral. Rather, it is formed and influenced by the forms of life experienced and practiced in the society. This is illustrated by an analysis of the origin of modern physics at the time when society was changing from a feudal to capitalist form of organization. It is concluded that advances will be made in epidemiology and community health when this field breaks from its focus on the individual and incorporates class into its analysis. However, given the interconnection between social structure and social theory, resistance to such a radical change can be expected.

  13. Melanocortin-1 receptor, skin cancer and phenotypic characteristics (M-SKIP) project: study design and methods for pooling results of genetic epidemiological studies

    PubMed Central

    2012-01-01

    Background For complex diseases like cancer, pooled-analysis of individual data represents a powerful tool to investigate the joint contribution of genetic, phenotypic and environmental factors to the development of a disease. Pooled-analysis of epidemiological studies has many advantages over meta-analysis, and preliminary results may be obtained faster and with lower costs than with prospective consortia. Design and methods Based on our experience with the study design of the Melanocortin-1 receptor (MC1R) gene, SKin cancer and Phenotypic characteristics (M-SKIP) project, we describe the most important steps in planning and conducting a pooled-analysis of genetic epidemiological studies. We then present the statistical analysis plan that we are going to apply, giving particular attention to methods of analysis recently proposed to account for between-study heterogeneity and to explore the joint contribution of genetic, phenotypic and environmental factors in the development of a disease. Within the M-SKIP project, data on 10,959 skin cancer cases and 14,785 controls from 31 international investigators were checked for quality and recoded for standardization. We first proposed to fit the aggregated data with random-effects logistic regression models. However, for the M-SKIP project, a two-stage analysis will be preferred to overcome the problem regarding the availability of different study covariates. The joint contribution of MC1R variants and phenotypic characteristics to skin cancer development will be studied via logic regression modeling. Discussion Methodological guidelines to correctly design and conduct pooled-analyses are needed to facilitate application of such methods, thus providing a better summary of the actual findings on specific fields. PMID:22862891

  14. Melanocortin-1 receptor, skin cancer and phenotypic characteristics (M-SKIP) project: study design and methods for pooling results of genetic epidemiological studies.

    PubMed

    Raimondi, Sara; Gandini, Sara; Fargnoli, Maria Concetta; Bagnardi, Vincenzo; Maisonneuve, Patrick; Specchia, Claudia; Kumar, Rajiv; Nagore, Eduardo; Han, Jiali; Hansson, Johan; Kanetsky, Peter A; Ghiorzo, Paola; Gruis, Nelleke A; Dwyer, Terry; Blizzard, Leigh; Fernandez-de-Misa, Ricardo; Branicki, Wojciech; Debniak, Tadeusz; Morling, Niels; Landi, Maria Teresa; Palmieri, Giuseppe; Ribas, Gloria; Stratigos, Alexander; Cornelius, Lynn; Motokawa, Tomonori; Anno, Sumiko; Helsing, Per; Wong, Terence H; Autier, Philippe; García-Borrón, José C; Little, Julian; Newton-Bishop, Julia; Sera, Francesco; Liu, Fan; Kayser, Manfred; Nijsten, Tamar

    2012-08-03

    For complex diseases like cancer, pooled-analysis of individual data represents a powerful tool to investigate the joint contribution of genetic, phenotypic and environmental factors to the development of a disease. Pooled-analysis of epidemiological studies has many advantages over meta-analysis, and preliminary results may be obtained faster and with lower costs than with prospective consortia. Based on our experience with the study design of the Melanocortin-1 receptor (MC1R) gene, SKin cancer and Phenotypic characteristics (M-SKIP) project, we describe the most important steps in planning and conducting a pooled-analysis of genetic epidemiological studies. We then present the statistical analysis plan that we are going to apply, giving particular attention to methods of analysis recently proposed to account for between-study heterogeneity and to explore the joint contribution of genetic, phenotypic and environmental factors in the development of a disease. Within the M-SKIP project, data on 10,959 skin cancer cases and 14,785 controls from 31 international investigators were checked for quality and recoded for standardization. We first proposed to fit the aggregated data with random-effects logistic regression models. However, for the M-SKIP project, a two-stage analysis will be preferred to overcome the problem regarding the availability of different study covariates. The joint contribution of MC1R variants and phenotypic characteristics to skin cancer development will be studied via logic regression modeling. Methodological guidelines to correctly design and conduct pooled-analyses are needed to facilitate application of such methods, thus providing a better summary of the actual findings on specific fields.

  15. An introduction to epidemiologic and statistical methods useful in environmental epidemiology.

    PubMed

    Nitta, Hiroshi; Yamazaki, Shin; Omori, Takashi; Sato, Tosiya

    2010-01-01

    Many developments in the design and analysis of environmental epidemiology have been made in air pollution studies. In the analysis of the short-term effects of particulate matter on daily mortality, Poisson regression models with flexible smoothing methods have been developed for the analysis of time-series data. Another option for such studies is the use of case-crossover designs, and there have been extensive discussions on the selection of control periods. In the Study on Respiratory Disease and Automobile Exhaust project conducted by the Japanese Ministry of the Environment, we adopted a new 2-stage case-control design that is efficient when both exposure and disease are rare. Based on our experience in conducting air pollution epidemiologic studies, we review 2-stage case-control designs, case-crossover designs, generalized linear models, generalized additive models, and generalized estimating equations, all of which are useful approaches in environmental epidemiology.

  16. Epidemiologic methods in analysis of scientific issues

    NASA Astrophysics Data System (ADS)

    Erdreich, Linda S.

    2003-10-01

    Studies of human populations provide much of the information that is used to evaluate compensation cases for hearing loss, including rates of hearing loss by age, and dose-response relationships. The reference data used to make decisions regarding workman's compensation is based on epidemiologic studies of cohorts of workers exposed to various noise levels. Epidemiology and its methods can be used in other ways in the courtroom; to assess the merits of a complaint, to support Daubert criteria, and to explain scientific issues to the trier of fact, generally a layperson. Using examples other than occupational noise induced hearing loss, these methods will be applied to respond to a complaint that hearing loss followed exposure to a sudden noise, a medication, or an occupational chemical, and thus was caused by said exposure. The standard criteria for assessing the weight of the evidence, and epidemiologic criteria for causality show the limits of such anecdotal data and incorporate quantitative and temporal issues. Reports of clusters of cases are also intuitively convincing to juries. Epidemiologic methods provide a scientific approach to assess whether rates of the outcome are indeed increased, and the extent to which increased rates provide evidence for causality.

  17. Using Epidemiologic Methods to Test Hypotheses regarding Causal Influences on Child and Adolescent Mental Disorders

    ERIC Educational Resources Information Center

    Lahey, Benjamin B.; D'Onofrio, Brian M.; Waldman, Irwin D.

    2009-01-01

    Epidemiology uses strong sampling methods and study designs to test refutable hypotheses regarding the causes of important health, mental health, and social outcomes. Epidemiologic methods are increasingly being used to move developmental psychopathology from studies that catalogue correlates of child and adolescent mental health to designs that…

  18. Kinetics methods for clinical epidemiology problems

    PubMed Central

    Corlan, Alexandru Dan; Ross, John

    2015-01-01

    Calculating the probability of each possible outcome for a patient at any time in the future is currently possible only in the simplest cases: short-term prediction in acute diseases of otherwise healthy persons. This problem is to some extent analogous to predicting the concentrations of species in a reactor when knowing initial concentrations and after examining reaction rates at the individual molecule level. The existing theoretical framework behind predicting contagion and the immediate outcome of acute diseases in previously healthy individuals is largely analogous to deterministic kinetics of chemical systems consisting of one or a few reactions. We show that current statistical models commonly used in chronic disease epidemiology correspond to simple stochastic treatment of single reaction systems. The general problem corresponds to stochastic kinetics of complex reaction systems. We attempt to formulate epidemiologic problems related to chronic diseases in chemical kinetics terms. We review methods that may be adapted for use in epidemiology. We show that some reactions cannot fit into the mass-action law paradigm and solutions to these systems would frequently exhibit an antiportfolio effect. We provide a complete example application of stochastic kinetics modeling for a deductive meta-analysis of two papers on atrial fibrillation incidence, prevalence, and mortality. PMID:26578757

  19. Kinetics methods for clinical epidemiology problems.

    PubMed

    Corlan, Alexandru Dan; Ross, John

    2015-11-17

    Calculating the probability of each possible outcome for a patient at any time in the future is currently possible only in the simplest cases: short-term prediction in acute diseases of otherwise healthy persons. This problem is to some extent analogous to predicting the concentrations of species in a reactor when knowing initial concentrations and after examining reaction rates at the individual molecule level. The existing theoretical framework behind predicting contagion and the immediate outcome of acute diseases in previously healthy individuals is largely analogous to deterministic kinetics of chemical systems consisting of one or a few reactions. We show that current statistical models commonly used in chronic disease epidemiology correspond to simple stochastic treatment of single reaction systems. The general problem corresponds to stochastic kinetics of complex reaction systems. We attempt to formulate epidemiologic problems related to chronic diseases in chemical kinetics terms. We review methods that may be adapted for use in epidemiology. We show that some reactions cannot fit into the mass-action law paradigm and solutions to these systems would frequently exhibit an antiportfolio effect. We provide a complete example application of stochastic kinetics modeling for a deductive meta-analysis of two papers on atrial fibrillation incidence, prevalence, and mortality.

  20. Automated Dental Epidemiology System. II. Systems Analysis and Functional Design,

    DTIC Science & Technology

    1983-08-01

    A"D-n134 803 AUTOMATED DENTAL EPIDEMIOLOGY SYSTEM II SYSTEMSi/ ANALYSIS AND FUNCTIONAL DE5IGN(U) NAVAnL DENTAL RESERRCH INST GREAT LAKES IL M C DIEHL... DENTAL EPIDEMIOLOGY SYSTEM: II. SYSTEMS ANALYSIS AND FUNCTIONAL DESIGN M. C. DIEHL DTICSELECTEOCT 218 D >- NAVAL 8DENTAL RESEARCH ’INSTITUTE Naval...NAVAL DENTAL RESEARCH INSTITUTE NAVAL BASE, BUILDING I-H GREAT LAKES, ILLINOIS 60088 AUTOMATED DENTAL EPIDEMIOLOGY SYSTEM: II. SYSTEMS ANALYSIS

  1. DESIGN OF EXPOSURE MEASUREMENTS FOR EPIDEMIOLOGIC STUDIES

    EPA Science Inventory

    This presentation will describe the following items: (1) London daily air pollution and deaths that demonstrate how time series epidemiology can indicate that air pollution caused death; (2) Sophisticated statistical models required to establish this relationship for lower pollut...

  2. Methods of Measurement in epidemiology: Sedentary Behaviour

    PubMed Central

    Atkin, Andrew J; Gorely, Trish; Clemes, Stacy A; Yates, Thomas; Edwardson, Charlotte; Brage, Soren; Salmon, Jo; Marshall, Simon J; Biddle, Stuart JH

    2012-01-01

    Background Research examining sedentary behaviour as a potentially independent risk factor for chronic disease morbidity and mortality has expanded rapidly in recent years. Methods We present a narrative overview of the sedentary behaviour measurement literature. Subjective and objective methods of measuring sedentary behaviour suitable for use in population-based research with children and adults are examined. The validity and reliability of each method is considered, gaps in the literature specific to each method identified and potential future directions discussed. Results To date, subjective approaches to sedentary behaviour measurement, e.g. questionnaires, have focused predominantly on TV viewing or other screen-based behaviours. Typically, such measures demonstrate moderate reliability but slight to moderate validity. Accelerometry is increasingly being used for sedentary behaviour assessments; this approach overcomes some of the limitations of subjective methods, but detection of specific postures and postural changes by this method is somewhat limited. Instruments developed specifically for the assessment of body posture have demonstrated good reliability and validity in the limited research conducted to date. Miniaturization of monitoring devices, interoperability between measurement and communication technologies and advanced analytical approaches are potential avenues for future developments in this field. Conclusions High-quality measurement is essential in all elements of sedentary behaviour epidemiology, from determining associations with health outcomes to the development and evaluation of behaviour change interventions. Sedentary behaviour measurement remains relatively under-developed, although new instruments, both objective and subjective, show considerable promise and warrant further testing. PMID:23045206

  3. Epidemiological study air disaster in Amsterdam (ESADA): study design

    PubMed Central

    Slottje, Pauline; Huizink, Anja C; Twisk, Jos WR; Witteveen, Anke B; van der Ploeg, Henk M; Bramsen, Inge; Smidt, Nynke; Bijlsma, Joost A; Bouter, Lex M; van Mechelen, Willem; Smid, Tjabe

    2005-01-01

    Background In 1992, a cargo aircraft crashed into apartment buildings in Amsterdam, killing 43 victims and destroying 266 apartments. In the aftermath there were speculations about the cause of the crash, potential exposures to hazardous materials due to the disaster and the health consequences. Starting in 2000, the Epidemiological Study Air Disaster in Amsterdam (ESADA) aimed to assess the long-term health effects of occupational exposure to this disaster on professional assistance workers. Methods/Design Epidemiological study among all the exposed professional fire-fighters and police officers who performed disaster-related task(s), and hangar workers who sorted the wreckage of the aircraft, as well as reference groups of their non-exposed colleagues who did not perform any disaster-related tasks. The study took place, on average, 8.5 years after the disaster. Questionnaires were used to assess details on occupational exposure to the disaster. Health measures comprised laboratory assessments in urine, blood and saliva, as well as self-reported current health measures, including health-related quality of life, and various physical and psychological symptoms. Discussion In this paper we describe and discuss the design of the ESADA. The ESADA will provide additional scientific knowledge on the long-term health effects of technological disasters on professional workers. PMID:15921536

  4. Genetic Epidemiology of COPD (COPDGene) Study Design

    PubMed Central

    Regan, Elizabeth A.; Hokanson, John E.; Murphy, James R.; Make, Barry; Lynch, David A.; Beaty, Terri H.; Curran-Everett, Douglas; Silverman, Edwin K.; Crapo, James D.

    2010-01-01

    Background COPDGeneis a multicenter observational study designed to identify genetic factors associated with COPD. It will also characterize chest CT phenotypes in COPD subjects, including assessment of emphysema, gas trapping, and airway wall thickening. Finally, subtypes of COPD based on these phenotypes will be used in a comprehensive genome-wide study to identify COPD susceptibility genes. Methods/Results COPDGene will enroll 10,000 smokers with and without COPD across the GOLD stages. Both Non-Hispanic white and African-American subjects are included in the cohort. Inspiratory and expiratory chest CT scans will be obtained on all participants. In addition to the cross-sectional enrollment process, these subjects will be followed regularly for longitudinal studies. A genome-wide association study (GWAS) will be done on an initial group of 4000 subjects to identify genetic variants associated with case-control status and several quantitative phenotypes related to COPD. The initial findings will be verified in an additional 2000 COPD cases and 2000 smoking control subjects, and further validation association studies will be carried out. Conclusions COPDGene will provide important new information about genetic factors in COPD, and will characterize the disease process using high resolution CT scans. Understanding genetic factors and CT phenotypes that define COPD will potentially permit earlier diagnosis of this disease and may lead to the development of treatments to modify progression. PMID:20214461

  5. A design framework for exploratory geovisualization in epidemiology

    PubMed Central

    Robinson, Anthony C.

    2009-01-01

    This paper presents a design framework for geographic visualization based on iterative evaluations of a toolkit designed to support cancer epidemiology. The Exploratory Spatio-Temporal Analysis Toolkit (ESTAT), is intended to support visual exploration through multivariate health data. Its purpose is to provide epidemiologists with the ability to generate new hypotheses or further refine those they may already have. Through an iterative user-centered design process, ESTAT has been evaluated by epidemiologists at the National Cancer Institute (NCI). Results of these evaluations are discussed, and a design framework based on evaluation evidence is presented. The framework provides specific recommendations and considerations for the design and development of a geovisualization toolkit for epidemiology. Its basic structure provides a model for future design and evaluation efforts in information visualization. PMID:20390052

  6. The 15-Country Collaborative Study of Cancer Risk Among Radiation Workers in the Nuclear Industry: design, epidemiological methods and descriptive results.

    PubMed

    Vrijheid, M; Cardis, E; Blettner, M; Gilbert, E; Hakama, M; Hill, C; Howe, G; Kaldor, J; Muirhead, C R; Schubauer-Berigan, M; Yoshimura, T; Ahn, Y-O; Ashmore, P; Auvinen, A; Bae, J-M; Engels, H; Gulis, G; Habib, R R; Hosoda, Y; Kurtinaitis, J; Malker, H; Moser, M; Rodriguez-Artalejo, F; Rogel, A; Tardy, H; Telle-Lamberton, M; Turai, I; Usel, M; Veress, K

    2007-04-01

    Radiation protection standards are based mainly on risk estimates from studies of atomic bomb survivors in Japan. The validity of extrapolations from the relatively high-dose acute exposures in this population to the low-dose, protracted or fractionated environmental and occupational exposures of primary public health concern has long been the subject of controversy. A collaborative retrospective cohort study was conducted to provide direct estimates of cancer risk after low-dose protracted exposures. The study included nearly 600,000 workers employed in 154 facilities in 15 countries. This paper describes the design, methods and results of descriptive analyses of the study. The main analyses included 407,391 nuclear industry workers employed for at least 1 year in a participating facility who were monitored individually for external radiation exposure and whose doses resulted predominantly from exposure to higher-energy photon radiation. The total duration of follow-up was 5,192,710 person-years. There were 24,158 deaths from all causes, including 6,734 deaths from cancer. The total collective dose was 7,892 Sv. The overall average cumulative recorded dose was 19.4 mSv. A strong healthy worker effect was observed in most countries. This study provides the largest body of direct evidence to date on the effects of low-dose protracted exposures to external photon radiation.

  7. Epidemiology and Clinical Research Design, Part 2: Principles.

    PubMed

    Manja, Veena; Lakshminrusimha, Satyan

    This is the third article covering core knowledge in scholarly activities for neonatal physicians. In this article, we discuss various principles of epidemiology and clinical research design. A basic knowledge of these principles is necessary for conducting clinical research and for proper interpretation of studies. This article reviews bias and confounding, causation, incidence and prevalence, decision analysis, cost-effectiveness, sensitivity analysis, and measurement.

  8. How to design a (good) epidemiological observational study: epidemiological research protocol at a glance.

    PubMed

    Fronteira, Ines

    2013-01-01

    In this article, we propose a general structure for designing a research protocol of an observational epidemiological study. We start by highlighting the importance of the research protocol, namely in accounting for some bias and guaranteeing methodologic rigor and study reproductability. Next, we reflect on some of the essential elements of a research protocol no matter its objective. We further present some specific issues to be included according to the type of study: cross-sectional, case-control and cohort.

  9. Designing ROW Methods

    NASA Technical Reports Server (NTRS)

    Freed, Alan D.

    1996-01-01

    There are many aspects to consider when designing a Rosenbrock-Wanner-Wolfbrandt (ROW) method for the numerical integration of ordinary differential equations (ODE's) solving initial value problems (IVP's). The process can be simplified by constructing ROW methods around good Runge-Kutta (RK) methods. The formulation of a new, simple, embedded, third-order, ROW method demonstrates this design approach.

  10. Molecular epidemiology: issues in study design and statistical analysis.

    PubMed

    Chia, K S; Shi, C Y; Lee, J; Seow, A; Lee, H P

    1996-01-01

    Traditional analytical epidemiology is directed at identifying the association between risk factors and occurrence of disease by using crude exposure data derived from questionnaires or clinical measures, and taking clinical disease as the end point. With the rapid development in molecular biology and laboratory methods, it is now possible to use biomarkers which are capable of identifying molecular events for epidemiologic research. This improved sensitivity enables us to develop a mechanistic understanding of disease causation: a step closer to the unravelling of the "black box" of traditional epidemiology. Biomarkers may be classified as internal indicators of exposure (biomarkers of exposure), indicators of preclinical adverse effect (biomarkers of effect) or indicators of an intrinsic or acquired susceptibility to disease (biomarkers of susceptibility). Biomarkers provide a better definition of exposure and disease status and consequently they could help to reduce misclassification bias in both exposure and disease, reduce the follow-up time in prospective studies, as well as identify possible interactions between risk factors on disease occurrence. However, a biomarker needs to be validated and its distribution in large populations described before it can be used profitably for aetiologic research. Also, the use of biomarkers in epidemiologic research raises other interesting epidemiological and statistical issues like confounding, effect modification and the analysis of repeated measurements. Molecular epidemiology is a multidisciplinary endeavour which comprises molecular biology, epidemiology and biostatistics. Clearly then, to carry out research in this field profitably, the molecular biologist, epidemiologist and biostatistician must acquire not only expertise in their respective fields, but also an integrated understanding of all three fields. The molecular biologist is not merely a laboratory bench worker; the epidemiologist, a field data-collector and

  11. Epidemiology and Clinical Research Design, Part 2: Principles

    PubMed Central

    Manja, Veena; Lakshminrusimha, Satyan

    2015-01-01

    This is the third article covering core knowledge in scholarly activities for neonatal physicians. In this article, we discuss various principles of epidemiology and clinical research design. A basic knowledge of these principles is necessary for conducting clinical research and for proper interpretation of studies. This article reviews bias and confounding, causation, incidence and prevalence, decision analysis, cost-effectiveness, sensitivity analysis, and measurement. PMID:26236171

  12. Design and analysis of metabolomics studies in epidemiologic research: a primer on -omic technologies.

    PubMed

    Tzoulaki, Ioanna; Ebbels, Timothy M D; Valdes, Ana; Elliott, Paul; Ioannidis, John P A

    2014-07-15

    Metabolomics is the field of "-omics" research concerned with the comprehensive characterization of the small low-molecular-weight metabolites in biological samples. In epidemiology, it represents an emerging technology and an unprecedented opportunity to measure environmental and other exposures with improved precision and far less measurement error than with standard epidemiologic methods. Advances in the application of metabolomics in large-scale epidemiologic research are now being realized through a combination of improved sample preparation and handling, automated laboratory and processing methods, and reduction in costs. The number of epidemiologic studies that use metabolic profiling is still limited, but it is fast gaining popularity in this area. In the present article, we present a roadmap for metabolomic analyses in epidemiologic studies and discuss the various challenges these data pose to large-scale studies. We discuss the steps of data preprocessing, univariate and multivariate data analysis, correction for multiplicity of comparisons with correlated data, and finally the steps of cross-validation and external validation. As data from metabolomic studies accumulate in epidemiology, there is a need for large-scale replication and synthesis of findings, increased availability of raw data, and a focus on good study design, all of which will highlight the potential clinical impact of metabolomics in this field. © The Author 2014. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. Epidemiological methods in diarrhoea studies—an update

    PubMed Central

    Schmidt, Wolf-Peter; Arnold, Benjamin F; Boisson, Sophie; Genser, Bernd; Luby, Stephen P; Barreto, Mauricio L; Clasen, Thomas; Cairncross, Sandy

    2011-01-01

    Background Diarrhoea remains a leading cause of morbidity and mortality but is difficult to measure in epidemiological studies. Challenges include the diagnosis based on self-reported symptoms, the logistical burden of intensive surveillance and the variability of diarrhoea in space, time and person. Methods We review current practices in sampling procedures to measure diarrhoea, and provide guidance for diarrhoea measurement across a range of study goals. Using 14 available data sets, we estimated typical design effects for clustering at household and village/neighbourhood level, and measured the impact of adjusting for baseline variables on the precision of intervention effect estimates. Results Incidence is the preferred outcome measure in aetiological studies, health services research and vaccine trials. Repeated prevalence measurements (longitudinal prevalence) are appropriate in high-mortality settings where malnutrition is common, although many repeat measures are rarely useful. Period prevalence is an inadequate outcome if an intervention affects illness duration. Adjusting point estimates for age or diarrhoea at baseline in randomized trials has little effect on the precision of estimates. Design effects in trials randomized at household level are usually <2 (range 1.0–3.2). Design effects for larger clusters (e.g. villages or neighbourhoods) vary greatly among different settings and study designs (range 0.1–25.8). Conclusions Using appropriate sampling strategies and outcome measures can improve the efficiency, validity and comparability of diarrhoea studies. Allocating large clusters in cluster randomized trials is compromized by unpredictable design effects and should be carried out only if the research question requires it. PMID:22268237

  14. Epidemiologic methods for investigating male fecundity

    PubMed Central

    Olsen, Jørn; Ramlau-Hansen, Cecilia Høst

    2014-01-01

    Fertility is a couple concept that has been measured since the beginning of demography, and male fecundity (his biological capacity to reproduce) is a component of the fertility rate. Unfortunately, we have no way of measuring the male component directly, although several indirect markers can be used. Population registers can be used to monitor the proportion of childless couples, couples who receive donor semen, trends in dizygotic twinning, and infertility diagnoses. Studies using time-to-pregnancy (TTP) may identify couple subfecundity, and TTP data will correlate with sperm quality and quantity as well as sexual activity and a number of other conditions. Having exposure data available for couples with a fecund female partner would make TTP studies of interest in identifying exposures that may affect male fecundity. Biological indicators such as sperm quality and quantity isolate the male component of fertility, and semen data therefore remain an important source of information for research. Unfortunately, often over half of those invited to provide a sperm sample will refuse, and the study is then subject to a selection that may introduce bias. Because the most important time windows for exposures that impair semen production could be early fetal life, puberty, and the time of ejaculation; longitudinal data over decades of time are required. The ongoing monitoring of semen quality and quantity should continue, and surveys monitoring fertility and waiting TTP should also be designed. PMID:24369129

  15. Basic epidemiologic and statistical methods in clinical research.

    PubMed

    Gottlieb, M; Anderson, G; Lepor, H

    1992-11-01

    Various study designs and approaches to statistical analysis in clinical research have their own underlying rationales and limitations for interpretation. This information is presented in an intuitive and accessible manner, relying minimally on basic algebra. Epidemiologic concepts of study design and interpretation, bias and confounding, hypothesis testing, and sample size and power are explained. Statistical tests and their appropriate applications are discussed for mean comparisons (t tests and ANOVA), percentages (chi-square), survival analysis, and correlation and regression. Applicable nonparametric tests are also introduced.

  16. Sampling designs for HIV molecular epidemiology with application to Honduras.

    PubMed

    Shepherd, Bryan E; Rossini, Anthony J; Soto, Ramon Jeremias; De Rivera, Ivette Lorenzana; Mullins, James I

    2005-11-01

    Proper sampling is essential to characterize the molecular epidemiology of human immunodeficiency virus (HIV). HIV sampling frames are difficult to identify, so most studies use convenience samples. We discuss statistically valid and feasible sampling techniques that overcome some of the potential for bias due to convenience sampling and ensure better representation of the study population. We employ a sampling design called stratified cluster sampling. This first divides the population into geographical and/or social strata. Within each stratum, a population of clusters is chosen from groups, locations, or facilities where HIV-positive individuals might be found. Some clusters are randomly selected within strata and individuals are randomly selected within clusters. Variation and cost help determine the number of clusters and the number of individuals within clusters that are to be sampled. We illustrate the approach through a study designed to survey the heterogeneity of subtype B strains in Honduras.

  17. [Malignant tumours of the eye: Epidemiology, diagnostic methods and radiotherapy].

    PubMed

    Jardel, P; Caujolle, J-P; Gastaud, L; Maschi, C; Sauerwein, W; Thariat, J

    2015-12-01

    Malignant tumours of the eye are not common, barely representing 1 % of all cancers. This article aims to summarise, for each of the main eye malignant diseases, aspects of epidemiology, diagnostic methods and treatments, with a focus on radiation therapy techniques. The studied tumours are: eye metastasis, intraocular and ocular adnexal lymphomas, uveal melanomas, malignant tumours of the conjunctive, of the lids, and retinoblastomas. The last chapter outlines ocular complications of radiation therapy and their management.

  18. [Evolution of epidemiological methods in clinical research in Spain (1975-1994)].

    PubMed

    Aibar Remón, C; Rabanaque, M J; Alvarez-Dardet, C; Nolasco, A; Moncho, J; Gascón, E

    1999-01-01

    Previous studies have shown a sparing utilization of analytical and experimental designs in Spanish clinical research journals. The study aims are to compare among countries, the use of epidemiologic method in articles published in scientific journals, and to determine the extent to which this research has direct funding. Cross-sectional study including all original papers published during 1994 in Medicina Clinica [(Med Clin (Barc)], Revista Clinica Española (Rev Clin Esp), The Lancet (Lancet) and New England Journal of Medicine (N Engl J Med). They were classified according to epidemiological design and we verified the financial support mention. 594 papers were included. Epidemiological studies without control group prevailed in Spanish journals. The most common designs were descriptive studies in Med Clin (Barc), with 45.5%, and clinical series in Rev Clin Esp, with 41.7%. The 33.6% of original papers published in Lancet and 28.4% of N England J Med were randomized trials. We found information about financial support in 73.7% of papers published in Lancet, in 77.4% of N Engl J Med, in 23.1% of Med Clin (Barc) papers and not one in the Rev Clin Esp studies. In Spanish clinical journals the use of epidemiological methods with control group is limited and direct financial support unusual. Wherefore these studies have a limited applicability.

  19. Parameter Plane Design Method

    DTIC Science & Technology

    1989-03-01

    Th usr a toente aninteer a thca sms b esta 1 Fp-ocsing 2. Enter P1 values, lwgt, ldig - > 9 Table I give us proper values. Table 1. PARAMETER TABLE...necessary and identify by block number) In this thesis a control systems analysis package is developed using parameter plane methods. It is an interactive...designer is able to choose values of the parameters which provide a good compromise between cost and dynamic behavior. 20 Distribution Availability of

  20. Control system design method

    DOEpatents

    Wilson, David G [Tijeras, NM; Robinett, III, Rush D.

    2012-02-21

    A control system design method and concomitant control system comprising representing a physical apparatus to be controlled as a Hamiltonian system, determining elements of the Hamiltonian system representation which are power generators, power dissipators, and power storage devices, analyzing stability and performance of the Hamiltonian system based on the results of the determining step and determining necessary and sufficient conditions for stability of the Hamiltonian system, creating a stable control system based on the results of the analyzing step, and employing the resulting control system to control the physical apparatus.

  1. [Curricular design of health postgraduate programs: the case of Masters in epidemiology].

    PubMed

    Bobadilla, J L; Lozano, R; Bobadilla, C

    1991-01-01

    This paper discusses the need to create specific programs for the training of researchers in epidemiology, a field that has traditionally been ignored by the graduate programs in public health. This is due, in part, to the emphasis that has been placed on the training of professionals in other areas of public health. The paper also includes the results of a consensus exercise developed during the curricular design of the Masters Program in Epidemiology of the School of Medicine of the National Autonomous University of Mexico. The technique used during the consensus exercise was the TKJ, which allows the presentation of ideas and possible solutions for a specific problem. This is probably the first published experience in the use of such a technique for the design of an academic curriculum. Taking as a base the general characteristics of the students, the substantive, disciplinary and methodological subjects were chosen. The results showed a need for a multidisciplinary approach based on modern methodologies of statistics and epidemiology. The usefulness of the results of the curricular design and the superiority of this method to reach consensus is also discussed.

  2. Anonymous statistical methods versus cryptographic methods in epidemiology.

    PubMed

    Quantin; Allaert; Dusserre

    2000-11-01

    Sensitive data are most often indirectly identifiable and so need to be rendered anonymous in order to ensure privacy. Statistical methods to provide anonymity require data perturbation and so generate data processing difficulties. Encryption methods, while preserving confidentiality, do not require data modification.

  3. Epidemiology and statistical methods in prediction of patient outcome.

    PubMed

    Bostwick, David G; Adolfsson, Jan; Burke, Harry B; Damber, Jan-Erik; Huland, Hartwig; Pavone-Macaluso, Michele; Waters, David J

    2005-05-01

    Substantial gaps exist in the data of the assessment of risk and prognosis that limit our understanding of the complex mechanisms that contribute to the greatest cancer epidemic, prostate cancer, of our time. This report was prepared by an international multidisciplinary committee of the World Health Organization to address contemporary issues of epidemiology and statistical methods in prostate cancer, including a summary of current risk assessment methods and prognostic factors. Emphasis was placed on the relative merits of each of the statistical methods available. We concluded that: 1. An international committee should be created to guide the assessment and validation of molecular biomarkers. The goal is to achieve more precise identification of those who would benefit from treatment. 2. Prostate cancer is a predictable disease despite its biologic heterogeneity. However, the accuracy of predicting it must be improved. We expect that more precise statistical methods will supplant the current staging system. The simplicity and intuitive ease of using the current staging system must be balanced against the serious compromise in accuracy for the individual patient. 3. The most useful new statistical approaches will integrate molecular biomarkers with existing prognostic factors to predict conditional life expectancy (i.e. the expected remaining years of a patient's life) and take into account all-cause mortality.

  4. Method for Design Rotation

    DTIC Science & Technology

    1993-08-01

    desirability of a rotation as a function of the set of planar angles. Criteria for the symmetry of the design (such as the same set of factor levels for...P is -1. Hence there is no theoretical problem in obtaining rotations of a design; there are only the practical questions Why rotate a design? And...star points, which can be represented in a shorthand notation by the permutations of (±1,0, "’" , 0), and (c) factorial points, which are a two- level

  5. Aircraft digital control design methods

    NASA Technical Reports Server (NTRS)

    Powell, J. D.; Parsons, E.; Tashker, M. G.

    1976-01-01

    Variations in design methods for aircraft digital flight control are evaluated and compared. The methods fall into two categories; those where the design is done in the continuous domain (or s plane) and those where the design is done in the discrete domain (or z plane). Design method fidelity is evaluated by examining closed loop root movement and the frequency response of the discretely controlled continuous aircraft. It was found that all methods provided acceptable performance for sample rates greater than 10 cps except the uncompensated s plane design method which was acceptable above 20 cps. A design procedure based on optimal control methods was proposed that provided the best fidelity at very slow sample rates and required no design iterations for changing sample rates.

  6. Has epidemiology become infatuated with methods? A historical perspective on the place of methods during the classical (1945-1965) phase of epidemiology.

    PubMed

    Morabia, Alfredo

    2015-03-18

    Before World War II, epidemiology was a small discipline, practiced by a handful of people working mostly in the United Kingdom and in the United States. Today it is practiced by tens of thousands of people on all continents. Between 1945 and 1965, during what is known as its "classical" phase, epidemiology became recognized as a major academic discipline in medicine and public health. On the basis of a review of the historical evidence, this article examines to which extent classical epidemiology has been a golden age of an action-driven, problem-solving science, in which epidemiologists were less concerned with the sophistication of their methods than with the societal consequences of their work. It also discusses whether the paucity of methods stymied or boosted classical epidemiology's ability to convince political and financial agencies about the need to intervene in order to improve the health of the people.

  7. New Saliva DNA Collection Method Compared to Buccal Cell Collection Techniques for Epidemiological Studies

    PubMed Central

    ROGERS, NIKKI L.; COLE, SHELLEY A.; LAN, HAO-CHANG; CROSSA, ALDO; DEMERATH, ELLEN W.

    2009-01-01

    Epidemiological studies may require noninvasive methods for off-site DNA collection. We compared the DNA yield and quality obtained using a whole-saliva collection device (Oragene™ DNA collection kit) to those from three established noninvasive methods (cytobrush, foam swab, and oral rinse). Each method was tested on 17 adult volunteers from our center, using a random crossover collection design and analyzed using repeated-measures statistics. DNA yield and quality were assessed via gel electrophoresis, spectophotometry, and polymerase chain reaction (PCR) amplification rate. The whole-saliva method provided a significantly greater DNA yield (mean ± SD = 154.9 ± 103.05 μg, median = 181.88) than the other methods (oral rinse = 54.74 ± 41.72 μg, 36.56; swab = 11.44 ± 7.39 μg, 10.72; cytobrush = 12.66 ± 6.19, 13.22 μg) (all pairwise P < 0.05). Oral-rinse and whole-saliva samples provided the best DNA quality, whereas cytobrush and swab samples provided poorer quality DNA, as shown by lower OD260/OD280 and OD260/OD230 ratios. We conclude that both a 10-ml oral-rinse sample and 2-ml whole-saliva sample provide sufficient DNA quantity and better quality DNA for genetic epidemiological studies than do the commonly used buccal swab and brush techniques. PMID:17421001

  8. New saliva DNA collection method compared to buccal cell collection techniques for epidemiological studies.

    PubMed

    Rogers, Nikki L; Cole, Shelley A; Lan, Hao-Chang; Crossa, Aldo; Demerath, Ellen W

    2007-01-01

    Epidemiological studies may require noninvasive methods for off-site DNA collection. We compared the DNA yield and quality obtained using a whole-saliva collection device (Oragene DNA collection kit) to those from three established noninvasive methods (cytobrush, foam swab, and oral rinse). Each method was tested on 17 adult volunteers from our center, using a random crossover collection design and analyzed using repeated-measures statistics. DNA yield and quality were assessed via gel electrophoresis, spectophotometry, and polymerase chain reaction (PCR) amplification rate. The whole-saliva method provided a significantly greater DNA yield (mean +/- SD = 154.9 +/- 103.05 microg, median = 181.88) than the other methods (oral rinse = 54.74 +/- 41.72 microg, 36.56; swab = 11.44 +/- 7.39 microg, 10.72; cytobrush = 12.66 +/- 6.19, 13.22 microg) (all pairwise P < 0.05). Oral-rinse and whole-saliva samples provided the best DNA quality, whereas cytobrush and swab samples provided poorer quality DNA, as shown by lower OD(260)/OD(280) and OD(260)/OD(230) ratios. We conclude that both a 10-ml oral-rinse sample and 2-ml whole-saliva sample provide sufficient DNA quantity and better quality DNA for genetic epidemiological studies than do the commonly used buccal swab and brush techniques.

  9. Bone lead measured by X-ray fluorescence: epidemiologic methods.

    PubMed Central

    Hu, H; Aro, A; Rotnitzky, A

    1995-01-01

    In vivo X-ray fluorescence (XRF) measurement of bone lead concentration (XRF) has emerged as an important technique for future epidemiological studies of long-term toxicity. Several issues germane to epidemiologic methodology need to be addressed, however. First, sources of variability in measurements of bone lead need to be quantified, including imprecision related to the physical measurement itself and the variability of lead deposition over the two main compartments of bones (cortical vs. trabecular) and within each compartment. Imprecision related to the physical measurement can be estimated for each individual measurement based on the variability of the signal and background. Second, approaches to low-level data need to be debated. We argue for using the minimal detection limit (MDL) to compare instruments and interpret individual measurements; however, with regard to epidemiologic studies, we would abandon the MDL in favor of using all point estimates. In analyses using bone lead as an independent variable, statistical techniques can be used to adjust regression estimates based on estimates of measurement uncertainty and bone lead variability. Third, factors that can be expected to modify the relationship between bone lead and toxicity such as gravida history, endocrinological states, nutrition, and other important influences on bone metabolism, need to be identified and measured in epidemiologic studies. By addressing these issues, investigators will be able to maximize the utility of XRF measurements in environmental epidemiologic studies. Images Figure 2. PMID:7621788

  10. Air Pollution Monitoring Design for Epidemiological Application in a Densely Populated City.

    PubMed

    Min, Kyung-Duk; Kwon, Ho-Jang; Kim, KyooSang; Kim, Sun-Young

    2017-06-25

    Introduction: Many studies have reported the association between air pollution and human health based on regulatory air pollution monitoring data. However, because regulatory monitoring networks were not designed for epidemiological studies, the collected data may not provide sufficient spatial contrasts for assessing such associations. Our goal was to develop a monitoring design supplementary to the regulatory monitoring network in Seoul, Korea. This design focused on the selection of 20 new monitoring sites to represent the variability in PM2.5 across people's residences for cohort studies. Methods: We obtained hourly measurements of PM2.5 at 37 regulatory monitoring sites in 2010 in Seoul, and computed the annual average at each site. We also computed 313 geographic variables representing various pollution sources at the regulatory monitoring sites, 31,097 children's homes from the Atopy Free School survey, and 412 community service centers in Seoul. These three types of locations represented current, subject, and candidate locations. Using the regulatory monitoring data, we performed forward variable selection and chose five variables most related to PM2.5. Then, k-means clustering was applied to categorize all locations into several groups representing a diversity in the spatial variability of the five selected variables. Finally, we computed the proportion of current to subject location in each cluster, and randomly selected new monitoring sites from candidate sites in the cluster with the minimum proportion until 20 sites were selected. Results: The five selected geographic variables were related to traffic or urbanicity with a cross-validated R² value of 0.69. Clustering analysis categorized all locations into nine clusters. Finally, one to eight new monitoring sites were selected from five clusters. Discussion: The proposed monitoring design will help future studies determine the locations of new monitoring sites representing spatial variability across

  11. Air Pollution Monitoring Design for Epidemiological Application in a Densely Populated City

    PubMed Central

    Min, Kyung-Duk; Kwon, Ho-Jang; Kim, KyooSang; Kim, Sun-Young

    2017-01-01

    Introduction: Many studies have reported the association between air pollution and human health based on regulatory air pollution monitoring data. However, because regulatory monitoring networks were not designed for epidemiological studies, the collected data may not provide sufficient spatial contrasts for assessing such associations. Our goal was to develop a monitoring design supplementary to the regulatory monitoring network in Seoul, Korea. This design focused on the selection of 20 new monitoring sites to represent the variability in PM2.5 across people’s residences for cohort studies. Methods: We obtained hourly measurements of PM2.5 at 37 regulatory monitoring sites in 2010 in Seoul, and computed the annual average at each site. We also computed 313 geographic variables representing various pollution sources at the regulatory monitoring sites, 31,097 children’s homes from the Atopy Free School survey, and 412 community service centers in Seoul. These three types of locations represented current, subject, and candidate locations. Using the regulatory monitoring data, we performed forward variable selection and chose five variables most related to PM2.5. Then, k-means clustering was applied to categorize all locations into several groups representing a diversity in the spatial variability of the five selected variables. Finally, we computed the proportion of current to subject location in each cluster, and randomly selected new monitoring sites from candidate sites in the cluster with the minimum proportion until 20 sites were selected. Results: The five selected geographic variables were related to traffic or urbanicity with a cross-validated R2 value of 0.69. Clustering analysis categorized all locations into nine clusters. Finally, one to eight new monitoring sites were selected from five clusters. Discussion: The proposed monitoring design will help future studies determine the locations of new monitoring sites representing spatial variability across

  12. Reporting of occupational and environmental research: use and misuse of statistical and epidemiological methods

    PubMed Central

    Rushton, L.

    2000-01-01

    OBJECTIVES—To report some of the most serious omissions and errors which may occur in papers submitted to Occupational and Environmental Medicine, and to give guidelines on the essential components that should be included in papers reporting results from studies of occupational and environmental health.
METHODS—Since 1994 Occupational and Environmental Medicine has used a panel of medical statisticians to review submitted papers which have a substantial statistical content. Although some studies may have genuine errors in their design, execution, and analysis, many of the problems identified during the reviewing process are due to inadequate and incomplete reporting of essential aspects of a study. This paper outlines some of the most important errors and omissions that may occur. Observational studies are often the preferred choice of design in occupational and environmental medicine. Some of the issues relating to design, execution, and analysis which should be considered when reporting three of the most common observational study designs, cross sectional, case-control, and cohort are described. An illustration of good reporting practice is given for each. Various mathematical modelling techniques are often used in the analysis of these studies, the reporting of which causes a major problem to some authors. Suggestions for the presentation of results from modelling are made.
CONCLUSIONS—There is increasing interest in the development and application of formal "good epidemiology practices". These not only consider issues of data quality, study design, and study conduct, but through their structured approach to the documentation of the study procedures, provide the potential for more rigorous reporting of the results in the scientific literature.


Keywords: research reporting; statistical methods; epidemiological methods PMID:10711263

  13. [An analysis to the focus of (American Journal of Epidemiology) research with bibliometrics methods].

    PubMed

    Cui, L

    1996-06-01

    Using bibliometric method, the author counted the citation of papers published in American Journal of Epidemiology in the last 3 years. The highly cited papers and books were presented and the focus of recent years on American Journal of Epidemiology outlined.

  14. Variation in choice of study design: findings from the Epidemiology Design Decision Inventory and Evaluation (EDDIE) survey.

    PubMed

    Stang, Paul E; Ryan, Patrick B; Overhage, J Marc; Schuemie, Martijn J; Hartzema, Abraham G; Welebob, Emily

    2013-10-01

    Researchers using observational data to understand drug effects must make a number of analytic design choices that suit the characteristics of the data and the subject of the study. Review of the published literature suggests that there is a lack of consistency even when addressing the same research question in the same database. To characterize the degree of similarity or difference in the method and analysis choices made by observational database research experts when presented with research study scenarios. On-line survey using research scenarios on drug-effect studies to capture method selection and analysis choices that follow a dependency branching based on response to key questions. Voluntary participants experienced in epidemiological study design solicited for participation through registration on the Observational Medical Outcomes Partnership website, membership in particular professional organizations, or links in relevant newsletters. Description (proportion) of respondents selecting particular methods and making specific analysis choices based on individual drug-outcome scenario pairs. The number of questions/decisions differed based on stem questions of study design, time-at-risk, outcome definition, and comparator. There is little consistency across scenarios, by drug or by outcome of interest, in the decisions made for design and analyses in scenarios using large healthcare databases. The most consistent choice was the cohort study design but variability in the other critical decisions was common. There is great variation among epidemiologists in the design and analytical choices that they make when implementing analyses in observational healthcare databases. These findings confirm that it will be important to generate empiric evidence to inform these decisions and to promote a better understanding of the impact of standardization on research implementation.

  15. Development of the residential case-specular epidemiologic investigation method. Final report

    SciTech Connect

    Zaffanella, L.E.; Savitz, D.A.

    1995-11-01

    The residential case-specular method is an innovative approach to epidemiologic studies of the association between wire codes and childhood cancer. This project was designed to further the development of the residential case-specular method, which seeks to help resolve the ``wire code paradox``. For years, wire codes have been used as surrogate measures of past electric and magnetic field (EMF) exposure. There is a magnetic field hypothesis that suggests childhood cancer is associated with exposure to magnetic fields, with wire codes as a proxy for these fields. The neighborhood hypothesis suggests that childhood cancer is associated with neighborhood characteristics and exposures other than magnetic fields, with wire codes as a proxy for these characteristics and exposures. The residential case-specular method was designed to discriminate between the magnetic field and the neighborhood hypothesis. Two methods were developed for determining the specular of a residence. These methods were tested with 400 randomly selected residences. The main advantage of the residential case-specular method is that it may efficiently confirm or eliminate the suspicion that control selection bias or confounding by neighborhood factors affected the results of case-control studies of childhood cancer and magnetic fields. The method may be applicable to both past and ongoing studies. The main disadvantage is that the method is untried. Consequently, further work is required to verify its validity and to ensure that sufficient statistical power can be obtained in a cost-effective manner.

  16. The role of applied epidemiology methods in the disaster management cycle.

    PubMed

    Malilay, Josephine; Heumann, Michael; Perrotta, Dennis; Wolkin, Amy F; Schnall, Amy H; Podgornik, Michelle N; Cruz, Miguel A; Horney, Jennifer A; Zane, David; Roisman, Rachel; Greenspan, Joel R; Thoroughman, Doug; Anderson, Henry A; Wells, Eden V; Simms, Erin F

    2014-11-01

    Disaster epidemiology (i.e., applied epidemiology in disaster settings) presents a source of reliable and actionable information for decision-makers and stakeholders in the disaster management cycle. However, epidemiological methods have yet to be routinely integrated into disaster response and fully communicated to response leaders. We present a framework consisting of rapid needs assessments, health surveillance, tracking and registries, and epidemiological investigations, including risk factor and health outcome studies and evaluation of interventions, which can be practiced throughout the cycle. Applying each method can result in actionable information for planners and decision-makers responsible for preparedness, response, and recovery. Disaster epidemiology, once integrated into the disaster management cycle, can provide the evidence base to inform and enhance response capability within the public health infrastructure.

  17. The Role of Applied Epidemiology Methods in the Disaster Management Cycle

    PubMed Central

    Heumann, Michael; Perrotta, Dennis; Wolkin, Amy F.; Schnall, Amy H.; Podgornik, Michelle N.; Cruz, Miguel A.; Horney, Jennifer A.; Zane, David; Roisman, Rachel; Greenspan, Joel R.; Thoroughman, Doug; Anderson, Henry A.; Wells, Eden V.; Simms, Erin F.

    2014-01-01

    Disaster epidemiology (i.e., applied epidemiology in disaster settings) presents a source of reliable and actionable information for decision-makers and stakeholders in the disaster management cycle. However, epidemiological methods have yet to be routinely integrated into disaster response and fully communicated to response leaders. We present a framework consisting of rapid needs assessments, health surveillance, tracking and registries, and epidemiological investigations, including risk factor and health outcome studies and evaluation of interventions, which can be practiced throughout the cycle. Applying each method can result in actionable information for planners and decision-makers responsible for preparedness, response, and recovery. Disaster epidemiology, once integrated into the disaster management cycle, can provide the evidence base to inform and enhance response capability within the public health infrastructure. PMID:25211748

  18. Design Characteristics Influence Performance of Clinical Prediction Rules in Validation: A Meta-Epidemiological Study

    PubMed Central

    Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda

    2016-01-01

    Background Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules’ performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Methods Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. Results A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2–4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Conclusion Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved. PMID:26730980

  19. Landscape-epidemiological study design to investigate an environmentally based disease.

    PubMed

    Tabor, Joseph A; O'rourke, Mary Kay; Lebowitz, Michael D; Harris, Robin B

    2011-01-01

    Cost-effective approaches for identifying and enrolling subjects in community-based epidemiological studies face many challenges. Additional challenges arise when a neighborhood scale of analysis is required to distinguish between individual- and group-level risk factors with strong environmental determinants. A stratified, two-stage, cross-sectional, address-based telephone survey of Greater Tucson, Arizona, was conducted in 2002-2003. Subjects were recruited from direct marketing data at neighborhood resolution using a geographic information system (GIS). Three geomorphic strata were divided into two demographic units. Households were randomly selected within census block groups, selected using the probability proportional to size technique. Purchased direct marketing lists represented 45.2% of Census 2000 households in the surveyed block groups. Survey design effect (1.6) on coccidioidomycosis prevalence (88 per 100,000 per year) was substantially reduced in four of the six strata (0.3-0.9). Race-ethnicity was more robust than age and gender to compensate for significant selection bias using poststratification. Clustered, address-based telephone surveys provide a cost-effective, valid method for recruiting populations from address-based lists using a GIS to design surveys and population survey statistical methods for analysis. Landscape ecology provides effective methods for identifying scales of analysis and units for stratification that will improve sampling efficiency when environmental variables of interest are strong predictors.

  20. Stochastic Methods for Aircraft Design

    NASA Technical Reports Server (NTRS)

    Pelz, Richard B.; Ogot, Madara

    1998-01-01

    The global stochastic optimization method, simulated annealing (SA), was adapted and applied to various problems in aircraft design. The research was aimed at overcoming the problem of finding an optimal design in a space with multiple minima and roughness ubiquitous to numerically generated nonlinear objective functions. SA was modified to reduce the number of objective function evaluations for an optimal design, historically the main criticism of stochastic methods. SA was applied to many CFD/MDO problems including: low sonic-boom bodies, minimum drag on supersonic fore-bodies, minimum drag on supersonic aeroelastic fore-bodies, minimum drag on HSCT aeroelastic wings, FLOPS preliminary design code, another preliminary aircraft design study with vortex lattice aerodynamics, HSR complete aircraft aerodynamics. In every case, SA provided a simple, robust and reliable optimization method which found optimal designs in order 100 objective function evaluations. Perhaps most importantly, from this academic/industrial project, technology has been successfully transferred; this method is the method of choice for optimization problems at Northrop Grumman.

  1. Design method of supercavitating pumps

    NASA Astrophysics Data System (ADS)

    Kulagin, V.; Likhachev, D.; Li, F. C.

    2016-05-01

    The problem of effective supercavitating (SC) pump is solved, and optimum load distribution along the radius of the blade is found taking into account clearance, degree of cavitation development, influence of finite number of blades, and centrifugal forces. Sufficient accuracy can be obtained using the equivalent flat SC-grid for design of any SC-mechanisms, applying the “grid effect” coefficient and substituting the skewed flow calculated for grids of flat plates with the infinite attached cavitation caverns. This article gives the universal design method and provides an example of SC-pump design.

  2. Text mining describes the use of statistical and epidemiological methods in published medical research.

    PubMed

    Meaney, Christopher; Moineddin, Rahim; Voruganti, Teja; O'Brien, Mary Ann; Krueger, Paul; Sullivan, Frank

    2016-06-01

    To describe trends in the use of statistical and epidemiological methods in the medical literature over the past 2 decades. We obtained all 1,028,786 articles from the PubMed Central Open-Access archive (retrieved May 9, 2015). We focused on 113,450 medical research articles. A Delphi panel identified 177 statistical/epidemiological methods pertinent to clinical researchers. We used a text-mining approach to determine if a specific statistical/epidemiological method was encountered in a given article. We report the proportion of articles using a specific method for the entire cross-sectional sample and also stratified into three blocks of time (1995-2005; 2006-2010; 2011-2015). Numeric descriptive statistics were commonplace (96.4% articles). Other frequently encountered methods groups included statistical inferential concepts (52.9% articles), epidemiological measures of association (53.5% articles) methods for diagnostic/classification accuracy (40.1% articles), hypothesis testing (28.8% articles), ANOVA (23.2% articles), and regression (22.6% articles). We observed relative percent increases in the use of: regression (103.0%), missing data methods (217.9%), survival analysis (147.6%), and correlated data analysis (192.2%). This study identified commonly encountered and emergent methods used to investigate medical research problems. Clinical researchers must be aware of the methodological landscape in their field, as statistical/epidemiological methods underpin research claims. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Trends in epidemiology in the 21st century: time to adopt Bayesian methods.

    PubMed

    Martinez, Edson Zangiacomi; Achcar, Jorge Alberto

    2014-04-01

    2013 marked the 250th anniversary of the presentation of Bayes' theorem by the philosopher Richard Price. Thomas Bayes was a figure little known in his own time, but in the 20th century the theorem that bears his name became widely used in many fields of research. The Bayes theorem is the basis of the so-called Bayesian methods, an approach to statistical inference that allows studies to incorporate prior knowledge about relevant data characteristics into statistical analysis. Nowadays, Bayesian methods are widely used in many different areas such as astronomy, economics, marketing, genetics, bioinformatics and social sciences. This study observed that a number of authors discussed recent advances in techniques and the advantages of Bayesian methods for the analysis of epidemiological data. This article presents an overview of Bayesian methods, their application to epidemiological research and the main areas of epidemiology which should benefit from the use of Bayesian methods in coming years.

  4. [Psychiatric and clinico-psychologic epidemiology: critique of methods and systematization approach].

    PubMed

    Bochmann, F; Petermann, F

    1994-01-01

    The development of typical epidemiological questions is discussed. Possible definitions und fields of applications for simple status-diagnostics as well as for epidemiological longitudinal research are presented. Examples for the application of complex mathematical models like LOGIT- or LISREL-analysis are presented. Insufficient methodological systematisation and problems in comparing studies of different origin (i.e. for metaanalysis) are identified as problems for the future development of epidemiology. A classification system of epidemiological methods is presented that includes content aspects as well as methodological aspects on four levels. These are: the demand of the study (descriptive vs. inferential), the time dimension (status diagnostics vs. longitudinal research), the content level of operationalisation (micro- vs. macrolevel) and the level of explanation (correlative vs. causal). Examples show that this system can be used for classification of existing studies as well as for the conceptualisation and optimization of planned studies.

  5. Concordance and discordance of sequence survey methods for molecular epidemiology

    PubMed Central

    Hasan, Nur A.; Cebula, Thomas A.; Colwell, Rita R.; Robison, Richard A.; Johnson, W. Evan; Crandall, Keith A.

    2015-01-01

    The post-genomic era is characterized by the direct acquisition and analysis of genomic data with many applications, including the enhancement of the understanding of microbial epidemiology and pathology. However, there are a number of molecular approaches to survey pathogen diversity, and the impact of these different approaches on parameter estimation and inference are not entirely clear. We sequenced whole genomes of bacterial pathogens, Burkholderia pseudomallei, Yersinia pestis, and Brucella spp. (60 new genomes), and combined them with 55 genomes from GenBank to address how different molecular survey approaches (whole genomes, SNPs, and MLST) impact downstream inferences on molecular evolutionary parameters, evolutionary relationships, and trait character associations. We selected isolates for sequencing to represent temporal, geographic origin, and host range variability. We found that substitution rate estimates vary widely among approaches, and that SNP and genomic datasets yielded different but strongly supported phylogenies. MLST yielded poorly supported phylogenies, especially in our low diversity dataset, i.e., Y. pestis. Trait associations showed that B. pseudomallei and Y. pestis phylogenies are significantly associated with geography, irrespective of the molecular survey approach used, while Brucella spp. phylogeny appears to be strongly associated with geography and host origin. We contrast inferences made among monomorphic (clonal) and non-monomorphic bacteria, and between intra- and inter-specific datasets. We also discuss our results in light of underlying assumptions of different approaches. PMID:25737810

  6. Concordance and discordance of sequence survey methods for molecular epidemiology.

    PubMed

    Castro-Nallar, Eduardo; Hasan, Nur A; Cebula, Thomas A; Colwell, Rita R; Robison, Richard A; Johnson, W Evan; Crandall, Keith A

    2015-01-01

    The post-genomic era is characterized by the direct acquisition and analysis of genomic data with many applications, including the enhancement of the understanding of microbial epidemiology and pathology. However, there are a number of molecular approaches to survey pathogen diversity, and the impact of these different approaches on parameter estimation and inference are not entirely clear. We sequenced whole genomes of bacterial pathogens, Burkholderia pseudomallei, Yersinia pestis, and Brucella spp. (60 new genomes), and combined them with 55 genomes from GenBank to address how different molecular survey approaches (whole genomes, SNPs, and MLST) impact downstream inferences on molecular evolutionary parameters, evolutionary relationships, and trait character associations. We selected isolates for sequencing to represent temporal, geographic origin, and host range variability. We found that substitution rate estimates vary widely among approaches, and that SNP and genomic datasets yielded different but strongly supported phylogenies. MLST yielded poorly supported phylogenies, especially in our low diversity dataset, i.e., Y. pestis. Trait associations showed that B. pseudomallei and Y. pestis phylogenies are significantly associated with geography, irrespective of the molecular survey approach used, while Brucella spp. phylogeny appears to be strongly associated with geography and host origin. We contrast inferences made among monomorphic (clonal) and non-monomorphic bacteria, and between intra- and inter-specific datasets. We also discuss our results in light of underlying assumptions of different approaches.

  7. Epidemiological methods for research with drug misusers: review of methods for studying prevalence and morbidity.

    PubMed

    Dunn, J; Ferri, C P

    1999-04-01

    Epidemiological studies of drug misusers have until recently relied on two main forms of sampling: probability and convenience. The former has been used when the aim was simply to estimate the prevalence of the condition and the latter when in depth studies of the characteristics, profiles and behaviour of drug users were required, but each method has its limitations. Probability samples become impracticable when the prevalence of the condition is very low, less than 0.5% for example, or when the condition being studied is a clandestine activity such as illicit drug use. When stratified random samples are used, it may be difficult to obtain a truly representative sample, depending on the quality of the information used to develop the stratification strategy. The main limitation of studies using convenience samples is that the results cannot be generalised to the whole population of drug users due to selection bias and a lack of information concerning the sampling frame. New methods have been developed which aim to overcome some of these difficulties, for example, social network analysis, snowball sampling, capture-recapture techniques, privileged access interviewer method and contact tracing. All these methods have been applied to the study of drug misuse. The various methods are described and examples of their use given, drawn from both the Brazilian and international drug misuse literature.

  8. From Smallpox to Big Data: The Next 100 Years of Epidemiologic Methods.

    PubMed

    Gange, Stephen J; Golub, Elizabeth T

    2016-03-01

    For more than a century, epidemiology has seen major shifts in both focus and methodology. Taking into consideration the explosion of "big data," the advent of more sophisticated data collection and analytical tools, and the increased interest in evidence-based solutions, we present a framework that summarizes 3 fundamental domains of epidemiologic methods that are relevant for the understanding of both historical contributions and future directions in public health. First, the manner in which populations and their follow-up are defined is expanding, with greater interest in online populations whose definition does not fit the usual classification by person, place, and time. Second, traditional data collection methods, such as population-based surveillance and individual interviews, have been supplemented with advances in measurement. From biomarkers to mobile health, innovations in the measurement of exposures and diseases enable refined accuracy of data collection. Lastly, the comparison of populations is at the heart of epidemiologic methodology. Risk factor epidemiology, prediction methods, and causal inference strategies are areas in which the field is continuing to make significant contributions to public health. The framework presented herein articulates the multifaceted ways in which epidemiologic methods make such contributions and can continue to do so as we embark upon the next 100 years. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Design of a protocol for large-scale epidemiological studies in individual sports: the Swedish Athletics injury study.

    PubMed

    Jacobsson, Jenny; Timpka, Toomas; Ekberg, Joakim; Kowalski, Jan; Nilsson, Sverker; Renström, Per

    2010-12-01

    Epidemiological studies have mainly been performed on team sports. The authors set out to develop a protocol for large-scale epidemiological studies of injuries among elite athletics athletes. An argument-based method for investigation of complex design problems was used to structure the collection and analysis of data. Specification of the protocol was preceded by an examination of requirements on injury surveillance in individual sports and iterated drafting of protocol specifications, and followed by formative evaluations. The requirements analysis shows that the central demand on the protocol is to allow for detailed epidemiological analyses of overuse injuries, which subsequently requires regular collection of self-reported data from athletes. The resulting study protocol is centred on a web-based weekly athlete e-diary enabling continual collection of individual-level data on exposure and injuries. To be able to interpret the self-reported data on injury events, collection of a wide range of personal baseline data from the athlete, including a psychological profile, is included in the protocol. The resulting protocol can be employed in intervention programmes that can prevent suffering among both adult elite and youth talent athletes who have made considerable life investments in their sport.

  10. Influence of DNA extraction methods on relative telomere length measurements and its impact on epidemiological studies.

    PubMed

    Raschenberger, Julia; Lamina, Claudia; Haun, Margot; Kollerits, Barbara; Coassin, Stefan; Boes, Eva; Kedenko, Ludmilla; Köttgen, Anna; Kronenberg, Florian

    2016-05-03

    Measurement of telomere length is widely used in epidemiologic studies. Insufficient standardization of the measurements processes has, however, complicated the comparison of results between studies. We aimed to investigate whether DNA extraction methods have an influence on measured values of relative telomere length (RTL) and whether this has consequences for epidemiological studies. We performed four experiments with RTL measurement in quadruplicate by qPCR using DNA extracted with different methods: 1) a standardized validation experiment including three extraction methods (magnetic-particle-method EZ1, salting-out-method INV, phenol-chloroform-isoamyl-alcohol PCI) each in the same 20 samples demonstrated pronounced differences in RTL with lowest values with EZ1 followed by INV and PCI-isolated DNA; 2) a comparison of 307 samples from an epidemiological study showing EZ1-measurements 40% lower than INV-measurements; 3) a matching-approach of two similar non-diseased control groups including 143 pairs of subjects revealed significantly shorter RTL in EZ1 than INV-extracted DNA (0.844 ± 0.157 vs. 1.357 ± 0.242); 4) an association analysis of RTL with prevalent cardiovascular disease detected a stronger association with INV than with EZ1-extracted DNA. In summary, DNA extraction methods have a pronounced influence on the measured RTL-values. This might result in spurious or lost associations in epidemiological studies under certain circumstances.

  11. Influence of DNA extraction methods on relative telomere length measurements and its impact on epidemiological studies

    PubMed Central

    Raschenberger, Julia; Lamina, Claudia; Haun, Margot; Kollerits, Barbara; Coassin, Stefan; Boes, Eva; Kedenko, Ludmilla; Köttgen, Anna; Kronenberg, Florian

    2016-01-01

    Measurement of telomere length is widely used in epidemiologic studies. Insufficient standardization of the measurements processes has, however, complicated the comparison of results between studies. We aimed to investigate whether DNA extraction methods have an influence on measured values of relative telomere length (RTL) and whether this has consequences for epidemiological studies. We performed four experiments with RTL measurement in quadruplicate by qPCR using DNA extracted with different methods: 1) a standardized validation experiment including three extraction methods (magnetic-particle-method EZ1, salting-out-method INV, phenol-chloroform-isoamyl-alcohol PCI) each in the same 20 samples demonstrated pronounced differences in RTL with lowest values with EZ1 followed by INV and PCI-isolated DNA; 2) a comparison of 307 samples from an epidemiological study showing EZ1-measurements 40% lower than INV-measurements; 3) a matching-approach of two similar non-diseased control groups including 143 pairs of subjects revealed significantly shorter RTL in EZ1 than INV-extracted DNA (0.844 ± 0.157 vs. 1.357 ± 0.242); 4) an association analysis of RTL with prevalent cardiovascular disease detected a stronger association with INV than with EZ1-extracted DNA. In summary, DNA extraction methods have a pronounced influence on the measured RTL-values. This might result in spurious or lost associations in epidemiological studies under certain circumstances. PMID:27138987

  12. Case-crossover and case-time-control designs in birth defects epidemiology.

    PubMed

    Hernández-Díaz, Sonia; Hernán, Miguel A; Meyer, Katie; Werler, Martha M; Mitchell, Allen A

    2003-08-15

    The case-crossover and the case-time-control designs can be used to evaluate the effect of intermittent exposures on the risk of acute events. To explore how birth defects epidemiology could benefit from these approaches, the authors compared them with a traditional case-control study design that evaluated the association between use of folic acid antagonists during the second and third pregnancy months and the risk of cardiovascular defects. Among 3,870 cases and 8,387 control infants in the Slone Epidemiology Center Birth Defects Study (1976-1998), the odds ratio was 2.3 (95% confidence interval (CI): 1.4, 3.9). The case-crossover approach compared folic acid antagonist use between the 2-month embryologically sensitive period (case window) and the 2 months preceding the last menstrual period (control window) among mothers of case infants (odds ratio = 1.0, 95% CI: 0.5, 2.0). Although it controls between-person confounding and avoids issues of control selection, this design may be biased by time trends of exposure prevalence during pregnancy. The case-time-control design, which adjusts for exposure time trends under certain assumptions, yielded an odds ratio of 2.9 (95% CI: 1.2, 7.2), but it requires controls. In the presence of gestational time trends of exposure, the new designs do not offer clear advantages over the case-control design.

  13. Trends in Citations to Books on Epidemiological and Statistical Methods in the Biomedical Literature

    PubMed Central

    Porta, Miquel; Vandenbroucke, Jan P.; Ioannidis, John P. A.; Sanz, Sergio; Fernandez, Esteve; Bhopal, Raj; Morabia, Alfredo; Victora, Cesar; Lopez, Tomàs

    2013-01-01

    Background There are no analyses of citations to books on epidemiological and statistical methods in the biomedical literature. Such analyses may shed light on how concepts and methods changed while biomedical research evolved. Our aim was to analyze the number and time trends of citations received from biomedical articles by books on epidemiological and statistical methods, and related disciplines. Methods and Findings The data source was the Web of Science. The study books were published between 1957 and 2010. The first year of publication of the citing articles was 1945. We identified 125 books that received at least 25 citations. Books first published in 1980–1989 had the highest total and median number of citations per year. Nine of the 10 most cited texts focused on statistical methods. Hosmer & Lemeshow's Applied logistic regression received the highest number of citations and highest average annual rate. It was followed by books by Fleiss, Armitage, et al., Rothman, et al., and Kalbfleisch and Prentice. Fifth in citations per year was Sackett, et al., Evidence-based medicine. The rise of multivariate methods, clinical epidemiology, or nutritional epidemiology was reflected in the citation trends. Educational textbooks, practice-oriented books, books on epidemiological substantive knowledge, and on theory and health policies were much less cited. None of the 25 top-cited books had the theoretical or sociopolitical scope of works by Cochrane, McKeown, Rose, or Morris. Conclusions Books were mainly cited to reference methods. Books first published in the 1980s continue to be most influential. Older books on theory and policies were rooted in societal and general medical concerns, while the most modern books are almost purely on methods. PMID:23667447

  14. Overview of molecular typing methods for outbreak detection and epidemiological surveillance.

    PubMed

    Sabat, A J; Budimir, A; Nashev, D; Sá-Leão, R; van Dijl, J m; Laurent, F; Grundmann, H; Friedrich, A W

    2013-01-24

    Typing methods for discriminating different bacterial isolates of the same species are essential epidemiological tools in infection prevention and control. Traditional typing systems based on phenotypes, such as serotype, biotype, phage-type, or antibiogram, have been used for many years. However, more recent methods that examine the relatedness of isolates at a molecular level have revolutionised our ability to differentiate among bacterial types and subtypes. Importantly, the development of molecular methods has provided new tools for enhanced surveillance and outbreak detection. This has resulted in better implementation of rational infection control programmes and efficient allocation of resources across Europe. The emergence of benchtop sequencers using next generation sequencing technology makes bacterial whole genome sequencing (WGS) feasible even in small research and clinical laboratories. WGS has already been used for the characterisation of bacterial isolates in several large outbreaks in Europe and, in the near future, is likely to replace currently used typing methodologies due to its ultimate resolution. However, WGS is still too laborious and time-consuming to obtain useful data in routine surveillance. Also, a largely unresolved question is how genome sequences must be examined for epidemiological characterisation. In the coming years, the lessons learnt from currently used molecular methods will allow us to condense the WGS data into epidemiologically useful information. On this basis, we have reviewed current and new molecular typing methods for outbreak detection and epidemiological surveillance of bacterial pathogens in clinical practice, aiming to give an overview of their specific advantages and disadvantages.

  15. Imputation method for lifetime exposure assessment in air pollution epidemiologic studies

    PubMed Central

    2013-01-01

    against health data should be done as a function of PDI to check for consistency of results. The 1% of study subjects who lived for long durations near heavily trafficked intersections, had very high cumulative exposures. Thus, imputation methods must be designed to reproduce non-standard distributions. Conclusions Our approach meets a number of methodological challenges to extending historical exposure reconstruction over a lifetime and shows promise for environmental epidemiology. Application to assessment of breast cancer risks will be reported in a subsequent manuscript. PMID:23919666

  16. Trends in citations to books on epidemiological and statistical methods in the biomedical literature.

    PubMed

    Porta, Miquel; Vandenbroucke, Jan P; Ioannidis, John P A; Sanz, Sergio; Fernandez, Esteve; Bhopal, Raj; Morabia, Alfredo; Victora, Cesar; Lopez, Tomàs

    2013-01-01

    There are no analyses of citations to books on epidemiological and statistical methods in the biomedical literature. Such analyses may shed light on how concepts and methods changed while biomedical research evolved. Our aim was to analyze the number and time trends of citations received from biomedical articles by books on epidemiological and statistical methods, and related disciplines. The data source was the Web of Science. The study books were published between 1957 and 2010. The first year of publication of the citing articles was 1945. We identified 125 books that received at least 25 citations. Books first published in 1980-1989 had the highest total and median number of citations per year. Nine of the 10 most cited texts focused on statistical methods. Hosmer & Lemeshow's Applied logistic regression received the highest number of citations and highest average annual rate. It was followed by books by Fleiss, Armitage, et al., Rothman, et al., and Kalbfleisch and Prentice. Fifth in citations per year was Sackett, et al., Evidence-based medicine. The rise of multivariate methods, clinical epidemiology, or nutritional epidemiology was reflected in the citation trends. Educational textbooks, practice-oriented books, books on epidemiological substantive knowledge, and on theory and health policies were much less cited. None of the 25 top-cited books had the theoretical or sociopolitical scope of works by Cochrane, McKeown, Rose, or Morris. Books were mainly cited to reference methods. Books first published in the 1980s continue to be most influential. Older books on theory and policies were rooted in societal and general medical concerns, while the most modern books are almost purely on methods.

  17. Outcome modelling strategies in epidemiology: traditional methods and basic alternatives

    PubMed Central

    Greenland, Sander; Daniel, Rhian; Pearce, Neil

    2016-01-01

    Controlling for too many potential confounders can lead to or aggravate problems of data sparsity or multicollinearity, particularly when the number of covariates is large in relation to the study size. As a result, methods to reduce the number of modelled covariates are often deployed. We review several traditional modelling strategies, including stepwise regression and the ‘change-in-estimate’ (CIE) approach to deciding which potential confounders to include in an outcome-regression model for estimating effects of a targeted exposure. We discuss their shortcomings, and then provide some basic alternatives and refinements that do not require special macros or programming. Throughout, we assume the main goal is to derive the most accurate effect estimates obtainable from the data and commercial software. Allowing that most users must stay within standard software packages, this goal can be roughly approximated using basic methods to assess, and thereby minimize, mean squared error (MSE). PMID:27097747

  18. An updated systematic review of epidemiological evidence on hormonal contraceptive methods and HIV acquisition in women

    PubMed Central

    Polis, Chelsea B.; Curtis, Kathryn M.; Hannaford, Philip C.; Phillips, Sharon J.; Chipato, Tsungai; Kiarie, James N.; Westreich, Daniel J.; Steyn, Petrus S.

    2016-01-01

    Objective and design: Some studies suggest that specific hormonal contraceptive methods [particularly depot medroxyprogesterone acetate (DMPA)] may increase women's HIV acquisition risk. We updated a systematic review to incorporate recent epidemiological data. Methods: We searched for articles published between 15 January 2014 and 15 January 2016 and hand-searched reference lists. We identified longitudinal studies comparing users of a specific hormonal contraceptive method against either nonusers of hormonal contraception or users of another specific hormonal contraceptive method. We added newly identified studies to those in the previous review, assessed study quality, created forest plots to display results, and conducted a meta-analysis for data on DMPA versus non-use of hormonal contraception. Results: We identified 10 new reports of which five were considered ‘unlikely to inform the primary question’. We focus on the other five reports, along with nine from the previous review, which were considered ‘informative but with important limitations’. The preponderance of data for oral contraceptive pills, injectable norethisterone enanthate, and levonorgestrel implants do not suggest an association with HIV acquisition, though data for implants are limited. The new, higher quality studies on DMPA (or nondisaggregated injectables), which had mixed results in terms of statistical significance, had hazard ratios between 1.2 and 1.7, consistent with our meta-analytic estimate for all higher quality studies of hazard ratio 1.4. Conclusion: Although confounding in these observational data cannot be excluded, new information increases concerns about DMPA and HIV acquisition risk in women. If the association is causal, the magnitude of effect is likely hazard ratio 1.5 or less. Data for other hormonal contraceptive methods, including norethisterone enanthate, are largely reassuring. PMID:27500670

  19. [Retrospective exposure assessment in occupational epidemiology: principles and methods].

    PubMed

    Cocco, P

    2010-01-01

    Occupational histories in case-control studies typically include a variety of past exposure circumstances and no monitoring data, posing serious challenges to the retrospective assessment of occupational exposures. METHODS. I will use examples from the EPILYMPH case-control study on lymphoma risk to introduce principles and methods of retrospective assessment of occupational exposures. Exposure assessment consists in several indicators, such as frequency and intensity of exposure, as well as a confidence score, expressing the occupational expert own judgement on the reliability of the assessment itself. Testing the null hypothesis from multiple perspectives allows boosting inference: while trends by the individual exposure indicators were all of borderline statistical significance, testing the association between CLL risk and exposure to ethylene oxide with the Fisher's test for combined testing of multiple probabilities yielded a p-value of 0.003. Using the occupational expert assessment as the gold standard, the specificity of a prior job-exposure matrix for benzene was 93%, and its sensitivity 40%., with a positive and negative predictive values ranging 71-77%. Once bias can be excluded, assuming a true association between exposure and disease, retrospective exposure assessment only under estimates the true risk, which size also depends on frequency of the exposure itself.

  20. [Mendelian randomisation - a genetic approach to an epidemiological method].

    PubMed

    Stensrud, Mats Julius

    2016-06-01

    BACKGROUND Genetic information is becoming more easily available, and rapid progress is being made in developing methods of illuminating issues of interest. Mendelian randomisation makes it possible to study causes of disease using observational data. The name refers to the random distribution of gene variants in meiosis. The methodology makes use of genes that influence a risk factor for a disease, without influencing the disease itself. In this review article I explain the principles behind Mendelian randomisation and present the areas of application for this methodology.MATERIAL AND METHOD Methodology articles describing Mendelian randomisation were reviewed. The articles were found through a search in PubMed with the combination «mendelian randomization» OR «mendelian randomisation», and a search in McMaster Plus with the combination «mendelian randomization». A total of 15 methodology articles were read in full text. Methodology articles were supplemented by clinical studies found in the PubMed search.RESULTS In contrast to traditional observational studies, Mendelian randomisation studies are not affected by two important sources of error: conventional confounding variables and reverse causation. Mendelian randomisation is therefore a promising tool for studying causality. Mendelian randomisation studies have already provided valuable knowledge on the risk factors for a wide range of diseases. It is nevertheless important to be aware of the limitations of the methodology. As a result of the rapid developments in genetics research, Mendelian randomisation will probably be widely used in future years.INTERPRETATION If Mendelian randomisation studies are conducted correctly, they may help to reveal both modifiable and non-modifiable causes of disease.

  1. Educational epidemiology: applying population-based design and analytic approaches to study medical education.

    PubMed

    Carney, Patricia A; Nierenberg, David W; Pipas, Catherine F; Brooks, W Blair; Stukel, Therese A; Keller, Adam M

    2004-09-01

    Conducting educational research in medical schools is challenging partly because interventional controlled research designs are difficult to apply. In addition, strict accreditation requirements and student/faculty concerns about educational inequality reduce the flexibility needed to plan and execute educational experiments. Consequently, there is a paucity of rigorous and generalizable educational research to provide an evidence-guided foundation to support educational effectiveness. "Educational epidemiology," ie, the application across the physician education continuum of observational designs (eg, cross-sectional, longitudinal, cohort, and case-control studies) and randomized experimental designs (eg, randomized controlled trials, randomized crossover designs), could revolutionize the conduct of research in medical education. Furthermore, the creation of a comprehensive national network of educational epidemiologists could enhance collaboration and the development of a strong educational research foundation.

  2. [Statistical and epidemiological methods used in biomedical research: implications for initial medical education].

    PubMed

    Picat, M-Q; Savès, M; Asselineau, J; Dumoulin, M; Coureau, G; Salmi, L-R; Perez, P; Chêne, G

    2013-06-01

    The main source of key medical information consists in original articles published in peer-reviewed biomedical journals. Reported studies use increasingly sophisticated statistical and epidemiological approaches that first require a solid understanding of core methods. However, such understanding is not widely shared among physicians. Our aim was to assess whether the basic statistical and epidemiological methods used in original articles published in general biomedical journals are taught during the first years of the medical curriculum in France. We selected original articles published in The New England Journal of Medicine, The Lancet, and The Journal of the American Medical Association, over a period of six months in 2007 and in 2008. A standardized statistical content checklist was used to extract the necessary information in the "Abstract", "Methods", "Results", footnotes of tables, and legends of figures. The methods used in the selected articles were compared to the national program and the public health program of biostatistics and epidemiology taught during the first six years of medical school. The 237 analyzed original articles all used at least one statistical or epidemiological method. Descriptive statistics, confidence interval and Chi(2) or Fisher tests, methods used in more than 50% of articles, were repeatedly taught throughout the medicine curriculum. Measures of association, sample size, fit and Kaplan-Meier method, used in 40 to 50% of articles, were specifically taught during training sessions on critical reading methods. Cox model (41% of articles) and logistic regression (24% of articles) were never taught. The most widely used illustrations, contingency tables (92%) and flowcharts (48%), were not included in the national program. More teaching of the core methods underlying the understanding of sophisticated methods and illustrations should be included in the early medical curriculum so that physicians can read the scientific literature

  3. Epidemiological surveillance methods for vector-borne diseases.

    PubMed

    Thompson, P N; Etter, E

    2015-04-01

    Compared with many other diseases, the ever-increasing threat of vector-borne diseases (VBDs) represents a great challenge to public and animal health managers. Complex life cycles, changing distribution ranges, a variety of potential vectors and hosts, and the possible role of reservoirs make surveillance for VBDs a grave concern in a changing environment with increasing economic constraints. Surveillance activities may have various specific objectives and may focus on clinical disease, pathogens, vectors, hosts and/or reservoirs, but ultimately such activities should improve our ability to predict, prevent and/or control the diseases concerned. This paper briefly reviews existing and newly developed tools for the surveillance of VBDs. A range of examples, by no means exhaustive, illustrates that VBD surveillance usually involves a combination of methods to achieve its aims, and is best accomplished when these techniques are adapted to the specific environment and constraints of the region. More so than any other diseases, VBDs respect no administrative boundaries; in addition, animal, human and commodity movements are increasing dramatically, with illegal or unknown movements difficult to quantify. Vector-borne disease surveillance therefore becomes a serious issue for local and national organisations and is being conducted more and more at the regional and international level through multidisciplinary networks. With economic and logistical constraints, tools for optimising and evaluating the performance of surveillance systems are essential and examples of recent developments in this area are included. The continuous development of mapping, analytical and modelling tools provides us with an enhanced ability to interpret, visualise and communicate surveillance results. This review also demonstrates the importance of the link between surveillance and research, with interactions and benefits in both directions.

  4. Statistical methods for bivariate spatial analysis in marked points. Examples in spatial epidemiology.

    PubMed

    Souris, Marc; Bichaud, Laurence

    2011-12-01

    This article presents methods to analyze global spatial relationships between two variables in two different sets of fixed points. Analysis of spatial relationships between two phenomena is of great interest in health geography and epidemiology, especially to highlight competing interest between phenomena or evidence of a common environmental factor. Our general approach extends the Moran and Pearson indices to the bivariate case in two different sets of points. The case where the variables are Boolean is treated separately through methods using nearest neighbors distances. All tests use Monte-Carlo simulations to estimate their probability distributions, with options to distinguish spatial and no spatial correlation in the special case of identical sets analysis. Implementation in a Geographic Information System (SavGIS) and real examples are used to illustrate these spatial indices and methods in epidemiology. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Epidemiological typing of clinical isolates of Achromobacter xylosoxidans: comparison of phenotypic and genotypic methods.

    PubMed

    Kaur, M; Ray, P; Bhatty, M; Sharma, M

    2009-09-01

    The purpose of this paper is to evaluate the utility of different typing methods for Achromobacter xylosoxidans clinical isolates. Ninety-two blood culture isolates of A. xylosoxidans subsp. xylosoxidans were collected over a 25-month period. The typeability, discriminatory power and reproducibility of commonly used phenotypic and genotypic methods, such as resistotyping, plasmid profiling, whole-cell protein fingerprinting, random amplification of polymorphic DNA (RAPD) and pulsed-field gel electrophoresis (PFGE), were compared. All 92 isolates were typeable by all of the methods used, with comparable reproducibility. PFGE showed the highest discriminatory power (98.9%), but whole-cell protein profiling showed better correlation with epidemiological data without significant loss in discriminatory power (94%). Whole-cell protein profiling is a reliable epidemiological tool for the analysis of A. xylosoxidans; PFGE is the most discriminatory.

  6. Violent crime in San Antonio, Texas: an application of spatial epidemiological methods.

    PubMed

    Sparks, Corey S

    2011-12-01

    Violent crimes are rarely considered a public health problem or investigated using epidemiological methods. But patterns of violent crime and other health conditions are often affected by similar characteristics of the built environment. In this paper, methods and perspectives from spatial epidemiology are used in an analysis of violent crimes in San Antonio, TX. Bayesian statistical methods are used to examine the contextual influence of several aspects of the built environment. Additionally, spatial regression models using Bayesian model specifications are used to examine spatial patterns of violent crime risk. Results indicate that the determinants of violent crime depend on the model specification, but are primarily related to the built environment and neighborhood socioeconomic conditions. Results are discussed within the context of a rapidly growing urban area with a diverse population.

  7. Experimental design methods for bioengineering applications.

    PubMed

    Keskin Gündoğdu, Tuğba; Deniz, İrem; Çalışkan, Gülizar; Şahin, Erdem Sefa; Azbar, Nuri

    2016-01-01

    Experimental design is a form of process analysis in which certain factors are selected to obtain the desired responses of interest. It may also be used for the determination of the effects of various independent factors on a dependent factor. The bioengineering discipline includes many different areas of scientific interest, and each study area is affected and governed by many different factors. Briefly analyzing the important factors and selecting an experimental design for optimization are very effective tools for the design of any bioprocess under question. This review summarizes experimental design methods that can be used to investigate various factors relating to bioengineering processes. The experimental methods generally used in bioengineering are as follows: full factorial design, fractional factorial design, Plackett-Burman design, Taguchi design, Box-Behnken design and central composite design. These design methods are briefly introduced, and then the application of these design methods to study different bioengineering processes is analyzed.

  8. Discriminatory Indices of Typing Methods for Epidemiologic Analysis of Contemporary Staphylococcus aureus Strains

    PubMed Central

    Rodriguez, Marcela; Hogan, Patrick G.; Satola, Sarah W.; Crispell, Emily; Wylie, Todd; Gao, Hongyu; Sodergren, Erica; Weinstock, George M.; Burnham, Carey-Ann D.; Fritz, Stephanie A.

    2015-01-01

    Abstract Historically, a number of typing methods have been evaluated for Staphylococcus aureus strain characterization. The emergence of contemporary strains of community-associated S. aureus, and the ensuing epidemic with a predominant strain type (USA300), necessitates re-evaluation of the discriminatory power of these typing methods for discerning molecular epidemiology and transmission dynamics, essential to investigations of hospital and community outbreaks. We compared the discriminatory index of 5 typing methods for contemporary S. aureus strain characterization. Children presenting to St. Louis Children's Hospital and community pediatric practices in St. Louis, Missouri (MO), with community-associated S. aureus infections were enrolled. Repetitive sequence-based PCR (repPCR), pulsed-field gel electrophoresis (PFGE), multilocus sequence typing (MLST), staphylococcal protein A (spa), and staphylococcal cassette chromosome (SCC) mec typing were performed on 200 S. aureus isolates. The discriminatory index of each method was calculated using the standard formula for this metric, where a value of 1 is highly discriminatory and a value of 0 is not discriminatory. Overall, we identified 26 distinct strain types by repPCR, 17 strain types by PFGE, 30 strain types by MLST, 68 strain types by spa typing, and 5 strain types by SCCmec typing. RepPCR had the highest discriminatory index (D) of all methods (D = 0.88), followed by spa typing (D = 0.87), MLST (D = 0.84), PFGE (D = 0.76), and SCCmec typing (D = 0.60). The method with the highest D among MRSA isolates was repPCR (D = 0.64) followed by spa typing (D = 0.45) and MLST (D = 0.44). The method with the highest D among MSSA isolates was spa typing (D = 0.98), followed by MLST (D = 0.93), repPCR (D = 0.92), and PFGE (D = 0.89). Among isolates designated USA300 by PFGE, repPCR was most discriminatory, with 10 distinct strain types identified (D = 0.63). We

  9. Discriminatory Indices of Typing Methods for Epidemiologic Analysis of Contemporary Staphylococcus aureus Strains.

    PubMed

    Rodriguez, Marcela; Hogan, Patrick G; Satola, Sarah W; Crispell, Emily; Wylie, Todd; Gao, Hongyu; Sodergren, Erica; Weinstock, George M; Burnham, Carey-Ann D; Fritz, Stephanie A

    2015-09-01

    Historically, a number of typing methods have been evaluated for Staphylococcus aureus strain characterization. The emergence of contemporary strains of community-associated S. aureus, and the ensuing epidemic with a predominant strain type (USA300), necessitates re-evaluation of the discriminatory power of these typing methods for discerning molecular epidemiology and transmission dynamics, essential to investigations of hospital and community outbreaks. We compared the discriminatory index of 5 typing methods for contemporary S. aureus strain characterization. Children presenting to St. Louis Children's Hospital and community pediatric practices in St. Louis, Missouri (MO), with community-associated S. aureus infections were enrolled. Repetitive sequence-based PCR (repPCR), pulsed-field gel electrophoresis (PFGE), multilocus sequence typing (MLST), staphylococcal protein A (spa), and staphylococcal cassette chromosome (SCC) mec typing were performed on 200 S. aureus isolates. The discriminatory index of each method was calculated using the standard formula for this metric, where a value of 1 is highly discriminatory and a value of 0 is not discriminatory. Overall, we identified 26 distinct strain types by repPCR, 17 strain types by PFGE, 30 strain types by MLST, 68 strain types by spa typing, and 5 strain types by SCCmec typing. RepPCR had the highest discriminatory index (D) of all methods (D = 0.88), followed by spa typing (D = 0.87), MLST (D = 0.84), PFGE (D = 0.76), and SCCmec typing (D = 0.60). The method with the highest D among MRSA isolates was repPCR (D = 0.64) followed by spa typing (D = 0.45) and MLST (D = 0.44). The method with the highest D among MSSA isolates was spa typing (D = 0.98), followed by MLST (D = 0.93), repPCR (D = 0.92), and PFGE (D = 0.89). Among isolates designated USA300 by PFGE, repPCR was most discriminatory, with 10 distinct strain types identified (D = 0.63). We identified 45

  10. Review of freeform TIR collimator design methods

    NASA Astrophysics Data System (ADS)

    Talpur, Taimoor; Herkommer, Alois

    2016-04-01

    Total internal reflection (TIR) collimators are essential illumination components providing high efficiency and uniformity in a compact geometry. Various illumination design methods have been developed for designing such collimators, including tailoring methods, design via optimization, the mapping and feedback method, and the simultaneous multiple surface (SMS) method. This paper provides an overview of the different methods and compares the performance of the methods along with their advantages and their limitations.

  11. Comparing Two Epidemiologic Surveillance Methods to Assess Underestimation of Human Stampedes in India

    PubMed Central

    Ngai, Ka Ming; Lee, Wing Yan; Madan, Aditi; Sanyal, Saswata; Roy, Nobhojit; Burkle, Frederick M.; Hsu, Edbert B.

    2013-01-01

    Background: Two separate but complementary epidemiologic surveillance methods for human stampedes have emerged since the publication of the topic in 2009. The objective of this study is to estimate the degree of underreporting in India. Method: The Ngai Search Method was compared to the Roy Search Method for human stampede events occurring in India between 2001 and 2010. Results: A total of 40 stampedes were identified by both search methods. Using the Ngai method, 34 human stampedes were identified. Using a previously defined stampede scale: 2 events were class I, 21 events were class II, 8 events were class III, and 3 events were class IV. The median deaths were 5.5 per event and median injuries were 13.5 per event. Using the Roy method, 27 events were identified, including 9 events that were not identified by the Ngai method. After excluding events based on exclusion criteria, six additional events identified by the Roy’s method had a median of 4 deaths and 30 injuries. In multivariate analysis using the Ngai method, religious (6.52, 95%CI 1.73-24.66, p=0.006) and political (277.09, 95%CI 5.12-15,001.96, p=0.006) events had higher relative number of deaths. Conclusion: Many causes accounting for the global increase in human stampede events can only be elucidated through systematic epidemiological investigation. Focusing on a country with a high recurrence of human stampedes, we compare two independent methods of data abstraction in an effort to improve the existing database and to identify pertinent risk factors. We concluded that our previous publication underestimated stampede events in India by approximately 18% and an international standardized database to systematically record occurrence of human stampedes is needed to facilitate understanding of the epidemiology of human stampedes. PMID:24077300

  12. Rationale and Design of the International Lymphoma Epidemiology Consortium (InterLymph) Non-Hodgkin Lymphoma Subtypes Project

    PubMed Central

    Morton, Lindsay M.; Sampson, Joshua N.; Cerhan, James R.; Turner, Jennifer J.; Vajdic, Claire M.; Wang, Sophia S.; Smedby, Karin E.; de Sanjosé, Silvia; Monnereau, Alain; Benavente, Yolanda; Bracci, Paige M.; Chiu, Brian C. H.; Skibola, Christine F.; Zhang, Yawei; Mbulaiteye, Sam M.; Spriggs, Michael; Robinson, Dennis; Norman, Aaron D.; Kane, Eleanor V.; Spinelli, John J.; Kelly, Jennifer L.; Vecchia, Carlo La; Dal Maso, Luigino; Maynadié, Marc; Kadin, Marshall E.; Cocco, Pierluigi; Costantini, Adele Seniori; Clarke, Christina A.; Roman, Eve; Miligi, Lucia; Colt, Joanne S.; Berndt, Sonja I.; Mannetje, Andrea; de Roos, Anneclaire J.; Kricker, Anne; Nieters, Alexandra; Franceschi, Silvia; Melbye, Mads; Boffetta, Paolo; Clavel, Jacqueline; Linet, Martha S.; Weisenburger, Dennis D.; Slager, Susan L.

    2014-01-01

    Background Non-Hodgkin lymphoma (NHL), the most common hematologic malignancy, consists of numerous subtypes. The etiology of NHL is incompletely understood, and increasing evidence suggests that risk factors may vary by NHL subtype. However, small numbers of cases have made investigation of subtype-specific risks challenging. The International Lymphoma Epidemiology Consortium therefore undertook the NHL Subtypes Project, an international collaborative effort to investigate the etiologies of NHL subtypes. This article describes in detail the project rationale and design. Methods We pooled individual-level data from 20 case-control studies (17471 NHL cases, 23096 controls) from North America, Europe, and Australia. Centralized data harmonization and analysis ensured standardized definitions and approaches, with rigorous quality control. Results The pooled study population included 11 specified NHL subtypes with more than 100 cases: diffuse large B-cell lymphoma (N = 4667), follicular lymphoma (N = 3530), chronic lymphocytic leukemia/small lymphocytic lymphoma (N = 2440), marginal zone lymphoma (N = 1052), peripheral T-cell lymphoma (N = 584), mantle cell lymphoma (N = 557), lymphoplasmacytic lymphoma/Waldenström macroglobulinemia (N = 374), mycosis fungoides/Sézary syndrome (N = 324), Burkitt/Burkitt-like lymphoma/leukemia (N = 295), hairy cell leukemia (N = 154), and acute lymphoblastic leukemia/lymphoma (N = 152). Associations with medical history, family history, lifestyle factors, and occupation for each of these 11 subtypes are presented in separate articles in this issue, with a final article quantitatively comparing risk factor patterns among subtypes. Conclusions The International Lymphoma Epidemiology Consortium NHL Subtypes Project provides the largest and most comprehensive investigation of potential risk factors for a broad range of common and rare NHL subtypes to date. The analyses contribute to our understanding of the multifactorial nature of NHL

  13. An internet-based method of selecting control populations for epidemiologic studies.

    PubMed

    Stone, Mary Bishop; Lyon, Joseph L; Simonsen, Sara Ellis; White, George L; Alder, Stephen C

    2007-01-01

    Identifying control subjects for epidemiologic studies continues to increase in difficulty because of changes in telephone technology such as answering services and machines, caller identification, and cell phones. An Internet-based method for obtaining study subjects that may increase response rates has been developed and is described. This method uses information from two websites that, when combined, provide accurate and complete lists of names, addresses, and listed phone numbers. This method was developed by use of randomly selected streets in a suburb of Salt Lake City, Utah, in June 2005.

  14. Spacesuit Radiation Shield Design Methods

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Anderson, Brooke M.; Cucinotta, Francis A.; Ware, J.; Zeitlin, Cary J.

    2006-01-01

    Meeting radiation protection requirements during EVA is predominantly an operational issue with some potential considerations for temporary shelter. The issue of spacesuit shielding is mainly guided by the potential of accidental exposure when operational and temporary shelter considerations fail to maintain exposures within operational limits. In this case, very high exposure levels are possible which could result in observable health effects and even be life threatening. Under these assumptions, potential spacesuit radiation exposures have been studied using known historical solar particle events to gain insight on the usefulness of modification of spacesuit design in which the control of skin exposure is a critical design issue and reduction of blood forming organ exposure is desirable. Transition to a new spacesuit design including soft upper-torso and reconfigured life support hardware gives an opportunity to optimize the next generation spacesuit for reduced potential health effects during an accidental exposure.

  15. [Methodical reflections on epidemiological methods to measure adverse medical device events].

    PubMed

    Lessing, C

    2009-06-01

    Drugs and medical devices are common remedies in patient care. Concerning patient safety, much research has been undertaken to study medication-related events, such as adverse drug events or medication errors; however, only little is known about device-related events and patient safety. Until now, only one survey on the epidemiology of adverse medical device events has been published. Estimates amount to 8.4 adverse medical device events/100 hospitalizations. As this indicates, further research is needed on epidemiological methodology to investigate the frequency, distribution, causes and results of medical device-related events. Only profound knowledge will constitute a resilient base for the development of safety strategies which can be then implemented and evaluated. Also in the German health care system, the special challenges described for data collection have to be mastered.

  16. Empirical Evidence of Study Design Biases in Randomized Trials: Systematic Review of Meta-Epidemiological Studies

    PubMed Central

    Page, Matthew J.; Higgins, Julian P. T.; Clayton, Gemma; Sterne, Jonathan A. C.; Hróbjartsson, Asbjørn; Savović, Jelena

    2016-01-01

    Objective To synthesise evidence on the average bias and heterogeneity associated with reported methodological features of randomized trials. Design Systematic review of meta-epidemiological studies. Methods We retrieved eligible studies included in a recent AHRQ-EPC review on this topic (latest search September 2012), and searched Ovid MEDLINE and Ovid EMBASE for studies indexed from Jan 2012-May 2015. Data were extracted by one author and verified by another. We combined estimates of average bias (e.g. ratio of odds ratios (ROR) or difference in standardised mean differences (dSMD)) in meta-analyses using the random-effects model. Analyses were stratified by type of outcome (“mortality” versus “other objective” versus “subjective”). Direction of effect was standardised so that ROR < 1 and dSMD < 0 denotes a larger intervention effect estimate in trials with an inadequate or unclear (versus adequate) characteristic. Results We included 24 studies. The available evidence suggests that intervention effect estimates may be exaggerated in trials with inadequate/unclear (versus adequate) sequence generation (ROR 0.93, 95% CI 0.86 to 0.99; 7 studies) and allocation concealment (ROR 0.90, 95% CI 0.84 to 0.97; 7 studies). For these characteristics, the average bias appeared to be larger in trials of subjective outcomes compared with other objective outcomes. Also, intervention effects for subjective outcomes appear to be exaggerated in trials with lack of/unclear blinding of participants (versus blinding) (dSMD -0.37, 95% CI -0.77 to 0.04; 2 studies), lack of/unclear blinding of outcome assessors (ROR 0.64, 95% CI 0.43 to 0.96; 1 study) and lack of/unclear double blinding (ROR 0.77, 95% CI 0.61 to 0.93; 1 study). The influence of other characteristics (e.g. unblinded trial personnel, attrition) is unclear. Conclusions Certain characteristics of randomized trials may exaggerate intervention effect estimates. The average bias appears to be greatest in trials of

  17. West Nile virus in Europe: a comparison of surveillance system designs in a changing epidemiological context.

    PubMed

    Chevalier, Veronique; Lecollinet, Sylvie; Durand, Benoit

    2011-08-01

    Current knowledge suggests that there is a low-level and recurrent circulation of West Nile virus (WNV) in Europe, with sporadic human and/or equines cases. However, recent events indicate that this picture is changing, raising the possibility that Europe could experience a modification in the virus' circulation patterns. We used an existing model of WNV circulation between Southern Europe and West Africa to estimate the sample size of equivalent West Nile surveillance systems, either passive (based upon horse populations and sentinel veterinarians) or active (sentinel horses, sentinel chickens, or WNV genome detection in trapped mosquito pools). The costs and calendar day of first detection of these different surveillance systems were compared under three different epidemiological scenarios: very low level circulation, low level recurrent circulation, and epidemic situation. The passive surveillance of 1000 horses by specialized veterinarian clinics appeared to be the most cost-effective system in the current European context, and estimated median dates of first detection appeared consistent with recent field observations. Our results can be used to optimize surveillance designs for different epidemiological requirements.

  18. Directory of Design Support Methods

    DTIC Science & Technology

    2005-08-01

    design’s resulting unique risk character in comparison with either another competing system and/or the new 100-point worst-case model (above). 5. It...programs, acquire required skills / competencies , find out where the money is being spent and where to allocate resources for greatest impact. ADVISOR Key...required to run one or multiple training programs, gain required skills and competencies as well as find out where the money is being spent - i.e

  19. Design Characteristics Influence Performance of Clinical Prediction Rules in Validation: A Meta-Epidemiological Study.

    PubMed

    Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda

    2016-01-01

    Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules' performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2-4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved.

  20. Dietary assessment methods in epidemiological research: current state of the art and future prospects.

    PubMed

    Naska, Androniki; Lagiou, Areti; Lagiou, Pagona

    2017-01-01

    Self-reported dietary intake is assessed by methods of real-time recording (food diaries and the duplicate portion method) and methods of recall (dietary histories, food frequency questionnaires, and 24-hour dietary recalls). Being less labor intensive, recall methods are more frequently employed in nutritional epidemiological investigations. However, sources of error, which include the participants' inability to fully and accurately recall their intakes as well as limitations inherent in the food composition databases applied to convert the reported food consumption to energy and nutrient intakes, may limit the validity of the generated information. The use of dietary biomarkers is often recommended to overcome such errors and better capture intra-individual variability in intake; nevertheless, it has its own challenges. To address measurement error associated with dietary questionnaires, large epidemiological investigations often integrate sub-studies for the validation and calibration of the questionnaires and/or administer a combination of different assessment methods (e.g. administration of different questionnaires and assessment of biomarker levels). Recent advances in the omics field could enrich the list of reliable nutrition biomarkers, whereas new approaches employing web-based and smart phone applications could reduce respondent burden and, possibly, reporting bias. Novel technologies are increasingly integrated with traditional methods, but some sources of error still remain. In the analyses, food and nutrient intakes always need to be adjusted for total daily energy intake to account for errors related to reporting.

  1. A survey of variable selection methods in two Chinese epidemiology journals

    PubMed Central

    2010-01-01

    Background Although much has been written on developing better procedures for variable selection, there is little research on how it is practiced in actual studies. This review surveys the variable selection methods reported in two high-ranking Chinese epidemiology journals. Methods Articles published in 2004, 2006, and 2008 in the Chinese Journal of Epidemiology and the Chinese Journal of Preventive Medicine were reviewed. Five categories of methods were identified whereby variables were selected using: A - bivariate analyses; B - multivariable analysis; e.g. stepwise or individual significance testing of model coefficients; C - first bivariate analyses, followed by multivariable analysis; D - bivariate analyses or multivariable analysis; and E - other criteria like prior knowledge or personal judgment. Results Among the 287 articles that reported using variable selection methods, 6%, 26%, 30%, 21%, and 17% were in categories A through E, respectively. One hundred sixty-three studies selected variables using bivariate analyses, 80% (130/163) via multiple significance testing at the 5% alpha-level. Of the 219 multivariable analyses, 97 (44%) used stepwise procedures, 89 (41%) tested individual regression coefficients, but 33 (15%) did not mention how variables were selected. Sixty percent (58/97) of the stepwise routines also did not specify the algorithm and/or significance levels. Conclusions The variable selection methods reported in the two journals were limited in variety, and details were often missing. Many studies still relied on problematic techniques like stepwise procedures and/or multiple testing of bivariate associations at the 0.05 alpha-level. These deficiencies should be rectified to safeguard the scientific validity of articles published in Chinese epidemiology journals. PMID:20920252

  2. FEEDBACK DESIGN METHOD REVIEW AND COMPARISON.

    SciTech Connect

    ONILLON,E.

    1999-03-29

    Different methods for feedback designs are compared. These includes classical Proportional Integral Derivative (P. I. D.), state variable based methods like pole placement, Linear Quadratic Regulator (L. Q. R.), H-infinity and p-analysis. These methods are then applied for the design and analysis of the RHIC phase and radial loop, yielding a performance, stability and robustness comparison.

  3. A method of determining where to target surveillance efforts in heterogeneous epidemiological systems

    PubMed Central

    van den Bosch, Frank; Gottwald, Timothy R.; Alonso Chavez, Vasthi

    2017-01-01

    The spread of pathogens into new environments poses a considerable threat to human, animal, and plant health, and by extension, human and animal wellbeing, ecosystem function, and agricultural productivity, worldwide. Early detection through effective surveillance is a key strategy to reduce the risk of their establishment. Whilst it is well established that statistical and economic considerations are of vital importance when planning surveillance efforts, it is also important to consider epidemiological characteristics of the pathogen in question—including heterogeneities within the epidemiological system itself. One of the most pronounced realisations of this heterogeneity is seen in the case of vector-borne pathogens, which spread between ‘hosts’ and ‘vectors’—with each group possessing distinct epidemiological characteristics. As a result, an important question when planning surveillance for emerging vector-borne pathogens is where to place sampling resources in order to detect the pathogen as early as possible. We answer this question by developing a statistical function which describes the probability distributions of the prevalences of infection at first detection in both hosts and vectors. We also show how this method can be adapted in order to maximise the probability of early detection of an emerging pathogen within imposed sample size and/or cost constraints, and demonstrate its application using two simple models of vector-borne citrus pathogens. Under the assumption of a linear cost function, we find that sampling costs are generally minimised when either hosts or vectors, but not both, are sampled. PMID:28846676

  4. Methods for measuring utilization of mental health services in two epidemiologic studies

    PubMed Central

    NOVINS, DOUGLAS K.; BEALS, JANETTE; CROY, CALVIN; MANSON, SPERO M.

    2015-01-01

    Objectives of Study Psychiatric epidemiologic studies often include two or more sets of questions regarding service utilization, but the agreement across these different questions and the factors associated with their endorsement have not been examined. The objectives of this study were to describe the agreement of different sets of mental health service utilization questions that were included in the American Indian Service Utilization Psychiatric Epidemiology Risk and Protective Factors Project (AI-SUPERPFP), and compare the results to similar questions included in the baseline National Comorbidity Survey (NCS). Methods Responses to service utilization questions by 2878 AI-SUPERPFP and 5877 NCS participants were examined by calculating estimates of service use and agreement (κ) across the different sets of questions. Logistic regression models were developed to identify factors associated with endorsement of specific sets of questions. Results In both studies, estimates of mental health service utilization varied across the different sets of questions. Agreement across the different question sets was marginal to good (κ = 0.27–0.69). Characteristics of identified service users varied across the question sets. Limitations Neither survey included data to examine the validity of participant responses to service utilization questions. Recommendations for Further Research Question wording and placement appear to impact estimates of service utilization in psychiatric epidemiologic studies. Given the importance of these estimates for policy-making, further research into the validity of survey responses as well as impacts of question wording and context on rates of service utilization is warranted. PMID:18767205

  5. A review for detecting gene-gene interactions using machine learning methods in genetic epidemiology.

    PubMed

    Koo, Ching Lee; Liew, Mei Jing; Mohamad, Mohd Saberi; Salleh, Abdul Hakim Mohamed

    2013-01-01

    Recently, the greatest statistical computational challenge in genetic epidemiology is to identify and characterize the genes that interact with other genes and environment factors that bring the effect on complex multifactorial disease. These gene-gene interactions are also denoted as epitasis in which this phenomenon cannot be solved by traditional statistical method due to the high dimensionality of the data and the occurrence of multiple polymorphism. Hence, there are several machine learning methods to solve such problems by identifying such susceptibility gene which are neural networks (NNs), support vector machine (SVM), and random forests (RFs) in such common and multifactorial disease. This paper gives an overview on machine learning methods, describing the methodology of each machine learning methods and its application in detecting gene-gene and gene-environment interactions. Lastly, this paper discussed each machine learning method and presents the strengths and weaknesses of each machine learning method in detecting gene-gene interactions in complex human disease.

  6. Design for validation, based on formal methods

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1990-01-01

    Validation of ultra-reliable systems decomposes into two subproblems: (1) quantification of probability of system failure due to physical failure; (2) establishing that Design Errors are not present. Methods of design, testing, and analysis of ultra-reliable software are discussed. It is concluded that a design-for-validation based on formal methods is needed for the digital flight control systems problem, and also that formal methods will play a major role in the development of future high reliability digital systems.

  7. Endodontic Epidemiology

    PubMed Central

    Shahravan, Arash; Haghdoost, Ali Akbar

    2014-01-01

    Epidemiology is the study of disease distribution and factors determining or affecting it. Likewise, endodontic epidemiology can be defined as the science of studying the distribution pattern and determinants of pulp and periapical diseases; specially apical periodontitis. Although different study designs have been used in endodontics, researchers must pay more attention to study designs with higher level of evidence such as randomized clinical trials. PMID:24688577

  8. Design Methods for Clinical Systems

    PubMed Central

    Blum, B.I.

    1986-01-01

    This paper presents a brief introduction to the techniques, methods and tools used to implement clinical systems. It begins with a taxonomy of software systems, describes the classic approach to development, provides some guidelines for the planning and management of software projects, and finishes with a guide to further reading. The conclusions are that there is no single right way to develop software, that most decisions are based upon judgment built from experience, and that there are tools that can automate some of the better understood tasks.

  9. Realist explanatory theory building method for social epidemiology: a protocol for a mixed method multilevel study of neighbourhood context and postnatal depression.

    PubMed

    Eastwood, John G; Jalaludin, Bin B; Kemp, Lynn A

    2014-01-01

    A recent criticism of social epidemiological studies, and multi-level studies in particular has been a paucity of theory. We will present here the protocol for a study that aims to build a theory of the social epidemiology of maternal depression. We use a critical realist approach which is trans-disciplinary, encompassing both quantitative and qualitative traditions, and that assumes both ontological and hierarchical stratification of reality. We describe a critical realist Explanatory Theory Building Method comprising of an: 1) emergent phase, 2) construction phase, and 3) confirmatory phase. A concurrent triangulated mixed method multilevel cross-sectional study design is described. The Emergent Phase uses: interviews, focus groups, exploratory data analysis, exploratory factor analysis, regression, and multilevel Bayesian spatial data analysis to detect and describe phenomena. Abductive and retroductive reasoning will be applied to: categorical principal component analysis, exploratory factor analysis, regression, coding of concepts and categories, constant comparative analysis, drawing of conceptual networks, and situational analysis to generate theoretical concepts. The Theory Construction Phase will include: 1) defining stratified levels; 2) analytic resolution; 3) abductive reasoning; 4) comparative analysis (triangulation); 5) retroduction; 6) postulate and proposition development; 7) comparison and assessment of theories; and 8) conceptual frameworks and model development. The strength of the critical realist methodology described is the extent to which this paradigm is able to support the epistemological, ontological, axiological, methodological and rhetorical positions of both quantitative and qualitative research in the field of social epidemiology. The extensive multilevel Bayesian studies, intensive qualitative studies, latent variable theory, abductive triangulation, and Inference to Best Explanation provide a strong foundation for Theory

  10. Diagnostic Methods of Helicobacter pylori Infection for Epidemiological Studies: Critical Importance of Indirect Test Validation.

    PubMed

    Miftahussurur, Muhammad; Yamaoka, Yoshio

    2016-01-01

    Among the methods developed to detect H. pylori infection, determining the gold standard remains debatable, especially for epidemiological studies. Due to the decreasing sensitivity of direct diagnostic tests (histopathology and/or immunohistochemistry [IHC], rapid urease test [RUT], and culture), several indirect tests, including antibody-based tests (serology and urine test), urea breath test (UBT), and stool antigen test (SAT) have been developed to diagnose H. pylori infection. Among the indirect tests, UBT and SAT became the best methods to determine active infection. While antibody-based tests, especially serology, are widely available and relatively sensitive, their specificity is low. Guidelines indicated that no single test can be considered as the gold standard for the diagnosis of H. pylori infection and that one should consider the method's advantages and disadvantages. Based on four epidemiological studies, culture and RUT present a sensitivity of 74.2-90.8% and 83.3-86.9% and a specificity of 97.7-98.8% and 95.1-97.2%, respectively, when using IHC as a gold standard. The sensitivity of serology is quite high, but that of the urine test was lower compared with that of the other methods. Thus, indirect test validation is important although some commercial kits propose universal cut-off values.

  11. Diagnostic Methods of Helicobacter pylori Infection for Epidemiological Studies: Critical Importance of Indirect Test Validation

    PubMed Central

    Miftahussurur, Muhammad; Yamaoka, Yoshio

    2016-01-01

    Among the methods developed to detect H. pylori infection, determining the gold standard remains debatable, especially for epidemiological studies. Due to the decreasing sensitivity of direct diagnostic tests (histopathology and/or immunohistochemistry [IHC], rapid urease test [RUT], and culture), several indirect tests, including antibody-based tests (serology and urine test), urea breath test (UBT), and stool antigen test (SAT) have been developed to diagnose H. pylori infection. Among the indirect tests, UBT and SAT became the best methods to determine active infection. While antibody-based tests, especially serology, are widely available and relatively sensitive, their specificity is low. Guidelines indicated that no single test can be considered as the gold standard for the diagnosis of H. pylori infection and that one should consider the method's advantages and disadvantages. Based on four epidemiological studies, culture and RUT present a sensitivity of 74.2–90.8% and 83.3–86.9% and a specificity of 97.7–98.8% and 95.1–97.2%, respectively, when using IHC as a gold standard. The sensitivity of serology is quite high, but that of the urine test was lower compared with that of the other methods. Thus, indirect test validation is important although some commercial kits propose universal cut-off values. PMID:26904678

  12. Klebsiella pneumoniae infection on a rehabilitation unit: comparison of epidemiologic typing methods.

    PubMed

    Thompson, W; Romance, L; Bialkowska-Hobrazanska, H; Rennie, R P; Ashton, F; Nicolle, L E

    1993-04-01

    To identify factors associated with an increased occurrence of Klebsiella pneumoniae isolation in urine cultures and infected wounds on a rehabilitation unit and to compare typing methods for K pneumoniae isolates. Retrospective review of laboratory reports and patient records with case-control study. Analysis of K pneumoniae isolates using capsular serotyping, enzyme electrophoretic typing, ribotyping, and DNA typing. 48-bed rehabilitation unit in an 1,100-bed tertiary care teaching hospital in Winnipeg, Manitoba. In 1988, 20 (19%) of 106 patients admitted to the rehabilitation unit had K pneumoniae isolated from urine or wound, and in 1989 31 (28%) of 111 patients had Klebsiella isolated. Review of ward practices revealed appropriate written policies but evidence of failure in execution leading to multiple opportunities for transmission among patients. Substantial environmental contamination was not identified, although a common urine graduate may have contributed to some transmission. Individuals with K pneumoniae isolated had a significantly longer duration of stay. Many of these were spinal cord-injured patients and were maintained on intermittent catheterization. One outbreak strain was identified in epidemiologic typing. Other strains were generally identified in individuals with non-nosocomial acquisition of infection. Comparison of epidemiologic typing methods suggests ribotyping may be the optimal method for typing K pneumoniae strains. K pneumoniae was acquired frequently by spinal cord-injured patients with extended admissions, re-emphasizing the importance of both patients and staff following appropriate infection control practices on rehabilitation wards. Ribotyping was the optimal method for typing K pneumoniae isolates.

  13. Study designs may influence results: the problems with questionnaire-based case-control studies on the epidemiology of glioma.

    PubMed

    Johansen, Christoffer; Schüz, Joachim; Andreasen, Anne-Marie Serena; Dalton, Susanne Oksbjerg

    2017-03-28

    Glioma is a rare brain tumour with a very poor prognosis and the search for modifiable factors is intense. We reviewed the literature concerning risk factors for glioma obtained in case-control designed epidemiological studies in order to discuss the influence of this methodology on the observed results. When reviewing the association between three exposures, medical radiation, exogenous hormone use and allergy, we critically appraised the evidence from both case-control and cohort studies. For medical radiation and hormone replacement therapy (HRT), questionnaire-based case-control studies appeared to show an inverse association, whereas nested case-control and cohort studies showed no association. For allergies, the inverse association was observed irrespective of study design. We recommend that the questionnaire-based case-control design be placed lower in the hierarchy of studies for establishing cause-and-effect for diseases such as glioma. We suggest that a state-of-the-art case-control study should, as a minimum, be accompanied by extensive validation of the exposure assessment methods and the representativeness of the study sample with regard to the exposures of interest. Otherwise, such studies cannot be regarded as 'hypothesis testing' but only 'hypothesis generating'. We consider that this holds true for all questionnaire-based case-control studies on cancer and other chronic diseases, although perhaps not to the same extent for each exposure-outcome combination.

  14. The Role of DNA Methylation in Cardiovascular Risk and Disease: Methodological Aspects, Study Design, and Data Analysis for Epidemiological Studies

    PubMed Central

    Baccarelli, Andrea A.

    2015-01-01

    Epidemiological studies have demonstrated that genetic, environmental, behavioral, and clinical factors contribute to cardiovascular disease (CVD) development. How these risk factors interact at the cellular level to cause CVD is not well known. Epigenetic epidemiology enables researchers to explore critical links between genomic coding, modifiable exposures, and manifestation of disease phenotype. One epigenetic link, DNA methylation, is potentially an important mechanism underlying these associations. In the past decade, there has been a significant increase in the number of epidemiological studies investigating cardiovascular risk factors and outcomes in relation to DNA methylation, but many gaps remain in our understanding of the underlying etiology and biological implications. In this review we provide a brief overview of the biology and mechanisms of DNA methylation and its role in CVD. In addition, we summarize the current evidence base in epigenetic epidemiology studies relevant to cardiovascular health and disease, and discuss the limitations, challenges, and future directions of the field. Finally, we provide guidelines for well-designed epigenetic epidemiology studies, with particular focus on methodological aspects, study design, and analytical challenges. PMID:26837743

  15. Guided Design as a Women's Studies Method.

    ERIC Educational Resources Information Center

    Trobian, Helen R.

    Guided Design has great potential as a teaching/learning method for Women's Studies courses. The Guided Design process is organized around the learner's efforts to come up with solutions to a series of carefully designed, open-ended problems. The problems are selected by the teacher according to the skills and subject matter to be learned. The…

  16. A practical method for use in epidemiological studies on enamel hypomineralisation.

    PubMed

    Ghanim, A; Elfrink, M; Weerheijm, K; Mariño, R; Manton, D

    2015-06-01

    With the development of the European Academy of Paediatric Dentistry (EAPD) judgment criteria, there has been increasing interest worldwide in investigation of the prevalence of demarcated opacities in tooth enamel substance, known as molar-incisor hypomineralisation (MIH). However, the lack of a standardised system for the purpose of recording MIH data in epidemiological surveys has contributed greatly to the wide variations in the reported prevalence between studies. The present publication describes the rationale, development, and content of a scoring method for MIH diagnosis in epidemiological studies as well as clinic- and hospital-based studies. The proposed grading method allows separate classification of demarcated hypomineralisation lesions and other enamel defects identical to MIH. It yields an informative description of the severity of MIH-affected teeth in terms of the stage of visible enamel destruction and the area of tooth surface affected (i.e. lesion clinical status and extent, respectively). In order to preserve the maximum amount of information from a clinical examination consistent with the need to permit direct comparisons between prevalence studies, two forms of the charting are proposed, a short form for simple screening surveys and a long form desirable for prospective, longitudinal observational research where aetiological factors in demarcated lesions are to be investigated in tandem with lesions distribution. Validation of the grading method is required, and its reliability and usefulness need to be tested in different age groups and different populations.

  17. Standardization of a molecular method for epidemiologic identification of Leishmania strains.

    PubMed

    Rocha, R F; Menezes, E V; Xavier, A R E O; Royo, V A; Oliveira, D A; Júnior, A F M; Dias, E S; Lima, A C V M R; Michalsky, E M

    2016-10-06

    Molecular studies of the evolutionary relationships among Leishmania species suggest the presence of high genetic variation within this genus, which has a direct effect on public health in many countries. The coexistence of species in a particular region can result in different leishmaniasis clinical forms and treatment responses. We aimed to standardize the kinetoplast DNA (kDNA) enterobacterial repetitive intergenic consensus (ERIC) sequence polymerase chain reaction (PCR) method for molecular epidemiological identification of Leishmania strains, and estimate existing inter-strain genomic differences and kDNA signatures using this technique. ERIC-PCR of genomic DNA revealed genetic polymorphisms between species, although some strains shared many DNA fragments. Leishmania guyanensis, L. amazonensis, and L. braziliensis clustered together in a dendrogram with similarities ranging from 42.0 to 61.0%, whereas L. chagasi grouped with these three species with a similarity of 28.0%. After amplification of kDNA, 780-bp bands were extracted from an agarose gel and purified for analysis of its genetic signature. kDNA ERIC-PCR electrophoretic patterns consisted of 100- to 600- bp fragments. Using these profiles, L. braziliensis and L. guyanensis grouped with a similarity of 26.0%, and L. amazonensis and L. chagasi clustered based on a similarity of 100%. The electrophoretic profiles and dendrograms showed that, for epidemiological identification by ERIC-PCR, genomic DNA had greater discriminatory power than kDNA did. More strains need to be analyzed to validate the kDNA ERIC-PCR method. The genomes of these strains should be sequenced for better epidemiological identification of Leishmania species.

  18. Methods for combinatorial and parallel library design.

    PubMed

    Schnur, Dora M; Beno, Brett R; Tebben, Andrew J; Cavallaro, Cullen

    2011-01-01

    Diversity has historically played a critical role in design of combinatorial libraries, screening sets and corporate collections for lead discovery. Large library design dominated the field in the 1990s with methods ranging anywhere from purely arbitrary through property based reagent selection to product based approaches. In recent years, however, there has been a downward trend in library size. This was due to increased information about the desirable targets gleaned from the genomics revolution and to the ever growing availability of target protein structures from crystallography and homology modeling. Creation of libraries directed toward families of receptors such as GPCRs, kinases, nuclear hormone receptors, proteases, etc., replaced the generation of libraries based primarily on diversity while single target focused library design has remained an important objective. Concurrently, computing grids and cpu clusters have facilitated the development of structure based tools that screen hundreds of thousands of molecules. Smaller "smarter" combinatorial and focused parallel libraries replaced those early un-focused large libraries in the twenty-first century drug design paradigm. While diversity still plays a role in lead discovery, the focus of current library design methods has shifted to receptor based methods, scaffold hopping/bio-isostere searching, and a much needed emphasis on synthetic feasibility. Methods such as "privileged substructures based design" and pharmacophore based design still are important methods for parallel and small combinatorial library design. This chapter discusses some of the possible design methods and presents examples where they are available.

  19. Product Development by Design Navigation Method

    NASA Astrophysics Data System (ADS)

    Nakazawa, Hiromu

    Manufacturers must be able to develop new products within a specified time period. This paper discusses a method for developing high performance products from a limited number of experiments, utilizing the concept of “function error”. Unlike conventional methods where the sequence of design, prototyping and experiment must be repeated several times, the proposed method can determine optimal design values directly from experimental data obtained from the first prototype. The theoretical basis of the method is presented, then its effectiveness proven by applying it to design an extrusion machine and a CNC lathe.

  20. Mixed Method Designs in Implementation Research

    PubMed Central

    Aarons, Gregory A.; Horwitz, Sarah; Chamberlain, Patricia; Hurlburt, Michael; Landsverk, John

    2010-01-01

    This paper describes the application of mixed method designs in implementation research in 22 mental health services research studies published in peer-reviewed journals over the last 5 years. Our analyses revealed 7 different structural arrangements of qualitative and quantitative methods, 5 different functions of mixed methods, and 3 different ways of linking quantitative and qualitative data together. Complexity of design was associated with number of aims or objectives, study context, and phase of implementation examined. The findings provide suggestions for the use of mixed method designs in implementation research. PMID:20967495

  1. Mixed method designs in implementation research.

    PubMed

    Palinkas, Lawrence A; Aarons, Gregory A; Horwitz, Sarah; Chamberlain, Patricia; Hurlburt, Michael; Landsverk, John

    2011-01-01

    This paper describes the application of mixed method designs in implementation research in 22 mental health services research studies published in peer-reviewed journals over the last 5 years. Our analyses revealed 7 different structural arrangements of qualitative and quantitative methods, 5 different functions of mixed methods, and 3 different ways of linking quantitative and qualitative data together. Complexity of design was associated with number of aims or objectives, study context, and phase of implementation examined. The findings provide suggestions for the use of mixed method designs in implementation research.

  2. Applying epidemiological principles to ergonomics: a checklist for incorporating sound design and interpretation of studies.

    PubMed

    Heacock, H; Koehoorn, M; Tan, J

    1997-06-01

    The primary purpose of this paper is to provide a checklist of scientific requirements necessary for the design of sound ergonomics studies. Ergonomics researchers will be able to use the checklist when designing a study and preparing it for publication. Practitioners can use the checklist to critically appraise study results, thereby having greater confidence when applying ergonomic recommendations to the workplace. A secondary purpose of the paper is to pilot the checklist on a sample of papers in the ergonomics literature and to assess its reliability. While there are checklists to assess the epidemiological rigour of studies, none have been adapted to address methodological issues in ergonomics. Two epidemiologists independently searched five ergonomics journals (Applied Ergonomics, Ergonomics, Human Factors, International Journal of Human-Computer Interaction and Journal of Human Ergology) for research studies on VDT use and visual function published between 1990 and 1995. Twenty-one articles were reviewed. Each paper was scored according to the checklist. Overall, the reviewers found that the articles did not consistently fulfill some of the checklist criteria. An insufficient sample size was the most serious omission. Inter-rater reliability of the checklist was excellent for 11 of 14 items on the checklist (Kappa > 0.74), good for two items (Kappa between 0.40 and 0.74) and poor for one item. As ergonomics is gaining acceptance as an integral part of occupational health and safety, individuals in this field must be cognizant of the fact that study results are being applied directly to workplace procedures and design. It is incumbent upon ergonomists to base their work on a solid research foundation. The checklist can be used as a tool to improve study designs and so ultimately has implications for improving the fit between the worker and the work environment.

  3. LCR method: road map for passive design

    SciTech Connect

    Morris, W.S.

    1983-05-01

    Choosing a design tool to estimate the performance of passive solar houses is discussed. One technique is the Load Collector Ratio (LCR) method. This method allows the solar designer to get quick performance estimates plus a feeling for the results that would be obtained by taking a different approach. How to use the LCR method and the results to be obtained from using it are discussed.

  4. Quantitative methods in the tuberculosis epidemiology and in the evaluation of BCG vaccination programs.

    PubMed

    Lugosi, L

    1986-01-01

    Controversies concerning the protective efficacy of the BCG vaccination result mostly from the fact that quantitative methods have not been used in the evaluation of the BCG programs. Therefore, to eliminate the current controversy an unconditional requirement is to apply valid biostatistical models to analyse the results of the BCG programs. In order to achieve objective statistical inferences and epidemiological interpretations the following conditions should be fulfilled: data for evaluation have to be taken from epidemiological trials exempt from sampling error, since the morbidity rates are not normally distributed an appropriate normalizing transformation is needed for point and confidence interval estimations, only unbiased point estimates (dependent variables) could be used in valid models for hypothesis tests, in cases of rejected null hypothesis the ranked estimates of the compared groups must be evaluated in a multiple comparison model in order to diminish the Type I error in the decision. The following quantitative methods are presented to evaluate the effectiveness of BCG vaccination in Hungary: linear regression analysis, stepwise regression analysis and log-linear analysis.

  5. Applications of a transonic wing design method

    NASA Technical Reports Server (NTRS)

    Campbell, Richard L.; Smith, Leigh A.

    1989-01-01

    A method for designing wings and airfoils at transonic speeds using a predictor/corrector approach was developed. The procedure iterates between an aerodynamic code, which predicts the flow about a given geometry, and the design module, which compares the calculated and target pressure distributions and modifies the geometry using an algorithm that relates differences in pressure to a change in surface curvature. The modular nature of the design method makes it relatively simple to couple it to any analysis method. The iterative approach allows the design process and aerodynamic analysis to converge in parallel, significantly reducing the time required to reach a final design. Viscous and static aeroelastic effects can also be accounted for during the design or as a post-design correction. Results from several pilot design codes indicated that the method accurately reproduced pressure distributions as well as the coordinates of a given airfoil or wing by modifying an initial contour. The codes were applied to supercritical as well as conventional airfoils, forward- and aft-swept transport wings, and moderate-to-highly swept fighter wings. The design method was found to be robust and efficient, even for cases having fairly strong shocks.

  6. The Chennai Urban Rural Epidemiology Study (CURES)--study design and methodology (urban component) (CURES-I).

    PubMed

    Deepa, M; Pradeepa, R; Rema, M; Mohan, Anjana; Deepa, R; Shanthirani, S; Mohan, V

    2003-09-01

    The report of World Health Organization (WHO) shows that India tops the world with the largest number of diabetic subjects. This increase is attributed to the rapid epidemiological transition accompanied by urbanization, which is occurring in India. There is very little data regarding the influence of affluence on the prevalence of diabetes and its complications particularly retinopathy in the Indian population. Furthermore, there are very few studies comparing the urban/rural prevalence of diabetes and its complications. The Chennai Urban Rural Epidemiology Study (CURES) is designed to answer the above questions. CURES is initially planned as a cross-sectional study to evolve later into a longitudinal study. Subjects for the urban component of the CURES have been recruited from within the corporation limits of Chennai City. Chennai (formerly Madras), the largest city in Southern India and the fourth largest in India has been divided into 10 zones and 155 wards. 46 wards were selected by a systematic random sampling method to represent the whole of Chennai. Twenty thousand and one individuals were recruited for the study, this number being derived based on a sample size calculation. The study has three phases. Phase one is a door to door survey which includes a questionnaire, anthropometric, fasting capillary blood glucose and blood pressure measurements. Phase two focussed on the prevalence of diabetic complications particularly retinopathy using standardized techniques like retinal photography etc. Diabetic subjects identified in phase one and age and sex matched non-diabetic subjects will participate in these studies. Phase three will include more detailed studies like clinical, biochemical and vascular studies on a sub-sample of the study subjects selected on a stratified basis from phase one. CURES is perhaps one of the largest systematic population based studies to be done in India in the field of diabetes and its complications like retinopathy, nephropathy

  7. Epidemiological study of suicide by physical methods between 1993 and 2013 in Ilam province, Iran.

    PubMed

    Azizpour, Yosra; Sayehmiri, Kourosh; Asadollahi, Khairollah; Kaikhavani, Satar; Bagheri, Maryam

    2017-08-23

    Suicide by aggressive physical methods such as firearms, hanging, and jumping is well known; however, different factors may influence a person while selecting a particular method. The aim of this study was to investigate the epidemiological factors involved in the selection and use of different physical methods for suicide over a long-term period in Ilam province, Iran. The present study was conducted retrospectively between 1993 and 2013 using recorded data from a comprehensive system for registration of suicide attempts in Ilam University of Medical Sciences. The epidemiological characteristics included person, time and place variables, and the outcomes of the suicide attempts. The chi square, univariate and multivariate logistic regression models were used for data analysis. Totally, 1516 suicide attempts were evaluated (the annual incidence rate: 19/100,000 individuals). The most commonly used suicide method in females (88.4%) and males (38.9%) was self-immolation. Furthermore, the annual incidence rate among males and females was within the age group of 15-24 years (24.6 and 47.8/100,000 individuals). The risk of death by suicide in the age group of 55-64 years was 2.93 compared with the age group of 10-14 years (OR = 2.93; 95% CI = 0.64-13.54, P = 0.168). This study revealed that self-immolation was the most selected physical method of suicide and had the highest incidence rate, and inflicted the survivors with severe physical and mental complications. In order to reduce the use of physical methods, especially self-immolation, life skills training becomes more important than ever.

  8. Design of a detection survey for Ostreid herpesvirus-1 using hydrodynamic dispersion models to determine epidemiological units.

    PubMed

    Pande, Anjali; Acosta, Hernando; Brangenberg, Naya Alexis; Keeling, Suzanne Elizabeth

    2015-04-01

    Using Ostreid herpesvirus-1 (OsHV-1) as a case study, this paper considers a survey design methodology for an aquatic animal pathogen that incorporates the concept of biologically independent epidemiological units. Hydrodynamically-modelled epidemiological units are used to divide marine areas into sensible sampling units for detection surveys of waterborne diseases. In the aquatic environment it is difficult to manage disease at the animal level, hence management practices are often aimed at a group of animals sharing a similar risk. Using epidemiological units is a way to define these groups, based on a similar level of probability of exposure based on the modelled potential spread of a viral particle via coastal currents, that can help inform management decisions.

  9. Measuring socio-economic position for epidemiological studies in low- and middle-income countries: a methods of measurement in epidemiology paper

    PubMed Central

    Howe, Laura D; Galobardes, Bruna; Matijasevich, Alicia; Gordon, David; Johnston, Deborah; Onwujekwe, Obinna; Patel, Rita; Webb, Elizabeth A; Lawlor, Debbie A; Hargreaves, James R

    2012-01-01

    Much has been written about the measurement of socio-economic position (SEP) in high-income countries (HIC). Less has been written for an epidemiology, health systems and public health audience about the measurement of SEP in low- and middle-income countries (LMIC). The social stratification processes in many LMIC—and therefore the appropriate measurement tools—differ considerably from those in HIC. Many measures of SEP have been utilized in epidemiological studies; the aspects of SEP captured by these measures and the pathways through which they may affect health are likely to be slightly different but overlapping. No single measure of SEP will be ideal for all studies and contexts; the strengths and limitations of a given indicator are likely to vary according to the specific research question. Understanding the general properties of different indicators, however, is essential for all those involved in the design or interpretation of epidemiological studies. In this article, we describe the measures of SEP used in LMIC. We concentrate on measures of individual or household-level SEP rather than area-based or ecological measures such as gross domestic product. We describe each indicator in terms of its theoretical basis, interpretation, measurement, strengths and limitations. We also provide brief comparisons between LMIC and HIC for each measure. PMID:22438428

  10. Impeller blade design method for centrifugal compressors

    NASA Technical Reports Server (NTRS)

    Jansen, W.; Kirschner, A. M.

    1974-01-01

    The design of a centrifugal impeller with blades that are aerodynamically efficient, easy to manufacture, and mechanically sound is discussed. The blade design method described here satisfies the first two criteria and with a judicious choice of certain variables will also satisfy stress considerations. The blade shape is generated by specifying surface velocity distributions and consists of straight-line elements that connect points at hub and shroud. The method may be used to design radially elemented and backward-swept blades. The background, a brief account of the theory, and a sample design are described.

  11. Model reduction methods for control design

    NASA Technical Reports Server (NTRS)

    Dunipace, K. R.

    1988-01-01

    Several different model reduction methods are developed and detailed implementation information is provided for those methods. Command files to implement the model reduction methods in a proprietary control law analysis and design package are presented. A comparison and discussion of the various reduction techniques is included.

  12. Need for Improved Methods to Collect and Present Spatial Epidemiologic Data for Vectorborne Diseases

    PubMed Central

    Eisen, Rebecca J.

    2007-01-01

    Improved methods for collection and presentation of spatial epidemiologic data are needed for vectorborne diseases in the United States. Lack of reliable data for probable pathogen exposure site has emerged as a major obstacle to the development of predictive spatial risk models. Although plague case investigations can serve as a model for how to ideally generate needed information, this comprehensive approach is cost-prohibitive for more common and less severe diseases. New methods are urgently needed to determine probable pathogen exposure sites that will yield reliable results while taking into account economic and time constraints of the public health system and attending physicians. Recent data demonstrate the need for a change from use of the county spatial unit for presentation of incidence of vectorborne diseases to more precise ZIP code or census tract scales. Such fine-scale spatial risk patterns can be communicated to the public and medical community through Web-mapping approaches. PMID:18258029

  13. Comparison of Methods to Account for Implausible Reporting of Energy Intake in Epidemiologic Studies

    PubMed Central

    Rhee, Jinnie J.; Sampson, Laura; Cho, Eunyoung; Hughes, Michael D.; Hu, Frank B.; Willett, Walter C.

    2015-01-01

    In a recent article in the American Journal of Epidemiology by Mendez et al. (Am J Epidemiol. 2011;173(4):448–458), the use of alternative approaches to the exclusion of implausible energy intakes led to significantly different cross-sectional associations between diet and body mass index (BMI), whereas the use of a simpler recommended criteria (<500 and >3,500 kcal/day) yielded no meaningful change. However, these findings might have been due to exclusions made based on weight, a primary determinant of BMI. Using data from 52,110 women in the Nurses' Health Study (1990), we reproduced the cross-sectional findings of Mendez et al. and compared the results from the recommended method with those from 2 weight-dependent alternative methods (the Goldberg method and predicted total energy expenditure method). The same 3 exclusion criteria were then used to examine dietary variables prospectively in relation to change in BMI, which is not a direct function of attained weight. We found similar associations using the 3 methods. In a separate cross-sectional analysis using biomarkers of dietary factors, we found similar correlations for intakes of fatty acids (n = 439) and carotenoids and retinol (n = 1,293) using the 3 methods for exclusions. These results do not support the general conclusion that use of exclusion criteria based on the alternative methods might confer an advantage over the recommended exclusion method. PMID:25656533

  14. Comparison of methods to account for implausible reporting of energy intake in epidemiologic studies.

    PubMed

    Rhee, Jinnie J; Sampson, Laura; Cho, Eunyoung; Hughes, Michael D; Hu, Frank B; Willett, Walter C

    2015-02-15

    In a recent article in the American Journal of Epidemiology by Mendez et al. (Am J Epidemiol. 2011;173(4):448-458), the use of alternative approaches to the exclusion of implausible energy intakes led to significantly different cross-sectional associations between diet and body mass index (BMI), whereas the use of a simpler recommended criteria (<500 and >3,500 kcal/day) yielded no meaningful change. However, these findings might have been due to exclusions made based on weight, a primary determinant of BMI. Using data from 52,110 women in the Nurses' Health Study (1990), we reproduced the cross-sectional findings of Mendez et al. and compared the results from the recommended method with those from 2 weight-dependent alternative methods (the Goldberg method and predicted total energy expenditure method). The same 3 exclusion criteria were then used to examine dietary variables prospectively in relation to change in BMI, which is not a direct function of attained weight. We found similar associations using the 3 methods. In a separate cross-sectional analysis using biomarkers of dietary factors, we found similar correlations for intakes of fatty acids (n = 439) and carotenoids and retinol (n = 1,293) using the 3 methods for exclusions. These results do not support the general conclusion that use of exclusion criteria based on the alternative methods might confer an advantage over the recommended exclusion method.

  15. Molecular and epidemiological characterization of canine Pseudomonas otitis using a prospective case-control study design.

    PubMed

    Morris, Daniel O; Davis, Meghan F; Palmeiro, Brian S; O'Shea, Kathleen; Rankin, Shelley C

    2017-02-01

    Pseudomonas aeruginosa is an opportunistic pathogen of the canine ear canal and occupies aquatic habitats in the environment. Nosocomial and zoonotic transmission of P. aeruginosa have been documented, including clonal outbreaks. The primary objective of this study was to assess various environmental exposures as potential risk factors for canine Pseudomonas otitis. It was hypothesized that isolates derived from infected ears would be clonal to isolates derived from household water sources and the mouths of human and animal companions of the study subjects. Seventy seven privately owned dogs with otitis were enrolled, along with their human and animal household companions, in a case-control design. Data on potential risk factors for Pseudomonas otitis were collected. Oral cavities of all study subjects, their human and animal companions, and household water sources were sampled. Pulsed field gel electrophoresis was used to estimate clonal relatedness of P. aeruginosa isolates. In a multivariate model, visiting a dog park was associated with 77% increased odds of case status (P = 0.048). Strains clonal to the infection isolates were obtained from subjects' mouths (n = 18), companion pets' mouths (n = 5), pet owners' mouths (n = 2), water bowls (n = 7) and water taps (n = 2). Clonally related P. aeruginosa isolates were obtained from dogs that had no clear epidemiological link. Genetic homology between otic and environmental isolates is consistent with a waterborne source for some dogs, and cross-contamination with other human and animal members within some households. © 2016 ESVD and ACVD.

  16. Importance of Survey Design for Studying the Epidemiology of Emerging Tobacco Product Use Among Youth.

    PubMed

    Delnevo, Cristine D; Gundersen, Daniel A; Manderski, Michelle T B; Giovenco, Daniel P; Giovino, Gary A

    2017-03-22

    Accurate surveillance is critical for monitoring the epidemiology of emerging tobacco products in the United States, and survey science suggests that survey response format can impact prevalence estimates. We utilized data from the 2014 New Jersey Youth Tobacco Survey (n = 3,909) to compare estimates of the prevalence of 4 behaviors (ever hookah use, current hookah use, ever e-cigarette use, and current e-cigarette use) among New Jersey high school students, as assessed using "check-all-that-apply" questions, with estimates measured by means of "forced-choice" questions. Measurement discrepancies were apparent for all 4 outcomes, with the forced-choice questions yielding prevalence estimates approximately twice those of the check-all-that-apply questions, and agreement was fair to moderate. The sensitivity of the check-all-that-apply questions, treating the forced-choice format as the "gold standard," ranged from 38.1% (current hookah use) to 58.3% (ever e-cigarette use), indicating substantial false-negative rates. These findings highlight the impact of question response format on prevalence estimates of emerging tobacco products among youth and suggest that estimates generated by means of check-all-that-apply questions may be biased downward. Alternative survey designs should be considered to avoid check-all-that-apply response formats, and researchers should use caution when interpreting tobacco use data obtained from check-all-that-apply formats.

  17. Mixed Methods Research Designs in Counseling Psychology

    ERIC Educational Resources Information Center

    Hanson, William E.; Creswell, John W.; Clark, Vicki L. Plano; Petska, Kelly S.; Creswell, David J.

    2005-01-01

    With the increased popularity of qualitative research, researchers in counseling psychology are expanding their methodologies to include mixed methods designs. These designs involve the collection, analysis, and integration of quantitative and qualitative data in a single or multiphase study. This article presents an overview of mixed methods…

  18. Mixed Methods Research Designs in Counseling Psychology

    ERIC Educational Resources Information Center

    Hanson, William E.; Creswell, John W.; Clark, Vicki L. Plano; Petska, Kelly S.; Creswell, David J.

    2005-01-01

    With the increased popularity of qualitative research, researchers in counseling psychology are expanding their methodologies to include mixed methods designs. These designs involve the collection, analysis, and integration of quantitative and qualitative data in a single or multiphase study. This article presents an overview of mixed methods…

  19. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  20. Iterative methods for design sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Belegundu, A. D.; Yoon, B. G.

    1989-01-01

    A numerical method is presented for design sensitivity analysis, using an iterative-method reanalysis of the structure generated by a small perturbation in the design variable; a forward-difference scheme is then employed to obtain the approximate sensitivity. Algorithms are developed for displacement and stress sensitivity, as well as for eignevalues and eigenvector sensitivity, and the iterative schemes are modified so that the coefficient matrices are constant and therefore decomposed only once.

  1. Using genetic epidemiology to study Rett syndrome: the design of a case-control study.

    PubMed

    Leonard, H; Fyfe, S; Dye, D; Leonard, S

    2000-01-01

    Rett syndrome is a neurological disorder that is seen almost exclusively in females. Although generally considered to have a genetic basis, the underlying mechanism remains obscure. One favoured hypothesis is that the syndrome is an X-linked dominant disorder, lethal or non-expressed in males. Genealogical research has also suggested that the mode of transmission in Rett syndrome may involve a premutation which over several generations is converted to a full mutation. Geographical clustering has been reported, and it has also been proposed that Rett syndrome is a clinically variable condition and that other neurological disorders may be occurring more commonly in families with Rett syndrome. Other studies have found an apparent increase in intellectual disability and seizures in the extended families of girls with Rett syndrome. The science of genetic epidemiology can be used to identify familial aggregation, which is the clustering of a disorder within a family. We have used a case-control study design to investigate both fetal wastage and familial aggregation of other disorders in families of girls with Rett syndrome. The Australian Rett Syndrome Database provided the source of cases, and control probands were girls of a similar age with normal development. This paper describes the methodology for a case-control study of this rare condition using pedigree data and discusses issues in the collection and evaluation of such data. The use of a control population is an important feature. Both the strengths and the shortcomings of our design are identified, and recommendations are made for future research.

  2. A Method for Designing Conforming Folding Propellers

    NASA Technical Reports Server (NTRS)

    Litherland, Brandon L.; Patterson, Michael D.; Derlaga, Joseph M.; Borer, Nicholas K.

    2017-01-01

    As the aviation vehicle design environment expands due to the in flux of new technologies, new methods of conceptual design and modeling are required in order to meet the customer's needs. In the case of distributed electric propulsion (DEP), the use of high-lift propellers upstream of the wing leading edge augments lift at low speeds enabling smaller wings with sufficient takeoff and landing performance. During cruise, however, these devices would normally contribute significant drag if left in a fixed or windmilling arrangement. Therefore, a design that stows the propeller blades is desirable. In this paper, we present a method for designing folding-blade configurations that conform to the nacelle surface when stowed. These folded designs maintain performance nearly identical to their straight, non-folding blade counterparts.

  3. Development of a hydraulic turbine design method

    NASA Astrophysics Data System (ADS)

    Kassanos, Ioannis; Anagnostopoulos, John; Papantonis, Dimitris

    2013-10-01

    In this paper a hydraulic turbine parametric design method is presented which is based on the combination of traditional methods and parametric surface modeling techniques. The blade of the turbine runner is described using Bezier surfaces for the definition of the meridional plane as well as the blade angle distribution, and a thickness distribution applied normal to the mean blade surface. In this way, it is possible to define parametrically the whole runner using a relatively small number of design parameters, compared to conventional methods. The above definition is then combined with a commercial CFD software and a stochastic optimization algorithm towards the development of an automated design optimization procedure. The process is demonstrated with the design of a Francis turbine runner.

  4. Accuracy of two geocoding methods for geographic information system-based exposure assessment in epidemiological studies.

    PubMed

    Faure, Elodie; Danjou, Aurélie M N; Clavel-Chapelon, Françoise; Boutron-Ruault, Marie-Christine; Dossus, Laure; Fervers, Béatrice

    2017-02-24

    residential addresses in epidemiological studies not initially recorded for environmental exposure assessment, for both recent addresses and residence locations more than 20 years ago. Accuracy of the two automatic geocoding methods was comparable. The in-house method (B) allowed a better control of the geocoding process and was less time consuming.

  5. Preliminary aerothermodynamic design method for hypersonic vehicles

    NASA Technical Reports Server (NTRS)

    Harloff, G. J.; Petrie, S. L.

    1987-01-01

    Preliminary design methods are presented for vehicle aerothermodynamics. Predictions are made for Shuttle orbiter, a Mach 6 transport vehicle and a high-speed missile configuration. Rapid and accurate methods are discussed for obtaining aerodynamic coefficients and heat transfer rates for laminar and turbulent flows for vehicles at high angles of attack and hypersonic Mach numbers.

  6. Current Methods and Challenges for Epidemiological Studies of the Associations Between Chemical Constituents of Particulate Matter and Health.

    PubMed

    Krall, Jenna R; Chang, Howard H; Sarnat, Stefanie Ebelt; Peng, Roger D; Waller, Lance A

    2015-12-01

    Epidemiological studies have been critical for estimating associations between exposure to ambient particulate matter (PM) air pollution and adverse health outcomes. Because total PM mass is a temporally and spatially varying mixture of constituents with different physical and chemical properties, recent epidemiological studies have focused on PM constituents. Most studies have estimated associations between PM constituents and health using the same statistical methods as in studies of PM mass. However, these approaches may not be sufficient to address challenges specific to studies of PM constituents, namely assigning exposure, disentangling health effects, and handling measurement error. We reviewed large, population-based epidemiological studies of PM constituents and health and describe the statistical methods typically applied to address these challenges. Development of statistical methods that simultaneously address multiple challenges, for example, both disentangling health effects and handling measurement error, could improve estimation of associations between PM constituents and adverse health outcomes.

  7. Some epidemiologic, clinical, microbiologic, and organizational assumptions that influenced the design and performance of the Global Enteric Multicenter Study (GEMS).

    PubMed

    Farag, Tamer H; Nasrin, Dilruba; Wu, Yukun; Muhsen, Khitam; Blackwelder, William C; Sommerfelt, Halvor; Panchalingam, Sandra; Nataro, James P; Kotloff, Karen L; Levine, Myron M

    2012-12-01

    The overall aim of the Global Enteric Multicenter Study-1 (GEMS-1) is to identify the etiologic agents associated with moderate-to-severe diarrhea (MSD) among children <5 years of age, and thereby the attributable pathogen-specific population-based incidence of MSD, to guide investments in research and public health interventions against diarrheal disease. To accomplish this, 9 core assumptions were vetted through widespread consultation: (1) a limited number of etiologic agents may be responsible for most MSD; (2) a definition of MSD can be crafted that encompasses cases that might otherwise be fatal in the community without treatment; (3) MSD seen at sentinel centers is a proxy for fatal diarrheal disease in the community; (4) matched case/control is the appropriate epidemiologic design; (5) methods across the sites can be standardized and rigorous quality control maintained; (6) a single 60-day postenrollment visit to case and control households creates mini-cohorts, allowing comparisons; (7) broad support for GEMS-1 messages can be achieved by incorporating advice from public health spokespersons; (8) results will facilitate the setting of investment and intervention priorities; and (9) wide acceptance and dissemination of the GEMS-1 results can be achieved.

  8. Some Epidemiologic, Clinical, Microbiologic, and Organizational Assumptions That Influenced the Design and Performance of the Global Enteric Multicenter Study (GEMS)

    PubMed Central

    Farag, Tamer H.; Nasrin, Dilruba; Wu, Yukun; Muhsen, Khitam; Blackwelder, William C.; Sommerfelt, Halvor; Panchalingam, Sandra; Nataro, James P.; Kotloff, Karen L.; Levine, Myron M.

    2012-01-01

    The overall aim of the Global Enteric Multicenter Study–1 (GEMS-1) is to identify the etiologic agents associated with moderate-to-severe diarrhea (MSD) among children <5 years of age, and thereby the attributable pathogen-specific population-based incidence of MSD, to guide investments in research and public health interventions against diarrheal disease. To accomplish this, 9 core assumptions were vetted through widespread consultation: (1) a limited number of etiologic agents may be responsible for most MSD; (2) a definition of MSD can be crafted that encompasses cases that might otherwise be fatal in the community without treatment; (3) MSD seen at sentinel centers is a proxy for fatal diarrheal disease in the community; (4) matched case/control is the appropriate epidemiologic design; (5) methods across the sites can be standardized and rigorous quality control maintained; (6) a single 60-day postenrollment visit to case and control households creates mini-cohorts, allowing comparisons; (7) broad support for GEMS-1 messages can be achieved by incorporating advice from public health spokespersons; (8) results will facilitate the setting of investment and intervention priorities; and (9) wide acceptance and dissemination of the GEMS-1 results can be achieved. PMID:23169935

  9. [Eco-epidemiology: towards epidemiology of complexity].

    PubMed

    Bizouarn, Philippe

    2016-05-01

    In order to solve public health problems posed by the epidemiology of risk factors centered on the individual and neglecting the causal processes linking the risk factors with the health outcomes, Mervyn Susser proposed a multilevel epidemiology called eco-epidemiology, addressing the interdependence of individuals and their connection with molecular, individual, societal, environmental levels of organization participating in the causal disease processes. The aim of this epidemiology is to integrate more than a level of organization in design, analysis and interpretation of health problems. After presenting the main criticisms of risk-factor epidemiology focused on the individual, we will try to show how eco-epidemiology and its development could help to understand the need for a broader and integrative epidemiology, in which studies designed to identify risk factors would be balanced by studies designed to answer other questions equally vital to public health. © 2016 médecine/sciences – Inserm.

  10. Characteristics of Biostatistics, Epidemiology, and Research Design Programs in Institutions With Clinical and Translational Science Awards.

    PubMed

    Rahbar, Mohammad H; Dickerson, Aisha S; Ahn, Chul; Carter, Rickey E; Hessabi, Manouchehr; Lindsell, Christopher J; Nietert, Paul J; Oster, Robert A; Pollock, Brad H; Welty, Leah J

    2017-02-01

    To learn the size, composition, and scholarly output of biostatistics, epidemiology, and research design (BERD) units in U.S. academic health centers (AHCs). Each year for four years, the authors surveyed all BERD units in U.S. AHCs that were members of the Clinical and Translational Science Award (CTSA) Consortium. In 2010, 46 BERD units were surveyed; in 2011, 55; in 2012, 60; and in 2013, 61. Response rates to the 2010, 2011, 2012, and 2013 surveys were 93.5%, 98.2%, 98.3%, and 86.9%, respectively. Overall, the size of BERD units ranged from 3 to 86 individuals. The median FTE in BERD units remained similar and ranged from 3.0 to 3.5 FTEs over the years. BERD units reported more availability of doctoral-level biostatisticians than doctoral-level epidemiologists. In 2011, 2012, and 2013, more than a third of BERD units provided consulting support on 101 to 200 projects. A majority of BERD units reported that between 25% and 75% (in 2011) and 31% to 70% (in 2012) of their consulting was to junior investigators. More than two-thirds of BERD units reported their contributions to the submission of 20 or more non-BERD grant or contract applications annually. Nearly half of BERD units reported 1 to 10 manuscripts submitted annually with a BERD practitioner as the first or corresponding author. The findings regarding BERD units provide a benchmark against which to compare BERD resources and may be particularly useful for institutions planning to develop new units to support programs such as the CTSA.

  11. Design issues in epidemiologic studies of indoor exposure to Rn and risk of lung cancer.

    PubMed

    Lubin, J H; Samet, J M; Weinberg, C

    1990-12-01

    Recent data on indoor air quality have indicated that Rn (222Rn) and its decay products are frequently present in domestic environments. Since studies of Rn-exposed miners have established that Rn decay products are a lung carcinogen, their presence in indoor air raises concerns about an increase in lung cancer risk for the general population. To directly evaluate lung cancer risk from domestic exposure to Rn and its decay products, as well as to evaluate risk assessments derived from studies of Rn-exposed underground miners, several epidemiologic studies of indoor Rn exposure have been initiated or are planned. This paper calculates sample sizes required for a hypothetical case-control study to address several important hypotheses and shows the impact of several difficult problems associated with estimating a subject's Rn exposure. We consider the effects of subject mobility, choice of the exposure response trend which is used to characterize an alternative hypothesis, and errors in the estimation of exposure. Imprecise estimation of Rn exposure arises from errors in the measurement device, exposure to Rn decay products from sources outside the home, inability to measure exposures over time in current as well as previous residences, and the unknown relationship between measured concentration and lung dose of alpha energy from the decay of Rn and its progeny. These methodological problems can result in large discrepancies between computed and actual study power. Failure to anticipate these problems in the design of a study can result in inaccurate estimates of power. We conclude that case-control studies of indoor Rn and lung cancer may require substantial numbers of subjects in order to address the many questions of importance that burden current risk assessments with uncertainty. We suggest pooling data from studies with the largest numbers of cases and with the most precise estimates of Rn exposure as the best approach for meeting present research needs.

  12. A robust method for iodine status determination in epidemiological studies by capillary electrophoresis.

    PubMed

    de Macedo, Adriana Nori; Teo, Koon; Mente, Andrew; McQueen, Matthew J; Zeidler, Johannes; Poirier, Paul; Lear, Scott A; Wielgosz, Andy; Britz-McKibbin, Philip

    2014-10-21

    Iodine deficiency is the most common preventable cause of intellectual disabilities in children. Global health initiatives to ensure optimum nutrition thus require continuous monitoring of population-wide iodine intake as determined by urinary excretion of iodide. Current methods to analyze urinary iodide are limited by complicated sample pretreatment, costly infrastructure, and/or poor selectivity, posing restrictions to large-scale epidemiological studies. We describe a simple yet selective method to analyze iodide in volume-restricted human urine specimens stored in biorepositories by capillary electrophoresis (CE) with UV detection. Excellent selectivity is achieved when using an acidic background electrolyte in conjunction with dynamic complexation via α-cyclodextrin in an unmodified fused-silica capillary under reversed polarity. Sample self-stacking is developed as a novel online sample preconcentration method to boost sensitivity with submicromolar detection limits for iodide (S/N ≈ 3, 0.06 μM) directly in urine. This assay also allows for simultaneous analysis of environmental iodide uptake inhibitors, including thiocyanate and nitrate. Rigorous method validation confirmed good linearity (R(2) = 0.9998), dynamic range (0.20 to 4.0 μM), accuracy (average recovery of 93% at three concentration levels) and precision for reliable iodide determination in pooled urine specimens over 29 days of analysis (RSD = 11%, n = 87).

  13. Multidisciplinary Optimization Methods for Aircraft Preliminary Design

    NASA Technical Reports Server (NTRS)

    Kroo, Ilan; Altus, Steve; Braun, Robert; Gage, Peter; Sobieski, Ian

    1994-01-01

    This paper describes a research program aimed at improved methods for multidisciplinary design and optimization of large-scale aeronautical systems. The research involves new approaches to system decomposition, interdisciplinary communication, and methods of exploiting coarse-grained parallelism for analysis and optimization. A new architecture, that involves a tight coupling between optimization and analysis, is intended to improve efficiency while simplifying the structure of multidisciplinary, computation-intensive design problems involving many analysis disciplines and perhaps hundreds of design variables. Work in two areas is described here: system decomposition using compatibility constraints to simplify the analysis structure and take advantage of coarse-grained parallelism; and collaborative optimization, a decomposition of the optimization process to permit parallel design and to simplify interdisciplinary communication requirements.

  14. Multidisciplinary Optimization Methods for Preliminary Design

    NASA Technical Reports Server (NTRS)

    Korte, J. J.; Weston, R. P.; Zang, T. A.

    1997-01-01

    An overview of multidisciplinary optimization (MDO) methodology and two applications of this methodology to the preliminary design phase are presented. These applications are being undertaken to improve, develop, validate and demonstrate MDO methods. Each is presented to illustrate different aspects of this methodology. The first application is an MDO preliminary design problem for defining the geometry and structure of an aerospike nozzle of a linear aerospike rocket engine. The second application demonstrates the use of the Framework for Interdisciplinary Design Optimization (FIDO), which is a computational environment system, by solving a preliminary design problem for a High-Speed Civil Transport (HSCT). The two sample problems illustrate the advantages to performing preliminary design with an MDO process.

  15. A simplified method of performance indicators development for epidemiological surveillance networks--application to the RESAPATH surveillance network.

    PubMed

    Sorbe, A; Chazel, M; Gay, E; Haenni, M; Madec, J-Y; Hendrikx, P

    2011-06-01

    Develop and calculate performance indicators allows to continuously follow the operation of an epidemiological surveillance network. This is an internal evaluation method, implemented by the coordinators in collaboration with all the actors of the network. Its purpose is to detect weak points in order to optimize management. A method for the development of performance indicators of epidemiological surveillance networks was developed in 2004 and was applied to several networks. Its implementation requires a thorough description of the network environment and all its activities to define priority indicators. Since this method is considered to be complex, our objective consisted in developing a simplified approach and applying it to an epidemiological surveillance network. We applied the initial method to a theoretical network model to obtain a list of generic indicators that can be adapted to any surveillance network. We obtained a list of 25 generic performance indicators, intended to be reformulated and described according to the specificities of each network. It was used to develop performance indicators for RESAPATH, an epidemiological surveillance network of antimicrobial resistance in pathogenic bacteria of animal origin in France. This application allowed us to validate the simplified method, its value in terms of practical implementation, and its level of user acceptance. Its ease of use and speed of application compared to the initial method argue in favor of its use on broader scale. Copyright © 2011 Elsevier Masson SAS. All rights reserved.

  16. Analysis Method for Quantifying Vehicle Design Goals

    NASA Technical Reports Server (NTRS)

    Fimognari, Peter; Eskridge, Richard; Martin, Adam; Lee, Michael

    2007-01-01

    A document discusses a method for using Design Structure Matrices (DSM), coupled with high-level tools representing important life-cycle parameters, to comprehensively conceptualize a flight/ground space transportation system design by dealing with such variables as performance, up-front costs, downstream operations costs, and reliability. This approach also weighs operational approaches based on their effect on upstream design variables so that it is possible to readily, yet defensively, establish linkages between operations and these upstream variables. To avoid the large range of problems that have defeated previous methods of dealing with the complex problems of transportation design, and to cut down the inefficient use of resources, the method described in the document identifies those areas that are of sufficient promise and that provide a higher grade of analysis for those issues, as well as the linkages at issue between operations and other factors. Ultimately, the system is designed to save resources and time, and allows for the evolution of operable space transportation system technology, and design and conceptual system approach targets.

  17. Axisymmetric inlet minimum weight design method

    NASA Technical Reports Server (NTRS)

    Nadell, Shari-Beth

    1995-01-01

    An analytical method for determining the minimum weight design of an axisymmetric supersonic inlet has been developed. The goal of this method development project was to improve the ability to predict the weight of high-speed inlets in conceptual and preliminary design. The initial model was developed using information that was available from inlet conceptual design tools (e.g., the inlet internal and external geometries and pressure distributions). Stiffened shell construction was assumed. Mass properties were computed by analyzing a parametric cubic curve representation of the inlet geometry. Design loads and stresses were developed at analysis stations along the length of the inlet. The equivalent minimum structural thicknesses for both shell and frame structures required to support the maximum loads produced by various load conditions were then determined. Preliminary results indicated that inlet hammershock pressures produced the critical design load condition for a significant portion of the inlet. By improving the accuracy of inlet weight predictions, the method will improve the fidelity of propulsion and vehicle design studies and increase the accuracy of weight versus cost studies.

  18. Optimization methods applied to hybrid vehicle design

    NASA Technical Reports Server (NTRS)

    Donoghue, J. F.; Burghart, J. H.

    1983-01-01

    The use of optimization methods as an effective design tool in the design of hybrid vehicle propulsion systems is demonstrated. Optimization techniques were used to select values for three design parameters (battery weight, heat engine power rating and power split between the two on-board energy sources) such that various measures of vehicle performance (acquisition cost, life cycle cost and petroleum consumption) were optimized. The apporach produced designs which were often significant improvements over hybrid designs already reported on in the literature. The principal conclusions are as follows. First, it was found that the strategy used to split the required power between the two on-board energy sources can have a significant effect on life cycle cost and petroleum consumption. Second, the optimization program should be constructed so that performance measures and design variables can be easily changed. Third, the vehicle simulation program has a significant effect on the computer run time of the overall optimization program; run time can be significantly reduced by proper design of the types of trips the vehicle takes in a one year period. Fourth, care must be taken in designing the cost and constraint expressions which are used in the optimization so that they are relatively smooth functions of the design variables. Fifth, proper handling of constraints on battery weight and heat engine rating, variables which must be large enough to meet power demands, is particularly important for the success of an optimization study. Finally, the principal conclusion is that optimization methods provide a practical tool for carrying out the design of a hybrid vehicle propulsion system.

  19. Novel Methods for Electromagnetic Simulation and Design

    DTIC Science & Technology

    2016-08-03

    AFRL-AFOSR-VA-TR-2016-0272 NOVEL METHODS FOR ELECTROMAGNETIC SIMULATION AND DESIGN Leslie Greengard NEW YORK UNIVERSITY 70 WASHINGTON SQUARE S NEW...METHODS FOR ELECTROMAGNETIC SIMULATION AND DESIGN 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA9550-10-1-0180 5c.  PROGRAM ELEMENT NUMBER 61102F 6. AUTHOR(S... electromagnetic scattering in realistic environments involving complex geometry. During the six year performance period (including a one-year no cost extension

  20. Optical fingerprinting in bacterial epidemiology: Raman spectroscopy as a real-time typing method.

    PubMed

    Willemse-Erix, Diana F M; Scholtes-Timmerman, Maarten J; Jachtenberg, Jan-Willem; van Leeuwen, Willem B; Horst-Kreft, Deborah; Bakker Schut, Tom C; Deurenberg, Ruud H; Puppels, Gerwin J; van Belkum, Alex; Vos, Margreet C; Maquelin, Kees

    2009-03-01

    Hospital-acquired infections (HAI) increase morbidity and mortality and constitute a high financial burden on health care systems. An effective weapon against HAI is early detection of potential outbreaks and sources of contamination. Such monitoring requires microbial typing with sufficient reproducibility and discriminatory power. Here, a microbial-typing method is presented, based on Raman spectroscopy. This technique provides strain-specific optical fingerprints in a few minutes instead of several hours to days, as is the case with genotyping methods. Although the method is generally applicable, we used 118 Staphylococcus aureus isolates to illustrate that the discriminatory power matches that of established genotyping techniques (numerical index of diversity [D]=0.989) and that concordance with the gold standard (pulsed-field gel electrophoresis) is high (95%). The Raman clustering of isolates was reproducible to the strain level for five independent cultures, despite the various culture times from 18 h to 24 h. Furthermore, this technique was able to classify stored (-80 degrees C) and recent isolates of a methicillin-resistant Staphylococcus aureus-colonized individual during surveillance studies and did so days earlier than established genotyping techniques did. Its high throughput and ease of use make it suitable for use in routine diagnostic laboratory settings. This will set the stage for continuous, automated, real-time epidemiological monitoring of bacterial infections in a hospital, which can then be followed by timely corrective action by infection prevention teams.

  1. The genetic study of three population microisolates in South Tyrol (MICROS): study design and epidemiological perspectives

    PubMed Central

    Pattaro, Cristian; Marroni, Fabio; Riegler, Alice; Mascalzoni, Deborah; Pichler, Irene; Volpato, Claudia B; Dal Cero, Umberta; De Grandi, Alessandro; Egger, Clemens; Eisendle, Agatha; Fuchsberger, Christian; Gögele, Martin; Pedrotti, Sara; Pinggera, Gerd K; Stefanov, Stefan A; Vogl, Florian D; Wiedermann, Christian J; Meitinger, Thomas; Pramstaller, Peter P

    2007-01-01

    Background There is increasing evidence of the important role that small, isolated populations could play in finding genes involved in the etiology of diseases. For historical and political reasons, South Tyrol, the northern most Italian region, includes several villages of small dimensions which remained isolated over the centuries. Methods The MICROS study is a population-based survey on three small, isolated villages, characterized by: old settlement; small number of founders; high endogamy rates; slow/null population expansion. During the stage-1 (2002/03) genealogical data, screening questionnaires, clinical measurements, blood and urine samples, and DNA were collected for 1175 adult volunteers. Stage-2, concerning trait diagnoses, linkage analysis and association studies, is ongoing. The selection of the traits is being driven by expert clinicians. Preliminary, descriptive statistics were obtained. Power simulations for finding linkage on a quantitative trait locus (QTL) were undertaken. Results Starting from participants, genealogies were reconstructed for 50,037 subjects, going back to the early 1600s. Within the last five generations, subjects were clustered in one pedigree of 7049 subjects plus 178 smaller pedigrees (3 to 85 subjects each). A significant probability of familial clustering was assessed for many traits, especially among the cardiovascular, neurological and respiratory traits. Simulations showed that the MICROS pedigree has a substantial power to detect a LOD score ≥ 3 when the QTL specific heritability is ≥ 20%. Conclusion The MICROS study is an extensive, ongoing, two-stage survey aimed at characterizing the genetic epidemiology of Mendelian and complex diseases. Our approach, involving different scientific disciplines, is an advantageous strategy to define and to study population isolates. The isolation of the Alpine populations, together with the extensive data collected so far, make the MICROS study a powerful resource for the study

  2. Evaluation and validity of a polymerase chain reaction-based open reading frame typing method to dissect the molecular epidemiology for Acinetobacter baumannii in an epidemiologic study of a hospital outbreak.

    PubMed

    Fujikura, Yuji; Yuki, Atsushi; Hamamoto, Takaaki; Ichimura, Sadahiro; Kawana, Akihiko; Ohkusu, Kiyofumi; Matsumoto, Tetsuya

    2016-11-01

    Acinetobacter baumannii is regarded as one of the most important pathogens in hospital outbreaks. To obtain an efficient and simple epidemiologic method of surveillance during outbreaks, we assessed the applicability of the polymerase chain reaction-based open reading frames typing (POT) method and compared it with pulsed-field gel electrophoresis. The POT method was found to have sufficient discriminatory power to identify the strains and would be widely applicable to epidemiologic surveillance during hospital outbreaks.

  3. Computer-Aided Drug Design Methods.

    PubMed

    Yu, Wenbo; MacKerell, Alexander D

    2017-01-01

    Computational approaches are useful tools to interpret and guide experiments to expedite the antibiotic drug design process. Structure-based drug design (SBDD) and ligand-based drug design (LBDD) are the two general types of computer-aided drug design (CADD) approaches in existence. SBDD methods analyze macromolecular target 3-dimensional structural information, typically of proteins or RNA, to identify key sites and interactions that are important for their respective biological functions. Such information can then be utilized to design antibiotic drugs that can compete with essential interactions involving the target and thus interrupt the biological pathways essential for survival of the microorganism(s). LBDD methods focus on known antibiotic ligands for a target to establish a relationship between their physiochemical properties and antibiotic activities, referred to as a structure-activity relationship (SAR), information that can be used for optimization of known drugs or guide the design of new drugs with improved activity. In this chapter, standard CADD protocols for both SBDD and LBDD will be presented with a special focus on methodologies and targets routinely studied in our laboratory for antibiotic drug discoveries.

  4. Variable genetic element typing: a quick method for epidemiological subtyping of Legionella pneumophila.

    PubMed

    Pannier, K; Heuner, K; Lück, C

    2010-04-01

    A total of 57 isolates of Legionella pneumophila were randomly selected from the German National Legionella strain collection and typed by monoclonal antibody subgrouping, seven-gene locus sequence-based typing (SBT) scheme and a newly developed variable element typing (VET) system based on the presence or absence of ten variable genetic elements. These elements were detected while screening a genomic library of strain Corby, as well as being taken from published data for PAI-1 (pathogenicity island) from strain Philadelphia. Specific primers were designed and used in gel-based polymerase chain reaction (PCR) assays. PCR amplification of the mip gene served as a control. The end-point was the presence/absence of a PCR product on an ethidium bromide-strained gel. In the present study, the index of discrimination was somewhat lower than that of the SBT (0.87 versus 0.97). Nevertheless, the results obtained showed as a 'proof of principle' that this simple and quick typing assay might be useful for the epidemiological characterisation of L. pneumophila strains.

  5. Standardized Radiation Shield Design Methods: 2005 HZETRN

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Tripathi, Ram K.; Badavi, Francis F.; Cucinotta, Francis A.

    2006-01-01

    Research committed by the Langley Research Center through 1995 resulting in the HZETRN code provides the current basis for shield design methods according to NASA STD-3000 (2005). With this new prominence, the database, basic numerical procedures, and algorithms are being re-examined with new methods of verification and validation being implemented to capture a well defined algorithm for engineering design processes to be used in this early development phase of the Bush initiative. This process provides the methodology to transform the 1995 HZETRN research code into the 2005 HZETRN engineering code to be available for these early design processes. In this paper, we will review the basic derivations including new corrections to the codes to insure improved numerical stability and provide benchmarks for code verification.

  6. MAST Propellant and Delivery System Design Methods

    NASA Technical Reports Server (NTRS)

    Nadeem, Uzair; Mc Cleskey, Carey M.

    2015-01-01

    A Mars Aerospace Taxi (MAST) concept and propellant storage and delivery case study is undergoing investigation by NASA's Element Design and Architectural Impact (EDAI) design and analysis forum. The MAST lander concept envisions landing with its ascent propellant storage tanks empty and supplying these reusable Mars landers with propellant that is generated and transferred while on the Mars surface. The report provides an overview of the data derived from modeling between different methods of propellant line routing (or "lining") and differentiate the resulting design and operations complexity of fluid and gaseous paths based on a given set of fluid sources and destinations. The EDAI team desires a rough-order-magnitude algorithm for estimating the lining characteristics (i.e., the plumbing mass and complexity) associated different numbers of vehicle propellant sources and destinations. This paper explored the feasibility of preparing a mathematically sound algorithm for this purpose, and offers a method for the EDAI team to implement.

  7. Case study: design? Method? Or comprehensive strategy?

    PubMed

    Jones, Colin; Lyons, Christina

    2004-01-01

    As the case study approach gains popularity in nursing research, questions arise with regard to what it exactly is, and where it appears to fit paradigmatically. Is it a method, a design, are such distinctions important? Colin Jones and Christina Lyons review some of the key issues, with specific emphasis on the use of case study within an interpretevist philosophy.

  8. Financial methods for waterflooding injectate design

    DOEpatents

    Heneman, Helmuth J.; Brady, Patrick V.

    2017-08-08

    A method of selecting an injectate for recovering liquid hydrocarbons from a reservoir includes designing a plurality of injectates, calculating a net present value of each injectate, and selecting a candidate injectate based on the net present value. For example, the candidate injectate may be selected to maximize the net present value of a waterflooding operation.

  9. Statistical Methods in Algorithm Design and Analysis.

    ERIC Educational Resources Information Center

    Weide, Bruce W.

    The use of statistical methods in the design and analysis of discrete algorithms is explored. The introductory chapter contains a literature survey and background material on probability theory. In Chapter 2, probabilistic approximation algorithms are discussed with the goal of exposing and correcting some oversights in previous work. Chapter 3…

  10. An optimisation method for complex product design

    NASA Astrophysics Data System (ADS)

    Li, Ni; Yi, Wenqing; Bi, Zhuming; Kong, Haipeng; Gong, Guanghong

    2013-11-01

    Designing a complex product such as an aircraft usually requires both qualitative and quantitative data and reasoning. To assist the design process, a critical issue is how to represent qualitative data and utilise it in the optimisation. In this study, a new method is proposed for the optimal design of complex products: to make the full use of available data, information and knowledge, qualitative reasoning is integrated into the optimisation process. The transformation and fusion of qualitative and qualitative data are achieved via the fuzzy sets theory and a cloud model. To shorten the design process, parallel computing is implemented to solve the formulated optimisation problems. A parallel adaptive hybrid algorithm (PAHA) has been proposed. The performance of the new algorithm has been verified by a comparison with the results from PAHA and two other existing algorithms. Further, PAHA has been applied to determine the shape parameters of an aircraft model for aerodynamic optimisation purpose.

  11. Outdoor work and solar radiation exposure: Evaluation method for epidemiological studies.

    PubMed

    Modenese, Alberto; Bisegna, Fabio; Borra, Massimo; Grandi, Carlo; Gugliermetti, Franco; Militello, Andrea; Gobba, Fabriziomaria

    The health risk related to an excessive exposure to solar radiation (SR) is well known. The Sun represents the main exposure source for all the frequency bands of optical radiation, that is the part of the electromagnetic spectrum ranging between 100 nm and 1 mm, including infrared (IR), ultraviolet (UV) and visible radiation. According to recent studies, outdoor workers have a relevant exposure to SR but few studies available in scientific literature have attempted to retrace a detailed history of individual exposure. We propose a new method for the evaluation of SR cumulative exposure both during work and leisure time, integrating subjective and objective data. The former is collected by means of an interviewer administrated questionnaire. The latter is available through the Internet databases for many geographical regions and through individual exposure measurements. The data is integrated into a mathematical algorithm, in order to obtain an esteem of the individual total amount of SR the subjects have been exposed to during their lives. The questionnaire has been tested for 58 voluntary subjects. Environmental exposure data through online databases has been collected for 3 different places in Italy in 2012. Individual exposure by electronic UV dosimeter has been measured in 6 fishermen. A mathematical algorithm integrating subjective and objective data has been elaborated. The method proposed may be used in epidemiological studies to evaluate specific correlations with biological effects of SR and to weigh the role of the personal and environmental factors that may increase or reduce SR exposure. Med Pr 2016;67(5):577-587.

  12. Evaluation of surveillance methods for an epidemiological study of contact lens related microbial keratitis.

    PubMed

    Keay, Lisa; Edwards, Katie; Brian, Garry; Naduvilath, Thomas; Stapleton, Fiona

    2004-08-01

    To evaluate surveillance methods in a pilot epidemiological study of contact lens related microbial keratitis (MK) cases identified by ophthalmic practitioners in Australia and New Zealand between May and August 2003 inclusive. Twelve ophthalmologists and 55 optometrists from rural and metropolitan locations were sent a study information pack with postal reporting forms. After 2 months, practitioners were emailed a link to a website for Internet reporting. After 4 months, practitioners were prompted by email and then by telephone if a response was not received. Passive response rates were the rate of returns after posting information and emailing the website link. Active response rates included personalized email and telephone follow-up. Ten cases of MK were identified by optometrists and five by ophthalmologists. The passive response rates were 79% and 58% for the first and second reporting periods, respectively. There was a lower response rate in the second reporting period compared to the first (P = 0.02). With active surveillance the response rate increased to 97% and 96%. A large proportion of optometrists (62%) and ophthalmologists (55%) used the website for at least one reporting period. Internet reporting was used by all New Zealand practitioners (5/5). A surveillance study to estimate the incidence of contact lens related MK in Australia and New Zealand is feasible and acceptable. Internet-based reporting offers a reliable, rapid and cost-effective means of running a large scale, international surveillance study. Active surveillance methods are necessary to enhance reporting rates.

  13. RADRUE METHOD FOR RECONSTRUCTION OF EXTERNAL PHOTON DOSES TO CHERNOBYL LIQUIDATORS IN EPIDEMIOLOGICAL STUDIES

    PubMed Central

    Kryuchkov, Victor; Chumak, Vadim; Maceika, Evaldas; Anspaugh, Lynn R.; Cardis, Elisabeth; Bakhanova, Elena; Golovanov, Ivan; Drozdovitch, Vladimir; Luckyanov, Nickolas; Kesminiene, Ausrele; Voillequé, Paul; Bouville, André

    2010-01-01

    Between 1986 and 1990, several hundred thousand workers, called “liquidators” or “clean-up workers”, took part in decontamination and recovery activities within the 30-km zone around the Chernobyl nuclear power plant in Ukraine, where a major accident occurred in April 1986. The Chernobyl liquidators were mainly exposed to external ionizing radiation levels that depended primarily on their work locations and the time after the accident when the work was performed. Because individual doses were often monitored inadequately or were not monitored at all for the majority of liquidators, a new method of photon (i.e. gamma and x-rays) dose assessment, called “RADRUE” (Realistic Analytical Dose Reconstruction with Uncertainty Estimation) was developed to obtain unbiased and reasonably accurate estimates for use in three epidemiologic studies of hematological malignancies and thyroid cancer among liquidators. The RADRUE program implements a time-and-motion dose reconstruction method that is flexible and conceptually easy to understand. It includes a large exposure rate database and interpolation and extrapolation techniques to calculate exposure rates at places where liquidators lived and worked within ~70 km of the destroyed reactor. The RADRUE technique relies on data collected from subjects’ interviews conducted by trained interviewers, and on expert dosimetrists to interpret the information and provide supplementary information, when necessary, based upon their own Chernobyl experience. The RADRUE technique was used to estimate doses from external irradiation, as well as uncertainties, to the bone-marrow for 929 subjects and to the thyroid gland for 530 subjects enrolled in epidemiologic studies. Individual bone-marrow dose estimates were found to range from less than one μGy to 3,300 mGy, with an arithmetic mean of 71 mGy. Individual thyroid dose estimates were lower and ranged from 20 μGy to 507 mGy, with an arithmetic mean of 29 mGy. The

  14. Radrue method for reconstruction of external photon doses for Chernobyl liquidators in epidemiological studies.

    PubMed

    Kryuchkov, Victor; Chumak, Vadim; Maceika, Evaldas; Anspaugh, Lynn R; Cardis, Elisabeth; Bakhanova, Elena; Golovanov, Ivan; Drozdovitch, Vladimir; Luckyanov, Nickolas; Kesminiene, Ausrele; Voillequé, Paul; Bouville, André

    2009-10-01

    Between 1986 and 1990, several hundred thousand workers, called "liquidators" or "clean-up workers," took part in decontamination and recovery activities within the 30-km zone around the Chernobyl nuclear power plant in Ukraine, where a major accident occurred in April 1986. The Chernobyl liquidators were mainly exposed to external ionizing radiation levels that depended primarily on their work locations and the time after the accident when the work was performed. Because individual doses were often monitored inadequately or were not monitored at all for the majority of liquidators, a new method of photon (i.e., gamma and x rays) dose assessment, called "RADRUE" (Realistic Analytical Dose Reconstruction with Uncertainty Estimation), was developed to obtain unbiased and reasonably accurate estimates for use in three epidemiologic studies of hematological malignancies and thyroid cancer among liquidators. The RADRUE program implements a time-and-motion dose-reconstruction method that is flexible and conceptually easy to understand. It includes a large exposure rate database and interpolation and extrapolation techniques to calculate exposure rates at places where liquidators lived and worked within approximately 70 km of the destroyed reactor. The RADRUE technique relies on data collected from subjects' interviews conducted by trained interviewers, and on expert dosimetrists to interpret the information and provide supplementary information, when necessary, based upon their own Chernobyl experience. The RADRUE technique was used to estimate doses from external irradiation, as well as uncertainties, to the bone marrow for 929 subjects and to the thyroid gland for 530 subjects enrolled in epidemiologic studies. Individual bone marrow dose estimates were found to range from less than one muGy to 3,300 mGy, with an arithmetic mean of 71 mGy. Individual thyroid dose estimates were lower and ranged from 20 muGy to 507 mGy, with an arithmetic mean of 29 mGy. The

  15. Acoustic Treatment Design Scaling Methods. Phase 2

    NASA Technical Reports Server (NTRS)

    Clark, L. (Technical Monitor); Parrott, T. (Technical Monitor); Jones, M. (Technical Monitor); Kraft, R. E.; Yu, J.; Kwan, H. W.; Beer, B.; Seybert, A. F.; Tathavadekar, P.

    2003-01-01

    The ability to design, build and test miniaturized acoustic treatment panels on scale model fan rigs representative of full scale engines provides not only cost-savings, but also an opportunity to optimize the treatment by allowing multiple tests. To use scale model treatment as a design tool, the impedance of the sub-scale liner must be known with confidence. This study was aimed at developing impedance measurement methods for high frequencies. A normal incidence impedance tube method that extends the upper frequency range to 25,000 Hz. without grazing flow effects was evaluated. The free field method was investigated as a potential high frequency technique. The potential of the two-microphone in-situ impedance measurement method was evaluated in the presence of grazing flow. Difficulties in achieving the high frequency goals were encountered in all methods. Results of developing a time-domain finite difference resonator impedance model indicated that a re-interpretation of the empirical fluid mechanical models used in the frequency domain model for nonlinear resistance and mass reactance may be required. A scale model treatment design that could be tested on the Universal Propulsion Simulator vehicle was proposed.

  16. 3. 6 simplified methods for design

    SciTech Connect

    Nickell, R.E.; Yahr, G.T.

    1981-01-01

    Simplified design analysis methods for elevated temperature construction are classified and reviewed. Because the major impetus for developing elevated temperature design methodology during the past ten years has been the LMFBR program, considerable emphasis is placed upon results from this source. The operating characteristics of the LMFBR are such that cycles of severe transient thermal stresses can be interspersed with normal elevated temperature operational periods of significant duration, leading to a combination of plastic and creep deformation. The various simplified methods are organized into two general categories, depending upon whether it is the material, or constitutive, model that is reduced, or the geometric modeling that is simplified. Because the elastic representation of material behavior is so prevalent, an entire section is devoted to elastic analysis methods. Finally, the validation of the simplified procedures is discussed.

  17. Reliability Methods for Shield Design Process

    NASA Technical Reports Server (NTRS)

    Tripathi, R. K.; Wilson, J. W.

    2002-01-01

    Providing protection against the hazards of space radiation is a major challenge to the exploration and development of space. The great cost of added radiation shielding is a potential limiting factor in deep space operations. In this enabling technology, we have developed methods for optimized shield design over multi-segmented missions involving multiple work and living areas in the transport and duty phase of space missions. The total shield mass over all pieces of equipment and habitats is optimized subject to career dose and dose rate constraints. An important component of this technology is the estimation of two most commonly identified uncertainties in radiation shield design, the shielding properties of materials used and the understanding of the biological response of the astronaut to the radiation leaking through the materials into the living space. The largest uncertainty, of course, is in the biological response to especially high charge and energy (HZE) ions of the galactic cosmic rays. These uncertainties are blended with the optimization design procedure to formulate reliability-based methods for shield design processes. The details of the methods will be discussed.

  18. A novel method to design flexible URAs

    NASA Astrophysics Data System (ADS)

    Lang, Haitao; Liu, Liren; Yang, Qingguo

    2007-05-01

    Aperture patterns play a vital role in coded aperture imaging (CAI) applications. In recent years, many approaches were presented to design optimum or near-optimum aperture patterns. Uniformly redundant arrays (URAs) are, undoubtedly, the most successful for constant sidelobe of their periodic autocorrelation function. Unfortunately, the existing methods can only be used to design URAs with a limited number of array sizes and fixed autocorrelation sidelobe-to-peak ratios. In this paper, we present a novel method to design more flexible URAs. Our approach is based on a searching program driven by DIRECT, a global optimization algorithm. We transform the design question to a mathematical model, based on the DIRECT algorithm, which is advantageous for computer implementation. By changing determinative conditions, we obtain two kinds of types of URAs, including the filled URAs which can be constructed by existing methods and the sparse URAs which have never been mentioned by other authors as far as we know. Finally, we carry out an experiment to demonstrate the imaging performance of the sparse URAs.

  19. Optimum Design Methods for Structural Sandwich Panels

    DTIC Science & Technology

    1988-01-01

    Security ClassificatioN~ Optimum Design Methods for Structural Sandwich Panels M 4, l 12. PERSONAL AUTHOR(S) Gibson, Lorna J. 113a. TYPE OF REPORT 13b...The largest value of GrE , for the 320 kg/m 3 foam for which the crack propagated through the adhesive, corresponds to the surface energy of the...Introduction , The goal of this part of the pro.ect is to find the minimum weight design of a foam core sandwich beam fora given strernth. The optimum value

  20. Optimization methods for alternative energy system design

    NASA Astrophysics Data System (ADS)

    Reinhardt, Michael Henry

    An electric vehicle heating system and a solar thermal coffee dryer are presented as case studies in alternative energy system design optimization. Design optimization tools are compared using these case studies, including linear programming, integer programming, and fuzzy integer programming. Although most decision variables in the designs of alternative energy systems are generally discrete (e.g., numbers of photovoltaic modules, thermal panels, layers of glazing in windows), the literature shows that the optimization methods used historically for design utilize continuous decision variables. Integer programming, used to find the optimal investment in conservation measures as a function of life cycle cost of an electric vehicle heating system, is compared to linear programming, demonstrating the importance of accounting for the discrete nature of design variables. The electric vehicle study shows that conservation methods similar to those used in building design, that reduce the overall UA of a 22 ft. electric shuttle bus from 488 to 202 (Btu/hr-F), can eliminate the need for fossil fuel heating systems when operating in the northeast United States. Fuzzy integer programming is presented as a means of accounting for imprecise design constraints such as being environmentally friendly in the optimization process. The solar thermal coffee dryer study focuses on a deep-bed design using unglazed thermal collectors (UTC). Experimental data from parchment coffee drying are gathered, including drying constants and equilibrium moisture. In this case, fuzzy linear programming is presented as a means of optimizing experimental procedures to produce the most information under imprecise constraints. Graphical optimization is used to show that for every 1 m2 deep-bed dryer, of 0.4 m depth, a UTC array consisting of 5, 1.1 m 2 panels, and a photovoltaic array consisting of 1, 0.25 m 2 panels produces the most dry coffee per dollar invested in the system. In general this study

  1. Epidemiological Surveillance and Typing Methods to Track Antibiotic Resistant Strains Using High Throughput Sequencing.

    PubMed

    Machado, Miguel Paulo; Ribeiro-Gonçalves, Bruno; Silva, Mickael; Ramirez, Mário; Carriço, João André

    2017-01-01

    High-Throughput Sequencing (HTS) technologies transformed the microbial typing and molecular epidemiology field by providing the cost-effective ability for researchers to probe draft genomes, not only for epidemiological markers but also for antibiotic resistance and virulence determinants. In this chapter, we provide protocols for the analysis of HTS data for the determination of multilocus sequence typing (MLST) information and for determining presence or absence of antibiotic resistance genes.

  2. Klebsiella spp. as Nosocomial Pathogens: Epidemiology, Taxonomy, Typing Methods, and Pathogenicity Factors

    PubMed Central

    Podschun, R.; Ullmann, U.

    1998-01-01

    Bacteria belonging to the genus Klebsiella frequently cause human nosocomial infections. In particular, the medically most important Klebsiella species, Klebsiella pneumoniae, accounts for a significant proportion of hospital-acquired urinary tract infections, pneumonia, septicemias, and soft tissue infections. The principal pathogenic reservoirs for transmission of Klebsiella are the gastrointestinal tract and the hands of hospital personnel. Because of their ability to spread rapidly in the hospital environment, these bacteria tend to cause nosocomial outbreaks. Hospital outbreaks of multidrug-resistant Klebsiella spp., especially those in neonatal wards, are often caused by new types of strains, the so-called extended-spectrum-β-lactamase (ESBL) producers. The incidence of ESBL-producing strains among clinical Klebsiella isolates has been steadily increasing over the past years. The resulting limitations on the therapeutic options demand new measures for the management of Klebsiella hospital infections. While the different typing methods are useful epidemiological tools for infection control, recent findings about Klebsiella virulence factors have provided new insights into the pathogenic strategies of these bacteria. Klebsiella pathogenicity factors such as capsules or lipopolysaccharides are presently considered to be promising candidates for vaccination efforts that may serve as immunological infection control measures. PMID:9767057

  3. Genetic diversity of Bacillus anthracis in Europe: genotyping methods in forensic and epidemiologic investigations.

    PubMed

    Derzelle, Sylviane; Thierry, Simon

    2013-09-01

    Bacillus anthracis, the etiological agent of anthrax, a zoonosis relatively common throughout the world, can be used as an agent of bioterrorism. In naturally occurring outbreaks and in criminal release of this pathogen, a fast and accurate diagnosis is crucial to an effective response. Microbiological forensics and epidemiologic investigations increasingly rely on molecular markers, such as polymorphisms in DNA sequence, to obtain reliable information regarding the identification or source of a suspicious strain. Over the past decade, significant research efforts have been undertaken to develop genotyping methods with increased power to differentiate B. anthracis strains. A growing number of DNA signatures have been identified and used to survey B. anthracis diversity in nature, leading to rapid advances in our understanding of the global population of this pathogen. This article provides an overview of the different phylogenetic subgroups distributed across the world, with a particular focus on Europe. Updated information on the anthrax situation in Europe is reported. A brief description of some of the work in progress in the work package 5.1 of the AniBioThreat project is also presented, including (1) the development of a robust typing tool based on a suspension array technology and multiplexed single nucleotide polymorphisms scoring and (2) the typing of a collection of DNA from European isolates exchanged between the partners of the project. The know-how acquired will contribute to improving the EU's ability to react rapidly when the identity and real origin of a strain need to be established.

  4. Waterflooding injectate design systems and methods

    DOEpatents

    Brady, Patrick V.; Krumhansl, James L.

    2014-08-19

    A method of designing an injectate to be used in a waterflooding operation is disclosed. One aspect includes specifying data representative of chemical characteristics of a liquid hydrocarbon, a connate, and a reservoir rock, of a subterranean reservoir. Charged species at an interface of the liquid hydrocarbon are determined based on the specified data by evaluating at least one chemical reaction. Charged species at an interface of the reservoir rock are determined based on the specified data by evaluating at least one chemical reaction. An extent of surface complexation between the charged species at the interfaces of the liquid hydrocarbon and the reservoir rock is determined by evaluating at least one surface complexation reaction. The injectate is designed and is operable to decrease the extent of surface complexation between the charged species at interfaces of the liquid hydrocarbon and the reservoir rock. Other methods, apparatus, and systems are disclosed.

  5. Design and implementation of security in a data collection system for epidemiology.

    PubMed

    Ainsworth, John; Harper, Robert; Juma, Ismael; Buchan, Iain

    2006-01-01

    Health informatics can benefit greatly from the e-Science approach, which is characterised by large scale distributed resource sharing and collaboration. Ensuring the privacy and confidentiality of data has always been the first requirement of health informatics systems. The PsyGrid data collection system, addresses both, providing secure distributed data collection for epidemiology. We have used Grid-computing approaches and technologies to address this problem. We describe the architecture and implementation of the security sub-system in detail.

  6. [An evaluation of sampling design for estimating an epidemiologic volume of diabetes and for assessing present status of its control in Korea].

    PubMed

    Lee, Ji-Sung; Kim, Jaiyong; Baik, Sei-Hyun; Park, Ie-Byung; Lee, Juneyoung

    2009-03-01

    An appropriate sampling strategy for estimating an epidemiologic volume of diabetes has been evaluated through a simulation. We analyzed about 250 million medical insurance claims data submitted to the Health Insurance Review & Assessment Service with diabetes as principal or subsequent diagnoses, more than or equal to once per year, in 2003. The database was re-constructed to a 'patient-hospital profile' that had 3,676,164 cases, and then to a 'patient profile' that consisted of 2,412,082 observations. The patient profile data was then used to test the validity of a proposed sampling frame and methods of sampling to develop diabetic-related epidemiologic indices. Simulation study showed that a use of a stratified two-stage cluster sampling design with a total sample size of 4,000 will provide an estimate of 57.04% (95% prediction range, 49.83 - 64.24%) for a treatment prescription rate of diabetes. The proposed sampling design consists, at first, stratifying the area of the nation into "metropolitan/city/county" and the types of hospital into "tertiary/secondary/primary/clinic" with a proportion of 5:10:10:75. Hospitals were then randomly selected within the strata as a primary sampling unit, followed by a random selection of patients within the hospitals as a secondly sampling unit. The difference between the estimate and the parameter value was projected to be less than 0.3%. The sampling scheme proposed will be applied to a subsequent nationwide field survey not only for estimating the epidemiologic volume of diabetes but also for assessing the present status of nationwide diabetes control.

  7. Phene Plate (PhP) biochemical fingerprinting. A screening method for epidemiological typing of enterococcal isolates.

    PubMed

    Saeedi, B; Tärnberg, M; Gill, H; Hällgren, A; Jonasson, J; Nilsson, L E; Isaksson, B; Kühn, I; Hanberger, H

    2005-09-01

    Pulsed-field gel electrophoresis (PFGE) is currently considered the gold standard for genotyping of enterococci. However, PFGE is both expensive and time-consuming. The purpose of this study was to investigate whether the PhP system can be used as a reliable clinical screening method for detection of genetically related isolates of enterococci. If so, it should be possible to minimize the number of isolates subjected to PFGE typing, which would save time and money. Ninety-nine clinical enterococcal isolates were analysed by PhP (similarity levels 0.90-0.975) and PFGE (similarity levels < or =3 and < or =6 bands) and all possible pairs of isolates were cross-classified as matched or mismatched. We found that the probability that a pair of isolates (A and B) belonging to the same type according to PhP also belong to the same cluster according to PFGE, i.e. p(A(PFGE)=B(PFGE) * A(PhP)=B(PhP)), and the probability that a pair of isolates of different types according to PhP also belong to different clusters according to PFGE, i.e. p(A(PFGE) not equalB(PFGE) * A(PhP) not equalB(PhP)), was relatively high for E. faecalis (0.86 and 0.96, respectively), but was lower for E. faecium (0.51 and 0.77, respectively). The concordance which shows the probability that PhP and PFGE agree on match or mismatch was 86%-93% for E. faecalis and 54%-66% for E. faecium, which indicates that the PhP method may be useful for epidemiological typing of E. faecalis in the current settings but not for E. faecium.

  8. Environmental epidemiology

    SciTech Connect

    Kopfler, F.C.; Craun, G.F.

    1986-01-01

    This volume is a compendium of peer-reviewed papers presented at the Symposium on Exposure Measurement and Evaluation Methods for Epidemiology, cosponsored in 1985 by the Health Effects Research Laboratory, USEPA, and the Division of Environmental Chemistry of the American Chemical Society. The book is divided into four sections: Use of Biological Monitoring to Assess Exposure, Epidemiologic Considerations for Assessing Exposure, Health and Exposure Data Bases, and Assessment of Exposure to Environmental Contaminants for Epidemiologic Studies. Both background papers and detailed reports of human studies are presented. The Biological Monitoring section contains reports of efforts to quantify adducts in blood and urine samples. In the section on Epidemiologic Considerations the feasibility of conducting epidemiologic studies of persons residing near hazardous waste sites and those exposed to arsenic in drinking water is described. The review of Data Bases includes government and industry water quality monitoring systems, the FDA Market Basket Study, major EPA air monitoring data, the National Database on Body Burden of Toxic chemicals, and the National Human Adipose Tissue Survey. Methods of assessing current exposure and estimating past exposure are detailed in the final section. Exposure to trichloroethylene in shower water, the relationship between water quality and cardiovascular disease, the contribution of environmental lead exposures to pediatric blood lead levels, and data from the TEAM study in which researchers compare indoor, outdoor, and breath analysis of air pollutant exposures are also discussed.

  9. Two method measurement for adolescent obesity epidemiology: Reducing the bias in self report of height and weight

    PubMed Central

    Drake, Keith M.; Longacre, Meghan R.; Dalton, Madeline A.; Langeloh, Gail; Peterson, Karen E.; Titus, Linda J.; Beach, Michael L.

    2013-01-01

    Background Despite validation studies demonstrating substantial bias, epidemiologic studies typically use self-reported height and weight as primary measures of body mass index due to feasibility and resource limitations. Purpose To demonstrate a method for calculating accurate and precise estimates that use body mass index when objectively measuring height and weight in a full sample is not feasible. Methods As part of a longitudinal study of adolescent health, 1,840 adolescents (aged 12–18) self-reported their height and weight during telephone surveys. Height and weight was measured for 407 of these adolescents. Sex specific, age-adjusted obesity status was calculated from self-reported and from measured height and weight. Prevalence and predictors of obesity were estimated using 1) self-reported data, 2) measured data, and 3) multiple imputation (of measured data). Results Among adolescents with self-reported and measured data, the obesity prevalence was lower when using self-report compared to actual measurements (p < 0.001). The obesity prevalence from multiple imputation (20%) was much closer to estimates based solely on measured data (20%) compared to estimates based solely on self-reported data (12%), indicating improved accuracy. In multivariate models, estimates of predictors of obesity were more accurate and approximately as precise (similar confidence intervals) as estimates based solely on self-reported data. Conclusions The two-method measurement design offers researchers a technique to reduce the bias typically inherent in self-reported height and weight without needing to collect measurements on the full sample. This technique enhances the ability to detect real, statistically significant differences, while minimizing the need for additional resources. PMID:23684216

  10. Evolutionary optimization methods for accelerator design

    NASA Astrophysics Data System (ADS)

    Poklonskiy, Alexey A.

    Many problems from the fields of accelerator physics and beam theory can be formulated as optimization problems and, as such, solved using optimization methods. Despite growing efficiency of the optimization methods, the adoption of modern optimization techniques in these fields is rather limited. Evolutionary Algorithms (EAs) form a relatively new and actively developed optimization methods family. They possess many attractive features such as: ease of the implementation, modest requirements on the objective function, a good tolerance to noise, robustness, and the ability to perform a global search efficiently. In this work we study the application of EAs to problems from accelerator physics and beam theory. We review the most commonly used methods of unconstrained optimization and describe the GATool, evolutionary algorithm and the software package, used in this work, in detail. Then we use a set of test problems to assess its performance in terms of computational resources, quality of the obtained result, and the tradeoff between them. We justify the choice of GATool as a heuristic method to generate cutoff values for the COSY-GO rigorous global optimization package for the COSY Infinity scientific computing package. We design the model of their mutual interaction and demonstrate that the quality of the result obtained by GATool increases as the information about the search domain is refined, which supports the usefulness of this model. We Giscuss GATool's performance on the problems suffering from static and dynamic noise and study useful strategies of GATool parameter tuning for these and other difficult problems. We review the challenges of constrained optimization with EAs and methods commonly used to overcome them. We describe REPA, a new constrained optimization method based on repairing, in exquisite detail, including the properties of its two repairing techniques: REFIND and REPROPT. We assess REPROPT's performance on the standard constrained

  11. Quality by design compliant analytical method validation.

    PubMed

    Rozet, E; Ziemons, E; Marini, R D; Boulanger, B; Hubert, Ph

    2012-01-03

    The concept of quality by design (QbD) has recently been adopted for the development of pharmaceutical processes to ensure a predefined product quality. Focus on applying the QbD concept to analytical methods has increased as it is fully integrated within pharmaceutical processes and especially in the process control strategy. In addition, there is the need to switch from the traditional checklist implementation of method validation requirements to a method validation approach that should provide a high level of assurance of method reliability in order to adequately measure the critical quality attributes (CQAs) of the drug product. The intended purpose of analytical methods is directly related to the final decision that will be made with the results generated by these methods under study. The final aim for quantitative impurity assays is to correctly declare a substance or a product as compliant with respect to the corresponding product specifications. For content assays, the aim is similar: making the correct decision about product compliance with respect to their specification limits. It is for these reasons that the fitness of these methods should be defined, as they are key elements of the analytical target profile (ATP). Therefore, validation criteria, corresponding acceptance limits, and method validation decision approaches should be settled in accordance with the final use of these analytical procedures. This work proposes a general methodology to achieve this in order to align method validation within the QbD framework and philosophy. β-Expectation tolerance intervals are implemented to decide about the validity of analytical methods. The proposed methodology is also applied to the validation of analytical procedures dedicated to the quantification of impurities or active product ingredients (API) in drug substances or drug products, and its applicability is illustrated with two case studies.

  12. Evidence-based decision-making in infectious diseases epidemiology, prevention and control: matching research questions to study designs and quality appraisal tools

    PubMed Central

    2014-01-01

    Background The Project on a Framework for Rating Evidence in Public Health (PRECEPT) was initiated and is being funded by the European Centre for Disease Prevention and Control (ECDC) to define a methodology for evaluating and grading evidence and strength of recommendations in the field of public health, with emphasis on infectious disease epidemiology, prevention and control. One of the first steps was to review existing quality appraisal tools (QATs) for individual research studies of various designs relevant to this area, using a question-based approach. Methods Through team discussions and expert consultations, we identified 20 relevant types of public health questions, which were grouped into six domains, i.e. characteristics of the pathogen, burden of disease, diagnosis, risk factors, intervention, and implementation of intervention. Previously published systematic reviews were used and supplemented by expert consultation to identify suitable QATs. Finally, a matrix was constructed for matching questions to study designs suitable to address them and respective QATs. Key features of each of the included QATs were then analyzed, in particular in respect to its intended use, types of questions and answers, presence/absence of a quality score, and if a validation was performed. Results In total we identified 21 QATs and 26 study designs, and matched them. Four QATs were suitable for experimental quantitative study designs, eleven for observational quantitative studies, two for qualitative studies, three for economic studies, one for diagnostic test accuracy studies, and one for animal studies. Included QATs consisted of six to 28 items. Six of the QATs had a summary quality score. Fourteen QATs had undergone at least one validation procedure. Conclusions The results of this methodological study can be used as an inventory of potentially relevant questions, appropriate study designs and QATs for researchers and authorities engaged with evidence-based decision

  13. Age-Based Methods to Explore Time-Related Variables in Occupational Epidemiology Studies

    SciTech Connect

    Janice P. Watkins, Edward L. Frome, Donna L. Cragle

    2005-08-31

    Although age is recognized as the strongest predictor of mortality in chronic disease epidemiology, a calendar-based approach is often employed when evaluating time-related variables. An age-based analysis file, created by determining the value of each time-dependent variable for each age that a cohort member is followed, provides a clear definition of age at exposure and allows development of diverse analytic models. To demonstrate methods, the relationship between cancer mortality and external radiation was analyzed with Poisson regression for 14,095 Oak Ridge National Laboratory workers. Based on previous analysis of this cohort, a model with ten-year lagged cumulative radiation doses partitioned by receipt before (dose-young) or after (dose-old) age 45 was examined. Dose-response estimates were similar to calendar-year-based results with elevated risk for dose-old, but not when film badge readings were weekly before 1957. Complementary results showed increasing risk with older hire ages and earlier birth cohorts, since workers hired after age 45 were born before 1915, and dose-young and dose-old were distributed differently by birth cohorts. Risks were generally higher for smokingrelated than non-smoking-related cancers. It was difficult to single out specific variables associated with elevated cancer mortality because of: (1) birth cohort differences in hire age and mortality experience completeness, and (2) time-period differences in working conditions, dose potential, and exposure assessment. This research demonstrated the utility and versatility of the age-based approach.

  14. Methods for structural design at elevated temperatures

    NASA Technical Reports Server (NTRS)

    Ellison, A. M.; Jones, W. E., Jr.; Leimbach, K. R.

    1973-01-01

    A procedure which can be used to design elevated temperature structures is discussed. The desired goal is to have the same confidence in the structural integrity at elevated temperature as the factor of safety gives on mechanical loads at room temperature. Methods of design and analysis for creep, creep rupture, and creep buckling are presented. Example problems are included to illustrate the analytical methods. Creep data for some common structural materials are presented. Appendix B is description, user's manual, and listing for the creep analysis program. The program predicts time to a given creep or to creep rupture for a material subjected to a specified stress-temperature-time spectrum. Fatigue at elevated temperature is discussed. Methods of analysis for high stress-low cycle fatigue, fatigue below the creep range, and fatigue in the creep range are included. The interaction of thermal fatigue and mechanical loads is considered, and a detailed approach to fatigue analysis is given for structures operating below the creep range.

  15. Two-method measurement for adolescent obesity epidemiology: reducing the bias in self-report of height and weight.

    PubMed

    Drake, Keith M; Longacre, Meghan R; Dalton, Madeline A; Langeloh, Gail; Peterson, Karen E; Titus, Linda J; Beach, Michael L

    2013-09-01

    Despite validation studies demonstrating substantial bias, epidemiologic studies typically use self-reported height and weight as primary measures of body mass index because of feasibility and resource limitations. To demonstrate a method for calculating accurate and precise estimates that use body mass index when objectively measuring height and weight in a full sample is not feasible. As part of a longitudinal study of adolescent health, 1,840 adolescents (ages 12-18) self-reported their height and weight during telephone surveys. Height and weight was measured for 407 of these adolescents. Sex-specific, age-adjusted obesity status was calculated from self-reported and from measured height and weight. Prevalence and predictors of obesity were estimated using self-reported data, measured data, and multiple imputation (of measured data). Among adolescents with self-reported and measured data, the obesity prevalence was lower when using self-report compared with actual measurements (p < .001). The obesity prevalence from multiple imputation (20%) was much closer to estimates based solely on measured data (20%) compared with estimates based solely on self-reported data (12%), indicating improved accuracy. In multivariate models, estimates of predictors of obesity were more accurate and approximately as precise (similar confidence intervals) as estimates based solely on self-reported data. The two-method measurement design offers researchers a technique to reduce the bias typically inherent in self-reported height and weight without needing to collect measurements on the full sample. This technique enhances the ability to detect real, statistically significant differences, while minimizing the need for additional resources. Copyright © 2013 Society for Adolescent Health and Medicine. All rights reserved.

  16. Design analysis, robust methods, and stress classification

    SciTech Connect

    Bees, W.J.

    1993-01-01

    This special edition publication volume is comprised of papers presented at the 1993 ASME Pressure Vessels and Piping Conference, July 25--29, 1993 in Denver, Colorado. The papers were prepared for presentations in technical sessions developed under the auspices of the PVPD Committees on Computer Technology, Design and Analysis, Operations Applications and Components. The topics included are: Analysis of Pressure Vessels and Components; Expansion Joints; Robust Methods; Stress Classification; and Non-Linear Analysis. Individual papers have been processed separately for inclusion in the appropriate data bases.

  17. Block designs in method transfer experiments.

    PubMed

    Altan, Stan; Shoung, Jyh-Ming

    2008-01-01

    Method transfer is a part of the pharmaceutical development process in which an analytical (chemical) procedure developed in one laboratory (typically the research laboratory) is about to be adopted by one or more recipient laboratories (production or commercial operations). The objective is to show that the recipient laboratory is capable of performing the procedure in an acceptable manner. In the course of carrying out a method transfer, other questions may arise related to fixed or random factors of interest, such as analyst, apparatus, batch, supplier of analytical reagents, and so forth. Estimates of reproducibility and repeatability may also be of interest. This article focuses on the application of various block designs that have been found useful in the comprehensive study of method transfer beyond the laboratory effect alone. An equivalence approach to the comparison of laboratories can still be carried out on either the least squares means or subject-specific means of the laboratories to justify a method transfer or to compare analytical methods.

  18. Methods and Technologies Branch (MTB)

    Cancer.gov

    The Methods and Technologies Branch focuses on methods to address epidemiologic data collection, study design and analysis, and to modify technological approaches to better understand cancer susceptibility.

  19. Computational and design methods for advanced imaging

    NASA Astrophysics Data System (ADS)

    Birch, Gabriel C.

    This dissertation merges the optical design and computational aspects of imaging systems to create novel devices that solve engineering problems in optical science and attempts to expand the solution space available to the optical designer. This dissertation is divided into two parts: the first discusses a new active illumination depth sensing modality, while the second part discusses a passive illumination system called plenoptic, or lightfield, imaging. The new depth sensing modality introduced in part one is called depth through controlled aberration. This technique illuminates a target with a known, aberrated projected pattern and takes an image using a traditional, unmodified imaging system. Knowing how the added aberration in the projected pattern changes as a function of depth, we are able to quantitatively determine depth of a series of points from the camera. A major advantage this method permits is the ability for illumination and imaging axes to be coincident. Plenoptic cameras capture both spatial and angular data simultaneously. This dissertation present a new set of parameters that permit the design and comparison of plenoptic devices outside the traditionally published plenoptic 1.0 and plenoptic 2.0 configurations. Additionally, a series of engineering advancements are presented, including full system raytraces of raw plenoptic images, Zernike compression techniques of raw image files, and non-uniform lenslet arrays to compensate for plenoptic system aberrations. Finally, a new snapshot imaging spectrometer is proposed based off the plenoptic configuration.

  20. The "Pathological Gambling and Epidemiology" (PAGE) study program: design and fieldwork.

    PubMed

    Meyer, Christian; Bischof, Anja; Westram, Anja; Jeske, Christine; de Brito, Susanna; Glorius, Sonja; Schön, Daniela; Porz, Sarah; Gürtler, Diana; Kastirke, Nadin; Hayer, Tobias; Jacobi, Frank; Lucht, Michael; Premper, Volker; Gilberg, Reiner; Hess, Doris; Bischof, Gallus; John, Ulrich; Rumpf, Hans-Jürgen

    2015-03-01

    The German federal states initiated the "Pathological Gambling and Epidemiology" (PAGE) program to evaluate the public health relevance of pathological gambling. The aim of PAGE was to estimate the prevalence of pathological gambling and cover the heterogenic presentation in the population with respect to comorbid substance use and mental disorders, risk and protective factors, course aspects, treatment utilization, triggering and maintenance factors of remission, and biological markers. This paper describes the methodological details of the study and reports basic prevalence data. Two sampling frames (landline and mobile telephone numbers) were used to generate a random sample from the general population consisting of 15,023 individuals (ages 14 to 64) completing a telephone interview. Additionally, high-risk populations have been approached in gambling locations, via media announcements, outpatient addiction services, debt counselors, probation assistants, self-help groups and specialized inpatient treatment facilities. The assessment included two steps: (1) a diagnostic interview comprising the gambling section of the Composite International Diagnostic Interview (CIDI) for case finding; (2) an in-depth clinical interview with participants reporting gambling problems. The in-depth clinical interview was completed by 594 participants, who were recruited from the general or high-risk populations. The program provides a rich epidemiological database which is available as a scientific use file. Copyright © 2015 John Wiley & Sons, Ltd.

  1. Development and Evaluation for Active Learning Instructional Design of Epidemiology in Nursing Informatics Field.

    PubMed

    Majima, Yukie

    2016-01-01

    Nursing education classes are classifiable into three types: lectures, classroom practice, and clinical practice. In this study, we implemented a class that incorporated elements of active learning, including clickers, minutes papers, quizzes, and group work and presentation, in the subject of "epidemiology", which is often positioned in the field of nursing informatics and which is usually taught in conventional knowledge-transmission style lectures, to help students understand knowledge and achieve seven class goals. Results revealed that the average scores of the class achievement (five levels of evaluation) were 3.6-3.9, which was good overall. The highest average score of the evaluation of teaching materials by students (five levels of evaluation) was 4.6 for quizzes, followed by 4.2 for announcement of test statistics, 4.1 for clickers, and 4.0 for news presentation related to epidemiology. We regard these as useful tools for students to increase their motivation. One problem with the class was that it took time to organize the class: creation of tests, class preparation and marking, such as things to be returned and distribution of clickers, and writing comments on small papers.

  2. Using field-based epidemiological methods to investigate FMD outbreaks: an example from the 2002 outbreak in Korea.

    PubMed

    Wee, S-H; Nam, H-M; Moon, O-K; Yoon, H; Park, J-Y; More, S J

    2008-12-01

    Relevant to foot and mouth disease (FMD), most published epidemiological studies have been conducted using quantitative methods and substantial regional or national datasets. Veterinary epidemiology also plays a critical role during outbreak investigations, both to assist with herd-level decision-making and to contribute relevant information to assist with ongoing national or regional control strategies. Despite the importance of this role, however, little information has been published on the use of applied (field-based) epidemiological methods during disease outbreaks. In this study, we outline an investigative template for FMD, and a case study of its use during the 2002 FMD outbreak in Korea. Suitable for use during field-based epidemiological investigations of individual farms within a broader regional/national response, the template considers three steps including confirming infection, estimating date of introduction and determining method of introduction. A case study was conducted on IP13 (the 13th infected premises), the only IP during the 2002 FMD outbreak in Korea that was geographically isolated from all other known cases. The authorities first became aware of FMD on IP13 on 2 June, however, infection may have been present from 12 May. Infection was confirmed on 3 June 2002. FMD was probably spread to IP13 by a contract worker who had participated during 2-4 May in the culling operations on IP1. Other routes of spread were ruled out during the investigation. The contract worker lived in the locality of IP13 and worked on a part-time basis at a pork-processing plant that was adjacent to this farm. The contractor became heavily contaminated during the cull, but did not comply fully with cleaning and disinfection requirements once the cull had been completed. The investigative template contributed structure and focus to the field-based investigation. Results from this case study demonstrate the need for strict management of personnel in disease control and

  3. [Dermato-epidemiology].

    PubMed

    Apfelbacher, C J; Diepgen, T L; Weisshaar, E

    2011-11-01

    Dermato-epidemiology is an important scientific discipline which investigates skin diseases using epidemiological methods. Epidemiology is the science of the distribution and determinants of disease in specified populations. We describe fundamental terms of dermato-epidemiology (measures of disease occurrence, measures of risk), different study types (observational studies, interventional studies), the selection of statistical tests, bias and confounding as well as the principles of evidence-based dermatology, and give illustrative examples.

  4. A structural design decomposition method utilizing substructuring

    NASA Technical Reports Server (NTRS)

    Scotti, Stephen J.

    1994-01-01

    A new method of design decomposition for structural analysis and optimization is described. For this method, the structure is divided into substructures where each substructure has its structural response described by a structural-response subproblem, and its structural sizing determined from a structural-sizing subproblem. The structural responses of substructures that have rigid body modes when separated from the remainder of the structure are further decomposed into displacements that have no rigid body components, and a set of rigid body modes. The structural-response subproblems are linked together through forces determined within a structural-sizing coordination subproblem which also determines the magnitude of any rigid body displacements. Structural-sizing subproblems having constraints local to the substructures are linked together through penalty terms that are determined by a structural-sizing coordination subproblem. All the substructure structural-response subproblems are totally decoupled from each other, as are all the substructure structural-sizing subproblems, thus there is significant potential for use of parallel solution methods for these subproblems.

  5. Comparison of epidemiological marker methods for identification of Salmonella typhimurium isolates from an outbreak caused by contaminated chocolate.

    PubMed Central

    Kapperud, G; Lassen, J; Dommarsnes, K; Kristiansen, B E; Caugant, D A; Ask, E; Jahkola, M

    1989-01-01

    Plasmid profile analysis, restriction endonuclease analysis, and multilocus enzyme electrophoresis were used in conjunction with serotyping, bacteriophage typing, and biochemical fingerprinting to trace epidemiologically related isolates of Salmonella typhimurium from an outbreak caused by contaminated chocolate products in Norway and Finland. To evaluate the efficiency of the epidemiological marker methods, isolates from the outbreak were compared with five groups of control isolates not known to be associated with the outbreak. Both plasmid profile analysis and phage typing provided further discrimination over that produced by serotyping and biochemical fingerprinting. Plasmid profile analysis and phage typing were equally reliable in differentiating the outbreak isolates from the epidemiologically unrelated controls and were significantly more effective than multilocus enzyme electrophoresis and restriction enzyme analysis of total DNA. The greatest differentiation was achieved when plasmid profile analysis and phage typing were combined to complement serotyping and biochemical fingerprinting. However, none of the methods employed, including restriction enzyme analysis of plasmid DNA, were able to distinguish the outbreak isolates from five isolates recovered in Norway and Finland over a period of years from dead passerine birds and a calf. Images PMID:2674198

  6. A genotypic method for determining HIV-2 coreceptor usage enables epidemiological studies and clinical decision support.

    PubMed

    Döring, Matthias; Borrego, Pedro; Büch, Joachim; Martins, Andreia; Friedrich, Georg; Camacho, Ricardo Jorge; Eberle, Josef; Kaiser, Rolf; Lengauer, Thomas; Taveira, Nuno; Pfeifer, Nico

    2016-12-20

    usage from the V3 loop. Using our method, we identified novel amino-acid markers of X4-capable variants in the V3 loop and found that HIV-2 coreceptor usage is also influenced by the V1/V2 region. The tool can aid clinicians in deciding whether coreceptor antagonists such as maraviroc are a treatment option and enables epidemiological studies investigating HIV-2 coreceptor usage. geno2pheno[coreceptor-hiv2] is freely available at http://coreceptor-hiv2.geno2pheno.org .

  7. Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS)

    PubMed Central

    Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M.; Khan, Ajmal

    2016-01-01

    Objective: This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Methods: Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. Results: A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher’s exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher’s exact test, logistic regression, epidemiological statistics, and non-parametric tests. Conclusion: This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design. PMID:27022365

  8. Neural method of spatiotemporal filter design

    NASA Astrophysics Data System (ADS)

    Szostakowski, Jaroslaw

    1997-10-01

    There is a lot of applications in medical imaging, computer vision, and the communications, where the video processing is critical. Although many techniques have been successfully developed for the filtering of the still-images, significantly fewer techniques have been proposed for the filtering of noisy image sequences. In this paper the novel approach to spatio- temporal filtering design is proposed. The multilayer perceptrons and functional-link nets are used for the 3D filtering. The spatio-temporal patterns are creating from real motion video images. The neural networks learn these patterns. The perceptrons with different number of layers and neurons in each layer are tested. Also, the different input functions in functional- link net are searched. The practical examples of the filtering are shown and compared with traditional (non-neural) spatio-temporal methods. The results are very interesting and the neural spatio-temporal filters seems to be very efficient tool for video noise reduction.

  9. Method for designing gas tag compositions

    DOEpatents

    Gross, K.C.

    1995-04-11

    For use in the manufacture of gas tags such as employed in a nuclear reactor gas tagging failure detection system, a method for designing gas tagging compositions utilizes an analytical approach wherein the final composition of a first canister of tag gas as measured by a mass spectrometer is designated as node No. 1. Lattice locations of tag nodes in multi-dimensional space are then used in calculating the compositions of a node No. 2 and each subsequent node so as to maximize the distance of each node from any combination of tag components which might be indistinguishable from another tag composition in a reactor fuel assembly. Alternatively, the measured compositions of tag gas numbers 1 and 2 may be used to fix the locations of nodes 1 and 2, with the locations of nodes 3-N then calculated for optimum tag gas composition. A single sphere defining the lattice locations of the tag nodes may be used to define approximately 20 tag nodes, while concentric spheres can extend the number of tag nodes to several hundred. 5 figures.

  10. Method for designing gas tag compositions

    DOEpatents

    Gross, Kenny C.

    1995-01-01

    For use in the manufacture of gas tags such as employed in a nuclear reactor gas tagging failure detection system, a method for designing gas tagging compositions utilizes an analytical approach wherein the final composition of a first canister of tag gas as measured by a mass spectrometer is designated as node #1. Lattice locations of tag nodes in multi-dimensional space are then used in calculating the compositions of a node #2 and each subsequent node so as to maximize the distance of each node from any combination of tag components which might be indistinguishable from another tag composition in a reactor fuel assembly. Alternatively, the measured compositions of tag gas numbers 1 and 2 may be used to fix the locations of nodes 1 and 2, with the locations of nodes 3-N then calculated for optimum tag gas composition. A single sphere defining the lattice locations of the tag nodes may be used to define approximately 20 tag nodes, while concentric spheres can extend the number of tag nodes to several hundred.

  11. Research and Design of Rootkit Detection Method

    NASA Astrophysics Data System (ADS)

    Liu, Leian; Yin, Zuanxing; Shen, Yuli; Lin, Haitao; Wang, Hongjiang

    Rootkit is one of the most important issues of network communication systems, which is related to the security and privacy of Internet users. Because of the existence of the back door of the operating system, a hacker can use rootkit to attack and invade other people's computers and thus he can capture passwords and message traffic to and from these computers easily. With the development of the rootkit technology, its applications are more and more extensive and it becomes increasingly difficult to detect it. In addition, for various reasons such as trade secrets, being difficult to be developed, and so on, the rootkit detection technology information and effective tools are still relatively scarce. In this paper, based on the in-depth analysis of the rootkit detection technology, a new kind of the rootkit detection structure is designed and a new method (software), X-Anti, is proposed. Test results show that software designed based on structure proposed is much more efficient than any other rootkit detection software.

  12. Geometric methods for optimal sensor design.

    PubMed

    Belabbas, M-A

    2016-01-01

    The Kalman-Bucy filter is the optimal estimator of the state of a linear dynamical system from sensor measurements. Because its performance is limited by the sensors to which it is paired, it is natural to seek optimal sensors. The resulting optimization problem is however non-convex. Therefore, many ad hoc methods have been used over the years to design sensors in fields ranging from engineering to biology to economics. We show in this paper how to obtain optimal sensors for the Kalman filter. Precisely, we provide a structural equation that characterizes optimal sensors. We furthermore provide a gradient algorithm and prove its convergence to the optimal sensor. This optimal sensor yields the lowest possible estimation error for measurements with a fixed signal-to-noise ratio. The results of the paper are proved by reducing the optimal sensor problem to an optimization problem on a Grassmannian manifold and proving that the function to be minimized is a Morse function with a unique minimum. The results presented here also apply to the dual problem of optimal actuator design.

  13. Geometric methods for optimal sensor design

    PubMed Central

    Belabbas, M.-A.

    2016-01-01

    The Kalman–Bucy filter is the optimal estimator of the state of a linear dynamical system from sensor measurements. Because its performance is limited by the sensors to which it is paired, it is natural to seek optimal sensors. The resulting optimization problem is however non-convex. Therefore, many ad hoc methods have been used over the years to design sensors in fields ranging from engineering to biology to economics. We show in this paper how to obtain optimal sensors for the Kalman filter. Precisely, we provide a structural equation that characterizes optimal sensors. We furthermore provide a gradient algorithm and prove its convergence to the optimal sensor. This optimal sensor yields the lowest possible estimation error for measurements with a fixed signal-to-noise ratio. The results of the paper are proved by reducing the optimal sensor problem to an optimization problem on a Grassmannian manifold and proving that the function to be minimized is a Morse function with a unique minimum. The results presented here also apply to the dual problem of optimal actuator design. PMID:26997885

  14. [Phenotypic and genotypic methods for epidemiological typing of veterinary important bacterial pathogens of the genera Staphylococcus, Salmonella, and Pasteurella].

    PubMed

    Schwarz, Stefan; Blickwede, Maren; Kehrenberg, Corinna; Michael, Geovana Brenner

    2003-01-01

    Molecular typing methods are capable of providing detailed strain characteristics which are commonly far beyond the capacities of phenotypic typing methods. Such molecular-based characteristics have proved to be very helpful in epidemiological studies of bacterial pathogens. The primary criteria that all typing methods should fulfill include (1) the typeability of the strains in question, (2) the reproducibility of the results, and (3) a high discriminatory power. In general, molecular typing methods can be differentiated with regard to their use in methods that can be applied to virtually all bacteria (e.g. plasmid profiling, ribotyping, macrorestriction analysis) and methods which can only be used for typing of certain bacterial genera or species (e.g. IS200 typing of certain Salmonella enterica subsp. enterica serovars, or coa-PCR of coagulase-positive staphylococci). In the present review, various phenotypic and molecular methods for the epidemiological typing of bacteria of the genera Staphylococcus, Salmonella, and Pasteurella are described and their advantages/disadvantages--also with regard to the fulfillment of the above-mentioned primary criteria--are critically assessed.

  15. Regression discontinuity designs are underutilized in medicine, epidemiology, and public health: a review of current and best practice.

    PubMed

    Moscoe, Ellen; Bor, Jacob; Bärnighausen, Till

    2015-02-01

    Regression discontinuity (RD) designs allow for rigorous causal inference when patients receive a treatment based on scoring above or below a cutoff point on a continuously measured variable. We provide an introduction to the theory of RD and a systematic review and assessment of the RD literature in medicine, epidemiology, and public health. We review the necessary conditions for valid RD results, provide a practical guide to RD implementation, compare RD to other methodologies, and conduct a systematic review of the RD literature in PubMed. We describe five key elements of analysis all RD studies should report, including tests of validity conditions and robustness checks. Thirty two empirical RD studies in PubMed met our selection criteria. Most of the 32 RD articles analyzed the effectiveness of social policies or mental health interventions, with only two evaluating clinical interventions to improve physical health. Seven out of the 32 studies reported on all the five key elements. Increased use of RD provides an exciting opportunity for obtaining unbiased causal effect estimates when experiments are not feasible or when we want to evaluate programs under "real-life" conditions. Although treatment eligibility in medicine, epidemiology, and public health is commonly determined by threshold rules, use of RD in these fields has been very limited until now. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Design and methods in a multi-center case-control interview study.

    PubMed Central

    Hartge, P; Cahill, J I; West, D; Hauck, M; Austin, D; Silverman, D; Hoover, R

    1984-01-01

    We conducted a case-control study in ten areas of the United States in which a total of 2,982 bladder cancer patients and 5,782 population controls were interviewed. We employed a variety of existing and new techniques to reduce bias and to monitor the quality of data collected. We review here many of the design elements and field methods that can be generally applied in epidemiologic studies, particularly multi-center interview studies, and explain the reasons for our selection of the methods, instruments, and procedures used. PMID:6689843

  17. Adjoint methods for aerodynamic wing design

    NASA Technical Reports Server (NTRS)

    Grossman, Bernard

    1993-01-01

    A model inverse design problem is used to investigate the effect of flow discontinuities on the optimization process. The optimization involves finding the cross-sectional area distribution of a duct that produces velocities that closely match a targeted velocity distribution. Quasi-one-dimensional flow theory is used, and the target is chosen to have a shock wave in its distribution. The objective function which quantifies the difference between the targeted and calculated velocity distributions may become non-smooth due to the interaction between the shock and the discretization of the flowfield. This paper offers two techniques to resolve the resulting problems for the optimization algorithms. The first, shock-fitting, involves careful integration of the objective function through the shock wave. The second, coordinate straining with shock penalty, uses a coordinate transformation to align the calculated shock with the target and then adds a penalty proportional to the square of the distance between the shocks. The techniques are tested using several popular sensitivity and optimization methods, including finite-differences, and direct and adjoint discrete sensitivity methods. Two optimization strategies, Gauss-Newton and sequential quadratic programming (SQP), are used to drive the objective function to a minimum.

  18. Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS).

    PubMed

    Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M; Khan, Ajmal

    2016-01-01

    This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher's exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher's exact test, logistic regression, epidemiological statistics, and non-parametric tests. This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design.

  19. Educating Instructional Designers: Different Methods for Different Outcomes.

    ERIC Educational Resources Information Center

    Rowland, Gordon; And Others

    1994-01-01

    Suggests new methods of teaching instructional design based on literature reviews of other design fields including engineering, architecture, interior design, media design, and medicine. Methods discussed include public presentations, visiting experts, competitions, artifacts, case studies, design studios, and internships and apprenticeships.…

  20. Epidemiological causality.

    PubMed

    Morabia, Alfredo

    2005-01-01

    Epidemiological methods, which combine population thinking and group comparisons, can primarily identify causes of disease in populations. There is therefore a tension between our intuitive notion of a cause, which we want to be deterministic and invariant at the individual level, and the epidemiological notion of causes, which are invariant only at the population level. Epidemiologists have given heretofore a pragmatic solution to this tension. Causal inference in epidemiology consists in checking the logical coherence of a causality statement and determining whether what has been found grossly contradicts what we think we already know: how strong is the association? Is there a dose-response relationship? Does the cause precede the effect? Is the effect biologically plausible? Etc. This approach to causal inference can be traced back to the English philosophers David Hume and John Stuart Mill. On the other hand, the mode of establishing causality, devised by Jakob Henle and Robert Koch, which has been fruitful in bacteriology, requires that in every instance the effect invariably follows the cause (e.g., inoculation of Koch bacillus and tuberculosis). This is incompatible with epidemiological causality which has to deal with probabilistic effects (e.g., smoking and lung cancer), and is therefore invariant only for the population.

  1. Game Methodology for Design Methods and Tools Selection

    ERIC Educational Resources Information Center

    Ahmad, Rafiq; Lahonde, Nathalie; Omhover, Jean-françois

    2014-01-01

    Design process optimisation and intelligence are the key words of today's scientific community. A proliferation of methods has made design a convoluted area. Designers are usually afraid of selecting one method/tool over another and even expert designers may not necessarily know which method is the best to use in which circumstances. This…

  2. Translating Vision into Design: A Method for Conceptual Design Development

    NASA Technical Reports Server (NTRS)

    Carpenter, Joyce E.

    2003-01-01

    One of the most challenging tasks for engineers is the definition of design solutions that will satisfy high-level strategic visions and objectives. Even more challenging is the need to demonstrate how a particular design solution supports the high-level vision. This paper describes a process and set of system engineering tools that have been used at the Johnson Space Center to analyze and decompose high-level objectives for future human missions into design requirements that can be used to develop alternative concepts for vehicles, habitats, and other systems. Analysis and design studies of alternative concepts and approaches are used to develop recommendations for strategic investments in research and technology that support the NASA Integrated Space Plan. In addition to a description of system engineering tools, this paper includes a discussion of collaborative design practices for human exploration mission architecture studies used at the Johnson Space Center.

  3. Epigenetic Epidemiology: Promises for Public Health Research

    PubMed Central

    Bakulski, Kelly M.; Fallin, M. Daniele

    2014-01-01

    Epigenetic changes underlie developmental and age related biology. Promising epidemiologic research implicates epigenetics in disease risk and progression, and suggests epigenetic status depends on environmental risks as well as genetic predisposition. Epigenetics may represent a mechanistic link between environmental exposures, or genetics, and many common diseases, or may simply provide a quantitative biomarker for exposure or disease for areas of epidemiology currently lacking such measures. This great promise is balanced by issues related to study design, measurement tools, statistical methods, and biological interpretation that must be given careful consideration in an epidemiologic setting. This article describes the promises and challenges for epigenetic epidemiology, and suggests directions to advance this emerging area of molecular epidemiology. PMID:24449392

  4. Epigenetic epidemiology: promises for public health research.

    PubMed

    Bakulski, Kelly M; Fallin, M Daniele

    2014-04-01

    Epigenetic changes underlie developmental and age related biology. Promising epidemiologic research implicates epigenetics in disease risk and progression, and suggests epigenetic status depends on environmental risks as well as genetic predisposition. Epigenetics may represent a mechanistic link between environmental exposures, or genetics, and many common diseases, or may simply provide a quantitative biomarker for exposure or disease for areas of epidemiology currently lacking such measures. This great promise is balanced by issues related to study design, measurement tools, statistical methods, and biological interpretation that must be given careful consideration in an epidemiologic setting. This article describes the promises and challenges for epigenetic epidemiology, and suggests directions to advance this emerging area of molecular epidemiology. Copyright © 2014 Wiley Periodicals, Inc.

  5. Using Software Design Methods in CALL

    ERIC Educational Resources Information Center

    Ward, Monica

    2006-01-01

    The phrase "software design" is not one that arouses the interest of many CALL practitioners, particularly those from a humanities background. However, software design essentials are simply logical ways of going about designing a system. The fundamentals include modularity, anticipation of change, generality and an incremental approach. While CALL…

  6. Global optimization methods for engineering design

    NASA Technical Reports Server (NTRS)

    Arora, Jasbir S.

    1990-01-01

    The problem is to find a global minimum for the Problem P. Necessary and sufficient conditions are available for local optimality. However, global solution can be assured only under the assumption of convexity of the problem. If the constraint set S is compact and the cost function is continuous on it, existence of a global minimum is guaranteed. However, in view of the fact that no global optimality conditions are available, a global solution can be found only by an exhaustive search to satisfy Inequality. The exhaustive search can be organized in such a way that the entire design space need not be searched for the solution. This way the computational burden is reduced somewhat. It is concluded that zooming algorithm for global optimizations appears to be a good alternative to stochastic methods. More testing is needed; a general, robust, and efficient local minimizer is required. IDESIGN was used in all numerical calculations which is based on a sequential quadratic programming algorithm, and since feasible set keeps on shrinking, a good algorithm to find an initial feasible point is required. Such algorithms need to be developed and evaluated.

  7. An Efficient Inverse Aerodynamic Design Method For Subsonic Flows

    NASA Technical Reports Server (NTRS)

    Milholen, William E., II

    2000-01-01

    Computational Fluid Dynamics based design methods are maturing to the point that they are beginning to be used in the aircraft design process. Many design methods however have demonstrated deficiencies in the leading edge region of airfoil sections. The objective of the present research is to develop an efficient inverse design method which is valid in the leading edge region. The new design method is a streamline curvature method, and a new technique is presented for modeling the variation of the streamline curvature normal to the surface. The new design method allows the surface coordinates to move normal to the surface, and has been incorporated into the Constrained Direct Iterative Surface Curvature (CDISC) design method. The accuracy and efficiency of the design method is demonstrated using both two-dimensional and three-dimensional design cases.

  8. Review of methods of dose estimation for epidemiological studies of the radiological impact of nevada test site and global fallout.

    PubMed

    Beck, Harold L; Anspaugh, Lynn R; Bouville, André; Simon, Steven L

    2006-07-01

    Methods to assess radiation doses from nuclear weapons test fallout have been used to estimate doses to populations and individuals in a number of studies. However, only a few epidemiology studies have relied on fallout dose estimates. Though the methods for assessing doses from local and regional compared to global fallout are similar, there are significant differences in predicted doses and contributing radionuclides depending on the source of the fallout, e.g. whether the nuclear debris originated in Nevada at the U.S. nuclear test site or whether it originated at other locations worldwide. The sparse historical measurement data available are generally sufficient to estimate external exposure doses reasonably well. However, reconstruction of doses to body organs from ingestion and inhalation of radionuclides is significantly more complex and is almost always more uncertain than are external dose estimates. Internal dose estimates are generally based on estimates of the ground deposition per unit area of specific radionuclides and subsequent transport of radionuclides through the food chain. A number of technical challenges to correctly modeling deposition of fallout under wet and dry atmospheric conditions still remain, particularly at close-in locations where sizes of deposited particles vary significantly over modest changes in distance. This paper summarizes the various methods of dose estimation from weapons test fallout and the most important dose assessment and epidemiology studies that have relied on those methods.

  9. The healthy men study: design and recruitment considerations for environmental epidemiologic studies in male reproductive health

    EPA Science Inventory

    Study Objective: To describe study conduct and response and participant characteristics. Design: Prospective cohort study. Setting: Participants were male partners of women enrolled in a community-based study of drinking water disinfection by-products and pregnancy healt...

  10. The healthy men study: design and recruitment considerations for environmental epidemiologic studies in male reproductive health

    EPA Science Inventory

    Study Objective: To describe study conduct and response and participant characteristics. Design: Prospective cohort study. Setting: Participants were male partners of women enrolled in a community-based study of drinking water disinfection by-products and pregnancy healt...

  11. A Review of Exposure Assessment Methods in Epidemiological Studies on Incinerators

    PubMed Central

    Ranzi, Andrea; De Leo, Giulio A.; Lauriola, Paolo

    2013-01-01

    Incineration is a common technology for waste disposal, and there is public concern for the health impact deriving from incinerators. Poor exposure assessment has been claimed as one of the main causes of inconsistency in the epidemiological literature. We reviewed 41 studies on incinerators published between 1984 and January 2013 and classified them on the basis of exposure assessment approach. Moreover, we performed a simulation study to explore how the different exposure metrics may influence the exposure levels used in epidemiological studies. 19 studies used linear distance as a measure of exposure to incinerators, 11 studies atmospheric dispersion models, and the remaining 11 studies a qualitative variable such as presence/absence of the source. All reviewed studies utilized residence as a proxy for population exposure, although residence location was evaluated with different precision (e.g., municipality, census block, or exact address). Only one study reconstructed temporal variability in exposure. Our simulation study showed a notable degree of exposure misclassification caused by the use of distance compared to dispersion modelling. We suggest that future studies (i) make full use of pollution dispersion models; (ii) localize population on a fine-scale; and (iii) explicitly account for the presence of potential environmental and socioeconomic confounding. PMID:23840228

  12. Design optimization method for Francis turbine

    NASA Astrophysics Data System (ADS)

    Kawajiri, H.; Enomoto, Y.; Kurosawa, S.

    2014-03-01

    This paper presents a design optimization system coupled CFD. Optimization algorithm of the system employs particle swarm optimization (PSO). Blade shape design is carried out in one kind of NURBS curve defined by a series of control points. The system was applied for designing the stationary vanes and the runner of higher specific speed francis turbine. As the first step, single objective optimization was performed on stay vane profile, and second step was multi-objective optimization for runner in wide operating range. As a result, it was confirmed that the design system is useful for developing of hydro turbine.

  13. An IARC Manual series aimed at assisting cancer epidemiology and prevention. "Environmental carcinogens: selected methods of analysis".

    PubMed

    O'Neill, I K; Fishbein, L

    1986-01-01

    Since 1975, the IARC has been preparing a series of volumes entitled "Environmental Carcinogens: Selected Methods of Analysis" (IARC Manual series) of which the purposes are to assist analysts, epidemiologists and regulatory authorities in planning or performing exposure measurements that are truly comparable between different studies. The Manual series provides expert information within each volume on multi-media sampling, methods of analyses and some background of epidemiology, metabolism, use/occurrence for a group of known or suspect carcinogens. So far, eleven volumes have been published or are in preparation on the following subjects: N-nitrosamines, vinyl chloride, PAH, aromatic amines, mycotoxins, N-nitroso compounds, volatile halogenated hydrocarbons, metals, passive smoking, benzene and alkylated benzenes, dioxins, PCDFs and PCBs. The presentation will discuss needs and priorities for use of analytical chemistry in estimating exposures of apparently greatest relevance to cancer causation, i.e. the approach to developing this series. Indications from epidemiology, evaluations of carcinogenic risk to humans, and recent developments in total exposure assessment are that new methods and matrices need more emphasis, e.g. as with biochemical dosimetry, exhaled breath, and in indoor air.

  14. Alternative methods for the design of jet engine control systems

    NASA Technical Reports Server (NTRS)

    Sain, M. K.; Leake, R. J.; Basso, R.; Gejji, R.; Maloney, A.; Seshadri, V.

    1976-01-01

    Various alternatives to linear quadratic design methods for jet engine control systems are discussed. The main alternatives are classified into two broad categories: nonlinear global mathematical programming methods and linear local multivariable frequency domain methods. Specific studies within these categories include model reduction, the eigenvalue locus method, the inverse Nyquist method, polynomial design, dynamic programming, and conjugate gradient approaches.

  15. Alternative methods for the design of jet engine control systems

    NASA Technical Reports Server (NTRS)

    Sain, M. K.; Leake, R. J.; Basso, R.; Gejji, R.; Maloney, A.; Seshadri, V.

    1976-01-01

    Various alternatives to linear quadratic design methods for jet engine control systems are discussed. The main alternatives are classified into two broad categories: nonlinear global mathematical programming methods and linear local multivariable frequency domain methods. Specific studies within these categories include model reduction, the eigenvalue locus method, the inverse Nyquist method, polynomial design, dynamic programming, and conjugate gradient approaches.

  16. Demystifying Mixed Methods Research Design: A Review of the Literature

    ERIC Educational Resources Information Center

    Caruth, Gail D.

    2013-01-01

    Mixed methods research evolved in response to the observed limitations of both quantitative and qualitative designs and is a more complex method. The purpose of this paper was to examine mixed methods research in an attempt to demystify the design thereby allowing those less familiar with its design an opportunity to utilize it in future research.…

  17. Cultural epidemiology of pandemic influenza in urban and rural Pune, India: a cross-sectional, mixed-methods study

    PubMed Central

    Sundaram, Neisha; Schaetti, Christian; Purohit, Vidula; Kudale, Abhay; Weiss, Mitchell G

    2014-01-01

    Objective To identify and compare sociocultural features of pandemic influenza with reference to illness-related experience, meaning and behaviour in urban and rural areas of India. Design Cross-sectional, mixed-methods, cultural epidemiological survey with vignette-based interviews. Semistructured explanatory model interviews were used to study community ideas of the 2009 influenza pandemic. In-depth interviews elaborated respondents’ experience during the pandemic. Setting Urban and rural communities, Pune district, western India. Participants Survey of urban (n=215) and rural (n=221) residents aged between 18 and 65 years. In-depth interviews of respondents with a history of 2009 pandemic influenza (n=6). Results More urban (36.7%) than rural respondents (16.3%, p<0.001) identified the illness in the vignette as ‘swine flu’. Over half (56.7%) believed the illness would be fatal without treatment, but with treatment 96% predicted full recovery. Worry (‘tension’) about the illness was reported as more troubling than somatic symptoms. The most common perceived causes—‘exposure to a dirty environment’ and ‘cough or sneeze of an infected person’–were more prominent in the urban group. Among rural respondents, climatic conditions, drinking contaminated water, tension and cultural ideas on humoral imbalance from heat-producing or cold-producing foods were more prominent. The most widely reported home treatment was herbal remedies; more rural respondents suggested reliance on prayer, and symptom relief was more of a priority for urban respondents. Government health services were preferred in the urban communities, and rural residents relied more than urban residents on private facilities. The important preventive measures emphasised were cleanliness, wholesome lifestyle and vaccines, and more urban respondents reported the use of masks. In-depth interviews indicated treatment delays during the 2009 pandemic, especially among rural patients

  18. The impact of chronic migraine: The Chronic Migraine Epidemiology and Outcomes (CaMEO) Study methods and baseline results

    PubMed Central

    Serrano, Daniel; Buse, Dawn C; Reed, Michael L; Marske, Valerie; Fanning, Kristina M; Lipton, Richard B

    2015-01-01

    Background Longitudinal migraine studies have rarely assessed headache frequency and disability variation over a year. Methods The Chronic Migraine Epidemiology and Outcomes (CaMEO) Study is a cross-sectional and longitudinal Internet study designed to characterize the course of episodic migraine (EM) and chronic migraine (CM). Participants were recruited from a Web-panel using quota sampling in an attempt to obtain a sample demographically similar to the US population. Participants who passed the screener were assessed every three months with the Core (baseline, six, and 12 months) and Snapshot (months three and nine) modules, which assessed headache frequency, headache-related disability, treatments, and treatment satisfaction. The Core also assessed resource use, health-related quality of life, and other features. One-time cross-sectional modules measured family burden, barriers to medical care, and comorbidities/endophenotypes. Results Of 489,537 invitees, we obtained 58,418 (11.9%) usable returns including 16,789 individuals who met ICHD-3 beta migraine criteria (EM (<15 headache days/mo): n = 15,313 (91.2%); CM (≥15 headache days/mo): n = 1476 (8.8%)). At baseline, all qualified respondents (n = 16,789) completed the Screener, Core, and Barriers to Care modules. Subsequent modules showed some attrition (Comorbidities/Endophenotypes, n = 12,810; Family Burden (Proband), n = 13,064; Family Burden (Partner), n = 4022; Family Burden (Child), n = 2140; Snapshot (three months), n = 9741; Core (six months), n = 7517; Snapshot (nine months), n = 6362; Core (12 months), n = 5915). A total of 3513 respondents (21.0%) completed all modules, and 3626 (EM: n = 3303 (21.6%); CM: n = 323 (21.9%)) completed all longitudinal assessments. Conclusions The CaMEO Study provides cross-sectional and longitudinal data that will contribute to our understanding of the course of migraine over one year and quantify variations in

  19. Computational Methods Applied to Rational Drug Design.

    PubMed

    Ramírez, David

    2016-01-01

    Due to the synergic relationship between medical chemistry, bioinformatics and molecular simulation, the development of new accurate computational tools for small molecules drug design has been rising over the last years. The main result is the increased number of publications where computational techniques such as molecular docking, de novo design as well as virtual screening have been used to estimate the binding mode, site and energy of novel small molecules. In this work I review some tools, which enable the study of biological systems at the atomistic level, providing relevant information and thereby, enhancing the process of rational drug design.

  20. Computational Methods Applied to Rational Drug Design

    PubMed Central

    Ramírez, David

    2016-01-01

    Due to the synergic relationship between medical chemistry, bioinformatics and molecular simulation, the development of new accurate computational tools for small molecules drug design has been rising over the last years. The main result is the increased number of publications where computational techniques such as molecular docking, de novo design as well as virtual screening have been used to estimate the binding mode, site and energy of novel small molecules. In this work I review some tools, which enable the study of biological systems at the atomistic level, providing relevant information and thereby, enhancing the process of rational drug design. PMID:27708723

  1. An International Comparison of the Instigation and Design of Health Registers in the Epidemiological Response to Major Environmental Health Incidents.

    PubMed

    Behbod, Behrooz; Leonardi, Giovanni; Motreff, Yvon; Beck, Charles R; Yzermans, Joris; Lebret, Erik; Muravov, Oleg I; Bayleyegn, Tesfaye; Wolkin, Amy Funk; Lauriola, Paolo; Close, Rebecca; Crabbe, Helen; Pirard, Philippe

    Epidemiological preparedness is vital in providing relevant, transparent, and timely intelligence for the management, mitigation, and prevention of public health impacts following major environmental health incidents. A register is a set of records containing systematically collected, standardized data about individual people. Planning for a register of people affected by or exposed to an incident is one of the evolving tools in the public health preparedness and response arsenal. We compared and contrasted the instigation and design of health registers in the epidemiological response to major environmental health incidents in England, France, Italy, the Netherlands, and the United States. Consultation with experts from the 5 nations, supplemented with a review of gray and peer-reviewed scientific literature to identify examples where registers have been used. Populations affected by or at risk from major environmental health incidents in England, France, Italy, the Netherlands, and the United States. Nations were compared with respect to the (1) types of major incidents in their remit for considering a register; (2) arrangements for triggering a register; (3) approaches to design of register; (4) arrangements for register implementation; (5) uses of registers; and (6) examples of follow-up studies. Health registers have played a key role in the effective public health response to major environmental incidents, including sudden chemical, biological, radiological, or nuclear, as well as natural, more prolonged incidents. Value has been demonstrated in the early and rapid deployment of health registers, enabling the capture of a representative population. The decision to establish a health register must ideally be confirmed immediately or soon after the incident using a set of agreed criteria. The establishment of protocols for the instigation, design, and implementation of health registers is recommended as part of preparedness activities. Key stakeholders must be

  2. Surveillance in a Telemedicine Setting: Application of Epidemiologic Methods at NASA Johnson Space Center Adriana

    NASA Technical Reports Server (NTRS)

    Babiak-Vazquez, Adriana; Ruffaner, Lanie; Wear, Mary; Crucian Brian; Sams, Clarence; Lee, Lesley R.; Van Baalen, Mary

    2016-01-01

    Space medicine presents unique challenges and opportunities for epidemiologists, such as the use of telemedicine during spaceflight. Medical capabilities aboard the International Space Station (ISS) are limited due to severe restrictions on power, volume, and mass. Consequently, inflight health information is based heavily on crewmember (CM) self-report of signs and symptoms, rather than formal diagnoses. While CM's are in flight, the primary source of crew health information is verbal communication between physicians and crewmembers. In 2010 NASA implemented the Lifetime Surveillance of Astronaut Health, an occupational surveillance program for the U.S. Astronaut corps. This has shifted the epidemiological paradigm from tracking diagnoses based on traditional terrestrial clinical practice to one that incorporates symptomatology and may gain a more population-based understanding of early detection of disease process.

  3. The Work Design Method for Human Friendly

    NASA Astrophysics Data System (ADS)

    Harada, Narumi; Sasaki, Masatoshi; Ichikawa, Masami

    In order to realize “the product life cycle with respect for human nature". we ought to make work design so that work environment should be configured to be sound in mind and body, with due consideration of not only physical but also mental factors from the viewpoint of workers. The former includes too heavy work, unreasonable working posture, local fatigue of the body, the safety, and working comfort, and the latter includes work motivation, work worthiness, stress, etc. For the purpose of evaluating the degree of working comfort and safety at human-oriented production lines, we acknowledged, for the work design, the effectiveness of the work designing technique with working time variation duly considered. And, we formulated a model for a mental factor experienced by workers from the degree of working delays. This study covers a work design technique we developed with the effect of the factor as the value of evaluation.

  4. A Method of Integrated Description of Design Information for Reusability

    NASA Astrophysics Data System (ADS)

    Tsumaya, Akira; Nagae, Masao; Wakamatsu, Hidefumi; Shirase, Keiichi; Arai, Eiji

    Much of product design is executed concurrently these days. For such concurrent design, the method which can share and ueuse varioud kind of design information among designers is needed. However, complete understanding of the design information among designers have been a difficult issue. In this paper, design process model with use of designers’ intention is proposed. A method to combine the design process information and the design object information is also proposed. We introduce how to describe designers’ intention by providing some databases. Keyword Database consists of ontological data related to design object/activities. Designers select suitable keyword(s) from Keyword Database and explain the reason/ideas for their design activities by the description with use of keyword(s). We also developed the integration design information management system architecture by using a method of integrated description with designers’ intension. This system realizes connections between the information related to design process and that related to design object through designers’ intention. Designers can communicate with each other to understand how others make decision in design through that. Designers also can re-use both design process information data and design object information data through detabase management sub-system.

  5. Supersonic biplane design via adjoint method

    NASA Astrophysics Data System (ADS)

    Hu, Rui

    In developing the next generation supersonic transport airplane, two major challenges must be resolved. The fuel efficiency must be significantly improved, and the sonic boom propagating to the ground must be dramatically reduced. Both of these objectives can be achieved by reducing the shockwaves formed in supersonic flight. The Busemann biplane is famous for using favorable shockwave interaction to achieve nearly shock-free supersonic flight at its design Mach number. Its performance at off-design Mach numbers, however, can be very poor. This dissertation studies the performance of supersonic biplane airfoils at design and off-design conditions. The choked flow and flow-hysteresis phenomena of these biplanes are studied. These effects are due to finite thickness of the airfoils and non-uniqueness of the solution to the Euler equations, creating over an order of magnitude more wave drag than that predicted by supersonic thin airfoil theory. As a result, the off-design performance is the major barrier to the practical use of supersonic biplanes. The main contribution of this work is to drastically improve the off-design performance of supersonic biplanes by using an adjoint based aerodynamic optimization technique. The Busemann biplane is used as the baseline design, and its shape is altered to achieve optimal wave drags in series of Mach numbers ranging from 1.1 to 1.7, during both acceleration and deceleration conditions. The optimized biplane airfoils dramatically reduces the effects of the choked flow and flow-hysteresis phenomena, while maintaining a certain degree of favorable shockwave interaction effects at the design Mach number. Compared to a diamond shaped single airfoil of the same total thickness, the wave drag of our optimized biplane is lower at almost all Mach numbers, and is significantly lower at the design Mach number. In addition, by performing a Navier-Stokes solution for the optimized airfoil, it is verified that the optimized biplane improves

  6. A rapid method for estimating the levels of urinary thiobarbituric Acid reactive substances for environmental epidemiologic survey.

    PubMed

    Kil, Han-Na; Eom, Sang-Yong; Park, Jung-Duck; Kawamoto, Toshihiro; Kim, Yong-Dae; Kim, Heon

    2014-03-01

    Malondialdehyde (MDA), used as an oxidative stress marker, is commonly assayed by measuring the thiobarbituric acid reactive substances (TBARS) using HPLC, as an indicator of the MDA concentration. Since the HPLC method, though highly specific, is time-consuming and expensive, usually it is not suitable for the rapid test in large-scale environmental epidemiologic surveys. The purpose of this study is to develop a simple and rapid method for estimating TBARS levels by using a multiple regression equation that includes TBARS levels measured with a microplate reader as an independent variable. Twelve hour urine samples were obtained from 715 subjects. The concentration of TBARS was measured at three different wavelengths (fluorescence: λ-ex 530 nm and λ-ex 550 nm; λ-ex 515 nm and λ-ex 553 nm; and absorbance: 532 nm) using microplate reader as well as HPLC. 500 samples were used to develop a regression equation, and the remaining 215 samples were used to evaluate the validity of the regression analysis. The induced multiple regression equation is as follows: TBARS level (μM) = -0.282 + 1.830 × (TBARS level measured with a microplate reader at the fluorescence wavelengths λ-ex 530 nm and λ-em 550 nm, μM) -0.685 × (TBARS level measured with a microplate reader at the fluorescence wavelengths λ-ex 515 nm and λ-em 553 nm, μM) + 0.035 × (TBARS level measured with a microplate reader at the absorbance wavelength 532 nm, μM). The estimated TBARS levels showed a better correlation with, and are closer to, the corresponding TBARS levels measured by HPLC compared to the values obtained by the microplate method. The TBARS estimation method reported here is simple and rapid, and that is generally in concordance with HPLC measurements. This method might be a useful tool for monitoring of urinary TBARS level in environmental epidemiologic surveys with large sample sizes.

  7. JASMINE design and method of data reduction

    NASA Astrophysics Data System (ADS)

    Yamada, Yoshiyuki; Gouda, Naoteru; Yano, Taihei; Kobayashi, Yukiyasu; Niwa, Yoshito

    2008-07-01

    Japan Astrometry Satellite Mission for Infrared Exploration (JASMINE) aims to construct a map of the Galactic bulge with 10 μ arc sec accuracy. We use z-band CCD for avoiding dust absorption, and observe about 10 × 20 degrees area around the Galactic bulge region. Because the stellar density is very high, each FOVs can be combined with high accuracy. With 5 years observation, we will construct 10 μ arc sec accurate map. In this poster, I will show the observation strategy, design of JASMINE hardware, reduction scheme, and error budget. We also construct simulation software named JASMINE Simulator. We also show the simulation results and design of software.

  8. Designing a mixed methods study in pediatric oncology nursing research.

    PubMed

    Wilkins, Krista; Woodgate, Roberta

    2008-01-01

    Despite the appeal of discovering the different strengths of various research methods, mixed methods research remains elusive in pediatric oncology nursing research. If pediatric oncology nurses are to succeed in mixing quantitative and qualitative methods, they need practical guidelines for managing the complex data and analyses of mixed methods research. This article discusses mixed methods terminology, designs, and key design features. Specific areas addressed include the myths about mixed methods research, types of mixed method research designs, steps involved in developing a mixed method research study, and the benefits and challenges of using mixed methods designs in pediatric oncology research. Examples of recent research studies that have combined quantitative and qualitative research methods are provided. The term mixed methods research is used throughout this article to reflect the use of both quantitative and qualitative methods within one study rather than the use of these methods in separate studies concerning the same research problem.

  9. A method for nonlinear optimization with discrete design variables

    NASA Technical Reports Server (NTRS)

    Olsen, Gregory R.; Vanderplaats, Garret N.

    1987-01-01

    A numerical method is presented for the solution of nonlinear discrete optimization problems. The applicability of discrete optimization to engineering design is discussed, and several standard structural optimization problems are solved using discrete design variables. The method uses approximation techniques to create subproblems suitable for linear mixed-integer programming methods. The method employs existing software for continuous optimization and integer programming.

  10. A method for nonlinear optimization with discrete design variables

    NASA Technical Reports Server (NTRS)

    Olsen, Gregory R.; Vanderplaats, Garret N.

    1987-01-01

    A numerical method is presented for the solution of nonlinear discrete optimization problems. The applicability of discrete optimization to engineering design is discussed, and several standard structural optimization problems are solved using discrete design variables. The method uses approximation techniques to create subproblems suitable for linear mixed-integer programming methods. The method employs existing software for continuous optimization and integer programming.

  11. Lithography aware overlay metrology target design method

    NASA Astrophysics Data System (ADS)

    Lee, Myungjun; Smith, Mark D.; Lee, Joonseuk; Jung, Mirim; Lee, Honggoo; Kim, Youngsik; Han, Sangjun; Adel, Michael E.; Lee, Kangsan; Lee, Dohwa; Choi, Dongsub; Liu, Zephyr; Itzkovich, Tal; Levinski, Vladimir; Levy, Ady

    2016-03-01

    We present a metrology target design (MTD) framework based on co-optimizing lithography and metrology performance. The overlay metrology performance is strongly related to the target design and optimizing the target under different process variations in a high NA optical lithography tool and measurement conditions in a metrology tool becomes critical for sub-20nm nodes. The lithography performance can be quantified by device matching and printability metrics, while accuracy and precision metrics are used to quantify the metrology performance. Based on using these metrics, we demonstrate how the optimized target can improve target printability while maintaining the good metrology performance for rotated dipole illumination used for printing a sub-100nm diagonal feature in a memory active layer. The remaining challenges and the existing tradeoff between metrology and lithography performance are explored with the metrology target designer's perspective. The proposed target design framework is completely general and can be used to optimize targets for different lithography conditions. The results from our analysis are both physically sensible and in good agreement with experimental results.

  12. Participatory design methods in telemedicine research.

    PubMed

    Clemensen, Jane; Rothmann, Mette J; Smith, Anthony C; Caffery, Liam J; Danbjorg, Dorthe B

    2016-01-01

    Healthcare systems require a paradigm shift in the way healthcare services are delivered to counteract demographic changes in patient populations, expanding technological developments and the increasing complexity of healthcare. Participatory design (PD) is a methodology that promotes the participation of users in the design process of potential telehealth applications. A PD project can be divided into four phases including: the identification and analysis of participant needs; the generation of ideas and development of prototypes; testing and further development of prototypes; and evaluation. PD is an iterative process where each phase is planned by reflecting on the results from the previous phase with respect to the participants' contribution. Key activities of a PD project include: fieldwork; literature reviewing; and development and testing. All activities must be applied with a participatory mindset that will ensure genuine participation throughout the project. Challenges associated with the use of PD include: the time required to properly engage with participants; language and culture barriers amongst participants; the selection of participants to ensure good representation of the user group; and empowerment. PD is an important process, which is complemented by other evaluation strategies that assess organisational requirements, clinical safety, and clinical and cost effectiveness. PD is a methodology which encourages genuine involvement, where participants have an opportunity to identify practical problems and to design and test technology. The process engages participants in storytelling, future planning and design. PD is a multifaceted assessment tool that helps explore more accurately clinical requirements and patient perspectives in telehealth.

  13. Epidemiologic Methods Lessons Learned from Environmental Public Health Disasters: Chernobyl, the World Trade Center, Bhopal, and Graniteville, South Carolina

    PubMed Central

    Svendsen, Erik R.; Runkle, Jennifer R.; Dhara, Venkata Ramana; Lin, Shao; Naboka, Marina; Mousseau, Timothy A.; Bennett, Charles

    2012-01-01

    Background: Environmental public health disasters involving hazardous contaminants may have devastating effects. While much is known about their immediate devastation, far less is known about long-term impacts of these disasters. Extensive latent and chronic long-term public health effects may occur. Careful evaluation of contaminant exposures and long-term health outcomes within the constraints imposed by limited financial resources is essential. Methods: Here, we review epidemiologic methods lessons learned from conducting long-term evaluations of four environmental public health disasters involving hazardous contaminants at Chernobyl, the World Trade Center, Bhopal, and Graniteville (South Carolina, USA). Findings: We found several lessons learned which have direct implications for the on-going disaster recovery work following the Fukushima radiation disaster or for future disasters. Interpretation: These lessons should prove useful in understanding and mitigating latent health effects that may result from the nuclear reactor accident in Japan or future environmental public health disasters. PMID:23066404

  14. The application of mixed methods designs to trauma research.

    PubMed

    Creswell, John W; Zhang, Wanqing

    2009-12-01

    Despite the use of quantitative and qualitative data in trauma research and therapy, mixed methods studies in this field have not been analyzed to help researchers designing investigations. This discussion begins by reviewing four core characteristics of mixed methods research in the social and human sciences. Combining these characteristics, the authors focus on four select mixed methods designs that are applicable in trauma research. These designs are defined and their essential elements noted. Applying these designs to trauma research, a search was conducted to locate mixed methods trauma studies. From this search, one sample study was selected, and its characteristics of mixed methods procedures noted. Finally, drawing on other mixed methods designs available, several follow-up mixed methods studies were described for this sample study, enabling trauma researchers to view design options for applying mixed methods research in trauma investigations.

  15. Methods for library-scale computational protein design.

    PubMed

    Johnson, Lucas B; Huber, Thaddaus R; Snow, Christopher D

    2014-01-01

    Faced with a protein engineering challenge, a contemporary researcher can choose from myriad design strategies. Library-scale computational protein design (LCPD) is a hybrid method suitable for the engineering of improved protein variants with diverse sequences. This chapter discusses the background and merits of several practical LCPD techniques. First, LCPD methods suitable for delocalized protein design are presented in the context of example design calculations for cellobiohydrolase II. Second, localized design methods are discussed in the context of an example design calculation intended to shift the substrate specificity of a ketol-acid reductoisomerase Rossmann domain from NADPH to NADH.

  16. Probabilistic Methods for Structural Design and Reliability

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Whitlow, Woodrow, Jr. (Technical Monitor)

    2002-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate, that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or in deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  17. Different methods to analyze stepped wedge trial designs revealed different aspects of intervention effects.

    PubMed

    Twisk, J W R; Hoogendijk, E O; Zwijsen, S A; de Boer, M R

    2016-04-01

    Within epidemiology, a stepped wedge trial design (i.e., a one-way crossover trial in which several arms start the intervention at different time points) is increasingly popular as an alternative to a classical cluster randomized controlled trial. Despite this increasing popularity, there is a huge variation in the methods used to analyze data from a stepped wedge trial design. Four linear mixed models were used to analyze data from a stepped wedge trial design on two example data sets. The four methods were chosen because they have been (frequently) used in practice. Method 1 compares all the intervention measurements with the control measurements. Method 2 treats the intervention variable as a time-independent categorical variable comparing the different arms with each other. In method 3, the intervention variable is a time-dependent categorical variable comparing groups with different number of intervention measurements, whereas in method 4, the changes in the outcome variable between subsequent measurements are analyzed. Regarding the results in the first example data set, methods 1 and 3 showed a strong positive intervention effect, which disappeared after adjusting for time. Method 2 showed an inverse intervention effect, whereas method 4 did not show a significant effect at all. In the second example data set, the results were the opposite. Both methods 2 and 4 showed significant intervention effects, whereas the other two methods did not. For method 4, the intervention effect attenuated after adjustment for time. Different methods to analyze data from a stepped wedge trial design reveal different aspects of a possible intervention effect. The choice of a method partly depends on the type of the intervention and the possible time-dependent effect of the intervention. Furthermore, it is advised to combine the results of the different methods to obtain an interpretable overall result. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. A comparison of digital flight control design methods

    NASA Technical Reports Server (NTRS)

    Powell, J. D.; Parsons, E.; Tashker, M. G.

    1976-01-01

    Many variations in design methods for aircraft digital flight control have been proposed in the literature. In general, the methods fall into two categories: those where the design is done in the continuous domain (or s-plane), and those where the design is done in the discrete domain (or z-plane). This paper evaluates several variations of each category and compares them for various flight control modes of the Langley TCV Boeing 737 aircraft. Design method fidelity is evaluated by examining closed loop root movement and the frequency response of the discretely controlled continuous aircraft. It was found that all methods provided acceptable performance for sample rates greater than 10 cps except the 'uncompensated s-plane design' method which was acceptable above 20 cps. A design procedure based on optimal control methods was proposed that provided the best fidelity at very slow sample rates and required no design iterations for changing sample rates.

  19. Improved hybrid SMS-DSF method of nonimaging optical design

    NASA Astrophysics Data System (ADS)

    Bortz, John; Shatz, Narkis

    2011-10-01

    The hybrid SMS-DSF method of nonimaging optical design combines the discrete simultaneous multiple surface (SMS) method with the dual-surface functional (DSF) method to obtain improved optical performance relative to the discrete SMS method alone. In this contribution we present a new extension of the hybrid SMS-DSF method that uses differential ray tracing to produce designs having significantly improved performance relative to the original hybrid SMS-DSF method.

  20. Computational Methods for Design, Control and Optimization

    DTIC Science & Technology

    2007-10-01

    34scenario" that applies to channel flows ( Poiseuille flows , Couette flow ) and pipe flows . Over the past 75 years many complex "transition theories" have... Simulation of Turbulent Flows , Springer Verlag, 2005. Additional Publications Supported by this Grant 1. J. Borggaard and T. Iliescu, Approximate Deconvolution...rigorous analysis of design algorithms that combine numerical simulation codes, approximate sensitivity calculations and optimization codes. The fundamental

  1. Soft computing methods in design of superalloys

    NASA Technical Reports Server (NTRS)

    Cios, K. J.; Berke, L.; Vary, A.; Sharma, S.

    1995-01-01

    Soft computing techniques of neural networks and genetic algorithms are used in the design of superalloys. The cyclic oxidation attack parameter K(sub a), generated from tests at NASA Lewis Research Center, is modeled as a function of the superalloy chemistry and test temperature using a neural network. This model is then used in conjunction with a genetic algorithm to obtain an optimized superalloy composition resulting in low K(sub a) values.

  2. Soft Computing Methods in Design of Superalloys

    NASA Technical Reports Server (NTRS)

    Cios, K. J.; Berke, L.; Vary, A.; Sharma, S.

    1996-01-01

    Soft computing techniques of neural networks and genetic algorithms are used in the design of superalloys. The cyclic oxidation attack parameter K(sub a), generated from tests at NASA Lewis Research Center, is modelled as a function of the superalloy chemistry and test temperature using a neural network. This model is then used in conjunction with a genetic algorithm to obtain an optimized superalloy composition resulting in low K(sub a) values.

  3. The epidemiology of male infertility.

    PubMed

    Winters, Brian R; Walsh, Thomas J

    2014-02-01

    The purpose of this review is to integrate understanding of epidemiology and infertility. A primer on epidemiologic science and an example disease for which the design of epidemiologic investigations is readily apparent are provided. Key features of infertility that limit epidemiologic investigation are described and a survey of available data on the epidemiology of infertility provided. Finally, the work that must be completed to move this area of research forward is proposed, and, with this new perspective of "infertility as a disease," improvements envisioned in public health that may be gained through improved understanding of the epidemiology of male infertility.

  4. A comparison of methods currently used in inclusive design.

    PubMed

    Goodman-Deane, Joy; Ward, James; Hosking, Ian; Clarkson, P John

    2014-07-01

    Inclusive design has unique challenges because it aims to improve usability for a wide range of users. This typically includes people with lower levels of ability, as well as mainstream users. This paper examines the effectiveness of two methods that are used in inclusive design: user trials and exclusion calculations (an inclusive design inspection method). A study examined three autoinjectors using both methods (n=30 for the user trials). The usability issues identified by each method are compared and the effectiveness of the methods is discussed. The study found that each method identified different kinds of issues, all of which are important for inclusive design. We therefore conclude that a combination of methods should be used in inclusive design rather than relying on a single method. Recommendations are also given for how the individual methods can be used more effectively in this context.

  5. Waterflooding injectate design systems and methods

    DOEpatents

    Brady, Patrick V.; Krumhansl, James L.

    2016-12-13

    A method of recovering a liquid hydrocarbon using an injectate includes recovering the liquid hydrocarbon through primary extraction. Physico-chemical data representative of electrostatic interactions between the liquid hydrocarbon and the reservoir rock are measured. At least one additive of the injectate is selected based on the physico-chemical data. The method includes recovering the liquid hydrocarbon from the reservoir rock through secondary extraction using the injectate.

  6. Design of a set of probes with high potential for influenza virus epidemiological surveillance

    PubMed Central

    Carreño-Durán, Luis R; Larios-Serrato, V; Jaimes-Díaz, Hueman; Pérez-Cervantes, Hilda; Zepeda-López, Héctor; Sánchez-Vallejo, Carlos Javier; Olguín-Ruiz, Gabriela Edith; Maldonado-Rodríguez, Rogelio; Méndez-Tenorio, Alfonso

    2013-01-01

    An Influenza Probe Set (IPS) consisting in 1,249 9-mer probes for genomic fingerprinting of closely and distantly related Influenza Virus strains was designed and tested in silico. The IPS was derived from alignments of Influenza genomes. The RNA segments of 5,133 influenza strains having diverse degree of relatedness were concatenated and aligned. After alignment, 9-mer sites having high Shannon entropy were searched. Additional criteria such as: G+C content between 35 to 65%, absence of dimer or trimer consecutive repeats, a minimum of 2 differences between 9mers and selecting only sequences with Tm values between 34.5 and 36.5oC were applied for selecting probes with high sequential entropy. Virtual Hybridization was used to predict Genomic Fingerprints to assess the capability of the IPS to discriminate between influenza and related strains. Distance scores between pairs of Influenza Genomic Fingerprints were calculated, and used for estimating Taxonomic Trees. Visual examination of both Genomic Fingerprints and Taxonomic Trees suggest that the IPS is able to discriminate between distant and closely related Influenza strains. It is proposed that the IPS can be used to investigate, by virtual or experimental hybridization, any new, and potentially virulent, strain. PMID:23750091

  7. Design and development of an instrument to measure overall lifestyle habits for epidemiological research: the Mediterranean Lifestyle (MEDLIFE) index.

    PubMed

    Sotos-Prieto, Mercedes; Moreno-Franco, Belén; Ordovás, Jose M; León, Montse; Casasnovas, Jose A; Peñalvo, Jose L

    2015-04-01

    To design and develop a questionnaire that can account for an individual's adherence to a Mediterranean lifestyle including the assessment of diet and physical activity patterns, as well as social interaction. The Mediterranean Lifestyle (MEDLIFE) index was created based on the current Spanish Mediterranean food guide pyramid. MEDLIFE is a twenty-eight-item derived index consisting of questions about food consumption (fifteen items), traditional Mediterranean dietary habits (seven items) and physical activity, rest and social interaction habits (six items). Linear regression models and Spearman rank correlation were fitted to assess content validity and internal consistency. A subset of participants in the Aragon Workers' Health Study cohort (Zaragoza, Spain) provided the data for development of MEDLIFE. Participants (n 988) of the Aragon Workers' Health Study cohort in Spain. Mean MEDLIFE score was 11·3 (sd 2·6; range: 0-28), and the quintile distribution of MEDLIFE score showed a significant association with each of the individual items as well as with specific nutrients and lifestyle indicators (intra-validity). We also quantified MEDLIFE correspondence with previously reported diet quality indices and found significant correlations (ρ range: 0·44-0·53; P<0·001) for the Alternate Healthy Eating Index, the Alternate Mediterranean Diet Index and Mediterranean Diet Adherence Screener. MEDLIFE is the first index to include an overall assessment of lifestyle habits. It is expected to be a more holistic tool to measure adherence to the Mediterranean lifestyle in epidemiological studies.

  8. The HIV prevention cascade: integrating theories of epidemiological, behavioural, and social science into programme design and monitoring.

    PubMed

    Hargreaves, James R; Delany-Moretlwe, Sinead; Hallett, Timothy B; Johnson, Saul; Kapiga, Saidi; Bhattacharjee, Parinita; Dallabetta, Gina; Garnett, Geoff P

    2016-07-01

    Theories of epidemiology, health behaviour, and social science have changed the understanding of HIV prevention in the past three decades. The HIV prevention cascade is emerging as a new approach to guide the design and monitoring of HIV prevention programmes in a way that integrates these multiple perspectives. This approach recognises that translating the efficacy of direct mechanisms that mediate HIV prevention (including prevention products, procedures, and risk-reduction behaviours) into population-level effects requires interventions that increase coverage. An HIV prevention cascade approach suggests that high coverage can be achieved by targeting three key components: demand-side interventions that improve risk perception and awareness and acceptability of prevention approaches; supply-side interventions that make prevention products and procedures more accessible and available; and adherence interventions that support ongoing adoption of prevention behaviours, including those that do and do not involve prevention products. Programmes need to develop delivery platforms that ensure these interventions reach target populations, to shape the policy environment so that it facilitates implementation at scale with high quality and intensity, and to monitor the programme with indicators along the cascade. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  10. Comparison of four nonstationary hydrologic design methods for changing environment

    NASA Astrophysics Data System (ADS)

    Yan, Lei; Xiong, Lihua; Guo, Shenglian; Xu, Chong-Yu; Xia, Jun; Du, Tao

    2017-08-01

    The hydrologic design of nonstationary flood extremes is an emerging field that is essential for water resources management and hydrologic engineering design to cope with changing environment. This paper aims to investigate and compare the capability of four nonstationary hydrologic design strategies, including the expected number of exceedances (ENE), design life level (DLL), equivalent reliability (ER), and average design life level (ADLL), with the last three methods taking into consideration the design life of the project. The confidence intervals of the calculated design floods were also estimated using the nonstationary bootstrap approach. A comparison of these four methods was performed using the annual maximum flood series (AMFS) of the Weihe River basin, Jinghe River basin, and Assunpink Creek basin. The results indicated that ENE, ER and ADLL yielded the same or very similar design values and confidence intervals for both increasing and decreasing trends of AMFS considered. DLL also yields similar design values if the relationship between DLL and ER/ADLL return periods is considered. Both ER and ADLL are recommended for practical use as they have associated design floods with the design life period of projects and yield reasonable design quantiles and confidence intervals. Furthermore, by assuming that the design results using either a stationary or nonstationary hydrologic design strategy should have the same reliability, the ER method enables us to solve the nonstationary hydrologic design problems by adopting the stationary design reliability, thus bridging the gap between stationary and nonstationary design criteria.

  11. Diagnosis of Dementia by Machine learning methods in Epidemiological studies: a pilot exploratory study from south India.

    PubMed

    Bhagyashree, Sheshadri Iyengar Raghavan; Nagaraj, Kiran; Prince, Martin; Fall, Caroline H D; Krishna, Murali

    2017-07-11

    There are limited data on the use of artificial intelligence methods for the diagnosis of dementia in epidemiological studies in low- and middle-income country (LMIC) settings. A culture and education fair battery of cognitive tests was developed and validated for population based studies in low- and middle-income countries including India by the 10/66 Dementia Research Group. We explored the machine learning methods based on the 10/66 battery of cognitive tests for the diagnosis of dementia based in a birth cohort study in South India. The data sets for 466 men and women for this study were obtained from the on-going Mysore Studies of Natal effect of Health and Ageing (MYNAH), in south India. The data sets included: demographics, performance on the 10/66 cognitive function tests, the 10/66 diagnosis of mental disorders and population based normative data for the 10/66 battery of cognitive function tests. Diagnosis of dementia from the rule based approach was compared against the 10/66 diagnosis of dementia. We have applied machine learning techniques to identify minimal number of the 10/66 cognitive function tests required for diagnosing dementia and derived an algorithm to improve the accuracy of dementia diagnosis. Of 466 subjects, 27 had 10/66 diagnosis of dementia, 19 of whom were correctly identified as having dementia by Jrip classification with 100% accuracy. This pilot exploratory study indicates that machine learning methods can help identify community dwelling older adults with 10/66 criterion diagnosis of dementia with good accuracy in a LMIC setting such as India. This should reduce the duration of the diagnostic assessment and make the process easier and quicker for clinicians, patients and will be useful for 'case' ascertainment in population based epidemiological studies.

  12. The Triton: Design concepts and methods

    NASA Technical Reports Server (NTRS)

    Meholic, Greg; Singer, Michael; Vanryn, Percy; Brown, Rhonda; Tella, Gustavo; Harvey, Bob

    1992-01-01

    During the design of the C & P Aerospace Triton, a few problems were encountered that necessitated changes in the configuration. After the initial concept phase, the aspect ratio was increased from 7 to 7.6 to produce a greater lift to drag ratio (L/D = 13) which satisfied the horsepower requirements (118 hp using the Lycoming O-235 engine). The initial concept had a wing planform area of 134 sq. ft. Detailed wing sizing analysis enlarged the planform area to 150 sq. ft., without changing its layout or location. The most significant changes, however, were made just prior to inboard profile design. The fuselage external diameter was reduced from 54 to 50 inches to reduce drag to meet the desired cruise speed of 120 knots. Also, the nose was extended 6 inches to accommodate landing gear placement. Without the extension, the nosewheel received an unacceptable percentage (25 percent) of the landing weight. The final change in the configuration was made in accordance with the stability and control analysis. In order to reduce the static margin from 20 to 13 percent, the horizontal tail area was reduced from 32.02 to 25.0 sq. ft. The Triton meets all the specifications set forth in the design criteria. If time permitted another iteration of the calculations, two significant changes would be made. The vertical stabilizer area would be reduced to decrease the aircraft lateral stability slope since the current value was too high in relation to the directional stability slope. Also, the aileron size would be decreased to reduce the roll rate below the current 106 deg/second. Doing so would allow greater flap area (increasing CL(sub max)) and thus reduce the overall wing area. C & P would also recalculate the horsepower and drag values to further validate the 120 knot cruising speed.

  13. Global Dissemination of Carbapenemase-Producing Klebsiella pneumoniae: Epidemiology, Genetic Context, Treatment Options, and Detection Methods

    PubMed Central

    Lee, Chang-Ro; Lee, Jung Hun; Park, Kwang Seung; Kim, Young Bae; Jeong, Byeong Chul; Lee, Sang Hee

    2016-01-01

    The emergence of carbapenem-resistant Gram-negative pathogens poses a serious threat to public health worldwide. In particular, the increasing prevalence of carbapenem-resistant Klebsiella pneumoniae is a major source of concern. K. pneumoniae carbapenemases (KPCs) and carbapenemases of the oxacillinase-48 (OXA-48) type have been reported worldwide. New Delhi metallo-β-lactamase (NDM) carbapenemases were originally identified in Sweden in 2008 and have spread worldwide rapidly. In this review, we summarize the epidemiology of K. pneumoniae producing three carbapenemases (KPCs, NDMs, and OXA-48-like). Although the prevalence of each resistant strain varies geographically, K. pneumoniae producing KPCs, NDMs, and OXA-48-like carbapenemases have become rapidly disseminated. In addition, we used recently published molecular and genetic studies to analyze the mechanisms by which these three carbapenemases, and major K. pneumoniae clones, such as ST258 and ST11, have become globally prevalent. Because carbapenemase-producing K. pneumoniae are often resistant to most β-lactam antibiotics and many other non-β-lactam molecules, the therapeutic options available to treat infection with these strains are limited to colistin, polymyxin B, fosfomycin, tigecycline, and selected aminoglycosides. Although, combination therapy has been recommended for the treatment of severe carbapenemase-producing K. pneumoniae infections, the clinical evidence for this strategy is currently limited, and more accurate randomized controlled trials will be required to establish the most effective treatment regimen. Moreover, because rapid and accurate identification of the carbapenemase type found in K. pneumoniae may be difficult to achieve through phenotypic antibiotic susceptibility tests, novel molecular detection techniques are currently being developed. PMID:27379038

  14. The causal pie model: an epidemiological method applied to evolutionary biology and ecology.

    PubMed

    Wensink, Maarten; Westendorp, Rudi G J; Baudisch, Annette

    2014-05-01

    A general concept for thinking about causality facilitates swift comprehension of results, and the vocabulary that belongs to the concept is instrumental in cross-disciplinary communication. The causal pie model has fulfilled this role in epidemiology and could be of similar value in evolutionary biology and ecology. In the causal pie model, outcomes result from sufficient causes. Each sufficient cause is made up of a "causal pie" of "component causes". Several different causal pies may exist for the same outcome. If and only if all component causes of a sufficient cause are present, that is, a causal pie is complete, does the outcome occur. The effect of a component cause hence depends on the presence of the other component causes that constitute some causal pie. Because all component causes are equally and fully causative for the outcome, the sum of causes for some outcome exceeds 100%. The causal pie model provides a way of thinking that maps into a number of recurrent themes in evolutionary biology and ecology: It charts when component causes have an effect and are subject to natural selection, and how component causes affect selection on other component causes; which partitions of outcomes with respect to causes are feasible and useful; and how to view the composition of a(n apparently homogeneous) population. The diversity of specific results that is directly understood from the causal pie model is a test for both the validity and the applicability of the model. The causal pie model provides a common language in which results across disciplines can be communicated and serves as a template along which future causal analyses can be made.

  15. The causal pie model: an epidemiological method applied to evolutionary biology and ecology

    PubMed Central

    Wensink, Maarten; Westendorp, Rudi G J; Baudisch, Annette

    2014-01-01

    A general concept for thinking about causality facilitates swift comprehension of results, and the vocabulary that belongs to the concept is instrumental in cross-disciplinary communication. The causal pie model has fulfilled this role in epidemiology and could be of similar value in evolutionary biology and ecology. In the causal pie model, outcomes result from sufficient causes. Each sufficient cause is made up of a “causal pie” of “component causes”. Several different causal pies may exist for the same outcome. If and only if all component causes of a sufficient cause are present, that is, a causal pie is complete, does the outcome occur. The effect of a component cause hence depends on the presence of the other component causes that constitute some causal pie. Because all component causes are equally and fully causative for the outcome, the sum of causes for some outcome exceeds 100%. The causal pie model provides a way of thinking that maps into a number of recurrent themes in evolutionary biology and ecology: It charts when component causes have an effect and are subject to natural selection, and how component causes affect selection on other component causes; which partitions of outcomes with respect to causes are feasible and useful; and how to view the composition of a(n apparently homogeneous) population. The diversity of specific results that is directly understood from the causal pie model is a test for both the validity and the applicability of the model. The causal pie model provides a common language in which results across disciplines can be communicated and serves as a template along which future causal analyses can be made. PMID:24963386

  16. Optimization and Application of Direct Infusion Nanoelectrospray HRMS Method for Large-Scale Urinary Metabolic Phenotyping in Molecular Epidemiology

    PubMed Central

    2017-01-01

    Large-scale metabolic profiling requires the development of novel economical high-throughput analytical methods to facilitate characterization of systemic metabolic variation in population phenotypes. We report a fit-for-purpose direct infusion nanoelectrospray high-resolution mass spectrometry (DI-nESI-HRMS) method with time-of-flight detection for rapid targeted parallel analysis of over 40 urinary metabolites. The newly developed 2 min infusion method requires <10 μL of urine sample and generates high-resolution MS profiles in both positive and negative polarities, enabling further data mining and relative quantification of hundreds of metabolites. Here we present optimization of the DI-nESI-HRMS method in a detailed step-by-step guide and provide a workflow with rigorous quality assessment for large-scale studies. We demonstrate for the first time the application of the method for urinary metabolic profiling in human epidemiological investigations. Implementation of the presented DI-nESI-HRMS method enabled cost-efficient analysis of >10 000 24 h urine samples from the INTERMAP study in 12 weeks and >2200 spot urine samples from the ARIC study in <3 weeks with the required sensitivity and accuracy. We illustrate the application of the technique by characterizing the differences in metabolic phenotypes of the USA and Japanese population from the INTERMAP study. PMID:28245357

  17. Descriptive and analytic epidemiology. Bridges to cancer control

    SciTech Connect

    Mettlin, C.

    1988-10-15

    Epidemiology serves as a bridge between basic science and cancer control. The two major orientations of epidemiology are descriptive and analytic. The former is useful in assessing the scope and dimensions of the cancer problem and the latter is used to assess environmental and lifestyle sources of cancer risk. A recent development in descriptive epidemiology is the use of functional measures of disease such as lost life expectancy. In analytical epidemiology, there is new or renewed interest in several lifestyle factors including diet and exercise as well as environmental factors such as involuntary tobacco exposure and radon in dwellings. Review of the evidence should consider the strengths and weaknesses of different research procedures. Each method is inconclusive by itself but, the different research designs of epidemiology collectively may represent a hierarchy of proof. Although the roles of many factors remain to be defined, the aggregate epidemiologic data continue to demonstrate the special importance of personal behavior and lifestyle in affecting cancer risk.

  18. Research and Methods for Simulation Design: State of the Art

    DTIC Science & Technology

    1990-09-01

    designers. Designers may use this review to identify methods to aid the training-device design process and individuals who manage research programs...maximum training effectiveness at a given cost. The methods should apply to the concept-formulation phase’of the training-device development process ...design process . Finally, individuals who manage research programs may use this information to set priorities for future research efforts. viii RESEARCH

  19. How to Construct a Mixed Methods Research Design.

    PubMed

    Schoonenboom, Judith; Johnson, R Burke

    2017-01-01

    This article provides researchers with knowledge of how to design a high quality mixed methods research study. To design a mixed study, researchers must understand and carefully consider each of the dimensions of mixed methods design, and always keep an eye on the issue of validity. We explain the seven major design dimensions: purpose, theoretical drive, timing (simultaneity and dependency), point of integration, typological versus interactive design approaches, planned versus emergent design, and design complexity. There also are multiple secondary dimensions that need to be considered during the design process. We explain ten secondary dimensions of design to be considered for each research study. We also provide two case studies showing how the mixed designs were constructed.

  20. Quantifying Uncertainty in Epidemiological Models

    SciTech Connect

    Ramanathan, Arvind; Jha, Sumit Kumar

    2012-01-01

    Modern epidemiology has made use of a number of mathematical models, including ordinary differential equation (ODE) based models and agent based models (ABMs) to describe the dynamics of how a disease may spread within a population and enable the rational design of strategies for intervention that effectively contain the spread of the disease. Although such predictions are of fundamental importance in preventing the next global pandemic, there is a significant gap in trusting the outcomes/predictions solely based on such models. Hence, there is a need to develop approaches such that mathematical models can be calibrated against historical data. In addition, there is a need to develop rigorous uncertainty quantification approaches that can provide insights into when a model will fail and characterize the confidence in the (possibly multiple) model outcomes/predictions, when such retrospective analysis cannot be performed. In this paper, we outline an approach to develop uncertainty quantification approaches for epidemiological models using formal methods and model checking. By specifying the outcomes expected from a model in a suitable spatio-temporal logic, we use probabilistic model checking methods to quantify the probability with which the epidemiological model satisfies the specification. We argue that statistical model checking methods can solve the uncertainty quantification problem for complex epidemiological models.

  1. Design Features of Explicit Values Clarification Methods: A Systematic Review.

    PubMed

    Witteman, Holly O; Scherer, Laura D; Gavaruzzi, Teresa; Pieterse, Arwen H; Fuhrel-Forbis, Andrea; Chipenda Dansokho, Selma; Exe, Nicole; Kahn, Valerie C; Feldman-Stewart, Deb; Col, Nananda F; Turgeon, Alexis F; Fagerlin, Angela

    2016-05-01

    Values clarification is a recommended element of patient decision aids. Many different values clarification methods exist, but there is little evidence synthesis available to guide design decisions. To describe practices in the field of explicit values clarification methods according to a taxonomy of design features. MEDLINE, all EBM Reviews, CINAHL, EMBASE, Google Scholar, manual search of reference lists, and expert contacts. Articles were included if they described 1 or more explicit values clarification methods. We extracted data about decisions addressed; use of theories, frameworks, and guidelines; and 12 design features. We identified 110 articles describing 98 explicit values clarification methods. Most of these addressed decisions in cancer or reproductive health, and half addressed a decision between just 2 options. Most used neither theory nor guidelines to structure their design. "Pros and cons" was the most common type of values clarification method. Most methods did not allow users to add their own concerns. Few methods explicitly presented tradeoffs inherent in the decision, supported an iterative process of values exploration, or showed how different options aligned with users' values. Study selection criteria and choice of elements for the taxonomy may have excluded values clarification methods or design features. Explicit values clarification methods have diverse designs but can be systematically cataloged within the structure of a taxonomy. Developers of values clarification methods should carefully consider each of the design features in this taxonomy and publish adequate descriptions of their designs. More research is needed to study the effects of different design features. © The Author(s) 2016.

  2. Meta-epidemiology.

    PubMed

    Bae, Jong-Myon

    2014-01-01

    The concept of meta-epidemiology has been introduced with considering the methodological limitations of systematic review for intervention trials. The paradigm of meta-epidemiology has shifted from a statistical method into a new methodology to close gaps between evidence and practice. Main interest of meta-epidemiology is to control potential biases in previous quantitative systematic reviews and draw appropriate evidences for establishing evidence-base guidelines. Nowadays, the network meta-epidemiology was suggested in order to overcome some limitations of meta-epidemiology. To activate meta-epidemiologic studies, implementation of tools for risk of bias and reporting guidelines such as the Consolidated Standards for Reporting Trials (CONSORT) should be done.

  3. A survey on methods of design features identification

    NASA Astrophysics Data System (ADS)

    Grabowik, C.; Kalinowski, K.; Paprocka, I.; Kempa, W.

    2015-11-01

    It is widely accepted that design features are one of the most attractive integration method of most fields of engineering activities such as a design modelling, process planning or production scheduling. One of the most important tasks which are realized in the integration process of design and planning functions is a design translation meant as design data mapping into data which are important from process planning needs point of view, it is manufacturing data. A design geometrical shape translation process can be realized with application one of the following strategies: (i) designing with previously prepared design features library also known as DBF method it is design by feature, (ii) interactive design features recognition IFR, (iii) automatic design features recognition AFR. In case of the DBF method design geometrical shape is created with design features. There are two basic approaches for design modelling in DBF method it is classic in which a part design is modelled from beginning to end with application design features previously stored in a design features data base and hybrid where part is partially created with standard predefined CAD system tools and the rest with suitable design features. Automatic feature recognition consist in an autonomic searching of a product model represented with a specific design representation method in order to find those model features which might be potentially recognized as design features, manufacturing features, etc. This approach needs the searching algorithm to be prepared. The searching algorithm should allow carrying on the whole recognition process without a user supervision. Currently there are lots of AFR methods. These methods need the product model to be represented with B-Rep representation most often, CSG rarely, wireframe very rarely. In the IFR method potential features are being recognized by a user. This process is most often realized by a user who points out those surfaces which seem to belong to a

  4. Epidemiologic methods lessons learned from environmental public health disasters: Chernobyl, the World Trade Center, Bhopal, and Graniteville, South Carolina.

    PubMed

    Svendsen, Erik R; Runkle, Jennifer R; Dhara, Venkata Ramana; Lin, Shao; Naboka, Marina; Mousseau, Timothy A; Bennett, Charles

    2012-08-01

    Environmental public health disasters involving hazardous contaminants may have devastating effects. While much is known about their immediate devastation, far less is known about long-term impacts of these disasters. Extensive latent and chronic long-term public health effects may occur. Careful evaluation of contaminant exposures and long-term health outcomes within the constraints imposed by limited financial resources is essential. Here, we review epidemiologic methods lessons learned from conducting long-term evaluations of four environmental public health disasters involving hazardous contaminants at Chernobyl, the World Trade Center, Bhopal, and Graniteville (South Carolina, USA). We found several lessons learned which have direct implications for the on-going disaster recovery work following the Fukushima radiation disaster or for future disasters. These lessons should prove useful in understanding and mitigating latent health effects that may result from the nuclear reactor accident in Japan or future environmental public health disasters.

  5. The potential of the case-control method for rapid epidemiological assessment.

    PubMed

    Baltazar, J C

    1991-01-01

    Over the past few decades, the case-control method has been mostly applied to risk-factor studies of chronic diseases. Recently, among its new applications is the use of the method to study the health effect of improvements in sanitation and water supply. The methodological considerations, prospects and constraints of the method for rapid assessment are reviewed.

  6. Design Methods and Optimization for Morphing Aircraft

    NASA Technical Reports Server (NTRS)

    Crossley, William A.

    2005-01-01

    This report provides a summary of accomplishments made during this research effort. The major accomplishments are in three areas. The first is the use of a multiobjective optimization strategy to help identify potential morphing features that uses an existing aircraft sizing code to predict the weight, size and performance of several fixed-geometry aircraft that are Pareto-optimal based upon on two competing aircraft performance objectives. The second area has been titled morphing as an independent variable and formulates the sizing of a morphing aircraft as an optimization problem in which the amount of geometric morphing for various aircraft parameters are included as design variables. This second effort consumed most of the overall effort on the project. The third area involved a more detailed sizing study of a commercial transport aircraft that would incorporate a morphing wing to possibly enable transatlantic point-to-point passenger service.

  7. Preliminary design method for deployable spacecraft beams

    NASA Technical Reports Server (NTRS)

    Mikulas, Martin M., Jr.; Cassapakis, Costas

    1995-01-01

    There is currently considerable interest in low-cost, lightweight, compactly packageable deployable elements for various future missions involving small spacecraft. These elements must also have a simple and reliable deployment scheme and possess zero or very small free-play. Although most small spacecraft do not experience large disturbances, very low stiffness appendages or free-play can couple with even small disturbances and lead to unacceptably large attitude errors which may involve the introduction of a flexible-body control system. A class of structures referred to as 'rigidized structures' offers significant promise in providing deployable elements that will meet these needs for small spacecraft. The purpose of this paper is to introduce several rigidizable concepts and to develop a design methodology which permits a rational comparison of these elements to be made with alternate concepts.

  8. Method for designing and controlling compliant gripper

    NASA Astrophysics Data System (ADS)

    Spanu, A. R.; Besnea, D.; Avram, M.; Ciobanu, R.

    2016-08-01

    The compliant grippers are useful for high accuracy grasping of small objects with adaptive control of contact points along the active surfaces of the fingers. The spatial trajectories of the elements become a must, due to the development of MEMS. The paper presents the solution for the compliant gripper designed by the authors, so the planar and spatial movements are discussed. At the beginning of the process, the gripper could work as passive one just for the moment when it has to reach out the object surface. The forces provided by the elements have to avoid the damage. As part of the system, the camera is taken picture of the object, in order to facilitate the positioning of the system. When the contact is established, the mechanism is acting as an active gripper by using an electrical stepper motor, which has controlled movement.

  9. A flexible layout design method for passive micromixers.

    PubMed

    Deng, Yongbo; Liu, Zhenyu; Zhang, Ping; Liu, Yongshun; Gao, Qingyong; Wu, Yihui

    2012-10-01

    This paper discusses a flexible layout design method of passive micromixers based on the topology optimization of fluidic flows. Being different from the trial and error method, this method obtains the detailed layout of a passive micromixer according to the desired mixing performance by solving a topology optimization problem. Therefore, the dependence on the experience of the designer is weaken, when this method is used to design a passive micromixer with acceptable mixing performance. Several design disciplines for the passive micromixers are considered to demonstrate the flexibility of the layout design method for passive micromixers. These design disciplines include the approximation of the real 3D micromixer, the manufacturing feasibility, the spacial periodic design, and effects of the Péclet number and Reynolds number on the designs obtained by this layout design method. The capability of this design method is validated by several comparisons performed between the obtained layouts and the optimized designs in the recently published literatures, where the values of the mixing measurement is improved up to 40.4% for one cycle of the micromixer.

  10. Comparison of Traditional Design Nonlinear Programming Optimization and Stochastic Methods for Structural Design

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2010-01-01

    Structural design generated by traditional method, optimization method and the stochastic design concept are compared. In the traditional method, the constraints are manipulated to obtain the design and weight is back calculated. In design optimization, the weight of a structure becomes the merit function with constraints imposed on failure modes and an optimization algorithm is used to generate the solution. Stochastic design concept accounts for uncertainties in loads, material properties, and other parameters and solution is obtained by solving a design optimization problem for a specified reliability. Acceptable solutions were produced by all the three methods. The variation in the weight calculated by the methods was modest. Some variation was noticed in designs calculated by the methods. The variation may be attributed to structural indeterminacy. It is prudent to develop design by all three methods prior to its fabrication. The traditional design method can be improved when the simplified sensitivities of the behavior constraint is used. Such sensitivity can reduce design calculations and may have a potential to unify the traditional and optimization methods. Weight versus reliabilitytraced out an inverted-S-shaped graph. The center of the graph corresponded to mean valued design. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure. Weight can be reduced to a small value for a most failure-prone design. Probabilistic modeling of load and material properties remained a challenge.

  11. [Epidemiology of head traumas. "Barcelona" data base. Objectives, design and analysis of 584 cases].

    PubMed

    Vilalta, J; Bosch, J; Castaño, C H; Poca, M A; Rubio, E; Godet, C; Puig, G

    1992-01-01

    higher in patients with severe CET (74.6, 64, and 47% in severe moderate, and slight CET, respectively). The main lesions were: acute subdural hematoma, 72 (12.3%); cerebral contusion, 207 (35.4%); epidural hematoma, 88 (15%); normal computerized tomography/subarachnoid hemorrhage, 87 (14.8%); swelling, 17 (2.9%); diffuse axonal injury, 74 (12.6%); and the remaining, 39 (6.6%) had other lesions such as hydrocephalus fracture-sinking, etc. Mortality was 44.2, 12.2, and 3.7% respectively in severe, moderate, and slight CET. Data base may contribute to establish the prognosis of CET and to determine the efficacy of therapeutic procedures as well as that of diagnostic and investigational methods.

  12. Studies on Aircraft Conceptual Design Incorporating Boundary Element Method for University Design Education

    NASA Astrophysics Data System (ADS)

    Kawai, Toshiyuki; Rinoie, Kenichi

    Aircraft conceptual design method currently used for the university design education mainly utilises empirical values based on the statistical database to determine the main design parameters. Therefore, it is often difficult for students to understand the effects of aerodynamic parameters such as a wing aspect ratio and a taper ratio during the design process. In this paper, a conceptual design method that incorporates a boundary element method is discussed so that aerodynamic characteristic estimations are possible and that the students can easily comprehend the effects of aerodynamic parameters while designing the airplane. A single engine light airplane has been designed by the present conceptual design method. The results obtained by the present method and those by the conventional method are compared and discussed.

  13. Conceptual design of clean processes: Tools and methods

    SciTech Connect

    Hurme, M.

    1996-12-31

    Design tools available for implementing clean design into practice are discussed. The application areas together with the methods of comparison of clean process alternatives are presented. Environmental principles are becoming increasingly important in the whole life cycle of products from design, manufacturing and marketing to disposal. The hinder of implementing clean technology in design has been the necessity to apply it in all phases of design starting from the beginning, since it deals with the major selections made in the conceptual process design. Therefore both a modified design approach and new tools are needed for process design to make the application of clean technology practical. The first item; extended process design methodologies has been presented by Hurme, Douglas, Rossiter and Klee, Hilaly and Sikdar. The aim of this paper is to discuss the latter topic; the process design tools which assist in implementing clean principles into process design. 22 refs., 2 tabs.

  14. Analytical techniques for instrument design - matrix methods

    SciTech Connect

    Robinson, R.A.

    1997-09-01

    We take the traditional Cooper-Nathans approach, as has been applied for many years for steady-state triple-axis spectrometers, and consider its generalisation to other inelastic scattering spectrometers. This involves a number of simple manipulations of exponentials of quadratic forms. In particular, we discuss a toolbox of matrix manipulations that can be performed on the 6- dimensional Cooper-Nathans matrix: diagonalisation (Moller-Nielsen method), coordinate changes e.g. from ({Delta}k{sub I},{Delta}k{sub F} to {Delta}E, {Delta}Q & 2 dummy variables), integration of one or more variables (e.g. over such dummy variables), integration subject to linear constraints (e.g. Bragg`s Law for analysers), inversion to give the variance-covariance matrix, and so on. We show how these tools can be combined to solve a number of important problems, within the narrow-band limit and the gaussian approximation. We will argue that a generalised program that can handle multiple different spectrometers could (and should) be written in parallel to the Monte-Carlo packages that are becoming available. We will also discuss the complementarity between detailed Monte-Carlo calculations and the approach presented here. In particular, Monte-Carlo methods traditionally simulate the real experiment as performed in practice, given a model scattering law, while the Cooper-Nathans method asks the inverse question: given that a neutron turns up in a particular spectrometer configuration (e.g. angle and time of flight), what is the probability distribution of possible scattering events at the sample? The Monte-Carlo approach could be applied in the same spirit to this question.

  15. A Review of the Epidemiological Methods Used to Investigate the Health Impacts of Air Pollution around Major Industrial Areas

    PubMed Central

    Pascal, Laurence; Bidondo, Marie-Laure; Cochet, Amandine; Sarter, Hélène; Stempfelet, Morgane; Wagner, Vérène

    2013-01-01

    We performed a literature review to investigate how epidemiological studies have been used to assess the health consequences of living in the vicinity of industries. 77 papers on the chronic effects of air pollution around major industrial areas were reviewed. Major health themes were cancers (27 studies), morbidity (25 studies), mortality (7 studies), and birth outcome (7 studies). Only 3 studies investigated mental health. While studies were available from many different countries, a majority of papers came from the United Kingdom, Italy, and Spain. Several studies were motivated by concerns from the population or by previous observations of an overincidence of cases. Geographical ecological designs were largely used for studying cancer and mortality, including statistical designs to quantify a relationship between health indicators and exposure. Morbidity was frequently investigated through cross-sectional surveys on the respiratory health of children. Few multicenter studies were performed. In a majority of papers, exposed areas were defined based on the distance to the industry and were located from <2 km to >20 km from the plants. Improving the exposure assessment would be an asset to future studies. Criteria to include industries in multicenter studies should be defined. PMID:23818910

  16. A review of the epidemiological methods used to investigate the health impacts of air pollution around major industrial areas.

    PubMed

    Pascal, Mathilde; Pascal, Laurence; Bidondo, Marie-Laure; Cochet, Amandine; Sarter, Hélène; Stempfelet, Morgane; Wagner, Vérène

    2013-01-01

    We performed a literature review to investigate how epidemiological studies have been used to assess the health consequences of living in the vicinity of industries. 77 papers on the chronic effects of air pollution around major industrial areas were reviewed. Major health themes were cancers (27 studies), morbidity (25 studies), mortality (7 studies), and birth outcome (7 studies). Only 3 studies investigated mental health. While studies were available from many different countries, a majority of papers came from the United Kingdom, Italy, and Spain. Several studies were motivated by concerns from the population or by previous observations of an overincidence of cases. Geographical ecological designs were largely used for studying cancer and mortality, including statistical designs to quantify a relationship between health indicators and exposure. Morbidity was frequently investigated through cross-sectional surveys on the respiratory health of children. Few multicenter studies were performed. In a majority of papers, exposed areas were defined based on the distance to the industry and were located from <2 km to >20 km from the plants. Improving the exposure assessment would be an asset to future studies. Criteria to include industries in multicenter studies should be defined.

  17. MEASUREMENT ERROR ESTIMATION AND CORRECTION METHODS TO MINIMIZE EXPOSURE MISCLASSIFICATION IN EPIDEMIOLOGICAL STUDIES: PROJECT SUMMARY

    EPA Science Inventory

    This project summary highlights recent findings from research undertaken to develop improved methods to assess potential human health risks related to drinking water disinfection byproduct (DBP) exposures.

  18. MEASUREMENT ERROR ESTIMATION AND CORRECTION METHODS TO MINIMIZE EXPOSURE MISCLASSIFICATION IN EPIDEMIOLOGICAL STUDIES: PROJECT SUMMARY

    EPA Science Inventory

    This project summary highlights recent findings from research undertaken to develop improved methods to assess potential human health risks related to drinking water disinfection byproduct (DBP) exposures.

  19. HEALTHY study rationale, design and methods

    PubMed Central

    2009-01-01

    The HEALTHY primary prevention trial was designed and implemented in response to the growing numbers of children and adolescents being diagnosed with type 2 diabetes. The objective was to moderate risk factors for type 2 diabetes. Modifiable risk factors measured were indicators of adiposity and glycemic dysregulation: body mass index ≥85th percentile, fasting glucose ≥5.55 mmol l-1 (100 mg per 100 ml) and fasting insulin ≥180 pmol l-1 (30 μU ml-1). A series of pilot studies established the feasibility of performing data collection procedures and tested the development of an intervention consisting of four integrated components: (1) changes in the quantity and nutritional quality of food and beverage offerings throughout the total school food environment; (2) physical education class lesson plans and accompanying equipment to increase both participation and number of minutes spent in moderate-to-vigorous physical activity; (3) brief classroom activities and family outreach vehicles to increase knowledge, enhance decision-making skills and support and reinforce youth in accomplishing goals; and (4) communications and social marketing strategies to enhance and promote changes through messages, images, events and activities. Expert study staff provided training, assistance, materials and guidance for school faculty and staff to implement the intervention components. A cohort of students were enrolled in sixth grade and followed to end of eighth grade. They attended a health screening data collection at baseline and end of study that involved measurement of height, weight, blood pressure, waist circumference and a fasting blood draw. Height and weight were also collected at the end of the seventh grade. The study was conducted in 42 middle schools, six at each of seven locations across the country, with 21 schools randomized to receive the intervention and 21 to act as controls (data collection activities only). Middle school was the unit of sample size and

  20. The Epidemiology of Substance Use Disorders in US Veterans: A Systematic Review and Analysis of Assessment Methods

    PubMed Central

    Lan, Chiao-Wen; Fiellin, David A.; Barry, Declan T.; Bryant, Kendall J.; Gordon, Adam J.; Edelman, E. Jennifer; Gaither, Julie R.; Maisto, Stephen A.; Marshall, Brandon D.L.

    2016-01-01

    Background Substance use disorders (SUDs), which encompass alcohol and drug use disorders (AUDs, DUDs), constitute a major public health challenge among US veterans. SUDs are among the most common and costly of all health conditions among veterans. Objectives This study sought to examine the epidemiology of SUDs among US veterans, compare the prevalence of SUDs in studies using diagnostic and administrative criteria assessment methods, and summarize trends in the prevalence of SUDs reported in studies sampling US veterans over time. Methods Comprehensive electronic database searches were conducted. A total of 3,490 studies were identified. We analyzed studies sampling US veterans and reporting prevalence, distribution, and examining AUDs and DUDs. Results Of the studies identified, 72 met inclusion criteria. The studies were published between 1995 and 2013. Studies using diagnostic criteria reported higher prevalence of AUDs (32% vs. 10%) and DUDs (20% vs. 5%) than administrative criteria, respectively. Regardless of assessment method, both the lifetime and past year prevalence of AUDs in studies sampling US veterans has declined gradually over time. Conclusion The prevalence of SUDs reported in studies sampling US veterans are affected by assessment method. Given the significant public health problems of SUDs among US veterans, improved guidelines for clinical screening using validated diagnostic criteria to assess AUDs and DUDs in US veteran populations are needed. Scientific Significance These findings may inform VA and other healthcare systems in prevention, diagnosis, and intervention for SUDs among US veterans. PMID:26693830

  1. System and method of designing models in a feedback loop

    DOEpatents

    Gosink, Luke C.; Pulsipher, Trenton C.; Sego, Landon H.

    2017-02-14

    A method and system for designing models is disclosed. The method includes selecting a plurality of models for modeling a common event of interest. The method further includes aggregating the results of the models and analyzing each model compared to the aggregate result to obtain comparative information. The method also includes providing the information back to the plurality of models to design more accurate models through a feedback loop.

  2. [Cost analysis of rapid methods for diagnosis of multidrug resistant tuberculosis in different epidemiologic groups in Perú].

    PubMed

    Solari, Lely; Gutiérrez, Alfonso; Suárez, Carmen; Jave, Oswaldo; Castillo, Edith; Yale, Gloria; Ascencios, Luis; Quispe, Neyda; Valencia, Eddy; Suárez, Víctor

    2011-01-01

    To evaluate the costs of three methods for the diagnosis of drug susceptibility in tuberculosis, and to compare the cost per case of Multidrug-resistant tuberculosis (MDR TB) diagnosed with these (MODS, GRIESS and Genotype MTBDR plus®) in 4 epidemiologic groups in Peru. In the basis of programmatic figures, we divided the population in 4 groups: new cases from Lima/Callao, new cases from other provinces, previously treated patients from Lima/Callao and previously treated from other provinces. We calculated the costs of each test with the standard methodology of the Ministry of Health, from the perspective of the health system. Finally, we calculated the cost per patient diagnosed with MDR TB for each epidemiologic group. The estimated costs per test for MODS, GRIESS, and Genotype MTBDR plus® were 14.83. 15.51 and 176.41 nuevos soles respectively (the local currency, 1 nuevos sol=0.36 US dollars for August, 2011). The cost per patient diagnosed with GRIESS and MODS was lower than 200 nuevos soles in 3 out of the 4 groups. The costs per diagnosed MDR TB were higher than 2,000 nuevos soles with Genotype MTBDR plus® in the two groups of new patients, and lower than 1,000 nuevos soles in the group of previously treated patients. In high-prevalence groups, like the previously treated patients, the costs per diagnosis of MDR TB with the 3 evaluated tests were low, nevertheless, the costs with the molecular test in the low- prevalence groups were high. The use of the molecular tests must be optimized in high prevalence areas.

  3. The use of mathematical models in the epidemiological study of infectious diseases and in the design of mass immunization programmes.

    PubMed

    Nokes, D J; Anderson, R M

    1988-08-01

    The relationship between the number of people vaccinated for an infectious disease and the resulting decrease in incidence of the disease is not straightforward and linear because many independent variables determine the course of infection. However, these variables are quantifiable and can therefore by used to model the course of an infectious disease and impact of mass vaccination. Before one can construct a model, one must know for any specific infectious disease the number of individuals in the community protected by maternally derived antibodies, the number susceptible to infectious the number infected but not yet infectious (i.e., with latent infection), the number of infectious individuals, and the number of recovered (i.e., immune) individuals. Compartmental models are sets of differential equations which describe the rates of flow of individuals between these categories. Several major epidemiologic concepts comprise the ingredients of the model: the net rate of infection (i.e., incidence), the per capita rate of infection, the Force of Infection, and the basic reproductive rate of infection. When a community attains a high level of vaccination coverage, it is no longer necessary to vaccinate everyone because the herd immunity of the population protects the unvaccinated because it lowers the likelihood of their coming into contact with an infectious individual. Many infections that confer lasting immunity tend to have interepidemic periods when the number of susceptibles is too low to sustain an epidemic. Mass vacination programs reduce the net rate of transmission of the infective organism; they also increase the length of the interepidemic period. Many diseases primawrily associated with children have much more serious consequences in older people and the question arises as to at what point childhood immunization will successfully prevent the more dangerous incidence of the disease in older cohorts. Mathematical models of disease transmission enable one

  4. Invited commentary: do-it-yourself modern epidemiology--at last!

    PubMed

    Morabia, Alfredo

    2014-10-01

    In this issue of the Journal, Keyes and Galea (Am J Epidemiol. 2014;180(7):661-668) propose "7 foundational steps" for introducing epidemiologic methods and concepts to beginners. Keyes and Galea's credo is that the methododological and conceptual components that comprise epidemiology, today scattered in textbook chapters, come together as an integrated and coherent methodological corpus in the process of designing studies. Thus, they expound, the process of designing studies should be the core of teaching epidemiology. Two aspects of their 7-steps-to-epidemiology, do-it-yourself user manual stand out as novel: 1) the approach, because of its emphasis on modern epidemiology's causal framework of a dynamic population in a steady state evolving across time, and 2) the ambition to teach modern epidemiology in introductory courses, instead of the popular mix of classical and modern epidemiology that is often used today to keep introductory courses simple. Both aspects are of potentially great significance for our discipline.

  5. INFLUENCE OF EXPOSURE ASSESSMENT METHOD IN AN EPIDEMIOLOGIC STUDY OF TRIHALOMETHANE EXPOSURE AND SPONTANEOUS ABORTION

    EPA Science Inventory

    Trihalomethanes are common contaminants of chlorinated drinking water. Studies of their health effects have been hampered by exposure misclassification, due in part to limitations inherent in using utility sampling records. We used two exposure assessment methods, one based on ut...

  6. INFLUENCE OF EXPOSURE ASSESSMENT METHOD IN AN EPIDEMIOLOGIC STUDY OF TRIHALOMETHANE EXPOSURE AND SPONTANEOUS ABORTION

    EPA Science Inventory

    Trihalomethanes are common contaminants of chlorinated drinking water. Studies of their health effects have been hampered by exposure misclassification, due in part to limitations inherent in using utility sampling records. We used two exposure assessment methods, one based on ut...

  7. Match rate and positional accuracy of two geocoding methods for epidemiologic research.

    PubMed

    Zhan, F Benjamin; Brender, Jean D; De Lima, Ionara; Suarez, Lucina; Langlois, Peter H

    2006-11-01

    This study compares the match rate and positional accuracy of two geocoding methods: the popular geocoding tool in ArcGIS 9.1 and the Centrus GeoCoder for ArcGIS. We first geocoded 11,016 Texas addresses in a case-control study using both methods and obtained the match rate of each method. We then randomly selected 200 addresses from those geocoded by using both methods and obtained geographic coordinates of the 200 addresses by using a global positioning system (GPS) device. Of the 200 addresses, 110 were case maternal residence addresses and 90 were control maternal residence addresses. These GPS-surveyed coordinates were used as the "true" coordinates to calculate positional errors of geocoded locations. We used Wilcoxon signed rank test to evaluate whether differences in positional errors from the two methods were statistically significantly different from zero. In addition, we calculated the sensitivity and specificity of the two methods for classifying maternal addresses within 1500 m of toxic release inventory facilities when distance is used as a proxy of exposure. The match rate of the Centrus GeoCoder was more than 10% greater than that of the geocoding tool in ArcGIS 9.1. Positional errors with the Centrus GeoCoder were less than those of the geocoding tool in ArcGIS 9.1, and this difference was statistically significant. Sensitivity and specificity of the two methods are similar. Centrus GeoCoder for ArcGIS for geocoding gives greater match rates than the geocoding tool in ArcGIS 9.1. Although the Centrus GeoCoder has better positional accuracy, both methods give similar results in classifying maternal addresses within 1500 m of toxic release inventory facilities when distance is used as a proxy of exposure.

  8. What Can Mixed Methods Designs Offer Professional Development Program Evaluators?

    ERIC Educational Resources Information Center

    Giordano, Victoria; Nevin, Ann

    2007-01-01

    In this paper, the authors describe the benefits and pitfalls of mixed methods designs. They argue that mixed methods designs may be preferred when evaluating professional development programs for p-K-12 education given the new call for accountability in making data-driven decisions. They summarize and critique the studies in terms of limitations…

  9. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Designation of noise description methods... TRANSPORTATION (CONTINUED) AIRPORTS NOTICE AND APPROVAL OF AIRPORT NOISE AND ACCESS RESTRICTIONS General Provisions § 161.9 Designation of noise description methods. For purposes of this part, the following...

  10. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 3 2014-01-01 2014-01-01 false Designation of noise description methods... TRANSPORTATION (CONTINUED) AIRPORTS NOTICE AND APPROVAL OF AIRPORT NOISE AND ACCESS RESTRICTIONS General Provisions § 161.9 Designation of noise description methods. For purposes of this part, the following...

  11. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Designation of noise description methods... TRANSPORTATION (CONTINUED) AIRPORTS NOTICE AND APPROVAL OF AIRPORT NOISE AND ACCESS RESTRICTIONS General Provisions § 161.9 Designation of noise description methods. For purposes of this part, the following...

  12. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Designation of noise description methods... TRANSPORTATION (CONTINUED) AIRPORTS NOTICE AND APPROVAL OF AIRPORT NOISE AND ACCESS RESTRICTIONS General Provisions § 161.9 Designation of noise description methods. For purposes of this part, the following...

  13. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Designation of noise description methods... TRANSPORTATION (CONTINUED) AIRPORTS NOTICE AND APPROVAL OF AIRPORT NOISE AND ACCESS RESTRICTIONS General Provisions § 161.9 Designation of noise description methods. For purposes of this part, the following...

  14. A design method of divertor in tokamak reactors

    NASA Astrophysics Data System (ADS)

    Ueda, N.; Itoh, S.-I.; Tanaka, M.; Itoh, K.

    1990-08-01

    Computational method to design the efficient divertor configuration in tokamak reactor is presented. The two dimensional code was developed to analyze the distributions of the plasma and neutral particles for realistic configurations. Using this code, a method to design the efficient divertor configuration is developed. An example of new divertor, which consists of the baffle and fin plates, is analyzed.

  15. The epidemiology of substance use disorders in US Veterans: A systematic review and analysis of assessment methods.

    PubMed

    Lan, Chiao-Wen; Fiellin, David A; Barry, Declan T; Bryant, Kendall J; Gordon, Adam J; Edelman, E Jennifer; Gaither, Julie R; Maisto, Stephen A; Marshall, Brandon D L

    2016-01-01

    Substance use disorders (SUDs), which encompass alcohol and drug use disorders (AUDs, DUDs), constitute a major public health challenge among US veterans. SUDs are among the most common and costly of all health conditions among veterans. This study sought to examine the epidemiology of SUDs among US veterans, compare the prevalence of SUDs in studies using diagnostic and administrative criteria assessment methods, and summarize trends in the prevalence of SUDs reported in studies sampling US veterans over time. Comprehensive electronic database searches were conducted. A total of 3,490 studies were identified. We analyzed studies sampling US veterans and reporting prevalence, distribution, and examining AUDs and DUDs. Of the studies identified, 72 met inclusion criteria. The studies were published between 1995 and 2013. Studies using diagnostic criteria reported higher prevalence of AUDs (32% vs. 10%) and DUDs (20% vs. 5%) than administrative criteria, respectively. Regardless of assessment method, both the lifetime and past year prevalence of AUDs in studies sampling US veterans has declined gradually over time. The prevalence of SUDs reported in studies sampling US veterans are affected by assessment method. Given the significant public health problems of SUDs among US veterans, improved guidelines for clinical screening using validated diagnostic criteria to assess AUDs and DUDs in US veteran populations are needed. These findings may inform VA and other healthcare systems in prevention, diagnosis, and intervention for SUDs among US veterans. © American Academy of Addiction Psychiatry.

  16. Probabilistic Methods for Uncertainty Propagation Applied to Aircraft Design

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Lin, Hong-Zong; Khalessi, Mohammad R.

    2002-01-01

    Three methods of probabilistic uncertainty propagation and quantification (the method of moments, Monte Carlo simulation, and a nongradient simulation search method) are applied to an aircraft analysis and conceptual design program to demonstrate design under uncertainty. The chosen example problems appear to have discontinuous design spaces and thus these examples pose difficulties for many popular methods of uncertainty propagation and quantification. However, specific implementation features of the first and third methods chosen for use in this study enable successful propagation of small uncertainties through the program. Input uncertainties in two configuration design variables are considered. Uncertainties in aircraft weight are computed. The effects of specifying required levels of constraint satisfaction with specified levels of input uncertainty are also demonstrated. The results show, as expected, that the designs under uncertainty are typically heavier and more conservative than those in which no input uncertainties exist.

  17. Method to Select Metropolitan Areas of Epidemiologic Interest for Enhanced Air Quality Monitoring

    EPA Science Inventory

    The U.S. Environmental Protection Agency’s current Speciation Trends Network (STN) covers most major U.S. metropolitan areas and a wide range of particulate matter (PM) constituents and gaseous co-pollutants. However, using filter-based methods, most PM constituents are measured ...

  18. Method to Select Metropolitan Areas of Epidemiologic Interest for Enhanced Air Quality Monitoring

    EPA Science Inventory

    The U.S. Environmental Protection Agency’s current Speciation Trends Network (STN) covers most major U.S. metropolitan areas and a wide range of particulate matter (PM) constituents and gaseous co-pollutants. However, using filter-based methods, most PM constituents are measured ...

  19. Expanding color design methods for architecture and allied disciplines

    NASA Astrophysics Data System (ADS)

    Linton, Harold E.

    2002-06-01

    The color design processes of visual artists, architects, designers, and theoreticians included in this presentation reflect the practical role of color in architecture. What the color design professional brings to the architectural design team is an expertise and rich sensibility made up of a broad awareness and a finely tuned visual perception. This includes a knowledge of design and its history, expertise with industrial color materials and their methods of application, an awareness of design context and cultural identity, a background in physiology and psychology as it relates to human welfare, and an ability to problem-solve and respond creatively to design concepts with innovative ideas. The broadening of the definition of the colorists's role in architectural design provides architects, artists and designers with significant opportunities for continued professional and educational development.

  20. Designing Adaptive Intensive Interventions Using Methods from Engineering

    PubMed Central

    Lagoa, Constantino M.; Bekiroglu, Korkut; Lanza, Stephanie T.; Murphy, Susan A.

    2014-01-01

    Objective Adaptive intensive interventions are introduced and new methods from the field of control engineering for use in their design are illustrated. Method A detailed step-by-step explanation of how control engineering methods can be used with intensive longitudinal data to design an adaptive intensive intervention is provided. The methods are evaluated via simulation. Results Simulation results illustrate how the designed adaptive intensive intervention can result in improved outcomes with less treatment by providing treatment only when it is needed. Furthermore, the methods are robust to model misspecification as well as the influence of unobserved causes. Conclusions These new methods can be used to design adaptive interventions that are effective yet reduce participant burden. PMID:25244394

  1. Advances in spatial epidemiology and geographic information systems.

    PubMed

    Kirby, Russell S; Delmelle, Eric; Eberth, Jan M

    2017-01-01

    The field of spatial epidemiology has evolved rapidly in the past 2 decades. This study serves as a brief introduction to spatial epidemiology and the use of geographic information systems in applied research in epidemiology. We highlight technical developments and highlight opportunities to apply spatial analytic methods in epidemiologic research, focusing on methodologies involving geocoding, distance estimation, residential mobility, record linkage and data integration, spatial and spatio-temporal clustering, small area estimation, and Bayesian applications to disease mapping. The articles included in this issue incorporate many of these methods into their study designs and analytical frameworks. It is our hope that these studies will spur further development and utilization of spatial analysis and geographic information systems in epidemiologic research.

  2. Molecular epidemiology and a loop-mediated isothermal amplification method for diagnosis of infection with rabies virus in Zambia.

    PubMed

    Muleya, Walter; Namangala, Boniface; Mweene, Aaron; Zulu, Luke; Fandamu, Paul; Banda, Douglas; Kimura, Takashi; Sawa, Hirofumi; Ishii, Akihiro

    2012-01-01

    The National Livestock Epidemiology and Information Center (NALEIC) in Zambia reported over 132 cases of canine rabies diagnosed by the direct fluorescent antibody test (DFAT) from 2004 to 2009. In this study, the lineage of rabies virus (RABV) in Zambia was determined by phylogenetic analysis of the nucleoprotein (N) and glycoprotein (G) gene sequences. Total RNA was extracted from 87-DFAT brain specimens out of which only 35 (40%) were positive on nested reverse transcription polymerase chain reaction (RT-PCR) for each gene, and 26 being positive for both genes. Positive specimens for the N (n=33) and G (n=35) genes were used for phylogenetic analysis. Phylogenetic analysis of the N gene showed two phylogenetic clusters in Zambia belonging to the Africa 1b lineage present in eastern and southern Africa. While one cluster exclusively comprised Zambian strains, the other was more heterogeneous regarding the RABV origins and included strains from Tanzania, Mozambique and Zambia. Phylogenetic analysis of the G gene revealed similar RABV strains in different hosts and regions of Zambia. We designed primers for reverse transcription loop-mediated isothermal amplification (RT-LAMP) assay from the consensus sequence of the N gene in an attempt to improve the molecular diagnosis of RABV in Zambia. The specificity and reproducibility of the RT-LAMP assay was confirmed with actual clinical specimens. Therefore, the RT-LAMP assay presented in this study may prove to be useful for routine diagnosis of rabies in Zambia.

  3. Telomere length varies by DNA extraction method: implications for epidemiologic research.

    PubMed

    Cunningham, Julie M; Johnson, Ruth A; Litzelman, Kristin; Skinner, Halcyon G; Seo, Songwon; Engelman, Corinne D; Vanderboom, Russell J; Kimmel, Grace W; Gangnon, Ronald E; Riegert-Johnson, Douglas L; Baron, John A; Potter, John D; Haile, Robert; Buchanan, Daniel D; Jenkins, Mark A; Rider, David N; Thibodeau, Stephen N; Petersen, Gloria M; Boardman, Lisa A

    2013-11-01

    Both shorter and longer telomeres in peripheral blood leukocyte (PBL) DNA have been associated with cancer risk. However, associations remain inconsistent across studies of the same cancer type. This study compares DNA preparation methods to determine telomere length from patients with colorectal cancer. We examined PBL relative telomere length (RTL) measured by quantitative PCR (qPCR) in 1,033 patients with colorectal cancer and 2,952 healthy controls. DNA was extracted with phenol/chloroform, PureGene, or QIAamp. We observed differences in RTL depending on DNA extraction method (P < 0.001). Phenol/chloroform-extracted DNA had a mean RTL (T/S ratio) of 0.78 (range 0.01-6.54) compared with PureGene-extracted DNA (mean RTL of 0.75; range 0.00-12.33). DNA extracted by QIAamp yielded a mean RTL of 0.38 (range 0.02-3.69). We subsequently compared RTL measured by qPCR from an independent set of 20 colorectal cancer cases and 24 normal controls in PBL DNA extracted by each of the three extraction methods. The range of RTL measured by qPCR from QIAamp-extracted DNA (0.17-0.58) was less than from either PureGene or phenol/chloroform (ranges, 0.04-2.67 and 0.32-2.81, respectively). RTL measured by qPCR from QIAamp-extracted DNA was less than from either PureGene or phenol/chloroform (P < 0.001). Differences in DNA extraction method may contribute to the discrepancies between studies seeking to find an association between the risk of cancer or other diseases and RTL. ©2013 AACR.

  4. Positional accuracy and geographic bias of four methods of geocoding in epidemiologic research.

    PubMed

    Schootman, Mario; Sterling, David A; Struthers, James; Yan, Yan; Laboube, Ted; Emo, Brett; Higgs, Gary

    2007-06-01

    We examined the geographic bias of four methods of geocoding addresses using ArcGIS, commercial firm, SAS/GIS, and aerial photography. We compared "point-in-polygon" (ArcGIS, commercial firm, and aerial photography) and the "look-up table" method (SAS/GIS) to allocate addresses to census geography, particularly as it relates to census-based poverty rates. We randomly selected 299 addresses of children treated for asthma at an urban emergency department (1999-2001). The coordinates of the building address side door were obtained by constant offset based on ArcGIS and a commercial firm and true ground location based on aerial photography. Coordinates were available for 261 addresses across all methods. For 24% to 30% of geocoded road/door coordinates the positional error was 51 meters or greater, which was similar across geocoding methods. The mean bearing was -26.8 degrees for the vector of coordinates based on aerial photography and ArcGIS and 8.5 degrees for the vector based on aerial photography and the commercial firm (p < 0.0001). ArcGIS and the commercial firm performed very well relative to SAS/GIS in terms of allocation to census geography. For 20%, the door location based on aerial photography was assigned to a different block group compared to SAS/GIS. The block group poverty rate varied at least two standard deviations for 6% to 7% of addresses. We found important differences in distance and bearing between geocoding relative to aerial photography. Allocation of locations based on aerial photography to census-based geographic areas could lead to substantial errors.

  5. A simple inverse design method for pump turbine

    NASA Astrophysics Data System (ADS)

    Yin, Junlian; Li, Jingjing; Wang, Dezhong; Wei, Xianzhu

    2014-03-01

    In this paper, a simple inverse design method is proposed for pump turbine. The main point of this method is that the blade loading distribution is first extracted from an existing model and then applied in the new design. As an example, the blade loading distribution of the runner designed with head 200m, was analyzed. And then, the combination of the extracted blade loading and a meridional passage suitable for 500m head is applied to design a new runner project. After CFD and model test, it is shown that the new runner performs very well in terms of efficiency and cavitation. Therefore, as an alternative, the inverse design method can be extended to other design applications.

  6. Design methods for fault-tolerant finite state machines

    NASA Technical Reports Server (NTRS)

    Niranjan, Shailesh; Frenzel, James F.

    1993-01-01

    VLSI electronic circuits are increasingly being used in space-borne applications where high levels of radiation may induce faults, known as single event upsets. In this paper we review the classical methods of designing fault tolerant digital systems, with an emphasis on those methods which are particularly suitable for VLSI-implementation of finite state machines. Four methods are presented and will be compared in terms of design complexity, circuit size, and estimated circuit delay.

  7. Methods and Processes of Developing the Strengthening the Reporting of Observational Studies in Epidemiology-Veterinary (STROBE-Vet) Statement.

    PubMed

    Sargeant, J M; O'Connor, A M; Dohoo, I R; Erb, H N; Cevallos, M; Egger, M; Ersbøll, A K; Martin, S W; Nielsen, L R; Pearl, D L; Pfeiffer, D U; Sanchez, J; Torrence, M E; Vigre, H; Waldner, C; Ward, M P

    2016-12-01

    Reporting of observational studies in veterinary research presents challenges that often are not addressed in published reporting guidelines. Our objective was to develop an extension of the STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) statement that addresses unique reporting requirements for observational studies in veterinary medicine related to health, production, welfare, and food safety. We conducted a consensus meeting with 17 experts in Mississauga, Canada. Experts completed a premeeting survey about whether items in the STROBE statement should be modified or added to address unique issues related to observational studies in animal species with health, production, welfare, or food safety outcomes. During the meeting, each STROBE item was discussed to determine whether or not rewording was recommended, and whether additions were warranted. Anonymous voting was used to determine consensus. Six items required no modifications or additions. Modifications or additions were made to the STROBE items 1 (title and abstract), 3 (objectives), 5 (setting), 6 (participants), 7 (variables), 8 (data sources and measurement), 9 (bias), 10 (study size), 12 (statistical methods), 13 (participants), 14 (descriptive data), 15 (outcome data), 16 (main results), 17 (other analyses), 19 (limitations), and 22 (funding). The methods and processes used were similar to those used for other extensions of the STROBE statement. The use of this STROBE statement extension should improve reporting of observational studies in veterinary research by recognizing unique features of observational studies involving food-producing and companion animals, products of animal origin, aquaculture, and wildlife.

  8. Novel Microbiological and Spatial Statistical Methods to Improve Strength of Epidemiological Evidence in a Community-Wide Waterborne Outbreak

    PubMed Central

    Jalava, Katri; Rintala, Hanna; Ollgren, Jukka; Maunula, Leena; Gomez-Alvarez, Vicente; Revez, Joana; Palander, Marja; Antikainen, Jenni; Kauppinen, Ari; Räsänen, Pia; Siponen, Sallamaari; Nyholm, Outi; Kyyhkynen, Aino; Hakkarainen, Sirpa; Merentie, Juhani; Pärnänen, Martti; Loginov, Raisa; Ryu, Hodon; Kuusi, Markku; Siitonen, Anja; Miettinen, Ilkka; Santo Domingo, Jorge W.; Hänninen, Marja-Liisa; Pitkänen, Tarja

    2014-01-01

    Failures in the drinking water distribution system cause gastrointestinal outbreaks with multiple pathogens. A water distribution pipe breakage caused a community-wide waterborne outbreak in Vuorela, Finland, July 2012. We investigated this outbreak with advanced epidemiological and microbiological methods. A total of 473/2931 inhabitants (16%) responded to a web-based questionnaire. Water and patient samples were subjected to analysis of multiple microbial targets, molecular typing and microbial community analysis. Spatial analysis on the water distribution network was done and we applied a spatial logistic regression model. The course of the illness was mild. Drinking untreated tap water from the defined outbreak area was significantly associated with illness (RR 5.6, 95% CI 1.9–16.4) increasing in a dose response manner. The closer a person lived to the water distribution breakage point, the higher the risk of becoming ill. Sapovirus, enterovirus, single Campylobacter jejuni and EHEC O157:H7 findings as well as virulence genes for EPEC, EAEC and EHEC pathogroups were detected by molecular or culture methods from the faecal samples of the patients. EPEC, EAEC and EHEC virulence genes and faecal indicator bacteria were also detected in water samples. Microbial community sequencing of contaminated tap water revealed abundance of Arcobacter species. The polyphasic approach improved the understanding of the source of the infections, and aided to define the extent and magnitude of this outbreak. PMID:25147923

  9. Novel microbiological and spatial statistical methods to improve strength of epidemiological evidence in a community-wide waterborne outbreak.

    PubMed

    Jalava, Katri; Rintala, Hanna; Ollgren, Jukka; Maunula, Leena; Gomez-Alvarez, Vicente; Revez, Joana; Palander, Marja; Antikainen, Jenni; Kauppinen, Ari; Räsänen, Pia; Siponen, Sallamaari; Nyholm, Outi; Kyyhkynen, Aino; Hakkarainen, Sirpa; Merentie, Juhani; Pärnänen, Martti; Loginov, Raisa; Ryu, Hodon; Kuusi, Markku; Siitonen, Anja; Miettinen, Ilkka; Santo Domingo, Jorge W; Hänninen, Marja-Liisa; Pitkänen, Tarja

    2014-01-01

    Failures in the drinking water distribution system cause gastrointestinal outbreaks with multiple pathogens. A water distribution pipe breakage caused a community-wide waterborne outbreak in Vuorela, Finland, July 2012. We investigated this outbreak with advanced epidemiological and microbiological methods. A total of 473/2931 inhabitants (16%) responded to a web-based questionnaire. Water and patient samples were subjected to analysis of multiple microbial targets, molecular typing and microbial community analysis. Spatial analysis on the water distribution network was done and we applied a spatial logistic regression model. The course of the illness was mild. Drinking untreated tap water from the defined outbreak area was significantly associated with illness (RR 5.6, 95% CI 1.9-16.4) increasing in a dose response manner. The closer a person lived to the water distribution breakage point, the higher the risk of becoming ill. Sapovirus, enterovirus, single Campylobacter jejuni and EHEC O157:H7 findings as well as virulence genes for EPEC, EAEC and EHEC pathogroups were detected by molecular or culture methods from the faecal samples of the patients. EPEC, EAEC and EHEC virulence genes and faecal indicator bacteria were also detected in water samples. Microbial community sequencing of contaminated tap water revealed abundance of Arcobacter species. The polyphasic approach improved the understanding of the source of the infections, and aided to define the extent and magnitude of this outbreak.

  10. Review of design optimization methods for turbomachinery aerodynamics

    NASA Astrophysics Data System (ADS)

    Li, Zhihui; Zheng, Xinqian

    2017-08-01

    In today's competitive environment, new turbomachinery designs need to be not only more efficient, quieter, and ;greener; but also need to be developed at on much shorter time scales and at lower costs. A number of advanced optimization strategies have been developed to achieve these requirements. This paper reviews recent progress in turbomachinery design optimization to solve real-world aerodynamic problems, especially for compressors and turbines. This review covers the following topics that are important for optimizing turbomachinery designs. (1) optimization methods, (2) stochastic optimization combined with blade parameterization methods and the design of experiment methods, (3) gradient-based optimization methods for compressors and turbines and (4) data mining techniques for Pareto Fronts. We also present our own insights regarding the current research trends and the future optimization of turbomachinery designs.

  11. Inviscid transonic wing design using inverse methods in curvilinear coordinates

    NASA Technical Reports Server (NTRS)

    Gally, Thomas A.; Carlson, Leland A.

    1987-01-01

    An inverse wing design method has been developed around an existing transonic wing analysis code. The original analysis code, TAWFIVE, has as its core the numerical potential flow solver, FLO30, developed by Jameson and Caughey. Features of the analysis code include a finite-volume formulation; wing and fuselage fitted, curvilinear grid mesh; and a viscous boundary layer correction that also accounts for viscous wake thickness and curvature. The development of the inverse methods as an extension of previous methods existing for design in Cartesian coordinates is presented. Results are shown for inviscid wing design cases in super-critical flow regimes. The test cases selected also demonstrate the versatility of the design method in designing an entire wing or discontinuous sections of a wing.

  12. System Design Support by Optimization Method Using Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio

    We proposed the new optimization method based on stochastic process. The characteristics of this method are to obtain the approximate solution of the optimum solution as an expected value. In numerical calculation, a kind of Monte Carlo method is used to obtain the solution because of stochastic process. Then, it can obtain the probability distribution of the design variable because it is generated in the probability that design variables were in proportion to the evaluation function value. This probability distribution shows the influence of design variables on the evaluation function value. This probability distribution is the information which is very useful for the system design. In this paper, it is shown the proposed method is useful for not only the optimization but also the system design. The flight trajectory optimization problem for the hang-glider is shown as an example of the numerical calculation.

  13. [Pharmacological vigilance and pharmacological epidemiology: principles, definition, methods and current trends in neurology].

    PubMed

    Montastruc, J L; Bagheri, H; Lapeyre-Mestre, M; Senard, J M

    1999-04-01

    It is now well established that only clinical trials performed before drug approval are not sufficient for a full modern pharmacological evaluation of drugs and treatments. The need of both pharmacovigilance and pharmacoepidemiology is underlined in order to evaluate drugs under real conditions. After a summary of methods used in pharmacoepidemiological trials (spontaneous reports, imputability assessment, cohorts, case control studies etc.), recent pharmacoepidemiological data useful for the neurologist are summarized: side effects of tacrine and vaccines, serotoninergic syndrome and side effects of new antiepileptic drugs.

  14. Evolution and social epidemiology.

    PubMed

    Nishi, Akihiro

    2015-11-01

    Evolutionary biology, which aims to explain the dynamic process of shaping the diversity of life, has not yet significantly affected thinking in social epidemiology. Current challenges in social epidemiology include understanding how social exposures can affect our biology, explaining the dynamics of society and health, and designing better interventions that are mindful of the impact of exposures during critical periods. I review how evolutionary concepts and tools, such as fitness gradient in cultural evolution, evolutionary game theory, and contemporary evolution in cancer, can provide helpful insights regarding social epidemiology. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. A graph-theory method for pattern identification in geographical epidemiology – a preliminary application to deprivation and mortality

    PubMed Central

    Maheswaran, Ravi; Craigs, Cheryl; Read, Simon; Bath, Peter A; Willett, Peter

    2009-01-01

    Background Graph theoretical methods are extensively used in the field of computational chemistry to search datasets of compounds to see if they contain particular molecular sub-structures or patterns. We describe a preliminary application of a graph theoretical method, developed in computational chemistry, to geographical epidemiology in relation to testing a prior hypothesis. We tested the methodology on the hypothesis that if a socioeconomically deprived neighbourhood is situated in a wider deprived area, then that neighbourhood would experience greater adverse effects on mortality compared with a similarly deprived neighbourhood which is situated in a wider area with generally less deprivation. Methods We used the Trent Region Health Authority area for this study, which contained 10,665 census enumeration districts (CED). Graphs are mathematical representations of objects and their relationships and within the context of this study, nodes represented CEDs and edges were determined by whether or not CEDs were neighbours (shared a common boundary). The overall area in this study was represented by one large graph comprising all CEDs in the region, along with their adjacency information. We used mortality data from 1988–1998, CED level population estimates and the Townsend Material Deprivation Index as an indicator of neighbourhood level deprivation. We defined deprived CEDs as those in the top 20% most deprived in the Region. We then set out to classify these deprived CEDs into seven groups defined by increasing deprivation levels in the neighbouring CEDs. 506 (24.2%) of the deprived CEDs had five adjacent CEDs and we limited pattern development and searching to these CEDs. We developed seven query patterns and used the RASCAL (Rapid Similarity Calculator) program to carry out the search for each of the query patterns. This program used a maximum common subgraph isomorphism method which was modified to handle geographical data. Results Of the 506 deprived CEDs

  16. Tabu search method with random moves for globally optimal design

    NASA Astrophysics Data System (ADS)

    Hu, Nanfang

    1992-09-01

    Optimum engineering design problems are usually formulated as non-convex optimization problems of continuous variables. Because of the absence of convexity structure, they can have multiple minima, and global optimization becomes difficult. Traditional methods of optimization, such as penalty methods, can often be trapped at a local optimum. The tabu search method with random moves to solve approximately these problems is introduced. Its reliability and efficiency are examined with the help of standard test functions. By the analysis of the implementations, it is seen that this method is easy to use, and no derivative information is necessary. It outperforms the random search method and composite genetic algorithm. In particular, it is applied to minimum weight design examples of a three-bar truss, coil springs, a Z-section and a channel section. For the channel section, the optimal design using the tabu search method with random moves saved 26.14 percent over the weight of the SUMT method.

  17. Designing adaptive intensive interventions using methods from engineering.

    PubMed

    Lagoa, Constantino M; Bekiroglu, Korkut; Lanza, Stephanie T; Murphy, Susan A

    2014-10-01

    Adaptive intensive interventions are introduced, and new methods from the field of control engineering for use in their design are illustrated. A detailed step-by-step explanation of how control engineering methods can be used with intensive longitudinal data to design an adaptive intensive intervention is provided. The methods are evaluated via simulation. Simulation results illustrate how the designed adaptive intensive intervention can result in improved outcomes with less treatment by providing treatment only when it is needed. Furthermore, the methods are robust to model misspecification as well as the influence of unobserved causes. These new methods can be used to design adaptive interventions that are effective yet reduce participant burden. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  18. Evidence-based planning and costing palliative care services for children: novel multi-method epidemiological and economic exemplar

    PubMed Central

    2013-01-01

    Background Children’s palliative care is a relatively new clinical specialty. Its nature is multi-dimensional and its delivery necessarily multi-professional. Numerous diverse public and not-for-profit organisations typically provide services and support. Because services are not centrally coordinated, they are provided in a manner that is inconsistent and incoherent. Since the first children’s hospice opened in 1982, the epidemiology of life-limiting conditions has changed with more children living longer, and many requiring transfer to adult services. Very little is known about the number of children living within any given geographical locality, costs of care, or experiences of children with ongoing palliative care needs and their families. We integrated evidence, and undertook and used novel methodological epidemiological work to develop the first evidence-based and costed commissioning exemplar. Methods Multi-method epidemiological and economic exemplar from a health and not-for-profit organisation perspective, to estimate numbers of children under 19 years with life-limiting conditions, cost current services, determine child/parent care preferences, and cost choice of end-of-life care at home. Results The exemplar locality (North Wales) had important gaps in service provision and the clinical network. The estimated annual total cost of current children’s palliative care was about £5.5 million; average annual care cost per child was £22,771 using 2007 prevalence estimates and £2,437- £11,045 using new 2012/13 population-based prevalence estimates. Using population-based prevalence, we estimate 2271 children with a life-limiting condition in the general exemplar population and around 501 children per year with ongoing palliative care needs in contact with hospital services. Around 24 children with a wide range of life-limiting conditions require end-of-life care per year. Choice of end-of-life care at home was requested, which is not currently

  19. Hormonal contraceptive methods and risk of HIV acquisition in women: a systematic review of epidemiological evidence.

    PubMed

    Polis, Chelsea B; Phillips, Sharon J; Curtis, Kathryn M; Westreich, Daniel J; Steyn, Petrus S; Raymond, Elizabeth; Hannaford, Philip; Turner, Abigail Norris

    2014-10-01

    Whether use of various types of hormonal contraception (HC) affect risk of HIV acquisition is a critical question for women's health. For this systematic review, we identified 22 studies published by January 15, 2014 which met inclusion criteria; we classified thirteen studies as having severe methodological limitations, and nine studies as "informative but with important limitations". Overall, data do not support an association between use of oral contraceptives and increased risk of HIV acquisition. Uncertainty persists regarding whether an association exists between depot-medroxyprogesterone acetate (DMPA) use and risk of HIV acquisition. Most studies suggested no significantly increased HIV risk with norethisterone enanthate (NET-EN) use, but when assessed in the same study, point estimates for NET-EN tended to be larger than for DMPA, though 95% confidence intervals overlapped substantially. No data have suggested significantly increased risk of HIV acquisition with use of implants, though data were limited. No data are available on the relationship between use of contraceptive patches, rings, or hormonal intrauterine devices and risk of HIV acquisition. Women choosing progestin-only injectable contraceptives such as DMPA or NET-EN should be informed of the current uncertainty regarding whether use of these methods increases risk of HIV acquisition, and like all women at risk of HIV, should be empowered to access and use condoms and other HIV preventative measures. Programs, practitioners, and women urgently need guidance on how to maximize health with respect to avoiding both unintended pregnancy and HIV given inconclusive or limited data for certain HC methods.

  20. Robust Multivariable Controller Design via Implicit Model-Following Methods.

    DTIC Science & Technology

    1983-12-01

    HD-Ri38 309 ROBUST MULTIVARIABLE CONTROLLER DESIGN VIA IMPLICIT 1/4 MODEL-FOLLOWING METHODS(U) AIR FORCE INST OF TECH WRIGHT-PATTERSON AFB OH SCHOOL...aaS. a%. 1 .111 I Q~ 18 0 ROBUST MULTIVARIABLE CONTROLLER DESIGN -~ :VIA IMPLICIT MODEL-FOLLOWING METHODS ’.% THESIS , AFIT/GE/EE/83D-48 William G... CONTROLLER DESIGN VIA IMPLICIT MODEL-FOLLOWING METHODS THESIS AFIT/GE/EE/83D-48 William G. Miller Capt USAF ,. Approved for pubi release; distribution

  1. An inverse method with regularity condition for transonic airfoil design

    NASA Technical Reports Server (NTRS)

    Zhu, Ziqiang; Xia, Zhixun; Wu, Liyi

    1991-01-01

    It is known from Lighthill's exact solution of the incompressible inverse problem that in the inverse design problem, the surface pressure distribution and the free stream speed cannot both be prescribed independently. This implies the existence of a constraint on the prescribed pressure distribution. The same constraint exists at compressible speeds. Presented here is an inverse design method for transonic airfoils. In this method, the target pressure distribution contains a free parameter that is adjusted during the computation to satisfy the regularity condition. Some design results are presented in order to demonstrate the capabilities of the method.

  2. Design of freeform unobscured reflective imaging systems using CI method

    NASA Astrophysics Data System (ADS)

    Yang, Tong; Hou, Wei; Wu, Xiaofei; Jin, Guofan; Zhu, Jun

    2016-10-01

    In this paper, we demonstrated the design method of freeform unobscured reflective imaging systems using the point-bypoint Construction-Iteration (CI) method. Compared with other point-by-point design methods, the light rays of multiple fields and different pupil coordinates are employed in the design. The whole design process starts from a simple initial system consisting of decentered and tilted planes. In the preliminary surfaces-construction stage, the coordinates as well as the surface normals of the feature data points on each freeform surface can be calculated point-by-point directly based on the given object-image relationships. Then, the freeform surfaces are generated through a novel surface fitting method considering both the coordinates and surface normals of the data points. Next, an iterative process is employed to significantly improve the image quality. In this way, an unobscured design with freeform surfaces can be obtained directly, and it can be taken as a good starting point for further optimization. The benefit and feasibility of this design method is demonstrated by two design examples of high-performance freeform unobscured imaging systems. Both two systems have good imaging performance after final design.

  3. History and Impact of Nutritional Epidemiology123

    PubMed Central

    Alpers, David H.; Bier, Dennis M.; Carpenter, Kenneth J.; McCormick, Donald B.; Miller, Anthony B.; Jacques, Paul F.

    2014-01-01

    The real and important role of epidemiology was discussed, noting heretofore unknown associations that led to improved understanding of the cause and prevention of individual nutritional deficiencies. However, epidemiology has been less successful in linking individual nutrients to the cause of chronic diseases, such as cancer and cardiovascular disease. Dietary changes, such as decreasing caloric intake to prevent cancer and the Mediterranean diet to prevent diabetes, were confirmed as successful approaches to modifying the incidence of chronic diseases. The role of the epidemiologist was confirmed as a collaborator, not an isolated expert of last resort. The challenge for the future is to decide which epidemiologic methods and study designs are most useful in studying chronic disease, then to determine which associations and the hypotheses derived from them are especially strong and worthy of pursuit, and finally to design randomized studies that are feasible, affordable, and likely to result in confirmation or refutation of these hypotheses. PMID:25469385

  4. Design Method for EPS Control System Based on KANSEI Structure

    NASA Astrophysics Data System (ADS)

    Saitoh, Yumi; Itoh, Hideaki; Ozaki, Fuminori; Nakamura, Takenobu; Kawaji, Shigeyasu

    Recently, it has been identified that a KANSEI engineering plays an important role in functional design developing for realizing highly sophisticated products. However, in practical development methods, we design products and optimise the design trial and error, which indecates that we depend on the skill set of experts. In this paper, we focus on an automobile electric power steering (EPS) for which a functional design is required. First, the KANSEI structure is determined on the basis of the steering feeling of an experienced driver, and an EPS control design based on this KANSEI structure is proposed. Then, the EPS control parameters are adjusted in accordance with the KANSEI index. Finally, by assessing the experimental results obtained from the driver, the effectiveness of the proposed design method is verified.

  5. Epidemiologic Methods of Assessing Asthma and Wheezing Episodes in Longitudinal Studies: Measures of Change and Stability

    PubMed Central

    Soto-Ramírez, Nelís; Ziyab, Ali H.; Karmaus, Wilfried; Zhang, Hongmei; Kurukulaaratchy, Ramesh J.; Ewart, Susan; Arshad, Syed Hasan

    2013-01-01

    Background In settings in which diseases wax and wane, there is a need to measure disease dynamics in longitudinal studies. Traditional measures of disease occurrence (eg, cumulative incidence) do not address change or stability or are limited to stable cohorts (eg, incidence) and may thus lead to erroneous conclusions. To illustrate how different measures can be used to detect disease dynamics, we investigated sex differences in the occurrence of asthma and wheezing, using a population-based study cohort that covered the first 18 years of life. Methods In the Isle of Wight birth cohort (n = 1456), prevalence, incidence, cumulative incidence, positive and negative transitions, and remission were determined at ages 1 or 2, 4, 10, and 18 years. Latent transition analysis was used to simultaneously identify classes of asthma and wheezing (related phenotypes) and characterize transition probabilities over time. Trajectory analysis was used to characterize the natural history of asthma and wheezing. Results Regarding time-specific changes, positive and negative transition probabilities were more informative than other measures of associations because they revealed a sex switchover in asthma prevalence (P < 0.05). Transition probabilities were able to identify the origin of a sex-specific dynamic; in particular, prior wheezing transitioned to asthma at age 18 years among girls but not among boys. In comparison with latent transition analysis, trajectory analysis did not directly identify a switchover in prevalence among boys and girls. Conclusions In longitudinal analyses, transition analyses that impose minimal restrictions on data are needed in order to produce appropriate information on disease dynamics. PMID:23994864

  6. Methods for estimation of radiation risk in epidemiological studies accounting for classical and Berkson errors in doses.

    PubMed

    Kukush, Alexander; Shklyar, Sergiy; Masiuk, Sergii; Likhtarov, Illya; Kovgan, Lina; Carroll, Raymond J; Bouville, Andre

    2011-02-16

    With a binary response Y, the dose-response model under consideration is logistic in flavor with pr(Y=1 | D) = R (1+R)(-1), R = λ(0) + EAR D, where λ(0) is the baseline incidence rate and EAR is the excess absolute risk per gray. The calculated thyroid dose of a person i is expressed as Dimes=fiQi(mes)/Mi(mes). Here, Qi(mes) is the measured content of radioiodine in the thyroid gland of person i at time t(mes), Mi(mes) is the estimate of the thyroid mass, and f(i) is the normalizing multiplier. The Q(i) and M(i) are measured with multiplicative errors Vi(Q) and ViM, so that Qi(mes)=Qi(tr)Vi(Q) (this is classical measurement error model) and Mi(tr)=Mi(mes)Vi(M) (this is Berkson measurement error model). Here, Qi(tr) is the true content of radioactivity in the thyroid gland, and Mi(tr) is the true value of the thyroid mass. The error in f(i) is much smaller than the errors in ( Qi(mes), Mi(mes)) and ignored in the analysis. By means of Parametric Full Maximum Likelihood and Regression Calibration (under the assumption that the data set of true doses has lognormal distribution), Nonparametric Full Maximum Likelihood, Nonparametric Regression Calibration, and by properly tuned SIMEX method we study the influence of measurement errors in thyroid dose on the estimates of λ(0) and EAR. The simulation study is presented based on a real sample from the epidemiological studies. The doses were reconstructed in the framework of the Ukrainian-American project on the investigation of Post-Chernobyl thyroid cancers in Ukraine, and the underlying subpolulation was artificially enlarged in order to increase the statistical power. The true risk parameters were given by the values to earlier epidemiological studies, and then the binary response was simulated according to the dose-response model.

  7. Single-Case Designs and Qualitative Methods: Applying a Mixed Methods Research Perspective

    ERIC Educational Resources Information Center

    Hitchcock, John H.; Nastasi, Bonnie K.; Summerville, Meredith

    2010-01-01

    The purpose of this conceptual paper is to describe a design that mixes single-case (sometimes referred to as single-subject) and qualitative methods, hereafter referred to as a single-case mixed methods design (SCD-MM). Minimal attention has been given to the topic of applying qualitative methods to SCD work in the literature. These two…

  8. Single-Case Designs and Qualitative Methods: Applying a Mixed Methods Research Perspective

    ERIC Educational Resources Information Center

    Hitchcock, John H.; Nastasi, Bonnie K.; Summerville, Meredith

    2010-01-01

    The purpose of this conceptual paper is to describe a design that mixes single-case (sometimes referred to as single-subject) and qualitative methods, hereafter referred to as a single-case mixed methods design (SCD-MM). Minimal attention has been given to the topic of applying qualitative methods to SCD work in the literature. These two…

  9. An artificial viscosity method for the design of supercritical airfoils

    NASA Technical Reports Server (NTRS)

    Mcfadden, G. B.

    1979-01-01

    A numerical technique is presented for the design of two-dimensional supercritical wing sections with low wave drag. The method is a design mode of the analysis code H which gives excellent agreement with experimental results and is widely used in the aircraft industry. Topics covered include the partial differential equations of transonic flow, the computational procedure and results; the design procedure; a convergence theorem; and description of the code.

  10. Stabilizing State-Feedback Design via the Moving Horizon Method.

    DTIC Science & Technology

    1982-01-01

    aide if necessary and identify by block number) Stabilizing control design; linear time varying systems; fixed depth horizon; index optimization methods...dual system. 20. ABSTRACT (Continue an reverse side If necessary and Identify by block number) Li _ A stabilizing control design for general linear...Apprvyed for pb~ ~~* 14 ~dl Stri but ion uni imit Oe, ABSTRACT A stabilizing control design for general linear time vary- invariant systems through

  11. Numerical methods for aerothermodynamic design of hypersonic space transport vehicles

    NASA Astrophysics Data System (ADS)

    Wanie, K. M.; Brenneis, A.; Eberle, A.; Heiss, S.

    1993-04-01

    The requirement of the design process of hypersonic vehicles to predict flow past entire configurations with wings, fins, flaps, and propulsion system represents one of the major challenges for aerothermodynamics. In this context computational fluid dynamics has come up as a powerful tool to support the experimental work. A couple of numerical methods developed at MBB designed to fulfill the needs of the design process are described. The governing equations and fundamental details of the solution methods are shortly reviewed. Results are given for both geometrically simple test cases and realistic hypersonic configurations. Since there is still a considerable lack of experience for hypersonic flow calculations an extensive testing and verification is essential. This verification is done by comparison of results with experimental data and other numerical methods. The results presented prove that the methods used are robust, flexible, and accurate enough to fulfill the strong needs of the design process.

  12. Working stress design method for reinforced soil walls

    SciTech Connect

    Ehrlich, M. ); Mitchell, J.K. )

    1994-04-01

    A method for the internal design of reinforced soil walls based on working stresses is developed and evaluated using measurements from five full-scale structures containing a range of reinforcement types. It is shown that, in general, the stiffer the reinforcement system and the higher the stresses induced during compaction, the higher are the tensile stresses that must be resisted by the reinforcements. Unique features of this method, compared to currently used reinforced soil wall design methods, are that it can be applied to all types of reinforcement systems, reinforcement and soil stiffness properties are considered, and backfill compaction stresses are taken explicitly into account. The method can be applied either analytically or using design charts. A design example is included.

  13. Two-Method Planned Missing Designs for Longitudinal Research

    ERIC Educational Resources Information Center

    Garnier-Villarreal, Mauricio; Rhemtulla, Mijke; Little, Todd D.

    2014-01-01

    We examine longitudinal extensions of the two-method measurement design, which uses planned missingness to optimize cost-efficiency and validity of hard-to-measure constructs. These designs use a combination of two measures: a "gold standard" that is highly valid but expensive to administer, and an inexpensive (e.g., survey-based)…

  14. New directions for Artificial Intelligence (AI) methods in optimum design

    NASA Technical Reports Server (NTRS)

    Hajela, Prabhat

    1989-01-01

    Developments and applications of artificial intelligence (AI) methods in the design of structural systems is reviewed. Principal shortcomings in the current approach are emphasized, and the need for some degree of formalism in the development environment for such design tools is underscored. Emphasis is placed on efforts to integrate algorithmic computations in expert systems.

  15. Two-Method Planned Missing Designs for Longitudinal Research

    ERIC Educational Resources Information Center

    Garnier-Villarreal, Mauricio; Rhemtulla, Mijke; Little, Todd D.

    2014-01-01

    We examine longitudinal extensions of the two-method measurement design, which uses planned missingness to optimize cost-efficiency and validity of hard-to-measure constructs. These designs use a combination of two measures: a "gold standard" that is highly valid but expensive to administer, and an inexpensive (e.g., survey-based)…

  16. Investigating the Use of Design Methods by Capstone Design Students at Clemson University

    ERIC Educational Resources Information Center

    Miller, W. Stuart; Summers, Joshua D.

    2013-01-01

    The authors describe a preliminary study to understand the attitude of engineering students regarding the use of design methods in projects to identify the factors either affecting or influencing the use of these methods by novice engineers. A senior undergraduate capstone design course at Clemson University, consisting of approximately fifty…

  17. Approximate method of designing a two-element airfoil

    NASA Astrophysics Data System (ADS)

    Abzalilov, D. F.; Mardanov, R. F.

    2011-09-01

    An approximate method is proposed for designing a two-element airfoil. The method is based on reducing an inverse boundary-value problem in a doubly connected domain to a problem in a singly connected domain located on a multisheet Riemann surface. The essence of the method is replacement of channels between the airfoil elements by channels of flow suction and blowing. The shape of these channels asymptotically tends to the annular shape of channels passing to infinity on the second sheet of the Riemann surface. The proposed method can be extended to designing multielement airfoils.

  18. New knowledge network evaluation method for design rationale management

    NASA Astrophysics Data System (ADS)

    Jing, Shikai; Zhan, Hongfei; Liu, Jihong; Wang, Kuan; Jiang, Hao; Zhou, Jingtao

    2015-01-01

    Current design rationale (DR) systems have not demonstrated the value of the approach in practice since little attention is put to the evaluation method of DR knowledge. To systematize knowledge management process for future computer-aided DR applications, a prerequisite is to provide the measure for the DR knowledge. In this paper, a new knowledge network evaluation method for DR management is presented. The method characterizes the DR knowledge value from four perspectives, namely, the design rationale structure scale, association knowledge and reasoning ability, degree of design justification support and degree of knowledge representation conciseness. The DR knowledge comprehensive value is also measured by the proposed method. To validate the proposed method, different style of DR knowledge network and the performance of the proposed measure are discussed. The evaluation method has been applied in two realistic design cases and compared with the structural measures. The research proposes the DR knowledge evaluation method which can provide object metric and selection basis for the DR knowledge reuse during the product design process. In addition, the method is proved to be more effective guidance and support for the application and management of DR knowledge.

  19. Design method for four-reflector type beam waveguide systems

    NASA Technical Reports Server (NTRS)

    Betsudan, S.; Katagi, T.; Urasaki, S.

    1986-01-01

    Discussed is a method for the design of four reflector type beam waveguide feed systems, comprised of a conical horn and 4 focused reflectors, which are used widely as the primary reflector systems for communications satellite Earth station antennas. The design parameters for these systems are clarified, the relations between each parameter are brought out based on the beam mode development, and the independent design parameters are specified. The characteristics of these systems, namely spillover loss, crosspolarization components, and frequency characteristics, and their relation to the design parameters, are also shown. It is also indicated that design parameters which decide the dimensions of the conical horn or the shape of the focused reflectors can be unerringly established once the design standard for the system has been selected as either: (1) minimizing the crosspolarization component by keeping the spillover loss to within acceptable limits, or (2) minimizing the spillover loss by maintaining the crossover components below an acceptable level and the independent design parameters, such as the respective sizes of the focused reflectors and the distances between the focussed reflectors, etc., have been established according to mechanical restrictions. A sample design is also shown. In addition to being able to clarify the effects of each of the design parameters on the system and improving insight into these systems, the efficiency of these systems will also be increased with this design method.

  20. Ecogeographic Genetic Epidemiology

    PubMed Central

    Sloan, Chantel D.; Duell, Eric J.; Shi, Xun; Irwin, Rebecca; Andrew, Angeline S.; Williams, Scott M.; Moore, Jason H.

    2009-01-01

    Complex diseases such as cancer and heart disease result from interactions between an individual's genetics and environment, i.e. their human ecology. Rates of complex diseases have consistently demonstrated geographic patterns of incidence, or spatial “clusters” of increased incidence relative to the general population. Likewise, genetic subpopulations and environmental influences are not evenly distributed across space. Merging appropriate methods from genetic epidemiology, ecology and geography will provide a more complete understanding of the spatial interactions between genetics and environment that result in spatial patterning of disease rates. Geographic Information Systems (GIS), which are tools designed specifically for dealing with geographic data and performing spatial analyses to determine their relationship, are key to this kind of data integration. Here the authors introduce a new interdisciplinary paradigm, ecogeographic genetic epidemiology, which uses GIS and spatial statistical analyses to layer genetic subpopulation and environmental data with disease rates and thereby discern the complex gene-environment interactions which result in spatial patterns of incidence. PMID:19025788

  1. Web tools for molecular epidemiology of tuberculosis.

    PubMed

    Shabbeer, Amina; Ozcaglar, Cagri; Yener, Bülent; Bennett, Kristin P

    2012-06-01

    In this study we explore publicly available web tools designed to use molecular epidemiological data to extract information that can be employed for the effective tracking and control of tuberculosis (TB). The application of molecular methods for the epidemiology of TB complement traditional approaches used in public health. DNA fingerprinting methods are now routinely employed in TB surveillance programs and are primarily used to detect recent transmissions and in outbreak investigations. Here we present web tools that facilitate systematic analysis of Mycobacterium tuberculosis complex (MTBC) genotype information and provide a view of the genetic diversity in the MTBC population. These tools help answer questions about the characteristics of MTBC strains, such as their pathogenicity, virulence, immunogenicity, transmissibility, drug-resistance profiles and host-pathogen associativity. They provide an integrated platform for researchers to use molecular epidemiological data to address current challenges in the understanding of TB dynamics and the characteristics of MTBC.

  2. [Epidemiology and heterogeny].

    PubMed

    Breilh, J; Granda, E

    1989-01-01

    The innovation of epidemiology plays a crucial role in the development of the health sciences. The authors emphasize the importance of epistemological analysis related to scientific and technical production. They focus on the theoretical and methodological contributions of the principal Latin American groups in the field of epidemiology, stating their main accomplishments, issues and potentials. When reviewing those conceptual and practical innovations, the authors analyse the effects of broader historical conditions on scientific work. To them, Latin American contemporary innovative epidemiological research and production have developed clearly differentiated principles, methods and technical projections which have led to a movement of critical or 'social' epidemiology. The functionalist approach of conventional epidemiology, characterized by an empiricist viewpoint, is being overcome by a more rigorous and analytical approach. This new epidemiological approach, in which the authors as members of CEAS (Health Research and Advisory Center) are working, has selectively incorporated some of the technical instruments of conventional epidemiology, subordinating them to a different theoretical and logical paradigm. The new framework of this group explains the need to consider the people's objective situation and necessities, when constructing scientific interpretations and planning technical action. In order to accomplish this goal, epidemiological reasoning has to reflect the unity of external epidemiological facts and associations, the so-called phenomenological aspect of health, with the underlying determinants and conditioning processes or internal relations, which are the essence of the health-disease production and distribution process. Epidemiological analysis is considered not only as a problem of empirical observation but as a process of theoretical construction, in which there is a dynamic fusion of deductive and inductive reasoning.(ABSTRACT TRUNCATED AT 250

  3. Snippets from the past: the evolution of Wade Hampton Frost's epidemiology as viewed from the American Journal of Hygiene/Epidemiology.

    PubMed

    Morabia, Alfredo

    2013-10-01

    Wade Hampton Frost, who was a Professor of Epidemiology at Johns Hopkins University from 1919 to 1938, spurred the development of epidemiologic methods. His 6 publications in the American Journal of Hygiene, which later became the American Journal of Epidemiology, comprise a 1928 Cutter lecture on a theory of epidemics, a survey-based study of tonsillectomy and immunity to Corynebacterium diphtheriae (1931), 2 papers from a longitudinal study of the incidence of minor respiratory diseases (1933 and 1935), an attack rate ratio analysis of the decline of diphtheria in Baltimore (1936), and a 1936 lecture on the age, time, and cohort analysis of tuberculosis mortality. These 6 American Journal of Hygiene /American Journal of Epidemiology papers attest that Frost's personal evolution mirrored that of the emerging "early" epidemiology: The scope of epidemiology extended beyond the study of epidemics of acute infectious diseases, and rigorous comparative study designs and their associated quantitative methods came to light.

  4. Ambient air pollution epidemiology systematic review and meta-analysis: A review of reporting and methods practice.

    PubMed

    Sheehan, Mary C; Lam, Juleen; Navas-Acien, Ana; Chang, Howard H

    2016-01-01

    Systematic review and meta-analysis (SRMA) are increasingly employed in environmental health (EH) epidemiology and, provided methods and reporting are sound, contribute to translating science evidence to policy. Ambient air pollution (AAP) is both among the leading environmental causes of mortality and morbidity worldwide, and of growing policy relevance due to health co-benefits associated with greenhouse gas emissions reductions. We reviewed the published AAP SRMA literature (2009 to mid-2015), and evaluated the consistency of methods, reporting and evidence evaluation using a 22-point questionnaire developed from available best-practice consensus guidelines and emerging recommendations for EH. Our goal was to contribute to enhancing the utility of AAP SRMAs to EH policy. We identified 43 studies that used both SR and MA techniques to examine associations between the AAPs PM2.5, PM10, NO2, SO2, CO and O3, and various health outcomes. On average AAP SRMAs partially or thoroughly addressed 16 of 22 questions (range 10-21), and thoroughly addressed 13 of 22 (range 5-19). We found evidence of an improving trend over the period. However, we observed some weaknesses, particularly infrequent formal reviews of underlying study quality and risk-of-bias that correlated with lower frequency of thorough evaluation for key study quality parameters. Several other areas for enhanced reporting are highlighted. The AAP SRMA literature, in particular more recent studies, indicate broad concordance with current and emerging best practice guidance. Development of an EH-specific SRMA consensus statement including a risk-of-bias evaluation tool, would be a contribution to enhanced reliability and robustness as well as policy utility. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. A comparison of methods for DPLL loop filter design

    NASA Technical Reports Server (NTRS)

    Aguirre, S.; Hurd, W. J.; Kumar, R.; Statman, J.

    1986-01-01

    Four design methodologies for loop filters for a class of digital phase-locked loops (DPLLs) are presented. The first design maps an optimum analog filter into the digital domain; the second approach designs a filter that minimizes in discrete time weighted combination of the variance of the phase error due to noise and the sum square of the deterministic phase error component; the third method uses Kalman filter estimation theory to design a filter composed of a least squares fading memory estimator and a predictor. The last design relies on classical theory, including rules for the design of compensators. Linear analysis is used throughout the article to compare different designs, and includes stability, steady state performance and transient behavior of the loops. Design methodology is not critical when the loop update rate can be made high relative to loop bandwidth, as the performance approaches that of continuous time. For low update rates, however, the miminization method is significantly superior to the other methods.

  6. A comparison of methods for DPLL loop filter design

    NASA Astrophysics Data System (ADS)

    Aguirre, S.; Hurd, W. J.; Kumar, R.; Statman, J.

    1986-11-01

    Four design methodologies for loop filters for a class of digital phase-locked loops (DPLLs) are presented. The first design maps an optimum analog filter into the digital domain; the second approach designs a filter that minimizes in discrete time weighted combination of the variance of the phase error due to noise and the sum square of the deterministic phase error component; the third method uses Kalman filter estimation theory to design a filter composed of a least squares fading memory estimator and a predictor. The last design relies on classical theory, including rules for the design of compensators. Linear analysis is used throughout the article to compare different designs, and includes stability, steady state performance and transient behavior of the loops. Design methodology is not critical when the loop update rate can be made high relative to loop bandwidth, as the performance approaches that of continuous time. For low update rates, however, the miminization method is significantly superior to the other methods.

  7. Novel parameter-based flexure bearing design method

    NASA Astrophysics Data System (ADS)

    Amoedo, Simon; Thebaud, Edouard; Gschwendtner, Michael; White, David

    2016-06-01

    A parameter study was carried out on the design variables of a flexure bearing to be used in a Stirling engine with a fixed axial displacement and a fixed outer diameter. A design method was developed in order to assist identification of the optimum bearing configuration. This was achieved through a parameter study of the bearing carried out with ANSYS®. The parameters varied were the number and the width of the arms, the thickness of the bearing, the eccentricity, the size of the starting and ending holes, and the turn angle of the spiral. Comparison was made between the different designs in terms of axial and radial stiffness, the natural frequency, and the maximum induced stresses. Moreover, the Finite Element Analysis (FEA) was compared to theoretical results for a given design. The results led to a graphical design method which assists the selection of flexure bearing geometrical parameters based on pre-determined geometric and material constraints.

  8. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  9. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  10. On design methods for bolted joints in composite aircraft structures

    NASA Astrophysics Data System (ADS)

    Ireman, Tomas; Nyman, Tonny; Hellbom, Kurt

    The problems related to the determination of the load distribution in a multirow fastener joint using the finite element method are discussed. Both simple and more advanced design methods used at Saab Military Aircraft are presented. The stress distributions obtained with an analytically based method and an FE-based method are compared. Results from failure predictions with a simple analytically based method and the more advanced FE-based method of multi-fastener tension and shear loaded test specimens are compared with experiments. Finally, complicating factors such as three-dimensional effects caused by secondary bending and fastener bending are discussed and suggestions for future research are given.

  11. A Bright Future for Evolutionary Methods in Drug Design.

    PubMed

    Le, Tu C; Winkler, David A

    2015-08-01

    Most medicinal chemists understand that chemical space is extremely large, essentially infinite. Although high-throughput experimental methods allow exploration of drug-like space more rapidly, they are still insufficient to fully exploit the opportunities that such large chemical space offers. Evolutionary methods can synergistically blend automated synthesis and characterization methods with computational design to identify promising regions of chemical space more efficiently. We describe how evolutionary methods are implemented, and provide examples of published drug development research in which these methods have generated molecules with increased efficacy. We anticipate that evolutionary methods will play an important role in future drug discovery.

  12. Design of diffractive optical surfaces within the nonimaging SMS design method

    NASA Astrophysics Data System (ADS)

    Mendes-Lopes, João.; Benítez, Pablo; Miñano, Juan C.

    2015-09-01

    The Simultaneous Multiple Surface (SMS) method was initially developed as a design method in Nonimaging Optics and later, the method was extended for designing Imaging Optics. We show an extension of the SMS method to diffractive surfaces. Using this method, diffractive kinoform surfaces are calculated simultaneously and through a direct method, i. e. it is not based in multi-parametric optimization techniques. Using the phase-shift properties of diffractive surfaces as an extra degree of freedom, only N/2 surfaces are needed to perfectly couple N one parameter wavefronts. Wavefronts of different wavelengths can also be coupled, hence chromatic aberration can be corrected in SMS-based systems. This method can be used by combining and calculating simultaneously both reflective, refractive and diffractive surfaces, through direct calculation of phase and refractive/reflective profiles. Representative diffractive systems designed by the SMS method are presented.

  13. The Design with Intent Method: a design tool for influencing user behaviour.

    PubMed

    Lockton, Dan; Harrison, David; Stanton, Neville A

    2010-05-01

    Using product and system design to influence user behaviour offers potential for improving performance and reducing user error, yet little guidance is available at the concept generation stage for design teams briefed with influencing user behaviour. This article presents the Design with Intent Method, an innovation tool for designers working in this area, illustrated via application to an everyday human-technology interaction problem: reducing the likelihood of a customer leaving his or her card in an automatic teller machine. The example application results in a range of feasible design concepts which are comparable to existing developments in ATM design, demonstrating that the method has potential for development and application as part of a user-centred design process.

  14. INNOVATIVE METHODS FOR THE OPTIMIZATION OF GRAVITY STORM SEWER DESIGN

    EPA Science Inventory

    The purpose of this paper is to describe a new method for optimizing the design of urban storm sewer systems. Previous efforts to optimize gravity sewers have met with limited success because classical optimization methods require that the problem be well behaved, e.g. describ...

  15. Designing, Teaching, and Evaluating Two Complementary Mixed Methods Research Courses

    ERIC Educational Resources Information Center

    Christ, Thomas W.

    2009-01-01

    Teaching mixed methods research is difficult. This longitudinal explanatory study examined how two classes were designed, taught, and evaluated. Curriculum, Research, and Teaching (EDCS-606) and Mixed Methods Research (EDCS-780) used a research proposal generation process to highlight the importance of the purpose, research question and…

  16. INNOVATIVE METHODS FOR THE OPTIMIZATION OF GRAVITY STORM SEWER DESIGN

    EPA Science Inventory

    The purpose of this paper is to describe a new method for optimizing the design of urban storm sewer systems. Previous efforts to optimize gravity sewers have met with limited success because classical optimization methods require that the problem be well behaved, e.g. describ...

  17. Advances in multiparameter optimization methods for de novo drug design.

    PubMed

    Segall, Matthew

    2014-07-01

    A high-quality drug must achieve a balance of physicochemical and absorption, distribution, metabolism and elimination properties, safety and potency against its therapeutic target(s). Multiparameter optimization (MPO) methods guide the simultaneous optimization of multiple factors to quickly target compounds with the highest chance of downstream success. MPO can be combined with 'de novo design' methods to automatically generate and assess a large number of diverse structures and identify strategies to optimize a compound's overall balance of properties. The article provides a review of MPO methods and recent developments in the methods and opinions in the field. It also provides a description of advances in de novo design that improve the relevance of automatically generated compound structures and integrate MPO. Finally, the article provides discussion of a recent case study of the automatic design of ligands to polypharmacological profiles. Recent developments have reduced the generation of chemically infeasible structures and improved the quality of compounds generated by de novo design methods. There are concerns about the ability of simple drug-like properties and ligand efficiency indices to effectively guide the detailed optimization of compounds. De novo design methods cannot identify a perfect compound for synthesis, but it can identify high-quality ideas for detailed consideration by an expert scientist.

  18. Methods and Descriptive Epidemiology of Services Provided by Athletic Trainers in High Schools: The National Athletic Treatment, Injury and Outcomes Network Study

    PubMed Central

    Kerr, Zachary Y.; Dompier, Thomas P.; Dalton, Sara L.; Miller, Sayers John; Hayden, Ross; Marshall, Stephen W.

    2015-01-01

    Context Research is limited on the extent and nature of the care provided by athletic trainers (ATs) to student-athletes in the high school setting. Objective To describe the methods of the National Athletic Treatment, Injury and Outcomes Network (NATION) project and provide the descriptive epidemiology of AT services for injury care in 27 high school sports. Design Descriptive epidemiology study. Setting Athletic training room (ATR) visits and AT services data collected in 147 high schools from 26 states. Patients or Other Participants High school student-athletes participating in 13 boys' sports and 14 girls' sports during the 2011−2012 through 2013−2014 academic years. Main Outcome Measure(s) The number of ATR visits and individual AT services, as well as the mean number of ATR visits (per injury) and AT services (per injury and ATR visit) were calculated by sport and for time-loss (TL) and non–time-loss (NTL) injuries. Results Over the 3-year period, 210 773 ATR visits and 557 381 AT services were reported for 50 604 injuries. Most ATR visits (70%) were for NTL injuries. Common AT services were therapeutic activities or exercise (45.4%), modalities (18.6%), and AT evaluation and reevaluation (15.9%), with an average of 4.17 ± 6.52 ATR visits and 11.01 ± 22.86 AT services per injury. Compared with NTL injuries, patients with TL injuries accrued more ATR visits (7.76 versus 3.47; P < .001) and AT services (18.60 versus 9.56; P < .001) per injury. An average of 2.24 ± 1.33 AT services were reported per ATR visit. Compared with TL injuries, NTL injuries had a larger average number of AT services per ATR visit (2.28 versus 2.05; P < .001). Conclusions These findings highlight the broad spectrum of care provided by ATs to high school student-athletes and demonstrate that patients with NTL injuries require substantial amounts of AT services. PMID:26678290

  19. Test methods and design allowables for fibrous composites. Volume 2

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C. (Editor)

    1989-01-01

    Topics discussed include extreme/hostile environment testing, establishing design allowables, and property/behavior specific testing. Papers are presented on environmental effects on the high strain rate properties of graphite/epoxy composite, the low-temperature performance of short-fiber reinforced thermoplastics, the abrasive wear behavior of unidirectional and woven graphite fiber/PEEK, test methods for determining design allowables for fiber reinforced composites, and statistical methods for calculating material allowables for MIL-HDBK-17. Attention is also given to a test method to measure the response of composite materials under reversed cyclic loads, a through-the-thickness strength specimen for composites, the use of torsion tubes to measure in-plane shear properties of filament-wound composites, the influlence of test fixture design on the Iosipescu shear test for fiber composite materials, and a method for monitoring in-plane shear modulus in fatigue testing of composites.

  20. Test methods and design allowables for fibrous composites. Volume 2

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C. (Editor)

    1989-01-01

    Topics discussed include extreme/hostile environment testing, establishing design allowables, and property/behavior specific testing. Papers are presented on environmental effects on the high strain rate properties of graphite/epoxy composite, the low-temperature performance of short-fiber reinforced thermoplastics, the abrasive wear behavior of unidirectional and woven graphite fiber/PEEK, test methods for determining design allowables for fiber reinforced composites, and statistical methods for calculating material allowables for MIL-HDBK-17. Attention is also given to a test method to measure the response of composite materials under reversed cyclic loads, a through-the-thickness strength specimen for composites, the use of torsion tubes to measure in-plane shear properties of filament-wound composites, the influlence of test fixture design on the Iosipescu shear test for fiber composite materials, and a method for monitoring in-plane shear modulus in fatigue testing of composites.

  1. Comparison of Optimal Design Methods in Inverse Problems

    PubMed Central

    Banks, H. T.; Holm, Kathleen; Kappel, Franz

    2011-01-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher Information Matrix (FIM). A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criteria with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model [13], the standard harmonic oscillator model [13] and a popular glucose regulation model [16, 19, 29]. PMID:21857762

  2. Tradeoff methods in multiobjective insensitive design of airplane control systems

    NASA Technical Reports Server (NTRS)

    Schy, A. A.; Giesy, D. P.

    1984-01-01

    The latest results of an ongoing study of computer-aided design of airplane control systems are given. Constrained minimization algorithms are used, with the design objectives in the constraint vector. The concept of Pareto optimiality is briefly reviewed. It is shown how an experienced designer can use it to find designs which are well-balanced in all objectives. Then the problem of finding designs which are insensitive to uncertainty in system parameters are discussed, introducing a probabilistic vector definition of sensitivity which is consistent with the deterministic Pareto optimal problem. Insensitivity is important in any practical design, but it is particularly important in the design of feedback control systems, since it is considered to be the most important distinctive property of feedback control. Methods of tradeoff between deterministic and stochastic-insensitive (SI) design are described, and tradeoff design results are presented for the example of the a Shuttle lateral stability augmentation system. This example is used because careful studies have been made of the uncertainty in Shuttle aerodynamics. Finally, since accurate statistics of uncertain parameters are usually not available, the effects of crude statistical models on SI designs are examined.

  3. Comparison of Optimal Design Methods in Inverse Problems.

    PubMed

    Banks, H T; Holm, Kathleen; Kappel, Franz

    2011-07-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher Information Matrix (FIM). A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criteria with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model [13], the standard harmonic oscillator model [13] and a popular glucose regulation model [16, 19, 29].

  4. Perspectives toward the stereotype production method for public symbol design: a case study of novice designers.

    PubMed

    Ng, Annie W Y; Siu, Kin Wai Michael; Chan, Chetwyn C H

    2013-01-01

    This study investigated the practices and attitudes of novice designers toward user involvement in public symbol design at the conceptual design stage, i.e. the stereotype production method. Differences between male and female novice designers were examined. Forty-eight novice designers (24 male, 24 female) were asked to design public symbol referents based on suggestions made by a group of users in a previous study and provide feedback with regard to the design process. The novice designers were receptive to the adoption of user suggestions in the conception of the design, but tended to modify the pictorial representations generated by the users to varying extents. It is also significant that the male and female novice designers appeared to emphasize different aspects of user suggestions, and the female novice designers were more positive toward these suggestions than their male counterparts. The findings should aid the optimization of the stereotype production method for user-involved symbol design. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  5. Statistical Reasoning and Methods in Epidemiology to Promote Individualized Health: In Celebration of the 100th Anniversary of the Johns Hopkins Bloomberg School of Public Health.

    PubMed

    Ogburn, Elizabeth L; Zeger, Scott L

    2016-03-01

    Epidemiology is concerned with determining the distribution and causes of disease. Throughout its history, epidemiology has drawn upon statistical ideas and methods to achieve its aims. Because of the exponential growth in our capacity to measure and analyze data on the underlying processes that define each person's state of health, there is an emerging opportunity for population-based epidemiologic studies to influence health decisions made by individuals in ways that take into account the individuals' characteristics, circumstances, and preferences. We refer to this endeavor as "individualized health." The present article comprises 2 sections. In the first, we describe how graphical, longitudinal, and hierarchical models can inform the project of individualized health. We propose a simple graphical model for informing individual health decisions using population-based data. In the second, we review selected topics in causal inference that we believe to be particularly useful for individualized health. Epidemiology and biostatistics were 2 of the 4 founding departments in the world's first graduate school of public health at Johns Hopkins University, the centennial of which we honor. This survey of a small part of the literature is intended to demonstrate that the 2 fields remain just as inextricably linked today as they were 100 years ago. © The Author 2016. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Computer method for design of acoustic liners for turbofan engines

    NASA Technical Reports Server (NTRS)

    Minner, G. L.; Rice, E. J.

    1976-01-01

    A design package is presented for the specification of acoustic liners for turbofans. An estimate of the noise generation was made based on modifications of existing noise correlations, for which the inputs are basic fan aerodynamic design variables. The method does not predict multiple pure tones. A target attenuation spectrum was calculated which was the difference between the estimated generation spectrum and a flat annoyance-weighted goal attenuated spectrum. The target spectrum was combined with a knowledge of acoustic liner performance as a function of the liner design variables to specify the acoustic design. The liner design method at present is limited to annular duct configurations. The detailed structure of the liner was specified by combining the required impedance (which is a result of the previous step) with a mathematical model relating impedance to the detailed structure. The design procedure was developed for a liner constructed of perforated sheet placed over honeycomb backing cavities. A sample calculation was carried through in order to demonstrate the design procedure, and experimental results presented show good agreement with the calculated results of the method.

  7. Assessment of methods and results of reproductive occupational epidemiology: spontaneous abortions and malformations in the offspring of working women

    SciTech Connect

    Hemminki, K.; Axelson, O.; Niemi, M.L.; Ahlborg, G.

    1983-01-01

    Epidemiological studies relating occupational exposures of working women to spontaneous abortions and malformation are reviewed and some methodological considerations are presented. The reproductive epidemiology is less developed than epidemiology in general and seems to involve some specific problems. The exposures may be reported differently by the women depending on the outcome of the pregnancy; thus confirmation of exposure from an independent data source would be an asset. The types of occupational exposures of the women, suggested to carry a risk of spontaneous abortions, include anesthetic agents, laboratory work, copper smelting, soldering, and chemical sterilization using ethylene oxide and glutaraldehyde. Maternal employment in laboratories and exposure to solvents has been linked to a risk of congenital malformations in the offspring in five studies. Data on the teratogenic effects of anesthetic gases has been conflicting. In one study, employment in copper smelting was associated with malformations in the offspring.

  8. Epidemiology: Then and Now.

    PubMed

    Kuller, Lewis H

    2016-03-01

    Twenty-five years ago, on the 75th anniversary of the Johns Hopkins Bloomberg School of Public Health, I noted that epidemiologic research was moving away from the traditional approaches used to investigate "epidemics" and their close relationship with preventive medicine. Twenty-five years later, the role of epidemiology as an important contribution to human population research, preventive medicine, and public health is under substantial pressure because of the emphasis on "big data," phenomenology, and personalized medical therapies. Epidemiology is the study of epidemics. The primary role of epidemiology is to identify the epidemics and parameters of interest of host, agent, and environment and to generate and test hypotheses in search of causal pathways. Almost all diseases have a specific distribution in relation to time, place, and person and specific "causes" with high effect sizes. Epidemiology then uses such information to develop interventions and test (through clinical trials and natural experiments) their efficacy and effectiveness. Epidemiology is dependent on new technologies to evaluate improved measurements of host (genomics), epigenetics, identification of agents (metabolomics, proteomics), new technology to evaluate both physical and social environment, and modern methods of data collection. Epidemiology does poorly in studying anything other than epidemics and collections of numerators and denominators without specific hypotheses even with improved statistical methodologies.

  9. An Integrated Optimization Design Method Based on Surrogate Modeling Applied to Diverging Duct Design

    NASA Astrophysics Data System (ADS)

    Hanan, Lu; Qiushi, Li; Shaobin, Li

    2016-12-01

    This paper presents an integrated optimization design method in which uniform design, response surface methodology and genetic algorithm are used in combination. In detail, uniform design is used to select the experimental sampling points in the experimental domain and the system performance is evaluated by means of computational fluid dynamics to construct a database. After that, response surface methodology is employed to generate a surrogate mathematical model relating the optimization objective and the design variables. Subsequently, genetic algorithm is adopted and applied to the surrogate model to acquire the optimal solution in the case of satisfying some constraints. The method has been applied to the optimization design of an axisymmetric diverging duct, dealing with three design variables including one qualitative variable and two quantitative variables. The method of modeling and optimization design performs well in improving the duct aerodynamic performance and can be also applied to wider fields of mechanical design and seen as a useful tool for engineering designers, by reducing the design time and computation consumption.

  10. Developing Conceptual Hypersonic Airbreathing Engines Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Ferlemann, Shelly M.; Robinson, Jeffrey S.; Martin, John G.; Leonard, Charles P.; Taylor, Lawrence W.; Kamhawi, Hilmi

    2000-01-01

    Designing a hypersonic vehicle is a complicated process due to the multi-disciplinary synergy that is required. The greatest challenge involves propulsion-airframe integration. In the past, a two-dimensional flowpath was generated based on the engine performance required for a proposed mission. A three-dimensional CAD geometry was produced from the two-dimensional flowpath for aerodynamic analysis, structural design, and packaging. The aerodynamics, engine performance, and mass properties arc inputs to the vehicle performance tool to determine if the mission goals were met. If the mission goals were not met, then a flowpath and vehicle redesign would begin. This design process might have to be performed several times to produce a "closed" vehicle. This paper will describe an attempt to design a hypersonic cruise vehicle propulsion flowpath using a Design of' Experiments method to reduce the resources necessary to produce a conceptual design with fewer iterations of the design cycle. These methods also allow for more flexible mission analysis and incorporation of additional design constraints at any point. A design system was developed using an object-based software package that would quickly generate each flowpath in the study given the values of the geometric independent variables. These flowpath geometries were put into a hypersonic propulsion code and the engine performance was generated. The propulsion results were loaded into statistical software to produce regression equations that were combined with an aerodynamic database to optimize the flowpath at the vehicle performance level. For this example, the design process was executed twice. The first pass was a cursory look at the independent variables selected to determine which variables are the most important and to test all of the inputs to the optimization process. The second cycle is a more in-depth study with more cases and higher order equations representing the design space.

  11. Risk factors and study designs used in research of youths' suicide behaviour-an epidemiological discussion with focus on level of evidence.

    PubMed

    Christiansen, Erik; Larsen, Kim Juul; Agerbo, Esben; Bilenberg, Niels; Stenager, Elsebeth

    2014-11-01

    Abstract Introduction: Many different epidemiology study designs have been used to analyse risk factors for suicide behaviour. The purpose of this study was to obtain an insight into the current study design used in research on youths' risk factors for suicide behaviour and to rank the studies according to level of evidence (LoE). We searched PubMed and psycINFO in order to identify relevant individual studies. We included 36 studies of children and youth on suicidal behaviour and ideation-many rank low on LoE. For suicide, cohort design was often used, and mental illness (depression, substance abuse and severity of mental illness) was the most common risk factor. Cohort studies are ranked 2b, which is high according to LoE. For suicide attempts, survey was often used, and psychopathology, substance abuse and being exposed to suicidal behaviour were the most common risk factors. For suicidal ideation, survey was the only design used, and substance abuse and psychopathology the most common risk factors. Surveys are ranked 4, which are low according to LoE. Many risk factors were broad and unspecific, and standard definitions of outcome and exposure were rarely used. A good study of risk factors for suicidal behaviour would need a high LoE, as a high-powered longitudinal epidemiological study (cohort or case-control) of very specific risk factors. The factors would have high prevention potential, compared with more broad and unspecific risk factors, to which many people are exposed. We would recommend a cohort design (in high-risk populations) or a case-control design to identify risk factors, using clinical and/or register data instead of self-reported information, reporting adjusted estimates and using standard definition of suicidal outcome and risk factors.

  12. A New Design Method based on Cooperative Data Mining from Multi-Objective Design Space

    NASA Astrophysics Data System (ADS)

    Sugimura, Kazuyuki; Obayashi, Shigeru; Jeong, Shinkyu

    We propose a new multi-objective parameter design method that uses the combination of the following data mining techniques: analysis of variance, self-organizing map, decision tree analysis, rough set theory, and association rule. This method first aims to improve multiple objective functions simultaneously using as much predominant main effects of different design variables as possible. Then it resolves the remaining conflictions between the objective functions using predominant interaction effects of design variables. The key to realizing this method is the obtaining of various design rules that quantitatively relate levels of design variables to levels of objective functions. Based on comparative studies of data mining techniques, the systematic processes for obtaining these design rules have been clarified, and the points of combining data mining techniques have also been summarized. This method has been applied to a multi-objective robust optimization problem of an industrial fan, and the results show its superior capabilities for controlling parameters to traditional single-objective parameter design methods like the Taguchi method.

  13. A decentralized linear quadratic control design method for flexible structures

    NASA Technical Reports Server (NTRS)

    Su, Tzu-Jeng; Craig, Roy R., Jr.

    1990-01-01

    A decentralized suboptimal linear quadratic control design procedure which combines substructural synthesis, model reduction, decentralized control design, subcontroller synthesis, and controller reduction is proposed for the design of reduced-order controllers for flexible structures. The procedure starts with a definition of the continuum structure to be controlled. An evaluation model of finite dimension is obtained by the finite element method. Then, the finite element model is decomposed into several substructures by using a natural decomposition called substructuring decomposition. Each substructure, at this point, still has too large a dimension and must be reduced to a size that is Riccati-solvable. Model reduction of each substructure can be performed by using any existing model reduction method, e.g., modal truncation, balanced reduction, Krylov model reduction, or mixed-mode method. Then, based on the reduced substructure model, a subcontroller is designed by an LQ optimal control method for each substructure independently. After all subcontrollers are designed, a controller synthesis method called substructural controller synthesis is employed to synthesize all subcontrollers into a global controller. The assembling scheme used is the same as that employed for the structure matrices. Finally, a controller reduction scheme, called the equivalent impulse response energy controller (EIREC) reduction algorithm, is used to reduce the global controller to a reasonable size for implementation. The EIREC reduced controller preserves the impulse response energy of the full-order controller and has the property of matching low-frequency moments and low-frequency power moments. An advantage of the substructural controller synthesis method is that it relieves the computational burden associated with dimensionality. Besides that, the SCS design scheme is also a highly adaptable controller synthesis method for structures with varying configuration, or varying mass

  14. A method for scoring the pain map of the McGill Pain Questionnaire for use in epidemiologic studies.

    PubMed

    Escalante, A; Lichtenstein, M J; White, K; Rios, N; Hazuda, H P

    1995-10-01

    Identifying and quantifying the location of pain may be important for understanding specific functional impairments in elderly populations. The purpose of the present analysis was two-fold: first, to describe the reliability of a scoring method for the McGill Pain Map (MPM), and second, to validate the method of scoring the MPM as a tool for assessing areas of body pain in an epidemiologic study. In interviews performed at the subjects' homes, 411 community dwelling Mexican-American and non-Hispanic white subjects aged 65-74 from the San Antonio Longitudinal Study of Aging (SALSA) were asked to describe the location of their pain on the map of the human body included in the McGill Pain Questionnaire. The location of pain was scored by overlaying the survey figures with a MPM template divided into 36 anatomical areas. Inter- and intra-rater agreement among three raters was measured by calculating a kappa statistic for each of the body areas, and an intraclass correlation coefficient for the total number of painful areas (NPA). Internal validity was measured by Spearman's rho between the NPA and the Present Pain Index (PPI) and Pain Rating Index (PRI) of the McGill Pain Questionnaire, and external validity by correlation between NPA and the Perceived Health (PH), Amount of Bodily Pain (APB), and Pain Interference with Work (PIW) items of the Medical Outcomes Study, and the Perceived Physical Health (PPH) question of the San Antonio Heart Study. Average inter-rater agreement for individual MPM areas was 0.92 +/- 0.01, and average agreement for NPA was 0.96 +/- 0.01. Intra-rater agreement for individual areas averaged 0.94 +/- 0.01, and for NPA = 0.99 +/- 0.001. Pain in one or more areas was present in 47.7% of the subjects. For the whole sample, correlations between NPA and the validation indices were: PPI (0.91), PRI (0.89), PH (0.25), ABP (0.64), PIW (0.49), and PPH (0.20). Among the 196 subjects with pain, correlations were: PPI (0.34), PRI (0.34), PH (0.19), ABP

  15. A method for the design of transonic flexible wings

    NASA Technical Reports Server (NTRS)

    Smith, Leigh Ann; Campbell, Richard L.

    1990-01-01

    Methodology was developed for designing airfoils and wings at transonic speeds which includes a technique that can account for static aeroelastic deflections. This procedure is capable of designing either supercritical or more conventional airfoil sections. Methods for including viscous effects are also illustrated and are shown to give accurate results. The methodology developed is an interactive system containing three major parts. A design module was developed which modifies airfoil sections to achieve a desired pressure distribution. This design module works in conjunction with an aerodynamic analysis module, which for this study is a small perturbation transonic flow code. Additionally, an aeroelastic module is included which determines the wing deformation due to the calculated aerodynamic loads. Because of the modular nature of the method, it can be easily coupled with any aerodynamic analysis code.

  16. Approximate Design Method for Single Stage Pulse Tube Refrigerators

    NASA Astrophysics Data System (ADS)

    Pfotenhauer, J. M.; Gan, Z. H.; Radebaugh, R.

    2008-03-01

    An approximate design method is presented for the design of a single stage Stirling type pulse tube refrigerator. The design method begins from a defined cooling power, operating temperature, average and dynamic pressure, and frequency. Using a combination of phasor analysis, approximate correlations derived from extensive use of REGEN3.2, a few `rules of thumb,' and available models for inertance tubes, a process is presented to define appropriate geometries for the regenerator, pulse tube and inertance tube components. In addition, specifications for the acoustic power and phase between the pressure and flow required from the compressor are defined. The process enables an appreciation of the primary physical parameters operating within the pulse tube refrigerator, but relies on approximate values for the combined loss mechanisms. The defined geometries can provide both a useful starting point, and a sanity check, for more sophisticated design methodologies.

  17. A new method named as Segment-Compound method of baffle design

    NASA Astrophysics Data System (ADS)

    Qin, Xing; Yang, Xiaoxu; Gao, Xin; Liu, Xishuang

    2017-02-01

    As the observation demand increased, the demand of the lens imaging quality rising. Segment- Compound baffle design method was proposed in this paper. Three traditional methods of baffle design they are characterized as Inside to Outside, Outside to Inside, and Mirror Symmetry. Through a transmission type of optical system, the four methods were used to design stray light suppression structure for it, respectively. Then, structures modeling simulation with Solidworks, CAXA, Tracepro, At last, point source transmittance (PST) curve lines were got to describe their performance. The result shows that the Segment- Compound method can inhibit stay light more effectively. Moreover, it is easy to active and without use special material.

  18. Rotordynamics and Design Methods of an Oil-Free Turbocharger

    NASA Technical Reports Server (NTRS)

    Howard, Samuel A.

    1999-01-01

    The feasibility of supporting a turbocharger rotor on air foil bearings is investigated based upon predicted rotordynamic stability, load accommodations, and stress considerations. It is demonstrated that foil bearings offer a plausible replacement for oil-lubricated bearings in diesel truck turbochargers. Also, two different rotor configurations are analyzed and the design is chosen which best optimizes the desired performance characteristics. The method of designing machinery for foil bearing use and the assumptions made are discussed.

  19. Methods for Reachability-based Hybrid Controller Design

    DTIC Science & Technology

    2012-05-10

    complexity of systems found in practical applications, the problem of controller design is often approached in a hierarchical fashion , with discrete...is often approached in a hierarchical fashion , with discrete abstractions and design methods used to satisfy high level task specifications, and...0720882 ( CSR - EHS: PRET), #0647591 ( CSR -SGER), and #0720841 ( CSR -CPS)), the U.S. Army Research Of- fice (ARO #W911NF-07-2-0019), U.S. Air Force Office

  20. Mixed methods research design for pragmatic psychoanalytic studies.

    PubMed

    Tillman, Jane G; Clemence, A Jill; Stevens, Jennifer L

    2011-10-01

    Calls for more rigorous psychoanalytic studies have increased over the past decade. The field has been divided by those who assert that psychoanalysis is properly a hermeneutic endeavor and those who see it as a science. A comparable debate is found in research methodology, where qualitative and quantitative methods have often been seen as occupying orthogonal positions. Recently, Mixed Methods Research (MMR) has emerged as a viable "third community" of research, pursuing a pragmatic approach to research endeavors through integrating qualitative and quantitative procedures in a single study design. Mixed Methods Research designs and the terminology associated with this emerging approach are explained, after which the methodology is explored as a potential integrative approach to a psychoanalytic human science. Both qualitative and quantitative research methods are reviewed, as well as how they may be used in Mixed Methods Research to study complex human phenomena.

  1. ERSYS-SPP access method subsystem design specification

    NASA Technical Reports Server (NTRS)

    Weise, R. C. (Principal Investigator)

    1980-01-01

    The STARAN special purpose processor (SPP) is a machine allowing the same operation to be performed on up to 512 different data elements simultaneously. In the ERSYS system, it is to be attached to a 4341 plug compatible machine (PCM) to do certain existing algorithms and, at a later date, to perform other to be specified algorithms. That part of the interface between the 4341 PCM and the SPP located in the 4341 PCM is known as the SPP access method (SPPAM). Access to the SPPAM will be obtained by use of the NQUEUE and DQUEUE commands. The subsystem design specification is to incorporate all applicable design considerations from the ERSYS system design specification and the Level B requirements documents relating to the SPPAM. It is intended as a basis for the preliminary design review and will expand into the subsystem detailed design specification.

  2. Design of large Francis turbine using optimal methods

    NASA Astrophysics Data System (ADS)

    Flores, E.; Bornard, L.; Tomas, L.; Liu, J.; Couston, M.

    2012-11-01

    Among a high number of Francis turbine references all over the world, covering the whole market range of heads, Alstom has especially been involved in the development and equipment of the largest power plants in the world : Three Gorges (China -32×767 MW - 61 to 113 m), Itaipu (Brazil- 20x750 MW - 98.7m to 127m) and Xiangjiaba (China - 8x812 MW - 82.5m to 113.6m - in erection). Many new projects are under study to equip new power plants with Francis turbines in order to answer an increasing demand of renewable energy. In this context, Alstom Hydro is carrying out many developments to answer those needs, especially for jumbo units such the planned 1GW type units in China. The turbine design for such units requires specific care by using the state of the art in computation methods and the latest technologies in model testing as well as the maximum feedback from operation of Jumbo plants already in operation. We present in this paper how a large Francis turbine can be designed using specific design methods, including the global and local optimization methods. The design of the spiral case, the tandem cascade profiles, the runner and the draft tube are designed with optimization loops involving a blade design tool, an automatic meshing software and a Navier-Stokes solver, piloted by a genetic algorithm. These automated optimization methods, presented in different papers over the last decade, are nowadays widely used, thanks to the growing computation capacity of the HPC clusters: the intensive use of such optimization methods at the turbine design stage allows to reach very high level of performances, while the hydraulic flow characteristics are carefully studied over the whole water passage to avoid any unexpected hydraulic phenomena.

  3. Computational methods of robust controller design for aerodynamic flutter suppression

    NASA Technical Reports Server (NTRS)

    Anderson, L. R.

    1981-01-01

    The development of Riccati iteration, a tool for the design and analysis of linear control systems is examined. First, Riccati iteration is applied to the problem of pole placement and order reduction in two-time scale control systems. Order reduction, yielding a good approximation to the original system, is demonstrated using a 16th order linear model of a turbofan engine. Next, a numerical method for solving the Riccati equation is presented and demonstrated for a set of eighth order random examples. A literature review of robust controller design methods follows which includes a number of methods for reducing the trajectory and performance index sensitivity in linear regulators. Lastly, robust controller design for large parameter variations is discussed.

  4. Improved method for transonic airfoil design-by-optimization

    NASA Technical Reports Server (NTRS)

    Kennelly, R. A., Jr.

    1983-01-01

    An improved method for use of optimization techniques in transonic airfoil design is demonstrated. FLO6QNM incorporates a modified quasi-Newton optimization package, and is shown to be more reliable and efficient than the method developed previously at NASA-Ames, which used the COPES/CONMIN optimization program. The design codes are compared on a series of test cases with known solutions, and the effects of problem scaling, proximity of initial point to solution, and objective function precision are studied. In contrast to the older method, well-converged solutions are shown to be attainable in the context of engineering design using computational fluid dynamics tools, a new result. The improvements are due to better performance by the optimization routine and to the use of problem-adaptive finite difference step sizes for gradient evaluation.

  5. Design of an explosive detection system using Monte Carlo method.

    PubMed

    Hernández-Adame, Pablo Luis; Medina-Castro, Diego; Rodriguez-Ibarra, Johanna Lizbeth; Salas-Luevano, Miguel Angel; Vega-Carrillo, Hector Rene

    2016-11-01

    Regardless the motivation terrorism is the most important risk for the national security in many countries. Attacks with explosives are the most common method used by terrorists. Therefore several procedures to detect explosives are utilized; among these methods are the use of neutrons and photons. In this study the Monte Carlo method an explosive detection system using a (241)AmBe neutron source was designed. In the design light water, paraffin, polyethylene, and graphite were used as moderators. In the work the explosive RDX was used and the induced gamma rays due to neutron capture in the explosive was estimated using NaI(Tl) and HPGe detectors. When light water is used as moderator and HPGe as the detector the system has the best performance allowing distinguishing between the explosive and urea. For the final design the Ambient dose equivalent for neutrons and photons were estimated along the radial and axial axis.

  6. Triparental Families: A New Genetic-Epidemiological Design Applied to Drug Abuse, Alcohol Use Disorders, and Criminal Behavior in a Swedish National Sample

    PubMed Central

    Kendler, Kenneth S.; Ohlsson, Henrik; Sundquist, Jan; Sundquist, Kristina

    2015-01-01

    Objective The authors sought to clarify the sources of parent-offspring resemblance for drug abuse, alcohol use disorders, and criminal behavior, using a novel genetic-epidemiological design. Method Using national registries, the authors identified rates of drug abuse, alcohol use disorders, and criminal behavior in 41,360 Swedish individuals born between 1960 and 1990 and raised in triparental families comprising a biological mother who reared them, a “not-lived-with” biological father, and a stepfather. Results When each syndrome was examined individually, hazard rates for drug abuse in offspring of parents with drug abuse were highest for mothers (2.80, 95% CI=2.23–3.38), intermediate for not-lived-with fathers (2.45,95%CI=2.14–2.79), and lowest for stepfathers (1.99, 95% CI=1.55–2.56). The same pattern was seen for alcohol use disorders (2.23, 95% CI=1.93–2.58; 1.84, 95% CI=1.69–2.00; and 1.27, 95% CI=1.12–1.43) and criminal behavior (1.55, 95% CI=1.44–1.66; 1.46, 95%CI=1.40–1.52; and1.30, 95% CI=1.23–1.37). When all three syndromes were examined together, specificity of cross-generational transmission was highest for mothers, intermediate for not-lived-with fathers, and lowest for stepfathers. Analyses of intact families and other not-lived-with parents and stepparents showed similar cross-generation transmission for these syndromes in mothers and fathers, supporting the representativeness of results from triparental families. Conclusions A major strength of the triparental design is its inclusion, within a single family, of parents who provide, to a first approximation, their offspring with genes plus rearing, genes only, and rearing only. For drug abuse, alcohol use disorders, and criminal behavior, the results of this study suggest that parent-offspring transmission involves both genetic and environmental processes, with genetic factors being somewhat more important. These results should be interpreted in the context of the strengths

  7. A method for the probabilistic design assessment of composite structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.

    1994-01-01

    A formal procedure for the probabilistic design assessment of a composite structure is described. The uncertainties in all aspects of a composite structure (constituent material properties, fabrication variables, structural geometry, service environments, etc.), which result in the uncertain behavior in the composite structural responses, are included in the assessment. The probabilistic assessment consists of design criteria, modeling of composite structures and uncertainties, simulation methods, and the decision making process. A sample case is presented to illustrate the formal procedure and to demonstrate that composite structural designs can be probabilistically assessed with accuracy and efficiency.

  8. Inverse design of airfoils using a flexible membrane method

    NASA Astrophysics Data System (ADS)

    Thinsurat, Kamon

    The Modified Garabedian Mc-Fadden (MGM) method is used to inversely design airfoils. The Finite Difference Method (FDM) for Non-Uniform Grids was developed to discretize the MGM equation for numerical solving. The Finite Difference Method (FDM) for Non-Uniform Grids has the advantage of being used flexibly with an unstructured grids airfoil. The commercial software FLUENT is being used as the flow solver. Several conditions are set in FLUENT such as subsonic inviscid flow, subsonic viscous flow, transonic inviscid flow, and transonic viscous flow to test the inverse design code for each condition. A moving grid program is used to create a mesh for new airfoils prior to importing meshes into FLUENT for the analysis of flows. For validation, an iterative process is used so the Cp distribution of the initial airfoil, the NACA0011, achieves the Cp distribution of the target airfoil, the NACA2315, for the subsonic inviscid case at M=0.2. Three other cases were carried out to validate the code. After the code validations, the inverse design method was used to design a shock free airfoil in the transonic condition and to design a separation free airfoil at a high angle of attack in the subsonic condition.

  9. An uncertain multidisciplinary design optimization method using interval convex models

    NASA Astrophysics Data System (ADS)

    Li, Fangyi; Luo, Zhen; Sun, Guangyong; Zhang, Nong

    2013-06-01

    This article proposes an uncertain multi-objective multidisciplinary design optimization methodology, which employs the interval model to represent the uncertainties of uncertain-but-bounded parameters. The interval number programming method is applied to transform each uncertain objective function into two deterministic objective functions, and a satisfaction degree of intervals is used to convert both the uncertain inequality and equality constraints to deterministic inequality constraints. In doing so, an unconstrained deterministic optimization problem will be constructed in association with the penalty function method. The design will be finally formulated as a nested three-loop optimization, a class of highly challenging problems in the area of engineering design optimization. An advanced hierarchical optimization scheme is developed to solve the proposed optimization problem based on the multidisciplinary feasible strategy, which is a well-studied method able to reduce the dimensions of multidisciplinary design optimization problems by using the design variables as independent optimization variables. In the hierarchical optimization system, the non-dominated sorting genetic algorithm II, sequential quadratic programming method and Gauss-Seidel iterative approach are applied to the outer, middle and inner loops of the optimization problem, respectively. Typical numerical examples are used to demonstrate the effectiveness of the proposed methodology.

  10. Evaluation of method for secondary DNA typing of Mycobacterium tuberculosis with pTBN12 in epidemiologic study of tuberculosis.

    PubMed

    Yang, Z; Chaves, F; Barnes, P F; Burman, W J; Koehler, J; Eisenach, K D; Bates, J H; Cave, M D

    1996-12-01

    Secondary fingerprinting of Mycobacterium tuberculosis DNA with a probe containing the polymorphic GC-rich repetitive sequence present in pTBN12 has been found to have greater discriminating power than does fingerprinting with the insertion sequence IS6110 for strains carrying few copies of IS6110. To validate the use of pTBN12 fingerprinting in the molecular epidemiology of tuberculosis, M. tuberculosis isolates from 67 patients in five states in the United States and in Spain were fingerprinted with both IS6110 and pTBN12. Epidemiologic links among the 67 patients were evaluated by patient interview and/or review of medical records. The 67 isolates had 5 IS6110 fingerprint patterns with two to five copies of IS6110 and 18 pTBN12 patterns, of which 10 were shared by more than 1 isolate. Epidemiologic links are consistently found among patients whose isolates had identical pTBN12 patterns, whereas no links were found among patients whose isolates had unique pTBN12 patterns. This suggests that pTBN12 fingerprinting is a useful tool to identify epidemiologically linked tuberculosis patients whose isolates have identical IS6110 fingerprints containing fewer than six fragments.

  11. Prevalence and Epidemiologic Characteristics of FASD From Various Research Methods with an Emphasis on Recent In-School Studies

    ERIC Educational Resources Information Center

    May, Philip A.; Gossage, J. Phillip; Kalberg, Wendy O.; Robinson, Luther K.; Buckley, David; Manning, Melanie; Hoyme, H. Eugene

    2009-01-01

    Researching the epidemiology and estimating the prevalence of fetal alcohol syndrome (FAS) and other fetal alcohol spectrum disorders (FASD) for mainstream populations anywhere in the world has presented a challenge to researchers. Three major approaches have been used in the past: surveillance and record review systems, clinic-based studies, and…

  12. Prevalence and Epidemiologic Characteristics of FASD From Various Research Methods with an Emphasis on Recent In-School Studies

    ERIC Educational Resources Information Center

    May, Philip A.; Gossage, J. Phillip; Kalberg, Wendy O.; Robinson, Luther K.; Buckley, David; Manning, Melanie; Hoyme, H. Eugene

    2009-01-01

    Researching the epidemiology and estimating the prevalence of fetal alcohol syndrome (FAS) and other fetal alcohol spectrum disorders (FASD) for mainstream populations anywhere in the world has presented a challenge to researchers. Three major approaches have been used in the past: surveillance and record review systems, clinic-based studies, and…

  13. Evidence-based decision-making in infectious diseases epidemiology, prevention and control: matching research questions to study designs and quality appraisal tools.

    PubMed

    Harder, Thomas; Takla, Anja; Rehfuess, Eva; Sánchez-Vivar, Alex; Matysiak-Klose, Dorothea; Eckmanns, Tim; Krause, Gérard; de Carvalho Gomes, Helena; Jansen, Andreas; Ellis, Simon; Forland, Frode; James, Roberta; Meerpohl, Joerg J; Morgan, Antony; Schünemann, Holger; Zuiderent-Jerak, Teun; Wichmann, Ole

    2014-05-21

    The Project on a Framework for Rating Evidence in Public Health (PRECEPT) was initiated and is being funded by the European Centre for Disease Prevention and Control (ECDC) to define a methodology for evaluating and grading evidence and strength of recommendations in the field of public health, with emphasis on infectious disease epidemiology, prevention and control. One of the first steps was to review existing quality appraisal tools (QATs) for individual research studies of various designs relevant to this area, using a question-based approach. Through team discussions and expert consultations, we identified 20 relevant types of public health questions, which were grouped into six domains, i.e. characteristics of the pathogen, burden of disease, diagnosis, risk factors, intervention, and implementation of intervention. Previously published systematic reviews were used and supplemented by expert consultation to identify suitable QATs. Finally, a matrix was constructed for matching questions to study designs suitable to address them and respective QATs. Key features of each of the included QATs were then analyzed, in particular in respect to its intended use, types of questions and answers, presence/absence of a quality score, and if a validation was performed. In total we identified 21 QATs and 26 study designs, and matched them. Four QATs were suitable for experimental quantitative study designs, eleven for observational quantitative studies, two for qualitative studies, three for economic studies, one for diagnostic test accuracy studies, and one for animal studies. Included QATs consisted of six to 28 items. Six of the QATs had a summary quality score. Fourteen QATs had undergone at least one validation procedure. The results of this methodological study can be used as an inventory of potentially relevant questions, appropriate study designs and QATs for researchers and authorities engaged with evidence-based decision-making in infectious disease epidemiology

  14. Breaking from binaries - using a sequential mixed methods design.

    PubMed

    Larkin, Patricia Mary; Begley, Cecily Marion; Devane, Declan

    2014-03-01

    To outline the traditional worldviews of healthcare research and discuss the benefits and challenges of using mixed methods approaches in contributing to the development of nursing and midwifery knowledge. There has been much debate about the contribution of mixed methods research to nursing and midwifery knowledge in recent years. A sequential exploratory design is used as an exemplar of a mixed methods approach. The study discussed used a combination of focus-group interviews and a quantitative instrument to obtain a fuller understanding of women's experiences of childbirth. In the mixed methods study example, qualitative data were analysed using thematic analysis and quantitative data using regression analysis. Polarised debates about the veracity, philosophical integrity and motivation for conducting mixed methods research have largely abated. A mixed methods approach can contribute to a deeper, more contextual understanding of a variety of subjects and experiences; as a result, it furthers knowledge that can be used in clinical practice. The purpose of the research study should be the main instigator when choosing from an array of mixed methods research designs. Mixed methods research offers a variety of models that can augment investigative capabilities and provide richer data than can a discrete method alone. This paper offers an example of an exploratory, sequential approach to investigating women's childbirth experiences. A clear framework for the conduct and integration of the different phases of the mixed methods research process is provided. This approach can be used by practitioners and policy makers to improve practice.

  15. Methods and processes of developing the strengthening the reporting of observational studies in epidemiology - veterinary (STROBE-Vet) statement.

    PubMed

    Sargeant, J M; O'Connor, A M; Dohoo, I R; Erb, H N; Cevallos, M; Egger, M; Ersbøll, A K; Martin, S W; Nielsen, L R; Pearl, D L; Pfeiffer, D U; Sanchez, J; Torrence, M E; Vigre, H; Waldner, C; Ward, M P

    2016-11-01

    The reporting of observational studies in veterinary research presents many challenges that often are not adequately addressed in published reporting guidelines. To develop an extension of the STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) statement that addresses unique reporting requirements for observational studies in veterinary medicine related to health, production, welfare, and food safety. A consensus meeting of experts was organized to develop an extension of the STROBE statement to address observational studies in veterinary medicine with respect to animal health, animal production, animal welfare, and food safety outcomes. Consensus meeting May 11-13, 2014 in Mississauga, Ontario, Canada. Seventeen experts from North America, Europe, and Australia attended the meeting. The experts were epidemiologists and biostatisticians, many of whom hold or have held editorial positions with relevant journals. Prior to the meeting, 19 experts completed a survey about whether they felt any of the 22 items of the STROBE statement should be modified and if items should be added to address unique issues related to observational studies in animal species with health, production, welfare, or food safety outcomes. At the meeting, the participants were provided with the survey responses and relevant literature concerning the reporting of veterinary observational studies. During the meeting, each STROBE item was discussed to determine whether or not re-wording was recommended, and whether additions were warranted. Anonymous voting was used to determine whether there was consensus for each item change or addition. The consensus was that six items needed no modifications or additions. Modifications or additions were made to the STROBE items numbered: 1 (title and abstract), 3 (objectives), 5 (setting), 6 (participants), 7 (variables), 8 (data sources/measurement), 9 (bias), 10 (study size), 12 (statistical methods), 13 (participants), 14

  16. Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Phase 1

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas

    1998-01-01

    The NASA Langley Multidisciplinary Design Optimization (MDO) method evaluation study seeks to arrive at a set of guidelines for using promising MDO methods by accumulating and analyzing computational data for such methods. The data are collected by conducting a series of reproducible experiments. This report documents all computational experiments conducted in Phase I of the study. This report is a companion to the paper titled Initial Results of an MDO Method Evaluation Study by N. M. Alexandrov and S. Kodiyalam (AIAA-98-4884).

  17. Exploration of Advanced Probabilistic and Stochastic Design Methods

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.

    2003-01-01

    The primary objective of the three year research effort was to explore advanced, non-deterministic aerospace system design methods that may have relevance to designers and analysts. The research pursued emerging areas in design methodology and leverage current fundamental research in the area of design decision-making, probabilistic modeling, and optimization. The specific focus of the three year investigation was oriented toward methods to identify and analyze emerging aircraft technologies in a consistent and complete manner, and to explore means to make optimal decisions based on this knowledge in a probabilistic environment. The research efforts were classified into two main areas. First, Task A of the grant has had the objective of conducting research into the relative merits of possible approaches that account for both multiple criteria and uncertainty in design decision-making. In particular, in the final year of research, the focus was on the comparison and contrasting between three methods researched. Specifically, these three are the Joint Probabilistic Decision-Making (JPDM) technique, Physical Programming, and Dempster-Shafer (D-S) theory. The next element of the research, as contained in Task B, was focused upon exploration of the Technology Identification, Evaluation, and Selection (TIES) methodology developed at ASDL, especially with regards to identification of research needs in the baseline method through implementation exercises. The end result of Task B was the documentation of the evolution of the method with time and a technology transfer to the sponsor regarding the method, such that an initial capability for execution could be obtained by the sponsor. Specifically, the results of year 3 efforts were the creation of a detailed tutorial for implementing the TIES method. Within the tutorial package, templates and detailed examples were created for learning and understanding the details of each step. For both research tasks, sample files and

  18. Association Between Cannabis and Psychosis: Epidemiologic Evidence.

    PubMed

    Gage, Suzanne H; Hickman, Matthew; Zammit, Stanley

    2016-04-01

    Associations between cannabis use and psychotic outcomes are consistently reported, but establishing causality from observational designs can be problematic. We review the evidence from longitudinal studies that have examined this relationship and discuss the epidemiologic evidence for and against interpreting the findings as causal. We also review the evidence identifying groups at particularly high risk of developing psychosis from using cannabis. Overall, evidence from epidemiologic studies provides strong enough evidence to warrant a public health message that cannabis use can increase the risk of psychotic disorders. However, further studies are required to determine the magnitude of this effect, to determine the effect of different strains of cannabis on risk, and to identify high-risk groups particularly susceptible to the effects of cannabis on psychosis. We also discuss complementary epidemiologic methods that can help address these questions.

  19. Review of SMS design methods and real-world applications

    NASA Astrophysics Data System (ADS)

    Dross, Oliver; Mohedano, Ruben; Benitez, Pablo; Minano, Juan Carlos; Chaves, Julio; Blen, Jose; Hernandez, Maikel; Munoz, Fernando

    2004-09-01

    The Simultaneous Multiple Surfaces design method (SMS), proprietary technology of Light Prescription Innovators (LPI), was developed in the early 1990's as a two dimensional method. The first embodiments had either linear or rotational symmetry and found applications in photovoltaic concentrators, illumination optics and optical communications. SMS designed devices perform close to the thermodynamic limit and are compact and simple; features that are especially beneficial in applications with today's high brightness LEDs. The method was extended to 3D "free form" geometries in 1999 that perfectly couple two incoming with two outgoing wavefronts. SMS 3D controls the light emitted by an extended light source much better than single free form surface designs, while reaching very high efficiencies. This has enabled the SMS method to be applied to automotive head lamps, one of the toughest lighting tasks in any application, where high efficiency and small size are required. This article will briefly review the characteristics of both the 2D and 3D methods and will present novel optical solutions that have been developed and manufactured to meet real world problems. These include various ultra compact LED collimators, solar concentrators and highly efficient LED low and high beam headlamp designs.

  20. A PDE Sensitivity Equation Method for Optimal Aerodynamic Design

    NASA Technical Reports Server (NTRS)

    Borggaard, Jeff; Burns, John

    1996-01-01

    The use of gradient based optimization algorithms in inverse design is well established as a practical approach to aerodynamic design. A typical procedure uses a simulation scheme to evaluate the objective function (from the approximate states) and its gradient, then passes this information to an optimization algorithm. Once the simulation scheme (CFD flow solver) has been selected and used to provide approximate function evaluations, there are several possible approaches to the problem of computing gradients. One popular method is to differentiate the simulation scheme and compute design sensitivities that are then used to obtain gradients. Although this black-box approach has many advantages in shape optimization problems, one must compute mesh sensitivities in order to compute the design sensitivity. In this paper, we present an alternative approach using the PDE sensitivity equation to develop algorithms for computing gradients. This approach has the advantage that mesh sensitivities need not be computed. Moreover, when it is possible to use the CFD scheme for both the forward problem and the sensitivity equation, then there are computational advantages. An apparent disadvantage of this approach is that it does not always produce consistent derivatives. However, for a proper combination of discretization schemes, one can show asymptotic consistency under mesh refinement, which is often sufficient to guarantee convergence of the optimal design algorithm. In particular, we show that when asymptotically consistent schemes are combined with a trust-region optimization algorithm, the resulting optimal design method converges. We denote this approach as the sensitivity equation method. The sensitivity equation method is presented, convergence results are given and the approach is illustrated on two optimal design problems involving shocks.

  1. A national cross-sectional study among drug-users in France: epidemiology of HCV and highlight on practical and statistical aspects of the design

    PubMed Central

    2009-01-01

    Background Epidemiology of HCV infection among drug users (DUs) has been widely studied. Prevalence and sociobehavioural data among DUs are therefore available in most countries but no study has taken into account in the sampling weights one important aspect of the way of life of DUs, namely that they can use one or more specialized services during the study period. In 2004–2005, we conducted a national seroepidemiologic survey of DUs, based on a random sampling design using the Generalised Weight Share Method (GWSM) and on blood testing. Methods A cross-sectional multicenter survey was done among DUs having injected or snorted drugs at least once in their life. We conducted a two stage random survey of DUs selected to represent the diversity of drug use. The fact that DUs can use more than one structure during the study period has an impact on their inclusion probabilities. To calculate a correct sampling weight, we used the GWSM. A sociobehavioral questionnaire was administered by interviewers. Selected DUs were asked to self-collect a fingerprick blood sample on blotting paper. Results Of all DUs selected, 1462 (75%) accepted to participate. HCV seroprevalence was 59.8% [95% CI: 50.7–68.3]. Of DUs under 30 years, 28% were HCV seropositive. Of HCV-infected DUs, 27% were unaware of their status. In the month prior to interview, 13% of DUs shared a syringe, 38% other injection parapharnelia and 81% shared a crack pipe. In multivariate analysis, factors independently associated with HCV seropositivity were age over 30, HIV seropositivity, having ever injected drugs, opiate substitution treatment (OST), crack use, and precarious housing. Conclusion This is the first time that blood testing combined to GWSM is applied to a DUs population, which improve the estimate of HCV prevalence. HCV seroprevalence is high, indeed by the youngest DUs. And a large proportion of DUs are not aware of their status. Our multivariate analysis identifies risk factors such as crack

  2. Taguchi method of experimental design in materials education

    NASA Technical Reports Server (NTRS)

    Weiser, Martin W.

    1993-01-01

    Some of the advantages and disadvantages of the Taguchi Method of experimental design as applied to Materials Science will be discussed. This is a fractional factorial method that employs the minimum number of experimental trials for the information obtained. The analysis is also very simple to use and teach, which is quite advantageous in the classroom. In addition, the Taguchi loss function can be easily incorporated to emphasize that improvements in reproducibility are often at least as important as optimization of the response. The disadvantages of the Taguchi Method include the fact that factor interactions are normally not accounted for, there are zero degrees of freedom if all of the possible factors are used, and randomization is normally not used to prevent environmental biasing. In spite of these disadvantages it is felt that the Taguchi Method is extremely useful for both teaching experimental design and as a research tool, as will be shown with a number of brief examples.

  3. Taguchi method of experimental design in materials education

    NASA Technical Reports Server (NTRS)

    Weiser, Martin W.

    1993-01-01

    Some of the advantages and disadvantages of the Taguchi Method of experimental design as applied to Materials Science will be discussed. This is a fractional factorial method that employs the minimum number of experimental trials for the information obtained. The analysis is also very simple to use and teach, which is quite advantageous in the classroom. In addition, the Taguchi loss function can be easily incorporated to emphasize that improvements in reproducibility are often at least as important as optimization of the response. The disadvantages of the Taguchi Method include the fact that factor interactions are normally not accounted for, there are zero degrees of freedom if all of the possible factors are used, and randomization is normally not used to prevent environmental biasing. In spite of these disadvantages it is felt that the Taguchi Method is extremely useful for both teaching experimental design and as a research tool, as will be shown with a number of brief examples.

  4. The Next PAGE in Understanding Complex Traits: Design for the Analysis of Population Architecture Using Genetics and Epidemiology (PAGE) Study

    PubMed Central

    Matise, Tara C.; Ambite, Jose Luis; Buyske, Steven; Carlson, Christopher S.; Cole, Shelley A.; Crawford, Dana C.; Haiman, Christopher A.; Heiss, Gerardo; Kooperberg, Charles; Marchand, Loic Le; Manolio, Teri A.; North, Kari E.; Peters, Ulrike; Ritchie, Marylyn D.; Hindorff, Lucia A.; Haines, Jonathan L.

    2011-01-01

    Genetic studies have identified thousands of variants associated with complex traits. However, most association studies are limited to populations of European descent and a single phenotype. The Population Architecture using Genomics and Epidemiology (PAGE) Study was initiated in 2008 by the National Human Genome Research Institute to investigate the epidemiologic architecture of well-replicated genetic variants associated with complex diseases in several large, ethnically diverse population-based studies. Combining DNA samples and hundreds of phenotypes from multiple cohorts, PAGE is well-suited to address generalization of associations and variability of effects in diverse populations; identify genetic and environmental modifiers; evaluate disease subtypes, intermediate phenotypes, and biomarkers; and investigate associations with novel phenotypes. PAGE investigators harmonize phenotypes across studies where possible and perform coordinated cohort-specific analyses and meta-analyses. PAGE researchers are genotyping thousands of genetic variants in up to 121,000 DNA samples from African-American, white, Hispanic/Latino, Asian/Pacific Islander, and American Indian participants. Initial analyses will focus on single nucleotide polymorphisms (SNPs) associated with obesity, lipids, cardiovascular disease, type 2 diabetes, inflammation, various cancers, and related biomarkers. PAGE SNPs are also assessed for pleiotropy using the “phenome-wide association study” approach, testing each SNP for associations with hundreds of phenotypes. PAGE data will be deposited into the National Center for Biotechnology Information's Database of Genotypes and Phenotypes and made available via a custom browser. PMID:21836165

  5. Molecular library design using multi-objective optimization methods.

    PubMed

    Nicolaou, Christos A; Kannas, Christos C

    2011-01-01

    Advancements in combinatorial chemistry and high-throughput screening technology have enabled the synthesis and screening of large molecular libraries for the purposes of drug discovery. Contrary to initial expectations, the increase in screening library size, typically combined with an emphasis on compound structural diversity, did not result in a comparable increase in the number of promising hits found. In an effort to improve the likelihood of discovering hits with greater optimization potential, more recent approaches attempt to incorporate additional knowledge to the library design process to effectively guide the search. Multi-objective optimization methods capable of taking into account several chemical and biological criteria have been used to design collections of compounds satisfying simultaneously multiple pharmaceutically relevant objectives. In this chapter, we present our efforts to implement a multi-objective optimization method, MEGALib, custom-designed to the library design problem. The method exploits existing knowledge, e.g. from previous biological screening experiments, to identify and profile molecular fragments used subsequently to design compounds compromising the various objectives.

  6. Function combined method for design innovation of children's bike

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoli; Qiu, Tingting; Chen, Huijuan

    2013-03-01

    As children mature, bike products for children in the market develop at the same time, and the conditions are frequently updated. Certain problems occur when using a bike, such as cycle overlapping, repeating function, and short life cycle, which go against the principles of energy conservation and the environmental protection intensive design concept. In this paper, a rational multi-function method of design through functional superposition, transformation, and technical implementation is proposed. An organic combination of frog-style scooter and children's tricycle is developed using the multi-function method. From the ergonomic perspective, the paper elaborates on the body size of children aged 5 to 12 and effectively extracts data for a multi-function children's bike, which can be used for gliding and riding. By inverting the body, parts can be interchanged between the handles and the pedals of the bike. Finally, the paper provides a detailed analysis of the components and structural design, body material, and processing technology of the bike. The study of Industrial Product Innovation Design provides an effective design method to solve the bicycle problems, extends the function problems, improves the product market situation, and enhances the energy saving feature while implementing intensive product development effectively at the same time.

  7. A Simple Method for High-Lift Propeller Conceptual Design

    NASA Technical Reports Server (NTRS)

    Patterson, Michael; Borer, Nick; German, Brian

    2016-01-01

    In this paper, we present a simple method for designing propellers that are placed upstream of the leading edge of a wing in order to augment lift. Because the primary purpose of these "high-lift propellers" is to increase lift rather than produce thrust, these props are best viewed as a form of high-lift device; consequently, they should be designed differently than traditional propellers. We present a theory that describes how these props can be designed to provide a relatively uniform axial velocity increase, which is hypothesized to be advantageous for lift augmentation based on a literature survey. Computational modeling indicates that such propellers can generate the same average induced axial velocity while consuming less power and producing less thrust than conventional propeller designs. For an example problem based on specifications for NASA's Scalable Convergent Electric Propulsion Technology and Operations Research (SCEPTOR) flight demonstrator, a propeller designed with the new method requires approximately 15% less power and produces approximately 11% less thrust than one designed for minimum induced loss. Higher-order modeling and/or wind tunnel testing are needed to verify the predicted performance.

  8. System Synthesis in Preliminary Aircraft Design using Statistical Methods

    NASA Technical Reports Server (NTRS)

    DeLaurentis, Daniel; Mavris, Dimitri N.; Schrage, Daniel P.

    1996-01-01

    This paper documents an approach to conceptual and preliminary aircraft design in which system synthesis is achieved using statistical methods, specifically design of experiments (DOE) and response surface methodology (RSM). These methods are employed in order to more efficiently search the design space for optimum configurations. In particular, a methodology incorporating three uses of these techniques is presented. First, response surface equations are formed which represent aerodynamic analyses, in the form of regression polynomials, which are more sophisticated than generally available in early design stages. Next, a regression equation for an overall evaluation criterion is constructed for the purpose of constrained optimization at the system level. This optimization, though achieved in a innovative way, is still traditional in that it is a point design solution. The methodology put forward here remedies this by introducing uncertainty into the problem, resulting a solutions which are probabilistic in nature. DOE/RSM is used for the third time in this setting. The process is demonstrated through a detailed aero-propulsion optimization of a high speed civil transport. Fundamental goals of the methodology, then, are to introduce higher fidelity disciplinary analyses to the conceptual aircraft synthesis and provide a roadmap for transitioning from point solutions to probabalistic designs (and eventually robust ones).

  9. An interdisciplinary heuristic evaluation method for universal building design.

    PubMed

    Afacan, Yasemin; Erbug, Cigdem

    2009-07-01

    This study highlights how heuristic evaluation as a usability evaluation method can feed into current building design practice to conform to universal design principles. It provides a definition of universal usability that is applicable to an architectural design context. It takes the seven universal design principles as a set of heuristics and applies an iterative sequence of heuristic evaluation in a shopping mall, aiming to achieve a cost-effective evaluation process. The evaluation was composed of three consecutive sessions. First, five evaluators from different professions were interviewed regarding the construction drawings in terms of universal design principles. Then, each evaluator was asked to perform the predefined task scenarios. In subsequent interviews, the evaluators were asked to re-analyze the construction drawings. The results showed that heuristic evaluation could successfully integrate universal usability into current building design practice in two ways: (i) it promoted an iterative evaluation process combined with multi-sessions rather than relying on one evaluator and on one evaluation session to find the maximum number of usability problems, and (ii) it highlighted the necessity of an interdisciplinary ad hoc committee regarding the heuristic abilities of each profession. A multi-session and interdisciplinary heuristic evaluation method can save both the project budget and the required time, while ensuring a reduced error rate for the universal usage of the built environments.

  10. Rationale and design of the PREFERS (Preserved and Reduced Ejection Fraction Epidemiological Regional Study) Stockholm heart failure study: an epidemiological regional study in Stockholm county of 2.1 million inhabitants.

    PubMed

    Linde, Cecilia; Eriksson, Maria J; Hage, Camilla; Wallén, Håkan; Persson, Bengt; Corbascio, Matthias; Lundeberg, Joakim; Maret, Eva; Ugander, Martin; Persson, Hans

    2016-10-01

    Heart failure (HF) with preserved (HFpEF) or reduced (HFrEF) ejection fraction is associated with poor prognosis and quality of life. While the incidence of HFrEF is declining and HF treatment is effective, HFpEF is increasing, with no established therapy. PREFERS Stockholm is an epidemiological study with the aim of improving clinical care and research in HF and to find new targets for drug treatment in HFpEF (https://internwebben.ki.se/sites/default/files/20150605_4d_research_appendix_final.pdf). Patients with new-onset HF (n = 2000) will be characterized at baseline and after 1-year follow-up by standardized protocols for clinical evaluation, echocardiography, and ECG. In one subset undergoing elective coronary bypass surgery (n = 100) and classified according to LV function, myocardial biopsies will be collected during surgery, and cardiac magnetic resonance (CMR) imaging will be performed at baseline and after 1 year. Blood and tissue samples will be stored in a biobank. We will characterize and compare new-onset HFpEF and HFrEF patients regarding clinical findings and cardiac imaging, genomics, proteomics, and transcriptomics from blood and cardiac biopsies, and by established biomarkers of fibrosis, inflammation, haemodynamics, haemostasis, and thrombosis. The data will be explored by state-of-the-art bioinformatics methods to investigate gene expression patterns, sequence variation, DNA methylation, and post-translational modifications, and using systems biology approaches including pathway and network analysis. In this epidemiological HF study with biopsy studies in a subset of patients, we aim to identify new biomarkers of disease progression and to find pathophysiological mechanisms to support explorations of new treatment regimens for HFpEF. © 2016 The Authors. European Journal of Heart Failure © 2016 European Society of Cardiology.

  11. The C8 Health Project: Design, Methods, and Participants

    PubMed Central

    Frisbee, Stephanie J.; Brooks, A. Paul; Maher, Arthur; Flensborg, Patsy; Arnold, Susan; Fletcher, Tony; Steenland, Kyle; Shankar, Anoop; Knox, Sarah S.; Pollard, Cecil; Halverson, Joel A.; Vieira, Verónica M.; Jin, Chuanfang; Leyden, Kevin M.; Ducatman, Alan M.

    2009-01-01

    Background The C8 Health Project was created, authorized, and funded as part of the settlement agreement reached in the case of Jack W. Leach, et al. v. E.I. du Pont de Nemours & Company (no. 01-C-608 W.Va., Wood County Circuit Court, filed 10 April 2002). The settlement stemmed from the perfluorooctanoic acid (PFOA, or C8) contamination of drinking water in six water districts in two states near the DuPont Washington Works facility near Parkersburg, West Virginia. Objectives This study reports on the methods and results from the C8 Health Project, a population study created to gather data that would allow class members to know their own PFOA levels and permit subsequent epidemiologic investigations. Methods Final study participation was 69,030, enrolled over a 13-month period in 2005–2006. Extensive data were collected, including demographic data, medical diagnoses (both self-report and medical records review), clinical laboratory testing, and determination of serum concentrations of 10 perfluorocarbons (PFCs). Here we describe the processes used to collect, validate, and store these health data. We also describe survey participants and their serum PFC levels. Results The population geometric mean for serum PFOA was 32.91 ng/mL, 500% higher than previously reported for a representative American population. Serum concentrations for perfluorohexane sulfonate and perfluorononanoic acid were elevated 39% and 73% respectively, whereas perfluorooctanesulfonate was present at levels similar to those in the U.S. population. Conclusions This largest known population study of community PFC exposure permits new evaluations of associations between PFOA, in particular, and a range of health parameters. These will contribute to understanding of the biology of PFC exposure. The C8 Health Project also represents an unprecedented effort to gather basic data on an exposed population; its achievements and limitations can inform future legal settlements for populations exposed to

  12. Comparison of optimal design methods in inverse problems

    NASA Astrophysics Data System (ADS)

    Banks, H. T.; Holm, K.; Kappel, F.

    2011-07-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric-based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher information matrix. A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criterion with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model (Banks H T and Tran H T 2009 Mathematical and Experimental Modeling of Physical and Biological Processes (Boca Raton, FL: Chapman and Hall/CRC)), the standard harmonic oscillator model (Banks H T and Tran H T 2009) and a popular glucose regulation model (Bergman R N, Ider Y Z, Bowden C R and Cobelli C 1979 Am. J. Physiol. 236 E667-77 De Gaetano A and Arino O 2000 J. Math. Biol. 40 136-68 Toffolo G, Bergman R N, Finegood D T, Bowden C R and Cobelli C 1980 Diabetes 29 979-90).

  13. New Methods and Transducer Designs for Ultrasonic Diagnostics and Therapy

    NASA Astrophysics Data System (ADS)

    Rybyanets, A. N.; Naumenko, A. A.; Sapozhnikov, O. A.; Khokhlova, V. A.

    Recent advances in the field of physical acoustics, imaging technologies, piezoelectric materials, and ultrasonic transducer design have led to emerging of novel methods and apparatus for ultrasonic diagnostics, therapy and body aesthetics. The paper presents the results on development and experimental study of different high intensity focused ultrasound (HIFU) transducers. Technological peculiarities of the HIFU transducer design as well as theoretical and numerical models of such transducers and the corresponding HIFU fields are discussed. Several HIFU transducers of different design have been fabricated using different advanced piezoelectric materials. Acoustic field measurements for those transducers have been performed using a calibrated fiber optic hydrophone and an ultrasonic measurement system (UMS). The results of ex vivo experiments with different tissues as well as in vivo experiments with blood vessels are presented that prove the efficacy, safety and selectivity of the developed HIFU transducers and methods.

  14. New displacement-based methods for optimal truss topology design

    NASA Technical Reports Server (NTRS)

    Bendsoe, Martin P.; Ben-Tal, Aharon; Haftka, Raphael T.

    1991-01-01

    Two alternate methods for maximum stiffness truss topology design are presented. The ground structure approach is used, and the problem is formulated in terms of displacements and bar areas. This large, nonconvex optimization problem can be solved by a simultaneous analysis and design approach. Alternatively, an equivalent, unconstrained, and convex problem in the displacements only can be formulated, and this problem can be solved by a nonsmooth, steepest descent algorithm. In both methods, the explicit solving of the equilibrium equations and the assembly of the global stiffness matrix are circumvented. A large number of examples have been studied, showing the attractive features of topology design as well as exposing interesting features of optimal topologies.

  15. Multi-objective optimization methods in drug design.

    PubMed

    Nicolaou, Christos A; Brown, Nathan

    2013-09-01

    Drug discovery is a challenging multi-objective problem where numerous pharmaceutically important objectives need to be adequately satisfied for a solution to be found. The problem is characterized by vast, complex solution spaces further perplexed by the presence of conflicting objectives. Multi-objective optimization methods, designed specifically to address such problems, have been introduced to the drug discovery field over a decade ago and have steadily gained in acceptance ever since. This paper reviews the latest multi-objective methods and applications reported in the literature, specifically in quantitative structure–activity modeling, docking, de novo design and library design. Further, the paper reports on related developments in drug discovery research and advances in the multi-objective optimization field.

  16. Continuation methods in multiobjective optimization for combined structure control design

    NASA Technical Reports Server (NTRS)

    Milman, M.; Salama, M.; Scheid, R.; Bruno, R.; Gibson, J. S.

    1990-01-01

    A homotopy approach involving multiobjective functions is developed to outline the methods that have evolved for the combined control-structure optimization of physical systems encountered in the technology of large space structures. A method to effect a timely consideration of the control performance prior to the finalization of the structural design involves integrating the control and structural design processes into a unified design methodology that combines the two optimization problems into a single formulation. This study uses the combined optimization problem as a family of weighted structural and control costs. Connections with vector optimizations are described; an analysis of the zero-set of required conditions is made, and a numerical example is given.

  17. Designs and Methods in School Improvement Research: A Systematic Review

    ERIC Educational Resources Information Center

    Feldhoff, Tobias; Radisch, Falk; Bischof, Linda Marie

    2016-01-01

    Purpose: The purpose of this paper is to focus on challenges faced by longitudinal quantitative analyses of school improvement processes and offers a systematic literature review of current papers that use longitudinal analyses. In this context, the authors assessed designs and methods that are used to analyze the relation between school…

  18. Impact design methods for ceramic components in gas turbine engines

    NASA Technical Reports Server (NTRS)

    Song, J.; Cuccio, J.; Kington, H.

    1991-01-01

    Methods currently under development to design ceramic turbine components with improved impact resistance are presented. Two different modes of impact damage are identified and characterized, i.e., structural damage and local damage. The entire computation is incorporated into the EPIC computer code. Model capability is demonstrated by simulating instrumented plate impact and particle impact tests.

  19. Impact design methods for ceramic components in gas turbine engines

    NASA Technical Reports Server (NTRS)

    Song, J.; Cuccio, J.; Kington, H.

    1991-01-01

    Methods currently under development to design ceramic turbine components with improved impact resistance are presented. Two different modes of impact damage are identified and characterized, i.e., structural damage and local damage. The entire computation is incorporated into the EPIC computer code. Model capability is demonstrated by simulating instrumented plate impact and particle impact tests.

  20. Using Propensity Score Methods to Approximate Factorial Experimental Designs

    ERIC Educational Resources Information Center

    Dong, Nianbo

    2011-01-01

    The purpose of this study is through Monte Carlo simulation to compare several propensity score methods in approximating factorial experimental design and identify best approaches in reducing bias and mean square error of parameter estimates of the main and interaction effects of two factors. Previous studies focused more on unbiased estimates of…

  1. Designs and Methods in School Improvement Research: A Systematic Review

    ERIC Educational Resources Information Center

    Feldhoff, Tobias; Radisch, Falk; Bischof, Linda Marie

    2016-01-01

    Purpose: The purpose of this paper is to focus on challenges faced by longitudinal quantitative analyses of school improvement processes and offers a systematic literature review of current papers that use longitudinal analyses. In this context, the authors assessed designs and methods that are used to analyze the relation between school…

  2. Obtaining Valid Response Rates: Considerations beyond the Tailored Design Method.

    ERIC Educational Resources Information Center

    Huang, Judy Y.; Hubbard, Susan M.; Mulvey, Kevin P.

    2003-01-01

    Reports on the use of the tailored design method (TDM) to achieve high survey response in two separate studies of the dissemination of Treatment Improvement Protocols (TIPs). Findings from these two studies identify six factors may have influenced nonresponse, and show that use of TDM does not, in itself, guarantee a high response rate. (SLD)

  3. Database design using NIAM (Nijssen Information Analysis Method) modeling

    SciTech Connect

    Stevens, N.H.

    1989-01-01

    The Nissjen Information Analysis Method (NIAM) is an information modeling technique based on semantics and founded in set theory. A NIAM information model is a graphical representation of the information requirements for some universe of discourse. Information models facilitate data integration and communication within an organization about data semantics. An information model is sometimes referred to as the semantic model or the conceptual schema. It helps in the logical and physical design and implementation of databases. NIAM information modeling is used at Sandia National Laboratories to design and implement relational databases containing engineering information which meet the users' information requirements. The paper focuses on the design of one database which satisfied the data needs of four disjoint but closely related applications. The applications as they existed before did not talk to each other even though they stored much of the same data redundantly. NIAM was used to determine the information requirements and design the integrated database. 6 refs., 7 figs.

  4. Integrated material and structural design method for flexible pavements. Volume 3: Laboratory design guide

    NASA Astrophysics Data System (ADS)

    Baladi, G. Y.

    1988-12-01

    The research quantified relationships between structural and material mix design parameters and documented a laboratory test procedure for examining mix design from a structural viewpoint. Laboratory asphalt mix design guidelines are presented. The guidelines are based upon the analysis of the results of laboratory static and cyclic load triaxial, indirect tensile, and flexural beam tests. The guidelines allow the highway engineer and the laboratory technician to tailor the asphalt mix design procedure to optimize the structural properties of the mix. Two mix design methods are covered: the Marshall mix design with minor modifications and the indirect tensile test. Analytical and statistical equations are also included to be able to calculate or estimate the structural properties of the mix.

  5. [Estimate methods used with complex sampling designs: their application in the Cuban 2001 health survey].

    PubMed

    Cañizares Pérez, Mayilée; Barroso Utra, Isabel; Alfonso León, Alina; García Roche, René; Alfonso Sagué, Karen; Chang de la Rosa, Martha; Bonet Gorbea, Mariano; León, Esther M

    2004-03-01

    calculated using the different methods. Analytic methods that take into account the way the data are structured as well as the study design give a more realistic picture of the problem under study and provide more exact estimates of the study parameters and their SE than conventional analytic methods. Because data from epidemiologic and public health research are often obtained through complex sampling designs, the methods described in this paper and the statistical packages that utilize them should be used more widely.

  6. [The role of epidemiology in mental disorder research].

    PubMed

    Borges, Guilherme; Medina-Mora, María Elena; López-Moreno, Sergio

    2004-01-01

    which are germane to public health, for example, violence. The epidemiology of mental disorders faces great challenges in the new millennium, including a complex, changing epidemiologic scenario. Several important issues will influence the future development of mental disorder epidemiology: measurement of mental disorders and risk factors, more efficient sampling design and methods, the relationships among biological research, genetics, social studies, and epidemiology, and the interface between epidemiology and the evaluation of therapies and health services.

  7. Assessing the global burden of ischemic heart disease, part 2: analytic methods and estimates of the global epidemiology of ischemic heart disease in 2010

    PubMed Central

    Forouzanfar, Mohammad H.; Moran, Andrew E.; Flaxman, Abraham D.; Roth, Gregory; Mensah, George A.; Ezzati, Majid; Naghavi, Mohsen; Murray, Christopher J.L.

    2012-01-01

    Background Ischemic Heart Disease (IHD) is the leading cause of death worldwide. The Global Burden of Diseases, Injuries and Risk Factors (GBD) 2010 Study estimated IHD mortality and disability burden for 21 world regions for the years 1990 to 2010. Methods Data sources for GBD IHD epidemiology estimates were mortality surveillance, verbal autopsy, and vital registration data (for IHD mortality) and systematic review of IHD epidemiology literature published 1980–2008 (for non-fatal IHD outcomes). An estimation and validation process led to an ensemble model of IHD mortality by country for all 21 world regions, adjusted for country-level covariates. Disease models were developed for the nonfatal sequelae of IHD: myocardial infarction, stable angina pectoris, and ischemic heart failure. Results Country level covariates including metabolic and nutritional risk factors, education, war, and annual income per capita contributed to the ensemble model for the analysis of IHD death. In the acute myocardial infarction model, inclusion of troponin in the diagnostic criteria of studies published after the year 2000 was associated with a 50% higher incidence. Self-reported diagnosis of angina significantly overestimated stable angina prevalence compared with “definite” angina elicited by the Rose angina questionnaire. For 2010, Eastern Europe and Central Asia had the highest rates of IHD death and the Asia Pacific High-Income, East Asia, Latin American Andean, and sub-Saharan Africa regions had the lowest. Conclusions Global and regional IHD epidemiology estimates are needed for estimating the worldwide burden of IHD. Using descriptive meta-analysis tools, the GBD 2010 standardized and pooled international data by adjusting for region-level mortality and risk factor data, and study level diagnostic method. Analyses maximized internal consistency, generalizability, and adjustment for known sources of bias. The GBD IHD analysis nonetheless highlights the need for improved

  8. Computational methods for aerodynamic design using numerical optimization

    NASA Technical Reports Server (NTRS)

    Peeters, M. F.

    1983-01-01

    Five methods to increase the computational efficiency of aerodynamic design using numerical optimization, by reducing the computer time required to perform gradient calculations, are examined. The most promising method consists of drastically reducing the size of the computational domain on which aerodynamic calculations are made during gradient calculations. Since a gradient calculation requires the solution of the flow about an airfoil whose geometry was slightly perturbed from a base airfoil, the flow about the base airfoil is used to determine boundary conditions on the reduced computational domain. This method worked well in subcritical flow.

  9. Supersonic/hypersonic aerodynamic methods for aircraft design and analysis

    NASA Technical Reports Server (NTRS)

    Torres, Abel O.

    1992-01-01

    A methodology employed in engineering codes to predict aerodynamic characteristics over arbitrary supersonic/hypersonic configurations is considered. Engineering codes use a combination of simplified methods, based on geometrical impact angle and freestream conditions, to compute pressure distribution over the vehicle's surface in an efficient and timely manner. These approximate methods are valid for both hypersonic (Mach greater than 4) and lower speeds (Mach down to 2). It is concluded that the proposed methodology enables the user to obtain reasonable estimates of vehicle performance and engineering methods are valuable in the design process of these type of vehicles.

  10. Guidance for using mixed methods design in nursing practice research.

    PubMed

    Chiang-Hanisko, Lenny; Newman, David; Dyess, Susan; Piyakong, Duangporn; Liehr, Patricia

    2016-08-01

    The mixed methods approach purposefully combines both quantitative and qualitative techniques, enabling a multi-faceted understanding of nursing phenomena. The purpose of this article is to introduce three mixed methods designs (parallel; sequential; conversion) and highlight interpretive processes that occur with the synthesis of qualitative and quantitative findings. Real world examples of research studies conducted by the authors will demonstrate the processes leading to the merger of data. The examples include: research questions; data collection procedures and analysis with a focus on synthesizing findings. Based on experience with mixed methods studied, the authors introduce two synthesis patterns (complementary; contrasting), considering application for practice and implications for research.

  11. A Mixed Methods Investigation of Mixed Methods Sampling Designs in Social and Health Science Research

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Jiao, Qun G.

    2007-01-01

    A sequential design utilizing identical samples was used to classify mixed methods studies via a two-dimensional model, wherein sampling designs were grouped according to the time orientation of each study's components and the relationship of the qualitative and quantitative samples. A quantitative analysis of 121 studies representing nine fields…

  12. A Mixed Methods Investigation of Mixed Methods Sampling Designs in Social and Health Science Research

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Jiao, Qun G.

    2007-01-01

    A sequential design utilizing identical samples was used to classify mixed methods studies via a two-dimensional model, wherein sampling designs were grouped according to the time orientation of each study's components and the relationship of the qualitative and quantitative samples. A quantitative analysis of 121 studies representing nine fields…

  13. Comparison of methods for inverse design of radiant enclosures.

    SciTech Connect

    Fran­ca, Francis; Larsen, Marvin Elwood; Howell, John R.; Daun, Kyle; Leduc, Guillaume

    2005-03-01

    A particular inverse design problem is proposed as a benchmark for comparison of five solution techniques used in design of enclosures with radiating sources. The enclosure is three-dimensional and includes some surfaces that are diffuse and others that are specular diffuse. Two aspect ratios are treated. The problem is completely described, and solutions are presented as obtained by the Tikhonov method, truncated singular value decomposition, conjugate gradient regularization, quasi-Newton minimization, and simulated annealing. All of the solutions use a common set of exchange factors computed by Monte Carlo, and smoothed by a constrained maximum likelihood estimation technique that imposes conservation, reciprocity, and non-negativity. Solutions obtained by the various methods are presented and compared, and the relative advantages and disadvantages of these methods are summarized.

  14. Application of optical diffraction method in designing phase plates

    NASA Astrophysics Data System (ADS)

    Lei, Ze-Min; Sun, Xiao-Yan; Lv, Feng-Nian; Zhang, Zhen; Lu, Xing-Qiang

    2016-11-01

    Continuous phase plate (CPP), which has a function of beam shaping in laser systems, is one kind of important diffractive optics. Based on the Fourier transform of the Gerchberg-Saxton (G-S) algorithm for designing CPP, we proposed an optical diffraction method according to the real system conditions. A thin lens can complete the Fourier transform of the input signal and the inverse propagation of light can be implemented in a program. Using both of the two functions can realize the iteration process to calculate the near-field distribution of light and the far-field repeatedly, which is similar to the G-S algorithm. The results show that using the optical diffraction method can design a CPP for a complicated laser system, and make the CPP have abilities of beam shaping and phase compensation for the phase aberration of the system. The method can improve the adaptation of the phase plate in systems with phase aberrations.

  15. Designing waveforms for temporal encoding using a frequency sampling method.

    PubMed

    Gran, Fredrik; Jensen, Jørgen Arendt

    2007-10-01

    In this paper a method for designing waveforms for temporal encoding in medical ultrasound imaging is described. The method is based on least squares optimization and is used to design nonlinear frequency modulated signals for synthetic transmit aperture imaging. By using the proposed design method, the amplitude spectrum of the transmitted waveform can be optimized, such that most of the energy is transmitted where the transducer has large amplification. To test the design method, a waveform was designed for a BK8804 linear array transducer. The resulting nonlinear frequency modulated waveform was compared to a linear frequency modulated signal with amplitude tapering, previously used in clinical studies for synthetic transmit aperture imaging. The latter had a relatively flat spectrum which implied that the waveform tried to excite all frequencies including ones with low amplification. The proposed waveform, on the other hand, was designed so that only frequencies where the transducer had a large amplification were excited. Hereby, unnecessary heating of the transducer could be avoided and the signal-to-noise ratio could be increased. The experimental ultrasound scanner RASMUS was used to evaluate the method experimentally. Due to the careful waveform design optimized for the transducer at hand, a theoretic gain in signal-to-noise ratio of 4.9 dB compared to the reference excitation was found, even though the energy of the nonlinear frequency modulated signal was 71% of the energy of the reference signal. This was supported by a signal-to-noise ratio measurement and comparison in penetration depth, where an increase of 1 cm was found in favor for the proposed waveform. Axial and lateral resolutions at full-width half-maximum were compared in a water phantom at depths of 42, 62, 82, and 102 mm. The axial resolutions of the nonlinear frequency modulated signal were 0.62, 0.69, 0.60, and 0.60 mm, respectively. The corresponding axial resolutions for the reference

  16. Tuning Parameters in Heuristics by Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Arin, Arif; Rabadi, Ghaith; Unal, Resit

    2010-01-01

    With the growing complexity of today's large scale problems, it has become more difficult to find optimal solutions by using exact mathematical methods. The need to find near-optimal solutions in an acceptable time frame requires heuristic approaches. In many cases, however, most heuristics have several parameters that need to be "tuned" before they can reach good results. The problem then turns into "finding best parameter setting" for the heuristics to solve the problems efficiently and timely. One-Factor-At-a-Time (OFAT) approach for parameter tuning neglects the interactions between parameters. Design of Experiments (DOE) tools can be instead employed to tune the parameters more effectively. In this paper, we seek the best parameter setting for a Genetic Algorithm (GA) to solve the single machine total weighted tardiness problem in which n jobs must be scheduled on a single machine without preemption, and the objective is to minimize the total weighted tardiness. Benchmark instances for the problem are available in the literature. To fine tune the GA parameters in the most efficient way, we compare multiple DOE models including 2-level (2k ) full factorial design, orthogonal array design, central composite design, D-optimal design and signal-to-noise (SIN) ratios. In each DOE method, a mathematical model is created using regression analysis, and solved to obtain the best parameter setting. After verification runs using the tuned parameter setting, the preliminary results for optimal solutions of multiple instances were found efficiently.

  17. Molecular Epidemiology of Breast Cancer: Development and Validation of Acetylation Methods for Carcinogen-DNA Adduct Detection

    DTIC Science & Technology

    2001-10-01

    epidemiological studies, and determine adduct levels in relation to metabolizing gene polymorphisms . The originally proposed assay is novel because one uses a...carcinogenic mechanisms. Currently, many ongoing breast cancer studies are exploring risks related to genetic polymorphisms in these genes. Yet these...the surrogate tissue). Finally, in these subjects, we will perform assays for genetic polymorphisms , to assess the association of "at risk" genetic

  18. Optimal pulse design in quantum control: A unified computational method

    PubMed Central

    Li, Jr-Shin; Ruths, Justin; Yu, Tsyr-Yan; Arthanari, Haribabu; Wagner, Gerhard

    2011-01-01

    Many key aspects of control of quantum systems involve manipulating a large quantum ensemble exhibiting variation in the value of parameters characterizing the system dynamics. Developing electromagnetic pulses to produce a desired evolution in the presence of such variation is a fundamental and challenging problem in this research area. We present such robust pulse designs as an optimal control problem of a continuum of bilinear systems with a common control function. We map this control problem of infinite dimension to a problem of polynomial approximation employing tools from geometric control theory. We then adopt this new notion and develop a unified computational method for optimal pulse design using ideas from pseudospectral approximations, by which a continuous-time optimal control problem of pulse design can be discretized to a constrained optimization problem with spectral accuracy. Furthermore, this is a highly flexible and efficient numerical method that requires low order of discretization and yields inherently smooth solutions. We demonstrate this method by designing effective broadband π/2 and π pulses with reduced rf energy and pulse duration, which show significant sensitivity enhancement at the edge of the spectrum over conventional pulses in 1D and 2D NMR spectroscopy experiments. PMID:21245345

  19. Novel TMS coils designed using an inverse boundary element method

    NASA Astrophysics Data System (ADS)

    Cobos Sánchez, Clemente; María Guerrero Rodriguez, Jose; Quirós Olozábal, Ángel; Blanco-Navarro, David

    2017-01-01

    In this work, a new method to design TMS coils is presented. It is based on the inclusion of the concept of stream function of a quasi-static electric current into a boundary element method. The proposed TMS coil design approach is a powerful technique to produce stimulators of arbitrary shape, and remarkably versatile as it permits the prototyping of many different performance requirements and constraints. To illustrate the power of this approach, it has been used for the design of TMS coils wound on rectangular flat, spherical and hemispherical surfaces, subjected to different constraints, such as minimum stored magnetic energy or power dissipation. The performances of such coils have been additionally described; and the torque experienced by each stimulator in the presence of a main magnetic static field have theoretically found in order to study the prospect of using them to perform TMS and fMRI concurrently. The obtained results show that described method is an efficient tool for the design of TMS stimulators, which can be applied to a wide range of coil geometries and performance requirements.

  20. Non-contact electromagnetic exciter design with linear control method

    NASA Astrophysics Data System (ADS)

    Wang, Lin; Xiong, Xianzhi; Xu, Hua

    2017-01-01

    A non-contact type force actuator is necessary for studying the dynamic performance of a high-speed spindle system owing to its high-speed operating conditions. A non-contact electromagnetic exciter is designed for identifying the dynamic coefficients of journal bearings in high-speed grinding spindles. A linear force control method is developed based on PID controller. The influence of amplitude and frequency of current, misalignment and rotational speed on magnetic field and excitation force is investigated based on two-dimensional finite element analysis. The electromagnetic excitation force is measured with the auxiliary coils and calibrated by load cells. The design is validated by the experimental results. Theoretical and experimental investigations show that the proposed design can accurately generate linear excitation force with sufficiently large amplitude and higher signal to noise ratio. Moreover, the fluctuations in force amplitude are reduced to a greater extent with the designed linear control method even when the air gap changes due to the rotor vibration at high-speed conditions. Besides, it is possible to apply various types of excitations: constant, synchronous, and non-synchronous excitation forces based on the proposed linear control method. This exciter can be used as linear-force exciting and controlling system for dynamic performance study of different high-speed rotor-bearing systems.

  1. Design method of coaxial reflex hollow beam generator

    NASA Astrophysics Data System (ADS)

    Wang, Jiake; Xu, Jia; Fu, Yuegang; He, Wenjun; Zhu, Qifan

    2016-10-01

    In view of the light energy loss in central obscuration of coaxial reflex optical system, the design method of a kind of hollow beam generator is introduced. First of all, according to the geometrical parameter and obscuration ratio of front-end coaxial reflex optical system, calculate the required physical dimension of hollow beam, and get the beam expanding rate of the hollow beam generator according to the parameters of the light source. Choose the better enlargement ratio of initial expanding system using the relational expression of beam expanding rate and beam expanding rate of initial system; the traditional design method of the reflex optical system is used to design the initial optical system, and then the position of rotation axis of the hollow beam generator can be obtained through the rotation axis translation formula. Intercept the initial system bus bar using the rotation axis after the translation, and rotate the bus bar around the rotation axis for 360°, so that two working faces of the hollow beam generator can be got. The hollow beam generator designed by this method can get the hollow beam that matches the front-end coaxial reflex optical system, improving the energy utilization ratio of beam and effectively reducing the back scattering of transmission system.

  2. The future of prodrugs - design by quantum mechanics methods.

    PubMed

    Karaman, Rafik; Fattash, Beesan; Qtait, Alaa

    2013-05-01

    The revolution in computational chemistry greatly impacted the drug design and delivery fields, in general, and recently the utilization of the prodrug approach in particular. The use of ab initio, semiempirical and molecular mechanics methods to understand organic reaction mechanisms of certain processes, especially intramolecular reactions, has opened the door to design and to rapidly produce safe and efficacious delivery of a wide range of active small molecule and biotherapeutics such as prodrugs. This article provides the readers with a concise overview of this modern approach to prodrug design. The use of computational approaches, such as density functional theory (DFT), semiempirical and ab initio molecular orbital methods, in modern prodrugs design will be discussed. The novel prodrug approach to be reported in this review implies prodrug design based on enzyme model (mimicking enzyme catalysis) that has been utilized to understand how enzymes work. The tool used in the design is a computational approach consisting of calculations using molecular orbital and molecular mechanics methods (DFT, ab initio and MM2) and correlations between experimental and calculated values of intramolecular processes that were used to understand the mechanism by which enzymes might exert their high rates catalysis. The future of prodrug technology is exciting yet extremely challenging. Advances must be made in understanding the chemistry of many organic reactions that can be effectively utilized to enable the development of even more types of prodrugs. Despite the increase in the number of marketed prodrugs, we have only started to appreciate the potential of the prodrug approach in modern drug development, and the coming years will witness many novel prodrug innovations.

  3. Molecular epidemiology of human hepatitis A virus defined by an antigen-capture polymerase chain reaction method.

    PubMed Central

    Jansen, R W; Siegl, G; Lemon, S M

    1990-01-01

    We describe an immunoaffinity-linked nucleic acid amplification system (antigen-capture/polymerase chain reaction, or AC/PCR) for detection of viruses in clinical specimens and its application to the study of the molecular epidemiology of a picornavirus, hepatitis A virus (HAV). Immunoaffinity capture of virus, synthesis of viral cDNA, and amplification of cDNA by a polymerase chain reaction (PCR) were carried out sequentially in a single reaction vessel. This approach simplified sample preparation and enhanced the specificity of conventional PCR. AC/PCR detected less than one cell culture infectious unit of virus in 80 microliters of sample. Sequencing of AC/PCR reaction products from 34 virus strains demonstrated remarkable conservation at the nucleotide level among most strains but revealed hitherto unsuspected genetic diversity among human isolates. Epidemiologically related strains were identical or closely related in sequence. Virus strains recovered from epidemics of hepatitis A in the United States and Germany were identical in sequence, providing evidence for a previously unrecognized epidemiologic link between these outbreaks. Images PMID:2158093

  4. The characterization of kerogen-analytical limitations and method design

    SciTech Connect

    Larter, S.R.

    1987-04-01

    Methods suitable for high resolution total molecular characterization of kerogens and other polymeric SOM are necessary for a quantitative understanding of hydrocarbon maturation and migration phenomena in addition to being a requirement for a systematic understanding of kerogen based fuel utilization. Gas chromatographic methods, in conjunction with analytical pyrolysis methods, have proven successful in the rapid superficial characterization of kerogen pyrolysates. Most applications involve qualitative or semi-quantitative assessment of the relative concentration of aliphatic, aromatic, or oxygen-containing species in a kerogen pyrolysate. More recently, the use of alkylated polystyrene internal standards has allowed the direct determination of parameters related to the abundance of, for example, normal alkyl groups or single ring aromatic species in kerogens. The future of methods of this type for improved kerogen typing is critically discussed. The conceptual design and feasibility of methods suitable for the more complete characterization of complex geopolymers on the molecular level is discussed with practical examples.

  5. Design of transonic compressor cascades using hodograph method

    NASA Technical Reports Server (NTRS)

    Chen, Zuoyi; Guo, Jingrong

    1991-01-01

    The use of the Hodograph Method in the design of a transonic compressor cascade is discussed. The design of the flow mode in the transonic compressor cascade must be as follows: the flow in the nozzle part should be uniform and smooth; the location of the sonic line should be reasonable; and the aerodynamic character of the flow canal in the subsonic region should be met. The rate through cascade may be determined by the velocity distribution in the subsonic region (i.e., by the numerical solution of the Chaplygin equation). The supersonic sections A'C' and AD are determined by the analytical solution of the Mixed-Type Hodograph equation.

  6. Rays inserting method (RIM) to design dielectric optical devices

    NASA Astrophysics Data System (ADS)

    Taskhiri, Mohammad Mahdi; Khalaj Amirhosseini, Mohammad

    2017-01-01

    In this article, a novel approach, called Rays Inserted Method (RIM), is introduced to design dielectric optical devices. In this approach, some rays are inserted between two ends of desired device and then the refractive index of the points along the route of rays are obtained. The validity of the introduced approach is verified by designing three types of optical devices, i.e. power splitter, bend, and flat lens. The results are confirmed with numerical simulations by the means of FDTD scheme at the frequency of 100 GHz.

  7. Current methods of epitope identification for cancer vaccine design.

    PubMed

    Cherryholmes, Gregory A; Stanton, Sasha E; Disis, Mary L

    2015-12-16

    The importance of the immune system in tumor development and progression has been emerging in many cancers. Previous cancer vaccines have not shown long-term clinical benefit possibly because were not designed to avoid eliciting regulatory T-cell responses that inhibit the anti-tumor immune response. This review will examine different methods of identifying epitopes derived from tumor associated antigens suitable for immunization and the steps used to design and validate peptide epitopes to improve efficacy of anti-tumor peptide-based vaccines. Focusing on in silico prediction algorithms, we survey the advantages and disadvantages of current cancer vaccine prediction tools. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. The C8 health project: design, methods, and participants.

    PubMed

    Frisbee, Stephanie J; Brooks, A Paul; Maher, Arthur; Flensborg, Patsy; Arnold, Susan; Fletcher, Tony; Steenland, Kyle; Shankar, Anoop; Knox, Sarah S; Pollard, Cecil; Halverson, Joel A; Vieira, Verónica M; Jin, Chuanfang; Leyden, Kevin M; Ducatman, Alan M

    2009-12-01

    The C8 Health Project was created, authorized, and funded as part of the settlement agreement reached in the case of Jack W. Leach, et al. v. E.I. du Pont de Nemours & Company (no. 01-C-608 W.Va., Wood County Circuit Court, filed 10 April 2002). The settlement stemmed from the perfluorooctanoic acid (PFOA, or C8) contamination of drinking water in six water districts in two states near the DuPont Washington Works facility near Parkersburg, West Virginia. This study reports on the methods and results from the C8 Health Project, a population study created to gather data that would allow class members to know their own PFOA levels and permit subsequent epidemiologic investigations. Final study participation was 69,030, enrolled over a 13-month period in 2005-2006. Extensive data were collected, including demographic data, medical diagnoses (both self-report and medical records review), clinical laboratory testing, and determination of serum concentrations of 10 perfluorocarbons (PFCs). Here we describe the processes used to collect, validate, and store these health data. We also describe survey participants and their serum PFC levels. The population geometric mean for serum PFOA was 32.91 ng/mL, 500% higher than previously reported for a representative American population. Serum concentrations for perfluorohexane sulfonate and perfluorononanoic acid were elevated 39% and 73% respectively, whereas perfluorooctanesulfonate was present at levels similar to those in the U.S. population. This largest known population study of community PFC exposure permits new evaluations of associations between PFOA, in particular, and a range of health parameters. These will contribute to understanding of the biology of PFC exposure. The C8 Health Project also represents an unprecedented effort to gather basic data on an exposed population; its achievements and limitations can inform future legal settlements for populations exposed to environmental contaminants.

  9. Material Design, Selection, and Manufacturing Methods for System Sustainment

    SciTech Connect

    David Sowder, Jim Lula, Curtis Marshall

    2010-02-18

    This paper describes a material selection and validation process proven to be successful for manufacturing high-reliability long-life product. The National Secure Manufacturing Center business unit of the Kansas City Plant (herein called KCP) designs and manufactures complex electrical and mechanical components used in extreme environments. The material manufacturing heritage is founded in the systems design to manufacturing practices that support the U.S. Department of Energy’s National Nuclear Security Administration (DOE/NNSA). Material Engineers at KCP work with the systems designers to recommend materials, develop test methods, perform analytical analysis of test data, define cradle to grave needs, present final selection and fielding. The KCP material engineers typically will maintain cost control by utilizing commercial products when possible, but have the resources and to develop and produce unique formulations as necessary. This approach is currently being used to mature technologies to manufacture materials with improved characteristics using nano-composite filler materials that will enhance system design and production. For some products the engineers plan and carry out science-based life-cycle material surveillance processes. Recent examples of the approach include refurbished manufacturing of the high voltage power supplies for cockpit displays in operational aircraft; dry film lubricant application to improve bearing life for guided munitions gyroscope gimbals, ceramic substrate design for electrical circuit manufacturing, and tailored polymeric materials for various systems. The following examples show evidence of KCP concurrent design-to-manufacturing techniques used to achieve system solutions that satisfy or exceed demanding requirements.

  10. Application of the CSCM method to the design of wedge cavities. [Conservative Supra Characteristic Method

    NASA Technical Reports Server (NTRS)

    Venkatapathy, Ethiraj; Nystrom, G. A.; Bardina, J.; Lombard, C. K.

    1987-01-01

    This paper describes the application of the conservative supra characteristic method (CSCM) to predict the flow around two-dimensional slot injection cooled cavities in hypersonic flow. Seven different numerical solutions are presented that model three different experimental designs. The calculations manifest outer flow conditions including the effects of nozzle/lip geometry, angle of attack, nozzle inlet conditions, boundary and shear layer growth and turbulance on the surrounding flow. The calculations were performed for analysis prior to wind tunnel testing for sensitivity studies early in the design process. Qualitative and quantitative understanding of the flows for each of the cavity designs and design recommendations are provided. The present paper demonstrates the ability of numerical schemes, such as the CSCM method, to play a significant role in the design process.

  11. Application of the CSCM method to the design of wedge cavities. [Conservative Supra Characteristic Method

    NASA Technical Reports Server (NTRS)

    Venkatapathy, Ethiraj; Nystrom, G. A.; Bardina, J.; Lombard, C. K.

    1987-01-01

    This paper describes the application of the conservative supra characteristic method (CSCM) to predict the flow around two-dimensional slot injection cooled cavities in hypersonic flow. Seven different numerical solutions are presented that model three different experimental designs. The calculations manifest outer flow conditions including the effects of nozzle/lip geometry, angle of attack, nozzle inlet conditions, boundary and shear layer growth and turbulance on the surrounding flow. The calculations were performed for analysis prior to wind tunnel testing for sensitivity studies early in the design process. Qualitative and quantitative understanding of the flows for each of the cavity designs and design recommendations are provided. The present paper demonstrates the ability of numerical schemes, such as the CSCM method, to play a significant role in the design process.

  12. Unified computational method for design of fluid loop systems

    NASA Astrophysics Data System (ADS)

    Furukawa, Masao

    1991-12-01

    Various kinds of empirical formulas of Nusselt numbers, fanning friction factors, and pressure loss coefficients were collected and reviewed with the object of constructing a common basis of design calculations of pumped fluid loop systems. The practical expressions obtained after numerical modifications are listed in tables with identification numbers corresponding to configurations of the flow passages. Design procedure of a cold plate and of a space radiator are clearly shown in a series of mathematical relations coupled with a number of detailed expressions which are put in the tables in order of numerical computations. Weight estimate models and several pump characteristics are given in the tables as a result of data regression. A unified computational method based upon the above procedure is presented for preliminary design analyses of a fluid loop system consisting of cold plates, plane radiators, mechanical pumps, valves, and so on.

  13. USER-derived cloning methods and their primer design.

    PubMed

    Salomonsen, Bo; Mortensen, Uffe H; Halkier, Barbara A

    2014-01-01

    Uracil excision-based cloning through USER™ (Uracil-Specific Excision Reagent) is an efficient ligase-free cloning technique that comprises USER cloning, USER fusion, and USER cassette-free (UCF) USER fusion. These USER-derived cloning techniques enable seamless assembly of multiple DNA fragments in one construct. Though governed by a few simple rules primer design for USER-based fusion of PCR fragments can prove time-consuming for inexperienced users. The Primer Help for USER (PHUSER) software is an easy-to-use primer design tool for USER-based methods. In this chapter, we present a PHUSER software protocol for designing primers for USER-derived cloning techniques.

  14. A simple design method of negative refractive index metamaterials

    NASA Astrophysics Data System (ADS)

    Kim, Dongho; Lee, Wangju; Choi, Jaeick

    2009-11-01

    We propose a very simple design method of negative refractive index (NRI) materials that can overcome some drawbacks of conventional resonant-type NRI materials. The proposed NRI materials consist of single or double metallic patterns printed on a dielectric substrate. Our metamaterials (MTMs) show two properties that are different from other types of MTMs in obtaining effective negative values of permittivity ( ɛ) and permeability ( μ) simultaneously; the geometrical outlines of the metallic patterns are not confined within any specific shape, and the metallic patterns are printed on only one side of the dielectric substrate. Therefore, they are very easy to design and fabricate using common printed circuit board (PCB) technology according to the appropriate application. Excellent agreement between the experiment and prediction data ensures the validity of our design approach.

  15. Applying Human-Centered Design Methods to Scientific Communication Products

    NASA Astrophysics Data System (ADS)

    Burkett, E. R.; Jayanty, N. K.; DeGroot, R. M.

    2016-12-01

    Knowing your users is a critical part of developing anything to be used or experienced by a human being. User interviews, journey maps, and personas are all techniques commonly employed in human-centered design practices because they have proven effective for informing the design of products and services that meet the needs of users. Many non-designers are unaware of the usefulness of personas and journey maps. Scientists who are interested in developing more effective products and communication can adopt and employ user-centered design approaches to better reach intended audiences. Journey mapping is a qualitative data-collection method that captures the story of a user's experience over time as related to the situation or product that requires development or improvement. Journey maps help define user expectations, where they are coming from, what they want to achieve, what questions they have, their challenges, and the gaps and opportunities that can be addressed by designing for them. A persona is a tool used to describe the goals and behavioral patterns of a subset of potential users or customers. The persona is a qualitative data model that takes the form of a character profile, built upon data about the behaviors and needs of multiple users. Gathering data directly from users avoids the risk of basing models on assumptions, which are often limited by misconceptions or gaps in understanding. Journey maps and user interviews together provide the data necessary to build the composite character that is the persona. Because a persona models the behaviors and needs of the target audience, it can then be used to make informed product design decisions. We share the methods and advantages of developing and using personas and journey maps to create more effective science communication products.

  16. 77 FR 32632 - Ambient Air Monitoring Reference and Equivalent Methods: Designation of Three New Equivalent Methods

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-01

    ... AGENCY Ambient Air Monitoring Reference and Equivalent Methods: Designation of Three New Equivalent... of lead (Pb) in the ambient air. FOR FURTHER INFORMATION CONTACT: Robert Vanderpool, Human Exposure... CFR Part 53, the EPA evaluates various methods for monitoring the concentrations of those ambient...

  17. Helicopter flight-control design using an H(2) method

    NASA Technical Reports Server (NTRS)

    Takahashi, Marc D.

    1991-01-01

    Rate-command and attitude-command flight-control designs for a UH-60 helicopter in hover are presented and were synthesized using an H(2) method. Using weight functions, this method allows the direct shaping of the singular values of the sensitivity, complementary sensitivity, and control input transfer-function matrices to give acceptable feedback properties. The designs were implemented on the Vertical Motion Simulator, and four low-speed hover tasks were used to evaluate the control system characteristics. The pilot comments from the accel-decel, bob-up, hovering turn, and side-step tasks indicated good decoupling and quick response characteristics. However, an underlying roll PIO tendency was found to exist away from the hover condition, which was caused by a flap regressing mode with insufficient damping.

  18. Optical design and active optics methods in astronomy

    NASA Astrophysics Data System (ADS)

    Lemaitre, Gerard R.

    2013-03-01

    Optical designs for astronomy involve implementation of active optics and adaptive optics from X-ray to the infrared. Developments and results of active optics methods for telescopes, spectrographs and coronagraph planet finders are presented. The high accuracy and remarkable smoothness of surfaces generated by active optics methods also allow elaborating new optical design types with high aspheric and/or non-axisymmetric surfaces. Depending on the goal and performance requested for a deformable optical surface analytical investigations are carried out with one of the various facets of elasticity theory: small deformation thin plate theory, large deformation thin plate theory, shallow spherical shell theory, weakly conical shell theory. The resulting thickness distribution and associated bending force boundaries can be refined further with finite element analysis.

  19. National Tuberculosis Genotyping and Surveillance Network: Design and Methods

    PubMed Central

    Braden, Christopher R.; Schable, Barbara A.; Onorato, Ida M.

    2002-01-01

    The National Tuberculosis Genotyping and Surveillance Network was established in 1996 to perform a 5-year, prospective study of the usefulness of genotyping Mycobacterium tuberculosis isolates to tuberculosis control programs. Seven sentinel sites identified all new cases of tuberculosis, collected information on patients and contacts, and obtained patient isolates. Seven genotyping laboratories performed DNA fingerprinting analysis by the international standard IS6110 method. BioImage Whole Band Analyzer software was used to analyze patterns, and distinct patterns were assigned unique designations. Isolates with six or fewer bands on IS6110 patterns were also spoligotyped. Patient data and genotyping designations were entered in a relational database and merged with selected variables from the national surveillance database. In two related databases, we compiled the results of routine contact investigations and the results of investigations of the relationships of patients who had isolates with matching genotypes. We describe the methods used in the study. PMID:12453342

  20. Simplified Analysis Methods for Primary Load Designs at Elevated Temperatures

    SciTech Connect

    Carter, Peter; Jetter, Robert I; Sham, Sam

    2011-01-01

    The use of simplified (reference stress) analysis methods is discussed and illustrated for primary load high temperature design. Elastic methods are the basis of the ASME Section III, Subsection NH primary load design procedure. There are practical drawbacks with this approach, particularly for complex geometries and temperature gradients. The paper describes an approach which addresses these difficulties through the use of temperature-dependent elastic-perfectly plastic analysis. Correction factors are defined to address difficulties traditionally associated with discontinuity stresses, inelastic strain concentrations and multiaxiality. A procedure is identified to provide insight into how this approach could be implemented but clearly there is additional work to be done to define and clarify the procedural steps to bring it to the point where it could be adapted into code language.

  1. Preliminary demonstration of a robust controller design method

    NASA Technical Reports Server (NTRS)

    Anderson, L. R.

    1980-01-01

    Alternative computational procedures for obtaining a feedback control law which yields a control signal based on measurable quantitites are evaluated. The three methods evaluated are: (1) the standard linear quadratic regulator design model; (2) minimization of the norm of the feedback matrix, k via nonlinear programming subject to the constraint that the closed loop eigenvalues be in a specified domain in the complex plane; and (3) maximize the angles between the closed loop eigenvectors in combination with minimizing the norm of K also via the constrained nonlinear programming. The third or robust design method was chosen to yield a closed loop system whose eigenvalues are insensitive to small changes in the A and B matrices. The relationship between orthogonality of closed loop eigenvectors and the sensitivity of closed loop eigenvalues is described. Computer programs are described.

  2. Attenuator design method for dedicated whole-core CT.

    PubMed

    Li, Mengfei; Zhao, Yunsong; Zhang, Peng

    2016-10-03

    In whole-core CT imaging, scanned data corresponding to the central portion of a cylindrical core often suffer from photon starvation, because increasing photon flux will cause overflow on some detector units under the restriction of detector dynamic range. Either photon starvation or data overflow will lead to increased noise or severe artifacts in the reconstructed CT image. In addition, cupping shaped beam hardening artifacts also appear in the whole-core CT image. In this paper, we present a method to design an attenuator for cone beam whole-core CT, which not only reduces the dynamic range requirement for high SNR data scanning, but also corrects beam hardening artifacts. Both simulation and real data are employed to verify our design method.

  3. A Requirements-Driven Optimization Method for Acoustic Treatment Design

    NASA Technical Reports Server (NTRS)

    Berton, Jeffrey J.

    2016-01-01

    Acoustic treatment designers have long been able to target specific noise sources inside turbofan engines. Facesheet porosity and cavity depth are key design variables of perforate-over-honeycomb liners that determine levels of noise suppression as well as the frequencies at which suppression occurs. Layers of these structures can be combined to create a robust attenuation spectrum that covers a wide range of frequencies. Looking to the future, rapidly-emerging additive manufacturing technologies are enabling new liners with multiple degrees of freedom, and new adaptive liners with variable impedance are showing promise. More than ever, there is greater flexibility and freedom in liner design. Subject to practical considerations, liner design variables may be manipulated to achieve a target attenuation spectrum. But characteristics of the ideal attenuation spectrum can be difficult to know. Many multidisciplinary system effects govern how engine noise sources contribute to community noise. Given a hardwall fan noise source to be suppressed, and using an analytical certification noise model to compute a community noise measure of merit, the optimal attenuation spectrum can be derived using multidisciplinary systems analysis methods. The subject of this paper is an analytical method that derives the ideal target attenuation spectrum that minimizes noise perceived by observers on the ground.

  4. A method for the aerodynamic design of dry powder inhalers.

    PubMed

    Ertunç, O; Köksoy, C; Wachtel, H; Delgado, A

    2011-09-15

    An inhaler design methodology was developed and then used to design a new dry powder inhaler (DPI) which aimed to fulfill two main performance requirements. The first requirement was that the patient should be able to completely empty the dry powder from the blister in which it is stored by inspiratory effort alone. The second requirement was that the flow resistance of the inhaler should be geared to optimum patient comfort. The emptying of a blister is a two-phase flow problem, whilst the adjustment of the flow resistance is an aerodynamic design problem. The core of the method comprised visualization of fluid and particle flow in upscaled prototypes operated in water. The prototypes and particles were upscaled so that dynamic similarity conditions were approximated as closely as possible. The initial step in the design method was to characterize different blister prototypes by measurements of their flow resistance and particle emptying performance. The blisters were then compared with regard to their aerodynamic performance and their ease of production. Following selection of candidate blisters, the other components such as needle, bypass and mouthpiece were dimensioned on the basis of node-loop operations and validation experiments. The final shape of the inhaler was achieved by experimental iteration. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. Translational Epidemiology in Psychiatry

    PubMed Central

    Weissman, Myrna M.; Brown, Alan S.; Talati, Ardesheer

    2012-01-01

    Translational research generally refers to the application of knowledge generated by advances in basic sciences research translated into new approaches for diagnosis, prevention, and treatment of disease. This direction is called bench-to-bedside. Psychiatry has similarly emphasized the basic sciences as the starting point of translational research. This article introduces the term translational epidemiology for psychiatry research as a bidirectional concept in which the knowledge generated from the bedside or the population can also be translated to the benches of laboratory science. Epidemiologic studies are primarily observational but can generate representative samples, novel designs, and hypotheses that can be translated into more tractable experimental approaches in the clinical and basic sciences. This bedside-to-bench concept has not been explicated in psychiatry, although there are an increasing number of examples in the research literature. This article describes selected epidemiologic designs, providing examples and opportunities for translational research from community surveys and prospective, birth cohort, and family-based designs. Rapid developments in informatics, emphases on large sample collection for genetic and biomarker studies, and interest in personalized medicine—which requires information on relative and absolute risk factors—make this topic timely. The approach described has implications for providing fresh metaphors to communicate complex issues in interdisciplinary collaborations and for training in epidemiology and other sciences in psychiatry. PMID:21646577

  6. Application of an optimization method to high performance propeller designs

    NASA Technical Reports Server (NTRS)

    Li, K. C.; Stefko, G. L.

    1984-01-01

    The application of an optimization method to determine the propeller blade twist distribution which maximizes propeller efficiency is presented. The optimization employs a previously developed method which has been improved to include the effects of blade drag, camber and thickness. Before the optimization portion of the computer code is used, comparisons of calculated propeller efficiencies and power coefficients are made with experimental data for one NACA propeller at Mach numbers in the range of 0.24 to 0.50 and another NACA propeller at a Mach number of 0.71 to validate the propeller aerodynamic analysis portion of the computer code. Then comparisons of calculated propeller efficiencies for the optimized and the original propellers show the benefits of the optimization method in improving propeller performance. This method can be applied to the aerodynamic design of propellers having straight, swept, or nonplanar propeller blades.

  7. Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Part 2

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas; Yuan, Charles; Sobieski, Jaroslaw (Technical Monitor)

    2000-01-01

    A new MDO method, BLISS, and two different variants of the method, BLISS/RS and BLISS/S, have been implemented using iSIGHT's scripting language and evaluated in this report on multidisciplinary problems. All of these methods are based on decomposing a modular system optimization system into several subtasks optimization, that may be executed concurrently, and the system optimization that coordinates the subtasks optimization. The BLISS method and its variants are well suited for exploiting the concurrent processing capabilities in a multiprocessor machine. Several steps, including the local sensitivity analysis, local optimization, response surfaces construction and updates are all ideally suited for concurrent processing. Needless to mention, such algorithms that can effectively exploit the concurrent processing capabilities of the compute servers will be a key requirement for solving large-scale industrial design problems, such as the automotive vehicle problem detailed in Section 3.4.

  8. Computational methods for drug design and discovery: focus on China.

    PubMed

    Zheng, Mingyue; Liu, Xian; Xu, Yuan; Li, Honglin; Luo, Cheng; Jiang, Hualiang

    2013-10-01

    In the past decades, China's computational drug design and discovery research has experienced fast development through various novel methodologies. Application of these methods spans a wide range, from drug target identification to hit discovery and lead optimization. In this review, we firstly provide an overview of China's status in this field and briefly analyze the possible reasons for this rapid advancement. The methodology development is then outlined. For each selected method, a short background precedes an assessment of the method with respect to the needs of drug discovery, and, in particular, work from China is highlighted. Furthermore, several successful applications of these methods are illustrated. Finally, we conclude with a discussion of current major challenges and future directions of the field.

  9. A method of designing clinical trials for combination drugs.

    PubMed

    Pigeon, J G; Copenhaver, M D; Whipple, J P

    1992-06-15

    Many pharmaceutical companies are now exploring combination drug therapies as an alternative to monotherapy. Consequently, it is of interest to investigate the simultaneous dose response relationship of two active drugs to select the lowest effective combination. In this paper, we propose a method for designing clinical trials for drug combinations that seems to offer several advantages over the 4 x 3 or even larger factorial studies that have been used to date. In addition, our proposed method provides a convenient formula for calculating the required sample size.

  10. Synthesis of aircraft structures using integrated design and analysis methods

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Goetz, R. C.

    1978-01-01

    A systematic research is reported to develop and validate methods for structural sizing of an airframe designed with the use of composite materials and active controls. This research program includes procedures for computing aeroelastic loads, static and dynamic aeroelasticity, analysis and synthesis of active controls, and optimization techniques. Development of the methods is concerned with the most effective ways of integrating and sequencing the procedures in order to generate structural sizing and the associated active control system, which is optimal with respect to a given merit function constrained by strength and aeroelasticity requirements.

  11. Uncertainty-Based Design Methods for Flow-Structure Interactions

    DTIC Science & Technology

    2007-06-01

    07 Final _ 2/01/05 - 01/31/07 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Uncertainty-based Design Methods for Flow- N00014-04-1-0007 Structure ...project is to develop advanced tools for efficient simulations of flow- structure interactions that account for random excitation and uncertain input...with emphasis on realistic three-dimensional nonlinear representatiol of the structures of interest. This capability will set the foundation for the

  12. A design method for constellation of lifting reentry vehicles

    NASA Astrophysics Data System (ADS)

    Xiang, Yu; Kun, Liu

    2017-03-01

    As the reachable domain of a single lifting reentry vehicle is not large enough to cover the whole globe in a short time, which is disadvantageous to responsive operation, it is of great significance to study on how to construct a constellation of several lifting reentry vehicles to responsively reach any point of the globe. This paper addresses a design method for such a constellation. Firstly, an approach for calculating the reachable domain of a single lifting reentry vehicle is given, using the combination of Gauss Pseudospectral Method and SQP method. Based on that, the entire reachable domain taking the limit of responsive time into consideration is simplified reasonably to reduce the complexity of the problem. Secondly, a Streets-of-Coverage (SOC) method is used to design the constellation and the parameters of the constellation are optimized through simple analysis and comparison. Lastly, a point coverage simulation method is utilized to verify the correctness of the optimization result. The verified result shows that 6 lifting reentry vehicles whose maximum lift-to-drag ratio is 1.7 can reach nearly any point on the earth's surface between -50° and 50° in less than 90 minutes.

  13. Design Methods for Load-bearing Elements from Crosslaminated Timber

    NASA Astrophysics Data System (ADS)

    Vilguts, A.; Serdjuks, D.; Goremikins, V.

    2015-11-01

    Cross-laminated timber is an environmentally friendly material, which possesses a decreased level of anisotropy in comparison with the solid and glued timber. Cross-laminated timber could be used for load-bearing walls and slabs of multi-storey timber buildings as well as decking structures of pedestrian and road bridges. Design methods of cross-laminated timber elements subjected to bending and compression with bending were considered. The presented methods were experimentally validated and verified by FEM. Two cross-laminated timber slabs were tested at the action of static load. Pine wood was chosen as a board's material. Freely supported beam with the span equal to 1.9 m, which was loaded by the uniformly distributed load, was a design scheme of the considered plates. The width of the plates was equal to 1 m. The considered cross-laminated timber plates were analysed by FEM method. The comparison of stresses acting in the edge fibres of the plate and the maximum vertical displacements shows that both considered methods can be used for engineering calculations. The difference between the results obtained experimentally and analytically is within the limits from 2 to 31%. The difference in results obtained by effective strength and stiffness and transformed sections methods was not significant.

  14. Achieving integration in mixed methods designs-principles and practices.

    PubMed

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-12-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs-exploratory sequential, explanatory sequential, and convergent-and through four advanced frameworks-multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods.

  15. Gradient-based optimum aerodynamic design using adjoint methods

    NASA Astrophysics Data System (ADS)

    Xie, Lei

    2002-09-01

    Continuous adjoint methods and optimal control theory are applied to a pressure-matching inverse design problem of quasi 1-D nozzle flows. Pontryagin's Minimum Principle is used to derive the adjoint system and the reduced gradient of the cost functional. The properties of adjoint variables at the sonic throat and the shock location are studied, revealing a log-arithmic singularity at the sonic throat and continuity at the shock location. A numerical method, based on the Steger-Warming flux-vector-splitting scheme, is proposed to solve the adjoint equations. This scheme can finely resolve the singularity at the sonic throat. A non-uniform grid, with points clustered near the throat region, can resolve it even better. The analytical solutions to the adjoint equations are also constructed via Green's function approach for the purpose of comparing the numerical results. The pressure-matching inverse design is then conducted for a nozzle parameterized by a single geometric parameter. In the second part, the adjoint methods are applied to the problem of minimizing drag coefficient, at fixed lift coefficient, for 2-D transonic airfoil flows. Reduced gradients of several functionals are derived through application of a Lagrange Multiplier Theorem. The adjoint system is carefully studied including the adjoint characteristic boundary conditions at the far-field boundary. A super-reduced design formulation is also explored by treating the angle of attack as an additional state; super-reduced gradients can be constructed either by solving adjoint equations with non-local boundary conditions or by a direct Lagrange multiplier method. In this way, the constrained optimization reduces to an unconstrained design problem. Numerical methods based on Jameson's finite volume scheme are employed to solve the adjoint equations. The same grid system generated from an efficient hyperbolic grid generator are adopted in both the Euler flow solver and the adjoint solver. Several

  16. Libration Orbit Mission Design: Applications of Numerical & Dynamical Methods

    NASA Technical Reports Server (NTRS)

    Bauer, Frank (Technical Monitor); Folta, David; Beckman, Mark

    2002-01-01

    Sun-Earth libration point orbits serve as excellent locations for scientific investigations. These orbits are often selected to minimize environmental disturbances and maximize observing efficiency. Trajectory design in support of libration orbits is ever more challenging as more complex missions are envisioned in the next decade. Trajectory design software must be further enabled to incorporate better understanding of the libration orbit solution space and thus improve the efficiency and expand the capabilities of current approaches. The Goddard Space Flight Center (GSFC) is currently supporting multiple libration missions. This end-to-end support consists of mission operations, trajectory design, and control. It also includes algorithm and software development. The recently launched Microwave Anisotropy Probe (MAP) and upcoming James Webb Space Telescope (JWST) and Constellation-X missions are examples of the use of improved numerical methods for attaining constrained orbital parameters and controlling their dynamical evolution at the collinear libration points. This paper presents a history of libration point missions, a brief description of the numerical and dynamical design techniques including software used, and a sample of future GSFC mission designs.

  17. Development of quality-by-design analytical methods.

    PubMed

    Vogt, Frederick G; Kord, Alireza S

    2011-03-01

    Quality-by-design (QbD) is a systematic approach to drug development, which begins with predefined objectives, and uses science and risk management approaches to gain product and process understanding and ultimately process control. The concept of QbD can be extended to analytical methods. QbD mandates the definition of a goal for the method, and emphasizes thorough evaluation and scouting of alternative methods in a systematic way to obtain optimal method performance. Candidate methods are then carefully assessed in a structured manner for risks, and are challenged to determine if robustness and ruggedness criteria are satisfied. As a result of these studies, the method performance can be understood and improved if necessary, and a control strategy can be defined to manage risk and ensure the method performs as desired when validated and deployed. In this review, the current state of analytical QbD in the industry is detailed with examples of the application of analytical QbD principles to a range of analytical methods, including high-performance liquid chromatography, Karl Fischer titration for moisture content, vibrational spectroscopy for chemical identification, quantitative color measurement, and trace analysis for genotoxic impurities.

  18. Towards Robust Designs Via Multiple-Objective Optimization Methods

    NASA Technical Reports Server (NTRS)

    Man Mohan, Rai

    2006-01-01

    evolutionary method (DE) is first used to solve a relatively difficult problem in extended surface heat transfer wherein optimal fin geometries are obtained for different safe operating base temperatures. The objective of maximizing the safe operating base temperature range is in direct conflict with the objective of maximizing fin heat transfer. This problem is a good example of achieving robustness in the context of changing operating conditions. The evolutionary method is then used to design a turbine airfoil; the two objectives being reduced sensitivity of the pressure distribution to small changes in the airfoil shape and the maximization of the trailing edge wedge angle with the consequent increase in airfoil thickness and strength. This is a relevant example of achieving robustness to manufacturing tolerances and wear and tear in the presence of other objectives.

  19. Bayesian methods for the design and analysis of noninferiority trials.

    PubMed

    Gamalo-Siebers, Margaret; Gao, Aijun; Lakshminarayanan, Mani; Liu, Guanghan; Natanegara, Fanni; Railkar, Radha; Schmidli, Heinz; Song, Guochen

    2016-01-01

    The gold standard for evaluating treatment efficacy of a medical product is a placebo-controlled trial. However, when the use of placebo is considered to be unethical or impractical, a viable alternative for evaluating treatment efficacy is through a noninferiority (NI) study where a test treatment is compared to an active control treatment. The minimal objective of such a study is to determine whether the test treatment is superior to placebo. An assumption is made that if the active control treatment remains efficacious, as was observed when it was compared against placebo, then a test treatment that has comparable efficacy with the active control, within a certain range, must also be superior to placebo. Because of this assumption, the design, implementation, and analysis of NI trials present challenges for sponsors and regulators. In designing and analyzing NI trials, substantial historical data are often required on the active control treatment and placebo. Bayesian approaches provide a natural framework for synthesizing the historical data in the form of prior distributions that can effectively be used in design and analysis of a NI clinical trial. Despite a flurry of recent research activities in the area of Bayesian approaches in medical product development, there are still substantial gaps in recognition and acceptance of Bayesian approaches in NI trial design and analysis. The Bayesian Scientific Working Group of the Drug Information Association provides a coordinated effort to target the education and implementation issues on Bayesian approaches for NI trials. In this article, we provide a review of both frequentist and Bayesian approaches in NI trials, and elaborate on the implementation for two common Bayesian methods including hierarchical prior method and meta-analytic-predictive approach. Simulations are conducted to investigate the properties of the Bayesian methods, and some real clinical trial examples are presented for illustration.

  20. Analytical methods for gravity-assist tour design

    NASA Astrophysics Data System (ADS)

    Strange, Nathan J.

    This dissertation develops analytical methods for the design of gravity-assist space- craft trajectories. Such trajectories are commonly employed by planetary science missions to reach Mercury or the Outer Planets. They may also be used at the Outer Planets for the design of science tours with multiple flybys of those planets' moons. Recent work has also shown applicability to new missions concepts such as NASA's Asteroid Redirect Mission. This work is based in the theory of patched conics. This document applies rigor to the concept of pumping (i.e. using gravity assists to change orbital energy) and cranking (i.e. using gravity assists to change inclination) to develop several analytic relations with pump and crank angles. In addition, transformations are developed between pump angle, crank angle, and v-infinity magnitude to classical orbit elements. These transformations are then used to describe the limits on orbits achievable via gravity assists of a planet or moon. This is then extended to develop analytic relations for all possible ballistic gravity-assist transfers and one type of propulsive transfer, v-infinity leveraging transfers. The results in this dissertation complement existing numerical methods for the design of these trajectories by providing methods that can guide numerical searches to find promising trajectories and even, in some cases, replace numerical searches altogether. In addition, results from new techniques presented in this dissertation such as Tisserand Graphs, the V-Infinity Globe, and Non-Tangent V-Infinty Leveraging provide additional insight into the structure of the gravity-assist trajectory design problem.