Science.gov

Sample records for design epidemiological methods

  1. The ZInEP Epidemiology Survey: background, design and methods.

    PubMed

    Ajdacic-Gross, Vladeta; Müller, Mario; Rodgers, Stephanie; Warnke, Inge; Hengartner, Michael P; Landolt, Karin; Hagenmuller, Florence; Meier, Magali; Tse, Lee-Ting; Aleksandrowicz, Aleksandra; Passardi, Marco; Knöpfli, Daniel; Schönfelder, Herdis; Eisele, Jochen; Rüsch, Nicolas; Haker, Helene; Kawohl, Wolfram; Rössler, Wulf

    2014-12-01

    This article introduces the design, sampling, field procedures and instruments used in the ZInEP Epidemiology Survey. This survey is one of six ZInEP projects (Zürcher Impulsprogramm zur nachhaltigen Entwicklung der Psychiatrie, i.e. the "Zurich Program for Sustainable Development of Mental Health Services"). It parallels the longitudinal Zurich Study with a sample comparable in age and gender, and with similar methodology, including identical instruments. Thus, it is aimed at assessing the change of prevalence rates of common mental disorders and the use of professional help and psychiatric sevices. Moreover, the current survey widens the spectrum of topics by including sociopsychiatric questionnaires on stigma, stress related biological measures such as load and cortisol levels, electroencephalographic (EEG) and near-infrared spectroscopy (NIRS) examinations with various paradigms, and sociophysiological tests. The structure of the ZInEP Epidemiology Survey entails four subprojects: a short telephone screening using the SCL-27 (n of nearly 10,000), a comprehensive face-to-face interview based on the SPIKE (Structured Psychopathological Interview and Rating of the Social Consequences for Epidemiology: the main instrument of the Zurich Study) with a stratified sample (n = 1500), tests in the Center for Neurophysiology and Sociophysiology (n = 227), and a prospective study with up to three follow-up interviews and further measures (n = 157). In sum, the four subprojects of the ZInEP Epidemiology Survey deliver a large interdisciplinary database.

  2. Overview of the epidemiology methods and applications: strengths and limitations of observational study designs.

    PubMed

    Colditz, Graham A

    2010-01-01

    The impact of study design on the results of medical research has long been an area of both substantial debate and a smaller body of empirical research. Examples come from many disciplines within clinical and public health research. Among the early major contributions in the 1970s was work by Mosteller and colleagues (Gilbert et al., 1997), who noted that innovations in surgery and anesthesia showed greater gains than standard therapy when nonrandomized, controlled trials were evaluated compared with the gains reported in randomized, controlled trials. More recently, we and others have evaluated the impact of design in medical and surgical research, and concluded that the mean gain comparing new therapies to established therapies was biased by study design in nonrandomized trials (Colditz et al., 1989; Miller et al., 1989). Benson and Hartz (2000) conducted a study in which they focused only on studies reported after 1985. On the basis of 136 reports of 19 diverse treatments, Benson and Hartz concluded that in only 2 of the 19 analyses did the combined data from the observational studies lie outside the 95% confidence interval for the combined data from the randomized trials. A similar study drew only on data reported from 1991 to 1995, which showed remarkably similar results among observational studies and randomized, controlled trials (Concato et al., 2000). These more recent data suggest that advancing the study design and analytic methods may reduce bias in some evaluations of medical and public health interventions. Such methods apply not only to the original studies, but also to the approaches that are taken to quantitatively combine results by using meta-analytic approaches such as random effects meta-regression, Bayesian meta-analysis, and the like (Normand, 1999). By focusing attention on thorough data analysis, design issues can be understood and their impact or bias can be estimated, on average, and then ideally accounted for in the interpretation of

  3. Design and implementation of epidemiological field investigation method based on mobile collaboration

    NASA Astrophysics Data System (ADS)

    Zhang, Lihui; Wang, Dongchuan; Huang, Mingxiang; Gong, Jianhua; Fang, Liqun; Cao, Wuchun

    2008-10-01

    With the development of mobile technologies and the integration with the spatial information technologies, it becomes possible to provide a potential to develop new techno-support solutions to Epidemiological Field Investigation especially for the disposal of emergent public health events. Based on mobile technologies and virtual geographic environment, the authors have designed a model for collaborative work in four communication patterns, namely, S2S (Static to Static), M2S (Mobile to Static), S2M (Static to Mobile), and M2M (Mobile to Mobile). Based on the model mentioned above, this paper stresses to explore mobile online mapping regarding mobile collaboration and conducts an experimental case study of HFRS (Hemorrhagic Fever with Renal Syndrome) fieldwork, and then develops a prototype system of emergent response disposition information system to test the effectiveness and usefulness of field survey based on mobile collaboration.

  4. Epidemiologic study of residential proximity to transmission lines and childhood cancer in California: description of design, epidemiologic methods and study population.

    PubMed

    Kheifets, Leeka; Crespi, Catherine M; Hooper, Chris; Oksuzyan, Sona; Cockburn, Myles; Ly, Thomas; Mezei, Gabor

    2015-01-01

    We conducted a large epidemiologic case-control study in California to examine the association between childhood cancer risk and distance from the home address at birth to the nearest high-voltage overhead transmission line as a replication of the study of Draper et al. in the United Kingdom. We present a detailed description of the study design, methods of case ascertainment, control selection, exposure assessment and data analysis plan. A total of 5788 childhood leukemia cases and 3308 childhood central nervous system cancer cases (included for comparison) and matched controls were available for analysis. Birth and diagnosis addresses of cases and birth addresses of controls were geocoded. Distance from the home to nearby overhead transmission lines was ascertained on the basis of the electric power companies' geographic information system (GIS) databases, additional Google Earth aerial evaluation and site visits to selected residences. We evaluated distances to power lines up to 2000 m and included consideration of lower voltages (60-69 kV). Distance measures based on GIS and Google Earth evaluation showed close agreement (Pearson correlation >0.99). Our three-tiered approach to exposure assessment allowed us to achieve high specificity, which is crucial for studies of rare diseases with low exposure prevalence.

  5. The INTERPHONE study: design, epidemiological methods, and description of the study population.

    PubMed

    Cardis, Elisabeth; Richardson, Lesley; Deltour, Isabelle; Armstrong, Bruce; Feychting, Maria; Johansen, Christoffer; Kilkenny, Monique; McKinney, Patricia; Modan, Baruch; Sadetzki, Siegal; Schüz, Joachim; Swerdlow, Anthony; Vrijheid, Martine; Auvinen, Anssi; Berg, Gabriele; Blettner, Maria; Bowman, Joseph; Brown, Julianne; Chetrit, Angela; Christensen, Helle Collatz; Cook, Angus; Hepworth, Sarah; Giles, Graham; Hours, Martine; Iavarone, Ivano; Jarus-Hakak, Avital; Klaeboe, Lars; Krewski, Daniel; Lagorio, Susanna; Lönn, Stefan; Mann, Simon; McBride, Mary; Muir, Kenneth; Nadon, Louise; Parent, Marie-Elise; Pearce, Neil; Salminen, Tiina; Schoemaker, Minouk; Schlehofer, Brigitte; Siemiatycki, Jack; Taki, Masao; Takebayashi, Toru; Tynes, Tore; van Tongeren, Martie; Vecchia, Paolo; Wiart, Joe; Woodward, Alistair; Yamaguchi, Naohito

    2007-01-01

    The very rapid worldwide increase in mobile phone use in the last decade has generated considerable interest in the possible health effects of exposure to radio frequency (RF) fields. A multinational case-control study, INTERPHONE, was set-up to investigate whether mobile phone use increases the risk of cancer and, more specifically, whether the RF fields emitted by mobile phones are carcinogenic. The study focused on tumours arising in the tissues most exposed to RF fields from mobile phones: glioma, meningioma, acoustic neurinoma and parotid gland tumours. In addition to a detailed history of mobile phone use, information was collected on a number of known and potential risk factors for these tumours. The study was conducted in 13 countries. Australia, Canada, Denmark, Finland, France, Germany, Israel, Italy, Japan, New Zealand, Norway, Sweden, and the UK using a common core protocol. This paper describes the study design and methods and the main characteristics of the study population. INTERPHONE is the largest case-control study to date investigating risks related to mobile phone use and to other potential risk factors for the tumours of interest and includes 2,765 glioma, 2,425 meningioma, 1,121 acoustic neurinoma, 109 malignant parotid gland tumour cases and 7,658 controls. Particular attention was paid to estimating the amount and direction of potential recall and participation biases and their impact on the study results.

  6. Melanocortin-1 receptor, skin cancer and phenotypic characteristics (M-SKIP) project: study design and methods for pooling results of genetic epidemiological studies

    PubMed Central

    2012-01-01

    Background For complex diseases like cancer, pooled-analysis of individual data represents a powerful tool to investigate the joint contribution of genetic, phenotypic and environmental factors to the development of a disease. Pooled-analysis of epidemiological studies has many advantages over meta-analysis, and preliminary results may be obtained faster and with lower costs than with prospective consortia. Design and methods Based on our experience with the study design of the Melanocortin-1 receptor (MC1R) gene, SKin cancer and Phenotypic characteristics (M-SKIP) project, we describe the most important steps in planning and conducting a pooled-analysis of genetic epidemiological studies. We then present the statistical analysis plan that we are going to apply, giving particular attention to methods of analysis recently proposed to account for between-study heterogeneity and to explore the joint contribution of genetic, phenotypic and environmental factors in the development of a disease. Within the M-SKIP project, data on 10,959 skin cancer cases and 14,785 controls from 31 international investigators were checked for quality and recoded for standardization. We first proposed to fit the aggregated data with random-effects logistic regression models. However, for the M-SKIP project, a two-stage analysis will be preferred to overcome the problem regarding the availability of different study covariates. The joint contribution of MC1R variants and phenotypic characteristics to skin cancer development will be studied via logic regression modeling. Discussion Methodological guidelines to correctly design and conduct pooled-analyses are needed to facilitate application of such methods, thus providing a better summary of the actual findings on specific fields. PMID:22862891

  7. An introduction to epidemiologic and statistical methods useful in environmental epidemiology.

    PubMed

    Nitta, Hiroshi; Yamazaki, Shin; Omori, Takashi; Sato, Tosiya

    2010-01-01

    Many developments in the design and analysis of environmental epidemiology have been made in air pollution studies. In the analysis of the short-term effects of particulate matter on daily mortality, Poisson regression models with flexible smoothing methods have been developed for the analysis of time-series data. Another option for such studies is the use of case-crossover designs, and there have been extensive discussions on the selection of control periods. In the Study on Respiratory Disease and Automobile Exhaust project conducted by the Japanese Ministry of the Environment, we adopted a new 2-stage case-control design that is efficient when both exposure and disease are rare. Based on our experience in conducting air pollution epidemiologic studies, we review 2-stage case-control designs, case-crossover designs, generalized linear models, generalized additive models, and generalized estimating equations, all of which are useful approaches in environmental epidemiology.

  8. Epidemiologic methods in analysis of scientific issues

    NASA Astrophysics Data System (ADS)

    Erdreich, Linda S.

    2003-10-01

    Studies of human populations provide much of the information that is used to evaluate compensation cases for hearing loss, including rates of hearing loss by age, and dose-response relationships. The reference data used to make decisions regarding workman's compensation is based on epidemiologic studies of cohorts of workers exposed to various noise levels. Epidemiology and its methods can be used in other ways in the courtroom; to assess the merits of a complaint, to support Daubert criteria, and to explain scientific issues to the trier of fact, generally a layperson. Using examples other than occupational noise induced hearing loss, these methods will be applied to respond to a complaint that hearing loss followed exposure to a sudden noise, a medication, or an occupational chemical, and thus was caused by said exposure. The standard criteria for assessing the weight of the evidence, and epidemiologic criteria for causality show the limits of such anecdotal data and incorporate quantitative and temporal issues. Reports of clusters of cases are also intuitively convincing to juries. Epidemiologic methods provide a scientific approach to assess whether rates of the outcome are indeed increased, and the extent to which increased rates provide evidence for causality.

  9. Using Epidemiologic Methods to Test Hypotheses regarding Causal Influences on Child and Adolescent Mental Disorders

    ERIC Educational Resources Information Center

    Lahey, Benjamin B.; D'Onofrio, Brian M.; Waldman, Irwin D.

    2009-01-01

    Epidemiology uses strong sampling methods and study designs to test refutable hypotheses regarding the causes of important health, mental health, and social outcomes. Epidemiologic methods are increasingly being used to move developmental psychopathology from studies that catalogue correlates of child and adolescent mental health to designs that…

  10. Kinetics methods for clinical epidemiology problems.

    PubMed

    Corlan, Alexandru Dan; Ross, John

    2015-11-17

    Calculating the probability of each possible outcome for a patient at any time in the future is currently possible only in the simplest cases: short-term prediction in acute diseases of otherwise healthy persons. This problem is to some extent analogous to predicting the concentrations of species in a reactor when knowing initial concentrations and after examining reaction rates at the individual molecule level. The existing theoretical framework behind predicting contagion and the immediate outcome of acute diseases in previously healthy individuals is largely analogous to deterministic kinetics of chemical systems consisting of one or a few reactions. We show that current statistical models commonly used in chronic disease epidemiology correspond to simple stochastic treatment of single reaction systems. The general problem corresponds to stochastic kinetics of complex reaction systems. We attempt to formulate epidemiologic problems related to chronic diseases in chemical kinetics terms. We review methods that may be adapted for use in epidemiology. We show that some reactions cannot fit into the mass-action law paradigm and solutions to these systems would frequently exhibit an antiportfolio effect. We provide a complete example application of stochastic kinetics modeling for a deductive meta-analysis of two papers on atrial fibrillation incidence, prevalence, and mortality.

  11. Methods of Measurement in epidemiology: Sedentary Behaviour

    PubMed Central

    Atkin, Andrew J; Gorely, Trish; Clemes, Stacy A; Yates, Thomas; Edwardson, Charlotte; Brage, Soren; Salmon, Jo; Marshall, Simon J; Biddle, Stuart JH

    2012-01-01

    Background Research examining sedentary behaviour as a potentially independent risk factor for chronic disease morbidity and mortality has expanded rapidly in recent years. Methods We present a narrative overview of the sedentary behaviour measurement literature. Subjective and objective methods of measuring sedentary behaviour suitable for use in population-based research with children and adults are examined. The validity and reliability of each method is considered, gaps in the literature specific to each method identified and potential future directions discussed. Results To date, subjective approaches to sedentary behaviour measurement, e.g. questionnaires, have focused predominantly on TV viewing or other screen-based behaviours. Typically, such measures demonstrate moderate reliability but slight to moderate validity. Accelerometry is increasingly being used for sedentary behaviour assessments; this approach overcomes some of the limitations of subjective methods, but detection of specific postures and postural changes by this method is somewhat limited. Instruments developed specifically for the assessment of body posture have demonstrated good reliability and validity in the limited research conducted to date. Miniaturization of monitoring devices, interoperability between measurement and communication technologies and advanced analytical approaches are potential avenues for future developments in this field. Conclusions High-quality measurement is essential in all elements of sedentary behaviour epidemiology, from determining associations with health outcomes to the development and evaluation of behaviour change interventions. Sedentary behaviour measurement remains relatively under-developed, although new instruments, both objective and subjective, show considerable promise and warrant further testing. PMID:23045206

  12. Epidemiological study air disaster in Amsterdam (ESADA): study design

    PubMed Central

    Slottje, Pauline; Huizink, Anja C; Twisk, Jos WR; Witteveen, Anke B; van der Ploeg, Henk M; Bramsen, Inge; Smidt, Nynke; Bijlsma, Joost A; Bouter, Lex M; van Mechelen, Willem; Smid, Tjabe

    2005-01-01

    Background In 1992, a cargo aircraft crashed into apartment buildings in Amsterdam, killing 43 victims and destroying 266 apartments. In the aftermath there were speculations about the cause of the crash, potential exposures to hazardous materials due to the disaster and the health consequences. Starting in 2000, the Epidemiological Study Air Disaster in Amsterdam (ESADA) aimed to assess the long-term health effects of occupational exposure to this disaster on professional assistance workers. Methods/Design Epidemiological study among all the exposed professional fire-fighters and police officers who performed disaster-related task(s), and hangar workers who sorted the wreckage of the aircraft, as well as reference groups of their non-exposed colleagues who did not perform any disaster-related tasks. The study took place, on average, 8.5 years after the disaster. Questionnaires were used to assess details on occupational exposure to the disaster. Health measures comprised laboratory assessments in urine, blood and saliva, as well as self-reported current health measures, including health-related quality of life, and various physical and psychological symptoms. Discussion In this paper we describe and discuss the design of the ESADA. The ESADA will provide additional scientific knowledge on the long-term health effects of technological disasters on professional workers. PMID:15921536

  13. Genetic Epidemiology of COPD (COPDGene) Study Design

    PubMed Central

    Regan, Elizabeth A.; Hokanson, John E.; Murphy, James R.; Make, Barry; Lynch, David A.; Beaty, Terri H.; Curran-Everett, Douglas; Silverman, Edwin K.; Crapo, James D.

    2010-01-01

    Background COPDGeneis a multicenter observational study designed to identify genetic factors associated with COPD. It will also characterize chest CT phenotypes in COPD subjects, including assessment of emphysema, gas trapping, and airway wall thickening. Finally, subtypes of COPD based on these phenotypes will be used in a comprehensive genome-wide study to identify COPD susceptibility genes. Methods/Results COPDGene will enroll 10,000 smokers with and without COPD across the GOLD stages. Both Non-Hispanic white and African-American subjects are included in the cohort. Inspiratory and expiratory chest CT scans will be obtained on all participants. In addition to the cross-sectional enrollment process, these subjects will be followed regularly for longitudinal studies. A genome-wide association study (GWAS) will be done on an initial group of 4000 subjects to identify genetic variants associated with case-control status and several quantitative phenotypes related to COPD. The initial findings will be verified in an additional 2000 COPD cases and 2000 smoking control subjects, and further validation association studies will be carried out. Conclusions COPDGene will provide important new information about genetic factors in COPD, and will characterize the disease process using high resolution CT scans. Understanding genetic factors and CT phenotypes that define COPD will potentially permit earlier diagnosis of this disease and may lead to the development of treatments to modify progression. PMID:20214461

  14. A design framework for exploratory geovisualization in epidemiology

    PubMed Central

    Robinson, Anthony C.

    2009-01-01

    This paper presents a design framework for geographic visualization based on iterative evaluations of a toolkit designed to support cancer epidemiology. The Exploratory Spatio-Temporal Analysis Toolkit (ESTAT), is intended to support visual exploration through multivariate health data. Its purpose is to provide epidemiologists with the ability to generate new hypotheses or further refine those they may already have. Through an iterative user-centered design process, ESTAT has been evaluated by epidemiologists at the National Cancer Institute (NCI). Results of these evaluations are discussed, and a design framework based on evaluation evidence is presented. The framework provides specific recommendations and considerations for the design and development of a geovisualization toolkit for epidemiology. Its basic structure provides a model for future design and evaluation efforts in information visualization. PMID:20390052

  15. How to design a (good) epidemiological observational study: epidemiological research protocol at a glance.

    PubMed

    Fronteira, Ines

    2013-01-01

    In this article, we propose a general structure for designing a research protocol of an observational epidemiological study. We start by highlighting the importance of the research protocol, namely in accounting for some bias and guaranteeing methodologic rigor and study reproductability. Next, we reflect on some of the essential elements of a research protocol no matter its objective. We further present some specific issues to be included according to the type of study: cross-sectional, case-control and cohort.

  16. Epidemiologic methods for investigating male fecundity

    PubMed Central

    Olsen, Jørn; Ramlau-Hansen, Cecilia Høst

    2014-01-01

    Fertility is a couple concept that has been measured since the beginning of demography, and male fecundity (his biological capacity to reproduce) is a component of the fertility rate. Unfortunately, we have no way of measuring the male component directly, although several indirect markers can be used. Population registers can be used to monitor the proportion of childless couples, couples who receive donor semen, trends in dizygotic twinning, and infertility diagnoses. Studies using time-to-pregnancy (TTP) may identify couple subfecundity, and TTP data will correlate with sperm quality and quantity as well as sexual activity and a number of other conditions. Having exposure data available for couples with a fecund female partner would make TTP studies of interest in identifying exposures that may affect male fecundity. Biological indicators such as sperm quality and quantity isolate the male component of fertility, and semen data therefore remain an important source of information for research. Unfortunately, often over half of those invited to provide a sperm sample will refuse, and the study is then subject to a selection that may introduce bias. Because the most important time windows for exposures that impair semen production could be early fetal life, puberty, and the time of ejaculation; longitudinal data over decades of time are required. The ongoing monitoring of semen quality and quantity should continue, and surveys monitoring fertility and waiting TTP should also be designed. PMID:24369129

  17. [Malignant tumours of the eye: Epidemiology, diagnostic methods and radiotherapy].

    PubMed

    Jardel, P; Caujolle, J-P; Gastaud, L; Maschi, C; Sauerwein, W; Thariat, J

    2015-12-01

    Malignant tumours of the eye are not common, barely representing 1 % of all cancers. This article aims to summarise, for each of the main eye malignant diseases, aspects of epidemiology, diagnostic methods and treatments, with a focus on radiation therapy techniques. The studied tumours are: eye metastasis, intraocular and ocular adnexal lymphomas, uveal melanomas, malignant tumours of the conjunctive, of the lids, and retinoblastomas. The last chapter outlines ocular complications of radiation therapy and their management.

  18. [Curricular design of health postgraduate programs: the case of Masters in epidemiology].

    PubMed

    Bobadilla, J L; Lozano, R; Bobadilla, C

    1991-01-01

    This paper discusses the need to create specific programs for the training of researchers in epidemiology, a field that has traditionally been ignored by the graduate programs in public health. This is due, in part, to the emphasis that has been placed on the training of professionals in other areas of public health. The paper also includes the results of a consensus exercise developed during the curricular design of the Masters Program in Epidemiology of the School of Medicine of the National Autonomous University of Mexico. The technique used during the consensus exercise was the TKJ, which allows the presentation of ideas and possible solutions for a specific problem. This is probably the first published experience in the use of such a technique for the design of an academic curriculum. Taking as a base the general characteristics of the students, the substantive, disciplinary and methodological subjects were chosen. The results showed a need for a multidisciplinary approach based on modern methodologies of statistics and epidemiology. The usefulness of the results of the curricular design and the superiority of this method to reach consensus is also discussed.

  19. Anonymous statistical methods versus cryptographic methods in epidemiology.

    PubMed

    Quantin; Allaert; Dusserre

    2000-11-01

    Sensitive data are most often indirectly identifiable and so need to be rendered anonymous in order to ensure privacy. Statistical methods to provide anonymity require data perturbation and so generate data processing difficulties. Encryption methods, while preserving confidentiality, do not require data modification.

  20. Parameter Plane Design Method

    DTIC Science & Technology

    1989-03-01

    Th usr a toente aninteer a thca sms b esta 1 Fp-ocsing 2. Enter P1 values, lwgt, ldig - > 9 Table I give us proper values. Table 1. PARAMETER TABLE...necessary and identify by block number) In this thesis a control systems analysis package is developed using parameter plane methods. It is an interactive...designer is able to choose values of the parameters which provide a good compromise between cost and dynamic behavior. 20 Distribution Availability of

  1. Epidemiology and statistical methods in prediction of patient outcome.

    PubMed

    Bostwick, David G; Adolfsson, Jan; Burke, Harry B; Damber, Jan-Erik; Huland, Hartwig; Pavone-Macaluso, Michele; Waters, David J

    2005-05-01

    Substantial gaps exist in the data of the assessment of risk and prognosis that limit our understanding of the complex mechanisms that contribute to the greatest cancer epidemic, prostate cancer, of our time. This report was prepared by an international multidisciplinary committee of the World Health Organization to address contemporary issues of epidemiology and statistical methods in prostate cancer, including a summary of current risk assessment methods and prognostic factors. Emphasis was placed on the relative merits of each of the statistical methods available. We concluded that: 1. An international committee should be created to guide the assessment and validation of molecular biomarkers. The goal is to achieve more precise identification of those who would benefit from treatment. 2. Prostate cancer is a predictable disease despite its biologic heterogeneity. However, the accuracy of predicting it must be improved. We expect that more precise statistical methods will supplant the current staging system. The simplicity and intuitive ease of using the current staging system must be balanced against the serious compromise in accuracy for the individual patient. 3. The most useful new statistical approaches will integrate molecular biomarkers with existing prognostic factors to predict conditional life expectancy (i.e. the expected remaining years of a patient's life) and take into account all-cause mortality.

  2. Control system design method

    DOEpatents

    Wilson, David G [Tijeras, NM; Robinett, III, Rush D.

    2012-02-21

    A control system design method and concomitant control system comprising representing a physical apparatus to be controlled as a Hamiltonian system, determining elements of the Hamiltonian system representation which are power generators, power dissipators, and power storage devices, analyzing stability and performance of the Hamiltonian system based on the results of the determining step and determining necessary and sufficient conditions for stability of the Hamiltonian system, creating a stable control system based on the results of the analyzing step, and employing the resulting control system to control the physical apparatus.

  3. Has epidemiology become infatuated with methods? A historical perspective on the place of methods during the classical (1945-1965) phase of epidemiology.

    PubMed

    Morabia, Alfredo

    2015-03-18

    Before World War II, epidemiology was a small discipline, practiced by a handful of people working mostly in the United Kingdom and in the United States. Today it is practiced by tens of thousands of people on all continents. Between 1945 and 1965, during what is known as its "classical" phase, epidemiology became recognized as a major academic discipline in medicine and public health. On the basis of a review of the historical evidence, this article examines to which extent classical epidemiology has been a golden age of an action-driven, problem-solving science, in which epidemiologists were less concerned with the sophistication of their methods than with the societal consequences of their work. It also discusses whether the paucity of methods stymied or boosted classical epidemiology's ability to convince political and financial agencies about the need to intervene in order to improve the health of the people.

  4. New Saliva DNA Collection Method Compared to Buccal Cell Collection Techniques for Epidemiological Studies

    PubMed Central

    ROGERS, NIKKI L.; COLE, SHELLEY A.; LAN, HAO-CHANG; CROSSA, ALDO; DEMERATH, ELLEN W.

    2009-01-01

    Epidemiological studies may require noninvasive methods for off-site DNA collection. We compared the DNA yield and quality obtained using a whole-saliva collection device (Oragene™ DNA collection kit) to those from three established noninvasive methods (cytobrush, foam swab, and oral rinse). Each method was tested on 17 adult volunteers from our center, using a random crossover collection design and analyzed using repeated-measures statistics. DNA yield and quality were assessed via gel electrophoresis, spectophotometry, and polymerase chain reaction (PCR) amplification rate. The whole-saliva method provided a significantly greater DNA yield (mean ± SD = 154.9 ± 103.05 μg, median = 181.88) than the other methods (oral rinse = 54.74 ± 41.72 μg, 36.56; swab = 11.44 ± 7.39 μg, 10.72; cytobrush = 12.66 ± 6.19, 13.22 μg) (all pairwise P < 0.05). Oral-rinse and whole-saliva samples provided the best DNA quality, whereas cytobrush and swab samples provided poorer quality DNA, as shown by lower OD260/OD280 and OD260/OD230 ratios. We conclude that both a 10-ml oral-rinse sample and 2-ml whole-saliva sample provide sufficient DNA quantity and better quality DNA for genetic epidemiological studies than do the commonly used buccal swab and brush techniques. PMID:17421001

  5. New saliva DNA collection method compared to buccal cell collection techniques for epidemiological studies.

    PubMed

    Rogers, Nikki L; Cole, Shelley A; Lan, Hao-Chang; Crossa, Aldo; Demerath, Ellen W

    2007-01-01

    Epidemiological studies may require noninvasive methods for off-site DNA collection. We compared the DNA yield and quality obtained using a whole-saliva collection device (Oragene DNA collection kit) to those from three established noninvasive methods (cytobrush, foam swab, and oral rinse). Each method was tested on 17 adult volunteers from our center, using a random crossover collection design and analyzed using repeated-measures statistics. DNA yield and quality were assessed via gel electrophoresis, spectophotometry, and polymerase chain reaction (PCR) amplification rate. The whole-saliva method provided a significantly greater DNA yield (mean +/- SD = 154.9 +/- 103.05 microg, median = 181.88) than the other methods (oral rinse = 54.74 +/- 41.72 microg, 36.56; swab = 11.44 +/- 7.39 microg, 10.72; cytobrush = 12.66 +/- 6.19, 13.22 microg) (all pairwise P < 0.05). Oral-rinse and whole-saliva samples provided the best DNA quality, whereas cytobrush and swab samples provided poorer quality DNA, as shown by lower OD(260)/OD(280) and OD(260)/OD(230) ratios. We conclude that both a 10-ml oral-rinse sample and 2-ml whole-saliva sample provide sufficient DNA quantity and better quality DNA for genetic epidemiological studies than do the commonly used buccal swab and brush techniques.

  6. Bone lead measured by X-ray fluorescence: epidemiologic methods.

    PubMed Central

    Hu, H; Aro, A; Rotnitzky, A

    1995-01-01

    In vivo X-ray fluorescence (XRF) measurement of bone lead concentration (XRF) has emerged as an important technique for future epidemiological studies of long-term toxicity. Several issues germane to epidemiologic methodology need to be addressed, however. First, sources of variability in measurements of bone lead need to be quantified, including imprecision related to the physical measurement itself and the variability of lead deposition over the two main compartments of bones (cortical vs. trabecular) and within each compartment. Imprecision related to the physical measurement can be estimated for each individual measurement based on the variability of the signal and background. Second, approaches to low-level data need to be debated. We argue for using the minimal detection limit (MDL) to compare instruments and interpret individual measurements; however, with regard to epidemiologic studies, we would abandon the MDL in favor of using all point estimates. In analyses using bone lead as an independent variable, statistical techniques can be used to adjust regression estimates based on estimates of measurement uncertainty and bone lead variability. Third, factors that can be expected to modify the relationship between bone lead and toxicity such as gravida history, endocrinological states, nutrition, and other important influences on bone metabolism, need to be identified and measured in epidemiologic studies. By addressing these issues, investigators will be able to maximize the utility of XRF measurements in environmental epidemiologic studies. Images Figure 2. PMID:7621788

  7. [An analysis to the focus of (American Journal of Epidemiology) research with bibliometrics methods].

    PubMed

    Cui, L

    1996-06-01

    Using bibliometric method, the author counted the citation of papers published in American Journal of Epidemiology in the last 3 years. The highly cited papers and books were presented and the focus of recent years on American Journal of Epidemiology outlined.

  8. Reporting of occupational and environmental research: use and misuse of statistical and epidemiological methods

    PubMed Central

    Rushton, L.

    2000-01-01

    OBJECTIVES—To report some of the most serious omissions and errors which may occur in papers submitted to Occupational and Environmental Medicine, and to give guidelines on the essential components that should be included in papers reporting results from studies of occupational and environmental health.
METHODS—Since 1994 Occupational and Environmental Medicine has used a panel of medical statisticians to review submitted papers which have a substantial statistical content. Although some studies may have genuine errors in their design, execution, and analysis, many of the problems identified during the reviewing process are due to inadequate and incomplete reporting of essential aspects of a study. This paper outlines some of the most important errors and omissions that may occur. Observational studies are often the preferred choice of design in occupational and environmental medicine. Some of the issues relating to design, execution, and analysis which should be considered when reporting three of the most common observational study designs, cross sectional, case-control, and cohort are described. An illustration of good reporting practice is given for each. Various mathematical modelling techniques are often used in the analysis of these studies, the reporting of which causes a major problem to some authors. Suggestions for the presentation of results from modelling are made.
CONCLUSIONS—There is increasing interest in the development and application of formal "good epidemiology practices". These not only consider issues of data quality, study design, and study conduct, but through their structured approach to the documentation of the study procedures, provide the potential for more rigorous reporting of the results in the scientific literature.


Keywords: research reporting; statistical methods; epidemiological methods PMID:10711263

  9. Method for Design Rotation

    DTIC Science & Technology

    1993-08-01

    desirability of a rotation as a function of the set of planar angles. Criteria for the symmetry of the design (such as the same set of factor levels for...P is -1. Hence there is no theoretical problem in obtaining rotations of a design; there are only the practical questions Why rotate a design? And...star points, which can be represented in a shorthand notation by the permutations of (±1,0, "’" , 0), and (c) factorial points, which are a two- level

  10. The role of applied epidemiology methods in the disaster management cycle.

    PubMed

    Malilay, Josephine; Heumann, Michael; Perrotta, Dennis; Wolkin, Amy F; Schnall, Amy H; Podgornik, Michelle N; Cruz, Miguel A; Horney, Jennifer A; Zane, David; Roisman, Rachel; Greenspan, Joel R; Thoroughman, Doug; Anderson, Henry A; Wells, Eden V; Simms, Erin F

    2014-11-01

    Disaster epidemiology (i.e., applied epidemiology in disaster settings) presents a source of reliable and actionable information for decision-makers and stakeholders in the disaster management cycle. However, epidemiological methods have yet to be routinely integrated into disaster response and fully communicated to response leaders. We present a framework consisting of rapid needs assessments, health surveillance, tracking and registries, and epidemiological investigations, including risk factor and health outcome studies and evaluation of interventions, which can be practiced throughout the cycle. Applying each method can result in actionable information for planners and decision-makers responsible for preparedness, response, and recovery. Disaster epidemiology, once integrated into the disaster management cycle, can provide the evidence base to inform and enhance response capability within the public health infrastructure.

  11. The Role of Applied Epidemiology Methods in the Disaster Management Cycle

    PubMed Central

    Heumann, Michael; Perrotta, Dennis; Wolkin, Amy F.; Schnall, Amy H.; Podgornik, Michelle N.; Cruz, Miguel A.; Horney, Jennifer A.; Zane, David; Roisman, Rachel; Greenspan, Joel R.; Thoroughman, Doug; Anderson, Henry A.; Wells, Eden V.; Simms, Erin F.

    2014-01-01

    Disaster epidemiology (i.e., applied epidemiology in disaster settings) presents a source of reliable and actionable information for decision-makers and stakeholders in the disaster management cycle. However, epidemiological methods have yet to be routinely integrated into disaster response and fully communicated to response leaders. We present a framework consisting of rapid needs assessments, health surveillance, tracking and registries, and epidemiological investigations, including risk factor and health outcome studies and evaluation of interventions, which can be practiced throughout the cycle. Applying each method can result in actionable information for planners and decision-makers responsible for preparedness, response, and recovery. Disaster epidemiology, once integrated into the disaster management cycle, can provide the evidence base to inform and enhance response capability within the public health infrastructure. PMID:25211748

  12. Development of the residential case-specular epidemiologic investigation method. Final report

    SciTech Connect

    Zaffanella, L.E.; Savitz, D.A.

    1995-11-01

    The residential case-specular method is an innovative approach to epidemiologic studies of the association between wire codes and childhood cancer. This project was designed to further the development of the residential case-specular method, which seeks to help resolve the ``wire code paradox``. For years, wire codes have been used as surrogate measures of past electric and magnetic field (EMF) exposure. There is a magnetic field hypothesis that suggests childhood cancer is associated with exposure to magnetic fields, with wire codes as a proxy for these fields. The neighborhood hypothesis suggests that childhood cancer is associated with neighborhood characteristics and exposures other than magnetic fields, with wire codes as a proxy for these characteristics and exposures. The residential case-specular method was designed to discriminate between the magnetic field and the neighborhood hypothesis. Two methods were developed for determining the specular of a residence. These methods were tested with 400 randomly selected residences. The main advantage of the residential case-specular method is that it may efficiently confirm or eliminate the suspicion that control selection bias or confounding by neighborhood factors affected the results of case-control studies of childhood cancer and magnetic fields. The method may be applicable to both past and ongoing studies. The main disadvantage is that the method is untried. Consequently, further work is required to verify its validity and to ensure that sufficient statistical power can be obtained in a cost-effective manner.

  13. Design Characteristics Influence Performance of Clinical Prediction Rules in Validation: A Meta-Epidemiological Study

    PubMed Central

    Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda

    2016-01-01

    Background Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules’ performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Methods Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. Results A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2–4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Conclusion Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved. PMID:26730980

  14. Aircraft digital control design methods

    NASA Technical Reports Server (NTRS)

    Powell, J. D.; Parsons, E.; Tashker, M. G.

    1976-01-01

    Variations in design methods for aircraft digital flight control are evaluated and compared. The methods fall into two categories; those where the design is done in the continuous domain (or s plane) and those where the design is done in the discrete domain (or z plane). Design method fidelity is evaluated by examining closed loop root movement and the frequency response of the discretely controlled continuous aircraft. It was found that all methods provided acceptable performance for sample rates greater than 10 cps except the uncompensated s plane design method which was acceptable above 20 cps. A design procedure based on optimal control methods was proposed that provided the best fidelity at very slow sample rates and required no design iterations for changing sample rates.

  15. Landscape-epidemiological study design to investigate an environmentally based disease.

    PubMed

    Tabor, Joseph A; O'rourke, Mary Kay; Lebowitz, Michael D; Harris, Robin B

    2011-01-01

    Cost-effective approaches for identifying and enrolling subjects in community-based epidemiological studies face many challenges. Additional challenges arise when a neighborhood scale of analysis is required to distinguish between individual- and group-level risk factors with strong environmental determinants. A stratified, two-stage, cross-sectional, address-based telephone survey of Greater Tucson, Arizona, was conducted in 2002-2003. Subjects were recruited from direct marketing data at neighborhood resolution using a geographic information system (GIS). Three geomorphic strata were divided into two demographic units. Households were randomly selected within census block groups, selected using the probability proportional to size technique. Purchased direct marketing lists represented 45.2% of Census 2000 households in the surveyed block groups. Survey design effect (1.6) on coccidioidomycosis prevalence (88 per 100,000 per year) was substantially reduced in four of the six strata (0.3-0.9). Race-ethnicity was more robust than age and gender to compensate for significant selection bias using poststratification. Clustered, address-based telephone surveys provide a cost-effective, valid method for recruiting populations from address-based lists using a GIS to design surveys and population survey statistical methods for analysis. Landscape ecology provides effective methods for identifying scales of analysis and units for stratification that will improve sampling efficiency when environmental variables of interest are strong predictors.

  16. Trends in epidemiology in the 21st century: time to adopt Bayesian methods.

    PubMed

    Martinez, Edson Zangiacomi; Achcar, Jorge Alberto

    2014-04-01

    2013 marked the 250th anniversary of the presentation of Bayes' theorem by the philosopher Richard Price. Thomas Bayes was a figure little known in his own time, but in the 20th century the theorem that bears his name became widely used in many fields of research. The Bayes theorem is the basis of the so-called Bayesian methods, an approach to statistical inference that allows studies to incorporate prior knowledge about relevant data characteristics into statistical analysis. Nowadays, Bayesian methods are widely used in many different areas such as astronomy, economics, marketing, genetics, bioinformatics and social sciences. This study observed that a number of authors discussed recent advances in techniques and the advantages of Bayesian methods for the analysis of epidemiological data. This article presents an overview of Bayesian methods, their application to epidemiological research and the main areas of epidemiology which should benefit from the use of Bayesian methods in coming years.

  17. Concordance and discordance of sequence survey methods for molecular epidemiology

    PubMed Central

    Hasan, Nur A.; Cebula, Thomas A.; Colwell, Rita R.; Robison, Richard A.; Johnson, W. Evan; Crandall, Keith A.

    2015-01-01

    The post-genomic era is characterized by the direct acquisition and analysis of genomic data with many applications, including the enhancement of the understanding of microbial epidemiology and pathology. However, there are a number of molecular approaches to survey pathogen diversity, and the impact of these different approaches on parameter estimation and inference are not entirely clear. We sequenced whole genomes of bacterial pathogens, Burkholderia pseudomallei, Yersinia pestis, and Brucella spp. (60 new genomes), and combined them with 55 genomes from GenBank to address how different molecular survey approaches (whole genomes, SNPs, and MLST) impact downstream inferences on molecular evolutionary parameters, evolutionary relationships, and trait character associations. We selected isolates for sequencing to represent temporal, geographic origin, and host range variability. We found that substitution rate estimates vary widely among approaches, and that SNP and genomic datasets yielded different but strongly supported phylogenies. MLST yielded poorly supported phylogenies, especially in our low diversity dataset, i.e., Y. pestis. Trait associations showed that B. pseudomallei and Y. pestis phylogenies are significantly associated with geography, irrespective of the molecular survey approach used, while Brucella spp. phylogeny appears to be strongly associated with geography and host origin. We contrast inferences made among monomorphic (clonal) and non-monomorphic bacteria, and between intra- and inter-specific datasets. We also discuss our results in light of underlying assumptions of different approaches. PMID:25737810

  18. Stochastic Methods for Aircraft Design

    NASA Technical Reports Server (NTRS)

    Pelz, Richard B.; Ogot, Madara

    1998-01-01

    The global stochastic optimization method, simulated annealing (SA), was adapted and applied to various problems in aircraft design. The research was aimed at overcoming the problem of finding an optimal design in a space with multiple minima and roughness ubiquitous to numerically generated nonlinear objective functions. SA was modified to reduce the number of objective function evaluations for an optimal design, historically the main criticism of stochastic methods. SA was applied to many CFD/MDO problems including: low sonic-boom bodies, minimum drag on supersonic fore-bodies, minimum drag on supersonic aeroelastic fore-bodies, minimum drag on HSCT aeroelastic wings, FLOPS preliminary design code, another preliminary aircraft design study with vortex lattice aerodynamics, HSR complete aircraft aerodynamics. In every case, SA provided a simple, robust and reliable optimization method which found optimal designs in order 100 objective function evaluations. Perhaps most importantly, from this academic/industrial project, technology has been successfully transferred; this method is the method of choice for optimization problems at Northrop Grumman.

  19. Influence of DNA extraction methods on relative telomere length measurements and its impact on epidemiological studies

    PubMed Central

    Raschenberger, Julia; Lamina, Claudia; Haun, Margot; Kollerits, Barbara; Coassin, Stefan; Boes, Eva; Kedenko, Ludmilla; Köttgen, Anna; Kronenberg, Florian

    2016-01-01

    Measurement of telomere length is widely used in epidemiologic studies. Insufficient standardization of the measurements processes has, however, complicated the comparison of results between studies. We aimed to investigate whether DNA extraction methods have an influence on measured values of relative telomere length (RTL) and whether this has consequences for epidemiological studies. We performed four experiments with RTL measurement in quadruplicate by qPCR using DNA extracted with different methods: 1) a standardized validation experiment including three extraction methods (magnetic-particle-method EZ1, salting-out-method INV, phenol-chloroform-isoamyl-alcohol PCI) each in the same 20 samples demonstrated pronounced differences in RTL with lowest values with EZ1 followed by INV and PCI-isolated DNA; 2) a comparison of 307 samples from an epidemiological study showing EZ1-measurements 40% lower than INV-measurements; 3) a matching-approach of two similar non-diseased control groups including 143 pairs of subjects revealed significantly shorter RTL in EZ1 than INV-extracted DNA (0.844 ± 0.157 vs. 1.357 ± 0.242); 4) an association analysis of RTL with prevalent cardiovascular disease detected a stronger association with INV than with EZ1-extracted DNA. In summary, DNA extraction methods have a pronounced influence on the measured RTL-values. This might result in spurious or lost associations in epidemiological studies under certain circumstances. PMID:27138987

  20. Influence of DNA extraction methods on relative telomere length measurements and its impact on epidemiological studies.

    PubMed

    Raschenberger, Julia; Lamina, Claudia; Haun, Margot; Kollerits, Barbara; Coassin, Stefan; Boes, Eva; Kedenko, Ludmilla; Köttgen, Anna; Kronenberg, Florian

    2016-05-03

    Measurement of telomere length is widely used in epidemiologic studies. Insufficient standardization of the measurements processes has, however, complicated the comparison of results between studies. We aimed to investigate whether DNA extraction methods have an influence on measured values of relative telomere length (RTL) and whether this has consequences for epidemiological studies. We performed four experiments with RTL measurement in quadruplicate by qPCR using DNA extracted with different methods: 1) a standardized validation experiment including three extraction methods (magnetic-particle-method EZ1, salting-out-method INV, phenol-chloroform-isoamyl-alcohol PCI) each in the same 20 samples demonstrated pronounced differences in RTL with lowest values with EZ1 followed by INV and PCI-isolated DNA; 2) a comparison of 307 samples from an epidemiological study showing EZ1-measurements 40% lower than INV-measurements; 3) a matching-approach of two similar non-diseased control groups including 143 pairs of subjects revealed significantly shorter RTL in EZ1 than INV-extracted DNA (0.844 ± 0.157 vs. 1.357 ± 0.242); 4) an association analysis of RTL with prevalent cardiovascular disease detected a stronger association with INV than with EZ1-extracted DNA. In summary, DNA extraction methods have a pronounced influence on the measured RTL-values. This might result in spurious or lost associations in epidemiological studies under certain circumstances.

  1. Design method of supercavitating pumps

    NASA Astrophysics Data System (ADS)

    Kulagin, V.; Likhachev, D.; Li, F. C.

    2016-05-01

    The problem of effective supercavitating (SC) pump is solved, and optimum load distribution along the radius of the blade is found taking into account clearance, degree of cavitation development, influence of finite number of blades, and centrifugal forces. Sufficient accuracy can be obtained using the equivalent flat SC-grid for design of any SC-mechanisms, applying the “grid effect” coefficient and substituting the skewed flow calculated for grids of flat plates with the infinite attached cavitation caverns. This article gives the universal design method and provides an example of SC-pump design.

  2. Trends in Citations to Books on Epidemiological and Statistical Methods in the Biomedical Literature

    PubMed Central

    Porta, Miquel; Vandenbroucke, Jan P.; Ioannidis, John P. A.; Sanz, Sergio; Fernandez, Esteve; Bhopal, Raj; Morabia, Alfredo; Victora, Cesar; Lopez, Tomàs

    2013-01-01

    Background There are no analyses of citations to books on epidemiological and statistical methods in the biomedical literature. Such analyses may shed light on how concepts and methods changed while biomedical research evolved. Our aim was to analyze the number and time trends of citations received from biomedical articles by books on epidemiological and statistical methods, and related disciplines. Methods and Findings The data source was the Web of Science. The study books were published between 1957 and 2010. The first year of publication of the citing articles was 1945. We identified 125 books that received at least 25 citations. Books first published in 1980–1989 had the highest total and median number of citations per year. Nine of the 10 most cited texts focused on statistical methods. Hosmer & Lemeshow's Applied logistic regression received the highest number of citations and highest average annual rate. It was followed by books by Fleiss, Armitage, et al., Rothman, et al., and Kalbfleisch and Prentice. Fifth in citations per year was Sackett, et al., Evidence-based medicine. The rise of multivariate methods, clinical epidemiology, or nutritional epidemiology was reflected in the citation trends. Educational textbooks, practice-oriented books, books on epidemiological substantive knowledge, and on theory and health policies were much less cited. None of the 25 top-cited books had the theoretical or sociopolitical scope of works by Cochrane, McKeown, Rose, or Morris. Conclusions Books were mainly cited to reference methods. Books first published in the 1980s continue to be most influential. Older books on theory and policies were rooted in societal and general medical concerns, while the most modern books are almost purely on methods. PMID:23667447

  3. Imputation method for lifetime exposure assessment in air pollution epidemiologic studies

    PubMed Central

    2013-01-01

    against health data should be done as a function of PDI to check for consistency of results. The 1% of study subjects who lived for long durations near heavily trafficked intersections, had very high cumulative exposures. Thus, imputation methods must be designed to reproduce non-standard distributions. Conclusions Our approach meets a number of methodological challenges to extending historical exposure reconstruction over a lifetime and shows promise for environmental epidemiology. Application to assessment of breast cancer risks will be reported in a subsequent manuscript. PMID:23919666

  4. An updated systematic review of epidemiological evidence on hormonal contraceptive methods and HIV acquisition in women

    PubMed Central

    Polis, Chelsea B.; Curtis, Kathryn M.; Hannaford, Philip C.; Phillips, Sharon J.; Chipato, Tsungai; Kiarie, James N.; Westreich, Daniel J.; Steyn, Petrus S.

    2016-01-01

    Objective and design: Some studies suggest that specific hormonal contraceptive methods [particularly depot medroxyprogesterone acetate (DMPA)] may increase women's HIV acquisition risk. We updated a systematic review to incorporate recent epidemiological data. Methods: We searched for articles published between 15 January 2014 and 15 January 2016 and hand-searched reference lists. We identified longitudinal studies comparing users of a specific hormonal contraceptive method against either nonusers of hormonal contraception or users of another specific hormonal contraceptive method. We added newly identified studies to those in the previous review, assessed study quality, created forest plots to display results, and conducted a meta-analysis for data on DMPA versus non-use of hormonal contraception. Results: We identified 10 new reports of which five were considered ‘unlikely to inform the primary question’. We focus on the other five reports, along with nine from the previous review, which were considered ‘informative but with important limitations’. The preponderance of data for oral contraceptive pills, injectable norethisterone enanthate, and levonorgestrel implants do not suggest an association with HIV acquisition, though data for implants are limited. The new, higher quality studies on DMPA (or nondisaggregated injectables), which had mixed results in terms of statistical significance, had hazard ratios between 1.2 and 1.7, consistent with our meta-analytic estimate for all higher quality studies of hazard ratio 1.4. Conclusion: Although confounding in these observational data cannot be excluded, new information increases concerns about DMPA and HIV acquisition risk in women. If the association is causal, the magnitude of effect is likely hazard ratio 1.5 or less. Data for other hormonal contraceptive methods, including norethisterone enanthate, are largely reassuring. PMID:27500670

  5. Educational epidemiology: applying population-based design and analytic approaches to study medical education.

    PubMed

    Carney, Patricia A; Nierenberg, David W; Pipas, Catherine F; Brooks, W Blair; Stukel, Therese A; Keller, Adam M

    2004-09-01

    Conducting educational research in medical schools is challenging partly because interventional controlled research designs are difficult to apply. In addition, strict accreditation requirements and student/faculty concerns about educational inequality reduce the flexibility needed to plan and execute educational experiments. Consequently, there is a paucity of rigorous and generalizable educational research to provide an evidence-guided foundation to support educational effectiveness. "Educational epidemiology," ie, the application across the physician education continuum of observational designs (eg, cross-sectional, longitudinal, cohort, and case-control studies) and randomized experimental designs (eg, randomized controlled trials, randomized crossover designs), could revolutionize the conduct of research in medical education. Furthermore, the creation of a comprehensive national network of educational epidemiologists could enhance collaboration and the development of a strong educational research foundation.

  6. [Mendelian randomisation - a genetic approach to an epidemiological method].

    PubMed

    Stensrud, Mats Julius

    2016-06-01

    BACKGROUND Genetic information is becoming more easily available, and rapid progress is being made in developing methods of illuminating issues of interest. Mendelian randomisation makes it possible to study causes of disease using observational data. The name refers to the random distribution of gene variants in meiosis. The methodology makes use of genes that influence a risk factor for a disease, without influencing the disease itself. In this review article I explain the principles behind Mendelian randomisation and present the areas of application for this methodology.MATERIAL AND METHOD Methodology articles describing Mendelian randomisation were reviewed. The articles were found through a search in PubMed with the combination «mendelian randomization» OR «mendelian randomisation», and a search in McMaster Plus with the combination «mendelian randomization». A total of 15 methodology articles were read in full text. Methodology articles were supplemented by clinical studies found in the PubMed search.RESULTS In contrast to traditional observational studies, Mendelian randomisation studies are not affected by two important sources of error: conventional confounding variables and reverse causation. Mendelian randomisation is therefore a promising tool for studying causality. Mendelian randomisation studies have already provided valuable knowledge on the risk factors for a wide range of diseases. It is nevertheless important to be aware of the limitations of the methodology. As a result of the rapid developments in genetics research, Mendelian randomisation will probably be widely used in future years.INTERPRETATION If Mendelian randomisation studies are conducted correctly, they may help to reveal both modifiable and non-modifiable causes of disease.

  7. Violent crime in San Antonio, Texas: an application of spatial epidemiological methods.

    PubMed

    Sparks, Corey S

    2011-12-01

    Violent crimes are rarely considered a public health problem or investigated using epidemiological methods. But patterns of violent crime and other health conditions are often affected by similar characteristics of the built environment. In this paper, methods and perspectives from spatial epidemiology are used in an analysis of violent crimes in San Antonio, TX. Bayesian statistical methods are used to examine the contextual influence of several aspects of the built environment. Additionally, spatial regression models using Bayesian model specifications are used to examine spatial patterns of violent crime risk. Results indicate that the determinants of violent crime depend on the model specification, but are primarily related to the built environment and neighborhood socioeconomic conditions. Results are discussed within the context of a rapidly growing urban area with a diverse population.

  8. Discriminatory Indices of Typing Methods for Epidemiologic Analysis of Contemporary Staphylococcus aureus Strains.

    PubMed

    Rodriguez, Marcela; Hogan, Patrick G; Satola, Sarah W; Crispell, Emily; Wylie, Todd; Gao, Hongyu; Sodergren, Erica; Weinstock, George M; Burnham, Carey-Ann D; Fritz, Stephanie A

    2015-09-01

    Historically, a number of typing methods have been evaluated for Staphylococcus aureus strain characterization. The emergence of contemporary strains of community-associated S. aureus, and the ensuing epidemic with a predominant strain type (USA300), necessitates re-evaluation of the discriminatory power of these typing methods for discerning molecular epidemiology and transmission dynamics, essential to investigations of hospital and community outbreaks. We compared the discriminatory index of 5 typing methods for contemporary S. aureus strain characterization. Children presenting to St. Louis Children's Hospital and community pediatric practices in St. Louis, Missouri (MO), with community-associated S. aureus infections were enrolled. Repetitive sequence-based PCR (repPCR), pulsed-field gel electrophoresis (PFGE), multilocus sequence typing (MLST), staphylococcal protein A (spa), and staphylococcal cassette chromosome (SCC) mec typing were performed on 200 S. aureus isolates. The discriminatory index of each method was calculated using the standard formula for this metric, where a value of 1 is highly discriminatory and a value of 0 is not discriminatory. Overall, we identified 26 distinct strain types by repPCR, 17 strain types by PFGE, 30 strain types by MLST, 68 strain types by spa typing, and 5 strain types by SCCmec typing. RepPCR had the highest discriminatory index (D) of all methods (D = 0.88), followed by spa typing (D = 0.87), MLST (D = 0.84), PFGE (D = 0.76), and SCCmec typing (D = 0.60). The method with the highest D among MRSA isolates was repPCR (D = 0.64) followed by spa typing (D = 0.45) and MLST (D = 0.44). The method with the highest D among MSSA isolates was spa typing (D = 0.98), followed by MLST (D = 0.93), repPCR (D = 0.92), and PFGE (D = 0.89). Among isolates designated USA300 by PFGE, repPCR was most discriminatory, with 10 distinct strain types identified (D = 0.63). We identified 45

  9. Discriminatory Indices of Typing Methods for Epidemiologic Analysis of Contemporary Staphylococcus aureus Strains

    PubMed Central

    Rodriguez, Marcela; Hogan, Patrick G.; Satola, Sarah W.; Crispell, Emily; Wylie, Todd; Gao, Hongyu; Sodergren, Erica; Weinstock, George M.; Burnham, Carey-Ann D.; Fritz, Stephanie A.

    2015-01-01

    Abstract Historically, a number of typing methods have been evaluated for Staphylococcus aureus strain characterization. The emergence of contemporary strains of community-associated S. aureus, and the ensuing epidemic with a predominant strain type (USA300), necessitates re-evaluation of the discriminatory power of these typing methods for discerning molecular epidemiology and transmission dynamics, essential to investigations of hospital and community outbreaks. We compared the discriminatory index of 5 typing methods for contemporary S. aureus strain characterization. Children presenting to St. Louis Children's Hospital and community pediatric practices in St. Louis, Missouri (MO), with community-associated S. aureus infections were enrolled. Repetitive sequence-based PCR (repPCR), pulsed-field gel electrophoresis (PFGE), multilocus sequence typing (MLST), staphylococcal protein A (spa), and staphylococcal cassette chromosome (SCC) mec typing were performed on 200 S. aureus isolates. The discriminatory index of each method was calculated using the standard formula for this metric, where a value of 1 is highly discriminatory and a value of 0 is not discriminatory. Overall, we identified 26 distinct strain types by repPCR, 17 strain types by PFGE, 30 strain types by MLST, 68 strain types by spa typing, and 5 strain types by SCCmec typing. RepPCR had the highest discriminatory index (D) of all methods (D = 0.88), followed by spa typing (D = 0.87), MLST (D = 0.84), PFGE (D = 0.76), and SCCmec typing (D = 0.60). The method with the highest D among MRSA isolates was repPCR (D = 0.64) followed by spa typing (D = 0.45) and MLST (D = 0.44). The method with the highest D among MSSA isolates was spa typing (D = 0.98), followed by MLST (D = 0.93), repPCR (D = 0.92), and PFGE (D = 0.89). Among isolates designated USA300 by PFGE, repPCR was most discriminatory, with 10 distinct strain types identified (D = 0.63). We

  10. Comparing Two Epidemiologic Surveillance Methods to Assess Underestimation of Human Stampedes in India

    PubMed Central

    Ngai, Ka Ming; Lee, Wing Yan; Madan, Aditi; Sanyal, Saswata; Roy, Nobhojit; Burkle, Frederick M.; Hsu, Edbert B.

    2013-01-01

    Background: Two separate but complementary epidemiologic surveillance methods for human stampedes have emerged since the publication of the topic in 2009. The objective of this study is to estimate the degree of underreporting in India. Method: The Ngai Search Method was compared to the Roy Search Method for human stampede events occurring in India between 2001 and 2010. Results: A total of 40 stampedes were identified by both search methods. Using the Ngai method, 34 human stampedes were identified. Using a previously defined stampede scale: 2 events were class I, 21 events were class II, 8 events were class III, and 3 events were class IV. The median deaths were 5.5 per event and median injuries were 13.5 per event. Using the Roy method, 27 events were identified, including 9 events that were not identified by the Ngai method. After excluding events based on exclusion criteria, six additional events identified by the Roy’s method had a median of 4 deaths and 30 injuries. In multivariate analysis using the Ngai method, religious (6.52, 95%CI 1.73-24.66, p=0.006) and political (277.09, 95%CI 5.12-15,001.96, p=0.006) events had higher relative number of deaths. Conclusion: Many causes accounting for the global increase in human stampede events can only be elucidated through systematic epidemiological investigation. Focusing on a country with a high recurrence of human stampedes, we compare two independent methods of data abstraction in an effort to improve the existing database and to identify pertinent risk factors. We concluded that our previous publication underestimated stampede events in India by approximately 18% and an international standardized database to systematically record occurrence of human stampedes is needed to facilitate understanding of the epidemiology of human stampedes. PMID:24077300

  11. Rationale and Design of the International Lymphoma Epidemiology Consortium (InterLymph) Non-Hodgkin Lymphoma Subtypes Project

    PubMed Central

    Morton, Lindsay M.; Sampson, Joshua N.; Cerhan, James R.; Turner, Jennifer J.; Vajdic, Claire M.; Wang, Sophia S.; Smedby, Karin E.; de Sanjosé, Silvia; Monnereau, Alain; Benavente, Yolanda; Bracci, Paige M.; Chiu, Brian C. H.; Skibola, Christine F.; Zhang, Yawei; Mbulaiteye, Sam M.; Spriggs, Michael; Robinson, Dennis; Norman, Aaron D.; Kane, Eleanor V.; Spinelli, John J.; Kelly, Jennifer L.; Vecchia, Carlo La; Dal Maso, Luigino; Maynadié, Marc; Kadin, Marshall E.; Cocco, Pierluigi; Costantini, Adele Seniori; Clarke, Christina A.; Roman, Eve; Miligi, Lucia; Colt, Joanne S.; Berndt, Sonja I.; Mannetje, Andrea; de Roos, Anneclaire J.; Kricker, Anne; Nieters, Alexandra; Franceschi, Silvia; Melbye, Mads; Boffetta, Paolo; Clavel, Jacqueline; Linet, Martha S.; Weisenburger, Dennis D.; Slager, Susan L.

    2014-01-01

    Background Non-Hodgkin lymphoma (NHL), the most common hematologic malignancy, consists of numerous subtypes. The etiology of NHL is incompletely understood, and increasing evidence suggests that risk factors may vary by NHL subtype. However, small numbers of cases have made investigation of subtype-specific risks challenging. The International Lymphoma Epidemiology Consortium therefore undertook the NHL Subtypes Project, an international collaborative effort to investigate the etiologies of NHL subtypes. This article describes in detail the project rationale and design. Methods We pooled individual-level data from 20 case-control studies (17471 NHL cases, 23096 controls) from North America, Europe, and Australia. Centralized data harmonization and analysis ensured standardized definitions and approaches, with rigorous quality control. Results The pooled study population included 11 specified NHL subtypes with more than 100 cases: diffuse large B-cell lymphoma (N = 4667), follicular lymphoma (N = 3530), chronic lymphocytic leukemia/small lymphocytic lymphoma (N = 2440), marginal zone lymphoma (N = 1052), peripheral T-cell lymphoma (N = 584), mantle cell lymphoma (N = 557), lymphoplasmacytic lymphoma/Waldenström macroglobulinemia (N = 374), mycosis fungoides/Sézary syndrome (N = 324), Burkitt/Burkitt-like lymphoma/leukemia (N = 295), hairy cell leukemia (N = 154), and acute lymphoblastic leukemia/lymphoma (N = 152). Associations with medical history, family history, lifestyle factors, and occupation for each of these 11 subtypes are presented in separate articles in this issue, with a final article quantitatively comparing risk factor patterns among subtypes. Conclusions The International Lymphoma Epidemiology Consortium NHL Subtypes Project provides the largest and most comprehensive investigation of potential risk factors for a broad range of common and rare NHL subtypes to date. The analyses contribute to our understanding of the multifactorial nature of NHL

  12. An internet-based method of selecting control populations for epidemiologic studies.

    PubMed

    Stone, Mary Bishop; Lyon, Joseph L; Simonsen, Sara Ellis; White, George L; Alder, Stephen C

    2007-01-01

    Identifying control subjects for epidemiologic studies continues to increase in difficulty because of changes in telephone technology such as answering services and machines, caller identification, and cell phones. An Internet-based method for obtaining study subjects that may increase response rates has been developed and is described. This method uses information from two websites that, when combined, provide accurate and complete lists of names, addresses, and listed phone numbers. This method was developed by use of randomly selected streets in a suburb of Salt Lake City, Utah, in June 2005.

  13. Endodontic Epidemiology

    PubMed Central

    Shahravan, Arash; Haghdoost, Ali Akbar

    2014-01-01

    Epidemiology is the study of disease distribution and factors determining or affecting it. Likewise, endodontic epidemiology can be defined as the science of studying the distribution pattern and determinants of pulp and periapical diseases; specially apical periodontitis. Although different study designs have been used in endodontics, researchers must pay more attention to study designs with higher level of evidence such as randomized clinical trials. PMID:24688577

  14. [Methodical reflections on epidemiological methods to measure adverse medical device events].

    PubMed

    Lessing, C

    2009-06-01

    Drugs and medical devices are common remedies in patient care. Concerning patient safety, much research has been undertaken to study medication-related events, such as adverse drug events or medication errors; however, only little is known about device-related events and patient safety. Until now, only one survey on the epidemiology of adverse medical device events has been published. Estimates amount to 8.4 adverse medical device events/100 hospitalizations. As this indicates, further research is needed on epidemiological methodology to investigate the frequency, distribution, causes and results of medical device-related events. Only profound knowledge will constitute a resilient base for the development of safety strategies which can be then implemented and evaluated. Also in the German health care system, the special challenges described for data collection have to be mastered.

  15. Methods for measuring utilization of mental health services in two epidemiologic studies

    PubMed Central

    NOVINS, DOUGLAS K.; BEALS, JANETTE; CROY, CALVIN; MANSON, SPERO M.

    2015-01-01

    Objectives of Study Psychiatric epidemiologic studies often include two or more sets of questions regarding service utilization, but the agreement across these different questions and the factors associated with their endorsement have not been examined. The objectives of this study were to describe the agreement of different sets of mental health service utilization questions that were included in the American Indian Service Utilization Psychiatric Epidemiology Risk and Protective Factors Project (AI-SUPERPFP), and compare the results to similar questions included in the baseline National Comorbidity Survey (NCS). Methods Responses to service utilization questions by 2878 AI-SUPERPFP and 5877 NCS participants were examined by calculating estimates of service use and agreement (κ) across the different sets of questions. Logistic regression models were developed to identify factors associated with endorsement of specific sets of questions. Results In both studies, estimates of mental health service utilization varied across the different sets of questions. Agreement across the different question sets was marginal to good (κ = 0.27–0.69). Characteristics of identified service users varied across the question sets. Limitations Neither survey included data to examine the validity of participant responses to service utilization questions. Recommendations for Further Research Question wording and placement appear to impact estimates of service utilization in psychiatric epidemiologic studies. Given the importance of these estimates for policy-making, further research into the validity of survey responses as well as impacts of question wording and context on rates of service utilization is warranted. PMID:18767205

  16. A survey of variable selection methods in two Chinese epidemiology journals

    PubMed Central

    2010-01-01

    Background Although much has been written on developing better procedures for variable selection, there is little research on how it is practiced in actual studies. This review surveys the variable selection methods reported in two high-ranking Chinese epidemiology journals. Methods Articles published in 2004, 2006, and 2008 in the Chinese Journal of Epidemiology and the Chinese Journal of Preventive Medicine were reviewed. Five categories of methods were identified whereby variables were selected using: A - bivariate analyses; B - multivariable analysis; e.g. stepwise or individual significance testing of model coefficients; C - first bivariate analyses, followed by multivariable analysis; D - bivariate analyses or multivariable analysis; and E - other criteria like prior knowledge or personal judgment. Results Among the 287 articles that reported using variable selection methods, 6%, 26%, 30%, 21%, and 17% were in categories A through E, respectively. One hundred sixty-three studies selected variables using bivariate analyses, 80% (130/163) via multiple significance testing at the 5% alpha-level. Of the 219 multivariable analyses, 97 (44%) used stepwise procedures, 89 (41%) tested individual regression coefficients, but 33 (15%) did not mention how variables were selected. Sixty percent (58/97) of the stepwise routines also did not specify the algorithm and/or significance levels. Conclusions The variable selection methods reported in the two journals were limited in variety, and details were often missing. Many studies still relied on problematic techniques like stepwise procedures and/or multiple testing of bivariate associations at the 0.05 alpha-level. These deficiencies should be rectified to safeguard the scientific validity of articles published in Chinese epidemiology journals. PMID:20920252

  17. Realist explanatory theory building method for social epidemiology: a protocol for a mixed method multilevel study of neighbourhood context and postnatal depression.

    PubMed

    Eastwood, John G; Jalaludin, Bin B; Kemp, Lynn A

    2014-01-01

    A recent criticism of social epidemiological studies, and multi-level studies in particular has been a paucity of theory. We will present here the protocol for a study that aims to build a theory of the social epidemiology of maternal depression. We use a critical realist approach which is trans-disciplinary, encompassing both quantitative and qualitative traditions, and that assumes both ontological and hierarchical stratification of reality. We describe a critical realist Explanatory Theory Building Method comprising of an: 1) emergent phase, 2) construction phase, and 3) confirmatory phase. A concurrent triangulated mixed method multilevel cross-sectional study design is described. The Emergent Phase uses: interviews, focus groups, exploratory data analysis, exploratory factor analysis, regression, and multilevel Bayesian spatial data analysis to detect and describe phenomena. Abductive and retroductive reasoning will be applied to: categorical principal component analysis, exploratory factor analysis, regression, coding of concepts and categories, constant comparative analysis, drawing of conceptual networks, and situational analysis to generate theoretical concepts. The Theory Construction Phase will include: 1) defining stratified levels; 2) analytic resolution; 3) abductive reasoning; 4) comparative analysis (triangulation); 5) retroduction; 6) postulate and proposition development; 7) comparison and assessment of theories; and 8) conceptual frameworks and model development. The strength of the critical realist methodology described is the extent to which this paradigm is able to support the epistemological, ontological, axiological, methodological and rhetorical positions of both quantitative and qualitative research in the field of social epidemiology. The extensive multilevel Bayesian studies, intensive qualitative studies, latent variable theory, abductive triangulation, and Inference to Best Explanation provide a strong foundation for Theory

  18. Experimental design methods for bioengineering applications.

    PubMed

    Keskin Gündoğdu, Tuğba; Deniz, İrem; Çalışkan, Gülizar; Şahin, Erdem Sefa; Azbar, Nuri

    2016-01-01

    Experimental design is a form of process analysis in which certain factors are selected to obtain the desired responses of interest. It may also be used for the determination of the effects of various independent factors on a dependent factor. The bioengineering discipline includes many different areas of scientific interest, and each study area is affected and governed by many different factors. Briefly analyzing the important factors and selecting an experimental design for optimization are very effective tools for the design of any bioprocess under question. This review summarizes experimental design methods that can be used to investigate various factors relating to bioengineering processes. The experimental methods generally used in bioengineering are as follows: full factorial design, fractional factorial design, Plackett-Burman design, Taguchi design, Box-Behnken design and central composite design. These design methods are briefly introduced, and then the application of these design methods to study different bioengineering processes is analyzed.

  19. Diagnostic Methods of Helicobacter pylori Infection for Epidemiological Studies: Critical Importance of Indirect Test Validation

    PubMed Central

    Miftahussurur, Muhammad; Yamaoka, Yoshio

    2016-01-01

    Among the methods developed to detect H. pylori infection, determining the gold standard remains debatable, especially for epidemiological studies. Due to the decreasing sensitivity of direct diagnostic tests (histopathology and/or immunohistochemistry [IHC], rapid urease test [RUT], and culture), several indirect tests, including antibody-based tests (serology and urine test), urea breath test (UBT), and stool antigen test (SAT) have been developed to diagnose H. pylori infection. Among the indirect tests, UBT and SAT became the best methods to determine active infection. While antibody-based tests, especially serology, are widely available and relatively sensitive, their specificity is low. Guidelines indicated that no single test can be considered as the gold standard for the diagnosis of H. pylori infection and that one should consider the method's advantages and disadvantages. Based on four epidemiological studies, culture and RUT present a sensitivity of 74.2–90.8% and 83.3–86.9% and a specificity of 97.7–98.8% and 95.1–97.2%, respectively, when using IHC as a gold standard. The sensitivity of serology is quite high, but that of the urine test was lower compared with that of the other methods. Thus, indirect test validation is important although some commercial kits propose universal cut-off values. PMID:26904678

  20. Diagnostic Methods of Helicobacter pylori Infection for Epidemiological Studies: Critical Importance of Indirect Test Validation.

    PubMed

    Miftahussurur, Muhammad; Yamaoka, Yoshio

    2016-01-01

    Among the methods developed to detect H. pylori infection, determining the gold standard remains debatable, especially for epidemiological studies. Due to the decreasing sensitivity of direct diagnostic tests (histopathology and/or immunohistochemistry [IHC], rapid urease test [RUT], and culture), several indirect tests, including antibody-based tests (serology and urine test), urea breath test (UBT), and stool antigen test (SAT) have been developed to diagnose H. pylori infection. Among the indirect tests, UBT and SAT became the best methods to determine active infection. While antibody-based tests, especially serology, are widely available and relatively sensitive, their specificity is low. Guidelines indicated that no single test can be considered as the gold standard for the diagnosis of H. pylori infection and that one should consider the method's advantages and disadvantages. Based on four epidemiological studies, culture and RUT present a sensitivity of 74.2-90.8% and 83.3-86.9% and a specificity of 97.7-98.8% and 95.1-97.2%, respectively, when using IHC as a gold standard. The sensitivity of serology is quite high, but that of the urine test was lower compared with that of the other methods. Thus, indirect test validation is important although some commercial kits propose universal cut-off values.

  1. Study designs may influence results: the problems with questionnaire-based case-control studies on the epidemiology of glioma.

    PubMed

    Johansen, Christoffer; Schüz, Joachim; Andreasen, Anne-Marie Serena; Dalton, Susanne Oksbjerg

    2017-03-28

    Glioma is a rare brain tumour with a very poor prognosis and the search for modifiable factors is intense. We reviewed the literature concerning risk factors for glioma obtained in case-control designed epidemiological studies in order to discuss the influence of this methodology on the observed results. When reviewing the association between three exposures, medical radiation, exogenous hormone use and allergy, we critically appraised the evidence from both case-control and cohort studies. For medical radiation and hormone replacement therapy (HRT), questionnaire-based case-control studies appeared to show an inverse association, whereas nested case-control and cohort studies showed no association. For allergies, the inverse association was observed irrespective of study design. We recommend that the questionnaire-based case-control design be placed lower in the hierarchy of studies for establishing cause-and-effect for diseases such as glioma. We suggest that a state-of-the-art case-control study should, as a minimum, be accompanied by extensive validation of the exposure assessment methods and the representativeness of the study sample with regard to the exposures of interest. Otherwise, such studies cannot be regarded as 'hypothesis testing' but only 'hypothesis generating'. We consider that this holds true for all questionnaire-based case-control studies on cancer and other chronic diseases, although perhaps not to the same extent for each exposure-outcome combination.

  2. A practical method for use in epidemiological studies on enamel hypomineralisation.

    PubMed

    Ghanim, A; Elfrink, M; Weerheijm, K; Mariño, R; Manton, D

    2015-06-01

    With the development of the European Academy of Paediatric Dentistry (EAPD) judgment criteria, there has been increasing interest worldwide in investigation of the prevalence of demarcated opacities in tooth enamel substance, known as molar-incisor hypomineralisation (MIH). However, the lack of a standardised system for the purpose of recording MIH data in epidemiological surveys has contributed greatly to the wide variations in the reported prevalence between studies. The present publication describes the rationale, development, and content of a scoring method for MIH diagnosis in epidemiological studies as well as clinic- and hospital-based studies. The proposed grading method allows separate classification of demarcated hypomineralisation lesions and other enamel defects identical to MIH. It yields an informative description of the severity of MIH-affected teeth in terms of the stage of visible enamel destruction and the area of tooth surface affected (i.e. lesion clinical status and extent, respectively). In order to preserve the maximum amount of information from a clinical examination consistent with the need to permit direct comparisons between prevalence studies, two forms of the charting are proposed, a short form for simple screening surveys and a long form desirable for prospective, longitudinal observational research where aetiological factors in demarcated lesions are to be investigated in tandem with lesions distribution. Validation of the grading method is required, and its reliability and usefulness need to be tested in different age groups and different populations.

  3. Measuring socio-economic position for epidemiological studies in low- and middle-income countries: a methods of measurement in epidemiology paper

    PubMed Central

    Howe, Laura D; Galobardes, Bruna; Matijasevich, Alicia; Gordon, David; Johnston, Deborah; Onwujekwe, Obinna; Patel, Rita; Webb, Elizabeth A; Lawlor, Debbie A; Hargreaves, James R

    2012-01-01

    Much has been written about the measurement of socio-economic position (SEP) in high-income countries (HIC). Less has been written for an epidemiology, health systems and public health audience about the measurement of SEP in low- and middle-income countries (LMIC). The social stratification processes in many LMIC—and therefore the appropriate measurement tools—differ considerably from those in HIC. Many measures of SEP have been utilized in epidemiological studies; the aspects of SEP captured by these measures and the pathways through which they may affect health are likely to be slightly different but overlapping. No single measure of SEP will be ideal for all studies and contexts; the strengths and limitations of a given indicator are likely to vary according to the specific research question. Understanding the general properties of different indicators, however, is essential for all those involved in the design or interpretation of epidemiological studies. In this article, we describe the measures of SEP used in LMIC. We concentrate on measures of individual or household-level SEP rather than area-based or ecological measures such as gross domestic product. We describe each indicator in terms of its theoretical basis, interpretation, measurement, strengths and limitations. We also provide brief comparisons between LMIC and HIC for each measure. PMID:22438428

  4. Applying epidemiological principles to ergonomics: a checklist for incorporating sound design and interpretation of studies.

    PubMed

    Heacock, H; Koehoorn, M; Tan, J

    1997-06-01

    The primary purpose of this paper is to provide a checklist of scientific requirements necessary for the design of sound ergonomics studies. Ergonomics researchers will be able to use the checklist when designing a study and preparing it for publication. Practitioners can use the checklist to critically appraise study results, thereby having greater confidence when applying ergonomic recommendations to the workplace. A secondary purpose of the paper is to pilot the checklist on a sample of papers in the ergonomics literature and to assess its reliability. While there are checklists to assess the epidemiological rigour of studies, none have been adapted to address methodological issues in ergonomics. Two epidemiologists independently searched five ergonomics journals (Applied Ergonomics, Ergonomics, Human Factors, International Journal of Human-Computer Interaction and Journal of Human Ergology) for research studies on VDT use and visual function published between 1990 and 1995. Twenty-one articles were reviewed. Each paper was scored according to the checklist. Overall, the reviewers found that the articles did not consistently fulfill some of the checklist criteria. An insufficient sample size was the most serious omission. Inter-rater reliability of the checklist was excellent for 11 of 14 items on the checklist (Kappa > 0.74), good for two items (Kappa between 0.40 and 0.74) and poor for one item. As ergonomics is gaining acceptance as an integral part of occupational health and safety, individuals in this field must be cognizant of the fact that study results are being applied directly to workplace procedures and design. It is incumbent upon ergonomists to base their work on a solid research foundation. The checklist can be used as a tool to improve study designs and so ultimately has implications for improving the fit between the worker and the work environment.

  5. Quantitative methods in the tuberculosis epidemiology and in the evaluation of BCG vaccination programs.

    PubMed

    Lugosi, L

    1986-01-01

    Controversies concerning the protective efficacy of the BCG vaccination result mostly from the fact that quantitative methods have not been used in the evaluation of the BCG programs. Therefore, to eliminate the current controversy an unconditional requirement is to apply valid biostatistical models to analyse the results of the BCG programs. In order to achieve objective statistical inferences and epidemiological interpretations the following conditions should be fulfilled: data for evaluation have to be taken from epidemiological trials exempt from sampling error, since the morbidity rates are not normally distributed an appropriate normalizing transformation is needed for point and confidence interval estimations, only unbiased point estimates (dependent variables) could be used in valid models for hypothesis tests, in cases of rejected null hypothesis the ranked estimates of the compared groups must be evaluated in a multiple comparison model in order to diminish the Type I error in the decision. The following quantitative methods are presented to evaluate the effectiveness of BCG vaccination in Hungary: linear regression analysis, stepwise regression analysis and log-linear analysis.

  6. Design of a detection survey for Ostreid herpesvirus-1 using hydrodynamic dispersion models to determine epidemiological units.

    PubMed

    Pande, Anjali; Acosta, Hernando; Brangenberg, Naya Alexis; Keeling, Suzanne Elizabeth

    2015-04-01

    Using Ostreid herpesvirus-1 (OsHV-1) as a case study, this paper considers a survey design methodology for an aquatic animal pathogen that incorporates the concept of biologically independent epidemiological units. Hydrodynamically-modelled epidemiological units are used to divide marine areas into sensible sampling units for detection surveys of waterborne diseases. In the aquatic environment it is difficult to manage disease at the animal level, hence management practices are often aimed at a group of animals sharing a similar risk. Using epidemiological units is a way to define these groups, based on a similar level of probability of exposure based on the modelled potential spread of a viral particle via coastal currents, that can help inform management decisions.

  7. Spacesuit Radiation Shield Design Methods

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Anderson, Brooke M.; Cucinotta, Francis A.; Ware, J.; Zeitlin, Cary J.

    2006-01-01

    Meeting radiation protection requirements during EVA is predominantly an operational issue with some potential considerations for temporary shelter. The issue of spacesuit shielding is mainly guided by the potential of accidental exposure when operational and temporary shelter considerations fail to maintain exposures within operational limits. In this case, very high exposure levels are possible which could result in observable health effects and even be life threatening. Under these assumptions, potential spacesuit radiation exposures have been studied using known historical solar particle events to gain insight on the usefulness of modification of spacesuit design in which the control of skin exposure is a critical design issue and reduction of blood forming organ exposure is desirable. Transition to a new spacesuit design including soft upper-torso and reconfigured life support hardware gives an opportunity to optimize the next generation spacesuit for reduced potential health effects during an accidental exposure.

  8. Comparison of Methods to Account for Implausible Reporting of Energy Intake in Epidemiologic Studies

    PubMed Central

    Rhee, Jinnie J.; Sampson, Laura; Cho, Eunyoung; Hughes, Michael D.; Hu, Frank B.; Willett, Walter C.

    2015-01-01

    In a recent article in the American Journal of Epidemiology by Mendez et al. (Am J Epidemiol. 2011;173(4):448–458), the use of alternative approaches to the exclusion of implausible energy intakes led to significantly different cross-sectional associations between diet and body mass index (BMI), whereas the use of a simpler recommended criteria (<500 and >3,500 kcal/day) yielded no meaningful change. However, these findings might have been due to exclusions made based on weight, a primary determinant of BMI. Using data from 52,110 women in the Nurses' Health Study (1990), we reproduced the cross-sectional findings of Mendez et al. and compared the results from the recommended method with those from 2 weight-dependent alternative methods (the Goldberg method and predicted total energy expenditure method). The same 3 exclusion criteria were then used to examine dietary variables prospectively in relation to change in BMI, which is not a direct function of attained weight. We found similar associations using the 3 methods. In a separate cross-sectional analysis using biomarkers of dietary factors, we found similar correlations for intakes of fatty acids (n = 439) and carotenoids and retinol (n = 1,293) using the 3 methods for exclusions. These results do not support the general conclusion that use of exclusion criteria based on the alternative methods might confer an advantage over the recommended exclusion method. PMID:25656533

  9. Comparison of methods to account for implausible reporting of energy intake in epidemiologic studies.

    PubMed

    Rhee, Jinnie J; Sampson, Laura; Cho, Eunyoung; Hughes, Michael D; Hu, Frank B; Willett, Walter C

    2015-02-15

    In a recent article in the American Journal of Epidemiology by Mendez et al. (Am J Epidemiol. 2011;173(4):448-458), the use of alternative approaches to the exclusion of implausible energy intakes led to significantly different cross-sectional associations between diet and body mass index (BMI), whereas the use of a simpler recommended criteria (<500 and >3,500 kcal/day) yielded no meaningful change. However, these findings might have been due to exclusions made based on weight, a primary determinant of BMI. Using data from 52,110 women in the Nurses' Health Study (1990), we reproduced the cross-sectional findings of Mendez et al. and compared the results from the recommended method with those from 2 weight-dependent alternative methods (the Goldberg method and predicted total energy expenditure method). The same 3 exclusion criteria were then used to examine dietary variables prospectively in relation to change in BMI, which is not a direct function of attained weight. We found similar associations using the 3 methods. In a separate cross-sectional analysis using biomarkers of dietary factors, we found similar correlations for intakes of fatty acids (n = 439) and carotenoids and retinol (n = 1,293) using the 3 methods for exclusions. These results do not support the general conclusion that use of exclusion criteria based on the alternative methods might confer an advantage over the recommended exclusion method.

  10. Importance of Survey Design for Studying the Epidemiology of Emerging Tobacco Product Use Among Youth.

    PubMed

    Delnevo, Cristine D; Gundersen, Daniel A; Manderski, Michelle T B; Giovenco, Daniel P; Giovino, Gary A

    2017-03-22

    Accurate surveillance is critical for monitoring the epidemiology of emerging tobacco products in the United States, and survey science suggests that survey response format can impact prevalence estimates. We utilized data from the 2014 New Jersey Youth Tobacco Survey (n = 3,909) to compare estimates of the prevalence of 4 behaviors (ever hookah use, current hookah use, ever e-cigarette use, and current e-cigarette use) among New Jersey high school students, as assessed using "check-all-that-apply" questions, with estimates measured by means of "forced-choice" questions. Measurement discrepancies were apparent for all 4 outcomes, with the forced-choice questions yielding prevalence estimates approximately twice those of the check-all-that-apply questions, and agreement was fair to moderate. The sensitivity of the check-all-that-apply questions, treating the forced-choice format as the "gold standard," ranged from 38.1% (current hookah use) to 58.3% (ever e-cigarette use), indicating substantial false-negative rates. These findings highlight the impact of question response format on prevalence estimates of emerging tobacco products among youth and suggest that estimates generated by means of check-all-that-apply questions may be biased downward. Alternative survey designs should be considered to avoid check-all-that-apply response formats, and researchers should use caution when interpreting tobacco use data obtained from check-all-that-apply formats.

  11. Algebraic Methods to Design Signals

    DTIC Science & Technology

    2015-08-27

    group theory are employed to investigate the theory of their construction methods leading to new families of these arrays and some generalizations...sequences and arrays with desirable correlation properties. The methods used are very algebraic and number theoretic. Many new families of sequences...context of optical quantum computing, we prove that infinite families of anticirculant block weighing matrices can be obtained from generic weighing

  12. Using genetic epidemiology to study Rett syndrome: the design of a case-control study.

    PubMed

    Leonard, H; Fyfe, S; Dye, D; Leonard, S

    2000-01-01

    Rett syndrome is a neurological disorder that is seen almost exclusively in females. Although generally considered to have a genetic basis, the underlying mechanism remains obscure. One favoured hypothesis is that the syndrome is an X-linked dominant disorder, lethal or non-expressed in males. Genealogical research has also suggested that the mode of transmission in Rett syndrome may involve a premutation which over several generations is converted to a full mutation. Geographical clustering has been reported, and it has also been proposed that Rett syndrome is a clinically variable condition and that other neurological disorders may be occurring more commonly in families with Rett syndrome. Other studies have found an apparent increase in intellectual disability and seizures in the extended families of girls with Rett syndrome. The science of genetic epidemiology can be used to identify familial aggregation, which is the clustering of a disorder within a family. We have used a case-control study design to investigate both fetal wastage and familial aggregation of other disorders in families of girls with Rett syndrome. The Australian Rett Syndrome Database provided the source of cases, and control probands were girls of a similar age with normal development. This paper describes the methodology for a case-control study of this rare condition using pedigree data and discusses issues in the collection and evaluation of such data. The use of a control population is an important feature. Both the strengths and the shortcomings of our design are identified, and recommendations are made for future research.

  13. Design for validation, based on formal methods

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1990-01-01

    Validation of ultra-reliable systems decomposes into two subproblems: (1) quantification of probability of system failure due to physical failure; (2) establishing that Design Errors are not present. Methods of design, testing, and analysis of ultra-reliable software are discussed. It is concluded that a design-for-validation based on formal methods is needed for the digital flight control systems problem, and also that formal methods will play a major role in the development of future high reliability digital systems.

  14. Design Methods for Clinical Systems

    PubMed Central

    Blum, B.I.

    1986-01-01

    This paper presents a brief introduction to the techniques, methods and tools used to implement clinical systems. It begins with a taxonomy of software systems, describes the classic approach to development, provides some guidelines for the planning and management of software projects, and finishes with a guide to further reading. The conclusions are that there is no single right way to develop software, that most decisions are based upon judgment built from experience, and that there are tools that can automate some of the better understood tasks.

  15. Some epidemiologic, clinical, microbiologic, and organizational assumptions that influenced the design and performance of the Global Enteric Multicenter Study (GEMS).

    PubMed

    Farag, Tamer H; Nasrin, Dilruba; Wu, Yukun; Muhsen, Khitam; Blackwelder, William C; Sommerfelt, Halvor; Panchalingam, Sandra; Nataro, James P; Kotloff, Karen L; Levine, Myron M

    2012-12-01

    The overall aim of the Global Enteric Multicenter Study-1 (GEMS-1) is to identify the etiologic agents associated with moderate-to-severe diarrhea (MSD) among children <5 years of age, and thereby the attributable pathogen-specific population-based incidence of MSD, to guide investments in research and public health interventions against diarrheal disease. To accomplish this, 9 core assumptions were vetted through widespread consultation: (1) a limited number of etiologic agents may be responsible for most MSD; (2) a definition of MSD can be crafted that encompasses cases that might otherwise be fatal in the community without treatment; (3) MSD seen at sentinel centers is a proxy for fatal diarrheal disease in the community; (4) matched case/control is the appropriate epidemiologic design; (5) methods across the sites can be standardized and rigorous quality control maintained; (6) a single 60-day postenrollment visit to case and control households creates mini-cohorts, allowing comparisons; (7) broad support for GEMS-1 messages can be achieved by incorporating advice from public health spokespersons; (8) results will facilitate the setting of investment and intervention priorities; and (9) wide acceptance and dissemination of the GEMS-1 results can be achieved.

  16. Some Epidemiologic, Clinical, Microbiologic, and Organizational Assumptions That Influenced the Design and Performance of the Global Enteric Multicenter Study (GEMS)

    PubMed Central

    Farag, Tamer H.; Nasrin, Dilruba; Wu, Yukun; Muhsen, Khitam; Blackwelder, William C.; Sommerfelt, Halvor; Panchalingam, Sandra; Nataro, James P.; Kotloff, Karen L.; Levine, Myron M.

    2012-01-01

    The overall aim of the Global Enteric Multicenter Study–1 (GEMS-1) is to identify the etiologic agents associated with moderate-to-severe diarrhea (MSD) among children <5 years of age, and thereby the attributable pathogen-specific population-based incidence of MSD, to guide investments in research and public health interventions against diarrheal disease. To accomplish this, 9 core assumptions were vetted through widespread consultation: (1) a limited number of etiologic agents may be responsible for most MSD; (2) a definition of MSD can be crafted that encompasses cases that might otherwise be fatal in the community without treatment; (3) MSD seen at sentinel centers is a proxy for fatal diarrheal disease in the community; (4) matched case/control is the appropriate epidemiologic design; (5) methods across the sites can be standardized and rigorous quality control maintained; (6) a single 60-day postenrollment visit to case and control households creates mini-cohorts, allowing comparisons; (7) broad support for GEMS-1 messages can be achieved by incorporating advice from public health spokespersons; (8) results will facilitate the setting of investment and intervention priorities; and (9) wide acceptance and dissemination of the GEMS-1 results can be achieved. PMID:23169935

  17. A robust method for iodine status determination in epidemiological studies by capillary electrophoresis.

    PubMed

    de Macedo, Adriana Nori; Teo, Koon; Mente, Andrew; McQueen, Matthew J; Zeidler, Johannes; Poirier, Paul; Lear, Scott A; Wielgosz, Andy; Britz-McKibbin, Philip

    2014-10-21

    Iodine deficiency is the most common preventable cause of intellectual disabilities in children. Global health initiatives to ensure optimum nutrition thus require continuous monitoring of population-wide iodine intake as determined by urinary excretion of iodide. Current methods to analyze urinary iodide are limited by complicated sample pretreatment, costly infrastructure, and/or poor selectivity, posing restrictions to large-scale epidemiological studies. We describe a simple yet selective method to analyze iodide in volume-restricted human urine specimens stored in biorepositories by capillary electrophoresis (CE) with UV detection. Excellent selectivity is achieved when using an acidic background electrolyte in conjunction with dynamic complexation via α-cyclodextrin in an unmodified fused-silica capillary under reversed polarity. Sample self-stacking is developed as a novel online sample preconcentration method to boost sensitivity with submicromolar detection limits for iodide (S/N ≈ 3, 0.06 μM) directly in urine. This assay also allows for simultaneous analysis of environmental iodide uptake inhibitors, including thiocyanate and nitrate. Rigorous method validation confirmed good linearity (R(2) = 0.9998), dynamic range (0.20 to 4.0 μM), accuracy (average recovery of 93% at three concentration levels) and precision for reliable iodide determination in pooled urine specimens over 29 days of analysis (RSD = 11%, n = 87).

  18. Evaluation and validity of a polymerase chain reaction-based open reading frame typing method to dissect the molecular epidemiology for Acinetobacter baumannii in an epidemiologic study of a hospital outbreak.

    PubMed

    Fujikura, Yuji; Yuki, Atsushi; Hamamoto, Takaaki; Ichimura, Sadahiro; Kawana, Akihiko; Ohkusu, Kiyofumi; Matsumoto, Tetsuya

    2016-11-01

    Acinetobacter baumannii is regarded as one of the most important pathogens in hospital outbreaks. To obtain an efficient and simple epidemiologic method of surveillance during outbreaks, we assessed the applicability of the polymerase chain reaction-based open reading frames typing (POT) method and compared it with pulsed-field gel electrophoresis. The POT method was found to have sufficient discriminatory power to identify the strains and would be widely applicable to epidemiologic surveillance during hospital outbreaks.

  19. Methods for combinatorial and parallel library design.

    PubMed

    Schnur, Dora M; Beno, Brett R; Tebben, Andrew J; Cavallaro, Cullen

    2011-01-01

    Diversity has historically played a critical role in design of combinatorial libraries, screening sets and corporate collections for lead discovery. Large library design dominated the field in the 1990s with methods ranging anywhere from purely arbitrary through property based reagent selection to product based approaches. In recent years, however, there has been a downward trend in library size. This was due to increased information about the desirable targets gleaned from the genomics revolution and to the ever growing availability of target protein structures from crystallography and homology modeling. Creation of libraries directed toward families of receptors such as GPCRs, kinases, nuclear hormone receptors, proteases, etc., replaced the generation of libraries based primarily on diversity while single target focused library design has remained an important objective. Concurrently, computing grids and cpu clusters have facilitated the development of structure based tools that screen hundreds of thousands of molecules. Smaller "smarter" combinatorial and focused parallel libraries replaced those early un-focused large libraries in the twenty-first century drug design paradigm. While diversity still plays a role in lead discovery, the focus of current library design methods has shifted to receptor based methods, scaffold hopping/bio-isostere searching, and a much needed emphasis on synthetic feasibility. Methods such as "privileged substructures based design" and pharmacophore based design still are important methods for parallel and small combinatorial library design. This chapter discusses some of the possible design methods and presents examples where they are available.

  20. Optical fingerprinting in bacterial epidemiology: Raman spectroscopy as a real-time typing method.

    PubMed

    Willemse-Erix, Diana F M; Scholtes-Timmerman, Maarten J; Jachtenberg, Jan-Willem; van Leeuwen, Willem B; Horst-Kreft, Deborah; Bakker Schut, Tom C; Deurenberg, Ruud H; Puppels, Gerwin J; van Belkum, Alex; Vos, Margreet C; Maquelin, Kees

    2009-03-01

    Hospital-acquired infections (HAI) increase morbidity and mortality and constitute a high financial burden on health care systems. An effective weapon against HAI is early detection of potential outbreaks and sources of contamination. Such monitoring requires microbial typing with sufficient reproducibility and discriminatory power. Here, a microbial-typing method is presented, based on Raman spectroscopy. This technique provides strain-specific optical fingerprints in a few minutes instead of several hours to days, as is the case with genotyping methods. Although the method is generally applicable, we used 118 Staphylococcus aureus isolates to illustrate that the discriminatory power matches that of established genotyping techniques (numerical index of diversity [D]=0.989) and that concordance with the gold standard (pulsed-field gel electrophoresis) is high (95%). The Raman clustering of isolates was reproducible to the strain level for five independent cultures, despite the various culture times from 18 h to 24 h. Furthermore, this technique was able to classify stored (-80 degrees C) and recent isolates of a methicillin-resistant Staphylococcus aureus-colonized individual during surveillance studies and did so days earlier than established genotyping techniques did. Its high throughput and ease of use make it suitable for use in routine diagnostic laboratory settings. This will set the stage for continuous, automated, real-time epidemiological monitoring of bacterial infections in a hospital, which can then be followed by timely corrective action by infection prevention teams.

  1. Applications of a transonic wing design method

    NASA Technical Reports Server (NTRS)

    Campbell, Richard L.; Smith, Leigh A.

    1989-01-01

    A method for designing wings and airfoils at transonic speeds using a predictor/corrector approach was developed. The procedure iterates between an aerodynamic code, which predicts the flow about a given geometry, and the design module, which compares the calculated and target pressure distributions and modifies the geometry using an algorithm that relates differences in pressure to a change in surface curvature. The modular nature of the design method makes it relatively simple to couple it to any analysis method. The iterative approach allows the design process and aerodynamic analysis to converge in parallel, significantly reducing the time required to reach a final design. Viscous and static aeroelastic effects can also be accounted for during the design or as a post-design correction. Results from several pilot design codes indicated that the method accurately reproduced pressure distributions as well as the coordinates of a given airfoil or wing by modifying an initial contour. The codes were applied to supercritical as well as conventional airfoils, forward- and aft-swept transport wings, and moderate-to-highly swept fighter wings. The design method was found to be robust and efficient, even for cases having fairly strong shocks.

  2. The genetic study of three population microisolates in South Tyrol (MICROS): study design and epidemiological perspectives

    PubMed Central

    Pattaro, Cristian; Marroni, Fabio; Riegler, Alice; Mascalzoni, Deborah; Pichler, Irene; Volpato, Claudia B; Dal Cero, Umberta; De Grandi, Alessandro; Egger, Clemens; Eisendle, Agatha; Fuchsberger, Christian; Gögele, Martin; Pedrotti, Sara; Pinggera, Gerd K; Stefanov, Stefan A; Vogl, Florian D; Wiedermann, Christian J; Meitinger, Thomas; Pramstaller, Peter P

    2007-01-01

    Background There is increasing evidence of the important role that small, isolated populations could play in finding genes involved in the etiology of diseases. For historical and political reasons, South Tyrol, the northern most Italian region, includes several villages of small dimensions which remained isolated over the centuries. Methods The MICROS study is a population-based survey on three small, isolated villages, characterized by: old settlement; small number of founders; high endogamy rates; slow/null population expansion. During the stage-1 (2002/03) genealogical data, screening questionnaires, clinical measurements, blood and urine samples, and DNA were collected for 1175 adult volunteers. Stage-2, concerning trait diagnoses, linkage analysis and association studies, is ongoing. The selection of the traits is being driven by expert clinicians. Preliminary, descriptive statistics were obtained. Power simulations for finding linkage on a quantitative trait locus (QTL) were undertaken. Results Starting from participants, genealogies were reconstructed for 50,037 subjects, going back to the early 1600s. Within the last five generations, subjects were clustered in one pedigree of 7049 subjects plus 178 smaller pedigrees (3 to 85 subjects each). A significant probability of familial clustering was assessed for many traits, especially among the cardiovascular, neurological and respiratory traits. Simulations showed that the MICROS pedigree has a substantial power to detect a LOD score ≥ 3 when the QTL specific heritability is ≥ 20%. Conclusion The MICROS study is an extensive, ongoing, two-stage survey aimed at characterizing the genetic epidemiology of Mendelian and complex diseases. Our approach, involving different scientific disciplines, is an advantageous strategy to define and to study population isolates. The isolation of the Alpine populations, together with the extensive data collected so far, make the MICROS study a powerful resource for the study

  3. Variable genetic element typing: a quick method for epidemiological subtyping of Legionella pneumophila.

    PubMed

    Pannier, K; Heuner, K; Lück, C

    2010-04-01

    A total of 57 isolates of Legionella pneumophila were randomly selected from the German National Legionella strain collection and typed by monoclonal antibody subgrouping, seven-gene locus sequence-based typing (SBT) scheme and a newly developed variable element typing (VET) system based on the presence or absence of ten variable genetic elements. These elements were detected while screening a genomic library of strain Corby, as well as being taken from published data for PAI-1 (pathogenicity island) from strain Philadelphia. Specific primers were designed and used in gel-based polymerase chain reaction (PCR) assays. PCR amplification of the mip gene served as a control. The end-point was the presence/absence of a PCR product on an ethidium bromide-strained gel. In the present study, the index of discrimination was somewhat lower than that of the SBT (0.87 versus 0.97). Nevertheless, the results obtained showed as a 'proof of principle' that this simple and quick typing assay might be useful for the epidemiological characterisation of L. pneumophila strains.

  4. Impeller blade design method for centrifugal compressors

    NASA Technical Reports Server (NTRS)

    Jansen, W.; Kirschner, A. M.

    1974-01-01

    The design of a centrifugal impeller with blades that are aerodynamically efficient, easy to manufacture, and mechanically sound is discussed. The blade design method described here satisfies the first two criteria and with a judicious choice of certain variables will also satisfy stress considerations. The blade shape is generated by specifying surface velocity distributions and consists of straight-line elements that connect points at hub and shroud. The method may be used to design radially elemented and backward-swept blades. The background, a brief account of the theory, and a sample design are described.

  5. Environmental epidemiology

    SciTech Connect

    Kopfler, F.C.; Craun, G.F.

    1986-01-01

    This volume is a compendium of peer-reviewed papers presented at the Symposium on Exposure Measurement and Evaluation Methods for Epidemiology, cosponsored in 1985 by the Health Effects Research Laboratory, USEPA, and the Division of Environmental Chemistry of the American Chemical Society. The book is divided into four sections: Use of Biological Monitoring to Assess Exposure, Epidemiologic Considerations for Assessing Exposure, Health and Exposure Data Bases, and Assessment of Exposure to Environmental Contaminants for Epidemiologic Studies. Both background papers and detailed reports of human studies are presented. The Biological Monitoring section contains reports of efforts to quantify adducts in blood and urine samples. In the section on Epidemiologic Considerations the feasibility of conducting epidemiologic studies of persons residing near hazardous waste sites and those exposed to arsenic in drinking water is described. The review of Data Bases includes government and industry water quality monitoring systems, the FDA Market Basket Study, major EPA air monitoring data, the National Database on Body Burden of Toxic chemicals, and the National Human Adipose Tissue Survey. Methods of assessing current exposure and estimating past exposure are detailed in the final section. Exposure to trichloroethylene in shower water, the relationship between water quality and cardiovascular disease, the contribution of environmental lead exposures to pediatric blood lead levels, and data from the TEAM study in which researchers compare indoor, outdoor, and breath analysis of air pollutant exposures are also discussed.

  6. RADRUE METHOD FOR RECONSTRUCTION OF EXTERNAL PHOTON DOSES TO CHERNOBYL LIQUIDATORS IN EPIDEMIOLOGICAL STUDIES

    PubMed Central

    Kryuchkov, Victor; Chumak, Vadim; Maceika, Evaldas; Anspaugh, Lynn R.; Cardis, Elisabeth; Bakhanova, Elena; Golovanov, Ivan; Drozdovitch, Vladimir; Luckyanov, Nickolas; Kesminiene, Ausrele; Voillequé, Paul; Bouville, André

    2010-01-01

    Between 1986 and 1990, several hundred thousand workers, called “liquidators” or “clean-up workers”, took part in decontamination and recovery activities within the 30-km zone around the Chernobyl nuclear power plant in Ukraine, where a major accident occurred in April 1986. The Chernobyl liquidators were mainly exposed to external ionizing radiation levels that depended primarily on their work locations and the time after the accident when the work was performed. Because individual doses were often monitored inadequately or were not monitored at all for the majority of liquidators, a new method of photon (i.e. gamma and x-rays) dose assessment, called “RADRUE” (Realistic Analytical Dose Reconstruction with Uncertainty Estimation) was developed to obtain unbiased and reasonably accurate estimates for use in three epidemiologic studies of hematological malignancies and thyroid cancer among liquidators. The RADRUE program implements a time-and-motion dose reconstruction method that is flexible and conceptually easy to understand. It includes a large exposure rate database and interpolation and extrapolation techniques to calculate exposure rates at places where liquidators lived and worked within ~70 km of the destroyed reactor. The RADRUE technique relies on data collected from subjects’ interviews conducted by trained interviewers, and on expert dosimetrists to interpret the information and provide supplementary information, when necessary, based upon their own Chernobyl experience. The RADRUE technique was used to estimate doses from external irradiation, as well as uncertainties, to the bone-marrow for 929 subjects and to the thyroid gland for 530 subjects enrolled in epidemiologic studies. Individual bone-marrow dose estimates were found to range from less than one μGy to 3,300 mGy, with an arithmetic mean of 71 mGy. Individual thyroid dose estimates were lower and ranged from 20 μGy to 507 mGy, with an arithmetic mean of 29 mGy. The

  7. Model reduction methods for control design

    NASA Technical Reports Server (NTRS)

    Dunipace, K. R.

    1988-01-01

    Several different model reduction methods are developed and detailed implementation information is provided for those methods. Command files to implement the model reduction methods in a proprietary control law analysis and design package are presented. A comparison and discussion of the various reduction techniques is included.

  8. Mixed Methods Research Designs in Counseling Psychology

    ERIC Educational Resources Information Center

    Hanson, William E.; Creswell, John W.; Clark, Vicki L. Plano; Petska, Kelly S.; Creswell, David J.

    2005-01-01

    With the increased popularity of qualitative research, researchers in counseling psychology are expanding their methodologies to include mixed methods designs. These designs involve the collection, analysis, and integration of quantitative and qualitative data in a single or multiphase study. This article presents an overview of mixed methods…

  9. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  10. Iterative methods for design sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Belegundu, A. D.; Yoon, B. G.

    1989-01-01

    A numerical method is presented for design sensitivity analysis, using an iterative-method reanalysis of the structure generated by a small perturbation in the design variable; a forward-difference scheme is then employed to obtain the approximate sensitivity. Algorithms are developed for displacement and stress sensitivity, as well as for eignevalues and eigenvector sensitivity, and the iterative schemes are modified so that the coefficient matrices are constant and therefore decomposed only once.

  11. Genetic diversity of Bacillus anthracis in Europe: genotyping methods in forensic and epidemiologic investigations.

    PubMed

    Derzelle, Sylviane; Thierry, Simon

    2013-09-01

    Bacillus anthracis, the etiological agent of anthrax, a zoonosis relatively common throughout the world, can be used as an agent of bioterrorism. In naturally occurring outbreaks and in criminal release of this pathogen, a fast and accurate diagnosis is crucial to an effective response. Microbiological forensics and epidemiologic investigations increasingly rely on molecular markers, such as polymorphisms in DNA sequence, to obtain reliable information regarding the identification or source of a suspicious strain. Over the past decade, significant research efforts have been undertaken to develop genotyping methods with increased power to differentiate B. anthracis strains. A growing number of DNA signatures have been identified and used to survey B. anthracis diversity in nature, leading to rapid advances in our understanding of the global population of this pathogen. This article provides an overview of the different phylogenetic subgroups distributed across the world, with a particular focus on Europe. Updated information on the anthrax situation in Europe is reported. A brief description of some of the work in progress in the work package 5.1 of the AniBioThreat project is also presented, including (1) the development of a robust typing tool based on a suspension array technology and multiplexed single nucleotide polymorphisms scoring and (2) the typing of a collection of DNA from European isolates exchanged between the partners of the project. The know-how acquired will contribute to improving the EU's ability to react rapidly when the identity and real origin of a strain need to be established.

  12. Preliminary aerothermodynamic design method for hypersonic vehicles

    NASA Technical Reports Server (NTRS)

    Harloff, G. J.; Petrie, S. L.

    1987-01-01

    Preliminary design methods are presented for vehicle aerothermodynamics. Predictions are made for Shuttle orbiter, a Mach 6 transport vehicle and a high-speed missile configuration. Rapid and accurate methods are discussed for obtaining aerodynamic coefficients and heat transfer rates for laminar and turbulent flows for vehicles at high angles of attack and hypersonic Mach numbers.

  13. Design and implementation of security in a data collection system for epidemiology.

    PubMed

    Ainsworth, John; Harper, Robert; Juma, Ismael; Buchan, Iain

    2006-01-01

    Health informatics can benefit greatly from the e-Science approach, which is characterised by large scale distributed resource sharing and collaboration. Ensuring the privacy and confidentiality of data has always been the first requirement of health informatics systems. The PsyGrid data collection system, addresses both, providing secure distributed data collection for epidemiology. We have used Grid-computing approaches and technologies to address this problem. We describe the architecture and implementation of the security sub-system in detail.

  14. Phene Plate (PhP) biochemical fingerprinting. A screening method for epidemiological typing of enterococcal isolates.

    PubMed

    Saeedi, B; Tärnberg, M; Gill, H; Hällgren, A; Jonasson, J; Nilsson, L E; Isaksson, B; Kühn, I; Hanberger, H

    2005-09-01

    Pulsed-field gel electrophoresis (PFGE) is currently considered the gold standard for genotyping of enterococci. However, PFGE is both expensive and time-consuming. The purpose of this study was to investigate whether the PhP system can be used as a reliable clinical screening method for detection of genetically related isolates of enterococci. If so, it should be possible to minimize the number of isolates subjected to PFGE typing, which would save time and money. Ninety-nine clinical enterococcal isolates were analysed by PhP (similarity levels 0.90-0.975) and PFGE (similarity levels < or =3 and < or =6 bands) and all possible pairs of isolates were cross-classified as matched or mismatched. We found that the probability that a pair of isolates (A and B) belonging to the same type according to PhP also belong to the same cluster according to PFGE, i.e. p(A(PFGE)=B(PFGE) * A(PhP)=B(PhP)), and the probability that a pair of isolates of different types according to PhP also belong to different clusters according to PFGE, i.e. p(A(PFGE) not equalB(PFGE) * A(PhP) not equalB(PhP)), was relatively high for E. faecalis (0.86 and 0.96, respectively), but was lower for E. faecium (0.51 and 0.77, respectively). The concordance which shows the probability that PhP and PFGE agree on match or mismatch was 86%-93% for E. faecalis and 54%-66% for E. faecium, which indicates that the PhP method may be useful for epidemiological typing of E. faecalis in the current settings but not for E. faecium.

  15. [Dermato-epidemiology].

    PubMed

    Apfelbacher, C J; Diepgen, T L; Weisshaar, E

    2011-11-01

    Dermato-epidemiology is an important scientific discipline which investigates skin diseases using epidemiological methods. Epidemiology is the science of the distribution and determinants of disease in specified populations. We describe fundamental terms of dermato-epidemiology (measures of disease occurrence, measures of risk), different study types (observational studies, interventional studies), the selection of statistical tests, bias and confounding as well as the principles of evidence-based dermatology, and give illustrative examples.

  16. Evidence-based decision-making in infectious diseases epidemiology, prevention and control: matching research questions to study designs and quality appraisal tools

    PubMed Central

    2014-01-01

    Background The Project on a Framework for Rating Evidence in Public Health (PRECEPT) was initiated and is being funded by the European Centre for Disease Prevention and Control (ECDC) to define a methodology for evaluating and grading evidence and strength of recommendations in the field of public health, with emphasis on infectious disease epidemiology, prevention and control. One of the first steps was to review existing quality appraisal tools (QATs) for individual research studies of various designs relevant to this area, using a question-based approach. Methods Through team discussions and expert consultations, we identified 20 relevant types of public health questions, which were grouped into six domains, i.e. characteristics of the pathogen, burden of disease, diagnosis, risk factors, intervention, and implementation of intervention. Previously published systematic reviews were used and supplemented by expert consultation to identify suitable QATs. Finally, a matrix was constructed for matching questions to study designs suitable to address them and respective QATs. Key features of each of the included QATs were then analyzed, in particular in respect to its intended use, types of questions and answers, presence/absence of a quality score, and if a validation was performed. Results In total we identified 21 QATs and 26 study designs, and matched them. Four QATs were suitable for experimental quantitative study designs, eleven for observational quantitative studies, two for qualitative studies, three for economic studies, one for diagnostic test accuracy studies, and one for animal studies. Included QATs consisted of six to 28 items. Six of the QATs had a summary quality score. Fourteen QATs had undergone at least one validation procedure. Conclusions The results of this methodological study can be used as an inventory of potentially relevant questions, appropriate study designs and QATs for researchers and authorities engaged with evidence-based decision

  17. Multidisciplinary Optimization Methods for Aircraft Preliminary Design

    NASA Technical Reports Server (NTRS)

    Kroo, Ilan; Altus, Steve; Braun, Robert; Gage, Peter; Sobieski, Ian

    1994-01-01

    This paper describes a research program aimed at improved methods for multidisciplinary design and optimization of large-scale aeronautical systems. The research involves new approaches to system decomposition, interdisciplinary communication, and methods of exploiting coarse-grained parallelism for analysis and optimization. A new architecture, that involves a tight coupling between optimization and analysis, is intended to improve efficiency while simplifying the structure of multidisciplinary, computation-intensive design problems involving many analysis disciplines and perhaps hundreds of design variables. Work in two areas is described here: system decomposition using compatibility constraints to simplify the analysis structure and take advantage of coarse-grained parallelism; and collaborative optimization, a decomposition of the optimization process to permit parallel design and to simplify interdisciplinary communication requirements.

  18. Multidisciplinary Optimization Methods for Preliminary Design

    NASA Technical Reports Server (NTRS)

    Korte, J. J.; Weston, R. P.; Zang, T. A.

    1997-01-01

    An overview of multidisciplinary optimization (MDO) methodology and two applications of this methodology to the preliminary design phase are presented. These applications are being undertaken to improve, develop, validate and demonstrate MDO methods. Each is presented to illustrate different aspects of this methodology. The first application is an MDO preliminary design problem for defining the geometry and structure of an aerospike nozzle of a linear aerospike rocket engine. The second application demonstrates the use of the Framework for Interdisciplinary Design Optimization (FIDO), which is a computational environment system, by solving a preliminary design problem for a High-Speed Civil Transport (HSCT). The two sample problems illustrate the advantages to performing preliminary design with an MDO process.

  19. Age-Based Methods to Explore Time-Related Variables in Occupational Epidemiology Studies

    SciTech Connect

    Janice P. Watkins, Edward L. Frome, Donna L. Cragle

    2005-08-31

    Although age is recognized as the strongest predictor of mortality in chronic disease epidemiology, a calendar-based approach is often employed when evaluating time-related variables. An age-based analysis file, created by determining the value of each time-dependent variable for each age that a cohort member is followed, provides a clear definition of age at exposure and allows development of diverse analytic models. To demonstrate methods, the relationship between cancer mortality and external radiation was analyzed with Poisson regression for 14,095 Oak Ridge National Laboratory workers. Based on previous analysis of this cohort, a model with ten-year lagged cumulative radiation doses partitioned by receipt before (dose-young) or after (dose-old) age 45 was examined. Dose-response estimates were similar to calendar-year-based results with elevated risk for dose-old, but not when film badge readings were weekly before 1957. Complementary results showed increasing risk with older hire ages and earlier birth cohorts, since workers hired after age 45 were born before 1915, and dose-young and dose-old were distributed differently by birth cohorts. Risks were generally higher for smokingrelated than non-smoking-related cancers. It was difficult to single out specific variables associated with elevated cancer mortality because of: (1) birth cohort differences in hire age and mortality experience completeness, and (2) time-period differences in working conditions, dose potential, and exposure assessment. This research demonstrated the utility and versatility of the age-based approach.

  20. Analysis Method for Quantifying Vehicle Design Goals

    NASA Technical Reports Server (NTRS)

    Fimognari, Peter; Eskridge, Richard; Martin, Adam; Lee, Michael

    2007-01-01

    A document discusses a method for using Design Structure Matrices (DSM), coupled with high-level tools representing important life-cycle parameters, to comprehensively conceptualize a flight/ground space transportation system design by dealing with such variables as performance, up-front costs, downstream operations costs, and reliability. This approach also weighs operational approaches based on their effect on upstream design variables so that it is possible to readily, yet defensively, establish linkages between operations and these upstream variables. To avoid the large range of problems that have defeated previous methods of dealing with the complex problems of transportation design, and to cut down the inefficient use of resources, the method described in the document identifies those areas that are of sufficient promise and that provide a higher grade of analysis for those issues, as well as the linkages at issue between operations and other factors. Ultimately, the system is designed to save resources and time, and allows for the evolution of operable space transportation system technology, and design and conceptual system approach targets.

  1. Axisymmetric inlet minimum weight design method

    NASA Technical Reports Server (NTRS)

    Nadell, Shari-Beth

    1995-01-01

    An analytical method for determining the minimum weight design of an axisymmetric supersonic inlet has been developed. The goal of this method development project was to improve the ability to predict the weight of high-speed inlets in conceptual and preliminary design. The initial model was developed using information that was available from inlet conceptual design tools (e.g., the inlet internal and external geometries and pressure distributions). Stiffened shell construction was assumed. Mass properties were computed by analyzing a parametric cubic curve representation of the inlet geometry. Design loads and stresses were developed at analysis stations along the length of the inlet. The equivalent minimum structural thicknesses for both shell and frame structures required to support the maximum loads produced by various load conditions were then determined. Preliminary results indicated that inlet hammershock pressures produced the critical design load condition for a significant portion of the inlet. By improving the accuracy of inlet weight predictions, the method will improve the fidelity of propulsion and vehicle design studies and increase the accuracy of weight versus cost studies.

  2. Optimization methods applied to hybrid vehicle design

    NASA Technical Reports Server (NTRS)

    Donoghue, J. F.; Burghart, J. H.

    1983-01-01

    The use of optimization methods as an effective design tool in the design of hybrid vehicle propulsion systems is demonstrated. Optimization techniques were used to select values for three design parameters (battery weight, heat engine power rating and power split between the two on-board energy sources) such that various measures of vehicle performance (acquisition cost, life cycle cost and petroleum consumption) were optimized. The apporach produced designs which were often significant improvements over hybrid designs already reported on in the literature. The principal conclusions are as follows. First, it was found that the strategy used to split the required power between the two on-board energy sources can have a significant effect on life cycle cost and petroleum consumption. Second, the optimization program should be constructed so that performance measures and design variables can be easily changed. Third, the vehicle simulation program has a significant effect on the computer run time of the overall optimization program; run time can be significantly reduced by proper design of the types of trips the vehicle takes in a one year period. Fourth, care must be taken in designing the cost and constraint expressions which are used in the optimization so that they are relatively smooth functions of the design variables. Fifth, proper handling of constraints on battery weight and heat engine rating, variables which must be large enough to meet power demands, is particularly important for the success of an optimization study. Finally, the principal conclusion is that optimization methods provide a practical tool for carrying out the design of a hybrid vehicle propulsion system.

  3. Computer-Aided Drug Design Methods.

    PubMed

    Yu, Wenbo; MacKerell, Alexander D

    2017-01-01

    Computational approaches are useful tools to interpret and guide experiments to expedite the antibiotic drug design process. Structure-based drug design (SBDD) and ligand-based drug design (LBDD) are the two general types of computer-aided drug design (CADD) approaches in existence. SBDD methods analyze macromolecular target 3-dimensional structural information, typically of proteins or RNA, to identify key sites and interactions that are important for their respective biological functions. Such information can then be utilized to design antibiotic drugs that can compete with essential interactions involving the target and thus interrupt the biological pathways essential for survival of the microorganism(s). LBDD methods focus on known antibiotic ligands for a target to establish a relationship between their physiochemical properties and antibiotic activities, referred to as a structure-activity relationship (SAR), information that can be used for optimization of known drugs or guide the design of new drugs with improved activity. In this chapter, standard CADD protocols for both SBDD and LBDD will be presented with a special focus on methodologies and targets routinely studied in our laboratory for antibiotic drug discoveries.

  4. Methods and Technologies Branch (MTB)

    Cancer.gov

    The Methods and Technologies Branch focuses on methods to address epidemiologic data collection, study design and analysis, and to modify technological approaches to better understand cancer susceptibility.

  5. Epidemiological causality.

    PubMed

    Morabia, Alfredo

    2005-01-01

    Epidemiological methods, which combine population thinking and group comparisons, can primarily identify causes of disease in populations. There is therefore a tension between our intuitive notion of a cause, which we want to be deterministic and invariant at the individual level, and the epidemiological notion of causes, which are invariant only at the population level. Epidemiologists have given heretofore a pragmatic solution to this tension. Causal inference in epidemiology consists in checking the logical coherence of a causality statement and determining whether what has been found grossly contradicts what we think we already know: how strong is the association? Is there a dose-response relationship? Does the cause precede the effect? Is the effect biologically plausible? Etc. This approach to causal inference can be traced back to the English philosophers David Hume and John Stuart Mill. On the other hand, the mode of establishing causality, devised by Jakob Henle and Robert Koch, which has been fruitful in bacteriology, requires that in every instance the effect invariably follows the cause (e.g., inoculation of Koch bacillus and tuberculosis). This is incompatible with epidemiological causality which has to deal with probabilistic effects (e.g., smoking and lung cancer), and is therefore invariant only for the population.

  6. MAST Propellant and Delivery System Design Methods

    NASA Technical Reports Server (NTRS)

    Nadeem, Uzair; Mc Cleskey, Carey M.

    2015-01-01

    A Mars Aerospace Taxi (MAST) concept and propellant storage and delivery case study is undergoing investigation by NASA's Element Design and Architectural Impact (EDAI) design and analysis forum. The MAST lander concept envisions landing with its ascent propellant storage tanks empty and supplying these reusable Mars landers with propellant that is generated and transferred while on the Mars surface. The report provides an overview of the data derived from modeling between different methods of propellant line routing (or "lining") and differentiate the resulting design and operations complexity of fluid and gaseous paths based on a given set of fluid sources and destinations. The EDAI team desires a rough-order-magnitude algorithm for estimating the lining characteristics (i.e., the plumbing mass and complexity) associated different numbers of vehicle propellant sources and destinations. This paper explored the feasibility of preparing a mathematically sound algorithm for this purpose, and offers a method for the EDAI team to implement.

  7. Standardized Radiation Shield Design Methods: 2005 HZETRN

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Tripathi, Ram K.; Badavi, Francis F.; Cucinotta, Francis A.

    2006-01-01

    Research committed by the Langley Research Center through 1995 resulting in the HZETRN code provides the current basis for shield design methods according to NASA STD-3000 (2005). With this new prominence, the database, basic numerical procedures, and algorithms are being re-examined with new methods of verification and validation being implemented to capture a well defined algorithm for engineering design processes to be used in this early development phase of the Bush initiative. This process provides the methodology to transform the 1995 HZETRN research code into the 2005 HZETRN engineering code to be available for these early design processes. In this paper, we will review the basic derivations including new corrections to the codes to insure improved numerical stability and provide benchmarks for code verification.

  8. An optimisation method for complex product design

    NASA Astrophysics Data System (ADS)

    Li, Ni; Yi, Wenqing; Bi, Zhuming; Kong, Haipeng; Gong, Guanghong

    2013-11-01

    Designing a complex product such as an aircraft usually requires both qualitative and quantitative data and reasoning. To assist the design process, a critical issue is how to represent qualitative data and utilise it in the optimisation. In this study, a new method is proposed for the optimal design of complex products: to make the full use of available data, information and knowledge, qualitative reasoning is integrated into the optimisation process. The transformation and fusion of qualitative and qualitative data are achieved via the fuzzy sets theory and a cloud model. To shorten the design process, parallel computing is implemented to solve the formulated optimisation problems. A parallel adaptive hybrid algorithm (PAHA) has been proposed. The performance of the new algorithm has been verified by a comparison with the results from PAHA and two other existing algorithms. Further, PAHA has been applied to determine the shape parameters of an aircraft model for aerodynamic optimisation purpose.

  9. Statistical Methods in Algorithm Design and Analysis.

    ERIC Educational Resources Information Center

    Weide, Bruce W.

    The use of statistical methods in the design and analysis of discrete algorithms is explored. The introductory chapter contains a literature survey and background material on probability theory. In Chapter 2, probabilistic approximation algorithms are discussed with the goal of exposing and correcting some oversights in previous work. Chapter 3…

  10. Epigenetic Epidemiology: Promises for Public Health Research

    PubMed Central

    Bakulski, Kelly M.; Fallin, M. Daniele

    2014-01-01

    Epigenetic changes underlie developmental and age related biology. Promising epidemiologic research implicates epigenetics in disease risk and progression, and suggests epigenetic status depends on environmental risks as well as genetic predisposition. Epigenetics may represent a mechanistic link between environmental exposures, or genetics, and many common diseases, or may simply provide a quantitative biomarker for exposure or disease for areas of epidemiology currently lacking such measures. This great promise is balanced by issues related to study design, measurement tools, statistical methods, and biological interpretation that must be given careful consideration in an epidemiologic setting. This article describes the promises and challenges for epigenetic epidemiology, and suggests directions to advance this emerging area of molecular epidemiology. PMID:24449392

  11. Comparison of epidemiological marker methods for identification of Salmonella typhimurium isolates from an outbreak caused by contaminated chocolate.

    PubMed Central

    Kapperud, G; Lassen, J; Dommarsnes, K; Kristiansen, B E; Caugant, D A; Ask, E; Jahkola, M

    1989-01-01

    Plasmid profile analysis, restriction endonuclease analysis, and multilocus enzyme electrophoresis were used in conjunction with serotyping, bacteriophage typing, and biochemical fingerprinting to trace epidemiologically related isolates of Salmonella typhimurium from an outbreak caused by contaminated chocolate products in Norway and Finland. To evaluate the efficiency of the epidemiological marker methods, isolates from the outbreak were compared with five groups of control isolates not known to be associated with the outbreak. Both plasmid profile analysis and phage typing provided further discrimination over that produced by serotyping and biochemical fingerprinting. Plasmid profile analysis and phage typing were equally reliable in differentiating the outbreak isolates from the epidemiologically unrelated controls and were significantly more effective than multilocus enzyme electrophoresis and restriction enzyme analysis of total DNA. The greatest differentiation was achieved when plasmid profile analysis and phage typing were combined to complement serotyping and biochemical fingerprinting. However, none of the methods employed, including restriction enzyme analysis of plasmid DNA, were able to distinguish the outbreak isolates from five isolates recovered in Norway and Finland over a period of years from dead passerine birds and a calf. Images PMID:2674198

  12. Acoustic Treatment Design Scaling Methods. Phase 2

    NASA Technical Reports Server (NTRS)

    Clark, L. (Technical Monitor); Parrott, T. (Technical Monitor); Jones, M. (Technical Monitor); Kraft, R. E.; Yu, J.; Kwan, H. W.; Beer, B.; Seybert, A. F.; Tathavadekar, P.

    2003-01-01

    The ability to design, build and test miniaturized acoustic treatment panels on scale model fan rigs representative of full scale engines provides not only cost-savings, but also an opportunity to optimize the treatment by allowing multiple tests. To use scale model treatment as a design tool, the impedance of the sub-scale liner must be known with confidence. This study was aimed at developing impedance measurement methods for high frequencies. A normal incidence impedance tube method that extends the upper frequency range to 25,000 Hz. without grazing flow effects was evaluated. The free field method was investigated as a potential high frequency technique. The potential of the two-microphone in-situ impedance measurement method was evaluated in the presence of grazing flow. Difficulties in achieving the high frequency goals were encountered in all methods. Results of developing a time-domain finite difference resonator impedance model indicated that a re-interpretation of the empirical fluid mechanical models used in the frequency domain model for nonlinear resistance and mass reactance may be required. A scale model treatment design that could be tested on the Universal Propulsion Simulator vehicle was proposed.

  13. Reliability Methods for Shield Design Process

    NASA Technical Reports Server (NTRS)

    Tripathi, R. K.; Wilson, J. W.

    2002-01-01

    Providing protection against the hazards of space radiation is a major challenge to the exploration and development of space. The great cost of added radiation shielding is a potential limiting factor in deep space operations. In this enabling technology, we have developed methods for optimized shield design over multi-segmented missions involving multiple work and living areas in the transport and duty phase of space missions. The total shield mass over all pieces of equipment and habitats is optimized subject to career dose and dose rate constraints. An important component of this technology is the estimation of two most commonly identified uncertainties in radiation shield design, the shielding properties of materials used and the understanding of the biological response of the astronaut to the radiation leaking through the materials into the living space. The largest uncertainty, of course, is in the biological response to especially high charge and energy (HZE) ions of the galactic cosmic rays. These uncertainties are blended with the optimization design procedure to formulate reliability-based methods for shield design processes. The details of the methods will be discussed.

  14. A novel method to design flexible URAs

    NASA Astrophysics Data System (ADS)

    Lang, Haitao; Liu, Liren; Yang, Qingguo

    2007-05-01

    Aperture patterns play a vital role in coded aperture imaging (CAI) applications. In recent years, many approaches were presented to design optimum or near-optimum aperture patterns. Uniformly redundant arrays (URAs) are, undoubtedly, the most successful for constant sidelobe of their periodic autocorrelation function. Unfortunately, the existing methods can only be used to design URAs with a limited number of array sizes and fixed autocorrelation sidelobe-to-peak ratios. In this paper, we present a novel method to design more flexible URAs. Our approach is based on a searching program driven by DIRECT, a global optimization algorithm. We transform the design question to a mathematical model, based on the DIRECT algorithm, which is advantageous for computer implementation. By changing determinative conditions, we obtain two kinds of types of URAs, including the filled URAs which can be constructed by existing methods and the sparse URAs which have never been mentioned by other authors as far as we know. Finally, we carry out an experiment to demonstrate the imaging performance of the sparse URAs.

  15. Optimization methods for alternative energy system design

    NASA Astrophysics Data System (ADS)

    Reinhardt, Michael Henry

    An electric vehicle heating system and a solar thermal coffee dryer are presented as case studies in alternative energy system design optimization. Design optimization tools are compared using these case studies, including linear programming, integer programming, and fuzzy integer programming. Although most decision variables in the designs of alternative energy systems are generally discrete (e.g., numbers of photovoltaic modules, thermal panels, layers of glazing in windows), the literature shows that the optimization methods used historically for design utilize continuous decision variables. Integer programming, used to find the optimal investment in conservation measures as a function of life cycle cost of an electric vehicle heating system, is compared to linear programming, demonstrating the importance of accounting for the discrete nature of design variables. The electric vehicle study shows that conservation methods similar to those used in building design, that reduce the overall UA of a 22 ft. electric shuttle bus from 488 to 202 (Btu/hr-F), can eliminate the need for fossil fuel heating systems when operating in the northeast United States. Fuzzy integer programming is presented as a means of accounting for imprecise design constraints such as being environmentally friendly in the optimization process. The solar thermal coffee dryer study focuses on a deep-bed design using unglazed thermal collectors (UTC). Experimental data from parchment coffee drying are gathered, including drying constants and equilibrium moisture. In this case, fuzzy linear programming is presented as a means of optimizing experimental procedures to produce the most information under imprecise constraints. Graphical optimization is used to show that for every 1 m2 deep-bed dryer, of 0.4 m depth, a UTC array consisting of 5, 1.1 m 2 panels, and a photovoltaic array consisting of 1, 0.25 m 2 panels produces the most dry coffee per dollar invested in the system. In general this study

  16. [Phenotypic and genotypic methods for epidemiological typing of veterinary important bacterial pathogens of the genera Staphylococcus, Salmonella, and Pasteurella].

    PubMed

    Schwarz, Stefan; Blickwede, Maren; Kehrenberg, Corinna; Michael, Geovana Brenner

    2003-01-01

    Molecular typing methods are capable of providing detailed strain characteristics which are commonly far beyond the capacities of phenotypic typing methods. Such molecular-based characteristics have proved to be very helpful in epidemiological studies of bacterial pathogens. The primary criteria that all typing methods should fulfill include (1) the typeability of the strains in question, (2) the reproducibility of the results, and (3) a high discriminatory power. In general, molecular typing methods can be differentiated with regard to their use in methods that can be applied to virtually all bacteria (e.g. plasmid profiling, ribotyping, macrorestriction analysis) and methods which can only be used for typing of certain bacterial genera or species (e.g. IS200 typing of certain Salmonella enterica subsp. enterica serovars, or coa-PCR of coagulase-positive staphylococci). In the present review, various phenotypic and molecular methods for the epidemiological typing of bacteria of the genera Staphylococcus, Salmonella, and Pasteurella are described and their advantages/disadvantages--also with regard to the fulfillment of the above-mentioned primary criteria--are critically assessed.

  17. Waterflooding injectate design systems and methods

    DOEpatents

    Brady, Patrick V.; Krumhansl, James L.

    2014-08-19

    A method of designing an injectate to be used in a waterflooding operation is disclosed. One aspect includes specifying data representative of chemical characteristics of a liquid hydrocarbon, a connate, and a reservoir rock, of a subterranean reservoir. Charged species at an interface of the liquid hydrocarbon are determined based on the specified data by evaluating at least one chemical reaction. Charged species at an interface of the reservoir rock are determined based on the specified data by evaluating at least one chemical reaction. An extent of surface complexation between the charged species at the interfaces of the liquid hydrocarbon and the reservoir rock is determined by evaluating at least one surface complexation reaction. The injectate is designed and is operable to decrease the extent of surface complexation between the charged species at interfaces of the liquid hydrocarbon and the reservoir rock. Other methods, apparatus, and systems are disclosed.

  18. Evolutionary optimization methods for accelerator design

    NASA Astrophysics Data System (ADS)

    Poklonskiy, Alexey A.

    Many problems from the fields of accelerator physics and beam theory can be formulated as optimization problems and, as such, solved using optimization methods. Despite growing efficiency of the optimization methods, the adoption of modern optimization techniques in these fields is rather limited. Evolutionary Algorithms (EAs) form a relatively new and actively developed optimization methods family. They possess many attractive features such as: ease of the implementation, modest requirements on the objective function, a good tolerance to noise, robustness, and the ability to perform a global search efficiently. In this work we study the application of EAs to problems from accelerator physics and beam theory. We review the most commonly used methods of unconstrained optimization and describe the GATool, evolutionary algorithm and the software package, used in this work, in detail. Then we use a set of test problems to assess its performance in terms of computational resources, quality of the obtained result, and the tradeoff between them. We justify the choice of GATool as a heuristic method to generate cutoff values for the COSY-GO rigorous global optimization package for the COSY Infinity scientific computing package. We design the model of their mutual interaction and demonstrate that the quality of the result obtained by GATool increases as the information about the search domain is refined, which supports the usefulness of this model. We Giscuss GATool's performance on the problems suffering from static and dynamic noise and study useful strategies of GATool parameter tuning for these and other difficult problems. We review the challenges of constrained optimization with EAs and methods commonly used to overcome them. We describe REPA, a new constrained optimization method based on repairing, in exquisite detail, including the properties of its two repairing techniques: REFIND and REPROPT. We assess REPROPT's performance on the standard constrained

  19. Quality by design compliant analytical method validation.

    PubMed

    Rozet, E; Ziemons, E; Marini, R D; Boulanger, B; Hubert, Ph

    2012-01-03

    The concept of quality by design (QbD) has recently been adopted for the development of pharmaceutical processes to ensure a predefined product quality. Focus on applying the QbD concept to analytical methods has increased as it is fully integrated within pharmaceutical processes and especially in the process control strategy. In addition, there is the need to switch from the traditional checklist implementation of method validation requirements to a method validation approach that should provide a high level of assurance of method reliability in order to adequately measure the critical quality attributes (CQAs) of the drug product. The intended purpose of analytical methods is directly related to the final decision that will be made with the results generated by these methods under study. The final aim for quantitative impurity assays is to correctly declare a substance or a product as compliant with respect to the corresponding product specifications. For content assays, the aim is similar: making the correct decision about product compliance with respect to their specification limits. It is for these reasons that the fitness of these methods should be defined, as they are key elements of the analytical target profile (ATP). Therefore, validation criteria, corresponding acceptance limits, and method validation decision approaches should be settled in accordance with the final use of these analytical procedures. This work proposes a general methodology to achieve this in order to align method validation within the QbD framework and philosophy. β-Expectation tolerance intervals are implemented to decide about the validity of analytical methods. The proposed methodology is also applied to the validation of analytical procedures dedicated to the quantification of impurities or active product ingredients (API) in drug substances or drug products, and its applicability is illustrated with two case studies.

  20. Review of methods of dose estimation for epidemiological studies of the radiological impact of nevada test site and global fallout.

    PubMed

    Beck, Harold L; Anspaugh, Lynn R; Bouville, André; Simon, Steven L

    2006-07-01

    Methods to assess radiation doses from nuclear weapons test fallout have been used to estimate doses to populations and individuals in a number of studies. However, only a few epidemiology studies have relied on fallout dose estimates. Though the methods for assessing doses from local and regional compared to global fallout are similar, there are significant differences in predicted doses and contributing radionuclides depending on the source of the fallout, e.g. whether the nuclear debris originated in Nevada at the U.S. nuclear test site or whether it originated at other locations worldwide. The sparse historical measurement data available are generally sufficient to estimate external exposure doses reasonably well. However, reconstruction of doses to body organs from ingestion and inhalation of radionuclides is significantly more complex and is almost always more uncertain than are external dose estimates. Internal dose estimates are generally based on estimates of the ground deposition per unit area of specific radionuclides and subsequent transport of radionuclides through the food chain. A number of technical challenges to correctly modeling deposition of fallout under wet and dry atmospheric conditions still remain, particularly at close-in locations where sizes of deposited particles vary significantly over modest changes in distance. This paper summarizes the various methods of dose estimation from weapons test fallout and the most important dose assessment and epidemiology studies that have relied on those methods.

  1. Methods for structural design at elevated temperatures

    NASA Technical Reports Server (NTRS)

    Ellison, A. M.; Jones, W. E., Jr.; Leimbach, K. R.

    1973-01-01

    A procedure which can be used to design elevated temperature structures is discussed. The desired goal is to have the same confidence in the structural integrity at elevated temperature as the factor of safety gives on mechanical loads at room temperature. Methods of design and analysis for creep, creep rupture, and creep buckling are presented. Example problems are included to illustrate the analytical methods. Creep data for some common structural materials are presented. Appendix B is description, user's manual, and listing for the creep analysis program. The program predicts time to a given creep or to creep rupture for a material subjected to a specified stress-temperature-time spectrum. Fatigue at elevated temperature is discussed. Methods of analysis for high stress-low cycle fatigue, fatigue below the creep range, and fatigue in the creep range are included. The interaction of thermal fatigue and mechanical loads is considered, and a detailed approach to fatigue analysis is given for structures operating below the creep range.

  2. A Review of Exposure Assessment Methods in Epidemiological Studies on Incinerators

    PubMed Central

    Ranzi, Andrea; De Leo, Giulio A.; Lauriola, Paolo

    2013-01-01

    Incineration is a common technology for waste disposal, and there is public concern for the health impact deriving from incinerators. Poor exposure assessment has been claimed as one of the main causes of inconsistency in the epidemiological literature. We reviewed 41 studies on incinerators published between 1984 and January 2013 and classified them on the basis of exposure assessment approach. Moreover, we performed a simulation study to explore how the different exposure metrics may influence the exposure levels used in epidemiological studies. 19 studies used linear distance as a measure of exposure to incinerators, 11 studies atmospheric dispersion models, and the remaining 11 studies a qualitative variable such as presence/absence of the source. All reviewed studies utilized residence as a proxy for population exposure, although residence location was evaluated with different precision (e.g., municipality, census block, or exact address). Only one study reconstructed temporal variability in exposure. Our simulation study showed a notable degree of exposure misclassification caused by the use of distance compared to dispersion modelling. We suggest that future studies (i) make full use of pollution dispersion models; (ii) localize population on a fine-scale; and (iii) explicitly account for the presence of potential environmental and socioeconomic confounding. PMID:23840228

  3. Design analysis, robust methods, and stress classification

    SciTech Connect

    Bees, W.J.

    1993-01-01

    This special edition publication volume is comprised of papers presented at the 1993 ASME Pressure Vessels and Piping Conference, July 25--29, 1993 in Denver, Colorado. The papers were prepared for presentations in technical sessions developed under the auspices of the PVPD Committees on Computer Technology, Design and Analysis, Operations Applications and Components. The topics included are: Analysis of Pressure Vessels and Components; Expansion Joints; Robust Methods; Stress Classification; and Non-Linear Analysis. Individual papers have been processed separately for inclusion in the appropriate data bases.

  4. An IARC Manual series aimed at assisting cancer epidemiology and prevention. "Environmental carcinogens: selected methods of analysis".

    PubMed

    O'Neill, I K; Fishbein, L

    1986-01-01

    Since 1975, the IARC has been preparing a series of volumes entitled "Environmental Carcinogens: Selected Methods of Analysis" (IARC Manual series) of which the purposes are to assist analysts, epidemiologists and regulatory authorities in planning or performing exposure measurements that are truly comparable between different studies. The Manual series provides expert information within each volume on multi-media sampling, methods of analyses and some background of epidemiology, metabolism, use/occurrence for a group of known or suspect carcinogens. So far, eleven volumes have been published or are in preparation on the following subjects: N-nitrosamines, vinyl chloride, PAH, aromatic amines, mycotoxins, N-nitroso compounds, volatile halogenated hydrocarbons, metals, passive smoking, benzene and alkylated benzenes, dioxins, PCDFs and PCBs. The presentation will discuss needs and priorities for use of analytical chemistry in estimating exposures of apparently greatest relevance to cancer causation, i.e. the approach to developing this series. Indications from epidemiology, evaluations of carcinogenic risk to humans, and recent developments in total exposure assessment are that new methods and matrices need more emphasis, e.g. as with biochemical dosimetry, exhaled breath, and in indoor air.

  5. Block designs in method transfer experiments.

    PubMed

    Altan, Stan; Shoung, Jyh-Ming

    2008-01-01

    Method transfer is a part of the pharmaceutical development process in which an analytical (chemical) procedure developed in one laboratory (typically the research laboratory) is about to be adopted by one or more recipient laboratories (production or commercial operations). The objective is to show that the recipient laboratory is capable of performing the procedure in an acceptable manner. In the course of carrying out a method transfer, other questions may arise related to fixed or random factors of interest, such as analyst, apparatus, batch, supplier of analytical reagents, and so forth. Estimates of reproducibility and repeatability may also be of interest. This article focuses on the application of various block designs that have been found useful in the comprehensive study of method transfer beyond the laboratory effect alone. An equivalence approach to the comparison of laboratories can still be carried out on either the least squares means or subject-specific means of the laboratories to justify a method transfer or to compare analytical methods.

  6. The healthy men study: design and recruitment considerations for environmental epidemiologic studies in male reproductive health

    EPA Science Inventory

    Study Objective: To describe study conduct and response and participant characteristics. Design: Prospective cohort study. Setting: Participants were male partners of women enrolled in a community-based study of drinking water disinfection by-products and pregnancy healt...

  7. Design and methods in a multi-center case-control interview study.

    PubMed Central

    Hartge, P; Cahill, J I; West, D; Hauck, M; Austin, D; Silverman, D; Hoover, R

    1984-01-01

    We conducted a case-control study in ten areas of the United States in which a total of 2,982 bladder cancer patients and 5,782 population controls were interviewed. We employed a variety of existing and new techniques to reduce bias and to monitor the quality of data collected. We review here many of the design elements and field methods that can be generally applied in epidemiologic studies, particularly multi-center interview studies, and explain the reasons for our selection of the methods, instruments, and procedures used. PMID:6689843

  8. The epidemiology of male infertility.

    PubMed

    Winters, Brian R; Walsh, Thomas J

    2014-02-01

    The purpose of this review is to integrate understanding of epidemiology and infertility. A primer on epidemiologic science and an example disease for which the design of epidemiologic investigations is readily apparent are provided. Key features of infertility that limit epidemiologic investigation are described and a survey of available data on the epidemiology of infertility provided. Finally, the work that must be completed to move this area of research forward is proposed, and, with this new perspective of "infertility as a disease," improvements envisioned in public health that may be gained through improved understanding of the epidemiology of male infertility.

  9. Computational and design methods for advanced imaging

    NASA Astrophysics Data System (ADS)

    Birch, Gabriel C.

    This dissertation merges the optical design and computational aspects of imaging systems to create novel devices that solve engineering problems in optical science and attempts to expand the solution space available to the optical designer. This dissertation is divided into two parts: the first discusses a new active illumination depth sensing modality, while the second part discusses a passive illumination system called plenoptic, or lightfield, imaging. The new depth sensing modality introduced in part one is called depth through controlled aberration. This technique illuminates a target with a known, aberrated projected pattern and takes an image using a traditional, unmodified imaging system. Knowing how the added aberration in the projected pattern changes as a function of depth, we are able to quantitatively determine depth of a series of points from the camera. A major advantage this method permits is the ability for illumination and imaging axes to be coincident. Plenoptic cameras capture both spatial and angular data simultaneously. This dissertation present a new set of parameters that permit the design and comparison of plenoptic devices outside the traditionally published plenoptic 1.0 and plenoptic 2.0 configurations. Additionally, a series of engineering advancements are presented, including full system raytraces of raw plenoptic images, Zernike compression techniques of raw image files, and non-uniform lenslet arrays to compensate for plenoptic system aberrations. Finally, a new snapshot imaging spectrometer is proposed based off the plenoptic configuration.

  10. Surveillance in a Telemedicine Setting: Application of Epidemiologic Methods at NASA Johnson Space Center Adriana

    NASA Technical Reports Server (NTRS)

    Babiak-Vazquez, Adriana; Ruffaner, Lanie; Wear, Mary; Crucian Brian; Sams, Clarence; Lee, Lesley R.; Van Baalen, Mary

    2016-01-01

    Space medicine presents unique challenges and opportunities for epidemiologists, such as the use of telemedicine during spaceflight. Medical capabilities aboard the International Space Station (ISS) are limited due to severe restrictions on power, volume, and mass. Consequently, inflight health information is based heavily on crewmember (CM) self-report of signs and symptoms, rather than formal diagnoses. While CM's are in flight, the primary source of crew health information is verbal communication between physicians and crewmembers. In 2010 NASA implemented the Lifetime Surveillance of Astronaut Health, an occupational surveillance program for the U.S. Astronaut corps. This has shifted the epidemiological paradigm from tracking diagnoses based on traditional terrestrial clinical practice to one that incorporates symptomatology and may gain a more population-based understanding of early detection of disease process.

  11. Meta-epidemiology.

    PubMed

    Bae, Jong-Myon

    2014-01-01

    The concept of meta-epidemiology has been introduced with considering the methodological limitations of systematic review for intervention trials. The paradigm of meta-epidemiology has shifted from a statistical method into a new methodology to close gaps between evidence and practice. Main interest of meta-epidemiology is to control potential biases in previous quantitative systematic reviews and draw appropriate evidences for establishing evidence-base guidelines. Nowadays, the network meta-epidemiology was suggested in order to overcome some limitations of meta-epidemiology. To activate meta-epidemiologic studies, implementation of tools for risk of bias and reporting guidelines such as the Consolidated Standards for Reporting Trials (CONSORT) should be done.

  12. A structural design decomposition method utilizing substructuring

    NASA Technical Reports Server (NTRS)

    Scotti, Stephen J.

    1994-01-01

    A new method of design decomposition for structural analysis and optimization is described. For this method, the structure is divided into substructures where each substructure has its structural response described by a structural-response subproblem, and its structural sizing determined from a structural-sizing subproblem. The structural responses of substructures that have rigid body modes when separated from the remainder of the structure are further decomposed into displacements that have no rigid body components, and a set of rigid body modes. The structural-response subproblems are linked together through forces determined within a structural-sizing coordination subproblem which also determines the magnitude of any rigid body displacements. Structural-sizing subproblems having constraints local to the substructures are linked together through penalty terms that are determined by a structural-sizing coordination subproblem. All the substructure structural-response subproblems are totally decoupled from each other, as are all the substructure structural-sizing subproblems, thus there is significant potential for use of parallel solution methods for these subproblems.

  13. Method for designing gas tag compositions

    DOEpatents

    Gross, K.C.

    1995-04-11

    For use in the manufacture of gas tags such as employed in a nuclear reactor gas tagging failure detection system, a method for designing gas tagging compositions utilizes an analytical approach wherein the final composition of a first canister of tag gas as measured by a mass spectrometer is designated as node No. 1. Lattice locations of tag nodes in multi-dimensional space are then used in calculating the compositions of a node No. 2 and each subsequent node so as to maximize the distance of each node from any combination of tag components which might be indistinguishable from another tag composition in a reactor fuel assembly. Alternatively, the measured compositions of tag gas numbers 1 and 2 may be used to fix the locations of nodes 1 and 2, with the locations of nodes 3-N then calculated for optimum tag gas composition. A single sphere defining the lattice locations of the tag nodes may be used to define approximately 20 tag nodes, while concentric spheres can extend the number of tag nodes to several hundred. 5 figures.

  14. Geometric methods for optimal sensor design.

    PubMed

    Belabbas, M-A

    2016-01-01

    The Kalman-Bucy filter is the optimal estimator of the state of a linear dynamical system from sensor measurements. Because its performance is limited by the sensors to which it is paired, it is natural to seek optimal sensors. The resulting optimization problem is however non-convex. Therefore, many ad hoc methods have been used over the years to design sensors in fields ranging from engineering to biology to economics. We show in this paper how to obtain optimal sensors for the Kalman filter. Precisely, we provide a structural equation that characterizes optimal sensors. We furthermore provide a gradient algorithm and prove its convergence to the optimal sensor. This optimal sensor yields the lowest possible estimation error for measurements with a fixed signal-to-noise ratio. The results of the paper are proved by reducing the optimal sensor problem to an optimization problem on a Grassmannian manifold and proving that the function to be minimized is a Morse function with a unique minimum. The results presented here also apply to the dual problem of optimal actuator design.

  15. Research and Design of Rootkit Detection Method

    NASA Astrophysics Data System (ADS)

    Liu, Leian; Yin, Zuanxing; Shen, Yuli; Lin, Haitao; Wang, Hongjiang

    Rootkit is one of the most important issues of network communication systems, which is related to the security and privacy of Internet users. Because of the existence of the back door of the operating system, a hacker can use rootkit to attack and invade other people's computers and thus he can capture passwords and message traffic to and from these computers easily. With the development of the rootkit technology, its applications are more and more extensive and it becomes increasingly difficult to detect it. In addition, for various reasons such as trade secrets, being difficult to be developed, and so on, the rootkit detection technology information and effective tools are still relatively scarce. In this paper, based on the in-depth analysis of the rootkit detection technology, a new kind of the rootkit detection structure is designed and a new method (software), X-Anti, is proposed. Test results show that software designed based on structure proposed is much more efficient than any other rootkit detection software.

  16. Geometric methods for optimal sensor design

    PubMed Central

    Belabbas, M.-A.

    2016-01-01

    The Kalman–Bucy filter is the optimal estimator of the state of a linear dynamical system from sensor measurements. Because its performance is limited by the sensors to which it is paired, it is natural to seek optimal sensors. The resulting optimization problem is however non-convex. Therefore, many ad hoc methods have been used over the years to design sensors in fields ranging from engineering to biology to economics. We show in this paper how to obtain optimal sensors for the Kalman filter. Precisely, we provide a structural equation that characterizes optimal sensors. We furthermore provide a gradient algorithm and prove its convergence to the optimal sensor. This optimal sensor yields the lowest possible estimation error for measurements with a fixed signal-to-noise ratio. The results of the paper are proved by reducing the optimal sensor problem to an optimization problem on a Grassmannian manifold and proving that the function to be minimized is a Morse function with a unique minimum. The results presented here also apply to the dual problem of optimal actuator design. PMID:26997885

  17. Neural method of spatiotemporal filter design

    NASA Astrophysics Data System (ADS)

    Szostakowski, Jaroslaw

    1997-10-01

    There is a lot of applications in medical imaging, computer vision, and the communications, where the video processing is critical. Although many techniques have been successfully developed for the filtering of the still-images, significantly fewer techniques have been proposed for the filtering of noisy image sequences. In this paper the novel approach to spatio- temporal filtering design is proposed. The multilayer perceptrons and functional-link nets are used for the 3D filtering. The spatio-temporal patterns are creating from real motion video images. The neural networks learn these patterns. The perceptrons with different number of layers and neurons in each layer are tested. Also, the different input functions in functional- link net are searched. The practical examples of the filtering are shown and compared with traditional (non-neural) spatio-temporal methods. The results are very interesting and the neural spatio-temporal filters seems to be very efficient tool for video noise reduction.

  18. Quantifying Uncertainty in Epidemiological Models

    SciTech Connect

    Ramanathan, Arvind; Jha, Sumit Kumar

    2012-01-01

    Modern epidemiology has made use of a number of mathematical models, including ordinary differential equation (ODE) based models and agent based models (ABMs) to describe the dynamics of how a disease may spread within a population and enable the rational design of strategies for intervention that effectively contain the spread of the disease. Although such predictions are of fundamental importance in preventing the next global pandemic, there is a significant gap in trusting the outcomes/predictions solely based on such models. Hence, there is a need to develop approaches such that mathematical models can be calibrated against historical data. In addition, there is a need to develop rigorous uncertainty quantification approaches that can provide insights into when a model will fail and characterize the confidence in the (possibly multiple) model outcomes/predictions, when such retrospective analysis cannot be performed. In this paper, we outline an approach to develop uncertainty quantification approaches for epidemiological models using formal methods and model checking. By specifying the outcomes expected from a model in a suitable spatio-temporal logic, we use probabilistic model checking methods to quantify the probability with which the epidemiological model satisfies the specification. We argue that statistical model checking methods can solve the uncertainty quantification problem for complex epidemiological models.

  19. Epidemiologic Methods Lessons Learned from Environmental Public Health Disasters: Chernobyl, the World Trade Center, Bhopal, and Graniteville, South Carolina

    PubMed Central

    Svendsen, Erik R.; Runkle, Jennifer R.; Dhara, Venkata Ramana; Lin, Shao; Naboka, Marina; Mousseau, Timothy A.; Bennett, Charles

    2012-01-01

    Background: Environmental public health disasters involving hazardous contaminants may have devastating effects. While much is known about their immediate devastation, far less is known about long-term impacts of these disasters. Extensive latent and chronic long-term public health effects may occur. Careful evaluation of contaminant exposures and long-term health outcomes within the constraints imposed by limited financial resources is essential. Methods: Here, we review epidemiologic methods lessons learned from conducting long-term evaluations of four environmental public health disasters involving hazardous contaminants at Chernobyl, the World Trade Center, Bhopal, and Graniteville (South Carolina, USA). Findings: We found several lessons learned which have direct implications for the on-going disaster recovery work following the Fukushima radiation disaster or for future disasters. Interpretation: These lessons should prove useful in understanding and mitigating latent health effects that may result from the nuclear reactor accident in Japan or future environmental public health disasters. PMID:23066404

  20. Educating Instructional Designers: Different Methods for Different Outcomes.

    ERIC Educational Resources Information Center

    Rowland, Gordon; And Others

    1994-01-01

    Suggests new methods of teaching instructional design based on literature reviews of other design fields including engineering, architecture, interior design, media design, and medicine. Methods discussed include public presentations, visiting experts, competitions, artifacts, case studies, design studios, and internships and apprenticeships.…

  1. Adjoint methods for aerodynamic wing design

    NASA Technical Reports Server (NTRS)

    Grossman, Bernard

    1993-01-01

    A model inverse design problem is used to investigate the effect of flow discontinuities on the optimization process. The optimization involves finding the cross-sectional area distribution of a duct that produces velocities that closely match a targeted velocity distribution. Quasi-one-dimensional flow theory is used, and the target is chosen to have a shock wave in its distribution. The objective function which quantifies the difference between the targeted and calculated velocity distributions may become non-smooth due to the interaction between the shock and the discretization of the flowfield. This paper offers two techniques to resolve the resulting problems for the optimization algorithms. The first, shock-fitting, involves careful integration of the objective function through the shock wave. The second, coordinate straining with shock penalty, uses a coordinate transformation to align the calculated shock with the target and then adds a penalty proportional to the square of the distance between the shocks. The techniques are tested using several popular sensitivity and optimization methods, including finite-differences, and direct and adjoint discrete sensitivity methods. Two optimization strategies, Gauss-Newton and sequential quadratic programming (SQP), are used to drive the objective function to a minimum.

  2. Game Methodology for Design Methods and Tools Selection

    ERIC Educational Resources Information Center

    Ahmad, Rafiq; Lahonde, Nathalie; Omhover, Jean-françois

    2014-01-01

    Design process optimisation and intelligence are the key words of today's scientific community. A proliferation of methods has made design a convoluted area. Designers are usually afraid of selecting one method/tool over another and even expert designers may not necessarily know which method is the best to use in which circumstances. This…

  3. Invited commentary: do-it-yourself modern epidemiology--at last!

    PubMed

    Morabia, Alfredo

    2014-10-01

    In this issue of the Journal, Keyes and Galea (Am J Epidemiol. 2014;180(7):661-668) propose "7 foundational steps" for introducing epidemiologic methods and concepts to beginners. Keyes and Galea's credo is that the methododological and conceptual components that comprise epidemiology, today scattered in textbook chapters, come together as an integrated and coherent methodological corpus in the process of designing studies. Thus, they expound, the process of designing studies should be the core of teaching epidemiology. Two aspects of their 7-steps-to-epidemiology, do-it-yourself user manual stand out as novel: 1) the approach, because of its emphasis on modern epidemiology's causal framework of a dynamic population in a steady state evolving across time, and 2) the ambition to teach modern epidemiology in introductory courses, instead of the popular mix of classical and modern epidemiology that is often used today to keep introductory courses simple. Both aspects are of potentially great significance for our discipline.

  4. The HIV prevention cascade: integrating theories of epidemiological, behavioural, and social science into programme design and monitoring.

    PubMed

    Hargreaves, James R; Delany-Moretlwe, Sinead; Hallett, Timothy B; Johnson, Saul; Kapiga, Saidi; Bhattacharjee, Parinita; Dallabetta, Gina; Garnett, Geoff P

    2016-07-01

    Theories of epidemiology, health behaviour, and social science have changed the understanding of HIV prevention in the past three decades. The HIV prevention cascade is emerging as a new approach to guide the design and monitoring of HIV prevention programmes in a way that integrates these multiple perspectives. This approach recognises that translating the efficacy of direct mechanisms that mediate HIV prevention (including prevention products, procedures, and risk-reduction behaviours) into population-level effects requires interventions that increase coverage. An HIV prevention cascade approach suggests that high coverage can be achieved by targeting three key components: demand-side interventions that improve risk perception and awareness and acceptability of prevention approaches; supply-side interventions that make prevention products and procedures more accessible and available; and adherence interventions that support ongoing adoption of prevention behaviours, including those that do and do not involve prevention products. Programmes need to develop delivery platforms that ensure these interventions reach target populations, to shape the policy environment so that it facilitates implementation at scale with high quality and intensity, and to monitor the programme with indicators along the cascade.

  5. Translating Vision into Design: A Method for Conceptual Design Development

    NASA Technical Reports Server (NTRS)

    Carpenter, Joyce E.

    2003-01-01

    One of the most challenging tasks for engineers is the definition of design solutions that will satisfy high-level strategic visions and objectives. Even more challenging is the need to demonstrate how a particular design solution supports the high-level vision. This paper describes a process and set of system engineering tools that have been used at the Johnson Space Center to analyze and decompose high-level objectives for future human missions into design requirements that can be used to develop alternative concepts for vehicles, habitats, and other systems. Analysis and design studies of alternative concepts and approaches are used to develop recommendations for strategic investments in research and technology that support the NASA Integrated Space Plan. In addition to a description of system engineering tools, this paper includes a discussion of collaborative design practices for human exploration mission architecture studies used at the Johnson Space Center.

  6. Using Software Design Methods in CALL

    ERIC Educational Resources Information Center

    Ward, Monica

    2006-01-01

    The phrase "software design" is not one that arouses the interest of many CALL practitioners, particularly those from a humanities background. However, software design essentials are simply logical ways of going about designing a system. The fundamentals include modularity, anticipation of change, generality and an incremental approach. While CALL…

  7. The causal pie model: an epidemiological method applied to evolutionary biology and ecology.

    PubMed

    Wensink, Maarten; Westendorp, Rudi G J; Baudisch, Annette

    2014-05-01

    A general concept for thinking about causality facilitates swift comprehension of results, and the vocabulary that belongs to the concept is instrumental in cross-disciplinary communication. The causal pie model has fulfilled this role in epidemiology and could be of similar value in evolutionary biology and ecology. In the causal pie model, outcomes result from sufficient causes. Each sufficient cause is made up of a "causal pie" of "component causes". Several different causal pies may exist for the same outcome. If and only if all component causes of a sufficient cause are present, that is, a causal pie is complete, does the outcome occur. The effect of a component cause hence depends on the presence of the other component causes that constitute some causal pie. Because all component causes are equally and fully causative for the outcome, the sum of causes for some outcome exceeds 100%. The causal pie model provides a way of thinking that maps into a number of recurrent themes in evolutionary biology and ecology: It charts when component causes have an effect and are subject to natural selection, and how component causes affect selection on other component causes; which partitions of outcomes with respect to causes are feasible and useful; and how to view the composition of a(n apparently homogeneous) population. The diversity of specific results that is directly understood from the causal pie model is a test for both the validity and the applicability of the model. The causal pie model provides a common language in which results across disciplines can be communicated and serves as a template along which future causal analyses can be made.

  8. Global Dissemination of Carbapenemase-Producing Klebsiella pneumoniae: Epidemiology, Genetic Context, Treatment Options, and Detection Methods

    PubMed Central

    Lee, Chang-Ro; Lee, Jung Hun; Park, Kwang Seung; Kim, Young Bae; Jeong, Byeong Chul; Lee, Sang Hee

    2016-01-01

    The emergence of carbapenem-resistant Gram-negative pathogens poses a serious threat to public health worldwide. In particular, the increasing prevalence of carbapenem-resistant Klebsiella pneumoniae is a major source of concern. K. pneumoniae carbapenemases (KPCs) and carbapenemases of the oxacillinase-48 (OXA-48) type have been reported worldwide. New Delhi metallo-β-lactamase (NDM) carbapenemases were originally identified in Sweden in 2008 and have spread worldwide rapidly. In this review, we summarize the epidemiology of K. pneumoniae producing three carbapenemases (KPCs, NDMs, and OXA-48-like). Although the prevalence of each resistant strain varies geographically, K. pneumoniae producing KPCs, NDMs, and OXA-48-like carbapenemases have become rapidly disseminated. In addition, we used recently published molecular and genetic studies to analyze the mechanisms by which these three carbapenemases, and major K. pneumoniae clones, such as ST258 and ST11, have become globally prevalent. Because carbapenemase-producing K. pneumoniae are often resistant to most β-lactam antibiotics and many other non-β-lactam molecules, the therapeutic options available to treat infection with these strains are limited to colistin, polymyxin B, fosfomycin, tigecycline, and selected aminoglycosides. Although, combination therapy has been recommended for the treatment of severe carbapenemase-producing K. pneumoniae infections, the clinical evidence for this strategy is currently limited, and more accurate randomized controlled trials will be required to establish the most effective treatment regimen. Moreover, because rapid and accurate identification of the carbapenemase type found in K. pneumoniae may be difficult to achieve through phenotypic antibiotic susceptibility tests, novel molecular detection techniques are currently being developed. PMID:27379038

  9. Design of a set of probes with high potential for influenza virus epidemiological surveillance

    PubMed Central

    Carreño-Durán, Luis R; Larios-Serrato, V; Jaimes-Díaz, Hueman; Pérez-Cervantes, Hilda; Zepeda-López, Héctor; Sánchez-Vallejo, Carlos Javier; Olguín-Ruiz, Gabriela Edith; Maldonado-Rodríguez, Rogelio; Méndez-Tenorio, Alfonso

    2013-01-01

    An Influenza Probe Set (IPS) consisting in 1,249 9-mer probes for genomic fingerprinting of closely and distantly related Influenza Virus strains was designed and tested in silico. The IPS was derived from alignments of Influenza genomes. The RNA segments of 5,133 influenza strains having diverse degree of relatedness were concatenated and aligned. After alignment, 9-mer sites having high Shannon entropy were searched. Additional criteria such as: G+C content between 35 to 65%, absence of dimer or trimer consecutive repeats, a minimum of 2 differences between 9mers and selecting only sequences with Tm values between 34.5 and 36.5oC were applied for selecting probes with high sequential entropy. Virtual Hybridization was used to predict Genomic Fingerprints to assess the capability of the IPS to discriminate between influenza and related strains. Distance scores between pairs of Influenza Genomic Fingerprints were calculated, and used for estimating Taxonomic Trees. Visual examination of both Genomic Fingerprints and Taxonomic Trees suggest that the IPS is able to discriminate between distant and closely related Influenza strains. It is proposed that the IPS can be used to investigate, by virtual or experimental hybridization, any new, and potentially virulent, strain. PMID:23750091

  10. An Efficient Inverse Aerodynamic Design Method For Subsonic Flows

    NASA Technical Reports Server (NTRS)

    Milholen, William E., II

    2000-01-01

    Computational Fluid Dynamics based design methods are maturing to the point that they are beginning to be used in the aircraft design process. Many design methods however have demonstrated deficiencies in the leading edge region of airfoil sections. The objective of the present research is to develop an efficient inverse design method which is valid in the leading edge region. The new design method is a streamline curvature method, and a new technique is presented for modeling the variation of the streamline curvature normal to the surface. The new design method allows the surface coordinates to move normal to the surface, and has been incorporated into the Constrained Direct Iterative Surface Curvature (CDISC) design method. The accuracy and efficiency of the design method is demonstrated using both two-dimensional and three-dimensional design cases.

  11. Optimization and Application of Direct Infusion Nanoelectrospray HRMS Method for Large-Scale Urinary Metabolic Phenotyping in Molecular Epidemiology

    PubMed Central

    2017-01-01

    Large-scale metabolic profiling requires the development of novel economical high-throughput analytical methods to facilitate characterization of systemic metabolic variation in population phenotypes. We report a fit-for-purpose direct infusion nanoelectrospray high-resolution mass spectrometry (DI-nESI-HRMS) method with time-of-flight detection for rapid targeted parallel analysis of over 40 urinary metabolites. The newly developed 2 min infusion method requires <10 μL of urine sample and generates high-resolution MS profiles in both positive and negative polarities, enabling further data mining and relative quantification of hundreds of metabolites. Here we present optimization of the DI-nESI-HRMS method in a detailed step-by-step guide and provide a workflow with rigorous quality assessment for large-scale studies. We demonstrate for the first time the application of the method for urinary metabolic profiling in human epidemiological investigations. Implementation of the presented DI-nESI-HRMS method enabled cost-efficient analysis of >10 000 24 h urine samples from the INTERMAP study in 12 weeks and >2200 spot urine samples from the ARIC study in <3 weeks with the required sensitivity and accuracy. We illustrate the application of the technique by characterizing the differences in metabolic phenotypes of the USA and Japanese population from the INTERMAP study. PMID:28245357

  12. Global optimization methods for engineering design

    NASA Technical Reports Server (NTRS)

    Arora, Jasbir S.

    1990-01-01

    The problem is to find a global minimum for the Problem P. Necessary and sufficient conditions are available for local optimality. However, global solution can be assured only under the assumption of convexity of the problem. If the constraint set S is compact and the cost function is continuous on it, existence of a global minimum is guaranteed. However, in view of the fact that no global optimality conditions are available, a global solution can be found only by an exhaustive search to satisfy Inequality. The exhaustive search can be organized in such a way that the entire design space need not be searched for the solution. This way the computational burden is reduced somewhat. It is concluded that zooming algorithm for global optimizations appears to be a good alternative to stochastic methods. More testing is needed; a general, robust, and efficient local minimizer is required. IDESIGN was used in all numerical calculations which is based on a sequential quadratic programming algorithm, and since feasible set keeps on shrinking, a good algorithm to find an initial feasible point is required. Such algorithms need to be developed and evaluated.

  13. Advances in spatial epidemiology and geographic information systems.

    PubMed

    Kirby, Russell S; Delmelle, Eric; Eberth, Jan M

    2017-01-01

    The field of spatial epidemiology has evolved rapidly in the past 2 decades. This study serves as a brief introduction to spatial epidemiology and the use of geographic information systems in applied research in epidemiology. We highlight technical developments and highlight opportunities to apply spatial analytic methods in epidemiologic research, focusing on methodologies involving geocoding, distance estimation, residential mobility, record linkage and data integration, spatial and spatio-temporal clustering, small area estimation, and Bayesian applications to disease mapping. The articles included in this issue incorporate many of these methods into their study designs and analytical frameworks. It is our hope that these studies will spur further development and utilization of spatial analysis and geographic information systems in epidemiologic research.

  14. A Review of the Epidemiological Methods Used to Investigate the Health Impacts of Air Pollution around Major Industrial Areas

    PubMed Central

    Pascal, Laurence; Bidondo, Marie-Laure; Cochet, Amandine; Sarter, Hélène; Stempfelet, Morgane; Wagner, Vérène

    2013-01-01

    We performed a literature review to investigate how epidemiological studies have been used to assess the health consequences of living in the vicinity of industries. 77 papers on the chronic effects of air pollution around major industrial areas were reviewed. Major health themes were cancers (27 studies), morbidity (25 studies), mortality (7 studies), and birth outcome (7 studies). Only 3 studies investigated mental health. While studies were available from many different countries, a majority of papers came from the United Kingdom, Italy, and Spain. Several studies were motivated by concerns from the population or by previous observations of an overincidence of cases. Geographical ecological designs were largely used for studying cancer and mortality, including statistical designs to quantify a relationship between health indicators and exposure. Morbidity was frequently investigated through cross-sectional surveys on the respiratory health of children. Few multicenter studies were performed. In a majority of papers, exposed areas were defined based on the distance to the industry and were located from <2 km to >20 km from the plants. Improving the exposure assessment would be an asset to future studies. Criteria to include industries in multicenter studies should be defined. PMID:23818910

  15. A review of the epidemiological methods used to investigate the health impacts of air pollution around major industrial areas.

    PubMed

    Pascal, Mathilde; Pascal, Laurence; Bidondo, Marie-Laure; Cochet, Amandine; Sarter, Hélène; Stempfelet, Morgane; Wagner, Vérène

    2013-01-01

    We performed a literature review to investigate how epidemiological studies have been used to assess the health consequences of living in the vicinity of industries. 77 papers on the chronic effects of air pollution around major industrial areas were reviewed. Major health themes were cancers (27 studies), morbidity (25 studies), mortality (7 studies), and birth outcome (7 studies). Only 3 studies investigated mental health. While studies were available from many different countries, a majority of papers came from the United Kingdom, Italy, and Spain. Several studies were motivated by concerns from the population or by previous observations of an overincidence of cases. Geographical ecological designs were largely used for studying cancer and mortality, including statistical designs to quantify a relationship between health indicators and exposure. Morbidity was frequently investigated through cross-sectional surveys on the respiratory health of children. Few multicenter studies were performed. In a majority of papers, exposed areas were defined based on the distance to the industry and were located from <2 km to >20 km from the plants. Improving the exposure assessment would be an asset to future studies. Criteria to include industries in multicenter studies should be defined.

  16. The Epidemiology of Substance Use Disorders in US Veterans: A Systematic Review and Analysis of Assessment Methods

    PubMed Central

    Lan, Chiao-Wen; Fiellin, David A.; Barry, Declan T.; Bryant, Kendall J.; Gordon, Adam J.; Edelman, E. Jennifer; Gaither, Julie R.; Maisto, Stephen A.; Marshall, Brandon D.L.

    2016-01-01

    Background Substance use disorders (SUDs), which encompass alcohol and drug use disorders (AUDs, DUDs), constitute a major public health challenge among US veterans. SUDs are among the most common and costly of all health conditions among veterans. Objectives This study sought to examine the epidemiology of SUDs among US veterans, compare the prevalence of SUDs in studies using diagnostic and administrative criteria assessment methods, and summarize trends in the prevalence of SUDs reported in studies sampling US veterans over time. Methods Comprehensive electronic database searches were conducted. A total of 3,490 studies were identified. We analyzed studies sampling US veterans and reporting prevalence, distribution, and examining AUDs and DUDs. Results Of the studies identified, 72 met inclusion criteria. The studies were published between 1995 and 2013. Studies using diagnostic criteria reported higher prevalence of AUDs (32% vs. 10%) and DUDs (20% vs. 5%) than administrative criteria, respectively. Regardless of assessment method, both the lifetime and past year prevalence of AUDs in studies sampling US veterans has declined gradually over time. Conclusion The prevalence of SUDs reported in studies sampling US veterans are affected by assessment method. Given the significant public health problems of SUDs among US veterans, improved guidelines for clinical screening using validated diagnostic criteria to assess AUDs and DUDs in US veteran populations are needed. Scientific Significance These findings may inform VA and other healthcare systems in prevention, diagnosis, and intervention for SUDs among US veterans. PMID:26693830

  17. Alternative methods for the design of jet engine control systems

    NASA Technical Reports Server (NTRS)

    Sain, M. K.; Leake, R. J.; Basso, R.; Gejji, R.; Maloney, A.; Seshadri, V.

    1976-01-01

    Various alternatives to linear quadratic design methods for jet engine control systems are discussed. The main alternatives are classified into two broad categories: nonlinear global mathematical programming methods and linear local multivariable frequency domain methods. Specific studies within these categories include model reduction, the eigenvalue locus method, the inverse Nyquist method, polynomial design, dynamic programming, and conjugate gradient approaches.

  18. Demystifying Mixed Methods Research Design: A Review of the Literature

    ERIC Educational Resources Information Center

    Caruth, Gail D.

    2013-01-01

    Mixed methods research evolved in response to the observed limitations of both quantitative and qualitative designs and is a more complex method. The purpose of this paper was to examine mixed methods research in an attempt to demystify the design thereby allowing those less familiar with its design an opportunity to utilize it in future research.…

  19. MEASUREMENT ERROR ESTIMATION AND CORRECTION METHODS TO MINIMIZE EXPOSURE MISCLASSIFICATION IN EPIDEMIOLOGICAL STUDIES: PROJECT SUMMARY

    EPA Science Inventory

    This project summary highlights recent findings from research undertaken to develop improved methods to assess potential human health risks related to drinking water disinfection byproduct (DBP) exposures.

  20. The use of mathematical models in the epidemiological study of infectious diseases and in the design of mass immunization programmes.

    PubMed

    Nokes, D J; Anderson, R M

    1988-08-01

    The relationship between the number of people vaccinated for an infectious disease and the resulting decrease in incidence of the disease is not straightforward and linear because many independent variables determine the course of infection. However, these variables are quantifiable and can therefore by used to model the course of an infectious disease and impact of mass vaccination. Before one can construct a model, one must know for any specific infectious disease the number of individuals in the community protected by maternally derived antibodies, the number susceptible to infectious the number infected but not yet infectious (i.e., with latent infection), the number of infectious individuals, and the number of recovered (i.e., immune) individuals. Compartmental models are sets of differential equations which describe the rates of flow of individuals between these categories. Several major epidemiologic concepts comprise the ingredients of the model: the net rate of infection (i.e., incidence), the per capita rate of infection, the Force of Infection, and the basic reproductive rate of infection. When a community attains a high level of vaccination coverage, it is no longer necessary to vaccinate everyone because the herd immunity of the population protects the unvaccinated because it lowers the likelihood of their coming into contact with an infectious individual. Many infections that confer lasting immunity tend to have interepidemic periods when the number of susceptibles is too low to sustain an epidemic. Mass vacination programs reduce the net rate of transmission of the infective organism; they also increase the length of the interepidemic period. Many diseases primawrily associated with children have much more serious consequences in older people and the question arises as to at what point childhood immunization will successfully prevent the more dangerous incidence of the disease in older cohorts. Mathematical models of disease transmission enable one

  1. Computational Methods Applied to Rational Drug Design.

    PubMed

    Ramírez, David

    2016-01-01

    Due to the synergic relationship between medical chemistry, bioinformatics and molecular simulation, the development of new accurate computational tools for small molecules drug design has been rising over the last years. The main result is the increased number of publications where computational techniques such as molecular docking, de novo design as well as virtual screening have been used to estimate the binding mode, site and energy of novel small molecules. In this work I review some tools, which enable the study of biological systems at the atomistic level, providing relevant information and thereby, enhancing the process of rational drug design.

  2. Computational Methods Applied to Rational Drug Design

    PubMed Central

    Ramírez, David

    2016-01-01

    Due to the synergic relationship between medical chemistry, bioinformatics and molecular simulation, the development of new accurate computational tools for small molecules drug design has been rising over the last years. The main result is the increased number of publications where computational techniques such as molecular docking, de novo design as well as virtual screening have been used to estimate the binding mode, site and energy of novel small molecules. In this work I review some tools, which enable the study of biological systems at the atomistic level, providing relevant information and thereby, enhancing the process of rational drug design. PMID:27708723

  3. The Work Design Method for Human Friendly

    NASA Astrophysics Data System (ADS)

    Harada, Narumi; Sasaki, Masatoshi; Ichikawa, Masami

    In order to realize “the product life cycle with respect for human nature". we ought to make work design so that work environment should be configured to be sound in mind and body, with due consideration of not only physical but also mental factors from the viewpoint of workers. The former includes too heavy work, unreasonable working posture, local fatigue of the body, the safety, and working comfort, and the latter includes work motivation, work worthiness, stress, etc. For the purpose of evaluating the degree of working comfort and safety at human-oriented production lines, we acknowledged, for the work design, the effectiveness of the work designing technique with working time variation duly considered. And, we formulated a model for a mental factor experienced by workers from the degree of working delays. This study covers a work design technique we developed with the effect of the factor as the value of evaluation.

  4. INFLUENCE OF EXPOSURE ASSESSMENT METHOD IN AN EPIDEMIOLOGIC STUDY OF TRIHALOMETHANE EXPOSURE AND SPONTANEOUS ABORTION

    EPA Science Inventory

    Trihalomethanes are common contaminants of chlorinated drinking water. Studies of their health effects have been hampered by exposure misclassification, due in part to limitations inherent in using utility sampling records. We used two exposure assessment methods, one based on ut...

  5. Supersonic biplane design via adjoint method

    NASA Astrophysics Data System (ADS)

    Hu, Rui

    In developing the next generation supersonic transport airplane, two major challenges must be resolved. The fuel efficiency must be significantly improved, and the sonic boom propagating to the ground must be dramatically reduced. Both of these objectives can be achieved by reducing the shockwaves formed in supersonic flight. The Busemann biplane is famous for using favorable shockwave interaction to achieve nearly shock-free supersonic flight at its design Mach number. Its performance at off-design Mach numbers, however, can be very poor. This dissertation studies the performance of supersonic biplane airfoils at design and off-design conditions. The choked flow and flow-hysteresis phenomena of these biplanes are studied. These effects are due to finite thickness of the airfoils and non-uniqueness of the solution to the Euler equations, creating over an order of magnitude more wave drag than that predicted by supersonic thin airfoil theory. As a result, the off-design performance is the major barrier to the practical use of supersonic biplanes. The main contribution of this work is to drastically improve the off-design performance of supersonic biplanes by using an adjoint based aerodynamic optimization technique. The Busemann biplane is used as the baseline design, and its shape is altered to achieve optimal wave drags in series of Mach numbers ranging from 1.1 to 1.7, during both acceleration and deceleration conditions. The optimized biplane airfoils dramatically reduces the effects of the choked flow and flow-hysteresis phenomena, while maintaining a certain degree of favorable shockwave interaction effects at the design Mach number. Compared to a diamond shaped single airfoil of the same total thickness, the wave drag of our optimized biplane is lower at almost all Mach numbers, and is significantly lower at the design Mach number. In addition, by performing a Navier-Stokes solution for the optimized airfoil, it is verified that the optimized biplane improves

  6. Molecular epidemiology and a loop-mediated isothermal amplification method for diagnosis of infection with rabies virus in Zambia.

    PubMed

    Muleya, Walter; Namangala, Boniface; Mweene, Aaron; Zulu, Luke; Fandamu, Paul; Banda, Douglas; Kimura, Takashi; Sawa, Hirofumi; Ishii, Akihiro

    2012-01-01

    The National Livestock Epidemiology and Information Center (NALEIC) in Zambia reported over 132 cases of canine rabies diagnosed by the direct fluorescent antibody test (DFAT) from 2004 to 2009. In this study, the lineage of rabies virus (RABV) in Zambia was determined by phylogenetic analysis of the nucleoprotein (N) and glycoprotein (G) gene sequences. Total RNA was extracted from 87-DFAT brain specimens out of which only 35 (40%) were positive on nested reverse transcription polymerase chain reaction (RT-PCR) for each gene, and 26 being positive for both genes. Positive specimens for the N (n=33) and G (n=35) genes were used for phylogenetic analysis. Phylogenetic analysis of the N gene showed two phylogenetic clusters in Zambia belonging to the Africa 1b lineage present in eastern and southern Africa. While one cluster exclusively comprised Zambian strains, the other was more heterogeneous regarding the RABV origins and included strains from Tanzania, Mozambique and Zambia. Phylogenetic analysis of the G gene revealed similar RABV strains in different hosts and regions of Zambia. We designed primers for reverse transcription loop-mediated isothermal amplification (RT-LAMP) assay from the consensus sequence of the N gene in an attempt to improve the molecular diagnosis of RABV in Zambia. The specificity and reproducibility of the RT-LAMP assay was confirmed with actual clinical specimens. Therefore, the RT-LAMP assay presented in this study may prove to be useful for routine diagnosis of rabies in Zambia.

  7. A method for nonlinear optimization with discrete design variables

    NASA Technical Reports Server (NTRS)

    Olsen, Gregory R.; Vanderplaats, Garret N.

    1987-01-01

    A numerical method is presented for the solution of nonlinear discrete optimization problems. The applicability of discrete optimization to engineering design is discussed, and several standard structural optimization problems are solved using discrete design variables. The method uses approximation techniques to create subproblems suitable for linear mixed-integer programming methods. The method employs existing software for continuous optimization and integer programming.

  8. JASMINE design and method of data reduction

    NASA Astrophysics Data System (ADS)

    Yamada, Yoshiyuki; Gouda, Naoteru; Yano, Taihei; Kobayashi, Yukiyasu; Niwa, Yoshito

    2008-07-01

    Japan Astrometry Satellite Mission for Infrared Exploration (JASMINE) aims to construct a map of the Galactic bulge with 10 μ arc sec accuracy. We use z-band CCD for avoiding dust absorption, and observe about 10 × 20 degrees area around the Galactic bulge region. Because the stellar density is very high, each FOVs can be combined with high accuracy. With 5 years observation, we will construct 10 μ arc sec accurate map. In this poster, I will show the observation strategy, design of JASMINE hardware, reduction scheme, and error budget. We also construct simulation software named JASMINE Simulator. We also show the simulation results and design of software.

  9. Method to Select Metropolitan Areas of Epidemiologic Interest for Enhanced Air Quality Monitoring

    EPA Science Inventory

    The U.S. Environmental Protection Agency’s current Speciation Trends Network (STN) covers most major U.S. metropolitan areas and a wide range of particulate matter (PM) constituents and gaseous co-pollutants. However, using filter-based methods, most PM constituents are measured ...

  10. Methods for library-scale computational protein design.

    PubMed

    Johnson, Lucas B; Huber, Thaddaus R; Snow, Christopher D

    2014-01-01

    Faced with a protein engineering challenge, a contemporary researcher can choose from myriad design strategies. Library-scale computational protein design (LCPD) is a hybrid method suitable for the engineering of improved protein variants with diverse sequences. This chapter discusses the background and merits of several practical LCPD techniques. First, LCPD methods suitable for delocalized protein design are presented in the context of example design calculations for cellobiohydrolase II. Second, localized design methods are discussed in the context of an example design calculation intended to shift the substrate specificity of a ketol-acid reductoisomerase Rossmann domain from NADPH to NADH.

  11. Novel Microbiological and Spatial Statistical Methods to Improve Strength of Epidemiological Evidence in a Community-Wide Waterborne Outbreak

    PubMed Central

    Jalava, Katri; Rintala, Hanna; Ollgren, Jukka; Maunula, Leena; Gomez-Alvarez, Vicente; Revez, Joana; Palander, Marja; Antikainen, Jenni; Kauppinen, Ari; Räsänen, Pia; Siponen, Sallamaari; Nyholm, Outi; Kyyhkynen, Aino; Hakkarainen, Sirpa; Merentie, Juhani; Pärnänen, Martti; Loginov, Raisa; Ryu, Hodon; Kuusi, Markku; Siitonen, Anja; Miettinen, Ilkka; Santo Domingo, Jorge W.; Hänninen, Marja-Liisa; Pitkänen, Tarja

    2014-01-01

    Failures in the drinking water distribution system cause gastrointestinal outbreaks with multiple pathogens. A water distribution pipe breakage caused a community-wide waterborne outbreak in Vuorela, Finland, July 2012. We investigated this outbreak with advanced epidemiological and microbiological methods. A total of 473/2931 inhabitants (16%) responded to a web-based questionnaire. Water and patient samples were subjected to analysis of multiple microbial targets, molecular typing and microbial community analysis. Spatial analysis on the water distribution network was done and we applied a spatial logistic regression model. The course of the illness was mild. Drinking untreated tap water from the defined outbreak area was significantly associated with illness (RR 5.6, 95% CI 1.9–16.4) increasing in a dose response manner. The closer a person lived to the water distribution breakage point, the higher the risk of becoming ill. Sapovirus, enterovirus, single Campylobacter jejuni and EHEC O157:H7 findings as well as virulence genes for EPEC, EAEC and EHEC pathogroups were detected by molecular or culture methods from the faecal samples of the patients. EPEC, EAEC and EHEC virulence genes and faecal indicator bacteria were also detected in water samples. Microbial community sequencing of contaminated tap water revealed abundance of Arcobacter species. The polyphasic approach improved the understanding of the source of the infections, and aided to define the extent and magnitude of this outbreak. PMID:25147923

  12. Novel microbiological and spatial statistical methods to improve strength of epidemiological evidence in a community-wide waterborne outbreak.

    PubMed

    Jalava, Katri; Rintala, Hanna; Ollgren, Jukka; Maunula, Leena; Gomez-Alvarez, Vicente; Revez, Joana; Palander, Marja; Antikainen, Jenni; Kauppinen, Ari; Räsänen, Pia; Siponen, Sallamaari; Nyholm, Outi; Kyyhkynen, Aino; Hakkarainen, Sirpa; Merentie, Juhani; Pärnänen, Martti; Loginov, Raisa; Ryu, Hodon; Kuusi, Markku; Siitonen, Anja; Miettinen, Ilkka; Santo Domingo, Jorge W; Hänninen, Marja-Liisa; Pitkänen, Tarja

    2014-01-01

    Failures in the drinking water distribution system cause gastrointestinal outbreaks with multiple pathogens. A water distribution pipe breakage caused a community-wide waterborne outbreak in Vuorela, Finland, July 2012. We investigated this outbreak with advanced epidemiological and microbiological methods. A total of 473/2931 inhabitants (16%) responded to a web-based questionnaire. Water and patient samples were subjected to analysis of multiple microbial targets, molecular typing and microbial community analysis. Spatial analysis on the water distribution network was done and we applied a spatial logistic regression model. The course of the illness was mild. Drinking untreated tap water from the defined outbreak area was significantly associated with illness (RR 5.6, 95% CI 1.9-16.4) increasing in a dose response manner. The closer a person lived to the water distribution breakage point, the higher the risk of becoming ill. Sapovirus, enterovirus, single Campylobacter jejuni and EHEC O157:H7 findings as well as virulence genes for EPEC, EAEC and EHEC pathogroups were detected by molecular or culture methods from the faecal samples of the patients. EPEC, EAEC and EHEC virulence genes and faecal indicator bacteria were also detected in water samples. Microbial community sequencing of contaminated tap water revealed abundance of Arcobacter species. The polyphasic approach improved the understanding of the source of the infections, and aided to define the extent and magnitude of this outbreak.

  13. Methods and Processes of Developing the Strengthening the Reporting of Observational Studies in Epidemiology-Veterinary (STROBE-Vet) Statement.

    PubMed

    Sargeant, J M; O'Connor, A M; Dohoo, I R; Erb, H N; Cevallos, M; Egger, M; Ersbøll, A K; Martin, S W; Nielsen, L R; Pearl, D L; Pfeiffer, D U; Sanchez, J; Torrence, M E; Vigre, H; Waldner, C; Ward, M P

    2016-12-01

    Reporting of observational studies in veterinary research presents challenges that often are not addressed in published reporting guidelines. Our objective was to develop an extension of the STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) statement that addresses unique reporting requirements for observational studies in veterinary medicine related to health, production, welfare, and food safety. We conducted a consensus meeting with 17 experts in Mississauga, Canada. Experts completed a premeeting survey about whether items in the STROBE statement should be modified or added to address unique issues related to observational studies in animal species with health, production, welfare, or food safety outcomes. During the meeting, each STROBE item was discussed to determine whether or not rewording was recommended, and whether additions were warranted. Anonymous voting was used to determine consensus. Six items required no modifications or additions. Modifications or additions were made to the STROBE items 1 (title and abstract), 3 (objectives), 5 (setting), 6 (participants), 7 (variables), 8 (data sources and measurement), 9 (bias), 10 (study size), 12 (statistical methods), 13 (participants), 14 (descriptive data), 15 (outcome data), 16 (main results), 17 (other analyses), 19 (limitations), and 22 (funding). The methods and processes used were similar to those used for other extensions of the STROBE statement. The use of this STROBE statement extension should improve reporting of observational studies in veterinary research by recognizing unique features of observational studies involving food-producing and companion animals, products of animal origin, aquaculture, and wildlife.

  14. The application of mixed methods designs to trauma research.

    PubMed

    Creswell, John W; Zhang, Wanqing

    2009-12-01

    Despite the use of quantitative and qualitative data in trauma research and therapy, mixed methods studies in this field have not been analyzed to help researchers designing investigations. This discussion begins by reviewing four core characteristics of mixed methods research in the social and human sciences. Combining these characteristics, the authors focus on four select mixed methods designs that are applicable in trauma research. These designs are defined and their essential elements noted. Applying these designs to trauma research, a search was conducted to locate mixed methods trauma studies. From this search, one sample study was selected, and its characteristics of mixed methods procedures noted. Finally, drawing on other mixed methods designs available, several follow-up mixed methods studies were described for this sample study, enabling trauma researchers to view design options for applying mixed methods research in trauma investigations.

  15. Lithography aware overlay metrology target design method

    NASA Astrophysics Data System (ADS)

    Lee, Myungjun; Smith, Mark D.; Lee, Joonseuk; Jung, Mirim; Lee, Honggoo; Kim, Youngsik; Han, Sangjun; Adel, Michael E.; Lee, Kangsan; Lee, Dohwa; Choi, Dongsub; Liu, Zephyr; Itzkovich, Tal; Levinski, Vladimir; Levy, Ady

    2016-03-01

    We present a metrology target design (MTD) framework based on co-optimizing lithography and metrology performance. The overlay metrology performance is strongly related to the target design and optimizing the target under different process variations in a high NA optical lithography tool and measurement conditions in a metrology tool becomes critical for sub-20nm nodes. The lithography performance can be quantified by device matching and printability metrics, while accuracy and precision metrics are used to quantify the metrology performance. Based on using these metrics, we demonstrate how the optimized target can improve target printability while maintaining the good metrology performance for rotated dipole illumination used for printing a sub-100nm diagonal feature in a memory active layer. The remaining challenges and the existing tradeoff between metrology and lithography performance are explored with the metrology target designer's perspective. The proposed target design framework is completely general and can be used to optimize targets for different lithography conditions. The results from our analysis are both physically sensible and in good agreement with experimental results.

  16. [Epidemiology and heterogeny].

    PubMed

    Breilh, J; Granda, E

    1989-01-01

    The innovation of epidemiology plays a crucial role in the development of the health sciences. The authors emphasize the importance of epistemological analysis related to scientific and technical production. They focus on the theoretical and methodological contributions of the principal Latin American groups in the field of epidemiology, stating their main accomplishments, issues and potentials. When reviewing those conceptual and practical innovations, the authors analyse the effects of broader historical conditions on scientific work. To them, Latin American contemporary innovative epidemiological research and production have developed clearly differentiated principles, methods and technical projections which have led to a movement of critical or 'social' epidemiology. The functionalist approach of conventional epidemiology, characterized by an empiricist viewpoint, is being overcome by a more rigorous and analytical approach. This new epidemiological approach, in which the authors as members of CEAS (Health Research and Advisory Center) are working, has selectively incorporated some of the technical instruments of conventional epidemiology, subordinating them to a different theoretical and logical paradigm. The new framework of this group explains the need to consider the people's objective situation and necessities, when constructing scientific interpretations and planning technical action. In order to accomplish this goal, epidemiological reasoning has to reflect the unity of external epidemiological facts and associations, the so-called phenomenological aspect of health, with the underlying determinants and conditioning processes or internal relations, which are the essence of the health-disease production and distribution process. Epidemiological analysis is considered not only as a problem of empirical observation but as a process of theoretical construction, in which there is a dynamic fusion of deductive and inductive reasoning.(ABSTRACT TRUNCATED AT 250

  17. Snippets from the past: the evolution of Wade Hampton Frost's epidemiology as viewed from the American Journal of Hygiene/Epidemiology.

    PubMed

    Morabia, Alfredo

    2013-10-01

    Wade Hampton Frost, who was a Professor of Epidemiology at Johns Hopkins University from 1919 to 1938, spurred the development of epidemiologic methods. His 6 publications in the American Journal of Hygiene, which later became the American Journal of Epidemiology, comprise a 1928 Cutter lecture on a theory of epidemics, a survey-based study of tonsillectomy and immunity to Corynebacterium diphtheriae (1931), 2 papers from a longitudinal study of the incidence of minor respiratory diseases (1933 and 1935), an attack rate ratio analysis of the decline of diphtheria in Baltimore (1936), and a 1936 lecture on the age, time, and cohort analysis of tuberculosis mortality. These 6 American Journal of Hygiene /American Journal of Epidemiology papers attest that Frost's personal evolution mirrored that of the emerging "early" epidemiology: The scope of epidemiology extended beyond the study of epidemics of acute infectious diseases, and rigorous comparative study designs and their associated quantitative methods came to light.

  18. A comparison of digital flight control design methods

    NASA Technical Reports Server (NTRS)

    Powell, J. D.; Parsons, E.; Tashker, M. G.

    1976-01-01

    Many variations in design methods for aircraft digital flight control have been proposed in the literature. In general, the methods fall into two categories: those where the design is done in the continuous domain (or s-plane), and those where the design is done in the discrete domain (or z-plane). This paper evaluates several variations of each category and compares them for various flight control modes of the Langley TCV Boeing 737 aircraft. Design method fidelity is evaluated by examining closed loop root movement and the frequency response of the discretely controlled continuous aircraft. It was found that all methods provided acceptable performance for sample rates greater than 10 cps except the 'uncompensated s-plane design' method which was acceptable above 20 cps. A design procedure based on optimal control methods was proposed that provided the best fidelity at very slow sample rates and required no design iterations for changing sample rates.

  19. Web tools for molecular epidemiology of tuberculosis.

    PubMed

    Shabbeer, Amina; Ozcaglar, Cagri; Yener, Bülent; Bennett, Kristin P

    2012-06-01

    In this study we explore publicly available web tools designed to use molecular epidemiological data to extract information that can be employed for the effective tracking and control of tuberculosis (TB). The application of molecular methods for the epidemiology of TB complement traditional approaches used in public health. DNA fingerprinting methods are now routinely employed in TB surveillance programs and are primarily used to detect recent transmissions and in outbreak investigations. Here we present web tools that facilitate systematic analysis of Mycobacterium tuberculosis complex (MTBC) genotype information and provide a view of the genetic diversity in the MTBC population. These tools help answer questions about the characteristics of MTBC strains, such as their pathogenicity, virulence, immunogenicity, transmissibility, drug-resistance profiles and host-pathogen associativity. They provide an integrated platform for researchers to use molecular epidemiological data to address current challenges in the understanding of TB dynamics and the characteristics of MTBC.

  20. Probabilistic Methods for Structural Design and Reliability

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Whitlow, Woodrow, Jr. (Technical Monitor)

    2002-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate, that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or in deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  1. A graph-theory method for pattern identification in geographical epidemiology – a preliminary application to deprivation and mortality

    PubMed Central

    Maheswaran, Ravi; Craigs, Cheryl; Read, Simon; Bath, Peter A; Willett, Peter

    2009-01-01

    Background Graph theoretical methods are extensively used in the field of computational chemistry to search datasets of compounds to see if they contain particular molecular sub-structures or patterns. We describe a preliminary application of a graph theoretical method, developed in computational chemistry, to geographical epidemiology in relation to testing a prior hypothesis. We tested the methodology on the hypothesis that if a socioeconomically deprived neighbourhood is situated in a wider deprived area, then that neighbourhood would experience greater adverse effects on mortality compared with a similarly deprived neighbourhood which is situated in a wider area with generally less deprivation. Methods We used the Trent Region Health Authority area for this study, which contained 10,665 census enumeration districts (CED). Graphs are mathematical representations of objects and their relationships and within the context of this study, nodes represented CEDs and edges were determined by whether or not CEDs were neighbours (shared a common boundary). The overall area in this study was represented by one large graph comprising all CEDs in the region, along with their adjacency information. We used mortality data from 1988–1998, CED level population estimates and the Townsend Material Deprivation Index as an indicator of neighbourhood level deprivation. We defined deprived CEDs as those in the top 20% most deprived in the Region. We then set out to classify these deprived CEDs into seven groups defined by increasing deprivation levels in the neighbouring CEDs. 506 (24.2%) of the deprived CEDs had five adjacent CEDs and we limited pattern development and searching to these CEDs. We developed seven query patterns and used the RASCAL (Rapid Similarity Calculator) program to carry out the search for each of the query patterns. This program used a maximum common subgraph isomorphism method which was modified to handle geographical data. Results Of the 506 deprived CEDs

  2. A comparison of methods currently used in inclusive design.

    PubMed

    Goodman-Deane, Joy; Ward, James; Hosking, Ian; Clarkson, P John

    2014-07-01

    Inclusive design has unique challenges because it aims to improve usability for a wide range of users. This typically includes people with lower levels of ability, as well as mainstream users. This paper examines the effectiveness of two methods that are used in inclusive design: user trials and exclusion calculations (an inclusive design inspection method). A study examined three autoinjectors using both methods (n=30 for the user trials). The usability issues identified by each method are compared and the effectiveness of the methods is discussed. The study found that each method identified different kinds of issues, all of which are important for inclusive design. We therefore conclude that a combination of methods should be used in inclusive design rather than relying on a single method. Recommendations are also given for how the individual methods can be used more effectively in this context.

  3. [Pharmacological vigilance and pharmacological epidemiology: principles, definition, methods and current trends in neurology].

    PubMed

    Montastruc, J L; Bagheri, H; Lapeyre-Mestre, M; Senard, J M

    1999-04-01

    It is now well established that only clinical trials performed before drug approval are not sufficient for a full modern pharmacological evaluation of drugs and treatments. The need of both pharmacovigilance and pharmacoepidemiology is underlined in order to evaluate drugs under real conditions. After a summary of methods used in pharmacoepidemiological trials (spontaneous reports, imputability assessment, cohorts, case control studies etc.), recent pharmacoepidemiological data useful for the neurologist are summarized: side effects of tacrine and vaccines, serotoninergic syndrome and side effects of new antiepileptic drugs.

  4. Soft computing methods in design of superalloys

    NASA Technical Reports Server (NTRS)

    Cios, K. J.; Berke, L.; Vary, A.; Sharma, S.

    1995-01-01

    Soft computing techniques of neural networks and genetic algorithms are used in the design of superalloys. The cyclic oxidation attack parameter K(sub a), generated from tests at NASA Lewis Research Center, is modeled as a function of the superalloy chemistry and test temperature using a neural network. This model is then used in conjunction with a genetic algorithm to obtain an optimized superalloy composition resulting in low K(sub a) values.

  5. Soft Computing Methods in Design of Superalloys

    NASA Technical Reports Server (NTRS)

    Cios, K. J.; Berke, L.; Vary, A.; Sharma, S.

    1996-01-01

    Soft computing techniques of neural networks and genetic algorithms are used in the design of superalloys. The cyclic oxidation attack parameter K(sub a), generated from tests at NASA Lewis Research Center, is modelled as a function of the superalloy chemistry and test temperature using a neural network. This model is then used in conjunction with a genetic algorithm to obtain an optimized superalloy composition resulting in low K(sub a) values.

  6. Computational Methods for Design, Control and Optimization

    DTIC Science & Technology

    2007-10-01

    34scenario" that applies to channel flows ( Poiseuille flows , Couette flow ) and pipe flows . Over the past 75 years many complex "transition theories" have... Simulation of Turbulent Flows , Springer Verlag, 2005. Additional Publications Supported by this Grant 1. J. Borggaard and T. Iliescu, Approximate Deconvolution...rigorous analysis of design algorithms that combine numerical simulation codes, approximate sensitivity calculations and optimization codes. The fundamental

  7. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  8. Waterflooding injectate design systems and methods

    DOEpatents

    Brady, Patrick V.; Krumhansl, James L.

    2016-12-13

    A method of recovering a liquid hydrocarbon using an injectate includes recovering the liquid hydrocarbon through primary extraction. Physico-chemical data representative of electrostatic interactions between the liquid hydrocarbon and the reservoir rock are measured. At least one additive of the injectate is selected based on the physico-chemical data. The method includes recovering the liquid hydrocarbon from the reservoir rock through secondary extraction using the injectate.

  9. Hormonal contraceptive methods and risk of HIV acquisition in women: a systematic review of epidemiological evidence.

    PubMed

    Polis, Chelsea B; Phillips, Sharon J; Curtis, Kathryn M; Westreich, Daniel J; Steyn, Petrus S; Raymond, Elizabeth; Hannaford, Philip; Turner, Abigail Norris

    2014-10-01

    Whether use of various types of hormonal contraception (HC) affect risk of HIV acquisition is a critical question for women's health. For this systematic review, we identified 22 studies published by January 15, 2014 which met inclusion criteria; we classified thirteen studies as having severe methodological limitations, and nine studies as "informative but with important limitations". Overall, data do not support an association between use of oral contraceptives and increased risk of HIV acquisition. Uncertainty persists regarding whether an association exists between depot-medroxyprogesterone acetate (DMPA) use and risk of HIV acquisition. Most studies suggested no significantly increased HIV risk with norethisterone enanthate (NET-EN) use, but when assessed in the same study, point estimates for NET-EN tended to be larger than for DMPA, though 95% confidence intervals overlapped substantially. No data have suggested significantly increased risk of HIV acquisition with use of implants, though data were limited. No data are available on the relationship between use of contraceptive patches, rings, or hormonal intrauterine devices and risk of HIV acquisition. Women choosing progestin-only injectable contraceptives such as DMPA or NET-EN should be informed of the current uncertainty regarding whether use of these methods increases risk of HIV acquisition, and like all women at risk of HIV, should be empowered to access and use condoms and other HIV preventative measures. Programs, practitioners, and women urgently need guidance on how to maximize health with respect to avoiding both unintended pregnancy and HIV given inconclusive or limited data for certain HC methods.

  10. Methods for estimation of radiation risk in epidemiological studies accounting for classical and Berkson errors in doses.

    PubMed

    Kukush, Alexander; Shklyar, Sergiy; Masiuk, Sergii; Likhtarov, Illya; Kovgan, Lina; Carroll, Raymond J; Bouville, Andre

    2011-02-16

    With a binary response Y, the dose-response model under consideration is logistic in flavor with pr(Y=1 | D) = R (1+R)(-1), R = λ(0) + EAR D, where λ(0) is the baseline incidence rate and EAR is the excess absolute risk per gray. The calculated thyroid dose of a person i is expressed as Dimes=fiQi(mes)/Mi(mes). Here, Qi(mes) is the measured content of radioiodine in the thyroid gland of person i at time t(mes), Mi(mes) is the estimate of the thyroid mass, and f(i) is the normalizing multiplier. The Q(i) and M(i) are measured with multiplicative errors Vi(Q) and ViM, so that Qi(mes)=Qi(tr)Vi(Q) (this is classical measurement error model) and Mi(tr)=Mi(mes)Vi(M) (this is Berkson measurement error model). Here, Qi(tr) is the true content of radioactivity in the thyroid gland, and Mi(tr) is the true value of the thyroid mass. The error in f(i) is much smaller than the errors in ( Qi(mes), Mi(mes)) and ignored in the analysis. By means of Parametric Full Maximum Likelihood and Regression Calibration (under the assumption that the data set of true doses has lognormal distribution), Nonparametric Full Maximum Likelihood, Nonparametric Regression Calibration, and by properly tuned SIMEX method we study the influence of measurement errors in thyroid dose on the estimates of λ(0) and EAR. The simulation study is presented based on a real sample from the epidemiological studies. The doses were reconstructed in the framework of the Ukrainian-American project on the investigation of Post-Chernobyl thyroid cancers in Ukraine, and the underlying subpolulation was artificially enlarged in order to increase the statistical power. The true risk parameters were given by the values to earlier epidemiological studies, and then the binary response was simulated according to the dose-response model.

  11. The Triton: Design concepts and methods

    NASA Technical Reports Server (NTRS)

    Meholic, Greg; Singer, Michael; Vanryn, Percy; Brown, Rhonda; Tella, Gustavo; Harvey, Bob

    1992-01-01

    During the design of the C & P Aerospace Triton, a few problems were encountered that necessitated changes in the configuration. After the initial concept phase, the aspect ratio was increased from 7 to 7.6 to produce a greater lift to drag ratio (L/D = 13) which satisfied the horsepower requirements (118 hp using the Lycoming O-235 engine). The initial concept had a wing planform area of 134 sq. ft. Detailed wing sizing analysis enlarged the planform area to 150 sq. ft., without changing its layout or location. The most significant changes, however, were made just prior to inboard profile design. The fuselage external diameter was reduced from 54 to 50 inches to reduce drag to meet the desired cruise speed of 120 knots. Also, the nose was extended 6 inches to accommodate landing gear placement. Without the extension, the nosewheel received an unacceptable percentage (25 percent) of the landing weight. The final change in the configuration was made in accordance with the stability and control analysis. In order to reduce the static margin from 20 to 13 percent, the horizontal tail area was reduced from 32.02 to 25.0 sq. ft. The Triton meets all the specifications set forth in the design criteria. If time permitted another iteration of the calculations, two significant changes would be made. The vertical stabilizer area would be reduced to decrease the aircraft lateral stability slope since the current value was too high in relation to the directional stability slope. Also, the aileron size would be decreased to reduce the roll rate below the current 106 deg/second. Doing so would allow greater flap area (increasing CL(sub max)) and thus reduce the overall wing area. C & P would also recalculate the horsepower and drag values to further validate the 120 knot cruising speed.

  12. Research and Methods for Simulation Design: State of the Art

    DTIC Science & Technology

    1990-09-01

    designers. Designers may use this review to identify methods to aid the training-device design process and individuals who manage research programs...maximum training effectiveness at a given cost. The methods should apply to the concept-formulation phase’of the training-device development process ...design process . Finally, individuals who manage research programs may use this information to set priorities for future research efforts. viii RESEARCH

  13. Epidemiology: Then and Now.

    PubMed

    Kuller, Lewis H

    2016-03-01

    Twenty-five years ago, on the 75th anniversary of the Johns Hopkins Bloomberg School of Public Health, I noted that epidemiologic research was moving away from the traditional approaches used to investigate "epidemics" and their close relationship with preventive medicine. Twenty-five years later, the role of epidemiology as an important contribution to human population research, preventive medicine, and public health is under substantial pressure because of the emphasis on "big data," phenomenology, and personalized medical therapies. Epidemiology is the study of epidemics. The primary role of epidemiology is to identify the epidemics and parameters of interest of host, agent, and environment and to generate and test hypotheses in search of causal pathways. Almost all diseases have a specific distribution in relation to time, place, and person and specific "causes" with high effect sizes. Epidemiology then uses such information to develop interventions and test (through clinical trials and natural experiments) their efficacy and effectiveness. Epidemiology is dependent on new technologies to evaluate improved measurements of host (genomics), epigenetics, identification of agents (metabolomics, proteomics), new technology to evaluate both physical and social environment, and modern methods of data collection. Epidemiology does poorly in studying anything other than epidemics and collections of numerators and denominators without specific hypotheses even with improved statistical methodologies.

  14. A survey on methods of design features identification

    NASA Astrophysics Data System (ADS)

    Grabowik, C.; Kalinowski, K.; Paprocka, I.; Kempa, W.

    2015-11-01

    It is widely accepted that design features are one of the most attractive integration method of most fields of engineering activities such as a design modelling, process planning or production scheduling. One of the most important tasks which are realized in the integration process of design and planning functions is a design translation meant as design data mapping into data which are important from process planning needs point of view, it is manufacturing data. A design geometrical shape translation process can be realized with application one of the following strategies: (i) designing with previously prepared design features library also known as DBF method it is design by feature, (ii) interactive design features recognition IFR, (iii) automatic design features recognition AFR. In case of the DBF method design geometrical shape is created with design features. There are two basic approaches for design modelling in DBF method it is classic in which a part design is modelled from beginning to end with application design features previously stored in a design features data base and hybrid where part is partially created with standard predefined CAD system tools and the rest with suitable design features. Automatic feature recognition consist in an autonomic searching of a product model represented with a specific design representation method in order to find those model features which might be potentially recognized as design features, manufacturing features, etc. This approach needs the searching algorithm to be prepared. The searching algorithm should allow carrying on the whole recognition process without a user supervision. Currently there are lots of AFR methods. These methods need the product model to be represented with B-Rep representation most often, CSG rarely, wireframe very rarely. In the IFR method potential features are being recognized by a user. This process is most often realized by a user who points out those surfaces which seem to belong to a

  15. A flexible layout design method for passive micromixers.

    PubMed

    Deng, Yongbo; Liu, Zhenyu; Zhang, Ping; Liu, Yongshun; Gao, Qingyong; Wu, Yihui

    2012-10-01

    This paper discusses a flexible layout design method of passive micromixers based on the topology optimization of fluidic flows. Being different from the trial and error method, this method obtains the detailed layout of a passive micromixer according to the desired mixing performance by solving a topology optimization problem. Therefore, the dependence on the experience of the designer is weaken, when this method is used to design a passive micromixer with acceptable mixing performance. Several design disciplines for the passive micromixers are considered to demonstrate the flexibility of the layout design method for passive micromixers. These design disciplines include the approximation of the real 3D micromixer, the manufacturing feasibility, the spacial periodic design, and effects of the Péclet number and Reynolds number on the designs obtained by this layout design method. The capability of this design method is validated by several comparisons performed between the obtained layouts and the optimized designs in the recently published literatures, where the values of the mixing measurement is improved up to 40.4% for one cycle of the micromixer.

  16. Comparison of Traditional Design Nonlinear Programming Optimization and Stochastic Methods for Structural Design

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2010-01-01

    Structural design generated by traditional method, optimization method and the stochastic design concept are compared. In the traditional method, the constraints are manipulated to obtain the design and weight is back calculated. In design optimization, the weight of a structure becomes the merit function with constraints imposed on failure modes and an optimization algorithm is used to generate the solution. Stochastic design concept accounts for uncertainties in loads, material properties, and other parameters and solution is obtained by solving a design optimization problem for a specified reliability. Acceptable solutions were produced by all the three methods. The variation in the weight calculated by the methods was modest. Some variation was noticed in designs calculated by the methods. The variation may be attributed to structural indeterminacy. It is prudent to develop design by all three methods prior to its fabrication. The traditional design method can be improved when the simplified sensitivities of the behavior constraint is used. Such sensitivity can reduce design calculations and may have a potential to unify the traditional and optimization methods. Weight versus reliabilitytraced out an inverted-S-shaped graph. The center of the graph corresponded to mean valued design. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure. Weight can be reduced to a small value for a most failure-prone design. Probabilistic modeling of load and material properties remained a challenge.

  17. Preliminary design method for deployable spacecraft beams

    NASA Technical Reports Server (NTRS)

    Mikulas, Martin M., Jr.; Cassapakis, Costas

    1995-01-01

    There is currently considerable interest in low-cost, lightweight, compactly packageable deployable elements for various future missions involving small spacecraft. These elements must also have a simple and reliable deployment scheme and possess zero or very small free-play. Although most small spacecraft do not experience large disturbances, very low stiffness appendages or free-play can couple with even small disturbances and lead to unacceptably large attitude errors which may involve the introduction of a flexible-body control system. A class of structures referred to as 'rigidized structures' offers significant promise in providing deployable elements that will meet these needs for small spacecraft. The purpose of this paper is to introduce several rigidizable concepts and to develop a design methodology which permits a rational comparison of these elements to be made with alternate concepts.

  18. Design Methods and Optimization for Morphing Aircraft

    NASA Technical Reports Server (NTRS)

    Crossley, William A.

    2005-01-01

    This report provides a summary of accomplishments made during this research effort. The major accomplishments are in three areas. The first is the use of a multiobjective optimization strategy to help identify potential morphing features that uses an existing aircraft sizing code to predict the weight, size and performance of several fixed-geometry aircraft that are Pareto-optimal based upon on two competing aircraft performance objectives. The second area has been titled morphing as an independent variable and formulates the sizing of a morphing aircraft as an optimization problem in which the amount of geometric morphing for various aircraft parameters are included as design variables. This second effort consumed most of the overall effort on the project. The third area involved a more detailed sizing study of a commercial transport aircraft that would incorporate a morphing wing to possibly enable transatlantic point-to-point passenger service.

  19. Method for designing and controlling compliant gripper

    NASA Astrophysics Data System (ADS)

    Spanu, A. R.; Besnea, D.; Avram, M.; Ciobanu, R.

    2016-08-01

    The compliant grippers are useful for high accuracy grasping of small objects with adaptive control of contact points along the active surfaces of the fingers. The spatial trajectories of the elements become a must, due to the development of MEMS. The paper presents the solution for the compliant gripper designed by the authors, so the planar and spatial movements are discussed. At the beginning of the process, the gripper could work as passive one just for the moment when it has to reach out the object surface. The forces provided by the elements have to avoid the damage. As part of the system, the camera is taken picture of the object, in order to facilitate the positioning of the system. When the contact is established, the mechanism is acting as an active gripper by using an electrical stepper motor, which has controlled movement.

  20. Conceptual design of clean processes: Tools and methods

    SciTech Connect

    Hurme, M.

    1996-12-31

    Design tools available for implementing clean design into practice are discussed. The application areas together with the methods of comparison of clean process alternatives are presented. Environmental principles are becoming increasingly important in the whole life cycle of products from design, manufacturing and marketing to disposal. The hinder of implementing clean technology in design has been the necessity to apply it in all phases of design starting from the beginning, since it deals with the major selections made in the conceptual process design. Therefore both a modified design approach and new tools are needed for process design to make the application of clean technology practical. The first item; extended process design methodologies has been presented by Hurme, Douglas, Rossiter and Klee, Hilaly and Sikdar. The aim of this paper is to discuss the latter topic; the process design tools which assist in implementing clean principles into process design. 22 refs., 2 tabs.

  1. Analytical techniques for instrument design - matrix methods

    SciTech Connect

    Robinson, R.A.

    1997-09-01

    We take the traditional Cooper-Nathans approach, as has been applied for many years for steady-state triple-axis spectrometers, and consider its generalisation to other inelastic scattering spectrometers. This involves a number of simple manipulations of exponentials of quadratic forms. In particular, we discuss a toolbox of matrix manipulations that can be performed on the 6- dimensional Cooper-Nathans matrix: diagonalisation (Moller-Nielsen method), coordinate changes e.g. from ({Delta}k{sub I},{Delta}k{sub F} to {Delta}E, {Delta}Q & 2 dummy variables), integration of one or more variables (e.g. over such dummy variables), integration subject to linear constraints (e.g. Bragg`s Law for analysers), inversion to give the variance-covariance matrix, and so on. We show how these tools can be combined to solve a number of important problems, within the narrow-band limit and the gaussian approximation. We will argue that a generalised program that can handle multiple different spectrometers could (and should) be written in parallel to the Monte-Carlo packages that are becoming available. We will also discuss the complementarity between detailed Monte-Carlo calculations and the approach presented here. In particular, Monte-Carlo methods traditionally simulate the real experiment as performed in practice, given a model scattering law, while the Cooper-Nathans method asks the inverse question: given that a neutron turns up in a particular spectrometer configuration (e.g. angle and time of flight), what is the probability distribution of possible scattering events at the sample? The Monte-Carlo approach could be applied in the same spirit to this question.

  2. Statistical Reasoning and Methods in Epidemiology to Promote Individualized Health: In Celebration of the 100th Anniversary of the Johns Hopkins Bloomberg School of Public Health.

    PubMed

    Ogburn, Elizabeth L; Zeger, Scott L

    2016-03-01

    Epidemiology is concerned with determining the distribution and causes of disease. Throughout its history, epidemiology has drawn upon statistical ideas and methods to achieve its aims. Because of the exponential growth in our capacity to measure and analyze data on the underlying processes that define each person's state of health, there is an emerging opportunity for population-based epidemiologic studies to influence health decisions made by individuals in ways that take into account the individuals' characteristics, circumstances, and preferences. We refer to this endeavor as "individualized health." The present article comprises 2 sections. In the first, we describe how graphical, longitudinal, and hierarchical models can inform the project of individualized health. We propose a simple graphical model for informing individual health decisions using population-based data. In the second, we review selected topics in causal inference that we believe to be particularly useful for individualized health. Epidemiology and biostatistics were 2 of the 4 founding departments in the world's first graduate school of public health at Johns Hopkins University, the centennial of which we honor. This survey of a small part of the literature is intended to demonstrate that the 2 fields remain just as inextricably linked today as they were 100 years ago.

  3. HEALTHY study rationale, design and methods

    PubMed Central

    2009-01-01

    The HEALTHY primary prevention trial was designed and implemented in response to the growing numbers of children and adolescents being diagnosed with type 2 diabetes. The objective was to moderate risk factors for type 2 diabetes. Modifiable risk factors measured were indicators of adiposity and glycemic dysregulation: body mass index ≥85th percentile, fasting glucose ≥5.55 mmol l-1 (100 mg per 100 ml) and fasting insulin ≥180 pmol l-1 (30 μU ml-1). A series of pilot studies established the feasibility of performing data collection procedures and tested the development of an intervention consisting of four integrated components: (1) changes in the quantity and nutritional quality of food and beverage offerings throughout the total school food environment; (2) physical education class lesson plans and accompanying equipment to increase both participation and number of minutes spent in moderate-to-vigorous physical activity; (3) brief classroom activities and family outreach vehicles to increase knowledge, enhance decision-making skills and support and reinforce youth in accomplishing goals; and (4) communications and social marketing strategies to enhance and promote changes through messages, images, events and activities. Expert study staff provided training, assistance, materials and guidance for school faculty and staff to implement the intervention components. A cohort of students were enrolled in sixth grade and followed to end of eighth grade. They attended a health screening data collection at baseline and end of study that involved measurement of height, weight, blood pressure, waist circumference and a fasting blood draw. Height and weight were also collected at the end of the seventh grade. The study was conducted in 42 middle schools, six at each of seven locations across the country, with 21 schools randomized to receive the intervention and 21 to act as controls (data collection activities only). Middle school was the unit of sample size and

  4. Association Between Cannabis and Psychosis: Epidemiologic Evidence.

    PubMed

    Gage, Suzanne H; Hickman, Matthew; Zammit, Stanley

    2016-04-01

    Associations between cannabis use and psychotic outcomes are consistently reported, but establishing causality from observational designs can be problematic. We review the evidence from longitudinal studies that have examined this relationship and discuss the epidemiologic evidence for and against interpreting the findings as causal. We also review the evidence identifying groups at particularly high risk of developing psychosis from using cannabis. Overall, evidence from epidemiologic studies provides strong enough evidence to warrant a public health message that cannabis use can increase the risk of psychotic disorders. However, further studies are required to determine the magnitude of this effect, to determine the effect of different strains of cannabis on risk, and to identify high-risk groups particularly susceptible to the effects of cannabis on psychosis. We also discuss complementary epidemiologic methods that can help address these questions.

  5. Methods and Descriptive Epidemiology of Services Provided by Athletic Trainers in High Schools: The National Athletic Treatment, Injury and Outcomes Network Study

    PubMed Central

    Kerr, Zachary Y.; Dompier, Thomas P.; Dalton, Sara L.; Miller, Sayers John; Hayden, Ross; Marshall, Stephen W.

    2015-01-01

    Context Research is limited on the extent and nature of the care provided by athletic trainers (ATs) to student-athletes in the high school setting. Objective To describe the methods of the National Athletic Treatment, Injury and Outcomes Network (NATION) project and provide the descriptive epidemiology of AT services for injury care in 27 high school sports. Design Descriptive epidemiology study. Setting Athletic training room (ATR) visits and AT services data collected in 147 high schools from 26 states. Patients or Other Participants High school student-athletes participating in 13 boys' sports and 14 girls' sports during the 2011−2012 through 2013−2014 academic years. Main Outcome Measure(s) The number of ATR visits and individual AT services, as well as the mean number of ATR visits (per injury) and AT services (per injury and ATR visit) were calculated by sport and for time-loss (TL) and non–time-loss (NTL) injuries. Results Over the 3-year period, 210 773 ATR visits and 557 381 AT services were reported for 50 604 injuries. Most ATR visits (70%) were for NTL injuries. Common AT services were therapeutic activities or exercise (45.4%), modalities (18.6%), and AT evaluation and reevaluation (15.9%), with an average of 4.17 ± 6.52 ATR visits and 11.01 ± 22.86 AT services per injury. Compared with NTL injuries, patients with TL injuries accrued more ATR visits (7.76 versus 3.47; P < .001) and AT services (18.60 versus 9.56; P < .001) per injury. An average of 2.24 ± 1.33 AT services were reported per ATR visit. Compared with TL injuries, NTL injuries had a larger average number of AT services per ATR visit (2.28 versus 2.05; P < .001). Conclusions These findings highlight the broad spectrum of care provided by ATs to high school student-athletes and demonstrate that patients with NTL injuries require substantial amounts of AT services. PMID:26678290

  6. System and method of designing models in a feedback loop

    DOEpatents

    Gosink, Luke C.; Pulsipher, Trenton C.; Sego, Landon H.

    2017-02-14

    A method and system for designing models is disclosed. The method includes selecting a plurality of models for modeling a common event of interest. The method further includes aggregating the results of the models and analyzing each model compared to the aggregate result to obtain comparative information. The method also includes providing the information back to the plurality of models to design more accurate models through a feedback loop.

  7. Assessment of methods and results of reproductive occupational epidemiology: spontaneous abortions and malformations in the offspring of working women

    SciTech Connect

    Hemminki, K.; Axelson, O.; Niemi, M.L.; Ahlborg, G.

    1983-01-01

    Epidemiological studies relating occupational exposures of working women to spontaneous abortions and malformation are reviewed and some methodological considerations are presented. The reproductive epidemiology is less developed than epidemiology in general and seems to involve some specific problems. The exposures may be reported differently by the women depending on the outcome of the pregnancy; thus confirmation of exposure from an independent data source would be an asset. The types of occupational exposures of the women, suggested to carry a risk of spontaneous abortions, include anesthetic agents, laboratory work, copper smelting, soldering, and chemical sterilization using ethylene oxide and glutaraldehyde. Maternal employment in laboratories and exposure to solvents has been linked to a risk of congenital malformations in the offspring in five studies. Data on the teratogenic effects of anesthetic gases has been conflicting. In one study, employment in copper smelting was associated with malformations in the offspring.

  8. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Designation of noise description methods... TRANSPORTATION (CONTINUED) AIRPORTS NOTICE AND APPROVAL OF AIRPORT NOISE AND ACCESS RESTRICTIONS General Provisions § 161.9 Designation of noise description methods. For purposes of this part, the...

  9. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Designation of noise description methods... TRANSPORTATION (CONTINUED) AIRPORTS NOTICE AND APPROVAL OF AIRPORT NOISE AND ACCESS RESTRICTIONS General Provisions § 161.9 Designation of noise description methods. For purposes of this part, the...

  10. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Designation of noise description methods... TRANSPORTATION (CONTINUED) AIRPORTS NOTICE AND APPROVAL OF AIRPORT NOISE AND ACCESS RESTRICTIONS General Provisions § 161.9 Designation of noise description methods. For purposes of this part, the...

  11. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 3 2014-01-01 2014-01-01 false Designation of noise description methods... TRANSPORTATION (CONTINUED) AIRPORTS NOTICE AND APPROVAL OF AIRPORT NOISE AND ACCESS RESTRICTIONS General Provisions § 161.9 Designation of noise description methods. For purposes of this part, the...

  12. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Designation of noise description methods... TRANSPORTATION (CONTINUED) AIRPORTS NOTICE AND APPROVAL OF AIRPORT NOISE AND ACCESS RESTRICTIONS General Provisions § 161.9 Designation of noise description methods. For purposes of this part, the...

  13. What Can Mixed Methods Designs Offer Professional Development Program Evaluators?

    ERIC Educational Resources Information Center

    Giordano, Victoria; Nevin, Ann

    2007-01-01

    In this paper, the authors describe the benefits and pitfalls of mixed methods designs. They argue that mixed methods designs may be preferred when evaluating professional development programs for p-K-12 education given the new call for accountability in making data-driven decisions. They summarize and critique the studies in terms of limitations…

  14. [The role of epidemiology in mental disorder research].

    PubMed

    Borges, Guilherme; Medina-Mora, María Elena; López-Moreno, Sergio

    2004-01-01

    which are germane to public health, for example, violence. The epidemiology of mental disorders faces great challenges in the new millennium, including a complex, changing epidemiologic scenario. Several important issues will influence the future development of mental disorder epidemiology: measurement of mental disorders and risk factors, more efficient sampling design and methods, the relationships among biological research, genetics, social studies, and epidemiology, and the interface between epidemiology and the evaluation of therapies and health services.

  15. Probabilistic Methods for Uncertainty Propagation Applied to Aircraft Design

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Lin, Hong-Zong; Khalessi, Mohammad R.

    2002-01-01

    Three methods of probabilistic uncertainty propagation and quantification (the method of moments, Monte Carlo simulation, and a nongradient simulation search method) are applied to an aircraft analysis and conceptual design program to demonstrate design under uncertainty. The chosen example problems appear to have discontinuous design spaces and thus these examples pose difficulties for many popular methods of uncertainty propagation and quantification. However, specific implementation features of the first and third methods chosen for use in this study enable successful propagation of small uncertainties through the program. Input uncertainties in two configuration design variables are considered. Uncertainties in aircraft weight are computed. The effects of specifying required levels of constraint satisfaction with specified levels of input uncertainty are also demonstrated. The results show, as expected, that the designs under uncertainty are typically heavier and more conservative than those in which no input uncertainties exist.

  16. Expanding color design methods for architecture and allied disciplines

    NASA Astrophysics Data System (ADS)

    Linton, Harold E.

    2002-06-01

    The color design processes of visual artists, architects, designers, and theoreticians included in this presentation reflect the practical role of color in architecture. What the color design professional brings to the architectural design team is an expertise and rich sensibility made up of a broad awareness and a finely tuned visual perception. This includes a knowledge of design and its history, expertise with industrial color materials and their methods of application, an awareness of design context and cultural identity, a background in physiology and psychology as it relates to human welfare, and an ability to problem-solve and respond creatively to design concepts with innovative ideas. The broadening of the definition of the colorists's role in architectural design provides architects, artists and designers with significant opportunities for continued professional and educational development.

  17. A simple inverse design method for pump turbine

    NASA Astrophysics Data System (ADS)

    Yin, Junlian; Li, Jingjing; Wang, Dezhong; Wei, Xianzhu

    2014-03-01

    In this paper, a simple inverse design method is proposed for pump turbine. The main point of this method is that the blade loading distribution is first extracted from an existing model and then applied in the new design. As an example, the blade loading distribution of the runner designed with head 200m, was analyzed. And then, the combination of the extracted blade loading and a meridional passage suitable for 500m head is applied to design a new runner project. After CFD and model test, it is shown that the new runner performs very well in terms of efficiency and cavitation. Therefore, as an alternative, the inverse design method can be extended to other design applications.

  18. Design methods for fault-tolerant finite state machines

    NASA Technical Reports Server (NTRS)

    Niranjan, Shailesh; Frenzel, James F.

    1993-01-01

    VLSI electronic circuits are increasingly being used in space-borne applications where high levels of radiation may induce faults, known as single event upsets. In this paper we review the classical methods of designing fault tolerant digital systems, with an emphasis on those methods which are particularly suitable for VLSI-implementation of finite state machines. Four methods are presented and will be compared in terms of design complexity, circuit size, and estimated circuit delay.

  19. Prevalence and Epidemiologic Characteristics of FASD From Various Research Methods with an Emphasis on Recent In-School Studies

    ERIC Educational Resources Information Center

    May, Philip A.; Gossage, J. Phillip; Kalberg, Wendy O.; Robinson, Luther K.; Buckley, David; Manning, Melanie; Hoyme, H. Eugene

    2009-01-01

    Researching the epidemiology and estimating the prevalence of fetal alcohol syndrome (FAS) and other fetal alcohol spectrum disorders (FASD) for mainstream populations anywhere in the world has presented a challenge to researchers. Three major approaches have been used in the past: surveillance and record review systems, clinic-based studies, and…

  20. Evaluation of method for secondary DNA typing of Mycobacterium tuberculosis with pTBN12 in epidemiologic study of tuberculosis.

    PubMed

    Yang, Z; Chaves, F; Barnes, P F; Burman, W J; Koehler, J; Eisenach, K D; Bates, J H; Cave, M D

    1996-12-01

    Secondary fingerprinting of Mycobacterium tuberculosis DNA with a probe containing the polymorphic GC-rich repetitive sequence present in pTBN12 has been found to have greater discriminating power than does fingerprinting with the insertion sequence IS6110 for strains carrying few copies of IS6110. To validate the use of pTBN12 fingerprinting in the molecular epidemiology of tuberculosis, M. tuberculosis isolates from 67 patients in five states in the United States and in Spain were fingerprinted with both IS6110 and pTBN12. Epidemiologic links among the 67 patients were evaluated by patient interview and/or review of medical records. The 67 isolates had 5 IS6110 fingerprint patterns with two to five copies of IS6110 and 18 pTBN12 patterns, of which 10 were shared by more than 1 isolate. Epidemiologic links are consistently found among patients whose isolates had identical pTBN12 patterns, whereas no links were found among patients whose isolates had unique pTBN12 patterns. This suggests that pTBN12 fingerprinting is a useful tool to identify epidemiologically linked tuberculosis patients whose isolates have identical IS6110 fingerprints containing fewer than six fragments.

  1. Triparental Families: A New Genetic-Epidemiological Design Applied to Drug Abuse, Alcohol Use Disorders, and Criminal Behavior in a Swedish National Sample

    PubMed Central

    Kendler, Kenneth S.; Ohlsson, Henrik; Sundquist, Jan; Sundquist, Kristina

    2015-01-01

    Objective The authors sought to clarify the sources of parent-offspring resemblance for drug abuse, alcohol use disorders, and criminal behavior, using a novel genetic-epidemiological design. Method Using national registries, the authors identified rates of drug abuse, alcohol use disorders, and criminal behavior in 41,360 Swedish individuals born between 1960 and 1990 and raised in triparental families comprising a biological mother who reared them, a “not-lived-with” biological father, and a stepfather. Results When each syndrome was examined individually, hazard rates for drug abuse in offspring of parents with drug abuse were highest for mothers (2.80, 95% CI=2.23–3.38), intermediate for not-lived-with fathers (2.45,95%CI=2.14–2.79), and lowest for stepfathers (1.99, 95% CI=1.55–2.56). The same pattern was seen for alcohol use disorders (2.23, 95% CI=1.93–2.58; 1.84, 95% CI=1.69–2.00; and 1.27, 95% CI=1.12–1.43) and criminal behavior (1.55, 95% CI=1.44–1.66; 1.46, 95%CI=1.40–1.52; and1.30, 95% CI=1.23–1.37). When all three syndromes were examined together, specificity of cross-generational transmission was highest for mothers, intermediate for not-lived-with fathers, and lowest for stepfathers. Analyses of intact families and other not-lived-with parents and stepparents showed similar cross-generation transmission for these syndromes in mothers and fathers, supporting the representativeness of results from triparental families. Conclusions A major strength of the triparental design is its inclusion, within a single family, of parents who provide, to a first approximation, their offspring with genes plus rearing, genes only, and rearing only. For drug abuse, alcohol use disorders, and criminal behavior, the results of this study suggest that parent-offspring transmission involves both genetic and environmental processes, with genetic factors being somewhat more important. These results should be interpreted in the context of the strengths

  2. System Design Support by Optimization Method Using Stochastic Process

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroaki; Yamaguchi, Katsuhito; Ishikawa, Yoshio

    We proposed the new optimization method based on stochastic process. The characteristics of this method are to obtain the approximate solution of the optimum solution as an expected value. In numerical calculation, a kind of Monte Carlo method is used to obtain the solution because of stochastic process. Then, it can obtain the probability distribution of the design variable because it is generated in the probability that design variables were in proportion to the evaluation function value. This probability distribution shows the influence of design variables on the evaluation function value. This probability distribution is the information which is very useful for the system design. In this paper, it is shown the proposed method is useful for not only the optimization but also the system design. The flight trajectory optimization problem for the hang-glider is shown as an example of the numerical calculation.

  3. Inviscid transonic wing design using inverse methods in curvilinear coordinates

    NASA Technical Reports Server (NTRS)

    Gally, Thomas A.; Carlson, Leland A.

    1987-01-01

    An inverse wing design method has been developed around an existing transonic wing analysis code. The original analysis code, TAWFIVE, has as its core the numerical potential flow solver, FLO30, developed by Jameson and Caughey. Features of the analysis code include a finite-volume formulation; wing and fuselage fitted, curvilinear grid mesh; and a viscous boundary layer correction that also accounts for viscous wake thickness and curvature. The development of the inverse methods as an extension of previous methods existing for design in Cartesian coordinates is presented. Results are shown for inviscid wing design cases in super-critical flow regimes. The test cases selected also demonstrate the versatility of the design method in designing an entire wing or discontinuous sections of a wing.

  4. Tabu search method with random moves for globally optimal design

    NASA Astrophysics Data System (ADS)

    Hu, Nanfang

    1992-09-01

    Optimum engineering design problems are usually formulated as non-convex optimization problems of continuous variables. Because of the absence of convexity structure, they can have multiple minima, and global optimization becomes difficult. Traditional methods of optimization, such as penalty methods, can often be trapped at a local optimum. The tabu search method with random moves to solve approximately these problems is introduced. Its reliability and efficiency are examined with the help of standard test functions. By the analysis of the implementations, it is seen that this method is easy to use, and no derivative information is necessary. It outperforms the random search method and composite genetic algorithm. In particular, it is applied to minimum weight design examples of a three-bar truss, coil springs, a Z-section and a channel section. For the channel section, the optimal design using the tabu search method with random moves saved 26.14 percent over the weight of the SUMT method.

  5. An inverse method with regularity condition for transonic airfoil design

    NASA Technical Reports Server (NTRS)

    Zhu, Ziqiang; Xia, Zhixun; Wu, Liyi

    1991-01-01

    It is known from Lighthill's exact solution of the incompressible inverse problem that in the inverse design problem, the surface pressure distribution and the free stream speed cannot both be prescribed independently. This implies the existence of a constraint on the prescribed pressure distribution. The same constraint exists at compressible speeds. Presented here is an inverse design method for transonic airfoils. In this method, the target pressure distribution contains a free parameter that is adjusted during the computation to satisfy the regularity condition. Some design results are presented in order to demonstrate the capabilities of the method.

  6. Robust Multivariable Controller Design via Implicit Model-Following Methods.

    DTIC Science & Technology

    1983-12-01

    HD-Ri38 309 ROBUST MULTIVARIABLE CONTROLLER DESIGN VIA IMPLICIT 1/4 MODEL-FOLLOWING METHODS(U) AIR FORCE INST OF TECH WRIGHT-PATTERSON AFB OH SCHOOL...aaS. a%. 1 .111 I Q~ 18 0 ROBUST MULTIVARIABLE CONTROLLER DESIGN -~ :VIA IMPLICIT MODEL-FOLLOWING METHODS ’.% THESIS , AFIT/GE/EE/83D-48 William G... CONTROLLER DESIGN VIA IMPLICIT MODEL-FOLLOWING METHODS THESIS AFIT/GE/EE/83D-48 William G. Miller Capt USAF ,. Approved for pubi release; distribution

  7. A national cross-sectional study among drug-users in France: epidemiology of HCV and highlight on practical and statistical aspects of the design

    PubMed Central

    2009-01-01

    Background Epidemiology of HCV infection among drug users (DUs) has been widely studied. Prevalence and sociobehavioural data among DUs are therefore available in most countries but no study has taken into account in the sampling weights one important aspect of the way of life of DUs, namely that they can use one or more specialized services during the study period. In 2004–2005, we conducted a national seroepidemiologic survey of DUs, based on a random sampling design using the Generalised Weight Share Method (GWSM) and on blood testing. Methods A cross-sectional multicenter survey was done among DUs having injected or snorted drugs at least once in their life. We conducted a two stage random survey of DUs selected to represent the diversity of drug use. The fact that DUs can use more than one structure during the study period has an impact on their inclusion probabilities. To calculate a correct sampling weight, we used the GWSM. A sociobehavioral questionnaire was administered by interviewers. Selected DUs were asked to self-collect a fingerprick blood sample on blotting paper. Results Of all DUs selected, 1462 (75%) accepted to participate. HCV seroprevalence was 59.8% [95% CI: 50.7–68.3]. Of DUs under 30 years, 28% were HCV seropositive. Of HCV-infected DUs, 27% were unaware of their status. In the month prior to interview, 13% of DUs shared a syringe, 38% other injection parapharnelia and 81% shared a crack pipe. In multivariate analysis, factors independently associated with HCV seropositivity were age over 30, HIV seropositivity, having ever injected drugs, opiate substitution treatment (OST), crack use, and precarious housing. Conclusion This is the first time that blood testing combined to GWSM is applied to a DUs population, which improve the estimate of HCV prevalence. HCV seroprevalence is high, indeed by the youngest DUs. And a large proportion of DUs are not aware of their status. Our multivariate analysis identifies risk factors such as crack

  8. Polygenic Epidemiology

    PubMed Central

    2016-01-01

    ABSTRACT Much of the genetic basis of complex traits is present on current genotyping products, but the individual variants that affect the traits have largely not been identified. Several traditional problems in genetic epidemiology have recently been addressed by assuming a polygenic basis for disease and treating it as a single entity. Here I briefly review some of these applications, which collectively may be termed polygenic epidemiology. Methodologies in this area include polygenic scoring, linear mixed models, and linkage disequilibrium scoring. They have been used to establish a polygenic effect, estimate genetic correlation between traits, estimate how many variants affect a trait, stratify cases into subphenotypes, predict individual disease risks, and infer causal effects using Mendelian randomization. Polygenic epidemiology will continue to yield useful applications even while much of the specific variation underlying complex traits remains undiscovered. PMID:27061411

  9. Design Method for EPS Control System Based on KANSEI Structure

    NASA Astrophysics Data System (ADS)

    Saitoh, Yumi; Itoh, Hideaki; Ozaki, Fuminori; Nakamura, Takenobu; Kawaji, Shigeyasu

    Recently, it has been identified that a KANSEI engineering plays an important role in functional design developing for realizing highly sophisticated products. However, in practical development methods, we design products and optimise the design trial and error, which indecates that we depend on the skill set of experts. In this paper, we focus on an automobile electric power steering (EPS) for which a functional design is required. First, the KANSEI structure is determined on the basis of the steering feeling of an experienced driver, and an EPS control design based on this KANSEI structure is proposed. Then, the EPS control parameters are adjusted in accordance with the KANSEI index. Finally, by assessing the experimental results obtained from the driver, the effectiveness of the proposed design method is verified.

  10. Stabilizing State-Feedback Design via the Moving Horizon Method.

    DTIC Science & Technology

    1982-01-01

    aide if necessary and identify by block number) Stabilizing control design; linear time varying systems; fixed depth horizon; index optimization methods...dual system. 20. ABSTRACT (Continue an reverse side If necessary and Identify by block number) Li _ A stabilizing control design for general linear...Apprvyed for pb~ ~~* 14 ~dl Stri but ion uni imit Oe, ABSTRACT A stabilizing control design for general linear time vary- invariant systems through

  11. An artificial viscosity method for the design of supercritical airfoils

    NASA Technical Reports Server (NTRS)

    Mcfadden, G. B.

    1979-01-01

    A numerical technique is presented for the design of two-dimensional supercritical wing sections with low wave drag. The method is a design mode of the analysis code H which gives excellent agreement with experimental results and is widely used in the aircraft industry. Topics covered include the partial differential equations of transonic flow, the computational procedure and results; the design procedure; a convergence theorem; and description of the code.

  12. Single-Case Designs and Qualitative Methods: Applying a Mixed Methods Research Perspective

    ERIC Educational Resources Information Center

    Hitchcock, John H.; Nastasi, Bonnie K.; Summerville, Meredith

    2010-01-01

    The purpose of this conceptual paper is to describe a design that mixes single-case (sometimes referred to as single-subject) and qualitative methods, hereafter referred to as a single-case mixed methods design (SCD-MM). Minimal attention has been given to the topic of applying qualitative methods to SCD work in the literature. These two…

  13. New directions for Artificial Intelligence (AI) methods in optimum design

    NASA Technical Reports Server (NTRS)

    Hajela, Prabhat

    1989-01-01

    Developments and applications of artificial intelligence (AI) methods in the design of structural systems is reviewed. Principal shortcomings in the current approach are emphasized, and the need for some degree of formalism in the development environment for such design tools is underscored. Emphasis is placed on efforts to integrate algorithmic computations in expert systems.

  14. Two-Method Planned Missing Designs for Longitudinal Research

    ERIC Educational Resources Information Center

    Garnier-Villarreal, Mauricio; Rhemtulla, Mijke; Little, Todd D.

    2014-01-01

    We examine longitudinal extensions of the two-method measurement design, which uses planned missingness to optimize cost-efficiency and validity of hard-to-measure constructs. These designs use a combination of two measures: a "gold standard" that is highly valid but expensive to administer, and an inexpensive (e.g., survey-based)…

  15. Numerical methods for aerothermodynamic design of hypersonic space transport vehicles

    NASA Astrophysics Data System (ADS)

    Wanie, K. M.; Brenneis, A.; Eberle, A.; Heiss, S.

    1993-04-01

    The requirement of the design process of hypersonic vehicles to predict flow past entire configurations with wings, fins, flaps, and propulsion system represents one of the major challenges for aerothermodynamics. In this context computational fluid dynamics has come up as a powerful tool to support the experimental work. A couple of numerical methods developed at MBB designed to fulfill the needs of the design process are described. The governing equations and fundamental details of the solution methods are shortly reviewed. Results are given for both geometrically simple test cases and realistic hypersonic configurations. Since there is still a considerable lack of experience for hypersonic flow calculations an extensive testing and verification is essential. This verification is done by comparison of results with experimental data and other numerical methods. The results presented prove that the methods used are robust, flexible, and accurate enough to fulfill the strong needs of the design process.

  16. Investigating the Use of Design Methods by Capstone Design Students at Clemson University

    ERIC Educational Resources Information Center

    Miller, W. Stuart; Summers, Joshua D.

    2013-01-01

    The authors describe a preliminary study to understand the attitude of engineering students regarding the use of design methods in projects to identify the factors either affecting or influencing the use of these methods by novice engineers. A senior undergraduate capstone design course at Clemson University, consisting of approximately fifty…

  17. Design method for four-reflector type beam waveguide systems

    NASA Technical Reports Server (NTRS)

    Betsudan, S.; Katagi, T.; Urasaki, S.

    1986-01-01

    Discussed is a method for the design of four reflector type beam waveguide feed systems, comprised of a conical horn and 4 focused reflectors, which are used widely as the primary reflector systems for communications satellite Earth station antennas. The design parameters for these systems are clarified, the relations between each parameter are brought out based on the beam mode development, and the independent design parameters are specified. The characteristics of these systems, namely spillover loss, crosspolarization components, and frequency characteristics, and their relation to the design parameters, are also shown. It is also indicated that design parameters which decide the dimensions of the conical horn or the shape of the focused reflectors can be unerringly established once the design standard for the system has been selected as either: (1) minimizing the crosspolarization component by keeping the spillover loss to within acceptable limits, or (2) minimizing the spillover loss by maintaining the crossover components below an acceptable level and the independent design parameters, such as the respective sizes of the focused reflectors and the distances between the focussed reflectors, etc., have been established according to mechanical restrictions. A sample design is also shown. In addition to being able to clarify the effects of each of the design parameters on the system and improving insight into these systems, the efficiency of these systems will also be increased with this design method.

  18. New knowledge network evaluation method for design rationale management

    NASA Astrophysics Data System (ADS)

    Jing, Shikai; Zhan, Hongfei; Liu, Jihong; Wang, Kuan; Jiang, Hao; Zhou, Jingtao

    2015-01-01

    Current design rationale (DR) systems have not demonstrated the value of the approach in practice since little attention is put to the evaluation method of DR knowledge. To systematize knowledge management process for future computer-aided DR applications, a prerequisite is to provide the measure for the DR knowledge. In this paper, a new knowledge network evaluation method for DR management is presented. The method characterizes the DR knowledge value from four perspectives, namely, the design rationale structure scale, association knowledge and reasoning ability, degree of design justification support and degree of knowledge representation conciseness. The DR knowledge comprehensive value is also measured by the proposed method. To validate the proposed method, different style of DR knowledge network and the performance of the proposed measure are discussed. The evaluation method has been applied in two realistic design cases and compared with the structural measures. The research proposes the DR knowledge evaluation method which can provide object metric and selection basis for the DR knowledge reuse during the product design process. In addition, the method is proved to be more effective guidance and support for the application and management of DR knowledge.

  19. Approximate method of designing a two-element airfoil

    NASA Astrophysics Data System (ADS)

    Abzalilov, D. F.; Mardanov, R. F.

    2011-09-01

    An approximate method is proposed for designing a two-element airfoil. The method is based on reducing an inverse boundary-value problem in a doubly connected domain to a problem in a singly connected domain located on a multisheet Riemann surface. The essence of the method is replacement of channels between the airfoil elements by channels of flow suction and blowing. The shape of these channels asymptotically tends to the annular shape of channels passing to infinity on the second sheet of the Riemann surface. The proposed method can be extended to designing multielement airfoils.

  20. Molecular Epidemiology of Breast Cancer: Development and Validation of Acetylation Methods for Carcinogen-DNA Adduct Detection

    DTIC Science & Technology

    2001-10-01

    epidemiological studies, and determine adduct levels in relation to metabolizing gene polymorphisms . The originally proposed assay is novel because one uses a...carcinogenic mechanisms. Currently, many ongoing breast cancer studies are exploring risks related to genetic polymorphisms in these genes. Yet these...the surrogate tissue). Finally, in these subjects, we will perform assays for genetic polymorphisms , to assess the association of "at risk" genetic

  1. Molecular epidemiology of human hepatitis A virus defined by an antigen-capture polymerase chain reaction method.

    PubMed Central

    Jansen, R W; Siegl, G; Lemon, S M

    1990-01-01

    We describe an immunoaffinity-linked nucleic acid amplification system (antigen-capture/polymerase chain reaction, or AC/PCR) for detection of viruses in clinical specimens and its application to the study of the molecular epidemiology of a picornavirus, hepatitis A virus (HAV). Immunoaffinity capture of virus, synthesis of viral cDNA, and amplification of cDNA by a polymerase chain reaction (PCR) were carried out sequentially in a single reaction vessel. This approach simplified sample preparation and enhanced the specificity of conventional PCR. AC/PCR detected less than one cell culture infectious unit of virus in 80 microliters of sample. Sequencing of AC/PCR reaction products from 34 virus strains demonstrated remarkable conservation at the nucleotide level among most strains but revealed hitherto unsuspected genetic diversity among human isolates. Epidemiologically related strains were identical or closely related in sequence. Virus strains recovered from epidemics of hepatitis A in the United States and Germany were identical in sequence, providing evidence for a previously unrecognized epidemiologic link between these outbreaks. Images PMID:2158093

  2. A comparison of methods for DPLL loop filter design

    NASA Technical Reports Server (NTRS)

    Aguirre, S.; Hurd, W. J.; Kumar, R.; Statman, J.

    1986-01-01

    Four design methodologies for loop filters for a class of digital phase-locked loops (DPLLs) are presented. The first design maps an optimum analog filter into the digital domain; the second approach designs a filter that minimizes in discrete time weighted combination of the variance of the phase error due to noise and the sum square of the deterministic phase error component; the third method uses Kalman filter estimation theory to design a filter composed of a least squares fading memory estimator and a predictor. The last design relies on classical theory, including rules for the design of compensators. Linear analysis is used throughout the article to compare different designs, and includes stability, steady state performance and transient behavior of the loops. Design methodology is not critical when the loop update rate can be made high relative to loop bandwidth, as the performance approaches that of continuous time. For low update rates, however, the miminization method is significantly superior to the other methods.

  3. Novel parameter-based flexure bearing design method

    NASA Astrophysics Data System (ADS)

    Amoedo, Simon; Thebaud, Edouard; Gschwendtner, Michael; White, David

    2016-06-01

    A parameter study was carried out on the design variables of a flexure bearing to be used in a Stirling engine with a fixed axial displacement and a fixed outer diameter. A design method was developed in order to assist identification of the optimum bearing configuration. This was achieved through a parameter study of the bearing carried out with ANSYS®. The parameters varied were the number and the width of the arms, the thickness of the bearing, the eccentricity, the size of the starting and ending holes, and the turn angle of the spiral. Comparison was made between the different designs in terms of axial and radial stiffness, the natural frequency, and the maximum induced stresses. Moreover, the Finite Element Analysis (FEA) was compared to theoretical results for a given design. The results led to a graphical design method which assists the selection of flexure bearing geometrical parameters based on pre-determined geometric and material constraints.

  4. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  5. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  6. A Bright Future for Evolutionary Methods in Drug Design.

    PubMed

    Le, Tu C; Winkler, David A

    2015-08-01

    Most medicinal chemists understand that chemical space is extremely large, essentially infinite. Although high-throughput experimental methods allow exploration of drug-like space more rapidly, they are still insufficient to fully exploit the opportunities that such large chemical space offers. Evolutionary methods can synergistically blend automated synthesis and characterization methods with computational design to identify promising regions of chemical space more efficiently. We describe how evolutionary methods are implemented, and provide examples of published drug development research in which these methods have generated molecules with increased efficacy. We anticipate that evolutionary methods will play an important role in future drug discovery.

  7. On design methods for bolted joints in composite aircraft structures

    NASA Astrophysics Data System (ADS)

    Ireman, Tomas; Nyman, Tonny; Hellbom, Kurt

    The problems related to the determination of the load distribution in a multirow fastener joint using the finite element method are discussed. Both simple and more advanced design methods used at Saab Military Aircraft are presented. The stress distributions obtained with an analytically based method and an FE-based method are compared. Results from failure predictions with a simple analytically based method and the more advanced FE-based method of multi-fastener tension and shear loaded test specimens are compared with experiments. Finally, complicating factors such as three-dimensional effects caused by secondary bending and fastener bending are discussed and suggestions for future research are given.

  8. Design of diffractive optical surfaces within the nonimaging SMS design method

    NASA Astrophysics Data System (ADS)

    Mendes-Lopes, João.; Benítez, Pablo; Miñano, Juan C.

    2015-09-01

    The Simultaneous Multiple Surface (SMS) method was initially developed as a design method in Nonimaging Optics and later, the method was extended for designing Imaging Optics. We show an extension of the SMS method to diffractive surfaces. Using this method, diffractive kinoform surfaces are calculated simultaneously and through a direct method, i. e. it is not based in multi-parametric optimization techniques. Using the phase-shift properties of diffractive surfaces as an extra degree of freedom, only N/2 surfaces are needed to perfectly couple N one parameter wavefronts. Wavefronts of different wavelengths can also be coupled, hence chromatic aberration can be corrected in SMS-based systems. This method can be used by combining and calculating simultaneously both reflective, refractive and diffractive surfaces, through direct calculation of phase and refractive/reflective profiles. Representative diffractive systems designed by the SMS method are presented.

  9. The Design with Intent Method: a design tool for influencing user behaviour.

    PubMed

    Lockton, Dan; Harrison, David; Stanton, Neville A

    2010-05-01

    Using product and system design to influence user behaviour offers potential for improving performance and reducing user error, yet little guidance is available at the concept generation stage for design teams briefed with influencing user behaviour. This article presents the Design with Intent Method, an innovation tool for designers working in this area, illustrated via application to an everyday human-technology interaction problem: reducing the likelihood of a customer leaving his or her card in an automatic teller machine. The example application results in a range of feasible design concepts which are comparable to existing developments in ATM design, demonstrating that the method has potential for development and application as part of a user-centred design process.

  10. INNOVATIVE METHODS FOR THE OPTIMIZATION OF GRAVITY STORM SEWER DESIGN

    EPA Science Inventory

    The purpose of this paper is to describe a new method for optimizing the design of urban storm sewer systems. Previous efforts to optimize gravity sewers have met with limited success because classical optimization methods require that the problem be well behaved, e.g. describ...

  11. Designing, Teaching, and Evaluating Two Complementary Mixed Methods Research Courses

    ERIC Educational Resources Information Center

    Christ, Thomas W.

    2009-01-01

    Teaching mixed methods research is difficult. This longitudinal explanatory study examined how two classes were designed, taught, and evaluated. Curriculum, Research, and Teaching (EDCS-606) and Mixed Methods Research (EDCS-780) used a research proposal generation process to highlight the importance of the purpose, research question and…

  12. Tradeoff methods in multiobjective insensitive design of airplane control systems

    NASA Technical Reports Server (NTRS)

    Schy, A. A.; Giesy, D. P.

    1984-01-01

    The latest results of an ongoing study of computer-aided design of airplane control systems are given. Constrained minimization algorithms are used, with the design objectives in the constraint vector. The concept of Pareto optimiality is briefly reviewed. It is shown how an experienced designer can use it to find designs which are well-balanced in all objectives. Then the problem of finding designs which are insensitive to uncertainty in system parameters are discussed, introducing a probabilistic vector definition of sensitivity which is consistent with the deterministic Pareto optimal problem. Insensitivity is important in any practical design, but it is particularly important in the design of feedback control systems, since it is considered to be the most important distinctive property of feedback control. Methods of tradeoff between deterministic and stochastic-insensitive (SI) design are described, and tradeoff design results are presented for the example of the a Shuttle lateral stability augmentation system. This example is used because careful studies have been made of the uncertainty in Shuttle aerodynamics. Finally, since accurate statistics of uncertain parameters are usually not available, the effects of crude statistical models on SI designs are examined.

  13. Comparison of Optimal Design Methods in Inverse Problems.

    PubMed

    Banks, H T; Holm, Kathleen; Kappel, Franz

    2011-07-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher Information Matrix (FIM). A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criteria with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model [13], the standard harmonic oscillator model [13] and a popular glucose regulation model [16, 19, 29].

  14. Against Popperized epidemiology.

    PubMed

    Jacobsen, M

    1976-03-01

    The recommendation of Popper's philosophy of science should be adopted by epidemiologists is disputed. Reference is made to other authors who have shown that the most constructive elements in Popper's ideas have been advocated by earlier philosophers and have been used in epidemiology without abandoning inductive reasoning. It is argued that Popper's denigration of inductive methods is particularly harmful to epidemiology. Inductive reasoning and statistical inference play a key role in the science; it is suggested that unfamiliarity with these ideas contributes to widespread misunderstanding of the function of epidemiology. Attention is drawn to a common fallacy involving correlations between three random variables. The prevalence of the fallacy may be related to confusion between deductive and inductive logic.

  15. Introduction to the use of regression models in epidemiology.

    PubMed

    Bender, Ralf

    2009-01-01

    Regression modeling is one of the most important statistical techniques used in analytical epidemiology. By means of regression models the effect of one or several explanatory variables (e.g., exposures, subject characteristics, risk factors) on a response variable such as mortality or cancer can be investigated. From multiple regression models, adjusted effect estimates can be obtained that take the effect of potential confounders into account. Regression methods can be applied in all epidemiologic study designs so that they represent a universal tool for data analysis in epidemiology. Different kinds of regression models have been developed in dependence on the measurement scale of the response variable and the study design. The most important methods are linear regression for continuous outcomes, logistic regression for binary outcomes, Cox regression for time-to-event data, and Poisson regression for frequencies and rates. This chapter provides a nontechnical introduction to these regression models with illustrating examples from cancer research.

  16. Computer method for design of acoustic liners for turbofan engines

    NASA Technical Reports Server (NTRS)

    Minner, G. L.; Rice, E. J.

    1976-01-01

    A design package is presented for the specification of acoustic liners for turbofans. An estimate of the noise generation was made based on modifications of existing noise correlations, for which the inputs are basic fan aerodynamic design variables. The method does not predict multiple pure tones. A target attenuation spectrum was calculated which was the difference between the estimated generation spectrum and a flat annoyance-weighted goal attenuated spectrum. The target spectrum was combined with a knowledge of acoustic liner performance as a function of the liner design variables to specify the acoustic design. The liner design method at present is limited to annular duct configurations. The detailed structure of the liner was specified by combining the required impedance (which is a result of the previous step) with a mathematical model relating impedance to the detailed structure. The design procedure was developed for a liner constructed of perforated sheet placed over honeycomb backing cavities. A sample calculation was carried through in order to demonstrate the design procedure, and experimental results presented show good agreement with the calculated results of the method.

  17. An Integrated Optimization Design Method Based on Surrogate Modeling Applied to Diverging Duct Design

    NASA Astrophysics Data System (ADS)

    Hanan, Lu; Qiushi, Li; Shaobin, Li

    2016-12-01

    This paper presents an integrated optimization design method in which uniform design, response surface methodology and genetic algorithm are used in combination. In detail, uniform design is used to select the experimental sampling points in the experimental domain and the system performance is evaluated by means of computational fluid dynamics to construct a database. After that, response surface methodology is employed to generate a surrogate mathematical model relating the optimization objective and the design variables. Subsequently, genetic algorithm is adopted and applied to the surrogate model to acquire the optimal solution in the case of satisfying some constraints. The method has been applied to the optimization design of an axisymmetric diverging duct, dealing with three design variables including one qualitative variable and two quantitative variables. The method of modeling and optimization design performs well in improving the duct aerodynamic performance and can be also applied to wider fields of mechanical design and seen as a useful tool for engineering designers, by reducing the design time and computation consumption.

  18. Developing Conceptual Hypersonic Airbreathing Engines Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Ferlemann, Shelly M.; Robinson, Jeffrey S.; Martin, John G.; Leonard, Charles P.; Taylor, Lawrence W.; Kamhawi, Hilmi

    2000-01-01

    Designing a hypersonic vehicle is a complicated process due to the multi-disciplinary synergy that is required. The greatest challenge involves propulsion-airframe integration. In the past, a two-dimensional flowpath was generated based on the engine performance required for a proposed mission. A three-dimensional CAD geometry was produced from the two-dimensional flowpath for aerodynamic analysis, structural design, and packaging. The aerodynamics, engine performance, and mass properties arc inputs to the vehicle performance tool to determine if the mission goals were met. If the mission goals were not met, then a flowpath and vehicle redesign would begin. This design process might have to be performed several times to produce a "closed" vehicle. This paper will describe an attempt to design a hypersonic cruise vehicle propulsion flowpath using a Design of' Experiments method to reduce the resources necessary to produce a conceptual design with fewer iterations of the design cycle. These methods also allow for more flexible mission analysis and incorporation of additional design constraints at any point. A design system was developed using an object-based software package that would quickly generate each flowpath in the study given the values of the geometric independent variables. These flowpath geometries were put into a hypersonic propulsion code and the engine performance was generated. The propulsion results were loaded into statistical software to produce regression equations that were combined with an aerodynamic database to optimize the flowpath at the vehicle performance level. For this example, the design process was executed twice. The first pass was a cursory look at the independent variables selected to determine which variables are the most important and to test all of the inputs to the optimization process. The second cycle is a more in-depth study with more cases and higher order equations representing the design space.

  19. A decentralized linear quadratic control design method for flexible structures

    NASA Technical Reports Server (NTRS)

    Su, Tzu-Jeng; Craig, Roy R., Jr.

    1990-01-01

    A decentralized suboptimal linear quadratic control design procedure which combines substructural synthesis, model reduction, decentralized control design, subcontroller synthesis, and controller reduction is proposed for the design of reduced-order controllers for flexible structures. The procedure starts with a definition of the continuum structure to be controlled. An evaluation model of finite dimension is obtained by the finite element method. Then, the finite element model is decomposed into several substructures by using a natural decomposition called substructuring decomposition. Each substructure, at this point, still has too large a dimension and must be reduced to a size that is Riccati-solvable. Model reduction of each substructure can be performed by using any existing model reduction method, e.g., modal truncation, balanced reduction, Krylov model reduction, or mixed-mode method. Then, based on the reduced substructure model, a subcontroller is designed by an LQ optimal control method for each substructure independently. After all subcontrollers are designed, a controller synthesis method called substructural controller synthesis is employed to synthesize all subcontrollers into a global controller. The assembling scheme used is the same as that employed for the structure matrices. Finally, a controller reduction scheme, called the equivalent impulse response energy controller (EIREC) reduction algorithm, is used to reduce the global controller to a reasonable size for implementation. The EIREC reduced controller preserves the impulse response energy of the full-order controller and has the property of matching low-frequency moments and low-frequency power moments. An advantage of the substructural controller synthesis method is that it relieves the computational burden associated with dimensionality. Besides that, the SCS design scheme is also a highly adaptable controller synthesis method for structures with varying configuration, or varying mass

  20. A method for the design of transonic flexible wings

    NASA Technical Reports Server (NTRS)

    Smith, Leigh Ann; Campbell, Richard L.

    1990-01-01

    Methodology was developed for designing airfoils and wings at transonic speeds which includes a technique that can account for static aeroelastic deflections. This procedure is capable of designing either supercritical or more conventional airfoil sections. Methods for including viscous effects are also illustrated and are shown to give accurate results. The methodology developed is an interactive system containing three major parts. A design module was developed which modifies airfoil sections to achieve a desired pressure distribution. This design module works in conjunction with an aerodynamic analysis module, which for this study is a small perturbation transonic flow code. Additionally, an aeroelastic module is included which determines the wing deformation due to the calculated aerodynamic loads. Because of the modular nature of the method, it can be easily coupled with any aerodynamic analysis code.

  1. Digital Epidemiology

    PubMed Central

    Salathé, Marcel; Bengtsson, Linus; Bodnar, Todd J.; Brewer, Devon D.; Brownstein, John S.; Buckee, Caroline; Campbell, Ellsworth M.; Cattuto, Ciro; Khandelwal, Shashank; Mabry, Patricia L.; Vespignani, Alessandro

    2012-01-01

    Mobile, social, real-time: the ongoing revolution in the way people communicate has given rise to a new kind of epidemiology. Digital data sources, when harnessed appropriately, can provide local and timely information about disease and health dynamics in populations around the world. The rapid, unprecedented increase in the availability of relevant data from various digital sources creates considerable technical and computational challenges. PMID:22844241

  2. Rotordynamics and Design Methods of an Oil-Free Turbocharger

    NASA Technical Reports Server (NTRS)

    Howard, Samuel A.

    1999-01-01

    The feasibility of supporting a turbocharger rotor on air foil bearings is investigated based upon predicted rotordynamic stability, load accommodations, and stress considerations. It is demonstrated that foil bearings offer a plausible replacement for oil-lubricated bearings in diesel truck turbochargers. Also, two different rotor configurations are analyzed and the design is chosen which best optimizes the desired performance characteristics. The method of designing machinery for foil bearing use and the assumptions made are discussed.

  3. ERSYS-SPP access method subsystem design specification

    NASA Technical Reports Server (NTRS)

    Weise, R. C. (Principal Investigator)

    1980-01-01

    The STARAN special purpose processor (SPP) is a machine allowing the same operation to be performed on up to 512 different data elements simultaneously. In the ERSYS system, it is to be attached to a 4341 plug compatible machine (PCM) to do certain existing algorithms and, at a later date, to perform other to be specified algorithms. That part of the interface between the 4341 PCM and the SPP located in the 4341 PCM is known as the SPP access method (SPPAM). Access to the SPPAM will be obtained by use of the NQUEUE and DQUEUE commands. The subsystem design specification is to incorporate all applicable design considerations from the ERSYS system design specification and the Level B requirements documents relating to the SPPAM. It is intended as a basis for the preliminary design review and will expand into the subsystem detailed design specification.

  4. Mixed methods research design for pragmatic psychoanalytic studies.

    PubMed

    Tillman, Jane G; Clemence, A Jill; Stevens, Jennifer L

    2011-10-01

    Calls for more rigorous psychoanalytic studies have increased over the past decade. The field has been divided by those who assert that psychoanalysis is properly a hermeneutic endeavor and those who see it as a science. A comparable debate is found in research methodology, where qualitative and quantitative methods have often been seen as occupying orthogonal positions. Recently, Mixed Methods Research (MMR) has emerged as a viable "third community" of research, pursuing a pragmatic approach to research endeavors through integrating qualitative and quantitative procedures in a single study design. Mixed Methods Research designs and the terminology associated with this emerging approach are explained, after which the methodology is explored as a potential integrative approach to a psychoanalytic human science. Both qualitative and quantitative research methods are reviewed, as well as how they may be used in Mixed Methods Research to study complex human phenomena.

  5. Design of large Francis turbine using optimal methods

    NASA Astrophysics Data System (ADS)

    Flores, E.; Bornard, L.; Tomas, L.; Liu, J.; Couston, M.

    2012-11-01

    Among a high number of Francis turbine references all over the world, covering the whole market range of heads, Alstom has especially been involved in the development and equipment of the largest power plants in the world : Three Gorges (China -32×767 MW - 61 to 113 m), Itaipu (Brazil- 20x750 MW - 98.7m to 127m) and Xiangjiaba (China - 8x812 MW - 82.5m to 113.6m - in erection). Many new projects are under study to equip new power plants with Francis turbines in order to answer an increasing demand of renewable energy. In this context, Alstom Hydro is carrying out many developments to answer those needs, especially for jumbo units such the planned 1GW type units in China. The turbine design for such units requires specific care by using the state of the art in computation methods and the latest technologies in model testing as well as the maximum feedback from operation of Jumbo plants already in operation. We present in this paper how a large Francis turbine can be designed using specific design methods, including the global and local optimization methods. The design of the spiral case, the tandem cascade profiles, the runner and the draft tube are designed with optimization loops involving a blade design tool, an automatic meshing software and a Navier-Stokes solver, piloted by a genetic algorithm. These automated optimization methods, presented in different papers over the last decade, are nowadays widely used, thanks to the growing computation capacity of the HPC clusters: the intensive use of such optimization methods at the turbine design stage allows to reach very high level of performances, while the hydraulic flow characteristics are carefully studied over the whole water passage to avoid any unexpected hydraulic phenomena.

  6. Design of an explosive detection system using Monte Carlo method.

    PubMed

    Hernández-Adame, Pablo Luis; Medina-Castro, Diego; Rodriguez-Ibarra, Johanna Lizbeth; Salas-Luevano, Miguel Angel; Vega-Carrillo, Hector Rene

    2016-11-01

    Regardless the motivation terrorism is the most important risk for the national security in many countries. Attacks with explosives are the most common method used by terrorists. Therefore several procedures to detect explosives are utilized; among these methods are the use of neutrons and photons. In this study the Monte Carlo method an explosive detection system using a (241)AmBe neutron source was designed. In the design light water, paraffin, polyethylene, and graphite were used as moderators. In the work the explosive RDX was used and the induced gamma rays due to neutron capture in the explosive was estimated using NaI(Tl) and HPGe detectors. When light water is used as moderator and HPGe as the detector the system has the best performance allowing distinguishing between the explosive and urea. For the final design the Ambient dose equivalent for neutrons and photons were estimated along the radial and axial axis.

  7. Computational methods of robust controller design for aerodynamic flutter suppression

    NASA Technical Reports Server (NTRS)

    Anderson, L. R.

    1981-01-01

    The development of Riccati iteration, a tool for the design and analysis of linear control systems is examined. First, Riccati iteration is applied to the problem of pole placement and order reduction in two-time scale control systems. Order reduction, yielding a good approximation to the original system, is demonstrated using a 16th order linear model of a turbofan engine. Next, a numerical method for solving the Riccati equation is presented and demonstrated for a set of eighth order random examples. A literature review of robust controller design methods follows which includes a number of methods for reducing the trajectory and performance index sensitivity in linear regulators. Lastly, robust controller design for large parameter variations is discussed.

  8. Improved method for transonic airfoil design-by-optimization

    NASA Technical Reports Server (NTRS)

    Kennelly, R. A., Jr.

    1983-01-01

    An improved method for use of optimization techniques in transonic airfoil design is demonstrated. FLO6QNM incorporates a modified quasi-Newton optimization package, and is shown to be more reliable and efficient than the method developed previously at NASA-Ames, which used the COPES/CONMIN optimization program. The design codes are compared on a series of test cases with known solutions, and the effects of problem scaling, proximity of initial point to solution, and objective function precision are studied. In contrast to the older method, well-converged solutions are shown to be attainable in the context of engineering design using computational fluid dynamics tools, a new result. The improvements are due to better performance by the optimization routine and to the use of problem-adaptive finite difference step sizes for gradient evaluation.

  9. Methodologic research needs in environmental epidemiology: data analysis.

    PubMed Central

    Prentice, R L; Thomas, D

    1993-01-01

    A brief review is given of data analysis methods for the identification and quantification of associations between environmental exposures and health events of interest. Data analysis methods are outlined for each of the study designs mentioned, with an emphasis on topics in need of further research. Particularly noted are the need for improved methods for accommodating exposure assessment measurement errors in analytic epidemiologic studies and for improved methods for the conduct and analysis of aggregate data (ecologic) studies. PMID:8206041

  10. Inverse design of airfoils using a flexible membrane method

    NASA Astrophysics Data System (ADS)

    Thinsurat, Kamon

    The Modified Garabedian Mc-Fadden (MGM) method is used to inversely design airfoils. The Finite Difference Method (FDM) for Non-Uniform Grids was developed to discretize the MGM equation for numerical solving. The Finite Difference Method (FDM) for Non-Uniform Grids has the advantage of being used flexibly with an unstructured grids airfoil. The commercial software FLUENT is being used as the flow solver. Several conditions are set in FLUENT such as subsonic inviscid flow, subsonic viscous flow, transonic inviscid flow, and transonic viscous flow to test the inverse design code for each condition. A moving grid program is used to create a mesh for new airfoils prior to importing meshes into FLUENT for the analysis of flows. For validation, an iterative process is used so the Cp distribution of the initial airfoil, the NACA0011, achieves the Cp distribution of the target airfoil, the NACA2315, for the subsonic inviscid case at M=0.2. Three other cases were carried out to validate the code. After the code validations, the inverse design method was used to design a shock free airfoil in the transonic condition and to design a separation free airfoil at a high angle of attack in the subsonic condition.

  11. An uncertain multidisciplinary design optimization method using interval convex models

    NASA Astrophysics Data System (ADS)

    Li, Fangyi; Luo, Zhen; Sun, Guangyong; Zhang, Nong

    2013-06-01

    This article proposes an uncertain multi-objective multidisciplinary design optimization methodology, which employs the interval model to represent the uncertainties of uncertain-but-bounded parameters. The interval number programming method is applied to transform each uncertain objective function into two deterministic objective functions, and a satisfaction degree of intervals is used to convert both the uncertain inequality and equality constraints to deterministic inequality constraints. In doing so, an unconstrained deterministic optimization problem will be constructed in association with the penalty function method. The design will be finally formulated as a nested three-loop optimization, a class of highly challenging problems in the area of engineering design optimization. An advanced hierarchical optimization scheme is developed to solve the proposed optimization problem based on the multidisciplinary feasible strategy, which is a well-studied method able to reduce the dimensions of multidisciplinary design optimization problems by using the design variables as independent optimization variables. In the hierarchical optimization system, the non-dominated sorting genetic algorithm II, sequential quadratic programming method and Gauss-Seidel iterative approach are applied to the outer, middle and inner loops of the optimization problem, respectively. Typical numerical examples are used to demonstrate the effectiveness of the proposed methodology.

  12. Comparison of pulsed-field gel electrophoresis & repetitive sequence-based PCR methods for molecular epidemiological studies of Escherichia coli clinical isolates

    PubMed Central

    Bae, Il Kwon; Kim, Juwon; Sun, Je Young Hannah; Jeong, Seok Hoon; Kim, Yong-Rok; Wang, Kang-Kyun; Lee, Kyungwon

    2014-01-01

    Background & objectives: PFGE, rep-PCR, and MLST are widely used to identify related bacterial isolates and determine epidemiologic associations during outbreaks. This study was performed to compare the ability of repetitive sequence-based PCR (rep-PCR) and pulsed-field gel electrophoresis (PFGE) to determine the genetic relationships among Escherichia coli isolates assigned to various sequence types (STs) by two multilocus sequence typing (MLST) schemes. Methods: A total of 41 extended-spectrum β-lactamase- (ESBL-) and/or AmpC β-lactamase-producing E. coli clinical isolates were included in this study. MLST experiments were performed following the Achtman's MLST scheme and the Whittam's MLST scheme, respectively. Rep-PCR experiments were performed using the DiversiLab system. PFGE experiments were also performed. Results: A comparison of the two MLST methods demonstrated that these two schemes yielded compatible results. PFGE correctly segregated E. coli isolates belonging to different STs as different types, but did not group E. coli isolates belonging to the same ST in the same group. Rep-PCR accurately grouped E. coli isolates belonging to the same ST together, but this method demonstrated limited ability to discriminate between E. coli isolates belonging to different STs. Interpretation & conclusions: These results suggest that PFGE would be more effective when investigating outbreaks in a limited space, such as a specialty hospital or an intensive care unit, whereas rep-PCR should be used for nationwide or worldwide epidemiology studies. PMID:25579152

  13. Exploration of Advanced Probabilistic and Stochastic Design Methods

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.

    2003-01-01

    The primary objective of the three year research effort was to explore advanced, non-deterministic aerospace system design methods that may have relevance to designers and analysts. The research pursued emerging areas in design methodology and leverage current fundamental research in the area of design decision-making, probabilistic modeling, and optimization. The specific focus of the three year investigation was oriented toward methods to identify and analyze emerging aircraft technologies in a consistent and complete manner, and to explore means to make optimal decisions based on this knowledge in a probabilistic environment. The research efforts were classified into two main areas. First, Task A of the grant has had the objective of conducting research into the relative merits of possible approaches that account for both multiple criteria and uncertainty in design decision-making. In particular, in the final year of research, the focus was on the comparison and contrasting between three methods researched. Specifically, these three are the Joint Probabilistic Decision-Making (JPDM) technique, Physical Programming, and Dempster-Shafer (D-S) theory. The next element of the research, as contained in Task B, was focused upon exploration of the Technology Identification, Evaluation, and Selection (TIES) methodology developed at ASDL, especially with regards to identification of research needs in the baseline method through implementation exercises. The end result of Task B was the documentation of the evolution of the method with time and a technology transfer to the sponsor regarding the method, such that an initial capability for execution could be obtained by the sponsor. Specifically, the results of year 3 efforts were the creation of a detailed tutorial for implementing the TIES method. Within the tutorial package, templates and detailed examples were created for learning and understanding the details of each step. For both research tasks, sample files and

  14. Review of SMS design methods and real-world applications

    NASA Astrophysics Data System (ADS)

    Dross, Oliver; Mohedano, Ruben; Benitez, Pablo; Minano, Juan Carlos; Chaves, Julio; Blen, Jose; Hernandez, Maikel; Munoz, Fernando

    2004-09-01

    The Simultaneous Multiple Surfaces design method (SMS), proprietary technology of Light Prescription Innovators (LPI), was developed in the early 1990's as a two dimensional method. The first embodiments had either linear or rotational symmetry and found applications in photovoltaic concentrators, illumination optics and optical communications. SMS designed devices perform close to the thermodynamic limit and are compact and simple; features that are especially beneficial in applications with today's high brightness LEDs. The method was extended to 3D "free form" geometries in 1999 that perfectly couple two incoming with two outgoing wavefronts. SMS 3D controls the light emitted by an extended light source much better than single free form surface designs, while reaching very high efficiencies. This has enabled the SMS method to be applied to automotive head lamps, one of the toughest lighting tasks in any application, where high efficiency and small size are required. This article will briefly review the characteristics of both the 2D and 3D methods and will present novel optical solutions that have been developed and manufactured to meet real world problems. These include various ultra compact LED collimators, solar concentrators and highly efficient LED low and high beam headlamp designs.

  15. Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Phase 1

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas

    1998-01-01

    The NASA Langley Multidisciplinary Design Optimization (MDO) method evaluation study seeks to arrive at a set of guidelines for using promising MDO methods by accumulating and analyzing computational data for such methods. The data are collected by conducting a series of reproducible experiments. This report documents all computational experiments conducted in Phase I of the study. This report is a companion to the paper titled Initial Results of an MDO Method Evaluation Study by N. M. Alexandrov and S. Kodiyalam (AIAA-98-4884).

  16. A PDE Sensitivity Equation Method for Optimal Aerodynamic Design

    NASA Technical Reports Server (NTRS)

    Borggaard, Jeff; Burns, John

    1996-01-01

    The use of gradient based optimization algorithms in inverse design is well established as a practical approach to aerodynamic design. A typical procedure uses a simulation scheme to evaluate the objective function (from the approximate states) and its gradient, then passes this information to an optimization algorithm. Once the simulation scheme (CFD flow solver) has been selected and used to provide approximate function evaluations, there are several possible approaches to the problem of computing gradients. One popular method is to differentiate the simulation scheme and compute design sensitivities that are then used to obtain gradients. Although this black-box approach has many advantages in shape optimization problems, one must compute mesh sensitivities in order to compute the design sensitivity. In this paper, we present an alternative approach using the PDE sensitivity equation to develop algorithms for computing gradients. This approach has the advantage that mesh sensitivities need not be computed. Moreover, when it is possible to use the CFD scheme for both the forward problem and the sensitivity equation, then there are computational advantages. An apparent disadvantage of this approach is that it does not always produce consistent derivatives. However, for a proper combination of discretization schemes, one can show asymptotic consistency under mesh refinement, which is often sufficient to guarantee convergence of the optimal design algorithm. In particular, we show that when asymptotically consistent schemes are combined with a trust-region optimization algorithm, the resulting optimal design method converges. We denote this approach as the sensitivity equation method. The sensitivity equation method is presented, convergence results are given and the approach is illustrated on two optimal design problems involving shocks.

  17. Methodologic Issues and Approaches to Spatial Epidemiology

    PubMed Central

    Beale, Linda; Abellan, Juan Jose; Hodgson, Susan; Jarup, Lars

    2008-01-01

    Spatial epidemiology is increasingly being used to assess health risks associated with environmental hazards. Risk patterns tend to have both a temporal and a spatial component; thus, spatial epidemiology must combine methods from epidemiology, statistics, and geographic information science. Recent statistical advances in spatial epidemiology include the use of smoothing in risk maps to create an interpretable risk surface, the extension of spatial models to incorporate the time dimension, and the combination of individual- and area-level information. Advances in geographic information systems and the growing availability of modeling packages have led to an improvement in exposure assessment. Techniques drawn from geographic information science are being developed to enable the visualization of uncertainty and ensure more meaningful inferences are made from data. When public health concerns related to the environment arise, it is essential to address such anxieties appropriately and in a timely manner. Tools designed to facilitate the investigation process are being developed, although the availability of complete and clean health data, and appropriate exposure data often remain limiting factors. PMID:18709139

  18. Taguchi method of experimental design in materials education

    NASA Technical Reports Server (NTRS)

    Weiser, Martin W.

    1993-01-01

    Some of the advantages and disadvantages of the Taguchi Method of experimental design as applied to Materials Science will be discussed. This is a fractional factorial method that employs the minimum number of experimental trials for the information obtained. The analysis is also very simple to use and teach, which is quite advantageous in the classroom. In addition, the Taguchi loss function can be easily incorporated to emphasize that improvements in reproducibility are often at least as important as optimization of the response. The disadvantages of the Taguchi Method include the fact that factor interactions are normally not accounted for, there are zero degrees of freedom if all of the possible factors are used, and randomization is normally not used to prevent environmental biasing. In spite of these disadvantages it is felt that the Taguchi Method is extremely useful for both teaching experimental design and as a research tool, as will be shown with a number of brief examples.

  19. A Simple Method for High-Lift Propeller Conceptual Design

    NASA Technical Reports Server (NTRS)

    Patterson, Michael; Borer, Nick; German, Brian

    2016-01-01

    In this paper, we present a simple method for designing propellers that are placed upstream of the leading edge of a wing in order to augment lift. Because the primary purpose of these "high-lift propellers" is to increase lift rather than produce thrust, these props are best viewed as a form of high-lift device; consequently, they should be designed differently than traditional propellers. We present a theory that describes how these props can be designed to provide a relatively uniform axial velocity increase, which is hypothesized to be advantageous for lift augmentation based on a literature survey. Computational modeling indicates that such propellers can generate the same average induced axial velocity while consuming less power and producing less thrust than conventional propeller designs. For an example problem based on specifications for NASA's Scalable Convergent Electric Propulsion Technology and Operations Research (SCEPTOR) flight demonstrator, a propeller designed with the new method requires approximately 15% less power and produces approximately 11% less thrust than one designed for minimum induced loss. Higher-order modeling and/or wind tunnel testing are needed to verify the predicted performance.

  20. System Synthesis in Preliminary Aircraft Design using Statistical Methods

    NASA Technical Reports Server (NTRS)

    DeLaurentis, Daniel; Mavris, Dimitri N.; Schrage, Daniel P.

    1996-01-01

    This paper documents an approach to conceptual and preliminary aircraft design in which system synthesis is achieved using statistical methods, specifically design of experiments (DOE) and response surface methodology (RSM). These methods are employed in order to more efficiently search the design space for optimum configurations. In particular, a methodology incorporating three uses of these techniques is presented. First, response surface equations are formed which represent aerodynamic analyses, in the form of regression polynomials, which are more sophisticated than generally available in early design stages. Next, a regression equation for an overall evaluation criterion is constructed for the purpose of constrained optimization at the system level. This optimization, though achieved in a innovative way, is still traditional in that it is a point design solution. The methodology put forward here remedies this by introducing uncertainty into the problem, resulting a solutions which are probabilistic in nature. DOE/RSM is used for the third time in this setting. The process is demonstrated through a detailed aero-propulsion optimization of a high speed civil transport. Fundamental goals of the methodology, then, are to introduce higher fidelity disciplinary analyses to the conceptual aircraft synthesis and provide a roadmap for transitioning from point solutions to probabalistic designs (and eventually robust ones).

  1. An interdisciplinary heuristic evaluation method for universal building design.

    PubMed

    Afacan, Yasemin; Erbug, Cigdem

    2009-07-01

    This study highlights how heuristic evaluation as a usability evaluation method can feed into current building design practice to conform to universal design principles. It provides a definition of universal usability that is applicable to an architectural design context. It takes the seven universal design principles as a set of heuristics and applies an iterative sequence of heuristic evaluation in a shopping mall, aiming to achieve a cost-effective evaluation process. The evaluation was composed of three consecutive sessions. First, five evaluators from different professions were interviewed regarding the construction drawings in terms of universal design principles. Then, each evaluator was asked to perform the predefined task scenarios. In subsequent interviews, the evaluators were asked to re-analyze the construction drawings. The results showed that heuristic evaluation could successfully integrate universal usability into current building design practice in two ways: (i) it promoted an iterative evaluation process combined with multi-sessions rather than relying on one evaluator and on one evaluation session to find the maximum number of usability problems, and (ii) it highlighted the necessity of an interdisciplinary ad hoc committee regarding the heuristic abilities of each profession. A multi-session and interdisciplinary heuristic evaluation method can save both the project budget and the required time, while ensuring a reduced error rate for the universal usage of the built environments.

  2. Function combined method for design innovation of children's bike

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoli; Qiu, Tingting; Chen, Huijuan

    2013-03-01

    As children mature, bike products for children in the market develop at the same time, and the conditions are frequently updated. Certain problems occur when using a bike, such as cycle overlapping, repeating function, and short life cycle, which go against the principles of energy conservation and the environmental protection intensive design concept. In this paper, a rational multi-function method of design through functional superposition, transformation, and technical implementation is proposed. An organic combination of frog-style scooter and children's tricycle is developed using the multi-function method. From the ergonomic perspective, the paper elaborates on the body size of children aged 5 to 12 and effectively extracts data for a multi-function children's bike, which can be used for gliding and riding. By inverting the body, parts can be interchanged between the handles and the pedals of the bike. Finally, the paper provides a detailed analysis of the components and structural design, body material, and processing technology of the bike. The study of Industrial Product Innovation Design provides an effective design method to solve the bicycle problems, extends the function problems, improves the product market situation, and enhances the energy saving feature while implementing intensive product development effectively at the same time.

  3. Molecular library design using multi-objective optimization methods.

    PubMed

    Nicolaou, Christos A; Kannas, Christos C

    2011-01-01

    Advancements in combinatorial chemistry and high-throughput screening technology have enabled the synthesis and screening of large molecular libraries for the purposes of drug discovery. Contrary to initial expectations, the increase in screening library size, typically combined with an emphasis on compound structural diversity, did not result in a comparable increase in the number of promising hits found. In an effort to improve the likelihood of discovering hits with greater optimization potential, more recent approaches attempt to incorporate additional knowledge to the library design process to effectively guide the search. Multi-objective optimization methods capable of taking into account several chemical and biological criteria have been used to design collections of compounds satisfying simultaneously multiple pharmaceutically relevant objectives. In this chapter, we present our efforts to implement a multi-objective optimization method, MEGALib, custom-designed to the library design problem. The method exploits existing knowledge, e.g. from previous biological screening experiments, to identify and profile molecular fragments used subsequently to design compounds compromising the various objectives.

  4. Comparison of optimal design methods in inverse problems

    NASA Astrophysics Data System (ADS)

    Banks, H. T.; Holm, K.; Kappel, F.

    2011-07-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric-based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher information matrix. A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criterion with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model (Banks H T and Tran H T 2009 Mathematical and Experimental Modeling of Physical and Biological Processes (Boca Raton, FL: Chapman and Hall/CRC)), the standard harmonic oscillator model (Banks H T and Tran H T 2009) and a popular glucose regulation model (Bergman R N, Ider Y Z, Bowden C R and Cobelli C 1979 Am. J. Physiol. 236 E667-77 De Gaetano A and Arino O 2000 J. Math. Biol. 40 136-68 Toffolo G, Bergman R N, Finegood D T, Bowden C R and Cobelli C 1980 Diabetes 29 979-90).

  5. New Methods and Transducer Designs for Ultrasonic Diagnostics and Therapy

    NASA Astrophysics Data System (ADS)

    Rybyanets, A. N.; Naumenko, A. A.; Sapozhnikov, O. A.; Khokhlova, V. A.

    Recent advances in the field of physical acoustics, imaging technologies, piezoelectric materials, and ultrasonic transducer design have led to emerging of novel methods and apparatus for ultrasonic diagnostics, therapy and body aesthetics. The paper presents the results on development and experimental study of different high intensity focused ultrasound (HIFU) transducers. Technological peculiarities of the HIFU transducer design as well as theoretical and numerical models of such transducers and the corresponding HIFU fields are discussed. Several HIFU transducers of different design have been fabricated using different advanced piezoelectric materials. Acoustic field measurements for those transducers have been performed using a calibrated fiber optic hydrophone and an ultrasonic measurement system (UMS). The results of ex vivo experiments with different tissues as well as in vivo experiments with blood vessels are presented that prove the efficacy, safety and selectivity of the developed HIFU transducers and methods.

  6. New displacement-based methods for optimal truss topology design

    NASA Technical Reports Server (NTRS)

    Bendsoe, Martin P.; Ben-Tal, Aharon; Haftka, Raphael T.

    1991-01-01

    Two alternate methods for maximum stiffness truss topology design are presented. The ground structure approach is used, and the problem is formulated in terms of displacements and bar areas. This large, nonconvex optimization problem can be solved by a simultaneous analysis and design approach. Alternatively, an equivalent, unconstrained, and convex problem in the displacements only can be formulated, and this problem can be solved by a nonsmooth, steepest descent algorithm. In both methods, the explicit solving of the equilibrium equations and the assembly of the global stiffness matrix are circumvented. A large number of examples have been studied, showing the attractive features of topology design as well as exposing interesting features of optimal topologies.

  7. Multi-objective optimization methods in drug design.

    PubMed

    Nicolaou, Christos A; Brown, Nathan

    2013-09-01

    Drug discovery is a challenging multi-objective problem where numerous pharmaceutically important objectives need to be adequately satisfied for a solution to be found. The problem is characterized by vast, complex solution spaces further perplexed by the presence of conflicting objectives. Multi-objective optimization methods, designed specifically to address such problems, have been introduced to the drug discovery field over a decade ago and have steadily gained in acceptance ever since. This paper reviews the latest multi-objective methods and applications reported in the literature, specifically in quantitative structure–activity modeling, docking, de novo design and library design. Further, the paper reports on related developments in drug discovery research and advances in the multi-objective optimization field.

  8. Database design using NIAM (Nijssen Information Analysis Method) modeling

    SciTech Connect

    Stevens, N.H.

    1989-01-01

    The Nissjen Information Analysis Method (NIAM) is an information modeling technique based on semantics and founded in set theory. A NIAM information model is a graphical representation of the information requirements for some universe of discourse. Information models facilitate data integration and communication within an organization about data semantics. An information model is sometimes referred to as the semantic model or the conceptual schema. It helps in the logical and physical design and implementation of databases. NIAM information modeling is used at Sandia National Laboratories to design and implement relational databases containing engineering information which meet the users' information requirements. The paper focuses on the design of one database which satisfied the data needs of four disjoint but closely related applications. The applications as they existed before did not talk to each other even though they stored much of the same data redundantly. NIAM was used to determine the information requirements and design the integrated database. 6 refs., 7 figs.

  9. Obtaining Valid Response Rates: Considerations beyond the Tailored Design Method.

    ERIC Educational Resources Information Center

    Huang, Judy Y.; Hubbard, Susan M.; Mulvey, Kevin P.

    2003-01-01

    Reports on the use of the tailored design method (TDM) to achieve high survey response in two separate studies of the dissemination of Treatment Improvement Protocols (TIPs). Findings from these two studies identify six factors may have influenced nonresponse, and show that use of TDM does not, in itself, guarantee a high response rate. (SLD)

  10. Designs and Methods in School Improvement Research: A Systematic Review

    ERIC Educational Resources Information Center

    Feldhoff, Tobias; Radisch, Falk; Bischof, Linda Marie

    2016-01-01

    Purpose: The purpose of this paper is to focus on challenges faced by longitudinal quantitative analyses of school improvement processes and offers a systematic literature review of current papers that use longitudinal analyses. In this context, the authors assessed designs and methods that are used to analyze the relation between school…

  11. Impact design methods for ceramic components in gas turbine engines

    NASA Technical Reports Server (NTRS)

    Song, J.; Cuccio, J.; Kington, H.

    1991-01-01

    Methods currently under development to design ceramic turbine components with improved impact resistance are presented. Two different modes of impact damage are identified and characterized, i.e., structural damage and local damage. The entire computation is incorporated into the EPIC computer code. Model capability is demonstrated by simulating instrumented plate impact and particle impact tests.

  12. Using Propensity Score Methods to Approximate Factorial Experimental Designs

    ERIC Educational Resources Information Center

    Dong, Nianbo

    2011-01-01

    The purpose of this study is through Monte Carlo simulation to compare several propensity score methods in approximating factorial experimental design and identify best approaches in reducing bias and mean square error of parameter estimates of the main and interaction effects of two factors. Previous studies focused more on unbiased estimates of…

  13. Transient Analysis Method for HEMi Sabot Structural Design

    DTIC Science & Technology

    2016-06-07

    Transient analysis method for HEMi sabot structural design F.C. Wong DRDC Valcartier Defence R&D Canada – Valcartier Technical Note DRDC Valcartier......that can withstand the high g- forces generated by the expanding gas and the inertia of the penetrator. This document discusses the finite element

  14. Rapid species identification and epidemiological analysis of carbapenem-resistant Acinetobacter spp. by a PCR-based open reading frame typing method.

    PubMed

    Yamada, Yuki; Endo, Kentaro; Sawase, Kaori; Anetai, Marie; Narita, Kazuya; Hatakeyama, Yuji; Ishifuji, Katsunori; Kurota, Makiko; Suwabe, Akira

    2016-09-01

    The spread of carbapenem-resistant Acinetobacter spp. has become a global problem. In this study, 18 carbapenem-resistant Acinetobacter calcoaceticus-baumannii (ACB) complexes, identified using a conventional biochemical method at our hospital during 2004-2013, were studied for species identification and epidemiological analyses. Species identification was performed using matrix-assisted laser desorption ionization-time-of-flight MS, a partial sequence analysis of rpoB and a PCR-based ORF typing (POT) method. The POT method can not only identify the species of ACB complexes but also simultaneously determine the international epidemic clones and the genetic identities of Acinetobacterbaumannii in several hours. Carbapenem resistance gene detection by PCR, molecular epidemiological analysis by PFGE and Pasteur Institute multilocus sequence typing (MLST) analysis were performed. All three methods identified 18 isolates as A. baumannii (n=10), Acinetobacterpittii (n=4) and Acinetobacternosocomialis (n=4). A metallo-β-lactamase gene in all strains of A. pittii and A. nosocomialis and an ISAba1 gene in the upstream of the blaOXA-51-like gene in eight strains of A. baumannii were detected, respectively, as carbapenemase-related genes. Results from PFGE demonstrated that nine strains of A. baumannii were closely related genetically. Results of MLST analysis showed that A. baumannii are classifiable to sequence type 2. These results were consistent with those obtained using the POT method. This POT method can easily and rapidly identify the international epidemic clones and the identities of A. baumannii. It can be a useful tool for infection control.

  15. A Mixed Methods Investigation of Mixed Methods Sampling Designs in Social and Health Science Research

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Jiao, Qun G.

    2007-01-01

    A sequential design utilizing identical samples was used to classify mixed methods studies via a two-dimensional model, wherein sampling designs were grouped according to the time orientation of each study's components and the relationship of the qualitative and quantitative samples. A quantitative analysis of 121 studies representing nine fields…

  16. Supersonic/hypersonic aerodynamic methods for aircraft design and analysis

    NASA Technical Reports Server (NTRS)

    Torres, Abel O.

    1992-01-01

    A methodology employed in engineering codes to predict aerodynamic characteristics over arbitrary supersonic/hypersonic configurations is considered. Engineering codes use a combination of simplified methods, based on geometrical impact angle and freestream conditions, to compute pressure distribution over the vehicle's surface in an efficient and timely manner. These approximate methods are valid for both hypersonic (Mach greater than 4) and lower speeds (Mach down to 2). It is concluded that the proposed methodology enables the user to obtain reasonable estimates of vehicle performance and engineering methods are valuable in the design process of these type of vehicles.

  17. Computational methods for aerodynamic design using numerical optimization

    NASA Technical Reports Server (NTRS)

    Peeters, M. F.

    1983-01-01

    Five methods to increase the computational efficiency of aerodynamic design using numerical optimization, by reducing the computer time required to perform gradient calculations, are examined. The most promising method consists of drastically reducing the size of the computational domain on which aerodynamic calculations are made during gradient calculations. Since a gradient calculation requires the solution of the flow about an airfoil whose geometry was slightly perturbed from a base airfoil, the flow about the base airfoil is used to determine boundary conditions on the reduced computational domain. This method worked well in subcritical flow.

  18. Guidance for using mixed methods design in nursing practice research.

    PubMed

    Chiang-Hanisko, Lenny; Newman, David; Dyess, Susan; Piyakong, Duangporn; Liehr, Patricia

    2016-08-01

    The mixed methods approach purposefully combines both quantitative and qualitative techniques, enabling a multi-faceted understanding of nursing phenomena. The purpose of this article is to introduce three mixed methods designs (parallel; sequential; conversion) and highlight interpretive processes that occur with the synthesis of qualitative and quantitative findings. Real world examples of research studies conducted by the authors will demonstrate the processes leading to the merger of data. The examples include: research questions; data collection procedures and analysis with a focus on synthesizing findings. Based on experience with mixed methods studied, the authors introduce two synthesis patterns (complementary; contrasting), considering application for practice and implications for research.

  19. Application of optical diffraction method in designing phase plates

    NASA Astrophysics Data System (ADS)

    Lei, Ze-Min; Sun, Xiao-Yan; Lv, Feng-Nian; Zhang, Zhen; Lu, Xing-Qiang

    2016-11-01

    Continuous phase plate (CPP), which has a function of beam shaping in laser systems, is one kind of important diffractive optics. Based on the Fourier transform of the Gerchberg-Saxton (G-S) algorithm for designing CPP, we proposed an optical diffraction method according to the real system conditions. A thin lens can complete the Fourier transform of the input signal and the inverse propagation of light can be implemented in a program. Using both of the two functions can realize the iteration process to calculate the near-field distribution of light and the far-field repeatedly, which is similar to the G-S algorithm. The results show that using the optical diffraction method can design a CPP for a complicated laser system, and make the CPP have abilities of beam shaping and phase compensation for the phase aberration of the system. The method can improve the adaptation of the phase plate in systems with phase aberrations.

  20. Tuning Parameters in Heuristics by Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Arin, Arif; Rabadi, Ghaith; Unal, Resit

    2010-01-01

    With the growing complexity of today's large scale problems, it has become more difficult to find optimal solutions by using exact mathematical methods. The need to find near-optimal solutions in an acceptable time frame requires heuristic approaches. In many cases, however, most heuristics have several parameters that need to be "tuned" before they can reach good results. The problem then turns into "finding best parameter setting" for the heuristics to solve the problems efficiently and timely. One-Factor-At-a-Time (OFAT) approach for parameter tuning neglects the interactions between parameters. Design of Experiments (DOE) tools can be instead employed to tune the parameters more effectively. In this paper, we seek the best parameter setting for a Genetic Algorithm (GA) to solve the single machine total weighted tardiness problem in which n jobs must be scheduled on a single machine without preemption, and the objective is to minimize the total weighted tardiness. Benchmark instances for the problem are available in the literature. To fine tune the GA parameters in the most efficient way, we compare multiple DOE models including 2-level (2k ) full factorial design, orthogonal array design, central composite design, D-optimal design and signal-to-noise (SIN) ratios. In each DOE method, a mathematical model is created using regression analysis, and solved to obtain the best parameter setting. After verification runs using the tuned parameter setting, the preliminary results for optimal solutions of multiple instances were found efficiently.

  1. Design and descriptive epidemiology of the Infectious Diseases of East African Livestock (IDEAL) project, a longitudinal calf cohort study in western Kenya

    PubMed Central

    2013-01-01

    Background There is a widely recognised lack of baseline epidemiological data on the dynamics and impacts of infectious cattle diseases in east Africa. The Infectious Diseases of East African Livestock (IDEAL) project is an epidemiological study of cattle health in western Kenya with the aim of providing baseline epidemiological data, investigating the impact of different infections on key responses such as growth, mortality and morbidity, the additive and/or multiplicative effects of co-infections, and the influence of management and genetic factors. A longitudinal cohort study of newborn calves was conducted in western Kenya between 2007-2009. Calves were randomly selected from all those reported in a 2 stage clustered sampling strategy. Calves were recruited between 3 and 7 days old. A team of veterinarians and animal health assistants carried out 5-weekly, clinical and postmortem visits. Blood and tissue samples were collected in association with all visits and screened using a range of laboratory based diagnostic methods for over 100 different pathogens or infectious exposures. Results The study followed the 548 calves over the first 51 weeks of life or until death and when they were reported clinically ill. The cohort experienced a high all cause mortality rate of 16% with at least 13% of these due to infectious diseases. Only 307 (6%) of routine visits were classified as clinical episodes, with a further 216 reported by farmers. 54% of calves reached one year without a reported clinical episode. Mortality was mainly to east coast fever, haemonchosis, and heartwater. Over 50 pathogens were detected in this population with exposure to a further 6 viruses and bacteria. Conclusion The IDEAL study has demonstrated that it is possible to mount population based longitudinal animal studies. The results quantify for the first time in an animal population the high diversity of pathogens a population may have to deal with and the levels of co-infections with key

  2. Non-contact electromagnetic exciter design with linear control method

    NASA Astrophysics Data System (ADS)

    Wang, Lin; Xiong, Xianzhi; Xu, Hua

    2017-01-01

    A non-contact type force actuator is necessary for studying the dynamic performance of a high-speed spindle system owing to its high-speed operating conditions. A non-contact electromagnetic exciter is designed for identifying the dynamic coefficients of journal bearings in high-speed grinding spindles. A linear force control method is developed based on PID controller. The influence of amplitude and frequency of current, misalignment and rotational speed on magnetic field and excitation force is investigated based on two-dimensional finite element analysis. The electromagnetic excitation force is measured with the auxiliary coils and calibrated by load cells. The design is validated by the experimental results. Theoretical and experimental investigations show that the proposed design can accurately generate linear excitation force with sufficiently large amplitude and higher signal to noise ratio. Moreover, the fluctuations in force amplitude are reduced to a greater extent with the designed linear control method even when the air gap changes due to the rotor vibration at high-speed conditions. Besides, it is possible to apply various types of excitations: constant, synchronous, and non-synchronous excitation forces based on the proposed linear control method. This exciter can be used as linear-force exciting and controlling system for dynamic performance study of different high-speed rotor-bearing systems.

  3. Novel TMS coils designed using an inverse boundary element method

    NASA Astrophysics Data System (ADS)

    Cobos Sánchez, Clemente; María Guerrero Rodriguez, Jose; Quirós Olozábal, Ángel; Blanco-Navarro, David

    2017-01-01

    In this work, a new method to design TMS coils is presented. It is based on the inclusion of the concept of stream function of a quasi-static electric current into a boundary element method. The proposed TMS coil design approach is a powerful technique to produce stimulators of arbitrary shape, and remarkably versatile as it permits the prototyping of many different performance requirements and constraints. To illustrate the power of this approach, it has been used for the design of TMS coils wound on rectangular flat, spherical and hemispherical surfaces, subjected to different constraints, such as minimum stored magnetic energy or power dissipation. The performances of such coils have been additionally described; and the torque experienced by each stimulator in the presence of a main magnetic static field have theoretically found in order to study the prospect of using them to perform TMS and fMRI concurrently. The obtained results show that described method is an efficient tool for the design of TMS stimulators, which can be applied to a wide range of coil geometries and performance requirements.

  4. Optimal pulse design in quantum control: A unified computational method

    PubMed Central

    Li, Jr-Shin; Ruths, Justin; Yu, Tsyr-Yan; Arthanari, Haribabu; Wagner, Gerhard

    2011-01-01

    Many key aspects of control of quantum systems involve manipulating a large quantum ensemble exhibiting variation in the value of parameters characterizing the system dynamics. Developing electromagnetic pulses to produce a desired evolution in the presence of such variation is a fundamental and challenging problem in this research area. We present such robust pulse designs as an optimal control problem of a continuum of bilinear systems with a common control function. We map this control problem of infinite dimension to a problem of polynomial approximation employing tools from geometric control theory. We then adopt this new notion and develop a unified computational method for optimal pulse design using ideas from pseudospectral approximations, by which a continuous-time optimal control problem of pulse design can be discretized to a constrained optimization problem with spectral accuracy. Furthermore, this is a highly flexible and efficient numerical method that requires low order of discretization and yields inherently smooth solutions. We demonstrate this method by designing effective broadband π/2 and π pulses with reduced rf energy and pulse duration, which show significant sensitivity enhancement at the edge of the spectrum over conventional pulses in 1D and 2D NMR spectroscopy experiments. PMID:21245345

  5. Design method of coaxial reflex hollow beam generator

    NASA Astrophysics Data System (ADS)

    Wang, Jiake; Xu, Jia; Fu, Yuegang; He, Wenjun; Zhu, Qifan

    2016-10-01

    In view of the light energy loss in central obscuration of coaxial reflex optical system, the design method of a kind of hollow beam generator is introduced. First of all, according to the geometrical parameter and obscuration ratio of front-end coaxial reflex optical system, calculate the required physical dimension of hollow beam, and get the beam expanding rate of the hollow beam generator according to the parameters of the light source. Choose the better enlargement ratio of initial expanding system using the relational expression of beam expanding rate and beam expanding rate of initial system; the traditional design method of the reflex optical system is used to design the initial optical system, and then the position of rotation axis of the hollow beam generator can be obtained through the rotation axis translation formula. Intercept the initial system bus bar using the rotation axis after the translation, and rotate the bus bar around the rotation axis for 360°, so that two working faces of the hollow beam generator can be got. The hollow beam generator designed by this method can get the hollow beam that matches the front-end coaxial reflex optical system, improving the energy utilization ratio of beam and effectively reducing the back scattering of transmission system.

  6. Material Design, Selection, and Manufacturing Methods for System Sustainment

    SciTech Connect

    David Sowder, Jim Lula, Curtis Marshall

    2010-02-18

    This paper describes a material selection and validation process proven to be successful for manufacturing high-reliability long-life product. The National Secure Manufacturing Center business unit of the Kansas City Plant (herein called KCP) designs and manufactures complex electrical and mechanical components used in extreme environments. The material manufacturing heritage is founded in the systems design to manufacturing practices that support the U.S. Department of Energy’s National Nuclear Security Administration (DOE/NNSA). Material Engineers at KCP work with the systems designers to recommend materials, develop test methods, perform analytical analysis of test data, define cradle to grave needs, present final selection and fielding. The KCP material engineers typically will maintain cost control by utilizing commercial products when possible, but have the resources and to develop and produce unique formulations as necessary. This approach is currently being used to mature technologies to manufacture materials with improved characteristics using nano-composite filler materials that will enhance system design and production. For some products the engineers plan and carry out science-based life-cycle material surveillance processes. Recent examples of the approach include refurbished manufacturing of the high voltage power supplies for cockpit displays in operational aircraft; dry film lubricant application to improve bearing life for guided munitions gyroscope gimbals, ceramic substrate design for electrical circuit manufacturing, and tailored polymeric materials for various systems. The following examples show evidence of KCP concurrent design-to-manufacturing techniques used to achieve system solutions that satisfy or exceed demanding requirements.

  7. Rays inserting method (RIM) to design dielectric optical devices

    NASA Astrophysics Data System (ADS)

    Taskhiri, Mohammad Mahdi; Khalaj Amirhosseini, Mohammad

    2017-01-01

    In this article, a novel approach, called Rays Inserted Method (RIM), is introduced to design dielectric optical devices. In this approach, some rays are inserted between two ends of desired device and then the refractive index of the points along the route of rays are obtained. The validity of the introduced approach is verified by designing three types of optical devices, i.e. power splitter, bend, and flat lens. The results are confirmed with numerical simulations by the means of FDTD scheme at the frequency of 100 GHz.

  8. Design of transonic compressor cascades using hodograph method

    NASA Technical Reports Server (NTRS)

    Chen, Zuoyi; Guo, Jingrong

    1991-01-01

    The use of the Hodograph Method in the design of a transonic compressor cascade is discussed. The design of the flow mode in the transonic compressor cascade must be as follows: the flow in the nozzle part should be uniform and smooth; the location of the sonic line should be reasonable; and the aerodynamic character of the flow canal in the subsonic region should be met. The rate through cascade may be determined by the velocity distribution in the subsonic region (i.e., by the numerical solution of the Chaplygin equation). The supersonic sections A'C' and AD are determined by the analytical solution of the Mixed-Type Hodograph equation.

  9. The characterization of kerogen-analytical limitations and method design

    SciTech Connect

    Larter, S.R.

    1987-04-01

    Methods suitable for high resolution total molecular characterization of kerogens and other polymeric SOM are necessary for a quantitative understanding of hydrocarbon maturation and migration phenomena in addition to being a requirement for a systematic understanding of kerogen based fuel utilization. Gas chromatographic methods, in conjunction with analytical pyrolysis methods, have proven successful in the rapid superficial characterization of kerogen pyrolysates. Most applications involve qualitative or semi-quantitative assessment of the relative concentration of aliphatic, aromatic, or oxygen-containing species in a kerogen pyrolysate. More recently, the use of alkylated polystyrene internal standards has allowed the direct determination of parameters related to the abundance of, for example, normal alkyl groups or single ring aromatic species in kerogens. The future of methods of this type for improved kerogen typing is critically discussed. The conceptual design and feasibility of methods suitable for the more complete characterization of complex geopolymers on the molecular level is discussed with practical examples.

  10. Application of the CSCM method to the design of wedge cavities. [Conservative Supra Characteristic Method

    NASA Technical Reports Server (NTRS)

    Venkatapathy, Ethiraj; Nystrom, G. A.; Bardina, J.; Lombard, C. K.

    1987-01-01

    This paper describes the application of the conservative supra characteristic method (CSCM) to predict the flow around two-dimensional slot injection cooled cavities in hypersonic flow. Seven different numerical solutions are presented that model three different experimental designs. The calculations manifest outer flow conditions including the effects of nozzle/lip geometry, angle of attack, nozzle inlet conditions, boundary and shear layer growth and turbulance on the surrounding flow. The calculations were performed for analysis prior to wind tunnel testing for sensitivity studies early in the design process. Qualitative and quantitative understanding of the flows for each of the cavity designs and design recommendations are provided. The present paper demonstrates the ability of numerical schemes, such as the CSCM method, to play a significant role in the design process.

  11. Unified computational method for design of fluid loop systems

    NASA Astrophysics Data System (ADS)

    Furukawa, Masao

    1991-12-01

    Various kinds of empirical formulas of Nusselt numbers, fanning friction factors, and pressure loss coefficients were collected and reviewed with the object of constructing a common basis of design calculations of pumped fluid loop systems. The practical expressions obtained after numerical modifications are listed in tables with identification numbers corresponding to configurations of the flow passages. Design procedure of a cold plate and of a space radiator are clearly shown in a series of mathematical relations coupled with a number of detailed expressions which are put in the tables in order of numerical computations. Weight estimate models and several pump characteristics are given in the tables as a result of data regression. A unified computational method based upon the above procedure is presented for preliminary design analyses of a fluid loop system consisting of cold plates, plane radiators, mechanical pumps, valves, and so on.

  12. A simple design method of negative refractive index metamaterials

    NASA Astrophysics Data System (ADS)

    Kim, Dongho; Lee, Wangju; Choi, Jaeick

    2009-11-01

    We propose a very simple design method of negative refractive index (NRI) materials that can overcome some drawbacks of conventional resonant-type NRI materials. The proposed NRI materials consist of single or double metallic patterns printed on a dielectric substrate. Our metamaterials (MTMs) show two properties that are different from other types of MTMs in obtaining effective negative values of permittivity ( ɛ) and permeability ( μ) simultaneously; the geometrical outlines of the metallic patterns are not confined within any specific shape, and the metallic patterns are printed on only one side of the dielectric substrate. Therefore, they are very easy to design and fabricate using common printed circuit board (PCB) technology according to the appropriate application. Excellent agreement between the experiment and prediction data ensures the validity of our design approach.

  13. A Requirements-Driven Optimization Method for Acoustic Treatment Design

    NASA Technical Reports Server (NTRS)

    Berton, Jeffrey J.

    2016-01-01

    Acoustic treatment designers have long been able to target specific noise sources inside turbofan engines. Facesheet porosity and cavity depth are key design variables of perforate-over-honeycomb liners that determine levels of noise suppression as well as the frequencies at which suppression occurs. Layers of these structures can be combined to create a robust attenuation spectrum that covers a wide range of frequencies. Looking to the future, rapidly-emerging additive manufacturing technologies are enabling new liners with multiple degrees of freedom, and new adaptive liners with variable impedance are showing promise. More than ever, there is greater flexibility and freedom in liner design. Subject to practical considerations, liner design variables may be manipulated to achieve a target attenuation spectrum. But characteristics of the ideal attenuation spectrum can be difficult to know. Many multidisciplinary system effects govern how engine noise sources contribute to community noise. Given a hardwall fan noise source to be suppressed, and using an analytical certification noise model to compute a community noise measure of merit, the optimal attenuation spectrum can be derived using multidisciplinary systems analysis methods. The subject of this paper is an analytical method that derives the ideal target attenuation spectrum that minimizes noise perceived by observers on the ground.

  14. National Tuberculosis Genotyping and Surveillance Network: Design and Methods

    PubMed Central

    Braden, Christopher R.; Schable, Barbara A.; Onorato, Ida M.

    2002-01-01

    The National Tuberculosis Genotyping and Surveillance Network was established in 1996 to perform a 5-year, prospective study of the usefulness of genotyping Mycobacterium tuberculosis isolates to tuberculosis control programs. Seven sentinel sites identified all new cases of tuberculosis, collected information on patients and contacts, and obtained patient isolates. Seven genotyping laboratories performed DNA fingerprinting analysis by the international standard IS6110 method. BioImage Whole Band Analyzer software was used to analyze patterns, and distinct patterns were assigned unique designations. Isolates with six or fewer bands on IS6110 patterns were also spoligotyped. Patient data and genotyping designations were entered in a relational database and merged with selected variables from the national surveillance database. In two related databases, we compiled the results of routine contact investigations and the results of investigations of the relationships of patients who had isolates with matching genotypes. We describe the methods used in the study. PMID:12453342

  15. Optical design and active optics methods in astronomy

    NASA Astrophysics Data System (ADS)

    Lemaitre, Gerard R.

    2013-03-01

    Optical designs for astronomy involve implementation of active optics and adaptive optics from X-ray to the infrared. Developments and results of active optics methods for telescopes, spectrographs and coronagraph planet finders are presented. The high accuracy and remarkable smoothness of surfaces generated by active optics methods also allow elaborating new optical design types with high aspheric and/or non-axisymmetric surfaces. Depending on the goal and performance requested for a deformable optical surface analytical investigations are carried out with one of the various facets of elasticity theory: small deformation thin plate theory, large deformation thin plate theory, shallow spherical shell theory, weakly conical shell theory. The resulting thickness distribution and associated bending force boundaries can be refined further with finite element analysis.

  16. Helicopter flight-control design using an H(2) method

    NASA Technical Reports Server (NTRS)

    Takahashi, Marc D.

    1991-01-01

    Rate-command and attitude-command flight-control designs for a UH-60 helicopter in hover are presented and were synthesized using an H(2) method. Using weight functions, this method allows the direct shaping of the singular values of the sensitivity, complementary sensitivity, and control input transfer-function matrices to give acceptable feedback properties. The designs were implemented on the Vertical Motion Simulator, and four low-speed hover tasks were used to evaluate the control system characteristics. The pilot comments from the accel-decel, bob-up, hovering turn, and side-step tasks indicated good decoupling and quick response characteristics. However, an underlying roll PIO tendency was found to exist away from the hover condition, which was caused by a flap regressing mode with insufficient damping.

  17. Preliminary demonstration of a robust controller design method

    NASA Technical Reports Server (NTRS)

    Anderson, L. R.

    1980-01-01

    Alternative computational procedures for obtaining a feedback control law which yields a control signal based on measurable quantitites are evaluated. The three methods evaluated are: (1) the standard linear quadratic regulator design model; (2) minimization of the norm of the feedback matrix, k via nonlinear programming subject to the constraint that the closed loop eigenvalues be in a specified domain in the complex plane; and (3) maximize the angles between the closed loop eigenvectors in combination with minimizing the norm of K also via the constrained nonlinear programming. The third or robust design method was chosen to yield a closed loop system whose eigenvalues are insensitive to small changes in the A and B matrices. The relationship between orthogonality of closed loop eigenvectors and the sensitivity of closed loop eigenvalues is described. Computer programs are described.

  18. Libration Orbit Mission Design: Applications of Numerical & Dynamical Methods

    NASA Technical Reports Server (NTRS)

    Bauer, Frank (Technical Monitor); Folta, David; Beckman, Mark

    2002-01-01

    Sun-Earth libration point orbits serve as excellent locations for scientific investigations. These orbits are often selected to minimize environmental disturbances and maximize observing efficiency. Trajectory design in support of libration orbits is ever more challenging as more complex missions are envisioned in the next decade. Trajectory design software must be further enabled to incorporate better understanding of the libration orbit solution space and thus improve the efficiency and expand the capabilities of current approaches. The Goddard Space Flight Center (GSFC) is currently supporting multiple libration missions. This end-to-end support consists of mission operations, trajectory design, and control. It also includes algorithm and software development. The recently launched Microwave Anisotropy Probe (MAP) and upcoming James Webb Space Telescope (JWST) and Constellation-X missions are examples of the use of improved numerical methods for attaining constrained orbital parameters and controlling their dynamical evolution at the collinear libration points. This paper presents a history of libration point missions, a brief description of the numerical and dynamical design techniques including software used, and a sample of future GSFC mission designs.

  19. Synthesis of aircraft structures using integrated design and analysis methods

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Goetz, R. C.

    1978-01-01

    A systematic research is reported to develop and validate methods for structural sizing of an airframe designed with the use of composite materials and active controls. This research program includes procedures for computing aeroelastic loads, static and dynamic aeroelasticity, analysis and synthesis of active controls, and optimization techniques. Development of the methods is concerned with the most effective ways of integrating and sequencing the procedures in order to generate structural sizing and the associated active control system, which is optimal with respect to a given merit function constrained by strength and aeroelasticity requirements.

  20. Computational methods for drug design and discovery: focus on China.

    PubMed

    Zheng, Mingyue; Liu, Xian; Xu, Yuan; Li, Honglin; Luo, Cheng; Jiang, Hualiang

    2013-10-01

    In the past decades, China's computational drug design and discovery research has experienced fast development through various novel methodologies. Application of these methods spans a wide range, from drug target identification to hit discovery and lead optimization. In this review, we firstly provide an overview of China's status in this field and briefly analyze the possible reasons for this rapid advancement. The methodology development is then outlined. For each selected method, a short background precedes an assessment of the method with respect to the needs of drug discovery, and, in particular, work from China is highlighted. Furthermore, several successful applications of these methods are illustrated. Finally, we conclude with a discussion of current major challenges and future directions of the field.

  1. Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Part 2

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas; Yuan, Charles; Sobieski, Jaroslaw (Technical Monitor)

    2000-01-01

    A new MDO method, BLISS, and two different variants of the method, BLISS/RS and BLISS/S, have been implemented using iSIGHT's scripting language and evaluated in this report on multidisciplinary problems. All of these methods are based on decomposing a modular system optimization system into several subtasks optimization, that may be executed concurrently, and the system optimization that coordinates the subtasks optimization. The BLISS method and its variants are well suited for exploiting the concurrent processing capabilities in a multiprocessor machine. Several steps, including the local sensitivity analysis, local optimization, response surfaces construction and updates are all ideally suited for concurrent processing. Needless to mention, such algorithms that can effectively exploit the concurrent processing capabilities of the compute servers will be a key requirement for solving large-scale industrial design problems, such as the automotive vehicle problem detailed in Section 3.4.

  2. Application of an optimization method to high performance propeller designs

    NASA Technical Reports Server (NTRS)

    Li, K. C.; Stefko, G. L.

    1984-01-01

    The application of an optimization method to determine the propeller blade twist distribution which maximizes propeller efficiency is presented. The optimization employs a previously developed method which has been improved to include the effects of blade drag, camber and thickness. Before the optimization portion of the computer code is used, comparisons of calculated propeller efficiencies and power coefficients are made with experimental data for one NACA propeller at Mach numbers in the range of 0.24 to 0.50 and another NACA propeller at a Mach number of 0.71 to validate the propeller aerodynamic analysis portion of the computer code. Then comparisons of calculated propeller efficiencies for the optimized and the original propellers show the benefits of the optimization method in improving propeller performance. This method can be applied to the aerodynamic design of propellers having straight, swept, or nonplanar propeller blades.

  3. Gradient-based optimum aerodynamic design using adjoint methods

    NASA Astrophysics Data System (ADS)

    Xie, Lei

    2002-09-01

    Continuous adjoint methods and optimal control theory are applied to a pressure-matching inverse design problem of quasi 1-D nozzle flows. Pontryagin's Minimum Principle is used to derive the adjoint system and the reduced gradient of the cost functional. The properties of adjoint variables at the sonic throat and the shock location are studied, revealing a log-arithmic singularity at the sonic throat and continuity at the shock location. A numerical method, based on the Steger-Warming flux-vector-splitting scheme, is proposed to solve the adjoint equations. This scheme can finely resolve the singularity at the sonic throat. A non-uniform grid, with points clustered near the throat region, can resolve it even better. The analytical solutions to the adjoint equations are also constructed via Green's function approach for the purpose of comparing the numerical results. The pressure-matching inverse design is then conducted for a nozzle parameterized by a single geometric parameter. In the second part, the adjoint methods are applied to the problem of minimizing drag coefficient, at fixed lift coefficient, for 2-D transonic airfoil flows. Reduced gradients of several functionals are derived through application of a Lagrange Multiplier Theorem. The adjoint system is carefully studied including the adjoint characteristic boundary conditions at the far-field boundary. A super-reduced design formulation is also explored by treating the angle of attack as an additional state; super-reduced gradients can be constructed either by solving adjoint equations with non-local boundary conditions or by a direct Lagrange multiplier method. In this way, the constrained optimization reduces to an unconstrained design problem. Numerical methods based on Jameson's finite volume scheme are employed to solve the adjoint equations. The same grid system generated from an efficient hyperbolic grid generator are adopted in both the Euler flow solver and the adjoint solver. Several

  4. Achieving integration in mixed methods designs-principles and practices.

    PubMed

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-12-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs-exploratory sequential, explanatory sequential, and convergent-and through four advanced frameworks-multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods.

  5. Design Methods for Load-bearing Elements from Crosslaminated Timber

    NASA Astrophysics Data System (ADS)

    Vilguts, A.; Serdjuks, D.; Goremikins, V.

    2015-11-01

    Cross-laminated timber is an environmentally friendly material, which possesses a decreased level of anisotropy in comparison with the solid and glued timber. Cross-laminated timber could be used for load-bearing walls and slabs of multi-storey timber buildings as well as decking structures of pedestrian and road bridges. Design methods of cross-laminated timber elements subjected to bending and compression with bending were considered. The presented methods were experimentally validated and verified by FEM. Two cross-laminated timber slabs were tested at the action of static load. Pine wood was chosen as a board's material. Freely supported beam with the span equal to 1.9 m, which was loaded by the uniformly distributed load, was a design scheme of the considered plates. The width of the plates was equal to 1 m. The considered cross-laminated timber plates were analysed by FEM method. The comparison of stresses acting in the edge fibres of the plate and the maximum vertical displacements shows that both considered methods can be used for engineering calculations. The difference between the results obtained experimentally and analytically is within the limits from 2 to 31%. The difference in results obtained by effective strength and stiffness and transformed sections methods was not significant.

  6. A design method for constellation of lifting reentry vehicles

    NASA Astrophysics Data System (ADS)

    Xiang, Yu; Kun, Liu

    2017-03-01

    As the reachable domain of a single lifting reentry vehicle is not large enough to cover the whole globe in a short time, which is disadvantageous to responsive operation, it is of great significance to study on how to construct a constellation of several lifting reentry vehicles to responsively reach any point of the globe. This paper addresses a design method for such a constellation. Firstly, an approach for calculating the reachable domain of a single lifting reentry vehicle is given, using the combination of Gauss Pseudospectral Method and SQP method. Based on that, the entire reachable domain taking the limit of responsive time into consideration is simplified reasonably to reduce the complexity of the problem. Secondly, a Streets-of-Coverage (SOC) method is used to design the constellation and the parameters of the constellation are optimized through simple analysis and comparison. Lastly, a point coverage simulation method is utilized to verify the correctness of the optimization result. The verified result shows that 6 lifting reentry vehicles whose maximum lift-to-drag ratio is 1.7 can reach nearly any point on the earth's surface between -50° and 50° in less than 90 minutes.

  7. Development of quality-by-design analytical methods.

    PubMed

    Vogt, Frederick G; Kord, Alireza S

    2011-03-01

    Quality-by-design (QbD) is a systematic approach to drug development, which begins with predefined objectives, and uses science and risk management approaches to gain product and process understanding and ultimately process control. The concept of QbD can be extended to analytical methods. QbD mandates the definition of a goal for the method, and emphasizes thorough evaluation and scouting of alternative methods in a systematic way to obtain optimal method performance. Candidate methods are then carefully assessed in a structured manner for risks, and are challenged to determine if robustness and ruggedness criteria are satisfied. As a result of these studies, the method performance can be understood and improved if necessary, and a control strategy can be defined to manage risk and ensure the method performs as desired when validated and deployed. In this review, the current state of analytical QbD in the industry is detailed with examples of the application of analytical QbD principles to a range of analytical methods, including high-performance liquid chromatography, Karl Fischer titration for moisture content, vibrational spectroscopy for chemical identification, quantitative color measurement, and trace analysis for genotoxic impurities.

  8. Towards Robust Designs Via Multiple-Objective Optimization Methods

    NASA Technical Reports Server (NTRS)

    Man Mohan, Rai

    2006-01-01

    evolutionary method (DE) is first used to solve a relatively difficult problem in extended surface heat transfer wherein optimal fin geometries are obtained for different safe operating base temperatures. The objective of maximizing the safe operating base temperature range is in direct conflict with the objective of maximizing fin heat transfer. This problem is a good example of achieving robustness in the context of changing operating conditions. The evolutionary method is then used to design a turbine airfoil; the two objectives being reduced sensitivity of the pressure distribution to small changes in the airfoil shape and the maximization of the trailing edge wedge angle with the consequent increase in airfoil thickness and strength. This is a relevant example of achieving robustness to manufacturing tolerances and wear and tear in the presence of other objectives.

  9. Examination of Different Exposure Metrics in an Epidemiological Study

    EPA Science Inventory

    Epidemiological studies of air pollution have traditionally relied upon measurements of ambient concentration from central-site monitoring stations as surrogates of population exposures. However, depending on the epidemiological study design, this approach may introduce exposure...

  10. Analytical methods for gravity-assist tour design

    NASA Astrophysics Data System (ADS)

    Strange, Nathan J.

    This dissertation develops analytical methods for the design of gravity-assist space- craft trajectories. Such trajectories are commonly employed by planetary science missions to reach Mercury or the Outer Planets. They may also be used at the Outer Planets for the design of science tours with multiple flybys of those planets' moons. Recent work has also shown applicability to new missions concepts such as NASA's Asteroid Redirect Mission. This work is based in the theory of patched conics. This document applies rigor to the concept of pumping (i.e. using gravity assists to change orbital energy) and cranking (i.e. using gravity assists to change inclination) to develop several analytic relations with pump and crank angles. In addition, transformations are developed between pump angle, crank angle, and v-infinity magnitude to classical orbit elements. These transformations are then used to describe the limits on orbits achievable via gravity assists of a planet or moon. This is then extended to develop analytic relations for all possible ballistic gravity-assist transfers and one type of propulsive transfer, v-infinity leveraging transfers. The results in this dissertation complement existing numerical methods for the design of these trajectories by providing methods that can guide numerical searches to find promising trajectories and even, in some cases, replace numerical searches altogether. In addition, results from new techniques presented in this dissertation such as Tisserand Graphs, the V-Infinity Globe, and Non-Tangent V-Infinty Leveraging provide additional insight into the structure of the gravity-assist trajectory design problem.

  11. Phylogenetic Analyses of Shigella and Enteroinvasive Escherichia coli for the Identification of Molecular Epidemiological Markers: Whole-Genome Comparative Analysis Does Not Support Distinct Genera Designation.

    PubMed

    Pettengill, Emily A; Pettengill, James B; Binet, Rachel

    2015-01-01

    As a leading cause of bacterial dysentery, Shigella represents a significant threat to public health and food safety. Related, but often overlooked, enteroinvasive Escherichia coli (EIEC) can also cause dysentery. Current typing methods have limited ability to identify and differentiate between these pathogens despite the need for rapid and accurate identification of pathogens for clinical treatment and outbreak response. We present a comprehensive phylogeny of Shigella and EIEC using whole genome sequencing of 169 samples, constituting unparalleled strain diversity, and observe a lack of monophyly between Shigella and EIEC and among Shigella taxonomic groups. The evolutionary relationships in the phylogeny are supported by analyses of population structure and hierarchical clustering patterns of translated gene homolog abundance. Lastly, we identified a panel of 404 single nucleotide polymorphism (SNP) markers specific to each phylogenetic cluster for more accurate identification of Shigella and EIEC. Our findings show that Shigella and EIEC are not distinct evolutionary groups within the E. coli genus and, thus, EIEC as a group is not the ancestor to Shigella. The multiple analyses presented provide evidence for reconsidering the taxonomic placement of Shigella. The SNP markers offer more discriminatory power to molecular epidemiological typing methods involving these bacterial pathogens.

  12. Design of time interval generator based on hybrid counting method

    NASA Astrophysics Data System (ADS)

    Yao, Yuan; Wang, Zhaoqi; Lu, Houbing; Chen, Lian; Jin, Ge

    2016-10-01

    Time Interval Generators (TIGs) are frequently used for the characterizations or timing operations of instruments in particle physics experiments. Though some "off-the-shelf" TIGs can be employed, the necessity of a custom test system or control system makes the TIGs, being implemented in a programmable device desirable. Nowadays, the feasibility of using Field Programmable Gate Arrays (FPGAs) to implement particle physics instrumentation has been validated in the design of Time-to-Digital Converters (TDCs) for precise time measurement. The FPGA-TDC technique is based on the architectures of Tapped Delay Line (TDL), whose delay cells are down to few tens of picosecond. In this case, FPGA-based TIGs with high delay step are preferable allowing the implementation of customized particle physics instrumentations and other utilities on the same FPGA device. A hybrid counting method for designing TIGs with both high resolution and wide range is presented in this paper. The combination of two different counting methods realizing an integratable TIG is described in detail. A specially designed multiplexer for tap selection is emphatically introduced. The special structure of the multiplexer is devised for minimizing the different additional delays caused by the unpredictable routings from different taps to the output. A Kintex-7 FPGA is used for the hybrid counting-based implementation of a TIG, providing a resolution up to 11 ps and an interval range up to 8 s.

  13. Sequence design in lattice models by graph theoretical methods

    NASA Astrophysics Data System (ADS)

    Sanjeev, B. S.; Patra, S. M.; Vishveshwara, S.

    2001-01-01

    A general strategy has been developed based on graph theoretical methods, for finding amino acid sequences that take up a desired conformation as the native state. This problem of inverse design has been addressed by assigning topological indices for the monomer sites (vertices) of the polymer on a 3×3×3 cubic lattice. This is a simple design strategy, which takes into account only the topology of the target protein and identifies the best sequence for a given composition. The procedure allows the design of a good sequence for a target native state by assigning weights for the vertices on a lattice site in a given conformation. It is seen across a variety of conformations that the predicted sequences perform well both in sequence and in conformation space, in identifying the target conformation as native state for a fixed composition of amino acids. Although the method is tested in the framework of the HP model [K. F. Lau and K. A. Dill, Macromolecules 22, 3986 (1989)] it can be used in any context if proper potential functions are available, since the procedure derives unique weights for all the sites (vertices, nodes) of the polymer chain of a chosen conformation (graph).

  14. The Global Enteric Multicenter Study (GEMS) of Diarrheal Disease in Infants and Young Children in Developing Countries: Epidemiologic and Clinical Methods of the Case/Control Study

    PubMed Central

    Kotloff, Karen L.; Blackwelder, William C.; Nasrin, Dilruba; Nataro, James P.; Farag, Tamer H.; van Eijk, Annemieke; Adegbola, Richard A.; Alonso, Pedro L.; Breiman, Robert F.; Golam Faruque, Abu Syed; Saha, Debasish; Sow, Samba O.; Sur, Dipika; Zaidi, Anita K. M.; Biswas, Kousick; Panchalingam, Sandra; Clemens, John D.; Cohen, Dani; Glass, Roger I.; Mintz, Eric D.; Sommerfelt, Halvor; Levine, Myron M.

    2012-01-01

    Background. Diarrhea is a leading cause of illness and death among children aged <5 years in developing countries. This paper describes the clinical and epidemiological methods used to conduct the Global Enteric Multicenter Study (GEMS), a 3-year, prospective, age-stratified, case/control study to estimate the population-based burden, microbiologic etiology, and adverse clinical consequences of acute moderate-to-severe diarrhea (MSD) among a censused population of children aged 0–59 months seeking care at health centers in sub-Saharan Africa and South Asia. Methods. GEMS was conducted at 7 field sites, each serving a population whose demography and healthcare utilization practices for childhood diarrhea were documented. We aimed to enroll 220 MSD cases per year from selected health centers serving each site in each of 3 age strata (0–11, 12–23, and 24–59 months), along with 1–3 matched community controls. Cases and controls supplied clinical, epidemiologic, and anthropometric data at enrollment and again approximately 60 days later, and provided enrollment stool specimens for identification and characterization of potential diarrheal pathogens. Verbal autopsy was performed if a child died. Analytic strategies will calculate the fraction of MSD attributable to each pathogen and the incidence, financial costs, nutritional consequences, and case fatality overall and by pathogen. Conclusions. When completed, GEMS will provide estimates of the incidence, etiology, and outcomes of MSD among infants and young children in sub-Saharan Africa and South Asia. This information can guide development and implementation of public health interventions to diminish morbidity and mortality from diarrheal diseases. PMID:23169936

  15. Optimization design of thumbspica splint using finite element method.

    PubMed

    Huang, Tz-How; Feng, Chi-Kung; Gung, Yih-Wen; Tsai, Mei-Wun; Chen, Chen-Sheng; Liu, Chien-Lin

    2006-12-01

    De Quervain's tenosynovitis is often observed on repetitive flexion of the thumb. In the clinical setting, the conservative treatment is usually an applied thumbspica splint to immobilize the thumb. However, the traditional thumbspica splint is bulky and heavy. Thus, this study used the finite element (FE) method to remove redundant material in order to reduce the splint's weight and increase ventilation. An FE model of a thumbspica splint was constructed using ANSYS9.0 software. A maximum lateral thumb pinch force of 98 N was used as the input loading condition for the FE model. This study implemented topology optimization and design optimization to seek the optimal thickness and shape of the splint. This new design was manufactured and compared with the traditional thumbspica splint. Ten thumbspica splints were tested in a materials testing system, and statistically analyzed using an independent t test. The optimal thickness of the thumbspica splint was 3.2 mm. The new design is not significantly different from the traditional splint in the immobilization effect. However, the volume of this new design has been reduced by about 35%. This study produced a new thumbspica splint shape with less volume, but had a similar immobilization effect compared to the traditional shape. In a clinical setting, this result can be used by the occupational therapist as a reference for manufacturing lighter thumbspica splints for patients with de Quervain's tenosynovitis.

  16. Modified method to improve the design of Petlyuk distillation columns

    PubMed Central

    2014-01-01

    Background A response surface analysis was performed to study the effect of the composition and feeding thermal conditions of ternary mixtures on the number of theoretical stages and the energy consumption of Petlyuk columns. A modification of the pre-design algorithm was necessary for this purpose. Results The modified algorithm provided feasible results in 100% of the studied cases, compared with only 8.89% for the current algorithm. The proposed algorithm allowed us to attain the desired separations, despite the type of mixture and the operating conditions in the feed stream, something that was not possible with the traditional pre-design method. The results showed that the type of mixture had great influence on the number of stages and on energy consumption. A higher number of stages and a lower consumption of energy were attained with mixtures rich in the light component, while higher energy consumption occurred when the mixture was rich in the heavy component. Conclusions The proposed strategy expands the search of an optimal design of Petlyuk columns within a feasible region, which allow us to find a feasible design that meets output specifications and low thermal loads. PMID:25061476

  17. A Probabilistic Design Method Applied to Smart Composite Structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1995-01-01

    A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.

  18. Optimal experimental design with the sigma point method.

    PubMed

    Schenkendorf, R; Kremling, A; Mangold, M

    2009-01-01

    Using mathematical models for a quantitative description of dynamical systems requires the identification of uncertain parameters by minimising the difference between simulation and measurement. Owing to the measurement noise also, the estimated parameters possess an uncertainty expressed by their variances. To obtain highly predictive models, very precise parameters are needed. The optimal experimental design (OED) as a numerical optimisation method is used to reduce the parameter uncertainty by minimising the parameter variances iteratively. A frequently applied method to define a cost function for OED is based on the inverse of the Fisher information matrix. The application of this traditional method has at least two shortcomings for models that are nonlinear in their parameters: (i) it gives only a lower bound of the parameter variances and (ii) the bias of the estimator is neglected. Here, the authors show that by applying the sigma point (SP) method a better approximation of characteristic values of the parameter statistics can be obtained, which has a direct benefit on OED. An additional advantage of the SP method is that it can also be used to investigate the influence of the parameter uncertainties on the simulation results. The SP method is demonstrated for the example of a widely used biological model.

  19. Epidemiology of epilepsy.

    PubMed

    Abramovici, S; Bagić, A

    2016-01-01

    Modern epidemiology of epilepsy maximizes the benefits of advanced diagnostic methods and sophisticated techniques for case ascertainment in order to increase the diagnostic accuracy and representativeness of the cases and cohorts studied, resulting in better comparability of similarly performed studies. Overall, these advanced epidemiologic methods are expected to yield a better understanding of diverse risk factors, high-risk populations, seizure triggers, multiple and poorly understood causes of epilepsy, including the increasing and complex role of genetics, and establish the natural course of treated and untreated epilepsy and syndromes - all of which form the foundation of an attempt to prevent epileptogenesis as the primary prophylaxis of epilepsy. Although data collection continues to improve, epidemiologists still need to overcome definition and coding variability, insufficient documentation, as well as the interplay of socioeconomic factors and stigma. As most of the 65-70 million people with epilepsy live outside of resource-rich countries, extensive underdiagnosis, misdiagnosis, and undertreatment are likely. Epidemiology will continue to provide the necessary information to the medical community, public, and regulators as the foundation for improved health policies, targeted education, and advanced measures of prevention and prognostication of the most common severe brain disorder.

  20. Statistical Methods for Rapid Aerothermal Analysis and Design Technology: Validation

    NASA Technical Reports Server (NTRS)

    DePriest, Douglas; Morgan, Carolyn

    2003-01-01

    The cost and safety goals for NASA s next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to identify adequate statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The initial research work focused on establishing suitable candidate models for these purposes. The second phase is focused on assessing the performance of these models to accurately predict the heat rate for a given candidate data set. This validation work compared models and methods that may be useful in predicting the heat rate.

  1. A Generic Method for Design of Oligomer-Specific Antibodies

    PubMed Central

    Brännström, Kristoffer; Lindhagen-Persson, Malin; Gharibyan, Anna L.; Iakovleva, Irina; Vestling, Monika; Sellin, Mikael E.; Brännström, Thomas; Morozova-Roche, Ludmilla; Forsgren, Lars; Olofsson, Anders

    2014-01-01

    Antibodies that preferentially and specifically target pathological oligomeric protein and peptide assemblies, as opposed to their monomeric and amyloid counterparts, provide therapeutic and diagnostic opportunities for protein misfolding diseases. Unfortunately, the molecular properties associated with oligomer-specific antibodies are not well understood, and this limits targeted design and development. We present here a generic method that enables the design and optimisation of oligomer-specific antibodies. The method takes a two-step approach where discrimination between oligomers and fibrils is first accomplished through identification of cryptic epitopes exclusively buried within the structure of the fibrillar form. The second step discriminates between monomers and oligomers based on differences in avidity. We show here that a simple divalent mode of interaction, as within e.g. the IgG isotype, can increase the binding strength of the antibody up to 1500 times compared to its monovalent counterpart. We expose how the ability to bind oligomers is affected by the monovalent affinity and the turnover rate of the binding and, importantly, also how oligomer specificity is only valid within a specific concentration range. We provide an example of the method by creating and characterising a spectrum of different monoclonal antibodies against both the Aβ peptide and α-synuclein that are associated with Alzheimer's and Parkinson's diseases, respectively. The approach is however generic, does not require identification of oligomer-specific architectures, and is, in essence, applicable to all polypeptides that form oligomeric and fibrillar assemblies. PMID:24618582

  2. Molecular epidemiology of tuberculosis: achievements and challenges to current knowledge.

    PubMed Central

    Murray, Megan; Nardell, Edward

    2002-01-01

    Over the past 10 years, molecular methods have become available with which to strain-type Mycobacterium tuberculosis. They have allowed researchers to study certain important but previously unresolved issues in the epidemiology of tuberculosis (TB). For example, some unsuspected microepidemics have been revealed and it has been shown that the relative contribution of recently acquired disease to the TB burden in many settings is far greater than had been thought. These findings have led to the strengthening of TB control. Other research has demonstrated the existence and described the frequency of exogenous reinfection in areas of high incidence. Much recent work has focused on the phenotypic variation among strains and has evaluated the relative transmissibility, virulence, and immunogenicity of different lineages of the organism. We summarize the recent achievements in TB epidemiology associated with the introduction of DNA fingerprinting techniques, and consider the implications of this technology for the design and analysis of epidemiological studies. PMID:12132006

  3. Design of braided composite tubes by numerical analysis method

    SciTech Connect

    Hamada, Hiroyuki; Fujita, Akihiro; Maekawa, Zenichiro; Nakai, Asami; Yokoyama, Atsushi

    1995-11-01

    Conventional composite laminates have very poor strength through thickness and as a result are limited in their application for structural parts with complex shape. In this paper, the design for braided composite tube was proposed. The concept of analysis model which involved from micro model to macro model was presented. This method was applied to predict bending rigidity and initial fracture stress under bending load of the braided tube. The proposed analytical procedure can be included as a unit in CAE system for braided composites.

  4. Methods to Design and Synthesize Antibody-Drug Conjugates (ADCs)

    PubMed Central

    Yao, Houzong; Jiang, Feng; Lu, Aiping; Zhang, Ge

    2016-01-01

    Antibody-drug conjugates (ADCs) have become a promising targeted therapy strategy that combines the specificity, favorable pharmacokinetics and biodistributions of antibodies with the destructive potential of highly potent drugs. One of the biggest challenges in the development of ADCs is the application of suitable linkers for conjugating drugs to antibodies. Recently, the design and synthesis of linkers are making great progress. In this review, we present the methods that are currently used to synthesize antibody-drug conjugates by using thiols, amines, alcohols, aldehydes and azides. PMID:26848651

  5. Epidemiology of malaria in an area of seasonal transmission in Niger and implications for the design of a seasonal malaria chemoprevention strategy

    PubMed Central

    2013-01-01

    Background Few data are available about malaria epidemiological situation in Niger. However, implementation of new strategies such as vaccination or seasonal treatment of a target population requires the knowledge of baseline epidemiological features of malaria. A population-based study was conducted to provide better characterization of malaria seasonal variations and population groups the most at risk in this particular area. Methods From July 2007 to December 2009, presumptive cases of malaria among a study population living in a typical Sahelian village of Niger were recorded, and confirmed by microscopic examination. In parallel, asymptomatic carriers were actively detected at the end of each dry season in 2007, 2008 and 2009. Results Among the 965 presumptive malaria cases recorded, 29% were confirmed by microscopic examination. The incidence of malaria was found to decrease significantly with age (p < 0.01). The mean annual incidence was 0.254. The results show that the risk of malaria was higher in children under ten years (p < 0.0001). The number of malaria episodes generally followed the temporal pattern of changes in precipitation levels, with a peak of transmission in August and September. One-thousand and ninety subjects were submitted to an active detection of asymptomatic carriage of whom 16% tested positive; asymptomatic carriage decreased with increasing age. A higher prevalence of gametocyte carriage among asymptomatic population was recorded in children aged two to ten years, though it did not reach significance. Conclusions In Southern Niger, malaria transmission mostly occurs from July to October. Children aged two to ten years are the most at risk of malaria, and may also represent the main reservoir for gametocytes. Strategies such as intermittent preventive treatment in children (IPTc) could be of interest in this area, where malaria transmission is highly seasonal. Based on these preliminary data, a pilot study could be implemented

  6. A novel observer design method for neural mass models

    NASA Astrophysics Data System (ADS)

    Liu, Xian; Miao, Dong-Kai; Gao, Qing; Xu, Shi-Yun

    2015-09-01

    Neural mass models can simulate the generation of electroencephalography (EEG) signals with different rhythms, and therefore the observation of the states of these models plays a significant role in brain research. The structure of neural mass models is special in that they can be expressed as Lurie systems. The developed techniques in Lurie system theory are applicable to these models. We here provide a new observer design method for neural mass models by transforming these models and the corresponding error systems into nonlinear systems with Lurie form. The purpose is to establish appropriate conditions which ensure the convergence of the estimation error. The effectiveness of the proposed method is illustrated by numerical simulations. Project supported by the National Natural Science Foundation of China (Grant Nos. 61473245, 61004050, and 51207144).

  7. A Method of Trajectory Design for Manned Asteroids Exploration

    NASA Astrophysics Data System (ADS)

    Gan, Q. B.; Zhang, Y.; Zhu, Z. F.; Han, W. H.; Dong, X.

    2014-11-01

    A trajectory optimization method of the nuclear propulsion manned asteroids exploration is presented. In the case of launching between 2035 and 2065, based on the Lambert transfer orbit, the phases of departure from and return to the Earth are searched at first. Then the optimal flight trajectory in the feasible regions is selected by pruning the flight sequences. Setting the nuclear propulsion flight plan as propel-coast-propel, and taking the minimal mass of aircraft departure as the index, the nuclear propulsion flight trajectory is separately optimized using a hybrid method. With the initial value of the optimized local parameters of each three phases, the global parameters are jointedly optimized. At last, the minimal departure mass trajectory design result is given.

  8. 77 FR 60985 - Ambient Air Monitoring Reference and Equivalent Methods: Designation of Three New Equivalent Methods

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-05

    ... for Air Pollution Measurement Systems, Volume I,'' EPA/600/R-94/038a and ``Quality Assurance Handbook for Air Pollution Measurement Systems, Volume II, Ambient Air Quality Monitoring Program'' EPA-454/B... AGENCY Ambient Air Monitoring Reference and Equivalent Methods: Designation of Three New...

  9. Performance enhancement of a pump impeller using optimal design method

    NASA Astrophysics Data System (ADS)

    Jeon, Seok-Yun; Kim, Chul-Kyu; Lee, Sang-Moon; Yoon, Joon-Yong; Jang, Choon-Man

    2017-04-01

    This paper presents the performance evaluation of a regenerative pump to increase its efficiency using optimal design method. Two design parameters which define the shape of the pump impeller, are introduced and analyzed. Pump performance is evaluated by numerical simulation and design of experiments(DOE). To analyze three-dimensional flow field in the pump, general analysis code, CFX, is used in the present work. Shear stress turbulence model is employed to estimate the eddy viscosity. Experimental apparatus with an open-loop facility is set up for measuring the pump performance. Pump performance, efficiency and pressure, obtained from numerical simulation are validated by comparison with the results of experiments. Throughout the shape optimization of the pump impeller at the operating flow condition, the pump efficiency is successfully increased by 3 percent compared to the reference pump. It is noted that the pressure increase of the optimum pump is mainly caused by higher momentum force generated inside blade passage due to the optimal blade shape. Comparisons of pump internal flow on the reference and optimum pump are also investigated and discussed in detail.

  10. Sensitivity method for integrated structure/active control law design

    NASA Technical Reports Server (NTRS)

    Gilbert, Michael G.

    1987-01-01

    The development is described of an integrated structure/active control law design methodology for aeroelastic aircraft applications. A short motivating introduction to aeroservoelasticity is given along with the need for integrated structures/controls design algorithms. Three alternative approaches to development of an integrated design method are briefly discussed with regards to complexity, coordination and tradeoff strategies, and the nature of the resulting solutions. This leads to the formulation of the proposed approach which is based on the concepts of sensitivity of optimum solutions and multi-level decompositions. The concept of sensitivity of optimum is explained in more detail and compared with traditional sensitivity concepts of classical control theory. The analytical sensitivity expressions for the solution of the linear, quadratic cost, Gaussian (LQG) control problem are summarized in terms of the linear regulator solution and the Kalman Filter solution. Numerical results for a state space aeroelastic model of the DAST ARW-II vehicle are given, showing the changes in aircraft responses to variations of a structural parameter, in this case first wing bending natural frequency.

  11. [Schistosomiasis epidemiology (author's transl)].

    PubMed

    Picq, J J; Roux, J

    1980-01-01

    Schistosomiasis are, with three hundred million of infested people, the second world endemy, after malaria. For each of the four species, the distribution areas, the life cycle and the main epidemiological features are recalled in the first chapter. In the five following chapters, the authors consider the human or animal reservoirs of virus, the importance of these diseases towards public health, the gasteropod molluscs acting as intermediate hosts, and the problems of immunity in man. The concepts of "schistosomian infection" and "schistosomian disease" are exposed as well as the differences affecting the various strains of schistosomes and snails intermediate hosts. The authors emphasize the value of quantitative parasitological techniques and sero-immunological methods for epidemiological surveys. They underline the difficulties met in the evaluation of the effect of these diseases upon public health. The main causes inducing the duration of the endemy and those responsible for its extension are studied. The value of mathematic patterns is briefly discussed. Quantitative data compiled through epidemiological surveys should improve the use of the various means presently available for controling schistosomiasis.

  12. A New Aerodynamic Data Dispersion Method for Launch Vehicle Design

    NASA Technical Reports Server (NTRS)

    Pinier, Jeremy T.

    2011-01-01

    A novel method for implementing aerodynamic data dispersion analysis is herein introduced. A general mathematical approach combined with physical modeling tailored to the aerodynamic quantity of interest enables the generation of more realistically relevant dispersed data and, in turn, more reasonable flight simulation results. The method simultaneously allows for the aerodynamic quantities and their derivatives to be dispersed given a set of non-arbitrary constraints, which stresses the controls model in more ways than with the traditional bias up or down of the nominal data within the uncertainty bounds. The adoption and implementation of this new method within the NASA Ares I Crew Launch Vehicle Project has resulted in significant increases in predicted roll control authority, and lowered the induced risks for flight test operations. One direct impact on launch vehicles is a reduced size for auxiliary control systems, and the possibility of an increased payload. This technique has the potential of being applied to problems in multiple areas where nominal data together with uncertainties are used to produce simulations using Monte Carlo type random sampling methods. It is recommended that a tailored physics-based dispersion model be delivered with any aerodynamic product that includes nominal data and uncertainties, in order to make flight simulations more realistic and allow for leaner spacecraft designs.

  13. Studies of Aural Nonlinearity and the Mechanisms of Auditory Fatigue. Part 2. Epidemiologic Methods in Noise-Induced Hearing Loss

    DTIC Science & Technology

    1982-04-01

    Other Factors in Study Design 10 Cross-Sectional Study 11 Case - Control Studies 11 The Cohort Study and Survival Data Analysis 12 Sample Size Requirements...different exposure levels. Transformations or interaction terms may also be included. The analysis for matched case control studies is aldso developed...such as decibel level, multiple regression methoc re suitable h., gain, all biases for prevalence data apply. Case - Control Studies Another sampling

  14. Studies of Aural Nonlinearity and the Mechanisms of Auditory Fatigue. Epidemiologic Methods in Noise-Induced Hearing Loss.

    DTIC Science & Technology

    1980-09-30

    or Representative? 8 Sampling Schemes and Other Factors in Study Design 10 Cross-Sectional Study 11 Case - Control Studies 11 The Cohort Study and...matched case control studies is also developed for outcomes with more than 2 categories, or for subgroups (3, 13). One published example using an...all biases for prevalence data apply. Case - Control Studies Another sampling approach involves selection of cases and of suitable controls. Cases and

  15. An analytical filter design method for guided wave phased arrays

    NASA Astrophysics Data System (ADS)

    Kwon, Hyu-Sang; Kim, Jin-Yeon

    2016-12-01

    This paper presents an analytical method for designing a spatial filter that processes the data from an array of two-dimensional guided wave transducers. An inverse problem is defined where the spatial filter coefficients are determined in such a way that a prescribed beam shape, i.e., a desired array output is best approximated in the least-squares sense. Taking advantage of the 2π-periodicity of the generated wave field, Fourier-series representation is used to derive closed-form expressions for the constituting matrix elements. Special cases in which the desired array output is an ideal delta function and a gate function are considered in a more explicit way. Numerical simulations are performed to examine the performance of the filters designed by the proposed method. It is shown that the proposed filters can significantly improve the beam quality in general. Most notable is that the proposed method does not compromise between the main lobe width and the sidelobe levels; i.e. a narrow main lobe and low sidelobes are simultaneously achieved. It is also shown that the proposed filter can compensate the effects of nonuniform directivity and sensitivity of array elements by explicitly taking these into account in the formulation. From an example of detecting two separate targets, how much the angular resolution can be improved as compared to the conventional delay-and-sum filter is quantitatively illustrated. Lamb wave based imaging of localized defects in an elastic plate using a circular array is also presented as an example of practical applications.

  16. Formal methods in the design of Ada 1995

    NASA Technical Reports Server (NTRS)

    Guaspari, David

    1995-01-01

    Formal, mathematical methods are most useful when applied early in the design and implementation of a software system--that, at least, is the familiar refrain. I will report on a modest effort to apply formal methods at the earliest possible stage, namely, in the design of the Ada 95 programming language itself. This talk is an 'experience report' that provides brief case studies illustrating the kinds of problems we worked on, how we approached them, and the extent (if any) to which the results proved useful. It also derives some lessons and suggestions for those undertaking future projects of this kind. Ada 95 is the first revision of the standard for the Ada programming language. The revision began in 1988, when the Ada Joint Programming Office first asked the Ada Board to recommend a plan for revising the Ada standard. The first step in the revision was to solicit criticisms of Ada 83. A set of requirements for the new language standard, based on those criticisms, was published in 1990. A small design team, the Mapping Revision Team (MRT), became exclusively responsible for revising the language standard to satisfy those requirements. The MRT, from Intermetrics, is led by S. Tucker Taft. The work of the MRT was regularly subject to independent review and criticism by a committee of distinguished Reviewers and by several advisory teams--for example, the two User/Implementor teams, each consisting of an industrial user (attempting to make significant use of the new language on a realistic application) and a compiler vendor (undertaking, experimentally, to modify its current implementation in order to provide the necessary new features). One novel decision established the Language Precision Team (LPT), which investigated language proposals from a mathematical point of view. The LPT applied formal mathematical analysis to help improve the design of Ada 95 (e.g., by clarifying the language proposals) and to help promote its acceptance (e.g., by identifying a

  17. Methods for comparing data across differently designed agronomic studies: examples of different meta-analysis methods used to compare relative composition of plant foods grown using organic or conventional production methods and a protocol for a systematic review.

    PubMed

    Brandt, Kirsten; Srednicka-Tober, Dominika; Barański, Marcin; Sanderson, Roy; Leifert, Carlo; Seal, Chris

    2013-07-31

    Meta-analyses are methods to combine outcomes from different studies to investigate consistent effects of relatively small magnitude, which are difficult to distinguish from random variation within a single study. Several published meta-analyses addressed whether organic and conventional production methods affect the composition of plant foods differently. The meta-analyses were carried out using different options for the methodology and resulted in different conclusions. The types of designs of field trials and farm comparisons widely used in horticultural and agronomic research differ substantially from the clinical trials and epidemiological studies that most meta-analysis methodologies were developed for. Therefore, it is proposed that a systematic review and meta-analysis be carried out with the aim of developing a consolidated methodology. If successful, this methodology can then be used to determine effects of different production systems on plant food composition as well as other comparable factors with small but systematic effects across studies.

  18. PARTIAL RESTRAINING FORCE INTRODUCTION METHOD FOR DESIGNING CONSTRUCTION COUNTERMESURE ON ΔB METHOD

    NASA Astrophysics Data System (ADS)

    Nishiyama, Taku; Imanishi, Hajime; Chiba, Noriyuki; Ito, Takao

    Landslide or slope failure is a three-dimensional movement phenomenon, thus a three-dimensional treatment makes it easier to understand stability. The ΔB method (simplified three-dimensional slope stability analysis method) is based on the limit equilibrium method and equals to an approximate three-dimensional slope stability analysis that extends two-dimensional cross-section stability analysis results to assess stability. This analysis can be conducted using conventional spreadsheets or two-dimensional slope stability computational software. This paper describes the concept of the partial restraining force in-troduction method for designing construction countermeasures using the distribution of the restraining force found along survey lines, which is based on the distribution of survey line safety factors derived from the above-stated analysis. This paper also presents the transverse distributive method of restraining force used for planning ground stabilizing on the basis of the example analysis.

  19. Basic research on design analysis methods for rotorcraft vibrations

    NASA Astrophysics Data System (ADS)

    Hanagud, S.

    1991-12-01

    The objective of the present work was to develop a method for identifying physically plausible finite element system models of airframe structures from test data. The assumed models were based on linear elastic behavior with general (nonproportional) damping. Physical plausibility of the identified system matrices was insured by restricting the identification process to designated physical parameters only and not simply to the elements of the system matrices themselves. For example, in a large finite element model the identified parameters might be restricted to the moduli for each of the different materials used in the structure. In the case of damping, a restricted set of damping values might be assigned to finite elements based on the material type and on the fabrication processes used. In this case, different damping values might be associated with riveted, bolted and bonded elements. The method itself is developed first, and several approaches are outlined for computing the identified parameter values. The method is applied first to a simple structure for which the 'measured' response is actually synthesized from an assumed model. Both stiffness and damping parameter values are accurately identified. The true test, however, is the application to a full-scale airframe structure. In this case, a NASTRAN model and actual measured modal parameters formed the basis for the identification of a restricted set of physically plausible stiffness and damping parameters.

  20. Basic research on design analysis methods for rotorcraft vibrations

    NASA Technical Reports Server (NTRS)

    Hanagud, S.

    1991-01-01

    The objective of the present work was to develop a method for identifying physically plausible finite element system models of airframe structures from test data. The assumed models were based on linear elastic behavior with general (nonproportional) damping. Physical plausibility of the identified system matrices was insured by restricting the identification process to designated physical parameters only and not simply to the elements of the system matrices themselves. For example, in a large finite element model the identified parameters might be restricted to the moduli for each of the different materials used in the structure. In the case of damping, a restricted set of damping values might be assigned to finite elements based on the material type and on the fabrication processes used. In this case, different damping values might be associated with riveted, bolted and bonded elements. The method itself is developed first, and several approaches are outlined for computing the identified parameter values. The method is applied first to a simple structure for which the 'measured' response is actually synthesized from an assumed model. Both stiffness and damping parameter values are accurately identified. The true test, however, is the application to a full-scale airframe structure. In this case, a NASTRAN model and actual measured modal parameters formed the basis for the identification of a restricted set of physically plausible stiffness and damping parameters.

  1. Designing arrays for modern high-resolution methods

    SciTech Connect

    Dowla, F.U.

    1987-10-01

    A bearing estimation study of seismic wavefields propagating from a strongly heterogeneous media shows that with the high-resolution MUSIC algorithm the bias of the direction estimate can be reduced by adopting a smaller aperture sub-array. Further, on this sub-array, the bias of the MUSIC algorithm is less than those of the MLM and Bartlett methods. On the full array, the performance for the three different methods are comparable. Improvement in bearing estimation in MUSIC with a reduced aperture might be attributed to increased signal coherency in the array. For methods with less resolution, the improved signal coherency in the smaller array is possible being offset by severe loss of resolution and the presence of weak secondary sources. Building upon the characteristics of real seismic wavefields, a design language has been developed to generate, modify, and test other arrays. Eigenstructures of wavefields and arrays have been studied empirically by simulation of a variety of realistic signals. 6 refs., 5 figs.

  2. On the feasibility of a transient dynamic design analysis method

    NASA Astrophysics Data System (ADS)

    Ohara, George J.; Cunniff, Patrick F.

    1992-04-01

    This Annual Report summarizes the progress that was made during the first year of the two-year grant from the Office of Naval Research. The dynamic behavior of structures subjected to mechanical shock loading provides a continuing problem for design engineers concerned with shipboard foundations supporting critical equipment. There are two particular problems associated with shock response that are currently under investigation. The first topic explores the possibilities of developing a transient design analysis method that does not degrade the current level of the Navy's shock-proofness requirements for heavy shipboard equipment. The second topic examines the prospects of developing scaling rules for the shock response of simple internal equipment of submarines subjected to various attack situations. This effort has been divided into two tasks: chemical explosive scaling for a given hull; and scaling of equipment response across different hull sizes. The computer is used as a surrogate shock machine for these studies. Hence, the results of the research can provide trends, ideas, suggestions, and scaling rules to the Navy. In using these results, the shock-hardening program should use measured data rather than calculated data.

  3. Inflammation and Exercise (INFLAME): study rationale, design, and methods

    PubMed Central

    Thompson, Angela; Mikus, Catherine; Rodarte, Ruben Q.; Distefano, Brandy; Priest, Elisa L.; Sinclair, Erin; Earnest, Conrad P.; Blair, Steven N.; Church, Timothy S.

    2008-01-01

    Purpose The INFLAME study is designed to determine the effect of exercise training on elevated high-sensitivity C-Reactive Protein (CRP) concentrations in initially sedentary women and men. Methods INFLAME will recruit 170 healthy, sedentary women and men with elevated CRP (≥2.0 mg/L) to be randomized to either an exercise group or non-exercise control group. Exercising individuals will participate in four months of supervised aerobic exercise with a total energy expenditure of 16 kcal • kg−1 • week−1 (KKW). Exercise intensity will be 60–80% of maximal oxygen consumption (VO2 max). Outcome The primary outcome will be change in plasma CRP concentration. Secondary outcomes include visceral adiposity, the cytokines IL-6 and TNF-α, and heart rate variability (HRV) in order to examine potential biological mechanisms whereby exercise might affect CRP concentrations. Summary INFLAME will help us understand the effects of moderate to vigorous exercise on CRP concentrations in sedentary individuals. To our knowledge this will be the largest training study specifically designed to examine the effect of exercise on CRP concentrations. This study has the potential to influence therapeutic applications since CRP measurement is becoming an important clinical measurement in Coronary Heart Disease risk assessment. This study will also contribute to the limited body of literature examining the effect of exercise on the variables of visceral adiposity, cytokines, and heart rate variability. PMID:18024231

  4. Design of composite laminates by a Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Fang, Chin; Springer, George S.

    1993-01-01

    A Monte Carlo procedure was developed for optimizing symmetric fiber reinforced composite laminates such that the weight is minimum and the Tsai-Wu strength failure criterion is satisfied in each ply. The laminate may consist of several materials including an idealized core, and may be subjected to several sets of combined in-plane and bending loads. The procedure yields the number of plies, the fiber orientation, and the material of each ply and the material and thickness of the core. A user friendly computer code was written for performing the numerical calculations. Laminates optimized by the code were compared to laminates resulting from existing optimization methods. These comparisons showed that the present Monte Carlo procedure is a useful and efficient tool for the design of composite laminates.

  5. A Design Method for FES Bone Health Therapy in SCI

    PubMed Central

    Andrews, Brian; Shippen, James; Armengol, Monica; Gibbons, Robin; Holderbaum, William; Harwin, William

    2016-01-01

    FES assisted activities such as standing, walking, cycling and rowing induce forces within the leg bones and have been proposed to reduce osteoporosis in spinal cord injury (SCI). However, details of the applied mechanical stimulus for osteogenesis is often not reported. Typically, comparisons of bone density results are made after costly and time consuming clinical trials. These studies have produced inconsistent results and are subject to sample size variations. Here we propose a design process that may be used to predict the clinical outcome based on biomechanical simulation and mechano-biology. This method may allow candidate therapies to be optimized and quantitatively compared. To illustrate the approach we have used data obtained from a rower with complete paraplegia using the RowStim (III) system. PMID:28078075

  6. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  7. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  8. Design method of water jet pump towards high cavitation performances

    NASA Astrophysics Data System (ADS)

    Cao, L. L.; Che, B. X.; Hu, L. J.; Wu, D. Z.

    2016-05-01

    As one of the crucial components for power supply, the propulsion system is of great significance to the advance speed, noise performances, stabilities and other associated critical performances of underwater vehicles. Developing towards much higher advance speed, the underwater vehicles make more critical demands on the performances of the propulsion system. Basically, the increased advance speed requires the significantly raised rotation speed of the propulsion system, which would result in the deteriorated cavitation performances and consequently limit the thrust and efficiency of the whole system. Compared with the traditional propeller, the water jet pump offers more favourite cavitation, propulsion efficiency and other associated performances. The present research focuses on the cavitation performances of the waterjet pump blade profile in expectation of enlarging its advantages in high-speed vehicle propulsion. Based on the specifications of a certain underwater vehicle, the design method of the waterjet blade with high cavitation performances was investigated in terms of numerical simulation.

  9. Development of impact design methods for ceramic gas turbine components

    NASA Technical Reports Server (NTRS)

    Song, J.; Cuccio, J.; Kington, H.

    1990-01-01

    Impact damage prediction methods are being developed to aid in the design of ceramic gas turbine engine components with improved impact resistance. Two impact damage modes were characterized: local, near the impact site, and structural, usually fast fracture away from the impact site. Local damage to Si3N4 impacted by Si3N4 spherical projectiles consists of ring and/or radial cracks around the impact point. In a mechanistic model being developed, impact damage is characterized as microcrack nucleation and propagation. The extent of damage is measured as volume fraction of microcracks. Model capability is demonstrated by simulating late impact tests. Structural failure is caused by tensile stress during impact exceeding material strength. The EPIC3 code was successfully used to predict blade structural failures in different size particle impacts on radial and axial blades.

  10. Computational methods in metabolic engineering for strain design.

    PubMed

    Long, Matthew R; Ong, Wai Kit; Reed, Jennifer L

    2015-08-01

    Metabolic engineering uses genetic approaches to control microbial metabolism to produce desired compounds. Computational tools can identify new biological routes to chemicals and the changes needed in host metabolism to improve chemical production. Recent computational efforts have focused on exploring what compounds can be made biologically using native, heterologous, and/or enzymes with broad specificity. Additionally, computational methods have been developed to suggest different types of genetic modifications (e.g. gene deletion/addition or up/down regulation), as well as suggest strategies meeting different criteria (e.g. high yield, high productivity, or substrate co-utilization). Strategies to improve the runtime performances have also been developed, which allow for more complex metabolic engineering strategies to be identified. Future incorporation of kinetic considerations will further improve strain design algorithms.

  11. Unique Method for Generating Design Earthquake Time Histories

    SciTech Connect

    R. E. Spears

    2008-07-01

    A method has been developed which takes a seed earthquake time history and modifies it to produce given design response spectra. It is a multi-step process with an initial scaling step and then multiple refinement steps. It is unique in the fact that both the acceleration and displacement response spectra are considered when performing the fit (which primarily improves the low frequency acceleration response spectrum accuracy). Additionally, no matrix inversion is needed. The features include encouraging the code acceleration, velocity, and displacement ratios and attempting to fit the pseudo velocity response spectrum. Also, “smoothing” is done to transition the modified time history to the seed time history at its start and end. This is done in the time history regions below a cumulative energy of 5% and above a cumulative energy of 95%. Finally, the modified acceleration, velocity, and displacement time histories are adjusted to start and end with an amplitude of zero (using Fourier transform techniques for integration).

  12. Information processing systems, reasoning modules, and reasoning system design methods

    SciTech Connect

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  13. Allergic contact dermatitis: epidemiology, molecular mechanisms, in vitro methods and regulatory aspects. Current knowledge assembled at an international workshop at BfR, Germany.

    PubMed

    Peiser, M; Tralau, T; Heidler, J; Api, A M; Arts, J H E; Basketter, D A; English, J; Diepgen, T L; Fuhlbrigge, R C; Gaspari, A A; Johansen, J D; Karlberg, A T; Kimber, I; Lepoittevin, J P; Liebsch, M; Maibach, H I; Martin, S F; Merk, H F; Platzek, T; Rustemeyer, T; Schnuch, A; Vandebriel, R J; White, I R; Luch, A

    2012-03-01

    Contact allergies are complex diseases, and one of the important challenges for public health and immunology. The German 'Federal Institute for Risk Assessment' hosted an 'International Workshop on Contact Dermatitis'. The scope of the workshop was to discuss new discoveries and developments in the field of contact dermatitis. This included the epidemiology and molecular biology of contact allergy, as well as the development of new in vitro methods. Furthermore, it considered regulatory aspects aiming to reduce exposure to contact sensitisers. An estimated 15-20% of the general population suffers from contact allergy. Workplace exposure, age, sex, use of consumer products and genetic predispositions were identified as the most important risk factors. Research highlights included: advances in understanding of immune responses to contact sensitisers, the importance of autoxidation or enzyme-mediated oxidation for the activation of chemicals, the mechanisms through which hapten-protein conjugates are formed and the development of novel in vitro strategies for the identification of skin-sensitising chemicals. Dendritic cell cultures and structure-activity relationships are being developed to identify potential contact allergens. However, the local lymph node assay (LLNA) presently remains the validated method of choice for hazard identification and characterisation. At the workshop the use of the LLNA for regulatory purposes and for quantitative risk assessment was also discussed.

  14. [A method for studying social security records in epidemiology. Use in a study on the prognosis of chronic bronchitis (author's transl)].

    PubMed

    Kauffmann, F; Bahi, J; Brille, D

    1976-01-01

    A method is presented to study, in an epidemiological research, the social security records. This study is based upon records of workers affiliated to the french social security general system. To obtain data which may be compared, it was necessary to take the legislation as a basis; this legislation gives the data which must be in the records. A study of laws and rules has been done to find out these data in the medical record and in the administrative one. A questionnaire is presented. This basic questionnaire should be modified according to the precise objectives of each study and to the characteristics of the population sample. To illustrate this method, some results of a study of chronic bronchitis risk factors are presented in the second part. These results concern 950 men, born in France, aged 30 to 59 in 1960 an still alive in 1972. The study of the long reductions of the ability to work, happened from 1960 to 1971, confirm the disabling character of the group "chronic bronchitis, asthma, emphysema, respiratory insufficiency" which follows immediately cardiovascular and rheumatic diseases. The total number of beneficiaries of the social security is already very important and the whole population will be soon concerned. The use of the social security records as data source could give very interesting informations about morbidity. So, it is possible to study representative samples of the general population or of some particular groups, which has up to now, been done only in a slight extent.

  15. International Evaluation of MIC Distributions and Epidemiological Cutoff Value (ECV) Definitions for Fusarium Species Identified by Molecular Methods for the CLSI Broth Microdilution Method

    PubMed Central

    Colombo, A. L.; Cordoba, S.; Dufresne, P. J.; Fuller, J.; Ghannoum, M.; Gonzalez, G. M.; Guarro, J.; Kidd, S. E.; Melhem, T. M. S. C.; Pelaez, T.; Pfaller, M. A.; Szeszs, M. W.; Takahaschi, J. P.; Wiederhold, N. P.; Turnidge, J.

    2015-01-01

    The CLSI epidemiological cutoff values (ECVs) of antifungal agents are available for various Candida spp., Aspergillus spp., and the Mucorales. However, those categorical endpoints have not been established for Fusarium spp., mostly due to the difficulties associated with collecting sufficient CLSI MICs for clinical isolates identified according to the currently recommended molecular DNA-PCR-based identification methodologies. CLSI MIC distributions were established for 53 Fusarium dimerum species complex (SC), 10 F. fujikuroi, 82 F. proliferatum, 20 F. incarnatum-F. equiseti SC, 226 F. oxysporum SC, 608 F. solani SC, and 151 F. verticillioides isolates originating in 17 laboratories (in Argentina, Australia, Brazil, Canada, Europe, Mexico, and the United States). According to the CLSI guidelines for ECV setting, ECVs encompassing ≥97.5% of pooled statistically modeled MIC distributions were as follows: for amphotericin B, 4 μg/ml (F. verticillioides) and 8 μg/ml (F. oxysporum SC and F. solani SC); for posaconazole, 2 μg/ml (F. verticillioides), 8 μg/ml (F. oxysporum SC), and 32 μg/ml (F. solani SC); for voriconazole, 4 μg/ml (F. verticillioides), 16 μg/ml (F. oxysporum SC), and 32 μg/ml (F. solani SC); and for itraconazole, 32 μg/ml (F. oxysporum SC and F. solani SC). Insufficient data precluded ECV definition for the other species. Although these ECVs could aid in detecting non-wild-type isolates with reduced susceptibility to the agents evaluated, the relationship between molecular mechanisms of resistance (gene mutations) and MICs still needs to be investigated for Fusarium spp. PMID:26643334

  16. The Method of Complex Characteristics for Design of Transonic Compressors.

    NASA Astrophysics Data System (ADS)

    Bledsoe, Margaret Randolph

    We calculate shockless transonic flows past two -dimensional cascades of airfoils characterized by a prescribed speed distribution. The approach is to find solutions of the partial differential equation (c('2)-u('2)) (PHI)(,xx) - 2uv (PHI)(,xy) + (c('2)-v('2)) (PHI)(,yy) = 0 by the method of complex characteristics. Here (PHI) is the velocity potential, so (DEL)(PHI) = (u,v), and c is the local speed of sound. Our method consists in noting that the coefficients of the equation are analytic, so that we can use analytic continuation, conformal mapping, and a spectral method in the hodograph plane to determine the flow. After complex extension we obtain canonical equations for (PHI) and for the stream function (psi) as well as an explicit map from the hodograph plane to complex characteristic coordinates. In the subsonic case, a new coordinate system is defined in which the flow region corresponds to the interior of an ellipse. We construct special solutions of the flow equations in these coordinates by solving characteristic initial value problems in the ellipse with initial data defined by the complete system of Chebyshev polynomials. The condition (psi) = 0 on the boundary of the ellipse is used to determine the series representation of (PHI) and (psi). The map from the ellipse to the complex flow coordinates is found from data specifying the speed q as a function of the arc length s. The transonic problem for shockless flow becomes well posed after appropriate modifications of this procedure. The nonlinearity of the problem is handled by an iterative method that determines the boundary value problem in the ellipse and the map function in sequence. We have implemented this method as a computer code to design two-dimensional cascades of shockless compressor airfoils with gap-to-chord ratios as low as .5 and supersonic zones on both the upper and lower surfaces. The method may be extended to solve more general boundary value problems for second order partial

  17. Rationale, design and methods of the CASHMERE study.

    PubMed

    Simon, Tabassome; Boutouyrie, Pierre; Gompel, Anne; Christin-Maitre, Sophie; Laurent, Stéphane; Thuillez, Christian; Zannad, Faiez; Bernaud, Corine; Jaillon, Patrice

    2004-02-01

    Carotid intima-media thickness (IMT) measurement is a noninvasive method used for quantification of early stage of atherosclerosis. Data suggest that the combination of statin and hormone replacement therapy (HRT) might be useful in reducing the early progression of atherosclerosis in postmenopausal women. The main aim of the study is to compare the effects of 12-month therapy with atorvastatin (80 mg/day), HRT (oral 17beta-estradiol 1 or 2 mg/day, plus cyclic dydrogesterone 10 mg) alone and their combination vs. placebo on the progression of carotid IMT by using a high-definition echotracking device. The secondary objectives are to assess the effects of the treatments vs. placebo on arterial stiffness, lipid profile and C-reactive protein. The CASHMERE trial is an European randomized study with a 2 x 2-factorial design, double blinded for atorvastatin and prospective randomized, open blinded endpoint evaluation (PROBE) method applied to HRT. The investigators can adjust the dose of estradiol at any time during follow-up if necessary. A total of 800 postmenopausal women with mild hypercholesterolemia and with no previous history of cardiovascular disease will be included and followed up by their physicians [general practitioners (GPs) or gynecologists] for 1 year. The CASHMERE trial is the first randomized clinical trial to examine the effects of a statin alone or combined with HRT on the structure and the function of carotid artery as early markers of atherosclerosis in postmenopausal women with mild hypercholesterolemia. The results are expected for 2007.

  18. Infectious Agents and Cancer Epidemiology Research Webinar Series

    Cancer.gov

    Infectious Agents and Cancer Epidemiology Research Webinar Series highlights emerging and cutting-edge research related to infection-associated cancers, shares scientific knowledge about technologies and methods, and fosters cross-disciplinary discussions on infectious agents and cancer epidemiology.

  19. Design optimization methods for genomic DNA tiling arrays

    PubMed Central

    Bertone, Paul; Trifonov, Valery; Rozowsky, Joel S.; Schubert, Falk; Emanuelsson, Olof; Karro, John; Kao, Ming-Yang; Snyder, Michael; Gerstein, Mark

    2006-01-01

    A recent development in microarray research entails the unbiased coverage, or tiling, of genomic DNA for the large-scale identification of transcribed sequences and regulatory elements. A central issue in designing tiling arrays is that of arriving at a single-copy tile path, as significant sequence cross-hybridization can result from the presence of non-unique probes on the array. Due to the fragmentation of genomic DNA caused by the widespread distribution of repetitive elements, the problem of obtaining adequate sequence coverage increases with the sizes of subsequence tiles that are to be included in the design. This becomes increasingly problematic when considering complex eukaryotic genomes that contain many thousands of interspersed repeats. The general problem of sequence tiling can be framed as finding an optimal partitioning of non-repetitive subsequences over a prescribed range of tile sizes, on a DNA sequence comprising repetitive and non-repetitive regions. Exact solutions to the tiling problem become computationally infeasible when applied to large genomes, but successive optimizations are developed that allow their practical implementation. These include an efficient method for determining the degree of similarity of many oligonucleotide sequences over large genomes, and two algorithms for finding an optimal tile path composed of longer sequence tiles. The first algorithm, a dynamic programming approach, finds an optimal tiling in linear time and space; the second applies a heuristic search to reduce the space complexity to a constant requirement. A Web resource has also been developed, accessible at http://tiling.gersteinlab.org, to generate optimal tile paths from user-provided DNA sequences. PMID:16365382

  20. Classification of personal exposure to radio frequency electromagnetic fields (RF-EMF) for epidemiological research: Evaluation of different exposure assessment methods.

    PubMed

    Frei, Patrizia; Mohler, Evelyn; Bürgi, Alfred; Fröhlich, Jürg; Neubauer, Georg; Braun-Fahrländer, Charlotte; Röösli, Martin

    2010-10-01

    The use of personal exposure meters (exposimeters) has been recommended for measuring personal exposure to radio frequency electromagnetic fields (RF-EMF) from environmental far-field sources in everyday life. However, it is unclear to what extent exposimeter readings are affected by measurements taken when personal mobile and cordless phones are used. In addition, the use of exposimeters in large epidemiological studies is limited due to high costs and large effort for study participants. In the current analysis we aimed to investigate the impact of personal phone use on exposimeter readings and to evaluate different exposure assessment methods potentially useful in epidemiological studies. We collected personal exposimeter measurements during one week and diary data from 166 study participants. Moreover, we collected spot measurements in the participants' bedrooms and data on self-estimated exposure, assessed residential exposure to fixed site transmitters by calculating the geo-coded distance and mean RF-EMF from a geospatial propagation model, and developed an exposure prediction model based on the propagation model and exposure relevant behavior. The mean personal exposure was 0.13 mW/m(2), when measurements during personal phone calls were excluded and 0.15 mW/m(2), when such measurements were included. The Spearman correlation with personal exposure (without personal phone calls) was 0.42 (95%-CI: 0.29 to 0.55) for the spot measurements, -0.03 (95%-CI: -0.18 to 0.12) for the geo-coded distance, 0.28 (95%-CI: 0.14 to 0.42) for the geospatial propagation model, 0.50 (95%-CI: 0.37 to 0.61) for the full exposure prediction model and 0.06 (95%-CI: -0.10 to 0.21) for self-estimated exposure. In conclusion, personal exposure measured with exposimeters correlated best with the full exposure prediction model and spot measurements. Self-estimated exposure and geo-coded distance turned out to be poor surrogates for personal exposure.

  1. The American Thoracic Society methods in epidemiologic, clinical, and operations research program. A research capacity-building program in low- and middle-income countries.

    PubMed

    Buist, A Sonia; Parry, Vivienne

    2013-08-01

    Respiratory diseases are a major cause of morbidity and mortality worldwide. The greatest impact of many of these diseases is felt in low- and middle-income countries, but their control and management is hampered by lack of accurate estimates of their prevalence, risk factors, and distribution, and knowledge of the social and cultural setting in which they occur. Providing enough information for cost-effective response to respiratory diseases requires research by trained investigators and public health personnel. The American Thoracic Society (ATS) Methods in Epidemiologic, Clinical, and Operations Research (MECOR) Program was launched in 1994 to provide a sustainable means of increasing local and national research capacity aimed at addressing this need. As of March 2013, approximately 1,015 students have completed at least one level of the training program. Post-MECOR, 64% of participants have published a medical paper, 79% have presented at a scientific or academic meeting, 51% have submitted a research protocol for funding, and 42% have had one funded. One-quarter have been awarded an academic or clinical fellowship, and 78% reported that MECOR had made a significant or extremely important contribution to their professional life and accomplishments. Future challenges include funding, recruitment of local faculty, helping to build the research infrastructure in MECOR countries, and providing ongoing mentoring for research.

  2. A New Method to Predict the Epidemiology of Fungal Keratitis by Monitoring the Sales Distribution of Antifungal Eye Drops in Brazil

    PubMed Central

    Ibrahim, Marlon Moraes; de Angelis, Rafael; Lima, Acacio Souza; Viana de Carvalho, Glauco Dreyer; Ibrahim, Fuad Moraes; Malki, Leonardo Tannus; de Paula Bichuete, Marina; de Paula Martins, Wellington; Rocha, Eduardo Melani

    2012-01-01

    Purpose Fungi are a major cause of keratitis, although few medications are licensed for their treatment. The aim of this study is to observe the variation in commercialisation of antifungal eye drops, and to predict the seasonal distribution of fungal keratitis in Brazil. Methods Data from a retrospective study of antifungal eye drops sales from the only pharmaceutical ophthalmologic laboratory, authorized to dispense them in Brazil (Opthalmos) were gathered. These data were correlated with geographic and seasonal distribution of fungal keratitis in Brazil between July 2002 and June 2008. Results A total of 26,087 antifungal eye drop units were sold, with a mean of 2.3 per patient. There was significant variation in antifungal sales during the year (p<0.01). A linear regression model displayed a significant association between reduced relative humidity and antifungal drug sales (R2 = 0.17,p<0.01). Conclusions Antifungal eye drops sales suggest that there is a seasonal distribution of fungal keratitis. A possible interpretation is that the third quarter of the year (a period when the climate is drier), when agricultural activity is more intense in Brazil, suggests a correlation with a higher incidence of fungal keratitis. A similar model could be applied to other diseases, that are managed with unique, or few, and monitorable medications to predict epidemiological aspects. PMID:22457787

  3. Methods and Processes of Developing the Strengthening the Reporting of Observational Studies in Epidemiology - Veterinary (STROBE-Vet) Statement.

    PubMed

    Sargeant, J M; O'Connor, A M; Dohoo, I R; Erb, H N; Cevallos, M; Egger, M; Ersbøll, A K; Martin, S W; Nielsen, L R; Pearl, D L; Pfeiffer, D U; Sanchez, J; Torrence, M E; Vigre, H; Waldner, C; Ward, M P

    2016-12-01

    The reporting of observational studies in veterinary research presents many challenges that often are not adequately addressed in published reporting guidelines. A consensus meeting of experts was organized to develop an extension of the STROBE statement to address observational studies in veterinary medicine with respect to animal health, animal production, animal welfare and food safety outcomes. The consensus meeting was held 11-13 May 2014 in Mississauga, Ontario, Canada. Seventeen experts from North America, Europe and Australia attended the meeting. The experts were epidemiologists and biostatisticians, many of whom hold or have held editorial positions with relevant journals. Prior to the meeting, 19 experts completed a survey about whether they felt any of the 22 items of the STROBE statement should be modified and whether items should be added to address unique issues related to observational studies in animal species with health, production, welfare or food safety outcomes. At the meeting, the participants were provided with the survey responses and relevant literature concerning the reporting of veterinary observational studies. During the meeting, each STROBE item was discussed to determine whether or not re-wording was recommended, and whether additions were warranted. Anonymous voting was used to determine whether there was consensus for each item change or addition. The consensus was that six items needed no modifications or additions. Modifications or additions were made to the STROBE items numbered as follows: 1 (title and abstract), 3 (objectives), 5 (setting), 6 (participants), 7 (variables), 8 (data sources/measurement), 9 (bias), 10 (study size), 12 (statistical methods), 13 (participants), 14 (descriptive data), 15 (outcome data), 16 (main results), 17 (other analyses), 19 (limitations) and 22 (funding). Published literature was not always available to support modification to, or inclusion of, an item. The methods and processes used in the

  4. International Lymphoma Epidemiology Consortium

    Cancer.gov

    The InterLymph Consortium, or formally the International Consortium of Investigators Working on Non-Hodgkin's Lymphoma Epidemiologic Studies, is an open scientific forum for epidemiologic research in non-Hodgkin's lymphoma.

  5. Epidemiology of Alcoholism.

    ERIC Educational Resources Information Center

    Helzer, John E.

    1987-01-01

    Reviews the application of epidemiology to alcoholism. Discusses measurement and diagnostic issues and reviews studies of the prevalence of alcoholism, its risk factors, and the contributions of epidemiology to our knowledge of treatment and prevention. (Author/KS)

  6. Epidemiological Cutoff Values for Fluconazole, Itraconazole, Posaconazole, and Voriconazole for Six Candida Species as Determined by the Colorimetric Sensititre YeastOne Method

    PubMed Central

    Pemán, Javier; Iñiguez, Carmen; Hervás, David; Lopez-Hontangas, Jose L.; Pina-Vaz, Cidalia; Camarena, Juan J.; Campos-Herrero, Isolina; García-García, Inmaculada; García-Tapia, Ana M.; Guna, Remedios; Merino, Paloma; Pérez del Molino, Luisa; Rubio, Carmen; Suárez, Anabel

    2013-01-01

    In the absence of clinical breakpoints (CBP), epidemiological cutoff values (ECVs) are useful to separate wild-type (WT) isolates (without mechanisms of resistance) from non-WT isolates (those that can harbor some resistance mechanisms), which is the goal of susceptibility tests. Sensititre YeastOne (SYO) is a widely used method to determine susceptibility of Candida spp. to antifungal agents. The CLSI CBP have been established, but not for the SYO method. The ECVs for four azoles, obtained using MIC distributions determined by the SYO method, were calculated via five methods (three statistical methods and based on the MIC50 and modal MIC). Respectively, the median ECVs (in mg/liter) of the five methods for fluconazole, itraconazole, posaconazole, and voriconazole (in parentheses: the percentage of isolates inhibited by MICs equal to or less than the ECVs; the number of isolates tested) were as follows: 2 (94.4%; 944), 0.5 (96.7%; 942), 0.25 (97.6%; 673), and 0.06 (96.7%; 849) for Candida albicans; 4 (86.1%; 642), 0.5 (99.4%; 642), 0.12 (93.9%; 392), and 0.06 (86.9%; 559) for C. parapsilosis; 8 (94.9%; 175), 1 (93.7%; 175), 2 (93.6%; 125), and 0.25 (90.4%; 167) for C. tropicalis; 128 (98.6%; 212), 4 (95.8%; 212), 4 (96.0%; 173), and 2 (98.5; 205) for C. glabrata; 256 (100%; 53), 1 (98.1%; 53), 1 (100%; 33), and 1 (97.9%; 48) for C. krusei; 4 (89.2%; 93), 0.5 (100%; 93), 0.25 (100%; 33), and 0.06 (87.7%; 73) for C. orthopsilosis. All methods included ≥94% of isolates and yielded similar ECVs (within 1 dilution). These ECVs would be suitable for monitoring emergence of isolates with reduced susceptibility by using the SYO method. PMID:23761155

  7. Restriction endonuclease analysis of clinical Pseudomonas aeruginosa strains: useful epidemiologic data from a simple and rapid method.

    PubMed Central

    Maher, W E; Kobe, M; Fass, R J

    1993-01-01

    Newer genetic techniques have replaced phenotypic methods of subtyping Pseudomonas aeruginosa strains. Widespread application of newer methodologies, however, may be limited by technologic complexity and the cost of equipment. We conducted restriction endonuclease analysis (REA) of sheared genomic DNAs from 48 clinical P. aeruginosa strains using the enzyme SalI and electrophoresis in horizontal, low-concentration (0.3 to 0.6%) agarose gels. Each REA profile consisted of a smear of lower-molecular-mass bands as well as a countable number of well-resolved bands in the 8.3- to 48.5-kbp range which could easily be compared when isolates were run side-by-side on the same gel. In general, the REA patterns of strains recovered from different patients differed by at least seven bands, and those of serial isolates from individual patients were identical or differed by, at most, two bands over this 8.3- to 48.5-kbp range. REA of strains already subtyped by field inversion gel electrophoresis revealed that the two techniques generally paralleled each other. Overall, some unrelated strains had similar REA profiles, but the relative simplicity and low cost of the approach coupled with the ability to demonstrate differences between most unrelated strains should make this type of REA an attractive first step in the investigation of institutional P. aeruginosa problems. Images PMID:8391021

  8. Assessing health impacts in complex eco-epidemiological settings in the humid tropics: Advancing tools and methods

    SciTech Connect

    Winkler, Mirko S.; Divall, Mark J.; Krieger, Gary R.; Balge, Marci Z.; Singer, Burton H.; Utzinger, Juerg

    2010-01-15

    In the developing world, large-scale projects in the extractive industry and natural resources sectors are often controversial and associated with long-term adverse health consequences to local communities. In many industrialised countries, health impact assessment (HIA) has been institutionalized for the mitigation of anticipated negative health effects while enhancing the benefits of projects, programmes and policies. However, in developing country settings, relatively few HIAs have been performed. Hence, more HIAs with a focus on low- and middle-income countries are needed to advance and refine tools and methods for impact assessment and subsequent mitigation measures. We present a promising HIA approach, developed within the frame of a large gold-mining project in the Democratic Republic of the Congo. The articulation of environmental health areas, the spatial delineation of potentially affected communities and the use of a diversity of sources to obtain quality baseline health data are utilized for risk profiling. We demonstrate how these tools and data are fed into a risk analysis matrix, which facilitates ranking of potential health impacts for subsequent prioritization of mitigation strategies. The outcomes encapsulate a multitude of environmental and health determinants in a systematic manner, and will assist decision-makers in the development of mitigation measures that minimize potential adverse health effects and enhance positive ones.

  9. Molecular Epidemiology of Leptospirosis in Northern Iran by Nested Polymerase Chain Reaction/Restriction Fragment Length Polymorphism and Sequencing Methods

    PubMed Central

    Zakeri, Sedigheh; Sepahian, Neda; Afsharpad, Mandana; Esfandiari, Behzad; Ziapour, Peyman; Djadid, Navid D.

    2010-01-01

    This study was conducted to investigate the prevalence of Leptospira species in Mazandaran Province of Iran by using nested polymerase chain reaction (PCR)/restriction fragment length polymorphism (RFLP) methods and sequencing analysis. Blood samples (n = 119) were collected from humans suspected of having leptospirosis from different parts of the province in 2007. By using an indirect immunofluorescent antibody test (IFAT), we determined that 35 (29.4%) of 119 suspected cases had leptospiral antibody titers ≥ 1:80, which confirmed the diagnosis of leptospirosis. Nested PCR assay also determined that 60 (50.4%) of 119 samples showed Leptospira infection. Furthermore, 44 (73.3%) of 60 confirmed leptospirosis amplified products were subjected to sequencing analysis. Sequence alignment identified L. interrogans, L. kirschneri, and L. wolffii species. All positive cases diagnosed by IFAT or PCR were in patients who reported contact with animals, high-risk occupational activities, and exposure to contaminated water. Therefore, it is important to increase attention about this disease among physicians and to strengthen laboratory capacity for its diagnosis in infected patients in Iran. PMID:20439973

  10. SU-D-16A-01: A Novel Method to Estimate Normal Tissue Dose for Radiotherapy Patients to Support Epidemiologic Studies of Second Cancer Risk

    SciTech Connect

    Lee, C; Jung, J; Pelletier, C; Kim, J; Lee, C

    2014-06-01

    Purpose: Patient cohort of second cancer study often involves radiotherapy patients with no radiological images available: We developed methods to construct a realistic surrogate anatomy by using computational human phantoms. We tested this phantom images both in a commercial treatment planning system (Eclipse) and a custom Monte Carlo (MC) transport code. Methods: We used a reference adult male phantom defined by International Commission on Radiological Protection (ICRP). The hybrid phantom which was originally developed in Non-Uniform Rational B-Spline (NURBS) and polygon mesh format was converted into more common medical imaging format. Electron density was calculated from the material composition of the organs and tissues and then converted into DICOM format. The DICOM images were imported into the Eclipse system for treatment planning, and then the resulting DICOM-RT files were imported into the MC code for MC-based dose calculation. Normal tissue doses were calculation in Eclipse and MC code for an illustrative prostate treatment case and compared to each other. Results: DICOM images were generated from the adult male reference phantom. Densities and volumes of selected organs between the original phantom and ones represented within Eclipse showed good agreements, less than 0.6%. Mean dose from Eclipse and MC code match less than 7%, whereas maximum and minimum doses were different up to 45%. Conclusion: The methods established in this study will be useful for the reconstruction of organ dose to support epidemiological studies of second cancer in cancer survivors treated by radiotherapy. We also work on implementing body size-dependent computational phantoms to better represent patient's anatomy when the height and weight of patients are available.

  11. Visual Narrative Research Methods as Performance in Industrial Design Education

    ERIC Educational Resources Information Center

    Campbell, Laurel H.; McDonagh, Deana

    2009-01-01

    This article discusses teaching empathic research methodology as performance. The authors describe their collaboration in an activity to help undergraduate industrial design students learn empathy for others when designing products for use by diverse or underrepresented people. The authors propose that an industrial design curriculum would benefit…

  12. Epidemiology, Science as Inquiry and Scientific Literacy

    ERIC Educational Resources Information Center

    Kaelin, Mark; Huebner, Wendy

    2003-01-01

    The recent worldwide SARS outbreak has put the science of epidemiology into the headlines once again. Epidemiology is "... the study of the distribution and the determinants of health-related states or events and the application of these methods to the control of health problems" (Gordis 2000). In this context, the authors have developed a…

  13. A New Approach to Comparing Several Equating Methods in the Context of the NEAT Design

    ERIC Educational Resources Information Center

    Sinharay, Sandip; Holland, Paul W.

    2010-01-01

    The nonequivalent groups with anchor test (NEAT) design involves missing data that are missing by design. Three equating methods that can be used with a NEAT design are the frequency estimation equipercentile equating method, the chain equipercentile equating method, and the item-response-theory observed-score-equating method. We suggest an…

  14. Assessing the Epidemiological Data and Management Methods of Body Packers Admitted to a Referral Center in Iran.

    PubMed

    Alipour-Faz, Athena; Shadnia, Shahin; Mirhashemi, Seyyed Hadi; Peyvandi, Maryam; Oroei, Mahbobeh; Shafagh, Omid; Peyvandi, Hassan; Peyvandi, Ali Asghar

    2016-05-01

    The incidence of smuggling and transporting illegal substances by internal concealment, also known as body packing, is on the rise. The clinical approach to such patients has been changed significantly over the past 2 decades. However, despite a recorded increase in body packing in general, there are controversies in the management of these patients. We aimed to gather data regarding the demographic characteristics, treatment, and outcome of body packers, which were that referred to Loghman Hakim Hospital, Tehran, Iran.The data of all body packers admitted to Loghman Hakim Hospital during 2010 to 2014 were evaluated retrospectively. Data regarding the demographic characteristics of the patients, findings of clinical imaging, treatment, and outcome were recorded.In this study, 175 individuals with a mean age of 31 ± 10 years were assessed. The most common concealed substances were crack (37%), crystal (17%), opium (13%), and heroin (6%). According to the results of surgery and imaging (abdominal radiography or computed tomography), the most common place for concealment was stomach in 33.3% and 12% of cases, respectively. Imaging findings were normal in 18% of the individuals. Forty-eight (27%) patients underwent surgery. The main indications for surgery were clinical manifestations of toxicity (79%) and obstruction of the gastro-intestinal tract (17%). The most common surgical techniques were laparotomy and gastrotomy (50%). The mean duration of hospitalization was 3.8 ± 4 days. The mortality rate was 3%.Conservative treatment of body packers seems to be the best treatment method. Careful monitoring of the patients for possible signs and symptoms of intoxication and gastro-intestinal obstruction is strongly recommended.

  15. Assessing the Epidemiological Data and Management Methods of Body Packers Admitted to a Referral Center in Iran

    PubMed Central

    Alipour-faz, Athena; Shadnia, Shahin; Mirhashemi, Seyyed Hadi; Peyvandi, Maryam; Oroei, Mahbobeh; Shafagh, Omid; Peyvandi, Hassan; Peyvandi, Ali Asghar

    2016-01-01

    Abstract The incidence of smuggling and transporting illegal substances by internal concealment, also known as body packing, is on the rise. The clinical approach to such patients has been changed significantly over the past 2 decades. However, despite a recorded increase in body packing in general, there are controversies in the management of these patients. We aimed to gather data regarding the demographic characteristics, treatment, and outcome of body packers, which were that referred to Loghman Hakim Hospital, Tehran, Iran. The data of all body packers admitted to Loghman Hakim Hospital during 2010 to 2014 were evaluated retrospectively. Data regarding the demographic characteristics of the patients, findings of clinical imaging, treatment, and outcome were recorded. In this study, 175 individuals with a mean age of 31 ± 10 years were assessed. The most common concealed substances were crack (37%), crystal (17%), opium (13%), and heroin (6%). According to the results of surgery and imaging (abdominal radiography or computed tomography), the most common place for concealment was stomach in 33.3% and 12% of cases, respectively. Imaging findings were normal in 18% of the individuals. Forty-eight (27%) patients underwent surgery. The main indications for surgery were clinical manifestations of toxicity (79%) and obstruction of the gastro-intestinal tract (17%). The most common surgical techniques were laparotomy and gastrotomy (50%). The mean duration of hospitalization was 3.8 ± 4 days. The mortality rate was 3%. Conservative treatment of body packers seems to be the best treatment method. Careful monitoring of the patients for possible signs and symptoms of intoxication and gastro-intestinal obstruction is strongly recommended. PMID:27175693

  16. Pseudo-Sibship Methods in the Case-Parents Design

    PubMed Central

    Yu, Zhaoxia; Deng, Li

    2013-01-01

    Recent evidence suggests that complex traits are likely determined by multiple loci, with each of which contributes a weak to moderate individual effect. Although extensive literature exists on multi-locus analysis of unrelated subjects, there are relatively fewer strategies for jointly analyzing multiple loci using family data. Here we address this issue by evaluating two pseudo-sibship methods: the 1:1 matching, which matches each affected offspring to the pseudo sibling formed by the alleles not transmitted to the affected offspring; the exhaustive matching, which matches each affected offspring to the pseudo siblings formed by all the other possible combinations of parental alleles. We prove that the two matching strategies use exactly and approximately the same amount of information from data under additive and multiplicative genetic models, respectively. Using numerical calculations under a variety of models and testing assumptions, we show that compared to the exhaustive matching, the 1:1 matching has comparable asymptotic power in detecting multiplicative / additive effects in single-locus analysis and main effects in multi-locus analysis, and it allows association testing of multiple linked loci. These results pave the way for many existing multi-locus analysis methods developed for the case-control (or matched case-control) design to be applied to case-parents data with minor modifications. As an example, with the 1:1 matching, we applied an L1 regularized regression to a Crohn’s disease dataset. Using the multiple loci selected by our approach, we obtained an order-of-magnitude decrease in p-value and an 18.9% increase in prediction accuracy when comparing to using the most significant individual locus. PMID:21953439

  17. Education in epidemiology: "The Times They Are a-Changin'".

    PubMed

    Samet, Jonathan M; Savitz, David A

    2008-03-01

    "The Changing Face of Epidemiology" is a series of symposia sponsored by Epidemiology for the purpose of addressing topical issues that cut across specialty areas. We comment here on 3 papers presented last summer at a symposium ("Education in Epidemiology: Changing needs for changing times") at the 2007 meeting of the Society for Epidemiologic Research. These papers address current challenges in training epidemiologists, including the rise of molecular epidemiology, the ongoing need to redefine core epidemiologic methods and develop optimum approaches, and the increasing difficulty of assuring competency in primary data collection. We offer suggestions for educational programs and professional organizations on approaching these ongoing challenges.

  18. Cancer Epidemiology Matters Blog

    Cancer.gov

    The Cancer Epidemiology Matters blog helps foster a dialogue between the National Cancer Institute's (NCI) Epidemiology and Genomics Research Program (EGRP), extramural researchers, and other individuals, such as clinicians, community partners, and advocates, who are interested in cancer epidemiology and genomics.

  19. A decision-based perspective for the design of methods for systems design

    NASA Technical Reports Server (NTRS)

    Mistree, Farrokh; Muster, Douglas; Shupe, Jon A.; Allen, Janet K.

    1989-01-01

    Organization of material, a definition of decision based design, a hierarchy of decision based design, the decision support problem technique, a conceptual model design that can be manufactured and maintained, meta-design, computer-based design, action learning, and the characteristics of decisions are among the topics covered.

  20. Design and methods of the national Vietnam veterans longitudinal study.

    PubMed

    Schlenger, William E; Corry, Nida H; Kulka, Richard A; Williams, Christianna S; Henn-Haase, Clare; Marmar, Charles R

    2015-09-01

    The National Vietnam Veterans Longitudinal Study (NVVLS) is the second assessment of a representative cohort of US veterans who served during the Vietnam War era, either in Vietnam or elsewhere. The cohort was initially surveyed in the National Vietnam Veterans Readjustment Study (NVVRS) from 1984 to 1988 to assess the prevalence, incidence, and effects of post-traumatic stress disorder (PTSD) and other post-war problems. The NVVLS sought to re-interview the cohort to assess the long-term course of PTSD. NVVLS data collection began July 3, 2012 and ended May 17, 2013, comprising three components: a mailed health questionnaire, a telephone health survey interview, and, for a probability sample of theater Veterans, a clinical diagnostic telephone interview administered by licensed psychologists. Excluding decedents, 78.8% completed the questionnaire and/or telephone survey, and 55.0% of selected living veterans participated in the clinical interview. This report provides a description of the NVVLS design and methods. Together, the NVVRS and NVVLS constitute a nationally representative longitudinal study of Vietnam veterans, and extend the NVVRS as a critical resource for scientific and policy analyses for Vietnam veterans, with policy relevance for Iraq and Afghanistan veterans.

  1. An entropy method for floodplain monitoring network design

    NASA Astrophysics Data System (ADS)

    Ridolfi, E.; Yan, K.; Alfonso, L.; Di Baldassarre, G.; Napolitano, F.; Russo, F.; Bates, Paul D.

    2012-09-01

    In recent years an increasing number of flood-related fatalities has highlighted the necessity of improving flood risk management to reduce human and economic losses. In this framework, monitoring of flood-prone areas is a key factor for building a resilient environment. In this paper a method for designing a floodplain monitoring network is presented. A redundant network of cheap wireless sensors (GridStix) measuring water depth is considered over a reach of the River Dee (UK), with sensors placed both in the channel and in the floodplain. Through a Three Objective Optimization Problem (TOOP) the best layouts of sensors are evaluated, minimizing their redundancy, maximizing their joint information content and maximizing the accuracy of the observations. A simple raster-based inundation model (LISFLOOD-FP) is used to generate a synthetic GridStix data set of water stages. The Digital Elevation Model (DEM) that is used for hydraulic model building is the globally and freely available SRTM DEM.

  2. The design method of a dam on gravel stream

    SciTech Connect

    Ni, W.B.; Wu, S.J.; Huang, C.Y.

    1995-12-31

    Due to the intense requirements of electricity and water supply in the past decades, large number of dams, reservoirs and mobile barrages have been completed in Taiwan. These hydraulic structures almost occupied all the sound rock foundations with little overburdens. This indicates that the future ones have to face the situation of high overburdens. Special considerations should be taken to overcome the difficulties of water tight requirement and stability of structures. A case study is presented in this paper. It is a dam built for the purpose of hydropower generation and water supply, and is constructed on a gravel stream with 40 m of overburdens. Design method of this dam is discussed in this paper. Curtain grouting is performed in this dam to reduce the high permeability of gravel to an acceptable level. Caissons are chosen to be the structural foundations in this case study to support heavy loads of the dam and to reduce the difficulty of curtain grouting. Another problem for a dam built on gravel stream is the damage of abrasion and erosion to the stilling basin slabs, the sluice way aprons and the spillway aprons. Discussions on the abrasion-erosion resistant materials are also given in this paper.

  3. Applications of numerical optimization methods to helicopter design problems: A survey

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1984-01-01

    A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.

  4. Applications of numerical optimization methods to helicopter design problems - A survey

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1984-01-01

    A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.

  5. Applications of numerical optimization methods to helicopter design problems - A survey

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1985-01-01

    A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.

  6. Active cooling design for scramjet engines using optimization methods

    NASA Technical Reports Server (NTRS)

    Scotti, Stephen J.; Martin, Carl J.; Lucas, Stephen H.

    1988-01-01

    A methodology for using optimization in designing metallic cooling jackets for scramjet engines is presented. The optimal design minimizes the required coolant flow rate subject to temperature, mechanical-stress, and thermal-fatigue-life constraints on the cooling-jacket panels, and Mach-number and pressure contraints on the coolant exiting the panel. The analytical basis for the methodology is presented, and results for the optimal design of panels are shown to demonstrate its utility.

  7. Active cooling design for scramjet engines using optimization methods

    NASA Technical Reports Server (NTRS)

    Scotti, Stephen J.; Martin, Carl J.; Lucas, Stephen H.

    1988-01-01

    A methodology for using optimization in designing metallic cooling jackets for scramjet engines is presented. The optimal design minimizes the required coolant flow rate subject to temperature, mechanical-stress, and thermal-fatigue-life constraints on the cooling-jacket panels, and Mach-number and pressure constraints on the coolant exiting the panel. The analytical basis for the methodology is presented, and results for the optimal design of panels are shown to demonstrate its utility.

  8. Inside multi-disciplinary design in medical informatics: experiences from the use of an argumentative design method.

    PubMed

    Sjøberg, C; Timpka, T

    1995-01-01

    This paper reports on a qualitative study using an argumentation-based design method (Argumentative Design) in the development of clinical software systems. The method, which requires visualization of the underlying design goals, the specific needs-for-change, and the probable consequences of the alternative design measures, caused previously implicit argument structures to be exposed and discussed. This uncovering of hidden agendas also revealed previously implicit coalitions and organizational influences on the design process. Implications for software development practices in medical informatics are discussed.

  9. Overview: Applications of numerical optimization methods to helicopter design problems

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1984-01-01

    There are a number of helicopter design problems that are well suited to applications of numerical design optimization techniques. Adequate implementation of this technology will provide high pay-offs. There are a number of numerical optimization programs available, and there are many excellent response/performance analysis programs developed or being developed. But integration of these programs in a form that is usable in the design phase should be recognized as important. It is also necessary to attract the attention of engineers engaged in the development of analysis capabilities and to make them aware that analysis capabilities are much more powerful if integrated into design oriented codes. Frequently, the shortcoming of analysis capabilities are revealed by coupling them with an optimization code. Most of the published work has addressed problems in preliminary system design, rotor system/blade design or airframe design. Very few published results were found in acoustics, aerodynamics and control system design. Currently major efforts are focused on vibration reduction, and aerodynamics/acoustics applications appear to be growing fast. The development of a computer program system to integrate the multiple disciplines required in helicopter design with numerical optimization technique is needed. Activities in Britain, Germany and Poland are identified, but no published results from France, Italy, the USSR or Japan were found.

  10. Multidisciplinary Design Optimization (MDO) Methods: Their Synergy with Computer Technology in Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1998-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate a radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimization (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behavior by interaction of a large number of very simple models may be an inspiration for the above algorithms, the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should be now, even though the widespread availability of massively parallel processing is still a few years away.

  11. Multidisciplinary Design Optimisation (MDO) Methods: Their Synergy with Computer Technology in the Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1999-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate. A radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimisation (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behaviour by interaction of a large number of very simple models may be an inspiration for the above algorithms; the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should begin now, even though the widespread availability of massively parallel processing is still a few years away.

  12. Epidemiology of gliomas.

    PubMed

    Ostrom, Quinn T; Gittleman, Haley; Stetson, Lindsay; Virk, Selene M; Barnholtz-Sloan, Jill S

    2015-01-01

    Gliomas are the most common type of primary intracranial tumors. Some glioma subtypes cause significant mortality and morbidity that are disproportionate to their relatively rare incidence. A very small proportion of glioma cases can be attributed to inherited genetic disorders. Many potential risk factors for glioma have been studied to date, but few provide explanation for the number of brain tumors identified. The most significant of these factors includes increased risk due to exposure to ionizing radiation, and decreased risk with history of allergy or atopic disease. The potential effect of exposure to cellular phones has been studied extensively, but the results remain inconclusive. Recent genomic analyses, using the genome-wide association study (GWAS) design, have identified several inherited risk variants that are associated with increased glioma risk. The following chapter provides an overview of the current state of research in the epidemiology of intracranial glioma.

  13. Stillbirth Collaborative Research Network: design, methods and recruitment experience.

    PubMed

    Parker, Corette B; Hogue, Carol J R; Koch, Matthew A; Willinger, Marian; Reddy, Uma M; Thorsten, Vanessa R; Dudley, Donald J; Silver, Robert M; Coustan, Donald; Saade, George R; Conway, Deborah; Varner, Michael W; Stoll, Barbara; Pinar, Halit; Bukowski, Radek; Carpenter, Marshall; Goldenberg, Robert

    2011-09-01

    The Stillbirth Collaborative Research Network (SCRN) has conducted a multisite, population-based, case-control study, with prospective enrollment of stillbirths and livebirths at the time of delivery. This paper describes the general design, methods and recruitment experience. The SCRN attempted to enroll all stillbirths and a representative sample of livebirths occurring to residents of pre-defined geographical catchment areas delivering at 59 hospitals associated with five clinical sites. Livebirths <32 weeks gestation and women of African descent were oversampled. The recruitment hospitals were chosen to ensure access to at least 90% of all stillbirths and livebirths to residents of the catchment areas. Participants underwent a standardised protocol including maternal interview, medical record abstraction, placental pathology, biospecimen testing and, in stillbirths, post-mortem examination. Recruitment began in March 2006 and was completed in September 2008 with 663 women with a stillbirth and 1932 women with a livebirth enrolled, representing 69% and 63%, respectively, of the women identified. Additional surveillance for stillbirths continued until June 2009 and a follow-up of the case-control study participants was completed in December 2009. Among consenting women, there were high consent rates for the various study components. For the women with stillbirths, 95% agreed to a maternal interview, chart abstraction and a placental pathological examination; 91% of the women with a livebirth agreed to all of these components. Additionally, 84% of the women with stillbirths agreed to a fetal post-mortem examination. This comprehensive study is poised to systematically study a wide range of potential causes of, and risk factors for, stillbirths and to better understand the scope and incidence of the problem.

  14. Stillbirth Collaborative Research Network: Design, Methods and Recruitment Experience

    PubMed Central

    Parker, Corette B.; Hogue, Carol J. Rowland; Koch, Matthew A.; Willinger, Marian; Reddy, Uma; Thorsten, Vanessa R.; Dudley, Donald J.; Silver, Robert M.; Coustan, Donald; Saade, George R.; Conway, Deborah; Varner, Michael W.; Stoll, Barbara; Pinar, Halit; Bukowski, Radek; Carpenter, Marshall; Goldenberg, Robert

    2013-01-01

    SUMMARY The Stillbirth Collaborative Research Network (SCRN) has conducted a multisite, population-based, case-control study, with prospective enrollment of stillbirths and live births at the time of delivery. This paper describes the general design, methods, and recruitment experience. The SCRN attempted to enroll all stillbirths and a representative sample of live births occurring to residents of pre-defined geographic catchment areas delivering at 59 hospitals associated with five clinical sites. Live births <32 weeks gestation and women of African descent were oversampled. The recruitment hospitals were chosen to ensure access to at least 90% of all stillbirths and live births to residents of the catchment areas. Participants underwent a standardized protocol including maternal interview, medical record abstraction, placental pathology, biospecimen testing, and, in stillbirths, postmortem examination. Recruitment began in March 2006 and was completed in September 2008 with 663 women with a stillbirth and 1932 women with a live birth enrolled, representing 69% and 63%, respectively, of the women identified. Additional surveillance for stillbirth continued through June 2009 and a follow-up of the case-control study participants was completed in December 2009. Among consenting women, there were high consent rates for the various study components. For the women with stillbirth, 95% agreed to maternal interview, chart abstraction, and placental pathologic examination; 91% of the women with live birth agreed to all of these components. Additionally, 84% of the women with stillbirth agreed to a fetal postmortem examination. This comprehensive study is poised to systematically study a wide range of potential causes of, and risk factors for, stillbirth and to better understand the scope and incidence of the problem. PMID:21819424

  15. METHODS FOR INTEGRATING ENVIRONMENTAL CONSIDERATIONS INTO CHEMICAL PROCESS DESIGN DECISIONS

    EPA Science Inventory

    The objective of this cooperative agreement was to postulate a means by which an engineer could routinely include environmental considerations in day-to-day conceptual design problems; a means that could easily integrate with existing design processes, and thus avoid massive retr...

  16. Analysis and Design Methods for Nonlinear Control Systems

    DTIC Science & Technology

    1990-03-01

    entitled "Design of Nonlinear PID Controllers ." In this paper it is demonstrated that the extended linearization approach can be applied to standard...Sciences and Systems, Baltimore, Maryland, pp. 675-680, 1987. [3] WJ. Rugh, "Design of Nonlinear PID Controllers ," AIChE Journa Vol. 33, No. 10, pp. 1738

  17. Preliminary design of pseudo satellites: Basic methods and feasibility criteria

    NASA Astrophysics Data System (ADS)

    Klimenko, N. N.

    2016-12-01

    Analytical models of weight and energy balances, aerodynamic models, and solar irradiance models to perform pseudo-satellite preliminary design are presented. Feasibility criteria are determined in accordance with the aim of preliminary design dependent on mission scenario and type of payload.

  18. Teaching Improvement Model Designed with DEA Method and Management Matrix

    ERIC Educational Resources Information Center

    Montoneri, Bernard

    2014-01-01

    This study uses student evaluation of teachers to design a teaching improvement matrix based on teaching efficiency and performance by combining management matrix and data envelopment analysis. This matrix is designed to formulate suggestions to improve teaching. The research sample consists of 42 classes of freshmen following a course of English…

  19. Developing Baby Bag Design by Using Kansei Engineering Method

    NASA Astrophysics Data System (ADS)

    Janari, D.; Rakhmawati, A.

    2016-01-01

    Consumer's preferences and market demand are essential factors for product's success. Thus, in achieving its success, a product should have design that could fulfill consumer's expectation. Purpose of this research is accomplishing baby bag product as stipulated by Kansei. The results that represent Kanseiwords are; neat, unique, comfortable, safe, modern, gentle, elegant, antique, attractive, simple, spacious, creative, colorful, durable, stylish, smooth and strong. Identification value on significance of correlation for durable attribute is 0,000 < 0,005, which means significant to baby's bag. While the value of coefficient regression is 0,812 < 0,005, which means that durable attribute insignificant to baby's bag.The result of the baby's bag final design selectionbased on the questionnaire 3 is resulting the combination of all design. Space for clothes, diaper's space, shoulder grip, side grip, bottle's heater pocket and bottle's pocket are derived from design 1. Top grip, space for clothes, shoulder grip, and side grip are derived from design 2.Others design that were taken are, spaces for clothes from design 3, diaper's space and clothes’ space from design 4.

  20. Methods for Reachability-based Hybrid Controller Design

    DTIC Science & Technology

    2012-05-10

    and air traffic management . First, we provide several design techniques and synthesis algorithms for deterministic reachability problems formulated...as motivated by application scenar- ios arising in autonomous vehicle control and air traffic management . First, we provide several design techniques...and in taking time from their busy schedules to offer their valuable comments and suggestions during my qualifying examination. It is Professor

  1. ADHD in the Arab World: A Review of Epidemiologic Studies

    ERIC Educational Resources Information Center

    Farah, Lynn G.; Fayyad, John A.; Eapen, Valsamma; Cassir,Youmna; Salamoun, Mariana M.; Tabet, Caroline C.; Mneimneh, Zeina N.; Karam, Elie G.

    2009-01-01

    Objective: Epidemiological studies on psychiatric disorders are quite rare in the Arab World. This article reviews epidemiological studies on ADHD in all the Arab countries. Method: All epidemiological studies on ADHD conducted from 1966 through th present were reviewed. Samples were drawn from the general community, primary care clinical…

  2. Categorisation of visualisation methods to support the design of Human-Computer Interaction Systems.

    PubMed

    Li, Katie; Tiwari, Ashutosh; Alcock, Jeffrey; Bermell-Garcia, Pablo

    2016-07-01

    During the design of Human-Computer Interaction (HCI) systems, the creation of visual artefacts forms an important part of design. On one hand producing a visual artefact has a number of advantages: it helps designers to externalise their thought and acts as a common language between different stakeholders. On the other hand, if an inappropriate visualisation method is employed it could hinder the design process. To support the design of HCI systems, this paper reviews the categorisation of visualisation methods used in HCI. A keyword search is conducted to identify a) current HCI design methods, b) approaches of selecting these methods. The resulting design methods are filtered to create a list of just visualisation methods. These are then categorised using the approaches identified in (b). As a result 23 HCI visualisation methods are identified and categorised in 5 selection approaches (The Recipient, Primary Purpose, Visual Archetype, Interaction Type, and The Design Process).

  3. Development of Combinatorial Methods for Alloy Design and Optimization

    SciTech Connect

    Pharr, George M.; George, Easo P.; Santella, Michael L

    2005-07-01

    The primary goal of this research was to develop a comprehensive methodology for designing and optimizing metallic alloys by combinatorial principles. Because conventional techniques for alloy preparation are unavoidably restrictive in the range of alloy composition that can be examined, combinatorial methods promise to significantly reduce the time, energy, and expense needed for alloy design. Combinatorial methods can be developed not only to optimize existing alloys, but to explore and develop new ones as well. The scientific approach involved fabricating an alloy specimen with a continuous distribution of binary and ternary alloy compositions across its surface--an ''alloy library''--and then using spatially resolved probing techniques to characterize its structure, composition, and relevant properties. The three specific objectives of the project were: (1) to devise means by which simple test specimens with a library of alloy compositions spanning the range interest can be produced; (2) to assess how well the properties of the combinatorial specimen reproduce those of the conventionally processed alloys; and (3) to devise screening tools which can be used to rapidly assess the important properties of the alloys. As proof of principle, the methodology was applied to the Fe-Ni-Cr ternary alloy system that constitutes many commercially important materials such as stainless steels and the H-series and C-series heat and corrosion resistant casting alloys. Three different techniques were developed for making alloy libraries: (1) vapor deposition of discrete thin films on an appropriate substrate and then alloying them together by solid-state diffusion; (2) co-deposition of the alloying elements from three separate magnetron sputtering sources onto an inert substrate; and (3) localized melting of thin films with a focused electron-beam welding system. Each of the techniques was found to have its own advantages and disadvantages. A new and very powerful technique for

  4. A combinational approach of multilocus sequence typing and other molecular typing methods in unravelling the epidemiology of Erysipelothrix rhusiopathiae strains from poultry and mammals.

    PubMed

    Janßen, Traute; Voss, Matthias; Kühl, Michael; Semmler, Torsten; Philipp, Hans-Christian; Ewers, Christa

    2015-07-21

    Erysipelothrix rhusiopathiae infections re-emerged as a matter of great concern particularly in the poultry industry. In contrast to porcine isolates, molecular epidemiological traits of avian E. rhusiopathiae isolates are less well known. Thus, we aimed to (i) develop a multilocus sequence typing (MLST) scheme for E. rhusiopathiae, (ii) study the congruence of strain grouping based on pulsed-field gel electrophoresis (PFGE) and MLST, (iii) determine the diversity of the dominant immunogenic protein SpaA, and (iv) examine the distribution of genes putatively linked with virulence among field isolates from poultry (120), swine (24) and other hosts (21), including humans (3). Using seven housekeeping genes for MLST analysis we determined 72 sequence types (STs) among 165 isolates. This indicated an overall high diversity, though 34.5% of all isolates belonged to a single predominant ST-complex, STC9, which grouped strains from birds and mammals, including humans, together. PFGE revealed 58 different clusters and congruence with the sequence-based MLST-method was not common. Based on polymorphisms in the N-terminal hyper-variable region of SpaA the isolates were classified into five groups, which followed the phylogenetic background of the strains. More than 90% of the isolates harboured all 16 putative virulence genes tested and only intI, encoding an internalin-like protein, showed infrequent distribution. MLST data determined E. rhusiopathiae as weakly clonal species with limited host specificity. A common evolutionary origin of isolates as well as shared SpaA variants and virulence genotypes obtained from avian and mammalian hosts indicates common reservoirs, pathogenic pathways and immunogenic properties of the pathogen.

  5. [METHODS AND TECHNOLOGIES OF HEALTH RISK ANALYSIS IN THE SYSTEM OF THE STATE MANAGEMENT UNDER ASSURANCE OF THE SANITATION AND EPIDEMIOLOGICAL WELFARE OF POPULATION].

    PubMed

    Zaĭtseva, N V; Popova, A Iu; Maĭ, I V; Shur, P Z

    2015-01-01

    The methodology of the analysis of health risk at the present stage of development of Russian society is in-demand at all levels of government management. In conjunction with the methods of mathematical modeling, spatial-temporal analysis and economic tools the risk assessment in the analysis of the situation makes it possible to determine the level of safety of the population, workers and consumers, to select prior resources, and threat factors as a point for exertion efforts. At the planning stage risk assessment is a basis for the establishment of most effective measures for the minimization of hazard and dangers. At the realization stage the methodology allows to estimate the efficiency of measures; at the control and supervision phase it permits to select out priorities for the concentration of efforts on the objects of maximal health risk for population. Risk assessments, including the elements of evolutionary modeling, are incorporated in the system of state hygienic regulation, the formation of evidence base of harm to health, the organization of control and supervisory activities. This allows you to harmonize the domestic legal framework with ternational legal requirements and ultimately enhances the credibility of the Russian data on the safety of the environment products and services. There is seemed to be actual the further assignment of enforcement of methodology of health risk analysis in the field of assurance of sanitary and epidemiological well-being and health of employers; he development of informational and analytical base in the part of the establishment of models of dependencies "exposure-response" for different types and levels of exposure and risk contingents; the accuracy enhancement of estimations of exposure; improvement of the economic aspects of health risk analysis and forecasting of measures aimed at mitigation of the losses associated with the negative impact of manifold factors on the health of citizens.

  6. A new method for designing dual foil electron beam forming systems. I. Introduction, concept of the method

    NASA Astrophysics Data System (ADS)

    Adrich, Przemysław

    2016-05-01

    In Part I of this work existing methods and problems in dual foil electron beam forming system design are presented. On this basis, a new method of designing these systems is introduced. The motivation behind this work is to eliminate the shortcomings of the existing design methods and improve overall efficiency of the dual foil design process. The existing methods are based on approximate analytical models applied in an unrealistically simplified geometry. Designing a dual foil system with these methods is a rather labor intensive task as corrections to account for the effects not included in the analytical models have to be calculated separately and accounted for in an iterative procedure. To eliminate these drawbacks, the new design method is based entirely on Monte Carlo modeling in a realistic geometry and using physics models that include all relevant processes. In our approach, an optimal configuration of the dual foil system is found by means of a systematic, automatized scan of the system performance in function of parameters of the foils. The new method, while being computationally intensive, minimizes the involvement of the designer and considerably shortens the overall design time. The results are of high quality as all the relevant physics and geometry details are naturally accounted for. To demonstrate the feasibility of practical implementation of the new method, specialized software tools were developed and applied to solve a real life design problem, as described in Part II of this work.

  7. Application of optimization methods to helicopter rotor blade design

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, A.; Walsh, J. L.

    1990-01-01

    A procedure for the minimum weight design of helicopter rotor blades with constraints on multiple coupled flap-lag natural frequencies, autorotational inertia, and centrifugal stress is presented. Optimum designs are obtained for blades with both rectangular and tapered planforms and are compared within a reference blade. The effects of higher-frequency constraints and stress constraints on the optimum blade designs are assessed. The results indicate that there is an increase in blade weight and a significant change in the design variable distributions with an increase in the number of frequency constraints. The inclusion of stress constraints has different effects on the wall thickness distributions of rectangular and tapered blades, but tends to increase the magnitude of the nonstructural segment weight distributions for both blade types.

  8. Third order TRANSPORT with MAD (Methodical Accelerator Design) input

    SciTech Connect

    Carey, D.C.

    1988-09-20

    This paper describes computer-aided design codes for particle accelerators. Among the topics discussed are: input beam description; parameters and algebraic expressions; the physical elements; beam lines; operations; and third-order transfer matrix. (LSP)

  9. An On-Board Diagnosis Logic and Its Design Method

    NASA Astrophysics Data System (ADS)

    Hiratsuka, Satoshi; Fusaoka, Akira

    In this paper, we propose a design methodology for on-board diagnosis engine of embedded systems. A boolean function for diagnosis circuit can be mechanically designed from the system dynamics given by the linear differential equation if it is observable, and also if the relation is given between the set of abnormal physical parameters and the faulty part. The size of diagnosis circuit is not so large that it can be implemented in FPGA or fabricated in a simple chip.

  10. Development of panel methods for subsonic analysis and design

    NASA Technical Reports Server (NTRS)

    Bristow, D. R.

    1980-01-01

    Two computer programs, developed for subsonic inviscid analysis and design are described. The first solves arbitrary mixed analysis design problems for multielement airfoils in two dimensional flow. The second calculates the pressure distribution for arbitrary lifting or nonlifting three dimensional configurations. In each program, inviscid flow is modelled by using distributed source doublet singularities on configuration surface panels. Numerical formulations and representative solutions are presented for the programs.

  11. Antimicrobial Susceptibility of Flavobacterium psychrophilum from Chilean Salmon Farms and Their Epidemiological Cut-Off Values Using Agar Dilution and Disk Diffusion Methods

    PubMed Central

    Miranda, Claudio D.; Smith, Peter; Rojas, Rodrigo; Contreras-Lynch, Sergio; Vega, J. M. Alonso

    2016-01-01

    Flavobacterium psychrophilum is the most important bacterial pathogen for freshwater farmed salmonids in Chile. The aims of this study were to determine the susceptibility to antimicrobials used in fish farming of Chilean isolates and to calculate their epidemiological cut-off (COWT) values. A number of 125 Chilean isolates of F. psychrophilum were isolated from reared salmonids presenting clinical symptoms indicative of flavobacteriosis and their identities were confirmed by 16S rRNA polymerase chain reaction. Susceptibility to antibacterials was tested on diluted Mueller-Hinton by using an agar dilution MIC method and a disk diffusion method. The COWT values calculated by Normalized Resistance Interpretation (NRI) analysis allow isolates to be categorized either as wild-type fully susceptible (WT) or as manifesting reduced susceptibility (NWT). When MIC data was used, NRI analysis calculated a COWT of ≤0.125, ≤2, and ≤0.5 μg mL-1 for amoxicillin, florfenicol, and oxytetracycline, respectively. For the quinolones, the COWT were ≤1, ≤0.5, and ≤0.125 μg mL-1 for oxolinic acid, flumequine, and enrofloxacin, respectively. The disk diffusion data sets obtained in this work were extremely diverse and were spread over a wide range. For the quinolones there was a close agreement between the frequencies of NWT isolates calculated using MIC and disk data. For oxolinic acid, flumequine, and enrofloxacin the frequencies were 45, 39, and 38% using MIC data, and 42, 41, and 44%, when disk data were used. There was less agreement with the other antimicrobials, because NWT frequencies obtained using MIC and disk data, respectively, were 24 and 10% for amoxicillin, 8 and 2% for florfenicol, and 70 and 64% for oxytetracycline. Considering that the MIC data was more precise than the disk diffusion data, MIC determination would be the preferred method for susceptibility testing for this species and the NWT frequencies derived from the MIC data sets should be considered

  12. Antimicrobial Susceptibility of Flavobacterium psychrophilum from Chilean Salmon Farms and Their Epidemiological Cut-Off Values Using Agar Dilution and Disk Diffusion Methods.

    PubMed

    Miranda, Claudio D; Smith, Peter; Rojas, Rodrigo; Contreras-Lynch, Sergio; Vega, J M Alonso

    2016-01-01

    Flavobacterium psychrophilum is the most important bacterial pathogen for freshwater farmed salmonids in Chile. The aims of this study were to determine the susceptibility to antimicrobials used in fish farming of Chilean isolates and to calculate their epidemiological cut-off (COWT) values. A number of 125 Chilean isolates of F. psychrophilum were isolated from reared salmonids presenting clinical symptoms indicative of flavobacteriosis and their identities were confirmed by 16S rRNA polymerase chain reaction. Susceptibility to antibacterials was tested on diluted Mueller-Hinton by using an agar dilution MIC method and a disk diffusion method. The COWT values calculated by Normalized Resistance Interpretation (NRI) analysis allow isolates to be categorized either as wild-type fully susceptible (WT) or as manifesting reduced susceptibility (NWT). When MIC data was used, NRI analysis calculated a COWT of ≤0.125, ≤2, and ≤0.5 μg mL(-1) for amoxicillin, florfenicol, and oxytetracycline, respectively. For the quinolones, the COWT were ≤1, ≤0.5, and ≤0.125 μg mL(-1) for oxolinic acid, flumequine, and enrofloxacin, respectively. The disk diffusion data sets obtained in this work were extremely diverse and were spread over a wide range. For the quinolones there was a close agreement between the frequencies of NWT isolates calculated using MIC and disk data. For oxolinic acid, flumequine, and enrofloxacin the frequencies were 45, 39, and 38% using MIC data, and 42, 41, and 44%, when disk data were used. There was less agreement with the other antimicrobials, because NWT frequencies obtained using MIC and disk data, respectively, were 24 and 10% for amoxicillin, 8 and 2% for florfenicol, and 70 and 64% for oxytetracycline. Considering that the MIC data was more precise than the disk diffusion data, MIC determination would be the preferred method for susceptibility testing for this species and the NWT frequencies derived from the MIC data sets should be

  13. Aircraft design for mission performance using nonlinear multiobjective optimization methods

    NASA Technical Reports Server (NTRS)

    Dovi, Augustine R.; Wrenn, Gregory A.

    1990-01-01

    A new technique which converts a constrained optimization problem to an unconstrained one where conflicting figures of merit may be simultaneously considered was combined with a complex mission analysis system. The method is compared with existing single and multiobjective optimization methods. A primary benefit from this new method for multiobjective optimization is the elimination of separate optimizations for each objective, which is required by some optimization methods. A typical wide body transport aircraft is used for the comparative studies.

  14. A Preliminary Rubric Design to Evaluate Mixed Methods Research

    ERIC Educational Resources Information Center

    Burrows, Timothy J.

    2013-01-01

    With the increase in frequency of the use of mixed methods, both in research publications and in externally funded grants there are increasing calls for a set of standards to assess the quality of mixed methods research. The purpose of this mixed methods study was to conduct a multi-phase analysis to create a preliminary rubric to evaluate mixed…

  15. Approximation methods for combined thermal/structural design

    NASA Technical Reports Server (NTRS)

    Haftka, R. T.; Shore, C. P.

    1979-01-01

    Two approximation concepts for combined thermal/structural design are evaluated. The first concept is an approximate thermal analysis based on the first derivatives of structural temperatures with respect to design variables. Two commonly used first-order Taylor series expansions are examined. The direct and reciprocal expansions are special members of a general family of approximations, and for some conditions other members of that family of approximations are more accurate. Several examples are used to compare the accuracy of the different expansions. The second approximation concept is the use of critical time points for combined thermal and stress analyses of structures with transient loading conditions. Significant time savings are realized by identifying critical time points and performing the stress analysis for those points only. The design of an insulated panel which is exposed to transient heating conditions is discussed.

  16. A design method for an intuitive web site

    SciTech Connect

    Quinniey, M.L.; Diegert, K.V.; Baca, B.G.; Forsythe, J.C.; Grose, E.

    1999-11-03

    The paper describes a methodology for designing a web site for human factor engineers that is applicable for designing a web site for a group of people. Many web pages on the World Wide Web are not organized in a format that allows a user to efficiently find information. Often the information and hypertext links on web pages are not organized into intuitive groups. Intuition implies that a person is able to use their knowledge of a paradigm to solve a problem. Intuitive groups are categories that allow web page users to find information by using their intuition or mental models of categories. In order to improve the human factors engineers efficiency for finding information on the World Wide Web, research was performed to develop a web site that serves as a tool for finding information effectively. The paper describes a methodology for designing a web site for a group of people who perform similar task in an organization.

  17. Statistical Methods for Rapid Aerothermal Analysis and Design Technology

    NASA Technical Reports Server (NTRS)

    Morgan, Carolyn; DePriest, Douglas; Thompson, Richard (Technical Monitor)

    2002-01-01

    The cost and safety goals for NASA's next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to establish statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The research work was focused on establishing the suitable mathematical/statistical models for these purposes. It is anticipated that the resulting models can be incorporated into a software tool to provide rapid, variable-fidelity, aerothermal environments to predict heating along an arbitrary trajectory. This work will support development of an integrated design tool to perform automated thermal protection system (TPS) sizing and material selection.

  18. A method for designing robust multivariable feedback systems

    NASA Technical Reports Server (NTRS)

    Milich, David Albert; Athans, Michael; Valavani, Lena; Stein, Gunter

    1988-01-01

    A new methodology is developed for the synthesis of linear, time-invariant (LTI) controllers for multivariable LTI systems. The aim is to achieve stability and performance robustness of the feedback system in the presence of multiple unstructured uncertainty blocks; i.e., to satisfy a frequency-domain inequality in terms of the structured singular value. The design technique is referred to as the Causality Recovery Methodology (CRM). Starting with an initial (nominally) stabilizing compensator, the CRM produces a closed-loop system whose performance-robustness is at least as good as, and hopefully superior to, that of the original design. The robustness improvement is obtained by solving an infinite-dimensional, convex optimization program. A finite-dimensional implementation of the CRM was developed, and it was applied to a multivariate design example.

  19. Multiple methods integration for structural mechanics analysis and design

    NASA Technical Reports Server (NTRS)

    Housner, J. M.; Aminpour, M. A.

    1991-01-01

    A new research area of multiple methods integration is proposed for joining diverse methods of structural mechanics analysis which interact with one another. Three categories of multiple methods are defined: those in which a physical interface are well defined; those in which a physical interface is not well-defined, but selected; and those in which the interface is a mathematical transformation. Two fundamental integration procedures are presented that can be extended to integrate various methods (e.g., finite elements, Rayleigh Ritz, Galerkin, and integral methods) with one another. Since the finite element method will likely be the major method to be integrated, its enhanced robustness under element distortion is also examined and a new robust shell element is demonstrated.

  20. Preliminary Axial Flow Turbine Design and Off-Design Performance Analysis Methods for Rotary Wing Aircraft Engines. Part 1; Validation

    NASA Technical Reports Server (NTRS)

    Chen, Shu-cheng, S.

    2009-01-01

    For the preliminary design and the off-design performance analysis of axial flow turbines, a pair of intermediate level-of-fidelity computer codes, TD2-2 (design; reference 1) and AXOD (off-design; reference 2), are being evaluated for use in turbine design and performance prediction of the modern high performance aircraft engines. TD2-2 employs a streamline curvature method for design, while AXOD approaches the flow analysis with an equal radius-height domain decomposition strategy. Both methods resolve only the flows in the annulus region while modeling the impact introduced by the blade rows. The mathematical formulations and derivations involved in both methods are documented in references 3, 4 for TD2-2) and in reference 5 (for AXOD). The focus of this paper is to discuss the fundamental issues of applicability and compatibility of the two codes as a pair of companion pieces, to perform preliminary design and off-design analysis for modern aircraft engine turbines. Two validation cases for the design and the off-design prediction using TD2-2 and AXOD conducted on two existing high efficiency turbines, developed and tested in the NASA/GE Energy Efficient Engine (GE-E3) Program, the High Pressure Turbine (HPT; two stages, air cooled) and the Low Pressure Turbine (LPT; five stages, un-cooled), are provided in support of the analysis and discussion presented in this paper.

  1. Advanced 3D inverse method for designing turbomachine blades

    SciTech Connect

    Dang, T.

    1995-10-01

    To meet the goal of 60% plant-cycle efficiency or better set in the ATS Program for baseload utility scale power generation, several critical technologies need to be developed. One such need is the improvement of component efficiencies. This work addresses the issue of improving the performance of turbo-machine components in gas turbines through the development of an advanced three-dimensional and viscous blade design system. This technology is needed to replace some elements in current design systems that are based on outdated technology.

  2. [Drug design ideas and methods of Chinese herb prescriptions].

    PubMed

    Ren, Jun-guo; Liu, Jian-xun

    2015-09-01

    The new drug of Chinese herbal prescription, which is the best carrier for the syndrome differentiation and treatment of Chinese medicine and is the main form of the new drug research and development, plays a very important role in the new drug research and development. Although there are many sources of the prescriptions, whether it can become a new drug, the necessity, rationality and science of the prescriptions are the key to develop the new drug. In this article, aiming at the key issues in prescriptions design, the source, classification, composition design of new drug of Chinese herbal prescriptions are discussed, and provide a useful reference for research and development of new drugs.

  3. Relative risk regression analysis of epidemiologic data.

    PubMed

    Prentice, R L

    1985-11-01

    Relative risk regression methods are described. These methods provide a unified approach to a range of data analysis problems in environmental risk assessment and in the study of disease risk factors more generally. Relative risk regression methods are most readily viewed as an outgrowth of Cox's regression and life model. They can also be viewed as a regression generalization of more classical epidemiologic procedures, such as that due to Mantel and Haenszel. In the context of an epidemiologic cohort study, relative risk regression methods extend conventional survival data methods and binary response (e.g., logistic) regression models by taking explicit account of the time to disease occurrence while allowing arbitrary baseline disease rates, general censorship, and time-varying risk factors. This latter feature is particularly relevant to many environmental risk assessment problems wherein one wishes to relate disease rates at a particular point in time to aspects of a preceding risk factor history. Relative risk regression methods also adapt readily to time-matched case-control studies and to certain less standard designs. The uses of relative risk regression methods are illustrated and the state of development of these procedures is discussed. It is argued that asymptotic partial likelihood estimation techniques are now well developed in the important special case in which the disease rates of interest have interpretations as counting process intensity functions. Estimation of relative risks processes corresponding to disease rates falling outside this class has, however, received limited attention. The general area of relative risk regression model criticism has, as yet, not been thoroughly studied, though a number of statistical groups are studying such features as tests of fit, residuals, diagnostics and graphical procedures. Most such studies have been restricted to exponential form relative risks as have simulation studies of relative risk estimation

  4. Epidemiology, molecular epidemiology and evolution of bovine respiratory syncytial virus.

    PubMed

    Sarmiento-Silva, Rosa Elena; Nakamura-Lopez, Yuko; Vaughan, Gilberto

    2012-11-30

    The bovine respiratory syncytial virus (BRSV) is an enveloped, negative sense, single-stranded RNA virus belonging to the pneumovirus genus within the family Paramyxoviridae. BRSV has been recognized as a major cause of respiratory disease in young calves since the early 1970s. The analysis of BRSV infection was originally hampered by its characteristic lability and poor growth in vitro. However, the advent of numerous immunological and molecular methods has facilitated the study of BRSV enormously. The knowledge gained from these studies has also provided the opportunity to develop safe, stable, attenuated virus vaccine candidates. Nonetheless, many aspects of the epidemiology, molecular epidemiology and evolution of the virus are still not fully understood. The natural course of infection is rather complex and further complicates diagnosis, treatment and the implementation of preventive measures aimed to control the disease. Therefore, understanding the mechanisms by which BRSV is able to establish infection is needed to prevent viral and disease spread. This review discusses important information regarding the epidemiology and molecular epidemiology of BRSV worldwide, and it highlights the importance of viral evolution in virus transmission.

  5. Active Learning Methods and Technology: Strategies for Design Education

    ERIC Educational Resources Information Center

    Coorey, Jillian

    2016-01-01

    The demands in higher education are on the rise. Charged with teaching more content, increased class sizes and engaging students, educators face numerous challenges. In design education, educators are often torn between the teaching of technology and the teaching of theory. Learning the formal concepts of hierarchy, contrast and space provide the…

  6. A Prospective Method to Guide Small Molecule Drug Design

    ERIC Educational Resources Information Center

    Johnson, Alan T.

    2015-01-01

    At present, small molecule drug design follows a retrospective path when considering what analogs are to be made around a current hit or lead molecule with the focus often on identifying a compound with higher intrinsic potency. What this approach overlooks is the simultaneous need to also improve the physicochemical (PC) and pharmacokinetic (PK)…

  7. Library Design Analysis Using Post-Occupancy Evaluation Methods.

    ERIC Educational Resources Information Center

    James, Dennis C.; Stewart, Sharon L.

    1995-01-01

    Presents findings of a user-based study of the interior of Rodger's Science and Engineering Library at the University of Alabama. Compared facility evaluations from faculty, library staff, and graduate and undergraduate students. Features evaluated include: acoustics, aesthetics, book stacks, design, finishes/materials, furniture, lighting,…

  8. Design component method for sensitivity analysis of built-up structures

    NASA Technical Reports Server (NTRS)

    Choi, Kyung K.; Seong, Hwai G.

    1986-01-01

    A 'design component method' that provides a unified and systematic organization of design sensitivity analysis for built-up structures is developed and implemented. Both conventional design variables, such as thickness and cross-sectional area, and shape design variables of components of built-up structures are considered. It is shown that design of components of built-up structures can be characterized and system design sensitivity expressions obtained by simply adding contributions from each component. The method leads to a systematic organization of computations for design sensitivity analysis that is similar to the way in which computations are organized within a finite element code.

  9. Overview of control design methods for smart structural system

    NASA Astrophysics Data System (ADS)

    Rao, Vittal S.; Sana, Sridhar

    2001-08-01

    Smart structures are a result of effective integration of control system design and signal processing with the structural systems to maximally utilize the new advances in materials for structures, actuation and sensing to obtain the best performance for the application at hand. The research in smart structures is constantly driving towards attaining self adaptive and diagnostic capabilities that biological systems possess. This has been manifested in the number of successful applications in many areas of engineering such as aerospace, civil and automotive systems. Instrumental in the development of such systems are smart materials such as piezo-electric, shape memory alloys, electrostrictive, magnetostrictive and fiber-optic materials and various composite materials for use as actuators, sensors and structural members. The need for development of control systems that maximally utilize the smart actuators and sensing materials to design highly distributed and highly adaptable controllers has spurred research in the area of smart structural modeling, identification, actuator/sensor design and placement, control systems design such as adaptive and robust controllers with new tools such a neural networks, fuzzy logic, genetic algorithms, linear matrix inequalities and electronics for controller implementation such as analog electronics, micro controllers, digital signal processors (DSPs) and application specific integrated circuits (ASICs) such field programmable gate arrays (FPGAs) and Multichip modules (MCMs) etc. In this paper, we give a brief overview of the state of control in smart structures. Different aspects of the development of smart structures such as applications, technology and theoretical advances especially in the area of control systems design and implementation will be covered.

  10. Optimal reliability design method for remote solar systems

    NASA Astrophysics Data System (ADS)

    Suwapaet, Nuchida

    A unique optimal reliability design algorithm is developed for remote communication systems. The algorithm deals with either minimizing an unavailability of the system within a fixed cost or minimizing the cost of the system with an unavailability constraint. The unavailability of the system is a function of three possible failure occurrences: individual component breakdown, solar energy deficiency (loss of load probability), and satellite/radio transmission loss. The three mathematical models of component failure, solar power failure, transmission failure are combined and formulated as a nonlinear programming optimization problem with binary decision variables, such as number and type (or size) of photovoltaic modules, batteries, radios, antennas, and controllers. Three possible failures are identified and integrated in computer algorithm to generate the parameters for the optimization algorithm. The optimization algorithm is implemented with a branch-and-bound technique solution in MS Excel Solver. The algorithm is applied to a case study design for an actual system that will be set up in remote mountainous areas of Peru. The automated algorithm is verified with independent calculations. The optimal results from minimizing the unavailability of the system with the cost constraint case and minimizing the total cost of the system with the unavailability constraint case are consistent with each other. The tradeoff feature in the algorithm allows designers to observe results of 'what-if' scenarios of relaxing constraint bounds, thus obtaining the most benefit from the optimization process. An example of this approach applied to an existing communication system in the Andes shows dramatic improvement in reliability for little increase in cost. The algorithm is a real design tool, unlike other existing simulation design tools. The algorithm should be useful for other stochastic systems where component reliability, random supply and demand, and communication are

  11. Consumers' Kansei Needs Clustering Method for Product Emotional Design Based on Numerical Design Structure Matrix and Genetic Algorithms

    PubMed Central

    Chen, Deng-kai; Gu, Rong; Gu, Yu-feng; Yu, Sui-huai

    2016-01-01

    Consumers' Kansei needs reflect their perception about a product and always consist of a large number of adjectives. Reducing the dimension complexity of these needs to extract primary words not only enables the target product to be explicitly positioned, but also provides a convenient design basis for designers engaging in design work. Accordingly, this study employs a numerical design structure matrix (NDSM) by parameterizing a conventional DSM and integrating genetic algorithms to find optimum Kansei clusters. A four-point scale method is applied to assign link weights of every two Kansei adjectives as values of cells when constructing an NDSM. Genetic algorithms are used to cluster the Kansei NDSM and find optimum clusters. Furthermore, the process of the proposed method is presented. The details of the proposed approach are illustrated using an example of electronic scooter for Kansei needs clustering. The case study reveals that the proposed method is promising for clustering Kansei needs adjectives in product emotional design. PMID:27630709

  12. Consumers' Kansei Needs Clustering Method for Product Emotional Design Based on Numerical Design Structure Matrix and Genetic Algorithms.

    PubMed

    Yang, Yan-Pu; Chen, Deng-Kai; Gu, Rong; Gu, Yu-Feng; Yu, Sui-Huai

    2016-01-01

    Consumers' Kansei needs reflect their perception about a product and always consist of a large number of adjectives. Reducing the dimension complexity of these needs to extract primary words not only enables the target product to be explicitly positioned, but also provides a convenient design basis for designers engaging in design work. Accordingly, this study employs a numerical design structure matrix (NDSM) by parameterizing a conventional DSM and integrating genetic algorithms to find optimum Kansei clusters. A four-point scale method is applied to assign link weights of every two Kansei adjectives as values of cells when constructing an NDSM. Genetic algorithms are used to cluster the Kansei NDSM and find optimum clusters. Furthermore, the process of the proposed method is presented. The details of the proposed approach are illustrated using an example of electronic scooter for Kansei needs clustering. The case study reveals that the proposed method is promising for clustering Kansei needs adjectives in product emotional design.

  13. Inverse airfoil design procedure using a multigrid Navier-Stokes method

    NASA Technical Reports Server (NTRS)

    Malone, J. B.; Swanson, R. C.

    1991-01-01

    The Modified Garabedian McFadden (MGM) design procedure was incorporated into an existing 2-D multigrid Navier-Stokes airfoil analysis method. The resulting design method is an iterative procedure based on a residual correction algorithm and permits the automated design of airfoil sections with prescribed surface pressure distributions. The new design method, Multigrid Modified Garabedian McFadden (MG-MGM), is demonstrated for several different transonic pressure distributions obtained from both symmetric and cambered airfoil shapes. The airfoil profiles generated with the MG-MGM code are compared to the original configurations to assess the capabilities of the inverse design method.

  14. Study of Fuze Structure and Reliability Design Based on the Direct Search Method

    NASA Astrophysics Data System (ADS)

    Lin, Zhang; Ning, Wang

    2017-03-01

    Redundant design is one of the important methods to improve the reliability of the system, but mutual coupling of multiple factors is often involved in the design. In my study, Direct Search Method is introduced into the optimum redundancy configuration for design optimization, in which, the reliability, cost, structural weight and other factors can be taken into account simultaneously, and the redundant allocation and reliability design of aircraft critical system are computed. The results show that this method is convenient and workable, and applicable to the redundancy configurations and optimization of various designs upon appropriate modifications. And this method has a good practical value.

  15. Design and ergonomics. Methods for integrating ergonomics at hand tool design stage.

    PubMed

    Marsot, Jacques; Claudon, Laurent

    2004-01-01

    As a marked increase in the number of musculoskeletal disorders was noted in many industrialized countries and more specifically in companies that require the use of hand tools, the French National Research and Safety Institute (INRS) launched in 1999 a research project on the topic of integrating ergonomics into hand tool design, and more particularly to a design of a boning knife. After a brief recall of the difficulties of integrating ergonomics at the design stage, the present paper shows how 3 design methodological tools--Functional Analysis, Quality Function Deployment and TRIZ--have been applied to the design of a boning knife. Implementation of these tools enabled us to demonstrate the extent to which they are capable of responding to the difficulties of integrating ergonomics into product design.

  16. Synthesis of calculational methods for design and analysis of radiation shields for nuclear rocket systems

    NASA Technical Reports Server (NTRS)

    Capo, M. A.; Disney, R. K.; Jordan, T. A.; Soltesz, R. G.; Woodsum, H. C.

    1969-01-01

    Eight computer programs make up a nine volume synthesis containing two design methods for nuclear rocket radiation shields. The first design method is appropriate for parametric and preliminary studies, while the second accomplishes the verification of a final nuclear rocket reactor design.

  17. Grouping design method of catadioptric projection objective for deep ultraviolet lithography

    NASA Astrophysics Data System (ADS)

    Cao, Zhen; Li, Yanqiu; Mao, Shanshan

    2017-02-01

    Choosing an adequate initial design for optimization plays an important role in obtaining high-quality deep ultraviolet (DUV) lithographic objectives. In this paper, the grouping design method is extended to acquire initial configurations of catadioptric projection objective for DUV lithography. In this method, an objective system is first divided into several lens groups. The initial configuration of each lens group is then determined by adjusting and optimizing existing lens design according to respective design requirements. Finally, the lens groups are connected into a feasible initial objective system. Grouping design allocates the complexity of designing a whole system to each of the lens groups, which significantly simplifies the design process. A two-mirror design form serves as an example for illustrating the grouping design principles to this type of system. In addition, it is demonstrated that different initial designs can be generated by changing the design form of each individual lens group.

  18. Applications of Genetic Methods to NASA Design and Operations Problems

    NASA Technical Reports Server (NTRS)

    Laird, Philip D.

    1996-01-01

    We review four recent NASA-funded applications in which evolutionary/genetic methods are important. In the process we survey: the kinds of problems being solved today with these methods; techniques and tools used; problems encountered; and areas where research is needed. The presentation slides are annotated briefly at the top of each page.

  19. Designing a Science Methods Course for Early Childhood Preservice Teachers

    ERIC Educational Resources Information Center

    Akerson, Valarie L.

    2004-01-01

    Preparing early childhood (K-3) teachers to teach science presents special challenges for the science methods instructor. Early childhood preservice teachers typically come to the methods classroom with little science content knowledge; they also lack confidence in their own abilities to teach science. This paper presents a theoretical background,…

  20. [Molecular epidemiology in the epidemiological transition].

    PubMed

    Tapia-Conyer, R

    1997-01-01

    The epidemiological transition describes the changes in the health profile of populations where infectious diseases are substituted by chronic or non-communicable diseases. Even in industrialized countries, infectious diseases emerge as important public health problems and with a very important association with several type of neoplasm. Molecular epidemiology brings in new tools for the study of the epidemiological transition by discovering infectious agents as etiology of diseases, neither of both new. Much has been advanced in the understanding of the virulence and resistance mechanism of different strains, or improving the knowledge on transmission dynamics and dissemination pathways of infectious diseases. As to the non-communicable diseases, molecular epidemiology has enhanced the identification of endogenous risk factors link to alterations, molecular changes in genetic material, that will allow a more detail definition of risk and the identification of individual and groups at risk of several diseases. The potential impact of molecular epidemiology in other areas as environmental, lifestyles and nutritional areas are illustrated with several examples.

  1. Air pollution, inflammation and preterm birth in Mexico City: study design and methods.

    PubMed

    O'Neill, Marie S; Osornio-Vargas, Alvaro; Buxton, Miatta A; Sánchez, Brisa N; Rojas-Bracho, Leonora; Castillo-Castrejon, Marisol; Mordhukovich, Irina B; Brown, Daniel G; Vadillo-Ortega, Felipe

    2013-03-15

    Preterm birth is one of the leading causes of perinatal mortality and is associated with long-term adverse health consequences for surviving infants. Preterm birth rates are rising worldwide, and no effective means for prevention currently exists. Air pollution exposure may be a significant cause of prematurity, but many published studies lack the individual, clinical data needed to elucidate possible biological mechanisms mediating these epidemiological associations. This paper presents the design of a prospective study now underway to evaluate those mechanisms in a cohort of pregnant women residing in Mexico City. We address how air quality may act together with other factors to induce systemic inflammation and influence the duration of pregnancy. Data collection includes: biomarkers relevant to inflammation in cervico-vaginal exudate and peripheral blood, along with full clinical information, pro-inflammatory cytokine gene polymorphisms and air pollution data to evaluate spatial and temporal variability in air pollution exposure. Samples are collected on a monthly basis and participants are followed for the duration of pregnancy. The data will be used to evaluate whether ambient air pollution is associated with preterm birth, controlling for other risk factors. We will evaluate which time windows during pregnancy are most influential in the air pollution and preterm birth association. In addition, the epidemiological study will be complemented with a parallel toxicology invitro study, in which monocytic cells will be exposed to air particle samples to evaluate the expression of biomarkers of inflammation.

  2. How to Combine Objectives and Methods of Evaluation in Iterative ILE Design: Lessons Learned from Designing Ambre-Add

    ERIC Educational Resources Information Center

    Nogry, S.; Jean-Daubias, S.; Guin, N.

    2012-01-01

    This article deals with evaluating an interactive learning environment (ILE) during the iterative-design process. Various aspects of the system must be assessed and a number of evaluation methods are available. In designing the ILE Ambre-add, several techniques were combined to test and refine the system. In particular, we point out the merits of…

  3. A hybrid nonlinear programming method for design optimization

    NASA Technical Reports Server (NTRS)

    Rajan, S. D.

    1986-01-01

    Solutions to engineering design problems formulated as nonlinear programming (NLP) problems usually require the use of more than one optimization technique. Moreover, the interaction between the user (analysis/synthesis) program and the NLP system can lead to interface, scaling, or convergence problems. An NLP solution system is presented that seeks to solve these problems by providing a programming system to ease the user-system interface. A simple set of rules is used to select an optimization technique or to switch from one technique to another in an attempt to detect, diagnose, and solve some potential problems. Numerical examples involving finite element based optimal design of space trusses and rotor bearing systems are used to illustrate the applicability of the proposed methodology.

  4. New Methods for Design and Computation of Freeform Optics

    DTIC Science & Technology

    2015-07-09

    as a partial differential equation(PDE) of second order with nonstandard boundary conditions. The solution to this PDE problem is a scalar function...the exact solution with any a priori given accuracy. By contrast with other approaches the solution obtained with our approach does not depend on ad hoc...strategy for constructing weak solutions to nonlinear partial differential equations arising in design problems involving freeform optical surfaces[10

  5. Design Method of Fault Detector for Injection Unit

    NASA Astrophysics Data System (ADS)

    Ochi, Kiyoshi; Saeki, Masami

    An injection unit is considered as a speed control system utilizing a reaction-force sensor. Our purpose is to design a fault detector that detects and isolates actuator and sensor faults under the condition that the system is disturbed by a reaction force. First described is the fault detector's general structure. In this system, a disturbance observer that estimates the reaction force is designed for the speed control system in order to obtain the residual signals, and then post-filters that separate the specific frequency elements from the residual signals are applied in order to generate the decision signals. Next, we describe a fault detector designed specifically for a model of the injection unit. It is shown that the disturbance imposed on the decision variables can be made significantly small by appropriate adjustments to the observer bandwidth, and that most of the sensor faults and actuator faults can be detected and some of them can be isolated in the frequency domain by setting the frequency characteristics of the post-filters appropriately. Our result is verified by experiments for an actual injection unit.

  6. Multicenter study of epidemiological cutoff values and detection of resistance in Candida spp. to anidulafungin, caspofungin, and micafungin using the Sensititre YeastOne colorimetric method.

    PubMed

    Espinel-Ingroff, A; Alvarez-Fernandez, M; Cantón, E; Carver, P L; Chen, S C-A; Eschenauer, G; Getsinger, D L; Gonzalez, G M; Govender, N P; Grancini, A; Hanson, K E; Kidd, S E; Klinker, K; Kubin, C J; Kus, J V; Lockhart, S R; Meletiadis, J; Morris, A J; Pelaez, T; Quindós, G; Rodriguez-Iglesias, M; Sánchez-Reus, F; Shoham, S; Wengenack, N L; Borrell Solé, N; Echeverria, J; Esperalba, J; Gómez-G de la Pedrosa, E; García García, I; Linares, M J; Marco, F; Merino, P; Pemán, J; Pérez Del Molino, L; Roselló Mayans, E; Rubio Calvo, C; Ruiz Pérez de Pipaon, M; Yagüe, G; Garcia-Effron, G; Guinea, J; Perlin, D S; Sanguinetti, M; Shields, R; Turnidge, J

    2015-11-01

    Neither breakpoints (BPs) nor epidemiological cutoff values (ECVs) have been established for Candida spp. with anidulafungin, caspofungin, and micafungin when using the Sensititre YeastOne (SYO) broth dilution colorimetric method. In addition, reference caspofungin MICs have so far proven to be unreliable. Candida species wild-type (WT) MIC distributions (for microorganisms in a species/drug combination with no detectable phenotypic resistance) were established for 6,007 Candida albicans, 186 C. dubliniensis, 3,188 C. glabrata complex, 119 C. guilliermondii, 493 C. krusei, 205 C. lusitaniae, 3,136 C. parapsilosis complex, and 1,016 C. tropicalis isolates. SYO MIC data gathered from 38 laboratories in Australia, Canada, Europe, Mexico, New Zealand, South Africa, and the United States were pooled to statistically define SYO ECVs. ECVs for anidulafungin, caspofungin, and micafungin encompassing ≥97.5% of the statistically modeled population were, respectively, 0.12, 0.25, and 0.06 μg/ml for C. albicans, 0.12, 0.25, and 0.03 μg/ml for C. glabrata complex, 4, 2, and 4 μg/ml for C. parapsilosis complex, 0.5, 0.25, and 0.06 μg/ml for C. tropicalis, 0.25, 1, and 0.25 μg/ml for C. krusei, 0.25, 1, and 0.12 μg/ml for C. lusitaniae, 4, 2, and 2 μg/ml for C. guilliermondii, and 0.25, 0.25, and 0.12 μg/ml for C. dubliniensis. Species-specific SYO ECVs for anidulafungin, caspofungin, and micafungin correctly classified 72 (88.9%), 74 (91.4%), 76 (93.8%), respectively, of 81 Candida isolates with identified fks mutations. SYO ECVs may aid in detecting non-WT isolates with reduced susceptibility to anidulafungin, micafungin, and especially caspofungin, since testing the susceptibilities of Candida spp. to caspofungin by reference methodologies is not recommended.

  7. Multicenter Study of Epidemiological Cutoff Values and Detection of Resistance in Candida spp. to Anidulafungin, Caspofungin, and Micafungin Using the Sensititre YeastOne Colorimetric Method

    PubMed Central

    Alvarez-Fernandez, M.; Cantón, E.; Carver, P. L.; Chen, S. C.-A.; Eschenauer, G.; Getsinger, D. L.; Gonzalez, G. M.; Grancini, A.; Hanson, K. E.; Kidd, S. E.; Klinker, K.; Kubin, C. J.; Kus, J. V.; Lockhart, S. R.; Meletiadis, J.; Morris, A. J.; Pelaez, T.; Rodriguez-Iglesias, M.; Sánchez-Reus, F.; Shoham, S.; Wengenack, N. L.; Borrell Solé, N.; Echeverria, J.; Esperalba, J.; Gómez-G. de la Pedrosa, E.; García García, I.; Linares, M. J.; Marco, F.; Merino, P.; Pemán, J.; Pérez del Molino, L.; Roselló Mayans, E.; Rubio Calvo, C.; Ruiz Pérez de Pipaon, M.; Yagüe, G.; Garcia-Effron, G.; Perlin, D. S.; Sanguinetti, M.; Shields, R.; Turnidge, J.

    2015-01-01

    Neither breakpoints (BPs) nor epidemiological cutoff values (ECVs) have been established for Candida spp. with anidulafungin, caspofungin, and micafungin when using the Sensititre YeastOne (SYO) broth dilution colorimetric method. In addition, reference caspofungin MICs have so far proven to be unreliable. Candida species wild-type (WT) MIC distributions (for microorganisms in a species/drug combination with no detectable phenotypic resistance) were established for 6,007 Candida albicans, 186 C. dubliniensis, 3,188 C. glabrata complex, 119 C. guilliermondii, 493 C. krusei, 205 C. lusitaniae, 3,136 C. parapsilosis complex, and 1,016 C. tropicalis isolates. SYO MIC data gathered from 38 laboratories in Australia, Canada, Europe, Mexico, New Zealand, South Africa, and the United States were pooled to statistically define SYO ECVs. ECVs for anidulafungin, caspofungin, and micafungin encompassing ≥97.5% of the statistically modeled population were, respectively, 0.12, 0.25, and 0.06 μg/ml for C. albicans, 0.12, 0.25, and 0.03 μg/ml for C. glabrata complex, 4, 2, and 4 μg/ml for C. parapsilosis complex, 0.5, 0.25, and 0.06 μg/ml for C. tropicalis, 0.25, 1, and 0.25 μg/ml for C. krusei, 0.25, 1, and 0.12 μg/ml for C. lusitaniae, 4, 2, and 2 μg/ml for C. guilliermondii, and 0.25, 0.25, and 0.12 μg/ml for C. dubliniensis. Species-specific SYO ECVs for anidulafungin, caspofungin, and micafungin correctly classified 72 (88.9%), 74 (91.4%), 76 (93.8%), respectively, of 81 Candida isolates with identified fks mutations. SYO ECVs may aid in detecting non-WT isolates with reduced susceptibility to anidulafungin, micafungin, and especially caspofungin, since testing the susceptibilities of Candida spp. to caspofungin by reference methodologies is not recommended. PMID:26282428

  8. Advanced Control and Protection system Design Methods for Modular HTGRs

    SciTech Connect

    Ball, Sydney J; Wilson Jr, Thomas L; Wood, Richard Thomas

    2012-06-01

    The project supported the Nuclear Regulatory Commission (NRC) in identifying and evaluating the regulatory implications concerning the control and protection systems proposed for use in the Department of Energy's (DOE) Next-Generation Nuclear Plant (NGNP). The NGNP, using modular high-temperature gas-cooled reactor (HTGR) technology, is to provide commercial industries with electricity and high-temperature process heat for industrial processes such as hydrogen production. Process heat temperatures range from 700 to 950 C, and for the upper range of these operation temperatures, the modular HTGR is sometimes referred to as the Very High Temperature Reactor or VHTR. Initial NGNP designs are for operation in the lower temperature range. The defining safety characteristic of the modular HTGR is that its primary defense against serious accidents is to be achieved through its inherent properties of the fuel and core. Because of its strong negative temperature coefficient of reactivity and the capability of the fuel to withstand high temperatures, fast-acting active safety systems or prompt operator actions should not be required to prevent significant fuel failure and fission product release. The plant is designed such that its inherent features should provide adequate protection despite operational errors or equipment failure. Figure 1 shows an example modular HTGR layout (prismatic core version), where its inlet coolant enters the reactor vessel at the bottom, traversing up the sides to the top plenum, down-flow through an annular core, and exiting from the lower plenum (hot duct). This research provided NRC staff with (a) insights and knowledge about the control and protection systems for the NGNP and VHTR, (b) information on the technologies/approaches under consideration for use in the reactor and process heat applications, (c) guidelines for the design of highly integrated control rooms, (d) consideration for modeling of control and protection system designs for

  9. Problems of epidemiology in malaria eradication

    PubMed Central

    Yekutiel, P.

    1960-01-01

    With an increasing number of malaria eradication programmes approaching or entering the consolidation phase, the epidemiological features of disappearing malaria are getting better known and defined. At the same time, the old classical methods of measuring malaria prevalence have become inadequate and new methods for the epidemiological assessment of the progress of eradication are being developed. In this article the new methods of assessment and epidemiological and statistical criteria for discontinuing residual spraying and for the stability of consolidation are discussed on the basis of field experience in several countries during the past two or three years. Some prominent epidemiological features of malaria at reduced levels of transmission are described, special attention being given to the role of the asymptomatic carrier in malaria eradication. PMID:13846510

  10. Reducing Design Risk Using Robust Design Methods: A Dual Response Surface Approach

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Yeniay, Ozgur; Lepsch, Roger A. (Technical Monitor)

    2003-01-01

    Space transportation system conceptual design is a multidisciplinary process containing considerable element of risk. Risk here is defined as the variability in the estimated (output) performance characteristic of interest resulting from the uncertainties in the values of several disciplinary design and/or operational parameters. Uncertainties from one discipline (and/or subsystem) may propagate to another, through linking parameters and the final system output may have a significant accumulation of risk. This variability can result in significant deviations from the expected performance. Therefore, an estimate of variability (which is called design risk in this study) together with the expected performance characteristic value (e.g. mean empty weight) is necessary for multidisciplinary optimization for a robust design. Robust design in this study is defined as a solution that minimizes variability subject to a constraint on mean performance characteristics. Even though multidisciplinary design optimization has gained wide attention and applications, the treatment of uncertainties to quantify and analyze design risk has received little attention. This research effort explores the dual response surface approach to quantify variability (risk) in critical performance characteristics (such as weight) during conceptual design.

  11. Object-oriented design of preconditioned iterative methods

    SciTech Connect

    Bruaset, A.M.

    1994-12-31

    In this talk the author discusses how object-oriented programming techniques can be used to develop a flexible software package for preconditioned iterative methods. The ideas described have been used to implement the linear algebra part of Diffpack, which is a collection of C++ class libraries that provides high-level tools for the solution of partial differential equations. In particular, this software package is aimed at rapid development of PDE-based numerical simulators, primarily using finite element methods.

  12. Epidemiology as discourse: the politics of development institutions in the Epidemiological Profile of El Salvador

    PubMed Central

    Aviles, L

    2001-01-01

    STUDY OBJECTIVE—To determine the ways in which institutions devoted to international development influence epidemiological studies.
DESIGN—This article takes a descriptive epidemiological study of El Salvador, Epidemiological Profile, conducted in 1994 by the US Agency for International Development, as a case study. The methods include discourse analysis in order to uncover the ideological basis of the report and its characteristics as a discourse of development.
SETTING—El Salvador.
RESULTS—The Epidemiological Profile theoretical basis, the epidemiological transition theory, embodies the ethnocentrism of a "colonizer's model of the world." This report follows the logic of a discourse of development by depoliticising development, creating abnormalities, and relying on the development consulting industry. The epidemiological transition theory serves as an ideology that legitimises and dissimulates the international order.
CONCLUSIONS—Even descriptive epidemiological assessments or epidemiological profiles are imbued with theoretical assumptions shaped by the institutional setting under which epidemiological investigations are conducted.


Keywords: El Salvador; politics PMID:11160170

  13. The Checkered History of American Psychiatric Epidemiology

    PubMed Central

    Horwitz, Allan V; Grob, Gerald N

    2011-01-01

    Context American psychiatry has been fascinated with statistics ever since the specialty was created in the early nineteenth century. Initially, psychiatrists hoped that statistics would reveal the benefits of institutional care. Nevertheless, their fascination with statistics was far removed from the growing importance of epidemiology generally. The impetus to create an epidemiology of mental disorders came from the emerging social sciences, whose members were concerned with developing a scientific understanding of individual and social behavior and applying it to a series of pressing social problems. Beginning in the 1920s, the interest of psychiatric epidemiologists shifted to the ways that social environments contributed to the development of mental disorders. This emphasis dramatically changed after 1980 when the policy focus of psychiatric epidemiology became the early identification and prevention of mental illness in individuals. Methods This article reviews the major developments in psychiatric epidemiology over the past century and a half. Findings The lack of an adequate classification system for mental illness has precluded the field of psychiatric epidemiology from providing causal understandings that could contribute to more adequate policies to remediate psychiatric disorders. Because of this gap, the policy influence of psychiatric epidemiology has stemmed more from institutional and ideological concerns than from knowledge about the causes of mental disorders. Conclusion Most of the problems that have bedeviled psychiatric epidemiology since its inception remain unresolved. In particular, until epidemiologists develop adequate methods to measure mental illnesses in community populations, the policy contributions of this field will not be fully realized. PMID:22188350

  14. Structural topology design of container ship based on knowledge-based engineering and level set method

    NASA Astrophysics Data System (ADS)

    Cui, Jin-ju; Wang, De-yu; Shi, Qi-qi

    2015-06-01

    Knowledge-Based Engineering (KBE) is introduced into the ship structural design in this paper. From the implementation of KBE, the design solutions for both Rules Design Method (RDM) and Interpolation Design Method (IDM) are generated. The corresponding Finite Element (FE) models are generated. Topological design of the longitudinal structures is studied where the Gaussian Process (GP) is employed to build the surrogate model for FE analysis. Multi-objective optimization methods inspired by Pareto Front are used to reduce the design tank weight and outer surface area simultaneously. Additionally, an enhanced Level Set Method (LSM) which employs implicit algorithm is applied to the topological design of typical bracket plate which is used extensively in ship structures. Two different sets of boundary conditions are considered. The proposed methods show satisfactory efficiency and accuracy.

  15. Category's analysis and operational project capacity method of transformation in design

    NASA Astrophysics Data System (ADS)

    Obednina, S. V.; Bystrova, T. Y.

    2015-10-01

    The method of transformation is attracting widespread interest in fields such contemporary design. However, in theory of design little attention has been paid to a categorical status of the term "transformation". This paper presents the conceptual analysis of transformation based on the theory of form employed in the influential essays by Aristotle and Thomas Aquinas. In the present work the transformation as a method of shaping design has been explored as well as potential application of this term in design has been demonstrated.

  16. Genetic-evolution-based optimization methods for engineering design

    NASA Technical Reports Server (NTRS)

    Rao, S. S.; Pan, T. S.; Dhingra, A. K.; Venkayya, V. B.; Kumar, V.

    1990-01-01

    This paper presents the applicability of a biological model, based on genetic evolution, for engineering design optimization. Algorithms embodying the ideas of reproduction, crossover, and mutation are developed and applied to solve different types of structural optimization problems. Both continuous and discrete variable optimization problems are solved. A two-bay truss for maximum fundamental frequency is considered to demonstrate the continuous variable case. The selection of locations of actuators in an actively controlled structure, for minimum energy dissipation, is considered to illustrate the discrete variable case.

  17. Mixing Qualitative and Quantitative Methods: Insights into Design and Analysis Issues

    ERIC Educational Resources Information Center

    Lieber, Eli

    2009-01-01

    This article describes and discusses issues related to research design and data analysis in the mixing of qualitative and quantitative methods. It is increasingly desirable to use multiple methods in research, but questions arise as to how best to design and analyze the data generated by mixed methods projects. I offer a conceptualization for such…

  18. A Proposed Model of Retransformed Qualitative Data within a Mixed Methods Research Design

    ERIC Educational Resources Information Center

    Palladino, John M.

    2009-01-01

    Most models of mixed methods research design provide equal emphasis of qualitative and quantitative data analyses and interpretation. Other models stress one method more than the other. The present article is a discourse about the investigator's decision to employ a mixed method design to examine special education teachers' advocacy and…

  19. Computer control of large accelerators design concepts and methods

    SciTech Connect

    Beck, F.; Gormley, M.

    1984-05-01

    Unlike most of the specialities treated in this volume, control system design is still an art, not a science. These lectures are an attempt to produce a primer for prospective practitioners of this art. A large modern accelerator requires a comprehensive control system for commissioning, machine studies and day-to-day operation. Faced with the requirement to design a control system for such a machine, the control system architect has a bewildering array of technical devices and techniques at his disposal, and it is our aim in the following chapters to lead him through the characteristics of the problems he will have to face and the practical alternatives available for solving them. We emphasize good system architecture using commercially available hardware and software components, but in addition we discuss the actual control strategies which are to be implemented since it is at the point of deciding what facilities shall be available that the complexity of the control system and its cost are implicitly decided. 19 references.

  20. Flight critical system design guidelines and validation methods

    NASA Technical Reports Server (NTRS)

    Holt, H. M.; Lupton, A. O.; Holden, D. G.

    1984-01-01

    Efforts being expended at NASA-Langley to define a validation methodology, techniques for comparing advanced systems concepts, and design guidelines for characterizing fault tolerant digital avionics are described with an emphasis on the capabilities of AIRLAB, an environmentally controlled laboratory. AIRLAB has VAX 11/750 and 11/780 computers with an aggregate of 22 Mb memory and over 650 Mb storage, interconnected at 256 kbaud. An additional computer is programmed to emulate digital devices. Ongoing work is easily accessed at user stations by either chronological or key word indexing. The CARE III program aids in analyzing the capabilities of test systems to recover from faults. An additional code, the semi-Markov unreliability program (SURE) generates upper and lower reliability bounds. The AIRLAB facility is mainly dedicated to research on designs of digital flight-critical systems which must have acceptable reliability before incorporation into aircraft control systems. The digital systems would be too costly to submit to a full battery of flight tests and must be initially examined with the AIRLAB simulation capabilities.

  1. [Epidemiology and public policies].

    PubMed

    Barata, Rita Barradas

    2013-03-01

    The present essay deals with the relation between epidemiology and public policies, highlighting the epidemiology position in the public health field, analyzing the impact of public policies over epidemiological profile and contributions from epidemiology to the lay down, implementation and evaluation of public health policies. In the first title, the essay debates the links between the epidemiology and public health field, the social determinants and political action framework proposed by the WHO's Commission on Social Determinants of Health, and different approaches of health policies. In the second title the essay analyses the reduction of child stunting in Brazil as an example of public policies that impact epidemiological profile. The third title presents three strategic topics for the application of public health policies: reduction of social inequalities in health, health promotion and regulation of products and services that have impact over health. The fourth title discusses the possibilities and difficulties to combine the epidemiological knowledge in the lay down, implementation and evaluation of public policies and, finally, material examples of such relation between epidemiology and public policies are presented.

  2. Mental health epidemiological research in South America: recent findings

    PubMed Central

    Silva de Lima, Maurício; Garcia de Oliveira Soares, Bernardo; de Jesus Mari, Jair

    2004-01-01

    This paper aims to review the recent mental health epidemiological research conducted in South America. The Latin American and the Caribbean (LILACS) database was searched from 1999 to 2003 using a specific strategy for identification of cohort, case-control and cross-sectional population-based studies in South America. The authors screened references and identified relevant studies. Further studies were obtained contacting local experts in epidemiology. 140 references were identified, and 12 studies were selected. Most selected studies explored the prevalence and risk factors for common mental disorders, and several of them used sophisticated methods of sample selection and analysis. There is a need for improving the quality of psychiatric journals in Latin America, and for increasing the distribution and access to research data. Regionally relevant problems such as violence and substance abuse should be considered in designing future investigations in this area. PMID:16633474

  3. AI/OR computational model for integrating qualitative and quantitative design methods

    NASA Technical Reports Server (NTRS)

    Agogino, Alice M.; Bradley, Stephen R.; Cagan, Jonathan; Jain, Pramod; Michelena, Nestor

    1990-01-01

    A theoretical framework for integrating qualitative and numerical computational methods for optimally-directed design is described. The theory is presented as a computational model and features of implementations are summarized where appropriate. To demonstrate the versatility of the methodology we focus on four seemingly disparate aspects of the design process and their interaction: (1) conceptual design, (2) qualitative optimal design, (3) design innovation, and (4) numerical global optimization.

  4. Cathodic protection design using the regression and correlation method

    SciTech Connect

    Niembro, A.M.; Ortiz, E.L.G.

    1997-09-01

    A computerized statistical method which calculates the current demand requirement based on potential measurements for cathodic protection systems is introduced. The method uses the regression and correlation analysis of statistical measurements of current and potentials of the piping network. This approach involves four steps: field potential measurements, statistical determination of the current required to achieve full protection, installation of more cathodic protection capacity with distributed anodes around the plant and examination of the protection potentials. The procedure is described and recommendations for the improvement of the existing and new cathodic protection systems are given.

  5. A design method for minimizing sensitivity to plant parameter variations

    NASA Technical Reports Server (NTRS)

    Hadass, Z.; Powell, J. D.

    1974-01-01

    A method is described for minimizing the sensitivity of multivariable systems to parameter variations. The variable parameters are considered as random variables and their effect is included in a quadratic performance index. The performance index is a weighted sum of the state and control covariances that stem from both the random system disturbances and the parameter uncertainties. The numerical solution of the problem is described and application of the method to several initially sensitive tracking systems is discussed. The sensitivity factor of reduction was typically 2 or 3 over a system based on random system noise only, and yet resulted in state RMS increases of only about a factor of two.

  6. Vaccine epidemiology: A review

    PubMed Central

    Lahariya, Chandrakant

    2016-01-01

    This review article outlines the key concepts in vaccine epidemiology, such as basic reproductive numbers, force of infection, vaccine efficacy and effectiveness, vaccine failure, herd immunity, herd effect, epidemiological shift, disease modeling, and describes the application of this knowledge both at program levels and in the practice by family physicians, epidemiologists, and pediatricians. A case has been made for increased knowledge and understanding of vaccine epidemiology among key stakeholders including policy makers, immunization program managers, public health experts, pediatricians, family physicians, and other experts/individuals involved in immunization service delivery. It has been argued that knowledge of vaccine epidemiology which is likely to benefit the society through contributions to the informed decision-making and improving vaccination coverage in the low and middle income countries (LMICs). The article ends with suggestions for the provision of systematic training and learning platforms in vaccine epidemiology to save millions of preventable deaths and improve health outcomes through life-course. PMID:27453836

  7. Computational Methods for Aerodynamic Design (Inverse) and Optimization

    DTIC Science & Technology

    1990-01-01

    Airfoils with Given Velocity Distribution in Incompressible Flow," J. Aircraft, Vol. 10, 1973, pp. 651-659. 7. Polito, L., "Un Metodo Esatto -per 11 Progetto...and the Simpson rule. Using a panel arrangement method with properly increased panel deusity in regions with comparatively large rv -variations, use of

  8. The Use of Hermeneutics in a Mixed Methods Design

    ERIC Educational Resources Information Center

    von Zweck, Claudia; Paterson, Margo; Pentland, Wendy

    2008-01-01

    Combining methods in a single study is becoming a more common practice because of the limitations of using only one approach to fully address all aspects of a research question. Hermeneutics in this paper is discussed in relation to a large national study that investigated issues influencing the ability of international graduates to work as…

  9. Stochastic Methods in Protective Structure Design: An Integrated Approach

    DTIC Science & Technology

    1988-09-01

    189a. Histogram and Probability for Monte Carlo Method .................................... 2 1 9a. Histogram and Probability for Response...Monte Carlo Simulation for ACI Shear and Shear Response .............. 26 1 lb. Static Monte Carlo Simulation for Direct Shear and Shear Response...problem to a wave-propagation, breaching, or penetration problem. A simple Monte- Carlo simulation of the range versus pressure function would

  10. Experimental Evaluation of Design Methods for Hardened Piping Systems.

    DTIC Science & Technology

    prediction capabilities of present day computer methods. The basic pipe elements tested included straight pipes, area changes, elbows , valves, a pump, and...surge tanks. The piping system tested was a closed loop system which contained the following elements: elbows , straight pipes, valves, a pump, and an

  11. Power Analysis for Complex Mediational Designs Using Monte Carlo Methods

    ERIC Educational Resources Information Center

    Thoemmes, Felix; MacKinnon, David P.; Reiser, Mark R.

    2010-01-01

    Applied researchers often include mediation effects in applications of advanced methods such as latent variable models and linear growth curve models. Guidance on how to estimate statistical power to detect mediation for these models has not yet been addressed in the literature. We describe a general framework for power analyses for complex…

  12. The research progress on Hodograph Method of aerodynamic design at Tsinghua University

    NASA Technical Reports Server (NTRS)

    Chen, Zuoyi; Guo, Jingrong

    1991-01-01

    Progress in the use of the Hodograph method of aerodynamic design is discussed. It was found that there are some restricted conditions in the application of Hodograph design to transonic turbine and compressor cascades. The Hodograph method is suitable not only to the transonic turbine cascade but also to the transonic compressor cascade. The three dimensional Hodograph method will be developed after obtaining the basic equation for the three dimensional Hodograph method. As an example of the Hodograph method, the use of the method to design a transonic turbine and compressor cascade is discussed.

  13. The challenges of using epidemiology to inform clinical practice

    PubMed Central

    Kessler, Ronald C.

    2007-01-01

    This paper discusses challenges and prospects for increasing the clinical relevance of psychiatric epidemiological research. The discussion begins with a review of the structural determinants of the fact that current psychiatric epidemiological research has less clinical relevance that epidemiological research in other areas of medicine. The discussion then turns to ways in which the focus of psychiatric epidemiological research might be changed to increase its clinical relevance. A review is then presented of recent innovations in community psychiatric epidemiological research that were designed to increase clinical relevance. An argument is then made that the full clinical value of psychiatric epidemiology will only be realized when community epidemiology becomes better integrated with clinical epidemiology and the latter takes on a more prominent role than it currently has in psychiatric research. Existing initiatives to realize an integration of community psychiatric epidemiology with clinical epidemiology are then reviewed. Finally, an agenda is proposed for an expansion of clinical psychiatric epidemiology to include a focus on both naturalistic and quasi-experimental studies of illness course and treatment response in diverse clinical samples. PMID:17896231

  14. Best Estimate Method vs Evaluation Method: a comparison of two techniques in evaluating seismic analysis and design

    SciTech Connect

    Bumpus, S.E.; Johnson, J.J.; Smith, P.D.

    1980-05-01

    The concept of how two techniques, Best Estimate Method and Evaluation Method, may be applied to the traditional seismic analysis and design of a nuclear power plant is introduced. Only the four links of the seismic analysis and design methodology chain (SMC) - seismic input, soil-structure interaction, major structural response, and subsystem response - are considered. The objective is to evaluate the compounding of conservatisms in the seismic analysis and design of nuclear power plants, to provide guidance for judgments in the SMC, and to concentrate the evaluation on that part of the seismic analysis and design which is familiar to the engineering community. An example applies the effects of three-dimensional excitations on a model of a nuclear power plant structure. The example demonstrates how conservatisms accrue by coupling two links in the SMC and comparing those results to the effects of one link alone. The utility of employing the Best Estimate Method vs the Evaluation Method is also demonstrated.

  15. Comparison of deterministic and Monte Carlo methods in shielding design.

    PubMed

    Oliveira, A D; Oliveira, C

    2005-01-01

    In shielding calculation, deterministic methods have some advantages and also some disadvantages relative to other kind of codes, such as Monte Carlo. The main advantage is the short computer time needed to find solutions while the disadvantages are related to the often-used build-up factor that is extrapolated from high to low energies or with unknown geometrical conditions, which can lead to significant errors in shielding results. The aim of this work is to investigate how good are some deterministic methods to calculating low-energy shielding, using attenuation coefficients and build-up factor corrections. Commercial software MicroShield 5.05 has been used as the deterministic code while MCNP has been used as the Monte Carlo code. Point and cylindrical sources with slab shield have been defined allowing comparison between the capability of both Monte Carlo and deterministic methods in a day-by-day shielding calculation using sensitivity analysis of significant parameters, such as energy and geometrical conditions.

  16. Matching wind turbine rotors and loads: Computational methods for designers

    NASA Astrophysics Data System (ADS)

    Seale, J. B.

    1983-04-01

    A comprehensive method for matching wind energy conversion system (WECS) rotors with the load characteristics of common electrical and mechanical applications was reported. A method was developed to convert the data into useful results: (1) from turbine efficiency and load torque characteristics, turbine power is predicted as a function of windspeed; (2) it is decided how turbine power is to be governed to insure safety of all components; (3) mechanical conversion efficiency comes into play to predict how useful delivered power varies with windspeed; (4) wind statistics are used to predict longterm energy output. Most systems are approximated by a graph and calculator approach. The method leads to energy predictions, and to insight into modeled processes. A computer program provides more sophisticated calculations where a highly unusual system is to be modeled, where accuracy is at a premium, or where error analysis is required. The analysis is fleshed out with in depth case studies for induction generator and inverter utility systems; battery chargers; resistance heaters; positive displacement pumps; including three different load compensation strategies; and centrifugal pumps with unregulated electric power transmission from turbine to pump.

  17. A New Automated Design Method Based on Machine Learning for CMOS Analog Circuits

    NASA Astrophysics Data System (ADS)

    Moradi, Behzad; Mirzaei, Abdolreza

    2016-11-01

    A new simulation based automated CMOS analog circuit design method which applies a multi-objective non-Darwinian-type evolutionary algorithm based on Learnable Evolution Model (LEM) is proposed in this article. The multi-objective property of this automated design of CMOS analog circuits is governed by a modified Strength Pareto Evolutionary Algorithm (SPEA) incorporated in the LEM algorithm presented here. LEM includes a machine learning method such as the decision trees that makes a distinction between high- and low-fitness areas in the design space. The learning process can detect the right directions of the evolution and lead to high steps in the evolution of the individuals. The learning phase shortens the evolution process and makes remarkable reduction in the number of individual evaluations. The expert designer's knowledge on circuit is applied in the design process in order to reduce the design space as well as the design time. The circuit evaluation is made by HSPICE simulator. In order to improve the design accuracy, bsim3v3 CMOS transistor model is adopted in this proposed design method. This proposed design method is tested on three different operational amplifier circuits. The performance of this proposed design method is verified by comparing it with the evolutionary strategy algorithm and other similar methods.

  18. Comparison of Three Statistical Methods for Establishing Tentative Wild-Type Population and Epidemiological Cutoff Values for Echinocandins, Amphotericin B, Flucytosine, and Six Candida Species as Determined by the Colorimetric Sensititre YeastOne Method

    PubMed Central

    Pemán, Javier; Hervás, David; Iñiguez, Carmen; Navarro, David; Echeverría, Julia; Martínez-Alarcón, José; Fontanals, Dionisia; Gomila-Sard, Bárbara; Buendía, Buenaventura; Torroba, Luis; Ayats, Josefina; Bratos, Angel; Sánchez-Reus, Ferran; Fernández-Natal, Isabel

    2012-01-01

    The Sensititre YeastOne (SYO) method is a widely used method to determine the susceptibility of Candida spp. to antifungal agents. CLSI clinical breakpoints (CBP) have been reported for antifungals, but not using this method. In the absence of CBP, epidemiological cutoff values (ECVs) are useful to separate wild-type (WT) isolates (those without mechanisms of resistance) from non-WT isolates (those that can harbor some resistance mechanisms), which is the goal of any susceptibility test. The ECVs for five agents, obtained using the MIC distributions determined by the SYO test, were calculated and contrasted with those for three statistical methods and the MIC50 and modal MIC, both plus 2-fold dilutions. The median ECVs (in mg/liter) (% of isolates inhibited by MICs equal to or less than the ECV; number of isolates tested) of the five methods for anidulafungin, micafungin, caspofungin, amphotericin B, and flucytosine, respectively, were as follows: 0.25 (98.5%; 656), 0.06 (95.1%; 659), 0.25 (98.7%; 747), 2 (100%; 923), and 1 (98.5%; 915) for Candida albicans; 8 (100%; 352), 4 (99.2%; 392), 2 (99.2%; 480), 1 (99.8%; 603), and 0.5 (97.9%; 635) for C. parapsilosis; 1 (99.2%; 123), 0.12 (99.2%; 121), 0.25 (99.2%; 138), 2 (100%; 171), and 0.5 (97.2%; 175) for C. tropicalis; 0.12 (96.6%; 174), 0.06 (96%; 176), 0.25 (98.4%; 188), 2 (100%; 209), and 0.25 (97.6%; 208) for C. glabrata; 0.25 (97%; 33), 0.5 (93.9%; 33), 1 (91.9%; 37), 4 (100%; 51), and 32 (100%; 53) for C. krusei; and 4 (100%; 33), 2 (100%; 33), 2 (100%; 54), 1 (100%; 90), and 0.25 (93.4%; 91) for C. orthopsilosis. The three statistical methods gave similar ECVs (within one dilution) and included ≥95% of isolates. These tentative ECVs would be useful for monitoring the emergence of isolates with reduced susceptibility by use of the SYO method. PMID:23015676

  19. Constructal method to optimize solar thermochemical reactor design

    SciTech Connect

    Tescari, S.; Mazet, N.; Neveu, P.

    2010-09-15

    The objective of this study is the geometrical optimization of a thermochemical reactor, which works simultaneously as solar collector and reactor. The heat (concentrated solar radiation) is supplied on a small peripheral surface and has to be dispersed in the entire reactive volume in order to activate the reaction all over the material. A similarity between this study and the point to volume problem analyzed by the constructal approach (Bejan, 2000) is evident. This approach was successfully applied to several domains, for example for the coupled mass and conductive heat transfer (Azoumah et al., 2004). Focusing on solar reactors, this work aims to apply constructal analysis to coupled conductive and radiative heat transfer. As a first step, the chemical reaction is represented by a uniform heat sink inside the material. The objective is to optimize the reactor geometry in order to maximize its efficiency. By using some hypothesis, a simplified solution is found. A parametric study provides the influence of different technical and operating parameters on the maximal efficiency and on the optimal shape. Different reactor designs (filled cylinder, cavity and honeycomb reactors) are compared, in order to determine the most efficient structure according to the operating conditions. Finally, these results are compared with a CFD model in order to validate the assumptions. (author)

  20. Defining Requirements and Related Methods for Designing Sensorized Garments.

    PubMed

    Andreoni, Giuseppe; Standoli, Carlo Emilio; Perego, Paolo

    2016-05-26

    Designing smart garments has strong interdisciplinary implications, specifically related to user and technical requirements, but also because of the very different applications they have: medicine, sport and fitness, lifestyle monitoring, workplace and job conditions analysis, etc. This paper aims to discuss some user, textile, and technical issues to be faced in sensorized clothes development. In relation to the user, the main requirements are anthropometric, gender-related, and aesthetical. In terms of these requirements, the user's age, the target application, and fashion trends cannot be ignored, because they determine the compliance with the wearable system. Regarding textile requirements, functional factors-also influencing user comfort-are elasticity and washability, while more technical properties are the stability of the chemical agents' effects for preserving the sensors' efficacy and reliability, and assuring the proper duration of the product for the complete life cycle. From the technical side, the physiological issues are the most important: skin conductance, tolerance, irritation, and the effect of sweat and perspiration are key factors for reliable sensing. Other technical features such as battery size and duration, and the form factor of the sensor collector, should be considered, as they affect aesthetical requirements, which have proven to be crucial, as well as comfort and wearability.