Science.gov

Sample records for design epidemiological methods

  1. Epidemiological designs for vaccine safety assessment: methods and pitfalls.

    PubMed

    Andrews, Nick

    2012-09-01

    Three commonly used designs for vaccine safety assessment post licensure are cohort, case-control and self-controlled case series. These methods are often used with routine health databases and immunisation registries. This paper considers the issues that may arise when designing an epidemiological study, such as understanding the vaccine safety question, case definition and finding, limitations of data sources, uncontrolled confounding, and pitfalls that apply to the individual designs. The example of MMR and autism, where all three designs have been used, is presented to help consider these issues. PMID:21985898

  2. The Swedish Longitudinal Gambling Study (Swelogs): design and methods of the epidemiological (EP-) track

    PubMed Central

    Romild, Ulla; Volberg, Rachel; Abbott, Max

    2014-01-01

    Swelogs (Swedish Longitudinal Gambling Study) epidemiological (EP-) track is a prospective study with four waves of data-collection among Swedish citizens aged 16–84 years at baseline. The major objectives of this track are to provide general population estimates of the prevalence and incidence of problem and at-risk gambling and enable comparisons with the first Swedish national study on gambling and problem gambling (Swegs) conducted in 1997/1998. The overall study (Swelogs) comprises three tracks of data collection; one epidemiological, one in-depth and one follow-up. It is expected to provide information that will inform the development of evidence-based methods and strategies to prevent the development of gambling problems. This paper gives an overview of the design of the epidemiological track, especially of its two first waves. The baseline wave, performed between October 2008 and August 2009, included 8165 subjects, of whom 6021 were re-assessed one year later. A stratified random sampling procedure was applied. Computer-supported telephone interviews were used as the primary method. Postal questionnaires were used to follow-up those not reached by telephone. The response rate was 55% in the first wave and 74% in the second. The interview and questionnaire data are supplemented by register data. © 2014 The Authors. International Journal of Methods in Psychiatric Research published by John Wiley & Sons Ltd. PMID:24942902

  3. Principles of study design in environmental epidemiology.

    PubMed Central

    Morgenstern, H; Thomas, D

    1993-01-01

    This paper discusses the principles of study design and related methodologic issues in environmental epidemiology. Emphasis is given to studies aimed at evaluating causal hypotheses regarding exposures to suspected health hazards. Following background sections on the quantitative objectives and methods of population-based research, we present the major types of observational designs used in environmental epidemiology: first, the three basic designs involving the individual as the unit of analysis (i.e., cohort, cross-sectional, and case-control studies) and a brief discussion of genetic studies for assessing gene-environment interactions; second, various ecologic designs involving the group or region as the unit of analysis. Ecologic designs are given special emphasis in this paper because of our lack of resources or inability to accurately measure environmental exposures in large numbers of individuals. The paper concludes with a section highlighting current design issues in environmental epidemiology and several recommendations for future work. PMID:8206038

  4. Overview of the epidemiology methods and applications: strengths and limitations of observational study designs.

    PubMed

    Colditz, Graham A

    2010-01-01

    The impact of study design on the results of medical research has long been an area of both substantial debate and a smaller body of empirical research. Examples come from many disciplines within clinical and public health research. Among the early major contributions in the 1970s was work by Mosteller and colleagues (Gilbert et al., 1997), who noted that innovations in surgery and anesthesia showed greater gains than standard therapy when nonrandomized, controlled trials were evaluated compared with the gains reported in randomized, controlled trials. More recently, we and others have evaluated the impact of design in medical and surgical research, and concluded that the mean gain comparing new therapies to established therapies was biased by study design in nonrandomized trials (Colditz et al., 1989; Miller et al., 1989). Benson and Hartz (2000) conducted a study in which they focused only on studies reported after 1985. On the basis of 136 reports of 19 diverse treatments, Benson and Hartz concluded that in only 2 of the 19 analyses did the combined data from the observational studies lie outside the 95% confidence interval for the combined data from the randomized trials. A similar study drew only on data reported from 1991 to 1995, which showed remarkably similar results among observational studies and randomized, controlled trials (Concato et al., 2000). These more recent data suggest that advancing the study design and analytic methods may reduce bias in some evaluations of medical and public health interventions. Such methods apply not only to the original studies, but also to the approaches that are taken to quantitatively combine results by using meta-analytic approaches such as random effects meta-regression, Bayesian meta-analysis, and the like (Normand, 1999). By focusing attention on thorough data analysis, design issues can be understood and their impact or bias can be estimated, on average, and then ideally accounted for in the interpretation of

  5. Epidemiologic study of residential proximity to transmission lines and childhood cancer in California: description of design, epidemiologic methods and study population.

    PubMed

    Kheifets, Leeka; Crespi, Catherine M; Hooper, Chris; Oksuzyan, Sona; Cockburn, Myles; Ly, Thomas; Mezei, Gabor

    2015-01-01

    We conducted a large epidemiologic case-control study in California to examine the association between childhood cancer risk and distance from the home address at birth to the nearest high-voltage overhead transmission line as a replication of the study of Draper et al. in the United Kingdom. We present a detailed description of the study design, methods of case ascertainment, control selection, exposure assessment and data analysis plan. A total of 5788 childhood leukemia cases and 3308 childhood central nervous system cancer cases (included for comparison) and matched controls were available for analysis. Birth and diagnosis addresses of cases and birth addresses of controls were geocoded. Distance from the home to nearby overhead transmission lines was ascertained on the basis of the electric power companies' geographic information system (GIS) databases, additional Google Earth aerial evaluation and site visits to selected residences. We evaluated distances to power lines up to 2000 m and included consideration of lower voltages (60-69 kV). Distance measures based on GIS and Google Earth evaluation showed close agreement (Pearson correlation >0.99). Our three-tiered approach to exposure assessment allowed us to achieve high specificity, which is crucial for studies of rare diseases with low exposure prevalence. PMID:24045429

  6. DESIGN STRATEGIES FOR EPIDEMIOLOGIC STUDIES OF ENVIRONMENTAL IMPACTS ON HEALTH

    EPA Science Inventory

    The papers describes epidemiologic designs and methods in studies of health effects of air pollution, whose implications, however, can be extended to the detection of health effects of other environmental exposures. Recent advances in measurement technology for the assessment of ...

  7. Melanocortin-1 receptor, skin cancer and phenotypic characteristics (M-SKIP) project: study design and methods for pooling results of genetic epidemiological studies

    PubMed Central

    2012-01-01

    Background For complex diseases like cancer, pooled-analysis of individual data represents a powerful tool to investigate the joint contribution of genetic, phenotypic and environmental factors to the development of a disease. Pooled-analysis of epidemiological studies has many advantages over meta-analysis, and preliminary results may be obtained faster and with lower costs than with prospective consortia. Design and methods Based on our experience with the study design of the Melanocortin-1 receptor (MC1R) gene, SKin cancer and Phenotypic characteristics (M-SKIP) project, we describe the most important steps in planning and conducting a pooled-analysis of genetic epidemiological studies. We then present the statistical analysis plan that we are going to apply, giving particular attention to methods of analysis recently proposed to account for between-study heterogeneity and to explore the joint contribution of genetic, phenotypic and environmental factors in the development of a disease. Within the M-SKIP project, data on 10,959 skin cancer cases and 14,785 controls from 31 international investigators were checked for quality and recoded for standardization. We first proposed to fit the aggregated data with random-effects logistic regression models. However, for the M-SKIP project, a two-stage analysis will be preferred to overcome the problem regarding the availability of different study covariates. The joint contribution of MC1R variants and phenotypic characteristics to skin cancer development will be studied via logic regression modeling. Discussion Methodological guidelines to correctly design and conduct pooled-analyses are needed to facilitate application of such methods, thus providing a better summary of the actual findings on specific fields. PMID:22862891

  8. An Introduction to Epidemiologic and Statistical Methods Useful in Environmental Epidemiology

    PubMed Central

    Nitta, Hiroshi; Yamazaki, Shin; Omori, Takashi; Sato, Tosiya

    2010-01-01

    Many developments in the design and analysis of environmental epidemiology have been made in air pollution studies. In the analysis of the short-term effects of particulate matter on daily mortality, Poisson regression models with flexible smoothing methods have been developed for the analysis of time-series data. Another option for such studies is the use of case–crossover designs, and there have been extensive discussions on the selection of control periods. In the Study on Respiratory Disease and Automobile Exhaust project conducted by the Japanese Ministry of the Environment, we adopted a new 2-stage case–control design that is efficient when both exposure and disease are rare. Based on our experience in conducting air pollution epidemiologic studies, we review 2-stage case–control designs, case–crossover designs, generalized linear models, generalized additive models, and generalized estimating equations, all of which are useful approaches in environmental epidemiology. PMID:20431236

  9. Epidemiologic methods in analysis of scientific issues

    NASA Astrophysics Data System (ADS)

    Erdreich, Linda S.

    2003-10-01

    Studies of human populations provide much of the information that is used to evaluate compensation cases for hearing loss, including rates of hearing loss by age, and dose-response relationships. The reference data used to make decisions regarding workman's compensation is based on epidemiologic studies of cohorts of workers exposed to various noise levels. Epidemiology and its methods can be used in other ways in the courtroom; to assess the merits of a complaint, to support Daubert criteria, and to explain scientific issues to the trier of fact, generally a layperson. Using examples other than occupational noise induced hearing loss, these methods will be applied to respond to a complaint that hearing loss followed exposure to a sudden noise, a medication, or an occupational chemical, and thus was caused by said exposure. The standard criteria for assessing the weight of the evidence, and epidemiologic criteria for causality show the limits of such anecdotal data and incorporate quantitative and temporal issues. Reports of clusters of cases are also intuitively convincing to juries. Epidemiologic methods provide a scientific approach to assess whether rates of the outcome are indeed increased, and the extent to which increased rates provide evidence for causality.

  10. Using Epidemiologic Methods to Test Hypotheses regarding Causal Influences on Child and Adolescent Mental Disorders

    ERIC Educational Resources Information Center

    Lahey, Benjamin B.; D'Onofrio, Brian M.; Waldman, Irwin D.

    2009-01-01

    Epidemiology uses strong sampling methods and study designs to test refutable hypotheses regarding the causes of important health, mental health, and social outcomes. Epidemiologic methods are increasingly being used to move developmental psychopathology from studies that catalogue correlates of child and adolescent mental health to designs that…

  11. Optimal design for epidemiological studies subject to designed missingness.

    PubMed

    Morara, Michele; Ryan, Louise; Houseman, Andres; Strauss, Warren

    2007-12-01

    In large epidemiological studies, budgetary or logistical constraints will typically preclude study investigators from measuring all exposures, covariates and outcomes of interest on all study subjects. We develop a flexible theoretical framework that incorporates a number of familiar designs such as case control and cohort studies, as well as multistage sampling designs. Our framework also allows for designed missingness and includes the option for outcome dependent designs. Our formulation is based on maximum likelihood and generalizes well known results for inference with missing data to the multistage setting. A variety of techniques are applied to streamline the computation of the Hessian matrix for these designs, facilitating the development of an efficient software tool to implement a wide variety of designs. PMID:18080755

  12. Kinetics methods for clinical epidemiology problems

    PubMed Central

    Corlan, Alexandru Dan; Ross, John

    2015-01-01

    Calculating the probability of each possible outcome for a patient at any time in the future is currently possible only in the simplest cases: short-term prediction in acute diseases of otherwise healthy persons. This problem is to some extent analogous to predicting the concentrations of species in a reactor when knowing initial concentrations and after examining reaction rates at the individual molecule level. The existing theoretical framework behind predicting contagion and the immediate outcome of acute diseases in previously healthy individuals is largely analogous to deterministic kinetics of chemical systems consisting of one or a few reactions. We show that current statistical models commonly used in chronic disease epidemiology correspond to simple stochastic treatment of single reaction systems. The general problem corresponds to stochastic kinetics of complex reaction systems. We attempt to formulate epidemiologic problems related to chronic diseases in chemical kinetics terms. We review methods that may be adapted for use in epidemiology. We show that some reactions cannot fit into the mass-action law paradigm and solutions to these systems would frequently exhibit an antiportfolio effect. We provide a complete example application of stochastic kinetics modeling for a deductive meta-analysis of two papers on atrial fibrillation incidence, prevalence, and mortality. PMID:26578757

  13. Kinetics methods for clinical epidemiology problems.

    PubMed

    Corlan, Alexandru Dan; Ross, John

    2015-11-17

    Calculating the probability of each possible outcome for a patient at any time in the future is currently possible only in the simplest cases: short-term prediction in acute diseases of otherwise healthy persons. This problem is to some extent analogous to predicting the concentrations of species in a reactor when knowing initial concentrations and after examining reaction rates at the individual molecule level. The existing theoretical framework behind predicting contagion and the immediate outcome of acute diseases in previously healthy individuals is largely analogous to deterministic kinetics of chemical systems consisting of one or a few reactions. We show that current statistical models commonly used in chronic disease epidemiology correspond to simple stochastic treatment of single reaction systems. The general problem corresponds to stochastic kinetics of complex reaction systems. We attempt to formulate epidemiologic problems related to chronic diseases in chemical kinetics terms. We review methods that may be adapted for use in epidemiology. We show that some reactions cannot fit into the mass-action law paradigm and solutions to these systems would frequently exhibit an antiportfolio effect. We provide a complete example application of stochastic kinetics modeling for a deductive meta-analysis of two papers on atrial fibrillation incidence, prevalence, and mortality. PMID:26578757

  14. DESIGN OF EXPOSURE MEASUREMENTS FOR EPIDEMIOLOGIC STUDIES

    EPA Science Inventory

    This presentation will describe the following items: (1) London daily air pollution and deaths that demonstrate how time series epidemiology can indicate that air pollution caused death; (2) Sophisticated statistical models required to establish this relationship for lower pollut...

  15. Methods of Measurement in epidemiology: Sedentary Behaviour

    PubMed Central

    Atkin, Andrew J; Gorely, Trish; Clemes, Stacy A; Yates, Thomas; Edwardson, Charlotte; Brage, Soren; Salmon, Jo; Marshall, Simon J; Biddle, Stuart JH

    2012-01-01

    Background Research examining sedentary behaviour as a potentially independent risk factor for chronic disease morbidity and mortality has expanded rapidly in recent years. Methods We present a narrative overview of the sedentary behaviour measurement literature. Subjective and objective methods of measuring sedentary behaviour suitable for use in population-based research with children and adults are examined. The validity and reliability of each method is considered, gaps in the literature specific to each method identified and potential future directions discussed. Results To date, subjective approaches to sedentary behaviour measurement, e.g. questionnaires, have focused predominantly on TV viewing or other screen-based behaviours. Typically, such measures demonstrate moderate reliability but slight to moderate validity. Accelerometry is increasingly being used for sedentary behaviour assessments; this approach overcomes some of the limitations of subjective methods, but detection of specific postures and postural changes by this method is somewhat limited. Instruments developed specifically for the assessment of body posture have demonstrated good reliability and validity in the limited research conducted to date. Miniaturization of monitoring devices, interoperability between measurement and communication technologies and advanced analytical approaches are potential avenues for future developments in this field. Conclusions High-quality measurement is essential in all elements of sedentary behaviour epidemiology, from determining associations with health outcomes to the development and evaluation of behaviour change interventions. Sedentary behaviour measurement remains relatively under-developed, although new instruments, both objective and subjective, show considerable promise and warrant further testing. PMID:23045206

  16. Combining Usability Techniques to Design Geovisualization Tools for Epidemiology

    PubMed Central

    Robinson, Anthony C.; Chen, Jin; Lengerich, Eugene J.; Meyer, Hans G.; MacEachren, Alan M.

    2009-01-01

    Designing usable geovisualization tools is an emerging problem in GIScience software development. We are often satisfied that a new method provides an innovative window on our data, but functionality alone is insufficient assurance that a tool is applicable to a problem in situ. As extensions of the static methods they evolved from, geovisualization tools are bound to enable new knowledge creation. We have yet to learn how to adapt techniques from interaction designers and usability experts toward our tools in order to maximize this ability. This is especially challenging because there is limited existing guidance for the design of usable geovisualization tools. Their design requires knowledge about the context of work within which they will be used, and should involve user input at all stages, as is the practice in any human-centered design effort. Toward that goal, we have employed a wide range of techniques in the design of ESTAT, an exploratory geovisualization toolkit for epidemiology. These techniques include; verbal protocol analysis, card-sorting, focus groups, and an in-depth case study. This paper reports the design process and evaluation results from our experience with the ESTAT toolkit. PMID:19960106

  17. Epidemiological study air disaster in Amsterdam (ESADA): study design

    PubMed Central

    Slottje, Pauline; Huizink, Anja C; Twisk, Jos WR; Witteveen, Anke B; van der Ploeg, Henk M; Bramsen, Inge; Smidt, Nynke; Bijlsma, Joost A; Bouter, Lex M; van Mechelen, Willem; Smid, Tjabe

    2005-01-01

    Background In 1992, a cargo aircraft crashed into apartment buildings in Amsterdam, killing 43 victims and destroying 266 apartments. In the aftermath there were speculations about the cause of the crash, potential exposures to hazardous materials due to the disaster and the health consequences. Starting in 2000, the Epidemiological Study Air Disaster in Amsterdam (ESADA) aimed to assess the long-term health effects of occupational exposure to this disaster on professional assistance workers. Methods/Design Epidemiological study among all the exposed professional fire-fighters and police officers who performed disaster-related task(s), and hangar workers who sorted the wreckage of the aircraft, as well as reference groups of their non-exposed colleagues who did not perform any disaster-related tasks. The study took place, on average, 8.5 years after the disaster. Questionnaires were used to assess details on occupational exposure to the disaster. Health measures comprised laboratory assessments in urine, blood and saliva, as well as self-reported current health measures, including health-related quality of life, and various physical and psychological symptoms. Discussion In this paper we describe and discuss the design of the ESADA. The ESADA will provide additional scientific knowledge on the long-term health effects of technological disasters on professional workers. PMID:15921536

  18. A design framework for exploratory geovisualization in epidemiology

    PubMed Central

    Robinson, Anthony C.

    2009-01-01

    This paper presents a design framework for geographic visualization based on iterative evaluations of a toolkit designed to support cancer epidemiology. The Exploratory Spatio-Temporal Analysis Toolkit (ESTAT), is intended to support visual exploration through multivariate health data. Its purpose is to provide epidemiologists with the ability to generate new hypotheses or further refine those they may already have. Through an iterative user-centered design process, ESTAT has been evaluated by epidemiologists at the National Cancer Institute (NCI). Results of these evaluations are discussed, and a design framework based on evaluation evidence is presented. The framework provides specific recommendations and considerations for the design and development of a geovisualization toolkit for epidemiology. Its basic structure provides a model for future design and evaluation efforts in information visualization. PMID:20390052

  19. Design and analysis of metabolomics studies in epidemiologic research: a primer on -omic technologies.

    PubMed

    Tzoulaki, Ioanna; Ebbels, Timothy M D; Valdes, Ana; Elliott, Paul; Ioannidis, John P A

    2014-07-15

    Metabolomics is the field of "-omics" research concerned with the comprehensive characterization of the small low-molecular-weight metabolites in biological samples. In epidemiology, it represents an emerging technology and an unprecedented opportunity to measure environmental and other exposures with improved precision and far less measurement error than with standard epidemiologic methods. Advances in the application of metabolomics in large-scale epidemiologic research are now being realized through a combination of improved sample preparation and handling, automated laboratory and processing methods, and reduction in costs. The number of epidemiologic studies that use metabolic profiling is still limited, but it is fast gaining popularity in this area. In the present article, we present a roadmap for metabolomic analyses in epidemiologic studies and discuss the various challenges these data pose to large-scale studies. We discuss the steps of data preprocessing, univariate and multivariate data analysis, correction for multiplicity of comparisons with correlated data, and finally the steps of cross-validation and external validation. As data from metabolomic studies accumulate in epidemiology, there is a need for large-scale replication and synthesis of findings, increased availability of raw data, and a focus on good study design, all of which will highlight the potential clinical impact of metabolomics in this field. PMID:24966222

  20. Research Methods in Healthcare Epidemiology and Antimicrobial Stewardship.

    PubMed

    Morgan, Daniel J; Safdar, Nasia; Milstone, Aaron M; Anderson, Deverick J

    2016-06-01

    Research in Healthcare Epidemiology and Antimicrobial Stewardship (HE&AS) is rapidly expanding with the involvement of researchers from varied countries and backgrounds. Researchers must use scientific methods that will provide the strongest evidence to advance healthcare epidemiology, but there are limited resources for information on specific aspects of HE&AS research or easy ways to access examples of studies using specific methods with HE&AS. In response to this need, the SHEA Research Committee has developed a series of white papers on research methods in HE&AS. The objective of this series is to promote rigorous healthcare epidemiology research by summarizing critical components, practical considerations, and pitfalls of commonly used research methods. Infect Control Hosp Epidemiol 2016;37:627-628. PMID:27074955

  1. Epidemiology and Clinical Research Design, Part 2: Principles

    PubMed Central

    Manja, Veena; Lakshminrusimha, Satyan

    2015-01-01

    This is the third article covering core knowledge in scholarly activities for neonatal physicians. In this article, we discuss various principles of epidemiology and clinical research design. A basic knowledge of these principles is necessary for conducting clinical research and for proper interpretation of studies. This article reviews bias and confounding, causation, incidence and prevalence, decision analysis, cost-effectiveness, sensitivity analysis, and measurement. PMID:26236171

  2. Designing ROW Methods

    NASA Technical Reports Server (NTRS)

    Freed, Alan D.

    1996-01-01

    There are many aspects to consider when designing a Rosenbrock-Wanner-Wolfbrandt (ROW) method for the numerical integration of ordinary differential equations (ODE's) solving initial value problems (IVP's). The process can be simplified by constructing ROW methods around good Runge-Kutta (RK) methods. The formulation of a new, simple, embedded, third-order, ROW method demonstrates this design approach.

  3. Study design in genetic epidemiology: theoretical and practical considerations.

    PubMed

    Whittemore, A S; Nelson, L M

    1999-01-01

    Recent advances in molecular genetics have created new opportunities and challenges for genetic epidemiologists. Here we review some of the issues that arise when designing a study involving the genetic epidemiology of chronic diseases of late onset, such as cancer. We discuss two considerations that influence the choice of design. The first consideration is the study's goals. We describe the goals of identifying new susceptibility genes for a disease, of estimating important characteristics of known genes, and of learning how to prevent the disease in the genetically susceptible. We indicate how these goals affect the choice of design and present some guidelines for choosing designs that effectively address them. The second consideration is the set of practical constraints to successfully conducting the research. These contraints include problems of potential selection bias, reduced response rates, problems particular to family registries, problems particular to the cultures of various ethnic groups, and ethical issues. We indicate how these constraints affect the choice of design and discuss ways to deal with them. PMID:10854488

  4. Strain Typing Methods and Molecular Epidemiology of Pneumocystis Pneumonia

    PubMed Central

    Roux, Patricia; Nevez, Gilles; Hauser, Philippe M.; Kovacs, Joseph A.; Unnasch, Thomas R.; Lundgren, Bettina

    2004-01-01

    Pneumocystis pneumonia (PCP) caused by the opportunistic fungal agent Pneumocystis jirovecii (formerly P. carinii) continues to cause illness and death in HIV-infected patients. In the absence of a culture system to isolate and maintain live organisms, efforts to type and characterize the organism have relied on polymerase chain reaction–based approaches. Studies using these methods have improved understanding of PCP epidemiology, shedding light on sources of infection, transmission patterns, and potential emergence of antimicrobial resistance. One concern, however, is the lack of guidance regarding the appropriateness of different methods and standardization of these methods, which would facilitate comparing results reported by different laboratories. PMID:15504257

  5. Genetic Epidemiology of Tuberculosis Susceptibility: Impact of Study Design

    PubMed Central

    Stein, Catherine M.

    2011-01-01

    Several candidate gene studies have provided evidence for a role of host genetics in susceptibility to tuberculosis (TB). However, the results of these studies have been very inconsistent, even within a study population. Here, we review the design of these studies from a genetic epidemiological perspective, illustrating important differences in phenotype definition in both cases and controls, consideration of latent M. tuberculosis infection versus active TB disease, population genetic factors such as population substructure and linkage disequilibrium, polymorphism selection, and potential global differences in M. tuberculosis strain. These considerable differences between studies should be accounted for when examining the current literature. Recommendations are made for future studies to further clarify the host genetics of TB. PMID:21283783

  6. [Curricular design of health postgraduate programs: the case of Masters in epidemiology].

    PubMed

    Bobadilla, J L; Lozano, R; Bobadilla, C

    1991-01-01

    This paper discusses the need to create specific programs for the training of researchers in epidemiology, a field that has traditionally been ignored by the graduate programs in public health. This is due, in part, to the emphasis that has been placed on the training of professionals in other areas of public health. The paper also includes the results of a consensus exercise developed during the curricular design of the Masters Program in Epidemiology of the School of Medicine of the National Autonomous University of Mexico. The technique used during the consensus exercise was the TKJ, which allows the presentation of ideas and possible solutions for a specific problem. This is probably the first published experience in the use of such a technique for the design of an academic curriculum. Taking as a base the general characteristics of the students, the substantive, disciplinary and methodological subjects were chosen. The results showed a need for a multidisciplinary approach based on modern methodologies of statistics and epidemiology. The usefulness of the results of the curricular design and the superiority of this method to reach consensus is also discussed. PMID:1948431

  7. Population- and individual-based approaches to the design and analysis of epidemiologic studies of sexually transmitted disease transmission.

    PubMed

    Shiboski, S; Padian, N S

    1996-10-01

    Epidemiologic studies of sexually transmitted disease (STD) transmission present a number of unique challenges in design and analysis. These arise both from the social nature of STD transmission and from inherent difficulties in collecting accurate and informative data on exposure and infection. Risk of acquiring an STD depends on both individual-level factors and the behavior and infectiousness of others. Consequently, study designs and analysis methods developed for studying chronic disease risk in individuals or groups may not apply directly. Simple models of STD transmission were used to investigate these issues, focusing on how the interplay between individual- and population-level factors influences design and interpretation of epidemiologic studies, with particular attention to interpretation of common measures of association and to common sources of bias in epidemiologic data. Existing methods for investigating risk factors can be modified such that these issues may be addressed directly. PMID:8843249

  8. Control system design method

    DOEpatents

    Wilson, David G.; Robinett, III, Rush D.

    2012-02-21

    A control system design method and concomitant control system comprising representing a physical apparatus to be controlled as a Hamiltonian system, determining elements of the Hamiltonian system representation which are power generators, power dissipators, and power storage devices, analyzing stability and performance of the Hamiltonian system based on the results of the determining step and determining necessary and sufficient conditions for stability of the Hamiltonian system, creating a stable control system based on the results of the analyzing step, and employing the resulting control system to control the physical apparatus.

  9. New Saliva DNA Collection Method Compared to Buccal Cell Collection Techniques for Epidemiological Studies

    PubMed Central

    ROGERS, NIKKI L.; COLE, SHELLEY A.; LAN, HAO-CHANG; CROSSA, ALDO; DEMERATH, ELLEN W.

    2009-01-01

    Epidemiological studies may require noninvasive methods for off-site DNA collection. We compared the DNA yield and quality obtained using a whole-saliva collection device (Oragene™ DNA collection kit) to those from three established noninvasive methods (cytobrush, foam swab, and oral rinse). Each method was tested on 17 adult volunteers from our center, using a random crossover collection design and analyzed using repeated-measures statistics. DNA yield and quality were assessed via gel electrophoresis, spectophotometry, and polymerase chain reaction (PCR) amplification rate. The whole-saliva method provided a significantly greater DNA yield (mean ± SD = 154.9 ± 103.05 μg, median = 181.88) than the other methods (oral rinse = 54.74 ± 41.72 μg, 36.56; swab = 11.44 ± 7.39 μg, 10.72; cytobrush = 12.66 ± 6.19, 13.22 μg) (all pairwise P < 0.05). Oral-rinse and whole-saliva samples provided the best DNA quality, whereas cytobrush and swab samples provided poorer quality DNA, as shown by lower OD260/OD280 and OD260/OD230 ratios. We conclude that both a 10-ml oral-rinse sample and 2-ml whole-saliva sample provide sufficient DNA quantity and better quality DNA for genetic epidemiological studies than do the commonly used buccal swab and brush techniques. PMID:17421001

  10. New saliva DNA collection method compared to buccal cell collection techniques for epidemiological studies.

    PubMed

    Rogers, Nikki L; Cole, Shelley A; Lan, Hao-Chang; Crossa, Aldo; Demerath, Ellen W

    2007-01-01

    Epidemiological studies may require noninvasive methods for off-site DNA collection. We compared the DNA yield and quality obtained using a whole-saliva collection device (Oragene DNA collection kit) to those from three established noninvasive methods (cytobrush, foam swab, and oral rinse). Each method was tested on 17 adult volunteers from our center, using a random crossover collection design and analyzed using repeated-measures statistics. DNA yield and quality were assessed via gel electrophoresis, spectophotometry, and polymerase chain reaction (PCR) amplification rate. The whole-saliva method provided a significantly greater DNA yield (mean +/- SD = 154.9 +/- 103.05 microg, median = 181.88) than the other methods (oral rinse = 54.74 +/- 41.72 microg, 36.56; swab = 11.44 +/- 7.39 microg, 10.72; cytobrush = 12.66 +/- 6.19, 13.22 microg) (all pairwise P < 0.05). Oral-rinse and whole-saliva samples provided the best DNA quality, whereas cytobrush and swab samples provided poorer quality DNA, as shown by lower OD(260)/OD(280) and OD(260)/OD(230) ratios. We conclude that both a 10-ml oral-rinse sample and 2-ml whole-saliva sample provide sufficient DNA quantity and better quality DNA for genetic epidemiological studies than do the commonly used buccal swab and brush techniques. PMID:17421001

  11. The Role of Applied Epidemiology Methods in the Disaster Management Cycle

    PubMed Central

    Malilay, Josephine; Heumann, Michael; Perrotta, Dennis; Wolkin, Amy F.; Schnall, Amy H.; Podgornik, Michelle N.; Cruz, Miguel A.; Horney, Jennifer A.; Zane, David; Roisman, Rachel; Greenspan, Joel R.; Thoroughman, Doug; Anderson, Henry A.; Wells, Eden V.; Simms, Erin F.

    2015-01-01

    Disaster epidemiology (i.e., applied epidemiology in disaster settings) presents a source of reliable and actionable information for decision-makers and stakeholders in the disaster management cycle. However, epidemiological methods have yet to be routinely integrated into disaster response and fully communicated to response leaders. We present a framework consisting of rapid needs assessments, health surveillance, tracking and registries, and epidemiological investigations, including risk factor and health outcome studies and evaluation of interventions, which can be practiced throughout the cycle. Applying each method can result in actionable information for planners and decision-makers responsible for preparedness, response, and recovery. Disaster epidemiology, once integrated into the disaster management cycle, can provide the evidence base to inform and enhance response capability within the public health infrastructure. PMID:25211748

  12. [Occupational epidemiology].

    PubMed

    Ahrens, W; Behrens, T; Mester, B; Schmeisser, N

    2008-03-01

    The aim of occupational epidemiology is to describe workplace-related diseases and to identify their underlying causes. Its primary goal is to protect workers from hazardous effects of the working process by applying work-related primary and secondary prevention measures. To assess health risks different study designs and a wide array of complex study instruments and methods are frequently employed that cannot be replaced by toxicological investigations. This paper primarily addresses health risks by agent exposures. In this context a central task of occupational epidemiology is careful assessment of exposure. Different data sources, such as work site measurements, register data, archive material, experts' opinion, and the workers' personal estimates of exposure may be used during this process. In addition, biological markers can complement exposure assessment. Since thorough occupational epidemiologic studies allow assessment of disease risks under realistic exposure conditions, their results should be more frequently used to derive workplace-related threshold limit values. PMID:18311483

  13. Development of the residential case-specular epidemiologic investigation method. Final report

    SciTech Connect

    Zaffanella, L.E.; Savitz, D.A.

    1995-11-01

    The residential case-specular method is an innovative approach to epidemiologic studies of the association between wire codes and childhood cancer. This project was designed to further the development of the residential case-specular method, which seeks to help resolve the ``wire code paradox``. For years, wire codes have been used as surrogate measures of past electric and magnetic field (EMF) exposure. There is a magnetic field hypothesis that suggests childhood cancer is associated with exposure to magnetic fields, with wire codes as a proxy for these fields. The neighborhood hypothesis suggests that childhood cancer is associated with neighborhood characteristics and exposures other than magnetic fields, with wire codes as a proxy for these characteristics and exposures. The residential case-specular method was designed to discriminate between the magnetic field and the neighborhood hypothesis. Two methods were developed for determining the specular of a residence. These methods were tested with 400 randomly selected residences. The main advantage of the residential case-specular method is that it may efficiently confirm or eliminate the suspicion that control selection bias or confounding by neighborhood factors affected the results of case-control studies of childhood cancer and magnetic fields. The method may be applicable to both past and ongoing studies. The main disadvantage is that the method is untried. Consequently, further work is required to verify its validity and to ensure that sufficient statistical power can be obtained in a cost-effective manner.

  14. The FEM-2 design method

    NASA Technical Reports Server (NTRS)

    Pratt, T. W.; Adams, L. M.; Mehrotra, P.; Vanrosendale, J.; Voigt, R. G.; Patrick, M.

    1983-01-01

    The FEM-2 parallel computer is designed using methods differing from those ordinarily employed in parallel computer design. The major distinguishing aspects are: (1) a top-down rather than bottom-up design process; (2) the design considers the entire system structure in terms of layers of virtual machines; and (3) each layer of virtual machine is defined formally during the design process. The result is a complete hardware/software system design. The basic design method is discussed and the advantages of the method are considered. A status report on the FEM-2 design is included.

  15. Complex mixtures and indoor air pollution: overview of epidemiologic methods.

    PubMed Central

    Weiss, N S

    1993-01-01

    The likelihood of an epidemiologic study correctly identifying an adverse health outcome associated with exposure to indoor air pollutants is increased if a) substantial variation exists in the frequency or level of exposure among study subjects otherwise at similar risk of the health outcome; b) the number of study subjects or study communities is large; c) the health outcome can be assessed with accuracy; d) relevant exposure levels can be measured with accuracy; e) an unbiased sample of exposed and nonexposed subjects is selected for study; and f) other determinants of the adverse health outcome can be measured. Nonetheless, given a strong enough impact of exposure to one pollutant or a mixture of pollutants on the risk of illness, it is possible for epidemiologic studies to discern a relation even if only some of the above circumstances are present. PMID:8206026

  16. A rapid method of grading cataract in epidemiological studies and eye surveys.

    PubMed Central

    Mehra, V; Minassian, D C

    1988-01-01

    A rapid method of grading clinically important central lens opacities has been developed for use in eye surveys and in epidemiological studies of cataract and has been field-tested in a specifically designed observer agreement study in a survey of a rural community in Central India. The grading method is based on simple measurement of the area of lens opacity that obscures the red reflex relative to the area of clear red reflex, as visualised through the undilated normal pupil. Good to almost perfect agreements were attained between two ophthalmologists and two trained ophthalmic assistants for overall grades of central lens opacity. Most disagreements were trivial in nature and were concerned with difficulties in distinguishing grade 0 from grade 1, and with hazy appearance of the red reflex in high myopes and in cases of early nuclear sclerosis. Teaching materials including video tape and slides for training survey teams and other workers are in preparation. PMID:3207653

  17. Optimal designs for epidemiologic longitudinal studies with binary outcomes.

    PubMed

    Mehtälä, Juha; Auranen, Kari; Kulathinal, Sangita

    2015-12-01

    Alternating presence and absence of a medical condition in human subjects is often modelled as an outcome of underlying process dynamics. Longitudinal studies provide important insights into research questions involving such dynamics. This article concerns optimal designs for studies in which the dynamics are modelled as a binary continuous-time Markov process. Either one or both the transition rate parameters in the model are to be estimated with maximum precision from a sequence of observations made at discrete times on a number of subjects. The design questions concern the choice of time interval between observations, the initial state of each subject and the choice between number of subjects versus repeated observations per subject. Sequential designs are considered due to dependence of the designs on the model parameters. The optimal time spacing can be approximated by the reciprocal of the sum of the two rates. The initial distribution of the study subjects should be taken into account when relatively few repeated samples per subject are to be collected. A study with a reasonably large size should be designed in more than one phase because there are then enough observations to be spent in the first phase to revise the time spacing for the subsequent phases. PMID:22170892

  18. Epidemiology and Clinical Research Design, Part 1: Study Types

    PubMed Central

    Manja, Veena; Lakshminrusimha, Satyan

    2015-01-01

    Selecting the best available preventive and therapeutic measures to avoid disability and death is an important goal for all health care practitioners. To achieve this goal, we need to perform studies that determine the value of these measures. In this article, we discuss the possible study designs that can be used for evaluating new approaches to prevention and treatment. The gold standard study design is a randomized, controlled, double-blind trial. In many instances, a randomized controlled trial may not be ethically or practically feasible. Other study types, such as case series, case-control studies, cohort studies, cross-sectional studies, crossover designs, and open-label studies, may be required to hypothesize and evaluate the link between an exposure or predictor variable and an outcome variable. Various study types pertaining to neonatal-perinatal medicine are reviewed in this article. PMID:25848346

  19. Aircraft digital control design methods

    NASA Technical Reports Server (NTRS)

    Powell, J. D.; Parsons, E.; Tashker, M. G.

    1976-01-01

    Variations in design methods for aircraft digital flight control are evaluated and compared. The methods fall into two categories; those where the design is done in the continuous domain (or s plane) and those where the design is done in the discrete domain (or z plane). Design method fidelity is evaluated by examining closed loop root movement and the frequency response of the discretely controlled continuous aircraft. It was found that all methods provided acceptable performance for sample rates greater than 10 cps except the uncompensated s plane design method which was acceptable above 20 cps. A design procedure based on optimal control methods was proposed that provided the best fidelity at very slow sample rates and required no design iterations for changing sample rates.

  20. Concordance and discordance of sequence survey methods for molecular epidemiology

    PubMed Central

    Hasan, Nur A.; Cebula, Thomas A.; Colwell, Rita R.; Robison, Richard A.; Johnson, W. Evan; Crandall, Keith A.

    2015-01-01

    The post-genomic era is characterized by the direct acquisition and analysis of genomic data with many applications, including the enhancement of the understanding of microbial epidemiology and pathology. However, there are a number of molecular approaches to survey pathogen diversity, and the impact of these different approaches on parameter estimation and inference are not entirely clear. We sequenced whole genomes of bacterial pathogens, Burkholderia pseudomallei, Yersinia pestis, and Brucella spp. (60 new genomes), and combined them with 55 genomes from GenBank to address how different molecular survey approaches (whole genomes, SNPs, and MLST) impact downstream inferences on molecular evolutionary parameters, evolutionary relationships, and trait character associations. We selected isolates for sequencing to represent temporal, geographic origin, and host range variability. We found that substitution rate estimates vary widely among approaches, and that SNP and genomic datasets yielded different but strongly supported phylogenies. MLST yielded poorly supported phylogenies, especially in our low diversity dataset, i.e., Y. pestis. Trait associations showed that B. pseudomallei and Y. pestis phylogenies are significantly associated with geography, irrespective of the molecular survey approach used, while Brucella spp. phylogeny appears to be strongly associated with geography and host origin. We contrast inferences made among monomorphic (clonal) and non-monomorphic bacteria, and between intra- and inter-specific datasets. We also discuss our results in light of underlying assumptions of different approaches. PMID:25737810

  1. From Smallpox to Big Data: The Next 100 Years of Epidemiologic Methods.

    PubMed

    Gange, Stephen J; Golub, Elizabeth T

    2016-03-01

    For more than a century, epidemiology has seen major shifts in both focus and methodology. Taking into consideration the explosion of "big data," the advent of more sophisticated data collection and analytical tools, and the increased interest in evidence-based solutions, we present a framework that summarizes 3 fundamental domains of epidemiologic methods that are relevant for the understanding of both historical contributions and future directions in public health. First, the manner in which populations and their follow-up are defined is expanding, with greater interest in online populations whose definition does not fit the usual classification by person, place, and time. Second, traditional data collection methods, such as population-based surveillance and individual interviews, have been supplemented with advances in measurement. From biomarkers to mobile health, innovations in the measurement of exposures and diseases enable refined accuracy of data collection. Lastly, the comparison of populations is at the heart of epidemiologic methodology. Risk factor epidemiology, prediction methods, and causal inference strategies are areas in which the field is continuing to make significant contributions to public health. The framework presented herein articulates the multifaceted ways in which epidemiologic methods make such contributions and can continue to do so as we embark upon the next 100 years. PMID:26443419

  2. Typing methods used in the molecular epidemiology of microbial pathogens: a how-to guide.

    PubMed

    Ranjbar, Reza; Karami, Ali; Farshad, Shohreh; Giammanco, Giovanni M; Mammina, Caterina

    2014-01-01

    Microbial typing is often employed to determine the source and routes of infections, confirm or rule out outbreaks, trace cross-transmission of healthcare-associated pathogens, recognize virulent strains and evaluate the effectiveness of control measures. Conventional microbial typing methods have occasionally been useful in describing the epidemiology of infectious diseases. However, these methods are generally considered too variable, labour intensive and time-consuming to be of practical value in epidemiological investigations. Moreover, these approaches have proved to be insufficiently discriminatory and poorly reproducible. DNA-based typing methods rely on the analysis of the genetic material of a microorganism. In recent years, several methods have been introduced and developed for investigation of the molecular epidemiology of microbial pathogens. Each of them has advantages and limitations that make them useful in some studies and restrictive in others. The choice of a molecular typing method therefore will depend on the skill level and resources of the laboratory and the aim and scale of the investigation. This study reviews the most popular DNA-based molecular typing methods used in the epidemiology of bacterial pathogens together with their advantages and limitations. PMID:24531166

  3. Influence of DNA extraction methods on relative telomere length measurements and its impact on epidemiological studies

    PubMed Central

    Raschenberger, Julia; Lamina, Claudia; Haun, Margot; Kollerits, Barbara; Coassin, Stefan; Boes, Eva; Kedenko, Ludmilla; Köttgen, Anna; Kronenberg, Florian

    2016-01-01

    Measurement of telomere length is widely used in epidemiologic studies. Insufficient standardization of the measurements processes has, however, complicated the comparison of results between studies. We aimed to investigate whether DNA extraction methods have an influence on measured values of relative telomere length (RTL) and whether this has consequences for epidemiological studies. We performed four experiments with RTL measurement in quadruplicate by qPCR using DNA extracted with different methods: 1) a standardized validation experiment including three extraction methods (magnetic-particle-method EZ1, salting-out-method INV, phenol-chloroform-isoamyl-alcohol PCI) each in the same 20 samples demonstrated pronounced differences in RTL with lowest values with EZ1 followed by INV and PCI-isolated DNA; 2) a comparison of 307 samples from an epidemiological study showing EZ1-measurements 40% lower than INV-measurements; 3) a matching-approach of two similar non-diseased control groups including 143 pairs of subjects revealed significantly shorter RTL in EZ1 than INV-extracted DNA (0.844 ± 0.157 vs. 1.357 ± 0.242); 4) an association analysis of RTL with prevalent cardiovascular disease detected a stronger association with INV than with EZ1-extracted DNA. In summary, DNA extraction methods have a pronounced influence on the measured RTL-values. This might result in spurious or lost associations in epidemiological studies under certain circumstances. PMID:27138987

  4. Influence of DNA extraction methods on relative telomere length measurements and its impact on epidemiological studies.

    PubMed

    Raschenberger, Julia; Lamina, Claudia; Haun, Margot; Kollerits, Barbara; Coassin, Stefan; Boes, Eva; Kedenko, Ludmilla; Köttgen, Anna; Kronenberg, Florian

    2016-01-01

    Measurement of telomere length is widely used in epidemiologic studies. Insufficient standardization of the measurements processes has, however, complicated the comparison of results between studies. We aimed to investigate whether DNA extraction methods have an influence on measured values of relative telomere length (RTL) and whether this has consequences for epidemiological studies. We performed four experiments with RTL measurement in quadruplicate by qPCR using DNA extracted with different methods: 1) a standardized validation experiment including three extraction methods (magnetic-particle-method EZ1, salting-out-method INV, phenol-chloroform-isoamyl-alcohol PCI) each in the same 20 samples demonstrated pronounced differences in RTL with lowest values with EZ1 followed by INV and PCI-isolated DNA; 2) a comparison of 307 samples from an epidemiological study showing EZ1-measurements 40% lower than INV-measurements; 3) a matching-approach of two similar non-diseased control groups including 143 pairs of subjects revealed significantly shorter RTL in EZ1 than INV-extracted DNA (0.844 ± 0.157 vs. 1.357 ± 0.242); 4) an association analysis of RTL with prevalent cardiovascular disease detected a stronger association with INV than with EZ1-extracted DNA. In summary, DNA extraction methods have a pronounced influence on the measured RTL-values. This might result in spurious or lost associations in epidemiological studies under certain circumstances. PMID:27138987

  5. Study designs for biobank-based epidemiologic research on chronic diseases.

    PubMed

    Läärä, Esa

    2011-01-01

    A review is given on design options to be considered in epidemiologic studies on cancers or other chronic diseases in relation to risk factors, the measurement of which is based on stored specimens in large biobanks. The two major choices for valid and cost-efficient sampling of risk factor data from large biobank cohorts are provided by the nested case-control design, and the case-cohort design. The main features of both designs are outlined and their relative merits are compared. Special issues such as matching, stratification, and statistical analysis are also briefly discussed. It is concluded that the nested case-control design is -better suited for studies involving biomarkers that can be influenced by analytic batch, long-term storage, and freeze-thaw cycles. The case-cohort design is useful, especially when several outcomes are of interest, given that the measurements on stored materials remain sufficiently stable during the study. PMID:20949387

  6. Aircraft digital control design methods

    NASA Technical Reports Server (NTRS)

    Tashker, M. G.; Powell, J. D.

    1975-01-01

    Investigations were conducted in two main areas: the first area is control system design, and the goals were to define the limits of 'digitized S-Plane design techniques' vs. sample rate, to show the results of a 'direct digital design technique', and to compare the two methods; the second area was to evaluate the roughness of autopilot designs parametrically versus sample rate. Goals of the first area were addressed by (1) an analysis of a 2nd order example using both design methods, (2) a linear analysis of the complete 737 aircraft with an autoland obtained using the digitized S-plane technique, (3) linear analysis of a high frequency 737 approximation with the autoland from a direct digital design technique, and (4) development of a simulation for evaluation of the autopilots with disturbances and nonlinearities included. Roughness evaluation was studied by defining an experiment to be carried out on the Langley motion simulator and coordinated with analysis at Stanford.

  7. Trends in Citations to Books on Epidemiological and Statistical Methods in the Biomedical Literature

    PubMed Central

    Porta, Miquel; Vandenbroucke, Jan P.; Ioannidis, John P. A.; Sanz, Sergio; Fernandez, Esteve; Bhopal, Raj; Morabia, Alfredo; Victora, Cesar; Lopez, Tomàs

    2013-01-01

    Background There are no analyses of citations to books on epidemiological and statistical methods in the biomedical literature. Such analyses may shed light on how concepts and methods changed while biomedical research evolved. Our aim was to analyze the number and time trends of citations received from biomedical articles by books on epidemiological and statistical methods, and related disciplines. Methods and Findings The data source was the Web of Science. The study books were published between 1957 and 2010. The first year of publication of the citing articles was 1945. We identified 125 books that received at least 25 citations. Books first published in 1980–1989 had the highest total and median number of citations per year. Nine of the 10 most cited texts focused on statistical methods. Hosmer & Lemeshow's Applied logistic regression received the highest number of citations and highest average annual rate. It was followed by books by Fleiss, Armitage, et al., Rothman, et al., and Kalbfleisch and Prentice. Fifth in citations per year was Sackett, et al., Evidence-based medicine. The rise of multivariate methods, clinical epidemiology, or nutritional epidemiology was reflected in the citation trends. Educational textbooks, practice-oriented books, books on epidemiological substantive knowledge, and on theory and health policies were much less cited. None of the 25 top-cited books had the theoretical or sociopolitical scope of works by Cochrane, McKeown, Rose, or Morris. Conclusions Books were mainly cited to reference methods. Books first published in the 1980s continue to be most influential. Older books on theory and policies were rooted in societal and general medical concerns, while the most modern books are almost purely on methods. PMID:23667447

  8. Overview of molecular typing methods for outbreak detection and epidemiological surveillance.

    PubMed

    Sabat, A J; Budimir, A; Nashev, D; Sá-Leão, R; van Dijl, J m; Laurent, F; Grundmann, H; Friedrich, A W

    2013-01-01

    Typing methods for discriminating different bacterial isolates of the same species are essential epidemiological tools in infection prevention and control. Traditional typing systems based on phenotypes, such as serotype, biotype, phage-type, or antibiogram, have been used for many years. However, more recent methods that examine the relatedness of isolates at a molecular level have revolutionised our ability to differentiate among bacterial types and subtypes. Importantly, the development of molecular methods has provided new tools for enhanced surveillance and outbreak detection. This has resulted in better implementation of rational infection control programmes and efficient allocation of resources across Europe. The emergence of benchtop sequencers using next generation sequencing technology makes bacterial whole genome sequencing (WGS) feasible even in small research and clinical laboratories. WGS has already been used for the characterisation of bacterial isolates in several large outbreaks in Europe and, in the near future, is likely to replace currently used typing methodologies due to its ultimate resolution. However, WGS is still too laborious and time-consuming to obtain useful data in routine surveillance. Also, a largely unresolved question is how genome sequences must be examined for epidemiological characterisation. In the coming years, the lessons learnt from currently used molecular methods will allow us to condense the WGS data into epidemiologically useful information. On this basis, we have reviewed current and new molecular typing methods for outbreak detection and epidemiological surveillance of bacterial pathogens in clinical practice, aiming to give an overview of their specific advantages and disadvantages. PMID:23369389

  9. Stochastic Methods for Aircraft Design

    NASA Technical Reports Server (NTRS)

    Pelz, Richard B.; Ogot, Madara

    1998-01-01

    The global stochastic optimization method, simulated annealing (SA), was adapted and applied to various problems in aircraft design. The research was aimed at overcoming the problem of finding an optimal design in a space with multiple minima and roughness ubiquitous to numerically generated nonlinear objective functions. SA was modified to reduce the number of objective function evaluations for an optimal design, historically the main criticism of stochastic methods. SA was applied to many CFD/MDO problems including: low sonic-boom bodies, minimum drag on supersonic fore-bodies, minimum drag on supersonic aeroelastic fore-bodies, minimum drag on HSCT aeroelastic wings, FLOPS preliminary design code, another preliminary aircraft design study with vortex lattice aerodynamics, HSR complete aircraft aerodynamics. In every case, SA provided a simple, robust and reliable optimization method which found optimal designs in order 100 objective function evaluations. Perhaps most importantly, from this academic/industrial project, technology has been successfully transferred; this method is the method of choice for optimization problems at Northrop Grumman.

  10. Imputation method for lifetime exposure assessment in air pollution epidemiologic studies

    PubMed Central

    2013-01-01

    against health data should be done as a function of PDI to check for consistency of results. The 1% of study subjects who lived for long durations near heavily trafficked intersections, had very high cumulative exposures. Thus, imputation methods must be designed to reproduce non-standard distributions. Conclusions Our approach meets a number of methodological challenges to extending historical exposure reconstruction over a lifetime and shows promise for environmental epidemiology. Application to assessment of breast cancer risks will be reported in a subsequent manuscript. PMID:23919666

  11. An overview of various typing methods for clinical epidemiology of the emerging pathogen Stenotrophomonas maltophilia.

    PubMed

    Gherardi, Giovanni; Creti, Roberta; Pompilio, Arianna; Di Bonaventura, Giovanni

    2015-03-01

    Typing of bacterial isolates has been used for decades to study local outbreaks as well as in national and international surveillances for monitoring newly emerging resistant clones. Despite being recognized as a nosocomial pathogen, the precise modes of transmission of Stenotrophomonas maltophilia in health care settings are unknown. Due to the high genetic diversity observed among S. maltophilia clinical isolates, the typing results might be better interpreted if also environmental strains were included. This could help to identify preventative measures to be designed and implemented for decreasing the possibility of outbreaks and nosocomial infections. In this review, we attempt to provide an overview on the most common typing methods used for clinical epidemiology of S. maltophilia strains, such as PCR-based fingerprinting analyses, pulsed-field gel electrophoresis, multilocus variable number tandem repeat analysis, and multilocus sequence type. Application of the proteomic-based mass spectrometry by matrix-assisted laser desorption ionization-time of flight is also described. Improvements of typing methods already in use have to be achieved to facilitate S. maltophilia infection control at any level. In the near future, when novel Web-based platforms for rapid data processing and analysis will be available, whole genome sequencing technologies will likely become a highly powerful tool for outbreak investigations and surveillance studies in routine clinical practices. PMID:25592000

  12. Outcome modelling strategies in epidemiology: traditional methods and basic alternatives.

    PubMed

    Greenland, Sander; Daniel, Rhian; Pearce, Neil

    2016-04-01

    Controlling for too many potential confounders can lead to or aggravate problems of data sparsity or multicollinearity, particularly when the number of covariates is large in relation to the study size. As a result, methods to reduce the number of modelled covariates are often deployed. We review several traditional modelling strategies, including stepwise regression and the 'change-in-estimate' (CIE) approach to deciding which potential confounders to include in an outcome-regression model for estimating effects of a targeted exposure. We discuss their shortcomings, and then provide some basic alternatives and refinements that do not require special macros or programming. Throughout, we assume the main goal is to derive the most accurate effect estimates obtainable from the data and commercial software. Allowing that most users must stay within standard software packages, this goal can be roughly approximated using basic methods to assess, and thereby minimize, mean squared error (MSE). PMID:27097747

  13. Outcome modelling strategies in epidemiology: traditional methods and basic alternatives

    PubMed Central

    Greenland, Sander; Daniel, Rhian; Pearce, Neil

    2016-01-01

    Controlling for too many potential confounders can lead to or aggravate problems of data sparsity or multicollinearity, particularly when the number of covariates is large in relation to the study size. As a result, methods to reduce the number of modelled covariates are often deployed. We review several traditional modelling strategies, including stepwise regression and the ‘change-in-estimate’ (CIE) approach to deciding which potential confounders to include in an outcome-regression model for estimating effects of a targeted exposure. We discuss their shortcomings, and then provide some basic alternatives and refinements that do not require special macros or programming. Throughout, we assume the main goal is to derive the most accurate effect estimates obtainable from the data and commercial software. Allowing that most users must stay within standard software packages, this goal can be roughly approximated using basic methods to assess, and thereby minimize, mean squared error (MSE). PMID:27097747

  14. Design method of supercavitating pumps

    NASA Astrophysics Data System (ADS)

    Kulagin, V.; Likhachev, D.; Li, F. C.

    2016-05-01

    The problem of effective supercavitating (SC) pump is solved, and optimum load distribution along the radius of the blade is found taking into account clearance, degree of cavitation development, influence of finite number of blades, and centrifugal forces. Sufficient accuracy can be obtained using the equivalent flat SC-grid for design of any SC-mechanisms, applying the “grid effect” coefficient and substituting the skewed flow calculated for grids of flat plates with the infinite attached cavitation caverns. This article gives the universal design method and provides an example of SC-pump design.

  15. [Mendelian randomisation - a genetic approach to an epidemiological method].

    PubMed

    Stensrud, Mats Julius

    2016-06-01

    BACKGROUND Genetic information is becoming more easily available, and rapid progress is being made in developing methods of illuminating issues of interest. Mendelian randomisation makes it possible to study causes of disease using observational data. The name refers to the random distribution of gene variants in meiosis. The methodology makes use of genes that influence a risk factor for a disease, without influencing the disease itself. In this review article I explain the principles behind Mendelian randomisation and present the areas of application for this methodology.MATERIAL AND METHOD Methodology articles describing Mendelian randomisation were reviewed. The articles were found through a search in PubMed with the combination «mendelian randomization» OR «mendelian randomisation», and a search in McMaster Plus with the combination «mendelian randomization». A total of 15 methodology articles were read in full text. Methodology articles were supplemented by clinical studies found in the PubMed search.RESULTS In contrast to traditional observational studies, Mendelian randomisation studies are not affected by two important sources of error: conventional confounding variables and reverse causation. Mendelian randomisation is therefore a promising tool for studying causality. Mendelian randomisation studies have already provided valuable knowledge on the risk factors for a wide range of diseases. It is nevertheless important to be aware of the limitations of the methodology. As a result of the rapid developments in genetics research, Mendelian randomisation will probably be widely used in future years.INTERPRETATION If Mendelian randomisation studies are conducted correctly, they may help to reveal both modifiable and non-modifiable causes of disease. PMID:27325033

  16. Violent crime in San Antonio, Texas: an application of spatial epidemiological methods.

    PubMed

    Sparks, Corey S

    2011-12-01

    Violent crimes are rarely considered a public health problem or investigated using epidemiological methods. But patterns of violent crime and other health conditions are often affected by similar characteristics of the built environment. In this paper, methods and perspectives from spatial epidemiology are used in an analysis of violent crimes in San Antonio, TX. Bayesian statistical methods are used to examine the contextual influence of several aspects of the built environment. Additionally, spatial regression models using Bayesian model specifications are used to examine spatial patterns of violent crime risk. Results indicate that the determinants of violent crime depend on the model specification, but are primarily related to the built environment and neighborhood socioeconomic conditions. Results are discussed within the context of a rapidly growing urban area with a diverse population. PMID:22748228

  17. Discriminatory Indices of Typing Methods for Epidemiologic Analysis of Contemporary Staphylococcus aureus Strains

    PubMed Central

    Rodriguez, Marcela; Hogan, Patrick G.; Satola, Sarah W.; Crispell, Emily; Wylie, Todd; Gao, Hongyu; Sodergren, Erica; Weinstock, George M.; Burnham, Carey-Ann D.; Fritz, Stephanie A.

    2015-01-01

    Abstract Historically, a number of typing methods have been evaluated for Staphylococcus aureus strain characterization. The emergence of contemporary strains of community-associated S. aureus, and the ensuing epidemic with a predominant strain type (USA300), necessitates re-evaluation of the discriminatory power of these typing methods for discerning molecular epidemiology and transmission dynamics, essential to investigations of hospital and community outbreaks. We compared the discriminatory index of 5 typing methods for contemporary S. aureus strain characterization. Children presenting to St. Louis Children's Hospital and community pediatric practices in St. Louis, Missouri (MO), with community-associated S. aureus infections were enrolled. Repetitive sequence-based PCR (repPCR), pulsed-field gel electrophoresis (PFGE), multilocus sequence typing (MLST), staphylococcal protein A (spa), and staphylococcal cassette chromosome (SCC) mec typing were performed on 200 S. aureus isolates. The discriminatory index of each method was calculated using the standard formula for this metric, where a value of 1 is highly discriminatory and a value of 0 is not discriminatory. Overall, we identified 26 distinct strain types by repPCR, 17 strain types by PFGE, 30 strain types by MLST, 68 strain types by spa typing, and 5 strain types by SCCmec typing. RepPCR had the highest discriminatory index (D) of all methods (D = 0.88), followed by spa typing (D = 0.87), MLST (D = 0.84), PFGE (D = 0.76), and SCCmec typing (D = 0.60). The method with the highest D among MRSA isolates was repPCR (D = 0.64) followed by spa typing (D = 0.45) and MLST (D = 0.44). The method with the highest D among MSSA isolates was spa typing (D = 0.98), followed by MLST (D = 0.93), repPCR (D = 0.92), and PFGE (D = 0.89). Among isolates designated USA300 by PFGE, repPCR was most discriminatory, with 10 distinct strain types identified (D = 0.63). We

  18. Rationale and Design of the International Lymphoma Epidemiology Consortium (InterLymph) Non-Hodgkin Lymphoma Subtypes Project

    PubMed Central

    Morton, Lindsay M.; Sampson, Joshua N.; Cerhan, James R.; Turner, Jennifer J.; Vajdic, Claire M.; Wang, Sophia S.; Smedby, Karin E.; de Sanjosé, Silvia; Monnereau, Alain; Benavente, Yolanda; Bracci, Paige M.; Chiu, Brian C. H.; Skibola, Christine F.; Zhang, Yawei; Mbulaiteye, Sam M.; Spriggs, Michael; Robinson, Dennis; Norman, Aaron D.; Kane, Eleanor V.; Spinelli, John J.; Kelly, Jennifer L.; Vecchia, Carlo La; Dal Maso, Luigino; Maynadié, Marc; Kadin, Marshall E.; Cocco, Pierluigi; Costantini, Adele Seniori; Clarke, Christina A.; Roman, Eve; Miligi, Lucia; Colt, Joanne S.; Berndt, Sonja I.; Mannetje, Andrea; de Roos, Anneclaire J.; Kricker, Anne; Nieters, Alexandra; Franceschi, Silvia; Melbye, Mads; Boffetta, Paolo; Clavel, Jacqueline; Linet, Martha S.; Weisenburger, Dennis D.; Slager, Susan L.

    2014-01-01

    Background Non-Hodgkin lymphoma (NHL), the most common hematologic malignancy, consists of numerous subtypes. The etiology of NHL is incompletely understood, and increasing evidence suggests that risk factors may vary by NHL subtype. However, small numbers of cases have made investigation of subtype-specific risks challenging. The International Lymphoma Epidemiology Consortium therefore undertook the NHL Subtypes Project, an international collaborative effort to investigate the etiologies of NHL subtypes. This article describes in detail the project rationale and design. Methods We pooled individual-level data from 20 case-control studies (17471 NHL cases, 23096 controls) from North America, Europe, and Australia. Centralized data harmonization and analysis ensured standardized definitions and approaches, with rigorous quality control. Results The pooled study population included 11 specified NHL subtypes with more than 100 cases: diffuse large B-cell lymphoma (N = 4667), follicular lymphoma (N = 3530), chronic lymphocytic leukemia/small lymphocytic lymphoma (N = 2440), marginal zone lymphoma (N = 1052), peripheral T-cell lymphoma (N = 584), mantle cell lymphoma (N = 557), lymphoplasmacytic lymphoma/Waldenström macroglobulinemia (N = 374), mycosis fungoides/Sézary syndrome (N = 324), Burkitt/Burkitt-like lymphoma/leukemia (N = 295), hairy cell leukemia (N = 154), and acute lymphoblastic leukemia/lymphoma (N = 152). Associations with medical history, family history, lifestyle factors, and occupation for each of these 11 subtypes are presented in separate articles in this issue, with a final article quantitatively comparing risk factor patterns among subtypes. Conclusions The International Lymphoma Epidemiology Consortium NHL Subtypes Project provides the largest and most comprehensive investigation of potential risk factors for a broad range of common and rare NHL subtypes to date. The analyses contribute to our understanding of the multifactorial nature of NHL

  19. Endodontic Epidemiology

    PubMed Central

    Shahravan, Arash; Haghdoost, Ali Akbar

    2014-01-01

    Epidemiology is the study of disease distribution and factors determining or affecting it. Likewise, endodontic epidemiology can be defined as the science of studying the distribution pattern and determinants of pulp and periapical diseases; specially apical periodontitis. Although different study designs have been used in endodontics, researchers must pay more attention to study designs with higher level of evidence such as randomized clinical trials. PMID:24688577

  20. Empirical Evidence of Study Design Biases in Randomized Trials: Systematic Review of Meta-Epidemiological Studies

    PubMed Central

    Page, Matthew J.; Higgins, Julian P. T.; Clayton, Gemma; Sterne, Jonathan A. C.; Hróbjartsson, Asbjørn; Savović, Jelena

    2016-01-01

    Objective To synthesise evidence on the average bias and heterogeneity associated with reported methodological features of randomized trials. Design Systematic review of meta-epidemiological studies. Methods We retrieved eligible studies included in a recent AHRQ-EPC review on this topic (latest search September 2012), and searched Ovid MEDLINE and Ovid EMBASE for studies indexed from Jan 2012-May 2015. Data were extracted by one author and verified by another. We combined estimates of average bias (e.g. ratio of odds ratios (ROR) or difference in standardised mean differences (dSMD)) in meta-analyses using the random-effects model. Analyses were stratified by type of outcome (“mortality” versus “other objective” versus “subjective”). Direction of effect was standardised so that ROR < 1 and dSMD < 0 denotes a larger intervention effect estimate in trials with an inadequate or unclear (versus adequate) characteristic. Results We included 24 studies. The available evidence suggests that intervention effect estimates may be exaggerated in trials with inadequate/unclear (versus adequate) sequence generation (ROR 0.93, 95% CI 0.86 to 0.99; 7 studies) and allocation concealment (ROR 0.90, 95% CI 0.84 to 0.97; 7 studies). For these characteristics, the average bias appeared to be larger in trials of subjective outcomes compared with other objective outcomes. Also, intervention effects for subjective outcomes appear to be exaggerated in trials with lack of/unclear blinding of participants (versus blinding) (dSMD -0.37, 95% CI -0.77 to 0.04; 2 studies), lack of/unclear blinding of outcome assessors (ROR 0.64, 95% CI 0.43 to 0.96; 1 study) and lack of/unclear double blinding (ROR 0.77, 95% CI 0.61 to 0.93; 1 study). The influence of other characteristics (e.g. unblinded trial personnel, attrition) is unclear. Conclusions Certain characteristics of randomized trials may exaggerate intervention effect estimates. The average bias appears to be greatest in trials of

  1. Methods for measuring utilization of mental health services in two epidemiologic studies

    PubMed Central

    NOVINS, DOUGLAS K.; BEALS, JANETTE; CROY, CALVIN; MANSON, SPERO M.

    2015-01-01

    Objectives of Study Psychiatric epidemiologic studies often include two or more sets of questions regarding service utilization, but the agreement across these different questions and the factors associated with their endorsement have not been examined. The objectives of this study were to describe the agreement of different sets of mental health service utilization questions that were included in the American Indian Service Utilization Psychiatric Epidemiology Risk and Protective Factors Project (AI-SUPERPFP), and compare the results to similar questions included in the baseline National Comorbidity Survey (NCS). Methods Responses to service utilization questions by 2878 AI-SUPERPFP and 5877 NCS participants were examined by calculating estimates of service use and agreement (κ) across the different sets of questions. Logistic regression models were developed to identify factors associated with endorsement of specific sets of questions. Results In both studies, estimates of mental health service utilization varied across the different sets of questions. Agreement across the different question sets was marginal to good (κ = 0.27–0.69). Characteristics of identified service users varied across the question sets. Limitations Neither survey included data to examine the validity of participant responses to service utilization questions. Recommendations for Further Research Question wording and placement appear to impact estimates of service utilization in psychiatric epidemiologic studies. Given the importance of these estimates for policy-making, further research into the validity of survey responses as well as impacts of question wording and context on rates of service utilization is warranted. PMID:18767205

  2. A Review for Detecting Gene-Gene Interactions Using Machine Learning Methods in Genetic Epidemiology

    PubMed Central

    Koo, Ching Lee; Liew, Mei Jing; Mohamad, Mohd Saberi

    2013-01-01

    Recently, the greatest statistical computational challenge in genetic epidemiology is to identify and characterize the genes that interact with other genes and environment factors that bring the effect on complex multifactorial disease. These gene-gene interactions are also denoted as epitasis in which this phenomenon cannot be solved by traditional statistical method due to the high dimensionality of the data and the occurrence of multiple polymorphism. Hence, there are several machine learning methods to solve such problems by identifying such susceptibility gene which are neural networks (NNs), support vector machine (SVM), and random forests (RFs) in such common and multifactorial disease. This paper gives an overview on machine learning methods, describing the methodology of each machine learning methods and its application in detecting gene-gene and gene-environment interactions. Lastly, this paper discussed each machine learning method and presents the strengths and weaknesses of each machine learning method in detecting gene-gene interactions in complex human disease. PMID:24228248

  3. A review for detecting gene-gene interactions using machine learning methods in genetic epidemiology.

    PubMed

    Koo, Ching Lee; Liew, Mei Jing; Mohamad, Mohd Saberi; Salleh, Abdul Hakim Mohamed

    2013-01-01

    Recently, the greatest statistical computational challenge in genetic epidemiology is to identify and characterize the genes that interact with other genes and environment factors that bring the effect on complex multifactorial disease. These gene-gene interactions are also denoted as epitasis in which this phenomenon cannot be solved by traditional statistical method due to the high dimensionality of the data and the occurrence of multiple polymorphism. Hence, there are several machine learning methods to solve such problems by identifying such susceptibility gene which are neural networks (NNs), support vector machine (SVM), and random forests (RFs) in such common and multifactorial disease. This paper gives an overview on machine learning methods, describing the methodology of each machine learning methods and its application in detecting gene-gene and gene-environment interactions. Lastly, this paper discussed each machine learning method and presents the strengths and weaknesses of each machine learning method in detecting gene-gene interactions in complex human disease. PMID:24228248

  4. The Role of DNA Methylation in Cardiovascular Risk and Disease: Methodological Aspects, Study Design, and Data Analysis for Epidemiological Studies.

    PubMed

    Zhong, Jia; Agha, Golareh; Baccarelli, Andrea A

    2016-01-01

    Epidemiological studies have demonstrated that genetic, environmental, behavioral, and clinical factors contribute to cardiovascular disease development. How these risk factors interact at the cellular level to cause cardiovascular disease is not well known. Epigenetic epidemiology enables researchers to explore critical links between genomic coding, modifiable exposures, and manifestation of disease phenotype. One epigenetic link, DNA methylation, is potentially an important mechanism underlying these associations. In the past decade, there has been a significant increase in the number of epidemiological studies investigating cardiovascular risk factors and outcomes in relation to DNA methylation, but many gaps remain in our understanding of the underlying cause and biological implications. In this review, we provide a brief overview of the biology and mechanisms of DNA methylation and its role in cardiovascular disease. In addition, we summarize the current evidence base in epigenetic epidemiology studies relevant to cardiovascular health and disease and discuss the limitations, challenges, and future directions of the field. Finally, we provide guidelines for well-designed epigenetic epidemiology studies, with particular focus on methodological aspects, study design, and analytical challenges. PMID:26837743

  5. Diagnostic Methods of Helicobacter pylori Infection for Epidemiological Studies: Critical Importance of Indirect Test Validation.

    PubMed

    Miftahussurur, Muhammad; Yamaoka, Yoshio

    2016-01-01

    Among the methods developed to detect H. pylori infection, determining the gold standard remains debatable, especially for epidemiological studies. Due to the decreasing sensitivity of direct diagnostic tests (histopathology and/or immunohistochemistry [IHC], rapid urease test [RUT], and culture), several indirect tests, including antibody-based tests (serology and urine test), urea breath test (UBT), and stool antigen test (SAT) have been developed to diagnose H. pylori infection. Among the indirect tests, UBT and SAT became the best methods to determine active infection. While antibody-based tests, especially serology, are widely available and relatively sensitive, their specificity is low. Guidelines indicated that no single test can be considered as the gold standard for the diagnosis of H. pylori infection and that one should consider the method's advantages and disadvantages. Based on four epidemiological studies, culture and RUT present a sensitivity of 74.2-90.8% and 83.3-86.9% and a specificity of 97.7-98.8% and 95.1-97.2%, respectively, when using IHC as a gold standard. The sensitivity of serology is quite high, but that of the urine test was lower compared with that of the other methods. Thus, indirect test validation is important although some commercial kits propose universal cut-off values. PMID:26904678

  6. Diagnostic Methods of Helicobacter pylori Infection for Epidemiological Studies: Critical Importance of Indirect Test Validation

    PubMed Central

    Miftahussurur, Muhammad; Yamaoka, Yoshio

    2016-01-01

    Among the methods developed to detect H. pylori infection, determining the gold standard remains debatable, especially for epidemiological studies. Due to the decreasing sensitivity of direct diagnostic tests (histopathology and/or immunohistochemistry [IHC], rapid urease test [RUT], and culture), several indirect tests, including antibody-based tests (serology and urine test), urea breath test (UBT), and stool antigen test (SAT) have been developed to diagnose H. pylori infection. Among the indirect tests, UBT and SAT became the best methods to determine active infection. While antibody-based tests, especially serology, are widely available and relatively sensitive, their specificity is low. Guidelines indicated that no single test can be considered as the gold standard for the diagnosis of H. pylori infection and that one should consider the method's advantages and disadvantages. Based on four epidemiological studies, culture and RUT present a sensitivity of 74.2–90.8% and 83.3–86.9% and a specificity of 97.7–98.8% and 95.1–97.2%, respectively, when using IHC as a gold standard. The sensitivity of serology is quite high, but that of the urine test was lower compared with that of the other methods. Thus, indirect test validation is important although some commercial kits propose universal cut-off values. PMID:26904678

  7. DISPLACEMENT BASED SEISMIC DESIGN METHODS.

    SciTech Connect

    HOFMAYER,C.MILLER,C.WANG,Y.COSTELLO,J.

    2003-07-15

    A research effort was undertaken to determine the need for any changes to USNRC's seismic regulatory practice to reflect the move, in the earthquake engineering community, toward using expected displacement rather than force (or stress) as the basis for assessing design adequacy. The research explored the extent to which displacement based seismic design methods, such as given in FEMA 273, could be useful for reviewing nuclear power stations. Two structures common to nuclear power plants were chosen to compare the results of the analysis models used. The first structure is a four-story frame structure with shear walls providing the primary lateral load system, referred herein as the shear wall model. The second structure is the turbine building of the Diablo Canyon nuclear power plant. The models were analyzed using both displacement based (pushover) analysis and nonlinear dynamic analysis. In addition, for the shear wall model an elastic analysis with ductility factors applied was also performed. The objectives of the work were to compare the results between the analyses, and to develop insights regarding the work that would be needed before the displacement based analysis methodology could be considered applicable to facilities licensed by the NRC. A summary of the research results, which were published in NUREGICR-6719 in July 2001, is presented in this paper.

  8. Realist explanatory theory building method for social epidemiology: a protocol for a mixed method multilevel study of neighbourhood context and postnatal depression.

    PubMed

    Eastwood, John G; Jalaludin, Bin B; Kemp, Lynn A

    2014-01-01

    A recent criticism of social epidemiological studies, and multi-level studies in particular has been a paucity of theory. We will present here the protocol for a study that aims to build a theory of the social epidemiology of maternal depression. We use a critical realist approach which is trans-disciplinary, encompassing both quantitative and qualitative traditions, and that assumes both ontological and hierarchical stratification of reality. We describe a critical realist Explanatory Theory Building Method comprising of an: 1) emergent phase, 2) construction phase, and 3) confirmatory phase. A concurrent triangulated mixed method multilevel cross-sectional study design is described. The Emergent Phase uses: interviews, focus groups, exploratory data analysis, exploratory factor analysis, regression, and multilevel Bayesian spatial data analysis to detect and describe phenomena. Abductive and retroductive reasoning will be applied to: categorical principal component analysis, exploratory factor analysis, regression, coding of concepts and categories, constant comparative analysis, drawing of conceptual networks, and situational analysis to generate theoretical concepts. The Theory Construction Phase will include: 1) defining stratified levels; 2) analytic resolution; 3) abductive reasoning; 4) comparative analysis (triangulation); 5) retroduction; 6) postulate and proposition development; 7) comparison and assessment of theories; and 8) conceptual frameworks and model development. The strength of the critical realist methodology described is the extent to which this paradigm is able to support the epistemological, ontological, axiological, methodological and rhetorical positions of both quantitative and qualitative research in the field of social epidemiology. The extensive multilevel Bayesian studies, intensive qualitative studies, latent variable theory, abductive triangulation, and Inference to Best Explanation provide a strong foundation for Theory

  9. A practical method for use in epidemiological studies on enamel hypomineralisation.

    PubMed

    Ghanim, A; Elfrink, M; Weerheijm, K; Mariño, R; Manton, D

    2015-06-01

    With the development of the European Academy of Paediatric Dentistry (EAPD) judgment criteria, there has been increasing interest worldwide in investigation of the prevalence of demarcated opacities in tooth enamel substance, known as molar-incisor hypomineralisation (MIH). However, the lack of a standardised system for the purpose of recording MIH data in epidemiological surveys has contributed greatly to the wide variations in the reported prevalence between studies. The present publication describes the rationale, development, and content of a scoring method for MIH diagnosis in epidemiological studies as well as clinic- and hospital-based studies. The proposed grading method allows separate classification of demarcated hypomineralisation lesions and other enamel defects identical to MIH. It yields an informative description of the severity of MIH-affected teeth in terms of the stage of visible enamel destruction and the area of tooth surface affected (i.e. lesion clinical status and extent, respectively). In order to preserve the maximum amount of information from a clinical examination consistent with the need to permit direct comparisons between prevalence studies, two forms of the charting are proposed, a short form for simple screening surveys and a long form desirable for prospective, longitudinal observational research where aetiological factors in demarcated lesions are to be investigated in tandem with lesions distribution. Validation of the grading method is required, and its reliability and usefulness need to be tested in different age groups and different populations. PMID:25916282

  10. Design of diffractive optical surfaces within the SMS design method

    NASA Astrophysics Data System (ADS)

    Mendes-Lopes, João.; Benítez, Pablo; Miñano, Juan C.

    2015-08-01

    The Simultaneous Multiple Surface (SMS) method was initially developed as a design method in Nonimaging Optics and later, the method was extended for designing Imaging Optics. We present the extension of the SMS method to design diffractive optical surfaces. This method involves the simultaneous calculation of N/2 diffractive surfaces, using the phase-shift properties of diffractive surfaces as an extra degree of freedom, such that N one-parameter wavefronts can be perfectly coupled. Moreover, the SMS method for diffractive surfaces is a direct method, i.e., it is not based in multi-parametric optimization techniques. Representative diffractive systems designed by the SMS method are presented.

  11. Measuring socio-economic position for epidemiological studies in low- and middle-income countries: a methods of measurement in epidemiology paper

    PubMed Central

    Howe, Laura D; Galobardes, Bruna; Matijasevich, Alicia; Gordon, David; Johnston, Deborah; Onwujekwe, Obinna; Patel, Rita; Webb, Elizabeth A; Lawlor, Debbie A; Hargreaves, James R

    2012-01-01

    Much has been written about the measurement of socio-economic position (SEP) in high-income countries (HIC). Less has been written for an epidemiology, health systems and public health audience about the measurement of SEP in low- and middle-income countries (LMIC). The social stratification processes in many LMIC—and therefore the appropriate measurement tools—differ considerably from those in HIC. Many measures of SEP have been utilized in epidemiological studies; the aspects of SEP captured by these measures and the pathways through which they may affect health are likely to be slightly different but overlapping. No single measure of SEP will be ideal for all studies and contexts; the strengths and limitations of a given indicator are likely to vary according to the specific research question. Understanding the general properties of different indicators, however, is essential for all those involved in the design or interpretation of epidemiological studies. In this article, we describe the measures of SEP used in LMIC. We concentrate on measures of individual or household-level SEP rather than area-based or ecological measures such as gross domestic product. We describe each indicator in terms of its theoretical basis, interpretation, measurement, strengths and limitations. We also provide brief comparisons between LMIC and HIC for each measure. PMID:22438428

  12. Research Methods in Healthcare Epidemiology and Antimicrobial Stewardship: Randomized Controlled Trials.

    PubMed

    Anderson, Deverick J; Juthani-Mehta, Manisha; Morgan, Daniel J

    2016-06-01

    Randomized controlled trials (RCT) produce the strongest level of clinical evidence when comparing interventions. RCTs are technically difficult, costly, and require specific considerations including the use of patient- and cluster-level randomization and outcome selection. In this methods paper, we focus on key considerations for RCT methods in healthcare epidemiology and antimicrobial stewardship (HE&AS) research, including the need for cluster randomization, conduct at multiple sites, behavior modification interventions, and difficulty with identifying appropriate outcomes. We review key RCTs in HE&AS with a focus on advantages and disadvantages of methods used. A checklist is provided to aid in the development of RCTs in HE&AS. Infect Control Hosp Epidemiol 2016;37:629-634. PMID:27108848

  13. [Eco-epidemiology: towards epidemiology of complexity].

    PubMed

    Bizouarn, Philippe

    2016-05-01

    In order to solve public health problems posed by the epidemiology of risk factors centered on the individual and neglecting the causal processes linking the risk factors with the health outcomes, Mervyn Susser proposed a multilevel epidemiology called eco-epidemiology, addressing the interdependence of individuals and their connection with molecular, individual, societal, environmental levels of organization participating in the causal disease processes. The aim of this epidemiology is to integrate more than a level of organization in design, analysis and interpretation of health problems. After presenting the main criticisms of risk-factor epidemiology focused on the individual, we will try to show how eco-epidemiology and its development could help to understand the need for a broader and integrative epidemiology, in which studies designed to identify risk factors would be balanced by studies designed to answer other questions equally vital to public health. PMID:27225924

  14. Methodologic frontiers in environmental epidemiology.

    PubMed Central

    Rothman, K J

    1993-01-01

    Environmental epidemiology comprises the epidemiologic study of those environmental factors that are outside the immediate control of the individual. Exposures of interest to environmental epidemiologists include air pollution, water pollution, occupational exposure to physical and chemical agents, as well as psychosocial elements of environmental concern. The main methodologic problem in environmental epidemiology is exposure assessment, a problem that extends through all of epidemiologic research but looms as a towering obstacle in environmental epidemiology. One of the most promising developments in improving exposure assessment in environmental epidemiology is to find exposure biomarkers, which could serve as built-in dosimeters that reflect the biologic footprint left behind by environmental exposures. Beyond exposure assessment, epidemiologists studying environmental exposures face the difficulty of studying small effects that may be distorted by confounding that eludes easy control. This challenge may prompt reliance on new study designs, such as two-stage designs in which exposure and disease information are collected in the first stage, and covariate information is collected on a subset of subjects in state two. While the analytic methods already available for environmental epidemiology are powerful, analytic methods for ecologic studies need further development. This workshop outlines the range of methodologic issues that environmental epidemiologists must address so that their work meets the goals set by scientists and society at large. PMID:8206029

  15. An airfoil design method for viscous flows

    NASA Technical Reports Server (NTRS)

    Malone, J. B.; Narramore, J. C.; Sankar, L. N.

    1990-01-01

    An airfoil design procedure is described that has been incorporated into an existing two-dimensional Navier-Stokes airfoil analysis method. The resulting design method, an iterative procedure based on a residual-correction algorithm, permits the automated design of airfoil sections with prescribed surface pressure distributions. This paper describes the inverse design method and the technique used to specify target pressure distributions. An example airfoil design problem is described to demonstrate application of the inverse design procedure. It shows that this inverse design method develops useful airfoil configurations with a reasonable expenditure of computer resources.

  16. Review of freeform TIR collimator design methods

    NASA Astrophysics Data System (ADS)

    Talpur, Taimoor; Herkommer, Alois

    2016-04-01

    Total internal reflection (TIR) collimators are essential illumination components providing high efficiency and uniformity in a compact geometry. Various illumination design methods have been developed for designing such collimators, including tailoring methods, design via optimization, the mapping and feedback method, and the simultaneous multiple surface (SMS) method. This paper provides an overview of the different methods and compares the performance of the methods along with their advantages and their limitations.

  17. [Application of molecular methods in the diagnosis and epidemiological study of viral respiratory infections].

    PubMed

    Pozo, Francisco; Casas, Inmaculada; Ruiz, Guillermo; Falcón, Ana; Pérez-Breña, Pilar

    2008-07-01

    To date, more than two hundred viruses, belonging to six different taxonomic families, have been associated with human respiratory tract infection. The widespread incorporation of molecular methods into clinical microbiology laboratories has not only led to notable advances in the etiological diagnosis of viral respiratory infections but has also increased insight into the pathology and epidemiological profiles of the causative viruses. Because of their high sensitivity, molecular techniques markedly increase the efficiency of viral detection in respiratory specimens, particularly those that fail to propagate successfully in common cell cultures, thus allowing more rapid etiologic diagnosis. However, there are also some disadvantages in the use of these new technologies such as detection of viruses that merely colonize the respiratory tract of healthy people, or those found in the nasopharyngeal secretions of patients who have recovered from respiratory infections, due to longterm viral shedding, when the viruses are unlikely to act as pathogens. Additionally, sequencing of the amplification products allows further characterization of detected viruses, including molecular epidemiology, genotyping, or detection of antiviral resistance, to cite only a few examples. PMID:19195443

  18. Comparison of Methods to Account for Implausible Reporting of Energy Intake in Epidemiologic Studies

    PubMed Central

    Rhee, Jinnie J.; Sampson, Laura; Cho, Eunyoung; Hughes, Michael D.; Hu, Frank B.; Willett, Walter C.

    2015-01-01

    In a recent article in the American Journal of Epidemiology by Mendez et al. (Am J Epidemiol. 2011;173(4):448–458), the use of alternative approaches to the exclusion of implausible energy intakes led to significantly different cross-sectional associations between diet and body mass index (BMI), whereas the use of a simpler recommended criteria (<500 and >3,500 kcal/day) yielded no meaningful change. However, these findings might have been due to exclusions made based on weight, a primary determinant of BMI. Using data from 52,110 women in the Nurses' Health Study (1990), we reproduced the cross-sectional findings of Mendez et al. and compared the results from the recommended method with those from 2 weight-dependent alternative methods (the Goldberg method and predicted total energy expenditure method). The same 3 exclusion criteria were then used to examine dietary variables prospectively in relation to change in BMI, which is not a direct function of attained weight. We found similar associations using the 3 methods. In a separate cross-sectional analysis using biomarkers of dietary factors, we found similar correlations for intakes of fatty acids (n = 439) and carotenoids and retinol (n = 1,293) using the 3 methods for exclusions. These results do not support the general conclusion that use of exclusion criteria based on the alternative methods might confer an advantage over the recommended exclusion method. PMID:25656533

  19. Computational methods for stealth design

    SciTech Connect

    Cable, V.P. )

    1992-08-01

    A review is presented of the utilization of computer models for stealth design toward the ultimate goal of designing and fielding an aircraft that remains undetected at any altitude and any range. Attention is given to the advancements achieved in computational tools and their utilization. Consideration is given to the development of supercomputers for large-scale scientific computing and the development of high-fidelity, 3D, radar-signature-prediction tools for complex shapes with nonmetallic and radar-penetrable materials.

  20. Spacesuit Radiation Shield Design Methods

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Anderson, Brooke M.; Cucinotta, Francis A.; Ware, J.; Zeitlin, Cary J.

    2006-01-01

    Meeting radiation protection requirements during EVA is predominantly an operational issue with some potential considerations for temporary shelter. The issue of spacesuit shielding is mainly guided by the potential of accidental exposure when operational and temporary shelter considerations fail to maintain exposures within operational limits. In this case, very high exposure levels are possible which could result in observable health effects and even be life threatening. Under these assumptions, potential spacesuit radiation exposures have been studied using known historical solar particle events to gain insight on the usefulness of modification of spacesuit design in which the control of skin exposure is a critical design issue and reduction of blood forming organ exposure is desirable. Transition to a new spacesuit design including soft upper-torso and reconfigured life support hardware gives an opportunity to optimize the next generation spacesuit for reduced potential health effects during an accidental exposure.

  1. Computational Methods in Nanostructure Design

    NASA Astrophysics Data System (ADS)

    Bellesia, Giovanni; Lampoudi, Sotiria; Shea, Joan-Emma

    Self-assembling peptides can serve as building blocks for novel biomaterials. Replica exchange molecular dynamics simulations are a powerful means to probe the conformational space of these peptides. We discuss the theoretical foundations of this enhanced sampling method and its use in biomolecular simulations. We then apply this method to determine the monomeric conformations of the Alzheimer amyloid-β(12-28) peptide that can serve as initiation sites for aggregation.

  2. Review of pathogenesis and diagnostic methods of immediate relevance for epidemiology and control of Salmonella Dublin in cattle.

    PubMed

    Nielsen, Liza Rosenbaum

    2013-02-22

    Salmonella enterica subsp. enterica serovar Dublin (S. Dublin) receives increasing attention in cattle production. It is host-adapted to cattle, and leads to unacceptable levels of morbidity, mortality and production losses in both newly and persistently infected herds. Cattle health promoting institutions in several countries are currently constructing active surveillance programmes or voluntary certification programmes, and encourage control and eradication of S. Dublin infected cattle herds. There is a need to understand the underlying pathogenesis of the infection at both animal and herd level to design successful programmes. Furthermore, knowledge about and access to diagnostic tests for use in practice including information about test accuracy and interpretation of available diagnostic test methods are requested. The aim is to synthesise the abundant literature on elements of pathogenesis and diagnosis of immediate relevance for epidemiology and control of S. Dublin at animal and herd level. Relatively few in vivo studies on S. Dublin pathogenesis in cattle included more than a few animals and often showed varying result. It makes it difficult to draw conclusions about mechanisms that affect dissemination in cattle and that might be targets for control methods directed towards improving resistance against the bacteria, e.g. new vaccines. It is recommended to perform larger studies to elucidate dose-response relationships and age- and genetic effects of immunity. Furthermore, it is recommended to attempt to develop faster and more sensitive methods for detection of S. Dublin for diagnosis of infectious animals. PMID:22925272

  3. The genetic study of three population microisolates in South Tyrol (MICROS): study design and epidemiological perspectives

    PubMed Central

    Pattaro, Cristian; Marroni, Fabio; Riegler, Alice; Mascalzoni, Deborah; Pichler, Irene; Volpato, Claudia B; Dal Cero, Umberta; De Grandi, Alessandro; Egger, Clemens; Eisendle, Agatha; Fuchsberger, Christian; Gögele, Martin; Pedrotti, Sara; Pinggera, Gerd K; Stefanov, Stefan A; Vogl, Florian D; Wiedermann, Christian J; Meitinger, Thomas; Pramstaller, Peter P

    2007-01-01

    Background There is increasing evidence of the important role that small, isolated populations could play in finding genes involved in the etiology of diseases. For historical and political reasons, South Tyrol, the northern most Italian region, includes several villages of small dimensions which remained isolated over the centuries. Methods The MICROS study is a population-based survey on three small, isolated villages, characterized by: old settlement; small number of founders; high endogamy rates; slow/null population expansion. During the stage-1 (2002/03) genealogical data, screening questionnaires, clinical measurements, blood and urine samples, and DNA were collected for 1175 adult volunteers. Stage-2, concerning trait diagnoses, linkage analysis and association studies, is ongoing. The selection of the traits is being driven by expert clinicians. Preliminary, descriptive statistics were obtained. Power simulations for finding linkage on a quantitative trait locus (QTL) were undertaken. Results Starting from participants, genealogies were reconstructed for 50,037 subjects, going back to the early 1600s. Within the last five generations, subjects were clustered in one pedigree of 7049 subjects plus 178 smaller pedigrees (3 to 85 subjects each). A significant probability of familial clustering was assessed for many traits, especially among the cardiovascular, neurological and respiratory traits. Simulations showed that the MICROS pedigree has a substantial power to detect a LOD score ≥ 3 when the QTL specific heritability is ≥ 20%. Conclusion The MICROS study is an extensive, ongoing, two-stage survey aimed at characterizing the genetic epidemiology of Mendelian and complex diseases. Our approach, involving different scientific disciplines, is an advantageous strategy to define and to study population isolates. The isolation of the Alpine populations, together with the extensive data collected so far, make the MICROS study a powerful resource for the study

  4. Regression calibration method for correcting measurement-error bias in nutritional epidemiology.

    PubMed

    Spiegelman, D; McDermott, A; Rosner, B

    1997-04-01

    Regression calibration is a statistical method for adjusting point and interval estimates of effect obtained from regression models commonly used in epidemiology for bias due to measurement error in assessing nutrients or other variables. Previous work developed regression calibration for use in estimating odds ratios from logistic regression. We extend this here to estimating incidence rate ratios from Cox proportional hazards models and regression slopes from linear-regression models. Regression calibration is appropriate when a gold standard is available in a validation study and a linear measurement error with constant variance applies or when replicate measurements are available in a reliability study and linear random within-person error can be assumed. In this paper, the method is illustrated by correction of rate ratios describing the relations between the incidence of breast cancer and dietary intakes of vitamin A, alcohol, and total energy in the Nurses' Health Study. An example using linear regression is based on estimation of the relation between ultradistal radius bone density and dietary intakes of caffeine, calcium, and total energy in the Massachusetts Women's Health Study. Software implementing these methods uses SAS macros. PMID:9094918

  5. Design for validation, based on formal methods

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1990-01-01

    Validation of ultra-reliable systems decomposes into two subproblems: (1) quantification of probability of system failure due to physical failure; (2) establishing that Design Errors are not present. Methods of design, testing, and analysis of ultra-reliable software are discussed. It is concluded that a design-for-validation based on formal methods is needed for the digital flight control systems problem, and also that formal methods will play a major role in the development of future high reliability digital systems.

  6. Radrue method for reconstruction of external photon doses for Chernobyl liquidators in epidemiological studies.

    PubMed

    Kryuchkov, Victor; Chumak, Vadim; Maceika, Evaldas; Anspaugh, Lynn R; Cardis, Elisabeth; Bakhanova, Elena; Golovanov, Ivan; Drozdovitch, Vladimir; Luckyanov, Nickolas; Kesminiene, Ausrele; Voillequé, Paul; Bouville, André

    2009-10-01

    Between 1986 and 1990, several hundred thousand workers, called "liquidators" or "clean-up workers," took part in decontamination and recovery activities within the 30-km zone around the Chernobyl nuclear power plant in Ukraine, where a major accident occurred in April 1986. The Chernobyl liquidators were mainly exposed to external ionizing radiation levels that depended primarily on their work locations and the time after the accident when the work was performed. Because individual doses were often monitored inadequately or were not monitored at all for the majority of liquidators, a new method of photon (i.e., gamma and x rays) dose assessment, called "RADRUE" (Realistic Analytical Dose Reconstruction with Uncertainty Estimation), was developed to obtain unbiased and reasonably accurate estimates for use in three epidemiologic studies of hematological malignancies and thyroid cancer among liquidators. The RADRUE program implements a time-and-motion dose-reconstruction method that is flexible and conceptually easy to understand. It includes a large exposure rate database and interpolation and extrapolation techniques to calculate exposure rates at places where liquidators lived and worked within approximately 70 km of the destroyed reactor. The RADRUE technique relies on data collected from subjects' interviews conducted by trained interviewers, and on expert dosimetrists to interpret the information and provide supplementary information, when necessary, based upon their own Chernobyl experience. The RADRUE technique was used to estimate doses from external irradiation, as well as uncertainties, to the bone marrow for 929 subjects and to the thyroid gland for 530 subjects enrolled in epidemiologic studies. Individual bone marrow dose estimates were found to range from less than one muGy to 3,300 mGy, with an arithmetic mean of 71 mGy. Individual thyroid dose estimates were lower and ranged from 20 muGy to 507 mGy, with an arithmetic mean of 29 mGy. The

  7. RADRUE METHOD FOR RECONSTRUCTION OF EXTERNAL PHOTON DOSES TO CHERNOBYL LIQUIDATORS IN EPIDEMIOLOGICAL STUDIES

    PubMed Central

    Kryuchkov, Victor; Chumak, Vadim; Maceika, Evaldas; Anspaugh, Lynn R.; Cardis, Elisabeth; Bakhanova, Elena; Golovanov, Ivan; Drozdovitch, Vladimir; Luckyanov, Nickolas; Kesminiene, Ausrele; Voillequé, Paul; Bouville, André

    2010-01-01

    Between 1986 and 1990, several hundred thousand workers, called “liquidators” or “clean-up workers”, took part in decontamination and recovery activities within the 30-km zone around the Chernobyl nuclear power plant in Ukraine, where a major accident occurred in April 1986. The Chernobyl liquidators were mainly exposed to external ionizing radiation levels that depended primarily on their work locations and the time after the accident when the work was performed. Because individual doses were often monitored inadequately or were not monitored at all for the majority of liquidators, a new method of photon (i.e. gamma and x-rays) dose assessment, called “RADRUE” (Realistic Analytical Dose Reconstruction with Uncertainty Estimation) was developed to obtain unbiased and reasonably accurate estimates for use in three epidemiologic studies of hematological malignancies and thyroid cancer among liquidators. The RADRUE program implements a time-and-motion dose reconstruction method that is flexible and conceptually easy to understand. It includes a large exposure rate database and interpolation and extrapolation techniques to calculate exposure rates at places where liquidators lived and worked within ~70 km of the destroyed reactor. The RADRUE technique relies on data collected from subjects’ interviews conducted by trained interviewers, and on expert dosimetrists to interpret the information and provide supplementary information, when necessary, based upon their own Chernobyl experience. The RADRUE technique was used to estimate doses from external irradiation, as well as uncertainties, to the bone-marrow for 929 subjects and to the thyroid gland for 530 subjects enrolled in epidemiologic studies. Individual bone-marrow dose estimates were found to range from less than one μGy to 3,300 mGy, with an arithmetic mean of 71 mGy. Individual thyroid dose estimates were lower and ranged from 20 μGy to 507 mGy, with an arithmetic mean of 29 mGy. The

  8. HTGR analytical methods and design verification

    SciTech Connect

    Neylan, A.J.; Northup, T.E.

    1982-05-01

    Analytical methods for the high-temperature gas-cooled reactor (HTGR) include development, update, verification, documentation, and maintenance of all computer codes for HTGR design and analysis. This paper presents selected nuclear, structural mechanics, seismic, and systems analytical methods related to the HTGR core. This paper also reviews design verification tests in the reactor core, reactor internals, steam generator, and thermal barrier.

  9. Genetic diversity of Bacillus anthracis in Europe: genotyping methods in forensic and epidemiologic investigations.

    PubMed

    Derzelle, Sylviane; Thierry, Simon

    2013-09-01

    Bacillus anthracis, the etiological agent of anthrax, a zoonosis relatively common throughout the world, can be used as an agent of bioterrorism. In naturally occurring outbreaks and in criminal release of this pathogen, a fast and accurate diagnosis is crucial to an effective response. Microbiological forensics and epidemiologic investigations increasingly rely on molecular markers, such as polymorphisms in DNA sequence, to obtain reliable information regarding the identification or source of a suspicious strain. Over the past decade, significant research efforts have been undertaken to develop genotyping methods with increased power to differentiate B. anthracis strains. A growing number of DNA signatures have been identified and used to survey B. anthracis diversity in nature, leading to rapid advances in our understanding of the global population of this pathogen. This article provides an overview of the different phylogenetic subgroups distributed across the world, with a particular focus on Europe. Updated information on the anthrax situation in Europe is reported. A brief description of some of the work in progress in the work package 5.1 of the AniBioThreat project is also presented, including (1) the development of a robust typing tool based on a suspension array technology and multiplexed single nucleotide polymorphisms scoring and (2) the typing of a collection of DNA from European isolates exchanged between the partners of the project. The know-how acquired will contribute to improving the EU's ability to react rapidly when the identity and real origin of a strain need to be established. PMID:23971802

  10. Wastewater-Based Epidemiology of Stimulant Drugs: Functional Data Analysis Compared to Traditional Statistical Methods

    PubMed Central

    Salvatore, Stefania; Bramness, Jørgen Gustav; Reid, Malcolm J.; Thomas, Kevin Victor; Harman, Christopher; Røislien, Jo

    2015-01-01

    Background Wastewater-based epidemiology (WBE) is a new methodology for estimating the drug load in a population. Simple summary statistics and specification tests have typically been used to analyze WBE data, comparing differences between weekday and weekend loads. Such standard statistical methods may, however, overlook important nuanced information in the data. In this study, we apply functional data analysis (FDA) to WBE data and compare the results to those obtained from more traditional summary measures. Methods We analysed temporal WBE data from 42 European cities, using sewage samples collected daily for one week in March 2013. For each city, the main temporal features of two selected drugs were extracted using functional principal component (FPC) analysis, along with simpler measures such as the area under the curve (AUC). The individual cities’ scores on each of the temporal FPCs were then used as outcome variables in multiple linear regression analysis with various city and country characteristics as predictors. The results were compared to those of functional analysis of variance (FANOVA). Results The three first FPCs explained more than 99% of the temporal variation. The first component (FPC1) represented the level of the drug load, while the second and third temporal components represented the level and the timing of a weekend peak. AUC was highly correlated with FPC1, but other temporal characteristic were not captured by the simple summary measures. FANOVA was less flexible than the FPCA-based regression, and even showed concordance results. Geographical location was the main predictor for the general level of the drug load. Conclusion FDA of WBE data extracts more detailed information about drug load patterns during the week which are not identified by more traditional statistical methods. Results also suggest that regression based on FPC results is a valuable addition to FANOVA for estimating associations between temporal patterns and covariate

  11. Applications of a transonic wing design method

    NASA Technical Reports Server (NTRS)

    Campbell, Richard L.; Smith, Leigh A.

    1989-01-01

    A method for designing wings and airfoils at transonic speeds using a predictor/corrector approach was developed. The procedure iterates between an aerodynamic code, which predicts the flow about a given geometry, and the design module, which compares the calculated and target pressure distributions and modifies the geometry using an algorithm that relates differences in pressure to a change in surface curvature. The modular nature of the design method makes it relatively simple to couple it to any analysis method. The iterative approach allows the design process and aerodynamic analysis to converge in parallel, significantly reducing the time required to reach a final design. Viscous and static aeroelastic effects can also be accounted for during the design or as a post-design correction. Results from several pilot design codes indicated that the method accurately reproduced pressure distributions as well as the coordinates of a given airfoil or wing by modifying an initial contour. The codes were applied to supercritical as well as conventional airfoils, forward- and aft-swept transport wings, and moderate-to-highly swept fighter wings. The design method was found to be robust and efficient, even for cases having fairly strong shocks.

  12. Impeller blade design method for centrifugal compressors

    NASA Technical Reports Server (NTRS)

    Jansen, W.; Kirschner, A. M.

    1974-01-01

    The design of a centrifugal impeller with blades that are aerodynamically efficient, easy to manufacture, and mechanically sound is discussed. The blade design method described here satisfies the first two criteria and with a judicious choice of certain variables will also satisfy stress considerations. The blade shape is generated by specifying surface velocity distributions and consists of straight-line elements that connect points at hub and shroud. The method may be used to design radially elemented and backward-swept blades. The background, a brief account of the theory, and a sample design are described.

  13. Model reduction methods for control design

    NASA Technical Reports Server (NTRS)

    Dunipace, K. R.

    1988-01-01

    Several different model reduction methods are developed and detailed implementation information is provided for those methods. Command files to implement the model reduction methods in a proprietary control law analysis and design package are presented. A comparison and discussion of the various reduction techniques is included.

  14. Mixed Methods Research Designs in Counseling Psychology

    ERIC Educational Resources Information Center

    Hanson, William E.; Creswell, John W.; Clark, Vicki L. Plano; Petska, Kelly S.; Creswell, David J.

    2005-01-01

    With the increased popularity of qualitative research, researchers in counseling psychology are expanding their methodologies to include mixed methods designs. These designs involve the collection, analysis, and integration of quantitative and qualitative data in a single or multiphase study. This article presents an overview of mixed methods…

  15. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  16. Age-Based Methods to Explore Time-Related Variables in Occupational Epidemiology Studies

    SciTech Connect

    Janice P. Watkins, Edward L. Frome, Donna L. Cragle

    2005-08-31

    Although age is recognized as the strongest predictor of mortality in chronic disease epidemiology, a calendar-based approach is often employed when evaluating time-related variables. An age-based analysis file, created by determining the value of each time-dependent variable for each age that a cohort member is followed, provides a clear definition of age at exposure and allows development of diverse analytic models. To demonstrate methods, the relationship between cancer mortality and external radiation was analyzed with Poisson regression for 14,095 Oak Ridge National Laboratory workers. Based on previous analysis of this cohort, a model with ten-year lagged cumulative radiation doses partitioned by receipt before (dose-young) or after (dose-old) age 45 was examined. Dose-response estimates were similar to calendar-year-based results with elevated risk for dose-old, but not when film badge readings were weekly before 1957. Complementary results showed increasing risk with older hire ages and earlier birth cohorts, since workers hired after age 45 were born before 1915, and dose-young and dose-old were distributed differently by birth cohorts. Risks were generally higher for smokingrelated than non-smoking-related cancers. It was difficult to single out specific variables associated with elevated cancer mortality because of: (1) birth cohort differences in hire age and mortality experience completeness, and (2) time-period differences in working conditions, dose potential, and exposure assessment. This research demonstrated the utility and versatility of the age-based approach.

  17. Epidemiological causality.

    PubMed

    Morabia, Alfredo

    2005-01-01

    Epidemiological methods, which combine population thinking and group comparisons, can primarily identify causes of disease in populations. There is therefore a tension between our intuitive notion of a cause, which we want to be deterministic and invariant at the individual level, and the epidemiological notion of causes, which are invariant only at the population level. Epidemiologists have given heretofore a pragmatic solution to this tension. Causal inference in epidemiology consists in checking the logical coherence of a causality statement and determining whether what has been found grossly contradicts what we think we already know: how strong is the association? Is there a dose-response relationship? Does the cause precede the effect? Is the effect biologically plausible? Etc. This approach to causal inference can be traced back to the English philosophers David Hume and John Stuart Mill. On the other hand, the mode of establishing causality, devised by Jakob Henle and Robert Koch, which has been fruitful in bacteriology, requires that in every instance the effect invariably follows the cause (e.g., inoculation of Koch bacillus and tuberculosis). This is incompatible with epidemiological causality which has to deal with probabilistic effects (e.g., smoking and lung cancer), and is therefore invariant only for the population. PMID:16898206

  18. Development and Evaluation for Active Learning Instructional Design of Epidemiology in Nursing Informatics Field.

    PubMed

    Majima, Yukie

    2016-01-01

    Nursing education classes are classifiable into three types: lectures, classroom practice, and clinical practice. In this study, we implemented a class that incorporated elements of active learning, including clickers, minutes papers, quizzes, and group work and presentation, in the subject of "epidemiology", which is often positioned in the field of nursing informatics and which is usually taught in conventional knowledge-transmission style lectures, to help students understand knowledge and achieve seven class goals. Results revealed that the average scores of the class achievement (five levels of evaluation) were 3.6-3.9, which was good overall. The highest average score of the evaluation of teaching materials by students (five levels of evaluation) was 4.6 for quizzes, followed by 4.2 for announcement of test statistics, 4.1 for clickers, and 4.0 for news presentation related to epidemiology. We regard these as useful tools for students to increase their motivation. One problem with the class was that it took time to organize the class: creation of tests, class preparation and marking, such as things to be returned and distribution of clickers, and writing comments on small papers. PMID:27332214

  19. Development of a hydraulic turbine design method

    NASA Astrophysics Data System (ADS)

    Kassanos, Ioannis; Anagnostopoulos, John; Papantonis, Dimitris

    2013-10-01

    In this paper a hydraulic turbine parametric design method is presented which is based on the combination of traditional methods and parametric surface modeling techniques. The blade of the turbine runner is described using Bezier surfaces for the definition of the meridional plane as well as the blade angle distribution, and a thickness distribution applied normal to the mean blade surface. In this way, it is possible to define parametrically the whole runner using a relatively small number of design parameters, compared to conventional methods. The above definition is then combined with a commercial CFD software and a stochastic optimization algorithm towards the development of an automated design optimization procedure. The process is demonstrated with the design of a Francis turbine runner.

  20. Preliminary aerothermodynamic design method for hypersonic vehicles

    NASA Technical Reports Server (NTRS)

    Harloff, G. J.; Petrie, S. L.

    1987-01-01

    Preliminary design methods are presented for vehicle aerothermodynamics. Predictions are made for Shuttle orbiter, a Mach 6 transport vehicle and a high-speed missile configuration. Rapid and accurate methods are discussed for obtaining aerodynamic coefficients and heat transfer rates for laminar and turbulent flows for vehicles at high angles of attack and hypersonic Mach numbers.

  1. The semi-individual study in air pollution epidemiology: a valid design as compared to ecologic studies.

    PubMed Central

    Künzli, N; Tager, I B

    1997-01-01

    The assessment of long-term effects of air pollution in humans relies on epidemiologic studies. A widely used design consists of cross-sectional or cohort studies in which ecologic assignment of exposure, based on a fixed-site ambient monitor, is employed. Although health outcome and usually a large number of covariates are measured in individuals, these studies are often called ecological. We will introduce the term semi-individual design for these studies. We review the major properties and limitations with regard to causal inference of truly ecologic studies, in which outcome, exposure, and covariates are available on an aggregate level only. Misclassification problems and issues related to confounding and model specification in truly ecologic studies limit etiologic inference to individuals. In contrast, the semi-individual study shares its methodological and inferential properties with typical individual-level study designs. The major caveat relates to the case where too few study areas, e.g., two or three, are used, which render control of aggregate level confounding impossible. The issue of exposure misclassification is of general concern in epidemiology and not an exclusive problem of the semi-individual design. In a multicenter setting, the semi-individual study is a valuable tool to approach long-term effects of air pollution. Knowledge about the error structure of the ecologically assigned exposure allows consideration of the impact of ecologically assigned exposure on effect estimation. Semi-individual studies, i.e., individual level air pollution studies with ecologic exposure assignment, more readily permit valid inference to individuals and should not be labeled as ecologic studies. PMID:9349825

  2. Combinatorial protein design strategies using computational methods.

    PubMed

    Kono, Hidetoshi; Wang, Wei; Saven, Jeffery G

    2007-01-01

    Computational methods continue to facilitate efforts in protein design. Most of this work has focused on searching sequence space to identify one or a few sequences compatible with a given structure and functionality. Probabilistic computational methods provide information regarding the range of amino acid variability permitted by desired functional and structural constraints. Such methods may be used to guide the construction of both individual sequences and combinatorial libraries of proteins. PMID:17041256

  3. Multidisciplinary Optimization Methods for Aircraft Preliminary Design

    NASA Technical Reports Server (NTRS)

    Kroo, Ilan; Altus, Steve; Braun, Robert; Gage, Peter; Sobieski, Ian

    1994-01-01

    This paper describes a research program aimed at improved methods for multidisciplinary design and optimization of large-scale aeronautical systems. The research involves new approaches to system decomposition, interdisciplinary communication, and methods of exploiting coarse-grained parallelism for analysis and optimization. A new architecture, that involves a tight coupling between optimization and analysis, is intended to improve efficiency while simplifying the structure of multidisciplinary, computation-intensive design problems involving many analysis disciplines and perhaps hundreds of design variables. Work in two areas is described here: system decomposition using compatibility constraints to simplify the analysis structure and take advantage of coarse-grained parallelism; and collaborative optimization, a decomposition of the optimization process to permit parallel design and to simplify interdisciplinary communication requirements.

  4. Methods and Technologies Branch (MTB)

    Cancer.gov

    The Methods and Technologies Branch focuses on methods to address epidemiologic data collection, study design and analysis, and to modify technological approaches to better understand cancer susceptibility.

  5. Methicillin-resistant Staphylococcus aureus in Spain: molecular epidemiology and utility of different typing methods.

    PubMed

    Vindel, Ana; Cuevas, Oscar; Cercenado, Emilia; Marcos, Carmen; Bautista, Verónica; Castellares, Carol; Trincado, Pilar; Boquete, Teresa; Pérez-Vázquez, Maria; Marín, Mercedes; Bouza, Emilio

    2009-06-01

    In a point-prevalence study performed in 145 Spanish hospitals in 2006, we collected 463 isolates of Staphylococcus aureus in a single day. Of these, 135 (29.2%) were methicillin (meticillin)-resistant S. aureus (MRSA) isolates. Susceptibility testing was performed by a microdilution method, and mecA was detected by PCR. The isolates were analyzed by pulsed-field gel electrophoresis (PFGE) after SmaI digestion, staphylococcal chromosomal cassette mec (SCCmec) typing, agr typing, spa typing with BURP (based-upon-repeat-pattern) analysis, and multilocus sequence typing (MLST). The 135 MRSA isolates showed resistance to ciprofloxacin (93.3%), tobramycin (72.6%), gentamicin (20.0%), erythromycin (66.7%), and clindamycin (39.3%). Among the isolates resistant to erythromycin, 27.4% showed the M phenotype. All of the isolates were susceptible to glycopeptides. Twelve resistance patterns were found, of which four accounted for 65% of the isolates. PFGE revealed 36 different patterns, with 13 major clones (including 2 predominant clones with various antibiotypes that accounted for 52.5% of the MRSA isolates) and 23 sporadic profiles. Two genotypes were observed for the first time in Spain. SCCmec type IV accounted for 6.7% of the isolates (70.1% were type IVa, 23.9% were type IVc, 0.9% were type IVd, and 5.1% were type IVh), and SCCmec type I and SCCmec type II accounted for 7.4% and 5.2% of the isolates, respectively. One isolate was nontypeable. Only one of the isolates produced the Panton-Valentine leukocidin. The isolates presented agr type 2 (82.2%), type 1 (14.8%), and type 3 (3.0%). spa typing revealed 32 different types, the predominant ones being t067 (48.9%) and t002 (14.8%), as well as clonal complex 067 (78%) by BURP analysis. The MRSA clone of sequence type 125 and SCCmec type IV was the most prevalent throughout Spain. In our experience, PFGE, spa typing, SCCmec typing, and MLST presented good correlations for the majority of the MRSA strains; we suggest the

  6. Axisymmetric inlet minimum weight design method

    NASA Technical Reports Server (NTRS)

    Nadell, Shari-Beth

    1995-01-01

    An analytical method for determining the minimum weight design of an axisymmetric supersonic inlet has been developed. The goal of this method development project was to improve the ability to predict the weight of high-speed inlets in conceptual and preliminary design. The initial model was developed using information that was available from inlet conceptual design tools (e.g., the inlet internal and external geometries and pressure distributions). Stiffened shell construction was assumed. Mass properties were computed by analyzing a parametric cubic curve representation of the inlet geometry. Design loads and stresses were developed at analysis stations along the length of the inlet. The equivalent minimum structural thicknesses for both shell and frame structures required to support the maximum loads produced by various load conditions were then determined. Preliminary results indicated that inlet hammershock pressures produced the critical design load condition for a significant portion of the inlet. By improving the accuracy of inlet weight predictions, the method will improve the fidelity of propulsion and vehicle design studies and increase the accuracy of weight versus cost studies.

  7. Analysis Method for Quantifying Vehicle Design Goals

    NASA Technical Reports Server (NTRS)

    Fimognari, Peter; Eskridge, Richard; Martin, Adam; Lee, Michael

    2007-01-01

    A document discusses a method for using Design Structure Matrices (DSM), coupled with high-level tools representing important life-cycle parameters, to comprehensively conceptualize a flight/ground space transportation system design by dealing with such variables as performance, up-front costs, downstream operations costs, and reliability. This approach also weighs operational approaches based on their effect on upstream design variables so that it is possible to readily, yet defensively, establish linkages between operations and these upstream variables. To avoid the large range of problems that have defeated previous methods of dealing with the complex problems of transportation design, and to cut down the inefficient use of resources, the method described in the document identifies those areas that are of sufficient promise and that provide a higher grade of analysis for those issues, as well as the linkages at issue between operations and other factors. Ultimately, the system is designed to save resources and time, and allows for the evolution of operable space transportation system technology, and design and conceptual system approach targets.

  8. Optimization methods applied to hybrid vehicle design

    NASA Technical Reports Server (NTRS)

    Donoghue, J. F.; Burghart, J. H.

    1983-01-01

    The use of optimization methods as an effective design tool in the design of hybrid vehicle propulsion systems is demonstrated. Optimization techniques were used to select values for three design parameters (battery weight, heat engine power rating and power split between the two on-board energy sources) such that various measures of vehicle performance (acquisition cost, life cycle cost and petroleum consumption) were optimized. The apporach produced designs which were often significant improvements over hybrid designs already reported on in the literature. The principal conclusions are as follows. First, it was found that the strategy used to split the required power between the two on-board energy sources can have a significant effect on life cycle cost and petroleum consumption. Second, the optimization program should be constructed so that performance measures and design variables can be easily changed. Third, the vehicle simulation program has a significant effect on the computer run time of the overall optimization program; run time can be significantly reduced by proper design of the types of trips the vehicle takes in a one year period. Fourth, care must be taken in designing the cost and constraint expressions which are used in the optimization so that they are relatively smooth functions of the design variables. Fifth, proper handling of constraints on battery weight and heat engine rating, variables which must be large enough to meet power demands, is particularly important for the success of an optimization study. Finally, the principal conclusion is that optimization methods provide a practical tool for carrying out the design of a hybrid vehicle propulsion system.

  9. Evaluation of the Epidemiologic Utility of Secondary Typing Methods for Differentiation of Mycobacterium tuberculosis Isolates

    PubMed Central

    Kwara, Awewura; Schiro, Ronald; Cowan, Lauren S.; Hyslop, Newton E.; Wiser, Mark F.; Roahen Harrison, Stephanie; Kissinger, Patricia; Diem, Lois; Crawford, Jack T.

    2003-01-01

    Spoligotyping and mycobacterial interspersed repetitive unit-variable-number tandem repeat analysis (MIRU-VNTR) were evaluated for the ability to differentiate 64 Mycobacterium tuberculosis isolates from 10 IS6110-defined clusters. MIRU-VNTR performed slightly better than spoligotyping in reducing the number of clustered isolates and the sizes of the clusters. All epidemiologically related isolates remained clustered by MIRU-VNTR but not by spoligotyping. PMID:12791904

  10. Environmental epidemiology: challenges and opportunities.

    PubMed Central

    Pekkanen, J; Pearce, N

    2001-01-01

    Epidemiology is struggling increasingly with problems with correlated exposures and small relative risks. As a consequence, some scholars have strongly emphasized molecular epidemiology, whereas others have argued for the importance of the population context and the reintegration of epidemiology into public health. Environmental epidemiology has several unique features that make these debates especially pertinent to it. The very large number of environmental exposures require prioritization, and the relative risks are usually very low. Furthermore, many environmental exposures can be addressed only by comparing populations rather than individuals, and the disruption of both local and global ecosystems requires us to develop new methods of study design. The population context is also very important to consider in risk management decisions because of the involuntary nature of most environmental exposures and the diversity of possible outcomes, both health- and nonhealth-related. Studies at the individual or molecular level tend to focus the research hypotheses and subsequent interventions at that level, even when research and interventions at other levels may be more appropriate. Thus, only by starting from the population and ecosystem levels can we ensure that these are given appropriate consideration. Although better research is needed at all levels, it is crucially important to choose the most appropriate level, or levels, of research for a particular problem. Only by conducting research at all these levels and by developing further methods to combine evidence from these different levels can we hope to address the challenges facing environmental epidemiology today. PMID:11171517

  11. A sociotechnical method for designing work systems.

    PubMed

    Waterson, Patrick E; Older Gray, Melanie T; Clegg, Chris W

    2002-01-01

    The paper describes a new method for allocating work between and among humans and machines. The method consists of a series of stages, which cover how the overall work system should be organized and designed; how tasks within the work system should be allocated (human-human allocations); and how tasks involving the use of technology should be allocated (human-machine allocations). The method makes use of a series of decision criteria that allow end users to consider a range of factors relevant to function allocation, including aspects of job, organizational, and technological design. The method is described in detail using an example drawn from a workshop involving the redesign of a naval command and control (C2) subsystem. We also report preliminary details of the evaluation of the method, based on the views of participants at the workshop. A final section outlines the contribution of the work in terms of current theoretical developments within the domain of function allocation. The method has been applied to the domain of naval C2 systems; however, it is also designed for generic use within function allocation and sociotechnical work systems. PMID:12502156

  12. MAST Propellant and Delivery System Design Methods

    NASA Technical Reports Server (NTRS)

    Nadeem, Uzair; Mc Cleskey, Carey M.

    2015-01-01

    A Mars Aerospace Taxi (MAST) concept and propellant storage and delivery case study is undergoing investigation by NASA's Element Design and Architectural Impact (EDAI) design and analysis forum. The MAST lander concept envisions landing with its ascent propellant storage tanks empty and supplying these reusable Mars landers with propellant that is generated and transferred while on the Mars surface. The report provides an overview of the data derived from modeling between different methods of propellant line routing (or "lining") and differentiate the resulting design and operations complexity of fluid and gaseous paths based on a given set of fluid sources and destinations. The EDAI team desires a rough-order-magnitude algorithm for estimating the lining characteristics (i.e., the plumbing mass and complexity) associated different numbers of vehicle propellant sources and destinations. This paper explored the feasibility of preparing a mathematically sound algorithm for this purpose, and offers a method for the EDAI team to implement.

  13. Standardized Radiation Shield Design Methods: 2005 HZETRN

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Tripathi, Ram K.; Badavi, Francis F.; Cucinotta, Francis A.

    2006-01-01

    Research committed by the Langley Research Center through 1995 resulting in the HZETRN code provides the current basis for shield design methods according to NASA STD-3000 (2005). With this new prominence, the database, basic numerical procedures, and algorithms are being re-examined with new methods of verification and validation being implemented to capture a well defined algorithm for engineering design processes to be used in this early development phase of the Bush initiative. This process provides the methodology to transform the 1995 HZETRN research code into the 2005 HZETRN engineering code to be available for these early design processes. In this paper, we will review the basic derivations including new corrections to the codes to insure improved numerical stability and provide benchmarks for code verification.

  14. Acoustic Treatment Design Scaling Methods. Phase 2

    NASA Technical Reports Server (NTRS)

    Clark, L. (Technical Monitor); Parrott, T. (Technical Monitor); Jones, M. (Technical Monitor); Kraft, R. E.; Yu, J.; Kwan, H. W.; Beer, B.; Seybert, A. F.; Tathavadekar, P.

    2003-01-01

    The ability to design, build and test miniaturized acoustic treatment panels on scale model fan rigs representative of full scale engines provides not only cost-savings, but also an opportunity to optimize the treatment by allowing multiple tests. To use scale model treatment as a design tool, the impedance of the sub-scale liner must be known with confidence. This study was aimed at developing impedance measurement methods for high frequencies. A normal incidence impedance tube method that extends the upper frequency range to 25,000 Hz. without grazing flow effects was evaluated. The free field method was investigated as a potential high frequency technique. The potential of the two-microphone in-situ impedance measurement method was evaluated in the presence of grazing flow. Difficulties in achieving the high frequency goals were encountered in all methods. Results of developing a time-domain finite difference resonator impedance model indicated that a re-interpretation of the empirical fluid mechanical models used in the frequency domain model for nonlinear resistance and mass reactance may be required. A scale model treatment design that could be tested on the Universal Propulsion Simulator vehicle was proposed.

  15. 3. 6 simplified methods for design

    SciTech Connect

    Nickell, R.E.; Yahr, G.T.

    1981-01-01

    Simplified design analysis methods for elevated temperature construction are classified and reviewed. Because the major impetus for developing elevated temperature design methodology during the past ten years has been the LMFBR program, considerable emphasis is placed upon results from this source. The operating characteristics of the LMFBR are such that cycles of severe transient thermal stresses can be interspersed with normal elevated temperature operational periods of significant duration, leading to a combination of plastic and creep deformation. The various simplified methods are organized into two general categories, depending upon whether it is the material, or constitutive, model that is reduced, or the geometric modeling that is simplified. Because the elastic representation of material behavior is so prevalent, an entire section is devoted to elastic analysis methods. Finally, the validation of the simplified procedures is discussed.

  16. Geometric methods for the design of mechanisms

    NASA Astrophysics Data System (ADS)

    Stokes, Ann Westagard

    1993-01-01

    Challenges posed by the process of designing robotic mechanisms have provided a new impetus to research in the classical subjects of kinematics, elastic analysis, and multibody dynamics. Historically, mechanism designers have considered these areas of analysis to be generally separate and distinct sciences. However, there are significant classes of problems which require a combination of these methods to arrive at a satisfactory solution. For example, both the compliance and the inertia distribution strongly influence the performance of a robotic manipulator. In this thesis, geometric methods are applied to the analysis of mechanisms where kinematics, elasticity, and dynamics play fundamental and interactive roles. Tools for the mathematical analysis, design, and optimization of a class of holonomic and nonholonomic mechanisms are developed. Specific contributions of this thesis include a network theory for elasto-kinematic systems. The applicability of the network theory is demonstrated by employing it to calculate the optimal distribution of joint compliance in a serial manipulator. In addition, the advantage of applying Lie group theoretic approaches to mechanisms requiring specific dynamic properties is demonstrated by extending Brockett's product of exponentials formula to the domain of dynamics. Conditions for the design of manipulators having inertia matrices which are constant in joint angle coordinates are developed. Finally, analysis and design techniques are developed for a class of mechanisms which rectify oscillations into secular motions. These techniques are applied to the analysis of free-floating chains that can reorient themselves in zero angular momentum processes and to the analysis of rattleback tops.

  17. Reliability Methods for Shield Design Process

    NASA Technical Reports Server (NTRS)

    Tripathi, R. K.; Wilson, J. W.

    2002-01-01

    Providing protection against the hazards of space radiation is a major challenge to the exploration and development of space. The great cost of added radiation shielding is a potential limiting factor in deep space operations. In this enabling technology, we have developed methods for optimized shield design over multi-segmented missions involving multiple work and living areas in the transport and duty phase of space missions. The total shield mass over all pieces of equipment and habitats is optimized subject to career dose and dose rate constraints. An important component of this technology is the estimation of two most commonly identified uncertainties in radiation shield design, the shielding properties of materials used and the understanding of the biological response of the astronaut to the radiation leaking through the materials into the living space. The largest uncertainty, of course, is in the biological response to especially high charge and energy (HZE) ions of the galactic cosmic rays. These uncertainties are blended with the optimization design procedure to formulate reliability-based methods for shield design processes. The details of the methods will be discussed.

  18. Review of methods of dose estimation for epidemiological studies of the radiological impact of nevada test site and global fallout.

    PubMed

    Beck, Harold L; Anspaugh, Lynn R; Bouville, André; Simon, Steven L

    2006-07-01

    Methods to assess radiation doses from nuclear weapons test fallout have been used to estimate doses to populations and individuals in a number of studies. However, only a few epidemiology studies have relied on fallout dose estimates. Though the methods for assessing doses from local and regional compared to global fallout are similar, there are significant differences in predicted doses and contributing radionuclides depending on the source of the fallout, e.g. whether the nuclear debris originated in Nevada at the U.S. nuclear test site or whether it originated at other locations worldwide. The sparse historical measurement data available are generally sufficient to estimate external exposure doses reasonably well. However, reconstruction of doses to body organs from ingestion and inhalation of radionuclides is significantly more complex and is almost always more uncertain than are external dose estimates. Internal dose estimates are generally based on estimates of the ground deposition per unit area of specific radionuclides and subsequent transport of radionuclides through the food chain. A number of technical challenges to correctly modeling deposition of fallout under wet and dry atmospheric conditions still remain, particularly at close-in locations where sizes of deposited particles vary significantly over modest changes in distance. This paper summarizes the various methods of dose estimation from weapons test fallout and the most important dose assessment and epidemiology studies that have relied on those methods. PMID:16808609

  19. Program for the epidemiological evaluation of stroke in Tandil, Argentina (PREVISTA) study: rationale and design.

    PubMed

    Sposato, Luciano A; Coppola, Mariano L; Altamirano, Juan; Borrego Guerrero, Brenda; Casanova, Jorge; De Martino, Maximiliano; Díaz, Alejandro; Feigin, Valery L; Funaro, Fernando; Gradillone, María E; Lewin, María L; Lopes, Renato D; López, Daniel H; Louge, Mariel; Maccarone, Patricia; Martens, Cecilia; Miguel, Marcelo; Rabinstein, Alejandro; Morasso, Hernán; Riccio, Patricia M; Saposnik, Gustavo; Silva, Damián; Suasnabar, Ramón; Truelsen, Thomas; Uzcudun, Araceli; Viviani, Carlos A; Bahit, M Cecilia

    2013-10-01

    The availability of population-based epidemiological data on the incident risk of stroke is very scarce in Argentina and other Latin American countries. In response to the priorities established by the World Health Organization and the United Nations, PREVISTA was envisaged as a population-based program to determine the risk of first-ever and recurrent stroke and transient ischemic attack incidence and mortality in Tandil, Buenos Aires, Argentina. The study will be conducted according to Standardized Tools for Stroke Surveillance (STEPS Stroke) methodology and will enroll all new (incident) and recurrent consecutive cases of stroke and transient ischemic attack in the City of Tandil between May 1st, 2013 and April 30, 2015. The study will include patients with ischemic stroke, non-traumatic primary intracerebral hemorrhage, subarachnoid hemorrhage, and transient ischemic attack. To ensure the inclusion of every cerebrovascular event during an observation period of two years, we will instrument an 'intensive screening program', consisting of a comprehensive daily tracking of every potential event of stroke or transient ischemic attack using multiple overlapping sources. Mortality would be determined during follow-up for every enrolled patient. Also, fatal community events would be screened daily through revision of death certificates at funeral homes and local offices of vital statistics. All causes of death will be adjudicated by an ad-hoc committee. The close population of Tandil is representative of a large proportion of Latin-American countries with low- and middle-income economies. The findings and conclusions of PREVISTA may provide data that could support future health policy decision-making in the region. PMID:24024917

  20. A survival tree method for the analysis of discrete event times in clinical and epidemiological studies.

    PubMed

    Schmid, Matthias; Küchenhoff, Helmut; Hoerauf, Achim; Tutz, Gerhard

    2016-02-28

    Survival trees are a popular alternative to parametric survival modeling when there are interactions between the predictor variables or when the aim is to stratify patients into prognostic subgroups. A limitation of classical survival tree methodology is that most algorithms for tree construction are designed for continuous outcome variables. Hence, classical methods might not be appropriate if failure time data are measured on a discrete time scale (as is often the case in longitudinal studies where data are collected, e.g., quarterly or yearly). To address this issue, we develop a method for discrete survival tree construction. The proposed technique is based on the result that the likelihood of a discrete survival model is equivalent to the likelihood of a regression model for binary outcome data. Hence, we modify tree construction methods for binary outcomes such that they result in optimized partitions for the estimation of discrete hazard functions. By applying the proposed method to data from a randomized trial in patients with filarial lymphedema, we demonstrate how discrete survival trees can be used to identify clinically relevant patient groups with similar survival behavior. PMID:26358826

  1. [The mode of infection of tuberculosis--analysis using molecular epidemiologic methods].

    PubMed

    Abe, C

    1995-11-01

    In the 1950's, evidence showed that INH-resistant Mycobacterium tuberculosis organisms were attenuated in both virulence and pathogenicity in animals, and it was postulated that these bacilli might also be attenuated in their virulence to humans. Subsequent studies, however, indicated that all INH-resistant strains should not be considered as being attenuated in their virulence to humans. The general conclusion was that patients excreting INH-resistant organisms are somewhat less infectious to their contacts than patients excreting INH-susceptible organisms. Insertion sequences are suitable tools for the diagnosis and epidemiology of tuberculosis because of the highly variable copy number and variability of insertion sites in the chromosome. This variability allows the subtyping of M. tuberculosis strains by restriction fragment length polymorphism (RFLP) analysis. Patients with the same RFLP pattern constitute an epidemiologically linked cluster. Clustering indicates recent infection and rapid progression to clinical illness. Nearly one thirds of new cases of tuberculosis in San Francisco are the result of recent infection. In Thailand, tuberculosis has now emerged as the most common opportunistic disease associated with HIV infection. Cluster analysis showed the risk of progression to active tuberculosis among individuals infected with HIV. Ther have been numerous outbreaks of multidrug-resistant tuberculosis in the United States. Most of such outbreaks have primarily involved persons infected with HIV, who are thought to have been exposed to the strains in medical or correctional facilities. PMID:8656589

  2. Waterflooding injectate design systems and methods

    SciTech Connect

    Brady, Patrick V.; Krumhansl, James L.

    2014-08-19

    A method of designing an injectate to be used in a waterflooding operation is disclosed. One aspect includes specifying data representative of chemical characteristics of a liquid hydrocarbon, a connate, and a reservoir rock, of a subterranean reservoir. Charged species at an interface of the liquid hydrocarbon are determined based on the specified data by evaluating at least one chemical reaction. Charged species at an interface of the reservoir rock are determined based on the specified data by evaluating at least one chemical reaction. An extent of surface complexation between the charged species at the interfaces of the liquid hydrocarbon and the reservoir rock is determined by evaluating at least one surface complexation reaction. The injectate is designed and is operable to decrease the extent of surface complexation between the charged species at interfaces of the liquid hydrocarbon and the reservoir rock. Other methods, apparatus, and systems are disclosed.

  3. The healthy men study: design and recruitment considerations for environmental epidemiologic studies in male reproductive health

    EPA Science Inventory

    Study Objective: To describe study conduct and response and participant characteristics. Design: Prospective cohort study. Setting: Participants were male partners of women enrolled in a community-based study of drinking water disinfection by-products and pregnancy healt...

  4. An improved design method for EPC middleware

    NASA Astrophysics Data System (ADS)

    Lou, Guohuan; Xu, Ran; Yang, Chunming

    2014-04-01

    For currently existed problems and difficulties during the small and medium enterprises use EPC (Electronic Product Code) ALE (Application Level Events) specification to achieved middleware, based on the analysis of principle of EPC Middleware, an improved design method for EPC middleware is presented. This method combines the powerful function of MySQL database, uses database to connect reader-writer with upper application system, instead of development of ALE application program interface to achieve a middleware with general function. This structure is simple and easy to implement and maintain. Under this structure, different types of reader-writers added can be configured conveniently and the expandability of the system is improved.

  5. Cultural epidemiology of pandemic influenza in urban and rural Pune, India: a cross-sectional, mixed-methods study

    PubMed Central

    Sundaram, Neisha; Schaetti, Christian; Purohit, Vidula; Kudale, Abhay; Weiss, Mitchell G

    2014-01-01

    Objective To identify and compare sociocultural features of pandemic influenza with reference to illness-related experience, meaning and behaviour in urban and rural areas of India. Design Cross-sectional, mixed-methods, cultural epidemiological survey with vignette-based interviews. Semistructured explanatory model interviews were used to study community ideas of the 2009 influenza pandemic. In-depth interviews elaborated respondents’ experience during the pandemic. Setting Urban and rural communities, Pune district, western India. Participants Survey of urban (n=215) and rural (n=221) residents aged between 18 and 65 years. In-depth interviews of respondents with a history of 2009 pandemic influenza (n=6). Results More urban (36.7%) than rural respondents (16.3%, p<0.001) identified the illness in the vignette as ‘swine flu’. Over half (56.7%) believed the illness would be fatal without treatment, but with treatment 96% predicted full recovery. Worry (‘tension’) about the illness was reported as more troubling than somatic symptoms. The most common perceived causes—‘exposure to a dirty environment’ and ‘cough or sneeze of an infected person’–were more prominent in the urban group. Among rural respondents, climatic conditions, drinking contaminated water, tension and cultural ideas on humoral imbalance from heat-producing or cold-producing foods were more prominent. The most widely reported home treatment was herbal remedies; more rural respondents suggested reliance on prayer, and symptom relief was more of a priority for urban respondents. Government health services were preferred in the urban communities, and rural residents relied more than urban residents on private facilities. The important preventive measures emphasised were cleanliness, wholesome lifestyle and vaccines, and more urban respondents reported the use of masks. In-depth interviews indicated treatment delays during the 2009 pandemic, especially among rural patients

  6. The impact of chronic migraine: The Chronic Migraine Epidemiology and Outcomes (CaMEO) Study methods and baseline results

    PubMed Central

    Serrano, Daniel; Buse, Dawn C; Reed, Michael L; Marske, Valerie; Fanning, Kristina M; Lipton, Richard B

    2015-01-01

    Background Longitudinal migraine studies have rarely assessed headache frequency and disability variation over a year. Methods The Chronic Migraine Epidemiology and Outcomes (CaMEO) Study is a cross-sectional and longitudinal Internet study designed to characterize the course of episodic migraine (EM) and chronic migraine (CM). Participants were recruited from a Web-panel using quota sampling in an attempt to obtain a sample demographically similar to the US population. Participants who passed the screener were assessed every three months with the Core (baseline, six, and 12 months) and Snapshot (months three and nine) modules, which assessed headache frequency, headache-related disability, treatments, and treatment satisfaction. The Core also assessed resource use, health-related quality of life, and other features. One-time cross-sectional modules measured family burden, barriers to medical care, and comorbidities/endophenotypes. Results Of 489,537 invitees, we obtained 58,418 (11.9%) usable returns including 16,789 individuals who met ICHD-3 beta migraine criteria (EM (<15 headache days/mo): n = 15,313 (91.2%); CM (≥15 headache days/mo): n = 1476 (8.8%)). At baseline, all qualified respondents (n = 16,789) completed the Screener, Core, and Barriers to Care modules. Subsequent modules showed some attrition (Comorbidities/Endophenotypes, n = 12,810; Family Burden (Proband), n = 13,064; Family Burden (Partner), n = 4022; Family Burden (Child), n = 2140; Snapshot (three months), n = 9741; Core (six months), n = 7517; Snapshot (nine months), n = 6362; Core (12 months), n = 5915). A total of 3513 respondents (21.0%) completed all modules, and 3626 (EM: n = 3303 (21.6%); CM: n = 323 (21.9%)) completed all longitudinal assessments. Conclusions The CaMEO Study provides cross-sectional and longitudinal data that will contribute to our understanding of the course of migraine over one year and quantify variations in

  7. Design methods of rhombic tensegrity structures

    NASA Astrophysics Data System (ADS)

    Feng, Xi-Qiao; Li, Yue; Cao, Yan-Ping; Yu, Shou-Wen; Gu, Yuan-Tong

    2010-08-01

    As a special type of novel flexible structures, tensegrity holds promise for many potential applications in such fields as materials science, biomechanics, civil and aerospace engineering. Rhombic systems are an important class of tensegrity structures, in which each bar constitutes the longest diagonal of a rhombus of four strings. In this paper, we address the design methods of rhombic structures based on the idea that many tensegrity structures can be constructed by assembling one-bar elementary cells. By analyzing the properties of rhombic cells, we first develop two novel schemes, namely, direct enumeration scheme and cell-substitution scheme. In addition, a facile and efficient method is presented to integrate several rhombic systems into a larger tensegrity structure. To illustrate the applications of these methods, some novel rhombic tensegrity structures are constructed.

  8. Foot-and-mouth disease in pigs: current epidemiological situation and control methods.

    PubMed

    León, Emilio A

    2012-03-01

    Foot-and-mouth disease (FMD) is the paradigm of a transboundary animal disease. Beyond any doubt, it is the most serious challenge for livestock's health. Official Veterinary Services from free countries invest considerable amount of money to prevent its introduction, whereas those from endemic countries invest most of their resources in the control of the disease. A very important volume of scientific production is developed every year in different aspects of FMD, and for that reason, the current knowledge makes the diagnosis of the disease easier to a great extent. However, FMD is still endemic in about two-thirds of the countries, and periodically re-emergent in several countries. This paper is a review of recent publications, focusing mainly on control measures and current world epidemiological situation, emphasizing primarily pigs. PMID:22225815

  9. Quantifying Uncertainty in Epidemiological Models

    SciTech Connect

    Ramanathan, Arvind; Jha, Sumit Kumar

    2012-01-01

    Modern epidemiology has made use of a number of mathematical models, including ordinary differential equation (ODE) based models and agent based models (ABMs) to describe the dynamics of how a disease may spread within a population and enable the rational design of strategies for intervention that effectively contain the spread of the disease. Although such predictions are of fundamental importance in preventing the next global pandemic, there is a significant gap in trusting the outcomes/predictions solely based on such models. Hence, there is a need to develop approaches such that mathematical models can be calibrated against historical data. In addition, there is a need to develop rigorous uncertainty quantification approaches that can provide insights into when a model will fail and characterize the confidence in the (possibly multiple) model outcomes/predictions, when such retrospective analysis cannot be performed. In this paper, we outline an approach to develop uncertainty quantification approaches for epidemiological models using formal methods and model checking. By specifying the outcomes expected from a model in a suitable spatio-temporal logic, we use probabilistic model checking methods to quantify the probability with which the epidemiological model satisfies the specification. We argue that statistical model checking methods can solve the uncertainty quantification problem for complex epidemiological models.

  10. Design and methods in a multi-center case-control interview study.

    PubMed Central

    Hartge, P; Cahill, J I; West, D; Hauck, M; Austin, D; Silverman, D; Hoover, R

    1984-01-01

    We conducted a case-control study in ten areas of the United States in which a total of 2,982 bladder cancer patients and 5,782 population controls were interviewed. We employed a variety of existing and new techniques to reduce bias and to monitor the quality of data collected. We review here many of the design elements and field methods that can be generally applied in epidemiologic studies, particularly multi-center interview studies, and explain the reasons for our selection of the methods, instruments, and procedures used. PMID:6689843

  11. Direct optimization method for reentry trajectory design

    NASA Astrophysics Data System (ADS)

    Jallade, S.; Huber, P.; Potti, J.; Dutruel-Lecohier, G.

    The software package called `Reentry and Atmospheric Transfer Trajectory' (RATT) was developed under ESA contract for the design of atmospheric trajectories. It includes four software TOP (Trajectory OPtimization) programs, which optimize reentry and aeroassisted transfer trajectories. 6FD and 3FD (6 and 3 degrees of freedom Flight Dynamic) are devoted to the simulation of the trajectory. SCA (Sensitivity and Covariance Analysis) performs covariance analysis on a given trajectory with respect to different uncertainties and error sources. TOP provides the optimum guidance law of a three degree of freedom reentry of aeroassisted transfer (AAOT) trajectories. Deorbit and reorbit impulses (if necessary) can be taken into account in the optimization. A wide choice of cost function is available to the user such as the integrated heat flux, or the sum of the velocity impulses, or a linear combination of both of them for trajectory and vehicle design. The crossrange and the downrange can be maximized during reentry trajectory. Path constraints are available on the load factor, the heat flux and the dynamic pressure. Results on these proposed options are presented. TOPPHY is the part of the TOP software corresponding to the definition and the computation of the optimization problemphysics. TOPPHY can interface with several optimizes with dynamic solvers: TOPOP and TROPIC using direct collocation methods and PROMIS using direct multiple shooting method. TOPOP was developed in the frame of this contract, it uses Hermite polynomials for the collocation method and the NPSOL optimizer from the NAG library. Both TROPIC and PROMIS were developed by the DLR (Deutsche Forschungsanstalt fuer Luft und Raumfahrt) and use the SLSQP optimizer. For the dynamic equation resolution, TROPIC uses a collocation method with Splines and PROMIS uses a multiple shooting method with finite differences. The three different optimizers including dynamics were tested on the reentry trajectory of the

  12. Methods for structural design at elevated temperatures

    NASA Technical Reports Server (NTRS)

    Ellison, A. M.; Jones, W. E., Jr.; Leimbach, K. R.

    1973-01-01

    A procedure which can be used to design elevated temperature structures is discussed. The desired goal is to have the same confidence in the structural integrity at elevated temperature as the factor of safety gives on mechanical loads at room temperature. Methods of design and analysis for creep, creep rupture, and creep buckling are presented. Example problems are included to illustrate the analytical methods. Creep data for some common structural materials are presented. Appendix B is description, user's manual, and listing for the creep analysis program. The program predicts time to a given creep or to creep rupture for a material subjected to a specified stress-temperature-time spectrum. Fatigue at elevated temperature is discussed. Methods of analysis for high stress-low cycle fatigue, fatigue below the creep range, and fatigue in the creep range are included. The interaction of thermal fatigue and mechanical loads is considered, and a detailed approach to fatigue analysis is given for structures operating below the creep range.

  13. Research design and statistical methods in Pakistan Journal of Medical Sciences (PJMS)

    PubMed Central

    Akhtar, Sohail; Shah, Syed Wadood Ali; Rafiq, M.; Khan, Ajmal

    2016-01-01

    Objective: This article compares the study design and statistical methods used in 2005, 2010 and 2015 of Pakistan Journal of Medical Sciences (PJMS). Methods: Only original articles of PJMS were considered for the analysis. The articles were carefully reviewed for statistical methods and designs, and then recorded accordingly. The frequency of each statistical method and research design was estimated and compared with previous years. Results: A total of 429 articles were evaluated (n=74 in 2005, n=179 in 2010, n=176 in 2015) in which 171 (40%) were cross-sectional and 116 (27%) were prospective study designs. A verity of statistical methods were found in the analysis. The most frequent methods include: descriptive statistics (n=315, 73.4%), chi-square/Fisher’s exact tests (n=205, 47.8%) and student t-test (n=186, 43.4%). There was a significant increase in the use of statistical methods over time period: t-test, chi-square/Fisher’s exact test, logistic regression, epidemiological statistics, and non-parametric tests. Conclusion: This study shows that a diverse variety of statistical methods have been used in the research articles of PJMS and frequency improved from 2005 to 2015. However, descriptive statistics was the most frequent method of statistical analysis in the published articles while cross-sectional study design was common study design. PMID:27022365

  14. Design analysis, robust methods, and stress classification

    SciTech Connect

    Bees, W.J.

    1993-01-01

    This special edition publication volume is comprised of papers presented at the 1993 ASME Pressure Vessels and Piping Conference, July 25--29, 1993 in Denver, Colorado. The papers were prepared for presentations in technical sessions developed under the auspices of the PVPD Committees on Computer Technology, Design and Analysis, Operations Applications and Components. The topics included are: Analysis of Pressure Vessels and Components; Expansion Joints; Robust Methods; Stress Classification; and Non-Linear Analysis. Individual papers have been processed separately for inclusion in the appropriate data bases.

  15. Epidemiologic Methods Lessons Learned from Environmental Public Health Disasters: Chernobyl, the World Trade Center, Bhopal, and Graniteville, South Carolina

    PubMed Central

    Svendsen, Erik R.; Runkle, Jennifer R.; Dhara, Venkata Ramana; Lin, Shao; Naboka, Marina; Mousseau, Timothy A.; Bennett, Charles

    2012-01-01

    Background: Environmental public health disasters involving hazardous contaminants may have devastating effects. While much is known about their immediate devastation, far less is known about long-term impacts of these disasters. Extensive latent and chronic long-term public health effects may occur. Careful evaluation of contaminant exposures and long-term health outcomes within the constraints imposed by limited financial resources is essential. Methods: Here, we review epidemiologic methods lessons learned from conducting long-term evaluations of four environmental public health disasters involving hazardous contaminants at Chernobyl, the World Trade Center, Bhopal, and Graniteville (South Carolina, USA). Findings: We found several lessons learned which have direct implications for the on-going disaster recovery work following the Fukushima radiation disaster or for future disasters. Interpretation: These lessons should prove useful in understanding and mitigating latent health effects that may result from the nuclear reactor accident in Japan or future environmental public health disasters. PMID:23066404

  16. Design Method and Calibration of Moulinet

    NASA Astrophysics Data System (ADS)

    Itoh, Hirokazu; Yamada, Hirokazu; Udagawa, Sinsuke

    The formula for obtaining the absorption horsepower of a Moulinet was rewritten, and the physical meaning of the constant in the formula was clarified. Based on this study, the design method of the Moulinet and the calibration method of the Moulinet that was performed after manufacture were verified experimentally. Consequently, the following was clarified; (1) If the propeller power coefficient was taken to be the proportionality constant, the absorption horsepower of the Moulinet was proportional to the cube of the revolution speed, and the fifth power of the Moulinet diameter. (2) If the Moulinet design was geometrically similar to the standard dimensions of the Aviation Technical Research Center's type-6 Moulinet, the proportionality constant C1 given in the reference could be used, and the absorption horsepower of the Moulinet was proportional to the cube of the revolution speed, the cube of the Moulinet diameter, and the side projection area of the Moulinet. (3) The proportionality constant C1 was proportional to the propeller power coefficient CP.

  17. The HIV prevention cascade: integrating theories of epidemiological, behavioural, and social science into programme design and monitoring.

    PubMed

    Hargreaves, James R; Delany-Moretlwe, Sinead; Hallett, Timothy B; Johnson, Saul; Kapiga, Saidi; Bhattacharjee, Parinita; Dallabetta, Gina; Garnett, Geoff P

    2016-07-01

    Theories of epidemiology, health behaviour, and social science have changed the understanding of HIV prevention in the past three decades. The HIV prevention cascade is emerging as a new approach to guide the design and monitoring of HIV prevention programmes in a way that integrates these multiple perspectives. This approach recognises that translating the efficacy of direct mechanisms that mediate HIV prevention (including prevention products, procedures, and risk-reduction behaviours) into population-level effects requires interventions that increase coverage. An HIV prevention cascade approach suggests that high coverage can be achieved by targeting three key components: demand-side interventions that improve risk perception and awareness and acceptability of prevention approaches; supply-side interventions that make prevention products and procedures more accessible and available; and adherence interventions that support ongoing adoption of prevention behaviours, including those that do and do not involve prevention products. Programmes need to develop delivery platforms that ensure these interventions reach target populations, to shape the policy environment so that it facilitates implementation at scale with high quality and intensity, and to monitor the programme with indicators along the cascade. PMID:27365206

  18. Measuring sun exposure in epidemiological studies: Matching the method to the research question.

    PubMed

    King, Laura; Xiang, Fan; Swaminathan, Ashwin; Lucas, Robyn M

    2015-12-01

    Sun exposure has risks and benefits for health. Testing these associations requires tools for measuring sun exposure that are feasible and relevant to the time-course of the health outcome. Recent sun exposure, e.g. the last week, is best captured by dosimeters and sun diaries. These can also be used for medium-term sun exposure e.g. over several weeks, but incur a high participant burden. Self-reported data on "typical time outdoors" for working and non-working days, is less detailed and not influenced by day-to-day variation. Over a longer period, e.g. the lifetime, or for particular life stages, proxies of sun exposure, such as latitude of residence or ambient ultraviolet (UV) radiation levels (from satellites or ground-level monitoring) can be used, with additional detail provided by lifetime sun exposure calendars that include locations of residence, usual time outdoors, and detail of sunburn episodes. Objective measures of lifetime sun exposure include microtopography of sun-exposed skin (e.g. using silicone casts) or conjunctival UV autofluorescence. Potential modifiers of the association between sun exposure and the health outcome, such as clothing coverage and skin colour, may also need to be measured. We provide a systematic approach to selecting sun exposure measures for use in epidemiological health research. PMID:26555640

  19. A structural design decomposition method utilizing substructuring

    NASA Technical Reports Server (NTRS)

    Scotti, Stephen J.

    1994-01-01

    A new method of design decomposition for structural analysis and optimization is described. For this method, the structure is divided into substructures where each substructure has its structural response described by a structural-response subproblem, and its structural sizing determined from a structural-sizing subproblem. The structural responses of substructures that have rigid body modes when separated from the remainder of the structure are further decomposed into displacements that have no rigid body components, and a set of rigid body modes. The structural-response subproblems are linked together through forces determined within a structural-sizing coordination subproblem which also determines the magnitude of any rigid body displacements. Structural-sizing subproblems having constraints local to the substructures are linked together through penalty terms that are determined by a structural-sizing coordination subproblem. All the substructure structural-response subproblems are totally decoupled from each other, as are all the substructure structural-sizing subproblems, thus there is significant potential for use of parallel solution methods for these subproblems.

  20. Review and International Recommendation of Methods for Typing Neisseria gonorrhoeae Isolates and Their Implications for Improved Knowledge of Gonococcal Epidemiology, Treatment, and Biology

    PubMed Central

    Unemo, Magnus; Dillon, Jo-Anne R.

    2011-01-01

    Summary: Gonorrhea, which may become untreatable due to multiple resistance to available antibiotics, remains a public health problem worldwide. Precise methods for typing Neisseria gonorrhoeae, together with epidemiological information, are crucial for an enhanced understanding regarding issues involving epidemiology, test of cure and contact tracing, identifying core groups and risk behaviors, and recommending effective antimicrobial treatment, control, and preventive measures. This review evaluates methods for typing N. gonorrhoeae isolates and recommends various methods for different situations. Phenotypic typing methods, as well as some now-outdated DNA-based methods, have limited usefulness in differentiating between strains of N. gonorrhoeae. Genotypic methods based on DNA sequencing are preferred, and the selection of the appropriate genotypic method should be guided by its performance characteristics and whether short-term epidemiology (microepidemiology) or long-term and/or global epidemiology (macroepidemiology) matters are being investigated. Currently, for microepidemiological questions, the best methods for fast, objective, portable, highly discriminatory, reproducible, typeable, and high-throughput characterization are N. gonorrhoeae multiantigen sequence typing (NG-MAST) or full- or extended-length porB gene sequencing. However, pulsed-field gel electrophoresis (PFGE) and Opa typing can be valuable in specific situations, i.e., extreme microepidemiology, despite their limitations. For macroepidemiological studies and phylogenetic studies, DNA sequencing of chromosomal housekeeping genes, such as multilocus sequence typing (MLST), provides a more nuanced understanding. PMID:21734242

  1. The causal pie model: an epidemiological method applied to evolutionary biology and ecology.

    PubMed

    Wensink, Maarten; Westendorp, Rudi G J; Baudisch, Annette

    2014-05-01

    A general concept for thinking about causality facilitates swift comprehension of results, and the vocabulary that belongs to the concept is instrumental in cross-disciplinary communication. The causal pie model has fulfilled this role in epidemiology and could be of similar value in evolutionary biology and ecology. In the causal pie model, outcomes result from sufficient causes. Each sufficient cause is made up of a "causal pie" of "component causes". Several different causal pies may exist for the same outcome. If and only if all component causes of a sufficient cause are present, that is, a causal pie is complete, does the outcome occur. The effect of a component cause hence depends on the presence of the other component causes that constitute some causal pie. Because all component causes are equally and fully causative for the outcome, the sum of causes for some outcome exceeds 100%. The causal pie model provides a way of thinking that maps into a number of recurrent themes in evolutionary biology and ecology: It charts when component causes have an effect and are subject to natural selection, and how component causes affect selection on other component causes; which partitions of outcomes with respect to causes are feasible and useful; and how to view the composition of a(n apparently homogeneous) population. The diversity of specific results that is directly understood from the causal pie model is a test for both the validity and the applicability of the model. The causal pie model provides a common language in which results across disciplines can be communicated and serves as a template along which future causal analyses can be made. PMID:24963386

  2. Global Dissemination of Carbapenemase-Producing Klebsiella pneumoniae: Epidemiology, Genetic Context, Treatment Options, and Detection Methods

    PubMed Central

    Lee, Chang-Ro; Lee, Jung Hun; Park, Kwang Seung; Kim, Young Bae; Jeong, Byeong Chul; Lee, Sang Hee

    2016-01-01

    The emergence of carbapenem-resistant Gram-negative pathogens poses a serious threat to public health worldwide. In particular, the increasing prevalence of carbapenem-resistant Klebsiella pneumoniae is a major source of concern. K. pneumoniae carbapenemases (KPCs) and carbapenemases of the oxacillinase-48 (OXA-48) type have been reported worldwide. New Delhi metallo-β-lactamase (NDM) carbapenemases were originally identified in Sweden in 2008 and have spread worldwide rapidly. In this review, we summarize the epidemiology of K. pneumoniae producing three carbapenemases (KPCs, NDMs, and OXA-48-like). Although the prevalence of each resistant strain varies geographically, K. pneumoniae producing KPCs, NDMs, and OXA-48-like carbapenemases have become rapidly disseminated. In addition, we used recently published molecular and genetic studies to analyze the mechanisms by which these three carbapenemases, and major K. pneumoniae clones, such as ST258 and ST11, have become globally prevalent. Because carbapenemase-producing K. pneumoniae are often resistant to most β-lactam antibiotics and many other non-β-lactam molecules, the therapeutic options available to treat infection with these strains are limited to colistin, polymyxin B, fosfomycin, tigecycline, and selected aminoglycosides. Although, combination therapy has been recommended for the treatment of severe carbapenemase-producing K. pneumoniae infections, the clinical evidence for this strategy is currently limited, and more accurate randomized controlled trials will be required to establish the most effective treatment regimen. Moreover, because rapid and accurate identification of the carbapenemase type found in K. pneumoniae may be difficult to achieve through phenotypic antibiotic susceptibility tests, novel molecular detection techniques are currently being developed. PMID:27379038

  3. Global Dissemination of Carbapenemase-Producing Klebsiella pneumoniae: Epidemiology, Genetic Context, Treatment Options, and Detection Methods.

    PubMed

    Lee, Chang-Ro; Lee, Jung Hun; Park, Kwang Seung; Kim, Young Bae; Jeong, Byeong Chul; Lee, Sang Hee

    2016-01-01

    The emergence of carbapenem-resistant Gram-negative pathogens poses a serious threat to public health worldwide. In particular, the increasing prevalence of carbapenem-resistant Klebsiella pneumoniae is a major source of concern. K. pneumoniae carbapenemases (KPCs) and carbapenemases of the oxacillinase-48 (OXA-48) type have been reported worldwide. New Delhi metallo-β-lactamase (NDM) carbapenemases were originally identified in Sweden in 2008 and have spread worldwide rapidly. In this review, we summarize the epidemiology of K. pneumoniae producing three carbapenemases (KPCs, NDMs, and OXA-48-like). Although the prevalence of each resistant strain varies geographically, K. pneumoniae producing KPCs, NDMs, and OXA-48-like carbapenemases have become rapidly disseminated. In addition, we used recently published molecular and genetic studies to analyze the mechanisms by which these three carbapenemases, and major K. pneumoniae clones, such as ST258 and ST11, have become globally prevalent. Because carbapenemase-producing K. pneumoniae are often resistant to most β-lactam antibiotics and many other non-β-lactam molecules, the therapeutic options available to treat infection with these strains are limited to colistin, polymyxin B, fosfomycin, tigecycline, and selected aminoglycosides. Although, combination therapy has been recommended for the treatment of severe carbapenemase-producing K. pneumoniae infections, the clinical evidence for this strategy is currently limited, and more accurate randomized controlled trials will be required to establish the most effective treatment regimen. Moreover, because rapid and accurate identification of the carbapenemase type found in K. pneumoniae may be difficult to achieve through phenotypic antibiotic susceptibility tests, novel molecular detection techniques are currently being developed. PMID:27379038

  4. Sequence Analysis of lip R: A Good Method for Molecular Epidemiology of Clinical Isolates of Mycobacterium tuberculosis.

    PubMed

    Saedi, Samaneh; Youssefi, Masoud; Safdari, Hadi; Soleimanpour, Saman; Marouzi, Parviz; Ghazvini, Kiarash

    2015-10-01

    Advances in DNA sequencing have greatly enhanced the molecular epidemiology studies. In order to assess evolutionary and phylogenetic relation of Mycobacterium tuberculosis isolates several gene targets were evaluated. In this study, appropriate fragments of 5 highly variable genes (rpsL, mprA, lipR, katG, and fgd1 genes) were sequenced. The sequence data were analyzed with neighbor-joining method using mega and Geneious software. The phylogenetic trees analyzes revealed that the discriminatory power of lipR is much stronger than that observed in the other genes. lipR could distinguish between more clinical isolates. Therefore, lipR is a promising target for sequence analyzes of M. tuberculosis. PMID:26063445

  5. Method for designing gas tag compositions

    DOEpatents

    Gross, K.C.

    1995-04-11

    For use in the manufacture of gas tags such as employed in a nuclear reactor gas tagging failure detection system, a method for designing gas tagging compositions utilizes an analytical approach wherein the final composition of a first canister of tag gas as measured by a mass spectrometer is designated as node No. 1. Lattice locations of tag nodes in multi-dimensional space are then used in calculating the compositions of a node No. 2 and each subsequent node so as to maximize the distance of each node from any combination of tag components which might be indistinguishable from another tag composition in a reactor fuel assembly. Alternatively, the measured compositions of tag gas numbers 1 and 2 may be used to fix the locations of nodes 1 and 2, with the locations of nodes 3-N then calculated for optimum tag gas composition. A single sphere defining the lattice locations of the tag nodes may be used to define approximately 20 tag nodes, while concentric spheres can extend the number of tag nodes to several hundred. 5 figures.

  6. Method for designing gas tag compositions

    DOEpatents

    Gross, Kenny C.

    1995-01-01

    For use in the manufacture of gas tags such as employed in a nuclear reactor gas tagging failure detection system, a method for designing gas tagging compositions utilizes an analytical approach wherein the final composition of a first canister of tag gas as measured by a mass spectrometer is designated as node #1. Lattice locations of tag nodes in multi-dimensional space are then used in calculating the compositions of a node #2 and each subsequent node so as to maximize the distance of each node from any combination of tag components which might be indistinguishable from another tag composition in a reactor fuel assembly. Alternatively, the measured compositions of tag gas numbers 1 and 2 may be used to fix the locations of nodes 1 and 2, with the locations of nodes 3-N then calculated for optimum tag gas composition. A single sphere defining the lattice locations of the tag nodes may be used to define approximately 20 tag nodes, while concentric spheres can extend the number of tag nodes to several hundred.

  7. Research and Design of Rootkit Detection Method

    NASA Astrophysics Data System (ADS)

    Liu, Leian; Yin, Zuanxing; Shen, Yuli; Lin, Haitao; Wang, Hongjiang

    Rootkit is one of the most important issues of network communication systems, which is related to the security and privacy of Internet users. Because of the existence of the back door of the operating system, a hacker can use rootkit to attack and invade other people's computers and thus he can capture passwords and message traffic to and from these computers easily. With the development of the rootkit technology, its applications are more and more extensive and it becomes increasingly difficult to detect it. In addition, for various reasons such as trade secrets, being difficult to be developed, and so on, the rootkit detection technology information and effective tools are still relatively scarce. In this paper, based on the in-depth analysis of the rootkit detection technology, a new kind of the rootkit detection structure is designed and a new method (software), X-Anti, is proposed. Test results show that software designed based on structure proposed is much more efficient than any other rootkit detection software.

  8. Fast and simple epidemiological typing of Pseudomonas aeruginosa using the double-locus sequence typing (DLST) method.

    PubMed

    Basset, P; Blanc, D S

    2014-06-01

    Although the molecular typing of Pseudomonas aeruginosa is important to understand the local epidemiology of this opportunistic pathogen, it remains challenging. Our aim was to develop a simple typing method based on the sequencing of two highly variable loci. Single-strand sequencing of three highly variable loci (ms172, ms217, and oprD) was performed on a collection of 282 isolates recovered between 1994 and 2007 (from patients and the environment). As expected, the resolution of each locus alone [number of types (NT) = 35-64; index of discrimination (ID) = 0.816-0.964] was lower than the combination of two loci (NT = 78-97; ID = 0.966-0.971). As each pairwise combination of loci gave similar results, we selected the most robust combination with ms172 [reverse; R] and ms217 [R] to constitute the double-locus sequence typing (DLST) scheme for P. aeruginosa. This combination gave: (i) a complete genotype for 276/282 isolates (typability of 98%), (ii) 86 different types, and (iii) an ID of 0.968. Analysis of multiple isolates from the same patients or taps showed that DLST genotypes are generally stable over a period of several months. The high typability, discriminatory power, and ease of use of the proposed DLST scheme makes it a method of choice for local epidemiological analyses of P. aeruginosa. Moreover, the possibility to give unambiguous definition of types allowed to develop an Internet database ( http://www.dlst.org ) accessible by all. PMID:24326699

  9. Evolution and social epidemiology.

    PubMed

    Nishi, Akihiro

    2015-11-01

    Evolutionary biology, which aims to explain the dynamic process of shaping the diversity of life, has not yet significantly affected thinking in social epidemiology. Current challenges in social epidemiology include understanding how social exposures can affect our biology, explaining the dynamics of society and health, and designing better interventions that are mindful of the impact of exposures during critical periods. I review how evolutionary concepts and tools, such as fitness gradient in cultural evolution, evolutionary game theory, and contemporary evolution in cancer, can provide helpful insights regarding social epidemiology. PMID:26319950

  10. Bayesian Decision-theoretic Methods for Parameter Ensembles with Application to Epidemiology

    NASA Astrophysics Data System (ADS)

    Gunterman, Haluna Penelope Frances

    and water-uptake behavior of CLs. Isolated CLs were made in-house and commercially and tested for their PC-S response. CLs have the propensity to be highly hydrophilic and require capillary pressures as low as -80 kPa to eject water. The presence of Pt or surface cracks increases hydrophilicity. These findings suggest that saturation in CLs, especially cracked CLs, may exacerbate poor transport. Lastly, this work includes early-stage development of a limiting-current measurement that can be used to calculate effective transport properties as a function of saturation. Results indicate that the method is valid, and different DM have higher transport depending on the operating condition. The technique is yet in a formative stage, and this work includes advice and recommendations for operation and design improvements.

  11. Epidemiology of Toxoplasmosis

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Infection with Toxoplasma gondii is highly prevalent throughout the world. This chapter discusses modes of transmission, the epidemiology of T. gondii infection worldwide and in Brazil, and methods of prevention and control....

  12. [Epidemiology of "sick buildings"].

    PubMed

    Sterling, T D; Collett, C; Rumel, D

    1991-02-01

    The indoor environment of modern buildings, especially those designed for commercial and administrative purposes, constitutes a unique ecological niche with its own biochemical environment, fauna and flora. Sophisticated construction methods and the new materials and machinery required to maintain the indoor environment of these enclosed structures produce a large number of chemical by-products and permit the growth of many different microorganisms. Because modern office buildings are sealed, the regulation of humidification and temperature of ducted air presents a dilemma, since difference species of microorganisms flourish at different combinations of humidity and temperature. If the indoor environment of modern office buildings is not properly maintained, the environment may become harmful to its occupants' health. Such buildings are classified as "Sick Buildings". A review of the epidemiology of building illness is presented. The etiology of occupant illnesses, sources of toxic substances, and possible methods of maintaining a safe indoor environment are described. PMID:1784964

  13. Game Methodology for Design Methods and Tools Selection

    ERIC Educational Resources Information Center

    Ahmad, Rafiq; Lahonde, Nathalie; Omhover, Jean-françois

    2014-01-01

    Design process optimisation and intelligence are the key words of today's scientific community. A proliferation of methods has made design a convoluted area. Designers are usually afraid of selecting one method/tool over another and even expert designers may not necessarily know which method is the best to use in which circumstances. This…

  14. A Review of the Epidemiological Methods Used to Investigate the Health Impacts of Air Pollution around Major Industrial Areas

    PubMed Central

    Pascal, Laurence; Bidondo, Marie-Laure; Cochet, Amandine; Sarter, Hélène; Stempfelet, Morgane; Wagner, Vérène

    2013-01-01

    We performed a literature review to investigate how epidemiological studies have been used to assess the health consequences of living in the vicinity of industries. 77 papers on the chronic effects of air pollution around major industrial areas were reviewed. Major health themes were cancers (27 studies), morbidity (25 studies), mortality (7 studies), and birth outcome (7 studies). Only 3 studies investigated mental health. While studies were available from many different countries, a majority of papers came from the United Kingdom, Italy, and Spain. Several studies were motivated by concerns from the population or by previous observations of an overincidence of cases. Geographical ecological designs were largely used for studying cancer and mortality, including statistical designs to quantify a relationship between health indicators and exposure. Morbidity was frequently investigated through cross-sectional surveys on the respiratory health of children. Few multicenter studies were performed. In a majority of papers, exposed areas were defined based on the distance to the industry and were located from <2 km to >20 km from the plants. Improving the exposure assessment would be an asset to future studies. Criteria to include industries in multicenter studies should be defined. PMID:23818910

  15. Translating Vision into Design: A Method for Conceptual Design Development

    NASA Technical Reports Server (NTRS)

    Carpenter, Joyce E.

    2003-01-01

    One of the most challenging tasks for engineers is the definition of design solutions that will satisfy high-level strategic visions and objectives. Even more challenging is the need to demonstrate how a particular design solution supports the high-level vision. This paper describes a process and set of system engineering tools that have been used at the Johnson Space Center to analyze and decompose high-level objectives for future human missions into design requirements that can be used to develop alternative concepts for vehicles, habitats, and other systems. Analysis and design studies of alternative concepts and approaches are used to develop recommendations for strategic investments in research and technology that support the NASA Integrated Space Plan. In addition to a description of system engineering tools, this paper includes a discussion of collaborative design practices for human exploration mission architecture studies used at the Johnson Space Center.

  16. An inverse design method for 2D airfoil

    NASA Astrophysics Data System (ADS)

    Liang, Zhi-Yong; Cui, Peng; Zhang, Gen-Bao

    2010-03-01

    The computational method for aerodynamic design of aircraft is applied more universally than before, in which the design of an airfoil is a hot problem. The forward problem is discussed by most relative papers, but inverse method is more useful in practical designs. In this paper, the inverse design of 2D airfoil was investigated. A finite element method based on the variational principle was used for carrying out. Through the simulation, it was shown that the method was fit for the design.

  17. MEASUREMENT ERROR ESTIMATION AND CORRECTION METHODS TO MINIMIZE EXPOSURE MISCLASSIFICATION IN EPIDEMIOLOGICAL STUDIES: PROJECT SUMMARY

    EPA Science Inventory

    This project summary highlights recent findings from research undertaken to develop improved methods to assess potential human health risks related to drinking water disinfection byproduct (DBP) exposures.

  18. Using Software Design Methods in CALL

    ERIC Educational Resources Information Center

    Ward, Monica

    2006-01-01

    The phrase "software design" is not one that arouses the interest of many CALL practitioners, particularly those from a humanities background. However, software design essentials are simply logical ways of going about designing a system. The fundamentals include modularity, anticipation of change, generality and an incremental approach. While CALL…

  19. History and Impact of Nutritional Epidemiology123

    PubMed Central

    Alpers, David H.; Bier, Dennis M.; Carpenter, Kenneth J.; McCormick, Donald B.; Miller, Anthony B.; Jacques, Paul F.

    2014-01-01

    The real and important role of epidemiology was discussed, noting heretofore unknown associations that led to improved understanding of the cause and prevention of individual nutritional deficiencies. However, epidemiology has been less successful in linking individual nutrients to the cause of chronic diseases, such as cancer and cardiovascular disease. Dietary changes, such as decreasing caloric intake to prevent cancer and the Mediterranean diet to prevent diabetes, were confirmed as successful approaches to modifying the incidence of chronic diseases. The role of the epidemiologist was confirmed as a collaborator, not an isolated expert of last resort. The challenge for the future is to decide which epidemiologic methods and study designs are most useful in studying chronic disease, then to determine which associations and the hypotheses derived from them are especially strong and worthy of pursuit, and finally to design randomized studies that are feasible, affordable, and likely to result in confirmation or refutation of these hypotheses. PMID:25469385

  20. Global optimization methods for engineering design

    NASA Technical Reports Server (NTRS)

    Arora, Jasbir S.

    1990-01-01

    The problem is to find a global minimum for the Problem P. Necessary and sufficient conditions are available for local optimality. However, global solution can be assured only under the assumption of convexity of the problem. If the constraint set S is compact and the cost function is continuous on it, existence of a global minimum is guaranteed. However, in view of the fact that no global optimality conditions are available, a global solution can be found only by an exhaustive search to satisfy Inequality. The exhaustive search can be organized in such a way that the entire design space need not be searched for the solution. This way the computational burden is reduced somewhat. It is concluded that zooming algorithm for global optimizations appears to be a good alternative to stochastic methods. More testing is needed; a general, robust, and efficient local minimizer is required. IDESIGN was used in all numerical calculations which is based on a sequential quadratic programming algorithm, and since feasible set keeps on shrinking, a good algorithm to find an initial feasible point is required. Such algorithms need to be developed and evaluated.

  1. [Epidemiology and heterogeny].

    PubMed

    Breilh, J; Granda, E

    1989-01-01

    The innovation of epidemiology plays a crucial role in the development of the health sciences. The authors emphasize the importance of epistemological analysis related to scientific and technical production. They focus on the theoretical and methodological contributions of the principal Latin American groups in the field of epidemiology, stating their main accomplishments, issues and potentials. When reviewing those conceptual and practical innovations, the authors analyse the effects of broader historical conditions on scientific work. To them, Latin American contemporary innovative epidemiological research and production have developed clearly differentiated principles, methods and technical projections which have led to a movement of critical or 'social' epidemiology. The functionalist approach of conventional epidemiology, characterized by an empiricist viewpoint, is being overcome by a more rigorous and analytical approach. This new epidemiological approach, in which the authors as members of CEAS (Health Research and Advisory Center) are working, has selectively incorporated some of the technical instruments of conventional epidemiology, subordinating them to a different theoretical and logical paradigm. The new framework of this group explains the need to consider the people's objective situation and necessities, when constructing scientific interpretations and planning technical action. In order to accomplish this goal, epidemiological reasoning has to reflect the unity of external epidemiological facts and associations, the so-called phenomenological aspect of health, with the underlying determinants and conditioning processes or internal relations, which are the essence of the health-disease production and distribution process. Epidemiological analysis is considered not only as a problem of empirical observation but as a process of theoretical construction, in which there is a dynamic fusion of deductive and inductive reasoning.(ABSTRACT TRUNCATED AT 250

  2. An Efficient Inverse Aerodynamic Design Method For Subsonic Flows

    NASA Technical Reports Server (NTRS)

    Milholen, William E., II

    2000-01-01

    Computational Fluid Dynamics based design methods are maturing to the point that they are beginning to be used in the aircraft design process. Many design methods however have demonstrated deficiencies in the leading edge region of airfoil sections. The objective of the present research is to develop an efficient inverse design method which is valid in the leading edge region. The new design method is a streamline curvature method, and a new technique is presented for modeling the variation of the streamline curvature normal to the surface. The new design method allows the surface coordinates to move normal to the surface, and has been incorporated into the Constrained Direct Iterative Surface Curvature (CDISC) design method. The accuracy and efficiency of the design method is demonstrated using both two-dimensional and three-dimensional design cases.

  3. INFLUENCE OF EXPOSURE ASSESSMENT METHOD IN AN EPIDEMIOLOGIC STUDY OF TRIHALOMETHANE EXPOSURE AND SPONTANEOUS ABORTION

    EPA Science Inventory

    Trihalomethanes are common contaminants of chlorinated drinking water. Studies of their health effects have been hampered by exposure misclassification, due in part to limitations inherent in using utility sampling records. We used two exposure assessment methods, one based on ut...

  4. Snippets From the Past: The Evolution of Wade Hampton Frost's Epidemiology as Viewed From the American Journal of Hygiene/Epidemiology

    PubMed Central

    Morabia, Alfredo

    2013-01-01

    Wade Hampton Frost, who was a Professor of Epidemiology at Johns Hopkins University from 1919 to 1938, spurred the development of epidemiologic methods. His 6 publications in the American Journal of Hygiene, which later became the American Journal of Epidemiology, comprise a 1928 Cutter lecture on a theory of epidemics, a survey-based study of tonsillectomy and immunity to Corynebacterium diphtheriae (1931), 2 papers from a longitudinal study of the incidence of minor respiratory diseases (1933 and 1935), an attack rate ratio analysis of the decline of diphtheria in Baltimore (1936), and a 1936 lecture on the age, time, and cohort analysis of tuberculosis mortality. These 6 American Journal of Hygiene /American Journal of Epidemiology papers attest that Frost's personal evolution mirrored that of the emerging “early” epidemiology: The scope of epidemiology extended beyond the study of epidemics of acute infectious diseases, and rigorous comparative study designs and their associated quantitative methods came to light. PMID:24022889

  5. Ecogeographic Genetic Epidemiology

    PubMed Central

    Sloan, Chantel D.; Duell, Eric J.; Shi, Xun; Irwin, Rebecca; Andrew, Angeline S.; Williams, Scott M.; Moore, Jason H.

    2009-01-01

    Complex diseases such as cancer and heart disease result from interactions between an individual's genetics and environment, i.e. their human ecology. Rates of complex diseases have consistently demonstrated geographic patterns of incidence, or spatial “clusters” of increased incidence relative to the general population. Likewise, genetic subpopulations and environmental influences are not evenly distributed across space. Merging appropriate methods from genetic epidemiology, ecology and geography will provide a more complete understanding of the spatial interactions between genetics and environment that result in spatial patterning of disease rates. Geographic Information Systems (GIS), which are tools designed specifically for dealing with geographic data and performing spatial analyses to determine their relationship, are key to this kind of data integration. Here the authors introduce a new interdisciplinary paradigm, ecogeographic genetic epidemiology, which uses GIS and spatial statistical analyses to layer genetic subpopulation and environmental data with disease rates and thereby discern the complex gene-environment interactions which result in spatial patterns of incidence. PMID:19025788

  6. A data-driven epidemiological prediction method for dengue outbreaks using local and remote sensing data

    PubMed Central

    2012-01-01

    Background Dengue is the most common arboviral disease of humans, with more than one third of the world’s population at risk. Accurate prediction of dengue outbreaks may lead to public health interventions that mitigate the effect of the disease. Predicting infectious disease outbreaks is a challenging task; truly predictive methods are still in their infancy. Methods We describe a novel prediction method utilizing Fuzzy Association Rule Mining to extract relationships between clinical, meteorological, climatic, and socio-political data from Peru. These relationships are in the form of rules. The best set of rules is automatically chosen and forms a classifier. That classifier is then used to predict future dengue incidence as either HIGH (outbreak) or LOW (no outbreak), where these values are defined as being above and below the mean previous dengue incidence plus two standard deviations, respectively. Results Our automated method built three different fuzzy association rule models. Using the first two weekly models, we predicted dengue incidence three and four weeks in advance, respectively. The third prediction encompassed a four-week period, specifically four to seven weeks from time of prediction. Using previously unused test data for the period 4–7 weeks from time of prediction yielded a positive predictive value of 0.686, a negative predictive value of 0.976, a sensitivity of 0.615, and a specificity of 0.982. Conclusions We have developed a novel approach for dengue outbreak prediction. The method is general, could be extended for use in any geographical region, and has the potential to be extended to other environmentally influenced infections. The variables used in our method are widely available for most, if not all countries, enhancing the generalizability of our method. PMID:23126401

  7. Design optimization method for Francis turbine

    NASA Astrophysics Data System (ADS)

    Kawajiri, H.; Enomoto, Y.; Kurosawa, S.

    2014-03-01

    This paper presents a design optimization system coupled CFD. Optimization algorithm of the system employs particle swarm optimization (PSO). Blade shape design is carried out in one kind of NURBS curve defined by a series of control points. The system was applied for designing the stationary vanes and the runner of higher specific speed francis turbine. As the first step, single objective optimization was performed on stay vane profile, and second step was multi-objective optimization for runner in wide operating range. As a result, it was confirmed that the design system is useful for developing of hydro turbine.

  8. Epidemiology of varicocele

    PubMed Central

    Alsaikhan, Bader; Alrabeeah, Khalid; Delouya, Guila; Zini, Armand

    2016-01-01

    Varicocele is a common problem in reproductive medicine practice. A varicocele is identified in 15% of healthy men and up to 35% of men with primary infertility. The exact pathophysiology of varicoceles is not very well understood, especially regarding its effect on male infertility. We have conducted a systematic review of studies evaluating the epidemiology of varicocele in the general population and in men presenting with infertility. In this article, we have identified some of the factors that can influence the epidemiological aspects of varicoceles. We also recognize that varicocele epidemiology remains incompletely understood, and there is a need for well-designed, large-scale studies to fully define the epidemiological aspects of this condition. PMID:26763551

  9. Investigation of major cattle production constraints in Kembata Tambaro zone of Southern Ethiopia using participatory epidemiology methods.

    PubMed

    Ayele, Birhanu; Tigre, Worku; Deresa, Benti

    2016-01-01

    Ethiopia has enormous livestock resources from which rural households derive their livelihoods. A cross-sectional study based on participatory appraisal methods was conducted in Kembata Tambaro zone to assess major constraints to livestock production and major diseases of cattle and their treatment options. Four districts were selected purposively for this study, and 18 peasant associations were randomly sampled from the selected districts. Focus group discussion, semistructured interviews, simple ranking and scoring, proportional piling, pairwise ranking, and matrix scoring were the participatory epidemiological tools used in the study. Feed and free grazing land shortages and diseases were found to be the major constraints to cattle production in the area. Mastitis was ranked as the most serious disease of cattle. Modern veterinary treatments are used alongside traditional herbal remedies. Matrix scoring showed strong agreement between focus groups in identifying the major diseases using their indicators (clinical signs). Hence, it was concluded that indigenous knowledge complemented with participatory methods and approaches allow community and field researchers to jointly study specific livestock problems and help identify appropriate solutions. PMID:26477032

  10. Novel Microbiological and Spatial Statistical Methods to Improve Strength of Epidemiological Evidence in a Community-Wide Waterborne Outbreak

    PubMed Central

    Jalava, Katri; Rintala, Hanna; Ollgren, Jukka; Maunula, Leena; Gomez-Alvarez, Vicente; Revez, Joana; Palander, Marja; Antikainen, Jenni; Kauppinen, Ari; Räsänen, Pia; Siponen, Sallamaari; Nyholm, Outi; Kyyhkynen, Aino; Hakkarainen, Sirpa; Merentie, Juhani; Pärnänen, Martti; Loginov, Raisa; Ryu, Hodon; Kuusi, Markku; Siitonen, Anja; Miettinen, Ilkka; Santo Domingo, Jorge W.; Hänninen, Marja-Liisa; Pitkänen, Tarja

    2014-01-01

    Failures in the drinking water distribution system cause gastrointestinal outbreaks with multiple pathogens. A water distribution pipe breakage caused a community-wide waterborne outbreak in Vuorela, Finland, July 2012. We investigated this outbreak with advanced epidemiological and microbiological methods. A total of 473/2931 inhabitants (16%) responded to a web-based questionnaire. Water and patient samples were subjected to analysis of multiple microbial targets, molecular typing and microbial community analysis. Spatial analysis on the water distribution network was done and we applied a spatial logistic regression model. The course of the illness was mild. Drinking untreated tap water from the defined outbreak area was significantly associated with illness (RR 5.6, 95% CI 1.9–16.4) increasing in a dose response manner. The closer a person lived to the water distribution breakage point, the higher the risk of becoming ill. Sapovirus, enterovirus, single Campylobacter jejuni and EHEC O157:H7 findings as well as virulence genes for EPEC, EAEC and EHEC pathogroups were detected by molecular or culture methods from the faecal samples of the patients. EPEC, EAEC and EHEC virulence genes and faecal indicator bacteria were also detected in water samples. Microbial community sequencing of contaminated tap water revealed abundance of Arcobacter species. The polyphasic approach improved the understanding of the source of the infections, and aided to define the extent and magnitude of this outbreak. PMID:25147923

  11. Statistical methods in epidemiology: Karl Pearson, Ronald Ross, Major Greenwood and Austin Bradford Hill, 1900-1945.

    PubMed

    Hardy, Anne; Magnello, M Eileen

    2002-01-01

    The tradition of epidemiological study through observation and the use of vital statistics dates back to the 18th century in Britain. At the close of the 19th century, however, a new and more sophisticated statistical approach emerged, from a base in the discipline of mathematics, which was eventually to transform the practice of epidemiology. This paper traces the evolution of that new analytical approach within English epidemiology through the work of four key contributors to its inception and establishment within the wider discipline. PMID:12134737

  12. Method to Select Metropolitan Areas of Epidemiologic Interest for Enhanced Air Quality Monitoring

    EPA Science Inventory

    The U.S. Environmental Protection Agency’s current Speciation Trends Network (STN) covers most major U.S. metropolitan areas and a wide range of particulate matter (PM) constituents and gaseous co-pollutants. However, using filter-based methods, most PM constituents are measured ...

  13. Alternative methods for the design of jet engine control systems

    NASA Technical Reports Server (NTRS)

    Sain, M. K.; Leake, R. J.; Basso, R.; Gejji, R.; Maloney, A.; Seshadri, V.

    1976-01-01

    Various alternatives to linear quadratic design methods for jet engine control systems are discussed. The main alternatives are classified into two broad categories: nonlinear global mathematical programming methods and linear local multivariable frequency domain methods. Specific studies within these categories include model reduction, the eigenvalue locus method, the inverse Nyquist method, polynomial design, dynamic programming, and conjugate gradient approaches.

  14. Epidemiology: Then and Now.

    PubMed

    Kuller, Lewis H

    2016-03-01

    Twenty-five years ago, on the 75th anniversary of the Johns Hopkins Bloomberg School of Public Health, I noted that epidemiologic research was moving away from the traditional approaches used to investigate "epidemics" and their close relationship with preventive medicine. Twenty-five years later, the role of epidemiology as an important contribution to human population research, preventive medicine, and public health is under substantial pressure because of the emphasis on "big data," phenomenology, and personalized medical therapies. Epidemiology is the study of epidemics. The primary role of epidemiology is to identify the epidemics and parameters of interest of host, agent, and environment and to generate and test hypotheses in search of causal pathways. Almost all diseases have a specific distribution in relation to time, place, and person and specific "causes" with high effect sizes. Epidemiology then uses such information to develop interventions and test (through clinical trials and natural experiments) their efficacy and effectiveness. Epidemiology is dependent on new technologies to evaluate improved measurements of host (genomics), epigenetics, identification of agents (metabolomics, proteomics), new technology to evaluate both physical and social environment, and modern methods of data collection. Epidemiology does poorly in studying anything other than epidemics and collections of numerators and denominators without specific hypotheses even with improved statistical methodologies. PMID:26493266

  15. Demystifying Mixed Methods Research Design: A Review of the Literature

    ERIC Educational Resources Information Center

    Caruth, Gail D.

    2013-01-01

    Mixed methods research evolved in response to the observed limitations of both quantitative and qualitative designs and is a more complex method. The purpose of this paper was to examine mixed methods research in an attempt to demystify the design thereby allowing those less familiar with its design an opportunity to utilize it in future research.…

  16. Genetic epidemiology of bilateral breast cancer: a linkage analysis using the affected-pedigree-member method.

    PubMed

    Haile, R W; Goldstein, A M; Weeks, D E; Sparkes, R S; Paganini-Hill, A

    1990-01-01

    We used the affected-pedigree-member (APM) method to conduct linkage analyses on 19 pedigrees in which the probands had premenopausal bilateral breast cancer. This method analyzes all affected pairs of relatives, as opposed to siblings only, and incorporates into the analyses information on the frequency of marker alleles. Fourteen codominant marker systems were evaluated in two separate analyses. In the first, only premenopausal cases of breast cancer were coded as affected because we assumed that postmenopausal cases were due to a different etiology. In the second analysis, all cases of breast cancer were coded as affected, irrespective of menopausal status. In the premenopausal-cases-only analysis, we observed evidence suggestive of nonindependent segregation for C3 and ESD. In the all-cases analysis, we observed much weaker evidence for C3 and ESD and noted a suggestion of nonindependent segregation for AMY2 and PGM1. PMID:2328913

  17. Evidence-based planning and costing palliative care services for children: novel multi-method epidemiological and economic exemplar

    PubMed Central

    2013-01-01

    Background Children’s palliative care is a relatively new clinical specialty. Its nature is multi-dimensional and its delivery necessarily multi-professional. Numerous diverse public and not-for-profit organisations typically provide services and support. Because services are not centrally coordinated, they are provided in a manner that is inconsistent and incoherent. Since the first children’s hospice opened in 1982, the epidemiology of life-limiting conditions has changed with more children living longer, and many requiring transfer to adult services. Very little is known about the number of children living within any given geographical locality, costs of care, or experiences of children with ongoing palliative care needs and their families. We integrated evidence, and undertook and used novel methodological epidemiological work to develop the first evidence-based and costed commissioning exemplar. Methods Multi-method epidemiological and economic exemplar from a health and not-for-profit organisation perspective, to estimate numbers of children under 19 years with life-limiting conditions, cost current services, determine child/parent care preferences, and cost choice of end-of-life care at home. Results The exemplar locality (North Wales) had important gaps in service provision and the clinical network. The estimated annual total cost of current children’s palliative care was about £5.5 million; average annual care cost per child was £22,771 using 2007 prevalence estimates and £2,437- £11,045 using new 2012/13 population-based prevalence estimates. Using population-based prevalence, we estimate 2271 children with a life-limiting condition in the general exemplar population and around 501 children per year with ongoing palliative care needs in contact with hospital services. Around 24 children with a wide range of life-limiting conditions require end-of-life care per year. Choice of end-of-life care at home was requested, which is not currently

  18. Epidemiology of Candida infection. II. Application of biochemical methods for typing of Candida albicans strains.

    PubMed

    Budak, A

    1990-01-01

    Biochemical profiles of 350 C. albicans isolates from five towns in Poland and from Freiburg in Germany were determined on the basis of nine biochemical tests of Odds and Abbott method. API 20 C AUX system and additionally a resistogram. The analysis of the strains according to Odds' and Abbotts's system showed that investigated strains can be typed into 9 profile codes of common biochemical patterns. There were some differences among the profiles according to their geographical origin and anatomical sources of the isolation. On the basis of the ability C. albicans strains to assimilate of carbon sources, 350 isolates were categorised into 13 separate auxotrophic profiles with the major one: 2,576,174 accounting for 81% of the total. The majority of the investigated isolates were susceptible to antifungal agents (83%). A disproportionate distribution of auxotrophic profiles limited the use of resistogram method and API 20 C AUX as systems for typing C. albicans strains. On the other hand, the method of Odds and Abbott provides valuable criteria for typing of C. albicans. PMID:2130802

  19. A graph-theory method for pattern identification in geographical epidemiology – a preliminary application to deprivation and mortality

    PubMed Central

    Maheswaran, Ravi; Craigs, Cheryl; Read, Simon; Bath, Peter A; Willett, Peter

    2009-01-01

    Background Graph theoretical methods are extensively used in the field of computational chemistry to search datasets of compounds to see if they contain particular molecular sub-structures or patterns. We describe a preliminary application of a graph theoretical method, developed in computational chemistry, to geographical epidemiology in relation to testing a prior hypothesis. We tested the methodology on the hypothesis that if a socioeconomically deprived neighbourhood is situated in a wider deprived area, then that neighbourhood would experience greater adverse effects on mortality compared with a similarly deprived neighbourhood which is situated in a wider area with generally less deprivation. Methods We used the Trent Region Health Authority area for this study, which contained 10,665 census enumeration districts (CED). Graphs are mathematical representations of objects and their relationships and within the context of this study, nodes represented CEDs and edges were determined by whether or not CEDs were neighbours (shared a common boundary). The overall area in this study was represented by one large graph comprising all CEDs in the region, along with their adjacency information. We used mortality data from 1988–1998, CED level population estimates and the Townsend Material Deprivation Index as an indicator of neighbourhood level deprivation. We defined deprived CEDs as those in the top 20% most deprived in the Region. We then set out to classify these deprived CEDs into seven groups defined by increasing deprivation levels in the neighbouring CEDs. 506 (24.2%) of the deprived CEDs had five adjacent CEDs and we limited pattern development and searching to these CEDs. We developed seven query patterns and used the RASCAL (Rapid Similarity Calculator) program to carry out the search for each of the query patterns. This program used a maximum common subgraph isomorphism method which was modified to handle geographical data. Results Of the 506 deprived CEDs

  20. Prevalence and epidemiologic characteristics of FASD from various research methods with an emphasis on recent in-school studies.

    PubMed

    May, Philip A; Gossage, J Phillip; Kalberg, Wendy O; Robinson, Luther K; Buckley, David; Manning, Melanie; Hoyme, H Eugene

    2009-01-01

    Researching the epidemiology and estimating the prevalence of fetal alcohol syndrome (FAS) and other fetal alcohol spectrum disorders (FASD) for mainstream populations anywhere in the world has presented a challenge to researchers. Three major approaches have been used in the past: surveillance and record review systems, clinic-based studies, and active case ascertainment methods. The literature on each of these methods is reviewed citing the strengths, weaknesses, prevalence results, and other practical considerations for each method. Previous conclusions about the prevalence of FAS and total FASD in the United States (US) population are summarized. Active approaches which provide clinical outreach, recruitment, and diagnostic services in specific populations have been demonstrated to produce the highest prevalence estimates. We then describe and review studies utilizing in-school screening and diagnosis, a special type of active case ascertainment. Selected results from a number of in-school studies in South Africa, Italy, and the US are highlighted. The particular focus of the review is on the nature of the data produced from in-school methods and the specific prevalence rates of FAS and total FASD which have emanated from them. We conclude that FAS and other FASD are more prevalent in school populations, and therefore the general population, than previously estimated. We believe that the prevalence of FAS in typical, mixed-racial, and mixed-socioeconomic populations of the US is at least 2 to 7 per 1,000. Regarding all levels of FASD, we estimate that the current prevalence of FASD in populations of younger school children may be as high as 2-5% in the US and some Western European countries. PMID:19731384

  1. Airfoil design method using the Navier-Stokes equations

    NASA Technical Reports Server (NTRS)

    Malone, J. B.; Narramore, J. C.; Sankar, L. N.

    1991-01-01

    An airfoil design procedure is described that was incorporated into an existing 2-D Navier-Stokes airfoil analysis method. The resulting design method, an iterative procedure based on a residual-correction algorithm, permits the automated design of airfoil sections with prescribed surface pressure distributions. The inverse design method and the technique used to specify target pressure distributions are described. It presents several example problems to demonstrate application of the design procedure. It shows that this inverse design method develops useful airfoil configurations with a reasonable expenditure of computer resources.

  2. A Method of Integrated Description of Design Information for Reusability

    NASA Astrophysics Data System (ADS)

    Tsumaya, Akira; Nagae, Masao; Wakamatsu, Hidefumi; Shirase, Keiichi; Arai, Eiji

    Much of product design is executed concurrently these days. For such concurrent design, the method which can share and ueuse varioud kind of design information among designers is needed. However, complete understanding of the design information among designers have been a difficult issue. In this paper, design process model with use of designers’ intention is proposed. A method to combine the design process information and the design object information is also proposed. We introduce how to describe designers’ intention by providing some databases. Keyword Database consists of ontological data related to design object/activities. Designers select suitable keyword(s) from Keyword Database and explain the reason/ideas for their design activities by the description with use of keyword(s). We also developed the integration design information management system architecture by using a method of integrated description with designers’ intension. This system realizes connections between the information related to design process and that related to design object through designers’ intention. Designers can communicate with each other to understand how others make decision in design through that. Designers also can re-use both design process information data and design object information data through detabase management sub-system.

  3. [Opportunity and challenge on molecular epidemiology].

    PubMed

    Duan, G C; Chen, S Y

    2016-08-10

    Molecular epidemiology, a branch of epidemiology, combines the theories and methods, both in epidemiology and molecular biology. Molecular epidemiology mainly focuses on biological markers, describing the distribution, occurrence, development and prognosis of diseases at the molecular level. The completion of Human Genome Project and rapid development of Precision Medicine and Big Data not only offer the new development opportunities but also bring about a higher demand and new challenge for molecular epidemiology. PMID:27539332

  4. Epidemiology and changed surgical treatment methods for fractures of the distal radius

    PubMed Central

    2013-01-01

    Background and purpose The incidence of fractures of the distal radius may have changed over the last decade, and operative treatment has been commoner during that time. We investigated the incidence of fractures of the distal radius and changing trends in surgical treatment during the period 2004–2010. Patients and methods Registry data on 42,583 patients with a fracture of the distal radius from 2004 to 2010 were evaluated regarding diagnosis, age, sex, and surgical treatment. Results The crude incidence rate was 31 per 104 person-years with a bimodal distribution. After the age of 45 years, the incidence rate in women increased rapidly and leveled off first at a very high age. The incidence rate in postmenopausal women was lower than previously reported. In men, the incidence was low and it increased slowly until the age of 80 years, when it amounted to 31 per 104 person-years. The number of surgical procedures increased by more than 40% despite the fact that there was reduced incidence during the study period. In patients ≥ 18 years of age, the proportion of fractures treated with plating increased from 16% to 70% while the use of external fixation decreased by about the same amount. Interpretation The incidence rate of distal radius fractures in postmenopausal women appears to have decreased over the last few decades. There has been a shift in surgical treatment from external fixation to open reduction and plating. PMID:23594225

  5. JASMINE design and method of data reduction

    NASA Astrophysics Data System (ADS)

    Yamada, Yoshiyuki; Gouda, Naoteru; Yano, Taihei; Kobayashi, Yukiyasu; Niwa, Yoshito

    2008-07-01

    Japan Astrometry Satellite Mission for Infrared Exploration (JASMINE) aims to construct a map of the Galactic bulge with 10 μ arc sec accuracy. We use z-band CCD for avoiding dust absorption, and observe about 10 × 20 degrees area around the Galactic bulge region. Because the stellar density is very high, each FOVs can be combined with high accuracy. With 5 years observation, we will construct 10 μ arc sec accurate map. In this poster, I will show the observation strategy, design of JASMINE hardware, reduction scheme, and error budget. We also construct simulation software named JASMINE Simulator. We also show the simulation results and design of software.

  6. Lithography aware overlay metrology target design method

    NASA Astrophysics Data System (ADS)

    Lee, Myungjun; Smith, Mark D.; Lee, Joonseuk; Jung, Mirim; Lee, Honggoo; Kim, Youngsik; Han, Sangjun; Adel, Michael E.; Lee, Kangsan; Lee, Dohwa; Choi, Dongsub; Liu, Zephyr; Itzkovich, Tal; Levinski, Vladimir; Levy, Ady

    2016-03-01

    We present a metrology target design (MTD) framework based on co-optimizing lithography and metrology performance. The overlay metrology performance is strongly related to the target design and optimizing the target under different process variations in a high NA optical lithography tool and measurement conditions in a metrology tool becomes critical for sub-20nm nodes. The lithography performance can be quantified by device matching and printability metrics, while accuracy and precision metrics are used to quantify the metrology performance. Based on using these metrics, we demonstrate how the optimized target can improve target printability while maintaining the good metrology performance for rotated dipole illumination used for printing a sub-100nm diagonal feature in a memory active layer. The remaining challenges and the existing tradeoff between metrology and lithography performance are explored with the metrology target designer's perspective. The proposed target design framework is completely general and can be used to optimize targets for different lithography conditions. The results from our analysis are both physically sensible and in good agreement with experimental results.

  7. Traditional epidemiology, modern epidemiology, and public health.

    PubMed Central

    Pearce, N

    1996-01-01

    There have been significant developments in epidemiologic methodology during the past century, including changes in basic concepts, methods of data analysis, and methods of exposure measurement. However, the rise of modern epidemiology has been a mixed blessing, and the new paradigm has major shortcomings, both in public health and in scientific terms. The changes in the paradigm have not been neutral but have rather helped change--and have reflected changes in--the way in which epidemiologists think about health and disease. The key issue has been the shift in the level of analysis from the population to the individual. Epidemiology has largely ceased to function as part of a multidisciplinary approach to understanding the causation of disease in populations and has become a set of generic methods for measuring associations of exposure and disease in individuals. This reductionist approach focuses on the individual, blames the victim, and produces interventions that can be harmful. We seem to be using more and more advanced technology to study more and more trivial issues, while the major causes of disease are ignored. Epidemiology must reintegrate itself into public health and must rediscover the population perspective. PMID:8629719

  8. Probabilistic Methods for Structural Design and Reliability

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Whitlow, Woodrow, Jr. (Technical Monitor)

    2002-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate, that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or in deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  9. A recommended epidemiological study design for examining the adverse health effects among emergency workers who experienced the TEPCO fukushima daiichi NPP accident in 2011.

    PubMed

    Yasui, Shojiro

    2016-01-01

    Results from medical examinations conducted in 2012 of workers who were engaged in radiation work in 2012 as a result of the 2011 Fukushima Daiichi Nuclear Power Plant (NPP) accident showed that the prevalence of abnormal findings was 4.21%, 3.23 points higher than the 0.98% that was found prior to the accident in the jurisdiction area of the labor inspection office which holds jurisdiction over the NPP. The Ministry of Health, Labour and Welfare (MHLW) concluded that the 2010 and 2012 data cannot be easily compared because 70% of the enterprises within the jurisdiction of the office that reported the 2012 results were different from those that did so in 2010. In addition, although the radiation workers' estimated average dose weighted by number of workers was 3.66 times higher than decontamination workers' dose, the prevalence among radiation workers was only 1.14 times higher than that among decontamination workers. Based on the results of the medical examinations, however, the MHLW decided to implement an epidemiological study on the health effects of radiation exposure on all emergency workers. This article explains key issues of the basic design of the study recommended by the expert meeting established in the MHLW and also identifies challenges that could not be resolved and thus required further consideration by the study researchers. The major issues included: (a) study methods and target group; (b) evaluation of cumulative doses; (c) health effects (end points); (d) control of confounding factors; and (e) study implementation framework. Identified key challenges that required further deliberation were: (a) preventing arbitrary partisan analysis; (b) ensuring a high participation rate; (c) inquiry about the medical radiation doses; and (d) the preparedness of new analytical technology. The study team formulated and implemented the pilot study in 2014 and started the full-scale study in April 2015 with funding from a research grant from the MHLW. PMID

  10. A comparison of digital flight control design methods

    NASA Technical Reports Server (NTRS)

    Powell, J. D.; Parsons, E.; Tashker, M. G.

    1976-01-01

    Many variations in design methods for aircraft digital flight control have been proposed in the literature. In general, the methods fall into two categories: those where the design is done in the continuous domain (or s-plane), and those where the design is done in the discrete domain (or z-plane). This paper evaluates several variations of each category and compares them for various flight control modes of the Langley TCV Boeing 737 aircraft. Design method fidelity is evaluated by examining closed loop root movement and the frequency response of the discretely controlled continuous aircraft. It was found that all methods provided acceptable performance for sample rates greater than 10 cps except the 'uncompensated s-plane design' method which was acceptable above 20 cps. A design procedure based on optimal control methods was proposed that provided the best fidelity at very slow sample rates and required no design iterations for changing sample rates.

  11. Soft Computing Methods in Design of Superalloys

    NASA Technical Reports Server (NTRS)

    Cios, K. J.; Berke, L.; Vary, A.; Sharma, S.

    1996-01-01

    Soft computing techniques of neural networks and genetic algorithms are used in the design of superalloys. The cyclic oxidation attack parameter K(sub a), generated from tests at NASA Lewis Research Center, is modelled as a function of the superalloy chemistry and test temperature using a neural network. This model is then used in conjunction with a genetic algorithm to obtain an optimized superalloy composition resulting in low K(sub a) values.

  12. Statistical Reasoning and Methods in Epidemiology to Promote Individualized Health: In Celebration of the 100th Anniversary of the Johns Hopkins Bloomberg School of Public Health.

    PubMed

    Ogburn, Elizabeth L; Zeger, Scott L

    2016-03-01

    Epidemiology is concerned with determining the distribution and causes of disease. Throughout its history, epidemiology has drawn upon statistical ideas and methods to achieve its aims. Because of the exponential growth in our capacity to measure and analyze data on the underlying processes that define each person's state of health, there is an emerging opportunity for population-based epidemiologic studies to influence health decisions made by individuals in ways that take into account the individuals' characteristics, circumstances, and preferences. We refer to this endeavor as "individualized health." The present article comprises 2 sections. In the first, we describe how graphical, longitudinal, and hierarchical models can inform the project of individualized health. We propose a simple graphical model for informing individual health decisions using population-based data. In the second, we review selected topics in causal inference that we believe to be particularly useful for individualized health. Epidemiology and biostatistics were 2 of the 4 founding departments in the world's first graduate school of public health at Johns Hopkins University, the centennial of which we honor. This survey of a small part of the literature is intended to demonstrate that the 2 fields remain just as inextricably linked today as they were 100 years ago. PMID:26867776

  13. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  14. The Triton: Design concepts and methods

    NASA Technical Reports Server (NTRS)

    Meholic, Greg; Singer, Michael; Vanryn, Percy; Brown, Rhonda; Tella, Gustavo; Harvey, Bob

    1992-01-01

    During the design of the C & P Aerospace Triton, a few problems were encountered that necessitated changes in the configuration. After the initial concept phase, the aspect ratio was increased from 7 to 7.6 to produce a greater lift to drag ratio (L/D = 13) which satisfied the horsepower requirements (118 hp using the Lycoming O-235 engine). The initial concept had a wing planform area of 134 sq. ft. Detailed wing sizing analysis enlarged the planform area to 150 sq. ft., without changing its layout or location. The most significant changes, however, were made just prior to inboard profile design. The fuselage external diameter was reduced from 54 to 50 inches to reduce drag to meet the desired cruise speed of 120 knots. Also, the nose was extended 6 inches to accommodate landing gear placement. Without the extension, the nosewheel received an unacceptable percentage (25 percent) of the landing weight. The final change in the configuration was made in accordance with the stability and control analysis. In order to reduce the static margin from 20 to 13 percent, the horizontal tail area was reduced from 32.02 to 25.0 sq. ft. The Triton meets all the specifications set forth in the design criteria. If time permitted another iteration of the calculations, two significant changes would be made. The vertical stabilizer area would be reduced to decrease the aircraft lateral stability slope since the current value was too high in relation to the directional stability slope. Also, the aileron size would be decreased to reduce the roll rate below the current 106 deg/second. Doing so would allow greater flap area (increasing CL(sub max)) and thus reduce the overall wing area. C & P would also recalculate the horsepower and drag values to further validate the 120 knot cruising speed.

  15. The emergence of translational epidemiology: from scientific discovery to population health impact.

    PubMed

    Khoury, Muin J; Gwinn, Marta; Ioannidis, John P A

    2010-09-01

    Recent emphasis on translational research (TR) is highlighting the role of epidemiology in translating scientific discoveries into population health impact. The authors present applications of epidemiology in TR through 4 phases designated T1-T4, illustrated by examples from human genomics. In T1, epidemiology explores the role of a basic scientific discovery (e.g., a disease risk factor or biomarker) in developing a "candidate application" for use in practice (e.g., a test used to guide interventions). In T2, epidemiology can help to evaluate the efficacy of a candidate application by using observational studies and randomized controlled trials. In T3, epidemiology can help to assess facilitators and barriers for uptake and implementation of candidate applications in practice. In T4, epidemiology can help to assess the impact of using candidate applications on population health outcomes. Epidemiology also has a leading role in knowledge synthesis, especially using quantitative methods (e.g., meta-analysis). To explore the emergence of TR in epidemiology, the authors compared articles published in selected issues of the Journal in 1999 and 2009. The proportion of articles identified as translational doubled from 16% (11/69) in 1999 to 33% (22/66) in 2009 (P = 0.02). Epidemiology is increasingly recognized as an important component of TR. By quantifying and integrating knowledge across disciplines, epidemiology provides crucial methods and tools for TR. PMID:20688899

  16. A survey on methods of design features identification

    NASA Astrophysics Data System (ADS)

    Grabowik, C.; Kalinowski, K.; Paprocka, I.; Kempa, W.

    2015-11-01

    It is widely accepted that design features are one of the most attractive integration method of most fields of engineering activities such as a design modelling, process planning or production scheduling. One of the most important tasks which are realized in the integration process of design and planning functions is a design translation meant as design data mapping into data which are important from process planning needs point of view, it is manufacturing data. A design geometrical shape translation process can be realized with application one of the following strategies: (i) designing with previously prepared design features library also known as DBF method it is design by feature, (ii) interactive design features recognition IFR, (iii) automatic design features recognition AFR. In case of the DBF method design geometrical shape is created with design features. There are two basic approaches for design modelling in DBF method it is classic in which a part design is modelled from beginning to end with application design features previously stored in a design features data base and hybrid where part is partially created with standard predefined CAD system tools and the rest with suitable design features. Automatic feature recognition consist in an autonomic searching of a product model represented with a specific design representation method in order to find those model features which might be potentially recognized as design features, manufacturing features, etc. This approach needs the searching algorithm to be prepared. The searching algorithm should allow carrying on the whole recognition process without a user supervision. Currently there are lots of AFR methods. These methods need the product model to be represented with B-Rep representation most often, CSG rarely, wireframe very rarely. In the IFR method potential features are being recognized by a user. This process is most often realized by a user who points out those surfaces which seem to belong to a

  17. Design Methods and Optimization for Morphing Aircraft

    NASA Technical Reports Server (NTRS)

    Crossley, William A.

    2005-01-01

    This report provides a summary of accomplishments made during this research effort. The major accomplishments are in three areas. The first is the use of a multiobjective optimization strategy to help identify potential morphing features that uses an existing aircraft sizing code to predict the weight, size and performance of several fixed-geometry aircraft that are Pareto-optimal based upon on two competing aircraft performance objectives. The second area has been titled morphing as an independent variable and formulates the sizing of a morphing aircraft as an optimization problem in which the amount of geometric morphing for various aircraft parameters are included as design variables. This second effort consumed most of the overall effort on the project. The third area involved a more detailed sizing study of a commercial transport aircraft that would incorporate a morphing wing to possibly enable transatlantic point-to-point passenger service.

  18. A flexible layout design method for passive micromixers.

    PubMed

    Deng, Yongbo; Liu, Zhenyu; Zhang, Ping; Liu, Yongshun; Gao, Qingyong; Wu, Yihui

    2012-10-01

    This paper discusses a flexible layout design method of passive micromixers based on the topology optimization of fluidic flows. Being different from the trial and error method, this method obtains the detailed layout of a passive micromixer according to the desired mixing performance by solving a topology optimization problem. Therefore, the dependence on the experience of the designer is weaken, when this method is used to design a passive micromixer with acceptable mixing performance. Several design disciplines for the passive micromixers are considered to demonstrate the flexibility of the layout design method for passive micromixers. These design disciplines include the approximation of the real 3D micromixer, the manufacturing feasibility, the spacial periodic design, and effects of the Péclet number and Reynolds number on the designs obtained by this layout design method. The capability of this design method is validated by several comparisons performed between the obtained layouts and the optimized designs in the recently published literatures, where the values of the mixing measurement is improved up to 40.4% for one cycle of the micromixer. PMID:22736305

  19. Comparison of Traditional Design Nonlinear Programming Optimization and Stochastic Methods for Structural Design

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2010-01-01

    Structural design generated by traditional method, optimization method and the stochastic design concept are compared. In the traditional method, the constraints are manipulated to obtain the design and weight is back calculated. In design optimization, the weight of a structure becomes the merit function with constraints imposed on failure modes and an optimization algorithm is used to generate the solution. Stochastic design concept accounts for uncertainties in loads, material properties, and other parameters and solution is obtained by solving a design optimization problem for a specified reliability. Acceptable solutions were produced by all the three methods. The variation in the weight calculated by the methods was modest. Some variation was noticed in designs calculated by the methods. The variation may be attributed to structural indeterminacy. It is prudent to develop design by all three methods prior to its fabrication. The traditional design method can be improved when the simplified sensitivities of the behavior constraint is used. Such sensitivity can reduce design calculations and may have a potential to unify the traditional and optimization methods. Weight versus reliabilitytraced out an inverted-S-shaped graph. The center of the graph corresponded to mean valued design. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure. Weight can be reduced to a small value for a most failure-prone design. Probabilistic modeling of load and material properties remained a challenge.

  20. Analytical techniques for instrument design - matrix methods

    SciTech Connect

    Robinson, R.A.

    1997-09-01

    We take the traditional Cooper-Nathans approach, as has been applied for many years for steady-state triple-axis spectrometers, and consider its generalisation to other inelastic scattering spectrometers. This involves a number of simple manipulations of exponentials of quadratic forms. In particular, we discuss a toolbox of matrix manipulations that can be performed on the 6- dimensional Cooper-Nathans matrix: diagonalisation (Moller-Nielsen method), coordinate changes e.g. from ({Delta}k{sub I},{Delta}k{sub F} to {Delta}E, {Delta}Q & 2 dummy variables), integration of one or more variables (e.g. over such dummy variables), integration subject to linear constraints (e.g. Bragg`s Law for analysers), inversion to give the variance-covariance matrix, and so on. We show how these tools can be combined to solve a number of important problems, within the narrow-band limit and the gaussian approximation. We will argue that a generalised program that can handle multiple different spectrometers could (and should) be written in parallel to the Monte-Carlo packages that are becoming available. We will also discuss the complementarity between detailed Monte-Carlo calculations and the approach presented here. In particular, Monte-Carlo methods traditionally simulate the real experiment as performed in practice, given a model scattering law, while the Cooper-Nathans method asks the inverse question: given that a neutron turns up in a particular spectrometer configuration (e.g. angle and time of flight), what is the probability distribution of possible scattering events at the sample? The Monte-Carlo approach could be applied in the same spirit to this question.

  1. Design Method for Single-Blade Centrifugal Pump Impeller

    NASA Astrophysics Data System (ADS)

    Nishi, Yasuyuki; Fujiwara, Ryota; Fukutomi, Junichiro

    The sewage pumps are demanded a high pump efficiency and a performance in passing foreign bodies. Therefore, the impeller used by these usages requires the large passed particle size (minimum particle size in the pump). However, because conventional design method of pump impeller results in small impeller exit width, it is difficult to be applied to the design of single-blade centrifugal pump impeller which is used as a sewage pump. This paper proposes a design method for single-blade centrifugal pump impeller. As a result, the head curve of the impeller designed by the proposed design method satisfied design specifications, and pump efficiency was over 62% more than conventional single-blade centrifugal pump impeller. By comparing design values with CFD analysis values, the suction velocity ratio of the design parameter agreed well with each other, but the relative velocity ratio did not agree due to the influence of the backflow of the impeller entrance.

  2. Methods for very high temperature design

    SciTech Connect

    Blass, J.J.; Corum, J.M.; Chang, S.J.

    1989-01-01

    Design rules and procedures for high-temperature, gas-cooled reactor components are being formulated as an ASME Boiler and Pressure Vessel Code Case. A draft of the Case, patterned after Code Case N-47, and limited to Inconel 617 and temperatures of 982/degree/C (1800/degree/F) or less, will be completed in 1989 for consideration by relevant Code committees. The purpose of this paper is to provide a synopsis of the significant differences between the draft Case and N-47, and to provide more complete accounts of the development of allowable stress and stress rupture values and the development of isochronous stress vs strain curves, in both of which Oak Ridge National Laboratory (ORNL) played a principal role. The isochronous curves, which represent average behavior for many heats of Inconel 617, were based in part on a unified constitutive model developed at ORNL. Details are also provided of this model of inelastic deformation behavior, which does not distinguish between rate-dependent plasticity and time-dependent creep, along with comparisons between calculated and observed results of tests conducted on a typical heat of Inconel 617 by the General Electric Company for the Department of Energy. 4 refs., 15 figs., 1 tab.

  3. Triparental Families: A New Genetic-Epidemiological Design Applied to Drug Abuse, Alcohol Use Disorders, and Criminal Behavior in a Swedish National Sample

    PubMed Central

    Kendler, Kenneth S.; Ohlsson, Henrik; Sundquist, Jan; Sundquist, Kristina

    2015-01-01

    Objective The authors sought to clarify the sources of parent-offspring resemblance for drug abuse, alcohol use disorders, and criminal behavior, using a novel genetic-epidemiological design. Method Using national registries, the authors identified rates of drug abuse, alcohol use disorders, and criminal behavior in 41,360 Swedish individuals born between 1960 and 1990 and raised in triparental families comprising a biological mother who reared them, a “not-lived-with” biological father, and a stepfather. Results When each syndrome was examined individually, hazard rates for drug abuse in offspring of parents with drug abuse were highest for mothers (2.80, 95% CI=2.23–3.38), intermediate for not-lived-with fathers (2.45,95%CI=2.14–2.79), and lowest for stepfathers (1.99, 95% CI=1.55–2.56). The same pattern was seen for alcohol use disorders (2.23, 95% CI=1.93–2.58; 1.84, 95% CI=1.69–2.00; and 1.27, 95% CI=1.12–1.43) and criminal behavior (1.55, 95% CI=1.44–1.66; 1.46, 95%CI=1.40–1.52; and1.30, 95% CI=1.23–1.37). When all three syndromes were examined together, specificity of cross-generational transmission was highest for mothers, intermediate for not-lived-with fathers, and lowest for stepfathers. Analyses of intact families and other not-lived-with parents and stepparents showed similar cross-generation transmission for these syndromes in mothers and fathers, supporting the representativeness of results from triparental families. Conclusions A major strength of the triparental design is its inclusion, within a single family, of parents who provide, to a first approximation, their offspring with genes plus rearing, genes only, and rearing only. For drug abuse, alcohol use disorders, and criminal behavior, the results of this study suggest that parent-offspring transmission involves both genetic and environmental processes, with genetic factors being somewhat more important. These results should be interpreted in the context of the strengths

  4. Polygenic Epidemiology

    PubMed Central

    2016-01-01

    ABSTRACT Much of the genetic basis of complex traits is present on current genotyping products, but the individual variants that affect the traits have largely not been identified. Several traditional problems in genetic epidemiology have recently been addressed by assuming a polygenic basis for disease and treating it as a single entity. Here I briefly review some of these applications, which collectively may be termed polygenic epidemiology. Methodologies in this area include polygenic scoring, linear mixed models, and linkage disequilibrium scoring. They have been used to establish a polygenic effect, estimate genetic correlation between traits, estimate how many variants affect a trait, stratify cases into subphenotypes, predict individual disease risks, and infer causal effects using Mendelian randomization. Polygenic epidemiology will continue to yield useful applications even while much of the specific variation underlying complex traits remains undiscovered. PMID:27061411

  5. Analytical techniques for instrument design -- Matrix methods

    SciTech Connect

    Robinson, R.A.

    1997-12-31

    The authors take the traditional Cooper-Nathans approach, as has been applied for many years for steady-state triple-axis spectrometers, and consider its generalization to other inelastic scattering spectrometers. This involves a number of simple manipulations of exponentials of quadratic forms. In particular, they discuss a toolbox of matrix manipulations that can be performed on the 6-dimensional Cooper-Nathans matrix. They show how these tools can be combined to solve a number of important problems, within the narrow-band limit and the gaussian approximation. They will argue that a generalized program that can handle multiple different spectrometers could (and should) be written in parallel to the Monte-Carlo packages that are becoming available. They also discuss the complementarity between detailed Monte-Carlo calculations and the approach presented here. In particular, Monte-Carlo methods traditionally simulate the real experiment as performed in practice, given a model scattering law, while the Cooper-Nathans method asks the inverse question: given that a neutron turns up in a particular spectrometer configuration (e.g. angle and time of flight), what is the probability distribution of possible scattering events at the sample? The Monte-Carlo approach could be applied in the same spirit to this question.

  6. Perspectives toward the stereotype production method for public symbol design: a case study of novice designers.

    PubMed

    Ng, Annie W Y; Siu, Kin Wai Michael; Chan, Chetwyn C H

    2013-01-01

    This study investigated the practices and attitudes of novice designers toward user involvement in public symbol design at the conceptual design stage, i.e. the stereotype production method. Differences between male and female novice designers were examined. Forty-eight novice designers (24 male, 24 female) were asked to design public symbol referents based on suggestions made by a group of users in a previous study and provide feedback with regard to the design process. The novice designers were receptive to the adoption of user suggestions in the conception of the design, but tended to modify the pictorial representations generated by the users to varying extents. It is also significant that the male and female novice designers appeared to emphasize different aspects of user suggestions, and the female novice designers were more positive toward these suggestions than their male counterparts. The findings should aid the optimization of the stereotype production method for user-involved symbol design. PMID:22632980

  7. Evaluation of method for secondary DNA typing of Mycobacterium tuberculosis with pTBN12 in epidemiologic study of tuberculosis.

    PubMed Central

    Yang, Z; Chaves, F; Barnes, P F; Burman, W J; Koehler, J; Eisenach, K D; Bates, J H; Cave, M D

    1996-01-01

    Secondary fingerprinting of Mycobacterium tuberculosis DNA with a probe containing the polymorphic GC-rich repetitive sequence present in pTBN12 has been found to have greater discriminating power than does fingerprinting with the insertion sequence IS6110 for strains carrying few copies of IS6110. To validate the use of pTBN12 fingerprinting in the molecular epidemiology of tuberculosis, M. tuberculosis isolates from 67 patients in five states in the United States and in Spain were fingerprinted with both IS6110 and pTBN12. Epidemiologic links among the 67 patients were evaluated by patient interview and/or review of medical records. The 67 isolates had 5 IS6110 fingerprint patterns with two to five copies of IS6110 and 18 pTBN12 patterns, of which 10 were shared by more than 1 isolate. Epidemiologic links are consistently found among patients whose isolates had identical pTBN12 patterns, whereas no links were found among patients whose isolates had unique pTBN12 patterns. This suggests that pTBN12 fingerprinting is a useful tool to identify epidemiologically linked tuberculosis patients whose isolates have identical IS6110 fingerprints containing fewer than six fragments. PMID:8940446

  8. Prevalence and Epidemiologic Characteristics of FASD From Various Research Methods with an Emphasis on Recent In-School Studies

    ERIC Educational Resources Information Center

    May, Philip A.; Gossage, J. Phillip; Kalberg, Wendy O.; Robinson, Luther K.; Buckley, David; Manning, Melanie; Hoyme, H. Eugene

    2009-01-01

    Researching the epidemiology and estimating the prevalence of fetal alcohol syndrome (FAS) and other fetal alcohol spectrum disorders (FASD) for mainstream populations anywhere in the world has presented a challenge to researchers. Three major approaches have been used in the past: surveillance and record review systems, clinic-based studies, and…

  9. HEALTHY study rationale, design and methods

    PubMed Central

    2009-01-01

    The HEALTHY primary prevention trial was designed and implemented in response to the growing numbers of children and adolescents being diagnosed with type 2 diabetes. The objective was to moderate risk factors for type 2 diabetes. Modifiable risk factors measured were indicators of adiposity and glycemic dysregulation: body mass index ≥85th percentile, fasting glucose ≥5.55 mmol l-1 (100 mg per 100 ml) and fasting insulin ≥180 pmol l-1 (30 μU ml-1). A series of pilot studies established the feasibility of performing data collection procedures and tested the development of an intervention consisting of four integrated components: (1) changes in the quantity and nutritional quality of food and beverage offerings throughout the total school food environment; (2) physical education class lesson plans and accompanying equipment to increase both participation and number of minutes spent in moderate-to-vigorous physical activity; (3) brief classroom activities and family outreach vehicles to increase knowledge, enhance decision-making skills and support and reinforce youth in accomplishing goals; and (4) communications and social marketing strategies to enhance and promote changes through messages, images, events and activities. Expert study staff provided training, assistance, materials and guidance for school faculty and staff to implement the intervention components. A cohort of students were enrolled in sixth grade and followed to end of eighth grade. They attended a health screening data collection at baseline and end of study that involved measurement of height, weight, blood pressure, waist circumference and a fasting blood draw. Height and weight were also collected at the end of the seventh grade. The study was conducted in 42 middle schools, six at each of seven locations across the country, with 21 schools randomized to receive the intervention and 21 to act as controls (data collection activities only). Middle school was the unit of sample size and

  10. Cognitive epidemiology

    PubMed Central

    Deary, Ian J; Batty, G David

    2007-01-01

    This glossary provides a guide to some concepts, findings and issues of discussion in the new field of research in which intelligence test scores are associated with mortality and morbidity. Intelligence tests are devised and studied by differential psychologists. Some of the major concepts in differential psychology are explained, especially those regarding cognitive ability testing. Some aspects of IQ (intelligence) tests are described and some of the major tests are outlined. A short guide is given to the main statistical techniques used by differential psychologists in the study of human mental abilities. There is a discussion of common epidemiological concepts in the context of cognitive epidemiology. PMID:17435201

  11. A national cross-sectional study among drug-users in France: epidemiology of HCV and highlight on practical and statistical aspects of the design

    PubMed Central

    2009-01-01

    Background Epidemiology of HCV infection among drug users (DUs) has been widely studied. Prevalence and sociobehavioural data among DUs are therefore available in most countries but no study has taken into account in the sampling weights one important aspect of the way of life of DUs, namely that they can use one or more specialized services during the study period. In 2004–2005, we conducted a national seroepidemiologic survey of DUs, based on a random sampling design using the Generalised Weight Share Method (GWSM) and on blood testing. Methods A cross-sectional multicenter survey was done among DUs having injected or snorted drugs at least once in their life. We conducted a two stage random survey of DUs selected to represent the diversity of drug use. The fact that DUs can use more than one structure during the study period has an impact on their inclusion probabilities. To calculate a correct sampling weight, we used the GWSM. A sociobehavioral questionnaire was administered by interviewers. Selected DUs were asked to self-collect a fingerprick blood sample on blotting paper. Results Of all DUs selected, 1462 (75%) accepted to participate. HCV seroprevalence was 59.8% [95% CI: 50.7–68.3]. Of DUs under 30 years, 28% were HCV seropositive. Of HCV-infected DUs, 27% were unaware of their status. In the month prior to interview, 13% of DUs shared a syringe, 38% other injection parapharnelia and 81% shared a crack pipe. In multivariate analysis, factors independently associated with HCV seropositivity were age over 30, HIV seropositivity, having ever injected drugs, opiate substitution treatment (OST), crack use, and precarious housing. Conclusion This is the first time that blood testing combined to GWSM is applied to a DUs population, which improve the estimate of HCV prevalence. HCV seroprevalence is high, indeed by the youngest DUs. And a large proportion of DUs are not aware of their status. Our multivariate analysis identifies risk factors such as crack

  12. Translational Epidemiology in Psychiatry

    PubMed Central

    Weissman, Myrna M.; Brown, Alan S.; Talati, Ardesheer

    2012-01-01

    Translational research generally refers to the application of knowledge generated by advances in basic sciences research translated into new approaches for diagnosis, prevention, and treatment of disease. This direction is called bench-to-bedside. Psychiatry has similarly emphasized the basic sciences as the starting point of translational research. This article introduces the term translational epidemiology for psychiatry research as a bidirectional concept in which the knowledge generated from the bedside or the population can also be translated to the benches of laboratory science. Epidemiologic studies are primarily observational but can generate representative samples, novel designs, and hypotheses that can be translated into more tractable experimental approaches in the clinical and basic sciences. This bedside-to-bench concept has not been explicated in psychiatry, although there are an increasing number of examples in the research literature. This article describes selected epidemiologic designs, providing examples and opportunities for translational research from community surveys and prospective, birth cohort, and family-based designs. Rapid developments in informatics, emphases on large sample collection for genetic and biomarker studies, and interest in personalized medicine—which requires information on relative and absolute risk factors—make this topic timely. The approach described has implications for providing fresh metaphors to communicate complex issues in interdisciplinary collaborations and for training in epidemiology and other sciences in psychiatry. PMID:21646577

  13. Experimental design for improved ceramic processing, emphasizing the Taguchi Method

    SciTech Connect

    Weiser, M.W. . Mechanical Engineering Dept.); Fong, K.B. )

    1993-12-01

    Ceramic processing often requires substantial experimentation to produce acceptable product quality and performance. This is a consequence of ceramic processes depending upon a multitude of factors, some of which can be controlled and others that are beyond the control of the manufacturer. Statistical design of experiments is a procedure that allows quick, economical, and accurate evaluation of processes and products that depend upon several variables. Designed experiments are sets of tests in which the variables are adjusted methodically. A well-designed experiment yields unambiguous results at minimal cost. A poorly designed experiment may reveal little information of value even with complex analysis, wasting valuable time and resources. This article will review the most common experimental designs. This will include both nonstatistical designs and the much more powerful statistical experimental designs. The Taguchi Method developed by Grenichi Taguchi will be discussed in some detail. The Taguchi method, based upon fractional factorial experiments, is a powerful tool for optimizing product and process performance.

  14. A new interval optimization method considering tolerance design

    NASA Astrophysics Data System (ADS)

    Jiang, C.; Xie, H. C.; Zhang, Z. G.; Han, X.

    2015-12-01

    This study considers the design variable uncertainty in the actual manufacturing process for a product or structure and proposes a new interval optimization method based on tolerance design, which can provide not only an optimal design but also the allowable maximal manufacturing errors that the design can bear. The design variables' manufacturing errors are depicted using the interval method, and an interval optimization model for the structure is constructed. A dimensionless design tolerance index is defined to describe the overall uncertainty of all design variables, and by combining the nominal objective function, a deterministic two-objective optimization model is built. The possibility degree of interval is used to represent the reliability of the constraints under uncertainty, through which the model is transformed to a deterministic optimization problem. Three numerical examples are investigated to verify the effectiveness of the present method.

  15. Artificial Intelligence Methods: Challenge in Computer Based Polymer Design

    NASA Astrophysics Data System (ADS)

    Rusu, Teodora; Pinteala, Mariana; Cartwright, Hugh

    2009-08-01

    This paper deals with the use of Artificial Intelligence Methods (AI) in the design of new molecules possessing desired physical, chemical and biological properties. This is an important and difficult problem in the chemical, material and pharmaceutical industries. Traditional methods involve a laborious and expensive trial-and-error procedure, but computer-assisted approaches offer many advantages in the automation of molecular design.

  16. An analytical method for designing low noise helicopter transmissions

    NASA Technical Reports Server (NTRS)

    Bossler, R. B., Jr.; Bowes, M. A.; Royal, A. C.

    1978-01-01

    The development and experimental validation of a method for analytically modeling the noise mechanism in the helicopter geared power transmission systems is described. This method can be used within the design process to predict interior noise levels and to investigate the noise reducing potential of alternative transmission design details. Examples are discussed.

  17. What Can Mixed Methods Designs Offer Professional Development Program Evaluators?

    ERIC Educational Resources Information Center

    Giordano, Victoria; Nevin, Ann

    2007-01-01

    In this paper, the authors describe the benefits and pitfalls of mixed methods designs. They argue that mixed methods designs may be preferred when evaluating professional development programs for p-K-12 education given the new call for accountability in making data-driven decisions. They summarize and critique the studies in terms of limitations…

  18. Nutritional Epidemiology

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Although observations on relationships between diet and health have always been recognized—the systematic science of nutritional epidemiology in populations is relatively recent. Important observations propelling the field of nutrition forward were numerous in the 18th and 19th centuries, as it was...

  19. Probabilistic Methods for Uncertainty Propagation Applied to Aircraft Design

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Lin, Hong-Zong; Khalessi, Mohammad R.

    2002-01-01

    Three methods of probabilistic uncertainty propagation and quantification (the method of moments, Monte Carlo simulation, and a nongradient simulation search method) are applied to an aircraft analysis and conceptual design program to demonstrate design under uncertainty. The chosen example problems appear to have discontinuous design spaces and thus these examples pose difficulties for many popular methods of uncertainty propagation and quantification. However, specific implementation features of the first and third methods chosen for use in this study enable successful propagation of small uncertainties through the program. Input uncertainties in two configuration design variables are considered. Uncertainties in aircraft weight are computed. The effects of specifying required levels of constraint satisfaction with specified levels of input uncertainty are also demonstrated. The results show, as expected, that the designs under uncertainty are typically heavier and more conservative than those in which no input uncertainties exist.

  20. An exploratory GIS-based method to identify and characterise landscapes with an elevated epidemiological risk of Rhodesian human African trypanosomiasis

    PubMed Central

    2012-01-01

    Background Specific land cover types and activities have been correlated with Trypanosoma brucei rhodesiense distributions, indicating the importance of landscape for epidemiological risk. However, methods proposed to identify specific areas with elevated epidemiological risk (i.e. where transmission is more likely to occur) tend to be costly and time consuming. This paper proposes an exploratory spatial analysis using geo-referenced human African trypanosomiasis (HAT) cases and matched controls from Serere hospital, Uganda (December 1998 to November 2002) to identify areas with an elevated epidemiological risk of HAT. Methods Buffers 3 km from each case and control were used to represent areas in which village inhabitants would carry out their daily activities. It was hypothesised that the selection of areas where several case village buffers overlapped would enable the identification of locations with increased risk of HAT transmission, as these areas were more likely to be frequented by HAT cases in several surrounding villages. The landscape within these overlap areas should more closely relate to the environment in which transmission occurs as opposed to using the full buffer areas. The analysis was carried out for each of four annual periods, for both cases and controls, using a series of threshold values (number of overlapping buffers), including a threshold of one, which represented the benchmark (e.g. use of the full buffer area as opposed to the overlap areas). Results A greater proportion of the overlap areas for cases consisted of seasonally flooding grassland and lake fringe swamp, than the control overlap areas, correlating well with the preferred habitat of the predominant tsetse species within the study area (Glossina fuscipes fuscipes). The use of overlap areas also resulted in a greater difference between case and control landscapes, when compared with the benchmark (using the full buffer area). Conclusions These results indicate that the overlap

  1. Expanding color design methods for architecture and allied disciplines

    NASA Astrophysics Data System (ADS)

    Linton, Harold E.

    2002-06-01

    The color design processes of visual artists, architects, designers, and theoreticians included in this presentation reflect the practical role of color in architecture. What the color design professional brings to the architectural design team is an expertise and rich sensibility made up of a broad awareness and a finely tuned visual perception. This includes a knowledge of design and its history, expertise with industrial color materials and their methods of application, an awareness of design context and cultural identity, a background in physiology and psychology as it relates to human welfare, and an ability to problem-solve and respond creatively to design concepts with innovative ideas. The broadening of the definition of the colorists's role in architectural design provides architects, artists and designers with significant opportunities for continued professional and educational development.

  2. Design methods for fault-tolerant finite state machines

    NASA Technical Reports Server (NTRS)

    Niranjan, Shailesh; Frenzel, James F.

    1993-01-01

    VLSI electronic circuits are increasingly being used in space-borne applications where high levels of radiation may induce faults, known as single event upsets. In this paper we review the classical methods of designing fault tolerant digital systems, with an emphasis on those methods which are particularly suitable for VLSI-implementation of finite state machines. Four methods are presented and will be compared in terms of design complexity, circuit size, and estimated circuit delay.

  3. Aerodynamic design optimization by using a continuous adjoint method

    NASA Astrophysics Data System (ADS)

    Luo, JiaQi; Xiong, JunTao; Liu, Feng

    2014-07-01

    This paper presents the fundamentals of a continuous adjoint method and the applications of this method to the aerodynamic design optimization of both external and internal flows. General formulation of the continuous adjoint equations and the corresponding boundary conditions are derived. With the adjoint method, the complete gradient information needed in the design optimization can be obtained by solving the governing flow equations and the corresponding adjoint equations only once for each cost function, regardless of the number of design parameters. An inverse design of airfoil is firstly performed to study the accuracy of the adjoint gradient and the effectiveness of the adjoint method as an inverse design method. Then the method is used to perform a series of single and multiple point design optimization problems involving the drag reduction of airfoil, wing, and wing-body configuration, and the aerodynamic performance improvement of turbine and compressor blade rows. The results demonstrate that the continuous adjoint method can efficiently and significantly improve the aerodynamic performance of the design in a shape optimization problem.

  4. Tabu search method with random moves for globally optimal design

    NASA Astrophysics Data System (ADS)

    Hu, Nanfang

    1992-09-01

    Optimum engineering design problems are usually formulated as non-convex optimization problems of continuous variables. Because of the absence of convexity structure, they can have multiple minima, and global optimization becomes difficult. Traditional methods of optimization, such as penalty methods, can often be trapped at a local optimum. The tabu search method with random moves to solve approximately these problems is introduced. Its reliability and efficiency are examined with the help of standard test functions. By the analysis of the implementations, it is seen that this method is easy to use, and no derivative information is necessary. It outperforms the random search method and composite genetic algorithm. In particular, it is applied to minimum weight design examples of a three-bar truss, coil springs, a Z-section and a channel section. For the channel section, the optimal design using the tabu search method with random moves saved 26.14 percent over the weight of the SUMT method.

  5. An inverse method with regularity condition for transonic airfoil design

    NASA Technical Reports Server (NTRS)

    Zhu, Ziqiang; Xia, Zhixun; Wu, Liyi

    1991-01-01

    It is known from Lighthill's exact solution of the incompressible inverse problem that in the inverse design problem, the surface pressure distribution and the free stream speed cannot both be prescribed independently. This implies the existence of a constraint on the prescribed pressure distribution. The same constraint exists at compressible speeds. Presented here is an inverse design method for transonic airfoils. In this method, the target pressure distribution contains a free parameter that is adjusted during the computation to satisfy the regularity condition. Some design results are presented in order to demonstrate the capabilities of the method.

  6. Digital Epidemiology

    PubMed Central

    Salathé, Marcel; Bengtsson, Linus; Bodnar, Todd J.; Brewer, Devon D.; Brownstein, John S.; Buckee, Caroline; Campbell, Ellsworth M.; Cattuto, Ciro; Khandelwal, Shashank; Mabry, Patricia L.; Vespignani, Alessandro

    2012-01-01

    Mobile, social, real-time: the ongoing revolution in the way people communicate has given rise to a new kind of epidemiology. Digital data sources, when harnessed appropriately, can provide local and timely information about disease and health dynamics in populations around the world. The rapid, unprecedented increase in the availability of relevant data from various digital sources creates considerable technical and computational challenges. PMID:22844241

  7. An artificial viscosity method for the design of supercritical airfoils

    NASA Technical Reports Server (NTRS)

    Mcfadden, G. B.

    1979-01-01

    A numerical technique is presented for the design of two-dimensional supercritical wing sections with low wave drag. The method is a design mode of the analysis code H which gives excellent agreement with experimental results and is widely used in the aircraft industry. Topics covered include the partial differential equations of transonic flow, the computational procedure and results; the design procedure; a convergence theorem; and description of the code.

  8. Single-Case Designs and Qualitative Methods: Applying a Mixed Methods Research Perspective

    ERIC Educational Resources Information Center

    Hitchcock, John H.; Nastasi, Bonnie K.; Summerville, Meredith

    2010-01-01

    The purpose of this conceptual paper is to describe a design that mixes single-case (sometimes referred to as single-subject) and qualitative methods, hereafter referred to as a single-case mixed methods design (SCD-MM). Minimal attention has been given to the topic of applying qualitative methods to SCD work in the literature. These two…

  9. 77 FR 55832 - Ambient Air Monitoring Reference and Equivalent Methods: Designation of a New Equivalent Method

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-11

    ... made under the provisions of 40 CFR part 53, as ] amended on August 31, 2011 (76 FR 54326-54341). The... AGENCY Ambient Air Monitoring Reference and Equivalent Methods: Designation of a New Equivalent Method AGENCY: Environmental Protection Agency. ACTION: Notice of the designation of a new equivalent method...

  10. 77 FR 60985 - Ambient Air Monitoring Reference and Equivalent Methods: Designation of Three New Equivalent Methods

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-05

    ... 53, as amended on August 31, 2011 (76 FR 54326-54341). The new equivalent methods are automated... AGENCY Ambient Air Monitoring Reference and Equivalent Methods: Designation of Three New Equivalent Methods AGENCY: Environmental Protection Agency. ACTION: Notice of the designation of three new...

  11. Investigating the Use of Design Methods by Capstone Design Students at Clemson University

    ERIC Educational Resources Information Center

    Miller, W. Stuart; Summers, Joshua D.

    2013-01-01

    The authors describe a preliminary study to understand the attitude of engineering students regarding the use of design methods in projects to identify the factors either affecting or influencing the use of these methods by novice engineers. A senior undergraduate capstone design course at Clemson University, consisting of approximately fifty…

  12. Two-Method Planned Missing Designs for Longitudinal Research

    ERIC Educational Resources Information Center

    Garnier-Villarreal, Mauricio; Rhemtulla, Mijke; Little, Todd D.

    2014-01-01

    We examine longitudinal extensions of the two-method measurement design, which uses planned missingness to optimize cost-efficiency and validity of hard-to-measure constructs. These designs use a combination of two measures: a "gold standard" that is highly valid but expensive to administer, and an inexpensive (e.g., survey-based)…

  13. New directions for Artificial Intelligence (AI) methods in optimum design

    NASA Technical Reports Server (NTRS)

    Hajela, Prabhat

    1989-01-01

    Developments and applications of artificial intelligence (AI) methods in the design of structural systems is reviewed. Principal shortcomings in the current approach are emphasized, and the need for some degree of formalism in the development environment for such design tools is underscored. Emphasis is placed on efforts to integrate algorithmic computations in expert systems.

  14. Approximate method of designing a two-element airfoil

    NASA Astrophysics Data System (ADS)

    Abzalilov, D. F.; Mardanov, R. F.

    2011-09-01

    An approximate method is proposed for designing a two-element airfoil. The method is based on reducing an inverse boundary-value problem in a doubly connected domain to a problem in a singly connected domain located on a multisheet Riemann surface. The essence of the method is replacement of channels between the airfoil elements by channels of flow suction and blowing. The shape of these channels asymptotically tends to the annular shape of channels passing to infinity on the second sheet of the Riemann surface. The proposed method can be extended to designing multielement airfoils.

  15. New knowledge network evaluation method for design rationale management

    NASA Astrophysics Data System (ADS)

    Jing, Shikai; Zhan, Hongfei; Liu, Jihong; Wang, Kuan; Jiang, Hao; Zhou, Jingtao

    2015-01-01

    Current design rationale (DR) systems have not demonstrated the value of the approach in practice since little attention is put to the evaluation method of DR knowledge. To systematize knowledge management process for future computer-aided DR applications, a prerequisite is to provide the measure for the DR knowledge. In this paper, a new knowledge network evaluation method for DR management is presented. The method characterizes the DR knowledge value from four perspectives, namely, the design rationale structure scale, association knowledge and reasoning ability, degree of design justification support and degree of knowledge representation conciseness. The DR knowledge comprehensive value is also measured by the proposed method. To validate the proposed method, different style of DR knowledge network and the performance of the proposed measure are discussed. The evaluation method has been applied in two realistic design cases and compared with the structural measures. The research proposes the DR knowledge evaluation method which can provide object metric and selection basis for the DR knowledge reuse during the product design process. In addition, the method is proved to be more effective guidance and support for the application and management of DR knowledge.

  16. Design method for four-reflector type beam waveguide systems

    NASA Technical Reports Server (NTRS)

    Betsudan, S.; Katagi, T.; Urasaki, S.

    1986-01-01

    Discussed is a method for the design of four reflector type beam waveguide feed systems, comprised of a conical horn and 4 focused reflectors, which are used widely as the primary reflector systems for communications satellite Earth station antennas. The design parameters for these systems are clarified, the relations between each parameter are brought out based on the beam mode development, and the independent design parameters are specified. The characteristics of these systems, namely spillover loss, crosspolarization components, and frequency characteristics, and their relation to the design parameters, are also shown. It is also indicated that design parameters which decide the dimensions of the conical horn or the shape of the focused reflectors can be unerringly established once the design standard for the system has been selected as either: (1) minimizing the crosspolarization component by keeping the spillover loss to within acceptable limits, or (2) minimizing the spillover loss by maintaining the crossover components below an acceptable level and the independent design parameters, such as the respective sizes of the focused reflectors and the distances between the focussed reflectors, etc., have been established according to mechanical restrictions. A sample design is also shown. In addition to being able to clarify the effects of each of the design parameters on the system and improving insight into these systems, the efficiency of these systems will also be increased with this design method.

  17. Methodologic research needs in environmental epidemiology: data analysis.

    PubMed Central

    Prentice, R L; Thomas, D

    1993-01-01

    A brief review is given of data analysis methods for the identification and quantification of associations between environmental exposures and health events of interest. Data analysis methods are outlined for each of the study designs mentioned, with an emphasis on topics in need of further research. Particularly noted are the need for improved methods for accommodating exposure assessment measurement errors in analytic epidemiologic studies and for improved methods for the conduct and analysis of aggregate data (ecologic) studies. PMID:8206041

  18. A multidisciplinary optimization method for designing boundary layer ingesting inlets

    NASA Astrophysics Data System (ADS)

    Rodriguez, David Leonard

    2001-07-01

    The Blended-Wing-Body is a conceptual aircraft design with rear-mounted, over-wing engines. Two types of engine installations have been considered for this aircraft. One installation is quite conventional with podded engines mounted on pylons. The other installation has partially buried engines with boundary layer ingesting inlets. Although ingesting the low-momentum flow in a boundary layer can improve propulsive efficiency, poor inlet performance can offset and even overwhelm this potential advantage. For both designs, the tight coupling between the aircraft aerodynamics and the propulsion system poses a difficult design integration problem. This dissertation presents a design method that solves the problem using multidisciplinary optimization. A Navier-Stokes flow solver, an engine analysis method, and a nonlinear optimizer are combined into a design tool that correctly addresses the tight coupling of the problem. The method is first applied to a model 2D problem to expedite development and thoroughly test the scheme. The low computational cost of the 2D method allows for several inlet installations to be optimized and analyzed. The method is then upgraded by using a validated 3D Navier-Stokes solver. The two candidate engine installations are analyzed and optimized using this inlet design method. The method is shown to be quite effective at integrating the propulsion and aerodynamic systems of the Blend-Wing-Body for both engine installations by improving overall performance and satisfying any specified design constraints. By comparing the two optimized designs, the potential advantages of ingesting boundary layer flow for this aircraft are demonstrated.

  19. Epidemiology: Cornerstone for Health Education.

    ERIC Educational Resources Information Center

    Markellis, Victoria C.

    1986-01-01

    Epidemiology has been used historically to reduce the incidence of communicable diseases and is used presently to study chronic conditions, environmental conditions, and social conditions. Its analytical method is necessary for health educators to evaluate tactics and recommend programs. (MT)

  20. Design of diffractive optical surfaces within the nonimaging SMS design method

    NASA Astrophysics Data System (ADS)

    Mendes-Lopes, João.; Benítez, Pablo; Miñano, Juan C.

    2015-09-01

    The Simultaneous Multiple Surface (SMS) method was initially developed as a design method in Nonimaging Optics and later, the method was extended for designing Imaging Optics. We show an extension of the SMS method to diffractive surfaces. Using this method, diffractive kinoform surfaces are calculated simultaneously and through a direct method, i. e. it is not based in multi-parametric optimization techniques. Using the phase-shift properties of diffractive surfaces as an extra degree of freedom, only N/2 surfaces are needed to perfectly couple N one parameter wavefronts. Wavefronts of different wavelengths can also be coupled, hence chromatic aberration can be corrected in SMS-based systems. This method can be used by combining and calculating simultaneously both reflective, refractive and diffractive surfaces, through direct calculation of phase and refractive/reflective profiles. Representative diffractive systems designed by the SMS method are presented.

  1. Epidemiological investigation of a Legionnaires' disease outbreak in Christchurch, New Zealand: the value of spatial methods for practical public health.

    PubMed

    White, P S; Graham, F F; Harte, D J G; Baker, M G; Ambrose, C D; Humphrey, A R G

    2013-04-01

    Between April and August 2005 Christchurch, New Zealand experienced an outbreak of Legionnaires' disease. There were 19 laboratory-confirmed case including three deaths. Legionella pneumophila serogroup 1 (Lpsg1) was identified as the causative agent for all cases. A case-control study indicated a geographical association between the cases but no specific common exposures. Rapid spatial epidemiological investigation confirmed the association and identified seven spatially significant case clusters. The clusters were all sourced in the same area and exhibited a clear anisotropic process (noticeable direction) revealing a plume effect consistent with aerosol dispersion from a prevailing southwesterly wind. Four out of five cases tested had indistinguishable allele profiles that also matched environmental isolates from a water cooling tower within the centre of the clusters. This tower was considered the most probable source for these clusters. The conclusion would suggest a maximum dispersal distance in this outbreak of 11·6 km. This work illustrated the value of geostatistical techniques for infectious disease epidemiology and for providing timely information during outbreak investigations. PMID:22697112

  2. A Bright Future for Evolutionary Methods in Drug Design.

    PubMed

    Le, Tu C; Winkler, David A

    2015-08-01

    Most medicinal chemists understand that chemical space is extremely large, essentially infinite. Although high-throughput experimental methods allow exploration of drug-like space more rapidly, they are still insufficient to fully exploit the opportunities that such large chemical space offers. Evolutionary methods can synergistically blend automated synthesis and characterization methods with computational design to identify promising regions of chemical space more efficiently. We describe how evolutionary methods are implemented, and provide examples of published drug development research in which these methods have generated molecules with increased efficacy. We anticipate that evolutionary methods will play an important role in future drug discovery. PMID:26059362

  3. A comparison of methods for DPLL loop filter design

    NASA Technical Reports Server (NTRS)

    Aguirre, S.; Hurd, W. J.; Kumar, R.; Statman, J.

    1986-01-01

    Four design methodologies for loop filters for a class of digital phase-locked loops (DPLLs) are presented. The first design maps an optimum analog filter into the digital domain; the second approach designs a filter that minimizes in discrete time weighted combination of the variance of the phase error due to noise and the sum square of the deterministic phase error component; the third method uses Kalman filter estimation theory to design a filter composed of a least squares fading memory estimator and a predictor. The last design relies on classical theory, including rules for the design of compensators. Linear analysis is used throughout the article to compare different designs, and includes stability, steady state performance and transient behavior of the loops. Design methodology is not critical when the loop update rate can be made high relative to loop bandwidth, as the performance approaches that of continuous time. For low update rates, however, the miminization method is significantly superior to the other methods.

  4. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2012-01-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  5. XML-based product information processing method for product design

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen Yu

    2011-12-01

    Design knowledge of modern mechatronics product is based on information processing as the center of the knowledge-intensive engineering, thus product design innovation is essentially the knowledge and information processing innovation. Analysis of the role of mechatronics product design knowledge and information management features, a unified model of XML-based product information processing method is proposed. Information processing model of product design includes functional knowledge, structural knowledge and their relationships. For the expression of product function element, product structure element, product mapping relationship between function and structure based on the XML model are proposed. The information processing of a parallel friction roller is given as an example, which demonstrates that this method is obviously helpful for knowledge-based design system and product innovation.

  6. Novel parameter-based flexure bearing design method

    NASA Astrophysics Data System (ADS)

    Amoedo, Simon; Thebaud, Edouard; Gschwendtner, Michael; White, David

    2016-06-01

    A parameter study was carried out on the design variables of a flexure bearing to be used in a Stirling engine with a fixed axial displacement and a fixed outer diameter. A design method was developed in order to assist identification of the optimum bearing configuration. This was achieved through a parameter study of the bearing carried out with ANSYS®. The parameters varied were the number and the width of the arms, the thickness of the bearing, the eccentricity, the size of the starting and ending holes, and the turn angle of the spiral. Comparison was made between the different designs in terms of axial and radial stiffness, the natural frequency, and the maximum induced stresses. Moreover, the Finite Element Analysis (FEA) was compared to theoretical results for a given design. The results led to a graphical design method which assists the selection of flexure bearing geometrical parameters based on pre-determined geometric and material constraints.

  7. A computational design method for transonic turbomachinery cascades

    NASA Technical Reports Server (NTRS)

    Sobieczky, H.; Dulikravich, D. S.

    1982-01-01

    This paper describes a systematical computational procedure to find configuration changes necessary to modify the resulting flow past turbomachinery cascades, channels and nozzles, to be shock-free at prescribed transonic operating conditions. The method is based on a finite area transonic analysis technique and the fictitious gas approach. This design scheme has two major areas of application. First, it can be used for design of supercritical cascades, with applications mainly in compressor blade design. Second, it provides subsonic inlet shapes including sonic surfaces with suitable initial data for the design of supersonic (accelerated) exits, like nozzles and turbine cascade shapes. This fast, accurate and economical method with a proven potential for applications to three-dimensional flows is illustrated by some design examples.

  8. Risk-based methods applicable to ranking conceptual designs

    SciTech Connect

    Breeding, R.J.; Ortiz, K.; Ringland, J.T.; Lim, J.J.

    1993-11-01

    In Ginichi Taguchi`s latest book on quality engineering, an emphasis is placed on robust design processes in which quality engineering techniques are brought ``upstream,`` that is, they are utilized as early as possible, preferably in the conceptual design stage. This approach was used in a study of possible future safety system designs for weapons. As an experiment, a method was developed for using probabilistic risk analysis (PRA) techniques to rank conceptual designs for performance against a safety metric for ultimate incorporation into a Pugh matrix evaluation. This represents a high-level UW application of PRA methods to weapons. As with most conceptual designs, details of the implementation were not yet developed; many of the components had never been built, let alone tested. Therefore, our application of risk assessment methods was forced to be at such a high level that the entire evaluation could be performed on a spreadsheet. Nonetheless, the method produced numerical estimates of safety in a manner that was consistent, reproducible, and scrutable. The results enabled us to rank designs to identify areas where returns on research efforts would be the greatest. The numerical estimates were calibrated against what is achievable by current weapon safety systems. The use of expert judgement is inescapable, but these judgements are explicit and the method is easily implemented on an spreadsheet computer program.

  9. The use of epidemiology in alcohol research

    PubMed Central

    Rossow, Ingeborg; Norström, Thor

    2013-01-01

    Aims This paper presents examples to illustrate the utility and limitations in the use of epidemiology in alcohol research and discusses some promising new directions. Methods Review of literature, concentrating on epidemiological alcohol research with relevance to public health. Findings and conclusion Epidemiology offers tools for assessment of causes and effects of alcohol consumption as well as the effects of efforts to prevent alcohol consumption and its consequences. Epidemiological studies have made significant contributions to alcohol research with respect to public health and public policy. Fixed-effects modelling, difference-in-differences estimation and integrated qualitative and epidemiological methods are promising but underused methods in epidemiological studies. Many epidemiological studies have limited transferability of knowledge to other cultures and jurisdictions. PMID:23134358

  10. Designing, Teaching, and Evaluating Two Complementary Mixed Methods Research Courses

    ERIC Educational Resources Information Center

    Christ, Thomas W.

    2009-01-01

    Teaching mixed methods research is difficult. This longitudinal explanatory study examined how two classes were designed, taught, and evaluated. Curriculum, Research, and Teaching (EDCS-606) and Mixed Methods Research (EDCS-780) used a research proposal generation process to highlight the importance of the purpose, research question and…

  11. INNOVATIVE METHODS FOR THE OPTIMIZATION OF GRAVITY STORM SEWER DESIGN

    EPA Science Inventory

    The purpose of this paper is to describe a new method for optimizing the design of urban storm sewer systems. Previous efforts to optimize gravity sewers have met with limited success because classical optimization methods require that the problem be well behaved, e.g. describ...

  12. FRP bolted flanged connections -- Modern design and fabrication methods

    SciTech Connect

    Blach, A.E.; Sun, L.

    1995-11-01

    Bolted flanged connections for fiber reinforced plastic (FRP) pipes and pressure vessels are of great importance for any user of FRP material in fluid containment applications. At present, no dimensional standards or design rules exist for FRP flanges. Most often, flanges are fabricated to dimensional standards for metallic flanges without questioning their applicability to FRP materials. This paper discusses simplified and exact design methods for composite flanges, based on isotropic material design and on laminate theory design. Both, exact and simplified methods are included. Results of various design methods are then compared with experimental results from strain gage measurements on test pressure vessels. Methods of flange fabrication such as hand lay-up, injection molding, filament winding, and others, are discussed for their relative merits in pressure vessel and piping applications. Both, integral and bonded flanges are covered as applicable to the various methods of fabrication, also the economic implications of these methods. Also treated are the problems of gasket selection, bolting and overbolting, gasket stresses, and leakage of flanged connections.

  13. INTERMAP: background, aims, design, methods, and descriptive statistics (nondietary).

    PubMed

    Stamler, J; Elliott, P; Dennis, B; Dyer, A R; Kesteloot, H; Liu, K; Ueshima, H; Zhou, B F

    2003-09-01

    Blood pressure (BP) above optimal (< or =120/< or =80 mmHg) is established as a major cardiovascular disease (CVD) risk factor. Prevalence of adverse BP is high in most adult populations; until recently research has been sparse on reasons for this. Since the 1980s, epidemiologic studies confirmed that salt, alcohol intake, and body mass relate directly to BP; dietary potassium, inversely. Several other nutrients also probably influence BP. The DASH feeding trials demonstrated that with the multiple modifications in the DASH combination diet, SBP/DBP (SBP: systolic blood pressure, DBP: diastolic blood pressure) was sizably reduced, independent of calorie balance, alcohol intake, and BP reduction with decreased dietary salt. A key challenge for research is to elucidate specific nutrients accounting for this effect. The general aim of the study was to clarify influences of multiple nutrients on SBP/DBP of individuals over and above effects of Na, K, alcohol, and body mass. Specific aims were, in a cross-sectional epidemiologic study of 4680 men and women aged 40-59 years from 17 diverse population samples in China, Japan, UK, and USA, test 10 prior hypotheses on relations of macronutrients to SBP/DBP and on role of dietary factors in inverse associations of education with BP; test four related subgroup hypotheses; explore associations with SBP/DBP of multiple other nutrients, urinary metabolites, and foods. For these purposes, for all 4680 participants, with standardized high-quality methods, assess individual intake of 76 nutrients from four 24-h dietary recalls/person; measure in two timed 24-h urine collections/person 24-h excretion of Na, K, Ca, Mg, creatinine, amino acids; microalbuminuria; multiple nutrients and metabolites by nuclear magnetic resonance and high-pressure liquid chromatography. Based on eight SBP/DBP measurements/person, and data on multiple possible confounders, utilize mainly multiple linear regression and quantile analyses to test prior

  14. Optimal Input Signal Design for Data-Centric Estimation Methods.

    PubMed

    Deshpande, Sunil; Rivera, Daniel E

    2013-01-01

    Data-centric estimation methods such as Model-on-Demand and Direct Weight Optimization form attractive techniques for estimating unknown functions from noisy data. These methods rely on generating a local function approximation from a database of regressors at the current operating point with the process repeated at each new operating point. This paper examines the design of optimal input signals formulated to produce informative data to be used by local modeling procedures. The proposed method specifically addresses the distribution of the regressor vectors. The design is examined for a linear time-invariant system under amplitude constraints on the input. The resulting optimization problem is solved using semidefinite relaxation methods. Numerical examples show the benefits in comparison to a classical PRBS input design. PMID:24317042

  15. Test methods and design allowables for fibrous composites. Volume 2

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C. (Editor)

    1989-01-01

    Topics discussed include extreme/hostile environment testing, establishing design allowables, and property/behavior specific testing. Papers are presented on environmental effects on the high strain rate properties of graphite/epoxy composite, the low-temperature performance of short-fiber reinforced thermoplastics, the abrasive wear behavior of unidirectional and woven graphite fiber/PEEK, test methods for determining design allowables for fiber reinforced composites, and statistical methods for calculating material allowables for MIL-HDBK-17. Attention is also given to a test method to measure the response of composite materials under reversed cyclic loads, a through-the-thickness strength specimen for composites, the use of torsion tubes to measure in-plane shear properties of filament-wound composites, the influlence of test fixture design on the Iosipescu shear test for fiber composite materials, and a method for monitoring in-plane shear modulus in fatigue testing of composites.

  16. Optimal Input Signal Design for Data-Centric Estimation Methods

    PubMed Central

    Deshpande, Sunil; Rivera, Daniel E.

    2013-01-01

    Data-centric estimation methods such as Model-on-Demand and Direct Weight Optimization form attractive techniques for estimating unknown functions from noisy data. These methods rely on generating a local function approximation from a database of regressors at the current operating point with the process repeated at each new operating point. This paper examines the design of optimal input signals formulated to produce informative data to be used by local modeling procedures. The proposed method specifically addresses the distribution of the regressor vectors. The design is examined for a linear time-invariant system under amplitude constraints on the input. The resulting optimization problem is solved using semidefinite relaxation methods. Numerical examples show the benefits in comparison to a classical PRBS input design. PMID:24317042

  17. Tradeoff methods in multiobjective insensitive design of airplane control systems

    NASA Technical Reports Server (NTRS)

    Schy, A. A.; Giesy, D. P.

    1984-01-01

    The latest results of an ongoing study of computer-aided design of airplane control systems are given. Constrained minimization algorithms are used, with the design objectives in the constraint vector. The concept of Pareto optimiality is briefly reviewed. It is shown how an experienced designer can use it to find designs which are well-balanced in all objectives. Then the problem of finding designs which are insensitive to uncertainty in system parameters are discussed, introducing a probabilistic vector definition of sensitivity which is consistent with the deterministic Pareto optimal problem. Insensitivity is important in any practical design, but it is particularly important in the design of feedback control systems, since it is considered to be the most important distinctive property of feedback control. Methods of tradeoff between deterministic and stochastic-insensitive (SI) design are described, and tradeoff design results are presented for the example of the a Shuttle lateral stability augmentation system. This example is used because careful studies have been made of the uncertainty in Shuttle aerodynamics. Finally, since accurate statistics of uncertain parameters are usually not available, the effects of crude statistical models on SI designs are examined.

  18. A new method of VLSI conform design for MOS cells

    NASA Astrophysics Data System (ADS)

    Schmidt, K. H.; Wach, W.; Mueller-Glaser, K. D.

    An automated method for the design of specialized SSI/LSI-level MOS cells suitable for incorporation in VLSI chips is described. The method uses the symbolic-layout features of the CABBAGE computer program (Hsueh, 1979; De Man et al., 1982), but restricted by a fixed grid system to facilitate compaction procedures. The techniques used are shown to significantly speed the processes of electrical design, layout, design verification, and description for subsequent CAD/CAM application. In the example presented, a 211-transistor, parallel-load, synchronous 4-bit up/down binary counter cell was designed in 9 days, as compared to 30 days for a manually-optimized-layout version and 3 days for a larger, less efficient cell designed by a programmable logic array; the cell areas were 0.36, 0.21, and 0.79 sq mm, respectively. The primary advantage of the method is seen in the extreme ease with which the cell design can be adapted to new parameters or design rules imposed by improvements in technology.

  19. Computer method for design of acoustic liners for turbofan engines

    NASA Technical Reports Server (NTRS)

    Minner, G. L.; Rice, E. J.

    1976-01-01

    A design package is presented for the specification of acoustic liners for turbofans. An estimate of the noise generation was made based on modifications of existing noise correlations, for which the inputs are basic fan aerodynamic design variables. The method does not predict multiple pure tones. A target attenuation spectrum was calculated which was the difference between the estimated generation spectrum and a flat annoyance-weighted goal attenuated spectrum. The target spectrum was combined with a knowledge of acoustic liner performance as a function of the liner design variables to specify the acoustic design. The liner design method at present is limited to annular duct configurations. The detailed structure of the liner was specified by combining the required impedance (which is a result of the previous step) with a mathematical model relating impedance to the detailed structure. The design procedure was developed for a liner constructed of perforated sheet placed over honeycomb backing cavities. A sample calculation was carried through in order to demonstrate the design procedure, and experimental results presented show good agreement with the calculated results of the method.

  20. Method for Enzyme Design with Genetically Encoded Unnatural Amino Acids.

    PubMed

    Hu, C; Wang, J

    2016-01-01

    We describe the methodologies for the design of artificial enzymes with genetically encoded unnatural amino acids. Genetically encoded unnatural amino acids offer great promise for constructing artificial enzymes with novel activities. In our studies, the designs of artificial enzyme were divided into two steps. First, we considered the unnatural amino acids and the protein scaffold separately. The scaffold is designed by traditional protein design methods. The unnatural amino acids are inspired by natural structure and organic chemistry methods, and synthesized by either organic chemistry methods or enzymatic conversion. With the increasing number of published unnatural amino acids with various functions, we described an unnatural amino acids toolkit containing metal chelators, redox mediators, and click chemistry reagents. These efforts enable a researcher to search the toolkit for appropriate unnatural amino acids for the study, rather than design and synthesize the unnatural amino acids from the beginning. After the first step, the model enzyme was optimized by computational methods and directed evolution. Lastly, we describe a general method for evolving aminoacyl-tRNA synthetase and expressing unnatural amino acids incorporated into a protein. PMID:27586330

  1. Developing Conceptual Hypersonic Airbreathing Engines Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Ferlemann, Shelly M.; Robinson, Jeffrey S.; Martin, John G.; Leonard, Charles P.; Taylor, Lawrence W.; Kamhawi, Hilmi

    2000-01-01

    Designing a hypersonic vehicle is a complicated process due to the multi-disciplinary synergy that is required. The greatest challenge involves propulsion-airframe integration. In the past, a two-dimensional flowpath was generated based on the engine performance required for a proposed mission. A three-dimensional CAD geometry was produced from the two-dimensional flowpath for aerodynamic analysis, structural design, and packaging. The aerodynamics, engine performance, and mass properties arc inputs to the vehicle performance tool to determine if the mission goals were met. If the mission goals were not met, then a flowpath and vehicle redesign would begin. This design process might have to be performed several times to produce a "closed" vehicle. This paper will describe an attempt to design a hypersonic cruise vehicle propulsion flowpath using a Design of' Experiments method to reduce the resources necessary to produce a conceptual design with fewer iterations of the design cycle. These methods also allow for more flexible mission analysis and incorporation of additional design constraints at any point. A design system was developed using an object-based software package that would quickly generate each flowpath in the study given the values of the geometric independent variables. These flowpath geometries were put into a hypersonic propulsion code and the engine performance was generated. The propulsion results were loaded into statistical software to produce regression equations that were combined with an aerodynamic database to optimize the flowpath at the vehicle performance level. For this example, the design process was executed twice. The first pass was a cursory look at the independent variables selected to determine which variables are the most important and to test all of the inputs to the optimization process. The second cycle is a more in-depth study with more cases and higher order equations representing the design space.

  2. A decentralized linear quadratic control design method for flexible structures

    NASA Technical Reports Server (NTRS)

    Su, Tzu-Jeng; Craig, Roy R., Jr.

    1990-01-01

    A decentralized suboptimal linear quadratic control design procedure which combines substructural synthesis, model reduction, decentralized control design, subcontroller synthesis, and controller reduction is proposed for the design of reduced-order controllers for flexible structures. The procedure starts with a definition of the continuum structure to be controlled. An evaluation model of finite dimension is obtained by the finite element method. Then, the finite element model is decomposed into several substructures by using a natural decomposition called substructuring decomposition. Each substructure, at this point, still has too large a dimension and must be reduced to a size that is Riccati-solvable. Model reduction of each substructure can be performed by using any existing model reduction method, e.g., modal truncation, balanced reduction, Krylov model reduction, or mixed-mode method. Then, based on the reduced substructure model, a subcontroller is designed by an LQ optimal control method for each substructure independently. After all subcontrollers are designed, a controller synthesis method called substructural controller synthesis is employed to synthesize all subcontrollers into a global controller. The assembling scheme used is the same as that employed for the structure matrices. Finally, a controller reduction scheme, called the equivalent impulse response energy controller (EIREC) reduction algorithm, is used to reduce the global controller to a reasonable size for implementation. The EIREC reduced controller preserves the impulse response energy of the full-order controller and has the property of matching low-frequency moments and low-frequency power moments. An advantage of the substructural controller synthesis method is that it relieves the computational burden associated with dimensionality. Besides that, the SCS design scheme is also a highly adaptable controller synthesis method for structures with varying configuration, or varying mass

  3. Design and descriptive epidemiology of the Infectious Diseases of East African Livestock (IDEAL) project, a longitudinal calf cohort study in western Kenya

    PubMed Central

    2013-01-01

    Background There is a widely recognised lack of baseline epidemiological data on the dynamics and impacts of infectious cattle diseases in east Africa. The Infectious Diseases of East African Livestock (IDEAL) project is an epidemiological study of cattle health in western Kenya with the aim of providing baseline epidemiological data, investigating the impact of different infections on key responses such as growth, mortality and morbidity, the additive and/or multiplicative effects of co-infections, and the influence of management and genetic factors. A longitudinal cohort study of newborn calves was conducted in western Kenya between 2007-2009. Calves were randomly selected from all those reported in a 2 stage clustered sampling strategy. Calves were recruited between 3 and 7 days old. A team of veterinarians and animal health assistants carried out 5-weekly, clinical and postmortem visits. Blood and tissue samples were collected in association with all visits and screened using a range of laboratory based diagnostic methods for over 100 different pathogens or infectious exposures. Results The study followed the 548 calves over the first 51 weeks of life or until death and when they were reported clinically ill. The cohort experienced a high all cause mortality rate of 16% with at least 13% of these due to infectious diseases. Only 307 (6%) of routine visits were classified as clinical episodes, with a further 216 reported by farmers. 54% of calves reached one year without a reported clinical episode. Mortality was mainly to east coast fever, haemonchosis, and heartwater. Over 50 pathogens were detected in this population with exposure to a further 6 viruses and bacteria. Conclusion The IDEAL study has demonstrated that it is possible to mount population based longitudinal animal studies. The results quantify for the first time in an animal population the high diversity of pathogens a population may have to deal with and the levels of co-infections with key

  4. The conditional risk probability-based seawall height design method

    NASA Astrophysics Data System (ADS)

    Yang, Xing; Hu, Xiaodong; Li, Zhiqing

    2015-11-01

    The determination of the required seawall height is usually based on the combination of wind speed (or wave height) and still water level according to a specified return period, e.g., 50-year return period wind speed and 50-year return period still water level. In reality, the two variables are be partially correlated. This may be lead to over-design (costs) of seawall structures. The above-mentioned return period for the design of a seawall depends on economy, society and natural environment in the region. This means a specified risk level of overtopping or damage of a seawall structure is usually allowed. The aim of this paper is to present a conditional risk probability-based seawall height design method which incorporates the correlation of the two variables. For purposes of demonstration, the wind speeds and water levels collected from Jiangsu of China are analyzed. The results show this method can improve seawall height design accuracy.

  5. Mixed methods research design for pragmatic psychoanalytic studies.

    PubMed

    Tillman, Jane G; Clemence, A Jill; Stevens, Jennifer L

    2011-10-01

    Calls for more rigorous psychoanalytic studies have increased over the past decade. The field has been divided by those who assert that psychoanalysis is properly a hermeneutic endeavor and those who see it as a science. A comparable debate is found in research methodology, where qualitative and quantitative methods have often been seen as occupying orthogonal positions. Recently, Mixed Methods Research (MMR) has emerged as a viable "third community" of research, pursuing a pragmatic approach to research endeavors through integrating qualitative and quantitative procedures in a single study design. Mixed Methods Research designs and the terminology associated with this emerging approach are explained, after which the methodology is explored as a potential integrative approach to a psychoanalytic human science. Both qualitative and quantitative research methods are reviewed, as well as how they may be used in Mixed Methods Research to study complex human phenomena. PMID:21880844

  6. Rotordynamics and Design Methods of an Oil-Free Turbocharger

    NASA Technical Reports Server (NTRS)

    Howard, Samuel A.

    1999-01-01

    The feasibility of supporting a turbocharger rotor on air foil bearings is investigated based upon predicted rotordynamic stability, load accommodations, and stress considerations. It is demonstrated that foil bearings offer a plausible replacement for oil-lubricated bearings in diesel truck turbochargers. Also, two different rotor configurations are analyzed and the design is chosen which best optimizes the desired performance characteristics. The method of designing machinery for foil bearing use and the assumptions made are discussed.

  7. A novel method for reducing the number of agents to be studied in an occupational epidemiologic study.

    PubMed

    Pierce, Jennifer S; Esmen, Nurtan A

    2011-04-01

    A novel screening tool method to select chemicals for exposure reconstruction was developed and validated using data generated for a hypothetical work force consisting of 10 job classes (ranging from 10,000 to 55,000 person-years). To achieve the required efficiency in the reconstruction of exposures, this method treats each product (defined as a part or process) as an "exposure." Exposure to 10 products was assigned to each job class at random using a computer program. The expected rate of a given disease was assumed to be constant throughout the job classes (tested at five levels), and the observed numbers of cases in the job classes were generated based on neutral deviations from background with error rates of ± 1% to 16%. One job class was assigned to be the "excess-class" and the number of cases in that class was increased by a factor of Q, which was set at levels that ranged from 1.25 to 5. All of the experimental conditions were replicated 10,000 times in a Monte Carlo scheme for scenarios in which each job class had been designated as the excess-class. Following each run, significant excesses (if any) were determined using a modified version of Daniel's method, and the percentages of false positive and false negative identifications were tabulated. We found that the sensitivity of the method is largely dependent on the relative risk (Q) associated with the exposure. Specifically, the results indicate that as the relative risk increases, the percentage of false negative identifications of the excesses is reduced to nearly 0% and the percentage of false positive identifications is approximately 13%. When applied to real data, should an association be detected between any product and a health outcome, this preliminary analysis will yield a reduced "product" set that can then be investigated in detail and the agents involved considered further for quantitative reconstruction. The proposed method is highly efficient and has the potential to benefit future

  8. Design of large Francis turbine using optimal methods

    NASA Astrophysics Data System (ADS)

    Flores, E.; Bornard, L.; Tomas, L.; Liu, J.; Couston, M.

    2012-11-01

    Among a high number of Francis turbine references all over the world, covering the whole market range of heads, Alstom has especially been involved in the development and equipment of the largest power plants in the world : Three Gorges (China -32×767 MW - 61 to 113 m), Itaipu (Brazil- 20x750 MW - 98.7m to 127m) and Xiangjiaba (China - 8x812 MW - 82.5m to 113.6m - in erection). Many new projects are under study to equip new power plants with Francis turbines in order to answer an increasing demand of renewable energy. In this context, Alstom Hydro is carrying out many developments to answer those needs, especially for jumbo units such the planned 1GW type units in China. The turbine design for such units requires specific care by using the state of the art in computation methods and the latest technologies in model testing as well as the maximum feedback from operation of Jumbo plants already in operation. We present in this paper how a large Francis turbine can be designed using specific design methods, including the global and local optimization methods. The design of the spiral case, the tandem cascade profiles, the runner and the draft tube are designed with optimization loops involving a blade design tool, an automatic meshing software and a Navier-Stokes solver, piloted by a genetic algorithm. These automated optimization methods, presented in different papers over the last decade, are nowadays widely used, thanks to the growing computation capacity of the HPC clusters: the intensive use of such optimization methods at the turbine design stage allows to reach very high level of performances, while the hydraulic flow characteristics are carefully studied over the whole water passage to avoid any unexpected hydraulic phenomena.

  9. The C8 Health Project: Design, Methods, and Participants

    PubMed Central

    Frisbee, Stephanie J.; Brooks, A. Paul; Maher, Arthur; Flensborg, Patsy; Arnold, Susan; Fletcher, Tony; Steenland, Kyle; Shankar, Anoop; Knox, Sarah S.; Pollard, Cecil; Halverson, Joel A.; Vieira, Verónica M.; Jin, Chuanfang; Leyden, Kevin M.; Ducatman, Alan M.

    2009-01-01

    Background The C8 Health Project was created, authorized, and funded as part of the settlement agreement reached in the case of Jack W. Leach, et al. v. E.I. du Pont de Nemours & Company (no. 01-C-608 W.Va., Wood County Circuit Court, filed 10 April 2002). The settlement stemmed from the perfluorooctanoic acid (PFOA, or C8) contamination of drinking water in six water districts in two states near the DuPont Washington Works facility near Parkersburg, West Virginia. Objectives This study reports on the methods and results from the C8 Health Project, a population study created to gather data that would allow class members to know their own PFOA levels and permit subsequent epidemiologic investigations. Methods Final study participation was 69,030, enrolled over a 13-month period in 2005–2006. Extensive data were collected, including demographic data, medical diagnoses (both self-report and medical records review), clinical laboratory testing, and determination of serum concentrations of 10 perfluorocarbons (PFCs). Here we describe the processes used to collect, validate, and store these health data. We also describe survey participants and their serum PFC levels. Results The population geometric mean for serum PFOA was 32.91 ng/mL, 500% higher than previously reported for a representative American population. Serum concentrations for perfluorohexane sulfonate and perfluorononanoic acid were elevated 39% and 73% respectively, whereas perfluorooctanesulfonate was present at levels similar to those in the U.S. population. Conclusions This largest known population study of community PFC exposure permits new evaluations of associations between PFOA, in particular, and a range of health parameters. These will contribute to understanding of the biology of PFC exposure. The C8 Health Project also represents an unprecedented effort to gather basic data on an exposed population; its achievements and limitations can inform future legal settlements for populations exposed to

  10. A finite-difference method for transonic airfoil design.

    NASA Technical Reports Server (NTRS)

    Steger, J. L.; Klineberg, J. M.

    1972-01-01

    This paper describes an inverse method for designing transonic airfoil sections or for modifying existing profiles. Mixed finite-difference procedures are applied to the equations of transonic small disturbance theory to determine the airfoil shape corresponding to a given surface pressure distribution. The equations are solved for the velocity components in the physical domain and flows with embedded shock waves can be calculated. To facilitate airfoil design, the method allows alternating between inverse and direct calculations to obtain a profile shape that satisfies given geometric constraints. Examples are shown of the application of the technique to improve the performance of several lifting airfoil sections. The extension of the method to three dimensions for designing supercritical wings is also indicated.

  11. Improved method for transonic airfoil design-by-optimization

    NASA Technical Reports Server (NTRS)

    Kennelly, R. A., Jr.

    1983-01-01

    An improved method for use of optimization techniques in transonic airfoil design is demonstrated. FLO6QNM incorporates a modified quasi-Newton optimization package, and is shown to be more reliable and efficient than the method developed previously at NASA-Ames, which used the COPES/CONMIN optimization problem. The design codes are compared on a series of test cases with known solutions, and the effects of problem scaling, proximity of initial point to solution, and objective function precision are studied. In contrast to the older method, well-converged solutions are shown to be attainable in the context of engineering design using computational fluid dynamics tools, a new result. The improvements are due to better performance by the optimization routine and to the use of problem-adaptive finite difference step sizes for gradient evaluation.

  12. Improved method for transonic airfoil design-by-optimization

    NASA Technical Reports Server (NTRS)

    Kennelly, R. A., Jr.

    1983-01-01

    An improved method for use of optimization techniques in transonic airfoil design is demonstrated. FLO6QNM incorporates a modified quasi-Newton optimization package, and is shown to be more reliable and efficient than the method developed previously at NASA-Ames, which used the COPES/CONMIN optimization program. The design codes are compared on a series of test cases with known solutions, and the effects of problem scaling, proximity of initial point to solution, and objective function precision are studied. In contrast to the older method, well-converged solutions are shown to be attainable in the context of engineering design using computational fluid dynamics tools, a new result. The improvements are due to better performance by the optimization routine and to the use of problem-adaptive finite difference step sizes for gradient evaluation.

  13. Computational methods of robust controller design for aerodynamic flutter suppression

    NASA Technical Reports Server (NTRS)

    Anderson, L. R.

    1981-01-01

    The development of Riccati iteration, a tool for the design and analysis of linear control systems is examined. First, Riccati iteration is applied to the problem of pole placement and order reduction in two-time scale control systems. Order reduction, yielding a good approximation to the original system, is demonstrated using a 16th order linear model of a turbofan engine. Next, a numerical method for solving the Riccati equation is presented and demonstrated for a set of eighth order random examples. A literature review of robust controller design methods follows which includes a number of methods for reducing the trajectory and performance index sensitivity in linear regulators. Lastly, robust controller design for large parameter variations is discussed.

  14. A method for the probabilistic design assessment of composite structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.

    1994-01-01

    A formal procedure for the probabilistic design assessment of a composite structure is described. The uncertainties in all aspects of a composite structure (constituent material properties, fabrication variables, structural geometry, service environments, etc.), which result in the uncertain behavior in the composite structural responses, are included in the assessment. The probabilistic assessment consists of design criteria, modeling of composite structures and uncertainties, simulation methods, and the decision making process. A sample case is presented to illustrate the formal procedure and to demonstrate that composite structural designs can be probabilistically assessed with accuracy and efficiency.

  15. Structure design: an artificial intelligence-based method for the design of molecules under geometrical constraints.

    PubMed

    Cohen, A A; Shatzmiller, S E

    1993-09-01

    This study presents an algorithm that implements artificial-intelligence techniques for automated, and site-directed drug design. The aim of the method is to link two or more predetermined functional groups into a sensible molecular structure. The proposed designing process mimics the classical manual design method, in which the drug designer sits in front of the computer screen and with the aid of computer graphics attempts to design the new drug. Therefore, the key principle of the algorithm is the parameterization of some criteria that affect the decision-making process carried out by the drug designer. This parameterization is based on the generation of weighting factors that reflect the knowledge and knowledge-based intuition of the drug designer, and thus add further rationalization to the drug design process. The proposed algorithm has been shown to yield a large variety of different structures, of which the drug designer may choose the most sensible. Performance tests indicate that with the proper set of parameters, the method generates a new structure within a short time. PMID:8110662

  16. Inverse design of airfoils using a flexible membrane method

    NASA Astrophysics Data System (ADS)

    Thinsurat, Kamon

    The Modified Garabedian Mc-Fadden (MGM) method is used to inversely design airfoils. The Finite Difference Method (FDM) for Non-Uniform Grids was developed to discretize the MGM equation for numerical solving. The Finite Difference Method (FDM) for Non-Uniform Grids has the advantage of being used flexibly with an unstructured grids airfoil. The commercial software FLUENT is being used as the flow solver. Several conditions are set in FLUENT such as subsonic inviscid flow, subsonic viscous flow, transonic inviscid flow, and transonic viscous flow to test the inverse design code for each condition. A moving grid program is used to create a mesh for new airfoils prior to importing meshes into FLUENT for the analysis of flows. For validation, an iterative process is used so the Cp distribution of the initial airfoil, the NACA0011, achieves the Cp distribution of the target airfoil, the NACA2315, for the subsonic inviscid case at M=0.2. Three other cases were carried out to validate the code. After the code validations, the inverse design method was used to design a shock free airfoil in the transonic condition and to design a separation free airfoil at a high angle of attack in the subsonic condition.

  17. Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Phase 1

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas

    1998-01-01

    The NASA Langley Multidisciplinary Design Optimization (MDO) method evaluation study seeks to arrive at a set of guidelines for using promising MDO methods by accumulating and analyzing computational data for such methods. The data are collected by conducting a series of reproducible experiments. This report documents all computational experiments conducted in Phase I of the study. This report is a companion to the paper titled Initial Results of an MDO Method Evaluation Study by N. M. Alexandrov and S. Kodiyalam (AIAA-98-4884).

  18. Exploration of Advanced Probabilistic and Stochastic Design Methods

    NASA Technical Reports Server (NTRS)

    Mavris, Dimitri N.

    2003-01-01

    The primary objective of the three year research effort was to explore advanced, non-deterministic aerospace system design methods that may have relevance to designers and analysts. The research pursued emerging areas in design methodology and leverage current fundamental research in the area of design decision-making, probabilistic modeling, and optimization. The specific focus of the three year investigation was oriented toward methods to identify and analyze emerging aircraft technologies in a consistent and complete manner, and to explore means to make optimal decisions based on this knowledge in a probabilistic environment. The research efforts were classified into two main areas. First, Task A of the grant has had the objective of conducting research into the relative merits of possible approaches that account for both multiple criteria and uncertainty in design decision-making. In particular, in the final year of research, the focus was on the comparison and contrasting between three methods researched. Specifically, these three are the Joint Probabilistic Decision-Making (JPDM) technique, Physical Programming, and Dempster-Shafer (D-S) theory. The next element of the research, as contained in Task B, was focused upon exploration of the Technology Identification, Evaluation, and Selection (TIES) methodology developed at ASDL, especially with regards to identification of research needs in the baseline method through implementation exercises. The end result of Task B was the documentation of the evolution of the method with time and a technology transfer to the sponsor regarding the method, such that an initial capability for execution could be obtained by the sponsor. Specifically, the results of year 3 efforts were the creation of a detailed tutorial for implementing the TIES method. Within the tutorial package, templates and detailed examples were created for learning and understanding the details of each step. For both research tasks, sample files and

  19. A PDE Sensitivity Equation Method for Optimal Aerodynamic Design

    NASA Technical Reports Server (NTRS)

    Borggaard, Jeff; Burns, John

    1996-01-01

    The use of gradient based optimization algorithms in inverse design is well established as a practical approach to aerodynamic design. A typical procedure uses a simulation scheme to evaluate the objective function (from the approximate states) and its gradient, then passes this information to an optimization algorithm. Once the simulation scheme (CFD flow solver) has been selected and used to provide approximate function evaluations, there are several possible approaches to the problem of computing gradients. One popular method is to differentiate the simulation scheme and compute design sensitivities that are then used to obtain gradients. Although this black-box approach has many advantages in shape optimization problems, one must compute mesh sensitivities in order to compute the design sensitivity. In this paper, we present an alternative approach using the PDE sensitivity equation to develop algorithms for computing gradients. This approach has the advantage that mesh sensitivities need not be computed. Moreover, when it is possible to use the CFD scheme for both the forward problem and the sensitivity equation, then there are computational advantages. An apparent disadvantage of this approach is that it does not always produce consistent derivatives. However, for a proper combination of discretization schemes, one can show asymptotic consistency under mesh refinement, which is often sufficient to guarantee convergence of the optimal design algorithm. In particular, we show that when asymptotically consistent schemes are combined with a trust-region optimization algorithm, the resulting optimal design method converges. We denote this approach as the sensitivity equation method. The sensitivity equation method is presented, convergence results are given and the approach is illustrated on two optimal design problems involving shocks.

  20. Taguchi method of experimental design in materials education

    NASA Technical Reports Server (NTRS)

    Weiser, Martin W.

    1993-01-01

    Some of the advantages and disadvantages of the Taguchi Method of experimental design as applied to Materials Science will be discussed. This is a fractional factorial method that employs the minimum number of experimental trials for the information obtained. The analysis is also very simple to use and teach, which is quite advantageous in the classroom. In addition, the Taguchi loss function can be easily incorporated to emphasize that improvements in reproducibility are often at least as important as optimization of the response. The disadvantages of the Taguchi Method include the fact that factor interactions are normally not accounted for, there are zero degrees of freedom if all of the possible factors are used, and randomization is normally not used to prevent environmental biasing. In spite of these disadvantages it is felt that the Taguchi Method is extremely useful for both teaching experimental design and as a research tool, as will be shown with a number of brief examples.

  1. Function combined method for design innovation of children's bike

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoli; Qiu, Tingting; Chen, Huijuan

    2013-03-01

    As children mature, bike products for children in the market develop at the same time, and the conditions are frequently updated. Certain problems occur when using a bike, such as cycle overlapping, repeating function, and short life cycle, which go against the principles of energy conservation and the environmental protection intensive design concept. In this paper, a rational multi-function method of design through functional superposition, transformation, and technical implementation is proposed. An organic combination of frog-style scooter and children's tricycle is developed using the multi-function method. From the ergonomic perspective, the paper elaborates on the body size of children aged 5 to 12 and effectively extracts data for a multi-function children's bike, which can be used for gliding and riding. By inverting the body, parts can be interchanged between the handles and the pedals of the bike. Finally, the paper provides a detailed analysis of the components and structural design, body material, and processing technology of the bike. The study of Industrial Product Innovation Design provides an effective design method to solve the bicycle problems, extends the function problems, improves the product market situation, and enhances the energy saving feature while implementing intensive product development effectively at the same time.

  2. Novel kind of DSP design method based on IP core

    NASA Astrophysics Data System (ADS)

    Yu, Qiaoyan; Liu, Peng; Wang, Weidong; Hong, Xiang; Chen, Jicheng; Yuan, Jianzhong; Chen, Keming

    2004-04-01

    With the pressure from the design productivity and various special applications, original design method for DSP can no longer keep up with the required speed. A novel design method is needed urgently. Intellectual Property (IP) reusing is a tendency for DSP design, but simple plug-and-play IP cores approaches almost never work. Therefore, appropriate control strategies are needed to connect all the IP cores used and coordinate the whole DSP. This paper presents a new DSP design procedure, which refers to System-on-a-chip, and later introduces a novel control strategy named DWC to implement the DSP based on IP cores. The most important part of this novel control strategy, pipeline control unit (PCU), is given in detail. Because a great number of data hazards occur in most computation-intensive scientific application, a new effective algorithm of checking data hazards is employed in PCU. Following this strategy, the design of a general or special purposed DSP can be finished in shorter time, and the DSP has a potency to improve performance with little modification on basic function units. This DWC strategy has been implement in a 16-bit fixed-pointed DSP successfully.

  3. System Synthesis in Preliminary Aircraft Design using Statistical Methods

    NASA Technical Reports Server (NTRS)

    DeLaurentis, Daniel; Mavris, Dimitri N.; Schrage, Daniel P.

    1996-01-01

    This paper documents an approach to conceptual and preliminary aircraft design in which system synthesis is achieved using statistical methods, specifically design of experiments (DOE) and response surface methodology (RSM). These methods are employed in order to more efficiently search the design space for optimum configurations. In particular, a methodology incorporating three uses of these techniques is presented. First, response surface equations are formed which represent aerodynamic analyses, in the form of regression polynomials, which are more sophisticated than generally available in early design stages. Next, a regression equation for an overall evaluation criterion is constructed for the purpose of constrained optimization at the system level. This optimization, though achieved in a innovative way, is still traditional in that it is a point design solution. The methodology put forward here remedies this by introducing uncertainty into the problem, resulting a solutions which are probabilistic in nature. DOE/RSM is used for the third time in this setting. The process is demonstrated through a detailed aero-propulsion optimization of a high speed civil transport. Fundamental goals of the methodology, then, are to introduce higher fidelity disciplinary analyses to the conceptual aircraft synthesis and provide a roadmap for transitioning from point solutions to probabalistic designs (and eventually robust ones).

  4. A Simple Method for High-Lift Propeller Conceptual Design

    NASA Technical Reports Server (NTRS)

    Patterson, Michael; Borer, Nick; German, Brian

    2016-01-01

    In this paper, we present a simple method for designing propellers that are placed upstream of the leading edge of a wing in order to augment lift. Because the primary purpose of these "high-lift propellers" is to increase lift rather than produce thrust, these props are best viewed as a form of high-lift device; consequently, they should be designed differently than traditional propellers. We present a theory that describes how these props can be designed to provide a relatively uniform axial velocity increase, which is hypothesized to be advantageous for lift augmentation based on a literature survey. Computational modeling indicates that such propellers can generate the same average induced axial velocity while consuming less power and producing less thrust than conventional propeller designs. For an example problem based on specifications for NASA's Scalable Convergent Electric Propulsion Technology and Operations Research (SCEPTOR) flight demonstrator, a propeller designed with the new method requires approximately 15% less power and produces approximately 11% less thrust than one designed for minimum induced loss. Higher-order modeling and/or wind tunnel testing are needed to verify the predicted performance.

  5. System Synthesis in Preliminary Aircraft Design Using Statistical Methods

    NASA Technical Reports Server (NTRS)

    DeLaurentis, Daniel; Mavris, Dimitri N.; Schrage, Daniel P.

    1996-01-01

    This paper documents an approach to conceptual and early preliminary aircraft design in which system synthesis is achieved using statistical methods, specifically Design of Experiments (DOE) and Response Surface Methodology (RSM). These methods are employed in order to more efficiently search the design space for optimum configurations. In particular, a methodology incorporating three uses of these techniques is presented. First, response surface equations are formed which represent aerodynamic analyses, in the form of regression polynomials, which are more sophisticated than generally available in early design stages. Next, a regression equation for an Overall Evaluation Criterion is constructed for the purpose of constrained optimization at the system level. This optimization, though achieved in an innovative way, is still traditional in that it is a point design solution. The methodology put forward here remedies this by introducing uncertainty into the problem, resulting in solutions which are probabilistic in nature. DOE/RSM is used for the third time in this setting. The process is demonstrated through a detailed aero-propulsion optimization of a High Speed Civil Transport. Fundamental goals of the methodology, then, are to introduce higher fidelity disciplinary analyses to the conceptual aircraft synthesis and provide a roadmap for transitioning from point solutions to probabilistic designs (and eventually robust ones).

  6. New displacement-based methods for optimal truss topology design

    NASA Technical Reports Server (NTRS)

    Bendsoe, Martin P.; Ben-Tal, Aharon; Haftka, Raphael T.

    1991-01-01

    Two alternate methods for maximum stiffness truss topology design are presented. The ground structure approach is used, and the problem is formulated in terms of displacements and bar areas. This large, nonconvex optimization problem can be solved by a simultaneous analysis and design approach. Alternatively, an equivalent, unconstrained, and convex problem in the displacements only can be formulated, and this problem can be solved by a nonsmooth, steepest descent algorithm. In both methods, the explicit solving of the equilibrium equations and the assembly of the global stiffness matrix are circumvented. A large number of examples have been studied, showing the attractive features of topology design as well as exposing interesting features of optimal topologies.

  7. New Methods and Transducer Designs for Ultrasonic Diagnostics and Therapy

    NASA Astrophysics Data System (ADS)

    Rybyanets, A. N.; Naumenko, A. A.; Sapozhnikov, O. A.; Khokhlova, V. A.

    Recent advances in the field of physical acoustics, imaging technologies, piezoelectric materials, and ultrasonic transducer design have led to emerging of novel methods and apparatus for ultrasonic diagnostics, therapy and body aesthetics. The paper presents the results on development and experimental study of different high intensity focused ultrasound (HIFU) transducers. Technological peculiarities of the HIFU transducer design as well as theoretical and numerical models of such transducers and the corresponding HIFU fields are discussed. Several HIFU transducers of different design have been fabricated using different advanced piezoelectric materials. Acoustic field measurements for those transducers have been performed using a calibrated fiber optic hydrophone and an ultrasonic measurement system (UMS). The results of ex vivo experiments with different tissues as well as in vivo experiments with blood vessels are presented that prove the efficacy, safety and selectivity of the developed HIFU transducers and methods.

  8. Using Propensity Score Methods to Approximate Factorial Experimental Designs

    ERIC Educational Resources Information Center

    Dong, Nianbo

    2011-01-01

    The purpose of this study is through Monte Carlo simulation to compare several propensity score methods in approximating factorial experimental design and identify best approaches in reducing bias and mean square error of parameter estimates of the main and interaction effects of two factors. Previous studies focused more on unbiased estimates of…

  9. Impact design methods for ceramic components in gas turbine engines

    NASA Technical Reports Server (NTRS)

    Song, J.; Cuccio, J.; Kington, H.

    1991-01-01

    Methods currently under development to design ceramic turbine components with improved impact resistance are presented. Two different modes of impact damage are identified and characterized, i.e., structural damage and local damage. The entire computation is incorporated into the EPIC computer code. Model capability is demonstrated by simulating instrumented plate impact and particle impact tests.

  10. Analytical methods of electrode design for a relativistic electron gun

    SciTech Connect

    Caporaso, G.J.; Cole, A.G.; Boyd, J.K.

    1985-05-09

    The standard paraxial ray equation method for the design of electrodes for an electrostatically focused gun is extended to include relativistic effects and the effects of the beam's azimuthal magnetic field. Solutions for parallel and converging beams are obtained and the predicted currents are compared against those measured on the High Brightness Test Stand. 4 refs., 2 figs.

  11. Designs and Methods in School Improvement Research: A Systematic Review

    ERIC Educational Resources Information Center

    Feldhoff, Tobias; Radisch, Falk; Bischof, Linda Marie

    2016-01-01

    Purpose: The purpose of this paper is to focus on challenges faced by longitudinal quantitative analyses of school improvement processes and offers a systematic literature review of current papers that use longitudinal analyses. In this context, the authors assessed designs and methods that are used to analyze the relation between school…

  12. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... and methods prescribed under appendix A of 14 CFR part 150; and (b) Use of computer models to create noise contours must be in accordance with the criteria prescribed under appendix A of 14 CFR part 150. ... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Designation of noise description...

  13. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... and methods prescribed under appendix A of 14 CFR part 150; and (b) Use of computer models to create noise contours must be in accordance with the criteria prescribed under appendix A of 14 CFR part 150. ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Designation of noise description...

  14. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... and methods prescribed under appendix A of 14 CFR part 150; and (b) Use of computer models to create noise contours must be in accordance with the criteria prescribed under appendix A of 14 CFR part 150. ... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Designation of noise description...

  15. Comparison of optimal design methods in inverse problems

    NASA Astrophysics Data System (ADS)

    Banks, H. T.; Holm, K.; Kappel, F.

    2011-07-01

    Typical optimal design methods for inverse or parameter estimation problems are designed to choose optimal sampling distributions through minimization of a specific cost function related to the resulting error in parameter estimates. It is hoped that the inverse problem will produce parameter estimates with increased accuracy using data collected according to the optimal sampling distribution. Here we formulate the classical optimal design problem in the context of general optimization problems over distributions of sampling times. We present a new Prohorov metric-based theoretical framework that permits one to treat succinctly and rigorously any optimal design criteria based on the Fisher information matrix. A fundamental approximation theory is also included in this framework. A new optimal design, SE-optimal design (standard error optimal design), is then introduced in the context of this framework. We compare this new design criterion with the more traditional D-optimal and E-optimal designs. The optimal sampling distributions from each design are used to compute and compare standard errors; the standard errors for parameters are computed using asymptotic theory or bootstrapping and the optimal mesh. We use three examples to illustrate ideas: the Verhulst-Pearl logistic population model (Banks H T and Tran H T 2009 Mathematical and Experimental Modeling of Physical and Biological Processes (Boca Raton, FL: Chapman and Hall/CRC)), the standard harmonic oscillator model (Banks H T and Tran H T 2009) and a popular glucose regulation model (Bergman R N, Ider Y Z, Bowden C R and Cobelli C 1979 Am. J. Physiol. 236 E667-77 De Gaetano A and Arino O 2000 J. Math. Biol. 40 136-68 Toffolo G, Bergman R N, Finegood D T, Bowden C R and Cobelli C 1980 Diabetes 29 979-90).

  16. Polypharmacology: in silico methods of ligand design and development.

    PubMed

    McKie, Samuel A

    2016-04-01

    How to design a ligand to bind multiple targets, rather than to a single target, is the focus of this review. Rational polypharmacology draws on knowledge that is both broad ranging and hierarchical. Computer-aided multitarget ligand design methods are described according to their nested knowledge level. Ligand-only and then receptor-ligand strategies are first described; followed by the metabolic network viewpoint. Subsequently strategies that view infectious diseases as multigenomic targets are discussed, and finally the disease level interpretation of medicinal therapy is considered. As yet there is no consensus on how best to proceed in designing a multitarget ligand. The current methodologies are bought together in an attempt to give a practical overview of how polypharmacology design might be best initiated. PMID:27105127

  17. Examination of Different Exposure Metrics in an Epidemiological Study

    EPA Science Inventory

    Epidemiological studies of air pollution have traditionally relied upon measurements of ambient concentration from central-site monitoring stations as surrogates of population exposures. However, depending on the epidemiological study design, this approach may introduce exposure...

  18. Guidance for using mixed methods design in nursing practice research.

    PubMed

    Chiang-Hanisko, Lenny; Newman, David; Dyess, Susan; Piyakong, Duangporn; Liehr, Patricia

    2016-08-01

    The mixed methods approach purposefully combines both quantitative and qualitative techniques, enabling a multi-faceted understanding of nursing phenomena. The purpose of this article is to introduce three mixed methods designs (parallel; sequential; conversion) and highlight interpretive processes that occur with the synthesis of qualitative and quantitative findings. Real world examples of research studies conducted by the authors will demonstrate the processes leading to the merger of data. The examples include: research questions; data collection procedures and analysis with a focus on synthesizing findings. Based on experience with mixed methods studied, the authors introduce two synthesis patterns (complementary; contrasting), considering application for practice and implications for research. PMID:27397810

  19. Computational methods for aerodynamic design using numerical optimization

    NASA Technical Reports Server (NTRS)

    Peeters, M. F.

    1983-01-01

    Five methods to increase the computational efficiency of aerodynamic design using numerical optimization, by reducing the computer time required to perform gradient calculations, are examined. The most promising method consists of drastically reducing the size of the computational domain on which aerodynamic calculations are made during gradient calculations. Since a gradient calculation requires the solution of the flow about an airfoil whose geometry was slightly perturbed from a base airfoil, the flow about the base airfoil is used to determine boundary conditions on the reduced computational domain. This method worked well in subcritical flow.

  20. A robust inverse inviscid method for airfoil design

    NASA Astrophysics Data System (ADS)

    Chaviaropoulos, P.; Dedoussis, V.; Papailiou, K. D.

    An irrotational inviscid compressible inverse design method for two-dimensional airfoil profiles is described. The method is based on the potential streamfunction formulation, where the physical space on which the boundaries of the airfoil are sought, is mapped onto the (phi, psi) space via a body-fitted coordinate transformation. A novel procedure based on differential geometry arguments is employed to derive the governing equations for the inverse problem, by requiring the curvature of the flat 2-D Euclidean space to be zero. An auxiliary coordinate transformation permits the definition of C-type computational grids on the (phi, psi) plane resulting to a more accurate description of the leading edge region. Geometry is determined by integrating Frenet equations along the grid lines. To validate the method inverse calculation results are compared to direct, `reproduction', calculation results. The design procedure of a new airfoil shape is also presented.

  1. A Mixed Methods Investigation of Mixed Methods Sampling Designs in Social and Health Science Research

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Jiao, Qun G.

    2007-01-01

    A sequential design utilizing identical samples was used to classify mixed methods studies via a two-dimensional model, wherein sampling designs were grouped according to the time orientation of each study's components and the relationship of the qualitative and quantitative samples. A quantitative analysis of 121 studies representing nine fields…

  2. Phylogenetic Analyses of Shigella and Enteroinvasive Escherichia coli for the Identification of Molecular Epidemiological Markers: Whole-Genome Comparative Analysis Does Not Support Distinct Genera Designation

    PubMed Central

    Pettengill, Emily A.; Pettengill, James B.; Binet, Rachel

    2016-01-01

    As a leading cause of bacterial dysentery, Shigella represents a significant threat to public health and food safety. Related, but often overlooked, enteroinvasive Escherichia coli (EIEC) can also cause dysentery. Current typing methods have limited ability to identify and differentiate between these pathogens despite the need for rapid and accurate identification of pathogens for clinical treatment and outbreak response. We present a comprehensive phylogeny of Shigella and EIEC using whole genome sequencing of 169 samples, constituting unparalleled strain diversity, and observe a lack of monophyly between Shigella and EIEC and among Shigella taxonomic groups. The evolutionary relationships in the phylogeny are supported by analyses of population structure and hierarchical clustering patterns of translated gene homolog abundance. Lastly, we identified a panel of 404 single nucleotide polymorphism (SNP) markers specific to each phylogenetic cluster for more accurate identification of Shigella and EIEC. Our findings show that Shigella and EIEC are not distinct evolutionary groups within the E. coli genus and, thus, EIEC as a group is not the ancestor to Shigella. The multiple analyses presented provide evidence for reconsidering the taxonomic placement of Shigella. The SNP markers offer more discriminatory power to molecular epidemiological typing methods involving these bacterial pathogens. PMID:26834722

  3. Phylogenetic Analyses of Shigella and Enteroinvasive Escherichia coli for the Identification of Molecular Epidemiological Markers: Whole-Genome Comparative Analysis Does Not Support Distinct Genera Designation.

    PubMed

    Pettengill, Emily A; Pettengill, James B; Binet, Rachel

    2015-01-01

    As a leading cause of bacterial dysentery, Shigella represents a significant threat to public health and food safety. Related, but often overlooked, enteroinvasive Escherichia coli (EIEC) can also cause dysentery. Current typing methods have limited ability to identify and differentiate between these pathogens despite the need for rapid and accurate identification of pathogens for clinical treatment and outbreak response. We present a comprehensive phylogeny of Shigella and EIEC using whole genome sequencing of 169 samples, constituting unparalleled strain diversity, and observe a lack of monophyly between Shigella and EIEC and among Shigella taxonomic groups. The evolutionary relationships in the phylogeny are supported by analyses of population structure and hierarchical clustering patterns of translated gene homolog abundance. Lastly, we identified a panel of 404 single nucleotide polymorphism (SNP) markers specific to each phylogenetic cluster for more accurate identification of Shigella and EIEC. Our findings show that Shigella and EIEC are not distinct evolutionary groups within the E. coli genus and, thus, EIEC as a group is not the ancestor to Shigella. The multiple analyses presented provide evidence for reconsidering the taxonomic placement of Shigella. The SNP markers offer more discriminatory power to molecular epidemiological typing methods involving these bacterial pathogens. PMID:26834722

  4. Tuning Parameters in Heuristics by Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Arin, Arif; Rabadi, Ghaith; Unal, Resit

    2010-01-01

    With the growing complexity of today's large scale problems, it has become more difficult to find optimal solutions by using exact mathematical methods. The need to find near-optimal solutions in an acceptable time frame requires heuristic approaches. In many cases, however, most heuristics have several parameters that need to be "tuned" before they can reach good results. The problem then turns into "finding best parameter setting" for the heuristics to solve the problems efficiently and timely. One-Factor-At-a-Time (OFAT) approach for parameter tuning neglects the interactions between parameters. Design of Experiments (DOE) tools can be instead employed to tune the parameters more effectively. In this paper, we seek the best parameter setting for a Genetic Algorithm (GA) to solve the single machine total weighted tardiness problem in which n jobs must be scheduled on a single machine without preemption, and the objective is to minimize the total weighted tardiness. Benchmark instances for the problem are available in the literature. To fine tune the GA parameters in the most efficient way, we compare multiple DOE models including 2-level (2k ) full factorial design, orthogonal array design, central composite design, D-optimal design and signal-to-noise (SIN) ratios. In each DOE method, a mathematical model is created using regression analysis, and solved to obtain the best parameter setting. After verification runs using the tuned parameter setting, the preliminary results for optimal solutions of multiple instances were found efficiently.

  5. A Systematic Bayesian Integration of Epidemiological and Genetic Data

    PubMed Central

    Lau, Max S. Y.; Marion, Glenn; Streftaris, George; Gibson, Gavin

    2015-01-01

    Genetic sequence data on pathogens have great potential to inform inference of their transmission dynamics ultimately leading to better disease control. Where genetic change and disease transmission occur on comparable timescales additional information can be inferred via the joint analysis of such genetic sequence data and epidemiological observations based on clinical symptoms and diagnostic tests. Although recently introduced approaches represent substantial progress, for computational reasons they approximate genuine joint inference of disease dynamics and genetic change in the pathogen population, capturing partially the joint epidemiological-evolutionary dynamics. Improved methods are needed to fully integrate such genetic data with epidemiological observations, for achieving a more robust inference of the transmission tree and other key epidemiological parameters such as latent periods. Here, building on current literature, a novel Bayesian framework is proposed that infers simultaneously and explicitly the transmission tree and unobserved transmitted pathogen sequences. Our framework facilitates the use of realistic likelihood functions and enables systematic and genuine joint inference of the epidemiological-evolutionary process from partially observed outbreaks. Using simulated data it is shown that this approach is able to infer accurately joint epidemiological-evolutionary dynamics, even when pathogen sequences and epidemiological data are incomplete, and when sequences are available for only a fraction of exposures. These results also characterise and quantify the value of incomplete and partial sequence data, which has important implications for sampling design, and demonstrate the abilities of the introduced method to identify multiple clusters within an outbreak. The framework is used to analyse an outbreak of foot-and-mouth disease in the UK, enhancing current understanding of its transmission dynamics and evolutionary process. PMID:26599399

  6. Non-Contact Electromagnetic Exciter Design with Linear Control Method

    NASA Astrophysics Data System (ADS)

    Wang, Lin; Xiong, Xianzhi; Xu, Hua

    2016-04-01

    A non-contact type force actuator is necessary for studying the dynamic performance of a high-speed spindle system owing to its high-speed operating conditions. A non-contact electromagnetic exciter is designed for identifying the dynamic coefficients of journal bearings in high-speed grinding spindles. A linear force control method is developed based on PID controller. The influence of amplitude and frequency of current, misalignment and rotational speed on magnetic field and excitation force is investigated based on two-dimensional finite element analysis. The electromagnetic excitation force is measured with the auxiliary coils and calibrated by load cells. The design is validated by the experimental results. Theoretical and experimental investigations show that the proposed design can accurately generate linear excitation force with sufficiently large amplitude and higher signal to noise ratio. Moreover, the fluctuations in force amplitude are reduced to a greater extent with the designed linear control method even when the air gap changes due to the rotor vibration at high-speed conditions. Besides, it is possible to apply various types of excitations: constant, synchronous, and non-synchronous excitation forces based on the proposed linear control method. This exciter can be used as linear-force exciting and controlling system for dynamic performance study of different high-speed rotor-bearing systems.

  7. Applying standard epidemiological methods for investigating foodborne disease outbreak in resource-poor settings: lessons from Vietnam.

    PubMed

    Vo, Thuan Huu; Nguyen, Dat Van; Le, Loan Thi Kim; Phan, Lan Trong; Nuorti, J Pekka; Tran Minh, Nguyen Nhu

    2014-07-01

    An outbreak of gastroenteritis occurred among workers of company X after eating lunch prepared by a catering service. Of 430 workers attending the meal, 56 were hospitalized with abdominal pain, diarrhea, vomiting, and nausea, according to the initial report. We conducted an investigation to identify the extent, vehicle, and source of the outbreak. In our case-control study, a case was a worker who attended the meal and who was hospitalized with acute gastroenteritis; controls were randomly selected from non-ill workers. Cases and controls were interviewed using a standard questionnaire. We used logistic regression to calculate adjusted odds ratios for the consumption of food items. Catering service facilities and food handlers working for the service were inspected. Food samples from the catering service were tested at reference laboratories. Of hospitalized cases, 54 fulfilled the case definition, but no stool specimens were collected for laboratory testing. Of four food items served during lunch, only "squash and pork soup" was significantly associated with gastroenteritis, with an adjusted odds ratio of 9.5 (95 % CI 3.2, 27.7). The caterer did not separate cooked from raw foods but used the same counter for both. Cooked foods were kept at room temperature for about 4 h before serving. Four of 14 food handlers were not trained on basic food safety principles and did not have health certificates. Although no microbiological confirmation was obtained, our epidemiological investigation suggested that squash and pork soup caused the outbreak. Hospitals should be instructed to obtain stool specimens from patients with gastroenteritis. Food catering services should be educated in basic food safety measures. PMID:24988035

  8. Epidemiological survey of anti-flea IgE in dogs in Japan by using an antigen-specific IgE quantitative measurement method

    PubMed Central

    Ichikawa, Y.; Beugnet, F.

    2012-01-01

    In Japan, an epidemiological survey was performed in dogs from October to December 2008 by using a quantitative measurement method for antigen-specific IgE towards specific Ctenocephalides felis antigens. 214 dogs from 22 veterinary clinics were included. These clinics were located as follows, from North to South: Hokkaido, Aomori, Fukushima, Tochigi, Saitama, Chiba, Tokyo (Tama-City and Ota-ku), Kanagawa, Gifu, Niigata, Kyoto, Nara, Osaka, Hyogo, Kagawa, Ehime, Hiroshima, Yamaguchi, Fukuoka, Kumamoto and Kagoshima. 110 dogs (51.4%) were seropositive for flea-specific IgE. No differences were associated with gender or breed. This survey confirms that flea infestation in dogs is a common problem in Japan. It especially shows that the infestation also occurs in Northern Japan where fleas are considered uncommon by the vet. PMID:22550629

  9. Material Design, Selection, and Manufacturing Methods for System Sustainment

    SciTech Connect

    David Sowder, Jim Lula, Curtis Marshall

    2010-02-18

    This paper describes a material selection and validation process proven to be successful for manufacturing high-reliability long-life product. The National Secure Manufacturing Center business unit of the Kansas City Plant (herein called KCP) designs and manufactures complex electrical and mechanical components used in extreme environments. The material manufacturing heritage is founded in the systems design to manufacturing practices that support the U.S. Department of Energy’s National Nuclear Security Administration (DOE/NNSA). Material Engineers at KCP work with the systems designers to recommend materials, develop test methods, perform analytical analysis of test data, define cradle to grave needs, present final selection and fielding. The KCP material engineers typically will maintain cost control by utilizing commercial products when possible, but have the resources and to develop and produce unique formulations as necessary. This approach is currently being used to mature technologies to manufacture materials with improved characteristics using nano-composite filler materials that will enhance system design and production. For some products the engineers plan and carry out science-based life-cycle material surveillance processes. Recent examples of the approach include refurbished manufacturing of the high voltage power supplies for cockpit displays in operational aircraft; dry film lubricant application to improve bearing life for guided munitions gyroscope gimbals, ceramic substrate design for electrical circuit manufacturing, and tailored polymeric materials for various systems. The following examples show evidence of KCP concurrent design-to-manufacturing techniques used to achieve system solutions that satisfy or exceed demanding requirements.

  10. Denoising Sparse Images from GRAPPA using the Nullspace Method (DESIGN)

    PubMed Central

    Weller, Daniel S.; Polimeni, Jonathan R.; Grady, Leo; Wald, Lawrence L.; Adalsteinsson, Elfar; Goyal, Vivek K

    2011-01-01

    To accelerate magnetic resonance imaging using uniformly undersampled (nonrandom) parallel imaging beyond what is achievable with GRAPPA alone, the Denoising of Sparse Images from GRAPPA using the Nullspace method (DESIGN) is developed. The trade-off between denoising and smoothing the GRAPPA solution is studied for different levels of acceleration. Several brain images reconstructed from uniformly undersampled k-space data using DESIGN are compared against reconstructions using existing methods in terms of difference images (a qualitative measure), PSNR, and noise amplification (g-factors) as measured using the pseudo-multiple replica method. Effects of smoothing, including contrast loss, are studied in synthetic phantom data. In the experiments presented, the contrast loss and spatial resolution are competitive with existing methods. Results for several brain images demonstrate significant improvements over GRAPPA at high acceleration factors in denoising performance with limited blurring or smoothing artifacts. In addition, the measured g-factors suggest that DESIGN mitigates noise amplification better than both GRAPPA and L1 SPIR-iT (the latter limited here by uniform undersampling). PMID:22213069

  11. The Global Enteric Multicenter Study (GEMS) of Diarrheal Disease in Infants and Young Children in Developing Countries: Epidemiologic and Clinical Methods of the Case/Control Study

    PubMed Central

    Kotloff, Karen L.; Blackwelder, William C.; Nasrin, Dilruba; Nataro, James P.; Farag, Tamer H.; van Eijk, Annemieke; Adegbola, Richard A.; Alonso, Pedro L.; Breiman, Robert F.; Golam Faruque, Abu Syed; Saha, Debasish; Sow, Samba O.; Sur, Dipika; Zaidi, Anita K. M.; Biswas, Kousick; Panchalingam, Sandra; Clemens, John D.; Cohen, Dani; Glass, Roger I.; Mintz, Eric D.; Sommerfelt, Halvor; Levine, Myron M.

    2012-01-01

    Background. Diarrhea is a leading cause of illness and death among children aged <5 years in developing countries. This paper describes the clinical and epidemiological methods used to conduct the Global Enteric Multicenter Study (GEMS), a 3-year, prospective, age-stratified, case/control study to estimate the population-based burden, microbiologic etiology, and adverse clinical consequences of acute moderate-to-severe diarrhea (MSD) among a censused population of children aged 0–59 months seeking care at health centers in sub-Saharan Africa and South Asia. Methods. GEMS was conducted at 7 field sites, each serving a population whose demography and healthcare utilization practices for childhood diarrhea were documented. We aimed to enroll 220 MSD cases per year from selected health centers serving each site in each of 3 age strata (0–11, 12–23, and 24–59 months), along with 1–3 matched community controls. Cases and controls supplied clinical, epidemiologic, and anthropometric data at enrollment and again approximately 60 days later, and provided enrollment stool specimens for identification and characterization of potential diarrheal pathogens. Verbal autopsy was performed if a child died. Analytic strategies will calculate the fraction of MSD attributable to each pathogen and the incidence, financial costs, nutritional consequences, and case fatality overall and by pathogen. Conclusions. When completed, GEMS will provide estimates of the incidence, etiology, and outcomes of MSD among infants and young children in sub-Saharan Africa and South Asia. This information can guide development and implementation of public health interventions to diminish morbidity and mortality from diarrheal diseases. PMID:23169936

  12. Molecular epidemiology of tuberculosis: achievements and challenges to current knowledge.

    PubMed Central

    Murray, Megan; Nardell, Edward

    2002-01-01

    Over the past 10 years, molecular methods have become available with which to strain-type Mycobacterium tuberculosis. They have allowed researchers to study certain important but previously unresolved issues in the epidemiology of tuberculosis (TB). For example, some unsuspected microepidemics have been revealed and it has been shown that the relative contribution of recently acquired disease to the TB burden in many settings is far greater than had been thought. These findings have led to the strengthening of TB control. Other research has demonstrated the existence and described the frequency of exogenous reinfection in areas of high incidence. Much recent work has focused on the phenotypic variation among strains and has evaluated the relative transmissibility, virulence, and immunogenicity of different lineages of the organism. We summarize the recent achievements in TB epidemiology associated with the introduction of DNA fingerprinting techniques, and consider the implications of this technology for the design and analysis of epidemiological studies. PMID:12132006

  13. Calculation of Evolutionary Correlation between Individual Genes and Full-Length Genome: A Method Useful for Choosing Phylogenetic Markers for Molecular Epidemiology

    PubMed Central

    Wang, Shuai; Luo, Xuenong; Wei, Wei; Zheng, Yadong; Dou, Yongxi; Cai, Xuepeng

    2013-01-01

    Individual genes or regions are still commonly used to estimate the phylogenetic relationships among viral isolates. The genomic regions that can faithfully provide assessments consistent with those predicted with full-length genome sequences would be preferable to serve as good candidates of the phylogenetic markers for molecular epidemiological studies of many viruses. Here we employed a statistical method to evaluate the evolutionary relationships between individual viral genes and full-length genomes without tree construction as a way to determine which gene can match the genome well in phylogenetic analyses. This method was performed by calculation of linear correlations between the genetic distance matrices of aligned individual gene sequences and aligned genome sequences. We applied this method to the phylogenetic analyses of porcine circovirus 2 (PCV2), measles virus (MV), hepatitis E virus (HEV) and Japanese encephalitis virus (JEV). Phylogenetic trees were constructed for comparisons and the possible factors affecting the method accuracy were also discussed in the calculations. The results revealed that this method could produce results consistent with those of previous studies about the proper consensus sequences that could be successfully used as phylogenetic markers. And our results also suggested that these evolutionary correlations could provide useful information for identifying genes that could be used effectively to infer the genetic relationships. PMID:24312527

  14. Online Guidance Law of Missile Using Multiple Design Point Method

    NASA Astrophysics Data System (ADS)

    Yamaoka, Seiji; Ueno, Seiya

    This paper deals with design procedure of online guidance law for future missiles that are required to have agile maneuverability. For the purpose, the authors propose to mount high power side-thrusters on a missile. The guidance law for such missiles is discussed from a point of view of optimal control theory in this paper. Minimum time problem is solved for the approximated system. It is derived that bang-bang control is optimal input from the necessary conditions of optimal solution. Feedback guidance without iterative calculation is useful for actual systems. Multiple design point method is applied to design feedback gains and feedforward inputs of the guidance law. The numerical results show the good performance of the proposed guidance law.

  15. A molecular epidemiology project on diet and cancer: the EPIC-Italy Prospective Study. Design and baseline characteristics of participants.

    PubMed

    Palli, Domenico; Berrino, Franco; Vineis, Paolo; Tumino, Rosario; Panico, Salvatore; Masala, Giovanna; Saieva, Calogero; Salvini, Simonetta; Ceroti, Marco; Pala, Valeria; Sieri, Sabina; Frasca, Graziella; Giurdanella, Maria Concetta; Sacerdote, Carlotta; Fiorini, Laura; Celentano, Egidio; Galasso, Rocco; Decarli, Adriano; Krogh, Vittorio

    2003-01-01

    EPIC-Italy is the Italian section of a larger project known as EPIC (European Prospective Investigation into Cancer and Nutrition), a prospective study on diet and cancer carried out in 10 European countries. In the period 1993-1998, EPIC-Italy completed the recruitment of 47,749 volunteers (15,171 men, 32,578 women, aged 35-65 years) in 4 different areas covered by cancer registries: Varese (12,083 volunteers) and Turin (10,604) in the Northern part of the country; Florence (13,597) and Ragusa (6,403) in Central and Southern Italy, respectively. An associate center in Naples enrolled 5,062 women. Detailed information for each individual volunteer about diet and life-style habits, anthropometric measurements and a blood sample was collected, after signing an informed consent form. A food frequency questionnaire specifically developed for the Italian dietary pattern was tested in a pilot phase. A computerized data base with the dietary and life-style information of each participant was completed. Blood samples were processed in the same day of collection, aliquoted (RBC, WBC, serum and plasma) and stored in liquid nitrogen containers. Follow-up procedures were validated and implemented for the identification of newly diagnosed cancer cases. Cancer incidence was related to dietary habits and biochemical markers of food consumption and individual susceptibility in order to test the role of diet-related exposure in the etiology of cancer and its interaction with other environmental or genetic determinants. The comparability of information in a prospective study design is much higher than in other studies. The availability of such a large biological bank linked to individual data on dietary and life-style exposures also provides the unique opportunity of evaluating the role of selected genotypes involved in the metabolism of chemical compounds and DNA repair, potentially related to the risk of cancer, in residents of geographic areas of Italy characterized by specific

  16. COMPSIZE - PRELIMINARY DESIGN METHOD FOR FIBER REINFORCED COMPOSITE STRUCTURES

    NASA Technical Reports Server (NTRS)

    Eastlake, C. N.

    1994-01-01

    The Composite Structure Preliminary Sizing program, COMPSIZE, is an analytical tool which structural designers can use when doing approximate stress analysis to select or verify preliminary sizing choices for composite structural members. It is useful in the beginning stages of design concept definition, when it is helpful to have quick and convenient approximate stress analysis tools available so that a wide variety of structural configurations can be sketched out and checked for feasibility. At this stage of the design process the stress/strain analysis does not need to be particularly accurate because any configurations tentatively defined as feasible will later be analyzed in detail by stress analysis specialists. The emphasis is on fast, user-friendly methods so that rough but technically sound evaluation of a broad variety of conceptual designs can be accomplished. Analysis equations used are, in most cases, widely known basic structural analysis methods. All the equations used in this program assume elastic deformation only. The default material selection is intermediate strength graphite/epoxy laid up in a quasi-isotropic laminate. A general flat laminate analysis subroutine is included for analyzing arbitrary laminates. However, COMPSIZE should be sufficient for most users to presume a quasi-isotropic layup and use the familiar basic structural analysis methods for isotropic materials, after estimating an appropriate elastic modulus. Homogeneous materials can be analyzed as simplified cases. The COMPSIZE program is written in IBM BASICA. The program format is interactive. It was designed on an IBM Personal Computer operating under DOS with a central memory requirement of approximately 128K. It has been implemented on an IBM compatible with GW-BASIC under DOS 3.2. COMPSIZE was developed in 1985.

  17. Putting Life into Computer-Based Training: The Creation of an Epidemiologic Case Study.

    ERIC Educational Resources Information Center

    Gathany, Nancy C.; Stehr-Green, Jeanette K.

    1994-01-01

    Describes the design of "Pharyngitis in Louisiana," a computer-based epidemiologic case study that was created to teach students how to conduct disease outbreak investigations. Topics discussed include realistic content portrayals; graphics; interactive teaching methods; interaction between the instructional designer and the medical expert; and…

  18. National Tuberculosis Genotyping and Surveillance Network: Design and Methods

    PubMed Central

    Braden, Christopher R.; Schable, Barbara A.; Onorato, Ida M.

    2002-01-01

    The National Tuberculosis Genotyping and Surveillance Network was established in 1996 to perform a 5-year, prospective study of the usefulness of genotyping Mycobacterium tuberculosis isolates to tuberculosis control programs. Seven sentinel sites identified all new cases of tuberculosis, collected information on patients and contacts, and obtained patient isolates. Seven genotyping laboratories performed DNA fingerprinting analysis by the international standard IS6110 method. BioImage Whole Band Analyzer software was used to analyze patterns, and distinct patterns were assigned unique designations. Isolates with six or fewer bands on IS6110 patterns were also spoligotyped. Patient data and genotyping designations were entered in a relational database and merged with selected variables from the national surveillance database. In two related databases, we compiled the results of routine contact investigations and the results of investigations of the relationships of patients who had isolates with matching genotypes. We describe the methods used in the study. PMID:12453342

  19. Optical design and active optics methods in astronomy

    NASA Astrophysics Data System (ADS)

    Lemaitre, Gerard R.

    2013-03-01

    Optical designs for astronomy involve implementation of active optics and adaptive optics from X-ray to the infrared. Developments and results of active optics methods for telescopes, spectrographs and coronagraph planet finders are presented. The high accuracy and remarkable smoothness of surfaces generated by active optics methods also allow elaborating new optical design types with high aspheric and/or non-axisymmetric surfaces. Depending on the goal and performance requested for a deformable optical surface analytical investigations are carried out with one of the various facets of elasticity theory: small deformation thin plate theory, large deformation thin plate theory, shallow spherical shell theory, weakly conical shell theory. The resulting thickness distribution and associated bending force boundaries can be refined further with finite element analysis.

  20. Simplified Analysis Methods for Primary Load Designs at Elevated Temperatures

    SciTech Connect

    Carter, Peter; Jetter, Robert I; Sham, Sam

    2011-01-01

    The use of simplified (reference stress) analysis methods is discussed and illustrated for primary load high temperature design. Elastic methods are the basis of the ASME Section III, Subsection NH primary load design procedure. There are practical drawbacks with this approach, particularly for complex geometries and temperature gradients. The paper describes an approach which addresses these difficulties through the use of temperature-dependent elastic-perfectly plastic analysis. Correction factors are defined to address difficulties traditionally associated with discontinuity stresses, inelastic strain concentrations and multiaxiality. A procedure is identified to provide insight into how this approach could be implemented but clearly there is additional work to be done to define and clarify the procedural steps to bring it to the point where it could be adapted into code language.

  1. Numerical design method for thermally loaded plate-cylinder intersections

    SciTech Connect

    Baldur, R.; Laberge, C.A.; Lapointe, D. )

    1988-11-01

    This paper is an extension of work on stresses in corner radii described by the authors previously. Whereas the original study concerned itself with pressure effects only and the second reference gave the initial version of the work dealing with the thermal effects, this report gives more recent results concerning specifically thermal loads. As before, the results are limited to inside corner radii between cylinders and flat heat closures. Similarly, the analysis is based on a systematic series of finite element calculations with the significant parameters covering the field of useful design boundaries. The results are condensed into a rapid method for the determination of peak stresses needed for performing fatigue analysis in pressure vessels subjected to a significant, variable thermal load. The paper takes into account the influence of the film coefficient, temporal temperature variations, and material properties. A set of coefficients provides a convenient method of stress evaluation suitable for design purposes.

  2. Preliminary demonstration of a robust controller design method

    NASA Technical Reports Server (NTRS)

    Anderson, L. R.

    1980-01-01

    Alternative computational procedures for obtaining a feedback control law which yields a control signal based on measurable quantitites are evaluated. The three methods evaluated are: (1) the standard linear quadratic regulator design model; (2) minimization of the norm of the feedback matrix, k via nonlinear programming subject to the constraint that the closed loop eigenvalues be in a specified domain in the complex plane; and (3) maximize the angles between the closed loop eigenvectors in combination with minimizing the norm of K also via the constrained nonlinear programming. The third or robust design method was chosen to yield a closed loop system whose eigenvalues are insensitive to small changes in the A and B matrices. The relationship between orthogonality of closed loop eigenvectors and the sensitivity of closed loop eigenvalues is described. Computer programs are described.

  3. Helicopter flight-control design using an H(2) method

    NASA Technical Reports Server (NTRS)

    Takahashi, Marc D.

    1991-01-01

    Rate-command and attitude-command flight-control designs for a UH-60 helicopter in hover are presented and were synthesized using an H(2) method. Using weight functions, this method allows the direct shaping of the singular values of the sensitivity, complementary sensitivity, and control input transfer-function matrices to give acceptable feedback properties. The designs were implemented on the Vertical Motion Simulator, and four low-speed hover tasks were used to evaluate the control system characteristics. The pilot comments from the accel-decel, bob-up, hovering turn, and side-step tasks indicated good decoupling and quick response characteristics. However, an underlying roll PIO tendency was found to exist away from the hover condition, which was caused by a flap regressing mode with insufficient damping.

  4. A Requirements-Driven Optimization Method for Acoustic Treatment Design

    NASA Technical Reports Server (NTRS)

    Berton, Jeffrey J.

    2016-01-01

    Acoustic treatment designers have long been able to target specific noise sources inside turbofan engines. Facesheet porosity and cavity depth are key design variables of perforate-over-honeycomb liners that determine levels of noise suppression as well as the frequencies at which suppression occurs. Layers of these structures can be combined to create a robust attenuation spectrum that covers a wide range of frequencies. Looking to the future, rapidly-emerging additive manufacturing technologies are enabling new liners with multiple degrees of freedom, and new adaptive liners with variable impedance are showing promise. More than ever, there is greater flexibility and freedom in liner design. Subject to practical considerations, liner design variables may be manipulated to achieve a target attenuation spectrum. But characteristics of the ideal attenuation spectrum can be difficult to know. Many multidisciplinary system effects govern how engine noise sources contribute to community noise. Given a hardwall fan noise source to be suppressed, and using an analytical certification noise model to compute a community noise measure of merit, the optimal attenuation spectrum can be derived using multidisciplinary systems analysis methods. The subject of this paper is an analytical method that derives the ideal target attenuation spectrum that minimizes noise perceived by observers on the ground.

  5. Application of an optimization method to high performance propeller designs

    NASA Technical Reports Server (NTRS)

    Li, K. C.; Stefko, G. L.

    1984-01-01

    The application of an optimization method to determine the propeller blade twist distribution which maximizes propeller efficiency is presented. The optimization employs a previously developed method which has been improved to include the effects of blade drag, camber and thickness. Before the optimization portion of the computer code is used, comparisons of calculated propeller efficiencies and power coefficients are made with experimental data for one NACA propeller at Mach numbers in the range of 0.24 to 0.50 and another NACA propeller at a Mach number of 0.71 to validate the propeller aerodynamic analysis portion of the computer code. Then comparisons of calculated propeller efficiencies for the optimized and the original propellers show the benefits of the optimization method in improving propeller performance. This method can be applied to the aerodynamic design of propellers having straight, swept, or nonplanar propeller blades.

  6. Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Part 2

    NASA Technical Reports Server (NTRS)

    Kodiyalam, Srinivas; Yuan, Charles; Sobieski, Jaroslaw (Technical Monitor)

    2000-01-01

    A new MDO method, BLISS, and two different variants of the method, BLISS/RS and BLISS/S, have been implemented using iSIGHT's scripting language and evaluated in this report on multidisciplinary problems. All of these methods are based on decomposing a modular system optimization system into several subtasks optimization, that may be executed concurrently, and the system optimization that coordinates the subtasks optimization. The BLISS method and its variants are well suited for exploiting the concurrent processing capabilities in a multiprocessor machine. Several steps, including the local sensitivity analysis, local optimization, response surfaces construction and updates are all ideally suited for concurrent processing. Needless to mention, such algorithms that can effectively exploit the concurrent processing capabilities of the compute servers will be a key requirement for solving large-scale industrial design problems, such as the automotive vehicle problem detailed in Section 3.4.

  7. Synthesis of aircraft structures using integrated design and analysis methods

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.; Goetz, R. C.

    1978-01-01

    A systematic research is reported to develop and validate methods for structural sizing of an airframe designed with the use of composite materials and active controls. This research program includes procedures for computing aeroelastic loads, static and dynamic aeroelasticity, analysis and synthesis of active controls, and optimization techniques. Development of the methods is concerned with the most effective ways of integrating and sequencing the procedures in order to generate structural sizing and the associated active control system, which is optimal with respect to a given merit function constrained by strength and aeroelasticity requirements.

  8. Epidemiology of malaria in an area of seasonal transmission in Niger and implications for the design of a seasonal malaria chemoprevention strategy

    PubMed Central

    2013-01-01

    Background Few data are available about malaria epidemiological situation in Niger. However, implementation of new strategies such as vaccination or seasonal treatment of a target population requires the knowledge of baseline epidemiological features of malaria. A population-based study was conducted to provide better characterization of malaria seasonal variations and population groups the most at risk in this particular area. Methods From July 2007 to December 2009, presumptive cases of malaria among a study population living in a typical Sahelian village of Niger were recorded, and confirmed by microscopic examination. In parallel, asymptomatic carriers were actively detected at the end of each dry season in 2007, 2008 and 2009. Results Among the 965 presumptive malaria cases recorded, 29% were confirmed by microscopic examination. The incidence of malaria was found to decrease significantly with age (p < 0.01). The mean annual incidence was 0.254. The results show that the risk of malaria was higher in children under ten years (p < 0.0001). The number of malaria episodes generally followed the temporal pattern of changes in precipitation levels, with a peak of transmission in August and September. One-thousand and ninety subjects were submitted to an active detection of asymptomatic carriage of whom 16% tested positive; asymptomatic carriage decreased with increasing age. A higher prevalence of gametocyte carriage among asymptomatic population was recorded in children aged two to ten years, though it did not reach significance. Conclusions In Southern Niger, malaria transmission mostly occurs from July to October. Children aged two to ten years are the most at risk of malaria, and may also represent the main reservoir for gametocytes. Strategies such as intermittent preventive treatment in children (IPTc) could be of interest in this area, where malaria transmission is highly seasonal. Based on these preliminary data, a pilot study could be implemented

  9. Design Methods for Load-bearing Elements from Crosslaminated Timber

    NASA Astrophysics Data System (ADS)

    Vilguts, A.; Serdjuks, D.; Goremikins, V.

    2015-11-01

    Cross-laminated timber is an environmentally friendly material, which possesses a decreased level of anisotropy in comparison with the solid and glued timber. Cross-laminated timber could be used for load-bearing walls and slabs of multi-storey timber buildings as well as decking structures of pedestrian and road bridges. Design methods of cross-laminated timber elements subjected to bending and compression with bending were considered. The presented methods were experimentally validated and verified by FEM. Two cross-laminated timber slabs were tested at the action of static load. Pine wood was chosen as a board's material. Freely supported beam with the span equal to 1.9 m, which was loaded by the uniformly distributed load, was a design scheme of the considered plates. The width of the plates was equal to 1 m. The considered cross-laminated timber plates were analysed by FEM method. The comparison of stresses acting in the edge fibres of the plate and the maximum vertical displacements shows that both considered methods can be used for engineering calculations. The difference between the results obtained experimentally and analytically is within the limits from 2 to 31%. The difference in results obtained by effective strength and stiffness and transformed sections methods was not significant.

  10. Asymmetric MRI magnet design using a hybrid numerical method.

    PubMed

    Zhao, H; Crozier, S; Doddrell, D M

    1999-12-01

    This paper describes a hybrid numerical method for the design of asymmetric magnetic resonance imaging magnet systems. The problem is formulated as a field synthesis and the desired current density on the surface of a cylinder is first calculated by solving a Fredholm equation of the first kind. Nonlinear optimization methods are then invoked to fit practical magnet coils to the desired current density. The field calculations are performed using a semi-analytical method. A new type of asymmetric magnet is proposed in this work. The asymmetric MRI magnet allows the diameter spherical imaging volume to be positioned close to one end of the magnet. The main advantages of making the magnet asymmetric include the potential to reduce the perception of claustrophobia for the patient, better access to the patient by attending physicians, and the potential for reduced peripheral nerve stimulation due to the gradient coil configuration. The results highlight that the method can be used to obtain an asymmetric MRI magnet structure and a very homogeneous magnetic field over the central imaging volume in clinical systems of approximately 1.2 m in length. Unshielded designs are the focus of this work. This method is flexible and may be applied to magnets of other geometries. PMID:10579958

  11. Development of quality-by-design analytical methods.

    PubMed

    Vogt, Frederick G; Kord, Alireza S

    2011-03-01

    Quality-by-design (QbD) is a systematic approach to drug development, which begins with predefined objectives, and uses science and risk management approaches to gain product and process understanding and ultimately process control. The concept of QbD can be extended to analytical methods. QbD mandates the definition of a goal for the method, and emphasizes thorough evaluation and scouting of alternative methods in a systematic way to obtain optimal method performance. Candidate methods are then carefully assessed in a structured manner for risks, and are challenged to determine if robustness and ruggedness criteria are satisfied. As a result of these studies, the method performance can be understood and improved if necessary, and a control strategy can be defined to manage risk and ensure the method performs as desired when validated and deployed. In this review, the current state of analytical QbD in the industry is detailed with examples of the application of analytical QbD principles to a range of analytical methods, including high-performance liquid chromatography, Karl Fischer titration for moisture content, vibrational spectroscopy for chemical identification, quantitative color measurement, and trace analysis for genotoxic impurities. PMID:21280050

  12. Mining characteristics of epidemiological studies from Medline: a case study in obesity

    PubMed Central

    2014-01-01

    Background The health sciences literature incorporates a relatively large subset of epidemiological studies that focus on population-level findings, including various determinants, outcomes and correlations. Extracting structured information about those characteristics would be useful for more complete understanding of diseases and for meta-analyses and systematic reviews. Results We present an information extraction approach that enables users to identify key characteristics of epidemiological studies from MEDLINE abstracts. It extracts six types of epidemiological characteristic: design of the study, population that has been studied, exposure, outcome, covariates and effect size. We have developed a generic rule-based approach that has been designed according to semantic patterns observed in text, and tested it in the domain of obesity. Identified exposure, outcome and covariate concepts are clustered into health-related groups of interest. On a manually annotated test corpus of 60 epidemiological abstracts, the system achieved precision, recall and F-score between 79-100%, 80-100% and 82-96% respectively. We report the results of applying the method to a large scale epidemiological corpus related to obesity. Conclusions The experiments suggest that the proposed approach could identify key epidemiological characteristics associated with a complex clinical problem from related abstracts. When integrated over the literature, the extracted data can be used to provide a more complete picture of epidemiological efforts, and thus support understanding via meta-analysis and systematic reviews. PMID:24949194

  13. Libration Orbit Mission Design: Applications of Numerical & Dynamical Methods

    NASA Technical Reports Server (NTRS)

    Bauer, Frank (Technical Monitor); Folta, David; Beckman, Mark

    2002-01-01

    Sun-Earth libration point orbits serve as excellent locations for scientific investigations. These orbits are often selected to minimize environmental disturbances and maximize observing efficiency. Trajectory design in support of libration orbits is ever more challenging as more complex missions are envisioned in the next decade. Trajectory design software must be further enabled to incorporate better understanding of the libration orbit solution space and thus improve the efficiency and expand the capabilities of current approaches. The Goddard Space Flight Center (GSFC) is currently supporting multiple libration missions. This end-to-end support consists of mission operations, trajectory design, and control. It also includes algorithm and software development. The recently launched Microwave Anisotropy Probe (MAP) and upcoming James Webb Space Telescope (JWST) and Constellation-X missions are examples of the use of improved numerical methods for attaining constrained orbital parameters and controlling their dynamical evolution at the collinear libration points. This paper presents a history of libration point missions, a brief description of the numerical and dynamical design techniques including software used, and a sample of future GSFC mission designs.

  14. Towards Robust Designs Via Multiple-Objective Optimization Methods

    NASA Technical Reports Server (NTRS)

    Man Mohan, Rai

    2006-01-01

    evolutionary method (DE) is first used to solve a relatively difficult problem in extended surface heat transfer wherein optimal fin geometries are obtained for different safe operating base temperatures. The objective of maximizing the safe operating base temperature range is in direct conflict with the objective of maximizing fin heat transfer. This problem is a good example of achieving robustness in the context of changing operating conditions. The evolutionary method is then used to design a turbine airfoil; the two objectives being reduced sensitivity of the pressure distribution to small changes in the airfoil shape and the maximization of the trailing edge wedge angle with the consequent increase in airfoil thickness and strength. This is a relevant example of achieving robustness to manufacturing tolerances and wear and tear in the presence of other objectives.

  15. Methods of compliance evaluation for ocean outfall design and analysis.

    PubMed

    Mukhtasor; Lye, L M; Sharp, J J

    2002-10-01

    Sewage discharge from an ocean outfall is subject to water quality standards, which are often stated in probabilistic terms. Monte Carlo simulation (MCS) has been used in the past to evaluate the ability of a designed outfall to meet water quality standards or compliance guidelines associated with sewage discharges. In this study, simpler and less computer-intensive probabilistic methods are considered. The probabilistic methods evaluated are the popular mean first-order second-moment (MFOSM) and the advance first-order second-moment (AFOSM) methods. Available data from the Spaniard's Bay Outfall located on the east coast of New-foundland, Canada, were used as inputs for a case study. Both methods were compared with results given by MCS. It was found that AFOSM gave a good approximation of the failure probability for total coliform concentration at points remote from the outfall. However, MFOSM was found to be better when considering only the initial dilutions between the discharge point and the surface. Reasons for the different results may be the difference in complexity of the performance function in both cases. This study does not recommend the use of AFOSM for failure analysis in ocean outfall design and analysis because the analysis requires computational efforts similar to MCS. With the advancement of computer technology, simulation techniques, available software, and its flexibility in handling complex situations, MCS is still the best choice for failure analysis of ocean outfalls when data or estimates on the parameters involved are available or can be assumed. PMID:12481920

  16. Improved Method of Design for Folding Inflatable Shells

    NASA Technical Reports Server (NTRS)

    Johnson, Christopher J.

    2009-01-01

    An improved method of designing complexly shaped inflatable shells to be assembled from gores was conceived for original application to the inflatable outer shell of a developmental habitable spacecraft module having a cylindrical mid-length section with toroidal end caps. The method is also applicable to inflatable shells of various shapes for terrestrial use. The method addresses problems associated with the assembly, folding, transport, and deployment of inflatable shells that may comprise multiple layers and have complex shapes that can include such doubly curved surfaces as toroids and spheres. One particularly difficult problem is that of mathematically defining fold lines on a gore pattern in a double- curvature region. Moreover, because the fold lines in a double-curvature region tend to be curved, there is a practical problem of how to implement the folds. Another problem is that of modifying the basic gore shapes and sizes for the various layers so that when they are folded as part of the integral structure, they do not mechanically interfere with each other at the fold lines. Heretofore, it has been a common practice to design an inflatable shell to be assembled in the deployed configuration, without regard for the need to fold it into compact form. Typically, the result has been that folding has been a difficult, time-consuming process resulting in a An improved method of designing complexly shaped inflatable shells to be assembled from gores was conceived for original application to the inflatable outer shell of a developmental habitable spacecraft module having a cylindrical mid-length section with toroidal end caps. The method is also applicable to inflatable shells of various shapes for terrestrial use. The method addresses problems associated with the assembly, folding, transport, and deployment of inflatable shells that may comprise multiple layers and have complex shapes that can include such doubly curved surfaces as toroids and spheres. One

  17. A geometric design method for side-stream distillation columns

    SciTech Connect

    Rooks, R.E.; Malone, M.F.; Doherty, M.F.

    1996-10-01

    A side-stream distillation column may replace two simple columns for some applications, sometimes at considerable savings in energy and investment. This paper describes a geometric method for the design of side-stream columns; the method provides rapid estimates of equipment size and utility requirements. Unlike previous approaches, the geometric method is applicable to nonideal and azeotropic mixtures. Several example problems for both ideal and nonideal mixtures, including azeotropic mixtures containing distillation boundaries, are given. The authors make use of the fact that azeotropes or pure components whose classification in the residue curve map is a saddle can be removed as side-stream products. Significant process simplifications are found among some alternatives in example problems, leading to flow sheets with fewer units and a substantial savings in vapor rate.

  18. Sequence design in lattice models by graph theoretical methods

    NASA Astrophysics Data System (ADS)

    Sanjeev, B. S.; Patra, S. M.; Vishveshwara, S.

    2001-01-01

    A general strategy has been developed based on graph theoretical methods, for finding amino acid sequences that take up a desired conformation as the native state. This problem of inverse design has been addressed by assigning topological indices for the monomer sites (vertices) of the polymer on a 3×3×3 cubic lattice. This is a simple design strategy, which takes into account only the topology of the target protein and identifies the best sequence for a given composition. The procedure allows the design of a good sequence for a target native state by assigning weights for the vertices on a lattice site in a given conformation. It is seen across a variety of conformations that the predicted sequences perform well both in sequence and in conformation space, in identifying the target conformation as native state for a fixed composition of amino acids. Although the method is tested in the framework of the HP model [K. F. Lau and K. A. Dill, Macromolecules 22, 3986 (1989)] it can be used in any context if proper potential functions are available, since the procedure derives unique weights for all the sites (vertices, nodes) of the polymer chain of a chosen conformation (graph).

  19. A Probabilistic Design Method Applied to Smart Composite Structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1995-01-01

    A probabilistic design method is described and demonstrated using a smart composite wing. Probabilistic structural design incorporates naturally occurring uncertainties including those in constituent (fiber/matrix) material properties, fabrication variables, structure geometry and control-related parameters. Probabilistic sensitivity factors are computed to identify those parameters that have a great influence on a specific structural reliability. Two performance criteria are used to demonstrate this design methodology. The first criterion requires that the actuated angle at the wing tip be bounded by upper and lower limits at a specified reliability. The second criterion requires that the probability of ply damage due to random impact load be smaller than an assigned value. When the relationship between reliability improvement and the sensitivity factors is assessed, the results show that a reduction in the scatter of the random variable with the largest sensitivity factor (absolute value) provides the lowest failure probability. An increase in the mean of the random variable with a negative sensitivity factor will reduce the failure probability. Therefore, the design can be improved by controlling or selecting distribution parameters associated with random variables. This can be implemented during the manufacturing process to obtain maximum benefit with minimum alterations.

  20. A geometric method for optimal design of color filter arrays.

    PubMed

    Hao, Pengwei; Li, Yan; Lin, Zhouchen; Dubois, Eric

    2011-03-01

    A color filter array (CFA) used in a digital camera is a mosaic of spectrally selective filters, which allows only one color component to be sensed at each pixel. The missing two components of each pixel have to be estimated by methods known as demosaicking. The demosaicking algorithm and the CFA design are crucial for the quality of the output images. In this paper, we present a CFA design methodology in the frequency domain. The frequency structure, which is shown to be just the symbolic DFT of the CFA pattern (one period of the CFA), is introduced to represent images sampled with any rectangular CFAs in the frequency domain. Based on the frequency structure, the CFA design involves the solution of a constrained optimization problem that aims at minimizing the demosaicking error. To decrease the number of parameters and speed up the parameter searching, the optimization problem is reformulated as the selection of geometric points on the boundary of a convex polygon or the surface of a convex polyhedron. Using our methodology, several new CFA patterns are found, which outperform the currently commercialized and published ones. Experiments demonstrate the effectiveness of our CFA design methodology and the superiority of our new CFA patterns. PMID:20858581

  1. Modified method to improve the design of Petlyuk distillation columns

    PubMed Central

    2014-01-01

    Background A response surface analysis was performed to study the effect of the composition and feeding thermal conditions of ternary mixtures on the number of theoretical stages and the energy consumption of Petlyuk columns. A modification of the pre-design algorithm was necessary for this purpose. Results The modified algorithm provided feasible results in 100% of the studied cases, compared with only 8.89% for the current algorithm. The proposed algorithm allowed us to attain the desired separations, despite the type of mixture and the operating conditions in the feed stream, something that was not possible with the traditional pre-design method. The results showed that the type of mixture had great influence on the number of stages and on energy consumption. A higher number of stages and a lower consumption of energy were attained with mixtures rich in the light component, while higher energy consumption occurred when the mixture was rich in the heavy component. Conclusions The proposed strategy expands the search of an optimal design of Petlyuk columns within a feasible region, which allow us to find a feasible design that meets output specifications and low thermal loads. PMID:25061476

  2. Sewage-based epidemiology in monitoring the use of new psychoactive substances: Validation and application of an analytical method using LC-MS/MS.

    PubMed

    Kinyua, Juliet; Covaci, Adrian; Maho, Walid; McCall, Ann-Kathrin; Neels, Hugo; van Nuijs, Alexander L N

    2015-09-01

    Sewage-based epidemiology (SBE) employs the analysis of sewage to detect and quantify drug use within a community. While SBE has been applied repeatedly for the estimation of classical illicit drugs, only few studies investigated new psychoactive substances (NPS). These compounds mimic effects of illicit drugs by introducing slight modifications to chemical structures of controlled illicit drugs. We describe the optimization, validation, and application of an analytical method using liquid chromatography coupled to positive electrospray tandem mass spectrometry (LC-ESI-MS/MS) for the determination of seven NPS in sewage: methoxetamine (MXE), butylone, ethylone, methylone, methiopropamine (MPA), 4-methoxymethamphetamine (PMMA), and 4-methoxyamphetamine (PMA). Sample preparation was performed using solid-phase extraction (SPE) with Oasis MCX cartridges. The LC separation was done with a HILIC (150 x 3 mm, 5 µm) column which ensured good resolution of the analytes with a total run time of 19 min. The lower limit of quantification (LLOQ) was between 0.5 and 5 ng/L for all compounds. The method was validated by evaluating the following parameters: sensitivity, selectivity, linearity, accuracy, precision, recoveries and matrix effects. The method was applied on sewage samples collected from sewage treatment plants in Belgium and Switzerland in which all investigated compounds were detected, except MPA and PMA. Furthermore, a consistent presence of MXE has been observed in most of the sewage samples at levels higher than LLOQ. PMID:25655588

  3. Comparison of Epidemiological Methods for Estimation of Hepatitis B Incidence and Residual Risk for Blood Donors in Southern Brazil

    PubMed Central

    Kupek, Emil; Petry, Andrea

    2011-01-01

    Background and Objective. The objective of this work was to compare three methods for estimating hepatitis B virus (HBV) incidence and residual risk. Methods. Computerized blood donor records in southern Brazil were examined for the period 2004–2006. The methods for estimating HBV incidence included stand-alone HBsAg, HBsAg yield method, and an extension of the latter which added recent anti-HBc seroconversions as incident HBV cases. Results. HBV incidences for the above methods were 9.91, 20.09, and 22.93 per 100000 repeat donors, respectively. In the same order, corresponding residual risks were 1 : 62482, 1 : 30821, and 1 : 47559, respectively. First-time donors had 52 higher HBV incidence compared to repeat donors. Conclusion. Although the three methods compared produced overlapping 95% confidence intervals, their variation was considerably lower for the method which included recent anti-HBc seroconversions. First-time donors are primary cause for concern regarding HBV transmission via blood transfusion in southern Brazil. PMID:25346858

  4. Comparative Performance of Four Methods for High-throughput Glycosylation Analysis of Immunoglobulin G in Genetic and Epidemiological Research*

    PubMed Central

    Huffman, Jennifer E.; Pučić-Baković, Maja; Klarić, Lucija; Hennig, René; Selman, Maurice H. J.; Vučković, Frano; Novokmet, Mislav; Krištić, Jasminka; Borowiak, Matthias; Muth, Thilo; Polašek, Ozren; Razdorov, Genadij; Gornik, Olga; Plomp, Rosina; Theodoratou, Evropi; Wright, Alan F.; Rudan, Igor; Hayward, Caroline; Campbell, Harry; Deelder, André M.; Reichl, Udo; Aulchenko, Yurii S.; Rapp, Erdmann; Wuhrer, Manfred; Lauc, Gordan

    2014-01-01

    The biological and clinical relevance of glycosylation is becoming increasingly recognized, leading to a growing interest in large-scale clinical and population-based studies. In the past few years, several methods for high-throughput analysis of glycans have been developed, but thorough validation and standardization of these methods is required before significant resources are invested in large-scale studies. In this study, we compared liquid chromatography, capillary gel electrophoresis, and two MS methods for quantitative profiling of N-glycosylation of IgG in the same data set of 1201 individuals. To evaluate the accuracy of the four methods we then performed analysis of association with genetic polymorphisms and age. Chromatographic methods with either fluorescent or MS-detection yielded slightly stronger associations than MS-only and multiplexed capillary gel electrophoresis, but at the expense of lower levels of throughput. Advantages and disadvantages of each method were identified, which should inform the selection of the most appropriate method in future studies. PMID:24719452

  5. Airfoil Design and Optimization by the One-Shot Method

    NASA Technical Reports Server (NTRS)

    Kuruvila, G.; Taasan, Shlomo; Salas, M. D.

    1995-01-01

    An efficient numerical approach for the design of optimal aerodynamic shapes is presented in this paper. The objective of any optimization problem is to find the optimum of a cost function subject to a certain state equation (governing equation of the flow field) and certain side constraints. As in classical optimal control methods, the present approach introduces a costate variable (Lagrange multiplier) to evaluate the gradient of the cost function. High efficiency in reaching the optimum solution is achieved by using a multigrid technique and updating the shape in a hierarchical manner such that smooth (low-frequency) changes are done separately from high-frequency changes. Thus, the design variables are changed on a grid where their changes produce nonsmooth (high-frequency) perturbations that can be damped efficiently by the multigrid. The cost of solving the optimization problem is approximately two to three times the cost of the equivalent analysis problem.

  6. A Method for Designing CDO Conformed to Investment Parameters

    NASA Astrophysics Data System (ADS)

    Nakae, Tatsuya; Moritsu, Toshiyuki; Komoda, Norihisa

    We propose a method for designing CDO (Collateralized Debt Obligation) that meets investor needs about attributes of CDO. It is demonstrated that adjusting attributes (that are credit capability and issue amount) of CDO to investors' preferences causes a capital loss risk that the agent takes. We formulate a CDO optimization problem by defining an objective function using the above risk and by setting constraints that arise from investor needs and a risk premium that is paid for the agent. Our prototype experiment, in which fictitious underlying obligations and investor needs are given, verifies that CDOs can be designed without opportunity loss and dead stock loss, and that the capital loss is not more than thousandth part of the amount of annual payment under guarantee for small and midium-sized enterprises by a general credit guarantee institution.

  7. A formal method for early spacecraft design verification

    NASA Astrophysics Data System (ADS)

    Fischer, P. M.; Ludtke, D.; Schaus, V.; Gerndt, A.

    In the early design phase of a spacecraft, various aspects of the system under development are described and modeled using parameters such as masses, power consumption or data rates. In particular power and data parameters are special since their values can change depending on the spacecrafts operational mode. These mode-dependent parameters can be easily verified to static requirements like a maximumdata rate. Such quick verifications allow the engineers to check the design after every change they apply. In contrast, requirements concerning the mission lifetime such as the amount of downlinked data during the whole mission, demands a more complex procedure. We propose an executable model together with a simulation framework to evaluate complex mission scenarios. In conjunction with a formalized specification of mission requirements it allows a quick verification by means of formal methods.

  8. Collocation methods for distillation design. 2: Applications for distillation

    SciTech Connect

    Huss, R.S.; Westerberg, A.W.

    1996-05-01

    The authors present applications for a collocation method for modeling distillation columns that they developed in a companion paper. They discuss implementation of the model, including discussion of the ASCEND (Advanced System for Computations in ENgineering Design) system, which enables one to create complex models with simple building blocks and interactively learn to solve them. They first investigate applying the model to compute minimum reflux for a given separation task, exactly solving nonsharp and approximately solving sharp split minimum reflux problems. They next illustrate the use of the collocation model to optimize the design a single column capable of carrying out a prescribed set of separation tasks. The optimization picks the best column diameter and total number of trays. It also picks the feed tray for each of the prescribed separations.

  9. Statistical Methods for Rapid Aerothermal Analysis and Design Technology: Validation

    NASA Technical Reports Server (NTRS)

    DePriest, Douglas; Morgan, Carolyn

    2003-01-01

    The cost and safety goals for NASA s next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to identify adequate statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The initial research work focused on establishing suitable candidate models for these purposes. The second phase is focused on assessing the performance of these models to accurately predict the heat rate for a given candidate data set. This validation work compared models and methods that may be useful in predicting the heat rate.

  10. Conceptual Design Method Developed for Advanced Propulsion Nozzles

    NASA Technical Reports Server (NTRS)

    Nadell, Shari-Beth; Barnhart, Paul J.

    1998-01-01

    As part of a contract with the NASA Lewis Research Center, a simple, accurate method of predicting the performance characteristics of a nozzle design has been developed for use in conceptual design studies. The Nozzle Performance Analysis Code (NPAC) can predict the on- and off-design performance of axisymmetric or two-dimensional convergent and convergent-divergent nozzle geometries. NPAC accounts for the effects of overexpansion or underexpansion, flow divergence, wall friction, heat transfer, and small mass addition or loss across surfaces when the nozzle gross thrust and gross thrust coefficient are being computed. NPAC can be used to predict the performance of a given nozzle design or to develop a preliminary nozzle system design for subsequent analysis. The input required by NPAC consists of a simple geometry definition of the nozzle surfaces, the location of key nozzle stations (entrance, throat, exit), and the nozzle entrance flow properties. NPAC performs three analysis "passes" on the nozzle geometry. First, an isentropic control volume analysis is performed to determine the gross thrust and gross thrust coefficient of the nozzle. During the second analysis pass, the skin friction and heat transfer losses are computed. The third analysis pass couples the effects of wall shear and heat transfer with the initial internal nozzle flow solutions to produce a system of equations that is solved at steps along the nozzle geometry. Small mass additions or losses, such as those resulting from leakage or bleed flow, can be included in the model at specified geometric sections. A final correction is made to account for divergence losses that are incurred if the nozzle exit flow is not purely axial.

  11. Hardware architecture design of a fast global motion estimation method

    NASA Astrophysics Data System (ADS)

    Liang, Chaobing; Sang, Hongshi; Shen, Xubang

    2015-12-01

    VLSI implementation of gradient-based global motion estimation (GME) faces two main challenges: irregular data access and high off-chip memory bandwidth requirement. We previously proposed a fast GME method that reduces computational complexity by choosing certain number of small patches containing corners and using them in a gradient-based framework. A hardware architecture is designed to implement this method and further reduce off-chip memory bandwidth requirement. On-chip memories are used to store coordinates of the corners and template patches, while the Gaussian pyramids of both the template and reference frame are stored in off-chip SDRAMs. By performing geometric transform only on the coordinates of the center pixel of a 3-by-3 patch in the template image, a 5-by-5 area containing the warped 3-by-3 patch in the reference image is extracted from the SDRAMs by burst read. Patched-based and burst mode data access helps to keep the off-chip memory bandwidth requirement at the minimum. Although patch size varies at different pyramid level, all patches are processed in term of 3x3 patches, so the utilization of the patch-processing circuit reaches 100%. FPGA implementation results show that the design utilizes 24,080 bits on-chip memory and for a sequence with resolution of 352x288 and frequency of 60Hz, the off-chip bandwidth requirement is only 3.96Mbyte/s, compared with 243.84Mbyte/s of the original gradient-based GME method. This design can be used in applications like video codec, video stabilization, and super-resolution, where real-time GME is a necessity and minimum memory bandwidth requirement is appreciated.

  12. Infectious Agents and Cancer Epidemiology Research Webinar Series

    Cancer.gov

    Infectious Agents and Cancer Epidemiology Research Webinar Series highlights emerging and cutting-edge research related to infection-associated cancers, shares scientific knowledge about technologies and methods, and fosters cross-disciplinary discussions on infectious agents and cancer epidemiology.

  13. Design of braided composite tubes by numerical analysis method

    SciTech Connect

    Hamada, Hiroyuki; Fujita, Akihiro; Maekawa, Zenichiro; Nakai, Asami; Yokoyama, Atsushi

    1995-11-01

    Conventional composite laminates have very poor strength through thickness and as a result are limited in their application for structural parts with complex shape. In this paper, the design for braided composite tube was proposed. The concept of analysis model which involved from micro model to macro model was presented. This method was applied to predict bending rigidity and initial fracture stress under bending load of the braided tube. The proposed analytical procedure can be included as a unit in CAE system for braided composites.

  14. Methods to Design and Synthesize Antibody-Drug Conjugates (ADCs)

    PubMed Central

    Yao, Houzong; Jiang, Feng; Lu, Aiping; Zhang, Ge

    2016-01-01

    Antibody-drug conjugates (ADCs) have become a promising targeted therapy strategy that combines the specificity, favorable pharmacokinetics and biodistributions of antibodies with the destructive potential of highly potent drugs. One of the biggest challenges in the development of ADCs is the application of suitable linkers for conjugating drugs to antibodies. Recently, the design and synthesis of linkers are making great progress. In this review, we present the methods that are currently used to synthesize antibody-drug conjugates by using thiols, amines, alcohols, aldehydes and azides. PMID:26848651

  15. The Schisto Track: A System for Gathering and Monitoring Epidemiological Surveys by Connecting Geographical Information Systems in Real Time

    PubMed Central

    2014-01-01

    Background Using the Android platform as a notification instrument for diseases and disorders forms a new alternative for computerization of epidemiological studies. Objective The objective of our study was to construct a tool for gathering epidemiological data on schistosomiasis using the Android platform. Methods The developed application (app), named the Schisto Track, is a tool for data capture and analysis that was designed to meet the needs of a traditional epidemiological survey. An initial version of the app was finished and tested in both real situations and simulations for epidemiological surveys. Results The app proved to be a tool capable of automation of activities, with data organization and standardization, easy data recovery (to enable interfacing with other systems), and totally modular architecture. Conclusions The proposed Schisto Track is in line with worldwide trends toward use of smartphones with the Android platform for modeling epidemiological scenarios. PMID:25099881

  16. An alternate method for designing dipole magnet ends

    SciTech Connect

    Pope, W.L.; Green, M.A.; Peters, C.; Caspi, S.; Taylor, C.E.

    1988-08-01

    Small bore superconducting dipole magnets, such as those for the SSC, often have problems in the ends. These problems can often be alleviated by spreading out the end windings so that the conductor sees less deformation. This paper presents a new procedure for designing dipole magnet ends which can be applied to magnets with either cylindrical or conical bulged ends to have integrated field multipoles which meet the constraints imposed by the SSC lattice. The method described here permits one to couple existing multiparameter optimization routines (i.e., MINUIT with suitable independent parameter constraints) with a computer code DIPEND, which describes the multiples, so that one can meet any reasonable objective (i.e., minimizing integrated sextupole and decapole). This paper will describe how the computer method was used to analyze the bulged conical ends for an SSC dipole. 6 refs, 6 figs, 2 tabs.

  17. A Method of Trajectory Design for Manned Asteroids Exploration

    NASA Astrophysics Data System (ADS)

    Gan, Q. B.; Zhang, Y.; Zhu, Z. F.; Han, W. H.; Dong, X.

    2014-11-01

    A trajectory optimization method of the nuclear propulsion manned asteroids exploration is presented. In the case of launching between 2035 and 2065, based on the Lambert transfer orbit, the phases of departure from and return to the Earth are searched at first. Then the optimal flight trajectory in the feasible regions is selected by pruning the flight sequences. Setting the nuclear propulsion flight plan as propel-coast-propel, and taking the minimal mass of aircraft departure as the index, the nuclear propulsion flight trajectory is separately optimized using a hybrid method. With the initial value of the optimized local parameters of each three phases, the global parameters are jointedly optimized. At last, the minimal departure mass trajectory design result is given.

  18. An FPGA-based heterogeneous image fusion system design method

    NASA Astrophysics Data System (ADS)

    Song, Le; Lin, Yu-chi; Chen, Yan-hua; Zhao, Mei-rong

    2011-08-01

    Taking the advantages of FPGA's low cost and compact structure, an FPGA-based heterogeneous image fusion platform is established in this study. Altera's Cyclone IV series FPGA is adopted as the core processor of the platform, and the visible light CCD camera and infrared thermal imager are used as the image-capturing device in order to obtain dualchannel heterogeneous video images. Tailor-made image fusion algorithms such as gray-scale weighted averaging, maximum selection and minimum selection methods are analyzed and compared. VHDL language and the synchronous design method are utilized to perform a reliable RTL-level description. Altera's Quartus II 9.0 software is applied to simulate and implement the algorithm modules. The contrast experiments of various fusion algorithms show that, preferably image quality of the heterogeneous image fusion can be obtained on top of the proposed system. The applied range of the different fusion algorithms is also discussed.

  19. A novel observer design method for neural mass models

    NASA Astrophysics Data System (ADS)

    Liu, Xian; Miao, Dong-Kai; Gao, Qing; Xu, Shi-Yun

    2015-09-01

    Neural mass models can simulate the generation of electroencephalography (EEG) signals with different rhythms, and therefore the observation of the states of these models plays a significant role in brain research. The structure of neural mass models is special in that they can be expressed as Lurie systems. The developed techniques in Lurie system theory are applicable to these models. We here provide a new observer design method for neural mass models by transforming these models and the corresponding error systems into nonlinear systems with Lurie form. The purpose is to establish appropriate conditions which ensure the convergence of the estimation error. The effectiveness of the proposed method is illustrated by numerical simulations. Project supported by the National Natural Science Foundation of China (Grant Nos. 61473245, 61004050, and 51207144).

  20. [A method for studying social security records in epidemiology. Use in a study on the prognosis of chronic bronchitis (author's transl)].

    PubMed

    Kauffmann, F; Bahi, J; Brille, D

    1976-01-01

    A method is presented to study, in an epidemiological research, the social security records. This study is based upon records of workers affiliated to the french social security general system. To obtain data which may be compared, it was necessary to take the legislation as a basis; this legislation gives the data which must be in the records. A study of laws and rules has been done to find out these data in the medical record and in the administrative one. A questionnaire is presented. This basic questionnaire should be modified according to the precise objectives of each study and to the characteristics of the population sample. To illustrate this method, some results of a study of chronic bronchitis risk factors are presented in the second part. These results concern 950 men, born in France, aged 30 to 59 in 1960 an still alive in 1972. The study of the long reductions of the ability to work, happened from 1960 to 1971, confirm the disabling character of the group "chronic bronchitis, asthma, emphysema, respiratory insufficiency" which follows immediately cardiovascular and rheumatic diseases. The total number of beneficiaries of the social security is already very important and the whole population will be soon concerned. The use of the social security records as data source could give very interesting informations about morbidity. So, it is possible to study representative samples of the general population or of some particular groups, which has up to now, been done only in a slight extent. PMID:1019402

  1. METHODS DEVELOPMENT FOR ASSESSING AIR POLLUTION CONTROL BENEFITS. VOLUME I. EXPERIMENTS IN THE ECONOMICS OF AIR POLLUTION EPIDEMIOLOGY

    EPA Science Inventory

    The volume employs the analytical and empirical methods of economics to develop hypotheses on disease etiologies and to value labor productivity and consumer losses due to air pollution-induced mortality and morbidity. In the mortality work, 1970 city-wide mortality rates for maj...

  2. Impact design methods for ceramic components in gas turbine engines

    SciTech Connect

    Song, J.; Cuccio, J.; Kington, H. . Garrett Auxilliary Power Division)

    1993-01-01

    Garrett Auxiliary Power Division of Allied-Signal Aerospace Company is developing methods to design ceramic turbine components with improved impact resistance. In an ongoing research effort under the DOE/NASA-funded Advanced Turbine Technology Applications Project (ATTAP), two different modes of impact damage have been identified and characterized: local damage and structural damage. Local impact damage to Si[sub 3]N[sub 4] impacted by spherical projectiles usually takes the form of ring and/or radial cracks in the vicinity of the impact point. Baseline data from Si[sub 3]N[sub 4] test bars impacted by 1.588-mm (0.0625-in.) diameter NC-132 projectiles indicates the critical velocity at which the probability of detecting surface cracks is 50 percent equaled 130 m/s (426 ft/sec). A microphysics-based model that assumes damage to be in the form of microcracks has been developed to predict local impact damage. Local stress and strain determine microcrack nucleation and propagation, which in turn alter local stress and strain through modulus degradation. Material damage is quantified by a damage parameter related to the volume fraction of microcracks. The entire computation has been incorporated into the EPIC computer code. Model capability is being demonstrated by simulating instrumented plate impact and particle impact tests. Structural impact damage usually occurs in the form of fast fracture caused by bending stresses that exceed the material strength. The EPIC code has been successfully used to predict radial and axial blade failures from impacts by various size particles. This method is also being used in conjunction with Taguchi experimental methods to investigate the effects of design parameters on turbine blade impact resistance. It has been shown that significant improvement in impact resistance can be achieved by using the configuration recommended by Taguchi methods.

  3. Novel computational methods to design protein-protein interactions

    NASA Astrophysics Data System (ADS)

    Zhou, Alice Qinhua; O'Hern, Corey; Regan, Lynne

    2014-03-01

    Despite the abundance of structural data, we still cannot accurately predict the structural and energetic changes resulting from mutations at protein interfaces. The inadequacy of current computational approaches to the analysis and design of protein-protein interactions has hampered the development of novel therapeutic and diagnostic agents. In this work, we apply a simple physical model that includes only a minimal set of geometrical constraints, excluded volume, and attractive van der Waals interactions to 1) rank the binding affinity of mutants of tetratricopeptide repeat proteins with their cognate peptides, 2) rank the energetics of binding of small designed proteins to the hydrophobic stem region of the influenza hemagglutinin protein, and 3) predict the stability of T4 lysozyme and staphylococcal nuclease mutants. This work will not only lead to a fundamental understanding of protein-protein interactions, but also to the development of efficient computational methods to rationally design protein interfaces with tunable specificity and affinity, and numerous applications in biomedicine. NSF DMR-1006537, PHY-1019147, Raymond and Beverly Sackler Institute for Biological, Physical and Engineering Sciences, and Howard Hughes Medical Institute.

  4. Sensitivity method for integrated structure/active control law design

    NASA Technical Reports Server (NTRS)

    Gilbert, Michael G.

    1987-01-01

    The development is described of an integrated structure/active control law design methodology for aeroelastic aircraft applications. A short motivating introduction to aeroservoelasticity is given along with the need for integrated structures/controls design algorithms. Three alternative approaches to development of an integrated design method are briefly discussed with regards to complexity, coordination and tradeoff strategies, and the nature of the resulting solutions. This leads to the formulation of the proposed approach which is based on the concepts of sensitivity of optimum solutions and multi-level decompositions. The concept of sensitivity of optimum is explained in more detail and compared with traditional sensitivity concepts of classical control theory. The analytical sensitivity expressions for the solution of the linear, quadratic cost, Gaussian (LQG) control problem are summarized in terms of the linear regulator solution and the Kalman Filter solution. Numerical results for a state space aeroelastic model of the DAST ARW-II vehicle are given, showing the changes in aircraft responses to variations of a structural parameter, in this case first wing bending natural frequency.

  5. 77 FR 32632 - Ambient Air Monitoring Reference and Equivalent Methods: Designation of Three New Equivalent Methods

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-01

    ...Notice is hereby given that the Environmental Protection Agency (EPA) has designated, in accordance with 40 CFR Part 53, three new equivalent methods: One for measuring concentrations of nitrogen dioxide (NO2) and two for measuring concentrations of lead (Pb) in the ambient...

  6. Simplified design method for shear-valve magnetorheological dampers

    NASA Astrophysics Data System (ADS)

    Ding, Yang; Zhang, Lu; Zhu, Haitao; Li, Zhongxian

    2014-12-01

    Based on the Bingham parallel-plate model, a simplified design method of shear-valve magnetorheological (MR) dampers is proposed considering the magnetic circuit optimization. Correspondingly, a new MR damper with a full-length effective damping path is proposed. The prototype dampers are also fabricated and studied numerically and experimentally. According to the test results, the Bingham parallel-plate model is further modified to obtain a damping force prediction model of the proposed MR dampers. This prediction model considers the magnetic saturation phenomenon. The study indicates that the proposed simplified design method is simple, effective and reliable. The maximum damping force of the proposed MR dampers with a full-length effective damping path is at least twice as large as those of conventional MR dampers. The dynamic range of damping force increases by at least 70%. The proposed damping force prediction model considers the magnetic saturation phenomenon and it can realize the actual characteristic of MR fluids. The model is able to predict the actual damping force of MR dampers precisely.

  7. International Lymphoma Epidemiology Consortium

    Cancer.gov

    The InterLymph Consortium, or formally the International Consortium of Investigators Working on Non-Hodgkin's Lymphoma Epidemiologic Studies, is an open scientific forum for epidemiologic research in non-Hodgkin's lymphoma.

  8. Development of Analysis Methods for Designing with Composites

    NASA Technical Reports Server (NTRS)

    Madenci, E.

    1999-01-01

    The project involved the development of new analysis methods to achieve efficient design of composite structures. We developed a complex variational formulation to analyze the in-plane and bending coupling response of an unsymmetrically laminated plate with an elliptical cutout subjected to arbitrary edge loading as shown in Figure 1. This formulation utilizes four independent complex potentials that satisfy the coupled in-plane and bending equilibrium equations, thus eliminating the area integrals from the strain energy expression. The solution to a finite geometry laminate under arbitrary loading is obtained by minimizing the total potential energy function and solving for the unknown coefficients of the complex potentials. The validity of this approach is demonstrated by comparison with finite element analysis predictions for a laminate with an inclined elliptical cutout under bi-axial loading.The geometry and loading of this laminate with a lay-up of [-45/45] are shown in Figure 2. The deformed configuration shown in Figure 3 reflects the presence of bending-stretching coupling. The validity of the present method is established by comparing the out-of-plane deflections along the boundary of the elliptical cutout from the present approach with those of the finite element method. The comparison shown in Figure 4 indicates remarkable agreement. The details of this method are described in a manuscript by Madenci et al. (1998).

  9. A new method of dual FOV optical system design

    NASA Astrophysics Data System (ADS)

    Zhang, Liang

    2009-07-01

    With the development of scientific technologies, infrared imaging technology has been applied in the fields of industry, medical treatment, and national defense and so on. Infrared detection has the advantages of looking through the smoke, fog, haze, snow, and also could avoid the affection of battlefield flash. Hence, it could achieve the long distance and all-weather scout, especially in nighttime and badness weather conditions.All kinds of single-FOV, dual-FOV, multi-FOV and continuous zoom optical systems have been applied more and more abroad with the research and application of infrared imaging technologies. Therefore, the research of all sorts of dual FOV optical systems would be more important. The system belongs to simple zoom optical systems by having two fields of view. The zoom methods comprise of single zoom, rotary zoom, radial zoom, axial zoom and so on. Basing on the analysis of zoom methods, a new method of zoom optical system has been developed, which realized the dual FOV optical system by sharing secondary imaging lenses. This design method could make the results approaching to diffraction limit, and improve the precision of optical axial. It also has decreased the moving parts and reduced the difficulty of assembly of system.

  10. A New Aerodynamic Data Dispersion Method for Launch Vehicle Design

    NASA Technical Reports Server (NTRS)

    Pinier, Jeremy T.

    2011-01-01

    A novel method for implementing aerodynamic data dispersion analysis is herein introduced. A general mathematical approach combined with physical modeling tailored to the aerodynamic quantity of interest enables the generation of more realistically relevant dispersed data and, in turn, more reasonable flight simulation results. The method simultaneously allows for the aerodynamic quantities and their derivatives to be dispersed given a set of non-arbitrary constraints, which stresses the controls model in more ways than with the traditional bias up or down of the nominal data within the uncertainty bounds. The adoption and implementation of this new method within the NASA Ares I Crew Launch Vehicle Project has resulted in significant increases in predicted roll control authority, and lowered the induced risks for flight test operations. One direct impact on launch vehicles is a reduced size for auxiliary control systems, and the possibility of an increased payload. This technique has the potential of being applied to problems in multiple areas where nominal data together with uncertainties are used to produce simulations using Monte Carlo type random sampling methods. It is recommended that a tailored physics-based dispersion model be delivered with any aerodynamic product that includes nominal data and uncertainties, in order to make flight simulations more realistic and allow for leaner spacecraft designs.

  11. Typing multidrug-resistant Staphylococcus aureus: conflicting epidemiological data produced by genotypic and phenotypic methods clarified by phylogenetic analysis.

    PubMed Central

    Jorgensen, M; Givney, R; Pegler, M; Vickery, A; Funnell, G

    1996-01-01

    An outbreak of an unusual tetracycline-sensitive, rifampicin- and ciprofloxacin-resistant, methicillin-resistant Staphylococcus aureus (MRSA) strain at a large teaching hospital was investigated. Two typing methods, phage typing and restriction fragment length polymorphism (RFLP) by pulsed-field gel electrophoresis (RFLP-PFGE), gave conflicting results which were clarified by phylogenetic analysis. Phage typing identified all the "epidemic-associated" strains as identical, while RFLP-PFGE further divided these strains into four pulsotypes. Phylogenetic analysis showed these four pulsotypes were related genetically and also recognized a second strain of MRSA causing a continuing cross-infection problem. Variation in the RFLP-PFGE pattern was shown to occur following lysogenization of phage-sensitive MRSA. These results indicate that in analyzing outbreaks caused by subgroups of clonal organisms like MRSA, it is necessary to use at least two typing methods and that conflicts between these could be resolved by phylogenetic analysis. PMID:8789023

  12. Nanobiological studies on drug design using molecular mechanic method

    PubMed Central

    Ghaheh, Hooria Seyedhosseini; Mousavi, Maryam; Araghi, Mahmood; Rasoolzadeh, Reza; Hosseini, Zahra

    2015-01-01

    Background: Influenza H1N1 is very important worldwide and point mutations that occur in the virus gene are a threat for the World Health Organization (WHO) and druggists, since they could make this virus resistant to the existing antibiotics. Influenza epidemics cause severe respiratory illness in 30 to 50 million people and kill 250,000 to 500,000 people worldwide every year. Nowadays, drug design is not done through trial and error because of its cost and waste of time; therefore bioinformatics studies is essential for designing drugs. Materials and Methods: This paper, infolds a study on binding site of Neuraminidase (NA) enzyme, (that is very important in drug design) in 310K temperature and different dielectrics, for the best drug design. Information of NA enzyme was extracted from Protein Data Bank (PDB) and National Center for Biotechnology Information (NCBI) websites. The new sequences of N1 were downloaded from the NCBI influenza virus sequence database. Drug binding sites were assimilated and homologized modeling using Argus lab 4.0, HyperChem 6.0 and Chem. D3 softwares. Their stability was assessed in different dielectrics and temperatures. Result: Measurements of potential energy (Kcal/mol) of binding sites of NA in different dielectrics and 310K temperature revealed that at time step size = 0 pSec drug binding sites have maximum energy level and at time step size = 100 pSec have maximum stability and minimum energy. Conclusions: Drug binding sites are more dependent on dielectric constants rather than on temperature and the optimum dielectric constant is 39/78. PMID:26605248

  13. Epidemiology, Science as Inquiry and Scientific Literacy

    ERIC Educational Resources Information Center

    Kaelin, Mark; Huebner, Wendy

    2003-01-01

    The recent worldwide SARS outbreak has put the science of epidemiology into the headlines once again. Epidemiology is "... the study of the distribution and the determinants of health-related states or events and the application of these methods to the control of health problems" (Gordis 2000). In this context, the authors have developed a…

  14. Cancer Epidemiology Matters Blog

    Cancer.gov

    The Cancer Epidemiology Matters blog helps foster a dialogue between the National Cancer Institute's (NCI) Epidemiology and Genomics Research Program (EGRP), extramural researchers, and other individuals, such as clinicians, community partners, and advocates, who are interested in cancer epidemiology and genomics.

  15. Formal methods in the design of Ada 1995

    NASA Technical Reports Server (NTRS)

    Guaspari, David

    1995-01-01

    Formal, mathematical methods are most useful when applied early in the design and implementation of a software system--that, at least, is the familiar refrain. I will report on a modest effort to apply formal methods at the earliest possible stage, namely, in the design of the Ada 95 programming language itself. This talk is an 'experience report' that provides brief case studies illustrating the kinds of problems we worked on, how we approached them, and the extent (if any) to which the results proved useful. It also derives some lessons and suggestions for those undertaking future projects of this kind. Ada 95 is the first revision of the standard for the Ada programming language. The revision began in 1988, when the Ada Joint Programming Office first asked the Ada Board to recommend a plan for revising the Ada standard. The first step in the revision was to solicit criticisms of Ada 83. A set of requirements for the new language standard, based on those criticisms, was published in 1990. A small design team, the Mapping Revision Team (MRT), became exclusively responsible for revising the language standard to satisfy those requirements. The MRT, from Intermetrics, is led by S. Tucker Taft. The work of the MRT was regularly subject to independent review and criticism by a committee of distinguished Reviewers and by several advisory teams--for example, the two User/Implementor teams, each consisting of an industrial user (attempting to make significant use of the new language on a realistic application) and a compiler vendor (undertaking, experimentally, to modify its current implementation in order to provide the necessary new features). One novel decision established the Language Precision Team (LPT), which investigated language proposals from a mathematical point of view. The LPT applied formal mathematical analysis to help improve the design of Ada 95 (e.g., by clarifying the language proposals) and to help promote its acceptance (e.g., by identifying a

  16. Learning physics: A comparative analysis between instructional design methods

    NASA Astrophysics Data System (ADS)

    Mathew, Easow

    The purpose of this research was to determine if there were differences in academic performance between students who participated in traditional versus collaborative problem-based learning (PBL) instructional design approaches to physics curricula. This study utilized a quantitative quasi-experimental design methodology to determine the significance of differences in pre- and posttest introductory physics exam performance between students who participated in traditional (i.e., control group) versus collaborative problem solving (PBL) instructional design (i.e., experimental group) approaches to physics curricula over a college semester in 2008. There were 42 student participants (N = 42) enrolled in an introductory physics course at the research site in the Spring 2008 semester who agreed to participate in this study after reading and signing informed consent documents. A total of 22 participants were assigned to the experimental group (n = 22) who participated in a PBL based teaching methodology along with traditional lecture methods. The other 20 students were assigned to the control group (n = 20) who participated in the traditional lecture teaching methodology. Both the courses were taught by experienced professors who have qualifications at the doctoral level. The results indicated statistically significant differences (p < .01) in academic performance between students who participated in traditional (i.e., lower physics posttest scores and lower differences between pre- and posttest scores) versus collaborative (i.e., higher physics posttest scores, and higher differences between pre- and posttest scores) instructional design approaches to physics curricula. Despite some slight differences in control group and experimental group demographic characteristics (gender, ethnicity, and age) there were statistically significant (p = .04) differences between female average academic improvement which was much higher than male average academic improvement (˜63%) in

  17. Classification of personal exposure to radio frequency electromagnetic fields (RF-EMF) for epidemiological research: Evaluation of different exposure assessment methods.

    PubMed

    Frei, Patrizia; Mohler, Evelyn; Bürgi, Alfred; Fröhlich, Jürg; Neubauer, Georg; Braun-Fahrländer, Charlotte; Röösli, Martin

    2010-10-01

    The use of personal exposure meters (exposimeters) has been recommended for measuring personal exposure to radio frequency electromagnetic fields (RF-EMF) from environmental far-field sources in everyday life. However, it is unclear to what extent exposimeter readings are affected by measurements taken when personal mobile and cordless phones are used. In addition, the use of exposimeters in large epidemiological studies is limited due to high costs and large effort for study participants. In the current analysis we aimed to investigate the impact of personal phone use on exposimeter readings and to evaluate different exposure assessment methods potentially useful in epidemiological studies. We collected personal exposimeter measurements during one week and diary data from 166 study participants. Moreover, we collected spot measurements in the participants' bedrooms and data on self-estimated exposure, assessed residential exposure to fixed site transmitters by calculating the geo-coded distance and mean RF-EMF from a geospatial propagation model, and developed an exposure prediction model based on the propagation model and exposure relevant behavior. The mean personal exposure was 0.13 mW/m(2), when measurements during personal phone calls were excluded and 0.15 mW/m(2), when such measurements were included. The Spearman correlation with personal exposure (without personal phone calls) was 0.42 (95%-CI: 0.29 to 0.55) for the spot measurements, -0.03 (95%-CI: -0.18 to 0.12) for the geo-coded distance, 0.28 (95%-CI: 0.14 to 0.42) for the geospatial propagation model, 0.50 (95%-CI: 0.37 to 0.61) for the full exposure prediction model and 0.06 (95%-CI: -0.10 to 0.21) for self-estimated exposure. In conclusion, personal exposure measured with exposimeters correlated best with the full exposure prediction model and spot measurements. Self-estimated exposure and geo-coded distance turned out to be poor surrogates for personal exposure. PMID:20538340

  18. Basic research on design analysis methods for rotorcraft vibrations

    NASA Technical Reports Server (NTRS)

    Hanagud, S.

    1991-01-01

    The objective of the present work was to develop a method for identifying physically plausible finite element system models of airframe structures from test data. The assumed models were based on linear elastic behavior with general (nonproportional) damping. Physical plausibility of the identified system matrices was insured by restricting the identification process to designated physical parameters only and not simply to the elements of the system matrices themselves. For example, in a large finite element model the identified parameters might be restricted to the moduli for each of the different materials used in the structure. In the case of damping, a restricted set of damping values might be assigned to finite elements based on the material type and on the fabrication processes used. In this case, different damping values might be associated with riveted, bolted and bonded elements. The method itself is developed first, and several approaches are outlined for computing the identified parameter values. The method is applied first to a simple structure for which the 'measured' response is actually synthesized from an assumed model. Both stiffness and damping parameter values are accurately identified. The true test, however, is the application to a full-scale airframe structure. In this case, a NASTRAN model and actual measured modal parameters formed the basis for the identification of a restricted set of physically plausible stiffness and damping parameters.

  19. Designing arrays for modern high-resolution methods

    SciTech Connect

    Dowla, F.U.

    1987-10-01

    A bearing estimation study of seismic wavefields propagating from a strongly heterogeneous media shows that with the high-resolution MUSIC algorithm the bias of the direction estimate can be reduced by adopting a smaller aperture sub-array. Further, on this sub-array, the bias of the MUSIC algorithm is less than those of the MLM and Bartlett methods. On the full array, the performance for the three different methods are comparable. Improvement in bearing estimation in MUSIC with a reduced aperture might be attributed to increased signal coherency in the array. For methods with less resolution, the improved signal coherency in the smaller array is possible being offset by severe loss of resolution and the presence of weak secondary sources. Building upon the characteristics of real seismic wavefields, a design language has been developed to generate, modify, and test other arrays. Eigenstructures of wavefields and arrays have been studied empirically by simulation of a variety of realistic signals. 6 refs., 5 figs.

  20. PCR amplification of rRNA intergenic spacer regions as a method for epidemiologic typing of Clostridium difficile.

    PubMed Central

    Cartwright, C P; Stock, F; Beekmann, S E; Williams, E C; Gill, V J

    1995-01-01

    From January to March 1993, a suspected outbreak of antibiotic-associated diarrhea occurred on a pediatric oncology ward of the Clinical Center Hospital at the National Institutes of Health. Isolates of Clostridium difficile obtained from six patients implicated in this outbreak were typed by both PCR amplification of rRNA intergenic spacer regions (PCR ribotyping) and restriction endonuclease analysis of genomic DNA. Comparable results were obtained with both methods; five of the six patients were infected with the same strain of C. difficile. Subsequent analysis of 102 C. difficile isolates obtained from symptomatic patients throughout the Clinical Center revealed the existence of 41 distinct and reproducible PCR ribotypes. These data suggest that PCR ribotyping provides a discriminatory, reproducible, and simple alternative to conventional molecular approaches for typing strains of C. difficile. PMID:7699038

  1. Evaluation of five susceptibility test methods for detection of tobramycin resistance in a cluster of epidemiologically related Acinetobacter baumannii isolates.

    PubMed

    Moodley, V Mischka; Oliver, Stephen P; Shankland, Iva; Elisha, B Gay

    2013-08-01

    Acinetobacter baumannii is a major nosocomial pathogen causing infections in critically ill patients. This organism has acquired the propensity to rapidly develop resistance to most antibiotics. At several hospitals within Cape Town, South Africa, tobramycin and colistin are frequently the only therapeutic options. Vitek2 automated susceptibility testing (AST) is used in the clinical laboratory to determine selected susceptibility profiles. The suspicion of a possible AST-related technical error when testing for susceptibility to tobramycin in A. baumannii precipitated this study. Thirty-nine A. baumannii strains isolated from clinical specimens (June to December 2006) were included in this prospective study. Tobramycin susceptibility testing results obtained by AST, disc diffusion, the epsilometer test (Etest), and agar dilution were compared to those for broth microdilution (BMD), the reference method. The tobramycin susceptibility results revealed errors in 25/39 (64%) isolates (10 very major and 15 minor errors) when AST was compared to BMD, 12/39 (31%) (2 very major and 10 minor errors) when Etest was compared to BMD, 16/39 (41%) (3 very major and 13 minor errors) when disc diffusion was compared to BMD, and 21/39 (54%) (10 very major and 11 minor errors) when agar dilution was compared to BMD. Using PCR, we detected aac(3)-IIa, which is associated with tobramycin resistance, in 21/25 of the discrepant isolates. Molecular typing (using pulsed-field gel electrophoresis and repetitive sequence-based PCR [rep-PCR]) showed that these isolates were genetically related. Clinical laboratories that routinely use the Vitek2 system should consider an alternative testing method for determining susceptibility to tobramycin. PMID:23698528

  2. A New Method to Predict the Epidemiology of Fungal Keratitis by Monitoring the Sales Distribution of Antifungal Eye Drops in Brazil

    PubMed Central

    Ibrahim, Marlon Moraes; de Angelis, Rafael; Lima, Acacio Souza; Viana de Carvalho, Glauco Dreyer; Ibrahim, Fuad Moraes; Malki, Leonardo Tannus; de Paula Bichuete, Marina; de Paula Martins, Wellington; Rocha, Eduardo Melani

    2012-01-01

    Purpose Fungi are a major cause of keratitis, although few medications are licensed for their treatment. The aim of this study is to observe the variation in commercialisation of antifungal eye drops, and to predict the seasonal distribution of fungal keratitis in Brazil. Methods Data from a retrospective study of antifungal eye drops sales from the only pharmaceutical ophthalmologic laboratory, authorized to dispense them in Brazil (Opthalmos) were gathered. These data were correlated with geographic and seasonal distribution of fungal keratitis in Brazil between July 2002 and June 2008. Results A total of 26,087 antifungal eye drop units were sold, with a mean of 2.3 per patient. There was significant variation in antifungal sales during the year (p<0.01). A linear regression model displayed a significant association between reduced relative humidity and antifungal drug sales (R2 = 0.17,p<0.01). Conclusions Antifungal eye drops sales suggest that there is a seasonal distribution of fungal keratitis. A possible interpretation is that the third quarter of the year (a period when the climate is drier), when agricultural activity is more intense in Brazil, suggests a correlation with a higher incidence of fungal keratitis. A similar model could be applied to other diseases, that are managed with unique, or few, and monitorable medications to predict epidemiological aspects. PMID:22457787

  3. Epidemiological Cutoff Values for Fluconazole, Itraconazole, Posaconazole, and Voriconazole for Six Candida Species as Determined by the Colorimetric Sensititre YeastOne Method

    PubMed Central

    Pemán, Javier; Iñiguez, Carmen; Hervás, David; Lopez-Hontangas, Jose L.; Pina-Vaz, Cidalia; Camarena, Juan J.; Campos-Herrero, Isolina; García-García, Inmaculada; García-Tapia, Ana M.; Guna, Remedios; Merino, Paloma; Pérez del Molino, Luisa; Rubio, Carmen; Suárez, Anabel

    2013-01-01

    In the absence of clinical breakpoints (CBP), epidemiological cutoff values (ECVs) are useful to separate wild-type (WT) isolates (without mechanisms of resistance) from non-WT isolates (those that can harbor some resistance mechanisms), which is the goal of susceptibility tests. Sensititre YeastOne (SYO) is a widely used method to determine susceptibility of Candida spp. to antifungal agents. The CLSI CBP have been established, but not for the SYO method. The ECVs for four azoles, obtained using MIC distributions determined by the SYO method, were calculated via five methods (three statistical methods and based on the MIC50 and modal MIC). Respectively, the median ECVs (in mg/liter) of the five methods for fluconazole, itraconazole, posaconazole, and voriconazole (in parentheses: the percentage of isolates inhibited by MICs equal to or less than the ECVs; the number of isolates tested) were as follows: 2 (94.4%; 944), 0.5 (96.7%; 942), 0.25 (97.6%; 673), and 0.06 (96.7%; 849) for Candida albicans; 4 (86.1%; 642), 0.5 (99.4%; 642), 0.12 (93.9%; 392), and 0.06 (86.9%; 559) for C. parapsilosis; 8 (94.9%; 175), 1 (93.7%; 175), 2 (93.6%; 125), and 0.25 (90.4%; 167) for C. tropicalis; 128 (98.6%; 212), 4 (95.8%; 212), 4 (96.0%; 173), and 2 (98.5; 205) for C. glabrata; 256 (100%; 53), 1 (98.1%; 53), 1 (100%; 33), and 1 (97.9%; 48) for C. krusei; 4 (89.2%; 93), 0.5 (100%; 93), 0.25 (100%; 33), and 0.06 (87.7%; 73) for C. orthopsilosis. All methods included ≥94% of isolates and yielded similar ECVs (within 1 dilution). These ECVs would be suitable for monitoring emergence of isolates with reduced susceptibility by using the SYO method. PMID:23761155

  4. How to assess epidemiological studies

    PubMed Central

    Zaccai, J

    2004-01-01

    Assessing the quality of an epidemiological study equates to assessing whether the inferences drawn from it are warranted when account is taken of the methods, the representativeness of the study sample, and the nature of the population from which it is drawn. Bias, confounding, and chance can threaten the quality of an epidemiological study at all its phases. Nevertheless, their presence does not necessarily imply that a study should be disregarded. The reader must first balance any of these threats or missing information with their potential impact on the conclusions of the report. PMID:15016934

  5. SU-D-16A-01: A Novel Method to Estimate Normal Tissue Dose for Radiotherapy Patients to Support Epidemiologic Studies of Second Cancer Risk

    SciTech Connect

    Lee, C; Jung, J; Pelletier, C; Kim, J; Lee, C

    2014-06-01

    Purpose: Patient cohort of second cancer study often involves radiotherapy patients with no radiological images available: We developed methods to construct a realistic surrogate anatomy by using computational human phantoms. We tested this phantom images both in a commercial treatment planning system (Eclipse) and a custom Monte Carlo (MC) transport code. Methods: We used a reference adult male phantom defined by International Commission on Radiological Protection (ICRP). The hybrid phantom which was originally developed in Non-Uniform Rational B-Spline (NURBS) and polygon mesh format was converted into more common medical imaging format. Electron density was calculated from the material composition of the organs and tissues and then converted into DICOM format. The DICOM images were imported into the Eclipse system for treatment planning, and then the resulting DICOM-RT files were imported into the MC code for MC-based dose calculation. Normal tissue doses were calculation in Eclipse and MC code for an illustrative prostate treatment case and compared to each other. Results: DICOM images were generated from the adult male reference phantom. Densities and volumes of selected organs between the original phantom and ones represented within Eclipse showed good agreements, less than 0.6%. Mean dose from Eclipse and MC code match less than 7%, whereas maximum and minimum doses were different up to 45%. Conclusion: The methods established in this study will be useful for the reconstruction of organ dose to support epidemiological studies of second cancer in cancer survivors treated by radiotherapy. We also work on implementing body size-dependent computational phantoms to better represent patient's anatomy when the height and weight of patients are available.

  6. The finite element method for calculating the marine structural design

    NASA Astrophysics Data System (ADS)

    Ion, A.; Ticu, I.

    2015-11-01

    The aim of this paper is to optimally design and dimension marine structures in order for them to fulfil both functional and safety requirements. A master level of structural mechanics is vital in order to check tests and analysis and to develop new structures. This study can improve the calculation and estimation of the effects of hydrodynamics and of other loads; movements, strains and internal forces in fixed and floating platforms and ships. The finite element method (FEM) ensures basic understanding of the finite element model as applied on static cases including beam and plate elements, experience with static analysis of marine structures like platforms and ships, along with the basic understanding of dynamic response of systems with one degree of freedom and simple continuous beams, and also how analysis models can be established for real structures by the use of generalized coordinates and superposition.

  7. POWER ANALYSIS FOR COMPLEX MEDIATIONAL DESIGNS USING MONTE CARLO METHODS

    PubMed Central

    Thoemmes, Felix; MacKinnon, David P.; Reiser, Mark R.

    2013-01-01

    Applied researchers often include mediation effects in applications of advanced methods such as latent variable models and linear growth curve models. Guidance on how to estimate statistical power to detect mediation for these models has not yet been addressed in the literature. We describe a general framework for power analyses for complex mediational models. The approach is based on the well known technique of generating a large number of samples in a Monte Carlo study, and estimating power as the percentage of cases in which an estimate of interest is significantly different from zero. Examples of power calculation for commonly used mediational models are provided. Power analyses for the single mediator, multiple mediators, three-path mediation, mediation with latent variables, moderated mediation, and mediation in longitudinal designs are described. Annotated sample syntax for Mplus is appended and tabled values of required sample sizes are shown for some models. PMID:23935262

  8. Unique Method for Generating Design Earthquake Time Histories

    SciTech Connect

    R. E. Spears

    2008-07-01

    A method has been developed which takes a seed earthquake time history and modifies it to produce given design response spectra. It is a multi-step process with an initial scaling step and then multiple refinement steps. It is unique in the fact that both the acceleration and displacement response spectra are considered when performing the fit (which primarily improves the low frequency acceleration response spectrum accuracy). Additionally, no matrix inversion is needed. The features include encouraging the code acceleration, velocity, and displacement ratios and attempting to fit the pseudo velocity response spectrum. Also, “smoothing” is done to transition the modified time history to the seed time history at its start and end. This is done in the time history regions below a cumulative energy of 5% and above a cumulative energy of 95%. Finally, the modified acceleration, velocity, and displacement time histories are adjusted to start and end with an amplitude of zero (using Fourier transform techniques for integration).

  9. Development of impact design methods for ceramic gas turbine components

    NASA Technical Reports Server (NTRS)

    Song, J.; Cuccio, J.; Kington, H.

    1990-01-01

    Impact damage prediction methods are being developed to aid in the design of ceramic gas turbine engine components with improved impact resistance. Two impact damage modes were characterized: local, near the impact site, and structural, usually fast fracture away from the impact site. Local damage to Si3N4 impacted by Si3N4 spherical projectiles consists of ring and/or radial cracks around the impact point. In a mechanistic model being developed, impact damage is characterized as microcrack nucleation and propagation. The extent of damage is measured as volume fraction of microcracks. Model capability is demonstrated by simulating late impact tests. Structural failure is caused by tensile stress during impact exceeding material strength. The EPIC3 code was successfully used to predict blade structural failures in different size particle impacts on radial and axial blades.

  10. Design method of water jet pump towards high cavitation performances

    NASA Astrophysics Data System (ADS)

    Cao, L. L.; Che, B. X.; Hu, L. J.; Wu, D. Z.

    2016-05-01

    As one of the crucial components for power supply, the propulsion system is of great significance to the advance speed, noise performances, stabilities and other associated critical performances of underwater vehicles. Developing towards much higher advance speed, the underwater vehicles make more critical demands on the performances of the propulsion system. Basically, the increased advance speed requires the significantly raised rotation speed of the propulsion system, which would result in the deteriorated cavitation performances and consequently limit the thrust and efficiency of the whole system. Compared with the traditional propeller, the water jet pump offers more favourite cavitation, propulsion efficiency and other associated performances. The present research focuses on the cavitation performances of the waterjet pump blade profile in expectation of enlarging its advantages in high-speed vehicle propulsion. Based on the specifications of a certain underwater vehicle, the design method of the waterjet blade with high cavitation performances was investigated in terms of numerical simulation.

  11. Computational methods in metabolic engineering for strain design.

    PubMed

    Long, Matthew R; Ong, Wai Kit; Reed, Jennifer L

    2015-08-01

    Metabolic engineering uses genetic approaches to control microbial metabolism to produce desired compounds. Computational tools can identify new biological routes to chemicals and the changes needed in host metabolism to improve chemical production. Recent computational efforts have focused on exploring what compounds can be made biologically using native, heterologous, and/or enzymes with broad specificity. Additionally, computational methods have been developed to suggest different types of genetic modifications (e.g. gene deletion/addition or up/down regulation), as well as suggest strategies meeting different criteria (e.g. high yield, high productivity, or substrate co-utilization). Strategies to improve the runtime performances have also been developed, which allow for more complex metabolic engineering strategies to be identified. Future incorporation of kinetic considerations will further improve strain design algorithms. PMID:25576846

  12. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  13. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  14. Information processing systems, reasoning modules, and reasoning system design methods

    DOEpatents

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  15. Assessing health impacts in complex eco-epidemiological settings in the humid tropics: Advancing tools and methods

    SciTech Connect

    Winkler, Mirko S.; Divall, Mark J.; Krieger, Gary R.; Balge, Marci Z.; Singer, Burton H.; Utzinger, Juerg

    2010-01-15

    In the developing world, large-scale projects in the extractive industry and natural resources sectors are often controversial and associated with long-term adverse health consequences to local communities. In many industrialised countries, health impact assessment (HIA) has been institutionalized for the mitigation of anticipated negative health effects while enhancing the benefits of projects, programmes and policies. However, in developing country settings, relatively few HIAs have been performed. Hence, more HIAs with a focus on low- and middle-income countries are needed to advance and refine tools and methods for impact assessment and subsequent mitigation measures. We present a promising HIA approach, developed within the frame of a large gold-mining project in the Democratic Republic of the Congo. The articulation of environmental health areas, the spatial delineation of potentially affected communities and the use of a diversity of sources to obtain quality baseline health data are utilized for risk profiling. We demonstrate how these tools and data are fed into a risk analysis matrix, which facilitates ranking of potential health impacts for subsequent prioritization of mitigation strategies. The outcomes encapsulate a multitude of environmental and health determinants in a systematic manner, and will assist decision-makers in the development of mitigation measures that minimize potential adverse health effects and enhance positive ones.

  16. Molecular Epidemiology of Leptospirosis in Northern Iran by Nested Polymerase Chain Reaction/Restriction Fragment Length Polymorphism and Sequencing Methods

    PubMed Central

    Zakeri, Sedigheh; Sepahian, Neda; Afsharpad, Mandana; Esfandiari, Behzad; Ziapour, Peyman; Djadid, Navid D.

    2010-01-01

    This study was conducted to investigate the prevalence of Leptospira species in Mazandaran Province of Iran by using nested polymerase chain reaction (PCR)/restriction fragment length polymorphism (RFLP) methods and sequencing analysis. Blood samples (n = 119) were collected from humans suspected of having leptospirosis from different parts of the province in 2007. By using an indirect immunofluorescent antibody test (IFAT), we determined that 35 (29.4%) of 119 suspected cases had leptospiral antibody titers ≥ 1:80, which confirmed the diagnosis of leptospirosis. Nested PCR assay also determined that 60 (50.4%) of 119 samples showed Leptospira infection. Furthermore, 44 (73.3%) of 60 confirmed leptospirosis amplified products were subjected to sequencing analysis. Sequence alignment identified L. interrogans, L. kirschneri, and L. wolffii species. All positive cases diagnosed by IFAT or PCR were in patients who reported contact with animals, high-risk occupational activities, and exposure to contaminated water. Therefore, it is important to increase attention about this disease among physicians and to strengthen laboratory capacity for its diagnosis in infected patients in Iran. PMID:20439973

  17. Virtual Design Method for Controlled Failure in Foldcore Sandwich Panels

    NASA Astrophysics Data System (ADS)

    Sturm, Ralf; Fischer, S.

    2015-12-01

    For certification, novel fuselage concepts have to prove equivalent crashworthiness standards compared to the existing metal reference design. Due to the brittle failure behaviour of CFRP this requirement can only be fulfilled by a controlled progressive crash kinematics. Experiments showed that the failure of a twin-walled fuselage panel can be controlled by a local modification of the core through-thickness compression strength. For folded cores the required change in core properties can be integrated by a modification of the fold pattern. However, the complexity of folded cores requires a virtual design methodology for tailoring the fold pattern according to all static and crash relevant requirements. In this context a foldcore micromodel simulation method is presented to identify the structural response of a twin-walled fuselage panels with folded core under crash relevant loading condition. The simulations showed that a high degree of correlation is required before simulation can replace expensive testing. In the presented studies, the necessary correlation quality could only be obtained by including imperfections of the core material in the micromodel simulation approach.

  18. Aims, methods and preliminary findings of the Physical Activity, Nutrition and Allergies in Children Examined in Athens (PANACEA) epidemiological study

    PubMed Central

    Priftis, Kostas N; Panagiotakos, Demosthenes B; Anthracopoulos, Michael B; Papadimitriou, Anastasios; Nicolaidou, Polyxeni

    2007-01-01

    Background To determine the prevalence of asthma symptoms in a sample of Greek children aged 10–12 years, and to evaluate these rates in relation to anthropometric, lifestyle characteristics and dietary habits. Methods During 2006, 700 schoolchildren (323 male and 377 female), aged 10–12 years (4th to 6th school grade), were selected from 18 schools located in the greater Athens area. The schools were randomly selected from a list provided by the regional educational offices. To achieve a representative sample the schools enrolled were selected from various region of the Athens area. For each child a questionnaire was completed that was developed for the purposes of the study to retrieve information on: age, sex, school class, other socio-demographic characteristics, anthropometric measurements, dietary habits (through a semi-quantitative Food Frequency Questionnaire) and physical activity status; the presence of asthma and allergies was assessed by the standard ISAAC questionnaire. Results The prevalence of wheezing in the past was 25% in boys and 19% in girls, while the prevalence of current wheezing was 9.0% in boys and 5.8% in girls. The prevalence of any asthma symptoms was 27.6% in boys and 20.4% in girls. Multiple logistic regression analysis revealed that increased body weight and sedentary lifestyle is associated with asthma symptoms only in boys. Conclusion The present cross-sectional study cannot establish causal relationships between asthma and increased body weight of schoolchildren; however, our findings underline the associations between asthma, increased body weight, and physical activity at population level, and urge for actions that should be taken by public health policy makers in order to prevent these conditions among children. PMID:17610743

  19. Assessing the Epidemiological Data and Management Methods of Body Packers Admitted to a Referral Center in Iran.

    PubMed

    Alipour-Faz, Athena; Shadnia, Shahin; Mirhashemi, Seyyed Hadi; Peyvandi, Maryam; Oroei, Mahbobeh; Shafagh, Omid; Peyvandi, Hassan; Peyvandi, Ali Asghar

    2016-05-01

    The incidence of smuggling and transporting illegal substances by internal concealment, also known as body packing, is on the rise. The clinical approach to such patients has been changed significantly over the past 2 decades. However, despite a recorded increase in body packing in general, there are controversies in the management of these patients. We aimed to gather data regarding the demographic characteristics, treatment, and outcome of body packers, which were that referred to Loghman Hakim Hospital, Tehran, Iran.The data of all body packers admitted to Loghman Hakim Hospital during 2010 to 2014 were evaluated retrospectively. Data regarding the demographic characteristics of the patients, findings of clinical imaging, treatment, and outcome were recorded.In this study, 175 individuals with a mean age of 31 ± 10 years were assessed. The most common concealed substances were crack (37%), crystal (17%), opium (13%), and heroin (6%). According to the results of surgery and imaging (abdominal radiography or computed tomography), the most common place for concealment was stomach in 33.3% and 12% of cases, respectively. Imaging findings were normal in 18% of the individuals. Forty-eight (27%) patients underwent surgery. The main indications for surgery were clinical manifestations of toxicity (79%) and obstruction of the gastro-intestinal tract (17%). The most common surgical techniques were laparotomy and gastrotomy (50%). The mean duration of hospitalization was 3.8 ± 4 days. The mortality rate was 3%.Conservative treatment of body packers seems to be the best treatment method. Careful monitoring of the patients for possible signs and symptoms of intoxication and gastro-intestinal obstruction is strongly recommended. PMID:27175693

  20. Assessing the Epidemiological Data and Management Methods of Body Packers Admitted to a Referral Center in Iran

    PubMed Central

    Alipour-faz, Athena; Shadnia, Shahin; Mirhashemi, Seyyed Hadi; Peyvandi, Maryam; Oroei, Mahbobeh; Shafagh, Omid; Peyvandi, Hassan; Peyvandi, Ali Asghar

    2016-01-01

    Abstract The incidence of smuggling and transporting illegal substances by internal concealment, also known as body packing, is on the rise. The clinical approach to such patients has been changed significantly over the past 2 decades. However, despite a recorded increase in body packing in general, there are controversies in the management of these patients. We aimed to gather data regarding the demographic characteristics, treatment, and outcome of body packers, which were that referred to Loghman Hakim Hospital, Tehran, Iran. The data of all body packers admitted to Loghman Hakim Hospital during 2010 to 2014 were evaluated retrospectively. Data regarding the demographic characteristics of the patients, findings of clinical imaging, treatment, and outcome were recorded. In this study, 175 individuals with a mean age of 31 ± 10 years were assessed. The most common concealed substances were crack (37%), crystal (17%), opium (13%), and heroin (6%). According to the results of surgery and imaging (abdominal radiography or computed tomography), the most common place for concealment was stomach in 33.3% and 12% of cases, respectively. Imaging findings were normal in 18% of the individuals. Forty-eight (27%) patients underwent surgery. The main indications for surgery were clinical manifestations of toxicity (79%) and obstruction of the gastro-intestinal tract (17%). The most common surgical techniques were laparotomy and gastrotomy (50%). The mean duration of hospitalization was 3.8 ± 4 days. The mortality rate was 3%. Conservative treatment of body packers seems to be the best treatment method. Careful monitoring of the patients for possible signs and symptoms of intoxication and gastro-intestinal obstruction is strongly recommended. PMID:27175693

  1. Inquiry into the Practices of Expert Courseware Designers: A Pragmatic Method for the Design of Effective Instructional Systems

    ERIC Educational Resources Information Center

    Rowley, Kurt

    2005-01-01

    A multi-stage study of the practices of expert courseware designers was conducted with the final goal of identifying methods for assisting non-experts with the design of effective instructional systems. A total of 25 expert designers were involved in all stages of the inquiry. A model of the expert courseware design process was created, tested,…

  2. TIR collimator designs based on point source and extended source methods

    NASA Astrophysics Data System (ADS)

    Talpur, T.; Herkommer, A.

    2015-09-01

    TIR collimator are essential illumination components demanding high efficiency, accuracy, and uniformity. Various illumination design methods have been developed for different design domains, including tailoring method, design via optimization, mapping and feedback method, and the simultaneous multiple surface (SMS) method. This paper summarizes and compares the performance of these methods along with the advantages and the limitations.

  3. Optical Design Methods: Your Head As A Personal Computer

    NASA Astrophysics Data System (ADS)

    Shafer, David

    1985-07-01

    Several design approaches are described which feature the use of your head as a design tool. This involves thinking about the design task at hand, trying to break it into separate, easily understood subtasks, and approaching these in a creative and intelligent fashion, as only humans can do. You and your computer can become a very powerful team when this design philosophy is adopted.

  4. ADHD in the Arab World: A Review of Epidemiologic Studies

    ERIC Educational Resources Information Center

    Farah, Lynn G.; Fayyad, John A.; Eapen, Valsamma; Cassir,Youmna; Salamoun, Mariana M.; Tabet, Caroline C.; Mneimneh, Zeina N.; Karam, Elie G.

    2009-01-01

    Objective: Epidemiological studies on psychiatric disorders are quite rare in the Arab World. This article reviews epidemiological studies on ADHD in all the Arab countries. Method: All epidemiological studies on ADHD conducted from 1966 through th present were reviewed. Samples were drawn from the general community, primary care clinical…

  5. Design optimization methods for genomic DNA tiling arrays

    PubMed Central

    Bertone, Paul; Trifonov, Valery; Rozowsky, Joel S.; Schubert, Falk; Emanuelsson, Olof; Karro, John; Kao, Ming-Yang; Snyder, Michael; Gerstein, Mark

    2006-01-01

    A recent development in microarray research entails the unbiased coverage, or tiling, of genomic DNA for the large-scale identification of transcribed sequences and regulatory elements. A central issue in designing tiling arrays is that of arriving at a single-copy tile path, as significant sequence cross-hybridization can result from the presence of non-unique probes on the array. Due to the fragmentation of genomic DNA caused by the widespread distribution of repetitive elements, the problem of obtaining adequate sequence coverage increases with the sizes of subsequence tiles that are to be included in the design. This becomes increasingly problematic when considering complex eukaryotic genomes that contain many thousands of interspersed repeats. The general problem of sequence tiling can be framed as finding an optimal partitioning of non-repetitive subsequences over a prescribed range of tile sizes, on a DNA sequence comprising repetitive and non-repetitive regions. Exact solutions to the tiling problem become computationally infeasible when applied to large genomes, but successive optimizations are developed that allow their practical implementation. These include an efficient method for determining the degree of similarity of many oligonucleotide sequences over large genomes, and two algorithms for finding an optimal tile path composed of longer sequence tiles. The first algorithm, a dynamic programming approach, finds an optimal tiling in linear time and space; the second applies a heuristic search to reduce the space complexity to a constant requirement. A Web resource has also been developed, accessible at http://tiling.gersteinlab.org, to generate optimal tile paths from user-provided DNA sequences. PMID:16365382

  6. A Universal Design Method for Reflecting Physical Characteristics Variability: Case Study of a Bicycle Frame.

    PubMed

    Shimada, Masato; Suzuki, Wataru; Yamada, Shuho; Inoue, Masato

    2016-01-01

    To achieve a Universal Design, designers must consider diverse users' physical and functional requirements for their products. However, satisfying these requirements and obtaining the information which is necessary for designing a universal product is very difficult. Therefore, we propose a new design method based on the concept of set-based design to solve these issues. This paper discusses the suitability of proposed design method by applying bicycle frame design problem. PMID:27534334

  7. Exploring Clinical and Epidemiological Characteristics of Interstitial Lung Diseases: Rationale, Aims, and Design of a Nationwide Prospective Registry--The EXCITING-ILD Registry.

    PubMed

    Kreuter, Michael; Herth, Felix J F; Wacker, Margarethe; Leidl, Reiner; Hellmann, Andreas; Pfeifer, Michael; Behr, Jürgen; Witt, Sabine; Kauschka, Dagmar; Mall, Marcus; Günther, Andreas; Markart, Philipp

    2015-01-01

    Despite a number of prospective registries conducted in past years, the current epidemiology of interstitial lung diseases (ILD) is still not well defined, particularly regarding the prevalence and incidence, their management, healthcare utilisation needs, and healthcare-associated costs. To address these issues in Germany, a new prospective ILD registry, "Exploring Clinical and Epidemiological Characteristics of Interstitial Lung Diseases" (EXCITING-ILD), is being conducted by the German Centre for Lung Research in association with ambulatory, inpatient, scientific pulmonology organisations and patient support groups. This multicentre, noninterventional, prospective, and observational ILD registry aims to collect comprehensive and validated data from all healthcare institutions on the incidence, prevalence, characteristics, management, and outcomes regarding all ILD presentations in the real-world setting. Specifically, this registry will collect demographic data, disease-related data such as ILD subtype, treatments, diagnostic procedures (e.g., HRCT, surgical lung biopsy), risk factors (e.g., familial ILD), significant comorbidities, ILD managements, and disease outcomes as well as healthcare resource consumption. The EXCITING-ILD registry will include in-patient and out-patient ILD healthcare facilities in more than 100 sites. In summary, this registry will document comprehensive and current epidemiological data as well as important health economic data for ILDs in Germany. PMID:26640781

  8. Exploring Clinical and Epidemiological Characteristics of Interstitial Lung Diseases: Rationale, Aims, and Design of a Nationwide Prospective Registry—The EXCITING-ILD Registry

    PubMed Central

    Kreuter, Michael; Herth, Felix J. F.; Wacker, Margarethe; Leidl, Reiner; Hellmann, Andreas; Pfeifer, Michael; Behr, Jürgen; Witt, Sabine; Kauschka, Dagmar; Mall, Marcus; Günther, Andreas; Markart, Philipp

    2015-01-01

    Despite a number of prospective registries conducted in past years, the current epidemiology of interstitial lung diseases (ILD) is still not well defined, particularly regarding the prevalence and incidence, their management, healthcare utilisation needs, and healthcare-associated costs. To address these issues in Germany, a new prospective ILD registry, “Exploring Clinical and Epidemiological Characteristics of Interstitial Lung Diseases” (EXCITING-ILD), is being conducted by the German Centre for Lung Research in association with ambulatory, inpatient, scientific pulmonology organisations and patient support groups. This multicentre, noninterventional, prospective, and observational ILD registry aims to collect comprehensive and validated data from all healthcare institutions on the incidence, prevalence, characteristics, management, and outcomes regarding all ILD presentations in the real-world setting. Specifically, this registry will collect demographic data, disease-related data such as ILD subtype, treatments, diagnostic procedures (e.g., HRCT, surgical lung biopsy), risk factors (e.g., familial ILD), significant comorbidities, ILD managements, and disease outcomes as well as healthcare resource consumption. The EXCITING-ILD registry will include in-patient and out-patient ILD healthcare facilities in more than 100 sites. In summary, this registry will document comprehensive and current epidemiological data as well as important health economic data for ILDs in Germany. PMID:26640781

  9. Visual Narrative Research Methods as Performance in Industrial Design Education

    ERIC Educational Resources Information Center

    Campbell, Laurel H.; McDonagh, Deana

    2009-01-01

    This article discusses teaching empathic research methodology as performance. The authors describe their collaboration in an activity to help undergraduate industrial design students learn empathy for others when designing products for use by diverse or underrepresented people. The authors propose that an industrial design curriculum would benefit…

  10. [METHODS AND TECHNOLOGIES OF HEALTH RISK ANALYSIS IN THE SYSTEM OF THE STATE MANAGEMENT UNDER ASSURANCE OF THE SANITATION AND EPIDEMIOLOGICAL WELFARE OF POPULATION].

    PubMed

    Zaĭtseva, N V; Popova, A Iu; Maĭ, I V; Shur, P Z

    2015-01-01

    The methodology of the analysis of health risk at the present stage of development of Russian society is in-demand at all levels of government management. In conjunction with the methods of mathematical modeling, spatial-temporal analysis and economic tools the risk assessment in the analysis of the situation makes it possible to determine the level of safety of the population, workers and consumers, to select prior resources, and threat factors as a point for exertion efforts. At the planning stage risk assessment is a basis for the establishment of most effective measures for the minimization of hazard and dangers. At the realization stage the methodology allows to estimate the efficiency of measures; at the control and supervision phase it permits to select out priorities for the concentration of efforts on the objects of maximal health risk for population. Risk assessments, including the elements of evolutionary modeling, are incorporated in the system of state hygienic regulation, the formation of evidence base of harm to health, the organization of control and supervisory activities. This allows you to harmonize the domestic legal framework with ternational legal requirements and ultimately enhances the credibility of the Russian data on the safety of the environment products and services. There is seemed to be actual the further assignment of enforcement of methodology of health risk analysis in the field of assurance of sanitary and epidemiological well-being and health of employers; he development of informational and analytical base in the part of the establishment of models of dependencies "exposure-response" for different types and levels of exposure and risk contingents; the accuracy enhancement of estimations of exposure; improvement of the economic aspects of health risk analysis and forecasting of measures aimed at mitigation of the losses associated with the negative impact of manifold factors on the health of citizens. PMID:26155657

  11. An entropy method for floodplain monitoring network design

    NASA Astrophysics Data System (ADS)

    Ridolfi, E.; Yan, K.; Alfonso, L.; Di Baldassarre, G.; Napolitano, F.; Russo, F.; Bates, Paul D.

    2012-09-01

    In recent years an increasing number of flood-related fatalities has highlighted the necessity of improving flood risk management to reduce human and economic losses. In this framework, monitoring of flood-prone areas is a key factor for building a resilient environment. In this paper a method for designing a floodplain monitoring network is presented. A redundant network of cheap wireless sensors (GridStix) measuring water depth is considered over a reach of the River Dee (UK), with sensors placed both in the channel and in the floodplain. Through a Three Objective Optimization Problem (TOOP) the best layouts of sensors are evaluated, minimizing their redundancy, maximizing their joint information content and maximizing the accuracy of the observations. A simple raster-based inundation model (LISFLOOD-FP) is used to generate a synthetic GridStix data set of water stages. The Digital Elevation Model (DEM) that is used for hydraulic model building is the globally and freely available SRTM DEM.

  12. The Chinese American Eye Study: Design and Methods

    PubMed Central

    Varma, Rohit; Hsu, Chunyi; Wang, Dandan; Torres, Mina; Azen, Stanley P.

    2016-01-01

    Purpose To summarize the study design, operational strategies and procedures of the Chinese American Eye Study (CHES), a population-based assessment of the prevalence of visual impairment, ocular disease, and visual functioning in Chinese Americans. Methods This population-based, cross-sectional study, included 4,570 Chinese, 50 years and older, residing in the city of Monterey Park, California. Each eligible participant completed a detailed interview and eye examination. The interview included an assessment of demographic, behavioral, and ocular risk factors and health-related and vision-related quality of life. The eye examination included measurements of visual acuity, intraocular pressure, visual fields, fundus and optic disc photography, a detailed anterior and posterior segment examination, and measurements of blood pressure, glycosylated hemoglobin levels, and blood glucose levels. Results The objectives of the CHES are to obtain prevalence estimates of visual impairment, refractive error, diabetic retinopathy, open-angle and angle-closure glaucoma, lens opacities, and age-related macular degeneration in Chinese-Americans. In addition, outcomes include effect estimates for risk factors associated with eye diseases. Lastly, CHES will investigate the genetic determinates of myopia and glaucoma. Conclusion The CHES will provide information about the prevalence and risk factors of ocular diseases in one of the fastest growing minority groups in the United States. PMID:24044409

  13. Design and methods of the national Vietnam veterans longitudinal study.

    PubMed

    Schlenger, William E; Corry, Nida H; Kulka, Richard A; Williams, Christianna S; Henn-Haase, Clare; Marmar, Charles R

    2015-09-01

    The National Vietnam Veterans Longitudinal Study (NVVLS) is the second assessment of a representative cohort of US veterans who served during the Vietnam War era, either in Vietnam or elsewhere. The cohort was initially surveyed in the National Vietnam Veterans Readjustment Study (NVVRS) from 1984 to 1988 to assess the prevalence, incidence, and effects of post-traumatic stress disorder (PTSD) and other post-war problems. The NVVLS sought to re-interview the cohort to assess the long-term course of PTSD. NVVLS data collection began July 3, 2012 and ended May 17, 2013, comprising three components: a mailed health questionnaire, a telephone health survey interview, and, for a probability sample of theater Veterans, a clinical diagnostic telephone interview administered by licensed psychologists. Excluding decedents, 78.8% completed the questionnaire and/or telephone survey, and 55.0% of selected living veterans participated in the clinical interview. This report provides a description of the NVVLS design and methods. Together, the NVVRS and NVVLS constitute a nationally representative longitudinal study of Vietnam veterans, and extend the NVVRS as a critical resource for scientific and policy analyses for Vietnam veterans, with policy relevance for Iraq and Afghanistan veterans. PMID:26096554

  14. A decision-based perspective for the design of methods for systems design

    NASA Technical Reports Server (NTRS)

    Mistree, Farrokh; Muster, Douglas; Shupe, Jon A.; Allen, Janet K.

    1989-01-01

    Organization of material, a definition of decision based design, a hierarchy of decision based design, the decision support problem technique, a conceptual model design that can be manufactured and maintained, meta-design, computer-based design, action learning, and the characteristics of decisions are among the topics covered.

  15. Uses of ecologic analysis in epidemiologic research.

    PubMed Central

    Morgenstern, H

    1982-01-01

    Despite the widespread use of ecologic analysis in epidemiologic research and health planning, little attention has been given by health scientists and practitioners to the methodological aspects of this approach. This paper reviews the major types of ecologic study designs, the analytic methods appropriate for each, the limitations of ecologic data for making causal inferences and what can be done to minimize these problems, and the relative advantages of ecologic analysis. Numerous examples are provided to illustrate the important principles and methods. A careful distinction is made between ecologic studies that generate or test etiologic hypotheses and those that evaluate the impact of intervention programs or policies (given adequate knowledge of disease etiology). Failure to recognize this difference in the conduct of ecologic studies can lead to results that are not very informative or that are misinterpreted by others. PMID:7137430

  16. Multidisciplinary Design Optimization (MDO) Methods: Their Synergy with Computer Technology in Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1998-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate a radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimization (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behavior by interaction of a large number of very simple models may be an inspiration for the above algorithms, the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should be now, even though the widespread availability of massively parallel processing is still a few years away.

  17. Multidisciplinary Design Optimisation (MDO) Methods: Their Synergy with Computer Technology in the Design Process

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw

    1999-01-01

    The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate. A radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimisation (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behaviour by interaction of a large number of very simple models may be an inspiration for the above algorithms; the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should begin now, even though the widespread availability of massively parallel processing is still a few years away.

  18. Active cooling design for scramjet engines using optimization methods

    NASA Technical Reports Server (NTRS)

    Scotti, Stephen J.; Martin, Carl J.; Lucas, Stephen H.

    1988-01-01

    A methodology for using optimization in designing metallic cooling jackets for scramjet engines is presented. The optimal design minimizes the required coolant flow rate subject to temperature, mechanical-stress, and thermal-fatigue-life constraints on the cooling-jacket panels, and Mach-number and pressure constraints on the coolant exiting the panel. The analytical basis for the methodology is presented, and results for the optimal design of panels are shown to demonstrate its utility.

  19. Active cooling design for scramjet engines using optimization methods

    NASA Technical Reports Server (NTRS)

    Scotti, Stephen J.; Martin, Carl J.; Lucas, Stephen H.

    1988-01-01

    A methodology for using optimization in designing metallic cooling jackets for scramjet engines is presented. The optimal design minimizes the required coolant flow rate subject to temperature, mechanical-stress, and thermal-fatigue-life constraints on the cooling-jacket panels, and Mach-number and pressure contraints on the coolant exiting the panel. The analytical basis for the methodology is presented, and results for the optimal design of panels are shown to demonstrate its utility.

  20. Investigating a Method of Scaffolding Student-Designed Experiments

    NASA Astrophysics Data System (ADS)

    Morgan, Kelly; Brooks, David W.

    2012-08-01

    The process of designing an experiment is a difficult one. Students often struggle to perform such tasks as the design process places a large cognitive load on students. Scaffolding is the process of providing support for a student to allow them to complete tasks they would otherwise not have been able to complete. This study sought to investigate backwards-design, one form of scaffolding the experimental design process for students. Students were guided through the design process in a backwards manner (designing the results section first and working backwards through typical report components to the materials and safety sections). The use of reflective prompts as possible scaffold for metacognitive processes was also studied. Scaffolding was in the form of a computer application built specifically for this purpose. Four versions of the computer application were randomly assigned to 102 high school chemistry students and students were asked to the design of an experiment, producing a report. The use of backwards-design scaffolding resulted in significantly higher performance on lab reports. The addition of reflective prompts reduced the effect of backwards-design scaffolding in lower-level students.

  1. Development of Combinatorial Methods for Alloy Design and Optimization

    SciTech Connect

    Pharr, George M.; George, Easo P.; Santella, Michael L

    2005-07-01

    The primary goal of this research was to develop a comprehensive methodology for designing and optimizing metallic alloys by combinatorial principles. Because conventional techniques for alloy preparation are unavoidably restrictive in the range of alloy composition that can be examined, combinatorial methods promise to significantly reduce the time, energy, and expense needed for alloy design. Combinatorial methods can be developed not only to optimize existing alloys, but to explore and develop new ones as well. The scientific approach involved fabricating an alloy specimen with a continuous distribution of binary and ternary alloy compositions across its surface--an ''alloy library''--and then using spatially resolved probing techniques to characterize its structure, composition, and relevant properties. The three specific objectives of the project were: (1) to devise means by which simple test specimens with a library of alloy compositions spanning the range interest can be produced; (2) to assess how well the properties of the combinatorial specimen reproduce those of the conventionally processed alloys; and (3) to devise screening tools which can be used to rapidly assess the important properties of the alloys. As proof of principle, the methodology was applied to the Fe-Ni-Cr ternary alloy system that constitutes many commercially important materials such as stainless steels and the H-series and C-series heat and corrosion resistant casting alloys. Three different techniques were developed for making alloy libraries: (1) vapor deposition of discrete thin films on an appropriate substrate and then alloying them together by solid-state diffusion; (2) co-deposition of the alloying elements from three separate magnetron sputtering sources onto an inert substrate; and (3) localized melting of thin films with a focused electron-beam welding system. Each of the techniques was found to have its own advantages and disadvantages. A new and very powerful technique for

  2. Applications of numerical optimization methods to helicopter design problems: A survey

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1984-01-01

    A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.

  3. Applications of numerical optimization methods to helicopter design problems - A survey

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1985-01-01

    A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.

  4. Applications of numerical optimization methods to helicopter design problems - A survey

    NASA Technical Reports Server (NTRS)

    Miura, H.

    1984-01-01

    A survey of applications of mathematical programming methods is used to improve the design of helicopters and their components. Applications of multivariable search techniques in the finite dimensional space are considered. Five categories of helicopter design problems are considered: (1) conceptual and preliminary design, (2) rotor-system design, (3) airframe structures design, (4) control system design, and (5) flight trajectory planning. Key technical progress in numerical optimization methods relevant to rotorcraft applications are summarized.

  5. Methods for combining payload parameter variations with input environment. [calculating design limit loads compatible with probabilistic structural design criteria

    NASA Technical Reports Server (NTRS)

    Merchant, D. H.

    1976-01-01

    Methods are presented for calculating design limit loads compatible with probabilistic structural design criteria. The approach is based on the concept that the desired limit load, defined as the largest load occurring in a mission, is a random variable having a specific probability distribution which may be determined from extreme-value theory. The design limit load, defined as a particular of this random limit load, is the value conventionally used in structural design. Methods are presented for determining the limit load probability distributions from both time-domain and frequency-domain dynamic load simulations. Numerical demonstrations of the method are also presented.

  6. The Zambia Children's KS-HHV8 Study: Rationale, Study Design, and Study Methods

    PubMed Central

    Minhas, Veenu; Crabtree, Kay L.; Chao, Ann; Wojcicki, Janet M.; Sifuniso, Adrian M.; Nkonde, Catherine; Kankasa, Chipepo; Mitchell, Charles D.; Wood, Charles

    2011-01-01

    The epidemic of human immunodeficiency virus in Zambia has led to a dramatic rise in the incidence of human herpesvirus-8 (HHV-8)–associated Kaposi's sarcoma in both adults and children. However, there is a paucity of knowledge about the routes of HHV-8 transmission to young children. The Zambia Children's KS-HHV8 Study, a large, prospective cohort study in Lusaka, Zambia, was launched in 2004 to investigate the role of household members as a source of HHV-8 infection in young children and social behaviors that may modify the risk of HHV-8 acquisition. This cohort is distinct from other epidemiologic studies designed to investigate HHV-8 incidence and transmission because it recruited and followed complete households in the urban central African context. Between July 2004 and March 2007, 1,600 households were screened; 368 households comprising 464 children and 1,335 caregivers and household members were enrolled. Follow-up of this population continued for 48 months postrecruitment, affording a unique opportunity to study horizontal transmission of HHV-8 and understand the routes and sources of transmission to young children in Zambia. The authors describe the study rationale, design, execution, and characteristics of this cohort, which provides critical data on the epidemiology and transmission of HHV-8 to young children in Zambia. PMID:21447476

  7. Investigating a Method of Scaffolding Student-Designed Experiments

    ERIC Educational Resources Information Center

    Morgan, Kelly; Brooks, David W.

    2012-01-01

    The process of designing an experiment is a difficult one. Students often struggle to perform such tasks as the design process places a large cognitive load on students. Scaffolding is the process of providing support for a student to allow them to complete tasks they would otherwise not have been able to complete. This study sought to investigate…

  8. Teaching Improvement Model Designed with DEA Method and Management Matrix

    ERIC Educational Resources Information Center

    Montoneri, Bernard

    2014-01-01

    This study uses student evaluation of teachers to design a teaching improvement matrix based on teaching efficiency and performance by combining management matrix and data envelopment analysis. This matrix is designed to formulate suggestions to improve teaching. The research sample consists of 42 classes of freshmen following a course of English…

  9. Developing Baby Bag Design by Using Kansei Engineering Method

    NASA Astrophysics Data System (ADS)

    Janari, D.; Rakhmawati, A.

    2016-01-01

    Consumer's preferences and market demand are essential factors for product's success. Thus, in achieving its success, a product should have design that could fulfill consumer's expectation. Purpose of this research is accomplishing baby bag product as stipulated by Kansei. The results that represent Kanseiwords are; neat, unique, comfortable, safe, modern, gentle, elegant, antique, attractive, simple, spacious, creative, colorful, durable, stylish, smooth and strong. Identification value on significance of correlation for durable attribute is 0,000 < 0,005, which means significant to baby's bag. While the value of coefficient regression is 0,812 < 0,005, which means that durable attribute insignificant to baby's bag.The result of the baby's bag final design selectionbased on the questionnaire 3 is resulting the combination of all design. Space for clothes, diaper's space, shoulder grip, side grip, bottle's heater pocket and bottle's pocket are derived from design 1. Top grip, space for clothes, shoulder grip, and side grip are derived from design 2.Others design that were taken are, spaces for clothes from design 3, diaper's space and clothes’ space from design 4.

  10. METHODS FOR INTEGRATING ENVIRONMENTAL CONSIDERATIONS INTO CHEMICAL PROCESS DESIGN DECISIONS

    EPA Science Inventory

    The objective of this cooperative agreement was to postulate a means by which an engineer could routinely include environmental considerations in day-to-day conceptual design problems; a means that could easily integrate with existing design processes, and thus avoid massive retr...

  11. The Chronic Renal Insufficiency Cohort (CRIC) Study: Design and Methods.

    PubMed

    Feldman, Harold I; Appel, Lawrence J; Chertow, Glenn M; Cifelli, Denise; Cizman, Borut; Daugirdas, John; Fink, Jeffrey C; Franklin-Becker, Eunice D; Go, Alan S; Hamm, L Lee; He, Jiang; Hostetter, Tom; Hsu, Chi-Yuan; Jamerson, Kenneth; Joffe, Marshall; Kusek, John W; Landis, J Richard; Lash, James P; Miller, Edgar R; Mohler, Emile R; Muntner, Paul; Ojo, Akinlolu O; Rahman, Mahboob; Townsend, Raymond R; Wright, Jackson T

    2003-07-01

    Insights into end-stage renal disease have emerged from many investigations but less is known about the epidemiology of chronic renal insufficiency (CRI) and its relationship to cardiovascular disease (CVD). The Chronic Renal Insufficiency Cohort (CRIC) Study was established to examine risk factors for progression of CRI and CVD among CRI patients and develop models to identify high-risk subgroups, informing future treatment trials, and increasing application of preventive therapies. CRIC will enroll approximately 3000 individuals at seven sites and follow participants for up to 5 yr. CRIC will include a racially and ethnically diverse group of adults aged 21 to 74 yr with a broad spectrum of renal disease severity, half of whom have diagnosed diabetes mellitus. CRIC will exclude subjects with polycystic kidney disease and those on active immunosuppression for glomerulonephritis. Subjects will undergo extensive clinical evaluation at baseline and at annual clinic visits and via telephone at 6 mo intervals. Data on quality of life, dietary assessment, physical activity, health behaviors, depression, cognitive function, health care resource utilization, as well as blood and urine specimens will be collected annually. (125)I-iothalamate clearances and CVD evaluations including a 12-lead surface electrocardiogram, an echocardiogram, and coronary electron beam or spiral CT will be performed serially. Analyses planned in CRIC will provide important information on potential risk factors for progressive CRI and CVD. Insights from CRIC should lead to the formulation of hypotheses regarding therapy that will serve as the basis for targeted interventional trials focused on reducing the burden of CRI and CVD. PMID:12819321

  12. Parametric design of a Francis turbine runner by means of a three-dimensional inverse design method

    NASA Astrophysics Data System (ADS)

    Daneshkah, K.; Zangeneh, M.

    2010-08-01

    The present paper describes the parametric design of a Francis turbine runner. The runner geometry is parameterized by means of a 3D inverse design method, while CFD analyses were performed to assess the hydrodymanic and suction performance of different design configurations that were investigated. An initial runner design was first generated and used as baseline for parametric study. The effects of several design parameter, namely stacking condition and blade loading was then investigated in order to determine their effect on the suction performance. The use of blade parameterization using the inverse method lead to a major advantage for design of Francis turbine runners, as the three-dimensional blade shape is describe by parameters that closely related to the flow field namely blade loading and stacking condition that have a direct impact on the hydrodynamics of the flow field. On the basis of this study, an optimum configuration was designed which results in a cavitation free flow in the runner, while maintaining a high level of hydraulic efficiency. The paper highlights design guidelines for application of inverse design method to Francis turbine runners. The design guidelines have a general validity and can be used for similar design applications since they are based on flow field analyses and on hydrodynamic design parameters.

  13. Categorisation of visualisation methods to support the design of Human-Computer Interaction Systems.

    PubMed

    Li, Katie; Tiwari, Ashutosh; Alcock, Jeffrey; Bermell-Garcia, Pablo

    2016-07-01

    During the design of Human-Computer Interaction (HCI) systems, the creation of visual artefacts forms an important part of design. On one hand producing a visual artefact has a number of advantages: it helps designers to externalise their thought and acts as a common language between different stakeholders. On the other hand, if an inappropriate visualisation method is employed it could hinder the design process. To support the design of HCI systems, this paper reviews the categorisation of visualisation methods used in HCI. A keyword search is conducted to identify a) current HCI design methods, b) approaches of selecting these methods. The resulting design methods are filtered to create a list of just visualisation methods. These are then categorised using the approaches identified in (b). As a result 23 HCI visualisation methods are identified and categorised in 5 selection approaches (The Recipient, Primary Purpose, Visual Archetype, Interaction Type, and The Design Process). PMID:26995039

  14. A new method for designing dual foil electron beam forming systems. I. Introduction, concept of the method

    NASA Astrophysics Data System (ADS)

    Adrich, Przemysław

    2016-05-01

    In Part I of this work existing methods and problems in dual foil electron beam forming system design are presented. On this basis, a new method of designing these systems is introduced. The motivation behind this work is to eliminate the shortcomings of the existing design methods and improve overall efficiency of the dual foil design process. The existing methods are based on approximate analytical models applied in an unrealistically simplified geometry. Designing a dual foil system with these methods is a rather labor intensive task as corrections to account for the effects not included in the analytical models have to be calculated separately and accounted for in an iterative procedure. To eliminate these drawbacks, the new design method is based entirely on Monte Carlo modeling in a realistic geometry and using physics models that include all relevant processes. In our approach, an optimal configuration of the dual foil system is found by means of a systematic, automatized scan of the system performance in function of parameters of the foils. The new method, while being computationally intensive, minimizes the involvement of the designer and considerably shortens the overall design time. The results are of high quality as all the relevant physics and geometry details are naturally accounted for. To demonstrate the feasibility of practical implementation of the new method, specialized software tools were developed and applied to solve a real life design problem, as described in Part II of this work.

  15. [The contribution of epidemiology to disease control: malaria].

    PubMed

    Osorio, Lyda E

    2013-01-01

    Despite the number of cases and attributable mortality having become reduced, malaria continues to be an important public health problem. This report presents some examples of epidemiology's contribution to malaria control; it also motivates reflexion to the contrary, i.e. malaria's contribution to the development of epidemiology. Attempting to identify methods for measuring epidemiology's contribution to malaria control led to an in-depth analysis of what exactly does epidemiology consist of, whether all its contributions could be considered positive and to what extent they might have been due just to epidemiology. PMID:25124242

  16. Airfoil design and optimization methods: recent progress at NLR

    NASA Astrophysics Data System (ADS)

    Soemarwoto, B. I.; Labrujère, Th. E.

    1999-05-01

    The present paper considers the problem of aerodynamic airfoil shape optimization where the shape of an airfoil is to be determined such that a priori specified design criteria will be met to the best possible extent. The design criteria are formulated by defining an objective or cost function, the minimum of which represents the solution to the design problem. A survey is given of developments at NLR applying the adjoint operator approach, utilizing a compressible inviscid flow model based on the Euler equations and a compressible viscous flow model based on the Reynolds-averaged Navier-Stokes equations. Computational results are presented for a two-point drag-reduction design problem. Copyright

  17. Third order TRANSPORT with MAD (Methodical Accelerator Design) input

    SciTech Connect

    Carey, D.C.

    1988-09-20

    This paper describes computer-aided design codes for particle accelerators. Among the topics discussed are: input beam description; parameters and algebraic expressions; the physical elements; beam lines; operations; and third-order transfer matrix. (LSP)

  18. Development of panel methods for subsonic analysis and design

    NASA Technical Reports Server (NTRS)

    Bristow, D. R.

    1980-01-01

    Two computer programs, developed for subsonic inviscid analysis and design are described. The first solves arbitrary mixed analysis design problems for multielement airfoils in two dimensional flow. The second calculates the pressure distribution for arbitrary lifting or nonlifting three dimensional configurations. In each program, inviscid flow is modelled by using distributed source doublet singularities on configuration surface panels. Numerical formulations and representative solutions are presented for the programs.

  19. Practical design methods for barrier pillars. Information circular/1995

    SciTech Connect

    Koehler, J.R.; Tadolini, S.C.

    1995-11-01

    Effective barrier pillar design is essential for safe and productive underground coal mining. This U.S. Bureau of Mines report presents an overview of available barrier pillar design methodologies that incorporate sound engineering principles while remaining practical for everyday usage. Nomographs and examples are presented to assist in the determination of proper barrier pillar sizing. Additionally, performance evaluation techniques and criteria are included to assist in determining the effectiveness of selected barrier pillar configurations.

  20. The cryogenic balance design and balance calibration methods

    NASA Astrophysics Data System (ADS)

    Ewald, B.; Polanski, L.; Graewe, E.

    1992-07-01

    The current status of a program aimed at the development of a cryogenic balance for the European Transonic Wind Tunnel is reviewed. In particular, attention is given to the cryogenic balance design philosophy, mechanical balance design, reliability and accuracy, cryogenic balance calibration concept, and the concept of an automatic calibration machine. It is shown that the use of the automatic calibration machine will improve the accuracy of calibration while reducing the man power and time required for balance calibration.

  1. Multicenter Study of Epidemiological Cutoff Values and Detection of Resistance in Candida spp. to Anidulafungin, Caspofungin, and Micafungin Using the Sensititre YeastOne Colorimetric Method

    PubMed Central

    Alvarez-Fernandez, M.; Cantón, E.; Carver, P. L.; Chen, S. C.-A.; Eschenauer, G.; Getsinger, D. L.; Gonzalez, G. M.; Govender, N. P.; Grancini, A.; Hanson, K. E.; Kidd, S. E.; Klinker, K.; Kubin, C. J.; Kus, J. V.; Lockhart, S. R.; Meletiadis, J.; Morris, A. J.; Pelaez, T.; Quindós, G.; Rodriguez-Iglesias, M.; Sánchez-Reus, F.; Shoham, S.; Wengenack, N. L.; Borrell Solé, N.; Echeverria, J.; Esperalba, J.; Gómez-G. de la Pedrosa, E.; García García, I.; Linares, M. J.; Marco, F.; Merino, P.; Pemán, J.; Pérez del Molino, L.; Roselló Mayans, E.; Rubio Calvo, C.; Ruiz Pérez de Pipaon, M.; Yagüe, G.; Garcia-Effron, G.; Guinea, J.; Perlin, D. S.; Sanguinetti, M.; Shields, R.; Turnidge, J.

    2015-01-01

    Neither breakpoints (BPs) nor epidemiological cutoff values (ECVs) have been established for Candida spp. with anidulafungin, caspofungin, and micafungin when using the Sensititre YeastOne (SYO) broth dilution colorimetric method. In addition, reference caspofungin MICs have so far proven to be unreliable. Candida species wild-type (WT) MIC distributions (for microorganisms in a species/drug combination with no detectable phenotypic resistance) were established for 6,007 Candida albicans, 186 C. dubliniensis, 3,188 C. glabrata complex, 119 C. guilliermondii, 493 C. krusei, 205 C. lusitaniae, 3,136 C. parapsilosis complex, and 1,016 C. tropicalis isolates. SYO MIC data gathered from 38 laboratories in Australia, Canada, Europe, Mexico, New Zealand, South Africa, and the United States were pooled to statistically define SYO ECVs. ECVs for anidulafungin, caspofungin, and micafungin encompassing ≥97.5% of the statistically modeled population were, respectively, 0.12, 0.25, and 0.06 μg/ml for C. albicans, 0.12, 0.25, and 0.03 μg/ml for C. glabrata complex, 4, 2, and 4 μg/ml for C. parapsilosis complex, 0.5, 0.25, and 0.06 μg/ml for C. tropicalis, 0.25, 1, and 0.25 μg/ml for C. krusei, 0.25, 1, and 0.12 μg/ml for C. lusitaniae, 4, 2, and 2 μg/ml for C. guilliermondii, and 0.25, 0.25, and 0.12 μg/ml for C. dubliniensis. Species-specific SYO ECVs for anidulafungin, caspofungin, and micafungin correctly classified 72 (88.9%), 74 (91.4%), 76 (93.8%), respectively, of 81 Candida isolates with identified fks mutations. SYO ECVs may aid in detecting non-WT isolates with reduced susceptibility to anidulafungin, micafungin, and especially caspofungin, since testing the susceptibilities of Candida spp. to caspofungin by reference methodologies is not recommended. PMID:26282428

  2. Multicenter study of epidemiological cutoff values and detection of resistance in Candida spp. to anidulafungin, caspofungin, and micafungin using the Sensititre YeastOne colorimetric method.

    PubMed

    Espinel-Ingroff, A; Alvarez-Fernandez, M; Cantón, E; Carver, P L; Chen, S C-A; Eschenauer, G; Getsinger, D L; Gonzalez, G M; Govender, N P; Grancini, A; Hanson, K E; Kidd, S E; Klinker, K; Kubin, C J; Kus, J V; Lockhart, S R; Meletiadis, J; Morris, A J; Pelaez, T; Quindós, G; Rodriguez-Iglesias, M; Sánchez-Reus, F; Shoham, S; Wengenack, N L; Borrell Solé, N; Echeverria, J; Esperalba, J; Gómez-G de la Pedrosa, E; García García, I; Linares, M J; Marco, F; Merino, P; Pemán, J; Pérez Del Molino, L; Roselló Mayans, E; Rubio Calvo, C; Ruiz Pérez de Pipaon, M; Yagüe, G; Garcia-Effron, G; Guinea, J; Perlin, D S; Sanguinetti, M; Shields, R; Turnidge, J

    2015-11-01

    Neither breakpoints (BPs) nor epidemiological cutoff values (ECVs) have been established for Candida spp. with anidulafungin, caspofungin, and micafungin when using the Sensititre YeastOne (SYO) broth dilution colorimetric method. In addition, reference caspofungin MICs have so far proven to be unreliable. Candida species wild-type (WT) MIC distributions (for microorganisms in a species/drug combination with no detectable phenotypic resistance) were established for 6,007 Candida albicans, 186 C. dubliniensis, 3,188 C. glabrata complex, 119 C. guilliermondii, 493 C. krusei, 205 C. lusitaniae, 3,136 C. parapsilosis complex, and 1,016 C. tropicalis isolates. SYO MIC data gathered from 38 laboratories in Australia, Canada, Europe, Mexico, New Zealand, South Africa, and the United States were pooled to statistically define SYO ECVs. ECVs for anidulafungin, caspofungin, and micafungin encompassing ≥97.5% of the statistically modeled population were, respectively, 0.12, 0.25, and 0.06 μg/ml for C. albicans, 0.12, 0.25, and 0.03 μg/ml for C. glabrata complex, 4, 2, and 4 μg/ml for C. parapsilosis complex, 0.5, 0.25, and 0.06 μg/ml for C. tropicalis, 0.25, 1, and 0.25 μg/ml for C. krusei, 0.25, 1, and 0.12 μg/ml for C. lusitaniae, 4, 2, and 2 μg/ml for C. guilliermondii, and 0.25, 0.25, and 0.12 μg/ml for C. dubliniensis. Species-specific SYO ECVs for anidulafungin, caspofungin, and micafungin correctly classified 72 (88.9%), 74 (91.4%), 76 (93.8%), respectively, of 81 Candida isolates with identified fks mutations. SYO ECVs may aid in detecting non-WT isolates with reduced susceptibility to anidulafungin, micafungin, and especially caspofungin, since testing the susceptibilities of Candida spp. to caspofungin by reference methodologies is not recommended. PMID:26282428

  3. Aircraft design for mission performance using nonlinear multiobjective optimization methods

    NASA Technical Reports Server (NTRS)

    Dovi, Augustine R.; Wrenn, Gregory A.

    1990-01-01

    A new technique which converts a constrained optimization problem to an unconstrained one where conflicting figures of merit may be simultaneously considered was combined with a complex mission analysis system. The method is compared with existing single and multiobjective optimization methods. A primary benefit from this new method for multiobjective optimization is the elimination of separate optimizations for each objective, which is required by some optimization methods. A typical wide body transport aircraft is used for the comparative studies.

  4. Vaccine epidemiology: A review

    PubMed Central

    Lahariya, Chandrakant

    2016-01-01

    This review article outlines the key concepts in vaccine epidemiology, such as basic reproductive numbers, force of infection, vaccine efficacy and effectiveness, vaccine failure, herd immunity, herd effect, epidemiological shift, disease modeling, and describes the application of this knowledge both at program levels and in the practice by family physicians, epidemiologists, and pediatricians. A case has been made for increased knowledge and understanding of vaccine epidemiology among key stakeholders including policy makers, immunization program managers, public health experts, pediatricians, family physicians, and other experts/individuals involved in immunization service delivery. It has been argued that knowledge of vaccine epidemiology which is likely to benefit the society through contributions to the informed decision-making and improving vaccination coverage in the low and middle income countries (LMICs). The article ends with suggestions for the provision of systematic training and learning platforms in vaccine epidemiology to save millions of preventable deaths and improve health outcomes through life-course. PMID:27453836

  5. A Preliminary Rubric Design to Evaluate Mixed Methods Research

    ERIC Educational Resources Information Center

    Burrows, Timothy J.

    2013-01-01

    With the increase in frequency of the use of mixed methods, both in research publications and in externally funded grants there are increasing calls for a set of standards to assess the quality of mixed methods research. The purpose of this mixed methods study was to conduct a multi-phase analysis to create a preliminary rubric to evaluate mixed…

  6. [The epidemiological surveillance of dengue in Mexico].

    PubMed

    Montesano-Castellanos, R; Ruiz-Matus, C

    1995-01-01

    The clinical behavior of dengue fever in Mexico has changed, now with the occurrence of hemorrhagic cases. In response to the emergence of such cases, a specific epidemiologic surveillance system has been designed and implemented. This system includes the means to monitor the factors involved in the evolution of the disease. The identification and analysis of these factors is necessary to implement prevention and control measures. This paper presents the main components and procedures of the epidemiologic surveillance system for common and hemorrhagic dengue fever in Mexico, emphasizing the usefulness of the risk approach to predict the pattern of this disease. The model includes the collaboration of a multidisciplinary group. The Epidemiologic Surveillance State Committee, coordinated by the National Health System, participates in the collection and analysis of epidemiologic data, particularly data related to the population, the individual, the vector, the viruses and the environment. PMID:8599150

  7. Mental health epidemiological research in South America: recent findings

    PubMed Central

    Silva de Lima, Maurício; Garcia de Oliveira Soares, Bernardo; de Jesus Mari, Jair

    2004-01-01

    This paper aims to review the recent mental health epidemiological research conducted in South America. The Latin American and the Caribbean (LILACS) database was searched from 1999 to 2003 using a specific strategy for identification of cohort, case-control and cross-sectional population-based studies in South America. The authors screened references and identified relevant studies. Further studies were obtained contacting local experts in epidemiology. 140 references were identified, and 12 studies were selected. Most selected studies explored the prevalence and risk factors for common mental disorders, and several of them used sophisticated methods of sample selection and analysis. There is a need for improving the quality of psychiatric journals in Latin America, and for increasing the distribution and access to research data. Regionally relevant problems such as violence and substance abuse should be considered in designing future investigations in this area. PMID:16633474

  8. Multiple methods integration for structural mechanics analysis and design

    NASA Technical Reports Server (NTRS)

    Housner, J. M.; Aminpour, M. A.

    1991-01-01

    A new research area of multiple methods integration is proposed for joining diverse methods of structural mechanics analysis which interact with one another. Three categories of multiple methods are defined: those in which a physical interface are well defined; those in which a physical interface is not well-defined, but selected; and those in which the interface is a mathematical transformation. Two fundamental integration procedures are presented that can be extended to integrate various methods (e.g., finite elements, Rayleigh Ritz, Galerkin, and integral methods) with one another. Since the finite element method will likely be the major method to be integrated, its enhanced robustness under element distortion is also examined and a new robust shell element is demonstrated.

  9. The challenges of using epidemiology to inform clinical practice

    PubMed Central

    Kessler, Ronald C.

    2007-01-01

    This paper discusses challenges and prospects for increasing the clinical relevance of psychiatric epidemiological research. The discussion begins with a review of the structural determinants of the fact that current psychiatric epidemiological research has less clinical relevance that epidemiological research in other areas of medicine. The discussion then turns to ways in which the focus of psychiatric epidemiological research might be changed to increase its clinical relevance. A review is then presented of recent innovations in community psychiatric epidemiological research that were designed to increase clinical relevance. An argument is then made that the full clinical value of psychiatric epidemiology will only be realized when community epidemiology becomes better integrated with clinical epidemiology and the latter takes on a more prominent role than it currently has in psychiatric research. Existing initiatives to realize an integration of community psychiatric epidemiology with clinical epidemiology are then reviewed. Finally, an agenda is proposed for an expansion of clinical psychiatric epidemiology to include a focus on both naturalistic and quasi-experimental studies of illness course and treatment response in diverse clinical samples. PMID:17896231

  10. The evolution of disease: anthropological perspectives on epidemiologic transitions

    PubMed Central

    Zuckerman, Molly Kathleen; Harper, Kristin Nicole; Barrett, Ronald; Armelagos, George John

    2014-01-01

    Background The model of epidemiologic transitions has served as a guiding framework for understanding relationships between patterns of human health and disease and economic development for the past several decades. However, epidemiologic transition theory is infrequently employed in epidemiology. Objective Moving beyond Omran's original formulation, we discuss critiques and modifications of the theory of epidemiologic transitions and highlight some of the ways in which incorporating epidemiologic transition theory can benefit theory and practice in epidemiology. Design We focus on two broad contemporary trends in human health that epidemiologic transition theory is useful for conceptualizing: the increased incidence of chronic inflammatory diseases (CIDs), such as allergic and autoimmune diseases, and the emergence and reemergence of infectious disease. Results Situating these trends within epidemiologic transition theory, we explain the rise in CIDs with the hygiene hypothesis and the rise in emerging and reemerging infections with the concept of a third epidemiologic transition. Conclusions Contextualizing these trends within epidemiologic transition theory reveals implications for clinical practice, global health policies, and future research within epidemiology. PMID:24848652

  11. Paper vs. electrons. Epidemiologic publishing in a changing world.

    PubMed

    Rothenberg; Frank; Fitzmaurice

    2000-10-01

    PURPOSE: To present the parallel histories of epidemiologic and electronic publishing and consider positive and negative factors that might affect their amalgam.METHODS: We performed a quantitative assessment of the arc of epidemiologic publication from 1966-1999, using major self-designated epidemiologic journals as a sample, and of scholarly electronic publication from 1991-1997, based on current literature review. We use an online, paperless journal as a case study, and review selected information-technology opinion in the area.RESULTS: By traditional standards, growth in epidemiologic publication has been considerable, with the addition of six new journals since 1966. In contrast, scholarly electronic publication for the period 1991-1997 grew from 27 to 2459 journals (not all exclusively online). Positive features of electronic publishing include flexibility, shortened time to publication, freedom from fixed publication date, diversity in presentation, and instant linkage to relevant material. A case study of a new online journal illustrates the substantive power of the medium. Negative factors include restriction (or unrestricted expansion) of the audience, the potential for hasty peer review, pitfalls in establishing credibility, an emphasis on style over content, technologic dependence, and additions to the information explosion. Relative cost and archiving are still debated. In assessing the pros and cons, it is important to distinguish electronic mechanisms that facilitate publication from electronic publishing, and to appreciate the difference between moving an existing journal to the electronic medium, and creating a new online journal.CONCLUSIONS: The movement from print to internet is probably inexorable, but a headlong rush may be ill-advised. Several models for dual publishing now exist, with the expectation that many, including the journals that serve epidemiology, will do so. The ultimate configuration is difficult to predict, but likely to be

  12. GESDB: a platform of simulation resources for genetic epidemiology studies.

    PubMed

    Yao, Po-Ju; Chung, Ren-Hua

    2016-01-01

    Computer simulations are routinely conducted to evaluate new statistical methods, to compare the properties among different methods, and to mimic the observed data in genetic epidemiology studies. Conducting simulation studies can become a complicated task as several challenges can occur, such as the selection of an appropriate simulation tool and the specification of parameters in the simulation model. Although abundant simulated data have been generated for human genetic research, currently there is no public database designed specifically as a repository for these simulated data. With the lack of such a database, for similar studies, similar simulations may have been repeated, which resulted in redundant work. Thus, we created an online platform, the Genetic Epidemiology Simulation Database (GESDB), for simulation data sharing and discussion of simulation techniques for genetic epidemiology studies. GESDB consists of a database for storing simulation scripts, simulated data and documentation from published articles as well as a discussion forum, which provides a platform for discussion of the simulated data and exchanging simulation ideas. Moreover, summary statistics such as the simulation tools that are most commonly used and datasets that are most frequently downloaded are provided. The statistics will be informative for researchers to choose an appropriate simulation tool or select a common dataset for method comparisons. GESDB can be accessed at http://gesdb.nhri.org.twDatabase URL: http://gesdb.nhri.org.tw. PMID:27242038

  13. GESDB: a platform of simulation resources for genetic epidemiology studies

    PubMed Central

    Yao, Po-Ju; Chung, Ren-Hua

    2016-01-01

    Computer simulations are routinely conducted to evaluate new statistical methods, to compare the properties among different methods, and to mimic the observed data in genetic epidemiology studies. Conducting simulation studies can become a complicated task as several challenges can occur, such as the selection of an appropriate simulation tool and the specification of parameters in the simulation model. Although abundant simulated data have been generated for human genetic research, currently there is no public database designed specifically as a repository for these simulated data. With the lack of such a database, for similar studies, similar simulations may have been repeated, which resulted in redundant work. Thus, we created an online platform, the Genetic Epidemiology Simulation Database (GESDB), for simulation data sharing and discussion of simulation techniques for genetic epidemiology studies. GESDB consists of a database for storing simulation scripts, simulated data and documentation from published articles as well as a discussion forum, which provides a platform for discussion of the simulated data and exchanging simulation ideas. Moreover, summary statistics such as the simulation tools that are most commonly used and datasets that are most frequently downloaded are provided. The statistics will be informative for researchers to choose an appropriate simulation tool or select a common dataset for method comparisons. GESDB can be accessed at http://gesdb.nhri.org.tw. Database URL: http://gesdb.nhri.org.tw PMID:27242038

  14. Statistical Methods for Rapid Aerothermal Analysis and Design Technology

    NASA Technical Reports Server (NTRS)

    Morgan, Carolyn; DePriest, Douglas; Thompson, Richard (Technical Monitor)

    2002-01-01

    The cost and safety goals for NASA's next generation of reusable launch vehicle (RLV) will require that rapid high-fidelity aerothermodynamic design tools be used early in the design cycle. To meet these requirements, it is desirable to establish statistical models that quantify and improve the accuracy, extend the applicability, and enable combined analyses using existing prediction tools. The research work was focused on establishing the suitable mathematical/statistical models for these purposes. It is anticipated that the resulting models can be incorporated into a software tool to provide rapid, variable-fidelity, aerothermal environments to predict heating along an arbitrary trajectory. This work will support development of an integrated design tool to perform automated thermal protection system (TPS) sizing and material selection.

  15. A method for designing robust multivariable feedback systems

    NASA Technical Reports Server (NTRS)

    Milich, David Albert; Athans, Michael; Valavani, Lena; Stein, Gunter

    1988-01-01

    A new methodology is developed for the synthesis of linear, time-invariant (LTI) controllers for multivariable LTI systems. The aim is to achieve stability and performance robustness of the feedback system in the presence of multiple unstructured uncertainty blocks; i.e., to satisfy a frequency-domain inequality in terms of the structured singular value. The design technique is referred to as the Causality Recovery Methodology (CRM). Starting with an initial (nominally) stabilizing compensator, the CRM produces a closed-loop system whose performance-robustness is at least as good as, and hopefully superior to, that of the original design. The robustness improvement is obtained by solving an infinite-dimensional, convex optimization program. A finite-dimensional implementation of the CRM was developed, and it was applied to a multivariate design example.

  16. A method for designing robust multivariable feedback systems

    NASA Technical Reports Server (NTRS)

    Milich, David A.; Athans, Michael; Valavani, Lena; Stein, Gunter

    1988-01-01

    A new methodology is developed for the synthesis of linear, time-invariant (LTI) controllers for multivariable LTI systems. The aim is to achieve stability and performance robustness of the feedback system in the presence of multiple unstructured uncertainty blocks; i.e., to satisfy a frequency-domain inequality in terms of the structured singular value. The design technique is referred to as the causality recovery methodology (CRM). Starting with an initial (nominally) stabilizing compensator, the CRM produces a closed-loop system whose performance-robustness is at least as good as, and hopefully superior to, that of the original design. The robustness improvement is obtained by solving an infinite-dimensional, convex optimization program. A finite-dimensional implementation of the CRM was developed, and it was applied to a multivariate design example.

  17. Prevalence of Mixed-Methods Sampling Designs in Social Science Research

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.

    2006-01-01

    The purpose of this mixed-methods study was to document the prevalence of sampling designs utilised in mixed-methods research and to examine the interpretive consistency between interpretations made in mixed-methods studies and the sampling design used. Classification of studies was based on a two-dimensional mixed-methods sampling model. This…

  18. Advanced 3D inverse method for designing turbomachine blades

    SciTech Connect

    Dang, T.

    1995-10-01

    To meet the goal of 60% plant-cycle efficiency or better set in the ATS Program for baseload utility scale power generation, several critical technologies need to be developed. One such need is the improvement of component efficiencies. This work addresses the issue of improving the performance of turbo-machine components in gas turbines through the development of an advanced three-dimensional and viscous blade design system. This technology is needed to replace some elements in current design systems that are based on outdated technology.

  19. Advanced 3D inverse method for designing turbomachine blades

    SciTech Connect

    Dang, T.

    1995-12-31

    To meet the goal of 60% plant-cycle efficiency or better set in the ATS Program for baseload utility scale power generation, several critical technologies need to be developed. One such need is the improvement of component efficiencies. This work addresses the issue of improving the performance of turbo-machine components in gas turbines through the development of an advanced three-dimensional and viscous blade design system. This technology is needed to replace some elements in current design systems that are based on outdated technology.

  20. An overview of reliability methods in mechanical and structural design

    NASA Technical Reports Server (NTRS)

    Wirsching, P. H.; Ortiz, K.; Lee, S. J.

    1987-01-01

    An evaluation is made of modern methods of fast probability integration and Monte Carlo treatment for the assessment of structural systems' and components' reliability. Fast probability integration methods are noted to be more efficient than Monte Carlo ones. This is judged to be an important consideration when several point probability estimates must be made in order to construct a distribution function. An example illustrating the relative efficiency of the various methods is included.

  1. An efficient multilevel optimization method for engineering design

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.; Yang, Y. J.; Kim, D. S.

    1988-01-01

    An efficient multilevel deisgn optimization technique is presented. The proposed method is based on the concept of providing linearized information between the system level and subsystem level optimization tasks. The advantages of the method are that it does not require optimum sensitivities, nonlinear equality constraints are not needed, and the method is relatively easy to use. The disadvantage is that the coupling between subsystems is not dealt with in a precise mathematical manner.

  2. Preliminary Axial Flow Turbine Design and Off-Design Performance Analysis Methods for Rotary Wing Aircraft Engines. Part 1; Validation

    NASA Technical Reports Server (NTRS)

    Chen, Shu-cheng, S.

    2009-01-01

    For the preliminary design and the off-design performance analysis of axial flow turbines, a pair of intermediate level-of-fidelity computer codes, TD2-2 (design; reference 1) and AXOD (off-design; reference 2), are being evaluated for use in turbine design and performance prediction of the modern high performance aircraft engines. TD2-2 employs a streamline curvature method for design, while AXOD approaches the flow analysis with an equal radius-height domain decomposition strategy. Both methods resolve only the flows in the annulus region while modeling the impact introduced by the blade rows. The mathematical formulations and derivations involved in both methods are documented in references 3, 4 for TD2-2) and in reference 5 (for AXOD). The focus of this paper is to discuss the fundamental issues of applicability and compatibility of the two codes as a pair of companion pieces, to perform preliminary design and off-design analysis for modern aircraft engine turbines. Two validation cases for the design and the off-design prediction using TD2-2 and AXOD conducted on two existing high efficiency turbines, developed and tested in the NASA/GE Energy Efficient Engine (GE-E3) Program, the High Pressure Turbine (HPT; two stages, air cooled) and the Low Pressure Turbine (LPT; five stages, un-cooled), are provided in support of the analysis and discussion presented in this paper.

  3. Improved methods for classification, prediction, and design of antimicrobial peptides.

    PubMed

    Wang, Guangshun

    2015-01-01

    Peptides with diverse amino acid sequences, structures, and functions are essential players in biological systems. The construction of well-annotated databases not only facilitates effective information management, search, and mining but also lays the foundation for developing and testing new peptide algorithms and machines. The antimicrobial peptide database (APD) is an original construction in terms of both database design and peptide entries. The host defense antimicrobial peptides (AMPs) registered in the APD cover the five kingdoms (bacteria, protists, fungi, plants, and animals) or three domains of life (bacteria, archaea, and eukaryota). This comprehensive database ( http://aps.unmc.edu/AP ) provides useful information on peptide discovery timeline, nomenclature, classification, glossary, calculation tools, and statistics. The APD enables effective search, prediction, and design of peptides with antibacterial, antiviral, antifungal, antiparasitic, insecticidal, spermicidal, anticancer activities, chemotactic, immune modulation, or antioxidative properties. A universal classification scheme is proposed herein to unify innate immunity peptides from a variety of biological sources. As an improvement, the upgraded APD makes predictions based on the database-defined parameter space and provides a list of the sequences most similar to natural AMPs. In addition, the powerful pipeline design of the database search engine laid a solid basis for designing novel antimicrobials to combat resistant superbugs, viruses, fungi, or parasites. This comprehensive AMP database is a useful tool for both research and education. PMID:25555720

  4. Categorizing Student Software Designs: Methods, Results, and Implications

    ERIC Educational Resources Information Center

    Eckerdal, Anna; McCartney, Robert; Mostrom, Jan Erik; Ratcliffe, Mark; Zander, Carol

    2006-01-01

    This paper examines the problem of studying and comparing student software designs. We propose semantic categorization as a way to organize widely varying data items. We describe how this was used to organize a particular multi-national, multi-institutional dataset, and present the results of this analysis: most students are unable to effectively…

  5. A Prospective Method to Guide Small Molecule Drug Design

    ERIC Educational Resources Information Center

    Johnson, Alan T.

    2015-01-01

    At present, small molecule drug design follows a retrospective path when considering what analogs are to be made around a current hit or lead molecule with the focus often on identifying a compound with higher intrinsic potency. What this approach overlooks is the simultaneous need to also improve the physicochemical (PC) and pharmacokinetic (PK)…

  6. Study of design and analysis methods for transonic flow

    NASA Technical Reports Server (NTRS)

    Murman, E. M.

    1977-01-01

    An airfoil design program and a boundary layer analysis were developed. Boundary conditions were derived for ventilated transonic wind tunnels and performing transonic windtunnel wall calculations. A computational procedure for rotational transonic flow in engine inlet throats was formulated. Results and conclusions are summarized.

  7. COPTRAN - A method of optimum communication systems design

    NASA Technical Reports Server (NTRS)

    Brinkman, K. L.; Pratt, W. K.; Stokes, L. S.; Weber, J. W.

    1970-01-01

    Single set of mathematical expressions describes system cost and probability of error of data transmission in terms of four basic parameters in the link equation. A Lagrange multiplier sets up equations whose solutions yield the optimum values for system design considerations and weight and cost values.

  8. Library Design Analysis Using Post-Occupancy Evaluation Methods.

    ERIC Educational Resources Information Center

    James, Dennis C.; Stewart, Sharon L.

    1995-01-01

    Presents findings of a user-based study of the interior of Rodger's Science and Engineering Library at the University of Alabama. Compared facility evaluations from faculty, library staff, and graduate and undergraduate students. Features evaluated include: acoustics, aesthetics, book stacks, design, finishes/materials, furniture, lighting,…

  9. Improved Methods for Classification, Prediction and Design of Antimicrobial Peptides

    PubMed Central

    Wang, Guangshun

    2015-01-01

    Peptides with diverse amino acid sequences, structures and functions are essential players in biological systems. The construction of well-annotated databases not only facilitates effective information management, search and mining, but also lays the foundation for developing and testing new peptide algorithms and machines. The antimicrobial peptide database (APD) is an original construction in terms of both database design and peptide entries. The host defense antimicrobial peptides (AMPs) registered in the APD cover the five kingdoms (bacteria, protists, fungi, plants, and animals) or three domains of life (bacteria, archaea, and eukaryota). This comprehensive database (http://aps.unmc.edu/AP) provides useful information on peptide discovery timeline, nomenclature, classification, glossary, calculation tools, and statistics. The APD enables effective search, prediction, and design of peptides with antibacterial, antiviral, antifungal, antiparasitic, insecticidal, spermicidal, anticancer activities, chemotactic, immune modulation, or anti-oxidative properties. A universal classification scheme is proposed herein to unify innate immunity peptides from a variety of biological sources. As an improvement, the upgraded APD makes predictions based on the database-defined parameter space and provides a list of the sequences most similar to natural AMPs. In addition, the powerful pipeline design of the database search engine laid a solid basis for designing novel antimicrobials to combat resistant superbugs, viruses, fungi or parasites. This comprehensive AMP database is a useful tool for both research and education. PMID:25555720

  10. Overview of control design methods for smart structural system

    NASA Astrophysics Data System (ADS)

    Rao, Vittal S.; Sana, Sridhar

    2001-08-01

    Smart structures are a result of effective integration of control system design and signal processing with the structural systems to maximally utilize the new advances in materials for structures, actuation and sensing to obtain the best performance for the application at hand. The research in smart structures is constantly driving towards attaining self adaptive and diagnostic capabilities that biological systems possess. This has been manifested in the number of successful applications in many areas of engineering such as aerospace, civil and automotive systems. Instrumental in the development of such systems are smart materials such as piezo-electric, shape memory alloys, electrostrictive, magnetostrictive and fiber-optic materials and various composite materials for use as actuators, sensors and structural members. The need for development of control systems that maximally utilize the smart actuators and sensing materials to design highly distributed and highly adaptable controllers has spurred research in the area of smart structural modeling, identification, actuator/sensor design and placement, control systems design such as adaptive and robust controllers with new tools such a neural networks, fuzzy logic, genetic algorithms, linear matrix inequalities and electronics for controller implementation such as analog electronics, micro controllers, digital signal processors (DSPs) and application specific integrated circuits (ASICs) such field programmable gate arrays (FPGAs) and Multichip modules (MCMs) etc. In this paper, we give a brief overview of the state of control in smart structures. Different aspects of the development of smart structures such as applications, technology and theoretical advances especially in the area of control systems design and implementation will be covered.

  11. Optimal reliability design method for remote solar systems

    NASA Astrophysics Data System (ADS)

    Suwapaet, Nuchida

    A unique optimal reliability design algorithm is developed for remote communication systems. The algorithm deals with either minimizing an unavailability of the system within a fixed cost or minimizing the cost of the system with an unavailability constraint. The unavailability of the system is a function of three possible failure occurrences: individual component breakdown, solar energy deficiency (loss of load probability), and satellite/radio transmission loss. The three mathematical models of component failure, solar power failure, transmission failure are combined and formulated as a nonlinear programming optimization problem with binary decision variables, such as number and type (or size) of photovoltaic modules, batteries, radios, antennas, and controllers. Three possible failures are identified and integrated in computer algorithm to generate the parameters for the optimization algorithm. The optimization algorithm is implemented with a branch-and-bound technique solution in MS Excel Solver. The algorithm is applied to a case study design for an actual system that will be set up in remote mountainous areas of Peru. The automated algorithm is verified with independent calculations. The optimal results from minimizing the unavailability of the system with the cost constraint case and minimizing the total cost of the system with the unavailability constraint case are consistent with each other. The tradeoff feature in the algorithm allows designers to observe results of 'what-if' scenarios of relaxing constraint bounds, thus obtaining the most benefit from the optimization process. An example of this approach applied to an existing communication system in the Andes shows dramatic improvement in reliability for little increase in cost. The algorithm is a real design tool, unlike other existing simulation design tools. The algorithm should be useful for other stochastic systems where component reliability, random supply and demand, and communication are

  12. Applications of Genetic Methods to NASA Design and Operations Problems

    NASA Technical Reports Server (NTRS)

    Laird, Philip D.

    1996-01-01

    We review four recent NASA-funded applications in which evolutionary/genetic methods are important. In the process we survey: the kinds of problems being solved today with these methods; techniques and tools used; problems encountered; and areas where research is needed. The presentation slides are annotated briefly at the top of each page.

  13. Consumers' Kansei Needs Clustering Method for Product Emotional Design Based on Numerical Design Structure Matrix and Genetic Algorithms

    PubMed Central

    Chen, Deng-kai; Gu, Rong; Gu, Yu-feng; Yu, Sui-huai

    2016-01-01

    Consumers' Kansei needs reflect their perception about a product and always consist of a large number of adjectives. Reducing the dimension complexity of these needs to extract primary words not only enables the target product to be explicitly positioned, but also provides a convenient design basis for designers engaging in design work. Accordingly, this study employs a numerical design structure matrix (NDSM) by parameterizing a conventional DSM and integrating genetic algorithms to find optimum Kansei clusters. A four-point scale method is applied to assign link weights of every two Kansei adjectives as values of cells when constructing an NDSM. Genetic algorithms are used to cluster the Kansei NDSM and find optimum clusters. Furthermore, the process of the proposed method is presented. The details of the proposed approach are illustrated using an example of electronic scooter for Kansei needs clustering. The case study reveals that the proposed method is promising for clustering Kansei needs adjectives in product emotional design.

  14. Design and ergonomics. Methods for integrating ergonomics at hand tool design stage.

    PubMed

    Marsot, Jacques; Claudon, Laurent

    2004-01-01

    As a marked increase in the number of musculoskeletal disorders was noted in many industrialized countries and more specifically in companies that require the use of hand tools, the French National Research and Safety Institute (INRS) launched in 1999 a research project on the topic of integrating ergonomics into hand tool design, and more particularly to a design of a boning knife. After a brief recall of the difficulties of integrating ergonomics at the design stage, the present paper shows how 3 design methodological tools--Functional Analysis, Quality Function Deployment and TRIZ--have been applied to the design of a boning knife. Implementation of these tools enabled us to demonstrate the extent to which they are capable of responding to the difficulties of integrating ergonomics into product design. PMID:15028190

  15. [Epidemiological research in Brazil].

    PubMed

    Guimarães, R; Lourenço-De-Oliveira, R; Cosac, S

    2001-08-01

    The current epidemiological research in Brazil is described. Secondary data sources were consulted, such as the year 2000 database of the Brazilian Directory of Research Groups and the National Board of Scientific and Technological Development (CNPq). The criterion to identify a group as a research one relies on the existence of at least one research line in the field of epidemiology, as defined by the group leader. After identifying the defined universe of epidemiological research, which included 176 groups and 320 different research lines, the following issues were presented and discussed: the relationships between research financing and health research, focusing on CAPES (Coordination Center for the Advance of University Professionals) graduation programs, public health research and epidemiological research, geographic and institutional distribution and outreach of the current epidemiological research, the researchers and students directly participating in epidemiological research, research topics and patterns of disseminating research findings; the journals where papers in its fullness were published; the financial support of the epidemiological research focusing on the 23 officially recognized graduate programs in public health field. PMID:11600921

  16. Networks and the Epidemiology of Infectious Disease

    PubMed Central

    Danon, Leon; Ford, Ashley P.; House, Thomas; Jewell, Chris P.; Keeling, Matt J.; Roberts, Gareth O.; Ross, Joshua V.; Vernon, Matthew C.

    2011-01-01

    The science of networks has revolutionised research into the dynamics of interacting elements. It could be argued that epidemiology in particular has embraced the potential of network theory more than any other discipline. Here we review the growing body of research concerning the spread of infectious diseases on networks, focusing on the interplay between network theory and epidemiology. The review is split into four main sections, which examine: the types of network relevant to epidemiology; the multitude of ways these networks can be characterised; the statistical methods that can be applied to infer the epidemiological parameters on a realised network; and finally simulation and analytical methods to determine epidemic dynamics on a given network. Given the breadth of areas covered and the ever-expanding number of publications, a comprehensive review of all work is impossible. Instead, we provide a personalised overview into the areas of network epidemiology that have seen the greatest progress in recent years or have the greatest potential to provide novel insights. As such, considerable importance is placed on analytical approaches and statistical methods which are both rapidly expanding fields. Throughout this review we restrict our attention to epidemiological issues. PMID:21437001

  17. Design component method for sensitivity analysis of built-up structures

    NASA Technical Reports Server (NTRS)

    Choi, Kyung K.; Seong, Hwai G.

    1986-01-01

    A 'design component method' that provides a unified and systematic organization of design sensitivity analysis for built-up structures is developed and implemented. Both conventional design variables, such as thickness and cross-sectional area, and shape design variables of components of built-up structures are considered. It is shown that design of components of built-up structures can be characterized and system design sensitivity expressions obtained by simply adding contributions from each component. The method leads to a systematic organization of computations for design sensitivity analysis that is similar to the way in which computations are organized within a finite element code.

  18. Inverse airfoil design procedure using a multigrid Navier-Stokes method

    NASA Technical Reports Server (NTRS)

    Malone, J. B.; Swanson, R. C.

    1991-01-01

    The Modified Garabedian McFadden (MGM) design procedure was incorporated into an existing 2-D multigrid Navier-Stokes airfoil analysis method. The resulting design method is an iterative procedure based on a residual correction algorithm and permits the automated design of airfoil sections with prescribed surface pressure distributions. The new design method, Multigrid Modified Garabedian McFadden (MG-MGM), is demonstrated for several different transonic pressure distributions obtained from both symmetric and cambered airfoil shapes. The airfoil profiles generated with the MG-MGM code are compared to the original configurations to assess the capabilities of the inverse design method.

  19. Synthesis of calculational methods for design and analysis of radiation shields for nuclear rocket systems

    NASA Technical Reports Server (NTRS)

    Capo, M. A.; Disney, R. K.; Jordan, T. A.; Soltesz, R. G.; Woodsum, H. C.

    1969-01-01

    Eight computer programs make up a nine volume synthesis containing two design methods for nuclear rocket radiation shields. The first design method is appropriate for parametric and preliminary studies, while the second accomplishes the verification of a final nuclear rocket reactor design.

  20. 40 CFR 53.8 - Designation of reference and equivalent methods.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... satisfy the applicable requirements of this part shall be designated as a FRM or FEM (as applicable) by... notice indicating that the method has been designated as a FRM or FEM shall be sent to the applicant. (c) The Administrator will maintain a current list of methods designated as FRM or FEM in accordance...

  1. 40 CFR 53.8 - Designation of reference and equivalent methods.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... satisfy the applicable requirements of this part shall be designated as a FRM or FEM (as applicable) by... notice indicating that the method has been designated as a FRM or FEM shall be sent to the applicant. (c) The Administrator will maintain a current list of methods designated as FRM or FEM in accordance...

  2. How to Combine Objectives and Methods of Evaluation in Iterative ILE Design: Lessons Learned from Designing Ambre-Add

    ERIC Educational Resources Information Center

    Nogry, S.; Jean-Daubias, S.; Guin, N.

    2012-01-01

    This article deals with evaluating an interactive learning environment (ILE) during the iterative-design process. Various aspects of the system must be assessed and a number of evaluation methods are available. In designing the ILE Ambre-add, several techniques were combined to test and refine the system. In particular, we point out the merits of…

  3. A hybrid nonlinear programming method for design optimization

    NASA Technical Reports Server (NTRS)

    Rajan, S. D.

    1986-01-01

    Solutions to engineering design problems formulated as nonlinear programming (NLP) problems usually require the use of more than one optimization technique. Moreover, the interaction between the user (analysis/synthesis) program and the NLP system can lead to interface, scaling, or convergence problems. An NLP solution system is presented that seeks to solve these problems by providing a programming system to ease the user-system interface. A simple set of rules is used to select an optimization technique or to switch from one technique to another in an attempt to detect, diagnose, and solve some potential problems. Numerical examples involving finite element based optimal design of space trusses and rotor bearing systems are used to illustrate the applicability of the proposed methodology.

  4. Simple optimization method for EMI mesh pattern design

    NASA Astrophysics Data System (ADS)

    Alpman, Mehmet Erhan; Senger, Tolga

    2014-05-01

    Metallic mesh coatings are used on visible and infrared windows and domes widely to provide shielding from EMI (Electromagnetic Interference). In this paper, different EMI mesh geometries are compared with each other regarding various performance parameters. But to decide the best fitting EMI mesh geometry to particular optic system is a little bit complicated issue. Therefore, we try to find a simple optimization methodology to decide best EMI mesh geometry design that fits our particular high performance ISR (Intelligence, Surveillance and Reconnaissance) systems.

  5. Design Method of Fault Detector for Injection Unit

    NASA Astrophysics Data System (ADS)

    Ochi, Kiyoshi; Saeki, Masami

    An injection unit is considered as a speed control system utilizing a reaction-force sensor. Our purpose is to design a fault detector that detects and isolates actuator and sensor faults under the condition that the system is disturbed by a reaction force. First described is the fault detector's general structure. In this system, a disturbance observer that estimates the reaction force is designed for the speed control system in order to obtain the residual signals, and then post-filters that separate the specific frequency elements from the residual signals are applied in order to generate the decision signals. Next, we describe a fault detector designed specifically for a model of the injection unit. It is shown that the disturbance imposed on the decision variables can be made significantly small by appropriate adjustments to the observer bandwidth, and that most of the sensor faults and actuator faults can be detected and some of them can be isolated in the frequency domain by setting the frequency characteristics of the post-filters appropriately. Our result is verified by experiments for an actual injection unit.

  6. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... and methods prescribed under appendix A of 14 CFR part 150; and (b) Use of computer models to create noise contours must be in accordance with the criteria prescribed under appendix A of 14 CFR part 150....

  7. 14 CFR 161.9 - Designation of noise description methods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... and methods prescribed under appendix A of 14 CFR part 150; and (b) Use of computer models to create noise contours must be in accordance with the criteria prescribed under appendix A of 14 CFR part 150....

  8. Object-oriented design of preconditioned iterative methods

    SciTech Connect

    Bruaset, A.M.

    1994-12-31

    In this talk the author discusses how object-oriented programming techniques can be used to develop a flexible software package for preconditioned iterative methods. The ideas described have been used to implement the linear algebra part of Diffpack, which is a collection of C++ class libraries that provides high-level tools for the solution of partial differential equations. In particular, this software package is aimed at rapid development of PDE-based numerical simulators, primarily using finite element methods.

  9. Reducing Design Risk Using Robust Design Methods: A Dual Response Surface Approach

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Yeniay, Ozgur; Lepsch, Roger A. (Technical Monitor)

    2003-01-01

    Space transportation system conceptual design is a multidisciplinary process containing considerable element of risk. Risk here is defined as the variability in the estimated (output) performance characteristic of interest resulting from the uncertainties in the values of several disciplinary design and/or operational parameters. Uncertainties from one discipline (and/or subsystem) may propagate to another, through linking parameters and the final system output may have a significant accumulation of risk. This variability can result in significant deviations from the expected performance. Therefore, an estimate of variability (which is called design risk in this study) together with the expected performance characteristic value (e.g. mean empty weight) is necessary for multidisciplinary optimization for a robust design. Robust design in this study is defined as a solution that minimizes variability subject to a constraint on mean performance characteristics. Even though multidisciplinary design optimization has gained wide attention and applications, the treatment of uncertainties to quantify and analyze design risk has received little attention. This research effort explores the dual response surface approach to quantify variability (risk) in critical performance characteristics (such as weight) during conceptual design.

  10. Advanced Control and Protection system Design Methods for Modular HTGRs

    SciTech Connect

    Ball, Sydney J; Wilson Jr, Thomas L; Wood, Richard Thomas

    2012-06-01

    The project supported the Nuclear Regulatory Commission (NRC) in identifying and evaluating the regulatory implications concerning the control and protection systems proposed for use in the Department of Energy's (DOE) Next-Generation Nuclear Plant (NGNP). The NGNP, using modular high-temperature gas-cooled reactor (HTGR) technology, is to provide commercial industries with electricity and high-temperature process heat for industrial processes such as hydrogen production. Process heat temperatures range from 700 to 950 C, and for the upper range of these operation temperatures, the modular HTGR is sometimes referred to as the Very High Temperature Reactor or VHTR. Initial NGNP designs are for operation in the lower temperature range. The defining safety characteristic of the modular HTGR is that its primary defense against serious accidents is to be achieved through its inherent properties of the fuel and core. Because of its strong negative temperature coefficient of reactivity and the capability of the fuel to withstand high temperatures, fast-acting active safety systems or prompt operator actions should not be required to prevent significant fuel failure and fission product release. The plant is designed such that its inherent features should provide adequate protection despite operational errors or equipment failure. Figure 1 shows an example modular HTGR layout (prismatic core version), where its inlet coolant enters the reactor vessel at the bottom, traversing up the sides to the top plenum, down-flow through an annular core, and exiting from the lower plenum (hot duct). This research provided NRC staff with (a) insights and knowledge about the control and protection systems for the NGNP and VHTR, (b) information on the technologies/approaches under consideration for use in the reactor and process heat applications, (c) guidelines for the design of highly integrated control rooms, (d) consideration for modeling of control and protection system designs for

  11. Ophthalmic epidemiology in Europe: the "European Eye Epidemiology" (E3) consortium.

    PubMed

    Delcourt, Cécile; Korobelnik, Jean-François; Buitendijk, Gabriëlle H S; Foster, Paul J; Hammond, Christopher J; Piermarocchi, Stefano; Peto, Tunde; Jansonius, Nomdo; Mirshahi, Alireza; Hogg, Ruth E; Bretillon, Lionel; Topouzis, Fotis; Deak, Gabor; Grauslund, Jakob; Broe, Rebecca; Souied, Eric H; Creuzot-Garcher, Catherine; Sahel, José; Daien, Vincent; Lehtimäki, Terho; Hense, Hans-Werner; Prokofyeva, Elena; Oexle, Konrad; Rahi, Jugnoo S; Cumberland, Phillippa M; Schmitz-Valckenberg, Steffen; Fauser, Sascha; Bertelsen, Geir; Hoyng, Carel; Bergen, Arthur; Silva, Rufino; Wolf, Sebastian; Lotery, Andrew; Chakravarthy, Usha; Fletcher, Astrid; Klaver, Caroline C W

    2016-02-01

    The European Eye Epidemiology (E3) consortium is a recently formed consortium of 29 groups from 12 European countries. It already comprises 21 population-based studies and 20 other studies (case-control, cases only, randomized trials), providing ophthalmological data on approximately 170,000 European participants. The aim of the consortium is to promote and sustain collaboration and sharing of data and knowledge in the field of ophthalmic epidemiology in Europe, with particular focus on the harmonization of methods for future research, estimation and projection of frequency and impact of visual outcomes in European populations (including temporal trends and European subregions), identification of risk factors and pathways for eye diseases (lifestyle, vascular and metabolic factors, genetics, epigenetics and biomarkers) and development and validation of prediction models for eye diseases. Coordinating these existing data will allow a detailed study of the risk factors and consequences of eye diseases and visual impairment, including study of international geographical variation which is not possible in individual studies. It is expected that collaborative work on these existing data will provide additional knowledge, despite the fact that the risk factors and the methods for collecting them differ somewhat among the participating studies. Most studies also include biobanks of various biological samples, which will enable identification of biomarkers to detect and predict occurrence and progression of eye diseases. This article outlines the rationale of the consortium, its design and presents a summary of the methodology. PMID:26686680

  12. Symphony: A case study for exploring and describing design methods and guidelines for learner-centered design

    NASA Astrophysics Data System (ADS)

    Quintana, Christopher

    Learner-centered design is an evolving software design perspective addressing the needs of learners---a specific audience trying to work in and understand new work practices in which they have a novice or naive understanding. Learner-centered design involves designing software that incorporates work support features (or scaffolding features) informed by social constructivist learning theories. By adopting a constructivist "learning by doing" perspective, scaffolds should support learners so they can mindfully engage in previously inaccessible work activity, which in turn allows those learners to progressively gain a better understanding of the new work. While there is an intuitive notion of "learner-centered design", there is less specific design information for developing learner-centered software. As a result, learner-centered software results from "educated guesses" and ad-hoc design approaches rather than from systematic design methods. Thus there is a need for specific design guidance to facilitate the development of learner-centered tools that help learners see the tasks, terminology, tools, etc. in the new work context and engage in that work. The research in this dissertation provides a more specific base of learner-centered design descriptions, methods, and guidelines to analyze work practices and design and evaluate scaffolds. The research approach involves using the development of Symphony---a scaffolded integrated tool environment for high-school students learning the work of computational science inquiry---as a case study to develop the learner-centered design approach. Symphony incorporates a variety of science tools with process scaffolding to support students in performing complex air pollution investigations. Six ninth-grade students used Symphony to investigate air quality questions for several weeks in an environmental science class. The student testing helped assess the effectiveness of the software scaffolding and in turn, the learner

  13. The Brazilian Football Association (CBF) model for epidemiological studies on professional soccer player injuries

    PubMed Central

    Arliani, Gustavo Gonçalves; Belangero, Paulo Santoro; Runco, Jose Luiz; Cohen, Moisés

    2011-01-01

    OBJECTIVE: This study aims to establish a national methodological model for epidemiological studies on professional soccer player injuries and to describe the numerous relevant studies previously published on this topic. INTRODUCTION: The risk of injury in professional soccer is high. However, previous studies of injury risk in Brazil and other countries have been characterized by large variations in study design and data collection methods as well as definitions of injury, standardized diagnostic criteria, and recovery times. METHODS: A system developed by the Union of European Football for epidemiological studies on professional soccer players is being used as a starting point to create a methodological model for the Brazilian Football Association. To describe the existing studies on professional soccer player injuries, we developed a search strategy to identify relevant epidemiological studies. We included the Latin American and Caribbean Center on Health Sciences and Medline databases in our study. RESULTS: We considered 60 studies from Medline and 16 studies from the Latin American and Caribbean Center on Health Sciences in the final analysis. Twelve studies were selected for final inclusion in this review: seven from the Latin American and Caribbean Center on Health Sciences and five from Medline. We identified a lack of uniformity in the study design, data collection methods, injury definitions, standardized diagnostic criteria, and the definition of recovery time. Based on the information contained within these articles, we developed a model for epidemiological studies for the Brazilian Football Association. CONCLUSIONS: There is no uniform model for epidemiological studies of professional soccer injuries. Here, we propose a novel model to be applied for epidemiological studies of professional soccer player injuries in Brazil and throughout the world. PMID:22012041

  14. Comparison of Three Statistical Methods for Establishing Tentative Wild-Type Population and Epidemiological Cutoff Values for Echinocandins, Amphotericin B, Flucytosine, and Six Candida Species as Determined by the Colorimetric Sensititre YeastOne Method

    PubMed Central

    Pemán, Javier; Hervás, David; Iñiguez, Carmen; Navarro, David; Echeverría, Julia; Martínez-Alarcón, José; Fontanals, Dionisia; Gomila-Sard, Bárbara; Buendía, Buenaventura; Torroba, Luis; Ayats, Josefina; Bratos, Angel; Sánchez-Reus, Ferran; Fernández-Natal, Isabel

    2012-01-01

    The Sensititre YeastOne (SYO) method is a widely used method to determine the susceptibility of Candida spp. to antifungal agents. CLSI clinical breakpoints (CBP) have been reported for antifungals, but not using this method. In the absence of CBP, epidemiological cutoff values (ECVs) are useful to separate wild-type (WT) isolates (those without mechanisms of resistance) from non-WT isolates (those that can harbor some resistance mechanisms), which is the goal of any susceptibility test. The ECVs for five agents, obtained using the MIC distributions determined by the SYO test, were calculated and contrasted with those for three statistical methods and the MIC50 and modal MIC, both plus 2-fold dilutions. The median ECVs (in mg/liter) (% of isolates inhibited by MICs equal to or less than the ECV; number of isolates tested) of the five methods for anidulafungin, micafungin, caspofungin, amphotericin B, and flucytosine, respectively, were as follows: 0.25 (98.5%; 656), 0.06 (95.1%; 659), 0.25 (98.7%; 747), 2 (100%; 923), and 1 (98.5%; 915) for Candida albicans; 8 (100%; 352), 4 (99.2%; 392), 2 (99.2%; 480), 1 (99.8%; 603), and 0.5 (97.9%; 635) for C. parapsilosis; 1 (99.2%; 123), 0.12 (99.2%; 121), 0.25 (99.2%; 138), 2 (100%; 171), and 0.5 (97.2%; 175) for C. tropicalis; 0.12 (96.6%; 174), 0.06 (96%; 176), 0.25 (98.4%; 188), 2 (100%; 209), and 0.25 (97.6%; 208) for C. glabrata; 0.25 (97%; 33), 0.5 (93.9%; 33), 1 (91.9%; 37), 4 (100%; 51), and 32 (100%; 53) for C. krusei; and 4 (100%; 33), 2 (100%; 33), 2 (100%; 54), 1 (100%; 90), and 0.25 (93.4%; 91) for C. orthopsilosis. The three statistical methods gave similar ECVs (within one dilution) and included ≥95% of isolates. These tentative ECVs would be useful for monitoring the emergence of isolates with reduced susceptibility by use of the SYO method. PMID:23015676

  15. Epidemiology of Enterocytozoon bieneusi Infection in Humans

    PubMed Central

    Matos, Olga; Lobo, Maria Luisa; Xiao, Lihua

    2012-01-01

    A review was conducted to examine published works that focus on the complex epidemiology of Enterocytozoon bieneusi infection in humans. Studies on the prevalence of these emerging microsporidian pathogens in humans, in developed and developing countries, the different clinical spectra of E. bieneusi intestinal infection in children, in different settings, and the risk factors associated with E. bieneusi infection have been reviewed. This paper also analyses the impact of the recent application of PCR-based molecular methods for species-specific identification and genotype differentiation has had in increasing the knowledge of the molecular epidemiology of E. bieneusi in humans. The advances in the epidemiology of E. bieneusi, in the last two decades, emphasize the importance of epidemiological control and prevention of E. bieneusi infections, from both the veterinary and human medical perspectives. PMID:23091702

  16. Epidemiology of Substance Use Disorders

    PubMed Central

    Merikangas, Kathleen R.; McClair, Vetisha L.

    2013-01-01

    Epidemiological studies of substance use and substance use disorders (SUDs) have provided an abundance of data on the patterns of substance use in nationally representative samples across the world (Degenhardt et al. 2008; Johnston et al. 2011; SAMHSA 2011). This paper presents a summary of the goals, methods and recent findings on the epidemiology of substance use and disorders in the general population of adults and adolescents and describes the methods and findings on the genetic epidemiology of drug use disorders. The high 12 month prevalence rates of substance dependence in U.S. adults (about 12% for alcohol and 2–3% for illicit drugs) approximate those of other mental disorders as well as chronic physical disorders with major public health impact. New findings from the nationally representative samples of U.S. youth reveal that the lifetime prevalence of alcohol use disorders is approximately 8% and illicit drug use disorders is 2–3% (Merikangas et al. 2010; Swendsen et al. in press, SAMSHA, 2011). The striking increase in prevalence rates from ages 13 to 18 highlight adolescence as the key period of development of substance use disorders. The application of genetic epidemiological studies has consistently demonstrated that genetic factors have a major influence on progression of substance use to dependence, whereas environmental factors unique to the individual play an important role in exposure and initial use of substances. Identification of specific susceptibility genes and environmental factors that influence exposure and progression of drug use may enhance our ability to prevent and treat substance use disorders. PMID:22543841

  17. Photovoltaic module hot spot durability design and test methods

    NASA Technical Reports Server (NTRS)

    Arnett, J. C.; Gonzalez, C. C.

    1981-01-01

    As part of the Jet Propulsion Laboratory's Low-Cost Solar Array Project, the susceptibility of fat-plate modules to hot-spot problems is investigated. Hot-spot problems arise in modules when the cells become back-biased and operate in the negative-voltage quadrant, as a result of short-circuit current mismatch, cell cracking or shadowing. The details of a qualification test for determining the capability of modules of surviving field hot-spot problems and typical results of this test are presented. In addition, recommended circuit-design techniques for improving the module and array reliability with respect to hot-spot problems are presented.

  18. Method of calculating the optimal radioelectronic equipment design

    NASA Astrophysics Data System (ADS)

    Ermolaev, Yu. P.

    1993-05-01

    In designing the competitive radio-electronic equipment there inevitably arises the justified optimal constructional decision problem that accounts for many quality indexes. A totality of these quality indexes can be conveniently represented by the vector in a multi-dimensional space, on the coordinate axes of which the specific quantities of all the accountable indexes of quality are laid off. Inevitability of accounting for a variety of quality indexes, when choosing optimal decision, raises a problem of vector (multi-objective) optimization of the object developed.

  19. Genetic-evolution-based optimization methods for engineering design

    NASA Technical Reports Server (NTRS)

    Rao, S. S.; Pan, T. S.; Dhingra, A. K.; Venkayya, V. B.; Kumar, V.

    1990-01-01

    This paper presents the applicability of a biological model, based on genetic evolution, for engineering design optimization. Algorithms embodying the ideas of reproduction, crossover, and mutation are developed and applied to solve different types of structural optimization problems. Both continuous and discrete variable optimization problems are solved. A two-bay truss for maximum fundamental frequency is considered to demonstrate the continuous variable case. The selection of locations of actuators in an actively controlled structure, for minimum energy dissipation, is considered to illustrate the discrete variable case.

  20. The epidemiology of recurrent pregnancy loss.

    PubMed

    Cramer, D W; Wise, L A

    2000-01-01

    In reviewing the epidemiology of recurrent abortion (RAB), we believe it is necessary to consider the epidemiology of spontaneous abortion (SAB) as well, since it is clear that even a single pregnancy loss increases the risk for a subsequent abortion. In addition, any attempt to identify epidemiologic risk factors for SAB or RAB must deal with the fact that at least 50% of SABs are associated with genetic abnormalities. Given that most epidemiologic studies have not distinguished karyotypically abnormal abortuses, risk factors are likely to be underestimated. Nevertheless, there is fair agreement that a variety of factors may increase risk for SAB or RAB, including advanced maternal age, single gene mutations such as PKU or G6PD deficiency, structural abnormalities of the uterus, poorly controlled diabetes, antiphospholipid syndrome, and smoking. More controversial is the role of luteal phase defect or hyperandrogenism, alloimmune factors, genital infections, caffeine or alcohol use, and trace element or chemical exposure from tap water or in the workplace. Besides better designed epidemiologic studies to detect modifiable risk factors for SAB or RAB, there is a clear need for clinical trials of therapy for RAB which meet minimum epidemiologic standards including randomization, double-blinded (when possible), and placebo-controlled (when ethical). PMID:11355791