Science.gov

Sample records for 95-39 methods development

  1. Medical Research and Evaluation Facility (MREF) and studies supporting the medical chemical defense program: Task 95-39: Methods development and validation of two mouse bioassays for use in quantifying botulinum toxins (a, b, c, d and e) and toxin antibody titers. Final report

    SciTech Connect

    Olson, C.T.; Gelzleichter, T.R.; Myers, M.A.; Menton, R.G.; Neimuth, N.A.

    1997-06-01

    Ths task was conducted for the U.S. Army Medical Materiel Development Activity (USAMMDA) to validate two mouse bioassays for quantify botulinum toxin potency and neutralizing antibodies to botulimun toxins. Phase I experiments were designed to validate the mouse potency assay. The coefficients of variation for day-to-day variability were 10, 7, 10, 9 and 13 percent for serotypes A, B, C, D, and E, respectively. Phase II experiments were -brined to develop and validate an assay for measuring neutralizing antibody content of serum. Avidity reetits were characterized at three separate test levels, L+/10, L+/33, and L+/100. The coefficients of variation for day-to-day variability were 9, 44, 11, 34, and 13 percent for serotype A, B, C, D, and E, respectively. Limits of intitation were approximately 0.02, 0.005, 0.012, 0.026, and 0.013 U/mL for serotypes A, B, C, D, and B, respectively. Phase III consisted of limited studies to develop a model of passive immunity in guinea pigs by intraperitoneal treatment with human botulinum immune globulin (BIG).

  2. Radiochemical method development

    SciTech Connect

    Erickson, M.D.; Aldstadt, J.H.; Alvarado, J.S.; Crain, J.S.; Orlandini, K.A.; Smith, L.L.

    1994-09-01

    The authors have developed methods for chemical characterization of the environment under a multitask project that focuses on improvement of radioanalytical methods with an emphasis on faster and cheaper routine methods. The authors have developed improved methods for separation of environmental levels of technetium-99, radium, and actinides from soil and water; separation of actinides from soil and water matrix interferences; and isolation of strontium. They are also developing methods for simultaneous detection of multiple isotopes (including nonradionuclides) by using a new instrumental technique, inductively coupled plasma-mass spectrometry (ICP-MS). The new ICP-MS methods have greater sensitivity and efficiency and could replace many radiometric techniques. They are using flow injection analysis to integrate and automate the separation methods with the ICP-MS methodology. The final product of all activities will be methods that are available (published in the U.S. Department of Energy`s analytical methods compendium) and acceptable for use in regulatory situations.

  3. Computational Methods Development at Ames

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Smith, Charles A. (Technical Monitor)

    1998-01-01

    This viewgraph presentation outlines the development at Ames Research Center of advanced computational methods to provide appropriate fidelity computational analysis/design capabilities. Current thrusts of the Ames research include: 1) methods to enhance/accelerate viscous flow simulation procedures, and the development of hybrid/polyhedral-grid procedures for viscous flow; 2) the development of real time transonic flow simulation procedures for a production wind tunnel, and intelligent data management technology; and 3) the validation of methods and the flow physics study gives historical precedents to above research, and speculates on its future course.

  4. Biological Methods and Manual Development

    EPA Pesticide Factsheets

    EPA scientists conduct research to develop and evaluate analytical methods for the identification, enumeration, evaluation of aquatic organisms exposed to environmental stressors and to correlate exposures with effects on chemical and biological indicators

  5. New methodical developments for GRANIT

    SciTech Connect

    Baessler, Stefan; Nesvizhevsky, V.; Toperverg, B; Zhernenkov, K.; Gagarski, A; Lychagin, E; Muzychka, A; Strelkov, A; Mietke, A

    2011-01-01

    New methodical developments for the GRANIT spectrometer address further improvements of the critical parameters of this experimental installation, as well as its applications to new fields of research. Keeping in mind an extremely small fraction of ultra cold neutrons (UCN) that could be bound in gravitational quantum states, we look for methods to increase statistics due to: developing UCN sources with maximum phase-space density, counting simultaneously a large fraction of neutrons using position-sensitive detectors, and decreasing detector backgrounds. Also we explore an eventual application of the GRANIT spectrometer beyond the scope of its initial goals, for instance, for reflectometry with UCN.

  6. Methods For Human Resource Development.

    ERIC Educational Resources Information Center

    Conger, D. Stuart

    A description is provided of the training and counseling materials and methods prepared by the Saskatchewan NewStart and the Training Research and Development Station. Following a brief review of the concept of social inventions, summary descriptions are provided of nine adult education courses. These are: 1) Life Skills, which focuses upon…

  7. Space Radiation Transport Methods Development

    NASA Astrophysics Data System (ADS)

    Wilson, J.; Tripathi, R.; Qualls, G.; Cucinotta, F.; Prael, R.; Norbury, J.

    Early space radiation shield code development relied on Monte Carlo methods for proton, neutron and pion transport and made important contributions to the space program. More recently Monte Carlo code LAHET has been upgraded to include high-energy multiple-charged light ions for GCR simulations and continues to be expanded in capability. To compensate for low computational efficiency, Monte Carlo methods have resorted to restricted one-dimensional problems leading to imperfect representations of appropriate boundary conditions. Even so, intensive computational requirements resulted and shield evaluation was made near the end of the design process and resolving shielding issues usually had a negative impact on the design. We evaluate the implications of these common one-dimensional assumptions on the evaluation of the Shuttle internal radiation field. Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard Finite Element Method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 milliseconds and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of reconfigurable computing and could be

  8. Space Radiation Transport Methods Development

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tripathi, R. K.; Qualls, G. D.; Cucinotta, F. A.; Prael, R. E.; Norbury, J. W.; Heinbockel, J. H.; Tweed, J.

    2002-01-01

    Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard Finite Element Method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 milliseconds and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of reconfigurable computing and could be utilized in the final design as verification of the deterministic method optimized design.

  9. Developing Scoring Algorithms (Earlier Methods)

    Cancer.gov

    We developed scoring procedures to convert screener responses to estimates of individual dietary intake for fruits and vegetables, dairy, added sugars, whole grains, fiber, and calcium using the What We Eat in America 24-hour dietary recall data from the 2003-2006 NHANES.

  10. GIS Method for Developing Wind Supply Curves

    SciTech Connect

    Kline, D.; Heimiller, D.; Cowlin, S.

    2008-06-01

    This report describes work conducted by the National Renewable Energy Laboratory (NREL) as part of the Wind Technology Partnership (WTP) sponsored by the U.S. Environmental Protection Agency (EPA). This project has developed methods that the National Development and Reform Commission (NDRC) intends to use in the planning and development of China's 30 GW of planned capacity. Because of China's influence within the community of developing countries, the methods and the approaches here may help foster wind development in other countries.

  11. Cochrane methods - twenty years experience in developing systematic review methods

    PubMed Central

    2013-01-01

    This year, The Cochrane Collaboration reached its 20th anniversary. It has played a pivotal role in the scientific development of systematic reviewing and in the development of review methods to synthesize research evidence, primarily from randomized trials, to answer questions about the effects of healthcare interventions. We introduce a series of articles, which form this special issue describing the development of systematic review methods within The Cochrane Collaboration. We also discuss the impact of Cochrane Review methods, and acknowledge the breadth and depth of methods development within The Cochrane Collaboration as part of the wider context of evidence synthesis. We conclude by considering the future development of methods for Cochrane Reviews. PMID:24050381

  12. Development Activities Locator and Assessment Method (DALAM)

    DTIC Science & Technology

    2013-11-01

    CENTER FOR ARMY ANALYSIS 6001 GOETHALS ROAD FORT BELVOIR, VA 22060-5230 CAA-2012049 DEVELOPMENT ACTIVITIES LOCATOR AND ASSESSMENT METHOD...CONTRACT NUMBER Development Activities Locator and Assessment Method (DALAM) 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d...the problem may be more fundamental, in that the design of our projects is flawed. This study, Development Activities Locator and Assessment Method

  13. Moral counselling: a method in development.

    PubMed

    de Groot, Jack; Leget, Carlo

    2011-01-01

    This article describes a method of moral counselling developed in the Radboud University Medical Centre Nijmegen (The Netherlands). The authors apply insights of Paul Ricoeur to the non-directive counselling method of Carl Rogers in their work of coaching patients with moral problems in health care. The developed method was shared with other health care professionals in a training course. Experiences in the course and further practice led to further improvement of the method.

  14. Research methods for formal consensus development.

    PubMed

    James, Daphne; Warren-Forward, Helen

    2015-01-01

    This paper reviews three research methods for developing consensus. Consensus statements and guidelines are increasingly used to clarify and standardise practice, and inform health policy, when relevant and rigorous evidence is lacking. Clinicians need to evaluate the quality of practice guidelines to determine whether to incorporate them into clinical practice or reject them. Formal methods of developing consensus provide a scientific method that uses expert panel members to evaluate current evidence and expert opinions to produce consensus statements for clinical problems. Online search for relevant literature was conducted in Medline and CINAHL. A literature review of consensus, consensus development and research methods papers published in English in peer-reviewed journals. The three methods of developing consensus discussed are the Delphi technique, nominal group technique and the consensus development conference. The techniques and their respective advantages are described, and examples from the literature are provided. The three methods are compared and a flowchart to assist researchers selecting an appropriate method is included. Online resources with information on the development and evaluation of clinical guidelines are reviewed. This paper will help researchers to select an appropriate research method for developing consensus statements and guidelines. When developing consensus guidelines for clinical practice, researchers should use a formal research method to ensure rigour and credibility.

  15. Addressing gaps in the contraceptive method mix: methods in development.

    PubMed

    Nanda, Kavita; Callahan, Rebecca; Dorflinger, Laneta

    2015-11-01

    Despite the availability of a variety of contraceptive methods, millions of women still have an unmet need for contraceptive choices. Short-acting methods are plagued by issues with adherence, leading to imperfect or inconsistent use and subsequent unintended pregnancy. Long-acting contraceptive methods such as intrauterine devices and contraceptive implants, while providing highly effective and safe contraception, do not meet the needs of all women, often due to cost, access or acceptability issues. Several new methods are in various stages of development and are designed to address the shortcomings of current methods. Providers should be aware of these future options and how they might better meet women's needs.

  16. Vienna development method: An informal production

    SciTech Connect

    Petrenko, A.K.

    1992-09-01

    This article presents a brief description of the Vienna Development Method The article contains the fundamental notions, methodological concepts of VDM, and simple examples. The description of the development method covers the phases of stepwise specification and programming. 2 refs., 2 figs.

  17. Development of Methods for Determination of Aflatoxins.

    PubMed

    Xie, Lijuan; Chen, Min; Ying, Yibin

    2016-12-09

    Aflatoxins can cause damage to the health of humans and animals. Several institutions around the world have established regulations to limit the levels of aflatoxins in food, and numerous analytical methods have been extensively developed for aflatoxin determination. This review covers the currently used analytical methods for the determination of aflatoxins in different food matrices, which includes sampling and sample preparation, sample pretreatment methods including extraction methods and purification methods of aflatoxin extracts, separation and determination methods. Validation for analysis of aflatoxins and safety considerations and precautions when doing the experiments are also discussed.

  18. Product Development by Design Navigation Method

    NASA Astrophysics Data System (ADS)

    Nakazawa, Hiromu

    Manufacturers must be able to develop new products within a specified time period. This paper discusses a method for developing high performance products from a limited number of experiments, utilizing the concept of “function error”. Unlike conventional methods where the sequence of design, prototyping and experiment must be repeated several times, the proposed method can determine optimal design values directly from experimental data obtained from the first prototype. The theoretical basis of the method is presented, then its effectiveness proven by applying it to design an extrusion machine and a CNC lathe.

  19. Toxicity test method development in southeast Asia

    SciTech Connect

    McPherson, C.A.

    1995-12-31

    Use of aquatic toxicity tests is relatively new in southeast Asia. As part of the ASEAN-Canada Cooperative Programme on Marine Science -- Phase 2, which includes development of marine environmental criteria, a need for tropical toxicity data was identified. A step-wise approach was used for test method development (simple, acute tests and easily measured endpoints first, then more complex short-term chronic methods), for test specific selection (using species found throughout the region first, and then considering species with narrower geographic distribution), and for integration of quality assurance/quality control (QA/QC) practices into all laboratory activities. Development of test protocols specifically for tropical species included acute and chronic toxicity tests with marine fish, invertebrates and algae. Criteria for test species selection will be reviewed. Method development was based on procedures and endpoints already widely used in North America and Europe (e.g., 96-h LC50 with fish), but adapted for use with tropical species. For example, a bivalve larval development test can use the same endpoints but the duration is only 24 hours. Test method development included research on culture and holding procedures, determination of test conditions (e.g., duration, test containers), and identification of appropriate endpoints. Acute tests with fish and invertebrates were developed first. The next step was development of short-term chronic tests to measure phytoplankton growth, bivalve and echinoderm embryo or larval development, and larval fish growth. The number of species and types of tests was increased in a staged approach, as laboratories became better equipped and personnel gained practical experience. In most cases, method development coincided with training workshops to introduce the principles of toxicity testing.

  20. Development of test methods for textile composites

    NASA Technical Reports Server (NTRS)

    Masters, John E.; Ifju, Peter G.; Fedro, Mark J.

    1993-01-01

    NASA's Advanced Composite Technology (ACT) Program was initiated in 1990 with the purpose of developing less costly composite aircraft structures. A number of innovative materials and processes were evaluated as a part of this effort. Chief among them are composite materials reinforced with textile preforms. These new forms of composite materials bring with them potential testing problems. Methods currently in practice were developed over the years for composite materials made from prepreg tape or simple 2-D woven fabrics. A wide variety of 2-D and 3-D braided, woven, stitched, and knit preforms were suggested for application in the ACT program. The applicability of existing test methods to the wide range of emerging materials bears investigation. The overriding concern is that the values measured are accurate representations of the true material response. The ultimate objective of this work is to establish a set of test methods to evaluate the textile composites developed for the ACT Program.

  1. Methods for the Study of Gonadal Development.

    PubMed

    Piprek, Rafal P

    2016-01-01

    Current knowledge on gonadal development and sex determination is the product of many decades of research involving a variety of scientific methods from different biological disciplines such as histology, genetics, biochemistry, and molecular biology. The earliest embryological investigations, followed by the invention of microscopy and staining methods, were based on histological examinations. The most robust development of histological staining techniques occurred in the second half of the nineteenth century and resulted in structural descriptions of gonadogenesis. These first studies on gonadal development were conducted on domesticated animals; however, currently the mouse is the most extensively studied species. The next key point in the study of gonadogenesis was the advancement of methods allowing for the in vitro culture of fetal gonads. For instance, this led to the description of the origin of cell lines forming the gonads. Protein detection using antibodies and immunolabeling methods and the use of reporter genes were also invaluable for developmental studies, enabling the visualization of the formation of gonadal structure. Recently, genetic and molecular biology techniques, especially gene expression analysis, have revolutionized studies on gonadogenesis and have provided insight into the molecular mechanisms that govern this process. The successive invention of new methods is reflected in the progress of research on gonadal development.

  2. Development of new hole expansion testing method

    NASA Astrophysics Data System (ADS)

    Kim, Hyunok; Shang, Jianhui; Beam, Kevin; Samant, Anoop; Hoschouer, Cliff; Dykeman, Jim

    2016-08-01

    This paper introduces a new hole expansion (HE) testing method that could be more relevant to the edge cracking problem observed in stamping advanced high strength steel (AHSS). The new testing method adopted a large hole diameter of 75 mm compared to the standard hole diameter of 10 mm. An inline monitoring system was developed to visually monitor the hole edge cracking during the test and synchronize the load-displacement data with the recorded video for capturing the initial crack. A new hole expansion testing method was found to be effective in evaluating the edge cracking by considering the effects of material properties and trimming methods. It showed a much larger difference, up to 11%, of the HE ratio between DP980 and TRIP780 compared to the standard HE testing method giving less than a 2% difference.

  3. Benchmarking Learning and Teaching: Developing a Method

    ERIC Educational Resources Information Center

    Henderson-Smart, Cheryl; Winning, Tracey; Gerzina, Tania; King, Shalinie; Hyde, Sarah

    2006-01-01

    Purpose: To develop a method for benchmarking teaching and learning in response to an institutional need to validate a new program in Dentistry at the University of Sydney, Australia. Design/methodology/approach: After a collaborative partner, University of Adelaide, was identified, the areas of teaching and learning to be benchmarked, PBL…

  4. New Developments of the Shared Concern Method.

    ERIC Educational Resources Information Center

    Pikas, Anatol

    2002-01-01

    Reviews and describes new developments in the Shared Concern method (SCm), a tool for tackling group bullying amongst teenagers by individual talks. The psychological mechanisms of healing in the bully group and what hinders the bully therapist in eliciting them have become better clarified. The most important recent advancement of the SCm…

  5. A Framework for Teaching Software Development Methods

    ERIC Educational Resources Information Center

    Dubinsky, Yael; Hazzan, Orit

    2005-01-01

    This article presents a study that aims at constructing a teaching framework for software development methods in higher education. The research field is a capstone project-based course, offered by the Technion's Department of Computer Science, in which Extreme Programming is introduced. The research paradigm is an Action Research that involves…

  6. Benchmarking Learning and Teaching: Developing a Method

    ERIC Educational Resources Information Center

    Henderson-Smart, Cheryl; Winning, Tracey; Gerzina, Tania; King, Shalinie; Hyde, Sarah

    2006-01-01

    Purpose: To develop a method for benchmarking teaching and learning in response to an institutional need to validate a new program in Dentistry at the University of Sydney, Australia. Design/methodology/approach: After a collaborative partner, University of Adelaide, was identified, the areas of teaching and learning to be benchmarked, PBL…

  7. New Developments of the Shared Concern Method.

    ERIC Educational Resources Information Center

    Pikas, Anatol

    2002-01-01

    Reviews and describes new developments in the Shared Concern method (SCm), a tool for tackling group bullying amongst teenagers by individual talks. The psychological mechanisms of healing in the bully group and what hinders the bully therapist in eliciting them have become better clarified. The most important recent advancement of the SCm…

  8. Development of a nonlinear vortex method

    NASA Technical Reports Server (NTRS)

    Kandil, O. A.

    1982-01-01

    Steady and unsteady Nonliner Hybrid Vortex (NHV) method, for low aspect ratio wings at large angles of attack, is developed. The method uses vortex panels with first-order vorticity distribution (equivalent to second-order doublet distribution) to calculate the induced velocity in the near field using closed form expressions. In the far field, the distributed vorticity is reduced to concentrated vortex lines and the simpler Biot-Savart's law is employed. The method is applied to rectangular wings in steady and unsteady flows without any restriction on the order of magnitude of the disturbances in the flow field. The numerical results show that the method accurately predicts the distributed aerodynamic loads and that it is of acceptable computational efficiency.

  9. Transport Test Problems for Hybrid Methods Development

    SciTech Connect

    Shaver, Mark W.; Miller, Erin A.; Wittman, Richard S.; McDonald, Benjamin S.

    2011-12-28

    This report presents 9 test problems to guide testing and development of hybrid calculations for the ADVANTG code at ORNL. These test cases can be used for comparing different types of radiation transport calculations, as well as for guiding the development of variance reduction methods. Cases are drawn primarily from existing or previous calculations with a preference for cases which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22.

  10. Development of a hydraulic turbine design method

    NASA Astrophysics Data System (ADS)

    Kassanos, Ioannis; Anagnostopoulos, John; Papantonis, Dimitris

    2013-10-01

    In this paper a hydraulic turbine parametric design method is presented which is based on the combination of traditional methods and parametric surface modeling techniques. The blade of the turbine runner is described using Bezier surfaces for the definition of the meridional plane as well as the blade angle distribution, and a thickness distribution applied normal to the mean blade surface. In this way, it is possible to define parametrically the whole runner using a relatively small number of design parameters, compared to conventional methods. The above definition is then combined with a commercial CFD software and a stochastic optimization algorithm towards the development of an automated design optimization procedure. The process is demonstrated with the design of a Francis turbine runner.

  11. Report on development of neutron passportisation method

    SciTech Connect

    Antropov, G.P.; Babichev, Yu.B.; Blagin, S.V.

    1994-12-31

    In this report the results of development of spatial neutron passportisation method are described. The method is aimed on spatial configuration (including the number of sources) control of closed objects containing neutron sources. The possible areas of method application are: (1) the number of warheads control inside the missile heads for RF-US nuclear disarmament treaties verification; (2) control of SNM containers arrangement in storage vaults; (3) control of complicated assemblies with SNM (and other radioactive materials) to remain unchanged. For objects with complicated structure such as multiple reentry vehicles the direct interpretation of observed radiation field configuration is rather difficult task. The reconstruction of object structure on basis of radiation field configuration usually require use of external information and is often not obvious. Besides, while using such methods of direct reconstruction of object internal structure the contradiction arises between the requirement of defining sources arrangement (warheads in case of arms control) and requirement of information protection concerning the sources themselves. In this case there may be different limitations on possible spatial resolution of method, use of spectroscopy information, etc.

  12. Prioritizing pesticide compounds for analytical methods development

    USGS Publications Warehouse

    Norman, Julia E.; Kuivila, Kathryn; Nowell, Lisa H.

    2012-01-01

    The U.S. Geological Survey (USGS) has a periodic need to re-evaluate pesticide compounds in terms of priorities for inclusion in monitoring and studies and, thus, must also assess the current analytical capabilities for pesticide detection. To meet this need, a strategy has been developed to prioritize pesticides and degradates for analytical methods development. Screening procedures were developed to separately prioritize pesticide compounds in water and sediment. The procedures evaluate pesticide compounds in existing USGS analytical methods for water and sediment and compounds for which recent agricultural-use information was available. Measured occurrence (detection frequency and concentrations) in water and sediment, predicted concentrations in water and predicted likelihood of occurrence in sediment, potential toxicity to aquatic life or humans, and priorities of other agencies or organizations, regulatory or otherwise, were considered. Several existing strategies for prioritizing chemicals for various purposes were reviewed, including those that identify and prioritize persistent, bioaccumulative, and toxic compounds, and those that determine candidates for future regulation of drinking-water contaminants. The systematic procedures developed and used in this study rely on concepts common to many previously established strategies. The evaluation of pesticide compounds resulted in the classification of compounds into three groups: Tier 1 for high priority compounds, Tier 2 for moderate priority compounds, and Tier 3 for low priority compounds. For water, a total of 247 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods for monitoring and studies. Of these, about three-quarters are included in some USGS analytical method; however, many of these compounds are included on research methods that are expensive and for which there are few data on environmental samples. The remaining quarter of Tier 1

  13. DEVELOPMENT OF MOLECULAR METHODS TO DETECT ...

    EPA Pesticide Factsheets

    A large number of human enteric viruses are known to cause gastrointestinal illness and waterborne outbreaks. Many of these are emerging viruses that do not grow or grow poorly in cell culture and so molecular detectoin methods based on the polymerase chain reaction (PCR) are being developed. Current studies focus on detecting two virus groups, the caliciviruses and the hepatitis E virus strains, both of which have been found to cause significant outbraks via contaminated drinking water. Once developed, these methods will be used to collect occurrence data for risk assessment studies. Develop sensitive techniques to detect and identify emerging human waterborne pathogenic viruses and viruses on the CCL.Determine effectiveness of viral indicators to measure microbial quality in water matrices.Support activities: (a) culture and distribution of mammalian cells for Agency and scientific community research needs, (b) provide operator expertise for research requiring confocal and electron microscopy, (c) glassware cleaning, sterilization and biological waste disposal for the Cincinnati EPA facility, (d) operation of infectious pathogenic suite, (e) maintenance of walk-in constant temperature rooms and (f) provide Giardia cysts.

  14. A space radiation transport method development

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tripathi, R. K.; Qualls, G. D.; Cucinotta, F. A.; Prael, R. E.; Norbury, J. W.; Heinbockel, J. H.; Tweed, J.

    2004-01-01

    Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest-order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard finite element method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 ms and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of re-configurable computing and could be utilized in the final design as verification of the deterministic method optimized design. Published by Elsevier Ltd on behalf of COSPAR.

  15. A space radiation transport method development.

    PubMed

    Wilson, J W; Tripathi, R K; Qualls, G D; Cucinotta, F A; Prael, R E; Norbury, J W; Heinbockel, J H; Tweed, J

    2004-01-01

    Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest-order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard finite element method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 ms and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of re-configurable computing and could be utilized in the final design as verification of the deterministic method optimized design.

  16. A space radiation transport method development

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tripathi, R. K.; Qualls, G. D.; Cucinotta, F. A.; Prael, R. E.; Norbury, J. W.; Heinbockel, J. H.; Tweed, J.

    2004-01-01

    Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest-order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard finite element method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 ms and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of re-configurable computing and could be utilized in the final design as verification of the deterministic method optimized design. Published by Elsevier Ltd on behalf of COSPAR.

  17. Structural reliability methods: Code development status

    NASA Technical Reports Server (NTRS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-01-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  18. Flexible Methods for Future Force Concept Development

    DTIC Science & Technology

    2005-08-01

    this report is included in Appendix A. A second Army method is to stand up a replica of the new system and conduct a unit exercise in simulation, such...information on loading exercises , navigating the map, and deploying and monitoring sensors. 6 I It J_ Map Area Mode Selection Area : Figure 1. Scaled-world...7 -- 7/ Figure 2. Scaled-world tool with sensor feed displayed. The scaled-world tool and events were developed using Java and open-source software

  19. Probabilistic structural analysis methods development for SSME

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1988-01-01

    The development of probabilistic structural analysis methods is a major part of the SSME Structural Durability Program and consists of three program elements: composite load spectra, probabilistic finite element structural analysis, and probabilistic structural analysis applications. Recent progress includes: (1) the effects of the uncertainties of several factors on the HPFP blade temperature pressure and torque, (2) the evaluation of the cumulative distribution function of structural response variables based on assumed uncertainties on primitive structural variables, and (3) evaluation of the failure probability. Collectively, the results obtained demonstrate that the structural durability of critical SSME components can be probabilistically evaluated.

  20. [Development of identification method for isopropyl citrate].

    PubMed

    Furusho, Noriko; Ohtsuki, Takashi; Tatebe-Sasaki, Chiye; Kubota, Hiroki; Sato, Kyoko; Akiyama, Hiroshi

    2014-01-01

    In Japan's Specification and Standards for Food Additive, 8th edition, two identification tests involving isopropyl citrate for detecting isopropyl alcohol and citrate are stipulated. However, these identification tests use mercury compound, which is toxic, or require a time-consuming pretreatment process. To solve these problems, an identification test method using GC-FID for detecting isopropyl alcohol was developed. In this test, a good linearity was observed in the range of 0.1-40 mg/mL of isopropyl alcohol. While investigating the pretreatment process, we found that isopropyl alcohol could be detected using GC-FID in the distillation step only, without involving any reflux step. The study also showed that the citrate moiety of isopropyl citrate was identified using the solution remaining after conducting the distillation of isopropyl alcohol. The developed identification tests for isopropyl citrate are simple and use no toxic materials.

  1. Using containerless methods to develop amorphous pharmaceuticals.

    PubMed

    Weber, J K R; Benmore, C J; Suthar, K J; Tamalonis, A J; Alderman, O L G; Sendelbach, S; Kondev, V; Yarger, J; Rey, C A; Byrn, S R

    2017-01-01

    Many pipeline drugs have low solubility in their crystalline state and require compounding in special dosage forms to increase bioavailability for oral administration. The use of amorphous formulations increases solubility and uptake of active pharmaceutical ingredients. These forms are rapidly gaining commercial importance for both pre-clinical and clinical use. Synthesis of amorphous drugs was performed using an acoustic levitation containerless processing method and spray drying. The structure of the products was investigated using in-situ high energy X-ray diffraction. Selected solvents for processing drugs were investigated using acoustic levitation. The stability of amorphous samples was measured using X-ray diffraction. Samples processed using both spray drying and containerless synthesis were compared. We review methods for making amorphous pharmaceuticals and present data on materials made by containerless processing and spray drying. It was shown that containerless processing using acoustic levitation can be used to make phase-pure forms of drugs that are known to be difficult to amorphize. The stability and structure of the materials was investigated in the context of developing and making clinically useful formulations. Amorphous compounds are emerging as an important component of drug development and for the oral delivery of drugs with low solubility. Containerless techniques can be used to efficiently synthesize small quantities of pure amorphous forms that are potentially useful in pre-clinical trials and for use in the optimization of clinical products. Developing new pharmaceutical products is an essential enterprise to improve patient outcomes. The development and application of amorphous pharmaceuticals to increase absorption is rapidly gaining importance and it provides opportunities for breakthrough research on new drugs. There is an urgent need to solve problems associated with making formulations that are both stable and that provide high

  2. Methods for developing and validating survivability distributions

    SciTech Connect

    Williams, R.L.

    1993-10-01

    A previous report explored and discussed statistical methods and procedures that may be applied to validate the survivability of a complex system of systems that cannot be tested as an entity. It described a methodology where Monte Carlo simulation was used to develop the system survivability distribution from the component distributions using a system model that registers the logical interactions of the components to perform system functions. This paper discusses methods that can be used to develop the required survivability distributions based upon three sources of knowledge. These are (1) available test results; (2) little or no available test data, but a good understanding of the physical laws and phenomena which can be applied by computer simulation; and (3) neither test data nor adequate knowledge of the physics are known, in which case, one must rely upon, and quantify, the judgement of experts. This paper describes the relationship between the confidence bounds that can be placed on survivability and the number of tests conducted. It discusses the procedure for developing system level survivability distributions from the distributions for lower levels of integration. It demonstrates application of these techniques by defining a communications network for a Hypothetical System Architecture. A logic model for the performance of this communications network is developed, as well as the survivability distributions for the nodes and links based on two alternate data sets, reflecting the effects of increased testing of all elements. It then shows how this additional testing could be optimized by concentrating only on those elements contained in the low-order fault sets which the methodology identifies.

  3. DEVELOPMENT OF A MOLECULAR METHOD TO IDENTIFY ...

    EPA Pesticide Factsheets

    Hepatitis E virus (HEV) is an emerging pathogen that causes significant illness in the developing world. Like the hepatitis A virus, it is transmitted via the fecal-oral route and can cause short-term, acute hepatitis. In addition, hepatitis E has been found to cause a significant rate of mortality in pregnant women. Thus far, a hepatitis E outbreak has not been reported in the U. S. although a swine variant of the virus is common in Midwestern hogs. Since it will be important to identify the presence of this virus in the water supply, we have developed and are testing a reverse transcription-polymerase chain reaction (RT-PCR) method that should be able to identify all of the known HEV strains. Develop sensitive techniques to detect and identify emerging human waterborne pathogenic viruses and viruses on the CCL.Determine effectiveness of viral indicators to measure microbial quality in water matrices.Support activities: (a) culture and distribution of mammalian cells for Agency and scientific community research needs, (b) provide operator expertise for research requiring confocal and electron microscopy, (c) glassware cleaning, sterilization and biological waste disposal for the Cincinnati EPA facility, (d) operation of infectious pathogenic suite, (e) maintenance of walk-in constant temperature rooms and (f) provide Giardia cysts.

  4. DEVELOPMENT OF A MOLECULAR METHOD TO IDENTIFY ...

    EPA Pesticide Factsheets

    Hepatitis E virus (HEV) is an emerging pathogen that causes significant illness in the developing world. Like the hepatitis A virus, it is transmitted via the fecal-oral route and can cause short-term, acute hepatitis. In addition, hepatitis E has been found to cause a significant rate of mortality in pregnant women. Thus far, a hepatitis E outbreak has not been reported in the U. S. although a swine variant of the virus is common in Midwestern hogs. Since it will be important to identify the presence of this virus in the water supply, we have developed and are testing a reverse transcription-polymerase chain reaction (RT-PCR) method that should be able to identify all of the known HEV strains. Develop sensitive techniques to detect and identify emerging human waterborne pathogenic viruses and viruses on the CCL.Determine effectiveness of viral indicators to measure microbial quality in water matrices.Support activities: (a) culture and distribution of mammalian cells for Agency and scientific community research needs, (b) provide operator expertise for research requiring confocal and electron microscopy, (c) glassware cleaning, sterilization and biological waste disposal for the Cincinnati EPA facility, (d) operation of infectious pathogenic suite, (e) maintenance of walk-in constant temperature rooms and (f) provide Giardia cysts.

  5. DEVELOPMENT OF NDA METHODS FOR NEPTUNIUM METAL

    SciTech Connect

    C. MOSS; ET AL

    2000-10-01

    Many techniques have been developed and applied in the US and other countries for the control of the special nuclear materials (SNM) plutonium and uranium, but no standard methods exist for the determination of neptunium in bulk containers. Such methods are needed because the U.S. Department of Energy requires all Government-owned {sup 237}Np be treated as if it were SNM and the International Atomic Energy Agency is considering how to monitor this material. We present the results of the measurements of several samples of neptunium metal with a variety of techniques. Analysis of passive gamma-ray spectra uniquely identifies the material, provides isotopic ratios for contaminants, such as {sup 243}Am, and may provide information about the shielding, mass, and time since processing. Active neutron interrogation, using the delayed neutron technique in a package monitor, provides useful data even if the neptunium is shielded. The tomographic gamma scanner yields a map of the distribution of the neptunium and shielding in a container. Active photon interrogation with pulses from a 10-MeV linac produces delayed neutrons between pulses, even when the container is heavily shielded. Data from one or more of these techniques can be used to identify the material and estimate a mass in a bulk container.

  6. Development of a Radial Deconsolidation Method

    SciTech Connect

    Helmreich, Grant W.; Montgomery, Fred C.; Hunn, John D.

    2015-12-01

    A series of experiments have been initiated to determine the retention or mobility of fission products* in AGR fuel compacts [Petti, et al. 2010]. This information is needed to refine fission product transport models. The AGR-3/4 irradiation test involved half-inch-long compacts that each contained twenty designed-to-fail (DTF) particles, with 20-μm thick carbon-coated kernels whose coatings were deliberately fabricated such that they would crack under irradiation, providing a known source of post-irradiation isotopes. The DTF particles in these compacts were axially distributed along the compact centerline so that the diffusion of fission products released from the DTF kernels would be radially symmetric [Hunn, et al. 2012; Hunn et al. 2011; Kercher, et al. 2011; Hunn, et al. 2007]. Compacts containing DTF particles were irradiated at Idaho National Laboratory (INL) at the Advanced Test Reactor (ATR) [Collin, 2015]. Analysis of the diffusion of these various post-irradiation isotopes through the compact requires a method to radially deconsolidate the compacts so that nested-annular volumes may be analyzed for post-irradiation isotope inventory in the compact matrix, TRISO outer pyrolytic carbon (OPyC), and DTF kernels. An effective radial deconsolidation method and apparatus appropriate to this application has been developed and parametrically characterized.

  7. Methods development for total organic carbon accountability

    NASA Technical Reports Server (NTRS)

    Benson, Brian L.; Kilgore, Melvin V., Jr.

    1991-01-01

    This report describes the efforts completed during the contract period beginning November 1, 1990 and ending April 30, 1991. Samples of product hygiene and potable water from WRT 3A were supplied by NASA/MSFC prior to contract award on July 24, 1990. Humidity condensate samples were supplied on August 3, 1990. During the course of this contract chemical analyses were performed on these samples to qualitatively determine specific components comprising, the measured organic carbon concentration. In addition, these samples and known standard solutions were used to identify and develop methodology useful to future comprehensive characterization of similar samples. Standard analyses including pH, conductivity, and total organic carbon (TOC) were conducted. Colorimetric and enzyme linked assays for total protein, bile acid, B-hydroxybutyric acid, methylene blue active substances (MBAS), urea nitrogen, ammonia, and glucose were also performed. Gas chromatographic procedures for non-volatile fatty acids and EPA priority pollutants were also performed. Liquid chromatography was used to screen for non-volatile, water soluble compounds not amenable to GC techniques. Methods development efforts were initiated to separate and quantitate certain chemical classes not classically analyzed in water and wastewater samples. These included carbohydrates, organic acids, and amino acids. Finally, efforts were initiated to identify useful concentration techniques to enhance detection limits and recovery of non-volatile, water soluble compounds.

  8. Interactive radio instruction: developing instructional methods.

    PubMed

    Friend, J

    1989-01-01

    The USAID has, since 1972, funded the development of a new methodology for educational radio for young children through 3 projects: the Radio Mathematics PRoject of Nicaragua, the Radio Language Arts Project of Kenya, and the Radio Science PRoject of Papua New Guinea. These projects developed math programs for grades 1-4 and English as a second language for grades 1-3; programs to teach science in grades 4-6 are now being developed. Appropriate techniques were developed to engage young children actively in the learning process. Lessons are planned as a "conversation" between the children and the radio; scripts are written as 1/2 of a dialogue, with pauses carefully timed so that written as 12 of a dialogue, with pauses carefully timed so that students can contribute their 1/2. Teaching techniques used in all 3 projects include choral responses, simultaneous individual seatwork, and activities using simple materials such as pebbles and rulers. Certain techniques were specific to the subject being taught, or to the circumstances in which the lessons were to be used. Patterned oral drill was used frequently in the English lessons, including sound-cued drills. "Deferred" oral responses were used often in the math lessons. In this method, the children are instructed to solve a problem silently, not giving the answer aloud until requested, thus allowing time for even the slower children to participate. "One-child" questions were used in both English and science: the radio asks a question to be answered by a single child, who is selected on the spot by the classroom teacher. This allows for open-ended questions, but also requires constant supervision of the classroom teacher. Songs and games were used in all programs, and extensively for didactic purposes in the teaching of English. Instructions for science activities are often more complex than in other courses, particularly when the children are using science apparatus, especially when they work in pairs to share scarce

  9. Methods in Human Development: Theory Manual.

    ERIC Educational Resources Information Center

    Bessell, Harold

    The manual, developed by psychologists at the Human Development Training Institute, describes techniques for understanding and dealing with the behavior and development of young children. The major objective of the book is to help elementary school teachers improve communication with their pupils. The manual offers conceptual tools for fostering…

  10. Agile Development Methods for Space Operations

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Webster, Chris

    2012-01-01

    Main stream industry software development practice has gone from a traditional waterfall process to agile iterative development that allows for fast response to customer inputs and produces higher quality software at lower cost. How can we, the space ops community, adopt state of the art software development practice, achieve greater productivity at lower cost, and maintain safe and effective space flight operations? At NASA Ames, we are developing Mission Control Technologies Software, in collaboration with Johnson Space Center (JSC) and, more recently, the Jet Propulsion Laboratory (JPL).

  11. Method Development for Analysis of Aspirin Tablets.

    ERIC Educational Resources Information Center

    Street, Kenneth W., Jr.

    1988-01-01

    Develops a lab experiment for introductory instrumental analysis that requires interference studies and optimizing of conditions. Notes the analysis of the aspirin is by visible spectrophotometric assay. Gives experimental details and discussion. (MVL)

  12. Method Development for Analysis of Aspirin Tablets.

    ERIC Educational Resources Information Center

    Street, Kenneth W., Jr.

    1988-01-01

    Develops a lab experiment for introductory instrumental analysis that requires interference studies and optimizing of conditions. Notes the analysis of the aspirin is by visible spectrophotometric assay. Gives experimental details and discussion. (MVL)

  13. Child development in developing countries: introduction and methods.

    PubMed

    Bornstein, Marc H; Britto, Pia Rebello; Nonoyama-Tarumi, Yuko; Ota, Yumiko; Petrovic, Oliver; Putnick, Diane L

    2012-01-01

    The Multiple Indicator Cluster Survey (MICS) is a nationally representative, internationally comparable household survey implemented to examine protective and risk factors of child development in developing countries around the world. This introduction describes the conceptual framework, nature of the MICS3, and general analytic plan of articles in this Special Section. The articles that follow describe the situations of children with successive foci on nutrition, parenting, discipline and violence, and the home environment. They address 2 common questions: How do developing and underresearched countries in the world vary with respect to these central indicators of children's development? How do key indicators of national development relate to child development in each of these substantive areas? The Special Section concludes with policy implications from the international findings.

  14. Child Development in Developing Countries: Introduction and Methods

    PubMed Central

    Bornstein, Marc H.; Britto, Pia Rebello; Nonoyama-Tarumi, Yuko; Ota, Yumiko; Petrovic, Oliver; Putnick, Diane L.

    2011-01-01

    The Multiple Indicator Cluster Survey (MICS) is a nationally representative, internationally comparable household survey implemented to examine protective and risk factors of child development in developing countries around the world. This Introduction describes the conceptual framework, nature of the MICS3, and general analytic plan of articles in this Special Section. The articles that follow describe the situations of children with successive foci on nutrition, parenting, discipline and violence, and the home environment addressing two common questions: How do developing and underresearched countries in the world vary with respect to these central indicators of children's development? and How do key indicators of national development relate to child development in each of these substantive areas? The Special Section concludes with policy implications from the international findings. PMID:22277004

  15. The Development of Cluster and Histogram Methods

    NASA Astrophysics Data System (ADS)

    Swendsen, Robert H.

    2003-11-01

    This talk will review the history of both cluster and histogram methods for Monte Carlo simulations. Cluster methods are based on the famous exact mapping by Fortuin and Kasteleyn from general Potts models onto a percolation representation. I will discuss the Swendsen-Wang algorithm, as well as its improvement and extension to more general spin models by Wolff. The Replica Monte Carlo method further extended cluster simulations to deal with frustrated systems. The history of histograms is quite extensive, and can only be summarized briefly in this talk. It goes back at least to work by Salsburg et al. in 1959. Since then, it has been forgotten and rediscovered several times. The modern use of the method has exploited its ability to efficiently determine the location and height of peaks in various quantities, which is of prime importance in the analysis of critical phenomena. The extensions of this approach to the multiple histogram method and multicanonical ensembles have allowed information to be obtained over a broad range of parameters. Histogram simulations and analyses have become standard techniques in Monte Carlo simulations.

  16. Development of advanced acreage estimation methods

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr. (Principal Investigator)

    1980-01-01

    The use of the AMOEBA clustering/classification algorithm was investigated as a basis for both a color display generation technique and maximum likelihood proportion estimation procedure. An approach to analyzing large data reduction systems was formulated and an exploratory empirical study of spatial correlation in LANDSAT data was also carried out. Topics addressed include: (1) development of multiimage color images; (2) spectral spatial classification algorithm development; (3) spatial correlation studies; and (4) evaluation of data systems.

  17. Child Development in Developing Countries: Introduction and Methods

    ERIC Educational Resources Information Center

    Bornstein, Marc H.; Britto, Pia Rebello; Nonoyama-Tarumi, Yuko; Ota, Yumiko; Petrovic, Oliver; Putnick, Diane L.

    2012-01-01

    The Multiple Indicator Cluster Survey (MICS) is a nationally representative, internationally comparable household survey implemented to examine protective and risk factors of child development in developing countries around the world. This introduction describes the conceptual framework, nature of the MICS3, and general analytic plan of articles…

  18. Child Development in Developing Countries: Introduction and Methods

    ERIC Educational Resources Information Center

    Bornstein, Marc H.; Britto, Pia Rebello; Nonoyama-Tarumi, Yuko; Ota, Yumiko; Petrovic, Oliver; Putnick, Diane L.

    2012-01-01

    The Multiple Indicator Cluster Survey (MICS) is a nationally representative, internationally comparable household survey implemented to examine protective and risk factors of child development in developing countries around the world. This introduction describes the conceptual framework, nature of the MICS3, and general analytic plan of articles…

  19. Development of ultrasonic methods for hemodynamic measurements

    NASA Technical Reports Server (NTRS)

    Histand, M. B.; Miller, C. W.; Wells, M. K.; Mcleod, F. D.; Greene, E. R.; Winter, D.

    1975-01-01

    A transcutanous method to measure instantaneous mean blood flow in peripheral arteries of the human body was defined. Transcutanous and implanted cuff ultrasound velocity measurements were evaluated, and the accuracies of velocity, flow, and diameter measurements were assessed for steady flow. Performance criteria were established for the pulsed Doppler velocity meter (PUDVM), and performance tests were conducted. Several improvements are suggested.

  20. Using Qualitative Methods to Inform Scale Development

    ERIC Educational Resources Information Center

    Rowan, Noell; Wulff, Dan

    2007-01-01

    This article describes the process by which one study utilized qualitative methods to create items for a multi dimensional scale to measure twelve step program affiliation. The process included interviewing fourteen addicted persons while in twelve step focused treatment about specific pros (things they like or would miss out on by not being…

  1. Development of advanced acreage estimation methods

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr. (Principal Investigator)

    1982-01-01

    The development of an accurate and efficient algorithm for analyzing the structure of MSS data, the application of the Akaiki information criterion to mixture models, and a research plan to delineate some of the technical issues and associated tasks in the area of rice scene radiation characterization are discussed. The AMOEBA clustering algorithm is refined and documented.

  2. Methods & Strategies: Developing Investigative Skills Purposefully

    ERIC Educational Resources Information Center

    Pellathy, Stephen L.; Paul, John; Cartier, Jennifer L.; Wittfeldt, Claudia

    2007-01-01

    Members of a team of educators and university students participating in the project, Pittsburgh Partnership for Energizing Science in Urban Schools, addressed the issue of helping students develop investigative skills within the context of an introductory science unit for fourth graders. The unit focuses on data-collection techniques and is a…

  3. A Method for Developing a Nutrient Guide.

    ERIC Educational Resources Information Center

    Gillespie, Ardyth H.; Roderuck, Charlotte E.

    1982-01-01

    This paper proposes a new approach to developing a tool for teaching nutrition and food selection. It allows adjustments as new information becomes available and takes into account both dietary recommendations and food composition. Steps involve nutrient composition; nutrient density; and ratings for fat, cholesterol, and sodium. (Author/CT)

  4. An Unbalance Adjustment Method for Development Indicators

    ERIC Educational Resources Information Center

    Tarabusi, Enrico Casadio; Guarini, Giulio

    2013-01-01

    This paper analyzes some aggregation aspects of the procedure for constructing a composite index on a multidimensional socio-economic phenomenon such as development, the main focus being on the unbalance among individual dimensions. First a theoretical framework is set up for the unbalance adjustment of the index. Then an aggregation function is…

  5. An Unbalance Adjustment Method for Development Indicators

    ERIC Educational Resources Information Center

    Tarabusi, Enrico Casadio; Guarini, Giulio

    2013-01-01

    This paper analyzes some aggregation aspects of the procedure for constructing a composite index on a multidimensional socio-economic phenomenon such as development, the main focus being on the unbalance among individual dimensions. First a theoretical framework is set up for the unbalance adjustment of the index. Then an aggregation function is…

  6. A Method for Developing a Nutrient Guide.

    ERIC Educational Resources Information Center

    Gillespie, Ardyth H.; Roderuck, Charlotte E.

    1982-01-01

    This paper proposes a new approach to developing a tool for teaching nutrition and food selection. It allows adjustments as new information becomes available and takes into account both dietary recommendations and food composition. Steps involve nutrient composition; nutrient density; and ratings for fat, cholesterol, and sodium. (Author/CT)

  7. Development of Thin Conducting Film Fabrication Methods.

    DTIC Science & Technology

    1979-12-01

    mediate mandrels ( beeswax , polymeric resins such as PVA, PVC, PBS, Saran). Efforts to transfer the foils intact from formation on intermediate mandrels...methods Investigated for removing and transferring pyrolytic carbon films onto cylindrical electrodes consist of: (1) melted beeswax and other high...attempts at removing the carbon film from the quartz mandrel were made using melted beeswax , as in Figure 4. In the first few attempts, the entire

  8. Pilot-in-the-loop Method Development

    DTIC Science & Technology

    2014-05-20

    case. 2.1 Full-Scale SFS2 Cases A full-scale SFS2 grid was generated with Pointwise from the wind tunnel grids (1/100th scale) provided by NAVAIR by...spacing given the differences in grid generation methods of the software. “Baffle” surfaces were used in Pointwise to control the volume mesh resolution...simplified full scale geometry was created by creating triangulated surfaces from the boundary curves using Pointwise . Since the intended grid topology

  9. Current status of fluoride volatility method development

    SciTech Connect

    Uhlir, J.; Marecek, M.; Skarohlid, J.

    2013-07-01

    The Fluoride Volatility Method is based on a separation process, which comes out from the specific property of uranium, neptunium and plutonium to form volatile hexafluorides whereas most of fission products (mainly lanthanides) and higher transplutonium elements (americium, curium) present in irradiated fuel form nonvolatile tri-fluorides. Fluoride Volatility Method itself is based on direct fluorination of the spent fuel, but before the fluorination step, the removal of cladding material and subsequent transformation of the fuel into a powdered form with a suitable grain size have to be done. The fluorination is made with fluorine gas in a flame fluorination reactor, where the volatile fluorides (mostly UF{sub 6}) are separated from the non-volatile ones (trivalent minor actinides and majority of fission products). The subsequent operations necessary for partitioning of volatile fluorides are the condensation and evaporation of volatile fluorides, the thermal decomposition of PuF{sub 6} and the finally distillation and sorption used for the purification of uranium product. The Fluoride Volatility Method is considered to be a promising advanced pyrochemical reprocessing technology, which can mainly be used for the reprocessing of oxide spent fuels coming from future GEN IV fast reactors.

  10. Recommendations for Developing Alternative Test Methods for Developmental Neurotoxicity

    EPA Science Inventory

    There is great interest in developing alternative methods for developmental neurotoxicity testing (DNT) that are cost-efficient, use fewer animals and are based on current scientific knowledge of the developing nervous system. Alternative methods will require demonstration of the...

  11. Recommendations for Developing Alternative Test Methods for Developmental Neurotoxicity

    EPA Science Inventory

    There is great interest in developing alternative methods for developmental neurotoxicity testing (DNT) that are cost-efficient, use fewer animals and are based on current scientific knowledge of the developing nervous system. Alternative methods will require demonstration of the...

  12. Developing an interactive microsimulation method in pharmacology.

    PubMed

    Collins, Angela S; Graves, Barbara A; Gullette, Donna; Edwards, Rebecca

    2010-07-01

    Pharmacology decision making requires clinical judgment. The authors created interactive microsimulation applying drug information to varying patients' situations. The theory-based microsimulation requires situational analysis for each scenario. The microsimulation uses an interactive format that allows the participant to navigate through three separate virtual clients' situations. Correct clinical decisions are rewarded by sounds and by video footage of the patient improving. Conversely, incorrect choices show video footage of the patient decompensating. This micro-simulation was developed to help students learn from the consequences of incorrect medication decision making in the virtual world without harming patients. The feedback of watching an incorrect decision on a patient helps students associate cause and effect on patient outcomes. The microsimulation reinforces the ease with which medication errors can occur and the extent of possible sequalae. The development process used to incorporate the technology in the nursing curriculum is discussed.

  13. Landfill mining: Developing a comprehensive assessment method.

    PubMed

    Hermann, Robert; Wolfsberger, Tanja; Pomberger, Roland; Sarc, Renato

    2016-11-01

    In Austria, the first basic technological and economic examinations of mass-waste landfills with the purpose to recover secondary raw materials have been carried out by the 'LAMIS - Landfill Mining Österreich' pilot project. A main focus of its research, and the subject of this article, is the first conceptual design of a comprehensive assessment method for landfill mining plans, including not only monetary factors (like costs and proceeds) but also non-monetary ones, such as the concerns of adjoining owners or the environmental impact. Detailed reviews of references, the identification of influences and system boundaries to be included in planning landfill mining, several expert workshops and talks with landfill operators have been performed followed by a division of the whole assessment method into preliminary and main assessment. Preliminary assessment is carried out with a questionnaire to rate juridical feasibility, the risk and the expenditure of a landfill mining project. The results of this questionnaire are compiled in a portfolio chart that is used to recommend, or not, further assessment. If a detailed main assessment is recommended, defined economic criteria are rated by net present value calculations, while ecological and socio-economic criteria are examined in a utility analysis and then transferred into a utility-net present value chart. If this chart does not support making a definite statement on the feasibility of the project, the results must be further examined in a cost-effectiveness analysis. Here, the benefit of the particular landfill mining project per capital unit (utility-net present value ratio) is determined to make a final distinct statement on the general benefit of a landfill mining project.

  14. Methods and Protocols for Developing Prion Vaccines.

    PubMed

    Marciniuk, Kristen; Taschuk, Ryan; Napper, Scott

    2016-01-01

    Prion diseases denote a distinct form of infectivity that is based in the misfolding of a self-protein (PrP(C)) into a pathological, infectious conformation (PrP(Sc)). Efforts to develop vaccines for prion diseases have been complicated by the potential dangers that are associated with induction of immune responses against a self-protein. As a consequence, there is considerable appeal for vaccines that specifically target the misfolded prion conformation. Such conformation-specific immunotherapy is made possible through the identification of vaccine targets (epitopes) that are exclusively presented as a consequence of misfolding. An immune response directed against these targets, termed disease-specific epitopes (DSEs), has the potential to spare the function of the native form of the protein while clearing, or neutralizing, the infectious isomer. Although identification of DSEs represents a critical first step in the induction of conformation-specific immune responses, substantial efforts are required to translate these targets into functional vaccines. Due to the poor immunogenicity that is inherent to self-proteins, and that is often associated with short peptides, substantial efforts are required to overcome tolerance-to-self and maximize the resultant immune response following DSE-based immunization. This often includes optimization of target sequences in terms of immunogenicity and development of effective formulation and delivery strategies for the associated peptides. Further, these vaccines must satisfy additional criteria from perspectives of specificity (PrP(C) vs. PrP(Sc)) and safety (antibody-induced template-driven misfolding of PrP(C)). The emphasis of this report is on the steps required to translate DSEs into prion vaccines and subsequent evaluation of the resulting immune responses.

  15. Developing Automated Methods of Waste Sorting

    SciTech Connect

    Shurtliff, Rodney Marvin

    2002-08-01

    The U.S. Department of Energy (DOE) analyzed the need complex-wide for remote and automated technologies as they relate to the treatment and disposal of mixed wastes. This analysis revealed that several DOE sites need the capability to open drums containing waste, visually inspect and sort the contents, and finally repackage the containers that are acceptable at a waste disposal facility such as the Waste Isolation Pilot Plant (WIPP) in New Mexico. Conditioning contaminated waste so that it is compatible with the WIPP criteria for storage is an arduous task whether the waste is contact handled (waste having radioactivity levels below 200 mrem/hr) or remote handled. Currently, WIPP non-compliant items are removed from the waste stream manually, at a rate of about one 55-gallon drum per day. Issues relating to contamination-based health hazards as well as repetitive motion health hazards are steering industry towards a more user-friendly, method of conditioning or sorting waste.

  16. Biomarker method validation in anticancer drug development.

    PubMed

    Cummings, J; Ward, T H; Greystoke, A; Ranson, M; Dive, C

    2008-02-01

    Over recent years the role of biomarkers in anticancer drug development has expanded across a spectrum of applications ranging from research tool during early discovery to surrogate endpoint in the clinic. However, in Europe when biomarker measurements are performed on samples collected from subjects entered into clinical trials of new investigational agents, laboratories conducting these analyses become subject to the Clinical Trials Regulations. While these regulations are not specific in their requirements of research laboratories, quality assurance and in particular assay validation are essential. This review, therefore, focuses on a discussion of current thinking in biomarker assay validation. Five categories define the majority of biomarker assays from 'absolute quantitation' to 'categorical'. Validation must therefore take account of both the position of the biomarker in the spectrum towards clinical end point and the level of quantitation inherent in the methodology. Biomarker assay validation should be performed ideally in stages on 'a fit for purpose' basis avoiding unnecessarily dogmatic adherence to rigid guidelines but with careful monitoring of progress at the end of each stage. These principles are illustrated with two specific examples: (a) absolute quantitation of protein biomarkers by mass spectrometry and (b) the M30 and M65 ELISA assays as surrogate end points of cell death.

  17. A Block-Matrix Method for Course Development

    ERIC Educational Resources Information Center

    Greenaway, John

    1977-01-01

    Describes the block-matrix method, a technique used to develop new training programs (commonly involving educational program developers and community representatives). Two examples of the block-matrix application and supplementary diagrams are included. It is noted that this method has been used successfully in the development of new courses for…

  18. Antimicrobial Testing Methods & Procedures Developed by EPA's Microbiology Laboratory

    EPA Pesticide Factsheets

    We develop antimicrobial testing methods and standard operating procedures to measure the effectiveness of hard surface disinfectants against a variety of microorganisms. Find methods and procedures for antimicrobial testing.

  19. CFD methods development considerations for unsteady aerodynamic analysis

    NASA Technical Reports Server (NTRS)

    Batina, John T.

    1992-01-01

    The development of computational fluid dynamics (CFD) methods for unsteady aerodynamic analysis is described. Special emphasis is placed on considerations that are required for application of the methods to unsteady aerodynamic flow problems. Two broad categories of topics are presented to illustrate the major points. Although primary application of these CFD methods is to relatively low frequency oscillatory phenomena such as flutter, the ideas that are presented may be of value to developers of computational aeroacoustic methods for predicting high frequency acoustics.

  20. 24 CFR 941.102 - Development methods and funding.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 24 Housing and Urban Development 4 2013-04-01 2013-04-01 false Development methods and funding. 941.102 Section 941.102 Housing and Urban Development REGULATIONS RELATING TO HOUSING AND URBAN DEVELOPMENT (CONTINUED) OFFICE OF ASSISTANT SECRETARY FOR PUBLIC AND INDIAN HOUSING, DEPARTMENT OF HOUSING AND...

  1. 24 CFR 941.102 - Development methods and funding.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Development methods and funding. 941.102 Section 941.102 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF ASSISTANT SECRETARY FOR PUBLIC AND INDIAN HOUSING, DEPARTMENT OF HOUSING AND...

  2. 24 CFR 941.102 - Development methods and funding.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 24 Housing and Urban Development 4 2012-04-01 2012-04-01 false Development methods and funding. 941.102 Section 941.102 Housing and Urban Development REGULATIONS RELATING TO HOUSING AND URBAN DEVELOPMENT (CONTINUED) OFFICE OF ASSISTANT SECRETARY FOR PUBLIC AND INDIAN HOUSING, DEPARTMENT OF HOUSING AND...

  3. How Qualitative Methods Can be Used to Inform Model Development.

    PubMed

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-06-01

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  4. Development of a Research Methods and Statistics Concept Inventory

    ERIC Educational Resources Information Center

    Veilleux, Jennifer C.; Chapman, Kate M.

    2017-01-01

    Research methods and statistics are core courses in the undergraduate psychology major. To assess learning outcomes, it would be useful to have a measure that assesses research methods and statistical literacy beyond course grades. In two studies, we developed and provided initial validation results for a research methods and statistical knowledge…

  5. Development of quality assurance methods for epoxy graphite prepreg

    NASA Technical Reports Server (NTRS)

    Chen, J. S.; Hunter, A. B.

    1982-01-01

    Quality assurance methods for graphite epoxy/prepregs were developed. Liquid chromatography, differential scanning calorimetry, and gel permeation chromatography were investigated. These methods were applied to a second prepreg system. The resin matrix formulation was correlated with mechanical properties. Dynamic mechanical analysis and fracture toughness methods were investigated. The chromatography and calorimetry techniques were all successfully developed as quality assurance methods for graphite epoxy prepregs. The liquid chromatography method was the most sensitive to changes in resin formulation. The were also successfully applied to the second prepreg system.

  6. Method Of Predicting Size Of Software Under Development

    NASA Technical Reports Server (NTRS)

    Hihn, Jairus M.; Murthy, Subramanyam K.

    1994-01-01

    Method of estimating size and complexity of large computer program under development based on metric called "function mass." Simplification of Demarco's "function bang" metric. Size of completed program usually expressed in terms of number of lines of code (SLOC), shown in previous research to be highly correlated with amount of development effort and therefore important predictor of development cost. Proposed method of estimating ultimate size of program is intermediate product of continuing research on estimation of size and structured analysis of developmental software.

  7. Development of novel growth methods for halide single crystals

    NASA Astrophysics Data System (ADS)

    Yokota, Yuui; Kurosawa, Shunsuke; Shoji, Yasuhiro; Ohashi, Yuji; Kamada, Kei; Yoshikawa, Akira

    2017-03-01

    We developed novel growth methods for halide scintillator single crystals with hygroscopic nature, Halide micro-pulling-down [H-μ-PD] method and Halide Vertical Bridgman [H-VB] method. The H-μ-PD method with a removable chamber system can grow a single crystal of halide scintillator material with hygroscopicity at faster growth rate than the conventional methods. On the other hand, the H-VB method can grow a large bulk single crystal of halide scintillator without a quartz ampule. CeCl3, LaBr3, Ce:LaBr3 and Eu:SrI2 fiber single crystals could be grown by the H-μ-PD method and Eu:SrI2 bulk single crystals of 1 and 1.5 inch in diameter could be grown by the H-VB method. The grown fiber and bulk single crystals showed comparable scintillation properties to the previous reports using the conventional methods.

  8. A random spatial sampling method in a rural developing nation

    PubMed Central

    2014-01-01

    Background Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. Methods We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. Results This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Conclusions Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available. PMID:24716473

  9. EPA Scientists Develop Research Methods for Studying Mold Fact Sheet

    EPA Pesticide Factsheets

    In 2002, U.S. Environmental Protection Agency researchers developed a DNA-based Mold Specific Quantitative Polymerase Chain Reaction method (MSQPCR) for identifying and quantifying over 100 common molds and fungi.

  10. 3-minute diagnosis: Researchers develop new method to recognize pathogens

    ScienceCinema

    Beer, Reg

    2016-07-12

    Imagine knowing precisely why you feel sick ... before the doctor's exam is over. Lawrence Livermore researcher Reg Beer and his engineering colleagues have developed a new method to recognize disease-causing pathogens quicker than ever before.

  11. 3-minute diagnosis: Researchers develop new method to recognize pathogens

    SciTech Connect

    Beer, Reg

    2014-01-06

    Imagine knowing precisely why you feel sick ... before the doctor's exam is over. Lawrence Livermore researcher Reg Beer and his engineering colleagues have developed a new method to recognize disease-causing pathogens quicker than ever before.

  12. Development of aerodynamic prediction methods for irregular planform wings

    NASA Technical Reports Server (NTRS)

    Benepe, D. B., Sr.

    1983-01-01

    A set of empirical methods was developed to predict low-speed lift, drag and pitching-moment variations with angle of attack for a class of low aspect ratio irregular planform wings suitable for application to advanced aerospace vehicles. The data base, an extensive series of wind-tunnel tests accomplished by the Langley Research Center of the National Aeronautics and Space Administration, is summarized. The approaches used to analyze the wind tunnel data, the evaluation of previously existing methods, data correlation efforts, and the development of the selected methods are presented and discussed. A summary of the methods is also presented to document the equations, computational charts and design guides which have been programmed for digital computer solution. Comparisons of predictions and test data are presented which show that the new methods provide a significant improvement in capability for evaluating the landing characteristics of advanced aerospace vehicles during the preliminary design phase of the configuration development cycle.

  13. Development of a new disintegration method for orally disintegrating tablets.

    PubMed

    Kakutani, Ryo; Muro, Hiroyuki; Makino, Tadashi

    2010-07-01

    Recently, the focus has been on the importance of assessing the oral disintegrative properties of orally disintegrating tablets (ODTs). In particular, in the development stages and the quality control field of ODT products, a physical assessment method which easily measures oral disintegrative properties is desired. For this reason, we developed a new disintegration test method (Kyoto-model disintegration method or KYO method), which is useful to predict the oral disintegrative properties of an ODT easily, and examined the availability of the method. In the KYO method, ODT samples were classified in terms of their water permeability, and a moderate water volume was decided. Subsequently, the disintegrative properties were assessed with the newly proposed method. For 25 commercial prescription ODTs used as samples, a good correlation was shown between the results of a human sensory test by five healthy male volunteers and the results using the KYO method. Furthermore, the KYO method could evaluate time-dependent changes in ODT samples. On the other hand, no correlation was observed between the Japanese Pharmacopeia disintegration test and the human sensory test. These results suggested that the KYO method reflected the disintegration nature of the ODTs in the oral cavity, and could easily be applied to development stages and the quality control field of ODT products.

  14. Measuring through the microscope: development and evolution of stereological methods.

    PubMed

    Weibel, E R

    1989-09-01

    Obtaining, by means of microscopy, meaningful measurements pertaining to spatial structures requires methods which allow three-dimensional quantitative information to be derived from the reduced information available on the two-dimensional flat sections of the structure. The most powerful methods to that effect are those of stereology which are based on mathematical principles. This paper reviews the early invention of these methods, which sought to solve practical problems, and their further evolution as more rigorous mathematical foundations were developed. It is demonstrated that stereological methods are essentially sampling methods and that newer trends provide new and sound solutions to old and elusive problems, such as anisotropy or particle number and size.

  15. A random spatial sampling method in a rural developing nation.

    PubMed

    Kondo, Michelle C; Bream, Kent D W; Barg, Frances K; Branas, Charles C

    2014-04-10

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available.

  16. AN APPROACH TO METHODS DEVELOPMENT FOR HUMAN EXPOSURE ASSESSMENT STUDIES

    EPA Science Inventory

    Human exposure assessment studies require methods that are rapid, cost-effective and have a high sample through-put. The development of analytical methods for exposure studies should be based on specific information for individual studies. Human exposure studies suggest that di...

  17. AN APPROACH TO METHODS DEVELOPMENT FOR HUMAN EXPOSURE ASSESSMENT STUDIES

    EPA Science Inventory

    Human exposure assessment studies require methods that are rapid, cost-effective and have a high sample through-put. The development of analytical methods for exposure studies should be based on specific information for individual studies. Human exposure studies suggest that di...

  18. Interdisciplinary Curriculum Development in Hospital Methods Improvement. Final Report.

    ERIC Educational Resources Information Center

    Watt, John R.

    The major purpose of this project was to develop a "package" curriculum of Hospital Methods Improvement techniques for college students in health related majors. The elementary Industrial Engineering methods for simplifying work and saving labor were applied to the hospital environment and its complex of problems. The report's…

  19. What Can Mixed Methods Designs Offer Professional Development Program Evaluators?

    ERIC Educational Resources Information Center

    Giordano, Victoria; Nevin, Ann

    2007-01-01

    In this paper, the authors describe the benefits and pitfalls of mixed methods designs. They argue that mixed methods designs may be preferred when evaluating professional development programs for p-K-12 education given the new call for accountability in making data-driven decisions. They summarize and critique the studies in terms of limitations…

  20. Epistemological Development and Judgments and Reasoning about Teaching Methods

    ERIC Educational Resources Information Center

    Spence, Sarah; Helwig, Charles C.

    2013-01-01

    Children's, adolescents', and adults' (N = 96 7-8, 10-11, and 13-14-year-olds and university students) epistemological development and its relation to judgments and reasoning about teaching methods was examined. The domain (scientific or moral), nature of the topic (controversial or noncontroversial), and teaching method (direct instruction by…

  1. Methods of the Development Strategy of Service Companies: Logistical Approach

    ERIC Educational Resources Information Center

    Toymentseva, Irina A.; Karpova, Natalya P.; Toymentseva, Angelina A.; Chichkina, Vera D.; Efanov, Andrey V.

    2016-01-01

    The urgency of the analyzed issue is due to lack of attention of heads of service companies to the theory and methodology of strategic management, methods and models of management decision-making in times of economic instability. The purpose of the article is to develop theoretical positions and methodical recommendations on the formation of the…

  2. Epistemological Development and Judgments and Reasoning about Teaching Methods

    ERIC Educational Resources Information Center

    Spence, Sarah; Helwig, Charles C.

    2013-01-01

    Children's, adolescents', and adults' (N = 96 7-8, 10-11, and 13-14-year-olds and university students) epistemological development and its relation to judgments and reasoning about teaching methods was examined. The domain (scientific or moral), nature of the topic (controversial or noncontroversial), and teaching method (direct instruction by…

  3. Preferred Methods of Professional Development in Student Affairs

    ERIC Educational Resources Information Center

    Roberts, Darby M.

    2007-01-01

    Continuing professional development is a foundation of the student affairs field. To stay current, practitioners use a variety of methods to learn about areas that they need to master to be successful in their careers. Results of this research indicate that staff use interactive methods such as consulting with colleagues and mentoring more so than…

  4. A Model-Driven Development Method for Management Information Systems

    NASA Astrophysics Data System (ADS)

    Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki

    Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.

  5. A random spatial sampling method in a rural developing nation

    Treesearch

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  6. Forestry sector analysis for developing countries: issues and methods.

    Treesearch

    R.W. Haynes

    1993-01-01

    A satellite meeting of the 10th Forestry World Congress focused on the methods used for forest sector analysis and their applications in both developed and developing countries. The results of that meeting are summarized, and a general approach for forest sector modeling is proposed. The approach includes models derived from the existing...

  7. Quality functions for requirements engineering in system development methods.

    PubMed

    Johansson, M; Timpka, T

    1996-01-01

    Based on a grounded theory framework, this paper analyses the quality characteristics for methods to be used for requirements engineering in the development of medical decision support systems (MDSS). The results from a Quality Function Deployment (QFD) used to rank functions connected to user value and a focus group study were presented to a validation focus group. The focus group studies take advantage of a group process to collect data for further analyses. The results describe factors considered by the participants as important in the development of methods for requirements engineering in health care. Based on the findings, the content which, according to the user a MDSS method should support is established.

  8. A new method for the development of external pressure charts

    SciTech Connect

    Michalopoulos, E.

    1995-12-01

    External pressure charts are used by various Sections of the ASME Boiler and Pressure Vessel and Piping Codes to design components such as cylinders, spheres, formed heads, tubes, pipings and other components, subjected to external pressure or axial compression loads. These charts are contained in Section 2, Part D, Subpart 3 of the Boiler and Pressure Vessel Code and are pseudo stress-strain curves for groups of materials with similar stress-strain shapes and strength levels. Historical background is provided on the traditional graphical method used to develop these charts, which was originally developed in the 1930`s. This paper presents a new method to develop external pressure charts using mathematical relationships for the material stress-strain curves and a mathematical formulation for the complete procedure. The new method improves the accuracy of the development of external pressure charts and takes advantage of the improvements in testing techniques which today provide digital data in addition to graphical information. A number of issues that introduced variabilities in the traditional graphical method are discussed in detail. The method also leads to the development of master material external pressure charts which could increase the allowable buckling strength of numerous material in the code. In addition it provides for the capability to extend the external pressure charts for higher temperatures. The new method has been applied successfully to several new materials adopted recently for code construction.

  9. Development of a hospital information system using the TAD method.

    PubMed

    Damij, T

    1998-01-01

    To examine the capability of a new object-oriented method called Tabular Application Development (TAD) in developing a hospital information system for a gastroenterology clinic. TAD has five phases. The first phase identifies the problem to be solved. The second phase defines the business processes and activities involved. The third phase develops the object model. The fourth phase designs the application model. The final phase deals with implementation. Eight requirements for the system were identified by hospital management; 17 specific tasks were grouped into three activity categories. The process model, the object model, and the application model of the system are described. The TAD method is capable of developing such an information system without any problem. Furthermore, the method minimizes the time needed to do this in such a way that the process is completely visible to the analyst.

  10. Development and applications of Krotov method of global control improvement

    SciTech Connect

    Rasina, Irina V.; Trushkova, Ekaterina A.; Baturina, Olga V.; Bulatov, Alexander V.; Guseva, Irina S.

    2016-06-08

    This is a survey of works on main properties, application and development of the Krotov method of global control improvement very popular among researchers of modern problems in quantum physics and quantum chemistry, applying actively optimal control methods. The survey includes a brief description of the method in comparison with well known gradient method demonstrating such its serious advantage as absence of tuning parameters; investigations aimed to make its special version for the quantum system well defined and more effective; and generalization for wide classes of control systems, including the systems of heterogeneous structure.

  11. Validation of analytic methods for biomarkers used in drug development.

    PubMed

    Chau, Cindy H; Rixe, Olivier; McLeod, Howard; Figg, William D

    2008-10-01

    The role of biomarkers in drug discovery and development has gained precedence over the years. As biomarkers become integrated into drug development and clinical trials, quality assurance and, in particular, assay validation become essential with the need to establish standardized guidelines for analytic methods used in biomarker measurements. New biomarkers can revolutionize both the development and use of therapeutics but are contingent on the establishment of a concrete validation process that addresses technology integration and method validation as well as regulatory pathways for efficient biomarker development. This perspective focuses on the general principles of the biomarker validation process with an emphasis on assay validation and the collaborative efforts undertaken by various sectors to promote the standardization of this procedure for efficient biomarker development.

  12. Validation of Analytical Methods for Biomarkers Employed in Drug Development

    PubMed Central

    Chau, Cindy H.; Rixe, Olivier; McLeod, Howard; Figg, William D.

    2008-01-01

    The role of biomarkers in drug discovery and development has gained precedence over the years. As biomarkers become integrated into drug development and clinical trials, quality assurance and in particular assay validation becomes essential with the need to establish standardized guidelines for analytical methods used in biomarker measurements. New biomarkers can revolutionize both the development and use of therapeutics, but is contingent upon the establishment of a concrete validation process that addresses technology integration and method validation as well as regulatory pathways for efficient biomarker development. This perspective focuses on the general principles of the biomarker validation process with an emphasis on assay validation and the collaborative efforts undertaken by various sectors to promote the standardization of this procedure for efficient biomarker development. PMID:18829475

  13. Development of Improved Surface Integral Methods for Jet Aeroacoustic Predictions

    NASA Technical Reports Server (NTRS)

    Pilon, Anthony R.; Lyrintzis, Anastasios S.

    1997-01-01

    The accurate prediction of aerodynamically generated noise has become an important goal over the past decade. Aeroacoustics must now be an integral part of the aircraft design process. The direct calculation of aerodynamically generated noise with CFD-like algorithms is plausible. However, large computer time and memory requirements often make these predictions impractical. It is therefore necessary to separate the aeroacoustics problem into two parts, one in which aerodynamic sound sources are determined, and another in which the propagating sound is calculated. This idea is applied in acoustic analogy methods. However, in the acoustic analogy, the determination of far-field sound requires the solution of a volume integral. This volume integration again leads to impractical computer requirements. An alternative to the volume integrations can be found in the Kirchhoff method. In this method, Green's theorem for the linear wave equation is used to determine sound propagation based on quantities on a surface surrounding the source region. The change from volume to surface integrals represents a tremendous savings in the computer resources required for an accurate prediction. This work is concerned with the development of enhancements of the Kirchhoff method for use in a wide variety of aeroacoustics problems. This enhanced method, the modified Kirchhoff method, is shown to be a Green's function solution of Lighthill's equation. It is also shown rigorously to be identical to the methods of Ffowcs Williams and Hawkings. This allows for development of versatile computer codes which can easily alternate between the different Kirchhoff and Ffowcs Williams-Hawkings formulations, using the most appropriate method for the problem at hand. The modified Kirchhoff method is developed primarily for use in jet aeroacoustics predictions. Applications of the method are shown for two dimensional and three dimensional jet flows. Additionally, the enhancements are generalized so that

  14. Reactions Involved in Fingerprint Development Using the Cyanoacrylate - Fuming Method

    SciTech Connect

    Lewis, L.A.

    2001-07-30

    The Learning Objective is to present the basic chemistry research findings to the forensic community regarding development of latent fingerprints using the cyanoacrylate fuming method. Chemical processes involved in the development of latent fingerprints using the cyanoacrylate fuming method have been studied, and will be presented. Two major types of latent prints have been investigated--clean (eccrine) and oily (sebaceous) prints. Scanning electron microscopy (SEM) was used as a tool for determining the morphology of the polymer developed separately on clean and oily prints after cyanoacrylate fuming. A correlation between the chemical composition of an aged latent fingerprint, prior to development, and the quality of a developed fingerprint was observed in the morphology. The moisture in the print prior to fuming was found to be a critical factor for the development of a useful latent print. In addition, the amount of time required to develop a high quality latent print was found to be minimal. The cyanoacrylate polymerization process is extremely rapid. When heat is used to accelerate the fuming process, typically a period of 2 minutes is required to develop the print. The optimum development time is dependent upon the concentration of cyanoacrylate vapors within the enclosure.

  15. Development of Impurity Profiling Methods Using Modern Analytical Techniques.

    PubMed

    Ramachandra, Bondigalla

    2017-01-02

    This review gives a brief introduction about the process- and product-related impurities and emphasizes on the development of novel analytical methods for their determination. It describes the application of modern analytical techniques, particularly the ultra-performance liquid chromatography (UPLC), liquid chromatography-mass spectrometry (LC-MS), high-resolution mass spectrometry (HRMS), gas chromatography-mass spectrometry (GC-MS) and high-performance thin layer chromatography (HPTLC). In addition to that, the application of nuclear magnetic resonance (NMR) spectroscopy was also discussed for the characterization of impurities and degradation products. The significance of the quality, efficacy and safety of drug substances/products, including the source of impurities, kinds of impurities, adverse effects by the presence of impurities, quality control of impurities, necessity for the development of impurity profiling methods, identification of impurities and regulatory aspects has been discussed. Other important aspects that have been discussed are forced degradation studies and the development of stability indicating assay methods.

  16. Epistemological development and judgments and reasoning about teaching methods.

    PubMed

    Spence, Sarah; Helwig, Charles C

    2013-01-01

    Children's, adolescents', and adults' (N = 96 7-8, 10-11, and 13-14-year-olds and university students) epistemological development and its relation to judgments and reasoning about teaching methods was examined. The domain (scientific or moral), nature of the topic (controversial or noncontroversial), and teaching method (direct instruction by lectures versus class discussions) were systematically varied. Epistemological development was assessed in the aesthetics, values, and physical truth domains. All participants took the domain, nature of the topic, and teaching method into consideration in ways that showed age-related variations. Epistemological development in the value domain alone was predictive of preferences for class discussions and a critical perspective on teacher-centered direct instruction, even when age was controlled in the analysis.

  17. Developing and Understanding Methods for Large Scale Nonlinear Optimization

    DTIC Science & Technology

    2001-12-01

    development of new algorithms for large-scale uncon- strained and constrained optimization problems, including limited-memory methods for problems with...analysis of tensor and SQP methods for singular con- strained optimization", to appear in SIAM Journal on Optimization. Published in peer-reviewed...Mathematica, Vol III, Journal der Deutschen Mathematiker-Vereinigung, 1998. S. Crivelli, B. Bader, R. Byrd, E. Eskow, V. Lamberti , R.Schnabel and T

  18. Recent development on statistical methods for personalized medicine discovery.

    PubMed

    Zhao, Yingqi; Zeng, Donglin

    2013-03-01

    It is well documented that patients can show significant heterogeneous responses to treatments so the best treatment strategies may require adaptation over individuals and time. Recently, a number of new statistical methods have been developed to tackle the important problem of estimating personalized treatment rules using single-stage or multiple-stage clinical data. In this paper, we provide an overview of these methods and list a number of challenges.

  19. Formal methods in the development of safety critical software systems

    SciTech Connect

    Williams, L.G.

    1991-11-15

    As the use of computers in critical control systems such as aircraft controls, medical instruments, defense systems, missile controls, and nuclear power plants has increased, concern for the safety of those systems has also grown. Much of this concern has focused on the software component of those computer-based systems. This is primarily due to historical experience with software systems that often exhibit larger numbers of errors than their hardware counterparts and the fact that the consequences of a software error may endanger human life, property, or the environment. A number of different techniques have been used to address the issue of software safety. Some are standard software engineering techniques aimed at reducing the number of faults in a software protect, such as reviews and walkthroughs. Others, including fault tree analysis, are based on identifying and reducing hazards. This report examines the role of one such technique, formal methods, in the development of software for safety critical systems. The use of formal methods to increase the safety of software systems is based on their role in reducing the possibility of software errors that could lead to hazards. The use of formal methods in the development of software systems is controversial. Proponents claim that the use of formal methods can eliminate errors from the software development process, and produce programs that are probably correct. Opponents claim that they are difficult to learn and that their use increases development costs unacceptably. This report discusses the potential of formal methods for reducing failures in safety critical software systems.

  20. Agile Software Development Methods: A Comparative Review1

    NASA Astrophysics Data System (ADS)

    Abrahamsson, Pekka; Oza, Nilay; Siponen, Mikko T.

    Although agile software development methods have caught the attention of software engineers and researchers worldwide, scientific research still remains quite scarce. The aim of this study is to order and make sense of the different agile approaches that have been proposed. This comparative review is performed from the standpoint of using the following features as the analytical perspectives: project management support, life-cycle coverage, type of practical guidance, adaptability in actual use, type of research objectives and existence of empirical evidence. The results show that agile software development methods cover, without offering any rationale, different phases of the software development life-cycle and that most of these methods fail to provide adequate project management support. Moreover, quite a few methods continue to offer little concrete guidance on how to use their solutions or how to adapt them in different development situations. Empirical evidence after ten years of application remains quite limited. Based on the results, new directions on agile methods are outlined.

  1. A Valuation Method for Multi-Stage Development Projects

    NASA Astrophysics Data System (ADS)

    Kobayashi, Yasuhiro; Kubo, Osamu; Ito, Junko; Ueda, Yoshikatsu

    A real-option based valuation method has been developed for multi-stage development projects which allow flexible stage-wise go/stop judgments. The proposed method measures the economic value of projects from potential future cash flow produced by them, and is characterized by following four functions: (1) Corporation of technical and market risks into project valuation, (2) Quantification of a project portfolio value, (3) Modeling of correlation between individual projects in a portfolio, and (4) Control of project portfolio risk with a risk index.

  2. Research and development of a stationary source method for phosgene

    SciTech Connect

    Coppedge, E.A.; Johnson, L.D.; Steger, J.L.

    1996-12-31

    Phosgene is listed as one of the hazardous air pollutants in title I of the Clean Air Act. Phosgene is a highly toxic gas at standard temperature and pressure and has been used in military applications and for a variety of industrial uses. Although various methods have been developed for detection of phosgene in ambient air, no method is directly applicable to stationary source emissions. The EPA has an on-going research project to develop a field ready protocol for phosgene from stationary source emissions. The results of the derivatization studies, sampling train experiments and other laboratory work will be shown.

  3. Recent Developments in Helioseismic Analysis Methods and Solar Data Assimilation

    NASA Astrophysics Data System (ADS)

    Schad, A.; Jouve, L.; Duvall, T. L.; Roth, M.; Vorontsov, S.

    2015-12-01

    We review recent advances and results in enhancing and developing helioseismic analysis methods and in solar data assimilation. In the first part of this paper we will focus on selected developments in time-distance and global helioseismology. In the second part, we review the application of data assimilation methods on solar data. Relating solar surface observations as well as helioseismic proxies with solar dynamo models by means of the techniques from data assimilation is a promising new approach to explore and to predict the magnetic activity cycle of the Sun.

  4. Methods for Developing Emissions Scenarios for Integrated Assessment Models

    SciTech Connect

    Prinn, Ronald; Webster, Mort

    2007-08-20

    The overall objective of this research was to contribute data and methods to support the future development of new emissions scenarios for integrated assessment of climate change. Specifically, this research had two main objectives: 1. Use historical data on economic growth and energy efficiency changes, and develop probability density functions (PDFs) for the appropriate parameters for two or three commonly used integrated assessment models. 2. Using the parameter distributions developed through the first task and previous work, we will develop methods of designing multi-gas emission scenarios that usefully span the joint uncertainty space in a small number of scenarios. Results on the autonomous energy efficiency improvement (AEEI) parameter are summarized, an uncertainty analysis of elasticities of substitution is described, and the probabilistic emissions scenario approach is presented.

  5. Pilot-in-the-Loop CFD Method Development

    DTIC Science & Technology

    2017-04-20

    1 Contract # N00014-14-C-0020 Pilot-in-the-Loop CFD Method Development Progress Report (CDRL A001) Progress Report for Period: January...21, 2017 to April 20, 2017 PI: Joseph F. Horn 814-865-6434 joehorn@psu.edu Performing Organization : The Pennsylvania State University...Penn State VLRCOE Flight Simulator. Performance Study and Grid Dependency To quantify the timing performance of the developed coupling tool on

  6. Development of ultrasonic methods for the nondestructive inspection of concrete

    SciTech Connect

    Claytor, T.N.; Ellingson, W.A.

    1983-08-01

    Nondestructive inspection of Portland cement and refractory concrete is conducted to determine strength, thickness, presence of voids or foreign matter, presence of cracks, amount of degradation due to chemical attack, and other properties without the necessity of coring the structure (which is usually accomplished by destructively removing a sample). This paper reviews the state of the art of acoustic nondestructive testing methods for Portland cement and refractory concrete. Most nondestructive work on concrete has concentrated on measuring acoustic velocity by through transmission methods. Development of a reliable pitch-catch or pulse-echo system would provide a method of measuring thickness with access from only one side of the concrete.

  7. Method for Developing Equipment Failure Rate K Factors

    DTIC Science & Technology

    1975-01-03

    General Technique 10 2.2 Assumptions 10 2.3 Method Detailed 11 2.4 Application to Reliability 16 3.0 TECHNIQUE VALIDATION 22 3.1 Methods Investigated 22...1 Chi-Square Test 28 3.2-2 Chi-Square Test 31 5 D180-17674-2 I THWA RAWAMA COMPANY 1.0 INTRODUCTION In general , K factors (Logistic Performance...to developing and validating a method for use at the generic system level. However, as data improves and resources become available, it may be

  8. Development of ultrasonic methods for the nondestructive inspection of concrete

    NASA Astrophysics Data System (ADS)

    Claytor, T. M.; Ellingson, W. A.

    1983-08-01

    Nondestructive inspection of Portland cement and refractory concrete is conducted to determine strength, thickness, presence of voids or foreign matter, presence of cracks, amount of degradation due to chemical attack, and other properties without the necessity of coring the structure (which is usually accomplished by destructively removing a sample). The state of the art of acoustic nondestructive testing methods for Portland cement and refractory concrete is reviewed. Most nondestructive work on concrete has concentrated on measuring acoustic velocity by through transmission methods. Development of a reliable pitch-catch or pulse-echo system would provide a method of measuring thickness with access from only one side of the concrete.

  9. Incremental dental development: methods and applications in hominoid evolutionary studies.

    PubMed

    Smith, Tanya M

    2008-02-01

    This survey of dental microstructure studies reviews recent methods used to quantify developmental variables (daily secretion rate, periodicity of long-period lines, extension rate, formation time) and applications to the study of hominoid evolution. While requisite preparative and analytical methods are time consuming, benefits include more precise identification of tooth crown initiation and completion than conventional radiographic approaches. Furthermore, incremental features facilitate highly accurate estimates of the speed and duration of crown and root formation, stress experienced during development (including birth), and age at death. These approaches have provided insight into fossil hominin and Miocene hominoid life histories, and have also been applied to ontogenetic and taxonomic studies of fossil apes and humans. It is shown here that, due to the rapidly evolving nature of dental microstructure studies, numerous methods have been applied over the past few decades to characterize the rate and duration of dental development. Yet, it is often unclear whether data derived from different methods are comparable or which methods are the most accurate. Areas for future research are identified, including the need for validation and standardization of certain methods, and new methods for integrating nondestructive structural and developmental studies are highlighted.

  10. Development of a benchtop baking method for chemically leavened crackers. II. Validation of the method

    USDA-ARS?s Scientific Manuscript database

    A benchtop baking method has been developed to predict the contribution of gluten functionality to overall flour performance for chemically leavened crackers. Using a diagnostic formula and procedure, dough rheology was analyzed to evaluate the extent of gluten development during mixing and machinin...

  11. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    SciTech Connect

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as a three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC`s PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.

  12. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    SciTech Connect

    Ravindra, M.K.; Banon, H. )

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as a three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.

  13. DEVELOPMENT OF CRITERIA AND METHODS FOR EVALUATING TRAINER AIRCRAFT EFFECTIVENESS.

    ERIC Educational Resources Information Center

    KUSEWITT, J.B.

    THE PURPOSE OF THIS STUDY WAS TO DEVELOP A METHOD FOR DETERMINING OBJECTIVE MEASURES OF TRAINER AIRCRAFT EFFECTIVENESS TO EVALUATE PROGRAM ALTERNATIVES FOR TRAINING PILOTS FOR FLEET FIGHTER AND ATTACK-TYPE AIRCRAFT. THE TRAINING SYLLABUS WAS BASED ON AVERAGE STUDENT ABILITY. THE BASIC PROBLEM WAS TO ESTABLISH QUANTITATIVE TIME-DIFFICULTY…

  14. Teaching Analytical Method Development in an Undergraduate Instrumental Analysis Course

    ERIC Educational Resources Information Center

    Lanigan, Katherine C.

    2008-01-01

    Method development and assessment, central components of carrying out chemical research, require problem-solving skills. This article describes a pedagogical approach for teaching these skills through the adaptation of published experiments and application of group-meeting style discussions to the curriculum of an undergraduate instrumental…

  15. EXPOSURE ASSESSMENT METHODS DEVELOPMENT PILOTS FOR THE NATIONAL CHILDREN'S STUDY

    EPA Science Inventory

    Accurate exposure classification tools are needed to link exposure with health effects. EPA began methods development pilot studies in 2000 to address general questions about exposures and outcome measures. Selected pilot studies are highlighted in this poster. The “Literature Re...

  16. 59 FR- Method Development for Airborne Mycobacterium Tuberculosis; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    1994-03-07

    ... Tuberculosis; Meeting The National Institute for Occupational Safety and Health (NIOSH) of the Centers for... Airborne Mycobacterium Tuberculosis. Time and Date: 1 p.m.-5 p.m., March 29, 1994. Place: Alice Hamilton... peer review of a NIOSH project entitled ``Method Development For Airborne Mycobacterium...

  17. DEVELOPMENT OF CRITERIA AND METHODS FOR EVALUATING TRAINER AIRCRAFT EFFECTIVENESS.

    ERIC Educational Resources Information Center

    KUSEWITT, J.B.

    THE PURPOSE OF THIS STUDY WAS TO DEVELOP A METHOD FOR DETERMINING OBJECTIVE MEASURES OF TRAINER AIRCRAFT EFFECTIVENESS TO EVALUATE PROGRAM ALTERNATIVES FOR TRAINING PILOTS FOR FLEET FIGHTER AND ATTACK-TYPE AIRCRAFT. THE TRAINING SYLLABUS WAS BASED ON AVERAGE STUDENT ABILITY. THE BASIC PROBLEM WAS TO ESTABLISH QUANTITATIVE TIME-DIFFICULTY…

  18. New developments in adaptive methods for computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Oden, J. T.; Bass, Jon M.

    1990-01-01

    New developments in a posteriori error estimates, smart algorithms, and h- and h-p adaptive finite element methods are discussed in the context of two- and three-dimensional compressible and incompressible flow simulations. Applications to rotor-stator interaction, rotorcraft aerodynamics, shock and viscous boundary layer interaction and fluid-structure interaction problems are discussed.

  19. Is Mixed Methods Research Used in Australian Career Development Research?

    ERIC Educational Resources Information Center

    Cameron, Roslyn

    2010-01-01

    Mixed methods research has become a substantive and growing methodological force that is growing in popularity within the human and social sciences. This article reports the findings of a study that has systematically reviewed articles from the "Australian Journal of Career Development" from 2004 to 2009. The aim of the study was to…

  20. Multidisciplinary Methods in Educational Technology Research and Development

    ERIC Educational Resources Information Center

    Randolph, Justus J.

    2008-01-01

    Over the past thirty years, there has been much dialogue, and debate, about the conduct of educational technology research and development. In this brief volume, the author helps clarify that dialogue by theoretically and empirically charting the research methods used in the field and provides much practical information on how to conduct…

  1. EXPOSURE ASSESSMENT METHODS DEVELOPMENT PILOTS FOR THE NATIONAL CHILDREN'S STUDY

    EPA Science Inventory

    Accurate exposure classification tools are needed to link exposure with health effects. EPA began methods development pilot studies in 2000 to address general questions about exposures and outcome measures. Selected pilot studies are highlighted in this poster. The “Literature Re...

  2. Is Mixed Methods Research Used in Australian Career Development Research?

    ERIC Educational Resources Information Center

    Cameron, Roslyn

    2010-01-01

    Mixed methods research has become a substantive and growing methodological force that is growing in popularity within the human and social sciences. This article reports the findings of a study that has systematically reviewed articles from the "Australian Journal of Career Development" from 2004 to 2009. The aim of the study was to…

  3. NEW METHODS FOR MEASURING THE DEVELOPMENT OF ATTITUDES IN CHILDREN.

    ERIC Educational Resources Information Center

    HESS, ROBERT D.; TORNEY, JUDITH V.

    STRUCTURAL (NONCONTENT) DIMENSIONS OF CHILDREN'S POLITICAL ATTITUDES AND THEIR DEVELOPMENT WERE INVESTIGATED USING NEW METHODS DERIVED FROM SELF-REPORT DATA. THE CONSTRUCT, "ATTITUDE-CONCEPT SYSTEM," WAS INTRODUCED TO DESIGNATE EVALUATIONS OF AN ATTITUDE OBJECT AND BELIEFS ASSOCIATED WITH THIS EVALUATION. THE FIVE STRUCTURAL DIMENSIONS…

  4. Development of a Chronic Toxicity Testing Method for Daphnia pulex

    DTIC Science & Technology

    2015-08-01

    Materials (ASTM), United States Environmental Protection Agency (USEPA), Organisation for Economic Cooperation and Development ( OECD ), and...Environmental Laboratory, modifications were made to the current USEPA (2002) chronic method for Ceriodaphnia dubia and the ASTM (2012) and OECD (2008...ultrapure water) MHRW moderately hard reconstituted water (USEPA 2002) MWF Monday, Wednesday, Friday OECD Organisation for Economic Cooperation and

  5. Development of a practical costing method for hospitals.

    PubMed

    Cao, Pengyu; Toyabe, Shin-Ichi; Akazawa, Kouhei

    2006-03-01

    To realize an effective cost control, a practical and accurate cost accounting system is indispensable in hospitals. In traditional cost accounting systems, the volume-based costing (VBC) is the most popular cost accounting method. In this method, the indirect costs are allocated to each cost object (services or units of a hospital) using a single indicator named a cost driver (e.g., Labor hours, revenues or the number of patients). However, this method often results in rough and inaccurate results. The activity based costing (ABC) method introduced in the mid 1990s can prove more accurate results. With the ABC method, all events or transactions that cause costs are recognized as "activities", and a specific cost driver is prepared for each activity. Finally, the costs of activities are allocated to cost objects by the corresponding cost driver. However, it is much more complex and costly than other traditional cost accounting methods because the data collection for cost drivers is not always easy. In this study, we developed a simplified ABC (S-ABC) costing method to reduce the workload of ABC costing by reducing the number of cost drivers used in the ABC method. Using the S-ABC method, we estimated the cost of the laboratory tests, and as a result, similarly accurate results were obtained with the ABC method (largest difference was 2.64%). Simultaneously, this new method reduces the seven cost drivers used in the ABC method to four. Moreover, we performed an evaluation using other sample data from physiological laboratory department to certify the effectiveness of this new method. In conclusion, the S-ABC method provides two advantages in comparison to the VBC and ABC methods: (1) it can obtain accurate results, and (2) it is simpler to perform. Once we reduce the number of cost drivers by applying the proposed S-ABC method to the data for the ABC method, we can easily perform the cost accounting using few cost drivers after the second round of costing.

  6. Development of a transfer function method for dynamic stability measurement

    NASA Technical Reports Server (NTRS)

    Johnson, W.

    1977-01-01

    Flutter testing method based on transfer function measurements is developed. The error statistics of several dynamic stability measurement methods are reviewed. It is shown that the transfer function measurement controls the error level by averaging the data and correlating the input and output. The method also gives a direct estimate of the error in the response measurement. An algorithm is developed for obtaining the natural frequency and damping ratio of low damped modes of the system, using integrals of the transfer function in the vicinity of a resonant peak. Guidelines are given for selecting the parameters in the transfer function measurement. Finally, the dynamic stability measurement technique is applied to data from a wind tunnel test of a proprotor and wing model.

  7. Quantitative methods for analyzing cell-cell adhesion in development.

    PubMed

    Kashef, Jubin; Franz, Clemens M

    2015-05-01

    During development cell-cell adhesion is not only crucial to maintain tissue morphogenesis and homeostasis, it also activates signalling pathways important for the regulation of different cellular processes including cell survival, gene expression, collective cell migration and differentiation. Importantly, gene mutations of adhesion receptors can cause developmental disorders and different diseases. Quantitative methods to measure cell adhesion are therefore necessary to understand how cells regulate cell-cell adhesion during development and how aberrations in cell-cell adhesion contribute to disease. Different in vitro adhesion assays have been developed in the past, but not all of them are suitable to study developmentally-related cell-cell adhesion processes, which usually requires working with low numbers of primary cells. In this review, we provide an overview of different in vitro techniques to study cell-cell adhesion during development, including a semi-quantitative cell flipping assay, and quantitative single-cell methods based on atomic force microscopy (AFM)-based single-cell force spectroscopy (SCFS) or dual micropipette aspiration (DPA). Furthermore, we review applications of Förster resonance energy transfer (FRET)-based molecular tension sensors to visualize intracellular mechanical forces acting on cell adhesion sites. Finally, we describe a recently introduced method to quantitate cell-generated forces directly in living tissues based on the deformation of oil microdroplets functionalized with adhesion receptor ligands. Together, these techniques provide a comprehensive toolbox to characterize different cell-cell adhesion phenomena during development. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Development of quality-by-design analytical methods.

    PubMed

    Vogt, Frederick G; Kord, Alireza S

    2011-03-01

    Quality-by-design (QbD) is a systematic approach to drug development, which begins with predefined objectives, and uses science and risk management approaches to gain product and process understanding and ultimately process control. The concept of QbD can be extended to analytical methods. QbD mandates the definition of a goal for the method, and emphasizes thorough evaluation and scouting of alternative methods in a systematic way to obtain optimal method performance. Candidate methods are then carefully assessed in a structured manner for risks, and are challenged to determine if robustness and ruggedness criteria are satisfied. As a result of these studies, the method performance can be understood and improved if necessary, and a control strategy can be defined to manage risk and ensure the method performs as desired when validated and deployed. In this review, the current state of analytical QbD in the industry is detailed with examples of the application of analytical QbD principles to a range of analytical methods, including high-performance liquid chromatography, Karl Fischer titration for moisture content, vibrational spectroscopy for chemical identification, quantitative color measurement, and trace analysis for genotoxic impurities.

  9. Developing new online calibration methods for multidimensional computerized adaptive testing.

    PubMed

    Chen, Ping; Wang, Chun; Xin, Tao; Chang, Hua-Hua

    2017-02-01

    Multidimensional computerized adaptive testing (MCAT) has received increasing attention over the past few years in educational measurement. Like all other formats of CAT, item replenishment is an essential part of MCAT for its item bank maintenance and management, which governs retiring overexposed or obsolete items over time and replacing them with new ones. Moreover, calibration precision of the new items will directly affect the estimation accuracy of examinees' ability vectors. In unidimensional CAT (UCAT) and cognitive diagnostic CAT, online calibration techniques have been developed to effectively calibrate new items. However, there has been very little discussion of online calibration in MCAT in the literature. Thus, this paper proposes new online calibration methods for MCAT based upon some popular methods used in UCAT. Three representative methods, Method A, the 'one EM cycle' method and the 'multiple EM cycles' method, are generalized to MCAT. Three simulation studies were conducted to compare the three new methods by manipulating three factors (test length, item bank design, and level of correlation between coordinate dimensions). The results showed that all the new methods were able to recover the item parameters accurately, and the adaptive online calibration designs showed some improvements compared to the random design under most conditions.

  10. Progress in Development of Methods in Bone Densitometry

    NASA Technical Reports Server (NTRS)

    Whedon, G. D.; Neumann, William F.; Jenkins, Dale W.

    1966-01-01

    The effects of weightlessness and decreased activity on the astronaut's musculoskeletal system during prolonged space flight, missions are of concern to NASA. This problem was anticipated from the knowledge that human subjects lose significant quantities of calcium from the skeleton during periods of bedrest, immobilization, and water immersion. An accurate method of measurement of the changes in mineral content of the skeleton is required not only in the space program but also in the biological, medical, and dental fields for mineral metabolism studies and for studying various pathological conditions of the skeleton and teeth. This method is a difficult one requiring the coordinated efforts of physiologists, biophysicists, radiologists, and clinicians. The densitometry methods reported in this conference which have been used or are being developed include X-ray, beta excited X-rays, radioisotopes, sonic vibration, and neutron activation analysis Studies in the Gemini, Biosatellite, and Apollo flights use the X-ray bone densitometry method which requires making X-rays before and after the flights. An in-flight method of bone densitometry would be valuable, and use of radioisotope sources has been suggested. Many advances in bone densitometry have been made in the last five years, and the urgency of the requirement makes this working conference timely and valuable. In such a rapidly developing field with investigators working independently in a variety of scientific disciplines, a working conference is of great value in exchanging information and ideas, critically evaluating approaches and methods, and pointing out new research pathways.

  11. Development of standard method performance requirements for biological threat agent detection methods.

    PubMed

    Coates, Scott G; Brunelle, Sharon L; Davenport, Matthew G

    2011-01-01

    Standards and third-party testing are necessary to demonstrate the performance and limitations of biological threat agent (biothreat) detection technologies to allow appropriate response actions by end-users and responders. In order to address this need, the Department of Homeland Security, Science and Technology Directorate has funded AOAC INTERNATIONAL to develop standards and perform conformity assessment. AOAC formed the Stakeholder Panel on Agent Detection Assays to develop consensus performance criteria (standard method performance requirements; SMPRs) for methods that detect biothreats. This paper documents the development of the first five biothreat SMPRs, including the voluntary consensus process, the components of an SMPR, the use of SMPRs in developing validation protocols, and a description of the development efforts and considerations for each of the current SMPRs.

  12. Development of electron moire method using a scanning electron microscope

    NASA Astrophysics Data System (ADS)

    Kishimoto, Satoshi; Egashira, Mitsuru; Shinya, Norio

    1991-12-01

    A new moire method using a scanning electron microscope (SEM) for the measurement of micro-deformation has been developed. This new method makes it possible to observe the moire fringe pattern and SEM image at the same time. In this method, a fine microgrid prepared by electron lithography is used as a model grid, and scanning exposure of the electron beam in a SEM is used as a master grid. The exposure of electron beam on the specimen with the model grid produces moire fringes of bright and dark lines formed by the different amount of the secondary electrons. This fine moire fringe pattern is fine and clear enough to measure the strain distribution in a small area. By this method, concentrated strains around a small hole in polyimide resin specimens and also the inhomogeneous micro- deformations such as grain boundary sliding in copper specimens were measured, with high accuracy.

  13. REVIEW: Development of methods for body composition studies

    NASA Astrophysics Data System (ADS)

    Mattsson, Sören; Thomas, Brian J.

    2006-07-01

    This review is focused on experimental methods for determination of the composition of the human body, its organs and tissues. It summarizes the development and current status of fat determinations from body density, total body water determinations through the dilution technique, whole and partial body potassium measurements for body cell mass estimates, in vivo neutron activation analysis for body protein measurements, dual-energy absorptiometry (DEXA), computed tomography (CT) and magnetic resonance imaging (MRI, fMRI) and spectroscopy (MRS) for body composition studies on tissue and organ levels, as well as single- and multiple-frequency bioimpedance (BIA) and anthropometry as simple easily available methods. Methods for trace element analysis in vivo are also described. Using this wide range of measurement methods, together with gradually improved body composition models, it is now possible to quantify a number of body components and follow their changes in health and disease.

  14. Quantifying nonhomogeneous colors in agricultural materials part I: method development.

    PubMed

    Balaban, M O

    2008-11-01

    Measuring the color of food and agricultural materials using machine vision (MV) has advantages not available by other measurement methods such as subjective tests or use of color meters. The perception of consumers may be affected by the nonuniformity of colors. For relatively uniform colors, average color values similar to those given by color meters can be obtained by MV. For nonuniform colors, various image analysis methods (color blocks, contours, and "color change index"[CCI]) can be applied to images obtained by MV. The degree of nonuniformity can be quantified, depending on the level of detail desired. In this article, the development of the CCI concept is presented. For images with a wide range of hue values, the color blocks method quantifies well the nonhomogeneity of colors. For images with a narrow hue range, the CCI method is a better indicator of color nonhomogeneity.

  15. Methods comparison for microsatellite marker development: Different isolation methods, different yield efficiency

    NASA Astrophysics Data System (ADS)

    Zhan, Aibin; Bao, Zhenmin; Hu, Xiaoli; Lu, Wei; Hu, Jingjie

    2009-06-01

    Microsatellite markers have become one kind of the most important molecular tools used in various researches. A large number of microsatellite markers are required for the whole genome survey in the fields of molecular ecology, quantitative genetics and genomics. Therefore, it is extremely necessary to select several versatile, low-cost, efficient and time- and labor-saving methods to develop a large panel of microsatellite markers. In this study, we used Zhikong scallop ( Chlamys farreri) as the target species to compare the efficiency of the five methods derived from three strategies for microsatellite marker development. The results showed that the strategy of constructing small insert genomic DNA library resulted in poor efficiency, while the microsatellite-enriched strategy highly improved the isolation efficiency. Although the mining public database strategy is time- and cost-saving, it is difficult to obtain a large number of microsatellite markers, mainly due to the limited sequence data of non-model species deposited in public databases. Based on the results in this study, we recommend two methods, microsatellite-enriched library construction method and FIASCO-colony hybridization method, for large-scale microsatellite marker development. Both methods were derived from the microsatellite-enriched strategy. The experimental results obtained from Zhikong scallop also provide the reference for microsatellite marker development in other species with large genomes.

  16. Organic analysis and analytical methods development: FY 1995 progress report

    SciTech Connect

    Clauss, S.A.; Hoopes, V.; Rau, J.

    1995-09-01

    This report describes the status of organic analyses and developing analytical methods to account for the organic components in Hanford waste tanks, with particular emphasis on tanks assigned to the Flammable Gas Watch List. The methods that have been developed are illustrated by their application to samples obtained from Tank 241-SY-103 (Tank 103-SY). The analytical data are to serve as an example of the status of methods development and application. Samples of the convective and nonconvective layers from Tank 103-SY were analyzed for total organic carbon (TOC). The TOC value obtained for the nonconvective layer using the hot persulfate method was 10,500 {mu}g C/g. The TOC value obtained from samples of Tank 101-SY was 11,000 {mu}g C/g. The average value for the TOC of the convective layer was 6400 {mu}g C/g. Chelator and chelator fragments in Tank 103-SY samples were identified using derivatization. gas chromatography/mass spectrometry (GC/MS). Organic components were quantified using GC/flame ionization detection. Major components in both the convective and nonconvective-layer samples include ethylenediaminetetraacetic acid (EDTA), nitrilotriacetic acid (NTA), succinic acid, nitrosoiminodiacetic acid (NIDA), citric acid, and ethylenediaminetriacetic acid (ED3A). Preliminary results also indicate the presence of C16 and C18 carboxylic acids in the nonconvective-layer sample. Oxalic acid was one of the major components in the nonconvective layer as determined by derivatization GC/flame ionization detection.

  17. Method Development and Monitoring of Cyanotoxins in Water ...

    EPA Pesticide Factsheets

    Increasing occurrence of cyanobacterial harmful algal blooms (HABs) in ambient waters has become a worldwide concern. Numerous cyanotoxins can be produced during HAB events which are toxic to animals and humans. Validated standardized methods that are rugged, selective and sensitive are needed for these cyanotoxins in drinking and ambient waters. EPA Drinking Water Methods 544 (six microcystins [MCs] and nodularin) and 545 (cylindrospermopsin [CYL] and anatoxin-a [ANA]) have been developed using liquid chromatography/tandem mass spectrometry (LC/MS/MS). This presentation will describe the adaptation of Methods 544 and 545 to ambient waters and application of these ambient water methods to seven bodies of water across the country with visible cyanobacterial blooms.Several changes were made to Method 544 to accommodate the increased complexity of ambient water. The major changes were to reduce the sample volume from 500 to 100 mL for ambient water analyses and to incorporate seven additional MCs in an effort to capture data for more MC congeners in ambient waters. The major change to Method 545 for ambient water analyses was the addition of secondary ion transitions for each of the target analytes for confirmation purposes. Both methods have been ruggedly tested in bloom samples from multiple bodies of water, some with multiple sample locations and sampling days. For ambient water bloom samples spiked with MCs (>800 congener measurements), 97% of the measurements

  18. Quantitative methods for developing C2 system requirement

    SciTech Connect

    Tyler, K.K.

    1992-06-01

    The US Army established the Army Tactical Command and Control System (ATCCS) Experimentation Site (AES) to provide a place where material and combat developers could experiment with command and control systems. The AES conducts fundamental and applied research involving command and control issues using a number of research methods, ranging from large force-level experiments, to controlled laboratory experiments, to studies and analyses. The work summarized in this paper was done by Pacific Northwest Laboratory under task order from the Army Tactical Command and Control System Experimentation Site. The purpose of the task was to develop the functional requirements for army engineer automation and support software, including MCS-ENG. A client, such as an army engineer, has certain needs and requirements of his or her software; these needs must be presented in ways that are readily understandable to the software developer. A requirements analysis then, such as the one described in this paper, is simply the means of communication between those who would use a piece of software and those who would develop it. The analysis from which this paper was derived attempted to bridge the communications gap'' between army combat engineers and software engineers. It sought to derive and state the software needs of army engineers in ways that are meaningful to software engineers. In doing this, it followed a natural sequence of investigation: (1) what does an army engineer do, (2) with which tasks can software help, (3) how much will it cost, and (4) where is the highest payoff This paper demonstrates how each of these questions was addressed during an analysis of the functional requirements of engineer support software. Systems engineering methods are used in a task analysis and a quantitative scoring method was developed to score responses regarding the feasibility of task automation. The paper discusses the methods used to perform utility and cost-benefits estimates.

  19. Quantitative methods for developing C2 system requirement

    SciTech Connect

    Tyler, K.K.

    1992-06-01

    The US Army established the Army Tactical Command and Control System (ATCCS) Experimentation Site (AES) to provide a place where material and combat developers could experiment with command and control systems. The AES conducts fundamental and applied research involving command and control issues using a number of research methods, ranging from large force-level experiments, to controlled laboratory experiments, to studies and analyses. The work summarized in this paper was done by Pacific Northwest Laboratory under task order from the Army Tactical Command and Control System Experimentation Site. The purpose of the task was to develop the functional requirements for army engineer automation and support software, including MCS-ENG. A client, such as an army engineer, has certain needs and requirements of his or her software; these needs must be presented in ways that are readily understandable to the software developer. A requirements analysis then, such as the one described in this paper, is simply the means of communication between those who would use a piece of software and those who would develop it. The analysis from which this paper was derived attempted to bridge the ``communications gap`` between army combat engineers and software engineers. It sought to derive and state the software needs of army engineers in ways that are meaningful to software engineers. In doing this, it followed a natural sequence of investigation: (1) what does an army engineer do, (2) with which tasks can software help, (3) how much will it cost, and (4) where is the highest payoff? This paper demonstrates how each of these questions was addressed during an analysis of the functional requirements of engineer support software. Systems engineering methods are used in a task analysis and a quantitative scoring method was developed to score responses regarding the feasibility of task automation. The paper discusses the methods used to perform utility and cost-benefits estimates.

  20. Analytical Failure Prediction Method Developed for Woven and Braided Composites

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2003-01-01

    Historically, advances in aerospace engine performance and durability have been linked to improvements in materials. Recent developments in ceramic matrix composites (CMCs) have led to increased interest in CMCs to achieve revolutionary gains in engine performance. The use of CMCs promises many advantages for advanced turbomachinery engine development and may be especially beneficial for aerospace engines. The most beneficial aspects of CMC material may be its ability to maintain its strength to over 2500 F, its internal material damping, and its relatively low density. Ceramic matrix composites reinforced with two-dimensional woven and braided fabric preforms are being considered for NASA s next-generation reusable rocket turbomachinery applications (for example, see the preceding figure). However, the architecture of a textile composite is complex, and therefore, the parameters controlling its strength properties are numerous. This necessitates the development of engineering approaches that combine analytical methods with limited testing to provide effective, validated design analyses for the textile composite structures development.

  1. [Descartes' influence on the development of the anatomoclinical method].

    PubMed

    González Hernández, A; Domínguez Rodríguez, M V; Fabre Pi, O; Cubero González, A

    2010-01-01

    The development of the anatomical-clinical method was a huge advance for modern medicine since it revealed a new approach to understanding diagnostic procedures. This change in medical thinking towards a more scientific basis has gradually evolved over several centuries, reaching its brilliant zenith with the contributions of the French school. There are certain similarities between the guidelines of the anatomical-clinical method and René Descartes' philosophical principles, so it is fair to consider him as one of the major precursors in this new line of thinking that definitely influenced the historical course of medicine.

  2. Development of target allocation methods for LAMOST focal plate

    NASA Astrophysics Data System (ADS)

    Yuan, Hailong; Zhang, Haotong; Zhang, Yanxia; Lei, Yajuan; Dong, Yiqiao

    2014-01-01

    We first introduce the primary target allocation requirements and restrictions for the parallel control multiple fiber system, which is used in the LAMOST spectroscopic survey. The fiber positioner anti-collision model is imported. Then several target allocation methods and features are discussed in detail, including a network flow algorithm, high priority for fiber unit holding less target number, target allocation algorithm for groups, target allocation method for add-ons and target reallocation. Their virtues and weaknesses are analyzed for various kinds of scientific research situations. Furthermore an optimization concept using the Simulate Anneal Arithmetic (SAA) is developed to improve the fiber utilizing efficiency.

  3. Control of irradiated food: Recent developments in analytical detection methods.

    NASA Astrophysics Data System (ADS)

    Delincée, H.

    1993-07-01

    An overview of recent international efforts, i.e. programmes of "ADMIT" (FAO/IAEA) and of BCR (EC) towards the development of analytical detection methods for radiation processed foods will be given. Some larger collaborative studies have already taken place, e.g. ESR of bones from chicken, prok, beef, frog legs and fish, thermoluminescence of insoluble minerals isolated from herbs and spices, GC analysis of long-chain hydrocarbons derived from the lipid fraction of chicken and other meats, and the microbiological APC/DEFT procedure for spices. These methods could soon be implemented in international standard protocols.

  4. Unstructured-grid methods development: Lessons le arned

    NASA Technical Reports Server (NTRS)

    Batina, John T.

    1991-01-01

    The development is summarized of unstructured grid methods for the solution of the equations of fluid flow and some of the lessons learned are shared. The 3-D Euler equations are solved, including spatial discretizations, temporal discretizations, and boundary conditions. An example calculation with an upwind implicit method using a CFL (Courant Friedricks Lewy) number of infinity is presented for the Boeing 747 aircraft. The results obtained in less than one hour of CPU time on a Cray-2 computer, thus demonstrating the speed and robustness of the present capability.

  5. Methods to assess Drosophila heart development, function and aging

    PubMed Central

    Ocorr, Karen; Vogler, Georg; Bodmer, Rolf

    2014-01-01

    In recent years the Drosophila heart has become an established model of many different aspects of human cardiac disease. This model has allowed identification of disease-causing mechanisms underlying congenital heart disease and cardiomyopathies and has permitted the study underlying genetic, metabolic and age-related contributions to heart function. In this review we discuss methods currently employed in the analysis of the Drosophila heart structure and function, such as optical methods to infer heart function and performance, electrophysiological and mechanical approaches to characterize cardiac tissue properties, and conclude with histological techniques used in the study of heart development and adult structure. PMID:24727147

  6. Development of motion control method for laser soldering process

    SciTech Connect

    Yerganian, S.S.

    1997-05-01

    Development of a method to generate the motion control data for sealing an electronic housing using laser soldering is described. The motion required to move the housing under the laser is a nonstandard application and was performed with a four-axis system using the timed data streaming mode capabilities of a Compumotor AT6400 indexer. A Microsoft Excel 5.0 spreadsheet (named Israuto.xls) was created to calculate the movement of the part under the laser, and macros were written into the spreadsheet to allow the user to easily create this data. A data verification method was developed for simulating the motion data. The geometry of the assembly was generated using Parametric Technology Corporation Pro/E version 15. This geometry was then converted using Pro/DADS version 3.1 from Computer Aided Design Software Inc. (CADSI), and the simulation was carried out using DADS version 8.0 from CADSI.

  7. Development of an in vitro cloning method for Cowdria ruminantium.

    PubMed Central

    Perez, J M; Martinez, D; Debus, A; Sheikboudou, C; Bensaid, A

    1997-01-01

    Cowdria ruminantium is a tick-borne rickettsia which causes severe disease in ruminants. All studies with C. ruminantium reported so far were carried out with stocks consisting of infective blood collected from reacting animals or from the same stocks propagated in vitro. Cloned isolates are needed to conduct studies on immune response of the host, on genetic diversity of the parasite, and on mechanisms of attenuation and the development of vaccines. A method of cloning based on the particular chlamydia life cycle of Cowdria was developed. Instead of cloning extracellular elementary bodies, it appeared more convenient to clone endothelial cells infected by one morula resulting from the infection of the cell by one elementary body of Cowdria. Two hundred and sixteen clones were obtained by limiting dilution of infected cells. The method was experimentally validated by comparing randomly amplified polymorphic DNA fingerprints from individual clones obtained from endothelial cell cultures coinfected with two different stocks of C. ruminantium. PMID:9302217

  8. ASPECTS: an automation-assisted SPE method development system.

    PubMed

    Li, Ming; Chou, Judy; King, Kristopher W; Yang, Liyu

    2013-07-01

    A typical conventional SPE method development (MD) process usually involves deciding the chemistry of the sorbent and eluent based on information about the analyte; experimentally preparing and trying out various combinations of adsorption chemistry and elution conditions; quantitatively evaluating the various conditions; and comparing quantitative results from all combination of conditions to select the best condition for method qualification. The second and fourth steps have mostly been performed manually until now. We developed an automation-assisted system that expedites the conventional SPE MD process by automating 99% of the second step, and expedites the fourth step by automatically processing the results data and presenting it to the analyst in a user-friendly format. The automation-assisted SPE MD system greatly saves the manual labor in SPE MD work, prevents analyst errors from causing misinterpretation of quantitative results, and shortens data analysis and interpretation time.

  9. In silico machine learning methods in drug development.

    PubMed

    Dobchev, Dimitar A; Pillai, Girinath G; Karelson, Mati

    2014-01-01

    Machine learning (ML) computational methods for predicting compounds with pharmacological activity, specific pharmacodynamic and ADMET (absorption, distribution, metabolism, excretion and toxicity) properties are being increasingly applied in drug discovery and evaluation. Recently, machine learning techniques such as artificial neural networks, support vector machines and genetic programming have been explored for predicting inhibitors, antagonists, blockers, agonists, activators and substrates of proteins related to specific therapeutic targets. These methods are particularly useful for screening compound libraries of diverse chemical structures, "noisy" and high-dimensional data to complement QSAR methods, and in cases of unavailable receptor 3D structure to complement structure-based methods. A variety of studies have demonstrated the potential of machine-learning methods for predicting compounds as potential drug candidates. The present review is intended to give an overview of the strategies and current progress in using machine learning methods for drug design and the potential of the respective model development tools. We also regard a number of applications of the machine learning algorithms based on common classes of diseases.

  10. Methods to Develop Inhalation Cancer Risk Estimates for ...

    EPA Pesticide Factsheets

    This document summarizes the approaches and rationale for the technical and scientific considerations used to derive inhalation cancer risks for emissions of chromium and nickel compounds from electric utility steam generating units. The purpose of this document is to discuss the methods used to develop inhalation cancer risk estimates associated with emissions of chromium and nickel compounds from coal- and oil-fired electric utility steam generating units (EGUs) in support of EPA's recently proposed Air Toxics Rule.

  11. Development of Methods for Diagnostics of Discharges in Supersonic Flows

    DTIC Science & Technology

    2004-03-01

    flow, diagnostic methods, electric probe, measurement circuit, propane-air mixture, ignition. Participating Institution : Department of...932-88-20 E-mail: dean@phys.msu.su /ershov@ph-elec.phys.msu.su Partner: The European Office of Aerospace Research and Development (EOARD) Adress...of flight . The gas discharges of a various type can be the perspective tool of the decision of this problem. The definition of the most effective

  12. Development of characteristic evaluation method on FR cycle system

    SciTech Connect

    Shinoda, Y.; Shiotani, H.; Hirao, K.

    2002-07-01

    The present report is intended to explain some results of the characteristic evaluation work on various FR cycle system concepts, in the 1. phase of the JNC's 'Feasibility Study on Commercialized Fast Reactor Cycle System' (from 1999 to March 2001). The development of the evaluation method is carried out for six criteria, such as Economics, Effective utilization of uranium resource, Reduction of environmental impact, Safety, Proliferation resistance, and Technological feasibility. (authors)

  13. Development of Methods for Diagnostics of Discharges in Supersonic Flows

    DTIC Science & Technology

    2001-09-01

    channel and quartz combustor tube and for installation of plasma generators. The centering and fixation of quartz tube is provided by ribbon sealings ... unsteady parameters [1-3 ]. Under these conditions, the main advantage of the probe method (a possibility of local measurements with a high temporal...reactions (including ionization and recombination) in the air plasma. The unsteady character of plasma parameters suggests a necessity of development of

  14. SPF Full-scale emissions test method development status ...

    EPA Pesticide Factsheets

    This is a non-technical presentation that is intended to inform ASTM task group members about our intended approach to full-scale emissions testing that includes the application of spray foam in an environmental chamber. The presentation describes the approach to emissions characterization, types of measurement systems employed, and expected outcomes from the planned tests. Purpose of this presentation is to update the ASTM D22.05 work group regarding status of our full-scale emissions test method development.

  15. Development of modelling method selection tool for health services management: From problem structuring methods to modelling and simulation methods

    PubMed Central

    2011-01-01

    Background There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. Aim The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can better select and use them or better commission relevant modelling and simulation work. Methods This paper proposes a modelling and simulation method comparison and selection tool developed from a comprehensive literature review, the research team's extensive expertise and inputs from potential users. Twenty-eight different methods were identified, characterised by their relevance to different application areas, project life cycle stages, types of output and levels of insight, and four input resources required (time, money, knowledge and data). Results The characterisation is presented in matrix forms to allow quick comparison and selection. This paper also highlights significant knowledge gaps in the existing literature when assessing the applicability of particular approaches to health services management, where modelling and simulation skills are scarce let alone money and time. Conclusions A modelling and simulation method comparison and selection tool is developed to assist with the selection of methods appropriate to supporting specific decision making processes. In particular it addresses the issue of which method is most appropriate to which specific health services management problem, what the user might expect to be obtained from the method, and what is required to use the method. In summary, we believe the tool adds value to the scarce existing literature on methods comparison and selection. PMID:21595946

  16. Viscous wing theory development. Volume 1: Analysis, method and results

    NASA Technical Reports Server (NTRS)

    Chow, R. R.; Melnik, R. E.; Marconi, F.; Steinhoff, J.

    1986-01-01

    Viscous transonic flows at large Reynolds numbers over 3-D wings were analyzed using a zonal viscid-inviscid interaction approach. A new numerical AFZ scheme was developed in conjunction with the finite volume formulation for the solution of the inviscid full-potential equation. A special far-field asymptotic boundary condition was developed and a second-order artificial viscosity included for an improved inviscid solution methodology. The integral method was used for the laminar/turbulent boundary layer and 3-D viscous wake calculation. The interaction calculation included the coupling conditions of the source flux due to the wing surface boundary layer, the flux jump due to the viscous wake, and the wake curvature effect. A method was also devised incorporating the 2-D trailing edge strong interaction solution for the normal pressure correction near the trailing edge region. A fully automated computer program was developed to perform the proposed method with one scalar version to be used on an IBM-3081 and two vectorized versions on Cray-1 and Cyber-205 computers.

  17. Development of fluorescent methods for DNA methyltransferase assay

    NASA Astrophysics Data System (ADS)

    Li, Yueying; Zou, Xiaoran; Ma, Fei; Tang, Bo; Zhang, Chun-yang

    2017-03-01

    DNA methylation modified by DNA methyltransferase (MTase) plays an important role in regulating gene transcription, cell growth and proliferation. The aberrant DNA MTase activity may lead to a variety of human diseases including cancers. Therefore, accurate and sensitive detection of DNA MTase activity is crucial to biomedical research, clinical diagnostics and therapy. However, conventional DNA MTase assays often suffer from labor-intensive operations and time-consuming procedures. Alternatively, fluorescent methods have significant advantages of simplicity and high sensitivity, and have been widely applied for DNA MTase assay. In this review, we summarize the recent advances in the development of fluorescent methods for DNA MTase assay. These emerging methods include amplification-free and the amplification-assisted assays. Moreover, we discuss the challenges and future directions of this area.

  18. Carotenoid extraction methods: A review of recent developments.

    PubMed

    Saini, Ramesh Kumar; Keum, Young-Soo

    2018-02-01

    The versatile use of carotenoids in feed, food, cosmetic and pharmaceutical industries has emphasized the optimization of extraction methods to obtain the highest recovery. The choice of method for carotenoid extraction from food matrices is crucial, owing to the presence of diverse carotenoids with varied levels of polarity, and the presence of various physical and chemical barriers in the food matrices. This review highlights the theoretical aspects and recent developments of various conventional and nonconventional methods used for the extraction of carotenoids, including ultrasound-assisted extraction (UAE), pressurized liquid extraction (PLE), and supercritical fluid extraction (SFE). Recent applications of non-toxic and environmentally safe solvents (green solvents) and ionic liquids (IL) for carotenoid extraction are also described. Additionally, future research challenges in the context of carotenoids extractions are also identified. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Development of gait segmentation methods for wearable foot pressure sensors.

    PubMed

    Crea, S; De Rossi, S M M; Donati, M; Reberšek, P; Novak, D; Vitiello, N; Lenzi, T; Podobnik, J; Munih, M; Carrozza, M C

    2012-01-01

    We present an automated segmentation method based on the analysis of plantar pressure signals recorded from two synchronized wireless foot insoles. Given the strict limits on computational power and power consumption typical of wearable electronic components, our aim is to investigate the capability of a Hidden Markov Model machine-learning method, to detect gait phases with different levels of complexity in the processing of the wearable pressure sensors signals. Therefore three different datasets are developed: raw voltage values, calibrated sensor signals and a calibrated estimation of total ground reaction force and position of the plantar center of pressure. The method is tested on a pool of 5 healthy subjects, through a leave-one-out cross validation. The results show high classification performances achieved using estimated biomechanical variables, being on average the 96%. Calibrated signals and raw voltage values show higher delays and dispersions in phase transition detection, suggesting a lower reliability for online applications.

  20. Development of acoustic sniper localization methods and models

    NASA Astrophysics Data System (ADS)

    Grasing, David; Ellwood, Benjamin

    2010-04-01

    A novel examination of a method capable of providing situational awareness of sniper fire from small arms fire is presented. Situational Awareness (SA) information is extracted by exploiting two distinct sounds created by small arms discharge: the muzzle blast (created when the bullet leaves the barrel of the gun) and the shockwave (sound created by a supersonic bullet). The direction of arrival associated with the muzzle blast will always point in the direction of the shooter. Range can be estimated from the muzzle blast alone, however at greater distances geometric dilution of precision will make obtaining accurate range estimates difficult. To address this issue, additional information obtained from the shockwave is utilized in order to estimate range to shooter. The focus of the paper is the development of a shockwave propagation model, the development of ballistics models (based off empirical measurements), and the subsequent application towards methods of determining shooter position. Knowledge of the rounds ballistics is required to estimate range to shooter. Many existing methods rely on extracting information from the shockwave in an attempt to identify the round type and thus the ballistic model to use ([1]). It has been our experience that this information becomes unreliable at greater distances or in high noise environments. Our method differs from existing solutions in that classification of the round type is not required, thus making the proposed solution more robust. Additionally, we demonstrate that sufficient accuracy can be achieved without the need to classify the round.

  1. Development of Analysis Methods for Designing with Composites

    NASA Technical Reports Server (NTRS)

    Madenci, E.

    1999-01-01

    The project involved the development of new analysis methods to achieve efficient design of composite structures. We developed a complex variational formulation to analyze the in-plane and bending coupling response of an unsymmetrically laminated plate with an elliptical cutout subjected to arbitrary edge loading as shown in Figure 1. This formulation utilizes four independent complex potentials that satisfy the coupled in-plane and bending equilibrium equations, thus eliminating the area integrals from the strain energy expression. The solution to a finite geometry laminate under arbitrary loading is obtained by minimizing the total potential energy function and solving for the unknown coefficients of the complex potentials. The validity of this approach is demonstrated by comparison with finite element analysis predictions for a laminate with an inclined elliptical cutout under bi-axial loading.The geometry and loading of this laminate with a lay-up of [-45/45] are shown in Figure 2. The deformed configuration shown in Figure 3 reflects the presence of bending-stretching coupling. The validity of the present method is established by comparing the out-of-plane deflections along the boundary of the elliptical cutout from the present approach with those of the finite element method. The comparison shown in Figure 4 indicates remarkable agreement. The details of this method are described in a manuscript by Madenci et al. (1998).

  2. The ReaxFF method - new applications and developments

    NASA Astrophysics Data System (ADS)

    van Duin, Adri

    The ReaxFF method provides a highly transferable simulation method for atomistic scale simulations on chemical reactions at the nanosecond and nanometer scale. It combines concepts of bond-order based potentials with a polarizable charge distribution. Since it initial development for hydrocarbons in 2001, we have found that this concept is transferable to applications to elements all across the periodic table, including all first row elements, metals, ceramics and ionic materials. For all these elements and associated materials we have demonstrated that ReaxFF can reproduce quantum mechanics-based structures, reaction energies and reaction barriers with reasonable accuracy, enabling the method to predict reaction kinetics in complicated, multi-material environments at a relatively modest computational expense. This presentation will describe the current concepts of the ReaxFF method, the current status of the various ReaxFF codes, including parallel implementations and recently developed hybrid Grand Canonical Monte Carlo options - which significantly increase its application areas. Also, we will present and overview of recent applications to a range of materials of increasing complexity, with a focus on applications to combustion, biomaterials, batteries, tribology and catalysis.

  3. Progress and development of analytical methods for gibberellins.

    PubMed

    Pan, Chaozhi; Tan, Swee Ngin; Yong, Jean Wan Hong; Ge, Liya

    2017-01-01

    Gibberellins, as a group of phytohormones, exhibit a wide variety of bio-functions within plant growth and development, which have been used to increase crop yields. Many analytical procedures, therefore, have been developed for the determination of the types and levels of endogenous and exogenous gibberellins. As plant tissues contain gibberellins in trace amounts (usually at the level of nanogram per gram fresh weight or even lower), the sample pre-treatment steps (extraction, pre-concentration, and purification) for gibberellins are reviewed in details. The primary focus of this comprehensive review is on the various analytical methods designed to meet the requirements for gibberellins analyses in complex matrices with particular emphasis on high-throughput analytical methods, such as gas chromatography, liquid chromatography, and capillary electrophoresis, mostly combined with mass spectrometry. The advantages and drawbacks of the each described analytical method are discussed. The overall aim of this review is to provide a comprehensive and critical view on the different analytical methods nowadays employed to analyze gibberellins in complex sample matrices and their foreseeable trends. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. The development of CFD methods for rotor applications

    NASA Technical Reports Server (NTRS)

    Caradonna, F. X.; Mccroskey, W. J.

    1988-01-01

    The optimum design of the advancing helicopter rotor for high-speed forward flight always involves a tradeoff between transonic and stall limitations. However, the preoccupation of the rotor industry was primarily concerned with stall until well into the 1970s. This emphasis on stall resulted from the prevalent use of low-solidity rotors with rather outdated airfoil sections. The use of cambered airfoil sections and higher-solidity rotors substantially reduced stall and revealed the advancing transonic flow to be a more persistent limitation to high-speed rotor performance. Work in this area was spurred not only by operational necessity but also by the development of a tool for the prediction of these flows (the method of computational fluid dynamics). The development of computational fluid dynamics for these rotor problems was a major Army and NASA achievement. This work is now being extended to other rotor flow problems. The developments are outlined.

  5. Development of modelling method selection tool for health services management: from problem structuring methods to modelling and simulation methods.

    PubMed

    Jun, Gyuchan T; Morris, Zoe; Eldabi, Tillal; Harper, Paul; Naseer, Aisha; Patel, Brijesh; Clarkson, John P

    2011-05-19

    There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can better select and use them or better commission relevant modelling and simulation work. This paper proposes a modelling and simulation method comparison and selection tool developed from a comprehensive literature review, the research team's extensive expertise and inputs from potential users. Twenty-eight different methods were identified, characterised by their relevance to different application areas, project life cycle stages, types of output and levels of insight, and four input resources required (time, money, knowledge and data). The characterisation is presented in matrix forms to allow quick comparison and selection. This paper also highlights significant knowledge gaps in the existing literature when assessing the applicability of particular approaches to health services management, where modelling and simulation skills are scarce let alone money and time. A modelling and simulation method comparison and selection tool is developed to assist with the selection of methods appropriate to supporting specific decision making processes. In particular it addresses the issue of which method is most appropriate to which specific health services management problem, what the user might expect to be obtained from the method, and what is required to use the method. In summary, we believe the tool adds value to the scarce existing literature on methods comparison and selection.

  6. User Experience Evaluation Methods in Product Development (UXEM'09)

    NASA Astrophysics Data System (ADS)

    Roto, Virpi; Väänänen-Vainio-Mattila, Kaisa; Law, Effie; Vermeeren, Arnold

    High quality user experience (UX) has become a central competitive factor of product development in mature consumer markets [1]. Although the term UX originated from industry and is a widely used term also in academia, the tools for managing UX in product development are still inadequate. A prerequisite for designing delightful UX in an industrial setting is to understand both the requirements tied to the pragmatic level of functionality and interaction and the requirements pertaining to the hedonic level of personal human needs, which motivate product use [2]. Understanding these requirements helps managers set UX targets for product development. The next phase in a good user-centered design process is to iteratively design and evaluate prototypes [3]. Evaluation is critical for systematically improving UX. In many approaches to UX, evaluation basically needs to be postponed until the product is fully or at least almost fully functional. However, in an industrial setting, it is very expensive to find the UX failures only at this phase of product development. Thus, product development managers and developers have a strong need to conduct UX evaluation as early as possible, well before all the parts affecting the holistic experience are available. Different types of products require evaluation on different granularity and maturity levels of a prototype. For example, due to its multi-user characteristic, a community service or an enterprise resource planning system requires a broader scope of UX evaluation than a microwave oven or a word processor that is meant for a single user at a time. Before systematic UX evaluation can be taken into practice, practical, lightweight UX evaluation methods suitable for different types of products and different phases of product readiness are needed. A considerable amount of UX research is still about the conceptual frameworks and models for user experience [4]. Besides, applying existing usability evaluation methods (UEMs) without

  7. Teaching Analytical Method Transfer through Developing and Validating Then Transferring Dissolution Testing Methods for Pharmaceuticals

    ERIC Educational Resources Information Center

    Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette

    Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…

  8. The NASA digital VGH program. Exploration of methods and final results. Volume 1: Development of methods

    NASA Technical Reports Server (NTRS)

    Crabill, Norman L.

    1989-01-01

    Two hundred hours of Lockheed L 1011 digital flight data recorder data taken in 1973 were used to develop methods and procedures for obtaining statistical data useful for updating airliner airworthiness design criteria. Five thousand hours of additional data taken in 1978 to 1982 are reported in volumes 2, 3, 4 and 5.

  9. Development of redesign method of production system based on QFD

    NASA Astrophysics Data System (ADS)

    Kondoh, Shinsuke; Umeda, Yasusi; Togawa, Hisashi

    In order to catch up with rapidly changing market environment, rapid and flexible redesign of production system is quite important. For effective and rapid redesign of production system, a redesign support system is eagerly needed. To this end, this paper proposes a redesign method of production system based on Quality Function Deployment (QFD). This method represents a designer's intention in the form of QFD, collects experts' knowledge as “Production Method (PM) modules,” and formulates redesign guidelines as seven redesign operations so as to support a designer to find out improvement ideas in a systematical manner. This paper also illustrates a redesign support tool of a production system we have developed based on this method, and demonstrates its feasibility with a practical example of a production system of a contact probe. A result from this example shows that comparable cost reduction to those of veteran designers can be achieved by a novice designer. From this result, we conclude our redesign method is effective and feasible for supporting redesign of a production system.

  10. Development of an optomechanical statistical tolerancing method for cost reduction

    NASA Astrophysics Data System (ADS)

    Lamontagne, Frédéric; Doucet, Michel

    2012-10-01

    Optical systems generally require a high level of optical components positioning precision resulting in elevated manufacturing cost. The optomechanical tolerance analysis is usually performed by the optomechanical engineer using his personal knowledge of the manufacturing precision capability. Worst case or root sum square (RSS) tolerance calculation methods are frequently used for their simplicity. In most situations, the chance to encounter the worst case error is statistically almost impossible. On the other hand, RSS method is generally not an accurate representation of the reality since it assumes centered normal distributions. Moreover, the RSS method is not suitable for multidimensional tolerance analysis that combines translational and rotational variations. An optomechanical tolerance analysis method based on Monte Carlo simulation has been developed at INO to reduce overdesign caused by pessimist manufacturing and assembly error predictions. Manufacturing data errors have been compiled and computed to be used as input for the optomechanical Monte Carlo tolerance model. This is resulting in a more realistic prediction of the optical components positioning errors (decenter, tilt and air gap). Calculated errors probabilities were validated on a real lenses barrels assembly using a high precision centering machine. Results show that the statistical error prediction is more accurate and that can relax significantly the precision required in comparison to the worst case method. Manufacturing, inspection, adjustment mechanism and alignment cost can then be reduced considerably.

  11. Development of NDE methods for hot gas filters.

    SciTech Connect

    Deemer, C.; Ellingson, W. A.; Koehl, E. R.; Lee, H.; Spohnholtz, T.; Sun, J. G.

    1999-07-21

    Ceramic hot gas candle filters are currently under development for hot gas particulate cleanup in advanced coal-based power systems. The ceramic materials for these filters include nonoxide monolithic, nonoxide-fiber-reinforced composites, and nonoxide reticulated foam. A concern is the lack of reliable data on which to base decisions for reusing or replacing hot gas filters during plant shutdowns. The work in this project is aimed at developing nondestructive evaluation (FIDE) technology to allow detection, and determination of extent, of life-limiting characteristics such as thermal fatigue, oxidation, damage from ash bridging such as localized cracking, damage from local burning, and elongation at elevated temperature. Although in-situ NDE methods are desirable in order to avoid disassembly of the candle filter vessels, the current vessel designs, the presence of filter cakes and possible ash bridging, and the state of NDE technology prevent this. Candle filter producers use a variety of NDE methods to ensure as-produced quality. While impact acoustic resonance offers initial promise for examining new as-produced filters and for detecting damage in some monolithic filters when removed from service, it presents difficulties in data interpretation, it lacks localization capability, and its applicability to composites has yet to be demonstrated. Additional NDE technologies being developed and evaluated in this program and whose applicability to both monolithics and composites has been demonstrated include (a) full-scale thermal imaging for analyzing thermal property variations; (b) fret, high-spatial-resolution X-ray imaging for detecting density variations and dimensional changes; (c) air-coupled ultrasonic methods for determining through-thickness compositional variations; and (d) acoustic emission technology with mechanical loading for detecting localized bulk damage. New and exposed clay-bonded SiC filters and CVI-SiC composite filters have been tested with

  12. Development of a novel vitrification method for chondrocyte sheets

    PubMed Central

    2013-01-01

    Background There is considerable interest in using cell sheets for the treatment of various lesions as part of regenerative medicine therapy. Cell sheets can be prepared in temperature-responsive culture dishes and applied to injured tissue. For example, cartilage-derived cell sheets are currently under preclinical testing for use in treatment of knee cartilage injuries. The additional use of cryopreservation technology could increase the range and practicality of cell sheet therapies. To date, however, cryopreservation of cell sheets has proved impractical. Results Here we have developed a novel and effective method for cryopreserving fragile chondrocyte sheets. We modified the vitrification method previously developed for cryopreservation of mammalian embryos to vitrify a cell sheet through use of a minimum volume of vitrification solution containing 20% dimethyl sulfoxide, 20% ethylene glycol, 0.5 M sucrose, and 10% carboxylated poly-L-lysine. The principal feature of our method is the coating of the cell sheet with a viscous vitrification solution containing permeable and non-permeable cryoprotectants prior to vitrification in liquid nitrogen vapor. This method prevented fracturing of the fragile cell sheet even after vitrification and rewarming. Both the macro- and microstructures of the vitrified cell sheets were maintained without damage or loss of major components. Cell survival in the vitrified sheets was comparable to that in non-vitrified samples. Conclusions We have shown here that it is feasible to vitrify chondrocyte cell sheets and that these sheets retain their normal characteristics upon thawing. The availability of a practical cryopreservation method should make a significant contribution to the effectiveness and range of applications of cell sheet therapy. PMID:23886356

  13. Development of an Immunoaffinity Method for Purification of Streptokinase

    PubMed Central

    Karimi, Zohreh; Babashamsi, Mohammad; Asgarani, Ezat; Salimi, Ali

    2012-01-01

    Background Streptokinase is a potent activator of plasminogen to plasmin, the enzyme that can solubilize the fibrin network in blood clots. Streptokinase is currently used in clinical medicine as a thrombolytic agent. It is naturally secreted by β-hemolytic streptococci. Methods To reach an efficient method of purification, an immunoaffinity chromatography method was developed that could purify the streptokinase in a single step with high yield. At the first stage, a CNBr-Activated sepharose 4B-Lysine column was made to purify the human blood plasminogen. The purified plasminogen was utilized to construct a column that could purify the streptokinase. The rabbit was immunized with the purified streptokinase and the anti-streptokinase (IgG) purified on another streptokinase substituted sepharose-4B column. The immunoaffinity column was developed by coupling the purified anti-Streptokinase (IgG) to sepharose 6MB–Protein A. The Escherichia coli (E.coli) BL21 (DE3) pLysS strain was transformed by the recombinant construct (cloned streptokinase gene in pGEX-4T-2 vector) and gene expression was induced by IPTG. The expressed protein was purified by immunoaffinity chromatography in a single step. Results The immunoaffinity column could purify the recombinant fusion GST-SK to homogeneity. The purity of streptokinase was confirmed by SDS-PAGE as a single band of about 71 kD and its biological activity determined in a specific streptokinase assay. The yield of the purification was about 94%. Conclusion This method of streptokinase purification is superior to the previous conventional methods. PMID:23408770

  14. Developing integrated methods to address complex resource and environmental issues

    USGS Publications Warehouse

    Smith, Kathleen S.; Phillips, Jeffrey D.; McCafferty, Anne E.; Clark, Roger N.

    2016-02-08

    IntroductionThis circular provides an overview of selected activities that were conducted within the U.S. Geological Survey (USGS) Integrated Methods Development Project, an interdisciplinary project designed to develop new tools and conduct innovative research requiring integration of geologic, geophysical, geochemical, and remote-sensing expertise. The project was supported by the USGS Mineral Resources Program, and its products and acquired capabilities have broad applications to missions throughout the USGS and beyond.In addressing challenges associated with understanding the location, quantity, and quality of mineral resources, and in investigating the potential environmental consequences of resource development, a number of field and laboratory capabilities and interpretative methodologies evolved from the project that have applications to traditional resource studies as well as to studies related to ecosystem health, human health, disaster and hazard assessment, and planetary science. New or improved tools and research findings developed within the project have been applied to other projects and activities. Specifically, geophysical equipment and techniques have been applied to a variety of traditional and nontraditional mineral- and energy-resource studies, military applications, environmental investigations, and applied research activities that involve climate change, mapping techniques, and monitoring capabilities. Diverse applied geochemistry activities provide a process-level understanding of the mobility, chemical speciation, and bioavailability of elements, particularly metals and metalloids, in a variety of environmental settings. Imaging spectroscopy capabilities maintained and developed within the project have been applied to traditional resource studies as well as to studies related to ecosystem health, human health, disaster assessment, and planetary science. Brief descriptions of capabilities and laboratory facilities and summaries of some

  15. [Development of a Drug Discovery Method Targeted to Stromal Tissue].

    PubMed

    Kamada, Haruhiko

    2016-01-01

    Several diseases are characterized by alterations in the molecular distribution of vascular structures, presenting the opportunity to use monoclonal antibodies for clinical therapies. This pharmaceutical strategy, often referred to as "vascular targeting", has promise in promoting the discovery and development of selective biological drugs to regulate angiogenesis-related diseases such as cancer. Various experimental approaches have been utilized to discover accessible vascular markers of health and disease at the protein level. Our group has developed a new chemical proteomics technology to identify and quantify accessible vascular proteins in normal organs and at disease sites. Our developed methodology relies on the perfusion of animal models with suitable ester derivatives of biotin, which react with the primary amine groups of proteins as soon as the molecules are attached. This presentation reports biomedical applications based on vascular targeting strategies, as well as methodologies that have been used to discover new vascular targets. The identification of antigens located in the stromal tissue of pathological blood vessels may provide attractive targets for the development of antibody drugs. This method will also provide an efficient discovery target that could lead to the development of novel antibody drugs.

  16. Development of Cross-Assembly Phage PCR-Based Methods ...

    EPA Pesticide Factsheets

    Technologies that can characterize human fecal pollution in environmental waters offer many advantages over traditional general indicator approaches. However, many human-associated methods cross-react with non-human animal sources and lack suitable sensitivity for fecal source identification applications. The genome of a newly discovered bacteriophage (~97 kbp), the Cross-Assembly phage or “crAssphage”, assembled from a human gut metagenome DNA sequence library is predicted to be both highly abundant and predominately occur in human feces suggesting that this double stranded DNA virus may be an ideal human fecal pollution indicator. We report the development of two human-associated crAssphage endpoint PCR methods (crAss056 and crAss064). A shotgun strategy was employed where 384 candidate primers were designed to cover ~41 kbp of the crAssphage genome deemed favorable for method development based on a series of bioinformatics analyses. Candidate primers were subjected to three rounds of testing to evaluate assay optimization, specificity, limit of detection (LOD95), geographic variability, and performance in environmental water samples. The top two performing candidate primer sets exhibited 100% specificity (n = 70 individual samples from 8 different animal species), >90% sensitivity (n = 10 raw sewage samples from different geographic locations), LOD95 of 0.01 ng/µL of total DNA per reaction, and successfully detected human fecal pollution in impaired envi

  17. [Development of analytical method for determination nicotine metabolites in urine].

    PubMed

    Piekoszewski, Wojciech; Florek, Ewa; Kulza, Maksymilian; Wilimowska, Jolanta; Loba, Urszula

    2009-01-01

    The assay of biomarkers in biological material is the most popular and reliable method in estimate exposure to tobacco smoke. Nicotine and its metabolites qualify to the most specific biomarkers for tobacco smoke. Currently the most often used are cotinine and trans-3'-hydroxycotinine. The aim of this study was development of easy and quick method of determining nicotine and its main metabolites with high performance liquid chromatography--available in most laboratories. Nicotine and its metabolites in urine (cotinine, trans-3'-hydroxycotinine, nornicotine and nicotine N-oxide) was determined by means of high performance liquid chromatography with spectrometry detection (HPLC-UV). The determined compounds were extracted from urine by means of the liquid-liquid technique, before analysed by the HPLC method. Developed technique of high performance liquid chromatography proved to be useful to assessment nicotine and its four metabolites in smokers, though further research are necessary. The further modification of procedure is required, because of the interferences of cotinine N-oxide with matrix, which prevent determination. Increasing the efficiency of extraction nicotine and nornicotine could enable the determination in people exposed on environmental tobacco smoke (ETS). This study confirm other authors' observations that 3'-hydroxycotinine might be equivalent with cotinine predictor of tobacco smoke exposure, however further studies are required.

  18. Development of DNA-based Identification methods to track the ...

    EPA Pesticide Factsheets

    The ability to track the identity and abundance of larval fish, which are ubiquitous during spawning season, may lead to a greater understanding of fish species distributions in Great Lakes nearshore areas including early-detection of invasive fish species before they become established. However, larval fish are notoriously hard to identify using traditional morphological techniques. While DNA-based identification methods could increase the ability of aquatic resource managers to determine larval fish composition, use of these methods in aquatic surveys is still uncommon and presents many challenges. In response to this need, we have been working with the U. S. Fish and Wildlife Service to develop field and laboratory methods to facilitate the identification of larval fish using DNA-meta-barcoding. In 2012, we initiated a pilot-project to develop a workflow for conducting DNA-based identification, and compared the species composition at sites within the St. Louis River Estuary of Lake Superior using traditional identification versus DNA meta-barcoding. In 2013, we extended this research to conduct DNA-identification of fish larvae collected from multiple nearshore areas of the Great Lakes by the USFWS. The species composition of larval fish generally mirrored that of fish species known from the same areas, but was influenced by the timing and intensity of sampling. Results indicate that DNA-based identification needs only very low levels of biomass to detect pre

  19. Development of DNA-based Identification methods to track the ...

    EPA Pesticide Factsheets

    The ability to track the identity and abundance of larval fish, which are ubiquitous during spawning season, may lead to a greater understanding of fish species distributions in Great Lakes nearshore areas including early-detection of invasive fish species before they become established. However, larval fish are notoriously hard to identify using traditional morphological techniques. While DNA-based identification methods could increase the ability of aquatic resource managers to determine larval fish composition, use of these methods in aquatic surveys is still uncommon and presents many challenges. In response to this need, we have been working with the U. S. Fish and Wildlife Service to develop field and laboratory methods to facilitate the identification of larval fish using DNA-meta-barcoding. In 2012, we initiated a pilot-project to develop a workflow for conducting DNA-based identification, and compared the species composition at sites within the St. Louis River Estuary of Lake Superior using traditional identification versus DNA meta-barcoding. In 2013, we extended this research to conduct DNA-identification of fish larvae collected from multiple nearshore areas of the Great Lakes by the USFWS. The species composition of larval fish generally mirrored that of fish species known from the same areas, but was influenced by the timing and intensity of sampling. Results indicate that DNA-based identification needs only very low levels of biomass to detect pre

  20. Development of nondestructive evaluation methods for structural ceramics.

    SciTech Connect

    Ellingson, W. A.

    1998-08-19

    During the past year, the focus of our work on nondestructive evaluation (NDE) methods was on the development and application of these methods to technologies such as ceramic matrix composite (CMC) hot-gas filters, CMC high-temperature heat exchangers, and CMC ceramic/ceramic joining. Such technologies are critical to the ''Vision 21 Energy-Plex Fleet'' of modular, high-efficiency, low-emission power systems. Specifically, our NDE work has continued toward faster, higher sensitivity, volumetric X-ray computed tomographic imaging with new amorphous silicon detectors to detect and measure axial and radial density variations in hot-gas filters and heat exchangers; explored the potential use of high-speed focal-plane-array infrared imaging technology to detect delaminations and variations in the thermal properties of SiC/SiC heat exchangers; and explored various NDE methods to characterize CMC joints in cooperation with various industrial partners. Work this year also addressed support of Southern Companies Services Inc., Power Systems Development Facility, where NDE is needed to assess the condition of hot-gas candle filters. This paper presents the results of these efforts.

  1. Physics methods development for the NCSU PULSTAR reactor

    SciTech Connect

    Perez, P.B.; Mayo, C.W.; Giavedoni, E.

    1996-12-31

    The safety analysis reports (SARs) of all university research reactors include analyses that determine reactor physics parameters. The initial SAR analyses utilized numerical models, codes, cross-section libraries, and computing platforms available at the time. Advances and updates in all of these contributing areas make it difficult or impractical to resort to the earlier methodologies for meeting current analysis needs. Many facilities updated their physics methods during the high-enrichment uranium (HEU) to low-enrichment uranium (LEU) conversion effort. These facilities updated their SAR with current methodologies. The North Carolina State University`s (NCSU`s) PULSTAR research reactor was designed to use low-enrichment (4%) fuel, and as a result, the facility did not update the reactor physics analyses during the HEU-to-LEU conversion program. An effort is currently under way at NCSU to develop new and updated methods for reactor physics calculations. Currently planned physics calculations for the PULSTAR reactor support both reactor licensing and experimental facility development goals. These goals include the following: 1. Increase excess reactivity by introducing beryllium reflector assemblies and a mixed-enrichment core. 2. Characterize various experimental facilities in support of neutron transmutation doping, prompt gamma analysis, and neutron depth profiling. 3. Establish core loading patterns that optimize characteristics for experimental facilities. Two and three-dimensional, multigroup models utilizing the DANT-SYS and MCNP codes have been developed in support of these goals. Results and lessons learned with the DANT-SYS code are presented in this paper.

  2. Development of computational methods for heavy lift launch vehicles

    NASA Technical Reports Server (NTRS)

    Yoon, Seokkwan; Ryan, James S.

    1993-01-01

    The research effort has been focused on the development of an advanced flow solver for complex viscous turbulent flows with shock waves. The three-dimensional Euler and full/thin-layer Reynolds-averaged Navier-Stokes equations for compressible flows are solved on structured hexahedral grids. The Baldwin-Lomax algebraic turbulence model is used for closure. The space discretization is based on a cell-centered finite-volume method augmented by a variety of numerical dissipation models with optional total variation diminishing limiters. The governing equations are integrated in time by an implicit method based on lower-upper factorization and symmetric Gauss-Seidel relaxation. The algorithm is vectorized on diagonal planes of sweep using two-dimensional indices in three dimensions. A new computer program named CENS3D has been developed for viscous turbulent flows with discontinuities. Details of the code are described in Appendix A and Appendix B. With the developments of the numerical algorithm and dissipation model, the simulation of three-dimensional viscous compressible flows has become more efficient and accurate. The results of the research are expected to yield a direct impact on the design process of future liquid fueled launch systems.

  3. Developing a Method to Mask Trees in Commercial Multispectral Imagery

    NASA Astrophysics Data System (ADS)

    Becker, S. J.; Daughtry, C. S. T.; Jain, D.; Karlekar, S. S.

    2015-12-01

    The US Army has an increasing focus on using automated remote sensing techniques with commercial multispectral imagery (MSI) to map urban and peri-urban agricultural and vegetative features; however, similar spectral profiles between trees (i.e., forest canopy) and other vegetation result in confusion between these cover classes. Established vegetation indices, like the Normalized Difference Vegetation Index (NDVI), are typically not effective in reliably differentiating between trees and other vegetation. Previous research in tree mapping has included integration of hyperspectral imagery (HSI) and LiDAR for tree detection and species identification, as well as the use of MSI to distinguish tree crowns from non-vegetated features. This project developed a straightforward method to model and also mask out trees from eight-band WorldView-2 (1.85 meter x 1.85 meter resolution at nadir) satellite imagery at the Beltsville Agricultural Research Center in Beltsville, MD spanning 2012 - 2015. The study site included tree cover, a range of agricultural and vegetative cover types, and urban features. The modeling method exploits the product of the red and red edge bands and defines accurate thresholds between trees and other land covers. Results show this method outperforms established vegetation indices including the NDVI, Soil Adjusted Vegetation Index, Normalized Difference Water Index, Simple Ratio, and Normalized Difference Red Edge Index in correctly masking trees while preserving the other information in the imagery. This method is useful when HSI and LiDAR collection are not possible or when using archived MSI.

  4. Development of fire test methods for airplane interior materials

    NASA Technical Reports Server (NTRS)

    Tustin, E. A.

    1978-01-01

    Fire tests were conducted in a 737 airplane fuselage at NASA-JSC to characterize jet fuel fires in open steel pans (simulating post-crash fire sources and a ruptured airplane fuselage) and to characterize fires in some common combustibles (simulating in-flight fire sources). Design post-crash and in-flight fire source selections were based on these data. Large panels of airplane interior materials were exposed to closely-controlled large scale heating simulations of the two design fire sources in a Boeing fire test facility utilizing a surplused 707 fuselage section. Small samples of the same airplane materials were tested by several laboratory fire test methods. Large scale and laboratory scale data were examined for correlative factors. Published data for dangerous hazard levels in a fire environment were used as the basis for developing a method to select the most desirable material where trade-offs in heat, smoke and gaseous toxicant evolution must be considered.

  5. Identifying emerging research collaborations and networks: Method development

    PubMed Central

    Dozier, Ann M.; Martina, Camille A.; O’Dell, Nicole L.; Fogg, Thomas T.; Lurie, Stephen J.; Rubinstein, Eric P.; Pearson, Thomas A.

    2014-01-01

    Clinical and translational research is a multidisciplinary, collaborative team process. To evaluate this process, we developed a method to document emerging research networks and collaborations in our medical center to describe their productivity and viability over time. Using an email survey, sent to 1,620 clinical and basic science full-and part-time faculty members, respondents identified their research collaborators. Initial analyses, using Pajek software, assessed the feasibility of using social network analysis (SNA) methods with these data. Nearly 400 respondents identified 1,594 collaborators across 28 medical center departments resulting in 309 networks with 5 or more collaborators. This low-burden approach yielded a rich dataset useful for evaluation using SNA to: a) assess networks at several levels of the organization, including intrapersonal (individuals), interpersonal (social), organizational/institutional leadership (tenure and promotion), and physical/environmental (spatial proximity) and b) link with other data to assess the evolution of these networks. PMID:24019209

  6. Development of a colloidal lithography method for patterning nonplanar surfaces.

    PubMed

    Bhawalkar, Sarang P; Qian, Jun; Heiber, Michael C; Jia, Li

    2010-11-16

    A colloidal lithography method has been developed for patterning nonplanar surfaces. Hexagonal noncontiguously packed (HNCP) colloidal particles 127 nm-2.7 μm in diameter were first formed at the air-water interface and then adsorbed onto a substrate coated with a layer of polymer adhesive ∼17 nm thick. The adhesive layer plays the critical role of securing the order of the particles against the destructive lateral capillary force generated by a thin film of water after the initial transfer of the particles from the air-water interface. The soft lithography method is robust and very simple to carry out. It is applicable to a variety of surface curvatures and for both inorganic and organic colloidal particles.

  7. Accelerated molecular dynamics methods: introduction and recent developments

    SciTech Connect

    Uberuaga, Blas Pedro; Voter, Arthur F; Perez, Danny; Shim, Y; Amar, J G

    2009-01-01

    A long-standing limitation in the use of molecular dynamics (MD) simulation is that it can only be applied directly to processes that take place on very short timescales: nanoseconds if empirical potentials are employed, or picoseconds if we rely on electronic structure methods. Many processes of interest in chemistry, biochemistry, and materials science require study over microseconds and beyond, due either to the natural timescale for the evolution or to the duration of the experiment of interest. Ignoring the case of liquids xxx, the dynamics on these time scales is typically characterized by infrequent-event transitions, from state to state, usually involving an energy barrier. There is a long and venerable tradition in chemistry of using transition state theory (TST) [10, 19, 23] to directly compute rate constants for these kinds of activated processes. If needed dynamical corrections to the TST rate, and even quantum corrections, can be computed to achieve an accuracy suitable for the problem at hand. These rate constants then allow them to understand the system behavior on longer time scales than we can directly reach with MD. For complex systems with many reaction paths, the TST rates can be fed into a stochastic simulation procedure such as kinetic Monte Carlo xxx, and a direct simulation of the advance of the system through its possible states can be obtained in a probabilistically exact way. A problem that has become more evident in recent years, however, is that for many systems of interest there is a complexity that makes it difficult, if not impossible, to determine all the relevant reaction paths to which TST should be applied. This is a serious issue, as omitted transition pathways can have uncontrollable consequences on the simulated long-time kinetics. Over the last decade or so, we have been developing a new class of methods for treating the long-time dynamics in these complex, infrequent-event systems. Rather than trying to guess in advance what

  8. ROOM: A recursive object oriented method for information systems development

    SciTech Connect

    Thelliez, T.; Donahue, S.

    1994-02-09

    Although complementary for the development of complex systems, top-down structured design and object oriented approach are still opposed and not integrated. As the complexity of the systems are still growing, and the so-called software crisis still not solved, it is urgent to provide a framework mixing the two paradigms. This paper presents an elegant attempt in this direction through our Recursive Object-Oriented Method (ROOM) in which a top-down approach divides the complexity of the system and an object oriented method studies a given level of abstraction. Illustrating this recursive schema with a simple example, we demonstrate that we achieve the goal of creating loosely coupled and reusable components.

  9. Developing students' qualitative muscles in an introductory methods course.

    PubMed

    SmithBattle, Lee

    2014-08-30

    The exponential growth of qualitative research (QR) has coincided with methodological innovations, the proliferation of qualitative textbooks and journals, and the greater availability of qualitative methods courses. In spite of these advances, the pedagogy for teaching qualitative methods has received little attention. This paper provides a philosophical foundation for teaching QR with active learning strategies and shows how active learning is fully integrated into a one-semester course. The course initiates students into qualitative dispositions and skills as students develop study aims and procedures; enter the field to gather data; analyze the full set of student-generated data; and write results in a final report. Conducting a study in one semester is challenging but has proven feasible and disabuses students of the view that QR is simple, unscientific, or non-rigorous. Student reflections on course assignments are integrated into the paper. The strengths and limitations of this pedagogical approach are also described.

  10. Identifying emerging research collaborations and networks: method development.

    PubMed

    Dozier, Ann M; Martina, Camille A; O'Dell, Nicole L; Fogg, Thomas T; Lurie, Stephen J; Rubinstein, Eric P; Pearson, Thomas A

    2014-03-01

    Clinical and translational research is a multidisciplinary, collaborative team process. To evaluate this process, we developed a method to document emerging research networks and collaborations in our medical center to describe their productivity and viability over time. Using an e-mail survey, sent to 1,620 clinical and basic science full- and part-time faculty members, respondents identified their research collaborators. Initial analyses, using Pajek software, assessed the feasibility of using social network analysis (SNA) methods with these data. Nearly 400 respondents identified 1,594 collaborators across 28 medical center departments resulting in 309 networks with 5 or more collaborators. This low-burden approach yielded a rich data set useful for evaluation using SNA to: (a) assess networks at several levels of the organization, including intrapersonal (individuals), interpersonal (social), organizational/institutional leadership (tenure and promotion), and physical/environmental (spatial proximity) and (b) link with other data to assess the evolution of these networks.

  11. Development of nondestructive evaluation methods for structural ceramics

    SciTech Connect

    Ellingson, W.A.; Koehl, R.D.; Stuckey, J.B.; Sun, J.G.; Engel, H.P.; Smith, R.G.

    1997-06-01

    Development of nondestructive evaluation (NDE) methods for application to fossil energy systems continues in three areas: (a) mapping axial and radial density gradients in hot gas filters, (b) characterization of the quality of continuous fiber ceramic matrix composite (CFCC) joints and (c) characterization and detection of defects in thermal barrier coatings. In this work, X-ray computed tomographic imaging was further developed and used to map variations in the axial and radial density of two full length (2.3-m) hot gas filters. The two filters differed in through wall density because of the thickness of the coating on the continuous fibers. Differences in axial and through wall density were clearly detected. Through transmission infrared imaging with a highly sensitivity focal plane array camera was used to assess joint quality in two sets of SiC/SiC CFCC joints. High frame rate data capture suggests that the infrared imaging method holds potential for the characterization of CFCC joints. Work to develop NDE methods that can be used to evaluate electron beam physical vapor deposited coatings with platinum-aluminide (Pt-Al) bonds was undertaken. Coatings of Zirconia with thicknesses of 125 {micro}m (0.005 in.), 190 {micro}m (0.0075 in.), and 254 {micro}m (0.010 in.) with a Pt-Al bond coat on Rene N5 Ni-based superalloy were studied by infrared imaging. Currently, it appears that thickness variation, as well as thermal properties, can be assessed by infrared technology.

  12. Method of breast reconstruction and the development of lymphoedema.

    PubMed

    Lee, K-T; Bang, S I; Pyon, J-K; Hwang, J H; Mun, G-H

    2017-02-01

    Several studies have demonstrated an association between immediate autologous or implant-based breast reconstruction and a reduced incidence of lymphoedema. However, few of these have ocused specifically on whether the reconstruction method affects the development of lymphoedema. The study evaluated the potential impact of breast reconstruction modality on the incidence of lymphoedema. Outcomes of women with breast cancer who underwent mastectomy and immediate reconstruction using an autologous flap or a tissue expander/implant between 2008 and 2013 were reviewed. Arm or hand swelling with pertinent clinical signs of lymphoedema and excess volume compared with those of the contralateral side was diagnosed as lymphoedema. The cumulative incidence of lymphoedema was estimated by the Kaplan-Meier method. Clinicopathological factors associated with the development of lymphoedema were investigated by Cox regression analysis. A total of 429 reconstructions (214 autologous and 215 tissue expander/implant) were analysed; the mean follow-up of patients was 45·3 months. The two groups had similar characteristics, except that women in the autologous group were older, had a higher BMI, and more often had preoperative radiotherapy than women in the tissue expander/implant group. Overall, the 2-year cumulative incidence of lymphoedema was 6·8 per cent (autologous 4·2 per cent, tissue expander/implant 9·3 per cent). Multivariable analysis demonstrated that autologous reconstruction was associated with a significantly reduced risk of lymphoedema compared with that for tissue expander/implant reconstruction. Axillary dissection, a greater number of dissected lymph nodes and postoperative chemotherapy were also independent risk factors for lymphoedema. The method of breast reconstruction may affect subsequent development of lymphoedema. © 2016 BJS Society Ltd Published by John Wiley & Sons Ltd.

  13. Bioanalytical method development and validation: Critical concepts and strategies.

    PubMed

    Moein, Mohammad Mahdi; El Beqqali, Aziza; Abdel-Rehim, Mohamed

    2017-02-01

    Bioanalysis is an essential part in drug discovery and development. Bioanalysis is related to the analysis of analytes (drugs, metabolites, biomarkers) in biological samples and it involves several steps from sample collection to sample analysis and data reporting. The first step is sample collection from clinical or preclinical studies; then sending the samples to laboratory for analysis. Second step is sample clean-up (sample preparation) and it is very important step in bioanalysis. In order to reach reliable results, a robust and stable sample preparation method should be applied. The role of sample preparation is to remove interferences from sample matrix and improve analytical system performance. Sample preparation is often labor intensive and time consuming. Last step is the sample analysis and detection. For separation and detection, liquid chromatography-tandem mass spectrometry (LC-MS/MS) is method of choice in bioanalytical laboratories. This is due to high selectivity and high sensitivity of the LC-MS/MS technique. In addition the information about the analyte chemical structure and chemical properties is important to be known before the start of bioanalytical work. This review provides an overview of bioanalytical method development and validation. The main principles of method validation will be discussed. In this review GLP and regulated bioanalysis are described. Commonly used sample preparation techniques will be presented. In addition the role of LC-MS/MS in modern bioanalysis will be discussed. In the present review we have our focus on bioanalysis of small molecules. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. HPLC method development and validation of chromafenozide in paddy.

    PubMed

    Ditya, Papia; Das, S P; Bhattacharyya, Anjan

    2012-12-01

    A simple and efficient HPLC-UV method was developed and validated for determination of chromafenozide in paddy as there was no previous report on record in this regard. The residue analysis method of chromafenozide, its dissipation and final residue in paddy along with soil were also studied after field treatment. Residues of chromafenozide were extracted and purified from paddy and soil followed by liquid/liquid partitioning, chromatographic column and determination by HPLC equipped with PDA detector. The separation was performed on a Phenomenex Luna RP C(18) (250 × 4.6 mm i.d, 5 μm particle size) column at room temperature. The mean accuracy of analytical method were 94.92 %, 95.38 %, 94.67 % and 96.90 % in straw, grain, soil and field water respectively. The precision (repeatability) was found in the range of 1.30 %-9.25 % for straw/grain, 1.27 %-11.19 % in soil; 1.0 %-9.25 % in field water. The precision (reproducibility) in straw/grain was ranging from 2.2 % to 12.1 %, in soil it from 2.0 % to 11.7 %. The minimum detectable concentration was 0.01 mg kg(-1). The degradation of chromafenozide formulation in rice, soil and water was determined and results showed that chromafenozide as wettable powder formulation degraded with the half-lives of about 4.4 and 2.9 days in paddy plant and soil respectively for double recommended dose. The results indicated that the developed method is easier and faster then could meet the requirements for determination of chromafenozide in paddy.

  15. Development of impact design methods for ceramic gas turbine components

    NASA Technical Reports Server (NTRS)

    Song, J.; Cuccio, J.; Kington, H.

    1990-01-01

    Impact damage prediction methods are being developed to aid in the design of ceramic gas turbine engine components with improved impact resistance. Two impact damage modes were characterized: local, near the impact site, and structural, usually fast fracture away from the impact site. Local damage to Si3N4 impacted by Si3N4 spherical projectiles consists of ring and/or radial cracks around the impact point. In a mechanistic model being developed, impact damage is characterized as microcrack nucleation and propagation. The extent of damage is measured as volume fraction of microcracks. Model capability is demonstrated by simulating late impact tests. Structural failure is caused by tensile stress during impact exceeding material strength. The EPIC3 code was successfully used to predict blade structural failures in different size particle impacts on radial and axial blades.

  16. Development of methods to predict agglomeration and disposition in FBCs

    SciTech Connect

    Mann, M.D.; Henderson, A.K.; Swanson, M.K.; Erickson, T.A.

    1995-11-01

    This 3-year, multiclient program is providing the information needed to determine the behavior of inorganic components in FBC units using advanced methods of analysis coupled with bench-scale combustion experiments. The major objectives of the program are as follows: (1) To develop further our advanced ash and deposit characterization techniques to quantify the effects of the liquid-phase components in terms of agglomerate formation and ash deposits, (2) To determine the mechanisms of inorganic transformations that lead to bed agglomeration and ash deposition in FBC systems, and (3) To develop a better means to predict the behavior of inorganic components as a function of coal composition, bed material characteristics, and combustion conditions.

  17. Developments in flow visualization methods for flight research

    NASA Technical Reports Server (NTRS)

    Holmes, Bruce J.; Obara, Clifford J.; Manuel, Gregory S.; Lee, Cynthia C.

    1990-01-01

    With the introduction of modern airplanes utilizing laminar flow, flow visualization has become an important diagnostic tool in determining aerodynamic characteristics such as surface flow direction and boundary-layer state. A refinement of the sublimating chemical technique has been developed to define both the boundary-layer transition location and the transition mode. In response to the need for flow visualization at subsonic and transonic speeds and altitudes above 20,000 feet, the liquid crystal technique has been developed. A third flow visualization technique that has been used is infrared imaging, which offers non-intrusive testing over a wide range of test conditions. A review of these flow visualization methods and recent flight results is presented for a variety of modern aircraft and flight conditions.

  18. Developments in flow visualization methods for flight research

    NASA Technical Reports Server (NTRS)

    Holmes, Bruce J.; Obara, Clifford J.; Manuel, Gregory S.; Lee, Cynthia C.

    1990-01-01

    With the introduction of modern airplanes utilizing laminar flow, flow visualization has become an important diagnostic tool in determining aerodynamic characteristics such as surface flow direction and boundary-layer state. A refinement of the sublimating chemical technique has been developed to define both the boundary-layer transition location and the transition mode. In response to the need for flow visualization at subsonic and transonic speeds and altitudes above 20,000 feet, the liquid crystal technique has been developed. A third flow visualization technique that has been used is infrared imaging, which offers non-intrusive testing over a wide range of test conditions. A review of these flow visualization methods and recent flight results is presented for a variety of modern aircraft and flight conditions.

  19. Electromagnetic methods for development and production: State of the art

    SciTech Connect

    Wilt, M.; Alumbaugh, D.

    1997-10-01

    Electromagnetic (EM) methods, long used for borehole logging as a formation evaluation tool in developed oil fields, are rarely applied in surface or crosshole configurations or applied in cased wells. This is largely due to the high levels of cultural noise and the preponderance of steel well casing. However, recent experimental success with crosshole EM systems for water and steam flood monitoring using fiberglass cased wells has shown promise in applying these techniques to development and production (D & P) problems. This paper describes technological solutions that will allow for successful application of EM techniques in oil fields, despite surface noise and steel casing. First an example sites the application of long offset logging to map resistivity structure away from the borehole. Next, a successful application of crosshole EM where one of the wells is steel cased is described. The potential application of earth`s field nuclear magnetic resonance (NMR) to map fluid saturation at large distances from the boreholes is also discussed.

  20. Development of RNAi Methods for Peregrinus maidis, the Corn Planthopper

    PubMed Central

    Yao, Jianxiu; Rotenberg, Dorith; Afsharifar, Alireza; Barandoc-Alviar, Karen; Whitfield, Anna E.

    2013-01-01

    The corn planthopper, Peregrinus maidis, is a major pest of agronomically-important crops. Peregrinus maidis has a large geographical distribution and transmits Maize mosaic rhabdovirus (MMV) and Maize stripe tenuivirus (MSpV). The objective of this study was to develop effective RNAi methods for P. maidis. Vacuolar-ATPase (V-ATPase) is an essential enzyme for hydrolysis of ATP and for transport of protons out of cells thereby maintaining membrane ion balance, and it has been demonstrated to be an efficacious target for RNAi in other insects. In this study, two genes encoding subunits of P. maidis V-ATPase (V-ATPase B and V-ATPase D) were chosen as RNAi target genes. The open reading frames of V-ATPase B and D were generated and used for constructing dsRNA fragments. Experiments were conducted using oral delivery and microinjection of V-ATPase B and V-ATPase D dsRNA to investigate the effectiveness of RNAi in P. maidis. Real-time quantitative reverse transcriptase-PCR (qRT-PCR) analysis indicated that microinjection of V-ATPase dsRNA led to a minimum reduction of 27-fold in the normalized abundance of V-ATPase transcripts two days post injection, while ingestion of dsRNA resulted in a two-fold reduction after six days of feeding. While both methods of dsRNA delivery resulted in knockdown of target transcripts, the injection method was more rapid and effective. The reduction in V-ATPase transcript abundance resulted in observable phenotypes. Specifically, the development of nymphs injected with 200 ng of either V-ATPase B or D dsRNA was impaired, resulting in higher mortality and lower fecundity than control insects injected with GFP dsRNA. Microscopic examination of these insects revealed that female reproductive organs did not develop normally. The successful development of RNAi in P. maidis to target specific genes will enable the development of new insect control strategies and functional analysis of vital genes and genes associated with interactions between P

  1. Development of RNAi methods for Peregrinus maidis, the corn planthopper.

    PubMed

    Yao, Jianxiu; Rotenberg, Dorith; Afsharifar, Alireza; Barandoc-Alviar, Karen; Whitfield, Anna E

    2013-01-01

    The corn planthopper, Peregrinus maidis, is a major pest of agronomically-important crops. Peregrinus maidis has a large geographical distribution and transmits Maize mosaic rhabdovirus (MMV) and Maize stripe tenuivirus (MSpV). The objective of this study was to develop effective RNAi methods for P. maidis. Vacuolar-ATPase (V-ATPase) is an essential enzyme for hydrolysis of ATP and for transport of protons out of cells thereby maintaining membrane ion balance, and it has been demonstrated to be an efficacious target for RNAi in other insects. In this study, two genes encoding subunits of P. maidis V-ATPase (V-ATPase B and V-ATPase D) were chosen as RNAi target genes. The open reading frames of V-ATPase B and D were generated and used for constructing dsRNA fragments. Experiments were conducted using oral delivery and microinjection of V-ATPase B and V-ATPase D dsRNA to investigate the effectiveness of RNAi in P. maidis. Real-time quantitative reverse transcriptase-PCR (qRT-PCR) analysis indicated that microinjection of V-ATPase dsRNA led to a minimum reduction of 27-fold in the normalized abundance of V-ATPase transcripts two days post injection, while ingestion of dsRNA resulted in a two-fold reduction after six days of feeding. While both methods of dsRNA delivery resulted in knockdown of target transcripts, the injection method was more rapid and effective. The reduction in V-ATPase transcript abundance resulted in observable phenotypes. Specifically, the development of nymphs injected with 200 ng of either V-ATPase B or D dsRNA was impaired, resulting in higher mortality and lower fecundity than control insects injected with GFP dsRNA. Microscopic examination of these insects revealed that female reproductive organs did not develop normally. The successful development of RNAi in P. maidis to target specific genes will enable the development of new insect control strategies and functional analysis of vital genes and genes associated with interactions between P

  2. Exploring the Application of Community Development Methods on Water Research in Developing Countries

    NASA Astrophysics Data System (ADS)

    Crane, P. E.

    2012-12-01

    In research and community development focused on water in developing countries, there is a common focus on issues of water quantity and quality. In the best circumstances both are innovative - bringing understanding and solutions to resource poor regions that is appropriate to their unique situations. But the underlying methods and measures for success often differ significantly. Applying critical aspects of community development methods to water research in developing countries could increase the probability of identifying innovative and sustainable solutions. This is examined through two case studies: the first identifies common methods across community development projects in six African countries, and the second examines water quality research performed in Benin, West Africa through the lens of these methods. The first case study is taken from observations gathered between 2008 and 2012 of community development projects focused on water quantity and quality in six sub-Saharan African countries implemented through different non-governmental organizations. These projects took place in rural and peri-urban regions where public utilities were few to none, instance of diarrheal disease was high, and most adults had received little formal education. The water projects included drilling of boreholes, building of rain water tanks, oasis rehabilitation, spring protection, and household biosand filters. All solutions were implemented with hygiene and sanitation components. Although these projects occurred in a wide array of cultural, geographical and climatic regions, the most successful projects shared methods of implementation. These methods are: high levels of stakeholder participation, environmental and cultural adaptation of process and product, and implementation over an extended length of time. The second case study focuses on water quality research performed in Benin, West Africa from 2003 to 2008. This research combined laboratory and statistical analyses with

  3. Chloroform extraction of iodine in seawater: method development

    NASA Astrophysics Data System (ADS)

    Seidler, H. B.; Glimme, A.; Tumey, S.; Guilderson, T. P.

    2012-12-01

    While 129I poses little to no radiological health hazard, the isotopic ratio of 129I to stable iodine is very useful as a nearly conservative tracer for ocean mixing processes. The unfortunate disaster at the Fukushima Daiichi nuclear power plant released many radioactive materials into the environment, including 129I. The release allows the studying of oceanic processes through the tracking of 129I. However, with such a low iodine (~0.5 micromolar) and 129I concentrations (<10-11) accelerator mass spectrometry (AMS) is needed for accurate measurements. In order to prepare the samples of ocean water for analysis by AMS, the iodine needs to be separated from the various other salts in the seawater. Solvent extraction is the preferred method for preparation of seawater for AMS analysis of 129I. However, given the relatively low background 129I concentrations in the Pacific Ocean, we sought to optimize recovery of thismethod, which would minimize both the sample size and the carrier addition required for analysis. We started from a base method described in other research and worked towards maximum efficiency of the process while boosting the recovery of iodine. During development, we assessed each methodological change qualitatively using a color scale (I2 in CHCl3) and quantitatively using Inductively Coupled Plasma Mass Spectrometry (ICP-MS). The "optimized method" yielded a 20-40% increase in recovery of the iodine compared to the base method (80-85% recovery vs. 60%). Lastly, the "optimized method" was tested by AMS for fractionation of the extracted iodine.

  4. Development of the Ion Exchange-Gravimetric Method for Sodium in Serum as a Definitive Method

    PubMed Central

    Moody, John R.; Vetter, Thomas W.

    1996-01-01

    An ion exchange-gravimetric method, previously developed as a National Committee for Clinical Laboratory Standards (NCCLS) reference method for the determination of sodium in human serum, has been re-evaluated and improved. Sources of analytical error in this method have been examined more critically and the overall uncertainties decreased. Additionally, greater accuracy and repeatability have been achieved by the application of this definitive method to a sodium chloride reference material. In this method sodium in serum is ion-exchanged, selectively eluted and converted to a weighable precipitate as Na2SO4. Traces of sodium eluting before or after the main fraction, and precipitate contaminants are determined instrumentally. Co-precipitating contaminants contribute less than 0.1 % while the analyte lost to other eluted ion-exchange fractions contributes less than 0.02 % to the total precipitate mass. With improvements, the relative expanded uncertainty (k = 2) of the method, as applied to serum, is 0.3 % to 0.4 % and is less than 0.1 % when applied to a sodium chloride reference material. PMID:27805122

  5. Multiphysics methods development for high temperature gas reactor analysis

    NASA Astrophysics Data System (ADS)

    Seker, Volkan

    Multiphysics computational methods were developed to perform design and safety analysis of the next generation Pebble Bed High Temperature Gas Cooled Reactors. A suite of code modules was developed to solve the coupled thermal-hydraulics and neutronics field equations. The thermal-hydraulics module is based on the three dimensional solution of the mass, momentum and energy equations in cylindrical coordinates within the framework of the porous media method. The neutronics module is a part of the PARCS (Purdue Advanced Reactor Core Simulator) code and provides a fine mesh finite difference solution of the neutron diffusion equation in three dimensional cylindrical coordinates. Coupling of the two modules was performed by mapping the solution variables from one module to the other. Mapping is performed automatically in the code system by the use of a common material mesh in both modules. The standalone validation of the thermal-hydraulics module was performed with several cases of the SANA experiment and the standalone thermal-hydraulics exercise of the PBMR-400 benchmark problem. The standalone neutronics module was validated by performing the relevant exercises of the PBMR-268 and PBMR-400 benchmark problems. Additionally, the validation of the coupled code system was performed by analyzing several steady state and transient cases of the OECD/NEA PBMR-400 benchmark problem.

  6. Characterization of Developer Application Methods Used in Fluorescent Penetrant Inspection

    NASA Astrophysics Data System (ADS)

    Brasche, L. J. H.; Lopez, R.; Eisenmann, D.

    2006-03-01

    Fluorescent penetrant inspection (FPI) is the most widely used inspection method for aviation components seeing use for production as well as an inservice inspection applications. FPI is a multiple step process requiring attention to the process parameters for each step in order to enable a successful inspection. A multiyear program is underway to evaluate the most important factors affecting the performance of FPI, to determine whether existing industry specifications adequately address control of the process parameters, and to provide the needed engineering data to the public domain. The final step prior to the inspection is the application of developer with typical aviation inspections involving the use of dry powder (form d) usually applied using either a pressure wand or dust storm chamber. Results from several typical dust storm chambers and wand applications have shown less than optimal performance. Measurements of indication brightness and recording of the UVA image, and in some cases, formal probability of detection (POD) studies were used to assess the developer application methods. Key conclusions and initial recommendations are provided.

  7. Development of new methods for studying nanostructures using neutron scattering

    SciTech Connect

    Pynn, Roger

    2016-03-18

    The goal of this project was to develop improved instrumentation for studying the microscopic structures of materials using neutron scattering. Neutron scattering has a number of advantages for studying material structure but suffers from the well-known disadvantage that neutrons’ ability to resolve structural details is usually limited by the strength of available neutron sources. We aimed to overcome this disadvantage using a new experimental technique, called Spin Echo Scattering Angle Encoding (SESAME) that makes use of the neutron’s magnetism. Our goal was to show that this innovation will allow the country to make better use of the significant investment it has recently made in a new neutron source at Oak Ridge National Laboratory (ORNL) and will lead to increases in scientific knowledge that contribute to the Nation’s technological infrastructure and ability to develop advanced materials and technologies. We were successful in demonstrating the technical effectiveness of the new method and established a baseline of knowledge that has allowed ORNL to start a project to implement the method on one of its neutron beam lines.

  8. System identification methods for aircraft flight control development and validation

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.

    1995-01-01

    System-identification methods compose a mathematical model, or series of models, from measurements of inputs and outputs of dynamic systems. The extracted models allow the characterization of the response of the overall aircraft or component subsystem behavior, such as actuators and on-board signal processing algorithms. This paper discusses the use of frequency-domain system-identification methods for the development and integration of aircraft flight-control systems. The extraction and analysis of models of varying complexity from nonparametric frequency-responses to transfer-functions and high-order state-space representations is illustrated using the Comprehensive Identification from FrEquency Responses (CIFER) system-identification facility. Results are presented for test data of numerous flight and simulation programs at the Ames Research Center including rotorcraft, fixed-wing aircraft, advanced short takeoff and vertical landing (ASTOVL), vertical/short takeoff and landing (V/STOL), tiltrotor aircraft, and rotor experiments in the wind tunnel. Excellent system characterization and dynamic response prediction is achieved for this wide class of systems. Examples illustrate the role of system-identification technology in providing an integrated flow of dynamic response data around the entire life-cycle of aircraft development from initial specifications, through simulation and bench testing, and into flight-test optimization.

  9. Characterization, thermal stability studies, and analytical method development of Paromomycin for formulation development.

    PubMed

    Khan, Wahid; Kumar, Neeraj

    2011-06-01

    Paromomycin (PM) is an aminoglycoside antibiotic, first isolated in the 1950s, and approved in 2006 for treatment of visceral leishmaniasis. Although isolated six decades back, sufficient information essential for development of pharmaceutical formulation is not available for PM. The purpose of this paper was to determine thermal stability and development of new analytical method for formulation development of PM. PM was characterized by thermoanalytical (DSC, TGA, and HSM) and by spectroscopic (FTIR) techniques and these techniques were used to establish thermal stability of PM after heating PM at 100, 110, 120, and 130 °C for 24 h. Biological activity of these heated samples was also determined by microbiological assay. Subsequently, a simple, rapid and sensitive RP-HPLC method for quantitative determination of PM was developed using pre-column derivatization with 9-fluorenylmethyl chloroformate. The developed method was applied to estimate PM quantitatively in two parenteral dosage forms. PM was successfully characterized by various stated techniques. These techniques indicated stability of PM for heating up to 120 °C for 24 h, but when heated at 130 °C, PM is liable to degradation. This degradation is also observed in microbiological assay where PM lost ∼30% of its biological activity when heated at 130 °C for 24 h. New analytical method was developed for PM in the concentration range of 25-200 ng/ml with intra-day and inter-day variability of < 2%RSD. Characterization techniques were established and stability of PM was determined successfully. Developed analytical method was found sensitive, accurate, and precise for quantification of PM. Copyright © 2010 John Wiley & Sons, Ltd.

  10. Electromagnetic Differential Measuring Method: Application in Microstrip Sensors Developing

    PubMed Central

    García-Chamizo, Juan Manuel; Nieto-Hidalgo, Mario

    2017-01-01

    Electromagnetic radiation is energy that interacts with matter. The interaction process is of great importance to the sensing applications that characterize material media. Parameters like constant dielectric represent matter characteristics and they are identified using emission, interaction and reception of electromagnetic radiation in adapted environmental conditions. How the electromagnetic wave responds when it interacts with the material media depends on the range of frequency used and the medium parameters. Different disciplines use this interaction and provides non-intrusive applications with clear benefits, remote sensing, earth sciences (geology, atmosphere, hydrosphere), biological or medical disciplines use this interaction and provides non-intrusive applications with clear benefits. Electromagnetic waves are transmitted and analyzed in the receiver to determine the interaction produced. In this work a method based in differential measurement technique is proposed as a novel way of detecting and characterizing electromagnetic matter characteristics using sensors based on a microstrip patch. The experimental results, based on simulations, show that it is possible to obtain benefits from the behavior of the wave-medium interaction using differential measurement on reception of electromagnetic waves at different frequencies or environmental conditions. Differential method introduce advantages in measure processes and promote new sensors development. A new microstrip sensor that uses differential time measures is proposed to show the possibilities of this method. PMID:28718804

  11. Electromagnetic Differential Measuring Method: Application in Microstrip Sensors Developing.

    PubMed

    Ferrández-Pastor, Francisco Javier; García-Chamizo, Juan Manuel; Nieto-Hidalgo, Mario

    2017-07-18

    Electromagnetic radiation is energy that interacts with matter. The interaction process is of great importance to the sensing applications that characterize material media. Parameters like constant dielectric represent matter characteristics and they are identified using emission, interaction and reception of electromagnetic radiation in adapted environmental conditions. How the electromagnetic wave responds when it interacts with the material media depends on the range of frequency used and the medium parameters. Different disciplines use this interaction and provides non-intrusive applications with clear benefits, remote sensing, earth sciences (geology, atmosphere, hydrosphere), biological or medical disciplines use this interaction and provides non-intrusive applications with clear benefits. Electromagnetic waves are transmitted and analyzed in the receiver to determine the interaction produced. In this work a method based in differential measurement technique is proposed as a novel way of detecting and characterizing electromagnetic matter characteristics using sensors based on a microstrip patch. The experimental results, based on simulations, show that it is possible to obtain benefits from the behavior of the wave-medium interaction using differential measurement on reception of electromagnetic waves at different frequencies or environmental conditions. Differential method introduce advantages in measure processes and promote new sensors development. A new microstrip sensor that uses differential time measures is proposed to show the possibilities of this method.

  12. Development of a molecular detection method for naphthalene degrading pseudomonads

    SciTech Connect

    Silva, M.C.

    1993-01-01

    A new combined method for detection of naphthalene degrading pseudomonads from soil has been developed. After direct extraction from soil using a lysozyme/sodium dodecyl sulfate/freeze thaw method and rapid purification through gel electrophoresis, DNA was amplified through polymerase chain reaction (PCR) using primers directed against the nahR regulatory gene present in plasmid NAH7 of Pseudomonas putida G7. The resulting product was detected with a reverse dot blot protocol improving sensitivity ten-fold over traditional ethidium bromide staining of agarose gel electrophoresis, with positive signals starting at the 10[sup 3] CFUs/g soil level. This method was successful in detecting indigenous bacteria from subsurface sediment of a naphthalene contaminated site in New York State, and that similar combined approaches could be developed for other soil borne genetic markers. A study was also carried out on how culture conditions, and other variables that modulate a cell's physiology bias a PCR amplification against generating a representative specimen profile. Two Pseudomonas putida G7 nahR alleles were constructed in pUC19 that differ solely in a 31 bp internal segment whose sequence has been inverted. After PCR amplification, the products could be distinguished. When an Escherichia coli strain carrying one nahR allele is submitted to varying growth conditions, the consequences can be ascertained through co-amplification with a strain carrying the other allele and subsequent restriction analysis. Sublethal levels of tetracycline or growth in minimal medium made the PCR target in these cells relatively less amplifiable. However, cells in stationary phase displayed improved amplifiability while cells grown at 42[degrees]C were equally amplifiable as compared to cells grown at 37[degrees]C. These results suggest that mixed populations containing cells in different physiological states may not be representatively amplified by PCR.

  13. Continuum modeling using granular micromechanics approach: Method development and applications

    NASA Astrophysics Data System (ADS)

    Poorsolhjouy, Payam

    This work presents a constitutive modeling approach for the behavior of granular materials. In the granular micromechanics approach presented here, the material point is assumed to be composed of grains interacting with their neighbors through different inter-granular mechanisms that represent material's macroscopic behavior. The present work focuses on (i) developing the method for modeling more complicated material systems as well as more complicated loading scenarios and (ii) applications of the method for modeling various granular materials and granular assemblies. A damage-plasticity model for modeling cementitious and rock-like materials is developed, calibrated, and verified in a thermo-mechanically consistent manner. Grain-pair interactions in normal tension, normal compression, and tangential directions have been defined in a manner that is consistent with the material's macroscopic behavior. The resulting model is able to predict, among other interesting issues, the effects of loading induced anisotropy. Material's response to loading will depend on the loading history of grain-pair interactions in different directions. Thus the model predicts load-path dependent failure. Due to the inadequacies of first gradient continuum theories in predicting phenomena such as shear band width, wave dispersion, and frequency band-gap, the presented method is enhanced by incorporation of non-classical terms in the kinematic analysis. A complete micromorphic theory is presented by incorporating additional terms such as fluctuations, second gradient terms, and spin fields. Relative deformation of grain-pairs is calculated based on the enhanced kinematic analysis. The resulting theory incorporates the deformation and forces in grain-pair interactions due to different kinematic measures into the macroscopic behavior. As a result, non-classical phenomena such as wave dispersion and frequency band-gaps can be predicted. Using the grain-scale analysis, a practical approach for

  14. Electrospinning: methods and development of biodegradable nanofibres for drug release.

    PubMed

    Ashammakhi, N; Wimpenny, I; Nikkola, L; Yang, Y

    2009-02-01

    It is clear that nanofibrous structures can be used as tools for many applications. It is already known that electrospinning is a highly versatile method of producing nanofibres and recent developments in the technique of electrospinning have led to the development of aligned nanofibres and biphasic, core-sheath fibres which can be used to encapsulate different materials from molecules to cells. Natural extracellular matrix (ECM) contains fibres in both micro and nano-scales and provides a structural scaffold which allows cells to localize, migrate, proliferate and differentiate. Polymer nanofibres can provide the structural cues of ECM. However, current literature gives new hope to further functionalising polymeric nanofibres by using them for drug delivery devices and improving their design to improve control of delivery. By encapsulating active agents within nanofibres (multifunctional nanofibres), a degree of control can be exerted over the release of encapsulated agents and therefore, the behaviour of cells can be manipulated for developing effective therapies and is extremely encouraging in the tissue engineering field by combining factors like fibre diameter, alignment and chemicals in new ways. Such multifunctional nanofibre-based systems are already being investigated in vivo. Experiments have shown the significant potential for treatments of disease and engineering of neural and bone tissues. Further, phase III clinical trials of nanofibrous patches for applications in wound treatment were encouraging. Hopefully, clinical applications of these drug delivery devices will follow, to enhance regenerative medicine applications.

  15. Systematic methods for knowledge acquisition and expert system development

    NASA Technical Reports Server (NTRS)

    Belkin, Brenda L.; Stengel, Robert F.

    1991-01-01

    Nine cooperating rule-based systems, collectively called AUTOCREW, were designed to automate functions and decisions associated with a combat aircraft's subsystem. The organization of tasks within each system is described; performance metrics were developed to evaluate the workload of each rule base, and to assess the cooperation between the rule-bases. Each AUTOCREW subsystem is composed of several expert systems that perform specific tasks. AUTOCREW's NAVIGATOR was analyzed in detail to understand the difficulties involved in designing the system and to identify tools and methodologies that ease development. The NAVIGATOR determines optimal navigation strategies from a set of available sensors. A Navigation Sensor Management (NSM) expert system was systematically designed from Kalman filter covariance data; four ground-based, a satellite-based, and two on-board INS-aiding sensors were modeled and simulated to aid an INS. The NSM Expert was developed using the Analysis of Variance (ANOVA) and the ID3 algorithm. Navigation strategy selection is based on an RSS position error decision metric, which is computed from the covariance data. Results show that the NSM Expert predicts position error correctly between 45 and 100 percent of the time for a specified navaid configuration and aircraft trajectory. The NSM Expert adapts to new situations, and provides reasonable estimates of hybrid performance. The systematic nature of the ANOVA/ID3 method makes it broadly applicable to expert system design when experimental or simulation data is available.

  16. Practical methods to improve the development of computational software

    SciTech Connect

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-07-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  17. Improved Method Being Developed for Surface Enhancement of Metallic Materials

    NASA Technical Reports Server (NTRS)

    Gabb, Timothy P.; Telesman, Jack; Kantzos, Peter T.

    2001-01-01

    Surface enhancement methods induce a layer of beneficial residual compressive stress to improve the impact (FOD) resistance and fatigue life of metallic materials. A traditional method of surface enhancement often used is shot peening, in which small steel spheres are repeatedly impinged on metallic surfaces. Shot peening is inexpensive and widely used, but the plastic deformation of 20 to 40 percent imparted by the impacts can be harmful. This plastic deformation can damage the microstructure, severely limiting the ductility and durability of the material near the surface. It has also been shown to promote accelerated relaxation of the beneficial compressive residual stresses at elevated temperatures. Low-plasticity burnishing (LPB) is being developed as an improved method for the surface enhancement of metallic materials. LPB is being investigated as a rapid, inexpensive surface enhancement method under NASA Small Business Innovation Research contracts NAS3-98034 and NAS3-99116, with supporting characterization work at NASA. Previously, roller burnishing had been employed to refine surface finish. This concept was adopted and then optimized as a means of producing a layer of compressive stress of high magnitude and depth, with minimal plastic deformation (ref. 1). A simplified diagram of the developed process is given in the following figure. A single pass of a smooth, free-rolling spherical ball under a normal force deforms the surface of the material in tension, creating a compressive layer of residual stress. The ball is supported in a fluid with sufficient pressure to lift the ball off the surface of the retaining spherical socket. The ball is only in mechanical contact with the surface of the material being burnished and is free to roll on the surface. This apparatus is designed to be mounted in the conventional lathes and vertical mills currently used to machine parts. The process has been successfully applied to nickel-base superalloys by a team from the

  18. Development of a nondestructive evaluation method for FRP bridge decks

    NASA Astrophysics Data System (ADS)

    Brown, Jeff; Fox, Terra

    2010-05-01

    Open steel grids are typically used on bridges to minimize the weight of the bridge deck and wearing surface. These grids, however, require frequent maintenance and exhibit other durability concerns related to fatigue cracking and corrosion. Bridge decks constructed from composite materials, such as a Fiber-reinforced Polymer (FRP), are strong and lightweight; they also offer improved rideability, reduced noise levels, less maintenance, and are relatively easy to install compared to steel grids. This research is aimed at developing an inspection protocol for FRP bridge decks using Infrared thermography. The finite element method was used to simulate the heat transfer process and determine optimal heating and data acquisition parameters that will be used to inspect FRP bridge decks in the field. It was demonstrated that thermal imaging could successfully identify features of the FRP bridge deck to depths of 1.7 cm using a phase analysis process.

  19. Development of the moments method for neutron gauging

    NASA Astrophysics Data System (ADS)

    Ingman, D.; Taviv, E.

    1981-11-01

    In the present investigation the new methodology of neutron moisture probe, based on measurements of the spatial moments of the slow neutron fluxes, is developed. Within the framework of the present work calibration curves for moments of low orders were calculated and recursive relations for high-order moments were obtained on the base of a P-1 approximation and diffusion theory. The neutron flux distributions obtained from a semiempirical method [5], three-group diffusion and age theories for the moments calculation, were investigated. It is shown that the spatial moments of neutron flux could serve as a basis for measurements of the volume weighted moisture and the content of strong neutron absorbers in the medium.

  20. Workshop Targets Development of Geodetic Transient Detection Methods

    NASA Astrophysics Data System (ADS)

    Murray-Moraleda, Jessica R.; Lohman, Rowena

    2010-02-01

    2009 SCEC Annual Meeting: Workshop on Transient Anomalous Strain Detection; Palm Springs, California, 12-13 September 2009; The Southern California Earthquake Center (SCEC) is a community of researchers at institutions worldwide working to improve understanding of earthquakes and mitigate earthquake risk. One of SCEC's priority objectives is to “develop a geodetic network processing system that will detect anomalous strain transients.” Given the growing number of continuously recording geodetic networks consisting of hundreds of stations, an automated means for systematically searching data for transient signals, especially in near real time, is critical for network operations, hazard monitoring, and event response. The SCEC Transient Detection Test Exercise began in 2008 to foster an active community of researchers working on this problem, explore promising methods, and combine effective approaches in novel ways. A workshop was held in California to assess what has been learned thus far and discuss areas of focus as the project moves forward.

  1. Development of silicon purification by strong radiation catalysis method

    NASA Astrophysics Data System (ADS)

    Chen, Ying-Tian; Ho, Tso-Hsiu; Lim, Chern-Sing; Lim Boon, Han

    2010-11-01

    Using a new type of solar furnace and a specially designed induction furnace, cost effective and highly efficient purification of metallurgical silicon into solar grade silicon can be achieved. It is realized by a new method for extracting boron from silicon with the aid of photo-chemical effect. In this article, we discussed the postulated principle of strong radiation catalysis and the recent development in practice. Starting from ordinary metallurgical silicon, we achieved a purification result of 0.12 ppmw to 0.3 ppmw of boron impurity in silicon by only single pass of a low cost and simple process, the major obstacle to make ‘cheap’ solar grade silicon feedstock in industry is thus removed.

  2. Structural analysis methods development for turbine hot section components

    NASA Technical Reports Server (NTRS)

    Thompson, R. L.

    1989-01-01

    The structural analysis technologies and activities of the NASA Lewis Research Center's gas turbine engine HOT Section Technoloogy (HOST) program are summarized. The technologies synergistically developed and validated include: time-varying thermal/mechanical load models; component-specific automated geometric modeling and solution strategy capabilities; advanced inelastic analysis methods; inelastic constitutive models; high-temperature experimental techniques and experiments; and nonlinear structural analysis codes. Features of the program that incorporate the new technologies and their application to hot section component analysis and design are described. Improved and, in some cases, first-time 3-D nonlinear structural analyses of hot section components of isotropic and anisotropic nickel-base superalloys are presented.

  3. Structural Analysis Methods Development for Turbine Hot Section Components

    NASA Technical Reports Server (NTRS)

    Thompson, Robert L.

    1988-01-01

    The structural analysis technologies and activities of the NASA Lewis Research Center's gas turbine engine Hot Section Technology (HOST) program are summarized. The technologies synergistically developed and validated include: time-varying thermal/mechanical load models; component-specific automated geometric modeling and solution strategy capabilities; advanced inelastic analysis methods; inelastic constitutive models; high-temperature experimental techniques and experiments; and nonlinear structural analysis codes. Features of the program that incorporate the new technologies and their application to hot section component analysis and design are described. Improved and, in some cases, first-time 3-D nonlinear structural analyses of hot section components of isotropic and anisotropic nickel-base superalloys are presented.

  4. Development of A High Throughput Method Incorporating Traditional Analytical Devices

    PubMed Central

    White, C. C.; Embree, E.; Byrd, W. E; Patel, A. R.

    2004-01-01

    A high-throughput (high throughput is the ability to process large numbers of samples) and companion informatics system has been developed and implemented. High throughput is defined as the ability to autonomously evaluate large numbers of samples, while an informatics system provides the software control of the physical devices, in addition to the organization and storage of the generated electronic data. This high throughput system includes both an ultra-violet and visible light spectrometer (UV-Vis) and a Fourier transform infrared spectrometer (FTIR) integrated with a multi sample positioning table. This method is designed to quantify changes in polymeric materials occurring from controlled temperature, humidity and high flux UV exposures. The integration of the software control of these analytical instruments within a single computer system is presented. Challenges in enhancing the system to include additional analytical devices are discussed. PMID:27366626

  5. Development of Porosity Measurement Method in Shale Gas Reservoir Rock

    NASA Astrophysics Data System (ADS)

    Siswandani, Alita; Nurhandoko, BagusEndar B.

    2016-08-01

    The pore scales have impacts on transport mechanisms in shale gas reservoirs. In this research, digital helium porosity meter is used for porosity measurement by considering real condition. Accordingly it is necessary to obtain a good approximation for gas filled porosity. Shale has the typical effective porosity that is changing as a function of time. Effective porosity values for three different shale rocks are analyzed by this proposed measurement. We develop the new measurement method for characterizing porosity phenomena in shale gas as a time function by measuring porosity in a range of minutes using digital helium porosity meter. The porosity of shale rock measured in this experiment are free gas and adsorbed gas porosoty. The pressure change in time shows that porosity of shale contains at least two type porosities: macro scale porosity (fracture porosity) and fine scale porosity (nano scale porosity). We present the estimation of effective porosity values by considering Boyle-Gay Lussaac approximation and Van der Waals approximation.

  6. New prior sampling methods for nested sampling - Development and testing

    NASA Astrophysics Data System (ADS)

    Stokes, Barrie; Tuyl, Frank; Hudson, Irene

    2017-06-01

    Nested Sampling is a powerful algorithm for fitting models to data in the Bayesian setting, introduced by Skilling [1]. The nested sampling algorithm proceeds by carrying out a series of compressive steps, involving successively nested iso-likelihood boundaries, starting with the full prior distribution of the problem parameters. The "central problem" of nested sampling is to draw at each step a sample from the prior distribution whose likelihood is greater than the current likelihood threshold, i.e., a sample falling inside the current likelihood-restricted region. For both flat and informative priors this ultimately requires uniform sampling restricted to the likelihood-restricted region. We present two new methods of carrying out this sampling step, and illustrate their use with the lighthouse problem [2], a bivariate likelihood used by Gregory [3] and a trivariate Gaussian mixture likelihood. All the algorithm development and testing reported here has been done with Mathematica® [4].

  7. Causal inference methods to study nonrandomized, preexisting development interventions

    PubMed Central

    Arnold, Benjamin F.; Khush, Ranjiv S.; Ramaswamy, Padmavathi; London, Alicia G.; Rajkumar, Paramasivan; Ramaprabha, Prabhakar; Durairaj, Natesan; Hubbard, Alan E.; Balakrishnan, Kalpana; Colford, John M.

    2010-01-01

    Empirical measurement of interventions to address significant global health and development problems is necessary to ensure that resources are applied appropriately. Such intervention programs are often deployed at the group or community level. The gold standard design to measure the effectiveness of community-level interventions is the community-randomized trial, but the conditions of these trials often make it difficult to assess their external validity and sustainability. The sheer number of community interventions, relative to randomized studies, speaks to a need for rigorous observational methods to measure their impact. In this article, we use the potential outcomes model for causal inference to motivate a matched cohort design to study the impact and sustainability of nonrandomized, preexisting interventions. We illustrate the method using a sanitation mobilization, water supply, and hygiene intervention in rural India. In a matched sample of 25 villages, we enrolled 1,284 children <5 y old and measured outcomes over 12 mo. Although we found a 33 percentage point difference in new toilet construction [95% confidence interval (CI) = 28%, 39%], we found no impacts on height-for-age Z scores (adjusted difference = 0.01, 95% CI = −0.15, 0.19) or diarrhea (adjusted longitudinal prevalence difference = 0.003, 95% CI = −0.001, 0.008) among children <5 y old. This study demonstrates that matched cohort designs can estimate impacts from nonrandomized, preexisting interventions that are used widely in development efforts. Interpreting the impacts as causal, however, requires stronger assumptions than prospective, randomized studies. PMID:21149699

  8. Recent developments in optical detection methods for microchip separations.

    PubMed

    Götz, Sebastian; Karst, Uwe

    2007-01-01

    This paper summarizes the features and performances of optical detection systems currently applied in order to monitor separations on microchip devices. Fluorescence detection, which delivers very high sensitivity and selectivity, is still the most widely applied method of detection. Instruments utilizing laser-induced fluorescence (LIF) and lamp-based fluorescence along with recent applications of light-emitting diodes (LED) as excitation sources are also covered in this paper. Since chemiluminescence detection can be achieved using extremely simple devices which no longer require light sources and optical components for focusing and collimation, interesting approaches based on this technique are presented, too. Although UV/vis absorbance is a detection method that is commonly used in standard desktop electrophoresis and liquid chromatography instruments, it has not yet reached the same level of popularity for microchip applications. Current applications of UV/vis absorbance detection to microchip separations and innovative approaches that increase sensitivity are described. This article, which contains 85 references, focuses on developments and applications published within the last three years, points out exciting new approaches, and provides future perspectives on this field.

  9. Development of imaging methods to assess adiposity and metabolism.

    PubMed

    Heymsfield, S B

    2008-12-01

    Body composition studies were first recorded around the time of the renaissance, and advances by the mid-twentieth century facilitated growth in the study of physiology, metabolism and pathological states. The field developed during this early period around the 'two-compartment' molecular level model that partitions body weight into fat and fat-free mass. Limited use was also made of X-rays as a means of estimating fat-layer thickness, but the revolutionary advance was brought about by the introduction of three-dimensional images provided by computed tomography (CT) in the mid 1970s, followed soon thereafter by magnetic resonance imaging (MRI). Complete in vivo reconstruction of all major anatomic body compartments and tissues became possible, thus providing major new research opportunities. This imaging revolution has continued to advance with further methodology refinements including functional MRI, diffusion tensor imaging and combined methods such as positron emission tomography+CT or MRI. The scientific advances made possible by these new and innovative methods continue to unfold today and hold enormous promise for the future of obesity research.

  10. Assessing methods for developing crop forecasting in the Iberian Peninsula

    NASA Astrophysics Data System (ADS)

    Ines, A. V. M.; Capa Morocho, M. I.; Baethgen, W.; Rodriguez-Fonseca, B.; Han, E.; Ruiz Ramos, M.

    2015-12-01

    Seasonal climate prediction may allow predicting crop yield to reduce the vulnerability of agricultural production to climate variability and its extremes. It has been already demonstrated that seasonal climate predictions at European (or Iberian) scale from ensembles of global coupled climate models have some skill (Palmer et al., 2004). The limited predictability that exhibits the atmosphere in mid-latitudes, and therefore de Iberian Peninsula (PI), can be managed by a probabilistic approach based in terciles. This study presents an application for the IP of two methods for linking tercile-based seasonal climate forecasts with crop models to improve crop predictability. Two methods were evaluated and applied for disaggregating seasonal rainfall forecasts into daily weather realizations: 1) a stochastic weather generator and 2) a forecast tercile resampler. Both methods were evaluated in a case study where the impacts of two seasonal rainfall forecasts (wet and dry forecast for 1998 and 2015 respectively) on rainfed wheat yield and irrigation requirements of maize in IP were analyzed. Simulated wheat yield and irrigation requirements of maize were computed with the crop models CERES-wheat and CERES-maize which are included in Decision Support System for Agrotechnology Transfer (DSSAT v.4.5, Hoogenboom et al., 2010). Simulations were run at several locations in Spain where the crop model was calibrated and validated with independent field data. These methodologies would allow quantifying the benefits and risks of a seasonal climate forecast to potential users as farmers, agroindustry and insurance companies in the IP. Therefore, we would be able to establish early warning systems and to design crop management adaptation strategies that take advantage of favorable conditions or reduce the effect of adverse ones. ReferencesPalmer, T. et al., 2004. Development of a European multimodel ensemble system for seasonal-to-interannual prediction (DEMETER). Bulletin of the

  11. Development of an oligonucleotide microarray method for Salmonella serotyping

    PubMed Central

    Tankouo‐Sandjong, B.; Sessitsch, A.; Stralis‐Pavese, N.; Liebana, E.; Kornschober, C.; Allerberger, F.; Hächler, H.; Bodrossy, L.

    2008-01-01

    Summary Adequate identification of Salmonella enterica serovars is a prerequisite for any epidemiological investigation. This is traditionally obtained via a combination of biochemical and serological typing. However, primary strain isolation and traditional serotyping is time‐consuming and faster methods would be desirable. A microarray, based on two housekeeping and two virulence marker genes (atpD, gyrB, fliC and fljB), has been developed for the detection and identification of the two species of Salmonella (S. enterica and S. bongori), the five subspecies of S. enterica (II, IIIa, IIIb, IV, VI) and 43 S. enterica ssp. enterica serovars (covering the most prevalent ones in Austria and the UK). A comprehensive set of probes (n = 240), forming 119 probe units, was developed based on the corresponding sequences of 148 Salmonella strains, successfully validated with 57 Salmonella strains and subsequently evaluated with 35 blind samples including isolated serotypes and mixtures of different serotypes. Results demonstrated a strong discriminatory ability of the microarray among Salmonella serovars. Threshold for detection was 1 colony forming unit per 25 g of food sample following overnight (14 h) enrichment. PMID:21261872

  12. Titanium matrix composite thermomechanical fatigue analysis method development

    NASA Astrophysics Data System (ADS)

    Ball, Dale Leray

    1998-12-01

    The results of complementary experimental and analytical investigations of thermomechanical fatigue of both unidirectional and crossply titanium matrix composite laminates are presented. Experimental results are given for both isothermal and thermomechanical fatigue tests which were based on simple, constant amplitude mechanical and thermal loading profiles. The discussion of analytical methods includes the development of titanium matrix composite laminate constitutive relationships, the development of damage models and the integration of both into a thermomechanical fatigue analysis algorithm. The technical approach begins with a micro-mechanical formulation of lamina response. Material behavior at the ply level is based on a mechanics of materials approach using thermo-elastic fibers and an thermo-elasto-viscoplastic matrix. The effects of several types of distributed damage are included in the material constitutive relationships at the ply level in the manner of continuum damage mechanics. The modified ply constitutive relationships are then used in an otherwise unmodified classical lamination theory treatment of laminate response. Finally, simple models for damage progression are utilized in an analytical framework which recalculates response and increments damage sizes at every load point in an applied thermal/mechanical load history. The model is used for the prediction of isothermal fatigue and thermomechanical fatigue life of unnotched, unidirectional [0°]4 and crossply [0°/90°]s titanium matrix composite laminates. The results of corresponding isothermal and thermomechanical fatigue tests are presented in detail and the correlation between experimental and analytical results is established in certain cases.

  13. Systematic methods for knowledge acquisition and expert system development

    NASA Technical Reports Server (NTRS)

    Belkin, Brenda L.; Stengel, Robert F.

    1991-01-01

    Nine cooperating rule-based systems, collectively called AUTOCREW which were designed to automate functions and decisions associated with a combat aircraft's subsystems, are discussed. The organization of tasks within each system is described; performance metrics were developed to evaluate the workload of each rule base and to assess the cooperation between the rule bases. Simulation and comparative workload results for two mission scenarios are given. The scenarios are inbound surface-to-air-missile attack on the aircraft and pilot incapacitation. The methodology used to develop the AUTOCREW knowledge bases is summarized. Issues involved in designing the navigation sensor selection expert in AUTOCREW's NAVIGATOR knowledge base are discussed in detail. The performance of seven navigation systems aiding a medium-accuracy INS was investigated using Kalman filter covariance analyses. A navigation sensor management (NSM) expert system was formulated from covariance simulation data using the analysis of variance (ANOVA) method and the ID3 algorithm. ANOVA results show that statistically different position accuracies are obtained when different navaids are used, the number of navaids aiding the INS is varied, the aircraft's trajectory is varied, and the performance history is varied. The ID3 algorithm determines the NSM expert's classification rules in the form of decision trees. The performance of these decision trees was assessed on two arbitrary trajectories, and the results demonstrate that the NSM expert adapts to new situations and provides reasonable estimates of the expected hybrid performance.

  14. Developing sub-domain verification methods based on GIS tools

    NASA Astrophysics Data System (ADS)

    Smith, J. A.; Foley, T. A.; Raby, J. W.

    2014-12-01

    The meteorological community makes extensive use of the Model Evaluation Tools (MET) developed by National Center for Atmospheric Research for numerical weather prediction model verification through grid-to-point, grid-to-grid and object-based domain level analyses. MET Grid-Stat has been used to perform grid-to-grid neighborhood verification to account for the uncertainty inherent in high resolution forecasting, and MET Method for Object-based Diagnostic Evaluation (MODE) has been used to develop techniques for object-based spatial verification of high resolution forecast grids for continuous meteorological variables. High resolution modeling requires more focused spatial and temporal verification over parts of the domain. With a Geographical Information System (GIS), researchers can now consider terrain type/slope and land use effects and other spatial and temporal variables as explanatory metrics in model assessments. GIS techniques, when coupled with high resolution point and gridded observations sets, allow location-based approaches that permit discovery of spatial and temporal scales where models do not sufficiently resolve the desired phenomena. In this paper we discuss our initial GIS approach to verify WRF-ARW with a one-kilometer horizontal resolution inner domain centered over Southern California. Southern California contains a mixture of urban, sub-urban, agricultural and mountainous terrain types along with a rich array of observational data with which to illustrate our ability to conduct sub-domain verification.

  15. Various methods and developments for calibrating seismological sensors at EOST

    NASA Astrophysics Data System (ADS)

    JUND, H.; Bès de Berc, M.; Thore, J.

    2013-12-01

    Calibrating seismic sensors is crucial for knowing the quality of the sensor and generating precise dataless files. We present here three calibration methods that we have developed for the short period and broad band sensors included in the temporary and permanent seismic networks in France. First, in the case of a short-period sensor with no electronics and calibration coil, we inject a sine wave signal into the signal coil. After locking the sensor mass, we first connect a voltage generator of signal waves and a series resistor to the coil. Then, a sinusoidal signal is sent to the sensor signal coil output. Both the voltage at the terminal of the resistor, which gives an image of the intensity entering the signal coil, and the voltage at the terminal of the signal coil are measured. The frequency of the generator then varies in order to find a phase shift between both signals of π/2. The output frequency of the generator corresponds to the image of the natural frequency of the sensor. Second, in the case of all types of sensors provided with a calibration coil, we inject different signals into the calibration coil. We usually apply two signals: a step signal and a sweep (or wobble) signal. A step signal into the calibration coil is equivalent to a Dirac excitation in derived acceleration. The response to this Dirac gives the transfer function of the signal coil, derived two times and without absolute gain. We developed a field-module allowing us to always apply the same excitation to various models of seismometers, in order to compare the results from several instruments previously installed on field. A wobble signal is a signal whose frequency varies. By varying the frequency of the input signal around the sensor's natural frequency, we obtain an immediate response of the sensor in acceleration. This method is particularly suitable in order to avoid any disturbances which may modify the signal of a permanent station. Finally, for the determination of absolute

  16. Sperm morphology and preparation method affect bovine embryonic development.

    PubMed

    Walters, Anneke H; Eyestone, Willard E; Saacke, Richard G; Pearson, Ronald E; Gwazdauskas, Francis C

    2004-01-01

    This study was conducted to evaluate the effect of sperm separation methods of semen samples collected from bulls subjected to scrotal insulation on embryonic development after in vitro fertilization (IVF) and to determine whether IVF results would be affected by various heparin concentrations. Morphologically abnormal semen samples were obtained and cryopreserved from Holstein bulls following scrotal insulation for 48 hours. Standard protocols using the Percoll gradient (90%/45%) method and the swim-up method were used to separate spermatozoa fractions in experiment I. The pellet (A(p)) and the 45% layer (B(p)) were isolated from the Percoll separation, while for the swim-up separation, the supernatant (A(s)) and the interphase (B(s)) were isolated. The overall blastocyst rate for our laboratory control semen was 23.1 +/- 2.1% for Percoll separations (A(p) and B(p)) and 18.2 +/- 2.0% for swim-up (A(s) and B(s)) separations. This rate was higher (P <.01) than the rate observed for the semen from the bull that had the greatest response to scrotal insult 5 days prior to the insult, when it was 9.2 +/- 2.1% for the Percoll separation and 20.7 +/- 2.3% for the swim-up separation, while semen from 27 days after scrotal insulation (D +27) resulted in no blastocyst formation for the Percoll separation and a 4.2 +/- 2.1% rate for the swim-up separation. In experiment II, semen was sampled from the bulls that responded in the greatest and least degrees to scrotal insult 5 days before scrotal insulation (D -5) and on days 23 (D +23) and 34 (D +34) after scrotal insulation. These samples were exposed to IVF mediums with 3 different heparin concentrations (0.1, 1.0, and 10 microg/mL). There was a significant difference (P <.05) in developmental scores between the D -5 (1.08 +/- 0.08), D +23 (0.9 +/- 0.08), and D +34 (0.8 +/- 0.08) samples, but no differences were observed in blastocyst formation based on the number of cleaved embryos. Increasing the heparin concentration

  17. Development of Stable Solidification Method for Insoluble Ferrocyanides-13170

    SciTech Connect

    Ikarashi, Yuki; Masud, Rana Syed; Mimura, Hitoshi; Ishizaki, Eiji; Matsukura, Minoru

    2013-07-01

    The development of stable solidification method of insoluble ferrocyanides sludge is an important subject for the safety decontamination in Fukushima NPP-1. By using the excellent immobilizing properties of zeolites such as gas trapping ability and self-sintering properties, the stable solidification of insoluble ferrocyanides was accomplished. The immobilization ratio of Cs for K{sub 2}[CoFe(CN){sub 6}].nH{sub 2}O saturated with Cs{sup +} ions (Cs{sub 2}[CoFe(CN){sub 6}].nH{sub 2}O) was estimated to be less than 0.1% above 1,000 deg. C; the adsorbed Cs{sup +} ions are completely volatilized. In contrast, the novel stable solid form was produced by the press-sintering of the mixture of Cs{sub 2}[CoFe(CN){sub 6}].nH{sub 2}O and zeolites at higher temperature of 1,000 deg. C and 1,100 deg. C; Cs volatilization and cyanide release were completely depressed. The immobilization ratio of Cs, under the mixing conditions of Cs{sub 2}[CoFe(CN){sub 6}].nH{sub 2}O:CP= 1:1 and calcining temperature: 1,000 deg. C, was estimated to be nearly 100%. As for the kinds of zeolites, natural mordenite (NM), clinoptilolite (CP) and Chabazite tended to have higher immobilization ratio compared to zeolite A. This may be due to the difference in the phase transformation between natural zeolites and synthetic zeolite A. In the case of the composites (K{sub 2-X}Ni{sub X/2}[NiFe(CN){sub 6}].nH{sub 2}O loaded natural mordenite), relatively high immobilization ratio of Cs was also obtained. This method using zeolite matrices can be applied to the stable solidification of the solid wastes of insoluble ferrocyanides sludge. (authors)

  18. Developing Conceptual Hypersonic Airbreathing Engines Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Ferlemann, Shelly M.; Robinson, Jeffrey S.; Martin, John G.; Leonard, Charles P.; Taylor, Lawrence W.; Kamhawi, Hilmi

    2000-01-01

    Designing a hypersonic vehicle is a complicated process due to the multi-disciplinary synergy that is required. The greatest challenge involves propulsion-airframe integration. In the past, a two-dimensional flowpath was generated based on the engine performance required for a proposed mission. A three-dimensional CAD geometry was produced from the two-dimensional flowpath for aerodynamic analysis, structural design, and packaging. The aerodynamics, engine performance, and mass properties arc inputs to the vehicle performance tool to determine if the mission goals were met. If the mission goals were not met, then a flowpath and vehicle redesign would begin. This design process might have to be performed several times to produce a "closed" vehicle. This paper will describe an attempt to design a hypersonic cruise vehicle propulsion flowpath using a Design of' Experiments method to reduce the resources necessary to produce a conceptual design with fewer iterations of the design cycle. These methods also allow for more flexible mission analysis and incorporation of additional design constraints at any point. A design system was developed using an object-based software package that would quickly generate each flowpath in the study given the values of the geometric independent variables. These flowpath geometries were put into a hypersonic propulsion code and the engine performance was generated. The propulsion results were loaded into statistical software to produce regression equations that were combined with an aerodynamic database to optimize the flowpath at the vehicle performance level. For this example, the design process was executed twice. The first pass was a cursory look at the independent variables selected to determine which variables are the most important and to test all of the inputs to the optimization process. The second cycle is a more in-depth study with more cases and higher order equations representing the design space.

  19. A Scale Development for Teacher Competencies on Cooperative Learning Method

    ERIC Educational Resources Information Center

    Kocabas, Ayfer; Erbil, Deniz Gokce

    2017-01-01

    Cooperative learning method is a learning method studied both in Turkey and in the world for long years as an active learning method. Although cooperative learning method takes place in training programs, it cannot be implemented completely in the direction of its principles. The results of the researches point out that teachers have problems with…

  20. Development of Exploration Methods for Engineered Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Iovenitti, J. L.; Tibuleac, I. M.; Hopkins, D.; Cladouhos, T.; Karlin, R. E.; Wannamaker, P. E.; Kennedy, B. M.; Blackwell, D. D.; Clyne, M.

    2010-12-01

    The principle objective of an exploration program is to identify exploration drilling targets that will advance a prospect towards development and full-scale production or relinquish interest in the prospect. Engineered Geothermal Systems (EGS) exploration key geoscience parameters are temperature, lithology, and stress state at an economically feasible target depth. Our project tests the hypothesis that our proposed exploration methodology will identify potential EGS drilling targets at Dixie Valley. Dixie Valley was chosen as the test calibration site because it is a highly characterized geothermal resource was a sufficiently large database in the public domain. U.S. Department of Energy Geothermal Technologies Program under the American Recovery and Reinvestment Act has awarded funding to AltaRock to develop exploration methods for EGS by integrating geophysical, geological, and geochemical data sets. New seismic, gravity, magnetotellurics (MT), and geochemical data will be collected and integrated into the model to improve model coverage and resolution. Other model inputs will include geology, fault-kinematics, fracture-characterization, and earthquake fault-plane solutions to provide information on stress state. Where appropriate, additional geochemical measurements will be made to model geo-thermal temperatures at depth. The resulting integrated data model will be used to predict the EGS parameters of interest (temperature, lithology and stress state) with greater certainty and a "higher degree of non-uniqueness” across the test area. We hypothesize that successful EGS drilling targets will be identifiable through integration of existing and new geoscience data coupled with geostatictical and Subject Matter Expertise. Both the existing and the existing plus new data will be integrated into separate data models on a 5x5km grid with 1 km depth slices. The results of each data model will be evaluated for the degree of improvement relative to the parameters of

  1. Development of Combinatorial Methods for Alloy Design and Optimization

    SciTech Connect

    Pharr, George M.; George, Easo P.; Santella, Michael L

    2005-07-01

    The primary goal of this research was to develop a comprehensive methodology for designing and optimizing metallic alloys by combinatorial principles. Because conventional techniques for alloy preparation are unavoidably restrictive in the range of alloy composition that can be examined, combinatorial methods promise to significantly reduce the time, energy, and expense needed for alloy design. Combinatorial methods can be developed not only to optimize existing alloys, but to explore and develop new ones as well. The scientific approach involved fabricating an alloy specimen with a continuous distribution of binary and ternary alloy compositions across its surface--an ''alloy library''--and then using spatially resolved probing techniques to characterize its structure, composition, and relevant properties. The three specific objectives of the project were: (1) to devise means by which simple test specimens with a library of alloy compositions spanning the range interest can be produced; (2) to assess how well the properties of the combinatorial specimen reproduce those of the conventionally processed alloys; and (3) to devise screening tools which can be used to rapidly assess the important properties of the alloys. As proof of principle, the methodology was applied to the Fe-Ni-Cr ternary alloy system that constitutes many commercially important materials such as stainless steels and the H-series and C-series heat and corrosion resistant casting alloys. Three different techniques were developed for making alloy libraries: (1) vapor deposition of discrete thin films on an appropriate substrate and then alloying them together by solid-state diffusion; (2) co-deposition of the alloying elements from three separate magnetron sputtering sources onto an inert substrate; and (3) localized melting of thin films with a focused electron-beam welding system. Each of the techniques was found to have its own advantages and disadvantages. A new and very powerful technique for

  2. [A non invasive method for assessing sexual development at adolescence].

    PubMed

    Lejarraga, Horacio; Berner, Enrique; del Pino, Mariana; Medina, Viviana; Cameron, Noel

    2009-10-01

    Observational assessments of puberty that invades the adolescent's privacy are not acceptable for research in population groups. Results based on self assessment have been variable, and in many cases poor. To evaluate the validity of a questionnaire with simple questions addressed to assess early, intermediate and advanced puberty periods rather than specific stages. In an outpatient clinic at the Service of Adolescence of a public hospital, 188 and 142 healthy girls and boys aged 8-18 years, and 36 girls and boys aged 8-9 years attending a public school were studied. Adolescents attended the Service for the first time. Those with chronic diseases were excluded from the study. Children answered the questionnaire before entering the doctor's office, where a trained professional clinically assessed their puberal development (Tanner's stage). The questionnaire was previously tested in 30 adolescents. The highest concordance were found in the questions: "Have you started puberty?", with Tanner's stages III, IV or V (Kappa value= 0.60); "Have you already had your first menstrual period?" with stages IV-V (K= 0.69); and "Do you shave?" with stages IV-V (K= 0.66). In most cases, these questions showed high (> or = 0.80) sensitivity and specificity for detecting the mentioned puberty periods. The method showed to be reliable, and its further evaluation in a non medical setting (schools, households, etc.) is recommended.

  3. Wavelet Methods Developed to Detect and Control Compressor Stall

    NASA Technical Reports Server (NTRS)

    Le, Dzu K.

    1997-01-01

    A "wavelet" is, by definition, an amplitude-varying, short waveform with a finite bandwidth (e.g., that shown in the first two graphs). Naturally, wavelets are more effective than the sinusoids of Fourier analysis for matching and reconstructing signal features. In wavelet transformation and inversion, all transient or periodic data features (as in compressor-inlet pressures) can be detected and reconstructed by stretching or contracting a single wavelet to generate the matching building blocks. Consequently, wavelet analysis provides many flexible and effective ways to reduce noise and extract signals which surpass classical techniques - making it very attractive for data analysis, modeling, and active control of stall and surge in high-speed turbojet compressors. Therefore, fast and practical wavelet methods are being developed in-house at the NASA Lewis Research Center to assist in these tasks. This includes establishing user-friendly links between some fundamental wavelet analysis ideas and the classical theories (or practices) of system identification, data analysis, and processing.

  4. Development of immunoaffinity chromatographic method for Ara h 2 isolation.

    PubMed

    Wu, Zhihua; Zhang, Ying; Zhan, Shaode; Lian, Jun; Zhao, Ruifang; Li, Kun; Tong, Ping; Li, Xin; Yang, Anshu; Chen, Hongbing

    2017-03-01

    Ara h 2 is considered a major allergen in peanut. Due to the difficulty of separation, Ara h 2 had not been fully studied. Immunoaffinity chromatography (IAC) column can separate target protein with high selectivity, which made it possible to purify Ara h 2 from different samples. In this study, IAC method was developed to purify Ara h 2 and its effect was evaluated. By coupling polyclonal antibody (pAb) on CNBr-activated Sepharose 4B, the column for specific extraction was constructed. The coupling efficiency of the IAC column was higher than 90%, which made the capacity of column reached 0.56 mg per 0.15 g medium (dry weight). The recovery of Ara h 2 ranged from 93% to 100% for different concentrations of pure Ara h 2 solutions in 15 min. After using a column 10 times, about 88% of the column capacity remained. When applied to extract Ara h 2 from raw peanut protein extract and boiled peanut protein extract, the IAC column could recovery 94% and 88% target protein from the mixture. SDS-PAGE and Western blotting analysis confirmed the purified protein was Ara h 2, its purity reached about 90%. Significantly, the IAC column could capture dimer of Ara h 2, which made it feasible to prepared derivative of protein after processing. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Developing improved metamodels by combining phenomenological reasoning with statistical methods

    NASA Astrophysics Data System (ADS)

    Bigelow, James H.; Davis, Paul K.

    2002-07-01

    A metamodel is relatively small, simple model that approximates the behavior of a large, complex model. A common and superficially attractive way to develop a metamodel is to generate data from a number of large-model runs and to then use off-the-shelf statistical methods without attempting to understand the models internal workings. This paper describes research illuminating why it is important and fruitful, in some problems, to improve the quality of such metamodels by using various types of phenomenological knowledge. The benefits are sometimes mathematically subtle, but strategically important, as when one is dealing with a system that could fail if any of several critical components fail. Naive metamodels may fail to reflect the individual criticality of such components and may therefore be quite misleading if used for policy analysis. Na*ve metamodeling may also give very misleading results on the relative importance of inputs, thereby skewing resource-allocation decisions. By inserting an appropriate dose of theory, however, such problems can be greatly mitigated. Our work is intended to be a contribution to the emerging understanding of multiresolution, multiperspective modeling (MRMPM), as well as a contribution to interdisciplinary work combining virtues of statistical methodology with virtues of more theory-based work. Although the analysis we present is based on a particular experiment with a particular large and complex model, we believe that the insights are more general.

  6. Comparative effectiveness research: Policy context, methods development and research infrastructure.

    PubMed

    Tunis, Sean R; Benner, Joshua; McClellan, Mark

    2010-08-30

    Comparative effectiveness research (CER) has received substantial attention as a potential approach for improving health outcomes while lowering costs of care, and for improving the relevance and quality of clinical and health services research. The Institute of Medicine defines CER as 'the conduct and synthesis of systematic research comparing different interventions and strategies to prevent, diagnose, treat, and monitor health conditions. The purpose of this research is to inform patients, providers, and decision-makers, responding to their expressed needs, about which interventions are most effective for which patients under specific circumstances.' Improving the methods and infrastructure for CER will require sustained attention to the following issues: (1) Meaningful involvement of patients, consumers, clinicians, payers, and policymakers in key phases of CER study design and implementation; (2) Development of methodological 'best practices' for the design of CER studies that reflect decision-maker needs and balance internal validity with relevance, feasibility and timeliness; and (3) Improvements in research infrastructure to enhance the validity and efficiency with which CER studies are implemented. The approach to addressing each of these issues should be informed by the understanding that the primary purpose of CER is to help health care decision makers make informed clinical and health policy decisions. Copyright (c) 2010 John Wiley & Sons, Ltd.

  7. Development of fatigue life evaluation method using small specimen

    NASA Astrophysics Data System (ADS)

    Nogami, Shuhei; Nishimura, Arata; Wakai, Eichi; Tanigawa, Hiroyasu; Itoh, Takamoto; Hasegawa, Akira

    2013-10-01

    For developing the fatigue life evaluation method using small specimen, the effect of specimen size and shape on the fatigue life of the reduced activation ferritic/martensitic steels (F82H-IEA, F82H-BA07 and JLF-1) was investigated by the fatigue test at room temperature in air using round-bar and hourglass specimens with various specimen sizes (test section diameter: 0.85-10 mm). The round-bar specimen showed no specimen size and no specimen shape effects on the fatigue life, whereas the hourglass specimen showed no specimen size effect and obvious specimen shape effect on it. The shorter fatigue life of the hourglass specimen observed under low strain ranges could be attributed to the shorter micro-crack initiation life induced by the stress concentration dependent on the specimen shape. On the basis of this study, the small round-bar specimen was an acceptable candidate for evaluating the fatigue life using small specimen.

  8. Approaches to improve development methods for therapeutic cancer vaccines.

    PubMed

    Ogi, Chizuru; Aruga, Atsushi

    2015-04-01

    Therapeutic cancer vaccines are an immunotherapy that amplify or induce an active immune response against tumors. Notably, limitations in the methodology for existing anti-cancer drugs may subsist while applying them to cancer vaccine therapy. A retrospective analysis was performed using information obtained from ClinicalTrials.gov, PubMed, and published articles. Our research evaluated the optimal methodologies for therapeutic cancer vaccines based on (1) patient populations, (2) immune monitoring, (3) tumor response evaluation, and (4) supplementary therapies. Failure to optimize these methodologies at an early phase may impact development at later stages; thus, we have proposed some points to be considered during the early phase. Moreover, we compared our proposal with the guidance for industry issued by the US Food and Drug Administration in October 2011 entitled "Clinical Considerations for Therapeutic Cancer Vaccines". Consequently, while our research was aligned with the guidance, we hope it provides further insights in order to predict the risks and benefits and facilitate decisions for a new technology. We identified the following points for consideration: (1) include in the selection criteria the immunological stage with a prognostic value, which is as important as the tumor stage; (2) select immunological assays such as phenotype analysis of lymphocytes, based on their features and standardize assay methods; (3) utilize optimal response criteria for immunotherapy in therapeutic cancer vaccine trials; and (4) consider supplementary therapies, including immune checkpoint inhibitors, for future therapeutic cancer vaccines.

  9. Development of a harmonised method for the profiling of amphetamines: III. Development of the gas chromatographic method.

    PubMed

    Andersson, Kjell; Jalava, Kaisa; Lock, Eric; Finnon, Yvonne; Huizer, Henk; Kaa, Elisabet; Lopes, Alvaro; Poortman-van der Meer, Anneke; Cole, Michael D; Dahlén, Johan; Sippola, Erkki

    2007-06-14

    This study focused on gas chromatographic analysis of target compounds found in illicit amphetamine synthesised by the Leuckart reaction, reductive amination of benzyl methyl ketone, and the nitrostyrene route. The analytical method was investigated and optimised with respect to introduction of amphetamine samples into the gas chromatograph and separation and detection of the target substances. Sample introduction using split and splitless injection was tested at different injector temperatures, and their ability to transfer the target compounds to the GC column was evaluated using cold on column injection as a reference. Taking the results from both techniques into consideration a temperature of 250 degrees C was considered to be the best compromise. The most efficient separation was achieved with a DB-35MS capillary column (35% diphenyl 65% dimethyl silicone; 30 m x 0.25 mm, d(f) 0.25 microm) and an oven temperature program that started at 90 degrees C (1 min) and was increased by 8 degrees C/min to 300 degrees C (10 min). Reproducibility, repeatability, linearity, and limits of determination for the flame ionisation detector (FID), nitrogen phosphorous detector (NPD), and mass spectrometry (MS) in scan mode and selected ion monitoring (SIM) mode were evaluated. In addition, selectivity was studied applying FID and MS in both scan and SIM mode. It was found that reproducibility, repeatability, and limits of determination were similar for FID, NPD, and MS in scan mode. Moreover, the linearity was better when applying FID or NPD whereas the selectivity was better when utilising the MS. Finally, the introduction of target compounds to the GC column when applying injection volumes of 0.2 microl, 1 microl, 2 microl, and 4 microl with splitless injection respectively 1 microl with split injection (split ratio, 1:40) were compared. It was demonstrated that splitless injections of 1 microl, 2 microl, and 4 microl could be employed in the developed method, while split

  10. Development of Probabilistic Methods to Assess Meteotsunami Hazards

    NASA Astrophysics Data System (ADS)

    Geist, E. L.; Ten Brink, U. S.

    2014-12-01

    A probabilistic method to assess the hazard from meteotsunamis is developed from both probabilistic tsunami hazard analysis (PTHA) and probabilistic storm-surge forecasting. Meteotsunamis are unusual sea level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation, similar to that used in PTHA, incorporates different meteotsunami sources. A historical record of 116 pressure disturbances recorded between 2000 and 2013 by the U.S. Automated Surface Observing Stations (ASOS) along the U.S. East Coast is used to establish a continuous analytic distribution of each source parameter as well as the overall Poisson rate of occurrence. Initially, atmospheric parameters are considered independently such that the joint probability distribution is given by the product of each marginal distribution. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of pressure disturbances is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a finite-difference hydrodynamic model that solves for the linearized long-wave equations. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using 20 synthetic catalogs of 116 events each, resampled from the parent parameter distributions, yield mean and quantile hazard curves. An example is presented for four Mid-Atlantic sites using ASOS data in which only atmospheric pressure disturbances from squall lines and derechos are considered. Results indicate that site-to-site variations among meteotsunami hazard curves are related to the geometry and width of the adjacent continental shelf. The new hazard analysis of meteotsunamis is important for

  11. Retention modeling and method development in hydrophilic interaction chromatography.

    PubMed

    Tyteca, Eva; Périat, Aurélie; Rudaz, Serge; Desmet, Gert; Guillarme, Davy

    2014-04-11

    In the present study, the possibility of retention modeling in the HILIC mode was investigated, testing several different literature relationships over a wide range of different analytical conditions (column chemistries and mobile phase pH) and using analytes possessing diverse physico-chemical properties. Furthermore, it was investigated how the retention prediction depends on the number of isocratic or gradient trial or initial scouting runs. The most promising set of scouting runs seems to be a combination of three isocratic runs (95, 90 and 70%ACN) and one gradient run (95 to 65%ACN in 10min), as the average prediction errors were lower than using six equally spaced isocratic runs and because it is common in Method development (MD) to perform at least one scouting gradient run in the screening step to find out the best column, temperature and pH conditions. Overall, the retention predictions were much less accurate in HILIC than what is usually experienced in RPLC. This has severe implications for MD, as it restricts the use of commercial software packages that require the simulation of the retention of every peak in the chromatogram. To overcome this problem, the recently proposed predictive elution window shifting and stretching (PEWS(2)) approach can be used. In this computer-assisted MD strategy, only an (approximate) prediction of the retention of the first and the last peak in the chromatogram is required to conduct a well-targeted trial-and-error search, with suggested search conditions uniformly covering the entire possible search and elution space. This strategy was used to optimize the separation of three representative pharmaceutical mixtures possessing diverse physico-chemical properties (pteridins, saccharides and cocktail of drugs/metabolites). All problems could be successfully handled in less than 2.5h of instrument time (including equilibration).

  12. Leadership Development Expertise: A Mixed-Method Analysis

    ERIC Educational Resources Information Center

    Okpala, Comfort O.; Hopson, Linda B.; Chapman, Bernadine; Fort, Edward

    2011-01-01

    In this study, the impact of graduate curriculum, experience, and standards in the development of leadership expertise were examined. The major goals of the study were to (1) examine the impact of college content curriculum in the development of leadership expertise, (2) examine the impact of on the job experience in the development of leadership…

  13. [Cognitive functions, their development and modern diagnostic methods].

    PubMed

    Klasik, Adam; Janas-Kozik, Małgorzata; Krupka-Matuszczyk, Irena; Augustyniak, Ewa

    2006-01-01

    Cognitive psychology is an interdisciplinary field whose main aim is to study the thinking mechanisms of humans leading to cognizance. Therefore the concept of human cognitive processes envelopes the knowledge related to the mechanisms which determine the way humans acquire information from the environment and utilize their knowledge and experience. There are three basic processes which need to be distinguished when discussing human perception development: acquiring sensations, perceptiveness and attention. Acquiring sensations means the experience arising from the stimulation of a single sense organ, i.e. detection and differentiation of sensory information. Perceptiveness stands for the interpretation of sensations and may include recognition and identification of sensory information. The attention process relates to the selectivity of perception. Mental processes of the higher order used in cognition, thanks to which humans tend to try to understand the world and adapt to it, doubtlessly include the processes of memory, reasoning, learning and problem solving. There is a great difference in the human cognitive functioning at different stages of one's life (from infancy to adulthood). The difference is both quantitative and qualitative. There are three main approaches to the human cognitive functioning development: Jean Piaget's approach, information processing approach and psychometric approach. Piaget's ideas continue to form the groundwork of child cognitive psychology. Piaget identified four developmental stages of child cognition: 1. Sensorimotor stage (birth - 2 years old); 2. Preoperational stage (ages 2-7); 3. Concrete operations (ages 7-11; 4. Formal operations (11 and more). The supporters of the information processing approach use a computer metaphor to present the human cognitive processes functioning model. The three important mechanisms involved are: coding, automation and strategy designing and they all often co-operate together. This theory has

  14. Development of Infrared Radiation Heating Method for Sustainable Tomato Peeling

    USDA-ARS?s Scientific Manuscript database

    Although lye peeling is the widely industrialized method for producing high quality peeled fruit and vegetable products, the peeling method has resulted in negative impacts by significantly exerting both environmental and economic pressure on the tomato processing industry due to its associated sali...

  15. Developing a multimodal biometric authentication system using soft computing methods.

    PubMed

    Malcangi, Mario

    2015-01-01

    Robust personal authentication is becoming ever more important in computer-based applications. Among a variety of methods, biometric offers several advantages, mainly in embedded system applications. Hard and soft multi-biometric, combined with hard and soft computing methods, can be applied to improve the personal authentication process and to generalize the applicability. This chapter describes the embedded implementation of a multi-biometric (voiceprint and fingerprint) multimodal identification system based on hard computing methods (DSP) for feature extraction and matching, an artificial neural network (ANN) for soft feature pattern matching, and a fuzzy logic engine (FLE) for data fusion and decision.

  16. The change and development of statistical methods used in research articles in child development 1930-2010.

    PubMed

    Køppe, Simo; Dammeyer, Jesper

    2014-09-01

    The evolution of developmental psychology has been characterized by the use of different quantitative and qualitative methods and procedures. But how does the use of methods and procedures change over time? This study explores the change and development of statistical methods used in articles published in Child Development from 1930 to 2010. The methods used in every article in the first issue of every volume were categorized into four categories. Until 1980 relatively simple statistical methods were used. During the last 30 years there has been an explosive use of more advanced statistical methods employed. The absence of statistical methods or use of simple methods had been eliminated.

  17. Developing Scientific Thinking Methods and Applications in Islamic Education

    ERIC Educational Resources Information Center

    Al-Sharaf, Adel

    2013-01-01

    This article traces the early and medieval Islamic scholarship to the development of critical and scientific thinking and how they contributed to the development of an Islamic theory of epistemology and scientific thinking education. The article elucidates how the Qur'an and the Sunna of Prophet Muhammad have also contributed to the…

  18. Sustainable Development Index in Hong Kong: Approach, Method and Findings

    ERIC Educational Resources Information Center

    Tso, Geoffrey K. F.; Yau, Kelvin K. W.; Yang, C. Y.

    2011-01-01

    Sustainable development is a priority area of research in many countries and regions nowadays. This paper illustrates how a multi-stakeholders engagement process can be applied to identify and prioritize the local community's concerns and issues regarding sustainable development in Hong Kong. Ten priority areas covering a wide range of community's…

  19. Sustainable Development Index in Hong Kong: Approach, Method and Findings

    ERIC Educational Resources Information Center

    Tso, Geoffrey K. F.; Yau, Kelvin K. W.; Yang, C. Y.

    2011-01-01

    Sustainable development is a priority area of research in many countries and regions nowadays. This paper illustrates how a multi-stakeholders engagement process can be applied to identify and prioritize the local community's concerns and issues regarding sustainable development in Hong Kong. Ten priority areas covering a wide range of community's…

  20. Developing Scientific Thinking Methods and Applications in Islamic Education

    ERIC Educational Resources Information Center

    Al-Sharaf, Adel

    2013-01-01

    This article traces the early and medieval Islamic scholarship to the development of critical and scientific thinking and how they contributed to the development of an Islamic theory of epistemology and scientific thinking education. The article elucidates how the Qur'an and the Sunna of Prophet Muhammad have also contributed to the…

  1. Adult Education in Development. Methods and Approaches from Changing Societies.

    ERIC Educational Resources Information Center

    McGivney, Veronica; Murray, Frances

    The case studies described in this book provide examples of initiatives illustrating the role of adult education in development and its contribution to the process of change in developing countries. The book is organized in five sections. Case studies in Part 1, "Health Education," illustrate the links between primary health care and…

  2. Graphical programming interface: A development environment for MRI methods.

    PubMed

    Zwart, Nicholas R; Pipe, James G

    2015-11-01

    To introduce a multiplatform, Python language-based, development environment called graphical programming interface for prototyping MRI techniques. The interface allows developers to interact with their scientific algorithm prototypes visually in an event-driven environment making tasks such as parameterization, algorithm testing, data manipulation, and visualization an integrated part of the work-flow. Algorithm developers extend the built-in functionality through simple code interfaces designed to facilitate rapid implementation. This article shows several examples of algorithms developed in graphical programming interface including the non-Cartesian MR reconstruction algorithms for PROPELLER and spiral as well as spin simulation and trajectory visualization of a FLORET example. The graphical programming interface framework is shown to be a versatile prototyping environment for developing numeric algorithms used in the latest MR techniques. © 2014 Wiley Periodicals, Inc.

  3. Method development for optimum recovery of Yersinia pestis ...

    EPA Pesticide Factsheets

    Report The primary goal of this project was to determine the best combination of sampling swab, pre-moistening agent, transport media, and extraction method for a high efficiency recovery of Y. pestis and F. tularensis vegetative cells.

  4. RESEARCH ON THE DEVELOPMENT OF SEDIMENT TOXICITY IDENTIFICATION (TIE) METHODS

    EPA Science Inventory

    A common method for determining whether contaminants in sediments represent an environmental risk is to perform toxicity tests. Toxicity tests indicate if contaminants in sediments are bioavailable and capable of causing adverse biological effects (e.g., mortality, reduced growt...

  5. DEVELOPMENT OF MANUFACTURING METHODS FOR LIGHTWEIGHT METAL FOIL HEAT EXCHANGERS.

    DTIC Science & Technology

    MICROSTRUCTURE, TENSILE PROPERTIES, STRESSES, SPOT WELDS, COATINGS , SILICIDES , OXIDATION, TEST METHODS....PHOSPHORUS ALLOYS, ALUMINUM ALLOYS, NIOBIUM ALLOYS, PRESSURE, THERMAL JOINING, AEROSPACE CRAFT, DIFFUSION, BONDING, VACUUM FURNACES, SOLDERED JOINTS

  6. Developments in Methods of Analysis for Naphthalene Sulfonates.

    PubMed

    Hashemi, Sayyed Hossein; Kaykhaii, Massoud

    2017-03-04

    Naphthalene sulfonates are highly water-soluble compounds, indicating a high mobility in aquatic systems along with high temperature stability, which are important substances in the chemical industry. This review covers analytical methods, instruments and techniques used for the pre-concentration and analysis of naphthalene sulfonates in different matrices. All analytical steps including the extraction from real samples, their detection by spectrophotometric and chromatographic techniques, as well as methods of identification of this class of compounds are described in detail. The methods normally employed for the extraction and pre-concentration of these compounds is solid-phase extraction (including molecularly imprinted polymers and anion-exchange), while their quantification are performed using high-performance liquid chromatography, capillary electrophoresis, gas chromatography-mass spectrometry, liquid chromatography-mass spectrometry and spectrophotometric techniques. In this review, in addition to chromatographic and spectrophotometric methods, electrochemical innovations appearing in the literature will be also explored.

  7. Development and implementation of a DFT/MIA method

    NASA Astrophysics Data System (ADS)

    Rousseau, Bart

    2001-11-01

    In the last half of the century that has passed since the advent of the quantum chemical era the computational quantum chemistry methods have matured into valuable tools for researchers in industry and academia alike. However, for these methods to be competitive with e.g. the high-throughput experimental techniques used in the pharmaceutical industry, they must not only be accurate but at the same time must be computationally inexpensive. Most of the contemporary computational methods succeed only in fulfilling one of these criteria. Therefore the goal of this Ph.D. project was to combine the MIA approach, which allows efficient SCF calculations on large systems, and the DFT method, that takes electron correlation into account at a moderate computational cost. The new DFT method thus obtained, the DFT/MIA method, allows for efficient correlated calculations on large systems. The MIA method, an efficient combination of the Multiplicative Integral Approximation and the direct SCF procedure, is implemented in the ab-initio quantum chemical program package BRABO. In the MIA approximation the product of two basis functions is expanded in terms of an auxiliary basis set. This reduces the N4 four-center two-electron integrals to a sum of N3 three-center two-electron integrals. In addition this allows for a very fast build- up of the Fock matrix. The MIA approach has already proven its effectiveness in calculations on systems that belong to the largest that have been calculated at this level of theory, such as the calculation on the 46- peptide crambin. In addition this method was implemented using the `Parallel Virtual Machine' method, allowing parallel execution on a heterogeneous cluster of workstations. The MIA approach is applied to both the calculation of the electron density and the calculation of the exchange-correlation contribution to the Fock matrix. For both these quantities a recursive procedure is used. Test calculations on water clusters ranging in size from

  8. [Current trends in the development of salmonella detection methods].

    PubMed

    Krüger, G

    1989-11-01

    The cultural methods require 4 to 7 days for presumptive evidence of Salmonella in foodstuffs. Attempts in time shortening have resulted in combination of pre-enrichment or selective enrichment with time saving genetical or immunological tests. Proceedings of enzyme immunoassays for applications in Salmonella screening are important. Involving monoclonal antibodies, fluorescent or chemiluminescent substrates, there are some commercial Salmonella test kits. Especially rapid EIA methods, here advantages and disadvantages are discussed.

  9. Development of Discontinuous Galerkin Method for the Linearized Euler Equations

    DTIC Science & Technology

    2003-02-01

    ESktbkX-(9) i=1 k=O Since the LEE are linear, Fj(Uh) is expanded in a natural way as can be seen from Eq.(7). Furthermore, Atkins and Lockard [5...Discontinuous Galerkin method for Hyperbolic Equations, AIAA Journal, Vol. 36, pp. 775-782, 1998. [5] H.L. Atkins and D.P. Lockard , A High-Order Method using

  10. The Crawford Slip Method: An Organizational Development Technique

    DTIC Science & Technology

    1987-09-01

    Thesis Chairman: John A. Ballard, Ph.D., LTC, USAF Assistant Professor of Management and Organizational Behavior -- 20. DISTRIBUTION / AVAILABILITY...study examined the advantages/disadvantages of the Crawford Slip ,_--Nethod relative to attitudinal surveys; investigated the relation- ships between...survey variables and the Crawford Slip Method; and examined the relationships between the content of the CLwford Slip Method’and attitudinal variables

  11. Isentropic Bulk Modulus: Development of a Federal Test Method

    DTIC Science & Technology

    2016-01-01

    ranging from 30-80 °C and applied pressures of 1,000-18,000 psi. This method has been applied successfully to aviation turbine fuels and diesel fuels...pressures of 1,000-18,000 psi. This method has been applied successfully to aviation turbine fuels and diesel fuels composed of petroleum, synthetic...successfully to aviation turbine fuels and diesel fuels composed of petroleum, synthetic, and alternative feedstocks. Bulk Modulus, currently referenced as

  12. Pilot-in-the-Loop CFD Method Development

    DTIC Science & Technology

    2014-08-01

    expected later this summer. 5. References 1. Interpolating Scattered Data – MATLAB & Simulink , Website, http://www.mathworks.com/help/ matlab ...converted into structured grid type. In order to do that, MATLAB has been used to interpolate the scattered data onto a uniform structured grid...Linear interpolation method was used for data conversion. MATLAB uses Delaunay triangulation method to draw a triangle that encloses the query point and

  13. Development of Image Segmentation Methods for Intracranial Aneurysms

    PubMed Central

    Qian, Yi; Morgan, Michael

    2013-01-01

    Though providing vital means for the visualization, diagnosis, and quantification of decision-making processes for the treatment of vascular pathologies, vascular segmentation remains a process that continues to be marred by numerous challenges. In this study, we validate eight aneurysms via the use of two existing segmentation methods; the Region Growing Threshold and Chan-Vese model. These methods were evaluated by comparison of the results obtained with a manual segmentation performed. Based upon this validation study, we propose a new Threshold-Based Level Set (TLS) method in order to overcome the existing problems. With divergent methods of segmentation, we discovered that the volumes of the aneurysm models reached a maximum difference of 24%. The local artery anatomical shapes of the aneurysms were likewise found to significantly influence the results of these simulations. In contrast, however, the volume differences calculated via use of the TLS method remained at a relatively low figure, at only around 5%, thereby revealing the existence of inherent limitations in the application of cerebrovascular segmentation. The proposed TLS method holds the potential for utilisation in automatic aneurysm segmentation without the setting of a seed point or intensity threshold. This technique will further enable the segmentation of anatomically complex cerebrovascular shapes, thereby allowing for more accurate and efficient simulations of medical imagery. PMID:23606905

  14. Development of image segmentation methods for intracranial aneurysms.

    PubMed

    Sen, Yuka; Qian, Yi; Avolio, Alberto; Morgan, Michael

    2013-01-01

    Though providing vital means for the visualization, diagnosis, and quantification of decision-making processes for the treatment of vascular pathologies, vascular segmentation remains a process that continues to be marred by numerous challenges. In this study, we validate eight aneurysms via the use of two existing segmentation methods; the Region Growing Threshold and Chan-Vese model. These methods were evaluated by comparison of the results obtained with a manual segmentation performed. Based upon this validation study, we propose a new Threshold-Based Level Set (TLS) method in order to overcome the existing problems. With divergent methods of segmentation, we discovered that the volumes of the aneurysm models reached a maximum difference of 24%. The local artery anatomical shapes of the aneurysms were likewise found to significantly influence the results of these simulations. In contrast, however, the volume differences calculated via use of the TLS method remained at a relatively low figure, at only around 5%, thereby revealing the existence of inherent limitations in the application of cerebrovascular segmentation. The proposed TLS method holds the potential for utilisation in automatic aneurysm segmentation without the setting of a seed point or intensity threshold. This technique will further enable the segmentation of anatomically complex cerebrovascular shapes, thereby allowing for more accurate and efficient simulations of medical imagery.

  15. Development of a Magnetic Attachment Method for Bionic Eye Applications.

    PubMed

    Fox, Kate; Meffin, Hamish; Burns, Owen; Abbott, Carla J; Allen, Penelope J; Opie, Nicholas L; McGowan, Ceara; Yeoh, Jonathan; Ahnood, Arman; Luu, Chi D; Cicione, Rosemary; Saunders, Alexia L; McPhedran, Michelle; Cardamone, Lisa; Villalobos, Joel; Garrett, David J; Nayagam, David A X; Apollo, Nicholas V; Ganesan, Kumaravelu; Shivdasani, Mohit N; Stacey, Alastair; Escudie, Mathilde; Lichter, Samantha; Shepherd, Robert K; Prawer, Steven

    2016-03-01

    Successful visual prostheses require stable, long-term attachment. Epiretinal prostheses, in particular, require attachment methods to fix the prosthesis onto the retina. The most common method is fixation with a retinal tack; however, tacks cause retinal trauma, and surgical proficiency is important to ensure optimal placement of the prosthesis near the macula. Accordingly, alternate attachment methods are required. In this study, we detail a novel method of magnetic attachment for an epiretinal prosthesis using two prostheses components positioned on opposing sides of the retina. The magnetic attachment technique was piloted in a feline animal model (chronic, nonrecovery implantation). We also detail a new method to reliably control the magnet coupling force using heat. It was found that the force exerted upon the tissue that separates the two components could be minimized as the measured force is proportionately smaller at the working distance. We thus detail, for the first time, a surgical method using customized magnets to position and affix an epiretinal prosthesis on the retina. The position of the epiretinal prosthesis is reliable, and its location on the retina is accurately controlled by the placement of a secondary magnet in the suprachoroidal location. The electrode position above the retina is less than 50 microns at the center of the device, although there were pressure points seen at the two edges due to curvature misalignment. The degree of retinal compression found in this study was unacceptably high; nevertheless, the normal structure of the retina remained intact under the electrodes.

  16. Development of an SPE/CE method for analyzing HAAs

    USGS Publications Warehouse

    Zhang, L.; Capel, P.D.; Hozalski, R.M.

    2007-01-01

    The haloacetic acid (HAA) analysis methods approved by the US Environmental Protection Agency involve extraction and derivatization of HAAs (typically to their methyl ester form) and analysis by gas chromatography (GC) with electron capture detection (ECD). Concerns associated with these methods include the time and effort of the derivatization process, use of potentially hazardous chemicals or conditions during methylation, poor recoveries because of low extraction efficiencies for some HAAs or matrix effects from sulfate, and loss of tribromoacetic acid because of decarboxylation. The HAA analysis method introduced here uses solid-phase extraction (SPE) followed by capillary electrophoresis (CE) analysis. The method is accurate, reproducible, sensitive, relatively safe, and easy to perform, and avoids the use of large amounts of solvent for liquid-liquid extraction and the potential hazards and hassles of derivatization. The cost of analyzing HAAs using this method should be lower than the currently approved methods, and utilities with a GC/ECD can perform the analysis in-house.

  17. Development of a Matched Runs Method for VERITAS

    NASA Astrophysics Data System (ADS)

    Flinders, Andrew; VERITAS Collaboration

    2016-03-01

    VERITAS is an array of four Imaging Air Cherenkov Telescopes located in southern Arizona. It has been successful in detecting Very High Energy (VHE) radiation from a variety of sources including pulsars, Pulsar Wind Nebulae, Blazars, and High Mass X-Ray Binary systems. Each of these detections been accomplished using either the standard Ring Background Method or the Reflected Region Method in order to determine the appropriate background for the source region. For highly extended sources (>1 degree) these background estimation methods become unsuitable due to the possibility of source contamination in the background regions. A new method, called the matched background method, has been implemented for potentially highly extended sources observed by VERITAS. It provides and algorithm for identifying a suitable gamma-ray background estimation from a different field of view than the source region. By carefully matching cosmic-ray event rates between the source and the background sky observations, a suitable gamma-ray background matched data set can be identified. We will describe the matched background method and give examples of its use for several sources including the Crab Nebula and IC443. This research is supported by Grants from the U.S. Department of Energy Office of Science, the U.S. National Science Foundation and the Smithsonian Institution, and by NSERC in Canada.

  18. Methods and challenges in quantitative imaging biomarker development.

    PubMed

    Abramson, Richard G; Burton, Kirsteen R; Yu, John-Paul J; Scalzetti, Ernest M; Yankeelov, Thomas E; Rosenkrantz, Andrew B; Mendiratta-Lala, Mishal; Bartholmai, Brian J; Ganeshan, Dhakshinamoorthy; Lenchik, Leon; Subramaniam, Rathan M

    2015-01-01

    Academic radiology is poised to play an important role in the development and implementation of quantitative imaging (QI) tools. This article, drafted by the Association of University Radiologists Radiology Research Alliance Quantitative Imaging Task Force, reviews current issues in QI biomarker research. We discuss motivations for advancing QI, define key terms, present a framework for QI biomarker research, and outline challenges in QI biomarker development. We conclude by describing where QI research and development is currently taking place and discussing the paramount role of academic radiology in this rapidly evolving field. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  19. DEVELOPMENT OF A MOLECULAR METHOD TO IDENTIFY ASTROVIRUS IN WATER

    EPA Science Inventory

    Astrovirus is a common cause of gastroenteritis that has been determined to be responsible for several outbreaks. Since astrovirus can be waterborne, there is interest in testing environmental water for astrovirus. We have developed a sensitive reverse transcription-polymerase ...

  20. DEVELOPMENT OF A MOLECULAR METHOD TO IDENTIFY ASTROVIRUS IN WATER

    EPA Science Inventory

    Astrovirus is a common cause of gastroenteritis that has been determined to be responsible for several outbreaks. Since astrovirus can be waterborne, there is interest in testing environmental water for astrovirus. We have developed a sensitive reverse transcription-polymerase ...

  1. The historical development of the magnetic method in exploration

    USGS Publications Warehouse

    Nabighian, M.N.; Grauch, V.J.S.; Hansen, R.O.; LaFehr, T.R.; Li, Y.; Peirce, J.W.; Phillips, J.D.; Ruder, M.E.

    2005-01-01

    The magnetic method, perhaps the oldest of geophysical exploration techniques, blossomed after the advent of airborne surveys in World War II. With improvements in instrumentation, navigation, and platform compensation, it is now possible to map the entire crustal section at a variety of scales, from strongly magnetic basement at regional scale to weakly magnetic sedimentary contacts at local scale. Methods of data filtering, display, and interpretation have also advanced, especially with the availability of low-cost, high-performance personal computers and color raster graphics. The magnetic method is the primary exploration tool in the search for minerals. In other arenas, the magnetic method has evolved from its sole use for mapping basement structure to include a wide range of new applications, such as locating intrasedimentary faults, defining subtle lithologic contacts, mapping salt domes in weakly magnetic sediments, and better defining targets through 3D inversion. These new applications have increased the method's utility in all realms of exploration - in the search for minerals, oil and gas, geothermal resources, and groundwater, and for a variety of other purposes such as natural hazards assessment, mapping impact structures, and engineering and environmental studies. ?? 2005 Society of Exploration Geophysicists. All rights reserved.

  2. Development of an extraction method for perchlorate in soils.

    PubMed

    Cañas, Jaclyn E; Patel, Rashila; Tian, Kang; Anderson, Todd A

    2006-03-01

    Perchlorate originates as a contaminant in the environment from its use in solid rocket fuels and munitions. The current US EPA methods for perchlorate determination via ion chromatography using conductivity detection do not include recommendations for the extraction of perchlorate from soil. This study evaluated and identified appropriate conditions for the extraction of perchlorate from clay loam, loamy sand, and sandy soils. Based on the results of this evaluation, soils should be extracted in a dry, ground (mortar and pestle) state with Milli-Q water in a 1 ratio 1 soil ratio water ratio and diluted no more than 5-fold before analysis. When sandy soils were extracted in this manner, the calculated method detection limit was 3.5 microg kg(-1). The findings of this study have aided in the establishment of a standardized extraction method for perchlorate in soil.

  3. Methods for Modeling Brassinosteroid-Mediated Signaling in Plant Development.

    PubMed

    Frigola, David; Caño-Delgado, Ana I; Ibañes, Marta

    2017-01-01

    Mathematical modeling of biological processes is a useful tool to draw conclusions that are contained in the data, but not directly reachable, as well as to make predictions and select the most efficient follow-up experiments. Here we outline a method to model systems of a few proteins that interact transcriptionally and/or posttranscriptionally, by representing the system as Ordinary Differential Equations and to study the model dynamics and stationary states. We exemplify this method by focusing on the regulation by the brassinosteroid (BR) signaling component BRASSINOSTEROID INSENSITIVE1 ETHYL METHYL SULFONATE SUPPRESSOR1 (BES1) of BRAVO, a quiescence-regulating transcription factor expressed in the quiescent cells of Arabidopsis thaliana roots. The method to extract the stationary states and the dynamics is provided as a Mathematica code and requires basic knowledge of the Mathematica software to be executed.

  4. Development of a rapid assimilable organic carbon method for water.

    PubMed

    Lechevallier, M W; Shaw, N E; Kaplan, L A; Bott, T L

    1993-05-01

    A rapid method for measurement of assimilable organic carbon (AOC) is proposed. The time needed to perform the assay is reduced by increasing the incubation temperature and increasing the inoculum density. The ATP luciferin-luciferase method quickly enumerates the test organisms without the need for plate count media or dilution bottles. There was no significant difference between AOC values determined with strain P17 for the ATP and plate count procedures. For strain NOX, the plate count procedure underestimated bacterial levels in some samples. Comparison of AOC values obtained by the Belleville laboratory (by the ATP technique) and the Stroud Water Research Center (by plate counts) showed that values were significantly correlated and not significantly different. The study concludes that the rapid AOC method can quickly determine the bacterial growth potential of water within 2 to 4 days.

  5. Development method of the motor winding's ultrasonic cleaning equipment

    NASA Astrophysics Data System (ADS)

    Jiang, Yingzhan; Wang, Caiyuan; Ao, Chenyang; Zhang, Haipeng

    2013-03-01

    The complicate question's solution of motor winding cleaning need new technologies such as ultrasonic cleaning. The mechanism of problems that the insulation level of the motor winding would be degraded with time and the motor winding would resumed tide soon after processing were analyzed. The ultrasonic cleaning method was studies and one ultrasonic cleaning device was designed. Its safety was verified by the destructive experiment. The test show that this device can clear away the depositional dirt in the winding thoroughly, which provides a new idea and method to ensure its insulation level and realize its safe and reliable operation.

  6. New Method for Data Treatment Developed at ESO

    NASA Astrophysics Data System (ADS)

    1996-08-01

    scientific return from the VLT and other telescopes such as the HST best be optimised? It is exactly for this reason that astronomers and engineers at ESO are now busy developing new methods of telescope operation and data analysis alongside with the VLT instrumental hardware itself. The new solution by means of models The appropriate strategy to make progress in the inherent conflict between calibration demand and time available for scientific observations is to obtain a physically correct understanding of the effects exerted on the data by different instruments . In this way, it is possible to decide which calibration data are actually required and on which timescale they have to be updated. One can then use computer models of these instruments to predict calibration solutions which are now valid for the full range of target properties and which handle environmental conditions properly. Such computer models can also be used to simulate observations. This brings a lot of benefits for the entire observational process. First, the astronomer can prepare observations and select instrumental modes and exposure times suited for optimal information return. Secondly, it provides confidence in the validity of the calibration process, and therefore in the cleanliness of the corrected data. Finally, once a theory about the target and its properties has been developed, one may simulate observations of a set of theoretical targets for which the properties are slightly modified in order to study their influence on the raw data. For the observatory there are also advantages. Optimization from the point of view of data analysis can now take place already during instrument design, calibration and data analysis procedures for any observational mode can be tested before real observations are obtained, and the maintenance staff can make sure that the instrument performs as expected and designed. How far have we come along this road? The present project consists of a close collaboration between

  7. Development of Fingerprinting Method in Sediment Source Studies

    NASA Astrophysics Data System (ADS)

    Du, Pengfei; Ning, Duihu; Huang, Donghao

    2016-04-01

    Sediment source study is valuable for watershed sediment budget, sediment control in channels, soil erosion model validation and benefits evaluation of soil and water conservation. As one of the methods to make clear the sediment sources, fingerprinting has been proven effective, and hence has been adopted in different countries over the world. This paper briefly introduced the fingerprinting method in models, diagnostic sediment properties, applied regions, spatial and temporal scales, and classification of sediment source types. Combining with environmental radionuclides as the time makers (such as 137Cs and 210Pb), the sediment source history has been possible by virtue of this method. However, some uncertainties are waiting for the confirmative answers while introducing fingerprinting technique to sediment related studies: the efficient sampling strategies through linking sediment source and fingerprint properties need to be clearer, the spatial scale links (up-scaling and down-scaling) should be provided with detailed methods, the model calibration is necessary to be updated to improve the estimated precision. (This paper is a contribution to the project of National Natural Science Foundation of China (No. 41501299), the non-profit project of Ministry of Water Resources of China (No. 201501045), and the project of Youth Scientific Research of China Institute of Water Resources and Hydropower Research (Using fingerprinting technique to study sediment source in a typical small watershed of black soil region in northeast China))

  8. Developing and Understanding Methods for Large-Scale Nonlinear Optimization

    DTIC Science & Technology

    2006-07-24

    algorithms for large-scale uncon- strained and constrained optimization problems, including limited-memory methods for problems with -2- many thousands...34Published in peer-reviewed journals" E. Eskow, B. Bader, R. Byrd, S. Crivelli, T. Head-Gordon, V. Lamberti and R. Schnabel, "An optimization approach to the

  9. Antibody humanization methods for development of therapeutic applications.

    PubMed

    Ahmadzadeh, Vahideh; Farajnia, Safar; Feizi, Mohammad Ali Hosseinpour; Nejad, Ramezan Ali Khavari

    2014-04-01

    Recombinant antibody technologies are rapidly becoming available and showing considerable clinical success. However, the immunogenicity of murine-derived monoclonal antibodies is restrictive in cancer immunotherapy. Humanized antibodies can overcome these problems and are considered to be a promising alternative therapeutic agent. There are several approaches for antibody humanization. In this article we review various methods used in the antibody humanization process.

  10. Integrating Methods and Materials: Developing Trainees' Reading Skills.

    ERIC Educational Resources Information Center

    Jarvis, Jennifer

    1987-01-01

    Explores issues arising from a research project which studied ways of meeting the reading needs of trainee primary school teachers (from Malawi and Tanzania) of English as a foreign language. Topics discussed include: the classroom teaching situation; teaching "quality"; and integration of materials and methods. (CB)

  11. DEVELOPMENT OF MOLECULAR METHODS TO DETECT EMERGING VIRUSES

    EPA Science Inventory

    A large number of human enteric viruses are known to cause gastrointestinal illness and waterborne outbreaks. Many of these are emerging viruses that do not grow or grow poorly in cell culture and so molecular detectoin methods based on the polymerase chain reaction (PCR) are be...

  12. Developing a Brazilian Band Method Book: Phase II.

    ERIC Educational Resources Information Center

    Barbosa, Joel Luis

    1999-01-01

    Relates a pilot test of an elementary band method book for group instruction in Brazilian music education. Focused on the amount of content taught within three one-hour classes per week and studied the quality of learning. Concludes that the group covered 17 pages of the book, learned outside material, and performed four concerts. (CMK)

  13. DEVELOPMENT OF MOLECULAR METHODS TO DETECT EMERGING VIRUSES

    EPA Science Inventory

    A large number of human enteric viruses are known to cause gastrointestinal illness and waterborne outbreaks. Many of these are emerging viruses that do not grow or grow poorly in cell culture and so molecular detectoin methods based on the polymerase chain reaction (PCR) are be...

  14. The Project Method: Its Vocational Education Origin and International Development.

    ERIC Educational Resources Information Center

    Knoll, Michael

    1997-01-01

    Traces the history of the project as a teaching method from the art academies of Renaissance Rome and Paris to European and U.S. technical universities, manual training and industrial arts, and the influence of Kilpatrick and Dewey in the early 20th century. (SK)

  15. Development of a Computerised Method of Determining Aircraft Maintenance Intervals.

    DTIC Science & Technology

    1985-09-01

    components, there is no simple method for solving the problem (27:446). Vergin and Scriabin considered dynamic programming for a two-component system...Repair at Failure and Adjustment Costs." Naval Research Logistics Quarterly, Vol 22, No 2, June 1975. 32. Vergin, Roger C. and Michael Scriabin

  16. Fluorimetric analysis of pesticides: Methods, recent developments and applications.

    PubMed

    Coly, A; Aaron, J J

    1998-08-01

    The fluorimetric analysis of pesticides is reviewed with emphasis on the description of direct and indirect fluorimetric methods, including chemical derivatization, fluorogenic labelling, and photochemically-induced fluorescence. The use of fluorescence detection in TLC, HPLC and FIA as well as applications to environmental samples are discussed in detail.

  17. Standardized development of computer software. Part 1: Methods

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1976-01-01

    This work is a two-volume set on standards for modern software engineering methodology. This volume presents a tutorial and practical guide to the efficient development of reliable computer software, a unified and coordinated discipline for design, coding, testing, documentation, and project organization and management. The aim of the monograph is to provide formal disciplines for increasing the probability of securing software that is characterized by high degrees of initial correctness, readability, and maintainability, and to promote practices which aid in the consistent and orderly development of a total software system within schedule and budgetary constraints. These disciplines are set forth as a set of rules to be applied during software development to drastically reduce the time traditionally spent in debugging, to increase documentation quality, to foster understandability among those who must come in contact with it, and to facilitate operations and alterations of the program as requirements on the program environment change.

  18. Trust in healthcare settings: Scale development, methods, and preliminary determinants

    PubMed Central

    LoCurto, Jamie; Berg, Gina M

    2016-01-01

    The literature contains research regarding how trust is formed in healthcare settings but rarely discusses trust formation in an emergent care population. A literature review was conducted to determine which of the trust determinants are important for this process as well as how to develop a scale to measure trust. A search generated a total of 155 articles, 65 of which met eligibility criteria. Determinants that were important included the following: honesty, confidentiality, dependability, communication, competency, fiduciary responsibility, fidelity, and agency. The process of developing a scale includes the following: a literature review, qualitative analysis, piloting, and survey validation. Results suggest that physician behaviors are important in influencing trust in patients and should be included in scales measuring trust. Next steps consist of interviewing emergent care patients to commence the process of developing a scale. PMID:27635245

  19. Development of a virtual metrology method using plasma harmonics analysis

    NASA Astrophysics Data System (ADS)

    Jun, H.; Shin, J.; Kim, S.; Choi, H.

    2017-07-01

    A virtual metrology technique based on plasma harmonics is developed for predicting semiconductor processes. From a plasma process performed by 300 mm photoresist stripper equipment, a strong correlation is found between optical plasma harmonics intensities and the process results, such as the photoresist strip rate and strip non-uniformity. Based on this finding, a general process prediction model is developed. The developed virtual metrology model shows that the R-squared (R2) values between the measured and predicted process results are 95% and 64% for the photoresist strip rate and photoresist strip non-uniformity, respectively. This is the first research on process prediction based on optical plasma harmonics analysis, and the results can be applied to semiconductor processes such as dry etching and plasma enhanced chemical vapor deposition.

  20. Development and Application of Agglomerated Multigrid Methods for Complex Geometries

    NASA Technical Reports Server (NTRS)

    Nishikawa, Hiroaki; Diskin, Boris; Thomas, James L.

    2010-01-01

    We report progress in the development of agglomerated multigrid techniques for fully un- structured grids in three dimensions, building upon two previous studies focused on efficiently solving a model diffusion equation. We demonstrate a robust fully-coarsened agglomerated multigrid technique for 3D complex geometries, incorporating the following key developments: consistent and stable coarse-grid discretizations, a hierarchical agglomeration scheme, and line-agglomeration/relaxation using prismatic-cell discretizations in the highly-stretched grid regions. A signi cant speed-up in computer time is demonstrated for a model diffusion problem, the Euler equations, and the Reynolds-averaged Navier-Stokes equations for 3D realistic complex geometries.

  1. Andragogical and Pedagogical Methods for Curriculum and Program Development

    ERIC Educational Resources Information Center

    Wang, Victor C. X., Ed.; Bryan, Valerie C., Ed.

    2014-01-01

    Today's ever-changing learning environment is characterized by the fast pace of technology that drives our society to move forward, and causes our knowledge to increase at an exponential rate. The need for in-depth research that is bound to generate new knowledge about curriculum and program development is becoming ever more relevant.…

  2. Review of methods for developing probabilistic risk assessments

    Treesearch

    D. A. Weinstein; P.B. Woodbury

    2010-01-01

    We describe methodologies currently in use or those under development containing features for estimating fire occurrence risk assessment. We describe two major categories of fire risk assessment tools: those that predict fire under current conditions, assuming that vegetation, climate, and the interactions between them and fire remain relatively similar to their...

  3. Teachers' Perceptions of Edcamp Professional Development: A Q Method Study

    ERIC Educational Resources Information Center

    Brown, Toby

    2015-01-01

    This study described the subjective opinions of teachers about their experiences at Edcamp, an unconference-style form of teacher professional development (PD). Traditional PD has been maligned for being overly expensive and ineffectual in affecting changes in teacher practice. In order to defend teachers' decisions to partake in Edcamp-style PD,…

  4. Pilot-in-the-Loop CFD Method Development

    DTIC Science & Technology

    2014-10-01

    FIFO (File In File Out) approach. Initial fully coupled tests have been performed for two different cases: Hover Case I: In an open domain, Hover...interface approach between CRUNCH and GENHEL-PSU has been developed using File In File Out ( FIFO ). The results will provide a baseline with which to

  5. Andragogical and Pedagogical Methods for Curriculum and Program Development

    ERIC Educational Resources Information Center

    Wang, Victor C. X., Ed.; Bryan, Valerie C., Ed.

    2014-01-01

    Today's ever-changing learning environment is characterized by the fast pace of technology that drives our society to move forward, and causes our knowledge to increase at an exponential rate. The need for in-depth research that is bound to generate new knowledge about curriculum and program development is becoming ever more relevant.…

  6. Developing Writing-Reading Abilities though Semiglobal Methods

    ERIC Educational Resources Information Center

    Macri, Cecilia; Bocos, Musata

    2013-01-01

    Through this research was intended to underline the importance of the semi-global strategies used within thematic projects for developing writing/reading abilities in the first grade pupils. Four different coordinates were chosen to be the main variables of this research: the level of phonological awareness, the degree in which writing-reading…

  7. DEVELOPMENT OF A MOLECULAR METHOD TO IDENTIFY ASTROVIRUS IN WATER.

    EPA Science Inventory

    Astrovirus is a common cause of gastroenteritis that has been determined to be responsible for several outbreaks. Since astrovirus can be waterborne, there is interest in testing environmental water for astrovirus and we have developed a sensitive RT-PCR assay that is designed t...

  8. Methods of Fostering Language Development in Deaf Infants. Final Report.

    ERIC Educational Resources Information Center

    Greenstein, Jules M.

    Thirty deaf children admitted to an auditory training program before age 2 were studied longituninally to age 40 months in an investigation of the effectiveness of early intervention, the relationship between mother-child interaction and language acquisition, and the effectiveness of new devices developed for auditory training. Among findings were…

  9. Classroom Coaching: An Emerging Method of Professional Development.

    ERIC Educational Resources Information Center

    Becker, Joanne Rossi

    This project investigated the efficacy of classroom coaching in improving instruction in elementary mathematics classrooms. The coaches involved in this study were participants in a professional development program. The program includes three major aspects: (1) an intensive 3-week summer institute focusing on mathematics content, pedagogical…

  10. DEVELOPMENT OF A MOLECULAR METHOD TO DETECT ASTROVIRUS

    EPA Science Inventory

    Astrovirus is a common cause of gastroenteritis that has been determined to be responsible for several outbreaks. Since astrovirus can be waterborne, there is interest in testing environmental water for astrovirus and we have developed a sensitive RT-PCR assay that is designed t...

  11. Reflection--A Method for Organisational and Individual Development

    ERIC Educational Resources Information Center

    Randle, Hanne; Tilander, Kristian

    2007-01-01

    This paper presents how organisational development can be the results when politicians, managers, social workers and teaching staff take part in reflection. The results are based on a government-funded initiative in Sweden for lowering sick absenteeism. Three local governments introduced reflection as a strategy to combat work related stress and a…

  12. Development of a computer method for predicting lumber cutting yields.

    Treesearch

    Daniel E. Dunmire; George H. Englerth

    1967-01-01

    A system of locating defects in a board by intersecting coordinate points was developed and a computer program devised that used these points to locate all possible clear areas in the board. The computer determined the yields by placing any given size or sizes of cuttings in these clear areas, and furthermore stated the type, location, and number of saw cuts. The...

  13. Measurement Development in Reflective Supervision: History, Methods, and Next Steps

    ERIC Educational Resources Information Center

    Tomlin, Angela M.; Heller, Sherryl Scott

    2016-01-01

    This issue of the "ZERO TO THREE" journal provides a snapshot of the current state of measurement of reflective supervision within the infant-family field. In this article, the authors introduce the issue by providing a brief history of the development of reflective supervision in the field of infant mental health, with a specific focus…

  14. Measurement Development in Reflective Supervision: History, Methods, and Next Steps

    ERIC Educational Resources Information Center

    Tomlin, Angela M.; Heller, Sherryl Scott

    2016-01-01

    This issue of the "ZERO TO THREE" journal provides a snapshot of the current state of measurement of reflective supervision within the infant-family field. In this article, the authors introduce the issue by providing a brief history of the development of reflective supervision in the field of infant mental health, with a specific focus…

  15. DEVELOPMENT OF A MOLECULAR METHOD TO IDENTIFY HEPATITIS E VIRUS

    EPA Science Inventory

    Hepatitis E virus (HEV) is a waterborne emerging pathogen that causes significant illness in the developing world. Thus far, an HEV outbreak has not been reported in the U.S., although a swine variant of the virus is common in Midwestern hogs. Because viruses isolated from two ...

  16. DEVELOPMENT OF A MOLECULAR METHOD TO DETECT ASTROVIRUS

    EPA Science Inventory

    Astrovirus is a common cause of gastroenteritis that has been determined to be responsible for several outbreaks. Since astrovirus can be waterborne, there is interest in testing environmental water for astrovirus and we have developed a sensitive RT-PCR assay that is designed t...

  17. Developing and evaluating rapid field methods to estimate peat carbon

    Treesearch

    Rodney A. Chimner; Cassandra A. Ott; Charles H. Perry; Randall K. Kolka

    2014-01-01

    Many international protocols (e.g., REDD+) are developing inventories of ecosystem carbon stocks and fluxes at country and regional scales, which can include peatlands. As the only nationally implemented field inventory and remeasurement of forest soils in the US, the USDA Forest Service Forest Inventory and Analysis Program (FIA) samples the top 20 cm of organic soils...

  18. DEVELOPMENT OF A MOLECULAR METHOD TO IDENTIFY HEPATITIS E VIRUS

    EPA Science Inventory

    Hepatitis E virus (HEV) is a waterborne emerging pathogen that causes significant illness in the developing world. Thus far, an HEV outbreak has not been reported in the U.S., although a swine variant of the virus is common in Midwestern hogs. Because viruses isolated from two ...

  19. Developing Principals as Racial Equity Leaders: A Mixed Method Study

    ERIC Educational Resources Information Center

    Raskin, Candace F.; Krull, Melissa; Thatcher, Roberta

    2015-01-01

    This article will present information and research on how a college of education is intentionally developing principals to lead with confidence and racial competence. The nation's student achievement research is sobering: our current school systems widen already existing gaps between white students and students of color, (Darling-Hammond, L. 2004,…

  20. Green methods of lignocellulose pretreatment for biorefinery development.

    PubMed

    Capolupo, Laura; Faraco, Vincenza

    2016-11-01

    Lignocellulosic biomass is the most abundant, low-cost, bio-renewable resource that holds enormous importance as alternative source for production of biofuels and other biochemicals that can be utilized as building blocks for production of new materials. Enzymatic hydrolysis is an essential step involved in the bioconversion of lignocellulose to produce fermentable monosaccharides. However, to allow the enzymatic hydrolysis, a pretreatment step is needed in order to remove the lignin barrier and break down the crystalline structure of cellulose. The present manuscript is dedicated to reviewing the most commonly applied "green" pretreatment processes used in bioconversion of lignocellulosic biomasses within the "biorefinery" concept. In this frame, the effects of different pretreatment methods on lignocellulosic biomass are described along with an in-depth discussion on the benefits and drawbacks of each method, including generation of potentially inhibitory compounds for enzymatic hydrolysis, effect on cellulose digestibility, and generation of compounds toxic for the environment, and energy and economic demand.

  1. Development of stroke performance measures: definitions, methods, and current measures.

    PubMed

    Reeves, Mathew J; Parker, Carol; Fonarow, Gregg C; Smith, Eric E; Schwamm, Lee H

    2010-07-01

    In the United States and elsewhere, stroke performance measures have been developed to monitor and improve the quality of care. The process by which these measures are developed, implemented, and evaluated is complex, evolving, and not widely understood. We review the methodological development of stroke performance measures in the United States. A literature search identified articles that addressed the development and endorsement of performance measures for stroke care. Emphasis was given to articles specific to acute stroke, but when these were lacking, other cardiovascular diseases were included. Ten process-based performance measures relevant to acute hospital-based stroke care have now been developed and endorsed. These measures include intravenous thrombolysis, deep vein thrombosis prophylaxis, dysphagia screening, stroke education, and discharge-related medications and assessments. There are currently at least 5 major US-based stroke quality improvement programs implementing stroke measures. Data indicate that rapid improvements in the quality of stroke care can be induced by the systematic collection and evaluation of stroke performance measures. However, current stroke measures are relatively limited, addressing only inpatient care and mostly patients with ischemic stroke. Stroke quality improvement is still in its early stages, but data suggest that large-scale improvements in stroke care can result from the implementation of stroke performance measures. Performance measures that address multidisciplinary stroke unit care, outpatient-based care, and patient-oriented outcomes such as functional recovery should be considered. Ongoing challenges relevant to stroke quality improvement include the role of public reporting and the need to link better stroke care to improved patient outcomes.

  2. MANPRINT Methods Monograph: Aiding the Development of Training Constraints

    DTIC Science & Technology

    1989-06-01

    great modification, of the Comparison-Based Prediction ( CBP ) technique. This methodology was pioneered by Tetmeyer (1976) and has been further refined...must be emphasized at this point that Product Four is not conventional CBP . It is a much more powerful technique which shares some general philosophy...with existing CBP methods. The premise behind the choice of this approach is the availability of data on the characteristics of training provided by

  3. Development of an Analytical Method for Explosive Residues in Soil,

    DTIC Science & Technology

    1987-06-01

    most popular approaches have re- lied on either gas chromatography (GC) using electron capture (ECD), thermal electron ( TEA ):3r mass spectro...higher for the sonic bath method was superior, shaking procedure using acetone, although it is unclear Johnsen and Starr (1972) also compared the extrac ...and Richard (1986) studied the efficiency of ex- based on batch ultrasonic agitation and Soxhlet extrac - traction of polycyclic organics from spiked

  4. Requirements and Methods for Management Development Programmes in the Least Developed Countries in Africa.

    ERIC Educational Resources Information Center

    Perry, Chad

    1993-01-01

    Management development is essential for the economic development of least developed countries (LDCs) in Africa. The collectivist culture of LDCs necessitates development of behavior skills and attitudes and a cyclic, experiential learning approach. (SK)

  5. Development of an ultrasonic cleaning method for fuel assemblies

    SciTech Connect

    Heki, H.; Komura, S.; Kato, H.; Sakai, H. ); Hattori, T. )

    1991-01-01

    Almost all radiation buildup in light water reactors is the result of the deposition of activated corrosion and wear products in out-of-core areas. After operation, a significant quantity of corrosion and wear products is deposited on the fuel rods as crud. At refueling shutdowns, these activation products are available for removal. If they can be quickly and easily removed, buildup of radioactivity on out-of-core surfaces and individual exposure dose can be greatly reduced. After studying various physical cleaning methods (e.g., water jet and ultrasonic), the ultrasonic cleaning method was selected as the most effective for fuel assembly cleaning. The ultrasonic cleaning method is especially able to efficiently clean the fuel without removing the channel box. The removed crud in the channel box would be swept out to the filtration unit. Parameter survey tests were carried out to evaluate the optimum conditions for ultrasonic cleaning using a mock-up of a short section of fuel assembly with the channel box. The ultrasonic device used was a 600-W ultrasonic transducer operating at 26-kHz ultrasonic frequency.

  6. Real space electrostatics for multipoles. I. Development of methods

    NASA Astrophysics Data System (ADS)

    Lamichhane, Madan; Gezelter, J. Daniel; Newman, Kathie E.

    2014-10-01

    We have extended the original damped-shifted force (DSF) electrostatic kernel and have been able to derive three new electrostatic potentials for higher-order multipoles that are based on truncated Taylor expansions around the cutoff radius. These include a shifted potential (SP) that generalizes the Wolf method for point multipoles, and Taylor-shifted force (TSF) and gradient-shifted force (GSF) potentials that are both generalizations of DSF electrostatics for multipoles. We find that each of the distinct orientational contributions requires a separate radial function to ensure that pairwise energies, forces, and torques all vanish at the cutoff radius. In this paper, we present energy, force, and torque expressions for the new models, and compare these real-space interaction models to exact results for ordered arrays of multipoles. We find that the GSF and SP methods converge rapidly to the correct lattice energies for ordered dipolar and quadrupolar arrays, while the TSF is too severe an approximation to provide accurate convergence to lattice energies. Because real-space methods can be made to scale linearly with system size, SP and GSF are attractive options for large Monte Carlo and molecular dynamics simulations, respectively.

  7. Development of a modified Cohen method of standard setting.

    PubMed

    Taylor, Celia A

    2011-01-01

    A new 'Cohen' approach to standard setting was recently described where the pass mark is calculated as 60% of the score of the student at the 95th percentile, after correcting for guessing. This article considers how two potential criticisms of the Cohen method can be addressed and proposes a modified version, with the assumptions tested using local data. The modified version removes the correction for guessing and uses the score of the 90th, rather than the 95th percentile student as the reference point, based on the cumulative density functions for 32 modules from one medical school; and incorporates an indirect criterion-referenced passing standard by changing the 60% multiplier to the ratio of the cut score to the score of the student at the 90th percentile on exams that have been standard set using modified Angoff. The assumption that the performance of the 90th percentile student is consistent over time holds for multiple choice questions. Applying the modified Cohen method to the 32 modules generally reduced the variation in failure rate across modules, compared to a fixed pass mark of 50%. The results suggest that the modified Cohen method holds much promise as an economical approach to standard setting.

  8. The use of expressive methods for developing empathic skills.

    PubMed

    Ozcan, Neslihan Keser; Bilgin, Hülya; Eracar, Nevin

    2011-01-01

    Empathy is one of the fundamental concepts in nursing, and it is an ability that can be learned. Various education models have been tested for improving empathic skills. Research has focused on using oral presentations, videos, modeling, practiced negotiation based on experiences, and psychodrama methods, such as role playing, as ways to improve empathy in participants. This study looked at the use of expressive arts to improve empathic skills of nursing students. The study was conducted with 48 students who were separated into five different groups. All groups lasted for two hours, and met for 12 weeks. Expressive art and psychodrama methods were used in the group studies. The Scale of Empathic Skill was administered to participants before and after the group studies. Before the group study took place, the average score for empathic skill was 127.97 (SD = 21.26). After the group study, it increased to 138.87 (SD = 20.40). The average score for empathic skill increased after the group (t = 3.996, p = .000). Results show that expressive methods are easier, more effective, and enjoyable processes in nursing training.

  9. PFAS Sampling Studies and Methods Development for Water and Other Environmental Media (Technical Brief)

    EPA Pesticide Factsheets

    EPA's methods for analyzing PFAS in environmental media are in various stages of development. This fact sheet summarizes EPA's analytical methods development for groundwater, surface water, wastewater, and solids, including soils, sediments, and biosolids

  10. Methods Used in Game Development to Foster FLOW

    NASA Technical Reports Server (NTRS)

    Jeppsen, Isaac Ben

    2010-01-01

    Games designed for entertainment have a rich history of providing compelling experiences. From consoles to PCs, games have managed to present intuitive and effective interfaces for a wide range of game styles to successfully allow users to "walk-up-and-play". Once a user is hooked, successful games artfully present challenging experiences just within reach of a user's ability, weaving each task and achievement into a compelling and engaging experience. In this paper, engagement is discussed in terms of the psychological theory of Flow. I argue that engagement should be one of the primary goals when developing a serious game and I discuss the best practices and techniques that have emerged from traditional video game development which help foster the creation of engaging, high Flow experiences.

  11. Translating Vision into Design: A Method for Conceptual Design Development

    NASA Technical Reports Server (NTRS)

    Carpenter, Joyce E.

    2003-01-01

    One of the most challenging tasks for engineers is the definition of design solutions that will satisfy high-level strategic visions and objectives. Even more challenging is the need to demonstrate how a particular design solution supports the high-level vision. This paper describes a process and set of system engineering tools that have been used at the Johnson Space Center to analyze and decompose high-level objectives for future human missions into design requirements that can be used to develop alternative concepts for vehicles, habitats, and other systems. Analysis and design studies of alternative concepts and approaches are used to develop recommendations for strategic investments in research and technology that support the NASA Integrated Space Plan. In addition to a description of system engineering tools, this paper includes a discussion of collaborative design practices for human exploration mission architecture studies used at the Johnson Space Center.

  12. Current Development in Elderly Comprehensive Assessment and Research Methods

    PubMed Central

    Jiang, Shantong; Li, Pingping

    2016-01-01

    Comprehensive geriatric assessment (CGA) is a core and an essential part of the comprehensive care of the aging population. CGA uses specific tools to summarize elderly status in several domains that may influence the general health and outcomes of diseases of elderly patients, including assessment of medical, physical, psychological, mental, nutritional, cognitive, social, economic, and environmental status. Here, in this paper, we review different assessment tools used in elderly patients with chronic diseases. The development of comprehensive assessment tools and single assessment tools specially used in a dimension of CGA was discussed. CGA provides substantial insight into the comprehensive management of elderly patients. Developing concise and effective assessment instruments is helpful to carry out CGA widely to create a higher clinical value. PMID:27042661

  13. Valid methods: the quality assurance of test method development, validation, approval, and transfer for veterinary testing laboratories.

    PubMed

    Wiegers, Ann L

    2003-07-01

    Third-party accreditation is a valuable tool to demonstrate a laboratory's competence to conduct testing. Accreditation, internationally and in the United States, has been discussed previously. However, accreditation is only I part of establishing data credibility. A validated test method is the first component of a valid measurement system. Validation is defined as confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. The international and national standard ISO/IEC 17025 recognizes the importance of validated methods and requires that laboratory-developed methods or methods adopted by the laboratory be appropriate for the intended use. Validated methods are therefore required and their use agreed to by the client (i.e., end users of the test results such as veterinarians, animal health programs, and owners). ISO/IEC 17025 also requires that the introduction of methods developed by the laboratory for its own use be a planned activity conducted by qualified personnel with adequate resources. This article discusses considerations and recommendations for the conduct of veterinary diagnostic test method development, validation, evaluation, approval, and transfer to the user laboratory in the ISO/IEC 17025 environment. These recommendations are based on those of nationally and internationally accepted standards and guidelines, as well as those of reputable and experienced technical bodies. They are also based on the author's experience in the evaluation of method development and transfer projects, validation data, and the implementation of quality management systems in the area of method development.

  14. Pilot-in-the Loop CFD Method Development

    DTIC Science & Technology

    2016-04-27

    simulations. All these efforts have been performed on CRAFT Tech’s in-house cluster with 32 nodes, each containing 8 Intel Xeon E5530 processors (2.4...Simulation For the Pilot-in-the-Loop CFD (PILCFD) demonstration cases, a network configuration on a CRAFT Tech Linux cluster was developed as shown...in Figure 5. The helicopter flight dynamics simulation (GENHEL-PSU) was run on the head node of the cluster , which enabled network communication

  15. Development of panel methods for subsonic analysis and design

    NASA Technical Reports Server (NTRS)

    Bristow, D. R.

    1980-01-01

    Two computer programs, developed for subsonic inviscid analysis and design are described. The first solves arbitrary mixed analysis design problems for multielement airfoils in two dimensional flow. The second calculates the pressure distribution for arbitrary lifting or nonlifting three dimensional configurations. In each program, inviscid flow is modelled by using distributed source doublet singularities on configuration surface panels. Numerical formulations and representative solutions are presented for the programs.

  16. Computational Fluid Dynamics. [numerical methods and algorithm development

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This collection of papers was presented at the Computational Fluid Dynamics (CFD) Conference held at Ames Research Center in California on March 12 through 14, 1991. It is an overview of CFD activities at NASA Lewis Research Center. The main thrust of computational work at Lewis is aimed at propulsion systems. Specific issues related to propulsion CFD and associated modeling will also be presented. Examples of results obtained with the most recent algorithm development will also be presented.

  17. Development of an improved method of consolidating fatigue life data

    NASA Technical Reports Server (NTRS)

    Leis, B. N.; Sampath, S. G.

    1978-01-01

    A fatigue data consolidation model that incorporates recent advances in life prediction methodology was developed. A combined analytic and experimental study of fatigue of notched 2024-T3 aluminum alloy under constant amplitude loading was carried out. Because few systematic and complete data sets for 2024-T3 were available in the program generated data for fatigue crack initiation and separation failure for both zero and nonzero mean stresses. Consolidations of these data are presented.

  18. Development of wide area environment accelerator operation and diagnostics method

    NASA Astrophysics Data System (ADS)

    Uchiyama, Akito; Furukawa, Kazuro

    2015-08-01

    Remote operation and diagnostic systems for particle accelerators have been developed for beam operation and maintenance in various situations. Even though fully remote experiments are not necessary, the remote diagnosis and maintenance of the accelerator is required. Considering remote-operation operator interfaces (OPIs), the use of standard protocols such as the hypertext transfer protocol (HTTP) is advantageous, because system-dependent protocols are unnecessary between the remote client and the on-site server. Here, we have developed a client system based on WebSocket, which is a new protocol provided by the Internet Engineering Task Force for Web-based systems, as a next-generation Web-based OPI using the Experimental Physics and Industrial Control System Channel Access protocol. As a result of this implementation, WebSocket-based client systems have become available for remote operation. Also, as regards practical application, the remote operation of an accelerator via a wide area network (WAN) faces a number of challenges, e.g., the accelerator has both experimental device and radiation generator characteristics. Any error in remote control system operation could result in an immediate breakdown. Therefore, we propose the implementation of an operator intervention system for remote accelerator diagnostics and support that can obviate any differences between the local control room and remote locations. Here, remote-operation Web-based OPIs, which resolve security issues, are developed.

  19. Two-Step Camera Calibration Method Developed for Micro UAV'S

    NASA Astrophysics Data System (ADS)

    Gašparović, M.; Gajski, D.

    2016-06-01

    The development of unmanned aerial vehicles (UAVs) and continuous price reduction of unmanned systems attracted us to this research. Professional measuring systems are dozens of times more expensive and often heavier than "amateur", non-metric UAVs. For this reason, we tested the DJI Phantom 2 Vision Plus UAV. Phantom's smaller mass and velocity can develop less kinetic energy in relation to the professional measurement platforms, which makes it potentially less dangerous for use in populated areas. In this research, we wanted to investigate the ability of such non-metric UAV and find the procedures under which this kind of UAV may be used for the photogrammetric survey. It is important to emphasize that UAV is equipped with an ultra wide-angle camera with 14MP sensor. Calibration of such cameras is a complex process. In the research, a new two-step process is presented and developed, and the results are compared with standard one-step camera calibration procedure. Two-step process involves initially removed distortion on all images, and then uses these images in the phototriangulation with self-calibration. The paper presents statistical indicators which proved that the proposed two-step process is better and more accurate procedure for calibrating those types of cameras than standard one-step calibration. Also, we suggest two-step calibration process as the standard for ultra-wideangle cameras for unmanned aircraft.

  20. Developing and executing quality improvement projects (concept, methods, and evaluation).

    PubMed

    Likosky, Donald S

    2014-03-01

    Continuous quality improvement, quality assurance, cycles of change--these words of often used to express the process of using data to inform and improve clinical care. Although many of us have been exposed to theories and practice of experimental work (e.g., randomized trial), few of us have been similarly exposed to the science underlying quality improvement. Through the lens of a single-center quality improvement study, this article exposes the reader to methodology for conducting such studies. The reader will gain an understanding of these methods required to embark on such a study.

  1. Developing and Executing Quality Improvement Projects (concept, methods, and evaluation)

    PubMed Central

    Likosky, Donald S.

    2014-01-01

    Abstract: Continuous quality improvement, quality assurance, cycles of change—these words of often used to express the process of using data to inform and improve clinical care. Although many of us have been exposed to theories and practice of experimental work (e.g., randomized trial), few of us have been similarly exposed to the science underlying quality improvement. Through the lens of a single-center quality improvement study, this article exposes the reader to methodology for conducting such studies. The reader will gain an understanding of these methods required to embark on such a study. PMID:24779118

  2. The development of a 3D risk analysis method.

    PubMed

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  3. Development of evaluation method for software hazard identification techniques

    SciTech Connect

    Huang, H. W.; Chen, M. H.; Shih, C.; Yih, S.; Kuo, C. T.; Wang, L. H.; Yu, Y. C.; Chen, C. W.

    2006-07-01

    This research evaluated the applicable software hazard identification techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flow-graph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/noise ratio, complexity, and implementation cost. By this proposed method, the analysts can evaluate various software hazard identification combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (with transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and simulation-based model-analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantages are the completeness complexity and implementation cost. This evaluation method can be a platform to reach common consensus for the stakeholders. Following the evolution of software hazard identification techniques, the evaluation results could be changed. However, the insight of software hazard identification techniques is much more important than the numbers obtained by the evaluation. (authors)

  4. Durability Methods Development. Volume 8. Test and Fractography Data

    DTIC Science & Technology

    1982-11-01

    fractography effort for the program. J. W. Norris developed the computer software for storing and analyzing the fractography data acquired and...98 2080. 0.1516 Test Date 2180. 0.1948 2279. 0.3247 Fatizue LiFe 2279. Faiiure ioao: A) ~8) initiation Locationls) Notes; No. of Crack No. of Crak V...8217e TO -Rar# F~oO.T-S NkeCA.dFO. CRAKS ATr /to* - SM~ft#eS S I I- O 14 9 - < No. of Crack No- of Crack Flights* Size Flights* Size Data set ABXHC4 200

  5. Development of alternate methods of determining integrated SMR source terms

    SciTech Connect

    Barry, Kenneth

    2014-06-10

    The Nuclear Energy Institute (NEI) Small Modular Reactor (SMR) Licensing Task Force (TF) has been evaluating licensing issues unique and important to iPWRs, ranking these issues, and developing NEI position papers for submittal to the U.S. Nuclear Regulatory Commission (NRC) during the past three years. Papers have been developed and submitted to the NRC in a range of areas including: Price-Anderson Act, NRC annual fees, security, modularity, and staffing. In December, 2012, NEI completed a draft position paper on SMR source terms and participated in an NRC public meeting presenting a summary of this paper, which was subsequently submitted to the NRC. One important conclusion of the source term paper was the evaluation and selection of high importance areas where additional research would have a significant impact on source terms. The highest ranked research area was iPWR containment aerosol natural deposition. The NRC accepts the use of existing aerosol deposition correlations in Regulatory Guide 1.183, but these were developed for large light water reactor (LWR) containments. Application of these correlations to an iPWR design has resulted in greater than a ten-fold reduction of containment airborne aerosol inventory as compared to large LWRs. Development and experimental justification of containment aerosol natural deposition correlations specifically for the unique iPWR containments is expected to result in a large reduction of design basis and beyond-design-basis accident source terms with concomitantly smaller dose to workers and the public. Therefore, NRC acceptance of iPWR containment aerosol natural deposition correlations will directly support the industry’s goal of reducing the Emergency Planning Zone (EPZ) for SMRs. Based on the results in this work, it is clear that thermophoresis is relatively unimportant for iPWRs. Gravitational settling is well understood, and may be the dominant process for a dry environment. Diffusiophoresis and enhanced

  6. Development of new ash cooling method for atmospheric fluidized beds

    SciTech Connect

    Li Xuantian; Luo Zhongyang; Ni Mingjiang; Cheng Leming; Gao Xiang; Fang Mengxiang; Cen Kefa

    1995-12-31

    The pollution caused by hot ash drained from the bed is another challenge to atmospheric fluidized bed combustion technology when low-rank, high ash fuels are used. A new technique is developed for ash cooling and utilization of the waste heat of ash. Results from the demonstration of an 1.5 T/H patented device have shown the potential to use this type of ash cooler for drying and secondary air preheating. Bottom ash sized in the range 0--13 mm can be cooled from 1,650 F (900 C) to tolerable temperatures for conveying machinery, and the cooled ash can be re-utilized for cement production.

  7. Developing a selection method for graduate nursing students.

    PubMed

    Creech, Constance J; Aplin-Kalisz, Christina

    2011-08-01

    The student selection process is an important faculty responsibility that impacts student success in individual courses, retention, and ultimately graduation rates. The purpose of this article is to review the existing research on graduate student selection and describe one university's newly developed selection process. Existing literature and research and one university's selection data. There is limited existing research on graduate student selection to assist faculty in selecting students. One university's process is described in detail for possible replication by others to improve the process. The process highlighted in this article may be useful to other faculty as a model for improvement of student selection processes. 2011 American Academy of Nurse Practitioners.

  8. [Comparison of sustainable development status in Heilongjiang Province based on traditional ecological footprint method and emergy ecological footprint method].

    PubMed

    Chen, Chun-feng; Wang, Hong-yan; Xiao, Du-ning; Wang, Da-qing

    2008-11-01

    By using traditional ecological footprint method and its modification, emergy ecological footprint method, the sustainable development status of Heilongjiang Province in 2005 was analyzed. The results showed that the ecological deficits of Heilongjiang Province in 2005 based on emergy and conventional ecological footprint methods were 1.919 and 0.6256 hm2 x cap(-1), respectively. The ecological footprint value based on the two methods both exceeded its carrying capacity, which indicated that the social and economic development of the study area was not sustainable. Emergy ecological footprint method was used to discuss the relationships between human's material demand and ecosystem resources supply, and more stable parameters such as emergy transformity and emergy density were introduced into emergy ecological footprint method, which overcame some of the shortcomings of conventional ecological method.

  9. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    SciTech Connect

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  10. Development of geopolitically relevant ranking criteria for geoengineering methods

    NASA Astrophysics Data System (ADS)

    Boyd, Philip W.

    2016-11-01

    A decade has passed since Paul Crutzen published his editorial essay on the potential for stratospheric geoengineering to cool the climate in the Anthropocene. He synthesized the effects of the 1991 Pinatubo eruption on the planet's radiative budget and used this large-scale event to broaden and deepen the debate on the challenges and opportunities of large-scale geoengineering. Pinatubo had pronounced effects, both in the short and longer term (months to years), on the ocean, land, and the atmosphere. This rich set of data on how a large-scale natural event influences many regional and global facets of the Earth System provides a comprehensive viewpoint to assess the wider ramifications of geoengineering. Here, I use the Pinatubo archives to develop a range of geopolitically relevant ranking criteria for a suite of different geoengineering approaches. The criteria focus on the spatial scales needed for geoengineering and whether large-scale dispersal is a necessary requirement for a technique to deliver significant cooling or carbon dioxide reductions. These categories in turn inform whether geoengineering approaches are amenable to participation (the "democracy of geoengineering") and whether they will lead to transboundary issues that could precipitate geopolitical conflicts. The criteria provide the requisite detail to demarcate different geoengineering approaches in the context of geopolitics. Hence, they offer another tool that can be used in the development of a more holistic approach to the debate on geoengineering.

  11. Discontinuity in pastoral development: time to update the method.

    PubMed

    KrÄtli, S

    2016-11-01

    Most off-the-shelf basic methodological tools currently used in pastoral development (e.g. technical definitions and conventional scales of observation) retain underlying assumptions about stability and uniformity being the norm (i.e. 'equilibrium thinking'). Such assumptions reflect a theoretical framework which had been questioned since the 1970s and was openly disproved in scientific circles during the 1990s, when it was shown to be fundamentally inadequate. Today, lingering equilibrium assumptions in the methodological legacy of pastoral development get in the way of operationalising state-of-the-art understanding of pastoral systems and drylands. Unless these barriers are identified, unpacked and managed, even increasing the rigour and intensity of data collection will not deliver a realistic representation of pastoral systems in statistics and policymaking. This article provides a range of examples of such 'barriers', where equilibrium assumptions persist in the methodology, including classifications of livestock systems, conventional scales of observation, key parameters in animal production, indicators in the measurement of ecological efficiency, and the concepts of 'fragile environment', natural resources, and pastoral risk.

  12. Magic shotgun methods for developing drugs for CNS disorders.

    PubMed

    Musk, Philip

    2004-10-01

    Extract: Development of novel therapeutic entities with which to treat disorders of the central nervous system (CNS) that are both more effective and more specific, poses significant challenges to the drug discovery industry. The normal focus of drug research is the search for a "magic bullet," which acts on a specific protein (or receptor), ideally with no other interactions with other proteins. These are termed "clean" drugs, as they have a single action with few side effects. However, most common CNS disorders are highly polygenic in nature, i.e., they are controlled by complex interactions between numerous gene products. As such, these conditions do not exhibit the single gene defect basis that is so attractive for the development of highly-specific drugs largely free of major undesirable side effects ("the magic bullet"). Secondly, the exact nature of the interactions that occur between the numerous gene products typically involved in CNS disorders remain elusive, and the biological mechanisms underlying mental illnesses are poorly understood.

  13. Development of 3-D Ice Accretion Measurement Method

    NASA Technical Reports Server (NTRS)

    Lee, Sam; Broeren, Andy P.; Addy, Harold E., Jr.; Sills, Robert; Pifer, Ellen M.

    2012-01-01

    A research plan is currently being implemented by NASA to develop and validate the use of a commercial laser scanner to record and archive fully three-dimensional (3-D) ice shapes from an icing wind tunnel. The plan focused specifically upon measuring ice accreted in the NASA Icing Research Tunnel (IRT). The plan was divided into two phases. The first phase was the identification and selection of the laser scanning system and the post-processing software to purchase and develop further. The second phase was the implementation and validation of the selected system through a series of icing and aerodynamic tests. Phase I of the research plan has been completed. It consisted of evaluating several scanning hardware and software systems against an established selection criteria through demonstrations in the IRT. The results of Phase I showed that all of the scanning systems that were evaluated were equally capable of scanning ice shapes. The factors that differentiated the scanners were ease of use and the ability to operate in a wide range of IRT environmental conditions.

  14. Development and testing of improved statistical wind power forecasting methods.

    SciTech Connect

    Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J.

    2011-12-06

    (with spatial and/or temporal dependence). Statistical approaches to uncertainty forecasting basically consist of estimating the uncertainty based on observed forecasting errors. Quantile regression (QR) is currently a commonly used approach in uncertainty forecasting. In Chapter 3, we propose new statistical approaches to the uncertainty estimation problem by employing kernel density forecast (KDF) methods. We use two estimators in both offline and time-adaptive modes, namely, the Nadaraya-Watson (NW) and Quantilecopula (QC) estimators. We conduct detailed tests of the new approaches using QR as a benchmark. One of the major issues in wind power generation are sudden and large changes of wind power output over a short period of time, namely ramping events. In Chapter 4, we perform a comparative study of existing definitions and methodologies for ramp forecasting. We also introduce a new probabilistic method for ramp event detection. The method starts with a stochastic algorithm that generates wind power scenarios, which are passed through a high-pass filter for ramp detection and estimation of the likelihood of ramp events to happen. The report is organized as follows: Chapter 2 presents the results of the application of ITL training criteria to deterministic WPF; Chapter 3 reports the study on probabilistic WPF, including new contributions to wind power uncertainty forecasting; Chapter 4 presents a new method to predict and visualize ramp events, comparing it with state-of-the-art methodologies; Chapter 5 briefly summarizes the main findings and contributions of this report.

  15. Ranging methods for developing wellbores in subsurface formations

    DOEpatents

    MacDonald, Duncan

    2011-09-06

    A method for forming two or more wellbores in a subsurface formation includes forming a first wellbore in the formation. A second wellbore is directionally drilled in a selected relationship relative to the first wellbore. At least one magnetic field is provided in the second wellbore using one or more magnets in the second wellbore located on a drilling string used to drill the second wellbore. At least one magnetic field is sensed in the first wellbore using at least two sensors in the first wellbore as the magnetic field passes by the at least two sensors while the second wellbore is being drilled. A position of the second wellbore is continuously assessed relative to the first wellbore using the sensed magnetic field. The direction of drilling of the second wellbore is adjusted so that the second wellbore remains in the selected relationship relative to the first wellbore.

  16. Sublimation rates of explosive materials : method development and initial results.

    SciTech Connect

    Phelan, James M.; Patton, Robert Thomas

    2004-08-01

    Vapor detection of explosives continues to be a technological basis for security applications. This study began experimental work to measure the chemical emanation rates of pure explosive materials as a basis for determining emanation rates of security threats containing explosives. Sublimation rates for TNT were determined with thermo gravimetric analysis using two different techniques. Data were compared with other literature values to provide sublimation rates from 25 to 70 C. The enthalpy of sublimation for the combined data was found to be 115 kJ/mol, which corresponds well with previously reported data from vapor pressure determinations. A simple Gaussian atmospheric dispersion model was used to estimate downrange concentrations based on continuous, steady-state conditions at 20, 45 and 62 C for a nominal exposed block of TNT under low wind conditions. Recommendations are made for extension of the experimental vapor emanation rate determinations and development of turbulent flow computational fluid dynamics based atmospheric dispersion estimates of standoff vapor concentrations.

  17. Development of a fast voltage control method for electrostatic accelerators

    NASA Astrophysics Data System (ADS)

    Lobanov, Nikolai R.; Linardakis, Peter; Tsifakis, Dimitrios

    2014-12-01

    The concept of a novel fast voltage control loop for tandem electrostatic accelerators is described. This control loop utilises high-frequency components of the ion beam current intercepted by the image slits to generate a correction voltage that is applied to the first few gaps of the low- and high-energy acceleration tubes adjoining the high voltage terminal. New techniques for the direct measurement of the transfer function of an ultra-high impedance structure, such as an electrostatic accelerator, have been developed. For the first time, the transfer function for the fast feedback loop has been measured directly. Slow voltage variations are stabilised with common corona control loop and the relationship between transfer functions for the slow and new fast control loops required for optimum operation is discussed. The main source of terminal voltage instabilities, which are due to variation of the charging current caused by mechanical oscillations of charging chains, has been analysed.

  18. Development of Methods Precision Length Measurement Using Transported Laser Interferometer

    NASA Astrophysics Data System (ADS)

    Lavrov, E. A.; Epikhin, V. M.; Mazur, M. M.; Suddenok, Y. A.; Shorin, V. N.

    The paper shows the results of a comparison of a developed transported laser interferometer (TLI) with a measurement interferometer XL-80 Renishaw at the distance 0-60 meters. Testings of a breadboard model of the TLI showed that a difference between the travel measurements of the two interferometers does not exceed 6 μm. The mean value of the difference of indications between the TLI and a Renishaw travel measurer at the distance near 58 m approximately equals to 0,5 μm. Root-mean square deviation of the indications of the interferometers approximately equals to 3 μm. At comparison of the sections with the same name between the TLI and the Renishaw travel measurer, measured at different days, a repeatability of the results for the sections with the same name is noted.

  19. Collaboration with academia in the development of post ovulatory methods.

    PubMed

    Thaler, G

    1999-12-01

    The 0.75-mg levonorgestrel-containing 'morning after' contraceptive tablet Postinor was developed by Gedeon Richter Ltd., Hungary. The product was first launched in 1979 and registered later in approximately 40 countries. In 1994, the World Health Organization offered the company participation in a multinational clinical trial to prove the superiority of the product over existing (Yuzpe-type) emergency contraceptive products. Based on these data the company was able to redesign the 'morning after' type Postinor into an 'emergency' pill, Postinor-2. During further clinical trials a close working relationship was formed between the Department of Obstetrics and Gynaecology at the Albert Szent-Györgyi Medical University in Szeged, Hungary, and Gedeon Richter. The advantages and challenges of cooperation between public- and private-sector institutions are analyzed in the paper.

  20. The development of MHD energy conversion methods in the USSR

    NASA Astrophysics Data System (ADS)

    Kirillin, V. A.; Sheindlin, A. E.

    1981-12-01

    It is noted that the development of magnetohydrodynamic (MHD) power conversion systems has evolved to the point where it is possible to commercially introduce MHD power plants into industry. Even with the present level of technology, the sharp increase in thermal efficiency of these plants by as much as 50-60% results in fuel economies of 20-35% and in reductions in generation costs of 6-7%. A description is given of an MHD power plant and its various aggregates. Also given are a review of the state of the art of MHD technology and an outline of the Soviet program for its commercial exploitation. The design of MHD electrical power plants, the interrelation between various aggregates, and the problems arising from nonstandard equipment are discussed.

  1. New Research Methods Developed for Studying Diabetic Foot Ulceration

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Dr. Brian Davis, one of the Cleveland Clinic Foundation's researchers, has been investigating the risk factors related to diabetic foot ulceration, a problem that accounts for 20 percent of all hospital admissions for diabetic patients. He had developed a sensor pad to measure the friction and pressure forces under a person's foot when walking. As part of NASA Lewis Research Center's Space Act Agreement with the Cleveland Clinic Foundation, Dr. Davis requested Lewis' assistance in visualizing the data from the sensor pad. As a result, Lewis' Interactive Data Display System (IDDS) was installed at the Cleveland Clinic. This computer graphics program is normally used to visualize the flow of air through aircraft turbine engines, producing color two- and three-dimensional images.

  2. Development of metrological NDE methods for microturbine ceramic components

    SciTech Connect

    Lee, H.-R.; Ellingson, W. A.

    1999-12-23

    In this work, X-ray computed tomographic imaging technology with high spatial resolution has been explored for metrological applications to Si{sub 3}N{sub 4} ceramic turbine wheels. X-ray computed tomography (XCT) data were acquired by a charge-coupled device detector coupled to an image intensifier. Cone-beam XCT reconstruction algorithms were used to allow full-volume data acquisition from the turbine wheels. Special software was developed so that edge detection and complex blade contours could be determined from the XCT data. The feasibility of using the XCT for dimensional analyses was compared with that of a coordinate-measuring machine. Details of the XCT system, data acquisition, and dimensional comparisons will be presented.

  3. New Research Methods Developed for Studying Diabetic Foot Ulceration

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Dr. Brian Davis, one of the Cleveland Clinic Foundation's researchers, has been investigating the risk factors related to diabetic foot ulceration, a problem that accounts for 20 percent of all hospital admissions for diabetic patients. He had developed a sensor pad to measure the friction and pressure forces under a person's foot when walking. As part of NASA Lewis Research Center's Space Act Agreement with the Cleveland Clinic Foundation, Dr. Davis requested Lewis' assistance in visualizing the data from the sensor pad. As a result, Lewis' Interactive Data Display System (IDDS) was installed at the Cleveland Clinic. This computer graphics program is normally used to visualize the flow of air through aircraft turbine engines, producing color two- and three-dimensional images.

  4. The development of episodic foresight: emerging concepts and methods.

    PubMed

    Hudson, Judith A; Mayhew, Estelle M Y; Prabhakar, Janani

    2011-01-01

    Episodic foresight is here defined as the ability to project oneself into the future and mentally simulate situations and outcomes. Tasks used to study the development of episodic foresight in young children are reviewed and compared to tasks used to study other future-oriented abilities (planning, delay of gratification, and prospective memory) in the same age-group. We argue for the importance of accounting for and minimizing the role of other cognitive demands in research tasks. Because episodic foresight is an emerging ability in young children, more research needs to be directed at the contexts in which it emerges and the extent to which episodic foresight is part of a growing ability for mental representation.

  5. Development of fluoroimmunoassay methods for delta-9-tetrahydrocannabinol

    SciTech Connect

    Mason, A.P.

    1986-01-01

    Heterogeneous, competitive, labelled-ligand solid-phase primary antibody fluoroimmunoassay methods for the detection of THC in blood and plasma were proposed, and the required assay components were produced and characterized. These components included polyclonal rabbit antisera and monoclonal antibodies reactive with tetrahydrocannabinols, solid-phase immunoglobin reagents, a fluoroligand, and protein conjugates of THC for immunization and immunoassay response amplification. The stereoselective rabbit anti-THC antiserum F-444-12 was found to have a high binding titer, a high affinity (K/sub D/ = 3.4 x 10/sup -/exclamation/sup 1/ M for 5'-iodo/sup -125/I-..delta../sup 2/-THC), and high specificity versus a large number of cannabinoid compounds. Immobilization of the immunoglobulin fraction of the antiserum on hydrophilic polyacrylamide microspheres resulted in only a four fold increase in K/sub D/, and a two fold increase in the concentration of binding sites required for the production of equivalent binding titers. Specificity for small ligands was not affected, but the binding of THC-protein conjugates was reduced in potency. Two monoclonal hybridoma cell lines were produced that secrete monoclonal antibodies which bind the radioligand. The fluoroligand was synthesized from 5'-carboxy-..delta../sup 2/-THC and FITC using a diamimoethane linkage structure. While the compound had the fluorescence properties of FTIC, it was bound to the antiserum F-144-12 with a cross-reactive potency 1.4x greater than the radioligand, and 10x greater than THC.

  6. Development of methods to measure virus inactivation in fresh waters.

    PubMed Central

    Ward, R L; Winston, P E

    1985-01-01

    This study concerns the identification and correction of deficiencies in methods used to measure inactivation rates of enteric viruses seeded into environmental waters. It was found that viable microorganisms in an environmental water sample increased greatly after addition of small amounts of nutrients normally present in the unpurified seed virus preparation. This burst of microbial growth was not observed after seeding the water with purified virus. The use of radioactively labeled poliovirus revealed that high percentages of virus particles, sometimes greater than 99%, were lost through adherence to containers, especially in less turbid waters. This effect was partially overcome by the use of polypropylene containers and by the absence of movement during incubation. Adherence to containers clearly demonstrated the need for labeled viruses to monitor losses in this type of study. Loss of viral infectivity in samples found to occur during freezing was avoided by addition of broth. Finally, microbial contamination of the cell cultures during infectivity assays was overcome by the use of gentamicin and increased concentrations of penicillin, streptomycin, and amphotericin B. PMID:3004328

  7. Methods and apparatuses for the development of microstructured nuclear fuels

    DOEpatents

    Jarvinen, Gordon D [Los Alamos, NM; Carroll, David W [Los Alamos, NM; Devlin, David J [Santa Fe, NM

    2009-04-21

    Microstructured nuclear fuel adapted for nuclear power system use includes fissile material structures of micrometer-scale dimension dispersed in a matrix material. In one method of production, fissile material particles are processed in a chemical vapor deposition (CVD) fluidized-bed reactor including a gas inlet for providing controlled gas flow into a particle coating chamber, a lower bed hot zone region to contain powder, and an upper bed region to enable powder expansion. At least one pneumatic or electric vibrator is operationally coupled to the particle coating chamber for causing vibration of the particle coater to promote uniform powder coating within the particle coater during fuel processing. An exhaust associated with the particle coating chamber and can provide a port for placement and removal of particles and powder. During use of the fuel in a nuclear power reactor, fission products escape from the fissile material structures and come to rest in the matrix material. After a period of use in a nuclear power reactor and subsequent cooling, separation of the fissile material from the matrix containing the embedded fission products will provide an efficient partitioning of the bulk of the fissile material from the fission products. The fissile material can be reused by incorporating it into new microstructured fuel. The fission products and matrix material can be incorporated into a waste form for disposal or processed to separate valuable components from the fission products mixture.

  8. Developing methods for timely and relevant mission impact estimation

    NASA Astrophysics Data System (ADS)

    Grimaila, Michael R.; Fortson, Larry W., Jr.; Sutton, Janet L.; Mills, Robert F.

    2009-05-01

    Military organizations embed information systems and networking technologies into their core mission processes as a means to increase operational efficiency, improve decision making quality, and shorten the "kill chain". Unfortunately, this dependence can place the mission at risk when the loss or degradation of the confidentiality, integrity, availability, non-repudiation, or authenticity of a critical information resource or flow occurs. Since the accuracy, conciseness, and timeliness of the information used in command decision making processes impacts the quality of these decisions, and hence, the operational mission outcome; it is imperative to explicitly recognize, quantify, and document critical missioninformation dependencies in order to gain a true appreciation of operational risk. We conjecture what is needed is a structured process to provide decision makers with real-time awareness of the status of critical information resources and timely notification of estimated mission impact, from the time an information incident is declared, until the incident is fully remediated. In this paper, we discuss our initial research towards the development of a mission impact estimation engine which fuses information from subject matter experts, historical mission impacts, and explicit mission models to provide the ability to estimate the mission impacts resulting from an information incident in real-time.

  9. Developing a study method for producing 400 microm spheroids.

    PubMed

    Dupont, G; Flament, M P; Leterme, P; Farah, N; Gayot, A

    2002-10-24

    The aim of this work was to obtain 400 microm spheroids that can be sprinkled on food to improve patient compliance particularly in the case of children and old people. A methodology to select wet masses for extrusion-spheronization through a 400 microm orifice was developed. The first step was to define the parameters that make it possible to assess the qualities required by the wet mass and the extrudates and evaluation norms: plasticity, cohesiveness, brittleness of the mass and the extrudates, and appearance of extrudates. A feasibility assay was then performed on the cylinder extruder, showing that extrusion of the lactose/Avicel PH 101/water (50/50/60) mass is not feasible through the 400 microm orifice. Precirol ato 5 and Gelucire 50/02 wetted with a sodium lauryl sulfate solution at 0.5% show plastic flow through the 400 microm diameter orifice. The presence of Avicel PH 101 does not improve plasticity for this orifice. Micropellets of 400 microm have been proved feasible as long as excipients with suitable pharmaceutical technological properties are used. After proving the feasibility of 400 microm spheroids of Gelucire 50/02, we considered the association of a drug with it. Copyright 2002 Elsevier Science B.V.

  10. PROGRESS ON GENERIC PHASE-FIELD METHOD DEVELOPMENT

    SciTech Connect

    Biner, Bullent; Tonks, Michael; Millett, Paul C.; Li, Yulan; Hu, Shenyang Y.; Gao, Fei; Sun, Xin; Martinez, E.; Anderson, D.

    2012-09-26

    In this report, we summarize our current collobarative efforts, involving three national laboratories: Idaho National Laboratory (INL), Pacific Northwest National Laboratory (PNNL) and Los Alamos National Laboatory (LANL), to develop a computational framework for homogenous and heterogenous nucleation mechanisms into the generic phase-field model. During the studies, the Fe-Cr system was chosen as a model system due to its simplicity and availability of reliable thermodynamic and kinetic data, as well as the range of applications of low-chromium ferritic steels in nuclear reactors. For homogenous nucleation, the relavant parameters determined from atomistic studies were used directly to determine the energy functional and parameters in the phase-field model. Interfacial energy, critical nucleus size, nucleation rate, and coarsening kinetics were systematically examined in two- and three- dimensional models. For the heteregoneous nucleation mechanism, we studied the nucleation and growth behavior of chromium precipitates due to the presence of dislocations. The results demonstrate that both nucleation schemes can be introduced to a phase-field modeling algorithm with the desired accuracy and computational efficiency.

  11. DEVELOPMENT OF A METHOD TO QUANTIFY THE IMPACT ...

    EPA Pesticide Factsheets

    Advances in human health risk assessment, especially for contaminants encountered by the inhalation route, have evolved so that the uncertainty factors (UF) used in the extrapolation of non-cancer effects across species (UFA) have been split into the respective pharmacodynamic (PD) and pharmacokinetic (PK) components. Present EPA default values for these components are divided into two half-logs (e.g., 10 to the 0.5 power or 3.16), so that their multiplication yields the 10-fold UF customarily seen in Agency risk assessments as UFA. The state of the science at present does not support a detailed evaluation of species-dependent and human interindividual variance of PD, but more data exist by which PK variance can be examined and quantified both across species and within the human species. Because metabolism accounts for much of the PK variance, we sought to examine the impact that differences in hepatic enzyme content exerts upon risk-relevant PK outcomes among humans. Because of the age and ethnic diversity expressed in the human organ donor population and the wide availability of tissues from these human organ donors, a program was developed to include information from those tissues in characterizing human interindividual PK variance. An Interagency Agreement with CDC/NIOSH Taft Laboratory, a Cooperative Agreement with CIIT Centers for Health Research, and a collaborative agreement with NHEERL/ETD were established to successfully complete the project. The di

  12. Indexing NASA programs for technology transfer methods development and feasibility

    NASA Technical Reports Server (NTRS)

    Clingman, W. H.

    1972-01-01

    This project was undertaken to evaluate the application of a previously developed indexing methodology to ongoing NASA programs. These programs are comprehended by the NASA Program Approval Documents (PADS). Each PAD contains a technical plan for the area it covers. It was proposed that these could be used to generate an index to the complete NASA program. To test this hypothesis two PADS were selected by the NASA Technology Utilization Office for trial indexing. Twenty-five individuals indexed the two PADS using NASA Thesaurus terms. The results demonstrated the feasibility of indexing ongoing NASA programs using PADS as the source of information. The same indexing methodology could be applied to other documents containing a brief description of the technical plan. Results of this project showed that over 85% of the concepts in the technology should be covered by the indexing. Also over 85% of the descriptors chosen would be accurate. This completeness and accuracy for the indexing is considered satisfactory for application in technology transfer.

  13. Validation and Continued Development of Methods for Spheromak Simulation

    NASA Astrophysics Data System (ADS)

    Benedett, Thomas

    2016-10-01

    The HIT-SI experiment has demonstrated stable sustainment of spheromaks. Determining how the underlying physics extrapolate to larger, higher-temperature regimes is of prime importance in determining the viability of the inductively-driven spheromak. It is thus prudent to develop and validate a computational model that can be used to study current results and study the effect of possible design choices on plasma behavior. A zero-beta Hall-MHD model has shown good agreement with experimental data at 14.5 kHz injector operation. Experimental observations at higher frequency, where the best performance is achieved, indicate pressure effects are important and likely required to attain quantitative agreement with simulations. Efforts to extend the existing validation to high frequency (36-68 kHz) using an extended MHD model implemented in the PSI-TET arbitrary-geometry 3D MHD code will be presented. An implementation of anisotropic viscosity, a feature observed to improve agreement between NIMROD simulations and experiment, will also be presented, along with investigations of flux conserver features and their impact on density control for future SIHI experiments. Work supported by DoE.

  14. Developing a method for customized induction of flowering

    PubMed Central

    2011-01-01

    Background The ability to induce flowering on demand is of significant biotechnological interest. FT protein has been recently identified as an important component of the mobile flowering hormone, florigen, whose function is conserved across the plant kingdom. We therefore focused on manipulation of both endogenous and heterologous FT genes to develop a floral induction system where flowering would be inhibited until it was induced on demand. The concept was tested in the model plant Arabidopsis thaliana (Arabidopsis). Results Our starting point was plants with strongly delayed flowering due to silencing of FT with an artificial microRNA directed at FT (amiR-FT) [1]. First, we showed that constitutive expression of a heterologous FT gene (FTa1), from the model legume Medicago truncatula, (Medicago) was able to rescue the amiR-FT late-flowering phenotype. In order to induce flowering in a controlled way, the FTa1 gene was then expressed under the control of an alcohol-inducible promoter in the late flowering amiR-FT plants. Upon exposure to ethanol, FTa1 was rapidly up regulated and this resulted in the synchronous induction of flowering. Conclusions We have thus demonstrated a controlled-inducible flowering system using a novel combination of endogenous and heterologous FT genes. The universal florigenic nature of FT suggests that this type of system should be applicable to crops of economic value where flowering control is desirable. PMID:21481273

  15. Ceramic Matrix Composites (CMC) Life Prediction Method Development

    NASA Technical Reports Server (NTRS)

    Levine, Stanley R.; Calomino, Anthony M.; Ellis, John R.; Halbig, Michael C.; Mital, Subodh K.; Murthy, Pappu L.; Opila, Elizabeth J.; Thomas, David J.; Thomas-Ogbuji, Linus U.; Verrilli, Michael J.

    2000-01-01

    Advanced launch systems (e.g., Reusable Launch Vehicle and other Shuttle Class concepts, Rocket-Based Combine Cycle, etc.), and interplanetary vehicles will very likely incorporate fiber reinforced ceramic matrix composites (CMC) in critical propulsion components. The use of CMC is highly desirable to save weight, to improve reuse capability, and to increase performance. CMC candidate applications are mission and cycle dependent and may include turbopump rotors, housings, combustors, nozzle injectors, exit cones or ramps, and throats. For reusable and single mission uses, accurate prediction of life is critical to mission success. The tools to accomplish life prediction are very immature and not oriented toward the behavior of carbon fiber reinforced silicon carbide (C/SiC), the primary system of interest for a variety of space propulsion applications. This paper describes an approach to satisfy the need to develop an integrated life prediction system for CMC that addresses mechanical durability due to cyclic and steady thermomechanical loads, and takes into account the impact of environmental degradation.

  16. Novel lipase purification methods - a review of the latest developments.

    PubMed

    Tan, Chung Hong; Show, Pau Loke; Ooi, Chien Wei; Ng, Eng-Poh; Lan, John Chi-Wei; Ling, Tau Chuan

    2015-01-01

    Microbial lipases are popular biocatalysts due to their ability to catalyse diverse reactions such as hydrolysis, esterification, and acidolysis. Lipases function efficiently on various substrates in aqueous and non-aqueous media. Lipases are chemo-, regio-, and enantio-specific, and are useful in various industries, including those manufacturing food, detergents, and pharmaceuticals. A large number of lipases from fungal and bacterial sources have been isolated and purified to homogeneity. This success is attributed to the development of both conventional and novel purification techniques. This review highlights the use of these techniques in lipase purification, including conventional techniques such as: (i) ammonium sulphate fractionation; (ii) ion-exchange; (iii) gel filtration and affinity chromatography; as well as novel techniques such as (iv) reverse micellar system; (v) membrane processes; (vi) immunopurification; (vi) aqueous two-phase system; and (vii) aqueous two-phase floatation. A summary of the purification schemes for various bacterial and fungal lipases are also provided. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Development of an ASTM Graphite Oxidation Test Method

    SciTech Connect

    Contescu, Cristian I; Baker, Frederick S; Burchell, Timothy D

    2006-01-01

    Oxidation behavior of graphite is of practical interest because of extended use of graphite materials in nuclear reactors. High temperature gas-cooled reactors are expected to become the nuclear reactors of the next generation. The most critical factor in their safe operation is an air-ingress accident, in which case the graphite materials in the moderator and reflector would come in contact with oxygen at a high temperature. Many results on graphite oxidation have been obtained from TGA measurements using commercial instruments, with sample sizes of a few hundred milligrams. They have demonstrated that graphite oxidation is in kinetic control regime at low temperatures, but becomes diffusion-limited at high temperatures. These effects are better understood from measurement results with large size samples, on which the shape and structural factors that control diffusion can be more clearly evidenced. An ASTM test for characterization of oxidation resistance of machined carbon and graphite materials is being developed with ORNL participation. The test recommends the use of large machined samples (~ 20 grams) in a dry air flow system. We will report on recent results and progress in this direction.

  18. Development of a Multi-Point Microwave Interferometry (MPMI) Method

    SciTech Connect

    Specht, Paul Elliott; Cooper, Marcia A.; Jilek, Brook Anton

    2015-09-01

    A multi-point microwave interferometer (MPMI) concept was developed for non-invasively tracking a shock, reaction, or detonation front in energetic media. Initially, a single-point, heterodyne microwave interferometry capability was established. The design, construction, and verification of the single-point interferometer provided a knowledge base for the creation of the MPMI concept. The MPMI concept uses an electro-optic (EO) crystal to impart a time-varying phase lag onto a laser at the microwave frequency. Polarization optics converts this phase lag into an amplitude modulation, which is analyzed in a heterodyne interfer- ometer to detect Doppler shifts in the microwave frequency. A version of the MPMI was constructed to experimentally measure the frequency of a microwave source through the EO modulation of a laser. The successful extraction of the microwave frequency proved the underlying physical concept of the MPMI design, and highlighted the challenges associated with the longer microwave wavelength. The frequency measurements made with the current equipment contained too much uncertainty for an accurate velocity measurement. Potential alterations to the current construction are presented to improve the quality of the measured signal and enable multiple accurate velocity measurements.

  19. Project-Based Learning in Undergraduate Environmental Chemistry Laboratory: Using EPA Methods to Guide Student Method Development for Pesticide Quantitation

    ERIC Educational Resources Information Center

    Davis, Eric J.; Pauls, Steve; Dick, Jonathan

    2017-01-01

    Presented is a project-based learning (PBL) laboratory approach for an upper-division environmental chemistry or quantitative analysis course. In this work, a combined laboratory class of 11 environmental chemistry students developed a method based on published EPA methods for the extraction of dichlorodiphenyltrichloroethane (DDT) and its…

  20. Analyzing Methods. A Procedural Guide for the Method Specialist. Research & Development Series No. 119-G. Career Planning Support System.

    ERIC Educational Resources Information Center

    Burkhardt, Carolyn M.; And Others

    Information in this brief guide, one of a set of twelve documents describing the Career Planning Support System (CPSS) and its use, is directed to the method specialist (a member of the CPSS steering committee) and provides procedures and a list of questions to aid in analyzing career development methods that may be appropriate for use in career…

  1. 78 FR 22540 - Notice of Public Meeting/Webinar: EPA Method Development Update on Drinking Water Testing Methods...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-16

    ... AGENCY Notice of Public Meeting/Webinar: EPA Method Development Update on Drinking Water Testing Methods...: Notice of public meeting. SUMMARY: The U.S. Environmental Protection Agency (EPA) Office of Ground Water and Drinking Water, Standards and Risk Management Division's Technical Support Center (TSC)...

  2. Risk-Informed Safety Margin Characterization Methods Development Work

    SciTech Connect

    Smith, Curtis L; Ma, Zhegang; Riley, Tom; Mandelli, Diego; Nielsen, Joseph W; Alfonsi, Andrea; Rabiti, Cristian

    2014-09-01

    This report summarizes the research activity developed during the Fiscal year 2014 within the Risk Informed Safety Margin and Characterization (RISMC) pathway within the Light Water Reactor Sustainability (LWRS) campaign. This research activity is complementary to the one presented in the INL/EXT-??? report which shows advances Probabilistic Risk Assessment Analysis using RAVEN and RELAP-7 in conjunction to novel flooding simulation tools. Here we present several analyses that prove the values of the RISMC approach in order to assess risk associated to nuclear power plants (NPPs). We focus on simulation based PRA which, in contrast to classical PRA, heavily employs system simulator codes. Firstly we compare, these two types of analyses, classical and RISMC, for a Boiling water reactor (BWR) station black out (SBO) initiating event. Secondly we present an extended BWR SBO analysis using RAVEN and RELAP-5 which address the comments and suggestions received about he original analysis presented in INL/EXT-???. This time we focus more on the stochastic analysis such probability of core damage and on the determination of the most risk-relevant factors. We also show some preliminary results regarding the comparison between RELAP5-3D and the new code RELAP-7 for a simplified Pressurized Water Reactors system. Lastly we present some conceptual ideas regarding the possibility to extended the RISMC capabilities from an off-line tool (i.e., as PRA analysis tool) to an online-tool. In this new configuration, RISMC capabilities can be used to assist and inform reactor operator during real accident scenarios.

  3. Methods for the development of a bioregenerative life support system

    NASA Technical Reports Server (NTRS)

    Goldman, Michelle; Gomez, Shawn; Voorhees, Mike

    1990-01-01

    Presented here is a rudimentary approach to designing a life support system based on the utilization of plants and animals. The biggest stumbling block in the initial phases of developing a bioregenerative life support system is encountered in collecting and consolidating the data. If a database existed for the systems engineer so that he or she may have accurate data and a better understanding of biological systems in engineering terms, then the design process would be simplified. Also addressed is a means of evaluating the subsystems chosen. These subsystems are unified into a common metric, kilograms of mass, and normalized in relation to the throughput of a few basic elements. The initial integration of these subsystems is based on input/output masses and eventually balanced to a point of operation within the inherent performance ranges of the organisms chosen. At this point, it becomes necessary to go beyond the simplifying assumptions of simple mass relationships and further define for each organism the processes used to manipulate the throughput matter. Mainly considered here is the fact that these organisms perform input/output functions on differing timescales, thus establishing the need for buffer volumes or appropriate subsystem phasing. At each point in a systematic design it is necessary to disturb the system and discern its sensitivity to the disturbance. This can be done either through the introduction of a catastrophic failure or by applying a small perturbation to the system. One example is increasing the crew size. Here the wide range of performance characteristics once again shows that biological systems have an inherent advantage in responding to systemic perturbations. Since the design of any space-based system depends on mass, power, and volume requirements, each subsystem must be evaluated in these terms.

  4. Description and Critique of Quantitative Methods for the Allocation of Exploratory Development Resources

    DTIC Science & Technology

    The paper analyzes ten methods for planning the allocation of resources among projects within the exploratory development category of the Defense...research, development, test and evaluation program. Each method is described in terms of a general framework of planning methods and of the factors that...influence the allocation of development resources. A comparative analysis is made of the relative strengths and weaknesses of these methods . The more

  5. Development of a new modified Bethesda method for coagulation inhibitors: the Osaka modified Bethesda method.

    PubMed

    Torita, Sumiko; Suehisa, Etsuji; Kawasaki, Tomio; Toku, Masayuki; Takeo, Emi; Tomiyama, Yoshiaki; Nishida, Sumiyuki; Hidaka, Yoh

    2011-04-01

    The Nijmegen assay for the factor VIII (F-VIII) inhibitor is recommended by the International Society on Thrombosis and Haemostasis/Scientific and Standardization Committee. However, due to cumbersome and complicated preprocessing, it is presently difficult to introduce this assay into hospital laboratories. We used buffered plasma that was made by addition of 1 volume of 1 mol/l HEPES buffer at pH 7.35 to 9 volumes of plasma to form the test samples. The inhibitor titer was calculated by the remaining rate of F-VIII coagulation activity (F-VIII:C), using the ratio of actual value to the theoretical value. Five hundred microliters of the buffered test plasma and the control (30 mmol/l HEPES buffered saline at pH 7.35) were each mixed with equal volumes (500 μl) of normal pooled plasma in a test tube (11 mm internal diameter and 6.5 ml volume capacity), and incubated at 37°C for 2 h. In our modified Bethesda method, there were no significant changes in pH and F-VIII:C of control and test mixtures after incubation tests for stability. With the modified method, the inhibitor titers (mean, SD) from examining three hemophilia A plasma samples (F-VIII:C, <1-3%) and 40 normal samples (F-VIII:C, 34.5-168.3%) were 0.032, 0.057 and -0.009, 0.057, respectively. By our method, the F-VIII inhibitor titer of type I inhibitor-positive samples was higher than the Nijmegen method, and for type II inhibitor-positive samples, the titer was similar. We believe that our method can be applied to not only the type I inhibitor, but also to assays of type II inhibitor, without cumbersome and complicated preprocessing.

  6. Development of Laser-Ion Beam Photodissociation Methods

    SciTech Connect

    David H. Russell

    2004-05-11

    stabilized) ions. It is difficult to probe 2o/3o structure of gas-phase ions using fragmentation chemistry, because the energy barriers to inter-conversion of different structural forms lie below the fragmentation threshold, studies of low internal energy ions are more suited for these studies. A major challenge for gas-phase ion research is the design of experimental structural probes that can be used in parallel with computational chemistry, molecular modeling and/or classical structural diagnostic tools to aid interpretation of the experimental data. Our experimental design and selection of research problems is guided by this philosophy. The following section of the progress report focus on three main issues: (i) technique and instrument development, and (ii) studies of ion structure and ion chemistry.

  7. Development of Laser-Ion Beam Photodissociation Methods

    SciTech Connect

    David H. Russell

    2004-03-31

    stabilized) ions. It is difficult to probe 2o/3o structure of gas-phase ions using fragmentation chemistry, because the energy barriers to inter-conversion of different structural forms lie below the fragmentation threshold, studies of low internal energy ions are more suited for these studies. A major challenge for gas-phase ion research is the design of experimental structural probes that can be used in parallel with computational chemistry, molecular modeling and/or classical structural diagnostic tools to aid interpretation of the experimental data. Our experimental design and selection of research problems is guided by this philosophy. The following section of the progress report focus on three main issues: (i) technique and instrument development, and (ii) studies of ion structure and ion chemistry.

  8. Development of Continuous-Energy Eigenvalue Sensitivity Coefficient Calculation Methods in the Shift Monte Carlo Code

    SciTech Connect

    Perfetti, Christopher M; Martin, William R; Rearden, Bradley T; Williams, Mark L

    2012-01-01

    Three methods for calculating continuous-energy eigenvalue sensitivity coefficients were developed and implemented into the SHIFT Monte Carlo code within the Scale code package. The methods were used for several simple test problems and were evaluated in terms of speed, accuracy, efficiency, and memory requirements. A promising new method for calculating eigenvalue sensitivity coefficients, known as the CLUTCH method, was developed and produced accurate sensitivity coefficients with figures of merit that were several orders of magnitude larger than those from existing methods.

  9. Development of measurement method of work hardeningbehavior in large plastic strain for sheet metal forging

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Nobuo; Yamashita, Tomohiro; Shirakami, Satoshi; kada, Osamu; Yoshida, Tohru; Hiwatashi, Shunji

    2016-08-01

    For the purpose of accuracy improvement of sheet metal forging FE analysis, we have developed a new measurement method of work hardening behavior in large plastic strain by repeatedly performing simple shear test using pre-strained steel sheet. In this method, it is possible to measure work hardening behavior more than equivalent plastic strain 2.0. In addition, it was carried out a comparison between developed method and compression test in order to verify the validity of the results by the developed method. As a result, both results were in good agreement. The validity of developed method has been verified.

  10. Development of Improved Microwave Dielectric Materials and Devices using Advanced Experimental and Theoretical Methods

    DTIC Science & Technology

    2008-04-17

    REPORT Development of improved microwave dielectric materials and devices using advanced experimental and theoretical methods 14. ABSTRACT 16. SECURITY... methods Report Title ABSTRACT Our work has made important progress towards developing a fundamental understanding of the microscopic mechanism that causes...electromagnetic Band Gap Filters using advanced ceramic injection molding methods ”, Semiconductor Research Corporation Packaging and Interconnect Summer

  11. Four Methods for Completing the Conceptual Development Phase of Applied Theory Building Research in HRD

    ERIC Educational Resources Information Center

    Storberg-Walker, Julia; Chermack, Thomas J.

    2007-01-01

    The purpose of this article is to describe four methods for completing the conceptual development phase of theory building research for single or multiparadigm research. The four methods selected for this review are (1) Weick's method of "theorizing as disciplined imagination" (1989); (2) Whetten's method of "modeling as theorizing" (2002); (3)…

  12. DEVELOPMENT OF AN ELECTROSPRAY MASS SPECTROMETRIC METHOD FOR DETERMINING PERCHLORATE IN FERTILIZERS

    EPA Science Inventory

    An electrospray mass spectrometric method has been developed for application to agricultural and horticultural fertilizers to determine perchlorate. After fertilizers are leached or dissolved in water, the method relies on the formation of stable ion pair complex of the perchlor...

  13. METHOD DEVELOPMENT FOR THE DETERMINATION OF FORMALDEHYDE IN SAMPLES OF ENVIRONMENTAL ORIGIN

    EPA Science Inventory

    An analytical method was developed for the determination of formaldehyde in samples of environmental origin. After a review of the current literature, five candidate methods involving chemical derivatization were chosen for evaluation. The five derivatization reagents studied wer...

  14. METHOD DEVELOPMENT FOR THE DETERMINATION OF FORMALDEHYDE IN SAMPLES OF ENVIRONMENTAL ORIGIN

    EPA Science Inventory

    An analytical method was developed for the determination of formaldehyde in samples of environmental origin. After a review of the current literature, five candidate methods involving chemical derivatization were chosen for evaluation. The five derivatization reagents studied wer...

  15. DEVELOPMENT OF AN ELECTROSPRAY MASS SPECTROMETRIC METHOD FOR DETERMINING PERCHLORATE IN FERTILIZERS

    EPA Science Inventory

    An electrospray mass spectrometric method has been developed for application to agricultural and horticultural fertilizers to determine perchlorate. After fertilizers are leached or dissolved in water, the method relies on the formation of stable ion pair complex of the perchlor...

  16. Cases on Successful E-Learning Practices in the Developed and Developing World: Methods for the Global Information Economy

    ERIC Educational Resources Information Center

    Olaniran, Bolanle A., Ed.

    2010-01-01

    E-learning has become a significant aspect of training and education in the worldwide information economy as an attempt to create and facilitate a competent global work force. "Cases on Successful E-Learning Practices in the Developed and Developing World: Methods for the Global Information Economy" provides eclectic accounts of case…

  17. Cases on Successful E-Learning Practices in the Developed and Developing World: Methods for the Global Information Economy

    ERIC Educational Resources Information Center

    Olaniran, Bolanle A., Ed.

    2010-01-01

    E-learning has become a significant aspect of training and education in the worldwide information economy as an attempt to create and facilitate a competent global work force. "Cases on Successful E-Learning Practices in the Developed and Developing World: Methods for the Global Information Economy" provides eclectic accounts of case…

  18. Development and validation of simple titrimetric method for the determination of magnesium content in esomeprazole magnesium.

    PubMed

    Haddadin, R N; Issa, A Y

    2011-07-01

    A simple and inexpensive titrimetric method for the determination of magnesium ion in esomeprazole magnesium raw material was developed and validated according to International Conference on Harmonization guidelines and the United States Pharmacopoeia. The method depends on complex formation between EDTA and magnesium ion. The method was proven to be valid, equivalent and useful as an alternative method to the current pharmacopeial methods that are based on atomic absorption spectrometry.

  19. Development and Validation of Simple Titrimetric Method for the Determination of Magnesium Content in Esomeprazole Magnesium

    PubMed Central

    Haddadin, R. N.; Issa, A. Y.

    2011-01-01

    A simple and inexpensive titrimetric method for the determination of magnesium ion in esomeprazole magnesium raw material was developed and validated according to International Conference on Harmonization guidelines and the United States Pharmacopoeia. The method depends on complex formation between EDTA and magnesium ion. The method was proven to be valid, equivalent and useful as an alternative method to the current pharmacopeial methods that are based on atomic absorption spectrometry. PMID:22707837

  20. Development of a Method to Investigate Medical Students' Perceptions of Their Personal and Professional Development

    ERIC Educational Resources Information Center

    Lown, Nick; Davies, Ioan; Cordingley, Lis; Bundy, Chris; Braidman, Isobel

    2009-01-01

    Personal and Professional Development (PPD) is now key to the undergraduate medical curriculum and requires provision of appropriate learning experiences. In order to achieve this, it is essential that we ascertain students' perceptions of what is important in their PPD. We required a methodological approach suitable for a large medical school,…

  1. Development of a Method to Investigate Medical Students' Perceptions of Their Personal and Professional Development

    ERIC Educational Resources Information Center

    Lown, Nick; Davies, Ioan; Cordingley, Lis; Bundy, Chris; Braidman, Isobel

    2009-01-01

    Personal and Professional Development (PPD) is now key to the undergraduate medical curriculum and requires provision of appropriate learning experiences. In order to achieve this, it is essential that we ascertain students' perceptions of what is important in their PPD. We required a methodological approach suitable for a large medical school,…

  2. An exploratory survey of methods used to develop measures of performance

    NASA Astrophysics Data System (ADS)

    Hamner, Kenneth L.; Lafleur, Charles A.

    1993-09-01

    Nonmanufacturing organizations are being challenged to provide high-quality products and services to their customers, with an emphasis on continuous process improvement. Measures of performance, referred to as metrics, can be used to foster process improvement. The application of performance measurement to nonmanufacturing processes can be very difficult. This research explored methods used to develop metrics in nonmanufacturing organizations. Several methods were formally defined in the literature, and the researchers used a two-step screening process to determine the OMB Generic Method was most likely to produce high-quality metrics. The OMB Generic Method was then used to develop metrics. A few other metric development methods were found in use at nonmanufacturing organizations. The researchers interviewed participants in metric development efforts to determine their satisfaction and to have them identify the strengths and weaknesses of, and recommended improvements to, the metric development methods used. Analysis of participants' responses allowed the researchers to identify the key components of a sound metrics development method. Those components were incorporated into a proposed metric development method that was based on the OMB Generic Method, and should be more likely to produce high-quality metrics that will result in continuous process improvement.

  3. RECENT DEVELOPMENTS IN ANALYTICAL METHODS FOR FIBROUS AMPHIBOLE IN VERMICULITE ATTIC INSULATION

    EPA Science Inventory

    The U.S. Environmental Protection Agency has developed a test method for the analysis of fibrous amphibole in vermiculite attic insulation. This method was developed to provide the Agency with monitoring tools to study the occurrence and potential for exposure to fibrous amphibo...

  4. INDOOR AIR EMISSIONS FROM OFFICE EQUIPMENT: TEST METHOD DEVELOPMENT AND POLLUTION PREVENTION OPPORTUNITIES

    EPA Science Inventory

    The report describes the development and evaluation of a large chamber test method for measuring emissions from dry-process photocopiers. The test method was developed in two phases. Phase 1 was a single-laboratory evaluation at Research Triangle Institute (RTI) using four, mid-r...

  5. RECENT DEVELOPMENTS IN ANALYTICAL METHODS FOR FIBROUS AMPHIBOLE IN VERMICULITE ATTIC INSULATION

    EPA Science Inventory

    The U.S. Environmental Protection Agency has developed a test method for the analysis of fibrous amphibole in vermiculite attic insulation. This method was developed to provide the Agency with monitoring tools to study the occurrence and potential for exposure to fibrous amphibo...

  6. Flammable gas safety program. Analytical methods development: FY 1994 progress report

    SciTech Connect

    Campbell, J.A.; Clauss, S.; Grant, K.; Hoopes, V.; Lerner, B.; Lucke, R.; Mong, G.; Rau, J.; Wahl, K.; Steele, R.

    1994-09-01

    This report describes the status of developing analytical methods to account for the organic components in Hanford waste tanks, with particular focus on tanks assigned to the Flammable Gas Watch List. The methods that have been developed are illustrated by their application to samples obtained from Tank 241-SY-101 (Tank 101-SY).

  7. INDOOR AIR EMISSIONS FROM OFFICE EQUIPMENT: TEST METHOD DEVELOPMENT AND POLLUTION PREVENTION OPPORTUNITIES

    EPA Science Inventory

    The report describes the development and evaluation of a large chamber test method for measuring emissions from dry-process photocopiers. The test method was developed in two phases. Phase 1 was a single-laboratory evaluation at Research Triangle Institute (RTI) using four, mid-r...

  8. Waste Tank Organic Safety Program: Analytical methods development. Progress report, FY 1994

    SciTech Connect

    Campbell, J.A.; Clauss, S.A.; Grant, K.E.

    1994-09-01

    The objectives of this task are to develop and document extraction and analysis methods for organics in waste tanks, and to extend these methods to the analysis of actual core samples to support the Waste Tank organic Safety Program. This report documents progress at Pacific Northwest Laboratory (a) during FY 1994 on methods development, the analysis of waste from Tank 241-C-103 (Tank C-103) and T-111, and the transfer of documented, developed analytical methods to personnel in the Analytical Chemistry Laboratory (ACL) and 222-S laboratory. This report is intended as an annual report, not a completed work.

  9. RESEARCH ASSOCIATED WITH THE DEVELOPMENT OF EPA METHOD 552.2

    EPA Science Inventory

    The work presented in this paper entails the development of a method for haloacetic acid (HAA) analysis, Environmental Protection Agency (EPA)method 552.2, that improves the saftey and efficiency of previous methods and incorporates three additional trihalogenated acetic acids: b...

  10. Development of a nonlinear vortex method. [steady and unsteady aerodynamic loads of highly sweptback wings

    NASA Technical Reports Server (NTRS)

    Kandil, O. A.

    1981-01-01

    Progress is reported in the development of reliable nonlinear vortex methods for predicting the steady and unsteady aerodynamic loads of highly sweptback wings at large angles of attack. Abstracts of the papers, talks, and theses produced through this research are included. The modified nonlinear discrete vortex method and the nonlinear hybrid vortex method are highlighted.

  11. RESEARCH ASSOCIATED WITH THE DEVELOPMENT OF EPA METHOD 552.2

    EPA Science Inventory

    The work presented in this paper entails the development of a method for haloacetic acid (HAA) analysis, Environmental Protection Agency (EPA)method 552.2, that improves the saftey and efficiency of previous methods and incorporates three additional trihalogenated acetic acids: b...

  12. 75 FR 22126 - Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-27

    ... AGENCY Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods: Designation of One New Equivalent Method AGENCY: Environmental Protection Agency. ACTION: Notice of the designation of one new equivalent method for monitoring ambient air quality. SUMMARY: Notice is hereby...

  13. 76 FR 62402 - Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-07

    ... AGENCY Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods; Designation of One New Equivalent Method AGENCY: Environmental Protection Agency. ACTION: Notice of the..., Research Triangle Park, North Carolina 27711. Designation of this new equivalent method is intended...

  14. Development of continuous-energy eigenvalue sensitivity coefficient calculation methods in the shift Monte Carlo Code

    SciTech Connect

    Perfetti, C.; Martin, W.; Rearden, B.; Williams, M.

    2012-07-01

    Three methods for calculating continuous-energy eigenvalue sensitivity coefficients were developed and implemented into the Shift Monte Carlo code within the SCALE code package. The methods were used for two small-scale test problems and were evaluated in terms of speed, accuracy, efficiency, and memory requirements. A promising new method for calculating eigenvalue sensitivity coefficients, known as the CLUTCH method, was developed and produced accurate sensitivity coefficients with figures of merit that were several orders of magnitude larger than those from existing methods. (authors)

  15. An Elementary Introduction to Recently Developed Asymptotic Methods and Nanomechanics in Textile Engineering

    NASA Astrophysics Data System (ADS)

    He, Ji-Huan

    This review is an elementary introduction to the concepts of the recently developed asymptotic methods and new developments. Particular attention is paid throughout the paper to giving an intuitive grasp for Lagrange multiplier, calculus of variations, optimization, variational iteration method, parameter-expansion method, exp-function method, homotopy perturbation method, and ancient Chinese mathematics as well. Subsequently, nanomechanics in textile engineering and E-infinity theory in high energy physics, Kleiber's 3/4 law in biology, possible mechanism in spider-spinning process and fractal approach to carbon nanotube are briefly introduced. Bubble-electrospinning for mass production of nanofibers is illustrated. There are in total more than 280 references.

  16. Development of a General Method for Determining Leak Rates from Limiting Enclosures

    NASA Technical Reports Server (NTRS)

    Zografos, A. I.; Blackwell, C. C.; Harper, Lynn D. (Technical Monitor)

    1994-01-01

    This paper discusses the development of a general method for the determination of very low leak rates from limiting enclosures. There are many methods that can be used to detect and repair leaks from enclosures. Many methods have also been proposed that allow the estimation of actual leak rates, usually expressed as enclosure volume turnover. The proposed method combines measurements of the state variables (pressure, temperature, and volume) as well as the change in the concentration of a tracer gas to estimate the leak rate. The method was applied to the containment enclosure of the Engineering Development Unit of the CELSS Test Facility, currently undergoing testing at the NASA Ames Research Center.

  17. DEVELOPMENT OF ANALYTICAL METHODS FOR DETERMINING SUPPRESSOR CONCENTRATION IN THE MCU NEXT GENERATION SOLVENT (NGS)

    SciTech Connect

    Taylor-Pashow, K.; Fondeur, F.; White, T.; Diprete, D.; Milliken, C.

    2013-07-31

    Savannah River National Laboratory (SRNL) was tasked with identifying and developing at least one, but preferably two methods for quantifying the suppressor in the Next Generation Solvent (NGS) system. The suppressor is a guanidine derivative, N,N',N"-tris(3,7-dimethyloctyl)guanidine (TiDG). A list of 10 possible methods was generated, and screening experiments were performed for 8 of the 10 methods. After completion of the screening experiments, the non-aqueous acid-base titration was determined to be the most promising, and was selected for further development as the primary method. {sup 1}H NMR also showed promising results from the screening experiments, and this method was selected for further development as the secondary method. Other methods, including {sup 36}Cl radiocounting and ion chromatography, also showed promise; however, due to the similarity to the primary method (titration) and the inability to differentiate between TiDG and TOA (tri-n-ocytlamine) in the blended solvent, {sup 1}H NMR was selected over these methods. Analysis of radioactive samples obtained from real waste ESS (extraction, scrub, strip) testing using the titration method showed good results. Based on these results, the titration method was selected as the method of choice for TiDG measurement. {sup 1}H NMR has been selected as the secondary (back-up) method, and additional work is planned to further develop this method and to verify the method using radioactive samples. Procedures for analyzing radioactive samples of both pure NGS and blended solvent were developed and issued for the both methods.

  18. Lessons from comparative effectiveness research methods development projects funded under the Recovery Act.

    PubMed

    Zurovac, Jelena; Esposito, Dominick

    2014-11-01

    The American Recovery and Reinvestment Act of 2009 (ARRA) directed nearly US$29.2 million to comparative effectiveness research (CER) methods development. To help inform future CER methods investments, we describe the ARRA CER methods projects, identify barriers to this research and discuss the alignment of topics with published methods development priorities. We used several existing resources and held discussions with ARRA CER methods investigators. Although funded projects explored many identified priority topics, investigators noted that much work remains. For example, given the considerable investments in CER data infrastructure, the methods development field can benefit from additional efforts to educate researchers about the availability of new data sources and about how best to apply methods to match their research questions and data.

  19. Methods for assessment of innovative medical technologies during early stages of development.

    PubMed

    Bartelmes, Marc; Neumann, Ulrike; Lühmann, Dagmar; Schönermark, Matthias P; Hagen, Anja

    2009-11-05

    Conventional Health Technology Assessment (HTA) is usually conducted at a point in time at which the development of the respective technology may no longer be influenced. By this time developers and/or purchasers may have misinvested resources. Thus the demand for Technology Assessment (TA) which incorporates appropriate methods during early development stages of a technology becomes apparent. Against this health political background, the present report describes methods for a development-accompanying assessment of innovative medical technologies. Furthermore, international research programmes set out to identify or apply such methods will be outlined. A systematic literature search as well as an extensive manual literature search are carried out in order to obtain literature and information. The greatest units of the identified methods consist of assessment concepts, decision support methods, modelling approaches and methods focusing on users and their knowledge. Additionally, several general-purpose concepts have been identified. The identified research programmes INNO-HTA and MATCH (Multidisciplinary-Assessment-of-Technology-Centre-for-Healthcare) are to be seen as pilot projects which so far have not been able to generate final results. MATCH focuses almost entirely on the incorporation of the user-perspective regarding the development of non-pharmaceutical technologies, whereas INNO-HTA is basically concerned with the identification and possible advancement of methods for the early, socially-oriented technology assessment. Most references offer only very vague descriptions of the respective method and the application of greatly differing methods seldom exceeds the character of a pilot implementation. A standardisation much less an institutionalisation of development-accompanying assessment cannot be recognized. It must be noted that there is no singular method with which development-accompanying assessment should be carried out. Instead, a technology and

  20. The Development and Evaluation of Training Methods for Group IV Personnel. 1. Orientation and Implementation of the Training Methods Development School (TMDS).

    ERIC Educational Resources Information Center

    Steinemann, John H.

    The investigation is part of continuing Navy research on the Trainability of Group IV (low ability) personnel intended to maximize the utilization and integration of marginal personnel in the fleet. An experimental Training Methods Development School (TMDS) was initiated to provide an experimental training program, with research controls, for…

  1. Using mixed methods to develop and evaluate complex interventions in palliative care research.

    PubMed

    Farquhar, Morag C; Ewing, Gail; Booth, Sara

    2011-12-01

    there is increasing interest in combining qualitative and quantitative research methods to provide comprehensiveness and greater knowledge yield. Mixed methods are valuable in the development and evaluation of complex interventions. They are therefore particularly valuable in palliative care research where the majority of interventions are complex, and the identification of outcomes particularly challenging. this paper aims to introduce the role of mixed methods in the development and evaluation of complex interventions in palliative care, and how they may be used in palliative care research. the paper defines mixed methods and outlines why and how mixed methods are used to develop and evaluate complex interventions, with a pragmatic focus on design and data collection issues and data analysis. Useful texts are signposted and illustrative examples provided of mixed method studies in palliative care, including a detailed worked example of the development and evaluation of a complex intervention in palliative care for breathlessness. Key challenges to conducting mixed methods in palliative care research are identified in relation to data collection, data integration in analysis, costs and dissemination and how these might be addressed. the development and evaluation of complex interventions in palliative care benefit from the application of mixed methods. Mixed methods enable better understanding of whether and how an intervention works (or does not work) and inform the design of subsequent studies. However, they can be challenging: mixed method studies in palliative care will benefit from working with agreed protocols, multidisciplinary teams and engaging staff with appropriate skill sets.

  2. Implementing Expertise-Based Training Methods to Accelerate the Development of Peer Academic Coaches

    ERIC Educational Resources Information Center

    Blair, Lisa

    2016-01-01

    The field of expertise studies offers several models from which to develop training programs that accelerate the development of novice performers in a variety of domains. This research study implemented two methods of expertise-based training in a course to develop undergraduate peer academic coaches through a ten-week program. An existing…

  3. Development of advanced modal methods for calculating transient thermal and structural response

    NASA Technical Reports Server (NTRS)

    Camarda, Charles J.

    1991-01-01

    Higher-order modal methods for predicting thermal and structural response are evaluated. More accurate methods or ones which can significantly reduce the size of complex, transient thermal and structural problems are desirable for analysis and are required for synthesis of real structures subjected to thermal and mechanical loading. A unified method is presented for deriving successively higher-order modal solutions related to previously-developed, lower-order methods such as the mode displacement and mode-acceleration methods. A new method, called the force-derivative method, is used to obtain higher-order modal solutions for both uncoupled (proportionally-damped) structural problems as well as thermal problems and coupled (non-proportionally damped) structural problems. The new method is called the force-derivative method because, analogous to the mode-acceleration method, it produces a term that depends on the forcing function and additional terms that depend on the time derivatives of the forcing function.

  4. Development of a hybrid deterministic/stochastic method for 1D nuclear reactor kinetics

    SciTech Connect

    Terlizzi, Stefano; Dulla, Sandra; Ravetto, Piero; Rahnema, Farzad; Zhang, Dingkang

    2015-12-31

    A new method has been implemented for solving the time-dependent neutron transport equation efficiently and accurately. This is accomplished by coupling the hybrid stochastic-deterministic steady-state coarse-mesh radiation transport (COMET) method [1,2] with the new predictor-corrector quasi-static method (PCQM) developed at Politecnico di Torino [3]. In this paper, the coupled method is implemented and tested in 1D slab geometry.

  5. Development of a Panel Method for Modeling Configurations with Unsteady Component Motions. Phase 1

    DTIC Science & Technology

    1988-04-15

    significant length scales, the methods rely on the results of existing wake modeling techniques to specify the boundary conditions on their solution...15ANALYTICAL METHODS REPORT 8801 ( I DEVELOPMENT OF A PANEL METHOD FOR MODELING CONFIGURATIONS WITH UNSTEADY COMPONENT MOTIONS PHASE I FINAL REPORT...PREPARED UNDER SEIR CONTRACT DAALO3-87-C-OO11 Lfl Prepared By: David R. Clark & Brian Maskew Analytical Methods Inc. 2133 152nd Avenue N.E. Redmond

  6. Using Models to Develop Measurement Systems: A Method and Its Industrial Use

    NASA Astrophysics Data System (ADS)

    Staron, Miroslaw; Meding, Wilhelm

    Making the measurement processes work in large software development organizations requires collecting right metrics and collecting them automatically. Collecting the right metrics requires development custom measurement systems which fulfill the actual needs of the company. Effective communication between stakeholders (persons who have the information needs) and the designers of measurement systems are cornerstones in identifying the right metrics and the right amount of them. In this paper we describe a method for developing measurement systems based on models which make this communication more effective. The method supports the designers of measurement systems and managers, for whom the measurement systems are created, in developing more effective measurement systems based on MS Excel. The method comprises of platform independent modeling, platform specific modeling and automated code generation. This method has been used in one of action research projects at Ericsson. We present the results of the evaluation of this method at Ericsson by the end of this paper.

  7. Development and validation of spectrophotometric methods for estimating amisulpride in pharmaceutical preparations.

    PubMed

    Sharma, Sangita; Neog, Madhurjya; Prajapati, Vipul; Patel, Hiren; Dabhi, Dipti

    2010-01-01

    Five simple, sensitive, accurate and rapid visible spectrophotometric methods (A, B, C, D and E) have been developed for estimating Amisulpride in pharmaceutical preparations. These are based on the diazotization of Amisulpride with sodium nitrite and hydrochloric acid, followed by coupling with N-(1-naphthyl)ethylenediamine dihydrochloride (Method A), diphenylamine (Method B), beta-naphthol in an alkaline medium (Method C), resorcinol in an alkaline medium (Method D) and chromotropic acid in an alkaline medium (Method E) to form a colored chromogen. The absorption maxima, lambda(max), are at 523 nm for Method A, 382 and 490 nm for Method B, 527 nm for Method C, 521 nm for Method D and 486 nm for Method E. Beer's law was obeyed in the concentration range of 2.5-12.5 microg mL(-1) in Method A, 5-25 and 10-50 microg mL(-1) in Method B, 4-20 microg mL(-1) in Method C, 2.5-12.5 microg mL(-1) in Method D and 5-15 microg mL(-1) in Method E. The results obtained for the proposed methods are in good agreement with labeled amounts, when marketed pharmaceutical preparations were analyzed.

  8. Commercial implementation of NASA-developed ultrasonic imaging methods via technology transfer

    SciTech Connect

    Roth, D.J.; Martin, K.; Hendricks, J.L.; Whalen, M.F.; Bodis, J.R.

    1996-11-01

    This article describes ultrasonic imaging methods developed and refined by the nondestructive evaluation group at NASA Lewis Research Center and their commercial implementation. The methods are based on the measurement of ultrasonic velocity in a material. Two velocity imaging methods were implemented: thickness based (apparent) velocity and thickness independent reflector plate velocity methods. The article demonstrates capabilities of the commercial implementation. The significance of this technology transfer effort is that (1) the commercial implementation of velocity imaging provides a 100x speed increase in scanning and processing over the laboratory methods developed at Lewis, thus making it applicable in industrial quality control application; and (2) the aerospace and other materials development intensive industries which use extensive ultrasonic inspection for process control and failure analysis will now have an alternative, highly quantitative, and revealing imaging method commercially available.

  9. The weight method: a new screening method for estimating pesticide deposition from knapsack sprayers in developing countries.

    PubMed

    García-Santos, Glenda; Scheiben, Dominik; Binder, Claudia R

    2011-03-01

    Investigations of occupational and environmental risk caused by the use of agrochemicals have received considerable interest over the last decades. And yet, in developing countries, the lack of staff and analytical equipment as well the costs of chemical analyses make it difficult, if not impossible, to monitor pesticide contamination and residues in humans, air, water, and soils. A new and simple method is presented here for estimation of pesticide deposition in humans and soil after application. The estimate is derived on the basis of water mass balance measured in a given number of high absorbent papers under low evaporative conditions and unsaturated atmosphere. The method is presented as a suitable, rapid, low cost screening tool, complementary to toxicological tests, to assess occupational and environmental exposure caused by knapsack sprayers, where there is a lack of analytical instruments. This new method, called the "weight method", was tested to obtain drift deposition on the neighbouring field and the clothes of the applicator after spraying water with a knapsack sprayer in one of the largest areas of potato production in Colombia. The results were confirmed by experimental data using a tracer and the same set up used for the weight method. The weight method was able to explain 86% of the airborne drift and deposition variance.

  10. Development of quadruped walking locomotion gait generator using a hybrid method

    NASA Astrophysics Data System (ADS)

    Jasni, F.; Shafie, A. A.

    2013-12-01

    The earth, in many areas is hardly reachable by the wheeled or tracked locomotion system. Thus, walking locomotion system is becoming a favourite option for mobile robot these days. This is because of the ability of walking locomotion to move on the rugged and unlevel terrains. However, to develop a walking locomotion gait for a robot is not a simple task. Central Pattern Generator (CPGs) method is a biological inspired method that is introduced as a method to develop the gait for the walking robot recently to tackle the issue faced by the conventional method of pre-designed trajectory based method. However, research shows that even the CPG method do have some limitations. Thus, in this paper, a hybrid method that combines CPG and the pre-designed trajectory based method is introduced to develop a walking gait for quadruped walking robot. The 3-D foot trajectories and the joint angle trajectories developed using the proposed method are compared with the data obtained via the conventional method of pre-designed trajectory to confirm the performance.

  11. Progress toward the development and testing of source reconstruction methods for NIF neutron imaging.

    PubMed

    Loomis, E N; Grim, G P; Wilde, C; Wilson, D C; Morgan, G; Wilke, M; Tregillis, I; Merrill, F; Clark, D; Finch, J; Fittinghoff, D; Bower, D

    2010-10-01

    Development of analysis techniques for neutron imaging at the National Ignition Facility is an important and difficult task for the detailed understanding of high-neutron yield inertial confinement fusion implosions. Once developed, these methods must provide accurate images of the hot and cold fuels so that information about the implosion, such as symmetry and areal density, can be extracted. One method under development involves the numerical inversion of the pinhole image using knowledge of neutron transport through the pinhole aperture from Monte Carlo simulations. In this article we present results of source reconstructions based on simulated images that test the methods effectiveness with regard to pinhole misalignment.

  12. The Rack-Gear Tool Generation Modelling. Non-Analytical Method Developed in CATIA, Using the Relative Generating Trajectories Method

    NASA Astrophysics Data System (ADS)

    Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.

    2016-11-01

    The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.

  13. The EDIC Method: An Engaging and Comprehensive Approach for Creating Health Department Workforce Development Plans.

    PubMed

    Grimm, Brandon L; Brandert, Kathleen; Palm, David; Svoboda, Colleen

    2016-09-29

    In 2013, the Nebraska Department of Health & Human Services, Division of Public Health (Nebraska's State Health Department); and the University of Nebraska Medical Center, College of Public Health developed a comprehensive approach to assess workforce training needs. This article outlines the method used to assess the education and training needs of Division staff, and develop comprehensive workforce development plans to address those needs. The EDIC method (Engage, Develop, Identify, and Create) includes the following four phases: (1) Engage Stakeholders, (2) Develop Assessment, (3) Identify Training Needs, and (4) Create Development Plans. The EDIC method provided a process grounded in science and practice, allowed input, and produced buy-in from staff at all levels throughout the Division of Public Health. This type of process provides greater assurance that the most important gaps in skills and competencies will be identified. Although it is a comprehensive approach, it can be replicated at the state or local level across the country.

  14. On Development of a Problem Based Learning System for Linear Algebra with Simple Input Method

    NASA Astrophysics Data System (ADS)

    Yokota, Hisashi

    2011-08-01

    Learning how to express a matrix using a keyboard inputs requires a lot of time for most of college students. Therefore, for a problem based learning system for linear algebra to be accessible for college students, it is inevitable to develop a simple method for expressing matrices. Studying the two most widely used input methods for expressing matrices, a simpler input method for expressing matrices is obtained. Furthermore, using this input method and educator's knowledge structure as a concept map, a problem based learning system for linear algebra which is capable of assessing students' knowledge structure and skill is developed.

  15. A Summary of the Development of Integral Aerodynamic Methods for the Computation of Rotor Wake Interactions.

    DTIC Science & Technology

    1986-03-01

    b-IP254 R SUMNMY OF THE DEVELOPMENT OF INTEONAL IERo~Umfic 1/1 METHODS FOR THE CONP.. (U) ANALYTICAL METHODS INC REDNOND MR J1 M SUNNAI HRR 86 RHI...8605 *RO-1S391.3-EG-S UCLFE AS029-81-CP-SI- -663 F/O 29/4 NL 141 1 .. * 11111 112 .0~ III111 2 - 1jL11. 11111 .6 MI(Rn’flI . Z ANALYTICAL METHODS REPORT...8605 " A SUMMARY OF TBE DEVELOPMENT OF INTEGRAL AERODYNAMIC METHODS FOR THE COMPUTATION OF ROTOR WAKE INTERACTIONS Prepared for : .-i Department of

  16. Development and Validation of Simultaneous Spectrophotometric Methods for Drotaverine Hydrochloride and Aceclofenac from Tablet Dosage Form

    PubMed Central

    Shah, S. A.; Shah, D. R.; Chauhan, R. S.; Jain, J. R.

    2011-01-01

    Two simple spectrophotometric methods have been developed for simultaneous estimation of drotaverine hydrochloride and aceclofenac from tablet dosage form. Method I is a simultaneous equation method (Vierodt's method), wavelengths selected are 306.5 and 276 nm. Method II is the absorbance ratio method (Q-Analysis), which employs 298.5 nm as λ1 and 276 nm as λ2 (λmax of AF) for formation of equations. Both the methods were found to be linear between the range of 8-32 μg/ml for drotaverine and 10-40 μg/ml for aceclofenac. The accuracy and precision were determined and found to comply with ICH guidelines. Both the methods showed good reproducibility and recovery with % RSD in the desired range. The methods were found to be rapid, specific, precise and accurate and can be successfully applied for the routine analysis of drotaverine and aceclofenac in their combined tablet dosage form. PMID:22457554

  17. Developing a Measure of Wealth for Primary Student Families in a Developing Country: Comparison of Two Methods of Psychometric Calibration

    ERIC Educational Resources Information Center

    Griffin, Patrick

    2005-01-01

    This article compares the invariance properties of two methods of psychometric instrument calibration for the development of a measure of wealth among families of Grade 5 pupils in five provinces in Vietnam. The measure is based on self-reported lists of possessions in the home. Its stability has been measured over two time periods. The concept of…

  18. Analysis of Perfluorinated Chemicals and Their Fluorinated Precursors in Sludge: Method Development and Initial Results

    EPA Science Inventory

    A rigorous method was developed to maximize the extraction efficacy for perfluorocarboxylic acids (PFCAs), perfluorosulfonates (PFSAs), fluorotelomer alcohols (FTOHs), fluorotelomer acrylates (FTAc), perfluorosulfonamides (FOSAs), and perfluorosulfonamidoethanols (FOSEs) from was...

  19. Development of method to characterize emissions from spray polyurethane foam insulation

    EPA Science Inventory

    This presentation updates symposium participants re EPA progress towards development of SPF insulation emissions characterization methods. The presentation highlights evaluation of experiments investigating emissions after application of SPF to substrates in micro chambers and i...

  20. Novel quantitative methods for characterization of chemical induced functional alteration in developing neuronal cultures

    EPA Science Inventory

    ABSTRACT BODY: Thousands of chemicals lack adequate testing for adverse effects on nervous system development, stimulating research into alternative methods to screen chemicals for potential developmental neurotoxicity. Microelectrode arrays (MEA) collect action potential spiking...

  1. Development and validation of a same-day monitoring method for recreational water

    EPA Pesticide Factsheets

    When water is polluted, swimmers can become ill from exposure to waterborne pathogens. EPA scientists have developed a new DNA extraction method for determining the amount of pathogens present in water.

  2. Development and validation of same-day monitoring methods for recreational water

    EPA Pesticide Factsheets

    When water is polluted, swimmers can become ill from exposure to waterborne pathogens. EPA scientists have developed a new DNA extraction method for determining the amount of pathogens present in water.

  3. Analysis of Perfluorinated Chemicals and Their Fluorinated Precursors in Sludge: Method Development and Initial Results

    EPA Science Inventory

    A rigorous method was developed to maximize the extraction efficacy for perfluorocarboxylic acids (PFCAs), perfluorosulfonates (PFSAs), fluorotelomer alcohols (FTOHs), fluorotelomer acrylates (FTAc), perfluorosulfonamides (FOSAs), and perfluorosulfonamidoethanols (FOSEs) from was...

  4. Developing Non-Targeted Measurement Methods to Characterize the Human Exposome

    EPA Science Inventory

    The exposome represents all exposures experienced by an individual during their lifetime. Registered chemicals currently number in the tens-of-thousands, and therefore comprise a significant portion of the human exposome. To date, quantitative monitoring methods have been develop...

  5. Developing Non-Targeted Measurement Methods to Characterize the Human Exposome

    EPA Science Inventory

    The exposome represents all exposures experienced by an individual during their lifetime. Registered chemicals currently number in the tens-of-thousands, and therefore comprise a significant portion of the human exposome. To date, quantitative monitoring methods have been develop...

  6. DEVELOPMENT AND VALIDATION OF AN ION CHROMATOGRAPHIC METHOD FOR DETERMINING PERCHLORATE IN FERTILIZERS

    EPA Science Inventory

    A method has been developed for the determination of perchlorate in fertilizers. Materials are leached with deionized water to dissolve any soluble perchlorate compounds. Ion chromatographic separation is followed by suppressed conductivity for detection. Perchlorate is retained ...

  7. DEVELOPMENT AND VALIDATION OF AN ION CHROMATOGRAPHIC METHOD FOR DETERMINING PERCHLORATE IN FERTILIZERS

    EPA Science Inventory

    A method has been developed for the determination of perchlorate in fertilizers. Materials are leached with deionized water to dissolve any soluble perchlorate compounds. Ion chromatographic separation is followed by suppressed conductivity for detection. Perchlorate is retained ...

  8. Novel quantitative methods for characterization of chemical induced functional alteration in developing neuronal cultures

    EPA Science Inventory

    ABSTRACT BODY: Thousands of chemicals lack adequate testing for adverse effects on nervous system development, stimulating research into alternative methods to screen chemicals for potential developmental neurotoxicity. Microelectrode arrays (MEA) collect action potential spiking...

  9. Development of method to characterize emissions from spray polyurethane foam insulation

    EPA Science Inventory

    This presentation updates symposium participants re EPA progress towards development of SPF insulation emissions characterization methods. The presentation highlights evaluation of experiments investigating emissions after application of SPF to substrates in micro chambers and i...

  10. Analysis of Perfluorinated Chemicals in Sludge: Method Development and Initial Results

    EPA Science Inventory

    A fast, rigorous method was developed to maximize the extraction efficacy for ten perfluorocarboxylic acids and perfluorooctanesulfonate from wastewater-treatment sludge and to quantitate using liquid chromatography, tandem-mass spectrometry (LC/MS/MS). First, organic solvents w...

  11. Analysis of Perfluorinated Chemicals in Sludge: Method Development and Initial Results

    EPA Science Inventory

    A fast, rigorous method was developed to maximize the extraction efficacy for ten perfluorocarboxylic acids and perfluorooctanesulfonate from wastewater-treatment sludge and to quantitate using liquid chromatography, tandem-mass spectrometry (LC/MS/MS). First, organic solvents w...

  12. Evaluation of an automatic brain segmentation method developed for neonates on adult MR brain images

    NASA Astrophysics Data System (ADS)

    Moeskops, Pim; Viergever, Max A.; Benders, Manon J. N. L.; Išgum, Ivana

    2015-03-01

    Automatic brain tissue segmentation is of clinical relevance in images acquired at all ages. The literature presents a clear distinction between methods developed for MR images of infants, and methods developed for images of adults. The aim of this work is to evaluate a method developed for neonatal images in the segmentation of adult images. The evaluated method employs supervised voxel classification in subsequent stages, exploiting spatial and intensity information. Evaluation was performed using images available within the MRBrainS13 challenge. The obtained average Dice coefficients were 85.77% for grey matter, 88.66% for white matter, 81.08% for cerebrospinal fluid, 95.65% for cerebrum, and 96.92% for intracranial cavity, currently resulting in the best overall ranking. The possibility of applying the same method to neonatal as well as adult images can be of great value in cross-sectional studies that include a wide age range.

  13. Progress toward developing a monitoring method for hydrogen cyanide in air

    SciTech Connect

    Cassinelli, M.E.

    1986-05-15

    The development of an analytical procedure for measuring free cyanide, the various solid and liquid sorbents tested, and the instability problems encountered during research to establish a monitoring method for hydrogen-cyanide (HCN) in air are described. The original goals of the project were the development of improved sampling and analytical methods for hydrogen fluoride, HCN, and particulate cyanides by replacing liquid-impinger samplers with solid sorbent samplers and using an ion-chromatographic analytical technique.

  14. Analysis of Investigational Drugs in Biological Fluids Method Development and Routine Assay

    DTIC Science & Technology

    1988-04-12

    The purpose of work under this contract is to develop and routinely use analytical methods for the determination of the concentration in biological specimens of investigational drugs in support of pharmacokinetic and bioavailability studies undertaken for the purpose of new drug development for the US military establishment. Accepted scientific procedures including normal and reversed phase high-performance liquid chromatographic methods, post column derivatization, and protein precipitation and

  15. Method for developing national quality indicators based on manual data extraction from medical records

    PubMed Central

    Couralet, Melanie; Leleu, Henri; Capuano, Frederic; Marcotte, Leah; Nitenberg, Gérard; Sicotte, Claude; Minvielle, Etienne

    2013-01-01

    Developing quality indicators (QI) for national purposes (eg, public disclosure, paying-for-performance) highlights the need to find accessible and reliable data sources for collecting standardised data. The most accurate and reliable data source for collecting clinical and organisational information still remains the medical record. Data collection from electronic medical records (EMR) would be far less burdensome than from paper medical records (PMR). However, the development of EMRs is costly and has suffered from low rates of adoption and barriers of usability even in developed countries. Currently, methods for producing national QIs based on the medical record rely on manual extraction from PMRs. We propose and illustrate such a method. These QIs display feasibility, reliability and discriminative power, and can be used to compare hospitals. They have been implemented nationwide in France since 2006. The method used to develop these QIs could be adapted for use in large-scale programmes of hospital regulation in other, including developing, countries. PMID:23015098

  16. Using Mixed Methods to Analyze Video Data: A Mathematics Teacher Professional Development Example

    ERIC Educational Resources Information Center

    DeCuir-Gunby, Jessica T.; Marshall, Patricia L.; McCulloch, Allison W.

    2012-01-01

    This article uses data from 65 teachers participating in a K-2 mathematics professional development research project as an example of how to analyze video recordings of teachers' classroom lessons using mixed methods. Through their discussion, the authors demonstrate how using a mixed methods approach to classroom video analysis allows researchers…

  17. The Design, Development, and Evaluation of a Systematic Method for English Diction in Choral Performance.

    ERIC Educational Resources Information Center

    Fisher, Robert E.

    1991-01-01

    Argues that many existing methods for teaching English diction for choral performance are ineffective. Describes a study of the Articulatory Diction Development Method (ADDM), focusing on kinesthetic awareness and speech articulator control. Concludes through research involving three high school choirs that the ADDM can improve choral tone and…

  18. Development of a Simultaneous Extraction and Cleanup Method for Pyrethroid Pesticides from Indoor House Dust Samples

    EPA Science Inventory

    An efficient and reliable analytical method was developed for the sensitive and selective quantification of pyrethroid pesticides (PYRs) in house dust samples. The method is based on selective pressurized liquid extraction (SPLE) of the dust-bound PYRs into dichloromethane (DCM) wi...

  19. Development of a flaw detection material for the magnetic particle method

    NASA Astrophysics Data System (ADS)

    Chesnokova, A. A.; Kalayeva, S. Z.; Ivanova, V. A.

    2017-08-01

    The issues of increasing the effectiveness of the magnetic particle method of nondestructive testing by using a new flaw detection material is considered in the paper. The requirements for flaw detection materials are determined, which ensure the effectiveness of the inspection method. A new flaw detection material - magnetic fluids from iron-containing waste products - has been developed.

  20. The Effectiveness of the Socratic Method in Developing Critical Thinking Skills in English Language Learners

    ERIC Educational Resources Information Center

    Jensen, Roger D., Jr.

    2015-01-01

    Critical thinking skills are an important topic of the United States' education system. This study examines the literature on critical thinking skills and defines them. The study also explores one specific teaching and assessment strategy known as the Socratic Method. The five-week research study used the Socratic Method for developing critical…

  1. Pathways to Lean Software Development: An Analysis of Effective Methods of Change

    ERIC Educational Resources Information Center

    Hanson, Richard D.

    2014-01-01

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is…

  2. METHOD DEVELOPMENT FOR THE DETERMINATION OF PERFLUORINATED ORGANIC COMPOUNDS ( PFCS ) IN SURFACE WATER

    EPA Science Inventory

    The method for the determination of perfluorinated organic compounds (PFCs) in surface water has been developed and applied to natural water. The method shows an adequate sensitivity, precision and accuracy for ten kinds of target compounds. These PFCs were found in most samples...

  3. An overview of recent developments and current status of gluten ELISA methods

    USDA-ARS?s Scientific Manuscript database

    ELISA methods for detecting and quantitating allergens have been around for some time and they are continuously improved. In this context, the development of gluten methods is no exception. Around the turn of the millennium, doubts were raised whether the existing “Skerritt-ELISA” would meet the 20 ...

  4. Development and assessment of disinfectant efficacy test methods for regulatory purposes.

    PubMed

    Tomasino, Stephen F

    2013-05-01

    The United States Environmental Protection Agency regulates pesticidal products, including products with antimicrobial activity. Test guidelines have been established to inform manufacturers of which methodology is appropriate to support a specific efficacy claim. This paper highlights efforts designed to improve current methods and the development and assessment of new test methods.

  5. Wellbeing Research in Developing Countries: Reviewing the Role of Qualitative Methods

    ERIC Educational Resources Information Center

    Camfield, Laura; Crivello, Gina; Woodhead, Martin

    2009-01-01

    The authors review the contribution of qualitative methods to exploring concepts and experiences of wellbeing among children and adults living in developing countries. They provide examples illustrating the potential of these methods for gaining a holistic and contextual understanding of people's perceptions and experiences. Some of these come…

  6. RESEARCH TOWARDS DEVELOPING METHODS FOR SELECTED PHARMACEUTICAL AND PERSONAL CARE PRODUCTS (PPCPS) ADAPTED FOR BIOSOLIDS

    EPA Science Inventory

    Development, standardization, and validation of analytical methods provides state-of-the-science

    techniques to evaluate the presence, or absence, of select PPCPs in biosolids. This research

    provides the approaches, methods, and tools to assess the exposures and redu...

  7. Development of a photogrammetric method of measuring tree taper outside bark

    Treesearch

    David R. Larsen

    2006-01-01

    A photogrammetric method is presented for measuring tree diameters outside bark using calibrated control ground-based digital photographs. The method was designed to rapidly collect tree taper information from subject trees for the development of tree taper equations. Software that is commercially available, but designed for a different purpose, can be readily adapted...

  8. Development of a Simultaneous Extraction and Cleanup Method for Pyrethroid Pesticides from Indoor House Dust Samples

    EPA Science Inventory

    An efficient and reliable analytical method was developed for the sensitive and selective quantification of pyrethroid pesticides (PYRs) in house dust samples. The method is based on selective pressurized liquid extraction (SPLE) of the dust-bound PYRs into dichloromethane (DCM) wi...

  9. Using Mixed Methods to Analyze Video Data: A Mathematics Teacher Professional Development Example

    ERIC Educational Resources Information Center

    DeCuir-Gunby, Jessica T.; Marshall, Patricia L.; McCulloch, Allison W.

    2012-01-01

    This article uses data from 65 teachers participating in a K-2 mathematics professional development research project as an example of how to analyze video recordings of teachers' classroom lessons using mixed methods. Through their discussion, the authors demonstrate how using a mixed methods approach to classroom video analysis allows researchers…

  10. Wellbeing Research in Developing Countries: Reviewing the Role of Qualitative Methods

    ERIC Educational Resources Information Center

    Camfield, Laura; Crivello, Gina; Woodhead, Martin

    2009-01-01

    The authors review the contribution of qualitative methods to exploring concepts and experiences of wellbeing among children and adults living in developing countries. They provide examples illustrating the potential of these methods for gaining a holistic and contextual understanding of people's perceptions and experiences. Some of these come…

  11. RESEARCH TOWARDS DEVELOPING METHODS FOR SELECTED PHARMACEUTICAL AND PERSONAL CARE PRODUCTS (PPCPS) ADAPTED FOR BIOSOLIDS

    EPA Science Inventory

    Development, standardization, and validation of analytical methods provides state-of-the-science

    techniques to evaluate the presence, or absence, of select PPCPs in biosolids. This research

    provides the approaches, methods, and tools to assess the exposures and redu...

  12. Pathways to Lean Software Development: An Analysis of Effective Methods of Change

    ERIC Educational Resources Information Center

    Hanson, Richard D.

    2014-01-01

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is…

  13. Recommendations for Developing Alternative Test Methods for Screening and Prioritization of Chemicals for Developmental Neurotoxicity

    EPA Science Inventory

    Developmental neurotoxicity testing (DNT) is perceived by many stakeholders to be an area in critical need of alternative methods to current animal testing protocols and gUidelines. An immediate goal is to develop test methods that are capable of screening large numbers of chemic...

  14. Algorithms for Zonal Methods and Development of Three Dimensional Mesh Generation Procedures.

    DTIC Science & Technology

    1984-02-01

    zonal methods and 4D-grid generation procedures for application of finite difference methods to solve complex aircraft con- figurations. For the task...this research has been to further develop grid generation procedures and zonal methods so as to extend the applications of nonlinear finite difference ...C - - A CHIMERA GRID SCHEME .................... 51 APPENDIX D - - A CONSERVATIVE FINITE DIFFERENCE ALGORITHM FOR THE UNSTEADY TRANSONIC POTENTIAL

  15. Development of In-Mold Assembly Methods for Producing Mesoscale Revolute Joints

    DTIC Science & Technology

    2009-01-01

    positioning methods to realize cavity shape change to avoid damage to delicate mesoscale parts created during molding, (3) developing a method to...premolded component, this process may lead to irreparable damages to the first stage part. As a result, cavity morphing methods are the only feasible... damage to the part. Figure 4.3 Mold design iterations for second stage injection When the mesoscale pin is molded first, there is a concern that the

  16. Development and evaluation of a source sampling and analysis method for hydrogen cyanide

    SciTech Connect

    Steger, J.L.; Merrill, R.G.; Fuerst, R.G.

    1997-12-31

    Laboratory studies were carried out to develop a method for the sampling and analysis of hydrogen cyanide from stationary source air emissions using a dilute NAOH solution as the collection medium. The method evaluated extracts stack gas from the emission sources and stabilizes the reactive gas for subsequent analysis in dilute sodium hydroxide solution. A modified Method 0050 sampling train was evaluated by dynamically spiking hydrogen cyanide into the heated probe while sampling simulated or actual source gas.

  17. STANDARD OPERATING PROCEDURE FOR QUALITY ASSURANCE IN ANALYTICAL CHEMISTRY METHODS DEVELOPMENT

    EPA Science Inventory

    The Environmental Protection Agency's (EPA) Office of Research and Development (ORD) is engaged in the development, demonstration, and validation of new or newly adapted methods of analysis for environmentally related samples. Recognizing that a "one size fits all" approach to qu...

  18. Cross Sectional Study of Agile Software Development Methods and Project Performance

    ERIC Educational Resources Information Center

    Lambert, Tracy

    2011-01-01

    Agile software development methods, characterized by delivering customer value via incremental and iterative time-boxed development processes, have moved into the mainstream of the Information Technology (IT) industry. However, despite a growing body of research which suggests that a predictive manufacturing approach, with big up-front…

  19. FODEM: A Multi-Threaded Research and Development Method for Educational Technology

    ERIC Educational Resources Information Center

    Suhonen, Jarkko; de Villiers, M. Ruth; Sutinen, Erkki

    2012-01-01

    Formative development method (FODEM) is a multithreaded design approach that was originated to support the design and development of various types of educational technology innovations, such as learning tools, and online study programmes. The threaded and agile structure of the approach provides flexibility to the design process. Intensive…

  20. STANDARD OPERATING PROCEDURE FOR QUALITY ASSURANCE IN ANALYTICAL CHEMISTRY METHODS DEVELOPMENT

    EPA Science Inventory

    The Environmental Protection Agency's (EPA) Office of Research and Development (ORD) is engaged in the development, demonstration, and validation of new or newly adapted methods of analysis for environmentally related samples. Recognizing that a "one size fits all" approach to qu...