Science.gov

Sample records for 95-39 methods development

  1. Medical Research and Evaluation Facility (MREF) and studies supporting the medical chemical defense program: Task 95-39: Methods development and validation of two mouse bioassays for use in quantifying botulinum toxins (a, b, c, d and e) and toxin antibody titers. Final report

    SciTech Connect

    Olson, C.T.; Gelzleichter, T.R.; Myers, M.A.; Menton, R.G.; Neimuth, N.A.

    1997-06-01

    Ths task was conducted for the U.S. Army Medical Materiel Development Activity (USAMMDA) to validate two mouse bioassays for quantify botulinum toxin potency and neutralizing antibodies to botulimun toxins. Phase I experiments were designed to validate the mouse potency assay. The coefficients of variation for day-to-day variability were 10, 7, 10, 9 and 13 percent for serotypes A, B, C, D, and E, respectively. Phase II experiments were -brined to develop and validate an assay for measuring neutralizing antibody content of serum. Avidity reetits were characterized at three separate test levels, L+/10, L+/33, and L+/100. The coefficients of variation for day-to-day variability were 9, 44, 11, 34, and 13 percent for serotype A, B, C, D, and E, respectively. Limits of intitation were approximately 0.02, 0.005, 0.012, 0.026, and 0.013 U/mL for serotypes A, B, C, D, and B, respectively. Phase III consisted of limited studies to develop a model of passive immunity in guinea pigs by intraperitoneal treatment with human botulinum immune globulin (BIG).

  2. Radiochemical method development

    SciTech Connect

    Erickson, M.D.; Aldstadt, J.H.; Alvarado, J.S.; Crain, J.S.; Orlandini, K.A.; Smith, L.L.

    1994-09-01

    The authors have developed methods for chemical characterization of the environment under a multitask project that focuses on improvement of radioanalytical methods with an emphasis on faster and cheaper routine methods. The authors have developed improved methods for separation of environmental levels of technetium-99, radium, and actinides from soil and water; separation of actinides from soil and water matrix interferences; and isolation of strontium. They are also developing methods for simultaneous detection of multiple isotopes (including nonradionuclides) by using a new instrumental technique, inductively coupled plasma-mass spectrometry (ICP-MS). The new ICP-MS methods have greater sensitivity and efficiency and could replace many radiometric techniques. They are using flow injection analysis to integrate and automate the separation methods with the ICP-MS methodology. The final product of all activities will be methods that are available (published in the U.S. Department of Energy`s analytical methods compendium) and acceptable for use in regulatory situations.

  3. Biological Methods and Manual Development

    EPA Pesticide Factsheets

    EPA scientists conduct research to develop and evaluate analytical methods for the identification, enumeration, evaluation of aquatic organisms exposed to environmental stressors and to correlate exposures with effects on chemical and biological indicators

  4. New methodical developments for GRANIT

    SciTech Connect

    Baessler, Stefan; Nesvizhevsky, V.; Toperverg, B; Zhernenkov, K.; Gagarski, A; Lychagin, E; Muzychka, A; Strelkov, A; Mietke, A

    2011-01-01

    New methodical developments for the GRANIT spectrometer address further improvements of the critical parameters of this experimental installation, as well as its applications to new fields of research. Keeping in mind an extremely small fraction of ultra cold neutrons (UCN) that could be bound in gravitational quantum states, we look for methods to increase statistics due to: developing UCN sources with maximum phase-space density, counting simultaneously a large fraction of neutrons using position-sensitive detectors, and decreasing detector backgrounds. Also we explore an eventual application of the GRANIT spectrometer beyond the scope of its initial goals, for instance, for reflectometry with UCN.

  5. Space Radiation Transport Methods Development

    NASA Astrophysics Data System (ADS)

    Wilson, J.; Tripathi, R.; Qualls, G.; Cucinotta, F.; Prael, R.; Norbury, J.

    Early space radiation shield code development relied on Monte Carlo methods for proton, neutron and pion transport and made important contributions to the space program. More recently Monte Carlo code LAHET has been upgraded to include high-energy multiple-charged light ions for GCR simulations and continues to be expanded in capability. To compensate for low computational efficiency, Monte Carlo methods have resorted to restricted one-dimensional problems leading to imperfect representations of appropriate boundary conditions. Even so, intensive computational requirements resulted and shield evaluation was made near the end of the design process and resolving shielding issues usually had a negative impact on the design. We evaluate the implications of these common one-dimensional assumptions on the evaluation of the Shuttle internal radiation field. Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard Finite Element Method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 milliseconds and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of reconfigurable computing and could be

  6. Space Radiation Transport Methods Development

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tripathi, R. K.; Qualls, G. D.; Cucinotta, F. A.; Prael, R. E.; Norbury, J. W.; Heinbockel, J. H.; Tweed, J.

    2002-01-01

    Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard Finite Element Method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 milliseconds and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of reconfigurable computing and could be utilized in the final design as verification of the deterministic method optimized design.

  7. Developing Scoring Algorithms (Earlier Methods)

    Cancer.gov

    We developed scoring procedures to convert screener responses to estimates of individual dietary intake for fruits and vegetables, dairy, added sugars, whole grains, fiber, and calcium using the What We Eat in America 24-hour dietary recall data from the 2003-2006 NHANES.

  8. GIS Method for Developing Wind Supply Curves

    SciTech Connect

    Kline, D.; Heimiller, D.; Cowlin, S.

    2008-06-01

    This report describes work conducted by the National Renewable Energy Laboratory (NREL) as part of the Wind Technology Partnership (WTP) sponsored by the U.S. Environmental Protection Agency (EPA). This project has developed methods that the National Development and Reform Commission (NDRC) intends to use in the planning and development of China's 30 GW of planned capacity. Because of China's influence within the community of developing countries, the methods and the approaches here may help foster wind development in other countries.

  9. Moral counselling: a method in development.

    PubMed

    de Groot, Jack; Leget, Carlo

    2011-01-01

    This article describes a method of moral counselling developed in the Radboud University Medical Centre Nijmegen (The Netherlands). The authors apply insights of Paul Ricoeur to the non-directive counselling method of Carl Rogers in their work of coaching patients with moral problems in health care. The developed method was shared with other health care professionals in a training course. Experiences in the course and further practice led to further improvement of the method.

  10. Addressing gaps in the contraceptive method mix: methods in development.

    PubMed

    Nanda, Kavita; Callahan, Rebecca; Dorflinger, Laneta

    2015-11-01

    Despite the availability of a variety of contraceptive methods, millions of women still have an unmet need for contraceptive choices. Short-acting methods are plagued by issues with adherence, leading to imperfect or inconsistent use and subsequent unintended pregnancy. Long-acting contraceptive methods such as intrauterine devices and contraceptive implants, while providing highly effective and safe contraception, do not meet the needs of all women, often due to cost, access or acceptability issues. Several new methods are in various stages of development and are designed to address the shortcomings of current methods. Providers should be aware of these future options and how they might better meet women's needs.

  11. Development of Methods for Determination of Aflatoxins.

    PubMed

    Xie, Lijuan; Chen, Min; Ying, Yibin

    2016-12-09

    Aflatoxins can cause damage to the health of humans and animals. Several institutions around the world have established regulations to limit the levels of aflatoxins in food, and numerous analytical methods have been extensively developed for aflatoxin determination. This review covers the currently used analytical methods for the determination of aflatoxins in different food matrices, which includes sampling and sample preparation, sample pretreatment methods including extraction methods and purification methods of aflatoxin extracts, separation and determination methods. Validation for analysis of aflatoxins and safety considerations and precautions when doing the experiments are also discussed.

  12. Toxicity test method development in southeast Asia

    SciTech Connect

    McPherson, C.A.

    1995-12-31

    Use of aquatic toxicity tests is relatively new in southeast Asia. As part of the ASEAN-Canada Cooperative Programme on Marine Science -- Phase 2, which includes development of marine environmental criteria, a need for tropical toxicity data was identified. A step-wise approach was used for test method development (simple, acute tests and easily measured endpoints first, then more complex short-term chronic methods), for test specific selection (using species found throughout the region first, and then considering species with narrower geographic distribution), and for integration of quality assurance/quality control (QA/QC) practices into all laboratory activities. Development of test protocols specifically for tropical species included acute and chronic toxicity tests with marine fish, invertebrates and algae. Criteria for test species selection will be reviewed. Method development was based on procedures and endpoints already widely used in North America and Europe (e.g., 96-h LC50 with fish), but adapted for use with tropical species. For example, a bivalve larval development test can use the same endpoints but the duration is only 24 hours. Test method development included research on culture and holding procedures, determination of test conditions (e.g., duration, test containers), and identification of appropriate endpoints. Acute tests with fish and invertebrates were developed first. The next step was development of short-term chronic tests to measure phytoplankton growth, bivalve and echinoderm embryo or larval development, and larval fish growth. The number of species and types of tests was increased in a staged approach, as laboratories became better equipped and personnel gained practical experience. In most cases, method development coincided with training workshops to introduce the principles of toxicity testing.

  13. Development of test methods for textile composites

    NASA Technical Reports Server (NTRS)

    Masters, John E.; Ifju, Peter G.; Fedro, Mark J.

    1993-01-01

    NASA's Advanced Composite Technology (ACT) Program was initiated in 1990 with the purpose of developing less costly composite aircraft structures. A number of innovative materials and processes were evaluated as a part of this effort. Chief among them are composite materials reinforced with textile preforms. These new forms of composite materials bring with them potential testing problems. Methods currently in practice were developed over the years for composite materials made from prepreg tape or simple 2-D woven fabrics. A wide variety of 2-D and 3-D braided, woven, stitched, and knit preforms were suggested for application in the ACT program. The applicability of existing test methods to the wide range of emerging materials bears investigation. The overriding concern is that the values measured are accurate representations of the true material response. The ultimate objective of this work is to establish a set of test methods to evaluate the textile composites developed for the ACT Program.

  14. Methods for the Study of Gonadal Development.

    PubMed

    Piprek, Rafal P

    2016-01-01

    Current knowledge on gonadal development and sex determination is the product of many decades of research involving a variety of scientific methods from different biological disciplines such as histology, genetics, biochemistry, and molecular biology. The earliest embryological investigations, followed by the invention of microscopy and staining methods, were based on histological examinations. The most robust development of histological staining techniques occurred in the second half of the nineteenth century and resulted in structural descriptions of gonadogenesis. These first studies on gonadal development were conducted on domesticated animals; however, currently the mouse is the most extensively studied species. The next key point in the study of gonadogenesis was the advancement of methods allowing for the in vitro culture of fetal gonads. For instance, this led to the description of the origin of cell lines forming the gonads. Protein detection using antibodies and immunolabeling methods and the use of reporter genes were also invaluable for developmental studies, enabling the visualization of the formation of gonadal structure. Recently, genetic and molecular biology techniques, especially gene expression analysis, have revolutionized studies on gonadogenesis and have provided insight into the molecular mechanisms that govern this process. The successive invention of new methods is reflected in the progress of research on gonadal development.

  15. Development of new hole expansion testing method

    NASA Astrophysics Data System (ADS)

    Kim, Hyunok; Shang, Jianhui; Beam, Kevin; Samant, Anoop; Hoschouer, Cliff; Dykeman, Jim

    2016-08-01

    This paper introduces a new hole expansion (HE) testing method that could be more relevant to the edge cracking problem observed in stamping advanced high strength steel (AHSS). The new testing method adopted a large hole diameter of 75 mm compared to the standard hole diameter of 10 mm. An inline monitoring system was developed to visually monitor the hole edge cracking during the test and synchronize the load-displacement data with the recorded video for capturing the initial crack. A new hole expansion testing method was found to be effective in evaluating the edge cracking by considering the effects of material properties and trimming methods. It showed a much larger difference, up to 11%, of the HE ratio between DP980 and TRIP780 compared to the standard HE testing method giving less than a 2% difference.

  16. Benchmarking Learning and Teaching: Developing a Method

    ERIC Educational Resources Information Center

    Henderson-Smart, Cheryl; Winning, Tracey; Gerzina, Tania; King, Shalinie; Hyde, Sarah

    2006-01-01

    Purpose: To develop a method for benchmarking teaching and learning in response to an institutional need to validate a new program in Dentistry at the University of Sydney, Australia. Design/methodology/approach: After a collaborative partner, University of Adelaide, was identified, the areas of teaching and learning to be benchmarked, PBL…

  17. New Developments of the Shared Concern Method.

    ERIC Educational Resources Information Center

    Pikas, Anatol

    2002-01-01

    Reviews and describes new developments in the Shared Concern method (SCm), a tool for tackling group bullying amongst teenagers by individual talks. The psychological mechanisms of healing in the bully group and what hinders the bully therapist in eliciting them have become better clarified. The most important recent advancement of the SCm…

  18. Development of a nonlinear vortex method

    NASA Technical Reports Server (NTRS)

    Kandil, O. A.

    1982-01-01

    Steady and unsteady Nonliner Hybrid Vortex (NHV) method, for low aspect ratio wings at large angles of attack, is developed. The method uses vortex panels with first-order vorticity distribution (equivalent to second-order doublet distribution) to calculate the induced velocity in the near field using closed form expressions. In the far field, the distributed vorticity is reduced to concentrated vortex lines and the simpler Biot-Savart's law is employed. The method is applied to rectangular wings in steady and unsteady flows without any restriction on the order of magnitude of the disturbances in the flow field. The numerical results show that the method accurately predicts the distributed aerodynamic loads and that it is of acceptable computational efficiency.

  19. Transport Test Problems for Hybrid Methods Development

    SciTech Connect

    Shaver, Mark W.; Miller, Erin A.; Wittman, Richard S.; McDonald, Benjamin S.

    2011-12-28

    This report presents 9 test problems to guide testing and development of hybrid calculations for the ADVANTG code at ORNL. These test cases can be used for comparing different types of radiation transport calculations, as well as for guiding the development of variance reduction methods. Cases are drawn primarily from existing or previous calculations with a preference for cases which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22.

  20. Report on development of neutron passportisation method

    SciTech Connect

    Antropov, G.P.; Babichev, Yu.B.; Blagin, S.V.

    1994-12-31

    In this report the results of development of spatial neutron passportisation method are described. The method is aimed on spatial configuration (including the number of sources) control of closed objects containing neutron sources. The possible areas of method application are: (1) the number of warheads control inside the missile heads for RF-US nuclear disarmament treaties verification; (2) control of SNM containers arrangement in storage vaults; (3) control of complicated assemblies with SNM (and other radioactive materials) to remain unchanged. For objects with complicated structure such as multiple reentry vehicles the direct interpretation of observed radiation field configuration is rather difficult task. The reconstruction of object structure on basis of radiation field configuration usually require use of external information and is often not obvious. Besides, while using such methods of direct reconstruction of object internal structure the contradiction arises between the requirement of defining sources arrangement (warheads in case of arms control) and requirement of information protection concerning the sources themselves. In this case there may be different limitations on possible spatial resolution of method, use of spectroscopy information, etc.

  1. Prioritizing pesticide compounds for analytical methods development

    USGS Publications Warehouse

    Norman, Julia E.; Kuivila, Kathryn; Nowell, Lisa H.

    2012-01-01

    The U.S. Geological Survey (USGS) has a periodic need to re-evaluate pesticide compounds in terms of priorities for inclusion in monitoring and studies and, thus, must also assess the current analytical capabilities for pesticide detection. To meet this need, a strategy has been developed to prioritize pesticides and degradates for analytical methods development. Screening procedures were developed to separately prioritize pesticide compounds in water and sediment. The procedures evaluate pesticide compounds in existing USGS analytical methods for water and sediment and compounds for which recent agricultural-use information was available. Measured occurrence (detection frequency and concentrations) in water and sediment, predicted concentrations in water and predicted likelihood of occurrence in sediment, potential toxicity to aquatic life or humans, and priorities of other agencies or organizations, regulatory or otherwise, were considered. Several existing strategies for prioritizing chemicals for various purposes were reviewed, including those that identify and prioritize persistent, bioaccumulative, and toxic compounds, and those that determine candidates for future regulation of drinking-water contaminants. The systematic procedures developed and used in this study rely on concepts common to many previously established strategies. The evaluation of pesticide compounds resulted in the classification of compounds into three groups: Tier 1 for high priority compounds, Tier 2 for moderate priority compounds, and Tier 3 for low priority compounds. For water, a total of 247 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods for monitoring and studies. Of these, about three-quarters are included in some USGS analytical method; however, many of these compounds are included on research methods that are expensive and for which there are few data on environmental samples. The remaining quarter of Tier 1

  2. Structural reliability methods: Code development status

    NASA Technical Reports Server (NTRS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-01-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  3. A space radiation transport method development.

    PubMed

    Wilson, J W; Tripathi, R K; Qualls, G D; Cucinotta, F A; Prael, R E; Norbury, J W; Heinbockel, J H; Tweed, J

    2004-01-01

    Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest-order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard finite element method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 ms and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of re-configurable computing and could be utilized in the final design as verification of the deterministic method optimized design.

  4. A space radiation transport method development

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tripathi, R. K.; Qualls, G. D.; Cucinotta, F. A.; Prael, R. E.; Norbury, J. W.; Heinbockel, J. H.; Tweed, J.

    2004-01-01

    Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest-order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard finite element method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 ms and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of re-configurable computing and could be utilized in the final design as verification of the deterministic method optimized design. Published by Elsevier Ltd on behalf of COSPAR.

  5. [Development of identification method for isopropyl citrate].

    PubMed

    Furusho, Noriko; Ohtsuki, Takashi; Tatebe-Sasaki, Chiye; Kubota, Hiroki; Sato, Kyoko; Akiyama, Hiroshi

    2014-01-01

    In Japan's Specification and Standards for Food Additive, 8th edition, two identification tests involving isopropyl citrate for detecting isopropyl alcohol and citrate are stipulated. However, these identification tests use mercury compound, which is toxic, or require a time-consuming pretreatment process. To solve these problems, an identification test method using GC-FID for detecting isopropyl alcohol was developed. In this test, a good linearity was observed in the range of 0.1-40 mg/mL of isopropyl alcohol. While investigating the pretreatment process, we found that isopropyl alcohol could be detected using GC-FID in the distillation step only, without involving any reflux step. The study also showed that the citrate moiety of isopropyl citrate was identified using the solution remaining after conducting the distillation of isopropyl alcohol. The developed identification tests for isopropyl citrate are simple and use no toxic materials.

  6. Methods for developing and validating survivability distributions

    SciTech Connect

    Williams, R.L.

    1993-10-01

    A previous report explored and discussed statistical methods and procedures that may be applied to validate the survivability of a complex system of systems that cannot be tested as an entity. It described a methodology where Monte Carlo simulation was used to develop the system survivability distribution from the component distributions using a system model that registers the logical interactions of the components to perform system functions. This paper discusses methods that can be used to develop the required survivability distributions based upon three sources of knowledge. These are (1) available test results; (2) little or no available test data, but a good understanding of the physical laws and phenomena which can be applied by computer simulation; and (3) neither test data nor adequate knowledge of the physics are known, in which case, one must rely upon, and quantify, the judgement of experts. This paper describes the relationship between the confidence bounds that can be placed on survivability and the number of tests conducted. It discusses the procedure for developing system level survivability distributions from the distributions for lower levels of integration. It demonstrates application of these techniques by defining a communications network for a Hypothetical System Architecture. A logic model for the performance of this communications network is developed, as well as the survivability distributions for the nodes and links based on two alternate data sets, reflecting the effects of increased testing of all elements. It then shows how this additional testing could be optimized by concentrating only on those elements contained in the low-order fault sets which the methodology identifies.

  7. DEVELOPMENT OF A MOLECULAR METHOD TO IDENTIFY ...

    EPA Pesticide Factsheets

    Hepatitis E virus (HEV) is an emerging pathogen that causes significant illness in the developing world. Like the hepatitis A virus, it is transmitted via the fecal-oral route and can cause short-term, acute hepatitis. In addition, hepatitis E has been found to cause a significant rate of mortality in pregnant women. Thus far, a hepatitis E outbreak has not been reported in the U. S. although a swine variant of the virus is common in Midwestern hogs. Since it will be important to identify the presence of this virus in the water supply, we have developed and are testing a reverse transcription-polymerase chain reaction (RT-PCR) method that should be able to identify all of the known HEV strains. Develop sensitive techniques to detect and identify emerging human waterborne pathogenic viruses and viruses on the CCL.Determine effectiveness of viral indicators to measure microbial quality in water matrices.Support activities: (a) culture and distribution of mammalian cells for Agency and scientific community research needs, (b) provide operator expertise for research requiring confocal and electron microscopy, (c) glassware cleaning, sterilization and biological waste disposal for the Cincinnati EPA facility, (d) operation of infectious pathogenic suite, (e) maintenance of walk-in constant temperature rooms and (f) provide Giardia cysts.

  8. 24 CFR 941.102 - Development methods and funding.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 24 Housing and Urban Development 4 2012-04-01 2012-04-01 false Development methods and funding... URBAN DEVELOPMENT PUBLIC HOUSING DEVELOPMENT General § 941.102 Development methods and funding. (a... housing units using this method. (b) Funding. A PHA may develop public housing with: (1) Development...

  9. DEVELOPMENT OF NDA METHODS FOR NEPTUNIUM METAL

    SciTech Connect

    C. MOSS; ET AL

    2000-10-01

    Many techniques have been developed and applied in the US and other countries for the control of the special nuclear materials (SNM) plutonium and uranium, but no standard methods exist for the determination of neptunium in bulk containers. Such methods are needed because the U.S. Department of Energy requires all Government-owned {sup 237}Np be treated as if it were SNM and the International Atomic Energy Agency is considering how to monitor this material. We present the results of the measurements of several samples of neptunium metal with a variety of techniques. Analysis of passive gamma-ray spectra uniquely identifies the material, provides isotopic ratios for contaminants, such as {sup 243}Am, and may provide information about the shielding, mass, and time since processing. Active neutron interrogation, using the delayed neutron technique in a package monitor, provides useful data even if the neptunium is shielded. The tomographic gamma scanner yields a map of the distribution of the neptunium and shielding in a container. Active photon interrogation with pulses from a 10-MeV linac produces delayed neutrons between pulses, even when the container is heavily shielded. Data from one or more of these techniques can be used to identify the material and estimate a mass in a bulk container.

  10. Development of a Radial Deconsolidation Method

    SciTech Connect

    Helmreich, Grant W.; Montgomery, Fred C.; Hunn, John D.

    2015-12-01

    A series of experiments have been initiated to determine the retention or mobility of fission products* in AGR fuel compacts [Petti, et al. 2010]. This information is needed to refine fission product transport models. The AGR-3/4 irradiation test involved half-inch-long compacts that each contained twenty designed-to-fail (DTF) particles, with 20-μm thick carbon-coated kernels whose coatings were deliberately fabricated such that they would crack under irradiation, providing a known source of post-irradiation isotopes. The DTF particles in these compacts were axially distributed along the compact centerline so that the diffusion of fission products released from the DTF kernels would be radially symmetric [Hunn, et al. 2012; Hunn et al. 2011; Kercher, et al. 2011; Hunn, et al. 2007]. Compacts containing DTF particles were irradiated at Idaho National Laboratory (INL) at the Advanced Test Reactor (ATR) [Collin, 2015]. Analysis of the diffusion of these various post-irradiation isotopes through the compact requires a method to radially deconsolidate the compacts so that nested-annular volumes may be analyzed for post-irradiation isotope inventory in the compact matrix, TRISO outer pyrolytic carbon (OPyC), and DTF kernels. An effective radial deconsolidation method and apparatus appropriate to this application has been developed and parametrically characterized.

  11. Methods development for total organic carbon accountability

    NASA Technical Reports Server (NTRS)

    Benson, Brian L.; Kilgore, Melvin V., Jr.

    1991-01-01

    This report describes the efforts completed during the contract period beginning November 1, 1990 and ending April 30, 1991. Samples of product hygiene and potable water from WRT 3A were supplied by NASA/MSFC prior to contract award on July 24, 1990. Humidity condensate samples were supplied on August 3, 1990. During the course of this contract chemical analyses were performed on these samples to qualitatively determine specific components comprising, the measured organic carbon concentration. In addition, these samples and known standard solutions were used to identify and develop methodology useful to future comprehensive characterization of similar samples. Standard analyses including pH, conductivity, and total organic carbon (TOC) were conducted. Colorimetric and enzyme linked assays for total protein, bile acid, B-hydroxybutyric acid, methylene blue active substances (MBAS), urea nitrogen, ammonia, and glucose were also performed. Gas chromatographic procedures for non-volatile fatty acids and EPA priority pollutants were also performed. Liquid chromatography was used to screen for non-volatile, water soluble compounds not amenable to GC techniques. Methods development efforts were initiated to separate and quantitate certain chemical classes not classically analyzed in water and wastewater samples. These included carbohydrates, organic acids, and amino acids. Finally, efforts were initiated to identify useful concentration techniques to enhance detection limits and recovery of non-volatile, water soluble compounds.

  12. 24 CFR 941.102 - Development methods and funding.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 24 Housing and Urban Development 4 2013-04-01 2013-04-01 false Development methods and funding. 941.102 Section 941.102 Housing and Urban Development REGULATIONS RELATING TO HOUSING AND URBAN... URBAN DEVELOPMENT PUBLIC HOUSING DEVELOPMENT General § 941.102 Development methods and funding....

  13. Agile Development Methods for Space Operations

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Webster, Chris

    2012-01-01

    Main stream industry software development practice has gone from a traditional waterfall process to agile iterative development that allows for fast response to customer inputs and produces higher quality software at lower cost. How can we, the space ops community, adopt state of the art software development practice, achieve greater productivity at lower cost, and maintain safe and effective space flight operations? At NASA Ames, we are developing Mission Control Technologies Software, in collaboration with Johnson Space Center (JSC) and, more recently, the Jet Propulsion Laboratory (JPL).

  14. Child development in developing countries: introduction and methods.

    PubMed

    Bornstein, Marc H; Britto, Pia Rebello; Nonoyama-Tarumi, Yuko; Ota, Yumiko; Petrovic, Oliver; Putnick, Diane L

    2012-01-01

    The Multiple Indicator Cluster Survey (MICS) is a nationally representative, internationally comparable household survey implemented to examine protective and risk factors of child development in developing countries around the world. This introduction describes the conceptual framework, nature of the MICS3, and general analytic plan of articles in this Special Section. The articles that follow describe the situations of children with successive foci on nutrition, parenting, discipline and violence, and the home environment. They address 2 common questions: How do developing and underresearched countries in the world vary with respect to these central indicators of children's development? How do key indicators of national development relate to child development in each of these substantive areas? The Special Section concludes with policy implications from the international findings.

  15. Method Development for Analysis of Aspirin Tablets.

    ERIC Educational Resources Information Center

    Street, Kenneth W., Jr.

    1988-01-01

    Develops a lab experiment for introductory instrumental analysis that requires interference studies and optimizing of conditions. Notes the analysis of the aspirin is by visible spectrophotometric assay. Gives experimental details and discussion. (MVL)

  16. Development of advanced acreage estimation methods

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr. (Principal Investigator)

    1980-01-01

    The use of the AMOEBA clustering/classification algorithm was investigated as a basis for both a color display generation technique and maximum likelihood proportion estimation procedure. An approach to analyzing large data reduction systems was formulated and an exploratory empirical study of spatial correlation in LANDSAT data was also carried out. Topics addressed include: (1) development of multiimage color images; (2) spectral spatial classification algorithm development; (3) spatial correlation studies; and (4) evaluation of data systems.

  17. Child Development in Developing Countries: Introduction and Methods

    ERIC Educational Resources Information Center

    Bornstein, Marc H.; Britto, Pia Rebello; Nonoyama-Tarumi, Yuko; Ota, Yumiko; Petrovic, Oliver; Putnick, Diane L.

    2012-01-01

    The Multiple Indicator Cluster Survey (MICS) is a nationally representative, internationally comparable household survey implemented to examine protective and risk factors of child development in developing countries around the world. This introduction describes the conceptual framework, nature of the MICS3, and general analytic plan of articles…

  18. The Development of Cluster and Histogram Methods

    NASA Astrophysics Data System (ADS)

    Swendsen, Robert H.

    2003-11-01

    This talk will review the history of both cluster and histogram methods for Monte Carlo simulations. Cluster methods are based on the famous exact mapping by Fortuin and Kasteleyn from general Potts models onto a percolation representation. I will discuss the Swendsen-Wang algorithm, as well as its improvement and extension to more general spin models by Wolff. The Replica Monte Carlo method further extended cluster simulations to deal with frustrated systems. The history of histograms is quite extensive, and can only be summarized briefly in this talk. It goes back at least to work by Salsburg et al. in 1959. Since then, it has been forgotten and rediscovered several times. The modern use of the method has exploited its ability to efficiently determine the location and height of peaks in various quantities, which is of prime importance in the analysis of critical phenomena. The extensions of this approach to the multiple histogram method and multicanonical ensembles have allowed information to be obtained over a broad range of parameters. Histogram simulations and analyses have become standard techniques in Monte Carlo simulations.

  19. Using Qualitative Methods to Inform Scale Development

    ERIC Educational Resources Information Center

    Rowan, Noell; Wulff, Dan

    2007-01-01

    This article describes the process by which one study utilized qualitative methods to create items for a multi dimensional scale to measure twelve step program affiliation. The process included interviewing fourteen addicted persons while in twelve step focused treatment about specific pros (things they like or would miss out on by not being…

  20. Development of ultrasonic methods for hemodynamic measurements

    NASA Technical Reports Server (NTRS)

    Histand, M. B.; Miller, C. W.; Wells, M. K.; Mcleod, F. D.; Greene, E. R.; Winter, D.

    1975-01-01

    A transcutanous method to measure instantaneous mean blood flow in peripheral arteries of the human body was defined. Transcutanous and implanted cuff ultrasound velocity measurements were evaluated, and the accuracies of velocity, flow, and diameter measurements were assessed for steady flow. Performance criteria were established for the pulsed Doppler velocity meter (PUDVM), and performance tests were conducted. Several improvements are suggested.

  1. Development of advanced acreage estimation methods

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr. (Principal Investigator)

    1982-01-01

    The development of an accurate and efficient algorithm for analyzing the structure of MSS data, the application of the Akaiki information criterion to mixture models, and a research plan to delineate some of the technical issues and associated tasks in the area of rice scene radiation characterization are discussed. The AMOEBA clustering algorithm is refined and documented.

  2. An Unbalance Adjustment Method for Development Indicators

    ERIC Educational Resources Information Center

    Tarabusi, Enrico Casadio; Guarini, Giulio

    2013-01-01

    This paper analyzes some aggregation aspects of the procedure for constructing a composite index on a multidimensional socio-economic phenomenon such as development, the main focus being on the unbalance among individual dimensions. First a theoretical framework is set up for the unbalance adjustment of the index. Then an aggregation function is…

  3. A Method for Developing a Nutrient Guide.

    ERIC Educational Resources Information Center

    Gillespie, Ardyth H.; Roderuck, Charlotte E.

    1982-01-01

    This paper proposes a new approach to developing a tool for teaching nutrition and food selection. It allows adjustments as new information becomes available and takes into account both dietary recommendations and food composition. Steps involve nutrient composition; nutrient density; and ratings for fat, cholesterol, and sodium. (Author/CT)

  4. Pilot-in-the-loop Method Development

    DTIC Science & Technology

    2014-05-20

    case. 2.1 Full-Scale SFS2 Cases A full-scale SFS2 grid was generated with Pointwise from the wind tunnel grids (1/100th scale) provided by NAVAIR by...spacing given the differences in grid generation methods of the software. “Baffle” surfaces were used in Pointwise to control the volume mesh resolution...simplified full scale geometry was created by creating triangulated surfaces from the boundary curves using Pointwise . Since the intended grid topology

  5. Developing Unconstrained Methods for Enzyme Evolution

    DTIC Science & Technology

    2014-09-19

    methods fail to produce catalytically efficient enzymes. This study has broad application in many technologies from chemical synthesis to human health and...enzymes. This study has broad application in many technologies from chemical synthesis to human health and the environment. Our work centers around the...minimal media with N-15 labeled ammonia . After several months of screening, we finally identified conditions that allowed us to obtain labeled protein in

  6. Development of Thin Conducting Film Fabrication Methods.

    DTIC Science & Technology

    1979-12-01

    mediate mandrels ( beeswax , polymeric resins such as PVA, PVC, PBS, Saran). Efforts to transfer the foils intact from formation on intermediate mandrels...methods Investigated for removing and transferring pyrolytic carbon films onto cylindrical electrodes consist of: (1) melted beeswax and other high...attempts at removing the carbon film from the quartz mandrel were made using melted beeswax , as in Figure 4. In the first few attempts, the entire

  7. Current status of fluoride volatility method development

    SciTech Connect

    Uhlir, J.; Marecek, M.; Skarohlid, J.

    2013-07-01

    The Fluoride Volatility Method is based on a separation process, which comes out from the specific property of uranium, neptunium and plutonium to form volatile hexafluorides whereas most of fission products (mainly lanthanides) and higher transplutonium elements (americium, curium) present in irradiated fuel form nonvolatile tri-fluorides. Fluoride Volatility Method itself is based on direct fluorination of the spent fuel, but before the fluorination step, the removal of cladding material and subsequent transformation of the fuel into a powdered form with a suitable grain size have to be done. The fluorination is made with fluorine gas in a flame fluorination reactor, where the volatile fluorides (mostly UF{sub 6}) are separated from the non-volatile ones (trivalent minor actinides and majority of fission products). The subsequent operations necessary for partitioning of volatile fluorides are the condensation and evaporation of volatile fluorides, the thermal decomposition of PuF{sub 6} and the finally distillation and sorption used for the purification of uranium product. The Fluoride Volatility Method is considered to be a promising advanced pyrochemical reprocessing technology, which can mainly be used for the reprocessing of oxide spent fuels coming from future GEN IV fast reactors.

  8. Recommendations for Developing Alternative Test Methods for Developmental Neurotoxicity

    EPA Science Inventory

    There is great interest in developing alternative methods for developmental neurotoxicity testing (DNT) that are cost-efficient, use fewer animals and are based on current scientific knowledge of the developing nervous system. Alternative methods will require demonstration of the...

  9. Developing an interactive microsimulation method in pharmacology.

    PubMed

    Collins, Angela S; Graves, Barbara A; Gullette, Donna; Edwards, Rebecca

    2010-07-01

    Pharmacology decision making requires clinical judgment. The authors created interactive microsimulation applying drug information to varying patients' situations. The theory-based microsimulation requires situational analysis for each scenario. The microsimulation uses an interactive format that allows the participant to navigate through three separate virtual clients' situations. Correct clinical decisions are rewarded by sounds and by video footage of the patient improving. Conversely, incorrect choices show video footage of the patient decompensating. This micro-simulation was developed to help students learn from the consequences of incorrect medication decision making in the virtual world without harming patients. The feedback of watching an incorrect decision on a patient helps students associate cause and effect on patient outcomes. The microsimulation reinforces the ease with which medication errors can occur and the extent of possible sequalae. The development process used to incorporate the technology in the nursing curriculum is discussed.

  10. Landfill mining: Developing a comprehensive assessment method.

    PubMed

    Hermann, Robert; Wolfsberger, Tanja; Pomberger, Roland; Sarc, Renato

    2016-11-01

    In Austria, the first basic technological and economic examinations of mass-waste landfills with the purpose to recover secondary raw materials have been carried out by the 'LAMIS - Landfill Mining Österreich' pilot project. A main focus of its research, and the subject of this article, is the first conceptual design of a comprehensive assessment method for landfill mining plans, including not only monetary factors (like costs and proceeds) but also non-monetary ones, such as the concerns of adjoining owners or the environmental impact. Detailed reviews of references, the identification of influences and system boundaries to be included in planning landfill mining, several expert workshops and talks with landfill operators have been performed followed by a division of the whole assessment method into preliminary and main assessment. Preliminary assessment is carried out with a questionnaire to rate juridical feasibility, the risk and the expenditure of a landfill mining project. The results of this questionnaire are compiled in a portfolio chart that is used to recommend, or not, further assessment. If a detailed main assessment is recommended, defined economic criteria are rated by net present value calculations, while ecological and socio-economic criteria are examined in a utility analysis and then transferred into a utility-net present value chart. If this chart does not support making a definite statement on the feasibility of the project, the results must be further examined in a cost-effectiveness analysis. Here, the benefit of the particular landfill mining project per capital unit (utility-net present value ratio) is determined to make a final distinct statement on the general benefit of a landfill mining project.

  11. Methods and Protocols for Developing Prion Vaccines.

    PubMed

    Marciniuk, Kristen; Taschuk, Ryan; Napper, Scott

    2016-01-01

    Prion diseases denote a distinct form of infectivity that is based in the misfolding of a self-protein (PrP(C)) into a pathological, infectious conformation (PrP(Sc)). Efforts to develop vaccines for prion diseases have been complicated by the potential dangers that are associated with induction of immune responses against a self-protein. As a consequence, there is considerable appeal for vaccines that specifically target the misfolded prion conformation. Such conformation-specific immunotherapy is made possible through the identification of vaccine targets (epitopes) that are exclusively presented as a consequence of misfolding. An immune response directed against these targets, termed disease-specific epitopes (DSEs), has the potential to spare the function of the native form of the protein while clearing, or neutralizing, the infectious isomer. Although identification of DSEs represents a critical first step in the induction of conformation-specific immune responses, substantial efforts are required to translate these targets into functional vaccines. Due to the poor immunogenicity that is inherent to self-proteins, and that is often associated with short peptides, substantial efforts are required to overcome tolerance-to-self and maximize the resultant immune response following DSE-based immunization. This often includes optimization of target sequences in terms of immunogenicity and development of effective formulation and delivery strategies for the associated peptides. Further, these vaccines must satisfy additional criteria from perspectives of specificity (PrP(C) vs. PrP(Sc)) and safety (antibody-induced template-driven misfolding of PrP(C)). The emphasis of this report is on the steps required to translate DSEs into prion vaccines and subsequent evaluation of the resulting immune responses.

  12. Developing Automated Methods of Waste Sorting

    SciTech Connect

    Shurtliff, Rodney Marvin

    2002-08-01

    The U.S. Department of Energy (DOE) analyzed the need complex-wide for remote and automated technologies as they relate to the treatment and disposal of mixed wastes. This analysis revealed that several DOE sites need the capability to open drums containing waste, visually inspect and sort the contents, and finally repackage the containers that are acceptable at a waste disposal facility such as the Waste Isolation Pilot Plant (WIPP) in New Mexico. Conditioning contaminated waste so that it is compatible with the WIPP criteria for storage is an arduous task whether the waste is contact handled (waste having radioactivity levels below 200 mrem/hr) or remote handled. Currently, WIPP non-compliant items are removed from the waste stream manually, at a rate of about one 55-gallon drum per day. Issues relating to contamination-based health hazards as well as repetitive motion health hazards are steering industry towards a more user-friendly, method of conditioning or sorting waste.

  13. Antimicrobial Testing Methods & Procedures Developed by EPA's Microbiology Laboratory

    EPA Pesticide Factsheets

    We develop antimicrobial testing methods and standard operating procedures to measure the effectiveness of hard surface disinfectants against a variety of microorganisms. Find methods and procedures for antimicrobial testing.

  14. 24 CFR 941.102 - Development methods and funding.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Development methods and funding. 941.102 Section 941.102 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF ASSISTANT SECRETARY FOR PUBLIC AND INDIAN HOUSING, DEPARTMENT OF HOUSING...

  15. How Qualitative Methods Can be Used to Inform Model Development.

    PubMed

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-03-20

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  16. Development of quality assurance methods for epoxy graphite prepreg

    NASA Technical Reports Server (NTRS)

    Chen, J. S.; Hunter, A. B.

    1982-01-01

    Quality assurance methods for graphite epoxy/prepregs were developed. Liquid chromatography, differential scanning calorimetry, and gel permeation chromatography were investigated. These methods were applied to a second prepreg system. The resin matrix formulation was correlated with mechanical properties. Dynamic mechanical analysis and fracture toughness methods were investigated. The chromatography and calorimetry techniques were all successfully developed as quality assurance methods for graphite epoxy prepregs. The liquid chromatography method was the most sensitive to changes in resin formulation. The were also successfully applied to the second prepreg system.

  17. Development of novel growth methods for halide single crystals

    NASA Astrophysics Data System (ADS)

    Yokota, Yuui; Kurosawa, Shunsuke; Shoji, Yasuhiro; Ohashi, Yuji; Kamada, Kei; Yoshikawa, Akira

    2017-03-01

    We developed novel growth methods for halide scintillator single crystals with hygroscopic nature, Halide micro-pulling-down [H-μ-PD] method and Halide Vertical Bridgman [H-VB] method. The H-μ-PD method with a removable chamber system can grow a single crystal of halide scintillator material with hygroscopicity at faster growth rate than the conventional methods. On the other hand, the H-VB method can grow a large bulk single crystal of halide scintillator without a quartz ampule. CeCl3, LaBr3, Ce:LaBr3 and Eu:SrI2 fiber single crystals could be grown by the H-μ-PD method and Eu:SrI2 bulk single crystals of 1 and 1.5 inch in diameter could be grown by the H-VB method. The grown fiber and bulk single crystals showed comparable scintillation properties to the previous reports using the conventional methods.

  18. A random spatial sampling method in a rural developing nation

    PubMed Central

    2014-01-01

    Background Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. Methods We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. Results This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Conclusions Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available. PMID:24716473

  19. 3-minute diagnosis: Researchers develop new method to recognize pathogens

    SciTech Connect

    Beer, Reg

    2014-01-06

    Imagine knowing precisely why you feel sick ... before the doctor's exam is over. Lawrence Livermore researcher Reg Beer and his engineering colleagues have developed a new method to recognize disease-causing pathogens quicker than ever before.

  20. 3-minute diagnosis: Researchers develop new method to recognize pathogens

    ScienceCinema

    Beer, Reg

    2016-07-12

    Imagine knowing precisely why you feel sick ... before the doctor's exam is over. Lawrence Livermore researcher Reg Beer and his engineering colleagues have developed a new method to recognize disease-causing pathogens quicker than ever before.

  1. Development of aerodynamic prediction methods for irregular planform wings

    NASA Technical Reports Server (NTRS)

    Benepe, D. B., Sr.

    1983-01-01

    A set of empirical methods was developed to predict low-speed lift, drag and pitching-moment variations with angle of attack for a class of low aspect ratio irregular planform wings suitable for application to advanced aerospace vehicles. The data base, an extensive series of wind-tunnel tests accomplished by the Langley Research Center of the National Aeronautics and Space Administration, is summarized. The approaches used to analyze the wind tunnel data, the evaluation of previously existing methods, data correlation efforts, and the development of the selected methods are presented and discussed. A summary of the methods is also presented to document the equations, computational charts and design guides which have been programmed for digital computer solution. Comparisons of predictions and test data are presented which show that the new methods provide a significant improvement in capability for evaluating the landing characteristics of advanced aerospace vehicles during the preliminary design phase of the configuration development cycle.

  2. A Model-Driven Development Method for Management Information Systems

    NASA Astrophysics Data System (ADS)

    Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki

    Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.

  3. AN APPROACH TO METHODS DEVELOPMENT FOR HUMAN EXPOSURE ASSESSMENT STUDIES

    EPA Science Inventory

    Human exposure assessment studies require methods that are rapid, cost-effective and have a high sample through-put. The development of analytical methods for exposure studies should be based on specific information for individual studies. Human exposure studies suggest that di...

  4. Interdisciplinary Curriculum Development in Hospital Methods Improvement. Final Report.

    ERIC Educational Resources Information Center

    Watt, John R.

    The major purpose of this project was to develop a "package" curriculum of Hospital Methods Improvement techniques for college students in health related majors. The elementary Industrial Engineering methods for simplifying work and saving labor were applied to the hospital environment and its complex of problems. The report's…

  5. What Can Mixed Methods Designs Offer Professional Development Program Evaluators?

    ERIC Educational Resources Information Center

    Giordano, Victoria; Nevin, Ann

    2007-01-01

    In this paper, the authors describe the benefits and pitfalls of mixed methods designs. They argue that mixed methods designs may be preferred when evaluating professional development programs for p-K-12 education given the new call for accountability in making data-driven decisions. They summarize and critique the studies in terms of limitations…

  6. Epistemological Development and Judgments and Reasoning about Teaching Methods

    ERIC Educational Resources Information Center

    Spence, Sarah; Helwig, Charles C.

    2013-01-01

    Children's, adolescents', and adults' (N = 96 7-8, 10-11, and 13-14-year-olds and university students) epistemological development and its relation to judgments and reasoning about teaching methods was examined. The domain (scientific or moral), nature of the topic (controversial or noncontroversial), and teaching method (direct instruction by…

  7. Methods of the Development Strategy of Service Companies: Logistical Approach

    ERIC Educational Resources Information Center

    Toymentseva, Irina A.; Karpova, Natalya P.; Toymentseva, Angelina A.; Chichkina, Vera D.; Efanov, Andrey V.

    2016-01-01

    The urgency of the analyzed issue is due to lack of attention of heads of service companies to the theory and methodology of strategic management, methods and models of management decision-making in times of economic instability. The purpose of the article is to develop theoretical positions and methodical recommendations on the formation of the…

  8. Preferred Methods of Professional Development in Student Affairs

    ERIC Educational Resources Information Center

    Roberts, Darby M.

    2007-01-01

    Continuing professional development is a foundation of the student affairs field. To stay current, practitioners use a variety of methods to learn about areas that they need to master to be successful in their careers. Results of this research indicate that staff use interactive methods such as consulting with colleagues and mentoring more so than…

  9. Hazards in chromatographic bioanalysis method development and applications.

    PubMed

    Hooshfar, Shirin; Bartlett, Michael G

    2017-01-01

    Bioanalytical methods are employed for the quantitative determination of drugs and their metabolites in biological matrices, in all stages of the drug development process. However, because of the highly complex nature of these matrices there is a wide range of potential biological, chemical and physical hazards that can influence the quality of the data produced by these methods. The present review focuses on the evaluation of the most important and frequent errors that may be encountered during bioanalytical method development/validation and analysis of clinical or preclinical samples mainly using chromatography. Additionally, the most practical ways for avoiding and managing these hazards during routine bioanalysis are presented.

  10. Validation of analytic methods for biomarkers used in drug development.

    PubMed

    Chau, Cindy H; Rixe, Olivier; McLeod, Howard; Figg, William D

    2008-10-01

    The role of biomarkers in drug discovery and development has gained precedence over the years. As biomarkers become integrated into drug development and clinical trials, quality assurance and, in particular, assay validation become essential with the need to establish standardized guidelines for analytic methods used in biomarker measurements. New biomarkers can revolutionize both the development and use of therapeutics but are contingent on the establishment of a concrete validation process that addresses technology integration and method validation as well as regulatory pathways for efficient biomarker development. This perspective focuses on the general principles of the biomarker validation process with an emphasis on assay validation and the collaborative efforts undertaken by various sectors to promote the standardization of this procedure for efficient biomarker development.

  11. Reactions Involved in Fingerprint Development Using the Cyanoacrylate - Fuming Method

    SciTech Connect

    Lewis, L.A.

    2001-07-30

    The Learning Objective is to present the basic chemistry research findings to the forensic community regarding development of latent fingerprints using the cyanoacrylate fuming method. Chemical processes involved in the development of latent fingerprints using the cyanoacrylate fuming method have been studied, and will be presented. Two major types of latent prints have been investigated--clean (eccrine) and oily (sebaceous) prints. Scanning electron microscopy (SEM) was used as a tool for determining the morphology of the polymer developed separately on clean and oily prints after cyanoacrylate fuming. A correlation between the chemical composition of an aged latent fingerprint, prior to development, and the quality of a developed fingerprint was observed in the morphology. The moisture in the print prior to fuming was found to be a critical factor for the development of a useful latent print. In addition, the amount of time required to develop a high quality latent print was found to be minimal. The cyanoacrylate polymerization process is extremely rapid. When heat is used to accelerate the fuming process, typically a period of 2 minutes is required to develop the print. The optimum development time is dependent upon the concentration of cyanoacrylate vapors within the enclosure.

  12. Development of Impurity Profiling Methods Using Modern Analytical Techniques.

    PubMed

    Ramachandra, Bondigalla

    2017-01-02

    This review gives a brief introduction about the process- and product-related impurities and emphasizes on the development of novel analytical methods for their determination. It describes the application of modern analytical techniques, particularly the ultra-performance liquid chromatography (UPLC), liquid chromatography-mass spectrometry (LC-MS), high-resolution mass spectrometry (HRMS), gas chromatography-mass spectrometry (GC-MS) and high-performance thin layer chromatography (HPTLC). In addition to that, the application of nuclear magnetic resonance (NMR) spectroscopy was also discussed for the characterization of impurities and degradation products. The significance of the quality, efficacy and safety of drug substances/products, including the source of impurities, kinds of impurities, adverse effects by the presence of impurities, quality control of impurities, necessity for the development of impurity profiling methods, identification of impurities and regulatory aspects has been discussed. Other important aspects that have been discussed are forced degradation studies and the development of stability indicating assay methods.

  13. Epistemological development and judgments and reasoning about teaching methods.

    PubMed

    Spence, Sarah; Helwig, Charles C

    2013-01-01

    Children's, adolescents', and adults' (N = 96 7-8, 10-11, and 13-14-year-olds and university students) epistemological development and its relation to judgments and reasoning about teaching methods was examined. The domain (scientific or moral), nature of the topic (controversial or noncontroversial), and teaching method (direct instruction by lectures versus class discussions) were systematically varied. Epistemological development was assessed in the aesthetics, values, and physical truth domains. All participants took the domain, nature of the topic, and teaching method into consideration in ways that showed age-related variations. Epistemological development in the value domain alone was predictive of preferences for class discussions and a critical perspective on teacher-centered direct instruction, even when age was controlled in the analysis.

  14. Development of Improved Surface Integral Methods for Jet Aeroacoustic Predictions

    NASA Technical Reports Server (NTRS)

    Pilon, Anthony R.; Lyrintzis, Anastasios S.

    1997-01-01

    The accurate prediction of aerodynamically generated noise has become an important goal over the past decade. Aeroacoustics must now be an integral part of the aircraft design process. The direct calculation of aerodynamically generated noise with CFD-like algorithms is plausible. However, large computer time and memory requirements often make these predictions impractical. It is therefore necessary to separate the aeroacoustics problem into two parts, one in which aerodynamic sound sources are determined, and another in which the propagating sound is calculated. This idea is applied in acoustic analogy methods. However, in the acoustic analogy, the determination of far-field sound requires the solution of a volume integral. This volume integration again leads to impractical computer requirements. An alternative to the volume integrations can be found in the Kirchhoff method. In this method, Green's theorem for the linear wave equation is used to determine sound propagation based on quantities on a surface surrounding the source region. The change from volume to surface integrals represents a tremendous savings in the computer resources required for an accurate prediction. This work is concerned with the development of enhancements of the Kirchhoff method for use in a wide variety of aeroacoustics problems. This enhanced method, the modified Kirchhoff method, is shown to be a Green's function solution of Lighthill's equation. It is also shown rigorously to be identical to the methods of Ffowcs Williams and Hawkings. This allows for development of versatile computer codes which can easily alternate between the different Kirchhoff and Ffowcs Williams-Hawkings formulations, using the most appropriate method for the problem at hand. The modified Kirchhoff method is developed primarily for use in jet aeroacoustics predictions. Applications of the method are shown for two dimensional and three dimensional jet flows. Additionally, the enhancements are generalized so that

  15. Recent development on statistical methods for personalized medicine discovery.

    PubMed

    Zhao, Yingqi; Zeng, Donglin

    2013-03-01

    It is well documented that patients can show significant heterogeneous responses to treatments so the best treatment strategies may require adaptation over individuals and time. Recently, a number of new statistical methods have been developed to tackle the important problem of estimating personalized treatment rules using single-stage or multiple-stage clinical data. In this paper, we provide an overview of these methods and list a number of challenges.

  16. Developing and Understanding Methods for Large Scale Nonlinear Optimization

    DTIC Science & Technology

    2001-12-01

    development of new algorithms for large-scale uncon- strained and constrained optimization problems, including limited-memory methods for problems with...analysis of tensor and SQP methods for singular con- strained optimization", to appear in SIAM Journal on Optimization. Published in peer-reviewed...Mathematica, Vol III, Journal der Deutschen Mathematiker-Vereinigung, 1998. S. Crivelli, B. Bader, R. Byrd, E. Eskow, V. Lamberti , R.Schnabel and T

  17. Formal methods in the development of safety critical software systems

    SciTech Connect

    Williams, L.G.

    1991-11-15

    As the use of computers in critical control systems such as aircraft controls, medical instruments, defense systems, missile controls, and nuclear power plants has increased, concern for the safety of those systems has also grown. Much of this concern has focused on the software component of those computer-based systems. This is primarily due to historical experience with software systems that often exhibit larger numbers of errors than their hardware counterparts and the fact that the consequences of a software error may endanger human life, property, or the environment. A number of different techniques have been used to address the issue of software safety. Some are standard software engineering techniques aimed at reducing the number of faults in a software protect, such as reviews and walkthroughs. Others, including fault tree analysis, are based on identifying and reducing hazards. This report examines the role of one such technique, formal methods, in the development of software for safety critical systems. The use of formal methods to increase the safety of software systems is based on their role in reducing the possibility of software errors that could lead to hazards. The use of formal methods in the development of software systems is controversial. Proponents claim that the use of formal methods can eliminate errors from the software development process, and produce programs that are probably correct. Opponents claim that they are difficult to learn and that their use increases development costs unacceptably. This report discusses the potential of formal methods for reducing failures in safety critical software systems.

  18. Agile Software Development Methods: A Comparative Review1

    NASA Astrophysics Data System (ADS)

    Abrahamsson, Pekka; Oza, Nilay; Siponen, Mikko T.

    Although agile software development methods have caught the attention of software engineers and researchers worldwide, scientific research still remains quite scarce. The aim of this study is to order and make sense of the different agile approaches that have been proposed. This comparative review is performed from the standpoint of using the following features as the analytical perspectives: project management support, life-cycle coverage, type of practical guidance, adaptability in actual use, type of research objectives and existence of empirical evidence. The results show that agile software development methods cover, without offering any rationale, different phases of the software development life-cycle and that most of these methods fail to provide adequate project management support. Moreover, quite a few methods continue to offer little concrete guidance on how to use their solutions or how to adapt them in different development situations. Empirical evidence after ten years of application remains quite limited. Based on the results, new directions on agile methods are outlined.

  19. Recent Developments in Helioseismic Analysis Methods and Solar Data Assimilation

    NASA Astrophysics Data System (ADS)

    Schad, A.; Jouve, L.; Duvall, T. L.; Roth, M.; Vorontsov, S.

    2015-12-01

    We review recent advances and results in enhancing and developing helioseismic analysis methods and in solar data assimilation. In the first part of this paper we will focus on selected developments in time-distance and global helioseismology. In the second part, we review the application of data assimilation methods on solar data. Relating solar surface observations as well as helioseismic proxies with solar dynamo models by means of the techniques from data assimilation is a promising new approach to explore and to predict the magnetic activity cycle of the Sun.

  20. Research and development of a stationary source method for phosgene

    SciTech Connect

    Coppedge, E.A.; Johnson, L.D.; Steger, J.L.

    1996-12-31

    Phosgene is listed as one of the hazardous air pollutants in title I of the Clean Air Act. Phosgene is a highly toxic gas at standard temperature and pressure and has been used in military applications and for a variety of industrial uses. Although various methods have been developed for detection of phosgene in ambient air, no method is directly applicable to stationary source emissions. The EPA has an on-going research project to develop a field ready protocol for phosgene from stationary source emissions. The results of the derivatization studies, sampling train experiments and other laboratory work will be shown.

  1. A Valuation Method for Multi-Stage Development Projects

    NASA Astrophysics Data System (ADS)

    Kobayashi, Yasuhiro; Kubo, Osamu; Ito, Junko; Ueda, Yoshikatsu

    A real-option based valuation method has been developed for multi-stage development projects which allow flexible stage-wise go/stop judgments. The proposed method measures the economic value of projects from potential future cash flow produced by them, and is characterized by following four functions: (1) Corporation of technical and market risks into project valuation, (2) Quantification of a project portfolio value, (3) Modeling of correlation between individual projects in a portfolio, and (4) Control of project portfolio risk with a risk index.

  2. Methods for Developing Emissions Scenarios for Integrated Assessment Models

    SciTech Connect

    Prinn, Ronald; Webster, Mort

    2007-08-20

    The overall objective of this research was to contribute data and methods to support the future development of new emissions scenarios for integrated assessment of climate change. Specifically, this research had two main objectives: 1. Use historical data on economic growth and energy efficiency changes, and develop probability density functions (PDFs) for the appropriate parameters for two or three commonly used integrated assessment models. 2. Using the parameter distributions developed through the first task and previous work, we will develop methods of designing multi-gas emission scenarios that usefully span the joint uncertainty space in a small number of scenarios. Results on the autonomous energy efficiency improvement (AEEI) parameter are summarized, an uncertainty analysis of elasticities of substitution is described, and the probabilistic emissions scenario approach is presented.

  3. Development of ultrasonic methods for the nondestructive inspection of concrete

    NASA Astrophysics Data System (ADS)

    Claytor, T. M.; Ellingson, W. A.

    1983-08-01

    Nondestructive inspection of Portland cement and refractory concrete is conducted to determine strength, thickness, presence of voids or foreign matter, presence of cracks, amount of degradation due to chemical attack, and other properties without the necessity of coring the structure (which is usually accomplished by destructively removing a sample). The state of the art of acoustic nondestructive testing methods for Portland cement and refractory concrete is reviewed. Most nondestructive work on concrete has concentrated on measuring acoustic velocity by through transmission methods. Development of a reliable pitch-catch or pulse-echo system would provide a method of measuring thickness with access from only one side of the concrete.

  4. Development of ultrasonic methods for the nondestructive inspection of concrete

    SciTech Connect

    Claytor, T.N.; Ellingson, W.A.

    1983-08-01

    Nondestructive inspection of Portland cement and refractory concrete is conducted to determine strength, thickness, presence of voids or foreign matter, presence of cracks, amount of degradation due to chemical attack, and other properties without the necessity of coring the structure (which is usually accomplished by destructively removing a sample). This paper reviews the state of the art of acoustic nondestructive testing methods for Portland cement and refractory concrete. Most nondestructive work on concrete has concentrated on measuring acoustic velocity by through transmission methods. Development of a reliable pitch-catch or pulse-echo system would provide a method of measuring thickness with access from only one side of the concrete.

  5. Development of a benchtop baking method for chemically leavened crackers. II. Validation of the method

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A benchtop baking method has been developed to predict the contribution of gluten functionality to overall flour performance for chemically leavened crackers. Using a diagnostic formula and procedure, dough rheology was analyzed to evaluate the extent of gluten development during mixing and machinin...

  6. New developments in adaptive methods for computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Oden, J. T.; Bass, Jon M.

    1990-01-01

    New developments in a posteriori error estimates, smart algorithms, and h- and h-p adaptive finite element methods are discussed in the context of two- and three-dimensional compressible and incompressible flow simulations. Applications to rotor-stator interaction, rotorcraft aerodynamics, shock and viscous boundary layer interaction and fluid-structure interaction problems are discussed.

  7. Is Mixed Methods Research Used in Australian Career Development Research?

    ERIC Educational Resources Information Center

    Cameron, Roslyn

    2010-01-01

    Mixed methods research has become a substantive and growing methodological force that is growing in popularity within the human and social sciences. This article reports the findings of a study that has systematically reviewed articles from the "Australian Journal of Career Development" from 2004 to 2009. The aim of the study was to…

  8. NEW METHODS FOR MEASURING THE DEVELOPMENT OF ATTITUDES IN CHILDREN.

    ERIC Educational Resources Information Center

    HESS, ROBERT D.; TORNEY, JUDITH V.

    STRUCTURAL (NONCONTENT) DIMENSIONS OF CHILDREN'S POLITICAL ATTITUDES AND THEIR DEVELOPMENT WERE INVESTIGATED USING NEW METHODS DERIVED FROM SELF-REPORT DATA. THE CONSTRUCT, "ATTITUDE-CONCEPT SYSTEM," WAS INTRODUCED TO DESIGNATE EVALUATIONS OF AN ATTITUDE OBJECT AND BELIEFS ASSOCIATED WITH THIS EVALUATION. THE FIVE STRUCTURAL DIMENSIONS…

  9. DEVELOPMENT OF CRITERIA AND METHODS FOR EVALUATING TRAINER AIRCRAFT EFFECTIVENESS.

    ERIC Educational Resources Information Center

    KUSEWITT, J.B.

    THE PURPOSE OF THIS STUDY WAS TO DEVELOP A METHOD FOR DETERMINING OBJECTIVE MEASURES OF TRAINER AIRCRAFT EFFECTIVENESS TO EVALUATE PROGRAM ALTERNATIVES FOR TRAINING PILOTS FOR FLEET FIGHTER AND ATTACK-TYPE AIRCRAFT. THE TRAINING SYLLABUS WAS BASED ON AVERAGE STUDENT ABILITY. THE BASIC PROBLEM WAS TO ESTABLISH QUANTITATIVE TIME-DIFFICULTY…

  10. Multidisciplinary Methods in Educational Technology Research and Development

    ERIC Educational Resources Information Center

    Randolph, Justus J.

    2008-01-01

    Over the past thirty years, there has been much dialogue, and debate, about the conduct of educational technology research and development. In this brief volume, the author helps clarify that dialogue by theoretically and empirically charting the research methods used in the field and provides much practical information on how to conduct…

  11. 59 FR- Method Development for Airborne Mycobacterium Tuberculosis; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    1994-03-07

    ... Tuberculosis; Meeting The National Institute for Occupational Safety and Health (NIOSH) of the Centers for... Airborne Mycobacterium Tuberculosis. Time and Date: 1 p.m.-5 p.m., March 29, 1994. Place: Alice Hamilton... peer review of a NIOSH project entitled ``Method Development For Airborne Mycobacterium...

  12. EXPOSURE ASSESSMENT METHODS DEVELOPMENT PILOTS FOR THE NATIONAL CHILDREN'S STUDY

    EPA Science Inventory

    Accurate exposure classification tools are needed to link exposure with health effects. EPA began methods development pilot studies in 2000 to address general questions about exposures and outcome measures. Selected pilot studies are highlighted in this poster. The “Literature Re...

  13. Development of a practical costing method for hospitals.

    PubMed

    Cao, Pengyu; Toyabe, Shin-Ichi; Akazawa, Kouhei

    2006-03-01

    To realize an effective cost control, a practical and accurate cost accounting system is indispensable in hospitals. In traditional cost accounting systems, the volume-based costing (VBC) is the most popular cost accounting method. In this method, the indirect costs are allocated to each cost object (services or units of a hospital) using a single indicator named a cost driver (e.g., Labor hours, revenues or the number of patients). However, this method often results in rough and inaccurate results. The activity based costing (ABC) method introduced in the mid 1990s can prove more accurate results. With the ABC method, all events or transactions that cause costs are recognized as "activities", and a specific cost driver is prepared for each activity. Finally, the costs of activities are allocated to cost objects by the corresponding cost driver. However, it is much more complex and costly than other traditional cost accounting methods because the data collection for cost drivers is not always easy. In this study, we developed a simplified ABC (S-ABC) costing method to reduce the workload of ABC costing by reducing the number of cost drivers used in the ABC method. Using the S-ABC method, we estimated the cost of the laboratory tests, and as a result, similarly accurate results were obtained with the ABC method (largest difference was 2.64%). Simultaneously, this new method reduces the seven cost drivers used in the ABC method to four. Moreover, we performed an evaluation using other sample data from physiological laboratory department to certify the effectiveness of this new method. In conclusion, the S-ABC method provides two advantages in comparison to the VBC and ABC methods: (1) it can obtain accurate results, and (2) it is simpler to perform. Once we reduce the number of cost drivers by applying the proposed S-ABC method to the data for the ABC method, we can easily perform the cost accounting using few cost drivers after the second round of costing.

  14. Progress in Development of Methods in Bone Densitometry

    NASA Technical Reports Server (NTRS)

    Whedon, G. D.; Neumann, William F.; Jenkins, Dale W.

    1966-01-01

    The effects of weightlessness and decreased activity on the astronaut's musculoskeletal system during prolonged space flight, missions are of concern to NASA. This problem was anticipated from the knowledge that human subjects lose significant quantities of calcium from the skeleton during periods of bedrest, immobilization, and water immersion. An accurate method of measurement of the changes in mineral content of the skeleton is required not only in the space program but also in the biological, medical, and dental fields for mineral metabolism studies and for studying various pathological conditions of the skeleton and teeth. This method is a difficult one requiring the coordinated efforts of physiologists, biophysicists, radiologists, and clinicians. The densitometry methods reported in this conference which have been used or are being developed include X-ray, beta excited X-rays, radioisotopes, sonic vibration, and neutron activation analysis Studies in the Gemini, Biosatellite, and Apollo flights use the X-ray bone densitometry method which requires making X-rays before and after the flights. An in-flight method of bone densitometry would be valuable, and use of radioisotope sources has been suggested. Many advances in bone densitometry have been made in the last five years, and the urgency of the requirement makes this working conference timely and valuable. In such a rapidly developing field with investigators working independently in a variety of scientific disciplines, a working conference is of great value in exchanging information and ideas, critically evaluating approaches and methods, and pointing out new research pathways.

  15. Developing new online calibration methods for multidimensional computerized adaptive testing.

    PubMed

    Chen, Ping; Wang, Chun; Xin, Tao; Chang, Hua-Hua

    2017-02-01

    Multidimensional computerized adaptive testing (MCAT) has received increasing attention over the past few years in educational measurement. Like all other formats of CAT, item replenishment is an essential part of MCAT for its item bank maintenance and management, which governs retiring overexposed or obsolete items over time and replacing them with new ones. Moreover, calibration precision of the new items will directly affect the estimation accuracy of examinees' ability vectors. In unidimensional CAT (UCAT) and cognitive diagnostic CAT, online calibration techniques have been developed to effectively calibrate new items. However, there has been very little discussion of online calibration in MCAT in the literature. Thus, this paper proposes new online calibration methods for MCAT based upon some popular methods used in UCAT. Three representative methods, Method A, the 'one EM cycle' method and the 'multiple EM cycles' method, are generalized to MCAT. Three simulation studies were conducted to compare the three new methods by manipulating three factors (test length, item bank design, and level of correlation between coordinate dimensions). The results showed that all the new methods were able to recover the item parameters accurately, and the adaptive online calibration designs showed some improvements compared to the random design under most conditions.

  16. Development of quality-by-design analytical methods.

    PubMed

    Vogt, Frederick G; Kord, Alireza S

    2011-03-01

    Quality-by-design (QbD) is a systematic approach to drug development, which begins with predefined objectives, and uses science and risk management approaches to gain product and process understanding and ultimately process control. The concept of QbD can be extended to analytical methods. QbD mandates the definition of a goal for the method, and emphasizes thorough evaluation and scouting of alternative methods in a systematic way to obtain optimal method performance. Candidate methods are then carefully assessed in a structured manner for risks, and are challenged to determine if robustness and ruggedness criteria are satisfied. As a result of these studies, the method performance can be understood and improved if necessary, and a control strategy can be defined to manage risk and ensure the method performs as desired when validated and deployed. In this review, the current state of analytical QbD in the industry is detailed with examples of the application of analytical QbD principles to a range of analytical methods, including high-performance liquid chromatography, Karl Fischer titration for moisture content, vibrational spectroscopy for chemical identification, quantitative color measurement, and trace analysis for genotoxic impurities.

  17. Quantifying nonhomogeneous colors in agricultural materials part I: method development.

    PubMed

    Balaban, M O

    2008-11-01

    Measuring the color of food and agricultural materials using machine vision (MV) has advantages not available by other measurement methods such as subjective tests or use of color meters. The perception of consumers may be affected by the nonuniformity of colors. For relatively uniform colors, average color values similar to those given by color meters can be obtained by MV. For nonuniform colors, various image analysis methods (color blocks, contours, and "color change index"[CCI]) can be applied to images obtained by MV. The degree of nonuniformity can be quantified, depending on the level of detail desired. In this article, the development of the CCI concept is presented. For images with a wide range of hue values, the color blocks method quantifies well the nonhomogeneity of colors. For images with a narrow hue range, the CCI method is a better indicator of color nonhomogeneity.

  18. REVIEW: Development of methods for body composition studies

    NASA Astrophysics Data System (ADS)

    Mattsson, Sören; Thomas, Brian J.

    2006-07-01

    This review is focused on experimental methods for determination of the composition of the human body, its organs and tissues. It summarizes the development and current status of fat determinations from body density, total body water determinations through the dilution technique, whole and partial body potassium measurements for body cell mass estimates, in vivo neutron activation analysis for body protein measurements, dual-energy absorptiometry (DEXA), computed tomography (CT) and magnetic resonance imaging (MRI, fMRI) and spectroscopy (MRS) for body composition studies on tissue and organ levels, as well as single- and multiple-frequency bioimpedance (BIA) and anthropometry as simple easily available methods. Methods for trace element analysis in vivo are also described. Using this wide range of measurement methods, together with gradually improved body composition models, it is now possible to quantify a number of body components and follow their changes in health and disease.

  19. Methods comparison for microsatellite marker development: Different isolation methods, different yield efficiency

    NASA Astrophysics Data System (ADS)

    Zhan, Aibin; Bao, Zhenmin; Hu, Xiaoli; Lu, Wei; Hu, Jingjie

    2009-06-01

    Microsatellite markers have become one kind of the most important molecular tools used in various researches. A large number of microsatellite markers are required for the whole genome survey in the fields of molecular ecology, quantitative genetics and genomics. Therefore, it is extremely necessary to select several versatile, low-cost, efficient and time- and labor-saving methods to develop a large panel of microsatellite markers. In this study, we used Zhikong scallop ( Chlamys farreri) as the target species to compare the efficiency of the five methods derived from three strategies for microsatellite marker development. The results showed that the strategy of constructing small insert genomic DNA library resulted in poor efficiency, while the microsatellite-enriched strategy highly improved the isolation efficiency. Although the mining public database strategy is time- and cost-saving, it is difficult to obtain a large number of microsatellite markers, mainly due to the limited sequence data of non-model species deposited in public databases. Based on the results in this study, we recommend two methods, microsatellite-enriched library construction method and FIASCO-colony hybridization method, for large-scale microsatellite marker development. Both methods were derived from the microsatellite-enriched strategy. The experimental results obtained from Zhikong scallop also provide the reference for microsatellite marker development in other species with large genomes.

  20. Organic analysis and analytical methods development: FY 1995 progress report

    SciTech Connect

    Clauss, S.A.; Hoopes, V.; Rau, J.

    1995-09-01

    This report describes the status of organic analyses and developing analytical methods to account for the organic components in Hanford waste tanks, with particular emphasis on tanks assigned to the Flammable Gas Watch List. The methods that have been developed are illustrated by their application to samples obtained from Tank 241-SY-103 (Tank 103-SY). The analytical data are to serve as an example of the status of methods development and application. Samples of the convective and nonconvective layers from Tank 103-SY were analyzed for total organic carbon (TOC). The TOC value obtained for the nonconvective layer using the hot persulfate method was 10,500 {mu}g C/g. The TOC value obtained from samples of Tank 101-SY was 11,000 {mu}g C/g. The average value for the TOC of the convective layer was 6400 {mu}g C/g. Chelator and chelator fragments in Tank 103-SY samples were identified using derivatization. gas chromatography/mass spectrometry (GC/MS). Organic components were quantified using GC/flame ionization detection. Major components in both the convective and nonconvective-layer samples include ethylenediaminetetraacetic acid (EDTA), nitrilotriacetic acid (NTA), succinic acid, nitrosoiminodiacetic acid (NIDA), citric acid, and ethylenediaminetriacetic acid (ED3A). Preliminary results also indicate the presence of C16 and C18 carboxylic acids in the nonconvective-layer sample. Oxalic acid was one of the major components in the nonconvective layer as determined by derivatization GC/flame ionization detection.

  1. Analytical Failure Prediction Method Developed for Woven and Braided Composites

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2003-01-01

    Historically, advances in aerospace engine performance and durability have been linked to improvements in materials. Recent developments in ceramic matrix composites (CMCs) have led to increased interest in CMCs to achieve revolutionary gains in engine performance. The use of CMCs promises many advantages for advanced turbomachinery engine development and may be especially beneficial for aerospace engines. The most beneficial aspects of CMC material may be its ability to maintain its strength to over 2500 F, its internal material damping, and its relatively low density. Ceramic matrix composites reinforced with two-dimensional woven and braided fabric preforms are being considered for NASA s next-generation reusable rocket turbomachinery applications (for example, see the preceding figure). However, the architecture of a textile composite is complex, and therefore, the parameters controlling its strength properties are numerous. This necessitates the development of engineering approaches that combine analytical methods with limited testing to provide effective, validated design analyses for the textile composite structures development.

  2. Unstructured-grid methods development: Lessons le arned

    NASA Technical Reports Server (NTRS)

    Batina, John T.

    1991-01-01

    The development is summarized of unstructured grid methods for the solution of the equations of fluid flow and some of the lessons learned are shared. The 3-D Euler equations are solved, including spatial discretizations, temporal discretizations, and boundary conditions. An example calculation with an upwind implicit method using a CFL (Courant Friedricks Lewy) number of infinity is presented for the Boeing 747 aircraft. The results obtained in less than one hour of CPU time on a Cray-2 computer, thus demonstrating the speed and robustness of the present capability.

  3. Development of target allocation methods for LAMOST focal plate

    NASA Astrophysics Data System (ADS)

    Yuan, Hailong; Zhang, Haotong; Zhang, Yanxia; Lei, Yajuan; Dong, Yiqiao

    2014-01-01

    We first introduce the primary target allocation requirements and restrictions for the parallel control multiple fiber system, which is used in the LAMOST spectroscopic survey. The fiber positioner anti-collision model is imported. Then several target allocation methods and features are discussed in detail, including a network flow algorithm, high priority for fiber unit holding less target number, target allocation algorithm for groups, target allocation method for add-ons and target reallocation. Their virtues and weaknesses are analyzed for various kinds of scientific research situations. Furthermore an optimization concept using the Simulate Anneal Arithmetic (SAA) is developed to improve the fiber utilizing efficiency.

  4. [Descartes' influence on the development of the anatomoclinical method].

    PubMed

    González Hernández, A; Domínguez Rodríguez, M V; Fabre Pi, O; Cubero González, A

    2010-01-01

    The development of the anatomical-clinical method was a huge advance for modern medicine since it revealed a new approach to understanding diagnostic procedures. This change in medical thinking towards a more scientific basis has gradually evolved over several centuries, reaching its brilliant zenith with the contributions of the French school. There are certain similarities between the guidelines of the anatomical-clinical method and René Descartes' philosophical principles, so it is fair to consider him as one of the major precursors in this new line of thinking that definitely influenced the historical course of medicine.

  5. Development of an in vitro cloning method for Cowdria ruminantium.

    PubMed Central

    Perez, J M; Martinez, D; Debus, A; Sheikboudou, C; Bensaid, A

    1997-01-01

    Cowdria ruminantium is a tick-borne rickettsia which causes severe disease in ruminants. All studies with C. ruminantium reported so far were carried out with stocks consisting of infective blood collected from reacting animals or from the same stocks propagated in vitro. Cloned isolates are needed to conduct studies on immune response of the host, on genetic diversity of the parasite, and on mechanisms of attenuation and the development of vaccines. A method of cloning based on the particular chlamydia life cycle of Cowdria was developed. Instead of cloning extracellular elementary bodies, it appeared more convenient to clone endothelial cells infected by one morula resulting from the infection of the cell by one elementary body of Cowdria. Two hundred and sixteen clones were obtained by limiting dilution of infected cells. The method was experimentally validated by comparing randomly amplified polymorphic DNA fingerprints from individual clones obtained from endothelial cell cultures coinfected with two different stocks of C. ruminantium. PMID:9302217

  6. Development of motion control method for laser soldering process

    SciTech Connect

    Yerganian, S.S.

    1997-05-01

    Development of a method to generate the motion control data for sealing an electronic housing using laser soldering is described. The motion required to move the housing under the laser is a nonstandard application and was performed with a four-axis system using the timed data streaming mode capabilities of a Compumotor AT6400 indexer. A Microsoft Excel 5.0 spreadsheet (named Israuto.xls) was created to calculate the movement of the part under the laser, and macros were written into the spreadsheet to allow the user to easily create this data. A data verification method was developed for simulating the motion data. The geometry of the assembly was generated using Parametric Technology Corporation Pro/E version 15. This geometry was then converted using Pro/DADS version 3.1 from Computer Aided Design Software Inc. (CADSI), and the simulation was carried out using DADS version 8.0 from CADSI.

  7. Development of characteristic evaluation method on FR cycle system

    SciTech Connect

    Shinoda, Y.; Shiotani, H.; Hirao, K.

    2002-07-01

    The present report is intended to explain some results of the characteristic evaluation work on various FR cycle system concepts, in the 1. phase of the JNC's 'Feasibility Study on Commercialized Fast Reactor Cycle System' (from 1999 to March 2001). The development of the evaluation method is carried out for six criteria, such as Economics, Effective utilization of uranium resource, Reduction of environmental impact, Safety, Proliferation resistance, and Technological feasibility. (authors)

  8. Methods to Develop Inhalation Cancer Risk Estimates for ...

    EPA Pesticide Factsheets

    This document summarizes the approaches and rationale for the technical and scientific considerations used to derive inhalation cancer risks for emissions of chromium and nickel compounds from electric utility steam generating units. The purpose of this document is to discuss the methods used to develop inhalation cancer risk estimates associated with emissions of chromium and nickel compounds from coal- and oil-fired electric utility steam generating units (EGUs) in support of EPA's recently proposed Air Toxics Rule.

  9. In silico machine learning methods in drug development.

    PubMed

    Dobchev, Dimitar A; Pillai, Girinath G; Karelson, Mati

    2014-01-01

    Machine learning (ML) computational methods for predicting compounds with pharmacological activity, specific pharmacodynamic and ADMET (absorption, distribution, metabolism, excretion and toxicity) properties are being increasingly applied in drug discovery and evaluation. Recently, machine learning techniques such as artificial neural networks, support vector machines and genetic programming have been explored for predicting inhibitors, antagonists, blockers, agonists, activators and substrates of proteins related to specific therapeutic targets. These methods are particularly useful for screening compound libraries of diverse chemical structures, "noisy" and high-dimensional data to complement QSAR methods, and in cases of unavailable receptor 3D structure to complement structure-based methods. A variety of studies have demonstrated the potential of machine-learning methods for predicting compounds as potential drug candidates. The present review is intended to give an overview of the strategies and current progress in using machine learning methods for drug design and the potential of the respective model development tools. We also regard a number of applications of the machine learning algorithms based on common classes of diseases.

  10. Viscous wing theory development. Volume 1: Analysis, method and results

    NASA Technical Reports Server (NTRS)

    Chow, R. R.; Melnik, R. E.; Marconi, F.; Steinhoff, J.

    1986-01-01

    Viscous transonic flows at large Reynolds numbers over 3-D wings were analyzed using a zonal viscid-inviscid interaction approach. A new numerical AFZ scheme was developed in conjunction with the finite volume formulation for the solution of the inviscid full-potential equation. A special far-field asymptotic boundary condition was developed and a second-order artificial viscosity included for an improved inviscid solution methodology. The integral method was used for the laminar/turbulent boundary layer and 3-D viscous wake calculation. The interaction calculation included the coupling conditions of the source flux due to the wing surface boundary layer, the flux jump due to the viscous wake, and the wake curvature effect. A method was also devised incorporating the 2-D trailing edge strong interaction solution for the normal pressure correction near the trailing edge region. A fully automated computer program was developed to perform the proposed method with one scalar version to be used on an IBM-3081 and two vectorized versions on Cray-1 and Cyber-205 computers.

  11. Development of fluorescent methods for DNA methyltransferase assay

    NASA Astrophysics Data System (ADS)

    Li, Yueying; Zou, Xiaoran; Ma, Fei; Tang, Bo; Zhang, Chun-yang

    2017-03-01

    DNA methylation modified by DNA methyltransferase (MTase) plays an important role in regulating gene transcription, cell growth and proliferation. The aberrant DNA MTase activity may lead to a variety of human diseases including cancers. Therefore, accurate and sensitive detection of DNA MTase activity is crucial to biomedical research, clinical diagnostics and therapy. However, conventional DNA MTase assays often suffer from labor-intensive operations and time-consuming procedures. Alternatively, fluorescent methods have significant advantages of simplicity and high sensitivity, and have been widely applied for DNA MTase assay. In this review, we summarize the recent advances in the development of fluorescent methods for DNA MTase assay. These emerging methods include amplification-free and the amplification-assisted assays. Moreover, we discuss the challenges and future directions of this area.

  12. Pu and Am determination in the environment—method development

    NASA Astrophysics Data System (ADS)

    Afonin, M.; Simonoff, M.; Donard, O.; Michel, H.; Ardisson, G.

    2003-01-01

    A high resolution inductively coupled plasma mass spectrometric (HR-ICP-MS) method for the determination of plutonium isotopes, Am and the 240Pu/239Pu isotope ratio utilising modification of Pu-02-RC Plutonium in Soil Samples, Pu-03-RC Plutonium in Soil Residue—Total Dissolution Method, Pu-11-RC Plutonium Purification—Ion Exchange Technique, Pu-12-RC Plutonium and/or Americium in Soil or Sediments, HASL-300 was developed. Total plutonium concentrations (239+240Pu) measured in environmental samples by this HR-ICP-MS method were in good agreement with recommended data obtained from a-spectrometry. It was achieved the decreasing of the time to analyze the samples over than 33%.

  13. Development of gait segmentation methods for wearable foot pressure sensors.

    PubMed

    Crea, S; De Rossi, S M M; Donati, M; Reberšek, P; Novak, D; Vitiello, N; Lenzi, T; Podobnik, J; Munih, M; Carrozza, M C

    2012-01-01

    We present an automated segmentation method based on the analysis of plantar pressure signals recorded from two synchronized wireless foot insoles. Given the strict limits on computational power and power consumption typical of wearable electronic components, our aim is to investigate the capability of a Hidden Markov Model machine-learning method, to detect gait phases with different levels of complexity in the processing of the wearable pressure sensors signals. Therefore three different datasets are developed: raw voltage values, calibrated sensor signals and a calibrated estimation of total ground reaction force and position of the plantar center of pressure. The method is tested on a pool of 5 healthy subjects, through a leave-one-out cross validation. The results show high classification performances achieved using estimated biomechanical variables, being on average the 96%. Calibrated signals and raw voltage values show higher delays and dispersions in phase transition detection, suggesting a lower reliability for online applications.

  14. Bioanalytical method development and validation: Critical concepts and strategies.

    PubMed

    Moein, Mohammad Mahdi; El Beqqali, Aziza; Abdel-Rehim, Mohamed

    2017-02-01

    Bioanalysis is an essential part in drug discovery and development. Bioanalysis is related to the analysis of analytes (drugs, metabolites, biomarkers) in biological samples and it involves several steps from sample collection to sample analysis and data reporting. The first step is sample collection from clinical or preclinical studies; then sending the samples to laboratory for analysis. Second step is sample clean-up (sample preparation) and it is very important step in bioanalysis. In order to reach reliable results, a robust and stable sample preparation method should be applied. The role of sample preparation is to remove interferences from sample matrix and improve analytical system performance. Sample preparation is often labor intensive and time consuming. Last step is the sample analysis and detection. For separation and detection, liquid chromatography-tandem mass spectrometry (LC-MS/MS) is method of choice in bioanalytical laboratories. This is due to high selectivity and high sensitivity of the LC-MS/MS technique. In addition the information about the analyte chemical structure and chemical properties is important to be known before the start of bioanalytical work. This review provides an overview of bioanalytical method development and validation. The main principles of method validation will be discussed. In this review GLP and regulated bioanalysis are described. Commonly used sample preparation techniques will be presented. In addition the role of LC-MS/MS in modern bioanalysis will be discussed. In the present review we have our focus on bioanalysis of small molecules.

  15. The ReaxFF method - new applications and developments

    NASA Astrophysics Data System (ADS)

    van Duin, Adri

    The ReaxFF method provides a highly transferable simulation method for atomistic scale simulations on chemical reactions at the nanosecond and nanometer scale. It combines concepts of bond-order based potentials with a polarizable charge distribution. Since it initial development for hydrocarbons in 2001, we have found that this concept is transferable to applications to elements all across the periodic table, including all first row elements, metals, ceramics and ionic materials. For all these elements and associated materials we have demonstrated that ReaxFF can reproduce quantum mechanics-based structures, reaction energies and reaction barriers with reasonable accuracy, enabling the method to predict reaction kinetics in complicated, multi-material environments at a relatively modest computational expense. This presentation will describe the current concepts of the ReaxFF method, the current status of the various ReaxFF codes, including parallel implementations and recently developed hybrid Grand Canonical Monte Carlo options - which significantly increase its application areas. Also, we will present and overview of recent applications to a range of materials of increasing complexity, with a focus on applications to combustion, biomaterials, batteries, tribology and catalysis.

  16. Development of acoustic sniper localization methods and models

    NASA Astrophysics Data System (ADS)

    Grasing, David; Ellwood, Benjamin

    2010-04-01

    A novel examination of a method capable of providing situational awareness of sniper fire from small arms fire is presented. Situational Awareness (SA) information is extracted by exploiting two distinct sounds created by small arms discharge: the muzzle blast (created when the bullet leaves the barrel of the gun) and the shockwave (sound created by a supersonic bullet). The direction of arrival associated with the muzzle blast will always point in the direction of the shooter. Range can be estimated from the muzzle blast alone, however at greater distances geometric dilution of precision will make obtaining accurate range estimates difficult. To address this issue, additional information obtained from the shockwave is utilized in order to estimate range to shooter. The focus of the paper is the development of a shockwave propagation model, the development of ballistics models (based off empirical measurements), and the subsequent application towards methods of determining shooter position. Knowledge of the rounds ballistics is required to estimate range to shooter. Many existing methods rely on extracting information from the shockwave in an attempt to identify the round type and thus the ballistic model to use ([1]). It has been our experience that this information becomes unreliable at greater distances or in high noise environments. Our method differs from existing solutions in that classification of the round type is not required, thus making the proposed solution more robust. Additionally, we demonstrate that sufficient accuracy can be achieved without the need to classify the round.

  17. The development of CFD methods for rotor applications

    NASA Technical Reports Server (NTRS)

    Caradonna, F. X.; Mccroskey, W. J.

    1988-01-01

    The optimum design of the advancing helicopter rotor for high-speed forward flight always involves a tradeoff between transonic and stall limitations. However, the preoccupation of the rotor industry was primarily concerned with stall until well into the 1970s. This emphasis on stall resulted from the prevalent use of low-solidity rotors with rather outdated airfoil sections. The use of cambered airfoil sections and higher-solidity rotors substantially reduced stall and revealed the advancing transonic flow to be a more persistent limitation to high-speed rotor performance. Work in this area was spurred not only by operational necessity but also by the development of a tool for the prediction of these flows (the method of computational fluid dynamics). The development of computational fluid dynamics for these rotor problems was a major Army and NASA achievement. This work is now being extended to other rotor flow problems. The developments are outlined.

  18. Development of modelling method selection tool for health services management: From problem structuring methods to modelling and simulation methods

    PubMed Central

    2011-01-01

    Background There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. Aim The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can better select and use them or better commission relevant modelling and simulation work. Methods This paper proposes a modelling and simulation method comparison and selection tool developed from a comprehensive literature review, the research team's extensive expertise and inputs from potential users. Twenty-eight different methods were identified, characterised by their relevance to different application areas, project life cycle stages, types of output and levels of insight, and four input resources required (time, money, knowledge and data). Results The characterisation is presented in matrix forms to allow quick comparison and selection. This paper also highlights significant knowledge gaps in the existing literature when assessing the applicability of particular approaches to health services management, where modelling and simulation skills are scarce let alone money and time. Conclusions A modelling and simulation method comparison and selection tool is developed to assist with the selection of methods appropriate to supporting specific decision making processes. In particular it addresses the issue of which method is most appropriate to which specific health services management problem, what the user might expect to be obtained from the method, and what is required to use the method. In summary, we believe the tool adds value to the scarce existing literature on methods comparison and selection. PMID:21595946

  19. User Experience Evaluation Methods in Product Development (UXEM'09)

    NASA Astrophysics Data System (ADS)

    Roto, Virpi; Väänänen-Vainio-Mattila, Kaisa; Law, Effie; Vermeeren, Arnold

    High quality user experience (UX) has become a central competitive factor of product development in mature consumer markets [1]. Although the term UX originated from industry and is a widely used term also in academia, the tools for managing UX in product development are still inadequate. A prerequisite for designing delightful UX in an industrial setting is to understand both the requirements tied to the pragmatic level of functionality and interaction and the requirements pertaining to the hedonic level of personal human needs, which motivate product use [2]. Understanding these requirements helps managers set UX targets for product development. The next phase in a good user-centered design process is to iteratively design and evaluate prototypes [3]. Evaluation is critical for systematically improving UX. In many approaches to UX, evaluation basically needs to be postponed until the product is fully or at least almost fully functional. However, in an industrial setting, it is very expensive to find the UX failures only at this phase of product development. Thus, product development managers and developers have a strong need to conduct UX evaluation as early as possible, well before all the parts affecting the holistic experience are available. Different types of products require evaluation on different granularity and maturity levels of a prototype. For example, due to its multi-user characteristic, a community service or an enterprise resource planning system requires a broader scope of UX evaluation than a microwave oven or a word processor that is meant for a single user at a time. Before systematic UX evaluation can be taken into practice, practical, lightweight UX evaluation methods suitable for different types of products and different phases of product readiness are needed. A considerable amount of UX research is still about the conceptual frameworks and models for user experience [4]. Besides, applying existing usability evaluation methods (UEMs) without

  20. Development of an optomechanical statistical tolerancing method for cost reduction

    NASA Astrophysics Data System (ADS)

    Lamontagne, Frédéric; Doucet, Michel

    2012-10-01

    Optical systems generally require a high level of optical components positioning precision resulting in elevated manufacturing cost. The optomechanical tolerance analysis is usually performed by the optomechanical engineer using his personal knowledge of the manufacturing precision capability. Worst case or root sum square (RSS) tolerance calculation methods are frequently used for their simplicity. In most situations, the chance to encounter the worst case error is statistically almost impossible. On the other hand, RSS method is generally not an accurate representation of the reality since it assumes centered normal distributions. Moreover, the RSS method is not suitable for multidimensional tolerance analysis that combines translational and rotational variations. An optomechanical tolerance analysis method based on Monte Carlo simulation has been developed at INO to reduce overdesign caused by pessimist manufacturing and assembly error predictions. Manufacturing data errors have been compiled and computed to be used as input for the optomechanical Monte Carlo tolerance model. This is resulting in a more realistic prediction of the optical components positioning errors (decenter, tilt and air gap). Calculated errors probabilities were validated on a real lenses barrels assembly using a high precision centering machine. Results show that the statistical error prediction is more accurate and that can relax significantly the precision required in comparison to the worst case method. Manufacturing, inspection, adjustment mechanism and alignment cost can then be reduced considerably.

  1. The NASA digital VGH program. Exploration of methods and final results. Volume 1: Development of methods

    NASA Technical Reports Server (NTRS)

    Crabill, Norman L.

    1989-01-01

    Two hundred hours of Lockheed L 1011 digital flight data recorder data taken in 1973 were used to develop methods and procedures for obtaining statistical data useful for updating airliner airworthiness design criteria. Five thousand hours of additional data taken in 1978 to 1982 are reported in volumes 2, 3, 4 and 5.

  2. Development of an Immunoaffinity Method for Purification of Streptokinase

    PubMed Central

    Karimi, Zohreh; Babashamsi, Mohammad; Asgarani, Ezat; Salimi, Ali

    2012-01-01

    Background Streptokinase is a potent activator of plasminogen to plasmin, the enzyme that can solubilize the fibrin network in blood clots. Streptokinase is currently used in clinical medicine as a thrombolytic agent. It is naturally secreted by β-hemolytic streptococci. Methods To reach an efficient method of purification, an immunoaffinity chromatography method was developed that could purify the streptokinase in a single step with high yield. At the first stage, a CNBr-Activated sepharose 4B-Lysine column was made to purify the human blood plasminogen. The purified plasminogen was utilized to construct a column that could purify the streptokinase. The rabbit was immunized with the purified streptokinase and the anti-streptokinase (IgG) purified on another streptokinase substituted sepharose-4B column. The immunoaffinity column was developed by coupling the purified anti-Streptokinase (IgG) to sepharose 6MB–Protein A. The Escherichia coli (E.coli) BL21 (DE3) pLysS strain was transformed by the recombinant construct (cloned streptokinase gene in pGEX-4T-2 vector) and gene expression was induced by IPTG. The expressed protein was purified by immunoaffinity chromatography in a single step. Results The immunoaffinity column could purify the recombinant fusion GST-SK to homogeneity. The purity of streptokinase was confirmed by SDS-PAGE as a single band of about 71 kD and its biological activity determined in a specific streptokinase assay. The yield of the purification was about 94%. Conclusion This method of streptokinase purification is superior to the previous conventional methods. PMID:23408770

  3. Developing integrated methods to address complex resource and environmental issues

    USGS Publications Warehouse

    Smith, Kathleen S.; Phillips, Jeffrey D.; McCafferty, Anne E.; Clark, Roger N.

    2016-02-08

    IntroductionThis circular provides an overview of selected activities that were conducted within the U.S. Geological Survey (USGS) Integrated Methods Development Project, an interdisciplinary project designed to develop new tools and conduct innovative research requiring integration of geologic, geophysical, geochemical, and remote-sensing expertise. The project was supported by the USGS Mineral Resources Program, and its products and acquired capabilities have broad applications to missions throughout the USGS and beyond.In addressing challenges associated with understanding the location, quantity, and quality of mineral resources, and in investigating the potential environmental consequences of resource development, a number of field and laboratory capabilities and interpretative methodologies evolved from the project that have applications to traditional resource studies as well as to studies related to ecosystem health, human health, disaster and hazard assessment, and planetary science. New or improved tools and research findings developed within the project have been applied to other projects and activities. Specifically, geophysical equipment and techniques have been applied to a variety of traditional and nontraditional mineral- and energy-resource studies, military applications, environmental investigations, and applied research activities that involve climate change, mapping techniques, and monitoring capabilities. Diverse applied geochemistry activities provide a process-level understanding of the mobility, chemical speciation, and bioavailability of elements, particularly metals and metalloids, in a variety of environmental settings. Imaging spectroscopy capabilities maintained and developed within the project have been applied to traditional resource studies as well as to studies related to ecosystem health, human health, disaster assessment, and planetary science. Brief descriptions of capabilities and laboratory facilities and summaries of some

  4. [Development of a Drug Discovery Method Targeted to Stromal Tissue].

    PubMed

    Kamada, Haruhiko

    2016-01-01

    Several diseases are characterized by alterations in the molecular distribution of vascular structures, presenting the opportunity to use monoclonal antibodies for clinical therapies. This pharmaceutical strategy, often referred to as "vascular targeting", has promise in promoting the discovery and development of selective biological drugs to regulate angiogenesis-related diseases such as cancer. Various experimental approaches have been utilized to discover accessible vascular markers of health and disease at the protein level. Our group has developed a new chemical proteomics technology to identify and quantify accessible vascular proteins in normal organs and at disease sites. Our developed methodology relies on the perfusion of animal models with suitable ester derivatives of biotin, which react with the primary amine groups of proteins as soon as the molecules are attached. This presentation reports biomedical applications based on vascular targeting strategies, as well as methodologies that have been used to discover new vascular targets. The identification of antigens located in the stromal tissue of pathological blood vessels may provide attractive targets for the development of antibody drugs. This method will also provide an efficient discovery target that could lead to the development of novel antibody drugs.

  5. Development of DNA-based Identification methods to track the ...

    EPA Pesticide Factsheets

    The ability to track the identity and abundance of larval fish, which are ubiquitous during spawning season, may lead to a greater understanding of fish species distributions in Great Lakes nearshore areas including early-detection of invasive fish species before they become established. However, larval fish are notoriously hard to identify using traditional morphological techniques. While DNA-based identification methods could increase the ability of aquatic resource managers to determine larval fish composition, use of these methods in aquatic surveys is still uncommon and presents many challenges. In response to this need, we have been working with the U. S. Fish and Wildlife Service to develop field and laboratory methods to facilitate the identification of larval fish using DNA-meta-barcoding. In 2012, we initiated a pilot-project to develop a workflow for conducting DNA-based identification, and compared the species composition at sites within the St. Louis River Estuary of Lake Superior using traditional identification versus DNA meta-barcoding. In 2013, we extended this research to conduct DNA-identification of fish larvae collected from multiple nearshore areas of the Great Lakes by the USFWS. The species composition of larval fish generally mirrored that of fish species known from the same areas, but was influenced by the timing and intensity of sampling. Results indicate that DNA-based identification needs only very low levels of biomass to detect pre

  6. [Development of analytical method for determination nicotine metabolites in urine].

    PubMed

    Piekoszewski, Wojciech; Florek, Ewa; Kulza, Maksymilian; Wilimowska, Jolanta; Loba, Urszula

    2009-01-01

    The assay of biomarkers in biological material is the most popular and reliable method in estimate exposure to tobacco smoke. Nicotine and its metabolites qualify to the most specific biomarkers for tobacco smoke. Currently the most often used are cotinine and trans-3'-hydroxycotinine. The aim of this study was development of easy and quick method of determining nicotine and its main metabolites with high performance liquid chromatography--available in most laboratories. Nicotine and its metabolites in urine (cotinine, trans-3'-hydroxycotinine, nornicotine and nicotine N-oxide) was determined by means of high performance liquid chromatography with spectrometry detection (HPLC-UV). The determined compounds were extracted from urine by means of the liquid-liquid technique, before analysed by the HPLC method. Developed technique of high performance liquid chromatography proved to be useful to assessment nicotine and its four metabolites in smokers, though further research are necessary. The further modification of procedure is required, because of the interferences of cotinine N-oxide with matrix, which prevent determination. Increasing the efficiency of extraction nicotine and nornicotine could enable the determination in people exposed on environmental tobacco smoke (ETS). This study confirm other authors' observations that 3'-hydroxycotinine might be equivalent with cotinine predictor of tobacco smoke exposure, however further studies are required.

  7. Development of Cross-Assembly Phage PCR-Based Methods ...

    EPA Pesticide Factsheets

    Technologies that can characterize human fecal pollution in environmental waters offer many advantages over traditional general indicator approaches. However, many human-associated methods cross-react with non-human animal sources and lack suitable sensitivity for fecal source identification applications. The genome of a newly discovered bacteriophage (~97 kbp), the Cross-Assembly phage or “crAssphage”, assembled from a human gut metagenome DNA sequence library is predicted to be both highly abundant and predominately occur in human feces suggesting that this double stranded DNA virus may be an ideal human fecal pollution indicator. We report the development of two human-associated crAssphage endpoint PCR methods (crAss056 and crAss064). A shotgun strategy was employed where 384 candidate primers were designed to cover ~41 kbp of the crAssphage genome deemed favorable for method development based on a series of bioinformatics analyses. Candidate primers were subjected to three rounds of testing to evaluate assay optimization, specificity, limit of detection (LOD95), geographic variability, and performance in environmental water samples. The top two performing candidate primer sets exhibited 100% specificity (n = 70 individual samples from 8 different animal species), >90% sensitivity (n = 10 raw sewage samples from different geographic locations), LOD95 of 0.01 ng/µL of total DNA per reaction, and successfully detected human fecal pollution in impaired envi

  8. Development of computational methods for heavy lift launch vehicles

    NASA Technical Reports Server (NTRS)

    Yoon, Seokkwan; Ryan, James S.

    1993-01-01

    The research effort has been focused on the development of an advanced flow solver for complex viscous turbulent flows with shock waves. The three-dimensional Euler and full/thin-layer Reynolds-averaged Navier-Stokes equations for compressible flows are solved on structured hexahedral grids. The Baldwin-Lomax algebraic turbulence model is used for closure. The space discretization is based on a cell-centered finite-volume method augmented by a variety of numerical dissipation models with optional total variation diminishing limiters. The governing equations are integrated in time by an implicit method based on lower-upper factorization and symmetric Gauss-Seidel relaxation. The algorithm is vectorized on diagonal planes of sweep using two-dimensional indices in three dimensions. A new computer program named CENS3D has been developed for viscous turbulent flows with discontinuities. Details of the code are described in Appendix A and Appendix B. With the developments of the numerical algorithm and dissipation model, the simulation of three-dimensional viscous compressible flows has become more efficient and accurate. The results of the research are expected to yield a direct impact on the design process of future liquid fueled launch systems.

  9. Physics methods development for the NCSU PULSTAR reactor

    SciTech Connect

    Perez, P.B.; Mayo, C.W.; Giavedoni, E.

    1996-12-31

    The safety analysis reports (SARs) of all university research reactors include analyses that determine reactor physics parameters. The initial SAR analyses utilized numerical models, codes, cross-section libraries, and computing platforms available at the time. Advances and updates in all of these contributing areas make it difficult or impractical to resort to the earlier methodologies for meeting current analysis needs. Many facilities updated their physics methods during the high-enrichment uranium (HEU) to low-enrichment uranium (LEU) conversion effort. These facilities updated their SAR with current methodologies. The North Carolina State University`s (NCSU`s) PULSTAR research reactor was designed to use low-enrichment (4%) fuel, and as a result, the facility did not update the reactor physics analyses during the HEU-to-LEU conversion program. An effort is currently under way at NCSU to develop new and updated methods for reactor physics calculations. Currently planned physics calculations for the PULSTAR reactor support both reactor licensing and experimental facility development goals. These goals include the following: 1. Increase excess reactivity by introducing beryllium reflector assemblies and a mixed-enrichment core. 2. Characterize various experimental facilities in support of neutron transmutation doping, prompt gamma analysis, and neutron depth profiling. 3. Establish core loading patterns that optimize characteristics for experimental facilities. Two and three-dimensional, multigroup models utilizing the DANT-SYS and MCNP codes have been developed in support of these goals. Results and lessons learned with the DANT-SYS code are presented in this paper.

  10. Development of fire test methods for airplane interior materials

    NASA Technical Reports Server (NTRS)

    Tustin, E. A.

    1978-01-01

    Fire tests were conducted in a 737 airplane fuselage at NASA-JSC to characterize jet fuel fires in open steel pans (simulating post-crash fire sources and a ruptured airplane fuselage) and to characterize fires in some common combustibles (simulating in-flight fire sources). Design post-crash and in-flight fire source selections were based on these data. Large panels of airplane interior materials were exposed to closely-controlled large scale heating simulations of the two design fire sources in a Boeing fire test facility utilizing a surplused 707 fuselage section. Small samples of the same airplane materials were tested by several laboratory fire test methods. Large scale and laboratory scale data were examined for correlative factors. Published data for dangerous hazard levels in a fire environment were used as the basis for developing a method to select the most desirable material where trade-offs in heat, smoke and gaseous toxicant evolution must be considered.

  11. Accelerated molecular dynamics methods: introduction and recent developments

    SciTech Connect

    Uberuaga, Blas Pedro; Voter, Arthur F; Perez, Danny; Shim, Y; Amar, J G

    2009-01-01

    A long-standing limitation in the use of molecular dynamics (MD) simulation is that it can only be applied directly to processes that take place on very short timescales: nanoseconds if empirical potentials are employed, or picoseconds if we rely on electronic structure methods. Many processes of interest in chemistry, biochemistry, and materials science require study over microseconds and beyond, due either to the natural timescale for the evolution or to the duration of the experiment of interest. Ignoring the case of liquids xxx, the dynamics on these time scales is typically characterized by infrequent-event transitions, from state to state, usually involving an energy barrier. There is a long and venerable tradition in chemistry of using transition state theory (TST) [10, 19, 23] to directly compute rate constants for these kinds of activated processes. If needed dynamical corrections to the TST rate, and even quantum corrections, can be computed to achieve an accuracy suitable for the problem at hand. These rate constants then allow them to understand the system behavior on longer time scales than we can directly reach with MD. For complex systems with many reaction paths, the TST rates can be fed into a stochastic simulation procedure such as kinetic Monte Carlo xxx, and a direct simulation of the advance of the system through its possible states can be obtained in a probabilistically exact way. A problem that has become more evident in recent years, however, is that for many systems of interest there is a complexity that makes it difficult, if not impossible, to determine all the relevant reaction paths to which TST should be applied. This is a serious issue, as omitted transition pathways can have uncontrollable consequences on the simulated long-time kinetics. Over the last decade or so, we have been developing a new class of methods for treating the long-time dynamics in these complex, infrequent-event systems. Rather than trying to guess in advance what

  12. Identifying emerging research collaborations and networks: Method development

    PubMed Central

    Dozier, Ann M.; Martina, Camille A.; O’Dell, Nicole L.; Fogg, Thomas T.; Lurie, Stephen J.; Rubinstein, Eric P.; Pearson, Thomas A.

    2014-01-01

    Clinical and translational research is a multidisciplinary, collaborative team process. To evaluate this process, we developed a method to document emerging research networks and collaborations in our medical center to describe their productivity and viability over time. Using an email survey, sent to 1,620 clinical and basic science full-and part-time faculty members, respondents identified their research collaborators. Initial analyses, using Pajek software, assessed the feasibility of using social network analysis (SNA) methods with these data. Nearly 400 respondents identified 1,594 collaborators across 28 medical center departments resulting in 309 networks with 5 or more collaborators. This low-burden approach yielded a rich dataset useful for evaluation using SNA to: a) assess networks at several levels of the organization, including intrapersonal (individuals), interpersonal (social), organizational/institutional leadership (tenure and promotion), and physical/environmental (spatial proximity) and b) link with other data to assess the evolution of these networks. PMID:24019209

  13. Development of nondestructive evaluation methods for structural ceramics

    SciTech Connect

    Ellingson, W.A.; Koehl, R.D.; Stuckey, J.B.; Sun, J.G.; Engel, H.P.; Smith, R.G.

    1997-06-01

    Development of nondestructive evaluation (NDE) methods for application to fossil energy systems continues in three areas: (a) mapping axial and radial density gradients in hot gas filters, (b) characterization of the quality of continuous fiber ceramic matrix composite (CFCC) joints and (c) characterization and detection of defects in thermal barrier coatings. In this work, X-ray computed tomographic imaging was further developed and used to map variations in the axial and radial density of two full length (2.3-m) hot gas filters. The two filters differed in through wall density because of the thickness of the coating on the continuous fibers. Differences in axial and through wall density were clearly detected. Through transmission infrared imaging with a highly sensitivity focal plane array camera was used to assess joint quality in two sets of SiC/SiC CFCC joints. High frame rate data capture suggests that the infrared imaging method holds potential for the characterization of CFCC joints. Work to develop NDE methods that can be used to evaluate electron beam physical vapor deposited coatings with platinum-aluminide (Pt-Al) bonds was undertaken. Coatings of Zirconia with thicknesses of 125 {micro}m (0.005 in.), 190 {micro}m (0.0075 in.), and 254 {micro}m (0.010 in.) with a Pt-Al bond coat on Rene N5 Ni-based superalloy were studied by infrared imaging. Currently, it appears that thickness variation, as well as thermal properties, can be assessed by infrared technology.

  14. Electromagnetic methods for development and production: State of the art

    SciTech Connect

    Wilt, M.; Alumbaugh, D.

    1997-10-01

    Electromagnetic (EM) methods, long used for borehole logging as a formation evaluation tool in developed oil fields, are rarely applied in surface or crosshole configurations or applied in cased wells. This is largely due to the high levels of cultural noise and the preponderance of steel well casing. However, recent experimental success with crosshole EM systems for water and steam flood monitoring using fiberglass cased wells has shown promise in applying these techniques to development and production (D & P) problems. This paper describes technological solutions that will allow for successful application of EM techniques in oil fields, despite surface noise and steel casing. First an example sites the application of long offset logging to map resistivity structure away from the borehole. Next, a successful application of crosshole EM where one of the wells is steel cased is described. The potential application of earth`s field nuclear magnetic resonance (NMR) to map fluid saturation at large distances from the boreholes is also discussed.

  15. Developments in flow visualization methods for flight research

    NASA Technical Reports Server (NTRS)

    Holmes, Bruce J.; Obara, Clifford J.; Manuel, Gregory S.; Lee, Cynthia C.

    1990-01-01

    With the introduction of modern airplanes utilizing laminar flow, flow visualization has become an important diagnostic tool in determining aerodynamic characteristics such as surface flow direction and boundary-layer state. A refinement of the sublimating chemical technique has been developed to define both the boundary-layer transition location and the transition mode. In response to the need for flow visualization at subsonic and transonic speeds and altitudes above 20,000 feet, the liquid crystal technique has been developed. A third flow visualization technique that has been used is infrared imaging, which offers non-intrusive testing over a wide range of test conditions. A review of these flow visualization methods and recent flight results is presented for a variety of modern aircraft and flight conditions.

  16. Development of impact design methods for ceramic gas turbine components

    NASA Technical Reports Server (NTRS)

    Song, J.; Cuccio, J.; Kington, H.

    1990-01-01

    Impact damage prediction methods are being developed to aid in the design of ceramic gas turbine engine components with improved impact resistance. Two impact damage modes were characterized: local, near the impact site, and structural, usually fast fracture away from the impact site. Local damage to Si3N4 impacted by Si3N4 spherical projectiles consists of ring and/or radial cracks around the impact point. In a mechanistic model being developed, impact damage is characterized as microcrack nucleation and propagation. The extent of damage is measured as volume fraction of microcracks. Model capability is demonstrated by simulating late impact tests. Structural failure is caused by tensile stress during impact exceeding material strength. The EPIC3 code was successfully used to predict blade structural failures in different size particle impacts on radial and axial blades.

  17. Development of methods to predict agglomeration and disposition in FBCs

    SciTech Connect

    Mann, M.D.; Henderson, A.K.; Swanson, M.K.; Erickson, T.A.

    1995-11-01

    This 3-year, multiclient program is providing the information needed to determine the behavior of inorganic components in FBC units using advanced methods of analysis coupled with bench-scale combustion experiments. The major objectives of the program are as follows: (1) To develop further our advanced ash and deposit characterization techniques to quantify the effects of the liquid-phase components in terms of agglomerate formation and ash deposits, (2) To determine the mechanisms of inorganic transformations that lead to bed agglomeration and ash deposition in FBC systems, and (3) To develop a better means to predict the behavior of inorganic components as a function of coal composition, bed material characteristics, and combustion conditions.

  18. Exploring the Application of Community Development Methods on Water Research in Developing Countries

    NASA Astrophysics Data System (ADS)

    Crane, P. E.

    2012-12-01

    In research and community development focused on water in developing countries, there is a common focus on issues of water quantity and quality. In the best circumstances both are innovative - bringing understanding and solutions to resource poor regions that is appropriate to their unique situations. But the underlying methods and measures for success often differ significantly. Applying critical aspects of community development methods to water research in developing countries could increase the probability of identifying innovative and sustainable solutions. This is examined through two case studies: the first identifies common methods across community development projects in six African countries, and the second examines water quality research performed in Benin, West Africa through the lens of these methods. The first case study is taken from observations gathered between 2008 and 2012 of community development projects focused on water quantity and quality in six sub-Saharan African countries implemented through different non-governmental organizations. These projects took place in rural and peri-urban regions where public utilities were few to none, instance of diarrheal disease was high, and most adults had received little formal education. The water projects included drilling of boreholes, building of rain water tanks, oasis rehabilitation, spring protection, and household biosand filters. All solutions were implemented with hygiene and sanitation components. Although these projects occurred in a wide array of cultural, geographical and climatic regions, the most successful projects shared methods of implementation. These methods are: high levels of stakeholder participation, environmental and cultural adaptation of process and product, and implementation over an extended length of time. The second case study focuses on water quality research performed in Benin, West Africa from 2003 to 2008. This research combined laboratory and statistical analyses with

  19. Development of RNAi Methods for Peregrinus maidis, the Corn Planthopper

    PubMed Central

    Yao, Jianxiu; Rotenberg, Dorith; Afsharifar, Alireza; Barandoc-Alviar, Karen; Whitfield, Anna E.

    2013-01-01

    The corn planthopper, Peregrinus maidis, is a major pest of agronomically-important crops. Peregrinus maidis has a large geographical distribution and transmits Maize mosaic rhabdovirus (MMV) and Maize stripe tenuivirus (MSpV). The objective of this study was to develop effective RNAi methods for P. maidis. Vacuolar-ATPase (V-ATPase) is an essential enzyme for hydrolysis of ATP and for transport of protons out of cells thereby maintaining membrane ion balance, and it has been demonstrated to be an efficacious target for RNAi in other insects. In this study, two genes encoding subunits of P. maidis V-ATPase (V-ATPase B and V-ATPase D) were chosen as RNAi target genes. The open reading frames of V-ATPase B and D were generated and used for constructing dsRNA fragments. Experiments were conducted using oral delivery and microinjection of V-ATPase B and V-ATPase D dsRNA to investigate the effectiveness of RNAi in P. maidis. Real-time quantitative reverse transcriptase-PCR (qRT-PCR) analysis indicated that microinjection of V-ATPase dsRNA led to a minimum reduction of 27-fold in the normalized abundance of V-ATPase transcripts two days post injection, while ingestion of dsRNA resulted in a two-fold reduction after six days of feeding. While both methods of dsRNA delivery resulted in knockdown of target transcripts, the injection method was more rapid and effective. The reduction in V-ATPase transcript abundance resulted in observable phenotypes. Specifically, the development of nymphs injected with 200 ng of either V-ATPase B or D dsRNA was impaired, resulting in higher mortality and lower fecundity than control insects injected with GFP dsRNA. Microscopic examination of these insects revealed that female reproductive organs did not develop normally. The successful development of RNAi in P. maidis to target specific genes will enable the development of new insect control strategies and functional analysis of vital genes and genes associated with interactions between P

  20. Development of RNAi methods for Peregrinus maidis, the corn planthopper.

    PubMed

    Yao, Jianxiu; Rotenberg, Dorith; Afsharifar, Alireza; Barandoc-Alviar, Karen; Whitfield, Anna E

    2013-01-01

    The corn planthopper, Peregrinus maidis, is a major pest of agronomically-important crops. Peregrinus maidis has a large geographical distribution and transmits Maize mosaic rhabdovirus (MMV) and Maize stripe tenuivirus (MSpV). The objective of this study was to develop effective RNAi methods for P. maidis. Vacuolar-ATPase (V-ATPase) is an essential enzyme for hydrolysis of ATP and for transport of protons out of cells thereby maintaining membrane ion balance, and it has been demonstrated to be an efficacious target for RNAi in other insects. In this study, two genes encoding subunits of P. maidis V-ATPase (V-ATPase B and V-ATPase D) were chosen as RNAi target genes. The open reading frames of V-ATPase B and D were generated and used for constructing dsRNA fragments. Experiments were conducted using oral delivery and microinjection of V-ATPase B and V-ATPase D dsRNA to investigate the effectiveness of RNAi in P. maidis. Real-time quantitative reverse transcriptase-PCR (qRT-PCR) analysis indicated that microinjection of V-ATPase dsRNA led to a minimum reduction of 27-fold in the normalized abundance of V-ATPase transcripts two days post injection, while ingestion of dsRNA resulted in a two-fold reduction after six days of feeding. While both methods of dsRNA delivery resulted in knockdown of target transcripts, the injection method was more rapid and effective. The reduction in V-ATPase transcript abundance resulted in observable phenotypes. Specifically, the development of nymphs injected with 200 ng of either V-ATPase B or D dsRNA was impaired, resulting in higher mortality and lower fecundity than control insects injected with GFP dsRNA. Microscopic examination of these insects revealed that female reproductive organs did not develop normally. The successful development of RNAi in P. maidis to target specific genes will enable the development of new insect control strategies and functional analysis of vital genes and genes associated with interactions between P

  1. Chloroform extraction of iodine in seawater: method development

    NASA Astrophysics Data System (ADS)

    Seidler, H. B.; Glimme, A.; Tumey, S.; Guilderson, T. P.

    2012-12-01

    While 129I poses little to no radiological health hazard, the isotopic ratio of 129I to stable iodine is very useful as a nearly conservative tracer for ocean mixing processes. The unfortunate disaster at the Fukushima Daiichi nuclear power plant released many radioactive materials into the environment, including 129I. The release allows the studying of oceanic processes through the tracking of 129I. However, with such a low iodine (~0.5 micromolar) and 129I concentrations (<10-11) accelerator mass spectrometry (AMS) is needed for accurate measurements. In order to prepare the samples of ocean water for analysis by AMS, the iodine needs to be separated from the various other salts in the seawater. Solvent extraction is the preferred method for preparation of seawater for AMS analysis of 129I. However, given the relatively low background 129I concentrations in the Pacific Ocean, we sought to optimize recovery of thismethod, which would minimize both the sample size and the carrier addition required for analysis. We started from a base method described in other research and worked towards maximum efficiency of the process while boosting the recovery of iodine. During development, we assessed each methodological change qualitatively using a color scale (I2 in CHCl3) and quantitatively using Inductively Coupled Plasma Mass Spectrometry (ICP-MS). The "optimized method" yielded a 20-40% increase in recovery of the iodine compared to the base method (80-85% recovery vs. 60%). Lastly, the "optimized method" was tested by AMS for fractionation of the extracted iodine.

  2. Characterization, thermal stability studies, and analytical method development of Paromomycin for formulation development.

    PubMed

    Khan, Wahid; Kumar, Neeraj

    2011-06-01

    Paromomycin (PM) is an aminoglycoside antibiotic, first isolated in the 1950s, and approved in 2006 for treatment of visceral leishmaniasis. Although isolated six decades back, sufficient information essential for development of pharmaceutical formulation is not available for PM. The purpose of this paper was to determine thermal stability and development of new analytical method for formulation development of PM. PM was characterized by thermoanalytical (DSC, TGA, and HSM) and by spectroscopic (FTIR) techniques and these techniques were used to establish thermal stability of PM after heating PM at 100, 110, 120, and 130 °C for 24 h. Biological activity of these heated samples was also determined by microbiological assay. Subsequently, a simple, rapid and sensitive RP-HPLC method for quantitative determination of PM was developed using pre-column derivatization with 9-fluorenylmethyl chloroformate. The developed method was applied to estimate PM quantitatively in two parenteral dosage forms. PM was successfully characterized by various stated techniques. These techniques indicated stability of PM for heating up to 120 °C for 24 h, but when heated at 130 °C, PM is liable to degradation. This degradation is also observed in microbiological assay where PM lost ∼30% of its biological activity when heated at 130 °C for 24 h. New analytical method was developed for PM in the concentration range of 25-200 ng/ml with intra-day and inter-day variability of < 2%RSD. Characterization techniques were established and stability of PM was determined successfully. Developed analytical method was found sensitive, accurate, and precise for quantification of PM. Copyright © 2010 John Wiley & Sons, Ltd.

  3. System identification methods for aircraft flight control development and validation

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.

    1995-01-01

    System-identification methods compose a mathematical model, or series of models, from measurements of inputs and outputs of dynamic systems. The extracted models allow the characterization of the response of the overall aircraft or component subsystem behavior, such as actuators and on-board signal processing algorithms. This paper discusses the use of frequency-domain system-identification methods for the development and integration of aircraft flight-control systems. The extraction and analysis of models of varying complexity from nonparametric frequency-responses to transfer-functions and high-order state-space representations is illustrated using the Comprehensive Identification from FrEquency Responses (CIFER) system-identification facility. Results are presented for test data of numerous flight and simulation programs at the Ames Research Center including rotorcraft, fixed-wing aircraft, advanced short takeoff and vertical landing (ASTOVL), vertical/short takeoff and landing (V/STOL), tiltrotor aircraft, and rotor experiments in the wind tunnel. Excellent system characterization and dynamic response prediction is achieved for this wide class of systems. Examples illustrate the role of system-identification technology in providing an integrated flow of dynamic response data around the entire life-cycle of aircraft development from initial specifications, through simulation and bench testing, and into flight-test optimization.

  4. Development of new methods for studying nanostructures using neutron scattering

    SciTech Connect

    Pynn, Roger

    2016-03-18

    The goal of this project was to develop improved instrumentation for studying the microscopic structures of materials using neutron scattering. Neutron scattering has a number of advantages for studying material structure but suffers from the well-known disadvantage that neutrons’ ability to resolve structural details is usually limited by the strength of available neutron sources. We aimed to overcome this disadvantage using a new experimental technique, called Spin Echo Scattering Angle Encoding (SESAME) that makes use of the neutron’s magnetism. Our goal was to show that this innovation will allow the country to make better use of the significant investment it has recently made in a new neutron source at Oak Ridge National Laboratory (ORNL) and will lead to increases in scientific knowledge that contribute to the Nation’s technological infrastructure and ability to develop advanced materials and technologies. We were successful in demonstrating the technical effectiveness of the new method and established a baseline of knowledge that has allowed ORNL to start a project to implement the method on one of its neutron beam lines.

  5. Multiphysics methods development for high temperature gas reactor analysis

    NASA Astrophysics Data System (ADS)

    Seker, Volkan

    Multiphysics computational methods were developed to perform design and safety analysis of the next generation Pebble Bed High Temperature Gas Cooled Reactors. A suite of code modules was developed to solve the coupled thermal-hydraulics and neutronics field equations. The thermal-hydraulics module is based on the three dimensional solution of the mass, momentum and energy equations in cylindrical coordinates within the framework of the porous media method. The neutronics module is a part of the PARCS (Purdue Advanced Reactor Core Simulator) code and provides a fine mesh finite difference solution of the neutron diffusion equation in three dimensional cylindrical coordinates. Coupling of the two modules was performed by mapping the solution variables from one module to the other. Mapping is performed automatically in the code system by the use of a common material mesh in both modules. The standalone validation of the thermal-hydraulics module was performed with several cases of the SANA experiment and the standalone thermal-hydraulics exercise of the PBMR-400 benchmark problem. The standalone neutronics module was validated by performing the relevant exercises of the PBMR-268 and PBMR-400 benchmark problems. Additionally, the validation of the coupled code system was performed by analyzing several steady state and transient cases of the OECD/NEA PBMR-400 benchmark problem.

  6. Continuum modeling using granular micromechanics approach: Method development and applications

    NASA Astrophysics Data System (ADS)

    Poorsolhjouy, Payam

    This work presents a constitutive modeling approach for the behavior of granular materials. In the granular micromechanics approach presented here, the material point is assumed to be composed of grains interacting with their neighbors through different inter-granular mechanisms that represent material's macroscopic behavior. The present work focuses on (i) developing the method for modeling more complicated material systems as well as more complicated loading scenarios and (ii) applications of the method for modeling various granular materials and granular assemblies. A damage-plasticity model for modeling cementitious and rock-like materials is developed, calibrated, and verified in a thermo-mechanically consistent manner. Grain-pair interactions in normal tension, normal compression, and tangential directions have been defined in a manner that is consistent with the material's macroscopic behavior. The resulting model is able to predict, among other interesting issues, the effects of loading induced anisotropy. Material's response to loading will depend on the loading history of grain-pair interactions in different directions. Thus the model predicts load-path dependent failure. Due to the inadequacies of first gradient continuum theories in predicting phenomena such as shear band width, wave dispersion, and frequency band-gap, the presented method is enhanced by incorporation of non-classical terms in the kinematic analysis. A complete micromorphic theory is presented by incorporating additional terms such as fluctuations, second gradient terms, and spin fields. Relative deformation of grain-pairs is calculated based on the enhanced kinematic analysis. The resulting theory incorporates the deformation and forces in grain-pair interactions due to different kinematic measures into the macroscopic behavior. As a result, non-classical phenomena such as wave dispersion and frequency band-gaps can be predicted. Using the grain-scale analysis, a practical approach for

  7. Method development for proteome stabilization in human saliva.

    PubMed

    Xiao, Hua; Wong, David T W

    2012-04-13

    Human saliva is a biological fluid with emerging early detection and diagnostic potentials. However, the salivary proteome suffers from rapid degradation and thus compromises its translational and clinical utilities. Therefore, easy, reliable and practical methods are urgently required for the storage of human saliva samples. In this study, saliva samples from healthy subjects were collected and stored at room temperature (RT) and 4 °C for different lengths of time with and without specific protein stabilization treatments. SDS-PAGE was run to compare the protein profiling between samples. Reference proteins, β-actin and interleukin-1 β (IL1β), were chosen to evaluate salivary protein stability. Immunoassay was used for the detection of these target proteins. All data was compared with the positive control that had been kept at -80 °C. The results show that the salivary proteome that has been stored at 4 °C with added protease inhibitors was stable for approximately two weeks without significant degradation. By adding ethanol to the samples, the salivary proteome was stabilized at RT. After optimization, a simple, robust and convenient method is developed for the stabilization of proteins in human saliva that does not affect the downstream translational and clinical applications. The salivary proteome could be stabilized without significant degradation by adding ethanol at RT for about two weeks. This optimized method could greatly accelerate the clinical usage of saliva for future diagnosis.

  8. Electrospinning: methods and development of biodegradable nanofibres for drug release.

    PubMed

    Ashammakhi, N; Wimpenny, I; Nikkola, L; Yang, Y

    2009-02-01

    It is clear that nanofibrous structures can be used as tools for many applications. It is already known that electrospinning is a highly versatile method of producing nanofibres and recent developments in the technique of electrospinning have led to the development of aligned nanofibres and biphasic, core-sheath fibres which can be used to encapsulate different materials from molecules to cells. Natural extracellular matrix (ECM) contains fibres in both micro and nano-scales and provides a structural scaffold which allows cells to localize, migrate, proliferate and differentiate. Polymer nanofibres can provide the structural cues of ECM. However, current literature gives new hope to further functionalising polymeric nanofibres by using them for drug delivery devices and improving their design to improve control of delivery. By encapsulating active agents within nanofibres (multifunctional nanofibres), a degree of control can be exerted over the release of encapsulated agents and therefore, the behaviour of cells can be manipulated for developing effective therapies and is extremely encouraging in the tissue engineering field by combining factors like fibre diameter, alignment and chemicals in new ways. Such multifunctional nanofibre-based systems are already being investigated in vivo. Experiments have shown the significant potential for treatments of disease and engineering of neural and bone tissues. Further, phase III clinical trials of nanofibrous patches for applications in wound treatment were encouraging. Hopefully, clinical applications of these drug delivery devices will follow, to enhance regenerative medicine applications.

  9. Practical methods to improve the development of computational software

    SciTech Connect

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-07-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  10. Systematic methods for knowledge acquisition and expert system development

    NASA Technical Reports Server (NTRS)

    Belkin, Brenda L.; Stengel, Robert F.

    1991-01-01

    Nine cooperating rule-based systems, collectively called AUTOCREW, were designed to automate functions and decisions associated with a combat aircraft's subsystem. The organization of tasks within each system is described; performance metrics were developed to evaluate the workload of each rule base, and to assess the cooperation between the rule-bases. Each AUTOCREW subsystem is composed of several expert systems that perform specific tasks. AUTOCREW's NAVIGATOR was analyzed in detail to understand the difficulties involved in designing the system and to identify tools and methodologies that ease development. The NAVIGATOR determines optimal navigation strategies from a set of available sensors. A Navigation Sensor Management (NSM) expert system was systematically designed from Kalman filter covariance data; four ground-based, a satellite-based, and two on-board INS-aiding sensors were modeled and simulated to aid an INS. The NSM Expert was developed using the Analysis of Variance (ANOVA) and the ID3 algorithm. Navigation strategy selection is based on an RSS position error decision metric, which is computed from the covariance data. Results show that the NSM Expert predicts position error correctly between 45 and 100 percent of the time for a specified navaid configuration and aircraft trajectory. The NSM Expert adapts to new situations, and provides reasonable estimates of hybrid performance. The systematic nature of the ANOVA/ID3 method makes it broadly applicable to expert system design when experimental or simulation data is available.

  11. Improved Method Being Developed for Surface Enhancement of Metallic Materials

    NASA Technical Reports Server (NTRS)

    Gabb, Timothy P.; Telesman, Jack; Kantzos, Peter T.

    2001-01-01

    Surface enhancement methods induce a layer of beneficial residual compressive stress to improve the impact (FOD) resistance and fatigue life of metallic materials. A traditional method of surface enhancement often used is shot peening, in which small steel spheres are repeatedly impinged on metallic surfaces. Shot peening is inexpensive and widely used, but the plastic deformation of 20 to 40 percent imparted by the impacts can be harmful. This plastic deformation can damage the microstructure, severely limiting the ductility and durability of the material near the surface. It has also been shown to promote accelerated relaxation of the beneficial compressive residual stresses at elevated temperatures. Low-plasticity burnishing (LPB) is being developed as an improved method for the surface enhancement of metallic materials. LPB is being investigated as a rapid, inexpensive surface enhancement method under NASA Small Business Innovation Research contracts NAS3-98034 and NAS3-99116, with supporting characterization work at NASA. Previously, roller burnishing had been employed to refine surface finish. This concept was adopted and then optimized as a means of producing a layer of compressive stress of high magnitude and depth, with minimal plastic deformation (ref. 1). A simplified diagram of the developed process is given in the following figure. A single pass of a smooth, free-rolling spherical ball under a normal force deforms the surface of the material in tension, creating a compressive layer of residual stress. The ball is supported in a fluid with sufficient pressure to lift the ball off the surface of the retaining spherical socket. The ball is only in mechanical contact with the surface of the material being burnished and is free to roll on the surface. This apparatus is designed to be mounted in the conventional lathes and vertical mills currently used to machine parts. The process has been successfully applied to nickel-base superalloys by a team from the

  12. Structural analysis methods development for turbine hot section components

    NASA Technical Reports Server (NTRS)

    Thompson, R. L.

    1989-01-01

    The structural analysis technologies and activities of the NASA Lewis Research Center's gas turbine engine HOT Section Technoloogy (HOST) program are summarized. The technologies synergistically developed and validated include: time-varying thermal/mechanical load models; component-specific automated geometric modeling and solution strategy capabilities; advanced inelastic analysis methods; inelastic constitutive models; high-temperature experimental techniques and experiments; and nonlinear structural analysis codes. Features of the program that incorporate the new technologies and their application to hot section component analysis and design are described. Improved and, in some cases, first-time 3-D nonlinear structural analyses of hot section components of isotropic and anisotropic nickel-base superalloys are presented.

  13. Development of A High Throughput Method Incorporating Traditional Analytical Devices

    PubMed Central

    White, C. C.; Embree, E.; Byrd, W. E; Patel, A. R.

    2004-01-01

    A high-throughput (high throughput is the ability to process large numbers of samples) and companion informatics system has been developed and implemented. High throughput is defined as the ability to autonomously evaluate large numbers of samples, while an informatics system provides the software control of the physical devices, in addition to the organization and storage of the generated electronic data. This high throughput system includes both an ultra-violet and visible light spectrometer (UV-Vis) and a Fourier transform infrared spectrometer (FTIR) integrated with a multi sample positioning table. This method is designed to quantify changes in polymeric materials occurring from controlled temperature, humidity and high flux UV exposures. The integration of the software control of these analytical instruments within a single computer system is presented. Challenges in enhancing the system to include additional analytical devices are discussed. PMID:27366626

  14. Workshop Targets Development of Geodetic Transient Detection Methods

    NASA Astrophysics Data System (ADS)

    Murray-Moraleda, Jessica R.; Lohman, Rowena

    2010-02-01

    2009 SCEC Annual Meeting: Workshop on Transient Anomalous Strain Detection; Palm Springs, California, 12-13 September 2009; The Southern California Earthquake Center (SCEC) is a community of researchers at institutions worldwide working to improve understanding of earthquakes and mitigate earthquake risk. One of SCEC's priority objectives is to “develop a geodetic network processing system that will detect anomalous strain transients.” Given the growing number of continuously recording geodetic networks consisting of hundreds of stations, an automated means for systematically searching data for transient signals, especially in near real time, is critical for network operations, hazard monitoring, and event response. The SCEC Transient Detection Test Exercise began in 2008 to foster an active community of researchers working on this problem, explore promising methods, and combine effective approaches in novel ways. A workshop was held in California to assess what has been learned thus far and discuss areas of focus as the project moves forward.

  15. Development of the moments method for neutron gauging

    NASA Astrophysics Data System (ADS)

    Ingman, D.; Taviv, E.

    1981-11-01

    In the present investigation the new methodology of neutron moisture probe, based on measurements of the spatial moments of the slow neutron fluxes, is developed. Within the framework of the present work calibration curves for moments of low orders were calculated and recursive relations for high-order moments were obtained on the base of a P-1 approximation and diffusion theory. The neutron flux distributions obtained from a semiempirical method [5], three-group diffusion and age theories for the moments calculation, were investigated. It is shown that the spatial moments of neutron flux could serve as a basis for measurements of the volume weighted moisture and the content of strong neutron absorbers in the medium.

  16. Development of a nondestructive evaluation method for FRP bridge decks

    NASA Astrophysics Data System (ADS)

    Brown, Jeff; Fox, Terra

    2010-05-01

    Open steel grids are typically used on bridges to minimize the weight of the bridge deck and wearing surface. These grids, however, require frequent maintenance and exhibit other durability concerns related to fatigue cracking and corrosion. Bridge decks constructed from composite materials, such as a Fiber-reinforced Polymer (FRP), are strong and lightweight; they also offer improved rideability, reduced noise levels, less maintenance, and are relatively easy to install compared to steel grids. This research is aimed at developing an inspection protocol for FRP bridge decks using Infrared thermography. The finite element method was used to simulate the heat transfer process and determine optimal heating and data acquisition parameters that will be used to inspect FRP bridge decks in the field. It was demonstrated that thermal imaging could successfully identify features of the FRP bridge deck to depths of 1.7 cm using a phase analysis process.

  17. Development of Porosity Measurement Method in Shale Gas Reservoir Rock

    NASA Astrophysics Data System (ADS)

    Siswandani, Alita; Nurhandoko, BagusEndar B.

    2016-08-01

    The pore scales have impacts on transport mechanisms in shale gas reservoirs. In this research, digital helium porosity meter is used for porosity measurement by considering real condition. Accordingly it is necessary to obtain a good approximation for gas filled porosity. Shale has the typical effective porosity that is changing as a function of time. Effective porosity values for three different shale rocks are analyzed by this proposed measurement. We develop the new measurement method for characterizing porosity phenomena in shale gas as a time function by measuring porosity in a range of minutes using digital helium porosity meter. The porosity of shale rock measured in this experiment are free gas and adsorbed gas porosoty. The pressure change in time shows that porosity of shale contains at least two type porosities: macro scale porosity (fracture porosity) and fine scale porosity (nano scale porosity). We present the estimation of effective porosity values by considering Boyle-Gay Lussaac approximation and Van der Waals approximation.

  18. Causal inference methods to study nonrandomized, preexisting development interventions

    PubMed Central

    Arnold, Benjamin F.; Khush, Ranjiv S.; Ramaswamy, Padmavathi; London, Alicia G.; Rajkumar, Paramasivan; Ramaprabha, Prabhakar; Durairaj, Natesan; Hubbard, Alan E.; Balakrishnan, Kalpana; Colford, John M.

    2010-01-01

    Empirical measurement of interventions to address significant global health and development problems is necessary to ensure that resources are applied appropriately. Such intervention programs are often deployed at the group or community level. The gold standard design to measure the effectiveness of community-level interventions is the community-randomized trial, but the conditions of these trials often make it difficult to assess their external validity and sustainability. The sheer number of community interventions, relative to randomized studies, speaks to a need for rigorous observational methods to measure their impact. In this article, we use the potential outcomes model for causal inference to motivate a matched cohort design to study the impact and sustainability of nonrandomized, preexisting interventions. We illustrate the method using a sanitation mobilization, water supply, and hygiene intervention in rural India. In a matched sample of 25 villages, we enrolled 1,284 children <5 y old and measured outcomes over 12 mo. Although we found a 33 percentage point difference in new toilet construction [95% confidence interval (CI) = 28%, 39%], we found no impacts on height-for-age Z scores (adjusted difference = 0.01, 95% CI = −0.15, 0.19) or diarrhea (adjusted longitudinal prevalence difference = 0.003, 95% CI = −0.001, 0.008) among children <5 y old. This study demonstrates that matched cohort designs can estimate impacts from nonrandomized, preexisting interventions that are used widely in development efforts. Interpreting the impacts as causal, however, requires stronger assumptions than prospective, randomized studies. PMID:21149699

  19. Development of imaging methods to assess adiposity and metabolism.

    PubMed

    Heymsfield, S B

    2008-12-01

    Body composition studies were first recorded around the time of the renaissance, and advances by the mid-twentieth century facilitated growth in the study of physiology, metabolism and pathological states. The field developed during this early period around the 'two-compartment' molecular level model that partitions body weight into fat and fat-free mass. Limited use was also made of X-rays as a means of estimating fat-layer thickness, but the revolutionary advance was brought about by the introduction of three-dimensional images provided by computed tomography (CT) in the mid 1970s, followed soon thereafter by magnetic resonance imaging (MRI). Complete in vivo reconstruction of all major anatomic body compartments and tissues became possible, thus providing major new research opportunities. This imaging revolution has continued to advance with further methodology refinements including functional MRI, diffusion tensor imaging and combined methods such as positron emission tomography+CT or MRI. The scientific advances made possible by these new and innovative methods continue to unfold today and hold enormous promise for the future of obesity research.

  20. Recent developments in optical detection methods for microchip separations.

    PubMed

    Götz, Sebastian; Karst, Uwe

    2007-01-01

    This paper summarizes the features and performances of optical detection systems currently applied in order to monitor separations on microchip devices. Fluorescence detection, which delivers very high sensitivity and selectivity, is still the most widely applied method of detection. Instruments utilizing laser-induced fluorescence (LIF) and lamp-based fluorescence along with recent applications of light-emitting diodes (LED) as excitation sources are also covered in this paper. Since chemiluminescence detection can be achieved using extremely simple devices which no longer require light sources and optical components for focusing and collimation, interesting approaches based on this technique are presented, too. Although UV/vis absorbance is a detection method that is commonly used in standard desktop electrophoresis and liquid chromatography instruments, it has not yet reached the same level of popularity for microchip applications. Current applications of UV/vis absorbance detection to microchip separations and innovative approaches that increase sensitivity are described. This article, which contains 85 references, focuses on developments and applications published within the last three years, points out exciting new approaches, and provides future perspectives on this field.

  1. Assessing methods for developing crop forecasting in the Iberian Peninsula

    NASA Astrophysics Data System (ADS)

    Ines, A. V. M.; Capa Morocho, M. I.; Baethgen, W.; Rodriguez-Fonseca, B.; Han, E.; Ruiz Ramos, M.

    2015-12-01

    Seasonal climate prediction may allow predicting crop yield to reduce the vulnerability of agricultural production to climate variability and its extremes. It has been already demonstrated that seasonal climate predictions at European (or Iberian) scale from ensembles of global coupled climate models have some skill (Palmer et al., 2004). The limited predictability that exhibits the atmosphere in mid-latitudes, and therefore de Iberian Peninsula (PI), can be managed by a probabilistic approach based in terciles. This study presents an application for the IP of two methods for linking tercile-based seasonal climate forecasts with crop models to improve crop predictability. Two methods were evaluated and applied for disaggregating seasonal rainfall forecasts into daily weather realizations: 1) a stochastic weather generator and 2) a forecast tercile resampler. Both methods were evaluated in a case study where the impacts of two seasonal rainfall forecasts (wet and dry forecast for 1998 and 2015 respectively) on rainfed wheat yield and irrigation requirements of maize in IP were analyzed. Simulated wheat yield and irrigation requirements of maize were computed with the crop models CERES-wheat and CERES-maize which are included in Decision Support System for Agrotechnology Transfer (DSSAT v.4.5, Hoogenboom et al., 2010). Simulations were run at several locations in Spain where the crop model was calibrated and validated with independent field data. These methodologies would allow quantifying the benefits and risks of a seasonal climate forecast to potential users as farmers, agroindustry and insurance companies in the IP. Therefore, we would be able to establish early warning systems and to design crop management adaptation strategies that take advantage of favorable conditions or reduce the effect of adverse ones. ReferencesPalmer, T. et al., 2004. Development of a European multimodel ensemble system for seasonal-to-interannual prediction (DEMETER). Bulletin of the

  2. Titanium matrix composite thermomechanical fatigue analysis method development

    NASA Astrophysics Data System (ADS)

    Ball, Dale Leray

    1998-12-01

    The results of complementary experimental and analytical investigations of thermomechanical fatigue of both unidirectional and crossply titanium matrix composite laminates are presented. Experimental results are given for both isothermal and thermomechanical fatigue tests which were based on simple, constant amplitude mechanical and thermal loading profiles. The discussion of analytical methods includes the development of titanium matrix composite laminate constitutive relationships, the development of damage models and the integration of both into a thermomechanical fatigue analysis algorithm. The technical approach begins with a micro-mechanical formulation of lamina response. Material behavior at the ply level is based on a mechanics of materials approach using thermo-elastic fibers and an thermo-elasto-viscoplastic matrix. The effects of several types of distributed damage are included in the material constitutive relationships at the ply level in the manner of continuum damage mechanics. The modified ply constitutive relationships are then used in an otherwise unmodified classical lamination theory treatment of laminate response. Finally, simple models for damage progression are utilized in an analytical framework which recalculates response and increments damage sizes at every load point in an applied thermal/mechanical load history. The model is used for the prediction of isothermal fatigue and thermomechanical fatigue life of unnotched, unidirectional [0°]4 and crossply [0°/90°]s titanium matrix composite laminates. The results of corresponding isothermal and thermomechanical fatigue tests are presented in detail and the correlation between experimental and analytical results is established in certain cases.

  3. Developing sub-domain verification methods based on GIS tools

    NASA Astrophysics Data System (ADS)

    Smith, J. A.; Foley, T. A.; Raby, J. W.

    2014-12-01

    The meteorological community makes extensive use of the Model Evaluation Tools (MET) developed by National Center for Atmospheric Research for numerical weather prediction model verification through grid-to-point, grid-to-grid and object-based domain level analyses. MET Grid-Stat has been used to perform grid-to-grid neighborhood verification to account for the uncertainty inherent in high resolution forecasting, and MET Method for Object-based Diagnostic Evaluation (MODE) has been used to develop techniques for object-based spatial verification of high resolution forecast grids for continuous meteorological variables. High resolution modeling requires more focused spatial and temporal verification over parts of the domain. With a Geographical Information System (GIS), researchers can now consider terrain type/slope and land use effects and other spatial and temporal variables as explanatory metrics in model assessments. GIS techniques, when coupled with high resolution point and gridded observations sets, allow location-based approaches that permit discovery of spatial and temporal scales where models do not sufficiently resolve the desired phenomena. In this paper we discuss our initial GIS approach to verify WRF-ARW with a one-kilometer horizontal resolution inner domain centered over Southern California. Southern California contains a mixture of urban, sub-urban, agricultural and mountainous terrain types along with a rich array of observational data with which to illustrate our ability to conduct sub-domain verification.

  4. Various methods and developments for calibrating seismological sensors at EOST

    NASA Astrophysics Data System (ADS)

    JUND, H.; Bès de Berc, M.; Thore, J.

    2013-12-01

    Calibrating seismic sensors is crucial for knowing the quality of the sensor and generating precise dataless files. We present here three calibration methods that we have developed for the short period and broad band sensors included in the temporary and permanent seismic networks in France. First, in the case of a short-period sensor with no electronics and calibration coil, we inject a sine wave signal into the signal coil. After locking the sensor mass, we first connect a voltage generator of signal waves and a series resistor to the coil. Then, a sinusoidal signal is sent to the sensor signal coil output. Both the voltage at the terminal of the resistor, which gives an image of the intensity entering the signal coil, and the voltage at the terminal of the signal coil are measured. The frequency of the generator then varies in order to find a phase shift between both signals of π/2. The output frequency of the generator corresponds to the image of the natural frequency of the sensor. Second, in the case of all types of sensors provided with a calibration coil, we inject different signals into the calibration coil. We usually apply two signals: a step signal and a sweep (or wobble) signal. A step signal into the calibration coil is equivalent to a Dirac excitation in derived acceleration. The response to this Dirac gives the transfer function of the signal coil, derived two times and without absolute gain. We developed a field-module allowing us to always apply the same excitation to various models of seismometers, in order to compare the results from several instruments previously installed on field. A wobble signal is a signal whose frequency varies. By varying the frequency of the input signal around the sensor's natural frequency, we obtain an immediate response of the sensor in acceleration. This method is particularly suitable in order to avoid any disturbances which may modify the signal of a permanent station. Finally, for the determination of absolute

  5. Developing Conceptual Hypersonic Airbreathing Engines Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Ferlemann, Shelly M.; Robinson, Jeffrey S.; Martin, John G.; Leonard, Charles P.; Taylor, Lawrence W.; Kamhawi, Hilmi

    2000-01-01

    Designing a hypersonic vehicle is a complicated process due to the multi-disciplinary synergy that is required. The greatest challenge involves propulsion-airframe integration. In the past, a two-dimensional flowpath was generated based on the engine performance required for a proposed mission. A three-dimensional CAD geometry was produced from the two-dimensional flowpath for aerodynamic analysis, structural design, and packaging. The aerodynamics, engine performance, and mass properties arc inputs to the vehicle performance tool to determine if the mission goals were met. If the mission goals were not met, then a flowpath and vehicle redesign would begin. This design process might have to be performed several times to produce a "closed" vehicle. This paper will describe an attempt to design a hypersonic cruise vehicle propulsion flowpath using a Design of' Experiments method to reduce the resources necessary to produce a conceptual design with fewer iterations of the design cycle. These methods also allow for more flexible mission analysis and incorporation of additional design constraints at any point. A design system was developed using an object-based software package that would quickly generate each flowpath in the study given the values of the geometric independent variables. These flowpath geometries were put into a hypersonic propulsion code and the engine performance was generated. The propulsion results were loaded into statistical software to produce regression equations that were combined with an aerodynamic database to optimize the flowpath at the vehicle performance level. For this example, the design process was executed twice. The first pass was a cursory look at the independent variables selected to determine which variables are the most important and to test all of the inputs to the optimization process. The second cycle is a more in-depth study with more cases and higher order equations representing the design space.

  6. Development of Stable Solidification Method for Insoluble Ferrocyanides-13170

    SciTech Connect

    Ikarashi, Yuki; Masud, Rana Syed; Mimura, Hitoshi; Ishizaki, Eiji; Matsukura, Minoru

    2013-07-01

    The development of stable solidification method of insoluble ferrocyanides sludge is an important subject for the safety decontamination in Fukushima NPP-1. By using the excellent immobilizing properties of zeolites such as gas trapping ability and self-sintering properties, the stable solidification of insoluble ferrocyanides was accomplished. The immobilization ratio of Cs for K{sub 2}[CoFe(CN){sub 6}].nH{sub 2}O saturated with Cs{sup +} ions (Cs{sub 2}[CoFe(CN){sub 6}].nH{sub 2}O) was estimated to be less than 0.1% above 1,000 deg. C; the adsorbed Cs{sup +} ions are completely volatilized. In contrast, the novel stable solid form was produced by the press-sintering of the mixture of Cs{sub 2}[CoFe(CN){sub 6}].nH{sub 2}O and zeolites at higher temperature of 1,000 deg. C and 1,100 deg. C; Cs volatilization and cyanide release were completely depressed. The immobilization ratio of Cs, under the mixing conditions of Cs{sub 2}[CoFe(CN){sub 6}].nH{sub 2}O:CP= 1:1 and calcining temperature: 1,000 deg. C, was estimated to be nearly 100%. As for the kinds of zeolites, natural mordenite (NM), clinoptilolite (CP) and Chabazite tended to have higher immobilization ratio compared to zeolite A. This may be due to the difference in the phase transformation between natural zeolites and synthetic zeolite A. In the case of the composites (K{sub 2-X}Ni{sub X/2}[NiFe(CN){sub 6}].nH{sub 2}O loaded natural mordenite), relatively high immobilization ratio of Cs was also obtained. This method using zeolite matrices can be applied to the stable solidification of the solid wastes of insoluble ferrocyanides sludge. (authors)

  7. Development of Combinatorial Methods for Alloy Design and Optimization

    SciTech Connect

    Pharr, George M.; George, Easo P.; Santella, Michael L

    2005-07-01

    The primary goal of this research was to develop a comprehensive methodology for designing and optimizing metallic alloys by combinatorial principles. Because conventional techniques for alloy preparation are unavoidably restrictive in the range of alloy composition that can be examined, combinatorial methods promise to significantly reduce the time, energy, and expense needed for alloy design. Combinatorial methods can be developed not only to optimize existing alloys, but to explore and develop new ones as well. The scientific approach involved fabricating an alloy specimen with a continuous distribution of binary and ternary alloy compositions across its surface--an ''alloy library''--and then using spatially resolved probing techniques to characterize its structure, composition, and relevant properties. The three specific objectives of the project were: (1) to devise means by which simple test specimens with a library of alloy compositions spanning the range interest can be produced; (2) to assess how well the properties of the combinatorial specimen reproduce those of the conventionally processed alloys; and (3) to devise screening tools which can be used to rapidly assess the important properties of the alloys. As proof of principle, the methodology was applied to the Fe-Ni-Cr ternary alloy system that constitutes many commercially important materials such as stainless steels and the H-series and C-series heat and corrosion resistant casting alloys. Three different techniques were developed for making alloy libraries: (1) vapor deposition of discrete thin films on an appropriate substrate and then alloying them together by solid-state diffusion; (2) co-deposition of the alloying elements from three separate magnetron sputtering sources onto an inert substrate; and (3) localized melting of thin films with a focused electron-beam welding system. Each of the techniques was found to have its own advantages and disadvantages. A new and very powerful technique for

  8. Developing improved metamodels by combining phenomenological reasoning with statistical methods

    NASA Astrophysics Data System (ADS)

    Bigelow, James H.; Davis, Paul K.

    2002-07-01

    A metamodel is relatively small, simple model that approximates the behavior of a large, complex model. A common and superficially attractive way to develop a metamodel is to generate data from a number of large-model runs and to then use off-the-shelf statistical methods without attempting to understand the models internal workings. This paper describes research illuminating why it is important and fruitful, in some problems, to improve the quality of such metamodels by using various types of phenomenological knowledge. The benefits are sometimes mathematically subtle, but strategically important, as when one is dealing with a system that could fail if any of several critical components fail. Naive metamodels may fail to reflect the individual criticality of such components and may therefore be quite misleading if used for policy analysis. Na*ve metamodeling may also give very misleading results on the relative importance of inputs, thereby skewing resource-allocation decisions. By inserting an appropriate dose of theory, however, such problems can be greatly mitigated. Our work is intended to be a contribution to the emerging understanding of multiresolution, multiperspective modeling (MRMPM), as well as a contribution to interdisciplinary work combining virtues of statistical methodology with virtues of more theory-based work. Although the analysis we present is based on a particular experiment with a particular large and complex model, we believe that the insights are more general.

  9. Comparative effectiveness research: Policy context, methods development and research infrastructure.

    PubMed

    Tunis, Sean R; Benner, Joshua; McClellan, Mark

    2010-08-30

    Comparative effectiveness research (CER) has received substantial attention as a potential approach for improving health outcomes while lowering costs of care, and for improving the relevance and quality of clinical and health services research. The Institute of Medicine defines CER as 'the conduct and synthesis of systematic research comparing different interventions and strategies to prevent, diagnose, treat, and monitor health conditions. The purpose of this research is to inform patients, providers, and decision-makers, responding to their expressed needs, about which interventions are most effective for which patients under specific circumstances.' Improving the methods and infrastructure for CER will require sustained attention to the following issues: (1) Meaningful involvement of patients, consumers, clinicians, payers, and policymakers in key phases of CER study design and implementation; (2) Development of methodological 'best practices' for the design of CER studies that reflect decision-maker needs and balance internal validity with relevance, feasibility and timeliness; and (3) Improvements in research infrastructure to enhance the validity and efficiency with which CER studies are implemented. The approach to addressing each of these issues should be informed by the understanding that the primary purpose of CER is to help health care decision makers make informed clinical and health policy decisions.

  10. Development of fatigue life evaluation method using small specimen

    NASA Astrophysics Data System (ADS)

    Nogami, Shuhei; Nishimura, Arata; Wakai, Eichi; Tanigawa, Hiroyasu; Itoh, Takamoto; Hasegawa, Akira

    2013-10-01

    For developing the fatigue life evaluation method using small specimen, the effect of specimen size and shape on the fatigue life of the reduced activation ferritic/martensitic steels (F82H-IEA, F82H-BA07 and JLF-1) was investigated by the fatigue test at room temperature in air using round-bar and hourglass specimens with various specimen sizes (test section diameter: 0.85-10 mm). The round-bar specimen showed no specimen size and no specimen shape effects on the fatigue life, whereas the hourglass specimen showed no specimen size effect and obvious specimen shape effect on it. The shorter fatigue life of the hourglass specimen observed under low strain ranges could be attributed to the shorter micro-crack initiation life induced by the stress concentration dependent on the specimen shape. On the basis of this study, the small round-bar specimen was an acceptable candidate for evaluating the fatigue life using small specimen.

  11. Wavelet Methods Developed to Detect and Control Compressor Stall

    NASA Technical Reports Server (NTRS)

    Le, Dzu K.

    1997-01-01

    A "wavelet" is, by definition, an amplitude-varying, short waveform with a finite bandwidth (e.g., that shown in the first two graphs). Naturally, wavelets are more effective than the sinusoids of Fourier analysis for matching and reconstructing signal features. In wavelet transformation and inversion, all transient or periodic data features (as in compressor-inlet pressures) can be detected and reconstructed by stretching or contracting a single wavelet to generate the matching building blocks. Consequently, wavelet analysis provides many flexible and effective ways to reduce noise and extract signals which surpass classical techniques - making it very attractive for data analysis, modeling, and active control of stall and surge in high-speed turbojet compressors. Therefore, fast and practical wavelet methods are being developed in-house at the NASA Lewis Research Center to assist in these tasks. This includes establishing user-friendly links between some fundamental wavelet analysis ideas and the classical theories (or practices) of system identification, data analysis, and processing.

  12. Approaches to improve development methods for therapeutic cancer vaccines.

    PubMed

    Ogi, Chizuru; Aruga, Atsushi

    2015-04-01

    Therapeutic cancer vaccines are an immunotherapy that amplify or induce an active immune response against tumors. Notably, limitations in the methodology for existing anti-cancer drugs may subsist while applying them to cancer vaccine therapy. A retrospective analysis was performed using information obtained from ClinicalTrials.gov, PubMed, and published articles. Our research evaluated the optimal methodologies for therapeutic cancer vaccines based on (1) patient populations, (2) immune monitoring, (3) tumor response evaluation, and (4) supplementary therapies. Failure to optimize these methodologies at an early phase may impact development at later stages; thus, we have proposed some points to be considered during the early phase. Moreover, we compared our proposal with the guidance for industry issued by the US Food and Drug Administration in October 2011 entitled "Clinical Considerations for Therapeutic Cancer Vaccines". Consequently, while our research was aligned with the guidance, we hope it provides further insights in order to predict the risks and benefits and facilitate decisions for a new technology. We identified the following points for consideration: (1) include in the selection criteria the immunological stage with a prognostic value, which is as important as the tumor stage; (2) select immunological assays such as phenotype analysis of lymphocytes, based on their features and standardize assay methods; (3) utilize optimal response criteria for immunotherapy in therapeutic cancer vaccine trials; and (4) consider supplementary therapies, including immune checkpoint inhibitors, for future therapeutic cancer vaccines.

  13. Development of Probabilistic Methods to Assess Meteotsunami Hazards

    NASA Astrophysics Data System (ADS)

    Geist, E. L.; Ten Brink, U. S.

    2014-12-01

    A probabilistic method to assess the hazard from meteotsunamis is developed from both probabilistic tsunami hazard analysis (PTHA) and probabilistic storm-surge forecasting. Meteotsunamis are unusual sea level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation, similar to that used in PTHA, incorporates different meteotsunami sources. A historical record of 116 pressure disturbances recorded between 2000 and 2013 by the U.S. Automated Surface Observing Stations (ASOS) along the U.S. East Coast is used to establish a continuous analytic distribution of each source parameter as well as the overall Poisson rate of occurrence. Initially, atmospheric parameters are considered independently such that the joint probability distribution is given by the product of each marginal distribution. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of pressure disturbances is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a finite-difference hydrodynamic model that solves for the linearized long-wave equations. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using 20 synthetic catalogs of 116 events each, resampled from the parent parameter distributions, yield mean and quantile hazard curves. An example is presented for four Mid-Atlantic sites using ASOS data in which only atmospheric pressure disturbances from squall lines and derechos are considered. Results indicate that site-to-site variations among meteotsunami hazard curves are related to the geometry and width of the adjacent continental shelf. The new hazard analysis of meteotsunamis is important for

  14. Retention modeling and method development in hydrophilic interaction chromatography.

    PubMed

    Tyteca, Eva; Périat, Aurélie; Rudaz, Serge; Desmet, Gert; Guillarme, Davy

    2014-04-11

    In the present study, the possibility of retention modeling in the HILIC mode was investigated, testing several different literature relationships over a wide range of different analytical conditions (column chemistries and mobile phase pH) and using analytes possessing diverse physico-chemical properties. Furthermore, it was investigated how the retention prediction depends on the number of isocratic or gradient trial or initial scouting runs. The most promising set of scouting runs seems to be a combination of three isocratic runs (95, 90 and 70%ACN) and one gradient run (95 to 65%ACN in 10min), as the average prediction errors were lower than using six equally spaced isocratic runs and because it is common in Method development (MD) to perform at least one scouting gradient run in the screening step to find out the best column, temperature and pH conditions. Overall, the retention predictions were much less accurate in HILIC than what is usually experienced in RPLC. This has severe implications for MD, as it restricts the use of commercial software packages that require the simulation of the retention of every peak in the chromatogram. To overcome this problem, the recently proposed predictive elution window shifting and stretching (PEWS(2)) approach can be used. In this computer-assisted MD strategy, only an (approximate) prediction of the retention of the first and the last peak in the chromatogram is required to conduct a well-targeted trial-and-error search, with suggested search conditions uniformly covering the entire possible search and elution space. This strategy was used to optimize the separation of three representative pharmaceutical mixtures possessing diverse physico-chemical properties (pteridins, saccharides and cocktail of drugs/metabolites). All problems could be successfully handled in less than 2.5h of instrument time (including equilibration).

  15. Leadership Development Expertise: A Mixed-Method Analysis

    ERIC Educational Resources Information Center

    Okpala, Comfort O.; Hopson, Linda B.; Chapman, Bernadine; Fort, Edward

    2011-01-01

    In this study, the impact of graduate curriculum, experience, and standards in the development of leadership expertise were examined. The major goals of the study were to (1) examine the impact of college content curriculum in the development of leadership expertise, (2) examine the impact of on the job experience in the development of leadership…

  16. [Cognitive functions, their development and modern diagnostic methods].

    PubMed

    Klasik, Adam; Janas-Kozik, Małgorzata; Krupka-Matuszczyk, Irena; Augustyniak, Ewa

    2006-01-01

    Cognitive psychology is an interdisciplinary field whose main aim is to study the thinking mechanisms of humans leading to cognizance. Therefore the concept of human cognitive processes envelopes the knowledge related to the mechanisms which determine the way humans acquire information from the environment and utilize their knowledge and experience. There are three basic processes which need to be distinguished when discussing human perception development: acquiring sensations, perceptiveness and attention. Acquiring sensations means the experience arising from the stimulation of a single sense organ, i.e. detection and differentiation of sensory information. Perceptiveness stands for the interpretation of sensations and may include recognition and identification of sensory information. The attention process relates to the selectivity of perception. Mental processes of the higher order used in cognition, thanks to which humans tend to try to understand the world and adapt to it, doubtlessly include the processes of memory, reasoning, learning and problem solving. There is a great difference in the human cognitive functioning at different stages of one's life (from infancy to adulthood). The difference is both quantitative and qualitative. There are three main approaches to the human cognitive functioning development: Jean Piaget's approach, information processing approach and psychometric approach. Piaget's ideas continue to form the groundwork of child cognitive psychology. Piaget identified four developmental stages of child cognition: 1. Sensorimotor stage (birth - 2 years old); 2. Preoperational stage (ages 2-7); 3. Concrete operations (ages 7-11; 4. Formal operations (11 and more). The supporters of the information processing approach use a computer metaphor to present the human cognitive processes functioning model. The three important mechanisms involved are: coding, automation and strategy designing and they all often co-operate together. This theory has

  17. Development of Infrared Radiation Heating Method for Sustainable Tomato Peeling

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Although lye peeling is the widely industrialized method for producing high quality peeled fruit and vegetable products, the peeling method has resulted in negative impacts by significantly exerting both environmental and economic pressure on the tomato processing industry due to its associated sali...

  18. Adult Education in Development. Methods and Approaches from Changing Societies.

    ERIC Educational Resources Information Center

    McGivney, Veronica; Murray, Frances

    The case studies described in this book provide examples of initiatives illustrating the role of adult education in development and its contribution to the process of change in developing countries. The book is organized in five sections. Case studies in Part 1, "Health Education," illustrate the links between primary health care and…

  19. Developing Scientific Thinking Methods and Applications in Islamic Education

    ERIC Educational Resources Information Center

    Al-Sharaf, Adel

    2013-01-01

    This article traces the early and medieval Islamic scholarship to the development of critical and scientific thinking and how they contributed to the development of an Islamic theory of epistemology and scientific thinking education. The article elucidates how the Qur'an and the Sunna of Prophet Muhammad have also contributed to the…

  20. Sustainable Development Index in Hong Kong: Approach, Method and Findings

    ERIC Educational Resources Information Center

    Tso, Geoffrey K. F.; Yau, Kelvin K. W.; Yang, C. Y.

    2011-01-01

    Sustainable development is a priority area of research in many countries and regions nowadays. This paper illustrates how a multi-stakeholders engagement process can be applied to identify and prioritize the local community's concerns and issues regarding sustainable development in Hong Kong. Ten priority areas covering a wide range of community's…

  1. The change and development of statistical methods used in research articles in child development 1930-2010.

    PubMed

    Køppe, Simo; Dammeyer, Jesper

    2014-09-01

    The evolution of developmental psychology has been characterized by the use of different quantitative and qualitative methods and procedures. But how does the use of methods and procedures change over time? This study explores the change and development of statistical methods used in articles published in Child Development from 1930 to 2010. The methods used in every article in the first issue of every volume were categorized into four categories. Until 1980 relatively simple statistical methods were used. During the last 30 years there has been an explosive use of more advanced statistical methods employed. The absence of statistical methods or use of simple methods had been eliminated.

  2. DEVELOPMENT OF MANUFACTURING METHODS FOR LIGHTWEIGHT METAL FOIL HEAT EXCHANGERS.

    DTIC Science & Technology

    MICROSTRUCTURE, TENSILE PROPERTIES, STRESSES, SPOT WELDS, COATINGS , SILICIDES , OXIDATION, TEST METHODS....PHOSPHORUS ALLOYS, ALUMINUM ALLOYS, NIOBIUM ALLOYS, PRESSURE, THERMAL JOINING, AEROSPACE CRAFT, DIFFUSION, BONDING, VACUUM FURNACES, SOLDERED JOINTS

  3. Development and implementation of a DFT/MIA method

    NASA Astrophysics Data System (ADS)

    Rousseau, Bart

    2001-11-01

    In the last half of the century that has passed since the advent of the quantum chemical era the computational quantum chemistry methods have matured into valuable tools for researchers in industry and academia alike. However, for these methods to be competitive with e.g. the high-throughput experimental techniques used in the pharmaceutical industry, they must not only be accurate but at the same time must be computationally inexpensive. Most of the contemporary computational methods succeed only in fulfilling one of these criteria. Therefore the goal of this Ph.D. project was to combine the MIA approach, which allows efficient SCF calculations on large systems, and the DFT method, that takes electron correlation into account at a moderate computational cost. The new DFT method thus obtained, the DFT/MIA method, allows for efficient correlated calculations on large systems. The MIA method, an efficient combination of the Multiplicative Integral Approximation and the direct SCF procedure, is implemented in the ab-initio quantum chemical program package BRABO. In the MIA approximation the product of two basis functions is expanded in terms of an auxiliary basis set. This reduces the N4 four-center two-electron integrals to a sum of N3 three-center two-electron integrals. In addition this allows for a very fast build- up of the Fock matrix. The MIA approach has already proven its effectiveness in calculations on systems that belong to the largest that have been calculated at this level of theory, such as the calculation on the 46- peptide crambin. In addition this method was implemented using the `Parallel Virtual Machine' method, allowing parallel execution on a heterogeneous cluster of workstations. The MIA approach is applied to both the calculation of the electron density and the calculation of the exchange-correlation contribution to the Fock matrix. For both these quantities a recursive procedure is used. Test calculations on water clusters ranging in size from

  4. Development of Discontinuous Galerkin Method for the Linearized Euler Equations

    DTIC Science & Technology

    2003-02-01

    ESktbkX-(9) i=1 k=O Since the LEE are linear, Fj(Uh) is expanded in a natural way as can be seen from Eq.(7). Furthermore, Atkins and Lockard [5...Discontinuous Galerkin method for Hyperbolic Equations, AIAA Journal, Vol. 36, pp. 775-782, 1998. [5] H.L. Atkins and D.P. Lockard , A High-Order Method using

  5. [Current trends in the development of salmonella detection methods].

    PubMed

    Krüger, G

    1989-11-01

    The cultural methods require 4 to 7 days for presumptive evidence of Salmonella in foodstuffs. Attempts in time shortening have resulted in combination of pre-enrichment or selective enrichment with time saving genetical or immunological tests. Proceedings of enzyme immunoassays for applications in Salmonella screening are important. Involving monoclonal antibodies, fluorescent or chemiluminescent substrates, there are some commercial Salmonella test kits. Especially rapid EIA methods, here advantages and disadvantages are discussed.

  6. Pilot-in-the-Loop CFD Method Development

    DTIC Science & Technology

    2014-08-01

    expected later this summer. 5. References 1. Interpolating Scattered Data – MATLAB & Simulink , Website, http://www.mathworks.com/help/ matlab ...converted into structured grid type. In order to do that, MATLAB has been used to interpolate the scattered data onto a uniform structured grid...Linear interpolation method was used for data conversion. MATLAB uses Delaunay triangulation method to draw a triangle that encloses the query point and

  7. Development of a Magnetic Attachment Method for Bionic Eye Applications.

    PubMed

    Fox, Kate; Meffin, Hamish; Burns, Owen; Abbott, Carla J; Allen, Penelope J; Opie, Nicholas L; McGowan, Ceara; Yeoh, Jonathan; Ahnood, Arman; Luu, Chi D; Cicione, Rosemary; Saunders, Alexia L; McPhedran, Michelle; Cardamone, Lisa; Villalobos, Joel; Garrett, David J; Nayagam, David A X; Apollo, Nicholas V; Ganesan, Kumaravelu; Shivdasani, Mohit N; Stacey, Alastair; Escudie, Mathilde; Lichter, Samantha; Shepherd, Robert K; Prawer, Steven

    2016-03-01

    Successful visual prostheses require stable, long-term attachment. Epiretinal prostheses, in particular, require attachment methods to fix the prosthesis onto the retina. The most common method is fixation with a retinal tack; however, tacks cause retinal trauma, and surgical proficiency is important to ensure optimal placement of the prosthesis near the macula. Accordingly, alternate attachment methods are required. In this study, we detail a novel method of magnetic attachment for an epiretinal prosthesis using two prostheses components positioned on opposing sides of the retina. The magnetic attachment technique was piloted in a feline animal model (chronic, nonrecovery implantation). We also detail a new method to reliably control the magnet coupling force using heat. It was found that the force exerted upon the tissue that separates the two components could be minimized as the measured force is proportionately smaller at the working distance. We thus detail, for the first time, a surgical method using customized magnets to position and affix an epiretinal prosthesis on the retina. The position of the epiretinal prosthesis is reliable, and its location on the retina is accurately controlled by the placement of a secondary magnet in the suprachoroidal location. The electrode position above the retina is less than 50 microns at the center of the device, although there were pressure points seen at the two edges due to curvature misalignment. The degree of retinal compression found in this study was unacceptably high; nevertheless, the normal structure of the retina remained intact under the electrodes.

  8. Development of an SPE/CE method for analyzing HAAs

    USGS Publications Warehouse

    Zhang, L.; Capel, P.D.; Hozalski, R.M.

    2007-01-01

    The haloacetic acid (HAA) analysis methods approved by the US Environmental Protection Agency involve extraction and derivatization of HAAs (typically to their methyl ester form) and analysis by gas chromatography (GC) with electron capture detection (ECD). Concerns associated with these methods include the time and effort of the derivatization process, use of potentially hazardous chemicals or conditions during methylation, poor recoveries because of low extraction efficiencies for some HAAs or matrix effects from sulfate, and loss of tribromoacetic acid because of decarboxylation. The HAA analysis method introduced here uses solid-phase extraction (SPE) followed by capillary electrophoresis (CE) analysis. The method is accurate, reproducible, sensitive, relatively safe, and easy to perform, and avoids the use of large amounts of solvent for liquid-liquid extraction and the potential hazards and hassles of derivatization. The cost of analyzing HAAs using this method should be lower than the currently approved methods, and utilities with a GC/ECD can perform the analysis in-house.

  9. Development of image segmentation methods for intracranial aneurysms.

    PubMed

    Sen, Yuka; Qian, Yi; Avolio, Alberto; Morgan, Michael

    2013-01-01

    Though providing vital means for the visualization, diagnosis, and quantification of decision-making processes for the treatment of vascular pathologies, vascular segmentation remains a process that continues to be marred by numerous challenges. In this study, we validate eight aneurysms via the use of two existing segmentation methods; the Region Growing Threshold and Chan-Vese model. These methods were evaluated by comparison of the results obtained with a manual segmentation performed. Based upon this validation study, we propose a new Threshold-Based Level Set (TLS) method in order to overcome the existing problems. With divergent methods of segmentation, we discovered that the volumes of the aneurysm models reached a maximum difference of 24%. The local artery anatomical shapes of the aneurysms were likewise found to significantly influence the results of these simulations. In contrast, however, the volume differences calculated via use of the TLS method remained at a relatively low figure, at only around 5%, thereby revealing the existence of inherent limitations in the application of cerebrovascular segmentation. The proposed TLS method holds the potential for utilisation in automatic aneurysm segmentation without the setting of a seed point or intensity threshold. This technique will further enable the segmentation of anatomically complex cerebrovascular shapes, thereby allowing for more accurate and efficient simulations of medical imagery.

  10. Development of a Matched Runs Method for VERITAS

    NASA Astrophysics Data System (ADS)

    Flinders, Andrew; VERITAS Collaboration

    2016-03-01

    VERITAS is an array of four Imaging Air Cherenkov Telescopes located in southern Arizona. It has been successful in detecting Very High Energy (VHE) radiation from a variety of sources including pulsars, Pulsar Wind Nebulae, Blazars, and High Mass X-Ray Binary systems. Each of these detections been accomplished using either the standard Ring Background Method or the Reflected Region Method in order to determine the appropriate background for the source region. For highly extended sources (>1 degree) these background estimation methods become unsuitable due to the possibility of source contamination in the background regions. A new method, called the matched background method, has been implemented for potentially highly extended sources observed by VERITAS. It provides and algorithm for identifying a suitable gamma-ray background estimation from a different field of view than the source region. By carefully matching cosmic-ray event rates between the source and the background sky observations, a suitable gamma-ray background matched data set can be identified. We will describe the matched background method and give examples of its use for several sources including the Crab Nebula and IC443. This research is supported by Grants from the U.S. Department of Energy Office of Science, the U.S. National Science Foundation and the Smithsonian Institution, and by NSERC in Canada.

  11. Methods and challenges in quantitative imaging biomarker development.

    PubMed

    Abramson, Richard G; Burton, Kirsteen R; Yu, John-Paul J; Scalzetti, Ernest M; Yankeelov, Thomas E; Rosenkrantz, Andrew B; Mendiratta-Lala, Mishal; Bartholmai, Brian J; Ganeshan, Dhakshinamoorthy; Lenchik, Leon; Subramaniam, Rathan M

    2015-01-01

    Academic radiology is poised to play an important role in the development and implementation of quantitative imaging (QI) tools. This article, drafted by the Association of University Radiologists Radiology Research Alliance Quantitative Imaging Task Force, reviews current issues in QI biomarker research. We discuss motivations for advancing QI, define key terms, present a framework for QI biomarker research, and outline challenges in QI biomarker development. We conclude by describing where QI research and development is currently taking place and discussing the paramount role of academic radiology in this rapidly evolving field.

  12. DEVELOPMENT OF A MOLECULAR METHOD TO IDENTIFY ASTROVIRUS IN WATER

    EPA Science Inventory

    Astrovirus is a common cause of gastroenteritis that has been determined to be responsible for several outbreaks. Since astrovirus can be waterborne, there is interest in testing environmental water for astrovirus. We have developed a sensitive reverse transcription-polymerase ...

  13. The historical development of the magnetic method in exploration

    USGS Publications Warehouse

    Nabighian, M.N.; Grauch, V.J.S.; Hansen, R.O.; LaFehr, T.R.; Li, Y.; Peirce, J.W.; Phillips, J.D.; Ruder, M.E.

    2005-01-01

    The magnetic method, perhaps the oldest of geophysical exploration techniques, blossomed after the advent of airborne surveys in World War II. With improvements in instrumentation, navigation, and platform compensation, it is now possible to map the entire crustal section at a variety of scales, from strongly magnetic basement at regional scale to weakly magnetic sedimentary contacts at local scale. Methods of data filtering, display, and interpretation have also advanced, especially with the availability of low-cost, high-performance personal computers and color raster graphics. The magnetic method is the primary exploration tool in the search for minerals. In other arenas, the magnetic method has evolved from its sole use for mapping basement structure to include a wide range of new applications, such as locating intrasedimentary faults, defining subtle lithologic contacts, mapping salt domes in weakly magnetic sediments, and better defining targets through 3D inversion. These new applications have increased the method's utility in all realms of exploration - in the search for minerals, oil and gas, geothermal resources, and groundwater, and for a variety of other purposes such as natural hazards assessment, mapping impact structures, and engineering and environmental studies. ?? 2005 Society of Exploration Geophysicists. All rights reserved.

  14. Development of an extraction method for perchlorate in soils.

    PubMed

    Cañas, Jaclyn E; Patel, Rashila; Tian, Kang; Anderson, Todd A

    2006-03-01

    Perchlorate originates as a contaminant in the environment from its use in solid rocket fuels and munitions. The current US EPA methods for perchlorate determination via ion chromatography using conductivity detection do not include recommendations for the extraction of perchlorate from soil. This study evaluated and identified appropriate conditions for the extraction of perchlorate from clay loam, loamy sand, and sandy soils. Based on the results of this evaluation, soils should be extracted in a dry, ground (mortar and pestle) state with Milli-Q water in a 1 ratio 1 soil ratio water ratio and diluted no more than 5-fold before analysis. When sandy soils were extracted in this manner, the calculated method detection limit was 3.5 microg kg(-1). The findings of this study have aided in the establishment of a standardized extraction method for perchlorate in soil.

  15. Development method of the motor winding's ultrasonic cleaning equipment

    NASA Astrophysics Data System (ADS)

    Jiang, Yingzhan; Wang, Caiyuan; Ao, Chenyang; Zhang, Haipeng

    2013-03-01

    The complicate question's solution of motor winding cleaning need new technologies such as ultrasonic cleaning. The mechanism of problems that the insulation level of the motor winding would be degraded with time and the motor winding would resumed tide soon after processing were analyzed. The ultrasonic cleaning method was studies and one ultrasonic cleaning device was designed. Its safety was verified by the destructive experiment. The test show that this device can clear away the depositional dirt in the winding thoroughly, which provides a new idea and method to ensure its insulation level and realize its safe and reliable operation.

  16. New Method for Data Treatment Developed at ESO

    NASA Astrophysics Data System (ADS)

    1996-08-01

    scientific return from the VLT and other telescopes such as the HST best be optimised? It is exactly for this reason that astronomers and engineers at ESO are now busy developing new methods of telescope operation and data analysis alongside with the VLT instrumental hardware itself. The new solution by means of models The appropriate strategy to make progress in the inherent conflict between calibration demand and time available for scientific observations is to obtain a physically correct understanding of the effects exerted on the data by different instruments . In this way, it is possible to decide which calibration data are actually required and on which timescale they have to be updated. One can then use computer models of these instruments to predict calibration solutions which are now valid for the full range of target properties and which handle environmental conditions properly. Such computer models can also be used to simulate observations. This brings a lot of benefits for the entire observational process. First, the astronomer can prepare observations and select instrumental modes and exposure times suited for optimal information return. Secondly, it provides confidence in the validity of the calibration process, and therefore in the cleanliness of the corrected data. Finally, once a theory about the target and its properties has been developed, one may simulate observations of a set of theoretical targets for which the properties are slightly modified in order to study their influence on the raw data. For the observatory there are also advantages. Optimization from the point of view of data analysis can now take place already during instrument design, calibration and data analysis procedures for any observational mode can be tested before real observations are obtained, and the maintenance staff can make sure that the instrument performs as expected and designed. How far have we come along this road? The present project consists of a close collaboration between

  17. Standardized development of computer software. Part 1: Methods

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1976-01-01

    This work is a two-volume set on standards for modern software engineering methodology. This volume presents a tutorial and practical guide to the efficient development of reliable computer software, a unified and coordinated discipline for design, coding, testing, documentation, and project organization and management. The aim of the monograph is to provide formal disciplines for increasing the probability of securing software that is characterized by high degrees of initial correctness, readability, and maintainability, and to promote practices which aid in the consistent and orderly development of a total software system within schedule and budgetary constraints. These disciplines are set forth as a set of rules to be applied during software development to drastically reduce the time traditionally spent in debugging, to increase documentation quality, to foster understandability among those who must come in contact with it, and to facilitate operations and alterations of the program as requirements on the program environment change.

  18. Trust in healthcare settings: Scale development, methods, and preliminary determinants

    PubMed Central

    LoCurto, Jamie; Berg, Gina M

    2016-01-01

    The literature contains research regarding how trust is formed in healthcare settings but rarely discusses trust formation in an emergent care population. A literature review was conducted to determine which of the trust determinants are important for this process as well as how to develop a scale to measure trust. A search generated a total of 155 articles, 65 of which met eligibility criteria. Determinants that were important included the following: honesty, confidentiality, dependability, communication, competency, fiduciary responsibility, fidelity, and agency. The process of developing a scale includes the following: a literature review, qualitative analysis, piloting, and survey validation. Results suggest that physician behaviors are important in influencing trust in patients and should be included in scales measuring trust. Next steps consist of interviewing emergent care patients to commence the process of developing a scale. PMID:27635245

  19. Development of an Analytical Method for Explosive Residues in Soil,

    DTIC Science & Technology

    1987-06-01

    block number) FIELD GROUP SUB-GROUP Analytical methods High - performance liquid chromatography Contaminated soils Soils Explosives % 19. ABSTRACT (Continue...and high - performance liquid chromatography tion limits for linear calibration curves. Analytical electrochemical detection. Journal of Chromatography... performance liquid chromatography . 5. Soils. 1. Walsh, Mari- anne E. 1. United States. Army. Corps of Engineers. 11. Cold Regions Research and

  20. Development of Fingerprinting Method in Sediment Source Studies

    NASA Astrophysics Data System (ADS)

    Du, Pengfei; Ning, Duihu; Huang, Donghao

    2016-04-01

    Sediment source study is valuable for watershed sediment budget, sediment control in channels, soil erosion model validation and benefits evaluation of soil and water conservation. As one of the methods to make clear the sediment sources, fingerprinting has been proven effective, and hence has been adopted in different countries over the world. This paper briefly introduced the fingerprinting method in models, diagnostic sediment properties, applied regions, spatial and temporal scales, and classification of sediment source types. Combining with environmental radionuclides as the time makers (such as 137Cs and 210Pb), the sediment source history has been possible by virtue of this method. However, some uncertainties are waiting for the confirmative answers while introducing fingerprinting technique to sediment related studies: the efficient sampling strategies through linking sediment source and fingerprint properties need to be clearer, the spatial scale links (up-scaling and down-scaling) should be provided with detailed methods, the model calibration is necessary to be updated to improve the estimated precision. (This paper is a contribution to the project of National Natural Science Foundation of China (No. 41501299), the non-profit project of Ministry of Water Resources of China (No. 201501045), and the project of Youth Scientific Research of China Institute of Water Resources and Hydropower Research (Using fingerprinting technique to study sediment source in a typical small watershed of black soil region in northeast China))

  1. DEVELOPMENT OF MOLECULAR METHODS TO DETECT EMERGING VIRUSES

    EPA Science Inventory

    A large number of human enteric viruses are known to cause gastrointestinal illness and waterborne outbreaks. Many of these are emerging viruses that do not grow or grow poorly in cell culture and so molecular detectoin methods based on the polymerase chain reaction (PCR) are be...

  2. Integrating Methods and Materials: Developing Trainees' Reading Skills.

    ERIC Educational Resources Information Center

    Jarvis, Jennifer

    1987-01-01

    Explores issues arising from a research project which studied ways of meeting the reading needs of trainee primary school teachers (from Malawi and Tanzania) of English as a foreign language. Topics discussed include: the classroom teaching situation; teaching "quality"; and integration of materials and methods. (CB)

  3. Antibody humanization methods for development of therapeutic applications.

    PubMed

    Ahmadzadeh, Vahideh; Farajnia, Safar; Feizi, Mohammad Ali Hosseinpour; Nejad, Ramezan Ali Khavari

    2014-04-01

    Recombinant antibody technologies are rapidly becoming available and showing considerable clinical success. However, the immunogenicity of murine-derived monoclonal antibodies is restrictive in cancer immunotherapy. Humanized antibodies can overcome these problems and are considered to be a promising alternative therapeutic agent. There are several approaches for antibody humanization. In this article we review various methods used in the antibody humanization process.

  4. Developing and Understanding Methods for Large-Scale Nonlinear Optimization

    DTIC Science & Technology

    2006-07-24

    algorithms for large-scale uncon- strained and constrained optimization problems, including limited-memory methods for problems with -2- many thousands...34Published in peer-reviewed journals" E. Eskow, B. Bader, R. Byrd, S. Crivelli, T. Head-Gordon, V. Lamberti and R. Schnabel, "An optimization approach to the

  5. Development and Application of Agglomerated Multigrid Methods for Complex Geometries

    NASA Technical Reports Server (NTRS)

    Nishikawa, Hiroaki; Diskin, Boris; Thomas, James L.

    2010-01-01

    We report progress in the development of agglomerated multigrid techniques for fully un- structured grids in three dimensions, building upon two previous studies focused on efficiently solving a model diffusion equation. We demonstrate a robust fully-coarsened agglomerated multigrid technique for 3D complex geometries, incorporating the following key developments: consistent and stable coarse-grid discretizations, a hierarchical agglomeration scheme, and line-agglomeration/relaxation using prismatic-cell discretizations in the highly-stretched grid regions. A signi cant speed-up in computer time is demonstrated for a model diffusion problem, the Euler equations, and the Reynolds-averaged Navier-Stokes equations for 3D realistic complex geometries.

  6. Requirements and Methods for Management Development Programmes in the Least Developed Countries in Africa.

    ERIC Educational Resources Information Center

    Perry, Chad

    1993-01-01

    Management development is essential for the economic development of least developed countries (LDCs) in Africa. The collectivist culture of LDCs necessitates development of behavior skills and attitudes and a cyclic, experiential learning approach. (SK)

  7. DEVELOPMENT OF A MOLECULAR METHOD TO DETECT ASTROVIRUS

    EPA Science Inventory

    Astrovirus is a common cause of gastroenteritis that has been determined to be responsible for several outbreaks. Since astrovirus can be waterborne, there is interest in testing environmental water for astrovirus and we have developed a sensitive RT-PCR assay that is designed t...

  8. Classroom Coaching: An Emerging Method of Professional Development.

    ERIC Educational Resources Information Center

    Becker, Joanne Rossi

    This project investigated the efficacy of classroom coaching in improving instruction in elementary mathematics classrooms. The coaches involved in this study were participants in a professional development program. The program includes three major aspects: (1) an intensive 3-week summer institute focusing on mathematics content, pedagogical…

  9. Reflection--A Method for Organisational and Individual Development

    ERIC Educational Resources Information Center

    Randle, Hanne; Tilander, Kristian

    2007-01-01

    This paper presents how organisational development can be the results when politicians, managers, social workers and teaching staff take part in reflection. The results are based on a government-funded initiative in Sweden for lowering sick absenteeism. Three local governments introduced reflection as a strategy to combat work related stress and a…

  10. DEVELOPMENT OF A MOLECULAR METHOD TO IDENTIFY ASTROVIRUS IN WATER.

    EPA Science Inventory

    Astrovirus is a common cause of gastroenteritis that has been determined to be responsible for several outbreaks. Since astrovirus can be waterborne, there is interest in testing environmental water for astrovirus and we have developed a sensitive RT-PCR assay that is designed t...

  11. Developing Writing-Reading Abilities though Semiglobal Methods

    ERIC Educational Resources Information Center

    Macri, Cecilia; Bocos, Musata

    2013-01-01

    Through this research was intended to underline the importance of the semi-global strategies used within thematic projects for developing writing/reading abilities in the first grade pupils. Four different coordinates were chosen to be the main variables of this research: the level of phonological awareness, the degree in which writing-reading…

  12. Measurement Development in Reflective Supervision: History, Methods, and Next Steps

    ERIC Educational Resources Information Center

    Tomlin, Angela M.; Heller, Sherryl Scott

    2016-01-01

    This issue of the "ZERO TO THREE" journal provides a snapshot of the current state of measurement of reflective supervision within the infant-family field. In this article, the authors introduce the issue by providing a brief history of the development of reflective supervision in the field of infant mental health, with a specific focus…

  13. Teachers' Perceptions of Edcamp Professional Development: A Q Method Study

    ERIC Educational Resources Information Center

    Brown, Toby

    2015-01-01

    This study described the subjective opinions of teachers about their experiences at Edcamp, an unconference-style form of teacher professional development (PD). Traditional PD has been maligned for being overly expensive and ineffectual in affecting changes in teacher practice. In order to defend teachers' decisions to partake in Edcamp-style PD,…

  14. DEVELOPMENT OF A MOLECULAR METHOD TO IDENTIFY HEPATITIS E VIRUS

    EPA Science Inventory

    Hepatitis E virus (HEV) is a waterborne emerging pathogen that causes significant illness in the developing world. Thus far, an HEV outbreak has not been reported in the U.S., although a swine variant of the virus is common in Midwestern hogs. Because viruses isolated from two ...

  15. Andragogical and Pedagogical Methods for Curriculum and Program Development

    ERIC Educational Resources Information Center

    Wang, Victor C. X., Ed.; Bryan, Valerie C., Ed.

    2014-01-01

    Today's ever-changing learning environment is characterized by the fast pace of technology that drives our society to move forward, and causes our knowledge to increase at an exponential rate. The need for in-depth research that is bound to generate new knowledge about curriculum and program development is becoming ever more relevant.…

  16. Development of a Chronic Toxicity Testing Method for Daphnia pulex

    DTIC Science & Technology

    2015-08-01

    11 Addition of 4d old D. pulex to 50 mL beakers for remainder of chronic test to assess survival and reproduction ...15 Reproduction ...protocol was developed specifically for the use of a D. pulex three-brood chronic toxicity test measuring survival and reproduction as endpoints. This

  17. Developing Principals as Racial Equity Leaders: A Mixed Method Study

    ERIC Educational Resources Information Center

    Raskin, Candace F.; Krull, Melissa; Thatcher, Roberta

    2015-01-01

    This article will present information and research on how a college of education is intentionally developing principals to lead with confidence and racial competence. The nation's student achievement research is sobering: our current school systems widen already existing gaps between white students and students of color, (Darling-Hammond, L. 2004,…

  18. Green methods of lignocellulose pretreatment for biorefinery development.

    PubMed

    Capolupo, Laura; Faraco, Vincenza

    2016-11-01

    Lignocellulosic biomass is the most abundant, low-cost, bio-renewable resource that holds enormous importance as alternative source for production of biofuels and other biochemicals that can be utilized as building blocks for production of new materials. Enzymatic hydrolysis is an essential step involved in the bioconversion of lignocellulose to produce fermentable monosaccharides. However, to allow the enzymatic hydrolysis, a pretreatment step is needed in order to remove the lignin barrier and break down the crystalline structure of cellulose. The present manuscript is dedicated to reviewing the most commonly applied "green" pretreatment processes used in bioconversion of lignocellulosic biomasses within the "biorefinery" concept. In this frame, the effects of different pretreatment methods on lignocellulosic biomass are described along with an in-depth discussion on the benefits and drawbacks of each method, including generation of potentially inhibitory compounds for enzymatic hydrolysis, effect on cellulose digestibility, and generation of compounds toxic for the environment, and energy and economic demand.

  19. Development of an ultrasonic cleaning method for fuel assemblies

    SciTech Connect

    Heki, H.; Komura, S.; Kato, H.; Sakai, H. ); Hattori, T. )

    1991-01-01

    Almost all radiation buildup in light water reactors is the result of the deposition of activated corrosion and wear products in out-of-core areas. After operation, a significant quantity of corrosion and wear products is deposited on the fuel rods as crud. At refueling shutdowns, these activation products are available for removal. If they can be quickly and easily removed, buildup of radioactivity on out-of-core surfaces and individual exposure dose can be greatly reduced. After studying various physical cleaning methods (e.g., water jet and ultrasonic), the ultrasonic cleaning method was selected as the most effective for fuel assembly cleaning. The ultrasonic cleaning method is especially able to efficiently clean the fuel without removing the channel box. The removed crud in the channel box would be swept out to the filtration unit. Parameter survey tests were carried out to evaluate the optimum conditions for ultrasonic cleaning using a mock-up of a short section of fuel assembly with the channel box. The ultrasonic device used was a 600-W ultrasonic transducer operating at 26-kHz ultrasonic frequency.

  20. The use of expressive methods for developing empathic skills.

    PubMed

    Ozcan, Neslihan Keser; Bilgin, Hülya; Eracar, Nevin

    2011-01-01

    Empathy is one of the fundamental concepts in nursing, and it is an ability that can be learned. Various education models have been tested for improving empathic skills. Research has focused on using oral presentations, videos, modeling, practiced negotiation based on experiences, and psychodrama methods, such as role playing, as ways to improve empathy in participants. This study looked at the use of expressive arts to improve empathic skills of nursing students. The study was conducted with 48 students who were separated into five different groups. All groups lasted for two hours, and met for 12 weeks. Expressive art and psychodrama methods were used in the group studies. The Scale of Empathic Skill was administered to participants before and after the group studies. Before the group study took place, the average score for empathic skill was 127.97 (SD = 21.26). After the group study, it increased to 138.87 (SD = 20.40). The average score for empathic skill increased after the group (t = 3.996, p = .000). Results show that expressive methods are easier, more effective, and enjoyable processes in nursing training.

  1. Current Development in Elderly Comprehensive Assessment and Research Methods

    PubMed Central

    Jiang, Shantong; Li, Pingping

    2016-01-01

    Comprehensive geriatric assessment (CGA) is a core and an essential part of the comprehensive care of the aging population. CGA uses specific tools to summarize elderly status in several domains that may influence the general health and outcomes of diseases of elderly patients, including assessment of medical, physical, psychological, mental, nutritional, cognitive, social, economic, and environmental status. Here, in this paper, we review different assessment tools used in elderly patients with chronic diseases. The development of comprehensive assessment tools and single assessment tools specially used in a dimension of CGA was discussed. CGA provides substantial insight into the comprehensive management of elderly patients. Developing concise and effective assessment instruments is helpful to carry out CGA widely to create a higher clinical value. PMID:27042661

  2. Translating Vision into Design: A Method for Conceptual Design Development

    NASA Technical Reports Server (NTRS)

    Carpenter, Joyce E.

    2003-01-01

    One of the most challenging tasks for engineers is the definition of design solutions that will satisfy high-level strategic visions and objectives. Even more challenging is the need to demonstrate how a particular design solution supports the high-level vision. This paper describes a process and set of system engineering tools that have been used at the Johnson Space Center to analyze and decompose high-level objectives for future human missions into design requirements that can be used to develop alternative concepts for vehicles, habitats, and other systems. Analysis and design studies of alternative concepts and approaches are used to develop recommendations for strategic investments in research and technology that support the NASA Integrated Space Plan. In addition to a description of system engineering tools, this paper includes a discussion of collaborative design practices for human exploration mission architecture studies used at the Johnson Space Center.

  3. Methods Used in Game Development to Foster FLOW

    NASA Technical Reports Server (NTRS)

    Jeppsen, Isaac Ben

    2010-01-01

    Games designed for entertainment have a rich history of providing compelling experiences. From consoles to PCs, games have managed to present intuitive and effective interfaces for a wide range of game styles to successfully allow users to "walk-up-and-play". Once a user is hooked, successful games artfully present challenging experiences just within reach of a user's ability, weaving each task and achievement into a compelling and engaging experience. In this paper, engagement is discussed in terms of the psychological theory of Flow. I argue that engagement should be one of the primary goals when developing a serious game and I discuss the best practices and techniques that have emerged from traditional video game development which help foster the creation of engaging, high Flow experiences.

  4. Development of an improved method of consolidating fatigue life data

    NASA Technical Reports Server (NTRS)

    Leis, B. N.; Sampath, S. G.

    1978-01-01

    A fatigue data consolidation model that incorporates recent advances in life prediction methodology was developed. A combined analytic and experimental study of fatigue of notched 2024-T3 aluminum alloy under constant amplitude loading was carried out. Because few systematic and complete data sets for 2024-T3 were available in the program generated data for fatigue crack initiation and separation failure for both zero and nonzero mean stresses. Consolidations of these data are presented.

  5. Computational Fluid Dynamics. [numerical methods and algorithm development

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This collection of papers was presented at the Computational Fluid Dynamics (CFD) Conference held at Ames Research Center in California on March 12 through 14, 1991. It is an overview of CFD activities at NASA Lewis Research Center. The main thrust of computational work at Lewis is aimed at propulsion systems. Specific issues related to propulsion CFD and associated modeling will also be presented. Examples of results obtained with the most recent algorithm development will also be presented.

  6. Development of panel methods for subsonic analysis and design

    NASA Technical Reports Server (NTRS)

    Bristow, D. R.

    1980-01-01

    Two computer programs, developed for subsonic inviscid analysis and design are described. The first solves arbitrary mixed analysis design problems for multielement airfoils in two dimensional flow. The second calculates the pressure distribution for arbitrary lifting or nonlifting three dimensional configurations. In each program, inviscid flow is modelled by using distributed source doublet singularities on configuration surface panels. Numerical formulations and representative solutions are presented for the programs.

  7. Development of wide area environment accelerator operation and diagnostics method

    NASA Astrophysics Data System (ADS)

    Uchiyama, Akito; Furukawa, Kazuro

    2015-08-01

    Remote operation and diagnostic systems for particle accelerators have been developed for beam operation and maintenance in various situations. Even though fully remote experiments are not necessary, the remote diagnosis and maintenance of the accelerator is required. Considering remote-operation operator interfaces (OPIs), the use of standard protocols such as the hypertext transfer protocol (HTTP) is advantageous, because system-dependent protocols are unnecessary between the remote client and the on-site server. Here, we have developed a client system based on WebSocket, which is a new protocol provided by the Internet Engineering Task Force for Web-based systems, as a next-generation Web-based OPI using the Experimental Physics and Industrial Control System Channel Access protocol. As a result of this implementation, WebSocket-based client systems have become available for remote operation. Also, as regards practical application, the remote operation of an accelerator via a wide area network (WAN) faces a number of challenges, e.g., the accelerator has both experimental device and radiation generator characteristics. Any error in remote control system operation could result in an immediate breakdown. Therefore, we propose the implementation of an operator intervention system for remote accelerator diagnostics and support that can obviate any differences between the local control room and remote locations. Here, remote-operation Web-based OPIs, which resolve security issues, are developed.

  8. Two-Step Camera Calibration Method Developed for Micro UAV'S

    NASA Astrophysics Data System (ADS)

    Gašparović, M.; Gajski, D.

    2016-06-01

    The development of unmanned aerial vehicles (UAVs) and continuous price reduction of unmanned systems attracted us to this research. Professional measuring systems are dozens of times more expensive and often heavier than "amateur", non-metric UAVs. For this reason, we tested the DJI Phantom 2 Vision Plus UAV. Phantom's smaller mass and velocity can develop less kinetic energy in relation to the professional measurement platforms, which makes it potentially less dangerous for use in populated areas. In this research, we wanted to investigate the ability of such non-metric UAV and find the procedures under which this kind of UAV may be used for the photogrammetric survey. It is important to emphasize that UAV is equipped with an ultra wide-angle camera with 14MP sensor. Calibration of such cameras is a complex process. In the research, a new two-step process is presented and developed, and the results are compared with standard one-step camera calibration procedure. Two-step process involves initially removed distortion on all images, and then uses these images in the phototriangulation with self-calibration. The paper presents statistical indicators which proved that the proposed two-step process is better and more accurate procedure for calibrating those types of cameras than standard one-step calibration. Also, we suggest two-step calibration process as the standard for ultra-wideangle cameras for unmanned aircraft.

  9. Isentropic Bulk Modulus: Development of a Federal Test Method

    DTIC Science & Technology

    2016-01-01

    Research Institute (SwRI). The current methodology allows the measurement of isentropic bulk modulus via speed-of- sound and density at temperatures ...current methodology allows the measurement of isentropic bulk modulus via speed-of- sound and density at temperatures ranging from 30-80 °C and applied...via speed-of- sound and density at temperatures ranging from 30-80 °C and applied pressures of 1,000-18,000 psi. This method has been applied

  10. Valid methods: the quality assurance of test method development, validation, approval, and transfer for veterinary testing laboratories.

    PubMed

    Wiegers, Ann L

    2003-07-01

    Third-party accreditation is a valuable tool to demonstrate a laboratory's competence to conduct testing. Accreditation, internationally and in the United States, has been discussed previously. However, accreditation is only I part of establishing data credibility. A validated test method is the first component of a valid measurement system. Validation is defined as confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. The international and national standard ISO/IEC 17025 recognizes the importance of validated methods and requires that laboratory-developed methods or methods adopted by the laboratory be appropriate for the intended use. Validated methods are therefore required and their use agreed to by the client (i.e., end users of the test results such as veterinarians, animal health programs, and owners). ISO/IEC 17025 also requires that the introduction of methods developed by the laboratory for its own use be a planned activity conducted by qualified personnel with adequate resources. This article discusses considerations and recommendations for the conduct of veterinary diagnostic test method development, validation, evaluation, approval, and transfer to the user laboratory in the ISO/IEC 17025 environment. These recommendations are based on those of nationally and internationally accepted standards and guidelines, as well as those of reputable and experienced technical bodies. They are also based on the author's experience in the evaluation of method development and transfer projects, validation data, and the implementation of quality management systems in the area of method development.

  11. The development of a 3D risk analysis method.

    PubMed

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  12. Development of evaluation method for software hazard identification techniques

    SciTech Connect

    Huang, H. W.; Chen, M. H.; Shih, C.; Yih, S.; Kuo, C. T.; Wang, L. H.; Yu, Y. C.; Chen, C. W.

    2006-07-01

    This research evaluated the applicable software hazard identification techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flow-graph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/noise ratio, complexity, and implementation cost. By this proposed method, the analysts can evaluate various software hazard identification combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (with transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and simulation-based model-analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantages are the completeness complexity and implementation cost. This evaluation method can be a platform to reach common consensus for the stakeholders. Following the evolution of software hazard identification techniques, the evaluation results could be changed. However, the insight of software hazard identification techniques is much more important than the numbers obtained by the evaluation. (authors)

  13. Development of new ash cooling method for atmospheric fluidized beds

    SciTech Connect

    Li Xuantian; Luo Zhongyang; Ni Mingjiang; Cheng Leming; Gao Xiang; Fang Mengxiang; Cen Kefa

    1995-12-31

    The pollution caused by hot ash drained from the bed is another challenge to atmospheric fluidized bed combustion technology when low-rank, high ash fuels are used. A new technique is developed for ash cooling and utilization of the waste heat of ash. Results from the demonstration of an 1.5 T/H patented device have shown the potential to use this type of ash cooler for drying and secondary air preheating. Bottom ash sized in the range 0--13 mm can be cooled from 1,650 F (900 C) to tolerable temperatures for conveying machinery, and the cooled ash can be re-utilized for cement production.

  14. Development of alternate methods of determining integrated SMR source terms

    SciTech Connect

    Barry, Kenneth

    2014-06-10

    The Nuclear Energy Institute (NEI) Small Modular Reactor (SMR) Licensing Task Force (TF) has been evaluating licensing issues unique and important to iPWRs, ranking these issues, and developing NEI position papers for submittal to the U.S. Nuclear Regulatory Commission (NRC) during the past three years. Papers have been developed and submitted to the NRC in a range of areas including: Price-Anderson Act, NRC annual fees, security, modularity, and staffing. In December, 2012, NEI completed a draft position paper on SMR source terms and participated in an NRC public meeting presenting a summary of this paper, which was subsequently submitted to the NRC. One important conclusion of the source term paper was the evaluation and selection of high importance areas where additional research would have a significant impact on source terms. The highest ranked research area was iPWR containment aerosol natural deposition. The NRC accepts the use of existing aerosol deposition correlations in Regulatory Guide 1.183, but these were developed for large light water reactor (LWR) containments. Application of these correlations to an iPWR design has resulted in greater than a ten-fold reduction of containment airborne aerosol inventory as compared to large LWRs. Development and experimental justification of containment aerosol natural deposition correlations specifically for the unique iPWR containments is expected to result in a large reduction of design basis and beyond-design-basis accident source terms with concomitantly smaller dose to workers and the public. Therefore, NRC acceptance of iPWR containment aerosol natural deposition correlations will directly support the industry’s goal of reducing the Emergency Planning Zone (EPZ) for SMRs. Based on the results in this work, it is clear that thermophoresis is relatively unimportant for iPWRs. Gravitational settling is well understood, and may be the dominant process for a dry environment. Diffusiophoresis and enhanced

  15. Durability Methods Development. Volume 8. Test and Fractography Data

    DTIC Science & Technology

    1982-11-01

    fractography effort for the program. J. W. Norris developed the computer software for storing and analyzing the fractography data acquired and...98 2080. 0.1516 Test Date 2180. 0.1948 2279. 0.3247 Fatizue LiFe 2279. Faiiure ioao: A) ~8) initiation Locationls) Notes; No. of Crack No. of Crak V...8217e TO -Rar# F~oO.T-S NkeCA.dFO. CRAKS ATr /to* - SM~ft#eS S I I- O 14 9 - < No. of Crack No- of Crack Flights* Size Flights* Size Data set ABXHC4 200

  16. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    SciTech Connect

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  17. Development of 3-D Ice Accretion Measurement Method

    NASA Technical Reports Server (NTRS)

    Lee, Sam; Broeren, Andy P.; Addy, Harold E., Jr.; Sills, Robert; Pifer, Ellen M.

    2012-01-01

    A research plan is currently being implemented by NASA to develop and validate the use of a commercial laser scanner to record and archive fully three-dimensional (3-D) ice shapes from an icing wind tunnel. The plan focused specifically upon measuring ice accreted in the NASA Icing Research Tunnel (IRT). The plan was divided into two phases. The first phase was the identification and selection of the laser scanning system and the post-processing software to purchase and develop further. The second phase was the implementation and validation of the selected system through a series of icing and aerodynamic tests. Phase I of the research plan has been completed. It consisted of evaluating several scanning hardware and software systems against an established selection criteria through demonstrations in the IRT. The results of Phase I showed that all of the scanning systems that were evaluated were equally capable of scanning ice shapes. The factors that differentiated the scanners were ease of use and the ability to operate in a wide range of IRT environmental conditions.

  18. Development of geopolitically relevant ranking criteria for geoengineering methods

    NASA Astrophysics Data System (ADS)

    Boyd, Philip W.

    2016-11-01

    A decade has passed since Paul Crutzen published his editorial essay on the potential for stratospheric geoengineering to cool the climate in the Anthropocene. He synthesized the effects of the 1991 Pinatubo eruption on the planet's radiative budget and used this large-scale event to broaden and deepen the debate on the challenges and opportunities of large-scale geoengineering. Pinatubo had pronounced effects, both in the short and longer term (months to years), on the ocean, land, and the atmosphere. This rich set of data on how a large-scale natural event influences many regional and global facets of the Earth System provides a comprehensive viewpoint to assess the wider ramifications of geoengineering. Here, I use the Pinatubo archives to develop a range of geopolitically relevant ranking criteria for a suite of different geoengineering approaches. The criteria focus on the spatial scales needed for geoengineering and whether large-scale dispersal is a necessary requirement for a technique to deliver significant cooling or carbon dioxide reductions. These categories in turn inform whether geoengineering approaches are amenable to participation (the "democracy of geoengineering") and whether they will lead to transboundary issues that could precipitate geopolitical conflicts. The criteria provide the requisite detail to demarcate different geoengineering approaches in the context of geopolitics. Hence, they offer another tool that can be used in the development of a more holistic approach to the debate on geoengineering.

  19. Discontinuity in pastoral development: time to update the method.

    PubMed

    KrÄtli, S

    2016-11-01

    Most off-the-shelf basic methodological tools currently used in pastoral development (e.g. technical definitions and conventional scales of observation) retain underlying assumptions about stability and uniformity being the norm (i.e. 'equilibrium thinking'). Such assumptions reflect a theoretical framework which had been questioned since the 1970s and was openly disproved in scientific circles during the 1990s, when it was shown to be fundamentally inadequate. Today, lingering equilibrium assumptions in the methodological legacy of pastoral development get in the way of operationalising state-of-the-art understanding of pastoral systems and drylands. Unless these barriers are identified, unpacked and managed, even increasing the rigour and intensity of data collection will not deliver a realistic representation of pastoral systems in statistics and policymaking. This article provides a range of examples of such 'barriers', where equilibrium assumptions persist in the methodology, including classifications of livestock systems, conventional scales of observation, key parameters in animal production, indicators in the measurement of ecological efficiency, and the concepts of 'fragile environment', natural resources, and pastoral risk.

  20. Ranging methods for developing wellbores in subsurface formations

    DOEpatents

    MacDonald, Duncan

    2011-09-06

    A method for forming two or more wellbores in a subsurface formation includes forming a first wellbore in the formation. A second wellbore is directionally drilled in a selected relationship relative to the first wellbore. At least one magnetic field is provided in the second wellbore using one or more magnets in the second wellbore located on a drilling string used to drill the second wellbore. At least one magnetic field is sensed in the first wellbore using at least two sensors in the first wellbore as the magnetic field passes by the at least two sensors while the second wellbore is being drilled. A position of the second wellbore is continuously assessed relative to the first wellbore using the sensed magnetic field. The direction of drilling of the second wellbore is adjusted so that the second wellbore remains in the selected relationship relative to the first wellbore.

  1. Development and testing of improved statistical wind power forecasting methods.

    SciTech Connect

    Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J.

    2011-12-06

    (with spatial and/or temporal dependence). Statistical approaches to uncertainty forecasting basically consist of estimating the uncertainty based on observed forecasting errors. Quantile regression (QR) is currently a commonly used approach in uncertainty forecasting. In Chapter 3, we propose new statistical approaches to the uncertainty estimation problem by employing kernel density forecast (KDF) methods. We use two estimators in both offline and time-adaptive modes, namely, the Nadaraya-Watson (NW) and Quantilecopula (QC) estimators. We conduct detailed tests of the new approaches using QR as a benchmark. One of the major issues in wind power generation are sudden and large changes of wind power output over a short period of time, namely ramping events. In Chapter 4, we perform a comparative study of existing definitions and methodologies for ramp forecasting. We also introduce a new probabilistic method for ramp event detection. The method starts with a stochastic algorithm that generates wind power scenarios, which are passed through a high-pass filter for ramp detection and estimation of the likelihood of ramp events to happen. The report is organized as follows: Chapter 2 presents the results of the application of ITL training criteria to deterministic WPF; Chapter 3 reports the study on probabilistic WPF, including new contributions to wind power uncertainty forecasting; Chapter 4 presents a new method to predict and visualize ramp events, comparing it with state-of-the-art methodologies; Chapter 5 briefly summarizes the main findings and contributions of this report.

  2. [Comparison of sustainable development status in Heilongjiang Province based on traditional ecological footprint method and emergy ecological footprint method].

    PubMed

    Chen, Chun-feng; Wang, Hong-yan; Xiao, Du-ning; Wang, Da-qing

    2008-11-01

    By using traditional ecological footprint method and its modification, emergy ecological footprint method, the sustainable development status of Heilongjiang Province in 2005 was analyzed. The results showed that the ecological deficits of Heilongjiang Province in 2005 based on emergy and conventional ecological footprint methods were 1.919 and 0.6256 hm2 x cap(-1), respectively. The ecological footprint value based on the two methods both exceeded its carrying capacity, which indicated that the social and economic development of the study area was not sustainable. Emergy ecological footprint method was used to discuss the relationships between human's material demand and ecosystem resources supply, and more stable parameters such as emergy transformity and emergy density were introduced into emergy ecological footprint method, which overcame some of the shortcomings of conventional ecological method.

  3. Sublimation rates of explosive materials : method development and initial results.

    SciTech Connect

    Phelan, James M.; Patton, Robert Thomas

    2004-08-01

    Vapor detection of explosives continues to be a technological basis for security applications. This study began experimental work to measure the chemical emanation rates of pure explosive materials as a basis for determining emanation rates of security threats containing explosives. Sublimation rates for TNT were determined with thermo gravimetric analysis using two different techniques. Data were compared with other literature values to provide sublimation rates from 25 to 70 C. The enthalpy of sublimation for the combined data was found to be 115 kJ/mol, which corresponds well with previously reported data from vapor pressure determinations. A simple Gaussian atmospheric dispersion model was used to estimate downrange concentrations based on continuous, steady-state conditions at 20, 45 and 62 C for a nominal exposed block of TNT under low wind conditions. Recommendations are made for extension of the experimental vapor emanation rate determinations and development of turbulent flow computational fluid dynamics based atmospheric dispersion estimates of standoff vapor concentrations.

  4. New Research Methods Developed for Studying Diabetic Foot Ulceration

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Dr. Brian Davis, one of the Cleveland Clinic Foundation's researchers, has been investigating the risk factors related to diabetic foot ulceration, a problem that accounts for 20 percent of all hospital admissions for diabetic patients. He had developed a sensor pad to measure the friction and pressure forces under a person's foot when walking. As part of NASA Lewis Research Center's Space Act Agreement with the Cleveland Clinic Foundation, Dr. Davis requested Lewis' assistance in visualizing the data from the sensor pad. As a result, Lewis' Interactive Data Display System (IDDS) was installed at the Cleveland Clinic. This computer graphics program is normally used to visualize the flow of air through aircraft turbine engines, producing color two- and three-dimensional images.

  5. Development of Methods Precision Length Measurement Using Transported Laser Interferometer

    NASA Astrophysics Data System (ADS)

    Lavrov, E. A.; Epikhin, V. M.; Mazur, M. M.; Suddenok, Y. A.; Shorin, V. N.

    The paper shows the results of a comparison of a developed transported laser interferometer (TLI) with a measurement interferometer XL-80 Renishaw at the distance 0-60 meters. Testings of a breadboard model of the TLI showed that a difference between the travel measurements of the two interferometers does not exceed 6 μm. The mean value of the difference of indications between the TLI and a Renishaw travel measurer at the distance near 58 m approximately equals to 0,5 μm. Root-mean square deviation of the indications of the interferometers approximately equals to 3 μm. At comparison of the sections with the same name between the TLI and the Renishaw travel measurer, measured at different days, a repeatability of the results for the sections with the same name is noted.

  6. Collaboration with academia in the development of post ovulatory methods.

    PubMed

    Thaler, G

    1999-12-01

    The 0.75-mg levonorgestrel-containing 'morning after' contraceptive tablet Postinor was developed by Gedeon Richter Ltd., Hungary. The product was first launched in 1979 and registered later in approximately 40 countries. In 1994, the World Health Organization offered the company participation in a multinational clinical trial to prove the superiority of the product over existing (Yuzpe-type) emergency contraceptive products. Based on these data the company was able to redesign the 'morning after' type Postinor into an 'emergency' pill, Postinor-2. During further clinical trials a close working relationship was formed between the Department of Obstetrics and Gynaecology at the Albert Szent-Györgyi Medical University in Szeged, Hungary, and Gedeon Richter. The advantages and challenges of cooperation between public- and private-sector institutions are analyzed in the paper.

  7. Development of metrological NDE methods for microturbine ceramic components

    SciTech Connect

    Lee, H.-R.; Ellingson, W. A.

    1999-12-23

    In this work, X-ray computed tomographic imaging technology with high spatial resolution has been explored for metrological applications to Si{sub 3}N{sub 4} ceramic turbine wheels. X-ray computed tomography (XCT) data were acquired by a charge-coupled device detector coupled to an image intensifier. Cone-beam XCT reconstruction algorithms were used to allow full-volume data acquisition from the turbine wheels. Special software was developed so that edge detection and complex blade contours could be determined from the XCT data. The feasibility of using the XCT for dimensional analyses was compared with that of a coordinate-measuring machine. Details of the XCT system, data acquisition, and dimensional comparisons will be presented.

  8. Development of a fast voltage control method for electrostatic accelerators

    NASA Astrophysics Data System (ADS)

    Lobanov, Nikolai R.; Linardakis, Peter; Tsifakis, Dimitrios

    2014-12-01

    The concept of a novel fast voltage control loop for tandem electrostatic accelerators is described. This control loop utilises high-frequency components of the ion beam current intercepted by the image slits to generate a correction voltage that is applied to the first few gaps of the low- and high-energy acceleration tubes adjoining the high voltage terminal. New techniques for the direct measurement of the transfer function of an ultra-high impedance structure, such as an electrostatic accelerator, have been developed. For the first time, the transfer function for the fast feedback loop has been measured directly. Slow voltage variations are stabilised with common corona control loop and the relationship between transfer functions for the slow and new fast control loops required for optimum operation is discussed. The main source of terminal voltage instabilities, which are due to variation of the charging current caused by mechanical oscillations of charging chains, has been analysed.

  9. The development of episodic foresight: emerging concepts and methods.

    PubMed

    Hudson, Judith A; Mayhew, Estelle M Y; Prabhakar, Janani

    2011-01-01

    Episodic foresight is here defined as the ability to project oneself into the future and mentally simulate situations and outcomes. Tasks used to study the development of episodic foresight in young children are reviewed and compared to tasks used to study other future-oriented abilities (planning, delay of gratification, and prospective memory) in the same age-group. We argue for the importance of accounting for and minimizing the role of other cognitive demands in research tasks. Because episodic foresight is an emerging ability in young children, more research needs to be directed at the contexts in which it emerges and the extent to which episodic foresight is part of a growing ability for mental representation.

  10. Methods and apparatuses for the development of microstructured nuclear fuels

    DOEpatents

    Jarvinen, Gordon D.; Carroll, David W.; Devlin, David J.

    2009-04-21

    Microstructured nuclear fuel adapted for nuclear power system use includes fissile material structures of micrometer-scale dimension dispersed in a matrix material. In one method of production, fissile material particles are processed in a chemical vapor deposition (CVD) fluidized-bed reactor including a gas inlet for providing controlled gas flow into a particle coating chamber, a lower bed hot zone region to contain powder, and an upper bed region to enable powder expansion. At least one pneumatic or electric vibrator is operationally coupled to the particle coating chamber for causing vibration of the particle coater to promote uniform powder coating within the particle coater during fuel processing. An exhaust associated with the particle coating chamber and can provide a port for placement and removal of particles and powder. During use of the fuel in a nuclear power reactor, fission products escape from the fissile material structures and come to rest in the matrix material. After a period of use in a nuclear power reactor and subsequent cooling, separation of the fissile material from the matrix containing the embedded fission products will provide an efficient partitioning of the bulk of the fissile material from the fission products. The fissile material can be reused by incorporating it into new microstructured fuel. The fission products and matrix material can be incorporated into a waste form for disposal or processed to separate valuable components from the fission products mixture.

  11. Development of fluoroimmunoassay methods for delta-9-tetrahydrocannabinol

    SciTech Connect

    Mason, A.P.

    1986-01-01

    Heterogeneous, competitive, labelled-ligand solid-phase primary antibody fluoroimmunoassay methods for the detection of THC in blood and plasma were proposed, and the required assay components were produced and characterized. These components included polyclonal rabbit antisera and monoclonal antibodies reactive with tetrahydrocannabinols, solid-phase immunoglobin reagents, a fluoroligand, and protein conjugates of THC for immunization and immunoassay response amplification. The stereoselective rabbit anti-THC antiserum F-444-12 was found to have a high binding titer, a high affinity (K/sub D/ = 3.4 x 10/sup -/exclamation/sup 1/ M for 5'-iodo/sup -125/I-..delta../sup 2/-THC), and high specificity versus a large number of cannabinoid compounds. Immobilization of the immunoglobulin fraction of the antiserum on hydrophilic polyacrylamide microspheres resulted in only a four fold increase in K/sub D/, and a two fold increase in the concentration of binding sites required for the production of equivalent binding titers. Specificity for small ligands was not affected, but the binding of THC-protein conjugates was reduced in potency. Two monoclonal hybridoma cell lines were produced that secrete monoclonal antibodies which bind the radioligand. The fluoroligand was synthesized from 5'-carboxy-..delta../sup 2/-THC and FITC using a diamimoethane linkage structure. While the compound had the fluorescence properties of FTIC, it was bound to the antiserum F-144-12 with a cross-reactive potency 1.4x greater than the radioligand, and 10x greater than THC.

  12. Development of methods to measure virus inactivation in fresh waters.

    PubMed Central

    Ward, R L; Winston, P E

    1985-01-01

    This study concerns the identification and correction of deficiencies in methods used to measure inactivation rates of enteric viruses seeded into environmental waters. It was found that viable microorganisms in an environmental water sample increased greatly after addition of small amounts of nutrients normally present in the unpurified seed virus preparation. This burst of microbial growth was not observed after seeding the water with purified virus. The use of radioactively labeled poliovirus revealed that high percentages of virus particles, sometimes greater than 99%, were lost through adherence to containers, especially in less turbid waters. This effect was partially overcome by the use of polypropylene containers and by the absence of movement during incubation. Adherence to containers clearly demonstrated the need for labeled viruses to monitor losses in this type of study. Loss of viral infectivity in samples found to occur during freezing was avoided by addition of broth. Finally, microbial contamination of the cell cultures during infectivity assays was overcome by the use of gentamicin and increased concentrations of penicillin, streptomycin, and amphotericin B. PMID:3004328

  13. Ceramic Matrix Composites (CMC) Life Prediction Method Development

    NASA Technical Reports Server (NTRS)

    Levine, Stanley R.; Calomino, Anthony M.; Ellis, John R.; Halbig, Michael C.; Mital, Subodh K.; Murthy, Pappu L.; Opila, Elizabeth J.; Thomas, David J.; Thomas-Ogbuji, Linus U.; Verrilli, Michael J.

    2000-01-01

    Advanced launch systems (e.g., Reusable Launch Vehicle and other Shuttle Class concepts, Rocket-Based Combine Cycle, etc.), and interplanetary vehicles will very likely incorporate fiber reinforced ceramic matrix composites (CMC) in critical propulsion components. The use of CMC is highly desirable to save weight, to improve reuse capability, and to increase performance. CMC candidate applications are mission and cycle dependent and may include turbopump rotors, housings, combustors, nozzle injectors, exit cones or ramps, and throats. For reusable and single mission uses, accurate prediction of life is critical to mission success. The tools to accomplish life prediction are very immature and not oriented toward the behavior of carbon fiber reinforced silicon carbide (C/SiC), the primary system of interest for a variety of space propulsion applications. This paper describes an approach to satisfy the need to develop an integrated life prediction system for CMC that addresses mechanical durability due to cyclic and steady thermomechanical loads, and takes into account the impact of environmental degradation.

  14. DEVELOPMENT OF A METHOD TO QUANTIFY THE IMPACT ...

    EPA Pesticide Factsheets

    Advances in human health risk assessment, especially for contaminants encountered by the inhalation route, have evolved so that the uncertainty factors (UF) used in the extrapolation of non-cancer effects across species (UFA) have been split into the respective pharmacodynamic (PD) and pharmacokinetic (PK) components. Present EPA default values for these components are divided into two half-logs (e.g., 10 to the 0.5 power or 3.16), so that their multiplication yields the 10-fold UF customarily seen in Agency risk assessments as UFA. The state of the science at present does not support a detailed evaluation of species-dependent and human interindividual variance of PD, but more data exist by which PK variance can be examined and quantified both across species and within the human species. Because metabolism accounts for much of the PK variance, we sought to examine the impact that differences in hepatic enzyme content exerts upon risk-relevant PK outcomes among humans. Because of the age and ethnic diversity expressed in the human organ donor population and the wide availability of tissues from these human organ donors, a program was developed to include information from those tissues in characterizing human interindividual PK variance. An Interagency Agreement with CDC/NIOSH Taft Laboratory, a Cooperative Agreement with CIIT Centers for Health Research, and a collaborative agreement with NHEERL/ETD were established to successfully complete the project. The di

  15. PROGRESS ON GENERIC PHASE-FIELD METHOD DEVELOPMENT

    SciTech Connect

    Biner, Bullent; Tonks, Michael; Millett, Paul C.; Li, Yulan; Hu, Shenyang Y.; Gao, Fei; Sun, Xin; Martinez, E.; Anderson, D.

    2012-09-26

    In this report, we summarize our current collobarative efforts, involving three national laboratories: Idaho National Laboratory (INL), Pacific Northwest National Laboratory (PNNL) and Los Alamos National Laboatory (LANL), to develop a computational framework for homogenous and heterogenous nucleation mechanisms into the generic phase-field model. During the studies, the Fe-Cr system was chosen as a model system due to its simplicity and availability of reliable thermodynamic and kinetic data, as well as the range of applications of low-chromium ferritic steels in nuclear reactors. For homogenous nucleation, the relavant parameters determined from atomistic studies were used directly to determine the energy functional and parameters in the phase-field model. Interfacial energy, critical nucleus size, nucleation rate, and coarsening kinetics were systematically examined in two- and three- dimensional models. For the heteregoneous nucleation mechanism, we studied the nucleation and growth behavior of chromium precipitates due to the presence of dislocations. The results demonstrate that both nucleation schemes can be introduced to a phase-field modeling algorithm with the desired accuracy and computational efficiency.

  16. Development of a Multi-Point Microwave Interferometry (MPMI) Method

    SciTech Connect

    Specht, Paul Elliott; Cooper, Marcia A.; Jilek, Brook Anton

    2015-09-01

    A multi-point microwave interferometer (MPMI) concept was developed for non-invasively tracking a shock, reaction, or detonation front in energetic media. Initially, a single-point, heterodyne microwave interferometry capability was established. The design, construction, and verification of the single-point interferometer provided a knowledge base for the creation of the MPMI concept. The MPMI concept uses an electro-optic (EO) crystal to impart a time-varying phase lag onto a laser at the microwave frequency. Polarization optics converts this phase lag into an amplitude modulation, which is analyzed in a heterodyne interfer- ometer to detect Doppler shifts in the microwave frequency. A version of the MPMI was constructed to experimentally measure the frequency of a microwave source through the EO modulation of a laser. The successful extraction of the microwave frequency proved the underlying physical concept of the MPMI design, and highlighted the challenges associated with the longer microwave wavelength. The frequency measurements made with the current equipment contained too much uncertainty for an accurate velocity measurement. Potential alterations to the current construction are presented to improve the quality of the measured signal and enable multiple accurate velocity measurements.

  17. Methods for the development of a bioregenerative life support system

    NASA Technical Reports Server (NTRS)

    Goldman, Michelle; Gomez, Shawn; Voorhees, Mike

    1990-01-01

    Presented here is a rudimentary approach to designing a life support system based on the utilization of plants and animals. The biggest stumbling block in the initial phases of developing a bioregenerative life support system is encountered in collecting and consolidating the data. If a database existed for the systems engineer so that he or she may have accurate data and a better understanding of biological systems in engineering terms, then the design process would be simplified. Also addressed is a means of evaluating the subsystems chosen. These subsystems are unified into a common metric, kilograms of mass, and normalized in relation to the throughput of a few basic elements. The initial integration of these subsystems is based on input/output masses and eventually balanced to a point of operation within the inherent performance ranges of the organisms chosen. At this point, it becomes necessary to go beyond the simplifying assumptions of simple mass relationships and further define for each organism the processes used to manipulate the throughput matter. Mainly considered here is the fact that these organisms perform input/output functions on differing timescales, thus establishing the need for buffer volumes or appropriate subsystem phasing. At each point in a systematic design it is necessary to disturb the system and discern its sensitivity to the disturbance. This can be done either through the introduction of a catastrophic failure or by applying a small perturbation to the system. One example is increasing the crew size. Here the wide range of performance characteristics once again shows that biological systems have an inherent advantage in responding to systemic perturbations. Since the design of any space-based system depends on mass, power, and volume requirements, each subsystem must be evaluated in these terms.

  18. Risk-Informed Safety Margin Characterization Methods Development Work

    SciTech Connect

    Smith, Curtis L; Ma, Zhegang; Riley, Tom; Mandelli, Diego; Nielsen, Joseph W; Alfonsi, Andrea; Rabiti, Cristian

    2014-09-01

    This report summarizes the research activity developed during the Fiscal year 2014 within the Risk Informed Safety Margin and Characterization (RISMC) pathway within the Light Water Reactor Sustainability (LWRS) campaign. This research activity is complementary to the one presented in the INL/EXT-??? report which shows advances Probabilistic Risk Assessment Analysis using RAVEN and RELAP-7 in conjunction to novel flooding simulation tools. Here we present several analyses that prove the values of the RISMC approach in order to assess risk associated to nuclear power plants (NPPs). We focus on simulation based PRA which, in contrast to classical PRA, heavily employs system simulator codes. Firstly we compare, these two types of analyses, classical and RISMC, for a Boiling water reactor (BWR) station black out (SBO) initiating event. Secondly we present an extended BWR SBO analysis using RAVEN and RELAP-5 which address the comments and suggestions received about he original analysis presented in INL/EXT-???. This time we focus more on the stochastic analysis such probability of core damage and on the determination of the most risk-relevant factors. We also show some preliminary results regarding the comparison between RELAP5-3D and the new code RELAP-7 for a simplified Pressurized Water Reactors system. Lastly we present some conceptual ideas regarding the possibility to extended the RISMC capabilities from an off-line tool (i.e., as PRA analysis tool) to an online-tool. In this new configuration, RISMC capabilities can be used to assist and inform reactor operator during real accident scenarios.

  19. Description and Critique of Quantitative Methods for the Allocation of Exploratory Development Resources

    DTIC Science & Technology

    The paper analyzes ten methods for planning the allocation of resources among projects within the exploratory development category of the Defense...research, development, test and evaluation program. Each method is described in terms of a general framework of planning methods and of the factors that...influence the allocation of development resources. A comparative analysis is made of the relative strengths and weaknesses of these methods . The more

  20. Analyzing Methods. A Procedural Guide for the Method Specialist. Research & Development Series No. 119-G. Career Planning Support System.

    ERIC Educational Resources Information Center

    Burkhardt, Carolyn M.; And Others

    Information in this brief guide, one of a set of twelve documents describing the Career Planning Support System (CPSS) and its use, is directed to the method specialist (a member of the CPSS steering committee) and provides procedures and a list of questions to aid in analyzing career development methods that may be appropriate for use in career…

  1. 78 FR 22540 - Notice of Public Meeting/Webinar: EPA Method Development Update on Drinking Water Testing Methods...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-16

    ... AGENCY Notice of Public Meeting/Webinar: EPA Method Development Update on Drinking Water Testing Methods...: Notice of public meeting. SUMMARY: The U.S. Environmental Protection Agency (EPA) Office of Ground Water and Drinking Water, Standards and Risk Management Division's Technical Support Center (TSC)...

  2. Development of Laser-Ion Beam Photodissociation Methods

    SciTech Connect

    David H. Russell

    2004-05-11

    stabilized) ions. It is difficult to probe 2o/3o structure of gas-phase ions using fragmentation chemistry, because the energy barriers to inter-conversion of different structural forms lie below the fragmentation threshold, studies of low internal energy ions are more suited for these studies. A major challenge for gas-phase ion research is the design of experimental structural probes that can be used in parallel with computational chemistry, molecular modeling and/or classical structural diagnostic tools to aid interpretation of the experimental data. Our experimental design and selection of research problems is guided by this philosophy. The following section of the progress report focus on three main issues: (i) technique and instrument development, and (ii) studies of ion structure and ion chemistry.

  3. Development of Laser-Ion Beam Photodissociation Methods

    SciTech Connect

    David H. Russell

    2004-03-31

    stabilized) ions. It is difficult to probe 2o/3o structure of gas-phase ions using fragmentation chemistry, because the energy barriers to inter-conversion of different structural forms lie below the fragmentation threshold, studies of low internal energy ions are more suited for these studies. A major challenge for gas-phase ion research is the design of experimental structural probes that can be used in parallel with computational chemistry, molecular modeling and/or classical structural diagnostic tools to aid interpretation of the experimental data. Our experimental design and selection of research problems is guided by this philosophy. The following section of the progress report focus on three main issues: (i) technique and instrument development, and (ii) studies of ion structure and ion chemistry.

  4. Development of Continuous-Energy Eigenvalue Sensitivity Coefficient Calculation Methods in the Shift Monte Carlo Code

    SciTech Connect

    Perfetti, Christopher M; Martin, William R; Rearden, Bradley T; Williams, Mark L

    2012-01-01

    Three methods for calculating continuous-energy eigenvalue sensitivity coefficients were developed and implemented into the SHIFT Monte Carlo code within the Scale code package. The methods were used for several simple test problems and were evaluated in terms of speed, accuracy, efficiency, and memory requirements. A promising new method for calculating eigenvalue sensitivity coefficients, known as the CLUTCH method, was developed and produced accurate sensitivity coefficients with figures of merit that were several orders of magnitude larger than those from existing methods.

  5. 75 FR 22126 - Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-27

    .... Environmental Protection Agency, Research Triangle Park, North Carolina 27711. Designation of this new... AGENCY Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods: Designation of One New Equivalent Method AGENCY: Environmental Protection Agency. ACTION: Notice of...

  6. METHOD DEVELOPMENT FOR THE DETERMINATION OF FORMALDEHYDE IN SAMPLES OF ENVIRONMENTAL ORIGIN

    EPA Science Inventory

    An analytical method was developed for the determination of formaldehyde in samples of environmental origin. After a review of the current literature, five candidate methods involving chemical derivatization were chosen for evaluation. The five derivatization reagents studied wer...

  7. DEVELOPMENT OF AN ELECTROSPRAY MASS SPECTROMETRIC METHOD FOR DETERMINING PERCHLORATE IN FERTILIZERS

    EPA Science Inventory

    An electrospray mass spectrometric method has been developed for application to agricultural and horticultural fertilizers to determine perchlorate. After fertilizers are leached or dissolved in water, the method relies on the formation of stable ion pair complex of the perchlor...

  8. Cases on Successful E-Learning Practices in the Developed and Developing World: Methods for the Global Information Economy

    ERIC Educational Resources Information Center

    Olaniran, Bolanle A., Ed.

    2010-01-01

    E-learning has become a significant aspect of training and education in the worldwide information economy as an attempt to create and facilitate a competent global work force. "Cases on Successful E-Learning Practices in the Developed and Developing World: Methods for the Global Information Economy" provides eclectic accounts of case…

  9. Development of a Method to Investigate Medical Students' Perceptions of Their Personal and Professional Development

    ERIC Educational Resources Information Center

    Lown, Nick; Davies, Ioan; Cordingley, Lis; Bundy, Chris; Braidman, Isobel

    2009-01-01

    Personal and Professional Development (PPD) is now key to the undergraduate medical curriculum and requires provision of appropriate learning experiences. In order to achieve this, it is essential that we ascertain students' perceptions of what is important in their PPD. We required a methodological approach suitable for a large medical school,…

  10. Flammable gas safety program. Analytical methods development: FY 1994 progress report

    SciTech Connect

    Campbell, J.A.; Clauss, S.; Grant, K.; Hoopes, V.; Lerner, B.; Lucke, R.; Mong, G.; Rau, J.; Wahl, K.; Steele, R.

    1994-09-01

    This report describes the status of developing analytical methods to account for the organic components in Hanford waste tanks, with particular focus on tanks assigned to the Flammable Gas Watch List. The methods that have been developed are illustrated by their application to samples obtained from Tank 241-SY-101 (Tank 101-SY).

  11. RECENT DEVELOPMENTS IN ANALYTICAL METHODS FOR FIBROUS AMPHIBOLE IN VERMICULITE ATTIC INSULATION

    EPA Science Inventory

    The U.S. Environmental Protection Agency has developed a test method for the analysis of fibrous amphibole in vermiculite attic insulation. This method was developed to provide the Agency with monitoring tools to study the occurrence and potential for exposure to fibrous amphibo...

  12. INDOOR AIR EMISSIONS FROM OFFICE EQUIPMENT: TEST METHOD DEVELOPMENT AND POLLUTION PREVENTION OPPORTUNITIES

    EPA Science Inventory

    The report describes the development and evaluation of a large chamber test method for measuring emissions from dry-process photocopiers. The test method was developed in two phases. Phase 1 was a single-laboratory evaluation at Research Triangle Institute (RTI) using four, mid-r...

  13. Waste Tank Organic Safety Program: Analytical methods development. Progress report, FY 1994

    SciTech Connect

    Campbell, J.A.; Clauss, S.A.; Grant, K.E.

    1994-09-01

    The objectives of this task are to develop and document extraction and analysis methods for organics in waste tanks, and to extend these methods to the analysis of actual core samples to support the Waste Tank organic Safety Program. This report documents progress at Pacific Northwest Laboratory (a) during FY 1994 on methods development, the analysis of waste from Tank 241-C-103 (Tank C-103) and T-111, and the transfer of documented, developed analytical methods to personnel in the Analytical Chemistry Laboratory (ACL) and 222-S laboratory. This report is intended as an annual report, not a completed work.

  14. An Elementary Introduction to Recently Developed Asymptotic Methods and Nanomechanics in Textile Engineering

    NASA Astrophysics Data System (ADS)

    He, Ji-Huan

    This review is an elementary introduction to the concepts of the recently developed asymptotic methods and new developments. Particular attention is paid throughout the paper to giving an intuitive grasp for Lagrange multiplier, calculus of variations, optimization, variational iteration method, parameter-expansion method, exp-function method, homotopy perturbation method, and ancient Chinese mathematics as well. Subsequently, nanomechanics in textile engineering and E-infinity theory in high energy physics, Kleiber's 3/4 law in biology, possible mechanism in spider-spinning process and fractal approach to carbon nanotube are briefly introduced. Bubble-electrospinning for mass production of nanofibers is illustrated. There are in total more than 280 references.

  15. 75 FR 45627 - Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-03

    ... AGENCY Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods: Designation of One New Equivalent Method AGENCY: Environmental Protection Agency. ACTION: Notice of the designation of one new equivalent method for monitoring ambient air quality. SUMMARY: Notice is hereby...

  16. 76 FR 15974 - Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-22

    ... particulate matter (TSP) (High-Volume Method, 40 CFR Part 50, Appendix B), with a particular extraction and... AGENCY Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods: Designation of Four New Equivalent Methods AGENCY: Environmental Protection Agency. ACTION: Notice of...

  17. RESEARCH ASSOCIATED WITH THE DEVELOPMENT OF EPA METHOD 552.2

    EPA Science Inventory

    The work presented in this paper entails the development of a method for haloacetic acid (HAA) analysis, Environmental Protection Agency (EPA)method 552.2, that improves the saftey and efficiency of previous methods and incorporates three additional trihalogenated acetic acids: b...

  18. Development of a nonlinear vortex method. [steady and unsteady aerodynamic loads of highly sweptback wings

    NASA Technical Reports Server (NTRS)

    Kandil, O. A.

    1981-01-01

    Progress is reported in the development of reliable nonlinear vortex methods for predicting the steady and unsteady aerodynamic loads of highly sweptback wings at large angles of attack. Abstracts of the papers, talks, and theses produced through this research are included. The modified nonlinear discrete vortex method and the nonlinear hybrid vortex method are highlighted.

  19. Methods for assessment of innovative medical technologies during early stages of development.

    PubMed

    Bartelmes, Marc; Neumann, Ulrike; Lühmann, Dagmar; Schönermark, Matthias P; Hagen, Anja

    2009-11-05

    Conventional Health Technology Assessment (HTA) is usually conducted at a point in time at which the development of the respective technology may no longer be influenced. By this time developers and/or purchasers may have misinvested resources. Thus the demand for Technology Assessment (TA) which incorporates appropriate methods during early development stages of a technology becomes apparent. Against this health political background, the present report describes methods for a development-accompanying assessment of innovative medical technologies. Furthermore, international research programmes set out to identify or apply such methods will be outlined. A systematic literature search as well as an extensive manual literature search are carried out in order to obtain literature and information. The greatest units of the identified methods consist of assessment concepts, decision support methods, modelling approaches and methods focusing on users and their knowledge. Additionally, several general-purpose concepts have been identified. The identified research programmes INNO-HTA and MATCH (Multidisciplinary-Assessment-of-Technology-Centre-for-Healthcare) are to be seen as pilot projects which so far have not been able to generate final results. MATCH focuses almost entirely on the incorporation of the user-perspective regarding the development of non-pharmaceutical technologies, whereas INNO-HTA is basically concerned with the identification and possible advancement of methods for the early, socially-oriented technology assessment. Most references offer only very vague descriptions of the respective method and the application of greatly differing methods seldom exceeds the character of a pilot implementation. A standardisation much less an institutionalisation of development-accompanying assessment cannot be recognized. It must be noted that there is no singular method with which development-accompanying assessment should be carried out. Instead, a technology and

  20. DEVELOPMENT OF ANALYTICAL METHODS FOR DETERMINING SUPPRESSOR CONCENTRATION IN THE MCU NEXT GENERATION SOLVENT (NGS)

    SciTech Connect

    Taylor-Pashow, K.; Fondeur, F.; White, T.; Diprete, D.; Milliken, C.

    2013-07-31

    Savannah River National Laboratory (SRNL) was tasked with identifying and developing at least one, but preferably two methods for quantifying the suppressor in the Next Generation Solvent (NGS) system. The suppressor is a guanidine derivative, N,N',N"-tris(3,7-dimethyloctyl)guanidine (TiDG). A list of 10 possible methods was generated, and screening experiments were performed for 8 of the 10 methods. After completion of the screening experiments, the non-aqueous acid-base titration was determined to be the most promising, and was selected for further development as the primary method. {sup 1}H NMR also showed promising results from the screening experiments, and this method was selected for further development as the secondary method. Other methods, including {sup 36}Cl radiocounting and ion chromatography, also showed promise; however, due to the similarity to the primary method (titration) and the inability to differentiate between TiDG and TOA (tri-n-ocytlamine) in the blended solvent, {sup 1}H NMR was selected over these methods. Analysis of radioactive samples obtained from real waste ESS (extraction, scrub, strip) testing using the titration method showed good results. Based on these results, the titration method was selected as the method of choice for TiDG measurement. {sup 1}H NMR has been selected as the secondary (back-up) method, and additional work is planned to further develop this method and to verify the method using radioactive samples. Procedures for analyzing radioactive samples of both pure NGS and blended solvent were developed and issued for the both methods.

  1. The Development and Evaluation of Training Methods for Group IV Personnel. 1. Orientation and Implementation of the Training Methods Development School (TMDS).

    ERIC Educational Resources Information Center

    Steinemann, John H.

    The investigation is part of continuing Navy research on the Trainability of Group IV (low ability) personnel intended to maximize the utilization and integration of marginal personnel in the fleet. An experimental Training Methods Development School (TMDS) was initiated to provide an experimental training program, with research controls, for…

  2. Development of a Panel Method for Modeling Configurations with Unsteady Component Motions. Phase 1

    DTIC Science & Technology

    1988-04-15

    significant length scales, the methods rely on the results of existing wake modeling techniques to specify the boundary conditions on their solution...15ANALYTICAL METHODS REPORT 8801 ( I DEVELOPMENT OF A PANEL METHOD FOR MODELING CONFIGURATIONS WITH UNSTEADY COMPONENT MOTIONS PHASE I FINAL REPORT...PREPARED UNDER SEIR CONTRACT DAALO3-87-C-OO11 Lfl Prepared By: David R. Clark & Brian Maskew Analytical Methods Inc. 2133 152nd Avenue N.E. Redmond

  3. Development of a hybrid deterministic/stochastic method for 1D nuclear reactor kinetics

    SciTech Connect

    Terlizzi, Stefano; Dulla, Sandra; Ravetto, Piero; Rahnema, Farzad; Zhang, Dingkang

    2015-12-31

    A new method has been implemented for solving the time-dependent neutron transport equation efficiently and accurately. This is accomplished by coupling the hybrid stochastic-deterministic steady-state coarse-mesh radiation transport (COMET) method [1,2] with the new predictor-corrector quasi-static method (PCQM) developed at Politecnico di Torino [3]. In this paper, the coupled method is implemented and tested in 1D slab geometry.

  4. Development of advanced modal methods for calculating transient thermal and structural response

    NASA Technical Reports Server (NTRS)

    Camarda, Charles J.

    1991-01-01

    Higher-order modal methods for predicting thermal and structural response are evaluated. More accurate methods or ones which can significantly reduce the size of complex, transient thermal and structural problems are desirable for analysis and are required for synthesis of real structures subjected to thermal and mechanical loading. A unified method is presented for deriving successively higher-order modal solutions related to previously-developed, lower-order methods such as the mode displacement and mode-acceleration methods. A new method, called the force-derivative method, is used to obtain higher-order modal solutions for both uncoupled (proportionally-damped) structural problems as well as thermal problems and coupled (non-proportionally damped) structural problems. The new method is called the force-derivative method because, analogous to the mode-acceleration method, it produces a term that depends on the forcing function and additional terms that depend on the time derivatives of the forcing function.

  5. Development and validation of spectrophotometric methods for estimating amisulpride in pharmaceutical preparations.

    PubMed

    Sharma, Sangita; Neog, Madhurjya; Prajapati, Vipul; Patel, Hiren; Dabhi, Dipti

    2010-01-01

    Five simple, sensitive, accurate and rapid visible spectrophotometric methods (A, B, C, D and E) have been developed for estimating Amisulpride in pharmaceutical preparations. These are based on the diazotization of Amisulpride with sodium nitrite and hydrochloric acid, followed by coupling with N-(1-naphthyl)ethylenediamine dihydrochloride (Method A), diphenylamine (Method B), beta-naphthol in an alkaline medium (Method C), resorcinol in an alkaline medium (Method D) and chromotropic acid in an alkaline medium (Method E) to form a colored chromogen. The absorption maxima, lambda(max), are at 523 nm for Method A, 382 and 490 nm for Method B, 527 nm for Method C, 521 nm for Method D and 486 nm for Method E. Beer's law was obeyed in the concentration range of 2.5-12.5 microg mL(-1) in Method A, 5-25 and 10-50 microg mL(-1) in Method B, 4-20 microg mL(-1) in Method C, 2.5-12.5 microg mL(-1) in Method D and 5-15 microg mL(-1) in Method E. The results obtained for the proposed methods are in good agreement with labeled amounts, when marketed pharmaceutical preparations were analyzed.

  6. Development of quadruped walking locomotion gait generator using a hybrid method

    NASA Astrophysics Data System (ADS)

    Jasni, F.; Shafie, A. A.

    2013-12-01

    The earth, in many areas is hardly reachable by the wheeled or tracked locomotion system. Thus, walking locomotion system is becoming a favourite option for mobile robot these days. This is because of the ability of walking locomotion to move on the rugged and unlevel terrains. However, to develop a walking locomotion gait for a robot is not a simple task. Central Pattern Generator (CPGs) method is a biological inspired method that is introduced as a method to develop the gait for the walking robot recently to tackle the issue faced by the conventional method of pre-designed trajectory based method. However, research shows that even the CPG method do have some limitations. Thus, in this paper, a hybrid method that combines CPG and the pre-designed trajectory based method is introduced to develop a walking gait for quadruped walking robot. The 3-D foot trajectories and the joint angle trajectories developed using the proposed method are compared with the data obtained via the conventional method of pre-designed trajectory to confirm the performance.

  7. Progress toward the development and testing of source reconstruction methods for NIF neutron imaging.

    PubMed

    Loomis, E N; Grim, G P; Wilde, C; Wilson, D C; Morgan, G; Wilke, M; Tregillis, I; Merrill, F; Clark, D; Finch, J; Fittinghoff, D; Bower, D

    2010-10-01

    Development of analysis techniques for neutron imaging at the National Ignition Facility is an important and difficult task for the detailed understanding of high-neutron yield inertial confinement fusion implosions. Once developed, these methods must provide accurate images of the hot and cold fuels so that information about the implosion, such as symmetry and areal density, can be extracted. One method under development involves the numerical inversion of the pinhole image using knowledge of neutron transport through the pinhole aperture from Monte Carlo simulations. In this article we present results of source reconstructions based on simulated images that test the methods effectiveness with regard to pinhole misalignment.

  8. The EDIC Method: An Engaging and Comprehensive Approach for Creating Health Department Workforce Development Plans.

    PubMed

    Grimm, Brandon L; Brandert, Kathleen; Palm, David; Svoboda, Colleen

    2016-09-29

    In 2013, the Nebraska Department of Health & Human Services, Division of Public Health (Nebraska's State Health Department); and the University of Nebraska Medical Center, College of Public Health developed a comprehensive approach to assess workforce training needs. This article outlines the method used to assess the education and training needs of Division staff, and develop comprehensive workforce development plans to address those needs. The EDIC method (Engage, Develop, Identify, and Create) includes the following four phases: (1) Engage Stakeholders, (2) Develop Assessment, (3) Identify Training Needs, and (4) Create Development Plans. The EDIC method provided a process grounded in science and practice, allowed input, and produced buy-in from staff at all levels throughout the Division of Public Health. This type of process provides greater assurance that the most important gaps in skills and competencies will be identified. Although it is a comprehensive approach, it can be replicated at the state or local level across the country.

  9. The weight method: a new screening method for estimating pesticide deposition from knapsack sprayers in developing countries.

    PubMed

    García-Santos, Glenda; Scheiben, Dominik; Binder, Claudia R

    2011-03-01

    Investigations of occupational and environmental risk caused by the use of agrochemicals have received considerable interest over the last decades. And yet, in developing countries, the lack of staff and analytical equipment as well the costs of chemical analyses make it difficult, if not impossible, to monitor pesticide contamination and residues in humans, air, water, and soils. A new and simple method is presented here for estimation of pesticide deposition in humans and soil after application. The estimate is derived on the basis of water mass balance measured in a given number of high absorbent papers under low evaporative conditions and unsaturated atmosphere. The method is presented as a suitable, rapid, low cost screening tool, complementary to toxicological tests, to assess occupational and environmental exposure caused by knapsack sprayers, where there is a lack of analytical instruments. This new method, called the "weight method", was tested to obtain drift deposition on the neighbouring field and the clothes of the applicator after spraying water with a knapsack sprayer in one of the largest areas of potato production in Colombia. The results were confirmed by experimental data using a tracer and the same set up used for the weight method. The weight method was able to explain 86% of the airborne drift and deposition variance.

  10. A Summary of the Development of Integral Aerodynamic Methods for the Computation of Rotor Wake Interactions.

    DTIC Science & Technology

    1986-03-01

    b-IP254 R SUMNMY OF THE DEVELOPMENT OF INTEONAL IERo~Umfic 1/1 METHODS FOR THE CONP.. (U) ANALYTICAL METHODS INC REDNOND MR J1 M SUNNAI HRR 86 RHI...8605 *RO-1S391.3-EG-S UCLFE AS029-81-CP-SI- -663 F/O 29/4 NL 141 1 .. * 11111 112 .0~ III111 2 - 1jL11. 11111 .6 MI(Rn’flI . Z ANALYTICAL METHODS REPORT...8605 " A SUMMARY OF TBE DEVELOPMENT OF INTEGRAL AERODYNAMIC METHODS FOR THE COMPUTATION OF ROTOR WAKE INTERACTIONS Prepared for : .-i Department of

  11. On Development of a Problem Based Learning System for Linear Algebra with Simple Input Method

    NASA Astrophysics Data System (ADS)

    Yokota, Hisashi

    2011-08-01

    Learning how to express a matrix using a keyboard inputs requires a lot of time for most of college students. Therefore, for a problem based learning system for linear algebra to be accessible for college students, it is inevitable to develop a simple method for expressing matrices. Studying the two most widely used input methods for expressing matrices, a simpler input method for expressing matrices is obtained. Furthermore, using this input method and educator's knowledge structure as a concept map, a problem based learning system for linear algebra which is capable of assessing students' knowledge structure and skill is developed.

  12. The Rack-Gear Tool Generation Modelling. Non-Analytical Method Developed in CATIA, Using the Relative Generating Trajectories Method

    NASA Astrophysics Data System (ADS)

    Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.

    2016-11-01

    The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.

  13. Developing a Measure of Wealth for Primary Student Families in a Developing Country: Comparison of Two Methods of Psychometric Calibration

    ERIC Educational Resources Information Center

    Griffin, Patrick

    2005-01-01

    This article compares the invariance properties of two methods of psychometric instrument calibration for the development of a measure of wealth among families of Grade 5 pupils in five provinces in Vietnam. The measure is based on self-reported lists of possessions in the home. Its stability has been measured over two time periods. The concept of…

  14. Development and Validation of Simultaneous Spectrophotometric Methods for Drotaverine Hydrochloride and Aceclofenac from Tablet Dosage Form

    PubMed Central

    Shah, S. A.; Shah, D. R.; Chauhan, R. S.; Jain, J. R.

    2011-01-01

    Two simple spectrophotometric methods have been developed for simultaneous estimation of drotaverine hydrochloride and aceclofenac from tablet dosage form. Method I is a simultaneous equation method (Vierodt's method), wavelengths selected are 306.5 and 276 nm. Method II is the absorbance ratio method (Q-Analysis), which employs 298.5 nm as λ1 and 276 nm as λ2 (λmax of AF) for formation of equations. Both the methods were found to be linear between the range of 8-32 μg/ml for drotaverine and 10-40 μg/ml for aceclofenac. The accuracy and precision were determined and found to comply with ICH guidelines. Both the methods showed good reproducibility and recovery with % RSD in the desired range. The methods were found to be rapid, specific, precise and accurate and can be successfully applied for the routine analysis of drotaverine and aceclofenac in their combined tablet dosage form. PMID:22457554

  15. Development and validation of a same-day monitoring method for recreational water

    EPA Pesticide Factsheets

    When water is polluted, swimmers can become ill from exposure to waterborne pathogens. EPA scientists have developed a new DNA extraction method for determining the amount of pathogens present in water.

  16. DEVELOPMENT AND VALIDATION OF AN ION CHROMATOGRAPHIC METHOD FOR DETERMINING PERCHLORATE IN FERTILIZERS

    EPA Science Inventory

    A method has been developed for the determination of perchlorate in fertilizers. Materials are leached with deionized water to dissolve any soluble perchlorate compounds. Ion chromatographic separation is followed by suppressed conductivity for detection. Perchlorate is retained ...

  17. Development of method to characterize emissions from spray polyurethane foam insulation

    EPA Science Inventory

    This presentation updates symposium participants re EPA progress towards development of SPF insulation emissions characterization methods. The presentation highlights evaluation of experiments investigating emissions after application of SPF to substrates in micro chambers and i...

  18. Developing Non-Targeted Measurement Methods to Characterize the Human Exposome

    EPA Science Inventory

    The exposome represents all exposures experienced by an individual during their lifetime. Registered chemicals currently number in the tens-of-thousands, and therefore comprise a significant portion of the human exposome. To date, quantitative monitoring methods have been develop...

  19. Analysis of Perfluorinated Chemicals and Their Fluorinated Precursors in Sludge: Method Development and Initial Results

    EPA Science Inventory

    A rigorous method was developed to maximize the extraction efficacy for perfluorocarboxylic acids (PFCAs), perfluorosulfonates (PFSAs), fluorotelomer alcohols (FTOHs), fluorotelomer acrylates (FTAc), perfluorosulfonamides (FOSAs), and perfluorosulfonamidoethanols (FOSEs) from was...

  20. Novel quantitative methods for characterization of chemical induced functional alteration in developing neuronal cultures

    EPA Science Inventory

    ABSTRACT BODY: Thousands of chemicals lack adequate testing for adverse effects on nervous system development, stimulating research into alternative methods to screen chemicals for potential developmental neurotoxicity. Microelectrode arrays (MEA) collect action potential spiking...

  1. Analysis of Perfluorinated Chemicals in Sludge: Method Development and Initial Results

    EPA Science Inventory

    A fast, rigorous method was developed to maximize the extraction efficacy for ten perfluorocarboxylic acids and perfluorooctanesulfonate from wastewater-treatment sludge and to quantitate using liquid chromatography, tandem-mass spectrometry (LC/MS/MS). First, organic solvents w...

  2. Evaluation of an automatic brain segmentation method developed for neonates on adult MR brain images

    NASA Astrophysics Data System (ADS)

    Moeskops, Pim; Viergever, Max A.; Benders, Manon J. N. L.; Išgum, Ivana

    2015-03-01

    Automatic brain tissue segmentation is of clinical relevance in images acquired at all ages. The literature presents a clear distinction between methods developed for MR images of infants, and methods developed for images of adults. The aim of this work is to evaluate a method developed for neonatal images in the segmentation of adult images. The evaluated method employs supervised voxel classification in subsequent stages, exploiting spatial and intensity information. Evaluation was performed using images available within the MRBrainS13 challenge. The obtained average Dice coefficients were 85.77% for grey matter, 88.66% for white matter, 81.08% for cerebrospinal fluid, 95.65% for cerebrum, and 96.92% for intracranial cavity, currently resulting in the best overall ranking. The possibility of applying the same method to neonatal as well as adult images can be of great value in cross-sectional studies that include a wide age range.

  3. Analysis of Investigational Drugs in Biological Fluids Method Development and Routine Assay

    DTIC Science & Technology

    1988-04-12

    The purpose of work under this contract is to develop and routinely use analytical methods for the determination of the concentration in biological specimens of investigational drugs in support of pharmacokinetic and bioavailability studies undertaken for the purpose of new drug development for the US military establishment. Accepted scientific procedures including normal and reversed phase high-performance liquid chromatographic methods, post column derivatization, and protein precipitation and

  4. Wellbeing Research in Developing Countries: Reviewing the Role of Qualitative Methods

    ERIC Educational Resources Information Center

    Camfield, Laura; Crivello, Gina; Woodhead, Martin

    2009-01-01

    The authors review the contribution of qualitative methods to exploring concepts and experiences of wellbeing among children and adults living in developing countries. They provide examples illustrating the potential of these methods for gaining a holistic and contextual understanding of people's perceptions and experiences. Some of these come…

  5. Development and assessment of disinfectant efficacy test methods for regulatory purposes.

    PubMed

    Tomasino, Stephen F

    2013-05-01

    The United States Environmental Protection Agency regulates pesticidal products, including products with antimicrobial activity. Test guidelines have been established to inform manufacturers of which methodology is appropriate to support a specific efficacy claim. This paper highlights efforts designed to improve current methods and the development and assessment of new test methods.

  6. Measuring Baseline Cortisol Levels in Cetaceans: Developing a Novel Non-Invasive Analysis Method

    DTIC Science & Technology

    2012-09-30

    Jakob Højer Kristensen, MSc). The proposed study will develop a method for extracting cortisol from cetacean skin using a two-step inverse...first real step toward establishing such a method. REFERENCES Hansen M, Jacobsen NW, Nielsen KN, Björklund E, Styrishave B & Halling-Sørensen B

  7. The Effectiveness of the Socratic Method in Developing Critical Thinking Skills in English Language Learners

    ERIC Educational Resources Information Center

    Jensen, Roger D., Jr.

    2015-01-01

    Critical thinking skills are an important topic of the United States' education system. This study examines the literature on critical thinking skills and defines them. The study also explores one specific teaching and assessment strategy known as the Socratic Method. The five-week research study used the Socratic Method for developing critical…

  8. An overview of recent developments and current status of gluten ELISA methods

    Technology Transfer Automated Retrieval System (TEKTRAN)

    ELISA methods for detecting and quantitating allergens have been around for some time and they are continuously improved. In this context, the development of gluten methods is no exception. Around the turn of the millennium, doubts were raised whether the existing “Skerritt-ELISA” would meet the 20 ...

  9. Pathways to Lean Software Development: An Analysis of Effective Methods of Change

    ERIC Educational Resources Information Center

    Hanson, Richard D.

    2014-01-01

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is…

  10. RESEARCH TOWARDS DEVELOPING METHODS FOR SELECTED PHARMACEUTICAL AND PERSONAL CARE PRODUCTS (PPCPS) ADAPTED FOR BIOSOLIDS

    EPA Science Inventory

    Development, standardization, and validation of analytical methods provides state-of-the-science

    techniques to evaluate the presence, or absence, of select PPCPs in biosolids. This research

    provides the approaches, methods, and tools to assess the exposures and redu...

  11. Development of a Simultaneous Extraction and Cleanup Method for Pyrethroid Pesticides from Indoor House Dust Samples

    EPA Science Inventory

    An efficient and reliable analytical method was developed for the sensitive and selective quantification of pyrethroid pesticides (PYRs) in house dust samples. The method is based on selective pressurized liquid extraction (SPLE) of the dust-bound PYRs into dichloromethane (DCM) wi...

  12. Using Mixed Methods to Analyze Video Data: A Mathematics Teacher Professional Development Example

    ERIC Educational Resources Information Center

    DeCuir-Gunby, Jessica T.; Marshall, Patricia L.; McCulloch, Allison W.

    2012-01-01

    This article uses data from 65 teachers participating in a K-2 mathematics professional development research project as an example of how to analyze video recordings of teachers' classroom lessons using mixed methods. Through their discussion, the authors demonstrate how using a mixed methods approach to classroom video analysis allows researchers…

  13. Algorithm Development and Application of High Order Numerical Methods for Shocked and Rapid Changing Solutions

    DTIC Science & Technology

    2007-12-06

    problems studied in this project involve numerically solving partial differential equations with either discontinuous or rapidly changing solutions ...REPORT Algorithm Development and Application of High Order Numerical Methods for Shocked and Rapid Changing Solutions 14. ABSTRACT 16. SECURITY...discontinuous Galerkin finite element methods, for solving partial differential equations with discontinuous or rapidly changing solutions . Algorithm

  14. Recommendations for Developing Alternative Test Methods for Screening and Prioritization of Chemicals for Developmental Neurotoxicity

    EPA Science Inventory

    Developmental neurotoxicity testing (DNT) is perceived by many stakeholders to be an area in critical need of alternative methods to current animal testing protocols and gUidelines. An immediate goal is to develop test methods that are capable of screening large numbers of chemic...

  15. Development of In-Mold Assembly Methods for Producing Mesoscale Revolute Joints

    DTIC Science & Technology

    2009-01-01

    positioning methods to realize cavity shape change to avoid damage to delicate mesoscale parts created during molding, (3) developing a method to...premolded component, this process may lead to irreparable damages to the first stage part. As a result, cavity morphing methods are the only feasible... damage to the part. Figure 4.3 Mold design iterations for second stage injection When the mesoscale pin is molded first, there is a concern that the

  16. DEVELOPMENT OF LOW-DIFFUSION FLUX-SPLITTING METHODS FOR DENSE GAS-SOLID FLOWS

    EPA Science Inventory

    The development of a class of low-diffusion upwinding methods for computing dense gas-solid flows is presented in this work. An artificial compressibility/low-Mach preconditioning strategy is developed for a hyperbolic two-phase flow equation system consisting of separate solids ...

  17. Methods for Studying Innovation Development in the Minnesota Innovation Research Program.

    ERIC Educational Resources Information Center

    Van de Ven, Andrew H.; Poole, Marshall Scott

    1990-01-01

    Focuses on the methods being used to examine processes of innovation development that pertain to the selection of cases and concepts, observing change, coding and analyzing event data to identify process patterns, and developing theories to explain observed innovation processes. (42 references) (MLF)

  18. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    ERIC Educational Resources Information Center

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  19. Cross Sectional Study of Agile Software Development Methods and Project Performance

    ERIC Educational Resources Information Center

    Lambert, Tracy

    2011-01-01

    Agile software development methods, characterized by delivering customer value via incremental and iterative time-boxed development processes, have moved into the mainstream of the Information Technology (IT) industry. However, despite a growing body of research which suggests that a predictive manufacturing approach, with big up-front…

  20. Algorithms for Zonal Methods and Development of Three Dimensional Mesh Generation Procedures.

    DTIC Science & Technology

    1984-02-01

    of three dimensional grid generation both elliptic and hyper- bolic methods were developed. A chimera grid scheme, that is, the use of overset multiple...were developed. A chimera grid scheme, that is, the use of overset multiple grid systems, was also tested in two dimensions. In our study of zonal...Background . . . . . . . . . . . . . . . . . . . . . . . . . . 1 2. Grid Generation ............................... 2 3. Overset Grids

  1. New methods in mammary gland development and cancer: proteomics, epigenetics, symmetric division and metastasis

    PubMed Central

    2012-01-01

    The European Network for Breast Development and Cancer (ENBDC) meeting on 'Methods in Mammary Gland Development and Cancer' has become an annual international rendezvous for scientists with interests in the normal and neoplastic breast. The fourth meeting in this series, held in April in Weggis, Switzerland, focused on proteomics, epigenetics, symmetric division, and metastasis. PMID:22809213

  2. Online Learning Communities and Teacher Professional Development: Methods for Improved Education Delivery

    ERIC Educational Resources Information Center

    Lindberg, J. Ola, Ed.; Olofsson, Anders D., Ed.

    2009-01-01

    In today's society, the professional development of teachers is urgent due to the constant change in working conditions and the impact that information and communication technologies have in teaching practices. "Online Learning Communities and Teacher Professional Development: Methods for Improved Education Delivery" features innovative…

  3. STANDARD OPERATING PROCEDURE FOR QUALITY ASSURANCE IN ANALYTICAL CHEMISTRY METHODS DEVELOPMENT

    EPA Science Inventory

    The Environmental Protection Agency's (EPA) Office of Research and Development (ORD) is engaged in the development, demonstration, and validation of new or newly adapted methods of analysis for environmentally related samples. Recognizing that a "one size fits all" approach to qu...

  4. An Auxiliary Method To Reduce Potential Adverse Impacts Of Projected Land Developments: Subwatershed Prioritization

    EPA Science Inventory

    An index based method is developed that ranks the subwatersheds of a watershed based on their relative impacts on watershed response to anticipated land developments, and then applied to an urbanizing watershed in Eastern Pennsylvania. Simulations with a semi-distributed hydrolo...

  5. FODEM: A Multi-Threaded Research and Development Method for Educational Technology

    ERIC Educational Resources Information Center

    Suhonen, Jarkko; de Villiers, M. Ruth; Sutinen, Erkki

    2012-01-01

    Formative development method (FODEM) is a multithreaded design approach that was originated to support the design and development of various types of educational technology innovations, such as learning tools, and online study programmes. The threaded and agile structure of the approach provides flexibility to the design process. Intensive…

  6. Linking Faculty Development to Community College Student Achievement: A Mixed Methods Approach

    ERIC Educational Resources Information Center

    Elliott, Robert W.; Oliver, Diane E.

    2016-01-01

    Using a mixed methods, multilevel research design, this pilot inquiry explored the relationship between college faculty professional development and the academic achievement of diverse students by coupling two separate links: (a) the effects that professional development activities have on improving teaching strategies, and (b) the effects these…

  7. Studies on the development of latent fingerprints by the method of solid-medium ninhydrin.

    PubMed

    Yang, Ruiqin; Lian, Jie

    2014-09-01

    A new series of fingerprint developing membrane were prepared using ninhydrin as the developing agent, and pressure-sensitive emulsifiers as the encapsulated chemicals. The type of emulsifier, plastic film, concentration of the developing agent, modifying ions and thickness of the membrane were studied in order to get the optimized fingerprint developing effect. The membrane can be successfully applied to both latent sweat fingerprints and blood fingerprint on many different surfaces. The sensitivity of the method toward the latent sweat fingerprint is 0.1 mg/L amino acid. The membrane can be applied to both porous and non-porous surfaces. Fingerprints that are difficult to develop on surfaces such as leather, glass and heat-sensitive paper using traditional chemical methods can be successfully developed with this membrane.

  8. Method of obtaining intensified image from developed photographic films and plates

    NASA Technical Reports Server (NTRS)

    Askins, B. S. (Inventor)

    1978-01-01

    A method is explained of obtaining intensified images from silver images on developed photographic films and plates. The steps involve converting silver of the developed film or plate to a radioactive compound by treatment with an aqueous alkaline solution of an organo-S35 compound; placing the treated film or plate in direct contact with a receiver film which is then exposed by radiation from the activated film; and developing and fixing the resulting intensified image on the receiver film.

  9. Forestry sector analysis for developing countries: Issues and methods. Forest Service general technical report

    SciTech Connect

    Haynes, R.W.

    1993-10-01

    A satellite meeting of the 10th Forestry World Congress focused on the methods used for forest sector analysis and their application in both developed and developing countries. The results of that meeting are summarized, and a general approach for forest sector modeling is proposed. The approach includes models derived from the existing literature and can be used as a structure for applying forest sector analysis in developing countries.

  10. Development of gas chromatographic methods for the analyses of organic carbonate-based electrolytes

    NASA Astrophysics Data System (ADS)

    Terborg, Lydia; Weber, Sascha; Passerini, Stefano; Winter, Martin; Karst, Uwe; Nowak, Sascha

    2014-01-01

    In this work, novel methods based on gas chromatography (GC) for the investigation of common organic carbonate-based electrolyte systems are presented, which are used in lithium ion batteries. The methods were developed for flame ionization detection (FID), mass spectrometric detection (MS). Further, headspace (HS) sampling for the investigation of solid samples like electrodes is reported. Limits of detection are reported for FID. Finally, the developed methods were applied to the electrolyte system of commercially available lithium ion batteries as well as on in-house assembled cells.

  11. Development and validation of a new fallout transport method using variable spectral winds. Doctoral thesis

    SciTech Connect

    Hopkins, A.T.

    1984-09-01

    The purpose of this research was to develop and validate a fallout prediction method using variable transport calculations. The new method uses National Meteorological Center (NMC) spectral coefficients to compute wind vectors along the space- and time-varying trajectories of falling particles. The method was validated by comparing computed and actual cloud trajectories from a Mount St. Helens volcanic eruption and a high dust cloud. In summary, this research demonstrated the feasibility of using spectral coefficients for fallout transport calculations, developed a two-step smearing model to treat variable winds, and showed that uncertainties in spectral winds do not contribute significantly to the error in computed dose rate.

  12. Analytical Methods Development in Support of the Caustic Side Solvent Extraction System

    SciTech Connect

    Maskarinec, M.P.

    2001-07-17

    The goal of the project reported herein was to develop and apply methods for the analysis of the major components of the solvent system used in the Caustic-Side Solvent Extraction Process (CSSX). These include the calix(4)arene, the modifier, 1-(2,2,3,3- tetrafluoropropoxy)-3-(4-sec-butylphenoxy)-2-propanol and tri-n-octylamine. In addition, it was an objective to develop methods that would allow visualization of other components under process conditions. These analyses would include quantitative laboratory methods for each of the components, quantitative analysis of expected breakdown products (4-see-butylphenol and di-n-octylamine), and qualitative investigations of possible additional breakdown products under a variety of process extremes. These methods would also provide a framework for process analysis should a pilot facility be developed. Two methods were implemented for sample preparation of aqueous phases. The first involves solid-phase extraction and produces quantitative recovery of the solvent components and degradation products from the various aqueous streams. This method can be automated and is suitable for use in radiation shielded facilities. The second is a variation of an established EPA liquid-liquid extraction procedure. This method is also quantitative and results in a final extract amenable to virtually any instrumental analysis. Two HPLC methods were developed for quantitative analysis. The first is a reverse-phase system with variable wavelength W detection. This method is excellent from a quantitative point of view. The second method is a size-exclusion method coupled with dual UV and evaporative light scattering detectors. This method is much faster than the reverse-phase method and allows for qualitative analysis of other components of the waste. For tri-n-octylamine and other degradation products, a GC method was developed and subsequently extended to GUMS. All methods have precision better than 5%. The combination of these methods

  13. Development of a Suitable Dissolution Method for the Combined Tablet Formulation of Atorvastatin and Ezetimibe by RP-LC Method.

    PubMed

    Ozkan Cansel, Kose; Ozgur, Esim; Sevinc, Kurbanoglu; Ayhan, Savaser; Ozkan, Sibel A; Yalcin, Ozkan

    2016-01-01

    Pharmaceutical preparations of ezetimibe and atorvastatin are generally used to regulate the lipid level in blood. It decreases the secondary events for patients with high cholesterol and clinical cardiovascular disease such as non-fatal or fatal heart attack. There is no any pharmacopoeia method available for the dissolution testing recommended by the FDA. Development of dissolution tests method is very critical parameter especially for the pharmaceutical preparations that contain Class II drugs (slightly soluble, good permeable). In the proposed method, the effects of pH and surfactant on the dissolution of poorly water soluble combined drug therapy with a different pKa values in an in vitro environment is investigated. The content of our study was designed to answer these open-ended questions. The optimized test conditions achieved under sink conditions with USP apparatus 2 at a paddle rotation speed of 75 rpm and 900 ml in 0.01 M Acetate buffer (pH= 6.8) containing 0.45% SDS as a dissolution medium. Quantification of dissolution samples were analyzed with a new fully validated RP-LC method with UV detection at 242 nm.

  14. Development of a generalized perturbation theory method for sensitivity analysis using continuous-energy Monte Carlo methods

    SciTech Connect

    Perfetti, Christopher M.; Rearden, Bradley T.

    2016-03-01

    The sensitivity and uncertainty analysis tools of the ORNL SCALE nuclear modeling and simulation code system that have been developed over the last decade have proven indispensable for numerous application and design studies for nuclear criticality safety and reactor physics. SCALE contains tools for analyzing the uncertainty in the eigenvalue of critical systems, but cannot quantify uncertainty in important neutronic parameters such as multigroup cross sections, fuel fission rates, activation rates, and neutron fluence rates with realistic three-dimensional Monte Carlo simulations. A more complete understanding of the sources of uncertainty in these design-limiting parameters could lead to improvements in process optimization, reactor safety, and help inform regulators when setting operational safety margins. A novel approach for calculating eigenvalue sensitivity coefficients, known as the CLUTCH method, was recently explored as academic research and has been found to accurately and rapidly calculate sensitivity coefficients in criticality safety applications. The work presented here describes a new method, known as the GEAR-MC method, which extends the CLUTCH theory for calculating eigenvalue sensitivity coefficients to enable sensitivity coefficient calculations and uncertainty analysis for a generalized set of neutronic responses using high-fidelity continuous-energy Monte Carlo calculations. Here, several criticality safety systems were examined to demonstrate proof of principle for the GEAR-MC method, and GEAR-MC was seen to produce response sensitivity coefficients that agreed well with reference direct perturbation sensitivity coefficients.

  15. Development of a generalized perturbation theory method for sensitivity analysis using continuous-energy Monte Carlo methods

    DOE PAGES

    Perfetti, Christopher M.; Rearden, Bradley T.

    2016-03-01

    The sensitivity and uncertainty analysis tools of the ORNL SCALE nuclear modeling and simulation code system that have been developed over the last decade have proven indispensable for numerous application and design studies for nuclear criticality safety and reactor physics. SCALE contains tools for analyzing the uncertainty in the eigenvalue of critical systems, but cannot quantify uncertainty in important neutronic parameters such as multigroup cross sections, fuel fission rates, activation rates, and neutron fluence rates with realistic three-dimensional Monte Carlo simulations. A more complete understanding of the sources of uncertainty in these design-limiting parameters could lead to improvements in processmore » optimization, reactor safety, and help inform regulators when setting operational safety margins. A novel approach for calculating eigenvalue sensitivity coefficients, known as the CLUTCH method, was recently explored as academic research and has been found to accurately and rapidly calculate sensitivity coefficients in criticality safety applications. The work presented here describes a new method, known as the GEAR-MC method, which extends the CLUTCH theory for calculating eigenvalue sensitivity coefficients to enable sensitivity coefficient calculations and uncertainty analysis for a generalized set of neutronic responses using high-fidelity continuous-energy Monte Carlo calculations. Here, several criticality safety systems were examined to demonstrate proof of principle for the GEAR-MC method, and GEAR-MC was seen to produce response sensitivity coefficients that agreed well with reference direct perturbation sensitivity coefficients.« less

  16. X-RAY FLUORESCENCE ANALYSIS OF HANFORD LOW ACTIVITY WASTE SIMULANTS METHOD DEVELOPMENT

    SciTech Connect

    Jurgensen, A; David Missimer, D; Ronny Rutherford, R

    2007-08-08

    The x-ray fluorescence laboratory (XRF) in the Analytical Development Directorate (ADD) of the Savannah River National Laboratory (SRNL) was requested to develop an x-ray fluorescence spectrometry method for elemental characterization of the Hanford Tank Waste Treatment and Immobilization Plant (WTP) pretreated low activity waste (LAW) stream to the LAW Vitrification Plant. The WTP is evaluating the potential for using XRF as a rapid turnaround technique to support LAW product compliance and glass former batching. The overall objective of this task was to develop an XRF analytical method that provides rapid turnaround time (<8 hours), while providing sufficient accuracy and precision to determine variations in waste.

  17. Development and validation of event-specific quantitative PCR method for genetically modified maize MIR604.

    PubMed

    Mano, Junichi; Furui, Satoshi; Takashima, Kaori; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2012-01-01

    A GM maize event, MIR604, has been widely distributed and an analytical method to quantify its content is required to monitor the validity of food labeling. Here we report a novel real-time PCR-based quantitation method for MIR604 maize. We developed real-time PCR assays specific for MIR604 using event-specific primers designed by the trait developer, and for maize endogenous starch synthase IIb gene (SSIIb). Then, we determined the conversion factor, which is required to calculate the weight-based GM maize content from the copy number ratio of MIR604-specific DNA to the endogenous reference DNA. Finally, to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind samples containing MIR604 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The reproducibility (RSDr) of the developed method was evaluated to be less than 25%. The limit of quantitation of the method was estimated to be 0.5% based on the ISO 24276 guideline. These results suggested that the developed method would be suitable for practical quantitative analyses of MIR604 maize.

  18. The updated clinical guideline development process in Estonia is an efficient method for developing evidence-based guidelines.

    PubMed

    Bero, Lisa A; Hill, Suzanne; Habicht, Jarno; Mathiesen, Mari; Starkopf, Joel

    2013-02-01

    Clinical practice guidelines are one of the tools available to improve the quality of health care. However, it may be difficult for countries to develop their own national guidelines "from scratch" because of limitations in time, expertise, and financial resources. The Estonian Health Insurance Fund (EHIF), in collaboration with other stakeholders, has launched a national effort to develop and implement evidence-based clinical practice guidelines aimed at improving the quality of care. Although the first EHIF handbook for preparing guidelines was published in 2004, there has been wide variation in the format and quality of guidelines prepared by medical specialty societies, EHIF, and other organizations in Estonia. An additional challenge to guideline development in Estonia is that it is a country with limited human resources. Therefore, revision of the Estonian guideline process was aimed at developing an efficient method for adapting current high-quality guidelines to the Estonian setting without compromising their quality. In 2010, a comprehensive assessment of guideline development in Estonia was made by the World Health Organization, EHIF, the Medical Faculty at the University of Tartu, and selected national and international experts in an effort to streamline and harmonize the principles and processes of guideline development in Estonia. This study summarizes the evaluation of and revisions to the process. Estonia has made substantial changes in its processes of clinical practice guideline development and implementation as part of an overall program aiming for systematic quality improvement in health care. This experience may be relevant to other small or resource-limited countries.

  19. Best Practices in Stability Indicating Method Development and Validation for Non-clinical Dose Formulations.

    PubMed

    Henry, Teresa R; Penn, Lara D; Conerty, Jason R; Wright, Francesca E; Gorman, Gregory; Pack, Brian W

    2016-11-01

    Non-clinical dose formulations (also known as pre-clinical or GLP formulations) play a key role in early drug development. These formulations are used to introduce active pharmaceutical ingredients (APIs) into test organisms for both pharmacokinetic and toxicological studies. Since these studies are ultimately used to support dose and safety ranges in human studies, it is important to understand not only the concentration and PK/PD of the active ingredient but also to generate safety data for likely process impurities and degradation products of the active ingredient. As such, many in the industry have chosen to develop and validate methods which can accurately detect and quantify the active ingredient along with impurities and degradation products. Such methods often provide trendable results which are predictive of stability, thus leading to the name; stability indicating methods. This document provides an overview of best practices for those choosing to include development and validation of such methods as part of their non-clinical drug development program. This document is intended to support teams who are either new to stability indicating method development and validation or who are less familiar with the requirements of validation due to their position within the product development life cycle.

  20. Development and application of quantitative detection method for viral hemorrhagic septicemia virus (VHSV) genogroup IVa.

    PubMed

    Kim, Jong-Oh; Kim, Wi-Sik; Kim, Si-Woo; Han, Hyun-Ja; Kim, Jin Woo; Park, Myoung Ae; Oh, Myung-Joo

    2014-05-23

    Viral hemorrhagic septicemia virus (VHSV) is a problematic pathogen in olive flounder (Paralichthys olivaceus) aquaculture farms in Korea. Thus, it is necessary to develop a rapid and accurate diagnostic method to detect this virus. We developed a quantitative RT-PCR (qRT-PCR) method based on the nucleocapsid (N) gene sequence of Korean VHSV isolate (Genogroup IVa). The slope and R² values of the primer set developed in this study were -0.2928 (96% efficiency) and 0.9979, respectively. Its comparison with viral infectivity calculated by traditional quantifying method (TCID₅₀) showed a similar pattern of kinetic changes in vitro and in vivo. The qRT-PCR method reduced detection time compared to that of TCID₅₀, making it a very useful tool for VHSV diagnosis.

  1. Development and Evaluation of Event-Specific Quantitative PCR Method for Genetically Modified Soybean MON87701.

    PubMed

    Tsukahara, Keita; Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Nishimaki-Mogami, Tomoko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2016-01-01

    A real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event, MON87701. First, a standard plasmid for MON87701 quantification was constructed. The conversion factor (Cf) required to calculate the amount of genetically modified organism (GMO) was experimentally determined for a real-time PCR instrument. The determined Cf for the real-time PCR instrument was 1.24. For the evaluation of the developed method, a blind test was carried out in an inter-laboratory trial. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr), respectively. The determined biases and the RSDr values were less than 30 and 13%, respectively, at all evaluated concentrations. The limit of quantitation of the method was 0.5%, and the developed method would thus be applicable for practical analyses for the detection and quantification of MON87701.

  2. Development and application of NDE methods for monolithic and continuous fiber ceramic matrix composites.

    SciTech Connect

    Ellingson, W. A.

    1999-05-21

    Monolithic structural ceramics and continuous fiber ceramic matrix composites (CMCs) are being developed for application in many thermally and chemically aggressive environments where structural reliability is paramount. We have recently developed advanced nondestructive evaluation (NDE) methods that can detect distributed ''defects'' such as density gradients and machining-induced damage in monolithic materials, as well as delamination, porosity, and throughwall cracks, in CMC materials. These advanced NDE methods utilize (a) high-resolution, high-sensitivity thermal imaging; (b) high-resolution X-ray imaging; (c) laser-based elastic optical scattering; (d) acoustic resonance; (e) air-coupled ultrasonic methods; and (f) high-sensitivity fluorescent penetrant technology. This paper discusses the development and application of these NDE methods relative to ceramic processing and ceramic components used in large-scale industrial gas turbines and hot gas filters for gas stream particulate cleanup.

  3. Comparison of methods for developing the dynamics of rigid-body systems

    NASA Technical Reports Server (NTRS)

    Ju, M. S.; Mansour, J. M.

    1989-01-01

    Several approaches for developing the equations of motion for a three-degree-of-freedom PUMA robot were compared on the basis of computational efficiency (i.e., the number of additions, subtractions, multiplications, and divisions). Of particular interest was the investigation of the use of computer algebra as a tool for developing the equations of motion. Three approaches were implemented algebraically: Lagrange's method, Kane's method, and Wittenburg's method. Each formulation was developed in absolute and relative coordinates. These six cases were compared to each other and to a recursive numerical formulation. The results showed that all of the formulations implemented algebraically required fewer calculations than the recursive numerical algorithm. The algebraic formulations required fewer calculations in absolute coordinates than in relative coordinates. Each of the algebraic formulations could be simplified, using patterns from Kane's method, to yield the same number of calculations in a given coordinate system.

  4. Development of a Hybrid RANS/LES Method for Compressible Mixing Layer Simulations

    NASA Technical Reports Server (NTRS)

    Georgiadis, Nicholas J.; Alexander, J. Iwan D.; Reshotko, Eli

    2001-01-01

    A hybrid method has been developed for simulations of compressible turbulent mixing layers. Such mixing layers dominate the flows in exhaust systems of modem day aircraft and also those of hypersonic vehicles currently under development. The hybrid method uses a Reynolds-averaged Navier-Stokes (RANS) procedure to calculate wall bounded regions entering a mixing section, and a Large Eddy Simulation (LES) procedure to calculate the mixing dominated regions. A numerical technique was developed to enable the use of the hybrid RANS/LES method on stretched, non-Cartesian grids. The hybrid RANS/LES method is applied to a benchmark compressible mixing layer experiment. Preliminary two-dimensional calculations are used to investigate the effects of axial grid density and boundary conditions. Actual LES calculations, performed in three spatial directions, indicated an initial vortex shedding followed by rapid transition to turbulence, which is in agreement with experimental observations.

  5. Development of direct-inverse 3-D methods for applied transonic aerodynamic wing design and analysis

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1989-01-01

    An inverse wing design method was developed around an existing transonic wing analysis code. The original analysis code, TAWFIVE, has as its core the numerical potential flow solver, FLO30, developed by Jameson and Caughey. Features of the analysis code include a finite-volume formulation; wing and fuselage fitted, curvilinear grid mesh; and a viscous boundary layer correction that also accounts for viscous wake thickness and curvature. The development of the inverse methods as an extension of previous methods existing for design in Cartesian coordinates is presented. Results are shown for inviscid wing design cases in super-critical flow regimes. The test cases selected also demonstrate the versatility of the design method in designing an entire wing or discontinuous sections of a wing.

  6. Development and Application of Quantitative Detection Method for Viral Hemorrhagic Septicemia Virus (VHSV) Genogroup IVa

    PubMed Central

    Kim, Jong-Oh; Kim, Wi-Sik; Kim, Si-Woo; Han, Hyun-Ja; Kim, Jin Woo; Park, Myoung Ae; Oh, Myung-Joo

    2014-01-01

    Viral hemorrhagic septicemia virus (VHSV) is a problematic pathogen in olive flounder (Paralichthys olivaceus) aquaculture farms in Korea. Thus, it is necessary to develop a rapid and accurate diagnostic method to detect this virus. We developed a quantitative RT-PCR (qRT-PCR) method based on the nucleocapsid (N) gene sequence of Korean VHSV isolate (Genogroup IVa). The slope and R2 values of the primer set developed in this study were −0.2928 (96% efficiency) and 0.9979, respectively. Its comparison with viral infectivity calculated by traditional quantifying method (TCID50) showed a similar pattern of kinetic changes in vitro and in vivo. The qRT-PCR method reduced detection time compared to that of TCID50, making it a very useful tool for VHSV diagnosis. PMID:24859343

  7. Further Development of Selective Dyeing Method for Detecting Chrysotile Asbestos in Building Materials

    NASA Astrophysics Data System (ADS)

    Oke, Y.; Yamasaki, N.; Maeta, N.; Fujimaki, H.; Hashida, T.

    2008-02-01

    Extensive usage of chrysotile asbestos has resulted in the remains of large numbers of chrysotile asbestos-containing buildings to be surveyed. We have recently developed a simple dyeing method for detecting chrysotile asbestos in building materials, which involves pretreatment with calcium-chelating agent and dyeing treatment with magnesium-chelating organic dyes. In this study, we further developed a method which eliminates dyed asbestos substitutes containing magnesium, potentially present in building materials. In the new method, post-treatment with formic acid was conducted to dissolve the non-chrysotile asbestos materials in order to delineate dyed chrysotile asbestos. The calcium-masking process was also shown to be an essential process even when the post-treatment was conducted. It was shown that the new method developed in this study may enable us to dye chrysotile asbestos only without detecting asbestos substitutes in building materials.

  8. Development and validation of stability-indicating HPLC method for determination of cefpirome sulfate.

    PubMed

    Zalewski, Przemysław; Skibiński, Robert; Cielecka-Piontek, Judyta; Bednarek-Rajewska, Katarzyna

    2014-01-01

    The stability-indicating LC assay method was developed and validated for quantitative determination of cefpirome sulfate (CPS) in the presence of degradation products formed during the forced degradation studies. An isocratic HPLC method was developed with Lichrospher RP-18 column, 5 μm particle size, 125 mm x 4 mm column and 12 mM ammonium acetate-acetonitrile (90 : 10 v/v) as a mobile phase. The flow rate of the mobile phase was 1.0 mL/min. Detection wavelength was 270 nm and temperature was 30 degrees C. Cefpirome sulfate as other cephalosporins was subjected to stress conditions of degradation in aqueous solutions including hydrolysis, oxidation, photolysis and thermal degradation. The developed method was validated with regard to linearity, accuracy, precision, selectivity and robustness. The method was applied successfully for identification and determination of cefpirome sulfate in pharmaceuticals and during kinetic studies.

  9. Pathways to lean software development: An analysis of effective methods of change

    NASA Astrophysics Data System (ADS)

    Hanson, Richard D.

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is firmly entrenched in American business today was to blame for this difficulty (Chatterjee, 2010). Each of these proponents of new methods sought to remove waste, lighten out the process, and implement lean principles in software development. Through this study, the experts evaluated the barriers to effective development principles and defined leadership qualities necessary to overcome these barriers. The barriers identified were issues of resistance to change, risk and reward issues, and management buy-in. Thirty experts in software development from several Fortune 500 companies across the United States explored each of these issues in detail. The conclusion reached by these experts was that visionary leadership is necessary to overcome these challenges.

  10. HPLC method development for evolving applications in the pharmaceutical industry and nanoscale chemistry

    NASA Astrophysics Data System (ADS)

    Castiglione, Steven Louis

    As scientific research trends towards trace levels and smaller architectures, the analytical chemist is often faced with the challenge of quantitating said species in a variety of matricies. The challenge is heightened when the analytes prove to be potentially toxic or possess physical or chemical properties that make traditional analytical methods problematic. In such cases, the successful development of an acceptable quantitative method plays a critical role in the ability to further develop the species under study. This is particularly true for pharmaceutical impurities and nanoparticles (NP). The first portion of the research focuses on the development of a part-per-billion level HPLC method for a substituted phenazine-class pharmaceutical impurity. The development of this method was required due to the need for a rapid methodology to quantitatively determine levels of a potentially toxic phenazine moiety in order to ensure patient safety. As the synthetic pathway for the active ingredient was continuously refined to produce progressively lower amounts of the phenazine impurity, the approach for increasingly sensitive quantitative methods was required. The approaches evolved across four discrete methods, each employing a unique scheme for analyte detection. All developed methods were evaluated with regards to accuracy, precision and linear adherence as well as ancillary benefits and detriments -- e.g., one method in this evolution demonstrated the ability to resolve and detect other species from the phenazine class. The second portion of the research focuses on the development of an HPLC method for the quantitative determination of NP size distributions. The current methodology for the determination of NP sizes employs tunneling electron microscopy (TEM), which requires sample drying without particle size alteration and which, in many cases, may prove infeasible due to cost or availability. The feasibility of an HPLC method for NP size characterizations evolved

  11. Development and validation of an extraction method for the analysis of perfluoroalkyl substances in human hair.

    PubMed

    Kim, Da-Hye; Oh, Jeong-Eun

    2017-05-01

    Human hair has many advantages as a non-invasive sample; however, analytical methods for detecting perfluoroalkyl substances (PFASs) in human hair are still in the development stage. Therefore, the aim of this study was to develop and validate a method for monitoring 11 PFASs in human hair. Solid-phase extraction (SPE), ion-pairing extraction (IPE), a combined method (SPE+IPE) and solvent extraction with ENVI-carb clean-up were compared to develop an optimal extraction method using two types of hair sample (powder and piece forms). Analysis of PFASs was performed using liquid chromatography and tandem mass spectrometry. Among the four different extraction procedures, the SPE method using powdered hair showed the best extraction efficiency and recoveries ranged from 85.8 to 102%. The method detection limits for the SPE method were 0.114-0.796 ng/g and good precision (below 10%) and accuracy (66.4-110%) were obtained. In light of these results, SPE is considered the optimal method for PFAS extraction from hair. It was also successfully used to detect PFASs in human hair samples.

  12. How to develop and validate a total organic carbon method for cleaning applications.

    PubMed

    Clark, K

    2001-01-01

    Good Manufacturing Practices require that the cleaning of drug manufacturing equipment be validated. Common analytical techniques used in the validation process include HPLC, UV/Vis, and Total Organic Carbon (TOC). HPLC and UV/Vis are classified as specific methods that identify and measure appropriate active substances. TOC is classified as a non-specific method and can detect all carbon-containing compounds, including active substances, excipients, and cleaning agents. The disadvantage of specific methods is that a new procedure must be developed for every active drug substance that is manufactured. This development process can be very time consuming and tedious. In contrast, one TOC method can potentially be used for all products. A TOC method is sensitive to the ppb range and is less time consuming than HPLC or UV/Vis. USP TOC methods are standard for Water for Injection and Purified Water, and simple modifications of these methods can be used for cleaning validation. The purpose of this study is to demonstrate how to develop and validate a TOC method for cleaning applications. Performance parameters evaluated in this study include linearity, MDL, LOQ, accuracy, precision, and swab recovery.

  13. Development of Advanced Methods of Structural and Trajectory Analysis for Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Ardema, Mark D.

    1996-01-01

    In this report the author describes: (1) development of advanced methods of structural weight estimation, and (2) development of advanced methods of flight path optimization. A method of estimating the load-bearing fuselage weight and wing weight of transport aircraft based on fundamental structural principles has been developed. This method of weight estimation represents a compromise between the rapid assessment of component weight using empirical methods based on actual weights of existing aircraft and detailed, but time-consuming, analysis using the finite element method. The method was applied to eight existing subsonic transports for validation and correlation. Integration of the resulting computer program, PDCYL, has been made into the weights-calculating module of the AirCraft SYNThesis (ACSYNT) computer program. ACSYNT bas traditionally used only empirical weight estimation methods; PDCYL adds to ACSYNT a rapid, accurate means of assessing the fuselage and wing weights of unconventional aircraft. PDCYL also allows flexibility in the choice of structural concept, as well as a direct means of determining the impact of advanced materials on structural weight.

  14. Development of a bioautographic method for the detection of lipase inhibitors.

    PubMed

    Bayineni, Venkata Krishna; Suresh, Sukrutha; Singh, Gurmeet; Kadeppagari, Ravi-Kumar

    2014-10-31

    An autobiographic method based on the thin layer chromatogram was developed by using the chemical system that comprises p-Nitrophenyl butyrate and bromothymol blue for detecting the lipase inhibitor. Lipase inhibitory zones were visualized as blue spots against the greenish yellow background. This method could able to detect the well known lipase inhibitor, orlistat up to the concentration of 1ng which is better than the earlier method. This method could also able to detect the lipase inhibition activities from the un-explored species of Streptomyces. The developed method can be used not only for the screening of unknown samples for the lipase inhibitors but also for the purification of the lipase inhibitors from the unknown samples.

  15. Preliminary Tests For Development Of A Non-Pertechnetate Analysis Method

    SciTech Connect

    Diprete, D.; McCabe, D.

    2016-09-28

    The objective of this task was to develop a non-pertechnetate analysis method that 222-S lab could easily implement. The initial scope involved working with 222-S laboratory personnel to adapt the existing Tc analytical method to fractionate the non-pertechnetate and pertechnetate. SRNL then developed and tested a method using commercial sorbents containing Aliquat® 336 to extract the pertechnetate (thereby separating it from non-pertechnetate), followed by oxidation, extraction, and stripping steps, and finally analysis by beta counting and Mass Spectroscopy. Several additional items were partially investigated, including impacts of a 137Cs removal step. The method was initially tested on SRS tank waste samples to determine its viability. Although SRS tank waste does not contain non-pertechnetate, testing with it was useful to investigate the compatibility, separation efficiency, interference removal efficacy, and method sensitivity.

  16. Development of an Innovative Algorithm for Aerodynamics-Structure Interaction Using Lattice Boltzmann Method

    NASA Technical Reports Server (NTRS)

    Mei, Ren-Wei; Shyy, Wei; Yu, Da-Zhi; Luo, Li-Shi; Rudy, David (Technical Monitor)

    2001-01-01

    The lattice Boltzmann equation (LBE) is a kinetic formulation which offers an alternative computational method capable of solving fluid dynamics for various systems. Major advantages of the method are owing to the fact that the solution for the particle distribution functions is explicit, easy to implement, and the algorithm is natural to parallelize. In this final report, we summarize the works accomplished in the past three years. Since most works have been published, the technical details can be found in the literature. Brief summary will be provided in this report. In this project, a second-order accurate treatment of boundary condition in the LBE method is developed for a curved boundary and tested successfully in various 2-D and 3-D configurations. To evaluate the aerodynamic force on a body in the context of LBE method, several force evaluation schemes have been investigated. A simple momentum exchange method is shown to give reliable and accurate values for the force on a body in both 2-D and 3-D cases. Various 3-D LBE models have been assessed in terms of efficiency, accuracy, and robustness. In general, accurate 3-D results can be obtained using LBE methods. The 3-D 19-bit model is found to be the best one among the 15-bit, 19-bit, and 27-bit LBE models. To achieve desired grid resolution and to accommodate the far field boundary conditions in aerodynamics computations, a multi-block LBE method is developed by dividing the flow field into various blocks each having constant lattice spacing. Substantial contribution to the LBE method is also made through the development of a new, generalized lattice Boltzmann equation constructed in the moment space in order to improve the computational stability, detailed theoretical analysis on the stability, dispersion, and dissipation characteristics of the LBE method, and computational studies of high Reynolds number flows with singular gradients. Finally, a finite difference-based lattice Boltzmann method is

  17. Development of a method for optimal maneuver analysis of complex space missions

    NASA Technical Reports Server (NTRS)

    Mcadoo, S. F., Jr.; Jezewski, D. J.; Dawkins, G. S.

    1975-01-01

    A system that allows mission planners to find optimal multiple-burn space trajectories easily is described. Previously developed methods with different gravity assumptions perform the optimization function. The power of these programs is extended by a method of costate estimation. A penalty function method of constraining coast arc times to be positive is included. The capability of the method is demonstrated by finding the optimal control for three different space missions. These include a shuttle abort-once-around mission and two- and three-burn geosynchronous satellite-placement missions.

  18. Development of surface profile measurement method for ellipsoidal x-ray mirrors using phase retrieval

    NASA Astrophysics Data System (ADS)

    Saitou, Takahiro; Takei, Yoshinori; Mimura, Hidekazu

    2012-09-01

    An ellipsoidal mirror is a promising type of X-ray mirror, because it can focus X-rays to nanometer size with a very large aperture and no chromatic aberration. However, ideal ellipsoidal mirrors have not yet been realized by any manufacturing method. This is partly because there is no evaluation method for its surface figure profile. In this paper, we propose and develop a method for measuring surface figure profile of ellipsoidal mirrors using phase retrieval. An optical design for soft X-ray focusing, the employed phase retrieval method and an experimental optical system specialized for wavefront measurement using a He-Ne laser are reported.

  19. Development of minimized mixing molecular orbital method for designing organic ferromagnets.

    PubMed

    Zhu, Xun; Aoki, Yuriko

    2015-06-15

    Predicting the high spin stability of organic radicals correctly for designing organic ferromagnets remains a significant challenge. We have developed a method with an index (L(min)) for predicting the high spin stability of conjugated organic radicals at the restricted open-shell Hartree-Fock level. Unitary transformations were performed for localizing the coefficients of nonbonding molecular orbitals, and subsequently the localized coefficients were used to calculate L(min) that indicates the high spin stability of conjugated organic radicals. This method can be combined with the elongation method to treat huge high spin open-shell systems. Thus, this method is useful for designing organic ferromagnets.

  20. Automation Analysis System of Child's Development Test and Multi-Viewpoints Input Method

    NASA Astrophysics Data System (ADS)

    Sugiura, Akihiko; Kirana, Rini Pura; Yonemura, Keiichi

    This study presents a simple method for examining a child's development by automatically analyzing the child's reactions while feedback images are displayed for 2 min. This computer analysis recognizes some features of an image of the child's reactions. This system can perform that analysis in a shorter time than conventional methods, as typified by the Enjoji-method. Moreover, evaluation results of our system verify its validity compared to conventional methods. This system is applicable to achieve automatic analyses through a network to thereby realize telemedicine as part of a Multimedia Communication System.

  1. Emerging analytical methods to determine gluten markers in processed foods--method development in support of standard setting.

    PubMed

    Weber, Dorcas; Cléroux, Chantal; Godefroy, Samuel Benrejeb

    2009-09-01

    The availability of analytical methods to detect and determine levels of markers of priority allergens in foods is of the utmost importance to support standard setting initiatives, the development of compliance and enforcement activities, as well as to provide guidance to industry on implementation of quality control practices, ensuring the effectiveness of allergen-related sanitation techniques. This paper describes the development and implementation of a mass-spectrometry-based technique to determine markers for individual sources of gluten in beer products. This methodology was shown to answer the requirements of Health Canada's proposed labeling standard for individual gluten source declaration, in order to achieve its policy objectives (i.e., protection of sensitive consumers, while promoting choice). Minimal sample work-up was required and the results obtained by ELISA were further complemented using the LC-MS/MS method. This paper aims to demonstrate the feasibility of alternative techniques to ELISA-based methodologies to determine allergen and gluten markers in food.

  2. An alternative method to quantify 2-MIB producing cyanobacteria in drinking water reservoirs: Method development and field applications.

    PubMed

    Chiu, Yi-Ting; Yen, Hung-Kai; Lin, Tsair-Fuh

    2016-11-01

    2-Methylisoborneol (2-MIB) is a commonly detected cyanobacterial odorant in drinking water sources in many countries. To provide safe and high-quality water, development of a monitoring method for the detection of 2-MIB-synthesis (mibC) genes is very important. In this study, new primers MIBS02F/R intended specifically for the mibC gene were developed and tested. Experimental results show that the MIBS02F/R primer set was able to capture 13 2-MIB producing cyanobacterial strains grown in the laboratory, and to effectively amplify the targeted DNA region from 17 2-MIB-producing cyanobacterial strains listed in the literature. The primers were further coupled with a TaqMan probe to detect 2-MIB producers in 29 drinking water reservoirs (DWRs). The results showed statistically significant correlations between mibC genes and 2-MIB concentrations for the data from each reservoir (R(2)=0.413-0.998; p<0.05), from all reservoirs in each of the three islands (R(2)=0.302-0.796; p<0.01), and from all data of the three islands (R(2)=0.473-0.479; p<0.01). The results demonstrate that the real-time PCR can be an alternative method to provide information to managers of reservoirs and water utilities facing 2-MIB-related incidents.

  3. Trimming Line Design using New Development Method and One Step FEM

    NASA Astrophysics Data System (ADS)

    Chung, Wan-Jin; Park, Choon-Dal; Yang, Dong-yol

    2005-08-01

    In most of automobile panel manufacturing, trimming is generally performed prior to flanging. To find feasible trimming line is crucial in obtaining accurate edge profile after flanging. Section-based method develops blank along section planes and find trimming line by generating loop of end points. This method suffers from inaccurate results for regions with out-of-section motion. On the other hand, simulation-based method can produce more accurate trimming line by iterative strategy. However, due to limitation of time and lack of information in initial die design, it is still not widely accepted in the industry. In this study, new fast method to find feasible trimming line is proposed. One step FEM is used to analyze the flanging process because we can define the desired final shape after flanging and most of strain paths are simple in flanging. When we use one step FEM, the main obstacle is the generation of initial guess. Robust initial guess generation method is developed to handle bad-shaped mesh, very different mesh size and undercut part. The new method develops 3D triangular mesh in propagational way from final mesh onto the drawing tool surface. Also in order to remedy mesh distortion during development, energy minimization technique is utilized. Trimming line is extracted from the outer boundary after one step FEM simulation. This method shows many benefits since trimming line can be obtained in the early design stage. The developed method is successfully applied to the complex industrial applications such as flanging of fender and door outer.

  4. MO-DE-BRA-05: Developing Effective Medical Physics Knowledge Structures: Models and Methods

    SciTech Connect

    Sprawls, P

    2015-06-15

    Purpose: Develop a method and supporting online resources to be used by medical physics educators for teaching medical imaging professionals and trainees so they develop highly-effective physics knowledge structures that can contribute to improved diagnostic image quality on a global basis. Methods: The different types of mental knowledge structures were analyzed and modeled with respect to both the learning and teaching process for their development and the functions or tasks that can be performed with the knowledge. While symbolic verbal and mathematical knowledge structures are very important in medical physics for many purposes, the tasks of applying physics in clinical imaging--especially to optimize image quality and diagnostic accuracy--requires a sensory conceptual knowledge structure, specifically, an interconnected network of visually based concepts. This type of knowledge supports tasks such as analysis, evaluation, problem solving, interacting, and creating solutions. Traditional educational methods including lectures, online modules, and many texts are serial procedures and limited with respect to developing interconnected conceptual networks. A method consisting of the synergistic combination of on-site medical physics teachers and the online resource, CONET (Concept network developer), has been developed and made available for the topic Radiographic Image Quality. This was selected as the inaugural topic, others to follow, because it can be used by medical physicists teaching the large population of medical imaging professionals, such as radiology residents, who can apply the knowledge. Results: Tutorials for medical physics educators on developing effective knowledge structures are being presented and published and CONET is available with open access for all to use. Conclusion: An adjunct to traditional medical physics educational methods with the added focus on sensory concept development provides opportunities for medical physics teachers to share

  5. Basic study to develop an electromagnetic drive method for the rotary undulation pump.

    PubMed

    Abe, Yusuke; Chinzei, Tsuneo; Isoyama, Takashi; Saito, Itsuro; Ono, Toshiya; Mochizuki, Shuichi; Kouno, Akimasa; Imachi, Kou

    2003-10-01

    The rotary undulation pump, which is composed of a disk with a convex shape on both sides and a pump housing with one narrow side and one wide side, is a unique continuous flow pump with a new principle. The concept of the levitation drive method for this pump was proposed. The electromagnetic driver model and drive circuit were developed to examine the possibility and the difference among the delta wired, Y wired, and repulsion methods. In the repulsion method, the disk was driven by magnetic repulsion. The model could be driven with either method, and the repulsion method was demonstrated to also be possible. With either method, owing to the wide gap between the permanent magnets and coils, the output was not enough when the load was high. The efficiency was almost the same in the delta wired and Y wired methods. In the repulsion method, however, it was less than 50% of that in the other two methods. From the results, the delta wired and Y wired methods with an active control of the gap distance were considered to be better than the repulsion method, which required no active gap control.

  6. Development of convective testing methods for low-rise multifamily buildings. Final report

    SciTech Connect

    Stiles, M.R.

    1996-08-01

    This report describes convective testing methods and protocols developed for use in weatherizing low-rise multifamily buildings. The methods can lead to controlling internal air movement and preventing leakage to the exterior by estimating magnitudes of air leakage pathways in garden and town house apartments. The 4 methods cited are: After-a-Retrofit; Equivalent Interfaces; Open-a-Door; and Add-a-Pathway. It is found that, because of modern interior finishing practices, convective problems tend to be more associated with indoor air quality than loss of space conditioning energy. The After-a-Retrofit method is the easiest to integrate into current diagnostic practices. In some cases, the Equivalent Interfaces method may be used on a production basis. The methods are an advance on current field practices that do not quantify the leakage pathways and research practices that require extensive equipment.

  7. Developments of entropy-stable residual distribution methods for conservation laws I: Scalar problems

    NASA Astrophysics Data System (ADS)

    Ismail, Farzad; Chizari, Hossain

    2017-02-01

    This paper presents preliminary developments of entropy-stable residual distribution methods for scalar problems. Controlling entropy generation is achieved by formulating an entropy conserved signals distribution coupled with an entropy-stable signals distribution. Numerical results of the entropy-stable residual distribution methods are accurate and comparable with the classic residual distribution methods for steady-state problems. High order accurate extensions for the new method on steady-state problems are also demonstrated. Moreover, the new method preserves second order accuracy on unsteady problems using an explicit time integration scheme. The idea of the multi-dimensional entropy-stable residual distribution method is generic enough to be extended to the system of hyperbolic equations, which will be presented in the sequel of this paper.

  8. Comparative Proteomic Analysis of Cotton Fiber Development and Protein Extraction Method Comparison in Late Stage Fibers

    PubMed Central

    Mujahid, Hana; Pendarvis, Ken; Reddy, Joseph S.; Nallamilli, Babi Ramesh Reddy; Reddy, K. R.; Nanduri, Bindu; Peng, Zhaohua

    2016-01-01

    The distinct stages of cotton fiber development and maturation serve as a single-celled model for studying the molecular mechanisms of plant cell elongation, cell wall development and cellulose biosynthesis. However, this model system of plant cell development is compromised for proteomic studies due to a lack of an efficient protein extraction method during the later stages of fiber development, because of a recalcitrant cell wall and the presence of abundant phenolic compounds. Here, we compared the quality and quantities of proteins extracted from 25 dpa (days post anthesis) fiber with multiple protein extraction methods and present a comprehensive quantitative proteomic study of fiber development from 10 dpa to 25 dpa. Comparative analysis using a label-free quantification method revealed 287 differentially-expressed proteins in the 10 dpa to 25 dpa fiber developmental period. Proteins involved in cell wall metabolism and regulation, cytoskeleton development and carbohydrate metabolism among other functional categories in four fiber developmental stages were identified. Our studies provide protocols for protein extraction from maturing fiber tissues for mass spectrometry analysis and expand knowledge of the proteomic profile of cotton fiber development. PMID:28248216

  9. Development and Evaluation of the Method with an Affective Interface for Promoting Employees' Morale

    NASA Astrophysics Data System (ADS)

    Fujino, Hidenori; Ishii, Hirotake; Shimoda, Hiroshi; Yoshikawa, Hidekazu

    For the sustainable society, organization management not based on the mass production and mass consumption but having the flexibility to meet to various social needs precisely is required. For realizing such management, the emploees' work morale is required. Recently, however, the emploees' work morale is tend to decrease. Therefore, in this study, the authors developed the model of the method for promoting and keeping employees' work morale effectively and efficiently. Especially the authors thought “work morale” of “attitude to the work”. Based on this idea, it could be considered that the theory of the persuasion psychology and various persuasion techniques. Therefore, the model of the method applying the character agent was developed based on the forced compliance which is one of persuasion techniques based on the theory of the cognitive dissonance. By the evaluation experiment using human subjects, it was confirmed that developed method could improve workers' work morle effectively.

  10. Development of a turbomachinery design optimization procedure using a multiple-parameter nonlinear perturbation method

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.

    1984-01-01

    An investigation was carried out to complete the preliminary development of a combined perturbation/optimization procedure and associated computational code for designing optimized blade-to-blade profiles of turbomachinery blades. The overall purpose of the procedures developed is to provide demonstration of a rapid nonlinear perturbation method for minimizing the computational requirements associated with parametric design studies of turbomachinery flows. The method combines the multiple parameter nonlinear perturbation method, successfully developed in previous phases of this study, with the NASA TSONIC blade-to-blade turbomachinery flow solver, and the COPES-CONMIN optimization procedure into a user's code for designing optimized blade-to-blade surface profiles of turbomachinery blades. Results of several design applications and a documented version of the code together with a user's manual are provided.

  11. Development of Implicit Methods in CFD NASA Ames Research Center 1970's - 1980's

    NASA Technical Reports Server (NTRS)

    Pulliam, Thomas H.

    2010-01-01

    The focus here is on the early development (mid 1970's-1980's) at NASA Ames Research Center of implicit methods in Computational Fluid Dynamics (CFD). A class of implicit finite difference schemes of the Beam and Warming approximate factorization type will be addressed. The emphasis will be on the Euler equations. A review of material pertinent to the solution of the Euler equations within the framework of implicit methods will be presented. The eigensystem of the equations will be used extensively in developing a framework for various methods applied to the Euler equations. The development and analysis of various aspects of this class of schemes will be given along with the motivations behind many of the choices. Various acceleration and efficiency modifications such as matrix reduction, diagonalization and flux split schemes will be presented.

  12. [Development and optimization of the methods for determining activity of plasminogen activator inhibitor-1 in plasma].

    PubMed

    Roka-Moĭia, Ia M; Zhernosiekov, D D; Kondratiuk, A S; Hrynenko, T V

    2013-01-01

    The activity and content of plasminogen activator inhibitor-1 (PAI-1) are important indicators of pathological processes, because its content in plasma increases at acute myocardium infarction, tumor, diabetes mellitus, etc. The present work is dedicated to the development and optimization of the methods of PAI-1 activity definition, which can be used in clinical practice. We have proposed the modification of the method COATEST PAI with the usage of chromogenic substrate S2251. According to our modification, the cyanogen bromide fragments of human fibrinogen have been changed into bovine desAB-fibrin. We have also developed the method with the usage of fibrin films. In this method fibrin is used as a stimulator of activation reaction and as a substrate at the same time. Using fibrin, the native substrate of plasmin, we provide high specificity of the reaction and exclude the cross-reaction with other plasma enzymes.

  13. Development of a size exclusion chromatography method for the determination of molar mass for poloxamers.

    PubMed

    Erlandsson, Bengt; Wittgren, Bengt; Brinkmalm, Gunnar

    2003-04-01

    An aqueous size exclusion chromatography (SEC) method for determination of the molar mass of poloxamers 188 and 407 has been developed as an alternative to the pharmacopoeia methods. During the development work two different columns and several different eluent compositions were investigated. With a PL-aquagel-OH column, non-exclusion behaviour was obtained. A TSKgel column gave good separation of both poloxamers. The best separation was obtained with an eluent consisting of sodium chloride (0.01 M)-methanol (90:10, v/v) on the TSKgel column. The method was shown to be linear within the elution time of the two poloxamers and to have acceptable precision. The results from the SEC method was compared to results obtained using SEC with online multi angle light scattering detection (MALS) and to results obtained with matrix-assisted laser desorption/ionisation mass spectrometry (MALDI-MS).

  14. Development of Matched (migratory Analytical Time Change Easy Detection) Method for Satellite-Tracked Migratory Birds

    NASA Astrophysics Data System (ADS)

    Doko, Tomoko; Chen, Wenbo; Higuchi, Hiroyoshi

    2016-06-01

    Satellite tracking technology has been used to reveal the migration patterns and flyways of migratory birds. In general, bird migration can be classified according to migration status. These statuses include the wintering period, spring migration, breeding period, and autumn migration. To determine the migration status, periods of these statuses should be individually determined, but there is no objective method to define 'a threshold date' for when an individual bird changes its status. The research objective is to develop an effective and objective method to determine threshold dates of migration status based on satellite-tracked data. The developed method was named the "MATCHED (Migratory Analytical Time Change Easy Detection) method". In order to demonstrate the method, data acquired from satellite-tracked Tundra Swans were used. MATCHED method is composed by six steps: 1) dataset preparation, 2) time frame creation, 3) automatic identification, 4) visualization of change points, 5) interpretation, and 6) manual correction. Accuracy was tested. In general, MATCHED method was proved powerful to identify the change points between migration status as well as stopovers. Nevertheless, identifying "exact" threshold dates is still challenging. Limitation and application of this method was discussed.

  15. Development of the residential case-specular epidemiologic investigation method. Final report

    SciTech Connect

    Zaffanella, L.E.; Savitz, D.A.

    1995-11-01

    The residential case-specular method is an innovative approach to epidemiologic studies of the association between wire codes and childhood cancer. This project was designed to further the development of the residential case-specular method, which seeks to help resolve the ``wire code paradox``. For years, wire codes have been used as surrogate measures of past electric and magnetic field (EMF) exposure. There is a magnetic field hypothesis that suggests childhood cancer is associated with exposure to magnetic fields, with wire codes as a proxy for these fields. The neighborhood hypothesis suggests that childhood cancer is associated with neighborhood characteristics and exposures other than magnetic fields, with wire codes as a proxy for these characteristics and exposures. The residential case-specular method was designed to discriminate between the magnetic field and the neighborhood hypothesis. Two methods were developed for determining the specular of a residence. These methods were tested with 400 randomly selected residences. The main advantage of the residential case-specular method is that it may efficiently confirm or eliminate the suspicion that control selection bias or confounding by neighborhood factors affected the results of case-control studies of childhood cancer and magnetic fields. The method may be applicable to both past and ongoing studies. The main disadvantage is that the method is untried. Consequently, further work is required to verify its validity and to ensure that sufficient statistical power can be obtained in a cost-effective manner.

  16. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC).

  17. Development of computational methods for unsteady aerodynamics at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Yates, E. Carson, Jr.; Whitlow, Woodrow, Jr.

    1987-01-01

    The current scope, recent progress, and plans for research and development of computational methods for unsteady aerodynamics at the NASA Langley Research Center are reviewed. Both integral equations and finite difference methods for inviscid and viscous flows are discussed. Although the great bulk of the effort has focused on finite difference solution of the transonic small perturbation equation, the integral equation program is given primary emphasis here because it is less well known.

  18. Development of computational methods for unsteady aerodynamics at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Yates, E. Carson, Jr.; Whitlow, Woodrow, Jr.

    1987-01-01

    The current scope, recent progress, and plans for research and development of computational methods for unsteady aerodynamics at the NASA Langley Research Center are reviewed. Both integral-equations and finite-difference method for inviscid and viscous flows are discussed. Although the great bulk of the effort has focused on finite-difference solution of the transonic small-perturbation equation, the integral-equation program is given primary emphasis here because it is less well known.

  19. Development of Theoretical Methods for Predicting Solvent Effects on Reaction Rates in Supercritical Water Oxidation Processes

    DTIC Science & Technology

    2007-11-02

    Tucker, manuscript in preparation. “Examination of Nonequilibrium Solvent Effects on an SN2 Reaction in Supercritical Water,” R. Behera, B...DATES COVERED Final: 7/1/99 - 12/31/02 4. TITLE AND SUBTITLE Development of theoretical methods for predicting solvent effects on reactions ...computational methods for predicting how reaction rate constants will vary with thermodynamic condition in supercritical water (SCW). Towards this

  20. Human-System Safety Methods for Development of Advanced Air Traffic Management Systems

    SciTech Connect

    Nelson, W.R.

    1999-05-24

    The Idaho National Engineering and Environmental Laboratory (INEEL) is supporting the National Aeronautics and Space Administration in the development of advanced air traffic management (ATM) systems as part of the Advanced Air Transportation Technologies program. As part of this program INEEL conducted a survey of human-system safety methods that have been applied to complex technical systems, to identify lessons learned from these applications and provide recommendations for the development of advanced ATM systems. The domains that were surveyed included offshore oil and gas, commercial nuclear power, commercial aviation, and military. The survey showed that widely different approaches are used in these industries, and that the methods used range from very high-level, qualitative approaches to very detailed quantitative methods such as human reliability analysis (HRA) and probabilistic safety assessment (PSA). In addition, the industries varied widely in how effectively they incorporate human-system safety assessment in the design, development, and testing of complex technical systems. In spite of the lack of uniformity in the approaches and methods used, it was found that methods are available that can be combined and adapted to support the development of advanced air traffic management systems.

  1. Method of Power System Suistanable Development Optimization in Liberalized Market Conditions

    NASA Astrophysics Data System (ADS)

    Turcik, M.; Oleinikova, I.; Krishans, Z.

    2011-01-01

    The paper is focused on the development of the Baltic Sea region taking into account the new EU energy policy. The authors elucidate the current situation and the power system infrastructure projects of the region. For the economic analysis and optimization of the development plans a method is proposed that takes into account the outlooks for upcoming 20-50 years and the initial information uncertainty. The method makes possible estimation of the technically-economic state, including market conditions, for a given power system.

  2. The development of field-based measurement methods for radioactive fallout assessment.

    PubMed

    Miller, Kevin M; Larsen, Richard J

    2002-05-01

    An overview is provided on the development of field equipment, instrument systems, and methods of analyses that were used to assess the impact of radioactive fallout from atmospheric weapons tests. Included in this review are developments in fallout collection, aerosols measurements in surface air, and high-altitude sampling with aircraft and balloons. In addition, developments in radiation measurements are covered in such areas as survey and monitoring instruments, in situ gamma-ray spectrometry, and aerial measurement systems. The history of these developments and the interplay with the general advances in the field of radiation and radioactivity metrology are highlighted. An emphasis is given as to how the modifications and improvements in the instruments and methods over time led to their adaptation to present-day applications to radiation and radioactivity measurements.

  3. Development of Sequential Optimization Method for CNC Turning Based on In-Process Tool Wear Monitoring

    NASA Astrophysics Data System (ADS)

    Moriwaki, Toshimichi; Tangjitsitcharoen, Somkiat; Shibasaka, Toshiroh

    A system and procedures are developed to optimize the cutting speed for CNC turning. The current amount of tool wear is estimated based on the in-process cutting force measurement by applying the method developed and reported previously. Once the tool wear is estimated for the different cutting speeds, the coefficients of the Taylor’s tool life equation are determined or successively modified based on the estimated tool wear data. The optimum cutting speed is obtained by referring to the criteria of either the minimum production cost or the maximum production rate. The system developed is applied to actual turning of carbon steel with coated carbide tools, and it has been proved that the system runs satisfactory. The method developed here can be readily applied to unknown combinations of the work material and the tool, as it searches the optimum cutting conditions automatically while the process is going on.

  4. Development, testing, and application of a new Multi-Receptor (MURA) Trajectory Source Apportionment (TSA) method

    NASA Astrophysics Data System (ADS)

    Lee, Stephanie J.

    Trajectory Source Apportionment (TSA) methods are statistical techniques used to identify sources of pollution at a sampling site (receptor). TSA methods have traditionally been applied to a single receptor (Ashbaugh et al., 1985; Seibert et al., 1994; Lui et al., 2003) with some exploration of using more than one receptor (Stohl, 1996; Zeng and Hopke, 1989). A new Multi-Receptor (MURA) method was developed here. It utilizes a two step process to first identify Potential Source Regions (PSRs), and then examine them to see how often they affect each receptor. The MURA method was first tested against the conditional probability method developed by Ashbaugh et al. (1985) to determine each method's ability to detect known sources. Two artificial data sets were used; one containing a single source and one that contained four sources. The MURA method outperformed the conditional probability method. Next, the MURA method was compared to an improved version of the conditional probability method (SIRA). This test utilized three sets of artificial data in the western and eastern U.S. Although the SIRA method was an improvement over the conditional probability method, the MURA method still performed better in the four-source simulation located in the western United States. In the two eastern simulations both the MURA and SIRA methods performed similarly. The third test evaluated the impact of trajectory starting heights from 10m to 500m on the MURA method using the three simulations from the SIRA comparison. In the western simulation, the starting height had little to no impact on the accuracy of the method. In the two eastern simulations, the 10m, 50m, and 250m starting heights performed more consistently over both simulations. The MURA method was then applied to two groups of IMPROVE receptors to identify sources of sulfate and nitrate. The southwest, the western Great Plains, and the eastern Midwest affect the south central United States group for high sulfate or nitrate

  5. A method for developing biomechanical response corridors based on principal component analysis.

    PubMed

    Sun, W; Jin, J H; Reed, M P; Gayzik, F S; Danelson, K A; Bass, C R; Zhang, J Y; Rupp, J D

    2016-10-03

    The standard method for specifying target responses for human surrogates, such as crash test dummies and human computational models, involves developing a corridor based on the distribution of a set of empirical mechanical responses. These responses are commonly normalized to account for the effects of subject body shape, size, and mass on impact response. Limitations of this method arise from the normalization techniques, which are based on the assumptions that human geometry linearly scales with size and in some cases, on simple mechanical models. To address these limitations, a new method was developed for corridor generation that applies principal component (PC) analysis to align response histories. Rather than use normalization techniques to account for the effects of subject size on impact response, linear regression models are used to model the relationship between PC features and subject characteristics. Corridors are generated using Monte Carlo simulation based on estimated distributions of PC features for each PC. This method is applied to pelvis impact force data from a recent series of lateral impact tests to develop corridor bounds for a group of signals associated with a particular subject size. Comparing to the two most common methods for response normalization, the corridors generated by the new method are narrower and better retain the features in signals that are related to subject size and body shape.

  6. Development of laser-ion beam photodissociation methods. Progress report, December 1, 1992--November 30, 1993

    SciTech Connect

    Russell, D.H.

    1992-08-01

    Research efforts were concentrated on developing the tandem magnetic sector (EB)/reflection-time-of-flight (TOF) instrument, preliminary experiments with tandem TOF/TOF instruments, developing method for performing photodissociation with pulsed lasers, experiments with laser ionization of aerosol particles, matrix-assisted laser desorption ionization (MALDI), and ion-molecule reaction chemistry of ground and excited state transition metal ions. This progress report is divided into: photodissociation, MALDI (including aerosols), and ion chemistry fundamentals.

  7. Improving Protection against Viral Aerosols Through Development of Novel Decontamination Methods and Characterization of Viral Aerosol

    DTIC Science & Technology

    2012-04-01

    AFRL-RX-TY-TP-2012-0040 IMPROVING PROTECTION AGAINST VIRAL AEROSOLS THROUGH DEVELOPMENT OF NOVEL DECONTAMINATION METHODS AND CHARACTERIZATION...Include area code) 16-APR-2012 Technical Paper (Thesis) 15-SEP-2007 -- 30-APR-2012 Improving Protection against Viral Aerosols Through Development of...medium showed that artificial saliva (AS) and beef serum extract (BE) produce a protective effect against UV compared to deionized (DI) water, that RH was

  8. Development of a simplified analytical method for representing material cyclic response

    NASA Technical Reports Server (NTRS)

    Moreno, V.

    1983-01-01

    Development of a simplified method for estimating structural inelastic stress and strain response to cyclic thermal loading is presented. The method assumes that high temperature structural response is the sum of time independent plastic and time dependent elastic/creep components. The local structural stress and strain response predicted by linear elastic analysis is modified by the simplified method to predict the inelastic response. The results with simulations by a nonlinear finite element analysis and used time independent plasticity and unified time dependent material model are compared.

  9. [Biological safety in dentistry: development of a useful method for quality control of sterilization].

    PubMed

    Arancegui, N; Lucena, P H

    1994-01-01

    534 autoclaves from Rosario dentist offices were controlled by a method developed in our laboratory, that consists in: a) a procedures instruction; b) a survey ; c) a colorimetric control; d) a biological control. By this method it is possible to detect the mistake in the autoclave function by only one step. The results showed that 86.90% of the autoclaves lacked thermometers, 76.60% lacked manual thermostats, 83.33% were automatic and 58.80% did not sterilize. It can be concluded the necessity of a periodic control by this method, the importance of a commercial quality control of the furnaces and the urgency of continuous education over biosafety concepts.

  10. [Development method of healthcare information system integration based on business collaboration model].

    PubMed

    Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2015-02-01

    Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.

  11. Formic Acid: Development of an Analytical Method and Use as Process Indicator in Anaerobic Systems

    DTIC Science & Technology

    1992-03-01

    I AD-A250 668 D0 ,I I I 111 Wl’i ill EDT|CS ELECTE MAY 27 1992 I C I FORMIC ACID: DEVELCPMENT OF AN ANALYTICAL METHOD AND USE AS A PROCESS INDICATOR...ANALYTICAL METHOD AND USE AS A PROCESS INDICATOR IN ANAEROBIC SYSTEMS A Special Research Problem Report Presented to the Faculty of the Division of...DEVELOPMENT-OF AN ANALYTICAL-METHOD ANDA USE AS A PROCESS INDICATOR IN ANAEROBIC-SYSTEMS by Sharon L. Perkins APPROVED: rr*W.f-.s, Adviso Dr. JWf . sord

  12. Development of a structured observational method for the systematic assessment of school food-choice architecture.

    PubMed

    Ozturk, Orgul D; McInnes, Melayne M; Blake, Christine E; Frongillo, Edward A; Jones, Sonya J

    2016-01-01

    The objective of this study is to develop a structured observational method for the systematic assessment of the food-choice architecture that can be used to identify key points for behavioral economic intervention intended to improve the health quality of children's diets. We use an ethnographic approach with observations at twelve elementary schools to construct our survey instrument. Elements of the structured observational method include decision environment, salience, accessibility/convenience, defaults/verbal prompts, number of choices, serving ware/method/packaging, and social/physical eating environment. Our survey reveals important "nudgeable" components of the elementary school food-choice architecture, including precommitment and default options on the lunch line.

  13. Development of the local magnification method for quantitative evaluation of endoscope geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong

    2016-05-01

    With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.

  14. Challenges in the analytical method development and validation for an unstable active pharmaceutical ingredient.

    PubMed

    Sajonz, Peter; Wu, Yan; Natishan, Theresa K; McGachy, Neil T; Detora, David

    2006-03-01

    A sensitive high-performance liquid chromatography (HPLC) impurity profile method for the antibiotic ertapenem is developed and subsequently validated. The method utilizes an Inertsil phenyl column at ambient temperature, gradient elution with aqueous sodium phosphate buffer at pH 8, and acetonitrile as the mobile phase. The linearity, method precision, method ruggedness, limit of quantitation, and limit of detection of the impurity profile HPLC method are found to be satisfactory. The method is determined to be specific, as judged by resolving ertapenem from in-process impurities in crude samples and degradation products that arise from solid state thermal and light stress, acid, base, and oxidative stressed solutions. In addition, evidence is obtained by photodiode array detection studies that no degradate or impurity having a different UV spectrum coeluted with the major component in stressed or unstressed samples. The challenges during the development and validation of the method are discussed. The difficulties of analyzing an unstable active pharmaceutical ingredient (API) are addressed. Several major impurities/degradates of the API have very different UV response factors from the API. These impurities/degradates are synthesized or prepared by controlled degradation and the relative response factors are determined.

  15. Development of a test method for carbonyl compounds from stationary source emissions

    SciTech Connect

    Zhihua Fan; Peterson, M.R.; Jayanty, R.K.M.

    1997-12-31

    Carbonyl compounds have received increasing attention because of their important role in ground-level ozone formation. The common method used for the measurement of aldehydes and ketones is 2,4-dinitrophenylhydrazine (DNPH) derivatization followed by high performance liquid chromatography and ultra violet (HPLC-UV) analysis. One of the problems associated with this method is the low recovery for certain compounds such as acrolein. This paper presents a study in the development of a test method for the collection and measurement of carbonyl compounds from stationary source emissions. This method involves collection of carbonyl compounds in impingers, conversion of carbonyl compounds to a stable derivative with O-2,3,4,5,6-pentafluorobenzyl hydroxylamine hydrochloride (PFBHA), and separation and measurement by electron capture gas chromatography (GC-ECD). Eight compounds were selected for the evaluation of this method: formaldehyde, acetaldehyde, acrolein, acetone, butanal, methyl ethyl ketone (MEK), methyl isobutyl ketone (MIBK), and hexanal.

  16. Development of a new method for determination of total haem protein in fish muscle.

    PubMed

    Chaijan, Manat; Undeland, Ingrid

    2015-04-15

    Using classic haem protein quantification methods, the extraction step in buffer or acid acetone often becomes limiting if muscle is oxidised and/or stored; haem-proteins then tend to bind to muscle components like myofibrils and/or biomembranes. The objective of this study was to develop a new haem protein determination method for fish muscle overcoming such extractability problems. The principle was to homogenise and heat samples in an SDS-containing phosphate buffer to dissolve major muscle components and convert ferrous/ferric haem proteins to hemichromes with a unique absorption peak at 535 nm. Hb-recovery tests with the new and classic methods showed that the new method and Hornsey's method performed significantly better on fresh Hb-enriched cod mince than Brown's and Drabkin's methods; recovery was ⩾98%. However, in highly oxidised samples and in cod protein isolates made with acid pH-shift processing, the new method performed better than Hornsey's method (63% and 87% vs. 50% and 68% recovery). Further, the new method performed well in fish muscle with ⩽30% lipid, <5% NaCl and pH 5.5-7.0; it was also unaffected by freezing/frozen storage.

  17. Development of a sampling method for qualification of a ceramic high-level waste form.

    SciTech Connect

    O'Holleran, T. P.

    2002-07-02

    A ceramic waste form has been developed to immobilize the salt waste stream from electrometallurgical treatment of spent nuclear fuel. The ceramic waste form was originally prepared in a hot isostatic press (HIP). Small HIP capsules called witness tubes were used to obtain representative samples of material for process monitoring, waste form qualification, and archiving. Since installation of a full-scale HIP in existing facilities proved impractical, a new fabrication process was developed. This process fabricates waste forms inside a stainless steel container using a conventional furnace. Progress in developing a new method of obtaining representative samples is reported.

  18. Chemometric approach for development, optimization, and validation of different chromatographic methods for separation of opium alkaloids.

    PubMed

    Acevska, J; Stefkov, G; Petkovska, R; Kulevanova, S; Dimitrovska, A

    2012-05-01

    The excessive and continuously growing interest in the simultaneous determination of poppy alkaloids imposes the development and optimization of convenient high-throughput methods for the assessment of the qualitative and quantitative profile of alkaloids in poppy straw. Systematic optimization of two chromatographic methods (gas chromatography (GC)/flame ionization detector (FID)/mass spectrometry (MS) and reversed-phase (RP)-high-performance liquid chromatography (HPLC)/diode array detector (DAD)) for the separation of alkaloids from Papaver somniferum L. (Papaveraceae) was carried out. The effects of various conditions on the predefined chromatographic descriptors were investigated using chemometrics. A full factorial linear design of experiments for determining the relationship between chromatographic conditions and the retention behavior of the analytes was used. Central composite circumscribed design was utilized for the final method optimization. By conducting the optimization of the methods in very rational manner, a great deal of excessive and unproductive laboratory research work was avoided. The developed chromatographic methods were validated and compared in line with the resolving power, sensitivity, accuracy, speed, cost, ecological aspects, and compatibility with the poppy straw extraction procedure. The separation of the opium alkaloids using the GC/FID/MS method was achieved within 10 min, avoiding any derivatization step. This method has a stronger resolving power, shorter analysis time, better cost/effectiveness factor than the RP-HPLC/DAD method and is in line with the "green trend" of the analysis. The RP-HPLC/DAD method on the other hand displayed better sensitivity for all tested alkaloids. The proposed methods provide both fast screening and an accurate content assessment of the six alkaloids in the poppy samples obtained from the selection program of Papaver strains.

  19. Report: Decline In EPA Particulate Matter Methods Development Activities May Hamper Timely Achievement of Program Goals

    EPA Pesticide Factsheets

    Report #2003-P-00016, September 30, 2003. EPA has not supported PM2.5 methods development activities to the extent necessary to fully achieve the short- and long-range goals of the PM2.5 program in a timely manner.

  20. Development of Assimilation Methods for Near-Shore Spectral Wave Models

    DTIC Science & Technology

    2001-09-30

    spectrum. The proposed program will develop the necessary methods to assimilate in-situ observations of this sort. The assimilation procedure is based...of variational assimilation procedure . WORK COMPLETED During FY 01, a variational approach for assimilating non-directional frequency spectrum

  1. A Method for User Centering Systematic Product Development Aimed at Industrial Design Students

    ERIC Educational Resources Information Center

    Coelho, Denis A.

    2010-01-01

    Instead of limiting the introduction and stimulus for new concept creation to lists of specifications, industrial design students seem to prefer to be encouraged by ideas in context. A new method that specifically tackles human activity to foster the creation of user centered concepts of new products was developed and is presented in this article.…

  2. Ventilation of mines developed by the combined method of coal mining

    NASA Astrophysics Data System (ADS)

    Senkus, Val V.; Ermakov, A. Yu; Senkus, V. V.

    2016-10-01

    The paper considers the features of ventilation of mines which are developed by the combined method of coal mining. It also provides recommendations for placing the flank and central ventilation holes while mining flat and steep seams from open pit sides, as well as anticlinal and synclinal deposits.

  3. Analysis of Pre-Service Science Teachers' Views about the Methods Which Develop Reflective Thinking

    ERIC Educational Resources Information Center

    Töman, Ufuk; Odabasi Çimer, Sabiha; Çimer, Atilla

    2014-01-01

    In this study, we investigate of science and technology pre-service teachers' opinions about the methods developed reflective thinking and we determined at the level of reflective thinking. This study is a descriptive study. Open-ended questions were used to determine the views of pre-service teachers. Questions used in the statistical analysis of…

  4. Functional Assessment: A Method for Developing Classroom-Based Accommodations and Interventions for Children with ADHD.

    ERIC Educational Resources Information Center

    Reid, Robert; Maag, John W.

    1998-01-01

    Describes functional assessment as a method teachers can use to develop classroom accommodations and interventions for children with attention deficit hyperactivity disorder (ADHD). Presents a rationale for adopting a functional approach and describes the basic stages and steps of functional assessment and various accommodations and interventions…

  5. Sequential Pattern Analysis: Method and Application in Exploring How Students Develop Concept Maps

    ERIC Educational Resources Information Center

    Chiu, Chiung-Hui; Lin, Chien-Liang

    2012-01-01

    Concept mapping is a technique that represents knowledge in graphs. It has been widely adopted in science education and cognitive psychology to aid learning and assessment. To realize the sequential manner in which students develop concept maps, most research relies upon human-dependent, qualitative approaches. This article proposes a method for…

  6. Development of Preservice Teachers' Value Orientations during a Secondary Methods Course and Early Field Experience

    ERIC Educational Resources Information Center

    Sofo, Seidu; Curtner-Smith, Matthew D.

    2010-01-01

    Few studies have examined the value orientations of physical education preservice teachers (PTs). The purposes of this study were to: (1) describe the extent to which one cohort of PTs' value orientations changed and developed during a secondary methods course and early field experience (EFE); and (2) determine why PTs' value orientations changed…

  7. An Observational Analysis of Coaching Behaviors for Career Development Event Teams: A Mixed Methods Study

    ERIC Educational Resources Information Center

    Ball, Anna L.; Bowling, Amanda M.; Sharpless, Justin D.

    2016-01-01

    School Based Agricultural Education (SBAE) teachers can use coaching behaviors, along with their agricultural content knowledge to help their Career Development Event (CDE) teams succeed. This mixed methods, collective case study observed three SBAE teachers preparing multiple CDEs throughout the CDE season. The teachers observed had a previous…

  8. Development of ISO/IEC 29112: test charts and methods for measuring monochrome printer resolution

    NASA Astrophysics Data System (ADS)

    Zeise, Eric K.; Kim, Sang Ho; Sigg, Franz

    2010-01-01

    Several measurable image quality attributes contribute to the perceived resolution of a printing system. These contributing attributes include addressability, sharpness, raggedness, spot size, and detail rendition capability. This paper summarizes the development of evaluation methods that will become the basis of ISO 29112, a standard for the objective measurement of monochrome printer resolution.

  9. Development of a new noninvasive method to determine the integrity of bone in vivo

    NASA Technical Reports Server (NTRS)

    Saha, S.

    1980-01-01

    An electromagnetic sensor for monitoring elastic waves in bone was developed. It does not require the use of traction pins and the output is not affected by soft tissue properties, a difficulty commonly encountered when using ultrasonic and vibration methods to determine in vivo properties of bone.

  10. 2D and 3D Method of Characteristic Tools for Complex Nozzle Development

    NASA Technical Reports Server (NTRS)

    Rice, Tharen

    2003-01-01

    This report details the development of a 2D and 3D Method of Characteristic (MOC) tool for the design of complex nozzle geometries. These tools are GUI driven and can be run on most Windows-based platforms. The report provides a user's manual for these tools as well as explains the mathematical algorithms used in the MOC solutions.

  11. Connecting Practice, Theory and Method: Supporting Professional Doctoral Students in Developing Conceptual Frameworks

    ERIC Educational Resources Information Center

    Kumar, Swapna; Antonenko, Pavlo

    2014-01-01

    From an instrumental view, conceptual frameworks that are carefully assembled from existing literature in Educational Technology and related disciplines can help students structure all aspects of inquiry. In this article we detail how the development of a conceptual framework that connects theory, practice and method is scaffolded and facilitated…

  12. The slice culture method for following development of tooth germs in explant culture.

    PubMed

    Alfaqeeh, Sarah A; Tucker, Abigail S

    2013-11-13

    Explant culture allows manipulation of developing organs at specific time points and is therefore an important method for the developmental biologist. For many organs it is difficult to access developing tissue to allow monitoring during ex vivo culture. The slice culture method allows access to tissue so that morphogenetic movements can be followed and specific cell populations can be targeted for manipulation or lineage tracing. In this paper we describe a method of slice culture that has been very successful for culture of tooth germs in a range of species. The method provides excellent access to the tooth germs, which develop at a similar rate to that observed in vivo, surrounded by the other jaw tissues. This allows tissue interactions between the tooth and surrounding tissue to be monitored. Although this paper concentrates on tooth germs, the same protocol can be applied to follow development of a number of other organs, such as salivary glands, Meckel's cartilage, nasal glands, tongue, and ear.

  13. Development of an MCG/MEG system for small animals and its noise reduction method

    NASA Astrophysics Data System (ADS)

    Miyamoto, M.; Kawai, J.; Adachi, Y.; Haruta, Y.; Komamura, K.; Uehara, G.

    2008-02-01

    Accurate capture of the biomagnetic signals from a rat or a mouse greatly benefits the development of new medicine and pathology. In order to improve the efficiency and accuracy of biomagnetic measurement of small animals, we developed a biomagnetic measurement system specific to small animal measurement. A superconducting quantum interference device (SQUID) sensor array and a table for the system were newly developed and were integrated into a transportable chassis having dimensions of 1.3 m width × 0.7 m depth × 1.8 m height and housing all principal components for the system. The integrated 9ch low-Tc SQUIDs magnetometer array designed to improve spatial resolution covers 8 mm × 8mm measurement area. We have also developed a real-time noise canceling method suitable for this system. The advantage of this method is that the noise reduction process is carried out in real time. We have confirmed the efficacy of this method using the measurement system which was installed in typical laboratory environment. The noise reduction effect was measured to be roughly 16 dB at power line frequency and its harmonics. We measured an magnetocardiogram (MCG) of a mouse using the system with the real-time noise canceling method, and the feasibility of small animal MCG measurement was ensured.

  14. Development of a Solid Phase Extraction Method for Agricultural Pesticides in Large-Volume Water Samples

    EPA Science Inventory

    An analytical method using solid phase extraction (SPE) and analysis by gas chromatography/mass spectrometry (GC/MS) was developed for the trace determination of a variety of agricultural pesticides and selected transformation products in large-volume high-elevation lake water sa...

  15. Development and validation of a Daphnia magna four-day survival and growth test method

    EPA Science Inventory

    Zooplankton are an important part of the aquatic ecology of all lakes and streams. As a result, numerous methods have been developed to assess the quality of waterbodies using various zooplankton species. Included in these is the freshwater species Daphnia magna. Current test me...

  16. Developing Critical Understanding in HRM Students: Using Innovative Teaching Methods to Encourage Deep Approaches to Study

    ERIC Educational Resources Information Center

    Butler, Michael J. R.; Reddy, Peter

    2010-01-01

    Purpose: This paper aims to focus on developing critical understanding in human resource management (HRM) students in Aston Business School, UK. The paper reveals that innovative teaching methods encourage deep approaches to study, an indicator of students reaching their own understanding of material and ideas. This improves student employability…

  17. Conference report: the 5th cell-based assay and bioanalytical method development conference.

    PubMed

    Ma, Mark

    2011-01-01

    Approximately 80 participants met at the Marriot Hotel, San Francisco, CA, USA, between the 4th and 6th October 2010 to share novel techniques and discuss the emerging approaches in the evolving field of cell-based assay and bioanalytical method development. This report highlights the discussion and summary of the meeting.

  18. DEVELOPMENT OF A MOLECULAR METHOD TO IDENTIFY HEPATITIS E VIRUS IN WATER

    EPA Science Inventory

    Hepatitis E virus (HEV) causes an infectious form of hepatitis associated with contaminated water. By analyzing the sequence of several HEV isolates, a reverse transciption-polymerase chain reaction method was developed and optimized that should be able to identify all of the kn...

  19. A Mixed Methods Analysis of Learning in Online Teacher Professional Development: A Case Report

    ERIC Educational Resources Information Center

    Lebec, Michael; Luft, Julie

    2007-01-01

    Web-based learning has been proposed as a convenient way to provide professional development experiences. Despite quantitative evidence that online instruction is equivalent to traditional methods (Russell, 2001), the efficiency of this approach has not been extensively studied among teachers. This case report describes learning in an online…

  20. History and Development of the Schmidt-Hunter Meta-Analysis Methods

    ERIC Educational Resources Information Center

    Schmidt, Frank L.

    2015-01-01

    In this article, I provide answers to the questions posed by Will Shadish about the history and development of the Schmidt-Hunter methods of meta-analysis. In the 1970s, I headed a research program on personnel selection at the US Office of Personnel Management (OPM). After our research showed that validity studies have low statistical power, OPM…

  1. Effect of storage method on manure as a substrate for filth fly development

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Numerous studies have been conducted using manure as a substrate for filth fly development. In these experiments, the manure is sometimes frozen for use at a later date. The purpose of this study was to determine the effects of various manure storage methods on subsequent house and stable fly develo...

  2. 2005 Nobel Prize in Chemistry: Development of the Olefin Metathesis Method in Organic Synthesis

    ERIC Educational Resources Information Center

    Casey, Charles P.

    2006-01-01

    The 2005 Nobel Prize in Chemistry was awarded "for the development of the metathesis method in organic synthesis". The discoveries of the laureates provided a chemical reaction used daily in the chemical industry for the efficient and more environmentally friendly production of important pharmaceuticals, fuels, synthetic fibers, and many other…

  3. Some recent developments of the discontinous Galerkin method for time-domain electromagnetics

    NASA Astrophysics Data System (ADS)

    Durochat, C.; Lanteri, S.; Moya, L.; Viquerat, J.; Descombes, S.; Scheid, C.

    2012-09-01

    We discuss about recent developments aiming at improving the accuracy, the flexibility and the computational performances of a high order discontinuous Galerkin time domain (DGTD) method for the simulation of time-domain electromagnetic wave propagation problems involving irregularly shaped objects and complex propagation media.

  4. Development of a Neuroscience-Oriented "Methods" Course for Graduate Students of Pharmacology and Toxicology

    ERIC Educational Resources Information Center

    Surratt, Christopher K.; Witt-Enderby, Paula A.; Johnson, David A.; Anderson, Carl A.; Bricker, J. Douglas; Davis, Vicki L.; Firestine, Steven M.; Meng, Wilson S.

    2006-01-01

    To provide graduate students in pharmacology/toxicology exposure to, and cross-training in, a variety of relevant laboratory skills, the Duquesne University School of Pharmacy developed a "methods" course as part of the core curriculum. Because some of the participating departmental faculty are neuroscientists, this course often applied…

  5. Development of the CODER System: A Testbed for Artificial Intelligence Methods in Information Retrieval.

    ERIC Educational Resources Information Center

    Fox, Edward A.

    1987-01-01

    Discusses the CODER system, which was developed to investigate the application of artificial intelligence methods to increase the effectiveness of information retrieval systems, particularly those involving heterogeneous documents. Highlights include the use of PROLOG programing, blackboard-based designs, knowledge engineering, lexicological…

  6. Contexts That Matter to the Leadership Development of Latino Male College Students: A Mixed Methods Perspective

    ERIC Educational Resources Information Center

    Garcia, Gina A.; Huerta, Adrian H.; Ramirez, Jenesis J.; Patrón, Oscar E.

    2017-01-01

    As the number of Latino males entering college increases, there is a need to understand their unique leadership experiences. This study used a convergent parallel mixed methods design to understand what contexts contribute to Latino male undergraduate students' leadership development, capacity, and experiences. Quantitative data were gathered by…

  7. Development of Methods for the Determination of pKa Values

    PubMed Central

    Reijenga, Jetse; van Hoof, Arno; van Loon, Antonie; Teunissen, Bram

    2013-01-01

    The acid dissociation constant (pKa) is among the most frequently used physicochemical parameters, and its determination is of interest to a wide range of research fields. We present a brief introduction on the conceptual development of pKa as a physical parameter and its relationship to the concept of the pH of a solution. This is followed by a general summary of the historical development and current state of the techniques of pKa determination and an attempt to develop insight into future developments. Fourteen methods of determining the acid dissociation constant are placed in context and are critically evaluated to make a fair comparison and to determine their applications in modern chemistry. Additionally, we have studied these techniques in light of present trends in science and technology and attempt to determine how these trends might affect future developments in the field. PMID:23997574

  8. Novel methods to collect meaningful data from adolescents for the development of health interventions.

    PubMed

    Hieftje, Kimberly; Duncan, Lindsay R; Fiellin, Lynn E

    2014-09-01

    Health interventions are increasingly focused on young adolescents, and as a result, discussions with this population have become a popular method in qualitative research. Traditional methods used to engage adults in discussions do not translate well to this population, who may have difficulty conceptualizing abstract thoughts and opinions and communicating them to others. As part of a larger project to develop and evaluate a video game for risk reduction and HIV prevention in young adolescents, we were seeking information and ideas from the priority audience that would help us create authentic story lines and character development in the video game. To accomplish this authenticity, we conducted in-depth interviews and focus groups with young adolescents aged 10 to 15 years and employed three novel methods: Storytelling Using Graphic Illustration, My Life, and Photo Feedback Project. These methods helped provide a thorough understanding of the adolescents' experiences and perspectives regarding their environment and future aspirations, which we translated into active components of the video game intervention. This article describes the processes we used and the valuable data we generated using these three engaging methods. These three activities are effective tools for eliciting meaningful data from young adolescents for the development of health interventions.

  9. Developments in Methods for Measuring the Intestinal Absorption of Nanoparticle-Bound Drugs

    PubMed Central

    Liu, Wei; Pan, Hao; Zhang, Caiyun; Zhao, Liling; Zhao, Ruixia; Zhu, Yongtao; Pan, Weisan

    2016-01-01

    With the rapid development of nanotechnology, novel drug delivery systems comprising orally administered nanoparticles (NPs) have been paid increasing attention in recent years. The bioavailability of orally administered drugs has significant influence on drug efficacy and therapeutic dosage, and it is therefore imperative that the intestinal absorption of oral NPs be investigated. This review examines the various literature on the oral absorption of polymeric NPs, and provides an overview of the intestinal absorption models that have been developed for the study of oral nanoparticles. Three major categories of models including a total of eight measurement methods are described in detail (in vitro: dialysis bag, rat gut sac, Ussing chamber, cell culture model; in situ: intestinal perfusion, intestinal loops, intestinal vascular cannulation; in vivo: the blood/urine drug concentration method), and the advantages and disadvantages of each method are contrasted and elucidated. In general, in vitro and in situ methods are relatively convenient but lack accuracy, while the in vivo method is troublesome but can provide a true reflection of drug absorption in vivo. This review summarizes the development of intestinal absorption experiments in recent years and provides a reference for the systematic study of the intestinal absorption of nanoparticle-bound drugs. PMID:27455239

  10. Developments in Methods for Measuring the Intestinal Absorption of Nanoparticle-Bound Drugs.

    PubMed

    Liu, Wei; Pan, Hao; Zhang, Caiyun; Zhao, Liling; Zhao, Ruixia; Zhu, Yongtao; Pan, Weisan

    2016-07-21

    With the rapid development of nanotechnology, novel drug delivery systems comprising orally administered nanoparticles (NPs) have been paid increasing attention in recent years. The bioavailability of orally administered drugs has significant influence on drug efficacy and therapeutic dosage, and it is therefore imperative that the intestinal absorption of oral NPs be investigated. This review examines the various literature on the oral absorption of polymeric NPs, and provides an overview of the intestinal absorption models that have been developed for the study of oral nanoparticles. Three major categories of models including a total of eight measurement methods are described in detail (in vitro: dialysis bag, rat gut sac, Ussing chamber, cell culture model; in situ: intestinal perfusion, intestinal loops, intestinal vascular cannulation; in vivo: the blood/urine drug concentration method), and the advantages and disadvantages of each method are contrasted and elucidated. In general, in vitro and in situ methods are relatively convenient but lack accuracy, while the in vivo method is troublesome but can provide a true reflection of drug absorption in vivo. This review summarizes the development of intestinal absorption experiments in recent years and provides a reference for the systematic study of the intestinal absorption of nanoparticle-bound drugs.

  11. Fast method development of rooibos tea phenolics using a variable column length strategy.

    PubMed

    Cabooter, Deirdre; Broeckhoven, Ken; Kalili, Kathithileni M; de Villiers, André; Desmet, Gert

    2011-10-14

    The development of a method for the separation of standard compounds of the 15 main phenolics found in rooibos tea is presented. The separation of these compounds in a single HPLC analysis is particularly challenging due to the similarity of rooibos phenolics. As a result, multiple methods are often required to analyze all major phenolics in rooibos tea samples. The method development process is significantly enhanced in this study by using the recently introduced automated column coupler in combination with the variable column length strategy. This strategy consists of performing the initial scouting runs, wherein the best separation conditions are determined, on a short column and subsequently fine-tuning the separation on longer columns to benefit from their higher separation performance. It is demonstrated that the method development process can further be expedited by operating each column length at the maximum pressure, in this case 1000 bar. Although this holds in general, it is even more the case for the presently considered sample, since the selectivity of the sample is more pressure- than temperature-dependent. Applying the optimized method to unfermented and fermented aqueous rooibos tea extracts in combination with Q-TOF mass spectrometry, some 30 phenolic compounds are tentatively identified.

  12. Novel Methods to Collect Meaningful Data From Adolescents for the Development of Health Interventions

    PubMed Central

    Hieftje, Kimberly; Duncan, Lindsay R.; Fiellin, Lynn E.

    2014-01-01

    Health interventions are increasingly focused on young adolescents, and as a result, discussions with this population have become a popular method in qualitative research. Traditional methods used to engage adults in discussions do not translate well to this population, who may have difficulty conceptualizing abstract thoughts and opinions and communicating them to others. As part of a larger project to develop and evaluate a video game for risk reduction and HIV prevention in young adolescents, we were seeking information and ideas from the priority audience that would help us create authentic story lines and character development in the video game. To accomplish this authenticity, we conducted in-depth interviews and focus groups with young adolescents aged 10 to 15 years and employed three novel methods: Storytelling Using Graphic Illustration, My Life, and Photo Feedback Project. These methods helped provide a thorough understanding of the adolescents’ experiences and perspectives regarding their environment and future aspirations, which we translated into active components of the video game intervention. This article describes the processes we used and the valuable data we generated using these three engaging methods. These three activities are effective tools for eliciting meaningful data from young adolescents for the development of health interventions. PMID:24519998

  13. Using Integrated Mixed Methods to Develop Behavioral Measures of Factors Associated With Microbicide Acceptability

    PubMed Central

    Morrow, Kathleen M.; Rosen, Rochelle K.; Salomon, Liz; Woodsong, Cynthia; Severy, Lawrence; Fava, Joseph L.; Vargas, Sara; Barroso, Candelaria

    2015-01-01

    Our current understanding of factors associated with microbicide acceptability and consistent use typically has been derived from separate and distinct qualitative or quantitative studies. Specifically, rarely have investigators used mixed methods to both develop and validate behavioral measures. We utilized an integrated mixed methods design, including qualitative metasyntheses, cognitive interviews and expert reviews, psychometric evaluation, and confirmatory qualitative analyses of the correspondence between quantitative items and original qualitative data to develop and validate measures of factors associated with microbicide acceptability and use. We describe this methodology and use the development of the Relationship Context Scale to illustrate it. As a result of independent confirmatory analyses of qualitative passages corresponding to survey items, we demonstrated that items from the same subscales are frequently double coded within a particular textual passage, and thematically related, suggesting associations that resulted in a unique factor structure within the subscale. This integrated mixed method design was critical to the development of this psychometrically validated behavioral measure, and could serve as a model for future measure development. PMID:21447804

  14. Development and Evaluation of Nursing User Interface Screens Using Multiple Methods

    PubMed Central

    Hyun, Sookyung; Johnson, Stephen B.; Stetson, Peter D.; Bakken, Suzanne

    2009-01-01

    Building upon the foundation of the Structured Narrative electronic health record (EHR) model, we applied theory-based (combined Technology Acceptance Model and Task-Technology Fit Model) and user-centered methods to explore nurses’ perceptions of functional requirements for an electronic nursing documentation system, design user interface screens reflective of the nurses’ perspectives, and assess nurses’ perceptions of the usability of the prototype user interface screens. The methods resulted in user interface screens that were perceived to be easy to use, potentially useful, and well-matched to nursing documentation tasks associated with Nursing Admission Assessment, Blood Administration, and Nursing Discharge Summary. The methods applied in this research may serve as a guide for others wishing to implement user-centered processes to develop or extend EHR systems. In addition, some of the insights obtained in this study may be informative to the development of safe and efficient user interface screens for nursing document templates in EHRs. PMID:19460464

  15. Development of high performance liquid chromatography method for miconazole analysis in powder sample

    NASA Astrophysics Data System (ADS)

    Hermawan, D.; Suwandri; Sulaeman, U.; Istiqomah, A.; Aboul-Enein, H. Y.

    2017-02-01

    A simple high performance liquid chromatography (HPLC) method has been developed in this study for the analysis of miconazole, an antifungal drug, in powder sample. The optimized HPLC system using C8 column was achieved using mobile phase composition containing methanol:water (85:15, v/v), a flow rate of 0.8 mL/min, and UV detection at 220 nm. The calibration graph was linear in the range from 10 to 50 mg/L with r 2 of 0.9983. The limit of detection (LOD) and limit of quantitation (LOQ) obtained were 2.24 mg/L and 7.47 mg/L, respectively. The present HPLC method is applicable for the determination of miconazole in the powder sample with a recovery of 101.28 % (RSD = 0.96%, n = 3). The developed HPLC method provides short analysis time, high reproducibility and high sensitivity.

  16. Emerging operando and x-ray pair distribution function methods for energy materials development

    SciTech Connect

    Chapman, Karena W.

    2016-03-01

    Our energy needs drive widespread materials research. Advances in materials characterization are critical to this research effort. Using new characterization tools that allow us to probe the atomic structure of energy materials in situ as they operate, we can identify how their structure is linked to their functional properties and performance. These fundamental insights serve as a roadmap to enhance performance in the next generation of advanced materials. In the last decade, developments in synchrotron instrumentation have made the pair distribution function (PDF) method and operando x-ray studies more readily accessible tools capable of providing valuable insights into complex materials systems. Here, the emergence of the PDF method as a versatile structure characterization tool and the further enhancement of this method through developments in operando capabilities and multivariate data analytics are described. These advances in materials characterization are demonstrated by several highlighted studies focused on energy storage in batteries.

  17. Model correlation and damage location for large space truss structures: Secant method development and evaluation

    NASA Technical Reports Server (NTRS)

    Smith, Suzanne Weaver; Beattie, Christopher A.

    1991-01-01

    On-orbit testing of a large space structure will be required to complete the certification of any mathematical model for the structure dynamic response. The process of establishing a mathematical model that matches measured structure response is referred to as model correlation. Most model correlation approaches have an identification technique to determine structural characteristics from the measurements of the structure response. This problem is approached with one particular class of identification techniques - matrix adjustment methods - which use measured data to produce an optimal update of the structure property matrix, often the stiffness matrix. New methods were developed for identification to handle problems of the size and complexity expected for large space structures. Further development and refinement of these secant-method identification algorithms were undertaken. Also, evaluation of these techniques is an approach for model correlation and damage location was initiated.

  18. Method development for measuring biodegradable dissolved organic nitrogen in treated wastewater.

    PubMed

    Khan, Eakalak; Awobamise, Mayo; Jones, Kimberly; Murthy, Sudhir

    2009-08-01

    A method for determining biodegradable dissolved organic nitrogen (BDON) in treated wastewater was developed. The method adopts the approaches used in the biochemical oxygen demand and biodegradable dissolved organic carbon tests to make it usable as a routine procedure at wastewater treatment plants (WWTPs). The development focused on various aspects of the procedure, including inoculum type and concentration, incubation period, and the need for sample filtration after incubation. The method was tested with filtered effluent samples from two nutrient removal WWTPs and standard organic nitrogen solutions. Accurate and precise BDON results were obtained with 2 mL of acclimated mixed-liquor suspended solids diluted to a concentration of 240 mg/L as an inoculum and an incubation period of 20 days. Sample filtration after incubation was not required.

  19. Integration of QFD, AHP, and LPP methods in supplier development problems under uncertainty

    NASA Astrophysics Data System (ADS)

    Shad, Zahra; Roghanian, Emad; Mojibian, Fatemeh

    2014-04-01

    Quality function deployment (QFD) is a customer-driven approach, widely used to develop or process new product to maximize customer satisfaction. Last researches used linear physical programming (LPP) procedure to optimize QFD; however, QFD issue involved uncertainties, or fuzziness, which requires taking them into account for more realistic study. In this paper, a set of fuzzy data is used to address linguistic values parameterized by triangular fuzzy numbers. Proposed integrated approach including analytic hierarchy process (AHP), QFD, and LPP to maximize overall customer satisfaction under uncertain conditions and apply them in the supplier development problem. The fuzzy AHP approach is adopted as a powerful method to obtain the relationship between the customer requirements and engineering characteristics (ECs) to construct house of quality in QFD method. LPP is used to obtain the optimal achievement level of the ECs and subsequently the customer satisfaction level under different degrees of uncertainty. The effectiveness of proposed method will be illustrated by an example.

  20. Development of a real-time PCR method for the identification of Atlantic mackerel (Scomber scombrus).

    PubMed

    Velasco, Amaya; Sánchez, Ana; Martínez, Icíar; Santaclara, Francisco J; Pérez-Martín, Ricardo I; Sotelo, Carmen G

    2013-12-01

    A Real Time-PCR method based on TaqMan technology for the identification of Scomber scombrus has been developed. A system of specific primers and a Minor Groove Binding (MGB) TaqMan probe based on sequences of the mitochondrial cytochrome b region was designed. The method was successfully tested in 81 specimens of S. scombrus and related species and validated in 26 different commercial samples. An average Threshold Cycle (Ct) value of 15.3 was obtained with S. scombrus DNA. With the other species tested fluorescence signal was not detected or Ct was significantly higher (P<0.001). The efficiency of the assay was estimated to be 92.41%, with 100% specificity, and no cross reactivity was detected with any other species. These results reveal that the developed method is a rapid and efficient tool to unequivocally identify S. scombrus and may aid in the prevention of fraud or mislabelling in mackerel products.

  1. Development and application of QM/MM methods to study the solvation effects and surfaces

    SciTech Connect

    Dibya, Pooja Arora

    2010-01-01

    Quantum mechanical (QM) calculations have the advantage of attaining high-level accuracy, however QM calculations become computationally inefficient as the size of the system grows. Solving complex molecular problems on large systems and ensembles by using quantum mechanics still poses a challenge in terms of the computational cost. Methods that are based on classical mechanics are an inexpensive alternative, but they lack accuracy. A good trade off between accuracy and efficiency is achieved by combining QM methods with molecular mechanics (MM) methods to use the robustness of the QM methods in terms of accuracy and the MM methods to minimize the computational cost. Two types of QM combined with MM (QM/MM) methods are the main focus of the present dissertation: the application and development of QM/MM methods for solvation studies and reactions on the Si(100) surface. The solvation studies were performed using a discreet solvation model that is largely based on first principles called the effective fragment potential method (EFP). The main idea of combining the EFP method with quantum mechanics is to accurately treat the solute-solvent and solvent-solvent interactions, such as electrostatic, polarization, dispersion and charge transfer, that are important in correctly calculating solvent effects on systems of interest. A second QM/MM method called SIMOMM (surface integrated molecular orbital molecular mechanics) is a hybrid QM/MM embedded cluster model that mimics the real surface.3 This method was employed to calculate the potential energy surfaces for reactions of atomic O on the Si(100) surface. The hybrid QM/MM method is a computationally inexpensive approach for studying reactions on larger surfaces in a reasonably accurate and efficient manner. This thesis is comprised of four chapters: Chapter 1 describes the general overview and motivation of the dissertation and gives a broad background of the computational methods that have been employed in this work

  2. Qualitative properties of roasting defect beans and development of its classification methods by hyperspectral imaging technology.

    PubMed

    Cho, Jeong-Seok; Bae, Hyung-Jin; Cho, Byoung-Kwan; Moon, Kwang-Deog

    2017-04-01

    Qualitative properties of roasting defect coffee beans and their classification methods were studied using hyperspectral imaging (HSI). The roasting defect beans were divided into 5 groups: medium roasting (Cont), under developed (RD-1), over roasting (RD-2), interior under developed (RD-3), and interior scorching (RD-4). The following qualitative properties were assayed: browning index (BI), moisture content (MC), chlorogenic acid (CA), trigonelline (TG), and caffeine (CF) content. Their HSI spectra (1000-1700nm) were also analysed to develop the classification methods of roasting defect beans. RD-2 showed the highest BI and the lowest MC, CA, and TG content. The accuracy of classification model of partial least-squares discriminant was 86.2%. The most powerful wavelength to classify the defective beans was approximately 1420nm (related to OH bond). The HSI reflectance values at 1420nm showed similar tendency with MC, enabling the use of this technology to classify the roasting defect beans.

  3. Whole genome sequence analysis of unidentified genetically modified papaya for development of a specific detection method.

    PubMed

    Nakamura, Kosuke; Kondo, Kazunari; Akiyama, Hiroshi; Ishigaki, Takumi; Noguchi, Akio; Katsumata, Hiroshi; Takasaki, Kazuto; Futo, Satoshi; Sakata, Kozue; Fukuda, Nozomi; Mano, Junichi; Kitta, Kazumi; Tanaka, Hidenori; Akashi, Ryo; Nishimaki-Mogami, Tomoko

    2016-08-15

    Identification of transgenic sequences in an unknown genetically modified (GM) papaya (Carica papaya L.) by whole genome sequence analysis was demonstrated. Whole genome sequence data were generated for a GM-positive fresh papaya fruit commodity detected in monitoring using real-time polymerase chain reaction (PCR). The sequences obtained were mapped against an open database for papaya genome sequence. Transgenic construct- and event-specific sequences were identified as a GM papaya developed to resist infection from a Papaya ringspot virus. Based on the transgenic sequences, a specific real-time PCR detection method for GM papaya applicable to various food commodities was developed. Whole genome sequence analysis enabled identifying unknown transgenic construct- and event-specific sequences in GM papaya and development of a reliable method for detecting them in papaya food commodities.

  4. Development of Finite Elements for Two-Dimensional Structural Analysis Using the Integrated Force Method

    NASA Technical Reports Server (NTRS)

    Kaljevic, Igor; Patnaik, Surya N.; Hopkins, Dale A.

    1996-01-01

    The Integrated Force Method has been developed in recent years for the analysis of structural mechanics problems. This method treats all independent internal forces as unknown variables that can be calculated by simultaneously imposing equations of equilibrium and compatibility conditions. In this paper a finite element library for analyzing two-dimensional problems by the Integrated Force Method is presented. Triangular- and quadrilateral-shaped elements capable of modeling arbitrary domain configurations are presented. The element equilibrium and flexibility matrices are derived by discretizing the expressions for potential and complementary energies, respectively. The displacement and stress fields within the finite elements are independently approximated. The displacement field is interpolated as it is in the standard displacement method, and the stress field is approximated by using complete polynomials of the correct order. A procedure that uses the definitions of stress components in terms of an Airy stress function is developed to derive the stress interpolation polynomials. Such derived stress fields identically satisfy the equations of equilibrium. Moreover, the resulting element matrices are insensitive to the orientation of local coordinate systems. A method is devised to calculate the number of rigid body modes, and the present elements are shown to be free of spurious zero-energy modes. A number of example problems are solved by using the present library, and the results are compared with corresponding analytical solutions and with results from the standard displacement finite element method. The Integrated Force Method not only gives results that agree well with analytical and displacement method results but also outperforms the displacement method in stress calculations.

  5. Diffuse reflectance near infrared-chemometric methods development and validation of amoxicillin capsule formulations

    PubMed Central

    Khan, Ahmed Nawaz; Khar, Roop Krishen; Ajayakumar, P. V.

    2016-01-01

    Objective: The aim of present study was to establish near infrared-chemometric methods that could be effectively used for quality profiling through identification and quantification of amoxicillin (AMOX) in formulated capsule which were similar to commercial products. In order to evaluate a large number of market products easily and quickly, these methods were modeled. Materials and Methods: Thermo Scientific Antaris II near infrared analyzer with TQ Analyst Chemometric Software were used for the development and validation of the identification and quantification models. Several AMOX formulations were composed with four excipients microcrystalline cellulose, magnesium stearate, croscarmellose sodium and colloidal silicon dioxide. Development includes quadratic mixture formulation design, near infrared spectrum acquisition, spectral pretreatment and outlier detection. According to prescribed guidelines by International Conference on Harmonization (ICH) and European Medicine Agency (EMA) developed methods were validated in terms of specificity, accuracy, precision, linearity, and robustness. Results: On diffuse reflectance mode, an identification model based on discriminant analysis was successfully processed with 76 formulations; and same samples were also used for quantitative analysis using partial least square algorithm with four latent variables and 0.9937 correlation of coefficient followed by 2.17% root mean square error of calibration (RMSEC), 2.38% root mean square error of prediction (RMSEP), 2.43% root mean square error of cross-validation (RMSECV). Conclusion: Proposed model established a good relationship between the spectral information and AMOX identity as well as content. Resulted values show the performance of the proposed models which offers alternate choice for AMOX capsule evaluation, relative to that of well-established high-performance liquid chromatography method. Ultimately three commercial products were successfully evaluated using developed

  6. Development and validation of scale nuclear analysis methods for high temperature gas-cooled reactors

    SciTech Connect

    Gehin, Jess C; Jessee, Matthew Anderson; Williams, Mark L; Lee, Deokjung; Goluoglu, Sedat; Ilas, Germina; Ilas, Dan; Bowman, Steve A

    2010-01-01

    In support of the U.S. Nuclear Regulatory Commission, ORNL is updating the nuclear analysis methods and data in the SCALE code system to support modeling of HTGRs. Development activities include methods used for reactor physics, criticality safety, and radiation shielding. This paper focuses on the nuclear methods in support of reactor physics, which primarily include lattice physics for cross-section processing of both prismatic and pebble-bed designs, Monte Carlo depletion methods and efficiency improvements for double heterogeneous fuels, and validation against relevant experiments. These methods enhancements are being validated using available experimental data from the HTTR and HTR-10 startup and initial criticality experiments. Results obtained with three-dimensional Monte Carlo models of the HTTR initial core critical configurations with SCALE6/KENO show excellent agreement between the continuous energy and multigroup methods and the results are consistent with results obtained by others. A three-dimensional multigroup Monte Carlo model for the initial critical core of the HTR-10 has been developed with SCALE6/KENO based on the benchmark specifications included in the IRPhE Handbook. The core eigenvalue obtained with this model is in very good agreement with the corresponding value obtained with a consistent continuous energy MCNP5 core model.

  7. Development and validation of RP-HPLC method for estimation of Cefotaxime sodium in marketed formulations.

    PubMed

    Lalitha, N; Pai, Pn Sanjay

    2009-12-01

    A RP-HPLC assay method has been developed and validated for cefotaxime. An isocratic RP-HPLC was developed on a SS Wakosil II- C8 column (250 mm ˜4.6 mm i.d., 5 μm) utilizing a mobile phase of ammonium acetate buffer (pH 6.8) and acetonitrile (85:15 v/v) with UV detection at wavelength 252 nm at the flow rate 0 .8 ml/min. The proposed method was validated for sensitivity, selectivity, linearity, accuracy, precision, ruggedness, robustness and solution stability. The response of the drug was linear in the concentration range of 10-70 μg/ml. Limit of detection and Limit of quantification was found to be 0.3 μg/ml and 0.6 μg/ml respectively. The % recovery ranged within 97-102 %. Method, system, interday and intraday precision was found to be within the limits of acceptance criteria. Method was found to be rugged when analysis was carried out by different analyst. The method was found to be sensitive and efficient with 2216 theoretical plates, 0.1128 mm HETP and tailing factor 1. The method was suitable for the quality control of cefotaxime in injection formulations.

  8. Development of Decision Making Algorithm for Control of Sea Cargo Containers by ``TAGGED'' Neutron Method

    NASA Astrophysics Data System (ADS)

    Anan'ev, A. A.; Belichenko, S. G.; Bogolyubov, E. P.; Bochkarev, O. V.; Petrov, E. V.; Polishchuk, A. M.; Udaltsov, A. Yu.

    2009-12-01

    Nowadays in Russia and abroad there are several groups of scientists, engaged in development of systems based on "tagged" neutron method (API method) and intended for detection of dangerous materials, including high explosives (HE). Particular attention is paid to possibility of detection of dangerous objects inside a sea cargo container. Energy gamma-spectrum, registered from object under inspection is used for determination of oxygen/carbon and nitrogen/carbon chemical ratios, according to which dangerous object is distinguished from not dangerous one. Material of filled container, however, gives rise to additional effects of rescattering and moderation of 14 MeV primary neutrons of generator, attenuation of secondary gamma-radiation from reactions of inelastic neutron scattering on objects under inspection. These effects lead to distortion of energy gamma-response from examined object and therefore prevent correct recognition of chemical ratios. These difficulties are taken into account in analytical method, presented in the paper. Method has been validated against experimental data, obtained by the system for HE detection in sea cargo, based on API method and developed in VNIIA. Influence of shielding materials on results of HE detection and identification is considered. Wood and iron were used as shielding materials. Results of method application for analysis of experimental data on HE simulator measurement (tetryl, trotyl, hexogen) are presented.

  9. DEVELOPMENT OF DECISION MAKING ALGORITHM FOR CONTROL OF SEA CARGO CONTAINERS BY 'TAGGED' NEUTRON METHOD

    SciTech Connect

    Anan'ev, A. A.; Belichenko, S. G.; Bogolyubov, E. P.; Bochkarev, O. V.; Petrov, E. V.; Polishchuk, A. M.; Udaltsov, A. Yu.

    2009-12-02

    Nowadays in Russia and abroad there are several groups of scientists, engaged in development of systems based on 'tagged' neutron method (API method) and intended for detection of dangerous materials, including high explosives (HE). Particular attention is paid to possibility of detection of dangerous objects inside a sea cargo container. Energy gamma-spectrum, registered from object under inspection is used for determination of oxygen/carbon and nitrogen/carbon chemical ratios, according to which dangerous object is distinguished from not dangerous one. Material of filled container, however, gives rise to additional effects of rescattering and moderation of 14 MeV primary neutrons of generator, attenuation of secondary gamma-radiation from reactions of inelastic neutron scattering on objects under inspection. These effects lead to distortion of energy gamma-response from examined object and therefore prevent correct recognition of chemical ratios. These difficulties are taken into account in analytical method, presented in the paper. Method has been validated against experimental data, obtained by the system for HE detection in sea cargo, based on API method and developed in VNIIA. Influence of shielding materials on results of HE detection and identification is considered. Wood and iron were used as shielding materials. Results of method application for analysis of experimental data on HE simulator measurement (tetryl, trotyl, hexogen) are presented.

  10. Development of threedimensional optical correction method for reconstruction of flow field in droplet

    NASA Astrophysics Data System (ADS)

    Ko, Han Seo; Gim, Yeonghyeon; Kang, Seung-Hwan

    2015-11-01

    A three-dimensional optical correction method was developed to reconstruct droplet-based flow fields. For a numerical simulation, synthetic phantoms were reconstructed by a simultaneous multiplicative algebraic reconstruction technique using three projection images which were positioned at an offset angle of 45°. If the synthetic phantom in a conical object with refraction index which differs from atmosphere, the image can be distorted because a light is refracted on the surface of the conical object. Thus, the direction of the projection ray was replaced by the refracted ray which occurred on the surface of the conical object. In order to prove the method considering the distorted effect, reconstruction results of the developed method were compared with the original phantom. As a result, the reconstruction result of the method showed smaller error than that without the method. The method was applied for a Taylor cone which was caused by high voltage between a droplet and a substrate to reconstruct the three-dimensional flow fields for analysis of the characteristics of the droplet. This work was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Korean government (MEST) (No. 2013R1A2A2A01068653).

  11. Assessing physical activity of women of childbearing age. Ongoing work to develop and evaluate simple methods.

    PubMed

    Löf, Marie; Hannestad, Ulf; Forsum, Elisabet

    2002-09-01

    Simple methods were developed and evaluated to assess total energy expenditure in 24 healthy, Swedish women planning pregnancy. Total energy expenditure was measured by the doubly-labeled water method (reference method) and three simple methods: heart rate recording, movement registration by an accelerometer, and a questionnaire. Mean total energy expenditure obtained by the four methods varied between 2,530 kcal per 24 hours (10,570 kJ/24 hours) and 2,730 kcal per 24 hours (11,420 kJ/24 hours). No significant difference between the results obtained by the different methods was found. The mean difference between the simple method and the reference method was for the questionnaire 204 +/- 508 kcal per 24 hours (853 +/- 2,124 kJ/24 hours), for the heart rate recorder 58 +/- 338 kcal per 24 hours (241 +/- 1,416 kJ/24 hours) and for the accelerometer 6 +/- 325 kcal per 24 hours (25 +/- 1,360 kJ/24 hours). The heart rate recorder and the questionnaire overestimated high and underestimated low energy expenditures. The accelerometer and the heart rate recorder were able to assess mean total energy expenditure of groups. No systematic bias was found when the accelerometer was used.

  12. Development of an integrated method for long-term water quality prediction using seasonal climate forecast

    NASA Astrophysics Data System (ADS)

    Cho, Jaepil; Shin, Chang-Min; Choi, Hwan-Kyu; Kim, Kyong-Hyeon; Choi, Ji-Yong

    2016-10-01

    The APEC Climate Center (APCC) produces climate prediction information utilizing a multi-climate model ensemble (MME) technique. In this study, four different downscaling methods, in accordance with the degree of utilizing the seasonal climate prediction information, were developed in order to improve predictability and to refine the spatial scale. These methods include: (1) the Simple Bias Correction (SBC) method, which directly uses APCC's dynamic prediction data with a 3 to 6 month lead time; (2) the Moving Window Regression (MWR) method, which indirectly utilizes dynamic prediction data; (3) the Climate Index Regression (CIR) method, which predominantly uses observation-based climate indices; and (4) the Integrated Time Regression (ITR) method, which uses predictors selected from both CIR and MWR. Then, a sampling-based temporal downscaling was conducted using the Mahalanobis distance method in order to create daily weather inputs to the Soil and Water Assessment Tool (SWAT) model. Long-term predictability of water quality within the Wecheon watershed of the Nakdong River Basin was evaluated. According to the Korean Ministry of Environment's Provisions of Water Quality Prediction and Response Measures, modeling-based predictability was evaluated by using 3-month lead prediction data issued in February, May, August, and November as model input of SWAT. Finally, an integrated approach, which takes into account various climate information and downscaling methods for water quality prediction, was presented. This integrated approach can be used to prevent potential problems caused by extreme climate in advance.

  13. Development and Validation of Liquid Chromatographic Method for Estimation of Naringin in Nanoformulation

    PubMed Central

    Musmade, Kranti P.; Trilok, M.; Dengale, Swapnil J.; Bhat, Krishnamurthy; Reddy, M. S.; Musmade, Prashant B.; Udupa, N.

    2014-01-01

    A simple, precise, accurate, rapid, and sensitive reverse phase high performance liquid chromatography (RP-HPLC) method with UV detection has been developed and validated for quantification of naringin (NAR) in novel pharmaceutical formulation. NAR is a polyphenolic flavonoid present in most of the citrus plants having variety of pharmacological activities. Method optimization was carried out by considering the various parameters such as effect of pH and column. The analyte was separated by employing a C18 (250.0 × 4.6 mm, 5 μm) column at ambient temperature in isocratic conditions using phosphate buffer pH 3.5: acetonitrile (75 : 25% v/v) as mobile phase pumped at a flow rate of 1.0 mL/min. UV detection was carried out at 282 nm. The developed method was validated according to ICH guidelines Q2(R1). The method was found to be precise and accurate on statistical evaluation with a linearity range of 0.1 to 20.0 μg/mL for NAR. The intra- and interday precision studies showed good reproducibility with coefficients of variation (CV) less than 1.0%. The mean recovery of NAR was found to be 99.33 ± 0.16%. The proposed method was found to be highly accurate, sensitive, and robust. The proposed liquid chromatographic method was successfully employed for the routine analysis of said compound in developed novel nanopharmaceuticals. The presence of excipients did not show any interference on the determination of NAR, indicating method specificity. PMID:26556205

  14. Capillary isoelectric focusing method development and validation for investigation of recombinant therapeutic monoclonal antibody.

    PubMed

    Suba, Dávid; Urbányi, Zoltán; Salgó, András

    2015-10-10

    Capillary isoelectric focusing (cIEF) is a basic and highly accurate routine analytical tool to prove identity of protein drugs in quality control (QC) and release tests in biopharmaceutical industries. However there are some "out-of-the-box" applications commercially available which provide easy and rapid isoelectric focusing solutions for investigating monoclonal antibody drug proteins. However use of these kits in routine testings requires high costs. A capillary isoelectric focusing method was developed and validated for identification testing of monoclonal antibody drug products with isoelectric point between 7.0 and 9.0. A method was developed providing good pH gradient for internal calibration (R(2)>0.99) and good resolution between all of the isoform peaks (R=2), minimizing the time and complexity of sample preparation (no urea or salt used). The method is highly reproducible and it is suitable for validation and method transfer to any QC laboratories. Another advantage of the method is that it operates with commercially available chemicals which can be purchased from any suppliers. The interaction with capillary walls (avoid precipitation and adsorption as far as possible) was minimized and synthetic isoelectric small molecular markers were used instead of peptide or protein based markers. The developed method was validated according to the recent ICH guideline (Q2(R1)). Relative standard deviation results were below 0.2% for isoelectric points and below 4% according to the normalized migration times. The method is robust to buffer components with different lot numbers and neutral capillaries with different type of inner coatings. The fluoro-carbon coated column was chosen because of costs-effectivity aspects.

  15. Development of a Hybrid RANS/LES Method for Turbulent Mixing Layers

    NASA Technical Reports Server (NTRS)

    Georgiadis, Nicholas J.; Alexander, J. Iwan D.; Reshotko, Eli

    2001-01-01

    Significant research has been underway for several years in NASA Glenn Research Center's nozzle branch to develop advanced computational methods for simulating turbulent flows in exhaust nozzles. The primary efforts of this research have concentrated on improving our ability to calculate the turbulent mixing layers that dominate flows both in the exhaust systems of modern-day aircraft and in those of hypersonic vehicles under development. As part of these efforts, a hybrid numerical method was recently developed to simulate such turbulent mixing layers. The method developed here is intended for configurations in which a dominant structural feature provides an unsteady mechanism to drive the turbulent development in the mixing layer. Interest in Large Eddy Simulation (LES) methods have increased in recent years, but applying an LES method to calculate the wide range of turbulent scales from small eddies in the wall-bounded regions to large eddies in the mixing region is not yet possible with current computers. As a result, the hybrid method developed here uses a Reynolds-averaged Navier-Stokes (RANS) procedure to calculate wall-bounded regions entering a mixing section and uses a LES procedure to calculate the mixing-dominated regions. A numerical technique was developed to enable the use of the hybrid RANS-LES method on stretched, non-Cartesian grids. With this technique, closure for the RANS equations is obtained by using the Cebeci-Smith algebraic turbulence model in conjunction with the wall-function approach of Ota and Goldberg. The LES equations are closed using the Smagorinsky subgrid scale model. Although the function of the Cebeci-Smith model to replace all of the turbulent stresses is quite different from that of the Smagorinsky subgrid model, which only replaces the small subgrid turbulent stresses, both are eddy viscosity models and both are derived at least in part from mixing-length theory. The similar formulation of these two models enables the RANS

  16. An overview of recent developments in genomics and associated statistical methods.

    PubMed

    Bickel, Peter J; Brown, James B; Huang, Haiyan; Li, Qunhua

    2009-11-13

    The landscape of genomics has changed drastically in the last two decades. Increasingly inexpensive sequencing has shifted the primary focus from the acquisition of biological sequences to the study of biological function. Assays have been developed to study many intricacies of biological systems, and publicly available databases have given rise to integrative analyses that combine information from many sources to draw complex conclusions. Such research was the focus of the recent workshop at the Isaac Newton Institute, 'High dimensional statistics in biology'. Many computational methods from modern genomics and related disciplines were presented and discussed. Using, as much as possible, the material from these talks, we give an overview of modern genomics: from the essential assays that make data-generation possible, to the statistical methods that yield meaningful inference. We point to current analytical challenges, where novel methods, or novel applications of extant methods, are presently needed.

  17. Method development and validation for pharmaceutical tablets analysis using transmission Raman spectroscopy.

    PubMed

    Li, Yi; Igne, Benoît; Drennen, James K; Anderson, Carl A

    2016-02-10

    The objective of the study is to demonstrate the development and validation of a transmission Raman spectroscopic method using the ICH-Q2 Guidance as a template. Specifically, Raman spectroscopy was used to determine niacinamide content in tablet cores. A 3-level, 2-factor full factorial design was utilized to generate a partial least-squares model for active pharmaceutical ingredient quantification. Validation of the transmission Raman model was focused on figures of merit from three independent batches manufactured at pilot scale. The resultant model statistics were evaluated along with the linearity, accuracy, precision and robustness assessments. Method specificity was demonstrated by accurate determination of niacinamide in the presence of niacin (an expected related substance). The method was demonstrated as fit for purpose and had the desirable characteristics of very short analysis times (∼2.5s per tablet). The resulting method was used for routine content uniformity analysis of single dosage units in a stability study.

  18. Lessons learned applying CASE methods/tools to Ada software development projects

    NASA Technical Reports Server (NTRS)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  19. Development of an Estimation Method for Damper Design in Static Elastography

    NASA Astrophysics Data System (ADS)

    Sato, Takayuki; Sato, Shizuka; Watanabe, Yasuaki; Goka, Shigeyoshi; Sekimoto, Hitoshi

    2010-07-01

    The estimation of elasticity distribution in static elastography methods strongly depends on the shape of the transducer head. It is known that the insertion of a damper layer is effective in improving the spatial nonuniformity of an applied strain distribution. We have developed a new estimation method for assessing the effectiveness of this technique through structural analysis using the finite element method and acoustic analysis using the finite-difference time-domain method with multilayered tissues. From the results, the strain flatness inside the compressed tissue is markedly improved, especially near the tissue surface, by inserting a damper layer between the transducer and the tissue surface. The elasticity of an inclusion and the deeper region of a three-layer tissue are also adequately imaged.

  20. Space-Time Conservation Element and Solution Element Method Being Developed

    NASA Technical Reports Server (NTRS)

    Chang, Sin-Chung; Himansu, Ananda; Jorgenson, Philip C. E.; Loh, Ching-Yuen; Wang, Xiao-Yen; Yu, Sheng-Tao

    1999-01-01

    The engineering research and design requirements of today pose great computer-simulation challenges to engineers and scientists who are called on to analyze phenomena in continuum mechanics. The future will bring even more daunting challenges, when increasingly complex phenomena must be analyzed with increased accuracy. Traditionally used numerical simulation methods have evolved to their present state by repeated incremental extensions to broaden their scope. They are reaching the limits of their applicability and will need to be radically revised, at the very least, to meet future simulation challenges. At the NASA Lewis Research Center, researchers have been developing a new numerical framework for solving conservation laws in continuum mechanics, namely, the Space-Time Conservation Element and Solution Element Method, or the CE/SE method. This method has been built from fundamentals and is not a modification of any previously existing method. It has been designed with generality, simplicity, robustness, and accuracy as cornerstones. The CE/SE method has thus far been applied in the fields of computational fluid dynamics, computational aeroacoustics, and computational electromagnetics. Computer programs based on the CE/SE method have been developed for calculating flows in one, two, and three spatial dimensions. Results have been obtained for numerous problems and phenomena, including various shock-tube problems, ZND detonation waves, an implosion and explosion problem, shocks over a forward-facing step, a blast wave discharging from a nozzle, various acoustic waves, and shock/acoustic-wave interactions. The method can clearly resolve shock/acoustic-wave interactions, wherein the difference of the magnitude between the acoustic wave and shock could be up to six orders. In two-dimensional flows, the reflected shock is as crisp as the leading shock. CE/SE schemes are currently being used for advanced applications to jet and fan noise prediction and to chemically

  1. Balancing nurses' workload in hospital wards: study protocol of developing a method to manage workload

    PubMed Central

    van den Oetelaar, W F J M; van Stel, H F; van Rhenen, W; Stellato, R K; Grolman, W

    2016-01-01

    Introduction Hospitals pursue different goals at the same time: excellent service to their patients, good quality care, operational excellence, retaining employees. This requires a good balance between patient needs and nursing staff. One way to ensure a proper fit between patient needs and nursing staff is to work with a workload management method. In our view, a nursing workload management method needs to have the following characteristics: easy to interpret; limited additional registration; applicable to different types of hospital wards; supported by nurses; covers all activities of nurses and suitable for prospective planning of nursing staff. At present, no such method is available. Methods/analysis The research follows several steps to come to a workload management method for staff nurses. First, a list of patient characteristics relevant to care time will be composed by performing a Delphi study among staff nurses. Next, a time study of nurses’ activities will be carried out. The 2 can be combined to estimate care time per patient group and estimate the time nurses spend on non-patient-related activities. These 2 estimates can be combined and compared with available nursing resources: this gives an estimate of nurses’ workload. The research will take place in an academic hospital in the Netherlands. 6 surgical wards will be included, capacity 15–30 beds. Ethical considerations The study protocol was submitted to the Medical Ethical Review Board of the University Medical Center (UMC) Utrecht and received a positive advice, protocol number 14-165/C. Discussion This method will be developed in close cooperation with staff nurses and ward management. The strong involvement of the end users will contribute to a broader support of the results. The method we will develop may also be useful for planning purposes; this is a strong advantage compared with existing methods, which tend to focus on retrospective analysis. PMID:28186931

  2. Impact of the emulsification-diffusion method on the development of pharmaceutical nanoparticles.

    PubMed

    Quintanar-Guerrero, David; Zambrano-Zaragoza, María de la Luz; Gutierrez-Cortez, Elsa; Mendoza-Munoz, Nestor

    2012-12-01

    Nanotechnology is having a profound impact in many scientific fields and it has become one of the most important and exciting discipline. Like all technological advances, nanotechnology has its own scientific basis with a broad interdisciplinary effect. Perhaps, we are witnessing an exponential growth of nanotechnology, reflection of this is the important increase in the number of patents, scientific papers and specialized "nano" meetings and journals. The impact in the pharmaceutical area is related to the use of colloidal drug delivery systems as carriers for bioactive agents, in particular, the nanoparticle technology. The term nanoparticles designates solid submicronic particles formed of acceptable materials (e.g. polymers, lipids, etc.) containing an active substance. It includes both nanospheres (matricial systems) and nanocapsules (membrane systems). The knowledge of the nanoparticle preparation methods is a key issue for the formulator involved with drug-delivery research and development. In general, the methods based on preformed polymers, in particular biodegradable polymers, are preferred due to their easy implementation and lower potential toxicity. One of the most widely used methods to prepare polymeric nanoparticles is emulsification-diffusion. This method has been discussed in some reviews that compile research works but has a small number of patents. In this review, the emulsification-diffusion method is discussed from a technological point of view in order to show the operating conditions and formulation variables from data extracted of recent patents and experimental works. The main idea is to provide the reader with a general guide for formulators to make decisions about the usefulness of this method to develop specific nanoparticulate systems. The first part of this review provides an overview of the emulsification-diffusion method to prepare polymeric nanoparticles, while the second part evaluates the influence of preparative variables on the

  3. Research on Assessment Methods for Urban Public Transport Development in China

    PubMed Central

    Zou, Linghong; Guo, Hongwei

    2014-01-01

    In recent years, with the rapid increase in urban population, the urban travel demands in Chinese cities have been increasing dramatically. As a result, developing comprehensive urban transport systems becomes an inevitable choice to meet the growing urban travel demands. In urban transport systems, public transport plays the leading role to promote sustainable urban development. This paper aims to establish an assessment index system for the development level of urban public transport consisting of a target layer, a criterion layer, and an index layer. Review on existing literature shows that methods used in evaluating urban public transport structure are dominantly qualitative. To overcome this shortcoming, fuzzy mathematics method is used for describing qualitative issues quantitatively, and AHP (analytic hierarchy process) is used to quantify expert's subjective judgment. The assessment model is established based on the fuzzy AHP. The weight of each index is determined through the AHP and the degree of membership of each index through the fuzzy assessment method to obtain the fuzzy synthetic assessment matrix. Finally, a case study is conducted to verify the rationality and practicability of the assessment system and the proposed assessment method. PMID:25530756

  4. Development and validation of an HPLC-MS/MS method for the early diagnosis of aspergillosis.

    PubMed

    Cerqueira, Letícia B; de Francisco, Thais M G; Gasparetto, João C; Campos, Francinete R; Pontarolo, Roberto

    2014-01-01

    Invasive aspergillosis is an opportunistic infection that is mainly caused by Aspergillus fumigatus, which is known to produce several secondary metabolites, including gliotoxin, the most abundant metabolite produced during hyphal growth. The diagnosis of invasive aspergillosis is often made late in the infection because of the lack of reliable and feasible diagnostic techniques; therefore, early detection is critical to begin treatment and avoid more serious complications. The present work reports the development and validation of an HPLC-MS/MS method for the detection of gliotoxin in the serum of patients with suspected aspergillosis. Chromatographic separation was achieved using an XBridge C18 column (150 × 2.1 mm id; 5 mm particle size) maintained at 25 °C with the corresponding guard column (XBridge C18, 10 × 2.1 mm id, 5 mm particle size). The mobile phase was composed of a gradient of water and acetonitrile/water (95:5 v/v), both containing 1 mM ammonium formate with a flow rate of 0.45 mL min(-1). Data from the validation studies demonstrate that this new method is highly sensitive, selective, linear, precise, accurate and free from matrix interference. The developed method was successfully applied to samples from patients suspected of having aspergillosis. Therefore, the developed method has considerable potential as a diagnostic technique for aspergillosis.

  5. Polymorphism in nimodipine raw materials: development and validation of a quantitative method through differential scanning calorimetry.

    PubMed

    Riekes, Manoela Klüppel; Pereira, Rafael Nicolay; Rauber, Gabriela Schneider; Cuffini, Silvia Lucia; de Campos, Carlos Eduardo Maduro; Silva, Marcos Antonio Segatto; Stulzer, Hellen Karine

    2012-11-01

    Due to the physical-chemical and therapeutic impacts of polymorphism, its monitoring in raw materials is necessary. The purpose of this study was to develop and validate a quantitative method to determine the polymorphic content of nimodipine (NMP) raw materials based on differential scanning calorimetry (DSC). The polymorphs required for the development of the method were characterized through DSC, X-ray powder diffraction (XRPD) and Raman spectroscopy and their polymorphic identity was confirmed. The developed method was found to be linear, robust, precise, accurate and specific. Three different samples obtained from distinct suppliers (NMP 1, NMP 2 and NMP 3) were firstly characterized through XRPD and DSC as polymorphic mixtures. The determination of their polymorphic identity revealed that all samples presented the Modification I (Mod I) or metastable form in greatest proportion. Since the commercial polymorph is Mod I, the polymorphic characteristic of the samples analyzed needs to be investigated. Thus, the proposed method provides a useful tool for the monitoring of the polymorphic content of NMP raw materials.

  6. Integrating design science theory and methods to improve the development and evaluation of health communication programs.

    PubMed

    Neuhauser, Linda; Kreps, Gary L

    2014-12-01

    Traditional communication theory and research methods provide valuable guidance about designing and evaluating health communication programs. However, efforts to use health communication programs to educate, motivate, and support people to adopt healthy behaviors often fail to meet the desired goals. One reason for this failure is that health promotion issues are complex, changeable, and highly related to the specific needs and contexts of the intended audiences. It is a daunting challenge to effectively influence health behaviors, particularly culturally learned and reinforced behaviors concerning lifestyle factors related to diet, exercise, and substance (such as alcohol and tobacco) use. Too often, program development and evaluation are not adequately linked to provide rapid feedback to health communication program developers so that important revisions can be made to design the most relevant and personally motivating health communication programs for specific audiences. Design science theory and methods commonly used in engineering, computer science, and other fields can address such program and evaluation weaknesses. Design science researchers study human-created programs using tightly connected build-and-evaluate loops in which they use intensive participatory methods to understand problems and develop solutions concurrently and throughout the duration of the program. Such thinking and strategies are especially relevant to address complex health communication issues. In this article, the authors explore the history, scientific foundation, methods, and applications of design science and its potential to enhance health communication programs and their evaluation.

  7. Systemic Sclerosis Classification Criteria: Developing methods for multi-criteria decision analysis with 1000Minds

    PubMed Central

    Johnson, Sindhu R.; Naden, Raymond P.; Fransen, Jaap; van den Hoogen, Frank; Pope, Janet E.; Baron, Murray; Tyndall, Alan; Matucci-Cerinic, Marco; Denton, Christopher P.; Distler, Oliver; Gabrielli, Armando; van Laar, Jacob M.; Mayes, Maureen; Steen, Virginia; Seibold, James R.; Clements, Phillip; Medsger, Thomas A.; Carreira, Patricia E.; Riemekasten, Gabriela; Chung, Lorinda; Fessler, Barri J.; Merkel, Peter A.; Silver, Richard; Varga, John; Allanore, Yannick; Mueller-Ladner, Ulf; Vonk, Madelon C.; Walker, Ulrich A.; Cappelli, Susanna; Khanna, Dinesh

    2014-01-01

    Objective Classification criteria for systemic sclerosis (SSc) are being developed. The objectives were to: develop an instrument for collating case-data and evaluate its sensibility; use forced-choice methods to reduce and weight criteria; and explore agreement between experts on the probability that cases were classified as SSc. Study Design and Setting A standardized instrument was tested for sensibility. The instrument was applied to 20 cases covering a range of probabilities that each had SSc. Experts rank-ordered cases from highest to lowest probability; reduced and weighted the criteria using forced-choice methods; and re-ranked the cases. Consistency in rankings was evaluated using intraclass correlation coefficients (ICC). Results Experts endorsed clarity (83%), comprehensibility (100%), face and content validity (100%). Criteria were weighted (points): finger skin thickening (14–22), finger-tip lesions (9–21), friction rubs (21), finger flexion contractures (16), pulmonary fibrosis (14), SSc-related antibodies (15), Raynaud’s phenomenon (13), calcinosis (12), pulmonary hypertension (11), renal crisis (11), telangiectasia (10), abnormal nailfold capillaries (10), esophageal dilation (7) and puffy fingers (5). The ICC across experts was 0.73 (95%CI 0.58,0.86) and improved to 0.80 (95%CI 0.68,0.90). Conclusions Using a sensible instrument and forced-choice methods, the number of criteria were reduced by 39% (23 to 14) and weighted. Our methods reflect the rigors of measurement science, and serves as a template for developing classification criteria. PMID:24721558

  8. Research on assessment methods for urban public transport development in China.

    PubMed

    Zou, Linghong; Dai, Hongna; Yao, Enjian; Jiang, Tian; Guo, Hongwei

    2014-01-01

    In recent years, with the rapid increase in urban population, the urban travel demands in Chinese cities have been increasing dramatically. As a result, developing comprehensive urban transport systems becomes an inevitable choice to meet the growing urban travel demands. In urban transport systems, public transport plays the leading role to promote sustainable urban development. This paper aims to establish an assessment index system for the development level of urban public transport consisting of a target layer, a criterion layer, and an index layer. Review on existing literature shows that methods used in evaluating urban public transport structure are dominantly qualitative. To overcome this shortcoming, fuzzy mathematics method is used for describing qualitative issues quantitatively, and AHP (analytic hierarchy process) is used to quantify expert's subjective judgment. The assessment model is established based on the fuzzy AHP. The weight of each index is determined through the AHP and the degree of membership of each index through the fuzzy assessment method to obtain the fuzzy synthetic assessment matrix. Finally, a case study is conducted to verify the rationality and practicability of the assessment system and the proposed assessment method.

  9. Development of a capillary electrophoresis method for the characterization of "palo azul" (Eysenhardtia polystachya).

    PubMed

    Salinas-Hernández, Pastora; López-Bermúdez, Francisco J; Rodríguez-Barrientos, Damaris; Ramírez-Silva, María Teresa; Romero-Romo, Mario A; Morales-Anzures, Fernando; Rojas-Hernández, Alberto

    2008-03-01

    The tree Eysenhardtia polystachya (Ortega) Sarg. has quite a wide popular use within the traditional Mexican medicine as herbal remedy. Popular practices constitute a relevant enough basis to design optimum analytical methods in order to determine basic principles of diverse medicinal plants. This has become one of the essentials needed to characterize such products, for which it is fundamentally important to develop an efficient and reliable separation method. This work presents the results concerning the development and optimization of a novel CE method for the separation of components from water/etanol (1:1) extracts of E. polystachya, using the following conditions, considered the best obtained: phosphate buffer 10 mM, 20 kV voltage, and pH 8.1 at 214 nm and 50 mM, 12.5 kV voltage with pH 8.1 at 426 nm. The optimization takes into account the parameters associated in the resulting electropherograms, such as number of peaks, migration times, and the Deltat(m) of the neighboring peaks. Under optimal conditions the separation intended was attained within 15 and 20 min for 214 and 426 nm, respectively. The characterization method developed was applied to the analysis of diverse extracts of E. polystachya.

  10. Rural school nurse perception of book studies as an effective method for professional development.

    PubMed

    Gray, Lorali

    2014-05-01

    School nurses who serve public school districts in rural Northwest Washington face barriers in accessing Continuing Education (CE) for professional development as they often practice in remote, isolated school communities. Acknowledging these barriers, the author discusses the inclusion of book studies within an existing training structure as an innovative method of providing professional development. By utilizing training that is already attended by rural school nurses, CE can be enhanced without incurring additional travel, cost, or training time. The school nurse's perception of the effectiveness of book studies as a CE method was examined per a descriptive, qualitative program evaluation. Over a period of 5 years, evaluation and feedback data from 12 rural school nurses were compiled from nine individual school nurse book study evaluations and one general satisfaction survey. Findings indicated overall school nurse satisfaction and belief that school nurse book studies are an effective and beneficial method for the delivery of professional development--a method that promotes collaborative learning and collegiality, informs practice, and provides insight into the broader health and social issues impacting today's students.

  11. Modeling methods for identifying critical source areas of bacteria: recent developments and future perspectives.

    PubMed

    Tong, Yangbin; Deng, Zhiqiang

    2013-03-01

    Identification of critical source areas of bacteria in a watershed is essential to environmental management and restoration. As a result of the nonpoint and distributed nature of bacterial pollution in watersheds, it is often difficult to identify specific source areas of bacteria for remediation because bacteria collected from different sampling sites might display similar fingerprints. Over the past decade, extensive efforts have been made to identify microbial pollution sources, especially in watersheds. The primary objective of this study was to identify effective methods that can be applied to tracking critical source areas of bacteria in a watershed by a review of recent developments in several modeling methods. Comparisons of the models and their applications revealed that comprehensive watershed-scale source area tracking primarily involves two steps-geographical tracking and mathematical tracking. In terms of geographical tracking, bacterial source locations must be identified to prepare structural best management practices or low impact development for site treatments. For mathematical tracking, the quantity (strength) or release history of bacterial sources must be computed to develop total maximum daily loads (TMDLs) for bacterial load reduction and water quality restoration. Mathematically, source tracking is essentially an inverse modeling issue under uncertainty, requiring inverse modeling combined with a geostatistical method or an optimization algorithm. Consequently, combining biological methods, mathematical models, and sensor technologies (including remote sensing and in-situ sensing) provides an effective approach to identifying critical source locations of bacteria at the watershed-scale.

  12. Methods to identify, study and understand End-user participation in HIT development

    PubMed Central

    2011-01-01

    Background Experience has shown that for new health-information-technology (HIT) to be suc-cessful clinicians must obtain positive clinical benefits as a result of its implementation and joint-ownership of the decisions made during the development process. A prerequisite for achieving both success criteria is real end-user-participation. Experience has also shown that further research into developing improved methods to collect more detailed information on social groups participating in HIT development is needed in order to support, facilitate and improve real end-user participation. Methods A case study of an EHR planning-process in a Danish county from October 2003 until April 2006 was conducted using process-analysis. Three social groups (physicians, IT-professionals and administrators) were identified and studied in the local, present perspective. In order to understand the interactions between the three groups, the national, historic perspective was included through a literature-study. Data were collected through observations, interviews, insight gathered from documents and relevant literature. Results In the local, present perspective, the administrator's strategy for the EHR planning process meant that there was no clinical workload-reduction. This was seen as one of the main barriers to the physicians to achieving real influence. In the national, historic perspective, physicians and administrators have had/have different perceptions of the purpose of the patient record and they have both struggled to influence this definition. To date, the administrators have won the battle. This explains the conditions made available for the physicians' participation in this case, which led to their role being reduced to that of clinical consultants - rather than real participants. Conclusion In HIT-development the interests of and the balance of power between the different social groups involved are decisive in determining whether or not the end-users become real

  13. Advanced organic analysis and analytical methods development: FY 1995 progress report. Waste Tank Organic Safety Program

    SciTech Connect

    Wahl, K.L.; Campbell, J.A.; Clauss, S.A.

    1995-09-01

    This report describes the work performed during FY 1995 by Pacific Northwest Laboratory in developing and optimizing analysis techniques for identifying organics present in Hanford waste tanks. The main focus was to provide a means for rapidly obtaining the most useful information concerning the organics present in tank waste, with minimal sample handling and with minimal waste generation. One major focus has been to optimize analytical methods for organic speciation. Select methods, such as atmospheric pressure chemical ionization mass spectrometry and matrix-assisted laser desorption/ionization mass spectrometry, were developed to increase the speciation capabilities, while minimizing sample handling. A capillary electrophoresis method was developed to improve separation capabilities while minimizing additional waste generation. In addition, considerable emphasis has been placed on developing a rapid screening tool, based on Raman and infrared spectroscopy, for determining organic functional group content when complete organic speciation is not required. This capability would allow for a cost-effective means to screen the waste tanks to identify tanks that require more specialized and complete organic speciation to determine tank safety.

  14. A generic simulation cell method for developing extensible, efficient and readable parallel computational models

    NASA Astrophysics Data System (ADS)

    Honkonen, I.

    2015-03-01

    I present a method for developing extensible and modular computational models without sacrificing serial or parallel performance or source code readability. By using a generic simulation cell method I show that it is possible to combine several distinct computational models to run in the same computational grid without requiring modification of existing code. This is an advantage for the development and testing of, e.g., geoscientific software as each submodel can be developed and tested independently and subsequently used without modification in a more complex coupled program. An implementation of the generic simulation cell method presented here, generic simulation cell class (gensimcell), also includes support for parallel programming by allowing model developers to select which simulation variables of, e.g., a domain-decomposed model to transfer between processes via a Message Passing Interface (MPI) library. This allows the communication strategy of a program to be formalized by explicitly stating which variables must be transferred between processes for the correct functionality of each submodel and the entire program. The generic simulation cell class requires a C++ compiler that supports a version of the language standardized in 2011 (C++11). The code is available at https://github.com/nasailja/gensimcell for everyone to use, study, modify and redistribute; those who do are kindly requested to acknowledge and cite this work.

  15. Development of correction methods for variable pinhole single-photon emission computed tomography

    NASA Astrophysics Data System (ADS)

    Bae, S.; Bae, J.; Lee, H.; Lee, K.

    2016-02-01

    We propose a novel pinhole collimator in which the pinhole shape can be changed in real-time, and a new single-photon emission computed tomography (SPECT) system that utilizes this variable pinhole (VP) collimator. The acceptance angle and distance between the collimator and the object of VP SPECT are varied so that the optimum value of the region-of-interest (ROI) can be obtained for each rotation angle. Because of these geometrical variations, new correction methods are required for image reconstruction. In this study, we developed two correction methods. The first is the sensitivity-correction algorithm, which minimizes the variation of a system matrix caused by varying the acceptance angle for each rotation angle. The second is the acquisition-time-correction method, which reduces the variation of uniformity caused by varying the distance between the collimator and the object for each rotation angle. A 3D maximum likelihood expectation maximization (MLEM) algorithm was applied to image reconstruction, and two digital phantoms were studied to evaluate the resolution and sensitivity of the images obtained using the proposed methods. The images obtained by using the proposed correction methods show higher uniformity and resolution than those obtained without using these methods. In particular, the results of the resolution phantom study show that hot rods (0.8-mm-diameter) can be clearly distinguished using the proposed correction methods. A quantitative analysis of the ROI phantom revealed that the mean square error (MSE) was 0.42 without the acquisition-time-correction method, and 0.04 with the acquisition-time-correction method. The MSEs of the resolution phantom without and with the acquisition-time-correction method were calculated as 55.14 and 14.69, respectively.

  16. Simple and effective HPLC method development and its validation for Clindipine in human drug free plasma.

    PubMed

    Muralidharan, Selvadurai; Kumar, Jaya raja; Dhanaraj, Sokkalingam Arumugam

    2015-01-01

    Simple and effective high performance liquid chromatographic (HPLC) method was developed for estimation of Clindipine in drug free human drug free blank plasma. The internal standard used as Nifidipine (IS). The current method was used protein precipitating extraction of Clindipine from blank plasma. Separation was achieved on reversed-phase c18 column (25cm × 4.6mm, 5μ) and the detection was monitored by UV detector at 260 nm. The optimized mobile phase was used acetonitrile: 5mM potassium dihydrogen orthophosphate (pH 4.5), in the ratio of 60:40% v/v at a flow rate of 1.0 ml/min. This linearity was achieved in this method range of 10.0-125.0 ng/ml with regression coefficient range is 0.99. The present method is suitable in terms of precise, accurate and specific during the study. The simplicity of the method allows for application in laboratories that lack sophisticated analytical instruments such as LC-MS/MS or GC-MS/MS that are complicated, costly and time consuming rather than a simple HPLC-UV method. The present method was successfully applied for pharmacokinetic studies.

  17. Development and validation of a novel RP-HPLC method for the analysis of reduced glutathione.

    PubMed

    Sutariya, Vijaykumar; Wehrung, Daniel; Geldenhuys, Werner J

    2012-03-01

    The objective of this study was the development, optimization, and validation of a novel reverse-phase high-pressure liquid chromatography (RP-HPLC) method for the quantification of reduced glutathione in pharmaceutical formulations utilizing simple UV detection. The separation utilized a C18 column at room temperature and UV absorption was measured at 215 nm. The mobile phase was an isocratic flow of a 50/50 (v/v) mixture of water (pH 7.0) and acetonitrile flowing at 1.0 mL/min. Validation of the method assessed the methods ability in seven categories: linearity, range, limit of detection, limit of quantification, accuracy, precision, and selectivity. Analysis of the system suitability showed acceptable levels of suitability in all categories. Likewise, the method displayed an acceptable degree of linearity (r(2) = 0.9994) over a concentration range of 2.5-60 µg/mL. The detection limit and quantification limit were 0.6 and 1.8 µg/mL respectively. The percent recovery of the method was 98.80-100.79%. Following validation the method was employed in the determination of glutathione in pharmaceutical formulations in the form of a conjugate and a nanoparticle. The proposed method offers a simple, accurate, and inexpensive way to quantify reduced glutathione.

  18. Development of a conceptual flight vehicle design weight estimation method library and documentation

    NASA Astrophysics Data System (ADS)

    Walker, Andrew S.

    The state of the art in estimating the volumetric size and mass of flight vehicles is held today by an elite group of engineers in the Aerospace Conceptual Design Industry. This is not a skill readily accessible or taught in academia. To estimate flight vehicle mass properties, many aerospace engineering students are encouraged to read the latest design textbooks, learn how to use a few basic statistical equations, and plunge into the details of parametric mass properties analysis. Specifications for and a prototype of a standardized engineering "tool-box" of conceptual and preliminary design weight estimation methods were developed to manage the growing and ever-changing body of weight estimation knowledge. This also bridges the gap in Mass Properties education for aerospace engineering students. The Weight Method Library will also be used as a living document for use by future aerospace students. This "tool-box" consists of a weight estimation method bibliography containing unclassified, open-source literature for conceptual and preliminary flight vehicle design phases. Transport aircraft validation cases have been applied to each entry in the AVD Weight Method Library in order to provide a sense of context and applicability to each method. The weight methodology validation results indicate consensus and agreement of the individual methods. This generic specification of a method library will be applicable for use by other disciplines within the AVD Lab, Post-Graduate design labs, or engineering design professionals.

  19. Development of Advanced Life Cycle Costing Methods for Technology Benefit/Cost/Risk Assessment

    NASA Technical Reports Server (NTRS)

    Yackovetsky, Robert (Technical Monitor)

    2002-01-01

    The overall objective of this three-year grant is to provide NASA Langley's System Analysis Branch with improved affordability tools and methods based on probabilistic cost assessment techniques. In order to accomplish this objective, the Aerospace Systems Design Laboratory (ASDL) needs to pursue more detailed affordability, technology impact, and risk prediction methods and to demonstrate them on variety of advanced commercial transports. The affordability assessment, which is a cornerstone of ASDL methods, relies on the Aircraft Life Cycle Cost Analysis (ALCCA) program originally developed by NASA Ames Research Center and enhanced by ASDL. This grant proposed to improve ALCCA in support of the project objective by updating the research, design, test, and evaluation cost module, as well as the engine development cost module. Investigations into enhancements to ALCCA include improved engine development cost, process based costing, supportability cost, and system reliability with airline loss of revenue for system downtime. A probabilistic, stand-alone version of ALCCA/FLOPS will also be developed under this grant in order to capture the uncertainty involved in technology assessments. FLOPS (FLight Optimization System program) is an aircraft synthesis and sizing code developed by NASA Langley Research Center. This probabilistic version of the coupled program will be used within a Technology Impact Forecasting (TIF) method to determine what types of technologies would have to be infused in a system in order to meet customer requirements. A probabilistic analysis of the CER's (cost estimating relationships) within ALCCA will also be carried out under this contract in order to gain some insight as to the most influential costs and the impact that code fidelity could have on future RDS (Robust Design Simulation) studies.

  20. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    PubMed

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize.

  1. Selection of reference standard during method development using the analytical hierarchy process.

    PubMed

    Sun, Wan-yang; Tong, Ling; Li, Dong-xiang; Huang, Jing-yi; Zhou, Shui-ping; Sun, Henry; Bi, Kai-shun

    2015-03-25

    Reference standard is critical for ensuring reliable and accurate method performance. One important issue is how to select the ideal one from the alternatives. Unlike the optimization of parameters, the criteria of the reference standard are always immeasurable. The aim of this paper is to recommend a quantitative approach for the selection of reference standard during method development based on the analytical hierarchy process (AHP) as a decision-making tool. Six alternative single reference standards were assessed in quantitative analysis of six phenolic acids from Salvia Miltiorrhiza and its preparations by using ultra-performance liquid chromatography. The AHP model simultaneously considered six criteria related to reference standard characteristics and method performance, containing feasibility to obtain, abundance in samples, chemical stability, accuracy, precision and robustness. The priority of each alternative was calculated using standard AHP analysis method. The results showed that protocatechuic aldehyde is the ideal reference standard, and rosmarinic acid is about 79.8% ability as the second choice. The determination results successfully verified the evaluation ability of this model. The AHP allowed us comprehensive considering the benefits and risks of the alternatives. It was an effective and practical tool for optimization of reference standards during method development.

  2. Developing a monitoring method facilitating continual improvements in the sorting of waste at recycling centres.

    PubMed

    Krook, Joakim; Eklund, Mats

    2010-01-01

    Beneficial use of waste relies on efficient systems for collection and separation. In Sweden, a bring system involving recycling centres for collection of bulky, electr(on)ic and hazardous waste has been introduced. A significant share of this waste is incorrectly sorted, causing downstream environmental implications. At present, however, there is a lack of affordable and accurate monitoring methods for providing the recycling centres with the necessary facts for improving the sorting of waste. The aim of this study was therefore to evaluate the usability of a simplified and potentially more suitable waste monitoring method for recycling centres. This method is based on standardised observations where the occurrence of incorrect sorting is monitored by taking digital pictures of the waste which then are analysed according to certain guidelines. The results show that the developed monitoring method could offer a resource-efficient and useful tool for proactive quality work at recycling centres, involving continuous efforts in developing and evaluating measures for improved sorting of waste. More research is however needed in order to determine to what extent the obtained results from the monitoring method are reliable.

  3. Development and validation of a fast SFC method for the analysis of flavonoids in plant extracts.

    PubMed

    Huang, Yang; Feng, Ying; Tang, Guangyun; Li, Minyi; Zhang, Tingting; Fillet, Marianne; Crommen, Jacques; Jiang, Zhengjin

    2017-03-12

    Flavonoids from plants always show a wide range of biological activities. In the present study, a rapid and highly efficient supercritical fluid chromatography (SFC) method was developed for the separation of 12 flavonoids. After careful optimization, the 12 flavonoids were baseline separated on a ZORBAX RX-SIL column using gradient elution. A 0.1% phosphoric acid solution in methanol was found to be the most suitable polar mobile phase component for the separation of flavonoids. From the viewpoint of retention and resolution, a backpressure of 200bar and a temperature of 40°C were shown to give the best results. Compared with a previously developed reverse phase liquid chromatography method, the SFC method could provide flavonoid separations that were about three times faster, while maintaining good peak shape and comparable peak efficiency. This SFC method was validated and applied to the analysis of five flavonoids (kaempferol, luteolin, quercetin, luteoloside, buddleoside) present in Chrysanthemum morifolium Ramat. from different cultivars (Chuju, Gongju, Hangju, Boju). The results indicated a good repeatability and sensitivity for the quantification of the five analytes with RSDs for overall precision lower than 3%. The limits of detection ranged from 0.73 to 2.34μg/mL, while the limits of quantification were between 2.19 and 5.86μg/mL. The method showed that SFC could be employed as a useful tool for the quality assessment of Traditional Chinese medicines (TCMs) containing flavonoids as active components.

  4. A Method for Developing 3D User Interfaces of Information Systems

    NASA Astrophysics Data System (ADS)

    Calleros, Juan Manuel González; Vanderdonckt, Jean; Arteaga, Jaime Muñoz

    A transformational method for developing tri-dimensional user interfaces of interactive information systems is presented that starts from a task model and a domain model to progressively derive a final user interface. This method consists of three steps: deriving one or many abstract user interfaces from a task model and a domain model, deriving one or many concrete user interfaces from each abstract interface, and producing the code of the final user interfaces corresponding to each concrete interface. To ensure the two first steps, trans-formations are encoded as graph transformations performed on the involved models expressed in their graph equivalent. In addition, a graph grammar gathers relevant graph transformations for accomplishing the sub-steps involved in each step. Once a concrete user interface is resulting from these two first steps, it is converted in a development environment for 3D user interfaces where it can be edited for fine tuning and personalization. From this environment, the user interface code is automatically generated. The method is defined by its steps, input/output, and exemplified on a case study. By expressing the steps of the method through transformations between models, the method adheres to Model-Driven Engineering paradigm where models and transformations are explicitly defined and used

  5. NASA Perspective on Requirements for Development of Advanced Methods Predicting Unsteady Aerodynamics and Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Schuster, David M.

    2008-01-01

    Over the past three years, the National Aeronautics and Space Administration (NASA) has initiated design, development, and testing of a new human-rated space exploration system under the Constellation Program. Initial designs within the Constellation Program are scheduled to replace the present Space Shuttle, which is slated for retirement within the next three years. The development of vehicles for the Constellation system has encountered several unsteady aerodynamics challenges that have bearing on more traditional unsteady aerodynamic and aeroelastic analysis. This paper focuses on the synergy between the present NASA challenges and the ongoing challenges that have historically been the subject of research and method development. There are specific similarities in the flows required to be analyzed for the space exploration problems and those required for some of the more nonlinear unsteady aerodynamic and aeroelastic problems encountered on aircraft. The aggressive schedule, significant technical challenge, and high-priority status of the exploration system development is forcing engineers to implement existing tools and techniques in a design and application environment that is significantly stretching the capability of their methods. While these methods afford the users with the ability to rapidly turn around designs and analyses, their aggressive implementation comes at a price. The relative immaturity of the techniques for specific flow problems and the inexperience with their broad application to them, particularly on manned spacecraft flight system, has resulted in the implementation of an extensive wind tunnel and flight test program to reduce uncertainty and improve the experience base in the application of these methods. This provides a unique opportunity for unsteady aerodynamics and aeroelastic method developers to test and evaluate new analysis techniques on problems with high potential for acquisition of test and even flight data against which they

  6. National health accounts in developing countries: appropriate methods and recent applications.

    PubMed

    Berman, P A

    1997-01-01

    Better information on the financing of the health sector is an essential basis for wise policy change in the area of health sector reform. Analysis of health care financing should begin with sound estimates of national health expenditure--total spending, the contributions to spending from different sources and the claims on spending by different uses of the funds. The member countries of the OECD have successfully established such comparative health expenditure accounts in terms of standardized definitions of the uses of funds and breakdowns by public and private sector sources. This has resulted in important research on health system differences which could explain variations in the level and composition of financing. The United States has developed a more detailed approach called National Health Accounts, which expands the OECD method into a more disaggregated 'sources and uses' matrix. In the developing countries, analysis of health expenditures has been much less systematic, despite several decades of calls by international researchers for more attention. This paper reviews previous work done in developing countries and proposes renewed attention to national health expenditures, adapting the recent experience of the United States. Because most developing countries have more pluralistic health financing structures than are found in most industrialized countries, an enhanced and adapted version of the 'sources and uses' matrix method is proposed. This method should be modified to address the relevant categories of expenditures prevalent in the developing countries. Examples of recent applications of such 'national health accounts' from the Philippines, Egypt, India, Mexico, Colombia and Zambia are presented. Experience to date suggests that development of sound estimates using this method in low and middle income countries is feasible and affordable. National health accounts estimates can significantly influence policy. They provide decision makers with a

  7. Development of wireless coupling methods in ultrasonic instruments for determining the strength of materials

    NASA Astrophysics Data System (ADS)

    Korolev, M. V.; Starikov, B. P.; Konovalov, A. A.; Karpelson, A. E.

    Two methods of wireless coupling in ultrasonic instruments for determining the strength of materials are described, i.e., radio coupling and acoustic coupling through the object being tested. Particular attention is given to the latter; this method is used to develop an instrument consisting of two miniaturized electronic units with built-in transmitting and receiving transducers. These units are electrically and structurally autonomous, with information being passed from one unit to the other through the acoustic channel, i.e., via the objective being tested.

  8. Development of the mathematical model for design and verification of acoustic modal analysis methods

    NASA Astrophysics Data System (ADS)

    Siner, Alexander; Startseva, Maria

    2016-10-01

    To reduce the turbofan noise it is necessary to develop methods for the analysis of the sound field generated by the blade machinery called modal analysis. Because modal analysis methods are very difficult and their testing on the full scale measurements are very expensive and tedious it is necessary to construct some mathematical models allowing to test modal analysis algorithms fast and cheap. At this work the model allowing to set single modes at the channel and to analyze generated sound field is presented. Modal analysis of the sound generated by the ring array of point sound sources is made. Comparison of experimental and numerical modal analysis results is presented at this work.

  9. Development of a Rapid and Confirmatory Method to Identify Ganoderic Acids in Ganoderma Mushrooms

    PubMed Central

    Qi, Ying; Zhao, Lingling; Sun, Hao H.

    2012-01-01

    To examine the composition of lanostanoids in Ganoderma lucidum, we have developed a liquid chromatography–mass spectrometry (LC–MS) method by using the ganoderic acids isolated in our laboratory as reference standards. The identity of 14 peaks in the high performance liquid chromatogram (HPLC) of G. lucidum has been confirmed. By using the HPLC retention times of these ganoderic acids and their mass fragmentation patterns established in this paper, one can use LC–MS to analyze G. lucidum without requiring the reference standards of these 14 ganoderic acids. Subsequently, only the HPLC–UV method would be needed to analyze routine samples of G. lucidum. PMID:22586399

  10. In situ methods for Li-ion battery research: A review of recent developments

    NASA Astrophysics Data System (ADS)

    Harks, P. P. R. M. L.; Mulder, F. M.; Notten, P. H. L.

    2015-08-01

    A considerable amount of research is being directed towards improving lithium-ion batteries in order to meet today's market demands. In particular in situ investigations of Li-ion batteries have proven extremely insightful, but require the electrochemical cell to be fully compatible with the conditions of the testing method and are therefore often challenging to execute. Advantageously, in the past few years significant progress has been made with new, more advanced, in situ techniques. Herein, a comprehensive overview of in situ methods for studying Li-ion batteries is given, with the emphasis on new developments and reported experimental highlights.

  11. Polymerase Chain Reaction/Rapid Methods Are Gaining a Foothold in Developing Countries.

    PubMed

    Ragheb, Suzan Mohammed; Jimenez, Luis

    Detection of microbial contamination in pharmaceutical raw materials and finished products is a critical factor to guarantee their safety, stability, and potency. Rapid microbiological methods-such as polymerase chain reaction-have been widely applied to clinical and food quality control analysis. However, polymerase chain reaction applications to pharmaceutical quality control have been rather slow and sporadic. Successful implementation of these methods in pharmaceutical companies in developing countries requires important considerations to provide sensitive and robust assays that will comply with good manufacturing practices.

  12. Expert Elicitation Methods in Quantifying the Consequences of Acoustic Disturbance from Offshore Renewable Energy Developments.

    PubMed

    Donovan, Carl; Harwood, John; King, Stephanie; Booth, Cormac; Caneco, Bruno; Walker, Cameron

    2016-01-01

    There are many developments for offshore renewable energy around the United Kingdom whose installation typically produces large amounts of far-reaching noise, potentially disturbing many marine mammals. The potential to affect the favorable conservation status of many species means extensive environmental impact assessment requirements for the licensing of such installation activities. Quantification of such complex risk problems is difficult and much of the key information is not readily available. Expert elicitation methods can be employed in such pressing cases. We describe the methodology used in an expert elicitation study conducted in the United Kingdom for combining expert opinions based on statistical distributions and copula-like methods.

  13. Development and validation of spectrophotometric, atomic absorption and kinetic methods for determination of moxifloxacin hydrochloride.

    PubMed

    Abdellaziz, Lobna M; Hosny, Mervat M

    2011-01-01

    Three simple spectrophotometric and atomic absorption spectrometric methods are developed and validated for the determination of moxifloxacin HCl in pure form and in pharmaceutical formulations. Method (A) is a kinetic method based on the oxidation of moxifloxacin HCl by Fe(3+) ion in the presence of 1,10 o-phenanthroline (o-phen). Method (B) describes spectrophotometric procedures for determination of moxifloxacin HCl based on its ability to reduce Fe (III) to Fe (II), which was rapidly converted to the corresponding stable coloured complex after reacting with 2,2' bipyridyl (bipy). The formation of the tris-complex formed in both methods (A) and (B) were carefully studied and their absorbance were measured at 510 and 520 nm respectively. Method (C) is based on the formation of ion- pair associated between the drug and bismuth (III) tetraiodide in acidic medium to form orange-red ion-pair associates. This associate can be quantitatively determined by three different procedures. The formed precipitate is either filtered off, dissolved in acetone and quantified spectrophotometrically at 462 nm (Procedure 1), or decomposed by hydrochloric acid, and the bismuth content is determined by direct atomic absorption spectrometric (Procedure 2). Also the residual unreacted metal complex in the filtrate is determined through its metal content using indirect atomic absorption spectrometric technique (procedure 3). All the proposed methods were validated according to the International Conference on Harmonization (ICH) guidelines, the three proposed methods permit the determination of moxifloxacin HCl in the range of (0.8-6, 0.8-4) for methods A and B, (16-96, 16-96 and 16-72) for procedures 1-3 in method C. The limits of detection and quantitation were calculated, the precision of the methods were satisfactory; the values of relative standard deviations did not exceed 2%. The proposed methods were successfully applied to determine the drug in its pharmaceutical formulations

  14. Development and Validation of Spectrophotometric, Atomic Absorption and Kinetic Methods for Determination of Moxifloxacin Hydrochloride

    PubMed Central

    Abdellaziz, Lobna M.; Hosny, Mervat M.

    2011-01-01

    Three simple spectrophotometric and atomic absorption spectrometric methods are developed and validated for the determination of moxifloxacin HCl in pure form and in pharmaceutical formulations. Method (A) is a kinetic method based on the oxidation of moxifloxacin HCl by Fe3+ ion in the presence of 1,10 o-phenanthroline (o-phen). Method (B) describes spectrophotometric procedures for determination of moxifloxacin HCl based on its ability to reduce Fe (III) to Fe (II), which was rapidly converted to the corresponding stable coloured complex after reacting with 2,2′ bipyridyl (bipy). The formation of the tris-complex formed in both methods (A) and (B) were carefully studied and their absorbance were measured at 510 and 520 nm respectively. Method (C) is based on the formation of ion- pair associated between the drug and bismuth (III) tetraiodide in acidic medium to form orange—red ion-pair associates. This associate can be quantitatively determined by three different procedures. The formed precipitate is either filtered off, dissolved in acetone and quantified spectrophotometrically at 462 nm (Procedure 1), or decomposed by hydrochloric acid, and the bismuth content is determined by direct atomic absorption spectrometric (Procedure 2). Also the residual unreacted metal complex in the filtrate is determined through its metal content using indirect atomic absorption spectrometric technique (procedure 3). All the proposed methods were validated according to the International Conference on Harmonization (ICH) guidelines, the three proposed methods permit the determination of moxifloxacin HCl in the range of (0.8–6, 0.8–4) for methods A and B, (16–96, 16–96 and 16–72) for procedures 1–3 in method C. The limits of detection and quantitation were calculated, the precision of the methods were satisfactory; the values of relative standard deviations did not exceed 2%. The proposed methods were successfully applied to determine the drug in its pharmaceutical

  15. Using qualitative methods to develop a contextually tailored instrument: Lessons learned

    PubMed Central

    Lee, Haeok; Kiang, Peter; Kim, Minjin; Semino-Asaro, Semira; Colten, Mary Ellen; Tang, Shirley S.; Chea, Phala; Peou, Sonith; Grigg-Saito, Dorcas C.

    2015-01-01

    Objective: To develop a population-specific instrument to inform hepatitis B virus (HBV) and human papilloma virus (HPV) prevention education and intervention based on data and evidence obtained from the targeted population of Khmer mothers reflecting their socio-cultural and health behaviors. Methods: The principles of community-based participatory research (CBPR) guided the development of a standardized survey interview. Four stages of development and testing of the survey instrument took place in order to inform the quantitative health survey used to collect data in stage five of the project. This article reports only on Stages 1-4. Results: This process created a new quantitative measure of HBV and HPV prevention behavior based on the revised Network Episode Model and informed by the targeted population. The CBPR method facilitated the application and translation of abstract theoretical ideas of HBV and HPV prevention behavior into culturally-relevant words and expressions of Cambodian Americans (CAs). Conclusions: The design of an instrument development process that accounts for distinctive socio-cultural backgrounds of CA refugee/immigrant women provides a model for use in developing future health surveys that are intended to aid minority-serving health care professionals and researchers as well as targeted minority populations. PMID:27981114

  16. Formative research methods for designing culturally appropriate, integrated child nutrition and development interventions: an overview.

    PubMed

    Bentley, Margaret E; Johnson, Susan L; Wasser, Heather; Creed-Kanashiro, Hilary; Shroff, Monal; Fernandez Rao, Sylvia; Cunningham, Melissa

    2014-01-01

    Nutritional and developmental insults in the first few years of life have profound public health implications, including substantial contributions to neonatal, infant, and early childhood morbidity and mortality, as well as longer term effects on cognitive development, school achievement, and worker productivity. Optimal development that can lead to the attainment of an individual's fullest potential, therefore, requires a combination of genetic capacity, adequate nutrition, psychosocial stimulation, and safe, clean physical environments. Researchers and policymakers have called for integrated child nutrition and development interventions for more than 20 years, yet there are only a handful of efficacy trials and even fewer examples of integrated interventions that have been taken to scale. While a critical component in the design of such interventions is formative research, there is a dearth of information in both the literature and policy arenas to guide this phase of the process. To move the field forward, this paper first provides an overview of formative research methods with a focus on qualitative inquiry, a description of the critical domains to be assessed (infant and young child feeding, responsive feeding, and child development), and currently available resources. Application of these methods is provided through a real-world case study--the design of an integrated nutrition and child development efficacy trial in Andhra Pradesh, India. Recommendations for next steps are discussed, the most important of which is the need for a comprehensive set of formative guidelines for designing locally tailored, culturally appropriate, integrated interventions.

  17. Formative research methods for designing culturally appropriate, integrated child nutrition and development interventions: An overview

    PubMed Central

    Bentley, Margaret E.; Johnson, Susan L.; Wasser, Heather; Creed-Kanashiro, Hilary; Shroff, Monal; Fernandez-Rao, Sylvia; Cunningham, Melissa

    2014-01-01

    Nutritional and developmental insults in the first few years of life have profound public health implications, including substantial contributions to neonatal, infant, and early childhood morbidity and mortality, as well as longer term impacts on cognitive development, school achievement, and worker productivity. Optimal development that can lead to the attainment of the individual's fullest potential therefore requires a combination of genetic capacity, adequate nutrition, psychosocial stimulation, and safe, clean physical environments. Researchers and policymakers have called for integrated child nutrition and development interventions for more than twenty years, yet there are only a handful of efficacy trials and even fewer examples of integrated interventions that have been taken to scale. While a critical component to the design of such interventions is formative research, there is a dearth of information in both the literature and policy arenas to guide this phase of the process. To move the field forward, this paper first provides an overview of formative research methods with a focus on qualitative inquiry, a description of the critical domains to be assessed (infant and young child feeding, responsive feeding, and child development), and currently available resources. Application of these methods is provided through a real-world case study—the design of an integrated nutrition and child development efficacy trial in Andhra Pradesh, India. Recommendations for next steps are discussed, the most important of which is the need for a comprehensive set of formative guidelines for designing locally tailored, culturally appropriate integrated interventions. PMID:24673167

  18. Development of methods for nuclear power plant personnel qualifications and training

    SciTech Connect

    Jorgensen, C.C.; Carter, R.J.

    1984-01-01

    The Nuclear Regulatory Commission (NRC) has proposed additions and revisions to Title 10 of the Code of Federal Regulations, Parts 50 and 55, and to Regulatory Guides 1.8 and 1.149. ORNL is developing methods and some aspects of the technical basis for the implementation and assessment of training programs, personnel qualifications, and simulation facilities. The paper describes the three methodologies which were developed during the FY-1984 research: a task sort procedure (TSORT); a simulation facility evaluation methodology; and a task analysis profiling system (TAPS).

  19. Methodical Approach to Developing a Decision Support System for Well Interventions Planning

    NASA Astrophysics Data System (ADS)

    Silich, V. A.; Savelev, A. O.; Isaev, A. N.

    2016-04-01

    The paper contains aspects of developing a decision support systems aimed for well interventions planning within the process of oil production engineering. The specific approach described by authors is based on system analysis methods and object model for system design. Declared number of problem-decision principles as follows: the principle of consolidated information area, the principle of integrated control, the principle of development process transparency. Also observed a set of models (class model, object model, attribute interdependence model, component model, coordination model) specified for designing decision support system for well intervention planning.

  20. Development of actual EUV mask observation method for micro coherent EUV scatterometry microscope

    NASA Astrophysics Data System (ADS)

    Harada, T.; Hashimoto, H.; Watanabe, T.

    2016-10-01

    To review phase and amplitude defect on extreme ultraviolet (EUV) mask with EUV intensity and phase contrast, we have developed the micro coherent EUV scatterometry microscope (micro-CSM). A coherent EUV beam was focused on a defect using a Fresnel zoneplate, where the illumination size was 140 nm diameter. Diffraction from the defect was captured by an EUV CCD camera directly. The diffraction signal was depended on the zoneplate focus, where the defect signal was efficiently detected at a best focus position. To review an actual EUV mask that has no focus-alignment pattern on surface, we developed a focusing method using a speckle signal.