Science.gov

Sample records for 95-39 methods development

  1. Medical Research and Evaluation Facility (MREF) and studies supporting the medical chemical defense program: Task 95-39: Methods development and validation of two mouse bioassays for use in quantifying botulinum toxins (a, b, c, d and e) and toxin antibody titers. Final report

    SciTech Connect

    Olson, C.T.; Gelzleichter, T.R.; Myers, M.A.; Menton, R.G.; Neimuth, N.A.

    1997-06-01

    Ths task was conducted for the U.S. Army Medical Materiel Development Activity (USAMMDA) to validate two mouse bioassays for quantify botulinum toxin potency and neutralizing antibodies to botulimun toxins. Phase I experiments were designed to validate the mouse potency assay. The coefficients of variation for day-to-day variability were 10, 7, 10, 9 and 13 percent for serotypes A, B, C, D, and E, respectively. Phase II experiments were -brined to develop and validate an assay for measuring neutralizing antibody content of serum. Avidity reetits were characterized at three separate test levels, L+/10, L+/33, and L+/100. The coefficients of variation for day-to-day variability were 9, 44, 11, 34, and 13 percent for serotype A, B, C, D, and E, respectively. Limits of intitation were approximately 0.02, 0.005, 0.012, 0.026, and 0.013 U/mL for serotypes A, B, C, D, and B, respectively. Phase III consisted of limited studies to develop a model of passive immunity in guinea pigs by intraperitoneal treatment with human botulinum immune globulin (BIG).

  2. 10 CFR 95.39 - External transmission of documents and material.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... transfer, the recipient and the person transferring the document must be enclosed within the inner envelope... Secret document is transferred. This receipt process is at the option of the sender for Confidential information. (c) Methods of transportation. (1) Secret matter may be transported only by one of the...

  3. 10 CFR 95.39 - External transmission of documents and material.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... transfer, the recipient and the person transferring the document must be enclosed within the inner envelope... Secret document is transferred. This receipt process is at the option of the sender for Confidential information. (c) Methods of transportation. (1) Secret matter may be transported only by one of the...

  4. 10 CFR 95.39 - External transmission of documents and material.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... transfer, the recipient and the person transferring the document must be enclosed within the inner envelope... Secret document is transferred. This receipt process is at the option of the sender for Confidential information. (c) Methods of transportation. (1) Secret matter may be transported only by one of the...

  5. 10 CFR 95.39 - External transmission of documents and material.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... transfer, the recipient and the person transferring the document must be enclosed within the inner envelope... Secret document is transferred. This receipt process is at the option of the sender for Confidential information. (c) Methods of transportation. (1) Secret matter may be transported only by one of the...

  6. 10 CFR 95.39 - External transmission of documents and material.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... transfer, the recipient and the person transferring the document must be enclosed within the inner envelope... Secret document is transferred. This receipt process is at the option of the sender for Confidential information. (c) Methods of transportation. (1) Secret matter may be transported only by one of the...

  7. Radiochemical method development

    SciTech Connect

    Erickson, M.D.; Aldstadt, J.H.; Alvarado, J.S.; Crain, J.S.; Orlandini, K.A.; Smith, L.L.

    1994-09-01

    The authors have developed methods for chemical characterization of the environment under a multitask project that focuses on improvement of radioanalytical methods with an emphasis on faster and cheaper routine methods. The authors have developed improved methods for separation of environmental levels of technetium-99, radium, and actinides from soil and water; separation of actinides from soil and water matrix interferences; and isolation of strontium. They are also developing methods for simultaneous detection of multiple isotopes (including nonradionuclides) by using a new instrumental technique, inductively coupled plasma-mass spectrometry (ICP-MS). The new ICP-MS methods have greater sensitivity and efficiency and could replace many radiometric techniques. They are using flow injection analysis to integrate and automate the separation methods with the ICP-MS methodology. The final product of all activities will be methods that are available (published in the U.S. Department of Energy`s analytical methods compendium) and acceptable for use in regulatory situations.

  8. Computational Methods Development at Ames

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Smith, Charles A. (Technical Monitor)

    1998-01-01

    This viewgraph presentation outlines the development at Ames Research Center of advanced computational methods to provide appropriate fidelity computational analysis/design capabilities. Current thrusts of the Ames research include: 1) methods to enhance/accelerate viscous flow simulation procedures, and the development of hybrid/polyhedral-grid procedures for viscous flow; 2) the development of real time transonic flow simulation procedures for a production wind tunnel, and intelligent data management technology; and 3) the validation of methods and the flow physics study gives historical precedents to above research, and speculates on its future course.

  9. Methods For Human Resource Development.

    ERIC Educational Resources Information Center

    Conger, D. Stuart

    A description is provided of the training and counseling materials and methods prepared by the Saskatchewan NewStart and the Training Research and Development Station. Following a brief review of the concept of social inventions, summary descriptions are provided of nine adult education courses. These are: 1) Life Skills, which focuses upon…

  10. ANALYTICAL METHOD DEVELOPMENT FOR PHENOLS

    EPA Science Inventory

    This project focused on the development of an analytical method for the analysis of phenols in drinking water. The need for this project is associated with the recently published Contaminant Candidate List (CCL). The following phenolic compounds are listed on the current CCL, a...

  11. Space Radiation Transport Methods Development

    NASA Astrophysics Data System (ADS)

    Wilson, J.; Tripathi, R.; Qualls, G.; Cucinotta, F.; Prael, R.; Norbury, J.

    Early space radiation shield code development relied on Monte Carlo methods for proton, neutron and pion transport and made important contributions to the space program. More recently Monte Carlo code LAHET has been upgraded to include high-energy multiple-charged light ions for GCR simulations and continues to be expanded in capability. To compensate for low computational efficiency, Monte Carlo methods have resorted to restricted one-dimensional problems leading to imperfect representations of appropriate boundary conditions. Even so, intensive computational requirements resulted and shield evaluation was made near the end of the design process and resolving shielding issues usually had a negative impact on the design. We evaluate the implications of these common one-dimensional assumptions on the evaluation of the Shuttle internal radiation field. Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard Finite Element Method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 milliseconds and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of reconfigurable computing and could be

  12. Space Radiation Transport Methods Development

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tripathi, R. K.; Qualls, G. D.; Cucinotta, F. A.; Prael, R. E.; Norbury, J. W.; Heinbockel, J. H.; Tweed, J.

    2002-01-01

    Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard Finite Element Method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 milliseconds and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of reconfigurable computing and could be utilized in the final design as verification of the deterministic method optimized design.

  13. Advanced probabilistic method of development

    NASA Technical Reports Server (NTRS)

    Wirsching, P. H.

    1987-01-01

    Advanced structural reliability methods are utilized on the Probabilistic Structural Analysis Methods (PSAM) project to provide a tool for analysis and design of space propulsion system hardware. The role of the effort at the University of Arizona is to provide reliability technology support to this project. PSAM computer programs will provide a design tool for analyzing uncertainty associated with thermal and mechanical loading, material behavior, geometry, and the analysis methods used. Specifically, reliability methods are employed to perform sensitivity analyses, to establish the distribution of a critical response variable (e.g., stress, deflection), to perform reliability assessment, and ultimately to produce a design which will minimize cost and/or weight. Uncertainties in the design factors of space propulsion hardware are described by probability models constructed using statistical analysis of data. Statistical methods are employed to produce a probability model, i.e., a statistical synthesis or summary of each design variable in a format suitable for reliability analysis and ultimately, design decisions.

  14. Methods development for electron transport

    NASA Astrophysics Data System (ADS)

    Ganapol, Barry D.

    1992-04-01

    This report consists of two code manuals and an article recently published in the proceedings of the American Nuclear Society Mathematics and Computation Topical Meeting held in Pittsburgh. In these presentations, deterministic calculational methods simulating electron transport in solids are detailed. The first method presented (Section 2) is for the solution of the Spencer-Lewis equation in which electron motion is characterized by continuous slowing down theory and a pathlength formulation. The FN solution to the standard monoenergetic transport equation for electron transport with isotropic scattering in finite media is given in Section 3. For both codes, complete flow charts, operational instructions and sample problems are included. Finally, in Section 4, an application of the multigroup formulation of electron transport in an infinite medium is used to verify an equivalent SN formulation. For this case, anisotropic scattering is also included.

  15. GIS Method for Developing Wind Supply Curves

    SciTech Connect

    Kline, D.; Heimiller, D.; Cowlin, S.

    2008-06-01

    This report describes work conducted by the National Renewable Energy Laboratory (NREL) as part of the Wind Technology Partnership (WTP) sponsored by the U.S. Environmental Protection Agency (EPA). This project has developed methods that the National Development and Reform Commission (NDRC) intends to use in the planning and development of China's 30 GW of planned capacity. Because of China's influence within the community of developing countries, the methods and the approaches here may help foster wind development in other countries.

  16. TEMPERATURE SCENARIO DEVELOPMENT USING REGRESSION METHODS

    EPA Science Inventory

    A method of developing scenarios of future temperature conditions resulting from climatic change is presented. he method is straightforward and can be used to provide information about daily temperature variations and diurnal ranges, monthly average high, and low temperatures, an...

  17. Model and method for systems development

    NASA Astrophysics Data System (ADS)

    Behl, Erich; Rittel, Michael

    1988-11-01

    A method for systems development was developed with a view to the increase of productivity and quality. The basic approaches are a standard consideration of software and hardware and the rapid prototyping procedure. The methodological procedure is strongly characterized by the reuse of available concepts as well as of hardware and software components. The method is supported by a systems development environment which contains adjusted aids and automates a serres of activities.

  18. Development of Methods for Determination of Aflatoxins.

    PubMed

    Xie, Lijuan; Chen, Min; Ying, Yibin

    2016-12-01

    Aflatoxins can cause damage to the health of humans and animals. Several institutions around the world have established regulations to limit the levels of aflatoxins in food, and numerous analytical methods have been extensively developed for aflatoxin determination. This review covers the currently used analytical methods for the determination of aflatoxins in different food matrices, which includes sampling and sample preparation, sample pretreatment methods including extraction methods and purification methods of aflatoxin extracts, separation and determination methods. Validation for analysis of aflatoxins and safety considerations and precautions when doing the experiments are also discussed. PMID:25840003

  19. A framework for teaching software development methods

    NASA Astrophysics Data System (ADS)

    Dubinsky, Yael; Hazzan, Orit

    2005-12-01

    This article presents a study that aims at constructing a teaching framework for software development methods in higher education. The research field is a capstone project-based course, offered by the Technion's Department of Computer Science, in which Extreme Programming is introduced. The research paradigm is an Action Research that involves cycles of data collection, examination, evaluation, and application of results. The research uses several research tools for data gathering, as well as several research methods for data interpretation. The article describes in detail the research background, the research method, and the gradual emergence process of a framework for teaching software development methods. As part of the comprehensive teaching framework, a set of measures is developed to assess, monitor, and improve the teaching and the actual process of software development projects.

  20. PM MASS METHODS RESEARCH AND DEVELOPMENT

    EPA Science Inventory

    This task supports research into methodologies for determining particulate matter (PM) mass concentrations. Due to the complexity of PM (composition, size distribution, and concentration), developing PM methods that perform acceptably under most weather conditions at most U.S. l...

  1. Toxicity test method development in southeast Asia

    SciTech Connect

    McPherson, C.A.

    1995-12-31

    Use of aquatic toxicity tests is relatively new in southeast Asia. As part of the ASEAN-Canada Cooperative Programme on Marine Science -- Phase 2, which includes development of marine environmental criteria, a need for tropical toxicity data was identified. A step-wise approach was used for test method development (simple, acute tests and easily measured endpoints first, then more complex short-term chronic methods), for test specific selection (using species found throughout the region first, and then considering species with narrower geographic distribution), and for integration of quality assurance/quality control (QA/QC) practices into all laboratory activities. Development of test protocols specifically for tropical species included acute and chronic toxicity tests with marine fish, invertebrates and algae. Criteria for test species selection will be reviewed. Method development was based on procedures and endpoints already widely used in North America and Europe (e.g., 96-h LC50 with fish), but adapted for use with tropical species. For example, a bivalve larval development test can use the same endpoints but the duration is only 24 hours. Test method development included research on culture and holding procedures, determination of test conditions (e.g., duration, test containers), and identification of appropriate endpoints. Acute tests with fish and invertebrates were developed first. The next step was development of short-term chronic tests to measure phytoplankton growth, bivalve and echinoderm embryo or larval development, and larval fish growth. The number of species and types of tests was increased in a staged approach, as laboratories became better equipped and personnel gained practical experience. In most cases, method development coincided with training workshops to introduce the principles of toxicity testing.

  2. Development of test methods for textile composites

    NASA Technical Reports Server (NTRS)

    Masters, John E.; Ifju, Peter G.; Fedro, Mark J.

    1993-01-01

    NASA's Advanced Composite Technology (ACT) Program was initiated in 1990 with the purpose of developing less costly composite aircraft structures. A number of innovative materials and processes were evaluated as a part of this effort. Chief among them are composite materials reinforced with textile preforms. These new forms of composite materials bring with them potential testing problems. Methods currently in practice were developed over the years for composite materials made from prepreg tape or simple 2-D woven fabrics. A wide variety of 2-D and 3-D braided, woven, stitched, and knit preforms were suggested for application in the ACT program. The applicability of existing test methods to the wide range of emerging materials bears investigation. The overriding concern is that the values measured are accurate representations of the true material response. The ultimate objective of this work is to establish a set of test methods to evaluate the textile composites developed for the ACT Program.

  3. Methods for the Study of Gonadal Development.

    PubMed

    Piprek, Rafal P

    2016-01-01

    Current knowledge on gonadal development and sex determination is the product of many decades of research involving a variety of scientific methods from different biological disciplines such as histology, genetics, biochemistry, and molecular biology. The earliest embryological investigations, followed by the invention of microscopy and staining methods, were based on histological examinations. The most robust development of histological staining techniques occurred in the second half of the nineteenth century and resulted in structural descriptions of gonadogenesis. These first studies on gonadal development were conducted on domesticated animals; however, currently the mouse is the most extensively studied species. The next key point in the study of gonadogenesis was the advancement of methods allowing for the in vitro culture of fetal gonads. For instance, this led to the description of the origin of cell lines forming the gonads. Protein detection using antibodies and immunolabeling methods and the use of reporter genes were also invaluable for developmental studies, enabling the visualization of the formation of gonadal structure. Recently, genetic and molecular biology techniques, especially gene expression analysis, have revolutionized studies on gonadogenesis and have provided insight into the molecular mechanisms that govern this process. The successive invention of new methods is reflected in the progress of research on gonadal development. PMID:27300186

  4. Developing a Primary Science Methods Classroom

    ERIC Educational Resources Information Center

    Veal, William R.; Jackson, Zachary

    2006-01-01

    The purpose of this paper is to describe how and why a primary science methods classroom was conceived, designed, and developed for preservice and inservice teachers. Just as science educators believe that students learn best by constructing their knowledge of the natural world with the aid of a teacher and colleagues, science educators also…

  5. New Developments of the Shared Concern Method.

    ERIC Educational Resources Information Center

    Pikas, Anatol

    2002-01-01

    Reviews and describes new developments in the Shared Concern method (SCm), a tool for tackling group bullying amongst teenagers by individual talks. The psychological mechanisms of healing in the bully group and what hinders the bully therapist in eliciting them have become better clarified. The most important recent advancement of the SCm…

  6. A Framework for Teaching Software Development Methods

    ERIC Educational Resources Information Center

    Dubinsky, Yael; Hazzan, Orit

    2005-01-01

    This article presents a study that aims at constructing a teaching framework for software development methods in higher education. The research field is a capstone project-based course, offered by the Technion's Department of Computer Science, in which Extreme Programming is introduced. The research paradigm is an Action Research that involves…

  7. Benchmarking Learning and Teaching: Developing a Method

    ERIC Educational Resources Information Center

    Henderson-Smart, Cheryl; Winning, Tracey; Gerzina, Tania; King, Shalinie; Hyde, Sarah

    2006-01-01

    Purpose: To develop a method for benchmarking teaching and learning in response to an institutional need to validate a new program in Dentistry at the University of Sydney, Australia. Design/methodology/approach: After a collaborative partner, University of Adelaide, was identified, the areas of teaching and learning to be benchmarked, PBL…

  8. Development of semiclassical molecular dynamics simulation method.

    PubMed

    Nakamura, Hiroki; Nanbu, Shinkoh; Teranishi, Yoshiaki; Ohta, Ayumi

    2016-04-28

    Various quantum mechanical effects such as nonadiabatic transitions, quantum mechanical tunneling and coherence play crucial roles in a variety of chemical and biological systems. In this paper, we propose a method to incorporate tunneling effects into the molecular dynamics (MD) method, which is purely based on classical mechanics. Caustics, which define the boundary between classically allowed and forbidden regions, are detected along classical trajectories and the optimal tunneling path with minimum action is determined by starting from each appropriate caustic. The real phase associated with tunneling can also be estimated. Numerical demonstration with use of a simple collinear chemical reaction O + HCl → OH + Cl is presented in order to help the reader to well comprehend the method proposed here. Generalization to the on-the-fly ab initio version is rather straightforward. By treating the nonadiabatic transitions at conical intersections by the Zhu-Nakamura theory, new semiclassical MD methods can be developed. PMID:27067383

  9. Development of a nonlinear vortex method

    NASA Technical Reports Server (NTRS)

    Kandil, O. A.

    1982-01-01

    Steady and unsteady Nonliner Hybrid Vortex (NHV) method, for low aspect ratio wings at large angles of attack, is developed. The method uses vortex panels with first-order vorticity distribution (equivalent to second-order doublet distribution) to calculate the induced velocity in the near field using closed form expressions. In the far field, the distributed vorticity is reduced to concentrated vortex lines and the simpler Biot-Savart's law is employed. The method is applied to rectangular wings in steady and unsteady flows without any restriction on the order of magnitude of the disturbances in the flow field. The numerical results show that the method accurately predicts the distributed aerodynamic loads and that it is of acceptable computational efficiency.

  10. Transport Test Problems for Hybrid Methods Development

    SciTech Connect

    Shaver, Mark W.; Miller, Erin A.; Wittman, Richard S.; McDonald, Benjamin S.

    2011-12-28

    This report presents 9 test problems to guide testing and development of hybrid calculations for the ADVANTG code at ORNL. These test cases can be used for comparing different types of radiation transport calculations, as well as for guiding the development of variance reduction methods. Cases are drawn primarily from existing or previous calculations with a preference for cases which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22.

  11. Development of a hydraulic turbine design method

    NASA Astrophysics Data System (ADS)

    Kassanos, Ioannis; Anagnostopoulos, John; Papantonis, Dimitris

    2013-10-01

    In this paper a hydraulic turbine parametric design method is presented which is based on the combination of traditional methods and parametric surface modeling techniques. The blade of the turbine runner is described using Bezier surfaces for the definition of the meridional plane as well as the blade angle distribution, and a thickness distribution applied normal to the mean blade surface. In this way, it is possible to define parametrically the whole runner using a relatively small number of design parameters, compared to conventional methods. The above definition is then combined with a commercial CFD software and a stochastic optimization algorithm towards the development of an automated design optimization procedure. The process is demonstrated with the design of a Francis turbine runner.

  12. Report on development of neutron passportisation method

    SciTech Connect

    Antropov, G.P.; Babichev, Yu.B.; Blagin, S.V.

    1994-12-31

    In this report the results of development of spatial neutron passportisation method are described. The method is aimed on spatial configuration (including the number of sources) control of closed objects containing neutron sources. The possible areas of method application are: (1) the number of warheads control inside the missile heads for RF-US nuclear disarmament treaties verification; (2) control of SNM containers arrangement in storage vaults; (3) control of complicated assemblies with SNM (and other radioactive materials) to remain unchanged. For objects with complicated structure such as multiple reentry vehicles the direct interpretation of observed radiation field configuration is rather difficult task. The reconstruction of object structure on basis of radiation field configuration usually require use of external information and is often not obvious. Besides, while using such methods of direct reconstruction of object internal structure the contradiction arises between the requirement of defining sources arrangement (warheads in case of arms control) and requirement of information protection concerning the sources themselves. In this case there may be different limitations on possible spatial resolution of method, use of spectroscopy information, etc.

  13. A space radiation transport method development

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tripathi, R. K.; Qualls, G. D.; Cucinotta, F. A.; Prael, R. E.; Norbury, J. W.; Heinbockel, J. H.; Tweed, J.

    2004-01-01

    Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest-order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard finite element method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 ms and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of re-configurable computing and could be utilized in the final design as verification of the deterministic method optimized design. Published by Elsevier Ltd on behalf of COSPAR.

  14. A space radiation transport method development.

    PubMed

    Wilson, J W; Tripathi, R K; Qualls, G D; Cucinotta, F A; Prael, R E; Norbury, J W; Heinbockel, J H; Tweed, J

    2004-01-01

    Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest-order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard finite element method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 ms and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of re-configurable computing and could be utilized in the final design as verification of the deterministic method optimized design. PMID:15880919

  15. Structural reliability methods: Code development status

    NASA Technical Reports Server (NTRS)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-01-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  16. Probabilistic structural analysis methods development for SSME

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1988-01-01

    The development of probabilistic structural analysis methods is a major part of the SSME Structural Durability Program and consists of three program elements: composite load spectra, probabilistic finite element structural analysis, and probabilistic structural analysis applications. Recent progress includes: (1) the effects of the uncertainties of several factors on the HPFP blade temperature pressure and torque, (2) the evaluation of the cumulative distribution function of structural response variables based on assumed uncertainties on primitive structural variables, and (3) evaluation of the failure probability. Collectively, the results obtained demonstrate that the structural durability of critical SSME components can be probabilistically evaluated.

  17. [Development of identification method for isopropyl citrate].

    PubMed

    Furusho, Noriko; Ohtsuki, Takashi; Tatebe-Sasaki, Chiye; Kubota, Hiroki; Sato, Kyoko; Akiyama, Hiroshi

    2014-01-01

    In Japan's Specification and Standards for Food Additive, 8th edition, two identification tests involving isopropyl citrate for detecting isopropyl alcohol and citrate are stipulated. However, these identification tests use mercury compound, which is toxic, or require a time-consuming pretreatment process. To solve these problems, an identification test method using GC-FID for detecting isopropyl alcohol was developed. In this test, a good linearity was observed in the range of 0.1-40 mg/mL of isopropyl alcohol. While investigating the pretreatment process, we found that isopropyl alcohol could be detected using GC-FID in the distillation step only, without involving any reflux step. The study also showed that the citrate moiety of isopropyl citrate was identified using the solution remaining after conducting the distillation of isopropyl alcohol. The developed identification tests for isopropyl citrate are simple and use no toxic materials. PMID:25707204

  18. Methods for developing and validating survivability distributions

    SciTech Connect

    Williams, R.L.

    1993-10-01

    A previous report explored and discussed statistical methods and procedures that may be applied to validate the survivability of a complex system of systems that cannot be tested as an entity. It described a methodology where Monte Carlo simulation was used to develop the system survivability distribution from the component distributions using a system model that registers the logical interactions of the components to perform system functions. This paper discusses methods that can be used to develop the required survivability distributions based upon three sources of knowledge. These are (1) available test results; (2) little or no available test data, but a good understanding of the physical laws and phenomena which can be applied by computer simulation; and (3) neither test data nor adequate knowledge of the physics are known, in which case, one must rely upon, and quantify, the judgement of experts. This paper describes the relationship between the confidence bounds that can be placed on survivability and the number of tests conducted. It discusses the procedure for developing system level survivability distributions from the distributions for lower levels of integration. It demonstrates application of these techniques by defining a communications network for a Hypothetical System Architecture. A logic model for the performance of this communications network is developed, as well as the survivability distributions for the nodes and links based on two alternate data sets, reflecting the effects of increased testing of all elements. It then shows how this additional testing could be optimized by concentrating only on those elements contained in the low-order fault sets which the methodology identifies.

  19. Development of a Radial Deconsolidation Method

    SciTech Connect

    Helmreich, Grant W.; Montgomery, Fred C.; Hunn, John D.

    2015-12-01

    A series of experiments have been initiated to determine the retention or mobility of fission products* in AGR fuel compacts [Petti, et al. 2010]. This information is needed to refine fission product transport models. The AGR-3/4 irradiation test involved half-inch-long compacts that each contained twenty designed-to-fail (DTF) particles, with 20-μm thick carbon-coated kernels whose coatings were deliberately fabricated such that they would crack under irradiation, providing a known source of post-irradiation isotopes. The DTF particles in these compacts were axially distributed along the compact centerline so that the diffusion of fission products released from the DTF kernels would be radially symmetric [Hunn, et al. 2012; Hunn et al. 2011; Kercher, et al. 2011; Hunn, et al. 2007]. Compacts containing DTF particles were irradiated at Idaho National Laboratory (INL) at the Advanced Test Reactor (ATR) [Collin, 2015]. Analysis of the diffusion of these various post-irradiation isotopes through the compact requires a method to radially deconsolidate the compacts so that nested-annular volumes may be analyzed for post-irradiation isotope inventory in the compact matrix, TRISO outer pyrolytic carbon (OPyC), and DTF kernels. An effective radial deconsolidation method and apparatus appropriate to this application has been developed and parametrically characterized.

  20. DEVELOPMENT OF NDA METHODS FOR NEPTUNIUM METAL

    SciTech Connect

    C. MOSS; ET AL

    2000-10-01

    Many techniques have been developed and applied in the US and other countries for the control of the special nuclear materials (SNM) plutonium and uranium, but no standard methods exist for the determination of neptunium in bulk containers. Such methods are needed because the U.S. Department of Energy requires all Government-owned {sup 237}Np be treated as if it were SNM and the International Atomic Energy Agency is considering how to monitor this material. We present the results of the measurements of several samples of neptunium metal with a variety of techniques. Analysis of passive gamma-ray spectra uniquely identifies the material, provides isotopic ratios for contaminants, such as {sup 243}Am, and may provide information about the shielding, mass, and time since processing. Active neutron interrogation, using the delayed neutron technique in a package monitor, provides useful data even if the neptunium is shielded. The tomographic gamma scanner yields a map of the distribution of the neptunium and shielding in a container. Active photon interrogation with pulses from a 10-MeV linac produces delayed neutrons between pulses, even when the container is heavily shielded. Data from one or more of these techniques can be used to identify the material and estimate a mass in a bulk container.

  1. Method development for fecal lipidomics profiling.

    PubMed

    Gregory, Katherine E; Bird, Susan S; Gross, Vera S; Marur, Vasant R; Lazarev, Alexander V; Walker, W Allan; Kristal, Bruce S

    2013-01-15

    Robust methodologies for the analysis of fecal material will facilitate the understanding of gut (patho)physiology and its role in health and disease and will help improve care for individual patients, especially high-risk populations, such as premature infants. Because lipidomics offers a biologically and analytically attractive approach, we developed a simple, sensitive, and quantitatively precise method for profiling intact lipids in fecal material. The method utilizes two separate, complementary extraction chemistries, dichloromethane (DCM) and a methyl tert-butyl ether/hexafluoroisopropanol (MTBE) mixture, alone or with high pressure cycling. Extracts were assessed by liquid chromatography-high-resolution mass spectrometry-based profiling with all ion higher energy collisional dissociation fragmentation in both positive and negative ionization modes. This approach provides both class-specific and lipid-specific fragments, enhancing lipid characterization. Solvents preferentially extracted lipids based on hydrophobicity. More polar species preferred MTBE; more hydrophobic compounds preferred DCM. Pressure cycling differentially increased the yield of some lipids. The platform enabled analysis of >500 intact lipophilic species with over 300 lipids spanning 6 LIPID MAPS categories identified in the fecal matter from premature infants. No previous report exists that provides these data; thus, this study represents a new paradigm for assessing nutritional health, inflammation, and infectious disease in vulnerable populations. PMID:23210743

  2. Methods development for total organic carbon accountability

    NASA Technical Reports Server (NTRS)

    Benson, Brian L.; Kilgore, Melvin V., Jr.

    1991-01-01

    This report describes the efforts completed during the contract period beginning November 1, 1990 and ending April 30, 1991. Samples of product hygiene and potable water from WRT 3A were supplied by NASA/MSFC prior to contract award on July 24, 1990. Humidity condensate samples were supplied on August 3, 1990. During the course of this contract chemical analyses were performed on these samples to qualitatively determine specific components comprising, the measured organic carbon concentration. In addition, these samples and known standard solutions were used to identify and develop methodology useful to future comprehensive characterization of similar samples. Standard analyses including pH, conductivity, and total organic carbon (TOC) were conducted. Colorimetric and enzyme linked assays for total protein, bile acid, B-hydroxybutyric acid, methylene blue active substances (MBAS), urea nitrogen, ammonia, and glucose were also performed. Gas chromatographic procedures for non-volatile fatty acids and EPA priority pollutants were also performed. Liquid chromatography was used to screen for non-volatile, water soluble compounds not amenable to GC techniques. Methods development efforts were initiated to separate and quantitate certain chemical classes not classically analyzed in water and wastewater samples. These included carbohydrates, organic acids, and amino acids. Finally, efforts were initiated to identify useful concentration techniques to enhance detection limits and recovery of non-volatile, water soluble compounds.

  3. Interactive radio instruction: developing instructional methods.

    PubMed

    Friend, J

    1989-01-01

    The USAID has, since 1972, funded the development of a new methodology for educational radio for young children through 3 projects: the Radio Mathematics PRoject of Nicaragua, the Radio Language Arts Project of Kenya, and the Radio Science PRoject of Papua New Guinea. These projects developed math programs for grades 1-4 and English as a second language for grades 1-3; programs to teach science in grades 4-6 are now being developed. Appropriate techniques were developed to engage young children actively in the learning process. Lessons are planned as a "conversation" between the children and the radio; scripts are written as 1/2 of a dialogue, with pauses carefully timed so that written as 12 of a dialogue, with pauses carefully timed so that students can contribute their 1/2. Teaching techniques used in all 3 projects include choral responses, simultaneous individual seatwork, and activities using simple materials such as pebbles and rulers. Certain techniques were specific to the subject being taught, or to the circumstances in which the lessons were to be used. Patterned oral drill was used frequently in the English lessons, including sound-cued drills. "Deferred" oral responses were used often in the math lessons. In this method, the children are instructed to solve a problem silently, not giving the answer aloud until requested, thus allowing time for even the slower children to participate. "One-child" questions were used in both English and science: the radio asks a question to be answered by a single child, who is selected on the spot by the classroom teacher. This allows for open-ended questions, but also requires constant supervision of the classroom teacher. Songs and games were used in all programs, and extensively for didactic purposes in the teaching of English. Instructions for science activities are often more complex than in other courses, particularly when the children are using science apparatus, especially when they work in pairs to share scarce

  4. Green analytical method development for statin analysis.

    PubMed

    Assassi, Amira Louiza; Roy, Claude-Eric; Perovitch, Philippe; Auzerie, Jack; Hamon, Tiphaine; Gaudin, Karen

    2015-02-01

    Green analytical chemistry method was developed for pravastatin, fluvastatin and atorvastatin analysis. HPLC/DAD method using ethanol-based mobile phase with octadecyl-grafted silica with various grafting and related-column parameters such as particle sizes, core-shell and monolith was studied. Retention, efficiency and detector linearity were optimized. Even for column with particle size under 2 μm, the benefit of keeping efficiency within a large range of flow rate was not obtained with ethanol based mobile phase compared to acetonitrile one. Therefore the strategy to shorten analysis by increasing the flow rate induced decrease of efficiency with ethanol based mobile phase. An ODS-AQ YMC column, 50 mm × 4.6 mm, 3 μm was selected which showed the best compromise between analysis time, statin separation, and efficiency. HPLC conditions were at 1 mL/min, ethanol/formic acid (pH 2.5, 25 mM) (50:50, v/v) and thermostated at 40°C. To reduce solvent consumption for sample preparation, 0.5mg/mL concentration of each statin was found the highest which respected detector linearity. These conditions were validated for each statin for content determination in high concentrated hydro-alcoholic solutions. Solubility higher than 100mg/mL was found for pravastatin and fluvastatin, whereas for atorvastatin calcium salt the maximum concentration was 2mg/mL for hydro-alcoholic binary mixtures between 35% and 55% of ethanol in water. Using atorvastatin instead of its calcium salt, solubility was improved. Highly concentrated solution of statins offered potential fluid for per Buccal Per-Mucous(®) administration with the advantages of rapid and easy passage of drugs. PMID:25582487

  5. HEASD PM RESEARCH METHODS: PARTICLE METHODS EVALUATION AND DEVELOPMENT

    EPA Science Inventory

    The FRM developed by NERL forms the backbone of the EPA's national monitoring strategy. It is the measurement that defines attainment of the new standard. However, the agency has numerous other needs in assessing the physical and chemical characteristics of ambient fine particl...

  6. Agile Development Methods for Space Operations

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Webster, Chris

    2012-01-01

    Main stream industry software development practice has gone from a traditional waterfall process to agile iterative development that allows for fast response to customer inputs and produces higher quality software at lower cost. How can we, the space ops community, adopt state of the art software development practice, achieve greater productivity at lower cost, and maintain safe and effective space flight operations? At NASA Ames, we are developing Mission Control Technologies Software, in collaboration with Johnson Space Center (JSC) and, more recently, the Jet Propulsion Laboratory (JPL).

  7. Development in reliability models and methods

    SciTech Connect

    Vaurio, J.K.

    1983-01-01

    This paper reviews analytical developments in modeling reliability characteristics for components and systems. Modeling involves definition of failure modes, relevant probability and timing parameters for the modes, and derivation of explicit equations for component and system unavailabilities and failure intensities. Some but not all developments to be discussed were carried out within the DOE-sponsored LMFBR safety program.

  8. Child Development in Developing Countries: Introduction and Methods

    PubMed Central

    Bornstein, Marc H.; Britto, Pia Rebello; Nonoyama-Tarumi, Yuko; Ota, Yumiko; Petrovic, Oliver; Putnick, Diane L.

    2011-01-01

    The Multiple Indicator Cluster Survey (MICS) is a nationally representative, internationally comparable household survey implemented to examine protective and risk factors of child development in developing countries around the world. This Introduction describes the conceptual framework, nature of the MICS3, and general analytic plan of articles in this Special Section. The articles that follow describe the situations of children with successive foci on nutrition, parenting, discipline and violence, and the home environment addressing two common questions: How do developing and underresearched countries in the world vary with respect to these central indicators of children's development? and How do key indicators of national development relate to child development in each of these substantive areas? The Special Section concludes with policy implications from the international findings. PMID:22277004

  9. Method Development for Analysis of Aspirin Tablets.

    ERIC Educational Resources Information Center

    Street, Kenneth W., Jr.

    1988-01-01

    Develops a lab experiment for introductory instrumental analysis that requires interference studies and optimizing of conditions. Notes the analysis of the aspirin is by visible spectrophotometric assay. Gives experimental details and discussion. (MVL)

  10. Development of advanced acreage estimation methods

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr. (Principal Investigator)

    1980-01-01

    The use of the AMOEBA clustering/classification algorithm was investigated as a basis for both a color display generation technique and maximum likelihood proportion estimation procedure. An approach to analyzing large data reduction systems was formulated and an exploratory empirical study of spatial correlation in LANDSAT data was also carried out. Topics addressed include: (1) development of multiimage color images; (2) spectral spatial classification algorithm development; (3) spatial correlation studies; and (4) evaluation of data systems.

  11. Child Development in Developing Countries: Introduction and Methods

    ERIC Educational Resources Information Center

    Bornstein, Marc H.; Britto, Pia Rebello; Nonoyama-Tarumi, Yuko; Ota, Yumiko; Petrovic, Oliver; Putnick, Diane L.

    2012-01-01

    The Multiple Indicator Cluster Survey (MICS) is a nationally representative, internationally comparable household survey implemented to examine protective and risk factors of child development in developing countries around the world. This introduction describes the conceptual framework, nature of the MICS3, and general analytic plan of articles…

  12. 24 CFR 941.102 - Development methods and funding.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... site. The PHA awards a construction contract in accordance with 24 CFR part 85. The contractor receives... 24 Housing and Urban Development 4 2011-04-01 2011-04-01 false Development methods and funding... URBAN DEVELOPMENT PUBLIC HOUSING DEVELOPMENT General § 941.102 Development methods and funding....

  13. 24 CFR 941.102 - Development methods and funding.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... site. The PHA awards a construction contract in accordance with 24 CFR part 85. The contractor receives... 24 Housing and Urban Development 4 2013-04-01 2013-04-01 false Development methods and funding... URBAN DEVELOPMENT PUBLIC HOUSING DEVELOPMENT General § 941.102 Development methods and funding....

  14. 24 CFR 941.102 - Development methods and funding.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... site. The PHA awards a construction contract in accordance with 24 CFR part 85. The contractor receives... 24 Housing and Urban Development 4 2012-04-01 2012-04-01 false Development methods and funding... URBAN DEVELOPMENT PUBLIC HOUSING DEVELOPMENT General § 941.102 Development methods and funding....

  15. Using Qualitative Methods to Inform Scale Development

    ERIC Educational Resources Information Center

    Rowan, Noell; Wulff, Dan

    2007-01-01

    This article describes the process by which one study utilized qualitative methods to create items for a multi dimensional scale to measure twelve step program affiliation. The process included interviewing fourteen addicted persons while in twelve step focused treatment about specific pros (things they like or would miss out on by not being…

  16. Development of ultrasonic methods for hemodynamic measurements

    NASA Technical Reports Server (NTRS)

    Histand, M. B.; Miller, C. W.; Wells, M. K.; Mcleod, F. D.; Greene, E. R.; Winter, D.

    1975-01-01

    A transcutanous method to measure instantaneous mean blood flow in peripheral arteries of the human body was defined. Transcutanous and implanted cuff ultrasound velocity measurements were evaluated, and the accuracies of velocity, flow, and diameter measurements were assessed for steady flow. Performance criteria were established for the pulsed Doppler velocity meter (PUDVM), and performance tests were conducted. Several improvements are suggested.

  17. DEVELOPMENT AND EVALUATION OF COMPOSITE RECEPTOR METHODS

    EPA Science Inventory

    A composite receptor method for PM-10 apportionment was evaluated to determine the stability of its solutions and to devise cost-effective measurement strategies. Aerosol samples used in the evaluation were collected during summer, 1982, by dichotomous samplers at three sites in ...

  18. Methods for generating hydroelectric power development alternatives

    SciTech Connect

    Chang, Shoou-yuh; Liaw, Shu-liang; Sale, M.J.; Railsback, S.F.

    1989-01-01

    Hydropower development on large rivers can result in a number of environmental impacts, including potential reductions in dissolved oxygen (DO) concentrations. This study presents a methodology for generating different hydropower development alternatives for evaluation. This methodology employs a Streeter-Phelps model to simulate DO, and the Bounded Implicit Enumeration algorithm to solve an optimization model formulated to maximize hydroelectric energy production subject to acceptable DO limits. The upper Ohio River basin was used to illustrate the use and characteristics of the methodology. The results indicate that several alternatives which meet the specified DO constraints can be generated efficiently, meeting both power and environmental objectives. 17 refs., 2 figs., 1 tab.

  19. Development of advanced acreage estimation methods

    NASA Technical Reports Server (NTRS)

    Guseman, L. F., Jr. (Principal Investigator)

    1982-01-01

    The development of an accurate and efficient algorithm for analyzing the structure of MSS data, the application of the Akaiki information criterion to mixture models, and a research plan to delineate some of the technical issues and associated tasks in the area of rice scene radiation characterization are discussed. The AMOEBA clustering algorithm is refined and documented.

  20. A Method for Developing a Nutrient Guide.

    ERIC Educational Resources Information Center

    Gillespie, Ardyth H.; Roderuck, Charlotte E.

    1982-01-01

    This paper proposes a new approach to developing a tool for teaching nutrition and food selection. It allows adjustments as new information becomes available and takes into account both dietary recommendations and food composition. Steps involve nutrient composition; nutrient density; and ratings for fat, cholesterol, and sodium. (Author/CT)

  1. An Unbalance Adjustment Method for Development Indicators

    ERIC Educational Resources Information Center

    Tarabusi, Enrico Casadio; Guarini, Giulio

    2013-01-01

    This paper analyzes some aggregation aspects of the procedure for constructing a composite index on a multidimensional socio-economic phenomenon such as development, the main focus being on the unbalance among individual dimensions. First a theoretical framework is set up for the unbalance adjustment of the index. Then an aggregation function is…

  2. Recommendations for Developing Alternative Test Methods for Developmental Neurotoxicity

    EPA Science Inventory

    There is great interest in developing alternative methods for developmental neurotoxicity testing (DNT) that are cost-efficient, use fewer animals and are based on current scientific knowledge of the developing nervous system. Alternative methods will require demonstration of the...

  3. Current status of fluoride volatility method development

    SciTech Connect

    Uhlir, J.; Marecek, M.; Skarohlid, J.

    2013-07-01

    The Fluoride Volatility Method is based on a separation process, which comes out from the specific property of uranium, neptunium and plutonium to form volatile hexafluorides whereas most of fission products (mainly lanthanides) and higher transplutonium elements (americium, curium) present in irradiated fuel form nonvolatile tri-fluorides. Fluoride Volatility Method itself is based on direct fluorination of the spent fuel, but before the fluorination step, the removal of cladding material and subsequent transformation of the fuel into a powdered form with a suitable grain size have to be done. The fluorination is made with fluorine gas in a flame fluorination reactor, where the volatile fluorides (mostly UF{sub 6}) are separated from the non-volatile ones (trivalent minor actinides and majority of fission products). The subsequent operations necessary for partitioning of volatile fluorides are the condensation and evaporation of volatile fluorides, the thermal decomposition of PuF{sub 6} and the finally distillation and sorption used for the purification of uranium product. The Fluoride Volatility Method is considered to be a promising advanced pyrochemical reprocessing technology, which can mainly be used for the reprocessing of oxide spent fuels coming from future GEN IV fast reactors.

  4. Developing Automated Methods of Waste Sorting

    SciTech Connect

    Shurtliff, Rodney Marvin

    2002-08-01

    The U.S. Department of Energy (DOE) analyzed the need complex-wide for remote and automated technologies as they relate to the treatment and disposal of mixed wastes. This analysis revealed that several DOE sites need the capability to open drums containing waste, visually inspect and sort the contents, and finally repackage the containers that are acceptable at a waste disposal facility such as the Waste Isolation Pilot Plant (WIPP) in New Mexico. Conditioning contaminated waste so that it is compatible with the WIPP criteria for storage is an arduous task whether the waste is contact handled (waste having radioactivity levels below 200 mrem/hr) or remote handled. Currently, WIPP non-compliant items are removed from the waste stream manually, at a rate of about one 55-gallon drum per day. Issues relating to contamination-based health hazards as well as repetitive motion health hazards are steering industry towards a more user-friendly, method of conditioning or sorting waste.

  5. 24 CFR 941.102 - Development methods and funding.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Development methods and funding. 941.102 Section 941.102 Housing and Urban Development Regulations Relating to Housing and Urban Development (Continued) OFFICE OF ASSISTANT SECRETARY FOR PUBLIC AND INDIAN HOUSING, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT PUBLIC HOUSING...

  6. A Block-Matrix Method for Course Development

    ERIC Educational Resources Information Center

    Greenaway, John

    1977-01-01

    Describes the block-matrix method, a technique used to develop new training programs (commonly involving educational program developers and community representatives). Two examples of the block-matrix application and supplementary diagrams are included. It is noted that this method has been used successfully in the development of new courses for…

  7. Method Development and Monitoring of Cyanotoxins in Water

    EPA Science Inventory

    This presentation describes method development of two ambient water LC/MS/MS methods for microcystins, cylindrospermopsin and anatoxin-a. Ruggedness of the methods will be demonstrated by evaluation of quality control samples derived from various water bodies across the country.

  8. Development of quality assurance methods for epoxy graphite prepreg

    NASA Technical Reports Server (NTRS)

    Chen, J. S.; Hunter, A. B.

    1982-01-01

    Quality assurance methods for graphite epoxy/prepregs were developed. Liquid chromatography, differential scanning calorimetry, and gel permeation chromatography were investigated. These methods were applied to a second prepreg system. The resin matrix formulation was correlated with mechanical properties. Dynamic mechanical analysis and fracture toughness methods were investigated. The chromatography and calorimetry techniques were all successfully developed as quality assurance methods for graphite epoxy prepregs. The liquid chromatography method was the most sensitive to changes in resin formulation. The were also successfully applied to the second prepreg system.

  9. A random spatial sampling method in a rural developing nation

    PubMed Central

    2014-01-01

    Background Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. Methods We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. Results This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Conclusions Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available. PMID:24716473

  10. EPA (ENVIRONMENTAL PROTECTION AGENCY) DEVELOPING METHODS TO ASSESS ENVIRONMENTAL RELEASE

    EPA Science Inventory

    The EPA has invested considerable research effort--intended to meet regulatory needs--toward developing methods for assessing the environmental effects of genetically engineered microorganisms (GEMs). Preliminary investigations centered on the fate, survival, accidental release, ...

  11. XENOBIOTIC METHODS DEVELOPMENT FOR HUMAN EXPOSURE ASSESSMENT RESEARCH

    EPA Science Inventory

    Biomarkers from blood, breath, urine, and other physiological matrices can provide useful information regarding exposures to environmental pollutants. Once developed and applied appropriately, specific and sensitive methods can often provide definitive data identifying the vario...

  12. 3-minute diagnosis: Researchers develop new method to recognize pathogens

    ScienceCinema

    Beer, Reg

    2014-05-30

    Imagine knowing precisely why you feel sick ... before the doctor's exam is over. Lawrence Livermore researcher Reg Beer and his engineering colleagues have developed a new method to recognize disease-causing pathogens quicker than ever before.

  13. 3-minute diagnosis: Researchers develop new method to recognize pathogens

    SciTech Connect

    Beer, Reg

    2014-01-06

    Imagine knowing precisely why you feel sick ... before the doctor's exam is over. Lawrence Livermore researcher Reg Beer and his engineering colleagues have developed a new method to recognize disease-causing pathogens quicker than ever before.

  14. Development of methods for orderly growth of nanowires

    NASA Astrophysics Data System (ADS)

    Reznik, R. R.; Kotlyar, K. P.; Khrebtov, A. I.; Samsonenko, Yu B.; Soshnikov, I. P.; Dyakonov, V.; Zadiranov, U. M.; Tankelevskaya, E. M.; Kudryashov, D. A.; Shevchuk, D. S.; Cirlin, G. E.

    2015-12-01

    Method of manufacturing substrates for self-catalyst/free-catalyst ordered growth of the nanowires has been developed. Experiments show the possibility of autocatalytic growth of ordered GaAs NWs on the substrates produced during the research.

  15. Development of aerodynamic prediction methods for irregular planform wings

    NASA Technical Reports Server (NTRS)

    Benepe, D. B., Sr.

    1983-01-01

    A set of empirical methods was developed to predict low-speed lift, drag and pitching-moment variations with angle of attack for a class of low aspect ratio irregular planform wings suitable for application to advanced aerospace vehicles. The data base, an extensive series of wind-tunnel tests accomplished by the Langley Research Center of the National Aeronautics and Space Administration, is summarized. The approaches used to analyze the wind tunnel data, the evaluation of previously existing methods, data correlation efforts, and the development of the selected methods are presented and discussed. A summary of the methods is also presented to document the equations, computational charts and design guides which have been programmed for digital computer solution. Comparisons of predictions and test data are presented which show that the new methods provide a significant improvement in capability for evaluating the landing characteristics of advanced aerospace vehicles during the preliminary design phase of the configuration development cycle.

  16. AN APPROACH TO METHODS DEVELOPMENT FOR HUMAN EXPOSURE ASSESSMENT STUDIES

    EPA Science Inventory

    Human exposure assessment studies require methods that are rapid, cost-effective and have a high sample through-put. The development of analytical methods for exposure studies should be based on specific information for individual studies. Human exposure studies suggest that di...

  17. What Can Mixed Methods Designs Offer Professional Development Program Evaluators?

    ERIC Educational Resources Information Center

    Giordano, Victoria; Nevin, Ann

    2007-01-01

    In this paper, the authors describe the benefits and pitfalls of mixed methods designs. They argue that mixed methods designs may be preferred when evaluating professional development programs for p-K-12 education given the new call for accountability in making data-driven decisions. They summarize and critique the studies in terms of limitations…

  18. DEVELOPMENT OF SAMPLING METHODS FOR SOURCE PM10 EMISSIONS

    EPA Science Inventory

    The report describes an investigation of the needs and available techniques for in-stack PM-10 sampling. Discussion includes the conceptualization, development, documentation, and testing of two candidate methods. The first method, Constant Sampling Rate (CSR), is a procedural ap...

  19. Epistemological Development and Judgments and Reasoning about Teaching Methods

    ERIC Educational Resources Information Center

    Spence, Sarah; Helwig, Charles C.

    2013-01-01

    Children's, adolescents', and adults' (N = 96 7-8, 10-11, and 13-14-year-olds and university students) epistemological development and its relation to judgments and reasoning about teaching methods was examined. The domain (scientific or moral), nature of the topic (controversial or noncontroversial), and teaching method (direct…

  20. Recent developments in synthetic methods for benzo[b]heteroles.

    PubMed

    Wu, Bin; Yoshikai, Naohiko

    2016-06-28

    Benzo[b]heteroles containing heteroatoms other than nitrogen and oxygen have received considerable attention for their potential applications in materials science. This poses an increasing demand for efficient, selective, and broad-scope methods for their synthesis. This review article summarizes the recent developments in synthetic methods and approaches to access representative members of the benzoheterole family. PMID:26892101

  1. A Model-Driven Development Method for Management Information Systems

    NASA Astrophysics Data System (ADS)

    Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki

    Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.

  2. A Study on Reclaimed Photoresist Developer Using an Electrodialysis Method

    NASA Astrophysics Data System (ADS)

    Sugawara, Hiroshi; Tajima, Yoshinori; Ohmi, Tadahiro

    2002-04-01

    Photoresist developer reclamation technology and systems have been investigated and characteristics of the reclaimed developer were evaluated. High-purity tetramethylammonium hydroxide (TMAH) aqueous solution adjusted to an appropriate concentration is generally used as the photoresist developer in the lithography step for large scale integration (LSI) and liquid crystal display (LCD) manufacturing processes. TMAH was recovered by an electrodialysis (ED) method from developer waste (spent developer), and purified by ion exchange (IE) technologies. The reclaimed developer was analyzed and found to feature the same purity as fresh commercial developer and no differences in their characteristics as developer material could be determined. Our experimental reclamation system recovered more than 80% of stable TMAH from waste and, moreover, achieved large reductions in operating costs, which amounted to total cost reductions of 55-75% for LSI and 65-85% for LCD compared with the conventional systems without reclamation studied by us. In addition, environmental load was reduced.

  3. Development and applications of Krotov method of global control improvement

    NASA Astrophysics Data System (ADS)

    Rasina, Irina V.; Trushkova, Ekaterina A.; Baturina, Olga V.; Bulatov, Alexander V.; Guseva, Irina S.

    2016-06-01

    This is a survey of works on main properties, application and development of the Krotov method of global control improvement very popular among researchers of modern problems in quantum physics and quantum chemistry, applying actively optimal control methods. The survey includes a brief description of the method in comparison with well known gradient method demonstrating such its serious advantage as absence of tuning parameters; investigations aimed to make its special version for the quantum system well defined and more effective; and generalization for wide classes of control systems, including the systems of heterogeneous structure.

  4. Validation of Analytical Methods for Biomarkers Employed in Drug Development

    PubMed Central

    Chau, Cindy H.; Rixe, Olivier; McLeod, Howard; Figg, William D.

    2008-01-01

    The role of biomarkers in drug discovery and development has gained precedence over the years. As biomarkers become integrated into drug development and clinical trials, quality assurance and in particular assay validation becomes essential with the need to establish standardized guidelines for analytical methods used in biomarker measurements. New biomarkers can revolutionize both the development and use of therapeutics, but is contingent upon the establishment of a concrete validation process that addresses technology integration and method validation as well as regulatory pathways for efficient biomarker development. This perspective focuses on the general principles of the biomarker validation process with an emphasis on assay validation and the collaborative efforts undertaken by various sectors to promote the standardization of this procedure for efficient biomarker development. PMID:18829475

  5. Development of Improved Surface Integral Methods for Jet Aeroacoustic Predictions

    NASA Technical Reports Server (NTRS)

    Pilon, Anthony R.; Lyrintzis, Anastasios S.

    1997-01-01

    The accurate prediction of aerodynamically generated noise has become an important goal over the past decade. Aeroacoustics must now be an integral part of the aircraft design process. The direct calculation of aerodynamically generated noise with CFD-like algorithms is plausible. However, large computer time and memory requirements often make these predictions impractical. It is therefore necessary to separate the aeroacoustics problem into two parts, one in which aerodynamic sound sources are determined, and another in which the propagating sound is calculated. This idea is applied in acoustic analogy methods. However, in the acoustic analogy, the determination of far-field sound requires the solution of a volume integral. This volume integration again leads to impractical computer requirements. An alternative to the volume integrations can be found in the Kirchhoff method. In this method, Green's theorem for the linear wave equation is used to determine sound propagation based on quantities on a surface surrounding the source region. The change from volume to surface integrals represents a tremendous savings in the computer resources required for an accurate prediction. This work is concerned with the development of enhancements of the Kirchhoff method for use in a wide variety of aeroacoustics problems. This enhanced method, the modified Kirchhoff method, is shown to be a Green's function solution of Lighthill's equation. It is also shown rigorously to be identical to the methods of Ffowcs Williams and Hawkings. This allows for development of versatile computer codes which can easily alternate between the different Kirchhoff and Ffowcs Williams-Hawkings formulations, using the most appropriate method for the problem at hand. The modified Kirchhoff method is developed primarily for use in jet aeroacoustics predictions. Applications of the method are shown for two dimensional and three dimensional jet flows. Additionally, the enhancements are generalized so that

  6. To develop a geometric matching method for precision mold alignment

    NASA Astrophysics Data System (ADS)

    Chen, Chun-Jen; Chang, Chun-Li; Jywe, Wenyuh

    2014-09-01

    In order to develop a high accuracy optical alignment system for precision molding machine, a geometric matching method was developed in this paper. The alignment system includes 4 high magnification lenses, 4 CCD cameras and 4 LED light sources. In the precision molding machine, a bottom metal mold and a top glass mold are used to produce a micro lens. The two molds combination does not use any pin or alignment part. They only use the optical alignment system to alignment. In this optical alignment system, the off-axis alignment method was used. The alignment accuracy of the alignment system is about 0.5 μm. There are 2 cross marks on the top glass mold and 2 cross marks on the bottom metal mod. In this paper did not use edge detection to recognize the mask center because the mask easy wears when the combination times increased. Therefore, this paper develops a geometric matching method to recognize mask center.

  7. Reactions Involved in Fingerprint Development Using the Cyanoacrylate - Fuming Method

    SciTech Connect

    Lewis, L.A.

    2001-07-30

    The Learning Objective is to present the basic chemistry research findings to the forensic community regarding development of latent fingerprints using the cyanoacrylate fuming method. Chemical processes involved in the development of latent fingerprints using the cyanoacrylate fuming method have been studied, and will be presented. Two major types of latent prints have been investigated--clean (eccrine) and oily (sebaceous) prints. Scanning electron microscopy (SEM) was used as a tool for determining the morphology of the polymer developed separately on clean and oily prints after cyanoacrylate fuming. A correlation between the chemical composition of an aged latent fingerprint, prior to development, and the quality of a developed fingerprint was observed in the morphology. The moisture in the print prior to fuming was found to be a critical factor for the development of a useful latent print. In addition, the amount of time required to develop a high quality latent print was found to be minimal. The cyanoacrylate polymerization process is extremely rapid. When heat is used to accelerate the fuming process, typically a period of 2 minutes is required to develop the print. The optimum development time is dependent upon the concentration of cyanoacrylate vapors within the enclosure.

  8. PM: RESEARCH METHODS FOR PM TOXIC COMPOUNDS - PARTICLE METHODS EVALUATION AND DEVELOPMENT

    EPA Science Inventory

    The Federal Reference Method (FRM) for Particulate Matter (PM) developed by EPA's National Exposure Research Laboratory (NERL) forms the backbone of the EPA's national monitoring strategy. It is the measurement that defines attainment of the National Ambient Air Quality Standard...

  9. Recent developments in the methods of estimating shooting distance.

    PubMed

    Zeichner, Arie; Glattstein, Baruch

    2002-03-01

    A review of developments during the past 10 years in the methods of estimating shooting distance is provided. This review discusses the examination of clothing targets, cadavers, and exhibits that cannot be processed in the laboratory. The methods include visual/microscopic examinations, color tests, and instrumental analysis of the gunshot residue deposits around the bullet entrance holes. The review does not cover shooting distance estimation from shotguns that fired pellet loads. PMID:12805985

  10. Formal methods in the development of safety critical software systems

    SciTech Connect

    Williams, L.G.

    1991-11-15

    As the use of computers in critical control systems such as aircraft controls, medical instruments, defense systems, missile controls, and nuclear power plants has increased, concern for the safety of those systems has also grown. Much of this concern has focused on the software component of those computer-based systems. This is primarily due to historical experience with software systems that often exhibit larger numbers of errors than their hardware counterparts and the fact that the consequences of a software error may endanger human life, property, or the environment. A number of different techniques have been used to address the issue of software safety. Some are standard software engineering techniques aimed at reducing the number of faults in a software protect, such as reviews and walkthroughs. Others, including fault tree analysis, are based on identifying and reducing hazards. This report examines the role of one such technique, formal methods, in the development of software for safety critical systems. The use of formal methods to increase the safety of software systems is based on their role in reducing the possibility of software errors that could lead to hazards. The use of formal methods in the development of software systems is controversial. Proponents claim that the use of formal methods can eliminate errors from the software development process, and produce programs that are probably correct. Opponents claim that they are difficult to learn and that their use increases development costs unacceptably. This report discusses the potential of formal methods for reducing failures in safety critical software systems.

  11. Agile Software Development Methods: A Comparative Review1

    NASA Astrophysics Data System (ADS)

    Abrahamsson, Pekka; Oza, Nilay; Siponen, Mikko T.

    Although agile software development methods have caught the attention of software engineers and researchers worldwide, scientific research still remains quite scarce. The aim of this study is to order and make sense of the different agile approaches that have been proposed. This comparative review is performed from the standpoint of using the following features as the analytical perspectives: project management support, life-cycle coverage, type of practical guidance, adaptability in actual use, type of research objectives and existence of empirical evidence. The results show that agile software development methods cover, without offering any rationale, different phases of the software development life-cycle and that most of these methods fail to provide adequate project management support. Moreover, quite a few methods continue to offer little concrete guidance on how to use their solutions or how to adapt them in different development situations. Empirical evidence after ten years of application remains quite limited. Based on the results, new directions on agile methods are outlined.

  12. Methods for Developing Emissions Scenarios for Integrated Assessment Models

    SciTech Connect

    Prinn, Ronald; Webster, Mort

    2007-08-20

    The overall objective of this research was to contribute data and methods to support the future development of new emissions scenarios for integrated assessment of climate change. Specifically, this research had two main objectives: 1. Use historical data on economic growth and energy efficiency changes, and develop probability density functions (PDFs) for the appropriate parameters for two or three commonly used integrated assessment models. 2. Using the parameter distributions developed through the first task and previous work, we will develop methods of designing multi-gas emission scenarios that usefully span the joint uncertainty space in a small number of scenarios. Results on the autonomous energy efficiency improvement (AEEI) parameter are summarized, an uncertainty analysis of elasticities of substitution is described, and the probabilistic emissions scenario approach is presented.

  13. Development of ultrasonic methods for the nondestructive inspection of concrete

    NASA Astrophysics Data System (ADS)

    Claytor, T. M.; Ellingson, W. A.

    1983-08-01

    Nondestructive inspection of Portland cement and refractory concrete is conducted to determine strength, thickness, presence of voids or foreign matter, presence of cracks, amount of degradation due to chemical attack, and other properties without the necessity of coring the structure (which is usually accomplished by destructively removing a sample). The state of the art of acoustic nondestructive testing methods for Portland cement and refractory concrete is reviewed. Most nondestructive work on concrete has concentrated on measuring acoustic velocity by through transmission methods. Development of a reliable pitch-catch or pulse-echo system would provide a method of measuring thickness with access from only one side of the concrete.

  14. Development of ultrasonic methods for the nondestructive inspection of concrete

    SciTech Connect

    Claytor, T.N.; Ellingson, W.A.

    1983-08-01

    Nondestructive inspection of Portland cement and refractory concrete is conducted to determine strength, thickness, presence of voids or foreign matter, presence of cracks, amount of degradation due to chemical attack, and other properties without the necessity of coring the structure (which is usually accomplished by destructively removing a sample). This paper reviews the state of the art of acoustic nondestructive testing methods for Portland cement and refractory concrete. Most nondestructive work on concrete has concentrated on measuring acoustic velocity by through transmission methods. Development of a reliable pitch-catch or pulse-echo system would provide a method of measuring thickness with access from only one side of the concrete.

  15. Development of a benchtop baking method for chemically leavened crackers. II. Validation of the method

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A benchtop baking method has been developed to predict the contribution of gluten functionality to overall flour performance for chemically leavened crackers. Using a diagnostic formula and procedure, dough rheology was analyzed to evaluate the extent of gluten development during mixing and machinin...

  16. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    SciTech Connect

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as a three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC`s PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.

  17. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    SciTech Connect

    Ravindra, M.K.; Banon, H. )

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as a three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.

  18. FEASIBILITY OF DEVELOPING SOURCE SAMPLING METHODS FOR ASBESTOS EMISSIONS

    EPA Science Inventory

    The objective of this program was to determine the feasibility of developing methods for sampling asbestos in the emissions of major asbestos sources: (1) ore production and taconite production, (2) asbestos-cement production, (3) asbestos felt and paper production, and (4) the p...

  19. Development of a rapid detection method for Yellow Dwarf Viruses

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Barley and Cereal yellow dwarf viruses (B/CYDVs), constitute the most economically important group of oat viruses. A multiplex reverse transcription polymerase chain reaction method was developed for the simultaneous detection and discrimination of five B/CYDVs viruses. The protocol uses specific pr...

  20. EXPOSURE ASSESSMENT METHODS DEVELOPMENT PILOTS FOR THE NATIONAL CHILDREN'S STUDY

    EPA Science Inventory

    Accurate exposure classification tools are needed to link exposure with health effects. EPA began methods development pilot studies in 2000 to address general questions about exposures and outcome measures. Selected pilot studies are highlighted in this poster. The “Literature Re...

  1. DEVELOPMENT OF CRITERIA AND METHODS FOR EVALUATING TRAINER AIRCRAFT EFFECTIVENESS.

    ERIC Educational Resources Information Center

    KUSEWITT, J.B.

    THE PURPOSE OF THIS STUDY WAS TO DEVELOP A METHOD FOR DETERMINING OBJECTIVE MEASURES OF TRAINER AIRCRAFT EFFECTIVENESS TO EVALUATE PROGRAM ALTERNATIVES FOR TRAINING PILOTS FOR FLEET FIGHTER AND ATTACK-TYPE AIRCRAFT. THE TRAINING SYLLABUS WAS BASED ON AVERAGE STUDENT ABILITY. THE BASIC PROBLEM WAS TO ESTABLISH QUANTITATIVE TIME-DIFFICULTY…

  2. MONITORING METHODS DEVELOPMENT IN THE BEAUMONT-LAKE CHARLES AREA

    EPA Science Inventory

    In 1978, the U.S. Environmental Protection Agency initiated a study in the Beaumont, Texas-Lake Charles, Louisiana area (BLCA) as a preliminary step to develop, demonstrate and test methods for monitoring levels of chemicals, primarily in air and water, and to measure effects of ...

  3. Is Mixed Methods Research Used in Australian Career Development Research?

    ERIC Educational Resources Information Center

    Cameron, Roslyn

    2010-01-01

    Mixed methods research has become a substantive and growing methodological force that is growing in popularity within the human and social sciences. This article reports the findings of a study that has systematically reviewed articles from the "Australian Journal of Career Development" from 2004 to 2009. The aim of the study was to provide a…

  4. Multidisciplinary Methods in Educational Technology Research and Development

    ERIC Educational Resources Information Center

    Randolph, Justus J.

    2008-01-01

    Over the past thirty years, there has been much dialogue, and debate, about the conduct of educational technology research and development. In this brief volume, the author helps clarify that dialogue by theoretically and empirically charting the research methods used in the field and provides much practical information on how to conduct…

  5. Teaching Analytical Method Development in an Undergraduate Instrumental Analysis Course

    ERIC Educational Resources Information Center

    Lanigan, Katherine C.

    2008-01-01

    Method development and assessment, central components of carrying out chemical research, require problem-solving skills. This article describes a pedagogical approach for teaching these skills through the adaptation of published experiments and application of group-meeting style discussions to the curriculum of an undergraduate instrumental…

  6. Methods of high throughput biophysical characterization in biopharmaceutical development.

    PubMed

    Razinkov, Vladimir I; Treuheit, Michael J; Becker, Gerald W

    2013-03-01

    Discovery and successful development of biopharmaceutical products depend on a thorough characterization of the molecule both before and after formulation. Characterization of a formulated biotherapeutic, typically a protein or large peptide, requires a rigorous assessment of the molecule's physical stability. Stability of a biotherapeutic includes not only chemical stability, i.e., degradation of the molecule to form undesired modifications, but also structural stability, including the formation of aggregates. In this review, high throughput biophysical characterization techniques are described according to their specific applications during biopharmaceutical discovery, development and manufacturing. The methods presented here are classified according to these attributes, and include spectroscopic assays based on absorbance, polarization, intrinsic and extrinsic fluorescence, surface plasmon resonance instrumentation, calorimetric methods, dynamic and static light scattering techniques, several visible particle counting and sizing methods, new viscosity assay, based on light scattering and mass spectrometry. Several techniques presented here are already implemented in industry; but, many high throughput biophysical methods are still in the initial stages of implementation or even in the prototype stage. Each technique in this report is judged by the specific application of the method through the biopharmaceutical development process. PMID:22725690

  7. Development of a practical costing method for hospitals.

    PubMed

    Cao, Pengyu; Toyabe, Shin-Ichi; Akazawa, Kouhei

    2006-03-01

    To realize an effective cost control, a practical and accurate cost accounting system is indispensable in hospitals. In traditional cost accounting systems, the volume-based costing (VBC) is the most popular cost accounting method. In this method, the indirect costs are allocated to each cost object (services or units of a hospital) using a single indicator named a cost driver (e.g., Labor hours, revenues or the number of patients). However, this method often results in rough and inaccurate results. The activity based costing (ABC) method introduced in the mid 1990s can prove more accurate results. With the ABC method, all events or transactions that cause costs are recognized as "activities", and a specific cost driver is prepared for each activity. Finally, the costs of activities are allocated to cost objects by the corresponding cost driver. However, it is much more complex and costly than other traditional cost accounting methods because the data collection for cost drivers is not always easy. In this study, we developed a simplified ABC (S-ABC) costing method to reduce the workload of ABC costing by reducing the number of cost drivers used in the ABC method. Using the S-ABC method, we estimated the cost of the laboratory tests, and as a result, similarly accurate results were obtained with the ABC method (largest difference was 2.64%). Simultaneously, this new method reduces the seven cost drivers used in the ABC method to four. Moreover, we performed an evaluation using other sample data from physiological laboratory department to certify the effectiveness of this new method. In conclusion, the S-ABC method provides two advantages in comparison to the VBC and ABC methods: (1) it can obtain accurate results, and (2) it is simpler to perform. Once we reduce the number of cost drivers by applying the proposed S-ABC method to the data for the ABC method, we can easily perform the cost accounting using few cost drivers after the second round of costing. PMID

  8. Recent developments in methods for identifying reaction coordinates

    PubMed Central

    Li, Wenjin; Ma, Ao

    2014-01-01

    In the study of rare events in complex systems with many degrees of freedom, a key element is to identify the reaction coordinates of a given process. Over recent years, a number of methods and protocols have been developed to extract the reaction coordinates based on limited information from molecular dynamics simulations. In this review, we provide a brief survey over a number of major methods developed in the past decade, some of which are discussed in greater detail, to provide an overview of the problems that are partially solved and challenges that still remain. A particular emphasis has been placed on methods for identifying reaction coordinates that are related to the committor. PMID:25197161

  9. Quality functions for requirements engineering in system development methods.

    PubMed

    Johansson, M; Timpka, T

    1996-01-01

    Based on a grounded theory framework, this paper analyses the quality characteristics for methods to be used for requirements engineering in the development of medical decision support systems (MDSS). The results from a Quality Function Deployment (QFD) used to rank functions connected to user value and a focus group study were presented to a validation focus group. The focus group studies take advantage of a group process to collect data for further analyses. The results describe factors considered by the participants as important in the development of methods for requirements engineering in health care. Based on the findings, the content which, according to the user a MDSS method should support is established. PMID:8947891

  10. Development of a transfer function method for dynamic stability measurement

    NASA Technical Reports Server (NTRS)

    Johnson, W.

    1977-01-01

    Flutter testing method based on transfer function measurements is developed. The error statistics of several dynamic stability measurement methods are reviewed. It is shown that the transfer function measurement controls the error level by averaging the data and correlating the input and output. The method also gives a direct estimate of the error in the response measurement. An algorithm is developed for obtaining the natural frequency and damping ratio of low damped modes of the system, using integrals of the transfer function in the vicinity of a resonant peak. Guidelines are given for selecting the parameters in the transfer function measurement. Finally, the dynamic stability measurement technique is applied to data from a wind tunnel test of a proprotor and wing model.

  11. Quantitative methods for analyzing cell-cell adhesion in development.

    PubMed

    Kashef, Jubin; Franz, Clemens M

    2015-05-01

    During development cell-cell adhesion is not only crucial to maintain tissue morphogenesis and homeostasis, it also activates signalling pathways important for the regulation of different cellular processes including cell survival, gene expression, collective cell migration and differentiation. Importantly, gene mutations of adhesion receptors can cause developmental disorders and different diseases. Quantitative methods to measure cell adhesion are therefore necessary to understand how cells regulate cell-cell adhesion during development and how aberrations in cell-cell adhesion contribute to disease. Different in vitro adhesion assays have been developed in the past, but not all of them are suitable to study developmentally-related cell-cell adhesion processes, which usually requires working with low numbers of primary cells. In this review, we provide an overview of different in vitro techniques to study cell-cell adhesion during development, including a semi-quantitative cell flipping assay, and quantitative single-cell methods based on atomic force microscopy (AFM)-based single-cell force spectroscopy (SCFS) or dual micropipette aspiration (DPA). Furthermore, we review applications of Förster resonance energy transfer (FRET)-based molecular tension sensors to visualize intracellular mechanical forces acting on cell adhesion sites. Finally, we describe a recently introduced method to quantitate cell-generated forces directly in living tissues based on the deformation of oil microdroplets functionalized with adhesion receptor ligands. Together, these techniques provide a comprehensive toolbox to characterize different cell-cell adhesion phenomena during development. PMID:25448695

  12. Development of quality-by-design analytical methods.

    PubMed

    Vogt, Frederick G; Kord, Alireza S

    2011-03-01

    Quality-by-design (QbD) is a systematic approach to drug development, which begins with predefined objectives, and uses science and risk management approaches to gain product and process understanding and ultimately process control. The concept of QbD can be extended to analytical methods. QbD mandates the definition of a goal for the method, and emphasizes thorough evaluation and scouting of alternative methods in a systematic way to obtain optimal method performance. Candidate methods are then carefully assessed in a structured manner for risks, and are challenged to determine if robustness and ruggedness criteria are satisfied. As a result of these studies, the method performance can be understood and improved if necessary, and a control strategy can be defined to manage risk and ensure the method performs as desired when validated and deployed. In this review, the current state of analytical QbD in the industry is detailed with examples of the application of analytical QbD principles to a range of analytical methods, including high-performance liquid chromatography, Karl Fischer titration for moisture content, vibrational spectroscopy for chemical identification, quantitative color measurement, and trace analysis for genotoxic impurities. PMID:21280050

  13. Quantifying nonhomogeneous colors in agricultural materials part I: method development.

    PubMed

    Balaban, M O

    2008-11-01

    Measuring the color of food and agricultural materials using machine vision (MV) has advantages not available by other measurement methods such as subjective tests or use of color meters. The perception of consumers may be affected by the nonuniformity of colors. For relatively uniform colors, average color values similar to those given by color meters can be obtained by MV. For nonuniform colors, various image analysis methods (color blocks, contours, and "color change index"[CCI]) can be applied to images obtained by MV. The degree of nonuniformity can be quantified, depending on the level of detail desired. In this article, the development of the CCI concept is presented. For images with a wide range of hue values, the color blocks method quantifies well the nonhomogeneity of colors. For images with a narrow hue range, the CCI method is a better indicator of color nonhomogeneity. PMID:19021817

  14. REVIEW: Development of methods for body composition studies

    NASA Astrophysics Data System (ADS)

    Mattsson, Sören; Thomas, Brian J.

    2006-07-01

    This review is focused on experimental methods for determination of the composition of the human body, its organs and tissues. It summarizes the development and current status of fat determinations from body density, total body water determinations through the dilution technique, whole and partial body potassium measurements for body cell mass estimates, in vivo neutron activation analysis for body protein measurements, dual-energy absorptiometry (DEXA), computed tomography (CT) and magnetic resonance imaging (MRI, fMRI) and spectroscopy (MRS) for body composition studies on tissue and organ levels, as well as single- and multiple-frequency bioimpedance (BIA) and anthropometry as simple easily available methods. Methods for trace element analysis in vivo are also described. Using this wide range of measurement methods, together with gradually improved body composition models, it is now possible to quantify a number of body components and follow their changes in health and disease.

  15. Development of Integration and Adjustment Method for Sequential Range Images

    NASA Astrophysics Data System (ADS)

    Nagara, K.; Fuse, T.

    2015-05-01

    With increasing widespread use of three-dimensional data, the demand for simplified data acquisition is also increasing. The range camera, which is a simplified sensor, can acquire a dense-range image in a single shot; however, its measuring coverage is narrow and its measuring accuracy is limited. The former drawback had be overcome by registering sequential range images. This method, however, assumes that the point cloud is error-free. In this paper, we develop an integration method for sequential range images with error adjustment of the point cloud. The proposed method consists of ICP (Iterative Closest Point) algorithm and self-calibration bundle adjustment. The ICP algorithm is considered an initial specification for the bundle adjustment. By applying the bundle adjustment, coordinates of the point cloud are modified and the camera poses are updated. Through experimentation on real data, the efficiency of the proposed method has been confirmed.

  16. Arsenic extraction and speciation in plants: Method comparison and development.

    PubMed

    Zhao, Di; Li, Hong-Bo; Xu, Jia-Yi; Luo, Jun; Ma, Lena Qiying

    2015-08-01

    We compared four methods to extract arsenic (As) from three different plants containing different As levels for As speciation with the goal of developing a more efficient method, i.e., As-hyperaccumulator Pteris vittata at 459-7714mgkg(-1), rice seedling at 53.4-574mgkg(-1), and tobacco leaf at 0.32-0.35mgkg(-1). The four methods included heating with dilute HNO3, and sonication with phosphate buffered solution, methanol/water, and ethanol/water, with As being analyzed using high-performance liquid chromatography coupled with inductively-coupled plasma mass spectrometry (HPLC-ICP-MS). Among the four methods, the ethanol/water method produced the most satisfactory extraction efficiency (~80% for the roots and >85% for the fronds) without changing As species based on P. vittata. The lower extraction efficiency from P. vittata roots was attributed to its dominance by arsenate (82%) while arsenite dominated in the fronds (89%). The ethanol/water method used sample:solution ratio of 1:200 (0.05g:10mL) with 50% ethanol and 2h sonication. Based on different extraction times (0.5-2h), ethanol concentrations (25-100%) and sample:solution ratios (1:50-1:300), the optimized ethanol/water method used less ethanol (25%) and time (0.5h for the fronds and 2h for the roots). Satisfactory extraction was also obtained for tobacco leaf (78-92%) and rice seedlings (~70%) using the optimized method, which was better than the other three methods. Based on satisfactory extraction efficiency with little change in As species during extraction from three plants containing different As levels, the optimized method has the potential to be used for As speciation in other plants. PMID:25863504

  17. Development of nondestructive evaluation methods for structural ceramics

    SciTech Connect

    Ellingson, W.A.; Koehl, R.D.; Wilson, J.A.; Stuckey, J.B.; Engel, H.P. |

    1996-04-01

    Nondestructive evaluation (NDE) methods using three-dimensional microfocus X-ray computed tomographic imaging (3DXCT) were employed to map axial and radial density variations in hot-gas filters and heat exchanger tubes. 3D XCT analysis was conducted on (a) two 38-mm-OD, 6.5-mm wall, SiC/SiC heat exchanger tubes infiltrated by CVI; (b) eight 10 cm diam. oxide/oxide heat exchanger tubes; and (c) one 26-cm-long Nextel fiber/SiC matrix hot-gas filter. The results show that radial and axial density uniformity as well as porosity, can be assessed by 3D XCT. NDE methods are also under development to assess thermal barrier coatings which are under development as methods to protect gas-turbine first-stage hot section metallic substrates. Further, because both shop and field joining of CFCC materials will be necessary, work is now beginning on development of NDE methods for joining.

  18. Organic analysis and analytical methods development: FY 1995 progress report

    SciTech Connect

    Clauss, S.A.; Hoopes, V.; Rau, J.

    1995-09-01

    This report describes the status of organic analyses and developing analytical methods to account for the organic components in Hanford waste tanks, with particular emphasis on tanks assigned to the Flammable Gas Watch List. The methods that have been developed are illustrated by their application to samples obtained from Tank 241-SY-103 (Tank 103-SY). The analytical data are to serve as an example of the status of methods development and application. Samples of the convective and nonconvective layers from Tank 103-SY were analyzed for total organic carbon (TOC). The TOC value obtained for the nonconvective layer using the hot persulfate method was 10,500 {mu}g C/g. The TOC value obtained from samples of Tank 101-SY was 11,000 {mu}g C/g. The average value for the TOC of the convective layer was 6400 {mu}g C/g. Chelator and chelator fragments in Tank 103-SY samples were identified using derivatization. gas chromatography/mass spectrometry (GC/MS). Organic components were quantified using GC/flame ionization detection. Major components in both the convective and nonconvective-layer samples include ethylenediaminetetraacetic acid (EDTA), nitrilotriacetic acid (NTA), succinic acid, nitrosoiminodiacetic acid (NIDA), citric acid, and ethylenediaminetriacetic acid (ED3A). Preliminary results also indicate the presence of C16 and C18 carboxylic acids in the nonconvective-layer sample. Oxalic acid was one of the major components in the nonconvective layer as determined by derivatization GC/flame ionization detection.

  19. Analytical Failure Prediction Method Developed for Woven and Braided Composites

    NASA Technical Reports Server (NTRS)

    Min, James B.

    2003-01-01

    Historically, advances in aerospace engine performance and durability have been linked to improvements in materials. Recent developments in ceramic matrix composites (CMCs) have led to increased interest in CMCs to achieve revolutionary gains in engine performance. The use of CMCs promises many advantages for advanced turbomachinery engine development and may be especially beneficial for aerospace engines. The most beneficial aspects of CMC material may be its ability to maintain its strength to over 2500 F, its internal material damping, and its relatively low density. Ceramic matrix composites reinforced with two-dimensional woven and braided fabric preforms are being considered for NASA s next-generation reusable rocket turbomachinery applications (for example, see the preceding figure). However, the architecture of a textile composite is complex, and therefore, the parameters controlling its strength properties are numerous. This necessitates the development of engineering approaches that combine analytical methods with limited testing to provide effective, validated design analyses for the textile composite structures development.

  20. Development of a spatial method for weed detection and localization

    NASA Astrophysics Data System (ADS)

    Vioix, Jean-Baptiste; Douzals, Jean-Paul; Truchetet, Fréd. éric

    2004-02-01

    This paper presents an algorithm specifically developed for filtering low frequency signals. The application is related to weed detection into aerial images where crop lines are detected as repetitive structures. Theoretical bases of this work are presented first. Then, two methods are compared to select low frequency signals and their limitations are described. A decomposition based on wavelet packet is used to combine advantages of both methods. This algorithm allows a high selectivity of low frequency signals with an interesting computation time. At last, a complete algorithm for weed/crop classification is explained and a few results are shown.

  1. Control of irradiated food: Recent developments in analytical detection methods.

    NASA Astrophysics Data System (ADS)

    Delincée, H.

    1993-07-01

    An overview of recent international efforts, i.e. programmes of "ADMIT" (FAO/IAEA) and of BCR (EC) towards the development of analytical detection methods for radiation processed foods will be given. Some larger collaborative studies have already taken place, e.g. ESR of bones from chicken, prok, beef, frog legs and fish, thermoluminescence of insoluble minerals isolated from herbs and spices, GC analysis of long-chain hydrocarbons derived from the lipid fraction of chicken and other meats, and the microbiological APC/DEFT procedure for spices. These methods could soon be implemented in international standard protocols.

  2. Development of target allocation methods for LAMOST focal plate

    NASA Astrophysics Data System (ADS)

    Yuan, Hailong; Zhang, Haotong; Zhang, Yanxia; Lei, Yajuan; Dong, Yiqiao

    2014-01-01

    We first introduce the primary target allocation requirements and restrictions for the parallel control multiple fiber system, which is used in the LAMOST spectroscopic survey. The fiber positioner anti-collision model is imported. Then several target allocation methods and features are discussed in detail, including a network flow algorithm, high priority for fiber unit holding less target number, target allocation algorithm for groups, target allocation method for add-ons and target reallocation. Their virtues and weaknesses are analyzed for various kinds of scientific research situations. Furthermore an optimization concept using the Simulate Anneal Arithmetic (SAA) is developed to improve the fiber utilizing efficiency.

  3. A newly developed wrapping method for scintillator detectors

    NASA Astrophysics Data System (ADS)

    Stuhl, L.; Krasznahorkay, A.; Csatlós, M.; Algora, A.; Gulyás, J.; Kalinka, G.; Kertész, Zs I.; Timár, J.

    2016-01-01

    A neutron spectrometer, the European Low-Energy Neutron Spectrometer (ELENS), has been constructed to study exotic nuclei in inverse kinematics experiments. The spectrometer consisting of scintillator bars can be used in the neutron energy range of 100 keV to 10 MeV. To increase the light collection efficiency a special wrapping method was developed for each bars of ELENS. By using the specially heat treated reflector foil 15-20% better light collection is available. The development of wrapping process, the results of the test experiments are also presented.

  4. Internal R and D task summary report: analytical methods development

    SciTech Connect

    Schweighardt, F.K.

    1983-07-01

    International Coal Refining Company (ICRC) conducted two research programs to develop analytical procedures for characterizing the feed, intermediates,and products of the proposed SRC-I Demonstration Plant. The major conclusion is that standard analytical methods must be defined and assigned statistical error limits of precision and reproducibility early in development. Comparing all SRC-I data or data from different processes is complex and expensive if common data correlation procedures are not followed. ICRC recommends that processes be audited analytically and statistical analyses generated as quickly as possible, in order to quantify process-dependent and -independent variables. 16 references, 10 figures, 20 tables.

  5. Development of an in vitro cloning method for Cowdria ruminantium.

    PubMed Central

    Perez, J M; Martinez, D; Debus, A; Sheikboudou, C; Bensaid, A

    1997-01-01

    Cowdria ruminantium is a tick-borne rickettsia which causes severe disease in ruminants. All studies with C. ruminantium reported so far were carried out with stocks consisting of infective blood collected from reacting animals or from the same stocks propagated in vitro. Cloned isolates are needed to conduct studies on immune response of the host, on genetic diversity of the parasite, and on mechanisms of attenuation and the development of vaccines. A method of cloning based on the particular chlamydia life cycle of Cowdria was developed. Instead of cloning extracellular elementary bodies, it appeared more convenient to clone endothelial cells infected by one morula resulting from the infection of the cell by one elementary body of Cowdria. Two hundred and sixteen clones were obtained by limiting dilution of infected cells. The method was experimentally validated by comparing randomly amplified polymorphic DNA fingerprints from individual clones obtained from endothelial cell cultures coinfected with two different stocks of C. ruminantium. PMID:9302217

  6. Development of motion control method for laser soldering process

    SciTech Connect

    Yerganian, S.S.

    1997-05-01

    Development of a method to generate the motion control data for sealing an electronic housing using laser soldering is described. The motion required to move the housing under the laser is a nonstandard application and was performed with a four-axis system using the timed data streaming mode capabilities of a Compumotor AT6400 indexer. A Microsoft Excel 5.0 spreadsheet (named Israuto.xls) was created to calculate the movement of the part under the laser, and macros were written into the spreadsheet to allow the user to easily create this data. A data verification method was developed for simulating the motion data. The geometry of the assembly was generated using Parametric Technology Corporation Pro/E version 15. This geometry was then converted using Pro/DADS version 3.1 from Computer Aided Design Software Inc. (CADSI), and the simulation was carried out using DADS version 8.0 from CADSI.

  7. In silico machine learning methods in drug development.

    PubMed

    Dobchev, Dimitar A; Pillai, Girinath G; Karelson, Mati

    2014-01-01

    Machine learning (ML) computational methods for predicting compounds with pharmacological activity, specific pharmacodynamic and ADMET (absorption, distribution, metabolism, excretion and toxicity) properties are being increasingly applied in drug discovery and evaluation. Recently, machine learning techniques such as artificial neural networks, support vector machines and genetic programming have been explored for predicting inhibitors, antagonists, blockers, agonists, activators and substrates of proteins related to specific therapeutic targets. These methods are particularly useful for screening compound libraries of diverse chemical structures, "noisy" and high-dimensional data to complement QSAR methods, and in cases of unavailable receptor 3D structure to complement structure-based methods. A variety of studies have demonstrated the potential of machine-learning methods for predicting compounds as potential drug candidates. The present review is intended to give an overview of the strategies and current progress in using machine learning methods for drug design and the potential of the respective model development tools. We also regard a number of applications of the machine learning algorithms based on common classes of diseases. PMID:25262800

  8. Development of Nondestructive Inspection Methods for Composite Repair

    NASA Astrophysics Data System (ADS)

    Hsu, D. K.; Barnard, D. J.; Peters, J. J.; Dayal, V.

    2003-03-01

    This paper describes the development and implementation of two complementary nondestructive inspection methods for repairs made on aircraft composite honeycomb structures: computer aided tap testing (CATT) and air-coupled ultrasonic testing (AC-UT). The CATT, being a semi-automated and quantitative technique, is exploited to map out the interior conditions of a repaired part. The same repair is also imaged with air-coupled ultrasound and both compared with the results from destructive sectioning.

  9. Lattice Boltzmann method on unstructured grids: further developments.

    PubMed

    Ubertini, S; Bella, G; Succi, S

    2003-07-01

    We discuss further developments of the finite-volume lattice Boltzmann formulation on unstructured grids. It is shown that the method tolerates significant grid distortions without showing any appreciable numerical viscosity effects at second order in the mesh size. A theoretical argument of plausibility for such a property is presented. In addition, a set of boundary conditions which permit to handle flows with open boundaries is also introduced and numerically demonstrated for the case of channel flows and driven cavity flows. PMID:12935281

  10. Development of gait segmentation methods for wearable foot pressure sensors.

    PubMed

    Crea, S; De Rossi, S M M; Donati, M; Reberšek, P; Novak, D; Vitiello, N; Lenzi, T; Podobnik, J; Munih, M; Carrozza, M C

    2012-01-01

    We present an automated segmentation method based on the analysis of plantar pressure signals recorded from two synchronized wireless foot insoles. Given the strict limits on computational power and power consumption typical of wearable electronic components, our aim is to investigate the capability of a Hidden Markov Model machine-learning method, to detect gait phases with different levels of complexity in the processing of the wearable pressure sensors signals. Therefore three different datasets are developed: raw voltage values, calibrated sensor signals and a calibrated estimation of total ground reaction force and position of the plantar center of pressure. The method is tested on a pool of 5 healthy subjects, through a leave-one-out cross validation. The results show high classification performances achieved using estimated biomechanical variables, being on average the 96%. Calibrated signals and raw voltage values show higher delays and dispersions in phase transition detection, suggesting a lower reliability for online applications. PMID:23367055

  11. Viscous wing theory development. Volume 1: Analysis, method and results

    NASA Technical Reports Server (NTRS)

    Chow, R. R.; Melnik, R. E.; Marconi, F.; Steinhoff, J.

    1986-01-01

    Viscous transonic flows at large Reynolds numbers over 3-D wings were analyzed using a zonal viscid-inviscid interaction approach. A new numerical AFZ scheme was developed in conjunction with the finite volume formulation for the solution of the inviscid full-potential equation. A special far-field asymptotic boundary condition was developed and a second-order artificial viscosity included for an improved inviscid solution methodology. The integral method was used for the laminar/turbulent boundary layer and 3-D viscous wake calculation. The interaction calculation included the coupling conditions of the source flux due to the wing surface boundary layer, the flux jump due to the viscous wake, and the wake curvature effect. A method was also devised incorporating the 2-D trailing edge strong interaction solution for the normal pressure correction near the trailing edge region. A fully automated computer program was developed to perform the proposed method with one scalar version to be used on an IBM-3081 and two vectorized versions on Cray-1 and Cyber-205 computers.

  12. Development of Analysis Methods for Designing with Composites

    NASA Technical Reports Server (NTRS)

    Madenci, E.

    1999-01-01

    The project involved the development of new analysis methods to achieve efficient design of composite structures. We developed a complex variational formulation to analyze the in-plane and bending coupling response of an unsymmetrically laminated plate with an elliptical cutout subjected to arbitrary edge loading as shown in Figure 1. This formulation utilizes four independent complex potentials that satisfy the coupled in-plane and bending equilibrium equations, thus eliminating the area integrals from the strain energy expression. The solution to a finite geometry laminate under arbitrary loading is obtained by minimizing the total potential energy function and solving for the unknown coefficients of the complex potentials. The validity of this approach is demonstrated by comparison with finite element analysis predictions for a laminate with an inclined elliptical cutout under bi-axial loading.The geometry and loading of this laminate with a lay-up of [-45/45] are shown in Figure 2. The deformed configuration shown in Figure 3 reflects the presence of bending-stretching coupling. The validity of the present method is established by comparing the out-of-plane deflections along the boundary of the elliptical cutout from the present approach with those of the finite element method. The comparison shown in Figure 4 indicates remarkable agreement. The details of this method are described in a manuscript by Madenci et al. (1998).

  13. New developments in the multiscale hybrid energy density computational method

    NASA Astrophysics Data System (ADS)

    Min, Sun; Shanying, Wang; Dianwu, Wang; Chongyu, Wang

    2016-01-01

    Further developments in the hybrid multiscale energy density method are proposed on the basis of our previous papers. The key points are as follows. (i) The theoretical method for the determination of the weight parameter in the energy coupling equation of transition region in multiscale model is given via constructing underdetermined equations. (ii) By applying the developed mathematical method, the weight parameters have been given and used to treat some problems in homogeneous charge density systems, which are directly related with multiscale science. (iii) A theoretical algorithm has also been presented for treating non-homogeneous systems of charge density. The key to the theoretical computational methods is the decomposition of the electrostatic energy in the total energy of density functional theory for probing the spanning characteristic at atomic scale, layer by layer, by which the choice of chemical elements and the defect complex effect can be understood deeply. (iv) The numerical computational program and design have also been presented. Project supported by the National Basic Research Program of China (Grant No. 2011CB606402) and the National Natural Science Foundation of China (Grant No. 51071091).

  14. The ReaxFF method - new applications and developments

    NASA Astrophysics Data System (ADS)

    van Duin, Adri

    The ReaxFF method provides a highly transferable simulation method for atomistic scale simulations on chemical reactions at the nanosecond and nanometer scale. It combines concepts of bond-order based potentials with a polarizable charge distribution. Since it initial development for hydrocarbons in 2001, we have found that this concept is transferable to applications to elements all across the periodic table, including all first row elements, metals, ceramics and ionic materials. For all these elements and associated materials we have demonstrated that ReaxFF can reproduce quantum mechanics-based structures, reaction energies and reaction barriers with reasonable accuracy, enabling the method to predict reaction kinetics in complicated, multi-material environments at a relatively modest computational expense. This presentation will describe the current concepts of the ReaxFF method, the current status of the various ReaxFF codes, including parallel implementations and recently developed hybrid Grand Canonical Monte Carlo options - which significantly increase its application areas. Also, we will present and overview of recent applications to a range of materials of increasing complexity, with a focus on applications to combustion, biomaterials, batteries, tribology and catalysis.

  15. Development of acoustic sniper localization methods and models

    NASA Astrophysics Data System (ADS)

    Grasing, David; Ellwood, Benjamin

    2010-04-01

    A novel examination of a method capable of providing situational awareness of sniper fire from small arms fire is presented. Situational Awareness (SA) information is extracted by exploiting two distinct sounds created by small arms discharge: the muzzle blast (created when the bullet leaves the barrel of the gun) and the shockwave (sound created by a supersonic bullet). The direction of arrival associated with the muzzle blast will always point in the direction of the shooter. Range can be estimated from the muzzle blast alone, however at greater distances geometric dilution of precision will make obtaining accurate range estimates difficult. To address this issue, additional information obtained from the shockwave is utilized in order to estimate range to shooter. The focus of the paper is the development of a shockwave propagation model, the development of ballistics models (based off empirical measurements), and the subsequent application towards methods of determining shooter position. Knowledge of the rounds ballistics is required to estimate range to shooter. Many existing methods rely on extracting information from the shockwave in an attempt to identify the round type and thus the ballistic model to use ([1]). It has been our experience that this information becomes unreliable at greater distances or in high noise environments. Our method differs from existing solutions in that classification of the round type is not required, thus making the proposed solution more robust. Additionally, we demonstrate that sufficient accuracy can be achieved without the need to classify the round.

  16. Accuracy of age estimation of radiographic methods using developing teeth.

    PubMed

    Maber, M; Liversidge, H M; Hector, M P

    2006-05-15

    Developing teeth are used to assess maturity and estimate age in a number of disciplines, however the accuracy of different methods has not been systematically investigated. The aim of this study was to determine the accuracy of several methods. Tooth formation was assessed from radiographs of healthy children attending a dental teaching hospital. The sample was 946 children (491 boys, 455 girls, aged 3-16.99 years) with similar number of children from Bangladeshi and British Caucasian ethnic origin. Panoramic radiographs were examined and seven mandibular teeth staged according to Demirjian's dental maturity scale [A. Demirjian, Dental development, CD-ROM, Silver Platter Education, University of Montreal, Montreal, 1993-1994; A. Demirjian, H. Goldstein, J.M. Tanner, A new system of dental age assessment, Hum. Biol. 45 (1973) 211-227; A. Demirjian, H. Goldstein, New systems for dental maturity based on seven and four teeth, Ann. Hum. Biol. 3 (1976) 411-421], Nolla [C.M. Nolla, The development of the permanent teeth, J. Dent. Child. 27 (1960) 254-266] and Haavikko [K. Haavikko, The formation and the alveolar and clinical eruption of the permanent teeth. An orthopantomographic study. Proc. Finn. Dent. Soc. 66 (1970) 103-170]. Dental age was calculated for each method, including an adaptation of Demirjian's method with updated scoring [G. Willems, A. Van Olmen, B. Spiessens, C. Carels, Dental age estimation in Belgian children: Demirjian's technique revisited, J. Forensic Sci. 46 (2001) 893-895]. The mean difference (+/-S.D. in years) between dental and real age was calculated for each method and in the case of Haavikko, each tooth type; and tested using t-test. Mean difference was also calculated for the age group 3-13.99 years for Haavikko (mean and individual teeth). Results show that the most accurate method was by Willems [G. Willems, A. Van Olmen, B. Spiessens, C. Carels, Dental age estimation in Belgian children: Demirjian's technique revisited, J. Forensic Sci

  17. Development of nondestructive evaluation methods for ceramic coatings.

    SciTech Connect

    Sun, J. G.

    2007-01-01

    Various nondestructive evaluation (NDE) technologies are being developed to advance the knowledge of ceramic coatings for components in the hot gas-path of advanced, low-emission gas-fired turbine engines. The ceramic coating systems being studied by NDE include thermal barrier coatings (TBCs) and environmental barrier coatings (EBCs). TBCs are under development for vanes, blades and combustor liners to allow hotter gas path temperatures and EBCs are under development to reduce environmental damage to high temperature components made of ceramic matrix composites (CMCs). Data provided by NDE methods will be used to: (a) provide data to assess reliability of new coating application processes, (b) identify defective components that could cause unscheduled outages (c) track growth rates of defects during use in engines and (d) allow rational judgement for replace/repair/re-use decisions of components.

  18. User Experience Evaluation Methods in Product Development (UXEM'09)

    NASA Astrophysics Data System (ADS)

    Roto, Virpi; Väänänen-Vainio-Mattila, Kaisa; Law, Effie; Vermeeren, Arnold

    High quality user experience (UX) has become a central competitive factor of product development in mature consumer markets [1]. Although the term UX originated from industry and is a widely used term also in academia, the tools for managing UX in product development are still inadequate. A prerequisite for designing delightful UX in an industrial setting is to understand both the requirements tied to the pragmatic level of functionality and interaction and the requirements pertaining to the hedonic level of personal human needs, which motivate product use [2]. Understanding these requirements helps managers set UX targets for product development. The next phase in a good user-centered design process is to iteratively design and evaluate prototypes [3]. Evaluation is critical for systematically improving UX. In many approaches to UX, evaluation basically needs to be postponed until the product is fully or at least almost fully functional. However, in an industrial setting, it is very expensive to find the UX failures only at this phase of product development. Thus, product development managers and developers have a strong need to conduct UX evaluation as early as possible, well before all the parts affecting the holistic experience are available. Different types of products require evaluation on different granularity and maturity levels of a prototype. For example, due to its multi-user characteristic, a community service or an enterprise resource planning system requires a broader scope of UX evaluation than a microwave oven or a word processor that is meant for a single user at a time. Before systematic UX evaluation can be taken into practice, practical, lightweight UX evaluation methods suitable for different types of products and different phases of product readiness are needed. A considerable amount of UX research is still about the conceptual frameworks and models for user experience [4]. Besides, applying existing usability evaluation methods (UEMs) without

  19. Evaluating a physician leadership development program - a mixed methods approach.

    PubMed

    Throgmorton, Cheryl; Mitchell, Trey; Morley, Tom; Snyder, Marijo

    2016-05-16

    Purpose - With the extent of change in healthcare today, organizations need strong physician leaders. To compensate for the lack of physician leadership education, many organizations are sending physicians to external leadership programs or developing in-house leadership programs targeted specifically to physicians. The purpose of this paper is to outline the evaluation strategy and outcomes of the inaugural year of a Physician Leadership Academy (PLA) developed and implemented at a Michigan-based regional healthcare system. Design/methodology/approach - The authors applied the theoretical framework of Kirkpatrick's four levels of evaluation and used surveys, observations, activity tracking, and interviews to evaluate the program outcomes. The authors applied grounded theory techniques to the interview data. Findings - The program met targeted outcomes across all four levels of evaluation. Interview themes focused on the significance of increasing self-awareness, building relationships, applying new skills, and building confidence. Research limitations/implications - While only one example, this study illustrates the importance of developing the evaluation strategy as part of the program design. Qualitative research methods, often lacking from learning evaluation design, uncover rich themes of impact. The study supports how a PLA program can enhance physician learning, engagement, and relationship building throughout and after the program. Physician leaders' partnership with organization development and learning professionals yield results with impact to individuals, groups, and the organization. Originality/value - Few studies provide an in-depth review of evaluation methods and outcomes of physician leadership development programs. Healthcare organizations seeking to develop similar in-house programs may benefit applying the evaluation strategy outlined in this study. PMID:27119393

  20. The NASA digital VGH program. Exploration of methods and final results. Volume 1: Development of methods

    NASA Technical Reports Server (NTRS)

    Crabill, Norman L.

    1989-01-01

    Two hundred hours of Lockheed L 1011 digital flight data recorder data taken in 1973 were used to develop methods and procedures for obtaining statistical data useful for updating airliner airworthiness design criteria. Five thousand hours of additional data taken in 1978 to 1982 are reported in volumes 2, 3, 4 and 5.

  1. Development of NDE methods for hot gas filters.

    SciTech Connect

    Deemer, C.; Ellingson, W. A.; Koehl, E. R.; Lee, H.; Spohnholtz, T.; Sun, J. G.

    1999-07-21

    Ceramic hot gas candle filters are currently under development for hot gas particulate cleanup in advanced coal-based power systems. The ceramic materials for these filters include nonoxide monolithic, nonoxide-fiber-reinforced composites, and nonoxide reticulated foam. A concern is the lack of reliable data on which to base decisions for reusing or replacing hot gas filters during plant shutdowns. The work in this project is aimed at developing nondestructive evaluation (FIDE) technology to allow detection, and determination of extent, of life-limiting characteristics such as thermal fatigue, oxidation, damage from ash bridging such as localized cracking, damage from local burning, and elongation at elevated temperature. Although in-situ NDE methods are desirable in order to avoid disassembly of the candle filter vessels, the current vessel designs, the presence of filter cakes and possible ash bridging, and the state of NDE technology prevent this. Candle filter producers use a variety of NDE methods to ensure as-produced quality. While impact acoustic resonance offers initial promise for examining new as-produced filters and for detecting damage in some monolithic filters when removed from service, it presents difficulties in data interpretation, it lacks localization capability, and its applicability to composites has yet to be demonstrated. Additional NDE technologies being developed and evaluated in this program and whose applicability to both monolithics and composites has been demonstrated include (a) full-scale thermal imaging for analyzing thermal property variations; (b) fret, high-spatial-resolution X-ray imaging for detecting density variations and dimensional changes; (c) air-coupled ultrasonic methods for determining through-thickness compositional variations; and (d) acoustic emission technology with mechanical loading for detecting localized bulk damage. New and exposed clay-bonded SiC filters and CVI-SiC composite filters have been tested with

  2. Development of an Immunoaffinity Method for Purification of Streptokinase

    PubMed Central

    Karimi, Zohreh; Babashamsi, Mohammad; Asgarani, Ezat; Salimi, Ali

    2012-01-01

    Background Streptokinase is a potent activator of plasminogen to plasmin, the enzyme that can solubilize the fibrin network in blood clots. Streptokinase is currently used in clinical medicine as a thrombolytic agent. It is naturally secreted by β-hemolytic streptococci. Methods To reach an efficient method of purification, an immunoaffinity chromatography method was developed that could purify the streptokinase in a single step with high yield. At the first stage, a CNBr-Activated sepharose 4B-Lysine column was made to purify the human blood plasminogen. The purified plasminogen was utilized to construct a column that could purify the streptokinase. The rabbit was immunized with the purified streptokinase and the anti-streptokinase (IgG) purified on another streptokinase substituted sepharose-4B column. The immunoaffinity column was developed by coupling the purified anti-Streptokinase (IgG) to sepharose 6MB–Protein A. The Escherichia coli (E.coli) BL21 (DE3) pLysS strain was transformed by the recombinant construct (cloned streptokinase gene in pGEX-4T-2 vector) and gene expression was induced by IPTG. The expressed protein was purified by immunoaffinity chromatography in a single step. Results The immunoaffinity column could purify the recombinant fusion GST-SK to homogeneity. The purity of streptokinase was confirmed by SDS-PAGE as a single band of about 71 kD and its biological activity determined in a specific streptokinase assay. The yield of the purification was about 94%. Conclusion This method of streptokinase purification is superior to the previous conventional methods. PMID:23408770

  3. Development of nondestructive evaluation methods for structural ceramics.

    SciTech Connect

    Ellingson, W. A.

    1998-08-19

    During the past year, the focus of our work on nondestructive evaluation (NDE) methods was on the development and application of these methods to technologies such as ceramic matrix composite (CMC) hot-gas filters, CMC high-temperature heat exchangers, and CMC ceramic/ceramic joining. Such technologies are critical to the ''Vision 21 Energy-Plex Fleet'' of modular, high-efficiency, low-emission power systems. Specifically, our NDE work has continued toward faster, higher sensitivity, volumetric X-ray computed tomographic imaging with new amorphous silicon detectors to detect and measure axial and radial density variations in hot-gas filters and heat exchangers; explored the potential use of high-speed focal-plane-array infrared imaging technology to detect delaminations and variations in the thermal properties of SiC/SiC heat exchangers; and explored various NDE methods to characterize CMC joints in cooperation with various industrial partners. Work this year also addressed support of Southern Companies Services Inc., Power Systems Development Facility, where NDE is needed to assess the condition of hot-gas candle filters. This paper presents the results of these efforts.

  4. Development of nondestructive evaluation methods for ceramic coatings.

    SciTech Connect

    Ellingson, W. A.; Deemer, C.; Sun, J. G.; Erdman, S.; Muliere, D.; Wheeler, B.

    2002-04-29

    Various nondestructive evaluation (NDE) technologies are being developed to study the use of ceramic coatings on components in the hot-gas path of advanced low-emission gas-fired turbines. The types of ceramic coatings include thermal barrier coatings (TBCs) and environmental barrier coatings (EBCs). TBCs are under development for vanes, blades, and combustor liners to allow hotter gas-path temperatures, and EBCs are under development to reduce environmental damage to high-temperature components made of ceramic matrix composites. The NDE methods will be used to (a) provide data to assess the reliability of new coating application processes, (b) identify defective components that could cause unscheduled outages, (c) track growth rates of defects during component use in engines, and (d) allow rational judgment for replace/repair/re-use decisions regarding components. Advances in TBC application, both electron beam-physical vapor deposition (EB-PVD) and air plasma spraying (APS), are allowing higher temperatures in the hot-gas path. However, as TBCs become ''prime reliant,'' their condition at scheduled or unscheduled outages must be known. NDE methods are under development to assess the condition of the TBC for pre-spall conditions. EB-PVD test samples with up to 70 thermal cycles have been studied by a newly developed method involving polarized laser back-scatter NDE. Results suggest a correlation between the NDE laser data and the TBC/bond-coat topography. This finding is important because several theories directed toward understanding the pre-spall condition suggest that the topography in the thermally grown oxide layer changes significantly as a function of the number of thermal cycles. Tests have also been conducted with this NDE method on APS TBCs. Results suggest that the pre-spall condition is detected for these coatings. One-sided, high-speed thermal imaging also has shown promise for NDE of APS coatings. Testing of SiC/SiC composites for combustor liners

  5. Development of computational methods for heavy lift launch vehicles

    NASA Technical Reports Server (NTRS)

    Yoon, Seokkwan; Ryan, James S.

    1993-01-01

    The research effort has been focused on the development of an advanced flow solver for complex viscous turbulent flows with shock waves. The three-dimensional Euler and full/thin-layer Reynolds-averaged Navier-Stokes equations for compressible flows are solved on structured hexahedral grids. The Baldwin-Lomax algebraic turbulence model is used for closure. The space discretization is based on a cell-centered finite-volume method augmented by a variety of numerical dissipation models with optional total variation diminishing limiters. The governing equations are integrated in time by an implicit method based on lower-upper factorization and symmetric Gauss-Seidel relaxation. The algorithm is vectorized on diagonal planes of sweep using two-dimensional indices in three dimensions. A new computer program named CENS3D has been developed for viscous turbulent flows with discontinuities. Details of the code are described in Appendix A and Appendix B. With the developments of the numerical algorithm and dissipation model, the simulation of three-dimensional viscous compressible flows has become more efficient and accurate. The results of the research are expected to yield a direct impact on the design process of future liquid fueled launch systems.

  6. Recent developments in methods for analysis of perfluorinated persistent pollutants.

    PubMed

    Trojanowicz, Marek; Koc, Mariusz

    2013-01-01

    Perfluoroalkyl substances (PFASs) are proliferated into the environment on a global scale and present in the organisms of animals and humans even in remote locations. Persistent organic pollutants of that kind therefore have stimulated substantial improvement in analytical methods. The aim of this review is to present recent achievements in PFASs determination in various matrices with different methods and its comparison to measurements of Total Organic Fluorine (TOF). Analytical methods used for PFASs determinations are dominated by chromatography, mostly in combination with mass spectrometric detection. However, HPLC may be also hyphenated with conductivity or fluorimetric detection, and gas chromatography may be combined with flame ionization or electron capture detection. The presence of a large number of PFASs species in environmental and biological samples necessitates parallel attempts to develop a total PFASs index that reflects the total content of PFASs in various matrices. Increasing attention is currently paid to the determination of branched isomers of PFASs, and their determination in food. FigureThe aim of this review is to present recent achievements in perfluoroalkyl substances (PFASs) determination in various matrices with different methods and its comparison to measurements of Total Organic Fluorine (TOF). Increasing attention is currently paid to the determination of branched isomers of PFASs, and their determination in food. PMID:23913984

  7. Developing a Method to Mask Trees in Commercial Multispectral Imagery

    NASA Astrophysics Data System (ADS)

    Becker, S. J.; Daughtry, C. S. T.; Jain, D.; Karlekar, S. S.

    2015-12-01

    The US Army has an increasing focus on using automated remote sensing techniques with commercial multispectral imagery (MSI) to map urban and peri-urban agricultural and vegetative features; however, similar spectral profiles between trees (i.e., forest canopy) and other vegetation result in confusion between these cover classes. Established vegetation indices, like the Normalized Difference Vegetation Index (NDVI), are typically not effective in reliably differentiating between trees and other vegetation. Previous research in tree mapping has included integration of hyperspectral imagery (HSI) and LiDAR for tree detection and species identification, as well as the use of MSI to distinguish tree crowns from non-vegetated features. This project developed a straightforward method to model and also mask out trees from eight-band WorldView-2 (1.85 meter x 1.85 meter resolution at nadir) satellite imagery at the Beltsville Agricultural Research Center in Beltsville, MD spanning 2012 - 2015. The study site included tree cover, a range of agricultural and vegetative cover types, and urban features. The modeling method exploits the product of the red and red edge bands and defines accurate thresholds between trees and other land covers. Results show this method outperforms established vegetation indices including the NDVI, Soil Adjusted Vegetation Index, Normalized Difference Water Index, Simple Ratio, and Normalized Difference Red Edge Index in correctly masking trees while preserving the other information in the imagery. This method is useful when HSI and LiDAR collection are not possible or when using archived MSI.

  8. Priorities for development of research methods in occupational cancer.

    PubMed Central

    Ward, Elizabeth M; Schulte, Paul A; Bayard, Steve; Blair, Aaron; Brandt-Rauf, Paul; Butler, Mary Ann; Dankovic, David; Hubbs, Ann F; Jones, Carol; Karstadt, Myra; Kedderis, Gregory L; Melnick, Ronald; Redlich, Carrie A; Rothman, Nathaniel; Savage, Russell E; Sprinker, Michael; Toraason, Mark; Weston, Ainsley; Olshan, Andrew F; Stewart, Patricia; Zahm, Sheila Hoar

    2003-01-01

    Occupational cancer research methods was identified in 1996 as 1 of 21 priority research areas in the National Occupational Research Agenda (NORA). To implement NORA, teams of experts from various sectors were formed and given the charge to further define research needs and develop strategies to enhance or augment research in each priority area. This article is a product of that process. Focus on occupational cancer research methods is important both because occupational factors play a significant role in a number of cancers, resulting in significant morbidity and mortality, and also because occupational cohorts (because of higher exposure levels) often provide unique opportunities to evaluate health effects of environmental toxicants and understand the carcinogenic process in humans. Despite an explosion of new methods for cancer research in general, these have not been widely applied to occupational cancer research. In this article we identify needs and gaps in occupational cancer research methods in four broad areas: identification of occupational carcinogens, design of epidemiologic studies, risk assessment, and primary and secondary prevention. Progress in occupational cancer will require interdisciplinary research involving epidemiologists, industrial hygienists, toxicologists, and molecular biologists. PMID:12524210

  9. Development of Stability-Indicating Methods for Cefquinome Sulphate

    PubMed Central

    Shantier, Shaza W.; Gadkariem, Elrasheed A.; Adam, Mohamed O.; Mohamed, Magdi A.

    2013-01-01

    The degradation behavior of cefquinome sulphate in alkaline medium at different temperatures was investigated using both first derivative spectrophotometric and HPLC methods. The drug degradation was found to be pH and temperature dependant. The pH-rate profile indicated a first order dependence of Kobs on [OH-] at pHs ranging between 9 and 11. Arrhenius plot obtained at pH 10 was linear between 65° and 100°C. The estimated activation energy of the hydrolysis was found to be 21.1 kcal mol-1. Stability-indicating thin-layer chromatographic method for the separation of the drug and its alkaline hydrolysis product has been developed. PMID:24170991

  10. Accelerated molecular dynamics methods: introduction and recent developments

    SciTech Connect

    Uberuaga, Blas Pedro; Voter, Arthur F; Perez, Danny; Shim, Y; Amar, J G

    2009-01-01

    A long-standing limitation in the use of molecular dynamics (MD) simulation is that it can only be applied directly to processes that take place on very short timescales: nanoseconds if empirical potentials are employed, or picoseconds if we rely on electronic structure methods. Many processes of interest in chemistry, biochemistry, and materials science require study over microseconds and beyond, due either to the natural timescale for the evolution or to the duration of the experiment of interest. Ignoring the case of liquids xxx, the dynamics on these time scales is typically characterized by infrequent-event transitions, from state to state, usually involving an energy barrier. There is a long and venerable tradition in chemistry of using transition state theory (TST) [10, 19, 23] to directly compute rate constants for these kinds of activated processes. If needed dynamical corrections to the TST rate, and even quantum corrections, can be computed to achieve an accuracy suitable for the problem at hand. These rate constants then allow them to understand the system behavior on longer time scales than we can directly reach with MD. For complex systems with many reaction paths, the TST rates can be fed into a stochastic simulation procedure such as kinetic Monte Carlo xxx, and a direct simulation of the advance of the system through its possible states can be obtained in a probabilistically exact way. A problem that has become more evident in recent years, however, is that for many systems of interest there is a complexity that makes it difficult, if not impossible, to determine all the relevant reaction paths to which TST should be applied. This is a serious issue, as omitted transition pathways can have uncontrollable consequences on the simulated long-time kinetics. Over the last decade or so, we have been developing a new class of methods for treating the long-time dynamics in these complex, infrequent-event systems. Rather than trying to guess in advance what

  11. Identifying emerging research collaborations and networks: method development.

    PubMed

    Dozier, Ann M; Martina, Camille A; O'Dell, Nicole L; Fogg, Thomas T; Lurie, Stephen J; Rubinstein, Eric P; Pearson, Thomas A

    2014-03-01

    Clinical and translational research is a multidisciplinary, collaborative team process. To evaluate this process, we developed a method to document emerging research networks and collaborations in our medical center to describe their productivity and viability over time. Using an e-mail survey, sent to 1,620 clinical and basic science full- and part-time faculty members, respondents identified their research collaborators. Initial analyses, using Pajek software, assessed the feasibility of using social network analysis (SNA) methods with these data. Nearly 400 respondents identified 1,594 collaborators across 28 medical center departments resulting in 309 networks with 5 or more collaborators. This low-burden approach yielded a rich data set useful for evaluation using SNA to: (a) assess networks at several levels of the organization, including intrapersonal (individuals), interpersonal (social), organizational/institutional leadership (tenure and promotion), and physical/environmental (spatial proximity) and (b) link with other data to assess the evolution of these networks. PMID:24019209

  12. Development of fire test methods for airplane interior materials

    NASA Technical Reports Server (NTRS)

    Tustin, E. A.

    1978-01-01

    Fire tests were conducted in a 737 airplane fuselage at NASA-JSC to characterize jet fuel fires in open steel pans (simulating post-crash fire sources and a ruptured airplane fuselage) and to characterize fires in some common combustibles (simulating in-flight fire sources). Design post-crash and in-flight fire source selections were based on these data. Large panels of airplane interior materials were exposed to closely-controlled large scale heating simulations of the two design fire sources in a Boeing fire test facility utilizing a surplused 707 fuselage section. Small samples of the same airplane materials were tested by several laboratory fire test methods. Large scale and laboratory scale data were examined for correlative factors. Published data for dangerous hazard levels in a fire environment were used as the basis for developing a method to select the most desirable material where trade-offs in heat, smoke and gaseous toxicant evolution must be considered.

  13. ROOM: A recursive object oriented method for information systems development

    SciTech Connect

    Thelliez, T.; Donahue, S.

    1994-02-09

    Although complementary for the development of complex systems, top-down structured design and object oriented approach are still opposed and not integrated. As the complexity of the systems are still growing, and the so-called software crisis still not solved, it is urgent to provide a framework mixing the two paradigms. This paper presents an elegant attempt in this direction through our Recursive Object-Oriented Method (ROOM) in which a top-down approach divides the complexity of the system and an object oriented method studies a given level of abstraction. Illustrating this recursive schema with a simple example, we demonstrate that we achieve the goal of creating loosely coupled and reusable components.

  14. Development of a Simple Method for Concentrating Enteroviruses from Oysters

    PubMed Central

    Sobsey, Mark D.; Wallis, Craig; Melnick, Joseph L.

    1975-01-01

    The development of a simple method for concentrating enteroviruses from oysters is described. In this method viruses in homogenized oyster tissues are efficiently adsorbed to oyster solids at pH 5.5 and low salt concentration. After low-speed centrifugation, the supernatant is discarded and viruses are eluted from the sedimented oyster solids by resuspending them in pH 3.5 glycine-buffered saline. The solids are then removed by low-speed centrifugation, and the virus-containing supernatant is filtered through a 0.2-μm porosity filter to remove bacteria and other small particulates without removing viruses. The virus-containing filtrate is then concentrated to a volume of a few milliliters by ultrafiltration, and the concentrate obtained is inoculated directly into cell cultures for virus assay. When tested with pools of oysters experimentally contaminated with small amounts of different enteroviruses, virus recovery efficiency averaged 63%. PMID:234154

  15. Identifying emerging research collaborations and networks: Method development

    PubMed Central

    Dozier, Ann M.; Martina, Camille A.; O’Dell, Nicole L.; Fogg, Thomas T.; Lurie, Stephen J.; Rubinstein, Eric P.; Pearson, Thomas A.

    2014-01-01

    Clinical and translational research is a multidisciplinary, collaborative team process. To evaluate this process, we developed a method to document emerging research networks and collaborations in our medical center to describe their productivity and viability over time. Using an email survey, sent to 1,620 clinical and basic science full-and part-time faculty members, respondents identified their research collaborators. Initial analyses, using Pajek software, assessed the feasibility of using social network analysis (SNA) methods with these data. Nearly 400 respondents identified 1,594 collaborators across 28 medical center departments resulting in 309 networks with 5 or more collaborators. This low-burden approach yielded a rich dataset useful for evaluation using SNA to: a) assess networks at several levels of the organization, including intrapersonal (individuals), interpersonal (social), organizational/institutional leadership (tenure and promotion), and physical/environmental (spatial proximity) and b) link with other data to assess the evolution of these networks. PMID:24019209

  16. Development of nondestructive evaluation methods for structural ceramics

    SciTech Connect

    Ellingson, W.A.; Koehl, R.D.; Stuckey, J.B.; Sun, J.G.; Engel, H.P.; Smith, R.G.

    1997-06-01

    Development of nondestructive evaluation (NDE) methods for application to fossil energy systems continues in three areas: (a) mapping axial and radial density gradients in hot gas filters, (b) characterization of the quality of continuous fiber ceramic matrix composite (CFCC) joints and (c) characterization and detection of defects in thermal barrier coatings. In this work, X-ray computed tomographic imaging was further developed and used to map variations in the axial and radial density of two full length (2.3-m) hot gas filters. The two filters differed in through wall density because of the thickness of the coating on the continuous fibers. Differences in axial and through wall density were clearly detected. Through transmission infrared imaging with a highly sensitivity focal plane array camera was used to assess joint quality in two sets of SiC/SiC CFCC joints. High frame rate data capture suggests that the infrared imaging method holds potential for the characterization of CFCC joints. Work to develop NDE methods that can be used to evaluate electron beam physical vapor deposited coatings with platinum-aluminide (Pt-Al) bonds was undertaken. Coatings of Zirconia with thicknesses of 125 {micro}m (0.005 in.), 190 {micro}m (0.0075 in.), and 254 {micro}m (0.010 in.) with a Pt-Al bond coat on Rene N5 Ni-based superalloy were studied by infrared imaging. Currently, it appears that thickness variation, as well as thermal properties, can be assessed by infrared technology.

  17. Electromagnetic methods for development and production: State of the art

    SciTech Connect

    Wilt, M.; Alumbaugh, D.

    1997-10-01

    Electromagnetic (EM) methods, long used for borehole logging as a formation evaluation tool in developed oil fields, are rarely applied in surface or crosshole configurations or applied in cased wells. This is largely due to the high levels of cultural noise and the preponderance of steel well casing. However, recent experimental success with crosshole EM systems for water and steam flood monitoring using fiberglass cased wells has shown promise in applying these techniques to development and production (D & P) problems. This paper describes technological solutions that will allow for successful application of EM techniques in oil fields, despite surface noise and steel casing. First an example sites the application of long offset logging to map resistivity structure away from the borehole. Next, a successful application of crosshole EM where one of the wells is steel cased is described. The potential application of earth`s field nuclear magnetic resonance (NMR) to map fluid saturation at large distances from the boreholes is also discussed.

  18. Developments in flow visualization methods for flight research

    NASA Technical Reports Server (NTRS)

    Holmes, Bruce J.; Obara, Clifford J.; Manuel, Gregory S.; Lee, Cynthia C.

    1990-01-01

    With the introduction of modern airplanes utilizing laminar flow, flow visualization has become an important diagnostic tool in determining aerodynamic characteristics such as surface flow direction and boundary-layer state. A refinement of the sublimating chemical technique has been developed to define both the boundary-layer transition location and the transition mode. In response to the need for flow visualization at subsonic and transonic speeds and altitudes above 20,000 feet, the liquid crystal technique has been developed. A third flow visualization technique that has been used is infrared imaging, which offers non-intrusive testing over a wide range of test conditions. A review of these flow visualization methods and recent flight results is presented for a variety of modern aircraft and flight conditions.

  19. Development of impact design methods for ceramic gas turbine components

    NASA Technical Reports Server (NTRS)

    Song, J.; Cuccio, J.; Kington, H.

    1990-01-01

    Impact damage prediction methods are being developed to aid in the design of ceramic gas turbine engine components with improved impact resistance. Two impact damage modes were characterized: local, near the impact site, and structural, usually fast fracture away from the impact site. Local damage to Si3N4 impacted by Si3N4 spherical projectiles consists of ring and/or radial cracks around the impact point. In a mechanistic model being developed, impact damage is characterized as microcrack nucleation and propagation. The extent of damage is measured as volume fraction of microcracks. Model capability is demonstrated by simulating late impact tests. Structural failure is caused by tensile stress during impact exceeding material strength. The EPIC3 code was successfully used to predict blade structural failures in different size particle impacts on radial and axial blades.

  20. Development of methods to predict agglomeration and disposition in FBCs

    SciTech Connect

    Mann, M.D.; Henderson, A.K.; Swanson, M.K.; Erickson, T.A.

    1995-11-01

    This 3-year, multiclient program is providing the information needed to determine the behavior of inorganic components in FBC units using advanced methods of analysis coupled with bench-scale combustion experiments. The major objectives of the program are as follows: (1) To develop further our advanced ash and deposit characterization techniques to quantify the effects of the liquid-phase components in terms of agglomerate formation and ash deposits, (2) To determine the mechanisms of inorganic transformations that lead to bed agglomeration and ash deposition in FBC systems, and (3) To develop a better means to predict the behavior of inorganic components as a function of coal composition, bed material characteristics, and combustion conditions.

  1. Methods to Characterize Ricin for the Development of Reference Materials

    PubMed Central

    Kim, Sook-Kyung; Hancock, Diane K.; Wang, Lili; Cole, Kenneth D.; Reddy, Prasad T.

    2006-01-01

    Ricin is an abundant protein from the castor bean plant Ricinus communis. Because of its high toxicity and the simplicity of producing mass quantities, ricin is considered a biological terrorism agent. We have characterized ricin extensively with a view to develop Reference Materials that could be used to test and calibrate detection devices. The characterization of ricin includes: 1) purity test of a commercial batch of ricin using electrophoresis in polyacrylamide gels, 2) biological activity assay by measuring its ability to inhibit protein synthesis, 3) quantitation of protein concentration by amino acid analysis, 4) detection of ricin by an immunoassay using a flow cytometer, and 5) detection of ricin genomic DNA by polymerase chain reaction using nine different primer sets. By implementing these five methods of characterization, we are in a position to develop a reference material for ricin. PMID:27274935

  2. Exploring the Application of Community Development Methods on Water Research in Developing Countries

    NASA Astrophysics Data System (ADS)

    Crane, P. E.

    2012-12-01

    In research and community development focused on water in developing countries, there is a common focus on issues of water quantity and quality. In the best circumstances both are innovative - bringing understanding and solutions to resource poor regions that is appropriate to their unique situations. But the underlying methods and measures for success often differ significantly. Applying critical aspects of community development methods to water research in developing countries could increase the probability of identifying innovative and sustainable solutions. This is examined through two case studies: the first identifies common methods across community development projects in six African countries, and the second examines water quality research performed in Benin, West Africa through the lens of these methods. The first case study is taken from observations gathered between 2008 and 2012 of community development projects focused on water quantity and quality in six sub-Saharan African countries implemented through different non-governmental organizations. These projects took place in rural and peri-urban regions where public utilities were few to none, instance of diarrheal disease was high, and most adults had received little formal education. The water projects included drilling of boreholes, building of rain water tanks, oasis rehabilitation, spring protection, and household biosand filters. All solutions were implemented with hygiene and sanitation components. Although these projects occurred in a wide array of cultural, geographical and climatic regions, the most successful projects shared methods of implementation. These methods are: high levels of stakeholder participation, environmental and cultural adaptation of process and product, and implementation over an extended length of time. The second case study focuses on water quality research performed in Benin, West Africa from 2003 to 2008. This research combined laboratory and statistical analyses with

  3. Developing integrated methods to address complex resource and environmental issues

    USGS Publications Warehouse

    2016-01-01

    IntroductionThis circular provides an overview of selected activities that were conducted within the U.S. Geological Survey (USGS) Integrated Methods Development Project, an interdisciplinary project designed to develop new tools and conduct innovative research requiring integration of geologic, geophysical, geochemical, and remote-sensing expertise. The project was supported by the USGS Mineral Resources Program, and its products and acquired capabilities have broad applications to missions throughout the USGS and beyond.In addressing challenges associated with understanding the location, quantity, and quality of mineral resources, and in investigating the potential environmental consequences of resource development, a number of field and laboratory capabilities and interpretative methodologies evolved from the project that have applications to traditional resource studies as well as to studies related to ecosystem health, human health, disaster and hazard assessment, and planetary science. New or improved tools and research findings developed within the project have been applied to other projects and activities. Specifically, geophysical equipment and techniques have been applied to a variety of traditional and nontraditional mineral- and energy-resource studies, military applications, environmental investigations, and applied research activities that involve climate change, mapping techniques, and monitoring capabilities. Diverse applied geochemistry activities provide a process-level understanding of the mobility, chemical speciation, and bioavailability of elements, particularly metals and metalloids, in a variety of environmental settings. Imaging spectroscopy capabilities maintained and developed within the project have been applied to traditional resource studies as well as to studies related to ecosystem health, human health, disaster assessment, and planetary science. Brief descriptions of capabilities and laboratory facilities and summaries of some

  4. Chloroform extraction of iodine in seawater: method development

    NASA Astrophysics Data System (ADS)

    Seidler, H. B.; Glimme, A.; Tumey, S.; Guilderson, T. P.

    2012-12-01

    While 129I poses little to no radiological health hazard, the isotopic ratio of 129I to stable iodine is very useful as a nearly conservative tracer for ocean mixing processes. The unfortunate disaster at the Fukushima Daiichi nuclear power plant released many radioactive materials into the environment, including 129I. The release allows the studying of oceanic processes through the tracking of 129I. However, with such a low iodine (~0.5 micromolar) and 129I concentrations (<10-11) accelerator mass spectrometry (AMS) is needed for accurate measurements. In order to prepare the samples of ocean water for analysis by AMS, the iodine needs to be separated from the various other salts in the seawater. Solvent extraction is the preferred method for preparation of seawater for AMS analysis of 129I. However, given the relatively low background 129I concentrations in the Pacific Ocean, we sought to optimize recovery of thismethod, which would minimize both the sample size and the carrier addition required for analysis. We started from a base method described in other research and worked towards maximum efficiency of the process while boosting the recovery of iodine. During development, we assessed each methodological change qualitatively using a color scale (I2 in CHCl3) and quantitatively using Inductively Coupled Plasma Mass Spectrometry (ICP-MS). The "optimized method" yielded a 20-40% increase in recovery of the iodine compared to the base method (80-85% recovery vs. 60%). Lastly, the "optimized method" was tested by AMS for fractionation of the extracted iodine.

  5. Multiphysics methods development for high temperature gas reactor analysis

    NASA Astrophysics Data System (ADS)

    Seker, Volkan

    Multiphysics computational methods were developed to perform design and safety analysis of the next generation Pebble Bed High Temperature Gas Cooled Reactors. A suite of code modules was developed to solve the coupled thermal-hydraulics and neutronics field equations. The thermal-hydraulics module is based on the three dimensional solution of the mass, momentum and energy equations in cylindrical coordinates within the framework of the porous media method. The neutronics module is a part of the PARCS (Purdue Advanced Reactor Core Simulator) code and provides a fine mesh finite difference solution of the neutron diffusion equation in three dimensional cylindrical coordinates. Coupling of the two modules was performed by mapping the solution variables from one module to the other. Mapping is performed automatically in the code system by the use of a common material mesh in both modules. The standalone validation of the thermal-hydraulics module was performed with several cases of the SANA experiment and the standalone thermal-hydraulics exercise of the PBMR-400 benchmark problem. The standalone neutronics module was validated by performing the relevant exercises of the PBMR-268 and PBMR-400 benchmark problems. Additionally, the validation of the coupled code system was performed by analyzing several steady state and transient cases of the OECD/NEA PBMR-400 benchmark problem.

  6. System identification methods for aircraft flight control development and validation

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.

    1995-01-01

    System-identification methods compose a mathematical model, or series of models, from measurements of inputs and outputs of dynamic systems. The extracted models allow the characterization of the response of the overall aircraft or component subsystem behavior, such as actuators and on-board signal processing algorithms. This paper discusses the use of frequency-domain system-identification methods for the development and integration of aircraft flight-control systems. The extraction and analysis of models of varying complexity from nonparametric frequency-responses to transfer-functions and high-order state-space representations is illustrated using the Comprehensive Identification from FrEquency Responses (CIFER) system-identification facility. Results are presented for test data of numerous flight and simulation programs at the Ames Research Center including rotorcraft, fixed-wing aircraft, advanced short takeoff and vertical landing (ASTOVL), vertical/short takeoff and landing (V/STOL), tiltrotor aircraft, and rotor experiments in the wind tunnel. Excellent system characterization and dynamic response prediction is achieved for this wide class of systems. Examples illustrate the role of system-identification technology in providing an integrated flow of dynamic response data around the entire life-cycle of aircraft development from initial specifications, through simulation and bench testing, and into flight-test optimization.

  7. Development of Image Selection Method Using Graph Cuts

    NASA Astrophysics Data System (ADS)

    Fuse, T.; Harada, R.

    2016-06-01

    3D models have been widely used by spread of many available free-software. Additionally, enormous images can be easily acquired, and images are utilized for creating the 3D models recently. The creation of 3D models by using huge amount of images, however, takes a lot of time and effort, and then efficiency for 3D measurement are required. In the efficient strategy, the accuracy of the measurement is also required. This paper develops an image selection method based on network design that means surveying network construction. The proposed method uses image connectivity graph. The image connectivity graph consists of nodes and edges. The nodes correspond to images to be used. The edges connected between nodes represent image relationships with costs as accuracies of orientation elements. For the efficiency, the image connectivity graph should be constructed with smaller number of edges. Once the image connectivity graph is built, the image selection problem is regarded as combinatorial optimization problem and the graph cuts technique can be applied. In the process of 3D reconstruction, low quality images and similar images are also extracted and removed. Through the experiments, the significance of the proposed method is confirmed. It implies potential to efficient and accurate 3D measurement.

  8. Systematic methods for knowledge acquisition and expert system development

    NASA Technical Reports Server (NTRS)

    Belkin, Brenda L.; Stengel, Robert F.

    1991-01-01

    Nine cooperating rule-based systems, collectively called AUTOCREW, were designed to automate functions and decisions associated with a combat aircraft's subsystem. The organization of tasks within each system is described; performance metrics were developed to evaluate the workload of each rule base, and to assess the cooperation between the rule-bases. Each AUTOCREW subsystem is composed of several expert systems that perform specific tasks. AUTOCREW's NAVIGATOR was analyzed in detail to understand the difficulties involved in designing the system and to identify tools and methodologies that ease development. The NAVIGATOR determines optimal navigation strategies from a set of available sensors. A Navigation Sensor Management (NSM) expert system was systematically designed from Kalman filter covariance data; four ground-based, a satellite-based, and two on-board INS-aiding sensors were modeled and simulated to aid an INS. The NSM Expert was developed using the Analysis of Variance (ANOVA) and the ID3 algorithm. Navigation strategy selection is based on an RSS position error decision metric, which is computed from the covariance data. Results show that the NSM Expert predicts position error correctly between 45 and 100 percent of the time for a specified navaid configuration and aircraft trajectory. The NSM Expert adapts to new situations, and provides reasonable estimates of hybrid performance. The systematic nature of the ANOVA/ID3 method makes it broadly applicable to expert system design when experimental or simulation data is available.

  9. Practical methods to improve the development of computational software

    SciTech Connect

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-07-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  10. Improved Method Being Developed for Surface Enhancement of Metallic Materials

    NASA Technical Reports Server (NTRS)

    Gabb, Timothy P.; Telesman, Jack; Kantzos, Peter T.

    2001-01-01

    Surface enhancement methods induce a layer of beneficial residual compressive stress to improve the impact (FOD) resistance and fatigue life of metallic materials. A traditional method of surface enhancement often used is shot peening, in which small steel spheres are repeatedly impinged on metallic surfaces. Shot peening is inexpensive and widely used, but the plastic deformation of 20 to 40 percent imparted by the impacts can be harmful. This plastic deformation can damage the microstructure, severely limiting the ductility and durability of the material near the surface. It has also been shown to promote accelerated relaxation of the beneficial compressive residual stresses at elevated temperatures. Low-plasticity burnishing (LPB) is being developed as an improved method for the surface enhancement of metallic materials. LPB is being investigated as a rapid, inexpensive surface enhancement method under NASA Small Business Innovation Research contracts NAS3-98034 and NAS3-99116, with supporting characterization work at NASA. Previously, roller burnishing had been employed to refine surface finish. This concept was adopted and then optimized as a means of producing a layer of compressive stress of high magnitude and depth, with minimal plastic deformation (ref. 1). A simplified diagram of the developed process is given in the following figure. A single pass of a smooth, free-rolling spherical ball under a normal force deforms the surface of the material in tension, creating a compressive layer of residual stress. The ball is supported in a fluid with sufficient pressure to lift the ball off the surface of the retaining spherical socket. The ball is only in mechanical contact with the surface of the material being burnished and is free to roll on the surface. This apparatus is designed to be mounted in the conventional lathes and vertical mills currently used to machine parts. The process has been successfully applied to nickel-base superalloys by a team from the

  11. Development of the EEG measurement method under exercising.

    PubMed

    Dobashi, Noriyuki; Magatani, Kazushige

    2009-01-01

    It is said that the result of the game of sports is controlled by player's mental state. Especially, player's concentration greatly controls the result of the game. Therefore, we think that if player's mental state under exercising can be evaluated, it becomes possible to guide the player appropriately. Our mental state can be understood from analyzing EEG (Electroencephalogram). Especially, it is said that the change of alpha and beta rhythm of EEG will indicate the change of human's mental state. Therefore, we think that if EEG of the athlete can be measured under exercising, it becomes possible to evaluate mental state of the athlete. However, EEG is measured in the state of the rest usually, and measuring EEG under exercising is difficult. Because, the amplitude of EEG is very small and high amplification is necessary to obtain observable EEG. A movement of the body causes vibration of electrodes, and these vibration cause artifact of EEG. So, our objective of this study is a development of the new measuring method of EEG under exercising. In this paper, we will talk about our developed EEG measuring system for athletes. This system measures EEG and acceleration of the athlete's body. These measured data are sent to the receiver by a FM transmitter. Received data are analyzed with the personal computer, and the EEG and the noise are separated. Some normal subjects were tested with our developed system. From these experiments, it was clarified that our system had some problems. However, EEG with little noise was able to be obtained in all cases. Therefore, we think that if these problems are improved, our developed system will become useful for the measurement of EEG under exercising. PMID:19964931

  12. Development of exposure assessment method with the chamber

    NASA Astrophysics Data System (ADS)

    Kato, N.; Koyama, Y.; Yokoyama, H.; Matsui, Y.; Yoneda, M.

    2015-05-01

    This study aims at developing the measurement method of nanoparticle concentration and at getting a representative value of nanoparticle uniform concentration due to chamber ventilation. We conducted a chamber equipped with HEPA filter and control the background nanoparticles concentration by using an adequate ventilation. Then, we used generator to evaluate concentration in the chamber uniformity. We measured background value and source counts at the particle size distribution by SMPS. In addition, we performed numerical analysis with CFD model OpenFoam. As results, we found that there is no aggregate in experimental conditions in this study. Though we confirmed that it is difficult to uniformalise nanoparticle concentration, However we also found simulation results showed higher reproducibility. Therefore, we could assess nanoparticle size distribution and concentration in our chamber at this stage.

  13. Development of a nondestructive evaluation method for FRP bridge decks

    NASA Astrophysics Data System (ADS)

    Brown, Jeff; Fox, Terra

    2010-05-01

    Open steel grids are typically used on bridges to minimize the weight of the bridge deck and wearing surface. These grids, however, require frequent maintenance and exhibit other durability concerns related to fatigue cracking and corrosion. Bridge decks constructed from composite materials, such as a Fiber-reinforced Polymer (FRP), are strong and lightweight; they also offer improved rideability, reduced noise levels, less maintenance, and are relatively easy to install compared to steel grids. This research is aimed at developing an inspection protocol for FRP bridge decks using Infrared thermography. The finite element method was used to simulate the heat transfer process and determine optimal heating and data acquisition parameters that will be used to inspect FRP bridge decks in the field. It was demonstrated that thermal imaging could successfully identify features of the FRP bridge deck to depths of 1.7 cm using a phase analysis process.

  14. Developed Adomian method for quadratic Kaluza-Klein relativity

    NASA Astrophysics Data System (ADS)

    Azreg-Aïnou, Mustapha

    2010-01-01

    We develop and modify the Adomian decomposition method (ADecM) to work for a new type of nonlinear matrix differential equations (MDE's) which arise in general relativity (GR) and possibly in other applications. The approach consists in modifying both the ADecM linear operator with highest order derivative and ADecM polynomials. We specialize in the case of a 4 × 4 nonlinear MDE along with a scalar one describing stationary cylindrically symmetric metrics in quadratic five-dimensional GR, derive some of their properties using ADecM and construct the most general unique power series solutions. However, because of the constraint imposed on the MDE by the scalar one, the series solutions terminate in closed forms exhausting all possible solutions.

  15. Structural Analysis Methods Development for Turbine Hot Section Components

    NASA Technical Reports Server (NTRS)

    Thompson, Robert L.

    1988-01-01

    The structural analysis technologies and activities of the NASA Lewis Research Center's gas turbine engine Hot Section Technology (HOST) program are summarized. The technologies synergistically developed and validated include: time-varying thermal/mechanical load models; component-specific automated geometric modeling and solution strategy capabilities; advanced inelastic analysis methods; inelastic constitutive models; high-temperature experimental techniques and experiments; and nonlinear structural analysis codes. Features of the program that incorporate the new technologies and their application to hot section component analysis and design are described. Improved and, in some cases, first-time 3-D nonlinear structural analyses of hot section components of isotropic and anisotropic nickel-base superalloys are presented.

  16. Development of silicon purification by strong radiation catalysis method

    NASA Astrophysics Data System (ADS)

    Chen, Ying-Tian; Ho, Tso-Hsiu; Lim, Chern-Sing; Lim Boon, Han

    2010-11-01

    Using a new type of solar furnace and a specially designed induction furnace, cost effective and highly efficient purification of metallurgical silicon into solar grade silicon can be achieved. It is realized by a new method for extracting boron from silicon with the aid of photo-chemical effect. In this article, we discussed the postulated principle of strong radiation catalysis and the recent development in practice. Starting from ordinary metallurgical silicon, we achieved a purification result of 0.12 ppmw to 0.3 ppmw of boron impurity in silicon by only single pass of a low cost and simple process, the major obstacle to make ‘cheap’ solar grade silicon feedstock in industry is thus removed.

  17. Development of a heat-mediated protein blotting method.

    PubMed

    O'Sullivan, Jack; McMahon, Hilary E M

    2016-04-15

    Western blotting is a significant tool employed for the detection of cell proteins. High-molecular-weight proteins have proven a challenge to detect by western blotting, but proteins even of 100 KDa can still present difficulties in detection. This work reports the development of a heat transfer method that is suitable for both low- and high-molecular-weight proteins. The procedure involves the use of a constant temperature at 78 °C in a dedicated heat transfer module. Through the use of this protocol the neuronal adaptor protein X11α (120 KDa), which prior to this methodology was undetectable endogenously in the neuroblastoma cell line (N2a), was successfully detected in the N2a cell line. The procedure provides a reproducible protocol that can be adapted for other high-molecular-weight proteins, and it provides the advantage that low-molecular-weight proteins are not sacrificed by the methodology. PMID:26869081

  18. Development of Combinatorial Methods for Alloy Design and Optimization

    SciTech Connect

    Pharr, George M.; George, Easo P.; Santella, Michael L

    2005-07-01

    The primary goal of this research was to develop a comprehensive methodology for designing and optimizing metallic alloys by combinatorial principles. Because conventional techniques for alloy preparation are unavoidably restrictive in the range of alloy composition that can be examined, combinatorial methods promise to significantly reduce the time, energy, and expense needed for alloy design. Combinatorial methods can be developed not only to optimize existing alloys, but to explore and develop new ones as well. The scientific approach involved fabricating an alloy specimen with a continuous distribution of binary and ternary alloy compositions across its surface--an ''alloy library''--and then using spatially resolved probing techniques to characterize its structure, composition, and relevant properties. The three specific objectives of the project were: (1) to devise means by which simple test specimens with a library of alloy compositions spanning the range interest can be produced; (2) to assess how well the properties of the combinatorial specimen reproduce those of the conventionally processed alloys; and (3) to devise screening tools which can be used to rapidly assess the important properties of the alloys. As proof of principle, the methodology was applied to the Fe-Ni-Cr ternary alloy system that constitutes many commercially important materials such as stainless steels and the H-series and C-series heat and corrosion resistant casting alloys. Three different techniques were developed for making alloy libraries: (1) vapor deposition of discrete thin films on an appropriate substrate and then alloying them together by solid-state diffusion; (2) co-deposition of the alloying elements from three separate magnetron sputtering sources onto an inert substrate; and (3) localized melting of thin films with a focused electron-beam welding system. Each of the techniques was found to have its own advantages and disadvantages. A new and very powerful technique for

  19. Current developments using emerging transdermal technologies in physical enhancement methods.

    PubMed

    Nanda, A; Nanda, S; Ghilzai, N M Khan

    2006-07-01

    Transdermal drug delivery using patches offers many advantages, but is limited primarily by the stratum corneum barrier. Amongst the various methods to overcome this barrier, physical methods are gaining in popularity and commercial devices development. Macroflux, MTS and Silex are based on microporation, involving use of microneedles that pierce thereby bypassing the stratum corneum. Intraject , Powderject and Helios are based on needleless jet injectors wherein very fine, solid particulate drug, is fired directly into the skin, using high-pressure gas. Med- Tats incorporate use of modified drug-containing tattoos, which bind to the skin, wherein the drug is absorbed. CHADD is based on use of heat, which increases skin - permeation of drugs. High-power, pulsed lasers transmit positive mechanical forces to the skin and create intercellular channels into the skin transiently. Sonophoresis involves use of ultrasound, which transiently disrupts the stratum corneum barrier. This technique offers a non-invasive transdermal extraction of interstitial fluids of sampling body fluids. Modified Liposomes include Ethosomes (containing alcohol) and Transferosomes (containing surfactants), which have enhanced skin permeability. Pulsed magnetic fields may create transient pores in cell membranes, including skin, resulting in increased permeation. Iontophoresis is based on application of electric potential for enhancing the movement of substances to and from the body. Dupel, Ionzyme, Liposite, ETrans, Phoresor and Drionic are based on iontophoresis. GlucoWatch offers non-invasive blood glucose monitoring, based on reverse iontophoresis. This review outlines recent commercial developments in physical transdermal drug delivery technology and the specific devices and applications being targeted by the pharmaceutical industry. PMID:16848725

  20. Two new methods for simulating photolithography development in 3D

    SciTech Connect

    Helmsen, J.; Colella, P.; Dorr, M.; Puckett, E.G.

    1997-01-30

    Two methods are presented for simulating the development of photolithographic profiles during the resist dissolution phase. These algorithms are the volume-of-fluid algorithm, and the steady level-set algorithm. They are compared with the ray-trace, cell, and level-set techniques employed in SAMPLE-3D. The volume-of-fluid algorithm employs an Euclidean Grid with volume fractions. At each time step, the surface is reconstructed by computing an approximation of the tangent plane of the surface in each cell that contains a value between 0 and 1. The geometry constructed in this manner is used to determine flow velocity vectors and the flux across each edge. The material is then advanced by a split advection scheme. The steady Level Set algorithm is an extension of the Iterative Level Set algorithm. The steady Level Set algorithm combines Fast Level Set concepts and a technique for finding zero residual solutions to the ( ) function. The etch time for each cell is calculated in a time ordered manner. Use of heap sorting data structures allows the algorithm to execute extremely quickly. Comparisons of the methods have been performed and results shown.

  1. Task analysis method for procedural training curriculum development.

    PubMed

    Riggle, Jakeb D; Wadman, Michael C; McCrory, Bernadette; Lowndes, Bethany R; Heald, Elizabeth A; Carstens, Patricia K; Hallbeck, M Susan

    2014-06-01

    A central venous catheter (CVC) is an important medical tool used in critical care and emergent situations. Integral to proper care in many circumstances, insertion of a CVC introduces the risk of central line-associated blood stream infections and mechanical adverse events; proper training is important for safe CVC insertion. Cognitive task analysis (CTA) methods have been successfully implemented in the medical field to improve the training of postgraduate medical trainees, but can be very time-consuming to complete and require a significant time commitment from many subject matter experts (SMEs). Many medical procedures such as CVC insertion are linear processes with well-documented procedural steps. These linear procedures may not require a traditional CTA to gather the information necessary to create a training curriculum. Accordingly, a novel, streamlined CTA method designed primarily to collect cognitive cues for linear procedures was developed to be used by medical professionals with minimal CTA training. This new CTA methodology required fewer trained personnel, fewer interview sessions, and less time commitment from SMEs than a traditional CTA. Based on this study, a streamlined CTA methodology can be used to efficiently gather cognitive information on linear medical procedures for the creation of resident training curricula and procedural skills assessments. PMID:24366759

  2. Development of a method to analyze orthopaedic practice expenses.

    PubMed

    Brinker, M R; Pierce, P; Siegel, G

    2000-03-01

    The purpose of the current investigation was to present a standard method by which an orthopaedic practice can analyze its practice expenses. To accomplish this, a five-step process was developed to analyze practice expenses using a modified version of activity-based costing. In this method, general ledger expenses were assigned to 17 activities that encompass all the tasks and processes typically performed in an orthopaedic practice. These 17 activities were identified in a practice expense study conducted for the American Academy of Orthopaedic Surgeons. To calculate the cost of each activity, financial data were used from a group of 19 orthopaedic surgeons in Houston, Texas. The activities that consumed the largest portion of the employee work force (person hours) were service patients in office (25.0% of all person hours), maintain medical records (13.6% of all person hours), and resolve collection disputes and rebill charges (12.3% of all person hours). The activities that comprised the largest portion of the total expenses were maintain facility (21.4%), service patients in office (16.0%), and sustain business by managing and coordinating practice (13.8%). The five-step process of analyzing practice expenses was relatively easy to perform and it may be used reliably by most orthopaedic practices. PMID:10738440

  3. Assessing methods for developing crop forecasting in the Iberian Peninsula

    NASA Astrophysics Data System (ADS)

    Ines, A. V. M.; Capa Morocho, M. I.; Baethgen, W.; Rodriguez-Fonseca, B.; Han, E.; Ruiz Ramos, M.

    2015-12-01

    Seasonal climate prediction may allow predicting crop yield to reduce the vulnerability of agricultural production to climate variability and its extremes. It has been already demonstrated that seasonal climate predictions at European (or Iberian) scale from ensembles of global coupled climate models have some skill (Palmer et al., 2004). The limited predictability that exhibits the atmosphere in mid-latitudes, and therefore de Iberian Peninsula (PI), can be managed by a probabilistic approach based in terciles. This study presents an application for the IP of two methods for linking tercile-based seasonal climate forecasts with crop models to improve crop predictability. Two methods were evaluated and applied for disaggregating seasonal rainfall forecasts into daily weather realizations: 1) a stochastic weather generator and 2) a forecast tercile resampler. Both methods were evaluated in a case study where the impacts of two seasonal rainfall forecasts (wet and dry forecast for 1998 and 2015 respectively) on rainfed wheat yield and irrigation requirements of maize in IP were analyzed. Simulated wheat yield and irrigation requirements of maize were computed with the crop models CERES-wheat and CERES-maize which are included in Decision Support System for Agrotechnology Transfer (DSSAT v.4.5, Hoogenboom et al., 2010). Simulations were run at several locations in Spain where the crop model was calibrated and validated with independent field data. These methodologies would allow quantifying the benefits and risks of a seasonal climate forecast to potential users as farmers, agroindustry and insurance companies in the IP. Therefore, we would be able to establish early warning systems and to design crop management adaptation strategies that take advantage of favorable conditions or reduce the effect of adverse ones. ReferencesPalmer, T. et al., 2004. Development of a European multimodel ensemble system for seasonal-to-interannual prediction (DEMETER). Bulletin of the

  4. Development and qualification of an antibody rapid deglycosylation method.

    PubMed

    Cook, K Steven; Bullock, Kevin; Sullivan, Timothy

    2012-03-01

    N-linked glycosylation can influence the biological activity and safety of an antibody as well as be a measure of the consistency of the production process. The released N-glycans is an important part of the development of a therapeutic antibody. The traditional method for N-glycan analysis requires complex and laborious sample preparation and lengthy analysis time. Two preparation steps with limited control are removal of the antibody backbone by ice-cold ethanol precipitation and water removal before 2-AB fluorescent dye labeling. Simplification of the sample preparation and better control of key steps that allows for the characterization/quantitation of glycans during all stages of development of a therapeutic antibody is desired. Recently Prozyme introduced a rapid deglycosylation kit and a rapid tagging kit that address some of these issues. The deglycosylation kit immobilizes the antibody on a membrane, thereby eliminating the precipitation step. An instant fluorescent tag kit eliminates the water removal before the 2-AB labeling step. In addition use of a new chromatography column can improve the glycan resolution and shorten the analysis time. The evaluation and qualification of the Rapid Deglycosylation Kit (RDK) and instant 2-AB tagging with the improved chromatography are highlighted. PMID:22257749

  5. Development of a method for assessing flood vulnerability.

    PubMed

    Connor, R F; Hiroki, K

    2005-01-01

    Over the past few decades, a growing number of studies have been conducted on the mechanisms responsible for climate change and the elaboration of future climate scenarios. More recently, studies have emerged examining the potential effects of climate change on human societies, including how variations in hydrological regimes impact water resources management. According to the Intergovernmental Panel on Climate Change's third assessment report, climate change will lead to an intensification of the hydrological cycle, resulting in greater variability in precipitation patterns and an increase in the intensity and frequency of severe storms and other extreme events. In other words, climate change will likely increase the risks of flooding in many areas. Structural and non-structural countermeasures are available to reduce flood vulnerability, but implementing new measures can be a lengthy process requiring political and financial support. In order to help guide such policy decisions, a method for assessing flood vulnerability due to climate change is proposed. In this preliminary study, multivariate analysis has been used to develop a Flood Vulnerability Index (FVI), which allows for a comparative analysis of flood vulnerability between different basins. Once fully developed, the FVI will also allow users to identify the main factors responsible for a basin's vulnerability, making it a valuable tool to assist in priority setting within decision-making processes. PMID:15918359

  6. Systematic methods for knowledge acquisition and expert system development

    NASA Technical Reports Server (NTRS)

    Belkin, Brenda L.; Stengel, Robert F.

    1991-01-01

    Nine cooperating rule-based systems, collectively called AUTOCREW which were designed to automate functions and decisions associated with a combat aircraft's subsystems, are discussed. The organization of tasks within each system is described; performance metrics were developed to evaluate the workload of each rule base and to assess the cooperation between the rule bases. Simulation and comparative workload results for two mission scenarios are given. The scenarios are inbound surface-to-air-missile attack on the aircraft and pilot incapacitation. The methodology used to develop the AUTOCREW knowledge bases is summarized. Issues involved in designing the navigation sensor selection expert in AUTOCREW's NAVIGATOR knowledge base are discussed in detail. The performance of seven navigation systems aiding a medium-accuracy INS was investigated using Kalman filter covariance analyses. A navigation sensor management (NSM) expert system was formulated from covariance simulation data using the analysis of variance (ANOVA) method and the ID3 algorithm. ANOVA results show that statistically different position accuracies are obtained when different navaids are used, the number of navaids aiding the INS is varied, the aircraft's trajectory is varied, and the performance history is varied. The ID3 algorithm determines the NSM expert's classification rules in the form of decision trees. The performance of these decision trees was assessed on two arbitrary trajectories, and the results demonstrate that the NSM expert adapts to new situations and provides reasonable estimates of the expected hybrid performance.

  7. Recent developments in the direct-current geoelectrical imaging method

    NASA Astrophysics Data System (ADS)

    Loke, M. H.; Chambers, J. E.; Rucker, D. F.; Kuras, O.; Wilkinson, P. B.

    2013-08-01

    There have been major improvements in instrumentation, field survey design and data inversion techniques for the geoelectrical method over the past 25 years. Multi-electrode and multi-channel systems have made it possible to conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve complex geological structures that were not possible with traditional 1-D surveys. Continued developments in computer technology, as well as fast data inversion techniques and software, have made it possible to carry out the interpretation on commonly available microcomputers. Multi-dimensional geoelectrical surveys are now widely used in environmental, engineering, hydrological and mining applications. 3-D surveys play an increasingly important role in very complex areas where 2-D models suffer from artifacts due to off-line structures. Large areas on land and water can be surveyed rapidly with computerized dynamic towed resistivity acquisition systems. The use of existing metallic wells as long electrodes has improved the detection of targets in areas where they are masked by subsurface infrastructure. A number of PC controlled monitoring systems are also available to measure and detect temporal changes in the subsurface. There have been significant advancements in techniques to automatically generate optimized electrodes array configurations that have better resolution and depth of investigation than traditional arrays. Other areas of active development include the translation of electrical values into geological parameters such as clay and moisture content, new types of sensors, estimation of fluid or ground movement from time-lapse images and joint inversion techniques. In this paper, we investigate the recent developments in geoelectrical imaging and provide a brief look into the future of where the science may be heading.

  8. Various methods and developments for calibrating seismological sensors at EOST

    NASA Astrophysics Data System (ADS)

    JUND, H.; Bès de Berc, M.; Thore, J.

    2013-12-01

    Calibrating seismic sensors is crucial for knowing the quality of the sensor and generating precise dataless files. We present here three calibration methods that we have developed for the short period and broad band sensors included in the temporary and permanent seismic networks in France. First, in the case of a short-period sensor with no electronics and calibration coil, we inject a sine wave signal into the signal coil. After locking the sensor mass, we first connect a voltage generator of signal waves and a series resistor to the coil. Then, a sinusoidal signal is sent to the sensor signal coil output. Both the voltage at the terminal of the resistor, which gives an image of the intensity entering the signal coil, and the voltage at the terminal of the signal coil are measured. The frequency of the generator then varies in order to find a phase shift between both signals of π/2. The output frequency of the generator corresponds to the image of the natural frequency of the sensor. Second, in the case of all types of sensors provided with a calibration coil, we inject different signals into the calibration coil. We usually apply two signals: a step signal and a sweep (or wobble) signal. A step signal into the calibration coil is equivalent to a Dirac excitation in derived acceleration. The response to this Dirac gives the transfer function of the signal coil, derived two times and without absolute gain. We developed a field-module allowing us to always apply the same excitation to various models of seismometers, in order to compare the results from several instruments previously installed on field. A wobble signal is a signal whose frequency varies. By varying the frequency of the input signal around the sensor's natural frequency, we obtain an immediate response of the sensor in acceleration. This method is particularly suitable in order to avoid any disturbances which may modify the signal of a permanent station. Finally, for the determination of absolute

  9. Development of acoustic observation method for seafloor hydrothermal flows

    NASA Astrophysics Data System (ADS)

    Mochizuki, M.; Tamura, H.; Asada, A.; Kinoshita, M.; Tamaki, K.

    2012-12-01

    In October 2009, we conducted seafloor reconnaissance using a manned deep-sea submersible Shinkai6500 in Central Indian Ridge 18-20deg.S, where hydrothermal plume signatures were previously perceived. Acoustic video camera "DIDSON" was equipped on the top of Shinkai6500 in order to get acoustic video images of hydrothermal plumes. The acoustic video images of the hydrothermal plumes had been captured in three of seven dives. We could identify shadings inside the acoustic video images of the hydrothermal plumes. Silhouettes of the hydrothermal plumes varied from second to second, and the shadings inside them also varied. These variations corresponded to internal structures and flows of the plumes. DIDSON (Dual-Frequency IDentification SONar) is acoustic lens-based sonar. It has sufficiently high resolution and rapid refresh rate that it can substitute for optical system in turbid or dark water where optical systems fail. Ins. of Industrial Science, University of Tokyo has understood DIDSON's superior performance and tried to develop a new observation method based on DIDSON for hydrothermal discharging from seafloor vent. We expected DIDSON to reveal whole image of hydrothermal plume as well as detail inside the plume. The proposed method to observe and measure hydrothermal flow is the one to utilize a sheet-like acoustic beam. Scanning with concentrated acoustic beam gives distances to the edges of the hydrothermal flows. And then, the shapes of the flows can be identified even in low and zero visibility conditions. Tank experiment was conducted. The purposes of this experiment were to make an attempt at proposed method to delineate underwater hydrothermal flows and to understand relationships among acoustic video image, flow rate and water temperature. Water was heated in the hot tub and pumped to the water tank through the silicon tube. We observed water flows discharging from the tip of the tube with DIDSON. Flow rate had been controlled and temperatures of the

  10. Development and Applications of Advanced Electronic Structure Methods

    NASA Astrophysics Data System (ADS)

    Bell, Franziska

    This dissertation contributes to three different areas in electronic structure theory. The first part of this thesis advances the fundamentals of orbital active spaces. Orbital active spaces are not only essential in multi-reference approaches, but have also become of interest in single-reference methods as they allow otherwise intractably large systems to be studied. However, despite their great importance, the optimal choice and, more importantly, their physical significance are still not fully understood. In order to address this problem, we studied the higher-order singular value decomposition (HOSVD) in the context of electronic structure methods. We were able to gain a physical understanding of the resulting orbitals and proved a connection to unrelaxed natural orbitals in the case of Moller-Plesset perturbation theory to second order (MP2). In the quest to find the optimal choice of the active space, we proposed a HOSVD for energy-weighted integrals, which yielded the fastest convergence in MP2 correlation energy for small- to medium-sized active spaces to date, and is also potentially transferable to coupled-cluster theory. In the second part, we studied monomeric and dimeric glycerol radical cations and their photo-induced dissociation in collaboration with Prof. Leone and his group. Understanding the mechanistic details involved in these processes are essential for further studies on the combustion of glycerol and carbohydrates. To our surprise, we found that in most cases, the experimentally observed appearance energies arise from the separation of product fragments from one another rather than rearrangement to products. The final chapters of this work focus on the development, assessment, and application of the spin-flip method, which is a single-reference approach, but capable of describing multi-reference problems. Systems exhibiting multi-reference character, which arises from the (near-) degeneracy of orbital energies, are amongst the most

  11. Developing Conceptual Hypersonic Airbreathing Engines Using Design of Experiments Methods

    NASA Technical Reports Server (NTRS)

    Ferlemann, Shelly M.; Robinson, Jeffrey S.; Martin, John G.; Leonard, Charles P.; Taylor, Lawrence W.; Kamhawi, Hilmi

    2000-01-01

    Designing a hypersonic vehicle is a complicated process due to the multi-disciplinary synergy that is required. The greatest challenge involves propulsion-airframe integration. In the past, a two-dimensional flowpath was generated based on the engine performance required for a proposed mission. A three-dimensional CAD geometry was produced from the two-dimensional flowpath for aerodynamic analysis, structural design, and packaging. The aerodynamics, engine performance, and mass properties arc inputs to the vehicle performance tool to determine if the mission goals were met. If the mission goals were not met, then a flowpath and vehicle redesign would begin. This design process might have to be performed several times to produce a "closed" vehicle. This paper will describe an attempt to design a hypersonic cruise vehicle propulsion flowpath using a Design of' Experiments method to reduce the resources necessary to produce a conceptual design with fewer iterations of the design cycle. These methods also allow for more flexible mission analysis and incorporation of additional design constraints at any point. A design system was developed using an object-based software package that would quickly generate each flowpath in the study given the values of the geometric independent variables. These flowpath geometries were put into a hypersonic propulsion code and the engine performance was generated. The propulsion results were loaded into statistical software to produce regression equations that were combined with an aerodynamic database to optimize the flowpath at the vehicle performance level. For this example, the design process was executed twice. The first pass was a cursory look at the independent variables selected to determine which variables are the most important and to test all of the inputs to the optimization process. The second cycle is a more in-depth study with more cases and higher order equations representing the design space.

  12. Development of Stable Solidification Method for Insoluble Ferrocyanides-13170

    SciTech Connect

    Ikarashi, Yuki; Masud, Rana Syed; Mimura, Hitoshi; Ishizaki, Eiji; Matsukura, Minoru

    2013-07-01

    The development of stable solidification method of insoluble ferrocyanides sludge is an important subject for the safety decontamination in Fukushima NPP-1. By using the excellent immobilizing properties of zeolites such as gas trapping ability and self-sintering properties, the stable solidification of insoluble ferrocyanides was accomplished. The immobilization ratio of Cs for K{sub 2}[CoFe(CN){sub 6}].nH{sub 2}O saturated with Cs{sup +} ions (Cs{sub 2}[CoFe(CN){sub 6}].nH{sub 2}O) was estimated to be less than 0.1% above 1,000 deg. C; the adsorbed Cs{sup +} ions are completely volatilized. In contrast, the novel stable solid form was produced by the press-sintering of the mixture of Cs{sub 2}[CoFe(CN){sub 6}].nH{sub 2}O and zeolites at higher temperature of 1,000 deg. C and 1,100 deg. C; Cs volatilization and cyanide release were completely depressed. The immobilization ratio of Cs, under the mixing conditions of Cs{sub 2}[CoFe(CN){sub 6}].nH{sub 2}O:CP= 1:1 and calcining temperature: 1,000 deg. C, was estimated to be nearly 100%. As for the kinds of zeolites, natural mordenite (NM), clinoptilolite (CP) and Chabazite tended to have higher immobilization ratio compared to zeolite A. This may be due to the difference in the phase transformation between natural zeolites and synthetic zeolite A. In the case of the composites (K{sub 2-X}Ni{sub X/2}[NiFe(CN){sub 6}].nH{sub 2}O loaded natural mordenite), relatively high immobilization ratio of Cs was also obtained. This method using zeolite matrices can be applied to the stable solidification of the solid wastes of insoluble ferrocyanides sludge. (authors)

  13. COOPERATION BETWEEN AACC AND ICC FOR STANDARD METHODS DEVELOPMENT

    Technology Transfer Automated Retrieval System (TEKTRAN)

    American Association of Cereal Chemists (AACC) Standard Methods and International Association for Cereal Science and Technology (ICC) are cooperating to harmonize certain of their methods. The harmonized methods will employ the same procedures so that the analytical results of either method (such a...

  14. Conceptual Design Method Developed for Advanced Propulsion Nozzles

    NASA Technical Reports Server (NTRS)

    Nadell, Shari-Beth; Barnhart, Paul J.

    1998-01-01

    As part of a contract with the NASA Lewis Research Center, a simple, accurate method of predicting the performance characteristics of a nozzle design has been developed for use in conceptual design studies. The Nozzle Performance Analysis Code (NPAC) can predict the on- and off-design performance of axisymmetric or two-dimensional convergent and convergent-divergent nozzle geometries. NPAC accounts for the effects of overexpansion or underexpansion, flow divergence, wall friction, heat transfer, and small mass addition or loss across surfaces when the nozzle gross thrust and gross thrust coefficient are being computed. NPAC can be used to predict the performance of a given nozzle design or to develop a preliminary nozzle system design for subsequent analysis. The input required by NPAC consists of a simple geometry definition of the nozzle surfaces, the location of key nozzle stations (entrance, throat, exit), and the nozzle entrance flow properties. NPAC performs three analysis "passes" on the nozzle geometry. First, an isentropic control volume analysis is performed to determine the gross thrust and gross thrust coefficient of the nozzle. During the second analysis pass, the skin friction and heat transfer losses are computed. The third analysis pass couples the effects of wall shear and heat transfer with the initial internal nozzle flow solutions to produce a system of equations that is solved at steps along the nozzle geometry. Small mass additions or losses, such as those resulting from leakage or bleed flow, can be included in the model at specified geometric sections. A final correction is made to account for divergence losses that are incurred if the nozzle exit flow is not purely axial.

  15. Wavelet Methods Developed to Detect and Control Compressor Stall

    NASA Technical Reports Server (NTRS)

    Le, Dzu K.

    1997-01-01

    A "wavelet" is, by definition, an amplitude-varying, short waveform with a finite bandwidth (e.g., that shown in the first two graphs). Naturally, wavelets are more effective than the sinusoids of Fourier analysis for matching and reconstructing signal features. In wavelet transformation and inversion, all transient or periodic data features (as in compressor-inlet pressures) can be detected and reconstructed by stretching or contracting a single wavelet to generate the matching building blocks. Consequently, wavelet analysis provides many flexible and effective ways to reduce noise and extract signals which surpass classical techniques - making it very attractive for data analysis, modeling, and active control of stall and surge in high-speed turbojet compressors. Therefore, fast and practical wavelet methods are being developed in-house at the NASA Lewis Research Center to assist in these tasks. This includes establishing user-friendly links between some fundamental wavelet analysis ideas and the classical theories (or practices) of system identification, data analysis, and processing.

  16. Development of CCD Imaging System Using Thermoelectric Cooling Method

    NASA Astrophysics Data System (ADS)

    Park, Youngsik; Lee, Ho Jin; Han, Wonyong; Nam, Uk-Won; Lee, Yong-Sam

    2000-06-01

    We developed low light CCD imaging system using thermoelectric cooling method collaboration with a company to design a commercial model. It consists of Kodak KAF-0401E (768x512 pixels) CCD chip,thermoelectric module manufactured by Thermotek. This TEC system can reach an operative temperature of -25deg. We employed an Uniblitz VS25S shutter and it has capability a minimum exposure time 80ms. The system components are an interface card using a Korea Astronomy Observatory (hereafter KAO) ISA bus controller, image acquisition with AD9816 chip, that is 12bit video processor. The performance test with this imaging system showed good operation within the initial specification of our design. It shows a dark current less than 0.4e-/pixel/sec at a temperature of -10deg, a linearity 99.9+/-0.1%, gain 4.24e-adu, and system noise is 25.3e- (rms). For low temperature CCD operation, we designed a TEC, which uses a one-stage peltier module and forced air heat exchanger. This TEC imaging system enables accurate photometry (+/-0.01mag) even though the CCD is not at 'conventional' cryogenic temperatures (140K). The system can be a useful instrument for any other imaging applications. Finally, with this system, we obtained several images of astronomical objects for system performance tests.

  17. Approaches to improve development methods for therapeutic cancer vaccines.

    PubMed

    Ogi, Chizuru; Aruga, Atsushi

    2015-04-01

    Therapeutic cancer vaccines are an immunotherapy that amplify or induce an active immune response against tumors. Notably, limitations in the methodology for existing anti-cancer drugs may subsist while applying them to cancer vaccine therapy. A retrospective analysis was performed using information obtained from ClinicalTrials.gov, PubMed, and published articles. Our research evaluated the optimal methodologies for therapeutic cancer vaccines based on (1) patient populations, (2) immune monitoring, (3) tumor response evaluation, and (4) supplementary therapies. Failure to optimize these methodologies at an early phase may impact development at later stages; thus, we have proposed some points to be considered during the early phase. Moreover, we compared our proposal with the guidance for industry issued by the US Food and Drug Administration in October 2011 entitled "Clinical Considerations for Therapeutic Cancer Vaccines". Consequently, while our research was aligned with the guidance, we hope it provides further insights in order to predict the risks and benefits and facilitate decisions for a new technology. We identified the following points for consideration: (1) include in the selection criteria the immunological stage with a prognostic value, which is as important as the tumor stage; (2) select immunological assays such as phenotype analysis of lymphocytes, based on their features and standardize assay methods; (3) utilize optimal response criteria for immunotherapy in therapeutic cancer vaccine trials; and (4) consider supplementary therapies, including immune checkpoint inhibitors, for future therapeutic cancer vaccines. PMID:25746315

  18. DEVELOPING METHODS FOR ANALYZING OIL DISPERSANTS IN SEAWATER

    EPA Science Inventory

    An analytical method was sought for determining the concentrations of dispersants in seawater contaminated with oil in both field and laboratory situations. Methods of analysis for surfactants found in the literature included spectrophotometry, gas chromatography (GC), thin-layer...

  19. 40 CFR 766.16 - Developing the analytical test method.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... meet the requirements of the chemical matrix. (d) Analysis. The method of choice is High Resolution Gas... analytical test method. Because of the matrix differences of the chemicals listed for testing, no one method for sample selection, preparation, extraction and clean up is prescribed. For analysis,...

  20. 40 CFR 766.16 - Developing the analytical test method.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... meet the requirements of the chemical matrix. (d) Analysis. The method of choice is High Resolution Gas... analytical test method. Because of the matrix differences of the chemicals listed for testing, no one method for sample selection, preparation, extraction and clean up is prescribed. For analysis,...

  1. 40 CFR 766.16 - Developing the analytical test method.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... meet the requirements of the chemical matrix. (d) Analysis. The method of choice is High Resolution Gas... analytical test method. Because of the matrix differences of the chemicals listed for testing, no one method for sample selection, preparation, extraction and clean up is prescribed. For analysis,...

  2. Development of a Benchtop Baking Method for Chemically Leavened Crackers

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Traditionally, the baking performance of soft wheat flours has been evaluated by well-established benchtop cookie-baking methods. In contrast, a benchtop cracker-baking method has not been widely explored or implemented as an official method, due to hurdles including the difficulty in finding ideal...

  3. Development of numerical methods to problems of micromechanics

    NASA Astrophysics Data System (ADS)

    Garcia-Martinez, Jose Ramon

    In this dissertation we utilize the finite element method to investigate three micromechanical problems. In Chapter 2, we study the compliance contribution tensor H of multiple branched cracks. The cracks grow from a deltoid pore at their center into a triple crack. For plain strain conditions, two-dimensional models of the branched crack are modeled and solved in ABAQUS. The displacement field over the surface of the branched crack and the deltoid is curve fitted to carry out the integral surface of the compliance contribution tensor H. The predicted values are in good agreement with analytical solution. In Chapter 3 a three-dimensional finite element program using an unaligned mesh with an eight-node isoparametric element is developed to study the compliance contribution tensor H of cavities with superellipsoid shapes. A mesh clustering algorithm to increase the number of elements inside and near the superellipsoid surface to obtain a mesh independent solution is used. The numerical results are compared with the analytical solution of a sphere; the error of the numerical approximation varied from 8 to 11%. It is found that the number of elements inside the superellipsoid are insufficient. An algorithm to mesh independently the volumes inside and outside the cube is proposed to increase the accuracy in the calculation of H. As n1 and n2 increase, the numerical solutions show that, H1111 → 0 and H2211 → 0. Although, for these concave shapes no analytical solution exists a bound of 0 for the terms H 1111 and H2211 is suggested. Finally, in Chapter 4 a numerical verification of the cross-property connection between the effective fluid permeability and the effective electrical conductivity is study. A molecular dynamics algorithm is used to generate a set of different microstructural patterns. The volumetric average over a cubic volume is used to obtain the effective electrical conductivity and the effective fluid permeability. The tortuosity of the porous phase

  4. Leadership Development Expertise: A Mixed-Method Analysis

    ERIC Educational Resources Information Center

    Okpala, Comfort O.; Hopson, Linda B.; Chapman, Bernadine; Fort, Edward

    2011-01-01

    In this study, the impact of graduate curriculum, experience, and standards in the development of leadership expertise were examined. The major goals of the study were to (1) examine the impact of college content curriculum in the development of leadership expertise, (2) examine the impact of on the job experience in the development of leadership…

  5. The development of accurate and efficient methods of numerical quadrature

    NASA Technical Reports Server (NTRS)

    Feagin, T.

    1973-01-01

    Some new methods for performing numerical quadrature of an integrable function over a finite interval are described. Each method provides a sequence of approximations of increasing order to the value of the integral. Each approximation makes use of all previously computed values of the integrand. The points at which new values of the integrand are computed are selected in such a way that the order of the approximation is maximized. The methods are compared with the quadrature methods of Clenshaw and Curtis, Gauss, Patterson, and Romberg using several examples.

  6. Ten Years of GLAPHI Method Developing Scientific Research Abilities

    NASA Astrophysics Data System (ADS)

    Vega-Carrillo, Hector R.

    2006-12-01

    During the past ten years we had applied our method, GLAPHI, to teach how to do scientific research. The method has been applied from freshman students up to PhD professionals. The method is based in the search and analysis of scientific literature, the scientific question or problem, the approach of hypothesis and objetive, the estimation of the project cost and the timetable. It also includes statistics for research, author rights, ethics in research, publication of scientific papers, writting scientific reports and meeting presentations. In this work success and fails of GLAPHI methods will be discussed. Work partially supported by CONACyT (Mexico) under contract: SEP-2004-C01-46893

  7. Development of an automatic evaluation method for patient positioning error.

    PubMed

    Kubota, Yoshiki; Tashiro, Mutsumi; Shinohara, Ayaka; Abe, Satoshi; Souda, Saki; Okada, Ryosuke; Ishii, Takayoshi; Kanai, Tatsuaki; Ohno, Tatsuya; Nakano, Takashi

    2015-01-01

    Highly accurate radiotherapy needs highly accurate patient positioning. At our facility, patient positioning is manually performed by radiology technicians. After the positioning, positioning error is measured by manually comparing some positions on a digital radiography image (DR) to the corresponding positions on a digitally reconstructed radiography image (DRR). This method is prone to error and can be time-consuming because of its manual nature. Therefore, we propose an automated measuring method for positioning error to improve patient throughput and achieve higher reliability. The error between a position on the DR and a position on the DRR was calculated to determine the best matched position using the block-matching method. The zero-mean normalized cross correlation was used as our evaluation function, and the Gaussian weight function was used to increase importance as the pixel position approached the isocenter. The accuracy of the calculation method was evaluated using pelvic phantom images, and the method's effectiveness was evaluated on images of prostate cancer patients before the positioning, comparing them with the results of radiology technicians' measurements. The root mean square error (RMSE) of the calculation method for the pelvic phantom was 0.23 ± 0.05 mm. The coefficients between the calculation method and the measurement results of the technicians were 0.989 for the phantom images and 0.980 for the patient images. The RMSE of the total evaluation results of positioning for prostate cancer patients using the calculation method was 0.32 ± 0.18 mm. Using the proposed method, we successfully measured residual positioning errors. The accuracy and effectiveness of the method was evaluated for pelvic phantom images and images of prostate cancer patients. In the future, positioning for cancer patients at other sites will be evaluated using the calculation method. Consequently, we expect an improvement in treatment throughput for these other sites

  8. Development of a harmonised method for the profiling of amphetamines: III. Development of the gas chromatographic method.

    PubMed

    Andersson, Kjell; Jalava, Kaisa; Lock, Eric; Finnon, Yvonne; Huizer, Henk; Kaa, Elisabet; Lopes, Alvaro; Poortman-van der Meer, Anneke; Cole, Michael D; Dahlén, Johan; Sippola, Erkki

    2007-06-14

    This study focused on gas chromatographic analysis of target compounds found in illicit amphetamine synthesised by the Leuckart reaction, reductive amination of benzyl methyl ketone, and the nitrostyrene route. The analytical method was investigated and optimised with respect to introduction of amphetamine samples into the gas chromatograph and separation and detection of the target substances. Sample introduction using split and splitless injection was tested at different injector temperatures, and their ability to transfer the target compounds to the GC column was evaluated using cold on column injection as a reference. Taking the results from both techniques into consideration a temperature of 250 degrees C was considered to be the best compromise. The most efficient separation was achieved with a DB-35MS capillary column (35% diphenyl 65% dimethyl silicone; 30 m x 0.25 mm, d(f) 0.25 microm) and an oven temperature program that started at 90 degrees C (1 min) and was increased by 8 degrees C/min to 300 degrees C (10 min). Reproducibility, repeatability, linearity, and limits of determination for the flame ionisation detector (FID), nitrogen phosphorous detector (NPD), and mass spectrometry (MS) in scan mode and selected ion monitoring (SIM) mode were evaluated. In addition, selectivity was studied applying FID and MS in both scan and SIM mode. It was found that reproducibility, repeatability, and limits of determination were similar for FID, NPD, and MS in scan mode. Moreover, the linearity was better when applying FID or NPD whereas the selectivity was better when utilising the MS. Finally, the introduction of target compounds to the GC column when applying injection volumes of 0.2 microl, 1 microl, 2 microl, and 4 microl with splitless injection respectively 1 microl with split injection (split ratio, 1:40) were compared. It was demonstrated that splitless injections of 1 microl, 2 microl, and 4 microl could be employed in the developed method, while split

  9. METHOD DEVELOPMENT FOR DETERMINATION OF POLYCHLORINATED HYDROCARBONS IN MUNICIPAL SLUDGE

    EPA Science Inventory

    The method provides a procedure for analysis of pesticides and PCB's in municipal sludge. The method includes extraction by a centrifuge technique of the chlorinated compounds from the sludge matrix; clean-up of the extract to remove interferences by sulfur precipitation as mercu...

  10. [Cognitive functions, their development and modern diagnostic methods].

    PubMed

    Klasik, Adam; Janas-Kozik, Małgorzata; Krupka-Matuszczyk, Irena; Augustyniak, Ewa

    2006-01-01

    Cognitive psychology is an interdisciplinary field whose main aim is to study the thinking mechanisms of humans leading to cognizance. Therefore the concept of human cognitive processes envelopes the knowledge related to the mechanisms which determine the way humans acquire information from the environment and utilize their knowledge and experience. There are three basic processes which need to be distinguished when discussing human perception development: acquiring sensations, perceptiveness and attention. Acquiring sensations means the experience arising from the stimulation of a single sense organ, i.e. detection and differentiation of sensory information. Perceptiveness stands for the interpretation of sensations and may include recognition and identification of sensory information. The attention process relates to the selectivity of perception. Mental processes of the higher order used in cognition, thanks to which humans tend to try to understand the world and adapt to it, doubtlessly include the processes of memory, reasoning, learning and problem solving. There is a great difference in the human cognitive functioning at different stages of one's life (from infancy to adulthood). The difference is both quantitative and qualitative. There are three main approaches to the human cognitive functioning development: Jean Piaget's approach, information processing approach and psychometric approach. Piaget's ideas continue to form the groundwork of child cognitive psychology. Piaget identified four developmental stages of child cognition: 1. Sensorimotor stage (birth - 2 years old); 2. Preoperational stage (ages 2-7); 3. Concrete operations (ages 7-11; 4. Formal operations (11 and more). The supporters of the information processing approach use a computer metaphor to present the human cognitive processes functioning model. The three important mechanisms involved are: coding, automation and strategy designing and they all often co-operate together. This theory has

  11. Developing a multimodal biometric authentication system using soft computing methods.

    PubMed

    Malcangi, Mario

    2015-01-01

    Robust personal authentication is becoming ever more important in computer-based applications. Among a variety of methods, biometric offers several advantages, mainly in embedded system applications. Hard and soft multi-biometric, combined with hard and soft computing methods, can be applied to improve the personal authentication process and to generalize the applicability. This chapter describes the embedded implementation of a multi-biometric (voiceprint and fingerprint) multimodal identification system based on hard computing methods (DSP) for feature extraction and matching, an artificial neural network (ANN) for soft feature pattern matching, and a fuzzy logic engine (FLE) for data fusion and decision. PMID:25502384

  12. Measurement Practices: Methods for Developing Content-Valid Student Examinations.

    ERIC Educational Resources Information Center

    Bridge, Patrick D.; Musial, Joseph; Frank, Robert; Roe, Thomas; Sawilowsky, Shlomo

    2003-01-01

    Reviews the fundamental principles associated with achieving a high level of content validity when developing tests for students. Suggests that the short-term efforts necessary to develop and integrate measurement theory into practice will lead to long-term gains for students, faculty, and academic institutions. (Includes 21 references.)…

  13. Developing Scientific Thinking Methods and Applications in Islamic Education

    ERIC Educational Resources Information Center

    Al-Sharaf, Adel

    2013-01-01

    This article traces the early and medieval Islamic scholarship to the development of critical and scientific thinking and how they contributed to the development of an Islamic theory of epistemology and scientific thinking education. The article elucidates how the Qur'an and the Sunna of Prophet Muhammad have also contributed to the…

  14. Adult Education in Development. Methods and Approaches from Changing Societies.

    ERIC Educational Resources Information Center

    McGivney, Veronica; Murray, Frances

    The case studies described in this book provide examples of initiatives illustrating the role of adult education in development and its contribution to the process of change in developing countries. The book is organized in five sections. Case studies in Part 1, "Health Education," illustrate the links between primary health care and adult…

  15. Sustainable Development Index in Hong Kong: Approach, Method and Findings

    ERIC Educational Resources Information Center

    Tso, Geoffrey K. F.; Yau, Kelvin K. W.; Yang, C. Y.

    2011-01-01

    Sustainable development is a priority area of research in many countries and regions nowadays. This paper illustrates how a multi-stakeholders engagement process can be applied to identify and prioritize the local community's concerns and issues regarding sustainable development in Hong Kong. Ten priority areas covering a wide range of community's…

  16. METHOD DEVELOPMENT, EVALUATION, REFINEMENT, AND ANALYSIS FOR FIELD STUDIES

    EPA Science Inventory

    Manufacturers routinely introduce new pesticides into the marketplace and discontinue manufacturing older pesticides that may be more toxic to humans. Analytical methods and environmental data are needed for current use residential pesticides (e.g., pyrethrins, synthetic pyrethr...

  17. RESEARCH ON THE DEVELOPMENT OF SEDIMENT TOXICITY IDENTIFICATION (TIE) METHODS

    EPA Science Inventory

    A common method for determining whether contaminants in sediments represent an environmental risk is to perform toxicity tests. Toxicity tests indicate if contaminants in sediments are bioavailable and capable of causing adverse biological effects (e.g., mortality, reduced growt...

  18. Development of a Matched Runs Method for VERITAS

    NASA Astrophysics Data System (ADS)

    Flinders, Andrew; VERITAS Collaboration

    2016-03-01

    VERITAS is an array of four Imaging Air Cherenkov Telescopes located in southern Arizona. It has been successful in detecting Very High Energy (VHE) radiation from a variety of sources including pulsars, Pulsar Wind Nebulae, Blazars, and High Mass X-Ray Binary systems. Each of these detections been accomplished using either the standard Ring Background Method or the Reflected Region Method in order to determine the appropriate background for the source region. For highly extended sources (>1 degree) these background estimation methods become unsuitable due to the possibility of source contamination in the background regions. A new method, called the matched background method, has been implemented for potentially highly extended sources observed by VERITAS. It provides and algorithm for identifying a suitable gamma-ray background estimation from a different field of view than the source region. By carefully matching cosmic-ray event rates between the source and the background sky observations, a suitable gamma-ray background matched data set can be identified. We will describe the matched background method and give examples of its use for several sources including the Crab Nebula and IC443. This research is supported by Grants from the U.S. Department of Energy Office of Science, the U.S. National Science Foundation and the Smithsonian Institution, and by NSERC in Canada.

  19. Development of an SPE/CE method for analyzing HAAs

    USGS Publications Warehouse

    Zhang, L.; Capel, P.D.; Hozalski, R.M.

    2007-01-01

    The haloacetic acid (HAA) analysis methods approved by the US Environmental Protection Agency involve extraction and derivatization of HAAs (typically to their methyl ester form) and analysis by gas chromatography (GC) with electron capture detection (ECD). Concerns associated with these methods include the time and effort of the derivatization process, use of potentially hazardous chemicals or conditions during methylation, poor recoveries because of low extraction efficiencies for some HAAs or matrix effects from sulfate, and loss of tribromoacetic acid because of decarboxylation. The HAA analysis method introduced here uses solid-phase extraction (SPE) followed by capillary electrophoresis (CE) analysis. The method is accurate, reproducible, sensitive, relatively safe, and easy to perform, and avoids the use of large amounts of solvent for liquid-liquid extraction and the potential hazards and hassles of derivatization. The cost of analyzing HAAs using this method should be lower than the currently approved methods, and utilities with a GC/ECD can perform the analysis in-house.

  20. Development of a Magnetic Attachment Method for Bionic Eye Applications.

    PubMed

    Fox, Kate; Meffin, Hamish; Burns, Owen; Abbott, Carla J; Allen, Penelope J; Opie, Nicholas L; McGowan, Ceara; Yeoh, Jonathan; Ahnood, Arman; Luu, Chi D; Cicione, Rosemary; Saunders, Alexia L; McPhedran, Michelle; Cardamone, Lisa; Villalobos, Joel; Garrett, David J; Nayagam, David A X; Apollo, Nicholas V; Ganesan, Kumaravelu; Shivdasani, Mohit N; Stacey, Alastair; Escudie, Mathilde; Lichter, Samantha; Shepherd, Robert K; Prawer, Steven

    2016-03-01

    Successful visual prostheses require stable, long-term attachment. Epiretinal prostheses, in particular, require attachment methods to fix the prosthesis onto the retina. The most common method is fixation with a retinal tack; however, tacks cause retinal trauma, and surgical proficiency is important to ensure optimal placement of the prosthesis near the macula. Accordingly, alternate attachment methods are required. In this study, we detail a novel method of magnetic attachment for an epiretinal prosthesis using two prostheses components positioned on opposing sides of the retina. The magnetic attachment technique was piloted in a feline animal model (chronic, nonrecovery implantation). We also detail a new method to reliably control the magnet coupling force using heat. It was found that the force exerted upon the tissue that separates the two components could be minimized as the measured force is proportionately smaller at the working distance. We thus detail, for the first time, a surgical method using customized magnets to position and affix an epiretinal prosthesis on the retina. The position of the epiretinal prosthesis is reliable, and its location on the retina is accurately controlled by the placement of a secondary magnet in the suprachoroidal location. The electrode position above the retina is less than 50 microns at the center of the device, although there were pressure points seen at the two edges due to curvature misalignment. The degree of retinal compression found in this study was unacceptably high; nevertheless, the normal structure of the retina remained intact under the electrodes. PMID:26416723

  1. Methods and Challenges in Quantitative Imaging Biomarker Development

    PubMed Central

    Abramson, Richard G.; Burton, Kirsteen R.; Yu, John-Paul J.; Scalzetti, Ernest M.; Yankeelov, Thomas E.; Rosenkrantz, Andrew B.; Mendiratta-Lala, Mishal; Bartholmai, Brian J.; Ganeshan, Dhakshinamoorthy; Lenchik, Leon; Subramaniam, Rathan M.

    2014-01-01

    Academic radiology is poised to play an important role in the development and implementation of quantitative imaging (QI) tools. This manuscript, drafted by the Association of University Radiologists (AUR) Radiology Research Alliance (RRA) Quantitative Imaging Task Force, reviews current issues in QI biomarker research. We discuss motivations for advancing QI, define key terms, present a framework for QI biomarker research, and outline challenges in QI biomarker development. We conclude by describing where QI research and development is currently taking place and discussing the paramount role of academic radiology in this rapidly evolving field. PMID:25481515

  2. The historical development of the magnetic method in exploration

    USGS Publications Warehouse

    Nabighian, M.N.; Grauch, V.J.S.; Hansen, R.O.; LaFehr, T.R.; Li, Y.; Peirce, J.W.; Phillips, J.D.; Ruder, M.E.

    2005-01-01

    The magnetic method, perhaps the oldest of geophysical exploration techniques, blossomed after the advent of airborne surveys in World War II. With improvements in instrumentation, navigation, and platform compensation, it is now possible to map the entire crustal section at a variety of scales, from strongly magnetic basement at regional scale to weakly magnetic sedimentary contacts at local scale. Methods of data filtering, display, and interpretation have also advanced, especially with the availability of low-cost, high-performance personal computers and color raster graphics. The magnetic method is the primary exploration tool in the search for minerals. In other arenas, the magnetic method has evolved from its sole use for mapping basement structure to include a wide range of new applications, such as locating intrasedimentary faults, defining subtle lithologic contacts, mapping salt domes in weakly magnetic sediments, and better defining targets through 3D inversion. These new applications have increased the method's utility in all realms of exploration - in the search for minerals, oil and gas, geothermal resources, and groundwater, and for a variety of other purposes such as natural hazards assessment, mapping impact structures, and engineering and environmental studies. ?? 2005 Society of Exploration Geophysicists. All rights reserved.

  3. DEVELOPMENT OF A MOLECULAR METHOD TO IDENTIFY ASTROVIRUS IN WATER

    EPA Science Inventory

    Astrovirus is a common cause of gastroenteritis that has been determined to be responsible for several outbreaks. Since astrovirus can be waterborne, there is interest in testing environmental water for astrovirus. We have developed a sensitive reverse transcription-polymerase ...

  4. Development of a rapid assimilable organic carbon method for water.

    PubMed

    Lechevallier, M W; Shaw, N E; Kaplan, L A; Bott, T L

    1993-05-01

    A rapid method for measurement of assimilable organic carbon (AOC) is proposed. The time needed to perform the assay is reduced by increasing the incubation temperature and increasing the inoculum density. The ATP luciferin-luciferase method quickly enumerates the test organisms without the need for plate count media or dilution bottles. There was no significant difference between AOC values determined with strain P17 for the ATP and plate count procedures. For strain NOX, the plate count procedure underestimated bacterial levels in some samples. Comparison of AOC values obtained by the Belleville laboratory (by the ATP technique) and the Stroud Water Research Center (by plate counts) showed that values were significantly correlated and not significantly different. The study concludes that the rapid AOC method can quickly determine the bacterial growth potential of water within 2 to 4 days. PMID:16348936

  5. Development of a dynamically adaptive grid method for multidimensional problems

    NASA Astrophysics Data System (ADS)

    Holcomb, J. E.; Hindman, R. G.

    1984-06-01

    An approach to solution adaptive grid generation for use with finite difference techniques, previously demonstrated on model problems in one space dimension, has been extended to multidimensional problems. The method is based on the popular elliptic steady grid generators, but is 'dynamically' adaptive in the sense that a grid is maintained at all times satisfying the steady grid law driven by a solution-dependent source term. Testing has been carried out on Burgers' equation in one and two space dimensions. Results appear encouraging both for inviscid wave propagation cases and viscous boundary layer cases, suggesting that application to practical flow problems is now possible. In the course of the work, obstacles relating to grid correction, smoothing of the solution, and elliptic equation solvers have been largely overcome. Concern remains, however, about grid skewness, boundary layer resolution and the need for implicit integration methods. Also, the method in 3-D is expected to be very demanding of computer resources.

  6. Development method of the motor winding's ultrasonic cleaning equipment

    NASA Astrophysics Data System (ADS)

    Jiang, Yingzhan; Wang, Caiyuan; Ao, Chenyang; Zhang, Haipeng

    2013-03-01

    The complicate question's solution of motor winding cleaning need new technologies such as ultrasonic cleaning. The mechanism of problems that the insulation level of the motor winding would be degraded with time and the motor winding would resumed tide soon after processing were analyzed. The ultrasonic cleaning method was studies and one ultrasonic cleaning device was designed. Its safety was verified by the destructive experiment. The test show that this device can clear away the depositional dirt in the winding thoroughly, which provides a new idea and method to ensure its insulation level and realize its safe and reliable operation.

  7. Diagnostic Methods for Platelet Bacteria Screening: Current Status and Developments

    PubMed Central

    Störmer, Melanie; Vollmer, Tanja

    2014-01-01

    Summary Bacterial contamination of blood components and the prevention of transfusion-associated bacterial infection still remains a major challenge in transfusion medicine. Over the past few decades, a significant reduction in the transmission of viral infections has been achieved due to the introduction of mandatory virus screening. Platelet concentrates (PCs) represent one of the highest risks for bacterial infection. This is due to the required storage conditions for PCs in gas-permeable containers at room temperature with constant agitation, which support bacterial proliferation from low contamination levels to high titers. In contrast to virus screening, since 1997 in Germany bacterial testing of PCs is only performed as a routine quality control or, since 2008, to prolong the shelf life to 5 days. In general, bacterial screening of PCs by cultivation methods is implemented by the various blood services. Although these culturing systems will remain the gold standard, the significance of rapid methods for screening for bacterial contamination has increased over the last few years. These new methods provide powerful tools for increasing the bacterial safety of blood components. This article summarizes the course of policies and provisions introduced to increase bacterial safety of blood components in Germany. Furthermore, we give an overview of the different diagnostic methods for bacterial screening of PCs and their current applicability in routine screening processes. PMID:24659944

  8. DEVELOPMENT OF MOLECULAR METHODS TO DETECT EMERGING VIRUSES

    EPA Science Inventory

    A large number of human enteric viruses are known to cause gastrointestinal illness and waterborne outbreaks. Many of these are emerging viruses that do not grow or grow poorly in cell culture and so molecular detectoin methods based on the polymerase chain reaction (PCR) are be...

  9. Method development for determination of fluroxypyr in water.

    PubMed

    Halimah, M; Tan, Y A; Aini, K; Ismail, B S

    2003-07-01

    Improved methods for extraction and clean up of fluroxypyr residue in water have been established. Two methods of fluroxypyr extraction were used, namely, Direct Measurement of fluroxypyr and Concentration of fluroxypyr onto A Solid Phase Extraction (SPE) Adsorbent, followed by elution with solvent before determination of fluroxypyr. The recovery for Direct Measurement of fluroxypyr in water containing 8-100 microg L(-1), ranged from 86 to 110% with relative standard deviation of 0.7 to 2.15%. For the second method, three types of SPE were used, viz. C18, C18 end-capped and polyvinyl dibenzene (ISOLUTE ENV+). The procedure involved concentrating the analyte from fluroxypyr-spiked water at pH 3, followed by elution of the analyte with 4 mL of acentonitrile. The recovery of fluroxypyr from the spiked sample at 1 to 50 microg L(-1) after eluting through either C18 or C18 end-capped ranged from 40-64% (with relative standard deviation of 0.7 to 2.15) and 41-65% (with standard deviation of 1.52 to 11.9). The use of ISOLUTE ENV+, gave better results than the C18, C18 end-capped or the Direct Measurement Methods. The recovery and standard deviation of fluroxypyr from spiked water using ISOLUTE ENV+ ranged from 91-102% and 2.5 to 5.3, respectively. PMID:12856925

  10. Development of Fingerprinting Method in Sediment Source Studies

    NASA Astrophysics Data System (ADS)

    Du, Pengfei; Ning, Duihu; Huang, Donghao

    2016-04-01

    Sediment source study is valuable for watershed sediment budget, sediment control in channels, soil erosion model validation and benefits evaluation of soil and water conservation. As one of the methods to make clear the sediment sources, fingerprinting has been proven effective, and hence has been adopted in different countries over the world. This paper briefly introduced the fingerprinting method in models, diagnostic sediment properties, applied regions, spatial and temporal scales, and classification of sediment source types. Combining with environmental radionuclides as the time makers (such as 137Cs and 210Pb), the sediment source history has been possible by virtue of this method. However, some uncertainties are waiting for the confirmative answers while introducing fingerprinting technique to sediment related studies: the efficient sampling strategies through linking sediment source and fingerprint properties need to be clearer, the spatial scale links (up-scaling and down-scaling) should be provided with detailed methods, the model calibration is necessary to be updated to improve the estimated precision. (This paper is a contribution to the project of National Natural Science Foundation of China (No. 41501299), the non-profit project of Ministry of Water Resources of China (No. 201501045), and the project of Youth Scientific Research of China Institute of Water Resources and Hydropower Research (Using fingerprinting technique to study sediment source in a typical small watershed of black soil region in northeast China))

  11. DEVELOPMENT AND VALIDATION OF A TEST METHOD FOR ACRYLONITRILE EMISSIONS

    EPA Science Inventory

    Acrylonitrile (AN) has been identified as a suspected carcinogen and may be regulated in the future as a hazardous air pollutant under Section 112 of the Clean Air Act. A method was validated that utilizes a midget impinger containing methanol for trapping AN vapors followed by a...

  12. DEVELOP NEW TOC/DOC METHOD WITH EXPANDED QUALITY CONTROL

    EPA Science Inventory

    The purpose of this project is to provide a total organic carbon (TOC)/specific ultraviolet absorbance (SUVA) method that will be used by the Office of Ground Water and Drinking Water (OGWDW) to support monitoring requirements of the Stage 2 Disinfectant/Disinfection By-products ...

  13. The Development of a Qualitative Analyzing Method for Concept Maps

    ERIC Educational Resources Information Center

    Ozgun Koca, S. Asli; Sen, Ahmet Ilhan

    2004-01-01

    Concept mapping has been widely used as one of the most efficient methods of revealing the cognitive structure of an individual on any concept. There are not only different concept mapping techniques but also different ways of analysis. It has been suggested that concept maps provide valuable and rich information which becomes disoriented when…

  14. New Method for Data Treatment Developed at ESO

    NASA Astrophysics Data System (ADS)

    1996-08-01

    scientific return from the VLT and other telescopes such as the HST best be optimised? It is exactly for this reason that astronomers and engineers at ESO are now busy developing new methods of telescope operation and data analysis alongside with the VLT instrumental hardware itself. The new solution by means of models The appropriate strategy to make progress in the inherent conflict between calibration demand and time available for scientific observations is to obtain a physically correct understanding of the effects exerted on the data by different instruments . In this way, it is possible to decide which calibration data are actually required and on which timescale they have to be updated. One can then use computer models of these instruments to predict calibration solutions which are now valid for the full range of target properties and which handle environmental conditions properly. Such computer models can also be used to simulate observations. This brings a lot of benefits for the entire observational process. First, the astronomer can prepare observations and select instrumental modes and exposure times suited for optimal information return. Secondly, it provides confidence in the validity of the calibration process, and therefore in the cleanliness of the corrected data. Finally, once a theory about the target and its properties has been developed, one may simulate observations of a set of theoretical targets for which the properties are slightly modified in order to study their influence on the raw data. For the observatory there are also advantages. Optimization from the point of view of data analysis can now take place already during instrument design, calibration and data analysis procedures for any observational mode can be tested before real observations are obtained, and the maintenance staff can make sure that the instrument performs as expected and designed. How far have we come along this road? The present project consists of a close collaboration between

  15. Trust in healthcare settings: Scale development, methods, and preliminary determinants

    PubMed Central

    LoCurto, Jamie; Berg, Gina M

    2016-01-01

    The literature contains research regarding how trust is formed in healthcare settings but rarely discusses trust formation in an emergent care population. A literature review was conducted to determine which of the trust determinants are important for this process as well as how to develop a scale to measure trust. A search generated a total of 155 articles, 65 of which met eligibility criteria. Determinants that were important included the following: honesty, confidentiality, dependability, communication, competency, fiduciary responsibility, fidelity, and agency. The process of developing a scale includes the following: a literature review, qualitative analysis, piloting, and survey validation. Results suggest that physician behaviors are important in influencing trust in patients and should be included in scales measuring trust. Next steps consist of interviewing emergent care patients to commence the process of developing a scale.

  16. Standardized development of computer software. Part 1: Methods

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1976-01-01

    This work is a two-volume set on standards for modern software engineering methodology. This volume presents a tutorial and practical guide to the efficient development of reliable computer software, a unified and coordinated discipline for design, coding, testing, documentation, and project organization and management. The aim of the monograph is to provide formal disciplines for increasing the probability of securing software that is characterized by high degrees of initial correctness, readability, and maintainability, and to promote practices which aid in the consistent and orderly development of a total software system within schedule and budgetary constraints. These disciplines are set forth as a set of rules to be applied during software development to drastically reduce the time traditionally spent in debugging, to increase documentation quality, to foster understandability among those who must come in contact with it, and to facilitate operations and alterations of the program as requirements on the program environment change.

  17. Nickel Catalysis: Synergy between Method Development and Total Synthesis

    PubMed Central

    Standley, Eric A.; Tasker, Sarah Z.; Jensen, Kim L.; Jamison, Timothy F.

    2015-01-01

    Natural products are a continual source of inspiration for chemists, particularly for organic chemists engaged in reaction development and methodology. In the early stages of our research program, we were drawn to macrocyclic natural products containing allylic alcohol moieties, such as (−)-terpestacin (1, Figure 1). We envisioned, in an ideal case, an intramolecular reductive coupling (a field still in its infancy at the time) could be developed to join an alkyne and an aldehyde to yield this allylic alcohol, simultaneously closing the macrocycle. For this reason we began studying reductive coupling as a tool for C–C bond formation. Additionally, as our program developed, it became clear that a number of other natural products, such as amphidinolide T1 (2), which, although they do not contain allylic alcohols, could be produced in an analogous fashion after modification of the allylic alcohol formed from such a macrocyclization. PMID:25905431

  18. Development and Application of Agglomerated Multigrid Methods for Complex Geometries

    NASA Technical Reports Server (NTRS)

    Nishikawa, Hiroaki; Diskin, Boris; Thomas, James L.

    2010-01-01

    We report progress in the development of agglomerated multigrid techniques for fully un- structured grids in three dimensions, building upon two previous studies focused on efficiently solving a model diffusion equation. We demonstrate a robust fully-coarsened agglomerated multigrid technique for 3D complex geometries, incorporating the following key developments: consistent and stable coarse-grid discretizations, a hierarchical agglomeration scheme, and line-agglomeration/relaxation using prismatic-cell discretizations in the highly-stretched grid regions. A signi cant speed-up in computer time is demonstrated for a model diffusion problem, the Euler equations, and the Reynolds-averaged Navier-Stokes equations for 3D realistic complex geometries.

  19. The mechanics of development: models and methods for tissue morphogenesis

    PubMed Central

    Gjorevski, Nikolce; Nelson, Celeste M.

    2011-01-01

    Embryonic development is a physical process during which masses of cells are sculpted into functional organs. The mechanical properties of tissues and the forces exerted on them serve as epigenetic regulators of morphogenesis. Understanding these mechanobiological effects in the embryo requires new experimental approaches. Here we focus on branching of the lung airways and bending of the heart tube to describe examples of mechanical and physical cues that guide cell fate decisions and organogenesis. We highlight recent technological advances to measure tissue elasticity and endogenous mechanical stresses in real time during organ development. We also discuss recent progress in manipulating forces in intact embryos. PMID:20860059

  20. Development of a method to investigate medical students' perceptions of their personal and professional development.

    PubMed

    Lown, Nick; Davies, Ioan; Cordingley, Lis; Bundy, Chris; Braidman, Isobel

    2009-10-01

    Personal and Professional Development (PPD) is now key to the undergraduate medical curriculum and requires provision of appropriate learning experiences. In order to achieve this, it is essential that we ascertain students' perceptions of what is important in their PPD. We required a methodological approach suitable for a large medical school, which defines constructs used by the students to describe their PPD, and is not constrained by a researcher's predetermined line of questioning. It should also quantify the saliencies of these constructs in the student population and indicate how they gauge their own PPD. The instrument should also be suitable for administration at key stages of the students' learning experience. Here we describe the first stages in developing a novel method, which fulfils these requirements. It is based on a modified self repertory grid, the "Ideal Self" Inventory. All first year students (N = 379), provided five descriptors of a "good medical student" and of a not very good medical student, which generated 1,531 'ideal' qualities. To define underlying themed constructs, 49 randomly selected descriptors, were grouped together by self selected students (n = 55), using commonly held assumptions. Frequency of item co-occurrence was tabulated by multidimensional scaling. Themed clusters of 'ideal' qualities, defined by hierarchical cluster analysis, were overlaid onto the multidimensional scaling to generate a concept map. This revealed seven themed constructs; Personal Welfare, Time and Self Management Committed Work Ethic, Learning Skills, Personal Development/Reflection, Personal and Professional Conduct and Teamwork. We then analysed the 1,531 'ideal' qualities, by determining the frequency with which students used each construct and the proportion of students who used a construct at least once. Personal and Professional Conduct, Committed Work Ethic and Time and Self Management were the most frequently used, implying that they were the

  1. DEVELOPMENT OF A MOLECULAR METHOD TO DETECT ASTROVIRUS

    EPA Science Inventory

    Astrovirus is a common cause of gastroenteritis that has been determined to be responsible for several outbreaks. Since astrovirus can be waterborne, there is interest in testing environmental water for astrovirus and we have developed a sensitive RT-PCR assay that is designed t...

  2. DEVELOPMENT OF A MOLECULAR METHOD TO IDENTIFY ASTROVIRUS IN WATER.

    EPA Science Inventory

    Astrovirus is a common cause of gastroenteritis that has been determined to be responsible for several outbreaks. Since astrovirus can be waterborne, there is interest in testing environmental water for astrovirus and we have developed a sensitive RT-PCR assay that is designed t...

  3. Andragogical and Pedagogical Methods for Curriculum and Program Development

    ERIC Educational Resources Information Center

    Wang, Victor C. X., Ed.; Bryan, Valerie C., Ed.

    2014-01-01

    Today's ever-changing learning environment is characterized by the fast pace of technology that drives our society to move forward, and causes our knowledge to increase at an exponential rate. The need for in-depth research that is bound to generate new knowledge about curriculum and program development is becoming ever more relevant.…

  4. DEVELOPMENT OF A MOLECULAR METHOD TO IDENTIFY HEPATITIS E VIRUS

    EPA Science Inventory

    Hepatitis E virus (HEV) is a waterborne emerging pathogen that causes significant illness in the developing world. Thus far, an HEV outbreak has not been reported in the U.S., although a swine variant of the virus is common in Midwestern hogs. Because viruses isolated from two ...

  5. Methods of Fostering Language Development in Deaf Infants. Final Report.

    ERIC Educational Resources Information Center

    Greenstein, Jules M.

    Thirty deaf children admitted to an auditory training program before age 2 were studied longituninally to age 40 months in an investigation of the effectiveness of early intervention, the relationship between mother-child interaction and language acquisition, and the effectiveness of new devices developed for auditory training. Among findings were…

  6. Developing Principals as Racial Equity Leaders: A Mixed Method Study

    ERIC Educational Resources Information Center

    Raskin, Candace F.; Krull, Melissa; Thatcher, Roberta

    2015-01-01

    This article will present information and research on how a college of education is intentionally developing principals to lead with confidence and racial competence. The nation's student achievement research is sobering: our current school systems widen already existing gaps between white students and students of color, (Darling-Hammond, L. 2004,…

  7. Developing Writing-Reading Abilities though Semiglobal Methods

    ERIC Educational Resources Information Center

    Macri, Cecilia; Bocos, Musata

    2013-01-01

    Through this research was intended to underline the importance of the semi-global strategies used within thematic projects for developing writing/reading abilities in the first grade pupils. Four different coordinates were chosen to be the main variables of this research: the level of phonological awareness, the degree in which writing-reading…

  8. Impact of NDE reliability developments on risk-informed methods

    SciTech Connect

    Walker, S.M.; Ammirato, F.V.

    1996-12-01

    Risk informed inspection procedures are being developed to more effectively and economically manage degradation in plant piping systems. A key element of this process is applying nondestructive examination (NDE) procedures capable of detecting specific damage mechanisms that may be operative in particular locations. Thus, the needs of risk informed analysis are closely coupled with a firm understanding of the capability of NDE.

  9. Reflection--A Method for Organisational and Individual Development

    ERIC Educational Resources Information Center

    Randle, Hanne; Tilander, Kristian

    2007-01-01

    This paper presents how organisational development can be the results when politicians, managers, social workers and teaching staff take part in reflection. The results are based on a government-funded initiative in Sweden for lowering sick absenteeism. Three local governments introduced reflection as a strategy to combat work related stress and a…

  10. Development of laser excited atomic fluorescence and ionization methods

    SciTech Connect

    Winefordner, J.D.

    1991-01-01

    Progress report: May 1, 1988 to December 31, 1991. The research supported by DE-FG05-88ER13881 during the past (nearly) 3 years can be divided into the following four categories: (1) theoretical considerations of the ultimate detection powers of laser fluorescence and laser ionization methods; (2) experimental evaluation of laser excited atomic fluorescence; (3) fundamental studies of atomic and molecular parameters in flames and plasmas; (4) other studies.

  11. Development of an ultrasonic cleaning method for fuel assemblies

    SciTech Connect

    Heki, H.; Komura, S.; Kato, H.; Sakai, H. ); Hattori, T. )

    1991-01-01

    Almost all radiation buildup in light water reactors is the result of the deposition of activated corrosion and wear products in out-of-core areas. After operation, a significant quantity of corrosion and wear products is deposited on the fuel rods as crud. At refueling shutdowns, these activation products are available for removal. If they can be quickly and easily removed, buildup of radioactivity on out-of-core surfaces and individual exposure dose can be greatly reduced. After studying various physical cleaning methods (e.g., water jet and ultrasonic), the ultrasonic cleaning method was selected as the most effective for fuel assembly cleaning. The ultrasonic cleaning method is especially able to efficiently clean the fuel without removing the channel box. The removed crud in the channel box would be swept out to the filtration unit. Parameter survey tests were carried out to evaluate the optimum conditions for ultrasonic cleaning using a mock-up of a short section of fuel assembly with the channel box. The ultrasonic device used was a 600-W ultrasonic transducer operating at 26-kHz ultrasonic frequency.

  12. Real space electrostatics for multipoles. I. Development of methods

    NASA Astrophysics Data System (ADS)

    Lamichhane, Madan; Gezelter, J. Daniel; Newman, Kathie E.

    2014-10-01

    We have extended the original damped-shifted force (DSF) electrostatic kernel and have been able to derive three new electrostatic potentials for higher-order multipoles that are based on truncated Taylor expansions around the cutoff radius. These include a shifted potential (SP) that generalizes the Wolf method for point multipoles, and Taylor-shifted force (TSF) and gradient-shifted force (GSF) potentials that are both generalizations of DSF electrostatics for multipoles. We find that each of the distinct orientational contributions requires a separate radial function to ensure that pairwise energies, forces, and torques all vanish at the cutoff radius. In this paper, we present energy, force, and torque expressions for the new models, and compare these real-space interaction models to exact results for ordered arrays of multipoles. We find that the GSF and SP methods converge rapidly to the correct lattice energies for ordered dipolar and quadrupolar arrays, while the TSF is too severe an approximation to provide accurate convergence to lattice energies. Because real-space methods can be made to scale linearly with system size, SP and GSF are attractive options for large Monte Carlo and molecular dynamics simulations, respectively.

  13. Real space electrostatics for multipoles. I. Development of methods.

    PubMed

    Lamichhane, Madan; Gezelter, J Daniel; Newman, Kathie E

    2014-10-01

    We have extended the original damped-shifted force (DSF) electrostatic kernel and have been able to derive three new electrostatic potentials for higher-order multipoles that are based on truncated Taylor expansions around the cutoff radius. These include a shifted potential (SP) that generalizes the Wolf method for point multipoles, and Taylor-shifted force (TSF) and gradient-shifted force (GSF) potentials that are both generalizations of DSF electrostatics for multipoles. We find that each of the distinct orientational contributions requires a separate radial function to ensure that pairwise energies, forces, and torques all vanish at the cutoff radius. In this paper, we present energy, force, and torque expressions for the new models, and compare these real-space interaction models to exact results for ordered arrays of multipoles. We find that the GSF and SP methods converge rapidly to the correct lattice energies for ordered dipolar and quadrupolar arrays, while the TSF is too severe an approximation to provide accurate convergence to lattice energies. Because real-space methods can be made to scale linearly with system size, SP and GSF are attractive options for large Monte Carlo and molecular dynamics simulations, respectively. PMID:25296786

  14. Methods Used in Game Development to Foster FLOW

    NASA Technical Reports Server (NTRS)

    Jeppsen, Isaac Ben

    2010-01-01

    Games designed for entertainment have a rich history of providing compelling experiences. From consoles to PCs, games have managed to present intuitive and effective interfaces for a wide range of game styles to successfully allow users to "walk-up-and-play". Once a user is hooked, successful games artfully present challenging experiences just within reach of a user's ability, weaving each task and achievement into a compelling and engaging experience. In this paper, engagement is discussed in terms of the psychological theory of Flow. I argue that engagement should be one of the primary goals when developing a serious game and I discuss the best practices and techniques that have emerged from traditional video game development which help foster the creation of engaging, high Flow experiences.

  15. Current Development in Elderly Comprehensive Assessment and Research Methods

    PubMed Central

    Jiang, Shantong; Li, Pingping

    2016-01-01

    Comprehensive geriatric assessment (CGA) is a core and an essential part of the comprehensive care of the aging population. CGA uses specific tools to summarize elderly status in several domains that may influence the general health and outcomes of diseases of elderly patients, including assessment of medical, physical, psychological, mental, nutritional, cognitive, social, economic, and environmental status. Here, in this paper, we review different assessment tools used in elderly patients with chronic diseases. The development of comprehensive assessment tools and single assessment tools specially used in a dimension of CGA was discussed. CGA provides substantial insight into the comprehensive management of elderly patients. Developing concise and effective assessment instruments is helpful to carry out CGA widely to create a higher clinical value. PMID:27042661

  16. Translating Vision into Design: A Method for Conceptual Design Development

    NASA Technical Reports Server (NTRS)

    Carpenter, Joyce E.

    2003-01-01

    One of the most challenging tasks for engineers is the definition of design solutions that will satisfy high-level strategic visions and objectives. Even more challenging is the need to demonstrate how a particular design solution supports the high-level vision. This paper describes a process and set of system engineering tools that have been used at the Johnson Space Center to analyze and decompose high-level objectives for future human missions into design requirements that can be used to develop alternative concepts for vehicles, habitats, and other systems. Analysis and design studies of alternative concepts and approaches are used to develop recommendations for strategic investments in research and technology that support the NASA Integrated Space Plan. In addition to a description of system engineering tools, this paper includes a discussion of collaborative design practices for human exploration mission architecture studies used at the Johnson Space Center.

  17. Development of an improved method of consolidating fatigue life data

    NASA Technical Reports Server (NTRS)

    Leis, B. N.; Sampath, S. G.

    1978-01-01

    A fatigue data consolidation model that incorporates recent advances in life prediction methodology was developed. A combined analytic and experimental study of fatigue of notched 2024-T3 aluminum alloy under constant amplitude loading was carried out. Because few systematic and complete data sets for 2024-T3 were available in the program generated data for fatigue crack initiation and separation failure for both zero and nonzero mean stresses. Consolidations of these data are presented.

  18. Development of panel methods for subsonic analysis and design

    NASA Technical Reports Server (NTRS)

    Bristow, D. R.

    1980-01-01

    Two computer programs, developed for subsonic inviscid analysis and design are described. The first solves arbitrary mixed analysis design problems for multielement airfoils in two dimensional flow. The second calculates the pressure distribution for arbitrary lifting or nonlifting three dimensional configurations. In each program, inviscid flow is modelled by using distributed source doublet singularities on configuration surface panels. Numerical formulations and representative solutions are presented for the programs.

  19. Computational Fluid Dynamics. [numerical methods and algorithm development

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This collection of papers was presented at the Computational Fluid Dynamics (CFD) Conference held at Ames Research Center in California on March 12 through 14, 1991. It is an overview of CFD activities at NASA Lewis Research Center. The main thrust of computational work at Lewis is aimed at propulsion systems. Specific issues related to propulsion CFD and associated modeling will also be presented. Examples of results obtained with the most recent algorithm development will also be presented.

  20. Development of Methods to Evaluate Safer Flight Characteristics

    NASA Technical Reports Server (NTRS)

    Basciano, Thomas E., Jr.; Erickson, Jon D.

    1997-01-01

    The goal of the proposed research is to begin development of a simulation that models the flight characteristics of the Simplified Aid For EVA Rescue (SAFER) pack. Development of such a simulation was initiated to ultimately study the effect an Orbital Replacement Unit (ORU) has on SAFER dynamics. A major function of this program will be to calculate fuel consumption for many ORUs with different masses and locations. This will ultimately determine the maximum ORU mass an astronaut can carry and still perform a self-rescue without jettisoning the unit. A second primary goal is to eventually simulate relative motion (vibration) between the ORU and astronaut. After relative motion is accurately modeled it will be possible to evaluate the robustness of the control system and optimize performance as needed. The first stage in developing the simulation is the ability to model a standardized, total, self-rescue scenario, making it possible to accurately compare different program runs. In orbit an astronaut has only limited data and will not be able to follow the most fuel efficient trajectory; therefore, it is important to correctly model the procedures an astronaut would use in orbit so that good fuel consumption data can be obtained. Once this part of the program is well tested and verified, the vibration (relative motion) of the ORU with respect to the astronaut can be studied.

  1. Development of wide area environment accelerator operation and diagnostics method

    NASA Astrophysics Data System (ADS)

    Uchiyama, Akito; Furukawa, Kazuro

    2015-08-01

    Remote operation and diagnostic systems for particle accelerators have been developed for beam operation and maintenance in various situations. Even though fully remote experiments are not necessary, the remote diagnosis and maintenance of the accelerator is required. Considering remote-operation operator interfaces (OPIs), the use of standard protocols such as the hypertext transfer protocol (HTTP) is advantageous, because system-dependent protocols are unnecessary between the remote client and the on-site server. Here, we have developed a client system based on WebSocket, which is a new protocol provided by the Internet Engineering Task Force for Web-based systems, as a next-generation Web-based OPI using the Experimental Physics and Industrial Control System Channel Access protocol. As a result of this implementation, WebSocket-based client systems have become available for remote operation. Also, as regards practical application, the remote operation of an accelerator via a wide area network (WAN) faces a number of challenges, e.g., the accelerator has both experimental device and radiation generator characteristics. Any error in remote control system operation could result in an immediate breakdown. Therefore, we propose the implementation of an operator intervention system for remote accelerator diagnostics and support that can obviate any differences between the local control room and remote locations. Here, remote-operation Web-based OPIs, which resolve security issues, are developed.

  2. Two-Step Camera Calibration Method Developed for Micro UAV'S

    NASA Astrophysics Data System (ADS)

    Gašparović, M.; Gajski, D.

    2016-06-01

    The development of unmanned aerial vehicles (UAVs) and continuous price reduction of unmanned systems attracted us to this research. Professional measuring systems are dozens of times more expensive and often heavier than "amateur", non-metric UAVs. For this reason, we tested the DJI Phantom 2 Vision Plus UAV. Phantom's smaller mass and velocity can develop less kinetic energy in relation to the professional measurement platforms, which makes it potentially less dangerous for use in populated areas. In this research, we wanted to investigate the ability of such non-metric UAV and find the procedures under which this kind of UAV may be used for the photogrammetric survey. It is important to emphasize that UAV is equipped with an ultra wide-angle camera with 14MP sensor. Calibration of such cameras is a complex process. In the research, a new two-step process is presented and developed, and the results are compared with standard one-step camera calibration procedure. Two-step process involves initially removed distortion on all images, and then uses these images in the phototriangulation with self-calibration. The paper presents statistical indicators which proved that the proposed two-step process is better and more accurate procedure for calibrating those types of cameras than standard one-step calibration. Also, we suggest two-step calibration process as the standard for ultra-wideangle cameras for unmanned aircraft.

  3. Development of evaluation method for software hazard identification techniques

    SciTech Connect

    Huang, H. W.; Chen, M. H.; Shih, C.; Yih, S.; Kuo, C. T.; Wang, L. H.; Yu, Y. C.; Chen, C. W.

    2006-07-01

    This research evaluated the applicable software hazard identification techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flow-graph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/noise ratio, complexity, and implementation cost. By this proposed method, the analysts can evaluate various software hazard identification combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (with transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and simulation-based model-analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantages are the completeness complexity and implementation cost. This evaluation method can be a platform to reach common consensus for the stakeholders. Following the evolution of software hazard identification techniques, the evaluation results could be changed. However, the insight of software hazard identification techniques is much more important than the numbers obtained by the evaluation. (authors)

  4. Development of alternate methods of determining integrated SMR source terms

    SciTech Connect

    Barry, Kenneth

    2014-06-10

    The Nuclear Energy Institute (NEI) Small Modular Reactor (SMR) Licensing Task Force (TF) has been evaluating licensing issues unique and important to iPWRs, ranking these issues, and developing NEI position papers for submittal to the U.S. Nuclear Regulatory Commission (NRC) during the past three years. Papers have been developed and submitted to the NRC in a range of areas including: Price-Anderson Act, NRC annual fees, security, modularity, and staffing. In December, 2012, NEI completed a draft position paper on SMR source terms and participated in an NRC public meeting presenting a summary of this paper, which was subsequently submitted to the NRC. One important conclusion of the source term paper was the evaluation and selection of high importance areas where additional research would have a significant impact on source terms. The highest ranked research area was iPWR containment aerosol natural deposition. The NRC accepts the use of existing aerosol deposition correlations in Regulatory Guide 1.183, but these were developed for large light water reactor (LWR) containments. Application of these correlations to an iPWR design has resulted in greater than a ten-fold reduction of containment airborne aerosol inventory as compared to large LWRs. Development and experimental justification of containment aerosol natural deposition correlations specifically for the unique iPWR containments is expected to result in a large reduction of design basis and beyond-design-basis accident source terms with concomitantly smaller dose to workers and the public. Therefore, NRC acceptance of iPWR containment aerosol natural deposition correlations will directly support the industry’s goal of reducing the Emergency Planning Zone (EPZ) for SMRs. Based on the results in this work, it is clear that thermophoresis is relatively unimportant for iPWRs. Gravitational settling is well understood, and may be the dominant process for a dry environment. Diffusiophoresis and enhanced

  5. Development of 3-D Ice Accretion Measurement Method

    NASA Technical Reports Server (NTRS)

    Lee, Sam; Broeren, Andy P.; Addy, Harold E., Jr.; Sills, Robert; Pifer, Ellen M.

    2012-01-01

    A research plan is currently being implemented by NASA to develop and validate the use of a commercial laser scanner to record and archive fully three-dimensional (3-D) ice shapes from an icing wind tunnel. The plan focused specifically upon measuring ice accreted in the NASA Icing Research Tunnel (IRT). The plan was divided into two phases. The first phase was the identification and selection of the laser scanning system and the post-processing software to purchase and develop further. The second phase was the implementation and validation of the selected system through a series of icing and aerodynamic tests. Phase I of the research plan has been completed. It consisted of evaluating several scanning hardware and software systems against an established selection criteria through demonstrations in the IRT. The results of Phase I showed that all of the scanning systems that were evaluated were equally capable of scanning ice shapes. The factors that differentiated the scanners were ease of use and the ability to operate in a wide range of IRT environmental conditions.

  6. Enrichment Assay Methods Development for the Integrated Cylinder Verification System

    SciTech Connect

    Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.; Curtis, Michael M.

    2009-10-22

    International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.

  7. Development and testing of improved statistical wind power forecasting methods.

    SciTech Connect

    Mendes, J.; Bessa, R.J.; Keko, H.; Sumaili, J.; Miranda, V.; Ferreira, C.; Gama, J.; Botterud, A.; Zhou, Z.; Wang, J.

    2011-12-06

    (with spatial and/or temporal dependence). Statistical approaches to uncertainty forecasting basically consist of estimating the uncertainty based on observed forecasting errors. Quantile regression (QR) is currently a commonly used approach in uncertainty forecasting. In Chapter 3, we propose new statistical approaches to the uncertainty estimation problem by employing kernel density forecast (KDF) methods. We use two estimators in both offline and time-adaptive modes, namely, the Nadaraya-Watson (NW) and Quantilecopula (QC) estimators. We conduct detailed tests of the new approaches using QR as a benchmark. One of the major issues in wind power generation are sudden and large changes of wind power output over a short period of time, namely ramping events. In Chapter 4, we perform a comparative study of existing definitions and methodologies for ramp forecasting. We also introduce a new probabilistic method for ramp event detection. The method starts with a stochastic algorithm that generates wind power scenarios, which are passed through a high-pass filter for ramp detection and estimation of the likelihood of ramp events to happen. The report is organized as follows: Chapter 2 presents the results of the application of ITL training criteria to deterministic WPF; Chapter 3 reports the study on probabilistic WPF, including new contributions to wind power uncertainty forecasting; Chapter 4 presents a new method to predict and visualize ramp events, comparing it with state-of-the-art methodologies; Chapter 5 briefly summarizes the main findings and contributions of this report.

  8. Polypharmacology: in silico methods of ligand design and development.

    PubMed

    McKie, Samuel A

    2016-04-01

    How to design a ligand to bind multiple targets, rather than to a single target, is the focus of this review. Rational polypharmacology draws on knowledge that is both broad ranging and hierarchical. Computer-aided multitarget ligand design methods are described according to their nested knowledge level. Ligand-only and then receptor-ligand strategies are first described; followed by the metabolic network viewpoint. Subsequently strategies that view infectious diseases as multigenomic targets are discussed, and finally the disease level interpretation of medicinal therapy is considered. As yet there is no consensus on how best to proceed in designing a multitarget ligand. The current methodologies are bought together in an attempt to give a practical overview of how polypharmacology design might be best initiated. PMID:27105127

  9. UCNA Systematic Uncertainties: Developments in Analysis and Method

    NASA Astrophysics Data System (ADS)

    Zeck, Bryan

    2012-10-01

    The UCNA experiment is an effort to measure the beta-decay asymmetry parameter A of the correlation between the electron momentum and the neutron spin, using bottled polarized ultracold neutrons in a homogenous 1 T magnetic field. Continued improvements in both analysis and method are helping to push the measurement uncertainty to the limits of the current statistical sensitivity (less than 0.4%). The implementation of thinner decay trap windows will be discussed, as will the use of a tagged beta particle calibration source to measure angle-dependent scattering effects and energy loss. Additionally, improvements in position reconstruction and polarization measurements using a new shutter system will be introduced. A full accounting of the current systematic uncertainties will be given.

  10. Ranging methods for developing wellbores in subsurface formations

    DOEpatents

    MacDonald, Duncan

    2011-09-06

    A method for forming two or more wellbores in a subsurface formation includes forming a first wellbore in the formation. A second wellbore is directionally drilled in a selected relationship relative to the first wellbore. At least one magnetic field is provided in the second wellbore using one or more magnets in the second wellbore located on a drilling string used to drill the second wellbore. At least one magnetic field is sensed in the first wellbore using at least two sensors in the first wellbore as the magnetic field passes by the at least two sensors while the second wellbore is being drilled. A position of the second wellbore is continuously assessed relative to the first wellbore using the sensed magnetic field. The direction of drilling of the second wellbore is adjusted so that the second wellbore remains in the selected relationship relative to the first wellbore.

  11. Chemical recycling of polyhydroxyalkanoates as a method towards sustainable development.

    PubMed

    Ariffin, Hidayah; Nishida, Haruo; Hassan, Mohd Ali; Shirai, Yoshihito

    2010-05-01

    Chemical recycling of bio-based polymers polyhydroxyalkanoates (PHAs) by thermal degradation was investigated from the viewpoint of biorefinery. The thermal degradation resulted in successful transformation of PHAs into vinyl monomers using alkali earth compound (AEC) catalysts. Poly(3-hydroxybutyrate-co-3-hydroxyvalerate)s (PHBVs) were smoothly and selectively depolymerized into crotonic (CA) and 2-pentenoic (2-PA) acids at lower degradation temperatures in the presence of CaO and Mg(OH)(2) as catalysts. Obtained CA from 3-hydroxybutyrate sequences in PHBV was copolymerized with acrylic acid to produce useful water-soluble copolymers, poly(crotonic acid-co-acrylic acid) that have high glass-transition temperatures. The copolymerization of CA derived from PHA pyrolysis is an example of cascade utilization of PHAs, which meets the idea of sustainable development. PMID:20408140

  12. New Research Methods Developed for Studying Diabetic Foot Ulceration

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Dr. Brian Davis, one of the Cleveland Clinic Foundation's researchers, has been investigating the risk factors related to diabetic foot ulceration, a problem that accounts for 20 percent of all hospital admissions for diabetic patients. He had developed a sensor pad to measure the friction and pressure forces under a person's foot when walking. As part of NASA Lewis Research Center's Space Act Agreement with the Cleveland Clinic Foundation, Dr. Davis requested Lewis' assistance in visualizing the data from the sensor pad. As a result, Lewis' Interactive Data Display System (IDDS) was installed at the Cleveland Clinic. This computer graphics program is normally used to visualize the flow of air through aircraft turbine engines, producing color two- and three-dimensional images.

  13. Development of partial failure analysis method in probability risk assessments

    SciTech Connect

    Ni, T.; Modarres, M.

    1996-12-01

    This paper presents a new approach to evaluate the partial failure effect on current Probability Risk Assessments (PRAs). An integrated methodology of the thermal-hydraulic analysis and fuzzy logic simulation using the Dynamic Master Logic Diagram (DMLD) was developed. The thermal-hydraulic analysis used in this approach is to identify partial operation effect of any PRA system function in a plant model. The DMLD is used to simulate the system performance of the partial failure effect and inspect all minimal cut sets of system functions. This methodology can be applied in the context of a full scope PRA to reduce core damage frequency. An example of this application of the approach is presented. The partial failure data used in the example is from a survey study of partial failure effects from the Nuclear Plant Reliability Data System (NPRDS).

  14. Sublimation rates of explosive materials : method development and initial results.

    SciTech Connect

    Phelan, James M.; Patton, Robert Thomas

    2004-08-01

    Vapor detection of explosives continues to be a technological basis for security applications. This study began experimental work to measure the chemical emanation rates of pure explosive materials as a basis for determining emanation rates of security threats containing explosives. Sublimation rates for TNT were determined with thermo gravimetric analysis using two different techniques. Data were compared with other literature values to provide sublimation rates from 25 to 70 C. The enthalpy of sublimation for the combined data was found to be 115 kJ/mol, which corresponds well with previously reported data from vapor pressure determinations. A simple Gaussian atmospheric dispersion model was used to estimate downrange concentrations based on continuous, steady-state conditions at 20, 45 and 62 C for a nominal exposed block of TNT under low wind conditions. Recommendations are made for extension of the experimental vapor emanation rate determinations and development of turbulent flow computational fluid dynamics based atmospheric dispersion estimates of standoff vapor concentrations.

  15. Development of metrological NDE methods for microturbine ceramic components

    SciTech Connect

    Lee, H.-R.; Ellingson, W. A.

    1999-12-23

    In this work, X-ray computed tomographic imaging technology with high spatial resolution has been explored for metrological applications to Si{sub 3}N{sub 4} ceramic turbine wheels. X-ray computed tomography (XCT) data were acquired by a charge-coupled device detector coupled to an image intensifier. Cone-beam XCT reconstruction algorithms were used to allow full-volume data acquisition from the turbine wheels. Special software was developed so that edge detection and complex blade contours could be determined from the XCT data. The feasibility of using the XCT for dimensional analyses was compared with that of a coordinate-measuring machine. Details of the XCT system, data acquisition, and dimensional comparisons will be presented.

  16. Novel lipase purification methods - a review of the latest developments.

    PubMed

    Tan, Chung Hong; Show, Pau Loke; Ooi, Chien Wei; Ng, Eng-Poh; Lan, John Chi-Wei; Ling, Tau Chuan

    2015-01-01

    Microbial lipases are popular biocatalysts due to their ability to catalyse diverse reactions such as hydrolysis, esterification, and acidolysis. Lipases function efficiently on various substrates in aqueous and non-aqueous media. Lipases are chemo-, regio-, and enantio-specific, and are useful in various industries, including those manufacturing food, detergents, and pharmaceuticals. A large number of lipases from fungal and bacterial sources have been isolated and purified to homogeneity. This success is attributed to the development of both conventional and novel purification techniques. This review highlights the use of these techniques in lipase purification, including conventional techniques such as: (i) ammonium sulphate fractionation; (ii) ion-exchange; (iii) gel filtration and affinity chromatography; as well as novel techniques such as (iv) reverse micellar system; (v) membrane processes; (vi) immunopurification; (vi) aqueous two-phase system; and (vii) aqueous two-phase floatation. A summary of the purification schemes for various bacterial and fungal lipases are also provided. PMID:25273633

  17. Development of fluoroimmunoassay methods for delta-9-tetrahydrocannabinol

    SciTech Connect

    Mason, A.P.

    1986-01-01

    Heterogeneous, competitive, labelled-ligand solid-phase primary antibody fluoroimmunoassay methods for the detection of THC in blood and plasma were proposed, and the required assay components were produced and characterized. These components included polyclonal rabbit antisera and monoclonal antibodies reactive with tetrahydrocannabinols, solid-phase immunoglobin reagents, a fluoroligand, and protein conjugates of THC for immunization and immunoassay response amplification. The stereoselective rabbit anti-THC antiserum F-444-12 was found to have a high binding titer, a high affinity (K/sub D/ = 3.4 x 10/sup -/exclamation/sup 1/ M for 5'-iodo/sup -125/I-..delta../sup 2/-THC), and high specificity versus a large number of cannabinoid compounds. Immobilization of the immunoglobulin fraction of the antiserum on hydrophilic polyacrylamide microspheres resulted in only a four fold increase in K/sub D/, and a two fold increase in the concentration of binding sites required for the production of equivalent binding titers. Specificity for small ligands was not affected, but the binding of THC-protein conjugates was reduced in potency. Two monoclonal hybridoma cell lines were produced that secrete monoclonal antibodies which bind the radioligand. The fluoroligand was synthesized from 5'-carboxy-..delta../sup 2/-THC and FITC using a diamimoethane linkage structure. While the compound had the fluorescence properties of FTIC, it was bound to the antiserum F-144-12 with a cross-reactive potency 1.4x greater than the radioligand, and 10x greater than THC.

  18. Development of Active Control Method for Supercooling Releasing of Water

    NASA Astrophysics Data System (ADS)

    Mito, Daisuke; Kozawa, Yoshiyuki; Tanino, Masayuki; Inada, Takaaki

    We have tested the prototype ice-slurry generator that enables both production of supercooled water (-2°C) and releasing of its supercooling simultaneously and continuously in a closed piping system. In the experiment, we adopted the irradiation of ultrasonic wave as an active control method of triggering for supercooling releasing, and evaluated the reliability for a practical use compared with the seed ice-crystal trigger. As the results, it has been confirmed that the ultrasonic wave trigger acts assuredly at the same level of degree of supercooling as that by using the seed ice-crystal Trigger. Moreover, it can be found that the ultrasonic wave trigger has the advantage of removing the growing ice-crystals on the pipe wall at the same time. Finally, we have specified the bombardment condition of ultrasonic wave enough to make continuously the ice-slurry in a closed system as the output surface power density > 31.4kW/m2 and the superficial bombardment time > 4.1sec. We have also demonstrated the continuous ice-slurry making for more than 6hours by using the refrigerator system with the practical scale of 88kW.

  19. Methods and apparatuses for the development of microstructured nuclear fuels

    DOEpatents

    Jarvinen, Gordon D.; Carroll, David W.; Devlin, David J.

    2009-04-21

    Microstructured nuclear fuel adapted for nuclear power system use includes fissile material structures of micrometer-scale dimension dispersed in a matrix material. In one method of production, fissile material particles are processed in a chemical vapor deposition (CVD) fluidized-bed reactor including a gas inlet for providing controlled gas flow into a particle coating chamber, a lower bed hot zone region to contain powder, and an upper bed region to enable powder expansion. At least one pneumatic or electric vibrator is operationally coupled to the particle coating chamber for causing vibration of the particle coater to promote uniform powder coating within the particle coater during fuel processing. An exhaust associated with the particle coating chamber and can provide a port for placement and removal of particles and powder. During use of the fuel in a nuclear power reactor, fission products escape from the fissile material structures and come to rest in the matrix material. After a period of use in a nuclear power reactor and subsequent cooling, separation of the fissile material from the matrix containing the embedded fission products will provide an efficient partitioning of the bulk of the fissile material from the fission products. The fissile material can be reused by incorporating it into new microstructured fuel. The fission products and matrix material can be incorporated into a waste form for disposal or processed to separate valuable components from the fission products mixture.

  20. Study on development of chemical measurement method utilizing plasmas

    NASA Astrophysics Data System (ADS)

    Saito, Morimasa; Hirose, Fumio

    1993-01-01

    Research was performed on the influence of parameters on the ion intensity, quantization of small amount impurities in highly pure Mo, sample solution analysis, and ionization mechanisms by glow discharge. The ion intensity was affected by the sample location for discharge, discharge current, and discharge voltage, but not by the sample shape. The maximum ion intensity was obtained when the sample was located 8 mm from the ion slit. After confirming that other conditions had to be set constant, these results were applied to quantization of impurities in highly pure Mo. Relative sensitivity coefficients were determined by the spark ion source mass spectrometry, and satisfactory results were obtained. A solution sampling method was investigated for lesser segregation influences by matrices and elements. Quantization of solution samples was made by dipping a highly pure graphite bar in the solution and doping. The Penning ionization ratio was determined by investigating the elements' ion intensities obtained by using Ar, Kr, and Xe gases. It is determined that there is no difference in the relative sensitivity coefficients between these discharge gases. The Penning ionization ratio is 75 to 85 percent.

  1. Development of an ASTM Graphite Oxidation Test Method

    SciTech Connect

    Contescu, Cristian I; Baker, Frederick S; Burchell, Timothy D

    2006-01-01

    Oxidation behavior of graphite is of practical interest because of extended use of graphite materials in nuclear reactors. High temperature gas-cooled reactors are expected to become the nuclear reactors of the next generation. The most critical factor in their safe operation is an air-ingress accident, in which case the graphite materials in the moderator and reflector would come in contact with oxygen at a high temperature. Many results on graphite oxidation have been obtained from TGA measurements using commercial instruments, with sample sizes of a few hundred milligrams. They have demonstrated that graphite oxidation is in kinetic control regime at low temperatures, but becomes diffusion-limited at high temperatures. These effects are better understood from measurement results with large size samples, on which the shape and structural factors that control diffusion can be more clearly evidenced. An ASTM test for characterization of oxidation resistance of machined carbon and graphite materials is being developed with ORNL participation. The test recommends the use of large machined samples (~ 20 grams) in a dry air flow system. We will report on recent results and progress in this direction.

  2. PROGRESS ON GENERIC PHASE-FIELD METHOD DEVELOPMENT

    SciTech Connect

    Biner, Bullent; Tonks, Michael; Millett, Paul C.; Li, Yulan; Hu, Shenyang Y.; Gao, Fei; Sun, Xin; Martinez, E.; Anderson, D.

    2012-09-26

    In this report, we summarize our current collobarative efforts, involving three national laboratories: Idaho National Laboratory (INL), Pacific Northwest National Laboratory (PNNL) and Los Alamos National Laboatory (LANL), to develop a computational framework for homogenous and heterogenous nucleation mechanisms into the generic phase-field model. During the studies, the Fe-Cr system was chosen as a model system due to its simplicity and availability of reliable thermodynamic and kinetic data, as well as the range of applications of low-chromium ferritic steels in nuclear reactors. For homogenous nucleation, the relavant parameters determined from atomistic studies were used directly to determine the energy functional and parameters in the phase-field model. Interfacial energy, critical nucleus size, nucleation rate, and coarsening kinetics were systematically examined in two- and three- dimensional models. For the heteregoneous nucleation mechanism, we studied the nucleation and growth behavior of chromium precipitates due to the presence of dislocations. The results demonstrate that both nucleation schemes can be introduced to a phase-field modeling algorithm with the desired accuracy and computational efficiency.

  3. Development of a Multi-Point Microwave Interferometry (MPMI) Method

    SciTech Connect

    Specht, Paul Elliott; Cooper, Marcia A.; Jilek, Brook Anton

    2015-09-01

    A multi-point microwave interferometer (MPMI) concept was developed for non-invasively tracking a shock, reaction, or detonation front in energetic media. Initially, a single-point, heterodyne microwave interferometry capability was established. The design, construction, and verification of the single-point interferometer provided a knowledge base for the creation of the MPMI concept. The MPMI concept uses an electro-optic (EO) crystal to impart a time-varying phase lag onto a laser at the microwave frequency. Polarization optics converts this phase lag into an amplitude modulation, which is analyzed in a heterodyne interfer- ometer to detect Doppler shifts in the microwave frequency. A version of the MPMI was constructed to experimentally measure the frequency of a microwave source through the EO modulation of a laser. The successful extraction of the microwave frequency proved the underlying physical concept of the MPMI design, and highlighted the challenges associated with the longer microwave wavelength. The frequency measurements made with the current equipment contained too much uncertainty for an accurate velocity measurement. Potential alterations to the current construction are presented to improve the quality of the measured signal and enable multiple accurate velocity measurements.

  4. Ceramic Matrix Composites (CMC) Life Prediction Method Development

    NASA Technical Reports Server (NTRS)

    Levine, Stanley R.; Calomino, Anthony M.; Ellis, John R.; Halbig, Michael C.; Mital, Subodh K.; Murthy, Pappu L.; Opila, Elizabeth J.; Thomas, David J.; Thomas-Ogbuji, Linus U.; Verrilli, Michael J.

    2000-01-01

    Advanced launch systems (e.g., Reusable Launch Vehicle and other Shuttle Class concepts, Rocket-Based Combine Cycle, etc.), and interplanetary vehicles will very likely incorporate fiber reinforced ceramic matrix composites (CMC) in critical propulsion components. The use of CMC is highly desirable to save weight, to improve reuse capability, and to increase performance. CMC candidate applications are mission and cycle dependent and may include turbopump rotors, housings, combustors, nozzle injectors, exit cones or ramps, and throats. For reusable and single mission uses, accurate prediction of life is critical to mission success. The tools to accomplish life prediction are very immature and not oriented toward the behavior of carbon fiber reinforced silicon carbide (C/SiC), the primary system of interest for a variety of space propulsion applications. This paper describes an approach to satisfy the need to develop an integrated life prediction system for CMC that addresses mechanical durability due to cyclic and steady thermomechanical loads, and takes into account the impact of environmental degradation.

  5. Interim methods for development of inhalation reference concentrations. Draft report

    SciTech Connect

    Blackburn, K.; Dourson, M.; Erdreich, L.; DeRose, C.; Graham, J.A.

    1990-08-01

    An inhalation reference concentration (RfC) is an estimate of continuous inhalation exposure over a human lifetime that is unlikely to pose significant risk of adverse noncancer health effects and serves as a benchmark value for assisting in risk management decisions. Derivation of an RfC involves dose-response assessment of animal data to determine the exposure levels at which no significant increase in the frequency or severity of adverse effects between the exposed population and its appropriate control exists. The assessment requires an interspecies dose extrapolation from a no-observed-adverse-effect level (NOAEL) exposure concentration of an animal to a human equivalent NOAEL (NOAEL(HBC)). The RfC is derived from the NOAEL(HBC) by the application of generally order-of-magnitude uncertainty factors. Intermittent exposure scenarios in animals are extrapolated to chronic continuous human exposures. Relationships between external exposures and internal doses depend upon complex simultaneous and consecutive processes of absorption, distribution, metabolism, storage, detoxification, and elimination. To estimate NOAEL(HBC)s when chemical-specific physiologically-based pharmacokinetic models are not available, a dosimetric extrapolation procedure based on anatomical and physiological parameters of the exposed human and animal and the physical parameters of the toxic chemical has been developed which gives equivalent or more conservative exposure concentrations values than those that would be obtained with a PB-PK model.

  6. Development of methods to estimate beryllium exposure. Final report

    SciTech Connect

    Rice, C.H.

    1988-06-30

    The project was designed to access data, provide preliminary exposure rankings, and delineate the process for detailing retrospective exposure assessments for beryllium among workers at processing facilities. A literature review was conducted, and walk-through surveys were conducted at two facilities still in operation. More than 8000 environmental records were entered into a computer file. Descriptive statistics were then generated and the process of rank ordering exposures across facilities was begun. In efforts to formulate crude indices of exposure, job titles of persons in the NIOSH mortality study were reviewed and categorized for any beryllium exposure, chemical form of beryllium exposure, and exposure to acid mists. Daily Weighted Average exposure estimates were reviewed by job title, across all facilities. The mean exposure at each facility was calculated. The strategy developed for retrospective exposure assessment is described. Tasks included determination of the usefulness of the Pennsylvania Workers' Compensation files; cataloging the numbers of samples available from company sources; investigating data holdings at Oak Ridge National Laboratory; and obtaining records from the Department of Energy Library.

  7. Risk-Informed Safety Margin Characterization Methods Development Work

    SciTech Connect

    Smith, Curtis L; Ma, Zhegang; Tom Riley; Mandelli, Diego; Nielsen, Joseph W; Alfonsi, Andrea; Rabiti, Cristian

    2014-09-01

    This report summarizes the research activity developed during the Fiscal year 2014 within the Risk Informed Safety Margin and Characterization (RISMC) pathway within the Light Water Reactor Sustainability (LWRS) campaign. This research activity is complementary to the one presented in the INL/EXT-??? report which shows advances Probabilistic Risk Assessment Analysis using RAVEN and RELAP-7 in conjunction to novel flooding simulation tools. Here we present several analyses that prove the values of the RISMC approach in order to assess risk associated to nuclear power plants (NPPs). We focus on simulation based PRA which, in contrast to classical PRA, heavily employs system simulator codes. Firstly we compare, these two types of analyses, classical and RISMC, for a Boiling water reactor (BWR) station black out (SBO) initiating event. Secondly we present an extended BWR SBO analysis using RAVEN and RELAP-5 which address the comments and suggestions received about he original analysis presented in INL/EXT-???. This time we focus more on the stochastic analysis such probability of core damage and on the determination of the most risk-relevant factors. We also show some preliminary results regarding the comparison between RELAP5-3D and the new code RELAP-7 for a simplified Pressurized Water Reactors system. Lastly we present some conceptual ideas regarding the possibility to extended the RISMC capabilities from an off-line tool (i.e., as PRA analysis tool) to an online-tool. In this new configuration, RISMC capabilities can be used to assist and inform reactor operator during real accident scenarios.

  8. Methods for the development of a bioregenerative life support system

    NASA Technical Reports Server (NTRS)

    Goldman, Michelle; Gomez, Shawn; Voorhees, Mike

    1990-01-01

    Presented here is a rudimentary approach to designing a life support system based on the utilization of plants and animals. The biggest stumbling block in the initial phases of developing a bioregenerative life support system is encountered in collecting and consolidating the data. If a database existed for the systems engineer so that he or she may have accurate data and a better understanding of biological systems in engineering terms, then the design process would be simplified. Also addressed is a means of evaluating the subsystems chosen. These subsystems are unified into a common metric, kilograms of mass, and normalized in relation to the throughput of a few basic elements. The initial integration of these subsystems is based on input/output masses and eventually balanced to a point of operation within the inherent performance ranges of the organisms chosen. At this point, it becomes necessary to go beyond the simplifying assumptions of simple mass relationships and further define for each organism the processes used to manipulate the throughput matter. Mainly considered here is the fact that these organisms perform input/output functions on differing timescales, thus establishing the need for buffer volumes or appropriate subsystem phasing. At each point in a systematic design it is necessary to disturb the system and discern its sensitivity to the disturbance. This can be done either through the introduction of a catastrophic failure or by applying a small perturbation to the system. One example is increasing the crew size. Here the wide range of performance characteristics once again shows that biological systems have an inherent advantage in responding to systemic perturbations. Since the design of any space-based system depends on mass, power, and volume requirements, each subsystem must be evaluated in these terms.

  9. There's Madness in These Methods: Teaching Secondary Methods Students to Develop Interdisciplinary Units.

    ERIC Educational Resources Information Center

    Combs, Dorie; White, Rodney

    2000-01-01

    Discusses interdisciplinary instruction at the high school level, examines the imperatives of statewide reform in Kentucky, and outlines how and when interdisciplinary instruction improves learning. Describes an interdisciplinary learning project taught in secondary methods classes at Eastern Kentucky University in which students experience…

  10. Development of Continuous-Energy Eigenvalue Sensitivity Coefficient Calculation Methods in the Shift Monte Carlo Code

    SciTech Connect

    Perfetti, Christopher M; Martin, William R; Rearden, Bradley T; Williams, Mark L

    2012-01-01

    Three methods for calculating continuous-energy eigenvalue sensitivity coefficients were developed and implemented into the SHIFT Monte Carlo code within the Scale code package. The methods were used for several simple test problems and were evaluated in terms of speed, accuracy, efficiency, and memory requirements. A promising new method for calculating eigenvalue sensitivity coefficients, known as the CLUTCH method, was developed and produced accurate sensitivity coefficients with figures of merit that were several orders of magnitude larger than those from existing methods.

  11. Development of Laser-Ion Beam Photodissociation Methods

    SciTech Connect

    David H. Russell

    2004-05-11

    stabilized) ions. It is difficult to probe 2o/3o structure of gas-phase ions using fragmentation chemistry, because the energy barriers to inter-conversion of different structural forms lie below the fragmentation threshold, studies of low internal energy ions are more suited for these studies. A major challenge for gas-phase ion research is the design of experimental structural probes that can be used in parallel with computational chemistry, molecular modeling and/or classical structural diagnostic tools to aid interpretation of the experimental data. Our experimental design and selection of research problems is guided by this philosophy. The following section of the progress report focus on three main issues: (i) technique and instrument development, and (ii) studies of ion structure and ion chemistry.

  12. Development of Laser-Ion Beam Photodissociation Methods

    SciTech Connect

    David H. Russell

    2004-03-31

    stabilized) ions. It is difficult to probe 2o/3o structure of gas-phase ions using fragmentation chemistry, because the energy barriers to inter-conversion of different structural forms lie below the fragmentation threshold, studies of low internal energy ions are more suited for these studies. A major challenge for gas-phase ion research is the design of experimental structural probes that can be used in parallel with computational chemistry, molecular modeling and/or classical structural diagnostic tools to aid interpretation of the experimental data. Our experimental design and selection of research problems is guided by this philosophy. The following section of the progress report focus on three main issues: (i) technique and instrument development, and (ii) studies of ion structure and ion chemistry.

  13. 75 FR 22126 - Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-27

    ... November 12, 2008 (73 FR 67057-67059). The new equivalent method for O 3 is an automated method that... AGENCY Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods: Designation of One New Equivalent Method AGENCY: Environmental Protection Agency. ACTION: Notice of...

  14. Towards standard methods for the detection of Cryptosporidium parvum on lettuce and raspberries. Part 1: development and optimization of methods.

    PubMed

    Cook, N; Paton, C A; Wilkinson, N; Nichols, R A B; Barker, K; Smith, H V

    2006-06-15

    No standard method is available for detecting protozoan parasites on foods such as soft fruit and salad vegetables. We report on optimizing methods for detecting Cryptosporidium parvum on lettuce and raspberries. These methods are based on four basic stages: extraction of oocysts from the foodstuffs, concentration of the extract and separation of the oocysts from food materials, staining of the oocysts to allow their visualization, and identification of oocysts by microscopy. The concentration and separation steps are performed by centrifugation, followed by immunomagnetic separation using proprietary kits. Oocyst staining is also performed using proprietary reagents. The performance parameters of the extraction steps were extensively optimized, using artificially contaminated samples. The fully developed methods were tested several times to determine their reliability. The method to detect C. parvum on lettuce recovered 59.0+/-12.0% (n=30) of artificially contaminated oocysts. The method to detect C. parvum on raspberries recovered 41.0+/-13.0% (n=30) of artificially contaminated oocysts. PMID:16529835

  15. DEVELOPMENT OF AN ELECTROSPRAY MASS SPECTROMETRIC METHOD FOR DETERMINING PERCHLORATE IN FERTILIZERS

    EPA Science Inventory

    An electrospray mass spectrometric method has been developed for application to agricultural and horticultural fertilizers to determine perchlorate. After fertilizers are leached or dissolved in water, the method relies on the formation of stable ion pair complex of the perchlor...

  16. DEVELOPMENT OF A RAPID ANALYTICAL METHOD FOR DETERMINING ASBESTOS IN WATER

    EPA Science Inventory

    The development of a rapid analytical method for determining chrysotile asbestos in water that requires substantially less time per analysis than electron microscopy methods is described. Based on the proposition that separation of chrysotile from other waterborne particulate wou...

  17. METHOD DEVELOPMENT FOR THE DETERMINATION OF FORMALDEHYDE IN SAMPLES OF ENVIRONMENTAL ORIGIN

    EPA Science Inventory

    An analytical method was developed for the determination of formaldehyde in samples of environmental origin. After a review of the current literature, five candidate methods involving chemical derivatization were chosen for evaluation. The five derivatization reagents studied wer...

  18. Observation of swelling behavior of ArF resist during development by using QCM method

    NASA Astrophysics Data System (ADS)

    Sekiguchi, Atsushi; Konishi, Hiroko; Isono, Mariko

    2012-03-01

    Many reports have discussed the swelling behavior of photoresists during development, as observed by the QCM method. Previously, we reported on the development of development analysis equipment based on the QCM method. In this paper, we report on a high-precision resist development analyzer also based on the QCM method. This equipment incorporates a high-precision developing solution temperature controller and features a high-precision air conditioning function for the measurement chamber. We also measured swelling behavior during development using a TBAH developer solution, which features larger molecules than TMAH, comparing these results with those obtained with TMAH. The results of this measurement indicate that the extent of resist swelling during development is less with TBAH developer solution than with TMAH developer solution. This result is consistent with results of a study by Itani et al. using high-speed AFM, suggesting the suitability of the measurement equipment used in our experiments.

  19. 24 CFR Appendix II to Subpart C of... - Development of Standards; Calculation Methods

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 24 Housing and Urban Development 1 2014-04-01 2014-04-01 false Development of Standards; Calculation Methods II Appendix II to Subpart C of Part 51 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development ENVIRONMENTAL CRITERIA AND STANDARDS Siting of...

  20. 24 CFR Appendix II to Subpart C of... - Development of Standards; Calculation Methods

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 24 Housing and Urban Development 1 2012-04-01 2012-04-01 false Development of Standards; Calculation Methods II Appendix II to Subpart C of Part 51 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development ENVIRONMENTAL CRITERIA AND STANDARDS Siting of...

  1. 24 CFR Appendix II to Subpart C of... - Development of Standards; Calculation Methods

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 1 2010-04-01 2010-04-01 false Development of Standards; Calculation Methods II Appendix II to Subpart C of Part 51 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development ENVIRONMENTAL CRITERIA AND STANDARDS Siting of...

  2. 24 CFR Appendix II to Subpart C of... - Development of Standards; Calculation Methods

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 24 Housing and Urban Development 1 2013-04-01 2013-04-01 false Development of Standards; Calculation Methods II Appendix II to Subpart C of Part 51 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development ENVIRONMENTAL CRITERIA AND STANDARDS Siting of...

  3. 24 CFR Appendix II to Subpart C of... - Development of Standards; Calculation Methods

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Development of Standards; Calculation Methods II Appendix II to Subpart C of Part 51 Housing and Urban Development Office of the Secretary, Department of Housing and Urban Development ENVIRONMENTAL CRITERIA AND STANDARDS Siting of...

  4. Cases on Successful E-Learning Practices in the Developed and Developing World: Methods for the Global Information Economy

    ERIC Educational Resources Information Center

    Olaniran, Bolanle A., Ed.

    2010-01-01

    E-learning has become a significant aspect of training and education in the worldwide information economy as an attempt to create and facilitate a competent global work force. "Cases on Successful E-Learning Practices in the Developed and Developing World: Methods for the Global Information Economy" provides eclectic accounts of case studies in…

  5. 78 FR 22540 - Notice of Public Meeting/Webinar: EPA Method Development Update on Drinking Water Testing Methods...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-16

    ...'s method development work and to provide public comments. In addition, EPA will be posting the... CCL (CCL 3) containing 116 contaminants on October 8, 2009 (74 FR 51850) and the third UCMR (UCMR 3) on May 2, 2012 (77 FR 26072). Monitoring under UCMR 3 began January 1, 2013, and will continue...

  6. Development of a Method to Investigate Medical Students' Perceptions of Their Personal and Professional Development

    ERIC Educational Resources Information Center

    Lown, Nick; Davies, Ioan; Cordingley, Lis; Bundy, Chris; Braidman, Isobel

    2009-01-01

    Personal and Professional Development (PPD) is now key to the undergraduate medical curriculum and requires provision of appropriate learning experiences. In order to achieve this, it is essential that we ascertain students' perceptions of what is important in their PPD. We required a methodological approach suitable for a large medical school,…

  7. An exploratory survey of methods used to develop measures of performance

    NASA Astrophysics Data System (ADS)

    Hamner, Kenneth L.; Lafleur, Charles A.

    1993-09-01

    Nonmanufacturing organizations are being challenged to provide high-quality products and services to their customers, with an emphasis on continuous process improvement. Measures of performance, referred to as metrics, can be used to foster process improvement. The application of performance measurement to nonmanufacturing processes can be very difficult. This research explored methods used to develop metrics in nonmanufacturing organizations. Several methods were formally defined in the literature, and the researchers used a two-step screening process to determine the OMB Generic Method was most likely to produce high-quality metrics. The OMB Generic Method was then used to develop metrics. A few other metric development methods were found in use at nonmanufacturing organizations. The researchers interviewed participants in metric development efforts to determine their satisfaction and to have them identify the strengths and weaknesses of, and recommended improvements to, the metric development methods used. Analysis of participants' responses allowed the researchers to identify the key components of a sound metrics development method. Those components were incorporated into a proposed metric development method that was based on the OMB Generic Method, and should be more likely to produce high-quality metrics that will result in continuous process improvement.

  8. RECENT DEVELOPMENTS IN ANALYTICAL METHODS FOR FIBROUS AMPHIBOLE IN VERMICULITE ATTIC INSULATION

    EPA Science Inventory

    The U.S. Environmental Protection Agency has developed a test method for the analysis of fibrous amphibole in vermiculite attic insulation. This method was developed to provide the Agency with monitoring tools to study the occurrence and potential for exposure to fibrous amphibo...

  9. Flammable gas safety program. Analytical methods development: FY 1994 progress report

    SciTech Connect

    Campbell, J.A.; Clauss, S.; Grant, K.; Hoopes, V.; Lerner, B.; Lucke, R.; Mong, G.; Rau, J.; Wahl, K.; Steele, R.

    1994-09-01

    This report describes the status of developing analytical methods to account for the organic components in Hanford waste tanks, with particular focus on tanks assigned to the Flammable Gas Watch List. The methods that have been developed are illustrated by their application to samples obtained from Tank 241-SY-101 (Tank 101-SY).

  10. INDOOR AIR EMISSIONS FROM OFFICE EQUIPMENT: TEST METHOD DEVELOPMENT AND POLLUTION PREVENTION OPPORTUNITIES

    EPA Science Inventory

    The report describes the development and evaluation of a large chamber test method for measuring emissions from dry-process photocopiers. The test method was developed in two phases. Phase 1 was a single-laboratory evaluation at Research Triangle Institute (RTI) using four, mid-r...

  11. Waste Tank Organic Safety Program: Analytical methods development. Progress report, FY 1994

    SciTech Connect

    Campbell, J.A.; Clauss, S.A.; Grant, K.E.

    1994-09-01

    The objectives of this task are to develop and document extraction and analysis methods for organics in waste tanks, and to extend these methods to the analysis of actual core samples to support the Waste Tank organic Safety Program. This report documents progress at Pacific Northwest Laboratory (a) during FY 1994 on methods development, the analysis of waste from Tank 241-C-103 (Tank C-103) and T-111, and the transfer of documented, developed analytical methods to personnel in the Analytical Chemistry Laboratory (ACL) and 222-S laboratory. This report is intended as an annual report, not a completed work.

  12. RESEARCH ASSOCIATED WITH THE DEVELOPMENT OF EPA METHOD 552.2

    EPA Science Inventory

    The work presented in this paper entails the development of a method for haloacetic acid (HAA) analysis, Environmental Protection Agency (EPA)method 552.2, that improves the saftey and efficiency of previous methods and incorporates three additional trihalogenated acetic acids: b...

  13. The Development of Students' Use of Additive and Proportional Methods along Primary and Secondary School

    ERIC Educational Resources Information Center

    Fernandez, Ceneida; Llinares, Salvador; Van Dooren, Wim; De Bock, Dirk; Verschaffel, Lieven

    2012-01-01

    This study investigates the development of proportional and additive methods along primary and secondary school. In particular, it simultaneously investigates the use of additive methods in proportional word problems and the use of proportional methods in additive word problems. We have also studied the role played by integer and non-integer…

  14. 75 FR 45627 - Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-03

    ... of 40 CFR part 53, as amended on November 12, 2008 (73 FR 67057-67059). The new equivalent method for... AGENCY Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods: Designation of One New Equivalent Method AGENCY: Environmental Protection Agency. ACTION: Notice of...

  15. 76 FR 62402 - Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-07

    ... provisions of 40 CFR part 53, as amended on June 22, 2010 (75 FR 35597). The new O 3 equivalent method is an... AGENCY Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods; Designation of One New Equivalent Method AGENCY: Environmental Protection Agency. ACTION: Notice of...

  16. Development of continuous-energy eigenvalue sensitivity coefficient calculation methods in the shift Monte Carlo Code

    SciTech Connect

    Perfetti, C.; Martin, W.; Rearden, B.; Williams, M.

    2012-07-01

    Three methods for calculating continuous-energy eigenvalue sensitivity coefficients were developed and implemented into the Shift Monte Carlo code within the SCALE code package. The methods were used for two small-scale test problems and were evaluated in terms of speed, accuracy, efficiency, and memory requirements. A promising new method for calculating eigenvalue sensitivity coefficients, known as the CLUTCH method, was developed and produced accurate sensitivity coefficients with figures of merit that were several orders of magnitude larger than those from existing methods. (authors)

  17. Development of a General Method for Determining Leak Rates from Limiting Enclosures

    NASA Technical Reports Server (NTRS)

    Zografos, A. I.; Blackwell, C. C.; Harper, Lynn D. (Technical Monitor)

    1994-01-01

    This paper discusses the development of a general method for the determination of very low leak rates from limiting enclosures. There are many methods that can be used to detect and repair leaks from enclosures. Many methods have also been proposed that allow the estimation of actual leak rates, usually expressed as enclosure volume turnover. The proposed method combines measurements of the state variables (pressure, temperature, and volume) as well as the change in the concentration of a tracer gas to estimate the leak rate. The method was applied to the containment enclosure of the Engineering Development Unit of the CELSS Test Facility, currently undergoing testing at the NASA Ames Research Center.

  18. DEVELOPMENT OF ANALYTICAL METHODS FOR DETERMINING SUPPRESSOR CONCENTRATION IN THE MCU NEXT GENERATION SOLVENT (NGS)

    SciTech Connect

    Taylor-Pashow, K.; Fondeur, F.; White, T.; Diprete, D.; Milliken, C.

    2013-07-31

    Savannah River National Laboratory (SRNL) was tasked with identifying and developing at least one, but preferably two methods for quantifying the suppressor in the Next Generation Solvent (NGS) system. The suppressor is a guanidine derivative, N,N',N"-tris(3,7-dimethyloctyl)guanidine (TiDG). A list of 10 possible methods was generated, and screening experiments were performed for 8 of the 10 methods. After completion of the screening experiments, the non-aqueous acid-base titration was determined to be the most promising, and was selected for further development as the primary method. {sup 1}H NMR also showed promising results from the screening experiments, and this method was selected for further development as the secondary method. Other methods, including {sup 36}Cl radiocounting and ion chromatography, also showed promise; however, due to the similarity to the primary method (titration) and the inability to differentiate between TiDG and TOA (tri-n-ocytlamine) in the blended solvent, {sup 1}H NMR was selected over these methods. Analysis of radioactive samples obtained from real waste ESS (extraction, scrub, strip) testing using the titration method showed good results. Based on these results, the titration method was selected as the method of choice for TiDG measurement. {sup 1}H NMR has been selected as the secondary (back-up) method, and additional work is planned to further develop this method and to verify the method using radioactive samples. Procedures for analyzing radioactive samples of both pure NGS and blended solvent were developed and issued for the both methods.

  19. Methods for assessment of innovative medical technologies during early stages of development

    PubMed Central

    Bartelmes, Marc; Neumann, Ulrike; Lühmann, Dagmar; Schönermark, Matthias P.; Hagen, Anja

    2009-01-01

    Conventional Health Technology Assessment (HTA) is usually conducted at a point in time at which the development of the respective technology may no longer be influenced. By this time developers and/or purchasers may have misinvested resources. Thus the demand for Technology Assessment (TA) which incorporates appropriate methods during early development stages of a technology becomes apparent. Against this health political background, the present report describes methods for a development-accompanying assessment of innovative medical technologies. Furthermore, international research programmes set out to identify or apply such methods will be outlined. A systematic literature search as well as an extensive manual literature search are carried out in order to obtain literature and information. The greatest units of the identified methods consist of assessment concepts, decision support methods, modelling approaches and methods focusing on users and their knowledge. Additionally, several general-purpose concepts have been identified. The identified research programmes INNO-HTA and MATCH (Multidisciplinary-Assessment-of-Technology-Centre-for-Healthcare) are to be seen as pilot projects which so far have not been able to generate final results. MATCH focuses almost entirely on the incorporation of the user-perspective regarding the development of non-pharmaceutical technologies, whereas INNO-HTA is basically concerned with the identification and possible advancement of methods for the early, socially-oriented technology assessment. Most references offer only very vague descriptions of the respective method and the application of greatly differing methods seldom exceeds the character of a pilot implementation. A standardisation much less an institutionalisation of development-accompanying assessment cannot be recognized. It must be noted that there is no singular method with which development-accompanying assessment should be carried out. Instead, a technology and

  20. The Development and Evaluation of Training Methods for Group IV Personnel. 1. Orientation and Implementation of the Training Methods Development School (TMDS).

    ERIC Educational Resources Information Center

    Steinemann, John H.

    The investigation is part of continuing Navy research on the Trainability of Group IV (low ability) personnel intended to maximize the utilization and integration of marginal personnel in the fleet. An experimental Training Methods Development School (TMDS) was initiated to provide an experimental training program, with research controls, for…

  1. Developing Methods to Test the Influence of Critical Zone Development on Watershed Hydrology and Biogeochemistry

    NASA Astrophysics Data System (ADS)

    Anderson, S. P.; Blum, A. E.; Dethier, D. P.; Murphy, S. F.; Williams, M. W.; McKnight, D.; Fierer, N.; Tucker, G.; Wobus, C.; Anderson, R. S.; Caine, N.; Loague, K.; Leopold, M.; Voelkel, J.; Sheehan, A.

    2007-12-01

    The Boulder Creek Critical Zone Observatory in the Front Range of Colorado, USA, is designed to study the development and function of the near-surface weathered profile. The critical zone is the interface between bedrock and the atmosphere, where water and terrestrial ecosystems drive chemical transformations, and where weathering and erosion transform landscapes and shape the critical zone itself. In Boulder Creek catchment, erosion rates and processes vary dramatically over the 2600 m elevation range from the Colorado piedmont to the headwaters at the continental divide. The topographic, climatic, erosional and ecologic variations result in a critical zone that ranges from thin, fracture-dominated, weathered profiles truncated by glacial erosion to slowly eroding, deeply weathered mantles. Quantifying these variations in critical zone development, and understanding how erosion and weathering processes produce these variations, are primary goals of the Boulder Creek CZO. Against this backdrop, we will use hydrochemistry to examine how critical zone development influences fluxes of water, solutes, and nutrients to streams. We expect that reaction progress will be low in glacially truncated critical zone profiles in the headwaters, since water residence times are expected to be low in the thin fracture-dominated weathered zone. In contrast, we expect deeply weathered profiles on post-Laramide low-relief surfaces to yield long residence time water that approaches saturation with respect to minerals present. Dissolved organic matter (DOM) will likely show the most evidence of microbial processing in the deeply weathered profiles. We will use a suite of tools to test these expectations. Water will be collected from streams, wells and soil water samplers arrayed in three subcatchments. Our headwater site is coincident with the Niwot Ridge LTER high elevation site and will be managed jointly. In the case of dry regolith, we will use laboratory soil water extracts to test

  2. Development of advanced modal methods for calculating transient thermal and structural response

    NASA Technical Reports Server (NTRS)

    Camarda, Charles J.

    1991-01-01

    Higher-order modal methods for predicting thermal and structural response are evaluated. More accurate methods or ones which can significantly reduce the size of complex, transient thermal and structural problems are desirable for analysis and are required for synthesis of real structures subjected to thermal and mechanical loading. A unified method is presented for deriving successively higher-order modal solutions related to previously-developed, lower-order methods such as the mode displacement and mode-acceleration methods. A new method, called the force-derivative method, is used to obtain higher-order modal solutions for both uncoupled (proportionally-damped) structural problems as well as thermal problems and coupled (non-proportionally damped) structural problems. The new method is called the force-derivative method because, analogous to the mode-acceleration method, it produces a term that depends on the forcing function and additional terms that depend on the time derivatives of the forcing function.

  3. Development of a hybrid deterministic/stochastic method for 1D nuclear reactor kinetics

    NASA Astrophysics Data System (ADS)

    Terlizzi, Stefano; Rahnema, Farzad; Zhang, Dingkang; Dulla, Sandra; Ravetto, Piero

    2015-12-01

    A new method has been implemented for solving the time-dependent neutron transport equation efficiently and accurately. This is accomplished by coupling the hybrid stochastic-deterministic steady-state coarse-mesh radiation transport (COMET) method [1,2] with the new predictor-corrector quasi-static method (PCQM) developed at Politecnico di Torino [3]. In this paper, the coupled method is implemented and tested in 1D slab geometry.

  4. Development of a hybrid deterministic/stochastic method for 1D nuclear reactor kinetics

    SciTech Connect

    Terlizzi, Stefano; Dulla, Sandra; Ravetto, Piero; Rahnema, Farzad; Zhang, Dingkang

    2015-12-31

    A new method has been implemented for solving the time-dependent neutron transport equation efficiently and accurately. This is accomplished by coupling the hybrid stochastic-deterministic steady-state coarse-mesh radiation transport (COMET) method [1,2] with the new predictor-corrector quasi-static method (PCQM) developed at Politecnico di Torino [3]. In this paper, the coupled method is implemented and tested in 1D slab geometry.

  5. Using Models to Develop Measurement Systems: A Method and Its Industrial Use

    NASA Astrophysics Data System (ADS)

    Staron, Miroslaw; Meding, Wilhelm

    Making the measurement processes work in large software development organizations requires collecting right metrics and collecting them automatically. Collecting the right metrics requires development custom measurement systems which fulfill the actual needs of the company. Effective communication between stakeholders (persons who have the information needs) and the designers of measurement systems are cornerstones in identifying the right metrics and the right amount of them. In this paper we describe a method for developing measurement systems based on models which make this communication more effective. The method supports the designers of measurement systems and managers, for whom the measurement systems are created, in developing more effective measurement systems based on MS Excel. The method comprises of platform independent modeling, platform specific modeling and automated code generation. This method has been used in one of action research projects at Ericsson. We present the results of the evaluation of this method at Ericsson by the end of this paper.

  6. Gas-generator pressurization system experimental development method of the LV propellant tanks

    NASA Astrophysics Data System (ADS)

    Logvinenko, A.

    2009-01-01

    The approved efficient method of experimental development is given in the example of accumulated experience in the gas-generator pressurization system development of the LV propellant tanks. To the present time, acceptable calculated methods has not been created from complexity of thermo-mass-transfer processes. Therefore, under the development of similar systems the main attention is centered to its ground experimental development which requires special benches, corresponding competent structures, great time and material expenditure. The approved method of gas-generator pressurization system experimental development is proposed. It is based on the energy analysis of influenced factors and selection of its limit-possible operation modes. Practical use is allowed to decrease significantly the test volume, to decrease material expenditure and time for pressurization system experimental development under complex assurance of its optimal main characteristics.

  7. Development and Validation of Simultaneous Spectrophotometric Methods for Drotaverine Hydrochloride and Aceclofenac from Tablet Dosage Form

    PubMed Central

    Shah, S. A.; Shah, D. R.; Chauhan, R. S.; Jain, J. R.

    2011-01-01

    Two simple spectrophotometric methods have been developed for simultaneous estimation of drotaverine hydrochloride and aceclofenac from tablet dosage form. Method I is a simultaneous equation method (Vierodt's method), wavelengths selected are 306.5 and 276 nm. Method II is the absorbance ratio method (Q-Analysis), which employs 298.5 nm as λ1 and 276 nm as λ2 (λmax of AF) for formation of equations. Both the methods were found to be linear between the range of 8-32 μg/ml for drotaverine and 10-40 μg/ml for aceclofenac. The accuracy and precision were determined and found to comply with ICH guidelines. Both the methods showed good reproducibility and recovery with % RSD in the desired range. The methods were found to be rapid, specific, precise and accurate and can be successfully applied for the routine analysis of drotaverine and aceclofenac in their combined tablet dosage form. PMID:22457554

  8. Development of a harmonised method for the profiling of amphetamines V: Determination of the variability of the optimised method.

    PubMed

    Lock, Eric; Aalberg, Laura; Andersson, Kjell; Dahlén, Johan; Cole, Michael D; Finnon, Yvonne; Huizer, Henk; Jalava, Kaisa; Kaa, Elisabet; Lopes, Alvaro; Poortman-van der Meer, Anneke; Sippola, Erkki

    2007-06-14

    This paper is the fifth in a series of six in relation to the development of a harmonised method for the profiling of amphetamine [L. Aalberg, K. Andersson, C. Bertler, H. Borén, M.D. Cole, J. Dahlén, Y. Finnon, H. Huizer, K. Jalava, E. Kaa, E. Lock, A. Lopes, A. Poortman-van der Meer, E. Sippola, Development of a harmonised method for the profiling of amphetamines I. Synthesis of standards and compilation of analytical data, Forensic Sci. Int. 149 (2005) 219-229; L. Aalberg, K. Andersson, C. Bertler, M.D. Cole, Y. Finnon, H. Huizer, K. Jalava, E. Kaa, E. Lock, A. Lopes, A. Poortman-van der Meer, E. Sippola, J. Dahlén, Development of a harmonised method for the profiling of amphetamines II. Stability of impurities in organic solvents, Forensic Sci. Int. 149 (2005) 231-241]. The third paper [K. Andersson, K. Jalava, E. Lock, L. Aalberg, Y. Finnon, H. Huizer, E. Kaa, A. Lopes, A. Poortman-van der Meer, M.D. Cole, J. Dahlén, E. Sippola, Development of a harmonised method for the profiling of amphetamines III. Development of the gas chromatographic method, Forensic Sci. Int., in press] dealt with the optimisation of the gas chromatographic and detection methods whereas the fourth paper [K. Andersson, K. Jalava, E. Lock, Y. Finnon, S. Stevenson, L. Aalberg, H. Huizer, E. Kaa, A. Lopes, A. Poortman-van der Meer, M.D. Cole, J. Dahlén, E. Sippola, Development of a harmonised method for the profiling of amphetamines IV. Optimisation of sample preparation, Forensic Sci. Int., in press] concerned the optimisation of the extraction method prior to GC analysis. This paper is a study of the optimised method in order to determine its stability. Investigations of within and between day variations were carried out in four laboratories. Moreover, variations between laboratories were also determined. Both flame ionisation detector (FID) and MS detection were used. One laboratory studied nitrogen-phosphorous detector (NPD) detection as well. For this task, 12 batches of

  9. Development of method to characterize emissions from spray polyurethane foam insulation

    EPA Science Inventory

    This presentation updates symposium participants re EPA progress towards development of SPF insulation emissions characterization methods. The presentation highlights evaluation of experiments investigating emissions after application of SPF to substrates in micro chambers and i...

  10. Analysis of Perfluorinated Chemicals in Sludge: Method Development and Initial Results

    EPA Science Inventory

    A fast, rigorous method was developed to maximize the extraction efficacy for ten perfluorocarboxylic acids and perfluorooctanesulfonate from wastewater-treatment sludge and to quantitate using liquid chromatography, tandem-mass spectrometry (LC/MS/MS). First, organic solvents w...

  11. Analysis of Perfluorinated Chemicals and Their Fluorinated Precursors in Sludge: Method Development and Initial Results

    EPA Science Inventory

    A rigorous method was developed to maximize the extraction efficacy for perfluorocarboxylic acids (PFCAs), perfluorosulfonates (PFSAs), fluorotelomer alcohols (FTOHs), fluorotelomer acrylates (FTAc), perfluorosulfonamides (FOSAs), and perfluorosulfonamidoethanols (FOSEs) from was...

  12. DEVELOPMENT AND VALIDATION OF AN ION CHROMATOGRAPHIC METHOD FOR DETERMINING PERCHLORATE IN FERTILIZERS

    EPA Science Inventory

    A method has been developed for the determination of perchlorate in fertilizers. Materials are leached with deionized water to dissolve any soluble perchlorate compounds. Ion chromatographic separation is followed by suppressed conductivity for detection. Perchlorate is retained ...

  13. Developing Non-Targeted Measurement Methods to Characterize the Human Exposome

    EPA Science Inventory

    The exposome represents all exposures experienced by an individual during their lifetime. Registered chemicals currently number in the tens-of-thousands, and therefore comprise a significant portion of the human exposome. To date, quantitative monitoring methods have been develop...

  14. Magnetron sputtering as a method of thin-film catalyst development for electrochemical sensors

    NASA Astrophysics Data System (ADS)

    Medvedeva, E. A.

    2016-07-01

    The aim of this work was to develop a thin-film Pt/C catalyst on the fluoroplastic substrates by means of the magnetron sputtering method in order to use as reference and working electrodes of electrochemical cells.

  15. PREDICTING THE EFFECTIVENESS OF CHEMICAL-PROTECTIVE CLOTHING MODEL AND TEST METHOD DEVELOPMENT

    EPA Science Inventory

    A predictive model and test method were developed for determining the chemical resistance of protective polymeric gloves exposed to liquid organic chemicals. The prediction of permeation through protective gloves by solvents was based on theories of the solution thermodynamics of...

  16. Novel quantitative methods for characterization of chemical induced functional alteration in developing neuronal cultures

    EPA Science Inventory

    ABSTRACT BODY: Thousands of chemicals lack adequate testing for adverse effects on nervous system development, stimulating research into alternative methods to screen chemicals for potential developmental neurotoxicity. Microelectrode arrays (MEA) collect action potential spiking...

  17. Evaluation of an automatic brain segmentation method developed for neonates on adult MR brain images

    NASA Astrophysics Data System (ADS)

    Moeskops, Pim; Viergever, Max A.; Benders, Manon J. N. L.; Išgum, Ivana

    2015-03-01

    Automatic brain tissue segmentation is of clinical relevance in images acquired at all ages. The literature presents a clear distinction between methods developed for MR images of infants, and methods developed for images of adults. The aim of this work is to evaluate a method developed for neonatal images in the segmentation of adult images. The evaluated method employs supervised voxel classification in subsequent stages, exploiting spatial and intensity information. Evaluation was performed using images available within the MRBrainS13 challenge. The obtained average Dice coefficients were 85.77% for grey matter, 88.66% for white matter, 81.08% for cerebrospinal fluid, 95.65% for cerebrum, and 96.92% for intracranial cavity, currently resulting in the best overall ranking. The possibility of applying the same method to neonatal as well as adult images can be of great value in cross-sectional studies that include a wide age range.

  18. Method for developing national quality indicators based on manual data extraction from medical records.

    PubMed

    Couralet, Melanie; Leleu, Henri; Capuano, Frederic; Marcotte, Leah; Nitenberg, Gérard; Sicotte, Claude; Minvielle, Etienne

    2013-02-01

    Developing quality indicators (QI) for national purposes (eg, public disclosure, paying-for-performance) highlights the need to find accessible and reliable data sources for collecting standardised data. The most accurate and reliable data source for collecting clinical and organisational information still remains the medical record. Data collection from electronic medical records (EMR) would be far less burdensome than from paper medical records (PMR). However, the development of EMRs is costly and has suffered from low rates of adoption and barriers of usability even in developed countries. Currently, methods for producing national QIs based on the medical record rely on manual extraction from PMRs. We propose and illustrate such a method. These QIs display feasibility, reliability and discriminative power, and can be used to compare hospitals. They have been implemented nationwide in France since 2006. The method used to develop these QIs could be adapted for use in large-scale programmes of hospital regulation in other, including developing, countries. PMID:23015098

  19. Calculating development parameters for chemically amplified resists by the film-reducing method

    NASA Astrophysics Data System (ADS)

    Sekiguchi, Atsushi; Sensu, Yoshihisa

    2013-03-01

    We obtained development parameters for a chemically amplified resist from calculations involving the conversion of the relationship between exposure dose and development rate to the relationship between protection ratio and development rate using the conventional ABC parameter[1] and development rate data (RDA data) [2]. However, calculations by this method require the ABC parameter. Since chemically amplified resists have no bleaching effect, the C parameter must be measured by the FT-IR [3-5] or coumarin addition method [6-8]. Given this constraint, we examined the method of obtaining development parameters based on the film reduction observed in the resist exposed or the film reduction observed after PEB, without using the ABC parameter. This paper presents the results.

  20. DEVELOPMENT OF CHROMATOGRAPHIC METHOD FOR DETERMINATION OF DRUGS REDUCING CHOLESTEROL LEVEL--STATINS AND EZETIMIBE.

    PubMed

    Kublin, Elżbieta; Malanowicz, Ewa; Kaczmarska-Graczyk, Barbara; Czerwińska, Krystyna; Wyszomirska, Elżbieta; Mazurek, Aleksander P

    2015-01-01

    The presented developed HPLC method and GC method may be used to separate and determine all analyzed 3-hydroxy-3-methylglutaryl-coenzyme A reductase inhibitors (statins) and ezetimibe using a single columns and a uniform methodology. In order to perform qualitative and quantitative tests of statins and ezetimibe the Symmetry C18 column 250 mm x 4.6 mm, 5 µm, the mobile phase: acetonitrile:water (70:30, v/v), adjusted to pH = 2.5 and a spectrophotometric detector for the HPLC method were used. For GC method column HP-1; 30 m x 0.25 mm x 0.25 µm and FID detector were selected. All results and statistical data obtained indicate good method sensitivity and precision. The RSD values are appropriate for both newly developed methods. PMID:26642651

  1. Development of a Simultaneous Extraction and Cleanup Method for Pyrethroid Pesticides from Indoor House Dust Samples

    EPA Science Inventory

    An efficient and reliable analytical method was developed for the sensitive and selective quantification of pyrethroid pesticides (PYRs) in house dust samples. The method is based on selective pressurized liquid extraction (SPLE) of the dust-bound PYRs into dichloromethane (DCM) wi...

  2. An overview of recent developments and current status of gluten ELISA methods

    Technology Transfer Automated Retrieval System (TEKTRAN)

    ELISA methods for detecting and quantitating allergens have been around for some time and they are continuously improved. In this context, the development of gluten methods is no exception. Around the turn of the millennium, doubts were raised whether the existing “Skerritt-ELISA” would meet the 20 ...

  3. Recommendations for Developing Alternative Test Methods for Screening and Prioritization of Chemicals for Developmental Neurotoxicity

    EPA Science Inventory

    Developmental neurotoxicity testing (DNT) is perceived by many stakeholders to be an area in critical need of alternative methods to current animal testing protocols and gUidelines. An immediate goal is to develop test methods that are capable of screening large numbers of chemic...

  4. The Effectiveness of the Socratic Method in Developing Critical Thinking Skills in English Language Learners

    ERIC Educational Resources Information Center

    Jensen, Roger D., Jr.

    2015-01-01

    Critical thinking skills are an important topic of the United States' education system. This study examines the literature on critical thinking skills and defines them. The study also explores one specific teaching and assessment strategy known as the Socratic Method. The five-week research study used the Socratic Method for developing critical…

  5. 76 FR 9534 - Development of Technical Guidelines and Scientific Methods for Quantifying GHG Emissions and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-18

    ...Section 2709 of the Food, Conservation and Energy Act of 2008 states that: USDA shall prepare technical guidelines that outline science-based methods to measure the carbon benefits from conservation and land management activities. In accordance with Section 2709 of the 2008 Farm Bill, USDA is developing technical guidelines and science- based methods to quantify greenhouse gas sources and......

  6. DEVELOPMENT AND APPLICATION OF METHODS TO ASSESS HUMAN EXPOSURE TO PESTICIDES

    EPA Science Inventory

    Note: this task is schedule to end September 2003. Two tasks will take its place: method development for emerging pesticides including chiral chemistry applications, and in-house laboratory operations. Field sampling methods are covered under a new task proposed this year.
    <...

  7. Using Mixed Methods to Analyze Video Data: A Mathematics Teacher Professional Development Example

    ERIC Educational Resources Information Center

    DeCuir-Gunby, Jessica T.; Marshall, Patricia L.; McCulloch, Allison W.

    2012-01-01

    This article uses data from 65 teachers participating in a K-2 mathematics professional development research project as an example of how to analyze video recordings of teachers' classroom lessons using mixed methods. Through their discussion, the authors demonstrate how using a mixed methods approach to classroom video analysis allows researchers…

  8. Pathways to Lean Software Development: An Analysis of Effective Methods of Change

    ERIC Educational Resources Information Center

    Hanson, Richard D.

    2014-01-01

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is…

  9. 75 FR 9894 - Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    .... This designation is made under the provisions of 40 CFR part 53, as amended on November 12, 2008 (73 FR... AGENCY Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods: Designation of One New Equivalent Method AGENCY: Environmental Protection Agency. ACTION: Notice of...

  10. 75 FR 51039 - Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-18

    ... provisions of 40 CFR Part 53, as amended on November 12, 2008 (73 FR 67057-67059). The new PM 10 equivalent... AGENCY Office of Research and Development; Ambient Air Monitoring Reference and Equivalent Methods: Designation of Two New Equivalent Methods AGENCY: Environmental Protection Agency. ACTION: Notice of...

  11. RESEARCH TOWARDS DEVELOPING METHODS FOR SELECTED PHARMACEUTICAL AND PERSONAL CARE PRODUCTS (PPCPS) ADAPTED FOR BIOSOLIDS

    EPA Science Inventory

    Development, standardization, and validation of analytical methods provides state-of-the-science

    techniques to evaluate the presence, or absence, of select PPCPs in biosolids. This research

    provides the approaches, methods, and tools to assess the exposures and redu...

  12. Wellbeing Research in Developing Countries: Reviewing the Role of Qualitative Methods

    ERIC Educational Resources Information Center

    Camfield, Laura; Crivello, Gina; Woodhead, Martin

    2009-01-01

    The authors review the contribution of qualitative methods to exploring concepts and experiences of wellbeing among children and adults living in developing countries. They provide examples illustrating the potential of these methods for gaining a holistic and contextual understanding of people's perceptions and experiences. Some of these come…

  13. Development of a two-step touch method for website navigation on smartphones.

    PubMed

    Jung, Kihyo; Jang, Jinah

    2015-05-01

    The touch method for hyperlink selection in smartphones can often create usability problems because a hyperlink is universally smaller than a finger contact area as well as visually occluded by a finger while pressing. In this study, we developed a two-step touch method (called Press and Flick method) and comprehensively examined its effectiveness using the goals, operators, methods, and selection rules (GOMS) model and user testing. The two-step touch method consisted of finger press and flick motions; a target hyperlink was selected by a finger press motion, and a finger flick method was subsequently conducted for error correction if the initial interaction (press) failed. We compared the two-step touch method with the current touch method through the GOMS model and user testing. As a result, the two-step touch method was significantly superior to the current touch method in terms of error rate and subjective satisfaction score; however, its superiority in terms of number of interactions and touch time was vulnerably affected by error rate. The two-step touch method developed in this study can improve the usability and user experience of website navigation using smartphones. PMID:25683542

  14. Online Learning Communities and Teacher Professional Development: Methods for Improved Education Delivery

    ERIC Educational Resources Information Center

    Lindberg, J. Ola, Ed.; Olofsson, Anders D., Ed.

    2009-01-01

    In today's society, the professional development of teachers is urgent due to the constant change in working conditions and the impact that information and communication technologies have in teaching practices. "Online Learning Communities and Teacher Professional Development: Methods for Improved Education Delivery" features innovative…

  15. FODEM: A Multi-Threaded Research and Development Method for Educational Technology

    ERIC Educational Resources Information Center

    Suhonen, Jarkko; de Villiers, M. Ruth; Sutinen, Erkki

    2012-01-01

    Formative development method (FODEM) is a multithreaded design approach that was originated to support the design and development of various types of educational technology innovations, such as learning tools, and online study programmes. The threaded and agile structure of the approach provides flexibility to the design process. Intensive…

  16. New methods in mammary gland development and cancer: proteomics, epigenetics, symmetric division and metastasis

    PubMed Central

    2012-01-01

    The European Network for Breast Development and Cancer (ENBDC) meeting on 'Methods in Mammary Gland Development and Cancer' has become an annual international rendezvous for scientists with interests in the normal and neoplastic breast. The fourth meeting in this series, held in April in Weggis, Switzerland, focused on proteomics, epigenetics, symmetric division, and metastasis. PMID:22809213

  17. DEVELOPMENT OF LOW-DIFFUSION FLUX-SPLITTING METHODS FOR DENSE GAS-SOLID FLOWS

    EPA Science Inventory

    The development of a class of low-diffusion upwinding methods for computing dense gas-solid flows is presented in this work. An artificial compressibility/low-Mach preconditioning strategy is developed for a hyperbolic two-phase flow equation system consisting of separate solids ...

  18. Linking Faculty Development to Community College Student Achievement: A Mixed Methods Approach

    ERIC Educational Resources Information Center

    Elliott, Robert W.; Oliver, Diane E.

    2016-01-01

    Using a mixed methods, multilevel research design, this pilot inquiry explored the relationship between college faculty professional development and the academic achievement of diverse students by coupling two separate links: (a) the effects that professional development activities have on improving teaching strategies, and (b) the effects these…

  19. Agile methods in biomedical software development: a multi-site experience report

    PubMed Central

    Kane, David W; Hohman, Moses M; Cerami, Ethan G; McCormick, Michael W; Kuhlmman, Karl F; Byrd, Jeff A

    2006-01-01

    Background Agile is an iterative approach to software development that relies on strong collaboration and automation to keep pace with dynamic environments. We have successfully used agile development approaches to create and maintain biomedical software, including software for bioinformatics. This paper reports on a qualitative study of our experiences using these methods. Results We have found that agile methods are well suited to the exploratory and iterative nature of scientific inquiry. They provide a robust framework for reproducing scientific results and for developing clinical support systems. The agile development approach also provides a model for collaboration between software engineers and researchers. We present our experience using agile methodologies in projects at six different biomedical software development organizations. The organizations include academic, commercial and government development teams, and included both bioinformatics and clinical support applications. We found that agile practices were a match for the needs of our biomedical projects and contributed to the success of our organizations. Conclusion We found that the agile development approach was a good fit for our organizations, and that these practices should be applicable and valuable to other biomedical software development efforts. Although we found differences in how agile methods were used, we were also able to identify a set of core practices that were common to all of the groups, and that could be a focus for others seeking to adopt these methods. PMID:16734914

  20. Development of direct-inverse 3-D method for applied aerodynamic design and analysis

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1987-01-01

    The primary tasks performed were the continued development of inverse design procedures for the TAWFIVE code, the development of corresponding relofting and trailing edge closure procedures, and the testing of the methods for a variety of cases. The period from July 1, 1986 through December 31, 1986 is covered.

  1. STANDARD OPERATING PROCEDURE FOR QUALITY ASSURANCE IN ANALYTICAL CHEMISTRY METHODS DEVELOPMENT

    EPA Science Inventory

    The Environmental Protection Agency's (EPA) Office of Research and Development (ORD) is engaged in the development, demonstration, and validation of new or newly adapted methods of analysis for environmentally related samples. Recognizing that a "one size fits all" approach to qu...

  2. A Framework for Mixing Methods in Quantitative Measurement Development, Validation, and Revision: A Case Study

    ERIC Educational Resources Information Center

    Luyt, Russell

    2012-01-01

    A framework for quantitative measurement development, validation, and revision that incorporates both qualitative and quantitative methods is introduced. It extends and adapts Adcock and Collier's work, and thus, facilitates understanding of quantitative measurement development, validation, and revision as an integrated and cyclical set of…

  3. Cross Sectional Study of Agile Software Development Methods and Project Performance

    ERIC Educational Resources Information Center

    Lambert, Tracy

    2011-01-01

    Agile software development methods, characterized by delivering customer value via incremental and iterative time-boxed development processes, have moved into the mainstream of the Information Technology (IT) industry. However, despite a growing body of research which suggests that a predictive manufacturing approach, with big up-front…

  4. An Auxiliary Method To Reduce Potential Adverse Impacts Of Projected Land Developments: Subwatershed Prioritization

    EPA Science Inventory

    An index based method is developed that ranks the subwatersheds of a watershed based on their relative impacts on watershed response to anticipated land developments, and then applied to an urbanizing watershed in Eastern Pennsylvania. Simulations with a semi-distributed hydrolo...

  5. Development of a Suitable Dissolution Method for the Combined Tablet Formulation of Atorvastatin and Ezetimibe by RP-LC Method.

    PubMed

    Ozkan Cansel, Kose; Ozgur, Esim; Sevinc, Kurbanoglu; Ayhan, Savaser; Ozkan, Sibel A; Yalcin, Ozkan

    2016-01-01

    Pharmaceutical preparations of ezetimibe and atorvastatin are generally used to regulate the lipid level in blood. It decreases the secondary events for patients with high cholesterol and clinical cardiovascular disease such as non-fatal or fatal heart attack. There is no any pharmacopoeia method available for the dissolution testing recommended by the FDA. Development of dissolution tests method is very critical parameter especially for the pharmaceutical preparations that contain Class II drugs (slightly soluble, good permeable). In the proposed method, the effects of pH and surfactant on the dissolution of poorly water soluble combined drug therapy with a different pKa values in an in vitro environment is investigated. The content of our study was designed to answer these open-ended questions. The optimized test conditions achieved under sink conditions with USP apparatus 2 at a paddle rotation speed of 75 rpm and 900 ml in 0.01 M Acetate buffer (pH= 6.8) containing 0.45% SDS as a dissolution medium. Quantification of dissolution samples were analyzed with a new fully validated RP-LC method with UV detection at 242 nm. PMID:26638976

  6. Development of a generalized perturbation theory method for sensitivity analysis using continuous-energy Monte Carlo methods

    DOE PAGESBeta

    Perfetti, Christopher M.; Rearden, Bradley T.

    2016-03-01

    The sensitivity and uncertainty analysis tools of the ORNL SCALE nuclear modeling and simulation code system that have been developed over the last decade have proven indispensable for numerous application and design studies for nuclear criticality safety and reactor physics. SCALE contains tools for analyzing the uncertainty in the eigenvalue of critical systems, but cannot quantify uncertainty in important neutronic parameters such as multigroup cross sections, fuel fission rates, activation rates, and neutron fluence rates with realistic three-dimensional Monte Carlo simulations. A more complete understanding of the sources of uncertainty in these design-limiting parameters could lead to improvements in processmore » optimization, reactor safety, and help inform regulators when setting operational safety margins. A novel approach for calculating eigenvalue sensitivity coefficients, known as the CLUTCH method, was recently explored as academic research and has been found to accurately and rapidly calculate sensitivity coefficients in criticality safety applications. The work presented here describes a new method, known as the GEAR-MC method, which extends the CLUTCH theory for calculating eigenvalue sensitivity coefficients to enable sensitivity coefficient calculations and uncertainty analysis for a generalized set of neutronic responses using high-fidelity continuous-energy Monte Carlo calculations. Here, several criticality safety systems were examined to demonstrate proof of principle for the GEAR-MC method, and GEAR-MC was seen to produce response sensitivity coefficients that agreed well with reference direct perturbation sensitivity coefficients.« less

  7. Development of gas chromatographic methods for the analyses of organic carbonate-based electrolytes

    NASA Astrophysics Data System (ADS)

    Terborg, Lydia; Weber, Sascha; Passerini, Stefano; Winter, Martin; Karst, Uwe; Nowak, Sascha

    2014-01-01

    In this work, novel methods based on gas chromatography (GC) for the investigation of common organic carbonate-based electrolyte systems are presented, which are used in lithium ion batteries. The methods were developed for flame ionization detection (FID), mass spectrometric detection (MS). Further, headspace (HS) sampling for the investigation of solid samples like electrodes is reported. Limits of detection are reported for FID. Finally, the developed methods were applied to the electrolyte system of commercially available lithium ion batteries as well as on in-house assembled cells.

  8. Development and validation of a new fallout transport method using variable spectral winds. Doctoral thesis

    SciTech Connect

    Hopkins, A.T.

    1984-09-01

    The purpose of this research was to develop and validate a fallout prediction method using variable transport calculations. The new method uses National Meteorological Center (NMC) spectral coefficients to compute wind vectors along the space- and time-varying trajectories of falling particles. The method was validated by comparing computed and actual cloud trajectories from a Mount St. Helens volcanic eruption and a high dust cloud. In summary, this research demonstrated the feasibility of using spectral coefficients for fallout transport calculations, developed a two-step smearing model to treat variable winds, and showed that uncertainties in spectral winds do not contribute significantly to the error in computed dose rate.

  9. Development and elaboration of numerical method for simulating gas–liquid–solid three-phase flows based on particle method

    NASA Astrophysics Data System (ADS)

    Takahashi, Ryohei; Mamori, Hiroya; Yamamoto, Makoto

    2016-02-01

    A numerical method for simulating gas-liquid-solid three-phase flows based on the moving particle semi-implicit (MPS) approach was developed in this study. Computational instability often occurs in multiphase flow simulations if the deformations of the free surfaces between different phases are large, among other reasons. To avoid this instability, this paper proposes an improved coupling procedure between different phases in which the physical quantities of particles in different phases are calculated independently. We performed numerical tests on two illustrative problems: a dam-break problem and a solid-sphere impingement problem. The former problem is a gas-liquid two-phase problem, and the latter is a gas-liquid-solid three-phase problem. The computational results agree reasonably well with the experimental results. Thus, we confirmed that the proposed MPS method reproduces the interaction between different phases without inducing numerical instability.

  10. Method of obtaining intensified image from developed photographic films and plates

    NASA Technical Reports Server (NTRS)

    Askins, B. S. (Inventor)

    1978-01-01

    A method is explained of obtaining intensified images from silver images on developed photographic films and plates. The steps involve converting silver of the developed film or plate to a radioactive compound by treatment with an aqueous alkaline solution of an organo-S35 compound; placing the treated film or plate in direct contact with a receiver film which is then exposed by radiation from the activated film; and developing and fixing the resulting intensified image on the receiver film.

  11. The implementation method and the development tendency of infrared stealth technology

    NASA Astrophysics Data System (ADS)

    Lu, Jianhua; Wang, Ruifeng

    2015-10-01

    The infrared stealthy theory is introduced simply, the two kinds of methods for realization of infrared stealth are educed. Then, the measures of infrared stealth are described in detail all over the world. At the same time, it is pointed out that the development of the infrared stealth materials is the basis for the development of infrared stealth technology. And expounds the performance characteristics of the stealthy materials in detail. Finally, the development trend of infrared stealth technology are analyzed.

  12. X-RAY FLUORESCENCE ANALYSIS OF HANFORD LOW ACTIVITY WASTE SIMULANTS METHOD DEVELOPMENT

    SciTech Connect

    Jurgensen, A; David Missimer, D; Ronny Rutherford, R

    2007-08-08

    The x-ray fluorescence laboratory (XRF) in the Analytical Development Directorate (ADD) of the Savannah River National Laboratory (SRNL) was requested to develop an x-ray fluorescence spectrometry method for elemental characterization of the Hanford Tank Waste Treatment and Immobilization Plant (WTP) pretreated low activity waste (LAW) stream to the LAW Vitrification Plant. The WTP is evaluating the potential for using XRF as a rapid turnaround technique to support LAW product compliance and glass former batching. The overall objective of this task was to develop an XRF analytical method that provides rapid turnaround time (<8 hours), while providing sufficient accuracy and precision to determine variations in waste.

  13. Unstructured-grid methods development for unsteady aerodynamic and aeroelastic analyses

    NASA Technical Reports Server (NTRS)

    Batina, John T.; Lee, Elizabeth M.; Kleb, William L.; Rausch, Russ D.

    1991-01-01

    The current status of unstructured grid methods development in the Unsteady Aerodynamics Branch at NASA-Langley is described. These methods are being developed for unsteady aerodynamic and aeroelastic analyses. The flow solvers are highlighted which were developed for the solution of the unsteady Euler equations and selected results are given which show various features of the capability. The results demonstrate 2-D and 3-D applications for both steady and unsteady flows. Comparisons are also made with solutions obtained using a structured grid code and with experimental data to determine the accuracy of the unstructured grid methodology. These comparisons show good agreement which thus verifies the accuracy.

  14. Unstructured-grid methods development for unsteady aerodynamic and aeroelastic analyses

    NASA Technical Reports Server (NTRS)

    Batina, John T.; Lee, Elizabeth M.; Kleb, William L.; Rausch, Russ D.

    1992-01-01

    The current status of unstructured grid methods developed in the Unsteady Aerodynamics Branch at NASA Langley Research Center is described. These methods are being developed for unsteady aerodynamic and aeroelastic analyses. Flow solvers that have been developed for the solution of unsteady Euler equations are highlighted. The results demonstrate two and three dimensional applications for both steady and unsteady flows. Comparisons are also made with solutions obtained using a structured grid code and with experimental data to determine the accuracy of the unstructured grid methodology. These comparisons show good agreement which thus verifies the accuracy.

  15. Development of direct-inverse 3-D methods for applied transonic aerodynamic wing design and analysis

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1989-01-01

    An inverse wing design method was developed around an existing transonic wing analysis code. The original analysis code, TAWFIVE, has as its core the numerical potential flow solver, FLO30, developed by Jameson and Caughey. Features of the analysis code include a finite-volume formulation; wing and fuselage fitted, curvilinear grid mesh; and a viscous boundary layer correction that also accounts for viscous wake thickness and curvature. The development of the inverse methods as an extension of previous methods existing for design in Cartesian coordinates is presented. Results are shown for inviscid wing design cases in super-critical flow regimes. The test cases selected also demonstrate the versatility of the design method in designing an entire wing or discontinuous sections of a wing.

  16. Development of a Hybrid RANS/LES Method for Compressible Mixing Layer Simulations

    NASA Technical Reports Server (NTRS)

    Georgiadis, Nicholas J.; Alexander, J. Iwan D.; Reshotko, Eli

    2001-01-01

    A hybrid method has been developed for simulations of compressible turbulent mixing layers. Such mixing layers dominate the flows in exhaust systems of modem day aircraft and also those of hypersonic vehicles currently under development. The hybrid method uses a Reynolds-averaged Navier-Stokes (RANS) procedure to calculate wall bounded regions entering a mixing section, and a Large Eddy Simulation (LES) procedure to calculate the mixing dominated regions. A numerical technique was developed to enable the use of the hybrid RANS/LES method on stretched, non-Cartesian grids. The hybrid RANS/LES method is applied to a benchmark compressible mixing layer experiment. Preliminary two-dimensional calculations are used to investigate the effects of axial grid density and boundary conditions. Actual LES calculations, performed in three spatial directions, indicated an initial vortex shedding followed by rapid transition to turbulence, which is in agreement with experimental observations.

  17. Improving measurement methods in rehabilitation: core concepts and recommendations for scale development.

    PubMed

    Velozo, Craig A; Seel, Ronald T; Magasi, Susan; Heinemann, Allen W; Romero, Sergio

    2012-08-01

    Validated measurement scales are essential to evaluating clinical outcomes and conducting meaningful and reliable research. The purpose of this article is to present the clinician and researcher with a contemporary 8-stage framework for measurement scale development based on a mixed-methods qualitative and quantitative approach. Core concepts related to item response theory are presented. Qualitative methods are described to conceptualize scale constructs; obtain patient, family, and other stakeholder perspectives; and develop item pools. Item response theory statistical methodologies are presented, including approaches for testing the assumptions of unidimensionality, local independence, monotonicity, and indices of model fit. Lastly, challenges faced by scale developers in implementing these methodologies are discussed. While rehabilitation research has recently started to apply mixed-methods qualitative and quantitative methodologies to scale development, these approaches show considerable promise in advancing rehabilitation measurement. PMID:22840881

  18. Pathways to lean software development: An analysis of effective methods of change

    NASA Astrophysics Data System (ADS)

    Hanson, Richard D.

    This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is firmly entrenched in American business today was to blame for this difficulty (Chatterjee, 2010). Each of these proponents of new methods sought to remove waste, lighten out the process, and implement lean principles in software development. Through this study, the experts evaluated the barriers to effective development principles and defined leadership qualities necessary to overcome these barriers. The barriers identified were issues of resistance to change, risk and reward issues, and management buy-in. Thirty experts in software development from several Fortune 500 companies across the United States explored each of these issues in detail. The conclusion reached by these experts was that visionary leadership is necessary to overcome these challenges.

  19. Development of Advanced Methods of Structural and Trajectory Analysis for Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Ardema, Mark D.

    1996-01-01

    In this report the author describes: (1) development of advanced methods of structural weight estimation, and (2) development of advanced methods of flight path optimization. A method of estimating the load-bearing fuselage weight and wing weight of transport aircraft based on fundamental structural principles has been developed. This method of weight estimation represents a compromise between the rapid assessment of component weight using empirical methods based on actual weights of existing aircraft and detailed, but time-consuming, analysis using the finite element method. The method was applied to eight existing subsonic transports for validation and correlation. Integration of the resulting computer program, PDCYL, has been made into the weights-calculating module of the AirCraft SYNThesis (ACSYNT) computer program. ACSYNT bas traditionally used only empirical weight estimation methods; PDCYL adds to ACSYNT a rapid, accurate means of assessing the fuselage and wing weights of unconventional aircraft. PDCYL also allows flexibility in the choice of structural concept, as well as a direct means of determining the impact of advanced materials on structural weight.

  20. A story/dialogue method for health promotion knowledge development and evaluation.

    PubMed

    Labonte, R; Feather, J; Hills, M

    1999-02-01

    Arguments have been made in favour of a constructivist or postpositivist approach to health promotion knowledge development and program evaluation, but little has been articulated about what such an approach would look like. This article describes a 'story/dialogue method' that was created with and for practitioners in response to their concerns that much of their practice did not lend itself to a positivist, or conventional, methodology. Derived from constructivist, feminist and critical pedagogical theory, and with roots in qualitative methods, the method structures group dialogue around case stories addressing particular generative practice themes. While intended for practitioner training, organizational development and evaluation, the method to date has been used primarily for training purposes. This article describes the method, provides an example of its application, and discusses its strengths, weaknesses and relevance to health promotion. PMID:10537946

  1. HPLC method development for evolving applications in the pharmaceutical industry and nanoscale chemistry

    NASA Astrophysics Data System (ADS)

    Castiglione, Steven Louis

    As scientific research trends towards trace levels and smaller architectures, the analytical chemist is often faced with the challenge of quantitating said species in a variety of matricies. The challenge is heightened when the analytes prove to be potentially toxic or possess physical or chemical properties that make traditional analytical methods problematic. In such cases, the successful development of an acceptable quantitative method plays a critical role in the ability to further develop the species under study. This is particularly true for pharmaceutical impurities and nanoparticles (NP). The first portion of the research focuses on the development of a part-per-billion level HPLC method for a substituted phenazine-class pharmaceutical impurity. The development of this method was required due to the need for a rapid methodology to quantitatively determine levels of a potentially toxic phenazine moiety in order to ensure patient safety. As the synthetic pathway for the active ingredient was continuously refined to produce progressively lower amounts of the phenazine impurity, the approach for increasingly sensitive quantitative methods was required. The approaches evolved across four discrete methods, each employing a unique scheme for analyte detection. All developed methods were evaluated with regards to accuracy, precision and linear adherence as well as ancillary benefits and detriments -- e.g., one method in this evolution demonstrated the ability to resolve and detect other species from the phenazine class. The second portion of the research focuses on the development of an HPLC method for the quantitative determination of NP size distributions. The current methodology for the determination of NP sizes employs tunneling electron microscopy (TEM), which requires sample drying without particle size alteration and which, in many cases, may prove infeasible due to cost or availability. The feasibility of an HPLC method for NP size characterizations evolved

  2. Development of an Innovative Algorithm for Aerodynamics-Structure Interaction Using Lattice Boltzmann Method

    NASA Technical Reports Server (NTRS)

    Mei, Ren-Wei; Shyy, Wei; Yu, Da-Zhi; Luo, Li-Shi; Rudy, David (Technical Monitor)

    2001-01-01

    The lattice Boltzmann equation (LBE) is a kinetic formulation which offers an alternative computational method capable of solving fluid dynamics for various systems. Major advantages of the method are owing to the fact that the solution for the particle distribution functions is explicit, easy to implement, and the algorithm is natural to parallelize. In this final report, we summarize the works accomplished in the past three years. Since most works have been published, the technical details can be found in the literature. Brief summary will be provided in this report. In this project, a second-order accurate treatment of boundary condition in the LBE method is developed for a curved boundary and tested successfully in various 2-D and 3-D configurations. To evaluate the aerodynamic force on a body in the context of LBE method, several force evaluation schemes have been investigated. A simple momentum exchange method is shown to give reliable and accurate values for the force on a body in both 2-D and 3-D cases. Various 3-D LBE models have been assessed in terms of efficiency, accuracy, and robustness. In general, accurate 3-D results can be obtained using LBE methods. The 3-D 19-bit model is found to be the best one among the 15-bit, 19-bit, and 27-bit LBE models. To achieve desired grid resolution and to accommodate the far field boundary conditions in aerodynamics computations, a multi-block LBE method is developed by dividing the flow field into various blocks each having constant lattice spacing. Substantial contribution to the LBE method is also made through the development of a new, generalized lattice Boltzmann equation constructed in the moment space in order to improve the computational stability, detailed theoretical analysis on the stability, dispersion, and dissipation characteristics of the LBE method, and computational studies of high Reynolds number flows with singular gradients. Finally, a finite difference-based lattice Boltzmann method is

  3. Emerging analytical methods to determine gluten markers in processed foods—method development in support of standard setting

    PubMed Central

    Weber, Dorcas; Cléroux, Chantal

    2009-01-01

    The availability of analytical methods to detect and determine levels of markers of priority allergens in foods is of the utmost importance to support standard setting initiatives, the development of compliance and enforcement activities, as well as to provide guidance to industry on implementation of quality control practices, ensuring the effectiveness of allergen-related sanitation techniques. This paper describes the development and implementation of a mass-spectrometry-based technique to determine markers for individual sources of gluten in beer products. This methodology was shown to answer the requirements of Health Canada’s proposed labeling standard for individual gluten source declaration, in order to achieve its policy objectives (i.e., protection of sensitive consumers, while promoting choice). Minimal sample work-up was required and the results obtained by ELISA were further complemented using the LC-MS/MS method. This paper aims to demonstrate the feasibility of alternative techniques to ELISA-based methodologies to determine allergen and gluten markers in food. PMID:19636545

  4. Basic study to develop an electromagnetic drive method for the rotary undulation pump.

    PubMed

    Abe, Yusuke; Chinzei, Tsuneo; Isoyama, Takashi; Saito, Itsuro; Ono, Toshiya; Mochizuki, Shuichi; Kouno, Akimasa; Imachi, Kou

    2003-10-01

    The rotary undulation pump, which is composed of a disk with a convex shape on both sides and a pump housing with one narrow side and one wide side, is a unique continuous flow pump with a new principle. The concept of the levitation drive method for this pump was proposed. The electromagnetic driver model and drive circuit were developed to examine the possibility and the difference among the delta wired, Y wired, and repulsion methods. In the repulsion method, the disk was driven by magnetic repulsion. The model could be driven with either method, and the repulsion method was demonstrated to also be possible. With either method, owing to the wide gap between the permanent magnets and coils, the output was not enough when the load was high. The efficiency was almost the same in the delta wired and Y wired methods. In the repulsion method, however, it was less than 50% of that in the other two methods. From the results, the delta wired and Y wired methods with an active control of the gap distance were considered to be better than the repulsion method, which required no active gap control. PMID:14616528

  5. Development of a Probabilistic Component Mode Synthesis Method for the Analysis of Non-Deterministic Substructures

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Ferri, Aldo A.

    1995-01-01

    Standard methods of structural dynamic analysis assume that the structural characteristics are deterministic. Recognizing that these characteristics are actually statistical in nature, researchers have recently developed a variety of methods that use this information to determine probabilities of a desired response characteristic, such as natural frequency, without using expensive Monte Carlo simulations. One of the problems in these methods is correctly identifying the statistical properties of primitive variables such as geometry, stiffness, and mass. This paper presents a method where the measured dynamic properties of substructures are used instead as the random variables. The residual flexibility method of component mode synthesis is combined with the probabilistic methods to determine the cumulative distribution function of the system eigenvalues. A simple cantilever beam test problem is presented that illustrates the theory.

  6. Development and Evaluation of the Method with an Affective Interface for Promoting Employees' Morale

    NASA Astrophysics Data System (ADS)

    Fujino, Hidenori; Ishii, Hirotake; Shimoda, Hiroshi; Yoshikawa, Hidekazu

    For the sustainable society, organization management not based on the mass production and mass consumption but having the flexibility to meet to various social needs precisely is required. For realizing such management, the emploees' work morale is required. Recently, however, the emploees' work morale is tend to decrease. Therefore, in this study, the authors developed the model of the method for promoting and keeping employees' work morale effectively and efficiently. Especially the authors thought “work morale” of “attitude to the work”. Based on this idea, it could be considered that the theory of the persuasion psychology and various persuasion techniques. Therefore, the model of the method applying the character agent was developed based on the forced compliance which is one of persuasion techniques based on the theory of the cognitive dissonance. By the evaluation experiment using human subjects, it was confirmed that developed method could improve workers' work morle effectively.

  7. Development of Implicit Methods in CFD NASA Ames Research Center 1970's - 1980's

    NASA Technical Reports Server (NTRS)

    Pulliam, Thomas H.

    2010-01-01

    The focus here is on the early development (mid 1970's-1980's) at NASA Ames Research Center of implicit methods in Computational Fluid Dynamics (CFD). A class of implicit finite difference schemes of the Beam and Warming approximate factorization type will be addressed. The emphasis will be on the Euler equations. A review of material pertinent to the solution of the Euler equations within the framework of implicit methods will be presented. The eigensystem of the equations will be used extensively in developing a framework for various methods applied to the Euler equations. The development and analysis of various aspects of this class of schemes will be given along with the motivations behind many of the choices. Various acceleration and efficiency modifications such as matrix reduction, diagonalization and flux split schemes will be presented.

  8. Development of a turbomachinery design optimization procedure using a multiple-parameter nonlinear perturbation method

    NASA Technical Reports Server (NTRS)

    Stahara, S. S.

    1984-01-01

    An investigation was carried out to complete the preliminary development of a combined perturbation/optimization procedure and associated computational code for designing optimized blade-to-blade profiles of turbomachinery blades. The overall purpose of the procedures developed is to provide demonstration of a rapid nonlinear perturbation method for minimizing the computational requirements associated with parametric design studies of turbomachinery flows. The method combines the multiple parameter nonlinear perturbation method, successfully developed in previous phases of this study, with the NASA TSONIC blade-to-blade turbomachinery flow solver, and the COPES-CONMIN optimization procedure into a user's code for designing optimized blade-to-blade surface profiles of turbomachinery blades. Results of several design applications and a documented version of the code together with a user's manual are provided.

  9. Evaluation and development plan of NRTA measurement methods for the Rokkasho Reprocessing Plant

    SciTech Connect

    Li, T.K.; Hakkila, E.A.; Flosterbuer, S.F.

    1995-08-01

    Near-real-time accounting (NRTA) has been proposed as a safeguards method at the Rokkasho Reprocessing Plant (RRP), a large-scale commercial boiling water and pressurized water reactors spent-fuel reprocessing facility. NRTA for RRP requires material balance closures every month. To develop a more effective and practical NRTA system for RRP, we have evaluated NRTA measurement techniques and systems that might be implemented in both the main process and the co-denitration process areas at RRP to analyze the concentrations of plutonium in solutions and mixed oxide powder. Based on the comparative evaluation, including performance, reliability, design criteria, operation methods, maintenance requirements, and estimated costs for each possible measurement method, recommendations for development were formulated. This paper discusses the evaluations and reports on the recommendation of the NRTA development plan for potential implementation at RRP.

  10. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC). PMID:24513349

  11. Development of Matched (migratory Analytical Time Change Easy Detection) Method for Satellite-Tracked Migratory Birds

    NASA Astrophysics Data System (ADS)

    Doko, Tomoko; Chen, Wenbo; Higuchi, Hiroyoshi

    2016-06-01

    Satellite tracking technology has been used to reveal the migration patterns and flyways of migratory birds. In general, bird migration can be classified according to migration status. These statuses include the wintering period, spring migration, breeding period, and autumn migration. To determine the migration status, periods of these statuses should be individually determined, but there is no objective method to define 'a threshold date' for when an individual bird changes its status. The research objective is to develop an effective and objective method to determine threshold dates of migration status based on satellite-tracked data. The developed method was named the "MATCHED (Migratory Analytical Time Change Easy Detection) method". In order to demonstrate the method, data acquired from satellite-tracked Tundra Swans were used. MATCHED method is composed by six steps: 1) dataset preparation, 2) time frame creation, 3) automatic identification, 4) visualization of change points, 5) interpretation, and 6) manual correction. Accuracy was tested. In general, MATCHED method was proved powerful to identify the change points between migration status as well as stopovers. Nevertheless, identifying "exact" threshold dates is still challenging. Limitation and application of this method was discussed.

  12. Development of the residential case-specular epidemiologic investigation method. Final report

    SciTech Connect

    Zaffanella, L.E.; Savitz, D.A.

    1995-11-01

    The residential case-specular method is an innovative approach to epidemiologic studies of the association between wire codes and childhood cancer. This project was designed to further the development of the residential case-specular method, which seeks to help resolve the ``wire code paradox``. For years, wire codes have been used as surrogate measures of past electric and magnetic field (EMF) exposure. There is a magnetic field hypothesis that suggests childhood cancer is associated with exposure to magnetic fields, with wire codes as a proxy for these fields. The neighborhood hypothesis suggests that childhood cancer is associated with neighborhood characteristics and exposures other than magnetic fields, with wire codes as a proxy for these characteristics and exposures. The residential case-specular method was designed to discriminate between the magnetic field and the neighborhood hypothesis. Two methods were developed for determining the specular of a residence. These methods were tested with 400 randomly selected residences. The main advantage of the residential case-specular method is that it may efficiently confirm or eliminate the suspicion that control selection bias or confounding by neighborhood factors affected the results of case-control studies of childhood cancer and magnetic fields. The method may be applicable to both past and ongoing studies. The main disadvantage is that the method is untried. Consequently, further work is required to verify its validity and to ensure that sufficient statistical power can be obtained in a cost-effective manner.

  13. Development and Validation of Stability Indicating RP-HPLC Method for Voriconazole.

    PubMed

    Khetre, A B; Sinha, P K; Damle, Mrinalini C; Mehendre, R

    2009-09-01

    This study describes the development and validation of stability indicating HPLC method for voriconazole, an antifungal drug. Voriconazole was subjected to stress degradation under different conditions recommended by International Conference on Harmonization. The sample so generated was used to develop a stability-indicating high performance liquid chromatographic method for voriconazole. The peak for voriconazole was well resolved from peaks of degradation products, using a Hypersil C18 (250x4.6 mm) column and a mobile phase comprising of acetonitrile: water (40:60, v/v), at flow rate of 1 ml/min. Detection was carried out using photodiode array detector. A linear response (r > 0.99) was observed in the range of 5-25 mug/ml. The method showed good recoveries (average 100.06%) and relative standard deviation for intra and inter-day were method was validated for specificity and robustness also. PMID:20502568

  14. A data-gathering method for use in modeling energy research, development and demonstration programs

    NASA Astrophysics Data System (ADS)

    Meyer, M. A.; Booker, J. M.; Cullingford, H. S.; Peaslee, A. T., Jr.

    The development and testing of a data-gathering method for use in a computer program designed to model energy research, development, and demonstration programs for decisionmakers are described. The data-gathering method consists of face-to-face interviews with the scientists working on the projects that will be modeled by the computer program. The basic information gained from an interview includes time estimates for reaching certain project goals and the probability of achieving those goals within the times estimated. The interview method is based on decision analysis techniques. The Magnetic Fusion Energy program of the US Department of Energy was selected as the test case. The data gathering method was used at five fusion projects to determine whether it could meet its design criteria. Extensive statistical analysis was performed to learn how much the expert's answers agreed, what factors were likely to enter into their estimates, and how their estimates corresponded.

  15. Development of computational methods for unsteady aerodynamics at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Yates, E. Carson, Jr.; Whitlow, Woodrow, Jr.

    1987-01-01

    The current scope, recent progress, and plans for research and development of computational methods for unsteady aerodynamics at the NASA Langley Research Center are reviewed. Both integral equations and finite difference methods for inviscid and viscous flows are discussed. Although the great bulk of the effort has focused on finite difference solution of the transonic small perturbation equation, the integral equation program is given primary emphasis here because it is less well known.

  16. Development of computational methods for unsteady aerodynamics at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Yates, E. Carson, Jr.; Whitlow, Woodrow, Jr.

    1987-01-01

    The current scope, recent progress, and plans for research and development of computational methods for unsteady aerodynamics at the NASA Langley Research Center are reviewed. Both integral-equations and finite-difference method for inviscid and viscous flows are discussed. Although the great bulk of the effort has focused on finite-difference solution of the transonic small-perturbation equation, the integral-equation program is given primary emphasis here because it is less well known.

  17. Human-System Safety Methods for Development of Advanced Air Traffic Management Systems

    SciTech Connect

    Nelson, W.R.

    1999-05-24

    The Idaho National Engineering and Environmental Laboratory (INEEL) is supporting the National Aeronautics and Space Administration in the development of advanced air traffic management (ATM) systems as part of the Advanced Air Transportation Technologies program. As part of this program INEEL conducted a survey of human-system safety methods that have been applied to complex technical systems, to identify lessons learned from these applications and provide recommendations for the development of advanced ATM systems. The domains that were surveyed included offshore oil and gas, commercial nuclear power, commercial aviation, and military. The survey showed that widely different approaches are used in these industries, and that the methods used range from very high-level, qualitative approaches to very detailed quantitative methods such as human reliability analysis (HRA) and probabilistic safety assessment (PSA). In addition, the industries varied widely in how effectively they incorporate human-system safety assessment in the design, development, and testing of complex technical systems. In spite of the lack of uniformity in the approaches and methods used, it was found that methods are available that can be combined and adapted to support the development of advanced air traffic management systems.

  18. Method of Power System Suistanable Development Optimization in Liberalized Market Conditions

    NASA Astrophysics Data System (ADS)

    Turcik, M.; Oleinikova, I.; Krishans, Z.

    2011-01-01

    The paper is focused on the development of the Baltic Sea region taking into account the new EU energy policy. The authors elucidate the current situation and the power system infrastructure projects of the region. For the economic analysis and optimization of the development plans a method is proposed that takes into account the outlooks for upcoming 20-50 years and the initial information uncertainty. The method makes possible estimation of the technically-economic state, including market conditions, for a given power system.

  19. Recent developments in quasi-Newton methods for structural analysis and synthesis

    NASA Technical Reports Server (NTRS)

    Kamat, M. P.; Hayduk, R. J.

    1981-01-01

    Unlike the Newton-Raphson method, quasi-Newton methods by virture of the updates and step length control procedures are globally convergent and hence better suited for the solution of nonlinear problems of structural analysis and synthesis. Extension of quasi-Newton algorithms to large scale problems has led to the development of sparse update algorithms and to economical strategies for evaluating sparse Hessians. Ill-conditioning problems have led to the development of self-scaled variable metric and conjugate gradient algorithms, as well as the use of the singular perturbation theory. This paper emphasizes the effectiveness of such quasi-Newton algorithms for nonlinear structural analysis and synthesis.

  20. Methods for Characterizing the Co-development of Biofilm and Habitat Heterogeneity

    PubMed Central

    Li, Xiaobao; Song, Jisun L.; Culotti, Alessandro; Zhang, Wei; Chopp, David L.; Lu, Nanxi; Packman, Aaron I.

    2016-01-01

    Biofilms are surface-attached microbial communities that have complex structures and produce significant spatial heterogeneities. Biofilm development is strongly regulated by the surrounding flow and nutritional environment. Biofilm growth also increases the heterogeneity of the local microenvironment by generating complex flow fields and solute transport patterns. To investigate the development of heterogeneity in biofilms and interactions between biofilms and their local micro-habitat, we grew mono-species biofilms of Pseudomonas aeruginosa and dual-species biofilms of P. aeruginosa and Escherichia coli under nutritional gradients in a microfluidic flow cell. We provide detailed protocols for creating nutrient gradients within the flow cell and for growing and visualizing biofilm development under these conditions. We also present protocols for a series of optical methods to quantify spatial patterns in biofilm structure, flow distributions over biofilms, and mass transport around and within biofilm colonies. These methods support comprehensive investigations of the co-development of biofilm and habitat heterogeneity. PMID:25866914

  1. Methods for characterizing the co-development of biofilm and habitat heterogeneity.

    PubMed

    Li, Xiaobao; Song, Jisun L; Culotti, Alessandro; Zhang, Wei; Chopp, David L; Lu, Nanxi; Packman, Aaron I

    2015-01-01

    Biofilms are surface-attached microbial communities that have complex structures and produce significant spatial heterogeneities. Biofilm development is strongly regulated by the surrounding flow and nutritional environment. Biofilm growth also increases the heterogeneity of the local microenvironment by generating complex flow fields and solute transport patterns. To investigate the development of heterogeneity in biofilms and interactions between biofilms and their local micro-habitat, we grew mono-species biofilms of Pseudomonas aeruginosa and dual-species biofilms of P. aeruginosa and Escherichia coli under nutritional gradients in a microfluidic flow cell. We provide detailed protocols for creating nutrient gradients within the flow cell and for growing and visualizing biofilm development under these conditions. We also present protocols for a series of optical methods to quantify spatial patterns in biofilm structure, flow distributions over biofilms, and mass transport around and within biofilm colonies. These methods support comprehensive investigations of the co-development of biofilm and habitat heterogeneity. PMID:25866914

  2. Development of quantitative duplex real-time PCR method for screening analysis of genetically modified maize.

    PubMed

    Oguchi, Taichi; Onishi, Mari; Minegishi, Yasutaka; Kurosawa, Yasunori; Kasahara, Masaki; Akiyama, Hiroshi; Teshima, Reiko; Futo, Satoshi; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi

    2009-06-01

    A duplex real-time PCR method was developed for quantitative screening analysis of GM maize. The duplex real-time PCR simultaneously detected two GM-specific segments, namely the cauliflower mosaic virus (CaMV) 35S promoter (P35S) segment and an event-specific segment for GA21 maize which does not contain P35S. Calibration was performed with a plasmid calibrant specially designed for the duplex PCR. The result of an in-house evaluation suggested that the analytical precision of the developed method was almost equivalent to those of simplex real-time PCR methods, which have been adopted as ISO standard methods for the analysis of GMOs in foodstuffs and have also been employed for the analysis of GMOs in Japan. In addition, this method will reduce both the cost and time requirement of routine GMO analysis by half. The high analytical performance demonstrated in the current study would be useful for the quantitative screening analysis of GM maize. We believe the developed method will be useful for practical screening analysis of GM maize, although interlaboratory collaborative studies should be conducted to confirm this. PMID:19602858

  3. Development of laser-ion beam photodissociation methods. Progress report, December 1, 1992--November 30, 1993

    SciTech Connect

    Russell, D.H.

    1992-08-01

    Research efforts were concentrated on developing the tandem magnetic sector (EB)/reflection-time-of-flight (TOF) instrument, preliminary experiments with tandem TOF/TOF instruments, developing method for performing photodissociation with pulsed lasers, experiments with laser ionization of aerosol particles, matrix-assisted laser desorption ionization (MALDI), and ion-molecule reaction chemistry of ground and excited state transition metal ions. This progress report is divided into: photodissociation, MALDI (including aerosols), and ion chemistry fundamentals.

  4. Method and tool for generating and managing image quality allocations through the design and development process

    NASA Astrophysics Data System (ADS)

    Sparks, Andrew W.; Olson, Craig; Theisen, Michael J.; Addiego, Chris J.; Hutchins, Tiffany G.; Goodman, Timothy D.

    2016-05-01

    Performance models for infrared imaging systems require image quality parameters; optical design engineers need image quality design goals; systems engineers develop image quality allocations to test imaging systems against. It is a challenge to maintain consistency and traceability amongst the various expressions of image quality. We present a method and parametric tool for generating and managing expressions of image quality during the system modeling, requirements specification, design, and testing phases of an imaging system design and development project.

  5. Development of a simplified analytical method for representing material cyclic response

    NASA Technical Reports Server (NTRS)

    Moreno, V.

    1983-01-01

    Development of a simplified method for estimating structural inelastic stress and strain response to cyclic thermal loading is presented. The method assumes that high temperature structural response is the sum of time independent plastic and time dependent elastic/creep components. The local structural stress and strain response predicted by linear elastic analysis is modified by the simplified method to predict the inelastic response. The results with simulations by a nonlinear finite element analysis and used time independent plasticity and unified time dependent material model are compared.

  6. Development of a test method for carbonyl compounds from stationary source emissions

    SciTech Connect

    Zhihua Fan; Peterson, M.R.; Jayanty, R.K.M.

    1997-12-31

    Carbonyl compounds have received increasing attention because of their important role in ground-level ozone formation. The common method used for the measurement of aldehydes and ketones is 2,4-dinitrophenylhydrazine (DNPH) derivatization followed by high performance liquid chromatography and ultra violet (HPLC-UV) analysis. One of the problems associated with this method is the low recovery for certain compounds such as acrolein. This paper presents a study in the development of a test method for the collection and measurement of carbonyl compounds from stationary source emissions. This method involves collection of carbonyl compounds in impingers, conversion of carbonyl compounds to a stable derivative with O-2,3,4,5,6-pentafluorobenzyl hydroxylamine hydrochloride (PFBHA), and separation and measurement by electron capture gas chromatography (GC-ECD). Eight compounds were selected for the evaluation of this method: formaldehyde, acetaldehyde, acrolein, acetone, butanal, methyl ethyl ketone (MEK), methyl isobutyl ketone (MIBK), and hexanal.

  7. Development of the local magnification method for quantitative evaluation of endoscope geometric distortion

    NASA Astrophysics Data System (ADS)

    Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong

    2016-05-01

    With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.

  8. Challenges in the analytical method development and validation for an unstable active pharmaceutical ingredient.

    PubMed

    Sajonz, Peter; Wu, Yan; Natishan, Theresa K; McGachy, Neil T; Detora, David

    2006-03-01

    A sensitive high-performance liquid chromatography (HPLC) impurity profile method for the antibiotic ertapenem is developed and subsequently validated. The method utilizes an Inertsil phenyl column at ambient temperature, gradient elution with aqueous sodium phosphate buffer at pH 8, and acetonitrile as the mobile phase. The linearity, method precision, method ruggedness, limit of quantitation, and limit of detection of the impurity profile HPLC method are found to be satisfactory. The method is determined to be specific, as judged by resolving ertapenem from in-process impurities in crude samples and degradation products that arise from solid state thermal and light stress, acid, base, and oxidative stressed solutions. In addition, evidence is obtained by photodiode array detection studies that no degradate or impurity having a different UV spectrum coeluted with the major component in stressed or unstressed samples. The challenges during the development and validation of the method are discussed. The difficulties of analyzing an unstable active pharmaceutical ingredient (API) are addressed. Several major impurities/degradates of the API have very different UV response factors from the API. These impurities/degradates are synthesized or prepared by controlled degradation and the relative response factors are determined. PMID:16620508

  9. Analytical methods for detection of gluten in food--method developments in support of food labeling legislation.

    PubMed

    Haraszi, Reka; Chassaigne, Hubert; Maquet, Alain; Ulberth, Franz

    2011-01-01

    The current essential therapy of celiac disease is a strict adherence to a gluten-free diet. Besides food products that are naturally gluten-free, "very low gluten" and "gluten-free" bakery products have become available. The availability of immunochemical and other analytical methods to determine gluten markers in foods is of utmost importance to ensure the well being of gluten-sensitive individuals. The aim of this review was to evaluate if currently available methodologies are suitable to meet the requirements of food labeling standards for individual gluten source declaration, in order to achieve policy objectives. Codex Alimentarius and European Union (EU) legislation and gluten detection methodologies applicable at present have been summarized and compared. In 2009, the European Commission issued Regulation No. 41/2009 concerning the composition and labeling of foodstuffs suitable for people intolerant to gluten. This review constitutes a basis to investigate the possibility to develop a proteomic-based method for the specific detection of gluten-containing cereals in food products, especially at or around the limits specified in EU legislation. PMID:21919334

  10. Development of a harmonised method for the profiling of amphetamines VI: Evaluation of methods for comparison of amphetamine.

    PubMed

    Andersson, Kjell; Lock, Eric; Jalava, Kaisa; Huizer, Henk; Jonson, Sten; Kaa, Elisabet; Lopes, Alvaro; Poortman-van der Meer, Anneke; Sippola, Erkki; Dujourdy, Laurence; Dahlén, Johan

    2007-06-14

    Amphetamine samples were analysed by gas chromatography-mass spectrometry (GC-MS), and the peak areas of 33 target compounds were transformed by applying various pretreatment techniques. The objective was to optimise the ability of a number of distance metrics to establish links between samples of amphetamine originating from the same batch (henceforth referred to as linked distances). Furthermore, partial least squares discriminant analysis (PLS-DA) was used to evaluate the effects of various pretreatment methods on separation of amphetamine batches synthesised by the Leuckart reaction, reductive amination of benzyl methyl ketone, and the nitrostyrene route. The most efficient way to pretreat GC-MS data varied for the different distance metrics, although best results were obtained when data were normalised to the sum of peak areas, and either the fourth root or a logarithm was applied to the normalised data. When pretreating normalised data by fourth root transformation, Pearson correlation was the distance metric that was most successful at finding linked samples. Normalisation and the use of fourth root also represented the best method of pretreating data when employing PLS-DA to separate samples synthesised by different routes. To achieve a faster and more user-friendly procedure for evaluating chromatograms, experiments were performed in which the number of target compounds used to compare samples was reduced. The effect of each compound that was removed was studied by applying PLS-DA and by using Pearson correlation to calculate linked distances as well as unlinked distances (between samples from different batches of amphetamine). Considering both links between samples from the same batch and separation of samples synthesised by different routes, the best results were obtained with the data set comprising 26 compounds. Finally, it was found that the profiling method developed in this work was superior to an existing technique with respect to separating linked

  11. Development of a new method for determination of total haem protein in fish muscle.

    PubMed

    Chaijan, Manat; Undeland, Ingrid

    2015-04-15

    Using classic haem protein quantification methods, the extraction step in buffer or acid acetone often becomes limiting if muscle is oxidised and/or stored; haem-proteins then tend to bind to muscle components like myofibrils and/or biomembranes. The objective of this study was to develop a new haem protein determination method for fish muscle overcoming such extractability problems. The principle was to homogenise and heat samples in an SDS-containing phosphate buffer to dissolve major muscle components and convert ferrous/ferric haem proteins to hemichromes with a unique absorption peak at 535 nm. Hb-recovery tests with the new and classic methods showed that the new method and Hornsey's method performed significantly better on fresh Hb-enriched cod mince than Brown's and Drabkin's methods; recovery was ⩾98%. However, in highly oxidised samples and in cod protein isolates made with acid pH-shift processing, the new method performed better than Hornsey's method (63% and 87% vs. 50% and 68% recovery). Further, the new method performed well in fish muscle with ⩽30% lipid, <5% NaCl and pH 5.5-7.0; it was also unaffected by freezing/frozen storage. PMID:25466135

  12. CE-C(4)D method development and validation for the assay of ciprofloxacin.

    PubMed

    Paul, Prasanta; Van Laeken, Christophe; Sänger-van de Griend, Cari; Adams, Erwin; Van Schepdael, Ann

    2016-09-10

    A capillary electrophoresis method with capacitively coupled contactless conductivity detection (CE-C(4)D) has been developed, optimized and validated for the determination of ciprofloxacin. Ciprofloxacin is a member of the fluoroquinolone antibiotics with a broad spectrum bactericidal activity and recommended for complicated respiratory infections, sexually transmitted diseases, tuberculosis, bacterial diarrhea etc. Method development was conducted with major focus on the quality by design (QbD) approach. During method development, multiple buffers were tried at different ionic strength. However, the optimized method finally involved a very simple background electrolyte, monosodium citrate at a concentration of 10mM without pH adjustment. The optimized CE-C(4)D method involved an uncoated fused silica capillary (59/39cm, 50μm i.d.) and hydrodynamic sample injection at a pressure of 0.5 p.s.i. for 5s. The actual separation was conducted for 10min at normal polarity with a voltage of 20kV corresponding to 5.9μA current. LiCl (1mg/mL) was used as an internal standard. The optimized method is robust and accurate (recovery >98%) which rendered the ciprofloxacin peak within five minutes with good linearity (R(2)>0.999) in the concentration range of 0.0126-0.8mg/mL. The repeatability is expressed by percentage relative standard deviation (%RSD) of the relative peak areas (RPA) and it showed good repeatability both intra-day (<3%) and inter-day (3.1%). This method, proven to be free of matrix interference, showed that the estimated percent content of ciprofloxacin (102%) was within the official requirements. Moreover, due to its ease of use and robustness, the method should also be applicable in less well controlled laboratory environments. PMID:27386824

  13. It's all in the details: methods in breast development and cancer

    PubMed Central

    Bentires-Alj, Mohamed; Clarke, Robert B; Jonkers, Jos; Smalley, Matthew; Stein, Torsten

    2009-01-01

    The inaugural European Network for Breast Development and Cancer (ENBDC) meeting on 'Methods in Mammary Gland Development and Cancer' was held in Weggis, Switzerland last April. The goal was to discuss the details of techniques used to study mammary gland biology and tumourigenesis. Highlights of this meeting included the use of four-colour fluorescence for protein co-localisation in tissue microarrays, genome analysis at single cell resolution, technical issues in the isolation of normal and tumour stem cells, and the use of mouse models and mammary gland transplantations to elucidate gene function in mammary development and to study drug resistance in breast cancer. PMID:19691817

  14. Development of Preservice Teachers' Value Orientations during a Secondary Methods Course and Early Field Experience

    ERIC Educational Resources Information Center

    Sofo, Seidu; Curtner-Smith, Matthew D.

    2010-01-01

    Few studies have examined the value orientations of physical education preservice teachers (PTs). The purposes of this study were to: (1) describe the extent to which one cohort of PTs' value orientations changed and developed during a secondary methods course and early field experience (EFE); and (2) determine why PTs' value orientations changed…

  15. Analysis of Pre-Service Science Teachers' Views about the Methods Which Develop Reflective Thinking

    ERIC Educational Resources Information Center

    Töman, Ufuk; Odabasi Çimer, Sabiha; Çimer, Atilla

    2014-01-01

    In this study, we investigate of science and technology pre-service teachers' opinions about the methods developed reflective thinking and we determined at the level of reflective thinking. This study is a descriptive study. Open-ended questions were used to determine the views of pre-service teachers. Questions used in the statistical analysis of…

  16. Sequential Pattern Analysis: Method and Application in Exploring How Students Develop Concept Maps

    ERIC Educational Resources Information Center

    Chiu, Chiung-Hui; Lin, Chien-Liang

    2012-01-01

    Concept mapping is a technique that represents knowledge in graphs. It has been widely adopted in science education and cognitive psychology to aid learning and assessment. To realize the sequential manner in which students develop concept maps, most research relies upon human-dependent, qualitative approaches. This article proposes a method for…

  17. Effect of storage method on manure as a substrate for filth fly development

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Numerous studies have been conducted using manure as a substrate for filth fly development. In these experiments, the manure is sometimes frozen for use at a later date. The purpose of this study was to determine the effects of various manure storage methods on subsequent house and stable fly develo...

  18. Development of a Neuroscience-Oriented "Methods" Course for Graduate Students of Pharmacology and Toxicology

    ERIC Educational Resources Information Center

    Surratt, Christopher K.; Witt-Enderby, Paula A.; Johnson, David A.; Anderson, Carl A.; Bricker, J. Douglas; Davis, Vicki L.; Firestine, Steven M.; Meng, Wilson S.

    2006-01-01

    To provide graduate students in pharmacology/toxicology exposure to, and cross-training in, a variety of relevant laboratory skills, the Duquesne University School of Pharmacy developed a "methods" course as part of the core curriculum. Because some of the participating departmental faculty are neuroscientists, this course often applied…

  19. Development and validation of a Daphnia magna four-day survival and growth test method

    EPA Science Inventory

    Zooplankton are an important part of the aquatic ecology of all lakes and streams. As a result, numerous methods have been developed to assess the quality of waterbodies using various zooplankton species. Included in these is the freshwater species Daphnia magna. Current test me...

  20. Developing Critical Understanding in HRM Students: Using Innovative Teaching Methods to Encourage Deep Approaches to Study

    ERIC Educational Resources Information Center

    Butler, Michael J. R.; Reddy, Peter

    2010-01-01

    Purpose: This paper aims to focus on developing critical understanding in human resource management (HRM) students in Aston Business School, UK. The paper reveals that innovative teaching methods encourage deep approaches to study, an indicator of students reaching their own understanding of material and ideas. This improves student employability…

  1. DEVELOPMENT OF A MOLECULAR METHOD TO IDENTIFY HEPATITIS E VIRUS IN WATER

    EPA Science Inventory

    Hepatitis E virus (HEV) causes an infectious form of hepatitis associated with contaminated water. By analyzing the sequence of several HEV isolates, a reverse transciption-polymerase chain reaction method was developed and optimized that should be able to identify all of the kn...

  2. The Concious Synthesis of Development: A Clinical Method for Treatment of Stuttering. Final Report.

    ERIC Educational Resources Information Center

    Brajovic, Cvjetko; And Others

    Presented are the results of a project to test and evaluate the Conscious Synthesis of Development (CSD) method for treating stuttering which concentrates on both psychological and physiological aspects of the disorder. The report is organized into the following divisions: definition and psychophysiology of stuttering; historical overview; review…

  3. The Slice Culture Method for Following Development of Tooth Germs In Explant Culture

    PubMed Central

    Alfaqeeh, Sarah A.; Tucker, Abigail S.

    2013-01-01

    Explant culture allows manipulation of developing organs at specific time points and is therefore an important method for the developmental biologist. For many organs it is difficult to access developing tissue to allow monitoring during ex vivo culture. The slice culture method allows access to tissue so that morphogenetic movements can be followed and specific cell populations can be targeted for manipulation or lineage tracing. In this paper we describe a method of slice culture that has been very successful for culture of tooth germs in a range of species. The method provides excellent access to the tooth germs, which develop at a similar rate to that observed in vivo, surrounded by the other jaw tissues. This allows tissue interactions between the tooth and surrounding tissue to be monitored. Although this paper concentrates on tooth germs, the same protocol can be applied to follow development of a number of other organs, such as salivary glands, Meckel's cartilage, nasal glands, tongue, and ear. PMID:24300332

  4. 2D and 3D Method of Characteristic Tools for Complex Nozzle Development

    NASA Technical Reports Server (NTRS)

    Rice, Tharen

    2003-01-01

    This report details the development of a 2D and 3D Method of Characteristic (MOC) tool for the design of complex nozzle geometries. These tools are GUI driven and can be run on most Windows-based platforms. The report provides a user's manual for these tools as well as explains the mathematical algorithms used in the MOC solutions.

  5. REGIONAL METHODS INITIATIVE: DEVELOPMENT OF LARGE RIVER BIOASSESSMENT PROTOCOLS (LRBP) FOR BENTHIC MACROINVERTEBRATES

    EPA Science Inventory

    We are developing the Large River Bioassessment Protocol (LRBP) for assessment of benthic macroinvertebrate assemblages. This multi-habitat method is currently being used in support of a REMAP project for probabilistic assessment of large rivers in USEPA Region 5. Six rivers, r...

  6. Connecting Practice, Theory and Method: Supporting Professional Doctoral Students in Developing Conceptual Frameworks

    ERIC Educational Resources Information Center

    Kumar, Swapna; Antonenko, Pavlo

    2014-01-01

    From an instrumental view, conceptual frameworks that are carefully assembled from existing literature in Educational Technology and related disciplines can help students structure all aspects of inquiry. In this article we detail how the development of a conceptual framework that connects theory, practice and method is scaffolded and facilitated…

  7. Development of Novel Analytical Method for Ab Initio Powder Structural Analysis

    NASA Astrophysics Data System (ADS)

    Sakata, Makoto; Nishibori, Eiji; Sawa, Hiroshi

    Genetic Algorithm (GA) applied to ab initio structure determination from synchrotron powder diffraction is described. It seems to have an advantage over other real space methods for ab initio structure determination because of the existence of schema theorem. As an example, the case of Prednisolone Succinate is shown in some detail. Future development of GA in crystallography is briefly described.

  8. COMPARISONS OF ACUTE REFERENCE VALUES IN DEVELOPING AN ACUTE INHALATION ASSESSMENT METHOD

    EPA Science Inventory

    A method is being developed for performing assessments of human health risk from acute (less than 24 hour) inhalation exposures. The methodology will be flexible in its ability to utilize variously robust data sets of dose-response information. A supporting task is a comparati...

  9. A Method for User Centering Systematic Product Development Aimed at Industrial Design Students

    ERIC Educational Resources Information Center

    Coelho, Denis A.

    2010-01-01

    Instead of limiting the introduction and stimulus for new concept creation to lists of specifications, industrial design students seem to prefer to be encouraged by ideas in context. A new method that specifically tackles human activity to foster the creation of user centered concepts of new products was developed and is presented in this article.…

  10. Development of methods for characterizing fetal and adult somatic mutations detected in human erythroid precursor

    SciTech Connect

    Langlois, R.G.; Manchester, D.K.

    1994-12-31

    The glycophorin A (GPA) assay was developed to quantify somatic mutations in humans by measuring the frequency of peripheral erythrocytes with mutant phenotypes that are presumed to be progeny of mutated erythroid precursor cells. This assay has been used to identify GPA variant cells in unexposed individuals at a frequency of {approximately}10 per million erythrocytes, and to demonstrate significant increases in variant frequency after mutagenic exposures. Characterization of the mutations responsible for these variant cells requires that the assay be modified to allow flow analysis and sorting of variant erythroid precursor cells that contain nucleic acids. Cord blood samples contain low levels of both reticulocytes and nucleated erythrocytes. We have developed enrichment methods using centrifugation that yield samples containing up to 30% nucleated erythrocytes, and immunomagnetic separation methods that yield samples containing up to 90% reticulocytes. Enrichment methods for these two cell types are also being developed for adult bone marrow samples. We have confirmed that enrichment and labeling with a nucleic acid-specific dye are compatible with GPA analysis of erythrocytes, reticulocytes, and nucleated erythrocytes. Enriched samples have been successfully used for flow cytometric detection of GPA variant reticulocytes in cord blood. PCR-based analysis methods are being developed for molecular characterization of sorted variant cells at the mRNA level.

  11. Trade-Offs in the Study of Culture and Development: Theories, Methods, and Values.

    ERIC Educational Resources Information Center

    Rothbaum, Fred; Pott, Martha; Azuma, Hiroshi; Miyake, Kazuo; Weisz, John

    2000-01-01

    Notes that commentators unanimously support Rothbaum et al.'s general orientation to culture and development and their developmental pathways. Views commentators' suggestions as relating to trade-offs: between theories that highlight generalization or exceptions; between methods that rely on one-, two-, or multiculture studies; and between values…

  12. 2005 Nobel Prize in Chemistry: Development of the Olefin Metathesis Method in Organic Synthesis

    ERIC Educational Resources Information Center

    Casey, Charles P.

    2006-01-01

    The 2005 Nobel Prize in Chemistry was awarded "for the development of the metathesis method in organic synthesis". The discoveries of the laureates provided a chemical reaction used daily in the chemical industry for the efficient and more environmentally friendly production of important pharmaceuticals, fuels, synthetic fibers, and many other…

  13. Development of a Hybrid Method for Dimensionality Identification Incorporating an Angle-Based Approach

    ERIC Educational Resources Information Center

    Zeng, Ji

    2010-01-01

    Correct dimensionality identification (i.e., a correct decision on the number of factors to retain) is crucial not only in educational and psychological measurement, but also in various fields such as medicine and sociology that use exploratory factor analysis (EFA) in developing theories. However, to date, no single method has been endorsed for…

  14. POLYURETHANE FOAM AS TRAPPING AGENT FOR AIRBORNE PESTICIDES: ANALYTICAL METHOD DEVELOPMENT

    EPA Science Inventory

    A method for determining levels of organochlorine, organophosphorus, and N-methyl carbamate insecticides in air was developed using 4.4 cm-diameter plugs of polyurethane foam as traps and a modified Sherma-Shafik multiresidue procedure for analysis of foam extracts. With this met...

  15. DEVELOPMENT OF A TEST METHOD FOR THE MEASUREMENT OF GASEOUS METHANOL EMISSIONS FROM STATIONARY SOURCES

    EPA Science Inventory

    Methanol was designated under Title III of the Clean Air Act Amendments of 1990 as a pollutant to be regulated. he U.S. EPA, through a contract with Research Triangle Institute, has developed a test method for the measurement of methanol emissions from stationary sources. he meth...

  16. Development of a Solid Phase Extraction Method for Agricultural Pesticides in Large-Volume Water Samples

    EPA Science Inventory

    An analytical method using solid phase extraction (SPE) and analysis by gas chromatography/mass spectrometry (GC/MS) was developed for the trace determination of a variety of agricultural pesticides and selected transformation products in large-volume high-elevation lake water sa...

  17. A Four-Stage Method for Developing Early Interventions for Alcohol among Aboriginal Adolescents

    ERIC Educational Resources Information Center

    Mushquash, Christopher J.; Comeau, M. Nancy; McLeod, Brian D.; Stewart, Sherry H.

    2010-01-01

    This paper details a four-stage methodology for developing early alcohol interventions for at-risk Aboriginal youth. Stage 1 was an integrative approach to Aboriginal education that upholds Aboriginal traditional wisdom supporting respectful relationships to the Creator, to the land and to each other. Stage 2 used quantitative methods to…

  18. Development of a new noninvasive method to determine the integrity of bone in vivo

    NASA Technical Reports Server (NTRS)

    Saha, S.

    1980-01-01

    An electromagnetic sensor for monitoring elastic waves in bone was developed. It does not require the use of traction pins and the output is not affected by soft tissue properties, a difficulty commonly encountered when using ultrasonic and vibration methods to determine in vivo properties of bone.

  19. Developments in Methods for Measuring the Intestinal Absorption of Nanoparticle-Bound Drugs.

    PubMed

    Liu, Wei; Pan, Hao; Zhang, Caiyun; Zhao, Liling; Zhao, Ruixia; Zhu, Yongtao; Pan, Weisan

    2016-01-01

    With the rapid development of nanotechnology, novel drug delivery systems comprising orally administered nanoparticles (NPs) have been paid increasing attention in recent years. The bioavailability of orally administered drugs has significant influence on drug efficacy and therapeutic dosage, and it is therefore imperative that the intestinal absorption of oral NPs be investigated. This review examines the various literature on the oral absorption of polymeric NPs, and provides an overview of the intestinal absorption models that have been developed for the study of oral nanoparticles. Three major categories of models including a total of eight measurement methods are described in detail (in vitro: dialysis bag, rat gut sac, Ussing chamber, cell culture model; in situ: intestinal perfusion, intestinal loops, intestinal vascular cannulation; in vivo: the blood/urine drug concentration method), and the advantages and disadvantages of each method are contrasted and elucidated. In general, in vitro and in situ methods are relatively convenient but lack accuracy, while the in vivo method is troublesome but can provide a true reflection of drug absorption in vivo. This review summarizes the development of intestinal absorption experiments in recent years and provides a reference for the systematic study of the intestinal absorption of nanoparticle-bound drugs. PMID:27455239

  20. Novel methods to collect meaningful data from adolescents for the development of health interventions.

    PubMed

    Hieftje, Kimberly; Duncan, Lindsay R; Fiellin, Lynn E

    2014-09-01

    Health interventions are increasingly focused on young adolescents, and as a result, discussions with this population have become a popular method in qualitative research. Traditional methods used to engage adults in discussions do not translate well to this population, who may have difficulty conceptualizing abstract thoughts and opinions and communicating them to others. As part of a larger project to develop and evaluate a video game for risk reduction and HIV prevention in young adolescents, we were seeking information and ideas from the priority audience that would help us create authentic story lines and character development in the video game. To accomplish this authenticity, we conducted in-depth interviews and focus groups with young adolescents aged 10 to 15 years and employed three novel methods: Storytelling Using Graphic Illustration, My Life, and Photo Feedback Project. These methods helped provide a thorough understanding of the adolescents' experiences and perspectives regarding their environment and future aspirations, which we translated into active components of the video game intervention. This article describes the processes we used and the valuable data we generated using these three engaging methods. These three activities are effective tools for eliciting meaningful data from young adolescents for the development of health interventions. PMID:24519998

  1. Developments in Methods for Measuring the Intestinal Absorption of Nanoparticle-Bound Drugs

    PubMed Central

    Liu, Wei; Pan, Hao; Zhang, Caiyun; Zhao, Liling; Zhao, Ruixia; Zhu, Yongtao; Pan, Weisan

    2016-01-01

    With the rapid development of nanotechnology, novel drug delivery systems comprising orally administered nanoparticles (NPs) have been paid increasing attention in recent years. The bioavailability of orally administered drugs has significant influence on drug efficacy and therapeutic dosage, and it is therefore imperative that the intestinal absorption of oral NPs be investigated. This review examines the various literature on the oral absorption of polymeric NPs, and provides an overview of the intestinal absorption models that have been developed for the study of oral nanoparticles. Three major categories of models including a total of eight measurement methods are described in detail (in vitro: dialysis bag, rat gut sac, Ussing chamber, cell culture model; in situ: intestinal perfusion, intestinal loops, intestinal vascular cannulation; in vivo: the blood/urine drug concentration method), and the advantages and disadvantages of each method are contrasted and elucidated. In general, in vitro and in situ methods are relatively convenient but lack accuracy, while the in vivo method is troublesome but can provide a true reflection of drug absorption in vivo. This review summarizes the development of intestinal absorption experiments in recent years and provides a reference for the systematic study of the intestinal absorption of nanoparticle-bound drugs. PMID:27455239

  2. Novel Methods to Collect Meaningful Data From Adolescents for the Development of Health Interventions

    PubMed Central

    Hieftje, Kimberly; Duncan, Lindsay R.; Fiellin, Lynn E.

    2014-01-01

    Health interventions are increasingly focused on young adolescents, and as a result, discussions with this population have become a popular method in qualitative research. Traditional methods used to engage adults in discussions do not translate well to this population, who may have difficulty conceptualizing abstract thoughts and opinions and communicating them to others. As part of a larger project to develop and evaluate a video game for risk reduction and HIV prevention in young adolescents, we were seeking information and ideas from the priority audience that would help us create authentic story lines and character development in the video game. To accomplish this authenticity, we conducted in-depth interviews and focus groups with young adolescents aged 10 to 15 years and employed three novel methods: Storytelling Using Graphic Illustration, My Life, and Photo Feedback Project. These methods helped provide a thorough understanding of the adolescents’ experiences and perspectives regarding their environment and future aspirations, which we translated into active components of the video game intervention. This article describes the processes we used and the valuable data we generated using these three engaging methods. These three activities are effective tools for eliciting meaningful data from young adolescents for the development of health interventions. PMID:24519998

  3. Development and application of QM/MM methods to study the solvation effects and surfaces

    SciTech Connect

    Dibya, Pooja Arora

    2010-01-01

    Quantum mechanical (QM) calculations have the advantage of attaining high-level accuracy, however QM calculations become computationally inefficient as the size of the system grows. Solving complex molecular problems on large systems and ensembles by using quantum mechanics still poses a challenge in terms of the computational cost. Methods that are based on classical mechanics are an inexpensive alternative, but they lack accuracy. A good trade off between accuracy and efficiency is achieved by combining QM methods with molecular mechanics (MM) methods to use the robustness of the QM methods in terms of accuracy and the MM methods to minimize the computational cost. Two types of QM combined with MM (QM/MM) methods are the main focus of the present dissertation: the application and development of QM/MM methods for solvation studies and reactions on the Si(100) surface. The solvation studies were performed using a discreet solvation model that is largely based on first principles called the effective fragment potential method (EFP). The main idea of combining the EFP method with quantum mechanics is to accurately treat the solute-solvent and solvent-solvent interactions, such as electrostatic, polarization, dispersion and charge transfer, that are important in correctly calculating solvent effects on systems of interest. A second QM/MM method called SIMOMM (surface integrated molecular orbital molecular mechanics) is a hybrid QM/MM embedded cluster model that mimics the real surface.3 This method was employed to calculate the potential energy surfaces for reactions of atomic O on the Si(100) surface. The hybrid QM/MM method is a computationally inexpensive approach for studying reactions on larger surfaces in a reasonably accurate and efficient manner. This thesis is comprised of four chapters: Chapter 1 describes the general overview and motivation of the dissertation and gives a broad background of the computational methods that have been employed in this work

  4. Development of a numerical method for the prediction of turbulent flows in dump diffusers

    NASA Astrophysics Data System (ADS)

    Ando, Yasunori; Kawai, Masafumi; Sato, Yukinori; Toh, Hidemi

    1987-01-01

    In order to obtain an effective tool to design dump diffusers for gas turbine combustors, a finite-volume numerical calculation method has been developed for the solution of two-dimensional/axisymmetric incompressible steady Navier-Stokes equation in general curvilinear coordinate system. This method was applied to the calculations of turbulent flows in a two-dimensional dump diffuser with uniform and distorted inlet velocity profiles as well as an annular dump diffuser with uniform inlet velocity profile, and the calculated results were compared with experimental data. The numerical results showed a good agreement with experimental data in case of both inlet velocity profiles; eventually, the numerical method was confirmed to be an effective tool for the development of dump diffusers which can predict the flow pattern, velocity distribution and the pressure loss.

  5. Development and Application of Multidimensional HPLC Mapping Method for O-linked Oligosaccharides

    PubMed Central

    Yagi, Hirokazu; Ohno, Erina; Kondo, Sachiko; Yoshida, Atsuhiro; Kato, Koichi

    2011-01-01

    Glycosylation improves the solubility and stability of proteins, contributes to the structural integrity of protein functional sites, and mediates biomolecular recognition events involved in cell-cell communications and viral infections. The first step toward understanding the molecular mechanisms underlying these carbohydrate functionalities is a detailed characterization of glycan structures. Recently developed glycomic approaches have enabled comprehensive analyses of N-glycosylation profiles in a quantitative manner. However, there are only a few reports describing detailed O-glycosylation profiles primarily because of the lack of a widespread standard method to identify O-glycan structures. Here, we developed an HPLC mapping method for detailed identification of O-glycans including neutral, sialylated, and sulfated oligosaccharides. Furthermore, using this method, we were able to quantitatively identify isomeric products from an in vitro reaction catalyzed by N-acetylglucosamine-6O-sulfotransferases and obtain O-glycosylation profiles of serum IgA as a model glycoprotein. PMID:24970123

  6. Integration of QFD, AHP, and LPP methods in supplier development problems under uncertainty

    NASA Astrophysics Data System (ADS)

    Shad, Zahra; Roghanian, Emad; Mojibian, Fatemeh

    2014-04-01

    Quality function deployment (QFD) is a customer-driven approach, widely used to develop or process new product to maximize customer satisfaction. Last researches used linear physical programming (LPP) procedure to optimize QFD; however, QFD issue involved uncertainties, or fuzziness, which requires taking them into account for more realistic study. In this paper, a set of fuzzy data is used to address linguistic values parameterized by triangular fuzzy numbers. Proposed integrated approach including analytic hierarchy process (AHP), QFD, and LPP to maximize overall customer satisfaction under uncertain conditions and apply them in the supplier development problem. The fuzzy AHP approach is adopted as a powerful method to obtain the relationship between the customer requirements and engineering characteristics (ECs) to construct house of quality in QFD method. LPP is used to obtain the optimal achievement level of the ECs and subsequently the customer satisfaction level under different degrees of uncertainty. The effectiveness of proposed method will be illustrated by an example.

  7. Complementary methods of system usability evaluation: surveys and observations during software design and development cycles.

    PubMed

    Horsky, Jan; McColgan, Kerry; Pang, Justine E; Melnikas, Andrea J; Linder, Jeffrey A; Schnipper, Jeffrey L; Middleton, Blackford

    2010-10-01

    Poor usability of clinical information systems delays their adoption by clinicians and limits potential improvements to the efficiency and safety of care. Recurring usability evaluations are therefore, integral to the system design process. We compared four methods employed during the development of outpatient clinical documentation software: clinician email response, online survey, observations and interviews. Results suggest that no single method identifies all or most problems. Rather, each approach is optimal for evaluations at a different stage of design and characterizes different usability aspect. Email responses elicited from clinicians and surveys report mostly technical, biomedical, terminology and control problems and are most effective when a working prototype has been completed. Observations of clinical work and interviews inform conceptual and workflow-related problems and are best performed early in the cycle. Appropriate use of these methods consistently during development may significantly improve system usability and contribute to higher adoption rates among clinicians and to improved quality of care. PMID:20546936

  8. Development of the smooth orthogonal decomposition method to derive the modal parameters of vehicle suspension system

    NASA Astrophysics Data System (ADS)

    Rezaee, Mousa; Shaterian-Alghalandis, Vahid; Banan-Nojavani, Ali

    2013-04-01

    In this paper, the smooth orthogonal decomposition (SOD) method is developed to the light damped systems in which the inputs are time shifted functions of one or more random processes. An example of such practical cases is the vehicle suspension system in which the random inputs due to the road roughness applied to the rear wheels are the shifted functions of the same random inputs on the front wheels with a time lag depending on the vehicle wheelbase as well as its velocity. The developed SOD method is applied to determine the natural frequencies and mode shapes of a certain vehicle suspension system and the results are compared with the true values obtained by the structural eigenvalue problem. The consistency of the results indicates that the SOD method can be applied with a high degree of accuracy to calculate the modal parameters of vibrating systems in which the system inputs are shifted functions of one or more random processes.

  9. Model correlation and damage location for large space truss structures: Secant method development and evaluation

    NASA Technical Reports Server (NTRS)

    Smith, Suzanne Weaver; Beattie, Christopher A.

    1991-01-01

    On-orbit testing of a large space structure will be required to complete the certification of any mathematical model for the structure dynamic response. The process of establishing a mathematical model that matches measured structure response is referred to as model correlation. Most model correlation approaches have an identification technique to determine structural characteristics from the measurements of the structure response. This problem is approached with one particular class of identification techniques - matrix adjustment methods - which use measured data to produce an optimal update of the structure property matrix, often the stiffness matrix. New methods were developed for identification to handle problems of the size and complexity expected for large space structures. Further development and refinement of these secant-method identification algorithms were undertaken. Also, evaluation of these techniques is an approach for model correlation and damage location was initiated.

  10. Development of Methods for the Determination of pKa Values.

    PubMed

    Reijenga, Jetse; van Hoof, Arno; van Loon, Antonie; Teunissen, Bram

    2013-01-01

    The acid dissociation constant (pKa) is among the most frequently used physicochemical parameters, and its determination is of interest to a wide range of research fields. We present a brief introduction on the conceptual development of pKa as a physical parameter and its relationship to the concept of the pH of a solution. This is followed by a general summary of the historical development and current state of the techniques of pKa determination and an attempt to develop insight into future developments. Fourteen methods of determining the acid dissociation constant are placed in context and are critically evaluated to make a fair comparison and to determine their applications in modern chemistry. Additionally, we have studied these techniques in light of present trends in science and technology and attempt to determine how these trends might affect future developments in the field. PMID:23997574

  11. Development of Methods for the Determination of pKa Values

    PubMed Central

    Reijenga, Jetse; van Hoof, Arno; van Loon, Antonie; Teunissen, Bram

    2013-01-01

    The acid dissociation constant (pKa) is among the most frequently used physicochemical parameters, and its determination is of interest to a wide range of research fields. We present a brief introduction on the conceptual development of pKa as a physical parameter and its relationship to the concept of the pH of a solution. This is followed by a general summary of the historical development and current state of the techniques of pKa determination and an attempt to develop insight into future developments. Fourteen methods of determining the acid dissociation constant are placed in context and are critically evaluated to make a fair comparison and to determine their applications in modern chemistry. Additionally, we have studied these techniques in light of present trends in science and technology and attempt to determine how these trends might affect future developments in the field. PMID:23997574

  12. Development of Finite Elements for Two-Dimensional Structural Analysis Using the Integrated Force Method

    NASA Technical Reports Server (NTRS)

    Kaljevic, Igor; Patnaik, Surya N.; Hopkins, Dale A.

    1996-01-01

    The Integrated Force Method has been developed in recent years for the analysis of structural mechanics problems. This method treats all independent internal forces as unknown variables that can be calculated by simultaneously imposing equations of equilibrium and compatibility conditions. In this paper a finite element library for analyzing two-dimensional problems by the Integrated Force Method is presented. Triangular- and quadrilateral-shaped elements capable of modeling arbitrary domain configurations are presented. The element equilibrium and flexibility matrices are derived by discretizing the expressions for potential and complementary energies, respectively. The displacement and stress fields within the finite elements are independently approximated. The displacement field is interpolated as it is in the standard displacement method, and the stress field is approximated by using complete polynomials of the correct order. A procedure that uses the definitions of stress components in terms of an Airy stress function is developed to derive the stress interpolation polynomials. Such derived stress fields identically satisfy the equations of equilibrium. Moreover, the resulting element matrices are insensitive to the orientation of local coordinate systems. A method is devised to calculate the number of rigid body modes, and the present elements are shown to be free of spurious zero-energy modes. A number of example problems are solved by using the present library, and the results are compared with corresponding analytical solutions and with results from the standard displacement finite element method. The Integrated Force Method not only gives results that agree well with analytical and displacement method results but also outperforms the displacement method in stress calculations.

  13. Development and Validation of New Spectrophotometric Methods to Determine Enrofloxacin in Pharmaceuticals

    NASA Astrophysics Data System (ADS)

    Rajendraprasad, N.; Basavaiah, K.

    2015-07-01

    Four spectrophotometric methods, based on oxidation with cerium(IV), are investigated and developed to determine EFX in pure form and in dosage forms. The frst and second methods (Method A and method B) are direct, in which after the oxidation of EFX with cerium(IV) in acid medium, the absorbance of reduced and unreacted oxidant is measured at 275 and 320 nm, respectively. In the third (C) and fourth (D) methods after the reaction between EFX and oxidant is ensured to be completed the surplus oxidant is treated with either N-phenylanthranilic acid (NPA) or Alizarin Red S (ARS) dye and the absorbance of the oxidized NPA or ARS is measured at 440 or 420 nm. The methods showed good linearity over the concentration ranges of 0.5-5.0, 1.25-12.5, 10.0-100.0, and 6.0-60.0 μg/ml, for method A, B, C and D, respectively, with apparent molar absorptivity values of 4.42 × 10 4 , 8.7 × 10 3 , 9.31 × 10 2 , and 2.28 × 10 3 l/(mol· cm). The limits of detection (LOD), quantification (LOQ), and Sandell's sensitivity values and other validation results have also been reported. The proposed methods are successfully applied to determine EFX in pure form and in dosage forms.

  14. Development of threedimensional optical correction method for reconstruction of flow field in droplet

    NASA Astrophysics Data System (ADS)

    Ko, Han Seo; Gim, Yeonghyeon; Kang, Seung-Hwan

    2015-11-01

    A three-dimensional optical correction method was developed to reconstruct droplet-based flow fields. For a numerical simulation, synthetic phantoms were reconstructed by a simultaneous multiplicative algebraic reconstruction technique using three projection images which were positioned at an offset angle of 45°. If the synthetic phantom in a conical object with refraction index which differs from atmosphere, the image can be distorted because a light is refracted on the surface of the conical object. Thus, the direction of the projection ray was replaced by the refracted ray which occurred on the surface of the conical object. In order to prove the method considering the distorted effect, reconstruction results of the developed method were compared with the original phantom. As a result, the reconstruction result of the method showed smaller error than that without the method. The method was applied for a Taylor cone which was caused by high voltage between a droplet and a substrate to reconstruct the three-dimensional flow fields for analysis of the characteristics of the droplet. This work was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Korean government (MEST) (No. 2013R1A2A2A01068653).

  15. Development of Decision Making Algorithm for Control of Sea Cargo Containers by ``TAGGED'' Neutron Method

    NASA Astrophysics Data System (ADS)

    Anan'ev, A. A.; Belichenko, S. G.; Bogolyubov, E. P.; Bochkarev, O. V.; Petrov, E. V.; Polishchuk, A. M.; Udaltsov, A. Yu.

    2009-12-01

    Nowadays in Russia and abroad there are several groups of scientists, engaged in development of systems based on "tagged" neutron method (API method) and intended for detection of dangerous materials, including high explosives (HE). Particular attention is paid to possibility of detection of dangerous objects inside a sea cargo container. Energy gamma-spectrum, registered from object under inspection is used for determination of oxygen/carbon and nitrogen/carbon chemical ratios, according to which dangerous object is distinguished from not dangerous one. Material of filled container, however, gives rise to additional effects of rescattering and moderation of 14 MeV primary neutrons of generator, attenuation of secondary gamma-radiation from reactions of inelastic neutron scattering on objects under inspection. These effects lead to distortion of energy gamma-response from examined object and therefore prevent correct recognition of chemical ratios. These difficulties are taken into account in analytical method, presented in the paper. Method has been validated against experimental data, obtained by the system for HE detection in sea cargo, based on API method and developed in VNIIA. Influence of shielding materials on results of HE detection and identification is considered. Wood and iron were used as shielding materials. Results of method application for analysis of experimental data on HE simulator measurement (tetryl, trotyl, hexogen) are presented.

  16. The role of the Standard Days Method in modern family planning services in developing countries

    PubMed Central

    2012-01-01

    Background The mere availability of family planning (FP) services is not sufficient to improve reproductive health; services must also be of adequate quality. The introduction of new contraceptive methods is a means of improving quality of care. The Standard Days Method (SDM) is a new fertility-awareness-based contraceptive method that has been successfully added to reproductive health care services around the world. Content Framed by the Bruce-Jain quality-of-care paradigm, this paper describes how the introduction of SDM in developing country settings can improve the six elements of quality while contributing to the intrinsic variety of available methods. SDM meets the needs of women and couples who opt not to use other modern methods. SDM providers are sensitised to the potential of fertility-awareness-based contraception as an appropriate choice for these clients. SDM requires the involvement of both partners and thus offers a natural entry point for providers to further explore partner communication, intimate partner violence, condoms, and HIV/STIs. Conclusion SDM introduction broadens the range of FP methods available to couples in developing countries. SDM counselling presents an opportunity for FP providers to discuss important interpersonal and reproductive health issues with potential users. PMID:22681177

  17. DEVELOPMENT OF DECISION MAKING ALGORITHM FOR CONTROL OF SEA CARGO CONTAINERS BY 'TAGGED' NEUTRON METHOD

    SciTech Connect

    Anan'ev, A. A.; Belichenko, S. G.; Bogolyubov, E. P.; Bochkarev, O. V.; Petrov, E. V.; Polishchuk, A. M.; Udaltsov, A. Yu.

    2009-12-02

    Nowadays in Russia and abroad there are several groups of scientists, engaged in development of systems based on 'tagged' neutron method (API method) and intended for detection of dangerous materials, including high explosives (HE). Particular attention is paid to possibility of detection of dangerous objects inside a sea cargo container. Energy gamma-spectrum, registered from object under inspection is used for determination of oxygen/carbon and nitrogen/carbon chemical ratios, according to which dangerous object is distinguished from not dangerous one. Material of filled container, however, gives rise to additional effects of rescattering and moderation of 14 MeV primary neutrons of generator, attenuation of secondary gamma-radiation from reactions of inelastic neutron scattering on objects under inspection. These effects lead to distortion of energy gamma-response from examined object and therefore prevent correct recognition of chemical ratios. These difficulties are taken into account in analytical method, presented in the paper. Method has been validated against experimental data, obtained by the system for HE detection in sea cargo, based on API method and developed in VNIIA. Influence of shielding materials on results of HE detection and identification is considered. Wood and iron were used as shielding materials. Results of method application for analysis of experimental data on HE simulator measurement (tetryl, trotyl, hexogen) are presented.

  18. Diffuse reflectance near infrared-chemometric methods development and validation of amoxicillin capsule formulations

    PubMed Central

    Khan, Ahmed Nawaz; Khar, Roop Krishen; Ajayakumar, P. V.

    2016-01-01

    Objective: The aim of present study was to establish near infrared-chemometric methods that could be effectively used for quality profiling through identification and quantification of amoxicillin (AMOX) in formulated capsule which were similar to commercial products. In order to evaluate a large number of market products easily and quickly, these methods were modeled. Materials and Methods: Thermo Scientific Antaris II near infrared analyzer with TQ Analyst Chemometric Software were used for the development and validation of the identification and quantification models. Several AMOX formulations were composed with four excipients microcrystalline cellulose, magnesium stearate, croscarmellose sodium and colloidal silicon dioxide. Development includes quadratic mixture formulation design, near infrared spectrum acquisition, spectral pretreatment and outlier detection. According to prescribed guidelines by International Conference on Harmonization (ICH) and European Medicine Agency (EMA) developed methods were validated in terms of specificity, accuracy, precision, linearity, and robustness. Results: On diffuse reflectance mode, an identification model based on discriminant analysis was successfully processed with 76 formulations; and same samples were also used for quantitative analysis using partial least square algorithm with four latent variables and 0.9937 correlation of coefficient followed by 2.17% root mean square error of calibration (RMSEC), 2.38% root mean square error of prediction (RMSEP), 2.43% root mean square error of cross-validation (RMSECV). Conclusion: Proposed model established a good relationship between the spectral information and AMOX identity as well as content. Resulted values show the performance of the proposed models which offers alternate choice for AMOX capsule evaluation, relative to that of well-established high-performance liquid chromatography method. Ultimately three commercial products were successfully evaluated using developed

  19. Development of unstructured grid methods for steady and unsteady aerodynamic analysis

    NASA Technical Reports Server (NTRS)

    Batina, John T.

    1990-01-01

    The current status of the development of unstructured grid methods in the Unsteady Aerodynamics Branch at NASA-Langley is described. These methods are being developed for steady and unsteady aerodynamic applications. The flow solvers that were developed for the solution of the unsteady Euler and Navier-Stokes equations are highlighted and selected results are given which demonstrate various features of the capability. The results demonstrate 2-D and 3-D applications for both steady and unsteady flows. Comparisons are also made with solutions obtained using a structured grid code and with experimental data to determine the accuracy of the unstructured grid methodology. These comparisons show good agreement which thus verifies the accuracy.

  20. The development of an automatic method of safety monitoring at Pelican crossings.

    PubMed

    Malkhamah, Siti; Tight, Miles; Montgomery, Frank

    2005-09-01

    This paper reports on the development of a method for automatic monitoring of safety at Pelican crossings. Historically, safety monitoring has typically been carried out using accident data, though given the rarity of such events it is difficult to quickly detect change in accident risk at a particular site. An alternative indicator sometimes used is traffic conflicts, though this data can be time consuming and expensive to collect. The method developed in this paper uses vehicle speeds and decelerations collected using standard in situ loops and tubes, to determine conflicts using vehicle decelerations and to assess the possibility of automatic safety monitoring at Pelican crossings. Information on signal settings, driver crossing behaviour, pedestrian crossing behaviour and delays, and pedestrian-vehicle conflicts was collected synchronously through a combination of direct observation, video analysis, and analysis of output from tube and loop detectors. Models were developed to predict safety, i.e. pedestrian-vehicle conflicts using vehicle speeds and decelerations. PMID:15919048

  1. Whole genome sequence analysis of unidentified genetically modified papaya for development of a specific detection method.

    PubMed

    Nakamura, Kosuke; Kondo, Kazunari; Akiyama, Hiroshi; Ishigaki, Takumi; Noguchi, Akio; Katsumata, Hiroshi; Takasaki, Kazuto; Futo, Satoshi; Sakata, Kozue; Fukuda, Nozomi; Mano, Junichi; Kitta, Kazumi; Tanaka, Hidenori; Akashi, Ryo; Nishimaki-Mogami, Tomoko

    2016-08-15

    Identification of transgenic sequences in an unknown genetically modified (GM) papaya (Carica papaya L.) by whole genome sequence analysis was demonstrated. Whole genome sequence data were generated for a GM-positive fresh papaya fruit commodity detected in monitoring using real-time polymerase chain reaction (PCR). The sequences obtained were mapped against an open database for papaya genome sequence. Transgenic construct- and event-specific sequences were identified as a GM papaya developed to resist infection from a Papaya ringspot virus. Based on the transgenic sequences, a specific real-time PCR detection method for GM papaya applicable to various food commodities was developed. Whole genome sequence analysis enabled identifying unknown transgenic construct- and event-specific sequences in GM papaya and development of a reliable method for detecting them in papaya food commodities. PMID:27006240

  2. Modeling of the evolution of steppe chernozems and development of the method of pedogenetic chronology

    NASA Astrophysics Data System (ADS)

    Lisetskii, F. N.; Stolba, V. F.; Goleusov, P. V.

    2016-08-01

    Geoarchaeological methods were used to study chronosequences of surface soils in the steppe zone and to trace soil evolution during the Late Holocene in northwestern Crimea. It was found that the morphological and functional "maturity" of the humus horizons in steppe chernozems of the Late Holocene was reached in about 1600-1800 yrs. After this, their development decelerated irreversibly. The maximum concentration of trace elements accumulated in these horizons in the course of pedogenesis was reached in 1400 yrs. A new method of pedogenetic chronology based on the model chronofunction of the development of irreversible results of pedogenesis over time is suggested. Original pedochronological data and growth functions—the most suitable models for simulating pedogenesis over the past three thousand years—suggest that the development of morphological features of soil as an organomineral natural body follows growth patterns established for biological systems.

  3. Development of unstructured grid methods for steady and unsteady aerodynamic analysis

    NASA Technical Reports Server (NTRS)

    Batina, John T.

    1990-01-01

    The current status of the development of unstructured grid methods in the Unsteady Aerodynamic Branch at NASA-Langley is described. These methods are being developed for steady and unsteady aerodynamic applications. The flow solvers that were developed for the solution of the unsteady Euler and Navier-Stokes equations are highlighted and selected results are given which demonstrate various features of the capability. The results demonstrate 2-D and 3-D applications for both steady and unsteady flows. Comparisons are also made with solutions obtained using a structured grid code and with experimental data to determine the accuracy of the unstructured grid methodology. These comparisons show good agreement which thus verifies the accuracy.

  4. Development and Validation of Liquid Chromatographic Method for Estimation of Naringin in Nanoformulation

    PubMed Central

    Musmade, Kranti P.; Trilok, M.; Dengale, Swapnil J.; Bhat, Krishnamurthy; Reddy, M. S.; Musmade, Prashant B.; Udupa, N.

    2014-01-01

    A simple, precise, accurate, rapid, and sensitive reverse phase high performance liquid chromatography (RP-HPLC) method with UV detection has been developed and validated for quantification of naringin (NAR) in novel pharmaceutical formulation. NAR is a polyphenolic flavonoid present in most of the citrus plants having variety of pharmacological activities. Method optimization was carried out by considering the various parameters such as effect of pH and column. The analyte was separated by employing a C18 (250.0 × 4.6 mm, 5 μm) column at ambient temperature in isocratic conditions using phosphate buffer pH 3.5: acetonitrile (75 : 25% v/v) as mobile phase pumped at a flow rate of 1.0 mL/min. UV detection was carried out at 282 nm. The developed method was validated according to ICH guidelines Q2(R1). The method was found to be precise and accurate on statistical evaluation with a linearity range of 0.1 to 20.0 μg/mL for NAR. The intra- and interday precision studies showed good reproducibility with coefficients of variation (CV) less than 1.0%. The mean recovery of NAR was found to be 99.33 ± 0.16%. The proposed method was found to be highly accurate, sensitive, and robust. The proposed liquid chromatographic method was successfully employed for the routine analysis of said compound in developed novel nanopharmaceuticals. The presence of excipients did not show any interference on the determination of NAR, indicating method specificity. PMID:26556205

  5. Capillary isoelectric focusing method development and validation for investigation of recombinant therapeutic monoclonal antibody.

    PubMed

    Suba, Dávid; Urbányi, Zoltán; Salgó, András

    2015-10-10

    Capillary isoelectric focusing (cIEF) is a basic and highly accurate routine analytical tool to prove identity of protein drugs in quality control (QC) and release tests in biopharmaceutical industries. However there are some "out-of-the-box" applications commercially available which provide easy and rapid isoelectric focusing solutions for investigating monoclonal antibody drug proteins. However use of these kits in routine testings requires high costs. A capillary isoelectric focusing method was developed and validated for identification testing of monoclonal antibody drug products with isoelectric point between 7.0 and 9.0. A method was developed providing good pH gradient for internal calibration (R(2)>0.99) and good resolution between all of the isoform peaks (R=2), minimizing the time and complexity of sample preparation (no urea or salt used). The method is highly reproducible and it is suitable for validation and method transfer to any QC laboratories. Another advantage of the method is that it operates with commercially available chemicals which can be purchased from any suppliers. The interaction with capillary walls (avoid precipitation and adsorption as far as possible) was minimized and synthetic isoelectric small molecular markers were used instead of peptide or protein based markers. The developed method was validated according to the recent ICH guideline (Q2(R1)). Relative standard deviation results were below 0.2% for isoelectric points and below 4% according to the normalized migration times. The method is robust to buffer components with different lot numbers and neutral capillaries with different type of inner coatings. The fluoro-carbon coated column was chosen because of costs-effectivity aspects. PMID:26025812

  6. Development of a Hybrid RANS/LES Method for Turbulent Mixing Layers

    NASA Technical Reports Server (NTRS)

    Georgiadis, Nicholas J.; Alexander, J. Iwan D.; Reshotko, Eli

    2001-01-01

    Significant research has been underway for several years in NASA Glenn Research Center's nozzle branch to develop advanced computational methods for simulating turbulent flows in exhaust nozzles. The primary efforts of this research have concentrated on improving our ability to calculate the turbulent mixing layers that dominate flows both in the exhaust systems of modern-day aircraft and in those of hypersonic vehicles under development. As part of these efforts, a hybrid numerical method was recently developed to simulate such turbulent mixing layers. The method developed here is intended for configurations in which a dominant structural feature provides an unsteady mechanism to drive the turbulent development in the mixing layer. Interest in Large Eddy Simulation (LES) methods have increased in recent years, but applying an LES method to calculate the wide range of turbulent scales from small eddies in the wall-bounded regions to large eddies in the mixing region is not yet possible with current computers. As a result, the hybrid method developed here uses a Reynolds-averaged Navier-Stokes (RANS) procedure to calculate wall-bounded regions entering a mixing section and uses a LES procedure to calculate the mixing-dominated regions. A numerical technique was developed to enable the use of the hybrid RANS-LES method on stretched, non-Cartesian grids. With this technique, closure for the RANS equations is obtained by using the Cebeci-Smith algebraic turbulence model in conjunction with the wall-function approach of Ota and Goldberg. The LES equations are closed using the Smagorinsky subgrid scale model. Although the function of the Cebeci-Smith model to replace all of the turbulent stresses is quite different from that of the Smagorinsky subgrid model, which only replaces the small subgrid turbulent stresses, both are eddy viscosity models and both are derived at least in part from mixing-length theory. The similar formulation of these two models enables the RANS

  7. Development of loop-mediated isothermal amplification methods for detecting Taylorella equigenitalis and Taylorella asinigenitalis

    PubMed Central

    KINOSHITA, Yuta; NIWA, Hidekazu; KATAYAMA, Yoshinari; HARIU, Kazuhisa

    2015-01-01

    ABSTRACT Taylorella equigenitalis is a causative bacterium of contagious equine metritis (CEM), and Taylorella asinigenitalis is species belonging to genus Taylorella. The authors developed two loop-mediated isothermal amplification (LAMP) methods, Te-LAMP and Ta-LAMP, for detecting T. equigenitalis and T. asinigenitalis, respectively. Using experimentally spiked samples, Te-LAMP was as sensitive as a published semi-nested PCR method, and Ta-LAMP was more sensitive than conventional PCR. Multiplex LAMP worked well without nonspecific reactions, and the analytical sensitivities of multiplex LAMP in the spiked samples were almost equivalent to those of Te-LAMP and Ta-LAMP. Therefore, the LAMP methods are considered useful tools to detect T. equigenitalis and/or T. asinigenitalis, and preventive measures will be rapidly implemented if the occurrence of CEM is confirmed by the LAMP methods. PMID:25829868

  8. Development of an ellipse fitting method with which to analyse selected area electron diffraction patterns.

    PubMed

    Mitchell, D R G; Van den Berg, J A

    2016-01-01

    A software method has been developed which uses ellipse fitting to analyse electron diffraction patterns from polycrystalline materials. The method, which requires minimal user input, can determine the pattern centre and the diameter of diffraction rings with sub-pixel precision. This enables accurate crystallographic information to be obtained in a rapid and consistent manner. Since the method fits ellipses, it can detect, quantify and correct any elliptical distortion introduced by the imaging system. Distortion information derived from polycrystalline patterns as a function of camera length can be subsequently recalled and applied to single crystal patterns, resulting in improved precision and accuracy. The method has been implemented as a plugin for the DigitalMicrograph software by Gatan, and is a freely available via the internet. PMID:26495808

  9. Lessons learned applying CASE methods/tools to Ada software development projects

    NASA Technical Reports Server (NTRS)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  10. Method development and validation for pharmaceutical tablets analysis using transmission Raman spectroscopy.

    PubMed

    Li, Yi; Igne, Benoît; Drennen, James K; Anderson, Carl A

    2016-02-10

    The objective of the study is to demonstrate the development and validation of a transmission Raman spectroscopic method using the ICH-Q2 Guidance as a template. Specifically, Raman spectroscopy was used to determine niacinamide content in tablet cores. A 3-level, 2-factor full factorial design was utilized to generate a partial least-squares model for active pharmaceutical ingredient quantification. Validation of the transmission Raman model was focused on figures of merit from three independent batches manufactured at pilot scale. The resultant model statistics were evaluated along with the linearity, accuracy, precision and robustness assessments. Method specificity was demonstrated by accurate determination of niacinamide in the presence of niacin (an expected related substance). The method was demonstrated as fit for purpose and had the desirable characteristics of very short analysis times (∼2.5s per tablet). The resulting method was used for routine content uniformity analysis of single dosage units in a stability study. PMID:26656945

  11. [Microscopic diagnosis of amebiasis: an obsolete method but necessary in the developing world].

    PubMed

    Chacín-Bonilla, Leonor

    2011-12-01

    Molecular biology-based diagnosis offers the best approach to detect amebiasis, but remains impractical in the clinical laboratories from the developing world. In these areas, the microscopic diagnosis remains the routine method. It is imperative that a series of fresh stool specimens be examined. The use of a concentration method should become a routine procedure. Permanent stained smears is the most critical and reliable diagnostic method for the microscopic detection of intestinal protozoa. If the direct or concentrate wet mounts were considered as a preliminary examination; and the use of iron-hematoxylin stained smears become a routine procedure, many of the misdiagnosis that frequently occur could be avoided. The iron-hematoxylin stained preparation is the method of choice for the microscopic detection of E. histolytica/E. dispar and other intestinal protozoa. PMID:22523839

  12. A comparison of methods to estimate seasonal phenological development from BBCH scale recording.

    PubMed

    Cornelius, Christine; Petermeier, Hannes; Estrella, Nicole; Menzel, Annette

    2011-11-01

    The BBCH scale is a two-digit key of growth stages in plants that is based on standardised definitions of plant development stages. The extended BBCH scale, used in this paper, enables the coding of the entire development cycle of all mono- and dicotyledonous plants. Using this key, the frequency distribution of phenological stages was recorded which required a less intense sampling frequency. The onset dates of single events were later estimated from the frequency distribution of BBCH codes. The purpose of this study was to present four different methods from which those onset dates can be estimated. Furthermore, the effects of (1) a less detailed observation key and (2) changes in the sampling frequency on estimates of onset dates were assessed. For all analyses, phenological data from the entire development cycle of four grass species were used. Estimates of onset dates determined by Weighted Plant Development (WPD), Pooled pre-/post-Stage Development (PSD), Cumulative Stage Development (CSD) and Ordinal Logistic Regression (OLR) methods can all be used to determine the phenological progression of plants. Moreover, results show that a less detailed observation key still resulted in similar onset dates, unless more than two consecutive stages were omitted. Further results reveal that the simulation of a less intense sampling frequency had only small impacts on estimates of onset dates. Thus, especially in remote areas where an observation interval of a week is not feasible, estimates derived from the frequency distribution of BBCH codes appear to be appropriate. PMID:21479619

  13. Impact of the emulsification-diffusion method on the development of pharmaceutical nanoparticles.

    PubMed

    Quintanar-Guerrero, David; Zambrano-Zaragoza, María de la Luz; Gutierrez-Cortez, Elsa; Mendoza-Munoz, Nestor

    2012-12-01

    Nanotechnology is having a profound impact in many scientific fields and it has become one of the most important and exciting discipline. Like all technological advances, nanotechnology has its own scientific basis with a broad interdisciplinary effect. Perhaps, we are witnessing an exponential growth of nanotechnology, reflection of this is the important increase in the number of patents, scientific papers and specialized "nano" meetings and journals. The impact in the pharmaceutical area is related to the use of colloidal drug delivery systems as carriers for bioactive agents, in particular, the nanoparticle technology. The term nanoparticles designates solid submicronic particles formed of acceptable materials (e.g. polymers, lipids, etc.) containing an active substance. It includes both nanospheres (matricial systems) and nanocapsules (membrane systems). The knowledge of the nanoparticle preparation methods is a key issue for the formulator involved with drug-delivery research and development. In general, the methods based on preformed polymers, in particular biodegradable polymers, are preferred due to their easy implementation and lower potential toxicity. One of the most widely used methods to prepare polymeric nanoparticles is emulsification-diffusion. This method has been discussed in some reviews that compile research works but has a small number of patents. In this review, the emulsification-diffusion method is discussed from a technological point of view in order to show the operating conditions and formulation variables from data extracted of recent patents and experimental works. The main idea is to provide the reader with a general guide for formulators to make decisions about the usefulness of this method to develop specific nanoparticulate systems. The first part of this review provides an overview of the emulsification-diffusion method to prepare polymeric nanoparticles, while the second part evaluates the influence of preparative variables on the

  14. Space-Time Conservation Element and Solution Element Method Being Developed

    NASA Technical Reports Server (NTRS)

    Chang, Sin-Chung; Himansu, Ananda; Jorgenson, Philip C. E.; Loh, Ching-Yuen; Wang, Xiao-Yen; Yu, Sheng-Tao

    1999-01-01

    The engineering research and design requirements of today pose great computer-simulation challenges to engineers and scientists who are called on to analyze phenomena in continuum mechanics. The future will bring even more daunting challenges, when increasingly complex phenomena must be analyzed with increased accuracy. Traditionally used numerical simulation methods have evolved to their present state by repeated incremental extensions to broaden their scope. They are reaching the limits of their applicability and will need to be radically revised, at the very least, to meet future simulation challenges. At the NASA Lewis Research Center, researchers have been developing a new numerical framework for solving conservation laws in continuum mechanics, namely, the Space-Time Conservation Element and Solution Element Method, or the CE/SE method. This method has been built from fundamentals and is not a modification of any previously existing method. It has been designed with generality, simplicity, robustness, and accuracy as cornerstones. The CE/SE method has thus far been applied in the fields of computational fluid dynamics, computational aeroacoustics, and computational electromagnetics. Computer programs based on the CE/SE method have been developed for calculating flows in one, two, and three spatial dimensions. Results have been obtained for numerous problems and phenomena, including various shock-tube problems, ZND detonation waves, an implosion and explosion problem, shocks over a forward-facing step, a blast wave discharging from a nozzle, various acoustic waves, and shock/acoustic-wave interactions. The method can clearly resolve shock/acoustic-wave interactions, wherein the difference of the magnitude between the acoustic wave and shock could be up to six orders. In two-dimensional flows, the reflected shock is as crisp as the leading shock. CE/SE schemes are currently being used for advanced applications to jet and fan noise prediction and to chemically

  15. Methods for the guideline-based development of quality indicators--a systematic review

    PubMed Central

    2012-01-01

    Background Quality indicators (QIs) are used in many healthcare settings to measure, compare, and improve quality of care. For the efficient development of high-quality QIs, rigorous, approved, and evidence-based development methods are needed. Clinical practice guidelines are a suitable source to derive QIs from, but no gold standard for guideline-based QI development exists. This review aims to identify, describe, and compare methodological approaches to guideline-based QI development. Methods We systematically searched medical literature databases (Medline, EMBASE, and CINAHL) and grey literature. Two researchers selected publications reporting methodological approaches to guideline-based QI development. In order to describe and compare methodological approaches used in these publications, we extracted detailed information on common steps of guideline-based QI development (topic selection, guideline selection, extraction of recommendations, QI selection, practice test, and implementation) to predesigned extraction tables. Results From 8,697 hits in the database search and several grey literature documents, we selected 48 relevant references. The studies were of heterogeneous type and quality. We found no randomized controlled trial or other studies comparing the ability of different methodological approaches to guideline-based development to generate high-quality QIs. The relevant publications featured a wide variety of methodological approaches to guideline-based QI development, especially regarding guideline selection and extraction of recommendations. Only a few studies reported patient involvement. Conclusions Further research is needed to determine which elements of the methodological approaches identified, described, and compared in this review are best suited to constitute a gold standard for guideline-based QI development. For this research, we provide a comprehensive groundwork. PMID:22436067

  16. Development of correction methods for variable pinhole single-photon emission computed tomography

    NASA Astrophysics Data System (ADS)

    Bae, S.; Bae, J.; Lee, H.; Lee, K.

    2016-02-01

    We propose a novel pinhole collimator in which the pinhole shape can be changed in real-time, and a new single-photon emission computed tomography (SPECT) system that utilizes this variable pinhole (VP) collimator. The acceptance angle and distance between the collimator and the object of VP SPECT are varied so that the optimum value of the region-of-interest (ROI) can be obtained for each rotation angle. Because of these geometrical variations, new correction methods are required for image reconstruction. In this study, we developed two correction methods. The first is the sensitivity-correction algorithm, which minimizes the variation of a system matrix caused by varying the acceptance angle for each rotation angle. The second is the acquisition-time-correction method, which reduces the variation of uniformity caused by varying the distance between the collimator and the object for each rotation angle. A 3D maximum likelihood expectation maximization (MLEM) algorithm was applied to image reconstruction, and two digital phantoms were studied to evaluate the resolution and sensitivity of the images obtained using the proposed methods. The images obtained by using the proposed correction methods show higher uniformity and resolution than those obtained without using these methods. In particular, the results of the resolution phantom study show that hot rods (0.8-mm-diameter) can be clearly distinguished using the proposed correction methods. A quantitative analysis of the ROI phantom revealed that the mean square error (MSE) was 0.42 without the acquisition-time-correction method, and 0.04 with the acquisition-time-correction method. The MSEs of the resolution phantom without and with the acquisition-time-correction method were calculated as 55.14 and 14.69, respectively.

  17. A Unified Development of Basis Reduction Methods for Rotor Blade Analysis

    NASA Technical Reports Server (NTRS)

    Ruzicka, Gene C.; Hodges, Dewey H.; Rutkowski, Michael (Technical Monitor)

    2001-01-01

    The axial foreshortening effect plays a key role in rotor blade dynamics, but approximating it accurately in reduced basis models has long posed a difficult problem for analysts. Recently, though, several methods have been shown to be effective in obtaining accurate,reduced basis models for rotor blades. These methods are the axial elongation method,the mixed finite element method, and the nonlinear normal mode method. The main objective of this paper is to demonstrate the close relationships among these methods, which are seemingly disparate at first glance. First, the difficulties inherent in obtaining reduced basis models of rotor blades are illustrated by examining the modal reduction accuracy of several blade analysis formulations. It is shown that classical, displacement-based finite elements are ill-suited for rotor blade analysis because they can't accurately represent the axial strain in modal space, and that this problem may be solved by employing the axial force as a variable in the analysis. It is shown that the mixed finite element method is a convenient means for accomplishing this, and the derivation of a mixed finite element for rotor blade analysis is outlined. A shortcoming of the mixed finite element method is that is that it increases the number of variables in the analysis. It is demonstrated that this problem may be rectified by solving for the axial displacements in terms of the axial forces and the bending displacements. Effectively, this procedure constitutes a generalization of the widely used axial elongation method to blades of arbitrary topology. The procedure is developed first for a single element, and then extended to an arbitrary assemblage of elements of arbitrary type. Finally, it is shown that the generalized axial elongation method is essentially an approximate solution for an invariant manifold that can be used as the basis for a nonlinear normal mode.

  18. Development of a Practical Method to Detect Noroviruses Contamination in Composite Meals.

    PubMed

    Saito, Hiroyuki; Toho, Miho; Tanaka, Tomoyuki; Noda, Mamoru

    2015-09-01

    Various methods to detect foodborne viruses including norovirus (NoV) in contaminated food have been developed. However, a practical method suitable for routine examination that can be applied for the detection of NoVs in oily, fatty, or emulsive food has not been established. In this study, we developed a new extraction and concentration method for detecting NoVs in contaminated composite meals. We spiked NoV-GI.4 or -GII.4 stool suspension into potato salad and stir-fried noodles. The food samples were suspended in homogenizing buffer and centrifuged to obtain a food emulsion. Then, anti-NoV-GI.4 or anti-NoV-GII.4 rabbit serum raised against recombinant virus-like particles or commercially available human gamma globulin and Staphylococcus aureus fixed with formalin as a source of protein A were added to the food emulsion. NoV-IgG-protein A-containing bacterial complexes were collected by centrifugation, and viral RNA was extracted. The detection limits of NoV RNA were 10-35 copies/g food for spiked NoVs in potato salad and stir-fried noodles. Human gamma globulin could also concentrate other NoV genotypes as well as other foodborne viruses, including sapovirus, hepatitis A virus, and adenovirus. This newly developed method can be used as to identify NoV contamination in composite foods and is also possibly applicable to other foodborne viruses. PMID:25796206

  19. Development of extraction method of pharmaceuticals and their occurrences found in Japanese wastewater treatment plants.

    PubMed

    Okuda, Takashi; Yamashita, Naoyuki; Tanaka, Hiroaki; Matsukawa, Hiroshi; Tanabe, Kaoru

    2009-07-01

    In this study, occurrence of 66 PPCPs (pharmaceuticals and personal care products) in liquid and solid phases of sewage sludge was elucidated. The extraction methods for the PPCPs from sludge were newly developed employing Pressurized Liquid Extraction (PLE) and Ultrasonic Solvent Extraction (USE). As an appropriate method, PLE using water (pH2), PLE using methanol (pH4), and USE using mixture of methanol and water (1/9,v/v, pH11) was found most effective because total recovery of most of the PPCPs indicated 40 to 130%. The developed extraction method with previously developed method for liquid phase analysis was applied to field survey at wastewater treatment plants (WWTPs) in Japan. 56 compounds were detected from the primary sludge and 61 compounds were detected from the excess sludge. The concentration was ranged between several ng/g and several microg/g. Solid-water distribution coefficient (Log K(d)) ranged between 0.9 L/kg (Caffeine) and 3.7 L/kg (Levofloxacin) for primary sludge and between 1.4 L/kg (Sulpirid) and 4.3 L/kg (Mefenamic acid) for excess sludge. PMID:19201472

  20. Development of the new physical method for real time spot weld quality evaluation using ultrasound

    NASA Astrophysics Data System (ADS)

    Chertov, Andriy M.

    Since the invention of resistance spot welding, the manufacturers have been concerned about the quality assurance of the joints. One of the most promising directions in quality inspection is the real time ultrasonic nondestructive evaluation. In such a system, the acoustic signals are sent through the spot weld during welding and then analyzed to characterize the quality of the joint. Many research groups are currently working to develop a reliable inspection method. In this dissertation the new physical method of resistance spot weld quality monitoring is presented. It differs from all other ultrasonic methods by the physical principles of inspection. The multilayered structure of the spot weld with varying physical properties is investigated with short pulses of longitudinal ultrasonic waves. Unlike other methods, the developed technology works in reflection mode. The waves bring back the information which, after careful analysis, can be used to evaluate the weld quality. The complex structure of the weldment modifies the waves in different ways which, makes it hard to accurately measure the physical properties of the weldment. The frequency-dependent attenuation of the sound, diffraction, and beam divergence - all contribute to the signal distraction. These factors are fully studied, and ways to minimize them are presented. After application of pattern recognition routines, the weld characteristics are submitted to fuzzy logic algorithm, and the weld is characterized. The current level of the system development allowed the installation of two prototype machines at one assembly plant. The technology is now under thorough evaluation for robustness and accuracy in an industrial environment.

  1. Research on Assessment Methods for Urban Public Transport Development in China

    PubMed Central

    Zou, Linghong; Guo, Hongwei

    2014-01-01

    In recent years, with the rapid increase in urban population, the urban travel demands in Chinese cities have been increasing dramatically. As a result, developing comprehensive urban transport systems becomes an inevitable choice to meet the growing urban travel demands. In urban transport systems, public transport plays the leading role to promote sustainable urban development. This paper aims to establish an assessment index system for the development level of urban public transport consisting of a target layer, a criterion layer, and an index layer. Review on existing literature shows that methods used in evaluating urban public transport structure are dominantly qualitative. To overcome this shortcoming, fuzzy mathematics method is used for describing qualitative issues quantitatively, and AHP (analytic hierarchy process) is used to quantify expert's subjective judgment. The assessment model is established based on the fuzzy AHP. The weight of each index is determined through the AHP and the degree of membership of each index through the fuzzy assessment method to obtain the fuzzy synthetic assessment matrix. Finally, a case study is conducted to verify the rationality and practicability of the assessment system and the proposed assessment method. PMID:25530756

  2. Empirical methods for identifying specific peptide-protein interactions for smart reagent development

    NASA Astrophysics Data System (ADS)

    Kogot, Joshua M.; Sarkes, Deborah A.; Stratis-Cullum, Dimitra N.; Pellegrino, Paul M.

    2012-06-01

    The current state of the art in the development of antibody alternatives is fraught with difficulties including mass production, robustness, and overall cost of production. The isolation of synthetic alternatives using peptide libraries offers great potential for recognition elements that are more stable and have improved binding affinity and target specificity. Although recent advances in rapid and automated discovery and synthetic library engineering continue to show promise for this emerging science, there remains a critical need for an improved fundamental understanding of the mechanisms of recognition. To better understand the fundamental mechanisms of binding, it is critical to be able to accurately assess binding between peptide reagents and protein targets. The development of empirical methods to analyze peptide-protein interactions is often overlooked, since it is often assumed that peptides can easily substitute for antibodies in antibody-derived immunoassays. The physico-chemical difference between peptides and antibodies represents a major challenge for developing peptides in standard immunoassays as capture or detection reagents. Analysis of peptide presents a unique challenge since the peptide has to be soluble, must be capable of target recognition, and capable of ELISA plate or SPR chip binding. Incorporating a plate-binding, hydrophilic peptide fusion (PS-tag) improves both the solubility and plate binding capability in a direct peptide ELISA format. Secondly, a solution based methods, affinity capillary electrophoresis (ACE) method is presented as a solution-based, affinity determination method that can be used for determining both the association constants and binding kinetics.

  3. Development of a capillary electrophoresis method for the characterization of "palo azul" (Eysenhardtia polystachya).

    PubMed

    Salinas-Hernández, Pastora; López-Bermúdez, Francisco J; Rodríguez-Barrientos, Damaris; Ramírez-Silva, María Teresa; Romero-Romo, Mario A; Morales-Anzures, Fernando; Rojas-Hernández, Alberto

    2008-03-01

    The tree Eysenhardtia polystachya (Ortega) Sarg. has quite a wide popular use within the traditional Mexican medicine as herbal remedy. Popular practices constitute a relevant enough basis to design optimum analytical methods in order to determine basic principles of diverse medicinal plants. This has become one of the essentials needed to characterize such products, for which it is fundamentally important to develop an efficient and reliable separation method. This work presents the results concerning the development and optimization of a novel CE method for the separation of components from water/etanol (1:1) extracts of E. polystachya, using the following conditions, considered the best obtained: phosphate buffer 10 mM, 20 kV voltage, and pH 8.1 at 214 nm and 50 mM, 12.5 kV voltage with pH 8.1 at 426 nm. The optimization takes into account the parameters associated in the resulting electropherograms, such as number of peaks, migration times, and the Deltat(m) of the neighboring peaks. Under optimal conditions the separation intended was attained within 15 and 20 min for 214 and 426 nm, respectively. The characterization method developed was applied to the analysis of diverse extracts of E. polystachya. PMID:18266292

  4. Systemic Sclerosis Classification Criteria: Developing methods for multi-criteria decision analysis with 1000Minds

    PubMed Central

    Johnson, Sindhu R.; Naden, Raymond P.; Fransen, Jaap; van den Hoogen, Frank; Pope, Janet E.; Baron, Murray; Tyndall, Alan; Matucci-Cerinic, Marco; Denton, Christopher P.; Distler, Oliver; Gabrielli, Armando; van Laar, Jacob M.; Mayes, Maureen; Steen, Virginia; Seibold, James R.; Clements, Phillip; Medsger, Thomas A.; Carreira, Patricia E.; Riemekasten, Gabriela; Chung, Lorinda; Fessler, Barri J.; Merkel, Peter A.; Silver, Richard; Varga, John; Allanore, Yannick; Mueller-Ladner, Ulf; Vonk, Madelon C.; Walker, Ulrich A.; Cappelli, Susanna; Khanna, Dinesh

    2014-01-01

    Objective Classification criteria for systemic sclerosis (SSc) are being developed. The objectives were to: develop an instrument for collating case-data and evaluate its sensibility; use forced-choice methods to reduce and weight criteria; and explore agreement between experts on the probability that cases were classified as SSc. Study Design and Setting A standardized instrument was tested for sensibility. The instrument was applied to 20 cases covering a range of probabilities that each had SSc. Experts rank-ordered cases from highest to lowest probability; reduced and weighted the criteria using forced-choice methods; and re-ranked the cases. Consistency in rankings was evaluated using intraclass correlation coefficients (ICC). Results Experts endorsed clarity (83%), comprehensibility (100%), face and content validity (100%). Criteria were weighted (points): finger skin thickening (14–22), finger-tip lesions (9–21), friction rubs (21), finger flexion contractures (16), pulmonary fibrosis (14), SSc-related antibodies (15), Raynaud’s phenomenon (13), calcinosis (12), pulmonary hypertension (11), renal crisis (11), telangiectasia (10), abnormal nailfold capillaries (10), esophageal dilation (7) and puffy fingers (5). The ICC across experts was 0.73 (95%CI 0.58,0.86) and improved to 0.80 (95%CI 0.68,0.90). Conclusions Using a sensible instrument and forced-choice methods, the number of criteria were reduced by 39% (23 to 14) and weighted. Our methods reflect the rigors of measurement science, and serves as a template for developing classification criteria. PMID:24721558

  5. Gestational age assessment by nurses in a developing country using the Ballard method, external criteria only.

    PubMed

    Verhoeff, F H; Milligan, P; Brabin, B J; Mlanga, S; Nakoma, V

    1997-12-01

    The aim of this study was to evaluate postnatal examination of the newborn by nurses in a developing country, using a modified Ballard method, scoring for the six external criteria only (Ballard-ext). Applicability of gestational age estimates with the Ballard-ext. was assessed by calculating its agreement with gestational age derived from the last menstrual period (LMP), fundal height and the Dubowitz method. The smallest difference in gestational age and the most narrow limits of agreement were found between the Ballard-ext. and the Dubowitz method. No reliable gestational age could be obtained from LMP or fundal height. At low gestational ages, Ballard-ext. tended to give lower gestational ages compared with the Dubowitz method. At an average gestational age of more than 251 days, Ballard-ext. gave higher values compared with Dubowitz. Both Ballard-ext. and the Dubowitz method identified 48% of low birthweight babies as growth-retarded (gestational age > or = 37 weeks). No significant difference in gestational age assessment of newborns between nurses was observed. The Ballard method, scoring for external criteria alone, compared favourably with the Dubowitz method. The test is simple to perform and can be reliably used routinely by nurses. PMID:9578793

  6. Development of a conceptual flight vehicle design weight estimation method library and documentation

    NASA Astrophysics Data System (ADS)

    Walker, Andrew S.

    The state of the art in estimating the volumetric size and mass of flight vehicles is held today by an elite group of engineers in the Aerospace Conceptual Design Industry. This is not a skill readily accessible or taught in academia. To estimate flight vehicle mass properties, many aerospace engineering students are encouraged to read the latest design textbooks, learn how to use a few basic statistical equations, and plunge into the details of parametric mass properties analysis. Specifications for and a prototype of a standardized engineering "tool-box" of conceptual and preliminary design weight estimation methods were developed to manage the growing and ever-changing body of weight estimation knowledge. This also bridges the gap in Mass Properties education for aerospace engineering students. The Weight Method Library will also be used as a living document for use by future aerospace students. This "tool-box" consists of a weight estimation method bibliography containing unclassified, open-source literature for conceptual and preliminary flight vehicle design phases. Transport aircraft validation cases have been applied to each entry in the AVD Weight Method Library in order to provide a sense of context and applicability to each method. The weight methodology validation results indicate consensus and agreement of the individual methods. This generic specification of a method library will be applicable for use by other disciplines within the AVD Lab, Post-Graduate design labs, or engineering design professionals.

  7. The Challenge of Automated Change Detection: Developing a Method for the Updating of Land Parcels

    NASA Astrophysics Data System (ADS)

    Matikainen, L.; Karila, K.; Litkey, P.; Ahokas, E.; Munck, A.; Karjalainen, M.; Hyyppä, J.

    2012-07-01

    Development of change detection methods that are functional and reliable enough for operational work is still a demanding task. This article discusses automated change detection from the viewpoint of one case study: the Finnish Land Parcel Identification System (FLPIS). The objective of the study is to develop a change detection method that could be used as an aid in the updating of the FLPIS. The method is based on object-based interpretation, and it uses existing parcel boundaries and new aerial ortho images as input data. Rules for classifying field and non-field objects are defined automatically by using the classification tree method and training data. Additional, manually created rules are used to improve the results. Classification tests carried out during the development work suggest that real changes can be detected relatively well. According to a recent visual evaluation, 96% of changes larger than 100 m2 were detected, at least partly. The overall accuracy of the change detection results was 93% when compared with reference data pixel-by-pixel. On the other hand, there are also missing changes and numerous false alarms. The main challenges encountered in the method development include the wide diversity of agricultural fields and other land cover objects locally, across the country, and at different times of the spring and summer, variability in the digital numbers (DNs) of the aerial images, the different nature of visual and automatic interpretation, and the small percentage of the total field area that has really changed. These challenges and possible solutions are discussed in the article.

  8. Advanced organic analysis and analytical methods development: FY 1995 progress report. Waste Tank Organic Safety Program

    SciTech Connect

    Wahl, K.L.; Campbell, J.A.; Clauss, S.A.

    1995-09-01

    This report describes the work performed during FY 1995 by Pacific Northwest Laboratory in developing and optimizing analysis techniques for identifying organics present in Hanford waste tanks. The main focus was to provide a means for rapidly obtaining the most useful information concerning the organics present in tank waste, with minimal sample handling and with minimal waste generation. One major focus has been to optimize analytical methods for organic speciation. Select methods, such as atmospheric pressure chemical ionization mass spectrometry and matrix-assisted laser desorption/ionization mass spectrometry, were developed to increase the speciation capabilities, while minimizing sample handling. A capillary electrophoresis method was developed to improve separation capabilities while minimizing additional waste generation. In addition, considerable emphasis has been placed on developing a rapid screening tool, based on Raman and infrared spectroscopy, for determining organic functional group content when complete organic speciation is not required. This capability would allow for a cost-effective means to screen the waste tanks to identify tanks that require more specialized and complete organic speciation to determine tank safety.

  9. A generic simulation cell method for developing extensible, efficient and readable parallel computational models

    NASA Astrophysics Data System (ADS)

    Honkonen, I.

    2015-03-01

    I present a method for developing extensible and modular computational models without sacrificing serial or parallel performance or source code readability. By using a generic simulation cell method I show that it is possible to combine several distinct computational models to run in the same computational grid without requiring modification of existing code. This is an advantage for the development and testing of, e.g., geoscientific software as each submodel can be developed and tested independently and subsequently used without modification in a more complex coupled program. An implementation of the generic simulation cell method presented here, generic simulation cell class (gensimcell), also includes support for parallel programming by allowing model developers to select which simulation variables of, e.g., a domain-decomposed model to transfer between processes via a Message Passing Interface (MPI) library. This allows the communication strategy of a program to be formalized by explicitly stating which variables must be transferred between processes for the correct functionality of each submodel and the entire program. The generic simulation cell class requires a C++ compiler that supports a version of the language standardized in 2011 (C++11). The code is available at https://github.com/nasailja/gensimcell for everyone to use, study, modify and redistribute; those who do are kindly requested to acknowledge and cite this work.

  10. Improved Method for Ex Ovo-Cultivation of Developing Chicken Embryos for Human Stem Cell Xenografts

    PubMed Central

    Schomann, Timo; Qunneis, Firas; Widera, Darius; Kaltschmidt, Christian; Kaltschmidt, Barbara

    2013-01-01

    The characterization of human stem cells for the usability in regenerative medicine is particularly based on investigations regarding their differentiation potential in vivo. In this regard, the chicken embryo model represents an ideal model organism. However, the access to the chicken embryo is only achievable by windowing the eggshell resulting in limited visibility and accessibility in subsequent experiments. On the contrary, ex ovo-culture systems avoid such negative side effects. Here, we present an improved ex ovo-cultivation method enabling the embryos to survive 13 days in vitro. Optimized cultivation of chicken embryos resulted in a normal development regarding their size and weight. Our ex ovo-approach closely resembles the development of chicken embryos in ovo, as demonstrated by properly developed nervous system, bones, and cartilage at expected time points. Finally, we investigated the usability of our method for trans-species transplantation of adult stem cells by injecting human neural crest-derived stem cells into late Hamburger and Hamilton stages (HH26–HH28/E5—E6) of ex ovo-incubated embryos. We demonstrated the integration of human cells allowing experimentally easy investigation of the differentiation potential in the proper developmental context. Taken together, this ex ovo-method supports the prolonged cultivation of properly developing chicken embryos enabling integration studies of xenografted mammalian stem cells at late developmental stages. PMID:23554818

  11. [Development and validation of event-specific quantitative PCR method for genetically modified maize LY038].

    PubMed

    Mano, Junichi; Masubuchi, Tomoko; Hatano, Shuko; Futo, Satoshi; Koiwa, Tomohiro; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Takabatake, Reona; Kitta, Kazumi

    2013-01-01

    In this article, we report a novel real-time PCR-based analytical method for quantitation of the GM maize event LY038. We designed LY038-specific and maize endogenous reference DNA-specific PCR amplifications. After confirming the specificity and linearity of the LY038-specific PCR amplification, we determined the conversion factor required to calculate the weight-based content of GM organism (GMO) in a multilaboratory evaluation. Finally, in order to validate the developed method, an interlaboratory collaborative trial according to the internationally harmonized guidelines was performed with blind DNA samples containing LY038 at the mixing levels of 0, 0.5, 1.0, 5.0 and 10.0%. The precision of the method was evaluated as the RSD of reproducibility (RSDR), and the values obtained were all less than 25%. The limit of quantitation of the method was judged to be 0.5% based on the definition of ISO 24276 guideline. The results from the collaborative trial suggested that the developed quantitative method would be suitable for practical testing of LY038 maize. PMID:23470871

  12. Data-gathering method for use in modeling energy research, development, and demonstration programs

    SciTech Connect

    Meyer, M.A.; Booker, J.M.; Cullingford, H.S.; Peaslee, A.T. Jr.

    1981-01-01

    The development and testing of a data-gathering method for use in a computer program designed to model energy research, development, and demonstration programs for decisionmakers are described. The data-gathering method consists of face-to-face interviews with the scientists working on the projects that will be modeled by the computer program. The basic information gained from an interview includes time estimates for reaching certain project goals and the probability of achieving those goals within the times estimated. The interview method is based on decision analysis techniques. The Magnetic Fusion Energy program of the US Department of Energy was selected as the test case for this project. The data gathering method was used at five fusion projects to determine whether it could meet its design criteria. Extensive statistical analysis was performed to learn how much the expert's answers agreed, what factors were likely to enter into their estimates, and how their estimates corresponded to the officially scheduled dates and to the dates that the project goals were actually achieved. The interview method was considered to have met its design criteria and to be a valid tool for planning.

  13. Development of complementary HPLC-DAD/APCI MS methods for chemical characterization of pharmaceutical packaging materials.

    PubMed

    Petruševski, V; Jolevska, S T; Ribarska, J T; Chachorovska, M; Petkovska, A; Ugarković, S

    2016-05-30

    The chemical characterization of plastics for pharmaceutical packaging has been subject to ever increasing regulatory scrutiny, the reasons for which being: a) plastic additives and degradation products can be extremely hazardous to the patients' health (especially patients on chronic therapy) and b) they offer no therapeutic or formulatory benefit whatsoever. The last decade has seen the issuing of several books, monographs and guidelines dealing with extractables and leachables, however the amount of scientific work done so far is still fairly small (the majority of it performed by only a few research groups), with only a small number of methods published in the literature. This work focuses on developing a set of two complementary HPLC-DAD/APCI MS methods for simultaneous separation, detection, identification and quantification of a wide variety of packaging additives and degradants, the second method specifically targeting a group of compounds known as polymeric hindered amine light stabilizers (HALS), which are known to be notoriously difficult to separate and analyze with standard analytical techniques. The methods are capable of detecting plastic additives present in low ppb concentrations, from samples extracted in solvents with various polarities and pH values. Both methods were developed and optimized using system suitability mixtures comprised of 9 additives commonly encountered in plastic materials, and their practical applicability tested on a variety of extracts from low-density polyethylene (LDPE) and polypropylene (PP), where several additives were successfully separated, detected and identified. PMID:26966896

  14. Selection of reference standard during method development using the analytical hierarchy process.

    PubMed

    Sun, Wan-yang; Tong, Ling; Li, Dong-xiang; Huang, Jing-yi; Zhou, Shui-ping; Sun, Henry; Bi, Kai-shun

    2015-03-25

    Reference standard is critical for ensuring reliable and accurate method performance. One important issue is how to select the ideal one from the alternatives. Unlike the optimization of parameters, the criteria of the reference standard are always immeasurable. The aim of this paper is to recommend a quantitative approach for the selection of reference standard during method development based on the analytical hierarchy process (AHP) as a decision-making tool. Six alternative single reference standards were assessed in quantitative analysis of six phenolic acids from Salvia Miltiorrhiza and its preparations by using ultra-performance liquid chromatography. The AHP model simultaneously considered six criteria related to reference standard characteristics and method performance, containing feasibility to obtain, abundance in samples, chemical stability, accuracy, precision and robustness. The priority of each alternative was calculated using standard AHP analysis method. The results showed that protocatechuic aldehyde is the ideal reference standard, and rosmarinic acid is about 79.8% ability as the second choice. The determination results successfully verified the evaluation ability of this model. The AHP allowed us comprehensive considering the benefits and risks of the alternatives. It was an effective and practical tool for optimization of reference standards during method development. PMID:25636165

  15. A method for studying decision-making by guideline development groups

    PubMed Central

    Gardner, Benjamin; Davidson, Rosemary; McAteer, John; Michie, Susan

    2009-01-01

    Background Multidisciplinary guideline development groups (GDGs) have considerable influence on UK healthcare policy and practice, but previous research suggests that research evidence is a variable influence on GDG recommendations. The Evidence into Recommendations (EiR) study has been set up to document social-psychological influences on GDG decision-making. In this paper we aim to evaluate the relevance of existing qualitative methodologies to the EiR study, and to develop a method best-suited to capturing influences on GDG decision-making. Methods A research team comprised of three postdoctoral research fellows and a multidisciplinary steering group assessed the utility of extant qualitative methodologies for coding verbatim GDG meeting transcripts and semi-structured interviews with GDG members. A unique configuration of techniques was developed to permit data reduction and analysis. Results Our method incorporates techniques from thematic analysis, grounded theory analysis, content analysis, and framework analysis. Thematic analysis of individual interviews conducted with group members at the start and end of the GDG process defines discrete problem areas to guide data extraction from GDG meeting transcripts. Data excerpts are coded both inductively and deductively, using concepts taken from theories of decision-making, social influence and group processes. These codes inform a framework analysis to describe and explain incidents within GDG meetings. We illustrate the application of the method by discussing some preliminary findings of a study of a National Institute for Health and Clinical Excellence (NICE) acute physical health GDG. Conclusion This method is currently being applied to study the meetings of three of NICE GDGs. These cover topics in acute physical health, mental health and public health, and comprise a total of 45 full-day meetings. The method offers potential for application to other health care and decision-making groups. PMID:19656366

  16. Development of Advanced Life Cycle Costing Methods for Technology Benefit/Cost/Risk Assessment

    NASA Technical Reports Server (NTRS)

    Yackovetsky, Robert (Technical Monitor)

    2002-01-01

    The overall objective of this three-year grant is to provide NASA Langley's System Analysis Branch with improved affordability tools and methods based on probabilistic cost assessment techniques. In order to accomplish this objective, the Aerospace Systems Design Laboratory (ASDL) needs to pursue more detailed affordability, technology impact, and risk prediction methods and to demonstrate them on variety of advanced commercial transports. The affordability assessment, which is a cornerstone of ASDL methods, relies on the Aircraft Life Cycle Cost Analysis (ALCCA) program originally developed by NASA Ames Research Center and enhanced by ASDL. This grant proposed to improve ALCCA in support of the project objective by updating the research, design, test, and evaluation cost module, as well as the engine development cost module. Investigations into enhancements to ALCCA include improved engine development cost, process based costing, supportability cost, and system reliability with airline loss of revenue for system downtime. A probabilistic, stand-alone version of ALCCA/FLOPS will also be developed under this grant in order to capture the uncertainty involved in technology assessments. FLOPS (FLight Optimization System program) is an aircraft synthesis and sizing code developed by NASA Langley Research Center. This probabilistic version of the coupled program will be used within a Technology Impact Forecasting (TIF) method to determine what types of technologies would have to be infused in a system in order to meet customer requirements. A probabilistic analysis of the CER's (cost estimating relationships) within ALCCA will also be carried out under this contract in order to gain some insight as to the most influential costs and the impact that code fidelity could have on future RDS (Robust Design Simulation) studies.

  17. Development of TRU waste mobile analysis methods for RCRA-regulated metals

    SciTech Connect

    Mahan, C.A.; Villarreal, R.; Drake, L.; Figg, D.; Wayne, D.; Goldstein, S.

    1998-12-31

    This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). Glow-discharge mass spectrometry (GD-MS), laser-induced breakdown spectroscopy (LIBS), dc-arc atomic-emission spectroscopy (DC-ARC-AES), laser-ablation inductively-coupled-plasma mass spectrometry (LA-ICP-MS), and energy-dispersive x-ray fluorescence (EDXRF) were identified as potential solid-sample analytical techniques for mobile characterization of TRU waste. Each technology developers was provided with surrogate TRU waste samples in order to develop an analytical method. Following successful development of the analytical method, five performance evaluation samples were distributed to each of the researchers in a blind round-robin format. Results of the round robin were compared to known values and Transuranic Waste Characterization Program (TWCP) data quality objectives. Only two techniques, DC-ARC-AES and EDXRF, were able to complete the entire project. Methods development for GD-MS and LA-ICP-MS was halted due to the stand-down at the CMR facility. Results of the round-robin analysis are given for the EDXRF and DCARC-AES techniques. While DC-ARC-AES met several of the data quality objectives, the performance of the EDXRF technique by far surpassed the DC-ARC-AES technique. EDXRF is a simple, rugged, field portable instrument that appears to hold great promise for mobile characterization of TRU waste. The performance of this technique needs to be tested on real TRU samples in order to assess interferences from actinide constituents. In addition, mercury and beryllium analysis will require another analytical technique because the EDXRF method failed to meet the TWCP data quality objectives. Mercury analysis is easily accomplished on solid samples by cold vapor atomic fluorescence (CVAFS). Beryllium can be analyzed by any of a variety of emission techniques.

  18. Efficient method development strategy for challenging separation of pharmaceutical molecules using advanced chromatographic technologies.

    PubMed

    Xiao, Kang Ping; Xiong, Yuan; Liu, Fang Zhu; Rustum, Abu M

    2007-09-01

    In this paper, we describe a strategy that can be used to efficiently develop a high-performance liquid chromatography (HPLC) separation of challenging pharmaceutical molecules. This strategy involves use of advanced chromatographic technologies, such as a computer-assisted chromatographic method development tool (ChromSword) and an automated column switching system (LC Spiderling). This process significantly enhances the probability of achieving adequate separations and can be a large time saver for bench analytical scientists. In our study, the ChromSword was used for mobile phase screening and separation optimization, and the LC Spiderling was used to identify the most appropriate HPLC columns. For proof of concept, the analytes employed in this study are the structural epimers betamethylepoxide and alphamethylepoxide (also known as 16-beta methyl epoxide and 16-alpha methyl epoxide). Both of these compounds are used in the synthesis of various active pharmaceutical ingredients that are part of the steroid pharmaceutical products. While these molecules are relatively large in size and contain various polar functional groups and non-polar cyclic carbon chains, their structures differ only in the orientation of one methyl group. To our knowledge, there is no reported HPLC separation of these two molecules. A simple gradient method was quickly developed on a 5 cm YMC Hydrosphere C(18) column that separated betamethylepoxide and alphamethylepoxide in 10 min with a resolution factor of 3.0. This high resolution provided a true baseline separation even when the concentration ratio between these two epimers was 10,000:1. Although outside of the scope of this paper, stability-indicating assay and impurity profile methods for betamethylepoxide and for alphamethylepoxide have also been developed by our group based on a similar method development strategy. PMID:17628579

  19. Development and Validation of Spectrophotometric, Atomic Absorption and Kinetic Methods for Determination of Moxifloxacin Hydrochloride

    PubMed Central

    Abdellaziz, Lobna M.; Hosny, Mervat M.

    2011-01-01

    Three simple spectrophotometric and atomic absorption spectrometric methods are developed and validated for the determination of moxifloxacin HCl in pure form and in pharmaceutical formulations. Method (A) is a kinetic method based on the oxidation of moxifloxacin HCl by Fe3+ ion in the presence of 1,10 o-phenanthroline (o-phen). Method (B) describes spectrophotometric procedures for determination of moxifloxacin HCl based on its ability to reduce Fe (III) to Fe (II), which was rapidly converted to the corresponding stable coloured complex after reacting with 2,2′ bipyridyl (bipy). The formation of the tris-complex formed in both methods (A) and (B) were carefully studied and their absorbance were measured at 510 and 520 nm respectively. Method (C) is based on the formation of ion- pair associated between the drug and bismuth (III) tetraiodide in acidic medium to form orange—red ion-pair associates. This associate can be quantitatively determined by three different procedures. The formed precipitate is either filtered off, dissolved in acetone and quantified spectrophotometrically at 462 nm (Procedure 1), or decomposed by hydrochloric acid, and the bismuth content is determined by direct atomic absorption spectrometric (Procedure 2). Also the residual unreacted metal complex in the filtrate is determined through its metal content using indirect atomic absorption spectrometric technique (procedure 3). All the proposed methods were validated according to the International Conference on Harmonization (ICH) guidelines, the three proposed methods permit the determination of moxifloxacin HCl in the range of (0.8–6, 0.8–4) for methods A and B, (16–96, 16–96 and 16–72) for procedures 1–3 in method C. The limits of detection and quantitation were calculated, the precision of the methods were satisfactory; the values of relative standard deviations did not exceed 2%. The proposed methods were successfully applied to determine the drug in its pharmaceutical

  20. NASA Perspective on Requirements for Development of Advanced Methods Predicting Unsteady Aerodynamics and Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Schuster, David M.

    2008-01-01

    Over the past three years, the National Aeronautics and Space Administration (NASA) has initiated design, development, and testing of a new human-rated space exploration system under the Constellation Program. Initial designs within the Constellation Program are scheduled to replace the present Space Shuttle, which is slated for retirement within the next three years. The development of vehicles for the Constellation system has encountered several unsteady aerodynamics challenges that have bearing on more traditional unsteady aerodynamic and aeroelastic analysis. This paper focuses on the synergy between the present NASA challenges and the ongoing challenges that have historically been the subject of research and method development. There are specific similarities in the flows required to be analyzed for the space exploration problems and those required for some of the more nonlinear unsteady aerodynamic and aeroelastic problems encountered on aircraft. The aggressive schedule, significant technical challenge, and high-priority status of the exploration system development is forcing engineers to implement existing tools and techniques in a design and application environment that is significantly stretching the capability of their methods. While these methods afford the users with the ability to rapidly turn around designs and analyses, their aggressive implementation comes at a price. The relative immaturity of the techniques for specific flow problems and the inexperience with their broad application to them, particularly on manned spacecraft flight system, has resulted in the implementation of an extensive wind tunnel and flight test program to reduce uncertainty and improve the experience base in the application of these methods. This provides a unique opportunity for unsteady aerodynamics and aeroelastic method developers to test and evaluate new analysis techniques on problems with high potential for acquisition of test and even flight data against which they