Sample records for apply methods developed

  1. Applying the Mixed Methods Instrument Development and Construct Validation Process: the Transformative Experience Questionnaire

    ERIC Educational Resources Information Center

    Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.

    2018-01-01

    Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…

  2. [Development of selective determination methods for quinones with fluorescence and chemiluminescence detection and their application to environmental and biological samples].

    PubMed

    Kishikawa, Naoya

    2010-10-01

    Quinones are compounds that have various characteristics such as a biological electron transporter, an industrial product and a harmful environmental pollutant. Therefore, an effective determination method for quinones is required in many fields. This review describes the development of sensitive and selective determination methods for quinones based on some detection principles and their application to analyses in environmental, pharmaceutical and biological samples. Firstly, a fluorescence method was developed based on fluorogenic derivatization of quinones and applied to environmental analysis. Secondly, a luminol chemiluminescence method was developed based on generation of reactive oxygen species through the redox cycle of quinone and applied to pharmaceutical analysis. Thirdly, a photo-induced chemiluminescence method was developed based on formation of reactive oxygen species and fluorophore or chemiluminescence enhancer by the photoreaction of quinones and applied to biological and environmental analyses.

  3. Development of quality assurance methods for epoxy graphite prepreg

    NASA Technical Reports Server (NTRS)

    Chen, J. S.; Hunter, A. B.

    1982-01-01

    Quality assurance methods for graphite epoxy/prepregs were developed. Liquid chromatography, differential scanning calorimetry, and gel permeation chromatography were investigated. These methods were applied to a second prepreg system. The resin matrix formulation was correlated with mechanical properties. Dynamic mechanical analysis and fracture toughness methods were investigated. The chromatography and calorimetry techniques were all successfully developed as quality assurance methods for graphite epoxy prepregs. The liquid chromatography method was the most sensitive to changes in resin formulation. The were also successfully applied to the second prepreg system.

  4. Incorporating an Applied Economic Development Component into a Geography Curriculum.

    ERIC Educational Resources Information Center

    Kale, Steven R.

    1989-01-01

    Discusses how applied economic development has been integrated into the economic geography curriculum at Oregon State University (Corvallis). States that coursework in applied economic development should lead to greater understanding of the causes of economic change, the problems associated with growth or decline, and methods for achieving…

  5. Applied Cognitive Task Analysis (ACTA) Methodology

    DTIC Science & Technology

    1997-11-01

    experienced based cognitive skills. The primary goal of this project was to develop streamlined methods of Cognitive Task Analysis that would fill this need...We have made important progression this direction. We have developed streamlined methods of Cognitive Task Analysis . Our evaluation study indicates...developed a CD-based stand alone instructional package, which will make the Applied Cognitive Task Analysis (ACTA) tools widely accessible. A survey of the

  6. Portraits of Benvenuto Cellini and Anthropological Methods of Their Identification

    ERIC Educational Resources Information Center

    Nasobin, Oleg

    2016-01-01

    Modern methods of biometric identification are increasingly applied in order to attribute works of art. They are based on developments in the 19th century anthropological methods. So, this article describes how the successional anthropological methods were applied for the identification of Benvenuto Cellini's portraits. Objective comparison of…

  7. The transfer function method for gear system dynamics applied to conventional and minimum excitation gearing designs

    NASA Technical Reports Server (NTRS)

    Mark, W. D.

    1982-01-01

    A transfer function method for predicting the dynamic responses of gear systems with more than one gear mesh is developed and applied to the NASA Lewis four-square gear fatigue test apparatus. Methods for computing bearing-support force spectra and temporal histories of the total force transmitted by a gear mesh, the force transmitted by a single pair of teeth, and the maximum root stress in a single tooth are developed. Dynamic effects arising from other gear meshes in the system are included. A profile modification design method to minimize the vibration excitation arising from a pair of meshing gears is reviewed and extended. Families of tooth loading functions required for such designs are developed and examined for potential excitation of individual tooth vibrations. The profile modification design method is applied to a pair of test gears.

  8. 24 CFR 1000.54 - What procedures apply to complaints arising out of any of the methods of providing for Indian...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false What procedures apply to complaints arising out of any of the methods of providing for Indian preference? 1000.54 Section 1000.54 Housing and... ACTIVITIES General § 1000.54 What procedures apply to complaints arising out of any of the methods of...

  9. An Aural Learning Project: Assimilating Jazz Education Methods for Traditional Applied Pedagogy

    ERIC Educational Resources Information Center

    Gamso, Nancy M.

    2011-01-01

    The Aural Learning Project (ALP) was developed to incorporate jazz method components into the author's classical practice and her applied woodwind lesson curriculum. The primary objective was to place a more focused pedagogical emphasis on listening and hearing than is traditionally used in the classical applied curriculum. The components of the…

  10. Solid Phase Extraction (SPE) for Biodiesel Processing and Analysis

    DTIC Science & Technology

    2017-12-13

    1 METHODS ...sources. There are several methods than can be applied to development of separation techniques that may replace necessary water wash steps in...biodiesel refinement. Unfortunately, the most common methods are poorly suited or face high costs when applied to diesel purification. Distillation is

  11. Development of new maskless manufacturing method for anti-reflection structure and application to large-area lens with curved surface

    NASA Astrophysics Data System (ADS)

    Yamamoto, Kazuya; Takaoka, Toshimitsu; Fukui, Hidetoshi; Haruta, Yasuyuki; Yamashita, Tomoya; Kitagawa, Seiichiro

    2016-03-01

    In general, thin-film coating process is widely applied on optical lens surface as anti-reflection function. In normal production process, at first lens is manufactured by molding, then anti-reflection is added by thin-film coating. In recent years, instead of thin-film coating, sub-wavelength structures adding on surface of molding die are widely studied and development to keep anti-reflection performance. As merits, applying sub-wavelength structure, coating process becomes unnecessary and it is possible to reduce man-hour costs. In addition to cost merit, these are some technical advantages on this study. Adhesion of coating depends on material of plastic, and it is impossible to apply anti-reflection function on arbitrary surface. Sub-wavelength structure can solve both problems. Manufacturing method of anti-reflection structure can be divided into two types mainly. One method is with the resist patterning, and the other is mask-less method that does not require patterning. What we have developed is new mask-less method which is no need for resist patterning and possible to impart an anti-reflection structure to large area and curved lens surface, and can be expected to apply to various market segments. We report developed technique and characteristics of production lens.

  12. A method of calculating the ultimate strength of continuous beams

    NASA Technical Reports Server (NTRS)

    Newlin, J A; Trayer, George W

    1931-01-01

    The purpose of this study was to investigate the strength of continuous beams after the elastic limit has been passed. As a result, a method of calculation, which is applicable to maximum load conditions, has been developed. The method is simpler than the methods now in use and it applies properly to conditions where the present methods fail to apply.

  13. Applying Program Theory-Driven Approach to Design and Evaluate a Teacher Professional Development Program

    ERIC Educational Resources Information Center

    Lin, Su-ching; Wu, Ming-sui

    2016-01-01

    This study was the first year of a two-year project which applied a program theory-driven approach to evaluating the impact of teachers' professional development interventions on students' learning by using a mix of methods, qualitative inquiry, and quasi-experimental design. The current study was to show the results of using the method of…

  14. Isentropic Bulk Modulus: Development of a Federal Test Method

    DTIC Science & Technology

    2016-01-01

    ranging from 30-80 °C and applied pressures of 1,000-18,000 psi. This method has been applied successfully to aviation turbine fuels and diesel fuels...FFP), aviation fuel, diesel fuel, 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF RESPONSIBLE...pressures of 1,000-18,000 psi. This method has been applied successfully to aviation turbine fuels and diesel fuels composed of petroleum, synthetic

  15. Differential electrophoretic separation of cells and its effect on cell viability

    NASA Technical Reports Server (NTRS)

    Leise, E. M.; Lesane, F.

    1974-01-01

    An electrophoretic separation method was applied to the separation of cells. To determine the efficiency of the separation, it was necessary to apply existing methodology and develop new methods to assess the characteristics and functions of the separated subpopulations. Through appropriate application of the widely used isoelectric focusing procedure, a reproducible separation method was developed. Cells accumulated at defined pH and 70-80% remained viable. The cells were suitable for further biologic, biochemical and immunologic studies.

  16. A multispectral imaging approach for diagnostics of skin pathologies

    NASA Astrophysics Data System (ADS)

    Lihacova, Ilze; Derjabo, Aleksandrs; Spigulis, Janis

    2013-06-01

    Noninvasive multispectral imaging method was applied for different skin pathology such as nevus, basal cell carcinoma, and melanoma diagnostics. Developed melanoma diagnostic parameter, using three spectral bands (540 nm, 650 nm and 950 nm), was calculated for nevus, melanoma and basal cell carcinoma. Simple multispectral diagnostic device was established and applied for skin assessment. Development and application of multispectral diagnostics method described further in this article.

  17. Developing an OD-Intervention Metric System with the Use of Applied Theory-Building Methodology: A Work/Life-Intervention Example

    ERIC Educational Resources Information Center

    Morris, Michael Lane; Storberg-Walker, Julia; McMillan, Heather S.

    2009-01-01

    This article presents a new model, generated through applied theory-building research methods, that helps human resource development (HRD) practitioners evaluate the return on investment (ROI) of organization development (OD) interventions. This model, called organization development human-capital accounting system (ODHCAS), identifies…

  18. RESIDENTIAL INDOOR EXPOSURES OF CHILDREN TO PESTICIDES FOLLOWING LAWN APPLICATIONS

    EPA Science Inventory

    Methods have been developed to estimate children's residential exposures to pesticide residues and applied in a small field study of indoor exposures resulting from the intrusion of lawn-applied herbicide into the home. Sampling methods included size-selective indoor air sampli...

  19. Air Quality Management Alternatives: United States Air Force Firefighter Training Facilities

    DTIC Science & Technology

    1988-01-01

    Pollution at LAX, JFK , and ORD," Impact of Aircraft Emissions on Air Quality in the Vicinity of Airports , Volume II, FAA-EE-80-09B, Federal...developed and applied . This method enabled fire prevention, and environmental management experts and professionals to provide data, opinions, and to...methodology utilizing questionnaires, interviews, and site visits is developed and applied . This method enabled fire prevention, and environmental

  20. A simplified method of performance indicators development for epidemiological surveillance networks--application to the RESAPATH surveillance network.

    PubMed

    Sorbe, A; Chazel, M; Gay, E; Haenni, M; Madec, J-Y; Hendrikx, P

    2011-06-01

    Develop and calculate performance indicators allows to continuously follow the operation of an epidemiological surveillance network. This is an internal evaluation method, implemented by the coordinators in collaboration with all the actors of the network. Its purpose is to detect weak points in order to optimize management. A method for the development of performance indicators of epidemiological surveillance networks was developed in 2004 and was applied to several networks. Its implementation requires a thorough description of the network environment and all its activities to define priority indicators. Since this method is considered to be complex, our objective consisted in developing a simplified approach and applying it to an epidemiological surveillance network. We applied the initial method to a theoretical network model to obtain a list of generic indicators that can be adapted to any surveillance network. We obtained a list of 25 generic performance indicators, intended to be reformulated and described according to the specificities of each network. It was used to develop performance indicators for RESAPATH, an epidemiological surveillance network of antimicrobial resistance in pathogenic bacteria of animal origin in France. This application allowed us to validate the simplified method, its value in terms of practical implementation, and its level of user acceptance. Its ease of use and speed of application compared to the initial method argue in favor of its use on broader scale. Copyright © 2011 Elsevier Masson SAS. All rights reserved.

  1. Experimental design methodologies in the optimization of chiral CE or CEC separations: an overview.

    PubMed

    Dejaegher, Bieke; Mangelings, Debby; Vander Heyden, Yvan

    2013-01-01

    In this chapter, an overview of experimental designs to develop chiral capillary electrophoresis (CE) and capillary electrochromatographic (CEC) methods is presented. Method development is generally divided into technique selection, method optimization, and method validation. In the method optimization part, often two phases can be distinguished, i.e., a screening and an optimization phase. In method validation, the method is evaluated on its fit for purpose. A validation item, also applying experimental designs, is robustness testing. In the screening phase and in robustness testing, screening designs are applied. During the optimization phase, response surface designs are used. The different design types and their application steps are discussed in this chapter and illustrated by examples of chiral CE and CEC methods.

  2. Studies on the development of latent fingerprints by the method of solid-medium ninhydrin.

    PubMed

    Yang, Ruiqin; Lian, Jie

    2014-09-01

    A new series of fingerprint developing membrane were prepared using ninhydrin as the developing agent, and pressure-sensitive emulsifiers as the encapsulated chemicals. The type of emulsifier, plastic film, concentration of the developing agent, modifying ions and thickness of the membrane were studied in order to get the optimized fingerprint developing effect. The membrane can be successfully applied to both latent sweat fingerprints and blood fingerprint on many different surfaces. The sensitivity of the method toward the latent sweat fingerprint is 0.1 mg/L amino acid. The membrane can be applied to both porous and non-porous surfaces. Fingerprints that are difficult to develop on surfaces such as leather, glass and heat-sensitive paper using traditional chemical methods can be successfully developed with this membrane. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  3. A Method for Application of Classification Tree Models to Map Aquatic Vegetation Using Remotely Sensed Images from Different Sensors and Dates

    PubMed Central

    Jiang, Hao; Zhao, Dehua; Cai, Ying; An, Shuqing

    2012-01-01

    In previous attempts to identify aquatic vegetation from remotely-sensed images using classification trees (CT), the images used to apply CT models to different times or locations necessarily originated from the same satellite sensor as that from which the original images used in model development came, greatly limiting the application of CT. We have developed an effective normalization method to improve the robustness of CT models when applied to images originating from different sensors and dates. A total of 965 ground-truth samples of aquatic vegetation types were obtained in 2009 and 2010 in Taihu Lake, China. Using relevant spectral indices (SI) as classifiers, we manually developed a stable CT model structure and then applied a standard CT algorithm to obtain quantitative (optimal) thresholds from 2009 ground-truth data and images from Landsat7-ETM+, HJ-1B-CCD, Landsat5-TM and ALOS-AVNIR-2 sensors. Optimal CT thresholds produced average classification accuracies of 78.1%, 84.7% and 74.0% for emergent vegetation, floating-leaf vegetation and submerged vegetation, respectively. However, the optimal CT thresholds for different sensor images differed from each other, with an average relative variation (RV) of 6.40%. We developed and evaluated three new approaches to normalizing the images. The best-performing method (Method of 0.1% index scaling) normalized the SI images using tailored percentages of extreme pixel values. Using the images normalized by Method of 0.1% index scaling, CT models for a particular sensor in which thresholds were replaced by those from the models developed for images originating from other sensors provided average classification accuracies of 76.0%, 82.8% and 68.9% for emergent vegetation, floating-leaf vegetation and submerged vegetation, respectively. Applying the CT models developed for normalized 2009 images to 2010 images resulted in high classification (78.0%–93.3%) and overall (92.0%–93.1%) accuracies. Our results suggest that Method of 0.1% index scaling provides a feasible way to apply CT models directly to images from sensors or time periods that differ from those of the images used to develop the original models.

  4. A Simple and Useful Method to Apply Exogenous NO Gas to Plant Systems: Bell Pepper Fruits as a Model.

    PubMed

    Palma, José M; Ruiz, Carmelo; Corpas, Francisco J

    2018-01-01

    Nitric oxide (NO) is involved many physiological plant processes, including germination, growth and development of roots, flower setting and development, senescence, and fruit ripening. In the latter physiological process, NO has been reported to play an opposite role to ethylene. Thus, treatment of fruits with NO may lead to delay ripening independently of whether they are climacteric or nonclimacteric. In many cases different methods have been reported to apply NO to plant systems involving sodium nitroprusside, NONOates, DETANO, or GSNO to investigate physiological and molecular consequences. In this chapter a method to treat plant materials with NO is provided using bell pepper fruits as a model. This method is cheap, free of side effects, and easy to apply since it only requires common chemicals and tools available in any biology laboratory.

  5. Infinitely many symmetries and conservation laws for quad-graph equations via the Gardner method

    NASA Astrophysics Data System (ADS)

    Rasin, Alexander G.

    2010-06-01

    The application of the Gardner method for the generation of conservation laws to all the ABS equations is considered. It is shown that all the necessary information for the application of the Gardner method, namely Bäcklund transformations and initial conservation laws, follows from the multidimensional consistency of ABS equations. We also apply the Gardner method to an asymmetric equation which is not included in the ABS classification. An analog of the Gardner method for the generation of symmetries is developed and applied to the discrete Korteweg-de Vries equation. It can also be applied to all the other ABS equations.

  6. Application of Quality by Design Approach to Bioanalysis: Development of a Method for Elvitegravir Quantification in Human Plasma.

    PubMed

    Baldelli, Sara; Marrubini, Giorgio; Cattaneo, Dario; Clementi, Emilio; Cerea, Matteo

    2017-10-01

    The application of Quality by Design (QbD) principles in clinical laboratories can help to develop an analytical method through a systematic approach, providing a significant advance over the traditional heuristic and empirical methodology. In this work, we applied for the first time the QbD concept in the development of a method for drug quantification in human plasma using elvitegravir as the test molecule. The goal of the study was to develop a fast and inexpensive quantification method, with precision and accuracy as requested by the European Medicines Agency guidelines on bioanalytical method validation. The method was divided into operative units, and for each unit critical variables affecting the results were identified. A risk analysis was performed to select critical process parameters that should be introduced in the design of experiments (DoEs). Different DoEs were used depending on the phase of advancement of the study. Protein precipitation and high-performance liquid chromatography-tandem mass spectrometry were selected as the techniques to be investigated. For every operative unit (sample preparation, chromatographic conditions, and detector settings), a model based on factors affecting the responses was developed and optimized. The obtained method was validated and clinically applied with success. To the best of our knowledge, this is the first investigation thoroughly addressing the application of QbD to the analysis of a drug in a biological matrix applied in a clinical laboratory. The extensive optimization process generated a robust method compliant with its intended use. The performance of the method is continuously monitored using control charts.

  7. A simple method for the enrichment of bisphenols using boron nitride.

    PubMed

    Fischnaller, Martin; Bakry, Rania; Bonn, Günther K

    2016-03-01

    A simple solid-phase extraction method for the enrichment of 5 bisphenol derivatives using hexagonal boron nitride (BN) was developed. BN was applied to concentrate bisphenol derivatives in spiked water samples and the compounds were analyzed using HPLC coupled to fluorescence detection. The effect of pH and organic solvents on the extraction efficiency was investigated. An enrichment factor up to 100 was achieved without evaporation and reconstitution. The developed method was applied for the determination of bisphenol A migrated from some polycarbonate plastic products. Furthermore, bisphenol derivatives were analyzed in spiked and non-spiked canned food and beverages. None of the analyzed samples exceeded the migration limit set by the European Union of 0.6mg/kg food. The method showed good recovery rates ranging from 80% to 110%. Validation of the method was performed in terms of accuracy and precision. The applied method is robust, fast, efficient and easily adaptable to different analytical problems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Pressure algorithm for elliptic flow calculations with the PDF method

    NASA Technical Reports Server (NTRS)

    Anand, M. S.; Pope, S. B.; Mongia, H. C.

    1991-01-01

    An algorithm to determine the mean pressure field for elliptic flow calculations with the probability density function (PDF) method is developed and applied. The PDF method is a most promising approach for the computation of turbulent reacting flows. Previous computations of elliptic flows with the method were in conjunction with conventional finite volume based calculations that provided the mean pressure field. The algorithm developed and described here permits the mean pressure field to be determined within the PDF calculations. The PDF method incorporating the pressure algorithm is applied to the flow past a backward-facing step. The results are in good agreement with data for the reattachment length, mean velocities, and turbulence quantities including triple correlations.

  9. Shuffling cross-validation-bee algorithm as a new descriptor selection method for retention studies of pesticides in biopartitioning micellar chromatography.

    PubMed

    Zarei, Kobra; Atabati, Morteza; Ahmadi, Monire

    2017-05-04

    Bee algorithm (BA) is an optimization algorithm inspired by the natural foraging behaviour of honey bees to find the optimal solution which can be proposed to feature selection. In this paper, shuffling cross-validation-BA (CV-BA) was applied to select the best descriptors that could describe the retention factor (log k) in the biopartitioning micellar chromatography (BMC) of 79 heterogeneous pesticides. Six descriptors were obtained using BA and then the selected descriptors were applied for model development using multiple linear regression (MLR). The descriptor selection was also performed using stepwise, genetic algorithm and simulated annealing methods and MLR was applied to model development and then the results were compared with those obtained from shuffling CV-BA. The results showed that shuffling CV-BA can be applied as a powerful descriptor selection method. Support vector machine (SVM) was also applied for model development using six selected descriptors by BA. The obtained statistical results using SVM were better than those obtained using MLR, as the root mean square error (RMSE) and correlation coefficient (R) for whole data set (training and test), using shuffling CV-BA-MLR, were obtained as 0.1863 and 0.9426, respectively, while these amounts for the shuffling CV-BA-SVM method were obtained as 0.0704 and 0.9922, respectively.

  10. HPLC-MS/MS method for dexmedetomidine quantification with Design of Experiments approach: application to pediatric pharmacokinetic study.

    PubMed

    Szerkus, Oliwia; Struck-Lewicka, Wiktoria; Kordalewska, Marta; Bartosińska, Ewa; Bujak, Renata; Borsuk, Agnieszka; Bienert, Agnieszka; Bartkowska-Śniatkowska, Alicja; Warzybok, Justyna; Wiczling, Paweł; Nasal, Antoni; Kaliszan, Roman; Markuszewski, Michał Jan; Siluk, Danuta

    2017-02-01

    The purpose of this work was to develop and validate a rapid and robust LC-MS/MS method for the determination of dexmedetomidine (DEX) in plasma, suitable for analysis of a large number of samples. Systematic approach, Design of Experiments, was applied to optimize ESI source parameters and to evaluate method robustness, therefore, a rapid, stable and cost-effective assay was developed. The method was validated according to US FDA guidelines. LLOQ was determined at 5 pg/ml. The assay was linear over the examined concentration range (5-2500 pg/ml), Results: Experimental design approach was applied for optimization of ESI source parameters and evaluation of method robustness. The method was validated according to the US FDA guidelines. LLOQ was determined at 5 pg/ml. The assay was linear over the examined concentration range (R 2 > 0.98). The accuracies, intra- and interday precisions were less than 15%. The stability data confirmed reliable behavior of DEX under tested conditions. Application of Design of Experiments approach allowed for fast and efficient analytical method development and validation as well as for reduced usage of chemicals necessary for regular method optimization. The proposed technique was applied to determination of DEX pharmacokinetics in pediatric patients undergoing long-term sedation in the intensive care unit.

  11. Computational structural mechanics methods research using an evolving framework

    NASA Technical Reports Server (NTRS)

    Knight, N. F., Jr.; Lotts, C. G.; Gillian, R. E.

    1990-01-01

    Advanced structural analysis and computational methods that exploit high-performance computers are being developed in a computational structural mechanics research activity sponsored by the NASA Langley Research Center. These new methods are developed in an evolving framework and applied to representative complex structural analysis problems from the aerospace industry. An overview of the methods development environment is presented, and methods research areas are described. Selected application studies are also summarized.

  12. A Triangulation Method for Identifying Hydrostratigraphic Locations of Well Screens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiteside, T. S.

    2015-01-31

    A method to identify the hydrostratigraphic location of well screens was developed using triangulation with known locations. This method was applied to all of the monitor wells being used to develop the new GSA groundwater model. Results from this method are closely aligned with those from an alternate method which uses a mesh surface.

  13. Four Methods for Completing the Conceptual Development Phase of Applied Theory Building Research in HRD

    ERIC Educational Resources Information Center

    Storberg-Walker, Julia; Chermack, Thomas J.

    2007-01-01

    The purpose of this article is to describe four methods for completing the conceptual development phase of theory building research for single or multiparadigm research. The four methods selected for this review are (1) Weick's method of "theorizing as disciplined imagination" (1989); (2) Whetten's method of "modeling as theorizing" (2002); (3)…

  14. Heuristics Applied in the Development of Advanced Space Mission Concepts

    NASA Technical Reports Server (NTRS)

    Nilsen, Erik N.

    1998-01-01

    Advanced mission studies are the first step in determining the feasibility of a given space exploration concept. A space scientist develops a science goal in the exploration of space. This may be a new observation method, a new instrument or a mission concept to explore a solar system body. In order to determine the feasibility of a deep space mission, a concept study is convened to determine the technology needs and estimated cost of performing that mission. Heuristics are one method of defining viable mission and systems architectures that can be assessed for technology readiness and cost. Developing a viable architecture depends to a large extent upon extending the existing body of knowledge, and applying it in new and novel ways. These heuristics have evolved over time to include methods for estimating technical complexity, technology development, cost modeling and mission risk in the unique context of deep space missions. This paper examines the processes involved in performing these advanced concepts studies, and analyzes the application of heuristics in the development of an advanced in-situ planetary mission. The Venus Surface Sample Return mission study provides a context for the examination of the heuristics applied in the development of the mission and systems architecture. This study is illustrative of the effort involved in the initial assessment of an advance mission concept, and the knowledge and tools that are applied.

  15. XENOBIOTIC METHODS DEVELOPMENT FOR HUMAN EXPOSURE ASSESSMENT RESEARCH

    EPA Science Inventory

    Biomarkers from blood, breath, urine, and other physiological matrices can provide useful information regarding exposures to environmental pollutants. Once developed and applied appropriately, specific and sensitive methods can often provide definitive data identifying the vario...

  16. Aircraft operability methods applied to space launch vehicles

    NASA Astrophysics Data System (ADS)

    Young, Douglas

    1997-01-01

    The commercial space launch market requirement for low vehicle operations costs necessitates the application of methods and technologies developed and proven for complex aircraft systems. The ``building in'' of reliability and maintainability, which is applied extensively in the aircraft industry, has yet to be applied to the maximum extent possible on launch vehicles. Use of vehicle system and structural health monitoring, automated ground systems and diagnostic design methods derived from aircraft applications support the goal of achieving low cost launch vehicle operations. Transforming these operability techniques to space applications where diagnostic effectiveness has significantly different metrics is critical to the success of future launch systems. These concepts will be discussed with reference to broad launch vehicle applicability. Lessons learned and techniques used in the adaptation of these methods will be outlined drawing from recent aircraft programs and implementation on phase 1 of the X-33/RLV technology development program.

  17. The Intertextual Method for Art Education Applied in Japanese Paper Theatre--A Study on Discovering Intercultural Differences

    ERIC Educational Resources Information Center

    Paatela-Nieminen, Martina

    2008-01-01

    In art education we need methods for studying works of art and visual culture interculturally because there are many multicultural art classes and little consensus as to how to interpret art in different cultures. In this article my central aim was to apply the intertextual method that I developed in my doctoral thesis for Western art education to…

  18. Integration of optical measurement methods with flight parameter measurement systems

    NASA Astrophysics Data System (ADS)

    Kopecki, Grzegorz; Rzucidlo, Pawel

    2016-05-01

    During the AIM (advanced in-flight measurement techniques) and AIM2 projects, innovative modern techniques were developed. The purpose of the AIM project was to develop optical measurement techniques dedicated for flight tests. Such methods give information about aircraft elements deformation, thermal loads or pressure distribution, etc. In AIM2 the development of optical methods for flight testing was continued. In particular, this project aimed at the development of methods that could be easily applied in flight tests in an industrial setting. Another equally important task was to guarantee the synchronization of the classical measuring system with cameras. The PW-6U glider used in flight tests was provided by the Rzeszów University of Technology. The glider had all the equipment necessary for testing the IPCT (image pattern correlation technique) and IRT (infrared thermometry) methods. Additionally, equipment adequate for the measurement of typical flight parameters, registration and analysis has been developed. This article describes the designed system, as well as presenting the system’s application during flight tests. Additionally, the results obtained in flight tests show certain limitations of the IRT method as applied.

  19. Nuclear Forensics Applications of Principal Component Analysis on Micro X-ray Fluorescence Images

    DTIC Science & Technology

    analysis on quantified micro x-ray fluorescence intensity values. This method is then applied to address goals of nuclear forensics . Thefirst...researchers in the development and validation of nuclear forensics methods. A method for determining material homogeneity is developed and demonstrated

  20. The Finnish healthcare services lean management.

    PubMed

    Hihnala, Susanna; Kettunen, Lilja; Suhonen, Marjo; Tiirinki, Hanna

    2018-02-05

    Purpose The purpose of this paper is to discuss health services managers' experiences of management in a special health-care unit and development efforts from the point of view of the Lean method. Additionally, the aim is to deepen the knowledge of the managers' work and nature of the Lean method development processes in the workplace. The research focuses on those aspects and results of Lean method that are currently being used in health-care environments. Design/methodology/approach These data were collected through a number of thematic interviews. The participants were nurse managers ( n = 7) and medical managers ( n = 7) who applied Lean management in their work at the University Hospital in the Northern Ostrobothnia Health Care District. The data were analysed with a qualitative content analysis. Findings A common set of values in specialized health-care services, development of activities and challenges for management in the use of the Lean manager development model to improve personal management skills. Practical implications Managers in specialized health-care services can develop and systematically manage with the help of the Lean method. This emphasizes assumptions, from the point of view of management, about systems development when the organization uses the Lean method. The research outcomes originate from specialized health-care settings in Finland in which the Lean method and its associated management principles have been implemented and applied to the delivery of health care. Originality/value The study shows that the research results and in-depth knowledge on Lean method principles can be applied to health-care management and development processes. The research also describes health services managers' experiences of using the Lean method. In the future, these results can be used to improve Lean management skills, identify personal professional competencies and develop skills required in development processes. Also, the research findings can be used in the training of health services managers in the health-care industry worldwide and to help them survive the pressure to change repeatedly.

  1. New human biomonitoring methods for chemicals of concern-the German approach to enhance relevance.

    PubMed

    Kolossa-Gehring, Marike; Fiddicke, Ulrike; Leng, Gabriele; Angerer, Jürgen; Wolz, Birgit

    2017-03-01

    In Germany strong efforts have been made within the last years to develop new methods for human biomonitoring (HBM). The German Federal Ministry for the Environment, Nature Conservation, Building and Nuclear Safety (BMUB) and the German Chemical Industry Association e. V. (VCI) cooperate since 2010 to increase the knowledge on the internal exposure of the general population to chemicals. The projects aim is to promote human biomonitoring by developing new analytical methods Key partner of the cooperation is the German Environment Agency (UBA) which has been entrusted with the scientific coordination. Another key partner is the "HBM Expert Panel" which each year puts together a list of chemicals of interest to the project from which the Steering Committee of the project choses up to five substances for which method development will be started. Emphasis is placed on substances with either a potential health relevance or on substances to which the general population is potentially exposed to a considerable extent. The HBM Expert Panel also advises on method development. Once a method is developed, it is usually first applied to about 40 non-occupationally exposed individuals. A next step is applying the methods to different samples. Either, if the time trend is of major interest, to samples from the German Environmental Specimen Bank, or, in case exposure sources and distribution of exposure levels in the general population are the focus, the new methods are applied to samples from children and adolescents from the population representative 5th German Environmental Survey (GerES V). Results are expected in late 2018. This article describes the challenges faced during method development and solutions found. An overview presents the 34 selected substances, the 14 methods developed and the 7 HBM-I values derived in the period from 2010 to mid 2016. Copyright © 2016 The Authors. Published by Elsevier GmbH.. All rights reserved.

  2. The use of experimental design for the development of a capillary zone electrophoresis method for the quantitation of captopril.

    PubMed

    Mukozhiwa, S Y; Khamanga, S M M; Walker, R B

    2017-09-01

    A capillary zone electrophoresis (CZE) method for the quantitation of captopril (CPT) using UV detection was developed. Influence of electrolyte concentration and system variables on electrophoretic separation was evaluated and a central composite design (CCD) was used to optimize the method. Variables investigated were pH, molarity, applied voltage and capillary length. The influence of sodium metabisulphite on the stability of test solutions was also investigated. The use of sodium metabisulphite prevented degradation of CPT over 24 hours. A fused uncoated silica capillary of 67.5cm total and 57.5 cm effective length was used for analysis. The applied voltage and capillary length affected the migration time of CPT significantly. A 20 mM phosphate buffer adjusted to pH 7.0 was used as running buffer and an applied voltage of 23.90 kV was suitable to effect a separation. The optimized electrophoretic conditions produced sharp, well-resolved peaks for CPT and sodium metabisulphite. Linear regression analysis of the response for CPT standards revealed the method was linear (R2 = 0.9995) over the range 5-70 μg/mL. The limits of quantitation and detection were 5 and 1.5 μg/mL. A simple, rapid and reliable CZE method has been developed and successfully applied to the analysis of commercially available CPT products.

  3. The application of generalized, cyclic, and modified numerical integration algorithms to problems of satellite orbit computation

    NASA Technical Reports Server (NTRS)

    Chesler, L.; Pierce, S.

    1971-01-01

    Generalized, cyclic, and modified multistep numerical integration methods are developed and evaluated for application to problems of satellite orbit computation. Generalized methods are compared with the presently utilized Cowell methods; new cyclic methods are developed for special second-order differential equations; and several modified methods are developed and applied to orbit computation problems. Special computer programs were written to generate coefficients for these methods, and subroutines were written which allow use of these methods with NASA's GEOSTAR computer program.

  4. Moral counselling: a method in development.

    PubMed

    de Groot, Jack; Leget, Carlo

    2011-01-01

    This article describes a method of moral counselling developed in the Radboud University Medical Centre Nijmegen (The Netherlands). The authors apply insights of Paul Ricoeur to the non-directive counselling method of Carl Rogers in their work of coaching patients with moral problems in health care. The developed method was shared with other health care professionals in a training course. Experiences in the course and further practice led to further improvement of the method.

  5. Micro Dot Patterning on the Light Guide Panel Using Powder Blasting

    PubMed Central

    Jang, Ho Su; Cho, Myeong Woo; Park, Dong Sam

    2008-01-01

    This study is to develop a micromachining technology for a light guide panel(LGP) mold, whereby micro dot patterns are formed on a LGP surface by a single injection process instead of existing screen printing processes. The micro powder blasting technique is applied to form micro dot patterns on the LGP mold surface. The optimal conditions for masking, laminating, exposure, and developing processes to form the micro dot patterns are first experimentally investigated. A LGP mold with masked micro patterns is then machined using the micro powder blasting method and the machinability of the micro dot patterns is verified. A prototype LGP is test- injected using the developed LGP mold and a shape analysis of the patterns and performance testing of the injected LGP are carried out. As an additional approach, matte finishing, a special surface treatment method, is applied to the mold surface to improve the light diffusion characteristics, uniformity and brightness of the LGP. The results of this study show that the applied powder blasting method can be successfully used to manufacture LGPs with micro patterns by just single injection using the developed mold and thereby replace existing screen printing methods. PMID:27879740

  6. Participatory Design in Gerontechnology: A Systematic Literature Review.

    PubMed

    Merkel, Sebastian; Kucharski, Alexander

    2018-05-19

    Participatory design (PD) is widely used within gerontechnology but there is no common understanding about which methods are used for what purposes. This review aims to examine what different forms of PD exist in the field of gerontechnology and how these can be categorized. We conducted a systematic literature review covering several databases. The search strategy was based on 3 elements: (1) participatory methods and approaches with (2) older persons aiming at developing (3) technology for older people. Our final review included 26 studies representing a variety of technologies designed/developed and methods/instruments applied. According to the technologies, the publications reviewed can be categorized in 3 groups: Studies that (1) use already existing technology with the aim to find new ways of use; (2) aim at creating new devices; (3) test and/or modify prototypes. The implementation of PD depends on the questions: Why a participatory approach is applied, who is involved as future user(s), when those future users are involved, and how they are incorporated into the innovation process. There are multiple ways, methods, and instruments to integrate users into the innovation process. Which methods should be applied, depends on the context. However, most studies do not evaluate if participatory approaches will lead to a better acceptance and/or use of the co-developed products. Therefore, participatory design should follow a comprehensive strategy, starting with the users' needs and ending with an evaluation if the applied methods have led to better results.

  7. Transient analysis of an adaptive system for optimization of design parameters

    NASA Technical Reports Server (NTRS)

    Bayard, D. S.

    1992-01-01

    Averaging methods are applied to analyzing and optimizing the transient response associated with the direct adaptive control of an oscillatory second-order minimum-phase system. The analytical design methods developed for a second-order plant can be applied with some approximation to a MIMO flexible structure having a single dominant mode.

  8. Generalized theoretical method for the interaction between arbitrary nonuniform electric field and molecular vibrations: Toward near-field infrared spectroscopy and microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iwasa, Takeshi, E-mail: tiwasa@mail.sci.hokudai.ac.jp; Takenaka, Masato; Taketsugu, Tetsuya

    A theoretical method to compute infrared absorption spectra when a molecule is interacting with an arbitrary nonuniform electric field such as near-fields is developed and numerically applied to simple model systems. The method is based on the multipolar Hamiltonian where the light-matter interaction is described by a spatial integral of the inner product of the molecular polarization and applied electric field. The computation scheme is developed under the harmonic approximation for the molecular vibrations and the framework of modern electronic structure calculations such as the density functional theory. Infrared reflection absorption and near-field infrared absorption are considered as model systems.more » The obtained IR spectra successfully reflect the spatial structure of the applied electric field and corresponding vibrational modes, demonstrating applicability of the present method to analyze modern nanovibrational spectroscopy using near-fields. The present method can use arbitral electric fields and thus can integrate two fields such as computational chemistry and electromagnetics.« less

  9. Generalized theoretical method for the interaction between arbitrary nonuniform electric field and molecular vibrations: Toward near-field infrared spectroscopy and microscopy.

    PubMed

    Iwasa, Takeshi; Takenaka, Masato; Taketsugu, Tetsuya

    2016-03-28

    A theoretical method to compute infrared absorption spectra when a molecule is interacting with an arbitrary nonuniform electric field such as near-fields is developed and numerically applied to simple model systems. The method is based on the multipolar Hamiltonian where the light-matter interaction is described by a spatial integral of the inner product of the molecular polarization and applied electric field. The computation scheme is developed under the harmonic approximation for the molecular vibrations and the framework of modern electronic structure calculations such as the density functional theory. Infrared reflection absorption and near-field infrared absorption are considered as model systems. The obtained IR spectra successfully reflect the spatial structure of the applied electric field and corresponding vibrational modes, demonstrating applicability of the present method to analyze modern nanovibrational spectroscopy using near-fields. The present method can use arbitral electric fields and thus can integrate two fields such as computational chemistry and electromagnetics.

  10. Interactive and Hands-on Methods for Professional Development of Undergraduate Researchers

    NASA Astrophysics Data System (ADS)

    Pressley, S. N.; LeBeau, J. E.

    2016-12-01

    Professional development workshops for undergraduate research programs can range from communicating science (i.e. oral, technical writing, poster presentations), applying for fellowships and scholarships, applying to graduate school, and learning about careers, among others. Novel methods of presenting the information on the above topics can result in positive outcomes beyond the obvious of transferring knowledge. Examples of innovative methods to present professional development information include 1) An interactive session on how to write an abstract where students are given an opportunity to draft an abstract from a short technical article, followed by discussion amongst a group of peers, and comparison with the "published" abstract. 2) Using the Process Oriented Guided Inquiry Learning (POGIL) method to evaluate and critique a research poster. 3) Inviting "experts" such as a Fulbright scholar graduate student to present on applying for fellowships and scholarships. These innovative methods of delivery provide more hands-on activities that engage the students, and in some cases (abstract writing) provide practice for the student. The methods also require that students develop team work skills, communicate amongst their peers, and develop networks with their cohort. All of these are essential non-technical skills needed for success in any career. Feedback from students on these sessions are positive and most importantly, the students walk out of the session with a smile on their face saying how much fun it was. Evaluating the impact of these sessions is more challenging and under investigation currently.

  11. A Mixed Prioritization Operators Strategy Using A Single Measurement Criterion For AHP Application Development

    NASA Astrophysics Data System (ADS)

    Yuen, Kevin Kam Fung

    2009-10-01

    The most appropriate prioritization method is still one of the unsettled issues of the Analytic Hierarchy Process, although many studies have been made and applied. Interestingly, many AHP applications apply only Saaty's Eigenvector method as many studies have found that this method may produce rank reversals and have proposed various prioritization methods as alternatives. Some methods have been proved to be better than the Eigenvector method. However, these methods seem not to attract the attention of researchers. In this paper, eight important prioritization methods are reviewed. A Mixed Prioritization Operators Strategy (MPOS) is developed to select a vector which is prioritized by the most appropriate prioritization operator. To verify this new method, a case study of high school selection is revised using the proposed method. The contribution is that MPOS is useful for solving prioritization problems in the AHP.

  12. An Auxiliary Method To Reduce Potential Adverse Impacts Of Projected Land Developments: Subwatershed Prioritization

    EPA Science Inventory

    An index based method is developed that ranks the subwatersheds of a watershed based on their relative impacts on watershed response to anticipated land developments, and then applied to an urbanizing watershed in Eastern Pennsylvania. Simulations with a semi-distributed hydrolo...

  13. Lessons learned applying CASE methods/tools to Ada software development projects

    NASA Technical Reports Server (NTRS)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  14. METHOD DEVELOPMENT FOR THE DETERMINATION OF PERFLUORINATED ORGANIC COMPOUNDS ( PFCS ) IN SURFACE WATER

    EPA Science Inventory

    The method for the determination of perfluorinated organic compounds (PFCs) in surface water has been developed and applied to natural water. The method shows an adequate sensitivity, precision and accuracy for ten kinds of target compounds. These PFCs were found in most samples...

  15. Interdisciplinary Curriculum Development in Hospital Methods Improvement. Final Report.

    ERIC Educational Resources Information Center

    Watt, John R.

    The major purpose of this project was to develop a "package" curriculum of Hospital Methods Improvement techniques for college students in health related majors. The elementary Industrial Engineering methods for simplifying work and saving labor were applied to the hospital environment and its complex of problems. The report's…

  16. A new open tubular capillary microextraction and sweeping for the analysis of super low concentration of hydrophobic compounds.

    PubMed

    Xia, Zhining; Gan, Tingting; Chen, Hua; Lv, Rui; Wei, Weili; Yang, Fengqing

    2010-10-01

    A sample pre-concentration method based on the in-line coupling of in-tube solid-phase microextraction and electrophoretic sweeping was developed for the analysis of hydrophobic compounds. The sample pre-concentration and electrophoretic separation processes were simply and sequentially carried out with a (35%-phenyl)-methylpolysiloxane-coated capillary. The developed method was validated and applied to enrich and separate several pharmaceuticals including loratadine, indomethacin, ibuprofen and doxazosin. Several parameters of microextration were investigated such as temperature, pH and eluant. And the concentration of microemulsion that influences separation efficiency and microextraction efficiency were also studied. Central composite design was applied for the optimization of sampling flow rate and sampling time that interact in a very complex way with each other. The precision, sensitivity and recovery of the method were investigated. Under the optimal conditions, the maximum enrichment factors for loratadine, indomethacin, ibuprofen and doxazosin in aqueous solutions are 1355, 571, 523 and 318, respectively. In addition, the developed method was applied to determine loratadine in rabbit blood sample.

  17. The Expanding Role of Applications in the Development and Validation of CFD at NASA

    NASA Technical Reports Server (NTRS)

    Schuster, David M.

    2010-01-01

    This paper focuses on the recent escalation in application of CFD to manned and unmanned flight projects at NASA and the need to often apply these methods to problems for which little or no previous validation data directly applies. The paper discusses the evolution of NASA.s CFD development from a strict Develop, Validate, Apply strategy to sometimes allowing for a Develop, Apply, Validate approach. The risks of this approach and some of its unforeseen benefits are discussed and tied to specific operational examples. There are distinct advantages for the CFD developer that is able to operate in this paradigm, and recommendations are provided for those inclined and willing to work in this environment.

  18. Digital Signal Processing Based on a Clustering Algorithm for Ir/Au TES Microcalorimeter

    NASA Astrophysics Data System (ADS)

    Zen, N.; Kunieda, Y.; Takahashi, H.; Hiramoto, K.; Nakazawa, M.; Fukuda, D.; Ukibe, M.; Ohkubo, M.

    2006-02-01

    In recent years, cryogenic microcalorimeters using their superconducting transition edge have been under development for possible application to the research for astronomical X-ray observations. To improve the energy resolution of superconducting transition edge sensors (TES), several correction methods have been developed. Among them, a clustering method based on digital signal processing has recently been proposed. In this paper, we applied the clustering method to Ir/Au bilayer TES. This method resulted in almost a 10% improvement in the energy resolution. Conversely, from the point of view of imaging X-ray spectroscopy, we applied the clustering method to pixellated Ir/Au-TES devices. We will thus show how a clustering method which sorts signals by their shapes is also useful for position identification

  19. Development and validation of an event-specific quantitative PCR method for genetically modified maize MIR162.

    PubMed

    Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2014-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize event, MIR162. We first prepared a standard plasmid for MIR162 quantification. The conversion factor (Cf) required to calculate the genetically modified organism (GMO) amount was empirically determined for two real-time PCR instruments, the Applied Biosystems 7900HT (ABI7900) and the Applied Biosystems 7500 (ABI7500) for which the determined Cf values were 0.697 and 0.635, respectively. To validate the developed method, a blind test was carried out in an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr). The determined biases were less than 25% and the RSDr values were less than 20% at all evaluated concentrations. These results suggested that the limit of quantitation of the method was 0.5%, and that the developed method would thus be suitable for practical analyses for the detection and quantification of MIR162.

  20. 32 CFR 22.105 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... applying existing technology to new products and processes in a general way. Advanced research is most... Category 6.3A) programs within Research, Development, Test and Evaluation (RDT&E). Applied research... technology such as new materials, devices, methods and processes. It typically is funded in Applied Research...

  1. How Qualitative Methods Can be Used to Inform Model Development.

    PubMed

    Husbands, Samantha; Jowett, Susan; Barton, Pelham; Coast, Joanna

    2017-06-01

    Decision-analytic models play a key role in informing healthcare resource allocation decisions. However, there are ongoing concerns with the credibility of models. Modelling methods guidance can encourage good practice within model development, but its value is dependent on its ability to address the areas that modellers find most challenging. Further, it is important that modelling methods and related guidance are continually updated in light of any new approaches that could potentially enhance model credibility. The objective of this article was to highlight the ways in which qualitative methods have been used and recommended to inform decision-analytic model development and enhance modelling practices. With reference to the literature, the article discusses two key ways in which qualitative methods can be, and have been, applied. The first approach involves using qualitative methods to understand and inform general and future processes of model development, and the second, using qualitative techniques to directly inform the development of individual models. The literature suggests that qualitative methods can improve the validity and credibility of modelling processes by providing a means to understand existing modelling approaches that identifies where problems are occurring and further guidance is needed. It can also be applied within model development to facilitate the input of experts to structural development. We recommend that current and future model development would benefit from the greater integration of qualitative methods, specifically by studying 'real' modelling processes, and by developing recommendations around how qualitative methods can be adopted within everyday modelling practice.

  2. Micro Dot Patterning on the Light Guide Panel Using Powder Blasting.

    PubMed

    Jang, Ho Su; Cho, Myeong Woo; Park, Dong Sam

    2008-02-08

    This study is to develop a micromachining technology for a light guidepanel(LGP) mold, whereby micro dot patterns are formed on a LGP surface by a singleinjection process instead of existing screen printing processes. The micro powder blastingtechnique is applied to form micro dot patterns on the LGP mold surface. The optimalconditions for masking, laminating, exposure, and developing processes to form the microdot patterns are first experimentally investigated. A LGP mold with masked micro patternsis then machined using the micro powder blasting method and the machinability of themicro dot patterns is verified. A prototype LGP is test- injected using the developed LGPmold and a shape analysis of the patterns and performance testing of the injected LGP arecarried out. As an additional approach, matte finishing, a special surface treatment method,is applied to the mold surface to improve the light diffusion characteristics, uniformity andbrightness of the LGP. The results of this study show that the applied powder blastingmethod can be successfully used to manufacture LGPs with micro patterns by just singleinjection using the developed mold and thereby replace existing screen printing methods.

  3. Dynamic Loads Generation for Multi-Point Vibration Excitation Problems

    NASA Technical Reports Server (NTRS)

    Shen, Lawrence

    2011-01-01

    A random-force method has been developed to predict dynamic loads produced by rocket-engine random vibrations for new rocket-engine designs. The method develops random forces at multiple excitation points based on random vibration environments scaled from accelerometer data obtained during hot-fire tests of existing rocket engines. This random-force method applies random forces to the model and creates expected dynamic response in a manner that simulates the way the operating engine applies self-generated random vibration forces (random pressure acting on an area) with the resulting responses that we measure with accelerometers. This innovation includes the methodology (implementation sequence), the computer code, two methods to generate the random-force vibration spectra, and two methods to reduce some of the inherent conservatism in the dynamic loads. This methodology would be implemented to generate the random-force spectra at excitation nodes without requiring the use of artificial boundary conditions in a finite element model. More accurate random dynamic loads than those predicted by current industry methods can then be generated using the random force spectra. The scaling method used to develop the initial power spectral density (PSD) environments for deriving the random forces for the rocket engine case is based on the Barrett Criteria developed at Marshall Space Flight Center in 1963. This invention approach can be applied in the aerospace, automotive, and other industries to obtain reliable dynamic loads and responses from a finite element model for any structure subject to multipoint random vibration excitations.

  4. Evaluation of evidence-based literature and formulation of recommendations for the clinical preventive guidelines for immigrants and refugees in Canada

    PubMed Central

    Tugwell, Peter; Pottie, Kevin; Welch, Vivian; Ueffing, Erin; Chambers, Andrea; Feightner, John

    2011-01-01

    Background: This article describes the evidence review and guideline development method developed for the Clinical Preventive Guidelines for Immigrants and Refugees in Canada by the Canadian Collaboration for Immigrant and Refugee Health Guideline Committee. Methods: The Appraisal of Guidelines for Research and Evaluation (AGREE) best-practice framework was combined with the recently developed Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach to produce evidence-based clinical guidelines for immigrants and refugees in Canada. Results: A systematic approach was designed to produce the evidence reviews and apply the GRADE approach, including building on evidence from previous systematic reviews, searching for and comparing evidence between general and specific immigrant populations, and applying the GRADE criteria for making recommendations. This method was used for priority health conditions that had been selected by practitioners caring for immigrants and refugees in Canada. Interpretation: This article outlines the 14-step method that was defined to standardize the guideline development process for each priority health condition. PMID:20573711

  5. Common cause evaluations in applied risk analysis of nuclear power plants. [PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taniguchi, T.; Ligon, D.; Stamatelatos, M.

    1983-04-01

    Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights inmore » the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.« less

  6. A Mixed-Methods Analysis in Assessing Students' Professional Development by Applying an Assessment for Learning Approach.

    PubMed

    Peeters, Michael J; Vaidya, Varun A

    2016-06-25

    Objective. To describe an approach for assessing the Accreditation Council for Pharmacy Education's (ACPE) doctor of pharmacy (PharmD) Standard 4.4, which focuses on students' professional development. Methods. This investigation used mixed methods with triangulation of qualitative and quantitative data to assess professional development. Qualitative data came from an electronic developmental portfolio of professionalism and ethics, completed by PharmD students during their didactic studies. Quantitative confirmation came from the Defining Issues Test (DIT)-an assessment of pharmacists' professional development. Results. Qualitatively, students' development reflections described growth through this course series. Quantitatively, the 2015 PharmD class's DIT N2-scores illustrated positive development overall; the lower 50% had a large initial improvement compared to the upper 50%. Subsequently, the 2016 PharmD class confirmed these average initial improvements of students and also showed further substantial development among students thereafter. Conclusion. Applying an assessment for learning approach, triangulation of qualitative and quantitative assessments confirmed that PharmD students developed professionally during this course series.

  7. Applied Counterfactual Reasoning

    NASA Astrophysics Data System (ADS)

    Hendrickson, Noel

    This chapter addresses two goals: The development of a structured method to aid intelligence and security analysts in assessing counterfactuals, and forming a structured method to educate (future) analysts in counterfactual reasoning. In order to pursue these objectives, I offer here an analysis of the purposes, problems, parts, and principles of applied counterfactual reasoning. In particular, the ways in which antecedent scenarios are selected and the ways in which scenarios are developed constitute essential (albeit often neglected) aspects of counterfactual reasoning. Both must be addressed to apply counterfactual reasoning effectively. Naturally, further issues remain, but these should serve as a useful point of departure. They are the beginning of a path to more rigorous and relevant counterfactual reasoning in intelligence analysis and counterterrorism.

  8. Applied Epistemology and Understanding in Information Studies

    ERIC Educational Resources Information Center

    Gorichanaz, Tim

    2017-01-01

    Introduction: Applied epistemology allows information studies to benefit from developments in philosophy. In information studies, epistemic concepts are rarely considered in detail. This paper offers a review of several epistemic concepts, focusing on understanding, as a call for further work in applied epistemology in information studies. Method:…

  9. Critical path method applied to research project planning: Fire Economics Evaluation System (FEES)

    Treesearch

    Earl B. Anderson; R. Stanton Hales

    1986-01-01

    The critical path method (CPM) of network analysis (a) depicts precedence among the many activities in a project by a network diagram; (b) identifies critical activities by calculating their starting, finishing, and float times; and (c) displays possible schedules by constructing time charts. CPM was applied to the development of the Forest Service's Fire...

  10. Large-scale structural analysis: The structural analyst, the CSM Testbed and the NAS System

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Mccleary, Susan L.; Macy, Steven C.; Aminpour, Mohammad A.

    1989-01-01

    The Computational Structural Mechanics (CSM) activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM testbed methods development environment is presented and some numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.

  11. Stability indicating methods for the analysis of cefprozil in the presence of its alkaline induced degradation product

    NASA Astrophysics Data System (ADS)

    Attia, Khalid A. M.; Nassar, Mohammed W. I.; El-Zeiny, Mohamed B.; Serag, Ahmed

    2016-04-01

    Three simple, specific, accurate and precise spectrophotometric methods were developed for the determination of cefprozil (CZ) in the presence of its alkaline induced degradation product (DCZ). The first method was the bivariate method, while the two other multivariate methods were partial least squares (PLS) and spectral residual augmented classical least squares (SRACLS). The multivariate methods were applied with and without variable selection procedure (genetic algorithm GA). These methods were tested by analyzing laboratory prepared mixtures of the above drug with its alkaline induced degradation product and they were applied to its commercial pharmaceutical products.

  12. A method for the analysis of nonlinearities in aircraft dynamic response to atmospheric turbulence

    NASA Technical Reports Server (NTRS)

    Sidwell, K.

    1976-01-01

    An analytical method is developed which combines the equivalent linearization technique for the analysis of the response of nonlinear dynamic systems with the amplitude modulated random process (Press model) for atmospheric turbulence. The method is initially applied to a bilinear spring system. The analysis of the response shows good agreement with exact results obtained by the Fokker-Planck equation. The method is then applied to an example of control-surface displacement limiting in an aircraft with a pitch-hold autopilot.

  13. CSM Testbed Development and Large-Scale Structural Applications

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Gillian, R. E.; Mccleary, Susan L.; Lotts, C. G.; Poole, E. L.; Overman, A. L.; Macy, S. C.

    1989-01-01

    A research activity called Computational Structural Mechanics (CSM) conducted at the NASA Langley Research Center is described. This activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM Testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM Testbed methods development environment is presented and some new numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.

  14. A Finite Difference Method for Modeling Migration of Impurities in Multilayer Systems

    NASA Astrophysics Data System (ADS)

    Tosa, V.; Kovacs, Katalin; Mercea, P.; Piringer, O.

    2008-09-01

    A finite difference method to solve the one-dimensional diffusion of impurities in a multilayer system was developed for the special case in which a partition coefficient K impose a ratio of the concentrations at the interface between two adiacent layers. The fictitious point method was applied to derive the algebraic equations for the mesh points at the interface, while for the non-uniform mesh points within the layers a combined method was used. The method was tested and then applied to calculate migration of impurities from multilayer systems into liquids or solids samples, in migration experiments performed for quality testing purposes. An application was developed in the field of impurities migrations from multilayer plastic packagings into food, a problem of increasing importance in food industry.

  15. Award for Distinguished Professional Contributions to Applied Research: Luciano L'Abate

    ERIC Educational Resources Information Center

    American Psychologist, 2009

    2009-01-01

    Luciano L'Abate, recipient of the Award for Distinguished Professional Contributions to Applied Research, contributed to applied research through the introduction of the laboratory method in clinical psychology assessment and intervention, leading to the development of the first automated playroom, linking play therapy with research in child…

  16. Evaluating a physician leadership development program - a mixed methods approach.

    PubMed

    Throgmorton, Cheryl; Mitchell, Trey; Morley, Tom; Snyder, Marijo

    2016-05-16

    Purpose - With the extent of change in healthcare today, organizations need strong physician leaders. To compensate for the lack of physician leadership education, many organizations are sending physicians to external leadership programs or developing in-house leadership programs targeted specifically to physicians. The purpose of this paper is to outline the evaluation strategy and outcomes of the inaugural year of a Physician Leadership Academy (PLA) developed and implemented at a Michigan-based regional healthcare system. Design/methodology/approach - The authors applied the theoretical framework of Kirkpatrick's four levels of evaluation and used surveys, observations, activity tracking, and interviews to evaluate the program outcomes. The authors applied grounded theory techniques to the interview data. Findings - The program met targeted outcomes across all four levels of evaluation. Interview themes focused on the significance of increasing self-awareness, building relationships, applying new skills, and building confidence. Research limitations/implications - While only one example, this study illustrates the importance of developing the evaluation strategy as part of the program design. Qualitative research methods, often lacking from learning evaluation design, uncover rich themes of impact. The study supports how a PLA program can enhance physician learning, engagement, and relationship building throughout and after the program. Physician leaders' partnership with organization development and learning professionals yield results with impact to individuals, groups, and the organization. Originality/value - Few studies provide an in-depth review of evaluation methods and outcomes of physician leadership development programs. Healthcare organizations seeking to develop similar in-house programs may benefit applying the evaluation strategy outlined in this study.

  17. Diagnostics and Active Control of Aircraft Interior Noise

    NASA Technical Reports Server (NTRS)

    Fuller, C. R.

    1998-01-01

    This project deals with developing advanced methods for investigating and controlling interior noise in aircraft. The work concentrates on developing and applying the techniques of Near Field Acoustic Holography (NAH) and Principal Component Analysis (PCA) to the aircraft interior noise dynamic problem. This involves investigating the current state of the art, developing new techniques and then applying them to the particular problem being studied. The knowledge gained under the first part of the project was then used to develop and apply new, advanced noise control techniques for reducing interior noise. A new fully active control approach based on the PCA was developed and implemented on a test cylinder. Finally an active-passive approach based on tunable vibration absorbers was to be developed and analytically applied to a range of test structures from simple plates to aircraft fuselages.

  18. Development of a multivariate tool to reject background in a WZ diboson search for the CDF experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cremonesi, Matteo

    In the frame of the strong on-going data analysis effort of the CDF collaboration at Fermilab, a method was developed by the candidate to improve the background rejection efficiency in the search for associated pair production of electroweak W, Z bosons. The performaces of the method for vetoing the tt background in a WZ/ZZ → fνqmore » $$\\bar{q}$$ diboson search are reported. The method was developed in the inclusive 2-jets sample and applied to the “tag-2 jets" region, the subsample defined by the request that the two jets carry beauty flavor. In this region the tt production is one of the largest backgrounds. The tt veto proceeds in two steps: first, a set of pre-selection cuts are applied in a candidate sample where up to two leptons are accepted in addition to a jet pair, and the ZZ component of the signal is thus preserved; next, a Neural Network is trained to indicate the probability that the event be top-pair production. To validate the the method as developed in the inclusive 2-jets sample, it is applied to veto region providing a significant rejection of this important background.« less

  19. Simultaneous determination of mebeverine hydrochloride and chlordiazepoxide in their binary mixture using novel univariate spectrophotometric methods via different manipulation pathways.

    PubMed

    Lotfy, Hayam M; Fayez, Yasmin M; Michael, Adel M; Nessim, Christine K

    2016-02-15

    Smart, sensitive, simple and accurate spectrophotometric methods were developed and validated for the quantitative determination of a binary mixture of mebeverine hydrochloride (MVH) and chlordiazepoxide (CDZ) without prior separation steps via different manipulating pathways. These pathways were applied either on zero order absorption spectra namely, absorbance subtraction (AS) or based on the recovered zero order absorption spectra via a decoding technique namely, derivative transformation (DT) or via ratio spectra namely, ratio subtraction (RS) coupled with extended ratio subtraction (EXRS), spectrum subtraction (SS), constant multiplication (CM) and constant value (CV) methods. The manipulation steps applied on the ratio spectra are namely, ratio difference (RD) and amplitude modulation (AM) methods or applying a derivative to these ratio spectra namely, derivative ratio (DD(1)) or second derivative (D(2)). Finally, the pathway based on the ratio spectra of derivative spectra is namely, derivative subtraction (DS). The specificity of the developed methods was investigated by analyzing the laboratory mixtures and was successfully applied for their combined dosage form. The proposed methods were validated according to ICH guidelines. These methods exhibited linearity in the range of 2-28μg/mL for mebeverine hydrochloride and 1-12μg/mL for chlordiazepoxide. The obtained results were statistically compared with those of the official methods using Student t-test, F-test, and one way ANOVA, showing no significant difference with respect to accuracy and precision. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Simultaneous determination of mebeverine hydrochloride and chlordiazepoxide in their binary mixture using novel univariate spectrophotometric methods via different manipulation pathways

    NASA Astrophysics Data System (ADS)

    Lotfy, Hayam M.; Fayez, Yasmin M.; Michael, Adel M.; Nessim, Christine K.

    2016-02-01

    Smart, sensitive, simple and accurate spectrophotometric methods were developed and validated for the quantitative determination of a binary mixture of mebeverine hydrochloride (MVH) and chlordiazepoxide (CDZ) without prior separation steps via different manipulating pathways. These pathways were applied either on zero order absorption spectra namely, absorbance subtraction (AS) or based on the recovered zero order absorption spectra via a decoding technique namely, derivative transformation (DT) or via ratio spectra namely, ratio subtraction (RS) coupled with extended ratio subtraction (EXRS), spectrum subtraction (SS), constant multiplication (CM) and constant value (CV) methods. The manipulation steps applied on the ratio spectra are namely, ratio difference (RD) and amplitude modulation (AM) methods or applying a derivative to these ratio spectra namely, derivative ratio (DD1) or second derivative (D2). Finally, the pathway based on the ratio spectra of derivative spectra is namely, derivative subtraction (DS). The specificity of the developed methods was investigated by analyzing the laboratory mixtures and was successfully applied for their combined dosage form. The proposed methods were validated according to ICH guidelines. These methods exhibited linearity in the range of 2-28 μg/mL for mebeverine hydrochloride and 1-12 μg/mL for chlordiazepoxide. The obtained results were statistically compared with those of the official methods using Student t-test, F-test, and one way ANOVA, showing no significant difference with respect to accuracy and precision.

  1. HPTLC Method for the Determination of Paracetamol, Pseudoephedrine and Loratidine in Tablets and Human Plasma

    PubMed Central

    Farid, Nehal Fayek; Abdelaleem, Eglal A.

    2016-01-01

    A sensitive, accurate and selective high performance thin layer chromatography (HPTLC) method was developed and validated for the simultaneous determination of paracetamol (PAR), its toxic impurity 4-aminophenol (4-AP), pseudoephedrine HCl (PSH) and loratidine (LOR). The proposed chromatographic method has been developed using HPTLC aluminum plates precoated with silica gel 60 F254 using acetone–hexane–ammonia (4:5:0.1, by volume) as a developing system followed by densitometric measurement at 254 nm for PAR, 4-AP and LOR, while PSH was scanned at 208 nm. System suitability testing parameters were calculated to ascertain the quality performance of the developed chromatographic method. The method was validated with respect to USP guidelines regarding accuracy, precision and specificity. The method was successfully applied for the determination of PAR, PSH and LOR in ATSHI® tablets. The three drugs were also determined in plasma by applying the proposed method in the ranges of 0.5–6 µg/band, 1.6–12 µg/band and 0.4–2 µg/band for PAR, PSH and LOR, respectively. The results obtained by the proposed method were compared with those obtained by a reported HPLC method, and there was no significance difference between both methods regarding accuracy and precision. PMID:26762956

  2. PHONICS WITH CONTEXT CLUES AS APPLIED TO LANGUAGE ARTS.

    ERIC Educational Resources Information Center

    GUFFEY, MARY DEMAREE

    A SIMPLIFIED METHOD OF PHONICS UTILIZING THE GESTALT METHOD OF LEARNING IS PRESENTED. THE WORDS IN THIS COURSE IN PHONICS ARE TO BE TAUGHT AT A TIME DIFFERENT FROM THE READING CLASSES, BUT THE PRINCIPLES DEVELOPED ARE TO BE APPLIED WITHIN THE READING CLASSES. THE COURSE CAN BE USED WITH ANY BASIC TEXT AND STRESSES THE ABILITY OF CHILDREN TO…

  3. MO-DE-BRA-05: Developing Effective Medical Physics Knowledge Structures: Models and Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprawls, P

    Purpose: Develop a method and supporting online resources to be used by medical physics educators for teaching medical imaging professionals and trainees so they develop highly-effective physics knowledge structures that can contribute to improved diagnostic image quality on a global basis. Methods: The different types of mental knowledge structures were analyzed and modeled with respect to both the learning and teaching process for their development and the functions or tasks that can be performed with the knowledge. While symbolic verbal and mathematical knowledge structures are very important in medical physics for many purposes, the tasks of applying physics in clinicalmore » imaging--especially to optimize image quality and diagnostic accuracy--requires a sensory conceptual knowledge structure, specifically, an interconnected network of visually based concepts. This type of knowledge supports tasks such as analysis, evaluation, problem solving, interacting, and creating solutions. Traditional educational methods including lectures, online modules, and many texts are serial procedures and limited with respect to developing interconnected conceptual networks. A method consisting of the synergistic combination of on-site medical physics teachers and the online resource, CONET (Concept network developer), has been developed and made available for the topic Radiographic Image Quality. This was selected as the inaugural topic, others to follow, because it can be used by medical physicists teaching the large population of medical imaging professionals, such as radiology residents, who can apply the knowledge. Results: Tutorials for medical physics educators on developing effective knowledge structures are being presented and published and CONET is available with open access for all to use. Conclusion: An adjunct to traditional medical physics educational methods with the added focus on sensory concept development provides opportunities for medical physics teachers to share their knowledge and experience at a higher cognitive level and produce medical professionals with the enhanced ability to apply physics to clinical procedures.« less

  4. Development and evaluation of nursing user interface screens using multiple methods.

    PubMed

    Hyun, Sookyung; Johnson, Stephen B; Stetson, Peter D; Bakken, Suzanne

    2009-12-01

    Building upon the foundation of the Structured Narrative Electronic Health Record (EHR) model, we applied theory-based (combined Technology Acceptance Model and Task-Technology Fit Model) and user-centered methods to explore nurses' perceptions of functional requirements for an electronic nursing documentation system, design user interface screens reflective of the nurses' perspectives, and assess nurses' perceptions of the usability of the prototype user interface screens. The methods resulted in user interface screens that were perceived to be easy to use, potentially useful, and well-matched to nursing documentation tasks associated with Nursing Admission Assessment, Blood Administration, and Nursing Discharge Summary. The methods applied in this research may serve as a guide for others wishing to implement user-centered processes to develop or extend EHR systems. In addition, some of the insights obtained in this study may be informative to the development of safe and efficient user interface screens for nursing document templates in EHRs.

  5. Effects of a Program for Developing Creative Thinking Skills

    ERIC Educational Resources Information Center

    Rabanos, Natalia Larraz; Torres, Pedro Allueva

    2012-01-01

    Introduction: The aim of this study is to present an intervention program for creative skills development applied to a group of students of lower Secondary Education. Method: This program was applied in a school in Zaragoza (Spain) during the 2008-09 academic year. The study used a repeated-measures, quasi-experimental design with non-equivalent…

  6. Mass spectrometry-based protein identification by integrating de novo sequencing with database searching.

    PubMed

    Wang, Penghao; Wilson, Susan R

    2013-01-01

    Mass spectrometry-based protein identification is a very challenging task. The main identification approaches include de novo sequencing and database searching. Both approaches have shortcomings, so an integrative approach has been developed. The integrative approach firstly infers partial peptide sequences, known as tags, directly from tandem spectra through de novo sequencing, and then puts these sequences into a database search to see if a close peptide match can be found. However the current implementation of this integrative approach has several limitations. Firstly, simplistic de novo sequencing is applied and only very short sequence tags are used. Secondly, most integrative methods apply an algorithm similar to BLAST to search for exact sequence matches and do not accommodate sequence errors well. Thirdly, by applying these methods the integrated de novo sequencing makes a limited contribution to the scoring model which is still largely based on database searching. We have developed a new integrative protein identification method which can integrate de novo sequencing more efficiently into database searching. Evaluated on large real datasets, our method outperforms popular identification methods.

  7. FIELD ANALYTICAL METHODS: ADVANCED FIELD MONITORING METHODS DEVELOPMENT AND EVALUATION OF NEW AND INNOVATIVE TECHNOLOGIES THAT SUPPORT THE SITE CHARACTERIZATION AND MONITORING REQUIREMENTS OF THE SUPERFUND PROGRAM.

    EPA Science Inventory

    The overall goal of this task is to help reduce the uncertainties in the assessment of environmental health and human exposure by better characterizing hazardous wastes through cost-effective analytical methods. Research projects are directed towards the applied development and ...

  8. Methods for assessing the stability of slopes during earthquakes-A retrospective

    USGS Publications Warehouse

    Jibson, R.W.

    2011-01-01

    During the twentieth century, several methods to assess the stability of slopes during earthquakes were developed. Pseudostatic analysis was the earliest method; it involved simply adding a permanent body force representing the earthquake shaking to a static limit-equilibrium analysis. Stress-deformation analysis, a later development, involved much more complex modeling of slopes using a mesh in which the internal stresses and strains within elements are computed based on the applied external loads, including gravity and seismic loads. Stress-deformation analysis provided the most realistic model of slope behavior, but it is very complex and requires a high density of high-quality soil-property data as well as an accurate model of soil behavior. In 1965, Newmark developed a method that effectively bridges the gap between these two types of analysis. His sliding-block model is easy to apply and provides a useful index of co-seismic slope performance. Subsequent modifications to sliding-block analysis have made it applicable to a wider range of landslide types. Sliding-block analysis provides perhaps the greatest utility of all the types of analysis. It is far easier to apply than stress-deformation analysis, and it yields much more useful information than does pseudostatic analysis. ?? 2010.

  9. Algebraic Algorithm Design and Local Search

    DTIC Science & Technology

    1996-12-01

    method for performing algorithm design that is more purely algebraic than that of KIDS. This method is then applied to local search. Local search is a...synthesis. Our approach was to follow KIDS in spirit, but to adopt a pure algebraic formalism, supported by Kestrel’s SPECWARE environment (79), that...design was developed that is more purely algebraic than that of KIDS. This method was then applied to local search. A general theory of local search was

  10. Water supply management using an extended group fuzzy decision-making method: a case study in north-eastern Iran

    NASA Astrophysics Data System (ADS)

    Minatour, Yasser; Bonakdari, Hossein; Zarghami, Mahdi; Bakhshi, Maryam Ali

    2015-09-01

    The purpose of this study was to develop a group fuzzy multi-criteria decision-making method to be applied in rating problems associated with water resources management. Thus, here Chen's group fuzzy TOPSIS method extended by a difference technique to handle uncertainties of applying a group decision making. Then, the extended group fuzzy TOPSIS method combined with a consistency check. In the presented method, initially linguistic judgments are being surveyed via a consistency checking process, and afterward these judgments are being used in the extended Chen's fuzzy TOPSIS method. Here, each expert's opinion is turned to accurate mathematical numbers and, then, to apply uncertainties, the opinions of group are turned to fuzzy numbers using three mathematical operators. The proposed method is applied to select the optimal strategy for the rural water supply of Nohoor village in north-eastern Iran, as a case study and illustrated example. Sensitivity analyses test over results and comparing results with project reality showed that proposed method offered good results for water resources projects.

  11. Developing integrated methods to address complex resource and environmental issues

    USGS Publications Warehouse

    Smith, Kathleen S.; Phillips, Jeffrey D.; McCafferty, Anne E.; Clark, Roger N.

    2016-02-08

    IntroductionThis circular provides an overview of selected activities that were conducted within the U.S. Geological Survey (USGS) Integrated Methods Development Project, an interdisciplinary project designed to develop new tools and conduct innovative research requiring integration of geologic, geophysical, geochemical, and remote-sensing expertise. The project was supported by the USGS Mineral Resources Program, and its products and acquired capabilities have broad applications to missions throughout the USGS and beyond.In addressing challenges associated with understanding the location, quantity, and quality of mineral resources, and in investigating the potential environmental consequences of resource development, a number of field and laboratory capabilities and interpretative methodologies evolved from the project that have applications to traditional resource studies as well as to studies related to ecosystem health, human health, disaster and hazard assessment, and planetary science. New or improved tools and research findings developed within the project have been applied to other projects and activities. Specifically, geophysical equipment and techniques have been applied to a variety of traditional and nontraditional mineral- and energy-resource studies, military applications, environmental investigations, and applied research activities that involve climate change, mapping techniques, and monitoring capabilities. Diverse applied geochemistry activities provide a process-level understanding of the mobility, chemical speciation, and bioavailability of elements, particularly metals and metalloids, in a variety of environmental settings. Imaging spectroscopy capabilities maintained and developed within the project have been applied to traditional resource studies as well as to studies related to ecosystem health, human health, disaster assessment, and planetary science. Brief descriptions of capabilities and laboratory facilities and summaries of some applications of project products and research findings are included in this circular. The work helped support the USGS mission to “provide reliable scientific information to describe and understand the Earth; minimize loss of life and property from natural disasters; manage water, biological, energy, and mineral resources; and enhance and protect our quality of life.” Activities within the project include the following:Spanned scales from microscopic to planetary;Demonstrated broad applications across disciplines;Included life-cycle studies of mineral resources;Incorporated specialized areas of expertise in applied geochemistry including mineralogy, hydrogeology, analytical chemistry, aqueous geochemistry, biogeochemistry, microbiology, aquatic toxicology, and public health; andIncorporated specialized areas of expertise in geophysics including magnetics, gravity, radiometrics, electromagnetics, seismic, ground-penetrating radar, borehole radar, and imaging spectroscopy.This circular consists of eight sections that contain summaries of various activities under the project. The eight sections are listed below:Laboratory Facilities and Capabilities, which includes brief descriptions of the various types of laboratories and capabilities used for the project;Method and Software Development, which includes summaries of remote-sensing, geophysical, and mineralogical methods developed or enhanced by the project;Instrument Development, which includes descriptions of geophysical instruments developed under the project;Minerals, Energy, and Climate, which includes summaries of research that applies to mineral or energy resources, environmental processes and monitoring, and carbon sequestration by earth materials;Element Cycling, Toxicity, and Health, which includes summaries of several process-oriented geochemical and biogeochemical studies and health-related research activities;Hydrogeology and Water Quality, which includes descriptions of innovative geophysical, remote-sensing, and geochemical research pertaining to hydrogeology and water-quality applications;Hazards and Disaster Assessment, which includes summaries of research and method development that were applied to natural hazards, human-caused hazards, and disaster assessments; andDatabases and Framework Studies, which includes descriptions of fundamental applications of geophysical studies and of the importance of archived data.

  12. Suggestopedia to SALT and a New Awareness in Education.

    ERIC Educational Resources Information Center

    Herr, Kay U.

    SALT, suggestive-accelerative learning and teaching, is the Americanized version of a pedagogy developed in Bulgaria. While most extensively applied to foreign language teaching, the methodology may be applied to any discipline, particularly one based upon a foundation of learned facts. This document applies the method to ESL classes. The teacher…

  13. Evaluating the efficiency of spectral resolution of univariate methods manipulating ratio spectra and comparing to multivariate methods: An application to ternary mixture in common cold preparation

    NASA Astrophysics Data System (ADS)

    Moustafa, Azza Aziz; Salem, Hesham; Hegazy, Maha; Ali, Omnia

    2015-02-01

    Simple, accurate, and selective methods have been developed and validated for simultaneous determination of a ternary mixture of Chlorpheniramine maleate (CPM), Pseudoephedrine HCl (PSE) and Ibuprofen (IBF), in tablet dosage form. Four univariate methods manipulating ratio spectra were applied, method A is the double divisor-ratio difference spectrophotometric method (DD-RD). Method B is double divisor-derivative ratio spectrophotometric method (DD-RD). Method C is derivative ratio spectrum-zero crossing method (DRZC), while method D is mean centering of ratio spectra (MCR). Two multivariate methods were also developed and validated, methods E and F are Principal Component Regression (PCR) and Partial Least Squares (PLSs). The proposed methods have the advantage of simultaneous determination of the mentioned drugs without prior separation steps. They were successfully applied to laboratory-prepared mixtures and to commercial pharmaceutical preparation without any interference from additives. The proposed methods were validated according to the ICH guidelines. The obtained results were statistically compared with the official methods where no significant difference was observed regarding both accuracy and precision.

  14. Applied Use Value of Scientific Information for Management of Ecosystem Services

    NASA Astrophysics Data System (ADS)

    Raunikar, R. P.; Forney, W.; Bernknopf, R.; Mishra, S.

    2012-12-01

    The U.S. Geological Survey has developed and applied methods for quantifying the value of scientific information (VOI) that are based on the applied use value of the information. In particular the applied use value of U.S. Geological Survey information often includes efficient management of ecosystem services. The economic nature of U.S. Geological Survey scientific information is largely equivalent to that of any information, but we focus application of our VOI quantification methods on the information products provided freely to the public by the U.S. Geological Survey. We describe VOI economics in general and illustrate by referring to previous studies that use the evolving applied use value methods, which includes examples of the siting of landfills in Louden County, the mineral exploration efficiencies of finer resolution geologic maps in Canada, and improved agricultural production and groundwater protection in Eastern Iowa possible with Landsat moderate resolution satellite imagery. Finally, we describe the adaptation of the applied use value method to the case of streamgage information used to improve the efficiency of water markets in New Mexico.

  15. The application of contraction theory to an iterative formulation of electromagnetic scattering

    NASA Technical Reports Server (NTRS)

    Brand, J. C.; Kauffman, J. F.

    1985-01-01

    Contraction theory is applied to an iterative formulation of electromagnetic scattering from periodic structures and a computational method for insuring convergence is developed. A short history of spectral (or k-space) formulation is presented with an emphasis on application to periodic surfaces. To insure a convergent solution of the iterative equation, a process called the contraction corrector method is developed. Convergence properties of previously presented iterative solutions to one-dimensional problems are examined utilizing contraction theory and the general conditions for achieving a convergent solution are explored. The contraction corrector method is then applied to several scattering problems including an infinite grating of thin wires with the solution data compared to previous works.

  16. On the effect of boundary layer growth on the stability of compressible flows

    NASA Technical Reports Server (NTRS)

    El-Hady, N. M.

    1981-01-01

    The method of multiple scales is used to describe a formally correct method based on the nonparallel linear stability theory, that examines the two and three dimensional stability of compressible boundary layer flows. The method is applied to the supersonic flat plate layer at Mach number 4.5. The theoretical growth rates are in good agreement with experimental results. The method is also applied to the infinite-span swept wing transonic boundary layer with suction to evaluate the effect of the nonparallel flow on the development of crossflow disturbances.

  17. A New View of Earthquake Ground Motion Data: The Hilbert Spectral Analysis

    NASA Technical Reports Server (NTRS)

    Huang, Norden; Busalacchi, Antonio J. (Technical Monitor)

    2000-01-01

    A brief description of the newly developed Empirical Mode Decomposition (ENID) and Hilbert Spectral Analysis (HSA) method will be given. The decomposition is adaptive and can be applied to both nonlinear and nonstationary data. Example of the method applied to a sample earthquake record will be given. The results indicate those low frequency components, totally missed by the Fourier analysis, are clearly identified by the new method. Comparisons with Wavelet and window Fourier analysis show the new method offers much better temporal and frequency resolutions.

  18. Validation of a pulsed electric field process to pasteurize strawberry puree

    USDA-ARS?s Scientific Manuscript database

    An inexpensive data acquisition method was developed to validate the exact number and shape of the pulses applied during pulsed electric fields (PEF) processing. The novel validation method was evaluated in conjunction with developing a pasteurization PEF process for strawberry puree. Both buffered...

  19. Sparse QSAR modelling methods for therapeutic and regenerative medicine

    NASA Astrophysics Data System (ADS)

    Winkler, David A.

    2018-02-01

    The quantitative structure-activity relationships method was popularized by Hansch and Fujita over 50 years ago. The usefulness of the method for drug design and development has been shown in the intervening years. As it was developed initially to elucidate which molecular properties modulated the relative potency of putative agrochemicals, and at a time when computing resources were scarce, there is much scope for applying modern mathematical methods to improve the QSAR method and to extending the general concept to the discovery and optimization of bioactive molecules and materials more broadly. I describe research over the past two decades where we have rebuilt the unit operations of the QSAR method using improved mathematical techniques, and have applied this valuable platform technology to new important areas of research and industry such as nanoscience, omics technologies, advanced materials, and regenerative medicine. This paper was presented as the 2017 ACS Herman Skolnik lecture.

  20. Aiming for the Singing Teacher: An Applied Study on Preservice Kindergarten Teachers' Singing Skills Development within a Music Methods Course

    ERIC Educational Resources Information Center

    Neokleous, Rania

    2015-01-01

    This study examined the effects of a music methods course offered at a Cypriot university on the singing skills of 33 female preservice kindergarten teachers. To systematically measure and analyze student progress, the research design was both experimental and descriptive. As an applied study which was carried out "in situ," the normal…

  1. Recent developments in ejector technology in the Air Force: An overview

    NASA Technical Reports Server (NTRS)

    Nagaraja, K. S.

    1979-01-01

    Basic and applied studies in thrust augmentation conducted at the Aerospace Research Laboratory at Wright-Patterson AFB which led to an effective configuration of the jet flap diffuser ejector, are reviewed. A method for compressible ejector flow analysis, developed in support of the preliminary design of an ejector thrust aircraft, is discussed and applied to single- and two-stage ejectors.

  2. Semi-automatic version of the potentiometric titration method for characterization of uranium compounds.

    PubMed

    Cristiano, Bárbara F G; Delgado, José Ubiratan; da Silva, José Wanderley S; de Barros, Pedro D; de Araújo, Radier M S; Dias, Fábio C; Lopes, Ricardo T

    2012-09-01

    The potentiometric titration method was used for characterization of uranium compounds to be applied in intercomparison programs. The method is applied with traceability assured using a potassium dichromate primary standard. A semi-automatic version was developed to reduce the analysis time and the operator variation. The standard uncertainty in determining the total concentration of uranium was around 0.01%, which is suitable for uranium characterization and compatible with those obtained by manual techniques. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Non-Adiabatic Molecular Dynamics Methods for Materials Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Furche, Filipp; Parker, Shane M.; Muuronen, Mikko J.

    2017-04-04

    The flow of radiative energy in light-driven materials such as photosensitizer dyes or photocatalysts is governed by non-adiabatic transitions between electronic states and cannot be described within the Born-Oppenheimer approximation commonly used in electronic structure theory. The non-adiabatic molecular dynamics (NAMD) methods based on Tully surface hopping and time-dependent density functional theory developed in this project have greatly extended the range of molecular materials that can be tackled by NAMD simulations. New algorithms to compute molecular excited state and response properties efficiently were developed. Fundamental limitations of common non-linear response methods were discovered and characterized. Methods for accurate computations ofmore » vibronic spectra of materials such as black absorbers were developed and applied. It was shown that open-shell TDDFT methods capture bond breaking in NAMD simulations, a longstanding challenge for single-reference molecular dynamics simulations. The methods developed in this project were applied to study the photodissociation of acetaldehyde and revealed that non-adiabatic effects are experimentally observable in fragment kinetic energy distributions. Finally, the project enabled the first detailed NAMD simulations of photocatalytic water oxidation by titania nanoclusters, uncovering the mechanism of this fundamentally important reaction for fuel generation and storage.« less

  4. Prediction of Solvent Physical Properties using the Hierarchical Clustering Method

    EPA Science Inventory

    Recently a QSAR (Quantitative Structure Activity Relationship) method, the hierarchical clustering method, was developed to estimate acute toxicity values for large, diverse datasets. This methodology has now been applied to the estimate solvent physical properties including sur...

  5. Method of characteristics for three-dimensional axially symmetrical supersonic flows.

    NASA Technical Reports Server (NTRS)

    Sauer, R

    1947-01-01

    An approximation method for three-dimensional axially symmetrical supersonic flows is developed; it is based on the characteristics theory (represented partly graphically, partly analytically). Thereafter this method is applied to the construction of rotationally symmetrical nozzles. (author)

  6. Scattering from very rough layers under the geometric optics approximation: further investigation.

    PubMed

    Pinel, Nicolas; Bourlier, Christophe

    2008-06-01

    Scattering from very rough homogeneous layers is studied in the high-frequency limit (under the geometric optics approximation) by taking the shadowing effect into account. To do so, the iterated Kirchhoff approximation, recently developed by Pinel et al. [Waves Random Complex Media17, 283 (2007)] and reduced to the geometric optics approximation, is used and investigated in more detail. The contributions from the higher orders of scattering inside the rough layer are calculated under the iterated Kirchhoff approximation. The method can be applied to rough layers of either very rough or perfectly flat lower interfaces, separating either lossless or lossy media. The results are compared with the PILE (propagation-inside-layer expansion) method, recently developed by Déchamps et al. [J. Opt. Soc. Am. A23, 359 (2006)], and accelerated by the forward-backward method with spectral acceleration. They highlight that there is very good agreement between the developed method and the reference numerical method for all scattering orders and that the method can be applied to root-mean-square (RMS) heights at least down to 0.25lambda.

  7. HPTLC Method for the Determination of Paracetamol, Pseudoephedrine and Loratidine in Tablets and Human Plasma.

    PubMed

    Farid, Nehal Fayek; Abdelaleem, Eglal A

    2016-04-01

    A sensitive, accurate and selective high performance thin layer chromatography (HPTLC) method was developed and validated for the simultaneous determination of paracetamol (PAR), its toxic impurity 4-aminophenol (4-AP), pseudoephedrine HCl (PSH) and loratidine (LOR). The proposed chromatographic method has been developed using HPTLC aluminum plates precoated with silica gel 60 F254 using acetone-hexane-ammonia (4:5:0.1, by volume) as a developing system followed by densitometric measurement at 254 nm for PAR, 4-AP and LOR, while PSH was scanned at 208 nm. System suitability testing parameters were calculated to ascertain the quality performance of the developed chromatographic method. The method was validated with respect to USP guidelines regarding accuracy, precision and specificity. The method was successfully applied for the determination of PAR, PSH and LOR in ATSHI(®) tablets. The three drugs were also determined in plasma by applying the proposed method in the ranges of 0.5-6 µg/band, 1.6-12 µg/band and 0.4-2 µg/band for PAR, PSH and LOR, respectively. The results obtained by the proposed method were compared with those obtained by a reported HPLC method, and there was no significance difference between both methods regarding accuracy and precision. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Shining Light on Higher Education's Newest Baccalaureate Degrees and the Research Needed to Understand Their Impact

    ERIC Educational Resources Information Center

    Bragg, Debra D.; Soler, Maria Claudia

    2016-01-01

    This chapter discusses methods and measures that are needed to conduct research on newly developing Applied Baccalaureate degrees that enable students to transfer applied college credits heretofore considered terminal to bachelor's degree programs.

  9. [International trends of applied ecology and its future development in China].

    PubMed

    Zhou, Qixing; Sun, Shunjiang

    2002-07-01

    Internationally applied ecology was born around 25-40 years ago in order to adapt and serve the needs of mitigating increasingly environmental pollution and ecological destroy in developed western countries at that time. All the times applied ecological principles thus underpin most efforts at solving increasingly deterioration of natural resources and serious eco-environmental problems as its keystone and researching kernel with the development of the subject. At the advent of the 21st century, human beings enter into the age of applied ecology. There are five international features of applied ecology, including more attention to many-sided applications, special emphasis on the intersection with engineering, strongly keeping on mutual links with basic ecology, omnidirectional adoption of new methods and new technology, and side-by-side trends of microcosmic mechanisms and macroscopical regulation. Although we must connect with international applied ecology and absorb distillates from the subject in developed western countries, development of applied ecology in China in the future, in particular, at the beginnings of the 21st century should not deviate from aiming at the solution of increasingly environmental pollution and ecological destroy that is one of the most important basic situations of the country.

  10. Provider payment in community-based health insurance schemes in developing countries: a systematic review

    PubMed Central

    Robyn, Paul Jacob; Sauerborn, Rainer; Bärnighausen, Till

    2013-01-01

    Objectives Community-based health insurance (CBI) is a common mechanism to generate financial resources for health care in developing countries. We review for the first time provider payment methods used in CBI in developing countries and their impact on CBI performance. Methods We conducted a systematic review of the literature on provider payment methods used by CBI in developing countries published up to January 2010. Results Information on provider payment was available for a total of 32 CBI schemes in 34 reviewed publications: 17 schemes in South Asia, 10 in sub-Saharan Africa, 4 in East Asia and 1 in Latin America. Various types of provider payment were applied by the CBI schemes: 17 used fee-for-service, 12 used salaries, 9 applied a coverage ceiling, 7 used capitation and 6 applied a co-insurance. The evidence suggests that provider payment impacts CBI performance through provider participation and support for CBI, population enrolment and patient satisfaction with CBI, quantity and quality of services provided and provider and patient retention. Lack of provider participation in designing and choosing a CBI payment method can lead to reduced provider support for the scheme. Conclusion CBI schemes in developing countries have used a wide range of provider payment methods. The existing evidence suggests that payment methods are a key determinant of CBI performance and sustainability, but the strength of this evidence is limited since it is largely based on observational studies rather than on trials or on quasi-experimental research. According to the evidence, provider payment can affect provider participation, satisfaction and retention in CBI; the quantity and quality of services provided to CBI patients; patient demand of CBI services; and population enrollment, risk pooling and financial sustainability of CBI. CBI schemes should carefully consider how their current payment methods influence their performance, how changes in the methods could improve performance, and how such effects could be assessed with scientific rigour to increase the strength of evidence on this topic. PMID:22522770

  11. A novel method of utilizing permeable reactive kiddle (PRK) for the remediation of acid mine drainage.

    PubMed

    Lee, Woo-Chun; Lee, Sang-Woo; Yun, Seong-Taek; Lee, Pyeong-Koo; Hwang, Yu Sik; Kim, Soon-Oh

    2016-01-15

    Numerous technologies have been developed and applied to remediate AMD, but each has specific drawbacks. To overcome the limitations of existing methods and improve their effectiveness, we propose a novel method utilizing permeable reactive kiddle (PRK). This manuscript explores the performance of the PRK method. In line with the concept of green technology, the PRK method recycles industrial waste, such as steel slag and waste cast iron. Our results demonstrate that the PRK method can be applied to remediate AMD under optimal operational conditions. Especially, this method allows for simple installation and cheap expenditure, compared with established technologies. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Application of TOPSIS and VIKOR improved versions in a multi criteria decision analysis to develop an optimized municipal solid waste management model.

    PubMed

    Aghajani Mir, M; Taherei Ghazvinei, P; Sulaiman, N M N; Basri, N E A; Saheri, S; Mahmood, N Z; Jahan, A; Begum, R A; Aghamohammadi, N

    2016-01-15

    Selecting a suitable Multi Criteria Decision Making (MCDM) method is a crucial stage to establish a Solid Waste Management (SWM) system. Main objective of the current study is to demonstrate and evaluate a proposed method using Multiple Criteria Decision Making methods (MCDM). An improved version of Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) applied to obtain the best municipal solid waste management method by comparing and ranking the scenarios. Applying this method in order to rank treatment methods is introduced as one contribution of the study. Besides, Viekriterijumsko Kompromisno Rangiranje (VIKOR) compromise solution method applied for sensitivity analyses. The proposed method can assist urban decision makers in prioritizing and selecting an optimized Municipal Solid Waste (MSW) treatment system. Besides, a logical and systematic scientific method was proposed to guide an appropriate decision-making. A modified TOPSIS methodology as a superior to existing methods for first time was applied for MSW problems. Applying this method in order to rank treatment methods is introduced as one contribution of the study. Next, 11 scenarios of MSW treatment methods are defined and compared environmentally and economically based on the waste management conditions. Results show that integrating a sanitary landfill (18.1%), RDF (3.1%), composting (2%), anaerobic digestion (40.4%), and recycling (36.4%) was an optimized model of integrated waste management. An applied decision-making structure provides the opportunity for optimum decision-making. Therefore, the mix of recycling and anaerobic digestion and a sanitary landfill with Electricity Production (EP) are the preferred options for MSW management. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Evaluating methods of inferring gene regulatory networks highlights their lack of performance for single cell gene expression data.

    PubMed

    Chen, Shuonan; Mar, Jessica C

    2018-06-19

    A fundamental fact in biology states that genes do not operate in isolation, and yet, methods that infer regulatory networks for single cell gene expression data have been slow to emerge. With single cell sequencing methods now becoming accessible, general network inference algorithms that were initially developed for data collected from bulk samples may not be suitable for single cells. Meanwhile, although methods that are specific for single cell data are now emerging, whether they have improved performance over general methods is unknown. In this study, we evaluate the applicability of five general methods and three single cell methods for inferring gene regulatory networks from both experimental single cell gene expression data and in silico simulated data. Standard evaluation metrics using ROC curves and Precision-Recall curves against reference sets sourced from the literature demonstrated that most of the methods performed poorly when they were applied to either experimental single cell data, or simulated single cell data, which demonstrates their lack of performance for this task. Using default settings, network methods were applied to the same datasets. Comparisons of the learned networks highlighted the uniqueness of some predicted edges for each method. The fact that different methods infer networks that vary substantially reflects the underlying mathematical rationale and assumptions that distinguish network methods from each other. This study provides a comprehensive evaluation of network modeling algorithms applied to experimental single cell gene expression data and in silico simulated datasets where the network structure is known. Comparisons demonstrate that most of these assessed network methods are not able to predict network structures from single cell expression data accurately, even if they are specifically developed for single cell methods. Also, single cell methods, which usually depend on more elaborative algorithms, in general have less similarity to each other in the sets of edges detected. The results from this study emphasize the importance for developing more accurate optimized network modeling methods that are compatible for single cell data. Newly-developed single cell methods may uniquely capture particular features of potential gene-gene relationships, and caution should be taken when we interpret these results.

  14. Applying operational research and data mining to performance based medical personnel motivation system.

    PubMed

    Niaksu, Olegas; Zaptorius, Jonas

    2014-01-01

    This paper presents the methodology suitable for creation of a performance related remuneration system in healthcare sector, which would meet requirements for efficiency and sustainable quality of healthcare services. Methodology for performance indicators selection, ranking and a posteriori evaluation has been proposed and discussed. Priority Distribution Method is applied for unbiased performance criteria weighting. Data mining methods are proposed to monitor and evaluate the results of motivation system.We developed a method for healthcare specific criteria selection consisting of 8 steps; proposed and demonstrated application of Priority Distribution Method for the selected criteria weighting. Moreover, a set of data mining methods for evaluation of the motivational system outcomes was proposed. The described methodology for calculating performance related payment needs practical approbation. We plan to develop semi-automated tools for institutional and personal performance indicators monitoring. The final step would be approbation of the methodology in a healthcare facility.

  15. Nonequilibrium radiative heating prediction method for aeroassist flowfields with coupling to flowfield solvers. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Hartung, Lin C.

    1991-01-01

    A method for predicting radiation adsorption and emission coefficients in thermochemical nonequilibrium flows is developed. The method is called the Langley optimized radiative nonequilibrium code (LORAN). It applies the smeared band approximation for molecular radiation to produce moderately detailed results and is intended to fill the gap between detailed but costly prediction methods and very fast but highly approximate methods. The optimization of the method to provide efficient solutions allowing coupling to flowfield solvers is discussed. Representative results are obtained and compared to previous nonequilibrium radiation methods, as well as to ground- and flight-measured data. Reasonable agreement is found in all cases. A multidimensional radiative transport method is also developed for axisymmetric flows. Its predictions for wall radiative flux are 20 to 25 percent lower than those of the tangent slab transport method, as expected, though additional investigation of the symmetry and outflow boundary conditions is indicated. The method was applied to the peak heating condition of the aeroassist flight experiment (AFE) trajectory, with results comparable to predictions from other methods. The LORAN method was also applied in conjunction with the computational fluid dynamics (CFD) code LAURA to study the sensitivity of the radiative heating prediction to various models used in nonequilibrium CFD. This study suggests that radiation measurements can provide diagnostic information about the detailed processes occurring in a nonequilibrium flowfield because radiation phenomena are very sensitive to these processes.

  16. Tracking Preservice Kindergarten Teachers' Development of Singing Skills and Confidence: An Applied Study

    ERIC Educational Resources Information Center

    Neokleous, Rania

    2010-01-01

    The purpose of this study was to (a) examine the effects of a music methods course on the singing skills of preservice kindergarten teachers, (b) document the nature and development of their skills during the course, and (c) trace any changes in their confidence levels toward singing as a result of the course. As an applied study which was carried…

  17. Ratio manipulating spectrophotometry versus chemometry as stability indicating methods for cefquinome sulfate determination

    NASA Astrophysics Data System (ADS)

    Yehia, Ali M.; Arafa, Reham M.; Abbas, Samah S.; Amer, Sawsan M.

    2016-01-01

    Spectral resolution of cefquinome sulfate (CFQ) in the presence of its degradation products was studied. Three selective, accurate and rapid spectrophotometric methods were performed for the determination of CFQ in the presence of either its hydrolytic, oxidative or photo-degradation products. The proposed ratio difference, derivative ratio and mean centering are ratio manipulating spectrophotometric methods that were satisfactorily applied for selective determination of CFQ within linear range of 5.0-40.0 μg mL- 1. Concentration Residuals Augmented Classical Least Squares was applied and evaluated for the determination of the cited drug in the presence of its all degradation products. Traditional Partial Least Squares regression was also applied and benchmarked against the proposed advanced multivariate calibration. Experimentally designed 25 synthetic mixtures of three factors at five levels were used to calibrate and validate the multivariate models. Advanced chemometrics succeeded in quantitative and qualitative analyses of CFQ along with its hydrolytic, oxidative and photo-degradation products. The proposed methods were applied successfully for different pharmaceutical formulations analyses. These developed methods were simple and cost-effective compared with the manufacturer's RP-HPLC method.

  18. Verifying Hybrid Systems Modeled as Timed Automata: A Case Study

    DTIC Science & Technology

    1997-03-01

    Introduction Researchers have proposed many innovative formal methods for developing real - time systems [9]. Such methods can give system developers and...customers greater con dence that real - time systems satisfy their requirements, especially their crit- ical requirements. However, applying formal methods...specifying and reasoning about real - time systems that is designed to address these challenging problems. Our approach is to build formal reasoning tools

  19. Lessons from comparative effectiveness research methods development projects funded under the Recovery Act.

    PubMed

    Zurovac, Jelena; Esposito, Dominick

    2014-11-01

    The American Recovery and Reinvestment Act of 2009 (ARRA) directed nearly US$29.2 million to comparative effectiveness research (CER) methods development. To help inform future CER methods investments, we describe the ARRA CER methods projects, identify barriers to this research and discuss the alignment of topics with published methods development priorities. We used several existing resources and held discussions with ARRA CER methods investigators. Although funded projects explored many identified priority topics, investigators noted that much work remains. For example, given the considerable investments in CER data infrastructure, the methods development field can benefit from additional efforts to educate researchers about the availability of new data sources and about how best to apply methods to match their research questions and data.

  20. Automated Segmentation of High-Resolution Photospheric Images of Active Regions

    NASA Astrophysics Data System (ADS)

    Yang, Meng; Tian, Yu; Rao, Changhui

    2018-02-01

    Due to the development of ground-based, large-aperture solar telescopes with adaptive optics (AO) resulting in increasing resolving ability, more accurate sunspot identifications and characterizations are required. In this article, we have developed a set of automated segmentation methods for high-resolution solar photospheric images. Firstly, a local-intensity-clustering level-set method is applied to roughly separate solar granulation and sunspots. Then reinitialization-free level-set evolution is adopted to adjust the boundaries of the photospheric patch; an adaptive intensity threshold is used to discriminate between umbra and penumbra; light bridges are selected according to their regional properties from candidates produced by morphological operations. The proposed method is applied to the solar high-resolution TiO 705.7-nm images taken by the 151-element AO system and Ground-Layer Adaptive Optics prototype system at the 1-m New Vacuum Solar Telescope of the Yunnan Observatory. Experimental results show that the method achieves satisfactory robustness and efficiency with low computational cost on high-resolution images. The method could also be applied to full-disk images, and the calculated sunspot areas correlate well with the data given by the National Oceanic and Atmospheric Administration (NOAA).

  1. Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound

    NASA Astrophysics Data System (ADS)

    Galperin, Michael

    2003-05-01

    A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.

  2. Designing stellarator coils by a modified Newton method using FOCUS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Caoxiang; Hudson, Stuart R.; Song, Yuntao

    To find the optimal coils for stellarators, nonlinear optimization algorithms are applied in existing coil design codes. However, none of these codes have used the information from the second-order derivatives. In this paper, we present a modified Newton method in the recently developed code FOCUS. The Hessian matrix is calculated with analytically derived equations. Its inverse is approximated by a modified Cholesky factorization and applied in the iterative scheme of a classical Newton method. Using this method, FOCUS is able to recover the W7-X modular coils starting from a simple initial guess. Results demonstrate significant advantages.

  3. Designing stellarator coils by a modified Newton method using FOCUS

    NASA Astrophysics Data System (ADS)

    Zhu, Caoxiang; Hudson, Stuart R.; Song, Yuntao; Wan, Yuanxi

    2018-06-01

    To find the optimal coils for stellarators, nonlinear optimization algorithms are applied in existing coil design codes. However, none of these codes have used the information from the second-order derivatives. In this paper, we present a modified Newton method in the recently developed code FOCUS. The Hessian matrix is calculated with analytically derived equations. Its inverse is approximated by a modified Cholesky factorization and applied in the iterative scheme of a classical Newton method. Using this method, FOCUS is able to recover the W7-X modular coils starting from a simple initial guess. Results demonstrate significant advantages.

  4. Designing stellarator coils by a modified Newton method using FOCUS

    DOE PAGES

    Zhu, Caoxiang; Hudson, Stuart R.; Song, Yuntao; ...

    2018-03-22

    To find the optimal coils for stellarators, nonlinear optimization algorithms are applied in existing coil design codes. However, none of these codes have used the information from the second-order derivatives. In this paper, we present a modified Newton method in the recently developed code FOCUS. The Hessian matrix is calculated with analytically derived equations. Its inverse is approximated by a modified Cholesky factorization and applied in the iterative scheme of a classical Newton method. Using this method, FOCUS is able to recover the W7-X modular coils starting from a simple initial guess. Results demonstrate significant advantages.

  5. Applying scrum methods to ITS projects.

    DOT National Transportation Integrated Search

    2017-08-01

    The introduction of new technology generally brings new challenges and new methods to help with deployments. Agile methodologies have been introduced in the information technology industry to potentially speed up development. The Federal Highway Admi...

  6. Self-calibrating models for dynamic monitoring and diagnosis

    NASA Technical Reports Server (NTRS)

    Kuipers, Benjamin

    1994-01-01

    The present goal in qualitative reasoning is to develop methods for automatically building qualitative and semiquantitative models of dynamic systems and to use them for monitoring and fault diagnosis. The qualitative approach to modeling provides a guarantee of coverage while our semiquantitative methods support convergence toward a numerical model as observations are accumulated. We have developed and applied methods for automatic creation of qualitative models, developed two methods for obtaining tractable results on problems that were previously intractable for qualitative simulation, and developed more powerful methods for learning semiquantitative models from observations and deriving semiquantitative predictions from them. With these advances, qualitative reasoning comes significantly closer to realizing its aims as a practical engineering method.

  7. Method Development and Application to Determine Potential Plant Uptake of Antibiotics and Other Drugs in Irrigated Crop Production Systems

    EPA Science Inventory

    Recent studies have shown the detection of pharmaceuticals in surface waters across the United States. The objective of this study was to develop methods, and apply them, to evaluate the potential for food chain transfer when pharmaceutical containing wastewaters are used for cr...

  8. A discussion on the origin of quantum probabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holik, Federico, E-mail: olentiev2@gmail.com; Departamento de Matemática - Ciclo Básico Común, Universidad de Buenos Aires - Pabellón III, Ciudad Universitaria, Buenos Aires; Sáenz, Manuel

    We study the origin of quantum probabilities as arising from non-Boolean propositional-operational structures. We apply the method developed by Cox to non distributive lattices and develop an alternative formulation of non-Kolmogorovian probability measures for quantum mechanics. By generalizing the method presented in previous works, we outline a general framework for the deduction of probabilities in general propositional structures represented by lattices (including the non-distributive case). -- Highlights: •Several recent works use a derivation similar to that of R.T. Cox to obtain quantum probabilities. •We apply Cox’s method to the lattice of subspaces of the Hilbert space. •We obtain a derivationmore » of quantum probabilities which includes mixed states. •The method presented in this work is susceptible to generalization. •It includes quantum mechanics and classical mechanics as particular cases.« less

  9. Feasibility study for automatic reduction of phase change imagery

    NASA Technical Reports Server (NTRS)

    Nossaman, G. O.

    1971-01-01

    The feasibility of automatically reducing a form of pictorial aerodynamic heating data is discussed. The imagery, depicting the melting history of a thin coat of fusible temperature indicator painted on an aerodynamically heated model, was previously reduced by manual methods. Careful examination of various lighting theories and approaches led to an experimentally verified illumination concept capable of yielding high-quality imagery. Both digital and video image processing techniques were applied to reduction of the data, and it was demonstrated that either method can be used to develop superimposed contours. Mathematical techniques were developed to find the model-to-image and the inverse image-to-model transformation using six conjugate points, and methods were developed using these transformations to determine heating rates on the model surface. A video system was designed which is able to reduce the imagery rapidly, economically and accurately. Costs for this system were estimated. A study plan was outlined whereby the mathematical transformation techniques developed to produce model coordinate heating data could be applied to operational software, and methods were discussed and costs estimated for obtaining the digital information necessary for this software.

  10. Development of gas chromatographic methods for the analyses of organic carbonate-based electrolytes

    NASA Astrophysics Data System (ADS)

    Terborg, Lydia; Weber, Sascha; Passerini, Stefano; Winter, Martin; Karst, Uwe; Nowak, Sascha

    2014-01-01

    In this work, novel methods based on gas chromatography (GC) for the investigation of common organic carbonate-based electrolyte systems are presented, which are used in lithium ion batteries. The methods were developed for flame ionization detection (FID), mass spectrometric detection (MS). Further, headspace (HS) sampling for the investigation of solid samples like electrodes is reported. Limits of detection are reported for FID. Finally, the developed methods were applied to the electrolyte system of commercially available lithium ion batteries as well as on in-house assembled cells.

  11. SEGMA: An Automatic SEGMentation Approach for Human Brain MRI Using Sliding Window and Random Forests

    PubMed Central

    Serag, Ahmed; Wilkinson, Alastair G.; Telford, Emma J.; Pataky, Rozalia; Sparrow, Sarah A.; Anblagan, Devasuda; Macnaught, Gillian; Semple, Scott I.; Boardman, James P.

    2017-01-01

    Quantitative volumes from brain magnetic resonance imaging (MRI) acquired across the life course may be useful for investigating long term effects of risk and resilience factors for brain development and healthy aging, and for understanding early life determinants of adult brain structure. Therefore, there is an increasing need for automated segmentation tools that can be applied to images acquired at different life stages. We developed an automatic segmentation method for human brain MRI, where a sliding window approach and a multi-class random forest classifier were applied to high-dimensional feature vectors for accurate segmentation. The method performed well on brain MRI data acquired from 179 individuals, analyzed in three age groups: newborns (38–42 weeks gestational age), children and adolescents (4–17 years) and adults (35–71 years). As the method can learn from partially labeled datasets, it can be used to segment large-scale datasets efficiently. It could also be applied to different populations and imaging modalities across the life course. PMID:28163680

  12. Development of Boundary Condition Independent Reduced Order Thermal Models using Proper Orthogonal Decomposition

    NASA Astrophysics Data System (ADS)

    Raghupathy, Arun; Ghia, Karman; Ghia, Urmila

    2008-11-01

    Compact Thermal Models (CTM) to represent IC packages has been traditionally developed using the DELPHI-based (DEvelopment of Libraries of PHysical models for an Integrated design) methodology. The drawbacks of this method are presented, and an alternative method is proposed. A reduced-order model that provides the complete thermal information accurately with less computational resources can be effectively used in system level simulations. Proper Orthogonal Decomposition (POD), a statistical method, can be used to reduce the order of the degree of freedom or variables of the computations for such a problem. POD along with the Galerkin projection allows us to create reduced-order models that reproduce the characteristics of the system with a considerable reduction in computational resources while maintaining a high level of accuracy. The goal of this work is to show that this method can be applied to obtain a boundary condition independent reduced-order thermal model for complex components. The methodology is applied to the 1D transient heat equation.

  13. Development of intron length polymorphism markers in genes encoding diketide-CoA synthase and curcumin synthase for discriminating Curcuma species.

    PubMed

    Kita, Tomoko; Komatsu, Katsuko; Zhu, Shu; Iida, Osamu; Sugimura, Koji; Kawahara, Nobuo; Taguchi, Hiromu; Masamura, Noriya; Cai, Shao-Qing

    2016-03-01

    Various Curcuma rhizomes have been used as medicines or spices in Asia since ancient times. It is very difficult to distinguish them morphologically, especially when they are boiled and dried, which causes misidentification leading to a loss of efficacy. We developed a method for discriminating Curcuma species by intron length polymorphism markers in genes encoding diketide-CoA synthase and curcumin synthase. This method could apply to identification of not only fresh plants but also samples of crude drugs or edible spices. By applying this method to Curcuma specimens and samples, and constructing a dendrogram based on these markers, seven Curcuma species were clearly distinguishable. Moreover, Curcuma longa specimens were geographically distinguishable. On the other hand, Curcuma kwangsiensis (gl type) specimens also showed intraspecies polymorphism, which may have occurred as a result of hybridization with other Curcuma species. The molecular method we developed is a potential tool for global classification of the genus Curcuma. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Practical techniques for enhancing the high-frequency MASW method

    USDA-ARS?s Scientific Manuscript database

    For soil exploration in the vadose zone, a high-frequency multi-channel analysis of surface waves (HF-MASW) method has been developed. In the study, several practical techniques were applied to enhance the overtone image of the HF-MASW method. They included (1) the self-adaptive MASW method using a ...

  15. Validation of the ULCEAT methodology by applying it in retrospect to the Roboticbed.

    PubMed

    Nakamura, Mio; Suzurikawa, Jun; Tsukada, Shohei; Kume, Yohei; Kawakami, Hideo; Inoue, Kaoru; Inoue, Takenobu

    2015-01-01

    In answer to the increasing demand for care by the Japanese oldest portion of the population, an extensive programme of life support robots is under development, advocated by the Japanese government. Roboticbed® (RB) is developed to facilitate patients in their daily life in making independent transfers from and to the bed. The bed is intended both for elderly and persons with a disability. The purpose of this study is to examine the validity of the user and user's life centred clinical evaluation of assistive technology (ULCEAT) methodology. To support user centred development of life support robots the ULCEAT method was developed. By means of the ULCEAT method the target users and the use environment were re-established in an earlier study. The validity of the method is tested by re-evaluating the development of RB in retrospect. Six participants used the first prototype of RB (RB1) and eight participants used the second prototype of RB (RB2). The results indicated that the functionality was improved owing to the end-user evaluations. Therefore, we confirmed the content validity of the proposed ULCEAT method. In this study we confirmed the validation of the ULCEAT methodology by applying it in retrospect to RB using development process. This method will be used for the development of Life-support robots and prototype assistive technologies.

  16. Applying Various Methods of Communicating Science for Community Decision-Making and Public Awareness: A NASA DEVELOP National Program Case Study

    NASA Astrophysics Data System (ADS)

    Miller, T. N.; Brumbaugh, E. J.; Barker, M.; Ly, V.; Schick, R.; Rogers, L.

    2015-12-01

    The NASA DEVELOP National Program conducts over eighty Earth science projects every year. Each project applies NASA Earth observations to impact decision-making related to a local or regional community concern. Small, interdisciplinary teams create a methodology to address the specific issue, and then pass on the results to partner organizations, as well as providing them with instruction to continue using remote sensing for future decisions. Many different methods are used by individual teams, and the program as a whole, to communicate results and research accomplishments to decision-makers, stakeholders, alumni, and the general public. These methods vary in scope from formal publications to more informal venues, such as social media. This presentation will highlight the communication techniques used by the DEVELOP program. Audiences, strategies, and outlets will be discussed, including a newsletter, microjournal, video contest, and several others.

  17. Assessment of higher order structure comparability in therapeutic proteins using nuclear magnetic resonance spectroscopy.

    PubMed

    Amezcua, Carlos A; Szabo, Christina M

    2013-06-01

    In this work, we applied nuclear magnetic resonance (NMR) spectroscopy to rapidly assess higher order structure (HOS) comparability in protein samples. Using a variation of the NMR fingerprinting approach described by Panjwani et al. [2010. J Pharm Sci 99(8):3334-3342], three nonglycosylated proteins spanning a molecular weight range of 6.5-67 kDa were analyzed. A simple statistical method termed easy comparability of HOS by NMR (ECHOS-NMR) was developed. In this method, HOS similarity between two samples is measured via the correlation coefficient derived from linear regression analysis of binned NMR spectra. Applications of this method include HOS comparability assessment during new product development, manufacturing process changes, supplier changes, next-generation products, and the development of biosimilars to name just a few. We foresee ECHOS-NMR becoming a routine technique applied to comparability exercises used to complement data from other analytical techniques. Copyright © 2013 Wiley Periodicals, Inc.

  18. A review of recent developments in parametric based acoustic emission techniques applied to concrete structures

    NASA Astrophysics Data System (ADS)

    Vidya Sagar, R.; Raghu Prasad, B. K.

    2012-03-01

    This article presents a review of recent developments in parametric based acoustic emission (AE) techniques applied to concrete structures. It recapitulates the significant milestones achieved by previous researchers including various methods and models developed in AE testing of concrete structures. The aim is to provide an overview of the specific features of parametric based AE techniques of concrete structures carried out over the years. Emphasis is given to traditional parameter-based AE techniques applied to concrete structures. A significant amount of research on AE techniques applied to concrete structures has already been published and considerable attention has been given to those publications. Some recent studies such as AE energy analysis and b-value analysis used to assess damage of concrete bridge beams have also been discussed. The formation of fracture process zone and the AE energy released during the fracture process in concrete beam specimens have been summarised. A large body of experimental data on AE characteristics of concrete has accumulated over the last three decades. This review of parametric based AE techniques applied to concrete structures may be helpful to the concerned researchers and engineers to better understand the failure mechanism of concrete and evolve more useful methods and approaches for diagnostic inspection of structural elements and failure prediction/prevention of concrete structures.

  19. Prediction of sound fields in acoustical cavities using the boundary element method. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Kipp, C. R.; Bernhard, R. J.

    1985-01-01

    A method was developed to predict sound fields in acoustical cavities. The method is based on the indirect boundary element method. An isoparametric quadratic boundary element is incorporated. Pressure, velocity and/or impedance boundary conditions may be applied to a cavity by using this method. The capability to include acoustic point sources within the cavity is implemented. The method is applied to the prediction of sound fields in spherical and rectangular cavities. All three boundary condition types are verified. Cases with a point source within the cavity domain are also studied. Numerically determined cavity pressure distributions and responses are presented. The numerical results correlate well with available analytical results.

  20. Statistical Model Selection for TID Hardness Assurance

    NASA Technical Reports Server (NTRS)

    Ladbury, R.; Gorelick, J. L.; McClure, S.

    2010-01-01

    Radiation Hardness Assurance (RHA) methodologies against Total Ionizing Dose (TID) degradation impose rigorous statistical treatments for data from a part's Radiation Lot Acceptance Test (RLAT) and/or its historical performance. However, no similar methods exist for using "similarity" data - that is, data for similar parts fabricated in the same process as the part under qualification. This is despite the greater difficulty and potential risk in interpreting of similarity data. In this work, we develop methods to disentangle part-to-part, lot-to-lot and part-type-to-part-type variation. The methods we develop apply not just for qualification decisions, but also for quality control and detection of process changes and other "out-of-family" behavior. We begin by discussing the data used in ·the study and the challenges of developing a statistic providing a meaningful measure of degradation across multiple part types, each with its own performance specifications. We then develop analysis techniques and apply them to the different data sets.

  1. Symetrica Measurements at PNNL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kouzes, Richard T.; Mace, Emily K.; Redding, Rebecca L.

    2009-01-26

    Symetrica is a small company based in Southampton, England, that has developed an algorithm for processing gamma ray spectra obtained from a variety of scintillation detectors. Their analysis method applied to NaI(Tl), BGO, and LaBr spectra results in deconvoluted spectra with the “resolution” improved by about a factor of three to four. This method has also been applied by Symetrica to plastic scintillator with the result that full energy peaks are produced. If this method is valid and operationally viable, it could lead to a significantly improved plastic scintillator based radiation portal monitor system.

  2. Factorization and reduction methods for optimal control of distributed parameter systems

    NASA Technical Reports Server (NTRS)

    Burns, J. A.; Powers, R. K.

    1985-01-01

    A Chandrasekhar-type factorization method is applied to the linear-quadratic optimal control problem for distributed parameter systems. An aeroelastic control problem is used as a model example to demonstrate that if computationally efficient algorithms, such as those of Chandrasekhar-type, are combined with the special structure often available to a particular problem, then an abstract approximation theory developed for distributed parameter control theory becomes a viable method of solution. A numerical scheme based on averaging approximations is applied to hereditary control problems. Numerical examples are given.

  3. Fungicidal seed coatings exert minor effects on arbuscular mycorrhizal fungi and plant nutrient content

    USDA-ARS?s Scientific Manuscript database

    Aims: Determine if contemporary, seed-applied fungicidal formulations inhibit colonization of plant roots by arbuscular mycorrhizal (AM) fungi, plant development, or plant nutrient content during early vegetative stages of several commodity crops. Methods: We evaluated seed-applied commercial fungic...

  4. Identification of informative features for predicting proinflammatory potentials of engine exhausts.

    PubMed

    Wang, Chia-Chi; Lin, Ying-Chi; Lin, Yuan-Chung; Jhang, Syu-Ruei; Tung, Chun-Wei

    2017-08-18

    The immunotoxicity of engine exhausts is of high concern to human health due to the increasing prevalence of immune-related diseases. However, the evaluation of immunotoxicity of engine exhausts is currently based on expensive and time-consuming experiments. It is desirable to develop efficient methods for immunotoxicity assessment. To accelerate the development of safe alternative fuels, this study proposed a computational method for identifying informative features for predicting proinflammatory potentials of engine exhausts. A principal component regression (PCR) algorithm was applied to develop prediction models. The informative features were identified by a sequential backward feature elimination (SBFE) algorithm. A total of 19 informative chemical and biological features were successfully identified by SBFE algorithm. The informative features were utilized to develop a computational method named FS-CBM for predicting proinflammatory potentials of engine exhausts. FS-CBM model achieved a high performance with correlation coefficient values of 0.997 and 0.943 obtained from training and independent test sets, respectively. The FS-CBM model was developed for predicting proinflammatory potentials of engine exhausts with a large improvement on prediction performance compared with our previous CBM model. The proposed method could be further applied to construct models for bioactivities of mixtures.

  5. Provider payment in community-based health insurance schemes in developing countries: a systematic review.

    PubMed

    Robyn, Paul Jacob; Sauerborn, Rainer; Bärnighausen, Till

    2013-03-01

    Community-based health insurance (CBI) is a common mechanism to generate financial resources for health care in developing countries. We review for the first time provider payment methods used in CBI in developing countries and their impact on CBI performance. We conducted a systematic review of the literature on provider payment methods used by CBI in developing countries published up to January 2010. Information on provider payment was available for a total of 32 CBI schemes in 34 reviewed publications: 17 schemes in South Asia, 10 in sub-Saharan Africa, 4 in East Asia and 1 in Latin America. Various types of provider payment were applied by the CBI schemes: 17 used fee-for-service, 12 used salaries, 9 applied a coverage ceiling, 7 used capitation and 6 applied a co-insurance. The evidence suggests that provider payment impacts CBI performance through provider participation and support for CBI, population enrolment and patient satisfaction with CBI, quantity and quality of services provided and provider and patient retention. Lack of provider participation in designing and choosing a CBI payment method can lead to reduced provider support for the scheme. CBI schemes in developing countries have used a wide range of provider payment methods. The existing evidence suggests that payment methods are a key determinant of CBI performance and sustainability, but the strength of this evidence is limited since it is largely based on observational studies rather than on trials or on quasi-experimental research. According to the evidence, provider payment can affect provider participation, satisfaction and retention in CBI; the quantity and quality of services provided to CBI patients; patient demand of CBI services; and population enrollment, risk pooling and financial sustainability of CBI. CBI schemes should carefully consider how their current payment methods influence their performance, how changes in the methods could improve performance, and how such effects could be assessed with scientific rigour to increase the strength of evidence on this topic.

  6. Simultaneous determination of domperidone and Itopride in pharmaceuticals and human plasma using RP-HPLC/UV detection: Method development, validation and application of the method in in-vivo evaluation of fast dispersible tablets.

    PubMed

    Khan, Amjad; Iqbal, Zafar; Khadra, Ibrahim; Ahmad, Lateef; Khan, Abad; Khan, Muhammad Imran; Ullah, Zia; Ismail

    2016-03-20

    Domperidone and Itopride are pro-kinetic agents, regulating the gastric motility and are commonly prescribed as anti emetic drugs. In the present study a simple, rapid and sensitive RP-HPLC/UV method was developed for simultaneous determination of Domperidone and Itopride in pharmaceutical samples and human plasma, using Tenofavir as internal standard. Experimental conditions were optimized and method was validated according to the standard guidelines. Combination of water (pH 3.0) and acetonitrile (65:35 v/v) was used as mobile phase, pumped at the flow rate of 1.5 ml/min. Detector wavelength was set at 210 nm and column oven temperature was 40oC. Unlike conventional liquid-liquid extraction, simple precipitation technique was applied for drug extraction from human plasma using acetonitrile for deprotienation. The method showed adequate separation of both the analytes and best resolution was achieved using Hypersil BDS C8 column (150 mm × 4.6 mm, 5 μm). The method was quite linear in the range of 20-600 ng/ml. Recovery of the method was 92.31% and 89.82% for Domperidone and Itopride, respectively. Retention time of both the analytes and internal standard was below 15 min. The lower limit of detection (LLOD) and lower limit of quantification (LLOQ) for Domperidone were 5 and 10 ng/ml while for Itopride was 12 and 15 ng/ml, respectively. The developed method was successfully applied for in-vivo analysis of fast dispersible tablets of Domperidone in healthy human volunteer. The proposed method was a part of formulation development study and was efficiently applied for determination of the two drugs in various pharmaceutical products and human plasma. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Non-invasive imaging methods applied to neo- and paleo-ontological cephalopod research

    NASA Astrophysics Data System (ADS)

    Hoffmann, R.; Schultz, J. A.; Schellhorn, R.; Rybacki, E.; Keupp, H.; Gerden, S. R.; Lemanis, R.; Zachow, S.

    2014-05-01

    Several non-invasive methods are common practice in natural sciences today. Here we present how they can be applied and contribute to current topics in cephalopod (paleo-) biology. Different methods will be compared in terms of time necessary to acquire the data, amount of data, accuracy/resolution, minimum/maximum size of objects that can be studied, the degree of post-processing needed and availability. The main application of the methods is seen in morphometry and volumetry of cephalopod shells. In particular we present a method for precise buoyancy calculation. Therefore, cephalopod shells were scanned together with different reference bodies, an approach developed in medical sciences. It is necessary to know the volume of the reference bodies, which should have similar absorption properties like the object of interest. Exact volumes can be obtained from surface scanning. Depending on the dimensions of the study object different computed tomography techniques were applied.

  8. The dream interview method in addiction recovery. A treatment guide.

    PubMed

    Flowers, L K; Zweben, J E

    1996-01-01

    The Dream Interview Method is a recently developed tool for dream interpretation that can facilitate work on addiction issues at all stages of recovery. This paper describes the method in detail and discusses examples of its application in a group composed of individuals in varying stages of the recovery process. It permits the therapist to accelerate the development of insight, and once the method is learned, it can be applied in self-help formats.

  9. Adaptive Wiener filter super-resolution of color filter array images.

    PubMed

    Karch, Barry K; Hardie, Russell C

    2013-08-12

    Digital color cameras using a single detector array with a Bayer color filter array (CFA) require interpolation or demosaicing to estimate missing color information and provide full-color images. However, demosaicing does not specifically address fundamental undersampling and aliasing inherent in typical camera designs. Fast non-uniform interpolation based super-resolution (SR) is an attractive approach to reduce or eliminate aliasing and its relatively low computational load is amenable to real-time applications. The adaptive Wiener filter (AWF) SR algorithm was initially developed for grayscale imaging and has not previously been applied to color SR demosaicing. Here, we develop a novel fast SR method for CFA cameras that is based on the AWF SR algorithm and uses global channel-to-channel statistical models. We apply this new method as a stand-alone algorithm and also as an initialization image for a variational SR algorithm. This paper presents the theoretical development of the color AWF SR approach and applies it in performance comparisons to other SR techniques for both simulated and real data.

  10. Development and application of LC–APCI–MS method for biomonitoring of animal and human exposure to imidacloprid.

    PubMed

    Kavvalakis, Matthaios P; Tzatzarakis, Manolis N; Theodoropoulou, Eleftheria P; Barbounis, Emmanouil G; Tsakalof, Andreas K; Tsatsakis, Aristidis M

    2013-11-01

    Imidacloprid (IMI) is a relatively new neuro-active neonicotinoid insecticide and nowadays one of the largest selling insecticides worldwide. In the present study a LC–APCI–MS based method was developed and validated for the quantification of imidacloprid and its main metabolite 6-chloronicotinic acid (6- CINA) in urine and hair specimens. The method was tested in biomonitoring of intentionally exposed animals and subsequently applied for biomonitoring of Cretan urban and rural population. The developed analytical method comprises two main steps of analytes isolation from specimen (solid– liquid extraction with methanol for hair, liquid–liquid extraction with methanol for urine) and subsequent instrumental analysis by LC–APCI–MS. The developed method was applied for the monitoring of IMI and 6-ClNA in hair and urine of laboratory animals (rabbits) intentionally fed with insecticide at low or high doses (40 and 80 mg kg(-1) weight d(-1) respectively) for 24 weeks. The analytes were detected in the regularly acquired hair and urine specimens and their found levels were proportional to the feeding dose and time of exposure with the exception of slight decline of IMI levels in high dose fed rabbits after 24 weeks of feeding. This decline can be explained by the induction of IMI metabolizing enzymes by the substrate. After testing on animal models the method was applied for pilot biomonitoring of Crete urban (n = 26) and rural (n = 32) population. Rural but not urban population is exposed to IMI with 21 positive samples (65.6%) and found median concentration 0.03 ng mg(-1). Maximum concentration detected was 27 ng mg(-1)

  11. Edge enhancement algorithm for low-dose X-ray fluoroscopic imaging.

    PubMed

    Lee, Min Seok; Park, Chul Hee; Kang, Moon Gi

    2017-12-01

    Low-dose X-ray fluoroscopy has continually evolved to reduce radiation risk to patients during clinical diagnosis and surgery. However, the reduction in dose exposure causes quality degradation of the acquired images. In general, an X-ray device has a time-average pre-processor to remove the generated quantum noise. However, this pre-processor causes blurring and artifacts within the moving edge regions, and noise remains in the image. During high-pass filtering (HPF) to enhance edge detail, this noise in the image is amplified. In this study, a 2D edge enhancement algorithm comprising region adaptive HPF with the transient improvement (TI) method, as well as artifacts and noise reduction (ANR), was developed for degraded X-ray fluoroscopic images. The proposed method was applied in a static scene pre-processed by a low-dose X-ray fluoroscopy device. First, the sharpness of the X-ray image was improved using region adaptive HPF with the TI method, which facilitates sharpening of edge details without overshoot problems. Then, an ANR filter that uses an edge directional kernel was developed to remove the artifacts and noise that can occur during sharpening, while preserving edge details. The quantitative and qualitative results obtained by applying the developed method to low-dose X-ray fluoroscopic images and visually and numerically comparing the final images with images improved using conventional edge enhancement techniques indicate that the proposed method outperforms existing edge enhancement methods in terms of objective criteria and subjective visual perception of the actual X-ray fluoroscopic image. The developed edge enhancement algorithm performed well when applied to actual low-dose X-ray fluoroscopic images, not only by improving the sharpness, but also by removing artifacts and noise, including overshoot. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. UNDERSTANDING AND APPLYING ENVIRONMENTAL RELATIVE MOLDINESS INDEX - ERMI

    EPA Science Inventory

    This study compared two binary classification methods to evaluate the mold condition in 271 homes of infants, 144 of which later developed symptoms of respiratory illness. A method using on-site visual mold inspection was compared to another method using a quantitative index of ...

  13. 78 FR 23961 - Request for Steering Committee Nominations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-23

    ... development of a methods research agenda and coordination of methods research in support of using electronic... surveillance, and methods research and application for scientific professionals. 3. IMEDS-Evaluation: Applies... transparent way to create exciting new research projects to advance regulatory science. The Foundation acts as...

  14. High-Resolution Wind Measurements for Offshore Wind Energy Development

    NASA Technical Reports Server (NTRS)

    Nghiem, Son V.; Neumann, Gregory

    2011-01-01

    A mathematical transform, called the Rosette Transform, together with a new method, called the Dense Sampling Method, have been developed. The Rosette Transform is invented to apply to both the mean part and the fluctuating part of a targeted radar signature using the Dense Sampling Method to construct the data in a high-resolution grid at 1-km posting for wind measurements over water surfaces such as oceans or lakes.

  15. Mass spectrometry applied to the identification of Mycobacterium tuberculosis and biomarker discovery.

    PubMed

    López-Hernández, Y; Patiño-Rodríguez, O; García-Orta, S T; Pinos-Rodríguez, J M

    2016-12-01

    An adequate and effective tuberculosis (TB) diagnosis system has been identified by the World Health Organization as a priority in the fight against this disease. Over the years, several methods have been developed to identify the bacillus, but bacterial culture remains one of the most affordable methods for most countries. For rapid and accurate identification, however, it is more feasible to implement molecular techniques, taking advantage of the availability of public databases containing protein sequences. Mass spectrometry (MS) has become an interesting technique for the identification of TB. Here, we review some of the most widely employed methods for identifying Mycobacterium tuberculosis and present an update on MS applied for the identification of mycobacterial species. © 2016 The Society for Applied Microbiology.

  16. An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle

    NASA Astrophysics Data System (ADS)

    Wang, Yue; Gao, Dan; Mao, Xuming

    2018-03-01

    A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.

  17. Second-order small disturbance theory for hypersonic flow over power-law bodies. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Townsend, J. C.

    1974-01-01

    A mathematical method for determining the flow field about power-law bodies in hypersonic flow conditions is developed. The second-order solutions, which reflect the effects of the second-order terms in the equations, are obtained by applying the method of small perturbations in terms of body slenderness parameter to the zeroth-order solutions. The method is applied by writing each flow variable as the sum of a zeroth-order and a perturbation function, each multiplied by the axial variable raised to a power. The similarity solutions are developed for infinite Mach number. All results obtained are for no flow through the body surface (as a boundary condition), but the derivation indicates that small amounts of blowing or suction through the wall can be accommodated.

  18. Physical interpretation and development of ultrasonic nondestructive evaluation techniques applied to the quantitative characterization of textile composite materials

    NASA Technical Reports Server (NTRS)

    Miller, James G.

    1993-01-01

    In this Progress Report, we describe our current research activities concerning the development and implementation of advanced ultrasonic nondestructive evaluation methods applied to the characterization of stitched composite materials and bonded aluminum plate specimens. One purpose of this investigation is to identify and characterize specific features of polar backscatter interrogation which enhance the ability of ultrasound to detect flaws in a stitched composite laminate. Another focus is to explore the feasibility of implementing medical linear array imaging technology as a viable ultrasonic-based nondestructive evaluation method to inspect and characterize bonded aluminum lap joints. As an approach to implementing quantitative ultrasonic inspection methods to both of these materials, we focus on the physics that underlies the detection of flaws in such materials.

  19. A Phase-Only technique for enhancing the high-frequency MASW method

    USDA-ARS?s Scientific Manuscript database

    For soil exploration in the vadose zone, a high-frequency multi-channel analysis of surface waves (HF-MASW) method has been developed. In the study, several practical techniques were applied to enhance the overtone image of the HF-MASW method. They included (1) the self-adaptive MASW method using a ...

  20. Capillary zone electrophoresis method for a highly glycosylated and sialylated recombinant protein: development, characterization and application for process development.

    PubMed

    Zhang, Le; Lawson, Ken; Yeung, Bernice; Wypych, Jette

    2015-01-06

    A purity method based on capillary zone electrophoresis (CZE) has been developed for the separation of isoforms of a highly glycosylated protein. The separation was found to be driven by the number of sialic acids attached to each isoform. The method has been characterized using orthogonal assays and shown to have excellent specificity, precision and accuracy. We have demonstrated the CZE method is a useful in-process assay to support cell culture and purification development of this glycoprotein. Compared to isoelectric focusing (IEF), the CZE method provides more quantitative results and higher sample throughput with excellent accuracy, qualities that are required for process development. In addition, the CZE method has been applied in the stability testing of purified glycoprotein samples.

  1. Development of the Ion Exchange-Gravimetric Method for Sodium in Serum as a Definitive Method

    PubMed Central

    Moody, John R.; Vetter, Thomas W.

    1996-01-01

    An ion exchange-gravimetric method, previously developed as a National Committee for Clinical Laboratory Standards (NCCLS) reference method for the determination of sodium in human serum, has been re-evaluated and improved. Sources of analytical error in this method have been examined more critically and the overall uncertainties decreased. Additionally, greater accuracy and repeatability have been achieved by the application of this definitive method to a sodium chloride reference material. In this method sodium in serum is ion-exchanged, selectively eluted and converted to a weighable precipitate as Na2SO4. Traces of sodium eluting before or after the main fraction, and precipitate contaminants are determined instrumentally. Co-precipitating contaminants contribute less than 0.1 % while the analyte lost to other eluted ion-exchange fractions contributes less than 0.02 % to the total precipitate mass. With improvements, the relative expanded uncertainty (k = 2) of the method, as applied to serum, is 0.3 % to 0.4 % and is less than 0.1 % when applied to a sodium chloride reference material. PMID:27805122

  2. Development of spectrofluorimetric method for determination of certain aminoglycoside drugs in dosage forms and human plasma through condensation with ninhydrin and phenyl acetaldehyde.

    PubMed

    Omar, Mahmoud A; Hammad, Mohamed A; Nagy, Dalia M; Aly, Alshymaa A

    2015-02-05

    A simple and sensitive spectrofluorimetric method has been developed and validated for determination of amikacin sulfate, neomycin sulfate and tobramycin in pure forms, pharmaceutical formulations and human plasma. The method was based on condensation reaction of cited drugs with ninhydrin and phenylacetaldehyde in buffered medium (pH 6) resulting in formation of fluorescent products which exhibit excitation and emission maxima at 395 and 470nm, respectively. The different experimental parameters affecting the development and stability of the reaction products were carefully studied and optimized. The calibration plots were constructed with good correlation coefficients (0.9993 for tobramycin and 0.9996 for both neomycin and amikacin). The proposed method was successfully applied for the analysis of cited drugs in dosage forms with high accuracy (98.33-101.7)±(0.80-1.26)%. The results show an excellent agreement with the reference method, indicating no significant difference in accuracy and precision. Due to its high sensitivity, the proposed method was applied successfully for determination of amikacin in real human plasma. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Rapid-estimation method for assessing scour at highway bridges

    USGS Publications Warehouse

    Holnbeck, Stephen R.

    1998-01-01

    A method was developed by the U.S. Geological Survey for rapid estimation of scour at highway bridges using limited site data and analytical procedures to estimate pier, abutment, and contraction scour depths. The basis for the method was a procedure recommended by the Federal Highway Administration for conducting detailed scour investigations, commonly referred to as the Level 2 method. Using pier, abutment, and contraction scour results obtained from Level 2 investigations at 122 sites in 10 States, envelope curves and graphical relations were developed that enable determination of scour-depth estimates at most bridge sites in a matter of a few hours. Rather than using complex hydraulic variables, surrogate variables more easily obtained in the field were related to calculated scour-depth data from Level 2 studies. The method was tested by having several experienced individuals apply the method in the field, and results were compared among the individuals and with previous detailed analyses performed for the sites. Results indicated that the variability in predicted scour depth among individuals applying the method generally was within an acceptable range, and that conservatively greater scour depths generally were obtained by the rapid-estimation method compared to the Level 2 method. The rapid-estimation method is considered most applicable for conducting limited-detail scour assessments and as a screening tool to determine those bridge sites that may require more detailed analysis. The method is designed to be applied only by a qualified professional possessing knowledge and experience in the fields of bridge scour, hydraulics, and flood hydrology, and having specific expertise with the Level 2 method.

  4. Dynamic Centralized and Decentralized Control Systems

    DOT National Transportation Integrated Search

    1977-09-01

    This report develops a systematic method for designing suboptimal decentralized control systems. The method is then applied to the design of a decentralized controller for a freeway-corridor system. A freeway corridor is considered to be a system of ...

  5. Diagnosis-Prescription in the Context of Instructional Management

    ERIC Educational Resources Information Center

    Besel, Ronald

    1973-01-01

    Author argues that individual assessment of the students learning style should precede the decision of which teaching method is appropriate. He applies the medical terminology of diagnosis-prescription to this method of instructional development for management. (HB)

  6. A Method for Cognitive Task Analysis

    DTIC Science & Technology

    1992-07-01

    A method for cognitive task analysis is described based on the notion of ’generic tasks’. The method distinguishes three layers of analysis. At the...model for applied areas such as the development of knowledge-based systems and training, are discussed. Problem solving, Cognitive Task Analysis , Knowledge, Strategies.

  7. Improving Method-in-Use through Classroom Observation

    ERIC Educational Resources Information Center

    Nunn, Roger

    2011-01-01

    Method-in-use (Nunn, Describing classroom interaction in intercultural curricular research and development, University of Reading, 1996, International Review of Applied Linguistics in Language Teaching 37: 23-42, 1999) is a description of the method actually being enacted through classroom interaction in a particular context. The description is…

  8. Forest Herbicide Washoff From Foliar Applications

    Treesearch

    J.L. Michael; Kevin L. Talley; H.C. Fishburn

    1992-01-01

    Field and laboratory experiments were conducted to develop and test methods for determining washoff of foliar applied herbicides typically used in forestry in the South.Preliminary results show good agreement between results of laboratory methods used and observations from field experiments on actual precipitation events. Methods included application of...

  9. An integrated bioanalytical method development and validation approach: case studies.

    PubMed

    Xue, Y-J; Melo, Brian; Vallejo, Martha; Zhao, Yuwen; Tang, Lina; Chen, Yuan-Shek; Keller, Karin M

    2012-10-01

    We proposed an integrated bioanalytical method development and validation approach: (1) method screening based on analyte's physicochemical properties and metabolism information to determine the most appropriate extraction/analysis conditions; (2) preliminary stability evaluation using both quality control and incurred samples to establish sample collection, storage and processing conditions; (3) mock validation to examine method accuracy and precision and incurred sample reproducibility; and (4) method validation to confirm the results obtained during method development. This integrated approach was applied to the determination of compound I in rat plasma and compound II in rat and dog plasma. The effectiveness of the approach was demonstrated by the superior quality of three method validations: (1) a zero run failure rate; (2) >93% of quality control results within 10% of nominal values; and (3) 99% incurred sample within 9.2% of the original values. In addition, rat and dog plasma methods for compound II were successfully applied to analyze more than 900 plasma samples obtained from Investigational New Drug (IND) toxicology studies in rats and dogs with near perfect results: (1) a zero run failure rate; (2) excellent accuracy and precision for standards and quality controls; and (3) 98% incurred samples within 15% of the original values. Copyright © 2011 John Wiley & Sons, Ltd.

  10. Multiplexed LC-MS/MS analysis of horse plasma proteins to study doping in sport.

    PubMed

    Barton, Chris; Beck, Paul; Kay, Richard; Teale, Phil; Roberts, Jane

    2009-06-01

    The development of protein biomarkers for the indirect detection of doping in horse is a potential solution to doping threats such as gene and protein doping. A method for biomarker candidate discovery in horse plasma is presented using targeted analysis of proteotypic peptides from horse proteins. These peptides were first identified in a novel list of the abundant proteins in horse plasma. To monitor these peptides, an LC-MS/MS method using multiple reaction monitoring was developed to study the quantity of 49 proteins in horse plasma in a single run. The method was optimised and validated, and then applied to a population of race-horses to study protein variance within a population. The method was finally applied to longitudinal time courses of horse plasma collected after administration of an anabolic steroid to demonstrate utility for hypothesis-driven discovery of doping biomarker candidates.

  11. Advanced spectrophotometric chemometric methods for resolving the binary mixture of doxylamine succinate and pyridoxine hydrochloride.

    PubMed

    Katsarov, Plamen; Gergov, Georgi; Alin, Aylin; Pilicheva, Bissera; Al-Degs, Yahya; Simeonov, Vasil; Kassarova, Margarita

    2018-03-01

    The prediction power of partial least squares (PLS) and multivariate curve resolution-alternating least squares (MCR-ALS) methods have been studied for simultaneous quantitative analysis of the binary drug combination - doxylamine succinate and pyridoxine hydrochloride. Analysis of first-order UV overlapped spectra was performed using different PLS models - classical PLS1 and PLS2 as well as partial robust M-regression (PRM). These linear models were compared to MCR-ALS with equality and correlation constraints (MCR-ALS-CC). All techniques operated within the full spectral region and extracted maximum information for the drugs analysed. The developed chemometric methods were validated on external sample sets and were applied to the analyses of pharmaceutical formulations. The obtained statistical parameters were satisfactory for calibration and validation sets. All developed methods can be successfully applied for simultaneous spectrophotometric determination of doxylamine and pyridoxine both in laboratory-prepared mixtures and commercial dosage forms.

  12. A Generalized Fast Frequency Sweep Algorithm for Coupled Circuit-EM Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rockway, J D; Champagne, N J; Sharpe, R M

    2004-01-14

    Frequency domain techniques are popular for analyzing electromagnetics (EM) and coupled circuit-EM problems. These techniques, such as the method of moments (MoM) and the finite element method (FEM), are used to determine the response of the EM portion of the problem at a single frequency. Since only one frequency is solved at a time, it may take a long time to calculate the parameters for wideband devices. In this paper, a fast frequency sweep based on the Asymptotic Wave Expansion (AWE) method is developed and applied to generalized mixed circuit-EM problems. The AWE method, which was originally developed for lumped-loadmore » circuit simulations, has recently been shown to be effective at quasi-static and low frequency full-wave simulations. Here it is applied to a full-wave MoM solver, capable of solving for metals, dielectrics, and coupled circuit-EM problems.« less

  13. Study of swelling behavior in ArF resist during development by the QCM method (3): observations of swelling layer elastic modulus

    NASA Astrophysics Data System (ADS)

    Sekiguchi, Atsushi

    2013-03-01

    The QCM method allows measurements of impedance, an index of swelling layer viscosity in a photoresist during development. While impedance is sometimes used as a qualitative index of change in the viscosity of the swelling layer, it has to date not been used quantitatively, for data analysis. We explored a method for converting impedance values to elastic modulus (Pa), a coefficient expressing viscosity. Applying this method, we compared changes in the viscosity of the swelling layer in an ArF resist generated during development in a TMAH developing solution and in a TBAH developing solution. This paper reports the results of this comparative study.

  14. Applying integrals of motion to the numerical solution of differential equations

    NASA Technical Reports Server (NTRS)

    Vezewski, D. J.

    1980-01-01

    A method is developed for using the integrals of systems of nonlinear, ordinary, differential equations in a numerical integration process to control the local errors in these integrals and reduce the global errors of the solution. The method is general and can be applied to either scalar or vector integrals. A number of example problems, with accompanying numerical results, are used to verify the analysis and support the conjecture of global error reduction.

  15. Applying integrals of motion to the numerical solution of differential equations

    NASA Technical Reports Server (NTRS)

    Jezewski, D. J.

    1979-01-01

    A method is developed for using the integrals of systems of nonlinear, ordinary differential equations in a numerical integration process to control the local errors in these integrals and reduce the global errors of the solution. The method is general and can be applied to either scaler or vector integrals. A number of example problems, with accompanying numerical results, are used to verify the analysis and support the conjecture of global error reduction.

  16. A comparison of high-frequency cross-correlation measures

    NASA Astrophysics Data System (ADS)

    Precup, Ovidiu V.; Iori, Giulia

    2004-12-01

    On a high-frequency scale the time series are not homogeneous, therefore standard correlation measures cannot be directly applied to the raw data. There are two ways to deal with this problem. The time series can be homogenised through an interpolation method (An Introduction to High-Frequency Finance, Academic Press, NY, 2001) (linear or previous tick) and then the Pearson correlation statistic computed. Recently, methods that can handle raw non-synchronous time series have been developed (Int. J. Theor. Appl. Finance 6(1) (2003) 87; J. Empirical Finance 4 (1997) 259). This paper compares two traditional methods that use interpolation with an alternative method applied directly to the actual time series.

  17. Analysis of pressure distortion testing

    NASA Technical Reports Server (NTRS)

    Koch, K. E.; Rees, R. L.

    1976-01-01

    The development of a distortion methodology, method D, was documented, and its application to steady state and unsteady data was demonstrated. Three methodologies based upon DIDENT, a NASA-LeRC distortion methodology based upon the parallel compressor model, were investigated by applying them to a set of steady state data. The best formulation was then applied to an independent data set. The good correlation achieved with this data set showed that method E, one of the above methodologies, is a viable concept. Unsteady data were analyzed by using the method E methodology. This analysis pointed out that the method E sensitivities are functions of pressure defect level as well as corrected speed and pattern.

  18. Quantification of arrestin-rhodopsin binding stoichiometry.

    PubMed

    Lally, Ciara C M; Sommer, Martha E

    2015-01-01

    We have developed several methods to quantify arrestin-1 binding to rhodopsin in the native rod disk membrane. These methods can be applied to study arrestin interactions with all functional forms of rhodopsin, including dark-state rhodopsin, light-activated metarhodopsin II (Meta II), and the products of Meta II decay, opsin and all-trans-retinal. When used in parallel, these methods report both the actual amount of arrestin bound to the membrane surface and the functional aspects of arrestin binding, such as which arrestin loops are engaged and whether Meta II is stabilized. Most of these methods can also be applied to recombinant receptor reconstituted into liposomes, bicelles, and nanodisks.

  19. A GPU-accelerated implicit meshless method for compressible flows

    NASA Astrophysics Data System (ADS)

    Zhang, Jia-Le; Ma, Zhi-Hua; Chen, Hong-Quan; Cao, Cheng

    2018-05-01

    This paper develops a recently proposed GPU based two-dimensional explicit meshless method (Ma et al., 2014) by devising and implementing an efficient parallel LU-SGS implicit algorithm to further improve the computational efficiency. The capability of the original 2D meshless code is extended to deal with 3D complex compressible flow problems. To resolve the inherent data dependency of the standard LU-SGS method, which causes thread-racing conditions destabilizing numerical computation, a generic rainbow coloring method is presented and applied to organize the computational points into different groups by painting neighboring points with different colors. The original LU-SGS method is modified and parallelized accordingly to perform calculations in a color-by-color manner. The CUDA Fortran programming model is employed to develop the key kernel functions to apply boundary conditions, calculate time steps, evaluate residuals as well as advance and update the solution in the temporal space. A series of two- and three-dimensional test cases including compressible flows over single- and multi-element airfoils and a M6 wing are carried out to verify the developed code. The obtained solutions agree well with experimental data and other computational results reported in the literature. Detailed analysis on the performance of the developed code reveals that the developed CPU based implicit meshless method is at least four to eight times faster than its explicit counterpart. The computational efficiency of the implicit method could be further improved by ten to fifteen times on the GPU.

  20. A predictive estimation method for carbon dioxide transport by data-driven modeling with a physically-based data model

    NASA Astrophysics Data System (ADS)

    Jeong, Jina; Park, Eungyu; Han, Weon Shik; Kim, Kue-Young; Jun, Seong-Chun; Choung, Sungwook; Yun, Seong-Taek; Oh, Junho; Kim, Hyun-Jun

    2017-11-01

    In this study, a data-driven method for predicting CO2 leaks and associated concentrations from geological CO2 sequestration is developed. Several candidate models are compared based on their reproducibility and predictive capability for CO2 concentration measurements from the Environment Impact Evaluation Test (EIT) site in Korea. Based on the data mining results, a one-dimensional solution of the advective-dispersive equation for steady flow (i.e., Ogata-Banks solution) is found to be most representative for the test data, and this model is adopted as the data model for the developed method. In the validation step, the method is applied to estimate future CO2 concentrations with the reference estimation by the Ogata-Banks solution, where a part of earlier data is used as the training dataset. From the analysis, it is found that the ensemble mean of multiple estimations based on the developed method shows high prediction accuracy relative to the reference estimation. In addition, the majority of the data to be predicted are included in the proposed quantile interval, which suggests adequate representation of the uncertainty by the developed method. Therefore, the incorporation of a reasonable physically-based data model enhances the prediction capability of the data-driven model. The proposed method is not confined to estimations of CO2 concentration and may be applied to various real-time monitoring data from subsurface sites to develop automated control, management or decision-making systems.

  1. Application of an innovative design space optimization strategy to the development of liquid chromatographic methods to combat potentially counterfeit nonsteroidal anti-inflammatory drugs.

    PubMed

    Mbinze, J K; Lebrun, P; Debrus, B; Dispas, A; Kalenda, N; Mavar Tayey Mbay, J; Schofield, T; Boulanger, B; Rozet, E; Hubert, Ph; Marini, R D

    2012-11-09

    In the context of the battle against counterfeit medicines, an innovative methodology has been used to develop rapid and specific high performance liquid chromatographic methods to detect and determine 18 non-steroidal anti-inflammatory drugs, 5 pharmaceutical conservatives, paracetamol, chlorzoxazone, caffeine and salicylic acid. These molecules are commonly encountered alone or in combination on the market. Regrettably, a significant proportion of these consumed medicines are counterfeit or substandard, with a strong negative impact in countries of Central Africa. In this context, an innovative design space optimization strategy was successfully applied to the development of LC screening methods allowing the detection of substandard or counterfeit medicines. Using the results of a unique experimental design, the design spaces of 5 potentially relevant HPLC methods have been developed, and transferred to an ultra high performance liquid chromatographic system to evaluate the robustness of the predicted DS while providing rapid methods of analysis. Moreover, one of the methods has been fully validated using the accuracy profile as decision tool, and was then used for the quantitative determination of three active ingredients and one impurity in a common and widely used pharmaceutical formulation. The method was applied to 5 pharmaceuticals sold in the Democratic Republic of Congo. None of these pharmaceuticals was found compliant to the European Medicines Agency specifications. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Efficient methods and readily customizable libraries for managing complexity of large networks.

    PubMed

    Dogrusoz, Ugur; Karacelik, Alper; Safarli, Ilkin; Balci, Hasan; Dervishi, Leonard; Siper, Metin Can

    2018-01-01

    One common problem in visualizing real-life networks, including biological pathways, is the large size of these networks. Often times, users find themselves facing slow, non-scaling operations due to network size, if not a "hairball" network, hindering effective analysis. One extremely useful method for reducing complexity of large networks is the use of hierarchical clustering and nesting, and applying expand-collapse operations on demand during analysis. Another such method is hiding currently unnecessary details, to later gradually reveal on demand. Major challenges when applying complexity reduction operations on large networks include efficiency and maintaining the user's mental map of the drawing. We developed specialized incremental layout methods for preserving a user's mental map while managing complexity of large networks through expand-collapse and hide-show operations. We also developed open-source JavaScript libraries as plug-ins to the web based graph visualization library named Cytsocape.js to implement these methods as complexity management operations. Through efficient specialized algorithms provided by these extensions, one can collapse or hide desired parts of a network, yielding potentially much smaller networks, making them more suitable for interactive visual analysis. This work fills an important gap by making efficient implementations of some already known complexity management techniques freely available to tool developers through a couple of open source, customizable software libraries, and by introducing some heuristics which can be applied upon such complexity management techniques to ensure preserving mental map of users.

  3. Towards the development of universal, fast and highly accurate docking/scoring methods: a long way to go

    PubMed Central

    Moitessier, N; Englebienne, P; Lee, D; Lawandi, J; Corbeil, C R

    2008-01-01

    Accelerating the drug discovery process requires predictive computational protocols capable of reducing or simplifying the synthetic and/or combinatorial challenge. Docking-based virtual screening methods have been developed and successfully applied to a number of pharmaceutical targets. In this review, we first present the current status of docking and scoring methods, with exhaustive lists of these. We next discuss reported comparative studies, outlining criteria for their interpretation. In the final section, we describe some of the remaining developments that would potentially lead to a universally applicable docking/scoring method. PMID:18037925

  4. Analytical Methods to Distinguish the Positive and Negative Spectra of Mineral and Environmental Elements Using Deep Ablation Laser-Induced Breakdown Spectroscopy (LIBS).

    PubMed

    Kim, Dongyoung; Yang, Jun-Ho; Choi, Soojin; Yoh, Jack J

    2018-01-01

    Environments affect mineral surfaces, and the surface contamination or alteration can provide potential information to understanding their regional environments. However, when investigating mineral surfaces, mineral and environmental elements appear mixed in data. This makes it difficult to determine their atomic compositions independently. In this research, we developed four analytical methods to distinguish mineral and environmental elements into positive and negative spectra based on depth profiling data using laser-induced breakdown spectroscopy (LIBS). The principle of the methods is to utilize how intensity varied with depth for creating a new spectrum. The methods were applied to five mineral samples exposed to four environmental conditions including seawater, crude oil, sulfuric acid, and air as control. The proposed methods are then validated by applying the resultant spectra to principal component analysis and data were classified by the environmental conditions and atomic compositions of mineral. By applying the methods, the atomic information of minerals and environmental conditions were successfully inferred in the resultant spectrum.

  5. Engineering Design Theory: Applying the Success of the Modern World to Campaign Creation

    DTIC Science & Technology

    2009-05-21

    and school of thought) to the simple methods of design.6 This progression is analogous to Peter Senge’s levels of learning disciplines.7 Senge...iterative learning and adaptive action that develops and employs critical and creative thinking , enabling leaders to apply the necessary logic to...overcome mental rigidity and develop group insight, the Army must learn to utilize group learning and thinking , through a fluid and creative open process

  6. Elwood Murray: Pioneering Methodologist in Communication

    ERIC Educational Resources Information Center

    Brownell, Judi

    2014-01-01

    Elwood Murray (1897-1988) was a pioneer in communication education. Beginning in the 1930s, he applied nontraditional methods in the speech classroom to encourage students to internalize and apply what they learned, and to view knowledge holistically. Drawing on the work of Kunkel, Moreno, Lewin, and Korzybski, Murray focused on developing skills…

  7. A Comparison of Various MRA Methods Applied to Longitudinal Evaluation Studies in Vocational Education.

    ERIC Educational Resources Information Center

    Kapes, Jerome T.; And Others

    Three models of multiple regression analysis (MRA): single equation, commonality analysis, and path analysis, were applied to longitudinal data from the Pennsylvania Vocational Development Study. Variables influencing weekly income of vocational education students one year after high school graduation were examined: grade point averages (grades…

  8. Applying the Bootstrap to Taxometric Analysis: Generating Empirical Sampling Distributions to Help Interpret Results

    ERIC Educational Resources Information Center

    Ruscio, John; Ruscio, Ayelet Meron; Meron, Mati

    2007-01-01

    Meehl's taxometric method was developed to distinguish categorical and continuous constructs. However, taxometric output can be difficult to interpret because expected results for realistic data conditions and differing procedural implementations have not been derived analytically or studied through rigorous simulations. By applying bootstrap…

  9. Ratio manipulating spectrophotometry versus chemometry as stability indicating methods for cefquinome sulfate determination.

    PubMed

    Yehia, Ali M; Arafa, Reham M; Abbas, Samah S; Amer, Sawsan M

    2016-01-15

    Spectral resolution of cefquinome sulfate (CFQ) in the presence of its degradation products was studied. Three selective, accurate and rapid spectrophotometric methods were performed for the determination of CFQ in the presence of either its hydrolytic, oxidative or photo-degradation products. The proposed ratio difference, derivative ratio and mean centering are ratio manipulating spectrophotometric methods that were satisfactorily applied for selective determination of CFQ within linear range of 5.0-40.0 μg mL(-1). Concentration Residuals Augmented Classical Least Squares was applied and evaluated for the determination of the cited drug in the presence of its all degradation products. Traditional Partial Least Squares regression was also applied and benchmarked against the proposed advanced multivariate calibration. Experimentally designed 25 synthetic mixtures of three factors at five levels were used to calibrate and validate the multivariate models. Advanced chemometrics succeeded in quantitative and qualitative analyses of CFQ along with its hydrolytic, oxidative and photo-degradation products. The proposed methods were applied successfully for different pharmaceutical formulations analyses. These developed methods were simple and cost-effective compared with the manufacturer's RP-HPLC method. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Investigation of High-Angle-of-Attack Maneuver-Limiting Factors. Part 1. Analysis and Simulation

    DTIC Science & Technology

    1980-12-01

    useful, are not so satisfying or in- structive as the more positive identification of causal factors offered by the methods developed in Reference 5...same methods be applied to additional high-performance fighter aircraft having widely differing high AOA handling characteristics to see if further...predictions and the nonlinear model results were resolved. The second task involved development of methods , criteria, and an associated pilot rating scale, for

  11. A multi-method approach to curriculum development for in-service training in China's newly established health emergency response offices.

    PubMed

    Wang, Yadong; Li, Xiangrui; Yuan, Yiwen; Patel, Mahomed S

    2014-01-01

    To describe an innovative approach for developing and implementing an in-service curriculum in China for staff of the newly established health emergency response offices (HEROs), and that is generalisable to other settings. The multi-method training needs assessment included reviews of the competency domains needed to implement the International Health Regulations (2005) as well as China's policies and emergency regulations. The review, iterative interviews and workshops with experts in government, academia, the military, and with HERO staff were reviewed critically by an expert technical advisory panel. Over 1600 participants contributed to curriculum development. Of the 18 competency domains identified as essential for HERO staff, nine were developed into priority in-service training modules to be conducted over 2.5 weeks. Experts from academia and experienced practitioners prepared and delivered each module through lectures followed by interactive problem-solving exercises and desktop simulations to help trainees apply, experiment with, and consolidate newly acquired knowledge and skills. This study adds to the emerging literature on China's enduring efforts to strengthen its emergency response capabilities since the outbreak of SARS in 2003. The multi-method approach to curriculum development in partnership with senior policy-makers, researchers, and experienced practitioners can be applied in other settings to ensure training is responsive and customized to local needs, resources and priorities. Ongoing curriculum development should reflect international standards and be coupled with the development of appropriate performance support systems at the workplace for motivating staff to apply their newly acquired knowledge and skills effectively and creatively.

  12. Development and evaluation of modified envelope correlation method for deep tectonic tremor

    NASA Astrophysics Data System (ADS)

    Mizuno, N.; Ide, S.

    2017-12-01

    We develop a new location method for deep tectonic tremors, as an improvement of widely used envelope correlation method, and applied it to construct a tremor catalog in western Japan. Using the cross-correlation functions as objective functions and weighting components of data by the inverse of error variances, the envelope cross-correlation method is redefined as a maximum likelihood method. This method is also capable of multiple source detection, because when several events occur almost simultaneously, they appear as local maxima of likelihood.The average of weighted cross-correlation functions, defined as ACC, is a nonlinear function whose variable is a position of deep tectonic tremor. The optimization method has two steps. First, we fix the source depth to 30 km and use a grid search with 0.2 degree intervals to find the maxima of ACC, which are candidate event locations. Then, using each of the candidate locations as initial values, we apply a gradient method to determine horizontal and vertical components of a hypocenter. Sometimes, several source locations are determined in a time window of 5 minutes. We estimate the resolution, which is defined as a distance of sources to be detected separately by the location method, is about 100 km. The validity of this estimation is confirmed by a numerical test using synthetic waveforms. Applying to continuous seismograms in western Japan for over 10 years, the new method detected 27% more tremors than a previous method, owing to the multiple detection and improvement of accuracy by appropriate weighting scheme.

  13. Computational flow development for unsteady viscous flows: Foundation of the numerical method

    NASA Technical Reports Server (NTRS)

    Bratanow, T.; Spehert, T.

    1978-01-01

    A procedure is presented for effective consideration of viscous effects in computational development of high Reynolds number flows. The procedure is based on the interpretation of the Navier-Stokes equations as vorticity transport equations. The physics of the flow was represented in a form suitable for numerical analysis. Lighthill's concept for flow development for computational purposes was adapted. The vorticity transport equations were cast in a form convenient for computation. A statement for these equations was written using the method of weighted residuals and applying the Galerkin criterion. An integral representation of the induced velocity was applied on the basis of the Biot-Savart law. Distribution of new vorticity, produced at wing surfaces over small computational time intervals, was assumed to be confined to a thin region around the wing surfaces.

  14. Mathematical Formulation used by MATLAB Code to Convert FTIR Interferograms to Calibrated Spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Armstrong, Derek Elswick

    This report discusses the mathematical procedures used to convert raw interferograms from Fourier transform infrared (FTIR) sensors to calibrated spectra. The work discussed in this report was completed as part of the Helios project at Los Alamos National Laboratory. MATLAB code was developed to convert the raw interferograms to calibrated spectra. The report summarizes the developed MATLAB scripts and functions, along with a description of the mathematical methods used by the code. The first step in working with raw interferograms is to convert them to uncalibrated spectra by applying an apodization function to the raw data and then by performingmore » a Fourier transform. The developed MATLAB code also addresses phase error correction by applying the Mertz method. This report provides documentation for the MATLAB scripts.« less

  15. Dynamic game balancing implementation using adaptive algorithm in mobile-based Safari Indonesia game

    NASA Astrophysics Data System (ADS)

    Yuniarti, Anny; Nata Wardanie, Novita; Kuswardayan, Imam

    2018-03-01

    In developing a game there is one method that should be applied to maintain the interest of players, namely dynamic game balancing. Dynamic game balancing is a process to match a player’s playing style with the behaviour, attributes, and game environment. This study applies dynamic game balancing using adaptive algorithm in scrolling shooter game type called Safari Indonesia which developed using Unity. The game of this type is portrayed by a fighter aircraft character trying to defend itself from insistent enemy attacks. This classic game is chosen to implement adaptive algorithms because it has quite complex attributes to be developed using dynamic game balancing. Tests conducted by distributing questionnaires to a number of players indicate that this method managed to reduce frustration and increase the pleasure factor in playing.

  16. Simultaneous determination of ezetimibe and simvastatin in pharmaceutical preparations by MEKC.

    PubMed

    Yardimci, Ceren; Ozaltin, Nuran

    2010-02-01

    A micellar electrokinetic capillary chromatography method was developed and validated for the simultaneous determination of ezetimibe and simvastatin in pharmaceutical preparations. The influence of buffer concentration, buffer pH, sodium dodecyl sulphate (SDS) concentration, organic modifier, capillary temperature, applied voltage, and injection time was investigated, and the method validation studies were performed. The optimum separation for these analytes was achieved in less than 10 min at 30 degrees C with a fused-silica capillary column (56 cm x 50 microm i.d.) and a 25mM borate buffer at pH 9.0 containing 25mM SDS and 10% (v/v) acetonitrile. The samples were injected hydrodynamically for 3 s at 50 mbar, and the applied voltage was +30.0 kV. Detection wavelength was set at 238 nm. Diflunisal was used as internal standard. The method was suitably validated with respect to stability, specificity, linearity, limits of detection and quantification, accuracy, precision, and robustness. The limits of detection and quantification were 1.0 and 2.0 microg/mL for both ezetimibe and simvastatin, respectively. The method developed was successfully applied to the simultaneous determination of ezetimibe and simvastatin in pharmaceutical preparations.

  17. Using Policy-Capturing to Measure Attitudes in Organizational Diagnosis.

    ERIC Educational Resources Information Center

    Madden, Joseph M.

    1981-01-01

    Discusses an indirect method of attitude measurement, policy-capturing, that can be applied on an individual basis. In three experiments this method detected prejudicial attitudes toward females not detected with traditional methods. Can be used as a self-improvement diagnostic tool for developing awareness of behavior influences. (JAC)

  18. Identifying Advanced Technologies for Education's Future.

    ERIC Educational Resources Information Center

    Moore, Gwendolyn B.; Yin, Robert K.

    A study to determine how three advanced technologies might be applied to the needs of special education students helped inspire the development of a new method for identifying such applications. This new method, named the "Hybrid Approach," combines features of the two traditional methods: technology-push and demand-pull. Technology-push involves…

  19. Joint the Center for Applied Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamblin, Todd; Bremer, Timo; Van Essen, Brian

    The Center for Applied Scientific Computing serves as Livermore Lab’s window to the broader computer science, computational physics, applied mathematics, and data science research communities. In collaboration with academic, industrial, and other government laboratory partners, we conduct world-class scientific research and development on problems critical to national security. CASC applies the power of high-performance computing and the efficiency of modern computational methods to the realms of stockpile stewardship, cyber and energy security, and knowledge discovery for intelligence applications.

  20. Development Research of a Teachers' Educational Performance Support System: The Practices of Design, Development, and Evaluation

    ERIC Educational Resources Information Center

    Hung, Wei-Chen; Smith, Thomas J.; Harris, Marian S.; Lockard, James

    2010-01-01

    This study adopted design and development research methodology (Richey & Klein, "Design and development research: Methods, strategies, and issues," 2007) to systematically investigate the process of applying instructional design principles, human-computer interaction, and software engineering to a performance support system (PSS) for behavior…

  1. Non-destructive research methods applied on materials for the new generation of nuclear reactors

    NASA Astrophysics Data System (ADS)

    Bartošová, I.; Slugeň, V.; Veterníková, J.; Sojak, S.; Petriska, M.; Bouhaddane, A.

    2014-06-01

    The paper is aimed on non-destructive experimental techniques applied on materials for the new generation of nuclear reactors (GEN IV). With the development of these reactors, also materials have to be developed in order to guarantee high standard properties needed for construction. These properties are high temperature resistance, radiation resistance and resistance to other negative effects. Nevertheless the changes in their mechanical properties should be only minimal. Materials, that fulfil these requirements, are analysed in this work. The ferritic-martensitic (FM) steels and ODS steels are studied in details. Microstructural defects, which can occur in structural materials and can be also accumulated during irradiation due to neutron flux or alpha, beta and gamma radiation, were analysed using different spectroscopic methods as positron annihilation spectroscopy and Barkhausen noise, which were applied for measurements of three different FM steels (T91, P91 and E97) as well as one ODS steel (ODS Eurofer).

  2. Investigating the use of a rational Runge Kutta method for transport modelling

    NASA Astrophysics Data System (ADS)

    Dougherty, David E.

    An unconditionally stable explicit time integrator has recently been developed for parabolic systems of equations. This rational Runge Kutta (RRK) method, proposed by Wambecq 1 and Hairer 2, has been applied by Liu et al.3 to linear heat conduction problems in a time-partitioned solution context. An important practical question is whether the method has application for the solution of (nearly) hyperbolic equations as well. In this paper the RRK method is applied to a nonlinear heat conduction problem, the advection-diffusion equation, and the hyperbolic Buckley-Leverett problem. The method is, indeed, found to be unconditionally stable for the linear heat conduction problem and performs satisfactorily for the nonlinear heat flow case. A heuristic limitation on the utility of RRK for the advection-diffusion equation arises in the Courant number; for the second-order accurate one-step two-stage RRK method, a limiting Courant number of 2 applies. First order upwinding is not as effective when used with RRK as with Euler one-step methods. The method is found to perform poorly for the Buckley-Leverett problem.

  3. Development, validation and determination of multiclass pesticide residues in cocoa beans using gas chromatography and liquid chromatography tandem mass spectrometry.

    PubMed

    Zainudin, Badrul Hisyam; Salleh, Salsazali; Mohamed, Rahmat; Yap, Ken Choy; Muhamad, Halimah

    2015-04-01

    An efficient and rapid method for the analysis of pesticide residues in cocoa beans using gas and liquid chromatography-tandem mass spectrometry was developed, validated and applied to imported and domestic cocoa beans samples collected over 2 years from smallholders and Malaysian ports. The method was based on solvent extraction method and covers 26 pesticides (insecticides, fungicides, and herbicides) of different chemical classes. The recoveries for all pesticides at 10 and 50 μg/kg were in the range of 70-120% with relative standard deviations of less than 20%. Good selectivity and sensitivity were obtained with method limit of quantification of 10 μg/kg. The expanded uncertainty measurements were in the range of 4-25%. Finally, the proposed method was successfully applied for the routine analysis of pesticide residues in cocoa beans via a monitoring study where 10% of them was found positive for chlorpyrifos, ametryn and metalaxyl. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Development and validation of an MEKC method for determination of nitrogen-containing drugs in pharmaceutical preparations.

    PubMed

    Buiarelli, Francesca; Coccioli, Franco; Jasionowska, Renata; Terracciano, Alessandro

    2008-09-01

    A fast and accurate micellar electrokinetic capillary chromatography method was developed for quality control of pharmaceutical preparations containing cold remedies as acetaminophen, salicylamide, caffeine, phenylephrine, pseudoephedrine, norephedrine and chlorpheniramine. The method optimization was realized on a Beckman P/ACE System MDQ instrument. The baseline separation of seven analytes was performed in an uncoated fused silica capillary internal diameter (ID)=50 microm using tris-borate (20 mM, pH=8.5) containing sodium dodecyl sulphate 30 mM BGE. On line-UV detection at 214 nm was performed and the applied voltage was 10 kV. The operating temperature was 25 degrees C. After experimental conditions optimization, the proposed method was validated. The evaluated parameters were: precision of migration time and of corrected peak area ratio, linearity range, limit of detection, limit of quantification, accuracy (recovery), ruggedness and applicability. The method was then successfully applied for the analysis of three pharmaceutical preparations containing some of the analytes listed before.

  5. A forward model-based validation of cardiovascular system identification

    NASA Technical Reports Server (NTRS)

    Mukkamala, R.; Cohen, R. J.

    2001-01-01

    We present a theoretical evaluation of a cardiovascular system identification method that we previously developed for the analysis of beat-to-beat fluctuations in noninvasively measured heart rate, arterial blood pressure, and instantaneous lung volume. The method provides a dynamical characterization of the important autonomic and mechanical mechanisms responsible for coupling the fluctuations (inverse modeling). To carry out the evaluation, we developed a computational model of the cardiovascular system capable of generating realistic beat-to-beat variability (forward modeling). We applied the method to data generated from the forward model and compared the resulting estimated dynamics with the actual dynamics of the forward model, which were either precisely known or easily determined. We found that the estimated dynamics corresponded to the actual dynamics and that this correspondence was robust to forward model uncertainty. We also demonstrated the sensitivity of the method in detecting small changes in parameters characterizing autonomic function in the forward model. These results provide confidence in the performance of the cardiovascular system identification method when applied to experimental data.

  6. QSAR and 3D-QSAR studies applied to compounds with anticonvulsant activity.

    PubMed

    Garro Martinez, Juan C; Vega-Hissi, Esteban G; Andrada, Matías F; Estrada, Mario R

    2015-01-01

    Quantitative structure-activity relationships (QSAR and 3D-QSAR) have been applied in the last decade to obtain a reliable statistical model for the prediction of the anticonvulsant activities of new chemical entities. However, despite the large amount of information on QSAR, no recent review has published and discussed this data in detail. In this review, the authors provide a detailed discussion of QSAR studies that have been applied to compounds with anticonvulsant activity published between the years 2003 and 2013. They also evaluate the mathematical approaches and the main software used to develop the QSAR and 3D-QSAR model. QSAR methodologies continue to attract the attention of researchers and provide valuable information for the development of new potentially active compounds including those with anticonvulsant activity. This has been helped in part by improvements in the size and performance of computers; the development of specific software and the development of novel molecular descriptors, which have given rise to new and more predictive QSAR models. The extensive development of descriptors, and the way by which descriptor values are derived, have allowed the evolution of the QSAR methods. This evolution could strengthen the QSAR methods as an important tool in research and development of new and more potent anticonvulsant agents.

  7. Valuing national effects of digital health investments: an applied method.

    PubMed

    Hagens, Simon; Zelmer, Jennifer; Frazer, Cassandra; Gheorghiu, Bobby; Leaver, Chad

    2015-01-01

    This paper describes an approach which has been applied to value national outcomes of investments by federal, provincial and territorial governments, clinicians and healthcare organizations in digital health. Hypotheses are used to develop a model, which is revised and populated based upon the available evidence. Quantitative national estimates and qualitative findings are produced and validated through structured peer review processes. This methodology has applied in four studies since 2008.

  8. On the prediction of far field computational aeroacoustics of advanced propellers

    NASA Technical Reports Server (NTRS)

    Jaeger, Stephen M.; Korkan, Kenneth D.

    1990-01-01

    A numerical method for determining the acoustic far field generated by a high-speed subsonic aircraft propeller was developed. The approach used in this method was to generate the entire three-dimensional pressure field about the propeller (using an Euler flowfield solver) and then to apply a solution of the wave equation on a cylindrical surface enveloping the propeller. The method is applied to generate the three-dimensional flowfield between two blades of an advanced propeller. The results are compared with experimental data obtained in a wind-tunnel test at a Mach number of 0.6.

  9. [Promoting parental competence by video councelling: The Marte Meo method].

    PubMed

    Bünder, Peter; Sirringhaus-Bünder, Annegret

    2008-01-01

    Marte Meo, that is a low-levelled outpatient form of councelling parents and other persons to whom children relate most closely in order to achieve educational competence and help them to assume responsibility for the developement of children. The article gives a short summary how this method has developed and has been applied in the field of youth care.

  10. Effects of Multiple Intelligences Activities on Writing Skill Development in an EFL Context

    ERIC Educational Resources Information Center

    Gündüz, Zennure Elgün; Ünal, Ismail Dogan

    2016-01-01

    This study aims at exploring the effects of multiple intelligences activities versus traditional method on English writing development of the sixth grade students in Turkey. A quasi-experimental research method with a pre-test post-test design was applied. The participants were 50 sixth grade students at a state school in Ardahan in Turkey. The…

  11. Modeling of polymer networks for application to solid propellant formulating

    NASA Technical Reports Server (NTRS)

    Marsh, H. E.

    1979-01-01

    Methods for predicting the network structural characteristics formed by the curing of pourable elastomers were presented; as well as the logic which was applied in the development of mathematical models. A universal approach for modeling was developed and verified by comparison with other methods in application to a complex system. Several applications of network models to practical problems are described.

  12. Developing a Blended Learning-Based Method for Problem-Solving in Capability Learning

    ERIC Educational Resources Information Center

    Dwiyogo, Wasis D.

    2018-01-01

    The main objectives of the study were to develop and investigate the implementation of blended learning based method for problem-solving. Three experts were involved in the study and all three had stated that the model was ready to be applied in the classroom. The implementation of the blended learning-based design for problem-solving was…

  13. A contracting-interval program for the Danilewski method. Ph.D. Thesis - Va. Univ.

    NASA Technical Reports Server (NTRS)

    Harris, J. D.

    1971-01-01

    The concept of contracting-interval programs is applied to finding the eigenvalues of a matrix. The development is a three-step process in which (1) a program is developed for the reduction of a matrix to Hessenberg form, (2) a program is developed for the reduction of a Hessenberg matrix to colleague form, and (3) the characteristic polynomial with interval coefficients is readily obtained from the interval of colleague matrices. This interval polynomial is then factored into quadratic factors so that the eigenvalues may be obtained. To develop a contracting-interval program for factoring this polynomial with interval coefficients it is necessary to have an iteration method which converges even in the presence of controlled rounding errors. A theorem is stated giving sufficient conditions for the convergence of Newton's method when both the function and its Jacobian cannot be evaluated exactly but errors can be made proportional to the square of the norm of the difference between the previous two iterates. This theorem is applied to prove the convergence of the generalization of the Newton-Bairstow method that is used to obtain quadratic factors of the characteristic polynomial.

  14. A new LC-MS based method to quantitate exogenous recombinant transferrin in cerebrospinal fluid: a potential approach for pharmacokinetic studies of transferrin-based therapeutics in the central nervous system

    PubMed Central

    Wang, Shunhai; Bobst, Cedric E.; Kaltashov, Igor A.

    2018-01-01

    Transferrin (Tf) is an 80 kDa iron-binding protein which is viewed as a promising drug carrier to target the central nervous system due to its ability to penetrate the blood-brain barrier (BBB). Among the many challenges during the development of Tf-based therapeutics, sensitive and accurate quantitation of the administered Tf in cerebrospinal fluid (CSF) remains particularly difficult due to the presence of abundant endogenous Tf. Herein, we describe the development of a new LC-MS based method for sensitive and accurate quantitation of exogenous recombinant human Tf in rat CSF. By taking advantage of a His-tag present in recombinant Tf and applying Ni affinity purification, the exogenous hTf can be greatly enriched from rat CSF, despite the presence of the abundant endogenous protein. Additionally, we applied a newly developed O18-labeling technique that can generate internal standards at the protein level, which greatly improved the accuracy and robustness of quantitation. The developed method was investigated for linearity, accuracy, precision and lower limit of quantitation, all of which met the commonly accepted criteria for bioanalytical method validation. PMID:26307718

  15. Final Technical Report for "Applied Mathematics Research: Simulation Based Optimization and Application to Electromagnetic Inverse Problems"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haber, Eldad

    2014-03-17

    The focus of research was: Developing adaptive mesh for the solution of Maxwell's equations; Developing a parallel framework for time dependent inverse Maxwell's equations; Developing multilevel methods for optimization problems with inequality constraints; A new inversion code for inverse Maxwell's equations in the 0th frequency (DC resistivity); A new inversion code for inverse Maxwell's equations in low frequency regime. Although the research concentrated on electromagnetic forward and in- verse problems the results of the research was applied to the problem of image registration.

  16. Development of a process-oriented vulnerability concept for water travel time in karst aquifers-case study of Tanour and Rasoun springs catchment area.

    NASA Astrophysics Data System (ADS)

    Hamdan, Ibraheem; Sauter, Martin; Ptak, Thomas; Wiegand, Bettina; Margane, Armin; Toll, Mathias

    2017-04-01

    Key words: Karst aquifer, water travel time, vulnerability assessment, Jordan. The understanding of the groundwater pathways and movement through karst aquifers, and the karst aquifer response to precipitation events especially in the arid to semi-arid areas is fundamental to evaluate pollution risks from point and non-point sources. In spite of the great importance of the karst aquifer for drinking purposes, karst aquifers are highly sensitive to contamination events due to the fast connections between the land-surface and the groundwater (through the karst features) which is makes groundwater quality issues within karst systems very complicated. Within this study, different methods and approaches were developed and applied in order to characterise the karst aquifer system of the Tanour and Rasoun springs (NW-Jordan) and the flow dynamics within the aquifer, and to develop a process-oriented method for vulnerability assessment based on the monitoring of different multi-spatially variable parameters of water travel time in karst aquifer. In general, this study aims to achieve two main objectives: 1. Characterization of the karst aquifer system and flow dynamics. 2. Development of a process-oriented method for vulnerability assessment based on spatially variable parameters of travel time. In order to achieve these aims, different approaches and methods were applied starting from the understanding of the geological and hydrogeological characteristics of the karst aquifer and its vulnerability against pollutants, to using different methods, procedures and monitored parameters in order to determine the water travel time within the aquifer and investigate its response to precipitation event and, finally, with the study of the aquifer response to pollution events. The integrated breakthrough signal obtained from the applied methods and procedures including the using of stable isotopes of oxygen and hydrogen, the monitoring of multi qualitative and quantitative parameters using automated probes and data loggers, and the development of travel time physics-based vulnerability assessment method shows good agreement as an applicable methods to determine the water travel time in karst aquifers, and to investigate its response to precipitation and pollution events.

  17. Human ergology that promotes participatory approach to improving safety, health and working conditions at grassroots workplaces: achievements and actions.

    PubMed

    Kawakami, Tsuyoshi

    2011-12-01

    Participatory approaches are increasingly applied to improve safety, health and working conditions of grassroots workplaces in Asia. The core concepts and methods in human ergology research such as promoting real work life studies, relying on positive efforts of local people (daily life-technology), promoting active participation of local people to identify practical solutions, and learning from local human networks to reach grassroots workplaces, have provided useful viewpoints to devise such participatory training programmes. This study was aimed to study and analyze how human ergology approaches were applied in the actual development and application of three typical participatory training programmes: WISH (Work Improvement for Safe Home) with home workers in Cambodia, WISCON (Work Improvement in Small Construction Sites) with construction workers in Thailand, and WARM (Work Adjustment for Recycling and Managing Waste) with waste collectors in Fiji. The results revealed that all the three programmes, in the course of their developments, commonly applied direct observation methods of the work of target workers before devising the training programmes, learned from existing local good examples and efforts, and emphasized local human networks for cooperation. These methods and approaches were repeatedly applied in grassroots workplaces by taking advantage of their the sustainability and impacts. It was concluded that human ergology approaches largely contributed to the developments and expansion of participatory training programmes and could continue to support the self-help initiatives of local people for promoting human-centred work.

  18. A method for determining electrophoretic and electroosmotic mobilities using AC and DC electric field particle displacements.

    PubMed

    Oddy, M H; Santiago, J G

    2004-01-01

    We have developed a method for measuring the electrophoretic mobility of submicrometer, fluorescently labeled particles and the electroosmotic mobility of a microchannel. We derive explicit expressions for the unknown electrophoretic and the electroosmotic mobilities as a function of particle displacements resulting from alternating current (AC) and direct current (DC) applied electric fields. Images of particle displacements are captured using an epifluorescent microscope and a CCD camera. A custom image-processing code was developed to determine image streak lengths associated with AC measurements, and a custom particle tracking velocimetry (PTV) code was devised to determine DC particle displacements. Statistical analysis was applied to relate mobility estimates to measured particle displacement distributions.

  19. ECO-ITS : Intelligent Transportation System Applications to Improve Environmental Performance

    DOT National Transportation Integrated Search

    2012-05-01

    This report describes recent research supported by the US DOTs AERIS program, building upon existing work through developing and improving data collection methods, developing new data fusion techniques to improve estimates, and applying appropriat...

  20. Development of indirect EFBEM for radiating noise analysis including underwater problems

    NASA Astrophysics Data System (ADS)

    Kwon, Hyun-Wung; Hong, Suk-Yoon; Song, Jee-Hun

    2013-09-01

    For the analysis of radiating noise problems in medium-to-high frequency ranges, the Energy Flow Boundary Element Method (EFBEM) was developed. EFBEM is the analysis technique that applies the Boundary Element Method (BEM) to Energy Flow Analysis (EFA). The fundamental solutions representing spherical wave property for radiating noise problems in open field and considering the free surface effect in underwater are developed. Also the directivity factor is developed to express wave's directivity patterns in medium-to-high frequency ranges. Indirect EFBEM by using fundamental solutions and fictitious source was applied to open field and underwater noise problems successfully. Through numerical applications, the acoustic energy density distributions due to vibration of a simple plate model and a sphere model were compared with those of commercial code, and the comparison showed good agreement in the level and pattern of the energy density distributions.

  1. Data Assimilation in the Solar Wind: Challenges and First Results

    NASA Astrophysics Data System (ADS)

    Lang, Matthew; Browne, Phil; van Leeuwen, Peter Jan; Owens, Matt

    2017-04-01

    Data assimilation (DA) is currently underused in the solar wind field to improve the modelled variables using observations. Data assimilation has been used in Numerical Weather Prediction (NWP) models with great success, and it can be seen that the improvement of DA methods in NWP modelling has led to improvements in forecasting skill over the past 20-30 years. The state of the art DA methods developed for NWP modelling have never been applied to space weather models, hence it is important to implement the improvements that can be gained from these methods to improve our understanding of the solar wind and how to model it. The ENLIL solar wind model has been coupled to the EMPIRE data assimilation library in order to apply these advanced data assimilation methods to a space weather model. This coupling allows multiple data assimilation methods to be applied to ENLIL with relative ease. I shall discuss twin experiments that have been undertaken, applying the LETKF to the ENLIL model when a CME occurs in the observation and when it does not. These experiments show that there is potential in the application of advanced data assimilation methods to the solar wind field, however, there is still a long way to go until it can be applied effectively. I shall discuss these issues and suggest potential avenues for future research in this area.

  2. Statistically qualified neuro-analytic failure detection method and system

    DOEpatents

    Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.

    2002-03-02

    An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.

  3. Activity analysis: contributions to the innovation of projects for aircrafts cabins.

    PubMed

    Rossi, N T; Greghi, F M; Menegon, L N; Souza, G B J

    2012-01-01

    This article presents results obtained from some ergonomics intervention in the project for the conception of aircraft's cabins. The study's aim is to analyze the contribution of the method adopted in the passengers' activities analysis in reference situations, real-use situations in aircraft's cabins, applied to analyze typical activities performed by people in their own environment. Within this perspective, the study shows two analyses which highlight the use of electronic device. The first analysis has been registered through a shooting filming in a real commercial flight. In the second one, the use is developed within the domestic environment. The same method has been applied in both contexts and it is based on activity analysis. Starting with the filming activity, postures and actions analysis, self-confrontation interviews, action course reconstruction and elaboration of postures envelopes. The results point out that the developed method might be applied to different contexts, evincing different ways of space occupation to meet human personal needs while performing an activity, which can help us with the anticipation of the users' needs, as well as indicate some innovation possibilities.

  4. Integrated Multiscale Modeling of Molecular Computing Devices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gregory Beylkin

    2012-03-23

    Significant advances were made on all objectives of the research program. We have developed fast multiresolution methods for performing electronic structure calculations with emphasis on constructing efficient representations of functions and operators. We extended our approach to problems of scattering in solids, i.e. constructing fast algorithms for computing above the Fermi energy level. Part of the work was done in collaboration with Robert Harrison and George Fann at ORNL. Specific results (in part supported by this grant) are listed here and are described in greater detail. (1) We have implemented a fast algorithm to apply the Green's function for themore » free space (oscillatory) Helmholtz kernel. The algorithm maintains its speed and accuracy when the kernel is applied to functions with singularities. (2) We have developed a fast algorithm for applying periodic and quasi-periodic, oscillatory Green's functions and those with boundary conditions on simple domains. Importantly, the algorithm maintains its speed and accuracy when applied to functions with singularities. (3) We have developed a fast algorithm for obtaining and applying multiresolution representations of periodic and quasi-periodic Green's functions and Green's functions with boundary conditions on simple domains. (4) We have implemented modifications to improve the speed of adaptive multiresolution algorithms for applying operators which are represented via a Gaussian expansion. (5) We have constructed new nearly optimal quadratures for the sphere that are invariant under the icosahedral rotation group. (6) We obtained new results on approximation of functions by exponential sums and/or rational functions, one of the key methods that allows us to construct separated representations for Green's functions. (7) We developed a new fast and accurate reduction algorithm for obtaining optimal approximation of functions by exponential sums and/or their rational representations.« less

  5. Aerospace reliability applied to biomedicine.

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.; Vargo, D. J.

    1972-01-01

    An analysis is presented that indicates that the reliability and quality assurance methodology selected by NASA to minimize failures in aerospace equipment can be applied directly to biomedical devices to improve hospital equipment reliability. The Space Electric Rocket Test project is used as an example of NASA application of reliability and quality assurance (R&QA) methods. By analogy a comparison is made to show how these same methods can be used in the development of transducers, instrumentation, and complex systems for use in medicine.

  6. Microbial sequencing methods for monitoring of anaerobic treatment of antibiotics to optimize performance and prevent system failure.

    PubMed

    Aydin, Sevcan

    2016-06-01

    As a result of developments in molecular technologies and the use of sequencing technologies, the analyses of the anaerobic microbial community in biological treatment process has become increasingly prevalent. This review examines the ways in which microbial sequencing methods can be applied to achieve an extensive understanding of the phylogenetic and functional characteristics of microbial assemblages in anaerobic reactor if the substrate is contaminated by antibiotics which is one of the most important toxic compounds. It will discuss some of the advantages and disadvantages associated with microbial sequencing techniques that are more commonly employed and will assess how a combination of the existing methods may be applied to develop a more comprehensive understanding of microbial communities and improve the validity and depth of the results for the enhancement of the stability of anaerobic reactors.

  7. 76 FR 77563 - Florida Power & Light Company; St. Lucie Plant, Unit No. 1; Exemption

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-13

    ....2, because the P-T limits developed for St. Lucie, Unit 1, use a finite element method to determine... Code for calculating K Im factors, and instead applies FEM [finite element modeling] methods for...

  8. Life Prediction/Reliability Data of Glass-Ceramic Material Determined for Radome Applications

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Gyekenyesi, John P.

    2002-01-01

    Brittle materials, ceramics, are candidate materials for a variety of structural applications for a wide range of temperatures. However, the process of slow crack growth, occurring in any loading configuration, limits the service life of structural components. Therefore, it is important to accurately determine the slow crack growth parameters required for component life prediction using an appropriate test methodology. This test methodology also should be useful in determining the influence of component processing and composition variables on the slow crack growth behavior of newly developed or existing materials, thereby allowing the component processing and composition to be tailored and optimized to specific needs. Through the American Society for Testing and Materials (ASTM), the authors recently developed two test methods to determine the life prediction parameters of ceramics. The two test standards, ASTM 1368 for room temperature and ASTM C 1465 for elevated temperatures, were published in the 2001 Annual Book of ASTM Standards, Vol. 15.01. Briefly, the test method employs constant stress-rate (or dynamic fatigue) testing to determine flexural strengths as a function of the applied stress rate. The merit of this test method lies in its simplicity: strengths are measured in a routine manner in flexure at four or more applied stress rates with an appropriate number of test specimens at each applied stress rate. The slow crack growth parameters necessary for life prediction are then determined from a simple relationship between the strength and the applied stress rate. Extensive life prediction testing was conducted at the NASA Glenn Research Center using the developed ASTM C 1368 test method to determine the life prediction parameters of a glass-ceramic material that the Navy will use for radome applications.

  9. Ultrahigh-performance liquid chromatography/electrospray ionization linear ion trap Orbitrap mass spectrometry of antioxidants (amines and phenols) applied in lubricant engineering.

    PubMed

    Kassler, Alexander; Pittenauer, Ernst; Doerr, Nicole; Allmaier, Guenter

    2014-01-15

    For the qualification and quantification of antioxidants (aromatic amines and sterically hindered phenols), most of them applied as lubricant additives, two ultrahigh-performance liquid chromatography (UHPLC) electrospray ionization mass spectrometric methods applying the positive and negative ion mode have been developed for lubricant design and engineering thus allowing e.g. the study of the degradation of lubricants. Based on the different chemical properties of the two groups of antioxidants, two methods offering a fast separation (10 min) without prior derivatization were developed. In order to reach these requirements, UHPLC was coupled with an LTQ Orbitrap hybrid tandem mass spectrometer with positive and negative ion electrospray ionization for simultaneous detection of spectra from UHPLC-high-resolution (HR)-MS (full scan mode) and UHPLC-low-resolution linear ion trap MS(2) (LITMS(2)), which we term UHPLC/HRMS-LITMS(2). All 20 analytes investigated could be qualified by an UHPLC/HRMS-LITMS(2) approach consisting of simultaneous UHPLC/HRMS (elemental composition) and UHPLC/LITMS(2) (diagnostic product ions) according to EC guidelines. Quantification was based on an UHPLC/LITMS(2) approach due to increased sensitivity and selectivity compared to UHPLC/HRMS. Absolute quantification was only feasible for seven analytes with well-specified purity of references whereas relative quantification was obtainable for another nine antioxidants. All of them showed good standard deviation and repeatability. The combined methods allow qualitative and quantitative determination of a wide variety of different antioxidants including aminic/phenolic compounds applied in lubricant engineering. These data show that the developed methods will be versatile tools for further research on identification and characterization of the thermo-oxidative degradation products of antioxidants in lubricants. Copyright © 2013 John Wiley & Sons, Ltd.

  10. Development and validation of dissolution study of sustained release dextromethorphan hydrobromide tablets.

    PubMed

    Rajan, Sekar; Colaco, Socorrina; Ramesh, N; Meyyanathan, Subramania Nainar; Elango, K

    2014-02-01

    This study describes the development and validation of dissolution tests for sustained release Dextromethorphan hydrobromide tablets using an HPLC method. Chromatographic separation was achieved on a C18 column utilizing 0.5% triethylamine (pH 7.5) and acetonitrile in the ratio of 50:50. The detection wavelength was 280 nm. The method was validated and response was found to be linear in the drug concentration range of 10-80 microg mL(-1). The suitable conditions were clearly decided after testing sink conditions, dissolution medium and agitation intensity. The most excellent dissolution conditions tested, for the Dextromethorphan hydrobromide was applied to appraise the dissolution profiles. The method was validated and response was found to be linear in the drug concentration range of 10-80 microg mL(-1). The method was established to have sufficient intermediate precision as similar separation was achieved on another instrument handled by different operators. Mean Recovery was 101.82%. Intra precisions for three different concentrations were 1.23, 1.10 0.72 and 1.57, 1.69, 0.95 and inter run precisions were % RSD 0.83, 1.36 and 1.57%, respectively. The method was successfully applied for dissolution study of the developed Dextromethorphan hydrobromide tablets.

  11. Development of achiral and chiral 2D HPLC methods for analysis of albendazole metabolites in microsomal fractions using multivariate analysis for the in vitro metabolism.

    PubMed

    Belaz, Kátia Roberta A; Pereira-Filho, Edenir Rodrigues; Oliveira, Regina V

    2013-08-01

    In this work, the development of two multidimensional liquid chromatography methods coupled to a fluorescence detector is described for direct analysis of microsomal fractions obtained from rat livers. The chiral multidimensional method was then applied for the optimization of the in vitro metabolism of albendazole by experimental design. Albendazole was selected as a model drug because of its anthelmintics properties and recent potential for cancer treatment. The development of two fully automated achiral-chiral and chiral-chiral high performance liquid chromatography (HPLC) methods for the determination of albendazole (ABZ) and its metabolites albendazole sulphoxide (ABZ-SO), albendazole sulphone (ABZ-SO2) and albendazole 2-aminosulphone (ABZ-SO2NH2) in microsomal fractions are described. These methods involve the use of a phenyl (RAM-phenyl-BSA) or octyl (RAM-C8-BSA) restricted access media bovine serum albumin column for the sample clean-up, followed by an achiral phenyl column (15.0×0.46cmI.D.) or a chiral amylose tris(3,5-dimethylphenylcarbamate) column (15.0×0.46cmI.D.). The chiral 2D HPLC method was applied to the development of a compromise condition for the in vitro metabolism of ABZ by means of experimental design involving multivariate analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Evaluation of qPCR curve analysis methods for reliable biomarker discovery: bias, resolution, precision, and implications.

    PubMed

    Ruijter, Jan M; Pfaffl, Michael W; Zhao, Sheng; Spiess, Andrej N; Boggy, Gregory; Blom, Jochen; Rutledge, Robert G; Sisti, Davide; Lievens, Antoon; De Preter, Katleen; Derveaux, Stefaan; Hellemans, Jan; Vandesompele, Jo

    2013-01-01

    RNA transcripts such as mRNA or microRNA are frequently used as biomarkers to determine disease state or response to therapy. Reverse transcription (RT) in combination with quantitative PCR (qPCR) has become the method of choice to quantify small amounts of such RNA molecules. In parallel with the democratization of RT-qPCR and its increasing use in biomedical research or biomarker discovery, we witnessed a growth in the number of gene expression data analysis methods. Most of these methods are based on the principle that the position of the amplification curve with respect to the cycle-axis is a measure for the initial target quantity: the later the curve, the lower the target quantity. However, most methods differ in the mathematical algorithms used to determine this position, as well as in the way the efficiency of the PCR reaction (the fold increase of product per cycle) is determined and applied in the calculations. Moreover, there is dispute about whether the PCR efficiency is constant or continuously decreasing. Together this has lead to the development of different methods to analyze amplification curves. In published comparisons of these methods, available algorithms were typically applied in a restricted or outdated way, which does not do them justice. Therefore, we aimed at development of a framework for robust and unbiased assessment of curve analysis performance whereby various publicly available curve analysis methods were thoroughly compared using a previously published large clinical data set (Vermeulen et al., 2009) [11]. The original developers of these methods applied their algorithms and are co-author on this study. We assessed the curve analysis methods' impact on transcriptional biomarker identification in terms of expression level, statistical significance, and patient-classification accuracy. The concentration series per gene, together with data sets from unpublished technical performance experiments, were analyzed in order to assess the algorithms' precision, bias, and resolution. While large differences exist between methods when considering the technical performance experiments, most methods perform relatively well on the biomarker data. The data and the analysis results per method are made available to serve as benchmark for further development and evaluation of qPCR curve analysis methods (http://qPCRDataMethods.hfrc.nl). Copyright © 2012 Elsevier Inc. All rights reserved.

  13. A Framework Applied Three Ways: Responsive Methods of Co-Developing and Implementing Community Science Solutions for Local Impact

    NASA Astrophysics Data System (ADS)

    Goodwin, M.; Pandya, R.; Udu-gama, N.; Wilkins, S.

    2017-12-01

    While one-size-fits all may work for most hats, it rarely does for communities. Research products, methods and knowledge may be usable at a local scale, but applying them often presents a challenge due to issues like availability, accessibility, awareness, lack of trust, and time. However, in an environment with diminishing federal investment in issues related climate change, natural hazards, and natural resource use and management, the ability of communities to access and leverage science has never been more urgent. Established, yet responsive frameworks and methods can help scientists and communities work together to identify and address specific challenges and leverage science to make a local impact. Through the launch of over 50 community science projects since 2013, the Thriving Earth Exchange (TEX) has created a living framework consisting of a set of milestones by which teams of scientists and community leaders navigate the challenges of working together. Central to the framework are context, trust, project planning and refinement, relationship management and community impact. We find that careful and respectful partnership management results in trust and an open exchange of information. Community science partnerships grounded in local priorities result in the development and exchange of stronger decision-relevant tools, resources and knowledge. This presentation will explore three methods TEX uses to apply its framework to community science partnerships: cohort-based collaboration, online dialogues, and one-on-one consultation. The choice of method should be responsive to a community's needs and working style. For example, a community may require customized support, desire the input and support of peers, or require consultation with multiple experts before deciding on a course of action. Knowing and applying the method of engagement best suited to achieve the community's objectives will ensure that the science is most effectively translated and applied.

  14. Developing a model for the adequate description of electronic communication in hospitals.

    PubMed

    Saboor, Samrend; Ammenwerth, Elske

    2011-01-01

    Adequate information and communication systems (ICT) can help to improve the communication in hospitals. Changes to the ICT-infrastructure of hospitals must be planed carefully. In order to support a comprehensive planning, we presented a classification of 81 common errors of the electronic communication on the MIE 2008 congress. Our objective now was to develop a data model that defines specific requirements for an adequate description of electronic communication processes We first applied the method of explicating qualitative content analysis on the error categorization in order to determine the essential process details. After this, we applied the method of subsuming qualitative content analysis on the results of the first step. A data model for the adequate description of electronic communication. This model comprises 61 entities and 91 relationships. The data model comprises and organizes all details that are necessary for the detection of the respective errors. It can be for either used to extend the capabilities of existing modeling methods or as a basis for the development of a new approach.

  15. Spectrophotometric methods for the determination of benazepril hydrochloride in its single and multi-component dosage forms.

    PubMed

    El-Yazbi, F A; Abdine, H H; Shaalan, R A

    1999-06-01

    Three sensitive and accurate methods are presented for the determination of benazepril in its dosage forms. The first method uses derivative spectrophotometry to resolve the interference due to formulation matrix. The second method depends on the color formed by the reaction of the drug with bromocresol green (BCG). The third one utilizes the reaction of benazepril, after alkaline hydrolysis, with 3-methylbenzothialozone (MBTH) hydrazone where the produced color is measured at 593 nm. The latter method was extended to develop a stability-indicating method for this drug. Moreover, the derivative method was applied for the determination of benazepril in its combination with hydrochlorothiazide. The proposed methods were applied for the analysis of benazepril in the pure form and in tablets. The coefficient of variation was less than 2%.

  16. Experimental Validation of the Dynamic Inertia Measurement Method to Find the Mass Properties of an Iron Bird Test Article

    NASA Technical Reports Server (NTRS)

    Chin, Alexander W.; Herrera, Claudia Y.; Spivey, Natalie D.; Fladung, William A.; Cloutier, David

    2015-01-01

    The mass properties of an aerospace vehicle are required by multiple disciplines in the analysis and prediction of flight behavior. Pendulum oscillation methods have been developed and employed for almost a century as a means to measure mass properties. However, these oscillation methods are costly, time consuming, and risky. The NASA Armstrong Flight Research Center has been investigating the Dynamic Inertia Measurement, or DIM method as a possible alternative to oscillation methods. The DIM method uses ground test techniques that are already applied to aerospace vehicles when conducting modal surveys. Ground vibration tests would require minimal additional instrumentation and time to apply the DIM method. The DIM method has been validated on smaller test articles, but has not yet been fully proven on large aerospace vehicles.

  17. COMPARISON OF MONTE CARLO METHODS FOR NONLINEAR RADIATION TRANSPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    W. R. MARTIN; F. B. BROWN

    2001-03-01

    Five Monte Carlo methods for solving the nonlinear thermal radiation transport equations are compared. The methods include the well-known Implicit Monte Carlo method (IMC) developed by Fleck and Cummings, an alternative to IMC developed by Carter and Forest, an ''exact'' method recently developed by Ahrens and Larsen, and two methods recently proposed by Martin and Brown. The five Monte Carlo methods are developed and applied to the radiation transport equation in a medium assuming local thermodynamic equilibrium. Conservation of energy is derived and used to define appropriate material energy update equations for each of the methods. Details of the Montemore » Carlo implementation are presented, both for the random walk simulation and the material energy update. Simulation results for all five methods are obtained for two infinite medium test problems and a 1-D test problem, all of which have analytical solutions. Conclusions regarding the relative merits of the various schemes are presented.« less

  18. Developing Creativity through Collaborative Problem Solving

    ERIC Educational Resources Information Center

    Albert, Lillie R.; Kim, Rina

    2013-01-01

    This paper discusses an alternative approach for developing problem solving experiences for students. The major argument is that students can develop their creativity by engaging in collaborative problem solving activities in which they apply a variety of mathematical methods creatively to solve problems. The argument is supported by: considering…

  19. Reliability of tanoak volume equations when applied to different areas

    Treesearch

    Norman H. Pillsbury; Philip M. McDonald; Victor Simon

    1995-01-01

    Tree volume equations for tanoak (Lithocarpus densiflorus) were developed for seven stands throughout its natural range and compared by a volume prediction and a parameter difference method. The objective was to test if volume estimates from a species growing in a local, relatively uniform habitat could be applied more widely. Results indicated...

  20. Updated generalized biomass equations for North American tree species

    Treesearch

    David C. Chojnacky; Linda S. Heath; Jennifer C. Jenkins

    2014-01-01

    Historically, tree biomass at large scales has been estimated by applying dimensional analysis techniques and field measurements such as diameter at breast height (dbh) in allometric regression equations. Equations often have been developed using differing methods and applied only to certain species or isolated areas. We previously had compiled and combined (in meta-...

  1. Semantics of User Interface for Image Retrieval: Possibility Theory and Learning Techniques.

    ERIC Educational Resources Information Center

    Crehange, M.; And Others

    1989-01-01

    Discusses the need for a rich semantics for the user interface in interactive image retrieval and presents two methods for building such interfaces: possibility theory applied to fuzzy data retrieval, and a machine learning technique applied to learning the user's deep need. Prototypes developed using videodisks and knowledge-based software are…

  2. Simultaneous determination of binary mixture of amlodipine besylate and atenolol based on dual wavelengths

    NASA Astrophysics Data System (ADS)

    Lamie, Nesrine T.

    2015-10-01

    Four, accurate, precise, and sensitive spectrophotometric methods are developed for simultaneous determination of a binary mixture of amlodipine besylate (AM) and atenolol (AT). AM is determined at its λmax 360 nm (0D), while atenolol can be determined by four different methods. Method (A) is absorption factor (AF). Method (B) is the new ratio difference method (RD) which measures the difference in amplitudes between 210 and 226 nm. Method (C) is novel constant center spectrophotometric method (CC). Method (D) is mean centering of the ratio spectra (MCR) at 284 nm. The methods are tested by analyzing synthetic mixtures of the cited drugs and they are applied to their commercial pharmaceutical preparation. The validity of results is assessed by applying standard addition technique. The results obtained are found to agree statistically with those obtained by official methods, showing no significant difference with respect to accuracy and precision.

  3. Path Integral Monte Carlo Simulations of Warm Dense Matter and Plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Militzer, Burkhard

    2018-01-13

    New path integral Monte Carlo simulation (PIMC) techniques will be developed and applied to derive the equation of state (EOS) for the regime of warm dense matter and dense plasmas where existing first-principles methods cannot be applied. While standard density functional theory has been used to accurately predict the structure of many solids and liquids up to temperatures on the order of 10,000 K, this method is not applicable at much higher temperature where electronic excitations become important because the number of partially occupied electronic orbitals reaches intractably large numbers and, more importantly, the use of zero-temperature exchange-correlation functionals introducesmore » an uncontrolled approximation. Here we focus on PIMC methods that become more and more efficient with increasing temperatures and still include all electronic correlation effects. In this approach, electronic excitations increase the efficiency rather than reduce it. While it has commonly been assumed such methods can only be applied to elements without core electrons like hydrogen and helium, we recently showed how to extend PIMC to heavier elements by performing the first PIMC simulations of carbon and water plasmas [Driver, Militzer, Phys. Rev. Lett. 108 (2012) 115502]. Here we propose to continue this important development to extend the reach of PIMC simulations to yet heavier elements and also lower temperatures. The goal is to provide a robust first-principles simulation method that can accurately and efficiently study materials with excited electrons at solid-state densities in order to access parts of the phase diagram such the regime of warm dense matter and plasmas where so far only more approximate, semi-analytical methods could be applied.« less

  4. A robust quantitative near infrared modeling approach for blend monitoring.

    PubMed

    Mohan, Shikhar; Momose, Wataru; Katz, Jeffrey M; Hossain, Md Nayeem; Velez, Natasha; Drennen, James K; Anderson, Carl A

    2018-01-30

    This study demonstrates a material sparing Near-Infrared modeling approach for powder blend monitoring. In this new approach, gram scale powder mixtures are subjected to compression loads to simulate the effect of scale using an Instron universal testing system. Models prepared by the new method development approach (small-scale method) and by a traditional method development (blender-scale method) were compared by simultaneously monitoring a 1kg batch size blend run. Both models demonstrated similar model performance. The small-scale method strategy significantly reduces the total resources expended to develop Near-Infrared calibration models for on-line blend monitoring. Further, this development approach does not require the actual equipment (i.e., blender) to which the method will be applied, only a similar optical interface. Thus, a robust on-line blend monitoring method can be fully developed before any large-scale blending experiment is viable, allowing the blend method to be used during scale-up and blend development trials. Copyright © 2017. Published by Elsevier B.V.

  5. Parent Training: A Review of Methods for Children with Developmental Disabilities

    ERIC Educational Resources Information Center

    Matson, Johnny L.; Mahan, Sara; LoVullo, Santino V.

    2009-01-01

    Great strides have been made in the development of skills and procedures to aid children with developmental disabilities to establish maximum independence and quality of life. Paramount among the treatment methods that have empirical support are treatments based on applied behavior analysis. These methods are often very labor intensive. Thus,…

  6. Optimal Stratification of Item Pools in a-Stratified Computerized Adaptive Testing.

    ERIC Educational Resources Information Center

    Chang, Hua-Hua; van der Linden, Wim J.

    2003-01-01

    Developed a method based on 0-1 linear programming to stratify an item pool optimally for use in alpha-stratified adaptive testing. Applied the method to a previous item pool from the computerized adaptive test of the Graduate Record Examinations. Results show the new method performs well in practical situations. (SLD)

  7. Probabilistic framework for product design optimization and risk management

    NASA Astrophysics Data System (ADS)

    Keski-Rahkonen, J. K.

    2018-05-01

    Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.

  8. An Autonomous Sensor Tasking Approach for Large Scale Space Object Cataloging

    NASA Astrophysics Data System (ADS)

    Linares, R.; Furfaro, R.

    The field of Space Situational Awareness (SSA) has progressed over the last few decades with new sensors coming online, the development of new approaches for making observations, and new algorithms for processing them. Although there has been success in the development of new approaches, a missing piece is the translation of SSA goals to sensors and resource allocation; otherwise known as the Sensor Management Problem (SMP). This work solves the SMP using an artificial intelligence approach called Deep Reinforcement Learning (DRL). Stable methods for training DRL approaches based on neural networks exist, but most of these approaches are not suitable for high dimensional systems. The Asynchronous Advantage Actor-Critic (A3C) method is a recently developed and effective approach for high dimensional systems, and this work leverages these results and applies this approach to decision making in SSA. The decision space for the SSA problems can be high dimensional, even for tasking of a single telescope. Since the number of SOs in space is relatively high, each sensor will have a large number of possible actions at a given time. Therefore, efficient DRL approaches are required when solving the SMP for SSA. This work develops a A3C based method for DRL applied to SSA sensor tasking. One of the key benefits of DRL approaches is the ability to handle high dimensional data. For example DRL methods have been applied to image processing for the autonomous car application. For example, a 256x256 RGB image has 196608 parameters (256*256*3=196608) which is very high dimensional, and deep learning approaches routinely take images like this as inputs. Therefore, when applied to the whole catalog the DRL approach offers the ability to solve this high dimensional problem. This work has the potential to, for the first time, solve the non-myopic sensor tasking problem for the whole SO catalog (over 22,000 objects) providing a truly revolutionary result.

  9. An Evaluation of the Effectiveness of U.S. Naval Aviation Crew Resource Management Training Programs: A Reassessment for the Twenty-First Century Operating Environment

    DTIC Science & Technology

    2009-06-01

    3. Previous Navy CRM Assessments ....................................................24 4. Applying Kirkpatrick’s Topology of Evaluation...development within each aviation community. Kirkpatrick’s (1976) hierarchy of training evaluation technique was applied to examine three levels of... Applying methods and techniques used in previous CRM evaluation research, this thesis provided an updated evaluation of the Naval CRM program to fill

  10. A predictive estimation method for carbon dioxide transport by data-driven modeling with a physically-based data model.

    PubMed

    Jeong, Jina; Park, Eungyu; Han, Weon Shik; Kim, Kue-Young; Jun, Seong-Chun; Choung, Sungwook; Yun, Seong-Taek; Oh, Junho; Kim, Hyun-Jun

    2017-11-01

    In this study, a data-driven method for predicting CO 2 leaks and associated concentrations from geological CO 2 sequestration is developed. Several candidate models are compared based on their reproducibility and predictive capability for CO 2 concentration measurements from the Environment Impact Evaluation Test (EIT) site in Korea. Based on the data mining results, a one-dimensional solution of the advective-dispersive equation for steady flow (i.e., Ogata-Banks solution) is found to be most representative for the test data, and this model is adopted as the data model for the developed method. In the validation step, the method is applied to estimate future CO 2 concentrations with the reference estimation by the Ogata-Banks solution, where a part of earlier data is used as the training dataset. From the analysis, it is found that the ensemble mean of multiple estimations based on the developed method shows high prediction accuracy relative to the reference estimation. In addition, the majority of the data to be predicted are included in the proposed quantile interval, which suggests adequate representation of the uncertainty by the developed method. Therefore, the incorporation of a reasonable physically-based data model enhances the prediction capability of the data-driven model. The proposed method is not confined to estimations of CO 2 concentration and may be applied to various real-time monitoring data from subsurface sites to develop automated control, management or decision-making systems. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Real-time determination of sarcomere length of a single cardiomyocyte during contraction

    PubMed Central

    Kalda, Mari; Vendelin, Marko

    2013-01-01

    Sarcomere length of a cardiomyocyte is an important control parameter for physiology studies on a single cell level; for instance, its accurate determination in real time is essential for performing single cardiomyocyte contraction experiments. The aim of this work is to develop an efficient and accurate method for estimating a mean sarcomere length of a contracting cardiomyocyte using microscopy images as an input. The novelty in developed method lies in 1) using unbiased measure of similarities to eliminate systematic errors from conventional autocorrelation function (ACF)-based methods when applied to region of interest of an image, 2) using a semianalytical, seminumerical approach for evaluating the similarity measure to take into account spatial dependence of neighboring image pixels, and 3) using a detrend algorithm to extract the sarcomere striation pattern content from the microscopy images. The developed sarcomere length estimation procedure has superior computational efficiency and estimation accuracy compared with the conventional ACF and spectral analysis-based methods using fast Fourier transform. As shown by analyzing synthetic images with the known periodicity, the estimates obtained by the developed method are more accurate at the subpixel level than ones obtained using ACF analysis. When applied in practice on rat cardiomyocytes, our method was found to be robust to the choice of the region of interest that may 1) include projections of carbon fibers and nucleus, 2) have uneven background, and 3) be slightly disoriented with respect to average direction of sarcomere striation pattern. The developed method is implemented in open-source software. PMID:23255581

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weiss, Paul

    Spectroscopic imaging tools and methods, based on scanning tunneling microscopes (STMs), are being developed and applied to examine buried layers and interfaces with ultrahigh resolution. These new methods measure buried contacts, molecule-substrate bonds, buried dipoles in molecular layers, and key structural aspects of adsorbed molecules, such as tilt angles. We are developing the ability to locate lateral projections of molecular parts as a means of determining the structures of molecular layers. We are developing the ability to measure the orientation of buried functionality.

  13. Synthesis of coupled resonator optical waveguides by cavity aggregation.

    PubMed

    Muñoz, Pascual; Doménech, José David; Capmany, José

    2010-01-18

    In this paper, the layer aggregation method is applied to coupled resonator optical waveguides. Starting from the frequency transfer function, the method yields the coupling constants between the resonators. The convergence of the algorithm developed is examined and the related parameters discussed.

  14. METHODS FOR MONITORING THE EFFECTS OF ENVIRONMENTAL TOXINS ON THE VISUAL SYSTEM.

    EPA Science Inventory

    A high percentage of neurotoxic compounds adversely effect the visual system. Our goal is to apply the tools of vision science to problems of toxicological import, exposure-related alterations in visual physiology, psychophysical function, and ocular development. Methods can ...

  15. EVALUATION OF METHODS FOR SAMPLING, RECOVERY, AND ENUMERATION OF BACTERIA APPLIED TO THE PHYLLOPANE

    EPA Science Inventory

    Determining the fate and survival of genetically engineered microorganisms released into the environment requires the development and application of accurate and practical methods of detection and enumeration. everal experiments were performed to examine quantitative recovery met...

  16. Electrondriven processes in polyatomic molecules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKoy, Vincent

    2017-03-20

    This project developed and applied scalable computational methods to obtain information about low-energy electron collisions with larger polyatomic molecules. Such collisions are important in modeling radiation damage to living systems, in spark ignition and combustion, and in plasma processing of materials. The focus of the project was to develop efficient methods that could be used to obtain both fundamental scientific insights and data of practical value to applications.

  17. Calibrating reaction rates for the CREST model

    NASA Astrophysics Data System (ADS)

    Handley, Caroline A.; Christie, Michael A.

    2017-01-01

    The CREST reactive-burn model uses entropy-dependent reaction rates that, until now, have been manually tuned to fit shock-initiation and detonation data in hydrocode simulations. This paper describes the initial development of an automatic method for calibrating CREST reaction-rate coefficients, using particle swarm optimisation. The automatic method is applied to EDC32, to help develop the first CREST model for this conventional high explosive.

  18. Finding False Paths in Sequential Circuits

    NASA Astrophysics Data System (ADS)

    Matrosova, A. Yu.; Andreeva, V. V.; Chernyshov, S. V.; Rozhkova, S. V.; Kudin, D. V.

    2018-02-01

    Method of finding false paths in sequential circuits is developed. In contrast with heuristic approaches currently used abroad, the precise method based on applying operations on Reduced Ordered Binary Decision Diagrams (ROBDDs) extracted from the combinational part of a sequential controlling logic circuit is suggested. The method allows finding false paths when transfer sequence length is not more than the given value and obviates the necessity of investigation of combinational circuit equivalents of the given lengths. The possibilities of using of the developed method for more complicated circuits are discussed.

  19. Application of remote sensing to reconnaissance geologic mapping and mineral exploration

    NASA Technical Reports Server (NTRS)

    Birnie, R. W.; Dykstra, J. D.

    1978-01-01

    A method of mapping geology at a reconnaissance scale and locating zones of possible hydrothermal alteration has been developed. This method is based on principal component analysis of Landsat digital data and is applied to the desert area of the Chagai Hills, Baluchistan, Pakistan. A method for airborne spectrometric detection of geobotanical anomalies associated with prophyry Cu-Mo mineralization at Heddleston, Montana has also been developed. This method is based on discriminants in the 0.67 micron and 0.79 micron region of the spectrum.

  20. Use of screening tests to assess cancer risk and to estimate the risk of adult T-cell leukemia/lymphoma.

    PubMed Central

    Yanagawa, T; Tokudome, S

    1990-01-01

    We developed methods to assess the cancer risks by screening tests. These methods estimate the size of the high risk group adjusted for the characteristics of screening tests and estimate the incidence rates of cancer among the high risk group adjusted for the characteristics of the tests. A method was also developed for selecting the cut-off point of a screening test. Finally, the methods were applied to estimate the risk of the adult T-cell leukemia/lymphoma. PMID:2269244

  1. A Model-Driven Development Method for Management Information Systems

    NASA Astrophysics Data System (ADS)

    Mizuno, Tomoki; Matsumoto, Keinosuke; Mori, Naoki

    Traditionally, a Management Information System (MIS) has been developed without using formal methods. By the informal methods, the MIS is developed on its lifecycle without having any models. It causes many problems such as lack of the reliability of system design specifications. In order to overcome these problems, a model theory approach was proposed. The approach is based on an idea that a system can be modeled by automata and set theory. However, it is very difficult to generate automata of the system to be developed right from the start. On the other hand, there is a model-driven development method that can flexibly correspond to changes of business logics or implementing technologies. In the model-driven development, a system is modeled using a modeling language such as UML. This paper proposes a new development method for management information systems applying the model-driven development method to a component of the model theory approach. The experiment has shown that a reduced amount of efforts is more than 30% of all the efforts.

  2. Resolving individual Shockley partials of a dissociated dislocation by STEM

    NASA Astrophysics Data System (ADS)

    Iwata, Hiroyuki; Saka, Hiroyasu

    2017-02-01

    A practical method was developed to image detailed features of defects in a crystal using STEM. This method is essentially a STEM version of the conventional CTEM g/3g weak beam dark field (WBDF) method. The method was successfully applied to resolving individual Shockley partials of a dissociated dislocation in a Cu-6.44at.%Al alloy.

  3. Rapid quantification of viable Legionella in nuclear cooling tower waters using filter cultivation, fluorescent in situ hybridization and solid-phase cytometry.

    PubMed

    Baudart, J; Guillaume, C; Mercier, A; Lebaron, P; Binet, M

    2015-05-01

    To develop a rapid and sensitive method to quantify viable Legionella spp. in cooling tower water samples. A rapid, culture-based method capable of quantifying as few as 600 Legionella microcolonies per litre within 2 days in industrial waters was developed. The method combines a short cultivation step of microcolonies on GVPC agar plate, specific detection of Legionella cells by a fluorescent in situ hybridization (FISH) approach, and a sensitive enumeration using a solid-phase cytometer. Following optimization of the cultivation conditions, the qualitative and quantitative performance of the method was assessed and the method was applied to 262 nuclear power plant cooling water samples. The performance of this method was in accordance with the culture method (NF-T 90-431) for Legionella enumeration. The rapid detection of viable Legionella in water is a major concern to the effective monitoring of this pathogenic bacterium in the main water sources involved in the transmission of legionellosis infection (Legionnaires' disease). The new method proposed here appears to be a robust, efficient and innovative means for rapidly quantifying cultivable Legionella in cooling tower water samples within 48 h. © 2015 The Society for Applied Microbiology.

  4. Development of a REBCO HTS magnet for Maglev - repeated bending tests of HTS pancake coils -

    NASA Astrophysics Data System (ADS)

    Sugino, Motohikoa; Mizuno, Katsutoshi; Tanaka, Minoru; Ogata, Masafumi

    2018-01-01

    In the past study, two manufacturing methods were developed that can manufacture pancake coils by using REBCO coated conductors. It was confirmed that the conductors have no electric degradation that caused by the manufacturing method. The durability evaluation tests of the pancake coils were conducted as the final evaluation of the coil manufacturing method in this study. The repeated bending deformation was applied to manufactured pancake coils in the tests. As the results of these tests, it was confirmed that the pancake coils that were manufactured by two methods had the durability for the repeated bending deformation and the coils maintained the appropriate mechanical performance and electric performance. We adopted the fusion bonding method as the coil manufacturing method of the HTS magnet Furthermore, using the prototype pancake coil that was manufactured by the fusion bonding method as a test sample, the repeated bending test under the exited condition was conducted. Thus it was confirmed that the coil manufactured by the fusion bonding method has no degradation of the electricity performance and the mechanical properties even if the repeated bending deformation was applied under the exited condition.

  5. A New Green Method for the Quantitative Analysis of Enrofloxacin by Fourier-Transform Infrared Spectroscopy.

    PubMed

    Rebouças, Camila Tavares; Kogawa, Ana Carolina; Salgado, Hérida Regina Nunes

    2018-05-18

    Background: A green analytical chemistry method was developed for quantification of enrofloxacin in tablets. The drug, a second-generation fluoroquinolone, was first introduced in veterinary medicine for the treatment of various bacterial species. Objective: This study proposed to develop, validate, and apply a reliable, low-cost, fast, and simple IR spectroscopy method for quantitative routine determination of enrofloxacin in tablets. Methods: The method was completely validated according to the International Conference on Harmonisation guidelines, showing accuracy, precision, selectivity, robustness, and linearity. Results: It was linear over the concentration range of 1.0-3.0 mg with correlation coefficients >0.9999 and LOD and LOQ of 0.12 and 0.36 mg, respectively. Conclusions: Now that this IR method has met performance qualifications, it can be adopted and applied for the analysis of enrofloxacin tablets for production process control. The validated method can also be utilized to quantify enrofloxacin in tablets and thus is an environmentally friendly alternative for the routine analysis of enrofloxacin in quality control. Highlights: A new green method for the quantitative analysis of enrofloxacin by Fourier-Transform Infrared spectroscopy was validated. It is a fast, clean and low-cost alternative for the evaluation of enrofloxacin tablets.

  6. Discovery and Development of ATP-Competitive mTOR Inhibitors Using Computational Approaches.

    PubMed

    Luo, Yao; Wang, Ling

    2017-11-16

    The mammalian target of rapamycin (mTOR) is a central controller of cell growth, proliferation, metabolism, and angiogenesis. This protein is an attractive target for new anticancer drug development. Significant progress has been made in hit discovery, lead optimization, drug candidate development and determination of the three-dimensional (3D) structure of mTOR. Computational methods have been applied to accelerate the discovery and development of mTOR inhibitors helping to model the structure of mTOR, screen compound databases, uncover structure-activity relationship (SAR) and optimize the hits, mine the privileged fragments and design focused libraries. Besides, computational approaches were also applied to study protein-ligand interactions mechanisms and in natural product-driven drug discovery. Herein, we survey the most recent progress on the application of computational approaches to advance the discovery and development of compounds targeting mTOR. Future directions in the discovery of new mTOR inhibitors using computational methods are also discussed. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  7. Mending the Gap, An Effort to Aid the Transfer of Formal Methods Technology

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly

    2009-01-01

    Formal methods can be applied to many of the development and verification activities required for civil avionics software. RTCA/DO-178B, Software Considerations in Airborne Systems and Equipment Certification, gives a brief description of using formal methods as an alternate method of compliance with the objectives of that standard. Despite this, the avionics industry at large has been hesitant to adopt formal methods, with few developers have actually used formal methods for certification credit. Why is this so, given the volume of evidence of the benefits of formal methods? This presentation will explore some of the challenges to using formal methods in a certification context and describe the effort by the Formal Methods Subgroup of RTCA SC-205/EUROCAE WG-71 to develop guidance to make the use of formal methods a recognized approach.

  8. Chemometric methods for the simultaneous determination of some water-soluble vitamins.

    PubMed

    Mohamed, Abdel-Maaboud I; Mohamed, Horria A; Mohamed, Niveen A; El-Zahery, Marwa R

    2011-01-01

    Two spectrophotometric methods, derivative and multivariate methods, were applied for the determination of binary, ternary, and quaternary mixtures of the water-soluble vitamins thiamine HCI (I), pyridoxine HCI (II), riboflavin (III), and cyanocobalamin (IV). The first method is divided into first derivative and first derivative of ratio spectra methods, and the second into classical least squares and principal components regression methods. Both methods are based on spectrophotometric measurements of the studied vitamins in 0.1 M HCl solution in the range of 200-500 nm for all components. The linear calibration curves were obtained from 2.5-90 microg/mL, and the correlation coefficients ranged from 0.9991 to 0.9999. These methods were applied for the analysis of the following mixtures: (I) and (II); (I), (II), and (III); (I), (II), and (IV); and (I), (II), (III), and (IV). The described methods were successfully applied for the determination of vitamin combinations in synthetic mixtures and dosage forms from different manufacturers. The recovery ranged from 96.1 +/- 1.2 to 101.2 +/- 1.0% for derivative methods and 97.0 +/- 0.5 to 101.9 +/- 1.3% for multivariate methods. The results of the developed methods were compared with those of reported methods, and gave good accuracy and precision.

  9. Knowledge Consolidation Analysis: Toward a Methodology for Studying the Role of Argument in Technology Development

    ERIC Educational Resources Information Center

    Dyehouse, Jeremiah

    2007-01-01

    Researchers studying technology development often examine how rhetorical activity contributes to technologies' design, implementation, and stabilization. This article offers a possible methodology for studying one role of rhetorical activity in technology development: knowledge consolidation analysis. Applying this method to an exemplar case, the…

  10. Developing Skills in Years 11 and 12 Secondary School Economics

    ERIC Educational Resources Information Center

    Stokes, Anthony; Wright, Sarah

    2013-01-01

    This paper explores different approaches for developing skills in economics in schools. It considers the different preferred learning styles of students through the VARK method and applies a contextual learning approach to engage students and develop skills. The key skills that are considered are literacy, numeracy, information and communication…

  11. Socioeconomic Status and Child Development: A Meta-Analysis

    ERIC Educational Resources Information Center

    Letourneau, Nicole Lyn; Duffett-Leger, Linda; Levac, Leah; Watson, Barry; Young-Morris, Catherine

    2013-01-01

    Lower socioeconomic status (SES) is widely accepted to have deleterious effects on the well-being and development of children and adolescents. However, rigorous meta-analytic methods have not been applied to determine the degree to which SES supports or limits children's and adolescents behavioural, cognitive and language development. While…

  12. Toward an applied technology for quality measurement in health care.

    PubMed

    Berwick, D M

    1988-01-01

    Cost containment, financial incentives to conserve resources, the growth of for-profit hospitals, an aggressive malpractice environment, and demands from purchasers are among the forces today increasing the need for improved methods that measure quality in health care. At the same time, increasingly sophisticated databases and the existence of managed care systems yield new opportunities to observe and correct quality problems. Research on targets of measurement (structure, process, and outcome) and methods of measurement (implicit, explicit, and sentinel methods) has not yet produced managerially useful applied technology for quality measurement in real-world settings. Such an applied technology would have to be cheaper, faster, more flexible, better reported, and more multidimensional than the majority of current research on quality assurance. In developing a new applied technology for the measurement of health care quality, quantitative disciplines have much to offer, such as decision support systems, criteria based on rigorous decision analyses, utility theory, tools for functional status measurement, and advances in operations research.

  13. A Systematic Method of Integrating BIM and Sensor Technology for Sustainable Construction Design

    NASA Astrophysics Data System (ADS)

    Liu, Zhen; Deng, Zhiyu

    2017-10-01

    Building Information Modeling (BIM) has received lots of attention of construction field, and sensor technology was applied in construction data collection. This paper developed a method to integrate BIM and sensor technology for sustainable construction design. A brief literature review was conducted to clarify the current development of BIM and sensor technology; then a systematic method for integrating BIM and sensor technology to realize sustainable construction design was put forward; finally a brief discussion and conclusion was given.

  14. How Shall I Live? Constructing a Life Story in the College Years

    ERIC Educational Resources Information Center

    McAdams, Dan P.; Guo, Jennifer

    2014-01-01

    This chapter applies the concept of narrative identity to college student development. The authors describe a narrative interview method that can be used to promote the development of a purposeful life story in the college years.

  15. Applying systematic review search methods to the grey literature: a case study examining guidelines for school-based breakfast programs in Canada.

    PubMed

    Godin, Katelyn; Stapleton, Jackie; Kirkpatrick, Sharon I; Hanning, Rhona M; Leatherdale, Scott T

    2015-10-22

    Grey literature is an important source of information for large-scale review syntheses. However, there are many characteristics of grey literature that make it difficult to search systematically. Further, there is no 'gold standard' for rigorous systematic grey literature search methods and few resources on how to conduct this type of search. This paper describes systematic review search methods that were developed and applied to complete a case study systematic review of grey literature that examined guidelines for school-based breakfast programs in Canada. A grey literature search plan was developed to incorporate four different searching strategies: (1) grey literature databases, (2) customized Google search engines, (3) targeted websites, and (4) consultation with contact experts. These complementary strategies were used to minimize the risk of omitting relevant sources. Since abstracts are often unavailable in grey literature documents, items' abstracts, executive summaries, or table of contents (whichever was available) were screened. Screening of publications' full-text followed. Data were extracted on the organization, year published, who they were developed by, intended audience, goal/objectives of document, sources of evidence/resources cited, meals mentioned in the guidelines, and recommendations for program delivery. The search strategies for identifying and screening publications for inclusion in the case study review was found to be manageable, comprehensive, and intuitive when applied in practice. The four search strategies of the grey literature search plan yielded 302 potentially relevant items for screening. Following the screening process, 15 publications that met all eligibility criteria remained and were included in the case study systematic review. The high-level findings of the case study systematic review are briefly described. This article demonstrated a feasible and seemingly robust method for applying systematic search strategies to identify web-based resources in the grey literature. The search strategy we developed and tested is amenable to adaptation to identify other types of grey literature from other disciplines and answering a wide range of research questions. This method should be further adapted and tested in future research syntheses.

  16. Priority Determination of Underwater Tourism Site Development in Gorontalo Province using Analytical Hierarchy Process (AHP)

    NASA Astrophysics Data System (ADS)

    Rohandi, M.; Tuloli, M. Y.; Jassin, R. T.

    2018-02-01

    This research aims to determine the development of priority of underwater tourism in Gorontalo province using the Analytical Hierarchy Process (AHP) method which is one of DSS methods applying Multi-Attribute Decision Making (MADM). This method used 5 criteria and 28 alternatives to determine the best priority of underwater tourism site development in Gorontalo province. Based on the AHP calculation it appeared that the best priority development of underwater tourism site is Pulau Cinta whose total AHP score is 0.489 or 48.9%. This DSS produced a reliable result, faster solution, time-saving, and low cost for the decision makers to obtain the best underwater tourism site to be developed.

  17. Spatial weighting approach in numerical method for disaggregation of MDGs indicators

    NASA Astrophysics Data System (ADS)

    Permai, S. D.; Mukhaiyar, U.; Satyaning PP, N. L. P.; Soleh, M.; Aini, Q.

    2018-03-01

    Disaggregation use to separate and classify the data based on certain characteristics or on administrative level. Disaggregated data is very important because some indicators not measured on all characteristics. Detailed disaggregation for development indicators is important to ensure that everyone benefits from development and support better development-related policymaking. This paper aims to explore different methods to disaggregate national employment-to-population ratio indicator to province- and city-level. Numerical approach applied to overcome the problem of disaggregation unavailability by constructing several spatial weight matrices based on the neighbourhood, Euclidean distance and correlation. These methods can potentially be used and further developed to disaggregate development indicators into lower spatial level even by several demographic characteristics.

  18. Development of direct-inverse 3-D methods for applied transonic aerodynamic wing design and analysis

    NASA Technical Reports Server (NTRS)

    Carlson, Leland A.

    1989-01-01

    An inverse wing design method was developed around an existing transonic wing analysis code. The original analysis code, TAWFIVE, has as its core the numerical potential flow solver, FLO30, developed by Jameson and Caughey. Features of the analysis code include a finite-volume formulation; wing and fuselage fitted, curvilinear grid mesh; and a viscous boundary layer correction that also accounts for viscous wake thickness and curvature. The development of the inverse methods as an extension of previous methods existing for design in Cartesian coordinates is presented. Results are shown for inviscid wing design cases in super-critical flow regimes. The test cases selected also demonstrate the versatility of the design method in designing an entire wing or discontinuous sections of a wing.

  19. An ultra-high pressure liquid chromatography-tandem mass spectrometry method for the quantification of teicoplanin in plasma of neonates.

    PubMed

    Begou, O; Kontou, A; Raikos, N; Sarafidis, K; Roilides, E; Papadoyannis, I N; Gika, H G

    2017-03-15

    The development and validation of an ultra-high pressure liquid chromatography (UHPLC) tandem mass spectrometry (MS/MS) method was performed with the aim to be applied for the quantification of plasma teicoplanin concentrations in neonates. Pharmacokinetic data of teicoplanin in the neonatal population is very limited, therefore, a sensitive and reliable method for the determination of all isoforms of teicoplanin applied in a low volume of sample is of real importance. Teicoplanin main components were extracted by a simple acetonitrile precipitation step and analysed on a C18 chromatographic column by a triple quadrupole MS with electrospray ionization. The method provides quantitative data over a linear range of 25-6400ng/mL with LOD 8.5ng/mL and LOQ 25ng/mL for total teicoplanin. The method was applied in plasma samples from neonates to support pharmacokinetic data and proved to be a reliable and fast method for the quantification of teicoplanin concentration levels in plasma of infants during therapy in Intensive Care Unit. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Mutation Clusters from Cancer Exome.

    PubMed

    Kakushadze, Zura; Yu, Willie

    2017-08-15

    We apply our statistically deterministic machine learning/clustering algorithm *K-means (recently developed in https://ssrn.com/abstract=2908286) to 10,656 published exome samples for 32 cancer types. A majority of cancer types exhibit a mutation clustering structure. Our results are in-sample stable. They are also out-of-sample stable when applied to 1389 published genome samples across 14 cancer types. In contrast, we find in- and out-of-sample instabilities in cancer signatures extracted from exome samples via nonnegative matrix factorization (NMF), a computationally-costly and non-deterministic method. Extracting stable mutation structures from exome data could have important implications for speed and cost, which are critical for early-stage cancer diagnostics, such as novel blood-test methods currently in development.

  1. Green preparation of carbon dots with mangosteen pulp for the selective detection of Fe3+ ions and cell imaging

    NASA Astrophysics Data System (ADS)

    Yang, Rui; Guo, Xiangfeng; Jia, Lihua; Zhang, Yu; Zhao, Zhenlong; Lonshakov, Fedor

    2017-11-01

    A simple method was developed in the synthesis of fluorescent carbon dots (referred to as M-CDs), calcined treatment of mangosteen pulp in air, without the assistance of any chemical reagent. The M-CDs possess good-solubility, satisfactory chemical stability and can be applied as the fluorescent temperature probe. More strikingly, the fluorescence of M-CDs can be fleetly and selectively quenched by Fe3+ ions. The phenomenon was used to develop a fluorescent method for facile detection of Fe3+ with a linear range of 0-0.18 mM and a detection limit of 52 nM. Eventually, the M-CDs were applied for cell imaging, demonstrating their potential toward diverse applications.

  2. Mutation Clusters from Cancer Exome

    PubMed Central

    Kakushadze, Zura; Yu, Willie

    2017-01-01

    We apply our statistically deterministic machine learning/clustering algorithm *K-means (recently developed in https://ssrn.com/abstract=2908286) to 10,656 published exome samples for 32 cancer types. A majority of cancer types exhibit a mutation clustering structure. Our results are in-sample stable. They are also out-of-sample stable when applied to 1389 published genome samples across 14 cancer types. In contrast, we find in- and out-of-sample instabilities in cancer signatures extracted from exome samples via nonnegative matrix factorization (NMF), a computationally-costly and non-deterministic method. Extracting stable mutation structures from exome data could have important implications for speed and cost, which are critical for early-stage cancer diagnostics, such as novel blood-test methods currently in development. PMID:28809811

  3. Incorporating ITS into transportation improvement planning : the Seattle Case Study using PRUEVIIN

    DOT National Transportation Integrated Search

    1998-01-01

    This project explored methods to analyze ITS strategies within Major Investment Study (MIS) studies and to apply them in a case study. The case study developed methods to define alternatives, and to estimate impacts and costs at the level required fo...

  4. Construction and evaluation of ion selective electrodes for nitrate with a summing operational amplifier. Application to tobacco analysis.

    PubMed

    Pérez-Olmos, R; Rios, A; Fernández, J R; Lapa, R A; Lima, J L

    2001-01-05

    In this paper, the construction and evaluation of an electrode selective to nitrate with improved sensitivity, constructed like a conventional electrode (ISE) but using an operational amplifier to sum the potentials supplied by four membranes (ESOA) is described. The two types of electrodes, without an inner reference solution, were constructed using tetraoctylammonium bromide as sensor, dibutylphthalate as solvent mediator and PVC as plastic matrix, the membranes obtained directly applied onto a conductive epoxy resin support. After the comparative evaluation of their working characteristics they were used in the determination of nitrate in different types of tobacco. The limit of detection of the direct potentiometric method developed was found to be 0.18 g kg(-1) and the precision and accuracy of the method, when applied to eight different samples of tobacco, expressed in terms of mean R.S.D. and average percentage of spike recovery was 0.6 and 100.3%, respectively. The comparison of variances showed, on all ocassions, that the results obtained by the ESOA were similar to those obtained by the conventional ISE, but with higher precision. Linear regression analysis showed good agreement (r=0.9994) between the results obtained by the developed potentiometric method and those of a spectrophotometric method based on brucine, adopted as reference method, when applied simultaneously to 32 samples of different types of tobacco.

  5. Chemical Differentiation of Dendrobium officinale and Dendrobium devonianum by Using HPLC Fingerprints, HPLC-ESI-MS, and HPTLC Analyses

    PubMed Central

    Ye, Zi; Dai, Jia-Rong; Zhang, Cheng-Gang; Lu, Ye; Wu, Lei-Lei; Gong, Amy G. W.; Wang, Zheng-Tao

    2017-01-01

    The stems of Dendrobium officinale Kimura et Migo (Dendrobii Officinalis Caulis) have a high medicinal value as a traditional Chinese medicine (TCM). Because of the limited supply, D. officinale is a high priced TCM, and therefore adulterants are commonly found in the herbal market. The dried stems of a closely related Dendrobium species, Dendrobium devonianum Paxt., are commonly used as the substitute; however, there is no effective method to distinguish the two Dendrobium species. Here, a high performance liquid chromatography (HPLC) method was successfully developed and applied to differentiate D. officinale and D. devonianum by comparing the chromatograms according to the characteristic peaks. A HPLC coupled with electrospray ionization multistage mass spectrometry (HPLC-ESI-MS) method was further applied for structural elucidation of 15 flavonoids, 5 phenolic acids, and 1 lignan in D. officinale. Among these flavonoids, 4 flavonoid C-glycosides were firstly reported in D. officinale, and violanthin and isoviolanthin were identified to be specific for D. officinale compared with D. devonianum. Then, two representative components were used as chemical markers. A rapid and reliable high performance thin layer chromatography (HPTLC) method was applied in distinguishing D. officinale from D. devonianum. The results of this work have demonstrated that these developed analytical methods can be used to discriminate D. officinale and D. devonianum effectively and conveniently. PMID:28769988

  6. On the convergence of an iterative formulation of the electromagnetic scattering from an infinite grating of thin wires

    NASA Technical Reports Server (NTRS)

    Brand, J. C.

    1985-01-01

    Contraction theory is applied to an iterative formulation of electromagnetic scattering from periodic structures and a computational method for insuring convergence is developed. A short history of spectral (or k-space) formulation is presented with an emphasis on application to periodic surfaces. The mathematical background for formulating an iterative equation is covered using straightforward single variable examples including an extension to vector spaces. To insure a convergent solution of the iterative equation, a process called the contraction corrector method is developed. Convergence properties of previously presented iterative solutions to one-dimensional problems are examined utilizing contraction theory and the general conditions for achieving a convergent solution are explored. The contraction corrector method is then applied to several scattering problems including an infinite grating of thin wires with the solution data compared to previous works.

  7. Accuracy assessment of the Precise Point Positioning method applied for surveys and tracking moving objects in GIS environment

    NASA Astrophysics Data System (ADS)

    Ilieva, Tamara; Gekov, Svetoslav

    2017-04-01

    The Precise Point Positioning (PPP) method gives the users the opportunity to determine point locations using a single GNSS receiver. The accuracy of the determined by PPP point locations is better in comparison to the standard point positioning, due to the precise satellite orbit and clock corrections that are developed and maintained by the International GNSS Service (IGS). The aim of our current research is the accuracy assessment of the PPP method applied for surveys and tracking moving objects in GIS environment. The PPP data is collected by using preliminary developed by us software application that allows different sets of attribute data for the measurements and their accuracy to be used. The results from the PPP measurements are directly compared within the geospatial database to different other sets of terrestrial data - measurements obtained by total stations, real time kinematic and static GNSS.

  8. Comparison of field-enhanced and pressure-assisted field-enhanced sample injection techniques for the analysis of water-soluble vitamins using CZE.

    PubMed

    Liu, Qingqing; Liu, Yaling; Guan, Yu; Jia, Li

    2009-04-01

    A new online concentration method, namely pressure-assisted field-enhanced sample injection (PA-FESI), was developed and compared with FESI for the analysis of water-soluble vitamins by CZE with UV detection. In PA-FESI, negative voltage and positive pressure were simultaneously applied to initialize PA-FESI. PA-FESI uses the hydrodynamic flow generated by the positive pressure to counterbalance the reverse EOF in the capillary column during electrokinetic sample injection, which allowed a longer injection time than usual FESI mode without compromising the separation efficiency. Using the PA-FESI method, the LODs of the vitamins were at ng/mL level based on the S/N of 3 and the RSDs of migration time and peak area for each vitamin (1 microg/mL) were less than 5.1%. The developed method was applied to the analysis of water-soluble vitamins in corns.

  9. Software ``Best'' Practices: Agile Deconstructed

    NASA Astrophysics Data System (ADS)

    Fraser, Steven

    This workshop will explore the intersection of agility and software development in a world of legacy code-bases and large teams. Organizations with hundreds of developers and code-bases exceeding a million or tens of millions of lines of code are seeking new ways to expedite development while retaining and attracting staff who desire to apply “agile” methods. This is a situation where specific agile practices may be embraced outside of their usual zone of applicability. Here is where practitioners must understand both what “best practices” already exist in the organization - and how they might be improved or modified by applying “agile” approaches.

  10. Developing and Applying a Protocol for a Systematic Review in the Social Sciences

    ERIC Educational Resources Information Center

    Campbell, Allison; Taylor, Brian; Bates, Jessica; O'Connor-Bones, Una

    2018-01-01

    The article reports on a systematic method of undertaking a literature search on the educational impact of being a young carer (16-24 years old). The search methodology applied and described in detail will be of value to academic librarians and to other education researchers who undertake systematic literature searches. Seven bibliographic…

  11. Recovery and purification process development for monoclonal antibody production

    PubMed Central

    Ma, Junfen; Winter, Charles; Bayer, Robert

    2010-01-01

    Hundreds of therapeutic monoclonal antibodies (mAbs) are currently in development, and many companies have multiple antibodies in their pipelines. Current methodology used in recovery processes for these molecules are reviewed here. Basic unit operations such as harvest, Protein A affinity chromatography and additional polishing steps are surveyed. Alternative processes such as flocculation, precipitation and membrane chromatography are discussed. We also cover platform approaches to purification methods development, use of high throughput screening methods, and offer a view on future developments in purification methodology as applied to mAbs. PMID:20647768

  12. Parametric Cost and Schedule Modeling for Early Technology Development

    DTIC Science & Technology

    2018-04-02

    Best Paper in the Analysis Methods Category and 2017 Best Paper Overall awards. It was also presented at the 2017 NASA Cost and Schedule Symposium... Methods over the Project Life Cycle .............................................................................................. 2 Figure 2. Average...information contribute to the lack of data, objective models, and methods that can be broadly applied in early planning stages. Scientific

  13. The Use of Invariance and Bootstrap Procedures as a Method to Establish the Reliability of Research Results.

    ERIC Educational Resources Information Center

    Sandler, Andrew B.

    Statistical significance is misused in educational and psychological research when it is applied as a method to establish the reliability of research results. Other techniques have been developed which can be correctly utilized to establish the generalizability of findings. Methods that do provide such estimates are known as invariance or…

  14. Domain identification in impedance computed tomography by spline collocation method

    NASA Technical Reports Server (NTRS)

    Kojima, Fumio

    1990-01-01

    A method for estimating an unknown domain in elliptic boundary value problems is considered. The problem is formulated as an inverse problem of integral equations of the second kind. A computational method is developed using a splice collocation scheme. The results can be applied to the inverse problem of impedance computed tomography (ICT) for image reconstruction.

  15. Success Factors for Using Case Method in Teaching and Learning Software Engineering

    ERIC Educational Resources Information Center

    Razali, Rozilawati; Zainal, Dzulaiha Aryanee Putri

    2013-01-01

    The Case Method (CM) has long been used effectively in Social Science education. Its potential use in Applied Science such as Software Engineering (SE) however has yet to be further explored. SE is an engineering discipline that concerns the principles, methods and tools used throughout the software development lifecycle. In CM, subjects are…

  16. Measuring forest evapotranspiration--theory and problems

    Treesearch

    Anthony C. Federer; Anthony C. Federer

    1970-01-01

    A satisfactory general method of measuring forest evapotranspiration has yet to be developed. Many procedures have been tried, but only the soil-water budget method and the micrometeorological methods offer any degree of success. This paper is a discussion of these procedures and the problems that arise in applying them. It is designed as a reference for scientists and...

  17. Commander’s Handbook for Unit Leader Development

    DTIC Science & Technology

    2007-07-02

    Transforming Organizations: Growing Leaders for Tomorrow. Mahwah, NJ: Lawrence Erlbaum Associates. Kolb , D. (1984). Experiential learning : Experiences...development tools, job aides, or other on-the-job leader development interventions. Implicitly, the handbook employs adult learning theory to engage...most effective and efficient methods of leader development for a unit environment. Principles of adult learning theory were then applied to

  18. Developments in hydrogenation technology for fine-chemical and pharmaceutical applications.

    PubMed

    Machado, R M; Heier, K R; Broekhuis, R R

    2001-11-01

    The continuous innovation in hydrogenation technology is testimony to its growing importance in the manufacture of specialty and fine chemicals. New developments in equipment, process intensification and catalysis represent major themes that have undergone recent advances. Developments in chiral catalysis, methods to support and fix homogeneous catalysts, novel reactor and mixing technology, high-throughput screening, supercritical processing, spectroscopic and electrochemical online process monitoring, monolithic and structured catalysts, and sonochemical activation methods illustrate the scope and breadth of evolving technology applied to hydrogenation.

  19. Combining large number of weak biomarkers based on AUC.

    PubMed

    Yan, Li; Tian, Lili; Liu, Song

    2015-12-20

    Combining multiple biomarkers to improve diagnosis and/or prognosis accuracy is a common practice in clinical medicine. Both parametric and non-parametric methods have been developed for finding the optimal linear combination of biomarkers to maximize the area under the receiver operating characteristic curve (AUC), primarily focusing on the setting with a small number of well-defined biomarkers. This problem becomes more challenging when the number of observations is not order of magnitude greater than the number of variables, especially when the involved biomarkers are relatively weak. Such settings are not uncommon in certain applied fields. The first aim of this paper is to empirically evaluate the performance of existing linear combination methods under such settings. The second aim is to propose a new combination method, namely, the pairwise approach, to maximize AUC. Our simulation studies demonstrated that the performance of several existing methods can become unsatisfactory as the number of markers becomes large, while the newly proposed pairwise method performs reasonably well. Furthermore, we apply all the combination methods to real datasets used for the development and validation of MammaPrint. The implication of our study for the design of optimal linear combination methods is discussed. Copyright © 2015 John Wiley & Sons, Ltd.

  20. Combining large number of weak biomarkers based on AUC

    PubMed Central

    Yan, Li; Tian, Lili; Liu, Song

    2018-01-01

    Combining multiple biomarkers to improve diagnosis and/or prognosis accuracy is a common practice in clinical medicine. Both parametric and non-parametric methods have been developed for finding the optimal linear combination of biomarkers to maximize the area under the receiver operating characteristic curve (AUC), primarily focusing on the setting with a small number of well-defined biomarkers. This problem becomes more challenging when the number of observations is not order of magnitude greater than the number of variables, especially when the involved biomarkers are relatively weak. Such settings are not uncommon in certain applied fields. The first aim of this paper is to empirically evaluate the performance of existing linear combination methods under such settings. The second aim is to propose a new combination method, namely, the pairwise approach, to maximize AUC. Our simulation studies demonstrated that the performance of several existing methods can become unsatisfactory as the number of markers becomes large, while the newly proposed pairwise method performs reasonably well. Furthermore, we apply all the combination methods to real datasets used for the development and validation of MammaPrint. The implication of our study for the design of optimal linear combination methods is discussed. PMID:26227901

  1. Smart manipulation of ratio spectra for resolving a pharmaceutical mixture of Methocarbamol and Paracetamol

    NASA Astrophysics Data System (ADS)

    Essam, Hebatallah M.; Abd-El Rahman, Mohamed K.

    2015-04-01

    Two smart, specific, accurate and precise spectrophotometric methods manipulating ratio spectra are developed for simultaneous determination of Methocarbamol (METH) and Paracetamol (PAR) in their combined pharmaceutical formulation without preliminary separation. Method A, is an extended ratio subtraction one (EXRSM) coupled with ratio subtraction method (RSM), which depends on subtraction of the plateau values from the ratio spectrum. Method B is a ratio difference spectrophotometric one (RDM) which measures the difference in amplitudes of ratio spectra between 278 and 286 nm for METH and 247 and 260 nm for PAR. The calibration curves are linear over the concentration range of 10-100 μg mL-1 and 2-20 μg mL-1 for METH and PAR, respectively. The specificity of the developed methods was investigated by analyzing different laboratory prepared mixtures of the two drugs. Both methods were applied successfully for the determination of the selected drugs in their combined dosage form. Furthermore, validation was performed according to ICH guidelines; accuracy, precision and repeatability are found to be within the acceptable limits. Statistical studies showed that both methods can be competitively applied in quality control laboratories.

  2. Material Development to Raise Awareness of Using Smart Boards: An Example Design and Development Research

    ERIC Educational Resources Information Center

    Günaydin, Serpil; Karamete, Aysen

    2016-01-01

    This study aims to develop training material that will help raise awareness in prospective teachers regarding the benefits of using smart boards in the classroom. In this study, a Type 2 design and development research method (DDR) was used. The material was developed by applying phases of ADDIE--an instructional systems design model. The…

  3. A novel method for estimating soybean herbivory in western corn rootworm (Coleoptera: Chrysomelidae).

    PubMed

    Seiter, Nicholas J; Richmond, Douglas S; Holland, Jeffrey D; Krupke, Christian H

    2010-08-01

    The western corn rootworm, Diabrotica virgifera virgifera LeConte (Coleoptera: Chrysomelidae), is the key pest of corn, Zea mays L., in North America. The western corn rootworm variant is a strain found in some parts of the United States that oviposits in soybean, Glycine max (L.) Merr., thereby circumventing crop rotation. Soybean herbivory is closely associated with oviposition; therefore, evidence of herbivory could serve as a proxy for rotation resistance. A digital image analysis method based on the characteristic green abdominal coloration of rootworm adults with soybean foliage in their guts was developed to estimate soybean herbivory rates of adult females. Image analysis software was used to develop and apply threshold limits that allowed only colors within the range that is characteristic of soybean herbivory to be displayed. When this method was applied to adult females swept from soybean fields in an area with high levels of rotation resistance, 54.3 +/- 2.1% were estimated to have fed on soybean. This is similar to a previously reported estimate of 54.8%. Results when laboratory-generated negative controls were analyzed showed an acceptably low frequency of false positives. This method could be developed into a management tool if user-friendly software were developed for its implementation. In addition, researchers may find the method useful as a rapid, standardized screen for measuring frequencies of soybean herbivory.

  4. Determination of sulfur compounds in hydrotreated transformer base oil by potentiometric titration.

    PubMed

    Chao, Qiu; Sheng, Han; Cheng, Xingguo; Ren, Tianhui

    2005-06-01

    A method was developed to analyze the distribution of sulfur compounds in model sulfur compounds by potentiometric titration, and applied to analyze hydrotreated transformer base oil. Model thioethers were oxidized to corresponding sulfoxides by tetrabutylammonium periodate and sodium metaperiodate, respectively, and the sulfoxides were titrated by perchloric acid titrant in acetic anhydride. The contents of aliphatic thioethers and total thioethers were then determined from that of sulfoxides in solution. The method was applied to determine the organic sulfur compounds in hydrotreated transformer base oil.

  5. Strip Yield Model Numerical Application to Different Geometries and Loading Conditions

    NASA Technical Reports Server (NTRS)

    Hatamleh, Omar; Forman, Royce; Shivakumar, Venkataraman; Lyons, Jed

    2006-01-01

    A new numerical method based on the strip-yield analysis approach was developed for calculating the Crack Tip Opening Displacement (CTOD). This approach can be applied for different crack configurations having infinite and finite geometries, and arbitrary applied loading conditions. The new technique adapts the boundary element / dislocation density method to obtain crack-face opening displacements at any point on a crack, and succeeds by obtaining requisite values as a series of definite integrals, the functional parts of each being evaluated exactly in a closed form.

  6. Methods for the Analysis of Protein Phosphorylation-Mediated Cellular Signaling Networks

    NASA Astrophysics Data System (ADS)

    White, Forest M.; Wolf-Yadlin, Alejandro

    2016-06-01

    Protein phosphorylation-mediated cellular signaling networks regulate almost all aspects of cell biology, including the responses to cellular stimulation and environmental alterations. These networks are highly complex and comprise hundreds of proteins and potentially thousands of phosphorylation sites. Multiple analytical methods have been developed over the past several decades to identify proteins and protein phosphorylation sites regulating cellular signaling, and to quantify the dynamic response of these sites to different cellular stimulation. Here we provide an overview of these methods, including the fundamental principles governing each method, their relative strengths and weaknesses, and some examples of how each method has been applied to the analysis of complex signaling networks. When applied correctly, each of these techniques can provide insight into the topology, dynamics, and regulation of protein phosphorylation signaling networks.

  7. The Development of an Environmentally Compliant, Multi-Functional Aerospace Coating Using Molecular- and Nano-Engineering Methods

    DTIC Science & Technology

    2006-10-02

    Al -TM-RE) alloy which could by spray applied using various deposition routes or deposited as a powder that is...corrosion properties of various spray deposited alloys from their properties as defective coatings on 2024-T3. "* HVOF spray deposited and cold spray ...layer. "* A method has been developed to distinguish the intrinsic corrosion properties of various spray deposited

  8. Projection-free approximate balanced truncation of large unstable systems

    NASA Astrophysics Data System (ADS)

    Flinois, Thibault L. B.; Morgans, Aimee S.; Schmid, Peter J.

    2015-08-01

    In this article, we show that the projection-free, snapshot-based, balanced truncation method can be applied directly to unstable systems. We prove that even for unstable systems, the unmodified balanced proper orthogonal decomposition algorithm theoretically yields a converged transformation that balances the Gramians (including the unstable subspace). We then apply the method to a spatially developing unstable system and show that it results in reduced-order models of similar quality to the ones obtained with existing methods. Due to the unbounded growth of unstable modes, a practical restriction on the final impulse response simulation time appears, which can be adjusted depending on the desired order of the reduced-order model. Recommendations are given to further reduce the cost of the method if the system is large and to improve the performance of the method if it does not yield acceptable results in its unmodified form. Finally, the method is applied to the linearized flow around a cylinder at Re = 100 to show that it actually is able to accurately reproduce impulse responses for more realistic unstable large-scale systems in practice. The well-established approximate balanced truncation numerical framework therefore can be safely applied to unstable systems without any modifications. Additionally, balanced reduced-order models can readily be obtained even for large systems, where the computational cost of existing methods is prohibitive.

  9. A new liquid chromatography-mass spectrometry-based method to quantitate exogenous recombinant transferrin in cerebrospinal fluid: a potential approach for pharmacokinetic studies of transferrin-based therapeutics in the central nervous systems.

    PubMed

    Wang, Shunhai; Bobst, Cedric E; Kaltashov, Igor A

    2015-01-01

    Transferrin (Tf) is an 80 kDa iron-binding protein that is viewed as a promising drug carrier to target the central nervous system as a result of its ability to penetrate the blood-brain barrier. Among the many challenges during the development of Tf-based therapeutics, the sensitive and accurate quantitation of the administered Tf in cerebrospinal fluid (CSF) remains particularly difficult because of the presence of abundant endogenous Tf. Herein, we describe the development of a new liquid chromatography-mass spectrometry-based method for the sensitive and accurate quantitation of exogenous recombinant human Tf in rat CSF. By taking advantage of a His-tag present in recombinant Tf and applying Ni affinity purification, the exogenous human serum Tf can be greatly enriched from rat CSF, despite the presence of the abundant endogenous protein. Additionally, we applied a newly developed (18)O-labeling technique that can generate internal standards at the protein level, which greatly improved the accuracy and robustness of quantitation. The developed method was investigated for linearity, accuracy, precision, and lower limit of quantitation, all of which met the commonly accepted criteria for bioanalytical method validation.

  10. Direct determination of GSK-3β activity and inhibition by UHPLC-UV-vis diode arrays detector (DAD).

    PubMed

    D'Urzo, Annalisa; De Simone, Angela; Fiori, Jessica; Naldi, Marina; Milelli, Andrea; Andrisano, Vincenza

    2016-05-30

    Altered GSK-3β activity can contribute to a number of pathological processes including Alzheimer's disease (AD). Indeed, GSK-3β catalyzes the hyperphosphorylation of tau protein by transferring a phosphate moiety from ATP to the protein substrate serine residue causing the formation of the toxic insoluble neurofibrillary tangles; for this reason it represents a key target for the development of new therapeutic agents for AD treatment. Herein we describe a new selective UHPLC methodology developed for the direct characterization of GSK-3β kinase activity and for the determination of its inhibition, which could be crucial in AD drug discovery. The UHPLC-UV (DAD) based method was validated for the very fast determination of ATP as reactant and ADP as product, and applied for the analysis of the enzymatic reaction between a phosphate primed peptide substrate (GSM), resembling tau protein sequence, ATP and GSK-3β, with/without inhibitors. Analysis time was ten times improved, when compared with previously published chromatographic methods. The method was also validated by determining enzyme reaction kinetic constants (KM and vmax) for GSM and ATP and by analyzing well known GSK-3β inhibitors. Inhibition potency (IC50) values for SB-415286 (81 ± 6 nM) and for Tideglusib (251 ± 17 nM), found by the newly developed UHPLC method, were in good agreement with the luminescence method taken as independent reference method. Further on, the UHPLC method was applied to the elucidation of Tideglusib mechanism of action by determining its inhibition constants (Ki). In agreement with literature data, Tideglusib resulted a GSM competitive inhibitor, whereas SB-415286 was found inhibiting GSK-3β in an ATP competitive manner. This method was applied to the determination of the potency of a new lead compound and was found potentially scalable to inhibitor screening of large compounds collections. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Quality by design in the chiral separation strategy for the determination of enantiomeric impurities: development of a capillary electrophoresis method based on dual cyclodextrin systems for the analysis of levosulpiride.

    PubMed

    Orlandini, S; Pasquini, B; Del Bubba, M; Pinzauti, S; Furlanetto, S

    2015-02-06

    Quality by design (QbD) concepts, in accordance with International Conference on Harmonisation Pharmaceutical Development guideline Q8(R2), represent an innovative strategy for the development of analytical methods. In this paper QbD principles have been comprehensively applied in the set-up of a capillary electrophoresis method aimed to quantify enantiomeric impurities. The test compound was the chiral drug substance levosulpiride (S-SUL) and the developed method was intended to be used for routine analysis of the pharmaceutical product. The target of analytical QbD approach is to establish a design space (DS) of critical process parameters (CPPs) where the critical quality attributes (CQAs) of the method have been assured to fulfil the desired requirements with a selected probability. QbD can improve the understanding of the enantioseparation process, including both the electrophoretic behavior of enantiomers and their separation, therefore enabling its control. The CQAs were represented by enantioresolution and analysis time. The scouting phase made it possible to select a separation system made by sulfated-β-cyclodextrin and a neutral cyclodextrin, operating in reverse polarity mode. The type of neutral cyclodextrin was included among other CPPs, both instrumental and related to background electrolyte composition, which were evaluated in a screening phase by an asymmetric screening matrix. Response surface methodology was carried out by a Doehlert design and allowed the contour plots to be drawn, highlighting significant interactions between some of the CPPs. DS was defined by applying Monte-Carlo simulations, and corresponded to the following intervals: sulfated-β-cyclodextrin concentration, 9-12 mM; methyl-β-cyclodextrin concentration, 29-38 mM; Britton-Robinson buffer pH, 3.24-3.50; voltage, 12-14 kV. Robustness of the method was examined by a Plackett-Burman matrix and the obtained results, together with system repeatability data, led to define a method control strategy. The method was validated and was finally applied to determine the enantiomeric purity of S-SUL in pharmaceutical dosage forms. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Implementing Quality Criteria in Designing and Conducting a Sequential Quan [right arrow] Qual Mixed Methods Study of Student Engagement with Learning Applied Research Methods Online

    ERIC Educational Resources Information Center

    Ivankova, Nataliya V.

    2014-01-01

    In spite of recent methodological developments related to quality assurance in mixed methods research, practical examples of how to implement quality criteria in designing and conducting sequential QUAN [right arrow] QUAL mixed methods studies to ensure the process is systematic and rigorous remain scarce. This article discusses a three-step…

  13. Fluorometric method for inorganic pyrophosphatase activity detection and inhibitor screening based on click chemistry.

    PubMed

    Xu, Kefeng; Chen, Zhonghui; Zhou, Ling; Zheng, Ou; Wu, Xiaoping; Guo, Longhua; Qiu, Bin; Lin, Zhenyu; Chen, Guonan

    2015-01-06

    A fluorometric method for pyrophosphatase (PPase) activity detection was developed based on click chemistry. Cu(II) can coordinate with pyrophosphate (PPi), the addition of pyrophosphatase (PPase) into the above system can destroy the coordinate compound because PPase catalyzes the hydrolysis of PPi into inorganic phosphate and produces free Cu(II), and free Cu(II) can be reduced by sodium ascorbate (SA) to form Cu(I), which in turn initiates the ligating reaction between nonfluorescent 3-azidocoumarins and terminal alkynes to produce a highly fluorescent triazole complex, based on which, a simple and sensitive turn on fluorometric method for PPase can be developed. The fluorescence intensity of the system has a linear relationship with the logarithm of the PPase concentration in the range of 0.5 and 10 mU with a detection limit down to 0.2 mU (S/N = 3). This method is cost-effective and convenient without any labels or complicated operations. The proposed system was applied to screen the potential PPase inhibitor with high efficiency. The proposed method can be applied to diagnosis of PPase-related diseases.

  14. Independent component analysis-based algorithm for automatic identification of Raman spectra applied to artistic pigments and pigment mixtures.

    PubMed

    González-Vidal, Juan José; Pérez-Pueyo, Rosanna; Soneira, María José; Ruiz-Moreno, Sergio

    2015-03-01

    A new method has been developed to automatically identify Raman spectra, whether they correspond to single- or multicomponent spectra. The method requires no user input or judgment. There are thus no parameters to be tweaked. Furthermore, it provides a reliability factor on the resulting identification, with the aim of becoming a useful support tool for the analyst in the decision-making process. The method relies on the multivariate techniques of principal component analysis (PCA) and independent component analysis (ICA), and on some metrics. It has been developed for the application of automated spectral analysis, where the analyzed spectrum is provided by a spectrometer that has no previous knowledge of the analyzed sample, meaning that the number of components in the sample is unknown. We describe the details of this method and demonstrate its efficiency by identifying both simulated spectra and real spectra. The method has been applied to artistic pigment identification. The reliable and consistent results that were obtained make the methodology a helpful tool suitable for the identification of pigments in artwork or in paint in general.

  15. Covariate Selection for Multilevel Models with Missing Data

    PubMed Central

    Marino, Miguel; Buxton, Orfeu M.; Li, Yi

    2017-01-01

    Missing covariate data hampers variable selection in multilevel regression settings. Current variable selection techniques for multiply-imputed data commonly address missingness in the predictors through list-wise deletion and stepwise-selection methods which are problematic. Moreover, most variable selection methods are developed for independent linear regression models and do not accommodate multilevel mixed effects regression models with incomplete covariate data. We develop a novel methodology that is able to perform covariate selection across multiply-imputed data for multilevel random effects models when missing data is present. Specifically, we propose to stack the multiply-imputed data sets from a multiple imputation procedure and to apply a group variable selection procedure through group lasso regularization to assess the overall impact of each predictor on the outcome across the imputed data sets. Simulations confirm the advantageous performance of the proposed method compared with the competing methods. We applied the method to reanalyze the Healthy Directions-Small Business cancer prevention study, which evaluated a behavioral intervention program targeting multiple risk-related behaviors in a working-class, multi-ethnic population. PMID:28239457

  16. Comparative study between derivative spectrophotometry and multivariate calibration as analytical tools applied for the simultaneous quantitation of Amlodipine, Valsartan and Hydrochlorothiazide

    NASA Astrophysics Data System (ADS)

    Darwish, Hany W.; Hassan, Said A.; Salem, Maissa Y.; El-Zeany, Badr A.

    2013-09-01

    Four simple, accurate and specific methods were developed and validated for the simultaneous estimation of Amlodipine (AML), Valsartan (VAL) and Hydrochlorothiazide (HCT) in commercial tablets. The derivative spectrophotometric methods include Derivative Ratio Zero Crossing (DRZC) and Double Divisor Ratio Spectra-Derivative Spectrophotometry (DDRS-DS) methods, while the multivariate calibrations used are Principal Component Regression (PCR) and Partial Least Squares (PLSs). The proposed methods were applied successfully in the determination of the drugs in laboratory-prepared mixtures and in commercial pharmaceutical preparations. The validity of the proposed methods was assessed using the standard addition technique. The linearity of the proposed methods is investigated in the range of 2-32, 4-44 and 2-20 μg/mL for AML, VAL and HCT, respectively.

  17. [An Introduction to Methods for Evaluating Health Care Technology].

    PubMed

    Lee, Ting-Ting

    2015-06-01

    The rapid and continual advance of healthcare technology makes ensuring that this technology is used effectively to achieve its original goals a critical issue. This paper presents three methods that may be applied by healthcare professionals in the evaluation of healthcare technology. These methods include: the perception/experiences of users, user work-pattern changes, and chart review or data mining. The first method includes two categories: using interviews to explore the user experience and using theory-based questionnaire surveys. The second method applies work sampling to observe the work pattern changes of users. The last method conducts chart reviews or data mining to analyze the designated variables. In conclusion, while evaluative feedback may be used to improve the design and development of healthcare technology applications, the informatics competency and informatics literacy of users may be further explored in future research.

  18. Recent archaeomagnetic studies in Slovakia: Comparison of methodological approaches

    NASA Astrophysics Data System (ADS)

    Kubišová, Lenka

    2016-03-01

    We review the recent archaeomagnetic studies carried out on the territory of Slovakia, focusing on the comparison of methodological approaches, discussing pros and cons of the individual applied methods from the perspective of our experience. The most widely used methods for the determination of intensity and direction of the archaeomegnetic field by demagnetisation of the sample material are the alternating field (AF) demagnetisation and the Thellier double heating method. These methods are used not only for archaeomagnetic studies but also help to solve some geological problems. The two methods were applied to samples collected recently at several sites of Slovakia, where archaeological prospection invoked by earthwork or reconstruction work of developing projects demanded archaeomagnetic dating. Then we discuss advantages and weaknesses of the investigated methods from different perspectives based on several examples and our recent experience.

  19. Different spectrophotometric methods applied for the analysis of simeprevir in the presence of its oxidative degradation product: Acomparative study

    NASA Astrophysics Data System (ADS)

    Attia, Khalid A. M.; El-Abasawi, Nasr M.; El-Olemy, Ahmed; Serag, Ahmed

    2018-02-01

    Five simple spectrophotometric methods were developed for the determination of simeprevir in the presence of its oxidative degradation product namely, ratio difference, mean centering, derivative ratio using the Savitsky-Golay filters, second derivative and continuous wavelet transform. These methods are linear in the range of 2.5-40 μg/mL and validated according to the ICH guidelines. The obtained results of accuracy, repeatability and precision were found to be within the acceptable limits. The specificity of the proposed methods was tested using laboratory prepared mixtures and assessed by applying the standard addition technique. Furthermore, these methods were statistically comparable to RP-HPLC method and good results were obtained. So, they can be used for the routine analysis of simeprevir in quality-control laboratories.

  20. Quantum supergroups and solutions of the Yang-Baxter equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bracken, A.J.; Gould, M.D.; Zhang, R.B.

    1990-05-10

    A method is developed for systematically constructing trigonometric and rational solutions of the Yang-Baxter equation using the representation theory of quantum supergroups. New quantum R-matrices are obtained by applying the method to the vector representations of quantum osp(1/2) and gl(m/n).

  1. A RAPID DNA EXTRACTION METHOD FOR PCR IDENTIFICATION OF FUNGAL INDOOR AIR CONTAMINANTS

    EPA Science Inventory

    Following air sampling, fungal DNA needs to be extracted and purified to a state suitable for laboratory use. Our laboratory has developed a simple method of extraction and purification of fungal DNA appropriate for enzymatic manipulation and polymerase chain reaction (PCR) appli...

  2. Application of an energy balance method for estimating evapotranspiration in cropping systems

    USDA-ARS?s Scientific Manuscript database

    Accurate quantification of evapotranspiration (ET, consumptive water use) from planting through harvest is critical for managing the limited water resources for crop irrigation. Our objective was to develop and apply an improved land-crop surface residual energy balance (EB) method for quantifying E...

  3. Self-calibrating models for dynamic monitoring and diagnosis

    NASA Technical Reports Server (NTRS)

    Kuipers, Benjamin

    1996-01-01

    A method for automatically building qualitative and semi-quantitative models of dynamic systems, and using them for monitoring and fault diagnosis, is developed and demonstrated. The qualitative approach and semi-quantitative method are applied to monitoring observation streams, and to design of non-linear control systems.

  4. Measurement of turbulent spatial structure and kinetic energy spectrum by exact temporal-to-spatial mapping

    NASA Astrophysics Data System (ADS)

    Buchhave, Preben; Velte, Clara M.

    2017-08-01

    We present a method for converting a time record of turbulent velocity measured at a point in a flow to a spatial velocity record consisting of consecutive convection elements. The spatial record allows computation of dynamic statistical moments such as turbulent kinetic wavenumber spectra and spatial structure functions in a way that completely bypasses the need for Taylor's hypothesis. The spatial statistics agree with the classical counterparts, such as the total kinetic energy spectrum, at least for spatial extents up to the Taylor microscale. The requirements for applying the method are access to the instantaneous velocity magnitude, in addition to the desired flow quantity, and a high temporal resolution in comparison to the relevant time scales of the flow. We map, without distortion and bias, notoriously difficult developing turbulent high intensity flows using three main aspects that distinguish these measurements from previous work in the field: (1) The measurements are conducted using laser Doppler anemometry and are therefore not contaminated by directional ambiguity (in contrast to, e.g., frequently employed hot-wire anemometers); (2) the measurement data are extracted using a correctly and transparently functioning processor and are analysed using methods derived from first principles to provide unbiased estimates of the velocity statistics; (3) the exact mapping proposed herein has been applied to the high turbulence intensity flows investigated to avoid the significant distortions caused by Taylor's hypothesis. The method is first confirmed to produce the correct statistics using computer simulations and later applied to measurements in some of the most difficult regions of a round turbulent jet—the non-equilibrium developing region and the outermost parts of the developed jet. The proposed mapping is successfully validated using corresponding directly measured spatial statistics in the fully developed jet, even in the difficult outer regions of the jet where the average convection velocity is negligible and turbulence intensities increase dramatically. The measurements in the developing region reveal interesting features of an incomplete Richardson-Kolmogorov cascade under development.

  5. Molecular testing for clinical diagnosis and epidemiological investigations of intestinal parasitic infections.

    PubMed

    Verweij, Jaco J; Stensvold, C Rune

    2014-04-01

    Over the past few decades, nucleic acid-based methods have been developed for the diagnosis of intestinal parasitic infections. Advantages of nucleic acid-based methods are numerous; typically, these include increased sensitivity and specificity and simpler standardization of diagnostic procedures. DNA samples can also be stored and used for genetic characterization and molecular typing, providing a valuable tool for surveys and surveillance studies. A variety of technologies have been applied, and some specific and general pitfalls and limitations have been identified. This review provides an overview of the multitude of methods that have been reported for the detection of intestinal parasites and offers some guidance in applying these methods in the clinical laboratory and in epidemiological studies.

  6. Minimizing Higgs potentials via numerical polynomial homotopy continuation

    NASA Astrophysics Data System (ADS)

    Maniatis, M.; Mehta, D.

    2012-08-01

    The study of models with extended Higgs sectors requires to minimize the corresponding Higgs potentials, which is in general very difficult. Here, we apply a recently developed method, called numerical polynomial homotopy continuation (NPHC), which guarantees to find all the stationary points of the Higgs potentials with polynomial-like non-linearity. The detection of all stationary points reveals the structure of the potential with maxima, metastable minima, saddle points besides the global minimum. We apply the NPHC method to the most general Higgs potential having two complex Higgs-boson doublets and up to five real Higgs-boson singlets. Moreover the method is applicable to even more involved potentials. Hence the NPHC method allows to go far beyond the limits of the Gröbner basis approach.

  7. Molecular Testing for Clinical Diagnosis and Epidemiological Investigations of Intestinal Parasitic Infections

    PubMed Central

    Stensvold, C. Rune

    2014-01-01

    SUMMARY Over the past few decades, nucleic acid-based methods have been developed for the diagnosis of intestinal parasitic infections. Advantages of nucleic acid-based methods are numerous; typically, these include increased sensitivity and specificity and simpler standardization of diagnostic procedures. DNA samples can also be stored and used for genetic characterization and molecular typing, providing a valuable tool for surveys and surveillance studies. A variety of technologies have been applied, and some specific and general pitfalls and limitations have been identified. This review provides an overview of the multitude of methods that have been reported for the detection of intestinal parasites and offers some guidance in applying these methods in the clinical laboratory and in epidemiological studies. PMID:24696439

  8. Coordinated development of leading biomass pretreatment technologies.

    PubMed

    Wyman, Charles E; Dale, Bruce E; Elander, Richard T; Holtzapple, Mark; Ladisch, Michael R; Lee, Y Y

    2005-12-01

    For the first time, a single source of cellulosic biomass was pretreated by leading technologies using identical analytical methods to provide comparative performance data. In particular, ammonia explosion, aqueous ammonia recycle, controlled pH, dilute acid, flowthrough, and lime approaches were applied to prepare corn stover for subsequent biological conversion to sugars through a Biomass Refining Consortium for Applied Fundamentals and Innovation (CAFI) among Auburn University, Dartmouth College, Michigan State University, the National Renewable Energy Laboratory, Purdue University, and Texas A&M University. An Agricultural and Industrial Advisory Board provided guidance to the project. Pretreatment conditions were selected based on the extensive experience of the team with each of the technologies, and the resulting fluid and solid streams were characterized using standard methods. The data were used to close material balances, and energy balances were estimated for all processes. The digestibilities of the solids by a controlled supply of cellulase enzyme and the fermentability of the liquids were also assessed and used to guide selection of optimum pretreatment conditions. Economic assessments were applied based on the performance data to estimate each pretreatment cost on a consistent basis. Through this approach, comparative data were developed on sugar recovery from hemicellulose and cellulose by the combined pretreatment and enzymatic hydrolysis operations when applied to corn stover. This paper introduces the project and summarizes the shared methods for papers reporting results of this research in this special edition of Bioresource Technology.

  9. Development of a NIR-based blend uniformity method for a drug product containing multiple structurally similar actives by using the quality by design principles.

    PubMed

    Lin, Yiqing; Li, Weiyong; Xu, Jin; Boulas, Pierre

    2015-07-05

    The aim of this study is to develop an at-line near infrared (NIR) method for the rapid and simultaneous determination of four structurally similar active pharmaceutical ingredients (APIs) in powder blends intended for the manufacturing of tablets. Two of the four APIs in the formula are present in relatively small amounts, one at 0.95% and the other at 0.57%. Such small amounts in addition to the similarity in structures add significant complexity to the blend uniformity analysis. The NIR method is developed using spectra from six laboratory-created calibration samples augmented by a small set of spectra from a large-scale blending sample. Applying the quality by design (QbD) principles, the calibration design included concentration variations of the four APIs and a main excipient, microcrystalline cellulose. A bench-top FT-NIR instrument was used to acquire the spectra. The obtained NIR spectra were analyzed by applying principal component analysis (PCA) before calibration model development. Score patterns from the PCA were analyzed to reveal relationship between latent variables and concentration variations of the APIs. In calibration model development, both PLS-1 and PLS-2 models were created and evaluated for their effectiveness in predicting API concentrations in the blending samples. The final NIR method shows satisfactory specificity and accuracy. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Conformational Space Annealing explained: A general optimization algorithm, with diverse applications

    NASA Astrophysics Data System (ADS)

    Joung, InSuk; Kim, Jong Yun; Gross, Steven P.; Joo, Keehyoung; Lee, Jooyoung

    2018-02-01

    Many problems in science and engineering can be formulated as optimization problems. One way to solve these problems is to develop tailored problem-specific approaches. As such development is challenging, an alternative is to develop good generally-applicable algorithms. Such algorithms are easy to apply, typically function robustly, and reduce development time. Here we provide a description for one such algorithm called Conformational Space Annealing (CSA) along with its python version, PyCSA. We previously applied it to many optimization problems including protein structure prediction and graph community detection. To demonstrate its utility, we have applied PyCSA to two continuous test functions, namely Ackley and Eggholder functions. In addition, in order to provide complete generality of PyCSA to any types of an objective function, we demonstrate the way PyCSA can be applied to a discrete objective function, namely a parameter optimization problem. Based on the benchmarking results of the three problems, the performance of CSA is shown to be better than or similar to the most popular optimization method, simulated annealing. For continuous objective functions, we found that, L-BFGS-B was the best performing local optimization method, while for a discrete objective function Nelder-Mead was the best. The current version of PyCSA can be run in parallel at the coarse grained level by calculating multiple independent local optimizations separately. The source code of PyCSA is available from http://lee.kias.re.kr.

  11. Safety assessment of foods from genetically modified crops in countries with developing economies.

    PubMed

    Delaney, Bryan

    2015-12-01

    Population growth particularly in countries with developing economies will result in a need to increase food production by 70% by the year 2050. Biotechnology has been utilized to produce genetically modified (GM) crops for insect and weed control with benefits including increased crop yield and will also be used in emerging countries. A multicomponent safety assessment paradigm has been applied to individual GM crops to determine whether they as safe as foods from non-GM crops. This paper reviews methods to assess the safety of foods from GM crops for safe consumption from the first generation of GM crops. The methods can readily be applied to new products developed within country and this paper will emphasize the concept of data portability; that safety data produced in one geographic location is suitable for safety assessment regardless of where it is utilized. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Simulations of Chemical Reactions with the Frozen Domain Formulation of the Fragment Molecular Orbital Method.

    PubMed

    Nakata, Hiroya; Fedorov, Dmitri G; Nagata, Takeshi; Kitaura, Kazuo; Nakamura, Shinichiro

    2015-07-14

    The fully analytic first and second derivatives of the energy in the frozen domain formulation of the fragment molecular orbital (FMO) were developed and applied to locate transition states and determine vibrational contributions to free energies. The development is focused on the frozen domain with dimers (FDD) model. The intrinsic reaction coordinate method was interfaced with FMO. Simulations of IR and Raman spectra were enabled using FMO/FDD by developing the calculation of intensities. The accuracy is evaluated for S(N)2 reactions in explicit solvent, and for the free binding energies of a protein-ligand complex of the Trp cage protein (PDB: 1L2Y ). FMO/FDD is applied to study the keto-enol tautomeric reaction of phosphoglycolohydroxamic acid and the triosephosphate isomerase (PDB: 7TIM ), and the role of amino acid residue fragments in the reaction is discussed.

  13. Face Liveness Detection Using Defocus

    PubMed Central

    Kim, Sooyeon; Ban, Yuseok; Lee, Sangyoun

    2015-01-01

    In order to develop security systems for identity authentication, face recognition (FR) technology has been applied. One of the main problems of applying FR technology is that the systems are especially vulnerable to attacks with spoofing faces (e.g., 2D pictures). To defend from these attacks and to enhance the reliability of FR systems, many anti-spoofing approaches have been recently developed. In this paper, we propose a method for face liveness detection using the effect of defocus. From two images sequentially taken at different focuses, three features, focus, power histogram and gradient location and orientation histogram (GLOH), are extracted. Afterwards, we detect forged faces through the feature-level fusion approach. For reliable performance verification, we develop two databases with a handheld digital camera and a webcam. The proposed method achieves a 3.29% half total error rate (HTER) at a given depth of field (DoF) and can be extended to camera-equipped devices, like smartphones. PMID:25594594

  14. Taking Charge of Professional Development: A Practical Model for Your School

    ERIC Educational Resources Information Center

    Semadeni, Joseph

    2009-01-01

    Overcome budget cuts, lack of leadership, top-down mandates, and other obstacles to professional development by using this book's take-charge approach. Joseph H. Semadeni guides you through a systemic method to professional development that: (1) Motivates teachers to continuously learn and apply best practices; (2) Makes adult learning activities…

  15. Knowledge Management Model: Practical Application for Competency Development

    ERIC Educational Resources Information Center

    Lustri, Denise; Miura, Irene; Takahashi, Sergio

    2007-01-01

    Purpose: This paper seeks to present a knowledge management (KM) conceptual model for competency development and a case study in a law service firm, which implemented the KM model in a competencies development program. Design/methodology/approach: The case study method was applied according to Yin (2003) concepts, focusing a six-professional group…

  16. Three novel approaches to structural identifiability analysis in mixed-effects models.

    PubMed

    Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D

    2016-05-06

    Structural identifiability is a concept that considers whether the structure of a model together with a set of input-output relations uniquely determines the model parameters. In the mathematical modelling of biological systems, structural identifiability is an important concept since biological interpretations are typically made from the parameter estimates. For a system defined by ordinary differential equations, several methods have been developed to analyse whether the model is structurally identifiable or otherwise. Another well-used modelling framework, which is particularly useful when the experimental data are sparsely sampled and the population variance is of interest, is mixed-effects modelling. However, established identifiability analysis techniques for ordinary differential equations are not directly applicable to such models. In this paper, we present and apply three different methods that can be used to study structural identifiability in mixed-effects models. The first method, called the repeated measurement approach, is based on applying a set of previously established statistical theorems. The second method, called the augmented system approach, is based on augmenting the mixed-effects model to an extended state-space form. The third method, called the Laplace transform mixed-effects extension, is based on considering the moment invariants of the systems transfer function as functions of random variables. To illustrate, compare and contrast the application of the three methods, they are applied to a set of mixed-effects models. Three structural identifiability analysis methods applicable to mixed-effects models have been presented in this paper. As method development of structural identifiability techniques for mixed-effects models has been given very little attention, despite mixed-effects models being widely used, the methods presented in this paper provides a way of handling structural identifiability in mixed-effects models previously not possible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. Simultaneous Quantitation of Advanced Glycation End Products in Soy Sauce and Beer by Liquid Chromatography-Tandem Mass Spectrometry without Ion-Pair Reagents and Derivatization.

    PubMed

    Nomi, Yuri; Annaka, Hironori; Sato, Shinji; Ueta, Etsuko; Ohkura, Tsuyoshi; Yamamoto, Kazuhiro; Homma, Seiichi; Suzuki, Emiko; Otsuka, Yuzuru

    2016-11-09

    The aim of this study was to develop a simple and sensitive method to analyze several advanced glycation end products (AGEs) simultaneously using liquid chromatography-tandem mass spectrometry (LC-MS/MS), and to apply this method to the quantitation of AGEs in brown-colored foods. The developed method enabled to separate and quantitate simultaneously seven AGEs, and was applied to the determination of free AGEs contained in various kinds of soy sauce and beer. The major AGEs in soy sauce and beer were N ε -carboxymethyllysine (CML), N ε -carboxyethyllysine (CEL), and N δ -(5-hydro-5-methyl-4-imidazolon-2-yl)ornithine (MG-H1). Using the developed LC-MS/MS method, recovery test on soy sauce and beer samples showed the recovery values of 85.3-103.9% for CML, 95.9-107.4% for CEL, and 69.5-123.2% for MG-H1. In particular, it is the first report that free CML, CEL, and MG-H1 were present in beer. Furthermore, long-term storage and heating process of soy sauce increased CML and MG-H1.

  18. Modeling the human development index and the percentage of poor people using quantile smoothing splines

    NASA Astrophysics Data System (ADS)

    Mulyani, Sri; Andriyana, Yudhie; Sudartianto

    2017-03-01

    Mean regression is a statistical method to explain the relationship between the response variable and the predictor variable based on the central tendency of the data (mean) of the response variable. The parameter estimation in mean regression (with Ordinary Least Square or OLS) generates a problem if we apply it to the data with a symmetric, fat-tailed, or containing outlier. Hence, an alternative method is necessary to be used to that kind of data, for example quantile regression method. The quantile regression is a robust technique to the outlier. This model can explain the relationship between the response variable and the predictor variable, not only on the central tendency of the data (median) but also on various quantile, in order to obtain complete information about that relationship. In this study, a quantile regression is developed with a nonparametric approach such as smoothing spline. Nonparametric approach is used if the prespecification model is difficult to determine, the relation between two variables follow the unknown function. We will apply that proposed method to poverty data. Here, we want to estimate the Percentage of Poor People as the response variable involving the Human Development Index (HDI) as the predictor variable.

  19. Optimization of o-phtaldialdehyde/2-mercaptoethanol postcolumn reaction for the hydrophilic interaction liquid chromatography determination of memantine utilizing a silica hydride stationary phase.

    PubMed

    Douša, Michal; Pivoňková, Veronika; Sýkora, David

    2016-08-01

    A rapid procedure for the determination of memantine based on hydrophilic interaction chromatography with fluorescence detection was developed. Fluorescence detection after postcolumn derivatization with o-phtaldialdehyde/2-mercaptoethanol was performed at excitation and emission wavelengths of 345 and 450 nm, respectively. The postcolumn reaction conditions such as reaction temperature, derivatization reagent flow rate, and reagents concentration were studied due to steric hindrance of amino group of memantine. The derivatization reaction was applied for the hydrophilic interaction liquid chromatography method which was based on Cogent Silica-C stationary phase with a mobile phase consisting of a mixture of 10 mmol/L citric acid and 10 mmol/L o-phosphoric acid (pH 6.0) with acetonitrile using an isocratic composition of 2:8 v/v. The benefit of the reported approach consists in a simple sample pretreatment and a quick and sensitive hydrophilic interaction chromatography method. The developed method was validated in terms of linearity, accuracy, precision, and selectivity according to the International Conference on Harmonisation guidelines. The developed method was successfully applied for the analysis of commercial memantine tablets. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Measuring Health Information Dissemination and Identifying Target Interest Communities on Twitter: Methods Development and Case Study of the @SafetyMD Network.

    PubMed

    Kandadai, Venk; Yang, Haodong; Jiang, Ling; Yang, Christopher C; Fleisher, Linda; Winston, Flaura Koplin

    2016-05-05

    Little is known about the ability of individual stakeholder groups to achieve health information dissemination goals through Twitter. This study aimed to develop and apply methods for the systematic evaluation and optimization of health information dissemination by stakeholders through Twitter. Tweet content from 1790 followers of @SafetyMD (July-November 2012) was examined. User emphasis, a new indicator of Twitter information dissemination, was defined and applied to retweets across two levels of retweeters originating from @SafetyMD. User interest clusters were identified based on principal component analysis (PCA) and hierarchical cluster analysis (HCA) of a random sample of 170 followers. User emphasis of keywords remained across levels but decreased by 9.5 percentage points. PCA and HCA identified 12 statistically unique clusters of followers within the @SafetyMD Twitter network. This study is one of the first to develop methods for use by stakeholders to evaluate and optimize their use of Twitter to disseminate health information. Our new methods provide preliminary evidence that individual stakeholders can evaluate the effectiveness of health information dissemination and create content-specific clusters for more specific targeted messaging.

  1. The Value of Satellite Early Warning Systems in Kenya and Guatemala: Results and Lessons Learned from Contingent Valuation and Loss Avoidance Approaches

    NASA Astrophysics Data System (ADS)

    Morrison, I.; Berenter, J. S.

    2017-12-01

    SERVIR, the joint USAID and NASA initiative, conducted two studies to assess the value of two distinctly different Early Warning Systems (EWS) in Guatemala and Kenya. Each study applied a unique method to asses EWS value. The evaluation team conducted a Contingent Valuation (CV) choice experiment to measure the value of a near-real time VIIRS and MODIS-based hot-spot mapping tool for forest management professionals targeting seasonal forest fires in Northern Guatemala. The team also conducted a survey-based Damage and Loss Avoidance (DaLA) exercise to calculate the monetary benefits of a MODIS-derived frost forecasting system for farmers in the tea-growing highlands of Kenya. This presentation compares and contrasts the use and utility of these two valuation approaches to assess EWS value. Although interest in these methods is growing, few empirical studies have applied them to benefit and value assessment for EWS. Furthermore, the application of CV and DaLA methods is much less common outside of the developed world. Empirical findings from these two studies indicated significant value for two substantially different beneficiary groups: natural resource management specialists and smallholder tea farmers. Additionally, the valuation processes generated secondary information that can help improve the format and delivery of both types of EWS outputs for user and beneficiary communities in Kenya and Guatemala. Based on lessons learned from the two studies, this presentation will also compare and contrast the methodological and logistical advantages, challenges, and limitations in applying the CV and DaLA methods in developing countries. By reviewing these two valuation methods alongside each other, the authors will outline conditions where they can be applied - individually or jointly - to other early warning systems and delivery contexts.

  2. Spatial scan statistics for detection of multiple clusters with arbitrary shapes.

    PubMed

    Lin, Pei-Sheng; Kung, Yi-Hung; Clayton, Murray

    2016-12-01

    In applying scan statistics for public health research, it would be valuable to develop a detection method for multiple clusters that accommodates spatial correlation and covariate effects in an integrated model. In this article, we connect the concepts of the likelihood ratio (LR) scan statistic and the quasi-likelihood (QL) scan statistic to provide a series of detection procedures sufficiently flexible to apply to clusters of arbitrary shape. First, we use an independent scan model for detection of clusters and then a variogram tool to examine the existence of spatial correlation and regional variation based on residuals of the independent scan model. When the estimate of regional variation is significantly different from zero, a mixed QL estimating equation is developed to estimate coefficients of geographic clusters and covariates. We use the Benjamini-Hochberg procedure (1995) to find a threshold for p-values to address the multiple testing problem. A quasi-deviance criterion is used to regroup the estimated clusters to find geographic clusters with arbitrary shapes. We conduct simulations to compare the performance of the proposed method with other scan statistics. For illustration, the method is applied to enterovirus data from Taiwan. © 2016, The International Biometric Society.

  3. Development of method for experimental determination of wheel-rail contact forces and contact point position by using instrumented wheelset

    NASA Astrophysics Data System (ADS)

    Bižić, Milan B.; Petrović, Dragan Z.; Tomić, Miloš C.; Djinović, Zoran V.

    2017-07-01

    This paper presents the development of a unique method for experimental determination of wheel-rail contact forces and contact point position by using the instrumented wheelset (IWS). Solutions of key problems in the development of IWS are proposed, such as the determination of optimal locations, layout, number and way of connecting strain gauges as well as the development of an inverse identification algorithm (IIA). The base for the solution of these problems is the wheel model and results of FEM calculations, while IIA is based on the method of blind source separation using independent component analysis. In the first phase, the developed method was tested on a wheel model and a high accuracy was obtained (deviations of parameters obtained with IIA and really applied parameters in the model are less than 2%). In the second phase, experimental tests on the real object or IWS were carried out. The signal-to-noise ratio was identified as the main influential parameter on the measurement accuracy. Тhе obtained results have shown that the developed method enables measurement of vertical and lateral wheel-rail contact forces Q and Y and their ratio Y/Q with estimated errors of less than 10%, while the estimated measurement error of contact point position is less than 15%. At flange contact and higher values of ratio Y/Q or Y force, the measurement errors are reduced, which is extremely important for the reliability and quality of experimental tests of safety against derailment of railway vehicles according to the standards UIC 518 and EN 14363. The obtained results have shown that the proposed method can be successfully applied in solving the problem of high accuracy measurement of wheel-rail contact forces and contact point position using IWS.

  4. A simple, rapid and sensitive RP-HPLC-UV method for the simultaneous determination of sorafenib & paclitaxel in plasma and pharmaceutical dosage forms: Application to pharmacokinetic study.

    PubMed

    Khan, Ismail; Iqbal, Zafar; Khan, Abad; Hassan, Muhammad; Nasir, Fazle; Raza, Abida; Ahmad, Lateef; Khan, Amjad; Akhlaq Mughal, Muhammad

    2016-10-15

    A simple, economical, fast, and sensitive RP-HPLC-UV method has been developed for the simultaneous quantification of Sorafenib and paclitaxel in biological samples and formulations using piroxicam as an internal standard. The experimental conditions were optimized and method was validated according to the standard guidelines. The separation of both the analytes and internal standard was achieved on Discovery HS C18 column (250mm×4.6mm, 5μm) using Acetonitrile and TFA (0.025%) in the ratio of (65:35V/V) as the mobile phase in isocratic mode at a flow rate of 1ml/min, with a wavelength of 245nm and at a column oven temperature of 25°Cin a short run time of 12min. The limits of detection (LLOD) were 5 and 10ng/ml while the limits of quantification (LLOQ) were 10 and 15ng/ml for sorafenib and paclitaxel, respectively. Sorafenib, paclitaxel and piroxicam (IS) were extracted from biological samples by applying acetonitrile as a precipitating and extraction solvent. The method is linear in the range of 15-20,000ng/ml for paclitaxel and 10-5000ng/ml for sorafenib, respectively. The method is sensitive and reliable by considering both of its intra-day and inter-day co-efficient of variance. The method was successfully applied for the quantification of the above mentioned drugs in plasma. The developed method will be applied towards sorafenib and paclitaxel pharmacokinetics studies in animal models. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Extension of biomass estimates to pre-assessment periods using density dependent surplus production approach.

    PubMed

    Horbowy, Jan; Tomczak, Maciej T

    2017-01-01

    Biomass reconstructions to pre-assessment periods for commercially important and exploitable fish species are important tools for understanding long-term processes and fluctuation on stock and ecosystem level. For some stocks only fisheries statistics and fishery dependent data are available, for periods before surveys were conducted. The methods for the backward extension of the analytical assessment of biomass for years for which only total catch volumes are available were developed and tested in this paper. Two of the approaches developed apply the concept of the surplus production rate (SPR), which is shown to be stock density dependent if stock dynamics is governed by classical stock-production models. The other approach used a modified form of the Schaefer production model that allows for backward biomass estimation. The performance of the methods was tested on the Arctic cod and North Sea herring stocks, for which analytical biomass estimates extend back to the late 1940s. Next, the methods were applied to extend biomass estimates of the North-east Atlantic mackerel from the 1970s (analytical biomass estimates available) to the 1950s, for which only total catch volumes were available. For comparison with other methods which employs a constant SPR estimated as an average of the observed values, was also applied. The analyses showed that the performance of the methods is stock and data specific; the methods that work well for one stock may fail for the others. The constant SPR method is not recommended in those cases when the SPR is relatively high and the catch volumes in the reconstructed period are low.

  6. Extension of biomass estimates to pre-assessment periods using density dependent surplus production approach

    PubMed Central

    Horbowy, Jan

    2017-01-01

    Biomass reconstructions to pre-assessment periods for commercially important and exploitable fish species are important tools for understanding long-term processes and fluctuation on stock and ecosystem level. For some stocks only fisheries statistics and fishery dependent data are available, for periods before surveys were conducted. The methods for the backward extension of the analytical assessment of biomass for years for which only total catch volumes are available were developed and tested in this paper. Two of the approaches developed apply the concept of the surplus production rate (SPR), which is shown to be stock density dependent if stock dynamics is governed by classical stock-production models. The other approach used a modified form of the Schaefer production model that allows for backward biomass estimation. The performance of the methods was tested on the Arctic cod and North Sea herring stocks, for which analytical biomass estimates extend back to the late 1940s. Next, the methods were applied to extend biomass estimates of the North-east Atlantic mackerel from the 1970s (analytical biomass estimates available) to the 1950s, for which only total catch volumes were available. For comparison with other methods which employs a constant SPR estimated as an average of the observed values, was also applied. The analyses showed that the performance of the methods is stock and data specific; the methods that work well for one stock may fail for the others. The constant SPR method is not recommended in those cases when the SPR is relatively high and the catch volumes in the reconstructed period are low. PMID:29131850

  7. Non-invasive body temperature measurement of wild chimpanzees using fecal temperature decline.

    PubMed

    Jensen, Siv Aina; Mundry, Roger; Nunn, Charles L; Boesch, Christophe; Leendertz, Fabian H

    2009-04-01

    New methods are required to increase our understanding of pathologic processes in wild mammals. We developed a noninvasive field method to estimate the body temperature of wild living chimpanzees habituated to humans, based on statistically fitting temperature decline of feces after defecation. The method was established with the use of control measures of human rectal temperature and subsequent changes in fecal temperature over time. The method was then applied to temperature data collected from wild chimpanzee feces. In humans, we found good correspondence between the temperature estimated by the method and the actual rectal temperature that was measured (maximum deviation 0.22 C). The method was successfully applied and the average estimated temperature of the chimpanzees was 37.2 C. This simple-to-use field method reliably estimates the body temperature of wild chimpanzees and probably also other large mammals.

  8. Algorithm Summary and Evaluation: Automatic Implementation of Ringdown Analysis for Electromechanical Mode Identification from Phasor Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Ning; Huang, Zhenyu; Tuffner, Francis K.

    2010-02-28

    Small signal stability problems are one of the major threats to grid stability and reliability. Prony analysis has been successfully applied on ringdown data to monitor electromechanical modes of a power system using phasor measurement unit (PMU) data. To facilitate an on-line application of mode estimation, this paper develops a recursive algorithm for implementing Prony analysis and proposed an oscillation detection method to detect ringdown data in real time. By automatically detecting ringdown data, the proposed method helps guarantee that Prony analysis is applied properly and timely on the ringdown data. Thus, the mode estimation results can be performed reliablymore » and timely. The proposed method is tested using Monte Carlo simulations based on a 17-machine model and is shown to be able to properly identify the oscillation data for on-line application of Prony analysis. In addition, the proposed method is applied to field measurement data from WECC to show the performance of the proposed algorithm.« less

  9. Applications of rule-induction in the derivation of quantitative structure-activity relationships.

    PubMed

    A-Razzak, M; Glen, R C

    1992-08-01

    Recently, methods have been developed in the field of Artificial Intelligence (AI), specifically in the expert systems area using rule-induction, designed to extract rules from data. We have applied these methods to the analysis of molecular series with the objective of generating rules which are predictive and reliable. The input to rule-induction consists of a number of examples with known outcomes (a training set) and the output is a tree-structured series of rules. Unlike most other analysis methods, the results of the analysis are in the form of simple statements which can be easily interpreted. These are readily applied to new data giving both a classification and a probability of correctness. Rule-induction has been applied to in-house generated and published QSAR datasets and the methodology, application and results of these analyses are discussed. The results imply that in some cases it would be advantageous to use rule-induction as a complementary technique in addition to conventional statistical and pattern-recognition methods.

  10. Applications of rule-induction in the derivation of quantitative structure-activity relationships

    NASA Astrophysics Data System (ADS)

    A-Razzak, Mohammed; Glen, Robert C.

    1992-08-01

    Recently, methods have been developed in the field of Artificial Intelligence (AI), specifically in the expert systems area using rule-induction, designed to extract rules from data. We have applied these methods to the analysis of molecular series with the objective of generating rules which are predictive and reliable. The input to rule-induction consists of a number of examples with known outcomes (a training set) and the output is a tree-structured series of rules. Unlike most other analysis methods, the results of the analysis are in the form of simple statements which can be easily interpreted. These are readily applied to new data giving both a classification and a probability of correctness. Rule-induction has been applied to in-house generated and published QSAR datasets and the methodology, application and results of these analyses are discussed. The results imply that in some cases it would be advantageous to use rule-induction as a complementary technique in addition to conventional statistical and pattern-recognition methods.

  11. Split luciferase complementation assay to detect regulated protein-protein interactions in rice protoplasts in a large-scale format

    PubMed Central

    2014-01-01

    Background The rice interactome, in which a network of protein-protein interactions has been elucidated in rice, is a useful resource to identify functional modules of rice signal transduction pathways. Protein-protein interactions occur in cells in two ways, constitutive and regulative. While a yeast-based high-throughput method has been widely used to identify the constitutive interactions, a method to detect the regulated interactions is rarely developed for a large-scale analysis. Results A split luciferase complementation assay was applied to detect the regulated interactions in rice. A transformation method of rice protoplasts in a 96-well plate was first established for a large-scale analysis. In addition, an antibody that specifically recognizes a carboxyl-terminal fragment of Renilla luciferase was newly developed. A pair of antibodies that recognize amino- and carboxyl- terminal fragments of Renilla luciferase, respectively, was then used to monitor quality and quantity of interacting recombinant-proteins accumulated in the cells. For a proof-of-concept, the method was applied to detect the gibberellin-dependent interaction between GIBBERELLIN INSENSITIVE DWARF1 and SLENDER RICE 1. Conclusions A method to detect regulated protein-protein interactions was developed towards establishment of the rice interactome. PMID:24987490

  12. Monitoring of platinum surface contamination in seven Dutch hospital pharmacies using inductively coupled plasma mass spectrometry

    PubMed Central

    Huitema, A. D. R.; Bakker, E. N.; Douma, J. W.; Schimmel, K. J. M.; van Weringh, G.; de Wolf, P. J.; Schellens, J. H. M.; Beijnen, J. H.

    2007-01-01

    Objective: To develop, validate, and apply a method for the determination of platinum contamination, originating from cisplatinum, oxaliplatinum, and carboplatinum. Methods: Inductively coupled plasma mass spectrometry (ICP-MS) was used to determine platinum in wipe samples. The sampling procedure and the analytical conditions were optimised and the assay was validated. The method was applied to measure surface contamination in seven Dutch hospital pharmacies. Results: The developed method allowed reproducible quantification of 0.50 ng l−1 platinum (5 pg/wipe sample). Recoveries for stainless steel and linoleum surfaces ranged between 50.4 and 81.4% for the different platinum compounds tested. Platinum contamination was reported in 88% of the wipe samples. Although a substantial variation in surface contamination of the pharmacies was noticed, in most pharmacies, the laminar-airflow (LAF) hoods, the floor in front of the LAF hoods, door handles, and handles of service hatches showed positive results. This demonstrates that contamination is spread throughout the preparation rooms. Conclusion: We developed and validated an ultra sensitive and reliable ICP-MS method for the determination of platinum in surface samples. Surface contamination with platinum was observed in all hospital pharmacies sampled. The interpretation of these results is, however, complicated. PMID:17377802

  13. Development of a Self-Rated Mixed Methods Skills Assessment: The National Institutes of Health Mixed Methods Research Training Program for the Health Sciences.

    PubMed

    Guetterman, Timothy C; Creswell, John W; Wittink, Marsha; Barg, Fran K; Castro, Felipe G; Dahlberg, Britt; Watkins, Daphne C; Deutsch, Charles; Gallo, Joseph J

    2017-01-01

    Demand for training in mixed methods is high, with little research on faculty development or assessment in mixed methods. We describe the development of a self-rated mixed methods skills assessment and provide validity evidence. The instrument taps six research domains: "Research question," "Design/approach," "Sampling," "Data collection," "Analysis," and "Dissemination." Respondents are asked to rate their ability to define or explain concepts of mixed methods under each domain, their ability to apply the concepts to problems, and the extent to which they need to improve. We administered the questionnaire to 145 faculty and students using an internet survey. We analyzed descriptive statistics and performance characteristics of the questionnaire using the Cronbach alpha to assess reliability and an analysis of variance that compared a mixed methods experience index with assessment scores to assess criterion relatedness. Internal consistency reliability was high for the total set of items (0.95) and adequate (≥0.71) for all but one subscale. Consistent with establishing criterion validity, respondents who had more professional experiences with mixed methods (eg, published a mixed methods article) rated themselves as more skilled, which was statistically significant across the research domains. This self-rated mixed methods assessment instrument may be a useful tool to assess skills in mixed methods for training programs. It can be applied widely at the graduate and faculty level. For the learner, assessment may lead to enhanced motivation to learn and training focused on self-identified needs. For faculty, the assessment may improve curriculum and course content planning.

  14. The burden of disease from indoor air pollution in developing countries: comparison of estimates.

    PubMed

    Smith, Kirk R; Mehta, Sumi

    2003-08-01

    Four different methods have been applied to estimate the burden of disease due to indoor air pollution from household solid fuel use in developing countries (LDCs). The largest number of estimates involves applying exposure-response information from urban ambient air pollution studies to estimate indoor exposure concentrations of particulate air pollution. Another approach is to construct child survival curves using the results of large-scale household surveys, as has been done for India. A third approach involves cross-national analyses of child survival and household fuel use. The fourth method, referred to as the 'fuel-based' approach, which is explored in more depth here, involves applying relative risk estimates from epidemiological studies that use exposure surrogates, such as fuel type, to estimates of household solid fuel use to determine population attributable fractions by disease and age group. With this method and conservative assumptions about relative risks, 4-5 percent of the global LDC totals for both deaths and DALYs (disability adjusted life years) from acute respiratory infections, chronic obstructive pulmonary disease, tuberculosis, asthma, lung cancer, ischaemic heart disease, and blindness can be attributed to solid fuel use in developing countries. Acute respiratory infections in children under five years of age are the largest single category of deaths (64%) and DALYs (81%) from indoor air pollution, apparently being responsible globally for about 1.2 million premature deaths annually in the early 1990s.

  15. [Study on the detection of active ingredient contents of Paecilomyces hepiali mycelium via near infrared spectroscopy].

    PubMed

    Teng, Wei-Zhuo; Song, Jia; Meng, Fan-Xin; Meng, Qing-Fan; Lu, Jia-Hui; Hu, Shuang; Teng, Li-Rong; Wang, Di; Xie, Jing

    2014-10-01

    Partial least squares (PLS) and radial basis function neural network (RBFNN) combined with near infrared spectros- copy (NIR) were applied to develop models for cordycepic acid, polysaccharide and adenosine analysis in Paecilomyces hepialid fermentation mycelium. The developed models possess well generalization and predictive ability which can be applied for crude drugs and related productions determination. During the experiment, 214 Paecilomyces hepialid mycelium samples were obtained via chemical mutagenesis combined with submerged fermentation. The contents of cordycepic acid, polysaccharide and adenosine were determined via traditional methods and the near infrared spectroscopy data were collected. The outliers were removed and the numbers of calibration set were confirmed via Monte Carlo partial least square (MCPLS) method. Based on the values of degree of approach (Da), both moving window partial least squares (MWPLS) and moving window radial basis function neural network (MWRBFNN) were applied to optimize characteristic wavelength variables, optimum preprocessing methods and other important variables in the models. After comparison, the RBFNN, RBFNN and PLS models were developed successfully for cordycepic acid, polysaccharide and adenosine detection, and the correlation between reference values and predictive values in both calibration set (R2c) and validation set (R2p) of optimum models was 0.9417 and 0.9663, 0.9803 and 0.9850, and 0.9761 and 0.9728, respectively. All the data suggest that these models possess well fitness and predictive ability.

  16. DEVELOPMENT OF ANALYTICAL METHODS FOR SPECIFIC LAWN- APPLIED PESTICIDES IN HOUSE DUST

    EPA Science Inventory

    Many pesticides have been developed for residential outdoor application, particularly for lawn care. Residues from these applications may be tracked into the home, where they become incorporated with house dust and persist for long periods of time. Consequently, potential human...

  17. Recent Developments and Applications of the MMPBSA Method

    PubMed Central

    Wang, Changhao; Greene, D'Artagnan; Xiao, Li; Qi, Ruxi; Luo, Ray

    2018-01-01

    The Molecular Mechanics Poisson-Boltzmann Surface Area (MMPBSA) approach has been widely applied as an efficient and reliable free energy simulation method to model molecular recognition, such as for protein-ligand binding interactions. In this review, we focus on recent developments and applications of the MMPBSA method. The methodology review covers solvation terms, the entropy term, extensions to membrane proteins and high-speed screening, and new automation toolkits. Recent applications in various important biomedical and chemical fields are also reviewed. We conclude with a few future directions aimed at making MMPBSA a more robust and efficient method. PMID:29367919

  18. A Review of User-Centered Design for Diabetes-Related Consumer Health Informatics Technologies

    PubMed Central

    LeRouge, Cynthia; Wickramasinghe, Nilmini

    2013-01-01

    User-centered design (UCD) is well recognized as an effective human factor engineering strategy for designing ease of use in the total customer experience with products and information technology that has been applied specifically to health care information technology systems. We conducted a literature review to analyze the current research regarding the use of UCD methods and principles to support the development or evaluation of diabetes-related consumer health informatics technology (CHIT) initiatives. Findings indicate that (1) UCD activities have been applied across the technology development life cycle stages, (2) there are benefits to incorporating UCD to better inform CHIT development in this area, and (3) the degree of adoption of the UCD process is quite uneven across diabetes CHIT studies. In addition, few to no studies report on methods used across all phases of the life cycle with process detail. To address that void, the Appendix provides an illustrative case study example of UCD techniques across development stages. PMID:23911188

  19. Unfolding and unfoldability of digital pulses in the z-domain

    NASA Astrophysics Data System (ADS)

    Regadío, Alberto; Sánchez-Prieto, Sebastián

    2018-04-01

    The unfolding (or deconvolution) technique is used in the development of digital pulse processing systems applied to particle detection. This technique is applied to digital signals obtained by digitization of analog signals that represent the combined response of the particle detectors and the associated signal conditioning electronics. This work describes a technique to determine if the signal is unfoldable. For unfoldable signals the characteristics of the unfolding system (unfolder) are presented. Finally, examples of the method applied to real experimental setup are discussed.

  20. Facility-specific radiation exposure risks and their implications for radiation workers at Department of Energy laboratories

    NASA Astrophysics Data System (ADS)

    Davis, Adam Christopher

    This research develops a new framework for evaluating the occupational risks of exposure to hazardous substances in any setting where As Low As Reasonably Achievable (ALARA) practices are mandated or used. The evaluation is performed by developing a hypothesis-test-based procedure for evaluating the homogeneity of various epidemiological cohorts, and thus the appropriateness of the application of aggregate data-pooling techniques to those cohorts. A statistical methodology is then developed as an alternative to aggregate pooling for situations in which individual cohorts show heterogeneity between them and are thus unsuitable for pooled analysis. These methods are then applied to estimate the all-cancer mortality risks incurred by workers at four Department-of-Energy nuclear weapons laboratories. Both linear, no-threshold and dose-bin averaged risks are calculated and it is further shown that aggregate analysis tends to overestimate the risks with respect to those calculated by the methods developed in this work. The risk estimates developed in Chapter 2 are, in Chapter 3, applied to assess the risks to workers engaged in americium recovery operations at Los Alamos National Laboratory. The work described in Chapter 3 develops a full radiological protection assessment for the new americium recovery project, including development of exposure cases, creation and modification of MCNP5 models, development of a time-and-motion study, and the final synthesis of all data. This work also develops a new risk-based method of determining whether administrative controls, such as staffing increases, are ALARA-optimized. The EPA's estimate of the value of statistical life is applied to these risk estimates to determine a monetary value for risk. The rate of change of this "risk value" (marginal risk) is then compared with the rate of change of workers' compensations as additional workers are added to the project to reduce the dose (and therefore, presumably, risk) to each individual.

  1. Boundary-integral methods in elasticity and plasticity. [solutions of boundary value problems

    NASA Technical Reports Server (NTRS)

    Mendelson, A.

    1973-01-01

    Recently developed methods that use boundary-integral equations applied to elastic and elastoplastic boundary value problems are reviewed. Direct, indirect, and semidirect methods using potential functions, stress functions, and displacement functions are described. Examples of the use of these methods for torsion problems, plane problems, and three-dimensional problems are given. It is concluded that the boundary-integral methods represent a powerful tool for the solution of elastic and elastoplastic problems.

  2. Objective comparison of particle tracking methods.

    PubMed

    Chenouard, Nicolas; Smal, Ihor; de Chaumont, Fabrice; Maška, Martin; Sbalzarini, Ivo F; Gong, Yuanhao; Cardinale, Janick; Carthel, Craig; Coraluppi, Stefano; Winter, Mark; Cohen, Andrew R; Godinez, William J; Rohr, Karl; Kalaidzidis, Yannis; Liang, Liang; Duncan, James; Shen, Hongying; Xu, Yingke; Magnusson, Klas E G; Jaldén, Joakim; Blau, Helen M; Paul-Gilloteaux, Perrine; Roudot, Philippe; Kervrann, Charles; Waharte, François; Tinevez, Jean-Yves; Shorte, Spencer L; Willemse, Joost; Celler, Katherine; van Wezel, Gilles P; Dan, Han-Wei; Tsai, Yuh-Show; Ortiz de Solórzano, Carlos; Olivo-Marin, Jean-Christophe; Meijering, Erik

    2014-03-01

    Particle tracking is of key importance for quantitative analysis of intracellular dynamic processes from time-lapse microscopy image data. Because manually detecting and following large numbers of individual particles is not feasible, automated computational methods have been developed for these tasks by many groups. Aiming to perform an objective comparison of methods, we gathered the community and organized an open competition in which participating teams applied their own methods independently to a commonly defined data set including diverse scenarios. Performance was assessed using commonly defined measures. Although no single method performed best across all scenarios, the results revealed clear differences between the various approaches, leading to notable practical conclusions for users and developers.

  3. A strategy to apply quantitative epistasis analysis on developmental traits.

    PubMed

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  4. A review of methods for assessment of the rate of gastric emptying in the dog and cat: 1898-2002.

    PubMed

    Wyse, C A; McLellan, J; Dickie, A M; Sutton, D G M; Preston, T; Yam, P S

    2003-01-01

    Gastric emptying is the process by which food is delivered to the small intestine at a rate and in a form that optimizes intestinal absorption of nutrients. The rate of gastric emptying is subject to alteration by physiological, pharmacological, and pathological conditions. Gastric emptying of solids is of greater clinical significance because disordered gastric emptying rarely is detectable in the liquid phase. Imaging techniques have the disadvantage of requiring restraint of the animal and access to expensive equipment. Radiographic methods require administration of test meals that are not similar to food. Scintigraphy is the gold standard method for assessment of gastric emptying but requires administration of a radioisotope. Magnetic resonance imaging has not yet been applied for assessment of gastric emptying in small animals. Ultrasonography is a potentially useful, but subjective, method for assessment of gastric emptying in dogs. Gastric tracer methods require insertion of gastric or intestinal cannulae and are rarely applied outside of the research laboratory. The paracetamol absorption test has been applied for assessment of liquid phase gastric emptying in the dog, but requires IV cannulation. The gastric emptying breath test is a noninvasive method for assessment of gastric emptying that has been applied in dogs and cats. This method can be carried out away from the veterinary hospital, but the effects of physiological and pathological abnormalities on the test are not known. Advances in technology will facilitate the development of reliable methods for assessment of gastric emptying in small animals.

  5. Introducing 3D U-statistic method for separating anomaly from background in exploration geochemical data with associated software development

    NASA Astrophysics Data System (ADS)

    Ghannadpour, Seyyed Saeed; Hezarkhani, Ardeshir

    2016-03-01

    The U-statistic method is one of the most important structural methods to separate the anomaly from the background. It considers the location of samples and carries out the statistical analysis of the data without judging from a geochemical point of view and tries to separate subpopulations and determine anomalous areas. In the present study, to use U-statistic method in three-dimensional (3D) condition, U-statistic is applied on the grade of two ideal test examples, by considering sample Z values (elevation). So far, this is the first time that this method has been applied on a 3D condition. To evaluate the performance of 3D U-statistic method and in order to compare U-statistic with one non-structural method, the method of threshold assessment based on median and standard deviation (MSD method) is applied on the two example tests. Results show that the samples indicated by U-statistic method as anomalous are more regular and involve less dispersion than those indicated by the MSD method. So that, according to the location of anomalous samples, denser areas of them can be determined as promising zones. Moreover, results show that at a threshold of U = 0, the total error of misclassification for U-statistic method is much smaller than the total error of criteria of bar {x}+n× s. Finally, 3D model of two test examples for separating anomaly from background using 3D U-statistic method is provided. The source code for a software program, which was developed in the MATLAB programming language in order to perform the calculations of the 3D U-spatial statistic method, is additionally provided. This software is compatible with all the geochemical varieties and can be used in similar exploration projects.

  6. Evaluation of an automatic brain segmentation method developed for neonates on adult MR brain images

    NASA Astrophysics Data System (ADS)

    Moeskops, Pim; Viergever, Max A.; Benders, Manon J. N. L.; Išgum, Ivana

    2015-03-01

    Automatic brain tissue segmentation is of clinical relevance in images acquired at all ages. The literature presents a clear distinction between methods developed for MR images of infants, and methods developed for images of adults. The aim of this work is to evaluate a method developed for neonatal images in the segmentation of adult images. The evaluated method employs supervised voxel classification in subsequent stages, exploiting spatial and intensity information. Evaluation was performed using images available within the MRBrainS13 challenge. The obtained average Dice coefficients were 85.77% for grey matter, 88.66% for white matter, 81.08% for cerebrospinal fluid, 95.65% for cerebrum, and 96.92% for intracranial cavity, currently resulting in the best overall ranking. The possibility of applying the same method to neonatal as well as adult images can be of great value in cross-sectional studies that include a wide age range.

  7. DETERMINATION OF PERCHLORATE AT PARTS-PER-BILLION LEVELS IN PLANTS BY ION CHROMATOGRAPHY

    EPA Science Inventory

    A method for the analysis of perchlorate in plants was developed, based on dry weight, and applied to the analysis of plant organs, foodstuffs, and plant products. The method reduced greatly the ionic interferences in water extracts of plant materials. The high background conduct...

  8. METHOD FOR THE DETERMINATION OF PERCHLORATE ANION IN PLANT AND SOLID MATRICES BY ION CHROMATOGRAPHY

    EPA Science Inventory

    A standardized method for the analysis of perchlorate in plants was developed, based on dry weight, and applied to the analysis of plant organs, foodstuffs, and plant products. The procedure greatly reduced the ionic interferences in water extracts of plant materials. Ion chro...

  9. On Systems Thinking and Ways of Building It in Learning

    ERIC Educational Resources Information Center

    Abdyrova, Aitzhan; Galiyev, Temir; Yessekeshova, Maral; Aldabergenova, Saule; Alshynbayeva, Zhuldyz

    2016-01-01

    The article focuses on the issue of shaping learners' systems thinking skills in the context of traditional education using specially elaborated system methods that are implemented based on the standard textbook. Applying these methods naturally complements the existing learning process and contributes to an efficient development of learners'…

  10. ESP 2.0: Improved method for projecting U.S. GHG and air pollution emissions through 2055

    EPA Science Inventory

    The Emission Scenario Projection (ESP) method is used to develop multi-decadal projections of U.S. Greenhouse Gas (GHG) and criteria pollutant emissions. The resulting future-year emissions can then translated into an emissions inventory and applied in climate and air quality mod...

  11. Asphalt in Pavement Maintenance.

    ERIC Educational Resources Information Center

    Asphalt Inst., College Park, MD.

    Maintenance methods that can be used equally well in all regions of the country have been developed for the use of asphalt in pavement maintenance. Specific information covering methods, equipment and terminology that applies to the use of asphalt in the maintenance of all types of pavement structures, including shoulders, is provided. In many…

  12. Selection of Suitable DNA Extraction Methods for Genetically Modified Maize 3272, and Development and Evaluation of an Event-Specific Quantitative PCR Method for 3272.

    PubMed

    Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi

    2016-01-01

    A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize, 3272. We first attempted to obtain genome DNA from this maize using a DNeasy Plant Maxi kit and a DNeasy Plant Mini kit, which have been widely utilized in our previous studies, but DNA extraction yields from 3272 were markedly lower than those from non-GM maize seeds. However, lowering of DNA extraction yields was not observed with GM quicker or Genomic-tip 20/G. We chose GM quicker for evaluation of the quantitative method. We prepared a standard plasmid for 3272 quantification. The conversion factor (Cf), which is required to calculate the amount of a genetically modified organism (GMO), was experimentally determined for two real-time PCR instruments, the Applied Biosystems 7900HT (the ABI 7900) and the Applied Biosystems 7500 (the ABI7500). The determined Cf values were 0.60 and 0.59 for the ABI 7900 and the ABI 7500, respectively. To evaluate the developed method, a blind test was conducted as part of an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSDr). The determined values were similar to those in our previous validation studies. The limit of quantitation for the method was estimated to be 0.5% or less, and we concluded that the developed method would be suitable and practical for detection and quantification of 3272.

  13. A Large-Scale Design Integration Approach Developed in Conjunction with the Ares Launch Vehicle Program

    NASA Technical Reports Server (NTRS)

    Redmon, John W.; Shirley, Michael C.; Kinard, Paul S.

    2012-01-01

    This paper presents a method for performing large-scale design integration, taking a classical 2D drawing envelope and interface approach and applying it to modern three dimensional computer aided design (3D CAD) systems. Today, the paradigm often used when performing design integration with 3D models involves a digital mockup of an overall vehicle, in the form of a massive, fully detailed, CAD assembly; therefore, adding unnecessary burden and overhead to design and product data management processes. While fully detailed data may yield a broad depth of design detail, pertinent integration features are often obscured under the excessive amounts of information, making them difficult to discern. In contrast, the envelope and interface method results in a reduction in both the amount and complexity of information necessary for design integration while yielding significant savings in time and effort when applied to today's complex design integration projects. This approach, combining classical and modern methods, proved advantageous during the complex design integration activities of the Ares I vehicle. Downstream processes, benefiting from this approach by reducing development and design cycle time, include: Creation of analysis models for the Aerodynamic discipline; Vehicle to ground interface development; Documentation development for the vehicle assembly.

  14. Differentiation of tea varieties using UV-Vis spectra and pattern recognition techniques

    NASA Astrophysics Data System (ADS)

    Palacios-Morillo, Ana; Alcázar, Ángela.; de Pablos, Fernando; Jurado, José Marcos

    2013-02-01

    Tea, one of the most consumed beverages all over the world, is of great importance in the economies of a number of countries. Several methods have been developed to classify tea varieties or origins based in pattern recognition techniques applied to chemical data, such as metal profile, amino acids, catechins and volatile compounds. Some of these analytical methods become tedious and expensive to be applied in routine works. The use of UV-Vis spectral data as discriminant variables, highly influenced by the chemical composition, can be an alternative to these methods. UV-Vis spectra of methanol-water extracts of tea have been obtained in the interval 250-800 nm. Absorbances have been used as input variables. Principal component analysis was used to reduce the number of variables and several pattern recognition methods, such as linear discriminant analysis, support vector machines and artificial neural networks, have been applied in order to differentiate the most common tea varieties. A successful classification model was built by combining principal component analysis and multilayer perceptron artificial neural networks, allowing the differentiation between tea varieties. This rapid and simple methodology can be applied to solve classification problems in food industry saving economic resources.

  15. Negotiating a Systems Development Method

    NASA Astrophysics Data System (ADS)

    Karlsson, Fredrik; Hedström, Karin

    Systems development methods (or methods) are often applied in tailored version to fit the actual situation. Method tailoring is in most the existing literature viewed as either (a) a highly rational process with the method engineer as the driver where the project members are passive information providers or (b) an unstructured process where the systems developer makes individual choices, a selection process without any driver. The purpose of this chapter is to illustrate that important design decisions during method tailoring are made by project members through negotiation. The study has been carried out using the perspective of actor-network theory. Our narratives depict method tailoring as more complex than (a) and (b) show the driver role rotates between the project members, and design decisions are based on influences from several project members. However, these design decisions are not consensus decisions.

  16. Box-Cox transformation for QTL mapping.

    PubMed

    Yang, Runqing; Yi, Nengjun; Xu, Shizhong

    2006-01-01

    The maximum likelihood method of QTL mapping assumes that the phenotypic values of a quantitative trait follow a normal distribution. If the assumption is violated, some forms of transformation should be taken to make the assumption approximately true. The Box-Cox transformation is a general transformation method which can be applied to many different types of data. The flexibility of the Box-Cox transformation is due to a variable, called transformation factor, appearing in the Box-Cox formula. We developed a maximum likelihood method that treats the transformation factor as an unknown parameter, which is estimated from the data simultaneously along with the QTL parameters. The method makes an objective choice of data transformation and thus can be applied to QTL analysis for many different types of data. Simulation studies show that (1) Box-Cox transformation can substantially increase the power of QTL detection; (2) Box-Cox transformation can replace some specialized transformation methods that are commonly used in QTL mapping; and (3) applying the Box-Cox transformation to data already normally distributed does not harm the result.

  17. The crowding factor method applied to parafoveal vision

    PubMed Central

    Ghahghaei, Saeideh; Walker, Laura

    2016-01-01

    Crowding increases with eccentricity and is most readily observed in the periphery. During natural, active vision, however, central vision plays an important role. Measures of critical distance to estimate crowding are difficult in central vision, as these distances are small. Any overlap of flankers with the target may create an overlay masking confound. The crowding factor method avoids this issue by simultaneously modulating target size and flanker distance and using a ratio to compare crowded to uncrowded conditions. This method was developed and applied in the periphery (Petrov & Meleshkevich, 2011b). In this work, we apply the method to characterize crowding in parafoveal vision (<3.5 visual degrees) with spatial uncertainty. We find that eccentricity and hemifield have less impact on crowding than in the periphery, yet radial/tangential asymmetries are clearly preserved. There are considerable idiosyncratic differences observed between participants. The crowding factor method provides a powerful tool for examining crowding in central and peripheral vision, which will be useful in future studies that seek to understand visual processing under natural, active viewing conditions. PMID:27690170

  18. Applications of Semiconductor Fabrication Methods to Nanomedicine: A Review of Recent Inventions and Techniques

    PubMed Central

    Rajasekhar, Achanta; Gimi, Barjor; Hu, Walter

    2013-01-01

    We live in a world of convergence where scientific techniques from a variety of seemingly disparate fields are being applied cohesively to the study and solution of biomedical problems. For instance, the semiconductor processing field has been primarily developed to cater to the needs of the ever decreasing transistor size and cost while increasing functionality of electronic circuits. In recent years, pioneers in this field have equipped themselves with a powerful understanding of how the same techniques can be applied in the biomedical field to develop new and efficient systems for the diagnosis, analysis and treatment of various conditions in the human body. In this paper, we review the major inventions and experimental methods which have been developed for nano/micro fluidic channels, nanoparticles fabricated by top-down methods, and in-vivo nanoporous microcages for effective drug delivery. This paper focuses on the information contained in patents as well as the corresponding technical publications. The goal of the paper is to help emerging scientists understand and improvise over these inventions. PMID:24312161

  19. Asthma management simulation for children: translating theory, methods, and strategies to effect behavior change.

    PubMed

    Shegog, Ross; Bartholomew, L Kay; Gold, Robert S; Pierrel, Elaine; Parcel, Guy S; Sockrider, Marianna M; Czyzewski, Danita I; Fernandez, Maria E; Berlin, Nina J; Abramson, Stuart

    2006-01-01

    Translating behavioral theories, models, and strategies to guide the development and structure of computer-based health applications is well recognized, although a continued challenge for program developers. A stepped approach to translate behavioral theory in the design of simulations to teach chronic disease management to children is described. This includes the translation steps to: 1) define target behaviors and their determinants, 2) identify theoretical methods to optimize behavioral change, and 3) choose educational strategies to effectively apply these methods and combine these into a cohesive computer-based simulation for health education. Asthma is used to exemplify a chronic health management problem and a computer-based asthma management simulation (Watch, Discover, Think and Act) that has been evaluated and shown to effect asthma self-management in children is used to exemplify the application of theory to practice. Impact and outcome evaluation studies have indicated the effectiveness of these steps in providing increased rigor and accountability, suggesting their utility for educators and developers seeking to apply simulations to enhance self-management behaviors in patients.

  20. A method of monitoring contact (pointed) welding

    NASA Astrophysics Data System (ADS)

    Bessonov, V. B.; Staroverov, N. E.; Larionov, I. A.; Guk, K. K.; Obodovskiy, A. V.

    2018-02-01

    The technology of welding parts of different thicknesses from various materials is improved, which is why the range of applied types and methods of welding is constantly expanding. In this regard, the issue of monitoring welded joints is particularly acute. The goal was: to develop a method of non-destructive radiographic inspection of point welds with a high accuracy rating of its quality.

  1. Estimation of Cellulose Crystallinity of Lignocelluloses Using Near-IR FT-Raman Spectroscopy and Comparison of the Raman and Segal-WAXS Methods

    Treesearch

    Umesh P. Agarwal; Richard R. Reiner; Sally A. Ralph

    2013-01-01

    Of the recently developed univariate and multivariate near-IR FT-Raman methods for estimating cellulose crystallinity, the former method was applied to a variety of lignocelluloses: softwoods, hardwoods, wood pulps, and agricultural residues/fibers. The effect of autofluorescence on the crystallinity estimation was minimized by solvent extraction or chemical treatment...

  2. An electrochemical method for determining hydrogen concentrations in metals and some applications

    NASA Technical Reports Server (NTRS)

    Danford, M. D.

    1983-01-01

    An electrochemical method was developed for the determination of hydrogen in metals using the EG&G-PARC Model 350A Corrosion Measurement Console. The method was applied to hydrogen uptake, both during electrolysis and electroplating, and to studies of hydrogen elimination and the effect of heat treatment on elimination times. Results from these studies are presented.

  3. The Effect of Coordinated Teaching Method Practices on Some Motor Skills of 6-Year-Old Children

    ERIC Educational Resources Information Center

    Altinkok, Mustafa

    2017-01-01

    Purpose: This study was designed to examine the effects of Coordinated Teaching Method activities applied for 10 weeks on 6-year-old children, and to examine the effects of these activities on the development of some motor skills in children. Research Methods: The "Experimental Research Model with Pre-test and Post-test Control Group"…

  4. Applied dendroecology and environmental forensics. Characterizing and age dating environmental releases: fundamentals and case studies

    Treesearch

    Jean-Christophe Balouet; Gil Oudijk; Kevin T. Smith; Ioana Petrisor; Hakan Grudd; Bengt Stocklassa

    2007-01-01

    Dendroecology, or the use of ring patterns to assess the age of trees and environmental factors controlling their growth, is a well-developed method in climatologic studies. This method holds great potential as a forensic tool for age dating, contamination assessment, and characterization of releases. Moreover, the method is independent of the physical presence of...

  5. The Effect of Schooling and Ability on Achievement Test Scores. NBER Working Paper Series.

    ERIC Educational Resources Information Center

    Hansen, Karsten; Heckman, James J.; Mullen, Kathleen J.

    This study developed two methods for estimating the effect of schooling on achievement test scores that control for the endogeneity of schooling by postulating that both schooling and test scores are generated by a common unobserved latent ability. The methods were applied to data on schooling and test scores. Estimates from the two methods are in…

  6. Modal analysis applied to circular, rectangular, and coaxial waveguides

    NASA Technical Reports Server (NTRS)

    Hoppe, D. J.

    1988-01-01

    Recent developments in the analysis of various waveguide components and feedhorns using Modal Analysis (Mode Matching Method) are summarized. A brief description of the theory is presented, and the important features of the method are pointed out. Specific examples in circular, rectangular, and coaxial waveguides are included, with comparisons between the theory and experimental measurements. Extensions to the methods are described.

  7. Analytical method development for the determination of emerging contaminants in water using supercritical-fluid chromatography coupled with diode-array detection.

    PubMed

    Del Carmen Salvatierra-Stamp, Vilma; Ceballos-Magaña, Silvia G; Gonzalez, Jorge; Ibarra-Galván, Valentin; Muñiz-Valencia, Roberto

    2015-05-01

    An analytical method using supercritical-fluid chromatography coupled with diode-array detection for the determination of seven emerging contaminants-two pharmaceuticals (carbamazepine and glyburide), three endocrine disruptors (17α-ethinyl estradiol, bisphenol A, and 17β-estradiol), one bactericide (triclosan), and one pesticide (diuron)-was developed and validated. These contaminants were chosen because of their frequency of use and their toxic effects on both humans and the environment. The optimized chromatographic separation on a Viridis BEH 2-EP column achieved baseline resolution for all compounds in less than 10 min. This separation was applied to environmental water samples after sample preparation. The optimized sample treatment involved a preconcentration step by means of solid-phase extraction using C18-OH cartridges. The proposed method was validated, finding recoveries higher than 94 % and limits of detection and limits of quantification in the range of 0.10-1.59 μg L(-1) and 0.31-4.83 μg L(-1), respectively. Method validation established the proposed method to be selective, linear, accurate, and precise. Finally, the method was successfully applied to environmental water samples.

  8. A modular modulation method for achieving increases in metabolite production.

    PubMed

    Acerenza, Luis; Monzon, Pablo; Ortega, Fernando

    2015-01-01

    Increasing the production of overproducing strains represents a great challenge. Here, we develop a modular modulation method to determine the key steps for genetic manipulation to increase metabolite production. The method consists of three steps: (i) modularization of the metabolic network into two modules connected by linking metabolites, (ii) change in the activity of the modules using auxiliary rates producing or consuming the linking metabolites in appropriate proportions and (iii) determination of the key modules and steps to increase production. The mathematical formulation of the method in matrix form shows that it may be applied to metabolic networks of any structure and size, with reactions showing any kind of rate laws. The results are valid for any type of conservation relationships in the metabolite concentrations or interactions between modules. The activity of the module may, in principle, be changed by any large factor. The method may be applied recursively or combined with other methods devised to perform fine searches in smaller regions. In practice, it is implemented by integrating to the producer strain heterologous reactions or synthetic pathways producing or consuming the linking metabolites. The new procedure may contribute to develop metabolic engineering into a more systematic practice. © 2015 American Institute of Chemical Engineers.

  9. Development and validation of chemometrics-assisted spectrophotometric and liquid chromatographic methods for the simultaneous determination of two multicomponent mixtures containing bronchodilator drugs.

    PubMed

    El-Gindy, Alaa; Emara, Samy; Shaaban, Heba

    2007-02-19

    Three methods are developed for the determination of two multicomponent mixtures containing guaiphenesine (GU) with salbutamol sulfate (SL), methylparaben (MP) and propylparaben (PP) [mixture 1]; and acephylline piperazine (AC) with bromhexine hydrochloride (BX), methylparaben (MP) and propylparaben (PP) [mixture 2]. The resolution of the two multicomponent mixtures has been accomplished by using numerical spectrophotometric methods such as partial least squares (PLS-1) and principal component regression (PCR) applied to UV absorption spectra of the two mixtures. In addition HPLC method was developed using a RP 18 column at ambient temperature with mobile phase consisting of acetonitrile-0.05 M potassium dihydrogen phosphate, pH 4.3 (60:40, v/v), with UV detection at 243 nm for mixture 1, and mobile phase consisting of acetonitrile-0.05 M potassium dihydrogen phosphate, pH 3 (50:50, v/v), with UV detection at 245 nm for mixture 2. The methods were validated in terms of accuracy, specificity, precision and linearity in the range of 20-60 microg ml(-1) for GU, 1-3 microg ml(-1) for SL, 20-80 microg ml(-1) for AC, 0.2-1.8 microgml(-1) for PP and 1-5 microg ml(-1) for BX and MP. The proposed methods were successfully applied for the determination of the two multicomponent combinations in laboratory prepared mixtures and commercial syrups.

  10. Multilayer ultra thick resist development for MEMS

    NASA Astrophysics Data System (ADS)

    Washio, Yasushi; Senzaki, Takahiro; Masuda, Yasuo; Saito, Koji; Obiya, Hiroyuki

    2005-05-01

    MEMS (Micro-Electro-Mechanical Systems) is achieved through a process technology, called Micro-machining. There are two distinct methods to manufacture a MEMS-product. One method is to form permanent film through photolithography, and the other is to form a non-permanent film resist after photolithography proceeded by etch or plating process. The three-dimensional ultra-fine processing technology based on photolithography, and is assembled by processes, such as anode junction, and post lithography processes such as etching and plating. Currently ORDYL PR-100 (Dry Film Type) is used for the permanent resist process. TOK has developed TMMR S2000 (Liquid Type) and TMMF S2000 (Dry Film Type) also. TOK has developed a new process utilizing these resist. The electro-forming method by photolithography is developed as one of the methods for enabling high resolution and high aspect formation. In recent years, it has become possible to manufacture conventionally difficult multilayer through our development with material and equipment project (M&E). As for material for electro-forming, it was checked that chemically amplified resist is optimal from the reaction mechanism as it is easily removed by the clean solution. Moreover, multiple plating formations were enabled with the resist through a new process. As for the equipment, TOK developed Applicator (It can apply 500 or more μms) and Developer, which achieves high throughput and quality. The detailed plating formations, which a path differs, and air wiring are realizable through M&E. From the above results, opposed to metallic mold plating, electro-forming method by resist, enabled to form high resolution and aspect pattern, at low cost. It is thought that the infinite possibility spreads by applying this process.

  11. Applying electric field to charged and polar particles between metallic plates: extension of the Ewald method.

    PubMed

    Takae, Kyohei; Onuki, Akira

    2013-09-28

    We develop an efficient Ewald method of molecular dynamics simulation for calculating the electrostatic interactions among charged and polar particles between parallel metallic plates, where we may apply an electric field with an arbitrary size. We use the fact that the potential from the surface charges is equivalent to the sum of those from image charges and dipoles located outside the cell. We present simulation results on boundary effects of charged and polar fluids, formation of ionic crystals, and formation of dipole chains, where the applied field and the image interaction are crucial. For polar fluids, we find a large deviation of the classical Lorentz-field relation between the local field and the applied field due to pair correlations along the applied field. As general aspects, we clarify the difference between the potential-fixed and the charge-fixed boundary conditions and examine the relationship between the discrete particle description and the continuum electrostatics.

  12. The order and priority of research and design method application within an assistive technology new product development process: a summative content analysis of 20 case studies.

    PubMed

    Torrens, George Edward

    2018-01-01

    Summative content analysis was used to define methods and heuristics from each case study. The review process was in two parts: (1) A literature review to identify conventional research methods and (2) a summative content analysis of published case studies, based on the identified methods and heuristics to suggest an order and priority of where and when were used. Over 200 research and design methods and design heuristics were identified. From the review of the 20 case studies 42 were identified as being applied. The majority of methods and heuristics were applied in phase two, market choice. There appeared a disparity between the limited numbers of methods frequently used, under 10 within the 20 case studies, when hundreds were available. Implications for Rehabilitation The communication highlights a number of issues that have implication for those involved in assistive technology new product development: •The study defined over 200 well-established research and design methods and design heuristics that are available for use by those who specify and design assistive technology products, which provide a comprehensive reference list for practitioners in the field; •The review within the study suggests only a limited number of research and design methods are regularly used by industrial design focused assistive technology new product developers; and, •Debate is required within the practitioners working in this field to reflect on how a wider range of potentially more effective methods and heuristics may be incorporated into daily working practice.

  13. Inversion methods for interpretation of asteroid lightcurves

    NASA Technical Reports Server (NTRS)

    Kaasalainen, Mikko; Lamberg, L.; Lumme, K.

    1992-01-01

    We have developed methods of inversion that can be used in the determination of the three-dimensional shape or the albedo distribution of the surface of a body from disk-integrated photometry, assuming the shape to be strictly convex. In addition to the theory of inversion methods, we have studied the practical aspects of the inversion problem and applied our methods to lightcurve data of 39 Laetitia and 16 Psyche.

  14. Inverse problems in quantum chemistry

    NASA Astrophysics Data System (ADS)

    Karwowski, Jacek

    Inverse problems constitute a branch of applied mathematics with well-developed methodology and formalism. A broad family of tasks met in theoretical physics, in civil and mechanical engineering, as well as in various branches of medical and biological sciences has been formulated as specific implementations of the general theory of inverse problems. In this article, it is pointed out that a number of approaches met in quantum chemistry can (and should) be classified as inverse problems. Consequently, the methodology used in these approaches may be enriched by applying ideas and theorems developed within the general field of inverse problems. Several examples, including the RKR method for the construction of potential energy curves, determining parameter values in semiempirical methods, and finding external potentials for which the pertinent Schrödinger equation is exactly solvable, are discussed in detail.

  15. Understanding Genetic Toxicity Through Data Mining: The ...

    EPA Pesticide Factsheets

    This paper demonstrates the usefulness of representing a chemical by its structural features and the use of these features to profile a battery of tests rather than relying on a single toxicity test of a given chemical. This paper presents data mining/profiling methods applied in a weight-of-evidence approach to assess potential for genetic toxicity, and to guide the development of intelligent testing strategies. This paper demonstrates the usefulness of representing a chemical by its structural features and the use of these features to profile a battery of tests rather than relying on a single toxicity test of a given chemical. This paper presents data mining/profiling methods applied in a weight-of-evidence approach to assess potential for genetic toxicity, and to guide the development of intelligent testing strategies.

  16. Advances in spatial epidemiology and geographic information systems.

    PubMed

    Kirby, Russell S; Delmelle, Eric; Eberth, Jan M

    2017-01-01

    The field of spatial epidemiology has evolved rapidly in the past 2 decades. This study serves as a brief introduction to spatial epidemiology and the use of geographic information systems in applied research in epidemiology. We highlight technical developments and highlight opportunities to apply spatial analytic methods in epidemiologic research, focusing on methodologies involving geocoding, distance estimation, residential mobility, record linkage and data integration, spatial and spatio-temporal clustering, small area estimation, and Bayesian applications to disease mapping. The articles included in this issue incorporate many of these methods into their study designs and analytical frameworks. It is our hope that these studies will spur further development and utilization of spatial analysis and geographic information systems in epidemiologic research. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Coupling-parameter expansion in thermodynamic perturbation theory.

    PubMed

    Ramana, A Sai Venkata; Menon, S V G

    2013-02-01

    An approach to the coupling-parameter expansion in the liquid state theory of simple fluids is presented by combining the ideas of thermodynamic perturbation theory and integral equation theories. This hybrid scheme avoids the problems of the latter in the two phase region. A method to compute the perturbation series to any arbitrary order is developed and applied to square well fluids. Apart from the Helmholtz free energy, the method also gives the radial distribution function and the direct correlation function of the perturbed system. The theory is applied for square well fluids of variable ranges and compared with simulation data. While the convergence of perturbation series and the overall performance of the theory is good, improvements are needed for potentials with shorter ranges. Possible directions for further developments in the coupling-parameter expansion are indicated.

  18. Wave propagation simulation in the upper core of sodium-cooled fast reactors using a spectral-element method for heterogeneous media

    NASA Astrophysics Data System (ADS)

    Nagaso, Masaru; Komatitsch, Dimitri; Moysan, Joseph; Lhuillier, Christian

    2018-01-01

    ASTRID project, French sodium cooled nuclear reactor of 4th generation, is under development at the moment by Alternative Energies and Atomic Energy Commission (CEA). In this project, development of monitoring techniques for a nuclear reactor during operation are identified as a measure issue for enlarging the plant safety. Use of ultrasonic measurement techniques (e.g. thermometry, visualization of internal objects) are regarded as powerful inspection tools of sodium cooled fast reactors (SFR) including ASTRID due to opacity of liquid sodium. In side of a sodium cooling circuit, heterogeneity of medium occurs because of complex flow state especially in its operation and then the effects of this heterogeneity on an acoustic propagation is not negligible. Thus, it is necessary to carry out verification experiments for developments of component technologies, while such kind of experiments using liquid sodium may be relatively large-scale experiments. This is why numerical simulation methods are essential for preceding real experiments or filling up the limited number of experimental results. Though various numerical methods have been applied for a wave propagation in liquid sodium, we still do not have a method for verifying on three-dimensional heterogeneity. Moreover, in side of a reactor core being a complex acousto-elastic coupled region, it has also been difficult to simulate such problems with conventional methods. The objective of this study is to solve these 2 points by applying three-dimensional spectral element method. In this paper, our initial results on three-dimensional simulation study on heterogeneous medium (the first point) are shown. For heterogeneity of liquid sodium to be considered, four-dimensional temperature field (three spatial and one temporal dimension) calculated by computational fluid dynamics (CFD) with Large-Eddy Simulation was applied instead of using conventional method (i.e. Gaussian Random field). This three-dimensional numerical experiment yields that we could verify the effects of heterogeneity of propagation medium on waves in Liquid sodium.

  19. HYPOTHESIS SETTING AND ORDER STATISTIC FOR ROBUST GENOMIC META-ANALYSIS.

    PubMed

    Song, Chi; Tseng, George C

    2014-01-01

    Meta-analysis techniques have been widely developed and applied in genomic applications, especially for combining multiple transcriptomic studies. In this paper, we propose an order statistic of p-values ( r th ordered p-value, rOP) across combined studies as the test statistic. We illustrate different hypothesis settings that detect gene markers differentially expressed (DE) "in all studies", "in the majority of studies", or "in one or more studies", and specify rOP as a suitable method for detecting DE genes "in the majority of studies". We develop methods to estimate the parameter r in rOP for real applications. Statistical properties such as its asymptotic behavior and a one-sided testing correction for detecting markers of concordant expression changes are explored. Power calculation and simulation show better performance of rOP compared to classical Fisher's method, Stouffer's method, minimum p-value method and maximum p-value method under the focused hypothesis setting. Theoretically, rOP is found connected to the naïve vote counting method and can be viewed as a generalized form of vote counting with better statistical properties. The method is applied to three microarray meta-analysis examples including major depressive disorder, brain cancer and diabetes. The results demonstrate rOP as a more generalizable, robust and sensitive statistical framework to detect disease-related markers.

  20. Practical human abdominal fat imaging utilizing electrical impedance tomography.

    PubMed

    Yamaguchi, T; Maki, K; Katashima, M

    2010-07-01

    The fundamental cause of metabolic syndrome is thought to be abdominal obesity. Accurate diagnosis of abdominal obesity can be done by an x-ray computed tomography (CT) scan. But CT is expensive, bulky and entails the risks involved with radiation. To overcome such disadvantages, we attempted to develop a measuring device that could apply electrical impedance tomography to abdominal fat imaging. The device has 32 electrodes that can be attached to a subject's abdomen by a pneumatic mechanism. That way, electrode position data can be acquired simultaneously. An applied alternating current of 1.0 mArms was used at a frequency of 500 kHz. Sensed voltage data were carefully filtered to remove noise and processed to satisfy the reciprocal theorem. The image reconstruction software was developed concurrently, applying standard finite element methods and the Marquardt method to solve the mathematical inverse problem. The results of preliminary experiments showed that abdominal subcutaneous fat and the muscle surrounding the viscera could be imaged in humans. While our imaging of visceral fat was not of sufficient quality, it was suggested that we will be able to develop a safe and practical abdominal fat scanner through future improvements.

  1. Exact traveling wave solutions for system of nonlinear evolution equations.

    PubMed

    Khan, Kamruzzaman; Akbar, M Ali; Arnous, Ahmed H

    2016-01-01

    In this work, recently deduced generalized Kudryashov method is applied to the variant Boussinesq equations, and the (2 + 1)-dimensional breaking soliton equations. As a result a range of qualitative explicit exact traveling wave solutions are deduced for these equations, which motivates us to develop, in the near future, a new approach to obtain unsteady solutions of autonomous nonlinear evolution equations those arise in mathematical physics and engineering fields. It is uncomplicated to extend this method to higher-order nonlinear evolution equations in mathematical physics. And it should be possible to apply the same method to nonlinear evolution equations having more general forms of nonlinearities by utilizing the traveling wave hypothesis.

  2. Test method development for structural characterization of fiber composites at high temperatures

    NASA Technical Reports Server (NTRS)

    Mandell, J. F.; Grande, D. H.; Edwards, B.

    1985-01-01

    Test methods used for structural characterization of polymer matrix composites can be applied to glass and ceramic matrix composites only at low temperatures. New test methods are required for tensile, compressive, and shear properties of fiber composites at high temperatures. A tensile test which should be useful to at least 1000 C has been developed and used to characterize the properties of a Nicalon/glass composite up to the matrix limiting temperature of 600 C. Longitudinal and transverse unidirectional composite data are presented and discussed.

  3. Comparison of thruster configurations in attitude control systems. M.S. Thesis. Progress Report

    NASA Technical Reports Server (NTRS)

    Boland, J. S., III; Drinkard, D. M., Jr.; White, L. R.; Chakravarthi, K. R.

    1973-01-01

    Several aspects concerning reaction control jet systems as used to govern the attitude of a spacecraft were considered. A thruster configuration currently in use was compared to several new configurations developed in this study. The method of determining the error signals which control the firing of the thrusters was also investigated. The current error determination procedure is explained and a new method is presented. Both of these procedures are applied to each of the thruster configurations which are developed and comparisons of the two methods are made.

  4. Aircraft Dynamic Modeling in Turbulence

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.; Cunninham, Kevin

    2012-01-01

    A method for accurately identifying aircraft dynamic models in turbulence was developed and demonstrated. The method uses orthogonal optimized multisine excitation inputs and an analytic method for enhancing signal-to-noise ratio for dynamic modeling in turbulence. A turbulence metric was developed to accurately characterize the turbulence level using flight measurements. The modeling technique was demonstrated in simulation, then applied to a subscale twin-engine jet transport aircraft in flight. Comparisons of modeling results obtained in turbulent air to results obtained in smooth air were used to demonstrate the effectiveness of the approach.

  5. Program for fundamental and applied research of fuel cells in VNIIEF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anisin, A.V.; Borisseonock, V.A.; Novitskii, Y.Z.

    1996-04-01

    According to VNIIEF the integral part of development of fuel cell power plants is fundamental and applied research. This paper describes areas of research on molten carbonate fuel cells. Topics include the development of mathematical models for porous electrodes, thin film electrolytes, the possibility of solid nickel anodes, model of activation polarization of anode, electrolyte with high solubility of oxygen. Other areas include research on a stationary mode of stack operation, anticorrosion coatings, impedance diagnostic methods, ultrasound diagnostics, radiation treatments, an air aluminium cell, and alternative catalysts for low temperature fuel cells.

  6. Surveillance theory applied to virus detection: a case for targeted discovery

    USGS Publications Warehouse

    Bogich, Tiffany L.; Anthony, Simon J.; Nichols, James D.

    2013-01-01

    Virus detection and mathematical modeling have gone through rapid developments in the past decade. Both offer new insights into the epidemiology of infectious disease and characterization of future risk; however, modeling has not yet been applied to designing the best surveillance strategies for viral and pathogen discovery. We review recent developments and propose methods to integrate viral and pathogen discovery and mathematical modeling through optimal surveillance theory, arguing for a more targeted approach to novel virus detection guided by the principles of adaptive management and structured decision-making.

  7. Goal-based angular adaptivity applied to a wavelet-based discretisation of the neutral particle transport equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goffin, Mark A., E-mail: mark.a.goffin@gmail.com; Buchan, Andrew G.; Dargaville, Steven

    2015-01-15

    A method for applying goal-based adaptive methods to the angular resolution of the neutral particle transport equation is presented. The methods are applied to an octahedral wavelet discretisation of the spherical angular domain which allows for anisotropic resolution. The angular resolution is adapted across both the spatial and energy dimensions. The spatial domain is discretised using an inner-element sub-grid scale finite element method. The goal-based adaptive methods optimise the angular discretisation to minimise the error in a specific functional of the solution. The goal-based error estimators require the solution of an adjoint system to determine the importance to the specifiedmore » functional. The error estimators and the novel methods to calculate them are described. Several examples are presented to demonstrate the effectiveness of the methods. It is shown that the methods can significantly reduce the number of unknowns and computational time required to obtain a given error. The novelty of the work is the use of goal-based adaptive methods to obtain anisotropic resolution in the angular domain for solving the transport equation. -- Highlights: •Wavelet angular discretisation used to solve transport equation. •Adaptive method developed for the wavelet discretisation. •Anisotropic angular resolution demonstrated through the adaptive method. •Adaptive method provides improvements in computational efficiency.« less

  8. Entrepreneurial Education at University Level and Entrepreneurship Development

    ERIC Educational Resources Information Center

    Hasan, Sk. Mahmudul; Khan, Eijaz Ahmed; Nabi, Md. Noor Un

    2017-01-01

    Purpose: The purpose of this paper is to contribute to the literature on effectiveness of entrepreneurship education by empirically assessing the role of university entrepreneurial education in entrepreneurship development and reporting the results. Design/methodology/approach: A quantitative method was applied for this study. This research was…

  9. Data Mining Methods Applied to Flight Operations Quality Assurance Data: A Comparison to Standard Statistical Methods

    NASA Technical Reports Server (NTRS)

    Stolzer, Alan J.; Halford, Carl

    2007-01-01

    In a previous study, multiple regression techniques were applied to Flight Operations Quality Assurance-derived data to develop parsimonious model(s) for fuel consumption on the Boeing 757 airplane. The present study examined several data mining algorithms, including neural networks, on the fuel consumption problem and compared them to the multiple regression results obtained earlier. Using regression methods, parsimonious models were obtained that explained approximately 85% of the variation in fuel flow. In general data mining methods were more effective in predicting fuel consumption. Classification and Regression Tree methods reported correlation coefficients of .91 to .92, and General Linear Models and Multilayer Perceptron neural networks reported correlation coefficients of about .99. These data mining models show great promise for use in further examining large FOQA databases for operational and safety improvements.

  10. Numerical Analysis of Effectiveness of Strengthening Concrete Slab in Tension of the Steel-Concrete Composite Beam Using Pretensioned CFRP Strips

    NASA Astrophysics Data System (ADS)

    Jankowiak, Iwona; Madaj, Arkadiusz

    2017-12-01

    One of the methods to increase the load carrying capacity of the reinforced concrete (RC) structure is its strengthening by using carbon fiber (CFRP) strips. There are two methods of strengthening using CFRP strips - passive method and active method. In the passive method a strip is applied to the concrete surface without initial strains, unlike in the active method a strip is initially pretensioned before its application. In the case of a steel-concrete composite beam, strips may be used to strengthen the concrete slab located in the tension zone (in the parts of beams with negative bending moments). The finite element model has been developed and validated by experimental tests to evaluate the strengthening efficiency of the composite girder with pretensioned CFRP strips applied to concrete slab in its tension zone.

  11. Comparative study between derivative spectrophotometry and multivariate calibration as analytical tools applied for the simultaneous quantitation of Amlodipine, Valsartan and Hydrochlorothiazide.

    PubMed

    Darwish, Hany W; Hassan, Said A; Salem, Maissa Y; El-Zeany, Badr A

    2013-09-01

    Four simple, accurate and specific methods were developed and validated for the simultaneous estimation of Amlodipine (AML), Valsartan (VAL) and Hydrochlorothiazide (HCT) in commercial tablets. The derivative spectrophotometric methods include Derivative Ratio Zero Crossing (DRZC) and Double Divisor Ratio Spectra-Derivative Spectrophotometry (DDRS-DS) methods, while the multivariate calibrations used are Principal Component Regression (PCR) and Partial Least Squares (PLSs). The proposed methods were applied successfully in the determination of the drugs in laboratory-prepared mixtures and in commercial pharmaceutical preparations. The validity of the proposed methods was assessed using the standard addition technique. The linearity of the proposed methods is investigated in the range of 2-32, 4-44 and 2-20 μg/mL for AML, VAL and HCT, respectively. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Developing Language Learning Textbooks Enriched with Sense of Literacy: The Case of Junior High School in Indonesia

    ERIC Educational Resources Information Center

    Sodiq, Syamsul

    2015-01-01

    This research is aimed at developing an Indonesian course-books integrated with the materials for life skill education (LSE). It can support effective learning through literacy models and results qualified book on Indonesian language learning. By applying Fenrich's method on development model (1997) include five phases of analysis, planning,…

  13. Value Forming Education of Prospective Primary School Teachers in Kazakhstan and Germany

    ERIC Educational Resources Information Center

    Utyupova, Gulnara Ye.; Baiseitova, Zhanar B.; Mukhamadiyeva, Aizhan A.

    2016-01-01

    Value education is one of the most effective forms of education. However, this system is applied only in developed countries due to a number of factors. The purpose of this study is to develop a method for training primary school teachers capable of implementing the value education system in developing countries. Teachers not only conveys…

  14. Implementing the Zone of Proximal Development: From the Pedagogical Experiment to the Developmental Education System of Leonid Zankov

    ERIC Educational Resources Information Center

    Guseva, Liudmila G.; Solomonovich, Mark

    2017-01-01

    This article overviews the theoretical and applied works of the psychologist and pedagogue Leonid Zankov. Zankov's model of teaching is based on Vygotsky's theory that appropriate teaching methods stimulate cognitive development, whose core notion is the Zone of Proximal Development. This educational psychology research was verified by large scale…

  15. Development of single chain variable fragment (scFv) antibodies against surface proteins of ‘Ca. Liberibacter asiaticus’

    USDA-ARS?s Scientific Manuscript database

    ‘Ca. Liberibacter asiaticus’ is the causal agent of citrus huanglongbing, the most serious disease of citrus worldwide. We have developed and applied immunization and affinity screening methods to develop a primary library of recombinant single chain variable fragment (scFv) antibodies in an M13 vec...

  16. A numerical homogenization method for heterogeneous, anisotropic elastic media based on multiscale theory

    DOE PAGES

    Gao, Kai; Chung, Eric T.; Gibson, Richard L.; ...

    2015-06-05

    The development of reliable methods for upscaling fine scale models of elastic media has long been an important topic for rock physics and applied seismology. Several effective medium theories have been developed to provide elastic parameters for materials such as finely layered media or randomly oriented or aligned fractures. In such cases, the analytic solutions for upscaled properties can be used for accurate prediction of wave propagation. However, such theories cannot be applied directly to homogenize elastic media with more complex, arbitrary spatial heterogeneity. We therefore propose a numerical homogenization algorithm based on multiscale finite element methods for simulating elasticmore » wave propagation in heterogeneous, anisotropic elastic media. Specifically, our method used multiscale basis functions obtained from a local linear elasticity problem with appropriately defined boundary conditions. Homogenized, effective medium parameters were then computed using these basis functions, and the approach applied a numerical discretization that is similar to the rotated staggered-grid finite difference scheme. Comparisons of the results from our method and from conventional, analytical approaches for finely layered media showed that the homogenization reliably estimated elastic parameters for this simple geometry. Additional tests examined anisotropic models with arbitrary spatial heterogeneity where the average size of the heterogeneities ranged from several centimeters to several meters, and the ratio between the dominant wavelength and the average size of the arbitrary heterogeneities ranged from 10 to 100. Comparisons to finite-difference simulations proved that the numerical homogenization was equally accurate for these complex cases.« less

  17. Topical dissolved oxygen penetrates skin: model and method.

    PubMed

    Roe, David F; Gibbins, Bruce L; Ladizinsky, Daniel A

    2010-03-01

    It has been commonly perceived that skin receives its oxygen supply from the internal circulation. However, recent investigations have shown that a significant amount of oxygen may enter skin from the external overlying surface. A method has been developed for measuring the transcutaneous penetration of human skin by oxygen as described herein. This method was used to determine both the depth and magnitude of penetration of skin by topically applied oxygen. An apparatus consisting of human skin samples interposed between a topical oxygen source and a fluid filled chamber that registered changes in dissolved oxygen. Viable human skin samples of variable thicknesses with and without epidermis were used to evaluate the depth and magnitude of oxygen penetration from either topical dissolved oxygen (TDO) or topical gaseous oxygen (TGO) devices. This model effectively demonstrates transcutaneous penetration of topically applied oxygen. Topically applied dissolved oxygen penetrates through >700 microm of human skin. Topically applied oxygen penetrates better though dermis than epidermis, and TDO devices deliver oxygen more effectively than TGO devices. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  18. A binary linear programming formulation of the graph edit distance.

    PubMed

    Justice, Derek; Hero, Alfred

    2006-08-01

    A binary linear programming formulation of the graph edit distance for unweighted, undirected graphs with vertex attributes is derived and applied to a graph recognition problem. A general formulation for editing graphs is used to derive a graph edit distance that is proven to be a metric, provided the cost function for individual edit operations is a metric. Then, a binary linear program is developed for computing this graph edit distance, and polynomial time methods for determining upper and lower bounds on the solution of the binary program are derived by applying solution methods for standard linear programming and the assignment problem. A recognition problem of comparing a sample input graph to a database of known prototype graphs in the context of a chemical information system is presented as an application of the new method. The costs associated with various edit operations are chosen by using a minimum normalized variance criterion applied to pairwise distances between nearest neighbors in the database of prototypes. The new metric is shown to perform quite well in comparison to existing metrics when applied to a database of chemical graphs.

  19. Pathfinder: applying graph theory to consistent tracking of daytime mixed layer height with backscatter lidar

    NASA Astrophysics Data System (ADS)

    de Bruine, Marco; Apituley, Arnoud; Donovan, David Patrick; Klein Baltink, Hendrik; Jorrit de Haij, Marijn

    2017-05-01

    The height of the atmospheric boundary layer or mixing layer is an important parameter for understanding the dynamics of the atmosphere and the dispersion of trace gases and air pollution. The height of the mixing layer (MLH) can be retrieved, among other methods, from lidar or ceilometer backscatter data. These instruments use the vertical backscatter lidar signal to infer MLHL, which is feasible because the main sources of aerosols are situated at the surface and vertical gradients are expected to go from the aerosol loaded mixing layer close to the ground to the cleaner free atmosphere above. Various lidar/ceilometer algorithms are currently applied, but accounting for MLH temporal development is not always well taken care of. As a result, MLHL retrievals may jump between different atmospheric layers, rather than reliably track true MLH development over time. This hampers the usefulness of MLHL time series, e.g. for process studies, model validation/verification and climatology. Here, we introduce a new method pathfinder, which applies graph theory to simultaneously evaluate time frames that are consistent with scales of MLH dynamics, leading to coherent tracking of MLH. Starting from a grid of gradients in the backscatter profiles, MLH development is followed using Dijkstra's shortest path algorithm (Dijkstra, 1959). Locations of strong gradients are connected under the condition that subsequent points on the path are limited to a restricted vertical range. The search is further guided by rules based on the presence of clouds and residual layers. After being applied to backscatter lidar data from Cabauw, excellent agreement is found with wind profiler retrievals for a 12-day period in 2008 (R2 = 0.90) and visual judgment of lidar data during a full year in 2010 (R2 = 0.96). These values compare favourably to other MLHL methods applied to the same lidar data set and corroborate more consistent MLH tracking by pathfinder.

  20. Comparison of different wind data interpolation methods for a region with complex terrain in Central Asia

    NASA Astrophysics Data System (ADS)

    Reinhardt, Katja; Samimi, Cyrus

    2018-01-01

    While climatological data of high spatial resolution are largely available in most developed countries, the network of climatological stations in many other regions of the world still constitutes large gaps. Especially for those regions, interpolation methods are important tools to fill these gaps and to improve the data base indispensible for climatological research. Over the last years, new hybrid methods of machine learning and geostatistics have been developed which provide innovative prospects in spatial predictive modelling. This study will focus on evaluating the performance of 12 different interpolation methods for the wind components \\overrightarrow{u} and \\overrightarrow{v} in a mountainous region of Central Asia. Thereby, a special focus will be on applying new hybrid methods on spatial interpolation of wind data. This study is the first evaluating and comparing the performance of several of these hybrid methods. The overall aim of this study is to determine whether an optimal interpolation method exists, which can equally be applied for all pressure levels, or whether different interpolation methods have to be used for the different pressure levels. Deterministic (inverse distance weighting) and geostatistical interpolation methods (ordinary kriging) were explored, which take into account only the initial values of \\overrightarrow{u} and \\overrightarrow{v} . In addition, more complex methods (generalized additive model, support vector machine and neural networks as single methods and as hybrid methods as well as regression-kriging) that consider additional variables were applied. The analysis of the error indices revealed that regression-kriging provided the most accurate interpolation results for both wind components and all pressure heights. At 200 and 500 hPa, regression-kriging is followed by the different kinds of neural networks and support vector machines and for 850 hPa it is followed by the different types of support vector machine and ordinary kriging. Overall, explanatory variables improve the interpolation results.

  1. Resolution of overlapped spectra for the determination of ternary mixture using different and modified spectrophotometric methods

    NASA Astrophysics Data System (ADS)

    Moussa, Bahia Abbas; El-Zaher, Asmaa Ahmed; Mahrouse, Marianne Alphonse; Ahmed, Maha Said

    2016-08-01

    Four new spectrophotometric methods were developed, applied to resolve the overlapped spectra of a ternary mixture of [aliskiren hemifumarate (ALS)-amlodipine besylate (AM)-hydrochlorothiazide (HCT)] and to determine the three drugs in pure form and in combined dosage form. Method A depends on simultaneous determination of ALS, AM and HCT using principal component regression and partial least squares chemometric methods. In Method B, a modified isosbestic spectrophotometric method was applied for the determination of the total concentration of ALS and HCT by measuring the absorbance at 274.5 nm (isosbestic point, Aiso). On the other hand, the concentration of HCT in ternary mixture with ALS and AM could be calculated without interference using first derivative spectrophotometric method by measuring the amplitude at 279 nm (zero crossing of ALS and zero value of AM). Thus, the content of ALS was calculated by subtraction. Method C, double divisor first derivative ratio spectrophotometry (double divisor 1DD method), was based on that for the determination of one drug, the ratio spectra were obtained by dividing the absorption spectra of its different concentrations by the sum of the absorption spectra of the other two drugs as a double divisor. The first derivative of the obtained ratio spectra were then recorded using the appropriate smoothing factor. The amplitudes at 291 nm, 380 nm and 274.5 nm were selected for the determination of ALS, AM and HCT in their ternary mixture, respectively. Method D was based on mean centering of ratio spectra. The mean centered values at 287, 295.5 and 269 nm were recorded and used for the determination of ALS, AM and HCT, respectively. The developed methods were validated according to ICH guidelines and proved to be accurate, precise and selective. Satisfactory results were obtained by applying the proposed methods to the analysis of pharmaceutical dosage form.

  2. Resolution of overlapped spectra for the determination of ternary mixture using different and modified spectrophotometric methods.

    PubMed

    Moussa, Bahia Abbas; El-Zaher, Asmaa Ahmed; Mahrouse, Marianne Alphonse; Ahmed, Maha Said

    2016-08-05

    Four new spectrophotometric methods were developed, applied to resolve the overlapped spectra of a ternary mixture of [aliskiren hemifumarate (ALS)-amlodipine besylate (AM)-hydrochlorothiazide (HCT)] and to determine the three drugs in pure form and in combined dosage form. Method A depends on simultaneous determination of ALS, AM and HCT using principal component regression and partial least squares chemometric methods. In Method B, a modified isosbestic spectrophotometric method was applied for the determination of the total concentration of ALS and HCT by measuring the absorbance at 274.5nm (isosbestic point, Aiso). On the other hand, the concentration of HCT in ternary mixture with ALS and AM could be calculated without interference using first derivative spectrophotometric method by measuring the amplitude at 279nm (zero crossing of ALS and zero value of AM). Thus, the content of ALS was calculated by subtraction. Method C, double divisor first derivative ratio spectrophotometry (double divisor (1)DD method), was based on that for the determination of one drug, the ratio spectra were obtained by dividing the absorption spectra of its different concentrations by the sum of the absorption spectra of the other two drugs as a double divisor. The first derivative of the obtained ratio spectra were then recorded using the appropriate smoothing factor. The amplitudes at 291nm, 380nm and 274.5nm were selected for the determination of ALS, AM and HCT in their ternary mixture, respectively. Method D was based on mean centering of ratio spectra. The mean centered values at 287, 295.5 and 269nm were recorded and used for the determination of ALS, AM and HCT, respectively. The developed methods were validated according to ICH guidelines and proved to be accurate, precise and selective. Satisfactory results were obtained by applying the proposed methods to the analysis of pharmaceutical dosage form. Copyright © 2016. Published by Elsevier B.V.

  3. An electromagnetic induction method for underground target detection and characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartel, L.C.; Cress, D.H.

    1997-01-01

    An improved capability for subsurface structure detection is needed to support military and nonproliferation requirements for inspection and for surveillance of activities of threatening nations. As part of the DOE/NN-20 program to apply geophysical methods to detect and characterize underground facilities, Sandia National Laboratories (SNL) initiated an electromagnetic induction (EMI) project to evaluate low frequency electromagnetic (EM) techniques for subsurface structure detection. Low frequency, in this case, extended from kilohertz to hundreds of kilohertz. An EMI survey procedure had already been developed for borehole imaging of coal seams and had successfully been applied in a surface mode to detect amore » drug smuggling tunnel. The SNL project has focused on building upon the success of that procedure and applying it to surface and low altitude airborne platforms. Part of SNL`s work has focused on improving that technology through improved hardware and data processing. The improved hardware development has been performed utilizing Laboratory Directed Research and Development (LDRD) funding. In addition, SNL`s effort focused on: (1) improvements in modeling of the basic geophysics of the illuminating electromagnetic field and its coupling to the underground target (partially funded using LDRD funds) and (2) development of techniques for phase-based and multi-frequency processing and spatial processing to support subsurface target detection and characterization. The products of this project are: (1) an evaluation of an improved EM gradiometer, (2) an improved gradiometer concept for possible future development, (3) an improved modeling capability, (4) demonstration of an EM wave migration method for target recognition, and a demonstration that the technology is capable of detecting targets to depths exceeding 25 meters.« less

  4. How can we value an environmental asset that very few have visited or heard of? Lessons learned from applying contingent and inferred valuation in an Australian wetlands case study.

    PubMed

    Gregg, Daniel; Wheeler, Sarah Ann

    2018-08-15

    To date, the majority of environmental assets studied in the economic valuation literature clearly have high amenity and recreational use values. However there are many cases where small, but nevertheless unique and important, ecosystems survive as islands amongst large areas of modified, productive, or urban, landscapes. Development encroaches on the landscape and as urban landscapes become more concentrated these types of conservation islands will become increasingly more important. Previous experience with economic valuation suggests that lower total values for smaller contributions to conservation are more liable to be swamped by survey and hypothetical bias measures. Hence there needs to be more understanding of approaches to economic valuation for small and isolated environmental assets, in particular regarding controlling stated preference biases. This study applied the recently developed method of Inferred Valuation (IV) to a small private wetland in South-East Australia, and compared willingness to pay values with estimates from a standard Contingent Valuation (CV) approach. We found that hypothetical bias did seem to be slightly lower with the IV method. However, other methods such as the use of log-normal transformations and median measures, significantly mitigate apparent hypothetical biases and are easier to apply and allow use of the well-tested CV method. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Error analysis of motion correction method for laser scanning of moving objects

    NASA Astrophysics Data System (ADS)

    Goel, S.; Lohani, B.

    2014-05-01

    The limitation of conventional laser scanning methods is that the objects being scanned should be static. The need of scanning moving objects has resulted in the development of new methods capable of generating correct 3D geometry of moving objects. Limited literature is available showing development of very few methods capable of catering to the problem of object motion during scanning. All the existing methods utilize their own models or sensors. Any studies on error modelling or analysis of any of the motion correction methods are found to be lacking in literature. In this paper, we develop the error budget and present the analysis of one such `motion correction' method. This method assumes availability of position and orientation information of the moving object which in general can be obtained by installing a POS system on board or by use of some tracking devices. It then uses this information along with laser scanner data to apply correction to laser data, thus resulting in correct geometry despite the object being mobile during scanning. The major application of this method lie in the shipping industry to scan ships either moving or parked in the sea and to scan other objects like hot air balloons or aerostats. It is to be noted that the other methods of "motion correction" explained in literature can not be applied to scan the objects mentioned here making the chosen method quite unique. This paper presents some interesting insights in to the functioning of "motion correction" method as well as a detailed account of the behavior and variation of the error due to different sensor components alone and in combination with each other. The analysis can be used to obtain insights in to optimal utilization of available components for achieving the best results.

  6. Development of a Self-Rated Mixed Methods Skills Assessment: The NIH Mixed Methods Research Training Program for the Health Sciences

    PubMed Central

    Guetterman, Timothy C.; Creswell, John W.; Wittink, Marsha; Barg, Fran K.; Castro, Felipe G.; Dahlberg, Britt; Watkins, Daphne C.; Deutsch, Charles; Gallo, Joseph J.

    2017-01-01

    Introduction Demand for training in mixed methods is high, with little research on faculty development or assessment in mixed methods. We describe the development of a Self-Rated Mixed Methods Skills Assessment and provide validity evidence. The instrument taps six research domains: “Research question,” “Design/approach,” “Sampling,” “Data collection,” “Analysis,” and “Dissemination.” Respondents are asked to rate their ability to define or explain concepts of mixed methods under each domain, their ability to apply the concepts to problems, and the extent to which they need to improve. Methods We administered the questionnaire to 145 faculty and students using an internet survey. We analyzed descriptive statistics and performance characteristics of the questionnaire using Cronbach’s alpha to assess reliability and an ANOVA that compared a mixed methods experience index with assessment scores to assess criterion-relatedness. Results Internal consistency reliability was high for the total set of items (.95) and adequate (>=.71) for all but one subscale. Consistent with establishing criterion validity, respondents who had more professional experiences with mixed methods (e.g., published a mixed methods paper) rated themselves as more skilled, which was statistically significant across the research domains. Discussion This Self-Rated Mixed Methods Assessment instrument may be a useful tool to assess skills in mixed methods for training programs. It can be applied widely at the graduate and faculty level. For the learner, assessment may lead to enhanced motivation to learn and training focused on self-identified needs. For faculty, the assessment may improve curriculum and course content planning. PMID:28562495

  7. Community Detection in Complex Networks via Clique Conductance.

    PubMed

    Lu, Zhenqi; Wahlström, Johan; Nehorai, Arye

    2018-04-13

    Network science plays a central role in understanding and modeling complex systems in many areas including physics, sociology, biology, computer science, economics, politics, and neuroscience. One of the most important features of networks is community structure, i.e., clustering of nodes that are locally densely interconnected. Communities reveal the hierarchical organization of nodes, and detecting communities is of great importance in the study of complex systems. Most existing community-detection methods consider low-order connection patterns at the level of individual links. But high-order connection patterns, at the level of small subnetworks, are generally not considered. In this paper, we develop a novel community-detection method based on cliques, i.e., local complete subnetworks. The proposed method overcomes the deficiencies of previous similar community-detection methods by considering the mathematical properties of cliques. We apply the proposed method to computer-generated graphs and real-world network datasets. When applied to networks with known community structure, the proposed method detects the structure with high fidelity and sensitivity. When applied to networks with no a priori information regarding community structure, the proposed method yields insightful results revealing the organization of these complex networks. We also show that the proposed method is guaranteed to detect near-optimal clusters in the bipartition case.

  8. Method and apparatus for imparting strength to a material using sliding loads

    DOEpatents

    Hughes, Darcy Anne; Dawson, Daniel B.; Korellis, John S.

    1999-01-01

    A method of enhancing the strength of metals by affecting subsurface zones developed during the application of large sliding loads. Stresses which develop locally within the near surface zone can be many times larger than those predicted from the applied load and the friction coefficient. These stress concentrations arise from two sources: 1) asperity interactions and 2) local and momentary bonding between the two surfaces. By controlling these parameters more desirable strength characteristics can be developed in weaker metals to provide much greater strength to rival that of steel, for example.

  9. Method And Apparatus For Imparting Strength To Materials Using Sliding Loads

    DOEpatents

    Hughes, Darcy Anne; Dawson, Daniel B.; Korellis, John S.

    1999-03-16

    A method of enhancing the strength of metals by affecting subsurface zones developed during the application of large sliding loads. Stresses which develop locally within the near surface zone can be many times larger than those predicted from the applied load and the friction coefficient. These stress concentrations arise from two sources: 1) asperity interactions and 2) local and momentary bonding between the two surfaces. By controlling these parameters more desirable strength characteristics can be developed in weaker metals to provide much greater strength to rival that of steel, for example.

  10. Method for Implementing Subsurface Solid Derived Concentration Guideline Levels (DCGL) - 12331

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lively, J.W.

    2012-07-01

    The U.S. Nuclear Regulatory Commission (NRC) and other federal agencies currently approve the Multi-Agency Radiation Site Survey and Investigation Manual (MARSSIM) as guidance for licensees who are conducting final radiological status surveys in support of decommissioning. MARSSIM provides a method to demonstrate compliance with the applicable regulation by comparing residual radioactivity in surface soils with derived concentration guideline levels (DCGLs), but specifically discounts its applicability to subsurface soils. Many sites and facilities undergoing decommissioning contain subsurface soils that are potentially impacted by radiological constituents. In the absence of specific guidance designed to address the derivation of subsurface soil DCGLs andmore » compliance demonstration, decommissioning facilities have attempted to apply DCGLs and final status survey techniques designed specifically for surface soils to subsurface soils. The decision to apply surface soil limits and surface soil compliance metrics to subsurface soils typically results in significant over-excavation with associated cost escalation. MACTEC, Inc. has developed the overarching concepts and principles found in recent NRC decommissioning guidance in NUREG 1757 to establish a functional method to derive dose-based subsurface soil DCGLs. The subsurface soil method developed by MACTEC also establishes a rigorous set of criterion-based data evaluation metrics (with analogs to the MARSSIM methodology) that can be used to demonstrate compliance with the developed subsurface soil DCGLs. The method establishes a continuum of volume factors that relate the size and depth of a volume of subsurface soil having elevated concentrations of residual radioactivity with its ability to produce dose. The method integrates the subsurface soil sampling regime with the derivation of the subsurface soil DCGL such that a self-regulating optimization is naturally sought by both the responsible party and regulator. This paper describes the concepts and basis used by MACTEC to develop the dose-based subsurface soil DCGL method. The paper will show how MACTEC's method can be used to demonstrate that higher concentrations of residual radioactivity in subsurface soils (as compared with surface soils) can meet the NRC's dose-based regulations. MACTEC's method has been used successfully to obtain the NRC's radiological release at a site with known radiological impacts to subsurface soils exceeding the surface soil DCGL, saving both time and cost. Having considered the current NRC guidance for consideration of residual radioactivity in subsurface soils during decommissioning, MACTEC has developed a technically based approach to the derivation of and demonstration of compliance with subsurface soil DCGLs for radionuclides. In fact, the process uses the already accepted concepts and metrics approved for surface soils as the foundation for deriving scaling factors used to calculate subsurface soil DCGLs that are at least equally protective of the decommissioning annual dose standard. Each of the elements identified for consideration in the current NRC guidance is addressed in this proposed method. Additionally, there is considerable conservatism built into the assumptions and techniques used to arrive at subsurface soil scaling factors and DCGLs. The degree of conservatism embodied in the approach used is such that risk managers and decision makers approving and using subsurface soil DCGLs derived in accordance with this method can be confident that the future exposures will be well below permissible and safe levels. The technical basis for the method can be applied to a broad variety of sites with residual radioactivity in subsurface soils. Given the costly nature of soil surveys, excavation, and disposal of soils as low-level radioactive waste, MACTEC's method for deriving and demonstrating compliance with subsurface soil DCGLs offers the possibility of significant cost savings over the traditional approach of applying surface soil DCGLs to subsurface soils. Furthermore, while yet untested, MACTEC believes that the concepts and methods embodied in this approach could readily be applied to other types of contamination found in subsurface soils. (author)« less

  11. Adult Learning Principles and Presentation Pearls

    PubMed Central

    Palis, Ana G.; Quiros, Peter A.

    2014-01-01

    Although lectures are one of the most common methods of knowledge transfer in medicine, their effectiveness has been questioned. Passive formats, lack of relevance and disconnection from the student's needs are some of the arguments supporting this apparent lack of efficacy. However, many authors have suggested that applying adult learning principles (i.e., relevance, congruence with student's needs, interactivity, connection to student's previous knowledge and experience) to this method increases learning by lectures and the effectiveness of lectures. This paper presents recommendations for applying adult learning principles during planning, creation and development of lectures to make them more effective. PMID:24791101

  12. Linear and nonlinear dynamic analysis of redundant load path bearingless rotor systems

    NASA Technical Reports Server (NTRS)

    Murthy, V. R.; Shultz, Louis A.

    1994-01-01

    The goal of this research is to develop the transfer matrix method to treat nonlinear autonomous boundary value problems with multiple branches. The application is the complete nonlinear aeroelastic analysis of multiple-branched rotor blades. Once the development is complete, it can be incorporated into the existing transfer matrix analyses. There are several difficulties to be overcome in reaching this objective. The conventional transfer matrix method is limited in that it is applicable only to linear branch chain-like structures, but consideration of multiple branch modeling is important for bearingless rotors. Also, hingeless and bearingless rotor blade dynamic characteristics (particularly their aeroelasticity problems) are inherently nonlinear. The nonlinear equations of motion and the multiple-branched boundary value problem are treated together using a direct transfer matrix method. First, the formulation is applied to a nonlinear single-branch blade to validate the nonlinear portion of the formulation. The nonlinear system of equations is iteratively solved using a form of Newton-Raphson iteration scheme developed for differential equations of continuous systems. The formulation is then applied to determine the nonlinear steady state trim and aeroelastic stability of a rotor blade in hover with two branches at the root. A comprehensive computer program is developed and is used to obtain numerical results for the (1) free vibration, (2) nonlinearly deformed steady state, (3) free vibration about the nonlinearly deformed steady state, and (4) aeroelastic stability tasks. The numerical results obtained by the present method agree with results from other methods.

  13. [Development of POCT and medical digital assistant for primary medical healthcare].

    PubMed

    Shi, Jun; Yan, Zhuang-Zhi; Pan, Zhi-Hao

    2008-01-01

    In this paper, we discuss the meaning, advantages and methods of applying the point of care testing (POCT) and medical digital assistant (MDA) to primary healthcare services. We also introduce the development of the POCT and MDA based on the electronic health record(EHR) system.

  14. Research and Development Services: Methods Development

    DTIC Science & Technology

    1982-07-23

    At an applied potential of -1.15 volts, the minimum detectable amount was 500 ng, which was not very sensitive. From Hammett linear free energy... Equation 1, the value of N was optimized by using two columns. The other factors which can influence resolution are the capacity factor, k, and the

  15. Steps in the open space planning process

    Treesearch

    Stephanie B. Kelly; Melissa M. Ryan

    1995-01-01

    This paper presents the steps involved in developing an open space plan. The steps are generic in that the methods may be applied various size communities. The intent is to provide a framework to develop an open space plan that meets Massachusetts requirements for funding of open space acquisition.

  16. The development of local calibration factors - phase II : Maryland freeways and ramps : final report.

    DOT National Transportation Integrated Search

    2016-11-01

    The goal of the study was to develop local calibration factors (LCFs) for Maryland freeways in order to apply the predictive methods of the Highway Safety Manual (HSM) to the state. LCFs were computed for freeway segments, speed-change lanes, and sig...

  17. Usability Methods for Ensuring Health Information Technology Safety: Evidence-Based Approaches. Contribution of the IMIA Working Group Health Informatics for Patient Safety.

    PubMed

    Borycki, E; Kushniruk, A; Nohr, C; Takeda, H; Kuwata, S; Carvalho, C; Bainbridge, M; Kannry, J

    2013-01-01

    Issues related to lack of system usability and potential safety hazards continue to be reported in the health information technology (HIT) literature. Usability engineering methods are increasingly used to ensure improved system usability and they are also beginning to be applied more widely for ensuring the safety of HIT applications. These methods are being used in the design and implementation of many HIT systems. In this paper we describe evidence-based approaches to applying usability engineering methods. A multi-phased approach to ensuring system usability and safety in healthcare is described. Usability inspection methods are first described including the development of evidence-based safety heuristics for HIT. Laboratory-based usability testing is then conducted under artificial conditions to test if a system has any base level usability problems that need to be corrected. Usability problems that are detected are corrected and then a new phase is initiated where the system is tested under more realistic conditions using clinical simulations. This phase may involve testing the system with simulated patients. Finally, an additional phase may be conducted, involving a naturalistic study of system use under real-world clinical conditions. The methods described have been employed in the analysis of the usability and safety of a wide range of HIT applications, including electronic health record systems, decision support systems and consumer health applications. It has been found that at least usability inspection and usability testing should be applied prior to the widespread release of HIT. However, wherever possible, additional layers of testing involving clinical simulations and a naturalistic evaluation will likely detect usability and safety issues that may not otherwise be detected prior to widespread system release. The framework presented in the paper can be applied in order to develop more usable and safer HIT, based on multiple layers of evidence.

  18. Microfabricated X-Ray Optics Technology Development for the Constellation-X Mission

    NASA Technical Reports Server (NTRS)

    Schattenburg, Mark L.

    2003-01-01

    During the period of this Cooperative Agreement, MIT developed advanced methods for applying silicon micro-stuctures for the precision assembly of foil x-ray optics in support of the Constellution-X Spectroscopy X-ray Telescope (SXT) development effort at Goddard Space Flight Center (GSFC). MIT developed improved methods for fabricating and characterizing the precision silicon micro-combs. MIT also developed and characterized assembly tools and several types of metrology tools in order to characterize and reduce the errors associated with precision assembly of foil optics. Results of this effort were published and presented to the scientific community and the GSFC SXT team.

  19. Newly invented biobased materials from low-carbon, diverted waste fibers: research methods, testing, and full-scale application in a case study structure

    Treesearch

    Julee A Herdt; John Hunt; Kellen Schauermann

    2016-01-01

    This project demonstrates newly invented, biobased construction materials developed by applying lowcarbon, biomass waste sources through the Authors’ engineered fiber processes and technology. If manufactured and applied large-scale the project inventions can divert large volumes of cellulose waste into high-performance, low embodied energy, environmental construction...

  20. Fast method to compute scattering by a buried object under a randomly rough surface: PILE combined with FB-SA.

    PubMed

    Bourlier, Christophe; Kubické, Gildas; Déchamps, Nicolas

    2008-04-01

    A fast, exact numerical method based on the method of moments (MM) is developed to calculate the scattering from an object below a randomly rough surface. Déchamps et al. [J. Opt. Soc. Am. A23, 359 (2006)] have recently developed the PILE (propagation-inside-layer expansion) method for a stack of two one-dimensional rough interfaces separating homogeneous media. From the inversion of the impedance matrix by block (in which two impedance matrices of each interface and two coupling matrices are involved), this method allows one to calculate separately and exactly the multiple-scattering contributions inside the layer in which the inverses of the impedance matrices of each interface are involved. Our purpose here is to apply this method for an object below a rough surface. In addition, to invert a matrix of large size, the forward-backward spectral acceleration (FB-SA) approach of complexity O(N) (N is the number of unknowns on the interface) proposed by Chou and Johnson [Radio Sci.33, 1277 (1998)] is applied. The new method, PILE combined with FB-SA, is tested on perfectly conducting circular and elliptic cylinders located below a dielectric rough interface obeying a Gaussian process with Gaussian and exponential height autocorrelation functions.

  1. Description of a user-oriented geographic information system - The resource analysis program

    NASA Technical Reports Server (NTRS)

    Tilmann, S. E.; Mokma, D. L.

    1980-01-01

    This paper describes the Resource Analysis Program, an applied geographic information system. Several applications are presented which utilized soil, and other natural resource data, to develop integrated maps and data analyses. These applications demonstrate the methods of analysis and the philosophy of approach used in the mapping system. The applications are evaluated in reference to four major needs of a functional mapping system: data capture, data libraries, data analysis, and mapping and data display. These four criteria are then used to describe an effort to develop the next generation of applied mapping systems. This approach uses inexpensive microcomputers for field applications and should prove to be a viable entry point for users heretofore unable or unwilling to venture into applied computer mapping.

  2. Measuring Metal Thickness With an Electric Probe

    NASA Technical Reports Server (NTRS)

    Shumka, A.

    1986-01-01

    Thickness of metal parts measured from one side with aid of Kelvin probe. Method developed for measuring thickness of end plate on sealed metal bellows from outside. Suitable for thicknesses of few thousandth's of inch (few hundred micrometers). Method also used to determine thickness of metal coatings applied by sputtering, electroplating, and flame spraying.

  3. A RAPID DNA EXTRACTION METHOD IS SUCCESSFULLY APPLIED TO ITS-RFLP ANALYSIS OF MYCORRHIZAL ROOT TIPS

    EPA Science Inventory

    A rapid method for extracting DNA from intact, single root tips using a Xanthine solution was developed to handle very large numbers of analyses of ectomycorrhizas. By using an extraction without grinding we have attempted to bias the extraction towards the fungal DNA in the man...

  4. Case-Study of the High School Student's Family Values Formation

    ERIC Educational Resources Information Center

    Valeeva, Roza A.; Korolyeva, Natalya E.; Sakhapova, Farida Kh.

    2016-01-01

    The aim of the research is the theoretical justification and experimental verification of content, complex forms and methods to ensure effective development of the high school students' family values formation. 93 lyceum students from Kazan took part in the experiment. To study students' family values we have applied method of studying personality…

  5. Next Generation Science Standards: A National Mixed-Methods Study on Teacher Readiness

    ERIC Educational Resources Information Center

    Haag, Susan; Megowan, Colleen

    2015-01-01

    Next Generation Science Standards (NGSS) science and engineering practices are ways of eliciting the reasoning and applying foundational ideas in science. As research has revealed barriers to states and schools adopting the NGSS, this mixed-methods study attempts to identify characteristics of professional development (PD) that will support NGSS…

  6. Methods of Selecting Industries for Depressed Areas--An Introduction to Feasibility Studies. Developing Job Opportunities 2.

    ERIC Educational Resources Information Center

    Klaassen, Leo H.

    This report presents severl alternative methods which may be employed by local authorities in identifying likely prospects for local industrialization, and describes a specialized input-output technique to define inter-industry relations and inter-regional relations of industries. This technique is applied, for illustrative purposes, to three…

  7. An Empirical Derivation of the Run Time of the Bubble Sort Algorithm.

    ERIC Educational Resources Information Center

    Gonzales, Michael G.

    1984-01-01

    Suggests a moving pictorial tool to help teach principles in the bubble sort algorithm. Develops such a tool applied to an unsorted list of numbers and describes a method to derive the run time of the algorithm. The method can be modified to run the times of various other algorithms. (JN)

  8. Translation of the Marlowe-Crowne Social Desirability Scale into an Equivalent Spanish Version

    ERIC Educational Resources Information Center

    Collazo, Andres A.

    2005-01-01

    A Spanish version of the Marlowe-Crowne Social Desirability Scale (MCSDS) was developed by applying a method derived from the cross-cultural and psychometric literature. The method included five sequenced studies: (a) translation and back-translation, (b) comprehension assessment, (c) psychometric equivalence study of two mixed-language versions,…

  9. A Brain-Computer Interface Project Applied in Computer Engineering

    ERIC Educational Resources Information Center

    Katona, Jozsef; Kovari, Attila

    2016-01-01

    Keeping up with novel methods and keeping abreast of new applications are crucial issues in engineering education. In brain research, one of the most significant research areas in recent decades, many developments have application in both modern engineering technology and education. New measurement methods in the observation of brain activity open…

  10. Methods and Techniques for Clinical Text Modeling and Analytics

    ERIC Educational Resources Information Center

    Ling, Yuan

    2017-01-01

    This study focuses on developing and applying methods/techniques in different aspects of the system for clinical text understanding, at both corpus and document level. We deal with two major research questions: First, we explore the question of "How to model the underlying relationships from clinical notes at corpus level?" Documents…

  11. Developing Health Indicators for People with Intellectual Disabilities. The Method of the Pomona Project

    ERIC Educational Resources Information Center

    van Schrojenstein Lantman-de Valk, H.; Linehan, C.; Kerr, M.; Noonan-Walsh, P.

    2007-01-01

    Aim: Recently, attention has focused on the health inequalities experienced by people with intellectual disabilities (ID) when compared with the general population. To inform policies aimed at equalizing health opportunities, comparable evidence is needed about the aspects of their health that may be amenable to intervention. Method: Applying the…

  12. Development of the Assessment Items of Debris Flow Using the Delphi Method

    NASA Astrophysics Data System (ADS)

    Byun, Yosep; Seong, Joohyun; Kim, Mingi; Park, Kyunghan; Yoon, Hyungkoo

    2016-04-01

    In recent years in Korea, Typhoon and the localized extreme rainfall caused by the abnormal climate has increased. Accordingly, debris flow is becoming one of the most dangerous natural disaster. This study aimed to develop the assessment items which can be used for conducting damage investigation of debris flow. Delphi method was applied to classify the realms of assessment items. As a result, 29 assessment items which can be classified into 6 groups were determined.

  13. Using exact solutions to develop an implicit scheme for the baroclinic primitive equations

    NASA Technical Reports Server (NTRS)

    Marchesin, D.

    1984-01-01

    The exact solutions presently obtained by means of a novel method for nonlinear initial value problems are used in the development of numerical schemes for the computer solution of these problems. The method is applied to a new, fully implicit scheme on a vertical slice of the isentropic baroclinic equations. It was not possible to find a global scale phenomenon that could be simulated by the baroclinic primitive equations on a vertical slice.

  14. Computational Electromagnetic Modeling of SansEC(Trade Mark) Sensors

    NASA Technical Reports Server (NTRS)

    Smith, Laura J.; Dudley, Kenneth L.; Szatkowski, George N.

    2011-01-01

    This paper describes the preliminary effort to apply computational design tools to aid in the development of an electromagnetic SansEC resonant sensor composite materials damage detection system. The computational methods and models employed on this research problem will evolve in complexity over time and will lead to the development of new computational methods and experimental sensor systems that demonstrate the capability to detect, diagnose, and monitor the damage of composite materials and structures on aerospace vehicles.

  15. A new tritiated water measurement method with plastic scintillator pellets.

    PubMed

    Furuta, Etsuko; Iwasaki, Noriko; Kato, Yuka; Tomozoe, Yusuke

    2016-01-01

    A new tritiated water measurement method with plastic scintillator pellets (PS-pellets) by using a conventional liquid scintillation counter was developed. The PS-pellets used were 3 mm in both diameter and length. A low potassium glass vial was filled full with the pellets, and tritiated water was applied to the vial from 5 to 100 μl. Then, the sample solution was scattered in the interstices of the pellets in a vial. This method needs no liquid scintillator, so no liquid organic waste fluid is generated. The counting efficiency with the pellets was approximately 48 % when a 5 μl solution was used, which was higher than that of conventional measurement using liquid scintillator. The relationship between count rate and activity showed good linearity. The pellets were able to be used repeatedly, so few solid wastes are generated with this method. The PS-pellets are useful for tritiated water measurement; however, it is necessary to develop a new device which can be applied to a larger volume and measure low level concentration like an environmental application.

  16. Analytical detection and method development of anticancer drug Gemcitabine HCl using gold nanoparticles.

    PubMed

    Menon, Shobhana K; Mistry, Bhoomika R; Joshi, Kuldeep V; Sutariya, Pinkesh G; Patel, Ravindra V

    2012-08-01

    A simple, rapid, cost effective and extractive UV spectrophotometric method was developed for the determination of Gemcitabine HCl (GMCT) in bulk drug and pharmaceutical formulation. It was based on UV spectrophotometric measurements in which the drug reacts with gold nanoparticles (AuNP) and changes the original colour of AuNP and forms a dark blue coloured solution which exhibits absorption maximum at 688nm. The apparent molar absorptivity and Sandell's sensitivity coefficient were found to be 3.95×10(-5)lmol(-1)cm(-1) and 0.060μgcm(-2) respectively. Beer's law was obeyed in the concentration range of 2.0-40μgml(-1). This method was tested and validated for various parameters according to ICH guidelines. The proposed method was successfully applied for the determination of GMCT in pharmaceutical formulation (parental formulation). The results demonstrated that the procedure is accurate, precise and reproducible (relative standard deviation <2%). As it is simple, cheap and less time consuming, it can be suitably applied for the estimation of GMCT in dosage forms. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Production of Landsat ETM+ reference imagery of burned areas within Southern African savannahs: comparison of methods and application to MODIS

    Treesearch

    A. M. S. Smith; N. A. Drake; M. J. Wooster; A. T. Hudak; Z. A. Holden; C. J. Gibbons

    2007-01-01

    Accurate production of regional burned area maps are necessary to reduce uncertainty in emission estimates from African savannah fires. Numerous methods have been developed that map burned and unburned surfaces. These methods are typically applied to coarse spatial resolution (1 km) data to produce regional estimates of the area burned, while higher spatial resolution...

  18. Some New Mathematical Methods for Variational Objective Analysis

    NASA Technical Reports Server (NTRS)

    Wahba, G.; Johnson, D. R.

    1984-01-01

    New and/or improved variational methods for simultaneously combining forecast, heterogeneous observational data, a priori climatology, and physics to obtain improved estimates of the initial state of the atmosphere for the purpose of numerical weather prediction are developed. Cross validated spline methods are applied to atmospheric data for the purpose of improved description and analysis of atmospheric phenomena such as the tropopause and frontal boundary surfaces.

  19. Development and psychometric evaluation of the Premarital Sexual Behavior Assessment Scale for Young Women (PSAS-YW): an exploratory mixed method study

    PubMed Central

    2014-01-01

    Background Premarital sexual behaviors are important issue for women’s health. The present study was designed to develop and examine the psychometric properties of a scale in order to identify young women who are at greater risk of premarital sexual behavior. Method This was an exploratory mixed method investigation. Indeed, the study was conducted in two phases. In the first phase, qualitative methods (focus group discussion and individual interview) were applied to generate items and develop the questionnaire. In the second phase, psychometric properties (validity and reliability) of the questionnaire were assessed. Results In the first phase an item pool containing 53 statements related to premarital sexual behavior was generated. In the second phase item reduction was applied and the final version of the questionnaire containing 26 items was developed. The psychometric properties of this final version were assessed and the results showed that the instrument has a good structure, and reliability. The results from exploratory factory analysis indicated a 5-factor solution for the instrument that jointly accounted for the 57.4% of variance observed. The Cronbach’s alpha coefficient for the instrument was found to be 0.87. Conclusion This study provided a valid and reliable scale to identify premarital sexual behavior in young women. Assessment of premarital sexual behavior might help to improve women’s sexual abstinence. PMID:24924696

  20. Development of Physics-Based Hurricane Wave Response Functions: Application to Selected Sites on the U.S. Gulf Coast

    NASA Astrophysics Data System (ADS)

    McLaughlin, P. W.; Kaihatu, J. M.; Irish, J. L.; Taylor, N. R.; Slinn, D.

    2013-12-01

    Recent hurricane activity in the Gulf of Mexico has led to a need for accurate, computationally efficient prediction of hurricane damage so that communities can better assess risk of local socio-economic disruption. This study focuses on developing robust, physics based non-dimensional equations that accurately predict maximum significant wave height at different locations near a given hurricane track. These equations (denoted as Wave Response Functions, or WRFs) were developed from presumed physical dependencies between wave heights and hurricane characteristics and fit with data from numerical models of waves and surge under hurricane conditions. After curve fitting, constraints which correct for fully developed sea state were used to limit the wind wave growth. When applied to the region near Gulfport, MS, back prediction of maximum significant wave height yielded root mean square errors between 0.22-0.42 (m) at open coast stations and 0.07-0.30 (m) at bay stations when compared to the numerical model data. The WRF method was also applied to Corpus Christi, TX and Panama City, FL with similar results. Back prediction errors will be included in uncertainty evaluations connected to risk calculations using joint probability methods. These methods require thousands of simulations to quantify extreme value statistics, thus requiring the use of reduced methods such as the WRF to represent the relevant physical processes.

  1. Reference materials for cellular therapeutics.

    PubMed

    Bravery, Christopher A; French, Anna

    2014-09-01

    The development of cellular therapeutics (CTP) takes place over many years, and, where successful, the developer will anticipate the product to be in clinical use for decades. Successful demonstration of manufacturing and quality consistency is dependent on the use of complex analytical methods; thus, the risk of process and method drift over time is high. The use of reference materials (RM) is an established scientific principle and as such also a regulatory requirement. The various uses of RM in the context of CTP manufacturing and quality are discussed, along with why they are needed for living cell products and the analytical methods applied to them. Relatively few consensus RM exist that are suitable for even common methods used by CTP developers, such as flow cytometry. Others have also identified this need and made proposals; however, great care will be needed to ensure any consensus RM that result are fit for purpose. Such consensus RM probably will need to be applied to specific standardized methods, and the idea that a single RM can have wide applicability is challenged. Written standards, including standardized methods, together with appropriate measurement RM are probably the most appropriate way to define specific starting cell types. The characteristics of a specific CTP will to some degree deviate from those of the starting cells; consequently, a product RM remains the best solution where feasible. Each CTP developer must consider how and what types of RM should be used to ensure the reliability of their own analytical measurements. Copyright © 2014 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  2. Development and validation of a UPLC-MS/MS method for simultaneous determination of fotagliptin and its two major metabolites in human plasma and urine.

    PubMed

    Wang, Zhenlei; Jiang, Ji; Hu, Pei; Zhao, Qian

    2017-02-01

    Fotagliptin is a novel dipeptidyl peptidase IV inhibitor under clinical development for the treatment of Type II diabetes mellitus. The objective of this study was to develop and validate a specific and sensitive ultra-performance liquid chromatography (UPLC)-MS/MS method for simultaneous determination of fotagliptin and its two major metabolites in human plasma and urine. Methodology & results: After being pretreated using an automatized procedure, the plasma and urine samples were separated and detected using a UPLC-ESI-MS/MS method, which was validated following the international guidelines. A selective and sensitive UPLC-MS/MS method was first developed and validated for quantifying fotagliptin and its metabolite in human plasma and urine. The method was successfully applied to support the clinical study of fotagliptin in Chinese healthy subjects.

  3. Development of a Hybrid RANS/LES Method for Compressible Mixing Layer Simulations

    NASA Technical Reports Server (NTRS)

    Georgiadis, Nicholas J.; Alexander, J. Iwan D.; Reshotko, Eli

    2001-01-01

    A hybrid method has been developed for simulations of compressible turbulent mixing layers. Such mixing layers dominate the flows in exhaust systems of modem day aircraft and also those of hypersonic vehicles currently under development. The hybrid method uses a Reynolds-averaged Navier-Stokes (RANS) procedure to calculate wall bounded regions entering a mixing section, and a Large Eddy Simulation (LES) procedure to calculate the mixing dominated regions. A numerical technique was developed to enable the use of the hybrid RANS/LES method on stretched, non-Cartesian grids. The hybrid RANS/LES method is applied to a benchmark compressible mixing layer experiment. Preliminary two-dimensional calculations are used to investigate the effects of axial grid density and boundary conditions. Actual LES calculations, performed in three spatial directions, indicated an initial vortex shedding followed by rapid transition to turbulence, which is in agreement with experimental observations.

  4. A Special Investigation to Develop a General Method for Three-dimensional Photoelastic Stress Analysis

    NASA Technical Reports Server (NTRS)

    Frocht, M M; Guernsey, R , Jr

    1953-01-01

    The method of strain measurement after annealing is reviewed and found to be satisfactory for the materials available in this country. A new general method is described for the photoelastic determination of the principal stresses at any point of a general body subjected to arbitrary load. The method has been applied to a sphere subjected to diametrical compressive loads. The results show possibilities of high accuracy.

  5. Two smart spectrophotometric methods for the simultaneous estimation of Simvastatin and Ezetimibe in combined dosage form

    NASA Astrophysics Data System (ADS)

    Magdy, Nancy; Ayad, Miriam F.

    2015-02-01

    Two simple, accurate, precise, sensitive and economic spectrophotometric methods were developed for the simultaneous determination of Simvastatin and Ezetimibe in fixed dose combination products without prior separation. The first method depends on a new chemometrics-assisted ratio spectra derivative method using moving window polynomial least square fitting method (Savitzky-Golay filters). The second method is based on a simple modification for the ratio subtraction method. The suggested methods were validated according to USP guidelines and can be applied for routine quality control testing.

  6. Assessing the Assessment Methods: Climate Change and Hydrologic Impacts

    NASA Astrophysics Data System (ADS)

    Brekke, L. D.; Clark, M. P.; Gutmann, E. D.; Mizukami, N.; Mendoza, P. A.; Rasmussen, R.; Ikeda, K.; Pruitt, T.; Arnold, J. R.; Rajagopalan, B.

    2014-12-01

    The Bureau of Reclamation, the U.S. Army Corps of Engineers, and other water management agencies have an interest in developing reliable, science-based methods for incorporating climate change information into longer-term water resources planning. Such assessments must quantify projections of future climate and hydrology, typically relying on some form of spatial downscaling and bias correction to produce watershed-scale weather information that subsequently drives hydrology and other water resource management analyses (e.g., water demands, water quality, and environmental habitat). Water agencies continue to face challenging method decisions in these endeavors: (1) which downscaling method should be applied and at what resolution; (2) what observational dataset should be used to drive downscaling and hydrologic analysis; (3) what hydrologic model(s) should be used and how should these models be configured and calibrated? There is a critical need to understand the ramification of these method decisions, as they affect the signal and uncertainties produced by climate change assessments and, thus, adaptation planning. This presentation summarizes results from a three-year effort to identify strengths and weaknesses of widely applied methods for downscaling climate projections and assessing hydrologic conditions. Methods were evaluated from two perspectives: historical fidelity, and tendency to modulate a global climate model's climate change signal. On downscaling, four methods were applied at multiple resolutions: statistically using Bias Correction Spatial Disaggregation, Bias Correction Constructed Analogs, and Asynchronous Regression; dynamically using the Weather Research and Forecasting model. Downscaling results were then used to drive hydrologic analyses over the contiguous U.S. using multiple models (VIC, CLM, PRMS), with added focus placed on case study basins within the Colorado Headwaters. The presentation will identify which types of climate changes are expressed robustly across methods versus those that are sensitive to method choice; which method choices seem relatively more important; and where strategic investments in research and development can substantially improve guidance on climate change provided to water managers.

  7. The anesthetic action of some polyhalogenated ethers-Monte Carlo method based QSAR study.

    PubMed

    Golubović, Mlađan; Lazarević, Milan; Zlatanović, Dragan; Krtinić, Dane; Stoičkov, Viktor; Mladenović, Bojan; Milić, Dragan J; Sokolović, Dušan; Veselinović, Aleksandar M

    2018-04-13

    Up to this date, there has been an ongoing debate about the mode of action of general anesthetics, which have postulated many biological sites as targets for their action. However, postoperative nausea and vomiting are common problems in which inhalational agents may have a role in their development. When a mode of action is unknown, QSAR modelling is essential in drug development. To investigate the aspects of their anesthetic, QSAR models based on the Monte Carlo method were developed for a set of polyhalogenated ethers. Until now, their anesthetic action has not been completely defined, although some hypotheses have been suggested. Therefore, a QSAR model should be developed on molecular fragments that contribute to anesthetic action. QSAR models were built on the basis of optimal molecular descriptors based on the SMILES notation and local graph invariants, whereas the Monte Carlo optimization method with three random splits into the training and test set was applied for model development. Different methods, including novel Index of ideality correlation, were applied for the determination of the robustness of the model and its predictive potential. The Monte Carlo optimization process was capable of being an efficient in silico tool for building up a robust model of good statistical quality. Molecular fragments which have both positive and negative influence on anesthetic action were determined. The presented study can be useful in the search for novel anesthetics. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Rtop - an R package for interpolation along the stream network

    NASA Astrophysics Data System (ADS)

    Skøien, J. O.

    2009-04-01

    Rtop - an R package for interpolation along the stream network Geostatistical methods have been used to a limited extent for estimation along stream networks, with a few exceptions(Gottschalk, 1993; Gottschalk, et al., 2006; Sauquet, et al., 2000; Skøien, et al., 2006). Interpolation of runoff characteristics are more complicated than the traditional random variables estimated by geostatistical methods, as the measurements have a more complicated support, and many catchments are nested. Skøien et al. (2006) presented the model Top-kriging which takes these effects into account for interpolation of stream flow characteristics (exemplified by the 100 year flood). The method has here been implemented as a package in the statistical environment R (R Development Core Team, 2004). Taking advantage of the existing methods in R for working with spatial objects, and the extensive possibilities for visualizing the result, this makes it considerably easier to apply the method on new data sets, in comparison to earlier implementation of the method. Gottschalk, L. 1993. Interpolation of runoff applying objective methods. Stochastic Hydrology and Hydraulics, 7, 269-281. Gottschalk, L., I. Krasovskaia, E. Leblois, and E. Sauquet. 2006. Mapping mean and variance of runoff in a river basin. Hydrology and Earth System Sciences, 10, 469-484. R Development Core Team. 2004. R: A language and environment for statistical computing. Vienna, Austria: R Foundation for Statistical Computing. Sauquet, E., L. Gottschalk, and E. Leblois. 2000. Mapping average annual runoff: a hierarchical approach applying a stochastic interpolation scheme. Hydrological Sciences Journal, 45 (6), 799-815. Skøien, J. O., R. Merz, and G. Blöschl. 2006. Top-kriging - geostatistics on stream networks. Hydrology and Earth System Sciences, 10, 277-287.

  9. A detailed protocol for chromatin immunoprecipitation in the yeast Saccharomyces cerevisiae.

    PubMed

    Grably, Melanie; Engelberg, David

    2010-01-01

    Critical cellular processes such as DNA replication, DNA damage repair, and transcription are mediated and regulated by DNA-binding proteins. Many efforts have been invested therefore in developing methods that monitor the dynamics of protein-DNA association. As older techniques such as DNA footprinting, and electrophoretic mobility shift assays (EMSA) could be applied mostly in vitro, the development of the chromatin immunoprecipitation (ChIP) method, which allows quantitative measurement of protein-bound DNA most accurately in vivo, revolutionized our capabilities of understanding the mechanisms underlying the aforementioned processes. Furthermore, this powerful tool could be applied at the genomic-scale providing a global picture of the protein-DNA complexes at the entire genome.The procedure is conceptually simple; involves rapid crosslinking of proteins to DNA by the addition of formaldehyde to the culture, shearing the DNA and immunoprecipitating the protein of interest while covalently bound to its DNA targets. Following decrosslinking, DNA that was coimmunoprecipitated could be amplified by PCR or could serve as a probe of a genomic microarray to identify all DNA fragments that were bound to the protein.Although simple in principle, the method is not trivial to implement and the results might be misleading if proper controls are not included in the experiment. In this chapter, we provide therefore a highly detailed protocol of ChIP assay as is applied successfully in our laboratory. We pay special attention to describe every small detail, in order that any investigator could readily and successfully apply this important and powerful technology.

  10. Reliability analysis of composite structures

    NASA Technical Reports Server (NTRS)

    Kan, Han-Pin

    1992-01-01

    A probabilistic static stress analysis methodology has been developed to estimate the reliability of a composite structure. Closed form stress analysis methods are the primary analytical tools used in this methodology. These structural mechanics methods are used to identify independent variables whose variations significantly affect the performance of the structure. Once these variables are identified, scatter in their values is evaluated and statistically characterized. The scatter in applied loads and the structural parameters are then fitted to appropriate probabilistic distribution functions. Numerical integration techniques are applied to compute the structural reliability. The predicted reliability accounts for scatter due to variability in material strength, applied load, fabrication and assembly processes. The influence of structural geometry and mode of failure are also considerations in the evaluation. Example problems are given to illustrate various levels of analytical complexity.

  11. [Learning from errors: applying aviation safety concepts to medicine].

    PubMed

    Sommer, K-J

    2012-11-01

    Health care safety levels range below other complex industries. Civil aviation has throughout its history developed methods and concepts that have made the airplane into one of the safest means of mass transport. Key elements are accident investigations that focus on cause instead of blame, human-centered design of machinery and processes, continuous training of all personnel and a shared safety culture. These methods and concepts can basically be applied to medicine which has successfully been achieved in certain areas, however, a comprehensive implementation remains to be completed. This applies particularly to including the topic of safety into relevant curricula. Physicians are obliged by the oath"primum nil nocere" to act, but economic as well as political pressure will eventually confine professional freedom if initiative is not taken soon.

  12. Development and application of a local linearization algorithm for the integration of quaternion rate equations in real-time flight simulation problems

    NASA Technical Reports Server (NTRS)

    Barker, L. E., Jr.; Bowles, R. L.; Williams, L. H.

    1973-01-01

    High angular rates encountered in real-time flight simulation problems may require a more stable and accurate integration method than the classical methods normally used. A study was made to develop a general local linearization procedure of integrating dynamic system equations when using a digital computer in real-time. The procedure is specifically applied to the integration of the quaternion rate equations. For this application, results are compared to a classical second-order method. The local linearization approach is shown to have desirable stability characteristics and gives significant improvement in accuracy over the classical second-order integration methods.

  13. Increasing role of arthropod bites in tularaemia transmission in Poland - case reports and diagnostic methods.

    PubMed

    Formińska, Kamila; Zasada, Aleksandra A; Rastawicki, Waldemar; Śmietańska, Karolina; Bander, Dorota; Wawrzynowicz-Syczewska, Marta; Yanushevych, Mariya; Niścigórska-Olsen, Jolanta; Wawszczak, Marek

    2015-01-01

    The study describes four cases of tularaemia - one developed after contact with rabbits and three developed after an arthropod bite. Due to non-specific clinical symptoms, accurate diagnosis of tularaemia may be difficult. The increasing contribution of the arthropod vectors in the transmission of the disease indicates that special effort should be made to apply sensitive and specific diagnostic methods for tularaemia, and to remind health-care workers about this route of Francisella tularensis infections. The advantages and disadvantages of various diagnostic methods - molecular, serological and microbiological culture - are discussed. The PCR as a rapid and proper diagnostic method for ulceroglandular tularaemia is presented.

  14. Spectrophotometric Method for the Determination of Two Coformulated Drugs with Highly Different Concentrations. Application on Vildagliptin and Metformin Hydrochloride

    NASA Astrophysics Data System (ADS)

    Zaazaa, H. E.; Elzanfaly, E. S.; Soudi, A. T.; Salem, M. Y.

    2016-03-01

    A new smart simple validated spectrophotometric method was developed for the determination of two drugs one of which is in a very low concentration compared to the other. The method is based on spiking and dilution then simple mathematical manipulation of the absorbance spectra. This method was applied for the determination of a binary mixture of vildagliptin and metformin hydrochloride in the ratio 50:850 in laboratory prepared mixtures containing both drugs in this ratio and in pharmaceutical dosage form with good recoveries. The developed method was validated according to ICH guidelines and can be used for routine quality control testing.

  15. Objective comparison of particle tracking methods

    PubMed Central

    Chenouard, Nicolas; Smal, Ihor; de Chaumont, Fabrice; Maška, Martin; Sbalzarini, Ivo F.; Gong, Yuanhao; Cardinale, Janick; Carthel, Craig; Coraluppi, Stefano; Winter, Mark; Cohen, Andrew R.; Godinez, William J.; Rohr, Karl; Kalaidzidis, Yannis; Liang, Liang; Duncan, James; Shen, Hongying; Xu, Yingke; Magnusson, Klas E. G.; Jaldén, Joakim; Blau, Helen M.; Paul-Gilloteaux, Perrine; Roudot, Philippe; Kervrann, Charles; Waharte, François; Tinevez, Jean-Yves; Shorte, Spencer L.; Willemse, Joost; Celler, Katherine; van Wezel, Gilles P.; Dan, Han-Wei; Tsai, Yuh-Show; de Solórzano, Carlos Ortiz; Olivo-Marin, Jean-Christophe; Meijering, Erik

    2014-01-01

    Particle tracking is of key importance for quantitative analysis of intracellular dynamic processes from time-lapse microscopy image data. Since manually detecting and following large numbers of individual particles is not feasible, automated computational methods have been developed for these tasks by many groups. Aiming to perform an objective comparison of methods, we gathered the community and organized, for the first time, an open competition, in which participating teams applied their own methods independently to a commonly defined data set including diverse scenarios. Performance was assessed using commonly defined measures. Although no single method performed best across all scenarios, the results revealed clear differences between the various approaches, leading to important practical conclusions for users and developers. PMID:24441936

  16. Development of a nonlinear vortex method

    NASA Technical Reports Server (NTRS)

    Kandil, O. A.

    1982-01-01

    Steady and unsteady Nonliner Hybrid Vortex (NHV) method, for low aspect ratio wings at large angles of attack, is developed. The method uses vortex panels with first-order vorticity distribution (equivalent to second-order doublet distribution) to calculate the induced velocity in the near field using closed form expressions. In the far field, the distributed vorticity is reduced to concentrated vortex lines and the simpler Biot-Savart's law is employed. The method is applied to rectangular wings in steady and unsteady flows without any restriction on the order of magnitude of the disturbances in the flow field. The numerical results show that the method accurately predicts the distributed aerodynamic loads and that it is of acceptable computational efficiency.

  17. Laser processing for manufacturing nanocarbon materials

    NASA Astrophysics Data System (ADS)

    Van, Hai Hoang

    CNTs have been considered as the excellent candidate to revolutionize a broad range of applications. There have been many method developed to manipulate the chemistry and the structure of CNTs. Laser with non-contact treatment capability exhibits many processing advantages, including solid-state treatment, extremely fast processing rate, and high processing resolution. In addition, the outstanding monochromatic, coherent, and directional beam generates the powerful energy absorption and the resultant extreme processing conditions. In my research, a unique laser scanning method was developed to process CNTs, controlling the oxidation and the graphitization. The achieved controllability of this method was applied to address the important issues of the current CNT processing methods for three applications. The controllable oxidation of CNTs by laser scanning method was applied to cut CNT films to produce high-performance cathodes for FE devices. The production method includes two important self-developed techniques to produce the cold cathodes: the production of highly oriented and uniformly distributed CNT sheets and the precise laser trimming process. Laser cutting is the unique method to produce the cathodes with remarkable features, including ultrathin freestanding structure (~200 nm), greatly high aspect ratio, hybrid CNT-GNR emitter arrays, even emitter separation, and directional emitter alignment. This unique cathode structure was unachievable by other methods. The developed FE devices successfully solved the screening effect issue encounter by current FE devices. The laser-control oxidation method was further developed to sequentially remove graphitic walls of CNTs. The laser oxidation process was directed to occur along the CNT axes by the laser scanning direction. Additionally, the oxidation was further assisted by the curvature stress and the thermal expansion of the graphitic nanotubes, ultimately opening (namely unzipping) the tubular structure to produce GNRs. Therefore the developed laser scanning method optimally exploited the thermal laser-CNT interaction, successfully transforming CNTs into 2D GNRs. The solid-state laser unzipping process effectively addressed the issues of contamination and scalability encountered by the current unzipping methods. Additionally, the produced GNRs were uniquely featured with the freestanding structure and the smooth surfaces. If the scanning process was performed in an inert environment without the appearance of oxygen, the oxidation of CNTs would not happen. Instead, the greatly mobile carbon atoms of the heated CNTs would reorganize the crystal structure, inducing the graphitization process to improve the crystallinity. Many observations showing the structural improvement of CNTs under laser irradiation has been reported, confirming the capability of laser to heal graphitic defects. Laser methods were more time-efficient and energy-efficient than other annealing methods because laser can quickly heat CNTs to generate graphitization in less than one second. This subsecond heating process of laser irradiation was more effective than other heating methods because it avoided the undesired coalescence of CNTs. In my research, the laser scanning method was applied to generate the graphitization, healing the structural defects of CNTs. Different from the reported laser methods, the laser scanning directed the locally annealed areas to move along the CNT axes, migrating and coalescencing the graphitic defects to achieve better healing results. The critical information describing the CNT structural transformation caused by the moving laser irradiation was explored from the successful applications of the developed laser method. This knowledge inspires an important method to modifiy the general graphitic structure for important applications, such as carbon fiber production, CNT self-assembly process and CNT welding. This method will be effective, facile, versatile, and adaptable for laboratory and industrial facilities.

  18. Development of an oximeter for neurology

    NASA Astrophysics Data System (ADS)

    Aleinik, A.; Serikbekova, Z.; Zhukova, N.; Zhukova, I.; Nikitina, M.

    2016-06-01

    Cerebral desaturation can occur during surgery manipulation, whereas other parameters vary insignificantly. Prolonged intervals of cerebral anoxia can cause serious damage to the nervous system. Commonly used method for measurement of cerebral blood flow uses invasive catheters. Other techniques include single photon emission computed tomography (SPECT), positron emission tomography (PET), magnetic resonance imaging (MRI). Tomographic methods frequently use isotope administration, that may result in anaphylactic reactions to contrast media and associated nerve diseases. Moreover, the high cost and the need for continuous monitoring make it difficult to apply these techniques in clinical practice. Cerebral oximetry is a method for measuring oxygen saturation using infrared spectrometry. Moreover reflection pulse oximetry can detect sudden changes in sympathetic tone. For this purpose the reflectance pulse oximeter for use in neurology is developed. Reflectance oximeter has a definite advantage as it can be used to measure oxygen saturation in any part of the body. Preliminary results indicate that the device has a good resolution and high reliability. Modern applied schematics have improved device characteristics compared with existing ones.

  19. Promotion of a healthy work life at small enterprises in Thailand by participatory methods.

    PubMed

    Krungkraiwong, Sudthida; Itani, Toru; Amornratanapaichit, Ratanaporn

    2006-01-01

    The major problems of small enterprises include unfavourable working conditions and environment that affect safety and health of workers. The WISE (Work Improvement in Small Enterprises) methodology developed by the ILO has been widely applied to improve occupational safety and health in small enterprises in Thailand. The participatory methods building on local good practices and focusing on practicable improvements have proven effective in controlling the occupational hazards in these enterprises at their sources. As a result of applying the methods in small-scale industries, the frequency of occupational accidents was reduced and the working environment actually improved in the cases studied. The results prove that the participatory approach taken by the WISE activities is a useful and effective tool to make owner/managers and workers in small enterprises voluntarily improve their own working conditions and environment. In promoting a healthy work life at small enterprises in Thailand, it is important to further develop and spread the approach.

  20. Digital Archiving of People Flow by Recycling Large-Scale Social Survey Data of Developing Cities

    NASA Astrophysics Data System (ADS)

    Sekimoto, Y.; Watanabe, A.; Nakamura, T.; Horanont, T.

    2012-07-01

    Data on people flow has become increasingly important in the field of business, including the areas of marketing and public services. Although mobile phones enable a person's position to be located to a certain degree, it is a challenge to acquire sufficient data from people with mobile phones. In order to grasp people flow in its entirety, it is important to establish a practical method of reconstructing people flow from various kinds of existing fragmentary spatio-temporal data such as social survey data. For example, despite typical Person Trip Survey Data collected by the public sector showing the fragmentary spatio-temporal positions accessed, the data are attractive given the sufficiently large sample size to estimate the entire flow of people. In this study, we apply our proposed basic method to Japan International Cooperation Agency (JICA) PT data pertaining to developing cities around the world, and we propose some correction methods to resolve the difficulties in applying it to many cities and stably to infrastructure data.

  1. Metabolite profiling on apple volatile content based on solid phase microextraction and gas-chromatography time of flight mass spectrometry.

    PubMed

    Aprea, Eugenio; Gika, Helen; Carlin, Silvia; Theodoridis, Georgios; Vrhovsek, Urska; Mattivi, Fulvio

    2011-07-15

    A headspace SPME GC-TOF-MS method was developed for the acquisition of metabolite profiles of apple volatiles. As a first step, an experimental design was applied to find out the most appropriate conditions for the extraction of apple volatile compounds by SPME. The selected SPME method was applied in profiling of four different apple varieties by GC-EI-TOF-MS. Full scan GC-MS data were processed by MarkerLynx software for peak picking, normalisation, alignment and feature extraction. Advanced chemometric/statistical techniques (PCA and PLS-DA) were used to explore data and extract useful information. Characteristic markers of each variety were successively identified using the NIST library thus providing useful information for variety classification. The developed HS-SPME sampling method is fully automated and proved useful in obtaining the fingerprint of the volatile content of the fruit. The described analytical protocol can aid in further studies of the apple metabolome. Copyright © 2011 Elsevier B.V. All rights reserved.

  2. Flexible methods for segmentation evaluation: results from CT-based luggage screening.

    PubMed

    Karimi, Seemeen; Jiang, Xiaoqian; Cosman, Pamela; Martz, Harry

    2014-01-01

    Imaging systems used in aviation security include segmentation algorithms in an automatic threat recognition pipeline. The segmentation algorithms evolve in response to emerging threats and changing performance requirements. Analysis of segmentation algorithms' behavior, including the nature of errors and feature recovery, facilitates their development. However, evaluation methods from the literature provide limited characterization of the segmentation algorithms. To develop segmentation evaluation methods that measure systematic errors such as oversegmentation and undersegmentation, outliers, and overall errors. The methods must measure feature recovery and allow us to prioritize segments. We developed two complementary evaluation methods using statistical techniques and information theory. We also created a semi-automatic method to define ground truth from 3D images. We applied our methods to evaluate five segmentation algorithms developed for CT luggage screening. We validated our methods with synthetic problems and an observer evaluation. Both methods selected the same best segmentation algorithm. Human evaluation confirmed the findings. The measurement of systematic errors and prioritization helped in understanding the behavior of each segmentation algorithm. Our evaluation methods allow us to measure and explain the accuracy of segmentation algorithms.

  3. Water surface temperature estimation from Landsat 7 ETM+ thermal infrared data using the generalized single-channel method: Case study of Embalse del Río Tercero (Córdoba, Argentina)

    NASA Astrophysics Data System (ADS)

    Lamaro, Anabel Alejandra; Mariñelarena, Alejandro; Torrusio, Sandra Edith; Sala, Silvia Estela

    2013-02-01

    Monitoring of warm distribution in water is fundamental to understand the performance and functioning of reservoirs and lakes. Surface water temperature is a key parameter in the physics of aquatic systems processes since it is closely related to the energy fluxes through the water-atmosphere interface. Remote sensing applied to water quality studies in inland waterbodies is a powerful tool that can provide additional information difficult to achieve by other means. The combination of good real-time coverage, spatial resolution and free availability of data makes Landsat system a proper alternative. Many papers have developed algorithms to retrieve surface temperature (principally, land surface temperature) from at-sensor and surface emissivity data. The aim of this study is to apply the single-channel generalized method (SCGM) developed by Jiménez-Muñoz and Sobrino (2003) for the estimation of water surface temperature from Landsat 7 ETM+ thermal bands. We consider a constant water emissivity value (0.9885) and we compare the results with radiative transfer classic method (RTM). We choose Embalse del Río Tercero (Córdoba, Argentina) as case study because it is a reservoir affected by the outlet of the cooling system of a nuclear power plant, whose thermal plume could influence the biota's distribution and biodiversity. These characteristics and the existence of long term studies make it an adequate place to test the methodology. Values of estimated and observed water surface temperatures obtained by the two compared methods were correlated applying a simple regression model. Correlation coefficients were significant (R2: 0.9498 for SCGM method and R2: 0.9584 for RTM method) while their standard errors were acceptable in both cases (SCGM method: RMS = 1.2250 and RTM method: RMS = 1.0426). Nevertheless, SCGM could estimate rather small differences in temperature between sites consistently with the results obtained in field measurements. Besides, it has the advantage that it only uses values of atmospheric water vapor and it can be applied to different thermal sensors using the same equation and coefficients.

  4. Methods for Estimating Environmental Effects and Constraints on NexGen: High Density Case Study

    NASA Technical Reports Server (NTRS)

    Augustine, S.; Ermatinger, C.; Graham, M.; Thompson, T.

    2010-01-01

    This document provides a summary of the current methods developed by Metron Aviation for the estimate of environmental effects and constraints on the Next Generation Air Transportation System (NextGen). This body of work incorporates many of the key elements necessary to achieve such an estimate. Each section contains the background and motivation for the technical elements of the work, a description of the methods used, and possible next steps. The current methods described in this document were selected in an attempt to provide a good balance between accuracy and fairly rapid turn around times to best advance Joint Planning and Development Office (JPDO) System Modeling and Analysis Division (SMAD) objectives while also supporting the needs of the JPDO Environmental Working Group (EWG). In particular this document describes methods applied to support the High Density (HD) Case Study performed during the spring of 2008. A reference day (in 2006) is modeled to describe current system capabilities while the future demand is applied to multiple alternatives to analyze system performance. The major variables in the alternatives are operational/procedural capabilities for airport, terminal, and en route airspace along with projected improvements to airframe, engine and navigational equipment.

  5. Applicability of a gene expression based prediction method to SD and Wistar rats: an example of CARCINOscreen®.

    PubMed

    Matsumoto, Hiroshi; Saito, Fumiyo; Takeyoshi, Masahiro

    2015-12-01

    Recently, the development of several gene expression-based prediction methods has been attempted in the fields of toxicology. CARCINOscreen® is a gene expression-based screening method to predict carcinogenicity of chemicals which target the liver with high accuracy. In this study, we investigated the applicability of the gene expression-based screening method to SD and Wistar rats by using CARCINOscreen®, originally developed with F344 rats, with two carcinogens, 2,4-diaminotoluen and thioacetamide, and two non-carcinogens, 2,6-diaminotoluen and sodium benzoate. After the 28-day repeated dose test was conducted with each chemical in SD and Wistar rats, microarray analysis was performed using total RNA extracted from each liver. Obtained gene expression data were applied to CARCINOscreen®. Predictive scores obtained by the CARCINOscreen® for known carcinogens were > 2 in all strains of rats, while non-carcinogens gave prediction scores below 0.5. These results suggested that the gene expression based screening method, CARCINOscreen®, can be applied to SD and Wistar rats, widely used strains in toxicological studies, by setting of an appropriate boundary line of prediction score to classify the chemicals into carcinogens and non-carcinogens.

  6. Improved HPLC method with the aid of chemometric strategy: determination of loxoprofen in pharmaceutical formulation.

    PubMed

    Venkatesan, P; Janardhanan, V Sree; Muralidharan, C; Valliappan, K

    2012-06-01

    Loxoprofen belongs to a class of Nonsteroidal anti-inflammatory drug acts by inhibiting isoforms of cyclo-oxygenase 1 and 2. In this study an improved RP-HPLC method was developed for the quantification of loxoprofen in pharmaceutical dosage form. For that purpose an experimental design approach was employed. Factors-independent variables (organic modifier, pH of the mobile phase and flow rate) were extracted from the preliminary study and as dependent variables three responses (loxoprofen retention factor, resolution between loxoprofen probenecid and retention time of probenecid) were selected. For the improvement of method development and optimization step, Derringer's desirability function was applied to simultaneously optimize the chosen three responses. The procedure allowed deduction of optimal conditions and the predicted optimum was acetonitrile: water (53:47, v/v), pH of the mobile phase adjusted at to 2.9 with ortho phosphoric acid. The separation was achieved in less than 4minutes. The method was applied in the quality control of commercial tablets. The method showed good agreement between the experimental data and predictive value throughout the studied parameter space. The optimized assay condition was validated according to International conference on harmonisation guidelines to confirm specificity, linearity, accuracy and precision.

  7. Identification of an EMS-induced causal mutation in a gene required for boron-mediated root development by low-coverage genome re-sequencing in Arabidopsis

    PubMed Central

    Tabata, Ryo; Kamiya, Takehiro; Shigenobu, Shuji; Yamaguchi, Katsushi; Yamada, Masashi; Hasebe, Mitsuyasu; Fujiwara, Toru; Sawa, Shinichiro

    2013-01-01

    Next-generation sequencing (NGS) technologies enable the rapid production of an enormous quantity of sequence data. These powerful new technologies allow the identification of mutations by whole-genome sequencing. However, most reported NGS-based mapping methods, which are based on bulked segregant analysis, are costly and laborious. To address these limitations, we designed a versatile NGS-based mapping method that consists of a combination of low- to medium-coverage multiplex SOLiD (Sequencing by Oligonucleotide Ligation and Detection) and classical genetic rough mapping. Using only low to medium coverage reduces the SOLiD sequencing costs and, since just 10 to 20 mutant F2 plants are required for rough mapping, the operation is simple enough to handle in a laboratory with limited space and funding. As a proof of principle, we successfully applied this method to identify the CTR1, which is involved in boron-mediated root development, from among a population of high boron requiring Arabidopsis thaliana mutants. Our work demonstrates that this NGS-based mapping method is a moderately priced and versatile method that can readily be applied to other model organisms. PMID:23104114

  8. Instrumentation development for In Situ 40Ar/39Ar planetary geochronology

    USGS Publications Warehouse

    Morgan, Leah; Munk, Madicken; Davidheiser-Kroll, Brett; Warner, Nicholas H.; Gupta, Sanjeev; Slaybaugh, Rachel; Harkness, Patrick; Mark, Darren

    2017-01-01

    The chronology of the Solar System, particularly the timing of formation of extra-terrestrial bodies and their features, is an outstanding problem in planetary science. Although various chronological methods for in situ geochronology have been proposed (e.g., Rb-Sr, K-Ar), and even applied (K-Ar), the reliability, accuracy, and applicability of the 40Ar/39Ar method makes it by far the most desirable chronometer for dating extra-terrestrial bodies. The method however relies on the neutron irradiation of samples, and thus a neutron source. Herein, we discuss the challenges and feasibility of deploying a passive neutron source to planetary surfaces for the in situ application of the 40Ar/39Ar chronometer. Requirements in generating and shielding neutrons, as well as analysing samples are described, along with an exploration of limitations such as mass, power and cost. Two potential solutions for the in situ extra-terrestrial deployment of the 40Ar/39Ar method are presented. Although this represents a challenging task, developing the technology to apply the 40Ar/39Ar method on planetary surfaces would represent a major advance towards constraining the timescale of solar system formation and evolution.

  9. Metrics in method engineering

    NASA Astrophysics Data System (ADS)

    Brinkkemper, S.; Rossi, M.

    1994-12-01

    As customizable computer aided software engineering (CASE) tools, or CASE shells, have been introduced in academia and industry, there has been a growing interest into the systematic construction of methods and their support environments, i.e. method engineering. To aid the method developers and method selectors in their tasks, we propose two sets of metrics, which measure the complexity of diagrammatic specification techniques on the one hand, and of complete systems development methods on the other hand. Proposed metrics provide a relatively fast and simple way to analyze the technique (or method) properties, and when accompanied with other selection criteria, can be used for estimating the cost of learning the technique and the relative complexity of a technique compared to others. To demonstrate the applicability of the proposed metrics, we have applied them to 34 techniques and 15 methods.

  10. Optimal Spatial Design of Capacity and Quantity of Rainwater Catchment Systems for Urban Flood Mitigation

    NASA Astrophysics Data System (ADS)

    Huang, C.; Hsu, N.

    2013-12-01

    This study imports Low-Impact Development (LID) technology of rainwater catchment systems into a Storm-Water runoff Management Model (SWMM) to design the spatial capacity and quantity of rain barrel for urban flood mitigation. This study proposes a simulation-optimization model for effectively searching the optimal design. In simulation method, we design a series of regular spatial distributions of capacity and quantity of rainwater catchment facilities, and thus the reduced flooding circumstances using a variety of design forms could be simulated by SWMM. Moreover, we further calculate the net benefit that is equal to subtract facility cost from decreasing inundation loss and the best solution of simulation method would be the initial searching solution of the optimization model. In optimizing method, first we apply the outcome of simulation method and Back-Propagation Neural Network (BPNN) for developing a water level simulation model of urban drainage system in order to replace SWMM which the operating is based on a graphical user interface and is hard to combine with optimization model and method. After that we embed the BPNN-based simulation model into the developed optimization model which the objective function is minimizing the negative net benefit. Finally, we establish a tabu search-based algorithm to optimize the planning solution. This study applies the developed method in Zhonghe Dist., Taiwan. Results showed that application of tabu search and BPNN-based simulation model into the optimization model not only can find better solutions than simulation method in 12.75%, but also can resolve the limitations of previous studies. Furthermore, the optimized spatial rain barrel design can reduce 72% of inundation loss according to historical flood events.

  11. Development of Implicit Methods in CFD NASA Ames Research Center 1970's - 1980's

    NASA Technical Reports Server (NTRS)

    Pulliam, Thomas H.

    2010-01-01

    The focus here is on the early development (mid 1970's-1980's) at NASA Ames Research Center of implicit methods in Computational Fluid Dynamics (CFD). A class of implicit finite difference schemes of the Beam and Warming approximate factorization type will be addressed. The emphasis will be on the Euler equations. A review of material pertinent to the solution of the Euler equations within the framework of implicit methods will be presented. The eigensystem of the equations will be used extensively in developing a framework for various methods applied to the Euler equations. The development and analysis of various aspects of this class of schemes will be given along with the motivations behind many of the choices. Various acceleration and efficiency modifications such as matrix reduction, diagonalization and flux split schemes will be presented.

  12. Quantitative Evaluation of Management Courses: Part 1

    ERIC Educational Resources Information Center

    Cunningham, Cyril

    1973-01-01

    The author describes how he developed a method of evaluating and comparing management courses of different types and lengths by applying an ordinal system of relative values using a process of transmutation. (MS)

  13. Quantitative Imaging in Cancer Clinical Trials

    PubMed Central

    Yankeelov, Thomas E.; Mankoff, David A.; Schwartz, Lawrence H.; Lieberman, Frank S.; Buatti, John M.; Mountz, James M.; Erickson, Bradley J.; Fennessy, Fiona M.M.; Huang, Wei; Kalpathy-Cramer, Jayashree; Wahl, Richard L.; Linden, Hannah M.; Kinahan, Paul; Zhao, Binsheng; Hylton, Nola M.; Gillies, Robert J.; Clarke, Laurence; Nordstrom, Robert; Rubin, Daniel L.

    2015-01-01

    As anti-cancer therapies designed to target specific molecular pathways have been developed, it has become critical to develop methods to assess the response induced by such agents. While traditional, anatomic CT and MRI exams are useful in many settings, there is increasing evidence that these methods cannot answer the fundamental biological and physiological questions essential for assessment and, eventually, prediction of treatment response in the clinical trial setting, especially in the critical period soon after treatment is initiated. To optimally apply advances in quantitative imaging methods to trials of targeted cancer therapy, new infrastructure improvements are needed that incorporate these emerging techniques into the settings where they are most likely to have impact. In this review, we first elucidate the needs for therapeutic response assessment in the era of molecularly targeted therapy and describe how quantitative imaging can most effectively provide scientifically and clinically relevant data. We then describe the tools and methods required to apply quantitative imaging and provide concrete examples of work making these advances practically available for routine application in clinical trials. We conclude by proposing strategies to surmount barriers to wider incorporation of these quantitative imaging methods into clinical trials and, eventually, clinical practice. Our goal is to encourage and guide the oncology community to deploy standardized quantitative imaging techniques in clinical trials to further personalize care for cancer patients, and to provide a more efficient path for the development of improved targeted therapies. PMID:26773162

  14. Comprehensive validation scheme for in situ fiber optics dissolution method for pharmaceutical drug product testing.

    PubMed

    Mirza, Tahseen; Liu, Qian Julie; Vivilecchia, Richard; Joshi, Yatindra

    2009-03-01

    There has been a growing interest during the past decade in the use of fiber optics dissolution testing. Use of this novel technology is mainly confined to research and development laboratories. It has not yet emerged as a tool for end product release testing despite its ability to generate in situ results and efficiency improvement. One potential reason may be the lack of clear validation guidelines that can be applied for the assessment of suitability of fiber optics. This article describes a comprehensive validation scheme and development of a reliable, robust, reproducible and cost-effective dissolution test using fiber optics technology. The test was successfully applied for characterizing the dissolution behavior of a 40-mg immediate-release tablet dosage form that is under development at Novartis Pharmaceuticals, East Hanover, New Jersey. The method was validated for the following parameters: linearity, precision, accuracy, specificity, and robustness. In particular, robustness was evaluated in terms of probe sampling depth and probe orientation. The in situ fiber optic method was found to be comparable to the existing manual sampling dissolution method. Finally, the fiber optic dissolution test was successfully performed by different operators on different days, to further enhance the validity of the method. The results demonstrate that the fiber optics technology can be successfully validated for end product dissolution/release testing. (c) 2008 Wiley-Liss, Inc. and the American Pharmacists Association

  15. Quantification of febuxostat polymorphs using powder X-ray diffraction technique.

    PubMed

    Qiu, Jing-bo; Li, Gang; Sheng, Yue; Zhu, Mu-rong

    2015-03-25

    Febuxostat is a pharmaceutical compound with more than 20 polymorphs of which form A is most widely used and usually exists in a mixed polymorphic form with form G. In the present study, a quantification method for polymorphic form A and form G of febuxostat (FEB) has been developed using powder X-ray diffraction (PXRD). Prior to development of a quantification method, pure polymorphic form A and form G are characterized. A continuous scan with a scan rate of 3° min(-1) over an angular range of 3-40° 2θ is applied for the construction of the calibration curve using the characteristic peaks of form A at 12.78° 2θ (I/I0100%) and form G at 11.72° 2θ (I/I0100%). The linear regression analysis data for the calibration plots shows good linear relationship with R(2)=0.9985 with respect to peak area in the concentration range 10-60 wt.%. The method is validated for precision, recovery and ruggedness. The limits of detection and quantitation are 1.5% and 4.6%, respectively. The obtained results prove that the method is repeatable, sensitive and accurate. The proposed developed PXRD method can be applied for the quantitative analysis of mixtures of febuxostat polymorphs (forms A and G). Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Simultaneous quantification of methiocarb and its metabolites, methiocarb sulfoxide and methiocarb sulfone, in five food products of animal origin using tandem mass spectrometry.

    PubMed

    Rahman, Md Musfiqur; Abd El-Aty, A M; Na, Tae-Woong; Park, Joon-Seong; Kabir, Md Humayun; Chung, Hyung Suk; Lee, Han Sol; Shin, Ho-Chul; Shim, Jae-Han

    2017-08-15

    A simultaneous analytical method was developed for the determination of methiocarb and its metabolites, methiocarb sulfoxide and methiocarb sulfone, in five livestock products (chicken, pork, beef, table egg, and milk) using liquid chromatography-tandem mass spectrometry. Due to the rapid degradation of methiocarb and its metabolites, a quick sample preparation method was developed using acetonitrile and salts followed by purification via dispersive- solid phase extraction (d-SPE). Seven-point calibration curves were constructed separately in each matrix, and good linearity was observed in each matrix-matched calibration curve with a coefficient of determination (R 2 ) ≥ 0.991. The limits of detection and quantification were 0.0016 and 0.005mg/kg, respectively, for all tested analytes in various matrices. The method was validated in triplicate at three fortification levels (equivalent to 1, 2, and 10 times the limit of quantification) with a recovery rate ranging between 76.4-118.0% and a relative standard deviation≤10.0%. The developed method was successfully applied to market samples, and no residues of methiocarb and/or its metabolites were observed in the tested samples. In sum, this method can be applied for the routine analysis of methiocarb and its metabolites in foods of animal origins. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Sea level side loads in high-area-ratio rocket engines

    NASA Technical Reports Server (NTRS)

    Nave, L. H.; Coffey, G. A.

    1973-01-01

    An empirical separation and side load model to obtain applied aerodynamic loads has been developed based on data obtained from full-scale J-2S (265K-pound-thrust engine with an area ratio of 40:1) engine and model testing. Experimental data include visual observations of the separation patterns that show the dynamic nature of the separation phenomenon. Comparisons between measured and applied side loads are made. Correlations relating the separation location to the applied side loads and the methods used to determine the separation location are given.

  18. The SOBANE strategy for the management of risk, as applied to whole-body or hand-arm vibration.

    PubMed

    Malchaire, J; Piette, A

    2006-06-01

    The objective was to develop a coherent set of methods to be used effectively in industry to prevent and manage the risks associated with exposure to vibration, by coordinating the progressive intervention of the workers, their management, the occupational health and safety (OHS) professionals and the experts. The methods were developed separately for the exposure to whole-body and hand-arm vibration. The SOBANE strategy of risk prevention includes four levels of intervention: level 1, Screening; level 2, Observation; level 3, Analysis and; level 4, Expertise. The methods making it possible to apply this strategy were developed for 14 types of risk factors. The article presents the methods specific to the prevention of the risks associated with the exposure to vibration. The strategy is similar to those published for the risks associated with exposure to noise, heat and musculoskeletal disorders. It explicitly recognizes the qualifications of the workers and their management with regard to the work situation and shares the principle that measuring the exposure of the workers is not necessarily the first step in order to improve these situations. It attempts to optimize the recourse to the competences of the OHS professionals and the experts, in order to come more rapidly, effectively and economically to practical control measures.

  19. Solid-phase microfibers based on polyethylene glycol modified single-walled carbon nanotubes for the determination of chlorinated organic carriers in textiles.

    PubMed

    Zhang, Wei-Ya; Sun, Yin; Wang, Cheng-Ming; Wu, Cai-Ying

    2011-09-01

    Based on polyethylene glycol modified single-walled carbon nanotubes, a novel sol-gel fiber coating was prepared and applied to the headspace microextraction of chlorinated organic carriers (COCs) in textiles by gas chromatography-electron capture detection. The preparation of polyethylene glycol modified single-walled carbon nanotubes and the sol-gel fiber coating process was stated and confirmed by infrared spectra, Raman spectroscopy, and scanning electron microscopy. Several parameters affecting headspace microextraction, including extraction temperature, extraction time, salting-out effect, and desorption time, were optimized by detecting 11 COCs in simulative sweat samples. Compared with the commercial solid-phase microextraction fibers, the sol-gel polyethylene glycol modified single-walled carbon nanotubes fiber showed higher extraction efficiency, better thermal stability, and longer life span. The method detection limits for COCs were in the range from 0.02 to 7.5 ng L(-1) (S/N = 3). The linearity of the developed method varied from 0.001 to 50 μg L(-1) for all analytes, with coefficients of correlation greater than 0.974. The developed method was successfully applied to the analysis of trace COCs in textiles, the recoveries of the analytes indicated that the developed method was considerably useful for the determination of COCs in ecological textile samples.

  20. Discrete-event system simulation on small and medium enterprises productivity improvement

    NASA Astrophysics Data System (ADS)

    Sulistio, J.; Hidayah, N. A.

    2017-12-01

    Small and medium industries in Indonesia is currently developing. The problem faced by SMEs is the difficulty of meeting growing demand coming into the company. Therefore, SME need an analysis and evaluation on its production process in order to meet all orders. The purpose of this research is to increase the productivity of SMEs production floor by applying discrete-event system simulation. This method preferred because it can solve complex problems die to the dynamic and stochastic nature of the system. To increase the credibility of the simulation, model validated by cooperating the average of two trials, two trials of variance and chi square test. Afterwards, Benferroni method applied to development several alternatives. The article concludes that, the productivity of SMEs production floor increased up to 50% by adding the capacity of dyeing and drying machines.

Top