24 CFR 213.260 - Allowable methods of premium payment.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 24 Housing and Urban Development 2 2012-04-01 2012-04-01 false Allowable methods of premium payment. 213.260 Section 213.260 Housing and Urban Development Regulations Relating to Housing and Urban... Allowable methods of premium payment. Premiums shall be payable in cash or in debentures at par plus accrued...
24 CFR 213.260 - Allowable methods of premium payment.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 24 Housing and Urban Development 2 2011-04-01 2011-04-01 false Allowable methods of premium payment. 213.260 Section 213.260 Housing and Urban Development Regulations Relating to Housing and Urban... Allowable methods of premium payment. Premiums shall be payable in cash or in debentures at par plus accrued...
24 CFR 213.260 - Allowable methods of premium payment.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 2 2010-04-01 2010-04-01 false Allowable methods of premium payment. 213.260 Section 213.260 Housing and Urban Development Regulations Relating to Housing and Urban... Allowable methods of premium payment. Premiums shall be payable in cash or in debentures at par plus accrued...
24 CFR 213.260 - Allowable methods of premium payment.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 24 Housing and Urban Development 2 2013-04-01 2013-04-01 false Allowable methods of premium payment. 213.260 Section 213.260 Housing and Urban Development Regulations Relating to Housing and Urban... Allowable methods of premium payment. Premiums shall be payable in cash or in debentures at par plus accrued...
24 CFR 213.260 - Allowable methods of premium payment.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 24 Housing and Urban Development 2 2014-04-01 2014-04-01 false Allowable methods of premium payment. 213.260 Section 213.260 Housing and Urban Development Regulations Relating to Housing and Urban... Allowable methods of premium payment. Premiums shall be payable in cash or in debentures at par plus accrued...
13 CFR 305.6 - Allowable methods of procurement for construction services.
Code of Federal Regulations, 2013 CFR
2013-01-01
... design/build, construction management at risk and force account. If an alternate method is used, the... for construction services. 305.6 Section 305.6 Business Credit and Assistance ECONOMIC DEVELOPMENT... Approved Projects § 305.6 Allowable methods of procurement for construction services. (a) Recipients may...
13 CFR 305.6 - Allowable methods of procurement for construction services.
Code of Federal Regulations, 2012 CFR
2012-01-01
... design/build, construction management at risk and force account. If an alternate method is used, the... for construction services. 305.6 Section 305.6 Business Credit and Assistance ECONOMIC DEVELOPMENT... Approved Projects § 305.6 Allowable methods of procurement for construction services. (a) Recipients may...
13 CFR 305.6 - Allowable methods of procurement for construction services.
Code of Federal Regulations, 2014 CFR
2014-01-01
... design/build, construction management at risk and force account. If an alternate method is used, the... for construction services. 305.6 Section 305.6 Business Credit and Assistance ECONOMIC DEVELOPMENT... Approved Projects § 305.6 Allowable methods of procurement for construction services. (a) Recipients may...
Flexible Wing Base Micro Aerial Vehicles: Composite Materials for Micro Air Vehicles
NASA Technical Reports Server (NTRS)
Ifju, Peter G.; Ettinger, Scott; Jenkins, David; Martinez, Luis
2002-01-01
This paper will discuss the development of the University of Florida's Micro Air Vehicle concept. A series of flexible wing based aircraft that possess highly desirable flight characteristics were developed. Since computational methods to accurately model flight at the low Reynolds numbers associated with this scale are still under development, our effort has relied heavily on trial and error. Hence a time efficient method was developed to rapidly produce prototype designs. The airframe and wings are fabricated using a unique process that incorporates carbon fiber composite construction. Prototypes can be fabricated in around five man-hours, allowing many design revisions to be tested in a short period of time. The resulting aircraft are far more durable, yet lighter, than their conventional counterparts. This process allows for thorough testing of each design in order to determine what changes were required on the next prototype. The use of carbon fiber allows for wing flexibility without sacrificing durability. The construction methods developed for this project were the enabling technology that allowed us to implement our designs. The resulting aircraft were the winning entries in the International Micro Air Vehicle Competition for the past two years. Details of the construction method are provided in this paper along with a background on our flexible wing concept.
NASA Astrophysics Data System (ADS)
Pritykin, F. N.; Nefedov, D. I.; Rogoza, Yu A.; Zinchenko, Yu V.
2018-03-01
The article presents the findings related to the development of the module for automatic collision detection of the manipulator with restricted zones for virtual motion modeling. It proposes the parametric method for specifying the area of allowable joint configurations. The authors study the cases when restricted zones are specified using the horizontal plane or front-projection planes. The joint coordinate space is specified by rectangular axes in the direction of which the angles defining the displacements in turning pairs are laid off. The authors present the results of modeling which enabled to develop a parametric method for specifying a set of cross-sections defining the shape and position of allowable configurations in different positions of a restricted zone. All joint points that define allowable configurations refer to the indicated sections. The area of allowable configurations is specified analytically by using several kinematic surfaces that limit it. A geometric analysis is developed based on the use of the area of allowable configurations characterizing the position of the manipulator and reported restricted zones. The paper presents numerical calculations related to virtual simulation of the manipulator path performed by the mobile robot Varan when using the developed algorithm and restricted zones. The obtained analytical dependencies allow us to define the area of allowable configurations, which is a knowledge pool to ensure the intelligent control of the manipulator path in a predefined environment. The use of the obtained region to synthesize a joint trajectory makes it possible to correct the manipulator path to foresee and eliminate deadlocks when synthesizing motions along the velocity vector.
Use of EPANET solver to manage water distribution in Smart City
NASA Astrophysics Data System (ADS)
Antonowicz, A.; Brodziak, R.; Bylka, J.; Mazurkiewicz, J.; Wojtecki, S.; Zakrzewski, P.
2018-02-01
Paper presents a method of using EPANET solver to support manage water distribution system in Smart City. The main task is to develop the application that allows remote access to the simulation model of the water distribution network developed in the EPANET environment. Application allows to perform both single and cyclic simulations with the specified step of changing the values of the selected process variables. In the paper the architecture of application was shown. The application supports the selection of the best device control algorithm using optimization methods. Optimization procedures are possible with following methods: brute force, SLSQP (Sequential Least SQuares Programming), Modified Powell Method. Article was supplemented by example of using developed computer tool.
Hidden asymmetry and forward-backward correlations
NASA Astrophysics Data System (ADS)
Bialas, A.; Zalewski, K.
2010-09-01
A model-independent method of studying the forward-backward correlations in symmetric high-energy processes is developed. The method allows a systematic study of the properties of various particle sources and allows one to uncover asymmetric structures hidden in symmetric hadron-hadron and nucleus-nucleus inelastic reactions.
Comparing the performance of biomedical clustering methods.
Wiwie, Christian; Baumbach, Jan; Röttger, Richard
2015-11-01
Identifying groups of similar objects is a popular first step in biomedical data analysis, but it is error-prone and impossible to perform manually. Many computational methods have been developed to tackle this problem. Here we assessed 13 well-known methods using 24 data sets ranging from gene expression to protein domains. Performance was judged on the basis of 13 common cluster validity indices. We developed a clustering analysis platform, ClustEval (http://clusteval.mpi-inf.mpg.de), to promote streamlined evaluation, comparison and reproducibility of clustering results in the future. This allowed us to objectively evaluate the performance of all tools on all data sets with up to 1,000 different parameter sets each, resulting in a total of more than 4 million calculated cluster validity indices. We observed that there was no universal best performer, but on the basis of this wide-ranging comparison we were able to develop a short guideline for biomedical clustering tasks. ClustEval allows biomedical researchers to pick the appropriate tool for their data type and allows method developers to compare their tool to the state of the art.
NASA Astrophysics Data System (ADS)
Khe Sun, Pak; Vorona-Slivinskaya, Lubov; Voskresenskay, Elena
2017-10-01
The article highlights the necessity of a complex approach to assess economic security of municipalities, which would consider municipal management specifics. The approach allows comparing the economic security level of municipalities, but it does not describe parameter differences between compared municipalities. Therefore, there is a second method suggested: parameter rank order method. Applying these methods allowed to figure out the leaders and outsiders of the economic security among municipalities and rank all economic security parameters according to the significance level. Complex assessment of the economic security of municipalities, based on the combination of the two approaches, allowed to assess the security level more accurate. In order to assure economic security and equalize its threshold values, one should pay special attention to transportation system development in municipalities. Strategic aims of projects in the area of transportation infrastructure development in municipalities include the following issues: contribution into creating and elaborating transportation logistics and manufacture transport complexes, development of transportation infrastructure with account of internal and external functions of the region, public transport development, improvement of transport security and reducing its negative influence on the environment.
Gel-based methods in redox proteomics.
Charles, Rebecca; Jayawardhana, Tamani; Eaton, Philip
2014-02-01
The key to understanding the full significance of oxidants in health and disease is the development of tools and methods that allow the study of proteins that sense and transduce changes in cellular redox. Oxidant-reactive deprotonated thiols commonly operate as redox sensors in proteins and a variety of methods have been developed that allow us to monitor their oxidative modification. This outline review specifically focuses on gel-based methods used to detect, quantify and identify protein thiol oxidative modifications. The techniques we discuss fall into one of two broad categories. Firstly, methods that allow oxidation of thiols in specific proteins or the global cellular pool to be monitored are discussed. These typically utilise thiol-labelling reagents that add a reporter moiety (e.g. affinity tag, fluorophore, chromophore), in which loss of labelling signifies oxidation. Secondly, we outline methods that allow specific thiol oxidation states of proteins (e.g. S-sulfenylation, S-nitrosylation, S-thionylation and interprotein disulfide bond formation) to be investigated. A variety of different gel-based methods for identifying thiol proteins that are sensitive to oxidative modifications have been developed. These methods can aid the detection and quantification of thiol redox state, as well as identifying the sensor protein. By understanding how cellular redox is sensed and transduced to a functional effect by protein thiol redox sensors, this will help us better appreciate the role of oxidants in health and disease. This article is part of a Special Issue entitled Current methods to study reactive oxygen species - pros and cons and biophysics of membrane proteins. Guest Editor: Christine Winterbourn. Copyright © 2013 Elsevier B.V. All rights reserved.
Flexible methods for segmentation evaluation: results from CT-based luggage screening.
Karimi, Seemeen; Jiang, Xiaoqian; Cosman, Pamela; Martz, Harry
2014-01-01
Imaging systems used in aviation security include segmentation algorithms in an automatic threat recognition pipeline. The segmentation algorithms evolve in response to emerging threats and changing performance requirements. Analysis of segmentation algorithms' behavior, including the nature of errors and feature recovery, facilitates their development. However, evaluation methods from the literature provide limited characterization of the segmentation algorithms. To develop segmentation evaluation methods that measure systematic errors such as oversegmentation and undersegmentation, outliers, and overall errors. The methods must measure feature recovery and allow us to prioritize segments. We developed two complementary evaluation methods using statistical techniques and information theory. We also created a semi-automatic method to define ground truth from 3D images. We applied our methods to evaluate five segmentation algorithms developed for CT luggage screening. We validated our methods with synthetic problems and an observer evaluation. Both methods selected the same best segmentation algorithm. Human evaluation confirmed the findings. The measurement of systematic errors and prioritization helped in understanding the behavior of each segmentation algorithm. Our evaluation methods allow us to measure and explain the accuracy of segmentation algorithms.
Cross-Proportions: A Conceptual Method for Developing Quantitative Problem-Solving Skills
ERIC Educational Resources Information Center
Cook, Elzbieta; Cook, Stephen L.
2005-01-01
The cross-proportion method allows both the instructor and the student to easily determine where an error is made during problem solving. The C-P method supports a strong cognitive foundation upon which students can develop other diagnostic methods as they advance in chemistry and scientific careers.
McCarthy, Jason R.; Weissleder, Ralph
2007-01-01
Background Probes that allow site-specific protein labeling have become critical tools for visualizing biological processes. Methods Here we used phage display to identify a novel peptide sequence with nanomolar affinity for near infrared (NIR) (benz)indolium fluorochromes. The developed peptide sequence (“IQ-tag”) allows detection of NIR dyes in a wide range of assays including ELISA, flow cytometry, high throughput screens, microscopy, and optical in vivo imaging. Significance The described method is expected to have broad utility in numerous applications, namely site-specific protein imaging, target identification, cell tracking, and drug development. PMID:17653285
Item Pocket Method to Allow Response Review and Change in Computerized Adaptive Testing
ERIC Educational Resources Information Center
Han, Kyung T.
2013-01-01
Most computerized adaptive testing (CAT) programs do not allow test takers to review and change their responses because it could seriously deteriorate the efficiency of measurement and make tests vulnerable to manipulative test-taking strategies. Several modified testing methods have been developed that provide restricted review options while…
Integrating Poetry and "To Kill a Mockingbird."
ERIC Educational Resources Information Center
Jolley, Susan Arpajian
2002-01-01
Outlines a method of teaching "To Kill a Mockingbird" along with the study of poetry. Notes that this method allows students to consider the themes of courage and developing compassion. Concludes that teaching such a multigenre unit allows students to look for connections among fact and fiction, the past and present, their own lives and…
NASA Technical Reports Server (NTRS)
Greenwood, Eric, II; Schmitz, Fredric H.
2010-01-01
A new physics-based parameter identification method for rotor harmonic noise sources is developed using an acoustic inverse simulation technique. This new method allows for the identification of individual rotor harmonic noise sources and allows them to be characterized in terms of their individual non-dimensional governing parameters. This new method is applied to both wind tunnel measurements and ground noise measurements of two-bladed rotors. The method is shown to match the parametric trends of main rotor Blade-Vortex Interaction (BVI) noise, allowing accurate estimates of BVI noise to be made for operating conditions based on a small number of measurements taken at different operating conditions.
Towards efficient multi-scale methods for monitoring sugarcane aphid infestations in sorghum
USDA-ARS?s Scientific Manuscript database
We discuss approaches and issues involved with developing optimal monitoring methods for sugarcane aphid infestations (SCA) in grain sorghum. We discuss development of sequential sampling methods that allow for estimation of the number of aphids per sample unit, and statistical decision making rela...
Flexible methods for segmentation evaluation: Results from CT-based luggage screening
Karimi, Seemeen; Jiang, Xiaoqian; Cosman, Pamela; Martz, Harry
2017-01-01
BACKGROUND Imaging systems used in aviation security include segmentation algorithms in an automatic threat recognition pipeline. The segmentation algorithms evolve in response to emerging threats and changing performance requirements. Analysis of segmentation algorithms’ behavior, including the nature of errors and feature recovery, facilitates their development. However, evaluation methods from the literature provide limited characterization of the segmentation algorithms. OBJECTIVE To develop segmentation evaluation methods that measure systematic errors such as oversegmentation and undersegmentation, outliers, and overall errors. The methods must measure feature recovery and allow us to prioritize segments. METHODS We developed two complementary evaluation methods using statistical techniques and information theory. We also created a semi-automatic method to define ground truth from 3D images. We applied our methods to evaluate five segmentation algorithms developed for CT luggage screening. We validated our methods with synthetic problems and an observer evaluation. RESULTS Both methods selected the same best segmentation algorithm. Human evaluation confirmed the findings. The measurement of systematic errors and prioritization helped in understanding the behavior of each segmentation algorithm. CONCLUSIONS Our evaluation methods allow us to measure and explain the accuracy of segmentation algorithms. PMID:24699346
NASA Technical Reports Server (NTRS)
Chesler, L.; Pierce, S.
1971-01-01
Generalized, cyclic, and modified multistep numerical integration methods are developed and evaluated for application to problems of satellite orbit computation. Generalized methods are compared with the presently utilized Cowell methods; new cyclic methods are developed for special second-order differential equations; and several modified methods are developed and applied to orbit computation problems. Special computer programs were written to generate coefficients for these methods, and subroutines were written which allow use of these methods with NASA's GEOSTAR computer program.
Immunocytochemical detection of astrocytes in brain slices in combination with Nissl staining.
Korzhevskii, D E; Otellin, V A
2005-07-01
The present study was performed to develop a simple and reliable method for the combined staining of specimens to allow the advantages of immunocytochemical detection of astrocytes and assessment of the functional state of neurons by the Nissl method to be assessed simultaneously. The protocol suggested for processing paraffin sections allows preservation of tissue structure at high quality and allows the selective identification of astrocytes with counterstaining of neurons by the Nissl method. The protocol can be used without modification for processing brain specimens from humans and various mammals--except mice and rabbits.
Navarrete, Alexander; Muñoz, Sergio; Sanz-Moral, Luis M; Brandner, Juergen J; Pfeifer, Peter; Martín, Ángel; Dittmeyer, Roland; Cocero, María J
2015-01-01
A novel plasmonic reactor concept is proposed and tested to work as a visible energy harvesting device while allowing reactions to transform CO2 to be carried out. Particularly the reverse water gas shift (RWGS) reaction has been tested as a means to introduce renewable energy into the economy. The development of the new reactor concept involved the synthesis of a new composite capable of plasmonic activation with light, the development of an impregnation method to create a single catalyst reactor entity, and finally the assembly of a reaction system to test the reaction. The composite developed was based on a Cu/ZnO catalyst dispersed into transparent aerogels. This allows efficient light transmission and a high surface area for the catalyst. An effective yet simple impregnation method was developed that allowed introduction of the composites into glass microchannels. The activation of the reaction was made using LEDs that covered all the sides of the reactor allowing a high power delivery. The results of the reaction show a stable process capable of low temperature transformations.
Method of the active contour for segmentation of bone systems on bitmap images
NASA Astrophysics Data System (ADS)
Vu, Hai Anh; Safonov, Roman A.; Kolesnikova, Anna S.; Kirillova, Irina V.; Kossovich, Leonid U.
2018-02-01
It is developed within a method of the active contours the approach, which is allowing to realize separation of a contour of a object of the image in case of its segmentation. This approach exceeds a parametric method on speed, but also does not concede to it on decision accuracy. The approach is offered within this operation will allow to realize allotment of a contour with high accuracy of the image and quicker than a parametric method of the active contours.
[Solubilization Specificities Interferon beta-1b from Inclusion Bodies].
Zhuravko, A S; Kononova, N V; Bobruskin, A I
2015-01-01
A new solubilization method of recombinant interferon beta-1b (IFNβ-1b) from the inclusion bodies was developed. This method allows to extract the target protein selectively in the solutions of different alcohols, such as ethanol, propanol and isopropanol. It was shown that the more effective IFNβ-1b solubilization was achieved in the 55% propanol solution. This method allowed to extract the target protein from inclusion bodies around 85-90%, and significantly reduced Escherichia coli content in the solubilizate, in comparison with standard methods.
Non-perturbative background field calculations
NASA Astrophysics Data System (ADS)
Stephens, C. R.
1988-01-01
New methods are developed for calculating one loop functional determinants in quantum field theory. Instead of relying on a calculation of all the eigenvalues of the small fluctuation equation, these techniques exploit the ability of the proper time formalism to reformulate an infinite dimensional field theoretic problem into a finite dimensional covariant quantum mechanical analog, thereby allowing powerful tools such as the method of Jacobi fields to be used advantageously in a field theory setting. More generally the methods developed herein should be extremely valuable when calculating quantum processes in non-constant background fields, offering a utilitarian alternative to the two standard methods of calculation—perturbation theory in the background field or taking the background field into account exactly. The formalism developed also allows for the approximate calculation of covariances of partial differential equations from a knowledge of the solutions of a homogeneous ordinary differential equation.
Development, history, and future of automated cell counters.
Green, Ralph; Wachsmann-Hogiu, Sebastian
2015-03-01
Modern automated hematology instruments use either optical methods (light scatter), impedance-based methods based on the Coulter principle (changes in electrical current induced by blood cells flowing through an electrically charged opening), or a combination of both optical and impedance-based methods. Progressive improvement in these instruments has allowed the enumeration and evaluation of blood cells with great accuracy, precision, and speed at very low cost. Future directions of hematology instrumentation include the addition of new parameters and the development of point-of-care instrumentation. In the future, in-vivo analysis of blood cells may allow noninvasive and near-continuous measurements. Copyright © 2015 Elsevier Inc. All rights reserved.
SDF technology in location and navigation procedures: a survey of applications
NASA Astrophysics Data System (ADS)
Kelner, Jan M.; Ziółkowski, Cezary
2017-04-01
The basis for development the Doppler location method, also called the signal Doppler frequency (SDF) method or technology is the analytical solution of the wave equation for a mobile source. This paper presents an overview of the simulations, numerical analysis and empirical studies of the possibilities and the range of SDF method applications. In the paper, the various applications from numerous publications are collected and described. They mainly focus on the use of SDF method in: emitter positioning, electronic warfare, crisis management, search and rescue, navigation. The developed method is characterized by an innovative, unique property among other location methods, because it allows the simultaneous location of the many radio emitters. Moreover, this is the first method based on the Doppler effect, which allows positioning of transmitters, using a single mobile platform. In the paper, the results of the using SDF method by the other teams are also presented.
NASA Astrophysics Data System (ADS)
Sizov, Gennadi Y.
In this dissertation, a model-based multi-objective optimal design of permanent magnet ac machines, supplied by sine-wave current regulated drives, is developed and implemented. The design procedure uses an efficient electromagnetic finite element-based solver to accurately model nonlinear material properties and complex geometric shapes associated with magnetic circuit design. Application of an electromagnetic finite element-based solver allows for accurate computation of intricate performance parameters and characteristics. The first contribution of this dissertation is the development of a rapid computational method that allows accurate and efficient exploration of large multi-dimensional design spaces in search of optimum design(s). The computationally efficient finite element-based approach developed in this work provides a framework of tools that allow rapid analysis of synchronous electric machines operating under steady-state conditions. In the developed modeling approach, major steady-state performance parameters such as, winding flux linkages and voltages, average, cogging and ripple torques, stator core flux densities, core losses, efficiencies and saturated machine winding inductances, are calculated with minimum computational effort. In addition, the method includes means for rapid estimation of distributed stator forces and three-dimensional effects of stator and/or rotor skew on the performance of the machine. The second contribution of this dissertation is the development of the design synthesis and optimization method based on a differential evolution algorithm. The approach relies on the developed finite element-based modeling method for electromagnetic analysis and is able to tackle large-scale multi-objective design problems using modest computational resources. Overall, computational time savings of up to two orders of magnitude are achievable, when compared to current and prevalent state-of-the-art methods. These computational savings allow one to expand the optimization problem to achieve more complex and comprehensive design objectives. The method is used in the design process of several interior permanent magnet industrial motors. The presented case studies demonstrate that the developed finite element-based approach practically eliminates the need for using less accurate analytical and lumped parameter equivalent circuit models for electric machine design optimization. The design process and experimental validation of the case-study machines are detailed in the dissertation.
Developing Sustainable Life Support System Concepts
NASA Technical Reports Server (NTRS)
Thomas, Evan A.
2010-01-01
Sustainable spacecraft life support concepts may allow the development of more reliable technologies for long duration space missions. Currently, life support technologies at different levels of development are not well evaluated against each other, and evaluation methods do not account for long term reliability and sustainability of the hardware. This paper presents point-of-departure sustainability evaluation criteria for life support systems, that may allow more robust technology development, testing and comparison. An example sustainable water recovery system concept is presented.
Using Mixed Methods to Analyze Video Data: A Mathematics Teacher Professional Development Example
ERIC Educational Resources Information Center
DeCuir-Gunby, Jessica T.; Marshall, Patricia L.; McCulloch, Allison W.
2012-01-01
This article uses data from 65 teachers participating in a K-2 mathematics professional development research project as an example of how to analyze video recordings of teachers' classroom lessons using mixed methods. Through their discussion, the authors demonstrate how using a mixed methods approach to classroom video analysis allows researchers…
Development of the mathematical model for design and verification of acoustic modal analysis methods
NASA Astrophysics Data System (ADS)
Siner, Alexander; Startseva, Maria
2016-10-01
To reduce the turbofan noise it is necessary to develop methods for the analysis of the sound field generated by the blade machinery called modal analysis. Because modal analysis methods are very difficult and their testing on the full scale measurements are very expensive and tedious it is necessary to construct some mathematical models allowing to test modal analysis algorithms fast and cheap. At this work the model allowing to set single modes at the channel and to analyze generated sound field is presented. Modal analysis of the sound generated by the ring array of point sound sources is made. Comparison of experimental and numerical modal analysis results is presented at this work.
Risk assessment for construction projects of transport infrastructure objects
NASA Astrophysics Data System (ADS)
Titarenko, Boris
2017-10-01
The paper analyzes and compares different methods of risk assessment for construction projects of transport objects. The management of such type of projects demands application of special probabilistic methods due to large level of uncertainty of their implementation. Risk management in the projects requires the use of probabilistic and statistical methods. The aim of the work is to develop a methodology for using traditional methods in combination with robust methods that allow obtaining reliable risk assessments in projects. The robust approach is based on the principle of maximum likelihood and in assessing the risk allows the researcher to obtain reliable results in situations of great uncertainty. The application of robust procedures allows to carry out a quantitative assessment of the main risk indicators of projects when solving the tasks of managing innovation-investment projects. Calculation of damage from the onset of a risky event is possible by any competent specialist. And an assessment of the probability of occurrence of a risky event requires the involvement of special probabilistic methods based on the proposed robust approaches. Practice shows the effectiveness and reliability of results. The methodology developed in the article can be used to create information technologies and their application in automated control systems for complex projects.
An integrated miRNA functional screening and target validation method for organ morphogenesis.
Rebustini, Ivan T; Vlahos, Maryann; Packer, Trevor; Kukuruzinska, Maria A; Maas, Richard L
2016-03-16
The relative ease of identifying microRNAs and their increasing recognition as important regulators of organogenesis motivate the development of methods to efficiently assess microRNA function during organ morphogenesis. In this context, embryonic organ explants provide a reliable and reproducible system that recapitulates some of the important early morphogenetic processes during organ development. Here we present a method to target microRNA function in explanted mouse embryonic organs. Our method combines the use of peptide-based nanoparticles to transfect specific microRNA inhibitors or activators into embryonic organ explants, with a microRNA pulldown assay that allows direct identification of microRNA targets. This method provides effective assessment of microRNA function during organ morphogenesis, allows prioritization of multiple microRNAs in parallel for subsequent genetic approaches, and can be applied to a variety of embryonic organs.
NREL'S Zunger Receives Top Scientific Honors
Zunger's research endeavors, specifically the development of pioneering theoretical methods for quantum -mechanical computations and predictions of the properties of solids. These methods allow the prediction of
Flexible Method for Developing Tactics, Techniques, and Procedures for Future Capabilities
2009-02-01
levels of ability, military experience, and motivation, (b) number and type of significant events, and (c) other sources of natural variability...research has developed a number of specific instruments designed to aid in this process. Second, the iterative, feed-forward nature of the method allows...FLEX method), but still lack the structured KE approach and iterative, feed-forward nature of the FLEX method. To facilitate decision making
Tools for in silico target fishing.
Cereto-Massagué, Adrià; Ojeda, María José; Valls, Cristina; Mulero, Miquel; Pujadas, Gerard; Garcia-Vallve, Santiago
2015-01-01
Computational target fishing methods are designed to identify the most probable target of a query molecule. This process may allow the prediction of the bioactivity of a compound, the identification of the mode of action of known drugs, the detection of drug polypharmacology, drug repositioning or the prediction of the adverse effects of a compound. The large amount of information regarding the bioactivity of thousands of small molecules now allows the development of these types of methods. In recent years, we have witnessed the emergence of many methods for in silico target fishing. Most of these methods are based on the similarity principle, i.e., that similar molecules might bind to the same targets and have similar bioactivities. However, the difficult validation of target fishing methods hinders comparisons of the performance of each method. In this review, we describe the different methods developed for target prediction, the bioactivity databases most frequently used by these methods, and the publicly available programs and servers that enable non-specialist users to obtain these types of predictions. It is expected that target prediction will have a large impact on drug development and on the functional food industry. Copyright © 2014 Elsevier Inc. All rights reserved.
Biofilm thickness measurement using an ultrasound method in a liquid phase.
Maurício, R; Dias, C J; Jubilado, N; Santana, F
2013-10-01
In this report, the development of an online, noninvasive, measurement method of the biofilm thickness in a liquid phase is presented. The method is based in the analysis of the ultrasound wave pulse-echo behavior in a liquid phase reproducing the real reactor conditions. It does not imply the removal of the biomass from the support or any kind of intervention in the support (pipes) to detect and perform the measurements (non-invasiveness). The developed method allows for its sensor to be easily and quickly mounted and unmounted in any location along a pipe or reactor wall. Finally, this method is an important innovation because it allows the thickness measurement of a biofilm, in liquid phase conditions that can be used in monitoring programs, to help in scheduling cleaning actions to remove the unwanted biofilm, in several application areas, namely in potable water supply pipes.
Gentile, T. R.; Nacher, P. J.; Saam, B.; Walker, T. G.
2018-01-01
This article reviews the physics and technology of producing large quantities of highly spin-polarized 3He nuclei using spin-exchange (SEOP) and metastability-exchange (MEOP) optical pumping. Both technical developments and deeper understanding of the physical processes involved have led to substantial improvements in the capabilities of both methods. For SEOP, the use of spectrally narrowed lasers and K-Rb mixtures has substantially increased the achievable polarization and polarizing rate. For MEOP nearly lossless compression allows for rapid production of polarized 3He and operation in high magnetic fields has likewise significantly increased the pressure at which this method can be performed, and revealed new phenomena. Both methods have benefitted from development of storage methods that allow for spin-relaxation times of hundreds of hours, and specialized precision methods for polarimetry. SEOP and MEOP are now widely applied for spin-polarized targets, neutron spin filters, magnetic resonance imaging, and precision measurements. PMID:29503479
Gentile, T R; Nacher, P J; Saam, B; Walker, T G
2017-01-01
This article reviews the physics and technology of producing large quantities of highly spin-polarized 3 He nuclei using spin-exchange (SEOP) and metastability-exchange (MEOP) optical pumping. Both technical developments and deeper understanding of the physical processes involved have led to substantial improvements in the capabilities of both methods. For SEOP, the use of spectrally narrowed lasers and K-Rb mixtures has substantially increased the achievable polarization and polarizing rate. For MEOP nearly lossless compression allows for rapid production of polarized 3 He and operation in high magnetic fields has likewise significantly increased the pressure at which this method can be performed, and revealed new phenomena. Both methods have benefitted from development of storage methods that allow for spin-relaxation times of hundreds of hours, and specialized precision methods for polarimetry. SEOP and MEOP are now widely applied for spin-polarized targets, neutron spin filters, magnetic resonance imaging, and precision measurements.
Grabowska-Polanowska, Beata; Miarka, Przemysław; Skowron, Monika; Sułowicz, Joanna; Wojtyna, Katarzyna; Moskal, Karolina; Śliwka, Ireneusz
2017-10-01
The studies on volatile organic compounds emitted from skin are an interest for chemists, biologists and physicians due to their role in development of different scientific areas, including medical diagnostics, forensic medicine and the perfume design. This paper presents a proposal of two sampling methods applied to skin odor collection: the first one uses a bag of cellulose film, the second one, using cellulose sachets filled with active carbon. Volatile organic compounds were adsorbed on carbon sorbent, removed via thermal desorption and analyzed using gas chromatograph with mass spectrometer. The first sampling method allowed identification of more compounds (52) comparing to the second one (30). Quantitative analyses for acetone, butanal, pentanal and hexanal were done. The skin odor sampling method using a bag of cellulose film, allowed the identification of many more compounds when compared with the method using a sachet filled with active carbon.
NASA Astrophysics Data System (ADS)
Gentile, T. R.; Nacher, P. J.; Saam, B.; Walker, T. G.
2017-10-01
This article reviews the physics and technology of producing large quantities of highly spin-polarized 3He nuclei using spin-exchange (SEOP) and metastability-exchange (MEOP) optical pumping. Both technical developments and deeper understanding of the physical processes involved have led to substantial improvements in the capabilities of both methods. For SEOP, the use of spectrally narrowed lasers and K-Rb mixtures has substantially increased the achievable polarization and polarizing rate. For MEOP nearly lossless compression allows for rapid production of polarized 3He and operation in high magnetic fields has likewise significantly increased the pressure at which this method can be performed, and revealed new phenomena. Both methods have benefitted from development of storage methods that allow for spin-relaxation times of hundreds of hours, and specialized precision methods for polarimetry. SEOP and MEOP are now widely applied for spin-polarized targets, neutron spin filters, magnetic resonance imaging, and precision measurements.
Farnum, C E; Wilsman, N J
1984-06-01
A postembedment method for the localization of lectin-binding glycoconjugates was developed using Epon-embedded growth plate cartilage from Yucatan miniature swine. By testing a variety of etching, blocking, and incubation procedures, a standard protocol was developed for 1 micron thick sections that allowed visualization of both intracellular and extracellular glycoconjugates with affinity for wheat germ agglutinin and concanavalin A. Both fluorescent and peroxidase techniques were used, and comparisons were made between direct methods and indirect methods using the biotin-avidin bridging system. Differential extracellular lectin binding allowed visualization of interterritorial , territorial, and pericellular matrices. Double labeling experiments showed the precision with which intracellular binding could be localized to specific cytoplasmic compartments, with resolution of binding to the Golgi apparatus, endoplasmic reticulum, and nuclear membrane at the light microscopic level. This method allows the localization of both intracellular and extracellular lectin-binding glycoconjugates using fixation and embedment procedures that are compatible with simultaneous ultrastructural analysis. As such it should have applicability both to the morphological analysis of growth plate organization during normal endochondral ossification, as well as to the diagnostic pathology of matrix abnormalities in disease states of growing cartilage.
Acoustic Treatment Design Scaling Methods. Volume 1; Overview, Results, and Recommendations
NASA Technical Reports Server (NTRS)
Kraft, R. E.; Yu, J.
1999-01-01
Scale model fan rigs that simulate new generation ultra-high-bypass engines at about 1/5-scale are achieving increased importance as development vehicles for the design of low-noise aircraft engines. Testing at small scale allows the tests to be performed in existing anechoic wind tunnels, which provides an accurate simulation of the important effects of aircraft forward motion on the noise generation. The ability to design, build, and test miniaturized acoustic treatment panels on scale model fan rigs representative of the fullscale engine provides not only a cost-savings, but an opportunity to optimize the treatment by allowing tests of different designs. The primary objective of this study was to develop methods that will allow scale model fan rigs to be successfully used as acoustic treatment design tools. The study focuses on finding methods to extend the upper limit of the frequency range of impedance prediction models and acoustic impedance measurement methods for subscale treatment liner designs, and confirm the predictions by correlation with measured data. This phase of the program had as a goal doubling the upper limit of impedance measurement from 6 kHz to 12 kHz. The program utilizes combined analytical and experimental methods to achieve the objectives.
Canine Hip Dysplasia: Diagnostic Imaging.
Butler, J Ryan; Gambino, Jennifer
2017-07-01
Diagnostic imaging is the principal method used to screen for and diagnose hip dysplasia in the canine patient. Multiple techniques are available, each having advantages, disadvantages, and limitations. Hip-extended radiography is the most used method and is best used as a screening tool and for assessment for osteoarthritis. Distraction radiographic methods such as the PennHip method allow for improved detection of laxity and improved ability to predict future osteoarthritis development. More advanced techniques such as MRI, although expensive and not widely available, may improve patient screening and allow for improved assessment of cartilage health. Copyright © 2017 Elsevier Inc. All rights reserved.
Jensen, Jacob S; Egebo, Max; Meyer, Anne S
2008-05-28
Accomplishment of fast tannin measurements is receiving increased interest as tannins are important for the mouthfeel and color properties of red wines. Fourier transform mid-infrared spectroscopy allows fast measurement of different wine components, but quantification of tannins is difficult due to interferences from spectral responses of other wine components. Four different variable selection tools were investigated for the identification of the most important spectral regions which would allow quantification of tannins from the spectra using partial least-squares regression. The study included the development of a new variable selection tool, iterative backward elimination of changeable size intervals PLS. The spectral regions identified by the different variable selection methods were not identical, but all included two regions (1485-1425 and 1060-995 cm(-1)), which therefore were concluded to be particularly important for tannin quantification. The spectral regions identified from the variable selection methods were used to develop calibration models. All four variable selection methods identified regions that allowed an improved quantitative prediction of tannins (RMSEP = 69-79 mg of CE/L; r = 0.93-0.94) as compared to a calibration model developed using all variables (RMSEP = 115 mg of CE/L; r = 0.87). Only minor differences in the performance of the variable selection methods were observed.
NASA Technical Reports Server (NTRS)
Brown, Andrew M.
2014-01-01
Numerical and Analytical methods developed to determine damage accumulation in specific engine components when speed variation included. Dither Life Ratio shown to be well over factor of 2 for specific example. Steady-State assumption shown to be accurate for most turbopump cases, allowing rapid calculation of DLR. If hot-fire speed data unknown, Monte Carlo method developed that uses speed statistics for similar engines. Application of techniques allow analyst to reduce both uncertainty and excess conservatism. High values of DLR could allow previously unacceptable part to pass HCF criteria without redesign. Given benefit and ease of implementation, recommend that any finite life turbomachine component analysis adopt these techniques. Probability Values calculated, compared, and evaluated for several industry-proposed methods for combining random and harmonic loads. Two new excel macros written to calculate combined load for any specific probability level. Closed form Curve fits generated for widely used 3(sigma) and 2(sigma) probability levels. For design of lightweight aerospace components, obtaining accurate, reproducible, statistically meaningful answer critical.
ERIC Educational Resources Information Center
Newton, Robert; Marcella, Rita; Middleton, Iain; McConnell, Michael
This paper reports on ReMOTE (Research Methods Online Teaching Environment), a Robert Gordon University (Scotland) project focusing on the development of a World Wide Web (WWW) site devoted to the teaching of research methods. The aim of ReMOTE is to provide an infrastructure that allows direct links to specialist sources in order to enable the…
Herrlinger, Stephanie A; Shao, Qiang; Ma, Li; Brindley, Melinda; Chen, Jian-Fu
2018-04-26
The Zika virus (ZIKV) is a flavivirus currently endemic in North, Central, and South America. It is now established that the ZIKV can cause microcephaly and additional brain abnormalities. However, the mechanism underlying the pathogenesis of ZIKV in the developing brain remains unclear. Intracerebral surgical methods are frequently used in neuroscience research to address questions about both normal and abnormal brain development and brain function. This protocol utilizes classical surgical techniques and describes methods that allow one to model ZIKV-associated human neurological disease in the mouse nervous system. While direct brain inoculation does not model the normal mode of virus transmission, the method allows investigators to ask targeted questions concerning the consequence after ZIKV infection of the developing brain. This protocol describes embryonic, neonatal, and adult stages of intraventricular inoculation of ZIKV. Once mastered, this method can become a straightforward and reproducible technique that only takes a few hours to perform.
Current techniques for the real-time processing of complex radar signatures
NASA Astrophysics Data System (ADS)
Clay, E.
A real-time processing technique has been developed for the microwave receiver of the Brahms radar station. The method allows such target signatures as the radar cross section (RCS) of the airframes and rotating parts, the one-dimensional tomography of aircraft, and the RCS of electromagnetic decoys to be characterized. The method allows optimization of experimental parameters including the analysis frequency band, the receiver gain, and the wavelength range of EM analysis.
USDA-ARS?s Scientific Manuscript database
Objective: To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. Methods: The conditional inference tree analysis, a data mining approach, was used to con...
Quantifying Three-Dimensional Morphology and RNA from Individual Embryos
Green, Rebecca M.; Leach, Courtney L.; Hoehn, Natasha; Marcucio, Ralph S.; Hallgrímsson, Benedikt
2017-01-01
Quantitative analysis of morphogenesis aids our understanding of developmental processes by providing a method to link changes in shape with cellular and molecular processes. Over the last decade many methods have been developed for 3D imaging of embryos using microCT scanning to quantify the shape of embryos during development. These methods generally involve a powerful, cross-linking fixative such as paraformaldehyde to limit shrinkage during the CT scan. However, the extended time frames that these embryos are incubated in such fixatives prevent use of the tissues for molecular analysis after microCT scanning. This is a significant problem because it limits the ability to correlate variation in molecular data with morphology at the level of individual embryos. Here, we outline a novel method that allows RNA, DNA or protein isolation following CT scan while also allowing imaging of different tissue layers within the developing embryo. We show shape differences early in craniofacial development (E11.5) between common mouse genetic backgrounds, and demonstrate that we are able to generate RNA from these embryos after CT scanning that is suitable for downstream RT-PCR and RNAseq analyses. PMID:28152580
Fundamental Rotorcraft Acoustic Modeling From Experiments (FRAME)
NASA Technical Reports Server (NTRS)
Greenwood, Eric
2011-01-01
A new methodology is developed for the construction of helicopter source noise models for use in mission planning tools from experimental measurements of helicopter external noise radiation. The models are constructed by employing a parameter identification method to an assumed analytical model of the rotor harmonic noise sources. This new method allows for the identification of individual rotor harmonic noise sources and allows them to be characterized in terms of their individual non-dimensional governing parameters. The method is applied to both wind tunnel measurements and ground noise measurements of two-bladed rotors. The method is shown to match the parametric trends of main rotor harmonic noise, allowing accurate estimates of the dominant rotorcraft noise sources to be made for operating conditions based on a small number of measurements taken at different operating conditions. The ability of this method to estimate changes in noise radiation due to changes in ambient conditions is also demonstrated.
Solution and reasoning reuse in space planning and scheduling applications
NASA Technical Reports Server (NTRS)
Verfaillie, Gerard; Schiex, Thomas
1994-01-01
In the space domain, as in other domains, the CSP (Constraint Satisfaction Problems) techniques are increasingly used to represent and solve planning and scheduling problems. But these techniques have been developed to solve CSP's which are composed of fixed sets of variables and constraints, whereas many planning and scheduling problems are dynamic. It is therefore important to develop methods which allow a new solution to be rapidly found, as close as possible to the previous one, when some variables or constraints are added or removed. After presenting some existing approaches, this paper proposes a simple and efficient method, which has been developed on the basis of the dynamic backtracking algorithm. This method allows previous solution and reasoning to be reused in the framework of a CSP which is close to the previous one. Some experimental results on general random CSPs and on operation scheduling problems for remote sensing satellites are given.
Complete diagnostics of pyroactive structures for smart systems of optoelectronics
NASA Astrophysics Data System (ADS)
Bravina, Svetlana L.; Morozovsky, Nicholas V.
1998-04-01
The results of study of pyroelectric phenomena in ferroelectric materials for evidence of the possibility to embody the functions promising for creation of smart systems for optoelectronic applications are presented. Designing such systems requires the development of methods for non- destructive complete diagnostics preferably by developing the self-diagnostic ability inherent in materials with the features of smart/intelligent ones. The complex method of complete non-destructive qualification of pyroactive materials based on the method of dynamic photopyroelectric effect allows the determination of pyroelectric, piezoelectric, ferroelectric, dielectric and thermophysical characteristics. The measuring system which allows the study of these characteristics and also memory effects, switching effects, fatigue and degradation process, self-repair process and others is presented. Sample pyroactive system with increased intelligence, such as systems with built-in adaptive controllable domain structure promising for functional optics are developed and peculiarities of their characterization are discussed.
FLARE STARS—A FAVORABLE OBJECT FOR STUDYING MECHANISMS OF NONTHERMAL ASTROPHYSICAL PHENOMENA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oks, E.; Gershberg, R. E.
2016-03-01
We present a spectroscopic method for diagnosing a low-frequency electrostatic plasma turbulence (LEPT) in plasmas of flare stars. This method had been previously developed by one of us and successfully applied to diagnosing the LEPT in solar flares. In distinction to our previous applications of the method, here we use the latest advances in the theory of the Stark broadening of hydrogen spectral lines. By analyzing observed emission Balmer lines, we show that it is very likely that the LEPT was developed in several flares of AD Leo, as well as in one flare of EV Lac. We found themore » LEPT (though of different field strengths) both in the explosive/impulsive phase and at the phase of the maximum, as well as at the gradual phase of the stellar flares. While for solar flares our method allows diagnosing the LEPT only in the most powerful flares, for the flare stars it seems that the method allows revealing the LEPT practically in every flare. It should be important to obtain new and better spectrograms of stellar flares, allowing their analysis by the method outlined in the present paper. This can be the most favorable way to the detailed understanding of the nature of nonthermal astrophysical phenomena.« less
Rapid Radiochemical Method for Radium-226 in Building ...
Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Radium-226 in building materials Method Selected for: SAM lists this method for qualitative analysis of radium-226 in concrete or brick building materials Summary of subject analytical method which will be posted to the SAM website to allow access to the method.
Rapid Radiochemical Method for Americium-241 in Building ...
Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Americium-241 in building materials Method Selected for: SAM lists this method for qualitative analysis of americium-241 in concrete or brick building materials. Summary of subject analytical method which will be posted to the SAM website to allow access to the method.
Computational techniques for flows with finite-rate condensation
NASA Technical Reports Server (NTRS)
Candler, Graham V.
1993-01-01
A computational method to simulate the inviscid two-dimensional flow of a two-phase fluid was developed. This computational technique treats the gas phase and each of a prescribed number of particle sizes as separate fluids which are allowed to interact with one another. Thus, each particle-size class is allowed to move through the fluid at its own velocity at each point in the flow field. Mass, momentum, and energy are exchanged between each particle class and the gas phase. It is assumed that the particles do not collide with one another, so that there is no inter-particle exchange of momentum and energy. However, the particles are allowed to grow, and therefore, they may change from one size class to another. Appropriate rates of mass, momentum, and energy exchange between the gas and particle phases and between the different particle classes were developed. A numerical method was developed for use with this equation set. Several test cases were computed and show qualitative agreement with previous calculations.
QGene 4.0, an extensible Java QTL-analysis platform.
Joehanes, Roby; Nelson, James C
2008-12-01
Of many statistical methods developed to date for quantitative trait locus (QTL) analysis, only a limited subset are available in public software allowing their exploration, comparison and practical application by researchers. We have developed QGene 4.0, a plug-in platform that allows execution and comparison of a variety of modern QTL-mapping methods and supports third-party addition of new ones. The software accommodates line-cross mating designs consisting of any arbitrary sequence of selfing, backcrossing, intercrossing and haploid-doubling steps that includes map, population, and trait simulators; and is scriptable. Software and documentation are available at http://coding.plantpath.ksu.edu/qgene. Source code is available on request.
EMAGE mouse embryo spatial gene expression database: 2010 update
Richardson, Lorna; Venkataraman, Shanmugasundaram; Stevenson, Peter; Yang, Yiya; Burton, Nicholas; Rao, Jianguo; Fisher, Malcolm; Baldock, Richard A.; Davidson, Duncan R.; Christiansen, Jeffrey H.
2010-01-01
EMAGE (http://www.emouseatlas.org/emage) is a freely available online database of in situ gene expression patterns in the developing mouse embryo. Gene expression domains from raw images are extracted and integrated spatially into a set of standard 3D virtual mouse embryos at different stages of development, which allows data interrogation by spatial methods. An anatomy ontology is also used to describe sites of expression, which allows data to be queried using text-based methods. Here, we describe recent enhancements to EMAGE including: the release of a completely re-designed website, which offers integration of many different search functions in HTML web pages, improved user feedback and the ability to find similar expression patterns at the click of a button; back-end refactoring from an object oriented to relational architecture, allowing associated SQL access; and the provision of further access by standard formatted URLs and a Java API. We have also increased data coverage by sourcing from a greater selection of journals and developed automated methods for spatial data annotation that are being applied to spatially incorporate the genome-wide (∼19 000 gene) ‘EURExpress’ dataset into EMAGE. PMID:19767607
Abdulmawjood, Amir; Grabowski, Nils; Fohler, Svenja; Kittler, Sophie; Nagengast, Helga; Klein, Guenter
2014-01-01
Animal species identification is one of the primary duties of official food control. Since ostrich meat is difficult to be differentiated macroscopically from beef, therefore new analytical methods are needed. To enforce labeling regulations for the authentication of ostrich meat, it might be of importance to develop and evaluate a rapid and reliable assay. In the present study, a loop-mediated isothermal amplification (LAMP) assay based on the cytochrome b gene of the mitochondrial DNA of the species Struthio camelus was developed. The LAMP assay was used in combination with a real-time fluorometer. The developed system allowed the detection of 0.01% ostrich meat products. In parallel, a direct swab method without nucleic acid extraction using the HYPLEX LPTV buffer was also evaluated. This rapid processing method allowed detection of ostrich meat without major incubation steps. In summary, the LAMP assay had excellent sensitivity and specificity for detecting ostrich meat and could provide a sampling-to-result identification-time of 15 to 20 minutes. PMID:24963709
Sorbe, A; Chazel, M; Gay, E; Haenni, M; Madec, J-Y; Hendrikx, P
2011-06-01
Develop and calculate performance indicators allows to continuously follow the operation of an epidemiological surveillance network. This is an internal evaluation method, implemented by the coordinators in collaboration with all the actors of the network. Its purpose is to detect weak points in order to optimize management. A method for the development of performance indicators of epidemiological surveillance networks was developed in 2004 and was applied to several networks. Its implementation requires a thorough description of the network environment and all its activities to define priority indicators. Since this method is considered to be complex, our objective consisted in developing a simplified approach and applying it to an epidemiological surveillance network. We applied the initial method to a theoretical network model to obtain a list of generic indicators that can be adapted to any surveillance network. We obtained a list of 25 generic performance indicators, intended to be reformulated and described according to the specificities of each network. It was used to develop performance indicators for RESAPATH, an epidemiological surveillance network of antimicrobial resistance in pathogenic bacteria of animal origin in France. This application allowed us to validate the simplified method, its value in terms of practical implementation, and its level of user acceptance. Its ease of use and speed of application compared to the initial method argue in favor of its use on broader scale. Copyright © 2011 Elsevier Masson SAS. All rights reserved.
Practical Use of Computationally Frugal Model Analysis Methods
Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; ...
2015-03-21
Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less
NASA Astrophysics Data System (ADS)
Restaino, Stephen M.; White, Ian M.
2017-03-01
Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.
Snipas, Mindaugas; Pranevicius, Henrikas; Pranevicius, Mindaugas; Pranevicius, Osvaldas; Paulauskas, Nerijus; Bukauskas, Feliksas F
2015-01-01
The primary goal of this work was to study advantages of numerical methods used for the creation of continuous time Markov chain models (CTMC) of voltage gating of gap junction (GJ) channels composed of connexin protein. This task was accomplished by describing gating of GJs using the formalism of the stochastic automata networks (SANs), which allowed for very efficient building and storing of infinitesimal generator of the CTMC that allowed to produce matrices of the models containing a distinct block structure. All of that allowed us to develop efficient numerical methods for a steady-state solution of CTMC models. This allowed us to accelerate CPU time, which is necessary to solve CTMC models, ~20 times.
Software development for teleroentgenogram analysis
NASA Astrophysics Data System (ADS)
Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.
2017-09-01
A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.
Store Separation Lessons Learned During the Last 30 Years
2010-01-01
the same time period the Influence Function Method (IFM) was also developed5. This method allowed for a straight STORE SEPARATION LESSONS LEARNED...developed at Grumman under an Air Force contract. The Influence Function Method (IFM)5,6,7 was used to determine the effect of the aircraft flowfield on...A Chimera Grid Scheme,” Advances in Grid Generation, ASME, June 1983. 5. Meyer, R., Cenko, A., and Yaros, S., “An Influence Function Method for
Study of phase clustering method for analyzing large volumes of meteorological observation data
NASA Astrophysics Data System (ADS)
Volkov, Yu. V.; Krutikov, V. A.; Botygin, I. A.; Sherstnev, V. S.; Sherstneva, A. I.
2017-11-01
The article describes an iterative parallel phase grouping algorithm for temperature field classification. The algorithm is based on modified method of structure forming by using analytic signal. The developed method allows to solve tasks of climate classification as well as climatic zoning for any time or spatial scale. When used to surface temperature measurement series, the developed algorithm allows to find climatic structures with correlated changes of temperature field, to make conclusion on climate uniformity in a given area and to overview climate changes over time by analyzing offset in type groups. The information on climate type groups specific for selected geographical areas is expanded by genetic scheme of class distribution depending on change in mutual correlation level between ground temperature monthly average.
Demonstrating Enzyme Activation: Calcium/Calmodulin Activation of Phosphodiesterase
ERIC Educational Resources Information Center
Porta, Angela R.
2004-01-01
Demonstrating the steps of a signal transduction cascade usually involves radioactive materials and thus precludes its use in undergraduate teaching labs. Developing labs that allow the visual demonstration of these steps without the use of radioactivity is important for allowing students hands-on methods of illustrating each step of a signal…
A cryopreservation method for Pasteurella multocida from wetland samples
Moore, Melody K.; Shadduck, D.J.; Goldberg, Diana R.; Samuel, M.D.
1998-01-01
A cryopreservation method and improved isolation techniques for detection of Pasteurella multocida from wetland samples were developed. Wetland water samples were collected in the field, diluted in dimethyl sulfoxide (DMSO, final concentration 10%), and frozen at -180 C in a liquid nitrogen vapor shipper. Frozen samples were transported to the laboratory where they were subsequently thawed and processed in Pasteurella multocida selective broth (PMSB) to isolate P. multocida. This method allowed for consistent isolation of 2 to 18 organisms/ml from water seeded with known concentrations of P. multocida. The method compared favorably with the standard mouse inoculation method and allowed for preservation of the samples until they could be processed in the laboratory.
Christien, F; Telling, M T F; Knight, K S; Le Gall, R
2015-05-01
A method is proposed for the monitoring of metal recrystallization using neutron diffraction that is based on the measurement of stored energy. Experiments were performed using deformed metal specimens heated in-situ while mounted at the sample position of the High Resolution Powder Diffractometer, HRPD (ISIS Facility), UK. Monitoring the breadth of the resulting Bragg lines during heating not only allows the time-dependence (or temperature-dependence) of the stored energy to be determined but also the recrystallized fraction. The analysis method presented here was developed using pure nickel (Ni270) specimens with different deformation levels from 0.29 to 0.94. In situ temperature ramping as well as isothermal annealing was undertaken. The method developed in this work allows accurate and quantitative monitoring of the recrystallization process. The results from neutron diffraction are satisfactorily compared to data obtained from calorimetry and hardness measurements.
Rapid Radiochemical Method for Total Radiostrontium (Sr-90) ...
Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Beta counting Method Developed for: Strontium-89 and strontium-90 in building materials Method Selected for: SAM lists this method for qualitative analysis of strontium-89 and strontium-90 in concrete or brick building materials Summary of subject analytical method which will be posted to the SAM website to allow access to the method.
Agasti, Sarit S; Liong, Monty; Peterson, Vanessa M; Lee, Hakho; Weissleder, Ralph
2012-11-14
DNA barcoding is an attractive technology, as it allows sensitive and multiplexed target analysis. However, DNA barcoding of cellular proteins remains challenging, primarily because barcode amplification and readout techniques are often incompatible with the cellular microenvironment. Here we describe the development and validation of a photocleavable DNA barcode-antibody conjugate method for rapid, quantitative, and multiplexed detection of proteins in single live cells. Following target binding, this method allows DNA barcodes to be photoreleased in solution, enabling easy isolation, amplification, and readout. As a proof of principle, we demonstrate sensitive and multiplexed detection of protein biomarkers in a variety of cancer cells.
Diagnostic emulation: Implementation and user's guide
NASA Technical Reports Server (NTRS)
Becher, Bernice
1987-01-01
The Diagnostic Emulation Technique was developed within the System Validation Methods Branch as a part of the development of methods for the analysis of the reliability of highly reliable, fault tolerant digital avionics systems. This is a general technique which allows for the emulation of a digital hardware system. The technique is general in the sense that it is completely independent of the particular target hardware which is being emulated. Parts of the system are described and emulated at the logic or gate level, while other parts of the system are described and emulated at the functional level. This algorithm allows for the insertion of faults into the system, and for the observation of the response of the system to these faults. This allows for controlled and accelerated testing of system reaction to hardware failures in the target machine. This document describes in detail how the algorithm was implemented at NASA Langley Research Center and gives instructions for using the system.
Espino, Daniel M; Shepherd, Duncan E T; Hukins, David W L
2014-01-01
A transient multi-physics model of the mitral heart valve has been developed, which allows simultaneous calculation of fluid flow and structural deformation. A recently developed contact method has been applied to enable simulation of systole (the stage when blood pressure is elevated within the heart to pump blood to the body). The geometry was simplified to represent the mitral valve within the heart walls in two dimensions. Only the mitral valve undergoes deformation. A moving arbitrary Lagrange-Euler mesh is used to allow true fluid-structure interaction (FSI). The FSI model requires blood flow to induce valve closure by inducing strains in the region of 10-20%. Model predictions were found to be consistent with existing literature and will undergo further development.
Development of V/STOL methodology based on a higher order panel method
NASA Technical Reports Server (NTRS)
Bhateley, I. C.; Howell, G. A.; Mann, H. W.
1983-01-01
The development of a computational technique to predict the complex flowfields of V/STOL aircraft was initiated in which a number of modules and a potential flow aerodynamic code were combined in a comprehensive computer program. The modules were developed in a building-block approach to assist the user in preparing the geometric input and to compute parameters needed to simulate certain flow phenomena that cannot be handled directly within a potential flow code. The PAN AIR aerodynamic code, which is higher order panel method, forms the nucleus of this program. PAN AIR's extensive capability for allowing generalized boundary conditions allows the modules to interact with the aerodynamic code through the input and output files, thereby requiring no changes to the basic code and easy replacement of updated modules.
Analytical chemistry in water quality monitoring during manned space missions
NASA Astrophysics Data System (ADS)
Artemyeva, Anastasia A.
2016-09-01
Water quality monitoring during human spaceflights is essential. However, most of the traditional methods require sample collection with a subsequent ground analysis because of the limitations in volume, power, safety and gravity. The space missions are becoming longer-lasting; hence methods suitable for in-flight monitoring are demanded. Since 2009, water quality has been monitored in-flight with colorimetric methods allowing for detection of iodine and ionic silver. Organic compounds in water have been monitored with a second generation total organic carbon analyzer, which provides information on the amount of carbon in water at both the U.S. and Russian segments of the International Space Station since 2008. The disadvantage of this approach is the lack of compound-specific information. The recently developed methods and tools may potentially allow one to obtain in-flight a more detailed information on water quality. Namely, the microanalyzers based on potentiometric measurements were designed for online detection of chloride, potassium, nitrate ions and ammonia. The recent application of the current highly developed air quality monitoring system for water analysis was a logical step because most of the target analytes are the same in air and water. An electro-thermal vaporizer was designed, manufactured and coupled with the air quality control system. This development allowed for liberating the analytes from the aqueous matrix and further compound-specific analysis in the gas phase.
NASA Astrophysics Data System (ADS)
Sekiguchi, Atsushi
2013-03-01
The QCM method allows measurements of impedance, an index of swelling layer viscosity in a photoresist during development. While impedance is sometimes used as a qualitative index of change in the viscosity of the swelling layer, it has to date not been used quantitatively, for data analysis. We explored a method for converting impedance values to elastic modulus (Pa), a coefficient expressing viscosity. Applying this method, we compared changes in the viscosity of the swelling layer in an ArF resist generated during development in a TMAH developing solution and in a TBAH developing solution. This paper reports the results of this comparative study.
NASA Astrophysics Data System (ADS)
Granade, Christopher; Combes, Joshua; Cory, D. G.
2016-03-01
In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of-the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we address all three problems. First, we use modern statistical methods, as pioneered by Huszár and Houlsby (2012 Phys. Rev. A 85 052120) and by Ferrie (2014 New J. Phys. 16 093035), to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first priors on quantum states and channels that allow for including useful experimental insight. Finally, we develop a method that allows tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.
Inverse transonic airfoil design methods including boundary layer and viscous interaction effects
NASA Technical Reports Server (NTRS)
Carlson, L. A.
1979-01-01
The development and incorporation into TRANDES of a fully conservative analysis method utilizing the artificial compressibility approach is described. The method allows for lifting cases and finite thickness airfoils and utilizes a stretched coordinate system. Wave drag and massive separation studies are also discussed.
Schwaninger, Andrea E.; Meyer, Markus R.; Huestis, Marilyn A.; Maurer, Hans H.
2013-01-01
3,4-Methylenedioxymethamphetamine (MDMA) is a racemic drug of abuse and its R- and S-enantiomers are known to differ in their dose-response curve. The S-enantiomer was shown to be eliminated at a higher rate than the R-enantiomer most likely explained by stereoselective metabolism that was observed in various in vitro experiments. The aim of this work was the development and validation of methods for evaluating the stereoselective elimination of phase I and particularly phase II metabolites of MDMA in human urine. Urine samples were divided into three different methods. Method A allowed stereoselective determination of the 4-hydroxy-3-methoxymethamphetamine (HMMA) glucuronides and only achiral determination of the intact sulfate conjugates of HMMA and 3,4-dihydroxymethamphetamine (DHMA) after C18 solid-phase extraction by liquid chromatography–high-resolution mass spectrometry with electrospray ionization. Method B allowed the determination of the enantiomer ratios of DHMA and HMMA sulfate conjugates after selective enzymatic cleavage and chiral analysis of the corresponding deconjugated metabolites after chiral derivatization with S-heptafluorobutyrylprolyl chloride using gas chromatography–mass spectrometry with negativeion chemical ionization. Method C allowed the chiral determination of MDMA and its unconjugated metabolites using method B without sulfate cleavage. The validation process including specificity, recovery, matrix effects, process efficiency, accuracy and precision, stabilities and limits of quantification and detection showed that all methods were selective, sensitive, accurate and precise for all tested analytes. PMID:21656610
Yi, H T; Chen, Y; Czelen, K; Podzorov, V
2011-12-22
A novel vacuum lamination approach to fabrication of high-performance single-crystal organic field-effect transistors has been developed. The non-destructive nature of this method allows a direct comparison of field-effect mobilities achieved with various gate dielectrics using the same single-crystal sample. The method also allows gating delicate systems, such as n -type crystals and SAM-coated surfaces, without perturbation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Chanbi, Daoud; Ogam, Erick; Amara, Sif Eddine; Fellah, Z E A
2018-05-07
Precise but simple experimental and inverse methods allowing the recovery of mechanical material parameters are necessary for the exploration of materials with novel crystallographic structures and elastic properties, particularly for new materials and those existing only in theory. The alloys studied herein are of new atomic compositions. This paper reports an experimental study involving the synthesis and development of methods for the determination of the elastic properties of binary (Fe-Al, Fe-Ti and Ti-Al) and ternary (Fe-Ti-Al) intermetallic alloys with different concentrations of their individual constituents. The alloys studied were synthesized from high purity metals using an arc furnace with argon flow to ensure their uniformity and homogeneity. Precise but simple methods for the recovery of the elastic constants of the isotropic metals from resonant ultrasound vibration data were developed. These methods allowed the fine analysis of the relationships between the atomic concentration of a given constituent and the Young’s modulus or alloy density.
Space Radiation Transport Methods Development
NASA Technical Reports Server (NTRS)
Wilson, J. W.; Tripathi, R. K.; Qualls, G. D.; Cucinotta, F. A.; Prael, R. E.; Norbury, J. W.; Heinbockel, J. H.; Tweed, J.
2002-01-01
Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard Finite Element Method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 milliseconds and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of reconfigurable computing and could be utilized in the final design as verification of the deterministic method optimized design.
Chanbi, Daoud; Amara, Sif Eddine; Fellah, Z. E. A.
2018-01-01
Precise but simple experimental and inverse methods allowing the recovery of mechanical material parameters are necessary for the exploration of materials with novel crystallographic structures and elastic properties, particularly for new materials and those existing only in theory. The alloys studied herein are of new atomic compositions. This paper reports an experimental study involving the synthesis and development of methods for the determination of the elastic properties of binary (Fe-Al, Fe-Ti and Ti-Al) and ternary (Fe-Ti-Al) intermetallic alloys with different concentrations of their individual constituents. The alloys studied were synthesized from high purity metals using an arc furnace with argon flow to ensure their uniformity and homogeneity. Precise but simple methods for the recovery of the elastic constants of the isotropic metals from resonant ultrasound vibration data were developed. These methods allowed the fine analysis of the relationships between the atomic concentration of a given constituent and the Young’s modulus or alloy density. PMID:29735946
SU-F-T-504: Non-Divergent Planning Method for Craniospinal Irradiation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sperling, N; Bogue, J; Parsai, E
2016-06-15
Purpose: Traditional Craniospinal Irradiation (CSI) planning techniques require careful field placement to allow optimal divergence and field overlap at depth, and measurement of skin gap. The result of this is a necessary field overlap resulting in dose heterogeneity in the spinal canal. A novel, nondivergent field matching method has been developed to allow simple treatment planning and delivery without the need to measure skin gap. Methods: The CSI patient was simulated in the prone, and a plan was developed. Bilateral cranial fields were designed with couch and collimator rotation to eliminate divergence with the upper spine field and minimize anteriormore » divergence into the lenses. Spinal posterior-to-anterior fields were designed with the couch rotated to 90 degrees to allow gantry rotation to eliminate divergence at the match line, and the collimator rotated to 90 degrees to allow appropriate field blocking with the MLCs. A match line for the two spinal fields was placed and the gantry rotated to equal angles in opposite directions about the match line. Jaw positions were then defined to allow 1mm overlap at the match line to avoid cold spots. A traditional CSI plan was generated using diverging spinal fields, and a comparison between the two techniques was generated. Results: The non-divergent treatment plan was able to deliver a highly uniform dose to the spinal cord with a cold spot of only 95% and maximum point dose of 115.8%, as compared to traditional plan cold spots of 87% and hot spots of 132% of the prescription dose. Conclusion: A non-divergent method for planning CSI patients has been developed and clinically implemented. Planning requires some geometric manipulation in order to achieve an adequate dose distribution, however, it can help to manage cold spots and simplify the shifts needed between spinal fields.« less
Pranevicius, Henrikas; Pranevicius, Mindaugas; Pranevicius, Osvaldas; Bukauskas, Feliksas F.
2015-01-01
The primary goal of this work was to study advantages of numerical methods used for the creation of continuous time Markov chain models (CTMC) of voltage gating of gap junction (GJ) channels composed of connexin protein. This task was accomplished by describing gating of GJs using the formalism of the stochastic automata networks (SANs), which allowed for very efficient building and storing of infinitesimal generator of the CTMC that allowed to produce matrices of the models containing a distinct block structure. All of that allowed us to develop efficient numerical methods for a steady-state solution of CTMC models. This allowed us to accelerate CPU time, which is necessary to solve CTMC models, ∼20 times. PMID:25705700
A community resource benchmarking predictions of peptide binding to MHC-I molecules.
Peters, Bjoern; Bui, Huynh-Hoa; Frankild, Sune; Nielson, Morten; Lundegaard, Claus; Kostem, Emrah; Basch, Derek; Lamberth, Kasper; Harndahl, Mikkel; Fleri, Ward; Wilson, Stephen S; Sidney, John; Lund, Ole; Buus, Soren; Sette, Alessandro
2006-06-09
Recognition of peptides bound to major histocompatibility complex (MHC) class I molecules by T lymphocytes is an essential part of immune surveillance. Each MHC allele has a characteristic peptide binding preference, which can be captured in prediction algorithms, allowing for the rapid scan of entire pathogen proteomes for peptide likely to bind MHC. Here we make public a large set of 48,828 quantitative peptide-binding affinity measurements relating to 48 different mouse, human, macaque, and chimpanzee MHC class I alleles. We use this data to establish a set of benchmark predictions with one neural network method and two matrix-based prediction methods extensively utilized in our groups. In general, the neural network outperforms the matrix-based predictions mainly due to its ability to generalize even on a small amount of data. We also retrieved predictions from tools publicly available on the internet. While differences in the data used to generate these predictions hamper direct comparisons, we do conclude that tools based on combinatorial peptide libraries perform remarkably well. The transparent prediction evaluation on this dataset provides tool developers with a benchmark for comparison of newly developed prediction methods. In addition, to generate and evaluate our own prediction methods, we have established an easily extensible web-based prediction framework that allows automated side-by-side comparisons of prediction methods implemented by experts. This is an advance over the current practice of tool developers having to generate reference predictions themselves, which can lead to underestimating the performance of prediction methods they are not as familiar with as their own. The overall goal of this effort is to provide a transparent prediction evaluation allowing bioinformaticians to identify promising features of prediction methods and providing guidance to immunologists regarding the reliability of prediction tools.
1988-01-01
ignored but the Volkersen model is extended to include adherend deformations will be discussed. STATISTICAL METHODOLOGY FOR DESIGN ALLOWABLES [15-17...structure. In the certification methodology , the development test program and the calculation of composite design allowables is orchestrated to support...Development of design methodology of thick composites and their test methods. (b) Role of interface in emerging composite systems. *CONTRACTS IMPROVED DAMAGE
Salivary biomarker development using genomic, proteomic and metabolomic approaches
2012-01-01
The use of saliva as a diagnostic sample provides a non-invasive, cost-efficient method of sample collection for disease screening without the need for highly trained professionals. Saliva collection is far more practical and safe compared with invasive methods of sample collection, because of the infection risk from contaminated needles during, for example, blood sampling. Furthermore, the use of saliva could increase the availability of accurate diagnostics for remote and impoverished regions. However, the development of salivary diagnostics has required technical innovation to allow stabilization and detection of analytes in the complex molecular mixture that is saliva. The recent development of cost-effective room temperature analyte stabilization methods, nucleic acid pre-amplification techniques and direct saliva transcriptomic analysis have allowed accurate detection and quantification of transcripts found in saliva. Novel protein stabilization methods have also facilitated improved proteomic analyses. Although candidate biomarkers have been discovered using epigenetic, transcriptomic, proteomic and metabolomic approaches, transcriptomic analyses have so far achieved the most progress in terms of sensitivity and specificity, and progress towards clinical implementation. Here, we review recent developments in salivary diagnostics that have been accomplished using genomic, transcriptomic, proteomic and metabolomic approaches. PMID:23114182
Ahmed, Sameh; Alqurshi, Abdulmalik; Mohamed, Abdel-Maaboud Ismail
2018-07-01
A new robust and reliable high-performance liquid chromatography (HPLC) method with multi-criteria decision making (MCDM) approach was developed to allow simultaneous quantification of atenolol (ATN) and nifedipine (NFD) in content uniformity testing. Felodipine (FLD) was used as an internal standard (I.S.) in this study. A novel marriage between a new interactive response optimizer and a HPLC method was suggested for multiple response optimizations of target responses. An interactive response optimizer was used as a decision and prediction tool for the optimal settings of target responses, according to specified criteria, based on Derringer's desirability. Four independent variables were considered in this study: Acetonitrile%, buffer pH and concentration along with column temperature. Eight responses were optimized: retention times of ATN, NFD, and FLD, resolutions between ATN/NFD and NFD/FLD, and plate numbers for ATN, NFD, and FLD. Multiple regression analysis was applied in order to scan the influences of the most significant variables for the regression models. The experimental design was set to give minimum retention times, maximum resolution and plate numbers. The interactive response optimizer allowed prediction of optimum conditions according to these criteria with a good composite desirability value of 0.98156. The developed method was validated according to the International Conference on Harmonization (ICH) guidelines with the aid of the experimental design. The developed MCDM-HPLC method showed superior robustness and resolution in short analysis time allowing successful simultaneous content uniformity testing of ATN and NFD in marketed capsules. The current work presents an interactive response optimizer as an efficient platform to optimize, predict responses, and validate HPLC methodology with tolerable design space for assay in quality control laboratories. Copyright © 2018 Elsevier B.V. All rights reserved.
Efficient hemodynamic event detection utilizing relational databases and wavelet analysis
NASA Technical Reports Server (NTRS)
Saeed, M.; Mark, R. G.
2001-01-01
Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.
de Oliveira, Gilberto Santos; Kawahara, Rebeca; Rosa-Fernandes, Livia; Avila, Carla Cristi; Teixeira, Marta M. G.; Larsen, Martin R.
2018-01-01
Background Chagas disease also known as American trypanosomiasis is caused by the protozoan Trypanosoma cruzi. Over the last 30 years, Chagas disease has expanded from a neglected parasitic infection of the rural population to an urbanized chronic disease, becoming a potentially emergent global health problem. T. cruzi strains were assigned to seven genetic groups (TcI-TcVI and TcBat), named discrete typing units (DTUs), which represent a set of isolates that differ in virulence, pathogenicity and immunological features. Indeed, diverse clinical manifestations (from asymptomatic to highly severe disease) have been attempted to be related to T.cruzi genetic variability. Due to that, several DTU typing methods have been introduced. Each method has its own advantages and drawbacks such as high complexity and analysis time and all of them are based on genetic signatures. Recently, a novel method discriminated bacterial strains using a peptide identification-free, genome sequence-independent shotgun proteomics workflow. Here, we aimed to develop a Trypanosoma cruzi Strain Typing Assay using MS/MS peptide spectral libraries, named Tc-STAMS2. Methods/Principal findings The Tc-STAMS2 method uses shotgun proteomics combined with spectral library search to assign and discriminate T. cruzi strains independently on the genome knowledge. The method is based on the construction of a library of MS/MS peptide spectra built using genotyped T. cruzi reference strains. For identification, the MS/MS peptide spectra of unknown T. cruzi cells are identified using the spectral matching algorithm SpectraST. The Tc-STAMS2 method allowed correct identification of all DTUs with high confidence. The method was robust towards different sample preparations, length of chromatographic gradients and fragmentation techniques. Moreover, a pilot inter-laboratory study showed the applicability to different MS platforms. Conclusions and significance This is the first study that develops a MS-based platform for T. cruzi strain typing. Indeed, the Tc-STAMS2 method allows T. cruzi strain typing using MS/MS spectra as discriminatory features and allows the differentiation of TcI-TcVI DTUs. Similar to genomic-based strategies, the Tc-STAMS2 method allows identification of strains within DTUs. Its robustness towards different experimental and biological variables makes it a valuable complementary strategy to the current T. cruzi genotyping assays. Moreover, this method can be used to identify DTU-specific features correlated with the strain phenotype. PMID:29608573
Kovács, István A.; Palotai, Robin; Szalay, Máté S.; Csermely, Peter
2010-01-01
Background Network communities help the functional organization and evolution of complex networks. However, the development of a method, which is both fast and accurate, provides modular overlaps and partitions of a heterogeneous network, has proven to be rather difficult. Methodology/Principal Findings Here we introduce the novel concept of ModuLand, an integrative method family determining overlapping network modules as hills of an influence function-based, centrality-type community landscape, and including several widely used modularization methods as special cases. As various adaptations of the method family, we developed several algorithms, which provide an efficient analysis of weighted and directed networks, and (1) determine pervasively overlapping modules with high resolution; (2) uncover a detailed hierarchical network structure allowing an efficient, zoom-in analysis of large networks; (3) allow the determination of key network nodes and (4) help to predict network dynamics. Conclusions/Significance The concept opens a wide range of possibilities to develop new approaches and applications including network routing, classification, comparison and prediction. PMID:20824084
Hatch, Christine E; Fisher, Andrew T.; Revenaugh, Justin S.; Constantz, Jim; Ruehl, Chris
2006-01-01
We present a method for determining streambed seepage rates using time series thermal data. The new method is based on quantifying changes in phase and amplitude of temperature variations between pairs of subsurface sensors. For a reasonable range of streambed thermal properties and sensor spacings the time series method should allow reliable estimation of seepage rates for a range of at least ±10 m d−1 (±1.2 × 10−2 m s−1), with amplitude variations being most sensitive at low flow rates and phase variations retaining sensitivity out to much higher rates. Compared to forward modeling, the new method requires less observational data and less setup and data handling and is faster, particularly when interpreting many long data sets. The time series method is insensitive to streambed scour and sedimentation, which allows for application under a wide range of flow conditions and allows time series estimation of variable streambed hydraulic conductivity. This new approach should facilitate wider use of thermal methods and improve understanding of the complex spatial and temporal dynamics of surface water–groundwater interactions.
Hydrocarbon Rocket Technology Impact Forecasting
NASA Technical Reports Server (NTRS)
Stuber, Eric; Prasadh, Nishant; Edwards, Stephen; Mavris, Dimitri N.
2012-01-01
Ever since the Apollo program ended, the development of launch propulsion systems in the US has fallen drastically, with only two new booster engine developments, the SSME and the RS-68, occurring in the past few decades.1 In recent years, however, there has been an increased interest in pursuing more effective launch propulsion technologies in the U.S., exemplified by the NASA Office of the Chief Technologist s inclusion of Launch Propulsion Systems as the first technological area in the Space Technology Roadmaps2. One area of particular interest to both government agencies and commercial entities has been the development of hydrocarbon engines; NASA and the Air Force Research Lab3 have expressed interest in the use of hydrocarbon fuels for their respective SLS Booster and Reusable Booster System concepts, and two major commercially-developed launch vehicles SpaceX s Falcon 9 and Orbital Sciences Antares feature engines that use RP-1 kerosene fuel. Compared to engines powered by liquid hydrogen, hydrocarbon-fueled engines have a greater propellant density (usually resulting in a lighter overall engine), produce greater propulsive force, possess easier fuel handling and loading, and for reusable vehicle concepts can provide a shorter turnaround time between launches. These benefits suggest that a hydrocarbon-fueled launch vehicle would allow for a cheap and frequent means of access to space.1 However, the time and money required for the development of a new engine still presents a major challenge. Long and costly design, development, testing and evaluation (DDT&E) programs underscore the importance of identifying critical technologies and prioritizing investment efforts. Trade studies must be performed on engine concepts examining the affordability, operability, and reliability of each concept, and quantifying the impacts of proposed technologies. These studies can be performed through use of the Technology Impact Forecasting (TIF) method. The Technology Impact Forecasting method is a normative forecasting technique that allows the designer to quantify the effects of adding new technologies on a given design. This method can be used to assess and identify the necessary technological improvements needed to close the gap that exists between the current design and one that satisfies all constraints imposed on the design. The TIF methodology allows for more design knowledge to be brought to the earlier phases of the design process, making use of tools such as Quality Function Deployments, Morphological Matrices, Response Surface Methodology, and Monte Carlo Simulations.2 This increased knowledge allows for more informed decisions to be made earlier in the design process, resulting in shortened design cycle time. This paper will investigate applying the TIF method, which has been widely used in aircraft applications, to the conceptual design of a hydrocarbon rocket engine. In order to reinstate a manned presence in space, the U.S. must develop an affordable and sustainable launch capability. Hydrocarbon-fueled rockets have drawn interest from numerous major government and commercial entities because they offer a low-cost heavy-lift option that would allow for frequent launches1. However, the development of effective new hydrocarbon rockets would likely require new technologies in order to overcome certain design constraints. The use of advanced design methods, such as the TIF method, enables the designer to identify key areas in need of improvement, allowing one to dial in a proposed technology and assess its impact on the system. Through analyses such as this one, a conceptual design for a hydrocarbon-fueled vehicle that meets all imposed requirements can be achieved.
NASA Astrophysics Data System (ADS)
Alshakova, E. L.
2017-01-01
The program in the AutoLISP language allows automatically to form parametrical drawings during the work in the AutoCAD software product. Students study development of programs on AutoLISP language with the use of the methodical complex containing methodical instructions in which real examples of creation of images and drawings are realized. Methodical instructions contain reference information necessary for the performance of the offered tasks. The method of step-by-step development of the program is the basis for training in programming on AutoLISP language: the program draws elements of the drawing of a detail by means of definitely created function which values of arguments register in that sequence in which AutoCAD gives out inquiries when performing the corresponding command in the editor. The process of the program design is reduced to the process of step-by-step formation of functions and sequence of their calls. The author considers the development of the AutoLISP program for the creation of parametrical drawings of details, the defined design, the user enters the dimensions of elements of details. These programs generate variants of tasks of the graphic works performed in educational process of "Engineering graphics", "Engineering and computer graphics" disciplines. Individual tasks allow to develop at students skills of independent work in reading and creation of drawings, as well as 3D modeling.
BetaTPred: prediction of beta-TURNS in a protein using statistical algorithms.
Kaur, Harpreet; Raghava, G P S
2002-03-01
beta-turns play an important role from a structural and functional point of view. beta-turns are the most common type of non-repetitive structures in proteins and comprise on average, 25% of the residues. In the past numerous methods have been developed to predict beta-turns in a protein. Most of these prediction methods are based on statistical approaches. In order to utilize the full potential of these methods, there is a need to develop a web server. This paper describes a web server called BetaTPred, developed for predicting beta-TURNS in a protein from its amino acid sequence. BetaTPred allows the user to predict turns in a protein using existing statistical algorithms. It also allows to predict different types of beta-TURNS e.g. type I, I', II, II', VI, VIII and non-specific. This server assists the users in predicting the consensus beta-TURNS in a protein. The server is accessible from http://imtech.res.in/raghava/betatpred/
Coelho Graça, Didia; Hartmer, Ralf; Jabs, Wolfgang; Beris, Photis; Clerici, Lorella; Stoermer, Carsten; Samii, Kaveh; Hochstrasser, Denis; Tsybin, Yury O; Scherl, Alexander; Lescuyer, Pierre
2015-04-01
Hemoglobin disorder diagnosis is a complex procedure combining several analytical steps. Due to the lack of specificity of the currently used protein analysis methods, the identification of uncommon hemoglobin variants (proteoforms) can become a hard task to accomplish. The aim of this work was to develop a mass spectrometry-based approach to quickly identify mutated protein sequences within globin chain variants. To reach this goal, a top-down electron transfer dissociation mass spectrometry method was developed for hemoglobin β chain analysis. A diagnostic product ion list was established with a color code strategy allowing to quickly and specifically localize a mutation in the hemoglobin β chain sequence. The method was applied to the analysis of rare hemoglobin β chain variants and an (A)γ-β fusion protein. The results showed that the developed data analysis process allows fast and reliable interpretation of top-down electron transfer dissociation mass spectra by nonexpert users in the clinical area.
A synchronization method for wireless acquisition systems, application to brain computer interfaces.
Foerster, M; Bonnet, S; van Langhenhove, A; Porcherot, J; Charvet, G
2013-01-01
A synchronization method for wireless acquisition systems has been developed and implemented on a wireless ECoG recording implant and on a wireless EEG recording helmet. The presented algorithm and hardware implementation allow the precise synchronization of several data streams from several sensor nodes for applications where timing is critical like in event-related potential (ERP) studies. The proposed method has been successfully applied to obtain visual evoked potentials and compared with a reference biosignal amplifier. The control over the exact sampling frequency allows reducing synchronization errors that will otherwise accumulate during a recording. The method is scalable to several sensor nodes communicating with a shared base station.
A Rapid Method for Engineering Recombinant Polioviruses or Other Enteroviruses.
Bessaud, Maël; Pelletier, Isabelle; Blondel, Bruno; Delpeyroux, Francis
2016-01-01
The cloning of large enterovirus RNA sequences is labor-intensive because of the frequent instability in bacteria of plasmidic vectors containing the corresponding cDNAs. In order to circumvent this issue we have developed a PCR-based method that allows the generation of highly modified or chimeric full-length enterovirus genomes. This method relies on fusion PCR which enables the concatenation of several overlapping cDNA amplicons produced separately. A T7 promoter sequence added upstream the fusion PCR products allows its transcription into infectious genomic RNAs directly in transfected cells constitutively expressing the phage T7 RNA polymerase. This method permits the rapid recovery of modified viruses that can be subsequently amplified on adequate cell-lines.
A stochastic Markov chain approach for tennis: Monte Carlo simulation and modeling
NASA Astrophysics Data System (ADS)
Aslam, Kamran
This dissertation describes the computational formulation of probability density functions (pdfs) that facilitate head-to-head match simulations in tennis along with ranking systems developed from their use. A background on the statistical method used to develop the pdfs , the Monte Carlo method, and the resulting rankings are included along with a discussion on ranking methods currently being used both in professional sports and in other applications. Using an analytical theory developed by Newton and Keller in [34] that defines a tennis player's probability of winning a game, set, match and single elimination tournament, a computational simulation has been developed in Matlab that allows further modeling not previously possible with the analytical theory alone. Such experimentation consists of the exploration of non-iid effects, considers the concept the varying importance of points in a match and allows an unlimited number of matches to be simulated between unlikely opponents. The results of these studies have provided pdfs that accurately model an individual tennis player's ability along with a realistic, fair and mathematically sound platform for ranking them.
Functional Techniques for Data Analysis
NASA Technical Reports Server (NTRS)
Tomlinson, John R.
1997-01-01
This dissertation develops a new general method of solving Prony's problem. Two special cases of this new method have been developed previously. They are the Matrix Pencil and the Osculatory Interpolation. The dissertation shows that they are instances of a more general solution type which allows a wide ranging class of linear functional to be used in the solution of the problem. This class provides a continuum of functionals which provide new methods that can be used to solve Prony's problem.
Land management planning: a method of evaluating alternatives
Andres Weintraub; Richard Adams; Linda Yellin
1982-01-01
A method is described for developing and evaluating alternatives in land management planning. A structured set of 15 steps provides a framework for such an evaluation. when multiple objectives and uncertainty must be considered in the planning process. The method is consistent with other processes used in organizational evaluation, and allows for the interaction of...
Ikebe, Jinzen; Umezawa, Koji; Higo, Junichi
2016-03-01
Molecular dynamics (MD) simulations using all-atom and explicit solvent models provide valuable information on the detailed behavior of protein-partner substrate binding at the atomic level. As the power of computational resources increase, MD simulations are being used more widely and easily. However, it is still difficult to investigate the thermodynamic properties of protein-partner substrate binding and protein folding with conventional MD simulations. Enhanced sampling methods have been developed to sample conformations that reflect equilibrium conditions in a more efficient manner than conventional MD simulations, thereby allowing the construction of accurate free-energy landscapes. In this review, we discuss these enhanced sampling methods using a series of case-by-case examples. In particular, we review enhanced sampling methods conforming to trivial trajectory parallelization, virtual-system coupled multicanonical MD, and adaptive lambda square dynamics. These methods have been recently developed based on the existing method of multicanonical MD simulation. Their applications are reviewed with an emphasis on describing their practical implementation. In our concluding remarks we explore extensions of the enhanced sampling methods that may allow for even more efficient sampling.
ERIC Educational Resources Information Center
Herba, Catherine; Phillips, Mary
2004-01-01
Background: Intact emotion processing is critical for normal emotional development. Recent advances in neuroimaging have facilitated the examination of brain development, and have allowed for the exploration of the relationships between the development of emotion processing abilities, and that of associated neural systems. Methods: A literature…
de Oliveira, Gilberto Santos; Kawahara, Rebeca; Rosa-Fernandes, Livia; Mule, Simon Ngao; Avila, Carla Cristi; Teixeira, Marta M G; Larsen, Martin R; Palmisano, Giuseppe
2018-04-01
Chagas disease also known as American trypanosomiasis is caused by the protozoan Trypanosoma cruzi. Over the last 30 years, Chagas disease has expanded from a neglected parasitic infection of the rural population to an urbanized chronic disease, becoming a potentially emergent global health problem. T. cruzi strains were assigned to seven genetic groups (TcI-TcVI and TcBat), named discrete typing units (DTUs), which represent a set of isolates that differ in virulence, pathogenicity and immunological features. Indeed, diverse clinical manifestations (from asymptomatic to highly severe disease) have been attempted to be related to T.cruzi genetic variability. Due to that, several DTU typing methods have been introduced. Each method has its own advantages and drawbacks such as high complexity and analysis time and all of them are based on genetic signatures. Recently, a novel method discriminated bacterial strains using a peptide identification-free, genome sequence-independent shotgun proteomics workflow. Here, we aimed to develop a Trypanosoma cruzi Strain Typing Assay using MS/MS peptide spectral libraries, named Tc-STAMS2. The Tc-STAMS2 method uses shotgun proteomics combined with spectral library search to assign and discriminate T. cruzi strains independently on the genome knowledge. The method is based on the construction of a library of MS/MS peptide spectra built using genotyped T. cruzi reference strains. For identification, the MS/MS peptide spectra of unknown T. cruzi cells are identified using the spectral matching algorithm SpectraST. The Tc-STAMS2 method allowed correct identification of all DTUs with high confidence. The method was robust towards different sample preparations, length of chromatographic gradients and fragmentation techniques. Moreover, a pilot inter-laboratory study showed the applicability to different MS platforms. This is the first study that develops a MS-based platform for T. cruzi strain typing. Indeed, the Tc-STAMS2 method allows T. cruzi strain typing using MS/MS spectra as discriminatory features and allows the differentiation of TcI-TcVI DTUs. Similar to genomic-based strategies, the Tc-STAMS2 method allows identification of strains within DTUs. Its robustness towards different experimental and biological variables makes it a valuable complementary strategy to the current T. cruzi genotyping assays. Moreover, this method can be used to identify DTU-specific features correlated with the strain phenotype.
Culturing Chick Embryos--A Simplification of New's Method.
ERIC Educational Resources Information Center
Downie, J. R.
1979-01-01
Describes a simplified version of New's method for culturing early chick embryos. The technique allows continuous observation of the critical first three days of development and the conditions for setting up successful cultures are also presented to help both teachers and students. (HM)
NASA Astrophysics Data System (ADS)
Steiner, Matthias
A statistically proven, series injection molding technique for ceramic components was developed for the construction of engines and gas turbines. The flow behavior of silicon injection-molding materials was characterized and improved. Hot-isostatic-pressing reaction bonded silicon nitride (HIPRBSN) was developed. A nondestructive component evaluation method was developed. An injection molding line for HIPRBSN engine components precombustion chamber, flame spreader, and valve guide was developed. This line allows the production of small series for engine tests.
Rapid Radiochemical Method for Isotopic Uranium in Building ...
Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Uranium-234, uranium-235, and uranium-238 in concrete and brick samples Method Selected for: SAM lists this method for qualitative analysis of uranium-234, uranium-235, and uranium-238 in concrete or brick building materials. Summary of subject analytical method which will be posted to the SAM website to allow access to the method.
ERIC Educational Resources Information Center
Zou, Junhua; Liu, Qingtang; Yang, Zongkai
2012-01-01
Based on Competence Motivation Theory (CMT), a Moodle course for schoolchildren's table tennis learning was developed (The URL is http://www.bssepp.com, and this course allows guest access). The effects of the course on students' knowledge, perceived competence and interest were evaluated through quantitative methods. The sample of the study…
NASA Technical Reports Server (NTRS)
Abolhassani, Jamshid S.; Everton, Eric L.
1990-01-01
An interactive grid adaption method is developed, discussed and applied to the unsteady flow about an oscillating airfoil. The user is allowed to have direct interaction with the adaption of the grid as well as the solution procedure. Grid points are allowed to adapt simultaneously to several variables. In addition to the theory and results, the hardware and software requirements are discussed.
ERIC Educational Resources Information Center
Skorikova, Tatyana Petrovna; Khromova, Sergey Sergeevich; Dneprovskaya, Natalia Vitalievna
2016-01-01
Modern level of informational technologies development allows the authors of educational courses to decrease their dependence from technical specialists and to independently develop distance-learning courses and their separate online components, which require special methodical learning. The aim of present study is to develop a distance-learning…
Development of Methods and Equipment for Sheet Stamping
NASA Astrophysics Data System (ADS)
Botashev, A. Yu; Bisilov, N. U.; Malsugenov, R. S.
2018-03-01
New methods of sheet stamping were developed: the gas forming with double-sided heating of a blank part and the gas molding with backpressure. In case of the first method the blank part is heated to the set temperature by means of a double-sided impact of combustion products of gas mixtures, after which, under the influence of gas pressure a stamping process is performed. In case of gas molding with backpressure, the blank part is heated to the set temperature by one-sided impact of the combustion products, while backpressure is created on the opposite side of the blank part by compressed air. In both methods the deformation takes place in the temperature range of warm or hot treatment due to the heating of a blank part. This allows one to form parts of complicated shape within one technological operation, which significantly reduces the cost of production. To implement these methods, original devices were designed and produced, which are new types of forging and stamping equipment. Using these devices, an experimental research on the stamping process was carried out and high-quality parts were obtained, which makes it possible to recommend the developed methods of stamping in the industrial production. Their application in small-scale production will allow one to reduce the cost price of stamped parts 2 or 3 times.
Finding False Paths in Sequential Circuits
NASA Astrophysics Data System (ADS)
Matrosova, A. Yu.; Andreeva, V. V.; Chernyshov, S. V.; Rozhkova, S. V.; Kudin, D. V.
2018-02-01
Method of finding false paths in sequential circuits is developed. In contrast with heuristic approaches currently used abroad, the precise method based on applying operations on Reduced Ordered Binary Decision Diagrams (ROBDDs) extracted from the combinational part of a sequential controlling logic circuit is suggested. The method allows finding false paths when transfer sequence length is not more than the given value and obviates the necessity of investigation of combinational circuit equivalents of the given lengths. The possibilities of using of the developed method for more complicated circuits are discussed.
OpenMM 7: Rapid development of high performance algorithms for molecular dynamics
Swails, Jason; Zhao, Yutong; Beauchamp, Kyle A.; Wang, Lee-Ping; Stern, Chaya D.; Brooks, Bernard R.; Pande, Vijay S.
2017-01-01
OpenMM is a molecular dynamics simulation toolkit with a unique focus on extensibility. It allows users to easily add new features, including forces with novel functional forms, new integration algorithms, and new simulation protocols. Those features automatically work on all supported hardware types (including both CPUs and GPUs) and perform well on all of them. In many cases they require minimal coding, just a mathematical description of the desired function. They also require no modification to OpenMM itself and can be distributed independently of OpenMM. This makes it an ideal tool for researchers developing new simulation methods, and also allows those new methods to be immediately available to the larger community. PMID:28746339
A robust quantitative near infrared modeling approach for blend monitoring.
Mohan, Shikhar; Momose, Wataru; Katz, Jeffrey M; Hossain, Md Nayeem; Velez, Natasha; Drennen, James K; Anderson, Carl A
2018-01-30
This study demonstrates a material sparing Near-Infrared modeling approach for powder blend monitoring. In this new approach, gram scale powder mixtures are subjected to compression loads to simulate the effect of scale using an Instron universal testing system. Models prepared by the new method development approach (small-scale method) and by a traditional method development (blender-scale method) were compared by simultaneously monitoring a 1kg batch size blend run. Both models demonstrated similar model performance. The small-scale method strategy significantly reduces the total resources expended to develop Near-Infrared calibration models for on-line blend monitoring. Further, this development approach does not require the actual equipment (i.e., blender) to which the method will be applied, only a similar optical interface. Thus, a robust on-line blend monitoring method can be fully developed before any large-scale blending experiment is viable, allowing the blend method to be used during scale-up and blend development trials. Copyright © 2017. Published by Elsevier B.V.
High-resolution melting-curve (HRM) analysis for C. meleagridis identification in stool samples.
Chelbi, Hanen; Essid, Rym; Jelassi, Refka; Bouzekri, Nesrine; Zidi, Ines; Ben Salah, Hamza; Mrad, Ilhem; Ben Sghaier, Ines; Abdelmalek, Rym; Aissa, Sameh; Bouratbine, Aida; Aoun, Karim
2018-02-01
Cryptosporidiosis represents a major public health problem. This infection, caused by a protozoan parasite of the genus Cryptosporidium, has been reported worldwide as a frequent cause of diarrhoea. In the immunocompetent host, the typical watery diarrhea can be self-limiting. However, it is severe and chronic, in the immunocompromised host and may cause death. Cryptosporidium spp. are coccidians, which complete their life cycle in both humans and animals. The two species C. hominis and C. parvum are the major cause of human infection. Compared to studies on C. hominis and C. parvum, only a few studies have developed methods to identify C. meleagridis. To develop a new real time PCR-coupled High resolution melting assay allowing the detection for C. meleagridis, in addition of the other dominant species (C. hominis and C. parvum). The polymorphic sequence on the dihydrofolate reductase gene (DHFR) of three species was sequenced to design primers pair and establish a sensitive real-time PCR coupled to a high-resolution melting-curve (HRM) analysis method, allowing the detection of Cryptosporidium sp. and discrimination between three prevalent species in Tunisia. We analyzed a collection of 42 archived human isolates of the three studied species. Real-time PCR coupled to HRM assay allowed detection of Cryptosporidium, using the new designed primers, and basing on melting profile, we can distinguish C. meleagridis species in addition to C. parvum and C. hominis. We developed a qPCR-HRM assay that allows Cryptosporidium genotyping. This method is sensitive and able to distinguish three Cryptosporidium species. Copyright © 2017. Published by Elsevier Ltd.
Reflexion measurements for inverse characterization of steel diffusion bond mechanical properties
NASA Astrophysics Data System (ADS)
Le Bourdais, Florian; Cachon, Lionel; Rigal, Emmanuel
2017-02-01
The present work describes a non-destructive testing method aimed at securing high manufacturing quality of the innovative compact heat exchanger developed under the framework of the CEA R&D program dedicated to the Advanced Sodium Technological Reactor for Industrial Demonstration (ASTRID). The heat exchanger assembly procedure currently proposed involves high temperature and high pressure diffusion welding of stainless steel plates. The aim of the non-destructive method presented herein is to characterize the quality of the welds obtained through this assembly process. Based on a low-frequency model developed by Baik and Thompson [1], pulse-echo normal incidence measurements are calibrated according to a specific procedure and allow the determination of the welding interface stiffness using a nonlinear fitting procedure in the frequency domain. Performing the characterization of plates after diffusion welding using this method allows a useful assessment of the material state as a function of the diffusion bonding process.
Cognition in action: imaging brain/body dynamics in mobile humans.
Gramann, Klaus; Gwin, Joseph T; Ferris, Daniel P; Oie, Kelvin; Jung, Tzyy-Ping; Lin, Chin-Teng; Liao, Lun-De; Makeig, Scott
2011-01-01
We have recently developed a mobile brain imaging method (MoBI), that allows for simultaneous recording of brain and body dynamics of humans actively behaving in and interacting with their environment. A mobile imaging approach was needed to study cognitive processes that are inherently based on the use of human physical structure to obtain behavioral goals. This review gives examples of the tight coupling between human physical structure with cognitive processing and the role of supraspinal activity during control of human stance and locomotion. Existing brain imaging methods for actively behaving participants are described and new sensor technology allowing for mobile recordings of different behavioral states in humans is introduced. Finally, we review recent work demonstrating the feasibility of a MoBI system that was developed at the Swartz Center for Computational Neuroscience at the University of California, San Diego, demonstrating the range of behavior that can be investigated with this method.
Jia, Xin; Fontaine, Benjamin M.; Strobel, Fred; Weinert, Emily E.
2014-01-01
A sensitive, versatile and economical method to extract and quantify cyclic nucleotide monophosphates (cNMPs) using LC-MS/MS, including both 3',5'-cNMPs and 2',3'-cNMPs, in mammalian tissues and cellular systems has been developed. Problems, such as matrix effects from complex biological samples, are addressed and have been optimized. This protocol allows for comparison of multiple cNMPs in the same system and was used to examine the relationship between tissue levels of cNMPs in a panel of rat organs. In addition, the study reports the first identification and quantification of 2',3'-cIMP. The developed method will allow for quantification of cNMPs levels in cells and tissues with varying disease states, which will provide insight into the role(s) and interplay of cNMP signalling pathways. PMID:25513747
Jia, Xin; Fontaine, Benjamin M; Strobel, Fred; Weinert, Emily E
2014-12-12
A sensitive, versatile and economical method to extract and quantify cyclic nucleotide monophosphates (cNMPs) using LC-MS/MS, including both 3',5'-cNMPs and 2',3'-cNMPs, in mammalian tissues and cellular systems has been developed. Problems, such as matrix effects from complex biological samples, are addressed and have been optimized. This protocol allows for comparison of multiple cNMPs in the same system and was used to examine the relationship between tissue levels of cNMPs in a panel of rat organs. In addition, the study reports the first identification and quantification of 2',3'-cIMP. The developed method will allow for quantification of cNMPs levels in cells and tissues with varying disease states, which will provide insight into the role(s) and interplay of cNMP signalling pathways.
Validated method for the analysis of goji berry, a rich source of zeaxanthin dipalmitate.
Karioti, Anastasia; Bergonzi, Maria Camilla; Vincieri, Franco F; Bilia, Anna Rita
2014-12-31
In the present study an HPLC-DAD method was developed for the determination of the main carotenoid, zeaxanthin dipalmitate, in the fruits of Lycium barbarum. The aim was to develop and optimize an extraction protocol to allow fast, exhaustive, and repeatable extraction, suitable for labile carotenoid content. Use of liquid N2 allowed the grinding of the fruit. A step of ultrasonication with water removed efficiently the polysaccharides and enabled the exhaustive extraction of carotenoids by hexane/acetone 50:50. The assay was fast and simple and permitted the quality control of a large number of commercial samples including fruits, juices, and a jam. The HPLC method was validated according to ICH guidelines and satisfied the requirements. Finally, the overall method was validated for precision (% RSD ranging between 3.81 and 4.13) and accuracy at three concentration levels. The recovery was between 94 and 107% with RSD values <2%, within the acceptable limits, especially if the difficulty of the matrix is taken into consideration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Christien, F., E-mail: frederic.christien@univ-nantes.fr; Le Gall, R.; Telling, M. T. F.
2015-05-15
A method is proposed for the monitoring of metal recrystallization using neutron diffraction that is based on the measurement of stored energy. Experiments were performed using deformed metal specimens heated in-situ while mounted at the sample position of the High Resolution Powder Diffractometer, HRPD (ISIS Facility), UK. Monitoring the breadth of the resulting Bragg lines during heating not only allows the time-dependence (or temperature-dependence) of the stored energy to be determined but also the recrystallized fraction. The analysis method presented here was developed using pure nickel (Ni270) specimens with different deformation levels from 0.29 to 0.94. In situ temperature rampingmore » as well as isothermal annealing was undertaken. The method developed in this work allows accurate and quantitative monitoring of the recrystallization process. The results from neutron diffraction are satisfactorily compared to data obtained from calorimetry and hardness measurements.« less
Nanoscale methods for single-molecule electrochemistry.
Mathwig, Klaus; Aartsma, Thijs J; Canters, Gerard W; Lemay, Serge G
2014-01-01
The development of experiments capable of probing individual molecules has led to major breakthroughs in fields ranging from molecular electronics to biophysics, allowing direct tests of knowledge derived from macroscopic measurements and enabling new assays that probe population heterogeneities and internal molecular dynamics. Although still somewhat in their infancy, such methods are also being developed for probing molecular systems in solution using electrochemical transduction mechanisms. Here we outline the present status of this emerging field, concentrating in particular on optical methods, metal-molecule-metal junctions, and electrochemical nanofluidic devices.
Characterization of an Isolated Kidney's Vasculature for Use in Bio-Thermal Modeling
NASA Astrophysics Data System (ADS)
Payne, Allison H.; Parker, Dennis L.; Moellmer, Jeff; Roemer, Robert B.; Clifford, Sarah
2007-05-01
Accurate bio-thermal modeling requires site-specific modeling of discrete vascular anatomy. Presented herewith are several steps that have been developed to describe the vessel network of isolated canine and bovine kidneys. These perfused, isolated kidneys provide an environment to repeatedly test and improve acquisition methods to visualize the vascular anatomy, as well as providing a method to experimentally validate discrete vasculature thermal models. The organs are preserved using a previously developed methodology that keeps the vasculature intact, allowing for the organ to be perfused. It also allows for the repeated fixation and re-hydration of the same organ, permitting the comparison of various methods and models. The organ extraction, alcohol preservation, and perfusion of the organ are described. The vessel locations were obtained through a high-resolution time-of-flight (TOF) magnetic resonance angiography (MRA) technique. Sequential improvements of both the experimental setup used for this acquisition, as well as MR sequence development are presented. The improvements in MR acquisition and experimental setup improved the number of vessels seen in both the raw data and segmented images by 50%. An automatic vessel centerline extraction algorithm describes both vessel location and genealogy. Centerline descriptions also allows for vessel diameter and flow rate determination, providing valuable input parameters for the discrete vascular thermal model. Characterized vessels networks of both canine and bovine kidneys are presented. While these tools have been developed in an ex vivo environment, all steps can be applied to in vivo applications.
Statistical methods for analysing responses of wildlife to human disturbance.
Haiganoush K. Preisler; Alan A. Ager; Michael J. Wisdom
2006-01-01
1. Off-road recreation is increasing rapidly in many areas of the world, and effects on wildlife can be highly detrimental. Consequently, we have developed methods for studying wildlife responses to off-road recreation with the use of new technologies that allow frequent and accurate monitoring of human-wildlife interactions. To illustrate these methods, we studied the...
Spacecraft maximum allowable concentrations for selected airborne contaminants, volume 1
NASA Technical Reports Server (NTRS)
1994-01-01
As part of its efforts to promote safe conditions aboard spacecraft, NASA requested the National Research Council (NRC) to develop guidelines for establishing spacecraft maximum allowable concentrations (SMAC's) for contaminants, and to review SMAC's for various spacecraft contaminants to determine whether NASA's recommended exposure limits are consistent with the guidelines recommended by the subcommittee. In response to NASA's request, the NRC organized the Subcommittee on Guidelines for Developing Spacecraft Maximum Allowable Concentrations for Space Station Contaminants within the Committee on Toxicology (COT). In the first phase of its work, the subcommittee developed the criteria and methods for preparing SMAC's for spacecraft contaminants. The subcommittee's report, entitled Guidelines for Developing Spacecraft Maximum Allowable Concentrations for Space Station Contaminants, was published in 1992. The executive summary of that report is reprinted as Appendix A of this volume. In the second phase of the study, the Subcommittee on Spacecraft Maximum Allowable Concentrations reviewed reports prepared by NASA scientists and contractors recommending SMAC's for 35 spacecraft contaminants. The subcommittee sought to determine whether the SMAC reports were consistent with the 1992 guidelines. Appendix B of this volume contains the first 11 SMAC reports that have been reviewed for their application of the guidelines developed in the first phase of this activity and approved by the subcommittee.
Hexographic Method of Complex Town-Planning Terrain Estimate
NASA Astrophysics Data System (ADS)
Khudyakov, A. Ju
2017-11-01
The article deals with the vital problem of a complex town-planning analysis based on the “hexographic” graphic analytic method, makes a comparison with conventional terrain estimate methods and contains the method application examples. It discloses a procedure of the author’s estimate of restrictions and building of a mathematical model which reflects not only conventional town-planning restrictions, but also social and aesthetic aspects of the analyzed territory. The method allows one to quickly get an idea of the territory potential. It is possible to use an unlimited number of estimated factors. The method can be used for the integrated assessment of urban areas. In addition, it is possible to use the methods of preliminary evaluation of the territory commercial attractiveness in the preparation of investment projects. The technique application results in simple informative graphics. Graphical interpretation is straightforward from the experts. A definite advantage is the free perception of the subject results as they are not prepared professionally. Thus, it is possible to build a dialogue between professionals and the public on a new level allowing to take into account the interests of various parties. At the moment, the method is used as a tool for the preparation of integrated urban development projects at the Department of Architecture in Federal State Autonomous Educational Institution of Higher Education “South Ural State University (National Research University)”, FSAEIHE SUSU (NRU). The methodology is included in a course of lectures as the material on architectural and urban design for architecture students. The same methodology was successfully tested in the preparation of business strategies for the development of some territories in the Chelyabinsk region. This publication is the first in a series of planned activities developing and describing the methodology of hexographical analysis in urban and architectural practice. It is also planned to create a software product that allows one to automate the process of site assessment on the basis of the methodology.
Evaluation of the structures size in the liquid-gas flow by gamma-ray absorption
NASA Astrophysics Data System (ADS)
Zych, Marcin; Hanus, Robert; Jaszczur, Marek; Świsulski, Dariusz; Petryka, Leszek; Jodłowski, Paweł; Zych, Piotr
2018-06-01
The rapid development of tomography methods particularly electrical, X and gamma rays allows for a wide range of the information about flow structure. However, all of such methods are quite complicated. At the same time much simpler systems as the measuring system of gamma rays absorption, allows to obtain a all key flow information which describe the two-phase flow. In the article the results of analyzes of radiometric signal that not only allow to recognize the type of flow, but also the assessment of forming structures are presented. Calculation and interpretation of the data were based on the crosscorrelation and cross-spectral density function. In order to verify the calculations the photographic documentation made during the measurements was used.
Piedra, Jose; Ontiveros, Maria; Miravet, Susana; Penalva, Cristina; Monfar, Mercè; Chillon, Miguel
2015-02-01
Recombinant adeno-associated viruses (rAAVs) are promising vectors in preclinical and clinical assays for the treatment of diseases with gene therapy strategies. Recent technological advances in amplification and purification have allowed the production of highly purified rAAV vector preparations. Although quantitative polymerase chain reaction (qPCR) is the current method of choice for titrating rAAV genomes, it shows high variability. In this work, we report a rapid and robust rAAV titration method based on the quantitation of encapsidated DNA with the fluorescent dye PicoGreen®. This method allows detection from 3×10(10) viral genome/ml up to 2.4×10(13) viral genome/ml in a linear range. Contrasted with dot blot or qPCR, the PicoGreen-based assay has less intra- and interassay variability. Moreover, quantitation is rapid, does not require specific primers or probes, and is independent of the rAAV pseudotype analyzed. In summary, development of this universal rAAV-titering method may have substantive implications in rAAV technology.
A space radiation transport method development
NASA Technical Reports Server (NTRS)
Wilson, J. W.; Tripathi, R. K.; Qualls, G. D.; Cucinotta, F. A.; Prael, R. E.; Norbury, J. W.; Heinbockel, J. H.; Tweed, J.
2004-01-01
Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest-order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard finite element method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 ms and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of re-configurable computing and could be utilized in the final design as verification of the deterministic method optimized design. Published by Elsevier Ltd on behalf of COSPAR.
New Geophysical Techniques for Offshore Exploration.
ERIC Educational Resources Information Center
Talwani, Manik
1983-01-01
New seismic techniques have been developed recently that borrow theory from academic institutions and technology from industry, allowing scientists to explore deeper into the earth with much greater precision than possible with older seismic methods. Several of these methods are discussed, including the seismic reflection common-depth-point…
Solution of Thermoelectricity Problems Energy Method
NASA Astrophysics Data System (ADS)
Niyazbek, Muheyat; Nogaybaeva, M. O.; Talp, Kuenssaule; Kudaikulov, A. A.
2018-06-01
On the basis of the fundamental laws of conservation of energy in conjunction with local quadratic spline functions was developed a universal computing algorithm, a method and associated software, which allows to investigate the Thermophysical insulated rod, with limited length, influenced by local heat flow, heat transfer and temperature
Validation of a Spectral Method for Quantitative Measurement of Color in Protein Drug Solutions.
Yin, Jian; Swartz, Trevor E; Zhang, Jian; Patapoff, Thomas W; Chen, Bartolo; Marhoul, Joseph; Shih, Norman; Kabakoff, Bruce; Rahimi, Kimia
2016-01-01
A quantitative spectral method has been developed to precisely measure the color of protein solutions. In this method, a spectrophotometer is utilized for capturing the visible absorption spectrum of a protein solution, which can then be converted to color values (L*a*b*) that represent human perception of color in a quantitative three-dimensional space. These quantitative values (L*a*b*) allow for calculating the best match of a sample's color to a European Pharmacopoeia reference color solution. In order to qualify this instrument and assay for use in clinical quality control, a technical assessment was conducted to evaluate the assay suitability and precision. Setting acceptance criteria for this study required development and implementation of a unique statistical method for assessing precision in 3-dimensional space. Different instruments, cuvettes, protein solutions, and analysts were compared in this study. The instrument accuracy, repeatability, and assay precision were determined. The instrument and assay are found suitable for use in assessing color of drug substances and drug products and is comparable to the current European Pharmacopoeia visual assessment method. In the biotechnology industry, a visual assessment is the most commonly used method for color characterization, batch release, and stability testing of liquid protein drug solutions. Using this method, an analyst visually determines the color of the sample by choosing the closest match to a standard color series. This visual method can be subjective because it requires an analyst to make a judgment of the best match of color of the sample to the standard color series, and it does not capture data on hue and chroma that would allow for improved product characterization and the ability to detect subtle differences between samples. To overcome these challenges, we developed a quantitative spectral method for color determination that greatly reduces the variability in measuring color and allows for a more precise understanding of color differences. In this study, we established a statistical method for assessing precision in 3-dimensional space and demonstrated that the quantitative spectral method is comparable with respect to precision and accuracy to the current European Pharmacopoeia visual assessment method. © PDA, Inc. 2016.
20170308 - Higher Throughput Toxicokinetics to Allow ...
As part of "Ongoing EDSP Directions & Activities" I will present CSS research on high throughput toxicokinetics, including in vitro data and models to allow rapid determination of the real world doses that may cause endocrine disruption. This is a presentation as part of the U.S. Environmental Protection Agency – Japan Ministry of the Environment 12th Bilateral Meeting on Endocrine Disruption Test Methods Development.
Flight Control in Complex Environments
2016-10-24
that allow insects, with their miniature brains and limited sensory systems to fly safely through cluttered natural environments . The most significant...specialisations that allow insects, with their miniature brains and limited sensory systems to fly safely through cluttered natural environments . The most...bees have developed more accurate or effective methods for flying safely through gaps than species from less complex environments . Fig. 4: The
Structured Case Analysis: Developing Critical Thinking Skills in a Marketing Case Course
ERIC Educational Resources Information Center
Klebba, Joanne M.; Hamilton, Janet G.
2007-01-01
Structured case analysis is a hybrid pedagogy that flexibly combines diverse instructional methods with comprehensive case analysis as a mechanism to develop critical thinking skills. An incremental learning framework is proposed that allows instructors to develop and monitor content-specific theory and the corresponding critical thinking skills.…
ERIC Educational Resources Information Center
Luckett, S.; Luckett, K.
1999-01-01
A South African university's community development program attempted to integrate Checkland's soft-systems method into Kolb's learning-cycle theory. Evaluation revealed shortcomings in the curriculum design, including the assumption of learner autonomy, necessity of assessing students individually, and difficulty of allowing learners to construct…
Measuring Service Quality in Higher Education: Development of a Hierarchical Model (HESQUAL)
ERIC Educational Resources Information Center
Teeroovengadum, Viraiyan; Kamalanabhan, T. J.; Seebaluck, Ashley Keshwar
2016-01-01
Purpose: This paper aims to develop and empirically test a hierarchical model for measuring service quality in higher education. Design/methodology/approach: The first phase of the study consisted of qualitative research methods and a comprehensive literature review, which allowed the development of a conceptual model comprising 53 service quality…
NASA Astrophysics Data System (ADS)
Abdel-Ghany, Maha F.; Hussein, Lobna A.; Ayad, Miriam F.; Youssef, Menatallah M.
2017-01-01
New, simple, accurate and sensitive UV spectrophotometric and chemometric methods have been developed and validated for determination of Entacapone (ENT), Levodopa (LD) and Carbidopa (CD) in ternary mixture. Method A is a derivative ratio spectra zero-crossing spectrophotometric method which allows the determination of ENT in the presence of both LD and CD by measuring the peak amplitude at 249.9 nm in the range of 1-20 μg mL- 1. Method B is a double divisor-first derivative of ratio spectra method, used for determination of ENT, LD and CD at 245, 239 and 293 nm, respectively. Method C is a mean centering of ratio spectra which allows their determination at 241, 241.6 and 257.1 nm, respectively. Methods B and C could successfully determine the studied drugs in concentration ranges of 1-20 μg mL- 1 for ENT and 10-90 μg mL- 1 for both LD and CD. Methods D and E are principal component regression and partial least-squares, respectively, used for the simultaneous determination of the studied drugs by using seventeen mixtures as calibration set and eight mixtures as validation set. The developed methods have the advantage of simultaneous determination of the cited components without any pre-treatment. All the results were statistically compared with the reported methods, where no significant difference was observed. The developed methods were satisfactorily applied to the analysis of the investigated drugs in their pure form and in pharmaceutical dosage forms.
Current Lead Design for the Accelerator Project for Upgrade of LHC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brandt, Jeffrey S.; Cheban, Sergey; Feher, Sandor
2010-01-01
The Accelerator Project for Upgrade of LHC (APUL) is a U.S. project participating in and contributing to CERN's Large Hadron Collider (LHC) upgrade program. In collaboration with Brookhaven National Laboratory, Fermilab is developing sub-systems for an upgrade of the LHC final focus magnet systems. A concept of main and auxiliary helium flow was developed that allows the superconductor to remain cold while the lead body warms up to prevent upper section frosting. The auxiliary flow will subsequently cool the thermal shields of the feed box and the transmission line cryostats. A thermal analysis of the current lead central heat exchangemore » section was performed using analytic and FEA techniques. A method of remote soldering was developed that allows the current leads to be field replaceable. The remote solder joint was designed to be made without flux or additional solder, and able to be remade up to ten full cycles. A method of upper section attachment was developed that allows high pressure sealing of the helium volume. Test fixtures for both remote soldering and upper section attachment for the 13 kA lead were produced. The cooling concept, thermal analyses, and test results from both remote soldering and upper section attachment fixtures are presented.« less
Development of Methodologies Evaluating Emissions from Metal-Containing Explosives and Propellants
Experiments were performed to develop methodologies that will allow determination of pollutant emission factors for gases and particles produced by...micrometer, 16 by weight). Although not included here, the analysis methods described will be directly applicable to the study of pyrotechnics.
An exploratory survey of methods used to develop measures of performance
NASA Astrophysics Data System (ADS)
Hamner, Kenneth L.; Lafleur, Charles A.
1993-09-01
Nonmanufacturing organizations are being challenged to provide high-quality products and services to their customers, with an emphasis on continuous process improvement. Measures of performance, referred to as metrics, can be used to foster process improvement. The application of performance measurement to nonmanufacturing processes can be very difficult. This research explored methods used to develop metrics in nonmanufacturing organizations. Several methods were formally defined in the literature, and the researchers used a two-step screening process to determine the OMB Generic Method was most likely to produce high-quality metrics. The OMB Generic Method was then used to develop metrics. A few other metric development methods were found in use at nonmanufacturing organizations. The researchers interviewed participants in metric development efforts to determine their satisfaction and to have them identify the strengths and weaknesses of, and recommended improvements to, the metric development methods used. Analysis of participants' responses allowed the researchers to identify the key components of a sound metrics development method. Those components were incorporated into a proposed metric development method that was based on the OMB Generic Method, and should be more likely to produce high-quality metrics that will result in continuous process improvement.
An overview of NSPCG: A nonsymmetric preconditioned conjugate gradient package
NASA Astrophysics Data System (ADS)
Oppe, Thomas C.; Joubert, Wayne D.; Kincaid, David R.
1989-05-01
The most recent research-oriented software package developed as part of the ITPACK Project is called "NSPCG" since it contains many nonsymmetric preconditioned conjugate gradient procedures. It is designed to solve large sparse systems of linear algebraic equations by a variety of different iterative methods. One of the main purposes for the development of the package is to provide a common modular structure for research on iterative methods for nonsymmetric matrices. Another purpose for the development of the package is to investigate the suitability of several iterative methods for vector computers. Since the vectorizability of an iterative method depends greatly on the matrix structure, NSPCG allows great flexibility in the operator representation. The coefficient matrix can be passed in one of several different matrix data storage schemes. These sparse data formats allow matrices with a wide range of structures from highly structured ones such as those with all nonzeros along a relatively small number of diagonals to completely unstructured sparse matrices. Alternatively, the package allows the user to call the accelerators directly with user-supplied routines for performing certain matrix operations. In this case, one can use the data format from an application program and not be required to copy the matrix into one of the package formats. This is particularly advantageous when memory space is limited. Some of the basic preconditioners that are available are point methods such as Jacobi, Incomplete LU Decomposition and Symmetric Successive Overrelaxation as well as block and multicolor preconditioners. The user can select from a large collection of accelerators such as Conjugate Gradient (CG), Chebyshev (SI, for semi-iterative), Generalized Minimal Residual (GMRES), Biconjugate Gradient Squared (BCGS) and many others. The package is modular so that almost any accelerator can be used with almost any preconditioner.
Mitrowska, Kamila; Vincent, Ursula; von Holst, Christoph
2012-04-13
The manuscript presents the development of a new reverse phase high performance liquid chromatography (RP-HPLC) photo diode array detection method allowing the separation and quantification of 15 carotenoids (adonirubin, adonixanthin, astaxanthin, astaxanthin dimethyl disuccinate, asteroidenone, beta-apo-8'-carotenal, beta-apo-8'-carotenoic acid ethyl ester, beta-carotene, canthaxanthin, capsanthin, citranaxanthin, echinenone, lutein, lycopene, and zeaxanthin), 10 of which are feed additives authorised within the European Union. The developed method allows for the reliable determination of the total carotenoid content in one run using the corresponding E-isomer as calibration standard while taking into account the E/Z-isomers composition. This is a key criterion for the application of the method, since for most of the analytes included in this study analytical standards are only available for the E-isomers. This goal was achieved by applying the isosbestic concept, in order to identify specific wavelengths, at which the absorption coefficients are identical for all stereoisomers concerned. The second target referred to the optimisation of the LC conditions. By means of an experimental design, an optimised RP-HPLC method was developed allowing for a sufficient chromatographic separation of all carotenoids. The selected method uses a Suplex pKb-100 HPLC column and applying a gradient with a mixture of acetonitrile, tert-butyl-methyl ether and water as mobile phases. The limits of detection and limits of quantification ranged from 0.06 mg L(-1) to 0.14 mg L(-1) and from 0.20 mg L(-1) to 0.48 mg L(-1), respectively. Copyright © 2012 Elsevier B.V. All rights reserved.
Design of the algorithm of photons migration in the multilayer skin structure
NASA Astrophysics Data System (ADS)
Bulykina, Anastasiia B.; Ryzhova, Victoria A.; Korotaev, Valery V.; Samokhin, Nikita Y.
2017-06-01
Design of approaches and methods of the oncological diseases diagnostics has special significance. It allows determining any kind of tumors at early stages. The development of optical and laser technologies provided increase of a number of methods allowing making diagnostic studies of oncological diseases. A promising area of biomedical diagnostics is the development of automated nondestructive testing systems for the study of the skin polarizing properties based on backscattered radiation detection. Specification of the examined tissue polarizing properties allows studying of structural properties change influenced by various pathologies. Consequently, measurement and analysis of the polarizing properties of the scattered optical radiation for the development of methods for diagnosis and imaging of skin in vivo appear relevant. The purpose of this research is to design the algorithm of photons migration in the multilayer skin structure. In this research, the algorithm of photons migration in the multilayer skin structure was designed. It is based on the use of the Monte Carlo method. Implemented Monte Carlo method appears as a tracking the paths of photons experiencing random discrete direction changes before they are released from the analyzed area or decrease their intensity to negligible levels. Modeling algorithm consists of the medium and the source characteristics generation, a photon generating considering spatial coordinates of the polar and azimuthal angles, the photon weight reduction calculating due to specular and diffuse reflection, the photon mean free path definition, the photon motion direction angle definition as a result of random scattering with a Henyey-Greenstein phase function, the medium's absorption calculation. Biological tissue is modeled as a homogeneous scattering sheet characterized by absorption, a scattering and anisotropy coefficients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulchin, Yu N; Vitrik, O B; Kuchmizhak, A A
2014-10-31
It is shown theoretically that the use of the spectral registration of the dipole local plasmon resonance (DLPR) displacement in a single spherical gold nanoantenna, placed near the surface of a homogeneous dielectric medium, allows the mapping of extremely small variations (to 5 × 10{sup -4}) of the refractive index (RI) of this medium. Using the quasi-static approximation, we have developed an analytic model that allows evaluation of the spectral displacement of the nanoantenna DLPR depending on the variation in the medium refractive index. The point probe based on a fibre microaxicon with a gold spherical nanoantenna attached to itsmore » top is proposed that allows practical implementation of the developed RI scanning method. Numerical calculations of the probe characteristics using the time-domain finite-difference method are presented, and it is shown that for the case of a gold spherical nanoantenna of small size, comparable with the skin layer thickness in gold, the relative spectral shift value is in good agreement with the results obtained by using the developed analytic model. (laser applications and other topics in quantum electronics)« less
ERIC Educational Resources Information Center
Dzhamalova, Bika B.; Timonin, Andrey I.; Kolesov, Vladimir I.; Pavlov, Vladimir V.; Evstegneeva, Anastasiia A.
2016-01-01
This article is focused on the development of the structure and content of consolidating orientation of pedagogical functions of university teachers in international students' training. The leading method of research is the modeling method that allows producing of the established structure's and content's justification of consolidating orientation…
A fluorescence-quenching method was developed to assess the hydrophobic organic pollutant binding potential of organic colloids (OC) in unaltered natural waters. This method allows (1) direct assessment of the importance of OC-enhanced pollutant transport for environmental sam- p...
Mainali, Laxman; Camenisch, Theodore G; Hyde, James S; Subczynski, Witold K
2017-12-01
The presence of integral membrane proteins induces the formation of distinct domains in the lipid bilayer portion of biological membranes. Qualitative application of both continuous wave (CW) and saturation recovery (SR) electron paramagnetic resonance (EPR) spin-labeling methods allowed discrimination of the bulk, boundary, and trapped lipid domains. A recently developed method, which is based on the CW EPR spectra of phospholipid (PL) and cholesterol (Chol) analog spin labels, allows evaluation of the relative amount of PLs (% of total PLs) in the boundary plus trapped lipid domain and the relative amount of Chol (% of total Chol) in the trapped lipid domain [ M. Raguz, L. Mainali, W. J. O'Brien, and W. K. Subczynski (2015), Exp. Eye Res., 140:179-186 ]. Here, a new method is presented that, based on SR EPR spin-labeling, allows quantitative evaluation of the relative amounts of PLs and Chol in the trapped lipid domain of intact membranes. This new method complements the existing one, allowing acquisition of more detailed information about the distribution of lipids between domains in intact membranes. The methodological transition of the SR EPR spin-labeling approach from qualitative to quantitative is demonstrated. The abilities of this method are illustrated for intact cortical and nuclear fiber cell plasma membranes from porcine eye lenses. Statistical analysis (Student's t -test) of the data allowed determination of the separations of mean values above which differences can be treated as statistically significant ( P ≤ 0.05) and can be attributed to sources other than preparation/technique.
Wikipedia mining of hidden links between political leaders
NASA Astrophysics Data System (ADS)
Frahm, Klaus M.; Jaffrès-Runser, Katia; Shepelyansky, Dima L.
2016-12-01
We describe a new method of reduced Google matrix which allows to establish direct and hidden links between a subset of nodes of a large directed network. This approach uses parallels with quantum scattering theory, developed for processes in nuclear and mesoscopic physics and quantum chaos. The method is applied to the Wikipedia networks in different language editions analyzing several groups of political leaders of USA, UK, Germany, France, Russia and G20. We demonstrate that this approach allows to recover reliably direct and hidden links among political leaders. We argue that the reduced Google matrix method can form the mathematical basis for studies in social and political sciences analyzing Leader-Members eXchange (LMX).
Head-target tracking control of well drilling
NASA Astrophysics Data System (ADS)
Agzamov, Z. V.
2018-05-01
The method of directional drilling trajectory control for oil and gas wells using predictive models is considered in the paper. The developed method does not apply optimization and therefore there is no need for the high-performance computing. Nevertheless, it allows following the well-plan with high precision taking into account process input saturation. Controller output is calculated both from the present target reference point of the well-plan and from well trajectory prediction with using the analytical model. This method allows following a well-plan not only on angular, but also on the Cartesian coordinates. Simulation of the control system has confirmed the high precision and operation performance with a wide range of random disturbance action.
Development of design and analysis methodology for composite bolted joints
NASA Astrophysics Data System (ADS)
Grant, Peter; Sawicki, Adam
1991-05-01
This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.
Correlative Stochastic Optical Reconstruction Microscopy and Electron Microscopy
Kim, Doory; Deerinck, Thomas J.; Sigal, Yaron M.; Babcock, Hazen P.; Ellisman, Mark H.; Zhuang, Xiaowei
2015-01-01
Correlative fluorescence light microscopy and electron microscopy allows the imaging of spatial distributions of specific biomolecules in the context of cellular ultrastructure. Recent development of super-resolution fluorescence microscopy allows the location of molecules to be determined with nanometer-scale spatial resolution. However, correlative super-resolution fluorescence microscopy and electron microscopy (EM) still remains challenging because the optimal specimen preparation and imaging conditions for super-resolution fluorescence microscopy and EM are often not compatible. Here, we have developed several experiment protocols for correlative stochastic optical reconstruction microscopy (STORM) and EM methods, both for un-embedded samples by applying EM-specific sample preparations after STORM imaging and for embedded and sectioned samples by optimizing the fluorescence under EM fixation, staining and embedding conditions. We demonstrated these methods using a variety of cellular targets. PMID:25874453
Inverse analysis of water profile in starch by non-contact photopyroelectric method
NASA Astrophysics Data System (ADS)
Frandas, A.; Duvaut, T.; Paris, D.
2000-07-01
The photopyroelectric (PPE) method in a non-contact configuration was proposed to study water migration in starch sheets used for biodegradable packaging. A 1-D theoretical model was developed, allowing the study of samples having a water profile characterized by an arbitrary continuous function. An experimental setup was designed or this purpose which included the choice of excitation source, detection of signals, signal and data processing, and cells for conditioning the samples. We report here the development of an inversion procedure allowing for the determination of the parameters that influence the PPE signal. This procedure led to the optimization of experimental conditions in order to identify the parameters related to the water profile in the sample, and to monitor the dynamics of the process.
Rösner, Benedikt; Döring, Florian; Ribič, Primož R; Gauthier, David; Principi, Emiliano; Masciovecchio, Claudio; Zangrando, Marco; Vila-Comamala, Joan; De Ninno, Giovanni; David, Christian
2017-11-27
High resolution metrology of beam profiles is presently a major challenge at X-ray free electron lasers. We demonstrate a characterization method based on beam imprints in poly (methyl methacrylate). By immersing the imprints formed at 47.8 eV into organic solvents, the regions exposed to the beam are removed similar to resist development in grayscale lithography. This allows for extending the sensitivity of the method by more than an order of magnitude compared to the established analysis of imprints created solely by ablation. Applying the Beer-Lambert law for absorption, the intensity distribution in a micron-sized focus can be reconstructed from one single shot with a high dynamic range, exceeding 10 3 . The procedure described here allows for beam characterization at free electron lasers revealing even faint beam tails, which are not accessible when using ablation imprint methods. We demonstrate the greatly extended dynamic range on developed imprints taken in focus of conventional Fresnel zone plates and spiral zone plates producing beams with a topological charge.
2015-01-01
The use of mixed methods (combining quantitative and qualitative data) is developing in a variety of forms, especially in the health field. Our own research has adopted this perspective from the outset. We have sought all along to innovate in various ways and especially to develop an equal partnership, in the sense of not allowing any single approach to dominate. After briefly describing mixed methods, in this article we explain and illustrate how we have exploited both qualitative and quantitative methods to answer our research questions, ending with a reflective analysis of our experiment. PMID:26559730
Thematic Analysis of the Children's Drawings on Museum Visit: Adaptation of the Kuhn's Method
ERIC Educational Resources Information Center
Kisovar-Ivanda, Tamara
2014-01-01
Researchers are using techniques that allow children to express their perspectives. In 2003, Kuhn developed the method of data collection and analysis which combined thematic drawing and focused, episodic interview. In this article the Kuhn's method is adjusted using the draw and write technique as a research methodology. Reflections on the…
Invasive pulmonary aspergillosis: current diagnostic methodologies and a new molecular approach.
Moura, S; Cerqueira, L; Almeida, A
2018-05-13
The fungus Aspergillus fumigatus is the main pathogenic agent responsible for invasive pulmonary aspergillosis. Immunocompromised patients are more likely to develop this pathology due to a decrease in the immune system's defense capacity. Despite of the low occurrence of invasive pulmonary aspergillosis, this pathology presents high rates of mortality, mostly due to late and unspecific diagnosis. Currently, the diagnostic methods used to detect this fungal infection are conventional mycological examination (direct microscopic examination, histological examination, and culture), imaging, non-culture-based tests for the detection of galactomannan, β(1,3)-glucan and an extracellular glycoprotein, and molecular tests based on PCR. However, most of these methods do not detect the species A. fumigatus; they only allow the identification of genus Aspergillus. The development of more specific detection methods is of extreme importance. Fluorescent in situ hybridization-based molecular methods can be a good alternative to achieve this purpose. In this review, it is intended to point out that most of the methods used for the diagnosis of invasive pulmonary aspergillosis do not allow to detect the fungus at the species level and that fluorescence in situ hybridization-based molecular method will be a promising approach in the A. fumigatus detection.
Wen, Zhimou; Chen, Jeng Shong
2018-05-26
We report here a simple and sensitive plant-based western corn rootworm, Diabrotica virgifera virgifera LeConte (Coleoptera: Chrysomelidae), bioassay method that allows for examination of multiple parameters for both plants and insects in a single experimental setup within a short duration. For plants, injury to roots can be visually examined, fresh root weight can be measured, and expression of trait protein in plant roots can be analyzed. For insects, in addition to survival, larval growth and development can be evaluated in several aspects including body weight gain, body length, and head capsule width. We demonstrated using the method that eCry3.1Ab-expressing 5307 corn was very effective against western corn rootworm by eliciting high mortality and significantly inhibiting larval growth and development. We also validated that the method allowed determination of resistance in an eCry3.1Ab-resistant western corn rootworm strain. While data presented in this paper demonstrate the usefulness of the method for selection of events of protein traits and for determination of resistance in laboratory populations, we envision that the method can be applied in much broader applications.
Jacquin, Laval; Cao, Tuong-Vi; Ahmadi, Nourollah
2016-01-01
One objective of this study was to provide readers with a clear and unified understanding of parametric statistical and kernel methods, used for genomic prediction, and to compare some of these in the context of rice breeding for quantitative traits. Furthermore, another objective was to provide a simple and user-friendly R package, named KRMM, which allows users to perform RKHS regression with several kernels. After introducing the concept of regularized empirical risk minimization, the connections between well-known parametric and kernel methods such as Ridge regression [i.e., genomic best linear unbiased predictor (GBLUP)] and reproducing kernel Hilbert space (RKHS) regression were reviewed. Ridge regression was then reformulated so as to show and emphasize the advantage of the kernel "trick" concept, exploited by kernel methods in the context of epistatic genetic architectures, over parametric frameworks used by conventional methods. Some parametric and kernel methods; least absolute shrinkage and selection operator (LASSO), GBLUP, support vector machine regression (SVR) and RKHS regression were thereupon compared for their genomic predictive ability in the context of rice breeding using three real data sets. Among the compared methods, RKHS regression and SVR were often the most accurate methods for prediction followed by GBLUP and LASSO. An R function which allows users to perform RR-BLUP of marker effects, GBLUP and RKHS regression, with a Gaussian, Laplacian, polynomial or ANOVA kernel, in a reasonable computation time has been developed. Moreover, a modified version of this function, which allows users to tune kernels for RKHS regression, has also been developed and parallelized for HPC Linux clusters. The corresponding KRMM package and all scripts have been made publicly available.
Bacteriophage Amplification-Coupled Detection and Identification of Bacterial Pathogens
NASA Astrophysics Data System (ADS)
Cox, Christopher R.; Voorhees, Kent J.
Current methods of species-specific bacterial detection and identification are complex, time-consuming, and often require expensive specialized equipment and highly trained personnel. Numerous biochemical and genotypic identification methods have been applied to bacterial characterization, but all rely on tedious microbiological culturing practices and/or costly sequencing protocols which render them impractical for deployment as rapid, cost-effective point-of-care or field detection and identification methods. With a view towards addressing these shortcomings, we have exploited the evolutionarily conserved interactions between a bacteriophage (phage) and its bacterial host to develop species-specific detection methods. Phage amplification-coupled matrix assisted laser desorption time-of-flight mass spectrometry (MALDI-TOF-MS) was utilized to rapidly detect phage propagation resulting from species-specific in vitro bacterial infection. This novel signal amplification method allowed for bacterial detection and identification in as little as 2 h, and when combined with disulfide bond reduction methods developed in our laboratory to enhance MALDI-TOF-MS resolution, was observed to lower the limit of detection by several orders of magnitude over conventional spectroscopy and phage typing methods. Phage amplification has been combined with lateral flow immunochromatography (LFI) to develop rapid, easy-to-operate, portable, species-specific point-of-care (POC) detection devices. Prototype LFI detectors have been developed and characterized for Yersinia pestis and Bacillus anthracis, the etiologic agents of plague and anthrax, respectively. Comparable sensitivity and rapidity was observed when phage amplification was adapted to a species-specific handheld LFI detector, thus allowing for rapid, simple, POC bacterial detection and identification while eliminating the need for bacterial culturing or DNA isolation and amplification techniques.
Automatic classification of fluorescence and optical diffusion spectroscopy data in neuro-oncology
NASA Astrophysics Data System (ADS)
Savelieva, T. A.; Loshchenov, V. B.; Goryajnov, S. A.; Potapov, A. A.
2018-04-01
The complexity of the biological tissue spectroscopic analysis due to the overlap of biological molecules' absorption spectra, multiple scattering effect, as well as measurement geometry in vivo has caused the relevance of this work. In the neurooncology the problem of tumor boundaries delineation is especially acute and requires the development of new methods of intraoperative diagnosis. Methods of optical spectroscopy allow detecting various diagnostically significant parameters non-invasively. 5-ALA induced protoporphyrin IX is frequently used as fluorescent tumor marker in neurooncology. At the same time analysis of the concentration and the oxygenation level of haemoglobin and significant changes of light scattering in tumor tissues have a high diagnostic value. This paper presents an original method for the simultaneous registration of backward diffuse reflectance and fluorescence spectra, which allows defining all the parameters listed above simultaneously. The clinical studies involving 47 patients with intracranial glial tumors of II-IV Grades were carried out in N.N. Burdenko National Medical Research Center of Neurosurgery. To register the spectral dependences the spectroscopic system LESA- 01-BIOSPEC was used with specially developed w-shaped diagnostic fiber optic probe. The original algorithm of combined spectroscopic signal processing was developed. We have created a software and hardware, which allowed (as compared with the methods currently used in neurosurgical practice) to increase the sensitivity of intraoperative demarcation of intracranial tumors from 78% to 96%, specificity of 60% to 82%. The result of analysis of different techniques of automatic classification shows that in our case the most appropriate is the k Nearest Neighbors algorithm with cubic metrics.
Integrate Evaluation into the Planning Process.
ERIC Educational Resources Information Center
Camp, William
1985-01-01
In an attempt to correct for limitations in the Program Evaluation and Review Technique-Critical Path Method (PERT-CPM), the Graphical Evaluation and Review Technique (GERT) has been developed. This management tool allows for evaluation during the facilities' development process. Two figures and two references are provided. (DCS)
Batch Isolation of Microsatellites for Tropical Plant Species Pyrosequencing
USDA-ARS?s Scientific Manuscript database
Microsatellites were developed for ten tropical species using a method recently developed in our laboratory that involves a combination of two adapters at the SSR-enrichment stage and allows for cost saving and simultaneous loading of samples. The species for which microsatellites were isolated are...
Ian T. Schmidt; John F. O' Leary; Douglas A. Stow; Kellie A. Uyeda; Philip Riggan
2016-01-01
Development of methods that more accurately estimate spatial distributions of fuel loads in shrublands allows for improved understanding of ecological processes such as wildfire behavior and postburn recovery. The goal of this study is to develop and test
Roger, B; Fernandez, X; Jeannot, V; Chahboun, J
2010-01-01
The essential oil obtained from iris rhizomes is one of the most precious raw materials for the perfume industry. Its fragrance is due to irones that are gradually formed by oxidative degradation of iridals during rhizome ageing. The development of an alternative method allowing irone quantification in iris rhizomes using HS-SPME-GC. The development of the method using HS-SPME-GC was achieved using the results obtained from a conventional method, i.e. a solid-liquid extraction (SLE) followed by irone quantification by CG. Among several calibration methods tested, internal calibration gave the best results and was the least sensitive to the matrix effect. The proposed method using HS-SPME-GC is as accurate and reproducible as the conventional one using SLE. These two methods were used to monitor and compare irone concentrations in iris rhizomes that had been stored for 6 months to 9 years. Irone quantification in iris rhizome can be achieved using HS-SPME-GC. This method can thus be used for the quality control of the iris rhizomes. It offers the advantage of combining extraction and analysis with an automated device and thus allows a large number of rhizome batches to be analysed and compared in a limited amount of time. Copyright © 2010 John Wiley & Sons, Ltd.
Analysis of methods. [information systems evolution environment
NASA Technical Reports Server (NTRS)
Mayer, Richard J. (Editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.
1991-01-01
Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity Relationship, Data Flow Diagrams, and Structure Charts, for inclusion in an integrated development support environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, S.; Jones, V.
2009-05-27
A new rapid separation method that allows separation and preconcentration of actinides in urine samples was developed for the measurement of longer lived actinides by inductively coupled plasma mass spectrometry (ICP-MS) and short-lived actinides by alpha spectrometry; a hybrid approach. This method uses stacked extraction chromatography cartridges and vacuum box technology to facilitate rapid separations. Preconcentration, if required, is performed using a streamlined calcium phosphate precipitation. Similar technology has been applied to separate actinides prior to measurement by alpha spectrometry, but this new method has been developed with elution reagents now compatible with ICP-MS as well. Purified solutions are splitmore » between ICP-MS and alpha spectrometry so that long- and short-lived actinide isotopes can be measured successfully. The method allows for simultaneous extraction of 24 samples (including QC samples) in less than 3 h. Simultaneous sample preparation can offer significant time savings over sequential sample preparation. For example, sequential sample preparation of 24 samples taking just 15 min each requires 6 h to complete. The simplicity and speed of this new method makes it attractive for radiological emergency response. If preconcentration is applied, the method is applicable to larger sample aliquots for occupational exposures as well. The chemical recoveries are typically greater than 90%, in contrast to other reported methods using flow injection separation techniques for urine samples where plutonium yields were 70-80%. This method allows measurement of both long-lived and short-lived actinide isotopes. 239Pu, 242Pu, 237Np, 243Am, 234U, 235U and 238U were measured by ICP-MS, while 236Pu, 238Pu, 239Pu, 241Am, 243Am and 244Cm were measured by alpha spectrometry. The method can also be adapted so that the separation of uranium isotopes for assay is not required, if uranium assay by direct dilution of the urine sample is preferred instead. Multiple vacuum box locations may be set-up to supply several ICP-MS units with purified sample fractions such that a high sample throughput may be achieved, while still allowing for rapid measurement of short-lived actinides by alpha spectrometry.« less
Anderson, Eric C
2012-11-08
Advances in genotyping that allow tens of thousands of individuals to be genotyped at a moderate number of single nucleotide polymorphisms (SNPs) permit parentage inference to be pursued on a very large scale. The intergenerational tagging this capacity allows is revolutionizing the management of cultured organisms (cows, salmon, etc.) and is poised to do the same for scientific studies of natural populations. Currently, however, there are no likelihood-based methods of parentage inference which are implemented in a manner that allows them to quickly handle a very large number of potential parents or parent pairs. Here we introduce an efficient likelihood-based method applicable to the specialized case of cultured organisms in which both parents can be reliably sampled. We develop a Markov chain representation for the cumulative number of Mendelian incompatibilities between an offspring and its putative parents and we exploit it to develop a fast algorithm for simulation-based estimates of statistical confidence in SNP-based assignments of offspring to pairs of parents. The method is implemented in the freely available software SNPPIT. We describe the method in detail, then assess its performance in a large simulation study using known allele frequencies at 96 SNPs from ten hatchery salmon populations. The simulations verify that the method is fast and accurate and that 96 well-chosen SNPs can provide sufficient power to identify the correct pair of parents from amongst millions of candidate pairs.
Nonparametric methods for doubly robust estimation of continuous treatment effects.
Kennedy, Edward H; Ma, Zongming; McHugh, Matthew D; Small, Dylan S
2017-09-01
Continuous treatments (e.g., doses) arise often in practice, but many available causal effect estimators are limited by either requiring parametric models for the effect curve, or by not allowing doubly robust covariate adjustment. We develop a novel kernel smoothing approach that requires only mild smoothness assumptions on the effect curve, and still allows for misspecification of either the treatment density or outcome regression. We derive asymptotic properties and give a procedure for data-driven bandwidth selection. The methods are illustrated via simulation and in a study of the effect of nurse staffing on hospital readmissions penalties.
Periodic response of nonlinear systems
NASA Technical Reports Server (NTRS)
Nataraj, C.; Nelson, H. D.
1988-01-01
A procedure is developed to determine approximate periodic solutions of autonomous and non-autonomous systems. The trignometric collocation method (TCM) is formalized to allow for the analysis of relatively small order systems directly in physical coordinates. The TCM is extended to large order systems by utilizing modal analysis in a component mode synthesis strategy. The procedure was coded and verified by several check cases. Numerical results for two small order mechanical systems and one large order rotor dynamic system are presented. The method allows for the possibility of approximating periodic responses for large order forced and self-excited nonlinear systems.
Sokoliess, Torsten; Köller, Gerhard
2005-06-01
A chiral capillary electrophoresis system allowing the determination of the enantiomeric purity of an investigational new drug was developed using a generic method development approach for basic analytes. The method was optimized in terms of type and concentration of both cyclodextrin (CD) and electrolyte, buffer pH, temperature, voltage, and rinsing procedure. Optimal chiral separation of the analyte was obtained using an electrolyte with 2.5% carboxymethyl-beta-CD in 25 mM NaH2PO4 (pH 4.0). Interchanging the inlet and outlet vials after each run improved the method's precision. To assure the method's suitability for the control of enantiomeric impurities in pharmaceutical quality control, its specificity, linearity, precision, accuracy, and robustness were validated according to the requirements of the International Conference on Harmonization. The usefulness of our generic method development approach for the validation of robustness was demonstrated.
Can the electronegativity equalization method predict spectroscopic properties?
Verstraelen, T; Bultinck, P
2015-02-05
The electronegativity equalization method is classically used as a method allowing the fast generation of atomic charges using a set of calibrated parameters and provided knowledge of the molecular structure. Recently, it has started being used for the calculation of other reactivity descriptors and for the development of polarizable and reactive force fields. For such applications, it is of interest to know whether the method, through the inclusion of the molecular geometry in the Taylor expansion of the energy, would also allow sufficiently accurate predictions of spectroscopic data. In this work, relevant quantities for IR spectroscopy are considered, namely the dipole derivatives and the Cartesian Hessian. Despite careful calibration of parameters for this specific task, it is shown that the current models yield insufficiently accurate results. Copyright © 2013 Elsevier B.V. All rights reserved.
Universal DNA-based methods for assessing the diet of grazing livestock and wildlife from feces.
Pegard, Anthony; Miquel, Christian; Valentini, Alice; Coissac, Eric; Bouvier, Frédéric; François, Dominique; Taberlet, Pierre; Engel, Erwan; Pompanon, François
2009-07-08
Because of the demand for controlling livestock diets, two methods that characterize the DNA of plants present in feces were developed. After DNA extraction from fecal samples, a short fragment of the chloroplastic trnL intron was amplified by PCR using a universal primer pair for plants. The first method generates a signature that is the electrophoretic migration pattern of the PCR product. The second method consists of sequencing several hundred DNA fragments from the PCR product through pyrosequencing. These methods were validated with a blind analysis of feces from concentrate- and pasture-fed lambs. The signature method allowed differentiation of the two diets and confirmed the presence of concentrate in one of them. The pyrosequencing method allowed the identification of up to 25 taxa in a diet. These methods are complementary to the chemical methods already used. They could be applied to the control of diets and the study of food preferences.
ERIC Educational Resources Information Center
Webber, Dana E.
2013-01-01
Using technology to develop a collaborative-reflective teaching practice in a world language education methods course block for teaching certification creates unique opportunities for world language education undergraduates to learn to develop synthecultural competence for education. Such a program allows undergraduates to expand their capacity to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berk, Herbert L.
2018-02-15
The study of this project focused on developing a reduced nonlinear model to describe chirping processes in a fusion plasma. A successful method was developed with results clear enough to allow an analytic theory to be developed that replicates the long term response of a nonlinear phase space structure immersed in the MHD continnuum.
Characterization of Pulsed Vortex Generator Jets for Active Flow Control
2003-10-01
significant role in the developing vortex, were assessed; most of which used Amtec Engineering’s provided the net momentum flux rate is maintained. 7 The...incorporated in a single Tecplot macro, system allowed experimental data to be obtained very close to Since this method was developed, Amtec
Bayesian Methods and Confidence Intervals for Automatic Target Recognition of SAR Canonical Shapes
2014-03-27
and DirectX [22]. The CUDA platform was developed by the NVIDIA Corporation to allow programmers access to the computational capabilities of the...were used for the intense repetitive computations. Developing CUDA software requires writing code for specialized compilers provided by NVIDIA and
Real Time Network Monitoring and Reporting System
ERIC Educational Resources Information Center
Massengale, Ricky L., Sr.
2009-01-01
With the ability of modern system developers to develop intelligent programs that allows machines to learn, modify and evolve themselves, current trends of reactionary methods to detect and eradicate malicious software code from infected machines is proving to be too costly. Addressing malicious software after an attack is the current methodology…
Infrared Spectroscopic Analysis of Linkage Isomerism in Metal-Thiocyanate Complexes
ERIC Educational Resources Information Center
Baer, Carl; Pike, Jay
2010-01-01
We developed an experiment suitable for an advanced inorganic chemistry laboratory that utilizes a cooperative learning environment, which allows students to develop an empirical method of determining the bonding mode of a series of unknown metal-thiocyanate complexes. Students synthesize the metal-thiocyanate complexes and obtain the FT-IR…
Direct Instruction with Playful Skill Extensions: Action Research in Emergent Literacy Development
ERIC Educational Resources Information Center
Keaton, Jean M.; Palmer, Barbara C.; Nicholas, Karen R.; Lake, Vickie E.
2007-01-01
Direct instruction teaching methods have been found to promote the acquisition of literacy in developing readers. Equally important, learning strategies that allow children to construct knowledge through active participation increase their motivation for reading and writing. This action research was designed to explore the effectiveness of direct…
Injector element characterization methodology
NASA Technical Reports Server (NTRS)
Cox, George B., Jr.
1988-01-01
Characterization of liquid rocket engine injector elements is an important part of the development process for rocket engine combustion devices. Modern nonintrusive instrumentation for flow velocity and spray droplet size measurement, and automated, computer-controlled test facilities allow rapid, low-cost evaluation of injector element performance and behavior. Application of these methods in rocket engine development, paralleling their use in gas turbine engine development, will reduce rocket engine development cost and risk. The Alternate Turbopump (ATP) Hot Gas Systems (HGS) preburner injector elements were characterized using such methods, and the methodology and some of the results obtained will be shown.
Building Reflection with Word Clouds for Online RN to BSN Students.
Volkert, Delene R
Reflection allows students to integrate learning with their personal context, developing deeper knowledge and promoting critical thinking. Word clouds help students develop themes/concepts beyond traditional methods, introducing visual aspects to an online learning environment. Students created word clouds and captions, then responded to those created by peers for a weekly discussion assignment. Students indicated overwhelming support for the use of word clouds to develop deeper understanding of the subject matter. This reflection assignment could be utilized in asynchronous, online undergraduate nursing courses for creative methods of building reflection and developing knowledge for the undergraduate RN to BSN student.
USDA-ARS?s Scientific Manuscript database
Magnetic separation has great advantages over traditional bioseparation methods and has become popular in the development of methods for the detection of bacterial pathogens, viruses, and transgenic crops. Functionalization of magnetic nanoparticles is a key factor in allowing efficient capture of t...
An Educational Model for Disruption of Bacteria for Protein Studies.
ERIC Educational Resources Information Center
Bhaduri, Saumya; Demchick, Paul H.
1984-01-01
A simple, rapid, and safe method has been developed for disrupting bacterial cells for protein studies. The method involved stepwise treatment of cells with acetone and with sodium dodecyl sulfate solution to allow extraction of cellular proteins for analysis by polyacrylamide gel electrophoresis. Applications for instructional purposes are noted.…
Surveying Assessment in Experiential Learning: A Single Campus Study
ERIC Educational Resources Information Center
Yates, Thomas; Wilson, Jay; Purton, Kendra
2015-01-01
The purpose of this study was to determine the methods of experiential assessment in use at a Canadian university and the extent to which they are used. Exploring experiential assessment will allow identification of commonly used methods and facilitate the development of best practices of assessment in the context of experiential learning (EL) at…
Statistical methods for analysing responses of wildlife to human disturbance
Haiganoush K. Preisler; Alan A. Ager; Michael J. Wisdom
2006-01-01
Off-road recreation is increasing rapidly in many areas of the world, and effects on wildlife can be highly detrimental. Consequently, we have developed methods for studying wildlife responses to off-road recreation with the use of new technologies that allow frequent and accurate monitoring of human-wildlife interactions. To...
Erynia radicans as a mycoinsecticide for spruce budworm control
Richard S. Soper
1985-01-01
The entomopahtogenic fungus Erynia radicans, has been under investigation for several years as a possible alternative to chemical control of the eastern spruce budworm. A commercial production method has been developed which allows the formulation of this pathogen as a mycoinsecticide. A standardized bioassay method was used to select strain RS141 as...
The relationship between total mercury (Hg) concentration in fish scales and in tissues of largemouth bass (Micropterus salmoides) from 20 freshwater sites was developed and evaluated to determine whether scale analysis would allow a non lethal and convenient method for predicti...
Seasonal trends in response to inoculation of coast live oak with Phytophthora ramorum
Richard S. Dodd; Daniel Hüberli; Tamar Y. Harnik; Brenda O' Dell; Matteo Garbelotto
2006-01-01
We developed a branch cutting inoculation method to provide a controlled system for studying variation in response to inoculation of coast live oak (Quercus agrifolia) with Phytophthora ramorum. This method has advantages over inoculations of trees in the field, in containing the inoculum and in allowing high levels of replication...
Fast Multipole Methods for Three-Dimensional N-body Problems
NASA Technical Reports Server (NTRS)
Koumoutsakos, P.
1995-01-01
We are developing computational tools for the simulations of three-dimensional flows past bodies undergoing arbitrary motions. High resolution viscous vortex methods have been developed that allow for extended simulations of two-dimensional configurations such as vortex generators. Our objective is to extend this methodology to three dimensions and develop a robust computational scheme for the simulation of such flows. A fundamental issue in the use of vortex methods is the ability of employing efficiently large numbers of computational elements to resolve the large range of scales that exist in complex flows. The traditional cost of the method scales as Omicron (N(sup 2)) as the N computational elements/particles induce velocities at each other, making the method unacceptable for simulations involving more than a few tens of thousands of particles. In the last decade fast methods have been developed that have operation counts of Omicron (N log N) or Omicron (N) (referred to as BH and GR respectively) depending on the details of the algorithm. These methods are based on the observation that the effect of a cluster of particles at a certain distance may be approximated by a finite series expansion. In order to exploit this observation we need to decompose the element population spatially into clusters of particles and build a hierarchy of clusters (a tree data structure) - smaller neighboring clusters combine to form a cluster of the next size up in the hierarchy and so on. This hierarchy of clusters allows one to determine efficiently when the approximation is valid. This algorithm is an N-body solver that appears in many fields of engineering and science. Some examples of its diverse use are in astrophysics, molecular dynamics, micro-magnetics, boundary element simulations of electromagnetic problems, and computer animation. More recently these N-body solvers have been implemented and applied in simulations involving vortex methods. Koumoutsakos and Leonard (1995) implemented the GR scheme in two dimensions for vector computer architectures allowing for simulations of bluff body flows using millions of particles. Winckelmans presented three-dimensional, viscous simulations of interacting vortex rings, using vortons and an implementation of a BH scheme for parallel computer architectures. Bhatt presented a vortex filament method to perform inviscid vortex ring interactions, with an alternative implementation of a BH scheme for a Connection Machine parallel computer architecture.
NASA Astrophysics Data System (ADS)
Dushkin, A. V.; Kasatkina, T. I.; Novoseltsev, V. I.; Ivanov, S. V.
2018-03-01
The article proposes a forecasting method that allows, based on the given values of entropy and error level of the first and second kind, to determine the allowable time for forecasting the development of the characteristic parameters of a complex information system. The main feature of the method under consideration is the determination of changes in the characteristic parameters of the development of the information system in the form of the magnitude of the increment in the ratios of its entropy. When a predetermined value of the prediction error ratio is reached, that is, the entropy of the system, the characteristic parameters of the system and the depth of the prediction in time are estimated. The resulting values of the characteristics and will be optimal, since at that moment the system possessed the best ratio of entropy as a measure of the degree of organization and orderliness of the structure of the system. To construct a method for estimating the depth of prediction, it is expedient to use the maximum principle of the value of entropy.
Probabilistic biological network alignment.
Todor, Andrei; Dobra, Alin; Kahveci, Tamer
2013-01-01
Interactions between molecules are probabilistic events. An interaction may or may not happen with some probability, depending on a variety of factors such as the size, abundance, or proximity of the interacting molecules. In this paper, we consider the problem of aligning two biological networks. Unlike existing methods, we allow one of the two networks to contain probabilistic interactions. Allowing interaction probabilities makes the alignment more biologically relevant at the expense of explosive growth in the number of alternative topologies that may arise from different subsets of interactions that take place. We develop a novel method that efficiently and precisely characterizes this massive search space. We represent the topological similarity between pairs of aligned molecules (i.e., proteins) with the help of random variables and compute their expected values. We validate our method showing that, without sacrificing the running time performance, it can produce novel alignments. Our results also demonstrate that our method identifies biologically meaningful mappings under a comprehensive set of criteria used in the literature as well as the statistical coherence measure that we developed to analyze the statistical significance of the similarity of the functions of the aligned protein pairs.
De Micco, Veronica; Ruel, Katia; Joseleau, Jean-Paul; Aronne, Giovanna
2010-08-01
During cell wall formation and degradation, it is possible to detect cellulose microfibrils assembled into thicker and thinner lamellar structures, respectively, following inverse parallel patterns. The aim of this study was to analyse such patterns of microfibril aggregation and cell wall delamination. The thickness of microfibrils and lamellae was measured on digital images of both growing and degrading cell walls viewed by means of transmission electron microscopy. To objectively detect, measure and classify microfibrils and lamellae into thickness classes, a method based on the application of computerized image analysis combined with graphical and statistical methods was developed. The method allowed common classes of microfibrils and lamellae in cell walls to be identified from different origins. During both the formation and degradation of cell walls, a preferential formation of structures with specific thickness was evidenced. The results obtained with the developed method allowed objective analysis of patterns of microfibril aggregation and evidenced a trend of doubling/halving lamellar structures, during cell wall formation/degradation in materials from different origin and which have undergone different treatments.
Aguirre-Junco, Angel-Ricardo; Colombet, Isabelle; Zunino, Sylvain; Jaulent, Marie-Christine; Leneveut, Laurence; Chatellier, Gilles
2004-01-01
The initial step for the computerization of guidelines is the knowledge specification from the prose text of guidelines. We describe a method of knowledge specification based on a structured and systematic analysis of text allowing detailed specification of a decision tree. We use decision tables to validate the decision algorithm and decision trees to specify and represent this algorithm, along with elementary messages of recommendation. Edition tools are also necessary to facilitate the process of validation and workflow between expert physicians who will validate the specified knowledge and computer scientist who will encode the specified knowledge in a guide-line model. Applied to eleven different guidelines issued by an official agency, the method allows a quick and valid computerization and integration in a larger decision support system called EsPeR (Personalized Estimate of Risks). The quality of the text guidelines is however still to be developed further. The method used for computerization could help to define a framework usable at the initial step of guideline development in order to produce guidelines ready for electronic implementation.
Successfully Implementing Net-Zero Energy Policy through the Air Force Military Construction Program
2013-03-01
Meets Does not meet Does not meet Meets Renewable Farms Meets Meets Meets Meets On-Site (Distributed Generation) Meets* Meets* Meets Meets...independence, nor does it allow for net-zero energy installations. Developing centralized renewable energy farms is another method for obtaining...combination of centralized renewable energy farms and distributed generation methods. The specific combination of methods an installation will utilize
Optimization of aerodynamic form of projectile for solving the problem of shooting range increasing
NASA Astrophysics Data System (ADS)
Lipanov, Alexey M.; Korolev, Stanislav A.; Rusyak, Ivan G.
2017-10-01
The article is devoted to the development of methods for solving the problem of external ballistics using a more complete system of motion equation taken into account the rotation and oscillation about the mass center and using aerodynamic coefficients of forces and moments which are calculated on the basis of modeling the hydrodynamics of flow around the projectile. Developed methods allows to study the basic ways of increasing the shooting range or artillery.
Project risk management in the construction of high-rise buildings
NASA Astrophysics Data System (ADS)
Titarenko, Boris; Hasnaoui, Amir; Titarenko, Roman; Buzuk, Liliya
2018-03-01
This paper shows the project risk management methods, which allow to better identify risks in the construction of high-rise buildings and to manage them throughout the life cycle of the project. One of the project risk management processes is a quantitative analysis of risks. The quantitative analysis usually includes the assessment of the potential impact of project risks and their probabilities. This paper shows the most popular methods of risk probability assessment and tries to indicate the advantages of the robust approach over the traditional methods. Within the framework of the project risk management model a robust approach of P. Huber is applied and expanded for the tasks of regression analysis of project data. The suggested algorithms used to assess the parameters in statistical models allow to obtain reliable estimates. A review of the theoretical problems of the development of robust models built on the methodology of the minimax estimates was done and the algorithm for the situation of asymmetric "contamination" was developed.
Step-Climbing Power Wheelchairs: A Literature Review
Sundaram, S. Andrea; Wang, Hongwu; Ding, Dan
2017-01-01
Background: Power wheelchairs capable of overcoming environmental barriers, such as uneven terrain, curbs, or stairs, have been under development for more than a decade. Method: We conducted a systematic review of the scientific and engineering literature to identify these devices, and we provide brief descriptions of the mechanism and method of operation for each. We also present data comparing their capabilities in terms of step climbing and standard wheelchair functions. Results: We found that all the devices presented allow for traversal of obstacles that cannot be accomplished with traditional power wheelchairs, but the slow speeds and small wheel diameters of some designs make them only moderately effective in the basic area of efficient transport over level ground and the size and configuration of some others limit maneuverability in tight spaces. Conclusion: We propose that safety and performance test methods more comprehensive than the International Organization for Standards (ISO) testing protocols be developed for measuring the capabilities of advanced wheelchairs with step-climbing and other environment-negotiating features to allow comparison of their clinical effectiveness. PMID:29339886
Imperial Valley's proposal to develop a guide for geothermal development within its county
NASA Technical Reports Server (NTRS)
Pierson, D. E.
1974-01-01
A plan to develop the geothermal resources of the Imperial Valley of California is presented. The plan consists of development policies and includes text and graphics setting forth the objectives, principles, standards, and proposals. The plan allows developers to know the goals of the surrounding community and provides a method for decision making to be used by county representatives. A summary impact statement for the geothermal development aspects is provided.
Inouye, David I.; Ravikumar, Pradeep; Dhillon, Inderjit S.
2016-01-01
We develop Square Root Graphical Models (SQR), a novel class of parametric graphical models that provides multivariate generalizations of univariate exponential family distributions. Previous multivariate graphical models (Yang et al., 2015) did not allow positive dependencies for the exponential and Poisson generalizations. However, in many real-world datasets, variables clearly have positive dependencies. For example, the airport delay time in New York—modeled as an exponential distribution—is positively related to the delay time in Boston. With this motivation, we give an example of our model class derived from the univariate exponential distribution that allows for almost arbitrary positive and negative dependencies with only a mild condition on the parameter matrix—a condition akin to the positive definiteness of the Gaussian covariance matrix. Our Poisson generalization allows for both positive and negative dependencies without any constraints on the parameter values. We also develop parameter estimation methods using node-wise regressions with ℓ1 regularization and likelihood approximation methods using sampling. Finally, we demonstrate our exponential generalization on a synthetic dataset and a real-world dataset of airport delay times. PMID:27563373
Quantum chemistry in environmental pesticide risk assessment.
Villaverde, Juan J; López-Goti, Carmen; Alcamí, Manuel; Lamsabhi, Al Mokhtar; Alonso-Prados, José L; Sandín-España, Pilar
2017-11-01
The scientific community and regulatory bodies worldwide, currently promote the development of non-experimental tests that produce reliable data for pesticide risk assessment. The use of standard quantum chemistry methods could allow the development of tools to perform a first screening of compounds to be considered for the experimental studies, improving the risk assessment. This fact results in a better distribution of resources and in better planning, allowing a more exhaustive study of the pesticides and their metabolic products. The current paper explores the potential of quantum chemistry in modelling toxicity and environmental behaviour of pesticides and their by-products by using electronic descriptors obtained computationally. Quantum chemistry has potential to estimate the physico-chemical properties of pesticides, including certain chemical reaction mechanisms and their degradation pathways, allowing modelling of the environmental behaviour of both pesticides and their by-products. In this sense, theoretical methods can contribute to performing a more focused risk assessment of pesticides used in the market, and may lead to higher quality and safer agricultural products. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
The influence of the infrastructure characteristics in urban road accidents occurrence.
Vieira Gomes, Sandra
2013-11-01
This paper summarizes the result of a study regarding the creation of tools that can be used in intervention methods in the planning and management of urban road networks in Portugal. The first tool relates the creation of a geocoded database of road accidents occurred in Lisbon between 2004 and 2007, which allowed the definition of digital maps, with the possibility of a wide range of consultations and crossing of information. The second tool concerns the development of models to estimate the frequency of accidents on urban networks, according to different desegregations: road element (intersections and segments); type of accident (accidents with and without pedestrians); and inclusion of explanatory variables related to the road environment. Several methods were used to assess the goodness of fit of the developed models, allowing more robust conclusions. This work aims to contribute to the scientific knowledge of accidents phenomenon in Portugal, with detailed and accurate information on the factors affecting its occurrence. This allows to explicitly include safety aspects in planning and road management tasks. Copyright © 2013 Elsevier Ltd. All rights reserved.
Fu, Liezhen; Wen, Luan; Luu, Nga; Shi, Yun-Bo
2016-01-01
Genome editing with designer nucleases such as TALEN and CRISPR/Cas enzymes has broad applications. Delivery of these designer nucleases into organisms induces various genetic mutations including deletions, insertions and nucleotide substitutions. Characterizing those mutations is critical for evaluating the efficacy and specificity of targeted genome editing. While a number of methods have been developed to identify the mutations, none other than sequencing allows the identification of the most desired mutations, i.e., out-of-frame insertions/deletions that disrupt genes. Here we report a simple and efficient method to visualize and quantify the efficiency of genomic mutations induced by genome-editing. Our approach is based on the expression of a two-color fusion protein in a vector that allows the insertion of the edited region in the genome in between the two color moieties. We show that our approach not only easily identifies developing animals with desired mutations but also efficiently quantifies the mutation rate in vivo. Furthermore, by using LacZα and GFP as the color moieties, our approach can even eliminate the need for a fluorescent microscope, allowing the analysis with simple bright field visualization. Such an approach will greatly simplify the screen for effective genome-editing enzymes and identify the desired mutant cells/animals. PMID:27748423
Final Report, DE-FG01-06ER25718 Domain Decomposition and Parallel Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Widlund, Olof B.
2015-06-09
The goal of this project is to develop and improve domain decomposition algorithms for a variety of partial differential equations such as those of linear elasticity and electro-magnetics.These iterative methods are designed for massively parallel computing systems and allow the fast solution of the very large systems of algebraic equations that arise in large scale and complicated simulations. A special emphasis is placed on problems arising from Maxwell's equation. The approximate solvers, the preconditioners, are combined with the conjugate gradient method and must always include a solver of a coarse model in order to have a performance which is independentmore » of the number of processors used in the computer simulation. A recent development allows for an adaptive construction of this coarse component of the preconditioner.« less
Ex vivo culture of mouse embryonic skin and live-imaging of melanoblast migration.
Mort, Richard L; Keighren, Margaret; Hay, Leonard; Jackson, Ian J
2014-05-19
Melanoblasts are the neural crest derived precursors of melanocytes; the cells responsible for producing the pigment in skin and hair. Melanoblasts migrate through the epidermis of the embryo where they subsequently colonize the developing hair follicles(1,2). Neural crest cell migration is extensively studied in vitro but in vivo methods are still not well developed, especially in mammalian systems. One alternative is to use ex vivo organotypic culture(3-6). Culture of mouse embryonic skin requires the maintenance of an air-liquid interface (ALI) across the surface of the tissue(3,6). High resolution live-imaging of mouse embryonic skin has been hampered by the lack of a good method that not only maintains this ALI but also allows the culture to be inverted and therefore compatible with short working distance objective lenses and most confocal microscopes. This article describes recent improvements to a method that uses a gas permeable membrane to overcome these problems and allow high-resolution confocal imaging of embryonic skin in ex vivo culture(6). By using a melanoblast specific Cre-recombinase expressing mouse line combined with the R26YFPR reporter line we are able to fluorescently label the melanoblast population within these skin cultures. The technique allows live-imaging of melanoblasts and observation of their behavior and interactions with the tissue in which they develop. Representative results are included to demonstrate the capability to live-image 6 cultures in parallel.
NASA Astrophysics Data System (ADS)
Dobrowolska, M.; Velthuis, J.; Frazão, L.; Kikoła, D.
2018-05-01
Nuclear waste is deposited for many years in the concrete or bitumen-filled containers. With time hydrogen gas is produced, which can accumulate in bubbles. These pockets of gas may result in bitumen overflowing out of the waste containers and could result in spread of radioactivity. Muon Scattering Tomography is a non-invasive scanning method developed to examine the unknown content of nuclear waste drums. Here we present a method which allows us to successfully detect bubbles larger than 2 litres and determine their size with a relative uncertainty resolution of 1.55 ± 0.77%. Furthermore, the method allows to make a distinction between a conglomeration of bubbles and a few smaller gas volumes in different locations.
NASA Astrophysics Data System (ADS)
Fomina, E. V.; Kozhukhova, N. I.; Sverguzova, S. V.; Fomin, A. E.
2018-05-01
In this paper, the regression equations method for design of construction material was studied. Regression and polynomial equations representing the correlation between the studied parameters were proposed. The logic design and software interface of the regression equations method focused on parameter optimization to provide the energy saving effect at the stage of autoclave aerated concrete design considering the replacement of traditionally used quartz sand by coal mining by-product such as argillite. The mathematical model represented by a quadric polynomial for the design of experiment was obtained using calculated and experimental data. This allowed the estimation of relationship between the composition and final properties of the aerated concrete. The surface response graphically presented in a nomogram allowed the estimation of concrete properties in response to variation of composition within the x-space. The optimal range of argillite content was obtained leading to a reduction of raw materials demand, development of target plastic strength of aerated concrete as well as a reduction of curing time before autoclave treatment. Generally, this method allows the design of autoclave aerated concrete with required performance without additional resource and time costs.
Use of the method of biosphere compatibility for the assessment of environmental protection methods
NASA Astrophysics Data System (ADS)
Vorobyov, Sergey
2018-01-01
The article is devoted to the question of using the indicator of biosphere compatibility for assessing the effectiveness of environmental protection methods. The indicator of biosphere compatibility was proposed by the vice-president of RAASN (Russian Academy of Architecture and Building Sciences), Doctor of Technical Sciences, Professor V.I. Ilyichev. This indicator is allows not only qualitatively but also quantitatively to assess the degree of development of urban urban areas, from the standpoint of preserving the biosphere in urban ecosystems while realizing the city’s main functions. The integral indicator of biosphere compatibility is allows us to assess not only the current ecological situation in the territory under consideration, but also to plan the forecast of its changes for building the new construction projects, or for reconstructing existing ones. The indicator of biosphere compatibility, which is a mathematical expression of the tripartite balance (technosphere, biosphere and population of this area), is allows us to quantify the degree of effectiveness of different method of protecting the environment for choose the most effective for these conditions.
A block-based algorithm for the solution of compressible flows in rotor-stator combinations
NASA Technical Reports Server (NTRS)
Akay, H. U.; Ecer, A.; Beskok, A.
1990-01-01
A block-based solution algorithm is developed for the solution of compressible flows in rotor-stator combinations. The method allows concurrent solution of multiple solution blocks in parallel machines. It also allows a time averaged interaction at the stator-rotor interfaces. Numerical results are presented to illustrate the performance of the algorithm. The effect of the interaction between the stator and rotor is evaluated.
Fundamentals, achievements and challenges in the electrochemical sensing of pathogens.
Monzó, Javier; Insua, Ignacio; Fernandez-Trillo, Francisco; Rodriguez, Paramaconi
2015-11-07
Electrochemical sensors are powerful tools widely used in industrial, environmental and medical applications. The versatility of electrochemical methods allows for the investigation of chemical composition in real time and in situ. Electrochemical detection of specific biological molecules is a powerful means for detecting disease-related markers. In the last 10 years, highly-sensitive and specific methods have been developed to detect waterborne and foodborne pathogens. In this review, we classify the different electrochemical techniques used for the qualitative and quantitative detection of pathogens. The robustness of electrochemical methods allows for accurate detection even in heterogeneous and impure samples. We present a fundamental description of the three major electrochemical sensing methods used in the detection of pathogens and the advantages and disadvantages of each of these methods. In each section, we highlight recent breakthroughs, including the utilisation of microfluidics, immunomagnetic separation and multiplexing for the detection of multiple pathogens in a single device. We also include recent studies describing new strategies for the design of future immunosensing systems and protocols. The high sensitivity and selectivity, together with the portability and the cost-effectiveness of the instrumentation, enhances the demand for further development in the electrochemical detection of microbes.
NASA Astrophysics Data System (ADS)
Mayhew, Christopher A.; Mayhew, Craig M.
2009-02-01
Vision III Imaging, Inc. (the Company) has developed Parallax Image Display (PIDTM) software tools to critically align and display aerial images with parallax differences. Terrain features are rendered obvious to the viewer when critically aligned images are presented alternately at 4.3 Hz. The recent inclusion of digital elevation models in geographic data browsers now allows true three-dimensional parallax to be acquired from virtual globe programs like Google Earth. The authors have successfully developed PID methods and code that allow three-dimensional geographical terrain data to be visualized using temporal parallax differences.
Krettek, Christian; El Naga, Ashraf
2017-10-01
Segmental transport is an effective method of treatment for segmental defects, but the need for external fixation during the transport phase is a disadvantage. To avoid external fixation, we have developed a Cylinder-Kombi-Tube Segmental Transport (CKTST) module for combination with a commercially available motorized lengthening nail. This CKTST module allows for an all-internal segmental bone transport and also allows for optional lengthening if needed. The concept and surgical technique of CKTST are described and illustrated with a clinical case.
Roda, Aldo; Mirasoli, Mara; Venturoli, Simona; Cricca, Monica; Bonvicini, Francesca; Baraldini, Mario; Pasini, Patrizia; Zerbini, Marialuisa; Musiani, Monica
2002-10-01
To allow multianalyte binding assays, we have developed a novel polystyrene microtiter plate containing 24 main wells, each divided into 7 subwells. We explored its clinical potential by developing a PCR-chemiluminescent immunoassay (PCR-CLEIA) for simultaneous detection and typing of seven high oncogenic risk human papillomavirus (HPV) DNAs in one well. Seven different oligonucleotide probes, each specific for a high-risk HPV genotype, were separately immobilized in the subwells. Subsequently, a digoxigenin-labeled consensus PCR amplification product was added to the main well. The PCR product hybridized to the immobilized probe corresponding to its genotype and was subsequently detected by use of a peroxidase-labeled anti-digoxigenin antibody and chemiluminescence imaging with an ultrasensitive charge-coupled device camera. Results obtained for 50 cytologic samples were compared with those obtained with a conventional colorimetric PCR-ELISA. The method was specific and allowed detection of 50 genome copies of HPV 16, 18, 33, and 58, and 100 genome copies of HPV 31, 35, and 45. Intra- and interassay CVs for the method were 5.6% and 7.9%, respectively. All results obtained for clinical samples were confirmed by the conventional PCR-ELISA. PCR-CLEIA allows rapid, single-tube simultaneous detection and typing of seven high-risk HPV DNAs with small reagent volumes. The principle appears applicable to the development of other single-tube panels of tests.
Confocal Imaging of Early Heart Development in Xenopus laevis
Kolker, Sandra J.; Tajchman, Urszula; Weeks, Daniel L.
2013-01-01
Xenopus laevis provides a number of advantages for studies on cardiovascular development. The embryos are fairly large, easy to obtain, and can develop at ambient temperature in simple buffer solutions. Although classic descriptions of heart development exist, the ability to use whole mount immunohistochemical methods and confocal microscopy may enhance the ability to understand both normal and experimentally perturbed cardiovascular development. We have started to examine the early stages of cardiac development in Xenopus, seeking to identify antibodies and fixatives that allow easy examination of the developing heart. We have used monoclonal antibodies (mAbs) raised against bovine cardiac troponin T and chicken tropomyosin to visualize cardiac muscle, a goat antibody recognizing bovine type VI collagen to stain the lining of vessels, and the JB3 mAb raised against chicken fibrillin which allows the visualization of a variety of cardiovascular tissues during early development. Results from embryonic stages 24–46 are presented. PMID:10644411
Using a Density-Management Diagram to Develop Thinning Schedules for Loblolly Pine Plantations
Thomas J. Dean; V. Clark Baldwin
1993-01-01
A method for developing thinning schedules using a density-management diagram is presented. A density-management diagram is a form of stocking chart based on patterns of natural stand development. The diagram allows rotation diameter and the upper and lower limits of growing stock to be easily transformed into before and after thinning densities. Site height lines on...
6-D, A Process Framework for the Design and Development of Web-based Systems.
ERIC Educational Resources Information Center
Christian, Phillip
2001-01-01
Explores how the 6-D framework can form the core of a comprehensive systemic strategy and help provide a supporting structure for more robust design and development while allowing organizations to support whatever methods and models best suit their purpose. 6-D stands for the phases of Web design and development: Discovery, Definition, Design,…
Model Robust Calibration: Method and Application to Electronically-Scanned Pressure Transducers
NASA Technical Reports Server (NTRS)
Walker, Eric L.; Starnes, B. Alden; Birch, Jeffery B.; Mays, James E.
2010-01-01
This article presents the application of a recently developed statistical regression method to the controlled instrument calibration problem. The statistical method of Model Robust Regression (MRR), developed by Mays, Birch, and Starnes, is shown to improve instrument calibration by reducing the reliance of the calibration on a predetermined parametric (e.g. polynomial, exponential, logarithmic) model. This is accomplished by allowing fits from the predetermined parametric model to be augmented by a certain portion of a fit to the residuals from the initial regression using a nonparametric (locally parametric) regression technique. The method is demonstrated for the absolute scale calibration of silicon-based pressure transducers.
Real-time application of knowledge-based systems
NASA Technical Reports Server (NTRS)
Brumbaugh, Randal W.; Duke, Eugene L.
1989-01-01
The Rapid Prototyping Facility (RPF) was developed to meet a need for a facility which allows flight systems concepts to be prototyped in a manner which allows for real-time flight test experience with a prototype system. This need was focused during the development and demonstration of the expert system flight status monitor (ESFSM). The ESFSM was a prototype system developed on a LISP machine, but lack of a method for progressive testing and problem identification led to an impractical system. The RPF concept was developed, and the ATMS designed to exercise its capabilities. The ATMS Phase 1 demonstration provided a practical vehicle for testing the RPF, as well as a useful tool. ATMS Phase 2 development continues. A dedicated F-18 is expected to be assigned for facility use in late 1988, with RAV modifications. A knowledge-based autopilot is being developed using the RPF. This is a system which provides elementary autopilot functions and is intended as a vehicle for testing expert system verification and validation methods. An expert system propulsion monitor is being prototyped. This system provides real-time assistance to an engineer monitoring a propulsion system during a flight.
Steponas Kolupaila's contribution to hydrological science development
NASA Astrophysics Data System (ADS)
Valiuškevičius, Gintaras
2017-08-01
Steponas Kolupaila (1892-1964) was an important figure in 20th century hydrology and one of the pioneers of scientific water gauging in Europe. His research on the reliability of hydrological data and measurement methods was particularly important and contributed to the development of empirical hydrological calculation methods. Kolupaila was one of the first who standardised water-gauging methods internationally. He created several original hydrological and hydraulic calculation methods (his discharge assessment method for winter period was particularly significant). His innate abilities and frequent travel made Kolupaila a universal specialist in various fields and an active public figure. He revealed his multilayered scientific and cultural experiences in his most famous book, Bibliography of Hydrometry. This book introduced the unique European hydrological-measurement and computation methods to the community of world hydrologists at that time and allowed the development and adaptation of these methods across the world.
NASA Astrophysics Data System (ADS)
Belyaev, V. P.; Mishchenko, S. V.; Belyaev, P. S.
2018-01-01
Ensuring non-destructive testing of products in industry is an urgent task. Most of the modern methods for determining the diffusion coefficient in porous materials have been developed for bodies of a given configuration and size. This leads to the need for finished products destruction to make experimental samples from them. The purpose of this study is the development of a dynamic method that allows operatively determine the diffusion coefficient in finished products from porous materials without destroying them. The method is designed to investigate the solvents diffusion coefficient in building constructions from materials having a porous structure: brick, concrete and aerated concrete, gypsum, cement, gypsum or silicate solutions, gas silicate blocks, heat insulators, etc. A mathematical model of the method is constructed. The influence of the design and measuring device operating parameters on the method accuracy is studied. The application results of the developed method for structural porous products are presented.
NASA Astrophysics Data System (ADS)
Bakaeva, N. V.; Vorobyov, S. A.; Chernyaeva, I. V.
2017-11-01
The article is devoted to the issue of using the biosphere compatibility indicator to assess the effectiveness of environmental protection methods. The indicator biosphere compatibility was proposed by the vice-president of RAASN (Russian Academy of Architecture and Building Sciences), Doctor of Technical Sciences, Professor V.I. Ilyichev. This indicator allows one to assess not only qualitatively but also quantitatively the degree of urban areas development from the standpoint of preserving the biosphere in urban ecosystems while performing the city’s main functions. The integral biosphere compatibility indicator allows us to assess not only the current ecological situation in the territory under consideration but also to plan the forecast of its changes for the new construction projects implementation or for the reconstruction of the existing ones. The biosphere compatibility indicator, which is a mathematical expression of the tripartite balance (technosphere, biosphere and population of this area), allows us to quantify the effectiveness degree of different methods for environment protection to choose the most effective one under these conditions.
Multigrid methods with space–time concurrency
Falgout, R. D.; Friedhoff, S.; Kolev, Tz. V.; ...
2017-10-06
Here, we consider the comparison of multigrid methods for parabolic partial differential equations that allow space–time concurrency. With current trends in computer architectures leading towards systems with more, but not faster, processors, space–time concurrency is crucial for speeding up time-integration simulations. In contrast, traditional time-integration techniques impose serious limitations on parallel performance due to the sequential nature of the time-stepping approach, allowing spatial concurrency only. This paper considers the three basic options of multigrid algorithms on space–time grids that allow parallelism in space and time: coarsening in space and time, semicoarsening in the spatial dimensions, and semicoarsening in the temporalmore » dimension. We develop parallel software and performance models to study the three methods at scales of up to 16K cores and introduce an extension of one of them for handling multistep time integration. We then discuss advantages and disadvantages of the different approaches and their benefit compared to traditional space-parallel algorithms with sequential time stepping on modern architectures.« less
Method of Calculating the Correction Factors for Cable Dimensioning in Smart Grids
NASA Astrophysics Data System (ADS)
Simutkin, M.; Tuzikova, V.; Tlusty, J.; Tulsky, V.; Muller, Z.
2017-04-01
One of the main causes of overloading electrical equipment by currents of higher harmonics is the great increasing of a number of non-linear electricity power consumers. Non-sinusoidal voltages and currents affect the operation of electrical equipment, reducing its lifetime, increases the voltage and power losses in the network, reducing its capacity. There are standards that respects emissions amount of higher harmonics current that cannot provide interference limit for a safe level in power grid. The article presents a method for determining a correction factor to the long-term allowable current of the cable, which allows for this influence. Using mathematical models in the software Elcut, it was described thermal processes in the cable in case the flow of non-sinusoidal current. Developed in the article theoretical principles, methods, mathematical models allow us to calculate the correction factor to account for the effect of higher harmonics in the current spectrum for network equipment in any type of non-linear load.
Multigrid methods with space–time concurrency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Falgout, R. D.; Friedhoff, S.; Kolev, Tz. V.
Here, we consider the comparison of multigrid methods for parabolic partial differential equations that allow space–time concurrency. With current trends in computer architectures leading towards systems with more, but not faster, processors, space–time concurrency is crucial for speeding up time-integration simulations. In contrast, traditional time-integration techniques impose serious limitations on parallel performance due to the sequential nature of the time-stepping approach, allowing spatial concurrency only. This paper considers the three basic options of multigrid algorithms on space–time grids that allow parallelism in space and time: coarsening in space and time, semicoarsening in the spatial dimensions, and semicoarsening in the temporalmore » dimension. We develop parallel software and performance models to study the three methods at scales of up to 16K cores and introduce an extension of one of them for handling multistep time integration. We then discuss advantages and disadvantages of the different approaches and their benefit compared to traditional space-parallel algorithms with sequential time stepping on modern architectures.« less
Azevedo de Brito, Wanessa; Gomes Dantas, Monique; Andrade Nogueira, Fernando Henrique; Ferreira da Silva-Júnior, Edeildo; Xavier de Araújo-Júnior, João; Aquino, Thiago Mendonça de; Adélia Nogueira Ribeiro, Êurica; da Silva Solon, Lilian Grace; Soares Aragão, Cícero Flávio; Barreto Gomes, Ana Paula
2017-08-30
Guanylhydrazones are molecules with great pharmacological potential in various therapeutic areas, including antitumoral activity. Factorial design is an excellent tool in the optimization of a chromatographic method, because it is possible quickly change factors such as temperature, mobile phase composition, mobile phase pH, column length, among others to establish the optimal conditions of analysis. The aim of the present work was to develop and validate a HPLC and UHPLC methods for the simultaneous determination of guanylhydrazones with anticancer activity employing experimental design. Precise, exact, linear and robust HPLC and UHPLC methods were developed and validated for the simultaneous quantification of the guanylhydrazones LQM10, LQM14, and LQM17. The UHPLC method was more economic, with a four times less solvent consumption, and 20 times less injection volume, what allowed better column performance. Comparing the empirical approach employed in the HPLC method development to the DoE approach employed in the UHPLC method development, we can conclude that the factorial design made the method development faster, more practical and rational. This resulted in methods that can be employed in the analysis, evaluation and quality control of these new synthetic guanylhydrazones.
NASA Astrophysics Data System (ADS)
Berthias, F.; Feketeová, L.; Della Negra, R.; Dupasquier, T.; Fillol, R.; Abdoul-Carime, H.; Farizon, B.; Farizon, M.; Märk, T. D.
2017-08-01
In the challenging field of imaging molecular dynamics, a novel method has been developed and implemented that allows the measurement of the velocity of neutral fragments produced in collision induced dissociation experiments on an event-by-event basis. This has been made possible by combining a correlated ion and neutral time of flight method with a velocity map imaging technique. This new method relies on a multiparametric correlated detection of the neutral and charged fragments from collision induced dissociation on one single detector. Its implementation on the DIAM device (Device for irradiation of biomolecular clusters) (Dispositif d'Irradiation d'Agrégats bioMoléculaires) allowed us to measure the velocity distribution of water molecules evaporated from collision induced dissociation of mass- and energy-selected protonated water clusters.
KazRAM: Build Your Own Raman Spectrometer for Environmental Science Education in Kazakhstan
NASA Astrophysics Data System (ADS)
Redfern, S. A. T.; Seitkan, A.
2016-12-01
The development of field-based spectroscopic investigations in Eastern Kazakhstan has been held-back by the lack of access to spectroscopic methods and technologies. This has been addressed in this project, in which we use a modular system of construction to allow a Raman spectrometer to be built in the University classroom. In collaboration with scientists at East Kazakhstan State University the team at Cambridge University have designed and developed an instrument that can be replicated in the near-field environment in Central Asia. This allows students to gain a first-hand understanding of the principles and practise of Raman spectroscopy by constructing their own instrument. The project will then allow measurement of key samples in both biological ecology settings as well as in geological and mining exploration contexts.
Evaluation of flaws in carbon steel piping. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zahoor, A.; Gamble, R.M.; Mehta, H.S.
1986-10-01
The objective of this program was to develop flaw evaluation procedures and allowable flaw sizes for ferritic piping used in light water reactor (LWR) power generation facilities. The program results provide relevant ASME Code groups with the information necessary to define flaw evaluation procedures, allowable flaw sizes, and their associated bases for Section XI of the code. Because there are several possible flaw-related failure modes for ferritic piping over the LWR operating temperature range, three analysis methods were employed to develop the evaluation procedures. These include limit load analysis for plastic collapse, elastic plastic fracture mechanics (EPFM) analysis for ductilemore » tearing, and linear elastic fracture mechanics (LEFM) analysis for non ductile crack extension. To ensure the appropriate analysis method is used in an evaluation, a step by step procedure also is provided to identify the relevant acceptance standard or procedure on a case by case basis. The tensile strength and toughness properties required to complete the flaw evaluation for any of the three analysis methods are included in the evaluation procedure. The flaw evaluation standards are provided in tabular form for the plastic collapse and ductile tearing modes, where the allowable part through flaw depth is defined as a function of load and flaw length. For non ductile crack extension, linear elastic fracture mechanics analysis methods, similar to those in Appendix A of Section XI, are defined. Evaluation flaw sizes and procedures are developed for both longitudinal and circumferential flaw orientations and normal/upset and emergency/faulted operating conditions. The tables are based on margins on load of 2.77 and 1.39 for circumferential flaws and 3.0 and 1.5 for longitudinal flaws for normal/upset and emergency/faulted conditions, respectively.« less
Paths of Improving the Technological Process of Manufacture of GTE Turbine Blades
NASA Astrophysics Data System (ADS)
Vdovin, R. A.; Smelov, V. G.; Bolotov, M. A.; Pronichev, N. D.
2016-08-01
The article provides an analysis of the problems at manufacture of blades of the turbine of gas-turbine engines and power stations is provided in article, and also paths of perfecting of technological process of manufacture of blades are offered. The analysis of the main systems of basing of blades in the course of machining and the control methods of the processed blades existing at the enterprises with the indication of merits and demerits is carried out. In work criteria in the form of the mathematical models of a spatial distribution of an allowance considering the uniform distribution of an allowance on a feather profile are developed. The considered methods allow to reduce percent of release of marriage and to reduce labor input when polishing path part of a feather of blades of the turbine.
Quiet Sonic Booms: A NASA and Industry Progress Report
NASA Technical Reports Server (NTRS)
Larson, David Nils; Martin, Roy; Haering, Edward A.
2011-01-01
The purpose of this Oral Presentation is to present a progress report on NASA and Industry efforts related to Quiet Sonic Boom Program activities. This presentation will review changes in aircraft shaping to produce quiet supersonic booms and associated supersonic flight test methods and results. In addition, new flight test profiles have been recently developed that have allowed for the generation of sonic booms of varying intensity. These new flight test profiles have allowed for ground testing of the response of various building structures to sonic booms and the associated public acceptability to various sonic boom intensities. The new flight test profiles and associated ground measurement test methods will be reviewed. Finally, this Oral Presentation will review the International Regulatory requirements that would be involved to change aviation regulation and allow for overland quiet supersonic flight.
Alcohol-related hot-spot analysis and prediction : final report.
DOT National Transportation Integrated Search
2017-05-01
This project developed methods to more accurately identify alcohol-related crash hot spots, ultimately allowing for more effective and efficient enforcement and safety campaigns. Advancements in accuracy came from improving the calculation of spatial...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-31
..., which allows for demonstration and pilot projects for the purpose of developing and implementing techniques and approaches, and demonstrating the effectiveness of specialized methods, in addressing...
Assessment of eutrophication in estuaries: Pressure-state-response and source apportionment
David Whitall; Suzanne Bricker
2006-01-01
The National Estuarine Eutrophication Assessment (NEEA) Update Program is a management oriented program designed to improve monitoring and assessment efforts through the development of type specific classification of estuaries that will allow improved assessment methods and development of analytical and research models and tools for managers which will help guide and...
An Educational Development Tool Based on Principles of Formal Ontology
ERIC Educational Resources Information Center
Guzzi, Rodolfo; Scarpanti, Stefano; Ballista, Giovanni; Di Nicolantonio, Walter
2005-01-01
Computer science provides with virtual laboratories, places where one can merge real experiments with the formalism of algorithms and mathematics and where, with the advent of multimedia, sounds and movies can also be added. In this paper we present a method, based on principles of formal ontology, allowing one to develop interactive educational…
Developing design methods of concrete mix with microsilica additives for road construction
NASA Astrophysics Data System (ADS)
Dmitrienko, Vladimir; Shrivel, Igor; Kokunko, Irina; Pashkova, Olga
2017-10-01
Based on the laboratory test results, regression equations having standard cone and concrete strength, to determine the available amount of cement, water and microsilica were obtained. The joint solution of these equations allowed the researchers to develop the algorithm of designing heavy concrete compositions with microsilica additives for road construction.
What's Working Memory Got to Do with It? A Case Study on Teenagers
ERIC Educational Resources Information Center
Price, Andrew; Oliver, Mary; McGrane, Joshua
2015-01-01
This paper presents the results of a small-scale study concerned with the development of working memory during adolescence. The working memory of adolescent students was examined with a novel method, electroencephalography, which allowed insight into the neurological development of the students. Results showed that: electroencephalography is a…
NASA Technical Reports Server (NTRS)
Cornelius, Michael; Smartt, Ziba; Henrie, Vaughn; Johnson, Mont
2003-01-01
The recent developments in Fabry-Perot fiber optic instruments have resulted in accurate transducers with some of the physical characteristics required for use in obtaining internal data from solid rocket motors. These characteristics include small size, non-electrical excitation, and immunity to electro-magnetic interference. These transducers have not been previously utilized in this environment due to the high temperatures typically encountered. A series of tests were conducted using a 1 1-Inch Hybrid test bed to develop installation techniques that will allow the fiber optic instruments to survive and obtain data for a short period of time following the motor ignition. The installation methods developed during this test series have the potential to allow data to be acquired in the motor chamber, propellant bore, and nozzle during the ignition transient. These measurements would prove to be very useful in the characterization of current motor designs and provide insight into the requirements for further refinements. The process of developing these protective methods and the installation techniques used to apply them is summarized.
1995-09-12
DCAM, developed by MSFC, grows crystals by the dialysis and liquid-liquid diffusion methods. In both methods, protein crystal growth is induced by changing conditions in the protein. In dialysis, a semipermeable membrane retains the protein solution in one compartment, while allowing molecules of precipitant to pass freely through the membrane from an adjacent compartment. As the precipitant concentration increases within the protein compartment, crystallization begins. In liquid-liquid diffusion, a protein solution and a precipitant solution are layered in a container and allowed to diffuse into each other. This leads to conditions which may induce crystallization of the protein. Liquid-liquid diffusion is difficult on Earth because density and temperature differences cause the solutions to mix rapidly.
A Numerical Optimization Approach for Tuning Fuzzy Logic Controllers
NASA Technical Reports Server (NTRS)
Woodard, Stanley E.; Garg, Devendra P.
1998-01-01
This paper develops a method to tune fuzzy controllers using numerical optimization. The main attribute of this approach is that it allows fuzzy logic controllers to be tuned to achieve global performance requirements. Furthermore, this approach allows design constraints to be implemented during the tuning process. The method tunes the controller by parameterizing the membership functions for error, change-in-error and control output. The resulting parameters form a design vector which is iteratively changed to minimize an objective function. The minimal objective function results in an optimal performance of the system. A spacecraft mounted science instrument line-of-sight pointing control is used to demonstrate results.
On the theory of evolution of particulate systems
NASA Astrophysics Data System (ADS)
Buyevich, Yuri A.; Alexandrov, Dmitri V.
2017-04-01
An analytical method for the description of particulate systems at sufficiently long times is developed. This method allows us to obtain very simple analytical expressions for the particle distribution function. The method under consideration can be applied to a number of practically important problems including evaporation of a polydisperse mist, dissolution of dispersed solids, combustion of dispersed propellants, physical and chemical transformation of powders and phase transitions in metastable materials.
ERIC Educational Resources Information Center
Petrovskaya, Maria V.; Larionova, Anna A.; Zaitseva, Natalia A.; Bondarchuk, Natalya V.; Grigorieva, Elena M.
2016-01-01
The relevance of the problem stated in the article is that in conditions of nonstationary economy the modification of existing approaches and methods is necessary during the formation of the capital. These methods allow taking into account the heterogeneity of factors' change in time and the purpose of the development of a particular company,…
A distance limited method for sampling downed coarse woody debris
Jeffrey H. Gove; Mark J. Ducey; Harry T. Valentine; Michael S. Williams
2012-01-01
A new sampling method for down coarse woody debris is proposed based on limiting the perpendicular distance from individual pieces to a randomly chosen sample point. Two approaches are presented that allow different protocols to be used to determine field measurements; estimators for each protocol are also developed. Both protocols are compared via simulation against...
Mapping ecological systems with a random foret model: tradeoffs between errors and bias
Emilie Grossmann; Janet Ohmann; James Kagan; Heather May; Matthew Gregory
2010-01-01
New methods for predictive vegetation mapping allow improved estimations of plant community composition across large regions. Random Forest (RF) models limit over-fitting problems of other methods, and are known for making accurate classification predictions from noisy, nonnormal data, but can be biased when plot samples are unbalanced. We developed two contrasting...
7 CFR 400.712 - Research and development reimbursement, maintenance reimbursement, and user fees.
Code of Federal Regulations, 2012 CFR
2012-01-01
.... Documentation of actual costs allowed under this section will be used to determine any reimbursement. (c) To be... requested, and all supporting documentation, must be submitted to FCIC by electronic method or by hard copy... supporting documentation, must be submitted to FCIC by electronic method or by hard copy and received by FCIC...
7 CFR 400.712 - Research and development reimbursement, maintenance reimbursement, and user fees.
Code of Federal Regulations, 2014 CFR
2014-01-01
.... Documentation of actual costs allowed under this section will be used to determine any reimbursement. (c) To be... requested, and all supporting documentation, must be submitted to FCIC by electronic method or by hard copy... supporting documentation, must be submitted to FCIC by electronic method or by hard copy and received by FCIC...
7 CFR 400.712 - Research and development reimbursement, maintenance reimbursement, and user fees.
Code of Federal Regulations, 2013 CFR
2013-01-01
.... Documentation of actual costs allowed under this section will be used to determine any reimbursement. (c) To be... requested, and all supporting documentation, must be submitted to FCIC by electronic method or by hard copy... supporting documentation, must be submitted to FCIC by electronic method or by hard copy and received by FCIC...
Peptide and protein biomarkers for type 1 diabetes mellitus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Qibin; Metz, Thomas O.
A method for identifying persons with increased risk of developing type 1 diabetes mellitus, or having type I diabetes mellitus, utilizing selected biomarkers described herein either alone or in combination. The present disclosure allows for broad based, reliable, screening of large population bases. Also provided are arrays and kits that can be used to perform such methods.
Developing operation algorithms for vision subsystems in autonomous mobile robots
NASA Astrophysics Data System (ADS)
Shikhman, M. V.; Shidlovskiy, S. V.
2018-05-01
The paper analyzes algorithms for selecting keypoints on the image for the subsequent automatic detection of people and obstacles. The algorithm is based on the histogram of oriented gradients and the support vector method. The combination of these methods allows successful selection of dynamic and static objects. The algorithm can be applied in various autonomous mobile robots.
Spatial Visualization Learning in Engineering: Traditional Methods vs. a Web-Based Tool
ERIC Educational Resources Information Center
Pedrosa, Carlos Melgosa; Barbero, Basilio Ramos; Miguel, Arturo Román
2014-01-01
This study compares an interactive learning manager for graphic engineering to develop spatial vision (ILMAGE_SV) to traditional methods. ILMAGE_SV is an asynchronous web-based learning tool that allows the manipulation of objects with a 3D viewer, self-evaluation, and continuous assessment. In addition, student learning may be monitored, which…
Peptide and protein biomarkers for type 1 diabetes mellitus
Zhang, Qibin; Metz, Thomas O.
2014-06-10
A method for identifying persons with increased risk of developing type 1 diabetes mellitus, or having type I diabetes mellitus, utilizing selected biomarkers described herein either alone or in combination. The present disclosure allows for broad based, reliable, screening of large population bases. Also provided are arrays and kits that can be used to perform such methods.
A smoothed residual based goodness-of-fit statistic for nest-survival models
Rodney X. Sturdivant; Jay J. Rotella; Robin E. Russell
2008-01-01
Estimating nest success and identifying important factors related to nest-survival rates is an essential goal for many wildlife researchers interested in understanding avian population dynamics. Advances in statistical methods have led to a number of estimation methods and approaches to modeling this problem. Recently developed models allow researchers to include a...
Thrust modeling for hypersonic engines
NASA Technical Reports Server (NTRS)
Riggins, D. W.; Mcclinton, C. R.
1995-01-01
Expressions for the thrust losses of a scramjet engine are developed in terms of irreversible entropy increases and the degree of incomplete combustion. A method is developed which allows the calculation of the lost vehicle thrust due to different loss mechanisms within a given flow-field. This analysis demonstrates clearly the trade-off between mixing enhancement and resultant increased flow losses in scramjet combustors. An engine effectiveness parameter is defined in terms of thrust loss. Exergy and the thrust-potential method are related and compared.
Multiplex Immunoassay Profiling.
Stephen, Laurie
2017-01-01
Multiplex immunoassays allow for the rapid profiling of biomarker proteins in biological fluids, using less sample and labor than single immunoassays. This chapter details the methods to develop and manufacture multiplex assays for the Luminex ® platform. Although assay development is not included here, the same methods can be used to covalently couple antibodies to the Luminex beads and to label antibodies for the screening of sandwich pairs, if needed. The assay optimization, detection of cross-reactivity, and minimizing antibody interactions and matrix interferences will be addressed.
MRI EVALUATION OF KNEE CARTILAGE
Rodrigues, Marcelo Bordalo; Camanho, Gilberto Luís
2015-01-01
Through the ability of magnetic resonance imaging (MRI) to characterize soft tissue noninvasively, it has become an excellent method for evaluating cartilage. The development of new and faster methods allowed increased resolution and contrast in evaluating chondral structure, with greater diagnostic accuracy. In addition, physiological techniques for cartilage assessment that can detect early changes before the appearance of cracks and erosion have been developed. In this updating article, the various techniques for chondral assessment using knee MRI will be discussed and demonstrated. PMID:27022562
3D printed optical phantoms and deep tissue imaging for in vivo applications including oral surgery
NASA Astrophysics Data System (ADS)
Bentz, Brian Z.; Costas, Alfonso; Gaind, Vaibhav; Garcia, Jose M.; Webb, Kevin J.
2017-03-01
Progress in developing optical imaging for biomedical applications requires customizable and often complex objects known as "phantoms" for testing, evaluation, and calibration. This work demonstrates that 3D printing is an ideal method for fabricating such objects, allowing intricate inhomogeneities to be placed at exact locations in complex or anatomically realistic geometries, a process that is difficult or impossible using molds. We show printed mouse phantoms we have fabricated for developing deep tissue fluorescence imaging methods, and measurements of both their optical and mechanical properties. Additionally, we present a printed phantom of the human mouth that we use to develop an artery localization method to assist in oral surgery.
An analytical method to predict efficiency of aircraft gearboxes
NASA Technical Reports Server (NTRS)
Anderson, N. E.; Loewenthal, S. H.; Black, J. D.
1984-01-01
A spur gear efficiency prediction method previously developed by the authors was extended to include power loss of planetary gearsets. A friction coefficient model was developed for MIL-L-7808 oil based on disc machine data. This combined with the recent capability of predicting losses in spur gears of nonstandard proportions allows the calculation of power loss for complete aircraft gearboxes that utilize spur gears. The method was applied to the T56/501 turboprop gearbox and compared with measured test data. Bearing losses were calculated with large scale computer programs. Breakdowns of the gearbox losses point out areas for possible improvement.
NASA Technical Reports Server (NTRS)
Childs, A. G.
1971-01-01
A discrete steepest ascent method which allows controls which are not piecewise constant (for example, it allows all continuous piecewise linear controls) was derived for the solution of optimal programming problems. This method is based on the continuous steepest ascent method of Bryson and Denham and new concepts introduced by Kelley and Denham in their development of compatible adjoints for taking into account the effects of numerical integration. The method is a generalization of the algorithm suggested by Canon, Cullum, and Polak with the details of the gradient computation given. The discrete method was compared with the continuous method for an aerodynamics problem for which an analytic solution is given by Pontryagin's maximum principle, and numerical results are presented. The discrete method converges more rapidly than the continuous method at first, but then for some undetermined reason, loses its exponential convergence rate. A comparsion was also made for the algorithm of Canon, Cullum, and Polak using piecewise constant controls. This algorithm is very competitive with the continuous algorithm.
Modelisations et inversions tri-dimensionnelles en prospections gravimetrique et electrique
NASA Astrophysics Data System (ADS)
Boulanger, Olivier
The aim of this thesis is the application of gravity and resistivity methods for mining prospecting. The objectives of the present study are: (1) to build a fast gravity inversion method to interpret surface data; (2) to develop a tool for modelling the electrical potential acquired at surface and in boreholes when the resistivity distribution is heterogeneous; and (3) to define and implement a stochastic inversion scheme allowing the estimation of the subsurface resistivity from electrical data. The first technique concerns the elaboration of a three dimensional (3D) inversion program allowing the interpretation of gravity data using a selection of constraints such as the minimum distance, the flatness, the smoothness and the compactness. These constraints are integrated in a Lagrangian formulation. A multi-grid technique is also implemented to resolve separately large and short gravity wavelengths. The subsurface in the survey area is divided into juxtaposed rectangular prismatic blocks. The problem is solved by calculating the model parameters, i.e. the densities of each block. Weights are given to each block depending on depth, a priori information on density, and density range allowed for the region under investigation. The present code is tested on synthetic data. Advantages and behaviour of each method are compared in the 3D reconstruction. Recovery of geometry (depth, size) and density distribution of the original model is dependent on the set of constraints used. The best combination of constraints experimented for multiple bodies seems to be flatness and minimum volume for multiple bodies. The inversion method is tested on real gravity data. The second tool developed in this thesis is a three-dimensional electrical resistivity modelling code to interpret surface and subsurface data. Based on the integral equation, it calculates the charge density caused by conductivity gradients at each interface of the mesh allowing an exact estimation of the potential. Modelling generates a huge matrix made of Green's functions which is stored by using the method of pyramidal compression. The third method consists to interpret electrical potential measurements from a non-linear geostatistical approach including new constraints. This method estimates an analytical covariance model for the resistivity parameters from the potential data. (Abstract shortened by UMI.)
[Enzymatic methods in the analysis of musts and wines].
Lafon-Lafourcade, S
1978-01-01
The enzymatic methods are based on the property of the enzymes to catalyse specifically and reversibly the conversion of certain metabolites. These methods, developed thanks to the industrial preparation of enzymes, can be applied with no major modification to the analysis of drinks. About 15 constituants of musts and wines can now be determined by these methods. If their cost price was not relatively high, their specificity, sensitivity and rapidity would enable them to compete with the most precise of chemical methods. This is why they are only used in analytic oenology when chemical analysis is most specific enough or too laborious. Enzymatic measurement allows one by its specificity to determine the amount of residual sugar that is fermentable in a dry wine and by its sensitivity to verifie the total disappearance of the malic acid of the wine. Its rapidity must make it preferable to the long and not very specific chemical measurement, especially concerning the determination of citric acid. But glycerol, ethanol and acetic acid can be measured by chemical or chromatographical means with sufficient precision and for a more modest price. In oenology the methods are essentially used for research. They have permitted the study of the combinations of sulphur anhydride in wines (measurement of cetonic acids). The determination of the isomeric nature of the lactic acid produced from sugars by lactic bacteria is based on their application; this determination is a criterium for the identification and classification of these microorganisms. The measurement of the lactic acid during vinification allows the early disclosure of the first effects of a bacterial development; inversely it permits the invalidation of the existence of a lactic sourness, which a high volatile acidity might point to. Lastly, the enzymatic measurement of gluconic acid allows the health of the crop to be controlled.
Suba, Dávid; Urbányi, Zoltán; Salgó, András
2016-10-01
Capillary electrophoresis techniques are widely used in the analytical biotechnology. Different electrophoretic techniques are very adequate tools to monitor size-and charge heterogenities of protein drugs. Method descriptions and development studies of capillary zone electrophoresis (CZE) have been described in literature. Most of them are performed based on the classical one-factor-at-time (OFAT) approach. In this study a very simple method development approach is described for capillary zone electrophoresis: a "two-phase-four-step" approach is introduced which allows a rapid, iterative method development process and can be a good platform for CZE method. In every step the current analytical target profile and an appropriate control strategy were established to monitor the current stage of development. A very good platform was established to investigate intact and digested protein samples. Commercially available monoclonal antibody was chosen as model protein for the method development study. The CZE method was qualificated after the development process and the results were presented. The analytical system stability was represented by the calculated RSD% value of area percentage and migration time of the selected peaks (<0.8% and <5%) during the intermediate precision investigation. Copyright © 2016 Elsevier B.V. All rights reserved.
Howard, Jeremy T; Pryce, Jennie E; Baes, Christine; Maltecca, Christian
2017-08-01
Traditionally, pedigree-based relationship coefficients have been used to manage the inbreeding and degree of inbreeding depression that exists within a population. The widespread incorporation of genomic information in dairy cattle genetic evaluations allows for the opportunity to develop and implement methods to manage populations at the genomic level. As a result, the realized proportion of the genome that 2 individuals share can be more accurately estimated instead of using pedigree information to estimate the expected proportion of shared alleles. Furthermore, genomic information allows genome-wide relationship or inbreeding estimates to be augmented to characterize relationships for specific regions of the genome. Region-specific stretches can be used to more effectively manage areas of low genetic diversity or areas that, when homozygous, result in reduced performance across economically important traits. The use of region-specific metrics should allow breeders to more precisely manage the trade-off between the genetic value of the progeny and undesirable side effects associated with inbreeding. Methods tailored toward more effectively identifying regions affected by inbreeding and their associated use to manage the genome at the herd level, however, still need to be developed. We have reviewed topics related to inbreeding, measures of relatedness, genetic diversity and methods to manage populations at the genomic level, and we discuss future challenges related to managing populations through implementing genomic methods at the herd and population levels. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Automatic age-related macular degeneration detection and staging
NASA Astrophysics Data System (ADS)
van Grinsven, Mark J. J. P.; Lechanteur, Yara T. E.; van de Ven, Johannes P. H.; van Ginneken, Bram; Theelen, Thomas; Sánchez, Clara I.
2013-03-01
Age-related macular degeneration (AMD) is a degenerative disorder of the central part of the retina, which mainly affects older people and leads to permanent loss of vision in advanced stages of the disease. AMD grading of non-advanced AMD patients allows risk assessment for the development of advanced AMD and enables timely treatment of patients, to prevent vision loss. AMD grading is currently performed manually on color fundus images, which is time consuming and expensive. In this paper, we propose a supervised classification method to distinguish patients at high risk to develop advanced AMD from low risk patients and provide an exact AMD stage determination. The method is based on the analysis of the number and size of drusen on color fundus images, as drusen are the early characteristics of AMD. An automatic drusen detection algorithm is used to detect all drusen. A weighted histogram of the detected drusen is constructed to summarize the drusen extension and size and fed into a random forest classifier in order to separate low risk from high risk patients and to allow exact AMD stage determination. Experiments showed that the proposed method achieved similar performance as human observers in distinguishing low risk from high risk AMD patients, obtaining areas under the Receiver Operating Characteristic curve of 0.929 and 0.934. A weighted kappa agreement of 0.641 and 0.622 versus two observers were obtained for AMD stage evaluation. Our method allows for quick and reliable AMD staging at low costs.
Learning challenges and sustainable development: A methodological perspective.
Seppänen, Laura
2017-01-01
Sustainable development requires learning, but the contents of learning are often complex and ambiguous. This requires new integrated approaches from research. It is argued that investigation of people's learning challenges in every-day work is beneficial for research on sustainable development. The aim of the paper is to describe a research method for examining learning challenges in promoting sustainable development. This method is illustrated with a case example from organic vegetable farming in Finland. The method, based on Activity Theory, combines historical analysis with qualitative analysis of need expressions in discourse data. The method linking local and subjective need expressions with general historical analysis is a promising way to overcome the gap between the individual and society, so much needed in research for sustainable development. Dialectically informed historical frameworks have practical value as tools in collaborative negotiations and participatory designs for sustainable development. The simultaneous use of systemic and subjective perspectives allows researchers to manage the complexity of practical work activities and to avoid too simplistic presumptions about sustainable development.
Who watches the watchers?: preventing fault in a fault tolerance library
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stanavige, C. D.
The Scalable Checkpoint/Restart library (SCR) was developed and is used by researchers at Lawrence Livermore National Laboratory to provide a fast and efficient method of saving and recovering large applications during runtime on high-performance computing (HPC) systems. Though SCR protects other programs, up until June 2017, nothing was actively protecting SCR. The goal of this project was to automate the building and testing of this library on the varying HPC architectures on which it is used. Our methods centered around the use of a continuous integration tool called Bamboo that allowed for automation agents to be installed on the HPCmore » systems themselves. These agents provided a way for us to establish a new and unique way to automate and customize the allocation of resources and running of tests with CMake’s unit testing framework, CTest, as well as integration testing scripts though an HPC package manager called Spack. These methods provided a parallel environment in which to test the more complex features of SCR. As a result, SCR is now automatically built and tested on several HPC architectures any time changes are made by developers to the library’s source code. The results of these tests are then communicated back to the developers for immediate feedback, allowing them to fix functionality of SCR that may have broken. Hours of developers’ time are now being saved from the tedious process of manually testing and debugging, which saves money and allows the SCR project team to focus their efforts towards development. Thus, HPC system users can use SCR in conjunction with their own applications to efficiently and effectively checkpoint and restart as needed with the assurance that SCR itself is functioning properly.« less
Development of a Native Fractionation Antigen Microarray for Autoantibody Profiling in Breast Cancer
2011-10-01
Antigen Microarray for Autoantibody Profiling in Breast Cancer PRINCIPAL INVESTIGATOR: Brian C.-S. Liu, Ph.D...Profiling in Breast Cancer 5b. GRANT NUMBER W81XWH-09-1-0684 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER Brian C.-S. Liu...NOTES 14. ABSTRACT The humoral response of a cancer patient may allow earlier detection of cancer than current methods allow. If so, the serum
Versatile Synthesis of Stable, Functional Polypeptides via Reaction with Epoxides.
Gharakhanian, Eric G; Deming, Timothy J
2015-06-08
Methodology was developed for efficient alkylation of methionine residues using epoxides as a general strategy to introduce a wide range of functional groups onto polypeptides. Use of a spacer between epoxide and functional groups further allowed addition of sterically demanding functionalities. Contrary to other methods to alkylate methionine residues, epoxide alkylations allow the reactions to be conducted in wet protic media and give sulfonium products that are stable against dealkylation. These functionalizations are notable since they are chemoselective, utilize stable and readily available epoxides, and allow facile incorporation of an unprecedented range of functional groups onto simple polypeptides using stable linkages.
Medendorp, Joseph; Bric, John; Connelly, Greg; Tolton, Kelly; Warman, Martin
2015-08-10
The purpose of this manuscript is to present the intended use and long-term maintenance strategy of an online laser diffraction particle size method used for process control in a spray drying process. A Malvern Insitec was used for online particle size measurements and a Malvern Mastersizer was used for offline particle size measurements. The two methods were developed in parallel with the Mastersizer serving as the reference method. Despite extensive method development across a range of particle sizes, the two instruments demonstrated different sensitivities to material and process changes over the product lifecycle. This paper will describe the procedure used to ensure consistent alignment of the two methods, thus allowing for continued use of online real-time laser diffraction as a surrogate for the offline system over the product lifecycle. Copyright © 2015 Elsevier B.V. All rights reserved.
Development of Advanced Methods of Structural and Trajectory Analysis for Transport Aircraft
NASA Technical Reports Server (NTRS)
Ardema, Mark D.
1996-01-01
In this report the author describes: (1) development of advanced methods of structural weight estimation, and (2) development of advanced methods of flight path optimization. A method of estimating the load-bearing fuselage weight and wing weight of transport aircraft based on fundamental structural principles has been developed. This method of weight estimation represents a compromise between the rapid assessment of component weight using empirical methods based on actual weights of existing aircraft and detailed, but time-consuming, analysis using the finite element method. The method was applied to eight existing subsonic transports for validation and correlation. Integration of the resulting computer program, PDCYL, has been made into the weights-calculating module of the AirCraft SYNThesis (ACSYNT) computer program. ACSYNT bas traditionally used only empirical weight estimation methods; PDCYL adds to ACSYNT a rapid, accurate means of assessing the fuselage and wing weights of unconventional aircraft. PDCYL also allows flexibility in the choice of structural concept, as well as a direct means of determining the impact of advanced materials on structural weight.
Genetics-based methods for detection of Salmonella spp. in foods.
Mozola, Mark A
2006-01-01
Genetic methods are now at the forefront of foodborne pathogen testing. The sensitivity, specificity, and inclusivity advantages offered by deoxyribonucleic acid (DNA) probe technology have driven an intense effort in methods development over the past 20 years. DNA probe-based methods for Salmonella spp. and other pathogens have progressed from time-consuming procedures involving the use of radioisotopes to simple, high throughput, automated assays. The analytical sensitivity of nucleic acid amplification technology has facilitated a reduction in analysis time by allowing enriched samples to be tested for previously undetectable quantities of analyte. This article will trace the evolution of the development of genetic methods for detection of Salmonella in foods, review the basic assay formats and their advantages and limitations, and discuss method performance characteristics and considerations for selection of methods.
Rapid Method for Sodium Hydroxide Fusion of Concrete and ...
Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Americium-241, plutonium-238, plutonium-239, radium-226, strontium-90, uranium-234, uranium-235 and uranium-238 in concrete and brick samples Method Selected for: SAM lists this method for qualitative analysis of americium-241, plutonium-238, plutonium-239, radium-226, strontium-90, uranium-234, uranium-235 and uranium-238 in concrete or brick building materials. Summary of subject analytical method which will be posted to the SAM website to allow access to the method.
Orfanidis, Leonidas; Bamidis, Panagiotis; Eaglestone, Barry
2006-01-01
This paper is concerned with modelling national approaches towards electronic health record systems (NEHRS) development. A model framework is stepwise produced, that allows for the characterisation of the preparedness and the readiness of a country to develop an NEHRS. Secondary data of published reports are considered for the creation of the model. Such sources are identified to mostly originate from within a sample of five developed countries. Factors arising from these sources are identified, coded and scaled, so as to allow for a quantitative application of the model. Instantiation of the latter for the case of the five developed countries is contrasted with the set of countries from South East Europe (SEE). The likely importance and validity of this modelling approach is discussed, using the Delphi method.
Communicating spatial uncertainty to non-experts using R
NASA Astrophysics Data System (ADS)
Luzzi, Damiano; Sawicka, Kasia; Heuvelink, Gerard; de Bruin, Sytze
2016-04-01
Effective visualisation methods are important for the efficient use of uncertainty information for various groups of users. Uncertainty propagation analysis is often used with spatial environmental models to quantify the uncertainty within the information. A challenge arises when trying to effectively communicate the uncertainty information to non-experts (not statisticians) in a wide range of cases. Due to the growing popularity and applicability of the open source programming language R, we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. The package has implemented Monte Carlo algorithms for uncertainty propagation, the output of which is represented by an ensemble of model outputs (i.e. a sample from a probability distribution). Numerous visualisation methods exist that aim to present such spatial uncertainty information both statically, dynamically and interactively. To provide the most universal visualisation tools for non-experts, we conducted a survey on a group of 20 university students and assessed the effectiveness of selected static and interactive methods for visualising uncertainty in spatial variables such as DEM and land cover. The static methods included adjacent maps and glyphs for continuous variables. Both allow for displaying maps with information about the ensemble mean, variance/standard deviation and prediction intervals. Adjacent maps were also used for categorical data, displaying maps of the most probable class, as well as its associated probability. The interactive methods included a graphical user interface, which in addition to displaying the previously mentioned variables also allowed for comparison of joint uncertainties at multiple locations. The survey indicated that users could understand the basics of the uncertainty information displayed in the static maps, with the interactive interface allowing for more in-depth information. Subsequently, the R package included a collation of the plotting functions that were evaluated in the survey. The implementation of static visualisations was done via calls to the 'ggplot2' package. This allowed the user to provide control over the content, legend, colours, axes and titles. The interactive methods were implemented using the 'shiny' package allowing users to activate the visualisation of statistical descriptions of uncertainty through interaction with a plotted map of means. This research brings uncertainty visualisation to a broader audience through the development of tools for visualising uncertainty using open source software.
Environmentally evaluated HPLC-ELSD method to monitor enzymatic synthesis of a non-ionic surfactant.
Gaber, Yasser; Akerman, Cecilia Orellana; Hatti-Kaul, Rajni
2014-01-01
N-Lauroyl-N-methylglucamide is a biodegradable surfactant derived from renewable resources. In an earlier study, we presented an enzymatic solvent-free method for synthesis of this compound. In the present report, the HPLC method developed to follow the reaction between lauric acid/methyl laurate and N-methyl glucamine (MEG) and its environmental assessment are described. Use of ultraviolet (UV) absorption or refractive index (RI) detectors did not allow the detection of N-methyl glucamine (MEG). With Evaporative light scattering detector ELSD, it was possible to apply a gradient elution, and detect MEG with a limit of detection, LOD = 0.12 μg. A good separation of the peaks: MEG, lauric acid, product (amide) and by-product (amide-ester) was achieved with the gradient program with a run time of 40 min. The setting of ELSD detector was optimized using methyl laurate as the analyte. LC-MS/MS was used to confirm the amide and amide-ester peaks. We evaluated the greenness of the developed method using the freely available software HPLC-Environmental Assessment Tool (HPLC-EAT) and the method got a scoring of 73 HPLC-EAT units, implying that the analytical procedure was more environmentally benign compared to some other methods reported in literature whose HPLC-EAT values scored up to 182. Use of ELSD detector allowed the detection and quantification of the substrates and the reaction products of enzymatic synthesis of the surfactant, N-lauroyl-N-methylglucamide. The developed HPLC method has acceptable environmental profile based on HPLC-EAT evaluation.
Environmentally evaluated HPLC-ELSD method to monitor enzymatic synthesis of a non-ionic surfactant
2014-01-01
Background N-Lauroyl-N-methylglucamide is a biodegradable surfactant derived from renewable resources. In an earlier study, we presented an enzymatic solvent-free method for synthesis of this compound. In the present report, the HPLC method developed to follow the reaction between lauric acid/methyl laurate and N-methyl glucamine (MEG) and its environmental assessment are described. Results Use of ultraviolet (UV) absorption or refractive index (RI) detectors did not allow the detection of N-methyl glucamine (MEG). With Evaporative light scattering detector ELSD, it was possible to apply a gradient elution, and detect MEG with a limit of detection, LOD = 0.12 μg. A good separation of the peaks: MEG, lauric acid, product (amide) and by-product (amide-ester) was achieved with the gradient program with a run time of 40 min. The setting of ELSD detector was optimized using methyl laurate as the analyte. LC-MS/MS was used to confirm the amide and amide-ester peaks. We evaluated the greenness of the developed method using the freely available software HPLC-Environmental Assessment Tool (HPLC-EAT) and the method got a scoring of 73 HPLC-EAT units, implying that the analytical procedure was more environmentally benign compared to some other methods reported in literature whose HPLC-EAT values scored up to 182. Conclusion Use of ELSD detector allowed the detection and quantification of the substrates and the reaction products of enzymatic synthesis of the surfactant, N-lauroyl-N-methylglucamide. The developed HPLC method has acceptable environmental profile based on HPLC-EAT evaluation. PMID:24914404
Prediction of unsteady transonic flow around missile configurations
NASA Technical Reports Server (NTRS)
Nixon, D.; Reisenthel, P. H.; Torres, T. O.; Klopfer, G. H.
1990-01-01
This paper describes the preliminary development of a method for predicting the unsteady transonic flow around missiles at transonic and supersonic speeds, with the final goal of developing a computer code for use in aeroelastic calculations or during maneuvers. The basic equations derived for this method are an extension of those derived by Klopfer and Nixon (1989) for steady flow and are a subset of the Euler equations. In this approach, the five Euler equations are reduced to an equation similar to the three-dimensional unsteady potential equation, and a two-dimensional Poisson equation. In addition, one of the equations in this method is almost identical to the potential equation for which there are well tested computer codes, allowing the development of a prediction method based in part on proved technology.
A method for multi-codon scanning mutagenesis of proteins based on asymmetric transposons.
Liu, Jia; Cropp, T Ashton
2012-02-01
Random mutagenesis followed by selection or screening is a commonly used strategy to improve protein function. Despite many available methods for random mutagenesis, nearly all generate mutations at the nucleotide level. An ideal mutagenesis method would allow for the generation of 'codon mutations' to change protein sequence with defined or mixed amino acids of choice. Herein we report a method that allows for mutations of one, two or three consecutive codons. Key to this method is the development of a Mu transposon variant with asymmetric terminal sequences. As a demonstration of the method, we performed multi-codon scanning on the gene encoding superfolder GFP (sfGFP). Characterization of 50 randomly chosen clones from each library showed that more than 40% of the mutants in these three libraries contained seamless, in-frame mutations with low site preference. By screening only 500 colonies from each library, we successfully identified several spectra-shift mutations, including a S205D variant that was found to bear a single excitation peak in the UV region.
Forensic Facial Reconstruction: The Final Frontier.
Gupta, Sonia; Gupta, Vineeta; Vij, Hitesh; Vij, Ruchieka; Tyagi, Nutan
2015-09-01
Forensic facial reconstruction can be used to identify unknown human remains when other techniques fail. Through this article, we attempt to review the different methods of facial reconstruction reported in literature. There are several techniques of doing facial reconstruction, which vary from two dimensional drawings to three dimensional clay models. With the advancement in 3D technology, a rapid, efficient and cost effective computerized 3D forensic facial reconstruction method has been developed which has brought down the degree of error previously encountered. There are several methods of manual facial reconstruction but the combination Manchester method has been reported to be the best and most accurate method for the positive recognition of an individual. Recognition allows the involved government agencies to make a list of suspected victims'. This list can then be narrowed down and a positive identification may be given by the more conventional method of forensic medicine. Facial reconstruction allows visual identification by the individual's family and associates to become easy and more definite.
A review of modern instrumental techniques for measurements of ice cream characteristics.
Bahram-Parvar, Maryam
2015-12-01
There is an increasing demand of the food industries and research institutes to have means of measurement allowing the characterization of foods. Ice cream, as a complex food system, consists of a frozen matrix containing air bubbles, fat globules, ice crystals, and an unfrozen serum phase. Some deficiencies in conventional methods for testing this product encourage the use of alternative techniques such as rheometry, spectroscopy, X-ray, electro-analytical techniques, ultrasound, and laser. Despite the development of novel instrumental applications in food science, use of some of them in ice cream testing is few, but has shown promising results. Developing the novel methods should increase our understanding of characteristics of ice cream and may allow online testing of the product. This review article discusses the potential of destructive and non-destructive methodologies in determining the quality and characteristics of ice cream and similar products. Copyright © 2015. Published by Elsevier Ltd.
Primate comparative neuroscience using magnetic resonance imaging: promises and challenges
Mars, Rogier B.; Neubert, Franz-Xaver; Verhagen, Lennart; Sallet, Jérôme; Miller, Karla L.; Dunbar, Robin I. M.; Barton, Robert A.
2014-01-01
Primate comparative anatomy is an established field that has made rich and substantial contributions to neuroscience. However, the labor-intensive techniques employed mean that most comparisons are often based on a small number of species, which limits the conclusions that can be drawn. In this review we explore how new developments in magnetic resonance imaging have the potential to apply comparative neuroscience to a much wider range of species, allowing it to realize an even greater potential. We discuss (1) new advances in the types of data that can be acquired, (2) novel methods for extracting meaningful measures from such data that can be compared between species, and (3) methods to analyse these measures within a phylogenetic framework. Together these developments will allow researchers to characterize the relationship between different brains, the ecological niche they occupy, and the behavior they produce in more detail than ever before. PMID:25339857
Teleautonomous guidance for mobile robots
NASA Technical Reports Server (NTRS)
Borenstein, J.; Koren, Y.
1990-01-01
Teleautonomous guidance (TG), a technique for the remote guidance of fast mobile robots, has been developed and implemented. With TG, the mobile robot follows the general direction prescribed by an operator. However, if the robot encounters an obstacle, it autonomously avoids collision with that obstacle while trying to match the prescribed direction as closely as possible. This type of shared control is completely transparent and transfers control between teleoperation and autonomous obstacle avoidance gradually. TG allows the operator to steer vehicles and robots at high speeds and in cluttered environments, even without visual contact. TG is based on the virtual force field (VFF) method, which was developed earlier for autonomous obstacle avoidance. The VFF method is especially suited to the accommodation of inaccurate sensor data (such as that produced by ultrasonic sensors) and sensor fusion, and allows the mobile robot to travel quickly without stopping for obstacles.
Prischi, Filippo; Pastore, Annalisa
2016-01-01
The current main challenge of Structural Biology is to undertake the structure determination of increasingly complex systems in the attempt to better understand their biological function. As systems become more challenging, however, there is an increasing demand for the parallel use of more than one independent technique to allow pushing the frontiers of structure determination and, at the same time, obtaining independent structural validation. The combination of different Structural Biology methods has been named hybrid approaches. The aim of this review is to critically discuss the most recent examples and new developments that have allowed structure determination or experimentally-based modelling of various molecular complexes selecting them among those that combine the use of nuclear magnetic resonance and small angle scattering techniques. We provide a selective but focused account of some of the most exciting recent approaches and discuss their possible further developments.
Label-Free Immuno-Sensors for the Fast Detection of Listeria in Food.
Morlay, Alexandra; Roux, Agnès; Templier, Vincent; Piat, Félix; Roupioz, Yoann
2017-01-01
Foodborne diseases are a major concern for both food industry and health organizations due to the economic costs and potential threats for human lives. For these reasons, specific regulations impose the research of pathogenic bacteria in food products. Nevertheless, current methods, references and alternatives, take up to several days and require many handling steps. In order to improve pathogen detection in food, we developed an immune-sensor, based on Surface Plasmon Resonance imaging (SPRi) and bacterial growth which allows the detection of a very low number of Listeria monocytogenes in food sample in one day. Adequate sensitivity is achieved by the deposition of several antibodies in a micro-array format allowing real-time detection. This label-free method thus reduces handling and time to result compared with current methods.
Rapid Method for Sodium Hydroxide/Sodium Peroxide Fusion ...
Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Plutonium-238 and plutonium-239 in water and air filters Method Selected for: SAM lists this method as a pre-treatment technique supporting analysis of refractory radioisotopic forms of plutonium in drinking water and air filters using the following qualitative techniques: • Rapid methods for acid or fusion digestion • Rapid Radiochemical Method for Plutonium-238 and Plutonium 239/240 in Building Materials for Environmental Remediation Following Radiological Incidents. Summary of subject analytical method which will be posted to the SAM website to allow access to the method.
Quantitative analysis of random migration of cells using time-lapse video microscopy.
Jain, Prachi; Worthylake, Rebecca A; Alahari, Suresh K
2012-05-13
Cell migration is a dynamic process, which is important for embryonic development, tissue repair, immune system function, and tumor invasion (1, 2). During directional migration, cells move rapidly in response to an extracellular chemotactic signal, or in response to intrinsic cues (3) provided by the basic motility machinery. Random migration occurs when a cell possesses low intrinsic directionality, allowing the cells to explore their local environment. Cell migration is a complex process, in the initial response cell undergoes polarization and extends protrusions in the direction of migration (2). Traditional methods to measure migration such as the Boyden chamber migration assay is an easy method to measure chemotaxis in vitro, which allows measuring migration as an end point result. However, this approach neither allows measurement of individual migration parameters, nor does it allow to visualization of morphological changes that cell undergoes during migration. Here, we present a method that allows us to monitor migrating cells in real time using video - time lapse microscopy. Since cell migration and invasion are hallmarks of cancer, this method will be applicable in studying cancer cell migration and invasion in vitro. Random migration of platelets has been considered as one of the parameters of platelet function (4), hence this method could also be helpful in studying platelet functions. This assay has the advantage of being rapid, reliable, reproducible, and does not require optimization of cell numbers. In order to maintain physiologically suitable conditions for cells, the microscope is equipped with CO(2) supply and temperature thermostat. Cell movement is monitored by taking pictures using a camera fitted to the microscope at regular intervals. Cell migration can be calculated by measuring average speed and average displacement, which is calculated by Slidebook software.
Identification of sewage leaks by active remote-sensing methods
NASA Astrophysics Data System (ADS)
Goldshleger, Naftaly; Basson, Uri
2016-04-01
The increasing length of sewage pipelines, and concomitant risk of leaks due to urban and industrial growth and development is exposing the surrounding land to contamination risk and environmental harm. It is therefore important to locate such leaks in a timely manner, to minimize the damage. Advances in active remote sensing Ground Penetrating Radar (GPR) and Frequency Domain Electromagnetic (FDEM) technologies was used to identify leaking potentially responsible for pollution and to identify minor spills before they cause widespread damage. This study focused on the development of these electromagnetic methods to replace conventional acoustic methods for the identification of leaks along sewage pipes. Electromagnetic methods provide an additional advantage in that they allow mapping of the fluid-transport system in the subsurface. Leak-detection systems using GPR and FDEM are not limited to large amounts of water, but enable detecting leaks of tens of liters per hour, because they can locate increases in environmental moisture content of only a few percentage along the pipes. The importance and uniqueness of this research lies in the development of practical tools to provide a snapshot and monitoring of the spatial changes in soil moisture content up to depths of about 3-4 m, in open and paved areas, at relatively low cost, in real time or close to real time. Spatial measurements performed using GPR and FDEM systems allow monitoring many tens of thousands of measurement points per hectare, thus providing a picture of the spatial situation along pipelines and the surrounding. The main purpose of this study was to develop a method for detecting sewage leaks using the above-proposed geophysical methods, since their contaminants can severely affect public health. We focused on identifying, locating and characterizing such leaks in sewage pipes in residential and industrial areas.
Kopf, Thomas; Schmitz, Gerd
2013-11-01
The determination of the fatty acid (FA) profile of lipid classes is essential for lipidomic analysis. We recently developed a GC/MS-method for the analysis of the FA profile of total FAs, i.e. the totality of bound and unbound FAs, in any given biological sample (TOFAs). Here, we present a method for the analysis of non-esterified fatty acids (NEFAs) in biological samples, i.e. the fraction that is present as extractable free fatty acids. Lipid extraction is performed according to Dole using 80/20 2-propanol/n-hexane (v/v), with 0.1% H2SO4. The fatty acid-species composition of this NEFA-fraction is determined as FAME after derivatization with our GC/MS-method on a BPX column (Shimadzu). Validation of the NEFA-method presented was performed in human plasma samples. The validated method has been used with human plasma, cells and tissues, as well as mammalian body fluids and tissue samples. The newly developed solid-phase-extraction (SPE)-GC-MS method allows the rapid separation of the NEFA-fraction from a neutral lipid extract of plasma samples. As a major advantage compared to G-FID-methods, GC-MS allows the use of stable isotope labeled fatty acid precursors to monitor fatty acid metabolism. Copyright © 2013 Elsevier B.V. All rights reserved.
Methodology to Define Delivery Accuracy Under Current Day ATC Operations
NASA Technical Reports Server (NTRS)
Sharma, Shivanjli; Robinson, John E., III
2015-01-01
In order to enable arrival management concepts and solutions in a NextGen environment, ground- based sequencing and scheduling functions have been developed to support metering operations in the National Airspace System. These sequencing and scheduling algorithms as well as tools are designed to aid air traffic controllers in developing an overall arrival strategy. The ground systems being developed will support the management of aircraft to their Scheduled Times of Arrival (STAs) at flow-constrained meter points. This paper presents a methodology for determining the undelayed delivery accuracy for current day air traffic control operations. This new method analyzes the undelayed delivery accuracy at meter points in order to understand changes of desired flow rates as well as enabling definition of metrics that will allow near-future ground automation tools to successfully achieve desired separation at the meter points. This enables aircraft to meet their STAs while performing high precision arrivals. The research presents a possible implementation that would allow delivery performance of current tools to be estimated and delivery accuracy requirements for future tools to be defined, which allows analysis of Estimated Time of Arrival (ETA) accuracy for Time-Based Flow Management (TBFM) and the FAA's Traffic Management Advisor (TMA). TMA is a deployed system that generates scheduled time-of-arrival constraints for en- route air traffic controllers in the US. This new method of automated analysis provides a repeatable evaluation of the delay metrics for current day traffic, new releases of TMA, implementation of different tools, and across different airspace environments. This method utilizes a wide set of data from the Operational TMA-TBFM Repository (OTTR) system, which processes raw data collected by the FAA from operational TMA systems at all ARTCCs in the nation. The OTTR system generates daily reports concerning ATC status, intent and actions. Due to its availability, ease of use, and vast collection of data across several airspaces it was determined that the OTTR data set would be the best method to utilize moving forward with this analysis. The particular variables needed for further analysis were determined along with the necessary OTTR reports, by working closely with the repository team additional analysis reports were developed that provided key ETA and STA information at the freeze horizon. One major benefit of the OTTR data is that using the correct reports the data across several airports could be analyzed over large periods of time. The OTTR data processes the TBFM data daily and is stored in various formats across several airspaces. This allowed us to develop our own parsing methods and raw data processing that would not rely on other computationally expensive tools that perform more in depth analysis of similar sets of data. The majority of this work consisted of the development of the ability to filter flights to create a subset of flights that could be considered undelayed, which is defined as a flight at the freeze horizon with an ETA and STA difference that was minimal or close to zero. This was a broad method that allowed the consideration of a large data set which consisted of all the traffic across a two month period in 2013, the hottest and coldest months, arriving into four airports: George Bush Intercontinental, Denver International, Los Angeles International, and Phoenix Sky Harbor.
NASA Astrophysics Data System (ADS)
Quiers, M.; Perrette, Y.; Etienne, D.; Develle, A. L.; Jacq, K.
2017-12-01
The use of organic proxies increases in paleoenvironmental reconstructions from natural archives. Major advances have been achieved by the development of new highly informative molecular proxies usually linked to specific compounds. While studies focused on targeted compounds, offering a high information degree, advances on bulk organic matter are limited. However, this bulk is the main contributor to carbon cycle and has been shown to be a driver of many mineral or organic compounds transfer and record. Development of target proxies need complementary information on bulk organic matter to understand biases link to controlling factors or analytical methods, and provide a robust interpretation. Fluorescence methods have often been employed to characterize and quantify organic matter. However, these technics are mainly developed for liquid samples, inducing material and resolution loss when working on natural archives (either stalagmite or sediments). High-resolution solid phase fluorescence (SPF) was developed on speleothems. This method allows now to analyse organic matter quality and quantity if procedure to constrain the optical density are adopted. In fact, a calibration method using liquid phase fluorescence (LPF) was developed for speleothem, allowing to quantify organic carbon at high-resolution. We report here an application of such a procedure SPF/LPF measurements on lake sediments. In order to avoid sediment matrix effects on the fluorescence signal, a calibration using LPF measurements was realised. First results using this method provided organic matter quality record of different organic matter compounds (humic-like, protein-like and chlorophylle-like compounds) at high resolution for the sediment core. High resolution organic matter fluxes are obtained in a second time, applying pragmatic chemometrics model (non linear models, partial least square models) on high resolution fluorescence data. SPF method can be considered as a promising tool for high resolution record on organic matter quality and quantity. Potential application of this method will be evocated (lake ecosystem dynamic, changes in trophic levels)
Femtosecond laser dissection in C. elegans neural circuits
NASA Astrophysics Data System (ADS)
Samuel, Aravinthan D. T.; Chung, Samuel H.; Clark, Damon A.; Gabel, Christopher V.; Chang, Chieh; Murthy, Venkatesh; Mazur, Eric
2006-02-01
The nematode C. elegans, a millimeter-long roundworm, is a well-established model organism for studies of neural development and behavior, however physiological methods to manipulate and monitor the activity of its neural network have lagged behind the development of powerful methods in genetics and molecular biology. The small size and transparency of C. elegans make the worm an ideal test-bed for the development of physiological methods derived from optics and microscopy. We present the development and application of a new physiological tool: femtosecond laser dissection, which allows us to selectively ablate segments of individual neural fibers within live C. elegans. Femtosecond laser dissection provides a scalpel with submicrometer resolution, and we discuss its application in studies of neural growth, regenerative growth, and the neural basis of behavior.
ERIC Educational Resources Information Center
Edmund, Norman W.
This booklet introduces a new and general approach to the scientific method for everyone. Teaching the scientific method to all students allows them to develop their own talents and is necessary to prevent the loss of jobs. Many job areas that require scientific methodology are listed. Harmful results that may occur because of not teaching the…
Improvement of calculation method for electrical parameters of short network of ore-thermal furnaces
NASA Astrophysics Data System (ADS)
Aliferov, A. I.; Bikeev, R. A.; Goreva, L. P.
2017-10-01
The paper describes a new calculation method for active and inductive resistance of split interleaved current leads packages in ore-thermal electric furnaces. The method is developed on basis of regression analysis of dependencies of active and inductive resistances of the packages on their geometrical parameters, mutual disposition and interleaving pattern. These multi-parametric calculations have been performed with ANSYS software. The proposed method allows solving split current lead electrical parameters minimization and balancing problems for ore-thermal furnaces.
Andreatta, Massimo; Schafer-Nielsen, Claus; Lund, Ole; Buus, Søren; Nielsen, Morten
2011-01-01
Recent advances in high-throughput technologies have made it possible to generate both gene and protein sequence data at an unprecedented rate and scale thereby enabling entirely new “omics”-based approaches towards the analysis of complex biological processes. However, the amount and complexity of data that even a single experiment can produce seriously challenges researchers with limited bioinformatics expertise, who need to handle, analyze and interpret the data before it can be understood in a biological context. Thus, there is an unmet need for tools allowing non-bioinformatics users to interpret large data sets. We have recently developed a method, NNAlign, which is generally applicable to any biological problem where quantitative peptide data is available. This method efficiently identifies underlying sequence patterns by simultaneously aligning peptide sequences and identifying motifs associated with quantitative readouts. Here, we provide a web-based implementation of NNAlign allowing non-expert end-users to submit their data (optionally adjusting method parameters), and in return receive a trained method (including a visual representation of the identified motif) that subsequently can be used as prediction method and applied to unknown proteins/peptides. We have successfully applied this method to several different data sets including peptide microarray-derived sets containing more than 100,000 data points. NNAlign is available online at http://www.cbs.dtu.dk/services/NNAlign. PMID:22073191
Andreatta, Massimo; Schafer-Nielsen, Claus; Lund, Ole; Buus, Søren; Nielsen, Morten
2011-01-01
Recent advances in high-throughput technologies have made it possible to generate both gene and protein sequence data at an unprecedented rate and scale thereby enabling entirely new "omics"-based approaches towards the analysis of complex biological processes. However, the amount and complexity of data that even a single experiment can produce seriously challenges researchers with limited bioinformatics expertise, who need to handle, analyze and interpret the data before it can be understood in a biological context. Thus, there is an unmet need for tools allowing non-bioinformatics users to interpret large data sets. We have recently developed a method, NNAlign, which is generally applicable to any biological problem where quantitative peptide data is available. This method efficiently identifies underlying sequence patterns by simultaneously aligning peptide sequences and identifying motifs associated with quantitative readouts. Here, we provide a web-based implementation of NNAlign allowing non-expert end-users to submit their data (optionally adjusting method parameters), and in return receive a trained method (including a visual representation of the identified motif) that subsequently can be used as prediction method and applied to unknown proteins/peptides. We have successfully applied this method to several different data sets including peptide microarray-derived sets containing more than 100,000 data points. NNAlign is available online at http://www.cbs.dtu.dk/services/NNAlign.
NASA Astrophysics Data System (ADS)
Jung, C. C.; Stumpe, J.
2005-02-01
The new method of immersion transmission ellipsometry (ITE) [1] has been developed. It allows the highly accurate determination of the absolute three-dimensional (3D) refractive indices of anisotropic thin films. The method is combined with conventional ellipsometry in transmission and reflection, and the thickness determination of anisotropic films solely by optical methods also becomes more accurate. The method is applied to the determination of the 3D refractive indices of thin spin-coated films of an azobenzene-containing liquid-crystalline copolymer. The development of the anisotropy in these films by photo-orientation and subsequent annealing is demonstrated. Depending on the annealing temperature, oblate or prolate orders are generated.
Principles and Applications of Liquid Chromatography-Mass Spectrometry in Clinical Biochemistry
Pitt, James J
2009-01-01
Liquid chromatography-mass spectrometry (LC-MS) is now a routine technique with the development of electrospray ionisation (ESI) providing a simple and robust interface. It can be applied to a wide range of biological molecules and the use of tandem MS and stable isotope internal standards allows highly sensitive and accurate assays to be developed although some method optimisation is required to minimise ion suppression effects. Fast scanning speeds allow a high degree of multiplexing and many compounds can be measured in a single analytical run. With the development of more affordable and reliable instruments, LC-MS is starting to play an important role in several areas of clinical biochemistry and compete with conventional liquid chromatography and other techniques such as immunoassay. PMID:19224008
Methodology for determining the investment attractiveness of construction of high-rise buildings
NASA Astrophysics Data System (ADS)
Nezhnikova, Ekaterina; Kashirin, Valentin; Davydova, Yana; Kazakova, Svetlana
2018-03-01
The article presents the analysis of the existing methods for assessing the investment attractiveness of high-rise construction. The authors determined and justified the primary choice of objects and territories that are the most attractive for the development of high-rise construction. A system of risk indicators has been developed that allow making a quantitative adjustment for a particular project in the evaluation of the efficiency of investment projects. The study is aimed at developing basic methodological concepts for a comparative evaluation of the prospects of construction of high-rise facilities that allow to take into consideration the features of investment in construction and to enable quantitative evaluation of the investment effectiveness in high-rise construction.
1999-01-01
contaminating the surface. Research efforts to develop an improved sampling method have previously been limited to deposits made from solutions of explosives...explosive per fingerprint calculated in this way has too much variation to allow determination of sampling efficiency or to use this method to prepare...crystals is put into suspension, the actual amount is determined by usual methods including high-performance liquid chromatography (HPLC), gas
Value and Methods for Molecular Subtyping of Bacteria
NASA Astrophysics Data System (ADS)
Moorman, Mark; Pruett, Payton; Weidman, Martin
Tracking sources of microbial contaminants has been a concern since the early days of commercial food processing; however, recent advances in the development of molecular subtyping methods have provided tools that allow more rapid and highly accurate determinations of these sources. Only individuals with an understanding of the molecular subtyping methods, and the epidemiological techniques used, can evaluate the reliability of a link between a food-manufacturing plant, a food, and a foodborne disease outbreak.
Frequency-dependent FDTD methods using Z transforms
NASA Technical Reports Server (NTRS)
Sullivan, Dennis M.
1992-01-01
While the frequency-dependent finite-difference time-domain, or (FD)2TD, method can correctly calculate EM propagation through media whose dielectric properties are frequency-dependent, more elaborate applications lead to greater (FD)2TD complexity. Z-transform theory is presently used to develop the mathematical bases of the (FD)2TD method, simultaneously obtaining a clearer formulation and allowing researchers to draw on the existing literature of systems analysis and signal-processing.
Hardisty, Frank; Robinson, Anthony C.
2010-01-01
In this paper we present the GeoViz Toolkit, an open-source, internet-delivered program for geographic visualization and analysis that features a diverse set of software components which can be flexibly combined by users who do not have programming expertise. The design and architecture of the GeoViz Toolkit allows us to address three key research challenges in geovisualization: allowing end users to create their own geovisualization and analysis component set on-the-fly, integrating geovisualization methods with spatial analysis methods, and making geovisualization applications sharable between users. Each of these tasks necessitates a robust yet flexible approach to inter-tool coordination. The coordination strategy we developed for the GeoViz Toolkit, called Introspective Observer Coordination, leverages and combines key advances in software engineering from the last decade: automatic introspection of objects, software design patterns, and reflective invocation of methods. PMID:21731423
Controlled assembly of jammed colloidal shells on fluid droplets.
Subramaniam, Anand Bala; Abkarian, Manouk; Stone, Howard A
2005-07-01
Assembly of colloidal particles on fluid interfaces is a promising technique for synthesizing two-dimensional microcrystalline materials useful in fields as diverse as biomedicine, materials science, mineral flotation and food processing. Current approaches rely on bulk emulsification methods, require further chemical and thermal treatments, and are restrictive with respect to the materials used. The development of methods that exploit the great potential of interfacial assembly for producing tailored materials have been hampered by the lack of understanding of the assembly process. Here we report a microfluidic method that allows direct visualization and understanding of the dynamics of colloidal crystal growth on curved interfaces. The crystals are periodically ejected to form stable jammed shells, which we refer to as colloidal armour. We propose that the energetic barriers to interfacial crystal growth and organization can be overcome by targeted delivery of colloidal particles through hydrodynamic flows. Our method allows an unprecedented degree of control over armour composition, size and stability.
Controlled assembly of jammed colloidal shells on fluid droplets
NASA Astrophysics Data System (ADS)
Subramaniam, Anand Bala; Abkarian, Manouk; Stone, Howard A.
2005-07-01
Assembly of colloidal particles on fluid interfaces is a promising technique for synthesizing two-dimensional microcrystalline materials useful in fields as diverse as biomedicine, materials science, mineral flotation and food processing. Current approaches rely on bulk emulsification methods, require further chemical and thermal treatments, and are restrictive with respect to the materials used. The development of methods that exploit the great potential of interfacial assembly for producing tailored materials have been hampered by the lack of understanding of the assembly process. Here we report a microfluidic method that allows direct visualization and understanding of the dynamics of colloidal crystal growth on curved interfaces. The crystals are periodically ejected to form stable jammed shells, which we refer to as colloidal armour. We propose that the energetic barriers to interfacial crystal growth and organization can be overcome by targeted delivery of colloidal particles through hydrodynamic flows. Our method allows an unprecedented degree of control over armour composition, size and stability.
An Automated Method for Navigation Assessment for Earth Survey Sensors Using Island Targets
NASA Technical Reports Server (NTRS)
Patt, F. S.; Woodward, R. H.; Gregg, W. W.
1997-01-01
An automated method has been developed for performing navigation assessment on satellite-based Earth sensor data. The method utilizes islands as targets which can be readily located in the sensor data and identified with reference locations. The essential elements are an algorithm for classifying the sensor data according to source, a reference catalogue of island locations, and a robust pattern-matching algorithm for island identification. The algorithms were developed and tested for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), an ocean colour sensor. This method will allow navigation error statistics to be automatically generated for large numbers of points, supporting analysis over large spatial and temporal ranges.
Evaluation of the flexibility of protective gloves.
Harrabi, Lotfi; Dolez, Patricia I; Vu-Khanh, Toan; Lara, Jaime
2008-01-01
Two mechanical methods have been developed for the characterization of the flexibility of protective gloves, a key factor affecting their degree of usefulness for workers. The principle of the first method is similar to the ASTM D 4032 standard relative to fabric stiffness and simulates the deformations encountered by gloves that are not tight fitted to the hand. The second method characterizes the flexibility of gloves that are worn tight fitted. Its validity was theoretically verified for elastomer materials. Both methods should prove themselves as valuable tools for protective glove manufacturers, allowing for the characterization of their existing products in terms of flexibility and the development of new ones better fitting workers' needs.
Survey Methods for Educators: Collaborative Survey Development (Part 1 of 3). REL 2016-163
ERIC Educational Resources Information Center
Irwin, Clare W.; Stafford, Erin T.
2016-01-01
This guide describes a five-step collaborative process that educators can use with other educators, researchers, and content experts to write or adapt questions and develop surveys for education contexts. This process allows educators to leverage the expertise of individuals within and outside of their organization to ensure a high-quality survey…
Code of Federal Regulations, 2014 CFR
2014-01-01
..., Specifications, Contract Documents and Other Documentation Part D—Inspection of Development Work Part A..., installation, device, arrangement, or method of work is at least equivalent to that prescribed in this exhibit..., within allowable stress and settlement limitations, all applicable loads. Any foundation and anchorage...
Code of Federal Regulations, 2011 CFR
2011-01-01
..., Specifications, Contract Documents and Other Documentation Part D—Inspection of Development Work Part A..., installation, device, arrangement, or method of work is at least equivalent to that prescribed in this exhibit..., within allowable stress and settlement limitations, all applicable loads. Any foundation and anchorage...
Code of Federal Regulations, 2012 CFR
2012-01-01
..., Specifications, Contract Documents and Other Documentation Part D—Inspection of Development Work Part A..., installation, device, arrangement, or method of work is at least equivalent to that prescribed in this exhibit..., within allowable stress and settlement limitations, all applicable loads. Any foundation and anchorage...
ERIC Educational Resources Information Center
Khan, Natalya N.; Kolumbayeva, Sholpan Zh.; Karsybayeva, Raissa K.; Nabuova, Roza A.; Kurmanbekova, Manshuk B.; Syzdykbayeva, Aigul Dzh.
2016-01-01
To develop research competence in prospective teachers, a system of methods for diagnostics and formation of this competence in prospective elementary school teachers in the training process is designed. To diagnose the research competence, a series of techniques were used that allow subtle evaluation of each competence research component:…
Middle School Students' Reasoning about 3-Dimensional Objects: A Case Study
ERIC Educational Resources Information Center
Okumus, Samet
2016-01-01
According to the National Council of Teacher of Mathematics (NCTM) (2000), K-12 students should be given an opportunity to develop their spatial reasoning abilities. One of the topics that may allow students to develop their spatial skills is forming 3-dimensional objects using spinning and extrusion methods. Also, extrusion and spinning methods…
ERIC Educational Resources Information Center
Mirzaei, Maryam Sadat; Meshgi, Kourosh; Akita, Yuya; Kawahara, Tatsuya
2017-01-01
This paper introduces a novel captioning method, partial and synchronized captioning (PSC), as a tool for developing second language (L2) listening skills. Unlike conventional full captioning, which provides the full text and allows comprehension of the material merely by reading, PSC promotes listening to the speech by presenting a selected…
Guyennon, Nicolas; Cerretto, Giancarlo; Tavella, Patrizia; Lahaye, François
2009-08-01
In recent years, many national timing laboratories have installed geodetic Global Positioning System receivers together with their traditional GPS/GLONASS Common View receivers and Two Way Satellite Time and Frequency Transfer equipment. Many of these geodetic receivers operate continuously within the International GNSS Service (IGS), and their data are regularly processed by IGS Analysis Centers. From its global network of over 350 stations and its Analysis Centers, the IGS generates precise combined GPS ephemeredes and station and satellite clock time series referred to the IGS Time Scale. A processing method called Precise Point Positioning (PPP) is in use in the geodetic community allowing precise recovery of GPS antenna position, clock phase, and atmospheric delays by taking advantage of these IGS precise products. Previous assessments, carried out at Istituto Nazionale di Ricerca Metrologica (INRiM; formerly IEN) with a PPP implementation developed at Natural Resources Canada (NRCan), showed PPP clock solutions have better stability over short/medium term than GPS CV and GPS P3 methods and significantly reduce the day-boundary discontinuities when used in multi-day continuous processing, allowing time-limited, campaign-style time-transfer experiments. This paper reports on follow-on work performed at INRiM and NRCan to further characterize and develop the PPP method for time transfer applications, using data from some of the National Metrology Institutes. We develop a processing procedure that takes advantage of the improved stability of the phase-connected multi-day PPP solutions while allowing the generation of continuous clock time series, more applicable to continuous operation/monitoring of timing equipment.
The Reintegration of Military Families Following Long Term Separation
2005-05-01
qualitative and quantitative research methods , allowed the researcher to maintain the importance of the research question as well as to better...discovering how families succeed and developing effective social work interventions is more critical than ever. Using mixed methods design, this... RESEARCH AT LUKE AIR FORCE BASE APPENDIX F INSTITUTIONAL REVIEW BOARD APPROVAL APPENDIX A FAMILY REINTEGRATION SURVEY APPENDIX B QUALITATIVE
An interactive programme for weighted Steiner trees
NASA Astrophysics Data System (ADS)
Zanchetta do Nascimento, Marcelo; Ramos Batista, Valério; Raffa Coimbra, Wendhel
2015-01-01
We introduce a fully written programmed code with a supervised method for generating weighted Steiner trees. Our choice of the programming language, and the use of well- known theorems from Geometry and Complex Analysis, allowed this method to be implemented with only 764 lines of effective source code. This eases the understanding and the handling of this beta version for future developments.
NASA Astrophysics Data System (ADS)
Richert, Hendryk; Surzhenko, Oleksy; Wangemann, Sebastian; Heinrich, Jochen; Görnert, Peter
2005-05-01
A method for active drug delivery inside the human digestive system is proposed. This method allows the localisation of a magnetically marked capsule on its natural way through the digestive system and to open it at a desired position. Thus, the procedure contains two important components: the magnetic monitoring and active drug release.
Computer technique for simulating the combustion of cellulose and other fuels
Andrew M. Stein; Brian W. Bauske
1971-01-01
A computer method has been developed for simulating the combustion of wood and other cellulosic fuels. The products of combustion are used as input for a convection model that slimulates real fires. The method allows the chemical process to proceed to equilibrium and then examines the effects of mass addition and repartitioning on the fluid mechanics of the convection...
ERIC Educational Resources Information Center
Ballon, Bruce C.; Silver, Ivan; Fidler, Donald
2007-01-01
Objective: Headspace Theater has been developed to allow small group learning of psychiatric conditions by creating role-play situations in which participants are placed in a scenario that simulates the experience of the condition. Method: The authors conducted a literature review of role-playing techniques, interactive teaching, and experiential…
The geography of maternal and newborn health: the state of the art.
Ebener, Steeve; Guerra-Arias, Maria; Campbell, James; Tatem, Andrew J; Moran, Allisyn C; Amoako Johnson, Fiifi; Fogstad, Helga; Stenberg, Karin; Neal, Sarah; Bailey, Patricia; Porter, Reid; Matthews, Zoe
2015-05-27
As the deadline for the millennium development goals approaches, it has become clear that the goals linked to maternal and newborn health are the least likely to be achieved by 2015. It is therefore critical to ensure that all possible data, tools and methods are fully exploited to help address this gap. Among the methods that are under-used, mapping has always represented a powerful way to 'tell the story' of a health problem in an easily understood way. In addition to this, the advanced analytical methods and models now being embedded into Geographic Information Systems allow a more in-depth analysis of the causes behind adverse maternal and newborn health (MNH) outcomes. This paper examines the current state of the art in mapping the geography of MNH as a starting point to unleashing the potential of these under-used approaches. Using a rapid literature review and the description of the work currently in progress, this paper allows the identification of methods in use and describes a framework for methodological approaches to inform improved decision-making. The paper is aimed at health metrics and geography of health specialists, the MNH community, as well as policy-makers in developing countries and international donor agencies.
Low-Dead-Volume Inlet for Vacuum Chamber
NASA Technical Reports Server (NTRS)
Naylor, Guy; Arkin, C.
2010-01-01
Gas introduction from near-ambient pressures to high vacuum traditionally is accomplished either by multi-stage differential pumping that allows for very rapid response, or by a capillary method that allows for a simple, single-stage introduction, but which often has a delayed response. Another means to introduce the gas sample is to use the multi-stage design with only a single stage. This is accomplished by using a very small conductance limit. The problem with this method is that a small conductance limit will amplify issues associated with dead -volume. As a result, a high -vacuum gas inlet was developed with low dead -volume, allowing the use of a very low conductance limit interface. Gas flows through the ConFlat flange at a relatively high flow rate at orders of magnitude greater than through the conductance limit. The small flow goes through a conductance limit that is a double-sided ConFlat.
Low-Dead-Volume Inlet for Vacuum Chamber
NASA Technical Reports Server (NTRS)
Naylor, Guy; Arkin, C.
2011-01-01
Gas introduction from near-ambient pressures to high vacuum traditionally is accomplished either by multi-stage differential pumping that allows for very rapid response, or by a capillary method that allows for a simple, single-stage introduction, but which often has a delayed response. Another means to introduce the gas sample is to use the multi-stage design with only a single stage. This is accomplished by using a very small conductance limit. The problem with this method is that a small conductance limit will amplify issues associated with dead-volume. As a result, a high-vacuum gas inlet was developed with low dead-volume, allowing the use of a very low conductance limit interface. Gas flows through the ConFlat flange at a relatively high flow rate at orders of magnitude greater than through the conductance limit. The small flow goes through a conductance limit that is a double-sided ConFlat.
Ageing airplane repair assessment program for Airbus A300
NASA Technical Reports Server (NTRS)
Gaillardon, J. M.; Schmidt, HANS-J.; Brandecker, B.
1992-01-01
This paper describes the current status of the repair categorization activities and includes all details about the methodologies developed for determination of the inspection program for the skin on pressurized fuselages. For inspection threshold determination two methods are defined based on fatigue life approach, a simplified and detailed method. The detailed method considers 15 different parameters to assess the influences of material, geometry, size location, aircraft usage, and workmanship on the fatigue life of the repair and the original structure. For definition of the inspection intervals a general method is developed which applies to all concerned repairs. For this the initial flaw concept is used by considering 6 parameters and the detectable flaw sizes depending on proposed nondestructive inspection methods. An alternative method is provided for small repairs allowing visual inspection with shorter intervals.
Kim, Shokaku; Shoji, Takao; Kitano, Yoshikazu; Chiba, Kazuhiro
2013-07-25
We have developed a highly efficient synthetic method for azanucleosides using a lithium perchlorate-nitromethane reaction medium, allowing direct and exclusive installation of various nucleophiles, including protected nucleobases into prolinol derivatives at the preferred 5-position.
APPLYING TOXICITY IDENTIFICATION PROCEDURES TO FIELD COLLECTED SEDIMENTS
Identification of specific causes of sediment toxicity can allow for much more focused risk assessment and management decision making. We have been developing toxicity identification evaluation (TIE) methods for contaminated sediments and focusing on three toxicant groups (ammoni...
Method and apparatus for phase for and amplitude detection
Cernosek, Richard W.; Frye, Gregory C.; Martin, Stephen J.
1998-06-09
A new class of techniques been developed which allow inexpensive application of SAW-type chemical sensor devices while retaining high sensitivity (ppm) to chemical detection. The new techniques do not require that the sensor be part of an oscillatory circuit, allowing large concentrations of, e.g., chemical vapors in air, to be accurately measured without compromising the capacity to measure trace concentrations. Such devices have numerous potential applications in environmental monitoring, from manufacturing environments to environmental restoration.
Muniroh, M S; Sariah, M; Zainal Abidin, M A; Lima, N; Paterson, R R M
2014-05-01
Detection of basal stem rot (BSR) by Ganoderma of oil palms was based on foliar symptoms and production of basidiomata. Enzyme-Linked Immunosorbent Assays-Polyclonal Antibody (ELISA-PAB) and PCR have been proposed as early detection methods for the disease. These techniques are complex, time consuming and have accuracy limitations. An ergosterol method was developed which correlated well with the degree of infection in oil palms, including samples growing in plantations. However, the method was capable of being optimised. This current study was designed to develop a simpler, more rapid and efficient ergosterol method with utility in the field that involved the use of microwave extraction. The optimised procedure involved extracting a small amount of Ganoderma, or Ganoderma-infected oil palm suspended in low volumes of solvent followed by irradiation in a conventional microwave oven at 70°C and medium high power for 30s, resulting in simultaneous extraction and saponification. Ergosterol was detected by thin layer chromatography (TLC) and quantified using high performance liquid chromatography with diode array detection. The TLC method was novel and provided a simple, inexpensive method with utility in the field. The new method was particularly effective at extracting high yields of ergosterol from infected oil palm and enables rapid analysis of field samples on site, allowing infected oil palms to be treated or culled very rapidly. Some limitations of the method are discussed herein. The procedures lend themselves to controlling the disease more effectively and allowing more effective use of land currently employed to grow oil palms, thereby reducing pressure to develop new plantations. Copyright © 2014 Elsevier B.V. All rights reserved.
The use of generalised additive models (GAM) in dentistry.
Helfenstein, U; Steiner, M; Menghini, G
1997-12-01
Ordinary multiple regression and logistic multiple regression are widely applied statistical methods which allow a researcher to 'explain' or 'predict' a response variable from a set of explanatory variables or predictors. In these models it is usually assumed that quantitative predictors such as age enter linearly into the model. During recent years these methods have been further developed to allow more flexibility in the way explanatory variables 'act' on a response variable. The methods are called 'generalised additive models' (GAM). The rigid linear terms characterising the association between response and predictors are replaced in an optimal way by flexible curved functions of the predictors (the 'profiles'). Plotting the 'profiles' allows the researcher to visualise easily the shape by which predictors 'act' over the whole range of values. The method facilitates detection of particular shapes such as 'bumps', 'U-shapes', 'J-shapes, 'threshold values' etc. Information about the shape of the association is not revealed by traditional methods. The shapes of the profiles may be checked by performing a Monte Carlo simulation ('bootstrapping'). After the presentation of the GAM a relevant case study is presented in order to demonstrate application and use of the method. The dependence of caries in primary teeth on a set of explanatory variables is investigated. Since GAMs may not be easily accessible to dentists, this article presents them in an introductory condensed form. It was thought that a nonmathematical summary and a worked example might encourage readers to consider the methods described. GAMs may be of great value to dentists in allowing visualisation of the shape by which predictors 'act' and obtaining a better understanding of the complex relationships between predictors and response.
Prediction of essential oil content of oregano by hand-held and Fourier transform NIR spectroscopy.
Camps, Cédric; Gérard, Marianne; Quennoz, Mélanie; Brabant, Cécile; Oberson, Carine; Simonnet, Xavier
2014-05-01
In the framework of a breeding programme, the analysis of hundreds of oregano samples to determine their essential oil content (EOC) is time-consuming and expensive in terms of labour. Therefore developing a new method that is rapid, accurate and less expensive to use would be an asset to breeders. The aim of the present study was to develop a method based on near-inrared (NIR) spectroscopy to determine the EOC of oregano dried powder. Two spectroscopic approaches were compared, the first using a hand-held NIR device and the second a Fourier transform (FT) NIR spectrometer. Hand-held NIR (1000-1800 nm) measurements and partial least squares regression allowed the determination of EOC with R² and SEP values of 0.58 and 0.81 mL per 100 g dry matter (DM) respectively. Measurements with FT-NIR (1000-2500 nm) allowed the determination of EOC with R² and SEP values of 0.91 and 0.68 mL per 100 g DM respectively. RPD, RER and RPIQ values for the model implemented with FT-NIR data were satisfactory for screening application, while those obtained with hand-held NIR data were below the level required to consider the model as enough accurate for screening application. The FT-NIR approach allowed the development of an accurate model for EOC prediction. Although the hand-held NIR approach is promising, it needs additional development before it can be used in practice. © 2013 Society of Chemical Industry.
Mahajan, Rishi; Chatterjee, Subhankar
2018-05-05
Indiscriminate use of two broad spectrum pesticides, profenofos and fenthion, in agricultural system, often results in their accumulation in a non-target niche and leaching into water bodies. The present study, therefore, aims at developing a simple and rapid HPLC method that allows simultaneous extraction and detection of these two pesticides, especially in run-off water. Extraction of the two pesticides from spiked water samples using dichloromethane resulted in recovery ranging between 80 and 90%. An HPLC run of 20 min under optimized chromatographic parameters (mobile phase: methanol (75%) and water (25%); flow rate of 0.8 ml min -1 ; diode array detector at wavelength 210 nm) resulted in a significant difference in retention times of two pesticides (4.593 min) which allows a window of opportunity to study any possible intermediates/transformants of the parent compounds while evaluating run-off waters from agricultural fields. The HPLC method developed allowed simultaneous detection of profenofos and fenthion with a single injection into the HPLC system with 0.0328 mg l -1 (32.83 ng ml -1 ) being the limit of detection (LOD) and 0.0995 mg l -1 (99.5 ng ml -1 ) as the limit of quantification (LOQ) for fenthion; for profenofos, LOD and LOQ were 0.104 mg l -1 (104.50 ng ml -1 ) and 0.316 mg l -1 (316.65 ng ml -1 ), respectively. The findings were further validated using the soil microcosm experiment that allowed simultaneous detection and quantification of profenofos and fenthion. The findings indicate towards the practical significance of the methodology developed as the soil microcosm experiment closely mimics the agricultural run-off water under natural environmental conditions.
Identification of spider-mite species and their endosymbionts using multiplex PCR.
Zélé, Flore; Weill, Mylène; Magalhães, Sara
2018-02-01
Spider mites of the genus Tetranychidae are severe crop pests. In the Mediterranean a few species coexist, but they are difficult to identify based on morphological characters. Additionally, spider mites often harbour several species of endosymbiotic bacteria, which may affect the biology of their hosts. Here, we propose novel, cost-effective, multiplex diagnostic methods allowing a quick identification of spider-mite species as well as of the endosymbionts they carry. First, we developed, and successfully multiplexed in a single PCR, primers to identify Tetranychus urticae, T. evansi and T. ludeni, some of the most common tetranychids found in southwest Europe. Moreover, we demonstrated that this method allows detecting multiple species in a single pool, even at low frequencies (up to 1/100), and can be used on entire mites without DNA extraction. Second, we developed another set of primers to detect spider-mite endosymbionts, namely Wolbachia, Cardinium and Rickettsia in a multiplex PCR, along with a generalist spider-mite primer to control for potential failure of DNA amplification in each PCR. Overall, our method represents a simple, cost-effective and reliable method to identify spider-mite species and their symbionts in natural field populations, as well as to detect contaminations in laboratory rearings. This method may easily be extended to other species.
NASA Astrophysics Data System (ADS)
Vlasov, V. M.; Novikov, A. N.; Novikov, I. A.; Shevtsova, A. G.
2018-03-01
In the environment of highly developed urban agglomerations, one of the main problems arises - inability of the road network to reach a high level of motorization. The introduction of intelligent transport systems allows solving this problem, but the main issue in their implementation remains open: to what extent this or that method of improving the transport network will be effective and whether it is able to solve the problem of vehicle growth especially for the long-term period. The main goal of this work was the development of an approach to forecasting the increase in the intensity of traffic flow for a long-term period using the population and the level of motorization. The developed approach made it possible to determine the projected population and, taking into account the level of motorization, to determine the growth factor of the traffic flow intensity, which allows calculating the intensity value for a long-term period with high accuracy. The analysis of the main methods for predicting the characteristics of the transport stream is performed. The basic values and parameters necessary for their use are established. The analysis of the urban settlement is carried out and the level of motorization characteristic for the given locality is determined. A new approach to predicting the intensity of the traffic flow has been developed, which makes it possible to predict the change in the transport situation in the long term in high accuracy. Calculations of the magnitude of the intensity increase on the basis of the developed forecasting method are made and the errors in the data obtained are determined. The main recommendations on the use of the developed forecasting approach for the long-term functioning of the road network are formulated.
Radiation Hard Bandpass Filters for Mid- to Far-IR Planetary Instruments
NASA Technical Reports Server (NTRS)
Brown, Ari D.; Aslam, Shahid; Chervenack, James A.; Huang, Wei-Chung; Merrell, Willie C.; Quijada, Manuel; Steptoe-Jackson, Rosalind; Wollack, Edward J.
2012-01-01
We present a novel method to fabricate compact metal mesh bandpass filters for use in mid- to far-infrared planetary instruments operating in the 20-600 micron wavelength spectral regime. Our target applications include thermal mapping instruments on ESA's JUICE as well as on a de-scoped JEO. These filters are novel because they are compact, customizable, free-standing copper mesh resonant bandpass filters with micromachined silicon support frames. The filters are well suited for thermal mapping mission to the outer planets and their moons because the filter material is radiation hard. Furthermore, the silicon support frame allows for effective hybridization with sensors made on silicon substrates. Using a Fourier Transform Spectrometer, we have demonstrated high transmittance within the passband as well as good out-of-band rejection [1]. In addition, we have developed a unique method of filter stacking in order to increase the bandwidth and sharpen the roll-off of the filters. This method allows one to reliably control the spacing between filters to within 2 microns. Furthermore, our method allows for reliable control over the relative position and orienta-tion between the shared faces of the filters.
A NEW APPROACH TO THE STUDY OF MUCOADHESIVENESS OF POLYMERIC MEMBRANES USING SILICONE DISCS.
Nowak, Karolina Maria; Szterk, Arkadiusz; Fiedor, Piotr; Bodek, Kazimiera Henryka
2016-01-01
The introduction of new test methods and the modification of existing ones are crucial for obtaining reliable results, which contributes to the development of innovative materials that may have clinical applications. Today, silicone is commonly used in medicine and the diversity of its applications are continually growing. The aim of this study is to evaluate the mucoadhesiveness of polymeric membranes by a method that modifies the existing test methods through the introduction of silicone discs. The matrices were designed for clinical application in the management of diseases within the oral cavity. The use of silicone discs allows reliable and reproducible results to be obtained, which allows us to make various tensometric measurements. In this study, different types of polymeric matrices were examined, as well as their crosslinking and the presence for the active pharmaceutical ingredient were compared to the pure dosage form. The lidocaine hydrochloride (Lid(HCl)) was used as a model active substance, due to its use in dentistry and clinical safety. The results were characterized by a high repeatability (RSD < 10.6%). The advantage of silicone material due to its mechanical strength, chemical and physical resistance, allowed a new test method using a texture analyzer to be proposed.
Development of a Training Model for Laparoscopic Common Bile Duct Exploration
Rodríguez, Omaira; Benítez, Gustavo; Sánchez, Renata; De la Fuente, Liliana
2010-01-01
Background: Training and experience of the surgical team are fundamental for the safety and success of complex surgical procedures, such as laparoscopic common bile duct exploration. Methods: We describe an inert, simple, very low-cost, and readily available training model. Created using a “black box” and basic medical and surgical material, it allows training in the fundamental steps necessary for laparoscopic biliary tract surgery, namely, (1) intraoperative cholangiography, (2) transcystic exploration, and (3) laparoscopic choledochotomy, and t-tube insertion. Results: The proposed model has allowed for the development of the skills necessary for partaking in said procedures, contributing to its development and diminishing surgery time as the trainee advances down the learning curve. Further studies are directed towards objectively determining the impact of the model on skill acquisition. Conclusion: The described model is simple and readily available allowing for accurate reproduction of the main steps and maneuvers that take place during laparoscopic common bile duct exploration, with the purpose of reducing failure and complications. PMID:20529526
Systematic evaluation of non-animal test methods for skin sensitisation safety assessment.
Reisinger, Kerstin; Hoffmann, Sebastian; Alépée, Nathalie; Ashikaga, Takao; Barroso, Joao; Elcombe, Cliff; Gellatly, Nicola; Galbiati, Valentina; Gibbs, Susan; Groux, Hervé; Hibatallah, Jalila; Keller, Donald; Kern, Petra; Klaric, Martina; Kolle, Susanne; Kuehnl, Jochen; Lambrechts, Nathalie; Lindstedt, Malin; Millet, Marion; Martinozzi-Teissier, Silvia; Natsch, Andreas; Petersohn, Dirk; Pike, Ian; Sakaguchi, Hitoshi; Schepky, Andreas; Tailhardat, Magalie; Templier, Marie; van Vliet, Erwin; Maxwell, Gavin
2015-02-01
The need for non-animal data to assess skin sensitisation properties of substances, especially cosmetics ingredients, has spawned the development of many in vitro methods. As it is widely believed that no single method can provide a solution, the Cosmetics Europe Skin Tolerance Task Force has defined a three-phase framework for the development of a non-animal testing strategy for skin sensitization potency prediction. The results of the first phase – systematic evaluation of 16 test methods – are presented here. This evaluation involved generation of data on a common set of ten substances in all methods and systematic collation of information including the level of standardisation, existing test data,potential for throughput, transferability and accessibility in cooperation with the test method developers.A workshop was held with the test method developers to review the outcome of this evaluation and to discuss the results. The evaluation informed the prioritisation of test methods for the next phase of the non-animal testing strategy development framework. Ultimately, the testing strategy – combined with bioavailability and skin metabolism data and exposure consideration – is envisaged to allow establishment of a data integration approach for skin sensitisation safety assessment of cosmetic ingredients.
PC_Eyewitness: evaluating the New Jersey method.
MacLin, Otto H; Phelan, Colin M
2007-05-01
One important variable in eyewitness identification research is lineup administration procedure. Lineups administered sequentially (one at a time) have been shown to reduce the number of false identifications in comparison with those administered simultaneously (all at once). As a result, some policymakers have adopted sequential administration. However, they have made slight changes to the method used in psychology laboratories. Eyewitnesses in the field are allowed to take multiple passes through a lineup, whereas participants in the laboratory are allowed only one pass. PC_Eyewitness (PCE) is a computerized system used to construct and administer simultaneous or sequential lineups in both the laboratory and the field. It is currently being used in laboratories investigating eyewitness identification in the United States, Canada, and abroad. A modified version of PCE is also being developed for a local police department. We developed a new module for PCE, the New Jersey module, to examine the effects of a second pass. We found that the sequential advantage was eliminated when the participants were allowed to view the lineup a second time. The New Jersey module, and steps we are taking to improve on the module, are presented here and are being made available to the research and law enforcement communities.
Fast nucleation for silica nanoparticle synthesis using a sol-gel method.
Dixit, Chandra K; Bhakta, Snehasis; Kumar, Ajeet; Suib, Steven L; Rusling, James F
2016-12-01
We have developed a method that for the first time allowed us to synthesize silica particles in 20 minutes using a sol-gel preparation. Therefore, it is critically important to understand the synthesis mechanism and kinetic behavior in order to achieve a higher degree of fine tuning ability during the synthesis. In this study, we have employed our ability to modulate the physical nature of the reaction medium from sol-gel to emulsion, which has allowed us to halt the reaction at a particular time; this has allowed us to precisely understand the mechanism and chemistry of the silica polymerization. The synthesis medium is kept quite simple with tetraethyl orthosilicate (TEOS) as a precursor in an equi-volumetric ethanol-water system and with sodium hydroxide as a catalyst. Synthesis is performed under ambient conditions at 20 °C for 20 minutes followed by phasing out of any unreacted TEOS and polysilicic acid chains via their emulsification with supersaturated water. We have also demonstrated that the developed particles with various sizes can be used as seeds for further particle growth and other applications. Luminol, a chemiluminescent molecule, has been entrapped successfully between the layers of silica and was demonstrated for the chemiluminescence of these particles.
Multiscale Methods for Nuclear Reactor Analysis
NASA Astrophysics Data System (ADS)
Collins, Benjamin S.
The ability to accurately predict local pin powers in nuclear reactors is necessary to understand the mechanisms that cause fuel pin failure during steady state and transient operation. In the research presented here, methods are developed to improve the local solution using high order methods with boundary conditions from a low order global solution. Several different core configurations were tested to determine the improvement in the local pin powers compared to the standard techniques, that use diffusion theory and pin power reconstruction (PPR). Two different multiscale methods were developed and analyzed; the post-refinement multiscale method and the embedded multiscale method. The post-refinement multiscale methods use the global solution to determine boundary conditions for the local solution. The local solution is solved using either a fixed boundary source or an albedo boundary condition; this solution is "post-refinement" and thus has no impact on the global solution. The embedded multiscale method allows the local solver to change the global solution to provide an improved global and local solution. The post-refinement multiscale method is assessed using three core designs. When the local solution has more energy groups, the fixed source method has some difficulties near the interface: however the albedo method works well for all cases. In order to remedy the issue with boundary condition errors for the fixed source method, a buffer region is used to act as a filter, which decreases the sensitivity of the solution to the boundary condition. Both the albedo and fixed source methods benefit from the use of a buffer region. Unlike the post-refinement method, the embedded multiscale method alters the global solution. The ability to change the global solution allows for refinement in areas where the errors in the few group nodal diffusion are typically large. The embedded method is shown to improve the global solution when it is applied to a MOX/LEU assembly interface, the fuel/reflector interface, and assemblies where control rods are inserted. The embedded method also allows for multiple solution levels to be applied in a single calculation. The addition of intermediate levels to the solution improves the accuracy of the method. Both multiscale methods considered here have benefits and drawbacks, but both can provide improvements over the current PPR methodology.
A methodology for the semi-automatic digital image analysis of fragmental impactites
NASA Astrophysics Data System (ADS)
Chanou, A.; Osinski, G. R.; Grieve, R. A. F.
2014-04-01
A semi-automated digital image analysis method is developed for the comparative textural study of impact melt-bearing breccias. This method uses the freeware software ImageJ developed by the National Institute of Health (NIH). Digital image analysis is performed on scans of hand samples (10-15 cm across), based on macroscopic interpretations of the rock components. All image processing and segmentation are done semi-automatically, with the least possible manual intervention. The areal fraction of components is estimated and modal abundances can be deduced, where the physical optical properties (e.g., contrast, color) of the samples allow it. Other parameters that can be measured include, for example, clast size, clast-preferred orientations, average box-counting dimension or fragment shape complexity, and nearest neighbor distances (NnD). This semi-automated method allows the analysis of a larger number of samples in a relatively short time. Textures, granulometry, and shape descriptors are of considerable importance in rock characterization. The methodology is used to determine the variations of the physical characteristics of some examples of fragmental impactites.
Modeling chemical reactions for drug design.
Gasteiger, Johann
2007-01-01
Chemical reactions are involved at many stages of the drug design process. This starts with the analysis of biochemical pathways that are controlled by enzymes that might be downregulated in certain diseases. In the lead discovery and lead optimization process compounds have to be synthesized in order to test them for their biological activity. And finally, the metabolism of a drug has to be established. A better understanding of chemical reactions could strongly help in making the drug design process more efficient. We have developed methods for quantifying the concepts an organic chemist is using in rationalizing reaction mechanisms. These methods allow a comprehensive modeling of chemical reactivity and thus are applicable to a wide variety of chemical reactions, from gas phase reactions to biochemical pathways. They are empirical in nature and therefore allow the rapid processing of large sets of structures and reactions. We will show here how methods have been developed for the prediction of acidity values and of the regioselectivity in organic reactions, for designing the synthesis of organic molecules and of combinatorial libraries, and for furthering our understanding of enzyme-catalyzed reactions and of the metabolism of drugs.
3D printed rapid disaster response
NASA Astrophysics Data System (ADS)
Lacaze, Alberto; Murphy, Karl; Mottern, Edward; Corley, Katrina; Chu, Kai-Dee
2014-05-01
Under the Department of Homeland Security-sponsored Sensor-smart Affordable Autonomous Robotic Platforms (SAARP) project, Robotic Research, LLC is developing an affordable and adaptable method to provide disaster response robots developed with 3D printer technology. The SAARP Store contains a library of robots, a developer storefront, and a user storefront. The SAARP Store allows the user to select, print, assemble, and operate the robot. In addition to the SAARP Store, two platforms are currently being developed. They use a set of common non-printed components that will allow the later design of other platforms that share non-printed components. During disasters, new challenges are faced that require customized tools or platforms. Instead of prebuilt and prepositioned supplies, a library of validated robots will be catalogued to satisfy various challenges at the scene. 3D printing components will allow these customized tools to be deployed in a fraction of the time that would normally be required. While the current system is focused on supporting disaster response personnel, this system will be expandable to a range of customers, including domestic law enforcement, the armed services, universities, and research facilities.
Extension of FRI for modeling of electrocardiogram signals.
Quick, R Frank; Crochiere, Ronald E; Hong, John H; Hormati, Ali; Baechler, Gilles
2012-01-01
Recent work has developed a modeling method applicable to certain types of signals having a "finite rate of innovation" (FRI). Such signals contain a sparse collection of time- or frequency-limited pulses having a restricted set of allowable pulse shapes. A limitation of past work on FRI is that all of the pulses must have the same shape. Many real signals, including electrocardiograms, consist of pulses with varying widths and asymmetry, and therefore are not well fit by the past FRI methods. We present an extension of FRI allowing pulses having variable pulse width (VPW) and asymmetry. We show example results for electrocardiograms and discuss the possibility of application to signal compression and diagnostics.
Fuegemann, Christopher J; Samraj, Ajoy K; Walsh, Stuart; Fleischmann, Bernd K; Jovinge, Stefan; Breitbach, Martin
2010-12-01
Herein, we describe two protocols for the in vitro differentiation of mouse embryonic stem cells (mESCs) into cardiomyocytes. mESCs are pluripotent and can be differentiated into cells of all three germ layers, including cardiomyocytes. The methods described here facilitate the differentiation of mESCs into the different cardiac subtypes (atrial-, ventricular-, nodal-like cells). The duration of cell culture determines whether preferentially early- or late-developmental stage cardiomyocytes can be obtained preferentially. This approach allows the investigation of cardiomyocyte development and differentiation in vitro, and also allows for the enrichment and isolation of physiologically intact cardiomyocytes for transplantation purposes. © 2010 by John Wiley & Sons, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, Matthew
The purpose of this CRADA is to develop an abrasion-resistant coating, suitable for use on polymeric-based reflective films (e.g., the ReflecTech reflective film), that allows for improved scratch resistance and enables the use of aggressive cleaning techniques (e.g., direct contact methods like brushing) without damaging the specular reflectance properties of the reflective film.
Ammonia Analysis by Gas Chromatograph/Infrared Detector (GC/IRD)
NASA Technical Reports Server (NTRS)
Scott, Joseph P.; Whitfield, Steve W.
2003-01-01
Methods are being developed at Marshall Space Flight Center's Toxicity Lab on a CG/IRD System that will be used to detect ammonia in low part per million (ppm) levels. These methods will allow analysis of gas samples by syringe injections. The GC is equipped with a unique cryogenic-cooled inlet system that will enable our lab to make large injections of a gas sample. Although the initial focus of the work will be analysis of ammonia, this instrument could identify other compounds on a molecular level. If proper methods can be developed, the IRD could work as a powerful addition to our offgassing capabilities.
Ortega, Nàdia; Macià, Alba; Romero, Maria-Paz; Trullols, Esther; Morello, Jose-Ramón; Anglès, Neus; Motilva, Maria-Jose
2009-08-26
An improved chromatographic method was developed using ultra-performance liquid chromatography-tandem mass spectrometry to identify and quantify phenolic compounds and alkaloids, theobromine and caffeine, in carob flour samples. The developed method has been validated in terms of speed, sensitivity, selectivity, peak efficiency, linearity, reproducibility, limits of detection, and limits of quantification. The chromatographic method allows the identification and quantification of 20 phenolic compounds, that is, phenolic acids, flavonoids, and their aglycone and glucoside forms, together with the determination of the alkaloids, caffeine and theobromine, at low concentration levels all in a short analysis time of less than 20 min.
Methods to assess Drosophila heart development, function and aging
Ocorr, Karen; Vogler, Georg; Bodmer, Rolf
2014-01-01
In recent years the Drosophila heart has become an established model of many different aspects of human cardiac disease. This model has allowed identification of disease-causing mechanisms underlying congenital heart disease and cardiomyopathies and has permitted the study underlying genetic, metabolic and age-related contributions to heart function. In this review we discuss methods currently employed in the analysis of the Drosophila heart structure and function, such as optical methods to infer heart function and performance, electrophysiological and mechanical approaches to characterize cardiac tissue properties, and conclude with histological techniques used in the study of heart development and adult structure. PMID:24727147
Method of identification of patent trends based on descriptions of technical functions
NASA Astrophysics Data System (ADS)
Korobkin, D. M.; Fomenkov, S. A.; Golovanchikov, A. B.
2018-05-01
The use of the global patent space to determine the scientific and technological priorities for the technical systems development (identifying patent trends) allows one to forecast the direction of the technical systems development and, accordingly, select patents of priority technical subjects as a source for updating the technical functions database and physical effects database. The authors propose an original method that uses as trend terms not individual unigrams or n-gram (usually for existing methods and systems), but structured descriptions of technical functions in the form “Subject-Action-Object” (SAO), which in the authors’ opinion are the basis of the invention.
Hasler, F; Krapf, R; Brenneisen, R; Bourquin, D; Krähenbühl, S
1993-10-22
Methods have been developed and characterized allowing rapid isolation and quantification of 18 beta-glycyrrhetinic acid (GRA) in biological fluids from both humans and rats. Sample preparation includes extraction with urea-methanol for plasma samples, and solid-phase extraction (SPE) for urine and bile samples. Hydrolysis of GRA glucuronides in urine and bile was performed by treatment with beta-glucuronidase. MGRA, the 3-O-methyl derivative of GRA was synthesized as an internal standard resistant to hydrolysis. High-performance liquid chromatography (HPLC) was performed with an isocratic system using methanol-water-acetic acid (83:16.8:0.2, v/v/v) as solvent on a Lichrocart RP-18 column at 30 degrees C with ultraviolet detection. The methods allowed base line separation of GRA and MGRA from all biological fluids tested, with a detection limit of 0.15 mg/l. Validation of the methods included determination of recovery, accuracy and precision in plasma, bile and urine from humans and rats. The methods were further evaluated by investigating the pharmacokinetics of GRA in normal rats and in rats with a bile fistula. Following an intravenous dose of 10 mg/kg, the plasma concentration-time curve of GRA could be fitted to a one compartment model both in control and bile fistula rats. The elimination half life averaged 15.0 +/- 2.2 versus 16.8 +/- 2.4 min in control and bile fistula rats (difference not significant). Within 90 min following administration of GRA, urinary elimination of GRA and GRA glucuronides was less than 1% in both groups whereas biliary elimination averaged 51.3 +/- 3.1%. The results show that the methods developed allow pharmacokinetic studies of GRA in humans and rats.
Developing and validating risk prediction models in an individual participant data meta-analysis
2014-01-01
Background Risk prediction models estimate the risk of developing future outcomes for individuals based on one or more underlying characteristics (predictors). We review how researchers develop and validate risk prediction models within an individual participant data (IPD) meta-analysis, in order to assess the feasibility and conduct of the approach. Methods A qualitative review of the aims, methodology, and reporting in 15 articles that developed a risk prediction model using IPD from multiple studies. Results The IPD approach offers many opportunities but methodological challenges exist, including: unavailability of requested IPD, missing patient data and predictors, and between-study heterogeneity in methods of measurement, outcome definitions and predictor effects. Most articles develop their model using IPD from all available studies and perform only an internal validation (on the same set of data). Ten of the 15 articles did not allow for any study differences in baseline risk (intercepts), potentially limiting their model’s applicability and performance in some populations. Only two articles used external validation (on different data), including a novel method which develops the model on all but one of the IPD studies, tests performance in the excluded study, and repeats by rotating the omitted study. Conclusions An IPD meta-analysis offers unique opportunities for risk prediction research. Researchers can make more of this by allowing separate model intercept terms for each study (population) to improve generalisability, and by using ‘internal-external cross-validation’ to simultaneously develop and validate their model. Methodological challenges can be reduced by prospectively planned collaborations that share IPD for risk prediction. PMID:24397587
A rational approach to heavy-atom derivative screening
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joyce, M. Gordon; Radaev, Sergei; Sun, Peter D., E-mail: psun@nih.gov
2010-04-01
In order to overcome the difficulties associated with the ‘classical’ heavy-atom derivatization procedure, an attempt has been made to develop a rational crystal-free heavy-atom-derivative screening method and a quick-soak derivatization procedure which allows heavy-atom compound identification. Despite the development in recent times of a range of techniques for phasing macromolecules, the conventional heavy-atom derivatization method still plays a significant role in protein structure determination. However, this method has become less popular in modern high-throughput oriented crystallography, mostly owing to its trial-and-error nature, which often results in lengthy empirical searches requiring large numbers of well diffracting crystals. In addition, the phasingmore » power of heavy-atom derivatives is often compromised by lack of isomorphism or even loss of diffraction. In order to overcome the difficulties associated with the ‘classical’ heavy-atom derivatization procedure, an attempt has been made to develop a rational crystal-free heavy-atom derivative-screening method and a quick-soak derivatization procedure which allows heavy-atom compound identification. The method includes three basic steps: (i) the selection of likely reactive compounds for a given protein and specific crystallization conditions based on pre-defined heavy-atom compound reactivity profiles, (ii) screening of the chosen heavy-atom compounds for their ability to form protein adducts using mass spectrometry and (iii) derivatization of crystals with selected heavy-metal compounds using the quick-soak method to maximize diffraction quality and minimize non-isomorphism. Overall, this system streamlines the process of heavy-atom compound identification and minimizes the problem of non-isomorphism in phasing.« less
Agile Development Methods for Space Operations
NASA Technical Reports Server (NTRS)
Trimble, Jay; Webster, Chris
2012-01-01
Main stream industry software development practice has gone from a traditional waterfall process to agile iterative development that allows for fast response to customer inputs and produces higher quality software at lower cost. How can we, the space ops community, adopt state of the art software development practice, achieve greater productivity at lower cost, and maintain safe and effective space flight operations? At NASA Ames, we are developing Mission Control Technologies Software, in collaboration with Johnson Space Center (JSC) and, more recently, the Jet Propulsion Laboratory (JPL).
NASA Astrophysics Data System (ADS)
Klügel, J.
2006-12-01
Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.
Analytical quality by design: a tool for regulatory flexibility and robust analytics.
Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy
2015-01-01
Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).
Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics
Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy
2015-01-01
Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723
Application of multi-grid method on the simulation of incremental forging processes
NASA Astrophysics Data System (ADS)
Ramadan, Mohamad; Khaled, Mahmoud; Fourment, Lionel
2016-10-01
Numerical simulation becomes essential in manufacturing large part by incremental forging processes. It is a splendid tool allowing to show physical phenomena however behind the scenes, an expensive bill should be paid, that is the computational time. That is why many techniques are developed to decrease the computational time of numerical simulation. Multi-Grid method is a numerical procedure that permits to reduce computational time of numerical calculation by performing the resolution of the system of equations on several mesh of decreasing size which allows to smooth faster the low frequency of the solution as well as its high frequency. In this paper a Multi-Grid method is applied to cogging process in the software Forge 3. The study is carried out using increasing number of degrees of freedom. The results shows that calculation time is divide by two for a mesh of 39,000 nodes. The method is promising especially if coupled with Multi-Mesh method.
An interactive website for analytical method comparison and bias estimation.
Bahar, Burak; Tuncel, Ayse F; Holmes, Earle W; Holmes, Daniel T
2017-12-01
Regulatory standards mandate laboratories to perform studies to ensure accuracy and reliability of their test results. Method comparison and bias estimation are important components of these studies. We developed an interactive website for evaluating the relative performance of two analytical methods using R programming language tools. The website can be accessed at https://bahar.shinyapps.io/method_compare/. The site has an easy-to-use interface that allows both copy-pasting and manual entry of data. It also allows selection of a regression model and creation of regression and difference plots. Available regression models include Ordinary Least Squares, Weighted-Ordinary Least Squares, Deming, Weighted-Deming, Passing-Bablok and Passing-Bablok for large datasets. The server processes the data and generates downloadable reports in PDF or HTML format. Our website provides clinical laboratories a practical way to assess the relative performance of two analytical methods. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Korany, Mohamed A; Abdine, Heba H; Ragab, Marwa A A; Aboras, Sara I
2015-05-15
This paper discusses a general method for the use of orthogonal polynomials for unequal intervals (OPUI) to eliminate interferences in two-component spectrophotometric analysis. In this paper, a new approach was developed by using first derivative D1 curve instead of absorbance curve to be convoluted using OPUI method for the determination of metronidazole (MTR) and nystatin (NYS) in their mixture. After applying derivative treatment of the absorption data many maxima and minima points appeared giving characteristic shape for each drug allowing the selection of different number of points for the OPUI method for each drug. This allows the specific and selective determination of each drug in presence of the other and in presence of any matrix interference. The method is particularly useful when the two absorption spectra have considerable overlap. The results obtained are encouraging and suggest that the method can be widely applied to similar problems. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dag, Serkan; Yildirim, Bora; Sabuncuoglu, Baris
The objective of this study is to develop crack growth analysis methods for functionally graded materials (FGMs) subjected to mode I cyclic loading. The study presents finite elements based computational procedures for both two and three dimensional problems to examine fatigue crack growth in functionally graded materials. Developed methods allow the computation of crack length and generation of crack front profile for a graded medium subjected to fluctuating stresses. The results presented for an elliptical crack embedded in a functionally graded medium, illustrate the competing effects of ellipse aspect ratio and material property gradation on the fatigue crack growth behavior.
NASA Technical Reports Server (NTRS)
Wu, Cathy; Taylor, Pam; Whitson, George; Smith, Cathy
1990-01-01
This paper describes the building of a corn disease diagnostic expert system using CLIPS, and the development of a neural expert system using the fact representation method of CLIPS for automated knowledge acquisition. The CLIPS corn expert system diagnoses 21 diseases from 52 symptoms and signs with certainty factors. CLIPS has several unique features. It allows the facts in rules to be broken down to object-attribute-value (OAV) triples, allows rule-grouping, and fires rules based on pattern-matching. These features combined with the chained inference engine result to a natural user query system and speedy execution. In order to develop a method for automated knowledge acquisition, an Artificial Neural Expert System (ANES) is developed by a direct mapping from the CLIPS system. The ANES corn expert system uses the same OAV triples in the CLIPS system for its facts. The LHS and RHS facts of the CLIPS rules are mapped into the input and output layers of the ANES, respectively; and the inference engine of the rules is imbedded in the hidden layer. The fact representation by OAC triples gives a natural grouping of the rules. These features allow the ANES system to automate rule-generation, and make it efficient to execute and easy to expand for a large and complex domain.
Height and the normal distribution: evidence from Italian military data.
A'Hearn, Brian; Peracchi, Franco; Vecchi, Giovanni
2009-02-01
Researchers modeling historical heights have typically relied on the restrictive assumption of a normal distribution, only the mean of which is affected by age, income, nutrition, disease, and similar influences. To avoid these restrictive assumptions, we develop a new semiparametric approach in which covariates are allowed to affect the entire distribution without imposing any parametric shape. We apply our method to a new database of height distributions for Italian provinces, drawn from conscription records, of unprecedented length and geographical disaggregation. Our method allows us to standardize distributions to a single age and calculate moments of the distribution that are comparable through time. Our method also allows us to generate counterfactual distributions for a range of ages, from which we derive age-height profiles. These profiles reveal how the adolescent growth spurt (AGS) distorts the distribution of stature, and they document the earlier and earlier onset of the AGS as living conditions improved over the second half of the nineteenth century. Our new estimates of provincial mean height also reveal a previously unnoticed "regime switch "from regional convergence to divergence in this period.
Progress Toward Efficient Laminar Flow Analysis and Design
NASA Technical Reports Server (NTRS)
Campbell, Richard L.; Campbell, Matthew L.; Streit, Thomas
2011-01-01
A multi-fidelity system of computer codes for the analysis and design of vehicles having extensive areas of laminar flow is under development at the NASA Langley Research Center. The overall approach consists of the loose coupling of a flow solver, a transition prediction method and a design module using shell scripts, along with interface modules to prepare the input for each method. This approach allows the user to select the flow solver and transition prediction module, as well as run mode for each code, based on the fidelity most compatible with the problem and available resources. The design module can be any method that designs to a specified target pressure distribution. In addition to the interface modules, two new components have been developed: 1) an efficient, empirical transition prediction module (MATTC) that provides n-factor growth distributions without requiring boundary layer information; and 2) an automated target pressure generation code (ATPG) that develops a target pressure distribution that meets a variety of flow and geometry constraints. The ATPG code also includes empirical estimates of several drag components to allow the optimization of the target pressure distribution. The current system has been developed for the design of subsonic and transonic airfoils and wings, but may be extendable to other speed ranges and components. Several analysis and design examples are included to demonstrate the current capabilities of the system.
Defining an additivity framework for mixture research in inducible whole-cell biosensors
NASA Astrophysics Data System (ADS)
Martin-Betancor, K.; Ritz, C.; Fernández-Piñas, F.; Leganés, F.; Rodea-Palomares, I.
2015-11-01
A novel additivity framework for mixture effect modelling in the context of whole cell inducible biosensors has been mathematically developed and implemented in R. The proposed method is a multivariate extension of the effective dose (EDp) concept. Specifically, the extension accounts for differential maximal effects among analytes and response inhibition beyond the maximum permissive concentrations. This allows a multivariate extension of Loewe additivity, enabling direct application in a biphasic dose-response framework. The proposed additivity definition was validated, and its applicability illustrated by studying the response of the cyanobacterial biosensor Synechococcus elongatus PCC 7942 pBG2120 to binary mixtures of Zn, Cu, Cd, Ag, Co and Hg. The novel method allowed by the first time to model complete dose-response profiles of an inducible whole cell biosensor to mixtures. In addition, the approach also allowed identification and quantification of departures from additivity (interactions) among analytes. The biosensor was found to respond in a near additive way to heavy metal mixtures except when Hg, Co and Ag were present, in which case strong interactions occurred. The method is a useful contribution for the whole cell biosensors discipline and related areas allowing to perform appropriate assessment of mixture effects in non-monotonic dose-response frameworks
ERIC Educational Resources Information Center
American School and University, 1981
1981-01-01
Recent developments in chemical technology can control graffiti by using chemicals that emulsify and soften the paint and are then rinsed with water under pressure. Protective coatings are applied that allow the easy removal of spray paint by a variety of methods. (Author/MLF)
RESULTS OF APPLYING TOXICITY IDENTIFICATION PROCEDURES TO FIELD COLLECTED SEDIMENTS
Identification of specific causes of sediment toxicity can allow for much more focused risk assessment and management decision making. We have been developing toxicity identification evaluation TIE) methods for contaminated sediments and are focusing on three toxicant groups (amm...
CASE STUDIES EXAMINING LCA STREAMLINING TECHNIQUES
Pressure is mounting for more streamlined Life Cycle Assessment (LCA) methods that allow for evaluations that are quick and simple, but accurate. As part of an overall research effort to develop and demonstrate streamlined LCA, the U.S. Environmental Protection Agency has funded ...
METAL FINISHING FACILITY POLLUTION PREVENTION TOOL (MFFPPT)
The Metal Finishing Facility Pollution Tool (MFFPPT) is being developed to allow the metal finishing industry an easy method to evaluate potential pollution prevention options. In order to reduce the quantity of pollutants generated by a process, the sources of pollutants within ...
Single Wall Carbon Nanotube Alignment Mechanisms for Non-Destructive Evaluation
NASA Technical Reports Server (NTRS)
Hong, Seunghun
2002-01-01
As proposed in our original proposal, we developed a new innovative method to assemble millions of single wall carbon nanotube (SWCNT)-based circuit components as fast as conventional microfabrication processes. This method is based on surface template assembly strategy. The new method solves one of the major bottlenecks in carbon nanotube based electrical applications and, potentially, may allow us to mass produce a large number of SWCNT-based integrated devices of critical interests to NASA.
Response Surface Methods For Spatially-Resolved Optical Measurement Techniques
NASA Technical Reports Server (NTRS)
Danehy, P. M.; Dorrington, A. A.; Cutler, A. D.; DeLoach, R.
2003-01-01
Response surface methods (or methodology), RSM, have been applied to improve data quality for two vastly different spatially-resolved optical measurement techniques. In the first application, modern design of experiments (MDOE) methods, including RSM, are employed to map the temperature field in a direct-connect supersonic combustion test facility at NASA Langley Research Center. The laser-based measurement technique known as coherent anti-Stokes Raman spectroscopy (CARS) is used to measure temperature at various locations in the combustor. RSM is then used to develop temperature maps of the flow. Even though the temperature fluctuations at a single point in the flowfield have a standard deviation on the order of 300 K, RSM provides analytic fits to the data having 95% confidence interval half width uncertainties in the fit as low as +/- 30 K. Methods of optimizing future CARS experiments are explored. The second application of RSM is to quantify the shape of a 5-meter diameter, ultra-lightweight, inflatable space antenna at NASA Langley Research Center. Photogrammetry is used to simultaneously measure the shape of the antenna at approximately 500 discrete spatial locations. RSM allows an analytic model to be developed that describes the shape of the majority of the antenna with an uncertainty of 0.4 mm, with 95% confidence. This model would allow a quantitative comparison between the actual shape of the antenna and the original design shape. Accurately determining this shape also allows confident interpolation between the measured points. Such a model could, for example, be used for ray tracing of radio-frequency waves up to 95 GHz. to predict the performance of the antenna.
Life cycle design metrics for energy generation technologies: Method, data, and case study
NASA Astrophysics Data System (ADS)
Cooper, Joyce; Lee, Seung-Jin; Elter, John; Boussu, Jeff; Boman, Sarah
A method to assist in the rapid preparation of Life Cycle Assessments of emerging energy generation technologies is presented and applied to distributed proton exchange membrane fuel cell systems. The method develops life cycle environmental design metrics and allows variations in hardware materials, transportation scenarios, assembly energy use, operating performance and consumables, and fuels and fuel production scenarios to be modeled and comparisons to competing systems to be made. Data and results are based on publicly available U.S. Life Cycle Assessment data sources and are formulated to allow the environmental impact weighting scheme to be specified. A case study evaluates improvements in efficiency and in materials recycling and compares distributed proton exchange membrane fuel cell systems to other distributed generation options. The results reveal the importance of sensitivity analysis and system efficiency in interpreting case studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oosthoek, J. L. M.; Schuitema, R. W.; Brink, G. H. ten
2015-03-15
An imaging method has been developed based on charge collection in a scanning electron microscope (SEM) that allows discrimination between the amorphous and crystalline states of Phase-change Random Access Memory (PRAM) line cells. During imaging, the cells are electrically connected and can be switched between the states and the resistance can be measured. This allows for electrical characterization of the line cells in-situ in the SEM. Details on sample and measurement system requirements are provided which turned out to be crucial for the successful development of this method. Results show that the amorphous or crystalline state of the line cellsmore » can be readily discerned, but the spatial resolution is relatively poor. Nevertheless, it is still possible to estimate the length of the amorphous mark, and also for the first time, we could directly observe the shift of the amorphous mark from one side of the line cell to the other side when the polarity of the applied (50 ns) RESET pulse was reversed.« less
Sebinger, David D. R.; Unbekandt, Mathieu; Ganeva, Veronika V.; Ofenbauer, Andreas; Werner, Carsten; Davies, Jamie A.
2010-01-01
Here, we present a novel method for culturing kidneys in low volumes of medium that offers more organotypic development compared to conventional methods. Organ culture is a powerful technique for studying renal development. It recapitulates many aspects of early development very well, but the established techniques have some disadvantages: in particular, they require relatively large volumes (1–3 mls) of culture medium, which can make high-throughput screens expensive, they require porous (filter) substrates which are difficult to modify chemically, and the organs produced do not achieve good cortico-medullary zonation. Here, we present a technique of growing kidney rudiments in very low volumes of medium–around 85 microliters–using silicone chambers. In this system, kidneys grow directly on glass, grow larger than in conventional culture and develop a clear anatomical cortico-medullary zonation with extended loops of Henle. PMID:20479933
ERIC Educational Resources Information Center
Gold, Stephanie
2005-01-01
The concept of data-driven professional development is both straight-forward and sensible. Implementing this approach is another story, which is why many administrators are turning to sophisticated tools to help manage data collection and analysis. These tools allow educators to assess and correlate student outcomes, instructional methods, and…
Hull Form Design and Optimization Tool Development
2012-07-01
global minimum. The algorithm accomplishes this by using a method known as metaheuristics which allows the algorithm to examine a large area by...further development of these tools including the implementation and testing of a new optimization algorithm , the improvement of a rapid hull form...under the 2012 Naval Research Enterprise Intern Program. 15. SUBJECT TERMS hydrodynamic, hull form, generation, optimization, algorithm
ERIC Educational Resources Information Center
David, Shannon L.; Hitchcock, John H.; Ragan, Brian; Brooks, Gordon; Starkey, Chad
2018-01-01
Developing psychometrically sound instruments can be difficult, especially if little is known about the constructs of interest. When constructs of interest are unclear, a mixed methods approach can be useful. Qualitative inquiry can be used to explore a construct's meaning in a way that informs item writing and allows the strengths of one analysis…
Learning from Our Own Lessons: Pre-Service Teachers' Narratives of Teaching as an Experiment
ERIC Educational Resources Information Center
Wickstrom, Megan H.; Wilm, Stephanie; Mills, Emily; Johnson, Alexis; Leonard, Nicole; Larberg, Raegan
2018-01-01
Pre-service teachers need to develop habits of mind that allow them to grow as new teachers. This article describes an elementary mathematics methods course in which teaching as an experiment was used a framework for pre-service teachers to participate in action research by developing learning goals, observing and analyzing student thinking,…
Distributed Sensor Fusion for Scalar Field Mapping Using Mobile Sensor Networks.
La, Hung Manh; Sheng, Weihua
2013-04-01
In this paper, autonomous mobile sensor networks are deployed to measure a scalar field and build its map. We develop a novel method for multiple mobile sensor nodes to build this map using noisy sensor measurements. Our method consists of two parts. First, we develop a distributed sensor fusion algorithm by integrating two different distributed consensus filters to achieve cooperative sensing among sensor nodes. This fusion algorithm has two phases. In the first phase, the weighted average consensus filter is developed, which allows each sensor node to find an estimate of the value of the scalar field at each time step. In the second phase, the average consensus filter is used to allow each sensor node to find a confidence of the estimate at each time step. The final estimate of the value of the scalar field is iteratively updated during the movement of the mobile sensors via weighted average. Second, we develop the distributed flocking-control algorithm to drive the mobile sensors to form a network and track the virtual leader moving along the field when only a small subset of the mobile sensors know the information of the leader. Experimental results are provided to demonstrate our proposed algorithms.
Investigation of the technology of conductive yarns manufacturing
NASA Astrophysics Data System (ADS)
Ryklin, Dzmitry; Medvetski, Sergey
2017-10-01
The paper is devoted to development of technology of electrically conductive yarn production. This technology allows manufacturing conductive yarns of copper wire and polyester filament yarns. Method of the predicting of the conductive yarn breaking force was developed on the base of analysing of load-elongation curves of each strand of the yarn. Also the method of the predicting of the conductive yarn diameter was offered. Investigation shows that conductive yarns can be integrated into the textiles structure using sewing or embroidery equipment. Application of developed conductive yarn is wearable electronics creating with wide range of functions, for example, for specific health issue monitoring, navigation tools or communication gadgets.
The role of finite-difference methods in design and analysis for supersonic cruise
NASA Technical Reports Server (NTRS)
Townsend, J. C.
1976-01-01
Finite-difference methods for analysis of steady, inviscid supersonic flows are described, and their present state of development is assessed with particular attention to their applicability to vehicles designed for efficient cruise flight. Current work is described which will allow greater geometric latitude, improve treatment of embedded shock waves, and relax the requirement that the axial velocity must be supersonic.
Development of PET projection data correction algorithm
NASA Astrophysics Data System (ADS)
Bazhanov, P. V.; Kotina, E. D.
2017-12-01
Positron emission tomography is modern nuclear medicine method used in metabolism and internals functions examinations. This method allows to diagnosticate treatments on their early stages. Mathematical algorithms are widely used not only for images reconstruction but also for PET data correction. In this paper random coincidences and scatter correction algorithms implementation are considered, as well as algorithm of PET projection data acquisition modeling for corrections verification.
An experimental method to simulate incipient decay of wood basidiomycete fungi
Simon Curling; Jerrold E. Winandy; Carol A. Clausen
2000-01-01
At very early stages of decay of wood by basidiomycete fungi, strength loss can be measured from wood before any measurable weight loss. Therefore, strength loss is a more efficient measure of incipient decay than weight loss. However, common standard decay tests (e.g. EN 113 or ASTM D2017) use weight loss as the measure of decay. A method was developed that allowed...
Fujihara, Hidehiko; Hino, Mika; Takashita, Hideharu; Kajiwara, Yasuhiro; Okamoto, Keiko; Furukawa, Kensuke
2014-01-01
We developed an efficient screening method for Saccharomyces cerevisiae strains from environmental isolates. MultiPlex PCR was performed targeting four brewing S. cerevisiae genes (SSU1, AWA1, BIO6, and FLO1). At least three genes among the four were amplified from all S. cerevisiae strains. The use of this method allowed us to successfully obtain S. cerevisiae strains.
Hubert, C; Lebrun, P; Houari, S; Ziemons, E; Rozet, E; Hubert, Ph
2014-01-01
The understanding of the method is a major concern when developing a stability-indicating method and even more so when dealing with impurity assays from complex matrices. In the presented case study, a Quality-by-Design approach was applied in order to optimize a routinely used method. An analytical issue occurring at the last stage of a long-term stability study involving unexpected impurities perturbing the monitoring of characterized impurities needed to be resolved. A compliant Quality-by-Design (QbD) methodology based on a Design of Experiments (DoE) approach was evaluated within the framework of a Liquid Chromatography (LC) method. This approach allows the investigation of Critical Process Parameters (CPPs), which have an impact on Critical Quality Attributes (CQAs) and, consequently, on LC selectivity. Using polynomial regression response modeling as well as Monte Carlo simulations for error propagation, Design Space (DS) was computed in order to determine robust working conditions for the developed stability-indicating method. This QbD compliant development was conducted in two phases allowing the use of the Design Space knowledge acquired during the first phase to define the experimental domain of the second phase, which constitutes a learning process. The selected working condition was then fully validated using accuracy profiles based on statistical tolerance intervals in order to evaluate the reliability of the results generated by this LC/ESI-MS stability-indicating method. A comparison was made between the traditional Quality-by-Testing (QbT) approach and the QbD strategy, highlighting the benefit of this QbD strategy in the case of an unexpected impurities issue. On this basis, the advantages of a systematic use of the QbD methodology were discussed. Copyright © 2013 Elsevier B.V. All rights reserved.
Vapor port and groundwater sampling well
Hubbell, Joel M.; Wylie, Allan H.
1996-01-01
A method and apparatus has been developed for combining groundwater monitoring wells with unsaturated-zone vapor sampling ports. The apparatus allows concurrent monitoring of both the unsaturated and the saturated zone from the same well at contaminated areas. The innovative well design allows for concurrent sampling of groundwater and volatile organic compounds (VOCs) in the vadose (unsaturated) zone from a single well, saving considerable time and money. The sample tubes are banded to the outer well casing during installation of the well casing.
Vapor port and groundwater sampling well
Hubbell, J.M.; Wylie, A.H.
1996-01-09
A method and apparatus have been developed for combining groundwater monitoring wells with unsaturated-zone vapor sampling ports. The apparatus allows concurrent monitoring of both the unsaturated and the saturated zone from the same well at contaminated areas. The innovative well design allows for concurrent sampling of groundwater and volatile organic compounds (VOCs) in the vadose (unsaturated) zone from a single well, saving considerable time and money. The sample tubes are banded to the outer well casing during installation of the well casing. 10 figs.
Method and apparatus for phase and amplitude detection
Cernosek, R.W.; Frye, G.C.; Martin, S.J.
1998-06-09
A new class of techniques has been developed which allow inexpensive application of SAW-type chemical sensor devices while retaining high sensitivity (ppm) to chemical detection. The new techniques do not require that the sensor be part of an oscillatory circuit, allowing large concentrations of, e.g., chemical vapors in air, to be accurately measured without compromising the capacity to measure trace concentrations. Such devices have numerous potential applications in environmental monitoring, from manufacturing environments to environmental restoration. 12 figs.
Long Term Ex Vivo Culture and Live Imaging of Drosophila Larval Imaginal Discs.
Tsao, Chia-Kang; Ku, Hui-Yu; Lee, Yuan-Ming; Huang, Yu-Fen; Sun, Yi Henry
Continuous imaging of live tissues provides clear temporal sequence of biological events. The Drosophila imaginal discs have been popular experimental subjects for the study of a wide variety of biological phenomena, but long term culture that allows normal development has not been satisfactory. Here we report a culture method that can sustain normal development for 18 hours and allows live imaging. The method is validated in multiple discs and for cell proliferation, differentiation and migration. However, it does not support disc growth and cannot support cell proliferation for more than 7 to 12 hr. We monitored the cellular behavior of retinal basal glia in the developing eye disc and found that distinct glia type has distinct properties of proliferation and migration. The live imaging provided direct proof that wrapping glia differentiated from existing glia after migrating to the anterior front, and unexpectedly found that they undergo endoreplication before wrapping axons, and their nuclei migrate up and down along the axons. UV-induced specific labeling of a single carpet glia also showed that the two carpet glia membrane do not overlap and suggests a tiling or repulsion mechanism between the two cells. These findings demonstrated the usefulness of an ex vivo culture method and live imaging.
Determination of plasma density from data on the ion current to cylindrical and planar probes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Voloshin, D. G., E-mail: dvoloshin@mics.msu.su; Vasil’eva, A. N.; Kovalev, A. S.
2016-12-15
To improve probe methods of plasma diagnostics, special probe measurements were performed and numerical models describing ion transport to a probe with allowance for collisions were developed. The current–voltage characteristics of cylindrical and planar probes were measured in an RF capacitive discharge in argon at a frequency of 81 MHz and plasma densities of 10{sup 10}–10{sup 11} cm{sup –3}, typical of modern RF reactors. 1D and 2D numerical models based on the particle-in-cell method with Monte Carlo collisions for simulating ion motion and the Boltzmann equilibrium for electrons are developed to describe current collection by a probe. The models weremore » used to find the plasma density from the ion part of the current–voltage characteristic, study the effect of ion collisions, and verify simplified approaches to determining the plasma density. A 1D hydrodynamic model of the ion current to a cylindrical probe with allowance for ion collisions is proposed. For a planar probe, a method to determine the plasma density from the averaged numerical results is developed. A comparative analysis of different approaches to calculating the plasma density from the ion current to a probe is performed.« less
Royer, Heather R; Fernandez-Lambert, Katherin M; Moreno, Megan A
2013-09-01
Sexually transmitted diseases are common among young women and effective self-management is foundational to improving health outcomes and preventing negative sequelae. Advances in technology create the opportunity for innovative delivery methods of self-management interventions. However, it is essential to conduct formative research with the target population to identify both the needs and the preferences for the content and delivery method of a sexually transmitted disease self-management intervention prior to intervention development. Eight focus groups were conducted with 35 young women between 18 and 24 years of age. We found that young women strongly support the use of a Web-based intervention to provide sexually transmitted disease self-management guidance. Women were interested in receiving comprehensive management information from the perspective of both clinicians and other women who have experienced a sexually transmitted disease. There was a clear interest in incorporating new media into the Web-based intervention to allow for communication with providers as well as to create opportunities for social networking between women. This formative research provides critical information about the content and delivery method of a self-management intervention and gives direction for intervention development that is inclusive of varying types of new media to allow for connectivity among users, their peers, and clinicians.
Hu, Hao; Yang, Weitao
2013-01-01
Determining the free energies and mechanisms of chemical reactions in solution and enzymes is a major challenge. For such complex reaction processes, combined quantum mechanics/molecular mechanics (QM/MM) method is the most effective simulation method to provide an accurate and efficient theoretical description of the molecular system. The computational costs of ab initio QM methods, however, have limited the application of ab initio QM/MM methods. Recent advances in ab initio QM/MM methods allowed the accurate simulation of the free energies for reactions in solution and in enzymes and thus paved the way for broader application of the ab initio QM/MM methods. We review here the theoretical developments and applications of the ab initio QM/MM methods, focusing on the determination of reaction path and the free energies of the reaction processes in solution and enzymes. PMID:24146439
A modified Finite Element-Transfer Matrix for control design of space structures
NASA Technical Reports Server (NTRS)
Tan, T.-M.; Yousuff, A.; Bahar, L. Y.; Konstandinidis, M.
1990-01-01
The Finite Element-Transfer Matrix (FETM) method was developed for reducing the computational efforts involved in structural analysis. While being widely used by structural analysts, this method does, however, have certain limitations, particularly when used for the control design of large flexible structures. In this paper, a new formulation based on the FETM method is presented. The new method effectively overcomes the limitations in the original FETM method, and also allows an easy construction of reduced models that are tailored for the control design. Other advantages of this new method include the ability to extract open loop frequencies and mode shapes with less computation, and simplification of the design procedures for output feedback, constrained compensation, and decentralized control. The development of this new method and the procedures for generating reduced models using this method are described in detail and the role of the reduced models in control design is discussed through an illustrative example.
Semeraro, Michela; Rizzo, Cristiano; Boenzi, Sara; Cappa, Marco; Bertini, Enrico; Antonetti, Giacomo; Dionisi-Vici, Carlo
2016-07-01
Peroxisomal disorders (PDs) present with wide phenotypic variability. An appropriate diagnosis requires a complete analysis of peroxisomal metabolites. We developed a multiplex LC-MS/MS method, using atmospheric pressure chemical ionization allowing the simultaneous determination in plasma of very-long-chain fatty acids, phytanic, pristanic, docosahexaenoic acids and di- and tri-hydroxycolestanoic bile acids. Two hundred microliters of plasma extracted with acetonitrile and 200μl extracted with hexane after an acid hydrolysis were combined, evaporated, dissolved in 10μl of methanol and analyzed. The acquisition was in negative-ion mode using multiple reaction monitoring. The method was validated analytically and clinically. Linearity was 0.1-200μmol/l for docosanoic, cis-13-docosenoic, tetracosanoic, cis-15-tetracosenoic and phytanic acids; 0.01-10μmol/l for hexacosanoic acid; 0.02-20μmol/l for di-hydroxycolestanoic, tri-hydroxycolestanoic and pristanic acids; 0.3-300μmol/l for docosahexaenoic acid. Intra-day and inter-day CVs were below 3.88 and 3.98 respectively for all compounds. Samples from patients with known peroxisomal disorders were compared with controls and the method allowed to confirm the diagnosis in all subjects with a 100% sensitivity. The advantage of this multiplex method is to allow in a single chromatographic run the simultaneous determination of a large number of peroxisome biomarkers with a simple preparative phase without derivatization. Copyright © 2016 Elsevier B.V. All rights reserved.
Scalable Production of Glioblastoma Tumor-initiating Cells in 3 Dimension Thermoreversible Hydrogels
NASA Astrophysics Data System (ADS)
Li, Qiang; Lin, Haishuang; Wang, Ou; Qiu, Xuefeng; Kidambi, Srivatsan; Deleyrolle, Loic P.; Reynolds, Brent A.; Lei, Yuguo
2016-08-01
There is growing interest in developing drugs that specifically target glioblastoma tumor-initiating cells (TICs). Current cell culture methods, however, cannot cost-effectively produce the large numbers of glioblastoma TICs required for drug discovery and development. In this paper we report a new method that encapsulates patient-derived primary glioblastoma TICs and grows them in 3 dimension thermoreversible hydrogels. Our method allows long-term culture (~50 days, 10 passages tested, accumulative ~>1010-fold expansion) with both high growth rate (~20-fold expansion/7 days) and high volumetric yield (~2.0 × 107 cells/ml) without the loss of stemness. The scalable method can be used to produce sufficient, affordable glioblastoma TICs for drug discovery.
Development of an ELA-DRA gene typing method based on pyrosequencing technology.
Díaz, S; Echeverría, M G; It, V; Posik, D M; Rogberg-Muñoz, A; Pena, N L; Peral-García, P; Vega-Pla, J L; Giovambattista, G
2008-11-01
The polymorphism of equine lymphocyte antigen (ELA) class II DRA gene had been detected by polymerase chain reaction-single-strand conformational polymorphism (PCR-SSCP) and reference strand-mediated conformation analysis. These methodologies allowed to identify 11 ELA-DRA exon 2 sequences, three of which are widely distributed among domestic horse breeds. Herein, we describe the development of a pyrosequencing-based method applicable to ELA-DRA typing, by screening samples from eight different horse breeds previously typed by PCR-SSCP. This sequence-based method would be useful in high-throughput genotyping of major histocompatibility complex genes in horses and other animal species, making this system interesting as a rapid screening method for animal genotyping of immune-related genes.
Evaluation of laser ablation crater relief by white light micro interferometer
NASA Astrophysics Data System (ADS)
Gurov, Igor; Volkov, Mikhail; Zhukova, Ekaterina; Ivanov, Nikita; Margaryants, Nikita; Potemkin, Andrey; Samokhvalov, Andrey; Shelygina, Svetlana
2017-06-01
A multi-view scanning method is suggested to assess a complicated surface relief by white light interferometer. Peculiarities of the method are demonstrated on a special object in the form of quadrangular pyramid cavity, which is formed at measurement of micro-hardness of materials using a hardness gauge. An algorithm of the joint processing of multi-view scanning results is developed that allows recovering correct relief values. Laser ablation craters were studied experimentally, and their relief was recovered using the developed method. It is shown that the multi-view scanning reduces ambiguity when determining the local depth of the laser ablation craters micro relief. Results of experimental studies of the multi-view scanning method and data processing algorithm are presented.
Evaluation of Nonlinear Constitutive Properties of Concrete
1990-02-01
34 " \\ 19:#BSTRACT (Continue on reverse if necesuar 4nd identify by block number) 3his report describes the development of a methodology that allows for...Continued). The method of evaluation, as developed herein, consists of the following steps: 1. The design and execution of a series of material... developed in Step L. 3. Design and execution of the series of verification tests which provide data suffi- cient for defining key complex material
MODELING LEACHING OF VIRUSES BY THE MONTE CARLO METHOD
A predictive screening model was developed for fate and transport
of viruses in the unsaturated zone. A database of input parameters
allowed Monte Carlo analysis with the model. The resulting kernel
densities of predicted attenuation during percolation indicated very ...
Fast sweeping methods for hyperbolic systems of conservation laws at steady state II
NASA Astrophysics Data System (ADS)
Engquist, Björn; Froese, Brittany D.; Tsai, Yen-Hsi Richard
2015-04-01
The idea of using fast sweeping methods for solving stationary systems of conservation laws has previously been proposed for efficiently computing solutions with sharp shocks. We further develop these methods to allow for a more challenging class of problems including problems with sonic points, shocks originating in the interior of the domain, rarefaction waves, and two-dimensional systems. We show that fast sweeping methods can produce higher-order accuracy. Computational results validate the claims of accuracy, sharp shock curves, and optimal computational efficiency.
Ranking of options of real estate use by expert assessments mathematical processing
NASA Astrophysics Data System (ADS)
Lepikhina, O. Yu; Skachkova, M. E.; Mihaelyan, T. A.
2018-05-01
The article is devoted to the development of the real estate assessment concept. In conditions of multivariate using of the real estate method based on calculating, the integral indicator of each variant’s efficiency is proposed. In order to calculate weights of criteria of the efficiency expert method, Analytic hierarchy process and its mathematical support are used. The method allows fulfilling ranking of alternative types of real estate use in dependence of their efficiency. The method was applied for one of the land parcels located on Primorsky district in Saint Petersburg.
Dichotomisation using a distributional approach when the outcome is skewed.
Sauzet, Odile; Ofuya, Mercy; Peacock, Janet L
2015-04-24
Dichotomisation of continuous outcomes has been rightly criticised by statisticians because of the loss of information incurred. However to communicate a comparison of risks, dichotomised outcomes may be necessary. Peacock et al. developed a distributional approach to the dichotomisation of normally distributed outcomes allowing the presentation of a comparison of proportions with a measure of precision which reflects the comparison of means. Many common health outcomes are skewed so that the distributional method for the dichotomisation of continuous outcomes may not apply. We present a methodology to obtain dichotomised outcomes for skewed variables illustrated with data from several observational studies. We also report the results of a simulation study which tests the robustness of the method to deviation from normality and assess the validity of the newly developed method. The review showed that the pattern of dichotomisation was varying between outcomes. Birthweight, Blood pressure and BMI can either be transformed to normal so that normal distributional estimates for a comparison of proportions can be obtained or better, the skew-normal method can be used. For gestational age, no satisfactory transformation is available and only the skew-normal method is reliable. The normal distributional method is reliable also when there are small deviations from normality. The distributional method with its applicability for common skewed data allows researchers to provide both continuous and dichotomised estimates without losing information or precision. This will have the effect of providing a practical understanding of the difference in means in terms of proportions.
Variational optical flow computation in real time.
Bruhn, Andrés; Weickert, Joachim; Feddern, Christian; Kohlberger, Timo; Schnörr, Christoph
2005-05-01
This paper investigates the usefulness of bidirectional multigrid methods for variational optical flow computations. Although these numerical schemes are among the fastest methods for solving equation systems, they are rarely applied in the field of computer vision. We demonstrate how to employ those numerical methods for the treatment of variational optical flow formulations and show that the efficiency of this approach even allows for real-time performance on standard PCs. As a representative for variational optic flow methods, we consider the recently introduced combined local-global method. It can be considered as a noise-robust generalization of the Horn and Schunck technique. We present a decoupled, as well as a coupled, version of the classical Gauss-Seidel solver, and we develop several multgrid implementations based on a discretization coarse grid approximation. In contrast, with standard bidirectional multigrid algorithms, we take advantage of intergrid transfer operators that allow for nondyadic grid hierarchies. As a consequence, no restrictions concerning the image size or the number of traversed levels have to be imposed. In the experimental section, we juxtapose the developed multigrid schemes and demonstrate their superior performance when compared to unidirectional multgrid methods and nonhierachical solvers. For the well-known 316 x 252 Yosemite sequence, we succeeded in computing the complete set of dense flow fields in three quarters of a second on a 3.06-GHz Pentium4 PC. This corresponds to a frame rate of 18 flow fields per second which outperforms the widely-used Gauss-Seidel method by almost three orders of magnitude.
Laser-based methods for the analysis of low molecular weight compounds in biological matrices.
Kiss, András; Hopfgartner, Gérard
2016-07-15
Laser-based desorption and/or ionization methods play an important role in the field of the analysis of low molecular-weight compounds (LMWCs) because they allow direct analysis with high-throughput capabilities. In the recent years there were several new improvements in ionization methods with the emergence of novel atmospheric ion sources such as laser ablation electrospray ionization or laser diode thermal desorption and atmospheric pressure chemical ionization and in sample preparation methods with the development of new matrix compounds for matrix-assisted laser desorption/ionization (MALDI). Also, the combination of ion mobility separation with laser-based ionization methods starts to gain popularity with access to commercial systems. These developments have been driven mainly by the emergence of new application fields such as MS imaging and non-chromatographic analytical approaches for quantification. This review aims to present these new developments in laser-based methods for the analysis of low-molecular weight compounds by MS and several potential applications. Copyright © 2016 Elsevier Inc. All rights reserved.
Near Infrared Imaging As a Method of Studying Tsetse Fly (Diptera: Glossinidae) Pupal Development
Moran, Zelda R.; Parker, Andrew G.
2016-01-01
Abstract Near infrared (NIR) photography and video was investigated as a method for observing and recording intrapuparial development in the tsetse fly Glossina palpalis gambiensis and other Muscomorpha (Cyclorrhapha) Diptera. We showed that NIR light passes through the puparium, permitting images of the true pupae and pharate adult to be captured. Various wavelengths of NIR light from 880 to 1060 nm were compared to study the development of tsetse fly pupae from larviposition to emergence, using time-lapse videos and photographs. This study was carried out to advance our understanding of tsetse pupal development, specifically with the goal of improving a sorting technique which could separate male from female tsetse flies several days before emergence. Separation of the sexes at this stage is highly desirable for operational tsetse sterile insect technique control programmes, as it would permit the easy retention of females for the colony while allowing the males to be handled, irradiated and shipped in the pupal stage when they are less sensitive to vibration. In addition, it presents a new methodology for studying the pupal stage of many coarctate insects for many applications. NIR imaging permits observation of living pupae, allowing the entire development process to be observed without disruption. PMID:27402791
Schneider, Lon S.; Mangialasche, Francesca; Andreasen, Niels; Feldman, Howard; Giacobini, Ezio; Jones, Roy; Mantua, Valentina; Mecocci, Patrizia; Pani, Luca; Winblad, Bengt; Kivipelto, Miia
2014-01-01
The modern era of drug development for Alzheimer’s disease began with the proposal of the cholinergic hypothesis of memory impairment and the 1984 research criteria for Alzheimer’s disease. Since then, despite the evaluation of numerous potential treatments in clinical trials, only four cholinesterase inhibitors and memantine have shown sufficient safety and efficacy to allow marketing approval at an international level. Although this is probably because the other drugs tested were ineffective, inadequate clinical development methods have also been blamed for the failures. Here we review the development of treatments for Alzheimer’s disease during the past 30 years, considering the drugs, potential targets, late-stage clinical trials, development methods, emerging use of biomarkers and evolution of regulatory considerations in order to summarize advances and anticipate future developments. We have considered late-stage Alzheimer’s disease drug development from 1984 to 2013, including individual clinical trials, systematic and qualitative reviews, meta-analyses, methods, commentaries, position papers and guidelines. We then review the evolution of drugs in late clinical development, methods, biomarkers and regulatory issues. Although a range of small molecules and biological products against many targets have been investigated in clinical trials, the predominant drug targets have been the cholinergic system and the amyloid cascade. Trial methods have evolved incrementally: inclusion criteria have largely remained focused on mild to moderate Alzheimer’s disease criteria, recently extending to early or prodromal Alzheimer disease or ‘mild cognitive impairment due to Alzheimer’s disease’, for drugs considered to be disease modifying. The duration of trials has remained at 6 to 12 months for drugs intended to improve symptoms; 18- to 24-month trials have been established for drugs expected to attenuate clinical course. Cognitive performance, activities of daily living, global change and severity ratings have persisted as the primary clinically relevant outcomes. Regulatory guidance and oversight have evolved to allow for enrichment of early-stage Alzheimer’s disease trial samples by using biomarkers and phase-specific outcomes. In conclusion, validated drug targets for Alzheimer’s disease remain to be developed. Only drugs that affect an aspect of cholinergic function have shown consistent, but modest, clinical effects in late-phase trials. There is opportunity for substantial improvements in drug discovery and clinical development methods. PMID:24605808
Li, Yumei; Xiang, Yang; Xu, Chao; Shen, Hui; Deng, Hongwen
2018-01-15
The development of next-generation sequencing technologies has facilitated the identification of rare variants. Family-based design is commonly used to effectively control for population admixture and substructure, which is more prominent for rare variants. Case-parents studies, as typical strategies in family-based design, are widely used in rare variant-disease association analysis. Current methods in case-parents studies are based on complete case-parents data; however, parental genotypes may be missing in case-parents trios, and removing these data may lead to a loss in statistical power. The present study focuses on testing for rare variant-disease association in case-parents study by allowing for missing parental genotypes. In this report, we extended the collapsing method for rare variant association analysis in case-parents studies to allow for missing parental genotypes, and investigated the performance of two methods by using the difference of genotypes between affected offspring and their corresponding "complements" in case-parent trios and TDT framework. Using simulations, we showed that, compared with the methods just only using complete case-parents data, the proposed strategy allowing for missing parental genotypes, or even adding unrelated affected individuals, can greatly improve the statistical power and meanwhile is not affected by population stratification. We conclude that adding case-parents data with missing parental genotypes to complete case-parents data set can greatly improve the power of our strategy for rare variant-disease association.
NASA Astrophysics Data System (ADS)
Ghosh, Pratik
1992-01-01
The investigations focussed on in vivo NMR imaging studies of magnetic particles with and within neural cells. NMR imaging methods, both Fourier transform and projection reconstruction, were implemented and new protocols were developed to perform "Neuronal Tracing with Magnetic Labels" on small animal brains. Having performed the preliminary experiments with neuronal tracing, new optimized coils and experimental set-up were devised. A novel gradient coil technology along with new rf-coils were implemented, and optimized for future use with small animals in them. A new magnetic labelling procedure was developed that allowed labelling of billions of cells with ultra -small magnetite particles in a short time. The relationships among the viability of such cells, the amount of label and the contrast in the images were studied as quantitatively as possible. Intracerebral grafting of magnetite labelled fetal rat brain cells made it possible for the first time to attempt monitoring in vivo the survival, differentiation, and possible migration of both host and grafted cells in the host rat brain. This constituted the early steps toward future experiments that may lead to the monitoring of human brain grafts of fetal brain cells. Preliminary experiments with direct injection of horse radish peroxidase-conjugated magnetite particles into neurons, followed by NMR imaging, revealed a possible non-invasive alternative, allowing serial study of the dynamic transport pattern of tracers in single living animals. New gradient coils were built by using parallel solid-conductor ribbon cables that could be wrapped easily and quickly. Rapid rise times provided by these coils allowed implementation of fast imaging methods. Optimized rf-coil circuit development made it possible to understand better the sample-coil properties and the associated trade -offs in cases of small but conducting samples.
The Global Survey Method Applied to Ground-level Cosmic Ray Measurements
NASA Astrophysics Data System (ADS)
Belov, A.; Eroshenko, E.; Yanke, V.; Oleneva, V.; Abunin, A.; Abunina, M.; Papaioannou, A.; Mavromichalaki, H.
2018-04-01
The global survey method (GSM) technique unites simultaneous ground-level observations of cosmic rays in different locations and allows us to obtain the main characteristics of cosmic-ray variations outside of the atmosphere and magnetosphere of Earth. This technique has been developed and applied in numerous studies over many years by the Institute of Terrestrial Magnetism, Ionosphere and Radiowave Propagation (IZMIRAN). We here describe the IZMIRAN version of the GSM in detail. With this technique, the hourly data of the world-wide neutron-monitor network from July 1957 until December 2016 were processed, and further processing is enabled upon the receipt of new data. The result is a database of homogeneous and continuous hourly characteristics of the density variations (an isotropic part of the intensity) and the 3D vector of the cosmic-ray anisotropy. It includes all of the effects that could be identified in galactic cosmic-ray variations that were caused by large-scale disturbances of the interplanetary medium in more than 50 years. These results in turn became the basis for a database on Forbush effects and interplanetary disturbances. This database allows correlating various space-environment parameters (the characteristics of the Sun, the solar wind, et cetera) with cosmic-ray parameters and studying their interrelations. We also present features of the coupling coefficients for different neutron monitors that enable us to make a connection from ground-level measurements to primary cosmic-ray variations outside the atmosphere and the magnetosphere. We discuss the strengths and weaknesses of the current version of the GSM as well as further possible developments and improvements. The method developed allows us to minimize the problems of the neutron-monitor network, which are typical for experimental physics, and to considerably enhance its advantages.
GEM detector performance with innovative micro-TPC readout in high magnetic field
NASA Astrophysics Data System (ADS)
Garzia, I.; Alexeev, M.; Amoroso, A.; Baldini Ferroli, R.; Bertani, M.; Bettoni, D.; Bianchi, F.; Calcaterra, A.; Canale, N.; Capodiferro, M.; Cassariti, V.; Cerioni, S.; Chai, J. Y.; Chiozzi, S.; Cibinetto, G.; Cossio, F.; Cotta Ramusino, A.; De Mori, F.; Destefanis, M.; Dong, J.; Evangelisti, F.; Evangelisti, F.; Farinelli, R.; Fava, L.; Felici, G.; Fioravanti, E.; Gatta, M.; Greco, M.; Lavezzi, L.; Leng, C. Y.; Li, H.; Maggiora, M.; Malaguti, R.; Marcello, S.; Melchiorri, M.; Mezzadri, G.; Mignone, M.; Morello, G.; Pacetti, S.; Patteri, P.; Pellegrino, J.; Pelosi, A.; Rivetti, A.; Rolo, M. D.; Savrié, M.; Scodeggio, M.; Soldani, E.; Sosio, S.; Spataro, S.; Tskhadadze, E.; Verma, S.; Wheadon, R.; Yan, L.
2018-01-01
Gas detector development is one of the pillars of the research in fundamental physics. Since several years, a new concept of detectors, called Micro Pattern Gas Detector (MPGD), allowed to overcome several problems related to other types of commonly used detectors, like drift chamber and micro strips detectors, reducing the rate of discharges and providing better radiation tolerance. Among the most used MPGDs are the Gas Electron Multipliers (GEMs). Invented by Sauli in 1997, nowadays GEMs have become an important reality for particle detectors in high energy physics. Commonly deployed as fast timing detectors and triggers, their fast response, high rate capability and high radiation hardness make them also suitable as tracking detectors. The readout scheme is one of the most important features in tracking technology. Analog readout based on the calculation of the center of gravity technique allows to overcome the limit imposed by digital pads, whose spatial resolution is limited by the pitch dimensions. However, the presence of high external magnetic fields can distort the electronic cloud and affect the performance. The development of the micro-TPC reconstruction method brings GEM detectors into a new prospective, improving significantly the spatial resolutionin presence of high magnetic fields. This innovative technique allows to reconstruct the 3-dimensional particle position, as Time Projection Chamber, but within a drift gap of a few millimeters. In these report, the charge centroid and micro-TPC methods are described in details. We discuss the results of several test beams performed with planar chambers in magnetic field. These results are one of the first developments of micro-TPC technique for GEM detectors, which allows to reach unprecedented performance in a high magnetic field of 1 T.
The current status and future prospects of computer-assisted hip surgery.
Inaba, Yutaka; Kobayashi, Naomi; Ike, Hiroyuki; Kubota, So; Saito, Tomoyuki
2016-03-01
The advances in computer assistance technology have allowed detailed three-dimensional preoperative planning and simulation of preoperative plans. The use of a navigation system as an intraoperative assistance tool allows more accurate execution of the preoperative plan, compared to manual operation without assistance of the navigation system. In total hip arthroplasty using CT-based navigation, three-dimensional preoperative planning with computer software allows the surgeon to determine the optimal angle of implant placement at which implant impingement is unlikely to occur in the range of hip joint motion necessary for daily activities of living, and to determine the amount of three-dimensional correction for leg length and offset. With the use of computer navigation for intraoperative assistance, the preoperative plan can be precisely executed. In hip osteotomy using CT-based navigation, the navigation allows three-dimensional preoperative planning, intraoperative confirmation of osteotomy sites, safe performance of osteotomy even under poor visual conditions, and a reduction in exposure doses from intraoperative fluoroscopy. Positions of the tips of chisels can be displayed on the computer monitor during surgery in real time, and staff other than the operator can also be aware of the progress of surgery. Thus, computer navigation also has an educational value. On the other hand, its limitations include the need for placement of trackers, increased radiation exposure from preoperative CT scans, and prolonged operative time. Moreover, because the position of a bone fragment cannot be traced after osteotomy, methods to find its precise position after its movement need to be developed. Despite the need to develop methods for the postoperative evaluation of accuracy for osteotomy, further application and development of these systems are expected in the future. Copyright © 2016 The Japanese Orthopaedic Association. Published by Elsevier B.V. All rights reserved.
[Methods Used for Monitoring Cure Reactions in Real-time in an Autoclave
NASA Technical Reports Server (NTRS)
Cooper, John B.; Wise, Kent L.; Jensen, Brian J. (Technical Monitor)
2000-01-01
The goal of the research was to investigate methods for monitoring cure reactions in real-time in an autoclave. This is of particular importance to NASA Langley Research Center because polyimides were proposed for use in the High Speed Civil Transport (HSCT) program. Understanding the cure chemistry behind the polyimides would allow for intelligent processing of the composites made from their use. This work has led to two publications in peer-reviewed journals and a patent. The journal articles are listed as Appendix A which is on the instrument design of the research and Appendix B which is on the cure chemistry. Also, a patent has been awarded for the instrumental design developed under this grant which is given as Appendix C. There has been a significant amount of research directed at developing methods for monitoring cure reactions in real-time within the autoclave. The various research efforts can be categorized as methods providing either direct chemical bonding information or methods that provide indirect chemical bonding information. Methods falling into the latter category are fluorescence, dielectric loss, ultrasonic and similar type methods. Correlation of such measurements with the underlying chemistry is often quite difficult since these techniques do not allow monitoring of the curing chemistry which is ultimately responsible for material properties. Direct methods such as vibrational spectroscopy, however, can often be easily correlated with the underlying chemistry of a reaction. Such methods include Raman spectroscopy, mid-IR absorbance, and near-IR absorbance. With the recent advances in fiber-optics, these spectroscopic techniques can be applied to remote on-line monitoring.
2016-01-01
Semiempirical (SE) methods can be derived from either Hartree–Fock or density functional theory by applying systematic approximations, leading to efficient computational schemes that are several orders of magnitude faster than ab initio calculations. Such numerical efficiency, in combination with modern computational facilities and linear scaling algorithms, allows application of SE methods to very large molecular systems with extensive conformational sampling. To reliably model the structure, dynamics, and reactivity of biological and other soft matter systems, however, good accuracy for the description of noncovalent interactions is required. In this review, we analyze popular SE approaches in terms of their ability to model noncovalent interactions, especially in the context of describing biomolecules, water solution, and organic materials. We discuss the most significant errors and proposed correction schemes, and we review their performance using standard test sets of molecular systems for quantum chemical methods and several recent applications. The general goal is to highlight both the value and limitations of SE methods and stimulate further developments that allow them to effectively complement ab initio methods in the analysis of complex molecular systems. PMID:27074247
Preparation of biodiesel with the help of ultrasonic and hydrodynamic cavitation.
Ji, Jianbing; Wang, Jianli; Li, Yongchao; Yu, Yunliang; Xu, Zhichao
2006-12-22
An alkali-catalyzed biodiesel production method with power ultrasonic (19.7 kHz) has been developed that allows a short reaction time and high yield because of emulsification and cavitation of the liquid-liquid immiscible system. Orthogonality experiments were employed to evaluate the effects of synthesis parameters. Furthermore, hydrodynamic cavitation was used for biodiesel production in comparison to ultrasonic method. Both methods were proved to be efficient, and time and energy saving for the preparation of biodiesel by transesterification of soybean oil.
NASA Technical Reports Server (NTRS)
Russell, S. S.; Lansing, M. D.
1997-01-01
The goal of this research effort was the development of methods for shearographic and thermographic inspection of coatings, bonds, or laminates inside rocket fuel or oxidizer tanks, fuel lines, and other closed structures. The endoscopic methods allow imaging and inspection inside cavities that are traditionally inaccessible with shearography or thermography cameras. The techniques are demonstrated and suggestions for practical application are made in this report. Drawings of the experimental setups, detailed procedures, and experimental data are included.
Varma, Hari M.; Valdes, Claudia P.; Kristoffersen, Anna K.; Culver, Joseph P.; Durduran, Turgut
2014-01-01
A novel tomographic method based on the laser speckle contrast, speckle contrast optical tomography (SCOT) is introduced that allows us to reconstruct three dimensional distribution of blood flow in deep tissues. This method is analogous to the diffuse optical tomography (DOT) but for deep tissue blood flow. We develop a reconstruction algorithm based on first Born approximation to generate three dimensional distribution of flow using the experimental data obtained from tissue simulating phantoms. PMID:24761306
Binder model system to be used for determination of prepolymer functionality
NASA Technical Reports Server (NTRS)
Martinelli, F. J.; Hodgkin, J. H.
1971-01-01
Development of a method for determining the functionality distribution of prepolymers used for rocket binders is discussed. Research has been concerned with accurately determining the gel point of a model polyester system containing a single trifunctional crosslinker, and the application of these methods to more complicated model systems containing a second trifunctional crosslinker, monofunctional ingredients, or a higher functionality crosslinker. Correlations of observed with theoretical gel points for these systems would allow the methods to be applied directly to prepolymers.
NASA Technical Reports Server (NTRS)
Lansing, Matthew D.; Bullock, Michael W.
1996-01-01
The goal of this research effort was the development of methods for shearography and thermography inspection of coatings, bonds, or laminates inside rocket fuel or oxidizer tanks, fuel lines, and other closed structures. The endoscopic methods allow imaging and inspection inside cavities which are traditionally inaccessible with shearography or thermography cameras. The techniques are demonstrated and suggestions for practical application are made in this report. Drawings of the experimental setups, detailed procedures, and experimental data are included.
Architecture for one-shot compressive imaging using computer-generated holograms.
Macfaden, Alexander J; Kindness, Stephen J; Wilkinson, Timothy D
2016-09-10
We propose a synchronous implementation of compressive imaging. This method is mathematically equivalent to prevailing sequential methods, but uses a static holographic optical element to create a spatially distributed spot array from which the image can be reconstructed with an instantaneous measurement. We present the holographic design requirements and demonstrate experimentally that the linear algebra of compressed imaging can be implemented with this technique. We believe this technique can be integrated with optical metasurfaces, which will allow the development of new compressive sensing methods.
Metamorphosis revealed: time-lapse three-dimensional imaging inside a living chrysalis.
Lowe, Tristan; Garwood, Russell J; Simonsen, Thomas J; Bradley, Robert S; Withers, Philip J
2013-07-06
Studies of model insects have greatly increased our understanding of animal development. Yet, they are limited in scope to this small pool of model species: a small number of representatives for a hyperdiverse group with highly varied developmental processes. One factor behind this narrow scope is the challenging nature of traditional methods of study, such as histology and dissection, which can preclude quantitative analysis and do not allow the development of a single individual to be followed. Here, we use high-resolution X-ray computed tomography (CT) to overcome these issues, and three-dimensionally image numerous lepidopteran pupae throughout their development. The resulting models are presented in the electronic supplementary material, as are figures and videos, documenting a single individual throughout development. They provide new insight and details of lepidopteran metamorphosis, and allow the measurement of tracheal and gut volume. Furthermore, this study demonstrates early and rapid development of the tracheae, which become visible in scans just 12 h after pupation. This suggests that there is less remodelling of the tracheal system than previously expected, and is methodologically important because the tracheal system is an often-understudied character system in development. In the future, this form of time-lapse CT-scanning could allow faster and more detailed developmental studies on a wider range of taxa than is presently possible.
Structural Embeddings: Mechanization with Method
NASA Technical Reports Server (NTRS)
Munoz, Cesar; Rushby, John
1999-01-01
The most powerful tools for analysis of formal specifications are general-purpose theorem provers and model checkers, but these tools provide scant methodological support. Conversely, those approaches that do provide a well-developed method generally have less powerful automation. It is natural, therefore, to try to combine the better-developed methods with the more powerful general-purpose tools. An obstacle is that the methods and the tools often employ very different logics. We argue that methods are separable from their logics and are largely concerned with the structure and organization of specifications. We, propose a technique called structural embedding that allows the structural elements of a method to be supported by a general-purpose tool, while substituting the logic of the tool for that of the method. We have found this technique quite effective and we provide some examples of its application. We also suggest how general-purpose systems could be restructured to support this activity better.
Tedeschi, L O; Seo, S; Fox, D G; Ruiz, R
2006-12-01
Current ration formulation systems used to formulate diets on farms and to evaluate experimental data estimate metabolizable energy (ME)-allowable and metabolizable protein (MP)-allowable milk production from the intake above animal requirements for maintenance, pregnancy, and growth. The changes in body reserves, measured via the body condition score (BCS), are not accounted for in predicting ME and MP balances. This paper presents 2 empirical models developed to adjust predicted diet-allowable milk production based on changes in BCS. Empirical reserves model 1 was based on the reserves model described by the 2001 National Research Council (NRC) Nutrient Requirements of Dairy Cattle, whereas empirical reserves model 2 was developed based on published data of body weight and composition changes in lactating dairy cows. A database containing 134 individually fed lactating dairy cows from 3 trials was used to evaluate these adjustments in milk prediction based on predicted first-limiting ME or MP by the 2001 Dairy NRC and Cornell Net Carbohydrate and Protein System models. The analysis of first-limiting ME or MP milk production without adjustments for BCS changes indicated that the predictions of both models were consistent (r(2) of the regression between observed and model-predicted values of 0.90 and 0.85), had mean biases different from zero (12.3 and 5.34%), and had moderate but different roots of mean square errors of prediction (5.42 and 4.77 kg/d) for the 2001 NRC model and the Cornell Net Carbohydrate and Protein System model, respectively. The adjustment of first-limiting ME- or MP-allowable milk to BCS changes improved the precision and accuracy of both models. We further investigated 2 methods of adjustment; the first method used only the first and last BCS values, whereas the second method used the mean of weekly BCS values to adjust ME- and MP-allowable milk production. The adjustment to BCS changes based on first and last BCS values was more accurate than the adjustment to BCS based on the mean of all BCS values, suggesting that adjusting milk production for mean weekly variations in BCS added more variability to model-predicted milk production. We concluded that both models adequately predicted the first-limiting ME- or MP-allowable milk after adjusting for changes in BCS.
Salas, Desirée; Le Gall, Antoine; Fiche, Jean-Bernard; Valeri, Alessandro; Ke, Yonggang; Bron, Patrick; Bellot, Gaetan
2017-01-01
Superresolution light microscopy allows the imaging of labeled supramolecular assemblies at a resolution surpassing the classical diffraction limit. A serious limitation of the superresolution approach is sample heterogeneity and the stochastic character of the labeling procedure. To increase the reproducibility and the resolution of the superresolution results, we apply multivariate statistical analysis methods and 3D reconstruction approaches originally developed for cryogenic electron microscopy of single particles. These methods allow for the reference-free 3D reconstruction of nanomolecular structures from two-dimensional superresolution projection images. Since these 2D projection images all show the structure in high-resolution directions of the optical microscope, the resulting 3D reconstructions have the best possible isotropic resolution in all directions. PMID:28811371
Athanasiou, Lambros; Sakellarios, Antonis I; Bourantas, Christos V; Tsirka, Georgia; Siogkas, Panagiotis; Exarchos, Themis P; Naka, Katerina K; Michalis, Lampros K; Fotiadis, Dimitrios I
2014-07-01
Optical coherence tomography and intravascular ultrasound are the most widely used methodologies in clinical practice as they provide high resolution cross-sectional images that allow comprehensive visualization of the lumen and plaque morphology. Several methods have been developed in recent years to process the output of these imaging modalities, which allow fast, reliable and reproducible detection of the luminal borders and characterization of plaque composition. These methods have proven useful in the study of the atherosclerotic process as they have facilitated analysis of a vast amount of data. This review presents currently available intravascular ultrasound and optical coherence tomography processing methodologies for segmenting and characterizing the plaque area, highlighting their advantages and disadvantages, and discusses the future trends in intravascular imaging.
[Application of cryogenic stimulation in treatment of chronic wounds].
Vinnik, Iu S; Karapetian, G E; Iakimov, S V; Sychev, A G
2008-01-01
The authors have studied alterations occurring both in the ultrastructure of the cell matrix and in the microcirculatory bed of the chronic wound after local exposure to cryoagent. The up-to-date effective methods including laser Doppler flowmetry were used followed by correct statistical processing of the data obtained. The cryogenic stimulation of the wound was shown to result in considerably improved perfusion of the microcirculatory bed, epithelization and remodeling of the scar. It allowed transformation of a chronic process into acute and thus led to considerably accelerated process of regeneration. The developed method of cryogenic treatment of the chronic wound was used in 35 patients, allowed quicker healing of the chronic wounds and made ambulatory treatment of the patients 3 weeks shorter.
Innovations in air sampling to detect plant pathogens
West, JS; Kimber, RBE
2015-01-01
Many innovations in the development and use of air sampling devices have occurred in plant pathology since the first description of the Hirst spore trap. These include improvements in capture efficiency at relatively high air-volume collection rates, methods to enhance the ease of sample processing with downstream diagnostic methods and even full automation of sampling, diagnosis and wireless reporting of results. Other innovations have been to mount air samplers on mobile platforms such as UAVs and ground vehicles to allow sampling at different altitudes and locations in a short space of time to identify potential sources and population structure. Geographical Information Systems and the application to a network of samplers can allow a greater prediction of airborne inoculum and dispersal dynamics. This field of technology is now developing quickly as novel diagnostic methods allow increasingly rapid and accurate quantifications of airborne species and genetic traits. Sampling and interpretation of results, particularly action-thresholds, is improved by understanding components of air dispersal and dilution processes and can add greater precision in the application of crop protection products as part of integrated pest and disease management decisions. The applications of air samplers are likely to increase, with much greater adoption by growers or industry support workers to aid in crop protection decisions. The same devices are likely to improve information available for detection of allergens causing hay fever and asthma or provide valuable metadata for regional plant disease dynamics. PMID:25745191
New techniques for test development for tactical auto-pilots using microprocessors
NASA Astrophysics Data System (ADS)
Shemeta, E. H.
1980-07-01
This paper reports on a demonstration of the application of the method to generate system level tests for a typical tactical missile autopilot. The test algorithms are based on the autopilot control law. When loaded on the tester with appropriate control information, the complete autopilot is tested to establish if the specified control law requirements are met. Thus, the test procedure not only checks to see if the hardware is functional, but also checks the operational software. The technique also uses a 'learning' mode to allow minor timing or functional deviations from the expected responses to be incorporated in the test procedures. A potential application of this test development technique is the extraction of production test data for the various subassemblies. The technique will 'learn' the input-output patterns forming the basis for developement and production tests. If successful, these new techniques should allow the test development process to keep pace with semiconductor progress.
NASA Astrophysics Data System (ADS)
Sanchez, P.; Hinojosa, J.; Ruiz, R.
2005-06-01
Recently, neuromodeling methods of microwave devices have been developed. These methods are suitable for the model generation of novel devices. They allow fast and accurate simulations and optimizations. However, the development of libraries makes these methods to be a formidable task, since they require massive input-output data provided by an electromagnetic simulator or measurements and repeated artificial neural network (ANN) training. This paper presents a strategy reducing the cost of library development with the advantages of the neuromodeling methods: high accuracy, large range of geometrical and material parameters and reduced CPU time. The library models are developed from a set of base prior knowledge input (PKI) models, which take into account the characteristics common to all the models in the library, and high-level ANNs which give the library model outputs from base PKI models. This technique is illustrated for a microwave multiconductor tunable phase shifter using anisotropic substrates. Closed-form relationships have been developed and are presented in this paper. The results show good agreement with the expected ones.
Nano-sized Contrast Agents to Non-Invasively Detect Renal Inflammation by Magnetic Resonance Imaging
Thurman, Joshua M.; Serkova, Natalie J.
2013-01-01
Several molecular imaging methods have been developed that employ nano-sized contrast agents to detect markers of inflammation within tissues. Renal inflammation contributes to disease progression in a wide range of autoimmune and inflammatory diseases, and a biopsy is currently the only method of definitively diagnosing active renal inflammation. However, the development of new molecular imaging methods that employ contrast agents capable of detecting particular immune cells or protein biomarkers will allow clinicians to evaluate inflammation throughout the kidneys, and to assess a patient's response to immunomodulatory drugs. These imaging tools will improve our ability to validate new therapies and to optimize the treatment of individual patients with existing therapies. This review describes the clinical need for new methods of monitoring renal inflammation, and recent advances in the development of nano-sized contrast agents for detection of inflammatory markers of renal disease. PMID:24206601
Grape colour phenotyping: development of a method based on the reflectance spectrum.
Rustioni, Laura; Basilico, Roberto; Fiori, Simone; Leoni, Alessandra; Maghradze, David; Failla, Osvaldo
2013-01-01
The colour of fruit is an important quality factor for cultivar classification and phenotyping techniques. Besides the subjective visual evaluation, new instruments and techniques can be used. This work aims at developping an objective, fast, easy and non-destructive method as a useful support for evaluating grapes' colour under different cultural and environmental conditions, as well as for breeding process and germplasm evaluation, supporting the plant characterization and the biodiversity preservation. Colours of 120 grape varieties were studied using reflectance spectra. The classification was realized using cluster and discriminant analysis. Reflectance of the whole berries surface was also compared with absorption properties of single skin extracts. A phenotyping method based on the reflectance spectra was developed, producing reliable colour classifications. A cultivar-independent index for pigment content evaluation has also been obtained. This work allowed the classification of the berry colour using an objective method. Copyright © 2013 John Wiley & Sons, Ltd.
Fingerprinting of music scores
NASA Astrophysics Data System (ADS)
Irons, Jonathan; Schmucker, Martin
2004-06-01
Publishers of sheet music are generally reluctant in distributing their content via the Internet. Although online sheet music distribution's advantages are numerous the potential risk of Intellectual Property Rights (IPR) infringement, e.g. illegal online distributions, disables any innovation propensity. While active protection techniques only deter external risk factors, additional technology is necessary to adequately treat further risk factors. For several media types including music scores watermarking technology has been developed, which ebeds information in data by suitable data modifications. Furthermore, fingerprinting or perceptual hasing methods have been developed and are being applied especially for audio. These methods allow the identification of content without prior modifications. In this article we motivate the development of watermarking and fingerprinting technologies for sheet music. Outgoing from potential limitations of watermarking methods we explain why fingerprinting methods are important for sheet music and address potential applications. Finally we introduce a condept for fingerprinting of sheet music.
Methods for the Study of Gonadal Development.
Piprek, Rafal P
2016-01-01
Current knowledge on gonadal development and sex determination is the product of many decades of research involving a variety of scientific methods from different biological disciplines such as histology, genetics, biochemistry, and molecular biology. The earliest embryological investigations, followed by the invention of microscopy and staining methods, were based on histological examinations. The most robust development of histological staining techniques occurred in the second half of the nineteenth century and resulted in structural descriptions of gonadogenesis. These first studies on gonadal development were conducted on domesticated animals; however, currently the mouse is the most extensively studied species. The next key point in the study of gonadogenesis was the advancement of methods allowing for the in vitro culture of fetal gonads. For instance, this led to the description of the origin of cell lines forming the gonads. Protein detection using antibodies and immunolabeling methods and the use of reporter genes were also invaluable for developmental studies, enabling the visualization of the formation of gonadal structure. Recently, genetic and molecular biology techniques, especially gene expression analysis, have revolutionized studies on gonadogenesis and have provided insight into the molecular mechanisms that govern this process. The successive invention of new methods is reflected in the progress of research on gonadal development.
Temperature Profile in Fuel and Tie-Tubes for Nuclear Thermal Propulsion Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vishal Patel
A finite element method to calculate temperature profiles in heterogeneous geometries of tie-tube moderated LEU nuclear thermal propulsion systems and HEU designs with tie-tubes is developed and implemented in MATLAB. This new method is compared to previous methods to demonstrate shortcomings in those methods. Typical methods to analyze peak fuel centerline temperature in hexagonal geometries rely on spatial homogenization to derive an analytical expression. These methods are not applicable to cores with tie-tube elements because conduction to tie-tubes cannot be accurately modeled with the homogenized models. The fuel centerline temperature directly impacts safety and performance so it must be predictedmore » carefully. The temperature profile in tie-tubes is also important when high temperatures are expected in the fuel because conduction to the tie-tubes may cause melting in tie-tubes, which may set maximum allowable performance. Estimations of maximum tie-tube temperature can be found from equivalent tube methods, however this method tends to be approximate and overly conservative. A finite element model of heat conduction on a unit cell can model spatial dependence and non-linear conductivity for fuel and tie-tube systems allowing for higher design fidelity of Nuclear Thermal Propulsion.« less
Theory and computation of optimal low- and medium-thrust transfers
NASA Technical Reports Server (NTRS)
Chuang, C.-H.
1994-01-01
This report presents two numerical methods considered for the computation of fuel-optimal, low-thrust orbit transfers in large numbers of burns. The origins of these methods are observations made with the extremal solutions of transfers in small numbers of burns; there seems to exist a trend such that the longer the time allowed to perform an optimal transfer the less fuel that is used. These longer transfers are obviously of interest since they require a motor of low thrust; however, we also find a trend that the longer the time allowed to perform the optimal transfer the more burns are required to satisfy optimality. Unfortunately, this usually increases the difficulty of computation. Both of the methods described use small-numbered burn solutions to determine solutions in large numbers of burns. One method is a homotopy method that corrects for problems that arise when a solution requires a new burn or coast arc for optimality. The other method is to simply patch together long transfers from smaller ones. An orbit correction problem is solved to develop this method. This method may also lead to a good guidance law for transfer orbits with long transfer times.
SEPHYDRO: An Integrated Multi-Filter Web-Based Tool for Baseflow Separation
NASA Astrophysics Data System (ADS)
Serban, D.; MacQuarrie, K. T. B.; Popa, A.
2017-12-01
Knowledge of baseflow contributions to streamflow is important for understanding watershed scale hydrology, including groundwater-surface water interactions, impact of geology and landforms on baseflow, estimation of groundwater recharge rates, etc. Baseflow (or hydrograph) separation methods can be used as supporting tools in many areas of environmental research, such as the assessment of the impact of agricultural practices, urbanization and climate change on surface water and groundwater. Over the past few decades various digital filtering and graphically-based methods have been developed in an attempt to improve the assessment of the dynamics of the various sources of streamflow (e.g. groundwater, surface runoff, subsurface flow); however, these methods are not available under an integrated platform and, individually, often require significant effort for implementation. Here we introduce SEPHYDRO, an open access, customizable web-based tool, which integrates 11 algorithms allowing for separation of streamflow hydrographs. The streamlined interface incorporates a reference guide as well as additional information that allows users to import their own data, customize the algorithms, and compare, visualise and export results. The tool includes one-, two- and three-parameter digital filters as well as graphical separation methods and has been successfully applied in Atlantic Canada, in studies dealing with nutrient loading to fresh water and coastal water ecosystems. Future developments include integration of additional separation algorithms as well as incorporation of geochemical separation methods. SEPHYDRO has been developed through a collaborative research effort between the Canadian Rivers Institute, University of New Brunswick (Fredericton, New Brunswick, Canada), Agriculture and Agri-Food Canada and Environment and Climate Change Canada and is currently available at http://canadianriversinstitute.com/tool/
Hybrid Particle-Element Simulation of Impact on Composite Orbital Debris Shields
NASA Technical Reports Server (NTRS)
Fahrenthold, Eric P.
2004-01-01
This report describes the development of new numerical methods and new constitutive models for the simulation of hypervelocity impact effects on spacecraft. The research has included parallel implementation of the numerical methods and material models developed under the project. Validation work has included both one dimensional simulations, for comparison with exact solutions, and three dimensional simulations of published hypervelocity impact experiments. The validated formulations have been applied to simulate impact effects in a velocity and kinetic energy regime outside the capabilities of current experimental methods. The research results presented here allow for the expanded use of numerical simulation, as a complement to experimental work, in future design of spacecraft for hypervelocity impact effects.
Automated navigation assessment for earth survey sensors using island targets
NASA Technical Reports Server (NTRS)
Patt, Frederick S.; Woodward, Robert H.; Gregg, Watson W.
1997-01-01
An automated method has been developed for performing navigation assessment on satellite-based Earth sensor data. The method utilizes islands as targets which can be readily located in the sensor data and identified with reference locations. The essential elements are an algorithm for classifying the sensor data according to source, a reference catalog of island locations, and a robust pattern-matching algorithm for island identification. The algorithms were developed and tested for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS), an ocean color sensor. This method will allow navigation error statistics to be automatically generated for large numbers of points, supporting analysis over large spatial and temporal ranges.
Instructional Development for Clinical Settings.
ERIC Educational Resources Information Center
Cranton, P. A.
Clinical teaching involves instruction in a natural health-related environment which allows students to observe and participate in the actual practice of the profession. The use of objectives, the sequence of instruction, the instructional methods and materials, and the evaluation of student performance constitute the components studied in…
Structure-property study of the Raman spectroscopy detection of fusaric acid and analogs
USDA-ARS?s Scientific Manuscript database
Food security can benefit from the development of selective methods to detect toxins. Fusaric acid is a mycotoxin produced by certain fungi occasionally found in agricultural commodities. Raman spectroscopy allows selective detection of analytes associated with certain spectral characteristics relat...
Novel on-demand bioadhesion to soft tissue in wet environments.
Mogal, Vishal; Papper, Vladislav; Chaurasia, Alok; Feng, Gao; Marks, Robert; Steele, Terry
2014-04-01
Current methods of tissue fixation rely on mechanical-related technologies developed from the clothing and carpentry industries. Herein, a novel bioadhesive method that allows tuneable adhesion and is also applicable to biodegradable polyester substrates is described. Diazirine is the key functional group that allows strong soft tissue crosslinking and on-demand adhesion based on a free radical mechanism. Plasma post-irradiation grafting makes it possible to graft diazirine onto PLGA substrates. When the diazirine-PLGA films, placed on wetted ex vivo swine aortas, are activated with low intensity UV light, lap shear strength of up to 450 ± 50 mN cm(-2) is observed, which is one order of magnitude higher than hydrogel bioadhesives placed on similar soft tissues. The diazirine-modified PLGA thin films could be added on top of previously developed technologies for minimally invasive surgeries. The present work is focused on the chemistry, grafting, and lap shear strength of the alkyl diazirine-modified PLGA bioadhesive films. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ionization potential depression in an atomic-solid-plasma picture
NASA Astrophysics Data System (ADS)
Rosmej, F. B.
2018-05-01
Exotic solid density matter such as heated hollow crystals allow extended material studies while their physical properties and models such as the famous ionization potential depression are presently under renewed controversial discussion. Here we develop an atomic-solid-plasma (ASP) model that permits ionization potential depression studies also for single and multiple core hole states. Numerical calculations show very good agreement with recently available data not only in absolute values but also for Z-scaled properties while currently employed methods fail. For much above solid density compression, the ASP model predicts increased K-edge energies that are related to a Fermi surface rising. This is in good agreement with recent quantum molecular dynamics simulations. For hot dense matter a quantum number dependent optical electron finite temperature ion sphere model is developed that fits well with line shift and line disappearance data from dense laser produced plasma experiments. Finally, the physical transparency of the ASP picture allows a critical discussion of current methods.
Developing Discontinuous Galerkin Methods for Solving Multiphysics Problems in General Relativity
NASA Astrophysics Data System (ADS)
Kidder, Lawrence; Field, Scott; Teukolsky, Saul; Foucart, Francois; SXS Collaboration
2016-03-01
Multi-messenger observations of the merger of black hole-neutron star and neutron star-neutron star binaries, and of supernova explosions will probe fundamental physics inaccessible to terrestrial experiments. Modeling these systems requires a relativistic treatment of hydrodynamics, including magnetic fields, as well as neutrino transport and nuclear reactions. The accuracy, efficiency, and robustness of current codes that treat all of these problems is not sufficient to keep up with the observational needs. We are building a new numerical code that uses the Discontinuous Galerkin method with a task-based parallelization strategy, a promising combination that will allow multiphysics applications to be treated both accurately and efficiently on petascale and exascale machines. The code will scale to more than 100,000 cores for efficient exploration of the parameter space of potential sources and allowed physics, and the high-fidelity predictions needed to realize the promise of multi-messenger astronomy. I will discuss the current status of the development of this new code.
Sound source measurement by using a passive sound insulation and a statistical approach
NASA Astrophysics Data System (ADS)
Dragonetti, Raffaele; Di Filippo, Sabato; Mercogliano, Francesco; Romano, Rosario A.
2015-10-01
This paper describes a measurement technique developed by the authors that allows carrying out acoustic measurements inside noisy environments reducing background noise effects. The proposed method is based on the integration of a traditional passive noise insulation system with a statistical approach. The latter is applied to signals picked up by usual sensors (microphones and accelerometers) equipping the passive sound insulation system. The statistical approach allows improving of the sound insulation given only by the passive sound insulation system at low frequency. The developed measurement technique has been validated by means of numerical simulations and measurements carried out inside a real noisy environment. For the case-studies here reported, an average improvement of about 10 dB has been obtained in a frequency range up to about 250 Hz. Considerations on the lower sound pressure level that can be measured by applying the proposed method and the measurement error related to its application are reported as well.
Kubicek, Jan; Schlesinger, Ramona; Baeken, Christian; Büldt, Georg; Schäfer, Frank; Labahn, Jörg
2012-01-01
We investigated in meso crystallization of membrane proteins to develop a fast screening technology which combines features of the well established classical vapor diffusion experiment with the batch meso phase crystallization, but without premixing of protein and monoolein. It inherits the advantages of both methods, namely (i) the stabilization of membrane proteins in the meso phase, (ii) the control of hydration level and additive concentration by vapor diffusion. The new technology (iii) significantly simplifies in meso crystallization experiments and allows the use of standard liquid handling robots suitable for 96 well formats. CIMP crystallization furthermore allows (iv) direct monitoring of phase transformation and crystallization events. Bacteriorhodopsin (BR) crystals of high quality and diffraction up to 1.3 Å resolution have been obtained in this approach. CIMP and the developed consumables and protocols have been successfully applied to obtain crystals of sensory rhodopsin II (SRII) from Halobacterium salinarum for the first time. PMID:22536388
Time-dependent limited penetrable visibility graph analysis of nonstationary time series
NASA Astrophysics Data System (ADS)
Gao, Zhong-Ke; Cai, Qing; Yang, Yu-Xuan; Dang, Wei-Dong
2017-06-01
Recent years have witnessed the development of visibility graph theory, which allows us to analyze a time series from the perspective of complex network. We in this paper develop a novel time-dependent limited penetrable visibility graph (TDLPVG). Two examples using nonstationary time series from RR intervals and gas-liquid flows are provided to demonstrate the effectiveness of our approach. The results of the first example suggest that our TDLPVG method allows characterizing the time-varying behaviors and classifying heart states of healthy, congestive heart failure and atrial fibrillation from RR interval time series. For the second example, we infer TDLPVGs from gas-liquid flow signals and interestingly find that the deviation of node degree of TDLPVGs enables to effectively uncover the time-varying dynamical flow behaviors of gas-liquid slug and bubble flow patterns. All these results render our TDLPVG method particularly powerful for characterizing the time-varying features underlying realistic complex systems from time series.
ERIC Educational Resources Information Center
Nadal, Gloria Claveria; Lancis, Carlos Sanchez
1997-01-01
Notes that the employment of databases to the study of the history of a language is a method that allows for substantial improvement in investigative quality. Illustrates this with the example of the application of this method to two studies of the history of Spanish developed in the Language and Information Seminary of the Independent University…
2010-11-01
estimate the pharmacokinetics of potential drugs (Horning and Klamt 2005). QSPR/ QSARs also have potential applications in the fuel science field...group contribution methods, and (2) quantitative structure-property/activity relationships (QSPR/ QSAR ). The group contribution methods are primarily...development of QSPR/ QSARs is the identification of the ap- propriate set of descriptors that allow the desired attribute of the compound to be adequately
Layer by Layer Growth of 2D Quantum Superlattices (NBIT III)
2017-02-28
building quantum superlatticies using 2D materials as the building blocks. Specifically, we develop methods that allow i) large-scale growth of aligned...superlattice and heterostructures, iii) lateral and clean patterning of 2D materials for atomically-thin circuitry and iv) novel physical properties...high precision and flexibility beyond conventional methods. Moreover, it provides the solutions for current major barrier for 2D materials (e.g
Recent advances in the sequencing of relevant water intrusion fungi by the EPA, combined with the development of probes and primers have allowed for the unequivocal quantitative and qualitative identification of fungi in selected matrices.
In this pilot study, quantitative...
A New Method for Determining Permethrin Level on Military Uniform Fabrics
2017-06-01
new desorption- gas chromatography–mass spectrometry based screening tool for permethrin content in military fabrics was developed. The method allows...SUBJECT TERMS permethrin, Army Combat Uniform, ACU, camouflage, desorption- gas chromatography-mass spectrometry, D-GC-MS 16. SECURITY CLASSIFICATION OF...and the permethrin contained in the specimens is extracted with solvent with a recovery rate of at least 95%. Samples are analyzed using a gas
1987-04-01
2. Comparison of Immunofluorescent Staining in Formaldehyde-Fixed Pichlnde Virus-Infected Cells That Had Been either Dried prior to Reaction with...was undertaken. 37 &aa&&3&M ^.{m^mmsmmmmmmmmmiä B. Experimental Methods General Procedures and Instrumentation. When required, reactions and...period, the reaction mixture was red and efficient stirring became very difficult. After the addition was complete, the reaction mixture was allowed
Modelling and Computation in the Valuation of Carbon Derivatives with Stochastic Convenience Yields
Chang, Shuhua; Wang, Xinyu
2015-01-01
The anthropogenic greenhouse gas (GHG) emission has risen dramatically during the last few decades, which mainstream researchers believe to be the main cause of climate change, especially the global warming. The mechanism of market-based carbon emission trading is regarded as a policy instrument to deal with global climate change. Although several empirical researches about the carbon allowance and its derivatives price have been made, theoretical results seem to be sparse. In this paper, we theoretically develop a mathematical model to price the CO2 emission allowance derivatives with stochastic convenience yields by the principle of absence of arbitrage opportunities. In the case of American options, we formulate the pricing problem to a linear parabolic variational inequality (VI) in two spatial dimensions and develop a power penalty method to solve it. Then, a fitted finite volume method is designed to solve the nonlinear partial differential equation (PDE) resulting from the power penalty method and governing the futures, European and American option valuation. Moreover, some numerical results are performed to illustrate the efficiency and usefulness of this method. We find that the stochastic convenience yield does effect the valuation of carbon emission derivatives. In addition, some sensitivity analyses are also made to examine the effects of some parameters on the valuation results. PMID:26010900
The Protein/Peptide Direct Virus Inactivation During Chromatographic Process: Developing Approaches.
Volkov, Georgii L; Havryliuk, Sergiy P; Krasnobryzha, Ievgenia M; Havryliuk, Olena S
2017-01-01
Virus clearance is required for pharmaceutical preparations derived from animal or human sources such as blood products, vaccines, recombinant proteins produced in mammalian cell lines, etc. High cost and substantial protein losses during virus inactivation are significant problems for protein/peptide manufacturing. The goal of this project was to develop a method to perform virus inactivation in a course of protein chromatographic purification. Another goal was to show that the chromatographic adsorbent can serve as reliable "sieva" for mechanical washing away of infecting viruses. Using chromatographic, photometric, IFA, and RT-PCR approaches, it was discovered that high temperature-depending dynamic capacity of adsorbent allowed to perform a virus inactivation directly in a chromatographic column by solvent/detergent treatment. The peptide/protein biological activity was completely preserved. Using this new approach enveloped and nonenveloped viruses were effectively removed protein preparation. In addition, it was shown that RT-PCR method demonstrates more precise and reproducible results and robust properties for assessment of virus reduction than virus titer followed by infectivity studies. Presented method allowed to obtain the factor of virus concentration decrease (FVD) values that were higher than those provided by known technologies and was sufficient for a full inactivation of viruses. The method is recommended to use in pharmaceutical industry.
Vierheilig, J.; Savio, D.; Ley, R. E.; Mach, R. L.; Farnleitner, A. H.
2016-01-01
The applicability of next generation DNA sequencing (NGS) methods for water quality assessment has so far not been broadly investigated. This study set out to evaluate the potential of an NGS-based approach in a complex catchment with importance for drinking water abstraction. In this multicompartment investigation, total bacterial communities in water, faeces, soil, and sediment samples were investigated by 454 pyrosequencing of bacterial 16S rRNA gene amplicons to assess the capabilities of this NGS method for (i) the development and evaluation of environmental molecular diagnostics, (ii) direct screening of the bulk bacterial communities, and (iii) the detection of faecal pollution in water. Results indicate that NGS methods can highlight potential target populations for diagnostics and will prove useful for the evaluation of existing and the development of novel DNA-based detection methods in the field of water microbiology. The used approach allowed unveiling of dominant bacterial populations but failed to detect populations with low abundances such as faecal indicators in surface waters. In combination with metadata, NGS data will also allow the identification of drivers of bacterial community composition during water treatment and distribution, highlighting the power of this approach for monitoring of bacterial regrowth and contamination in technical systems. PMID:26606090
Modelling and computation in the valuation of carbon derivatives with stochastic convenience yields.
Chang, Shuhua; Wang, Xinyu
2015-01-01
The anthropogenic greenhouse gas (GHG) emission has risen dramatically during the last few decades, which mainstream researchers believe to be the main cause of climate change, especially the global warming. The mechanism of market-based carbon emission trading is regarded as a policy instrument to deal with global climate change. Although several empirical researches about the carbon allowance and its derivatives price have been made, theoretical results seem to be sparse. In this paper, we theoretically develop a mathematical model to price the CO2 emission allowance derivatives with stochastic convenience yields by the principle of absence of arbitrage opportunities. In the case of American options, we formulate the pricing problem to a linear parabolic variational inequality (VI) in two spatial dimensions and develop a power penalty method to solve it. Then, a fitted finite volume method is designed to solve the nonlinear partial differential equation (PDE) resulting from the power penalty method and governing the futures, European and American option valuation. Moreover, some numerical results are performed to illustrate the efficiency and usefulness of this method. We find that the stochastic convenience yield does effect the valuation of carbon emission derivatives. In addition, some sensitivity analyses are also made to examine the effects of some parameters on the valuation results.
Flexible, Photopatterned, Colloidal CdSe Semiconductor Nanocrystal Integrated Circuits
NASA Astrophysics Data System (ADS)
Stinner, F. Scott
As semiconductor manufacturing pushes towards smaller and faster transistors, a parallel goal exists to create transistors which are not nearly as small. These transistors are not intended to match the performance of traditional crystalline semiconductors; they are designed to be significantly lower in cost and manufactured using methods that can make them physically flexible for applications where form is more important than speed. One of the developing technologies for this application is semiconductor nanocrystals. We first explore methods to develop CdSe nanocrystal semiconducting "inks" into large-scale, high-speed integrated circuits. We demonstrate photopatterned transistors with mobilities of 10 cm2/Vs on Kapton substrates. We develop new methods for vertical interconnect access holes to demonstrate multi-device integrated circuits including inverting amplifiers with 7 kHz bandwidths, ring oscillators with <10 micros stage delays, and NAND and NOR logic gates. In order to produce higher performance and more consistent transistors, we develop a new hybrid procedure for processing the CdSe nanocrystals. This procedure produces transistors with repeatable performance exceeding 40 cm2/Vs when fabricated on silicon wafers and 16 cm 2/vs when fabricated as part of photopatterned integrated circuits on Kapton substrates. In order to demonstrate the full potential of these transistors, methods to create high-frequency oscillators were developed. These methods allow for transistors to operate at higher voltages as well as provide a means for wirebonding to the Kapton substrate, both of which are required for operating and probing high-frequency oscillators. Simulations of this system show the potential for operation at MHz frequencies. Demonstration of these transistors in this frequency range would open the door for development of CdSe integrated circuits for high-performance sensor, display, and audio applications. To develop further applications of electronics on flexible substrates, procedures are developed for the integration of polychromatic displays on polyethylene terephthalate (PET) substrates and a commercial near field communication (NFC) link. The device draws its power from the NFC transmitter common on smartphones and eliminates the need for a fixed battery. This allows for the mass deployment of flexible, interactive displays on product packaging.
A further use for the Harvest plot: a novel method for the presentation of data synthesis.
Crowther, Mark; Avenell, Alison; MacLennan, Graeme; Mowatt, Graham
2011-06-01
When performing a systematic review, whether or not a meta-analysis is performed, graphical displays can be useful. Data do still need to be described, ideally in graphical form. The Harvest plot has been developed to display combined data from several studies that allows demonstration of not only effect but also study quality. We describe a modification to the Harvest plot that allows the presentation of data that normally could not be included in a forest plot meta-analysis and allows extra information to be displayed. Using specific examples, we describe how the arrangement of studies, height of the bars and additional information can be used to enhance the plot. This is an important development, which by fulfilling Tufte's nine requirements for graphical presentation, allows researchers to display evidence in a flexible way. This means readers can follow an argument in a clear and efficient manner without the need for large volumes of descriptive text. Copyright © 2011 John Wiley & Sons, Ltd. Copyright © 2011 John Wiley & Sons, Ltd.
Methods to Detect Nitric Oxide and its Metabolites in Biological Samples
Bryan, Nathan S.; Grisham, Matthew B.
2007-01-01
Nitric oxide (NO) methodology is a complex and often confusing science and the focus of many debates and discussion concerning NO biochemistry. NO is involved in many physiological processes including regulation of blood pressure, immune response and neural communication. Therefore its accurate detection and quantification is critical to understanding health and disease. Due to the extremely short physiological half life of this gaseous free radical, alternative strategies for the detection of reaction products of NO biochemistry have been developed. The quantification of NO metabolites in biological samples provides valuable information with regards to in vivo NO production, bioavailability and metabolism. Simply sampling a single compartment such as blood or plasma may not always provide an accurate assessment of whole body NO status, particularly in tissues. Therefore, extrapolation of plasma or blood NO status to specific tissues of interest is no longer a valid approach. As a result, methods continue to be developed and validated which allow the detection and quantification of NO and NO-related products/metabolites in multiple compartments of experimental animals in vivo. The methods described in this review is not an exhaustive or comprehensive discussion of all methods available for the detection of NO but rather a description of the most commonly used and practical methods which allow accurate and sensitive quantification of NO products/metabolites in multiple biological matrices under normal physiological conditions. PMID:17664129
A Combinatorial Approach to Detecting Gene-Gene and Gene-Environment Interactions in Family Studies
Lou, Xiang-Yang; Chen, Guo-Bo; Yan, Lei; Ma, Jennie Z.; Mangold, Jamie E.; Zhu, Jun; Elston, Robert C.; Li, Ming D.
2008-01-01
Widespread multifactor interactions present a significant challenge in determining risk factors of complex diseases. Several combinatorial approaches, such as the multifactor dimensionality reduction (MDR) method, have emerged as a promising tool for better detecting gene-gene (G × G) and gene-environment (G × E) interactions. We recently developed a general combinatorial approach, namely the generalized multifactor dimensionality reduction (GMDR) method, which can entertain both qualitative and quantitative phenotypes and allows for both discrete and continuous covariates to detect G × G and G × E interactions in a sample of unrelated individuals. In this article, we report the development of an algorithm that can be used to study G × G and G × E interactions for family-based designs, called pedigree-based GMDR (PGMDR). Compared to the available method, our proposed method has several major improvements, including allowing for covariate adjustments and being applicable to arbitrary phenotypes, arbitrary pedigree structures, and arbitrary patterns of missing marker genotypes. Our Monte Carlo simulations provide evidence that the PGMDR method is superior in performance to identify epistatic loci compared to the MDR-pedigree disequilibrium test (PDT). Finally, we applied our proposed approach to a genetic data set on tobacco dependence and found a significant interaction between two taste receptor genes (i.e., TAS2R16 and TAS2R38) in affecting nicotine dependence. PMID:18834969
NASA Astrophysics Data System (ADS)
Preziosi, E.; Sánchez, S.; González, A. J.; Pani, R.; Borrazzo, C.; Bettiol, M.; Rodriguez-Alvarez, M. J.; González-Montoro, A.; Moliner, L.; Benlloch, J. M.
2016-12-01
One of the technical objectives of the MindView project is developing a brain-dedicated PET insert based on monolithic scintillation crystals. It will be inserted in MRI systems with the purpose to obtain simultaneous PET and MRI brain images. High sensitivity, high image quality performance and accurate detection of the Depth-of-Interaction (DoI) of the 511keV photons are required. We have developed a DoI estimation method, dedicated to monolithic scintillators, allowing continuous DoI estimation and a DoI-dependent algorithm for the estimation of the photon planar impact position, able to improve the single module imaging capabilities. In this work, through experimental measurements, the proposed methods have been used for the estimation of the impact positions within the monolithic crystal block. We have evaluated the PET system performance following the NEMA NU 4-2008 protocol by reconstructing the images using the STIR 3D platform. The results obtained with two different methods, providing discrete and continuous DoI information, are compared with those obtained from an algorithm without DoI capabilities and with the ideal response of the detector. The proposed DoI-dependent imaging methods show clear improvements in the spatial resolution (FWHM) of reconstructed images, allowing to obtain values from 2mm (at the center FoV) to 3mm (at the FoV edges).
Jee, Samuel D; Schafheutle, Ellen I; Noyce, Peter R
2017-05-01
Recent longitudinal investigations of professional socialisation and development of professional behaviours during work-based training are lacking. Using longitudinal mixed methods, this study aimed to explore the development of professional behaviours during a year of intensive work-based (pre-registration) training in pharmacy. Twenty trainee pharmacists and their tutors completed semi-structured interview and professional behaviour questionnaires at four time points during 2011/2012: months 1, 4 and 9 during training and 4 months after registration; tutors participated in months 1 and 9. Interviews were analysed thematically using template analysis, and questionnaires were analysed using ANOVA and t-tests. Self-assessed (trainee) and tutor ratings of all elements of professional behaviours measured in questionnaires (appearance, interpersonal/social skills, responsibility, communication skills) increased significantly from the start of pre-registration training to post-registration. Some elements, for example, communication skills, showed more change over time compared with others, such as appearance, and continued to improve post-registration. Qualitative findings highlighted the changing roles of trainees and learning experiences that appeared to facilitate the development of professional behaviours. Trainees' colleagues, and particularly tutors, played an essential part in trainees' development through offering support and role modelling. Trainees noted that they would have benefited from more responsibilities during training to ease the transition into practising as a responsible pharmacist. Longitudinal mixed methods can unpack the way in which professional behaviours develop during work-based training and allow researchers to examine changes in the demonstration of professional behaviours and how they occur. Identifying areas less prone to change allows for more focus to be given to supporting trainees in areas where there is a development need, such as communication skills and holding increasing responsibility. © 2016 John Wiley & Sons Ltd.
In-situ protein determination to monitor contamination in a centrifugal partition chromatograph.
Bouiche, Feriel; Faure, Karine
2017-05-15
Centrifugal partition chromatography (CPC) works with biphasic liquid systems including aqueous two-phase systems. Metallic rotors are able to retain an aqueous stationary phase able to purify proteins. But the adhesion of proteins to solid surface may pose a cross-contamination risk during downstream processes. So it is of utmost importance to ensure the cleanliness of the equipment and detect possible protein contamination in a timely manner. Thereby, a direct method that allows the determination of the effective presence of proteins and the extent of contamination in the metallic CPC rotors was developed. This in-situ method is derived from the Amino Density Estimation by Colorimetric Assay (ADECA) which is based on the affinity of a dye, Coomassie Brillant Blue (CBB), with protonated N + groups of the proteins. In this paper, the ADECA method was developed dynamically, on a 25 mL stainless-steel rotor with various extents of protein contaminations using bovine serum albumin (BSA) as a fouling model. The eluted CBB dye was quantified and found to respond linearly to BSA contamination up to 70 mg injected. Limits of detection and quantification were recorded as 0.9 mg and 3.1 mg, respectively. While the non-specific interactions between the dye and the rotor cannot currently be neglected, this method allows for in situ determination of proteins contamination and should contribute to the development of CPC as a separation tool in protein purification processes. Copyright © 2017 Elsevier Inc. All rights reserved.
Targeted Quantitation of Proteins by Mass Spectrometry
2013-01-01
Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement. PMID:23517332
Analysis of Formation Flying in Eccentric Orbits Using Linearized Equations of Relative Motion
NASA Technical Reports Server (NTRS)
Lane, Christopher; Axelrad, Penina
2004-01-01
Geometrical methods for formation flying design based on the analytical solution to Hill's equations have been previously developed and used to specify desired relative motions in near circular orbits. By generating relationships between the vehicles that are intuitive, these approaches offer valuable insight into the relative motion and allow for the rapid design of satellite configurations to achieve mission specific requirements, such as vehicle separation at perigee or apogee, minimum separation, or a specific geometrical shape. Furthermore, the results obtained using geometrical approaches can be used to better constrain numerical optimization methods; allowing those methods to converge to optimal satellite configurations faster. This paper presents a set of geometrical relationships for formations in eccentric orbits, where Hill.s equations are not valid, and shows how these relationships can be used to investigate formation designs and how they evolve with time.
Analysis of possible designs of processing units with radial plasma flows
NASA Astrophysics Data System (ADS)
Kolesnik, V. V.; Zaitsev, S. V.; Vashilin, V. S.; Limarenko, M. V.; Prochorenkov, D. S.
2018-03-01
Analysis of plasma-ion methods of obtaining thin-film coatings shows that their development goes along the path of the increasing use of sputter deposition processes, which allow one to obtain multicomponent coatings with varying percentage of particular components. One of the methods that allow one to form multicomponent coatings with virtually any composition of elementary components is the method of coating deposition using quasi-magnetron sputtering systems [1]. This requires the creation of an axial magnetic field of a defined configuration with the flux density within the range of 0.01-0.1 T [2]. In order to compare and analyze various configurations of processing unit magnetic systems, it is necessary to obtain the following dependencies: the dependency of magnetic core section on the input power to inductors, the distribution of magnetic induction within the equatorial plane in the corresponding sections, the distribution of the magnetic induction value in the area of cathode target location.
Targeted quantitation of proteins by mass spectrometry.
Liebler, Daniel C; Zimmerman, Lisa J
2013-06-04
Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement.
Tehrani, Farshad; Bavarian, Behzad
2016-01-01
A novel and highly sensitive disposable glucose sensor strip was developed using direct laser engraved graphene (DLEG) decorated with pulse deposited copper nanocubes (CuNCs). The high reproducibility (96.8%), stability (97.4%) and low cost demonstrated by this 3-step fabrication method indicates that it could be used for high volume manufacturing of disposable glucose strips. The fabrication method also allows for a high degree of flexibility, allowing for control of the electrode size, design, and functionalization method. Additionally, the excellent selectivity and sensitivity (4,532.2 μA/mM.cm2), low detection limit (250 nM), and suitable linear range of 25 μM–4 mM, suggests that these sensors may be a great potential platform for glucose detection within the physiological range for tear, saliva, and/or sweat. PMID:27306706
High-throughput imaging method for direct assessment of GM1 ganglioside levels in mammalian cells
Acosta, Walter; Martin, Reid; Radin, David N.; Cramer, Carole L.
2016-01-01
GM1-gangliosidosis is an inherited autosomal recessive disorder caused by mutations in the gene GLB1, which encodes acid β-galactosidase (β-gal). The lack of activity in this lysosomal enzyme leads to accumulation of GM1 gangliosides (GM1) in cells. We have developed a high-content-imaging method to assess GM1 levels in fibroblasts that can be used to evaluate substrate reduction in treated GLB1−/− cells [1]. This assay allows fluorescent quantification in a multi-well system which generates unbiased and statistically significant data. Fluorescently labeled Cholera Toxin B subunit (CTXB), which specifically binds to GM1 gangliosides, was used to detect in situ GM1 levels in a fixed monolayer of fibroblasts. This sensitive, rapid, and inexpensive method facilitates in vitro drug screening in a format that allows a high number of replicates using low working volumes. PMID:26958633
High-throughput imaging method for direct assessment of GM1 ganglioside levels in mammalian cells.
Acosta, Walter; Martin, Reid; Radin, David N; Cramer, Carole L
2016-03-01
GM1-gangliosidosis is an inherited autosomal recessive disorder caused by mutations in the gene GLB1, which encodes acid β-galactosidase (β-gal). The lack of activity in this lysosomal enzyme leads to accumulation of GM1 gangliosides (GM1) in cells. We have developed a high-content-imaging method to assess GM1 levels in fibroblasts that can be used to evaluate substrate reduction in treated GLB1(-/-) cells [1]. This assay allows fluorescent quantification in a multi-well system which generates unbiased and statistically significant data. Fluorescently labeled Cholera Toxin B subunit (CTXB), which specifically binds to GM1 gangliosides, was used to detect in situ GM1 levels in a fixed monolayer of fibroblasts. This sensitive, rapid, and inexpensive method facilitates in vitro drug screening in a format that allows a high number of replicates using low working volumes.
Devolatilization Analysis in a Twin Screw Extruder by using the Flow Analysis Network (FAN) Method
NASA Astrophysics Data System (ADS)
Tomiyama, Hideki; Takamoto, Seiji; Shintani, Hiroaki; Inoue, Shigeki
We derived the theoretical formulas for three mechanisms of devolatilization in a twin screw extruder. These are flash, surface refreshment and forced expansion. The method for flash devolatilization is based on the equation of equilibrium concentration which shows that volatiles break off from polymer when they are relieved from high pressure condition. For surface refreshment devolatilization, we applied Latinen's model to allow estimation of polymer behavior in the unfilled screw conveying condition. Forced expansion devolatilization is based on the expansion theory in which foams are generated under reduced pressure and volatiles are diffused on the exposed surface layer after mixing with the injected devolatilization agent. Based on these models, we developed the simulation software of twin-screw extrusion by the FAN method and it allows us to quantitatively estimate volatile concentration and polymer temperature with a high accuracy in the actual multi-vent extrusion process for LDPE + n-hexane.
NASA Astrophysics Data System (ADS)
Tehrani, Farshad; Bavarian, Behzad
2016-06-01
A novel and highly sensitive disposable glucose sensor strip was developed using direct laser engraved graphene (DLEG) decorated with pulse deposited copper nanocubes (CuNCs). The high reproducibility (96.8%), stability (97.4%) and low cost demonstrated by this 3-step fabrication method indicates that it could be used for high volume manufacturing of disposable glucose strips. The fabrication method also allows for a high degree of flexibility, allowing for control of the electrode size, design, and functionalization method. Additionally, the excellent selectivity and sensitivity (4,532.2 μA/mM.cm2), low detection limit (250 nM), and suitable linear range of 25 μM-4 mM, suggests that these sensors may be a great potential platform for glucose detection within the physiological range for tear, saliva, and/or sweat.
Magnetically-refreshable receptor platform structures for reusable nano-biosensor chips
NASA Astrophysics Data System (ADS)
Yoo, Haneul; Lee, Dong Jun; Cho, Dong-guk; Park, Juhun; Nam, Ki Wan; Tak Cho, Young; Park, Jae Yeol; Chen, Xing; Hong, Seunghun
2016-01-01
We developed a magnetically-refreshable receptor platform structure which can be integrated with quite versatile nano-biosensor structures to build reusable nano-biosensor chips. This structure allows one to easily remove used receptor molecules from a biosensor surface and reuse the biosensor for repeated sensing operations. Using this structure, we demonstrated reusable immunofluorescence biosensors. Significantly, since our method allows one to place receptor molecules very close to a nano-biosensor surface, it can be utilized to build reusable carbon nanotube transistor-based biosensors which require receptor molecules within a Debye length from the sensor surface. Furthermore, we also show that a single sensor chip can be utilized to detect two different target molecules simply by replacing receptor molecules using our method. Since this method does not rely on any chemical reaction to refresh sensor chips, it can be utilized for versatile biosensor structures and virtually-general receptor molecular species.
A novel method for trajectory planning of cooperative mobile manipulators.
Bolandi, Hossein; Ehyaei, Amir Farhad
2011-01-01
We have designed a two-stage scheme to consider the trajectory planning problem of two mobile manipulators for cooperative transportation of a rigid body in the presence of static obstacles. In the first stage, with regard to the static obstacles, we develop a method that searches the workspace for the shortest possible path between the start and goal configurations, by constructing a graph on a portion of the configuration space that satisfies the collision and closure constraints. The final stage is to calculate a sequence of time-optimal trajectories to go between the consecutive points of the path, with regard to the nonholonomic constraints and the maximum allowed joint accelerations. This approach allows geometric constraints such as joint limits and closed-chain constraints, along with differential constraints such as nonholonomic velocity constraints and acceleration limits, to be incorporated into the planning scheme. The simulation results illustrate the effectiveness of the proposed method.
A Novel Method for Trajectory Planning of Cooperative Mobile Manipulators
Bolandi, Hossein; Ehyaei, Amir Farhad
2011-01-01
We have designed a two-stage scheme to consider the trajectory planning problem of two mobile manipulators for cooperative transportation of a rigid body in the presence of static obstacles. In the first stage, with regard to the static obstacles, we develop a method that searches the workspace for the shortest possible path between the start and goal configurations, by constructing a graph on a portion of the configuration space that satisfies the collision and closure constraints. The final stage is to calculate a sequence of time-optimal trajectories to go between the consecutive points of the path, with regard to the nonholonomic constraints and the maximum allowed joint accelerations. This approach allows geometric constraints such as joint limits and closed-chain constraints, along with differential constraints such as nonholonomic velocity constraints and acceleration limits, to be incorporated into the planning scheme. The simulation results illustrate the effectiveness of the proposed method. PMID:22606656
Molecular Imprinting of Macromolecules for Sensor Applications
Saylan, Yeşeren; Yilmaz, Fatma; Özgür, Erdoğan; Derazshamshir, Ali; Yavuz, Handan; Denizli, Adil
2017-01-01
Molecular recognition has an important role in numerous living systems. One of the most important molecular recognition methods is molecular imprinting, which allows host compounds to recognize and detect several molecules rapidly, sensitively and selectively. Compared to natural systems, molecular imprinting methods have some important features such as low cost, robustness, high recognition ability and long term durability which allows molecularly imprinted polymers to be used in various biotechnological applications, such as chromatography, drug delivery, nanotechnology, and sensor technology. Sensors are important tools because of their ability to figure out a potentially large number of analytical difficulties in various areas with different macromolecular targets. Proteins, enzymes, nucleic acids, antibodies, viruses and cells are defined as macromolecules that have wide range of functions are very important. Thus, macromolecules detection has gained great attention in concerning the improvement in most of the studies. The applications of macromolecule imprinted sensors will have a spacious exploration according to the low cost, high specificity and stability. In this review, macromolecules for molecularly imprinted sensor applications are structured according to the definition of molecular imprinting methods, developments in macromolecular imprinting methods, macromolecular imprinted sensors, and conclusions and future perspectives. This chapter follows the latter strategies and focuses on the applications of macromolecular imprinted sensors. This allows discussion on how sensor strategy is brought to solve the macromolecules imprinting. PMID:28422082
Molecular Imprinting of Macromolecules for Sensor Applications.
Saylan, Yeşeren; Yilmaz, Fatma; Özgür, Erdoğan; Derazshamshir, Ali; Yavuz, Handan; Denizli, Adil
2017-04-19
Molecular recognition has an important role in numerous living systems. One of the most important molecular recognition methods is molecular imprinting, which allows host compounds to recognize and detect several molecules rapidly, sensitively and selectively. Compared to natural systems, molecular imprinting methods have some important features such as low cost, robustness, high recognition ability and long term durability which allows molecularly imprinted polymers to be used in various biotechnological applications, such as chromatography, drug delivery, nanotechnology, and sensor technology. Sensors are important tools because of their ability to figure out a potentially large number of analytical difficulties in various areas with different macromolecular targets. Proteins, enzymes, nucleic acids, antibodies, viruses and cells are defined as macromolecules that have wide range of functions are very important. Thus, macromolecules detection has gained great attention in concerning the improvement in most of the studies. The applications of macromolecule imprinted sensors will have a spacious exploration according to the low cost, high specificity and stability. In this review, macromolecules for molecularly imprinted sensor applications are structured according to the definition of molecular imprinting methods, developments in macromolecular imprinting methods, macromolecular imprinted sensors, and conclusions and future perspectives. This chapter follows the latter strategies and focuses on the applications of macromolecular imprinted sensors. This allows discussion on how sensor strategy is brought to solve the macromolecules imprinting.
Difference structures from time-resolved small-angle and wide-angle x-ray scattering
NASA Astrophysics Data System (ADS)
Nepal, Prakash; Saldin, D. K.
2018-05-01
Time-resolved small-angle x-ray scattering/wide-angle x-ray scattering (SAXS/WAXS) is capable of recovering difference structures directly from difference SAXS/WAXS curves. It does so by means of the theory described here because the structural changes in pump-probe detection in a typical time-resolved experiment are generally small enough to be confined to a single residue or group in close proximity which is identified by a method akin to the difference Fourier method of time-resolved crystallography. If it is assumed, as is usual with time-resolved structures, that the moved atoms lie within the residue, the 100-fold reduction in the search space (assuming a typical protein has about 100 residues) allows the exaction of the structure by a simulated annealing algorithm with a huge reduction in computing time and leads to a greater resolution by varying the positions of atoms only within that residue. This reduction in the number of potential moved atoms allows us to identify the actual motions of the individual atoms. In the case of a crystal, time-resolved calculations are normally performed using the difference Fourier method, which is, of course, not directly applicable to SAXS/WAXS. The method developed in this paper may be thought of as a substitute for that method which allows SAXS/WAXS (and hence disordered molecules) to also be used for time-resolved structural work.
Per- and Polyfluoroalkyl Substances (PFAS): Sampling ...
Per- and polyfluoroalkyl substances (PFAS) are a large group of manufactured compounds used in a variety of industries, such as aerospace, automotive, textiles, and electronics, and are used in some food packaging and firefighting materials. For example, they may be used to make products more resistant to stains, grease and water. In the environment, some PFAS break down very slowly, if at all, allowing bioaccumulation (concentration) to occur in humans and wildlife. Some have been found to be toxic to laboratory animals, producing reproductive, developmental, and systemic effects in laboratory tests. EPA's methods for analyzing PFAS in environmental media are in various stages of development. This technical brief summarizes the work being done to develop robust analytical methods for groundwater, surface water, wastewater, and solids, including soils, sediments, and biosolids. The U.S. Environmental Protection Agency’s (EPA) methods for analyzing PFAS in environmental media are in various stages of development. EPA is working to develop robust analytical methods for groundwater, surface water, wastewater, and solids, including soils, sediments, and biosolids.
Schmidberger, Andreas; Durner, Bernhard; Gehrmeyer, David; Schupfner, Robert
2018-06-19
The age determination of elephant ivory provides necessary and crucial information for all criminal prosecution authorities enforcing the Convention on International Trade in Endangered Species of Wild Fauna and Flora. The knowledge of the age of ivory allows to distinguish between pre-convention, hence legal material and ivory deriving from recent, illegal poaching incidents. The commonly applied method to determine the age of ivory is radiocarbon dating in the form of bomb pulse dating, which however will fade out soon. This work provides an enhancement of the radiocarbon dating method by supplementary determination of the isotope profile of 90-Sr and the two thorium isotopes 228-Th and 232-Th. This combined analysis allows for a precise and unambiguous age determination of ivory. We provided calibration curves for all involved radionuclides by analyzing ivory samples with known age and investigated a new method for the extraction of strontium from ivory. Copyright © 2018 Elsevier B.V. All rights reserved.
Determination of the equilibrium constant of C60 fullerene binding with drug molecules.
Mosunov, Andrei A; Pashkova, Irina S; Sidorova, Maria; Pronozin, Artem; Lantushenko, Anastasia O; Prylutskyy, Yuriy I; Parkinson, John A; Evstigneev, Maxim P
2017-03-01
We report a new analytical method that allows the determination of the magnitude of the equilibrium constant of complexation, K h , of small molecules to C 60 fullerene in aqueous solution. The developed method is based on the up-scaled model of C 60 fullerene-ligand complexation and contains the full set of equations needed to fit titration datasets arising from different experimental methods (UV-Vis spectroscopy, 1 H NMR spectroscopy, diffusion ordered NMR spectroscopy, DLS). The up-scaled model takes into consideration the specificity of C 60 fullerene aggregation in aqueous solution and allows the highly dispersed nature of C 60 fullerene cluster distribution to be accounted for. It also takes into consideration the complexity of fullerene-ligand dynamic equilibrium in solution, formed by various types of self- and hetero-complexes. These features make the suggested method superior to standard Langmuir-type analysis, the approach used to date for obtaining quantitative information on ligand binding with different nanoparticles.
NASA Astrophysics Data System (ADS)
da Silva Oliveira, C. I.; Martinez-Martinez, D.; Al-Rjoub, A.; Rebouta, L.; Menezes, R.; Cunha, L.
2018-04-01
In this paper, we present a statistical method that allows evaluating the degree of a transparency of a thin film. To do so, the color coordinates are measured on different substrates, and the standard deviation is evaluated. In case of low values, the color depends on the film and not on the substrate, and intrinsic colors are obtained. In contrast, transparent films lead to high values of standard deviation, since the value of the color coordinates depends on the substrate. Between both extremes, colored films with a certain degree of transparency can be found. This method allows an objective and simple evaluation of the transparency of any film, improving the subjective visual inspection and avoiding the thickness problems related to optical spectroscopy evaluation. Zirconium oxynitride films deposited on three different substrates (Si, steel and glass) are used for testing the validity of this method, whose results have been validated with optical spectroscopy, and agree with the visual impression of the samples.
Organotypic Slice Cultures for Studies of Postnatal Neurogenesis
Mosa, Adam J.; Wang, Sabrina; Tan, Yao Fang; Wojtowicz, J. Martin
2015-01-01
Here we describe a technique for studying hippocampal postnatal neurogenesis in the rodent brain using the organotypic slice culture technique. This method maintains the characteristic topographical morphology of the hippocampus while allowing direct application of pharmacological agents to the developing hippocampal dentate gyrus. Additionally, slice cultures can be maintained for up to 4 weeks and thus, allow one to study the maturation process of newborn granule neurons. Slice cultures allow for efficient pharmacological manipulation of hippocampal slices while excluding complex variables such as uncertainties related to the deep anatomic location of the hippocampus as well as the blood brain barrier. For these reasons, we sought to optimize organotypic slice cultures specifically for postnatal neurogenesis research. PMID:25867138
2011-01-01
Background This paper reports the development of an in-vitro technique allowing quantification of relative (not absolute) deformations measured at the level of the cancellous bone of the tibial proximal epiphysis (CBTPE) during knee flexion-extension. This method has been developed to allow a future study of the effects of low femoral osteotomies consequence on the CBTPE. Methods Six strain gages were encapsulated in an epoxy resin solution to form, after resin polymerisation, six measurement elements (ME). The latter were inserted into the CBTPE of six unembalmed specimens, just below the tibial plateau. Knee motion data were collected by three-dimensional (3D) electrogoniometry during several cycles of knee flexion-extension. Intra- and inter-observer reproducibility was estimated on one specimen for all MEs. Intra-specimen repeatability was calculated to determine specimen's variability and the error of measurement. A varum and valgum chirurgical procedure was realised on another specimen to observed CBTPE deformation after these kind of procedure. Results Average intra-observer variation of the deformation ranged from 8% to 9% (mean coefficient of variation, MCV) respectively for extension and flexion movement. The coefficient of multiple correlations (CMC) ranged from 0.93 to 0.96 for flexion and extension. No phase shift of maximum strain peaks was observed. Inter-observer MCV averaged 23% and 28% for flexion and extension. The CMC were 0.82 and 0.87 respectively for extension and flexion. For the intra-specimen repeatability, the average of mean RMS difference and the mean ICC were calculated only for flexion movement. The mean RMS variability ranged from 7 to 10% and the mean ICC was 0.98 (0.95 - 0.99). A Pearson's correlation coefficient was calculated showing that RMS was independent of signal intensity. For the chirurgical procedure, valgum and varum deviation seems be in agree with the frontal misalignment theory. Conclusions Results show that the methodology is reproducible within a range of 10%. This method has been developed to allow analysis the indirect reflect of deformation variations in CBTPE before and after distal femoral osteotomies. The first results of the valgum and varum deformation show that our methodology allows this kind of measurement and are encourageant for latter studies. It will therefore allow quantification and enhance the understanding of the effects of this kind of surgery on the CBTPE loading. PMID:21371297
NASA Astrophysics Data System (ADS)
Ho, R. J.; Yusoff, M. Z.; Palanisamy, K.
2013-06-01
Stringent emission policy has put automotive research & development on developing high efficiency and low pollutant power train. Conventional direct injection diesel engine with diffused flame has reached its limitation and has driven R&D to explore other field of combustion. Low temperature combustion (LTC) and homogeneous charge combustion ignition has been proven to be effective methods in decreasing combustion pollutant emission. Nitrogen Oxide (NOx) and Particulate Matter (PM) formation from combustion can be greatly suppressed. A review on each of method is covered to identify the condition and processes that result in these reductions. The critical parameters that allow such combustion to take place will be highlighted and serves as emphasis to the direction of developing future diesel engine system. This paper is written to explore potential of present numerical and experimental methods in optimizing diesel engine design through adoption of the new combustion technology.
The method of selecting an integrated development territory for the high-rise unique constructions
NASA Astrophysics Data System (ADS)
Sheina, Svetlana; Shevtsova, Elina; Sukhinin, Alexander; Priss, Elena
2018-03-01
On the basis of data provided by the Department of architecture and urban planning of the city of Rostov-on-don, the problem of the choice of the territory for complex development that will be in priority for the construction of high-rise and unique buildings is solved. The objective of the study was the development of a methodology for selection of the area and the implementation of the proposed method on the example of evaluation of four-territories complex development. The developed method along with standard indicators of complex evaluation considers additional indicators that assess the territory from the position of high-rise unique building. The final result of the study is the rankings of the functional priority areas that takes into account the construction of both residential and public and business objects of unique high-rise construction. The use of the developed methodology will allow investors and customers to assess the investment attractiveness of the future unique construction project on the proposed site.
Efficient Inversion of Mult-frequency and Multi-Source Electromagnetic Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gary D. Egbert
2007-03-22
The project covered by this report focused on development of efficient but robust non-linear inversion algorithms for electromagnetic induction data, in particular for data collected with multiple receivers, and multiple transmitters, a situation extremely common in eophysical EM subsurface imaging methods. A key observation is that for such multi-transmitter problems each step in commonly used linearized iterative limited memory search schemes such as conjugate gradients (CG) requires solution of forward and adjoint EM problems for each of the N frequencies or sources, essentially generating data sensitivities for an N dimensional data-subspace. These multiple sensitivities allow a good approximation to themore » full Jacobian of the data mapping to be built up in many fewer search steps than would be required by application of textbook optimization methods, which take no account of the multiplicity of forward problems that must be solved for each search step. We have applied this idea to a develop a hybrid inversion scheme that combines features of the iterative limited memory type methods with a Newton-type approach using a partial calculation of the Jacobian. Initial tests on 2D problems show that the new approach produces results essentially identical to a Newton type Occam minimum structure inversion, while running more rapidly than an iterative (fixed regularization parameter) CG style inversion. Memory requirements, while greater than for something like CG, are modest enough that even in 3D the scheme should allow 3D inverse problems to be solved on a common desktop PC, at least for modest (~ 100 sites, 15-20 frequencies) data sets. A secondary focus of the research has been development of a modular system for EM inversion, using an object oriented approach. This system has proven useful for more rapid prototyping of inversion algorithms, in particular allowing initial development and testing to be conducted with two-dimensional example problems, before approaching more computationally cumbersome three-dimensional problems.« less
A Versatile Mounting Method for Long Term Imaging of Zebrafish Development.
Hirsinger, Estelle; Steventon, Ben
2017-01-26
Zebrafish embryos offer an ideal experimental system to study complex morphogenetic processes due to their ease of accessibility and optical transparency. In particular, posterior body elongation is an essential process in embryonic development by which multiple tissue deformations act together to direct the formation of a large part of the body axis. In order to observe this process by long-term time-lapse imaging it is necessary to utilize a mounting technique that allows sufficient support to maintain samples in the correct orientation during transfer to the microscope and acquisition. In addition, the mounting must also provide sufficient freedom of movement for the outgrowth of the posterior body region without affecting its normal development. Finally, there must be a certain degree in versatility of the mounting method to allow imaging on diverse imaging set-ups. Here, we present a mounting technique for imaging the development of posterior body elongation in the zebrafish D. rerio. This technique involves mounting embryos such that the head and yolk sac regions are almost entirely included in agarose, while leaving out the posterior body region to elongate and develop normally. We will show how this can be adapted for upright, inverted and vertical light-sheet microscopy set-ups. While this protocol focuses on mounting embryos for imaging for the posterior body, it could easily be adapted for the live imaging of multiple aspects of zebrafish development.
Development and current applications of assisted fertilization.
Palermo, Gianpiero D; Neri, Queenie V; Monahan, Devin; Kocent, Justin; Rosenwaks, Zev
2012-02-01
Since the very early establishment of in vitro insemination, it became clear that one of the limiting steps is the achievement of fertilization. Among the different assisted fertilization methods, intracytoplasmic sperm injection emerged as the ultimate technique to allow fertilization with ejaculated, epididymal, and testicular spermatozoa. This work describes the early steps that brought forth the development of intracytoplasmic sperm injection and its role in assisted reproductive techniques. The current methods to select the preferential male gamete will be elucidated and the concerns related to the offspring of severe male factor couples will be discussed. Copyright © 2012. Published by Elsevier Inc.
Brain transcriptome atlases: a computational perspective.
Mahfouz, Ahmed; Huisman, Sjoerd M H; Lelieveldt, Boudewijn P F; Reinders, Marcel J T
2017-05-01
The immense complexity of the mammalian brain is largely reflected in the underlying molecular signatures of its billions of cells. Brain transcriptome atlases provide valuable insights into gene expression patterns across different brain areas throughout the course of development. Such atlases allow researchers to probe the molecular mechanisms which define neuronal identities, neuroanatomy, and patterns of connectivity. Despite the immense effort put into generating such atlases, to answer fundamental questions in neuroscience, an even greater effort is needed to develop methods to probe the resulting high-dimensional multivariate data. We provide a comprehensive overview of the various computational methods used to analyze brain transcriptome atlases.
Geo-spatial Informatics in International Public Health Nursing Education.
Kerr, Madeleine J; Honey, Michelle L L; Krzyzanowski, Brittany
2016-01-01
This poster describes results of an undergraduate nursing informatics experience. Students applied geo-spatial methods to community assessments in two urban regions of New Zealand and the United States. Students used the Omaha System standardized language to code their observations during a brief community assessment activity and entered their data into a mapping program developed in Esri ArcGIS Online, a geographic information system. Results will be displayed in tables and maps to allow comparison among the communities. The next generation of nurses can employ geo-spatial informatics methods to contribute to innovative community assessment, planning and policy development.
NASA Astrophysics Data System (ADS)
Busarev, Vladimir V.; Prokof'eva-Mikhailovskaya, Valentina V.; Bochkov, Valerii V.
2007-06-01
A method of reflectance spectrophotometry of atmosphereless bodies of the Solar system, its specificity, and the means of eliminating basic spectral noise are considered. As a development, joining the method of reflectance spectrophotometry with the frequency analysis of observational data series is proposed. The combined spectral-frequency method allows identification of formations with distinctive spectral features, and estimations of their sizes and distribution on the surface of atmospherelss celestial bodies. As applied to investigations of asteroids 21 Lutetia and 4 Vesta, the spectral frequency method has given us the possibility of obtaining fundamentally new information about minor planets.
Development and validation of a notational system to study the offensive process in football.
Sarmento, Hugo; Anguera, Teresa; Campaniço, Jorge; Leitão, José
2010-01-01
The most striking change within football development is the application of science to its problems and in particular the use of increasingly sophisticated technology that, supported by scientific data, allows us to establish a "code of reading" the reality of the game. Therefore, this study describes the process of the development and validation of an ad hoc system of categorization, which allows the different methods of offensive game in football and the interaction to be analyzed. Therefore, through an exploratory phase of the study, we identified 10 vertebrate criteria and the respective behaviors observed for each of these criteria. We heard a panel of five experts with the purpose of a content validation. The resulting instrument is characterized by a combination of field formats and systems of categories. The reliability of the instrument was calculated by the intraobserver agreement, and values above 0.95 for all criteria were achieved. Two FC Barcelona games were coded and analyzed, which allowed the detection of various T-patterns. The results show that the instrument serves the purpose for which it was developed and can provide important information for the understanding of game interaction in football.
Detecting and estimating errors in 3D restoration methods using analog models.
NASA Astrophysics Data System (ADS)
José Ramón, Ma; Pueyo, Emilio L.; Briz, José Luis
2015-04-01
Some geological scenarios may be important for a number of socio-economic reasons, such as water or energy resources, but the available underground information is often limited, scarce and heterogeneous. A truly 3D reconstruction, which is still necessary during the decision-making process, may have important social and economic implications. For this reason, restoration methods were developed. By honoring some geometric or mechanical laws, they help build a reliable image of the subsurface. Pioneer methods were firstly applied in 2D (balanced and restored cross-sections) during the sixties and seventies. Later on, and due to the improvements of computational capabilities, they were extended to 3D. Currently, there are some academic and commercial restoration solutions; Unfold by the Université de Grenoble, Move by Midland Valley Exploration, Kine3D (on gOcad code) by Paradigm, Dynel3D by igeoss-Schlumberger. We have developed our own restoration method, Pmag3Drest (IGME-Universidad de Zaragoza), which is designed to tackle complex geometrical scenarios using paleomagnetic vectors as a pseudo-3D indicator of deformation. However, all these methods have limitations based on the assumptions they need to establish. For this reason, detecting and estimating uncertainty in 3D restoration methods is of key importance to trust the reconstructions. Checking the reliability and the internal consistency of every method, as well as to compare the results among restoration tools, is a critical issue never tackled so far because of the impossibility to test out the results in Nature. To overcome this problem we have developed a technique using analog models. We built complex geometric models inspired in real cases of superposed and/or conical folding at laboratory scale. The stratigraphic volumes were modeled using EVA sheets (ethylene vinyl acetate). Their rheology (tensile and tear strength, elongation, density etc) and thickness can be chosen among a large number of values, allowing to simulate many geologic settings. Besides, we also developed a novel technique to reconstruct the deformation ellipsoid. It consists in the screen-printing of an orthogonal net in every single EVA plate. The CT scan of the stack of plates allows the numbering of the nodes in 3D. Then, the geologic geometry is simulated and scanned again. The comparison of the nets before and after the deformation allows computing the distribution of strain ellipsoids in 3D. After extracting the principal axes, we can calculate dilation, total anisotropy etc. with a density proportional to the mesh size. The resultant geometry is perfectly known and thus, the expected result if we apply any restoration method. In this contribution we will show the first results obtained after testing some restoration methods with this stress test.
An Overview of Virtual Acoustic Simulation of Aircraft Flyover Noise
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.
2013-01-01
Methods for testing human subject response to aircraft flyover noise have greatly advanced in recent years as a result of advances in simulation technology. Capabilities have been developed which now allow subjects to be immersed both visually and aurally in a three-dimensional, virtual environment. While suitable for displaying recorded aircraft noise, the true potential is found when synthesizing aircraft flyover noise because it allows the flexibility and freedom to study sounds from aircraft not yet flown. A virtual acoustic simulation method is described which is built upon prediction-based source noise synthesis, engineering-based propagation modeling, and empirically-based receiver modeling. This source-path-receiver paradigm allows complete control over all aspects of flyover auralization. With this capability, it is now possible to assess human response to flyover noise by systematically evaluating source noise reductions within the context of a system level simulation. Examples of auralized flyover noise and movie clips representative of an immersive aircraft flyover environment are made in the presentation.
Microfabricated Patch Clamp Electrodes for Improved Ion Channel Protein Measurements
NASA Astrophysics Data System (ADS)
Klemic, James; Klemic, Kathryn; Reed, Mark; Sigworth, Frederick
2002-03-01
Ion channels are trans-membrane proteins that underlie many cell functions including hormone and neurotransmitter release, muscle contraction and cell signaling cascades. Ion channel proteins are commonly characterized via the patch clamp method in which an extruded glass tube containing ionic solution, manipulated by an expert technician, is brought into contact with a living cell to record ionic current through the cell membrane. Microfabricated planar patch electrodes, micromolded in the silicone elastomer poly-dimethylsiloxane (PDMS) from microlithographically patterned structures, have been developed that improve on this method. Microfabrication techniques allow arrays of patch electrodes to be fabricated, increasing the throughput of the measurement technique. Planar patch electrodes readily allow the automation of cell sealing, further increasing throughput. Microfabricated electrode arrays may be readily integrated with microfluidic structures to allow fast, in situ solution exchange. Miniaturization of the electrode geometry should increase both the signal to noise and the bandwidth of the measurement. Microfabricated patch electrode arrays have been fabricated and measurements have been taken.
Fundamental research in the area of high temperature fuel cells in Russia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dyomin, A.K.
1996-04-01
Research in the area of molten carbonate and solid oxide fuel cells has been conducted in Russia since the late 60`s. Institute of High Temperature Electrochemistry is the lead organisation in this area. Research in the area of materials used in fuel cells has allowed us to identify compositions of electrolytes, electrodes, current paths and transmitting, sealing and structural materials appropriate for long-term fuel cell applications. Studies of electrode processes resulted in better understanding of basic patterns of electrode reactions and in the development of a foundation for electrode structure optimization. We have developed methods to increase electrode activity levelsmore » that allowed us to reach current density levels of up to 1 amper/cm{sup 2}. Development of mathematical models of processes in high temperature fuel cells has allowed us to optimize their structure. The results of fundamental studies have been tested on laboratory mockups. MCFC mockups with up to 100 W capacity and SOFC mockups with up to 1 kW capacity have been manufactured and tested at IHTE. There are three SOFC structural options: tube, plate and modular.« less