Science.gov

Sample records for high performance analysis

  1. Failure analysis of high performance ballistic fibers

    NASA Astrophysics Data System (ADS)

    Spatola, Jennifer S.

    High performance fibers have a high tensile strength and modulus, good wear resistance, and a low density, making them ideal for applications in ballistic impact resistance, such as body armor. However, the observed ballistic performance of these fibers is much lower than the predicted values. Since the predictions assume only tensile stress failure, it is safe to assume that the stress state is affecting fiber performance. The purpose of this research was to determine if there are failure mode changes in the fiber fracture when transversely loaded by indenters of different shapes. An experimental design mimicking transverse impact was used to determine any such effects. Three different indenters were used: round, FSP, and razor blade. The indenter height was changed to change the angle of failure tested. Five high performance fibers were examined: KevlarRTM KM2, SpectraRTM 130d, DyneemaRTM SK-62 and SK-76, and ZylonRTM 555. Failed fibers were analyzed using an SEM to determine failure mechanisms. The results show that the round and razor blade indenters produced a constant failure strain, as well as failure mechanisms independent of testing angle. The FSP indenter produced a decrease in failure strain as the angle increased. Fibrillation was the dominant failure mechanism at all angles for the round indenter, while through thickness shearing was the failure mechanism for the razor blade. The FSP indenter showed a transition from fibrillation at low angles to through thickness shearing at high angles, indicating that the round and razor blade indenters are extreme cases of the FSP indenter. The failure mechanisms observed with the FSP indenter at various angles correlated with the experimental strain data obtained during fiber testing. This indicates that geometry of the indenter tip in compression is a contributing factor in lowering the failure strain of the high performance fibers. TEM analysis of the fiber failure mechanisms was also attempted, though without

  2. High Performance Data Analysis via Coordinated Caches

    NASA Astrophysics Data System (ADS)

    Fischer, M.; Metzlaff, C.; Kühn, E.; Giffels, M.; Quast, G.; Jung, C.; Hauth, T.

    2015-12-01

    With the second run period of the LHC, high energy physics collaborations will have to face increasing computing infrastructural needs. Opportunistic resources are expected to absorb many computationally expensive tasks, such as Monte Carlo event simulation. This leaves dedicated HEP infrastructure with an increased load of analysis tasks that in turn will need to process an increased volume of data. In addition to storage capacities, a key factor for future computing infrastructure is therefore input bandwidth available per core. Modern data analysis infrastructure relies on one of two paradigms: data is kept on dedicated storage and accessed via network or distributed over all compute nodes and accessed locally. Dedicated storage allows data volume to grow independently of processing capacities, whereas local access allows processing capacities to scale linearly. However, with the growing data volume and processing requirements, HEP will require both of these features. For enabling adequate user analyses in the future, the KIT CMS group is merging both paradigms: popular data is spread over a local disk layer on compute nodes, while any data is available from an arbitrarily sized background storage. This concept is implemented as a pool of distributed caches, which are loosely coordinated by a central service. A Tier 3 prototype cluster is currently being set up for performant user analyses of both local and remote data.

  3. Performance analysis of memory hierachies in high performance systems

    SciTech Connect

    Yogesh, A.

    1993-07-01

    This thesis studies memory bandwidth as a performance predictor of programs. The focus of this work is on computationally intensive programs. These programs are the most likely to access large amounts of data, stressing the memory system. Computationally intensive programs are also likely to use highly optimizing compilers to produce the fastest executables possible. Methods to reduce the amount of data traffic by increasing the average number of references to each item while it resides in the cache are explored. Increasing the average number of references to each cache item reduces the number of memory requests. Chapter 2 describes the DLX architecture. This is the architecture on which all the experiments were performed. Chapter 3 studies memory moves as a performance predictor for a group of application programs. Chapter 4 introduces a model to study the performance of programs in the presence of memory hierarchies. Chapter 5 explores some compiler optimizations that can help increase the references to each item while it resides in the cache.

  4. SIMS analysis of high-performance accelerator niobium

    SciTech Connect

    Maheshwari, P.; Stevie, F. A.; Myneni, Ganapati Rao; Rigsbee, J, M.; Dhakal, Pashupati; Ciovati, Gianluigi; Griffis, D. P.

    2014-11-01

    Niobium is used to fabricate superconducting radio frequency accelerator modules because of its high critical temperature, high critical magnetic field, and easy formability. Recent experiments have shown a very significant improvement in performance (over 100%) after a high-temperature bake at 1400 degrees C for 3h. SIMS analysis of this material showed the oxygen profile was significantly deeper than the native oxide with a shape that is indicative of diffusion. Positive secondary ion mass spectra showed the presence of Ti with a depth profile similar to that of O. It is suspected that Ti is associated with the performance improvement. The source of Ti contamination in the anneal furnace has been identified, and a new furnace was constructed without Ti. Initial results from the new furnace do not show the yield improvement. Further analyses should determine the relationship of Ti to cavity performance.

  5. High Performance Descriptive Semantic Analysis of Semantic Graph Databases

    SciTech Connect

    Joslyn, Cliff A.; Adolf, Robert D.; al-Saffar, Sinan; Feo, John T.; Haglin, David J.; Mackey, Greg E.; Mizell, David W.

    2011-06-02

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to understand their inherent semantic structure, whether codified in explicit ontologies or not. Our group is researching novel methods for what we call descriptive semantic analysis of RDF triplestores, to serve purposes of analysis, interpretation, visualization, and optimization. But data size and computational complexity makes it increasingly necessary to bring high performance computational resources to bear on this task. Our research group built a novel high performance hybrid system comprising computational capability for semantic graph database processing utilizing the large multi-threaded architecture of the Cray XMT platform, conventional servers, and large data stores. In this paper we describe that architecture and our methods, and present the results of our analyses of basic properties, connected components, namespace interaction, and typed paths such for the Billion Triple Challenge 2010 dataset.

  6. [An analysis of maicaodi by high performance liquid chromatography].

    PubMed

    Yang, H; Chen, R; Jiang, M

    1997-05-01

    Maicaodi has recently been developed and produced by the pesticide plant of Nanjing Agricultural University. The quantitative analysis of the effective components--tribenuron methyl and R (-)napropamide in wettable powder of Maicaode, by a high performance liquid chromatographic method was carried out with a Lichrosorb Si-60 20cm x 0.46cm i.d. column, mobile phase of petroleum ether/isopropanol/methanol/acetonitrile/chloroform mixture solvent (80:5:5:5:5) and internal standard of diisooctyl phthalate. The sample was detected by ultraviolet absorption at 254 nm. The retention times of tribenuron methyl and R (-)napropamide were 10-11min and 6-7min respectively. The coefficient of variation of this analysis was 0.34% with a recovery of 99.51%-100.32%. The coefficient of linear correlation was 0.9999. PMID:15739379

  7. An Analysis of a High Performing School District's Culture

    ERIC Educational Resources Information Center

    Corum, Kenneth D.; Schuetz, Todd B.

    2012-01-01

    This report describes a problem based learning project focusing on the cultural elements of a high performing school district. Current literature on school district culture provides numerous cultural elements that are present in high performing school districts. With the current climate in education placing pressure on school districts to perform…

  8. Analysis and Performance of a 12-Pulse High Power Regulator

    NASA Technical Reports Server (NTRS)

    Silva, Arnold; Daeges, John

    1994-01-01

    Under work being performed to upgrade the 20 Kilowatt CW uplink transmitters of the NASA Deep Space Network (DSN), the high voltage regulator has been revisited in order to optimize its performance (long-term stability and regulation), and enhance field reliability.

  9. Total systems design analysis of high performance structures

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1993-01-01

    Designer-control parameters were identified at interdiscipline interfaces to optimize structural systems performance and downstream development and operations with reliability and least life-cycle cost. Interface tasks and iterations are tracked through a matrix of performance disciplines integration versus manufacturing, verification, and operations interactions for a total system design analysis. Performance integration tasks include shapes, sizes, environments, and materials. Integrity integrating tasks are reliability and recurring structural costs. Significant interface designer control parameters were noted as shapes, dimensions, probability range factors, and cost. Structural failure concept is presented, and first-order reliability and deterministic methods, benefits, and limitations are discussed. A deterministic reliability technique combining benefits of both is proposed for static structures which is also timely and economically verifiable. Though launch vehicle environments were primarily considered, the system design process is applicable to any surface system using its own unique filed environments.

  10. High-performance liquid chromatographic analysis of ampicillin.

    PubMed

    Tsuji, K; Robertson, J H

    1975-09-01

    A high-pressure liquid chromatographic method for the analysis of ampicillin is described. The method uses a 1-m long stainless steel column packed with anionic exchange resin, with a mobile phase of 0.02 M NaNO3 in 0.01 M pH 9.15 borate buffer at a flow rate of 0.45 ml/min. The degradation products of ampicillin, penicillenic and penicilloic acids of ampicillin, can be separated and quantitated in less than 12 min of chromatographic time. The relative standard deviation for the analysis of ampicillin is less than 1%, and the method is sensitive to approximately 20 ng of ampicillin/sample injected. The method was applied to the analysis of various pharmaceutical preparations of ampicillin. It is also applicable, with a slight modification, for the analysis of penicillins G and V. PMID:1185575

  11. Theoretical performance analysis for CMOS based high resolution detectors.

    PubMed

    Jain, Amit; Bednarek, Daniel R; Rudin, Stephen

    2013-03-01

    High resolution imaging capabilities are essential for accurately guiding successful endovascular interventional procedures. Present x-ray imaging detectors are not always adequate due to their inherent limitations. The newly-developed high-resolution micro-angiographic fluoroscope (MAF-CCD) detector has demonstrated excellent clinical image quality; however, further improvement in performance and physical design may be possible using CMOS sensors. We have thus calculated the theoretical performance of two proposed CMOS detectors which may be used as a successor to the MAF. The proposed detectors have a 300 μm thick HL-type CsI phosphor, a 50 μm-pixel CMOS sensor with and without a variable gain light image intensifier (LII), and are designated MAF-CMOS-LII and MAF-CMOS, respectively. For the performance evaluation, linear cascade modeling was used. The detector imaging chains were divided into individual stages characterized by one of the basic processes (quantum gain, binomial selection, stochastic and deterministic blurring, additive noise). Ranges of readout noise and exposure were used to calculate the detectors' MTF and DQE. The MAF-CMOS showed slightly better MTF than the MAF-CMOS-LII, but the MAF-CMOS-LII showed far better DQE, especially for lower exposures. The proposed detectors can have improved MTF and DQE compared with the present high resolution MAF detector. The performance of the MAF-CMOS is excellent for the angiography exposure range; however it is limited at fluoroscopic levels due to additive instrumentation noise. The MAF-CMOS-LII, having the advantage of the variable LII gain, can overcome the noise limitation and hence may perform exceptionally for the full range of required exposures; however, it is more complex and hence more expensive. PMID:24353390

  12. Performance analysis of two high actuator count MEMS deformable mirrors

    NASA Astrophysics Data System (ADS)

    Ryan, Peter J.; Cornelissen, Steven A.; Lam, Charlie V.; Bierden, Paul A.

    2013-03-01

    Two new MEMS deformable mirrors have been designed and fabricated, one having a continuous facesheet with an active aperture of 20mm and 2040 actuators and the other, a similarly sized segmented tip tilt piston DM containing 1021 elements and 3063 actuators. The surface figures, electro mechanical performances, and actuator yield of these devices, with statistical information, are reported here. The statistical distributions of these measurements directly illustrate the surface variance of Boston Micromachines deformable mirrors. Measurements of the surface figure were also performed with the elements at different actuation states. Also presented here are deviations of the surface figure under actuation versus at its rest state, the electromechanical distribution, and a dynamic analysis.

  13. CytoSPADE: high-performance analysis and visualization of high-dimensional cytometry data

    PubMed Central

    Linderman, Michael D.; Simonds, Erin F.; Qiu, Peng; Bruggner, Robert V.; Sheode, Ketaki; Meng, Teresa H.; Plevritis, Sylvia K.; Nolan, Garry P.

    2012-01-01

    Motivation: Recent advances in flow cytometry enable simultaneous single-cell measurement of 30+ surface and intracellular proteins. CytoSPADE is a high-performance implementation of an interface for the Spanning-tree Progression Analysis of Density-normalized Events algorithm for tree-based analysis and visualization of this high-dimensional cytometry data. Availability: Source code and binaries are freely available at http://cytospade.org and via Bioconductor version 2.10 onwards for Linux, OSX and Windows. CytoSPADE is implemented in R, C++ and Java. Contact: michael.linderman@mssm.edu Supplementary Information: Additional documentation available at http://cytospade.org. PMID:22782546

  14. Structural analysis of amorphous phosphates using high performance liquid chromatography

    SciTech Connect

    Sales, B.C.; Boatner, L.A.; Chakoumakos, B.C.; McCallum, J.C.; Ramey, J.O.; Zuhr, R.A.

    1993-12-31

    Determining the atomic-scale structure of amorphous solids has proven to be a formidable scientific and technological problem for the past 100 years. The technique of high-performance liquid chromatography (HPLC) provides unique detailed information regarding the structure of partially disordered or amorphous phosphate solids. Applications of the experimental technique of HPLC to phosphate solids are reviewed, and examples of the type of information that can be obtained with HPLC are presented. Inorganic phosphates encompass a large class of important materials whose applications include: catalysts, ion-exchange media, solid electrolytes for batteries, linear and nonlinear optical components, chelating agents, synthetic replacements for bone and teeth, phosphors, detergents, and fertilizers. Phosphate ions also represent a unique link between living systems and the inorganic world.

  15. Moisture and Structural Analysis for High Performance Hybrid Wall Assemblies

    SciTech Connect

    Grin, A.; Lstiburek, J.

    2012-09-01

    This report describes the work conducted by the Building Science Corporation (BSC) Building America Research Team's 'Energy Efficient Housing Research Partnerships' project. Based on past experience in the Building America program, they have found that combinations of materials and approaches---in other words, systems--usually provide optimum performance. No single manufacturer typically provides all of the components for an assembly, nor has the specific understanding of all the individual components necessary for optimum performance.

  16. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  17. Transmission System Performance Analysis for High-Penetration Photovoltaics

    SciTech Connect

    Achilles, S.; Schramm, S.; Bebic, J.

    2008-02-01

    This study is an assessment of the potential impact of high levels of penetration of photovoltaic (PV) generation on transmission systems. The effort used stability simulations of a transmission system with different levels of PV generation and load.

  18. Moisture and Structural Analysis for High Performance Hybrid Wall Assemblies

    SciTech Connect

    Grin, A.; Lstiburek, J.

    2012-09-01

    Based on past experience in the Building America program, BSC has found that combinations of materials and approaches—in other words, systems—usually provide optimum performance. Integration is necessary, as described in this research project. The hybrid walls analyzed utilize a combination of exterior insulation, diagonal metal strapping, and spray polyurethane foam and leave room for cavity-fill insulation. These systems can provide effective thermal, air, moisture, and water barrier systems in one assembly and provide structure.

  19. Algorithms and architectures for high performance analysis of semantic graphs.

    SciTech Connect

    Hendrickson, Bruce Alan

    2005-09-01

    Semantic graphs offer one promising avenue for intelligence analysis in homeland security. They provide a mechanism for describing a wide variety of relationships between entities of potential interest. The vertices are nouns of various types, e.g. people, organizations, events, etc. Edges in the graph represent different types of relationships between entities, e.g. 'is friends with', 'belongs-to', etc. Semantic graphs offer a number of potential advantages as a knowledge representation system. They allow information of different kinds, and collected in differing ways, to be combined in a seamless manner. A semantic graph is a very compressed representation of some of relationship information. It has been reported that the semantic graph can be two orders of magnitude smaller than the processed intelligence data. This allows for much larger portions of the data universe to be resident in computer memory. Many intelligence queries that are relevant to the terrorist threat are naturally expressed in the language of semantic graphs. One example is the search for 'interesting' relationships between two individuals or between an individual and an event, which can be phrased as a search for short paths in the graph. Another example is the search for an analyst-specified threat pattern, which can be cast as an instance of subgraph isomorphism. It is important to note than many kinds of analysis are not relationship based, so these are not good candidates for semantic graphs. Thus, a semantic graph should always be used in conjunction with traditional knowledge representation and interface methods. Operations that involve looking for chains of relationships (e.g. friend of a friend) are not efficiently executable in a traditional relational database. However, the semantic graph can be thought of as a pre-join of the database, and it is ideally suited for these kinds of operations. Researchers at Sandia National Laboratories are working to facilitate semantic graph

  20. High Performance Parallel Analysis of Coupled Problems for Aircraft Propulsion

    NASA Technical Reports Server (NTRS)

    Felippa, C. A.; Farhat, C.; Lanteri, S.; Maman, N.; Piperno, S.; Gumaste, U.

    1994-01-01

    In order to predict the dynamic response of a flexible structure in a fluid flow, the equations of motion of the structure and the fluid must be solved simultaneously. In this paper, we present several partitioned procedures for time-integrating this focus coupled problem and discuss their merits in terms of accuracy, stability, heterogeneous computing, I/O transfers, subcycling, and parallel processing. All theoretical results are derived for a one-dimensional piston model problem with a compressible flow, because the complete three-dimensional aeroelastic problem is difficult to analyze mathematically. However, the insight gained from the analysis of the coupled piston problem and the conclusions drawn from its numerical investigation are confirmed with the numerical simulation of the two-dimensional transient aeroelastic response of a flexible panel in a transonic nonlinear Euler flow regime.

  1. Similarity spectra analysis of high-performance jet aircraft noise.

    PubMed

    Neilsen, Tracianne B; Gee, Kent L; Wall, Alan T; James, Michael M

    2013-04-01

    Noise measured in the vicinity of an F-22A Raptor has been compared to similarity spectra found previously to represent mixing noise from large-scale and fine-scale turbulent structures in laboratory-scale jet plumes. Comparisons have been made for three engine conditions using ground-based sideline microphones, which covered a large angular aperture. Even though the nozzle geometry is complex and the jet is nonideally expanded, the similarity spectra do agree with large portions of the measured spectra. Toward the sideline, the fine-scale similarity spectrum is used, while the large-scale similarity spectrum provides a good fit to the area of maximum radiation. Combinations of the two similarity spectra are shown to match the data in between those regions. Surprisingly, a combination of the two is also shown to match the data at the farthest aft angle. However, at high frequencies the degree of congruity between the similarity and the measured spectra changes with engine condition and angle. At the higher engine conditions, there is a systematically shallower measured high-frequency slope, with the largest discrepancy occurring in the regions of maximum radiation. PMID:23556581

  2. Application of high performance capillary electrophoresis on toxic alkaloids analysis.

    PubMed

    Zhang, Li; Wang, Rong; Zhang, Yurong; Yu, Yunqiu

    2007-06-01

    We employed CE to identify mixtures of the toxic alkaloids lappaconitine, bullatine A, atropine sulfate, atropine methobromide, scopolamine hydrobromide, anisodamine hydrobromide, brucine, strychnine, quinine sulfate, and chloroquine in human blood and urine, using procaine hydrochloride as an internal standard. The separation employed a fused-silica capillary of 75 microm id x 60 cm length (effective length: 50.2 cm) and a buffer containing 100 mM phosphate and 5% ACN (pH 4.0). The sample was injected in a pressure mode and the separation was performed at a voltage of 16 kV and a temperature of 25 degrees C. The compounds were detected by UV absorbance at wavelengths of 195 and 235 nm. All the ten alkaloids were separated within 16 min. The method was validated with regard to precision (RSD), accuracy, sensitivity, linear range, LOD, and LOQ. In blood and urine samples, the detection limits were 5-40 ng/mL and linear calibration curves were obtained over the range of 0.02-10 microg/mL. The precision of intra- and interday measurements was less than 15%. Electrophoretic peaks could be identified either by the relative migration time or by their UV spectrum. PMID:17623479

  3. Performance analysis of high quality parallel preconditioners applied to 3D finite element structural analysis

    SciTech Connect

    Kolotilina, L.; Nikishin, A.; Yeremin, A.

    1994-12-31

    The solution of large systems of linear equations is a crucial bottleneck when performing 3D finite element analysis of structures. Also, in many cases the reliability and robustness of iterative solution strategies, and their efficiency when exploiting hardware resources, fully determine the scope of industrial applications which can be solved on a particular computer platform. This is especially true for modern vector/parallel supercomputers with large vector length and for modern massively parallel supercomputers. Preconditioned iterative methods have been successfully applied to industrial class finite element analysis of structures. The construction and application of high quality preconditioners constitutes a high percentage of the total solution time. Parallel implementation of high quality preconditioners on such architectures is a formidable challenge. Two common types of existing preconditioners are the implicit preconditioners and the explicit preconditioners. The implicit preconditioners (e.g. incomplete factorizations of several types) are generally high quality but require solution of lower and upper triangular systems of equations per iteration which are difficult to parallelize without deteriorating the convergence rate. The explicit type of preconditionings (e.g. polynomial preconditioners or Jacobi-like preconditioners) require sparse matrix-vector multiplications and can be parallelized but their preconditioning qualities are less than desirable. The authors present results of numerical experiments with Factorized Sparse Approximate Inverses (FSAI) for symmetric positive definite linear systems. These are high quality preconditioners that possess a large resource of parallelism by construction without increasing the serial complexity.

  4. ANALYSIS OF DRUG INTERACTIONS WITH HIGH DENSITY LIPOPROTEIN BY HIGH-PERFORMANCE AFFINITY CHROMATOGRAPHY

    PubMed Central

    Chen, Sike; Sobansky, Matthew R.; Hage, David S.

    2009-01-01

    Columns containing immobilized lipoproteins were prepared for the analysis of drug interactions with these particles by high-performance affinity chromatography. This approach was evaluated by using it to examine the binding of high density lipoprotein (HDL) to the drugs propranolol or verapamil. HDL was immobilized by the Schiff base method onto silica and gave HPLC columns with reproducible binding to propranolol over four to five days of continuous operation at pH 7.4. Frontal analysis experiments indicated that two types of interactions were occurring between R/S-propranolol and HDL at 37°C: saturable binding with an association equilibrium constant (Ka) of 1.1–1.9 × 105 M−1, and non-saturable binding with an overall affinity constant (n Ka) of 3.7–4.1 × 104 M−1. Similar results were found at 4 and 27°C. Verapamil also gave similar behavior, with a Ka of 6.0 × 104 M−1 at 37°C for the saturable sites and a n Ka value for the non-saturable sites of 2.5 × 104 M−1. These measured affinities gave good agreement with solution-phase values. The results indicated HPAC can be used to study drug interactions with HDL, providing information that should be valuable in obtaining a better description of how drugs are transported within the body. PMID:19833090

  5. The NetLogger Methodology for High Performance Distributed Systems Performance Analysis

    SciTech Connect

    Tierney, Brian; Johnston, William; Crowley, Brian; Hoo, Gary; Brooks, Chris; Gunter, Dan

    1999-12-23

    The authors describe a methodology that enables the real-time diagnosis of performance problems in complex high-performance distributed systems. The methodology includes tools for generating precision event logs that can be used to provide detailed end-to-end application and system level monitoring; a Java agent-based system for managing the large amount of logging data; and tools for visualizing the log data and real-time state of the distributed system. The authors developed these tools for analyzing a high-performance distributed system centered around the transfer of large amounts of data at high speeds from a distributed storage server to a remote visualization client. However, this methodology should be generally applicable to any distributed system. This methodology, called NetLogger, has proven invaluable for diagnosing problems in networks and in distributed systems code. This approach is novel in that it combines network, host, and application-level monitoring, providing a complete view of the entire system.

  6. BioGraphE: High-performance bionetwork analysis using the Biological Graph Environment

    SciTech Connect

    Chin, George; Chavarría-Miranda, Daniel; Nakamura, Grant C.; Sofia, Heidi J.

    2008-05-28

    Graphs and networks are common analysis representations for biological systems. Many traditional graph algorithms such as k-clique, k-coloring, and subgraph matching have great potential as analysis techniques for newly available data in biology. Yet, as the amount of genomic and bionetwork information rapidly grows, scientists need advanced new computational strategies and tools for dealing with the complexities of the bionetwork analysis and the volume of the data. We introduce a computational framework for graph analysis called the Biological Graph Environment (BioGraphE), which provides a general, scalable integration platform for connecting graph problems in biology to optimized computational solvers and high-performance systems. This framework enables biology researchers and computational scientists to identify and deploy network analysis applications and to easily connect them to efficient and powerful computational software and hardware that are specifically designed and tuned to solve complex graph problems. In our particular application of BioGraphE to support network analysis in genome biology, we investigate the use of a Boolean satisfiability solver known as Survey Propagation as a core computational solver and high-performance parallel systems that utilize multi-threaded architectures. In our application of BioGraphE to conduct bionetwork analysis of homology networks, we found that BioGraphE and a custom, parallel implementation of the Survey Propagation SAT solver were capable of solving very large bionetwork problems at high rates of execution on different high-performance computing platforms.

  7. Driven To Succeed: High-Performing, High-Poverty, Turnaround Middle Schools. Volume I: Cross-Case Analysis of High-Performing, High-Poverty, Turnaround Middle Schools.

    ERIC Educational Resources Information Center

    Picucci, Ali Callicoatte; Brownson, Amanda; Kahlert, Rahel; Sobel, Andrew

    This study investigated how seven high-poverty middle schools demonstrated strong academic improvement so they were performing at levels consistent with, and often better than, higher-income schools in their states. Schools ranged in enrollment from 291-1,010 and represented several community types and ethnic groups. Among the characterisitics…

  8. ANALYSIS OF CHLORINATED HERBICIDES BY HIGH-PERFORMANCE LIQUID CHROMATOGRAPHY/MASS SPECTROMETRY

    EPA Science Inventory

    A method that uses high performance liquid chromatography/mass spectrometry (HPLC/MS) for the analysis of chlorinated phenoxyacid herbicides is described. uring method development different techniques were used to increase both the sensitivity and the specificity of thermospray H...

  9. Analysis of scalability of high-performance 3D image processing platform for virtual colonoscopy

    NASA Astrophysics Data System (ADS)

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli

    2014-03-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. For this purpose, we previously developed a software platform for high-performance 3D medical image processing, called HPC 3D-MIP platform, which employs increasingly available and affordable commodity computing systems such as the multicore, cluster, and cloud computing systems. To achieve scalable high-performance computing, the platform employed size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D-MIP algorithms, supported task scheduling for efficient load distribution and balancing, and consisted of a layered parallel software libraries that allow image processing applications to share the common functionalities. We evaluated the performance of the HPC 3D-MIP platform by applying it to computationally intensive processes in virtual colonoscopy. Experimental results showed a 12-fold performance improvement on a workstation with 12-core CPUs over the original sequential implementation of the processes, indicating the efficiency of the platform. Analysis of performance scalability based on the Amdahl's law for symmetric multicore chips showed the potential of a high performance scalability of the HPC 3DMIP platform when a larger number of cores is available.

  10. Analysis of scalability of high-performance 3D image processing platform for virtual colonoscopy.

    PubMed

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli

    2014-03-19

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. For this purpose, we previously developed a software platform for high-performance 3D medical image processing, called HPC 3D-MIP platform, which employs increasingly available and affordable commodity computing systems such as the multicore, cluster, and cloud computing systems. To achieve scalable high-performance computing, the platform employed size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D-MIP algorithms, supported task scheduling for efficient load distribution and balancing, and consisted of a layered parallel software libraries that allow image processing applications to share the common functionalities. We evaluated the performance of the HPC 3D-MIP platform by applying it to computationally intensive processes in virtual colonoscopy. Experimental results showed a 12-fold performance improvement on a workstation with 12-core CPUs over the original sequential implementation of the processes, indicating the efficiency of the platform. Analysis of performance scalability based on the Amdahl's law for symmetric multicore chips showed the potential of a high performance scalability of the HPC 3D-MIP platform when a larger number of cores is available. PMID:24910506

  11. Analysis of scalability of high-performance 3D image processing platform for virtual colonoscopy

    PubMed Central

    Yoshida, Hiroyuki; Wu, Yin; Cai, Wenli

    2014-01-01

    One of the key challenges in three-dimensional (3D) medical imaging is to enable the fast turn-around time, which is often required for interactive or real-time response. This inevitably requires not only high computational power but also high memory bandwidth due to the massive amount of data that need to be processed. For this purpose, we previously developed a software platform for high-performance 3D medical image processing, called HPC 3D-MIP platform, which employs increasingly available and affordable commodity computing systems such as the multicore, cluster, and cloud computing systems. To achieve scalable high-performance computing, the platform employed size-adaptive, distributable block volumes as a core data structure for efficient parallelization of a wide range of 3D-MIP algorithms, supported task scheduling for efficient load distribution and balancing, and consisted of a layered parallel software libraries that allow image processing applications to share the common functionalities. We evaluated the performance of the HPC 3D-MIP platform by applying it to computationally intensive processes in virtual colonoscopy. Experimental results showed a 12-fold performance improvement on a workstation with 12-core CPUs over the original sequential implementation of the processes, indicating the efficiency of the platform. Analysis of performance scalability based on the Amdahl’s law for symmetric multicore chips showed the potential of a high performance scalability of the HPC 3D-MIP platform when a larger number of cores is available. PMID:24910506

  12. High-performance equation solvers and their impact on finite element analysis

    NASA Technical Reports Server (NTRS)

    Poole, Eugene L.; Knight, Norman F., Jr.; Davis, D. Dale, Jr.

    1990-01-01

    The role of equation solvers in modern structural analysis software is described. Direct and iterative equation solvers which exploit vectorization on modern high-performance computer systems are described and compared. The direct solvers are two Cholesky factorization methods. The first method utilizes a novel variable-band data storage format to achieve very high computation rates and the second method uses a sparse data storage format designed to reduce the number of operations. The iterative solvers are preconditioned conjugate gradient methods. Two different preconditioners are included; the first uses a diagonal matrix storage scheme to achieve high computation rates and the second requires a sparse data storage scheme and converges to the solution in fewer iterations that the first. The impact of using all of the equation solvers in a common structural analysis software system is demonstrated by solving several representative structural analysis problems.

  13. An analysis for high speed propeller-nacelle aerodynamic performance prediction. Volume 1: Theory and application

    NASA Technical Reports Server (NTRS)

    Egolf, T. Alan; Anderson, Olof L.; Edwards, David E.; Landgrebe, Anton J.

    1988-01-01

    A computer program, the Propeller Nacelle Aerodynamic Performance Prediction Analysis (PANPER), was developed for the prediction and analysis of the performance and airflow of propeller-nacelle configurations operating over a forward speed range inclusive of high speed flight typical of recent propfan designs. A propeller lifting line, wake program was combined with a compressible, viscous center body interaction program, originally developed for diffusers, to compute the propeller-nacelle flow field, blade loading distribution, propeller performance, and the nacelle forebody pressure and viscous drag distributions. The computer analysis is applicable to single and coaxial counterrotating propellers. The blade geometries can include spanwise variations in sweep, droop, taper, thickness, and airfoil section type. In the coaxial mode of operation the analysis can treat both equal and unequal blade number and rotational speeds on the propeller disks. The nacelle portion of the analysis can treat both free air and tunnel wall configurations including wall bleed. The analysis was applied to many different sets of flight conditions using selected aerodynamic modeling options. The influence of different propeller nacelle-tunnel wall configurations was studied. Comparisons with available test data for both single and coaxial propeller configurations are presented along with a discussion of the results.

  14. Viewpoints: A High-Performance High-Dimensional Exploratory Data Analysis Tool

    NASA Astrophysics Data System (ADS)

    Gazis, P. R.; Levit, C.; Way, M. J.

    2010-12-01

    Scientific data sets continue to increase in both size and complexity. In the past, dedicated graphics systems at supercomputing centers were required to visualize large data sets, but as the price of commodity graphics hardware has dropped and its capability has increased, it is now possible, in principle, to view large complex data sets on a single workstation. To do this in practice, an investigator will need software that is written to take advantage of the relevant graphics hardware. The Viewpoints visualization package described herein is an example of such software. Viewpoints is an interactive tool for exploratory visual analysis of large high-dimensional (multivariate) data. It leverages the capabilities of modern graphics boards (GPUs) to run on a single workstation or laptop. Viewpoints is minimalist: it attempts to do a small set of useful things very well (or at least very quickly) in comparison with similar packages today. Its basic feature set includes linked scatter plots with brushing, dynamic histograms, normalization, and outlier detection/removal. Viewpoints was originally designed for astrophysicists, but it has since been used in a variety of fields that range from astronomy, quantum chemistry, fluid dynamics, machine learning, bioinformatics, and finance to information technology server log mining. In this article, we describe the Viewpoints package and show examples of its usage.

  15. A validated high performance liquid chromatographic method for the analysis of Goldenseal.

    PubMed

    Li, Wenkui; Fitzloff, John F

    2002-03-01

    Goldenseal (Hydrastis canadensis L.) has emerged as one of the top ten herbal supplements on the worldwide market. A rapid, simple and validated high performance liquid chromatographic method, with photodiode array detection, has been developed for the analysis of commercial Goldenseal products. Samples were treated by sonication with acidified methanol/water. The method was validated for LOD, LOQ, linearity, reproducibility and recovery with good results. PMID:11902811

  16. RECENT ADVANCES IN ULTRA-HIGH PERFORMANCE LIQUID CHROMATOGRAPHY FOR THE ANALYSIS OF TRADITIONAL CHINESE MEDICINE

    PubMed Central

    Huang, Huilian; Liu, Min; Chen, Pei

    2014-01-01

    Traditional Chinese medicine has been widely used for the prevention and treatment of various diseases for thousands of years in China. Ultra-high performance liquid chromatography (UHPLC) is a relatively new technique offering new possibilities. This paper reviews recent developments in UHPLC in the separation and identification, fingerprinting, quantification, and metabolism of traditional Chinese medicine. Recently, the combination of UHPLC with MS has improved the efficiency of the analysis of these materials. PMID:25045170

  17. An analysis for high speed propeller-nacelle aerodynamic performance prediction. Volume 2: User's manual

    NASA Technical Reports Server (NTRS)

    Egolf, T. Alan; Anderson, Olof L.; Edwards, David E.; Landgrebe, Anton J.

    1988-01-01

    A user's manual for the computer program developed for the prediction of propeller-nacelle aerodynamic performance reported in, An Analysis for High Speed Propeller-Nacelle Aerodynamic Performance Prediction: Volume 1 -- Theory and Application, is presented. The manual describes the computer program mode of operation requirements, input structure, input data requirements and the program output. In addition, it provides the user with documentation of the internal program structure and the software used in the computer program as it relates to the theory presented in Volume 1. Sample input data setups are provided along with selected printout of the program output for one of the sample setups.

  18. Performance, Performance System, and High Performance System

    ERIC Educational Resources Information Center

    Jang, Hwan Young

    2009-01-01

    This article proposes needed transitions in the field of human performance technology. The following three transitions are discussed: transitioning from training to performance, transitioning from performance to performance system, and transitioning from learning organization to high performance system. A proposed framework that comprises…

  19. The High Performance and Wide Area Analysis and Mining of Scientific & Engineering Data

    SciTech Connect

    Grossman, R.

    2002-12-01

    This final report summarizes our accomplishments and findings and includes recent publications occurring in the final period of this award. One of our research goals was to develop algorithms and services for remote data analysis and distributed data mining which scaled from the commodity internet to high performance networks. When we began the project there was no effective mechanisms to achieve high end to end performance for data intensive applications over wide area, high bandwidth networks. For this reason, we developed algorithms and services for Layers 2,3, and 4 in the simple data web application stack below. We describe our research accomplishments for each of these layers in turn: Layer 4--Data Web Applications; Layer 3--Data Web Services; Layer 2--Network Protocol Services; Layer 1--IP.

  20. Core-Shell Columns in High-Performance Liquid Chromatography: Food Analysis Applications

    PubMed Central

    Preti, Raffaella

    2016-01-01

    The increased separation efficiency provided by the new technology of column packed with core-shell particles in high-performance liquid chromatography (HPLC) has resulted in their widespread diffusion in several analytical fields: from pharmaceutical, biological, environmental, and toxicological. The present paper presents their most recent applications in food analysis. Their use has proved to be particularly advantageous for the determination of compounds at trace levels or when a large amount of samples must be analyzed fast using reliable and solvent-saving apparatus. The literature hereby described shows how the outstanding performances provided by core-shell particles column on a traditional HPLC instruments are comparable to those obtained with a costly UHPLC instrumentation, making this novel column a promising key tool in food analysis. PMID:27143972

  1. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    SciTech Connect

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  2. Linear Static Structural and Vibration Analysis on High-Performance Computers

    NASA Technical Reports Server (NTRS)

    Baddourah, Majdi; Storaasli, Olaf O.; Bostic, Susan

    1993-01-01

    Parallel computers offer the opportunity to significantly reduce the computation time necessary to analyze large-scale aerospace structures. This paper presents algorithms developed for and implemented on a massively-parallel computers hereafter referred to as Scalable High Performance Computers (SHPC) for the most computationally intensive tasks involved in structural analysis, namely, generation and assembly of system matrices, solution of systems of equations and calculation of the eigenvalues and eigenvectors. Results on SHPC are presented for large-scale structural problems (i.e. Models of high speed civil transport). The goal of this research is to develop new efficient technique which extend structural analysis to SHPC and make large-scale structural analyses tractable.

  3. High-performance parallel analysis of coupled problems for aircraft propulsion

    NASA Technical Reports Server (NTRS)

    Felippa, C. A.; Farhat, C.; Lanteri, S.; Maman, N.; Piperno, S.; Gumaste, U.

    1994-01-01

    This research program deals with the application of high-performance computing methods for the analysis of complete jet engines. We have entitled this program by applying the two dimensional parallel aeroelastic codes to the interior gas flow problem of a bypass jet engine. The fluid mesh generation, domain decomposition, and solution capabilities were successfully tested. We then focused attention on methodology for the partitioned analysis of the interaction of the gas flow with a flexible structure and with the fluid mesh motion that results from these structural displacements. This is treated by a new arbitrary Lagrangian-Eulerian (ALE) technique that models the fluid mesh motion as that of a fictitious mass-spring network. New partitioned analysis procedures to treat this coupled three-component problem are developed. These procedures involved delayed corrections and subcycling. Preliminary results on the stability, accuracy, and MPP computational efficiency are reported.

  4. Electrical performance analysis of IC package for the high-end memory device

    NASA Astrophysics Data System (ADS)

    Lee, Dong H.; Han, Chan M.

    1997-08-01

    The developments of processing technology and design make it possible to increase the clock speed and the number of input outputs (I/Os) in memory devices. The interconnections of IC package are considered as an important factor to decide the performance of the memory devices. In order to overcome the limitations of the conventional package, new types of package such as Ball Grid Array (BGA), chip scale package or flip chip bonding are adopted by many IC manufacturers. The present work has compared the electrical performances of 3 different packages to provide deign guide for IC packages of the high performance memory devices in the future. Those packages are designed for the same memory devices to confront to the diversity of memory market demand. The conventional package using lead frame, wire bonded BGA using printed circuit board substrate and flip chip bonded BGA are analyzed. Their electrical performances are compared in the area of signal delay and coupling effect between signal interconnections. The electrical package modeling is built by extracting parasitic of interconnections in IC package through electro-magnetic simulations. The electrical package modeling is built by extracting parasitic of interconnections in IC package through electro-magnetic simulations. The analysis of electrical behavior is performed using SPICE model which is made to represent the real situation. The methodology presented is also capable of determining the most suitable memory package for a particular device based on the electrical performance.

  5. Performance Analysis of Two Early NACA High Speed Propellers with Application to Civil Tiltrotor Configurations

    NASA Technical Reports Server (NTRS)

    Harris, Franklin D.

    1996-01-01

    The helicopter industry is vigorously pursuing development of civil tiltrotors. One key to efficient high speed performance of this rotorcraft is prop-rotor performance. Of equal, if not greater, importance is assurance that the flight envelope is free of aeroelastic instabilities well beyond currently envisioned cuise speeds. This later condition requires study at helical tip Match numbers well in excess of 1.0. Two 1940's 'supersonic' propeller experiments conducted by NACA have provided an immensely valuable data bank with which to study prop-rotor behavior at transonic and supersonic helical tip Mach numbers. Very accurate 'blades alone' data were obtained by using nearly an infinite hub. Tabulated data were recreated from the many thrust and power figures and are included in two Appendices to this report. This data set is exceptionally well suited to re-evaluating classical blade element theories as well as evolving computational fluid dynamic (CFD) analyses. A limited comparison of one propeller's experimental results to a modem rotorcraft CFD code is made. This code, referred to as TURNS, gives very encouraging results. Detailed analysis of the performance data from both propellers is provided in Appendix A. This appendix quantifies the minimum power required to produce usable prop-rotor thrust. The dependence of minimum profile power on Reynolds number is quantified. First order compressibility power losses are quantified as well and a first approximation to design air-foil thickness ratio to avoid compressibility losses is provided. Appendix A's results are applied to study high speed civil tiltrotor cruise performance. Predicted tiltrotor performance is compared to two turboprop commercial transports. The comparison shows that there is no fundamental aerodynamic reason why the rotorcraft industry could not develop civil tiltrotor aircraft which have competitive cruise performance with today's regional, turboprop airlines. Recommendations for future study

  6. Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography

    PubMed Central

    Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila

    2016-01-01

    Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory.

  7. Ariel Performance Analysis System

    NASA Astrophysics Data System (ADS)

    Ariel, Gideon B.; Penny, M. A.; Saar, Dany

    1990-08-01

    The Ariel Performance Analysis System is a computer-based system for the measurement, analysis and presentation of human performance. The system is based on a proprietary technique for processing multiple high-speed film and video recordings of a subject's performance. It is noninvasive, and does not require wires, sensors, markers or reflectors. In addition, it is portable and does not require modification of the performing environment. The scale and accuracy of measurement can be set to whatever levels are required by the activity being performed.

  8. Performance improvement in high-speed random accessibility of Brillouin optical correlation domain analysis

    NASA Astrophysics Data System (ADS)

    Kohno, Yuta; Kishi, Masato; Hotate, Kazuo

    2016-05-01

    Brillouin Optical Correlation Domain Analysis (BOCDA) offers high speed random accessibility along a sensing fiber, because it can localize stimulated Brillouin scattering at an arbitrary fiber position. By using this function, simultaneous dynamic strain measurement at arbitrary selected multiple points along the fiber was achieved. However, measurement accuracy was restricted due to performance limitation of lock-in-amplifier in the system. This paper reports a new system which uses I/Q demodulator instead of the lock-in-amplifier. Measurement accuracy was improved.

  9. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    PubMed

    Simonyan, Vahan; Mazumder, Raja

    2014-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953

  10. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis

    PubMed Central

    Simonyan, Vahan; Mazumder, Raja

    2014-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis. PMID:25271953

  11. [Quantitative analysis of (-)-epigallocatechin gallate in tea leaves by high-performance liquid chromatography].

    PubMed

    Sakata, I; Ikeuchi, M; Maruyama, I; Okuda, T

    1991-12-01

    The quantitative analysis of (-)-epigallocatechin gallate (EGCG) in tea (Camellia sinensis L.) was performed by high-performance liquid chromatography (HPLC) with a C-18 reversed-phase column. EGCG was then eluted within 20 min by using methanol-water-acetic acid (20:75:5 (v/v/v)) as an eluent. As an internal standard, tryptophan was used. The content of EGCG in five kinds of green tea (sencha, gyokuro, bancha, matsucha and oolong tea) and in a cup of those was determined by both the extraction method with 50% (v/v) methanol and the infusion method with water. The largest amount of EGCG was obtained from matsucha by the extraction method, or from sencha by the infusion method. Furthermore, EGCG contents in various parts of the tea plant were examined. The first leaf had the highest concentration of EGCG, and the concentration of EGCG decreased with the aging of the leaf. PMID:1806661

  12. [Analysis of amines in water samples by high performance liquid chromatography-laser induced fluorescence detection].

    PubMed

    Liu, Fan; Gao, Fangyuan; Tang, Tao; Sun, Yuanshe; Li, Tong; Zhang, Weibing

    2013-11-01

    A sensitive high performance liquid chromatography (HPLC)-laser induced fluorescence detection (LIFD) method was developed for the determination of amines. The derivatization and separation conditions were investigated. Under the optimized conditions, spermidine, putrescine and histamine were analyzed. The limits of detection (LODs) of the three biogenic amines (S/N = 3) were as low as 10(-10) mol/L. This method showed excellent stability. The RSDs of retention times and peak areas of the three biogenic amines were lower than 0.3% and 3%, respectively. This method was applied in biogenic amine analysis in water samples, and the average recoveries were in the range of 94.99%-104.7%. Furthermore, the amines in seven tea samples were analyzed by this method, and satisfactory results were achieved. The developed assay is of excellent sensitivity and good reproducibility, which can be used in the analysis of the amines in water samples. PMID:24558849

  13. Preliminary Evaluation of MapReduce for High-Performance Climate Data Analysis

    NASA Technical Reports Server (NTRS)

    Duffy, Daniel Q.; Schnase, John L.; Thompson, John H.; Freeman, Shawn M.; Clune, Thomas L.

    2012-01-01

    MapReduce is an approach to high-performance analytics that may be useful to data intensive problems in climate research. It offers an analysis paradigm that uses clusters of computers and combines distributed storage of large data sets with parallel computation. We are particularly interested in the potential of MapReduce to speed up basic operations common to a wide range of analyses. In order to evaluate this potential, we are prototyping a series of canonical MapReduce operations over a test suite of observational and climate simulation datasets. Our initial focus has been on averaging operations over arbitrary spatial and temporal extents within Modern Era Retrospective- Analysis for Research and Applications (MERRA) data. Preliminary results suggest this approach can improve efficiencies within data intensive analytic workflows.

  14. A high-performance computing toolset for relatedness and principal component analysis of SNP data.

    PubMed

    Zheng, Xiuwen; Levine, David; Shen, Jess; Gogarten, Stephanie M; Laurie, Cathy; Weir, Bruce S

    2012-12-15

    Genome-wide association studies are widely used to investigate the genetic basis of diseases and traits, but they pose many computational challenges. We developed gdsfmt and SNPRelate (R packages for multi-core symmetric multiprocessing computer architectures) to accelerate two key computations on SNP data: principal component analysis (PCA) and relatedness analysis using identity-by-descent measures. The kernels of our algorithms are written in C/C++ and highly optimized. Benchmarks show the uniprocessor implementations of PCA and identity-by-descent are ∼8-50 times faster than the implementations provided in the popular EIGENSTRAT (v3.0) and PLINK (v1.07) programs, respectively, and can be sped up to 30-300-fold by using eight cores. SNPRelate can analyse tens of thousands of samples with millions of SNPs. For example, our package was used to perform PCA on 55 324 subjects from the 'Gene-Environment Association Studies' consortium studies. PMID:23060615

  15. Analysis of Polyamines in Higher Plants by High Performance Liquid Chromatography 1

    PubMed Central

    Flores, Hector E.; Galston, Arthur W.

    1982-01-01

    A sensitive (0.01-1 nmol) method has been developed for the analysis of polyamines in higher plant extracts based on high performance liquid chromatography (HPLC) of their benzoyl derivatives (Redmond, Tseng 1979 J Chromatogr 170: 479-481). Putrescine, cadaverine, agmatine, spermidine, spermine, and the less common polyamines nor-spermidine and homospermidine can be completely resolved by reverse phase HPLC, isocratic elution with methanol:water (64%, v/v) through a 5-μm C18 column, and detection at 254 nm. The method can be directly applied to crude plant extracts, and it is not subject to interference by carbohydrates and phenolics. A good quantitative correlation was found between HPLC analysis of benzoylpolyamines and thin layer chromatography of their dansyl derivatives. With the HPLC method, polyamine titers have been reproducibly estimated for various organs of amaranth, Lemna, oat, pea, Pharbitis, and potato. The analyses correlate well with results of thin layer chromatography determinations. Images PMID:16662279

  16. A time-motion analysis of turns performed by highly ranked viennese waltz dancers.

    PubMed

    Prosen, Jerneja; James, Nic; Dimitriou, Lygeri; Perš, Janez; Vučković, Goran

    2013-01-01

    Twenty-four dance couples performing at the 2011 IDSF (International DanceSport Federation) International Slovenia Open were divided into two groups: the first twelve placed couples (top ranked) and the last twelve placed couples (lower ranked). Video recordings were processed automatically using computer vision tracking algorithms under operator supervision to calculate movement parameters. Time and speed of movement were analysed during single natural (right) and reverse (left) turns performed during the Viennese waltz. Both top and lower ranked dancers tended to perform similar proportionate frequencies of reverse (≈ 35%) and natural (≈ 65%) turns. Analysis of reverse turns showed that the top ranked dancers performed less turns on a curved trajectory (16%) than the lower ranked dancers (33%). The top ranked couples performed all turns at similar speeds (F = 1.31, df = 3, p = 0.27; mean = 2.09m/s) all of which were significantly quicker than the lower ranked couples (mean = 1.94m/s), the greatest differences found for reverse turns (12.43% faster for curved trajectories, 8.42% for straight trajectories). This suggests that the ability to maintain a high speed in the more difficult turns, particularly the reverse turns on a curved trajectory, results in the overall dance appearing more fluent as the speed of movement does not fluctuate as much. This aspect of performance needs to be improved by lower ranked dancers if they wish to improve rating of their performance. Future research should determine which factors relate to the speed of turns. PMID:24146705

  17. Design and performance analysis of high-order optical temporal differentiator with twin-core fiber

    NASA Astrophysics Data System (ADS)

    You, Haidong; Ning, Tigang; Li, Jing; Jian, Wei; Wen, Xiaodong; Pei, Li

    2013-08-01

    A simple and general approach for implementing all-fiber high-order optical temporal differentiator based on twin-core fiber (TCF) is presented and demonstrated. Specifically, the core 2 (or core 1) of the TCF should be cut in N sections with the same length for achieving N'th-order optical temporal differentiator, which can be considered to consist of N cascaded first-order optical temporal differentiators based on TCF. Our simulations show that the proposed approach can provide optical operation bandwidths in the several THz regime, which is capable of accurately processing time features as short as subpicoseconds. Performance analysis results show a good accuracy calculating the high-order time differentiation of the optical signal launched at core 2 (or core 1).

  18. BIOINTERACTION ANALYSIS BY HIGH-PERFORMANCE AFFINITY CHROMATOGRAPHY: KINETIC STUDIES OF IMMOBILIZED ANTIBODIES

    PubMed Central

    Nelson, Mary Anne; Moser, Annette; Hage, David S.

    2009-01-01

    A system based on high-performance affinity chromatography was developed for characterizing the binding, elution and regeneration kinetics of immobilized antibodies and immunoaffinity supports. This information was provided by using a combination of frontal analysis, split-peak analysis and peak decay analysis to determine the rate constants for antibody-antigen interactions under typical sample application and elution conditions. This technique was tested using immunoaffinity supports that contained monoclonal antibodies for 2,4-dichlorophenoxyacetic acid (2,4-D). Association equilibrium constants measured by frontal analysis for 2,4-D and related compounds with the immobilized antibodies were 1.7–12 × 106 M−1 at pH 7.0 and 25°C. Split-peak analysis gave association rate constants of 1.4–12 × 105 M−1s−1 and calculated dissociation rate constants of 0.01–0.4 s−1 under the application conditions. Elution at pH 2.5 for the analytes from the antibodies was examined by peak decay analysis and gave dissociation rate constants of 0.056–0.17 s−1. A comparison of frontal analysis results after various periods of column regeneration allowed the rate of antibody regeneration to be examined, with the results giving a first-order regeneration rate constant of 2.4 × 10−4 s−1. This combined approach and the information it provides should be useful in the design and optimization of immunoaffinity chromatography and other analytical methods that employ immobilized antibodies. The methods described are not limited to the particular analytes and antibodies employed in this study but should be useful in characterizing other targets, ligands and supports. PMID:19394281

  19. National cyber defense high performance computing and analysis : concepts, planning and roadmap.

    SciTech Connect

    Hamlet, Jason R.; Keliiaa, Curtis M.

    2010-09-01

    There is a national cyber dilemma that threatens the very fabric of government, commercial and private use operations worldwide. Much is written about 'what' the problem is, and though the basis for this paper is an assessment of the problem space, we target the 'how' solution space of the wide-area national information infrastructure through the advancement of science, technology, evaluation and analysis with actionable results intended to produce a more secure national information infrastructure and a comprehensive national cyber defense capability. This cybersecurity High Performance Computing (HPC) analysis concepts, planning and roadmap activity was conducted as an assessment of cybersecurity analysis as a fertile area of research and investment for high value cybersecurity wide-area solutions. This report and a related SAND2010-4765 Assessment of Current Cybersecurity Practices in the Public Domain: Cyber Indications and Warnings Domain report are intended to provoke discussion throughout a broad audience about developing a cohesive HPC centric solution to wide-area cybersecurity problems.

  20. The design, performance and analysis of a high work capacity transonic turbine

    SciTech Connect

    Bryce, J.D.; Leversuch, N.P.; Litchfield, M.R.

    1985-10-01

    This paper describes the design and testing of a high work capacity single-stage transonic turbine of aerodynamic duty tailored to the requirements of driving the high-pressure core of a low cost turbofan engine. Aerodynamic loading was high for this duty (..delta..H/U/sup 2/ = 2.1) and a major objective in the design was the control of the resulting transonic flow to achieve good turbine performance. Practical and coolable blading was a design requirement. At the design point (pressure ratio = 4.48), a turbine total to total efficiency of 87.0 percent was measured - this being based on measured shaft power and a tip clearance of 1.4 percent of blade height. In addition, the turbine was comprehensively instrumented to allow measurement of aerofoil surface static pressures on both stator and rotor - the latter being expedited via a rotating scanivalve system. Downstream area traverses were also conducted. Analysis of these measurements indicates that the turbine operates at overall reaction levels lower than design but the rotor blade performs efficiently.

  1. Group-type hydrocarbon standards for high-performance liquid chromatographic analysis of middistillate fuels

    NASA Technical Reports Server (NTRS)

    Otterson, D. A.; Seng, G. T.

    1984-01-01

    A new high-performance liquid chromatographic (HPLC) method for group-type analysis of middistillate fuels is described. It uses a refractive index detector and standards that are prepared by reacting a portion of the fuel sample with sulfuric acid. A complete analysis of a middistillate fuel for saturates and aromatics (including the preparation of the standard) requires about 15 min if standards for several fuels are prepared simultaneously. From model fuel studies, the method was found to be accurate to within 0.4 vol% saturates or aromatics, and provides a precision of + or - 0.4 vol%. Olefin determinations require an additional 15 min of analysis time. However, this determination is needed only for those fuels displaying a significant olefin response at 200 nm (obtained routinely during the saturated/aromatics analysis procedure). The olefin determination uses the responses of the olefins and the corresponding saturates, as well as the average value of their refractive index sensitivity ratios (1.1). Studied indicated that, although the relative error in the olefins result could reach 10 percent by using this average sensitivity ratio, it was 5 percent for the fuels used in this study. Olefin concentrations as low as 0.1 vol% have been determined using this method.

  2. High-performance parallel analysis of coupled problems for aircraft propulsion

    NASA Technical Reports Server (NTRS)

    Felippa, C. A.; Farhat, C.; Chen, P.-S.; Gumaste, U.; Leoinne, M.; Stern, P.

    1995-01-01

    This research program deals with the application of high-performance computing methods to the numerical simulation of complete jet engines. The program was initiated in 1993 by applying two-dimensional parallel aeroelastic codes to the interior gas flow problem of a by-pass jet engine. The fluid mesh generation, domain decomposition and solution capabilities were successfully tested. Attention was then focused on methodology for the partitioned analysis of the interaction of the gas flow with a flexible structure and with the fluid mesh motion driven by these structural displacements. The latter is treated by an ALE technique that models the fluid mesh motion as that of a fictitious mechanical network laid along the edges of near-field fluid elements. New partitioned analysis procedures to treat this coupled 3-component problem were developed in 1994. These procedures involved delayed corrections and subcycling, and have been successfully tested on several massively parallel computers. For the global steady-state axisymmetric analysis of a complete engine we have decided to use the NASA-sponsored ENG10 program, which uses a regular FV-multiblock-grid discretization in conjunction with circumferential averaging to include effects of blade forces, loss, combustor heat addition, blockage, bleeds and convective mixing. A load-balancing preprocessor for parallel versions of ENG10 has been developed. It is planned to use the steady-state global solution provided by ENG10 as input to a localized three-dimensional FSI analysis for engine regions where aeroelastic effects may be important.

  3. High speed spherical roller-bearing analysis and comparison with experimental performance

    NASA Technical Reports Server (NTRS)

    Kleckner, R. J.; Dyba, G.

    1983-01-01

    The capabilities of a spherical roller bearing analysis/design tool, Spherbean (spherical bearing analysis) are described. Capabilities of the analysis are demonstrated and verified by comparison with experimental data. A practical design problem is presented where the computer program is used to improve a particular bearing's performance.

  4. An Analysis of Factors Affecting Teacher Attrition in High Performing and Low Performing Elementary Rural Schools in South Carolina

    ERIC Educational Resources Information Center

    Carter-Blocker, Vickie R.

    2012-01-01

    The purpose of this study was to examine the factors impacting teacher attrition in high-performing and low-performing elementary rural schools in South Carolina. Several factors were identified that interfered with teachers returning to the teaching profession. School districts in rural areas need to be better informed of the factors that affect…

  5. Quantitative analysis of mitragynine in human urine by high performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Lu, Shijun; Tran, Buu N; Nelsen, Jamie L; Aldous, Kenneth M

    2009-08-15

    Mitragynine is the primary active alkaloid extracted from the leaves of Mitragyna speciosa Korth, a plant that originates in South-East Asia and is commonly known as kratom in Thailand. Kratom has been used for many centuries for their medicinal and psychoactive qualities, which are comparable to that of opiate-based drugs. Kratom abuse can lead to a detectable content of mitragynine residue in urine. Ultra trace amount of mitragynine in human urine was determined by a high performance liquid chromatography coupled to an electrospray tandem mass spectrometry (HPLC-ESI/MS/MS). Mitragynine was extracted by methyl t-butyl ether (MTBE) and separated on a HILIC column. The ESI/MS/MS was accomplished using a triple quadrupole mass spectrometer in positive ion detection and multiple reactions monitoring (MRM) mode. Ajmalicine, a mitragynine's structure analog was selected as internal standard (IS) for method development. Quality control (QC) performed at three levels 0.1, 1 and 5 ng/ml of mitragynine in urine gave mean recoveries of 90, 109, and 98% with average relative standard deviation of 22, 12 and 16%, respectively. The regression linearity of mitragynine calibration ranged from 0.01 to 5.0 ng/ml was achieved with correlation coefficient greater than 0.995. A detection limit of 0.02 ng/ml and high precision data within-day and between days analysis were obtained. PMID:19577523

  6. Analysis of starch in food systems by high-performance size exclusion chromatography.

    PubMed

    Ovando-Martínez, Maribel; Whitney, Kristin; Simsek, Senay

    2013-02-01

    Starch has unique physicochemical characteristics among food carbohydrates. Starch contributes to the physicochemical attributes of food products made from roots, legumes, cereals, and fruits. It occurs naturally as distinct particles, called granules. Most starch granules are a mixture of 2 sugar polymers: a highly branched polysaccharide named amylopectin and a basically linear polysaccharide named amylose. The starch contained in food products undergoes changes during processing, which causes changes in the starch molecular weight and amylose to amylopectin ratio. The objective of this study was to develop a new, simple, 1-step, and accurate method for simultaneous determination of amylose and amylopectin ratio as well as weight-averaged molecular weights of starch in food products. Starch from bread flour, canned peas, corn flake cereal, snack crackers, canned kidney beans, pasta, potato chips, and white bread was extracted by dissolving in KOH, urea, and precipitation with ethanol. Starch samples were solubilized and analyzed on a high-performance size exclusion chromatography (HPSEC) system. To verify the identity of the peaks, fractions were collected and soluble starch and beta-glucan assays were performed additional to gas chromatography analysis. We found that all the fractions contain only glucose and soluble starch assay is correlated to the HPSEC fractionation. This new method can be used to determine amylose amylopectin ratio and weight-averaged molecular weight of starch from various food products using as low as 25 mg dry samples. PMID:23330715

  7. [Separation of tannins in Rhubarb and its analysis by high performance liquid chromatography-mass spectrometry].

    PubMed

    Ding, Mingyu; Ni, Weiwei

    2004-11-01

    In order to investigate the pharmaceutical actions of rhubarb, a method for extracting, separating and analyzing the tannin components in rhubarb was studied. At first, a procedure for the group separation of tannins from the water-ethanol extract of rhubarb was established based on the formation of tannins-caffein precipitation. Then, a high performance liquid chromatographic (HPLC) method for the analysis of tannins in rhubarb was developed. This HPLC method is based on a reversed-phase C18 column and polar mobile-phase such as water and methanol with gradient elution, and the tannins can be well separated. Finally, the identification of the tannin components in rhubarb was carried out by high performance liquid chromatography-mass spectrometry (HPLC-MS). The structures of the main tannin components (gallic acid, catechin, the dimer, trimer, tetramer and pentamer of catechin) in rhubarb are suggested. The fragmentation laws of the tannin components are summarized. In comparing with previous methods, it is simple and without the interference of co-existed compounds. PMID:15807111

  8. High-Performance Parallel Analysis of Coupled Problems for Aircraft Propulsion

    NASA Technical Reports Server (NTRS)

    Felippa, C. A.; Farhat, C.; Park, K. C.; Gumaste, U.; Chen, P.-S.; Lesoinne, M.; Stern, P.

    1997-01-01

    Applications are described of high-performance computing methods to the numerical simulation of complete jet engines. The methodology focuses on the partitioned analysis of the interaction of the gas flow with a flexible structure and with the fluid mesh motion driven by structural displacements. The latter is treated by a ALE technique that models the fluid mesh motion as that of a fictitious mechanical network laid along the edges of near-field elements. New partitioned analysis procedures to treat this coupled three-component problem were developed. These procedures involved delayed corrections and subcycling, and have been successfully tested on several massively parallel computers, including the iPSC-860, Paragon XP/S and the IBM SP2. The NASA-sponsored ENG10 program was used for the global steady state analysis of the whole engine. This program uses a regular FV-multiblock-grid discretization in conjunction with circumferential averaging to include effects of blade forces, loss, combustor heat addition, blockage, bleeds and convective mixing. A load-balancing preprocessor for parallel versions of ENG10 was developed as well as the capability for the first full 3D aeroelastic simulation of a multirow engine stage. This capability was tested on the IBM SP2 parallel supercomputer at NASA Ames.

  9. Quantitative analysis of oxytetracycline and related substances by high-performance liquid chromatography.

    PubMed

    Khan, N H; Roets, E; Hoogmartens, J; Vanderhaeghe, H

    1987-09-18

    Isocratic high-performance liquid chromatography on PLRP-S 8-microns poly(styrene-divinylbenzene) copolymer allows complete separation of oxytetracycline, 4-epioxytetracycline, tetracycline, anhydrooxytetracycline, alpha- and beta-apooxytetracycline. The mobile phase was tert.-butanol-0.2 M phosphate buffer pH 8.0-0.02 M tetrabutylammonium sulphate pH 8-0.0001 M sodium ethylenediaminetetraacetate pH 8.0-water (5.9:10:5:10:78.1, m/v/v/v/v). With this isocratic method, 2-acetyl-2-decarboxamidooxytetracycline is only partly resolved from oxytetracycline. The separation and the detection limits can be improved by the use of gradient elution. Gradient elution was used for the comparison of official standards and for the analysis of a number of commercial samples, and to monitor the stability of oxytetracycline hydrochloride during storage in the solid state for about 6 years at various temperatures. PMID:3693465

  10. HIVE-Hexagon: High-Performance, Parallelized Sequence Alignment for Next-Generation Sequencing Data Analysis

    PubMed Central

    Santana-Quintero, Luis; Dingerdissen, Hayley; Thierry-Mieg, Jean; Mazumder, Raja; Simonyan, Vahan

    2014-01-01

    Due to the size of Next-Generation Sequencing data, the computational challenge of sequence alignment has been vast. Inexact alignments can take up to 90% of total CPU time in bioinformatics pipelines. High-performance Integrated Virtual Environment (HIVE), a cloud-based environment optimized for storage and analysis of extra-large data, presents an algorithmic solution: the HIVE-hexagon DNA sequence aligner. HIVE-hexagon implements novel approaches to exploit both characteristics of sequence space and CPU, RAM and Input/Output (I/O) architecture to quickly compute accurate alignments. Key components of HIVE-hexagon include non-redundification and sorting of sequences; floating diagonals of linearized dynamic programming matrices; and consideration of cross-similarity to minimize computations. Availability https://hive.biochemistry.gwu.edu/hive/ PMID:24918764

  11. High-performance parallel analysis of coupled problems for aircraft propulsion

    NASA Technical Reports Server (NTRS)

    Felippa, C. A.; Farhat, C.; Lanteri, S.; Gumaste, U.; Ronaghi, M.

    1994-01-01

    Applications are described of high-performance parallel, computation for the analysis of complete jet engines, considering its multi-discipline coupled problem. The coupled problem involves interaction of structures with gas dynamics, heat conduction and heat transfer in aircraft engines. The methodology issues addressed include: consistent discrete formulation of coupled problems with emphasis on coupling phenomena; effect of partitioning strategies, augmentation and temporal solution procedures; sensitivity of response to problem parameters; and methods for interfacing multiscale discretizations in different single fields. The computer implementation issues addressed include: parallel treatment of coupled systems; domain decomposition and mesh partitioning strategies; data representation in object-oriented form and mapping to hardware driven representation, and tradeoff studies between partitioning schemes and fully coupled treatment.

  12. An Analysis of High School Students' Performance on Five Integrated Science Process Skills.

    ERIC Educational Resources Information Center

    Beaumont-Walters, Yvonne; Soyibo, Kola

    2001-01-01

    This study determined Jamaican high school students' (n=305) level of performance on five integrated science process skills with performance linked to gender, grade level, school location, school type, student type, and socio-economic background (SEB). Statistically significant differences in performance based on grade level, school type, student…

  13. EXTRACTION AND QUANTITATIVE ANALYSIS OF ELEMENTAL SULFUR FROM SULFIDE MINERAL SURFACES BY HIGH-PERFORMANCE LIQUID CHROMATOGRAPHY. (R826189)

    EPA Science Inventory

    A simple method for the quantitative determination of elemental sulfur on oxidized sulfide minerals is described. Extraction of elemental sulfur in perchloroethylene and subsequent analysis with high-performance liquid chromatography were used to ascertain the total elemental ...

  14. High Performance Liquid Chromatography-mass Spectrometry Analysis of High Antioxidant Australian Fruits with Antiproliferative Activity Against Cancer Cells

    PubMed Central

    Sirdaarta, Joseph; Maen, Anton; Rayan, Paran; Matthews, Ben; Cock, Ian Edwin

    2016-01-01

    g/mL). All other extracts were nontoxic. A total of 145 unique mass signals were detected in the lemon aspen methanolic and aqueous extracts by nonbiased high-performance liquid chromatography-mass spectrometry analysis. Of these, 20 compounds were identified as being of particular interest due to their reported antioxidant and/or anticancer activities. Conclusions: The lack of toxicity and antiproliferative activity of the high antioxidant plant extracts against HeLa and CaCo2 cancer cell lines indicates their potential in the treatment and prevention of some cancers. SUMMARY Australian fruit extracts with high antioxidant contents were potent inhibitors of CaCo2 and HeLa carcinoma cell proliferationMethanolic lemon aspen extract was particularly potent, with IC50 values of 480 μg/mL (HeLa) and 769 μg/mL (CaCo2)High-performance liquid chromatography-mass spectrometry-quadrupole time-of-flight analysis highlighted and putatively identified 20 compounds in the antiproliferative lemon aspen extractsIn contrast, lower antioxidant content extracts stimulated carcinoma cell proliferationAll extracts with antiproliferative activity were nontoxic in the Artemia nauplii assay. Abbreviations used: DPPH: di (phenyl)- (2,4,6-trinitrophenyl) iminoazanium, HPLC: High-performance liquid chromatography, IC50: The concentration required to inhibit by 50%, LC50: The concentration required to achieve 50% mortality, MS: Mass spectrometry. PMID:27279705

  15. High-Performance Computing for Real-Time Grid Analysis and Operation

    SciTech Connect

    Huang, Zhenyu; Chen, Yousu; Chavarría-Miranda, Daniel

    2013-10-31

    Power grids worldwide are undergoing an unprecedented transition as a result of grid evolution meeting information revolution. The grid evolution is largely driven by the desire for green energy. Emerging grid technologies such as renewable generation, smart loads, plug-in hybrid vehicles, and distributed generation provide opportunities to generate energy from green sources and to manage energy use for better system efficiency. With utility companies actively deploying these technologies, a high level of penetration of these new technologies is expected in the next 5-10 years, bringing in a level of intermittency, uncertainties, and complexity that the grid did not see nor design for. On the other hand, the information infrastructure in the power grid is being revolutionized with large-scale deployment of sensors and meters in both the transmission and distribution networks. The future grid will have two-way flows of both electrons and information. The challenge is how to take advantage of the information revolution: pull the large amount of data in, process it in real time, and put information out to manage grid evolution. Without addressing this challenge, the opportunities in grid evolution will remain unfulfilled. This transition poses grand challenges in grid modeling, simulation, and information presentation. The computational complexity of underlying power grid modeling and simulation will significantly increase in the next decade due to an increased model size and a decreased time window allowed to compute model solutions. High-performance computing is essential to enable this transition. The essential technical barrier is to vastly increase the computational speed so operation response time can be reduced from minutes to seconds and sub-seconds. The speed at which key functions such as state estimation and contingency analysis are conducted (typically every 3-5 minutes) needs to be dramatically increased so that the analysis of contingencies is both

  16. Sources of Variability in Chlorophyll Analysis by Fluorometry and by High Performance Liquid Chromatography. Chapter 22

    NASA Technical Reports Server (NTRS)

    VanHeukelem, Laurie; Thomas, Crystal S.; Glibert, Patricia M.

    2001-01-01

    The need for accurate determination of chlorophyll a (chl a) is of interest for numerous reasons. From the need for ground-truth data for remote sensing to pigment detection for laboratory experimentation, it is essential to know the accuracy of the analyses and the factors potentially contributing to variability and error. Numerous methods and instrument techniques are currently employed in the analyses of chl a. These methods range from spectrophotometric quantification, to fluorometric analysis and determination by high performance liquid chromatography. Even within the application of HPLC techniques, methods vary. Here we provide the results of a comparison among methods and provide some guidance for improving the accuracy of these analyses. These results are based on a round-robin conducted among numerous investigators, including several in the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) and HyCODE Programs. Our purpose here is not to present the full results of the laboratory intercalibration; those results will be presented elsewhere. Rather, here we highlight some of the major factors that may contribute to the variability observed. Specifically, we aim to assess the comparability of chl a analyses performed by fluorometry and HPLC, and we identify several factors in the analyses which may contribute disproportionately to this variability.

  17. Analysis of creams. IV. Application of high performance liquid chromatography. Part I.

    PubMed

    Lake, O A; Hulshoff, A; Van De Vaart, F J; Indemans, A W

    1983-02-25

    The possibilities of applying reversed-phase high performance liquid chromatography to the analysis of o/w emulsion type creams without preceding sample clean-up were investigated. The chromatographic behaviour of cream base components and active compounds in reversed phase systems consisting of methanol-water mixtures as the mobile phase and a chemically bonded octadecyl stationary phase, was studied. A number of active compounds and the preservative (sorbic acid) could be determined--often in one chromatographic run--without complications, by simply dissolving the sample in a suitable solvent mixture and injecting an aliquot of the solution into the chromatograph. Separation was achieved by the proper choice of methanol content, pH and ionic strength of the eluent. The compounds were detected by UV absorption. Some of the lipophilic cream base components could easily be determined in the same manner, with methanol as the eluent and with refraction index detection. The developed procedure was applied to the analysis of a number of creams. Some of the results are presented as examples, demonstrating the suitability of the method for quality control purposes. PMID:6844121

  18. A high-performance computing toolset for relatedness and principal component analysis of SNP data

    PubMed Central

    Zheng, Xiuwen; Levine, David; Shen, Jess; Gogarten, Stephanie M.; Laurie, Cathy; Weir, Bruce S.

    2012-01-01

    Summary: Genome-wide association studies are widely used to investigate the genetic basis of diseases and traits, but they pose many computational challenges. We developed gdsfmt and SNPRelate (R packages for multi-core symmetric multiprocessing computer architectures) to accelerate two key computations on SNP data: principal component analysis (PCA) and relatedness analysis using identity-by-descent measures. The kernels of our algorithms are written in C/C++ and highly optimized. Benchmarks show the uniprocessor implementations of PCA and identity-by-descent are ∼8–50 times faster than the implementations provided in the popular EIGENSTRAT (v3.0) and PLINK (v1.07) programs, respectively, and can be sped up to 30–300-fold by using eight cores. SNPRelate can analyse tens of thousands of samples with millions of SNPs. For example, our package was used to perform PCA on 55 324 subjects from the ‘Gene-Environment Association Studies’ consortium studies. Availability and implementation: gdsfmt and SNPRelate are available from R CRAN (http://cran.r-project.org), including a vignette. A tutorial can be found at https://www.genevastudy.org/Accomplishments/software. Contact: zhengx@u.washington.edu PMID:23060615

  19. Analysis of Amadori compounds by high-performance cation exchange chromatography coupled to tandem mass spectrometry.

    PubMed

    Davidek, Tomas; Kraehenbuehl, Karin; Devaud, Stéphanie; Robert, Fabien; Blank, Imre

    2005-01-01

    High-performance cation exchange chromatography coupled to tandem mass spectrometry or electrochemical detection was found to be an efficient tool for analyzing Amadori compounds derived from hexose and pentose sugars. The method allows rapid separation and identification of Amadori compounds, while benefiting from the well-known advantages of mass spectrometry, such as specificity and sensitivity. Glucose- and xylose-derived Amadori compounds of several amino acids, such as glycine, alanine, valine, leucine/isoleucine, methionine, proline, phenylalanine, and glutamic acid, were separated or discriminated using this new method. The method is suitable for the analysis of both model reaction mixtures and food products. Fructosylglutamate was found to be the major Amadori compound in dried tomatoes (approximately 1.5 g/100 g) and fructosylproline in dried apricots (approximately 0.2 g/100 g). Reaction of xylose and glycine at 90 degrees C (pH 6) for 2 h showed rapid formation of xylulosylglycine (approximately 12 mol %, 15 min) followed by slow decrease over time. Analysis of pentose-derived Amadori compounds is shown for the first time, which represents a major breakthrough in studying occurrence, formation, and decomposition of these labile Maillard intermediates. PMID:15623289

  20. Geometrically nonlinear design sensitivity analysis on parallel-vector high-performance computers

    NASA Technical Reports Server (NTRS)

    Baddourah, Majdi A.; Nguyen, Duc T.

    1993-01-01

    Parallel-vector solution strategies for generation and assembly of element matrices, solution of the resulted system of linear equations, calculations of the unbalanced loads, displacements, stresses, and design sensitivity analysis (DSA) are all incorporated into the Newton Raphson (NR) procedure for nonlinear finite element analysis and DSA. Numerical results are included to show the performance of the proposed method for structural analysis and DSA in a parallel-vector computer environment.

  1. High Performance Liquid Chromatography

    NASA Astrophysics Data System (ADS)

    Talcott, Stephen

    High performance liquid chromatography (HPLC) has many applications in food chemistry. Food components that have been analyzed with HPLC include organic acids, vitamins, amino acids, sugars, nitrosamines, certain pesticides, metabolites, fatty acids, aflatoxins, pigments, and certain food additives. Unlike gas chromatography, it is not necessary for the compound being analyzed to be volatile. It is necessary, however, for the compounds to have some solubility in the mobile phase. It is important that the solubilized samples for injection be free from all particulate matter, so centrifugation and filtration are common procedures. Also, solid-phase extraction is used commonly in sample preparation to remove interfering compounds from the sample matrix prior to HPLC analysis.

  2. WEP: a high-performance analysis pipeline for whole-exome data

    PubMed Central

    2013-01-01

    Background The advent of massively parallel sequencing technologies (Next Generation Sequencing, NGS) profoundly modified the landscape of human genetics. In particular, Whole Exome Sequencing (WES) is the NGS branch that focuses on the exonic regions of the eukaryotic genomes; exomes are ideal to help us understanding high-penetrance allelic variation and its relationship to phenotype. A complete WES analysis involves several steps which need to be suitably designed and arranged into an efficient pipeline. Managing a NGS analysis pipeline and its huge amount of produced data requires non trivial IT skills and computational power. Results Our web resource WEP (Whole-Exome sequencing Pipeline web tool) performs a complete WES pipeline and provides easy access through interface to intermediate and final results. The WEP pipeline is composed of several steps: 1) verification of input integrity and quality checks, read trimming and filtering; 2) gapped alignment; 3) BAM conversion, sorting and indexing; 4) duplicates removal; 5) alignment optimization around insertion/deletion (indel) positions; 6) recalibration of quality scores; 7) single nucleotide and deletion/insertion polymorphism (SNP and DIP) variant calling; 8) variant annotation; 9) result storage into custom databases to allow cross-linking and intersections, statistics and much more. In order to overcome the challenge of managing large amount of data and maximize the biological information extracted from them, our tool restricts the number of final results filtering data by customizable thresholds, facilitating the identification of functionally significant variants. Default threshold values are also provided at the analysis computation completion, tuned with the most common literature work published in recent years. Conclusions Through our tool a user can perform the whole analysis without knowing the underlying hardware and software architecture, dealing with both paired and single end data. The interface

  3. A high performance liquid radiochromatographic assay for the simultaneous analysis of iloprost and misoprostol.

    PubMed

    Womack, I M; Lee, A S; Kamath, B; Agrawal, K C; Kishore, V

    1996-10-01

    A high-performance liquid chromatographic (HPLC) method utilizing ultraviolet absorbance coupled with radioisotove detection was developed for the precise and simultaneous determination of iloprost and misoprostol. This assay allows complete resolution of iloprost diastereoisomers and has a total run time of approximately twenty minutes. Samples were prepared for chromatographic analysis by extracting a mixture of tritiated drugs from rat plasma with acetonitrile. The resulting solutions were chromatographed on a reversed phase Zorbax Rx-C8 column using 0.02M potassium phosphate (pH 3.0), acetonitrile, and methanol (46:30:24, v/v) at a flow rate of 1.7 mL/min. 2-Naphthoic acid was employed as an internal standard. The correlation coefficient for varying concentrations of tritiated iloprost (12.7 Ci/mmol specific activity) from 2.18 ng/mL to 21.8 ng/mL was 0.995, and the correlation coefficient for concentrations of tritiated misoprostol (50 Ci/mmol specific activity) from 0.617 ng/mL to 6.17 ng/mL was 0.993. The high selectivity and sensitivity of this assay make it useful for the simultaneous quantitation of iloprost and misoprostol. PMID:8936581

  4. Analysis of Drug Interactions with Lipoproteins by High-Performance Affinity Chromatography

    PubMed Central

    Sobansky, Matthew R.; Hage, David S.

    2013-01-01

    Lipoproteins such as high-density lipoprotein (HDL) and low-density lipoprotein (LDL) are known to interact with drugs and other solutes in blood. These interactions have been examined in the past by methods such as equilibrium dialysis and capillary electrophoresis. This chapter describes an alternative approach that has recently been developed for examining these interactions by using high-performance affinity chromatography. In this method, lipoproteins are covalently immobilized to a solid support and used within a column as a stationary phase for binding studies. This approach allows the same lipoprotein preparation to be used for a large number of binding studies, leading to precise estimates of binding parameters. This chapter will discuss how this technique can be applied to the identification of interaction models and be used to differentiate between systems that have interactions based on partitioning, adsorption or mixed-mode interactions. It is also shown how this approach can then be used for the measurement of binding parameters for HDL and LDL with drugs. Examples of these studies are provided, with particular attention being given to the use of frontal analysis to examine the interactions of R- and S-propranolol with HDL and LDL. The advantages and possible limitations of this method are described. The extension of this approach to other types of drug-lipoprotein interactions is also considered. PMID:25392741

  5. High-Performance Parallel Analysis of Coupled Problems for Aircraft Propulsion

    NASA Technical Reports Server (NTRS)

    Felippa, C. A.; Farhat, C.; Park, K. C.; Gumaste, U.; Chen, P.-S.; Lesoinne, M.; Stern, P.

    1996-01-01

    This research program dealt with the application of high-performance computing methods to the numerical simulation of complete jet engines. The program was initiated in January 1993 by applying two-dimensional parallel aeroelastic codes to the interior gas flow problem of a bypass jet engine. The fluid mesh generation, domain decomposition and solution capabilities were successfully tested. Attention was then focused on methodology for the partitioned analysis of the interaction of the gas flow with a flexible structure and with the fluid mesh motion driven by these structural displacements. The latter is treated by a ALE technique that models the fluid mesh motion as that of a fictitious mechanical network laid along the edges of near-field fluid elements. New partitioned analysis procedures to treat this coupled three-component problem were developed during 1994 and 1995. These procedures involved delayed corrections and subcycling, and have been successfully tested on several massively parallel computers, including the iPSC-860, Paragon XP/S and the IBM SP2. For the global steady-state axisymmetric analysis of a complete engine we have decided to use the NASA-sponsored ENG10 program, which uses a regular FV-multiblock-grid discretization in conjunction with circumferential averaging to include effects of blade forces, loss, combustor heat addition, blockage, bleeds and convective mixing. A load-balancing preprocessor tor parallel versions of ENG10 was developed. During 1995 and 1996 we developed the capability tor the first full 3D aeroelastic simulation of a multirow engine stage. This capability was tested on the IBM SP2 parallel supercomputer at NASA Ames. Benchmark results were presented at the 1196 Computational Aeroscience meeting.

  6. Determination of plasma protein binding of positron emission tomography radioligands by high-performance frontal analysis.

    PubMed

    Amini, Nahid; Nakao, Ryuji; Schou, Magnus; Halldin, Christer

    2014-09-01

    Positron emission tomography (PET) is an imaging technique based on the use of radioligands labeled with short lived radionuclides, such as (11)C (t½=20.4min) and (18)F (t½=109.8min), which as a consequence often requires rapid plasma protein binding analysis methods. In addition, PET radioligands can suffer from non-specific binding to the membrane when ultrafiltraion, which is the most commonly used method for measuring protein binding in PET, is employed. In this study a high-performance frontal analysis (HPFA) method based on incorporation of a gel filtration column (discovery(®) BIO GFC 100, 50mm×4.6mm, 5μm, 100Å) into a radio-LC system with phosphate buffered saline (PBS, pH 7.4) at a flow rate of 3ml/min as mobile phase was developed and investigated for four PET radioligands. The minimum injection volume (MIV) of plasma, which is a crucial factor in HPFA, was determined to be 200μl (human), 500μl (monkey), 700μl (human) and 1000μl (monkey) for these four radioligands. The MIV values increased as a higher fraction of the radioligand was present in the protein-free form. The protein binding results obtained were in good agreement with ultrafiltration and the method did not suffer from non-specific binding. The short analysis time (<12min) allowed multiple protein binding measurements during time course of a human [(11)C]PBR28 PET study. PMID:24922085

  7. Comparison of ultra-high performance supercritical fluid chromatography and ultra-high performance liquid chromatography for the analysis of pharmaceutical compounds.

    PubMed

    Grand-Guillaume Perrenoud, Alexandre; Veuthey, Jean-Luc; Guillarme, Davy

    2012-11-30

    Currently, columns packed with sub-2 μm particles are widely employed in liquid chromatography but are scarcely used in supercritical fluid chromatography. The goal of the present study was to compare the performance, possibilities and limitations of both ultra-high performance liquid chromatography (UHPLC) and ultra-high performance supercritical fluid chromatography (UHPSFC) using columns packed with sub-2 μm particles. For this purpose, a kinetic evaluation was first performed, and van Deemter curves and pressure plots were constructed and compared for columns packed with hybrid silica stationary phases composed of 1.7 and 3.5 μm particles. As expected, the kinetic performance of the UHPSFC method was significantly better than that of the UHPLC. Indeed, the h(min) values were in the same range with both strategies and were between 2.2 and 2.8, but u(opt) was increased by a factor of >4 in UHPSFC conditions. Another obvious advantage of UHPSFC over UHPLC is related to the generated backpressure, which is significantly lower in the presence of a supercritical or subcritical fluid. However, the upper pressure limit of the UHPSFC system was only ∼400 bar vs. ∼1000 bar in the UHPLC system, which prevents the use of highly organic mobile phases at high flow rates in UHPSFC. Second, the impact of reducing the particle size (from 3.5 to 1.7 μm) was evaluated in both UHPLC and UHPSFC conditions. The effect of frictional heating on the selectivity was demonstrated in UHPLC and that of fluid density or decompression cooling was highlighted in UHPSFC. However, in both cases, a change in selectivity was observed for only a limited number of compounds. Third, various types of column chemistries packed with 1.7 μm particles were evaluated in both UHPLC and UHPSFC conditions using a model mixture of acidic, neutral and basic compounds. It has been shown that more drastic changes in selectivity were obtained using UHPSFC columns compared to those obtained by changing

  8. An Analysis of Jamaican High School Students' Integrated Science Process Skills Performance.

    ERIC Educational Resources Information Center

    Soyibo, Kola; Beaumont-Walters, Yvonne

    This study determined Jamaican high school students' level of performance on five integrated science process skills and if there were significant differences in their performance linked to their gender, grade level, school location, school-type, student-type, and socioeconomic background (SEB). The 305 subjects comprised 133 males, 172 females,…

  9. Analysis of some selected catechins and caffeine in green tea by high performance liquid chromatography.

    PubMed

    El-Shahawi, M S; Hamza, A; Bahaffi, S O; Al-Sibaai, A A; Abduljabbar, T N

    2012-10-15

    Green tea seems to have a positive impact on health due to the catechins-found as flavanols. Thus, the present study was aimed to develop a low cost reversed phase high performance liquid chromatographic (HPLC) method for simultaneous determination of flavanol contents, namely catechin (C), epicatechin (EC), epigallocatechin (EGC), epicatechin 3-gallate (ECG) and epigallocatechin 3-gallate (EGCG) and caffeine in 29 commercial green tea samples available in a Saudi Arabian local market. A C-18 reversed-phase column, acetonitrile-trifluoroacetic acid as a mobile phase, coupled with UV detector at 205 nm, was successfully used for precise analysis of the tested analytes in boiled water of digested tea leaves. The average values of N (No. of theoretical plates), HETP (height equivalent of theoretical plates) and R(s) (separation factor) (at 10 μg ml(-1) of the catechins EC, EGC, EGCG and ECG) were 2.6×10(3)±1.2×10(3), 1.7×10(-3)±4.7×10(-4) cm and 1.7±5.53×10(-2), respectively. The developed HPLC method demonstrated excellent performance, with low limits of detection (LOD) and quantification (LOQ) of the tested catechins of 0.004-0.05 μg ml(-1) and 0.01-0.17 μg ml(-1), respectively, and recovery percentages of 96-101%. The influence of infusion time (5-30 min) and temperature on the content of the flavanols was investigated by HPLC. After a 5 min infusion of the tea leaves, the average concentrations of caffeine, catechin, EC, EGC, ECG and EGCG were found to be in the ranges 0.086-2.23, 0.113-2.94, 0.58-10.22, 0.19-24.9, 0.22-13.9 and 1.01-43.3 mg g(-1), respectively. The contents of caffeine and catechins followed the sequence: EGCG>EGC>ECG>EC>C>caffeine. The method was applied satisfactorily for the analysis of (+)-catechin, even at trace and ultra trace concentrations of catechins. The method was rapid, accurate, reproducible and ideal for routine analysis. PMID:23442685

  10. Performance analysis of InSb based QWFET for ultra high speed applications

    NASA Astrophysics Data System (ADS)

    Subash, T. D.; Gnanasekaran, T.; Divya, C.

    2015-01-01

    An indium antimonide based QWFET (quantum well field effect transistor) with the gate length down to 50 nm has been designed and investigated for the first time for L-band radar applications at 230 GHz. QWFETs are designed at the high performance node of the International Technology Road Map for Semiconductors (ITRS) requirements of drive current (Semiconductor Industry Association 2010). The performance of the device is investigated using the SYNOPSYS CAD (TCAD) software. InSb based QWFET could be a promising device technology for very low power and ultra-high speed performance with 5-10 times low DC power dissipation.

  11. An Analysis of High School Students' Performance on Five Integrated Science Process Skills

    NASA Astrophysics Data System (ADS)

    Beaumont-Walters, Yvonne; Soyibo, Kola

    2001-02-01

    This study determined Jamaican high school students' level of performance on five integrated science process skills and if there were statistically significant differences in their performance linked to their gender, grade level, school location, school type, student type and socio-economic background (SEB). The 305 subjects comprised 133 males, 172 females, 146 ninth graders, 159 10th graders, 150 traditional and 155 comprehensive high school students, 164 students from the Reform of Secondary Education (ROSE) project and 141 non-ROSE students, 166 urban and 139 rural students and 110 students from a high SEB and 195 from a low SEB. Data were collected with the authors' constructed integrated science process skills test the results indicated that the subjects' mean score was low and unsatisfactory; their performance in decreasing order was: interpreting data, recording data, generalising, formulating hypotheses and identifying variables; there were statistically significant differences in their performance based on their grade level, school type, student type, and SEB in favour of the 10th graders, traditional high school students, ROSE students and students from a high SEB. There was a positive, statistically significant and fairly strong relationship between their performance and school type, but weak relationships among their student type, grade level and SEB and performance.

  12. Thermodynamic performance analysis of a molten carbonate fuel cell at very high current densities

    NASA Astrophysics Data System (ADS)

    Ramandi, M. Y.; Dincer, I.

    2011-10-01

    This study is basically composed of two sections. In the first section, a CFD analysis is used to provide a better insight to molten carbonate fuel cell operation and performance characteristics at very high current densities. Therefore, a mathematical model is developed by employing mass and momentum conservation, electrochemical reaction mechanisms and electric charges. The model results are then compared with the available data for an MCFC unit, and a good agreement is observed. In addition, the model is applied to predict the unit cell behaviour at various operating pressures, temperatures, and cathode gas stoichiometric ratios. In the second section, a thermodynamic model is utilized to examine energy efficiency, exergy efficiency and entropy generation of the MCFC. At low current densities, no considerable difference in output voltage and power is observed; however, for greater values of current densities, the difference is not negligible. If the molten carbonate fuel cell is to operate at current densities smaller than 2500 A m-2, there is no point to pressurize the system. If the fuel cell operates at pressures greater than atmospheric pressure, the unit cell cost could be minimized. In addition, various partial pressure ratios at the cathode side demonstrated nearly the same effect on the performance of the fuel cell. With a 60 K change in operating temperature, almost 10% improvement in energy and exergy efficiencies is obtained. Both efficiencies initially increase at lower current densities and then reach their maximum values and ultimately decrease with the increase of current density. By elevating the pressure, both energy and exergy efficiencies of the cell enhance. In addition, higher operating pressure and temperature decrease the unit cell entropy generation.

  13. High performance polymer development

    NASA Technical Reports Server (NTRS)

    Hergenrother, Paul M.

    1991-01-01

    The term high performance as applied to polymers is generally associated with polymers that operate at high temperatures. High performance is used to describe polymers that perform at temperatures of 177 C or higher. In addition to temperature, other factors obviously influence the performance of polymers such as thermal cycling, stress level, and environmental effects. Some recent developments at NASA Langley in polyimides, poly(arylene ethers), and acetylenic terminated materials are discussed. The high performance/high temperature polymers discussed are representative of the type of work underway at NASA Langley Research Center. Further improvement in these materials as well as the development of new polymers will provide technology to help meet NASA future needs in high performance/high temperature applications. In addition, because of the combination of properties offered by many of these polymers, they should find use in many other applications.

  14. High-Performance Mixed Models Based Genome-Wide Association Analysis with omicABEL software

    PubMed Central

    Fabregat-Traver, Diego; Sharapov, Sodbo Zh.; Hayward, Caroline; Rudan, Igor; Campbell, Harry; Aulchenko, Yurii; Bientinesi, Paolo

    2014-01-01

    To raise the power of genome-wide association studies (GWAS) and avoid false-positive results in structured populations, one can rely on mixed model based tests. When large samples are used, and when multiple traits are to be studied in the ’omics’ context, this approach becomes computationally challenging. Here we consider the problem of mixed-model based GWAS for arbitrary number of traits, and demonstrate that for the analysis of single-trait and multiple-trait scenarios different computational algorithms are optimal. We implement these optimal algorithms in a high-performance computing framework that uses state-of-the-art linear algebra kernels, incorporates optimizations, and avoids redundant computations, increasing throughput while reducing memory usage and energy consumption. We show that, compared to existing libraries, our algorithms and software achieve considerable speed-ups. The OmicABEL software described in this manuscript is available under the GNU GPL v. 3 license as part of the GenABEL project for statistical genomics at http: //www.genabel.org/packages/OmicABEL. PMID:25717363

  15. High-Performance Mixed Models Based Genome-Wide Association Analysis with omicABEL software.

    PubMed

    Fabregat-Traver, Diego; Sharapov, Sodbo Zh; Hayward, Caroline; Rudan, Igor; Campbell, Harry; Aulchenko, Yurii; Bientinesi, Paolo

    2014-01-01

    To raise the power of genome-wide association studies (GWAS) and avoid false-positive results in structured populations, one can rely on mixed model based tests. When large samples are used, and when multiple traits are to be studied in the 'omics' context, this approach becomes computationally challenging. Here we consider the problem of mixed-model based GWAS for arbitrary number of traits, and demonstrate that for the analysis of single-trait and multiple-trait scenarios different computational algorithms are optimal. We implement these optimal algorithms in a high-performance computing framework that uses state-of-the-art linear algebra kernels, incorporates optimizations, and avoids redundant computations, increasing throughput while reducing memory usage and energy consumption. We show that, compared to existing libraries, our algorithms and software achieve considerable speed-ups. The OmicABEL software described in this manuscript is available under the GNU GPL v. 3 license as part of the GenABEL project for statistical genomics at http: //www.genabel.org/packages/OmicABEL. PMID:25717363

  16. DETECTION OF HETEROGENEOUS DRUG-PROTEIN BINDING BY FRONTAL ANALYSIS AND HIGH-PERFORMANCE AFFINITY CHROMATOGRAPHY

    PubMed Central

    Tong, Zenghan; Joseph, K.S.; Hage, David S.

    2011-01-01

    This study examined the use of frontal analysis and high-performance affinity chromatography for detecting heterogeneous binding in biomolecular interactions, using the binding of acetohexamide with human serum albumin (HSA) as a model. It was found through the use of this model system and chromatographic theory that double-reciprocal plots could be used more easily than traditional isotherms for the initial detection of binding site heterogeneity. The deviations from linearity that were seen in double-reciprocal plots as a result of heterogeneity were a function of the analyte concentration, the relative affinities of the binding sites in the system and the amount of each type of site that was present. The size of these deviations was determined and compared under various conditions. Plots were also generated to show what experimental conditions would be needed to observe these deviations for general heterogeneous systems or for cases in which some preliminary information was available on the extent of binding heterogeneity. The methods developed in this work for the detection of binding heterogeneity are not limited to drug interactions with HSA but could be applied to other types of drug-protein binding or to additional biological systems with heterogeneous binding. PMID:21612784

  17. A validated method for analysis of Swerchirin in Swertia longifolia Boiss. by high performance liquid chromatography

    PubMed Central

    Shekarchi, M.; Hajimehdipoor, H.; Khanavi, M.; Adib, N.; Bozorgi, M.; Akbari-Adergani, B.

    2010-01-01

    Swertia spp. (Gentianaceae) grow widely in the eastern and southern Asian countries and are used as traditional medicine for gastrointestinal disorders. Swerchirin, one of the xanthones in Swertia spp., has many pharmacological properties, such as, antimalarial, antihepatotoxic, and hypoglycemic effects. Because of the pharmacological importance of Swerchirin in this investigation, it was purified from Swertia longifolia Boiss. as one of the main components and quantified by means of a validated high performance liquid chromatography (HPLC) technique. Aerial parts of the plant were extracted with acetone 80%. Phenolic and non-phenolic constituents of the extract were separated from each other during several processes. The phenolic fraction was injected into the semi-preparative HPLC system, which consisted of a C18 column and a gradient methanol: 0.1% formic acid mode. Using this method, we were able to purify six xanthones from the plant, in order to use them as standard materials. The analytical method was validated for Swerchirin as one of the most important components of the plant, with more pharmacological activities according to the validation parameters, such as, selectivity, linearity (r2 > 0.9998), precision (≤3.3), and accuracy, which were measured by the determination of recovery (98-107%). The limits of detection and quantization were found to be 2.1 and 6.3 μg/mL, respectively. On account of the speed and accuracy, the UV-HPLC method may be used for quantitative analysis of Swerchirin. PMID:20548931

  18. College Performance and Retention: A Meta-Analysis of the Predictive Validities of ACT® Scores, High School Grades, and SES

    ERIC Educational Resources Information Center

    Westrick, Paul A.; Le, Huy; Robbins, Steven B.; Radunzel, Justine M. R.; Schmidt, Frank L.

    2015-01-01

    This meta-analysis examines the strength of the relationships of ACT® Composite scores, high school grades, and socioeconomic status (SES) with academic performance and persistence into the 2nd and 3rd years at 4-year colleges and universities. Based upon a sample of 189,612 students at 50 institutions, ACT Composite scores and high school grade…

  19. High Performance Data Clustering: A Comparative Analysis of Performance for GPU, RASC, MPI, and OpenMP Implementations.

    PubMed

    Yang, Luobin; Chiu, Steve C; Liao, Wei-Keng; Thomas, Michael A

    2014-10-01

    Compared to Beowulf clusters and shared-memory machines, GPU and FPGA are emerging alternative architectures that provide massive parallelism and great computational capabilities. These architectures can be utilized to run compute-intensive algorithms to analyze ever-enlarging datasets and provide scalability. In this paper, we present four implementations of K-means data clustering algorithm for different high performance computing platforms. These four implementations include a CUDA implementation for GPUs, a Mitrion C implementation for FPGAs, an MPI implementation for Beowulf compute clusters, and an OpenMP implementation for shared-memory machines. The comparative analyses of the cost of each platform, difficulty level of programming for each platform, and the performance of each implementation are presented. PMID:25309040

  20. High Performance Data Clustering: A Comparative Analysis of Performance for GPU, RASC, MPI, and OpenMP Implementations*

    PubMed Central

    Yang, Luobin; Chiu, Steve C.; Liao, Wei-Keng; Thomas, Michael A.

    2013-01-01

    Compared to Beowulf clusters and shared-memory machines, GPU and FPGA are emerging alternative architectures that provide massive parallelism and great computational capabilities. These architectures can be utilized to run compute-intensive algorithms to analyze ever-enlarging datasets and provide scalability. In this paper, we present four implementations of K-means data clustering algorithm for different high performance computing platforms. These four implementations include a CUDA implementation for GPUs, a Mitrion C implementation for FPGAs, an MPI implementation for Beowulf compute clusters, and an OpenMP implementation for shared-memory machines. The comparative analyses of the cost of each platform, difficulty level of programming for each platform, and the performance of each implementation are presented. PMID:25309040

  1. High Performance Computing for probabilistic distributed slope stability analysis, an early example

    NASA Astrophysics Data System (ADS)

    Rossi, Guglielmo; Catani, Filippo

    2010-05-01

    The term shallow landslides is widely used in literature to describe a slope movement of limited size that mainly develops in soils up to a maximum of a few meters thick. Shallow landslides are usually triggered by heavy rainfall because, as the water starts to infiltrate into the soil, the pore-water pressure increases so that the shear strength of the soil is reduced leading to slope failure. We have developed a distributed hydrological-geotechnical model for forecasting the temporal and spatial distribution of shallow landslides to be used as a real time warning system for civil protection purposes. The stability simulator is developed to use High Performance Computing (HPC) resources and in this way can manage large areas, with high spatial and temporal resolution, at useful computational time for a warning system . The output of the model is a probabilistic value of slope instability. In its current stage the model applied for predicting the expected location of shallow landslides involves several stand-alone components. The base solution suggested by Iverson for the Richards equation is adapted to be used in a real time simulator to estimate the probabilistic distribution of the transient groundwater pressure head according to radar detected rainfall intensity. The use of radar detected rainfall intensity as the input for the hydrological simulation of the infiltration allows a more accurate computation of the redistribution of the groundwater pressure associated with transient infiltration of rain. A soil depth prediction scheme and a limit-equilibrium infinite slope stability algorithm are used to calculate the distributed factor of safety (FS) at different depths and to record the probability distribution of slope instability in the final output file. The additional ancillary data required have been collected during fieldwork and with laboratory standard tests. The model deals with both saturated and unsaturated conditions taking into account the effect of

  2. Factors Affecting University Entrants' Performance in High-Stakes Tests: A Multiple Regression Analysis

    ERIC Educational Resources Information Center

    Uy, Chin; Manalo, Ronaldo A.; Cabauatan, Ronaldo R.

    2015-01-01

    In the Philippines, students seeking admission to a university are usually required to meet certain entrance requirements, including passing the entrance examinations with questions on IQ and English, mathematics, and science. This paper aims to determine the factors that affect the performance of entrants into business programmes in high-stakes…

  3. Ion-pair high-performance liquid chromatographic analysis of aspartame and related products.

    PubMed

    Verzella, G; Bagnasco, G; Mangia, A

    1985-12-01

    A simple and accurate quantitative determination of aspartame (L-alpha-aspartyl-L-phenylalanine methyl ester), a new artificial sweetener, is described. The method, which is based on ion-pair high-performance liquid chromatography, allows the determination of aspartame in finished bulk and dosage forms, and the detection of a few related products at levels down to 0.1%. PMID:4086646

  4. High Performance Liquid Chromatographic Analysis of Phytoplankton Pigments Using a C16-Amide Column

    EPA Science Inventory

    A reverse-phase high performance liquid chromatographic (RP-HPLC) method was developed to analyze in a single run, most polar and non-polar chlorophylls and carotenoids from marine phytoplankton. The method is based on a RP-C16-Amide column and a ternary gradient system consistin...

  5. Application of High-performance Visual Analysis Methods to Laser Wakefield Particle Acceleration Data

    SciTech Connect

    Rubel, Oliver; Prabhat, Mr.; Wu, Kesheng; Childs, Hank; Meredith, Jeremy; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Ahern, Sean; Weber, Gunther H.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes

    2008-08-28

    Our work combines and extends techniques from high-performance scientific data management and visualization to enable scientific researchers to gain insight from extremely large, complex, time-varying laser wakefield particle accelerator simulation data. We extend histogram-based parallel coordinates for use in visual information display as well as an interface for guiding and performing data mining operations, which are based upon multi-dimensional and temporal thresholding and data subsetting operations. To achieve very high performance on parallel computing platforms, we leverage FastBit, a state-of-the-art index/query technology, to accelerate data mining and multi-dimensional histogram computation. We show how these techniques are used in practice by scientific researchers to identify, visualize and analyze a particle beam in a large, time-varying dataset.

  6. Relativistic performance analysis of a high current density magnetron injection gun

    SciTech Connect

    Barnett, L. R.; Luhmann, N. C. Jr.; Chiu, C. C.; Chu, K. R.

    2009-09-15

    Electron beam quality is essential to the performance of millimeter-wave gyroamplifiers, particularly the gyrotron traveling-wave tube amplifier, which is extremely sensitive to the electron velocity spread and emission uniformity. As one moves up in power and frequency, the quality of the electron beam becomes even more critical. One aspect of the electron beam formation technology which has received relatively little attention has been the performance analysis of the electron beam itself. In this study, a 100 kV, 8 A magnetron injection gun with a calculated perpendicular-to-parallel velocity ratio of 1.4 and axial velocity spread of 3.5% has been designed, tested, and analyzed. It is shown that the equipment precision and a fully relativistic data analysis model afford sufficient resolution to allow a verification of the theoretical predictions as well as a quantitative inference to the surface roughness of the cathode used.

  7. Relativistic performance analysis of a high current density magnetron injection gun

    NASA Astrophysics Data System (ADS)

    Barnett, L. R.; Luhmann, N. C.; Chiu, C. C.; Chu, K. R.

    2009-09-01

    Electron beam quality is essential to the performance of millimeter-wave gyroamplifiers, particularly the gyrotron traveling-wave tube amplifier, which is extremely sensitive to the electron velocity spread and emission uniformity. As one moves up in power and frequency, the quality of the electron beam becomes even more critical. One aspect of the electron beam formation technology which has received relatively little attention has been the performance analysis of the electron beam itself. In this study, a 100 kV, 8 A magnetron injection gun with a calculated perpendicular-to-parallel velocity ratio of 1.4 and axial velocity spread of 3.5% has been designed, tested, and analyzed. It is shown that the equipment precision and a fully relativistic data analysis model afford sufficient resolution to allow a verification of the theoretical predictions as well as a quantitative inference to the surface roughness of the cathode used.

  8. The NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform to Support the Analysis of Petascale Environmental Data Collections

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Pugh, T.; Wyborn, L. A.; Porter, D.; Allen, C.; Smillie, J.; Antony, J.; Trenham, C.; Evans, B. J.; Beckett, D.; Erwin, T.; King, E.; Hodge, J.; Woodcock, R.; Fraser, R.; Lescinsky, D. T.

    2014-12-01

    The National Computational Infrastructure (NCI) has co-located a priority set of national data assets within a HPC research platform. This powerful in-situ computational platform has been created to help serve and analyse the massive amounts of data across the spectrum of environmental collections - in particular the climate, observational data and geoscientific domains. This paper examines the infrastructure, innovation and opportunity for this significant research platform. NCI currently manages nationally significant data collections (10+ PB) categorised as 1) earth system sciences, climate and weather model data assets and products, 2) earth and marine observations and products, 3) geosciences, 4) terrestrial ecosystem, 5) water management and hydrology, and 6) astronomy, social science and biosciences. The data is largely sourced from the NCI partners (who include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. By co-locating these large valuable data assets, new opportunities have arisen by harmonising the data collections, making a powerful transdisciplinary research platformThe data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. New scientific software, cloud-scale techniques, server-side visualisation and data services have been harnessed and integrated into the platform, so that analysis is performed seamlessly across the traditional boundaries of the underlying data domains. Characterisation of the techniques along with performance profiling ensures scalability of each software component, all of which can either be enhanced or replaced through future improvements. A Development-to-Operations (DevOps) framework has also been implemented to manage the scale of the software complexity alone. This ensures that

  9. Solar Total Energy Project (STEP) Performance Analysis of High Temperature Energy Storage Subsystem

    NASA Technical Reports Server (NTRS)

    Moore, D. M.

    1984-01-01

    The 1982 milestones and lessons learned; performance in 1983; a typical day's operation; collector field performance and thermal losses; and formal testing are highlighted. An initial test that involves characterizing the high temperature storage (hts) subsystem is emphasized. The primary element is on 11,000 gallon storage tank that provides energy to the steam generator during transient solar conditions or extends operating time. Overnight, thermal losses were analyzed. The length of time the system is operated at various levels of cogeneration using stored energy is reviewed.

  10. Comparative analysis of steroidal saponins in four Dioscoreae herbs by high performance liquid chromatography coupled with mass spectrometry.

    PubMed

    Guo, Long; Zeng, Su-Ling; Zhang, Yu; Li, Ping; Liu, E-Hu

    2016-01-01

    Steroidal saponins, which exhibit multiple pharmacological effects, are the major bioactive constituents in herbal medicines from Dioscoreae species. In this study, a sensitive method based on high performance liquid chromatography-mass spectrometry (HPLC-MS) was established and validated for qualitative and quantitative analysis of steroidal saponins in four Dioscoreae herbs including Dioscoreae Nipponica Rhizome (DNR) and Dioscoreae Hypoglaucae Rhizome (DHR), Dioscoreae Spongiosae Rhizome (DSR) and Dioscoreae Rhizome (DR). A total of eleven steroidal saponins were identified by high performance liquid chromatography coupled with quadrupole time-of-flight mass spectrometry (HPLC-QTOF/MS). Furthermore, seven major steroidal saponins was simultaneous quantified using a high performance liquid chromatography coupled with triple quadrupole mass spectrometry (HPLC-QQQ/MS). The qualitative and quantitative analysis results indicated that the chemical composition of DNR, DHR and DSR samples exhibited a high level of global similarity, while the ingredients in DR varied greatly from the other three herbs. Moreover, principal component analysis (PCA) and hierarchical clustering analysis (HCA) were performed to compare and discriminate the Dioscoreae herbs based on the quantitative data. The results demonstrated the qualitative and quantitative analysis of steroidal saponins based on HPLC-MS is a feasible method for quality control of Dioscoreae herbs. PMID:26344383

  11. High performance systems

    SciTech Connect

    Vigil, M.B.

    1995-03-01

    This document provides a written compilation of the presentations and viewgraphs from the 1994 Conference on High Speed Computing given at the High Speed Computing Conference, {open_quotes}High Performance Systems,{close_quotes} held at Gleneden Beach, Oregon, on April 18 through 21, 1994.

  12. High performance dash on warning air mobile, missile system. [intercontinental ballistic missiles - systems analysis

    NASA Technical Reports Server (NTRS)

    Levin, A. D.; Castellano, C. R.; Hague, D. S.

    1975-01-01

    An aircraft-missile system which performs a high acceleration takeoff followed by a supersonic dash to a 'safe' distance from the launch site is presented. Topics considered are: (1) technological feasibility to the dash on warning concept; (2) aircraft and boost trajectory requirements; and (3) partial cost estimates for a fleet of aircraft which provide 200 missiles on airborne alert. Various aircraft boost propulsion systems were studied such as an unstaged cryogenic rocket, an unstaged storable liquid, and a solid rocket staged system. Various wing planforms were also studied. Vehicle gross weights are given. The results indicate that the dash on warning concept will meet expected performance criteria, and can be implemented using existing technology, such as all-aluminum aircraft and existing high-bypass-ratio turbofan engines.

  13. Analysis of high performance conjugate heat transfer with the OpenPALM coupler

    NASA Astrophysics Data System (ADS)

    Duchaine, Florent; Jauré, Stéphan; Poitou, Damien; Quémerais, Eric; Staffelbach, Gabriel; Morel, Thierry; Gicquel, Laurent

    2015-01-01

    In many communities such as climate science or industrial design, to solve complex coupled problems with high fidelity external coupling of legacy solvers puts a lot of pressure on the tool used for the coupling. The precision of such predictions not only largely depends on simulation resolutions and the use of huge meshes but also on high performance computing to reduce restitution times. In this context, the current work aims at studying the scalability of code coupling on high performance computing architectures for a conjugate heat transfer problem. The flow solver is a Large Eddy Simulation code that has been already ported on massively parallel architectures. The conduction solver is based on the same data structure and thus shares the flow solver scalability properties. Accurately coupling solvers on massively parallel architectures while maintaining their scalability is challenging. It requires exchanging and treating information based on two different computational grids that are partitioned differently on a different number of cores. Such transfers have to be thought to maintain code scalabilities while maintaining numerical accuracy. This raises communication and high performance computing issues: transferring data from a distributed interface to another distributed interface in a parallel way and on a very large number of processors is not straightforward and solutions are not clear. Performance tests have been carried out up to 12 288 cores on the CURIE supercomputer (TGCC/CEA). Results show a good behavior of the coupled model when increasing the number of cores thanks to the fully distributed exchange process implemented in the coupler. Advanced analyses are carried out to draw new paths for future developments for coupled simulations: i.e. optimization of the data transfer protocols through asynchronous communications or coupling-aware preprocessing of the coupled models (mesh partitioning phase).

  14. High-performance liquid chromatography analysis of plant saponins: An update 2005-2010

    PubMed Central

    Negi, Jagmohan S.; Singh, Pramod; Pant, Geeta Joshi Nee; Rawat, M. S. M.

    2011-01-01

    Saponins are widely distributed in plant kingdom. In view of their wide range of biological activities and occurrence as complex mixtures, saponins have been purified and separated by high-performance liquid chromatography using reverse-phase columns at lower wavelength. Mostly, saponins are not detected by ultraviolet detector due to lack of chromophores. Electrospray ionization mass spectrometry, diode array detector , evaporative light scattering detection, and charged aerosols have been used for overcoming the detection problem of saponins. PMID:22303089

  15. Simultaneous Analysis and Quantification of Markers of Manjisthadi Churna Using High Performance Thin Layer Chromatography

    PubMed Central

    Patel, V. R.; Patel, R. K.

    2013-01-01

    Manjisthadi churna has been traditionally used in the Ayurvedic system of medicine and by traditional medical practices of India to treat hyperlipidemia. A rapid, simple and accurate method with high performance thin layer chromatography has been developed to standardised Manjisthadi churna using rubiadin, sennoside and ellagic acid as markers. Methanol extract of Manjisthadi churna were used for high performance thin layer chromatography on silica gel plates. The Rf of rubiadin, sennoside-A and ellagic acid were found to 0.48, 0.23 and 0.72, respectively with densitometric scanning at 280 nm and the calibration plot were linear in the range of 100-600 ng of markers. The correlation coefficients were higher than 0.99 were indicative of good linear dependence of peaks area on concentration. The rubiadin, sennoside-A and ellagic acid contents in Manjisthadi churna were found to be 0.014, 0.038 and 0.534% w/w, respectively. This method permits reliable quantification of rubiadin, sennoside-A and ellagic acid with good resolution and separation of the same from other constitutes of the extract of Manjisthadi churna. Recovery value from 95.66-102.33% showed the reliability and reproducibility of the method. The proposed high performance thin layer chromatography method for simultaneous quantification of markers in Manjisthadi churna can be used for routine quality testing. PMID:23901170

  16. High-precision image aided inertial navigation with known features: observability analysis and performance evaluation.

    PubMed

    Jiang, Weiping; Wang, Li; Niu, Xiaoji; Zhang, Quan; Zhang, Hui; Tang, Min; Hu, Xiangyun

    2014-01-01

    A high-precision image-aided inertial navigation system (INS) is proposed as an alternative to the carrier-phase-based differential Global Navigation Satellite Systems (CDGNSSs) when satellite-based navigation systems are unavailable. In this paper, the image/INS integrated algorithm is modeled by a tightly-coupled iterative extended Kalman filter (IEKF). Tightly-coupled integration ensures that the integrated system is reliable, even if few known feature points (i.e., less than three) are observed in the images. A new global observability analysis of this tightly-coupled integration is presented to guarantee that the system is observable under the necessary conditions. The analysis conclusions were verified by simulations and field tests. The field tests also indicate that high-precision position (centimeter-level) and attitude (half-degree-level)-integrated solutions can be achieved in a global reference. PMID:25330046

  17. High-Precision Image Aided Inertial Navigation with Known Features: Observability Analysis and Performance Evaluation

    PubMed Central

    Jiang, Weiping; Wang, Li; Niu, Xiaoji; Zhang, Quan; Zhang, Hui; Tang, Min; Hu, Xiangyun

    2014-01-01

    A high-precision image-aided inertial navigation system (INS) is proposed as an alternative to the carrier-phase-based differential Global Navigation Satellite Systems (CDGNSSs) when satellite-based navigation systems are unavailable. In this paper, the image/INS integrated algorithm is modeled by a tightly-coupled iterative extended Kalman filter (IEKF). Tightly-coupled integration ensures that the integrated system is reliable, even if few known feature points (i.e., less than three) are observed in the images. A new global observability analysis of this tightly-coupled integration is presented to guarantee that the system is observable under the necessary conditions. The analysis conclusions were verified by simulations and field tests. The field tests also indicate that high-precision position (centimeter-level) and attitude (half-degree-level)-integrated solutions can be achieved in a global reference. PMID:25330046

  18. Optimal design analysis for thermal performance of high power 2.5D package

    NASA Astrophysics Data System (ADS)

    Xiaoyang, Liu; He, Ma; Daquan, Yu; Wenlu, Chen; Xiaolong, Wu

    2016-03-01

    Based on ANSYS and Icepak softwares, the numerical analysis method is used to build up the thermal analysis model of the 2.5D package, which contains a high power CPU chip. The focus of the research is on the determination of the contributing factors and their effects on the thermal resistance and heat distribution of the package. The parametric analysis illustrates that the substrate conductivity, TIM conductivity and fin height are more crucial for heat conduction in the package. Furthermore, these major parameters are compared and analyzed by orthogonal tests, and the optimal solution for 2.5D integration is proposed. The factors' influence patterns on thermal resistance, obtained in this article, could be utilized as a thermal design reference. Project supported by the National S & T Major Projects (No. 2011ZX02709-2) and the China National Science Foundation (No. 61176098).

  19. High performance N204/amine elements: Blowapart. [test, and analysis of single element injectors

    NASA Technical Reports Server (NTRS)

    Lawver, B. R.

    1974-01-01

    The work is reported which was conducted to define the mechanisms governing blowapart of hypergolic propellant through the design, fabrication, test, and analysis of single element injectors. Data were developed that show the parameters exhibiting a controlling influence over blowapart are the chamber pressure, orifice diameter, and propellant temperature. Mixing, popping (cyclic blowapart), low pressure separation, and high pressure separation were identified as modes of reactive impingement.

  20. Aerothermodynamic heating and performance analysis of a high-lift aeromaneuvering AOTV concept

    NASA Technical Reports Server (NTRS)

    Menees, G. P.; Brown, K. G.; Wilson, J. F.; Davies, C. B.

    1985-01-01

    The thermal-control requirements for design-optimized aeromaneuvering performance are determined for space-based applications and low-earth orbit sorties involving large, multiple plane-inclination changes. The leading-edge heating analysis is the most advanced developed for hypersonic-rarefied flow over lifting surfaces at incidence. The effects of leading-edge bluntness, low-density viscous phenomena, and finite-rate flow-field chemistry and surface catalysis are accounted for. The predicted aerothermodynamic heating characteristics are correlated with thermal-control and flight-performance capabilities. The mission payload capability for delivery, retrieval, and combined operations is determined for round-trip sorties extending to polar orbits. Recommendations are given for future design refinements. The results help to identify technology issues required to develop prototype operational systems.

  1. MIR Performance Analysis

    SciTech Connect

    Hazen, Damian; Hick, Jason

    2012-06-12

    We provide analysis of Oracle StorageTek T10000 Generation B (T10KB) Media Information Record (MIR) Performance Data gathered over the course of a year from our production High Performance Storage System (HPSS). The analysis shows information in the MIR may be used to improve tape subsystem operations. Most notably, we found the MIR information to be helpful in determining whether the drive or tape was most suspect given a read or write error, and for helping identify which tapes should not be reused given their history of read or write errors. We also explored using the MIR Assisted Search to order file retrieval requests. We found that MIR Assisted Search may be used to reduce the time needed to retrieve collections of files from a tape volume.

  2. Simultaneous analysis of monosaccharides and oligosaccharides by high-performance liquid chromatography with postcolumn fluorescence derivatization.

    PubMed

    Kakita, Hirotaka; Kamishima, Hiroshi; Komiya, Katsuo; Kato, Yoshio

    2002-06-28

    To develop a fluorimetric HPLC technique for the simultaneous microanalysis of reducing mono- and oligosaccharides, the technique of linear gradient elution was introduced into the postcolumn fluorimetric detemination system of reducing saccharides with benzamidine. Fluorescence measurement was performed at 288 nm for excitation and 470 nm for emission and an optimization study for this postcolumn fluorescence derivatization carried out. Under optimum conditions, the detection limits of D-glucose and maltohexaose were 1.78 and 2.59 pmol, respectively. The present method was successfully applied to saccharide analysis and should prove useful for automated simultaneous microanalysis of reducing mono- and oligosaccharides in foods. PMID:12186393

  3. High Performance Polymers

    NASA Technical Reports Server (NTRS)

    Venumbaka, Sreenivasulu R.; Cassidy, Patrick E.

    2003-01-01

    This report summarizes results from research on high performance polymers. The research areas proposed in this report include: 1) Effort to improve the synthesis and to understand and replicate the dielectric behavior of 6HC17-PEK; 2) Continue preparation and evaluation of flexible, low dielectric silicon- and fluorine- containing polymers with improved toughness; and 3) Synthesis and characterization of high performance polymers containing the spirodilactam moiety.

  4. RAPID ANALYSIS OF CYANURIC ACID IN SWIMMING POOL WATERS BY HIGH PERFORMANCE LIQUID CHROMATOGRAPHY USING POROUS GRAPHITIC CARBON

    EPA Science Inventory

    An innovative approach is presented for reducing analysis times of cynuric acid in swimming pool waters by high performance liquid chromatography (HPLC). The HPLC method exploits the unique selectivity of porous graphitic carbon (PGC) to fully resolve within 10 minutes cyanuric ...

  5. RAPID ANALYSIS OF CYNANURIC ACID IN SWIMMING POOL WATERS BY HIGH PERFORMANCE LIQUID CHROMATOGRAPHY USING POROUS GRAPHITIC CARBON COLUMN

    EPA Science Inventory

    An innovative approach is presented for reducing analysis times of cyanuric acid in swimming pool waters by high performance liquid chromatography (HPLC). The HPLC method exploits the unique selectivity of porous graphitic carbon (PGC) to fully resolve cyanuric acid from other p...

  6. Phytochemical analysis of Hibiscus caesius using high performance liquid chromatography coupled with mass spectrometry.

    PubMed

    Ain, Quratul; Naveed, Muhammad Na; Mumtaz, Abdul Samad; Farman, Muhammad; Ahmed, Iftikhar; Khalid, Nauman

    2015-09-01

    Various species in genus Hibiscus are traditionally known for their therapeutic attributes. The present study focused on the phytochemical analysis of a rather unexplored species Hibiscus caesius (H. caesius), using high-pressure liquid chromatography coupled with mass spectrometry (HPLC-MS). The analysis revealed five major compounds in the aqueous extract, viz. vanillic acid, protocatechoic acid, quercetin, quercetin glucoside and apigenin, being reported for the first time in H. caesius. Literature suggests that these compounds have important pharmacological traits such as anti-cancer, anti-inflammatory, anti-bacterial and hepatoprotective etc. however, this requires further pharmacological investigations at in vitro and in vivo scale. The above study concluded the medicinal potential of H. caesius. PMID:26408882

  7. A rapid and sensitive high performance liquid chromatographic analysis of clofazimine in plasma.

    PubMed

    Krishnan, T R; Abraham, I

    1992-12-01

    The high performance liquid chromatographic (HPLC) method of Gidoh, et al. has been modified substantially to provide a simple, rapid, and relatively inexpensive procedure for measuring clofazimine in plasma. The modification involves the use of commonly available laboratory reagents instead of custom-made ones. It also employs a solid phase system for efficient extraction instead of the conventional, less efficient and more labor intensive, liquid-liquid extraction. The inclusion of an internal standard (salicylic acid) improves the precision and reproducibility. It is demonstrated that the method can be used to monitor in vivo clofazimine levels as may be required in formal pharmacokinetic studies or therapeutic drug monitoring. PMID:1299710

  8. High Performance Network Monitoring

    SciTech Connect

    Martinez, Jesse E

    2012-08-10

    Network Monitoring requires a substantial use of data and error analysis to overcome issues with clusters. Zenoss and Splunk help to monitor system log messages that are reporting issues about the clusters to monitoring services. Infiniband infrastructure on a number of clusters upgraded to ibmon2. ibmon2 requires different filters to report errors to system administrators. Focus for this summer is to: (1) Implement ibmon2 filters on monitoring boxes to report system errors to system administrators using Zenoss and Splunk; (2) Modify and improve scripts for monitoring and administrative usage; (3) Learn more about networks including services and maintenance for high performance computing systems; and (4) Gain a life experience working with professionals under real world situations. Filters were created to account for clusters running ibmon2 v1.0.0-1 10 Filters currently implemented for ibmon2 using Python. Filters look for threshold of port counters. Over certain counts, filters report errors to on-call system administrators and modifies grid to show local host with issue.

  9. Extending PowerPack for Profiling and Analysis of High Performance Accelerator-Based Systems

    SciTech Connect

    Li, Bo; Chang, Hung-Ching; Song, Shuaiwen; Su, Chun-Yi; Meyer, Timmy; Mooring, John; Cameron, Kirk

    2014-12-01

    Accelerators offer a substantial increase in efficiency for high-performance systems offering speedups for computational applications that leverage hardware support for highly-parallel codes. However, the power use of some accelerators exceeds 200 watts at idle which means use at exascale comes at a significant increase in power at a time when we face a power ceiling of about 20 megawatts. Despite the growing domination of accelerator-based systems in the Top500 and Green500 lists of fastest and most efficient supercomputers, there are few detailed studies comparing the power and energy use of common accelerators. In this work, we conduct detailed experimental studies of the power usage and distribution of Xeon-Phi-based systems in comparison to the NVIDIA Tesla and at SandyBridge.

  10. Performance Analysis of high-order remap-type advection scheme on icosahedral-hexagonal grid

    NASA Astrophysics Data System (ADS)

    Mittal, Rashmi; Dubey, Sarvesh; Saxena, Vaibhav; Meurdesoif, Yann

    2014-05-01

    A comparative performance analysis on computational cost of second order advection schemes FF-CSLAM (Flux form conservative semi-Lagrangian multi-tracer transport scheme) and it's two simplifications on Icosahedral grid has been presented. Tracer transport is one of the main building blocks in atmospheric models and hence their performance greatly determines the overall performance of the model. FF-CSLAM falls in the category of arbitrary Lagrangian Eulerian (ALE) scheme. It exploits the finite volume formulation and therefore it is inherently conservative. Flux-area through edges are approximated with great circle arcs in an upwind fashion. Bi-quadratic sub-grid scale reconstructions using weighted least-squares method is employed to approximate trace field. Area integrals on the overlapped region of flux-area and static Eulerian meshes are evaluated via line-integrals. A brief description of implementation of FF-CSLAM on icosahedral -hexagonal meshes along with and its numerical accuracy in terms of standard test cases will be presented. A comparative analysis of the computational overhead is necessary to assess the suitability of FF-CSLAM for massively parallel and multi-threading computer architectures in comparison to other advection schemes implemented on icosahedral grids. The main focus of this work is to present the implementation of the shared memory parallelization and to describe the memory access pattern of the numerical scheme. FF-CSLAM is a remap-type advection scheme, thus extra calculation are done in comparison to the other advection schemes. The additional computations are associated with the search required to find the overlap area between the area swept through the edge and the underlining grid. But the experiments shows that the associated computational overhead is minimal for multi-tracer transport. It will be shown that for the Courant Number less than one, FF-CSLAM, the computations are not expensive. Since the grid cells are arranged in

  11. Modeling and analysis of a high-performance midwave infrared panoramic periscope

    NASA Astrophysics Data System (ADS)

    Nichols, Jonathan M.; Waterman, James R.; Menon, Raghu; Devitt, John

    2010-11-01

    A high-resolution midwave infrared panoramic periscope sensor system has been developed. The sensor includes an f/2.5 catadioptric optical system that provides a field of view with 360-deg horizontal azimuth and -10- to +30-deg elevation without requiring moving components (e.g., rotating mirrors). The focal plane is a 2048×2048, 15-μm-pitch InSb detector operating at 80 K. An onboard thermoelectric reference source allows for real-time nonuniformity correction using the two-point correction method. The entire system (detector-Dewar assembly, cooler, electronics, and optics) is packaged to fit in an 8-in.-high, 6.5-in.-diameter volume. This work describes both the system optics and the electronics and presents sample imagery. We model both the sensor's radiometric performance, quantified by the noise-equivalent temperature difference, and its resolution performance. Model predictions are then compared with estimates obtained from experimental data. The ability of the system to resolve targets as a function of imaged spatial frequency is also presented.

  12. Fully automated high-performance liquid chromatographic assay for the analysis of free catecholamines in urine.

    PubMed

    Said, R; Robinet, D; Barbier, C; Sartre, J; Huguet, C

    1990-08-24

    A totally automated and reliable high-performance liquid chromatographic method is described for the routine determination of free catecholamines (norepinephrine, epinephrine and dopamine) in urine. The catecholamines were isolated from urine samples using small alumina columns. A standard automated method for pH adjustment of urine before the extraction step has been developed. The extraction was performed on an ASPEC (Automatic Sample Preparation with Extraction Columns, Gilson). The eluate was collected in a separate tube and then automatically injected into the chromatographic column. The catecholamines were separated by reversed-phase ion-pair liquid chromatography and quantified by fluorescence detection. No manual intervention was required during the extraction and separation procedure. One sample may be run every 15 min, ca. 96 samples in 24 h. Analytical recoveries for all three catecholamines are 63-87%, and the detection limits are 0.01, 0.01, and 0.03 microM for norepinephrine, epinephrine and dopamine, respectively, which is highly satisfactory for urine. Day-to-day coefficients of variation were less than 10%. PMID:2277100

  13. Electrochemical detection coupled with high-performance liquid chromatography in pharmaceutical and biomedical analysis: a mini review.

    PubMed

    Wang, Chengyin; Xu, Jianyun; Zhou, Guiyou; Qu, Qishu; Yang, Gongjun; Hu, Xiaoya

    2007-08-01

    Recent advances in electrochemical detection techniques coupled with high-performance liquid chromatography (HPLC-ECD) in pharmaceutical and biomedical analysis are reviewed. ECD classification and modes including common amperometric, coulometric, conductimetric, and potentiometric detector, are outlined and the some typical examples of determinations in pharmaceutical and biomedical analysis are described. The electrochemical detection system can offer superior merits over other detectors commonly used with HPLC. These techniques have great potential owing to their prominent characteristics in high-throughput screening procedures of drugs in various matrices. Fundamental 67 references from last 5 years related with a field are cited in this review. PMID:17979637

  14. High-Performance Computational Analysis of Glioblastoma Pathology Images with Database Support Identifies Molecular and Survival Correlates

    PubMed Central

    Kong, Jun; Wang, Fusheng; Teodoro, George; Cooper, Lee; Moreno, Carlos S.; Kurc, Tahsin; Pan, Tony; Saltz, Joel; Brat, Daniel

    2014-01-01

    In this paper, we present a novel framework for microscopic image analysis of nuclei, data management, and high performance computation to support translational research involving nuclear morphometry features, molecular data, and clinical outcomes. Our image analysis pipeline consists of nuclei segmentation and feature computation facilitated by high performance computing with coordinated execution in multi-core CPUs and Graphical Processor Units (GPUs). All data derived from image analysis are managed in a spatial relational database supporting highly efficient scientific queries. We applied our image analysis workflow to 159 glioblastomas (GBM) from The Cancer Genome Atlas dataset. With integrative studies, we found statistics of four specific nuclear features were significantly associated with patient survival. Additionally, we correlated nuclear features with molecular data and found interesting results that support pathologic domain knowledge. We found that Proneural subtype GBMs had the smallest mean of nuclear Eccentricity and the largest mean of nuclear Extent, and MinorAxisLength. We also found gene expressions of stem cell marker MYC and cell proliferation maker MKI67 were correlated with nuclear features. To complement and inform pathologists of relevant diagnostic features, we queried the most representative nuclear instances from each patient population based on genetic and transcriptional classes. Our results demonstrate that specific nuclear features carry prognostic significance and associations with transcriptional and genetic classes, highlighting the potential of high throughput pathology image analysis as a complementary approach to human-based review and translational research. PMID:25098236

  15. An analysis of high school students' perceptions and academic performance in laboratory experiences

    NASA Astrophysics Data System (ADS)

    Mirchin, Robert Douglas

    This research study is an investigation of student-laboratory (i.e., lab) learning based on students' perceptions of experiences using questionnaire data and evidence of their science-laboratory performance based on paper-and-pencil assessments using Maryland-mandated criteria, Montgomery County Public Schools (MCPS) criteria, and published laboratory questions. A 20-item questionnaire consisting of 18 Likert-scale items and 2 open-ended items that addressed what students liked most and least about lab was administered to students before labs were observed. A pre-test and post-test assessing laboratory achievement were administered before and after the laboratory experiences. The three labs observed were: soda distillation, stoichiometry, and separation of a mixture. Five significant results or correlations were found. For soda distillation, there were two positive correlations. Student preference for analyzing data was positively correlated with achievement on the data analysis dimension of the lab rubric. A student preference for using numbers and graphs to analyze data was positively correlated with achievement on the analysis dimension of the lab rubric. For the separating a mixture lab data the following pairs of correlations were significant. Student preference for doing chemistry labs where numbers and graphs were used to analyze data had a positive correlation with writing a correctly worded hypothesis. Student responses that lab experiences help them learn science positively correlated with achievement on the data dimension of the lab rubric. The only negative correlation found related to the first result where students' preference for computers was inversely correlated to their performance on analyzing data on their lab report. Other findings included the following: students like actual experimental work most and the write-up and analysis of a lab the least. It is recommended that lab science instruction be inquiry-based, hands-on, and that students be

  16. Analysis of lipophilic pigments from a phototrophic microbial mat community by high performance liquid chromatography

    NASA Technical Reports Server (NTRS)

    Palmisano, A. C.; Cronin, S. E.; Des Marais, D. J.

    1988-01-01

    As assay for lipophilic pigments in phototrophic microbial mat communities using reverse phase-high performance liquid chromatography was developed which allows the separation of 15 carotenoids and chloropigments in a single 30 min program. Lipophilic pigments in a laminated mat from a commercial salina near Laguna Guerrero Negro, Baja California Sur, Mexico reflected their source organisms. Myxoxanthophyll, echinenone, canthaxanthin, and zeaxanthin were derived from cyanobacteria; chlorophyll c, and fucoxanthin from diatoms; chlorophyll a from cyanobacteria and diatoms; bacteriochlorophylls a and c, bacteriophaeophytin a, and gamma-carotene from Chloroflexus spp.; and beta-carotene from a variety of phototrophs. Sensitivity of detection was 0.6-6.1 ng for carotenoids and 1.7-12 ng for most chloropigments. This assay represents a significant improvement over previous analyses of lipophilic pigments in microbial mats and promises to have a wider application to other types of phototrophic communities.

  17. Experimental and theoretical performance analysis for a CMOS-based high resolution image detector

    NASA Astrophysics Data System (ADS)

    Jain, Amit; Bednarek, Daniel R.; Rudin, Stephen

    2014-03-01

    Increasing complexity of endovascular interventional procedures requires superior x-ray imaging quality. Present stateof- the-art x-ray imaging detectors may not be adequate due to their inherent noise and resolution limitations. With recent developments, CMOS based detectors are presenting an option to fulfill the need for better image quality. For this work, a new CMOS detector has been analyzed experimentally and theoretically in terms of sensitivity, MTF and DQE. The detector (Dexela Model 1207, Perkin-Elmer Co., London, UK) features 14-bit image acquisition, a CsI phosphor, 75 μm pixels and an active area of 12 cm x 7 cm with over 30 fps frame rate. This detector has two modes of operations with two different full-well capacities: high and low sensitivity. The sensitivity and instrumentation noise equivalent exposure (INEE) were calculated for both modes. The detector modulation-transfer function (MTF), noise-power spectra (NPS) and detective quantum efficiency (DQE) were measured using an RQA5 spectrum. For the theoretical performance evaluation, a linear cascade model with an added aliasing stage was used. The detector showed excellent linearity in both modes. The sensitivity and the INEE of the detector were found to be 31.55 DN/μR and 0.55 μR in high sensitivity mode, while they were 9.87 DN/μR and 2.77 μR in low sensitivity mode. The theoretical and experimental values for the MTF and DQE showed close agreement with good DQE even at fluoroscopic exposure levels. In summary, the Dexela detector's imaging performance in terms of sensitivity, linear system metrics, and INEE demonstrates that it can overcome the noise and resolution limitations of present state-of-the-art x-ray detectors.

  18. High precision carbon-interspaced antiscatter grids: Performance testing and moiré pattern analysis

    NASA Astrophysics Data System (ADS)

    Lee, S. J.; Cho, H. S.; Oh, J. E.; Choi, S. I.; Cho, H. M.; Park, Y. O.; Hong, D. K.; Lee, M. S.; Yang, Y. J.; Je, U. K.; Kim, D. S.; Lee, H. K.

    2011-10-01

    Recently, we have developed high precision carbon-interspaced antiscatter grids to be suitable for digital radiography (DR) adopting a precise sawing process. For systematic evaluation of the grid performance, we prepared several sample grids having different grid frequencies (4.0-8.5 lines/mm) and grid ratios (5:1-10:1) and established a well-controlled test condition based upon the IEC standard. In this paper, we presented the performance characteristics of the carbon-interspaced grids in terms of the transmission of primary radiation ( Tp), the transmission of scattered radiation ( Ts), the transmission of total radiation ( Tt), contrast improvement factor ( Cif), and Bucky factor ( B). We also described the grid line artifact, known as a moiré pattern, which may be the most critical problem to be solved for the successful grid use in DR. We examined the factors that affect the moiré pattern by integrating the sample grids with an a-Se based flat panel detector having a 139 μm×139 μm pixel size.

  19. Geostatistical analysis of Landsat-TM lossy compression images in a high-performance computing environment

    NASA Astrophysics Data System (ADS)

    Pesquer, Lluís; Cortés, Ana; Serral, Ivette; Pons, Xavier

    2011-11-01

    The main goal of this study is to characterize the effects of lossy image compression procedures on the spatial patterns of remotely sensed images, as well as to test the performance of job distribution tools specifically designed for obtaining geostatistical parameters (variogram) in a High Performance Computing (HPC) environment. To this purpose, radiometrically and geometrically corrected Landsat-5 TM images from April, July, August and September 2006 were compressed using two different methods: Band-Independent Fixed-Rate (BIFR) and three-dimensional Discrete Wavelet Transform (3d-DWT) applied to the JPEG 2000 standard. For both methods, a wide range of compression ratios (2.5:1, 5:1, 10:1, 50:1, 100:1, 200:1 and 400:1, from soft to hard compression) were compared. Variogram analyses conclude that all compression ratios maintain the variogram shapes and that the higher ratios (more than 100:1) reduce variance in the sill parameter of about 5%. Moreover, the parallel solution in a distributed environment demonstrates that HPC offers a suitable scientific test bed for time demanding execution processes, as in geostatistical analyses of remote sensing images.

  20. Towards Real-Time High Performance Computing For Power Grid Analysis

    SciTech Connect

    Hui, Peter SY; Lee, Barry; Chikkagoudar, Satish

    2012-11-16

    Real-time computing has traditionally been considered largely in the context of single-processor and embedded systems, and indeed, the terms real-time computing, embedded systems, and control systems are often mentioned in closely related contexts. However, real-time computing in the context of multinode systems, specifically high-performance, cluster-computing systems, remains relatively unexplored. Imposing real-time constraints on a parallel (cluster) computing environment introduces a variety of challenges with respect to the formal verification of the system's timing properties. In this paper, we give a motivating example to demonstrate the need for such a system--- an application to estimate the electromechanical states of the power grid--- and we introduce a formal method for performing verification of certain temporal properties within a system of parallel processes. We describe our work towards a full real-time implementation of the target application--- namely, our progress towards extracting a key mathematical kernel from the application, the formal process by which we analyze the intricate timing behavior of the processes on the cluster, as well as timing measurements taken on our test cluster to demonstrate use of these concepts.

  1. High performance polymeric foams

    SciTech Connect

    Gargiulo, M.; Sorrentino, L.; Iannace, S.

    2008-08-28

    The aim of this work was to investigate the foamability of high-performance polymers (polyethersulfone, polyphenylsulfone, polyetherimide and polyethylenenaphtalate). Two different methods have been used to prepare the foam samples: high temperature expansion and two-stage batch process. The effects of processing parameters (saturation time and pressure, foaming temperature) on the densities and microcellular structures of these foams were analyzed by using scanning electron microscopy.

  2. Analysis of Metabolomics Datasets with High-Performance Computing and Metabolite Atlases

    PubMed Central

    Yao, Yushu; Sun, Terence; Wang, Tony; Ruebel, Oliver; Northen, Trent; Bowen, Benjamin P.

    2015-01-01

    Even with the widespread use of liquid chromatography mass spectrometry (LC/MS) based metabolomics, there are still a number of challenges facing this promising technique. Many, diverse experimental workflows exist; yet there is a lack of infrastructure and systems for tracking and sharing of information. Here, we describe the Metabolite Atlas framework and interface that provides highly-efficient, web-based access to raw mass spectrometry data in concert with assertions about chemicals detected to help address some of these challenges. This integration, by design, enables experimentalists to explore their raw data, specify and refine features annotations such that they can be leveraged for future experiments. Fast queries of the data through the web using SciDB, a parallelized database for high performance computing, make this process operate quickly. By using scripting containers, such as IPython or Jupyter, to analyze the data, scientists can utilize a wide variety of freely available graphing, statistics, and information management resources. In addition, the interfaces facilitate integration with systems biology tools to ultimately link metabolomics data with biological models. PMID:26287255

  3. Trace analysis of endectocides in milk by high performance liquid chromatography with fluorescence detection.

    PubMed

    Cerkvenik-Flajs, Vesna; Milcinski, Luka; Süssinger, Adica; Hodoscek, Lena; Danaher, Martin; Antonić, Jan

    2010-03-24

    An analytical method has been developed for the simultaneous determination of the following endectocide drugs in milk: ivermectin, abamectin, doramectin, moxidectin, eprinomectin, emamectin and nemadectin. Samples were extracted with acetonitrile, purified with solid-phase extraction on a reversed phase C(8), derivatised with N-methylimidazole, trifluoroacetic anhydride and acetic acid to a stable fluorescent derivative, and were further analysed by gradient high performance liquid chromatography (HPLC) on an endcapped reversed phase Supelcosil LC-8-DB. The derivatisation step was mathematically optimised and the method was validated according to the requirements of Commission Decision 2002/657/EC, using fortified raw bovine milk. Mean recovery was between 78 and 98%. The repeatability (CV(r)) and within-laboratory reproducibility (CV(W)) ranged from 4.6 to 13.4% and from 6.6 to 14.5%, respectively. Decision limits (CCalpha) for analytes with MRL values, namely eprinomectin and moxidectin, were determined to be 24.8 and 50.6 microg kg(-1), respectively. CCalpha values for unauthorised endectocides ranged from 0.1 to 0.2 microg kg(-1). Due to high acceptability regarding the required criteria and applicability to ovine and caprine milk, giving similar results, this multi-analyte method has been successfully implemented in pharmacokinetic research studies as well as statutory residue monitoring in Slovenia. PMID:20206006

  4. Analysis of Metabolomics Datasets with High-Performance Computing and Metabolite Atlases.

    PubMed

    Yao, Yushu; Sun, Terence; Wang, Tony; Ruebel, Oliver; Northen, Trent; Bowen, Benjamin P

    2015-01-01

    Even with the widespread use of liquid chromatography mass spectrometry (LC/MS) based metabolomics, there are still a number of challenges facing this promising technique. Many, diverse experimental workflows exist; yet there is a lack of infrastructure and systems for tracking and sharing of information. Here, we describe the Metabolite Atlas framework and interface that provides highly-efficient, web-based access to raw mass spectrometry data in concert with assertions about chemicals detected to help address some of these challenges. This integration, by design, enables experimentalists to explore their raw data, specify and refine features annotations such that they can be leveraged for future experiments. Fast queries of the data through the web using SciDB, a parallelized database for high performance computing, make this process operate quickly. By using scripting containers, such as IPython or Jupyter, to analyze the data, scientists can utilize a wide variety of freely available graphing, statistics, and information management resources. In addition, the interfaces facilitate integration with systems biology tools to ultimately link metabolomics data with biological models. PMID:26287255

  5. POPE: A distributed query system for high performance analysis of very large persistent object stores

    SciTech Connect

    Fischler, M.S.; Isely, M.C.; Nigri, A.M.; Rinaldo, F.J.

    1996-01-01

    Analysis of large physics data sets is a major computing task at Fermilab. One step in such an analysis involves culling ``interesting`` events via the use of complex query criteria. What makes this unusual is the scale required: 100`s of gigabytes of event data must be scanned at 10`s of megabytes per second for the typical queries that are applied, and data must be extracted from 10`s of terabytes based on the result of the query. The Physics Object Persistency Manager (POPM) system is a solution tailored to this scale of problem. A running POPM environment can support multiple queries in progress, each scanning at rates exceeding 10 megabytes per second, all of which are sharing access to a very large persistent address space distributed across multiple disks on multiple hosts. Specifically, POPM employs the following techniques to permit this scale of performance and access: Persistent objects: Experimental data to be scanned is ``populated`` as a data structure into the persistent address space supported by POPM. C++ classes with a few key overloaded operators provide nearly transparent semantics for access to the persistent storage. Distributed and parallel I/O: The persistent address space is automatically distributed across disks of multiple ``I/O nodes`` within the POPM system. A striping unit concept is implemented in POPM, permitting fast parallel I/O across the storage nodes, even for small single queries. Efficient Shared access: POPM implements an efficient mechanism for arbitration and multiplexing of I/O access among multiple queries on the same or separate compute nodes.

  6. High performance parallel architectures

    SciTech Connect

    Anderson, R.E. )

    1989-09-01

    In this paper the author describes current high performance parallel computer architectures. A taxonomy is presented to show computer architecture from the user programmer's point-of-view. The effects of the taxonomy upon the programming model are described. Some current architectures are described with respect to the taxonomy. Finally, some predictions about future systems are presented. 5 refs., 1 fig.

  7. High-Performance Happy

    ERIC Educational Resources Information Center

    O'Hanlon, Charlene

    2007-01-01

    Traditionally, the high-performance computing (HPC) systems used to conduct research at universities have amounted to silos of technology scattered across the campus and falling under the purview of the researchers themselves. This article reports that a growing number of universities are now taking over the management of those systems and…

  8. High-Performance Data Analysis Tools for Sun-Earth Connection Missions

    NASA Technical Reports Server (NTRS)

    Messmer, Peter

    2011-01-01

    The data analysis tool of choice for many Sun-Earth Connection missions is the Interactive Data Language (IDL) by ITT VIS. The increasing amount of data produced by these missions and the increasing complexity of image processing algorithms requires access to higher computing power. Parallel computing is a cost-effective way to increase the speed of computation, but algorithms oftentimes have to be modified to take advantage of parallel systems. Enhancing IDL to work on clusters gives scientists access to increased performance in a familiar programming environment. The goal of this project was to enable IDL applications to benefit from both computing clusters as well as graphics processing units (GPUs) for accelerating data analysis tasks. The tool suite developed in this project enables scientists now to solve demanding data analysis problems in IDL that previously required specialized software, and it allows them to be solved orders of magnitude faster than on conventional PCs. The tool suite consists of three components: (1) TaskDL, a software tool that simplifies the creation and management of task farms, collections of tasks that can be processed independently and require only small amounts of data communication; (2) mpiDL, a tool that allows IDL developers to use the Message Passing Interface (MPI) inside IDL for problems that require large amounts of data to be exchanged among multiple processors; and (3) GPULib, a tool that simplifies the use of GPUs as mathematical coprocessors from within IDL. mpiDL is unique in its support for the full MPI standard and its support of a broad range of MPI implementations. GPULib is unique in enabling users to take advantage of an inexpensive piece of hardware, possibly already installed in their computer, and achieve orders of magnitude faster execution time for numerically complex algorithms. TaskDL enables the simple setup and management of task farms on compute clusters. The products developed in this project have the

  9. High Performance, Dependable Multiprocessor

    NASA Technical Reports Server (NTRS)

    Ramos, Jeremy; Samson, John R.; Troxel, Ian; Subramaniyan, Rajagopal; Jacobs, Adam; Greco, James; Cieslewski, Grzegorz; Curreri, John; Fischer, Michael; Grobelny, Eric; George, Alan; Aggarwal, Vikas; Patel, Minesh; Some, Raphael

    2006-01-01

    With the ever increasing demand for higher bandwidth and processing capacity of today's space exploration, space science, and defense missions, the ability to efficiently apply commercial-off-the-shelf (COTS) processors for on-board computing is now a critical need. In response to this need, NASA's New Millennium Program office has commissioned the development of Dependable Multiprocessor (DM) technology for use in payload and robotic missions. The Dependable Multiprocessor technology is a COTS-based, power efficient, high performance, highly dependable, fault tolerant cluster computer. To date, Honeywell has successfully demonstrated a TRL4 prototype of the Dependable Multiprocessor [I], and is now working on the development of a TRLS prototype. For the present effort Honeywell has teamed up with the University of Florida's High-performance Computing and Simulation (HCS) Lab, and together the team has demonstrated major elements of the Dependable Multiprocessor TRLS system.

  10. HX-MS2 for high performance conformational analysis of complex protein states

    PubMed Central

    Burns, Kyle M; Sarpe, Vladimir; Wagenbach, Mike; Wordeman, Linda; Schriemer, David C

    2015-01-01

    Water-mediated hydrogen exchange (HX) processes involving the protein main chain are sensitive to structural dynamics and molecular interactions. Measuring deuterium uptake in amide bonds provides information on conformational states, structural transitions and binding events. Increasingly, deuterium levels are measured by mass spectrometry (MS) from proteolytically generated peptide fragments of large molecular systems. However, this bottom-up method has limited spectral capacity and requires a burdensome manual validation exercise, both of which restrict analysis of protein systems to generally less than 150 kDa. In this study, we present a bottom-up HX-MS2 method that improves peptide identification rates, localizes high-quality HX data and simplifies validation. The method combines a new peptide scoring algorithm (WUF, weighted unique fragment) with data-independent acquisition of peptide fragmentation data. Scoring incorporates the validation process and emphasizes identification accuracy. The HX-MS2 method is illustrated using data from a conformational analysis of microtubules treated with dimeric kinesin MCAK. When compared to a conventional Mascot-driven HX-MS method, HX-MS2 produces two-fold higher α/β-tubulin sequence depth at a peptide utilization rate of 74%. A Mascot approach delivers a utilization rate of 44%. The WUF score can be constrained by false utilization rate (FUR) calculations to return utilization values exceeding 90% without serious data loss, indicating that automated validation should be possible. The HX-MS2 data confirm that N-terminal MCAK domains anchor kinesin force generation in kinesin-mediated depolymerization, while the C-terminal tails regulate MCAK-tubulin interactions. PMID:26009873

  11. Ion-pair ultra-high performance liquid chromatographic analysis of monoamines: peak-splitting at high flow rates.

    PubMed

    Van Schoors, Jolien; Brouwer, Hendrik-Jan; Maes, Katrien; Michotte, Yvette; Van Eeckhaut, Ann

    2013-12-20

    The use of ion-pair ultra-high performance liquid chromatography (UHPLC) coupled with electrochemical detection (ECD) is of great interest for the fast and sensitive determination of the monoamine neurotransmitters dopamine, noradrenaline and serotonin in microdialysis samples. However, when applying high flow rates in ion-pair UHPLC, other peaks than the initial compound peaks appear on the chromatogram. This peak-splitting phenomenon is caused by disturbed ion-pair retention mechanisms. The influence of several chromatographic parameters is investigated. Peak-splitting is delayed to higher flow rates when increasing the concentration of ion-pair reagent or buffering agent in the mobile phase, when decreasing the percentage of organic modifier in the mobile phase, when applying a stationary phase with a smaller amount of packing material or when increasing the separation temperature. One or a combination of these conditions can be applied to analyze the monoamine neurotransmitters using ion-pair UHPLC-ECD at high flow rates. PMID:24238712

  12. Modeling and analysis of transient vehicle underhood thermo- hydrodynamic events using computational fluid dynamics and high performance computing.

    SciTech Connect

    Tentner, A.; Froehle, P.; Wang, C.; Nuclear Engineering Division

    2004-01-01

    This work has explored the preliminary design of a Computational Fluid Dynamics (CFD) tool for the analysis of transient vehicle underhood thermo-hydrodynamic events using high performance computing platforms. The goal of this tool will be to extend the capabilities of an existing established CFD code, STAR-CD, allowing the car manufacturers to analyze the impact of transient operational events on the underhood thermal management by exploiting the computational efficiency of modern high performance computing systems. In particular, the project has focused on the CFD modeling of the radiator behavior during a specified transient. The 3-D radiator calculations were performed using STAR-CD, which can perform both steady-state and transient calculations, on the cluster computer available at ANL in the Nuclear Engineering Division. Specified transient boundary conditions, based on experimental data provided by Adapco and DaimlerChrysler were used. The possibility of using STAR-CD in a transient mode for the entire period of time analyzed has been compared with other strategies which involve the use of STAR-CD in a steady-state mode at specified time intervals, while transient heat transfer calculations would be performed for the rest of the time. The results of these calculations have been compared with the experimental data provided by Adapco/DaimlerChrysler and recommendations for future development of an optimal strategy for the CFD modeling of transient thermo-hydrodynamic events have been made. The results of this work open the way for the development of a CFD tool for the transient analysis of underhood thermo-hydrodynamic events, which will allow the integrated transient thermal analysis of the entire cooling system, including both the engine block and the radiator, on high performance computing systems.

  13. Modeling and analysis of transient vehicle underhood thermo - hydrodynamic events using computational fluid dynamics and high performance computing.

    SciTech Connect

    Froehle, P.; Tentner, A.; Wang, C.

    2003-09-05

    This work has explored the preliminary design of a Computational Fluid Dynamics (CFD) tool for the analysis of transient vehicle underhood thermo-hydrodynamic events using high performance computing platforms. The goal of this tool will be to extend the capabilities of an existing established CFD code, STAR-CD, allowing the car manufacturers to analyze the impact of transient operational events on the underhood thermal management by exploiting the computational efficiency of modern high performance computing systems. In particular, the project has focused on the CFD modeling of the radiator behavior during a specified transient. The 3-D radiator calculations were performed using STAR-CD, which can perform both steady-state and transient calculations, on the cluster computer available at ANL in the Nuclear Engineering Division. Specified transient boundary conditions, based on experimental data provided by Adapco and DaimlerChrysler were used. The possibility of using STAR-CD in a transient mode for the entire period of time analyzed has been compared with other strategies which involve the use of STAR-CD in a steady-state mode at specified time intervals, while transient heat transfer calculations would be performed for the rest of the time. The results of these calculations have been compared with the experimental data provided by Adapco/DaimlerChrysler and recommendations for future development of an optimal strategy for the CFD modeling of transient thermo-hydrodynamic events have been made. The results of this work open the way for the development of a CFD tool for the transient analysis of underhood thermo-hydrodynamic events, which will allow the integrated transient thermal analysis of the entire cooling system, including both the engine block and the radiator, on high performance computing systems.

  14. Osiris: A Modern, High-Performance, Coupled, Multi-Physics Code For Nuclear Reactor Core Analysis

    SciTech Connect

    Procassini, R J; Chand, K K; Clouse, C J; Ferencz, R M; Grandy, J M; Henshaw, W D; Kramer, K J; Parsons, I D

    2007-02-26

    To meet the simulation needs of the GNEP program, LLNL is leveraging a suite of high-performance codes to be used in the development of a multi-physics tool for modeling nuclear reactor cores. The Osiris code project, which began last summer, is employing modern computational science techniques in the development of the individual physics modules and the coupling framework. Initial development is focused on coupling thermal-hydraulics and neutral-particle transport, while later phases of the project will add thermal-structural mechanics and isotope depletion. Osiris will be applicable to the design of existing and future reactor systems through the use of first-principles, coupled physics models with fine-scale spatial resolution in three dimensions and fine-scale particle-energy resolution. Our intent is to replace an existing set of legacy, serial codes which require significant approximations and assumptions, with an integrated, coupled code that permits the design of a reactor core using a first-principles physics approach on a wide range of computing platforms, including the world's most powerful parallel computers. A key research activity of this effort deals with the efficient and scalable coupling of physics modules which utilize rather disparate mesh topologies. Our approach allows each code module to use a mesh topology and resolution that is optimal for the physics being solved, and employs a mesh-mapping and data-transfer module to effect the coupling. Additional research is planned in the area of scalable, parallel thermal-hydraulics, high-spatial-accuracy depletion and coupled-physics simulation using Monte Carlo transport.

  15. High performance steam development

    SciTech Connect

    Duffy, T.; Schneider, P.

    1995-12-31

    DOE has launched a program to make a step change in power plant to 1500 F steam, since the highest possible performance gains can be achieved in a 1500 F steam system when using a topping turbine in a back pressure steam turbine for cogeneration. A 500-hour proof-of-concept steam generator test module was designed, fabricated, and successfully tested. It has four once-through steam generator circuits. The complete HPSS (high performance steam system) was tested above 1500 F and 1500 psig for over 102 hours at full power.

  16. Analysis of Phase Separation in High Performance PbTe–PbS Thermoelectric Materials

    SciTech Connect

    Girard, Steven N.; Schmidt-Rohr, Klaus; Chasapis, Thomas C.; Hatzikraniotis, Euripides; Njegic, B.; Levin, E. M.; Rawal, A.; Paraskevopoulos, Konstantios M.; Kanatzidis, Mercouri G.

    2013-02-11

    Phase immiscibility in PbTe–based thermoelectric materials is an effective means of top-down synthesis of nanostructured composites exhibiting low lattice thermal conductivities. PbTe1-x Sx thermoelectric materials can be synthesized as metastable solid solution alloys through rapid quenching. Subsequent post-annealing induces phase separation at the nanometer scale, producing nanostructures that increase phonon scattering and reduce lattice thermal conductivity. However, there has yet to be any study investigating in detail the local chemical structure of both the solid solution and nanostructured variants of this material system. Herein, quenched and annealed (i.e., solid solution and phase-separated) samples of PbTe–PbS are analyzed by in situ high-resolution synchrotron powder X-ray diffraction, solid-state 125Te nuclear magnetic resonance (NMR), and infrared (IR) spectroscopy analysis. For high concentrations of PbS in PbTe, e.g., x >16%, NMR and IR analyses reveal that rapidly quenched samples exhibit incipient phase separation that is not detected by state-of-the-art synchrotron X-ray diffraction, providing an example of a PbTe thermoelectric “alloy” that is in fact phase inhomogeneous. Thermally-induced PbS phase separation in PbTe–PbS occurs close to 200 °C for all compositions studied, and the solubility of the PbS phase in PbTe at elevated temperatures >500 °C is reported. The findings of this study suggest that there may be a large number of thermoelectric alloy systems that are phase inhomogeneous or nanostructured despite adherence to Vegard's Law of alloys, highlighting the importance of careful chemical characterization to differentiate between thermoelectric alloys and composites.

  17. Performance analysis of a new biolistic gun using high power laser irradiation

    NASA Astrophysics Data System (ADS)

    Han, Tae-Hee; Lee, Hyunhee; Choi, Soojin; Gojani, Ardian B.; Yoh, Jack J.

    2010-11-01

    Impingement of a high power laser pulse (above 109 W/cm2) on a metal foil causes ablation, which is characterized by a rapid expulsion of matter and initiation of a strong shock wave inside the solid metal. The shock propagates through the foil and reverberates on the rear side causing instant deformation of the foil, whose surface is treated with micro particles prior to ablation. Based on this principle of micro particle ejection, we develop a new biolistic gun with improved controllability, stability, efficiency of our previous system, and perform characterization of the penetration shapes at varying confinements and energy levels. The confinement media include BK7 glass, water, and succulent jelly (ultrasound gel). Biological tissue was replicated by a gelatin-water solution at a 3% weight ratio. Present data show that confinement effect results in a conspicuous enhancement of penetration reached by 5 μm cobalt micro particles. Also, there exists an optimal thickness at each energy level when using liquid confinement for enhanced particle delivery.

  18. Towards the design of high performance IR photonics: Optical analysis of textured gallium antimonide surfaces

    NASA Astrophysics Data System (ADS)

    Wassweiler, Ella; Prineas, John; Toor, Fatima

    Gallium antimonide (GaSb) is used for fabrication of various optoelectronics devices, such as laser diodes, light emitting diodes, and photodetectors for the mid-infrared (MIR) wavelengths of 3 μm to 30 μm. Light extraction or collection efficiency of GaSb-based MIR devices can be significantly enhanced by surface texturing due to the density graded effect. However to the best of our knowledge no systematic study exists that analyzes the etch chemistries, surface textures and resultant reflectivity of GaSb surfaces. In this work we present the characterization of GaSb textures and how they correlate to reflectivity in the visible and MIR wavelengths. A parametric sweep of etch chemistries involving hydrofluoric acid (HF), hydrogen peroxide (H2O2) , and citric acid (C4H6O6) provide a variety of surface textures that correspond to low reflectivity in different wavelength regimes. The size of the surface features causes scattering in wavelengths of the same magnitude and as a result lower the reflectivity. In addition an analytical equation derived from our experimental data is presented that correlates reflectivity measurements to etch depth and wavelength, which can used to design high performance IR photonic devices.

  19. Cloud CPFP: a shotgun proteomics data analysis pipeline using cloud and high performance computing.

    PubMed

    Trudgian, David C; Mirzaei, Hamid

    2012-12-01

    We have extended the functionality of the Central Proteomics Facilities Pipeline (CPFP) to allow use of remote cloud and high performance computing (HPC) resources for shotgun proteomics data processing. CPFP has been modified to include modular local and remote scheduling for data processing jobs. The pipeline can now be run on a single PC or server, a local cluster, a remote HPC cluster, and/or the Amazon Web Services (AWS) cloud. We provide public images that allow easy deployment of CPFP in its entirety in the AWS cloud. This significantly reduces the effort necessary to use the software, and allows proteomics laboratories to pay for compute time ad hoc, rather than obtaining and maintaining expensive local server clusters. Alternatively the Amazon cloud can be used to increase the throughput of a local installation of CPFP as necessary. We demonstrate that cloud CPFP allows users to process data at higher speed than local installations but with similar cost and lower staff requirements. In addition to the computational improvements, the web interface to CPFP is simplified, and other functionalities are enhanced. The software is under active development at two leading institutions and continues to be released under an open-source license at http://cpfp.sourceforge.net. PMID:23088505

  20. Analysis of abamectin residues in avocados by high-performance liquid chromatography with fluorescence detection.

    PubMed

    Hernández-Borges, Javier; Ravelo-Pérez, Lidia M; Hernández-Suárez, Estrella M; Carnero, Aurelio; Rodríguez-Delgado, Miguel Angel

    2007-09-21

    In this work an analytical method for the determination of abamectin residues in avocados is developed using high-performance liquid chromatography (HPLC) with fluorescence (FL) detection. A pre-column derivatization with trifluoroacetic anhydride (TFAA) and N-methylimidazole (NMIM) was carried out. The mobile phase consisted of water, methanol and acetonitrile (5:47.5:47.5 v/v/v) and was pumped at a rate of 1 mL/min (isocratic elution). The fluorescence detector was set at an excitation wavelength of 365 nm and an emission wavelength of 470 nm. Homogenized avocado samples were extracted twice with acetonitrile:water 8:2 (v/v) and cleaned using C(18) solid-phase extraction (SPE) cartridges. Recovery values were in the range 87-98% with RSD values lower than 13%. The limits of detection (LODs) and quantification (LOQs) of the whole method were 0.001 and 0.003 mg/kg, respectively. These values are lower than the maximum residue limit (MRL) established by the European Union (EU) and the Spanish legislation in avocado samples. PMID:17681518

  1. ANALYSIS OF DRUG INTERACTIONS WITH VERY LOW DENSITY LIPOPROTEIN BY HIGH PERFORMANCE AFFINITY CHROMATOGRAPHY

    PubMed Central

    Sobansky, Matthew R.; Hage, David S.

    2014-01-01

    High-performance affinity chromatography (HPAC) was utilized to examine the binding of very low density lipoprotein (VLDL) with drugs, using R/S-propranolol as a model. These studies indicated that two mechanisms existed for the binding of R- and S-propranolol with VLDL. The first mechanism involved non-saturable partitioning of these drugs with VLDL, which probably occurred with the lipoprotein's non-polar core. This partitioning was described by overall affinity constants of 1.2 (± 0.3) × 106 M-1 for R-propranolol and 2.4 (± 0.6) × 106 M-1 for S-propranolol at pH 7.4 and 37 °C. The second mechanism occurred through saturable binding by these drugs at fixed sites on VLDL, such as represented by apolipoproteins on the surface of the lipoprotein. The association equilibrium constants for this saturable binding at 37 °C were 7.0 (± 2.3) × 104 M-1 for R-propranolol and 9.6 (± 2.2) × 104 M-1 for S-propranolol. Comparable results were obtained at 20 °C and 27 °C for the propranolol enantiomers. This work provided fundamental information on the processes involved in the binding of R- and S-propranolol to VLDL, while also illustrating how HPAC can be used to evaluate relatively complex interactions between agents such as VLDL and drugs or other solutes. PMID:25103529

  2. Analysis of Lidocaine Interactions with Serum Proteins Using High-Performance Affinity Chromatography

    PubMed Central

    Soman, Sony; Yoo, Michelle J.; Jang, Yoon Jeong; Hage, David S.

    2010-01-01

    High-performance affinity chromatography was used to study binding by the drug lidocaine to human serum albumin (HSA) and α1–acid glycoprotein (AGP). AGP had strong binding to lidocaine, with an association equilibrium constant (Ka) of 1.1-1.7 × 105 M-1 at 37 °C and pH 7.4. Lidocaine had weak-to-moderate binding to HSA, with a Ka in the range of 103 to 104 M-1. Competitive experiments with site selective probes showed that lidocaine was interacting with Sudlow site II of HSA and the propranolol site of AGP. These results agree with previous observations in the literature and provide a better quantitative understanding of how lidocaine binds to these serum proteins and is transported in the circulation. This study also demonstrates how HPAC can be used to examine the binding of a drug with multiple serum proteins and provide detailed information on the interaction sites and equilibrium constants that are involved in such processes. PMID:20138813

  3. High-performance computational analysis and peptide screening from databases of cyclotides from poaceae.

    PubMed

    Porto, William F; Miranda, Vivian J; Pinto, Michelle F S; Dohms, Stephan M; Franco, Octavio L

    2016-01-01

    Cyclotides are a family of head-to-tail cyclized peptides containing three conserved disulfide bonds, in a structural scaffold also known as a cyclic cysteine knot. Due to the high degree of cysteine conservation, novel members from this peptide family can be identified in protein databases through a search through regular expression (REGEX). In this work, six novel cyclotide-like precursors from the Poaceae were identified from NCBI's non-redundant protein database by the use of REGEX. Two out of six sequences (named Zea mays L and M) showed an Asp residue in the C-terminal, which indicated that they could be cyclic. Gene expression in maize tissues was investigated, showing that the previously described cyclotide-like Z. mays J is expressed in the roots. According to molecular dynamics, the structure of Z. mays J seems to be stable, despite the putative absence of cyclization. As regards cyclotide evolution, it was hypothesized that this is an outcome from convergent evolution and/or horizontal gene transfer. The results showed that peptide screening from databases should be performed periodically in order to include novel sequences, which are deposited as the databases grow. Indeed, the advances in computational and experimental methods will together help to answer key questions and reach new horizons in defense-related peptide identification. PMID:26572696

  4. Simulink models for performance analysis of high speed DQPSK modulated optical link

    NASA Astrophysics Data System (ADS)

    Sharan, Lucky; Rupanshi, Chaubey, V. K.

    2016-03-01

    This paper attempts to present the design approach for development of simulation models to study and analyze the transmission of 10 Gbps DQPSK signal over a single channel Peer to Peer link using Matlab Simulink. The simulation model considers the different optical components used in link design with their behavior represented initially by theoretical interpretation, including the transmitter topology, Mach Zehnder Modulator(MZM) module and, the propagation model for optical fibers etc. thus allowing scope for direct realization in experimental configurations. It provides the flexibility to incorporate the various photonic components as either user-defined or fixed and, can also be enhanced or removed from the model as per the design requirements. We describe the detailed operation and need of every component model and its representation in Simulink blocksets. Moreover the developed model can be extended in future to support Dense Wavelength Division Multiplexing (DWDM) system, thereby allowing high speed transmission with N × 40 Gbps systems. The various compensation techniques and their influence on system performance can be easily investigated by using such models.

  5. Performance Measurement Analysis System

    Energy Science and Technology Software Center (ESTSC)

    1989-06-01

    The PMAS4.0 (Performance Measurement Analysis System) is a user-oriented system designed to track the cost and schedule performance of Department of Energy (DOE) major projects (MPs) and major system acquisitions (MSAs) reporting under DOE Order 5700.4A, Project Management System. PMAS4.0 provides for the analysis of performance measurement data produced from management control systems complying with the Federal Government''s Cost and Schedule Control Systems Criteria.

  6. High Performance FORTRAN

    NASA Technical Reports Server (NTRS)

    Mehrotra, Piyush

    1994-01-01

    High performance FORTRAN is a set of extensions for FORTRAN 90 designed to allow specification of data parallel algorithms. The programmer annotates the program with distribution directives to specify the desired layout of data. The underlying programming model provides a global name space and a single thread of control. Explicitly parallel constructs allow the expression of fairly controlled forms of parallelism in particular data parallelism. Thus the code is specified in a high level portable manner with no explicit tasking or communication statements. The goal is to allow architecture specific compilers to generate efficient code for a wide variety of architectures including SIMD, MIMD shared and distributed memory machines.

  7. High Performance Window Retrofit

    SciTech Connect

    Shrestha, Som S; Hun, Diana E; Desjarlais, Andre Omer

    2013-12-01

    The US Department of Energy (DOE) Office of Energy Efficiency and Renewable Energy (EERE) and Traco partnered to develop high-performance windows for commercial building that are cost-effective. The main performance requirement for these windows was that they needed to have an R-value of at least 5 ft2 F h/Btu. This project seeks to quantify the potential energy savings from installing these windows in commercial buildings that are at least 20 years old. To this end, we are conducting evaluations at a two-story test facility that is representative of a commercial building from the 1980s, and are gathering measurements on the performance of its windows before and after double-pane, clear-glazed units are upgraded with R5 windows. Additionally, we will use these data to calibrate EnergyPlus models that we will allow us to extrapolate results to other climates. Findings from this project will provide empirical data on the benefits from high-performance windows, which will help promote their adoption in new and existing commercial buildings. This report describes the experimental setup, and includes some of the field and simulation results.

  8. Analysis of phenolics in wine by high performance thin-layer chromatography with gradient elution and high resolution plate imaging.

    PubMed

    Agatonovic-Kustrin, Snezana; Hettiarachchi, Chandima G; Morton, David W; Razic, Slavica

    2015-01-01

    Health benefits of wine, especially with red wine, have been linked to the presence of a wide range of phenolic antioxidants. Thus, the aim of this study was to develop a simple, high performance thin layer chromatographic (HPTLC) method combined with high resolution digital plate images to visually compare multiple wine samples simultaneously on a single chromatographic plate and to quantify levels of gallic acid, caffeic acid, resveratrol and rutin, as representatives of the four different classes of phenolics found in wines. We also wanted to investigate the contribution of the investigated phenolic compounds to the total polyphenolic content (TPC) and total antioxidant capacity (TAC) of the wine samples. The average concentrations of caffeic acid, gallic acid, resveratrol, and rutin in the red wines were 2.15, 30.17, 0.59 and 2.47 mg/L respectively with their concentration below limit of quantification in the white wine samples. The highest concentration of resveratrol and rutin is found in the Cabernet and Shiraz wine samples. The amounts of gallic acid are correlated with TPC (r=0.58). Italian wines have the highest correlation between TPC and TAC (r=0.99) although they do not contain detectable amounts of resveratrol, they contain significant amount of rutin. Therefore, antioxidant properties might be associated with the presence of flavanols in these wines. PMID:25255450

  9. Simultaneous analysis and monitoring of 16 UV filters in cosmetics by high-performance liquid chromatography.

    PubMed

    Kim, Dojung; Kim, Sangseop; Kim, Seol-A; Choi, Myoengsin; Kwon, Kyoung-Jin; Kim, Mijeong; Kim, Dong-Sup; Kim, Seung-Hee; Choi, Bo-Kyung

    2012-01-01

    Sixteen UV filters were simultaneously analyzed using the high-performance liquid chromatographic method. They were drometrizole (USAN Drometrizole), 4-methylbenzylidene camphor (USAN Enzacamene), menthyl anthranilate (USAN Menthyl anthranilate), benzophenone-3 (USAN Oxybenzone), benzophenone-8 (USAN Dioxybenzone), butyl methoxydibenzoylmethane (USAN Avobenzone), ethylhexyl triazone (USAN Octyl triazone), octocrylene (USAN Octocrylene), ethylhexyl dimethyl p-aminobenzoic acid (USAN Padimate O), ethylhexyl methoxycinnamate (USAN Octinoxate), p-aminobenzoic acid (USAN Aminobenzoic acid), 2-phenylbenzimidazole-5-sulfonic acid (USAN Ensulizole), isoamyl p-methoxycinnamate (USAN Amiloxate), and recent UV filters such as diethylhexyl butamidotriazone (USAN Iscotrizinol), methylene bis-benzotriazolyl tetramethylbutylphenol (USAN Bisoctrizole), and terephthalylidene dicamphor sulfonic acid (USAN Ecamsule). Separation of the UV filters was carried out in a C(18) column with a gradient of methanol-phosphate buffer, and the UV detection was at 300, 320, or 360 nm without any interference. The limits of detection were between 0.08 and 1.94 μg/ml, and the limits of quantitation were between 0.24 and 5.89 μg/ml. The extracting solvent for the UV filters was methanol, except for ethylhexyl triazone and methylene bis-benzotriazolyl tetramethylbutylphenol, which were prepared with tetrahydrofuran. The recoveries from spiked samples were between 94.90% and 116.54%, depending on the matrixes used. The developed method was applied to 23 sunscreens obtained from local markets, and the results were acceptable to their own criteria and to maximum authorized concentrations. Consequently, these results would provide a simple extracting method and a simultaneous determination for various UV filters, which can improve the quality control process as well as the environmental monitoring of sunscreens. PMID:22591562

  10. Rapid high performance liquid chromatography-high resolution mass spectrometry methodology for multiple prenol lipids analysis in zebrafish embryos.

    PubMed

    Martano, Chiara; Mugoni, Vera; Dal Bello, Federica; Santoro, Massimo M; Medana, Claudio

    2015-09-18

    The analysis of lipid molecules in living organism is an important step in deciphering metabolic pathways. Recently, the zebrafish has been adopted as a valuable animal model system to perform in vivo metabolomics studies, however limited methodologies and protocols are currently available to investigate zebrafish lipidome and even fewer to analyze specific classes of lipids. Here we present an HPLC-HRMS based method to rapidly measure multiple prenol lipid molecules from zebrafish tissues. In particular, we have optimized our method for concurrent detection of ubiquinones (Coenzyme Q6, Coenzyme Q9, Coenzyme Q10), cholesterol, vitamin E (α-tocopherol), vitamin K1 and vitamin K2. The purpose of this study was to compare different ionization modes, mobile phases and stationary phases in order to optimize lipid molecules separation. After HPLC-HRMS parameters selection, several extraction conditions from zebrafish embryos were evaluated. We assessed our methodology by quantitation of analytical recovery on zebrafish extracts from wild-type or zebrafish mutants (barolo) affected by impaired biosynthesis of ubiquinones. PMID:26283533

  11. High performance cyclone development

    SciTech Connect

    Giles, W.B.

    1981-01-01

    The results of cold flow experiments at atmospheric conditions of an air-shielded 18 in-dia electrocyclone with a central cusped electrode are reported using fine test dusts of both flyash and nickel powder. These results are found to confirm expectations of enhanced performance, similar to earlier work on a 12 in-dia model. An analysis of the combined inertial-electrostatic force field is also presented which identifies general design goals and scaling laws. From this, it is found that electrostatic enhancement will be particularly beneficial for fine dusts in large cyclones. Recommendations for further improvement in cyclone collection efficiency are proposed.

  12. Analysis of therapeutic proteins and peptides using multiangle light scattering coupled to ultra high performance liquid chromatography.

    PubMed

    Espinosa-de la Garza, Carlos E; Miranda-Hernández, Mariana P; Acosta-Flores, Lilia; Pérez, Néstor O; Flores-Ortiz, Luis F; Medina-Rivero, Emilio

    2015-05-01

    Analysis of the physical properties of biotherapeutic proteins is crucial throughout all the stages of their lifecycle. Herein, we used size-exclusion ultra high performance liquid chromatography coupled to multiangle light scattering and refractive index detection systems to determine the molar mass, mass-average molar mass, molar-mass dispersity and hydrodynamic radius of two monoclonal antibodies (rituximab and trastuzumab), a fusion protein (etanercept), and a synthetic copolymer (glatiramer acetate) employed as models. A customized instrument configuration was set to diminish band-broadening effects and enhance sensitivity throughout detectors. The customized configuration showed a performance improvement with respect to the high-performance liquid chromatography standard configuration, as observed by a 3 h column conditioning and a higher resolution analysis in 20 min. Analysis of the two monoclonal antibodies showed averaged values of 148.0 kDa for mass-average molar mass and 5.4 nm for hydrodynamic radius, whereas for etanercept these values were 124.2 kDa and 6.9 nm, respectively. Molar-mass dispersity was 1.000 on average for these proteins. Regarding glatiramer acetate, a molar mass range from 3 to 45 kDa and a molar-mass dispersity of 1.304 were consistent with its intrinsic peptide diversity, and its mass-average molar mass was 10.4 kDa. Overall, this method demonstrated an accurate determination of molar mass, overcoming the difficulties of size-exclusion chromatography. PMID:25727056

  13. High Performance Buildings Database

    DOE Data Explorer

    The High Performance Buildings Database is a shared resource for the building industry, a unique central repository of in-depth information and data on high-performance, green building projects across the United States and abroad. The database includes information on the energy use, environmental performance, design process, finances, and other aspects of each project. Members of the design and construction teams are listed, as are sources for additional information. In total, up to twelve screens of detailed information are provided for each project profile. Projects range in size from small single-family homes or tenant fit-outs within buildings to large commercial and institutional buildings and even entire campuses. The database is a data repository as well. A series of Web-based data-entry templates allows anyone to enter information about a building project into the database. Once a project has been submitted, each of the partner organizations can review the entry and choose whether or not to publish that particular project on its own Web site.

  14. High-performance liquid chromatography with fluorescence detection for the rapid analysis of pheophytins and pyropheophytins in virgin olive oil.

    PubMed

    Li, Xueqi; Woodman, Michael; Wang, Selina C

    2015-08-01

    Pheophytins and pyropheophytin are degradation products of chlorophyll pigments, and their ratios can be used as a sensitive indicator of stress during the manufacturing and storage of olive oil. They increase over time depending on the storage condition and if the oil is exposed to heat treatments during the refining process. The traditional analysis method includes solvent- and time-consuming steps of solid-phase extraction followed by analysis by high-performance liquid chromatography with ultraviolet detection. We developed an improved dilute/fluorescence method where multi-step sample preparation was replaced by a simple isopropanol dilution before the high-performance liquid chromatography injection. A quaternary solvent gradient method was used to include a fourth strong solvent wash on a quaternary gradient pump, which avoided the need to premix any solvents and greatly reduced the oil residues on the column from previous analysis. This new method not only reduces analysis cost and time but shows reliability, repeatability, and improved sensitivity, especially important for low-level samples. PMID:26047465

  15. Performance Support for Performance Analysis

    ERIC Educational Resources Information Center

    Schaffer, Scott; Douglas, Ian

    2004-01-01

    Over the past several years, there has been a shift in emphasis in many business, industry, government and military training organizations toward human performance technology or HPT (Rossett, 2002; Dean, 1995). This trend has required organizations to increase the human performance knowledge, skills, and abilities of the training workforce.…

  16. Analysis of dynamic interaction between catenary and pantograph with experimental verification and performance evaluation in new high-speed line

    NASA Astrophysics Data System (ADS)

    Lee, Jin Hee; Park, Tae Won; Oh, Hyuck Keun; Kim, Young Guk

    2015-08-01

    Understanding the dynamic interaction between the catenary and pantograph of a high-speed train is the one of the most important technical issues in the railway industry. This is because the catenary-pantograph system plays a crucial role in providing electric power to the railway vehicle for stable operation. The aim of the present paper is to estimate the current-collection performance of this system by using numerical analysis, in particular, the flexible multibody dynamic analysis technique. To implement large deformable catenary wires, an absolute nodal coordinate formulation is used for the cable element. Additionally, an efficient contact element and an interactive model for the catenary-pantograph system are introduced. Each developed model is then used for analytical and experimental verification. Actual on-line test results of existing high-speed railway vehicles are presented and used to verify the analysis model. Finally, the performance characteristics of a new 400 km/h-class high-speed line are estimated and evaluated on the basis of international standards.

  17. Analysis of isomeric forms of oxidized triacylglycerols using ultra-high-performance liquid chromatography and tandem mass spectrometry.

    PubMed

    Suomela, Jukka-Pekka; Leskinen, Heidi; Kallio, Heikki

    2011-08-10

    Detailed studies on the regioisomeric structures of oxidized species of triacylglycerols (TAG), formed in food during storage and processing, have not been published thus far. In this study, an analytical approach based on efficient ultra-high-performance liquid chromatographic (UHPLC) separation of different isomers of oxidized TAG species and their tandem mass spectrometric analysis was created. A linear solvent gradient based on acetonitrile and acetone was used in the UHPLC method. A novel method utilizing positive ion ESI using ammonia supplemented in the nebulizer gas was used to produce ammonium adduct ions for mass spectrometric analysis. With the UHPLC method used, different regioisomers of TAG species containing oxidized linoleic or oleic acid could be efficiently resolved. Differences in the fragmentation patterns of many of the oxidized TAG isomers could be demonstrated by the tandem mass spectrometric method. On the basis of the results, the approach enables regiospecific analysis of oxidized TAG molecules. PMID:21702477

  18. High Performance Work Practices and Firm Performance.

    ERIC Educational Resources Information Center

    Department of Labor, Washington, DC. Office of the American Workplace.

    A literature survey established that a substantial amount of research has been conducted on the relationship between productivity and the following specific high performance work practices: employee involvement in decision making, compensation linked to firm or worker performance, and training. According to these studies, high performance work…

  19. A high performance cloud computing platform for mRNA analysis.

    PubMed

    Lin, Feng-Seng; Shen, Chia-Ping; Sung, Hsiao-Ya; Lam, Yan-Yu; Lin, Jeng-Wei; Lai, Feipei

    2013-01-01

    Multiclass classification is an important technique to many complex bioinformatics problems. However, their performance is limited by the computation power. Based on the Apache Hadoop design framework, this study proposes a two layer architecture that exploits the inherent parallelism of GA-SVM classification to speed up the work. The performance evaluations on an mRNA benchmark cancer dataset have reduced 86.55% features and raised accuracy from 97.53% to 98.03%. With a user-friendly web interface, the system provides researchers an easy way to investigate the unrevealed secrets in the fast-growing repository of bioinformatics data. PMID:24109986

  20. A meta-analysis of country differences in the high-performance work system-business performance relationship: the roles of national culture and managerial discretion.

    PubMed

    Rabl, Tanja; Jayasinghe, Mevan; Gerhart, Barry; Kühlmann, Torsten M

    2014-11-01

    Our article develops a conceptual framework based primarily on national culture perspectives but also incorporating the role of managerial discretion (cultural tightness-looseness, institutional flexibility), which is aimed at achieving a better understanding of how the effectiveness of high-performance work systems (HPWSs) may vary across countries. Based on a meta-analysis of 156 HPWS-business performance effect sizes from 35,767 firms and establishments in 29 countries, we found that the mean HPWS-business performance effect size was positive overall (corrected r = .28) and positive in each country, regardless of its national culture or degree of institutional flexibility. In the case of national culture, the HPWS-business performance relationship was, on average, actually more strongly positive in countries where the degree of a priori hypothesized consistency or fit between an HPWS and national culture (according to national culture perspectives) was lower, except in the case of tight national cultures, where greater a priori fit of an HPWS with national culture was associated with a more positive HPWS-business performance effect size. However, in loose cultures (and in cultures that were neither tight nor loose), less a priori hypothesized consistency between an HPWS and national culture was associated with higher HPWS effectiveness. As such, our findings suggest the importance of not only national culture but also managerial discretion in understanding the HPWS-business performance relationship. (PsycINFO Database Record (c) 2014 APA, all rights reserved). PMID:25222523

  1. AVES: A high performance computer cluster array for the INTEGRAL satellite scientific data analysis

    NASA Astrophysics Data System (ADS)

    Federici, Memmo; Martino, Bruno Luigi; Ubertini, Pietro

    2012-07-01

    In this paper we describe a new computing system array, designed, built and now used at the Space Astrophysics and Planetary Institute (IAPS) in Rome, Italy, for the INTEGRAL Space Observatory scientific data analysis. This new system has become necessary in order to reduce the processing time of the INTEGRAL data accumulated during the more than 9 years of in-orbit operation. In order to fulfill the scientific data analysis requirements with a moderately limited investment the starting approach has been to use a `cluster' array of commercial quad-CPU computers, featuring the extremely large scientific and calibration data archive on line.

  2. Recent advances in ultra-high performance liquid chromatography for the analysis of traditional chinese medicine

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Traditional Chinese medicines (TCMs) have been widely used for the prevention and treatment of various diseases for thousands of years in China. Ultra Performance Liquid Chromatography (UHPLC) is a relatively new technique offering new possibilities in liquid chromatography. This paper reviews recen...

  3. An Analysis of High School Students' Perceptions and Academic Performance in Laboratory Experiences

    ERIC Educational Resources Information Center

    Mirchin, Robert Douglas

    2012-01-01

    This research study is an investigation of student-laboratory (i.e., lab) learning based on students' perceptions of experiences using questionnaire data and evidence of their science-laboratory performance based on paper-and-pencil assessments using Maryland-mandated criteria, Montgomery County Public Schools (MCPS) criteria, and published…

  4. High Performance Liquid Chromatography of Some Analgesic Compounds: An Instrumental Analysis Experiment.

    ERIC Educational Resources Information Center

    Haddad, Paul; And Others

    1983-01-01

    Background information, procedures, and results are provided for an experiment demonstrating techniques of solvent selection, gradient elution, pH control, and ion-pairing in the analysis of an analgesic mixture using reversed-phase liquid chromatography on an octadecylsilane column. Although developed using sophisticated/expensive equipment, less…

  5. Visiting Science Museums during Middle and High School: A Longitudinal Analysis of Student Performance in Science

    ERIC Educational Resources Information Center

    Suter, Larry E.

    2014-01-01

    This exploratory analysis of student attendance at science museums finds that student achievement in science and mathematics is somewhat higher for those students who visited science museums frequently during the school year or summer. The strength of the association with cognitive achievement is sufficiently noteworthy to encourage further…

  6. START High Performance Discharges

    NASA Astrophysics Data System (ADS)

    Gates, D. A.

    1997-11-01

    Improvements to START (Small Tight Aspect Ratio Tokamak), the first spherical tokamak in the world to achieve high plasma temperature with both a significant pulse length and confinement time, have been ongoing since 1991. Recent modifications include: expansion of the existing capacitor banks allowing plasma currents as high as 300kA, an increase in the available neutral beam heating power ( ~ 500kW), and improvements to the vacuum system. These improvements have led to the achievement of the world record plasma β (≡ 2μ_0 /B^2) of ~ 30% in a tokamak. The normalised β ( βN ≡ β aB/I_p) reached 4.5 with q_95 = 2.3. Properties of the reconstructed equilibrium will be discussed in detail. The theoretical limit to β is higher in a spherical tokamak than in a conventional machine, due to the higher values of normalised current (IN ≡ I_p/aB) achievable at low aspect ratio. The record β was achieved with IN ~ 8 while conventional tokamaks are limited to IN ~ 3, or less. Calculations of the ideal MHD stability of the record discharge indicate high β low-n kink modes are stable, but that the entire profile is at or near marginal stability for high-n ballooning modes. The phenomenology of the events leading up to the plasma termination is discussed. An important aspect of the START program is to explore the physics of neutral beam absorption at low aspect ratio. A passive neutral particle analyser has been used to study the temporal and spatial dependence of the fast hydrogen beam ions. These measurements have been used in conjunction with a single particle orbit code to estimate the fast ion losses due to collisions with slow neutrals from the plasma edge. Numerical analysis of neutral beam power deposition profiles are compared with the data from an instrumented beam stop. The global energy confinement time τE in beam heated discharges on START is similar to that obtained in Ohmic discharges, even though the input power has roughly doubled over the Ohmic case

  7. Performance analysis of reversible image compression techniques for high-resolution digital teleradiology.

    PubMed

    Kuduvalli, G R; Rangayyan, R M

    1992-01-01

    The performances of a number of block-based, reversible, compression algorithms suitable for compression of very-large-format images (4096x4096 pixels or more) are compared to that of a novel two-dimensional linear predictive coder developed by extending the multichannel version of the Burg algorithm to two dimensions. The compression schemes implemented are: Huffman coding, Lempel-Ziv coding, arithmetic coding, two-dimensional linear predictive coding (in addition to the aforementioned one), transform coding using discrete Fourier-, discrete cosine-, and discrete Walsh transforms, linear interpolative coding, and combinations thereof. The performances of these coding techniques for a few mammograms and chest radiographs digitized to sizes up to 4096x4096 10 b pixels are discussed. Compression from 10 b to 2.5-3.0 b/pixel on these images has been achieved without any loss of information. The modified multichannel linear predictor outperforms the other methods while offering certain advantages in implementation. PMID:18222885

  8. Quality Assessment of Ojeok-San, a Traditional Herbal Formula, Using High-Performance Liquid Chromatography Combined with Chemometric Analysis

    PubMed Central

    Kim, Jung-Hoon; Seo, Chang-Seob; Kim, Seong-Sil; Shin, Hyeun-Kyoo

    2015-01-01

    Ojeok-san (OJS) is a traditional herbal formula consisting of 17 herbal medicines that has been used to treat various disorders. In this study, quantitative analytical methods were developed using high-performance liquid chromatography equipped with a photodiode array detector to determine 19 marker compounds in OJS preparations, which was then combined with chemometric analysis. The method developed was validated in terms of its precision and accuracy. The intra- and interday precision of the marker compounds were <3.0% of the relative standard deviation (RSD) and the recovery of the marker compounds was 92.74%–104.16% with RSD values <3.0%. The results of our quantitative analysis show that the quantities of the 19 marker compounds varied between a laboratory water extract and commercial OJS granules. The chemometric analysis used, principal component analysis (PCA) and hierarchical clustering analysis (HCA), also showed that the OJS water extract produced using a laboratory method clearly differed from the commercial OJS granules; therefore, an equalized production process is required for quality control of OJS preparations. Our results suggest that the HPLC analytical methods developed are suitable for the quantification and quality assessment of OJS preparations when combined with chemometric analysis involving PCA and HCA. PMID:26539304

  9. High performance sapphire windows

    NASA Technical Reports Server (NTRS)

    Bates, Stephen C.; Liou, Larry

    1993-01-01

    High-quality, wide-aperture optical access is usually required for the advanced laser diagnostics that can now make a wide variety of non-intrusive measurements of combustion processes. Specially processed and mounted sapphire windows are proposed to provide this optical access to extreme environment. Through surface treatments and proper thermal stress design, single crystal sapphire can be a mechanically equivalent replacement for high strength steel. A prototype sapphire window and mounting system have been developed in a successful NASA SBIR Phase 1 project. A large and reliable increase in sapphire design strength (as much as 10x) has been achieved, and the initial specifications necessary for these gains have been defined. Failure testing of small windows has conclusively demonstrated the increased sapphire strength, indicating that a nearly flawless surface polish is the primary cause of strengthening, while an unusual mounting arrangement also significantly contributes to a larger effective strength. Phase 2 work will complete specification and demonstration of these windows, and will fabricate a set for use at NASA. The enhanced capabilities of these high performance sapphire windows will lead to many diagnostic capabilities not previously possible, as well as new applications for sapphire.

  10. Performance analysis of radiation cooled dc transmission lines for high power space systems

    NASA Technical Reports Server (NTRS)

    Schwarze, G. E.

    1985-01-01

    As space power levels increase to meet mission objectives and also as the transmission distance between power source and load increases, the mass, volume, power loss, and operating voltage and temperature become important system design considerations. This analysis develops the dependence of the specific mass and percent power loss on the power and voltage levels, transmission distance, operating temperature and conductor material properties. Only radiation cooling is considered since the transmission line is assumed to operate in a space environment. The results show that the limiting conditions for achieving low specific mass, percent power loss, and volume for a space-type dc transmission line are the permissible transmission voltage and operating temperature. Other means to achieve low specific mass include the judicious choice of conductor materials. The results of this analysis should be immediately applicable to power system trade-off studies including comparisons with ac transmission systems.

  11. Performance analysis of radiation cooled dc transmission lines for high power space systems

    NASA Technical Reports Server (NTRS)

    Schwarze, G. E.

    1985-01-01

    As space power levels increase to meet mission objectives and also as the transmission distance between power source and load increases, the mass, volume, power loss, and operating voltage and temperature become important system design considerations. This analysis develops the dependence of the specific mass and percent power loss on hte power and voltage levels, transmission distance, operating temperature and conductor material properties. Only radiation cooling is considered since the transmission line is assumed to operate in a space environment. The results show that the limiting conditions for achieving low specific mass, percent power loss, and volume for a space-type dc transmission line are the permissible transmission voltage and operating temperature. Other means to achieve low specific mass include the judicious choice of conductor materials. The results of this analysis should be immediately applicable to power system trade-off studies including comparisons with ac transmission systems.

  12. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    SciTech Connect

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  13. High-performance analysis of single interphase cells with custom DNA probes spanning translocation break points

    NASA Astrophysics Data System (ADS)

    Weier, Heinz-Ulli G.; Munne, S.; Lersch, Robert A.; Marquez, C.; Wu, J.; Pedersen, Roger A.; Fung, Jingly

    1999-06-01

    The chromatin organization of interphase cell nuclei, albeit an object of intense investigation, is only poorly understood. In the past, this has hampered the cytogenetic analysis of tissues derived from specimens where only few cells were actively proliferating or a significant number of metaphase cells could be obtained by induction of growth. Typical examples of such hard to analyze cell systems are solid tumors, germ cells and, to a certain extent, fetal cells such as amniocytes, blastomeres or cytotrophoblasts. Balanced reciprocal translocations that do not disrupt essential genes and thus do not led to disease symptoms exit in less than one percent of the general population. Since the presence of translocations interferes with homologue pairing in meiosis, many of these individuals experience problems in their reproduction, such as reduced fertility, infertility or a history of spontaneous abortions. The majority of translocation carriers enrolled in our in vitro fertilization (IVF) programs carry simple translocations involving only two autosomes. While most translocations are relatively easy to spot in metaphase cells, the majority of cells biopsied from embryos produced by IVF are in interphase and thus unsuitable for analysis by chromosome banding or FISH-painting. We therefore set out to analyze single interphase cells for presence or absence of specific translocations. Our assay, based on fluorescence in situ hybridization (FISH) of breakpoint-spanning DNA probes, detects translocations in interphase by visual microscopic inspection of hybridization domains. Probes are prepared so that they span a breakpoint and cover several hundred kb of DNA adjacent to the breakpoint. On normal chromosomes, such probes label a contiguous stretch of DNA and produce a single hybridization domain per chromosome in interphase cells. The translocation disrupts the hybridization domain and the resulting two fragments appear as physically separated hybridization domains in

  14. A parallel-vector algorithm for rapid structural analysis on high-performance computers

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O.; Nguyen, Duc T.; Agarwal, Tarun K.

    1990-01-01

    A fast, accurate Choleski method for the solution of symmetric systems of linear equations is presented. This direct method is based on a variable-band storage scheme and takes advantage of column heights to reduce the number of operations in the Choleski factorization. The method employs parallel computation in the outermost DO-loop and vector computation via the loop unrolling technique in the innermost DO-loop. The method avoids computations with zeros outside the column heights, and as an option, zeros inside the band. The close relationship between Choleski and Gauss elimination methods is examined. The minor changes required to convert the Choleski code to a Gauss code to solve non-positive-definite symmetric systems of equations are identified. The results for two large scale structural analyses performed on supercomputers, demonstrate the accuracy and speed of the method.

  15. An application of high performance liquid chromatographic assay for the kinetic analysis of degradation of faropenem.

    PubMed

    Cielecka-Piontek, J; Krause, A; Paczkowska, M

    2012-11-01

    An isocratic RP-HPLC-DAD procedure was developed and validated for kinetic analysis of degradation of faropenem in bulk drug substance and in tablets. It involved the use of a C-18 analytical column (5 microm particle size, 250 mm x 4.6 mm), flow rate 1.3 ml/min and 50 microl injection volume. The mobile phase consisted of acetate buffer (pH 3.5) - acetonitrile (70:30 v/v). The determination was carried out at the wavelength of 323 nm. Kinetic studies of faropenem degradation in aqueous solutions included hydrolysis, oxidation, photolysis and thermal degradation. A derivative spectrophotometry was used as an alternative method to compare the observed rate constants. PMID:23210240

  16. Multicriteria Decision Analysis of Material Selection of High Energy Performance Residential Building

    NASA Astrophysics Data System (ADS)

    Čuláková, Monika; Vilčeková, Silvia; Katunská, Jana; Krídlová Burdová, Eva

    2013-11-01

    In world with limited amount of energy sources and with serious environmental pollution, interest in comparing the environmental embodied impacts of buildings using different structure systems and alternative building materials will be increased. This paper shows the significance of life cycle energy and carbon perspective and the material selection in reducing energy consumption and emissions production in the built environment. The study evaluates embodied environmental impacts of nearly zero energy residential structures. The environmental assessment uses framework of LCA within boundary: cradle to gate. Designed alternative scenarios of material compositions are also assessed in terms of energy effectiveness through selected thermal-physical parameters. This study uses multi-criteria decision analysis for making clearer selection between alternative scenarios. The results of MCDA show that alternative E from materials on nature plant base (wood, straw bales, massive wood panel) present possible way to sustainable perspective of nearly zero energy houses in Slovak republic

  17. Comparative analysis of conjugated bile acids in human serum using high-performance liquid chromatography and capillary electrophoresis.

    PubMed

    Lee, B L; New, A L; Ong, C N

    1997-12-19

    This paper describes the analysis of conjugated bile acids in human serum using reversed-phase high-performance liquid chromatography (HPLC) and micellar electrokinetic capillary electrophoresis (CE). Samples of healthy subjects and patients with different hepatic diseases were pretreated with a simple preparation procedure using a solid-phase extraction technique. The optimal analytical conditions of both chromatographic methods were investigated for the convenience and reliability for routine analysis. Both HPLC and CE methods were found to be reliable and compatible. The recoveries of nine bile acid conjugates using both methods were generally >85% and reproducibility >90%. The day-to-day variation of retention time was <5% for HPLC, while the variation of migration time for CE was <3%. Although the detection limit of the HPLC method (1 nmol/ml) was five times more sensitive than that of the CE method, the CE method was considered to be more time and cost effective. PMID:9518169

  18. [Diol column as stationary phase for high performance liquid chromatographic analysis of carbohydrates in drinks with evaporative light scattering detection].

    PubMed

    Wei, Y; Guo, L; Ding, M Y

    2001-11-01

    A high performance liquid chromatographic method with a diol column and evaporative light scattering detector (ELSD) was established for the direct analysis of fructose, glucose, sucrose, maltose and raffinose in mixture. A separation column (Lichrospher 100 Diol, 250 mm x 4.0 mm i.d., 5 microns, Hewlett-Packard, USA) and a guard column (Zorbax Rx-SIL, 12.5 mm x 4.6 mm i.d., 5 microns) were used. The mobile phase was a mixture of dichloromethane-methanol (3.2:1, volume ratio). Regression equations revealed linear relationship (correlation coefficients: 0.995-0.999) between the mass of carbohydrates injected and the peak area of carbohydrates detected by ELSD. The detection limits of ELSD (S/N = 3) were about 0.20 microgram for all carbohydrates. This system could be used for the routine analysis of simple carbohydrates in some common drinks on market. PMID:12545463

  19. High Performance Liquid Chromatographic Analysis of Almotriptan Malate in Bulk and Tablets

    PubMed Central

    Lavudu, Petikam; Rani, Avula Prameela; Divya, Chepuri; Sekharan, Chandra Bala

    2013-01-01

    Purpose: A simple RP-HPLC method has been developed and validated for the determination of almotriptan malate (ATM) in bulk and tablets. Methods: Chromatographic separation of ATM was achieved by using a Thermo Scientific C18 column. A Mobile phase containing a mixture of methanol, water and acetic acid (4:8:0.1 v/v) was pumped at the flow rate of 1 mL/min. Detection was performed at 227 nm. According to ICH guidelines, the method was validated. Results: The calibration curve was linear in the concentration range 5–60 µg/mL for the ATM with regression coefficient 0.9999. The method was precise with RSD <1.2%. Excellent recoveries of 99.60 - 100.80% proved the accuracy of the method. The limits of detection and quantification were found to be 0.025 and 0.075 µg/mL, respectively. Conclusion: The method was successfully applied for the quantification of ATM in tablets with acceptable accuracy and precision. PMID:24312833

  20. Characterization of high-power lithium-ion cells-performance and diagnostic analysis

    SciTech Connect

    Striebel, K.A.; Shim, J.; Kostecki, R.; Richardson, T.J.; Ross, P.N.; Song, X.; Zhuang, G.V.

    2003-11-25

    Lithium-ion cells, with graphite anodes and LiNi0.8Co0.15Al0.05O2 cathodes, were cycled for up to 1000 cycles over different ranges of SOC and temperatures. The decline in cell performance increases with the span of SOC and temperature during cycling. Capacity fade was caused by a combination of the loss of cycleable Li and degradation of the cathode. The room temperature anodes showed SEI compositions and degrees of graphite disorder that correlated with the extent of the Li consumption, which was linear in cell test time. TEM of the cathodes showed evidence of crystalline defects, though no major new phases were identified, consistent with XRD. No evidence of polymeric deposits on the cathode particles (FTIR) was detected although both Raman and TEM showed evidence of P-containing deposits from electrolyte salt degradation. Raman microscopy showed differences in relative carbon contents of the cycled cathodes, which is blamed for part of the cathode degradation.

  1. High Performance Work Systems and Firm Performance.

    ERIC Educational Resources Information Center

    Kling, Jeffrey

    1995-01-01

    A review of 17 studies of high-performance work systems concludes that benefits of employee involvement, skill training, and other high-performance work practices tend to be greater when new methods are adopted as part of a consistent whole. (Author)

  2. Beyond the CM-5: A case study in performance analysis for the CM-5, T3D, and high performance RISC workstations

    SciTech Connect

    Beazley, D.M.; Lomdahl, P.S.

    1995-03-22

    We present a comprehensive performance evaluation of our molecular dynamics code SPaSM on the CM-5 in order to devise optimization strategies for the CM-5, T3D, and RISC workstations. In this analysis, we focus on the effective use of the SPARC microprocessor by performing measurements of instruction set utilization, cache effects, memory access patterns, and pipeline stall cycles. We then show that we can account for more than 99% of observed execution time of our program. Optimization strategies are devised and we show that our highly optimized ANSI C program running only on the SPARC microprocessor of the CM-5 is only twice as slow as our Gordon-Bell prize winning code that utilized the CM-5 vector units. On the CM-5E, we show that this optimized code run faster than the vector unit version. We then apply these techniques to the Cray T3D and measure resulting speedups. Finally, we show that simple optimization strategies are effective on a wide variety of high performance RISC workstations.

  3. Wheat gluten amino acid analysis by high-performance anion-exchange chromatography with integrated pulsed amperometric detection.

    PubMed

    Rombouts, Ine; Lagrain, Bert; Lamberts, Lieve; Celus, Inge; Brijs, Kristof; Delcour, Jan A

    2012-01-01

    This chapter describes an accurate and user-friendly method for determining amino acid composition of wheat gluten proteins and their gliadin and glutenin fractions. The method consists of hydrolysis of the peptide bonds in 6.0 M hydrochloric acid solution at 110°C for 24 h, followed by evaporation of the acid and separation of the free amino acids by high-performance anion-exchange chromatography with integrated pulsed amperometric detection. In contrast to conventional methods, the analysis requires neither pre- or postcolumn derivatization, nor a time-consuming oxidation or derivatization step prior to hydrolysis. Correction factors account for incomplete release of Val and Ile even after hydrolysis for 24 h, and for losses of Ser during evaporation. Gradient conditions including an extra eluent allow multiple sequential sample analyses without risk of Glu accumulation on the anion-exchange column which otherwise would result from high Gln levels in gluten proteins. PMID:22125156

  4. Comprehensive sample analysis using high performance liquid chromatography with multi-detection.

    PubMed

    Pravadali, Sercan; Bassanese, Danielle N; Conlan, Xavier A; Francis, Paul S; Smith, Zoe M; Terry, Jessica M; Shalliker, R Andrew

    2013-11-25

    Herein we assess the separation space offered by a liquid chromatography system with an optimised uni-dimensional separation for the determination of the key chemical entities in the highly complex matrix of a tobacco leaf extract. Multiple modes of detection, including UV-visible absorbance, chemiluminescence (acidic potassium permanganate, manganese(IV), and tris(2,2'-bipyridine)ruthenium(III)), mass spectrometry and DPPH radical scavenging were used in an attempt to systematically reduce the data complexity of the sample whilst obtaining a greater degree of molecule-specific information. A large amount of chemical data was obtained, but several limitations in the ability to assign detector responses to particular compounds, even with the aid of complementary detection systems, were observed. Thirty-three compounds were detected via MS on the tobacco extract and 12 out of 32 compounds gave a peak height ratio (PHR) greater than 0.33 on one or more detectors. This paper serves as a case study of these limitations, illustrating why multidimensional chromatography is an important consideration when developing a comprehensive chemical detection system. PMID:24216214

  5. Analysis of histidine and urocanic acid isomers by reversed-phase high-performance liquid chromatography.

    PubMed

    Hermann, K; Abeck, D

    2001-01-01

    The qualitative separation performance of a C18, C8 and C4 reversed-phase column was investigated for the separation of histidine and its metabolites histamine, 1-methyihistamine and trans- and cis-urocanic acid. Trans- and cis-urocanic acid were baseline separated from their precursor histidine on all three columns using isocratic elution with a mobile phase composed of 0.01 M aqueous TEAP pH 3.0 and acetonitrile at a ratio of 98:2 (v/v). However, histidine was not separated from histamine and 1-methyihistamine. Selecting the C8 column and introducing 0.005 M of the ion pairing reagent 1-octanesulfonic acid sodium salt into the aqueous solution and acetonitrile at a ratio of 90:10 (v/v), significantly improved the separation. The separation was also followed by a change in the retention times and the order of elution. The sequence of elution was histidine, cis-urocanic acid, trans-urocanic acid, histamine and 1-methylhistamine with retention times of 5.58 +/- 0.07, 7.03 +/- 0.15, 7.92 +/- 0.18, 18.77 +/- 0.24 and 20.79 +/- 0.21 min (mean +/- SD; n=5). The separation on the C8 column in the presence of the ion-pairing reagent was further improved with gradient elution that resulted in a reduction in the retention times and elution volumes of histamine and 1-methylhistamine. The detection limits of histidine and trans-urocanic acid at a wavelength of 210 nm and an injection volume of 0.05 ml were 5 x 10(-8) mol l(-1) (n=3). The kinetic of the in-vitro conversion of trans- into the cis-isomer after UV irradiation was depending on the time of exposure and the energy of the light source. UVB light induced a significantly faster conversion than UVA light. TUCA and cUCA samples kept at -25 degrees C were stable for up to 50 weeks. Samples, eluted from human skin showed various concentrations of histidine and trans- and cis-urocanic acid with an average of 1.69 +/- 0.33 x 10(-5) mol l(-1), 1.17 +/- 0.43 x 10(-5) mol l(-1) and 1.67 +/- 0.33 x 10(-5) mol l(-1), respectively

  6. High-performance liquid chromatography analysis of naturally occurring D-amino acids in sake.

    PubMed

    Gogami, Yoshitaka; Okada, Kaori; Oikawa, Tadao

    2011-11-01

    We measured all of the D- and L-amino acids in 141 bottles of sakes using HPLC. We used two precolumn derivatization methods of amino acid enantiomer detection with o-phthalaldehyde and N-acetyl-L-cysteine, as well as (+)-1-(9-fluorenyl)ethyl chloroformate/1-aminoadamantane and one postcolumn derivatization method with o-phthalaldehyde and N-acetyl-L-cysteine. We found that the sakes contained the D-amino acids forms of Ala, Asn, Asp, Arg, Glu, Gln, His, Ile, Leu, Lys, Ser, Tyr, Val, Phe, and Pro. We were not able to detect D-Met, D-Thr D-Trp in any of the sakes analyzed. The most abundant D-Ala, D-Asp, and D-Glu ranged from 66.9 to 524.3 μM corresponding to relative 34.4, 12.0, and 14.6% D-enantiomer. The basic parameters that generally determine the taste of sake such as the sake meter value (SMV; "Nihonshudo"), acidity ("Sando"), amino acid value ("Aminosando"), alcohol content by volume, and rice species of raw material show no significant relationship to the D-amino acid content of sake. The brewing water ("Shikomimizu") and brewing process had effects on the D-amino acid content of the sakes: the D-amino acid contents of the sakes brewed with deep-sea water "Kaiyoushinosousui", "Kimoto yeast starter", "Yamahaimoto", and the long aging process "Choukijukusei" are high compared with those of other sakes analyzed. Additionally, the D-amino acid content of sakes that were brewed with the adenine auxotroph of sake yeast ("Sekishoku seishu kobo", Saccharomyces cerevisiae) without pasteurization ("Hiire") increased after storage at 25 °C for three months. PMID:21555255

  7. High performance thin layer chromatography (HPTLC) and high performance liquid chromatography (HPLC) for the qualitative and quantitative analysis of Calendula officinalis-advantages and limitations.

    PubMed

    Loescher, Christine M; Morton, David W; Razic, Slavica; Agatonovic-Kustrin, Snezana

    2014-09-01

    Chromatography techniques such as HPTLC and HPLC are commonly used to produce a chemical fingerprint of a plant to allow identification and quantify the main constituents within the plant. The aims of this study were to compare HPTLC and HPLC, for qualitative and quantitative analysis of the major constituents of Calendula officinalis and to investigate the effect of different extraction techniques on the C. officinalis extract composition from different parts of the plant. The results found HPTLC to be effective for qualitative analysis, however, HPLC was found to be more accurate for quantitative analysis. A combination of the two methods may be useful in a quality control setting as it would allow rapid qualitative analysis of herbal material while maintaining accurate quantification of extract composition. PMID:24880991

  8. Elution strategies for reversed-phase high-performance liquid chromatography analysis of sucrose alkanoate regioisomers with charged aerosol detection.

    PubMed

    Lie, Aleksander; Pedersen, Lars Haastrup

    2013-10-11

    A broad range of elution strategies for RP-HPLC analysis of sucrose alkanoate regioisomers with CAD was systematically evaluated. The HPLC analyses were investigated using design-of-experiments methodology and analysed by analysis of variance (ANOVA) and regression modelling. Isocratic elutions, isocratic elutions with increased flow, and gradient elutions with step-down profiles and step-up profiles were performed and the chromatographic parameters of the different elution strategies were described by suitable variables. Based on peak resolutions general resolution deviation for multiple peaks (RDm) was developed for sample-independent evaluation of separation of any number of peaks in chromatographic analysis. Isocratic elutions of sucrose alkanoates showed similar relationships between eluent acetonitrile concentration and retention time for all regioisomers of sucrose caprate and sucrose laurate, as confirmed by evaluation of the curvatures using approximate second derivatives and Kendall rank correlation coefficients. Regression modelling and statistical analysis showed that acetonitrile concentration and flow rate were highly significant for both average adjusted retention time and RDm for sucrose laurate. For both responses the effect of changes in acetonitrile concentration was larger than the effect of changes in flow rate, over the ranges studied. Regression modelling of the step-down gradient profiles for the sucrose alkanoates showed that the eluent acetonitrile concentrations were the overall most significant variables for retention time and separation. The models for average adjusted retention time of sucrose caprate and sucrose laurate showed only a few differences in the significance levels of terms, while the models for RDm showed larger differences between the sucrose alkanoates, in both the number of terms and their significance. Efficiency evaluation of elution strategies, in terms of RDm and analysis time, showed that the best results were

  9. Assessment of repeatability of composition of perfumed waters by high-performance liquid chromatography combined with numerical data analysis based on cluster analysis (HPLC UV/VIS - CA).

    PubMed

    Ruzik, L; Obarski, N; Papierz, A; Mojski, M

    2015-06-01

    High-performance liquid chromatography (HPLC) with UV/VIS spectrophotometric detection combined with the chemometric method of cluster analysis (CA) was used for the assessment of repeatability of composition of nine types of perfumed waters. In addition, the chromatographic method of separating components of the perfume waters under analysis was subjected to an optimization procedure. The chromatograms thus obtained were used as sources of data for the chemometric method of cluster analysis (CA). The result was a classification of a set comprising 39 perfumed water samples with a similar composition at a specified level of probability (level of agglomeration). A comparison of the classification with the manufacturer's declarations reveals a good degree of consistency and demonstrates similarity between samples in different classes. A combination of the chromatographic method with cluster analysis (HPLC UV/VIS - CA) makes it possible to quickly assess the repeatability of composition of perfumed waters at selected levels of probability. PMID:25533703

  10. Commoditization of High Performance Storage

    SciTech Connect

    Studham, Scott S.

    2004-04-01

    The commoditization of high performance computers started in the late 80s with the attack of the killer micros. Previously, high performance computers were exotic vector systems that could only be afforded by an illustrious few. Now everyone has a supercomputer composed of clusters of commodity processors. A similar commoditization of high performance storage has begun. Commodity disks are being used for high performance storage, enabling a paradigm change in storage and significantly changing the price point of high volume storage.

  11. Seismic Assessment of R/C Building Structure through Nonlinear Probabilistic Analysis with High-performance Computing

    SciTech Connect

    Faggella, M.; Barbosa, A.; Conte, J. P.; Restrepo, J. I.; Spacone, E.

    2008-07-08

    This paper presents a probabilistic seismic demand analysis of a three dimensional R/C building model subjected to tri-axial earthquake excitation. Realistic probability distributions are assumed for the main structural and material properties and for the ground motion Intensity Measure (IM) Sa(T1. Natural ground motions are used in the analyses to represent the inherent randomness in the earthquake ground motion time histories. Monte Carlo simulations are performed to account for the record-to-record variability and Tornado diagrams are used to represent the uncertainty induced in the response by the basic uncertainties in the structural properties. In order to perform a probabilistic study on three-dimensional engineering demand parameters (EDPs), a large number of ensemble time history analyses were carried out using the TeraGrid high-performance computing resources available at the San Diego Supercomputer Center. Early results show that for the testbed building used in this study, uncertainty in the structural parameters contribute little to the uncertainty of the EDPs, while large variations in the EDPs are due to the variability of the ground motion intensity measure and the record-to-record variability.

  12. Evaluation of ternary mobile phases for the analysis of carbonyl compound derivatives using high-performance liquid chromatography.

    PubMed

    Ho, Duy Xuan; Kim, Ki-Hyun

    2011-01-01

    In this study, the feasibility of ternary mobile phases was examined in a high-performance liquid chromatography (HPLC)-based analysis of carbonyl compounds (CCs). To test the performance of different ternary phases, the liquid phase standards containing a 15 aldehyde/ketone-DNPH(o) mix were analyzed through a series of five-point calibration experiments. For this comparison, three types of ternary mobile phases were prepared initially by mixing water (W) with two of the following three organic solvents: isopropanol (I), methanol (M), and tetrahydrofuran (T). The resulting three types of ternary phases (named as WIM, WTM, and WIT) were tested and evaluated in relation to the water content or in terms of methanol-to-water ratio (M/W). The results derived by the three ternary phases revealed that the optimal resolution was attained near maximum water content, while those of WIT consistently suffered from poor resolution problems. The relative performances of WIM and WTM phases, if assessed by three key operating parameters (sensitivity, retention time, and resolution), were found to be reliable for most selected CCs with the decreasing M/W ratio. PMID:21218260

  13. New trend in the LC separation analysis of pharmaceuticals--high performance separation by ultra high-performance liquid chromatography (UHPLC) with core-shell particle C18 columns--.

    PubMed

    Nishi, Hiroyuki; Nagamatsu, Kumi

    2014-01-01

    This article presents a mini-review of the recent results in the ultra high-performance liquid chromatography (UHPLC) separation of pharmaceuticals by our group. High performance UHPLC separation employing core-shell particle C18 columns was demonstrated. High performance (high theoretical plate number of approximately 20000/10 cm, low theoretical plate height of 5 μm) was obtained without any specific devices in the conventional HPLC apparatus, only through changing detector sampling times and the inner diameter of the connecting tube. High theoretical plate numbers with low column back pressure obtained by the core-shell particle columns enabled fast separation of the analytes. Methanol, which gives high column pressure drops in the reversed-phase mode HPLC compared with acetonitrile, can be used without any trouble. One analysis of the purity testing of diltiazem hydrochloride was performed within 100 s. One analysis in the photostability testing of mecobalamin (vitamin B12 analogue) was successful within 180 s. PMID:24521905

  14. High Performance Computing Today

    SciTech Connect

    Dongarra, Jack; Meuer,Hans; Simon,Horst D.; Strohmaier,Erich

    2000-04-01

    In last 50 years, the field of scientific computing has seen a rapid change of vendors, architectures, technologies and the usage of systems. Despite all these changes the evolution of performance on a large scale however seems to be a very steady and continuous process. Moore's Law is often cited in this context. If the authors plot the peak performance of various computers of the last 5 decades in Figure 1 that could have been called the supercomputers of their time they indeed see how well this law holds for almost the complete lifespan of modern computing. On average they see an increase in performance of two magnitudes of order every decade.

  15. Modern, PC based, high resolution portable EDXRF analyzer offers laboratory performance for field, in-situ analysis of environmental contaminants

    NASA Astrophysics Data System (ADS)

    Piorek, Stanislaw

    1994-12-01

    The introduction of a new, high resolution, portable probe that has improved the sensitivity of the conventional field portable X-ray fluorescence (FPXRF) by up to an order of magnitude had been reported earlier [S. Piorek and J.R. Pasmore, Proc. 2nd Int. Symp. on Field Screening Methods for Hazardous Wastes and Toxic Chemicals, Las Vegas, 1991, p. 737]. A high resolution Si(Li) detector probe operates connected to a multichannel X-ray analyzer (2048 channels) which is housed in a portable, battery powered industrial computer. An improved energy resolution of the detector allows the implementation of more sophisticated data treatment methods to convert the measured intensities into mass concentrations of the analytes. A backscatter with a fundamental parameters approach (BFP) is one of the best methods, specifically for metallic contaminants in soil. A program has been written based on the BFP method for use with the new probe. The new software/probe combination enables one to quickly assess levels of contaminants on the site without the need of analyzed samples for instrument calibration. The performance of the EDXRF system in application to analysis of metals in contaminated soil is discussed in this paper. Also discussed is the extension of this method in the analysis of other types of environmental samples such as air particulates collected on filter paper.

  16. Fast analysis of isoflavones by high-performance liquid chromatography using a column packed with fused-core particles.

    PubMed

    Manchón, N; D'Arrigo, M; García-Lafuente, A; Guillamón, E; Villares, A; Ramos, A; Martínez, J A; Rostagno, M A

    2010-10-15

    The recent development of fused-core technology in HPLC columns is enabling faster and highly efficient separations. This technology was evaluated for the development of an fast analysis method for the most relevant soy isoflavones. A step-by-step strategy was used to optimize temperature (25-50°C), flow rate (1.2-2.7 mL/min), mobile phase composition and equilibration time (1-5 min). Optimized conditions provided a method for the separation of all isoflavones in less than 5.8 min and total analysis time (sample-to-sample) of 11.5 min. Evaluation of chromatographic performance revealed excellent reproducibility, resolution, selectivity, peak symmetry and low limits of detection and quantification levels. The use of a fused-core column allows highly efficient, sensitive, accurate and reproducible determination of isoflavones with an outstanding sample throughout and resolution. The developed method was validated with different soy samples with a total isoflavone concentration ranging from 1941.53 to 2460.84 μg g(-1) with the predominant isoflavones being isoflavone glucosides and malonyl derivatives. PMID:20875606

  17. High Performance Tools And Technologies

    SciTech Connect

    Collette, M R; Corey, I R; Johnson, J R

    2005-01-24

    This goal of this project was to evaluate the capability and limits of current scientific simulation development tools and technologies with specific focus on their suitability for use with the next generation of scientific parallel applications and High Performance Computing (HPC) platforms. The opinions expressed in this document are those of the authors, and reflect the authors' current understanding and functionality of the many tools investigated. As a deliverable for this effort, we are presenting this report describing our findings along with an associated spreadsheet outlining current capabilities and characteristics of leading and emerging tools in the high performance computing arena. This first chapter summarizes our findings (which are detailed in the other chapters) and presents our conclusions, remarks, and anticipations for the future. In the second chapter, we detail how various teams in our local high performance community utilize HPC tools and technologies, and mention some common concerns they have about them. In the third chapter, we review the platforms currently or potentially available to utilize these tools and technologies on to help in software development. Subsequent chapters attempt to provide an exhaustive overview of the available parallel software development tools and technologies, including their strong and weak points and future concerns. We categorize them as debuggers, memory checkers, performance analysis tools, communication libraries, data visualization programs, and other parallel development aides. The last chapter contains our closing information. Included with this paper at the end is a table of the discussed development tools and their operational environment.

  18. Comparative analysis of different plant oils by high-performance liquid chromatography-atmospheric pressure chemical ionization mass spectrometry.

    PubMed

    Jakab, Annamaria; Héberger, Károly; Forgács, Esther

    2002-11-01

    Different vegetable oil samples (almond, avocado, corngerm, grapeseed, linseed, olive, peanut, pumpkin seed, soybean, sunflower, walnut, wheatgerm) were analyzed using high-performance liquid chromatography-atmospheric pressure chemical ionization mass spectrometry. A gradient elution technique was applied using acetone-acetonitrile eluent systems on an ODS column (Purospher, RP-18e, 125 x 4 mm, 5 microm). Identification of triacylglycerols (TAGs) was based on the pseudomolecular ion [M+1]+ and the diacylglycerol fragments. The positional isomers of triacylglycerol were identified from the relative intensities of the [M-RCO2]+ fragments. Linear discriminant analysis (LDA) as a common multivariate mathematical-statistical calculation was successfully used to distinguish the oils based on their TAG composition. LDA showed that 97.6% of the samples were classified correctly. PMID:12462617

  19. Development and validation of a high-performance liquid chromatographic method for the analysis of propylthiouracil in pharmaceuticals.

    PubMed

    Abdul-Fattah, A M; Bhargava, H N

    2001-09-01

    A simple, rapid, and stability-indicating high-performance liquid chromatographic (HPLC) method was developed and validated for the assay of propylthiouracil (PTU). The method was used to quantify PTU in topical formulations and in tablets. Excellent linearity was observed between PTU concentration and the peak area (R2= 0.999). The limit of detection was 1 ng, and the limit of quantitation was 1.2 ng. The method proved to be selective. Selectivity was validated by subjecting a stock solution of PTU to acidic, basic, and oxidative degradations. The peaks of the degradation products did not interfere with the peak of PTU. Excipients present in the dosage forms did not interfere with the analysis, and the recovery of PTU from each dosage form was quantitative. PMID:11699835

  20. Feasibility of thiocarbamate pesticide analysis in apples by supercritical fluid extraction and high-performance liquid chromatography.

    PubMed

    Howard, A L; Braue, C; Taylor, L T

    1993-08-01

    Supercritical fluid extraction produced comparable results with liquid-solid extraction for the analysis of several thiocarbamate pesticides from apples at the 2 ppm spike level. These results were achieved with a simple one-step extraction procedure. The use of diatomaceous earth (Celite, Supelco, Inc.; Bellefonte, PA) served to increase thiocarbamate recoveries by aiding in the immobilization of the aqueous component of the apple matrix. High-performance liquid chromatography coupled with ultraviolet absorbance detection (HPLC-UV) had the most viable means of quantitation when compared with micro-HPLC-sulfur chemiluminescence detection (SCD) and gas chromatography-flame-ionization detection (GC-FID). The small injection volumes used with the micro-HPLC-SCD system made thiocarbamate detection at a spiking level of 2 ppm impossible. SCD did provide, however, valuable qualitative information about the nature of the apple coextractants. PMID:8376544

  1. High-performance liquid chromatographic, capillary electrophoretic and capillary electrophoretic-electrospray ionisation mass spectrometric analysis of selected alkaloid groups.

    PubMed

    Stöckigt, Joachim; Sheludk, Yuri; Unger, Matthias; Gerasimenko, Irina; Warzecha, Heribert; Stöckigt, Detlef

    2002-08-16

    Systems for efficient separation of selected alkaloid groups by high performance liquid chromatography (HPLC), capillary electrophoresis (CE) and capillary electrophoresis coupled with electrospray ionisation mass spectrometry (CE-ESI-MS) are described. The optimized HPLC system was applied for the separation of 23 standard indole alkaloids as well as for qualitative and quantitative analyses of crude alkaloid extracts of Rauvolfia serpentina X Rhazya stricta hybrid cell cultures. The developed conditions for CE analysis proved to be efficient for separation of mixtures of standard indole and beta-carboline alkaloids. The described buffer system is also applicable in the combination of CE with electrospray ionisation mass spectrometry. This analytical technique allowed the separation and identification of components of standard indole alkaloid mixture as well as crude extracts of R. serpentina roots, R. serpentina cell suspension cultures and cortex of Aspidosperma quebracho-blanco. The influence of buffer composition and analyte structures on separation is discussed. PMID:12219932

  2. New High-Performance Droplet Freezing Assay (HP-DFA) for the Analysis of Ice Nuclei with Complex Composition

    NASA Astrophysics Data System (ADS)

    Kunert, Anna Theresa; Scheel, Jan Frederik; Helleis, Frank; Klimach, Thomas; Pöschl, Ulrich; Fröhlich-Nowoisky, Janine

    2016-04-01

    Freezing of water above homogeneous freezing is catalyzed by ice nucleation active (INA) particles called ice nuclei (IN), which can be of various inorganic or biological origin. The freezing temperatures reach up to -1 °C for some biological samples and are dependent on the chemical composition of the IN. The standard method to analyze IN in solution is the droplet freezing assay (DFA) established by Gabor Vali in 1970. Several modifications and improvements were already made within the last decades, but they are still limited by either small droplet numbers, large droplet volumes or inadequate separation of the single droplets resulting in mutual interferences and therefore improper measurements. The probability that miscellaneous IN are concentrated together in one droplet increases with the volume of the droplet, which can be described by the Poisson distribution. At a given concentration, the partition of a droplet into several smaller droplets leads to finely dispersed IN resulting in better statistics and therefore in a better resolution of the nucleation spectrum. We designed a new customized high-performance droplet freezing assay (HP-DFA), which represents an upgrade of the previously existing DFAs in terms of temperature range and statistics. The necessity of observing freezing events at temperatures lower than homogeneous freezing due to freezing point depression, requires high-performance thermostats combined with an optimal insulation. Furthermore, we developed a cooling setup, which allows both huge and tiny temperature changes within a very short period of time. Besides that, the new DFA provides the analysis of more than 750 droplets per run with a small droplet volume of 5 μL. This enables a fast and more precise analysis of biological samples with complex IN composition as well as better statistics for every sample at the same time.

  3. Ultra-high performance supercritical fluid chromatography coupled with quadrupole-time-of-flight mass spectrometry as a performing tool for bioactive analysis.

    PubMed

    Grand-Guillaume Perrenoud, Alexandre; Guillarme, Davy; Boccard, Julien; Veuthey, Jean-Luc; Barron, Denis; Moco, Sofia

    2016-06-10

    Secondary metabolites are an almost unlimited reservoir of potential bioactive compounds. In view of the wide chemical space covered by natural compounds, their comprehensive analysis requires multiple cutting-edge approaches. This study evaluates the applicability of ultra-high performance supercritical fluid chromatography coupled to quadrupole-time-of-flight mass spectrometry (UHPSFC-QqToF-MS) as an analytical strategy for plant metabolites profiling. Versatility of this analytical platform was first assessed using 120 highly diverse natural compounds (according to lipophilicity, hydrogen bond capability, acid-base properties, molecular mass and chemical structure) that were screened on a set of 15 rationally chosen stationary phase chemistries. UHPSFC-QqToF-MS provides a suitable analytical solution for 88% of the tested compounds. Three stationary phases (Diol, not endcapped C18 and 2-EP) were highlighted as particularly polyvalent, since they allow suitable elution of 101 out of 120 natural compounds. The systematic evaluation of retention and selectivity of natural compounds further underlined the suitability of these three columns for the separation of natural compounds. This reduced set of key stationary phases constitutes a basis for untargeted scouting analysis and method development. Even if they were less versatile, stationary phases such as endcapped T3C18, polar P-PFP, were nevertheless found to provide extended selectivity for specific natural molecules sub-classes. Finally, the identified polyvalent conditions were successfully applied for the analysis of complex polar and non-polar plant extracts. These first experimental hits demonstrate the full applicability and potential of UHPSFC-QqToF-MS for plant metabolite profiling. PMID:27156735

  4. Analysis of aspartyl peptide degradation products by high-performance liquid chromatography and high-performance liquid chromatography-mass spectrometry.

    PubMed

    De Boni, Silvia; Oberthür, Christine; Hamburger, Matthias; Scriba, Gerhard K E

    2004-01-01

    A reversed-phase HPLC method for the analysis of degradation products of the model aspartyl tripeptides Phe-Asp-GlyNH2 and Gly-Asp-PheNH2 after incubation at pH 2 and 10 was developed. Most of the compounds could be separated with a gradient of acetonitrile in water containing 0.1% trifluoroacetic acid. Resolution of the isomeric pairs L-Phe-alpha-L-Asp-GlyNH2/L-Phe-beta-L-Asp-GlyNH2 and L-Phe-alpha-D-Asp-GlyOH/L-Phe-beta-D-Asp-GlyOH was achieved with a gradient of acetonitrile in phosphate buffer, pH 5.0. Under acidic conditions the major degradation pathway was cleavage of the peptide backbone amide bonds yielding dipeptides and amino acids, C-terminal deamidation as well as formation of succidinimyl peptides. At alkaline pH both deamidation of the C-terminal amide as well as isomerization and concomitant enantiomerization of Asp were observed. The peaks were identified both by reference substances and by online electrospray mass spectrometry. The results were compared to a previous developed capillary electrophoresis method. Diastereomeric pairs ofpeptides that could not be separated by capillary electrophoresis were resolved by HPLC while the separation of corresponding pairs of alpha- and beta-Asp peptides was not always achieved by HPLC in contrast to capillary electrophoresis illustrating that both techniques can be complimentary in peptide analysis. PMID:14753775

  5. On-line coupled high performance liquid chromatography-gas chromatography for the analysis of contamination by mineral oil. Part 1: method of analysis.

    PubMed

    Biedermann, Maurus; Grob, Koni

    2012-09-14

    For the analysis of mineral oil saturated hydrocarbons (MOSH) and mineral oil aromatic hydrocarbons (MOAH), on-line coupled high performance liquid chromatography-gas chromatography-flame ionization detection (HPLC-GC-FID) offers important advantages: it separates MOSH and MOAH in robust manner, enables direct injection of large aliquots of raw extracts (resulting in a low detection limit), avoids contamination of the sample during preparation and is fully automated. This review starts with an overview of the technology, particularly the fundamentals of introducing large volumes of solvent into GC, and their implementation into various transfer techniques. The main part deals with the concepts of MOSH and MOAH analysis, with a thorough discussion of the choices made. It is followed by a description of the method. Finally auxiliary tools are summarized to remove interfering components, enrich the sample in case of a high fat content and obtain additional information about the MOSH and MOAH composition. PMID:22770383

  6. High Voltage TAL Performance

    NASA Technical Reports Server (NTRS)

    Jacobson, David T.; Jankovsky, Robert S.; Rawlin, Vincent K.; Manzella, David H.

    2001-01-01

    The performance of a two-stage, anode layer Hall thruster was evaluated. Experiments were conducted in single and two-stage configurations. In single-stage configuration, the thruster was operated with discharge voltages ranging from 300 to 1700 V. Discharge specific impulses ranged from 1630 to 4140 sec. Thruster investigations were conducted with input power ranging from 1 to 8.7 kW, corresponding to power throttling of nearly 9: 1. An extensive two-stage performance map was generated. Data taken with total voltage (sum of discharge and accelerating voltage) constant revealed a decrease in thruster efficiency as the discharge voltage was increased. Anode specific impulse values were comparable in the single and two-stage configurations showing no strong advantage for two-stage operation.

  7. High Performance Arcjet Engines

    NASA Technical Reports Server (NTRS)

    Kennel, Elliot B.; Ivanov, Alexey Nikolayevich; Nikolayev, Yuri Vyacheslavovich

    1994-01-01

    This effort sought to exploit advanced single crystal tungsten-tantalum alloy material for fabrication of a high strength, high temperature arcjet anode. The use of this material is expected to result in improved strength, temperature resistance, and lifetime compared to state of the art polycrystalline alloys. In addition, the use of high electrical and thermal conductivity carbon-carbon composites was considered, and is believed to be a feasible approach. Highly conductive carbon-carbon composite anode capability represents enabling technology for rotating-arc designs derived from the Russian Scientific Research Institute of Thermal Processes (NIITP) because of high heat fluxes at the anode surface. However, for US designs the anode heat flux is much smaller, and thus the benefits are not as great as in the case of NIITP-derived designs. Still, it does appear that the tensile properties of carbon-carbon can be even better than those of single crystal tungsten alloys, especially when nearly-single-crystal fibers such as vapor grown carbon fiber (VGCF) are used. Composites fabricated from such materials must be coated with a refractory carbide coating in order to ensure compatibility with high temperature hydrogen. Fabrication of tungsten alloy single crystals in the sizes required for fabrication of an arcjet anode has been shown to be feasible. Test data indicate that the material can be expected to be at least the equal of W-Re-HfC polycrystalline alloy in terms of its tensile properties, and possibly superior. We are also informed by our colleagues at Scientific Production Association Luch (NP0 Luch) that it is possible to use Russian technology to fabricate polycrystalline W-Re-HfC or other high strength alloys if desired. This is important because existing engines must rely on previously accumulated stocks of these materials, and a fabrication capability for future requirements is not assured.

  8. High-throughput analysis of 19 endogenous androgenic steroids by ultra-performance convergence chromatography tandem mass spectrometry.

    PubMed

    Quanson, Jonathan L; Stander, Marietjie A; Pretorius, Elzette; Jenkinson, Carl; Taylor, Angela E; Storbeck, Karl-Heinz

    2016-09-15

    11-Oxygenated steroids such as 11-ketotestosterone and 11-ketodihydrotestosterone have recently been shown to play a putative role in the development and progression of castration resistant prostate cancer. In this study we report on the development of a high throughput ultra-performance convergence chromatography tandem mass spectrometry (UPC(2)-MS/MS) method for the analysis of thirteen 11-oxygenated and six canonical C19 steroids isolated from a cell culture matrix. Using an Acquity UPC(2) BEH 2-EP column we found that UPC(2) resulted in superior selectivity, increased chromatographic efficiency and a scattered elution order when compared to conventional reverse phase ultra-performance liquid chromatography (UPLC). Furthermore, there was a significant improvement in sensitivity (5-50 times). The lower limits of quantification ranged between 0.01-10ngmL(-1), while the upper limit of quantification was 100ngmL(-1) for all steroids. Accuracy, precision, intra-day variation, recovery, matrix effects and process efficiency were all evaluated and found to be within acceptable limits. Taken together we show that the increased power of UPC(2)-MS/MS allows the analyst to complete in vitro assays at biologically relevant concentrations for the first time and in so doing determine the routes of steroid metabolism which is vital for studies of androgen responsive cancers, such as prostate cancer, and could highlight new mechanisms of disease progression and new targets for cancer therapy. PMID:27479683

  9. Rapid high-performance liquid chromatography method for the analysis of sodium benzoate and potassium sorbate in foods.

    PubMed

    Pylypiw, H M; Grether, M T

    2000-06-23

    A rapid and reliable method is presented for the determination of the preservatives sodium benzoate and potassium sorbate in fruit juices, sodas, soy sauce, ketchup, peanut butter, cream cheese, and other foods. The procedure utilizes high-performance liquid chromatography (HPLC) followed by UV diode array detection for identification and quantitation of the two preservatives. Liquid samples were prepared by diluting 1 ml of the sample with 10 ml of an acetonitrile/ammonium acetate buffer solution. Samples of viscous or solid foods were prepared by blending the sample with the same buffer solution in a 1:5 ratio followed by a dilution identical to liquid samples. All samples were filtered to remove particulate matter prior to analysis. The HPLC determination of the preservatives was performed using a reversed-phase C18 column and UV detection at 225 nm for sodium benzoate and 255 nm potassium sorbate. The percentage of preservative in the sample was calculated by external standard using authentic sodium benzoate and potassium sorbate. Apple juice, apple sauce, soy sauce, and peanut butter, spiked at 0.10 and 0.050% for both sodium benzoate and potassium sorbate, yielded recoveries ranging from 82 to 96%. The method can detect 0.0010% (10 mg/l) of either preservative in a juice matrix. PMID:10910223

  10. Leveraging the Power of High Performance Computing for Next Generation Sequencing Data Analysis: Tricks and Twists from a High Throughput Exome Workflow

    PubMed Central

    Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter

    2015-01-01

    Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438

  11. Investigation of eluted monomers from resin-based root canal sealer by high-performance liquid chromatography analysis

    PubMed Central

    Omurlu, Huma; Arisu, Hacer Deniz; Dalkilic, Evrim Eliguzeloglu; Tamer, Ugur; Torul, Hilal

    2016-01-01

    Objective: The purpose of the current study was to determine the amount of urethane dimethacrylate (UDMA), bisphenol A-glycidyl methacrylate (Bis-GMA), poly (ethylene glycol) dimethacrylate (PEGDMA), bisphenol A ethoxylated dimethacrylate (Bis-EMA), and 2-hydroxyethyl methacrylate (HEMA) eluted from resin-based root canal sealer, epiphany, using high-performance liquid chromatography (HPLC). Materials and Methods: Epiphany was placed into the plastic molds and light-cured with a light emitting diode. After the curing process, each specimen in the first group (n = 12) was immersed in Eppendorf tubes containing a phosphate-buffered saline solution (PBS) and incubated for 45 s. In the second group, each specimen (n = 12) was immersed in Eppendorf tubes containing PBS and incubated for 24 h. Of the specimen extracts, 100 μL were subjected to HPLC. Analysis of data was accomplished with one-way analysis of variance (P < 0.05). Results: All of the samples eluted HEMA, UDMA, Bis-GMA, PEGDMA, and Bis-EMA. A significant difference was determined between the time periods of HEMA, UDMA, PEGDMA, and Bis-EMA (P < 0.05). Conclusion: The results of the current study showed that Epiphany releases HEMA, UDMA, Bis-GMA, PEGDMA, and Bis-EMA in both time periods. PMID:27011746

  12. Analysis of metal ions in crude oil by reversed-phase high performance liquid chromatography using short column.

    PubMed

    Salar Amoli, H; Porgam, A; Bashiri Sadr, Z; Mohanazadeh, F

    2006-06-16

    In this study a rapid, simultaneous analysis of V, Ni, Fe and Cu in crude oil was achieved by high performance liquid chromatography using 10 cm length reversed-phase C18 column. Since the amount of metal ions is at a very low level, in this work, solvent extraction of metals by a ligand such as 8-hydroxyquinoline from acidic media was investigated with some modification to previous procedures. Average extraction recoveries were 99, 85, 94 and 96 for V, Ni, Fe and Cu, respectively. The proposed method was successfully applied to the crude oil which was obtained from Koshk area in southern Iran. Fast analysis of metal ion in reversed-phase short column was achieved with methanol/water (55/45, v/v) and the detection limits measured as three times the background noise were obtained. Also it was shown that if small amount of 8-hydroxyquinoline was added to the mobile phase, the peak height and the peak symmetry were improved. A typical chromatogram for the separation of the 8-hydroxyquinoline complexes of V (V), Ni (II), Fe (III) and Cu (II) in crude oil was obtained in less than 4 min. PMID:16723133

  13. Comparative Performance of Four Methods for High-throughput Glycosylation Analysis of Immunoglobulin G in Genetic and Epidemiological Research*

    PubMed Central

    Huffman, Jennifer E.; Pučić-Baković, Maja; Klarić, Lucija; Hennig, René; Selman, Maurice H. J.; Vučković, Frano; Novokmet, Mislav; Krištić, Jasminka; Borowiak, Matthias; Muth, Thilo; Polašek, Ozren; Razdorov, Genadij; Gornik, Olga; Plomp, Rosina; Theodoratou, Evropi; Wright, Alan F.; Rudan, Igor; Hayward, Caroline; Campbell, Harry; Deelder, André M.; Reichl, Udo; Aulchenko, Yurii S.; Rapp, Erdmann; Wuhrer, Manfred; Lauc, Gordan

    2014-01-01

    The biological and clinical relevance of glycosylation is becoming increasingly recognized, leading to a growing interest in large-scale clinical and population-based studies. In the past few years, several methods for high-throughput analysis of glycans have been developed, but thorough validation and standardization of these methods is required before significant resources are invested in large-scale studies. In this study, we compared liquid chromatography, capillary gel electrophoresis, and two MS methods for quantitative profiling of N-glycosylation of IgG in the same data set of 1201 individuals. To evaluate the accuracy of the four methods we then performed analysis of association with genetic polymorphisms and age. Chromatographic methods with either fluorescent or MS-detection yielded slightly stronger associations than MS-only and multiplexed capillary gel electrophoresis, but at the expense of lower levels of throughput. Advantages and disadvantages of each method were identified, which should inform the selection of the most appropriate method in future studies. PMID:24719452

  14. Metabolite analysis of toosendanin by an ultra-high performance liquid chromatography-quadrupole-time of flight mass spectrometry technique.

    PubMed

    Wu, Jian-Lin; Leung, Elaine Lai-Han; Zhou, Hua; Liu, Liang; Li, Na

    2013-01-01

    Toosendanin is the major bioactive component of Melia toosendan Sieb. et Zucc., which is traditionally used for treatment of abdominal pain and as an insecticide. Previous studies reported that toosendanin possesses hepatotoxicity, but the mechanism remains unknown. Its bioavailability in rats is low, which indicates the hepatotoxicity might be induced by its metabolites. In this connection, in the current study, we examined the metabolites obtained by incubating toosendanin with human live microsomes, and then six of these metabolites (M1-M6) were identified for the first time by ultra-high performance liquid chromatography-quadrupole-time of flight mass spectrometry (UHPLC-Q-TOF/MS). Further analysis on the MS spectra showed M1, M2, and M3 are oxidative products and M6 is a dehydrogenation product, while M4 and M5 are oxidative and dehydrogenation products of toosendanin. Moreover, their possible structures were deduced from the MS/MS spectral features. Quantitative analysis demonstrated that M1-M5 levels rapidly increased and reached a plateau at 30 min, while M6 rapidly reached a maximal level at 20 min and then decreased slowly afterwards. These findings have provided valuable data not only for understanding the metabolic fate of toosendanin in liver microsomes, but also for elucidating the possible molecular mechanism of its hepatotoxicity. PMID:24084018

  15. Feasibility of analysis of polar compounds by high performance liquid chromatography with Fourier transform infrared spectroscopic detection

    SciTech Connect

    Amateis, P.G.

    1984-01-01

    High performance liquid chromatographic separations employing on-line flow cell Fourier transform infrared spectroscopic detection were developed for polar compounds including phenols, alcohols, amines and azaarenes. Detection by FTIR gave information concerning hydrogen bonding and solvent effects occurring during the separations in addition to giving structural information about eluted species to aid in identification. Both analytical size and microbore normal phase columns were employed. Such experimental considerations as column overload, injected minimum detectable quantities, the use of analytical vs. microbore columns and flow cell pathlength were examined. The developed HPLC-FTIR systems were applied to the analysis of several coal liquefaction samples for heteroatom content. Confirmatory and additional information concerning the samples were provided by field ionization mass spectrometry, gas chromatography/mass spectrometry and reversed phase liquid chromatography employing UV detection. An equation relating reversed phase retention times to structural parameters was developed and applied to the analysis of the coal-derived samples. Two process solvents were found to contain primarily alkyl-substituted phenols in addition to azaarenes such as pyridine and quinoline. Some non-distillable coal-derived samples were found to contain azaarenes such as alkyl quinolines. Evidence was also found concerning the presence of hydroxy-pyridine type compounds and the incorporation of process solvent molecules into the coal structure during liquefaction.

  16. Global analysis of chemical constituents in Shengmai injection using high performance liquid chromatography coupled with tandem mass spectrometry.

    PubMed

    Li, Fei; Cheng, Tao-fang; Dong, Xin; Li, Ping; Yang, Hua

    2016-01-01

    This study aimed to develop a specific and reliable method to comprehensively analyze the chemical constituents in Shengmai injection (SMI) using high performance liquid chromatography coupled with tandem mass spectrometry. The qualitative analysis of SMI was achieved on a Kromasil 100-5C18 column, and the results demonstrated that a total of sixty-two compounds in SMI were unambiguously assigned or tentatively identified, and further, twenty-one compounds including fourteen saponins, six lignans and one L-borneol-7-O-[β-D-apiofuranosyl (1→6)]-β-D-gluco-pyranoside were quantified by HPLC-MS. Furthermore, L-borneol-7-O-[β-D-apio-furanosyl (1→6)]-β-D-glucopyranoside, originated from Radix ophiopogonis, was identified and quantified in SMI for the first time. The method validation results indicated that the methods were simple, specific and reliable. All the investigated compounds showed good linearity (r(2)≥0.9992) with a relatively wide concentration range and acceptable recovery at 90.13-109.09%. Consequently, the developed methods were successfully applied to ten batches of SMI samples analysis. The proposed methods may provide a useful and comprehensive reference for the quality control of SMI, and thus to provide supporting data for the quality control and application of SMI clinically. PMID:26342447

  17. Ultra-high-performance liquid chromatography/tandem high-resolution mass spectrometry analysis of sixteen red beverages containing carminic acid: identification of degradation products by using principal component analysis/discriminant analysis.

    PubMed

    Gosetti, Fabio; Chiuminatto, Ugo; Mazzucco, Eleonora; Mastroianni, Rita; Marengo, Emilio

    2015-01-15

    The study investigates the sunlight photodegradation process of carminic acid, a natural red colourant used in beverages. For this purpose, both carminic acid aqueous standard solutions and sixteen different commercial beverages, ten containing carminic acid and six containing E120 dye, were subjected to photoirradiation. The results show different patterns of degradation, not only between the standard solutions and the beverages, but also from beverage to beverage. Due to the different beverage recipes, unpredictable reactions take place between the dye and the other ingredients. To identify the dye degradation products in a very complex scenario, a methodology was used, based on the combined use of principal component analysis with discriminant analysis and ultra-high-performance liquid chromatography coupled with tandem high resolution mass spectrometry. The methodology is unaffected by beverage composition and allows the degradation products of carminic acid dye to be identified for each beverage. PMID:25149011

  18. Ultra high performance liquid chromatography tandem mass spectrometry for rapid analysis of trace organic contaminants in water

    PubMed Central

    2013-01-01

    Background The widespread utilization of organic compounds in modern society and their dispersion through wastewater have resulted in extensive contamination of source and drinking waters. The vast majority of these compounds are not regulated in wastewater outfalls or in drinking water while trace amounts of certain compounds can impact aquatic wildlife. Hence it is prudent to monitor these contaminants in water sources until sufficient toxicological data relevant to humans becomes available. A method was developed for the analysis of 36 trace organic contaminants (TOrCs) including pharmaceuticals, pesticides, steroid hormones (androgens, progestins, and glucocorticoids), personal care products and polyfluorinated compounds (PFCs) using a single solid phase extraction (SPE) technique with ultra-high performance liquid chromatography coupled to tandem mass spectrometry (UHPLC-MS/MS). The method was applied to a variety of water matrices to demonstrate method performance and reliability. Results UHPLC-MS/MS in both positive and negative electrospray ionization (ESI) modes was employed to achieve optimum sensitivity while reducing sample analysis time (<20 min) compared with previously published methods. The detection limits for most compounds was lower than 1.0 picogram on the column while reporting limits in water ranged from 0.1 to 15 ng/L based on the extraction of a 1 L sample and concentration to 1 mL. Recoveries in ultrapure water for most compounds were between 90-110%, while recoveries in surface water and wastewater were in the range of 39-121% and 38-141% respectively. The analytical method was successfully applied to analyze samples across several different water matrices including wastewater, groundwater, surface water and drinking water at different stages of the treatment. Among several compounds detected in wastewater, sucralose and TCPP showed the highest concentrations. Conclusion The proposed method is sensitive, rapid and robust; hence it can

  19. Alkaloids analysis using off-line two-dimensional supercritical fluid chromatography × ultra-high performance liquid chromatography.

    PubMed

    Li, Kuiyong; Fu, Qing; Xin, Huaxia; Ke, Yanxiong; Jin, Yu; Liang, Xinmiao

    2014-07-21

    In this study, an off-line two-dimensional (2-D) supercritical fluid chromatography (SFC) × ultra-high performance liquid chromatography (UHPLC) method with high orthogonality was developed for the analysis of the practical amide alkaloids fraction from P. longum L. The effects of SFC parameters such as column type, organic modifier, temperature and back-pressure on separation were systematically evaluated. Different selectivity was observed for different columns (BEH, BEH 2-EP, XAmide and CSH FP). An investigation was then carried out of the orthogonality of different columns and systems following a geometric approach with a set of amide alkaloid samples. The orthogonality between a CSH FP column and a BEH column reached 50.79%, which was much higher than that for the other columns. While the orthogonality between SFC and UHPLC based on an XAmide column and an HSS T3 column reached 69.84%, which was the highest of all the combinations. At last, the practical amide alkaloids fraction was analyzed with an off-line 2-D chromatography SFC × UHPLC system. In total, at least 340 peaks were detected by this method. Rapid separation in these two dimensions and easy post treatment of SFC facilitated this 2-D system for the separation of complex samples. PMID:24828698

  20. High performance liquid chromatography coupled to mass spectrometry for profiling and quantitative analysis of folate monoglutamates in tomato.

    PubMed

    Tyagi, Kamal; Upadhyaya, Pallawi; Sarma, Supriya; Tamboli, Vajir; Sreelakshmi, Yellamaraju; Sharma, Rameshwar

    2015-07-15

    Folates are essential micronutrients for animals as they play a major role in one carbon metabolism. Animals are unable to synthesize folates and obtain them from plant derived food. In the present study, a high performance liquid chromatography coupled to mass spectrometric (HPLC-MS/MS) method was developed for the high throughput screening and quantitative analysis of folate monoglutamates in tomato fruits. For folate extraction, several parameters were optimized including extraction conditions, pH range, amount of tri-enzyme and boiling time. After processing the extract was purified using ultra-filtration with 10 kDa membrane filter. The ultra-filtered extract was chromatographed on a RP Luna C18 column using gradient elution program. The method was validated by determining linearity, sensitivity and recovery. This method was successfully applied to folate estimation in spinach, capsicum, and garden pea and demonstrated that this method offers a versatile approach for accurate and fast determination of different folate monoglutamates in vegetables. PMID:25722141

  1. High Performance Proactive Digital Forensics

    NASA Astrophysics Data System (ADS)

    Alharbi, Soltan; Moa, Belaid; Weber-Jahnke, Jens; Traore, Issa

    2012-10-01

    With the increase in the number of digital crimes and in their sophistication, High Performance Computing (HPC) is becoming a must in Digital Forensics (DF). According to the FBI annual report, the size of data processed during the 2010 fiscal year reached 3,086 TB (compared to 2,334 TB in 2009) and the number of agencies that requested Regional Computer Forensics Laboratory assistance increasing from 689 in 2009 to 722 in 2010. Since most investigation tools are both I/O and CPU bound, the next-generation DF tools are required to be distributed and offer HPC capabilities. The need for HPC is even more evident in investigating crimes on clouds or when proactive DF analysis and on-site investigation, requiring semi-real time processing, are performed. Although overcoming the performance challenge is a major goal in DF, as far as we know, there is almost no research on HPC-DF except for few papers. As such, in this work, we extend our work on the need of a proactive system and present a high performance automated proactive digital forensic system. The most expensive phase of the system, namely proactive analysis and detection, uses a parallel extension of the iterative z algorithm. It also implements new parallel information-based outlier detection algorithms to proactively and forensically handle suspicious activities. To analyse a large number of targets and events and continuously do so (to capture the dynamics of the system), we rely on a multi-resolution approach to explore the digital forensic space. Data set from the Honeynet Forensic Challenge in 2001 is used to evaluate the system from DF and HPC perspectives.

  2. High-performance integrated virtual environment (HIVE): a robust infrastructure for next-generation sequence data analysis

    PubMed Central

    Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E.; Tkachenko, Valery; Torcivia-Rodriguez, John; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja

    2016-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure. The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu PMID:26989153

  3. High-performance integrated virtual environment (HIVE): a robust infrastructure for next-generation sequence data analysis.

    PubMed

    Simonyan, Vahan; Chumakov, Konstantin; Dingerdissen, Hayley; Faison, William; Goldweber, Scott; Golikov, Anton; Gulzar, Naila; Karagiannis, Konstantinos; Vinh Nguyen Lam, Phuc; Maudru, Thomas; Muravitskaja, Olesja; Osipova, Ekaterina; Pan, Yang; Pschenichnov, Alexey; Rostovtsev, Alexandre; Santana-Quintero, Luis; Smith, Krista; Thompson, Elaine E; Tkachenko, Valery; Torcivia-Rodriguez, John; Voskanian, Alin; Wan, Quan; Wang, Jing; Wu, Tsung-Jung; Wilson, Carolyn; Mazumder, Raja

    2016-01-01

    The High-performance Integrated Virtual Environment (HIVE) is a distributed storage and compute environment designed primarily to handle next-generation sequencing (NGS) data. This multicomponent cloud infrastructure provides secure web access for authorized users to deposit, retrieve, annotate and compute on NGS data, and to analyse the outcomes using web interface visual environments appropriately built in collaboration with research and regulatory scientists and other end users. Unlike many massively parallel computing environments, HIVE uses a cloud control server which virtualizes services, not processes. It is both very robust and flexible due to the abstraction layer introduced between computational requests and operating system processes. The novel paradigm of moving computations to the data, instead of moving data to computational nodes, has proven to be significantly less taxing for both hardware and network infrastructure.The honeycomb data model developed for HIVE integrates metadata into an object-oriented model. Its distinction from other object-oriented databases is in the additional implementation of a unified application program interface to search, view and manipulate data of all types. This model simplifies the introduction of new data types, thereby minimizing the need for database restructuring and streamlining the development of new integrated information systems. The honeycomb model employs a highly secure hierarchical access control and permission system, allowing determination of data access privileges in a finely granular manner without flooding the security subsystem with a multiplicity of rules. HIVE infrastructure will allow engineers and scientists to perform NGS analysis in a manner that is both efficient and secure. HIVE is actively supported in public and private domains, and project collaborations are welcomed. Database URL: https://hive.biochemistry.gwu.edu. PMID:26989153

  4. Tough high performance composite matrix

    NASA Technical Reports Server (NTRS)

    Pater, Ruth H. (Inventor); Johnston, Norman J. (Inventor)

    1994-01-01

    This invention is a semi-interpentrating polymer network which includes a high performance thermosetting polyimide having a nadic end group acting as a crosslinking site and a high performance linear thermoplastic polyimide. Provided is an improved high temperature matrix resin which is capable of performing in the 200 to 300 C range. This resin has significantly improved toughness and microcracking resistance, excellent processability, mechanical performance, and moisture and solvent resistances.

  5. Discrimination of Wild Paris Based on Near Infrared Spectroscopy and High Performance Liquid Chromatography Combined with Multivariate Analysis

    PubMed Central

    Zhao, Yanli; Zhang, Ji; Yuan, Tianjun; Shen, Tao; Li, Wei; Yang, Shihua; Hou, Ying; Wang, Yuanzhong; Jin, Hang

    2014-01-01

    Different geographical origins and species of Paris obtained from southwestern China were discriminated by near infrared (NIR) spectroscopy and high performance liquid chromatography (HPLC) combined with multivariate analysis. The NIR parameter settings were scanning (64 times), resolution (4 cm−1), scanning range (10000 cm−1∼4000 cm−1) and parallel collection (3 times). NIR spectrum was optimized by TQ 8.6 software, and the ranges 7455∼6852 cm−1 and 5973∼4007 cm−1 were selected according to the spectrum standard deviation. The contents of polyphyllin I, polyphyllin II, polyphyllin VI, and polyphyllin VII and total steroid saponins were detected by HPLC. The contents of chemical components data matrix and spectrum data matrix were integrated and analyzed by partial least squares discriminant analysis (PLS-DA). From the PLS-DA model of NIR spectrum, Paris samples were separated into three groups according to the different geographical origins. The R2X and Q2Y described accumulative contribution rates were 99.50% and 94.03% of the total variance, respectively. The PLS-DA model according to 12 species of Paris described 99.62% of the variation in X and predicted 95.23% in Y. The results of the contents of chemical components described differences among collections quantitatively. A multivariate statistical model of PLS-DA showed geographical origins of Paris had a much greater influence on Paris compared with species. NIR and HPLC combined with multivariate analysis could discriminate different geographical origins and different species. The quality of Paris showed regional dependence. PMID:24558477

  6. High Voltage SPT Performance

    NASA Technical Reports Server (NTRS)

    Manzella, David; Jacobson, David; Jankovsky, Robert

    2001-01-01

    A 2.3 kW stationary plasma thruster designed to operate at high voltage was tested at discharge voltages between 300 and 1250 V. Discharge specific impulses between 1600 and 3700 sec were demonstrated with thrust between 40 and 145 mN. Test data indicated that discharge voltage can be optimized for maximum discharge efficiency. The optimum discharge voltage was between 500 and 700 V for the various anode mass flow rates considered. The effect of operating voltage on optimal magnet field strength was investigated. The effect of cathode flow rate on thruster efficiency was considered for an 800 V discharge.

  7. High performance steam development

    SciTech Connect

    Duffy, T.; Schneider, P.

    1995-10-01

    Over 30 years ago U.S. industry introduced the world`s highest temperature (1200{degrees}F at 5000 psig) and most efficient power plant, the Eddystone coal-burning steam plant. The highest alloy material used in the plant was 316 stainless steel. Problems during the first few years of operation caused a reduction in operating temperature to 1100{degrees}F which has generally become the highest temperature used in plants around the world. Leadership in high temperature steam has moved to Japan and Europe over the last 30 years.

  8. High Performance Pulse Tube Cryocoolers

    NASA Astrophysics Data System (ADS)

    Olson, J. R.; Roth, E.; Champagne, P.; Evtimov, B.; Nast, T. C.

    2008-03-01

    Lockheed Martin's Advanced Technology Center has been developing pulse tube cryocoolers for more than ten years. Recent innovations include successful testing of four-stage coldheads, no-load temperature below 4 K, and the recent development of a high-efficiency compressor. This paper discusses the predicted performance of single and multiple stage pulse tube coldheads driven by our new 6 kg "M5Midi" compressor, which is capable of 90% efficiency with 200 W input power, and a maximum input power of 1000 W. This compressor retains the simplicity of earlier LM-ATC compressors: it has a moving magnet and an external electrical coil, minimizing organics in the working gas and requiring no electrical penetrations through the pressure wall. Motor losses were minimized during design, resulting in a simple, easily-manufactured compressor with state-of-the-art motor efficiency. The predicted cryocooler performance is presented as simple formulae, allowing an engineer to include the impact of a highly-optimized cryocooler into a full system analysis. Performance is given as a function of the heat rejection temperature and the cold tip temperatures and cooling loads.

  9. A Comparative Analysis of Social Media Usage and Academic Performance in Public and Private Senior High Schools

    ERIC Educational Resources Information Center

    Mingle, Jeffrey; Adams, Musah; Adjei, E. A.

    2016-01-01

    The study comparatively analyzed social media usage and academic performance in public and private senior high schools. The issue of social media and academic performance has been a very debatable topic with regard to its effect. This study further explores the relation between private and public schools in relation to social media use and…

  10. High Performance Parallel Computational Nanotechnology

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Craw, James M. (Technical Monitor)

    1995-01-01

    At a recent press conference, NASA Administrator Dan Goldin encouraged NASA Ames Research Center to take a lead role in promoting research and development of advanced, high-performance computer technology, including nanotechnology. Manufacturers of leading-edge microprocessors currently perform large-scale simulations in the design and verification of semiconductor devices and microprocessors. Recently, the need for this intensive simulation and modeling analysis has greatly increased, due in part to the ever-increasing complexity of these devices, as well as the lessons of experiences such as the Pentium fiasco. Simulation, modeling, testing, and validation will be even more important for designing molecular computers because of the complex specification of millions of atoms, thousands of assembly steps, as well as the simulation and modeling needed to ensure reliable, robust and efficient fabrication of the molecular devices. The software for this capacity does not exist today, but it can be extrapolated from the software currently used in molecular modeling for other applications: semi-empirical methods, ab initio methods, self-consistent field methods, Hartree-Fock methods, molecular mechanics; and simulation methods for diamondoid structures. In as much as it seems clear that the application of such methods in nanotechnology will require powerful, highly powerful systems, this talk will discuss techniques and issues for performing these types of computations on parallel systems. We will describe system design issues (memory, I/O, mass storage, operating system requirements, special user interface issues, interconnects, bandwidths, and programming languages) involved in parallel methods for scalable classical, semiclassical, quantum, molecular mechanics, and continuum models; molecular nanotechnology computer-aided designs (NanoCAD) techniques; visualization using virtual reality techniques of structural models and assembly sequences; software required to