Sample records for current analytical capabilities

  1. Actionable data analytics in oncology: are we there yet?

    PubMed

    Barkley, Ronald; Greenapple, Rhonda; Whang, John

    2014-03-01

    To operate under a new value-based paradigm, oncology providers must develop the capability to aggregate, analyze, measure, and report their value proposition--that is, their outcomes and associated costs. How are oncology providers positioned currently to perform these functions in a manner that is actionable? What is the current state of analytic capabilities in oncology? Are oncology providers prepared? This line of inquiry was the basis for the 2013 Cancer Center Business Summit annual industry research survey. This article reports on the key findings and implications of the 2013 research survey with regard to data analytic capabilities in the oncology sector. The essential finding from the study is that only a small number of oncology providers (7%) currently possess the analytic tools and capabilities necessary to satisfy internal and external demands for aggregating and reporting clinical outcome and economic data. However there is an expectation that a majority of oncology providers (60%) will have developed such capabilities within the next 2 years.

  2. Bringing Business Intelligence to Health Information Technology Curriculum

    ERIC Educational Resources Information Center

    Zheng, Guangzhi; Zhang, Chi; Li, Lei

    2015-01-01

    Business intelligence (BI) and healthcare analytics are the emerging technologies that provide analytical capability to help healthcare industry improve service quality, reduce cost, and manage risks. However, such component on analytical healthcare data processing is largely missed from current healthcare information technology (HIT) or health…

  3. An investigation of the feasibility of improving oculometer data analysis through application of advanced statistical techniques

    NASA Technical Reports Server (NTRS)

    Rana, D. S.

    1980-01-01

    The data reduction capabilities of the current data reduction programs were assessed and a search for a more comprehensive system with higher data analytic capabilities was made. Results of the investigation are presented.

  4. Multidisciplinary Analysis and Optimal Design: As Easy as it Sounds?

    NASA Technical Reports Server (NTRS)

    Moore, Greg; Chainyk, Mike; Schiermeier, John

    2004-01-01

    The viewgraph presentation examines optimal design for precision, large aperture structures. Discussion focuses on aspects of design optimization, code architecture and current capabilities, and planned activities and collaborative area suggestions. The discussion of design optimization examines design sensitivity analysis; practical considerations; and new analytical environments including finite element-based capability for high-fidelity multidisciplinary analysis, design sensitivity, and optimization. The discussion of code architecture and current capabilities includes basic thermal and structural elements, nonlinear heat transfer solutions and process, and optical modes generation.

  5. Assessment of the analytical capabilities of inductively coupled plasma-mass spectrometry

    USGS Publications Warehouse

    Taylor, Howard E.; Garbarino, John R.

    1988-01-01

    A thorough assessment of the analytical capabilities of inductively coupled plasma-mass spectrometry was conducted for selected analytes of importance in water quality applications and hydrologic research. A multielement calibration curve technique was designed to produce accurate and precise results in analysis times of approximately one minute. The suite of elements included Al, As, B, Ba, Be, Cd, Co, Cr, Cu, Hg, Li, Mn, Mo, Ni, Pb, Se, Sr, V, and Zn. The effects of sample matrix composition on the accuracy of the determinations showed that matrix elements (such as Na, Ca, Mg, and K) that may be present in natural water samples at concentration levels greater than 50 mg/L resulted in as much as a 10% suppression in ion current for analyte elements. Operational detection limits are presented.

  6. Modern Instrumental Methods in Forensic Toxicology*

    PubMed Central

    Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.

    2009-01-01

    This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968

  7. Pressurization of cryogens - A review of current technology and its applicability to low-gravity conditions

    NASA Technical Reports Server (NTRS)

    Van Dresar, N. T.

    1992-01-01

    A review of technology, history, and current status for pressurized expulsion of cryogenic tankage is presented. Use of tank pressurization to expel cryogenic fluid will continue to be studied for future spacecraft applications over a range of operating conditions in the low-gravity environment. The review examines experimental test results and analytical model development for quiescent and agitated conditions in normal-gravity followed by a discussion of pressurization and expulsion in low-gravity. Validated, 1-D, finite difference codes exist for the prediction of pressurant mass requirements within the range of quiescent normal-gravity test data. To date, the effects of liquid sloshing have been characterized by tests in normal-gravity, but analytical models capable of predicting pressurant gas requirements remain unavailable. Efforts to develop multidimensional modeling capabilities in both normal and low-gravity have recently occurred. Low-gravity cryogenic fluid transfer experiments are needed to obtain low-gravity pressurized expulsion data. This data is required to guide analytical model development and to verify code performance.

  8. Pressurization of cryogens: A review of current technology and its applicability to low-gravity conditions

    NASA Technical Reports Server (NTRS)

    Vandresar, N. T.

    1992-01-01

    A review of technology, history, and current status for pressurized expulsion of cryogenic tankage is presented. Use of tank pressurization to expel cryogenic fluids will continue to be studied for future spacecraft applications over a range of operating conditions in the low-gravity environment. The review examines experimental test results and analytical model development for quiescent and agitated conditions in normal-gravity, followed by a discussion of pressurization and expulsion in low-gravity. Validated, 1-D, finite difference codes exist for the prediction of pressurant mass requirements within the range of quiescent normal-gravity test data. To date, the effects of liquid sloshing have been characterized by tests in normal-gravity, but analytical models capable of predicting pressurant gas requirements remain unavailable. Efforts to develop multidimensional modeling capabilities in both normal and low-gravity have recently occurred. Low-gravity cryogenic fluid transfer experiments are needed to obtain low-gravity pressurized expulsion data. This data is required to guide analytical model development and to verify code performance.

  9. Analytical and physical modeling program for the NASA Lewis Research Center's Altitude Wind Tunnel (AWT)

    NASA Technical Reports Server (NTRS)

    Abbott, J. M.; Deidrich, J. H.; Groeneweg, J. F.; Povinelli, L. A.; Reid, L.; Reinmann, J. J.; Szuch, J. R.

    1985-01-01

    An effort is currently underway at the NASA Lewis Research Center to rehabilitate and extend the capabilities of the Altitude Wind Tunnel (AWT). This extended capability will include a maximum test section Mach number of about 0.9 at an altitude of 55,000 ft and a -20 F stagnation temperature (octagonal test section, 20 ft across the flats). In addition, the AWT will include an icing and acoustic research capability. In order to insure a technically sound design, an AWT modeling program (both analytical and physical) was initiated to provide essential input to the AWT final design process. This paper describes the modeling program, including the rationale and criteria used in program definition, and presents some early program results.

  10. Thermal Effects Modeling Developed for Smart Structures

    NASA Technical Reports Server (NTRS)

    Lee, Ho-Jun

    1998-01-01

    Applying smart materials in aeropropulsion systems may improve the performance of aircraft engines through a variety of vibration, noise, and shape-control applications. To facilitate the experimental characterization of these smart structures, researchers have been focusing on developing analytical models to account for the coupled mechanical, electrical, and thermal response of these materials. One focus of current research efforts has been directed toward incorporating a comprehensive thermal analysis modeling capability. Typically, temperature affects the behavior of smart materials by three distinct mechanisms: Induction of thermal strains because of coefficient of thermal expansion mismatch 1. Pyroelectric effects on the piezoelectric elements; 2. Temperature-dependent changes in material properties; and 3. Previous analytical models only investigated the first two thermal effects mechanisms. However, since the material properties of piezoelectric materials generally vary greatly with temperature (see the graph), incorporating temperature-dependent material properties will significantly affect the structural deflections, sensory voltages, and stresses. Thus, the current analytical model captures thermal effects arising from all three mechanisms through thermopiezoelectric constitutive equations. These constitutive equations were incorporated into a layerwise laminate theory with the inherent capability to model both the active and sensory response of smart structures in thermal environments. Corresponding finite element equations were formulated and implemented for both the beam and plate elements to provide a comprehensive thermal effects modeling capability.

  11. On the current drive capability of low dimensional semiconductors: 1D versus 2D

    DOE PAGES

    Zhu, Y.; Appenzeller, J.

    2015-10-29

    Low-dimensional electronic systems are at the heart of many scaling approaches currently pursuit for electronic applications. Here, we present a comparative study between an array of one-dimensional (1D) channels and its two-dimensional (2D) counterpart in terms of current drive capability. Lastly, our findings from analytical expressions derived in this article reveal that under certain conditions an array of 1D channels can outperform a 2D field-effect transistor because of the added degree of freedom to adjust the threshold voltage in an array of 1D devices.

  12. Curved Thermopiezoelectric Shell Structures Modeled by Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Lee, Ho-Jun

    2000-01-01

    "Smart" structures composed of piezoelectric materials may significantly improve the performance of aeropropulsion systems through a variety of vibration, noise, and shape-control applications. The development of analytical models for piezoelectric smart structures is an ongoing, in-house activity at the NASA Glenn Research Center at Lewis Field focused toward the experimental characterization of these materials. Research efforts have been directed toward developing analytical models that account for the coupled mechanical, electrical, and thermal response of piezoelectric composite materials. Current work revolves around implementing thermal effects into a curvilinear-shell finite element code. This enhances capabilities to analyze curved structures and to account for coupling effects arising from thermal effects and the curved geometry. The current analytical model implements a unique mixed multi-field laminate theory to improve computational efficiency without sacrificing accuracy. The mechanics can model both the sensory and active behavior of piezoelectric composite shell structures. Finite element equations are being implemented for an eight-node curvilinear shell element, and numerical studies are being conducted to demonstrate capabilities to model the response of curved piezoelectric composite structures (see the figure).

  13. Composable Analytic Systems for next-generation intelligence analysis

    NASA Astrophysics Data System (ADS)

    DiBona, Phil; Llinas, James; Barry, Kevin

    2015-05-01

    Lockheed Martin Advanced Technology Laboratories (LM ATL) is collaborating with Professor James Llinas, Ph.D., of the Center for Multisource Information Fusion at the University at Buffalo (State of NY), researching concepts for a mixed-initiative associate system for intelligence analysts to facilitate reduced analysis and decision times while proactively discovering and presenting relevant information based on the analyst's needs, current tasks and cognitive state. Today's exploitation and analysis systems have largely been designed for a specific sensor, data type, and operational context, leading to difficulty in directly supporting the analyst's evolving tasking and work product development preferences across complex Operational Environments. Our interactions with analysts illuminate the need to impact the information fusion, exploitation, and analysis capabilities in a variety of ways, including understanding data options, algorithm composition, hypothesis validation, and work product development. Composable Analytic Systems, an analyst-driven system that increases flexibility and capability to effectively utilize Multi-INT fusion and analytics tailored to the analyst's mission needs, holds promise to addresses the current and future intelligence analysis needs, as US forces engage threats in contested and denied environments.

  14. Nuclear Forensics and Attribution: A National Laboratory Perspective

    NASA Astrophysics Data System (ADS)

    Hall, Howard L.

    2008-04-01

    Current capabilities in technical nuclear forensics - the extraction of information from nuclear and/or radiological materials to support the attribution of a nuclear incident to material sources, transit routes, and ultimately perpetrator identity - derive largely from three sources: nuclear weapons testing and surveillance programs of the Cold War, advances in analytical chemistry and materials characterization techniques, and abilities to perform ``conventional'' forensics (e.g., fingerprints) on radiologically contaminated items. Leveraging that scientific infrastructure has provided a baseline capability to the nation, but we are only beginning to explore the scientific challenges that stand between today's capabilities and tomorrow's requirements. These scientific challenges include radically rethinking radioanalytical chemistry approaches, developing rapidly deployable sampling and analysis systems for field applications, and improving analytical instrumentation. Coupled with the ability to measure a signature faster or more exquisitely, we must also develop the ability to interpret those signatures for meaning. This requires understanding of the physics and chemistry of nuclear materials processes well beyond our current level - especially since we are unlikely to ever have direct access to all potential sources of nuclear threat materials.

  15. A Review of Biological Agent Sampling Methods and ...

    EPA Pesticide Factsheets

    Report This study was conducted to evaluate current sampling and analytical capabilities, from a time and resource perspective, for a large-scale biological contamination incident. The analysis will be useful for strategically directing future research investment.

  16. Range Scheduling Aid (RSA)

    NASA Technical Reports Server (NTRS)

    Logan, J. R.; Pulvermacher, M. K.

    1991-01-01

    Range Scheduling Aid (RSA) is presented in the form of the viewgraphs. The following subject areas are covered: satellite control network; current and new approaches to range scheduling; MITRE tasking; RSA features; RSA display; constraint based analytic capability; RSA architecture; and RSA benefits.

  17. Creating Synthetic Coronal Observational Data From MHD Models: The Forward Technique

    NASA Technical Reports Server (NTRS)

    Rachmeler, Laurel A.; Gibson, Sarah E.; Dove, James; Kucera, Therese Ann

    2010-01-01

    We present a generalized forward code for creating simulated corona) observables off the limb from numerical and analytical MHD models. This generalized forward model is capable of creating emission maps in various wavelengths for instruments such as SXT, EIT, EIS, and coronagraphs, as well as spectropolari metric images and line profiles. The inputs to our code can be analytic models (of which four come with the code) or 2.5D and 3D numerical datacubes. We present some examples of the observable data created with our code as well as its functional capabilities. This code is currently available for beta-testing (contact authors), with the ultimate goal of release as a SolarSoft package

  18. Dielectrophoretic label-free immunoassay for rare-analyte quantification in biological samples

    NASA Astrophysics Data System (ADS)

    Velmanickam, Logeeshan; Laudenbach, Darrin; Nawarathna, Dharmakeerthi

    2016-10-01

    The current gold standard for detecting or quantifying target analytes from blood samples is the ELISA (enzyme-linked immunosorbent assay). The detection limit of ELISA is about 250 pg/ml. However, to quantify analytes that are related to various stages of tumors including early detection requires detecting well below the current limit of the ELISA test. For example, Interleukin 6 (IL-6) levels of early oral cancer patients are <100 pg/ml and the prostate specific antigen level of the early stage of prostate cancer is about 1 ng/ml. Further, it has been reported that there are significantly less than 1 pg /mL of analytes in the early stage of tumors. Therefore, depending on the tumor type and the stage of the tumors, it is required to quantify various levels of analytes ranging from ng/ml to pg/ml. To accommodate these critical needs in the current diagnosis, there is a need for a technique that has a large dynamic range with an ability to detect extremely low levels of target analytes (

  19. An Assessment of Current Fan Noise Prediction Capability

    NASA Technical Reports Server (NTRS)

    Envia, Edmane; Woodward, Richard P.; Elliott, David M.; Fite, E. Brian; Hughes, Christopher E.; Podboy, Gary G.; Sutliff, Daniel L.

    2008-01-01

    In this paper, the results of an extensive assessment exercise carried out to establish the current state of the art for predicting fan noise at NASA are presented. Representative codes in the empirical, analytical, and computational categories were exercised and assessed against a set of benchmark acoustic data obtained from wind tunnel tests of three model scale fans. The chosen codes were ANOPP, representing an empirical capability, RSI, representing an analytical capability, and LINFLUX, representing a computational aeroacoustics capability. The selected benchmark fans cover a wide range of fan pressure ratios and fan tip speeds, and are representative of modern turbofan engine designs. The assessment results indicate that the ANOPP code can predict fan noise spectrum to within 4 dB of the measurement uncertainty band on a third-octave basis for the low and moderate tip speed fans except at extreme aft emission angles. The RSI code can predict fan broadband noise spectrum to within 1.5 dB of experimental uncertainty band provided the rotor-only contribution is taken into account. The LINFLUX code can predict interaction tone power levels to within experimental uncertainties at low and moderate fan tip speeds, but could deviate by as much as 6.5 dB outside the experimental uncertainty band at the highest tip speeds in some case.

  20. Current Status of Mycotoxin Analysis: A Critical Review.

    PubMed

    Shephard, Gordon S

    2016-07-01

    It is over 50 years since the discovery of aflatoxins focused the attention of food safety specialists on fungal toxins in the feed and food supply. Since then, analysis of this important group of natural contaminants has advanced in parallel with general developments in analytical science, and current MS methods are capable of simultaneously analyzing hundreds of compounds, including mycotoxins, pesticides, and drugs. This profusion of data may advance our understanding of human exposure, yet constitutes an interpretive challenge to toxicologists and food safety regulators. Despite these advances in analytical science, the basic problem of the extreme heterogeneity of mycotoxin contamination, although now well understood, cannot be circumvented. The real health challenges posed by mycotoxin exposure occur in the developing world, especially among small-scale and subsistence farmers. Addressing these problems requires innovative approaches in which analytical science must also play a role in providing suitable out-of-laboratory analytical techniques.

  1. Simulation study of a new inverse-pinch high Coulomb transfer switch

    NASA Technical Reports Server (NTRS)

    Choi, S. H.

    1984-01-01

    A simulation study of a simplified model of a high coulomb transfer switch is performed. The switch operates in an inverse pinch geometry formed by an all metal chamber, which greatly reduces hot spot formations on the electrode surfaces. Advantages of the switch over the conventional switches are longer useful life, higher current capability and lower inductance, which improves the characteristics required for a high repetition rate switch. The simulation determines the design parameters by analytical computations and comparison with the experimentally measured risetime, current handling capability, electrode damage, and hold-off voltages. The parameters of initial switch design can be determined for the anticipated switch performance. Results are in agreement with the experiment results. Although the model is simplified, the switch characteristics such as risetime, current handling capability, electrode damages, and hold-off voltages are accurately determined.

  2. SmartR: an open-source platform for interactive visual analytics for translational research data

    PubMed Central

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-01-01

    Abstract Summary: In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR, a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. Availability and Implementation: The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR. Contact: reinhard.schneider@uni.lu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28334291

  3. SmartR: an open-source platform for interactive visual analytics for translational research data.

    PubMed

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-07-15

    In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR , a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR . reinhard.schneider@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  4. LSP 156, Low Power Embedded Analytics: FY15 Line Supported Information, Computation, and Exploitation Program

    DTIC Science & Technology

    2015-12-04

    from back-office big - data analytics to fieldable hot-spot systems providing storage-processing-communication services for off- grid sensors. Speed...and power efficiency are the key metrics. Current state-of-the art approaches for big - data aim toward scaling out to many computers to meet...pursued within Lincoln Laboratory as well as external sponsors. Our vision is to bring new capabilities in big - data and internet-of-things applications

  5. Analytical Support Capabilities of Turkish General Staff Scientific Decision Support Centre (SDSC) to Defence Transformation

    DTIC Science & Technology

    2005-04-01

    RTO-MP-SAS-055 4 - 1 UNCLASSIFIED/UNLIMITED UNCLASSIFIED/UNLIMITED Analytical Support Capabilities of Turkish General Staff Scientific...the end failed to achieve anything commensurate with the effort. The analytical support capabilities of Turkish Scientific Decision Support Center to...percent of the İpekkan, Z.; Özkil, A. (2005) Analytical Support Capabilities of Turkish General Staff Scientific Decision Support Centre (SDSC) to

  6. Using health information technology to manage a patient population in accountable care organizations.

    PubMed

    Wu, Frances M; Rundall, Thomas G; Shortell, Stephen M; Bloom, Joan R

    2016-06-20

    Purpose - The purpose of this paper is to describe the current landscape of health information technology (HIT) in early accountable care organizations (ACOs), the different strategies ACOs are using to develop HIT-based capabilities, and how ACOs are using these capabilities within their care management processes to advance health outcomes for their patient population. Design/methodology/approach - Mixed methods study pairing data from a cross-sectional National Survey of ACOs with in-depth, semi-structured interviews with leaders from 11 ACOs (both completed in 2013). Findings - Early ACOs vary widely in their electronic health record, data integration, and analytic capabilities. The most common HIT capability was drug-drug and drug-allergy interaction checks, with 53.2 percent of respondents reporting that the ACO possessed the capability to a high degree. Outpatient and inpatient data integration was the least common HIT capability (8.1 percent). In the interviews, ACO leaders commented on different HIT development strategies to gain a more comprehensive picture of patient needs and service utilization. ACOs realize the necessity for robust data analytics, and are exploring a variety of approaches to achieve it. Research limitations/implications - Data are self-reported. The qualitative portion was based on interviews with 11 ACOs, limiting generalizability to the universe of ACOs but allowing for a range of responses. Practical implications - ACOs are challenged with the development of sophisticated HIT infrastructure. They may benefit from targeted assistance and incentives to implement health information exchanges with other providers to promote more coordinated care management for their patient population. Originality/value - Using new empirical data, this study increases understanding of the extent of ACOs' current and developing HIT capabilities to support ongoing care management.

  7. Strategic analytics: towards fully embedding evidence in healthcare decision-making.

    PubMed

    Garay, Jason; Cartagena, Rosario; Esensoy, Ali Vahit; Handa, Kiren; Kane, Eli; Kaw, Neal; Sadat, Somayeh

    2015-01-01

    Cancer Care Ontario (CCO) has implemented multiple information technology solutions and collected health-system data to support its programs. There is now an opportunity to leverage these data and perform advanced end-to-end analytics that inform decisions around improving health-system performance. In 2014, CCO engaged in an extensive assessment of its current data capacity and capability, with the intent to drive increased use of data for evidence-based decision-making. The breadth and volume of data at CCO uniquely places the organization to contribute to not only system-wide operational reporting, but more advanced modelling of current and future state system management and planning. In 2012, CCO established a strategic analytics practice to assist the agency's programs contextualize and inform key business decisions and to provide support through innovative predictive analytics solutions. This paper describes the organizational structure, services and supporting operations that have enabled progress to date, and discusses the next steps towards the vision of embedding evidence fully into healthcare decision-making. Copyright © 2014 Longwoods Publishing.

  8. Analytical challenges: bridging the gap from regulation to enforcement.

    PubMed

    Van den Eede, Guy; Kay, Simon; Anklam, Elke; Schimmel, Heinz

    2002-01-01

    An overview is presented of the analytical steps that may be needed to determine the presence of genetically modified organisms (GMOs) or for analysis of GMO-derived produce. The analytical aspects necessary for compliance with labeling regulations are discussed along with bottlenecks that may develop when a plant product or a food sample is analyzed for conformity with current European Union GMO legislation. In addition to sampling and testing, other topics deal with complications that arise from biological and agricultural realities that may influence testing capabilities. The issues presented are intended to serve as elements to examine the different challenges that enforcement laboratories might face.

  9. Light Water Reactor Sustainability Program Status Report on the Grizzly Code Enhancements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Novascone, Stephen R.; Spencer, Benjamin W.; Hales, Jason D.

    2013-09-01

    This report summarizes work conducted during fiscal year 2013 to work toward developing a full capability to evaluate fracture contour J-integrals to the Grizzly code. This is a progress report on ongoing work. During the next fiscal year, this capability will be completed, and Grizzly will be capable of evaluating these contour integrals for 3D geometry, including the effects of thermal stress and large deformation. A usable, limited capability has been developed, which is capable of evaluating these integrals on 2D geometry, without considering the effects of material nonlinearity, thermal stress or large deformation. This report presents an overview ofmore » the approach used, along with a demonstration of the current capability in Grizzly, including a comparison with an analytical solution.« less

  10. Review of spectral imaging technology in biomedical engineering: achievements and challenges.

    PubMed

    Li, Qingli; He, Xiaofu; Wang, Yiting; Liu, Hongying; Xu, Dongrong; Guo, Fangmin

    2013-10-01

    Spectral imaging is a technology that integrates conventional imaging and spectroscopy to get both spatial and spectral information from an object. Although this technology was originally developed for remote sensing, it has been extended to the biomedical engineering field as a powerful analytical tool for biological and biomedical research. This review introduces the basics of spectral imaging, imaging methods, current equipment, and recent advances in biomedical applications. The performance and analytical capabilities of spectral imaging systems for biological and biomedical imaging are discussed. In particular, the current achievements and limitations of this technology in biomedical engineering are presented. The benefits and development trends of biomedical spectral imaging are highlighted to provide the reader with an insight into the current technological advances and its potential for biomedical research.

  11. Graph Analytics for Signature Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, Emilie A.; Johnson, John R.; Halappanavar, Mahantesh

    2013-06-01

    Within large amounts of seemingly unstructured data it can be diffcult to find signatures of events. In our work we transform unstructured data into a graph representation. By doing this we expose underlying structure in the data and can take advantage of existing graph analytics capabilities, as well as develop new capabilities. Currently we focus on applications in cybersecurity and communication domains. Within cybersecurity we aim to find signatures for perpetrators using the pass-the-hash attack, and in communications we look for emails or phone calls going up or down a chain of command. In both of these areas, and inmore » many others, the signature we look for is a path with certain temporal properties. In this paper we discuss our methodology for finding these temporal paths within large graphs.« less

  12. Update on SLD Engineering Tools Development

    NASA Technical Reports Server (NTRS)

    Miller, Dean R.; Potapczuk, Mark G.; Bond, Thomas H.

    2004-01-01

    The airworthiness authorities (FAA, JAA, Transport Canada) will be releasing a draft rule in the 2006 timeframe concerning the operation of aircraft in a Supercooled Large Droplet (SLD) environment aloft. The draft rule will require aircraft manufacturers to demonstrate that their aircraft can operate safely in an SLD environment for a period of time to facilitate a safe exit from the condition. It is anticipated that aircraft manufacturers will require a capability to demonstrate compliance with this rule via experimental means (icing tunnels or tankers) and by analytical means (ice prediction codes). Since existing icing research facilities and analytical codes were not developed to account for SLD conditions, current engineering tools are not adequate to support compliance activities in SLD conditions. Therefore, existing capabilities need to be augmented to include SLD conditions. In response to this need, NASA and its partners conceived a strategy or Roadmap for developing experimental and analytical SLD simulation tools. Following review and refinement by the airworthiness authorities and other international research partners, this technical strategy has been crystallized into a project plan to guide the SLD Engineering Tool Development effort. This paper will provide a brief overview of the latest version of the project plan and technical rationale, and provide a status of selected SLD Engineering Tool Development research tasks which are currently underway.

  13. A review of selected inorganic surface water quality-monitoring practices: are we really measuring what we think, and if so, are we doing it right?

    USGS Publications Warehouse

    Horowitz, Arthur J.

    2013-01-01

    Successful environmental/water quality-monitoring programs usually require a balance between analytical capabilities, the collection and preservation of representative samples, and available financial/personnel resources. Due to current economic conditions, monitoring programs are under increasing pressure to do more with less. Hence, a review of current sampling and analytical methodologies, and some of the underlying assumptions that form the bases for these programs seems appropriate, to see if they are achieving their intended objectives within acceptable error limits and/or measurement uncertainty, in a cost-effective manner. That evaluation appears to indicate that several common sampling/processing/analytical procedures (e.g., dip (point) samples/measurements, nitrogen determinations, total recoverable analytical procedures) are generating biased or nonrepresentative data, and that some of the underlying assumptions relative to current programs, such as calendar-based sampling and stationarity are no longer defensible. The extensive use of statistical models as well as surrogates (e.g., turbidity) also needs to be re-examined because the hydrologic interrelationships that support their use tend to be dynamic rather than static. As a result, a number of monitoring programs may need redesigning, some sampling and analytical procedures may need to be updated, and model/surrogate interrelationships may require recalibration.

  14. Current and future technology in radial and axial gas turbines

    NASA Technical Reports Server (NTRS)

    Rohlik, H. E.

    1983-01-01

    Design approaches and flow analysis techniques currently employed by aircraft engine manufacturers are assessed. Studies were performed to define the characteristics of aircraft and engines for civil missions of the 1990's and beyond. These studies, coupled with experience in recent years, identified the critical technologies needed to meet long range goals in fuel economy and other operating costs. Study results, recent and current research and development programs, and an estimate of future design and analytic capabilities are discussed.

  15. Good Chemical Measurements, Good Public Policies

    NASA Astrophysics Data System (ADS)

    Faulkner, Larry R.

    2005-02-01

    At every turn now, one encounters sharply debated issues and important public policies that rest on chemical information. This is true in practically any arena where public interest intersects with the material world: health care practice and public health; energy; quality of air, water, and food; manufacturing standards and product liability; criminal justice; national and international security, including the defense against terrorism. The scale can be truly global, as in the case of the current debate over climate change, which extends into international efforts to regulate gaseous emissions. Sometimes the relevant chemical measurements and applicable theory are sound and their scope is appropriate to the policy; often they are inadequate and a policy or debate overreaches the analytical capability needed to support it. In the decades ahead, the issues with us today will become even more pressing and will drive a still greater reliance on analytical chemistry. This presentation will have four parts covering (a) illustrations of the impact of analytical chemistry on public debate and public policy, including instances where analytical capabilities actually gave rise to new issues and policies, (b) the manner in which chemical information is handled and understood in public debates, (c) areas of analytical chemistry that will be critical to sound public policy in the future, and (d) implications for the education of leaders and general citizens of modern societies.

  16. Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel, David W.; Dalton, Angela C.; Dale, Crystal

    2014-06-01

    Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by anmore » integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.« less

  17. Evaluating the Performance of a Commercial Silicon Drift Detector for X-ray Microanalysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenik, Edward A

    2011-01-01

    Silicon drift detectors (SDDs) are rapidly becoming the energy dispersive spectrometer (EDS) of choice, especially for scanning electron microscopy x-ray microanalysis. The complementary features of large active areas (i.e., high collection angle) and high count rate capability of these detector contribute to their popularity, as well as the absence of liquid nitrogen cooling and good energy resolution of these detectors. The performance of an EDAX Apollo 40 SDD on a JEOL 6500F SEM is discussed. The larger detector resulted in an significant increase (~3.5x) in geometric collection efficiency compared to the original 10mm2 Si(Li) detector that it replaced. The SEMmore » can provide high beam currents (up to 200nA in some conditions) at small probe diameters. The high count rate capability of the SDD and the high current capability of the SEM compliment each other and provide excellent EDS analytical capabilities for both single point and spectrum imaging applications.« less

  18. G-189A analytical simulation of the integrated waste management-water system using radioisotopes for thermal energy

    NASA Technical Reports Server (NTRS)

    Coggi, J. V.; Loscutoff, A. V.; Barker, R. S.

    1973-01-01

    An analytical simulation of the RITE-Integrated Waste Management and Water Recovery System using radioisotopes for thermal energy was prepared for the NASA-Manned Space Flight Center (MSFC). The RITE system is the most advanced concept water-waste management system currently under development and has undergone extended duration testing. It has the capability of disposing of nearly all spacecraft wastes including feces and trash and of recovering water from usual waste water sources: urine, condensate, wash water, etc. All of the process heat normally used in the system is produced from low penalty radioisotope heat sources. The analytical simulation was developed with the G189A computer program. The objective of the simulation was to obtain an analytical simulation which can be used to (1) evaluate the current RITE system steady state and transient performance during normal operating conditions, and also during off normal operating conditions including failure modes; and (2) evaluate the effects of variations in component design parameters and vehicle interface parameters on system performance.

  19. Simple Analytic Expressions for the Magnetic Field of a Circular Current Loop

    NASA Technical Reports Server (NTRS)

    Simpson, James C.; Lane, John E.; Immer, Christopher D.; Youngquist, Robert C.

    2001-01-01

    Analytic expressions for the magnetic induction (magnetic flux density, B) of a simple planar circular current loop have been published in Cartesian and cylindrical coordinates [1,2], and are also known implicitly in spherical coordinates [3]. In this paper, we present explicit analytic expressions for B and its spatial derivatives in Cartesian, cylindrical, and spherical coordinates for a filamentary current loop. These results were obtained with extensive use of Mathematica "TM" and are exact throughout all space outside of the conductor. The field expressions reduce to the well-known limiting cases and satisfy V · B = 0 and V x B = 0 outside the conductor. These results are general and applicable to any model using filamentary circular current loops. Solenoids of arbitrary size may be easily modeled by approximating the total magnetic induction as the sum of those for the individual loops. The inclusion of the spatial derivatives expands their utility to magnetohydrodynamics where the derivatives are required. The equations can be coded into any high-level programming language. It is necessary to numerically evaluate complete elliptic integrals of the first and second kind, but this capability is now available with most programming packages.

  20. Assembling substrate-less plasmonic metacrystals at the oil/water interface for multiplex ultratrace analyte detection.

    PubMed

    Lee, Yih Hong; Lee, Hiang Kwee; Ho, Jonathan Yong Chew; Yang, Yijie; Ling, Xing Yi

    2016-08-15

    Current substrate-less SERS platforms are limited to uncontrolled aggregation of plasmonic nanoparticles or quasi-crystalline arrays of spherical nanoparticles, with no study on how the lattice structures formed by nanoparticle self-assembly affect their detection capabilities. Here, we organize Ag octahedral building blocks into two large-area plasmonic metacrystals at the oil/water interface, and investigate their in situ SERS sensing capabilities. Amphiphilic octahedra assemble into a hexagonal close-packed metacrystal, while hydrophobic octahedra assemble into an open square metacrystal. The lower packing density square metacrystal gives rise to much stronger SERS enhancement than the denser packing hexagonal metacrystal, arising from the larger areas of plasmonic hotspots within the square metacrystal at the excitation wavelength. We further demonstrate the ability of the square metacrystal to achieve quantitative ultratrace detection of analytes from both the aqueous and organic phases. Detection limits are at the nano-molar levels, with analytical enhancement factors reaching 10(8). In addition, multiplex detection across both phases can be achieved in situ without any loss of signal quantitation.

  1. Nonlinear feedback control for high alpha flight

    NASA Technical Reports Server (NTRS)

    Stalford, Harold

    1990-01-01

    Analytical aerodynamic models are derived from a high alpha 6 DOF wind tunnel model. One detail model requires some interpolation between nonlinear functions of alpha. One analytical model requires no interpolation and as such is a completely continuous model. Flight path optimization is conducted on the basic maneuvers: half-loop, 90 degree pitch-up, and level turn. The optimal control analysis uses the derived analytical model in the equations of motion and is based on both moment and force equations. The maximum principle solution for the half-loop is poststall trajectory performing the half-loop in 13.6 seconds. The agility induced by thrust vectoring capability provided a minimum effect on reducing the maneuver time. By means of thrust vectoring control the 90 degrees pitch-up maneuver can be executed in a small place over a short time interval. The agility capability of thrust vectoring is quite beneficial for pitch-up maneuvers. The level turn results are based currently on only outer layer solutions of singular perturbation. Poststall solutions provide high turn rates but generate higher losses of energy than that of classical sustained solutions.

  2. Phosphorescent nanosensors for in vivo tracking of histamine levels.

    PubMed

    Cash, Kevin J; Clark, Heather A

    2013-07-02

    Continuously tracking bioanalytes in vivo will enable clinicians and researchers to profile normal physiology and monitor diseased states. Current in vivo monitoring system designs are limited by invasive implantation procedures and biofouling, limiting the utility of these tools for obtaining physiologic data. In this work, we demonstrate the first success in optically tracking histamine levels in vivo using a modular, injectable sensing platform based on diamine oxidase and a phosphorescent oxygen nanosensor. Our new approach increases the range of measurable analytes by combining an enzymatic recognition element with a reversible nanosensor capable of measuring the effects of enzymatic activity. We use these enzyme nanosensors (EnzNS) to monitor the in vivo histamine dynamics as the concentration rapidly increases and decreases due to administration and clearance. The EnzNS system measured kinetics that match those reported from ex vivo measurements. This work establishes a modular approach to in vivo nanosensor design for measuring a broad range of potential target analytes. Simply replacing the recognition enzyme, or both the enzyme and nanosensor, can produce a new sensor system capable of measuring a wide range of specific analytical targets in vivo.

  3. Meeting report: Ocean 'omics science, technology and cyberinfrastructure: current challenges and future requirements (August 20-23, 2013).

    PubMed

    Gilbert, Jack A; Dick, Gregory J; Jenkins, Bethany; Heidelberg, John; Allen, Eric; Mackey, Katherine R M; DeLong, Edward F

    2014-06-15

    The National Science Foundation's EarthCube End User Workshop was held at USC Wrigley Marine Science Center on Catalina Island, California in August 2013. The workshop was designed to explore and characterize the needs and tools available to the community that is focusing on microbial and physical oceanography research with a particular emphasis on 'omic research. The assembled researchers outlined the existing concerns regarding the vast data resources that are being generated, and how we will deal with these resources as their volume and diversity increases. Particular attention was focused on the tools for handling and analyzing the existing data, on the need for the construction and curation of diverse federated databases, as well as development of shared, interoperable, "big-data capable" analytical tools. The key outputs from this workshop include (i) critical scientific challenges and cyber infrastructure constraints, (ii) the current and future ocean 'omics science grand challenges and questions, and (iii) data management, analytical and associated and cyber-infrastructure capabilities required to meet critical current and future scientific challenges. The main thrust of the meeting and the outcome of this report is a definition of the 'omics tools, technologies and infrastructures that facilitate continued advance in ocean science biology, marine biogeochemistry, and biological oceanography.

  4. 2.0 Introduction to the Delaware River Basin pilot study

    Treesearch

    Peter S. Murdoch; Jennifer C. Jenkins; Richard A. Birdsey

    2008-01-01

    The past 20 years of environmental research have shown that the environment is not made up of discrete components acting independently, but rather it is a mosaic of complex relationships among air, land, water, living resources, and human activities. The data collection and analytical capabilities of current ecosystem assessment and monitoring programs are insufficient...

  5. U.S. Offshore Wind Manufacturing and Supply Chain Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamilton, Bruce

    2013-02-22

    This report seeks to provide an organized, analytical approach to identifying and bounding uncertainties around offshore wind manufacturing and supply chain capabilities; projecting potential component-level supply chain needs under three demand scenarios; and identifying key supply chain challenges and opportunities facing the future U.S. market and current suppliers of the nation’s landbased wind market.

  6. Analytical modeling of eddy-current losses caused by pulse-width-modulation switching in permanent-magnet brushless direct-current motors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deng, F.; Nehl, T.W.

    1998-09-01

    Because of their high efficiency and power density the PM brushless dc motor is a strong candidate for electric and hybrid vehicle propulsion systems. An analytical approach is developed to predict the inverter high frequency pulse width modulation (PWM) switching caused eddy-current losses in a permanent magnet brushless dc motor. The model uses polar coordinates to take curvature effects into account, and is also capable of including the space harmonic effect of the stator magnetic field and the stator lamination effect on the losses. The model was applied to an existing motor design and was verified with the finite elementmore » method. Good agreement was achieved between the two approaches. Hence, the model is expected to be very helpful in predicting PWM switching losses in permanent magnet machine design.« less

  7. Evaluation of capillary electrophoresis for in-flight ionic contaminant monitoring of SSF potable water

    NASA Technical Reports Server (NTRS)

    Mudgett, Paul D.; Schultz, John R.; Sauer, Richard L.

    1992-01-01

    Until 1989, ion chromatography (IC) was the baseline technology selected for the Specific Ion Analyzer, an in-flight inorganic water quality monitor being designed for Space Station Freedom. Recent developments in capillary electrophoresis (CE) may offer significant savings of consumables, power consumption, and weight/volume allocation, relative to IC technology. A thorough evaluation of CE's analytical capability, however, is necessary before one of the two techniques is chosen. Unfortunately, analytical methods currently available for inorganic CE are unproven for NASA's target list of anions and cations. Thus, CE electrolyte chemistry and methods to measure the target contaminants must be first identified and optimized. This paper reports the status of a study to evaluate CE's capability with regard to inorganic and carboxylate anions, alkali and alkaline earth cations, and transition metal cations. Preliminary results indicate that CE has an impressive selectivity and trace sensitivity, although considerable methods development remains to be performed.

  8. Finite Element Analysis of Active and Sensory Thermopiezoelectric Composite Materials. Degree awarded by Northwestern Univ., Dec. 2000

    NASA Technical Reports Server (NTRS)

    Lee, Ho-Jun

    2001-01-01

    Analytical formulations are developed to account for the coupled mechanical, electrical, and thermal response of piezoelectric composite materials. The coupled response is captured at the material level through the thermopiezoelectric constitutive equations and leads to the inherent capability to model both the sensory and active responses of piezoelectric materials. A layerwise laminate theory is incorporated to provide more accurate analysis of the displacements, strains, stresses, electric fields, and thermal fields through-the-thickness. Thermal effects which arise from coefficient of thermal expansion mismatch, pyroelectric effects, and temperature dependent material properties are explicitly accounted for in the formulation. Corresponding finite element formulations are developed for piezoelectric beam, plate, and shell elements to provide a more generalized capability for the analysis of arbitrary piezoelectric composite structures. The accuracy of the current formulation is verified with comparisons from published experimental data and other analytical models. Additional numerical studies are also conducted to demonstrate additional capabilities of the formulation to represent the sensory and active behaviors. A future plan of experimental studies is provided to characterize the high temperature dynamic response of piezoelectric composite materials.

  9. Analytical and Experimental Evaluation of the Heat Transfer Distribution over the Surfaces of Turbine Vanes

    NASA Technical Reports Server (NTRS)

    Hylton, L. D.; Mihelc, M. S.; Turner, E. R.; Nealy, D. A.; York, R. E.

    1983-01-01

    Three airfoil data sets were selected for use in evaluating currently available analytical models for predicting airfoil surface heat transfer distributions in a 2-D flow field. Two additional airfoils, representative of highly loaded, low solidity airfoils currently being designed, were selected for cascade testing at simulated engine conditions. Some 2-D analytical methods were examined and a version of the STAN5 boundary layer code was chosen for modification. The final form of the method utilized a time dependent, transonic inviscid cascade code coupled to a modified version of the STAN5 boundary layer code featuring zero order turbulence modeling. The boundary layer code is structured to accommodate a full spectrum of empirical correlations addressing the coupled influences of pressure gradient, airfoil curvature, and free-stream turbulence on airfoil surface heat transfer distribution and boundary layer transitional behavior. Comparison of pedictions made with the model to the data base indicates a significant improvement in predictive capability.

  10. Analytical and experimental evaluation of the heat transfer distribution over the surfaces of turbine vanes

    NASA Astrophysics Data System (ADS)

    Hylton, L. D.; Mihelc, M. S.; Turner, E. R.; Nealy, D. A.; York, R. E.

    1983-05-01

    Three airfoil data sets were selected for use in evaluating currently available analytical models for predicting airfoil surface heat transfer distributions in a 2-D flow field. Two additional airfoils, representative of highly loaded, low solidity airfoils currently being designed, were selected for cascade testing at simulated engine conditions. Some 2-D analytical methods were examined and a version of the STAN5 boundary layer code was chosen for modification. The final form of the method utilized a time dependent, transonic inviscid cascade code coupled to a modified version of the STAN5 boundary layer code featuring zero order turbulence modeling. The boundary layer code is structured to accommodate a full spectrum of empirical correlations addressing the coupled influences of pressure gradient, airfoil curvature, and free-stream turbulence on airfoil surface heat transfer distribution and boundary layer transitional behavior. Comparison of pedictions made with the model to the data base indicates a significant improvement in predictive capability.

  11. Trace level detection of analytes using artificial olfactometry

    NASA Technical Reports Server (NTRS)

    Lewis, Nathan S. (Inventor); Severin, Erik J. (Inventor); Wong, Bernard (Inventor)

    2002-01-01

    The present invention provides a device for detecting the presence of an analyte, such as for example, a lightweight device, including: a sample chamber having a fluid inlet port for the influx of the analyte; a fluid concentrator in flow communication with the sample chamber wherein the fluid concentrator has an absorbent material capable of absorbing the analyte and capable of desorbing a concentrated analyte; and an array of sensors in fluid communication with the concentrated analyte to be released from the fluid concentrator.

  12. Web-based Visual Analytics for Extreme Scale Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Evans, Katherine J; Harney, John F

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less

  13. Integrating emerging earth science technologies into disaster risk management: an enterprise architecture approach

    NASA Astrophysics Data System (ADS)

    Evans, J. D.; Hao, W.; Chettri, S. R.

    2014-12-01

    Disaster risk management has grown to rely on earth observations, multi-source data analysis, numerical modeling, and interagency information sharing. The practice and outcomes of disaster risk management will likely undergo further change as several emerging earth science technologies come of age: mobile devices; location-based services; ubiquitous sensors; drones; small satellites; satellite direct readout; Big Data analytics; cloud computing; Web services for predictive modeling, semantic reconciliation, and collaboration; and many others. Integrating these new technologies well requires developing and adapting them to meet current needs; but also rethinking current practice to draw on new capabilities to reach additional objectives. This requires a holistic view of the disaster risk management enterprise and of the analytical or operational capabilities afforded by these technologies. One helpful tool for this assessment, the GEOSS Architecture for the Use of Remote Sensing Products in Disaster Management and Risk Assessment (Evans & Moe, 2013), considers all phases of the disaster risk management lifecycle for a comprehensive set of natural hazard types, and outlines common clusters of activities and their use of information and computation resources. We are using these architectural views, together with insights from current practice, to highlight effective, interrelated roles for emerging earth science technologies in disaster risk management. These roles may be helpful in creating roadmaps for research and development investment at national and international levels.

  14. Destructive analysis capabilities for plutonium and uranium characterization at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tandon, Lav; Kuhn, Kevin J; Drake, Lawrence R

    Los Alamos National Laboratory's (LANL) Actinide Analytical Chemistry (AAC) group has been in existence since the Manhattan Project. It maintains a complete set of analytical capabilities for performing complete characterization (elemental assay, isotopic, metallic and non metallic trace impurities) of uranium and plutonium samples in different forms. For a majority of the customers there are strong quality assurance (QA) and quality control (QC) objectives including highest accuracy and precision with well defined uncertainties associated with the analytical results. Los Alamos participates in various international and national programs such as the Plutonium Metal Exchange Program, New Brunswick Laboratory's (NBL' s) Safeguardsmore » Measurement Evaluation Program (SME) and several other inter-laboratory round robin exercises to monitor and evaluate the data quality generated by AAC. These programs also provide independent verification of analytical measurement capabilities, and allow any technical problems with analytical measurements to be identified and corrected. This presentation will focus on key analytical capabilities for destructive analysis in AAC and also comparative data between LANL and peer groups for Pu assay and isotopic analysis.« less

  15. Acoustic Prediction State of the Art Assessment

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.

    2007-01-01

    The acoustic assessment task for both the Subsonic Fixed Wing and the Supersonic projects under NASA s Fundamental Aeronautics Program was designed to assess the current state-of-the-art in noise prediction capability and to establish baselines for gauging future progress. The documentation of our current capabilities included quantifying the differences between predictions of noise from computer codes and measurements of noise from experimental tests. Quantifying the accuracy of both the computed and experimental results further enhanced the credibility of the assessment. This presentation gives sample results from codes representative of NASA s capabilities in aircraft noise prediction both for systems and components. These include semi-empirical, statistical, analytical, and numerical codes. System level results are shown for both aircraft and engines. Component level results are shown for a landing gear prototype, for fan broadband noise, for jet noise from a subsonic round nozzle, and for propulsion airframe aeroacoustic interactions. Additional results are shown for modeling of the acoustic behavior of duct acoustic lining and the attenuation of sound in lined ducts with flow.

  16. Proceedings of the OECD/CSNI workshop on transient thermal-hydraulic and neutronic codes requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ebert, D.

    1997-07-01

    This is a report on the CSNI Workshop on Transient Thermal-Hydraulic and Neutronic Codes Requirements held at Annapolis, Maryland, USA November 5-8, 1996. This experts` meeting consisted of 140 participants from 21 countries; 65 invited papers were presented. The meeting was divided into five areas: (1) current and prospective plans of thermal hydraulic codes development; (2) current and anticipated uses of thermal-hydraulic codes; (3) advances in modeling of thermal-hydraulic phenomena and associated additional experimental needs; (4) numerical methods in multi-phase flows; and (5) programming language, code architectures and user interfaces. The workshop consensus identified the following important action items tomore » be addressed by the international community in order to maintain and improve the calculational capability: (a) preserve current code expertise and institutional memory, (b) preserve the ability to use the existing investment in plant transient analysis codes, (c) maintain essential experimental capabilities, (d) develop advanced measurement capabilities to support future code validation work, (e) integrate existing analytical capabilities so as to improve performance and reduce operating costs, (f) exploit the proven advances in code architecture, numerics, graphical user interfaces, and modularization in order to improve code performance and scrutibility, and (g) more effectively utilize user experience in modifying and improving the codes.« less

  17. CFD Code Survey for Thrust Chamber Application

    NASA Technical Reports Server (NTRS)

    Gross, Klaus W.

    1990-01-01

    In the quest fo find analytical reference codes, responses from a questionnaire are presented which portray the current computational fluid dynamics (CFD) program status and capability at various organizations, characterizing liquid rocket thrust chamber flow fields. Sample cases are identified to examine the ability, operational condition, and accuracy of the codes. To select the best suited programs for accelerated improvements, evaluation criteria are being proposed.

  18. Defense Resource Management Studies: Introduction to Capability and Acquisition Planning Processes

    DTIC Science & Technology

    2010-08-01

    interchangeable and useful in a common contextual framework . Currently, both simulations use a common scenario, the same fictitious country, and...culture, legal framework , and institutions. • Incorporate Principles of Good Governance and Respect for Human Rights: Stress accountability and...Preparing for the assessments requires defining the missions to be analyzed; subdividing the mission definitions to provide a framework for analytic work

  19. Analytical and Radiochemistry for Nuclear Forensics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steiner, Robert Ernest; Dry, Donald E.; Kinman, William Scott

    Information about nonproliferation nuclear forensics, activities in forensics at Los Alamos National Laboratory, radio analytical work at LANL, radiochemical characterization capabilities, bulk chemical and materials analysis capabilities, and future interests in forensics interactions.

  20. Automatically measuring brain ventricular volume within PACS using artificial intelligence.

    PubMed

    Yepes-Calderon, Fernando; Nelson, Marvin D; McComb, J Gordon

    2018-01-01

    The picture archiving and communications system (PACS) is currently the standard platform to manage medical images but lacks analytical capabilities. Staying within PACS, the authors have developed an automatic method to retrieve the medical data and access it at a voxel level, decrypted and uncompressed that allows analytical capabilities while not perturbing the system's daily operation. Additionally, the strategy is secure and vendor independent. Cerebral ventricular volume is important for the diagnosis and treatment of many neurological disorders. A significant change in ventricular volume is readily recognized, but subtle changes, especially over longer periods of time, may be difficult to discern. Clinical imaging protocols and parameters are often varied making it difficult to use a general solution with standard segmentation techniques. Presented is a segmentation strategy based on an algorithm that uses four features extracted from the medical images to create a statistical estimator capable of determining ventricular volume. When compared with manual segmentations, the correlation was 94% and holds promise for even better accuracy by incorporating the unlimited data available. The volume of any segmentable structure can be accurately determined utilizing the machine learning strategy presented and runs fully automatically within the PACS.

  1. Analytical capabilities and services of Lawrence Livermore Laboratory's General Chemistry Division. [Methods available at Lawrence Livermore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gutmacher, R.; Crawford, R.

    This comprehensive guide to the analytical capabilities of Lawrence Livermore Laboratory's General Chemistry Division describes each analytical method in terms of its principle, field of application, and qualitative and quantitative uses. Also described are the state and quantity of sample required for analysis, processing time, available instrumentation, and responsible personnel.

  2. Imaging and Analytics: The changing face of Medical Imaging

    NASA Astrophysics Data System (ADS)

    Foo, Thomas

    There have been significant technological advances in imaging capability over the past 40 years. Medical imaging capabilities have developed rapidly, along with technology development in computational processing speed and miniaturization. Moving to all-digital, the number of images that are acquired in a routine clinical examination has increased dramatically from under 50 images in the early days of CT and MRI to more than 500-1000 images today. The staggering number of images that are routinely acquired poses significant challenges for clinicians to interpret the data and to correctly identify the clinical problem. Although the time provided to render a clinical finding has not substantially changed, the amount of data available for interpretation has grown exponentially. In addition, the image quality (spatial resolution) and information content (physiologically-dependent image contrast) has also increased significantly with advances in medical imaging technology. On its current trajectory, medical imaging in the traditional sense is unsustainable. To assist in filtering and extracting the most relevant data elements from medical imaging, image analytics will have a much larger role. Automated image segmentation, generation of parametric image maps, and clinical decision support tools will be needed and developed apace to allow the clinician to manage, extract and utilize only the information that will help improve diagnostic accuracy and sensitivity. As medical imaging devices continue to improve in spatial resolution, functional and anatomical information content, image/data analytics will be more ubiquitous and integral to medical imaging capability.

  3. Productivity and injectivity of horizontal wells. Quarterly report, October 1--December 31, 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fayers, F.J.; Aziz, K.; Hewett, T.A.

    1993-03-10

    A number of activities have been carried out in the last three months. A list outlining these efforts is presented below followed by brief description of each activity in the subsequent sections of this report: Progress is being made on the development of a black oil three-phase simulator which will allow the use of a generalized Voronoi grid in the plane perpendicular to a horizontal well. The available analytical solutions in the literature for calculating productivity indices (Inflow Performance) of horizontal wells have been reviewed. The pseudo-steady state analytic model of Goode and Kuchuk has been applied to an examplemore » problem. A general mechanistic two-phase flow model is under development. The model is capable of predicting flow transition boundaries for a horizontal pipe at any inclination angle. It also has the capability of determining pressure drops and holdups for all the flow regimes. A large code incorporating all the features of the model has been programmed and is currently being tested.« less

  4. Calculation of ground vibration spectra from heavy military vehicles

    NASA Astrophysics Data System (ADS)

    Krylov, V. V.; Pickup, S.; McNuff, J.

    2010-07-01

    The demand for reliable autonomous systems capable to detect and identify heavy military vehicles becomes an important issue for UN peacekeeping forces in the current delicate political climate. A promising method of detection and identification is the one using the information extracted from ground vibration spectra generated by heavy military vehicles, often termed as their seismic signatures. This paper presents the results of the theoretical investigation of ground vibration spectra generated by heavy military vehicles, such as tanks and armed personnel carriers. A simple quarter car model is considered to identify the resulting dynamic forces applied from a vehicle to the ground. Then the obtained analytical expressions for vehicle dynamic forces are used for calculations of generated ground vibrations, predominantly Rayleigh surface waves, using Green's function method. A comparison of the obtained theoretical results with the published experimental data shows that analytical techniques based on the simplified quarter car vehicle model are capable of producing ground vibration spectra of heavy military vehicles that reproduce basic properties of experimental spectra.

  5. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE PAGES

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...

    2016-07-05

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  6. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  7. Activities in Aeroelasticity at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Perry, Boyd, III; Noll, Thomas E.

    1997-01-01

    This paper presents the results of recently-completed research and presents status reports of current research being performed within the Aeroelasticity Branch of the NASA Langley Research Center. Within the paper this research is classified as experimental, analytical, and theoretical aeroelastic research. The paper also describes the Langley Transonic Dynamics Tunnel, its features, capabilities, a new open-architecture data acquisition system, ongoing facility modifications, and the subsequent calibration of the facility.

  8. xQuake: A Modern Approach to Seismic Network Analytics

    NASA Astrophysics Data System (ADS)

    Johnson, C. E.; Aikin, K. E.

    2017-12-01

    While seismic networks have expanded over the past few decades, and social needs for accurate and timely information has increased dramatically, approaches to the operational needs of both global and regional seismic observatories have been slow to adopt new technologies. This presentation presents the xQuake system that provides a fresh approach to seismic network analytics based on complexity theory and an adaptive architecture of streaming connected microservices as diverse data (picks, beams, and other data) flow into a final, curated catalog of events. The foundation for xQuake is the xGraph (executable graph) framework that is essentially a self-organizing graph database. An xGraph instance provides both the analytics as well as the data storage capabilities at the same time. Much of the analytics, such as synthetic annealing in the detection process and an evolutionary programing approach for event evolution, draws from the recent GLASS 3.0 seismic associator developed by and for the USGS National Earthquake Information Center (NEIC). In some respects xQuake is reminiscent of the Earthworm system, in that it comprises processes interacting through store and forward rings; not surprising as the first author was the lead architect of the original Earthworm project when it was known as "Rings and Things". While Earthworm components can easily be integrated into the xGraph processing framework, the architecture and analytics are more current (e.g. using a Kafka Broker for store and forward rings). The xQuake system is being released under an unrestricted open source license to encourage and enable sthe eismic community support in further development of its capabilities.

  9. The role of light microscopy in aerospace analytical laboratories

    NASA Technical Reports Server (NTRS)

    Crutcher, E. R.

    1977-01-01

    Light microscopy has greatly reduced analytical flow time and added new dimensions to laboratory capability. Aerospace analytical laboratories are often confronted with problems involving contamination, wear, or material inhomogeneity. The detection of potential problems and the solution of those that develop necessitate the most sensitive and selective applications of sophisticated analytical techniques and instrumentation. This inevitably involves light microscopy. The microscope can characterize and often identify the cause of a problem in 5-15 minutes with confirmatory tests generally less than one hour. Light microscopy has and will make a very significant contribution to the analytical capabilities of aerospace laboratories.

  10. Collector modulation in high-voltage bipolar transistor in the saturation mode: Analytical approach

    NASA Astrophysics Data System (ADS)

    Dmitriev, A. P.; Gert, A. V.; Levinshtein, M. E.; Yuferev, V. S.

    2018-04-01

    A simple analytical model is developed, capable of replacing the numerical solution of a system of nonlinear partial differential equations by solving a simple algebraic equation when analyzing the collector resistance modulation of a bipolar transistor in the saturation mode. In this approach, the leakage of the base current into the emitter and the recombination of non-equilibrium carriers in the base are taken into account. The data obtained are in good agreement with the results of numerical calculations and make it possible to describe both the motion of the front of the minority carriers and the steady state distribution of minority carriers across the collector in the saturation mode.

  11. Automated glycopeptide analysis—review of current state and future directions

    PubMed Central

    Dallas, David C.; Martin, William F.; Hua, Serenus

    2013-01-01

    Glycosylation of proteins is involved in immune defense, cell–cell adhesion, cellular recognition and pathogen binding and is one of the most common and complex post-translational modifications. Science is still struggling to assign detailed mechanisms and functions to this form of conjugation. Even the structural analysis of glycoproteins—glycoproteomics—remains in its infancy due to the scarcity of high-throughput analytical platforms capable of determining glycopeptide composition and structure, especially platforms for complex biological mixtures. Glycopeptide composition and structure can be determined with high mass-accuracy mass spectrometry, particularly when combined with chromatographic separation, but the sheer volume of generated data necessitates computational software for interpretation. This review discusses the current state of glycopeptide assignment software—advances made to date and issues that remain to be addressed. The various software and algorithms developed so far provide important insights into glycoproteomics. However, there is currently no freely available software that can analyze spectral data in batch and unambiguously determine glycopeptide compositions for N- and O-linked glycopeptides from relevant biological sources such as human milk and serum. Few programs are capable of aiding in structural determination of the glycan component. To significantly advance the field of glycoproteomics, analytical software and algorithms are required that: (i) solve for both N- and O-linked glycopeptide compositions, structures and glycosites in biological mixtures; (ii) are high-throughput and process data in batches; (iii) can interpret mass spectral data from a variety of sources and (iv) are open source and freely available. PMID:22843980

  12. Application of capability indices and control charts in the analytical method control strategy.

    PubMed

    Oliva, Alexis; Llabres Martinez, Matías

    2017-08-01

    In this study, we assessed the usefulness of control charts in combination with the process capability indices, C pm and C pk , in the control strategy of an analytical method. The traditional X-chart and moving range chart were used to monitor the analytical method over a 2-year period. The results confirmed that the analytical method is in-control and stable. Different criteria were used to establish the specifications limits (i.e. analyst requirements) for fixed method performance (i.e. method requirements). If the specification limits and control limits are equal in breadth, the method can be considered "capable" (C pm  = 1), but it does not satisfy the minimum method capability requirements proposed by Pearn and Shu (2003). Similar results were obtained using the C pk index. The method capability was also assessed as a function of method performance for fixed analyst requirements. The results indicate that the method does not meet the requirements of the analytical target approach. A real-example data of a SEC with light-scattering detection method was used as a model whereas previously published data were used to illustrate the applicability of the proposed approach. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. MERRA Analytic Services

    NASA Astrophysics Data System (ADS)

    Schnase, J. L.; Duffy, D. Q.; McInerney, M. A.; Tamkin, G. S.; Thompson, J. H.; Gill, R.; Grieg, C. M.

    2012-12-01

    MERRA Analytic Services (MERRA/AS) is a cyberinfrastructure resource for developing and evaluating a new generation of climate data analysis capabilities. MERRA/AS supports OBS4MIP activities by reducing the time spent in the preparation of Modern Era Retrospective-Analysis for Research and Applications (MERRA) data used in data-model intercomparison. It also provides a testbed for experimental development of high-performance analytics. MERRA/AS is a cloud-based service built around the Virtual Climate Data Server (vCDS) technology that is currently used by the NASA Center for Climate Simulation (NCCS) to deliver Intergovernmental Panel on Climate Change (IPCC) data to the Earth System Grid Federation (ESGF). Crucial to its effectiveness, MERRA/AS's servers will use a workflow-generated realizable object capability to perform analyses over the MERRA data using the MapReduce approach to parallel storage-based computation. The results produced by these operations will be stored by the vCDS, which will also be able to host code sets for those who wish to explore the use of MapReduce for more advanced analytics. While the work described here will focus on the MERRA collection, these technologies can be used to publish other reanalysis, observational, and ancillary OBS4MIP data to ESGF and, importantly, offer an architectural approach to climate data services that can be generalized to applications and customers beyond the traditional climate research community. In this presentation, we describe our approach, experiences, lessons learned,and plans for the future.; (A) MERRA/AS software stack. (B) Example MERRA/AS interfaces.

  14. Quantitative Story Telling: Initial steps towards bridging perspectives and tools for a robust nexus assessment

    NASA Astrophysics Data System (ADS)

    Cabello, Violeta

    2017-04-01

    This communication will present the advancement of an innovative analytical framework for the analysis of Water-Energy-Food-Climate Nexus termed Quantitative Story Telling (QST). The methodology is currently under development within the H2020 project MAGIC - Moving Towards Adaptive Governance in Complexity: Informing Nexus Security (www.magic-nexus.eu). The key innovation of QST is that it bridges qualitative and quantitative analytical tools into an iterative research process in which each step is built and validated in interaction with stakeholders. The qualitative analysis focusses on the identification of the narratives behind the development of relevant WEFC-Nexus policies and innovations. The quantitative engine is the Multi-Scale Analysis of Societal and Ecosystem Metabolism (MuSIASEM), a resource accounting toolkit capable of integrating multiple analytical dimensions at different scales through relational analysis. Although QST may not be labelled a data-driven but a story-driven approach, I will argue that improving models per se may not lead to an improved understanding of WEF-Nexus problems unless we are capable of generating more robust narratives to frame them. The communication will cover an introduction to MAGIC project, the basic concepts of QST and a case study focussed on agricultural production in a semi-arid region in Southern Spain. Data requirements for this case study and the limitations to find, access or estimate them will be presented alongside a reflection on the relation between analytical scales and data availability.

  15. Recent Work in Hybrid Radiation Transport Methods with Applications to Commercial Nuclear Power

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kulesza, Joel A.

    This talk will begin with an overview of hybrid radiation transport methods followed by a discussion of the author’s work to advance current capabilities. The talk will then describe applications for these methods in commercial nuclear power reactor analyses and techniques for experimental validation. When discussing these analytical and experimental activities, the importance of technical standards such as those created and maintained by ASTM International will be demonstrated.

  16. Mars Field Geology, Biology. and Paleontology Workshop: Summary and Recommendations

    NASA Technical Reports Server (NTRS)

    Budden, Nancy Ann (Editor)

    1998-01-01

    Current NASA planning envisions human missions to Mars as early as 2013, on a mission that would send six crew members for a 500-day stay on the surface of Mars. While our understanding of how we would get there and back is fairly mature, the planning for what the crew would do to explore while on the surface for 500 days is less detailed. Mission objectives are to understand the composition and geo- morphology of the martian surface, and to continue to investigate and sample the geologic history of Mars. Special emphasis will focus on exploring for possible biogenic signatures, past or present, and on analyzing pre-biotic chemistry. The purpose of this workshop was to explore the strategies, desired capabilities, skills, and operational realities required to lend success to the first human missions to Mars. Current mission planning dictates that there will be considerable mobility, sampling and analytical capability available to human crews, at a site warranting long-term geologic and possibly biological interest. However, the details of specific capabilities are not yet clearly defined.

  17. Mars Field Geology, Biology, and Paleontology Workshop: Summary and Recommendations

    NASA Technical Reports Server (NTRS)

    Budden, Nancy Ann (Editor)

    1999-01-01

    Current NASA planning envisions human missions to Mars as early as 2013, on a mission that would send six crew members for a 500-day stay on the surface of Mars. While our understanding of how we would get there and back is fairly mature, the planning for what the crew would do to explore while on the surface for 500 days is less detailed. Mission objectives are to understand the composition and geo- morphology of the martian surface, and to continue to investigate and sample the geologic history of Mars. Special emphasis will focus on exploring for possible biogenic signatures, past or present, and on analyzing pre-biotic chemistry. The purpose of this workshop was to explore the strategies, desired capabilities, skills, and operational realities required to lend success to the first human missions to Mars. Current mission planning dictates that there will be considerable mobility, sampling and analytical capability available to human crews, at a site warranting long-term geologic and possibly biological interest. However, the details of specific capabilities are not yet clearly defined.

  18. Analytical screening of low emissions, high performance duct burners for supersonic cruise aircraft engines

    NASA Technical Reports Server (NTRS)

    Lohmann, R. A.; Riecke, G. T.

    1977-01-01

    An analytical screening study was conducted to identify duct burner concepts capable of providing low emissions and high performance in advanced supersonic engines. Duct burner configurations ranging from current augmenter technology to advanced concepts such as premix-prevaporized burners were defined. Aerothermal and mechanical design studies provided the basis for screening these configurations using the criteria of emissions, performance, engine compatibility, cost, weight and relative risk. Technology levels derived from recently defined experimental low emissions main burners are required to achieve both low emissions and high performance goals. A configuration based on the Vorbix (Vortex burning and mixing) combustor concept was analytically determined to meet the performance goals and is consistent with the fan duct envelope of a variable cycle engine. The duct burner configuration has a moderate risk level compatible with the schedule of anticipated experimental programs.

  19. Development of methodologies and procedures for identifying STS users and uses

    NASA Technical Reports Server (NTRS)

    Archer, J. L.; Beauchamp, N. A.; Macmichael, D. C.

    1974-01-01

    A study was conducted to identify new uses and users of the new Space Transporation System (STS) within the domestic government sector. The study develops a series of analytical techniques and well-defined functions structured as an integrated planning process to assure efficient and meaningful use of the STS. The purpose of the study is to provide NASA with the following functions: (1) to realize efficient and economic use of the STS and other NASA capabilities, (2) to identify new users and uses of the STS, (3) to contribute to organized planning activities for both current and future programs, and (4) to air in analyzing uses of NASA's overall capabilities.

  20. Tunable Ionization Modes of a Flowing Atmospheric-Pressure Afterglow (FAPA) Ambient Ionization Source.

    PubMed

    Badal, Sunil P; Michalak, Shawn D; Chan, George C-Y; You, Yi; Shelley, Jacob T

    2016-04-05

    Plasma-based ambient desorption/ionization sources are versatile in that they enable direct ionization of gaseous samples as well as desorption/ionization of analytes from liquid and solid samples. However, ionization matrix effects, caused by competitive ionization processes, can worsen sensitivity or even inhibit detection all together. The present study is focused on expanding the analytical capabilities of the flowing atmospheric-pressure afterglow (FAPA) source by exploring additional types of ionization chemistry. Specifically, it was found that the abundance and type of reagent ions produced by the FAPA source and, thus, the corresponding ionization pathways of analytes, can be altered by changing the source working conditions. High abundance of proton-transfer reagent ions was observed with relatively high gas flow rates and low discharge currents. Conversely, charge-transfer reagent species were most abundant at low gas flows and high discharge currents. A rather nonpolar model analyte, biphenyl, was found to significantly change ionization pathway based on source operating parameters. Different analyte ions (e.g., MH(+) via proton-transfer and M(+.) via charge-transfer) were formed under unique operating parameters demonstrating two different operating regimes. These tunable ionization modes of the FAPA were used to enable or enhance detection of analytes which traditionally exhibit low-sensitivity in plasma-based ADI-MS analyses. In one example, 2,2'-dichloroquaterphenyl was detected under charge-transfer FAPA conditions, which were difficult or impossible to detect with proton-transfer FAPA or direct analysis in real-time (DART). Overall, this unique mode of operation increases the number and range of detectable analytes and has the potential to lessen ionization matrix effects in ADI-MS analyses.

  1. Nanopore with Transverse Nanoelectrodes for Electrical Characterization and Sequencing of DNA

    PubMed Central

    Gierhart, Brian C.; Howitt, David G.; Chen, Shiahn J.; Zhu, Zhineng; Kotecki, David E.; Smith, Rosemary L.; Collins, Scott D.

    2009-01-01

    A DNA sequencing device which integrates transverse conducting electrodes for the measurement of electrode currents during DNA translocation through a nanopore has been nanofabricated and characterized. A focused electron beam (FEB) milling technique, capable of creating features on the order of 1 nm in diameter, was used to create the nanopore. The device was characterized electrically using gold nanoparticles as an artificial analyte with both DC and AC measurement methods. Single nanoparticle/electrode interaction events were recorded. A low-noise, high-speed transimpedance current amplifier for the detection of nano to picoampere currents at microsecond time scales was designed, fabricated and tested for future integration with the nanopore device. PMID:19584949

  2. Nanopore with Transverse Nanoelectrodes for Electrical Characterization and Sequencing of DNA.

    PubMed

    Gierhart, Brian C; Howitt, David G; Chen, Shiahn J; Zhu, Zhineng; Kotecki, David E; Smith, Rosemary L; Collins, Scott D

    2008-06-16

    A DNA sequencing device which integrates transverse conducting electrodes for the measurement of electrode currents during DNA translocation through a nanopore has been nanofabricated and characterized. A focused electron beam (FEB) milling technique, capable of creating features on the order of 1 nm in diameter, was used to create the nanopore. The device was characterized electrically using gold nanoparticles as an artificial analyte with both DC and AC measurement methods. Single nanoparticle/electrode interaction events were recorded. A low-noise, high-speed transimpedance current amplifier for the detection of nano to picoampere currents at microsecond time scales was designed, fabricated and tested for future integration with the nanopore device.

  3. Pre-Analytical Considerations for Successful Next-Generation Sequencing (NGS): Challenges and Opportunities for Formalin-Fixed and Paraffin-Embedded Tumor Tissue (FFPE) Samples

    PubMed Central

    Arreaza, Gladys; Qiu, Ping; Pang, Ling; Albright, Andrew; Hong, Lewis Z.; Marton, Matthew J.; Levitan, Diane

    2016-01-01

    In cancer drug discovery, it is important to investigate the genetic determinants of response or resistance to cancer therapy as well as factors that contribute to adverse events in the course of clinical trials. Despite the emergence of new technologies and the ability to measure more diverse analytes (e.g., circulating tumor cell (CTC), circulating tumor DNA (ctDNA), etc.), tumor tissue is still the most common and reliable source for biomarker investigation. Because of its worldwide use and ability to preserve samples for many decades at ambient temperature, formalin-fixed, paraffin-embedded tumor tissue (FFPE) is likely to be the preferred choice for tissue preservation in clinical practice for the foreseeable future. Multiple analyses are routinely performed on the same FFPE samples (such as Immunohistochemistry (IHC), in situ hybridization, RNAseq, DNAseq, TILseq, Methyl-Seq, etc.). Thus, specimen prioritization and optimization of the isolation of analytes is critical to ensure successful completion of each assay. FFPE is notorious for producing suboptimal DNA quality and low DNA yield. However, commercial vendors tend to request higher DNA sample mass than what is actually required for downstream assays, which restricts the breadth of biomarker work that can be performed. We evaluated multiple genomics service laboratories to assess the current state of NGS pre-analytical processing of FFPE. Significant differences in pre-analytical capabilities were observed. Key aspects are highlighted and recommendations are made to improve the current practice in translational research. PMID:27657050

  4. Research highlights: increasing paper possibilities.

    PubMed

    Wu, Chueh-Yu; Adeyiga, Oladunni; Lin, Jonathan; Di Carlo, Dino

    2014-09-07

    In this issue we highlight three recent papers that demonstrate new strategies to extend the capabilities of paper microfluidics. Paper (a mesh of porous fibers) has a long history as a substrate to perform biomolecular assays. Traditional lateral flow immunoassays (LFAs) are widely used for rapid diagnostic tests, and perform well when a yes or no answer is required and the analyte of interest is at relatively high concentrations. High concentrations are required because usually only a small volume of analyte-containing fluid flows past the detection region, leading to a limited signal. Further, the small pores within paper matrices prevent the use of paper to control the flow of larger particles and cells, limiting the use of paper microfluidics for cell-based diagnostics. The work we highlight addresses these important unmet challenges in paper microfluidics: enriching low concentration analytes to a higher concentration in a smaller volume that can be processed effectively, and using paper to pump flows in larger channels amenable to cells. Applying these new approaches may allow diagnosis of disease states currently technically unachievable using current LFA systems, while maintaining many of the "un-instrumented" advantages of an assay on self-wicking paper.

  5. International Technical Working Group Round Robin Tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dudder, Gordon B.; Hanlen, Richard C.; Herbillion, Georges M.

    The goal of nuclear forensics is to develop a preferred approach to support illicit trafficking investigations. This approach must be widely understood and accepted as credible. The principal objectives of the Round Robin Tests are to prioritize forensic techniques and methods, evaluate attribution capabilities, and examine the utility of database. The HEU (Highly Enriched Uranium) Round Robin, and previous Plutonium Round Robin, have made tremendous contributions to fulfilling these goals through a collaborative learning experience that resulted from the outstanding efforts of the nine participating internal laboratories. A prioritized list of techniques and methods has been developed based on thismore » exercise. Current work is focused on the extent to which the techniques and methods can be generalized. The HEU Round Robin demonstrated a rather high level of capability to determine the important characteristics of the materials and processes using analytical methods. When this capability is combined with the appropriate knowledge/database, it results in a significant capability to attribute the source of the materials to a specific process or facility. A number of shortfalls were also identified in the current capabilities including procedures for non-nuclear forensics and the lack of a comprehensive network of data/knowledge bases. The results of the Round Robin will be used to develop guidelines or a ''recommended protocol'' to be made available to the interested authorities and countries to use in real cases.« less

  6. Analytical electron microscopy in the study of biological systems.

    PubMed

    Johnson, D E

    1986-01-01

    The AEM is a powerful tool in biological research, capable of providing information simply not available by other means. The use of a field emission STEM for this application can lead to a significant improvement in spatial resolution in most cases now allowed by the quality of the specimen preparation but perhaps ultimately limited by the effects of radiation damage. Increased elemental sensitivity is at least possible in selected cases with electron energy-loss spectrometry, but fundamental aspects of ELS will probably confine its role to that of a limited complement to EDS. The considerable margin for improvement in sensitivity of the basic analytical technique means that the search for technological improvement will continue. Fortunately, however, current technology can also continue to answer important biological questions.

  7. Development of a bidirectional ring thermal actuator

    NASA Astrophysics Data System (ADS)

    Stevenson, Mathew; Yang, Peng; Lai, Yongjun; Mechefske, Chris

    2007-10-01

    A new planar micro electrothermal actuator capable of bidirectional rotation is presented. The ring thermal actuator has a wheel-like geometry with eight arms connecting an outer ring to a central hub. Thermal expansion of the arms results in a rotation of the outer ring about its center. An analytical model is developed for the electrothermal and thermal-mechanical aspects of the actuator's operation. Finite element analysis is used to validate the analytic study. The actuator has been fabricated using the multi-user MEMS process and experimental displacement results are compared with model predictions. Experiments show a possible displacement of 7.4 µm in each direction. Also, by switching the current between the arms it is possible to achieve an oscillating motion.

  8. Status Report on Ex-Vessel Coolability and Water Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farmer, M. T.; Robb, K. R.

    Specific to BWR plants, current accident management guidance calls for flooding the drywell to a level of approximately 1.2 m (4 feet) above the drywell floor once vessel breach has been determined. While this action can help to submerge ex-vessel core debris, it can also result in flooding the wetwell and thereby rendering the wetwell vent path unavailable. An alternate strategy is being developed in the industry guidance for responding to the severe accident capable vent Order, EA-13-109. The alternate strategy being proposed would throttle the flooding rate to achieve a stable wetwell water level while preserving the wetwell ventmore » path. The overall objective of this work is to upgrade existing analytical tools (i.e. MELTSPREAD and CORQUENCH - which have been used as part of the DOE-sponsored Fukushima accident analyses) in order to provide flexible, analytically capable, and validated models to support the development of water throttling strategies for BWRs that are aimed at keeping ex-vessel core debris covered with water while preserving the wetwell vent path.« less

  9. Concept for Inclusion of Analytical and Computational Capability in Optical Plume Anomaly Detection (OPAD) for Measurement of Neutron Flux

    NASA Technical Reports Server (NTRS)

    Patrick, Marshall Clint; Cooper, Anita E.; Powers, W. T.

    2004-01-01

    Researchers are working on many fronts to make possible high-speed, automated classification and quantification of constituent materials in numerous environments. NASA's Marshall Space Flight Center has implemented a system for rocket engine flowfields/plumes. The Optical Plume Anomaly Detector (OPAD) system was designed to utilize emission and absorption spectroscopy for monitoring molecular and atomic particulates in gas plasma. An accompanying suite of tools and analytical package designed to utilize information collected by OPAD is known as the Engine Diagnostic Filtering System (EDiFiS). The current combination of these systems identifies atomic and molecular species and quantifies mass loss rates in H2/O2 rocket plumes. Capabilities for real-time processing are being advanced on several fronts, including an effort to hardware encode components of the EDiFiS for health monitoring and management. This paper addresses the OPAD with its tool suites, and discusses what is considered a natural progression: a concept for taking OPAD to the next logical level of high energy physics, incorporating fermion and boson particle analyses in measurement of neutron flux.

  10. Chemical Detection and Identification Techniques for Exobiology Flight Experiments

    NASA Technical Reports Server (NTRS)

    Kojiro, Daniel R.; Sheverev, Valery A.; Khromov, Nikolai A.

    2002-01-01

    Exobiology flight experiments require highly sensitive instrumentation for in situ analysis of the volatile chemical species that occur in the atmospheres and surfaces of various bodies within the solar system. The complex mixtures encountered place a heavy burden on the analytical Instrumentation to detect and identify all species present. The minimal resources available onboard for such missions mandate that the instruments provide maximum analytical capabilities with minimal requirements of volume, weight and consumables. Advances in technology may be achieved by increasing the amount of information acquired by a given technique with greater analytical capabilities and miniaturization of proven terrestrial technology. We describe here methods to develop analytical instruments for the detection and identification of a wide range of chemical species using Gas Chromatography. These efforts to expand the analytical capabilities of GC technology are focused on the development of detectors for the GC which provide sample identification independent of the GC retention time data. A novel new approach employs Penning Ionization Electron Spectroscopy (PIES).

  11. An Assessment of Operational Energy Capability Improvement Fund (OECIF) Programs 17-S-2544

    DTIC Science & Technology

    2017-09-19

    persistently attack key operational energy problems . OECIF themes are summarized in Table 1, and Appendix A includes more detail on the programs within... problems FY 2014 Analytical methods and tools FY 2015 Improving fuel economy for the current tactical ground fleet FY 2016 Increasing the operational...involve a variety of organizations to solve operational energy problems . In FY 2015, the OECIF program received a one-time $14.1M Congressional plus-up

  12. PANTHER. Pattern ANalytics To support High-performance Exploitation and Reasoning.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Czuchlewski, Kristina Rodriguez; Hart, William E.

    Sandia has approached the analysis of big datasets with an integrated methodology that uses computer science, image processing, and human factors to exploit critical patterns and relationships in large datasets despite the variety and rapidity of information. The work is part of a three-year LDRD Grand Challenge called PANTHER (Pattern ANalytics To support High-performance Exploitation and Reasoning). To maximize data analysis capability, Sandia pursued scientific advances across three key technical domains: (1) geospatial-temporal feature extraction via image segmentation and classification; (2) geospatial-temporal analysis capabilities tailored to identify and process new signatures more efficiently; and (3) domain- relevant models of humanmore » perception and cognition informing the design of analytic systems. Our integrated results include advances in geographical information systems (GIS) in which we discover activity patterns in noisy, spatial-temporal datasets using geospatial-temporal semantic graphs. We employed computational geometry and machine learning to allow us to extract and predict spatial-temporal patterns and outliers from large aircraft and maritime trajectory datasets. We automatically extracted static and ephemeral features from real, noisy synthetic aperture radar imagery for ingestion into a geospatial-temporal semantic graph. We worked with analysts and investigated analytic workflows to (1) determine how experiential knowledge evolves and is deployed in high-demand, high-throughput visual search workflows, and (2) better understand visual search performance and attention. Through PANTHER, Sandia's fundamental rethinking of key aspects of geospatial data analysis permits the extraction of much richer information from large amounts of data. The project results enable analysts to examine mountains of historical and current data that would otherwise go untouched, while also gaining meaningful, measurable, and defensible insights into overlooked relationships and patterns. The capability is directly relevant to the nation's nonproliferation remote-sensing activities and has broad national security applications for military and intelligence- gathering organizations.« less

  13. TH-A-19A-06: Site-Specific Comparison of Analytical and Monte Carlo Based Dose Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuemann, J; Grassberger, C; Paganetti, H

    2014-06-15

    Purpose: To investigate the impact of complex patient geometries on the capability of analytical dose calculation algorithms to accurately predict dose distributions and to verify currently used uncertainty margins in proton therapy. Methods: Dose distributions predicted by an analytical pencilbeam algorithm were compared with Monte Carlo simulations (MCS) using TOPAS. 79 complete patient treatment plans were investigated for 7 disease sites (liver, prostate, breast, medulloblastoma spine and whole brain, lung and head and neck). A total of 508 individual passively scattered treatment fields were analyzed for field specific properties. Comparisons based on target coverage indices (EUD, D95, D90 and D50)more » were performed. Range differences were estimated for the distal position of the 90% dose level (R90) and the 50% dose level (R50). Two-dimensional distal dose surfaces were calculated and the root mean square differences (RMSD), average range difference (ARD) and average distal dose degradation (ADD), the distance between the distal position of the 80% and 20% dose levels (R80- R20), were analyzed. Results: We found target coverage indices calculated by TOPAS to generally be around 1–2% lower than predicted by the analytical algorithm. Differences in R90 predicted by TOPAS and the planning system can be larger than currently applied range margins in proton therapy for small regions distal to the target volume. We estimate new site-specific range margins (R90) for analytical dose calculations considering total range uncertainties and uncertainties from dose calculation alone based on the RMSD. Our results demonstrate that a reduction of currently used uncertainty margins is feasible for liver, prostate and whole brain fields even without introducing MC dose calculations. Conclusion: Analytical dose calculation algorithms predict dose distributions within clinical limits for more homogeneous patients sites (liver, prostate, whole brain). However, we recommend treatment plan verification using Monte Carlo simulations for patients with complex geometries.« less

  14. Structural Sizing Methodology for the Tendon-Actuated Lightweight In-Space MANipulator (TALISMAN) System

    NASA Technical Reports Server (NTRS)

    Jones, Thomas C.; Dorsey, John T.; Doggett, William R.

    2015-01-01

    The Tendon-Actuated Lightweight In-Space MANipulator (TALISMAN) is a versatile long-reach robotic manipulator that is currently being tested at NASA Langley Research Center. TALISMAN is designed to be highly mass-efficient and multi-mission capable, with applications including asteroid retrieval and manipulation, in-space servicing, and astronaut and payload positioning. The manipulator uses a modular, periodic, tension-compression design that lends itself well to analytical modeling. Given the versatility of application for TALISMAN, a structural sizing methodology was developed that could rapidly assess mass and configuration sensitivities for any specified operating work space, applied loads and mission requirements. This methodology allows the systematic sizing of the key structural members of TALISMAN, which include the truss arm links, the spreaders and the tension elements. This paper summarizes the detailed analytical derivations and methodology that support the structural sizing approach and provides results from some recent TALISMAN designs developed for current and proposed mission architectures.

  15. Summary Report for the Evaluation of Current QA Processes Within the FRMAC FAL and EPA MERL.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shanks, Sonoya T.; Redding, Ted; Jaussi, Lynn

    The Federal Radiological Monitoring and Assessment Center (FRMAC) relies on accurate and defensible analytical laboratory data to support its mission. Therefore, FRMAC must ensure that the environmental analytical laboratories providing analytical services maintain an ongoing capability to provide accurate analytical results to DOE. It is undeniable that the more Quality Assurance (QA) and Quality Control (QC) measures required of the laboratory, the less resources that are available for analysis of response samples. Being that QA and QC measures in general are understood to comprise a major effort related to a laboratory’s operations, requirements should only be considered if they aremore » deemed “value-added” for the FRMAC mission. This report provides observations of areas for improvement and potential interoperability opportunities in the areas of Batch Quality Control Requirements, Written Communications, Data Review Processes, Data Reporting Processes, along with the lessons learned as they apply to items in the early phase of a response that will be critical for developing a more efficient, integrated response for future interactions between the FRMAC and EPA assets.« less

  16. Comparison of ISS Power System Telemetry with Analytically Derived Data for Shadowed Cases

    NASA Technical Reports Server (NTRS)

    Fincannon, H. James

    2002-01-01

    Accurate International Space Station (ISS) power prediction requires the quantification of solar array shadowing. Prior papers have discussed the NASA Glenn Research Center (GRC) ISS power system tool SPACE (System Power Analysis for Capability Evaluation) and its integrated shadowing algorithms. On-orbit telemetry has become available that permits the correlation of theoretical shadowing predictions with actual data. This paper documents the comparison of a shadowing metric (total solar array current) as derived from SPACE predictions and on-orbit flight telemetry data for representative significant shadowing cases. Images from flight video recordings and the SPACE computer program graphical output are used to illustrate the comparison. The accuracy of the SPACE shadowing capability is demonstrated for the cases examined.

  17. Antiferromagnetic nano-oscillator in external magnetic fields

    NASA Astrophysics Data System (ADS)

    Checiński, Jakub; Frankowski, Marek; Stobiecki, Tomasz

    2017-11-01

    We describe the dynamics of an antiferromagnetic nano-oscillator in an external magnetic field of any given time distribution. The oscillator is powered by a spin current originating from spin-orbit effects in a neighboring heavy metal layer and is capable of emitting a THz signal in the presence of an additional easy-plane anisotropy. We derive an analytical formula describing the interaction between such a system and an external field, which can affect the output signal character. Interactions with magnetic pulses of different shapes, with a sinusoidal magnetic field and with a sequence of rapidly changing magnetic fields are discussed. We also perform numerical simulations based on the Landau-Lifshitz-Gilbert equation with spin-transfer torque effects to verify the obtained results and find a very good quantitative agreement between analytical and numerical predictions.

  18. Development of computer-based analytical tool for assessing physical protection system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mardhi, Alim, E-mail: alim-m@batan.go.id; Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330; Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work,more » we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.« less

  19. Development of computer-based analytical tool for assessing physical protection system

    NASA Astrophysics Data System (ADS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  20. Analytical Tools for Behavioral Influences Operations

    DTIC Science & Technology

    2003-12-01

    NASIC’s Investment in Analytical Capabilities ....................................................... 56 6.2 Study Limitations...get started. This project is envisioned as a foundation for future work by NASIC analysts. They will use the tools identified in this study to...capabilities Though this study took all three categories into account, most (90%) of the focus for the SRA team’s effort was on identifying and analyzing

  1. Failure of Standard Training Sets in the Analysis of Fast-Scan Cyclic Voltammetry Data.

    PubMed

    Johnson, Justin A; Rodeberg, Nathan T; Wightman, R Mark

    2016-03-16

    The use of principal component regression, a multivariate calibration method, in the analysis of in vivo fast-scan cyclic voltammetry data allows for separation of overlapping signal contributions, permitting evaluation of the temporal dynamics of multiple neurotransmitters simultaneously. To accomplish this, the technique relies on information about current-concentration relationships across the scan-potential window gained from analysis of training sets. The ability of the constructed models to resolve analytes depends critically on the quality of these data. Recently, the use of standard training sets obtained under conditions other than those of the experimental data collection (e.g., with different electrodes, animals, or equipment) has been reported. This study evaluates the analyte resolution capabilities of models constructed using this approach from both a theoretical and experimental viewpoint. A detailed discussion of the theory of principal component regression is provided to inform this discussion. The findings demonstrate that the use of standard training sets leads to misassignment of the current-concentration relationships across the scan-potential window. This directly results in poor analyte resolution and, consequently, inaccurate quantitation, which may lead to erroneous conclusions being drawn from experimental data. Thus, it is strongly advocated that training sets be obtained under the experimental conditions to allow for accurate data analysis.

  2. Extending Climate Analytics-As to the Earth System Grid Federation

    NASA Astrophysics Data System (ADS)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; McInerney, M.; Nadeau, D.; Li, J.; Strong, S.; Thompson, J. H.

    2015-12-01

    We are building three extensions to prior-funded work on climate analytics-as-a-service that will benefit the Earth System Grid Federation (ESGF) as it addresses the Big Data challenges of future climate research: (1) We are creating a cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables from six major reanalysis data sets. This near real-time capability will enable advanced technologies like the Cloudera Impala-based Structured Query Language (SQL) query capabilities and Hadoop-based MapReduce analytics over native NetCDF files while providing a platform for community experimentation with emerging analytic technologies. (2) We are building a full-featured Reanalysis Ensemble Service comprising monthly means data from six reanalysis data sets. The service will provide a basic set of commonly used operations over the reanalysis collections. The operations will be made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services (CDS) API. (3) We are establishing an Open Geospatial Consortium (OGC) WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation ESGF capabilities. The CDS API will be extended to accommodate the new WPS Web service endpoints as well as ESGF's Web service endpoints. These activities address some of the most important technical challenges for server-side analytics and support the research community's requirements for improved interoperability and improved access to reanalysis data.

  3. Quantum thermal diode based on two interacting spinlike systems under different excitations.

    PubMed

    Ordonez-Miranda, Jose; Ezzahri, Younès; Joulain, Karl

    2017-02-01

    We demonstrate that two interacting spinlike systems characterized by different excitation frequencies and coupled to a thermal bath each, can be used as a quantum thermal diode capable of efficiently rectifying the heat current. This is done by deriving analytical expressions for both the heat current and rectification factor of the diode, based on the solution of a master equation for the density matrix. Higher rectification factors are obtained for lower heat currents, whose magnitude takes their maximum values for a given interaction coupling proportional to the temperature of the hotter thermal bath. It is shown that the rectification ability of the diode increases with the excitation frequencies difference, which drives the asymmetry of the heat current, when the temperatures of the thermal baths are inverted. Furthermore, explicit conditions for the optimization of the rectification factor and heat current are explicitly found.

  4. Application of Interface Technology in Progressive Failure Analysis of Composite Panels

    NASA Technical Reports Server (NTRS)

    Sleight, D. W.; Lotts, C. G.

    2002-01-01

    A progressive failure analysis capability using interface technology is presented. The capability has been implemented in the COMET-AR finite element analysis code developed at the NASA Langley Research Center and is demonstrated on composite panels. The composite panels are analyzed for damage initiation and propagation from initial loading to final failure using a progressive failure analysis capability that includes both geometric and material nonlinearities. Progressive failure analyses are performed on conventional models and interface technology models of the composite panels. Analytical results and the computational effort of the analyses are compared for the conventional models and interface technology models. The analytical results predicted with the interface technology models are in good correlation with the analytical results using the conventional models, while significantly reducing the computational effort.

  5. Real-time sensor data validation

    NASA Technical Reports Server (NTRS)

    Bickmore, Timothy W.

    1994-01-01

    This report describes the status of an on-going effort to develop software capable of detecting sensor failures on rocket engines in real time. This software could be used in a rocket engine controller to prevent the erroneous shutdown of an engine due to sensor failures which would otherwise be interpreted as engine failures by the control software. The approach taken combines analytical redundancy with Bayesian belief networks to provide a solution which has well defined real-time characteristics and well-defined error rates. Analytical redundancy is a technique in which a sensor's value is predicted by using values from other sensors and known or empirically derived mathematical relations. A set of sensors and a set of relations among them form a network of cross-checks which can be used to periodically validate all of the sensors in the network. Bayesian belief networks provide a method of determining if each of the sensors in the network is valid, given the results of the cross-checks. This approach has been successfully demonstrated on the Technology Test Bed Engine at the NASA Marshall Space Flight Center. Current efforts are focused on extending the system to provide a validation capability for 100 sensors on the Space Shuttle Main Engine.

  6. Concept for Inclusion of Analytical and Computational Capability in Optical Plume Anomaly Detection (OPAD) for Measurement of Neutron Flux

    NASA Technical Reports Server (NTRS)

    Patrick, M. Clinton; Cooper, Anita E.; Powers, W. T.

    2004-01-01

    Researchers are working on many konts to make possible high speed, automated classification and quantification of constituent materials in numerous environments. NASA's Marshall Space Flight Center has implemented a system for rocket engine flow fields/plumes; the Optical Plume Anomaly Detection (OPAD) system was designed to utilize emission and absorption spectroscopy for monitoring molecular and atomic particulates in gas plasma. An accompanying suite of tools and analytical package designed to utilize information collected by OPAD is known as the Engine Diagnostic Filtering System (EDIFIS). The current combination of these systems identifies atomic and molecular species and quantifies mass loss rates in H2/O2 rocket plumes. Additionally, efforts are being advanced to hardware encode components of the EDIFIS in order to address real-time operational requirements for health monitoring and management. This paper addresses the OPAD with its tool suite, and discusses what is considered a natural progression: a concept for migrating OPAD towards detection of high energy particles, including neutrons and gamma rays. The integration of these tools and capabilities will provide NASA with a systematic approach to monitor space vehicle internal and external environment.

  7. The NASA Reanalysis Ensemble Service - Advanced Capabilities for Integrated Reanalysis Access and Intercomparison

    NASA Astrophysics Data System (ADS)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.

    2017-12-01

    NASA's efforts to advance climate analytics-as-a-service are making new capabilities available to the research community: (1) A full-featured Reanalysis Ensemble Service (RES) comprising monthly means data from multiple reanalysis data sets, accessible through an enhanced set of extraction, analytic, arithmetic, and intercomparison operations. The operations are made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib; (2) A cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables. This near real-time capability enables advanced technologies like Spark and Hadoop-based MapReduce analytics over native NetCDF files; and (3) A WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation systems such as ESGF. The Reanalysis Ensemble Service includes the following: - New API that supports full temporal, spatial, and grid-based resolution services with sample queries - A Docker-ready RES application to deploy across platforms - Extended capabilities that enable single- and multiple reanalysis area average, vertical average, re-gridding, standard deviation, and ensemble averages - Convenient, one-stop shopping for commonly used data products from multiple reanalyses including basic sub-setting and arithmetic operations (e.g., avg, sum, max, min, var, count, anomaly) - Full support for the MERRA-2 reanalysis dataset in addition to, ECMWF ERA-Interim, NCEP CFSR, JMA JRA-55 and NOAA/ESRL 20CR… - A Jupyter notebook-based distribution mechanism designed for client use cases that combines CDSlib documentation with interactive scenarios and personalized project management - Supporting analytic services for NASA GMAO Forward Processing datasets - Basic uncertainty quantification services that combine heterogeneous ensemble products with comparative observational products (e.g., reanalysis, observational, visualization) - The ability to compute and visualize multiple reanalysis for ease of inter-comparisons - Automated tools to retrieve and prepare data collections for analytic processing

  8. ELECTRONICS UPGRADE TO THE SAVANNAH RIVER NATIONAL LABORATORY COULOMETER FOR PLUTONIUM AND NEPTUNIUM ASSAY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cordaro, J.; Holland, M.; Reeves, G.

    The Savannah River Site (SRS) has the analytical measurement capability to perform high-precision plutonium concentration measurements by controlled-potential coulometry. State-of-the-art controlled-potential coulometers were designed and fabricated by the Savannah River National Laboratory and installed in the Analytical Laboratories process control laboratory. The Analytical Laboratories uses coulometry for routine accountability measurements of and for verification of standard preparations used to calibrate other plutonium measurement systems routinely applied to process control, nuclear safety, and other accountability applications. The SRNL Coulometer has a demonstrated measurement reliability of {approx}0.05% for 10 mg samples. The system has also been applied to the characterization of neptuniummore » standard solutions with a comparable reliability. The SRNL coulometer features: a patented current integration system; continuous electrical calibration versus Faraday's Constants and Ohm's Law; the control-potential adjustment technique for enhanced application of the Nernst Equation; a wide operating room temperature range; and a fully automated instrument control and data acquisition capability. Systems have been supplied to the International Atomic Energy Agency (IAEA), Russia, Japanese Atomic Energy Agency (JAEA) and the New Brunswick Laboratory (NBL). The most recent vintage of electronics was based on early 1990's integrated circuits. Many of the components are no longer available. At the request of the IAEA and the Department of State, SRNL has completed an electronics upgrade of their controlled-potential coulometer design. Three systems have built with the new design, one for the IAEA which was installed at SAL in May 2011, one system for Los Alamos National Laboratory, (LANL) and one for the SRS Analytical Laboratory. The LANL and SRS systems are undergoing startup testing with installation scheduled for this summer.« less

  9. System performance predictions for Space Station Freedom's electric power system

    NASA Technical Reports Server (NTRS)

    Kerslake, Thomas W.; Hojnicki, Jeffrey S.; Green, Robert D.; Follo, Jeffrey C.

    1993-01-01

    Space Station Freedom Electric Power System (EPS) capability to effectively deliver power to housekeeping and user loads continues to strongly influence Freedom's design and planned approaches for assembly and operations. The EPS design consists of silicon photovoltaic (PV) arrays, nickel-hydrogen batteries, and direct current power management and distribution hardware and cabling. To properly characterize the inherent EPS design capability, detailed system performance analyses must be performed for early stages as well as for the fully assembled station up to 15 years after beginning of life. Such analyses were repeatedly performed using the FORTRAN code SPACE (Station Power Analysis for Capability Evaluation) developed at the NASA Lewis Research Center over a 10-year period. SPACE combines orbital mechanics routines, station orientation/pointing routines, PV array and battery performance models, and a distribution system load-flow analysis to predict EPS performance. Time-dependent, performance degradation, low earth orbit environmental interactions, and EPS architecture build-up are incorporated in SPACE. Results from two typical SPACE analytical cases are presented: (1) an electric load driven case and (2) a maximum EPS capability case.

  10. System and Method for Providing a Climate Data Analytic Services Application Programming Interface Distribution Package

    NASA Technical Reports Server (NTRS)

    Tamkin, Glenn S. (Inventor); Duffy, Daniel Q. (Inventor); Schnase, John L. (Inventor)

    2016-01-01

    A system, method and computer-readable storage devices for providing a climate data analytic services application programming interface distribution package. The example system can provide various components. The system provides a climate data analytic services application programming interface library that enables software applications running on a client device to invoke the capabilities of a climate data analytic service. The system provides a command-line interface that provides a means of interacting with a climate data analytic service by issuing commands directly to the system's server interface. The system provides sample programs that call on the capabilities of the application programming interface library and can be used as templates for the construction of new client applications. The system can also provide test utilities, build utilities, service integration utilities, and documentation.

  11. Visual Information for the Desktop, version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2006-03-29

    VZIN integrates visual analytics capabilities into popular desktop tools to aid a user in searching and understanding an information space. VZIN allows users to Drag-Drop-Visualize-Explore-Organize information within tools such as Microsoft Office, Windows Explorer, Excel, and Outlook. VZIN is tailorable to specific client or industry requirements. VZIN follows the desktop metaphors so that advanced analytical capabilities are available with minimal user training.

  12. Effect of ionization suppression by trace impurities in mobile phase water on the accuracy of quantification by high-performance liquid chromatography/mass spectrometry.

    PubMed

    Herath, H M D R; Shaw, P N; Cabot, P; Hewavitharana, A K

    2010-06-15

    The high-performance liquid chromatography (HPLC) column is capable of enrichment/pre-concentration of trace impurities in the mobile phase during the column equilibration, prior to sample injection and elution. These impurities elute during gradient elution and result in significant chromatographic peaks. Three types of purified water were tested for their impurity levels, and hence their performances as mobile phase, in HPLC followed by total ion current (TIC) mode of MS. Two types of HPLC-grade water produced 3-4 significant peaks in solvent blanks while LC/MS-grade water produced no peaks (although peaks were produced by LC/MS-grade water also after a few days of standing). None of the three waters produced peaks in HPLC followed by UV-Vis detection. These peaks, if co-eluted with analyte, are capable of suppressing or enhancing the analyte signal in a MS detector. As it is not common practice to run solvent blanks in TIC mode, when quantification is commonly carried out using single ion monitoring (SIM) or single or multiple reaction monitoring (SRM or MRM), the effect of co-eluting impurities on the analyte signal and hence on the accuracy of the results is often unknown to the analyst. Running solvent blanks in TIC mode, regardless of the MS mode used for quantification, is essential in order to detect this problem and to take subsequent precautions. Copyright (c) 2010 John Wiley & Sons, Ltd.

  13. Ultra-sensitive chemical and biological analysis via specialty fibers with built-in microstructured optofluidic channels.

    PubMed

    Zhang, Nan; Li, Kaiwei; Cui, Ying; Wu, Zhifang; Shum, Perry Ping; Auguste, Jean-Louis; Dinh, Xuan Quyen; Humbert, Georges; Wei, Lei

    2018-02-13

    All-in-fiber optofluidics is an analytical tool that provides enhanced sensing performance with simplified analyzing system design. Currently, its advance is limited either by complicated liquid manipulation and light injection configuration or by low sensitivity resulting from inadequate light-matter interaction. In this work, we design and fabricate a side-channel photonic crystal fiber (SC-PCF) and exploit its versatile sensing capabilities in in-line optofluidic configurations. The built-in microfluidic channel of the SC-PCF enables strong light-matter interaction and easy lateral access of liquid samples in these analytical systems. In addition, the sensing performance of the SC-PCF is demonstrated with methylene blue for absorptive molecular detection and with human cardiac troponin T protein by utilizing a Sagnac interferometry configuration for ultra-sensitive and specific biomolecular specimen detection. Owing to the features of great flexibility and compactness, high-sensitivity to the analyte variation, and efficient liquid manipulation/replacement, the demonstrated SC-PCF offers a generic solution to be adapted to various fiber-waveguide sensors to detect a wide range of analytes in real time, especially for applications from environmental monitoring to biological diagnosis.

  14. One-step patterning of hollow microstructures in paper by laser cutting to create microfluidic analytical devices.

    PubMed

    Nie, Jinfang; Liang, Yuanzhi; Zhang, Yun; Le, Shangwang; Li, Dunnan; Zhang, Songbai

    2013-01-21

    In this paper, we report a simple, low-cost method for rapid, highly reproductive fabrication of paper-based microfluidics by using a commercially available, minitype CO(2) laser cutting/engraving machine. This method involves only one operation of cutting a piece of paper by laser according to a predesigned pattern. The hollow microstructures formed in the paper are used as the 'hydrophobic barriers' to define the hydrophilic flowing paths. A typical paper device on a 4 cm × 4 cm piece of paper can be fabricated within ∼7-20 s; it is ready for use once the cutting process is finished. The main fabrication parameters such as the applied current and cutting rate of the laser were optimized. The fabrication resolution and multiplexed analytical capability of the hollow microstructure-patterned paper were also characterized.

  15. The Mars Organic Molecule Analyzer (MOMA) Instrument: Characterization of Organic Material in Martian Sediments

    PubMed Central

    Goesmann, Fred; Brinckerhoff, William B.; Raulin, François; Danell, Ryan M.; Getty, Stephanie A.; Siljeström, Sandra; Mißbach, Helge; Steininger, Harald; Arevalo, Ricardo D.; Buch, Arnaud; Freissinet, Caroline; Grubisic, Andrej; Meierhenrich, Uwe J.; Pinnick, Veronica T.; Stalport, Fabien; Szopa, Cyril; Vago, Jorge L.; Lindner, Robert; Schulte, Mitchell D.; Brucato, John Robert; Glavin, Daniel P.; Grand, Noel; Li, Xiang; van Amerom, Friso H. W.

    2017-01-01

    Abstract The Mars Organic Molecule Analyzer (MOMA) instrument onboard the ESA/Roscosmos ExoMars rover (to launch in July, 2020) will analyze volatile and refractory organic compounds in martian surface and subsurface sediments. In this study, we describe the design, current status of development, and analytical capabilities of the instrument. Data acquired on preliminary MOMA flight-like hardware and experimental setups are also presented, illustrating their contribution to the overall science return of the mission. Key Words: Mars—Mass spectrometry—Life detection—Planetary instrumentation. Astrobiology 17, 655–685.

  16. Space station environmental control and life support systems conceptual studies

    NASA Technical Reports Server (NTRS)

    Humphries, W. R.; Powell, L. E.

    1985-01-01

    It is pointed out that the establishment of a permanent manned Space Station requires the development of a comprehensive approach which combines new technologies and existing spacecraft subsystem capabilities into an optimum design. The present paper is concerned with studies which were conducted in connection with the development of the regenerative Environmental Control and Life Support Systems (ECLSS) for the Space Station. Attention is given to the current state of the ECLSS subsystems and system level analytical selection and group studies related to the integrated system conceptual design.

  17. Recent activities within the Aeroservoelasticity Branch at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Noll, Thomas E.; Perry, Boyd, III; Gilbert, Michael G.

    1989-01-01

    The objective of research in aeroservoelasticity at the NASA Langley Research Center is to enhance the modeling, analysis, and multidisciplinary design methodologies for obtaining multifunction digital control systems for application to flexible flight vehicles. Recent accomplishments are discussed, and a status report on current activities within the Aeroservoelasticity Branch is presented. In the area of modeling, improvements to the Minimum-State Method of approximating unsteady aerodynamics are shown to provide precise, low-order aeroservoelastic models for design and simulation activities. Analytical methods based on Matched Filter Theory and Random Process Theory to provide efficient and direct predictions of the critical gust profile and the time-correlated gust loads for linear structural design considerations are also discussed. Two research projects leading towards improved design methodology are summarized. The first program is developing an integrated structure/control design capability based on hierarchical problem decomposition, multilevel optimization and analytical sensitivities. The second program provides procedures for obtaining low-order, robust digital control laws for aeroelastic applications. In terms of methodology validation and application the current activities associated with the Active Flexible Wing project are reviewed.

  18. Nonlinear Dynamic Analysis of Disordered Bladed-Disk Assemblies

    NASA Technical Reports Server (NTRS)

    McGee, Oliver G., III

    1997-01-01

    In a effort to address current needs for efficient, air propulsion systems, we have developed some new analytical predictive tools for understanding and alleviating aircraft engine instabilities which have led to accelerated high cycle fatigue and catastrophic failures of these machines during flight. A frequent cause of failure in Jets engines is excessive resonant vibrations and stall flutter instabilities. The likelihood of these phenomena is reduced when designers employ the analytical models we have developed. These prediction models will ultimately increase the nation's competitiveness in producing high performance Jets engines with enhanced operability, energy economy, and safety. The objectives of our current threads of research in the final year are directed along two lines. First, we want to improve the current state of blade stress and aeromechanical reduced-ordered modeling of high bypass engine fans, Specifically, a new reduced-order iterative redesign tool for passively controlling the mechanical authority of shroudless, wide chord, laminated composite transonic bypass engine fans has been developed. Second, we aim to advance current understanding of aeromechanical feedback control of dynamic flow instabilities in axial flow compressors. A systematic theoretical evaluation of several approaches to aeromechanical feedback control of rotating stall in axial compressors has been conducted. Attached are abstracts of two .papers under preparation for the 1998 ASME Turbo Expo in Stockholm, Sweden sponsored under Grant No. NAG3-1571. Our goals during the final year under Grant No. NAG3-1571 is to enhance NASA's capabilities of forced response of turbomachines (such as NASA FREPS). We with continue our development of the reduced-ordered, three-dimensional component synthesis models for aeromechanical evaluation of integrated bladeddisk assemblies (i.e., the disk, non-identical bladeing etc.). We will complete our development of component systems design optimization strategies for specified vibratory stresses and increased fatigue life prediction of assembly components, and for specified frequency margins on the Campbell diagrams of turbomachines. Finally, we will integrate the developed codes with NASA's turbomachinery aeromechanics prediction capability (such as NASA FREPS).

  19. Some thoughts on problems associated with various sampling media used for environmental monitoring

    USGS Publications Warehouse

    Horowitz, A.J.

    1997-01-01

    Modern analytical instrumentation is capable of measuring a variety of trace elements at concentrations down into the single or double digit parts-per-trillion (ng l-1) range. This holds for the three most common sample media currently used in environmental monitoring programs: filtered water, whole-water and separated suspended sediment. Unfortunately, current analytical capabilities have exceeded the current capacity to collect both uncontaminated and representative environmental samples. The success of any trace element monitoring program requires that this issue be both understood and addressed. The environmental monitoring of trace elements requires the collection of calendar- and event-based dissolved and suspended sediment samples. There are unique problems associated with the collection and chemical analyses of both types of sample media. Over the past 10 years, reported ambient dissolved trace element concentrations have declined. Generally, these decreases do not reflect better water quality, but rather improvements in the procedures used to collect, process, preserve and analyze these samples without contaminating them during these steps. Further, recent studies have shown that the currently accepted operational definition of dissolved constituents (material passing a 0.45 ??m membrane filter) is inadequat owing to sampling and processing artifacts. The existence of these artifacts raises questions about the generation of accurate, precise and comparable 'dissolved' trace element data. Suspended sediment and associated trace elements can display marked short- and long-term spatial and temporal variability. This implies that spatially representative samples only can be obtained by generating composites using depth- and width-integrated sampling techniques. Additionally, temporal variations have led to the view that the determination of annual trace element fluxes may require nearly constant (e.g., high-frequency) sampling and subsequent chemical analyses. Ultimately, sampling frequency for flux estimates becomes dependent on the time period of concern (daily, weekly, monthly, yearly) and the amount of acceptable error associated with these estimates.

  20. Computer vision-based technologies and commercial best practices for the advancement of the motion imagery tradecraft

    NASA Astrophysics Data System (ADS)

    Phipps, Marja; Capel, David; Srinivasan, James

    2014-06-01

    Motion imagery capabilities within the Department of Defense/Intelligence Community (DoD/IC) have advanced significantly over the last decade, attempting to meet continuously growing data collection, video processing and analytical demands in operationally challenging environments. The motion imagery tradecraft has evolved accordingly, enabling teams of analysts to effectively exploit data and generate intelligence reports across multiple phases in structured Full Motion Video (FMV) Processing Exploitation and Dissemination (PED) cells. Yet now the operational requirements are drastically changing. The exponential growth in motion imagery data continues, but to this the community adds multi-INT data, interoperability with existing and emerging systems, expanded data access, nontraditional users, collaboration, automation, and support for ad hoc configurations beyond the current FMV PED cells. To break from the legacy system lifecycle, we look towards a technology application and commercial adoption model course which will meet these future Intelligence, Surveillance and Reconnaissance (ISR) challenges. In this paper, we explore the application of cutting edge computer vision technology to meet existing FMV PED shortfalls and address future capability gaps. For example, real-time georegistration services developed from computer-vision-based feature tracking, multiple-view geometry, and statistical methods allow the fusion of motion imagery with other georeferenced information sources - providing unparalleled situational awareness. We then describe how these motion imagery capabilities may be readily deployed in a dynamically integrated analytical environment; employing an extensible framework, leveraging scalable enterprise-wide infrastructure and following commercial best practices.

  1. Limiting current of intense electron beams in a decelerating gap

    NASA Astrophysics Data System (ADS)

    Nusinovich, G. S.; Beaudoin, B. L.; Thompson, C.; Karakkad, J. A.; Antonsen, T. M.

    2016-02-01

    For numerous applications, it is desirable to develop electron beam driven efficient sources of electromagnetic radiation that are capable of producing the required power at beam voltages as low as possible. This trend is limited by space charge effects that cause the reduction of electron kinetic energy and can lead to electron reflection. So far, this effect was analyzed for intense beams propagating in uniform metallic pipes. In the present study, the limiting currents of intense electron beams are analyzed for the case of beam propagation in the tubes with gaps. A general treatment is illustrated by an example evaluating the limiting current in a high-power, tunable 1-10 MHz inductive output tube (IOT), which is currently under development for ionospheric modification. Results of the analytical theory are compared to results of numerical simulations. The results obtained allow one to estimate the interaction efficiency of IOTs.

  2. A Process for Assessing NASA's Capability in Aircraft Noise Prediction Technology

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.

    2008-01-01

    An acoustic assessment is being conducted by NASA that has been designed to assess the current state of the art in NASA s capability to predict aircraft related noise and to establish baselines for gauging future progress in the field. The process for determining NASA s current capabilities includes quantifying the differences between noise predictions and measurements of noise from experimental tests. The computed noise predictions are being obtained from semi-empirical, analytical, statistical, and numerical codes. In addition, errors and uncertainties are being identified and quantified both in the predictions and in the measured data to further enhance the credibility of the assessment. The content of this paper contains preliminary results, since the assessment project has not been fully completed, based on the contributions of many researchers and shows a select sample of the types of results obtained regarding the prediction of aircraft noise at both the system and component levels. The system level results are for engines and aircraft. The component level results are for fan broadband noise, for jet noise from a variety of nozzles, and for airframe noise from flaps and landing gear parts. There are also sample results for sound attenuation in lined ducts with flow and the behavior of acoustic lining in ducts.

  3. A generic interface between COSMIC/NASTRAN and PATRAN (R)

    NASA Technical Reports Server (NTRS)

    Roschke, Paul N.; Premthamkorn, Prakit; Maxwell, James C.

    1990-01-01

    Despite its powerful analytical capabilities, COSMIC/NASTRAN lacks adequate post-processing adroitness. PATRAN, on the other hand is widely accepted for its graphical capabilities. A nonproprietary, public domain code mnemonically titled CPI (for COSMIC/NASTRAN-PATRAN Interface) is designed to manipulate a large number of files rapidly and efficiently between the two parent codes. In addition to PATRAN's results file preparation, CPI also prepares PATRAN's P/PLOT data files for xy plotting. The user is prompted for necessary information during an interactive session. Current implementation supports NASTRAN's displacement approach including the following rigid formats: (1) static analysis, (2) normal modal analysis, (3) direct transient response, and (4) modal transient response. A wide variety of data blocks are also supported. Error trapping is given special consideration. A sample session with CPI illustrates its simplicity and ease of use.

  4. Ballistic Puncture Self-Healing Polymeric Materials

    NASA Technical Reports Server (NTRS)

    Gordon, Keith L.; Siochi, Emilie J.; Yost, William T.; Bogert, Phil B.; Howell, Patricia A.; Cramer, K. Elliott; Burke, Eric R.

    2017-01-01

    Space exploration launch costs on the order of $10,000 per pound provide an incentive to seek ways to reduce structural mass while maintaining structural function to assure safety and reliability. Damage-tolerant structural systems provide a route to avoiding weight penalty while enhancing vehicle safety and reliability. Self-healing polymers capable of spontaneous puncture repair show promise to mitigate potentially catastrophic damage from events such as micrometeoroid penetration. Effective self-repair requires these materials to quickly heal following projectile penetration while retaining some structural function during the healing processes. Although there are materials known to possess this capability, they are typically not considered for structural applications. Current efforts use inexpensive experimental methods to inflict damage, after which analytical procedures are identified to verify that function is restored. Two candidate self-healing polymer materials for structural engineering systems are used to test these experimental methods.

  5. Resolved-particle simulation by the Physalis method: Enhancements and new capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sierakowski, Adam J., E-mail: sierakowski@jhu.edu; Prosperetti, Andrea; Faculty of Science and Technology and J.M. Burgers Centre for Fluid Dynamics, University of Twente, P.O. Box 217, 7500 AE Enschede

    2016-03-15

    We present enhancements and new capabilities of the Physalis method for simulating disperse multiphase flows using particle-resolved simulation. The current work enhances the previous method by incorporating a new type of pressure-Poisson solver that couples with a new Physalis particle pressure boundary condition scheme and a new particle interior treatment to significantly improve overall numerical efficiency. Further, we implement a more efficient method of calculating the Physalis scalar products and incorporate short-range particle interaction models. We provide validation and benchmarking for the Physalis method against experiments of a sedimenting particle and of normal wall collisions. We conclude with an illustrativemore » simulation of 2048 particles sedimenting in a duct. In the appendix, we present a complete and self-consistent description of the analytical development and numerical methods.« less

  6. Techniques for sensing methanol concentration in aqueous environments

    NASA Technical Reports Server (NTRS)

    Narayanan, Sekharipuram R. (Inventor); Chun, William (Inventor); Valdez, Thomas I. (Inventor)

    2001-01-01

    An analyte concentration sensor that is capable of fast and reliable sensing of analyte concentration in aqueous environments with high concentrations of the analyte. Preferably, the present invention is a methanol concentration sensor device coupled to a fuel metering control system for use in a liquid direct-feed fuel cell.

  7. Advances in spectroscopic methods for quantifying soil carbon

    USGS Publications Warehouse

    Reeves, James B.; McCarty, Gregory W.; Calderon, Francisco; Hively, W. Dean

    2012-01-01

    The current gold standard for soil carbon (C) determination is elemental C analysis using dry combustion. However, this method requires expensive consumables, is limited by the number of samples that can be processed (~100/d), and is restricted to the determination of total carbon. With increased interest in soil C sequestration, faster methods of analysis are needed, and there is growing interest in methods based on diffuse reflectance spectroscopy in the visible, near-infrared or mid-infrared spectral ranges. These spectral methods can decrease analytical requirements and speed sample processing, be applied to large landscape areas using remote sensing imagery, and be used to predict multiple analytes simultaneously. However, the methods require localized calibrations to establish the relationship between spectral data and reference analytical data, and also have additional, specific problems. For example, remote sensing is capable of scanning entire watersheds for soil carbon content but is limited to the surface layer of tilled soils and may require difficult and extensive field sampling to obtain proper localized calibration reference values. The objective of this chapter is to discuss the present state of spectroscopic methods for determination of soil carbon.

  8. Illustration and analysis of a coordinated approach to an effective forensic trace evidence capability.

    PubMed

    Stoney, David A; Stoney, Paul L

    2015-08-01

    An effective trace evidence capability is defined as one that exploits all useful particle types, chooses appropriate technologies to do so, and directly integrates the findings with case-specific problems. Limitations of current approaches inhibit the attainment of an effective capability and it has been strongly argued that a new approach to trace evidence analysis is essential. A hypothetical case example is presented to illustrate and analyze how forensic particle analysis can be used as a powerful practical tool in forensic investigations. The specifics in this example, including the casework investigation, laboratory analyses, and close professional interactions, provide focal points for subsequent analysis of how this outcome can be achieved. This leads to the specification of five key elements that are deemed necessary and sufficient for effective forensic particle analysis: (1) a dynamic forensic analytical approach, (2) concise and efficient protocols addressing particle combinations, (3) multidisciplinary capabilities of analysis and interpretation, (4) readily accessible external specialist resources, and (5) information integration and communication. A coordinating role, absent in current approaches to trace evidence analysis, is essential to achieving these elements. However, the level of expertise required for the coordinating role is readily attainable. Some additional laboratory protocols are also essential. However, none of these has greater staffing requirements than those routinely met by existing forensic trace evidence practitioners. The major challenges that remain are organizational acceptance, planning and implementation. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. Development of analytically capable time-of-flight mass spectrometer with continuous ion introduction

    NASA Astrophysics Data System (ADS)

    Hárs, György; Dobos, Gábor

    2010-03-01

    The present article describes the results and findings explored in the course of the development of the analytically capable prototype of continuous time-of-flight (CTOF) mass spectrometer. Currently marketed pulsed TOF (PTOF) instruments use ion introduction with a 10 ns or so pulse width, followed by a waiting period roughly 100 μs. Accordingly, the sample is under excitation in 10-4 part of the total measuring time. This very low duty cycle severely limits the sensitivity of the PTOF method. A possible approach to deal with this problem is to use linear sinusoidal dual modulation technique (CTOF) as described in this article. This way the sensitivity of the method is increased, due to the 50% duty cycle of the excitation. All other types of TOF spectrometer use secondary electron multiplier (SEM) for detection, which unfortunately discriminates in amplification in favor of the lighter ions. This discrimination effect is especially undesirable in a mass spectrometric method, which targets high mass range. In CTOF method, SEM is replaced with Faraday cup detector, thus eliminating the mass discrimination effect. Omitting SEM is made possible by the high ion intensity and the very slow ion detection with some hundred hertz detection bandwidth. The electrometer electronics of the Faraday cup detector operates with amplification 1010 V/A. The primary ion beam is highly monoenergetic due to the construction of the ion gun, which made possible to omit any electrostatic mirror configuration for bunching the ions. The measurement is controlled by a personal computer and the intelligent signal generator Type Tabor WW 2571, which uses the direct digital synthesis technique for making arbitrary wave forms. The data are collected by a Labjack interface board, and the fast Fourier transformation is performed by the software. Noble gas mixture has been used to test the analytical capabilities of the prototype setup. Measurement presented proves the results of the mathematical calculations as well as the future potentiality for use in chemical analysis of gaseous mixtures.

  10. Meeting report: Ocean ‘omics science, technology and cyberinfrastructure: current challenges and future requirements (August 20-23, 2013)

    PubMed Central

    Gilbert, Jack A; Dick, Gregory J.; Jenkins, Bethany; Heidelberg, John; Allen, Eric; Mackey, Katherine R. M.

    2014-01-01

    The National Science Foundation’s EarthCube End User Workshop was held at USC Wrigley Marine Science Center on Catalina Island, California in August 2013. The workshop was designed to explore and characterize the needs and tools available to the community that is focusing on microbial and physical oceanography research with a particular emphasis on ‘omic research. The assembled researchers outlined the existing concerns regarding the vast data resources that are being generated, and how we will deal with these resources as their volume and diversity increases. Particular attention was focused on the tools for handling and analyzing the existing data, on the need for the construction and curation of diverse federated databases, as well as development of shared, interoperable, “big-data capable” analytical tools. The key outputs from this workshop include (i) critical scientific challenges and cyber infrastructure constraints, (ii) the current and future ocean ‘omics science grand challenges and questions, and (iii) data management, analytical and associated and cyber-infrastructure capabilities required to meet critical current and future scientific challenges. The main thrust of the meeting and the outcome of this report is a definition of the ‘omics tools, technologies and infrastructures that facilitate continued advance in ocean science biology, marine biogeochemistry, and biological oceanography. PMID:25197495

  11. Utilisation of the magnetic sensor in a smartphone for facile magnetostatics experiment: magnetic field due to electrical current in straight and loop wires

    NASA Astrophysics Data System (ADS)

    Septianto, R. D.; Suhendra, D.; Iskandar, F.

    2017-01-01

    This paper reports on the result of a research into the utilisation of a smartphone for the study of magnetostatics on the basis of experiments. The use of such a device gives great measurement result and thus it can replace magnetic sensor tools that are relatively expensive. For the best experimental result, firstly the position of the magnetic sensor in the smartphone has to be considered by way of value mapping of a magnetic field due to permanent magnet. The magnetostatics experiment investigated in this research was the measurement of magnetic field due to electrical currents in two shapes of wire, straight and looped. The current flow, the distance between the observation point and the wire, and the diameter of the loop were the variable parameters investigated to test the smartphone’s capabilities as a measurement tool. To evaluate the experimental results, the measured data were compared with theoretical values that were calculated by using both an analytical and a numerical approach. According to the experiment results, the measured data had good agreement with the results from the analytical and the numerical approach. This means that the use of the magnetic sensor in a smartphone in physics experiments is viable, especially for magnetic field measurement.

  12. Perioperative and ICU Healthcare Analytics within a Veterans Integrated System Network: a Qualitative Gap Analysis.

    PubMed

    Mudumbai, Seshadri; Ayer, Ferenc; Stefanko, Jerry

    2017-08-01

    Health care facilities are implementing analytics platforms as a way to document quality of care. However, few gap analyses exist on platforms specifically designed for patients treated in the Operating Room, Post-Anesthesia Care Unit, and Intensive Care Unit (ICU). As part of a quality improvement effort, we undertook a gap analysis of an existing analytics platform within the Veterans Healthcare Administration. The objectives were to identify themes associated with 1) current clinical use cases and stakeholder needs; 2) information flow and pain points; and 3) recommendations for future analytics development. Methods consisted of semi-structured interviews in 2 phases with a diverse set (n = 9) of support personnel and end users from five facilities across a Veterans Integrated Service Network. Phase 1 identified underlying needs and previous experiences with the analytics platform across various roles and operational responsibilities. Phase 2 validated preliminary feedback, lessons learned, and recommendations for improvement. Emerging themes suggested that the existing system met a small pool of national reporting requirements. However, pain points were identified with accessing data in several information system silos and performing multiple manual validation steps of data content. Notable recommendations included enhancing systems integration to create "one-stop shopping" for data, and developing a capability to perform trends analysis. Our gap analysis suggests that analytics platforms designed for surgical and ICU patients should employ approaches similar to those being used for primary care patients.

  13. Analytical applications of microbial fuel cells. Part I: Biochemical oxygen demand.

    PubMed

    Abrevaya, Ximena C; Sacco, Natalia J; Bonetto, Maria C; Hilding-Ohlsson, Astrid; Cortón, Eduardo

    2015-01-15

    Microbial fuel cells (MFCs) are bio-electrochemical devices, where usually the anode (but sometimes the cathode, or both) contains microorganisms able to generate and sustain an electrochemical gradient which is used typically to generate electrical power. In the more studied set-up, the anode contains heterotrophic bacteria in anaerobic conditions, capable to oxidize organic molecules releasing protons and electrons, as well as other by-products. Released protons could reach the cathode (through a membrane or not) whereas electrons travel across an external circuit originating an easily measurable direct current flow. MFCs have been proposed fundamentally as electric power producing devices or more recently as hydrogen producing devices. Here we will review the still incipient development of analytical uses of MFCs or related devices or set-ups, in the light of a non-restrictive MFC definition, as promising tools to asset water quality or other measurable parameters. An introduction to biological based analytical methods, including bioassays and biosensors, as well as MFCs design and operating principles, will also be included. Besides, the use of MFCs as biochemical oxygen demand sensors (perhaps the main analytical application of MFCs) is discussed. In a companion review (Part 2), other new analytical applications are reviewed used for toxicity sensors, metabolic sensors, life detectors, and other proposed applications. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. An Advanced Framework for Improving Situational Awareness in Electric Power Grid Operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Huang, Zhenyu; Zhou, Ning

    With the deployment of new smart grid technologies and the penetration of renewable energy in power systems, significant uncertainty and variability is being introduced into power grid operation. Traditionally, the Energy Management System (EMS) operates the power grid in a deterministic mode, and thus will not be sufficient for the future control center in a stochastic environment with faster dynamics. One of the main challenges is to improve situational awareness. This paper reviews the current status of power grid operation and presents a vision of improving wide-area situational awareness for a future control center. An advanced framework, consisting of parallelmore » state estimation, state prediction, parallel contingency selection, parallel contingency analysis, and advanced visual analytics, is proposed to provide capabilities needed for better decision support by utilizing high performance computing (HPC) techniques and advanced visual analytic techniques. Research results are presented to support the proposed vision and framework.« less

  15. Some studies related to a new Hexagonal Compound Parabolic Concentrator (HCPC) as a secondary in tandem with a solar tower

    NASA Astrophysics Data System (ADS)

    Suresh, Deivarajan

    Secondary concentrators operate in the focal plane of a point focusing system such as a paraboloidal dish or a tower and, when properly designed, are capable of enhancing the overall concentration ratio of the optical system at least by factor of two to five. The viability of using different shapes was demonstrated both analytically as well as experimentally in recent years, including Compound Parabolic Concentrators (CPCs) of circular cross section and 'trumpets' as secondaries. Current research effort is centered around a HCPC (Hexagonal CPC). Major areas addressed include an overview on the state of development of secondary concentrators, some background information related to the design of a HCPC, the results of an analytical study on the thermal behavior of this HCPC under concentrated flux conditions, and a computer modeling for assessing the possible thermal interactions between the secondary and a high temperature receiver.

  16. Privacy is an essentially contested concept: a multi-dimensional analytic for mapping privacy.

    PubMed

    Mulligan, Deirdre K; Koopman, Colin; Doty, Nick

    2016-12-28

    The meaning of privacy has been much disputed throughout its history in response to wave after wave of new technological capabilities and social configurations. The current round of disputes over privacy fuelled by data science has been a cause of despair for many commentators and a death knell for privacy itself for others. We argue that privacy's disputes are neither an accidental feature of the concept nor a lamentable condition of its applicability. Privacy is essentially contested. Because it is, privacy is transformable according to changing technological and social conditions. To make productive use of privacy's essential contestability, we argue for a new approach to privacy research and practical design, focused on the development of conceptual analytics that facilitate dissecting privacy's multiple uses across multiple contexts.This article is part of the themed issue 'The ethical impact of data science'. © 2016 The Author(s).

  17. Privacy is an essentially contested concept: a multi-dimensional analytic for mapping privacy

    PubMed Central

    Koopman, Colin; Doty, Nick

    2016-01-01

    The meaning of privacy has been much disputed throughout its history in response to wave after wave of new technological capabilities and social configurations. The current round of disputes over privacy fuelled by data science has been a cause of despair for many commentators and a death knell for privacy itself for others. We argue that privacy’s disputes are neither an accidental feature of the concept nor a lamentable condition of its applicability. Privacy is essentially contested. Because it is, privacy is transformable according to changing technological and social conditions. To make productive use of privacy’s essential contestability, we argue for a new approach to privacy research and practical design, focused on the development of conceptual analytics that facilitate dissecting privacy’s multiple uses across multiple contexts. This article is part of the themed issue ‘The ethical impact of data science’. PMID:28336797

  18. Array-based sensing using nanoparticles: an alternative approach for cancer diagnostics.

    PubMed

    Le, Ngoc D B; Yazdani, Mahdieh; Rotello, Vincent M

    2014-07-01

    Array-based sensing using nanoparticles (NPs) provides an attractive alternative to specific biomarker-focused strategies for cancer diagnosis. The physical and chemical properties of NPs provide both the recognition and transduction capabilities required for biosensing. Array-based sensors utilize a combined response from the interactions between sensors and analytes to generate a distinct pattern (fingerprint) for each analyte. These interactions can be the result of either the combination of multiple specific biomarker recognition (specific binding) or multiple selective binding responses, known as chemical nose sensing. The versatility of the latter array-based sensing using NPs can facilitate the development of new personalized diagnostic methodologies in cancer diagnostics, a necessary evolution in the current healthcare system to better provide personalized treatments. This review will describe the basic principle of array-based sensors, along with providing examples of both invasive and noninvasive samples used in cancer diagnosis.

  19. High Performance Visualization using Query-Driven Visualizationand Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, E. Wes; Campbell, Scott; Dart, Eli

    2006-06-15

    Query-driven visualization and analytics is a unique approach for high-performance visualization that offers new capabilities for knowledge discovery and hypothesis testing. The new capabilities akin to finding needles in haystacks are the result of combining technologies from the fields of scientific visualization and scientific data management. This approach is crucial for rapid data analysis and visualization in the petascale regime. This article describes how query-driven visualization is applied to a hero-sized network traffic analysis problem.

  20. Telematics Options and Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, Cabell

    This presentation describes the data tracking and analytical capabilities of telematics devices. Federal fleet managers can use the systems to keep their drivers safe, maintain a fuel efficient fleet, ease their reporting burden, and save money. The presentation includes an example of how much these capabilities can save fleets.

  1. A new mathematical model and control of a three-phase AC-DC voltage source converter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blasko, V.; Kaura, V.

    1997-01-01

    A new mathematical model of the power circuit of a three-phase voltage source converter (VSC) was developed in the stationary and synchronous reference frames. The mathematical model was then used to analyze and synthesize the voltage and current control loops for the VSC. Analytical expressions were derived for calculating the gains and time constants of the current and voltage regulators. The mathematical model was used to control a 140-kW regenerative VSC. The synchronous reference frame model was used to define feedforward signals in the current regulators to eliminate the cross coupling between the d and q phases. It allowed themore » reduction of the current control loop to first-order plants and improved their tracking capability. The bandwidths of the current and voltage-control loops were found to be approximately 20 and 60 times (respectively) smaller than the sampling frequency. All control algorithms were implemented in a digital-signal processor. All results of the analysis were experimentally verified.« less

  2. Selecting Statistical Procedures for Quality Control Planning Based on Risk Management.

    PubMed

    Yago, Martín; Alcover, Silvia

    2016-07-01

    According to the traditional approach to statistical QC planning, the performance of QC procedures is assessed in terms of its probability of rejecting an analytical run that contains critical size errors (PEDC). Recently, the maximum expected increase in the number of unacceptable patient results reported during the presence of an undetected out-of-control error condition [Max E(NUF)], has been proposed as an alternative QC performance measure because it is more related to the current introduction of risk management concepts for QC planning in the clinical laboratory. We used a statistical model to investigate the relationship between PEDC and Max E(NUF) for simple QC procedures widely used in clinical laboratories and to construct charts relating Max E(NUF) with the capability of the analytical process that allow for QC planning based on the risk of harm to a patient due to the report of erroneous results. A QC procedure shows nearly the same Max E(NUF) value when used for controlling analytical processes with the same capability, and there is a close relationship between PEDC and Max E(NUF) for simple QC procedures; therefore, the value of PEDC can be estimated from the value of Max E(NUF) and vice versa. QC procedures selected by their high PEDC value are also characterized by a low value for Max E(NUF). The PEDC value can be used for estimating the probability of patient harm, allowing for the selection of appropriate QC procedures in QC planning based on risk management. © 2016 American Association for Clinical Chemistry.

  3. Critical review of dog detection and the influences of physiology, training, and analytical methodologies.

    PubMed

    Hayes, J E; McGreevy, P D; Forbes, S L; Laing, G; Stuetz, R M

    2018-08-01

    Detection dogs serve a plethora of roles within modern society, and are relied upon to identify threats such as explosives and narcotics. Despite their importance, research and training regarding detection dogs has involved ambiguity. This is partially due to the fact that the assessment of effectiveness regarding detection dogs continues to be entrenched within a traditional, non-scientific understanding. Furthermore, the capabilities of detection dogs are also based on their olfactory physiology and training methodologies, both of which are hampered by knowledge gaps. Additionally, the future of detection dogs is strongly influenced by welfare and social implications. Most importantly however, is the emergence of progressively inexpensive and efficacious analytical methodologies including gas chromatography related techniques, "e-noses", and capillary electrophoresis. These analytical methodologies provide both an alternative and assistor for the detection dog industry, however the interrelationship between these two detection paradigms requires clarification. These factors, when considering their relative contributions, illustrate a need to address research gaps, formalise the detection dog industry and research process, as well as take into consideration analytical methodologies and their influence on the future status of detection dogs. This review offers an integrated assessment of the factors involved in order to determine the current and future status of detection dogs. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Multi-analytical Approaches Informing the Risk of Sepsis

    NASA Astrophysics Data System (ADS)

    Gwadry-Sridhar, Femida; Lewden, Benoit; Mequanint, Selam; Bauer, Michael

    Sepsis is a significant cause of mortality and morbidity and is often associated with increased hospital resource utilization, prolonged intensive care unit (ICU) and hospital stay. The economic burden associated with sepsis is huge. With advances in medicine, there are now aggressive goal oriented treatments that can be used to help these patients. If we were able to predict which patients may be at risk for sepsis we could start treatment early and potentially reduce the risk of mortality and morbidity. Analytic methods currently used in clinical research to determine the risk of a patient developing sepsis may be further enhanced by using multi-modal analytic methods that together could be used to provide greater precision. Researchers commonly use univariate and multivariate regressions to develop predictive models. We hypothesized that such models could be enhanced by using multiple analytic methods that together could be used to provide greater insight. In this paper, we analyze data about patients with and without sepsis using a decision tree approach and a cluster analysis approach. A comparison with a regression approach shows strong similarity among variables identified, though not an exact match. We compare the variables identified by the different approaches and draw conclusions about the respective predictive capabilities,while considering their clinical significance.

  5. OCCIMA: Optical Channel Characterization in Maritime Atmospheres

    NASA Astrophysics Data System (ADS)

    Hammel, Steve; Tsintikidis, Dimitri; deGrassie, John; Reinhardt, Colin; McBryde, Kevin; Hallenborg, Eric; Wayne, David; Gibson, Kristofor; Cauble, Galen; Ascencio, Ana; Rudiger, Joshua

    2015-05-01

    The Navy is actively developing diverse optical application areas, including high-energy laser weapons and free- space optical communications, which depend on an accurate and timely knowledge of the state of the atmospheric channel. The Optical Channel Characterization in Maritime Atmospheres (OCCIMA) project is a comprehensive program to coalesce and extend the current capability to characterize the maritime atmosphere for all optical and infrared wavelengths. The program goal is the development of a unified and validated analysis toolbox. The foundational design for this program coordinates the development of sensors, measurement protocols, analytical models, and basic physics necessary to fulfill this goal.

  6. An overview of current Navy programs to develop thrust augmenting ejectors

    NASA Technical Reports Server (NTRS)

    Green, K. A.

    1979-01-01

    The primary objective of Navy sponsored research in thrust augmentation is the development of an improved augmenter for V/STOL application. In support of this goal, a data base is being established to provide an accurate prediction capability for use in ejector design. A general technology development of ejectors and associated effects presently is split into the more specific areas of lift and control, since thrust augmenting ejectors may be suitable for both. Research areas examined include advanced diffuser and end wall design; advanced primary nozzles; analytic studies; augmenting reaction controls; and nozzle design.

  7. Digital Inverter Amine Sensing via Synergistic Responses by n and p Organic Semiconductors.

    PubMed

    Tremblay, Noah J; Jung, Byung Jun; Breysse, Patrick; Katz, Howard E

    2011-11-22

    Chemiresistors and sensitive OFETs have been substantially developed as cheap, scalable, and versatile sensing platforms. While new materials are expanding OFET sensing capabilities, the device architectures have changed little. Here we report higher order logic circuits utilizing OFETs sensitive to amine vapors. The circuits depend on the synergistic responses of paired p- and n-channel organic semiconductors, including an unprecedented analyte-induced current increase by the n-channel semiconductor. This represents the first step towards 'intelligent sensors' that utilize analog signal changes in sensitive OFETs to produce direct digital readouts suitable for further logic operations.

  8. Digital Inverter Amine Sensing via Synergistic Responses by n and p Organic Semiconductors

    PubMed Central

    Tremblay, Noah J.; Jung, Byung Jun; Breysse, Patrick; Katz, Howard E.

    2013-01-01

    Chemiresistors and sensitive OFETs have been substantially developed as cheap, scalable, and versatile sensing platforms. While new materials are expanding OFET sensing capabilities, the device architectures have changed little. Here we report higher order logic circuits utilizing OFETs sensitive to amine vapors. The circuits depend on the synergistic responses of paired p- and n-channel organic semiconductors, including an unprecedented analyte-induced current increase by the n-channel semiconductor. This represents the first step towards ‘intelligent sensors’ that utilize analog signal changes in sensitive OFETs to produce direct digital readouts suitable for further logic operations. PMID:23754969

  9. Numerical modeling of the coupling of an ICRH antenna with a plasma with self-consistent antenna currents

    NASA Astrophysics Data System (ADS)

    Pécoul, S.; Heuraux, S.; Koch, R.; Leclert, G.

    2002-07-01

    A realistic modeling of ICRH antennas requires the knowledge of the antenna currents. The code ICANT determines self-consistently these currents and, as a byproduct, the electrical characteristics of the antenna (radiated power, propagation constants on straps, frequency response, … ). The formalism allows for the description of three-dimensional antenna elements (for instance, finite size thick screen blades). The results obtained for various cases where analytical results are available are discussed. The resonances appearing in the spectrum and the occurrence of unphysical resonant modes are discussed. The capability of this self-consistent method is illustrated by a number of examples, e.g., fully conducting thin or thick screen bars leading to magnetic shielding effects, frequency response and resonances of an end-tuned antenna, field distributions in front of a Tore-Supra type antenna with tilted screen blades.

  10. Machine Learning Technologies Translates Vigilant Surveillance Satellite Big Data into Predictive Alerts for Environmental Stressors

    NASA Astrophysics Data System (ADS)

    Johnson, S. P.; Rohrer, M. E.

    2017-12-01

    The application of scientific research pertaining to satellite imaging and data processing has facilitated the development of dynamic methodologies and tools that utilize nanosatellites and analytical platforms to address the increasing scope, scale, and intensity of emerging environmental threats to national security. While the use of remotely sensed data to monitor the environment at local and global scales is not a novel proposition, the application of advances in nanosatellites and analytical platforms are capable of overcoming the data availability and accessibility barriers that have historically impeded the timely detection, identification, and monitoring of these stressors. Commercial and university-based applications of these technologies were used to identify and evaluate their capacity as security-motivated environmental monitoring tools. Presently, nanosatellites can provide consumers with 1-meter resolution imaging, frequent revisits, and customizable tasking, allowing users to define an appropriate temporal scale for high resolution data collection that meets their operational needs. Analytical platforms are capable of ingesting increasingly large and diverse volumes of data, delivering complex analyses in the form of interpretation-ready data products and solutions. The synchronous advancement of these technologies creates the capability of analytical platforms to deliver interpretable products from persistently collected high-resolution data that meet varying temporal and geographic scale requirements. In terms of emerging environmental threats, these advances translate into customizable and flexible tools that can respond to and accommodate the evolving nature of environmental stressors. This presentation will demonstrate the capability of nanosatellites and analytical platforms to provide timely, relevant, and actionable information that enables environmental analysts and stakeholders to make informed decisions regarding the prevention, intervention, and prediction of emerging environmental threats.

  11. Visual Analytics in Public Safety: Example Capabilities for Example Government Agencies

    DTIC Science & Technology

    2011-10-01

    is not limited to: the Police Records Information Management Environment for British Columbia (PRIME-BC), the Police Reporting and Occurrence System...and filtering for rapid identification of relevant documents - Graphical environment for visual evidence marshaling - Interactive linking and...analytical reasoning facilitated by interactive visual interfaces and integration with computational analytics. Indeed, a wide variety of technologies

  12. New York State energy-analytic information system: first-stage implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allentuck, J.; Carroll, O.; Fiore, L.

    1979-09-01

    So that energy policy by state government may be formulated within the constraints imposed by policy determined at the national level - yet reflect the diverse interests of its citizens - large quantities of data and sophisticated analytic capabilities are required. This report presents the design of an energy-information/analytic system for New York State, the data for a base year, 1976, and projections of these data. At the county level, 1976 energy-supply demand data and electric generating plant data are provided as well. Data-base management is based on System 2000. Three computerized models provide the system's basic analytic capacity. Themore » Brookhaven Energy System Network Simulator provides an integrating framework while a price-response model and a weather sensitive energy demand model furnished a short-term energy response estimation capability. The operation of these computerized models is described. 62 references, 25 figures, 39 tables.« less

  13. Big Data Analytics and Machine Intelligence Capability Development at NASA Langley Research Center: Strategy, Roadmap, and Progress

    NASA Technical Reports Server (NTRS)

    Ambur, Manjula Y.; Yagle, Jeremy J.; Reith, William; McLarney, Edward

    2016-01-01

    In 2014, a team of researchers, engineers and information technology specialists at NASA Langley Research Center developed a Big Data Analytics and Machine Intelligence Strategy and Roadmap as part of Langley's Comprehensive Digital Transformation Initiative, with the goal of identifying the goals, objectives, initiatives, and recommendations need to develop near-, mid- and long-term capabilities for data analytics and machine intelligence in aerospace domains. Since that time, significant progress has been made in developing pilots and projects in several research, engineering, and scientific domains by following the original strategy of collaboration between mission support organizations, mission organizations, and external partners from universities and industry. This report summarizes the work to date in Data Intensive Scientific Discovery, Deep Content Analytics, and Deep Q&A projects, as well as the progress made in collaboration, outreach, and education. Recommendations for continuing this success into future phases of the initiative are also made.

  14. Development and Applications of Liquid Sample Desorption Electrospray Ionization Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Zheng, Qiuling; Chen, Hao

    2016-06-01

    Desorption electrospray ionization mass spectrometry (DESI-MS) is a recent advance in the field of analytical chemistry. This review surveys the development of liquid sample DESI-MS (LS-DESI-MS), a variant form of DESI-MS that focuses on fast analysis of liquid samples, and its novel analy-tical applications in bioanalysis, proteomics, and reaction kinetics. Due to the capability of directly ionizing liquid samples, liquid sample DESI (LS-DESI) has been successfully used to couple MS with various analytical techniques, such as microfluidics, microextraction, electrochemistry, and chromatography. This review also covers these hyphenated techniques. In addition, several closely related ionization methods, including transmission mode DESI, thermally assisted DESI, and continuous flow-extractive DESI, are briefly discussed. The capabilities of LS-DESI extend and/or complement the utilities of traditional DESI and electrospray ionization and will find extensive and valuable analytical application in the future.

  15. Operation of a voltage source converter at increased utility voltage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaura, V.; Blasko, V.

    1997-01-01

    The operation of a voltage source converter (VSC) with regeneration capability, controllable power factor, and low distortion of utility currents is analyzed at increased utility voltage. Increase in the utility voltage causes a VSC to saturate and enter a nonlinear mode of operation. To operate under elevated utility, two steps are taken: (1) a pulse width modulation (PWM) algorithm is implemented which extends the linear region of operation by 15% and (2) a PWM saturation regulator is used to control the reactive current at higher utility voltages. The PWM algorithm reduces the switching losses by at least 33% and themore » effect of blanking time by one-third. All analytical results are experimentally verified on a 100 kW three-phase VSC.« less

  16. Efficient Power-Transfer Capability Analysis of the TET System Using the Equivalent Small Parameter Method.

    PubMed

    Yanzhen Wu; Hu, A P; Budgett, D; Malpas, S C; Dissanayake, T

    2011-06-01

    Transcutaneous energy transfer (TET) enables the transfer of power across the skin without direct electrical connection. It is a mechanism for powering implantable devices for the lifetime of a patient. For maximum power transfer, it is essential that TET systems be resonant on both the primary and secondary sides, which requires considerable design effort. Consequently, a strong need exists for an efficient method to aid the design process. This paper presents an analytical technique appropriate to analyze complex TET systems. The system's steady-state solution in closed form with sufficient accuracy is obtained by employing the proposed equivalent small parameter method. It is shown that power-transfer capability can be correctly predicted without tedious iterative simulations or practical measurements. Furthermore, for TET systems utilizing a current-fed push-pull soft switching resonant converter, it is found that the maximum energy transfer does not occur when the primary and secondary resonant tanks are "tuned" to the nominal resonant frequency. An optimal turning point exists, corresponding to the system's maximum power-transfer capability when optimal tuning capacitors are applied.

  17. Integrated multisensor perimeter detection systems

    NASA Astrophysics Data System (ADS)

    Kent, P. J.; Fretwell, P.; Barrett, D. J.; Faulkner, D. A.

    2007-10-01

    The report describes the results of a multi-year programme of research aimed at the development of an integrated multi-sensor perimeter detection system capable of being deployed at an operational site. The research was driven by end user requirements in protective security, particularly in threat detection and assessment, where effective capability was either not available or prohibitively expensive. Novel video analytics have been designed to provide robust detection of pedestrians in clutter while new radar detection and tracking algorithms provide wide area day/night surveillance. A modular integrated architecture based on commercially available components has been developed. A graphical user interface allows intuitive interaction and visualisation with the sensors. The fusion of video, radar and other sensor data provides the basis of a threat detection capability for real life conditions. The system was designed to be modular and extendable in order to accommodate future and legacy surveillance sensors. The current sensor mix includes stereoscopic video cameras, mmWave ground movement radar, CCTV and a commercially available perimeter detection cable. The paper outlines the development of the system and describes the lessons learnt after deployment in a pilot trial.

  18. Validation of a liquid chromatography-tandem mass spectrometry method for the identification and quantification of 5-nitroimidazole drugs and their corresponding hydroxy metabolites in lyophilised pork meat.

    PubMed

    Zeleny, Reinhard; Harbeck, Stefan; Schimmel, Heinz

    2009-01-09

    A liquid chromatography-electrospray ionisation tandem mass spectrometry method for the simultaneous detection and quantitation of 5-nitroimidazole veterinary drugs in lyophilised pork meat, the chosen format of a candidate certified reference material, has been developed and validated. Six analytes have been included in the scope of validation, i.e. dimetridazole (DMZ), metronidazole (MNZ), ronidazole (RNZ), hydroxymetronidazole (MNZOH), hydroxyipronidazole (IPZOH), and 2-hydroxymethyl-1-methyl-5-nitroimidazole (HMMNI). The analytes were extracted from the sample with ethyl acetate, chromatographically separated on a C(18) column, and finally identified and quantified by tandem mass spectrometry in the multiple reaction monitoring mode (MRM) using matrix-matched calibration and (2)H(3)-labelled analogues of the analytes (except for MNZOH, where [(2)H(3)]MNZ was used). The method was validated in accordance with Commission Decision 2002/657/EC, by determining selectivity, linearity, matrix effect, apparent recovery, repeatability and intermediate precision, decision limits and detection capabilities, robustness of sample preparation method, and stability of extracts. Recovery at 1 microg/kg level was at 100% (estimates in the range of 101-107%) for all analytes, repeatabilities and intermediate precisions at this level were in the range of 4-12% and 2-9%, respectively. Linearity of calibration curves in the working range 0.5-10 microg/kg was confirmed, with r values typically >0.99. Decision limits (CCalpha) and detection capabilities (CCbeta) according to ISO 11843-2 (calibration curve approach) were 0.29-0.44 and 0.36-0.54 microg/kg, respectively. The method reliably identifies and quantifies the selected nitroimidazoles in the reconstituted pork meat in the low and sub-microg/kg range and will be applied in an interlaboratory comparison for determining the mass fraction of the selected nitroimidazoles in the candidate reference material currently developed at IRMM.

  19. Synchrotron X-ray Analytical Techniques for Studying Materials Electrochemistry in Rechargeable Batteries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Feng; Liu, Yijin; Yu, Xiqian

    Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less

  20. Synchrotron X-ray Analytical Techniques for Studying Materials Electrochemistry in Rechargeable Batteries

    DOE PAGES

    Lin, Feng; Liu, Yijin; Yu, Xiqian; ...

    2017-08-30

    Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less

  1. Building Virtual Watersheds: A Global Opportunity to Strengthen Resource Management and Conservation.

    PubMed

    Benda, Lee; Miller, Daniel; Barquin, Jose; McCleary, Richard; Cai, TiJiu; Ji, Y

    2016-03-01

    Modern land-use planning and conservation strategies at landscape to country scales worldwide require complete and accurate digital representations of river networks, encompassing all channels including the smallest headwaters. The digital river networks, integrated with widely available digital elevation models, also need to have analytical capabilities to support resource management and conservation, including attributing river segments with key stream and watershed data, characterizing topography to identify landforms, discretizing land uses at scales necessary to identify human-environment interactions, and connecting channels downstream and upstream, and to terrestrial environments. We investigate the completeness and analytical capabilities of national to regional scale digital river networks that are available in five countries: Canada, China, Russia, Spain, and United States using actual resource management and conservation projects involving 12 university, agency, and NGO organizations. In addition, we review one pan-European and one global digital river network. Based on our analysis, we conclude that the majority of the regional, national, and global scale digital river networks in our sample lack in network completeness, analytical capabilities or both. To address this limitation, we outline a general framework to build as complete as possible digital river networks and to integrate them with available digital elevation models to create robust analytical capabilities (e.g., virtual watersheds). We believe this presents a global opportunity for in-country agencies, or international players, to support creation of virtual watersheds to increase environmental problem solving, broaden access to the watershed sciences, and strengthen resource management and conservation in countries worldwide.

  2. Building Virtual Watersheds: A Global Opportunity to Strengthen Resource Management and Conservation

    NASA Astrophysics Data System (ADS)

    Benda, Lee; Miller, Daniel; Barquin, Jose; McCleary, Richard; Cai, TiJiu; Ji, Y.

    2016-03-01

    Modern land-use planning and conservation strategies at landscape to country scales worldwide require complete and accurate digital representations of river networks, encompassing all channels including the smallest headwaters. The digital river networks, integrated with widely available digital elevation models, also need to have analytical capabilities to support resource management and conservation, including attributing river segments with key stream and watershed data, characterizing topography to identify landforms, discretizing land uses at scales necessary to identify human-environment interactions, and connecting channels downstream and upstream, and to terrestrial environments. We investigate the completeness and analytical capabilities of national to regional scale digital river networks that are available in five countries: Canada, China, Russia, Spain, and United States using actual resource management and conservation projects involving 12 university, agency, and NGO organizations. In addition, we review one pan-European and one global digital river network. Based on our analysis, we conclude that the majority of the regional, national, and global scale digital river networks in our sample lack in network completeness, analytical capabilities or both. To address this limitation, we outline a general framework to build as complete as possible digital river networks and to integrate them with available digital elevation models to create robust analytical capabilities (e.g., virtual watersheds). We believe this presents a global opportunity for in-country agencies, or international players, to support creation of virtual watersheds to increase environmental problem solving, broaden access to the watershed sciences, and strengthen resource management and conservation in countries worldwide.

  3. Ultra-sensitive fluorescent imaging-biosensing using biological photonic crystals

    NASA Astrophysics Data System (ADS)

    Squire, Kenny; Kong, Xianming; Wu, Bo; Rorrer, Gregory; Wang, Alan X.

    2018-02-01

    Optical biosensing is a growing area of research known for its low limits of detection. Among optical sensing techniques, fluorescence detection is among the most established and prevalent. Fluorescence imaging is an optical biosensing modality that exploits the sensitivity of fluorescence in an easy-to-use process. Fluorescence imaging allows a user to place a sample on a sensor and use an imager, such as a camera, to collect the results. The image can then be processed to determine the presence of the analyte. Fluorescence imaging is appealing because it can be performed with as little as a light source, a camera and a data processor thus being ideal for nontrained personnel without any expensive equipment. Fluorescence imaging sensors generally employ an immunoassay procedure to selectively trap analytes such as antigens or antibodies. When the analyte is present, the sensor fluoresces thus transducing the chemical reaction into an optical signal capable of imaging. Enhancement of this fluorescence leads to an enhancement in the detection capabilities of the sensor. Diatoms are unicellular algae with a biosilica shell called a frustule. The frustule is porous with periodic nanopores making them biological photonic crystals. Additionally, the porous nature of the frustule allows for large surface area capable of multiple analyte binding sites. In this paper, we fabricate a diatom based ultra-sensitive fluorescence imaging biosensor capable of detecting the antibody mouse immunoglobulin down to a concentration of 1 nM. The measured signal has an enhancement of 6× when compared to sensors fabricated without diatoms.

  4. Buckling Testing and Analysis of Space Shuttle Solid Rocket Motor Cylinders

    NASA Technical Reports Server (NTRS)

    Weidner, Thomas J.; Larsen, David V.; McCool, Alex (Technical Monitor)

    2002-01-01

    A series of full-scale buckling tests were performed on the space shuttle Reusable Solid Rocket Motor (RSRM) cylinders. The tests were performed to determine the buckling capability of the cylinders and to provide data for analytical comparison. A nonlinear ANSYS Finite Element Analysis (FEA) model was used to represent and evaluate the testing. Analytical results demonstrated excellent correlation to test results, predicting the failure load within 5%. The analytical value was on the conservative side, predicting a lower failure load than was applied to the test. The resulting study and analysis indicated the important parameters for FEA to accurately predict buckling failure. The resulting method was subsequently used to establish the pre-launch buckling capability of the space shuttle system.

  5. Analytical Chemistry Laboratory

    NASA Technical Reports Server (NTRS)

    Anderson, Mark

    2013-01-01

    The Analytical Chemistry and Material Development Group maintains a capability in chemical analysis, materials R&D failure analysis and contamination control. The uniquely qualified staff and facility support the needs of flight projects, science instrument development and various technical tasks, as well as Cal Tech.

  6. INTEGRATING BIOANALYTICAL CAPABILITY IN AN ENVIRONMENTAL ANALYTICAL LABORATORY

    EPA Science Inventory

    The product is a book chapter which is an introductory and summary chapter for the reference work "Immunoassays and Other Bianalytical Techniques" to be published by CRC Press, Taylor and Francis Books. The chapter provides analytical chemists information on new techni...

  7. Autonomous cloud based site monitoring through hydro geophysical data assimilation, processing and result delivery

    NASA Astrophysics Data System (ADS)

    Versteeg, R.; Johnson, D. V.; Rodzianko, A.; Zhou, H.; Dafflon, B.; Leger, E.; de Kleine, M.

    2017-12-01

    Understanding of processes in the shallow subsurface requires that geophysical, biogeochemical, hydrological and remote sensing datasets are assimilated, processed and interpreted. Multiple enabling software capabilities for process understanding have been developed by the science community. These include information models (ODM2), reactive transport modeling (PFLOTRAN, Modflow, CLM, Landlab), geophysical inversion (E4D, BERT), parameter estimation (PEST, DAKOTA), visualization (ViSiT, Paraview, D3, QGIS) as well as numerous tools written in python and R for petrophysical mapping, stochastic modeling, data analysis and so on. These capabilities use data collected using sensors and analytical tools developed by multiple manufacturers which produce many different measurements. While scientists obviously leverage tools, capabilities and lessons learned from one site at other sites, the current approach to site characterization and monitoring is very labor intensive and does not scale well. Our objective is to be able to monitor many (hundreds - thousands) of sites. This requires that monitoring can be done in a near time, affordable, auditable and essentially autonomous manner. For this we have developed a modular vertically integrated cloud based software framework which was designed from the ground up for effective site and process monitoring. This software framework (PAF - Predictive Assimilation Framework) is multitenant software and provides automation of data ingestion, processing and visualization of hydrological, geochemical and geophysical (ERT/DTS) data. The core organizational element of PAF is a project/user one in which capabilities available to users are controlled by a combination of available data and access permissions. All PAF capabilities are exposed through APIs, making it easy to quickly add new components. PAF is fully integrated with newly developed autonomous electrical geophysical hardware and thus allows for automation of electrical geophysical ingestion and processing and the ability for co analysis and visualization of the raw and processed data with other data of interest (e.g. soil temperature, soil moisture, precipitation). We will demonstrate current PAF capabilities and discuss future efforts.

  8. Manipulability, force, and compliance analysis for planar continuum manipulators

    NASA Technical Reports Server (NTRS)

    Gravagne, Ian A.; Walker, Ian D.

    2002-01-01

    Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.

  9. Manipulability, force, and compliance analysis for planar continuum manipulators.

    PubMed

    Gravagne, Ian A; Walker, Ian D

    2002-06-01

    Continuum manipulators, inspired by the natural capabilities of elephant trunks and octopus tentacles, may find niche applications in areas like human-robot interaction, multiarm manipulation, and unknown environment exploration. However, their true capabilities will remain largely inaccessible without proper analytical tools to evaluate their unique properties. Ellipsoids have long served as one of the foremost analytical tools available to the robotics researcher, and the purpose of this paper is to first formulate, and then to examine, three types of ellipsoids for continuum robots: manipulability, force, and compliance.

  10. Performance enhancement of Pt/TiO2/Si UV-photodetector by optimizing light trapping capability and interdigitated electrodes geometry

    NASA Astrophysics Data System (ADS)

    Bencherif, H.; Djeffal, F.; Ferhati, H.

    2016-09-01

    This paper presents a hybrid approach based on an analytical and metaheuristic investigation to study the impact of the interdigitated electrodes engineering on both speed and optical performance of an Interdigitated Metal-Semiconductor-Metal Ultraviolet Photodetector (IMSM-UV-PD). In this context, analytical models regarding the speed and optical performance have been developed and validated by experimental results, where a good agreement has been recorded. Moreover, the developed analytical models have been used as objective functions to determine the optimized design parameters, including the interdigit configuration effect, via a Multi-Objective Genetic Algorithm (MOGA). The ultimate goal of the proposed hybrid approach is to identify the optimal design parameters associated with the maximum of electrical and optical device performance. The optimized IMSM-PD not only reveals superior performance in terms of photocurrent and response time, but also illustrates higher optical reliability against the optical losses due to the active area shadowing effects. The advantages offered by the proposed design methodology suggest the possibility to overcome the most challenging problem with the communication speed and power requirements of the UV optical interconnect: high derived current and commutation speed in the UV receiver.

  11. MOMA Gas Chromatograph-Mass Spectrometer onboard the 2018 ExoMars Mission: results and performance

    NASA Astrophysics Data System (ADS)

    Buch, A.; Pinnick, V. T.; Szopa, C.; Grand, N.; Humeau, O.; van Amerom, F. H.; Danell, R.; Freissinet, C.; Brinckerhoff, W.; Gonnsen, Z.; Mahaffy, P. R.; Coll, P.; Raulin, F.; Goesmann, F.

    2015-10-01

    The Mars Organic Molecule Analyzer (MOMA) is a dual ion source linear ion trap mass spectrometer that was designed for the 2018 joint ESA-Roscosmos mission to Mars. The main scientific aim of the mission is to search for signs of extant or extinct life in the near subsurface of Mars by acquiring samples from as deep as 2 m below the surface. MOMA will be a key analytical tool in providing chemical (molecular and chiral) information from the solid samples, with particular focus on the characterization of organic content. The MOMA instrument, itself, is a joint venture for NASA and ESA to develop a mass spectrometer capable of analyzing samples from pyrolysis/chemical derivatization gas chromatography (GC) as well as ambient pressure laser desorption ionization (LDI). The combination of the two analytical techniques allows for the chemical characterization of a broad range of compounds, including volatile and non-volatile species. Generally, MOMA can provide information on elemental and molecular makeup, polarity, chirality and isotopic patterns of analyte species. Here we report on the current performance of the MOMA prototype instruments, specifically the demonstration of the gas chromatographymass spectrometry (GC-MS) mode of operation.

  12. Adding a solar-radiance function to the Hošek-Wilkie skylight model.

    PubMed

    Hošek, Lukáš; Wilkie, Alexander

    2013-01-01

    One prerequisite for realistic renderings of outdoor scenes is the proper capturing of the sky's appearance. Currently, an explicit simulation of light scattering in the atmosphere isn't computationally feasible, and won't be in the foreseeable future. Captured luminance patterns have proven their usefulness in practice but can't meet all user needs. To fill this capability gap, computer graphics technology has employed analytical models of sky-dome luminance patterns for more than two decades. For technical reasons, such models deal with only the sky dome's appearance, though, and exclude the solar disc. The widely used model proposed by Arcot Preetham and colleagues employed a separately derived analytical formula for adding a solar emitter of suitable radiant intensity. Although this yields reasonable results, the formula is derived in a manner that doesn't exactly match the conditions in their sky-dome model. But the more sophisticated a skylight model is and the more subtly it can represent different conditions, the more the solar radiance should exactly match the skylight's conditions. Toward that end, researchers propose a solar-radiance function that exactly matches a recently published high-quality analytical skylight model.

  13. Acetaminophen and acetone sensing capabilities of nickel ferrite nanostructures

    NASA Astrophysics Data System (ADS)

    Mondal, Shrabani; Kumari, Manisha; Madhuri, Rashmi; Sharma, Prashant K.

    2017-07-01

    Present work elucidates the gas sensing and electrochemical sensing capabilities of sol-gel-derived nickel ferrite (NF) nanostructures based on the electrical and electrochemical properties. In current work, the choices of target species (acetone and acetaminophen) are strictly governed by their practical utility and concerning the safety measures. Acetone, the target analyte for gas sensing measurement is a common chemical used in varieties of application as well as provides an indirect way to monitor diabetes. The gas sensing experiments were performed within a homemade sensing chamber designed by our group. Acetone gas sensor (NF pellet sensor) response was monitored by tracking the change in resistance both in the presence and absence of acetone. At optimum operating temperature 300 °C, NF pellet sensor exhibits selective response for acetone in the presence of other common interfering gases like ethanol, benzene, and toluene. The electrochemical sensor fabricated to determine acetaminophen is prepared by coating NF onto the surface of pre-treated/cleaned pencil graphite electrode (NF-PGE). The common name of target analyte acetaminophen is paracetamol (PC), which is widespread worldwide as a well-known pain killer. Overdose of PC can cause renal failure even fatal diseases in children and demand accurate monitoring. Under optimal conditions NF-PGE shows a detection limit as low as 0.106 μM with selective detection ability towards acetaminophen in the presence of ascorbic acid (AA), which co-exists in our body. Use of cheap and abundant PGE instead of other electrodes (gold/Pt/glassy carbon electrode) can effectively reduce the cost barrier of such sensors. The obtained results elucidate an ample appeal of NF-sensors in real analytical applications viz. in environmental monitoring, pharmaceutical industry, drug detection, and health monitoring.

  14. Modeling and evaluation of the oil-spill emergency response capability based on linguistic variables.

    PubMed

    Kang, Jian; Zhang, Jixin; Bai, Yongqiang

    2016-12-15

    An evaluation of the oil-spill emergency response capability (OS-ERC) currently in place in modern marine management is required to prevent pollution and loss accidents. The objective of this paper is to develop a novel OS-ERC evaluation model, the importance of which stems from the current lack of integrated approaches for interpreting, ranking and assessing OS-ERC performance factors. In the first part of this paper, the factors influencing OS-ERC are analyzed and classified to generate a global evaluation index system. Then, a semantic tree is adopted to illustrate linguistic variables in the evaluation process, followed by the application of a combination of Fuzzy Cognitive Maps (FCM) and the Analytic Hierarchy Process (AHP) to construct and calculate the weight distribution. Finally, considering that the OS-ERC evaluation process is a complex system, a fuzzy comprehensive evaluation (FCE) is employed to calculate the OS-ERC level. The entire evaluation framework obtains the overall level of OS-ERC, and also highlights the potential major issues concerning OS-ERC, as well as expert opinions for improving the feasibility of oil-spill accident prevention and protection. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. The use of a quartz crystal microbalance as an analytical tool to monitor particle/surface and particle/particle interactions under dry ambient and pressurized conditions: a study using common inhaler components.

    PubMed

    Turner, N W; Bloxham, M; Piletsky, S A; Whitcombe, M J; Chianella, I

    2016-12-19

    Metered dose inhalers (MDI) and multidose powder inhalers (MPDI) are commonly used for the treatment of chronic obstructive pulmonary diseases and asthma. Currently, analytical tools to monitor particle/particle and particle/surface interaction within MDI and MPDI at the macro-scale do not exist. A simple tool capable of measuring such interactions would ultimately enable quality control of MDI and MDPI, producing remarkable benefits for the pharmaceutical industry and the users of inhalers. In this paper, we have investigated whether a quartz crystal microbalance (QCM) could become such a tool. A QCM was used to measure particle/particle and particle/surface interactions on the macroscale, by additions of small amounts of MDPI components, in the powder form into a gas stream. The subsequent interactions with materials on the surface of the QCM sensor were analyzed. Following this, the sensor was used to measure fluticasone propionate, a typical MDI active ingredient, in a pressurized gas system to assess its interactions with different surfaces under conditions mimicking the manufacturing process. In both types of experiments the QCM was capable of discriminating interactions of different components and surfaces. The results have demonstrated that the QCM is a suitable platform for monitoring macro-scale interactions and could possibly become a tool for quality control of inhalers.

  16. EPA Region 6 Laboratory Method Specific Analytical Capabilities with Sample Concentration Range

    EPA Pesticide Factsheets

    EPA Region 6 Environmental Services Branch (ESB) Laboratory is capable of analyzing a wide range of samples with concentrations ranging for low part-per trillion (ppt) to low percent () levels, depending on the sample matrix.

  17. SPECIATION OF ARSENIC IN EXPOSURE ASSESSMENT MATRICES

    EPA Science Inventory

    The speciaton of arsenic in water, food and urine are analytical capabilities which are an essential part in arsenic risk assessment. The cancer risk associated with arsenic has been the driving force in generating the analytical research in each of these matrices. This presentat...

  18. ENVIRONMENTAL TECHNOLOGICAL VERIFICATION REPORT - L2000 PCB/CHLORIDE ANALYZER - DEXSIL CORPORATION

    EPA Science Inventory

    In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of Polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCBs in soil...

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - ENVIROGARD PCB TEST KIT - STRATEGIC DIAGNOSTICS INC

    EPA Science Inventory

    In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of Polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCBs in soil...

  20. Quantitative correlations between collision induced dissociation mass spectrometry coupled with electrospray ionization or atmospheric pressure chemical ionization mass spectrometry - Experiment and theory

    NASA Astrophysics Data System (ADS)

    Ivanova, Bojidarka; Spiteller, Michael

    2018-04-01

    The problematic that we consider in this paper treats the quantitative correlation model equations between experimental kinetic and thermodynamic parameters of coupled electrospray ionization (ESI) mass spectrometry (MS) or atmospheric pressure chemical ionization (APCI) mass spectrometry with collision induced dissociation mass spectrometry, accounting for the fact that the physical phenomena and mechanisms of ESI- and APCI-ion formation are completely different. There are described forty two fragment reactions of three analytes under independent ESI- and APCI-measurements. The developed new quantitative models allow us to study correlatively the reaction kinetics and thermodynamics using the methods of mass spectrometry, which complementary application with the methods of the quantum chemistry provide 3D structural information of the analytes. Both static and dynamic quantum chemical computations are carried out. The object of analyses are [2,3-dimethyl-4-(4-methyl-benzoyl)-2,3-di-p-tolyl-cyclobutyl]-p-tolyl-methanone (1) and the polycyclic aromatic hydrocarbons derivatives of dibenzoperylen (2) and tetrabenzo [a,c,fg,op]naphthacene (3), respectively. As far as (1) is known to be a product of [2π+2π] cycloaddition reactions of chalcone (1,3-di-p-tolyl-propenone), however producing cyclic derivatives with different stereo selectivity, so that the study provide crucial data about the capability of mass spectrometry to provide determine the stereo selectivity of the analytes. This work also first provides quantitative treatment of the relations '3D molecular/electronic structures'-'quantum chemical diffusion coefficient'-'mass spectrometric diffusion coefficient', thus extending the capability of the mass spectrometry for determination of the exact 3D structure of the analytes using independent measurements and computations of the diffusion coefficients. The determination of the experimental diffusion parameters is carried out within the 'current monitoring method' evaluating the translation diffusion of charged analytes, while the theoretical modelling of MS ions and computations of theoretical diffusion coefficients are based on the Arrhenius type behavior of the charged species under ESI- and APCI-conditions. Although the study provide certain sound considerations for the quantitative relations between the reaction kinetic-thermodynamics and 3D structure of the analytes together with correlations between 3D molecular/electronic structures-quantum chemical diffusion coefficient-mass spectrometric diffusion coefficient, which contribute significantly to the structural analytical chemistry, the results have importance to other areas such as organic synthesis and catalysis as well.

  1. Analytical and experimental investigation of liquid double drop dynamics: Preliminary design for space shuttle experiments

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The preliminary grant assessed the use of laboratory experiments for simulating low g liquid drop experiments in the space shuttle environment. Investigations were begun of appropriate immiscible liquid systems, design of experimental apparatus and analyses. The current grant continued these topics, completed construction and preliminary testing of the experimental apparatus, and performed experiments on single and compound liquid drops. A continuing assessment of laboratory capabilities, and the interests of project personnel and available collaborators, led to, after consultations with NASA personnel, a research emphasis specializing on compound drops consisting of hollow plastic or elastic spheroids filled with liquids.

  2. Space tug thermal control. [design criteria and specifications

    NASA Technical Reports Server (NTRS)

    1974-01-01

    It was determined that space tug will require the capability to perform its mission within a broad range of thermal environments with currently planned mission durations of up to seven days, so an investigation was conducted to define a thermal design for the forward and intertank compartments and fuel cell heat rejection system that satisfies tug requirements for low inclination geosynchronous deploy and retrieve missions. Passive concepts were demonstrated analytically for both the forward and intertank compartments, and a worst case external heating environment was determined for use during the study. The thermal control system specifications and designs which resulted from the research are shown.

  3. The effective compliance of spatially evolving planar wing-cracks

    NASA Astrophysics Data System (ADS)

    Ayyagari, R. S.; Daphalapurkar, N. P.; Ramesh, K. T.

    2018-02-01

    We present an analytic closed form solution for anisotropic change in compliance due to the spatial evolution of planar wing-cracks in a material subjected to largely compressive loading. A fully three-dimensional anisotropic compliance tensor is defined and evaluated considering the wing-crack mechanism, using a mixed-approach based on kinematic and energetic arguments to derive the coefficients in incremental compliance. Material, kinematic and kinetic parametric influences on the increments in compliance are studied in order to understand their physical implications on material failure. Model verification is carried out through comparisons to experimental uniaxial compression results to showcase the predictive capabilities of the current study.

  4. Sociocultural Behavior Influence Modelling & Assessment: Current Work and Research Frontiers.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernard, Michael Lewis

    A common problem associated with the effort to better assess potential behaviors of various individuals within different countries is the shear difficulty in comprehending the dynamic nature of populations, particularly over time and considering feedback effects. This paper discusses a theory-based analytical capability designed to enable analysts to better assess the influence of events on individuals interacting within a country or region. These events can include changes in policy, man-made or natural disasters, migration, war, or other changes in environmental/economic conditions. In addition, this paper describes potential extensions of this type of research to enable more timely and accurate assessments.

  5. Assembling Amperometric Biosensors for Clinical Diagnostics

    PubMed Central

    Belluzo, María Soledad; Ribone, María Élida; Lagier, Claudia Marina

    2008-01-01

    Clinical diagnosis and disease prevention routinely require the assessment of species determined by chemical analysis. Biosensor technology offers several benefits over conventional diagnostic analysis. They include simplicity of use, specificity for the target analyte, speed to arise to a result, capability for continuous monitoring and multiplexing, together with the potentiality of coupling to low-cost, portable instrumentation. This work focuses on the basic lines of decisions when designing electron-transfer-based biosensors for clinical analysis, with emphasis on the strategies currently used to improve the device performance, the present status of amperometric electrodes for biomedicine, and the trends and challenges envisaged for the near future. PMID:27879771

  6. Experimental and Analytical Determinations of Spiral Bevel Gear-Tooth Bending Stress Compared

    NASA Technical Reports Server (NTRS)

    Handschuh, Robert F.

    2000-01-01

    Spiral bevel gears are currently used in all main-rotor drive systems for rotorcraft produced in the United States. Applications such as these need spiral bevel gears to turn the corner from the horizontal gas turbine engine to the vertical rotor shaft. These gears must typically operate at extremely high rotational speeds and carry high power levels. With these difficult operating conditions, an improved analytical capability is paramount to increasing aircraft safety and reliability. Also, literature on the analysis and testing of spiral bevel gears has been very sparse in comparison to that for parallel axis gears. This is due to the complex geometry of this type of gear and to the specialized test equipment necessary to test these components. To develop an analytical model of spiral bevel gears, researchers use differential geometry methods to model the manufacturing kinematics. A three-dimensional spiral bevel gear modeling method was developed that uses finite elements for the structural analysis. This method was used to analyze the three-dimensional contact pattern between the test pinion and gear used in the Spiral Bevel Gear Test Facility at the NASA Glenn Research Center at Lewis Field. Results of this analysis are illustrated in the preceding figure. The development of the analytical method was a joint endeavor between NASA Glenn, the U.S. Army Research Laboratory, and the University of North Dakota.

  7. Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator.

    PubMed

    Garcia Castro, Alexander; Thoraval, Samuel; Garcia, Leyla J; Ragan, Mark A

    2005-04-07

    Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces) in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results) can be reproduced or shared among users. http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive), ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download). From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous analytical tools.

  8. Nanomanipulation-Coupled Matrix-Assisted Laser Desorption/ Ionization-Direct Organelle Mass Spectrometry: A Technique for the Detailed Analysis of Single Organelles

    NASA Astrophysics Data System (ADS)

    Phelps, Mandy S.; Sturtevant, Drew; Chapman, Kent D.; Verbeck, Guido F.

    2016-02-01

    We describe a novel technique combining precise organelle microextraction with deposition and matrix-assisted laser desorption/ionization (MALDI) for a rapid, minimally invasive mass spectrometry (MS) analysis of single organelles from living cells. A dual-positioner nanomanipulator workstation was utilized for both extraction of organelle content and precise co-deposition of analyte and matrix solution for MALDI-direct organelle mass spectrometry (DOMS) analysis. Here, the triacylglycerol (TAG) profiles of single lipid droplets from 3T3-L1 adipocytes were acquired and results validated with nanoelectrospray ionization (NSI) MS. The results demonstrate the utility of the MALDI-DOMS technique as it enabled longer mass analysis time, higher ionization efficiency, MS imaging of the co-deposited spot, and subsequent MS/MS capabilities of localized lipid content in comparison to NSI-DOMS. This method provides selective organellar resolution, which complements current biochemical analyses and prompts for subsequent subcellular studies to be performed where limited samples and analyte volume are of concern.

  9. Mission Analysis Program for Solar Electric Propulsion (MAPSEP). Volume 1: Analytical manual for earth orbital MAPSEP

    NASA Technical Reports Server (NTRS)

    1975-01-01

    An introduction to the MAPSEP organization and a detailed analytical description of all models and algorithms are given. These include trajectory and error covariance propagation methods, orbit determination processes, thrust modeling, and trajectory correction (guidance) schemes. Earth orbital MAPSEP contains the capability of analyzing almost any currently projected low thrust mission from low earth orbit to super synchronous altitudes. Furthermore, MAPSEP is sufficiently flexible to incorporate extended dynamic models, alternate mission strategies, and almost any other system requirement imposed by the user. As in the interplanetary version, earth orbital MAPSEP represents a trade-off between precision modeling and computational speed consistent with defining necessary system requirements. It can be used in feasibility studies as well as in flight operational support. Pertinent operational constraints are available both implicitly and explicitly. However, the reader should be warned that because of program complexity, MAPSEP is only as good as the user and will quickly succumb to faulty user inputs.

  10. Comparison of the analytical capabilities of the BAC Datamaster and Datamaster DMT forensic breath testing devices.

    PubMed

    Glinn, Michele; Adatsi, Felix; Curtis, Perry

    2011-11-01

    The State of Michigan uses the Datamaster as an evidential breath testing device. The newest version, the DMT, will replace current instruments in the field as they are retired from service. The Michigan State Police conducted comparison studies to test the analytical properties of the new instrument and to evaluate its response to conditions commonly cited in court defenses. The effects of mouth alcohol, objects in the mouth, and radiofrequency interference on paired samples from drinking subjects were assessed on the DMT. The effects of sample duration and chemical interferents were assessed on both instruments, using drinking subjects and wet-bath simulators, respectively. Our testing shows that Datamaster and DMT results are essentially identical; the DMT gave accurate readings as compared with measurements made using simulators containing standard ethanol solutions and that the DMT did not give falsely elevated breath alcohol results from any of the influences tested. © 2011 American Academy of Forensic Sciences.

  11. Towards a Web-Enabled Geovisualization and Analytics Platform for the Energy and Water Nexus

    NASA Astrophysics Data System (ADS)

    Sanyal, J.; Chandola, V.; Sorokine, A.; Allen, M.; Berres, A.; Pang, H.; Karthik, R.; Nugent, P.; McManamay, R.; Stewart, R.; Bhaduri, B. L.

    2017-12-01

    Interactive data analytics are playing an increasingly vital role in the generation of new, critical insights regarding the complex dynamics of the energy/water nexus (EWN) and its interactions with climate variability and change. Integration of impacts, adaptation, and vulnerability (IAV) science with emerging, and increasingly critical, data science capabilities offers a promising potential to meet the needs of the EWN community. To enable the exploration of pertinent research questions, a web-based geospatial visualization platform is being built that integrates a data analysis toolbox with advanced data fusion and data visualization capabilities to create a knowledge discovery framework for the EWN. The system, when fully built out, will offer several geospatial visualization capabilities including statistical visual analytics, clustering, principal-component analysis, dynamic time warping, support uncertainty visualization and the exploration of data provenance, as well as support machine learning discoveries to render diverse types of geospatial data and facilitate interactive analysis. Key components in the system architecture includes NASA's WebWorldWind, the Globus toolkit, postgresql, as well as other custom built software modules.

  12. INTEGRATED ENVIRONMENTAL ASSESSMENT OF THE MID-ATLANTIC REGION WITH ANALYTICAL NETWORK PROCESS

    EPA Science Inventory

    A decision analysis method for integrating environmental indicators was developed. This was a combination of Principal Component Analysis (PCA) and the Analytic Network Process (ANP). Being able to take into account interdependency among variables, the method was capable of ran...

  13. Anticipating Surprise: Analysis for Strategic Warning

    DTIC Science & Technology

    2002-12-01

    Intentions versus Capabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 2 Introduction to the Analytical Method ...Analysis . . . . . . . . . . . . . . . . . . . . . . 32 Specifics of the Analytical Method . . . . . . . . . . . . . . . . . . . . . . . . 42 3...intelligence. Why is it that “no one’’—a slight but not great exaggeration—believes in the indications method , despite its demonstrably good record in these

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: IMMUNOASSAY KIT, ENVIROLOGIX, INC., PCB IN SOIL TUBE ASSAY

    EPA Science Inventory

    In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCB's in soi...

  15. Probabilistic evaluation of on-line checks in fault-tolerant multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Nair, V. S. S.; Hoskote, Yatin V.; Abraham, Jacob A.

    1992-01-01

    The analysis of fault-tolerant multiprocessor systems that use concurrent error detection (CED) schemes is much more difficult than the analysis of conventional fault-tolerant architectures. Various analytical techniques have been proposed to evaluate CED schemes deterministically. However, these approaches are based on worst-case assumptions related to the failure of system components. Often, the evaluation results do not reflect the actual fault tolerance capabilities of the system. A probabilistic approach to evaluate the fault detecting and locating capabilities of on-line checks in a system is developed. The various probabilities associated with the checking schemes are identified and used in the framework of the matrix-based model. Based on these probabilistic matrices, estimates for the fault tolerance capabilities of various systems are derived analytically.

  16. NGNP Data Management and Analysis System Analysis and Web Delivery Capabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cynthia D. Gentillon

    2011-09-01

    Projects for the Very High Temperature Reactor (VHTR) Technology Development Office provide data in support of Nuclear Regulatory Commission licensing of the very high temperature reactor. Fuel and materials to be used in the reactor are tested and characterized to quantify performance in high-temperature and high-fluence environments. The NGNP Data Management and Analysis System (NDMAS) at the Idaho National Laboratory has been established to ensure that VHTR data are (1) qualified for use, (2) stored in a readily accessible electronic form, and (3) analyzed to extract useful results. This document focuses on the third NDMAS objective. It describes capabilities formore » displaying the data in meaningful ways and for data analysis to identify useful relationships among the measured quantities. The capabilities are described from the perspective of NDMAS users, starting with those who just view experimental data and analytical results on the INL NDMAS web portal. Web display and delivery capabilities are described in detail. Also the current web pages that show Advanced Gas Reactor, Advanced Graphite Capsule, and High Temperature Materials test results are itemized. Capabilities available to NDMAS developers are more extensive, and are described using a second series of examples. Much of the data analysis efforts focus on understanding how thermocouple measurements relate to simulated temperatures and other experimental parameters. Statistical control charts and correlation monitoring provide an ongoing assessment of instrument accuracy. Data analysis capabilities are virtually unlimited for those who use the NDMAS web data download capabilities and the analysis software of their choice. Overall, the NDMAS provides convenient data analysis and web delivery capabilities for studying a very large and rapidly increasing database of well-documented, pedigreed data.« less

  17. Immunochemistry for high-throughput screening of human exhaled breath condensate (EBC) media: implementation of automated Quanterix SIMOA instrumentation.

    PubMed

    Pleil, Joachim D; Angrish, Michelle M; Madden, Michael C

    2015-12-11

    Immunochemistry is an important clinical tool for indicating biological pathways leading towards disease. Standard enzyme-linked immunosorbent assays (ELISA) are labor intensive and lack sensitivity at low-level concentrations. Here we report on emerging technology implementing fully-automated ELISA capable of molecular level detection and describe application to exhaled breath condensate (EBC) samples. The Quanterix SIMOA HD-1 analyzer was evaluated for analytical performance for inflammatory cytokines (IL-6, TNF-α, IL-1β and IL-8). The system was challenged with human EBC representing the most dilute and analytically difficult of the biological media. Calibrations from synthetic samples and spiked EBC showed excellent linearity at trace levels (r(2)  >  0.99). Sensitivities varied by analyte, but were robust from ~0.006 (IL-6) to ~0.01 (TNF-α) pg ml(-1). All analytes demonstrated response suppression when diluted with deionized water and so assay buffer diluent was found to be a better choice. Analytical runs required ~45 min setup time for loading samples, reagents, calibrants, etc., after which the instrument performs without further intervention for up to 288 separate samples. Currently, available kits are limited to single-plex analyses and so sample volumes require adjustments. Sample dilutions should be made with assay diluent to avoid response suppression. Automation performs seamlessly and data are automatically analyzed and reported in spreadsheet format. The internal 5-parameter logistic (pl) calibration model should be supplemented with a linear regression spline at the very lowest analyte levels, (<1.3 pg ml(-1)). The implementation of the automated Quanterix platform was successfully demonstrated using EBC, which poses the greatest challenge to ELISA due to limited sample volumes and low protein levels.

  18. Analytical Chemistry Developmental Work Using a 243Am Solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Khalil J.; Stanley, Floyd E.; Porterfield, Donivan R.

    2015-02-24

    This project seeks to reestablish our analytical capability to characterize Am bulk material and develop a reference material suitable to characterizing the purity and assay of 241Am oxide for industrial use. The tasks associated with this phase of the project included conducting initial separations experiments, developing thermal ionization mass spectrometry capability using the 243Am isotope as an isotope dilution spike , optimizing the spike for the determination of 241Pu- 241 Am radiochemistry, and, additionally, developing and testing a methodology which can detect trace to ultra- trace levels of Pu (both assay and isotopics) in bulk Am samples .

  19. Grade 8 students' capability of analytical thinking and attitude toward science through teaching and learning about soil and its' pollution based on science technology and society (STS) approach

    NASA Astrophysics Data System (ADS)

    Boonprasert, Lapisarin; Tupsai, Jiraporn; Yuenyong, Chokchai

    2018-01-01

    This study reported Grade 8 students' analytical thinking and attitude toward science in teaching and learning about soil and its' pollution through science technology and society (STS) approach. The participants were 36 Grade 8 students in Naklang, Nongbualumphu, Thailand. The teaching and learning about soil and its' pollution through STS approach had carried out for 6 weeks. The soil and its' pollution unit through STS approach was developed based on framework of Yuenyong (2006) that consisted of five stages including (1) identification of social issues, (2) identification of potential solutions, (3) need for knowledge, (4) decision-making, and (5) socialization stage. Students' analytical thinking and attitude toward science was collected during their learning by participant observation, analytical thinking test, students' tasks, and journal writing. The findings revealed that students could gain their capability of analytical thinking. They could give ideas or behave the characteristics of analytical thinking such as thinking for classifying, compare and contrast, reasoning, interpreting, collecting data and decision making. Students' journal writing reflected that the STS class of soil and its' pollution motivated students. The paper will discuss implications of these for science teaching and learning through STS in Thailand.

  20. Generalization of Solovev’s approach to finding equilibrium solutions for axisymmetric plasmas with flow

    NASA Astrophysics Data System (ADS)

    M, S. CHU; Yemin, HU; Wenfeng, GUO

    2018-03-01

    Solovev’s approach of finding equilibrium solutions was found to be extremely useful for generating a library of linear-superposable equilibria for the purpose of shaping studies. This set of solutions was subsequently expanded to include the vacuum solutions of Zheng, Wootton and Solano, resulting in a set of functions {SOLOVEV_ZWS} that were usually used for all toroidally symmetric plasmas, commonly recognized as being able to accommodate any desired plasma shapes (complete-shaping capability). The possibility of extending the Solovev approach to toroidal equilibria with a general plasma flow is examined theoretically. We found that the only meaningful extension is to plasmas with a pure toroidal rotation and with a constant Mach number. We also show that the simplification ansatz made to the current profiles, which was the basis of the Solovev approach, should be applied more systematically to include an internal boundary condition at the magnetic axis; resulting in a modified and more useful set {SOLOVEV_ZWSm}. Explicit expressions of functions in this set are given for equilibria with a quasi-constant current density profile, with a toroidal flow at a constant Mach number and with specific heat capacity 1. The properties of {SOLOVEV_ZWSm} are studied analytically. Numerical examples of achievable equilibria are demonstrated. Although the shaping capability of the set {SOLOVE_ZWSm} is quite extensive, it nevertheless still does not have complete shaping capability, particularly for plasmas with negative curvature points on the plasma boundary such as the doublets or indented bean shaped tokamaks.

  1. Analysis of Environmental Contamination resulting from Catastrophic Incidents: Part two: Building Laboratory Capability by Selecting and Developing Analytical Methodologies

    EPA Science Inventory

    Catastrophic incidents can generate a large number of samples with analytically diverse types including forensic, clinical, environmental, food, and others. Environmental samples include water, wastewater, soil, air, urban building and infrastructure materials, and surface resid...

  2. Three lessons for genetic toxicology from baseball analytics.

    PubMed

    Dertinger, Stephen D

    2017-07-01

    In many respects the evolution of baseball statistics mirrors advances made in the field of genetic toxicology. From its inception, baseball and statistics have been inextricably linked. Generations of players and fans have used a number of relatively simple measurements to describe team and individual player's current performance, as well as for historical record-keeping purposes. Over the years, baseball analytics has progressed in several important ways. Early advances were based on deriving more meaningful metrics from simpler forerunners. Now, technological innovations are delivering much deeper insights. Videography, radar, and other advances that include automatic player recognition capabilities provide the means to measure more complex and useful factors. Fielders' reaction times, efficiency of the route taken to reach a batted ball, and pitch-framing effectiveness come to mind. With the current availability of complex measurements from multiple data streams, multifactorial analyses occurring via machine learning algorithms have become necessary to make sense of the terabytes of data that are now being captured in every Major League Baseball game. Collectively, these advances have transformed baseball statistics from being largely descriptive in nature to serving data-driven, predictive roles. Whereas genetic toxicology has charted a somewhat parallel course, a case can be made that greater utilization of baseball's mindset and strategies would serve our scientific field well. This paper describes three useful lessons for genetic toxicology, courtesy of the field of baseball analytics: seek objective knowledge; incorporate multiple data streams; and embrace machine learning. Environ. Mol. Mutagen. 58:390-397, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  3. Adulterants in Urine Drug Testing.

    PubMed

    Fu, S

    Urine drug testing plays an important role in monitoring licit and illicit drug use for both medico-legal and clinical purposes. One of the major challenges of urine drug testing is adulteration, a practice involving manipulation of a urine specimen with chemical adulterants to produce a false negative test result. This problem is compounded by the number of easily obtained chemicals that can effectively adulterate a urine specimen. Common adulterants include some household chemicals such as hypochlorite bleach, laundry detergent, table salt, and toilet bowl cleaner and many commercial products such as UrinAid (glutaraldehyde), Stealth® (containing peroxidase and peroxide), Urine Luck (pyridinium chlorochromate, PCC), and Klear® (potassium nitrite) available through the Internet. These adulterants can invalidate a screening test result, a confirmatory test result, or both. To counteract urine adulteration, drug testing laboratories have developed a number of analytical methods to detect adulterants in a urine specimen. While these methods are useful in detecting urine adulteration when such activities are suspected, they do not reveal what types of drugs are being concealed. This is particularly the case when oxidizing urine adulterants are involved as these oxidants are capable of destroying drugs and their metabolites in urine, rendering the drug analytes undetectable by any testing technology. One promising approach to address this current limitation has been the use of unique oxidation products formed from reaction of drug analytes with oxidizing adulterants as markers for monitoring drug misuse and urine adulteration. This novel approach will ultimately improve the effectiveness of the current urine drug testing programs. © 2016 Elsevier Inc. All rights reserved.

  4. Overview: MURI Center on spectroscopic and time domain detection of trace explosives in condensed and vapor phases

    NASA Astrophysics Data System (ADS)

    Spicer, James B.; Dagdigian, Paul; Osiander, Robert; Miragliotta, Joseph A.; Zhang, Xi-Cheng; Kersting, Roland; Crosley, David R.; Hanson, Ronald K.; Jeffries, Jay

    2003-09-01

    The research center established by Army Research Office under the Multidisciplinary University Research Initiative program pursues a multidisciplinary approach to investigate and advance the use of complementary analytical techniques for sensing of explosives and/or explosive-related compounds as they occur in the environment. The techniques being investigated include Terahertz (THz) imaging and spectroscopy, Laser-Induced Breakdown Spectroscopy (LIBS), Cavity Ring Down Spectroscopy (CRDS) and Resonance Enhanced Multiphoton Ionization (REMPI). This suite of techniques encompasses a diversity of sensing approaches that can be applied to detection of explosives in condensed phases such as adsorbed species in soil or can be used for vapor phase detection above the source. Some techniques allow for remote detection while others have highly specific and sensitive analysis capabilities. This program is addressing a range of fundamental, technical issues associated with trace detection of explosive related compounds using these techniques. For example, while both LIBS and THz can be used to carry-out remote analysis of condensed phase analyte from a distance in excess several meters, the sensitivities of these techniques to surface adsorbed explosive-related compounds are not currently known. In current implementations, both CRDS and REMPI require sample collection techniques that have not been optimized for environmental applications. Early program elements will pursue the fundamental advances required for these techniques including signature identification for explosive-related compounds/interferents and trace analyte extraction. Later program tasks will explore simultaneous application of two or more techniques to assess the benefits of sensor fusion.

  5. Can cloud point-based enrichment, preservation, and detection methods help to bridge gaps in aquatic nanometrology?

    PubMed

    Duester, Lars; Fabricius, Anne-Lena; Jakobtorweihen, Sven; Philippe, Allan; Weigl, Florian; Wimmer, Andreas; Schuster, Michael; Nazar, Muhammad Faizan

    2016-11-01

    Coacervate-based techniques are intensively used in environmental analytical chemistry to enrich and extract different kinds of analytes. Most methods focus on the total content or the speciation of inorganic and organic substances. Size fractionation is less commonly addressed. Within coacervate-based techniques, cloud point extraction (CPE) is characterized by a phase separation of non-ionic surfactants dispersed in an aqueous solution when the respective cloud point temperature is exceeded. In this context, the feature article raises the following question: May CPE in future studies serve as a key tool (i) to enrich and extract nanoparticles (NPs) from complex environmental matrices prior to analyses and (ii) to preserve the colloidal status of unstable environmental samples? With respect to engineered NPs, a significant gap between environmental concentrations and size- and element-specific analytical capabilities is still visible. CPE may support efforts to overcome this "concentration gap" via the analyte enrichment. In addition, most environmental colloidal systems are known to be unstable, dynamic, and sensitive to changes of the environmental conditions during sampling and sample preparation. This delivers a so far unsolved "sample preparation dilemma" in the analytical process. The authors are of the opinion that CPE-based methods have the potential to preserve the colloidal status of these instable samples. Focusing on NPs, this feature article aims to support the discussion on the creation of a convention called the "CPE extractable fraction" by connecting current knowledge on CPE mechanisms and on available applications, via the uncertainties visible and modeling approaches available, with potential future benefits from CPE protocols.

  6. Correlation study of theoretical and experimental results for spin tests of a 1/10 scale radio control model

    NASA Technical Reports Server (NTRS)

    Bihrle, W., Jr.

    1976-01-01

    A correlation study was conducted to determine the ability of current analytical spin prediction techniques to predict the flight motions of a current fighter airplane configuration during the spin entry, the developed spin, and the spin recovery motions. The airplane math model used aerodynamics measured on an exact replica of the flight test model using conventional static and forced-oscillation wind-tunnel test techniques and a recently developed rotation-balance test apparatus capable of measuring aerodynamics under steady spinning conditions. An attempt was made to predict the flight motions measured during stall/spin flight testing of an unpowered, radio-controlled model designed to be a 1/10 scale, dynamically-scaled model of a current fighter configuration. Comparison of the predicted and measured flight motions show that while the post-stall and spin entry motions were not well-predicted, the developed spinning motion (a steady flat spin) and the initial phases of the spin recovery motion are reasonably well predicted.

  7. An Examination of Advisor Concerns in the Era of Academic Analytics

    ERIC Educational Resources Information Center

    Daughtry, Jeremy J.

    2017-01-01

    Performance-based funding models are increasingly becoming the norm for many institutions of higher learning. Such models place greater emphasis on student retention and success metrics, for example, as requirements for receiving state appropriations. To stay competitive, universities have adopted academic analytics technologies capable of…

  8. Analytical Capability of Defocused µ-SORS in the Chemical Interrogation of Thin Turbid Painted Layers

    PubMed Central

    Realini, Marco; Botteon, Alessandra; Colombo, Chiara; Noll, Sarah; Elliott, Stephen R.; Matousek, Pavel

    2016-01-01

    A recently developed micrometer-scale spatially offset Raman spectroscopy (μ-SORS) method provides a new analytical capability for investigating non-destructively the chemical composition of sub-surface, micrometer-scale thickness, diffusely scattering layers at depths beyond the reach of conventional confocal Raman microscopy. Here, we demonstrate experimentally, for the first time, the capability of μ-SORS to determine whether two detected chemical components originate from two separate layers or whether the two components are mixed together in a single layer. Such information is important in a number of areas, including conservation of cultural heritage objects, and is not available, for highly turbid media, from conventional Raman microscopy, where axial (confocal) scanning is not possible due to an inability to facilitate direct imaging within the highly scattering sample. This application constitutes an additional capability for μ-SORS in addition to its basic capacity to determine the overall chemical make-up of layers in a turbid system. PMID:26767641

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hohimer, J.P.

    The use of laser-based analytical methods in nuclear-fuel processing plants is considered. The species and locations for accountability, process control, and effluent control measurements in the Coprocessing, Thorex, and reference Purex fuel processing operations are identified and the conventional analytical methods used for these measurements are summarized. The laser analytical methods based upon Raman, absorption, fluorescence, and nonlinear spectroscopy are reviewed and evaluated for their use in fuel processing plants. After a comparison of the capabilities of the laser-based and conventional analytical methods, the promising areas of application of the laser-based methods in fuel processing plants are identified.

  10. MIiSR: Molecular Interactions in Super-Resolution Imaging Enables the Analysis of Protein Interactions, Dynamics and Formation of Multi-protein Structures.

    PubMed

    Caetano, Fabiana A; Dirk, Brennan S; Tam, Joshua H K; Cavanagh, P Craig; Goiko, Maria; Ferguson, Stephen S G; Pasternak, Stephen H; Dikeakos, Jimmy D; de Bruyn, John R; Heit, Bryan

    2015-12-01

    Our current understanding of the molecular mechanisms which regulate cellular processes such as vesicular trafficking has been enabled by conventional biochemical and microscopy techniques. However, these methods often obscure the heterogeneity of the cellular environment, thus precluding a quantitative assessment of the molecular interactions regulating these processes. Herein, we present Molecular Interactions in Super Resolution (MIiSR) software which provides quantitative analysis tools for use with super-resolution images. MIiSR combines multiple tools for analyzing intermolecular interactions, molecular clustering and image segmentation. These tools enable quantification, in the native environment of the cell, of molecular interactions and the formation of higher-order molecular complexes. The capabilities and limitations of these analytical tools are demonstrated using both modeled data and examples derived from the vesicular trafficking system, thereby providing an established and validated experimental workflow capable of quantitatively assessing molecular interactions and molecular complex formation within the heterogeneous environment of the cell.

  11. 75 FR 49930 - Stakeholder Meeting Regarding Re-Evaluation of Currently Approved Total Coliform Analytical Methods

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-16

    ... Currently Approved Total Coliform Analytical Methods AGENCY: Environmental Protection Agency (EPA). ACTION... of currently approved Total Coliform Rule (TCR) analytical methods. At these meetings, stakeholders will be given an opportunity to discuss potential elements of a method re-evaluation study, such as...

  12. Shear joint capability versus bolt clearance

    NASA Technical Reports Server (NTRS)

    Lee, H. M.

    1992-01-01

    The results of a conservative analysis approach into the determination of shear joint strength capability for typical space-flight hardware as a function of the bolt-hole clearance specified in the design are presented. These joints are comprised of high-strength steel fasteners and abutments constructed of aluminum alloys familiar to the aerospace industry. A general analytical expression was first arrived at which relates bolt-hole clearance to the bolt shear load required to place all joint fasteners into a shear transferring position. Extension of this work allowed the analytical development of joint load capability as a function of the number of fasteners, shear strength of the bolt, bolt-hole clearance, and the desired factor of safety. Analysis results clearly indicate that a typical space-flight hardware joint can withstand significant loading when less than ideal bolt hole clearances are used in the design.

  13. The Synergistic Engineering Environment

    NASA Technical Reports Server (NTRS)

    Cruz, Jonathan

    2006-01-01

    The Synergistic Engineering Environment (SEE) is a system of software dedicated to aiding the understanding of space mission operations. The SEE can integrate disparate sets of data with analytical capabilities, geometric models of spacecraft, and a visualization environment, all contributing to the creation of an interactive simulation of spacecraft. Initially designed to satisfy needs pertaining to the International Space Station, the SEE has been broadened in scope to include spacecraft ranging from those in low orbit around the Earth to those on deep-space missions. The SEE includes analytical capabilities in rigid-body dynamics, kinematics, orbital mechanics, and payload operations. These capabilities enable a user to perform real-time interactive engineering analyses focusing on diverse aspects of operations, including flight attitudes and maneuvers, docking of visiting spacecraft, robotic operations, impingement of spacecraft-engine exhaust plumes, obscuration of instrumentation fields of view, communications, and alternative assembly configurations. .

  14. Size separation of analytes using monomeric surfactants

    DOEpatents

    Yeung, Edward S.; Wei, Wei

    2005-04-12

    A sieving medium for use in the separation of analytes in a sample containing at least one such analyte comprises a monomeric non-ionic surfactant of the of the general formula, B-A, wherein A is a hydrophilic moiety and B is a hydrophobic moiety, present in a solvent at a concentration forming a self-assembled micelle configuration under selected conditions and having an aggregation number providing an equivalent weight capable of effecting the size separation of the sample solution so as to resolve a target analyte(s) in a solution containing the same, the size separation taking place in a chromatography or electrophoresis separation system.

  15. Capabilities for Intercultural Dialogue

    ERIC Educational Resources Information Center

    Crosbie, Veronica

    2014-01-01

    The capabilities approach offers a valuable analytical lens for exploring the challenge and complexity of intercultural dialogue in contemporary settings. The central tenets of the approach, developed by Amartya Sen and Martha Nussbaum, involve a set of humanistic goals including the recognition that development is a process whereby people's…

  16. Design and testing of a caseless solid-fuel integral-rocket ramjet engine for use in small tactical missiles

    NASA Astrophysics Data System (ADS)

    Fruge, Keith J.

    1991-09-01

    An investigation was conducted to determine the feasibility of a low cost, caseless, solid fuel integral rocket ramjet (IRSFRJ) that has no ejecta. Analytical design of a ramjet powered air-to-ground missile capable of being fired from a remotely piloted vehicle or helicopter was accomplished using current JANNAF and Air Force computer codes. The results showed that an IRSFRJ powered missile can exceed the velocity and range of current systems by more than a two to one ratio, without an increase in missile length and weight. A caseless IRSFRJ with a nonejecting port cover was designed and tested. The experimental results of the static tests showed that a low cost, caseless IRSFRJ with a nonejectable port cover is a viable design. Rocket ramjet transition was demonstrated and ramjet ignition was found to be insensitive to the booster tail off to air injection timing sequence.

  17. Firing patterns in the adaptive exponential integrate-and-fire model.

    PubMed

    Naud, Richard; Marcille, Nicolas; Clopath, Claudia; Gerstner, Wulfram

    2008-11-01

    For simulations of large spiking neuron networks, an accurate, simple and versatile single-neuron modeling framework is required. Here we explore the versatility of a simple two-equation model: the adaptive exponential integrate-and-fire neuron. We show that this model generates multiple firing patterns depending on the choice of parameter values, and present a phase diagram describing the transition from one firing type to another. We give an analytical criterion to distinguish between continuous adaption, initial bursting, regular bursting and two types of tonic spiking. Also, we report that the deterministic model is capable of producing irregular spiking when stimulated with constant current, indicating low-dimensional chaos. Lastly, the simple model is fitted to real experiments of cortical neurons under step current stimulation. The results provide support for the suitability of simple models such as the adaptive exponential integrate-and-fire neuron for large network simulations.

  18. Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Stoeckel, Gerhard P.; Doyle, Keith B.

    2013-09-01

    Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.

  19. Shifting from Stewardship to Analytics of Massive Science Data

    NASA Astrophysics Data System (ADS)

    Crichton, D. J.; Doyle, R.; Law, E.; Hughes, S.; Huang, T.; Mahabal, A.

    2015-12-01

    Currently, the analysis of large data collections is executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Data collection, archiving and analysis from future remote sensing missions, be it from earth science satellites, planetary robotic missions, or massive radio observatories may not scale as more capable instruments stress existing architectural approaches and systems due to more continuous data streams, data from multiple observational platforms, and measurements and models from different agencies. A new paradigm is needed in order to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural choices, data processing, management, analysis, etc are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections. Future observational systems, including satellite and airborne experiments, and research in climate modeling will significantly increase the size of the data requiring new methodological approaches towards data analytics where users can more effectively interact with the data and apply automated mechanisms for data reduction, reduction and fusion across these massive data repositories. This presentation will discuss architecture, use cases, and approaches for developing a big data analytics strategy across multiple science disciplines.

  20. Analysis of a variety of inorganic and organic additives in food products by ion-pairing liquid chromatography coupled to high-resolution mass spectrometry.

    PubMed

    Kaufmann, Anton; Widmer, Mirjam; Maden, Kathryn; Butcher, Patrick; Walker, Stephan

    2018-03-05

    A reversed-phase ion-pairing chromatographic method was developed for the detection and quantification of inorganic and organic anionic food additives. A single-stage high-resolution mass spectrometer (orbitrap ion trap, Orbitrap) was used to detect the accurate masses of the unfragmented analyte ions. The developed ion-pairing chromatography method was based on a dibutylamine/hexafluoro-2-propanol buffer. Dibutylamine can be charged to serve as a chromatographic ion-pairing agent. This ensures sufficient retention of inorganic and organic anions. Yet, unlike quaternary amines, it can be de-charged in the electrospray to prevent the formation of neutral analyte ion-pairing agent adducts. This process is significantly facilitated by the added hexafluoro-2-propanol. This approach permits the sensitive detection and quantification of additives like nitrate and mono-, di-, and triphosphate as well as citric acid, a number of artificial sweeteners like cyclamate and aspartame, flavor enhancers like glutamate, and preservatives like sorbic acid. This is a major advantage, since the currently used analytical methods as utilized in food safety laboratories are only capable in monitoring a few compounds or a particular category of food additives. Graphical abstract Deptotonation of ion pair agent in the electrospray interface.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romanov, Gennady

    Typically the RFQs are designed using the Parmteq, DesRFQ and other similar specialized codes, which produces the files containing the field and geometrical parameters for every cell. The beam dynamic simulations with these analytical fields a re, of course, ideal realizations of the designed RFQs. The new advanced computing capabilities made it possible to simulate beam and even dark current in the realistic 3D electromagnetic fields in the RFQs that may reflect cavity tuning, presence of tune rs and couplers, RFQ segmentation etc. The paper describes the utilization of full 3D field distribution obtained with CST Studio Suite for beammore » dynamic simulations using both PIC solver of CST Particle Studio and the beam dynamic code TRACK.« less

  2. The Ozone Widget Framework: towards modularity of C2 human interfaces

    NASA Astrophysics Data System (ADS)

    Hellar, David Benjamin; Vega, Laurian C.

    2012-05-01

    The Ozone Widget Framework (OWF) is a common webtop environment for distribution across the enterprise. A key mission driver for OWF is to enable rapid capability delivery by lowering time-to-market with lightweight components. OWF has been released as Government Open Source Software and has been deployed in a variety of C2 net-centric contexts ranging from real-time analytics, cyber-situational awareness, to strategic and operational planning. This paper discusses the current and future evolution of OWF including the availability of the OZONE Marketplace (OMP), useractivity driven metrics, and architecture enhancements for accessibility. Together, OWF is moving towards the rapid delivery of modular human interfaces supporting modern and future command and control contexts.

  3. Optimal guidance law development for an advanced launch system

    NASA Technical Reports Server (NTRS)

    Calise, Anthony J.; Hodges, Dewey H.; Leung, Martin S.; Bless, Robert R.

    1991-01-01

    The proposed investigation on a Matched Asymptotic Expansion (MAE) method was carried out. It was concluded that the method of MAE is not applicable to launch vehicle ascent trajectory optimization due to a lack of a suitable stretched variable. More work was done on the earlier regular perturbation approach using a piecewise analytic zeroth order solution to generate a more accurate approximation. In the meantime, a singular perturbation approach using manifold theory is also under current investigation. Work on a general computational environment based on the use of MACSYMA and the weak Hamiltonian finite element method continued during this period. This methodology is capable of the solution of a large class of optimal control problems.

  4. The CORSAGE Programme: Continuous Orbital Remote Sensing of Archipelagic Geochemical Effects

    NASA Technical Reports Server (NTRS)

    Acker, J. G.; Brown, C. W.; Hine, A. C.

    1997-01-01

    Current and pending oceanographic remote sensing technology allows the conceptualization of a programme designed to investigate ocean island interactions that could induce short-term nearshore fluxes of particulate organic carbon and biogenic calcium carbonate from pelagic island archipelagoes. These events will influence the geochemistry of adjacent waters, particularly the marine carbon system. Justification and design are provided for a study that would combine oceanographic satellite remote sensing (visible and infrared radiometry, altimetry and scatterometry) with shore-based facilities. A programme incorporating the methodology outlined here would seek to identify the mechanisms that cause such events, assess their geochemical significance, and provide both analytical and predictive capabilities for observations on greater temporal and spatial scales.

  5. Dynamic thermoregulation of the sample in flow cytometry.

    PubMed

    Graves, Steven W; Habbersett, Robert C; Nolan, John P

    2002-05-01

    Fine control of temperature is an important capability for any analytical platform. A circulating water bath has been the traditional means of maintaining constant temperature in the sample chamber of a flow cytometer, but this approach does not permit rapid changes in sample temperature. This unit explains the use of Peltier modules for regulation of sample temperature. The heat pumping generated by the passage of current through properly matched semiconductors, known as the Peltier effect, makes it possible for these thermoelectric modules to both heat and cool. The authors describe the construction of a Peltier module based thermoregulation unit in step-by-step detail and present a demonstration of flow cytometry measurements as a function of temperature.

  6. Digital barcodes of suspension array using laser induced breakdown spectroscopy

    PubMed Central

    He, Qinghua; Liu, Yixi; He, Yonghong; Zhu, Liang; Zhang, Yilong; Shen, Zhiyuan

    2016-01-01

    We show a coding method of suspension array based on the laser induced breakdown spectroscopy (LIBS), which promotes the barcodes from analog to digital. As the foundation of digital optical barcodes, nanocrystals encoded microspheres are prepared with self-assembly encapsulation method. We confirm that digital multiplexing of LIBS-based coding method becomes feasible since the microsphere can be coded with direct read-out data of wavelengths, and the method can avoid fluorescence signal crosstalk between barcodes and analyte tags, which lead to overall advantages in accuracy and stability to current fluorescent multicolor coding method. This demonstration increases the capability of multiplexed detection and accurate filtrating, expanding more extensive applications of suspension array in life science. PMID:27808270

  7. Electronic cooling design and test validation

    NASA Astrophysics Data System (ADS)

    Murtha, W. B.

    1983-07-01

    An analytical computer model has been used to design a counterflow air-cooled heat exchanger according to the cooling, structural and geometric requirements of a U.S. Navy shipboard electronics cabinet, emphasizing high reliability performance through the maintenance of electronic component junction temperatures lower than 110 C. Environmental testing of the design obtained has verified that the analytical predictions were conservative. Model correlation to the test data furnishes an upgraded capability for the evaluation of tactical effects, and has established a two-orders of magnitude growth potential for increased electronics capabilities through enhanced heat dissipation. Electronics cabinets of this type are destined for use with Vertical Launching System-type combatant vessel magazines.

  8. Collaborative human-machine analysis to disambiguate entities in unstructured text and structured datasets

    NASA Astrophysics Data System (ADS)

    Davenport, Jack H.

    2016-05-01

    Intelligence analysts demand rapid information fusion capabilities to develop and maintain accurate situational awareness and understanding of dynamic enemy threats in asymmetric military operations. The ability to extract relationships between people, groups, and locations from a variety of text datasets is critical to proactive decision making. The derived network of entities must be automatically created and presented to analysts to assist in decision making. DECISIVE ANALYTICS Corporation (DAC) provides capabilities to automatically extract entities, relationships between entities, semantic concepts about entities, and network models of entities from text and multi-source datasets. DAC's Natural Language Processing (NLP) Entity Analytics model entities as complex systems of attributes and interrelationships which are extracted from unstructured text via NLP algorithms. The extracted entities are automatically disambiguated via machine learning algorithms, and resolution recommendations are presented to the analyst for validation; the analyst's expertise is leveraged in this hybrid human/computer collaborative model. Military capability is enhanced by these NLP Entity Analytics because analysts can now create/update an entity profile with intelligence automatically extracted from unstructured text, thereby fusing entity knowledge from structured and unstructured data sources. Operational and sustainment costs are reduced since analysts do not have to manually tag and resolve entities.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  10. Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery

    NASA Astrophysics Data System (ADS)

    Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.

    2017-12-01

    Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.

  11. T.Rex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-06-08

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  12. Introducing Chemometrics to the Analytical Curriculum: Combining Theory and Lab Experience

    ERIC Educational Resources Information Center

    Gilbert, Michael K.; Luttrell, Robert D.; Stout, David; Vogt, Frank

    2008-01-01

    Beer's law is an ideal technique that works only in certain situations. A method for dealing with more complex conditions needs to be integrated into the analytical chemistry curriculum. For that reason, the capabilities and limitations of two common chemometric algorithms, classical least squares (CLS) and principal component regression (PCR),…

  13. Capabilities of Direct Sample Introduction - Comprehensive Two-Dimensional Gas Chromatgraphy-Time-of-Flight Mass Spectrometry to Analyze Organic Chemicals of Interest in Fish Oils

    USDA-ARS?s Scientific Manuscript database

    Most analytical methods for persistent organic pollutants (POPs) focus on targeted analytes. Therefore, analysis of multiple classes of POPs typically entails several sample preparations, fractionations, and injections, whereas other chemicals of possible interest are neglected. To analyze a wider...

  14. Fire behavior modeling-a decision tool

    Treesearch

    Jack Cohen; Bill Bradshaw

    1986-01-01

    The usefulness of an analytical model as a fire management decision tool is determined by the correspondence of its descriptive capability to the specific decision context. Fire managers must determine the usefulness of fire models as a decision tool when applied to varied situations. Because the wildland fire phenomenon is complex, analytical fire spread models will...

  15. Give Me a Customizable Dashboard: Personalized Learning Analytics Dashboards in Higher Education

    ERIC Educational Resources Information Center

    Roberts, Lynne D.; Howell, Joel A.; Seaman, Kristen

    2017-01-01

    With the increased capability of learning analytics in higher education, more institutions are developing or implementing student dashboards. Despite the emergence of dashboards as an easy way to present data to students, students have had limited involvement in the dashboard development process. As part of a larger program of research examining…

  16. Investigation of practical applications of H infinity control theory to the design of control systems for large space structures

    NASA Technical Reports Server (NTRS)

    Irwin, R. Dennis

    1988-01-01

    The applicability of H infinity control theory to the problems of large space structures (LSS) control was investigated. A complete evaluation to any technique as a candidate for large space structure control involves analytical evaluation, algorithmic evaluation, evaluation via simulation studies, and experimental evaluation. The results of analytical and algorithmic evaluations are documented. The analytical evaluation involves the determination of the appropriateness of the underlying assumptions inherent in the H infinity theory, the determination of the capability of the H infinity theory to achieve the design goals likely to be imposed on an LSS control design, and the identification of any LSS specific simplifications or complications of the theory. The resuls of the analytical evaluation are presented in the form of a tutorial on the subject of H infinity control theory with the LSS control designer in mind. The algorthmic evaluation of H infinity for LSS control pertains to the identification of general, high level algorithms for effecting the application of H infinity to LSS control problems, the identification of specific, numerically reliable algorithms necessary for a computer implementation of the general algorithms, the recommendation of a flexible software system for implementing the H infinity design steps, and ultimately the actual development of the necessary computer codes. Finally, the state of the art in H infinity applications is summarized with a brief outline of the most promising areas of current research.

  17. Characterization of spacecraft humidity condensate

    NASA Technical Reports Server (NTRS)

    Muckle, Susan; Schultz, John R.; Sauer, Richard L.

    1994-01-01

    When construction of Space Station Freedom reaches the Permanent Manned Capability (PMC) stage, the Water Recovery and Management Subsystem will be fully operational such that (distilled) urine, spent hygiene water, and humidity condensate will be reclaimed to provide water of potable quality. The reclamation technologies currently baselined to process these waste waters include adsorption, ion exchange, catalytic oxidation, and disinfection. To ensure that the baseline technologies will be able to effectively remove those compounds presenting a health risk to the crew, the National Research Council has recommended that additional information be gathered on specific contaminants in waste waters representative of those to be encountered on the Space Station. With the application of new analytical methods and the analysis of waste water samples more representative of the Space Station environment, advances in the identification of the specific contaminants continue to be made. Efforts by the Water and Food Analytical Laboratory at JSC were successful in enlarging the database of contaminants in humidity condensate. These efforts have not only included the chemical characterization of condensate generated during ground-based studies, but most significantly the characterization of cabin and Spacelab condensate generated during Shuttle missions. The analytical results presented in this paper will be used to show how the composition of condensate varies amongst enclosed environments and thus the importance of collecting condensate from an environment close to that of the proposed Space Station. Although advances were made in the characterization of space condensate, complete characterization, particularly of the organics, requires further development of analytical methods.

  18. Gas chromatography coupled to tunable pulsed glow discharge time-of-flight mass spectrometry for environmental analysis.

    PubMed

    Solà-Vázquez, Auristela; Lara-Gonzalo, Azucena; Costa-Fernández, José M; Pereiro, Rosario; Sanz-Medel, Alfredo

    2010-05-01

    A tuneable microsecond pulsed direct current glow discharge (GD)-time-of-flight mass spectrometer MS(TOF) developed in our laboratory was coupled to a gas chromatograph (GC) to obtain sequential collection of the mass spectra, at different temporal regimes occurring in the GD pulses, during elution of the analytes. The capabilities of this set-up were explored using a mixture of volatile organic compounds of environmental concern: BrClCH, Cl(3)CH, Cl(4)C, BrCl(2)CH, Br(2)ClCH, Br(3)CH. The experimental parameters of the GC-pulsed GD-MS(TOF) prototype were optimized in order to separate appropriately and analyze the six selected organic compounds, and two GC carrier gases, helium and nitrogen, were evaluated. Mass spectra for all analytes were obtained in the prepeak, plateau and afterpeak temporal regimes of the pulsed GD. Results showed that helium offered the best elemental sensitivity, while nitrogen provided higher signal intensities for fragments and molecular peaks. The analytical performance characteristics were also worked out for each analyte. Absolute detection limits obtained were in the order of ng. In a second step, headspace solid phase microextraction (HS SPME), as sample preparation and preconcentration technique, was evaluated for the quantification of the compounds under study, in order to achieve the required analytical sensitivity for trihalomethanes European Union (EU) environmental legislation. The analytical figures of merit obtained using the proposed methodology showed rather good detection limits (between 2 and 13 microg L(-1) depending on the analyte). In fact, the developed methodology met the EU legislation requirements (the maximum level permitted in tap water for the "total trihalomethanes" is set at 100 microg L(-1)). Real analysis of drinking water and river water were successfully carried out. To our knowledge this is the first application of GC-pulsed GD-MS(TOF) for the analysis of real samples. Its ability to provide elemental, fragments and molecular information of the organic compounds is demonstrated.

  19. Differential homogeneous immunosensor device

    DOEpatents

    Malmros, Mark K.; Gulbinski, III, Julian

    1990-04-10

    There is provided a novel method of testing for the presence of an analyte in a fluid suspected of containing the same. In this method, in the presence of the analyte, a substance capable of modifying certain characteristics of the substrate is bound to the substrate and the change in these qualities is measured. While the method may be modified for carrying out quantitative differential analyses, it eliminates the need for washing analyte from the substrate which is characteristic of prior art methods.

  20. Decision making in prioritization of required operational capabilities

    NASA Astrophysics Data System (ADS)

    Andreeva, P.; Karev, M.; Kovacheva, Ts.

    2015-10-01

    The paper describes an expert heuristic approach to prioritization of required operational capabilities in the field of defense. Based on expert assessment and by application of the method of Analytical Hierarchical Process, a methodology for their prioritization has been developed. It has been applied to practical simulation decision making games.

  1. Bibliographic Post-Processing with the TIS Intelligent Gateway: Analytical and Communication Capabilities.

    ERIC Educational Resources Information Center

    Burton, Hilary D.

    TIS (Technology Information System) is an intelligent gateway system capable of performing quantitative evaluation and analysis of bibliographic citations using a set of Process functions. Originally developed by Lawrence Livermore National Laboratory (LLNL) to analyze information retrieved from three major federal databases, DOE/RECON,…

  2. Intelligent Vehicle Mobility M&S Capability Development (FY13 innovation Project) (Briefing Charts)

    DTIC Science & Technology

    2014-05-19

    Intelligent Vehicle Mobility M&S Capability Development (FY13 Innovation Project) P. Jayakumar and J. Raymond, Analytics 19 May 2014...PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Paramsithy Jayakumar ; J Raymond 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING

  3. Langley Ground Facilities and Testing in the 21st Century

    NASA Technical Reports Server (NTRS)

    Ambur, Damodar R.; Kegelman, Jerome T.; Kilgore, William A.

    2010-01-01

    A strategic approach for retaining and more efficiently operating the essential Langley Ground Testing Facilities in the 21st Century is presented. This effort takes advantage of the previously completed and ongoing studies at the Agency and National levels. This integrated approach takes into consideration the overall decline in test business base within the nation and reduced utilization in each of the Langley facilities with capabilities to test in the subsonic, transonic, supersonic, and hypersonic speed regimes. The strategy accounts for capability needs to meet the Agency programmatic requirements and strategic goals and to execute test activities in the most efficient and flexible facility operating structure. The structure currently being implemented at Langley offers agility to right-size our capability and capacity from a national perspective, to accommodate the dynamic nature of the testing needs, and will address the influence of existing and emerging analytical tools for design. The paradigm for testing in the retained facilities is to efficiently and reliably provide more accurate and high-quality test results at an affordable cost to support design information needs for flight regimes where the computational capability is not adequate and to verify and validate the existing and emerging computational tools. Each of the above goals are planned to be achieved, keeping in mind the increasing small industry customer base engaged in developing unpiloted aerial vehicles and commercial space transportation systems.

  4. Strategies for Distinguishing Abiotic Chemistry from Martian Biochemistry in Samples Returned from Mars

    NASA Technical Reports Server (NTRS)

    Glavin, D. P.; Burton, A. S.; Callahan, M. P.; Elsila, J. E.; Stern, J. C.; Dworkin, J. P.

    2012-01-01

    A key goal in the search for evidence of extinct or extant life on Mars will be the identification of chemical biosignatures including complex organic molecules common to all life on Earth. These include amino acids, the monomer building blocks of proteins and enzymes, and nucleobases, which serve as the structural basis of information storage in DNA and RNA. However, many of these organic compounds can also be formed abiotically as demonstrated by their prevalence in carbonaceous meteorites [1]. Therefore, an important challenge in the search for evidence of life on Mars will be distinguishing between abiotic chemistry of either meteoritic or martian origin from any chemical biosignatures from an extinct or extant martian biota. Although current robotic missions to Mars, including the 2011 Mars Science Laboratory (MSL) and the planned 2018 ExoMars rovers, will have the analytical capability needed to identify these key classes of organic molecules if present [2,3], return of a diverse suite of martian samples to Earth would allow for much more intensive laboratory studies using a broad array of extraction protocols and state-of-theart analytical techniques for bulk and spatially resolved characterization, molecular detection, and isotopic and enantiomeric compositions that may be required for unambiguous confirmation of martian life. Here we will describe current state-of-the-art laboratory analytical techniques that have been used to characterize the abundance and distribution of amino acids and nucleobases in meteorites, Apollo samples, and comet- exposed materials returned by the Stardust mission with an emphasis on their molecular characteristics that can be used to distinguish abiotic chemistry from biochemistry as we know it. The study of organic compounds in carbonaceous meteorites is highly relevant to Mars sample return analysis, since exogenous organic matter should have accumulated in the martian regolith over the last several billion years and the analytical techniques previously developed for the study of extraterrestrial materials can be applied to martian samples.

  5. On the analytic and numeric optimisation of airplane trajectories under real atmospheric conditions

    NASA Astrophysics Data System (ADS)

    Gonzalo, J.; Domínguez, D.; López, D.

    2014-12-01

    From the beginning of aviation era, economic constraints have forced operators to continuously improve the planning of the flights. The revenue is proportional to the cost per flight and the airspace occupancy. Many methods, the first started in the middle of last century, have explore analytical, numerical and artificial intelligence resources to reach the optimal flight planning. In parallel, advances in meteorology and communications allow an almost real-time knowledge of the atmospheric conditions and a reliable, error-bounded forecast for the near future. Thus, apart from weather risks to be avoided, airplanes can dynamically adapt their trajectories to minimise their costs. International regulators are aware about these capabilities, so it is reasonable to envisage some changes to allow this dynamic planning negotiation to soon become operational. Moreover, current unmanned airplanes, very popular and often small, suffer the impact of winds and other weather conditions in form of dramatic changes in their performance. The present paper reviews analytic and numeric solutions for typical trajectory planning problems. Analytic methods are those trying to solve the problem using the Pontryagin principle, where influence parameters are added to state variables to form a split condition differential equation problem. The system can be solved numerically -indirect optimisation- or using parameterised functions -direct optimisation-. On the other hand, numerical methods are based on Bellman's dynamic programming (or Dijkstra algorithms), where the fact that two optimal trajectories can be concatenated to form a new optimal one if the joint point is demonstrated to belong to the final optimal solution. There is no a-priori conditions for the best method. Traditionally, analytic has been more employed for continuous problems whereas numeric for discrete ones. In the current problem, airplane behaviour is defined by continuous equations, while wind fields are given in a discrete grid at certain time intervals. The research demonstrates advantages and disadvantages of each method as well as performance figures of the solutions found for typical flight conditions under static and dynamic atmospheres. This provides significant parameters to be used in the selection of solvers for optimal trajectories.

  6. Ultrasensitive Ambient Mass Spectrometric Analysis with a Pin-to-Capillary Flowing Atmospheric-Pressure Afterglow Source

    PubMed Central

    Shelley, Jacob T.; Wiley, Joshua S.; Hieftje, Gary M.

    2011-01-01

    The advent of ambient desorption/ionization mass spectrometry has resulted in a strong interest in ionization sources that are capable of direct analyte sampling and ionization. One source that has enjoyed increasing interest is the Flowing Atmospheric-Pressure Afterglow (FAPA). FAPA has been proven capable of directly desorbing/ionizing samples in any phase (solid, liquid, or gas) and with impressive limits of detection (<100 fmol). The FAPA was also shown to be less affected by competitive-ionization matrix effects than other plasma-based sources. However, the original FAPA design exhibited substantial background levels, cluttered background spectra in the negative-ion mode, and significant oxidation of aromatic analytes, which ultimately compromised analyte identification and quantification. In the present study, a change in the FAPA configuration from a pin-to-plate to a pin-to-capillary geometry was found to vastly improve performance. Background signals in positive- and negative-ionization modes were reduced by 89% and 99%, respectively. Additionally, the capillary anode strongly reduced the amount of atomic oxygen that could cause oxidation of analytes. Temperatures of the gas stream that interacts with the sample, which heavily influences desorption capabilities, were compared between the two sources by means of IR thermography. The performance of the new FAPA configuration is evaluated through the determination of a variety of compounds in positive- and negative-ion mode, including agrochemicals and explosives. A detection limit of 4 amol was found for the direct determination of the agrochemical ametryn, and appears to be spectrometer-limited. The ability to quickly screen for analytes in bulk liquid samples with the pin-to-capillary FAPA is also shown. PMID:21627097

  7. Ultrasensitive ambient mass spectrometric analysis with a pin-to-capillary flowing atmospheric-pressure afterglow source.

    PubMed

    Shelley, Jacob T; Wiley, Joshua S; Hieftje, Gary M

    2011-07-15

    The advent of ambient desorption/ionization mass spectrometry has resulted in a strong interest in ionization sources that are capable of direct analyte sampling and ionization. One source that has enjoyed increasing interest is the flowing atmospheric-pressure afterglow (FAPA). The FAPA has been proven capable of directly desorbing/ionizing samples in any phase (solid, liquid, or gas) and with impressive limits of detection (<100 fmol). The FAPA was also shown to be less affected by competitive-ionization matrix effects than other plasma-based sources. However, the original FAPA design exhibited substantial background levels, cluttered background spectra in the negative-ion mode, and significant oxidation of aromatic analytes, which ultimately compromised analyte identification and quantification. In the present study, a change in the FAPA configuration from a pin-to-plate to a pin-to-capillary geometry was found to vastly improve performance. Background signals in positive- and negative-ionization modes were reduced by 89% and 99%, respectively. Additionally, the capillary anode strongly reduced the amount of atomic oxygen that could cause oxidation of analytes. Temperatures of the gas stream that interacts with the sample, which heavily influences desorption capabilities, were compared between the two sources by means of IR thermography. The performance of the new FAPA configuration is evaluated through the determination of a variety of compounds in positive- and negative-ion mode, including agrochemicals and explosives. A detection limit of 4 amol was found for the direct determination of the agrochemical ametryn and appears to be spectrometer-limited. The ability to quickly screen for analytes in bulk liquid samples with the pin-to-capillary FAPA is also shown.

  8. Scalable Visual Analytics of Massive Textual Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krishnan, Manoj Kumar; Bohn, Shawn J.; Cowley, Wendy E.

    2007-04-01

    This paper describes the first scalable implementation of text processing engine used in Visual Analytics tools. These tools aid information analysts in interacting with and understanding large textual information content through visual interfaces. By developing parallel implementation of the text processing engine, we enabled visual analytics tools to exploit cluster architectures and handle massive dataset. The paper describes key elements of our parallelization approach and demonstrates virtually linear scaling when processing multi-gigabyte data sets such as Pubmed. This approach enables interactive analysis of large datasets beyond capabilities of existing state-of-the art visual analytics tools.

  9. Many-Objective Reservoir Policy Identification and Refinement to Reduce Institutional Myopia in Water Management

    NASA Astrophysics Data System (ADS)

    Giuliani, M.; Herman, J. D.; Castelletti, A.; Reed, P. M.

    2013-12-01

    Institutional inertia strongly limits our ability to adapt water reservoir operations to better manage growing water demands as well as their associated uncertainties in a changing climate. Although it has long been recognized that these systems are generally framed in heterogeneous socio-economic contexts involving a myriad of conflicting, non-commensurable operating objectives, our broader understanding of the multiobjective consequences of current operating rules as well as their vulnerability to hydroclimatic uncertainties is severely limited. This study proposes a decision analytic framework to overcome policy inertia and myopia in complex river basin management contexts. The framework combines reservoir policy identification and many-objective optimization under uncertainty to characterize current operations and discover key tradeoffs between alternative policies for balancing evolving demands and system uncertainties. The approach is demonstrated on the Conowingo Dam, located within the Lower Susquehanna River, USA. The Lower Susquehanna River is an interstate water body that has been subject to intensive water management efforts due to the system's competing demands from urban water supply, atomic power plant cooling, hydropower production, and federally regulated environmental flows. Initially our proposed framework uses available streamflow observations to implicitly identify the Conowingo Dam's current but unknown operating policy. This baseline policy is identified by fitting radial basis functions to existing system dynamics. Our assumption in the baseline policy is that the dam operator is represented as a rational agent seeking to maximize primary operational objectives (i.e., guaranteeing the public water supply and maximizing the hydropower revenue). The quality of the identified baseline policy is evaluated by its ability to replicate historical release dynamics. Once identified, the historical baseline policy then provides a means of representing the decision preferences guiding current operations. Our results show that the estimated policy closely captures the dynamics of current releases and flows for the Lower Susquehanna. After identifying the historical baseline policy, our proposed decision analytic framework then combines evolutionary many-objective optimization with visual analytics to discover improved operating policies. Our Lower Susquehanna results confirm that the system's current history-based operations are negatively biased to overestimate the reliability of the reservoir's multi-sector services. Moreover, our proposed framework has successfully identified alternative reservoir policies that are more robust to hydroclimatic uncertainties while being capable of better addressing the tradeoffs across the Conowingo Dam's multi-sector services.

  10. Employing socially driven techniques for framing, contextualization, and collaboration in complex analytical threads

    NASA Astrophysics Data System (ADS)

    Wollocko, Arthur; Danczyk, Jennifer; Farry, Michael; Jenkins, Michael; Voshell, Martin

    2015-05-01

    The proliferation of sensor technologies continues to impact Intelligence Analysis (IA) work domains. Historical procurement focus on sensor platform development and acquisition has resulted in increasingly advanced collection systems; however, such systems often demonstrate classic data overload conditions by placing increased burdens on already overtaxed human operators and analysts. Support technologies and improved interfaces have begun to emerge to ease that burden, but these often focus on single modalities or sensor platforms rather than underlying operator and analyst support needs, resulting in systems that do not adequately leverage their natural human attentional competencies, unique skills, and training. One particular reason why emerging support tools often fail is due to the gap between military applications and their functions, and the functions and capabilities afforded by cutting edge technology employed daily by modern knowledge workers who are increasingly "digitally native." With the entry of Generation Y into these workplaces, "net generation" analysts, who are familiar with socially driven platforms that excel at giving users insight into large data sets while keeping cognitive burdens at a minimum, are creating opportunities for enhanced workflows. By using these ubiquitous platforms, net generation analysts have trained skills in discovering new information socially, tracking trends among affinity groups, and disseminating information. However, these functions are currently under-supported by existing tools. In this paper, we describe how socially driven techniques can be contextualized to frame complex analytical threads throughout the IA process. This paper focuses specifically on collaborative support technology development efforts for a team of operators and analysts. Our work focuses on under-supported functions in current working environments, and identifies opportunities to improve a team's ability to discover new information and disseminate insightful analytic findings. We describe our Cognitive Systems Engineering approach to developing a novel collaborative enterprise IA system that combines modern collaboration tools with familiar contemporary social technologies. Our current findings detail specific cognitive and collaborative work support functions that defined the design requirements for a prototype analyst collaborative support environment.

  11. Some new features of Direct Analysis in Real Time mass spectrometry utilizing the desorption at an angle option.

    PubMed

    Chernetsova, Elena S; Revelsky, Alexander I; Morlock, Gertrud E

    2011-08-30

    The present study is a first step towards the unexplored capabilities of Direct Analysis in Real Time (DART) mass spectrometry (MS) arising from the possibility of the desorption at an angle: scanning analysis of surfaces, including the coupling of thin-layer chromatography (TLC) with DART-MS, and a more sensitive analysis due to the preliminary concentration of analytes dissolved in large volumes of liquids on glass surfaces. In order to select the most favorable conditions for DART-MS analysis, proper positioning of samples is important. Therefore, a simple and cheap technique for the visualization of the impact region of the DART gas stream onto a substrate was developed. A filter paper or TLC plate, previously loaded with the analyte, was immersed in a derivatization solution. On this substrate, owing to the impact of the hot DART gas, reaction of the analyte to a colored product occurred. An improved capability of detection of DART-MS for the analysis of liquids was demonstrated by applying large volumes of model solutions of coumaphos into small glass vessels and drying these solutions prior to DART-MS analysis under ambient conditions. This allowed the introduction of, by up to more than two orders of magnitude, increased quantities of analyte compared with the conventional DART-MS analysis of liquids. Through this improved detectability, the capabilities of DART-MS in trace analysis could be strengthened. Copyright © 2011 John Wiley & Sons, Ltd.

  12. Retention in porous layer pillar array planar separation platforms

    DOE PAGES

    Lincoln, Danielle R.; Lavrik, Nickolay V.; Kravchenko, Ivan I.; ...

    2016-08-11

    Here, this work presents the retention capabilities and surface area enhancement of highly ordered, high-aspect-ratio, open-platform, two-dimensional (2D) pillar arrays when coated with a thin layer of porous silicon oxide (PSO). Photolithographically prepared pillar arrays were coated with 50–250 nm of PSO via plasma-enhanced chemical vapor deposition and then functionalized with either octadecyltrichlorosilane or n-butyldimethylchlorosilane. Theoretical calculations indicate that a 50 nm layer of PSO increases the surface area of a pillar nearly 120-fold. Retention capabilities were tested by observing capillary-action-driven development under various conditions, as well as by running one-dimensional separations on varying thicknesses of PSO. Increasing the thicknessmore » of PSO on an array clearly resulted in greater retention of the analyte(s) in question in both experiments. In culmination, a two-dimensional separation of fluorescently derivatized amines was performed to further demonstrate the capabilities of these fabricated platforms.« less

  13. Retention in porous layer pillar array planar separation platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lincoln, Danielle R.; Lavrik, Nickolay V.; Kravchenko, Ivan I.

    Here, this work presents the retention capabilities and surface area enhancement of highly ordered, high-aspect-ratio, open-platform, two-dimensional (2D) pillar arrays when coated with a thin layer of porous silicon oxide (PSO). Photolithographically prepared pillar arrays were coated with 50–250 nm of PSO via plasma-enhanced chemical vapor deposition and then functionalized with either octadecyltrichlorosilane or n-butyldimethylchlorosilane. Theoretical calculations indicate that a 50 nm layer of PSO increases the surface area of a pillar nearly 120-fold. Retention capabilities were tested by observing capillary-action-driven development under various conditions, as well as by running one-dimensional separations on varying thicknesses of PSO. Increasing the thicknessmore » of PSO on an array clearly resulted in greater retention of the analyte(s) in question in both experiments. In culmination, a two-dimensional separation of fluorescently derivatized amines was performed to further demonstrate the capabilities of these fabricated platforms.« less

  14. An Overview of NASA's Integrated Design and Engineering Analysis (IDEA) Environment

    NASA Technical Reports Server (NTRS)

    Robinson, Jeffrey S.

    2011-01-01

    Historically, the design of subsonic and supersonic aircraft has been divided into separate technical disciplines (such as propulsion, aerodynamics and structures), each of which performs design and analysis in relative isolation from others. This is possible, in most cases, either because the amount of interdisciplinary coupling is minimal, or because the interactions can be treated as linear. The design of hypersonic airbreathing vehicles, like NASA's X-43, is quite the opposite. Such systems are dominated by strong non-linear interactions between disciplines. The design of these systems demands that a multi-disciplinary approach be taken. Furthermore, increased analytical fidelity at the conceptual design phase is highly desirable, as many of the non-linearities are not captured by lower fidelity tools. Only when these systems are designed from a true multi-disciplinary perspective, can the real performance benefits be achieved and complete vehicle systems be fielded. Toward this end, the Vehicle Analysis Branch at NASA Langley Research Center has been developing the Integrated Design and Engineering Analysis (IDEA) Environment. IDEA is a collaborative environment for parametrically modeling conceptual and preliminary designs for launch vehicle and high speed atmospheric flight configurations using the Adaptive Modeling Language (AML) as the underlying framework. The environment integrates geometry, packaging, propulsion, trajectory, aerodynamics, aerothermodynamics, engine and airframe subsystem design, thermal and structural analysis, and vehicle closure into a generative, parametric, unified computational model where data is shared seamlessly between the different disciplines. Plans are also in place to incorporate life cycle analysis tools into the environment which will estimate vehicle operability, reliability and cost. IDEA is currently being funded by NASA?s Hypersonics Project, a part of the Fundamental Aeronautics Program within the Aeronautics Research Mission Directorate. The environment is currently focused around a two-stage-to-orbit configuration with a turbine-based combined cycle (TBCC) first stage and a reusable rocket second stage. IDEA will be rolled out in generations, with each successive generation providing a significant increase in capability, either through increased analytic fidelity, expansion of vehicle classes considered, or by the inclusion of advanced modeling techniques. This paper provides the motivation behind the current effort, an overview of the development of the IDEA environment (including the contents and capabilities to be included in Generation 1 and Generation 2), and a description of the current status and detail of future plans.

  15. Design of power electronics for TVC EMA systems

    NASA Technical Reports Server (NTRS)

    Nelms, R. Mark

    1993-01-01

    The Composite Development Division of the Propulsion Laboratory at Marshall Space Flight Center (MSFC) is currently developing a class of electromechanical actuators (EMA's) for use in space transportation applications such as thrust vector control (TVC) and propellant control valves (PCV). These high power servomechanisms will require rugged, reliable, and compact power electronic modules capable of modulating several hundred amperes of current at up to 270 volts. MSFC has selected the brushless dc motor for implementation in EMA's. This report presents the results of an investigation into the applicability of two new technologies, MOS-controlled thyristors (MCT's) and pulse density modulation (PDM), to the control of brushless dc motors in EMA systems. MCT's are new power semiconductor devices, which combine the high voltage and current capabilities of conventional thyristors and the low gate drive requirements of metal oxide semiconductor field effect transistors (MOSFET's). The commanded signals in a PDM system are synthesized using a series of sinusoidal pulses instead of a series of square pulses as in a pulse width modulation (PWM) system. A resonant dc link inverter is employed to generate the sinusoidal pulses in the PDM system. This inverter permits zero-voltage switching of all semiconductors which reduces switching losses and switching stresses. The objectives of this project are to develop and validate an analytical model of the MCT device when used in high power motor control applications and to design, fabricate, and test a prototype electronic circuit employing both MCT and PDM technology for controlling a brushless dc motor.

  16. Exact analytic solutions of Maxwell's equations describing propagating nonparaxial electromagnetic beams.

    PubMed

    Garay-Avendaño, Roger L; Zamboni-Rached, Michel

    2014-07-10

    In this paper, we propose a method that is capable of describing in exact and analytic form the propagation of nonparaxial scalar and electromagnetic beams. The main features of the method presented here are its mathematical simplicity and the fast convergence in the cases of highly nonparaxial electromagnetic beams, enabling us to obtain high-precision results without the necessity of lengthy numerical simulations or other more complex analytical calculations. The method can be used in electromagnetism (optics, microwaves) as well as in acoustics.

  17. Differential homogeneous immunosensor device

    DOEpatents

    Malmros, M.K.; Gulbinski, J. III.

    1990-04-10

    There is provided a novel method of testing for the presence of an analyte in a fluid suspected of containing the same. In this method, in the presence of the analyte, a substance capable of modifying certain characteristics of the substrate is bound to the substrate and the change in these qualities is measured. While the method may be modified for carrying out quantitative differential analyses, it eliminates the need for washing the analyte from the substrate which is characteristic of prior art methods. 12 figs.

  18. Gyrokinetic Particle Simulations of Neoclassical Transport

    NASA Astrophysics Data System (ADS)

    Lin, Zhihong

    A time varying weighting (delta f) scheme based on the small gyro-radius ordering is developed and applied to a steady state, multi-species gyrokinetic particle simulation of neoclassical transport. Accurate collision operators conserving momentum and energy are developed and implemented. Benchmark simulation results using these operators are found to agree very well with neoclassical theory. For example, it is dynamically demonstrated that like-particle collisions produce no particle flux and that the neoclassical fluxes are ambipolar for an ion -electron plasma. An important physics feature of the present scheme is the introduction of toroidal flow to the simulations. In agreement with the existing analytical neoclassical theory, ion energy flux is enhanced by the toroidal mass flow and the neoclassical viscosity is a Pfirsch-Schluter factor times the classical viscosity in the banana regime. In addition, the poloidal electric field associated with toroidal mass flow is found to enhance density gradient driven electron particle flux and the bootstrap current while reducing temperature gradient driven flux and current. Modifications of the neoclassical transport by the orbit squeezing effects due to the radial electric field associated with sheared toroidal flow are studied. Simulation results indicate a reduction of both ion thermal flux and neoclassical toroidal rotation. Neoclassical theory in the steep gradient profile regime, where conventional neoclassical theory fails, is examined by taking into account finite banana width effects. The relevance of these studies to interesting experimental conditions in tokamaks is discussed. Finally, the present numerical scheme is extended to general geometry equilibrium. This new formulation will be valuable for the development of new capabilities to address complex equilibria such as advanced stellarator configurations and possibly other alternate concepts for the magnetic confinement of plasmas. In general, the present work demonstrates a valuable new capability for studying important aspects of neoclassical transport inaccessible by conventional analytical calculation processes.

  19. State-of-the-Art of (Bio)Chemical Sensor Developments in Analytical Spanish Groups

    PubMed Central

    Plata, María Reyes; Contento, Ana María; Ríos, Angel

    2010-01-01

    (Bio)chemical sensors are one of the most exciting fields in analytical chemistry today. The development of these analytical devices simplifies and miniaturizes the whole analytical process. Although the initial expectation of the massive incorporation of sensors in routine analytical work has been truncated to some extent, in many other cases analytical methods based on sensor technology have solved important analytical problems. Many research groups are working in this field world-wide, reporting interesting results so far. Modestly, Spanish researchers have contributed to these recent developments. In this review, we summarize the more representative achievements carried out for these groups. They cover a wide variety of sensors, including optical, electrochemical, piezoelectric or electro-mechanical devices, used for laboratory or field analyses. The capabilities to be used in different applied areas are also critically discussed. PMID:22319260

  20. Computer search for binary cyclic UEP codes of odd length up to 65

    NASA Technical Reports Server (NTRS)

    Lin, Mao-Chao; Lin, Chi-Chang; Lin, Shu

    1990-01-01

    Using an exhaustive computation, the unequal error protection capabilities of all binary cyclic codes of odd length up to 65 that have minimum distances at least 3 are found. For those codes that can only have upper bounds on their unequal error protection capabilities computed, an analytic method developed by Dynkin and Togonidze (1976) is used to show that the upper bounds meet the exact unequal error protection capabilities.

  1. The use of surface-enhanced Raman scattering for detecting molecular evidence of life in rocks, sediments, and sedimentary deposits.

    PubMed

    Bowden, Stephen A; Wilson, Rab; Cooper, Jonathan M; Parnell, John

    2010-01-01

    Raman spectroscopy is a versatile analytical technique capable of characterizing the composition of both inorganic and organic materials. Consequently, it is frequently suggested as a payload on many planetary landers. Only approximately 1 in every 10(6) photons are Raman scattered; therefore, the detection of trace quantities of an analyte dispersed in a sample matrix can be much harder to achieve. To overcome this, surface-enhanced Raman scattering (SERS) and surface-enhanced resonance Raman scattering (SERRS) both provide greatly enhanced signals (enhancements between 10(5) and 10(9)) through the analyte's interaction with the locally generated surface plasmons, which occur at a "roughened" or nanostructured metallic surface (e.g., Cu, Au, and Ag). Both SERS and SERRS may therefore provide a viable technique for trace analysis of samples. In this paper, we describe the development of SERS assays for analyzing trace amounts of compounds present in the solvent extracts of sedimentary deposits. These assays were used to detect biological pigments present in an Arctic microoasis (a small locale of elevated biological productivity) and its detrital regolith, characterize the pigmentation of microbial mats around hydrothermal springs, and detect fossil organic matter in hydrothermal deposits. These field study examples demonstrate that SERS technology is sufficiently mature to be applied to many astrobiological analog studies on Earth. Many current and proposed imaging systems intended for remote deployment already posses the instrumental components needed for SERS. The addition of wet chemistry sample processing facilities to these instruments could yield field-deployable analytical instruments with a broadened analytical window for detecting organic compounds with a biological or geological origin.

  2. A microfabricated, low dark current a-Se detector for measurement of microplasma optical emission in the UV for possible use on-site

    NASA Astrophysics Data System (ADS)

    Abbaszadeh, Shiva; Karim, Karim S.; Karanassios, Vassili

    2013-05-01

    Traditionally, samples are collected on-site (i.e., in the field) and are shipped to a lab for chemical analysis. An alternative is offered by using portable chemical analysis instruments that can be used on-site (i.e., in the field). Many analytical measurements by optical emission spectrometry require use of light-sources and of spectral lines that are in the Ultra-Violet (UV, ~200 nm - 400 nm wavelength) region of the spectrum. For such measurements, a portable, battery-operated, fiber-optic spectrometer equipped with an un-cooled, linear, solid-state detector may be used. To take full advantage of the advanced measurement capabilities offered by state-of-the-art solid-state detectors, cooling of the detector is required. But cooling and other thermal management hamper portability and use on-site because they add size and weight and they increase electrical power requirements. To address these considerations, an alternative was implemented, as described here. Specifically, a microfabricated solid-state detector for measurement of UV photons will be described. Unlike solid-state detectors developed on crystalline Silicon, this miniaturized and low-cost detector utilizes amorphous Selenium (a-Se) as its photosensitive material. Due to its low dark current, this detector does not require cooling, thus it is better suited for portable use and for chemical measurements on-site. In this paper, a microplasma will be used as a light-source of UV photons for the a-Se detector. For example, spectra acquired using a microplasma as a light-source will be compared with those obtained with a portable, fiber-optic spectrometer equipped with a Si-based 2080-element detector. And, analytical performance obtained by introducing ng-amounts of analytes into the microplasma will be described.

  3. Analytically Quantifying Gains in the Test and Evaluation Process through Capabilities-Based Analysis

    DTIC Science & Technology

    2011-09-01

    Evaluation Process through Capabilities-Based Analysis 5. FUNDING NUMBERS 6. AUTHOR(S) Eric J. Lednicky 7. PERFORMING ORGANIZATION NAME(S) AND...ADDRESS(ES) Naval Postgraduate School Monterey, CA 93943-5000 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING /MONITORING AGENCY NAME(S...14 C. MEASURES OF EFFECTIVENESS / MEASURES OF PERFORMANCE

  4. Strategy for the elucidation of elemental compositions of trace analytes based on a mass resolution of 100,000 full width at half maximum.

    PubMed

    Kaufmann, Anton

    2010-07-30

    Elemental compositions (ECs) can be elucidated by evaluating the high-resolution mass spectra of unknown or suspected unfragmented analyte ions. Classical approaches utilize the exact mass of the monoisotopic peak (M + 0) and the relative abundance of isotope peaks (M + 1 and M + 2). The availability of high-resolution instruments like the Orbitrap currently permits mass resolutions up to 100,000 full width at half maximum. This not only allows the determination of relative isotopic abundances (RIAs), but also the extraction of other diagnostic information from the spectra, such as fully resolved signals originating from (34)S isotopes and fully or partially resolved signals related to (15)N isotopes (isotopic fine structure). Fully and partially resolved peaks can be evaluated by visual inspection of the measured peak profiles. This approach is shown to be capable of correctly discarding many of the EC candidates which were proposed by commercial EC calculating algorithms. Using this intuitive strategy significantly extends the upper mass range for the successful elucidation of ECs. Copyright 2010 John Wiley & Sons, Ltd.

  5. Communication: Analytical optimal pulse shapes obtained with the aid of genetic algorithms: Controlling the photoisomerization yield of retinal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guerrero, R. D., E-mail: rdguerrerom@unal.edu.co; Arango, C. A., E-mail: caarango@icesi.edu.co; Reyes, A., E-mail: areyesv@unal.edu.co

    We recently proposed a Quantum Optimal Control (QOC) method constrained to build pulses from analytical pulse shapes [R. D. Guerrero et al., J. Chem. Phys. 143(12), 124108 (2015)]. This approach was applied to control the dissociation channel yields of the diatomic molecule KH, considering three potential energy curves and one degree of freedom. In this work, we utilized this methodology to study the strong field control of the cis-trans photoisomerization of 11-cis retinal. This more complex system was modeled with a Hamiltonian comprising two potential energy surfaces and two degrees of freedom. The resulting optimal pulse, made of 6 linearlymore » chirped pulses, was capable of controlling the population of the trans isomer on the ground electronic surface for nearly 200 fs. The simplicity of the pulse generated with our QOC approach offers two clear advantages: a direct analysis of the sequence of events occurring during the driven dynamics, and its reproducibility in the laboratory with current laser technologies.« less

  6. An analytical model with flexible accuracy for deep submicron DCVSL cells

    NASA Astrophysics Data System (ADS)

    Valiollahi, Sepideh; Ardeshir, Gholamreza

    2018-07-01

    Differential cascoded voltage switch logic (DCVSL) cells are among the best candidates of circuit designers for a wide range of applications due to advantages such as low input capacitance, high switching speed, small area and noise-immunity; nevertheless, a proper model has not yet been developed to analyse them. This paper analyses deep submicron DCVSL cells based on a flexible accuracy-simplicity trade-off including the following key features: (1) the model is capable of producing closed-form expressions with an acceptable accuracy; (2) model equations can be solved numerically to offer higher accuracy; (3) the short-circuit currents occurring in high-low/low-high transitions are accounted in analysis and (4) the changes in the operating modes of transistors during transitions together with an efficient submicron I-V model, which incorporates the most important non-ideal short-channel effects, are considered. The accuracy of the proposed model is validated in IBM 0.13 µm CMOS technology through comparisons with the accurate physically based BSIM3 model. The maximum error caused by analytical solutions is below 10%, while this amount is below 7% for numerical solutions.

  7. Vanishing tattoo multi-sensor for biomedical diagnostics

    NASA Astrophysics Data System (ADS)

    Moczko, E.; Meglinski, I.; Piletsky, S.

    2008-04-01

    Currently, precise non-invasive diagnostics systems for the real-time multi detection and monitoring of physiological parameters and chemical analytes in the human body are urgently required by clinicians, physiologists and bio-medical researchers. We have developed a novel cost effective smart 'vanishing tattoo' (similar to temporary child's tattoos) consisting of environmental-sensitive dyes. Painlessly impregnated into the skin the smart tattoo is capable of generating optical/fluorescence changes (absorbance, transmission, reflectance, emission and/or luminescence within UV, VIS or NIR regions) in response to physical or chemical changes. These changes allow the identification of colour pattern changes similar to bar-code scanning. Such a system allows an easy, cheap and robust comprehensive detection of various parameters and analytes in a small volume of sample (e.g. variations in pH, temperature, ionic strength, solvent polarity, presence of redox species, surfactants, oxygen). These smart tattoos have possible applications in monitoring the progress of disease and transcutaneous drug delivery. The potential of this highly innovative diagnostic tool is wide and diverse and can impact on routine clinical diagnostics, general therapeutic management, skin care and cosmetic products testing as well as fundamental physiological investigations.

  8. Vanishing "tattoo" multisensor for biomedical diagnostics

    NASA Astrophysics Data System (ADS)

    Moczko, E.; Meglinski, I.; Piletsky, S.

    2008-02-01

    Currently, precise non-invasive diagnostics systems for the real-time multi detection and monitoring of physiological parameters and chemical analytes in the human body are urgently required by clinicians, physiologists and bio-medical researchers. We have developed a novel cost effective smart 'vanishing tattoo' (similar to temporary child's tattoos) consisting of environmental-sensitive dyes. Painlessly impregnated into the skin the smart tattoo is capable of generating optical/fluorescence changes (absorbance, transmission, reflectance, emission and/or luminescence within UV, VIS or NIR regions) in response to physical or chemical changes. These changes allow the identification of colour pattern changes similar to bar-code scanning. Such a system allows an easy, cheap and robust comprehensive detection of various parameters and analytes in a small volume of sample (e.g. variations in pH, temperature, ionic strength, solvent polarity, presence of redox species, surfactants, oxygen). These smart tattoos have possible applications in monitoring the progress of disease and transcutaneous drug delivery. The potential of this highly innovative diagnostic tool is wide and diverse and can impact on routine clinical diagnostics, general therapeutic management, skin care and cosmetic products testing as well as fundamental physiological investigations.

  9. Development and Validation of a Multiplexed Protein Quantitation Assay for the Determination of Three Recombinant Proteins in Soybean Tissues by Liquid Chromatography with Tandem Mass Spectrometry.

    PubMed

    Hill, Ryan C; Oman, Trent J; Shan, Guomin; Schafer, Barry; Eble, Julie; Chen, Cynthia

    2015-08-26

    Currently, traditional immunochemistry technologies such as enzyme-linked immunosorbent assays (ELISA) are the predominant analytical tool used to measure levels of recombinant proteins expressed in genetically engineered (GE) plants. Recent advances in agricultural biotechnology have created a need to develop methods capable of selectively detecting and quantifying multiple proteins in complex matrices because of increasing numbers of transgenic proteins being coexpressed or "stacked" to achieve tolerance to multiple herbicides or to provide multiple modes of action for insect control. A multiplexing analytical method utilizing liquid chromatography with tandem mass spectrometry (LC-MS/MS) has been developed and validated to quantify three herbicide-tolerant proteins in soybean tissues: aryloxyalkanoate dioxygenase (AAD-12), 5-enol-pyruvylshikimate-3-phosphate synthase (2mEPSPS), and phosphinothricin acetyltransferase (PAT). Results from the validation showed high recovery and precision over multiple analysts and laboratories. Results from this method were comparable to those obtained with ELISA with respect to protein quantitation, and the described method was demonstrated to be suitable for multiplex quantitation of transgenic proteins in GE crops.

  10. Remote Geochemical and Mineralogical Analyses under Venus Atmospheric Conditions by Raman - Laser Induced Breakdown Spectroscopy (LIBS)

    NASA Astrophysics Data System (ADS)

    Clegg, S. M.; Wiens, R. C.; Newell, R. T.; DeCroix, D. S.; Sharma, S. K.; Misra, A. K.; Dyar, M. D.; Anderson, R. B.; Angel, S. M.; Martinez, R.; McInroy, R.

    2016-12-01

    The extreme Venus surface temperature ( 740 K) and atmospheric pressure ( 93 atm) create a challenging environment for surface geochemical and mineralogical investigations. Such investigations must be completed within hours of landing before the lander will be overcome by the harsh atmosphere. A combined remote Raman - LIBS spectrometer (RLS) is capable of accomplishing the geochemical science goals without the risks associated with collecting samples and bringing them into the lander. Wiens et al. [1], Sharma et al. [2] and Clegg et al. [3] demonstrated that both analytical techniques can be integrated into a single instrument similar to the SuperCam instrument selected for the Mars 2020 rover. The focus of this paper is to explore the capability to probe geologic samples by Raman and LIBS and demonstrate quantitative analysis under Venus surface conditions. Raman and LIBS are highly complementary analytical techniques capable of determining both the mineralogical and geochemical composition of Venus surface samples. These techniques have the potential to profoundly increase our knowledge of the Venus surface composition, which is currently limited to geochemical data from the Venera and VEGA landers [4]. Based on the observed compositional differences and recognizing the imprecise nature of the existing data, samples were chosen to constitute a Venus-analog suite for this study. LIBS data reduction involved generating a partial least squares (PLS) model with a subset of the rock powder standards to quantitatively determine the major elemental abundance of the remaining samples. The Raman experiments have been conducted under supercritical CO2 involving single-mineral and mixed-mineral samples containing talc, olivine, pyroxenes, feldspars, anhydrite, barite, and siderite. These experiments involve a new RLS prototype similar to the SuperCam instrument as well a new 2 m long pressure chamber capable of simulating the Venus surface temperature and pressure. Results of these combined Raman-LIBS investigations will be presented and discussed. [1] Wiens R.C., et al. (2005) Spect. Acta A 61, 2324; [2] Sharma, S. K. et al. (2007) Spect. Acta A, 68 , 1036 (2007); [3] Clegg, S.M. et al. (2014) Appl. Spec. 68, 925; [4] Barsukov VL (1992) In Venus Geology, Geochemistry, and Geophysics, Univ. Arizona Press, pp. 165.

  11. Study of a tri-trophic prey-dependent food chain model of interacting populations.

    PubMed

    Haque, Mainul; Ali, Nijamuddin; Chakravarty, Santabrata

    2013-11-01

    The current paper accounts for the influence of intra-specific competition among predators in a prey dependent tri-trophic food chain model of interacting populations. We offer a detailed mathematical analysis of the proposed food chain model to illustrate some of the significant results that has arisen from the interplay of deterministic ecological phenomena and processes. Biologically feasible equilibria of the system are observed and the behaviours of the system around each of them are described. In particular, persistence, stability (local and global) and bifurcation (saddle-node, transcritical, Hopf-Andronov) analysis of this model are obtained. Relevant results from previous well known food chain models are compared with the current findings. Global stability analysis is also carried out by constructing appropriate Lyapunov functions. Numerical simulations show that the present system is capable enough to produce chaotic dynamics when the rate of self-interaction is very low. On the other hand such chaotic behaviour disappears for a certain value of the rate of self interaction. In addition, numerical simulations with experimented parameters values confirm the analytical results and shows that intra-specific competitions bears a potential role in controlling the chaotic dynamics of the system; and thus the role of self interactions in food chain model is illustrated first time. Finally, a discussion of the ecological applications of the analytical and numerical findings concludes the paper. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Microalgal process-monitoring based on high-selectivity spectroscopy tools: status and future perspectives.

    PubMed

    Podevin, Michael; Fotidis, Ioannis A; Angelidaki, Irini

    2018-08-01

    Microalgae are well known for their ability to accumulate lipids intracellularly, which can be used for biofuels and mitigate CO 2 emissions. However, due to economic challenges, microalgae bioprocesses have maneuvered towards the simultaneous production of food, feed, fuel, and various high-value chemicals in a biorefinery concept. On-line and in-line monitoring of macromolecules such as lipids, proteins, carbohydrates, and high-value pigments will be more critical to maintain product quality and consistency for downstream processing in a biorefinery to maintain and valorize these markets. The main contribution of this review is to present current and prospective advances of on-line and in-line process analytical technology (PAT), with high-selectivity - the capability of monitoring several analytes simultaneously - in the interest of improving product quality, productivity, and process automation of a microalgal biorefinery. The high-selectivity PAT under consideration are mid-infrared (MIR), near-infrared (NIR), and Raman vibrational spectroscopies. The current review contains a critical assessment of these technologies in the context of recent advances in software and hardware in order to move microalgae production towards process automation through multivariate process control (MVPC) and software sensors trained on "big data". The paper will also include a comprehensive overview of off-line implementations of vibrational spectroscopy in microalgal research as it pertains to spectral interpretation and process automation to aid and motivate development.

  13. MT3D-USGS version 1: A U.S. Geological Survey release of MT3DMS updated with new and expanded transport capabilities for use with MODFLOW

    USGS Publications Warehouse

    Bedekar, Vivek; Morway, Eric D.; Langevin, Christian D.; Tonkin, Matthew J.

    2016-09-30

    MT3D-USGS, a U.S. Geological Survey updated release of the groundwater solute transport code MT3DMS, includes new transport modeling capabilities to accommodate flow terms calculated by MODFLOW packages that were previously unsupported by MT3DMS and to provide greater flexibility in the simulation of solute transport and reactive solute transport. Unsaturated-zone transport and transport within streams and lakes, including solute exchange with connected groundwater, are among the new capabilities included in the MT3D-USGS code. MT3D-USGS also includes the capability to route a solute through dry cells that may occur in the Newton-Raphson formulation of MODFLOW (that is, MODFLOW-NWT). New chemical reaction Package options include the ability to simulate inter-species reactions and parent-daughter chain reactions. A new pump-and-treat recirculation package enables the simulation of dynamic recirculation with or without treatment for combinations of wells that are represented in the flow model, mimicking the above-ground treatment of extracted water. A reformulation of the treatment of transient mass storage improves conservation of mass and yields solutions for better agreement with analytical benchmarks. Several additional features of MT3D-USGS are (1) the separate specification of the partitioning coefficient (Kd) within mobile and immobile domains; (2) the capability to assign prescribed concentrations to the top-most active layer; (3) the change in mass storage owing to the change in water volume now appears as its own budget item in the global mass balance summary; (4) the ability to ignore cross-dispersion terms; (5) the definition of Hydrocarbon Spill-Source Package (HSS) mass loading zones using regular and irregular polygons, in addition to the currently supported circular zones; and (6) the ability to specify an absolute minimum thickness rather than the default percent minimum thickness in dry-cell circumstances.Benchmark problems that implement the new features and packages test the accuracy of new code through comparison to analytical benchmarks, as well as to solutions from other published codes. The input file structure for MT3D-USGS adheres to MT3DMS conventions for backward compatibility: the new capabilities and packages described herein are readily invoked by adding three-letter package name acronyms to the name file or by setting input flags as needed. Memory is managed in MT3D-USGS using FORTRAN modules in order to simplify code development and expansion.

  14. Exploration Laboratory Analysis

    NASA Technical Reports Server (NTRS)

    Krihak, M.; Ronzano, K.; Shaw, T.

    2016-01-01

    The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability for manned exploration missions. Since a single, compact space-ready laboratory analysis capability to perform all exploration clinical measurements is not commercially available, the ELA project objective is to demonstrate the feasibility of emerging operational and analytical capability as a biomedical diagnostics precursor to long duration manned exploration missions. The initial step towards ground and flight demonstrations in fiscal year (FY) 2015 was the down selection of platform technologies for demonstrations in the space environment. The technologies selected included two Small Business Innovation Research (SBIR) performers: DNA Medicine Institutes rHEALTH X and Intelligent Optical Systems later flow assays combined with Holomics smartphone analyzer. The selection of these technologies were based on their compact size, breadth of analytical capability and favorable ability to process fluids in a space environment, among several factors. These two technologies will be advanced to meet ground and flight demonstration success criteria and requirements that will be finalized in FY16. Also, the down selected performers will continue the technology development phase towards meeting prototype deliverables in either late 2016 or 2017.

  15. Exploration Laboratory Analysis

    NASA Technical Reports Server (NTRS)

    Krihak, M.; Ronzano, K.; Shaw, T.

    2016-01-01

    The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability for manned exploration missions. Since a single, compact space-ready laboratory analysis capability to perform all exploration clinical measurements is not commercially available, the ELA project objective is to demonstrate the feasibility of emerging operational and analytical capability as a biomedical diagnostics precursor to long duration manned exploration missions. The initial step towards ground and flight demonstrations in fiscal year (FY) 2015 was the downselection of platform technologies for demonstrations in the space environment. The technologies selected included two Small Business Innovation Research (SBIR) performers: DNA Medicine Institute's rHEALTH X and Intelligent Optical System's lateral flow assays combined with Holomic's smartphone analyzer. The selection of these technologies were based on their compact size, breadth of analytical capability and favorable ability to process fluids in a space environment, among several factors. These two technologies will be advanced to meet ground and flight demonstration success criteria and requirements. The technology demonstrations and metrics for success will be finalized in FY16. Also, the downselected performers will continue the technology development phase towards meeting prototype deliverables in either late 2016 or 2017.

  16. Maximizing the U.S. Army’s Future Contribution to Global Security Using the Capability Portfolio Analysis Tool (CPAT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.

    We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less

  17. Maximizing the U.S. Army’s Future Contribution to Global Security Using the Capability Portfolio Analysis Tool (CPAT)

    DOE PAGES

    Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.; ...

    2016-02-01

    We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less

  18. Compositional control of continuously graded anode functional layer

    NASA Astrophysics Data System (ADS)

    McCoppin, J.; Barney, I.; Mukhopadhyay, S.; Miller, R.; Reitz, T.; Young, D.

    2012-10-01

    In this work, solid oxide fuel cells (SOFC's) are fabricated with linear-compositionally graded anode functional layers (CGAFL) using a computer-controlled compound aerosol deposition (CCAD) system. Cells with different CGAFL thicknesses (30 um and 50 um) are prepared with a continuous compositionally graded interface deposited between the electrolyte and anode support current collecting regions. The compositional profile was characterized using energy dispersive X-ray spectroscopic mapping. An analytical model of the compound aerosol deposition was developed. The model predicted compositional profiles for both samples that closely matched the measured profiles, suggesting that aerosol-based deposition methods are capable of creating functional gradation on length scales suitable for solid oxide fuel cell structures. The electrochemical performances of the two cells are analyzed using electrochemical impedance spectroscopy (EIS).

  19. Positron radiography of ignition-relevant ICF capsules

    NASA Astrophysics Data System (ADS)

    Williams, G. J.; Chen, Hui; Field, J. E.; Landen, O. L.; Strozzi, D. J.

    2017-12-01

    Laser-generated positrons are evaluated as a probe source to radiograph in-flight ignition-relevant inertial confinement fusion capsules. Current ultraintense laser facilities are capable of producing 2 × 1012 relativistic positrons in a narrow energy bandwidth and short time duration. Monte Carlo simulations suggest that the unique characteristics of such positrons allow for the reconstruction of both capsule shell radius and areal density between 0.002 and 2 g/cm2. The energy-downshifted positron spectrum and angular scattering of the source particles are sufficient to constrain the conditions of the capsule between preshot and stagnation. We evaluate the effects of magnetic fields near the capsule surface using analytic estimates where it is shown that this diagnostic can tolerate line integrated field strengths of 100 T mm.

  20. Benchmark Comparison of Cloud Analytics Methods Applied to Earth Observations

    NASA Technical Reports Server (NTRS)

    Lynnes, Chris; Little, Mike; Huang, Thomas; Jacob, Joseph; Yang, Phil; Kuo, Kwo-Sen

    2016-01-01

    Cloud computing has the potential to bring high performance computing capabilities to the average science researcher. However, in order to take full advantage of cloud capabilities, the science data used in the analysis must often be reorganized. This typically involves sharding the data across multiple nodes to enable relatively fine-grained parallelism. This can be either via cloud-based file systems or cloud-enabled databases such as Cassandra, Rasdaman or SciDB. Since storing an extra copy of data leads to increased cost and data management complexity, NASA is interested in determining the benefits and costs of various cloud analytics methods for real Earth Observation cases. Accordingly, NASA's Earth Science Technology Office and Earth Science Data and Information Systems project have teamed with cloud analytics practitioners to run a benchmark comparison on cloud analytics methods using the same input data and analysis algorithms. We have particularly looked at analysis algorithms that work over long time series, because these are particularly intractable for many Earth Observation datasets which typically store data with one or just a few time steps per file. This post will present side-by-side cost and performance results for several common Earth observation analysis operations.

  1. Benchmark Comparison of Cloud Analytics Methods Applied to Earth Observations

    NASA Astrophysics Data System (ADS)

    Lynnes, C.; Little, M. M.; Huang, T.; Jacob, J. C.; Yang, C. P.; Kuo, K. S.

    2016-12-01

    Cloud computing has the potential to bring high performance computing capabilities to the average science researcher. However, in order to take full advantage of cloud capabilities, the science data used in the analysis must often be reorganized. This typically involves sharding the data across multiple nodes to enable relatively fine-grained parallelism. This can be either via cloud-based filesystems or cloud-enabled databases such as Cassandra, Rasdaman or SciDB. Since storing an extra copy of data leads to increased cost and data management complexity, NASA is interested in determining the benefits and costs of various cloud analytics methods for real Earth Observation cases. Accordingly, NASA's Earth Science Technology Office and Earth Science Data and Information Systems project have teamed with cloud analytics practitioners to run a benchmark comparison on cloud analytics methods using the same input data and analysis algorithms. We have particularly looked at analysis algorithms that work over long time series, because these are particularly intractable for many Earth Observation datasets which typically store data with one or just a few time steps per file. This post will present side-by-side cost and performance results for several common Earth observation analysis operations.

  2. Application of surface plasmon resonance for the detection of carbohydrates, glycoconjugates, and measurement of the carbohydrate-specific interactions: a comparison with conventional analytical techniques. A critical review.

    PubMed

    Safina, Gulnara

    2012-01-27

    Carbohydrates (glycans) and their conjugates with proteins and lipids contribute significantly to many biological processes. That makes these compounds important targets to be detected, monitored and identified. The identification of the carbohydrate content in their conjugates with proteins and lipids (glycoforms) is often a challenging task. Most of the conventional instrumental analytical techniques are time-consuming and require tedious sample pretreatment and utilising various labeling agents. Surface plasmon resonance (SPR) has been intensively developed during last two decades and has received the increasing attention for different applications, from the real-time monitoring of affinity bindings to biosensors. SPR does not require any labels and is capable of direct measurement of biospecific interaction occurring on the sensing surface. This review provides a critical comparison of modern analytical instrumental techniques with SPR in terms of their analytical capabilities to detect carbohydrates, their conjugates with proteins and lipids and to study the carbohydrate-specific bindings. A few selected examples of the SPR approaches developed during 2004-2011 for the biosensing of glycoforms and for glycan-protein affinity studies are comprehensively discussed. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. Collaborative visual analytics of radio surveys in the Big Data era

    NASA Astrophysics Data System (ADS)

    Vohl, Dany; Fluke, Christopher J.; Hassan, Amr H.; Barnes, David G.; Kilborn, Virginia A.

    2017-06-01

    Radio survey datasets comprise an increasing number of individual observations stored as sets of multidimensional data. In large survey projects, astronomers commonly face limitations regarding: 1) interactive visual analytics of sufficiently large subsets of data; 2) synchronous and asynchronous collaboration; and 3) documentation of the discovery workflow. To support collaborative data inquiry, we present encube, a large-scale comparative visual analytics framework. encube can utilise advanced visualization environments such as the CAVE2 (a hybrid 2D and 3D virtual reality environment powered with a 100 Tflop/s GPU-based supercomputer and 84 million pixels) for collaborative analysis of large subsets of data from radio surveys. It can also run on standard desktops, providing a capable visual analytics experience across the display ecology. encube is composed of four primary units enabling compute-intensive processing, advanced visualisation, dynamic interaction, parallel data query, along with data management. Its modularity will make it simple to incorporate astronomical analysis packages and Virtual Observatory capabilities developed within our community. We discuss how encube builds a bridge between high-end display systems (such as CAVE2) and the classical desktop, preserving all traces of the work completed on either platform - allowing the research process to continue wherever you are.

  4. Recombinant drugs-on-a-chip: The usage of capillary electrophoresis and trends in miniaturized systems - A review.

    PubMed

    Morbioli, Giorgio Gianini; Mazzu-Nascimento, Thiago; Aquino, Adriano; Cervantes, Cesar; Carrilho, Emanuel

    2016-09-07

    We present here a critical review covering conventional analytical tools of recombinant drug analysis and discuss their evolution towards miniaturized systems foreseeing a possible unique recombinant drug-on-a-chip device. Recombinant protein drugs and/or pro-drug analysis require sensitive and reproducible analytical techniques for quality control to ensure safety and efficacy of drugs according to regulatory agencies. The versatility of miniaturized systems combined with their low-cost could become a major trend in recombinant drugs and bioprocess analysis. Miniaturized systems are capable of performing conventional analytical and proteomic tasks, allowing for interfaces with other powerful techniques, such as mass spectrometry. Microdevices can be applied during the different stages of recombinant drug processing, such as gene isolation, DNA amplification, cell culture, protein expression, protein separation, and analysis. In addition, organs-on-chips have appeared as a viable alternative to testing biodrug pharmacokinetics and pharmacodynamics, demonstrating the capabilities of the miniaturized systems. The integration of individual established microfluidic operations and analytical tools in a single device is a challenge to be overcome to achieve a unique recombinant drug-on-a-chip device. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. An Analytical Framework for Assessing the Efficacy of Small Satellites in Performing Novel Imaging Missions

    NASA Astrophysics Data System (ADS)

    Weaver, Oesa A.

    In the last two decades, small satellites have opened up the use of space to groups other than governments and large corporations, allowing for increased participation and experimentation. This democratization of space was primarily enabled by two factors: improved technology and reduced launch costs. Improved technology allowed the miniaturization of components and reduced overall cost meaning many of the capabilities of larger satellites could be replicated at a fraction of the cost. In addition, new launcher systems that could host many small satellites as ride-shares on manifested vehicles lowered launch costs and simplified the process of getting a satellite into orbit. The potential of these smaller satellites to replace or augment existing systems has led to a flood of potential satellite and mission concepts, often with little rigorous study of whether the proposed satellite or mission is achievable or necessary. This work proposes an analytical framework to aid system designers in evaluating the ability of an existing concept or small satellite to perform a particular imaging mission, either replacing or augmenting existing capabilities. This framework was developed and then refined by application to the problem of using small satellites to perform a wide area search mission -- a mission not possible with existing imaging satellites, but one that would add to current capabilities. Requirements for a wide area search mission were developed, along with a list of factors that would affect image quality and system performance. Two existing small satellite concepts were evaluated for use by examining image quality from the systems, selecting an algorithm to perform the search function automatically, and then assessing mission feasibility by applying the algorithm to simulated imagery. Finally, a notional constellation design was developed to assess the number of satellites required to perform the mission. It was found that a constellation of 480 CubeSats producing 4 m spatial resolution panchromatic imagery and employing an on-board processing algorithm would be sufficient to perform a wide area search mission.

  6. The Relationship between Learning Capability and Organizational Performance: A Meta-Analytic Examination

    ERIC Educational Resources Information Center

    Goh, Swee C.; Elliott, Catherine; Quon, Tony K.

    2012-01-01

    Purpose: The purpose of this paper is to present a meta-analysis of a subset of published empirical research papers that measure learning capability and link it to organizational performance. It also seeks to examine both financial and non-financial performance. Design/methodology/approach: In a search of published research on learning capability…

  7. User's Guide To CHEAP0 II-Economic Analysis of Stand Prognosis Model Outputs

    Treesearch

    Joseph E. Horn; E. Lee Medema; Ervin G. Schuster

    1986-01-01

    CHEAP0 II provides supplemental economic analysis capability for users of version 5.1 of the Stand Prognosis Model, including recent regeneration and insect outbreak extensions. Although patterned after the old CHEAP0 model, CHEAP0 II has more features and analytic capabilities, especially for analysis of existing and uneven-aged stands....

  8. Why Are Teachers Absent? Utilising the Capability Approach and Critical Realism to Explain Teacher Performance in Tanzania

    ERIC Educational Resources Information Center

    Tao, Sharon

    2013-01-01

    Tanzanian teachers have been criticised for a variety of behaviours such as absenteeism, lack of preparation and rote-teaching. This paper introduces an analytical framework that attempts to provide explanations for these behaviours by locating Capability Approach concepts within a Critical Realist theory of causation. Qualitative data from three…

  9. Discreet passive explosive detection through 2-sided waveguided fluorescence

    DOEpatents

    Harper, Ross James [Stillwater, OK; la Grone, Marcus [Cushing, OK; Fisher, Mark [Stillwater, OK

    2011-10-18

    The current invention provides a passive sampling device suitable for collecting and detecting the presence of target analytes. In particular, the passive sampling device is suitable for detecting nitro-aromatic compounds. The current invention further provides a passive sampling device reader suitable for determining the collection of target analytes. Additionally, the current invention provides methods for detecting target analytes using the passive sampling device and the passive sampling device reader.

  10. Analytical method for the accurate determination of tricothecenes in grains using LC-MS/MS: a comparison between MRM transition and MS3 quantitation.

    PubMed

    Lim, Chee Wei; Tai, Siew Hoon; Lee, Lin Min; Chan, Sheot Harn

    2012-07-01

    The current food crisis demands unambiguous determination of mycotoxin contamination in staple foods to achieve safer food for consumption. This paper describes the first accurate LC-MS/MS method developed to analyze tricothecenes in grains by applying multiple reaction monitoring (MRM) transition and MS(3) quantitation strategies in tandem. The tricothecenes are nivalenol, deoxynivalenol, deoxynivalenol-3-glucoside, fusarenon X, 3-acetyl-deoxynivalenol, 15-acetyldeoxynivalenol, diacetoxyscirpenol, and HT-2 and T-2 toxins. Acetic acid and ammonium acetate were used to convert the analytes into their respective acetate adducts and ammonium adducts under negative and positive MS polarity conditions, respectively. The mycotoxins were separated by reversed-phase LC in a 13.5-min run, ionized using electrospray ionization, and detected by tandem mass spectrometry. Analyte-specific mass-to-charge (m/z) ratios were used to perform quantitation under MRM transition and MS(3) (linear ion trap) modes. Three experiments were made for each quantitation mode and matrix in batches over 6 days for recovery studies. The matrix effect was investigated at concentration levels of 20, 40, 80, 120, 160, and 200 μg kg(-1) (n = 3) in 5 g corn flour and rice flour. Extraction with acetonitrile provided a good overall recovery range of 90-108% (n = 3) at three levels of spiking concentration of 40, 80, and 120 μg kg(-1). A quantitation limit of 2-6 μg kg(-1) was achieved by applying an MRM transition quantitation strategy. Under MS(3) mode, a quantitation limit of 4-10 μg kg(-1) was achieved. Relative standard deviations of 2-10% and 2-11% were reported for MRM transition and MS(3) quantitation, respectively. The successful utilization of MS(3) enabled accurate analyte fragmentation pattern matching and its quantitation, leading to the development of analytical methods in fields that demand both analyte specificity and fragmentation fingerprint-matching capabilities that are unavailable under MRM transition.

  11. Novel immunoassay formats for integrated microfluidic circuits: diffusion immunoassays (DIA)

    NASA Astrophysics Data System (ADS)

    Weigl, Bernhard H.; Hatch, Anson; Kamholz, Andrew E.; Yager, Paul

    2000-03-01

    Novel designs of integrated fluidic microchips allow separations, chemical reactions, and calibration-free analytical measurements to be performed directly in very small quantities of complex samples such as whole blood and contaminated environmental samples. This technology lends itself to applications such as clinical diagnostics, including tumor marker screening, and environmental sensing in remote locations. Lab-on-a-Chip based systems offer many *advantages over traditional analytical devices: They consume extremely low volumes of both samples and reagents. Each chip is inexpensive and small. The sampling-to-result time is extremely short. They perform all analytical functions, including sampling, sample pretreatment, separation, dilution, and mixing steps, chemical reactions, and detection in an integrated microfluidic circuit. Lab-on-a-Chip systems enable the design of small, portable, rugged, low-cost, easy to use, yet extremely versatile and capable diagnostic instruments. In addition, fluids flowing in microchannels exhibit unique characteristics ('microfluidics'), which allow the design of analytical devices and assay formats that would not function on a macroscale. Existing Lab-on-a-chip technologies work very well for highly predictable and homogeneous samples common in genetic testing and drug discovery processes. One of the biggest challenges for current Labs-on-a-chip, however, is to perform analysis in the presence of the complexity and heterogeneity of actual samples such as whole blood or contaminated environmental samples. Micronics has developed a variety of Lab-on-a-Chip assays that can overcome those shortcomings. We will now present various types of novel Lab- on-a-Chip-based immunoassays, including the so-called Diffusion Immunoassays (DIA) that are based on the competitive laminar diffusion of analyte molecules and tracer molecules into a region of the chip containing antibodies that target the analyte molecules. Advantages of this technique are a reduction in reagents, higher sensitivity, minimal preparation of complex samples such as blood, real-time calibration, and extremely rapid analysis.

  12. Revisiting the Relationship between Individual Differences in Analytic Thinking and Religious Belief: Evidence That Measurement Order Moderates Their Inverse Correlation.

    PubMed

    Finley, Anna J; Tang, David; Schmeichel, Brandon J

    2015-01-01

    Prior research has found that persons who favor more analytic modes of thought are less religious. We propose that individual differences in analytic thought are associated with reduced religious beliefs particularly when analytic thought is measured (hence, primed) first. The current study provides a direct replication of prior evidence that individual differences in analytic thinking are negatively related to religious beliefs when analytic thought is measured before religious beliefs. When religious belief is measured before analytic thinking, however, the negative relationship is reduced to non-significance, suggesting that the link between analytic thought and religious belief is more tenuous than previously reported. The current study suggests that whereas inducing analytic processing may reduce religious belief, more analytic thinkers are not necessarily less religious. The potential for measurement order to inflate the inverse correlation between analytic thinking and religious beliefs deserves additional consideration.

  13. Revisiting the Relationship between Individual Differences in Analytic Thinking and Religious Belief: Evidence That Measurement Order Moderates Their Inverse Correlation

    PubMed Central

    Finley, Anna J.; Tang, David; Schmeichel, Brandon J.

    2015-01-01

    Prior research has found that persons who favor more analytic modes of thought are less religious. We propose that individual differences in analytic thought are associated with reduced religious beliefs particularly when analytic thought is measured (hence, primed) first. The current study provides a direct replication of prior evidence that individual differences in analytic thinking are negatively related to religious beliefs when analytic thought is measured before religious beliefs. When religious belief is measured before analytic thinking, however, the negative relationship is reduced to non-significance, suggesting that the link between analytic thought and religious belief is more tenuous than previously reported. The current study suggests that whereas inducing analytic processing may reduce religious belief, more analytic thinkers are not necessarily less religious. The potential for measurement order to inflate the inverse correlation between analytic thinking and religious beliefs deserves additional consideration. PMID:26402334

  14. Next generation data systems and knowledge products to support agricultural producers and science-based policy decision making.

    PubMed

    Capalbo, Susan M; Antle, John M; Seavert, Clark

    2017-07-01

    Research on next generation agricultural systems models shows that the most important current limitation is data, both for on-farm decision support and for research investment and policy decision making. One of the greatest data challenges is to obtain reliable data on farm management decision making, both for current conditions and under scenarios of changed bio-physical and socio-economic conditions. This paper presents a framework for the use of farm-level and landscape-scale models and data to provide analysis that could be used in NextGen knowledge products, such as mobile applications or personal computer data analysis and visualization software. We describe two analytical tools - AgBiz Logic and TOA-MD - that demonstrate the current capability of farmlevel and landscape-scale models. The use of these tools is explored with a case study of an oilseed crop, Camelina sativa , which could be used to produce jet aviation fuel. We conclude with a discussion of innovations needed to facilitate the use of farm and policy-level models to generate data and analysis for improved knowledge products.

  15. Improvement of analytical capabilities of neutron activation analysis laboratory at the Colombian Geological Survey

    NASA Astrophysics Data System (ADS)

    Parrado, G.; Cañón, Y.; Peña, M.; Sierra, O.; Porras, A.; Alonso, D.; Herrera, D. C.; Orozco, J.

    2016-07-01

    The Neutron Activation Analysis (NAA) laboratory at the Colombian Geological Survey has developed a technique for multi-elemental analysis of soil and plant matrices, based on Instrumental Neutron Activation Analysis (INAA) using the comparator method. In order to evaluate the analytical capabilities of the technique, the laboratory has been participating in inter-comparison tests organized by Wepal (Wageningen Evaluating Programs for Analytical Laboratories). In this work, the experimental procedure and results for the multi-elemental analysis of four soil and four plant samples during participation in the first round on 2015 of Wepal proficiency test are presented. Only elements with radioactive isotopes with medium and long half-lives have been evaluated, 15 elements for soils (As, Ce, Co, Cr, Cs, Fe, K, La, Na, Rb, Sb, Sc, Th, U and Zn) and 7 elements for plants (Br, Co, Cr, Fe, K, Na and Zn). The performance assessment by Wepal based on Z-score distributions showed that most results obtained |Z-scores| ≤ 3.

  16. Site-specific range uncertainties caused by dose calculation algorithms for proton therapy

    NASA Astrophysics Data System (ADS)

    Schuemann, J.; Dowdell, S.; Grassberger, C.; Min, C. H.; Paganetti, H.

    2014-08-01

    The purpose of this study was to assess the possibility of introducing site-specific range margins to replace current generic margins in proton therapy. Further, the goal was to study the potential of reducing margins with current analytical dose calculations methods. For this purpose we investigate the impact of complex patient geometries on the capability of analytical dose calculation algorithms to accurately predict the range of proton fields. Dose distributions predicted by an analytical pencil-beam algorithm were compared with those obtained using Monte Carlo (MC) simulations (TOPAS). A total of 508 passively scattered treatment fields were analyzed for seven disease sites (liver, prostate, breast, medulloblastoma-spine, medulloblastoma-whole brain, lung and head and neck). Voxel-by-voxel comparisons were performed on two-dimensional distal dose surfaces calculated by pencil-beam and MC algorithms to obtain the average range differences and root mean square deviation for each field for the distal position of the 90% dose level (R90) and the 50% dose level (R50). The average dose degradation of the distal falloff region, defined as the distance between the distal position of the 80% and 20% dose levels (R80-R20), was also analyzed. All ranges were calculated in water-equivalent distances. Considering total range uncertainties and uncertainties from dose calculation alone, we were able to deduce site-specific estimations. For liver, prostate and whole brain fields our results demonstrate that a reduction of currently used uncertainty margins is feasible even without introducing MC dose calculations. We recommend range margins of 2.8% + 1.2 mm for liver and prostate treatments and 3.1% + 1.2 mm for whole brain treatments, respectively. On the other hand, current margins seem to be insufficient for some breast, lung and head and neck patients, at least if used generically. If no case specific adjustments are applied, a generic margin of 6.3% + 1.2 mm would be needed for breast, lung and head and neck treatments. We conclude that the currently used generic range uncertainty margins in proton therapy should be redefined site specific and that complex geometries may require a field specific adjustment. Routine verifications of treatment plans using MC simulations are recommended for patients with heterogeneous geometries.

  17. Total analysis systems with Thermochromic Etching Discs technology.

    PubMed

    Avella-Oliver, Miquel; Morais, Sergi; Carrascosa, Javier; Puchades, Rosa; Maquieira, Ángel

    2014-12-16

    A new analytical system based on Thermochromic Etching Discs (TED) technology is presented. TED comprises a number of attractive features such as track independency, selective irradiation, a high power laser, and the capability to create useful assay platforms. The analytical versatility of this tool opens up a wide range of possibilities to design new compact disc-based total analysis systems applicable in chemistry and life sciences. In this paper, TED analytical implementation is described and discussed, and their analytical potential is supported by several applications. Microarray immunoassay, immunofiltration assay, solution measurement, and cell culture approaches are herein addressed in order to demonstrate the practical capacity of this system. The analytical usefulness of TED technology is herein demonstrated, describing how to exploit this tool for developing truly integrated analytical systems that provide solutions within the point of care framework.

  18. Analyzing Matrices of Meta-Analytic Correlations: Current Practices and Recommendations

    ERIC Educational Resources Information Center

    Sheng, Zitong; Kong, Wenmo; Cortina, Jose M.; Hou, Shuofei

    2016-01-01

    Researchers have become increasingly interested in conducting analyses on meta-analytic correlation matrices. Methodologists have provided guidance and recommended practices for the application of this technique. The purpose of this article is to review current practices regarding analyzing meta-analytic correlation matrices, to identify the gaps…

  19. Application of Out-of-Plane Warping to Control Rotor Blade Twist

    NASA Technical Reports Server (NTRS)

    VanWeddingen, Yannick; Bauchau, Olivier; Kottapalli, Sesi; Ozbay, Serkan; Mehrotra, Yogesh

    2012-01-01

    The goal of this ongoing study is to develop and demonstrate the feasibility of a blade actuation system to dynamically change the twist, and/or the camber, of an airfoil section and, consequently, alter the in-flight aerodynamic loading on the blade for efficient flight control. The required analytical and finite element tools are under development to enable an accurate and comprehensive aeroelastic assessment of the current Full-Blade Warping and 3D Warping Actuated Trailing Edge Flap concepts. The feasibility of the current concepts for swashplateless rotors and higher harmonic blade control is also being investigated. In particular, the aim is to complete the following objectives, some of which have been completed (as noted below) and others that are currently ongoing: i) Develop a Vlasov finite element model and validate against the ABAQUS shell models (completed). ii) Implement the 3D warping actuation concept within the comprehensive analysis code DYMORE. iii) Perform preliminary aeroelastic simulations of blades using DYMORE with 3D warping actuation: a) Investigate the blade behavior under 1 per/rev actuation. Determine whether sufficient twist can be generated and sustained to achieve primary blade control. b) Investigate the behavior of a trailing edge flap configuration under higher harmonic excitations. Determine how much twist can be obtained at the harmonics 2-5 per/rev. iv) Determine actuator specifications such as the power required, load and displacements, and identify the stress and strain distributions in the actuated blades. In general, the completion of Item ii) above will give an additional research capability in rotorcraft dynamics analyses, i.e., the capability to calculate the rotor blade twist due to warping, something that is not currently available in any of the existing comprehensive rotorcraft analyses.

  20. Revisiting the positive DC corona discharge theory: Beyond Peek's and Townsend's law

    NASA Astrophysics Data System (ADS)

    Monrolin, Nicolas; Praud, Olivier; Plouraboué, Franck

    2018-06-01

    The classical positive Corona Discharge theory in a cylindrical axisymmetric configuration is revisited in order to find analytically the influence of gas properties and thermodynamic conditions on the corona current. The matched asymptotic expansion of Durbin and Turyn [J. Phys. D: Appl. Phys. 20, 1490-1495 (1987)] of a simplified but self-consistent problem is performed and explicit analytical solutions are derived. The mathematical derivation enables us to express a new positive DC corona current-voltage characteristic, choosing either a dimensionless or dimensional formulation. In dimensional variables, the current voltage law and the corona inception voltage explicitly depend on the electrode size and physical gas properties such as ionization and photoionization parameters. The analytical predictions are successfully confronted with experiments and Peek's and Townsend's laws. An analytical expression of the corona inception voltage φ o n is proposed, which depends on the known values of physical parameters without adjustable parameters. As a proof of consistency, the classical Townsend current-voltage law I = C φ ( φ - φ o n ) is retrieved by linearizing the non-dimensional analytical solution. A brief parametric study showcases the interest in this analytical current model, especially for exploring small corona wires or considering various thermodynamic conditions.

  1. Spin–orbit DFT with Analytic Gradients and Applications to Heavy Element Compounds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Zhiyong

    We have implemented the unrestricted DFT approach with one-electron spin–orbit operators in the massively parallel NWChem program. Also implemented is the analytic gradient in the DFT approach with spin–orbit interactions. The current capabilities include single-point calculations and geometry optimization. Vibrational frequencies can be calculated numerically from the analytically calculated gradients. The implementation is based on the spin–orbit interaction operator derived from the effective core potential approach. The exchange functionals used in the implementation are functionals derived for non-spin–orbit calculations, including GGA as well as hybrid functionals. Spin–orbit Hartree–Fock calculations can also be carried out. We have applied the spin–orbit DFTmore » methods to the Uranyl aqua complexes. We have optimized the structures and calculated the vibrational frequencies of both (UO2 2+)aq and (UO2 +)aq with and without spin–orbit effects. The effects of the spin–orbit interaction on the structures and frequencies of these two complexes are discussed. We also carried out calculations for Th2, and several low-lying electronic states are calculated. Our results indicate that, for open-shell systems, there are significant effects due to the spin–orbit effects and the electronic configurations with and without spin–orbit interactions could change due to the occupation of orbitals of larger spin–orbit interactions.« less

  2. Empirical testing of an analytical model predicting electrical isolation of photovoltaic models

    NASA Astrophysics Data System (ADS)

    Garcia, A., III; Minning, C. P.; Cuddihy, E. F.

    A major design requirement for photovoltaic modules is that the encapsulation system be capable of withstanding large DC potentials without electrical breakdown. Presented is a simple analytical model which can be used to estimate material thickness to meet this requirement for a candidate encapsulation system or to predict the breakdown voltage of an existing module design. A series of electrical tests to verify the model are described in detail. The results of these verification tests confirmed the utility of the analytical model for preliminary design of photovoltaic modules.

  3. Detection of biological molecules using chemical amplification and optical sensors

    DOEpatents

    Van Antwerp, William Peter; Mastrototaro, John Joseph

    2001-01-01

    Methods are provided for the determination of the concentration of biological levels of polyhydroxylated compounds, particularly glucose. The methods utilize an amplification system that is an analyte transducer immobilized in a polymeric matrix, where the system is implantable and biocompatible. Upon interrogation by an optical system, the amplification system produces a signal capable of detection external to the skin of the patient. Quantitation of the analyte of interest is achieved by measurement of the emitted signal. Specifically, the analyte transducer immobilized in a polymeric matrix can be a boronic acid moiety.

  4. Lightweight fuel cell powerplant components program

    NASA Technical Reports Server (NTRS)

    Martin, R. E.

    1980-01-01

    A lightweight hydrogen-oxygen alkaline fuel cell incorporated into the design of a lightweight fuel cell powerplant (LFCP) was analytically and experimentally developed. The powerplant operates with passive water removal which contributes to a lower system weight and extended operating life. A preliminary LFCP specification and design table were developed along with a lightweight power section for the LFCP design, consisting of repeating two-cell modules was designed. Two, four-cell modules were designed incorporating 0.508 sq ft active area space shuttle technology fuel cells. Over 1,200 hours of single-cell and over 8,800 hours of two-cell module testing was completed. The 0.25 sq ft active area lightweight cell design was shown to be capable of operating on propellant purity reactants out to a current density of 600ASF. Endurance testing of the two-cell module configuration exceeded the 2,500-hour LFCP voltage requirements out to 3700-hours. A two-cell module capable of operating at increased reactant pressure completed 1000 hours of operation at a 30 psia reactant pressure. A lightweight power section consisting of fifteen, two-cell modules connected electrically in series was fabricated.

  5. Averting Denver Airports on a Chip

    NASA Technical Reports Server (NTRS)

    Sullivan, Kevin J.

    1995-01-01

    As a result of recent advances in software engineering capabilities, we are now in a more stable environment. De-facto hardware and software standards are emerging. Work on software architecture and design patterns signals a consensus on the importance of early system-level design decisions, and agreements on the uses of certain paradigmatic software structures. We now routinely build systems that would have been risky or infeasible a few years ago. Unfortunately, technological developments threaten to destabilize software design again. Systems designed around novel computing and peripheral devices will spark ambitious new projects that will stress current software design and engineering capabilities. Micro-electro-mechanical systems (MEMS) and related technologies provide the physical basis for new systems with the potential to produce this kind of destabilizing effect. One important response to anticipated software engineering and design difficulties is carefully directed engineering-scientific research. Two specific problems meriting substantial research attention are: A lack of sufficient means to build software systems by generating, extending, specializing, and integrating large-scale reusable components; and a lack of adequate computational and analytic tools to extend and aid engineers in maintaining intellectual control over complex software designs.

  6. A new method to evaluate human-robot system performance

    NASA Technical Reports Server (NTRS)

    Rodriguez, G.; Weisbin, C. R.

    2003-01-01

    One of the key issues in space exploration is that of deciding what space tasks are best done with humans, with robots, or a suitable combination of each. In general, human and robot skills are complementary. Humans provide as yet unmatched capabilities to perceive, think, and act when faced with anomalies and unforeseen events, but there can be huge potential risks to human safety in getting these benefits. Robots provide complementary skills in being able to work in extremely risky environments, but their ability to perceive, think, and act by themselves is currently not error-free, although these capabilities are continually improving with the emergence of new technologies. Substantial past experience validates these generally qualitative notions. However, there is a need for more rigorously systematic evaluation of human and robot roles, in order to optimize the design and performance of human-robot system architectures using well-defined performance evaluation metrics. This article summarizes a new analytical method to conduct such quantitative evaluations. While the article focuses on evaluating human-robot systems, the method is generally applicable to a much broader class of systems whose performance needs to be evaluated.

  7. Modeling and Simulation Tools for Heavy Lift Airships

    NASA Technical Reports Server (NTRS)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  8. Development, Verification and Validation of Enclosure Radiation Capabilities in the CHarring Ablator Response (CHAR) Code

    NASA Technical Reports Server (NTRS)

    Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.

    2016-01-01

    With the recent development of multi-dimensional thermal protection system (TPS) material response codes including the capabilities to account for radiative heating is a requirement. This paper presents the recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute view factors for radiation problems involving multiple surfaces. Furthermore, verification and validation of the code's radiation capabilities are demonstrated by comparing solutions to analytical results, to other codes, and to radiant test data.

  9. Discreet passive explosive detection through 2-sided wave guided fluorescence

    DOEpatents

    Harper, Ross James; la Grone, Marcus; Fisher, Mark

    2012-10-16

    The current invention provides a passive sampling device suitable for collecting and detecting the presence of target analytes. In particular, the passive sampling device is suitable for detecting nitro-aromatic compounds. The current invention further provides a passive sampling device reader suitable for determining the collection of target analytes. Additionally, the current invention provides methods for detecting target analytes using the passive sampling device and the passive sampling device reader.

  10. Turbine Vane External Heat Transfer. Volume 1: Analytical and Experimental Evaluation of Surface Heat Transfer Distributions with Leading Edge Showerhead Film Cooling

    NASA Technical Reports Server (NTRS)

    Turner, E. R.; Wilson, M. D.; Hylton, L. D.; Kaufman, R. M.

    1985-01-01

    Progress in predictive design capabilities for external heat transfer to turbine vanes was summarized. A two dimensional linear cascade (previously used to obtain vane surface heat transfer distributions on nonfilm cooled airfoils) was used to examine the effect of leading edge shower head film cooling on downstream heat transfer. The data were used to develop and evaluate analytical models. Modifications to the two dimensional boundary layer model are described. The results were used to formulate and test an effective viscosity model capable of predicting heat transfer phenomena downstream of the leading edge film cooling array on both the suction and pressure surfaces, with and without mass injection.

  11. Numerical Analysis of Stress Concentration in Isotropic and Laminated Plates with Inclined Elliptical Holes

    NASA Astrophysics Data System (ADS)

    Khechai, Abdelhak; Tati, Abdelouahab; Belarbi, Mohamed Ouejdi; Guettala, Abdelhamid

    2018-03-01

    The design of high-performance composite structures frequently includes discontinuities to reduce the weight and fastener holes for joining. Understanding the behavior of perforated laminates is necessary for structural design. In the current work, stress concentrations taking place in laminated and isotropic plates subjected to tensile load are investigated. The stress concentrations are obtained using a recent quadrilateral finite element of four nodes with 32 DOFs. The present finite element (PE) is a combination of two finite elements. The first finite element is a linear isoparametric membrane element and the second is a high precision Hermitian element. One of the essential objectives of the current investigation is to confirm the capability and efficiency of the PE for stress determination in perforated laminates. Different geometric parameters, such as the cutout form, sizes and cutout orientations, which have a considerable effect on the stress values, are studied. Using the present finite element formulation, the obtained results are found to be in good agreement with the analytical findings, which validates the capability and the efficiency of the proposed formulation. Finally, to understand the material parameters effect such as the orientation of fibers and degree of orthotropy ratio on the stress values, many figures are presented using different ellipse major to minor axis ratio. The stress concentration values are considerably affected by increasing the orientation angle of the fibers and degree of orthotropy.

  12. Keeping the Momentum and Nuclear Forensics at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steiner, Robert Ernest; Dion, Heather M.; Dry, Donald E.

    LANL has 70 years of experience in nuclear forensics and supports the community through a wide variety of efforts and leveraged capabilities: Expanding the understanding of nuclear forensics, providing training on nuclear forensics methods, and developing bilateral relationships to expand our understanding of nuclear forensic science. LANL remains highly supportive of several key organizations tasked with carrying forth the Nuclear Security Summit messages: IAEA, GICNT, and INTERPOL. Analytical chemistry measurements on plutonium and uranium matrices are critical to numerous programs including safeguards accountancy verification measurements. Los Alamos National Laboratory operates capable actinide analytical chemistry and material science laboratories suitable formore » nuclear material and environmental forensic characterization. Los Alamos National Laboratory uses numerous means to validate and independently verify that measurement data quality objectives are met. Numerous LANL nuclear facilities support the nuclear material handling, preparation, and analysis capabilities necessary to evaluate samples containing nearly any mass of an actinide (attogram to kilogram levels).« less

  13. Two logics of policy intervention in immigrant integration: an institutionalist framework based on capabilities and aspirations.

    PubMed

    Lutz, Philipp

    2017-01-01

    The effectiveness of immigrant integration policies has gained considerable attention across Western democracies dealing with ethnically and culturally diverse societies. However, the findings on what type of policy produces more favourable integration outcomes remain inconclusive. The conflation of normative and analytical assumptions on integration is a major challenge for causal analysis of integration policies. This article applies actor-centered institutionalism as a new framework for the analysis of immigrant integration outcomes in order to separate two different mechanisms of policy intervention. Conceptualising integration outcomes as a function of capabilities and aspirations allows separating assumptions on the policy intervention in assimilation and multiculturalism as the two main types of policy approaches. The article illustrates that assimilation is an incentive-based policy and primarily designed to increase immigrants' aspirations, whereas multiculturalism is an opportunity-based policy and primarily designed to increase immigrants' capabilities. Conceptualising causal mechanisms of policy intervention clarifies the link between normative concepts of immigrant integration and analytical concepts of policy effectiveness.

  14. High temperature ion channels and pores

    NASA Technical Reports Server (NTRS)

    Cheley, Stephen (Inventor); Gu, Li Qun (Inventor); Bayley, Hagan (Inventor); Kang, Xiaofeng (Inventor)

    2011-01-01

    The present invention includes an apparatus, system and method for stochastic sensing of an analyte to a protein pore. The protein pore may be an engineer protein pore, such as an ion channel at temperatures above 55.degree. C. and even as high as near 100.degree. C. The analyte may be any reactive analyte, including chemical weapons, environmental toxins and pharmaceuticals. The analyte covalently bonds to the sensor element to produce a detectable electrical current signal. Possible signals include change in electrical current. Detection of the signal allows identification of the analyte and determination of its concentration in a sample solution. Multiple analytes present in the same solution may also be detected.

  15. Attitude Determination Error Analysis System (ADEAS) mathematical specifications document

    NASA Technical Reports Server (NTRS)

    Nicholson, Mark; Markley, F.; Seidewitz, E.

    1988-01-01

    The mathematical specifications of Release 4.0 of the Attitude Determination Error Analysis System (ADEAS), which provides a general-purpose linear error analysis capability for various spacecraft attitude geometries and determination processes, are presented. The analytical basis of the system is presented. The analytical basis of the system is presented, and detailed equations are provided for both three-axis-stabilized and spin-stabilized attitude sensor models.

  16. Nonlinear Structural Health Monitoring of the Responsive Space Satellite Systems Using Magneto Elastic Active Sensors (MEAS)

    DTIC Science & Technology

    2011-11-30

    detection of fatigue damage at early stage, well before onset of fracture and crack development. Analytical and numerical models of MEAS and MMI are...stage, well before onset of fracture and crack development. Analytical and numerical models of MEAS and MMI are suggested. Finally, MEAS capability...47  2.4.1  Far-Field Crack Detection

  17. Visual Analytics and Storytelling through Video

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Pak C.; Perrine, Kenneth A.; Mackey, Patrick S.

    2005-10-31

    This paper supplements a video clip submitted to the Video Track of IEEE Symposium on Information Visualization 2005. The original video submission applies a two-way storytelling approach to demonstrate the visual analytics capabilities of a new visualization technique. The paper presents our video production philosophy, describes the plot of the video, explains the rationale behind the plot, and finally, shares our production experiences with our readers.

  18. A critical evaluation of the Beckman Coulter Access hsTnI: Analytical performance, reference interval and concordance.

    PubMed

    Pretorius, Carel J; Tate, Jillian R; Wilgen, Urs; Cullen, Louise; Ungerer, Jacobus P J

    2018-05-01

    We investigated the analytical performance, outlier rate, carryover and reference interval of the Beckman Coulter Access hsTnI in detail and compared it with historical and other commercial assays. We compared the imprecision, detection capability, analytical sensitivity, outlier rate and carryover against two previous Access AccuTnI assay versions. We established the reference interval with stored samples from a previous study and compared the concordances and variances with the Access AccuTnI+3 as well as with two commercial assays. The Access hsTnI had excellent analytical sensitivity with the calibration slope 5.6 times steeper than the Access AccuTnI+3. The detection capability was markedly improved with the SD of the blank 0.18-0.20 ng/L, LoB 0.29-0.33 ng/L and LoD 0.58-0.69 ng/L. All the reference interval samples had a result above the LoB value. At a mean concentration of 2.83 ng/L the SD was 0.28 ng/L (CV 9.8%). Carryover (0.005%) and outlier (0.046%) rates were similar to the Access AccuTnI+3. The combined male and female 99th percentile reference interval was 18.2 ng/L (90% CI 13.2-21.1 ng/L). Concordance amongst the assays was poor with only 16.7%, 19.6% and 15.2% of samples identified by all 4 assays as above the 99th, 97.5th and 95th percentiles. Analytical imprecision was a minor contributor to the observed variances between assays. The Beckman Coulter Access hsTnI assay has excellent analytical sensitivity and precision characteristics close to zero. This allows cTnI measurement in all healthy individuals and the capability to identify numerically small differences between serial samples as statistically significant. Concordance in healthy individuals remains poor amongst assays. Crown Copyright © 2018. Published by Elsevier Inc. All rights reserved.

  19. Drinking Water and Wastewater Laboratory Networks

    EPA Pesticide Factsheets

    This website provides the drinking water sector with an integrated nationwide network of laboratories with the analytical capability to respond to intentional and unintentional drinking water incidents.

  20. Visual Analytics for Law Enforcement: Deploying a Service-Oriented Analytic Framework for Web-based Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dowson, Scott T.; Bruce, Joseph R.; Best, Daniel M.

    2009-04-14

    This paper presents key components of the Law Enforcement Information Framework (LEIF) that provides communications, situational awareness, and visual analytics tools in a service-oriented architecture supporting web-based desktop and handheld device users. LEIF simplifies interfaces and visualizations of well-established visual analytical techniques to improve usability. Advanced analytics capability is maintained by enhancing the underlying processing to support the new interface. LEIF development is driven by real-world user feedback gathered through deployments at three operational law enforcement organizations in the US. LEIF incorporates a robust information ingest pipeline supporting a wide variety of information formats. LEIF also insulates interface and analyticalmore » components from information sources making it easier to adapt the framework for many different data repositories.« less

  1. Examining the Use of a Visual Analytics System for Sensemaking Tasks: Case Studies with Domain Experts.

    PubMed

    Kang, Youn-Ah; Stasko, J

    2012-12-01

    While the formal evaluation of systems in visual analytics is still relatively uncommon, particularly rare are case studies of prolonged system use by domain analysts working with their own data. Conducting case studies can be challenging, but it can be a particularly effective way to examine whether visual analytics systems are truly helping expert users to accomplish their goals. We studied the use of a visual analytics system for sensemaking tasks on documents by six analysts from a variety of domains. We describe their application of the system along with the benefits, issues, and problems that we uncovered. Findings from the studies identify features that visual analytics systems should emphasize as well as missing capabilities that should be addressed. These findings inform design implications for future systems.

  2. ANALYTICAL MODELING OF ELECTRON BACK-BOMBARDMENT INDUCED CURRENT INCREASE IN UN-GATED THERMIONIC CATHODE RF GUNS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edelen, J. P.; Sun, Y.; Harris, J. R.

    In this paper we derive analytical expressions for the output current of an un-gated thermionic cathode RF gun in the presence of back-bombardment heating. We provide a brief overview of back-bombardment theory and discuss comparisons between the analytical back-bombardment predictions and simulation models. We then derive an expression for the output current as a function of the RF repetition rate and discuss relationships between back-bombardment, fieldenhancement, and output current. We discuss in detail the relevant approximations and then provide predictions about how the output current should vary as a function of repetition rate for some given system configurations.

  3. Exact analytical solution of a classical Josephson tunnel junction problem

    NASA Astrophysics Data System (ADS)

    Kuplevakhsky, S. V.; Glukhov, A. M.

    2010-10-01

    We give an exact and complete analytical solution of the classical problem of a Josephson tunnel junction of arbitrary length W ɛ(0,∞) in the presence of external magnetic fields and transport currents. Contrary to a wide-spread belief, the exact analytical solution unambiguously proves that there is no qualitative difference between so-called "small" (W≪1) and "large" junctions (W≫1). Another unexpected physical implication of the exact analytical solution is the existence (in the current-carrying state) of unquantized Josephson vortices carrying fractional flux and located near one of the edges of the junction. We also refine the mathematical definition of critical transport current.

  4. Harnessing Aptamers to Overcome Challenges in Gluten Detection

    PubMed Central

    Miranda-Castro, Rebeca; de-los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J.; Lobo-Castañón, María Jesús

    2016-01-01

    Celiac disease is a lifelong autoimmune disorder triggered by foods containing gluten, the storage protein in wheat, rye, and barley. The rapidly escalating number of patients diagnosed with this disease poses a great challenge to both food industry and authorities to guarantee food safety for all. Therefore, intensive efforts are being made to establish minimal disease-eliciting doses of gluten and consequently to improve gluten-free labeling. These efforts depend to a high degree on the availability of methods capable of detecting the protein in food samples at levels as low as possible. Current analytical approaches rely on the use of antibodies as selective recognition elements. With limited sensitivity, these methods exhibit some deficiencies that compromise the accuracy of the obtained results. Aptamers provide an ideal alternative for designing biosensors for fast and selective measurement of gluten in foods. This article highlights the challenges in gluten detection, the current status of the use of aptamers for solving this problem, and what remains to be done to move these systems into commercial applications. PMID:27104578

  5. Extravehicular Activity Operations Concepts Under Communication Latency and Bandwidth Constraints

    NASA Technical Reports Server (NTRS)

    Beaton, Kara H.; Chappell, Steven P.; Abercromby, Andrew F. J.; Miller, Matthew J.; Nawotniak, Shannon Kobs; Hughes, Scott; Brady, Allyson; Lim, Darlene S. S.

    2017-01-01

    The Biologic Analog Science Associated with Lava Terrains (BASALT) project is a multi-year program dedicated to iteratively develop, implement, and evaluate concepts of operations (ConOps) and supporting capabilities intended to enable and enhance human scientific exploration of Mars. This pa-per describes the planning, execution, and initial results from the first field deployment, referred to as BASALT-1, which consisted of a series of 10 simulated extravehicular activities (EVAs) on volcanic flows in Idaho's Craters of the Moon (COTM) National Monument. The ConOps and capabilities deployed and tested during BASALT-1 were based on previous NASA trade studies and analog testing. Our primary research question was whether those ConOps and capabilities work acceptably when performing real (non-simulated) biological and geological scientific exploration under 4 different Mars-to-Earth communication conditions: 5 and 15 min one-way light time (OWLT) communication latencies and low (0.512 Mb/s uplink, 1.54 Mb/s downlink) and high (5.0 Mb/s uplink, 10.0 Mb/s downlink) bandwidth conditions representing the lower and higher limits of technical communication capabilities currently proposed for future human exploration missions. The synthesized results of BASALT-1 with respect to the ConOps and capabilities assessment were derived from a variety of sources, including EVA task timing data, network analytic data, and subjective ratings and comments regarding the scientific and operational acceptability of the ConOp and the extent to which specific capabilities were enabling and enhancing, and are presented here. BASALT-1 established preliminary findings that baseline ConOp, software systems, and communication protocols were scientifically and operationally acceptable with minor improvements desired by the "Mars" extravehicular (EV) and intravehicular (IV) crewmembers, but unacceptable with improvements required by the "Earth" Mission Support Center. These data will provide a basis for guiding and prioritizing capability development for future BASALT deployments and, ultimately, future human exploration missions.

  6. ERLN Water Focus Area

    EPA Pesticide Factsheets

    The Water Laboratory Alliance (WLA), within Environmental Response Laboratory Network, maintains analytical capability and capacity in the event of intentional and unintentional water contamination with chemical, biological and radiochemical contaminants.

  7. Workspace Program for Complex-Number Arithmetic

    NASA Technical Reports Server (NTRS)

    Patrick, M. C.; Howell, Leonard W., Jr.

    1986-01-01

    COMPLEX is workspace program designed to empower APL with complexnumber capabilities. Complex-variable methods provide analytical tools invaluable for applications in mathematics, science, and engineering. COMPLEX written in APL.

  8. Final report on mid-polarity analytes in food matrix: mid-polarity pesticides in tea

    NASA Astrophysics Data System (ADS)

    Sin, Della W. M.; Li, Hongmei; Wong, S. K.; Lo, M. F.; Wong, Y. L.; Wong, Y. C.; Mok, C. S.

    2015-01-01

    At the Paris meeting in April 2011, the CCQM Working Group on Organic Analysis (OAWG) agreed on a suite of Track A studies meant to support the assessment of measurement capabilities needed for the delivery of measurement services within the scope of the OAWG Terms of Reference. One of the studies discussed and agreed upon for the suite of ten Track A studies that support the 5-year plan of the CCQM Core Competence assessment was CCQM-K95 'Mid-Polarity Analytes in Food Matrix: Mid-Polarity Pesticides in Tea'. This key comparison was co-organized by the Government Laboratory of Hong Kong Special Administrative Region (GL) and the National Institute of Metrology, China (NIM). To allow wider participation, a pilot study, CCQM-P136, was run in parallel. Participants' capabilities in measuring mid-polarity analytes in food matrix were demonstrated through this key comparison. Most of the participating NMIs/DIs successfully measured beta-endosulfan and endosulfan sulphate in the sample, however, there is room for further improvement for some participants. This key comparison involved not only extraction, clean-up, analytical separation and selective detection of the analytes in a complex food matrix, but also the pre-treatment procedures of the material before the extraction process. The problem of incomplete extraction of the incurred analytes from the sample matrix may not be observed simply by using spike recovery. The relative standard deviations for the data included in the KCRV calculation in this key comparison were less than 7 % which was acceptable given the complexity of the matrix, the level of the analytes and the complexity of the analytical procedure. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  9. Sub-micron surface plasmon resonance sensor systems

    NASA Technical Reports Server (NTRS)

    Glazier, James A. (Inventor); Amarie, Dragos (Inventor)

    2013-01-01

    Wearable or implantable devices combining microfluidic control of sample and reagent flow and micro-cavity surface plasmon resonance sensors functionalized with surface treatments or coatings capable of specifically binding to target analytes, ligands, or molecules in a bodily fluid are provided. The devices can be used to determine the presence and concentration of target analytes in the bodily fluids and thereby help diagnose, monitor or detect changes in disease conditions.

  10. Device and method for enhanced collection and assay of chemicals with high surface area ceramic

    DOEpatents

    Addleman, Raymond S.; Li, Xiaohong Shari; Chouyyok, Wilaiwan; Cinson, Anthony D.; Bays, John T.; Wallace, Krys

    2016-02-16

    A method and device for enhanced capture of target analytes is disclosed. This invention relates to collection of chemicals for separations and analysis. More specifically, this invention relates to a solid phase microextraction (SPME) device having better capability for chemical collection and analysis. This includes better physical stability, capacity for chemical collection, flexible surface chemistry and high affinity for target analyte.

  11. Further Investigations of Content Analytic Techniques for Extracting the Differentiating Information Contained in the Narrative Sections of Performance Evaluations for Navy Enlisted Personnel. Technical Report No. 75-1.

    ERIC Educational Resources Information Center

    Ramsey-Klee, Diane M.; Richman, Vivian

    The purpose of this research is to develop content analytic techniques capable of extracting the differentiating information in narrative performance evaluations for enlisted personnel in order to aid in the process of selecting personnel for advancement, duty assignment, training, or quality retention. Four tasks were performed. The first task…

  12. Rapid, automated, parallel quantitative immunoassays using highly integrated microfluidics and AlphaLISA

    PubMed Central

    Tak For Yu, Zeta; Guan, Huijiao; Ki Cheung, Mei; McHugh, Walker M.; Cornell, Timothy T.; Shanley, Thomas P.; Kurabayashi, Katsuo; Fu, Jianping

    2015-01-01

    Immunoassays represent one of the most popular analytical methods for detection and quantification of biomolecules. However, conventional immunoassays such as ELISA and flow cytometry, even though providing high sensitivity and specificity and multiplexing capability, can be labor-intensive and prone to human error, making them unsuitable for standardized clinical diagnoses. Using a commercialized no-wash, homogeneous immunoassay technology (‘AlphaLISA’) in conjunction with integrated microfluidics, herein we developed a microfluidic immunoassay chip capable of rapid, automated, parallel immunoassays of microliter quantities of samples. Operation of the microfluidic immunoassay chip entailed rapid mixing and conjugation of AlphaLISA components with target analytes before quantitative imaging for analyte detections in up to eight samples simultaneously. Aspects such as fluid handling and operation, surface passivation, imaging uniformity, and detection sensitivity of the microfluidic immunoassay chip using AlphaLISA were investigated. The microfluidic immunoassay chip could detect one target analyte simultaneously for up to eight samples in 45 min with a limit of detection down to 10 pg mL−1. The microfluidic immunoassay chip was further utilized for functional immunophenotyping to examine cytokine secretion from human immune cells stimulated ex vivo. Together, the microfluidic immunoassay chip provides a promising high-throughput, high-content platform for rapid, automated, parallel quantitative immunosensing applications. PMID:26074253

  13. Performing data analytics on information obtained from various sensors on an OSUS compliant system

    NASA Astrophysics Data System (ADS)

    Cashion, Kelly; Landoll, Darian; Klawon, Kevin; Powar, Nilesh

    2017-05-01

    The Open Standard for Unattended Sensors (OSUS) was developed by DIA and ARL to provide a plug-n-play platform for sensor interoperability. Our objective is to use the standardized data produced by OSUS in performing data analytics on information obtained from various sensors. Data analytics can be integrated in one of three ways: within an asset itself; as an independent plug-in designed for one type of asset (i.e. camera or seismic sensor); or as an independent plug-in designed to incorporate data from multiple assets. As a proof-of-concept, we develop a model that can be used in the second of these types - an independent component for camera images. The dataset used was collected as part of a demonstration and test of OSUS capabilities. The image data includes images of empty outdoor scenes and scenes with human or vehicle activity. We design, test, and train a convolution neural network (CNN) to analyze these images and assess the presence of activity in the image. The resulting classifier labels input images as empty or activity with 86.93% accuracy, demonstrating the promising opportunities for deep learning, machine learning, and predictive analytics as an extension of OSUS's already robust suite of capabilities.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean

    A new field of research, visual analytics, has recently been introduced. This has been defined as “the science of analytical reasoning facilitated by visual interfaces." Visual analytic environments, therefore, support analytical reasoning using visual representations and interactions, with data representations and transformation capabilities, to support production, presentation and dissemination. As researchers begin to develop visual analytic environments, it will be advantageous to develop metrics and methodologies to help researchers measure the progress of their work and understand the impact their work will have on the users who will work in such environments. This paper presents five areas or aspects ofmore » visual analytic environments that should be considered as metrics and methodologies for evaluation are developed. Evaluation aspects need to include usability, but it is necessary to go beyond basic usability. The areas of situation awareness, collaboration, interaction, creativity, and utility are proposed as areas for initial consideration. The steps that need to be undertaken to develop systematic evaluation methodologies and metrics for visual analytic environments are outlined.« less

  15. Integration of analytical measurements and wireless communications--current issues and future strategies.

    PubMed

    Diamond, Dermot; Lau, King Tong; Brady, Sarah; Cleary, John

    2008-05-15

    Rapid developments in wireless communications are opening up opportunities for new ways to perform many types of analytical measurements that up to now have been restricted in scope due to the need to have access to centralised facilities. This paper will address both the potential for new applications and the challenges that currently inhibit more widespread integration of wireless communications with autonomous sensors and analytical devices. Key issues are identified and strategies for closer integration of analytical information and wireless communications systems discussed.

  16. Sierra/Aria 4.48 Verification Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sierra Thermal Fluid Development Team

    Presented in this document is a portion of the tests that exist in the Sierra Thermal/Fluids verification test suite. Each of these tests is run nightly with the Sierra/TF code suite and the results of the test checked under mesh refinement against the correct analytic result. For each of the tests presented in this document the test setup, derivation of the analytic solution, and comparison of the code results to the analytic solution is provided. This document can be used to confirm that a given code capability is verified or referenced as a compilation of example problems.

  17. service line analytics in the new era.

    PubMed

    Spence, Jay; Seargeant, Dan

    2015-08-01

    To succeed under the value-based business model, hospitals and health systems require effective service line analytics that combine inpatient and outpatient data and that incorporate quality metrics for evaluating clinical operations. When developing a framework for collection, analysis, and dissemination of service line data, healthcare organizations should focus on five key aspects of effective service line analytics: Updated service line definitions. Ability to analyze and trend service line net patient revenues by payment source. Access to accurate service line cost information across multiple dimensions with drill-through capabilities. Ability to redesign key reports based on changing requirements. Clear assignment of accountability.

  18. Trace level detection of analytes using artificial olfactometry

    NASA Technical Reports Server (NTRS)

    Wong, Bernard (Inventor); Lewis, Nathan S. (Inventor); Severin, Erik J. (Inventor)

    2001-01-01

    The present invention provides a device for detecting the presence of an analyte, wherein said analyte is a microorganism marker gas. The device comprises a sample chamber having a fluid inlet port for the influx of the microorganism marker gas; a fluid concentrator in flow communication with the sample chamber, wherein the fluid concentrator has an absorbent material capable of absorbing the microorganism marker gas and thereafter releasing a concentrated microorganism marker gas; and an array of sensors in fluid communication with the concentrated microorganism marker gas. The sensor array detects and identifies the marker gas upon its release from fluid concentrate.

  19. Simulations of binary black hole mergers

    NASA Astrophysics Data System (ADS)

    Lovelace, Geoffrey

    2017-01-01

    Advanced LIGO's observations of merging binary black holes have inaugurated the era of gravitational wave astronomy. Accurate models of binary black holes and the gravitational waves they emit are helping Advanced LIGO to find as many gravitational waves as possible and to learn as much as possible about the waves' sources. These models require numerical-relativity simulations of binary black holes, because near the time when the black holes merge, all analytic approximations break down. Following breakthroughs in 2005, many research groups have built numerical-relativity codes capable of simulating binary black holes. In this talk, I will discuss current challenges in simulating binary black holes for gravitational-wave astronomy, and I will discuss the tremendous progress that has already enabled such simulations to become an essential tool for Advanced LIGO.

  20. Ballistics Analysis of Orion Crew Module Separation Bolt Cover

    NASA Technical Reports Server (NTRS)

    Howard, Samuel A.; Konno, Kevin E.; Carney, Kelly S.; Pereira, J. Michael

    2013-01-01

    NASA is currently developing a new crew module to replace capabilities of the retired Space Shuttles and to provide a crewed vehicle for exploring beyond low earth orbit. The crew module is a capsule-type design, which is designed to separate from the launch vehicle during launch ascent once the launch vehicle fuel is expended. The separation is achieved using pyrotechnic separation bolts, wherein a section of the bolt is propelled clear of the joint at high velocity by an explosive charge. The resulting projectile must be contained within the fairing structure by a containment plate. This paper describes an analytical effort completed to augment testing of various containment plate materials and thicknesses. The results help guide the design and have potential benefit for future similar applications.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halavanau, Aliaksei; Eddy, Nathan; Edstrom, Dean

    Superconducting linacs are capable of producing intense, ultra-stable, high-quality electron beams that have widespread application in Science and Industry. Many current and planned projects employ 1.3-GHz 9-cell superconducting cavities of the TESLA design*. In the present paper we discuss the transverse-focusing properties of such a cavity and non-ideal transverse-map effects introduced by field asymmetries in the vicinity of the input and high-order-mode radiofrequency (RF) couplers**. We especially consider the case of a cavity located downstream of an RF-gun in a setup similar to the photoinjector of the Fermilab Accelerator Science and Technology (FAST) facility. Preliminary experimental measurements of the CC2more » cavity transverse matrix were carried out at the FAST facility. The results are discussed and compared with analytical and numerical simulations.« less

  2. Positron radiography of ignition-relevant ICF capsules

    DOE PAGES

    Williams, G. J.; Chen, Hui; Field, J. E.; ...

    2017-12-11

    Laser-generated positrons are evaluated as a probe source to radiograph in-flight ignition-relevant inertial confinement fusion capsules. Current ultraintense laser facilities are capable of producing 2 ×10 12 relativistic positrons in a narrow energy bandwidth and short time duration. Monte Carlo simulations suggest that the unique characteristics of such positrons allow for the reconstruction of both capsule shell radius and areal density between 0.002 and 2g/cm 2. The energy-downshifted positron spectrum and angular scattering of the source particles are sufficient to constrain the conditions of the capsule between preshot and stagnation. Here, we evaluate the effects of magnetic fields near themore » capsule surface using analytic estimates where it is shown that this diagnostic can tolerate line integrated field strengths of 100 T mm.« less

  3. Positron radiography of ignition-relevant ICF capsules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, G. J.; Chen, Hui; Field, J. E.

    Laser-generated positrons are evaluated as a probe source to radiograph in-flight ignition-relevant inertial confinement fusion capsules. Current ultraintense laser facilities are capable of producing 2 ×10 12 relativistic positrons in a narrow energy bandwidth and short time duration. Monte Carlo simulations suggest that the unique characteristics of such positrons allow for the reconstruction of both capsule shell radius and areal density between 0.002 and 2g/cm 2. The energy-downshifted positron spectrum and angular scattering of the source particles are sufficient to constrain the conditions of the capsule between preshot and stagnation. Here, we evaluate the effects of magnetic fields near themore » capsule surface using analytic estimates where it is shown that this diagnostic can tolerate line integrated field strengths of 100 T mm.« less

  4. Molecular-biological sensing in aquatic environments: recent developments and emerging capabilities.

    PubMed

    McQuillan, Jonathan S; Robidart, Julie C

    2017-06-01

    Aquatic microbial communities are central to biogeochemical processes that maintain Earth's habitability. However, there is a significant paucity of data collected from these species in their natural environment. To address this, a suite of ocean-deployable sampling and sensing instrumentation has been developed to retrieve, archive and analyse water samples and their microbial fraction using state of the art genetic assays. Recent deployments have shed new light onto the role microbes play in essential ocean processes and highlight the risks they may pose to coastal populations. Although current designs are generally too large, complex and expensive for widespread use, a host of emerging bio-analytical technologies have the potential to revolutionise this field and open new possibilities in aquatic microbial metrology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Hypersonic airframe structures: Technology needs and flight test requirements

    NASA Technical Reports Server (NTRS)

    Stone, J. E.; Koch, L. C.

    1979-01-01

    Hypersonic vehicles, that may be produced by the year 2000, were identified. Candidate thermal/structural concepts that merit consideration for these vehicles were described. The current status of analytical methods, materials, manufacturing techniques, and conceptual developments pertaining to these concepts were reviewed. Guidelines establishing meaningful technology goals were defined and twenty-eight specific technology needs were identified. The extent to which these technology needs can be satisfied, using existing capabilities and facilities without the benefit of a hypersonic research aircraft, was assessed. The role that a research aircraft can fill in advancing this technology was discussed and a flight test program was outlined. Research aircraft thermal/structural design philosophy was also discussed. Programs, integrating technology advancements with the projected vehicle needs, were presented. Program options were provided to reflect various scheduling and cost possibilities.

  6. Protocol for Future Amino Acid Analyses of Samples Returned by the Stardust Mission

    NASA Technical Reports Server (NTRS)

    Glavin, D. P.; Doty, J. H., III; Matrajt, G.; Dworkin, J. P.

    2006-01-01

    We have demonstrated that LC-ToF-MS coupled with UV fluorescence detection is a powerful tool for the detection of amino acids in meteorite extracts. Using this new analytical technique we were able to identify the extraterrestrial amino acid AIB extracted from fifteen 20 micron sized Murchison meteorite grains. We found that the amino acid contamination levels in Stardust aerogels was much lower than the levels observed in the Murchison meteorite. In addition, the alpha-dialkyl amino acids AIB and isovaline which are the most abundant amino acids in Murchison were not detected in the aerogel above blank levels. We are currently integrating LIF detection capability to our existing nanoflow LC-ToF-MS for enhanced sensitivity required for the analysis of amino acids in Stardust samples.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dayman, Ken J; Ade, Brian J; Weber, Charles F

    High-dimensional, nonlinear function estimation using large datasets is a current area of interest in the machine learning community, and applications may be found throughout the analytical sciences, where ever-growing datasets are making more information available to the analyst. In this paper, we leverage the existing relevance vector machine, a sparse Bayesian version of the well-studied support vector machine, and expand the method to include integrated feature selection and automatic function shaping. These innovations produce an algorithm that is able to distinguish variables that are useful for making predictions of a response from variables that are unrelated or confusing. We testmore » the technology using synthetic data, conduct initial performance studies, and develop a model capable of making position-independent predictions of the coreaveraged burnup using a single specimen drawn randomly from a nuclear reactor core.« less

  8. Onward through the Fog: Uncertainty and Management Adaptation in Systems Analysis and Design

    DTIC Science & Technology

    1990-07-01

    has fallen into stereotyped problem formulations and analytical ap- proaches. In particular, treatments of uncertainty are typically quite incomplete...and often conceptually wrong. This report argues that these shortcomings produce pervasive systematic biases in analyses. Problem formulations ...capability were lost. The expected number of aircraft that would not be fully mission capable thirty days later was roughly twice the num - ber

  9. US Army Research Laboratory Joint Interagency Field Experimentation 15-2 Final Report

    DTIC Science & Technology

    2015-12-01

    February 2015, at Alameda Island, California. Advanced text analytics capabilities were demonstrated in a logically coherent workflow pipeline that... text processing capabilities allowed the targeted use of a persistent imagery sensor for rapid detection of mission- critical events. The creation of...a very large text database from open source data provides a relevant and unclassified foundation for continued development of text -processing

  10. Enzyme-based fiber optic sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kulp, T.J.; Camins, I.; Angel, S.M.

    Fiber optic chemical sensors capable of detecting glucose and penicillin were developed. Each consists of a polymer membrane that is covalently attached to the tip of a glass optical fiber. The membrane contains the enzyme and a pH-sensitive fluorescent dye (fluorescein). A signal is produced when the enzyme catalyzes the conversion of the analyte (glucose or penicillin) into a product (gluconic or penicilloic acid, respectively) that lowers the microenvironmental pH of the membrane and consequently, lowers the fluorescence intensity of the dye. Each sensor is capable of responding to analyte concentrations in the range of approx.0.1 to 100 mM. Themore » penicillin optrode response time is 40 to 60 s while that for glucose is approx.5 to 12 min. 7 figs.« less

  11. A Conceptual Architecture for National Biosurveillance: Moving Beyond Situational Awareness to Enable Digital Detection of Emerging Threats.

    PubMed

    Velsko, Stephan; Bates, Thomas

    2016-01-01

    Despite numerous calls for improvement, the US biosurveillance enterprise remains a patchwork of uncoordinated systems that fail to take advantage of the rapid progress in information processing, communication, and analytics made in the past decade. By synthesizing components from the extensive biosurveillance literature, we propose a conceptual framework for a national biosurveillance architecture and provide suggestions for implementation. The framework differs from the current federal biosurveillance development pathway in that it is not focused on systems useful for "situational awareness" but is instead focused on the long-term goal of having true warning capabilities. Therefore, a guiding design objective is the ability to digitally detect emerging threats that span jurisdictional boundaries, because attempting to solve the most challenging biosurveillance problem first provides the strongest foundation to meet simpler surveillance objectives. Core components of the vision are: (1) a whole-of-government approach to support currently disparate federal surveillance efforts that have a common data need, including those for food safety, vaccine and medical product safety, and infectious disease surveillance; (2) an information architecture that enables secure national access to electronic health records, yet does not require that data be sent to a centralized location for surveillance analysis; (3) an inference architecture that leverages advances in "big data" analytics and learning inference engines-a significant departure from the statistical process control paradigm that underpins nearly all current syndromic surveillance systems; and (4) an organizational architecture with a governance model aimed at establishing national biosurveillance as a critical part of the US national infrastructure. Although it will take many years to implement, and a national campaign of education and debate to acquire public buy-in for such a comprehensive system, the potential benefits warrant increased consideration by the US government.

  12. A results-based process for evaluation of diverse visual analytics tools

    NASA Astrophysics Data System (ADS)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  13. Portable sample preparation and analysis system for micron and sub-micron particle characterization using light scattering and absorption spectroscopy

    DOEpatents

    Stark, Peter C [Los Alamos, NM; Zurek, Eduardo [Barranquilla, CO; Wheat, Jeffrey V [Fort Walton Beach, FL; Dunbar, John M [Santa Fe, NM; Olivares, Jose A [Los Alamos, NM; Garcia-Rubio, Luis H [Temple Terrace, FL; Ward, Michael D [Los Alamos, NM

    2011-07-26

    There is provided a method and device for remote sampling, preparation and optical interrogation of a sample using light scattering and light absorption methods. The portable device is a filtration-based device that removes interfering background particle material from the sample matrix by segregating or filtering the chosen analyte from the sample solution or matrix while allowing the interfering background particles to be pumped out of the device. The segregated analyte is then suspended in a diluent for analysis. The device is capable of calculating an initial concentration of the analyte, as well as diluting the analyte such that reliable optical measurements can be made. Suitable analytes include cells, microorganisms, bioparticles, pathogens and diseases. Sample matrixes include biological fluids such as blood and urine, as well as environmental samples including waste water.

  14. Test and Analysis Capabilities of the Space Environment Effects Team at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Finckenor, M. M.; Edwards, D. L.; Vaughn, J. A.; Schneider, T. A.; Hovater, M. A.; Hoppe, D. T.

    2002-01-01

    Marshall Space Flight Center has developed world-class space environmental effects testing facilities to simulate the space environment. The combined environmental effects test system exposes temperature-controlled samples to simultaneous protons, high- and low-energy electrons, vacuum ultraviolet (VUV) radiation, and near-ultraviolet (NUV) radiation. Separate chambers for studying the effects of NUV and VUV at elevated temperatures are also available. The Atomic Oxygen Beam Facility exposes samples to atomic oxygen of 5 eV energy to simulate low-Earth orbit (LEO). The LEO space plasma simulators are used to study current collection to biased spacecraft surfaces, arcing from insulators and electrical conductivity of materials. Plasma propulsion techniques are analyzed using the Marshall magnetic mirror system. The micro light gas gun simulates micrometeoroid and space debris impacts. Candidate materials and hardware for spacecraft can be evaluated for durability in the space environment with a variety of analytical techniques. Mass, solar absorptance, infrared emittance, transmission, reflectance, bidirectional reflectance distribution function, and surface morphology characterization can be performed. The data from the space environmental effects testing facilities, combined with analytical results from flight experiments, enable the Environmental Effects Group to determine optimum materials for use on spacecraft.

  15. Use of big data for drug development and for public and personal health and care.

    PubMed

    Leyens, Lada; Reumann, Matthias; Malats, Nuria; Brand, Angela

    2017-01-01

    The use of data analytics across the entire healthcare value chain, from drug discovery and development through epidemiology to informed clinical decision for patients or policy making for public health, has seen an explosion in the recent years. The increase in quantity and variety of data available together with the improvement of storing capabilities and analytical tools offer numerous possibilities to all stakeholders (manufacturers, regulators, payers, healthcare providers, decision makers, researchers) but most importantly, it has the potential to improve general health outcomes if we learn how to exploit it in the right way. This article looks at the different sources of data and the importance of unstructured data. It goes on to summarize current and potential future uses in drug discovery, development, and monitoring as well as in public and personal healthcare; including examples of good practice and recent developments. Finally, we discuss the main practical and ethical challenges to unravel the full potential of big data in healthcare and conclude that all stakeholders need to work together towards the common goal of making sense of the available data for the common good. © 2016 WILEY PERIODICALS, INC.

  16. Some aspects of analytical chemistry as applied to water quality assurance techniques for reclaimed water: The potential use of X-ray fluorescence spectrometry for automated on-line fast real-time simultaneous multi-component analysis of inorganic pollutants in reclaimed water

    NASA Technical Reports Server (NTRS)

    Ling, A. C.; Macpherson, L. H.; Rey, M.

    1981-01-01

    The potential use of isotopically excited energy dispersive X-ray fluorescence (XRF) spectrometry for automated on line fast real time (5 to 15 minutes) simultaneous multicomponent (up to 20) trace (1 to 10 parts per billion) analysis of inorganic pollutants in reclaimed water was examined. Three anionic elements (chromium 6, arsenic and selenium) were studied. The inherent lack of sensitivity of XRF spectrometry for these elements mandates use of a preconcentration technique and various methods were examined, including: several direct and indirect evaporation methods; ion exchange membranes; selective and nonselective precipitation; and complexation processes. It is shown tha XRF spectrometry itself is well suited for automated on line quality assurance, and can provide a nondestructive (and thus sample storage and repeat analysis capabilities) and particularly convenient analytical method. Further, the use of an isotopically excited energy dispersive unit (50 mCi Cd-109 source) coupled with a suitable preconcentration process can provide sufficient sensitivity to achieve the current mandated minimum levels of detection without the need for high power X-ray generating tubes.

  17. Modification of a Macromechanical Finite-Element Based Model for Impact Analysis of Triaxially-Braided Composites

    NASA Technical Reports Server (NTRS)

    Goldberg, Robert K.; Blinzler, Brina J.; Binienda, Wieslaw K.

    2010-01-01

    A macro level finite element-based model has been developed to simulate the mechanical and impact response of triaxially-braided polymer matrix composites. In the analytical model, the triaxial braid architecture is simulated by using four parallel shell elements, each of which is modeled as a laminated composite. For the current analytical approach, each shell element is considered to be a smeared homogeneous material. The commercial transient dynamic finite element code LS-DYNA is used to conduct the simulations, and a continuum damage mechanics model internal to LS-DYNA is used as the material constitutive model. The constitutive model requires stiffness and strength properties of an equivalent unidirectional composite. Simplified micromechanics methods are used to determine the equivalent stiffness properties, and results from coupon level tests on the braided composite are utilized to back out the required strength properties. Simulations of quasi-static coupon tests of several representative braided composites are conducted to demonstrate the correlation of the model. Impact simulations of a represented braided composites are conducted to demonstrate the capability of the model to predict the penetration velocity and damage patterns obtained experimentally.

  18. A Conductometric Indium Oxide Semiconducting Nanoparticle Enzymatic Biosensor Array

    PubMed Central

    Lee, Dongjin; Ondrake, Janet; Cui, Tianhong

    2011-01-01

    We report a conductometric nanoparticle biosensor array to address the significant variation of electrical property in nanomaterial biosensors due to the random network nature of nanoparticle thin-film. Indium oxide and silica nanoparticles (SNP) are assembled selectively on the multi-site channel area of the resistors using layer-by-layer self-assembly. To demonstrate enzymatic biosensing capability, glucose oxidase is immobilized on the SNP layer for glucose detection. The packaged sensor chip onto a ceramic pin grid array is tested using syringe pump driven feed and multi-channel I–V measurement system. It is successfully demonstrated that glucose is detected in many different sensing sites within a chip, leading to concentration dependent currents. The sensitivity has been found to be dependent on the channel length of the resistor, 4–12 nA/mM for channel lengths of 5–20 μm, while the apparent Michaelis-Menten constant is 20 mM. By using sensor array, analytical data could be obtained with a single step of sample solution feeding. This work sheds light on the applicability of the developed nanoparticle microsensor array to multi-analyte sensors, novel bioassay platforms, and sensing components in a lab-on-a-chip. PMID:22163696

  19. Environmental Response Laboratory Network Membership and Benefits

    EPA Pesticide Factsheets

    Member laboratories must meet core requirements including quality systems, policies and procedures, sample and data management, and analytical capabilities. Benefits include training and exercise opportunities, information sharing and technical support.

  20. A miniaturized optoelectronic system for rapid quantitative label-free detection of harmful species in food

    NASA Astrophysics Data System (ADS)

    Raptis, Ioannis; Misiakos, Konstantinos; Makarona, Eleni; Salapatas, Alexandros; Petrou, Panagiota; Kakabakos, Sotirios; Botsialas, Athanasios; Jobst, Gerhard; Haasnoot, Willem; Fernandez-Alba, Amadeo; Lees, Michelle; Valamontes, Evangelos

    2016-03-01

    Optical biosensors have emerged in the past decade as the most promising candidates for portable, highly-sensitive bioanalytical systems that can be employed for in-situ measurements. In this work, a miniaturized optoelectronic system for rapid, quantitative, label-free detection of harmful species in food is presented. The proposed system has four distinctive features that can render to a powerful tool for the next generation of Point-of-Need applications, namely it accommodates the light sources and ten interferometric biosensors on a single silicon chip of a less-than-40mm2 footprint, each sensor can be individually functionalized for a specific target analyte, the encapsulation can be performed at the wafer-scale, and finally it exploits a new operation principle, Broad-band Mach-Zehnder Interferometry to ameliorate its analytical capabilities. Multi-analyte evaluation schemes for the simultaneous detection of harmful contaminants, such as mycotoxins, allergens and pesticides, proved that the proposed system is capable of detecting within short time these substances at concentrations below the limits imposed by regulatory authorities, rendering it to a novel tool for the near-future food safety applications.

  1. Modified symplectic schemes with nearly-analytic discrete operators for acoustic wave simulations

    NASA Astrophysics Data System (ADS)

    Liu, Shaolin; Yang, Dinghui; Lang, Chao; Wang, Wenshuai; Pan, Zhide

    2017-04-01

    Using a structure-preserving algorithm significantly increases the computational efficiency of solving wave equations. However, only a few explicit symplectic schemes are available in the literature, and the capabilities of these symplectic schemes have not been sufficiently exploited. Here, we propose a modified strategy to construct explicit symplectic schemes for time advance. The acoustic wave equation is transformed into a Hamiltonian system. The classical symplectic partitioned Runge-Kutta (PRK) method is used for the temporal discretization. Additional spatial differential terms are added to the PRK schemes to form the modified symplectic methods and then two modified time-advancing symplectic methods with all of positive symplectic coefficients are then constructed. The spatial differential operators are approximated by nearly-analytic discrete (NAD) operators, and we call the fully discretized scheme modified symplectic nearly analytic discrete (MSNAD) method. Theoretical analyses show that the MSNAD methods exhibit less numerical dispersion and higher stability limits than conventional methods. Three numerical experiments are conducted to verify the advantages of the MSNAD methods, such as their numerical accuracy, computational cost, stability, and long-term calculation capability.

  2. Remote Raman - laser induced breakdown spectroscopy (LIBS) geochemical investigation under Venus atmospheric conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clegg, Sanuel M; Barefield, James E; Humphries, Seth D

    2010-12-13

    The extreme Venus surface temperatures ({approx}740 K) and atmospheric pressures ({approx}93 atm) create a challenging environment for surface missions. Scientific investigations capable of Venus geochemical observations must be completed within hours of landing before the lander will be overcome by the harsh atmosphere. A combined remote Raman - LIBS (Laser Induced Breakdown Spectroscopy) instrument is capable of accomplishing the geochemical science goals without the risks associated with collecting samples and bringing them into the lander. Wiens et al. and Sharma et al. demonstrated that both analytical techniques can be integrated into a single instrument capable of planetary missions. The focusmore » of this paper is to explore the capability to probe geologic samples with Raman - LIBS and demonstrate quantitative analysis under Venus surface conditions. Raman and LIBS are highly complementary analytical techniques capable of detecting both the mineralogical and geochemical composition of Venus surface materials. These techniques have the potential to profoundly increase our knowledge of the Venus surface composition, which is currently limited to geochemical data from Soviet Venera and VEGA landers that collectively suggest a surface composition that is primarily tholeiitic basaltic with some potentially more evolved compositions and, in some locations, K-rich trachyandesite. These landers were not equipped to probe the surface mineralogy as can be accomplished with Raman spectroscopy. Based on the observed compositional differences and recognizing the imprecise nature of the existing data, 15 samples were chosen to constitute a Venus-analog suite for this study, including five basalts, two each of andesites, dacites, and sulfates, and single samples of a foidite, trachyandesite, rhyolite, and basaltic trachyandesite under Venus conditions. LIBS data reduction involved generating a partial least squares (PLS) model with a subset of the rock powder standards to quantitatively determine the major elemental abundance of the remaining samples. PLS analysis suggests that the major element compositions can be determined with root mean square errors ca. 5% (absolute) for SiO{sub 2}, Al{sub 2}O{sub 3}, Fe{sub 2}O{sub 3}(total), MgO, and CaO, and ca. 2% or less for TiO{sub 2}, Cr{sub 2}O{sub 3}, MnO, K{sub 2}O, and Na{sub 2}O. Finally, the Raman experiments have been conducted under supercritical CO{sub 2} involving single-mineral and mixed-mineral samples containing talc, olivine, pyroxenes, feldspars, anhydrite, barite, and siderite. The Raman data have shown that the individual minerals can easily be identified individually or in mixtures.« less

  3. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    PubMed

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  4. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    PubMed Central

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  5. Analytics for Education

    ERIC Educational Resources Information Center

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  6. Flow adjustment inside homogeneous canopies after a leading edge – An analytical approach backed by LES

    DOE PAGES

    Kroniger, Konstantin; Banerjee, Tirtha; De Roo, Frederik; ...

    2017-10-06

    A two-dimensional analytical model for describing the mean flow behavior inside a vegetation canopy after a leading edge in neutral conditions was developed and tested by means of large eddy simulations (LES) employing the LES code PALM. The analytical model is developed for the region directly after the canopy edge, the adjustment region, where one-dimensional canopy models fail due to the sharp change in roughness. The derivation of this adjustment region model is based on an analytic solution of the two-dimensional Reynolds averaged Navier–Stokes equation in neutral conditions for a canopy with constant plant area density (PAD). The main assumptionsmore » for solving the governing equations are separability of the velocity components concerning the spatial variables and the neglection of the Reynolds stress gradients. These two assumptions are verified by means of LES. To determine the emerging model parameters, a simultaneous fitting scheme was applied to the velocity and pressure data of a reference LES simulation. Furthermore a sensitivity analysis of the adjustment region model, equipped with the previously calculated parameters, was performed varying the three relevant length, the canopy height ( h), the canopy length and the adjustment length ( Lc), in additional LES. Even if the model parameters are, in general, functions of h/ Lc, it was found out that the model is capable of predicting the flow quantities in various cases, when using constant parameters. Subsequently the adjustment region model is combined with the one-dimensional model of Massman, which is applicable for the interior of the canopy, to attain an analytical model capable of describing the mean flow for the full canopy domain. As a result, the model is tested against an analytical model based on a linearization approach.« less

  7. Flow adjustment inside homogeneous canopies after a leading edge – An analytical approach backed by LES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroniger, Konstantin; Banerjee, Tirtha; De Roo, Frederik

    A two-dimensional analytical model for describing the mean flow behavior inside a vegetation canopy after a leading edge in neutral conditions was developed and tested by means of large eddy simulations (LES) employing the LES code PALM. The analytical model is developed for the region directly after the canopy edge, the adjustment region, where one-dimensional canopy models fail due to the sharp change in roughness. The derivation of this adjustment region model is based on an analytic solution of the two-dimensional Reynolds averaged Navier–Stokes equation in neutral conditions for a canopy with constant plant area density (PAD). The main assumptionsmore » for solving the governing equations are separability of the velocity components concerning the spatial variables and the neglection of the Reynolds stress gradients. These two assumptions are verified by means of LES. To determine the emerging model parameters, a simultaneous fitting scheme was applied to the velocity and pressure data of a reference LES simulation. Furthermore a sensitivity analysis of the adjustment region model, equipped with the previously calculated parameters, was performed varying the three relevant length, the canopy height ( h), the canopy length and the adjustment length ( Lc), in additional LES. Even if the model parameters are, in general, functions of h/ Lc, it was found out that the model is capable of predicting the flow quantities in various cases, when using constant parameters. Subsequently the adjustment region model is combined with the one-dimensional model of Massman, which is applicable for the interior of the canopy, to attain an analytical model capable of describing the mean flow for the full canopy domain. As a result, the model is tested against an analytical model based on a linearization approach.« less

  8. Older driver highway design handbook

    DOT National Transportation Integrated Search

    1998-01-01

    This project included literature reviews and research syntheses, using meta-analytic techniques where : appropriate, in the areas of age-related (diminished) functional capabilities, and human factors and : highway safety. A User-Requirements Analysi...

  9. Programmable Bio-Nano-Chip Systems for Serum CA125 Quantification: Towards Ovarian Cancer Diagnostics at the Point-of-Care

    PubMed Central

    Raamanathan, Archana; Simmons, Glennon W.; Christodoulides, Nicolaos; Floriano, Pierre N.; Furmaga, Wieslaw B.; Redding, Spencer W.; Lu, Karen H.; Bast, Robert C.; McDevitt, John T.

    2013-01-01

    Point-of-care (POC) implementation of early detection and screening methodologies for ovarian cancer may enable improved survival rates through early intervention. Current laboratory-confined immunoanalyzers have long turnaround times and are often incompatible with multiplexing and POC implementation. Rapid, sensitive and multiplexable POC diagnostic platforms compatible with promising early detection approaches for ovarian cancer are needed. To this end, we report the adaptation of the programmable bio-nano-chip (p-BNC), an integrated, microfluidic, modular (Programmable) platform for CA125 serum quantitation, a biomarker prominently implicated in multi-modal and multi-marker screening approaches. In the p-BNC, CA125 from diseased sera (Bio) is sequestered and assessed with a fluorescence-based sandwich immunoassay, completed in the nano-nets (Nano) of sensitized agarose microbeads localized in individually addressable wells (Chip), housed in a microfluidic module, capable of integrating multiple sample, reagent and biowaste processing and handling steps. Antibody pairs that bind to distinct epitopes on CA125 were screened. To permit efficient biomarker sequestration in a 3-D microfluidic environment, the p-BNC operating variables (incubation times, flow rates and reagent concentrations) were tuned to deliver optimal analytical performance under 45 minutes. With short analysis times, competitive analytical performance (Inter- and intra-assay precision of 1.2% and 1.9% and LODs of 1.0 U/mL) was achieved on this mini-sensor ensemble. Further validation with sera of ovarian cancer patients (n=20) demonstrated excellent correlation (R2 = 0.97) with gold-standard ELISA. Building on the integration capabilities of novel microfluidic systems programmed for ovarian cancer, the rapid, precise and sensitive miniaturized p-BNC system shows strong promise for ovarian cancer diagnostics. PMID:22490510

  10. Programmable bio-nano-chip systems for serum CA125 quantification: toward ovarian cancer diagnostics at the point-of-care.

    PubMed

    Raamanathan, Archana; Simmons, Glennon W; Christodoulides, Nicolaos; Floriano, Pierre N; Furmaga, Wieslaw B; Redding, Spencer W; Lu, Karen H; Bast, Robert C; McDevitt, John T

    2012-05-01

    Point-of-care (POC) implementation of early detection and screening methodologies for ovarian cancer may enable improved survival rates through early intervention. Current laboratory-confined immunoanalyzers have long turnaround times and are often incompatible with multiplexing and POC implementation. Rapid, sensitive, and multiplexable POC diagnostic platforms compatible with promising early detection approaches for ovarian cancer are needed. To this end, we report the adaptation of the programmable bio-nano-chip (p-BNC), an integrated, microfluidic, and modular (programmable) platform for CA125 serum quantitation, a biomarker prominently implicated in multimodal and multimarker screening approaches. In the p-BNCs, CA125 from diseased sera (Bio) is sequestered and assessed with a fluorescence-based sandwich immunoassay, completed in the nano-nets (Nano) of sensitized agarose microbeads localized in individually addressable wells (Chip), housed in a microfluidic module, capable of integrating multiple sample, reagent and biowaste processing, and handling steps. Antibody pairs that bind to distinct epitopes on CA125 were screened. To permit efficient biomarker sequestration in a three-dimensional microfluidic environment, the p-BNC operating variables (incubation times, flow rates, and reagent concentrations) were tuned to deliver optimal analytical performance under 45 minutes. With short analysis times, competitive analytical performance (inter- and intra-assay precision of 1.2% and 1.9% and limit of detection of 1.0 U/mL) was achieved on this minisensor ensemble. Furthermore, validation with sera of patients with ovarian cancer (n = 20) showed excellent correlation (R(2) = 0.97) with gold-standard ELISA. Building on the integration capabilities of novel microfluidic systems programmed for ovarian cancer, the rapid, precise, and sensitive miniaturized p-BNC system shows strong promise for ovarian cancer diagnostics.

  11. Acoustic Predictions of Manned and Unmanned Rotorcraft Using the Comprehensive Analytical Rotorcraft Model for Acoustics (CARMA) Code System

    NASA Technical Reports Server (NTRS)

    Boyd, D. Douglas, Jr.; Burley, Casey L.; Conner, David A.

    2005-01-01

    The Comprehensive Analytical Rotorcraft Model for Acoustics (CARMA) is being developed under the Quiet Aircraft Technology Project within the NASA Vehicle Systems Program. The purpose of CARMA is to provide analysis tools for the design and evaluation of efficient low-noise rotorcraft, as well as support the development of safe, low-noise flight operations. The baseline prediction system of CARMA is presented and current capabilities are illustrated for a model rotor in a wind tunnel, a rotorcraft in flight and for a notional coaxial rotor configuration; however, a complete validation of the CARMA system capabilities with respect to a variety of measured databases is beyond the scope of this work. For the model rotor illustration, predicted rotor airloads and acoustics for a BO-105 model rotor are compared to test data from HART-II. For the flight illustration, acoustic data from an MD-520N helicopter flight test, which was conducted at Eglin Air Force Base in September 2003, are compared with CARMA full vehicle flight predictions. Predicted acoustic metrics at three microphone locations are compared for limited level flight and descent conditions. Initial acoustic predictions using CARMA for a notional coaxial rotor system are made. The effect of increasing the vertical separation between the rotors on the predicted airloads and acoustic results are shown for both aerodynamically non-interacting and aerodynamically interacting rotors. The sensitivity of including the aerodynamic interaction effects of each rotor on the other, especially when the rotors are in close proximity to one another is initially examined. The predicted coaxial rotor noise is compared to that of a conventional single rotor system of equal thrust, where both are of reasonable size for an unmanned aerial vehicle (UAV).

  12. Quantumness-generating capability of quantum dynamics

    NASA Astrophysics Data System (ADS)

    Li, Nan; Luo, Shunlong; Mao, Yuanyuan

    2018-04-01

    We study quantumness-generating capability of quantum dynamics, where quantumness refers to the noncommutativity between the initial state and the evolving state. In terms of the commutator of the square roots of the initial state and the evolving state, we define a measure to quantify the quantumness-generating capability of quantum dynamics with respect to initial states. Quantumness-generating capability is absent in classical dynamics and hence is a fundamental characteristic of quantum dynamics. For qubit systems, we present an analytical form for this measure, by virtue of which we analyze several prototypical dynamics such as unitary dynamics, phase damping dynamics, amplitude damping dynamics, and random unitary dynamics (Pauli channels). Necessary and sufficient conditions for the monotonicity of quantumness-generating capability are also identified. Finally, we compare these conditions for the monotonicity of quantumness-generating capability with those for various Markovianities and illustrate that quantumness-generating capability and quantum Markovianity are closely related, although they capture different aspects of quantum dynamics.

  13. Bending of an Infinite beam on a base with two parameters in the absence of a part of the base

    NASA Astrophysics Data System (ADS)

    Aleksandrovskiy, Maxim; Zaharova, Lidiya

    2018-03-01

    Currently, in connection with the rapid development of high-rise construction and the improvement of joint operation of high-rise structures and bases models, the questions connected with the use of various calculation methods become topical. The rigor of analytical methods is capable of more detailed and accurate characterization of the structures behavior, which will affect the reliability of objects and can lead to a reduction in their cost. In the article, a model with two parameters is used as a computational model of the base that can effectively take into account the distributive properties of the base by varying the coefficient reflecting the shift parameter. The paper constructs the effective analytical solution of the problem of a beam of infinite length interacting with a two-parameter voided base. Using the Fourier integral equations, the original differential equation is reduced to the Fredholm integral equation of the second kind with a degenerate kernel, and all the integrals are solved analytically and explicitly, which leads to an increase in the accuracy of the computations in comparison with the approximate methods. The paper consider the solution of the problem of a beam loaded with a concentrated force applied at the point of origin with a fixed value of the length of the dip section. The paper gives the analysis of the obtained results values for various parameters of coefficient taking into account cohesion of the ground.

  14. Developing strategies to enhance loading efficiency of erythrosensors

    NASA Astrophysics Data System (ADS)

    Bustamante Lopez, Sandra C.; Ritter, Sarah C.; Meissner, Kenith E.

    2014-02-01

    For diabetics, continuous glucose monitoring and the resulting tighter control of glucose levels ameliorate serious complications from hypoglycemia and hyperglycemia. Diabetics measure their blood glucose levels multiple times a day by finger pricks, or use implantable monitoring devices. Still, glucose and other analytes in the blood fluctuate throughout the day and the current monitoring methods are invasive, immunogenic, and/or present biodegradation problems. Using carrier erythrocytes loaded with a fluorescent sensor, we seek to develop a biodegradable, efficient, and potentially cost effective method to continuously sense blood analytes. We aim to reintroduce sensor-loaded erythrocytes to the bloodstream and conserve the erythrocytes lifetime of 120 days in the circulatory system. Here, we compare the efficiency of two loading techniques: hypotonic dilution and electroporation. Hypotonic dilution employs hypotonic buffer to create transient pores in the erythrocyte membrane, allowing dye entrance and a hypertonic buffer to restore tonicity. Electroporation relies on controlled electrical pulses that results in reversible pores formation to allow cargo entrance, follow by incubation at 37°C to reseal. As part of the cellular characterization of loaded erythrocytes, we focus on cell size, shape, and hemoglobin content. Cell recovery, loading efficiency and cargo release measurements render optimal loading conditions. The detected fluorescent signal from sensor-loaded erythrocytes can be translated into a direct measurement of analyte levels in the blood stream. The development of a suitable protocol to engineer carrier erythrocytes has profound and lasting implications in the erythrosensor's lifespan and sensing capabilities.

  15. Human performance modeling for system of systems analytics: combat performance-shaping factors.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawton, Craig R.; Miller, Dwight Peter

    The US military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives. To support this goal, Sandia National Laboratories (SNL) has undertaken a program of HPM as an integral augmentation to its system-of-system (SoS) analytics capabilities. The previous effort, reported in SAND2005-6569, evaluated the effects of soldier cognitive fatigue on SoS performance. The current effort began with a very broad survey of any performance-shaping factors (PSFs) that also might affect soldiers performance in combat situations. The work included consideration of three different approaches to cognition modeling and how appropriate theymore » would be for application to SoS analytics. This bulk of this report categorizes 47 PSFs into three groups (internal, external, and task-related) and provides brief descriptions of how each affects combat performance, according to the literature. The PSFs were then assembled into a matrix with 22 representative military tasks and assigned one of four levels of estimated negative impact on task performance, based on the literature. Blank versions of the matrix were then sent to two ex-military subject-matter experts to be filled out based on their personal experiences. Data analysis was performed to identify the consensus most influential PSFs. Results indicate that combat-related injury, cognitive fatigue, inadequate training, physical fatigue, thirst, stress, poor perceptual processing, and presence of chemical agents are among the PSFs with the most negative impact on combat performance.« less

  16. Effect of Microscopic Damage Events on Static and Ballistic Impact Strength of Triaxial Braid Composites

    NASA Technical Reports Server (NTRS)

    Littell, Justin D.; Binienda, Wieslaw K.; Arnold, William A.; Roberts, Gary D.; Goldberg, Robert K.

    2010-01-01

    The reliability of impact simulations for aircraft components made with triaxial-braided carbon-fiber composites is currently limited by inadequate material property data and lack of validated material models for analysis. Methods to characterize the material properties used in the analytical models from a systematically obtained set of test data are also lacking. A macroscopic finite element based analytical model to analyze the impact response of these materials has been developed. The stiffness and strength properties utilized in the material model are obtained from a set of quasi-static in-plane tension, compression and shear coupon level tests. Full-field optical strain measurement techniques are applied in the testing, and the results are used to help in characterizing the model. The unit cell of the braided composite is modeled as a series of shell elements, where each element is modeled as a laminated composite. The braided architecture can thus be approximated within the analytical model. The transient dynamic finite element code LS-DYNA is utilized to conduct the finite element simulations, and an internal LS-DYNA constitutive model is utilized in the analysis. Methods to obtain the stiffness and strength properties required by the constitutive model from the available test data are developed. Simulations of quasi-static coupon tests and impact tests of a represented braided composite are conducted. Overall, the developed method shows promise, but improvements that are needed in test and analysis methods for better predictive capability are examined.

  17. Raman spectroscopic analysis of geological and biogeological specimens of relevance to the ExoMars mission.

    PubMed

    Edwards, Howell G M; Hutchinson, Ian B; Ingley, Richard; Parnell, John; Vítek, Petr; Jehlička, Jan

    2013-06-01

    A novel miniaturized Raman spectrometer is scheduled to fly as part of the analytical instrumentation package on an ESA remote robotic lander in the ESA/Roscosmos ExoMars mission to search for evidence for extant or extinct life on Mars in 2018. The Raman spectrometer will be part of the first-pass analytical stage of the sampling procedure, following detailed surface examination by the PanCam scanning camera unit on the ExoMars rover vehicle. The requirements of the analytical protocol are stringent and critical; this study represents a laboratory blind interrogation of specimens that form a list of materials that are of relevance to martian exploration and at this stage simulates a test of current laboratory instrumentation to highlight the Raman technique strengths and possible weaknesses that may be encountered in practice on the martian surface and from which future studies could be formulated. In this preliminary exercise, some 10 samples that are considered terrestrial representatives of the mineralogy and possible biogeologically modified structures that may be identified on Mars have been examined with Raman spectroscopy, and conclusions have been drawn about the viability of the unambiguous spectral identification of biomolecular life signatures. It is concluded that the Raman spectroscopic technique does indeed demonstrate the capability to identify biomolecular signatures and the mineralogy in real-world terrestrial samples with a very high degree of success without any preconception being made about their origin and classification.

  18. Visual Analytics 101

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean; Burtner, Edwin R.; Cook, Kristin A.

    This course will introduce the field of Visual Analytics to HCI researchers and practitioners highlighting the contributions they can make to this field. Topics will include a definition of visual analytics along with examples of current systems, types of tasks and end users, issues in defining user requirements, design of visualizations and interactions, guidelines and heuristics, the current state of user-centered evaluations, and metrics for evaluation. We encourage designers, HCI researchers, and HCI practitioners to attend to learn how their skills can contribute to advancing the state of the art of visual analytics

  19. Social Exclusion and Education Inequality: Towards an Integrated Analytical Framework for the Urban-Rural Divide in China

    ERIC Educational Resources Information Center

    Wang, Li

    2012-01-01

    The aim of this paper is to build a capability-based framework, drawing upon the strengths of other approaches, which is applicable to the complexity of the urban-rural divide in education in China. It starts with a brief introduction to the capability approach. This is followed by a discussion of how the rights-based approach and resource-based…

  20. Dynamic Analytical Capability to Better Understand and Anticipate Extremist Shifts Within Populations under Authoritarian Regimes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernard, Michael Lewis

    2015-11-01

    The purpose of this work is to create a generalizable data- and theory-supported capability to better understand and anticipate (with quantifiable uncertainty): 1) how the dynamics of allegiance formations between various groups and society are impacted by active conflict and by third-party interventions and 2) how/why extremist allegiances co-evolve over time due to changing geopolitical, sociocultural, and military conditions.

  1. Improvements to the FATOLA computer program including nosewheel steering: Supplemental instruction manual

    NASA Technical Reports Server (NTRS)

    Carden, H. D.; Mcgehee, J. R.

    1978-01-01

    Modifications to a multidegree of freedom flexible aircraft take-off and landing analysis (FATOLA) computer program, which improved its simulation capabilities, are discussed, and supplemental instructions for use of the program are included. Sample analytical results which illustrate the capabilities of an added nosewheel steering option indicate consistent behavior of the airplane tracking, attitude, motions, and loads for the landing cases and steering situations which were investigated.

  2. Exploring Large Scale Data Analysis and Visualization for ARM Data Discovery Using NoSQL Technologies

    NASA Astrophysics Data System (ADS)

    Krishna, B.; Gustafson, W. I., Jr.; Vogelmann, A. M.; Toto, T.; Devarakonda, R.; Palanisamy, G.

    2016-12-01

    This paper presents a new way of providing ARM data discovery through data analysis and visualization services. ARM stands for Atmospheric Radiation Measurement. This Program was created to study cloud formation processes and their influence on radiative transfer and also include additional measurements of aerosol and precipitation at various highly instrumented ground and mobile stations. The total volume of ARM data is roughly 900TB. The current search for ARM data is performed by using its metadata, such as the site name, instrument name, date, etc. NoSQL technologies were explored to improve the capabilities of data searching, not only by their metadata, but also by using the measurement values. Two technologies that are currently being implemented for testing are Apache Cassandra (noSQL database) and Apache Spark (noSQL based analytics framework). Both of these technologies were developed to work in a distributed environment and hence can handle large data for storing and analytics. D3.js is a JavaScript library that can generate interactive data visualizations in web browsers by making use of commonly used SVG, HTML5, and CSS standards. To test the performance of NoSQL for ARM data, we will be using ARM's popular measurements to locate the data based on its value. Recently noSQL technology has been applied to a pilot project called LASSO, which stands for LES ARM Symbiotic Simulation and Observation Workflow. LASSO will be packaging LES output and observations in "data bundles" and analyses will require the ability for users to analyze both observations and LES model output either individually or together across multiple time periods. The LASSO implementation strategy suggests that enormous data storage is required to store the above mentioned quantities. Thus noSQL was used to provide a powerful means to store portions of the data that provided users with search capabilities on each simulation's traits through a web application. Based on the user selection, plots are created dynamically along with ancillary information that enables the user to locate and download data that fulfilled their required traits.

  3. 40 CFR 142.11 - Initial determination of primary enforcement responsibility.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., certified or approved by the Administrator and capable of performing analytical measurements of all...'s program activity to assure that the design and construction of new or substantially modified...

  4. 40 CFR 142.11 - Initial determination of primary enforcement responsibility.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., certified or approved by the Administrator and capable of performing analytical measurements of all...'s program activity to assure that the design and construction of new or substantially modified...

  5. DOE-FG02-00ER62797 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sweedler, J.V.

    2004-12-01

    Specific Aims The overall goal of this proposal has been to develop and interface a new technology, molecular gates, with microfabricated systems to add an important capability to microfabricated DNA measurement systems. This project specifically focused on demonstrating how molecular gates could be used to capture a single analyte band, among a stream of bands from a separation or a flow injection analysis experiment, and release it for later measurement, thus allowing further manipulations on the selected analyte. Since the original proposal, the molecular gate concept has been greatly expanded to allow the gates to be used as externally controllablemore » intelligent interconnects in multilayer microfluidic networks. We have demonstrated: (1) the ability of the molecular gates to work with a much wider range of biological molecules including DNA, proteins and small metabolites; and (2) the capability of performing an electrophoretic separation and sequestering individual picoliter volume components (or even classes of components) into separate channels for further analysis. Both capabilities will enable characterization of small mass amounts of complex mixtures of DNA, proteins and even small molecules--allowing them to be further separated and chemically characterized.« less

  6. Honing the Priorities and Making the Investment Case for Global Health.

    PubMed

    Mundel, Trevor

    2016-03-01

    In the aftermath of the Ebola crisis, the global health community has a unique opportunity to reflect on the lessons learned and apply them to prepare the world for the next crisis. Part of that preparation will entail knowing, with greater precision, what the scale and scope of our specific global health challenges are and what resources are needed to address them. However, how can we know the magnitude of the challenge, and what resources are needed without knowing the current status of the world through accurate primary data? Once we know the current status, how can we decide on an intervention today with a predicted impact decades out if we cannot project into that future? Making a case for more investments will require not just better data generation and sharing but a whole new level of sophistication in our analytical capability--a fundamental shift in our thinking to set expectations to match the reality. In this current status of a distributed world, being transparent with our assumptions and specific with the case for investing in global health is a powerful approach to finding solutions to the problems that have plagued us for centuries.

  7. Towards a Biosynthetic UAV

    NASA Technical Reports Server (NTRS)

    Block, Eli; Byemerwa, Jovita; Dispenza, Ross; Doughty, Benjamin; Gillyard, KaNesha; Godbole, Poorwa; Gonzales-Wright, Jeanette; Hull, Ian; Kannappan, Jotthe; Levine, Alexander; hide

    2014-01-01

    We are currently working on a series of projects towards the construction of a fully biological unmanned aerial vehicle (UAV) for use in scientific and humanitarian missions. The prospect of a biologically-produced UAV presents numerous advantages over the current manufacturing paradigm. First, a foundational architecture built by cells allows for construction or repair in locations where it would be difficult to bring traditional tools of production. Second, a major limitation of current research with UAVs is the size and high power consumption of analytical instruments, which require bulky electrical components and large fuselages to support their weight. By moving these functions into cells with biosensing capabilities - for example, a series of cells engineered to report GFP, green fluorescent protein, when conditions exceed a certain threshold concentration of a compound of interest, enabling their detection post-flight - these problems of scale can be avoided. To this end, we are working to engineer cells to synthesize cellulose acetate as a novel bioplastic, characterize biological methods of waterproofing the material, and program this material's systemic biodegradation. In addition, we aim to use an "amberless" system to prevent horizontal gene transfer from live cells on the material to microorganisms in the flight environment.

  8. Baseline experimental investigation of an electrohydrodynamically assisted heat pipe

    NASA Technical Reports Server (NTRS)

    Duncan, A. B.

    1995-01-01

    The increases in power demand and associated thermal management requirements of future space programs such as potential Lunar/Mars missions will require enhancing the operating efficiencies of thermal management devices. Currently, the use of electrohydrodynamically (EHD) assisted thermal control devices is under consideration as a potential method of increasing thermal management system capacity. The objectives of the currently described investigation included completing build-up of the EHD-Assisted Heat Pipe Test bed, developing test procedures for an experimental evaluation of the unassisted heat pipe, developing an analytical model capable of predicting the performance limits of the unassisted heat pipe, and obtaining experimental data which would define the performance characteristics of the unassisted heat pipe. The information obtained in the currently proposed study will be used in order to provide extensive comparisons with the EHD-assisted performance observations to be obtained during the continuing investigation of EHD-Assisted heat transfer devices. Through comparisons of the baseline test bed data and the EHD assisted test bed data, accurate insight into the performance enhancing characteristics of EHD augmentation may be obtained. This may lead to optimization, development, and implementation of EHD technology for future space programs.

  9. Mathematical analysis and coordinated current allocation control in battery power module systems

    NASA Astrophysics Data System (ADS)

    Han, Weiji; Zhang, Liang

    2017-12-01

    As the major energy storage device and power supply source in numerous energy applications, such as solar panels, wind plants, and electric vehicles, battery systems often face the issue of charge imbalance among battery cells/modules, which can accelerate battery degradation, cause more energy loss, and even incur fire hazard. To tackle this issue, various circuit designs have been developed to enable charge equalization among battery cells/modules. Recently, the battery power module (BPM) design has emerged to be one of the promising solutions for its capability of independent control of individual battery cells/modules. In this paper, we propose a new current allocation method based on charging/discharging space (CDS) for performance control in BPM systems. Based on the proposed method, the properties of CDS-based current allocation with constant parameters are analyzed. Then, real-time external total power requirement is taken into account and an algorithm is developed for coordinated system performance control. By choosing appropriate control parameters, the desired system performance can be achieved by coordinating the module charge balance and total power efficiency. Besides, the proposed algorithm has complete analytical solutions, and thus is very computationally efficient. Finally, the efficacy of the proposed algorithm is demonstrated using simulations.

  10. Effects of 60-Heartz electric and magnetic fields on implanted cardiac pacemakers. Final report. [Hazards of power transmission line frequencies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bridges, J.E.; Frazier, M.J.

    1979-09-01

    The effects of 60-Hz electric and magnetic fields of exta-high voltage (EHV) transmission lines on the performance of implanted cardiac pacemakers were studied by: (1) in vitro bench tests of a total of thirteen cardiac pacemakers; (2) in vivo tests of six implanted cardiac pacemakers in baboons; and (3) non-hazardous skin measurement tests on four humans. Analytical methods were developed to predict the thresholds of body current and electric fields capable of affecting normal pacemaker operation in humans. The field strengths calculated to alter implanted pacemaker performance were compared with the range of maximum electric and magnetic field strengths amore » human would normally encounter under transmission lines of various voltages. Results indicate that the electric field or body current necessary to alter the normal operation of pacemakers is highly dependent on the type of pacemaker and the location of the implanted electrodes. However, cardiologists have not so far detected harmful effects of pacemaker reversion to the asynchronous mode in current types of pacemakers and with present methods of implantation. Such interferences can be eliminated by using advanced pacemakers less sensitive to 60-Hz voltages or by using implantation lead arrangements less sensitive to body current.« less

  11. Use of near-infrared spectroscopy (NIRs) in the biopharmaceutical industry for real-time determination of critical process parameters and integration of advanced feedback control strategies using MIDUS control.

    PubMed

    Vann, Lucas; Sheppard, John

    2017-12-01

    Control of biopharmaceutical processes is critical to achieve consistent product quality. The most challenging unit operation to control is cell growth in bioreactors due to the exquisitely sensitive and complex nature of the cells that are converting raw materials into new cells and products. Current monitoring capabilities are increasing, however, the main challenge is now becoming the ability to use the data generated in an effective manner. There are a number of contributors to this challenge including integration of different monitoring systems as well as the functionality to perform data analytics in real-time to generate process knowledge and understanding. In addition, there is a lack of ability to easily generate strategies and close the loop to feedback into the process for advanced process control (APC). The current research aims to demonstrate the use of advanced monitoring tools along with data analytics to generate process understanding in an Escherichia coli fermentation process. NIR spectroscopy was used to measure glucose and critical amino acids in real-time to help in determining the root cause of failures associated with different lots of yeast extract. First, scale-down of the process was required to execute a simple design of experiment, followed by scale-up to build NIR models as well as soft sensors for advanced process control. In addition, the research demonstrates the potential for a novel platform technology that enables manufacturers to consistently achieve "goldenbatch" performance through monitoring, integration, data analytics, understanding, strategy design and control (MIDUS control). MIDUS control was employed to increase batch-to-batch consistency in final product titers, decrease the coefficient of variability from 8.49 to 1.16%, predict possible exhaust filter failures and close the loop to prevent their occurrence and avoid lost batches.

  12. Visual analytics in medical education: impacting analytical reasoning and decision making for quality improvement.

    PubMed

    Vaitsis, Christos; Nilsson, Gunnar; Zary, Nabil

    2015-01-01

    The medical curriculum is the main tool representing the entire undergraduate medical education. Due to its complexity and multilayered structure it is of limited use to teachers in medical education for quality improvement purposes. In this study we evaluated three visualizations of curriculum data from a pilot course, using teachers from an undergraduate medical program and applying visual analytics methods. We found that visual analytics can be used to positively impacting analytical reasoning and decision making in medical education through the realization of variables capable to enhance human perception and cognition on complex curriculum data. The positive results derived from our evaluation of a medical curriculum and in a small scale, signify the need to expand this method to an entire medical curriculum. As our approach sustains low levels of complexity it opens a new promising direction in medical education informatics research.

  13. Opportunity and Challenges for Migrating Big Data Analytics in Cloud

    NASA Astrophysics Data System (ADS)

    Amitkumar Manekar, S.; Pradeepini, G., Dr.

    2017-08-01

    Big Data Analytics is a big word now days. As per demanding and more scalable process data generation capabilities, data acquisition and storage become a crucial issue. Cloud storage is a majorly usable platform; the technology will become crucial to executives handling data powered by analytics. Now a day’s trend towards “big data-as-a-service” is talked everywhere. On one hand, cloud-based big data analytics exactly tackle in progress issues of scale, speed, and cost. But researchers working to solve security and other real-time problem of big data migration on cloud based platform. This article specially focused on finding possible ways to migrate big data to cloud. Technology which support coherent data migration and possibility of doing big data analytics on cloud platform is demanding in natute for new era of growth. This article also gives information about available technology and techniques for migration of big data in cloud.

  14. Analytical Solution for Flow to a Partially Penetrating Well with Storage in a Confined Aquifer

    NASA Astrophysics Data System (ADS)

    Vesselinov, V. V.; Mishra, P. K.; Neuman, S. P.

    2009-12-01

    Analytical solutions for radial flow toward a pumping well are commonly applied to analyze pumping tests conducted in confined aquifers. However, the existing analytical solutions are not capable to simultaneously take into account aquifer anisotropy, partial penetration, and wellbore storage capacity of pumping well. Ignoring these effects may have important impact on the estimated aquifer properties. We present a new analytical solution for three-dimensional, axially symmetric flow to a pumping well in confined aquifer that accouts for aquifer anisotropy, partial penetration and wellbore storage capacity of pumping well. Our analytical reduces to that of Papadopulos et.al. [1967] when the pumping well is fully penetrating, Hantush [1964] when the pumping well has no wellbore storage, and Theis [1935] when both conditions are fulfilled. The solution is evaluated through numerical inversion of its Laplace transform. We use our new solution to analyze data from synthetic and real pumping tests.

  15. Real-time, aptamer-based tracking of circulating therapeutic agents in living animals

    PubMed Central

    Ferguson, B. Scott; Hoggarth, David A.; Maliniak, Dan; Ploense, Kyle; White, Ryan J.; Woodward, Nick; Hsieh, Kuangwen; Bonham, Andrew J.; Eisenstein, Michael; Kippin, Tod; Plaxco, Kevin W.; Soh, H. Tom

    2014-01-01

    A sensor capable of continuously measuring specific molecules in the bloodstream in vivo would give clinicians a valuable window into patients’ health and their response to therapeutics. Such technology would enable truly personalized medicine, wherein therapeutic agents could be tailored with optimal doses for each patient to maximize efficacy and minimize side effects. Unfortunately, continuous, real-time measurement is currently only possible for a handful of targets, such as glucose, lactose, and oxygen, and the few existing platforms for continuous measurement are not generalizable for the monitoring of other analytes, such as small-molecule therapeutics. In response, we have developed a real-time biosensor capable of continuously tracking a wide range of circulating drugs in living subjects. Our microfluidic electrochemical detector for in vivo continuous monitoring (MEDIC) requires no exogenous reagents, operates at room temperature, and can be reconfigured to measure different target molecules by exchanging probes in a modular manner. To demonstrate the system's versatility, we measured therapeutic in vivo concentrations of doxorubicin (a chemotherapeutic) and kanamycin (an antibiotic) in live rats and in human whole blood for several hours with high sensitivity and specificity at sub-minute temporal resolution. Importantly, we show that MEDIC can also obtain pharmacokineticparameters for individual animals in real-time. Accordingly, just as continuous glucose monitoring technology is currently revolutionizing diabetes care, we believe MEDIC could be a powerful enabler for personalized medicine by ensuring delivery of optimal drug doses for individual patients based on direct detection of physiological parameters. PMID:24285484

  16. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory.

    PubMed

    Horowitz, Gary L; Zaman, Zahur; Blanckaert, Norbert J C; Chan, Daniel W; Dubois, Jeffrey A; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W; Nilsen, Olaug L; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality.

  17. MODULAR ANALYTICS: A New Approach to Automation in the Clinical Laboratory

    PubMed Central

    Zaman, Zahur; Blanckaert, Norbert J. C.; Chan, Daniel W.; Dubois, Jeffrey A.; Golaz, Olivier; Mensi, Noury; Keller, Franz; Stolz, Herbert; Klingler, Karl; Marocchi, Alessandro; Prencipe, Lorenzo; McLawhon, Ronald W.; Nilsen, Olaug L.; Oellerich, Michael; Luthe, Hilmar; Orsonneau, Jean-Luc; Richeux, Gérard; Recio, Fernando; Roldan, Esther; Rymo, Lars; Wicktorsson, Anne-Charlotte; Welch, Shirley L.; Wieland, Heinrich; Grawitz, Andrea Busse; Mitsumaki, Hiroshi; McGovern, Margaret; Ng, Katherine; Stockmann, Wolfgang

    2005-01-01

    MODULAR ANALYTICS (Roche Diagnostics) (MODULAR ANALYTICS, Elecsys and Cobas Integra are trademarks of a member of the Roche Group) represents a new approach to automation for the clinical chemistry laboratory. It consists of a control unit, a core unit with a bidirectional multitrack rack transportation system, and three distinct kinds of analytical modules: an ISE module, a P800 module (44 photometric tests, throughput of up to 800 tests/h), and a D2400 module (16 photometric tests, throughput up to 2400 tests/h). MODULAR ANALYTICS allows customised configurations for various laboratory workloads. The performance and practicability of MODULAR ANALYTICS were evaluated in an international multicentre study at 16 sites. Studies included precision, accuracy, analytical range, carry-over, and workflow assessment. More than 700 000 results were obtained during the course of the study. Median between-day CVs were typically less than 3% for clinical chemistries and less than 6% for homogeneous immunoassays. Median recoveries for nearly all standardised reference materials were within 5% of assigned values. Method comparisons versus current existing routine instrumentation were clinically acceptable in all cases. During the workflow studies, the work from three to four single workstations was transferred to MODULAR ANALYTICS, which offered over 100 possible methods, with reduction in sample splitting, handling errors, and turnaround time. Typical sample processing time on MODULAR ANALYTICS was less than 30 minutes, an improvement from the current laboratory systems. By combining multiple analytic units in flexible ways, MODULAR ANALYTICS met diverse laboratory needs and offered improvement in workflow over current laboratory situations. It increased overall efficiency while maintaining (or improving) quality. PMID:18924721

  18. 3D analysis of eddy current loss in the permanent magnet coupling.

    PubMed

    Zhu, Zina; Meng, Zhuo

    2016-07-01

    This paper first presents a 3D analytical model for analyzing the radial air-gap magnetic field between the inner and outer magnetic rotors of the permanent magnet couplings by using the Amperian current model. Based on the air-gap field analysis, the eddy current loss in the isolation cover is predicted according to the Maxwell's equations. A 3D finite element analysis model is constructed to analyze the magnetic field spatial distributions and vector eddy currents, and then the simulation results obtained are analyzed and compared with the analytical method. Finally, the current losses of two types of practical magnet couplings are measured in the experiment to compare with the theoretical results. It is concluded that the 3D analytical method of eddy current loss in the magnet coupling is viable and could be used for the eddy current loss prediction of magnet couplings.

  19. Northwest Trajectory Analysis Capability: A Platform for Enhancing Computational Biophysics Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Elena S.; Stephan, Eric G.; Corrigan, Abigail L.

    2008-07-30

    As computational resources continue to increase, the ability of computational simulations to effectively complement, and in some cases replace, experimentation in scientific exploration also increases. Today, large-scale simulations are recognized as an effective tool for scientific exploration in many disciplines including chemistry and biology. A natural side effect of this trend has been the need for an increasingly complex analytical environment. In this paper, we describe Northwest Trajectory Analysis Capability (NTRAC), an analytical software suite developed to enhance the efficiency of computational biophysics analyses. Our strategy is to layer higher-level services and introduce improved tools within the user’s familiar environmentmore » without preventing researchers from using traditional tools and methods. Our desire is to share these experiences to serve as an example for effectively analyzing data intensive large scale simulation data.« less

  20. Determination of hydrazine in drinking water: Development and multivariate optimization of a rapid and simple solid phase microextraction-gas chromatography-triple quadrupole mass spectrometry protocol.

    PubMed

    Gionfriddo, Emanuela; Naccarato, Attilio; Sindona, Giovanni; Tagarelli, Antonio

    2014-07-04

    In this work, the capabilities of solid phase microextraction were exploited in a fully optimized SPME-GC-QqQ-MS analytical approach for hydrazine assay. A rapid and easy method was obtained by a simple derivatization reaction with propyl chloroformate and pyridine carried out directly in water samples, followed by automated SPME analysis in the same vial without further sample handling. The affinity of the different derivatized compounds obtained towards five commercially available SPME coatings was evaluated, in order to achieve the best extraction efficiency. GC analyses were carried out using a GC-QqQ-MS instrument in selected reaction monitoring (SRM) acquisition mode which has allowed the achievement of high specificity by selecting appropriate precursor-product ion couples improving the capability in analyte identification. The multivariate approach of experimental design was crucial in order to optimize derivatization reaction, SPME process and tandem mass spectrometry parameters. Accuracy of the proposed protocol, tested at 60, 200 and 800 ng L(-1), provided satisfactory values (114.2%, 83.6% and 98.6%, respectively), whereas precision (RSD%) at the same concentration levels were of 10.9%, 7.9% and 7.7% respectively. Limit of detection and quantification of 4.4 and 8.3 ng L(-1) were obtained. The reliable application of the proposed protocol to real drinking water samples confirmed its capability to be used as analytical tool for routine analyses. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Reporter Proteins in Whole-Cell Optical Bioreporter Detection Systems, Biosensor Integrations, and Biosensing Applications

    PubMed Central

    Close, Dan M.; Ripp, Steven; Sayler, Gary S.

    2009-01-01

    Whole-cell, genetically modified bioreporters are designed to emit detectable signals in response to a target analyte or related group of analytes. When integrated with a transducer capable of measuring those signals, a biosensor results that acts as a self-contained analytical system useful in basic and applied environmental, medical, pharmacological, and agricultural sciences. Historically, these devices have focused on signaling proteins such as green fluorescent protein, aequorin, firefly luciferase, and/or bacterial luciferase. The biochemistry and genetic development of these sensor systems as well as the advantages, challenges, and common applications of each one will be discussed. PMID:22291559

  2. Analytical modeling of circuit aerodynamics in the new NASA Lewis wind tunnel

    NASA Technical Reports Server (NTRS)

    Towne, C. E.; Povinelli, L. A.; Kunik, W. G.; Muramoto, K. K.; Hughes, C. E.; Levy, R.

    1985-01-01

    Rehabilitation and extention of the capability of the altitude wind tunnel (AWT) was analyzed. The analytical modeling program involves the use of advanced axisymmetric and three dimensional viscous analyses to compute the flow through the various AWT components. Results for the analytical modeling of the high speed leg aerodynamics are presented; these include: an evaluation of the flow quality at the entrance to the test section, an investigation of the effects of test section bleed for different model blockages, and an examination of three dimensional effects in the diffuser due to reentry flow and due to the change in cross sectional shape of the exhaust scoop.

  3. Analytical modeling and analysis of magnetic field and torque for novel axial flux eddy current couplers with PM excitation

    NASA Astrophysics Data System (ADS)

    Li, Zhao; Wang, Dazhi; Zheng, Di; Yu, Linxin

    2017-10-01

    Rotational permanent magnet eddy current couplers are promising devices for torque and speed transmission without any mechanical contact. In this study, flux-concentration disk-type permanent magnet eddy current couplers with double conductor rotor are investigated. Given the drawback of the accurate three-dimensional finite element method, this paper proposes a mixed two-dimensional analytical modeling approach. Based on this approach, the closed-form expressions of magnetic field, eddy current, electromagnetic force and torque for such devices are obtained. Finally, a three-dimensional finite element method is employed to validate the analytical results. Besides, a prototype is manufactured and tested for the torque-speed characteristic.

  4. Computational tools for Breakthrough Propulsion Physics: State of the art and future prospects

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2000-01-01

    To address problems in Breakthrough Propulsion Physics (BPP) one needs sheer computing capabilities. This is because General Relativity and Quantum Field Theory are so mathematically sophisticated that the amount of analytical calculations is prohibitive and one can hardly do all of them by hand. In this paper we make a comparative review of the main tensor calculus capabilities of the three most advanced and commercially available ``symbolic manipulator'' codes: Macsyma, Maple V and Mathematica. We also point out that currently one faces such a variety of different conventions in tensor calculus that it is difficult or impossible to compare results obtained by different scholars in General Relativity and Quantum Field Theory. Mathematical physicists, experimental physicists and engineers have each their own way of customizing tensors, especially by using the different metric signatures, different metric determinant signs, different definitions of the basic Riemann and Ricci tensors, and by adopting different systems of physical units. This chaos greatly hampers progress toward the chief NASA BPP goal: the design of the NASA Warp Drive. It is thus concluded that NASA should put order by establishing international standards in symbolic tensor calculus and enforcing anyone working in BPP to adopt these NASA BPP Standards. .

  5. SIERRA Multimechanics Module: Aria User Manual Version 4.44

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sierra Thermal /Fluid Team

    2017-04-01

    Aria is a Galerkin fnite element based program for solving coupled-physics problems described by systems of PDEs and is capable of solving nonlinear, implicit, transient and direct-to-steady state problems in two and three dimensions on parallel architectures. The suite of physics currently supported by Aria includes thermal energy transport, species transport, and electrostatics as well as generalized scalar, vector and tensor transport equations. Additionally, Aria includes support for manufacturing process fows via the incompressible Navier-Stokes equations specialized to a low Reynolds number ( %3C 1 ) regime. Enhanced modeling support of manufacturing processing is made possible through use of eithermore » arbitrary Lagrangian- Eulerian (ALE) and level set based free and moving boundary tracking in conjunction with quasi-static nonlinear elastic solid mechanics for mesh control. Coupled physics problems are solved in several ways including fully-coupled Newton's method with analytic or numerical sensitivities, fully-coupled Newton- Krylov methods and a loosely-coupled nonlinear iteration about subsets of the system that are solved using combinations of the aforementioned methods. Error estimation, uniform and dynamic h -adaptivity and dynamic load balancing are some of Aria's more advanced capabilities. Aria is based upon the Sierra Framework.« less

  6. Intelligent model-based diagnostics for vehicle health management

    NASA Astrophysics Data System (ADS)

    Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki

    2003-08-01

    The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.

  7. Aeroelastic Ground Wind Loads Analysis Tool for Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Ivanco, Thomas G.

    2016-01-01

    Launch vehicles are exposed to ground winds during rollout and on the launch pad that can induce static and dynamic loads. Of particular concern are the dynamic loads caused by vortex shedding from nearly-cylindrical structures. When the frequency of vortex shedding nears that of a lowly-damped structural mode, the dynamic loads can be more than an order of magnitude greater than mean drag loads. Accurately predicting vehicle response to vortex shedding during the design and analysis cycles is difficult and typically exceeds the practical capabilities of modern computational fluid dynamics codes. Therefore, mitigating the ground wind loads risk typically requires wind-tunnel tests of dynamically-scaled models that are time consuming and expensive to conduct. In recent years, NASA has developed a ground wind loads analysis tool for launch vehicles to fill this analytical capability gap in order to provide predictions for prelaunch static and dynamic loads. This paper includes a background of the ground wind loads problem and the current state-of-the-art. It then discusses the history and significance of the analysis tool and the methodology used to develop it. Finally, results of the analysis tool are compared to wind-tunnel and full-scale data of various geometries and Reynolds numbers.

  8. Fluvial sediment in the environment: a national challenge

    USGS Publications Warehouse

    Larsen, Matthew C.; Gellis, Allen C.; Glysson, G. Douglas; Gray, John R.; Horowitz, Arthur J.

    2010-01-01

    Sediment and sediment-associated constituents can contribute substantially to water-quality impairment. In the past, sediment was viewed mainly as an engineering problem that affected reservoir storage capacity, shipping channel maintenance, and bridge scour, as well as the loss of agricultural soil. Sediment is now recognized as a major cause of aquatic system degradation in many rivers and streams as a result of light attenuation, loss of spawning substrate due to fine-grained sediment infilling, reduction in primary productivity, decreases in biotic diversity, and effects from sediment-associated chemical constituents. Recent advances in sediment measurement, assessment, source-identification, and analytical protocols provide new capabilities to quantify sediment and solid-phase chemical fluxes in aquatic systems. Developing, maintaining, and augmenting current sediment- and water-quality-monitoring networks is essential for determining the health of U.S. waterways and for evaluating the effectiveness of management actions in reducing sediment-related problems. The application of new scientific capabilities that address the adverse effects of sediment and sediment- associated constituents represents a major step in managing the Nation’s water quality. A robust Federal, national-scale eff rt, in collaboration with vested stakeholders, is needed to address these sediment-related water-quality issues across the United States.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sierra Thermal/Fluid Team

    Aria is a Galerkin fnite element based program for solving coupled-physics problems described by systems of PDEs and is capable of solving nonlinear, implicit, transient and direct-to-steady state problems in two and three dimensions on parallel architectures. The suite of physics currently supported by Aria includes thermal energy transport, species transport, and electrostatics as well as generalized scalar, vector and tensor transport equations. Additionally, Aria includes support for manufacturing process fows via the incompressible Navier-Stokes equations specialized to a low Reynolds number ( %3C 1 ) regime. Enhanced modeling support of manufacturing processing is made possible through use of eithermore » arbitrary Lagrangian- Eulerian (ALE) and level set based free and moving boundary tracking in conjunction with quasi-static nonlinear elastic solid mechanics for mesh control. Coupled physics problems are solved in several ways including fully-coupled Newton's method with analytic or numerical sensitivities, fully-coupled Newton- Krylov methods and a loosely-coupled nonlinear iteration about subsets of the system that are solved using combinations of the aforementioned methods. Error estimation, uniform and dynamic h -adaptivity and dynamic load balancing are some of Aria's more advanced capabilities. Aria is based upon the Sierra Framework.« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sierra Thermal /Fluid Team

    Aria is a Galerkin finite element based program for solving coupled-physics problems described by systems of PDEs and is capable of solving nonlinear, implicit, transient and direct-to-steady state problems in two and three dimensions on parallel architectures. The suite of physics currently supported by Aria includes thermal energy transport, species transport, and electrostatics as well as generalized scalar, vector and tensor transport equations. Additionally, Aria includes support for manufacturing process flows via the incompressible Navier-Stokes equations specialized to a low Reynolds number (Re %3C 1) regime. Enhanced modeling support of manufacturing processing is made possible through use of either arbitrarymore » Lagrangian- Eulerian (ALE) and level set based free and moving boundary tracking in conjunction with quasi-static nonlinear elastic solid mechanics for mesh control. Coupled physics problems are solved in several ways including fully-coupled Newton's method with analytic or numerical sensitivities, fully-coupled Newton- Krylov methods and a loosely-coupled nonlinear iteration about subsets of the system that are solved using combinations of the aforementioned methods. Error estimation, uniform and dynamic h-adaptivity and dynamic load balancing are some of Aria's more advanced capabilities. Aria is based upon the Sierra Framework.« less

  11. Application of Sequential Quadratic Programming to Minimize Smart Active Flap Rotor Hub Loads

    NASA Technical Reports Server (NTRS)

    Kottapalli, Sesi; Leyland, Jane

    2014-01-01

    In an analytical study, SMART active flap rotor hub loads have been minimized using nonlinear programming constrained optimization methodology. The recently developed NLPQLP system (Schittkowski, 2010) that employs Sequential Quadratic Programming (SQP) as its core algorithm was embedded into a driver code (NLP10x10) specifically designed to minimize active flap rotor hub loads (Leyland, 2014). Three types of practical constraints on the flap deflections have been considered. To validate the current application, two other optimization methods have been used: i) the standard, linear unconstrained method, and ii) the nonlinear Generalized Reduced Gradient (GRG) method with constraints. The new software code NLP10x10 has been systematically checked out. It has been verified that NLP10x10 is functioning as desired. The following are briefly covered in this paper: relevant optimization theory; implementation of the capability of minimizing a metric of all, or a subset, of the hub loads as well as the capability of using all, or a subset, of the flap harmonics; and finally, solutions for the SMART rotor. The eventual goal is to implement NLP10x10 in a real-time wind tunnel environment.

  12. Predictive Capability of the Compressible MRG Equation for an Explosively Driven Particle with Validation

    NASA Astrophysics Data System (ADS)

    Garno, Joshua; Ouellet, Frederick; Koneru, Rahul; Balachandar, Sivaramakrishnan; Rollin, Bertrand

    2017-11-01

    An analytic model to describe the hydrodynamic forces on an explosively driven particle is not currently available. The Maxey-Riley-Gatignol (MRG) particle force equation generalized for compressible flows is well-studied in shock-tube applications, and captures the evolution of particle force extracted from controlled shock-tube experiments. In these experiments only the shock-particle interaction was examined, and the effects of the contact line were not investigated. In the present work, the predictive capability of this model is considered for the case where a particle is explosively ejected from a rigid barrel into ambient air. Particle trajectory information extracted from simulations is compared with experimental data. This configuration ensures that both the shock and contact produced by the detonation will influence the motion of the particle. The simulations are carried out using a finite volume, Euler-Lagrange code using the JWL equation of state to handle the explosive products. This work was supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program,under Contract No. DE-NA0002378.

  13. Big data analytics workflow management for eScience

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni

    2015-04-01

    In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the real challenge in many practical scientific use cases. This talk will specifically address the main needs, requirements and challenges regarding data analytics workflow management applied to large scientific datasets. Three real use cases concerning analytics workflows for sea situational awareness, fire danger prevention, climate change and biodiversity will be discussed in detail.

  14. Prioritizing pesticide compounds for analytical methods development

    USGS Publications Warehouse

    Norman, Julia E.; Kuivila, Kathryn; Nowell, Lisa H.

    2012-01-01

    The U.S. Geological Survey (USGS) has a periodic need to re-evaluate pesticide compounds in terms of priorities for inclusion in monitoring and studies and, thus, must also assess the current analytical capabilities for pesticide detection. To meet this need, a strategy has been developed to prioritize pesticides and degradates for analytical methods development. Screening procedures were developed to separately prioritize pesticide compounds in water and sediment. The procedures evaluate pesticide compounds in existing USGS analytical methods for water and sediment and compounds for which recent agricultural-use information was available. Measured occurrence (detection frequency and concentrations) in water and sediment, predicted concentrations in water and predicted likelihood of occurrence in sediment, potential toxicity to aquatic life or humans, and priorities of other agencies or organizations, regulatory or otherwise, were considered. Several existing strategies for prioritizing chemicals for various purposes were reviewed, including those that identify and prioritize persistent, bioaccumulative, and toxic compounds, and those that determine candidates for future regulation of drinking-water contaminants. The systematic procedures developed and used in this study rely on concepts common to many previously established strategies. The evaluation of pesticide compounds resulted in the classification of compounds into three groups: Tier 1 for high priority compounds, Tier 2 for moderate priority compounds, and Tier 3 for low priority compounds. For water, a total of 247 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods for monitoring and studies. Of these, about three-quarters are included in some USGS analytical method; however, many of these compounds are included on research methods that are expensive and for which there are few data on environmental samples. The remaining quarter of Tier 1 compounds are high priority as new analytes. The objective for analytical methods development is to design an integrated analytical strategy that includes as many of the Tier 1 pesticide compounds as possible in a relatively few, cost-effective methods. More than 60 percent of the Tier 1 compounds are high priority because they are anticipated to be present at concentrations approaching levels that could be of concern to human health or aquatic life in surface water or groundwater. An additional 17 percent of Tier 1 compounds were frequently detected in monitoring studies, but either were not measured at levels potentially relevant to humans or aquatic organisms, or do not have benchmarks available with which to compare concentrations. The remaining 21 percent are pesticide degradates that were included because their parent pesticides were in Tier 1. Tier 1 pesticide compounds for water span all major pesticide use groups and a diverse range of chemical classes, with herbicides and their degradates composing half of compounds. Many of the high priority pesticide compounds also are in several national regulatory programs for water, including those that are regulated in drinking water by the U.S. Environmental Protection Agency under the Safe Drinking Water Act and those that are on the latest Contaminant Candidate List. For sediment, a total of 175 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods available for monitoring and studies. More than 60 percent of these compounds are included in some USGS analytical method; however, some are spread across several research methods that are expensive to perform, and monitoring data are not extensive for many compounds. The remaining Tier 1 compounds for sediment are high priority as new analytes. The objective for analytical methods development for sediment is to enhance an existing analytical method that currently includes nearly half of the pesticide compounds in Tier 1 by adding as many additional Tier 1 compounds as are analytically compatible. About 35 percent of the Tier 1 compounds for sediment are high priority on the basis of measured occurrence. A total of 74 compounds, or 42 percent, are high priority on the basis of predicted likelihood of occurrence according to physical-chemical properties, and either have potential toxicity to aquatic life, high pesticide useage, or both. The remaining 22 percent of Tier 1 pesticide compounds were either degradates of Tier 1 parent compounds or included for other reasons. As with water, the Tier 1 pesticide compounds for sediment are distributed across the major pesticide-use groups; insecticides and their degradates are the largest fraction, making up 45 percent of Tier 1. In contrast to water, organochlorines, at 17 percent, are the largest chemical class for Tier 1 in sediment, which is to be expected because there is continued widespread detection in sediments of persistent organochlorine pesticides and their degradates at concentrations high enough for potential effects on aquatic life. Compared to water, there are fewer available benchmarks with which to compare contaminant concentrations in sediment, but a total of 19 Tier 1 compounds have at least one sediment benchmark or screening value for aquatic organisms. Of the 175 compounds in Tier 1, 77 percent have high aquatic-life toxicity, as defined for this process. This evaluation of pesticides and degradates resulted in two lists of compounds that are priorities for USGS analytical methods development, one for water and one for sediment. These lists will be used as the basis for redesigning and enhancing USGS analytical capabilities for pesticides in order to capture as many high-priority pesticide compounds as possible using an economically feasible approach.

  15. Capacitive chemical sensor

    DOEpatents

    Manginell, Ronald P; Moorman, Matthew W; Wheeler, David R

    2014-05-27

    A microfabricated capacitive chemical sensor can be used as an autonomous chemical sensor or as an analyte-sensitive chemical preconcentrator in a larger microanalytical system. The capacitive chemical sensor detects changes in sensing film dielectric properties, such as the dielectric constant, conductivity, or dimensionality. These changes result from the interaction of a target analyte with the sensing film. This capability provides a low-power, self-heating chemical sensor suitable for remote and unattended sensing applications. The capacitive chemical sensor also enables a smart, analyte-sensitive chemical preconcentrator. After sorption of the sample by the sensing film, the film can be rapidly heated to release the sample for further analysis. Therefore, the capacitive chemical sensor can optimize the sample collection time prior to release to enable the rapid and accurate analysis of analytes by a microanalytical system.

  16. Detection of biological molecules using chemical amplification and optical sensors

    DOEpatents

    Van Antwerp, William Peter; Mastrototaro, John Joseph

    2000-01-01

    Methods are provided for the determination of the concentration of biological levels of polyhydroxylated compounds, particularly glucose. The methods utilize an amplification system that is an analyte transducer immobilized in a polymeric matrix, where the system is implantable and biocompatible. Upon interrogation by an optical system, the amplification system produces a signal capable of detection external to the skin of the patient. Quantitation of the analyte of interest is achieved by measurement of the emitted signal.

  17. MERRA/AS: The MERRA Analytic Services Project Interim Report

    NASA Technical Reports Server (NTRS)

    Schnase, John; Duffy, Dan; Tamkin, Glenn; Nadeau, Denis; Thompson, Hoot; Grieg, Cristina; Luczak, Ed; McInerney, Mark

    2013-01-01

    MERRA AS is a cyberinfrastructure resource that will combine iRODS-based Climate Data Server (CDS) capabilities with Coudera MapReduce to serve MERRA analytic products, store the MERRA reanalysis data collection in an HDFS to enable parallel, high-performance, storage-side data reductions, manage storage-side driver, mapper, reducer code sets and realized objects for users, and provide a library of commonly used spatiotemporal operations that can be composed to enable higher-order analyses.

  18. Detection of biological molecules using chemical amplification and optical sensors

    DOEpatents

    Van Antwerp, William Peter; Mastrototaro, John Joseph

    2004-10-12

    Methods are provided for the determination of the concentration of biological levels of polyhydroxylated compounds, particularly glucose. The methods utilize an amplification system that is an analyte transducer immobilized in a polymeric matrix, where the system is implantable and biocompatible. Upon interrogation by an optical system, the amplification system produces a signal capable of detection external to the skin of the patient. Quantitation of the analyte of interest is achieved by measurement of the emitted signal.

  19. Sample Return Missions Where Contamination Issues are Critical: Genesis Mission Approach

    NASA Technical Reports Server (NTRS)

    Allton, Judith H.; Stansbery E. K.

    2011-01-01

    The Genesis Mission, sought the challenging analytical goals of accurately and precisely measuring the elemental and isotopic composition of the Sun to levels useful for planetary science, requiring sensitivities of ppm to ppt in the outer 100 nm of collector materials. Analytical capabilities were further challenged when the hard landing in 2004 broke open the canister containing the super-clean collectors. Genesis illustrates that returned samples allow flexibility and creativity to recover from setbacks.

  20. ATLAS, an integrated structural analysis and design system. Volume 1: ATLAS user's guide

    NASA Technical Reports Server (NTRS)

    Dreisbach, R. L. (Editor)

    1979-01-01

    Some of the many analytical capabilities provided by the ATLAS Version 4.0 System in the logical sequence are described in which model-definition data are prepared and the subsequent computer job is executed. The example data presented and the fundamental technical considerations that are highlighted can be used as guides during the problem solving process. This guide does not describe the details of the ATLAS capabilities, but provides an introduction to the new user of ATLAS to the level at which the complete array of capabilities described in the ATLAS User's Manual can be exploited fully.

  1. Three-dimensional eddy current solution of a polyphase machine test model (abstract)

    NASA Astrophysics Data System (ADS)

    Pahner, Uwe; Belmans, Ronnie; Ostovic, Vlado

    1994-05-01

    This abstract describes a three-dimensional (3D) finite element solution of a test model that has been reported in the literature. The model is a basis for calculating the current redistribution effects in the end windings of turbogenerators. The aim of the study is to see whether the analytical results of the test model can be found using a general purpose finite element package, thus indicating that the finite element model is accurate enough to treat real end winding problems. The real end winding problems cannot be solved analytically, as the geometry is far too complicated. The model consists of a polyphase coil set, containing 44 individual coils. This set generates a two pole mmf distribution on a cylindrical surface. The rotating field causes eddy currents to flow in the inner massive and conducting rotor. In the analytical solution a perfect sinusoidal mmf distribution is put forward. The finite element model contains 85824 tetrahedra and 16451 nodes. A complex single scalar potential representation is used in the nonconducting parts. The computation time required was 3 h and 42 min. The flux plots show that the field distribution is acceptable. Furthermore, the induced currents are calculated and compared with the values found from the analytical solution. The distribution of the eddy currents is very close to the distribution of the analytical solution. The most important results are the losses, both local and global. The value of the overall losses is less than 2% away from those of the analytical solution. Also the local distribution of the losses is at any given point less than 7% away from the analytical solution. The deviations of the results are acceptable and are partially due to the fact that the sinusoidal mmf distribution was not modeled perfectly in the finite element method.

  2. Thermal fatigue durability for advanced propulsion materials

    NASA Technical Reports Server (NTRS)

    Halford, Gary R.

    1989-01-01

    A review is presented of thermal and thermomechanical fatigue (TMF) crack initiation life prediction and cyclic constitutive modeling efforts sponsored recently by the NASA Lewis Research Center in support of advanced aeronautical propulsion research. A brief description is provided of the more significant material durability models that were created to describe TMF fatigue resistance of both isotropic and anisotropic superalloys, with and without oxidation resistant coatings. The two most significant crack initiation models are the cyclic damage accumulation model and the total strain version of strainrange partitioning. Unified viscoplastic cyclic constitutive models are also described. A troika of industry, university, and government research organizations contributed to the generation of these analytic models. Based upon current capabilities and established requirements, an attempt is made to project which TMF research activities most likely will impact future generation propulsion systems.

  3. Remote real-time monitoring of subsurface landfill gas migration.

    PubMed

    Fay, Cormac; Doherty, Aiden R; Beirne, Stephen; Collins, Fiachra; Foley, Colum; Healy, John; Kiernan, Breda M; Lee, Hyowon; Maher, Damien; Orpen, Dylan; Phelan, Thomas; Qiu, Zhengwei; Zhang, Kirk; Gurrin, Cathal; Corcoran, Brian; O'Connor, Noel E; Smeaton, Alan F; Diamond, Dermot

    2011-01-01

    The cost of monitoring greenhouse gas emissions from landfill sites is of major concern for regulatory authorities. The current monitoring procedure is recognised as labour intensive, requiring agency inspectors to physically travel to perimeter borehole wells in rough terrain and manually measure gas concentration levels with expensive hand-held instrumentation. In this article we present a cost-effective and efficient system for remotely monitoring landfill subsurface migration of methane and carbon dioxide concentration levels. Based purely on an autonomous sensing architecture, the proposed sensing platform was capable of performing complex analytical measurements in situ and successfully communicating the data remotely to a cloud database. A web tool was developed to present the sensed data to relevant stakeholders. We report our experiences in deploying such an approach in the field over a period of approximately 16 months.

  4. HZETRN: A heavy ion/nucleon transport code for space radiations

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Chun, Sang Y.; Badavi, Forooz F.; Townsend, Lawrence W.; Lamkin, Stanley L.

    1991-01-01

    The galactic heavy ion transport code (GCRTRN) and the nucleon transport code (BRYNTRN) are integrated into a code package (HZETRN). The code package is computer efficient and capable of operating in an engineering design environment for manned deep space mission studies. The nuclear data set used by the code is discussed including current limitations. Although the heavy ion nuclear cross sections are assumed constant, the nucleon-nuclear cross sections of BRYNTRN with full energy dependence are used. The relation of the final code to the Boltzmann equation is discussed in the context of simplifying assumptions. Error generation and propagation is discussed, and comparison is made with simplified analytic solutions to test numerical accuracy of the final results. A brief discussion of biological issues and their impact on fundamental developments in shielding technology is given.

  5. Remote Real-Time Monitoring of Subsurface Landfill Gas Migration

    PubMed Central

    Fay, Cormac; Doherty, Aiden R.; Beirne, Stephen; Collins, Fiachra; Foley, Colum; Healy, John; Kiernan, Breda M.; Lee, Hyowon; Maher, Damien; Orpen, Dylan; Phelan, Thomas; Qiu, Zhengwei; Zhang, Kirk; Gurrin, Cathal; Corcoran, Brian; O’Connor, Noel E.; Smeaton, Alan F.; Diamond, Dermot

    2011-01-01

    The cost of monitoring greenhouse gas emissions from landfill sites is of major concern for regulatory authorities. The current monitoring procedure is recognised as labour intensive, requiring agency inspectors to physically travel to perimeter borehole wells in rough terrain and manually measure gas concentration levels with expensive hand-held instrumentation. In this article we present a cost-effective and efficient system for remotely monitoring landfill subsurface migration of methane and carbon dioxide concentration levels. Based purely on an autonomous sensing architecture, the proposed sensing platform was capable of performing complex analytical measurements in situ and successfully communicating the data remotely to a cloud database. A web tool was developed to present the sensed data to relevant stakeholders. We report our experiences in deploying such an approach in the field over a period of approximately 16 months. PMID:22163975

  6. A review of microdialysis coupled to microchip electrophoresis for monitoring biological events

    PubMed Central

    Saylor, Rachel A.; Lunte, Susan M.

    2015-01-01

    Microdialysis is a powerful sampling technique that enables monitoring of dynamic processes in vitro and in vivo. The combination of microdialysis with chromatographic or electrophoretic methods yields along with selective detection methods yields a “separation-based sensor” capable of monitoring multiple analytes in near real time. Analysis of microdialysis samples requires techniques that are fast (<1 min), have low volume requirements (nL–pL), and, ideally, can be employed on-line. Microchip electrophoresis fulfills these requirements and also permits the possibility of integrating sample preparation and manipulation with detection strategies directly on-chip. Microdialysis coupled to microchip electrophoresis has been employed for monitoring biological events in vivo and in vitro. This review discusses technical considerations for coupling microdialysis sampling and microchip electrophoresis, including various interface designs, and current applications in the field. PMID:25637011

  7. Data and Tools | Concentrating Solar Power | NREL

    Science.gov Websites

    download. Solar Power tower Integrated Layout and Optimization Tool (SolarPILOT(tm)) The SolarPILOT is code rapid layout and optimization capability of the analytical DELSOL3 program with the accuracy and

  8. AmO 2 Analysis for Analytical Method Testing and Assessment: Analysis Support for AmO 2 Production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhn, Kevin John; Bland, Galey Jean; Fulwyler, James Brent

    Americium oxide samples will be measured for various analytes to support AmO 2 production. The key analytes that are currently requested by the Am production customer at LANL include total Am content, Am isotopics, Pu assay, Pu isotopics, and trace element content including 237Np content. Multiple analytical methods will be utilized depending on the sensitivity, accuracy and precision needs of the Am matrix. Traceability to the National Institute of Standards and Technology (NIST) will be achieved, where applicable, by running NIST traceable quality control materials. This given that there are no suitable AmO 2 reference materials currently available for requestedmore » analytes. The primary objective is to demonstrate the suitability of actinide analytical chemistry methods to support AmO 2 production operations.« less

  9. Analytical model for describing ion guiding through capillaries in insulating polymers

    NASA Astrophysics Data System (ADS)

    Liu, Shi-Dong; Zhao, Yong-Tao; Wang, Yu-Yu; N, Stolterfoht; Cheng, Rui; Zhou, Xian-Ming; Xu, Hu-Shan; Xiao, Guo-Qing

    2015-08-01

    An analytical description for guiding of ions through nanocapillaries is given on the basis of previous work. The current entering into the capillary is assumed to be divided into a current fraction transmitted through the capillary, a current fraction flowing away via the capillary conductivity and a current fraction remaining within the capillary, which is responsible for its charge-up. The discharging current is assumed to be governed by the Frenkel-Poole process. At higher conductivities the analytical model shows a blocking of the ion transmission, which is in agreement with recent simulations. Also, it is shown that ion blocking observed in experiments is well reproduced by the analytical formula. Furthermore, the asymptotic fraction of transmitted ions is determined. Apart from the key controlling parameter (charge-to-energy ratio), the ratio of the capillary conductivity to the incident current is included in the model. Differences resulting from the nonlinear and linear limits of the Frenkel-Poole discharge are pointed out. Project supported by the Major State Basic Research Development Program of China (Grant No. 2010CB832902) and the National Natural Science Foundation of China (Grant Nos. 11275241, 11275238, 11105192, and 11375034).

  10. Steel Shear Walls, Behavior, Modeling and Design

    NASA Astrophysics Data System (ADS)

    Astaneh-Asl, Abolhassan

    2008-07-01

    In recent years steel shear walls have become one of the more efficient lateral load resisting systems in tall buildings. The basic steel shear wall system consists of a steel plate welded to boundary steel columns and boundary steel beams. In some cases the boundary columns have been concrete-filled steel tubes. Seismic behavior of steel shear wall systems during actual earthquakes and based on laboratory cyclic tests indicates that the systems are quite ductile and can be designed in an economical way to have sufficient stiffness, strength, ductility and energy dissipation capacity to resist seismic effects of strong earthquakes. This paper, after summarizing the past research, presents the results of two tests of an innovative steel shear wall system where the boundary elements are concrete-filled tubes. Then, a review of currently available analytical models of steel shear walls is provided with a discussion of capabilities and limitations of each model. We have observed that the tension only "strip model", forming the basis of the current AISC seismic design provisions for steel shear walls, is not capable of predicting the behavior of steel shear walls with length-to-thickness ratio less than about 600 which is the range most common in buildings. The main reasons for such shortcomings of the AISC seismic design provisions for steel shear walls is that it ignores the compression field in the shear walls, which can be significant in typical shear walls. The AISC method also is not capable of incorporating stresses in the shear wall due to overturning moments. A more rational seismic design procedure for design of shear walls proposed in 2000 by the author is summarized in the paper. The design method, based on procedures used for design of steel plate girders, takes into account both tension and compression stress fields and is applicable to all values of length-to-thickness ratios of steel shear walls. The method is also capable of including the effect of overturning moments and any normal forces that might act on the steel shear wall.

  11. Influence of polydimethylsiloxane outer coating and packing material on analyte recovery in dual-phase headspace sorptive extraction.

    PubMed

    Bicchi, Carlo; Cordero, Chiara; Liberto, Erica; Sgorbini, Barbara; David, Frank; Sandra, Pat; Rubiolo, Patrizia

    2007-09-14

    Dual phase twisters (DP twisters), consisting of a polydimethylsiloxane (PDMS) outer coating and a second complementary (ad)sorbent as inner packing, have recently been shown to extend the applicability of headspace sorptive extraction (HSSE). In comparison to HSSE using PDMS only, the recovery of analytes from the headspace of a solid or liquid matrix is increased by combining the concentration capabilities of two sampling materials operating on different mechanisms (sorption and adsorption). This study compares the performance of DP twisters consisting of different PDMS outer coatings and different packing materials, including Tenax GC, a bisphenol-PDMS copolymer, Carbopack coated with 5% of Carbowax and beta-cyclodextrin, for the analysis of the headspace of roasted Arabica coffee, dried sage leaves and an aqueous test mixture containing compounds with different water solubility, acidity, polarity and volatility as test samples. In general, DP twisters showed a higher concentration capability than the corresponding conventional PDMS twisters for the analytes considered. The highest recoveries were obtained with DP twisters consisting of 0.2mm thick PDMS coating combined with Tenax GC, a bisphenol-PDMS copolymer and Carbopack coated with 5% of Carbowax as inner adsorption phase.

  12. Nanoscaled aptasensors for multi-analyte sensing

    PubMed Central

    Saberian-Borujeni, Mehdi; Johari-Ahar, Mohammad; Hamzeiy, Hossein; Barar, Jaleh; Omidi, Yadollah

    2014-01-01

    Introduction: Nanoscaled aptamers (Aps), as short single-stranded DNA or RNA oligonucleotides, are able to bind to their specific targets with high affinity, upon which they are considered as powerful diagnostic and analytical sensing tools (the so-called "aptasensors"). Aptamers are selected from a random pool of oligonucleotides through a procedure known as "systematic evolution of ligands by exponential enrichment". Methods: In this work, the most recent studies in the field of aptasensors are reviewed and discussed with a main focus on the potential of aptasensors for the multianalyte detection(s). Results: Due to the specific folding capability of aptamers in the presence of analyte, aptasensors have substantially successfully been exploited for the detection of a wide range of small and large molecules (e.g., drugs and their metabolites, toxins, and associated biomarkers in various diseases) at very low concentrations in the biological fluids/samples even in presence of interfering species. Conclusion: Biological samples are generally considered as complexes in the real biological media. Hence, the development of aptasensors with capability to determine various targets simultaneously within a biological matrix seems to be our main challenge. To this end, integration of various key scientific dominions such as bioengineering and systems biology with biomedical researches are inevitable. PMID:25671177

  13. Modern Material Analysis Instruments Add a New Dimension to Materials Characterization and Failure Analysis

    NASA Technical Reports Server (NTRS)

    Panda, Binayak

    2009-01-01

    Modern analytical tools can yield invaluable results during materials characterization and failure analysis. Scanning electron microscopes (SEMs) provide significant analytical capabilities, including angstrom-level resolution. These systems can be equipped with a silicon drift detector (SDD) for very fast yet precise analytical mapping of phases, as well as electron back-scattered diffraction (EBSD) units to map grain orientations, chambers that admit large samples, variable pressure for wet samples, and quantitative analysis software to examine phases. Advanced solid-state electronics have also improved surface and bulk analysis instruments: Secondary ion mass spectroscopy (SIMS) can quantitatively determine and map light elements such as hydrogen, lithium, and boron - with their isotopes. Its high sensitivity detects impurities at parts per billion (ppb) levels. X-ray photo-electron spectroscopy (XPS) can determine oxidation states of elements, as well as identifying polymers and measuring film thicknesses on coated composites. This technique is also known as electron spectroscopy for chemical analysis (ESCA). Scanning Auger electron spectroscopy (SAM) combines surface sensitivity, spatial lateral resolution (10 nm), and depth profiling capabilities to describe elemental compositions of near and below surface regions down to the chemical state of an atom.

  14. Visual programming for next-generation sequencing data analytics.

    PubMed

    Milicchio, Franco; Rose, Rebecca; Bian, Jiang; Min, Jae; Prosperi, Mattia

    2016-01-01

    High-throughput or next-generation sequencing (NGS) technologies have become an established and affordable experimental framework in biological and medical sciences for all basic and translational research. Processing and analyzing NGS data is challenging. NGS data are big, heterogeneous, sparse, and error prone. Although a plethora of tools for NGS data analysis has emerged in the past decade, (i) software development is still lagging behind data generation capabilities, and (ii) there is a 'cultural' gap between the end user and the developer. Generic software template libraries specifically developed for NGS can help in dealing with the former problem, whilst coupling template libraries with visual programming may help with the latter. Here we scrutinize the state-of-the-art low-level software libraries implemented specifically for NGS and graphical tools for NGS analytics. An ideal developing environment for NGS should be modular (with a native library interface), scalable in computational methods (i.e. serial, multithread, distributed), transparent (platform-independent), interoperable (with external software interface), and usable (via an intuitive graphical user interface). These characteristics should facilitate both the run of standardized NGS pipelines and the development of new workflows based on technological advancements or users' needs. We discuss in detail the potential of a computational framework blending generic template programming and visual programming that addresses all of the current limitations. In the long term, a proper, well-developed (although not necessarily unique) software framework will bridge the current gap between data generation and hypothesis testing. This will eventually facilitate the development of novel diagnostic tools embedded in routine healthcare.

  15. Tidally induced residual current over the Malin Sea continental slope

    NASA Astrophysics Data System (ADS)

    Stashchuk, Nataliya; Vlasenko, Vasiliy; Hosegood, Phil; Nimmo-Smith, W. Alex M.

    2017-05-01

    Tidally induced residual currents generated over shelf-slope topography are investigated analytically and numerically using the Massachusetts Institute of Technology general circulation model. Observational support for the presence of such a slope current was recorded over the Malin Sea continental slope during the 88-th cruise of the RRS ;James Cook; in July 2013. A simple analytical formula developed here in the framework of time-averaged shallow water equations has been validated against a fully nonlinear nonhydrostatic numerical solution. A good agreement between analytical and numerical solutions is found for a wide range of input parameters of the tidal flow and bottom topography. In application to the Malin Shelf area both the numerical model and analytical solution predicted a northward moving current confined to the slope with its core located above the 400 m isobath and with vertically averaged maximum velocities up to 8 cm s-1, which is consistent with the in-situ data recorded at three moorings and along cross-slope transects.

  16. Modeling of phonon scattering in n-type nanowire transistors using one-shot analytic continuation technique

    NASA Astrophysics Data System (ADS)

    Bescond, Marc; Li, Changsheng; Mera, Hector; Cavassilas, Nicolas; Lannoo, Michel

    2013-10-01

    We present a one-shot current-conserving approach to model the influence of electron-phonon scattering in nano-transistors using the non-equilibrium Green's function formalism. The approach is based on the lowest order approximation (LOA) to the current and its simplest analytic continuation (LOA+AC). By means of a scaling argument, we show how both LOA and LOA+AC can be easily obtained from the first iteration of the usual self-consistent Born approximation (SCBA) algorithm. Both LOA and LOA+AC are then applied to model n-type silicon nanowire field-effect-transistors and are compared to SCBA current characteristics. In this system, the LOA fails to describe electron-phonon scattering, mainly because of the interactions with acoustic phonons at the band edges. In contrast, the LOA+AC still well approximates the SCBA current characteristics, thus demonstrating the power of analytic continuation techniques. The limits of validity of LOA+AC are also discussed, and more sophisticated and general analytic continuation techniques are suggested for more demanding cases.

  17. Detection of biological molecules using boronate-based chemical amplification and optical sensors

    DOEpatents

    Van Antwerp, William Peter; Mastrototaro, John Joseph; Lane, Stephen M.; Satcher, Jr., Joe H.; Darrow, Christopher B.; Peyser, Thomas A.; Harder, Jennifer

    1999-01-01

    Methods are provided for the determination of the concentration of biological levels of polyhydroxylated compounds, particularly glucose. The methods utilize an amplification system that is an analyte transducer immobilized in a polymeric matrix, where the system is implantable and biocompatible. Upon interrogation by an optical system, the amplification system produces a signal capable of detection external to the skin of the patient. Quantitation of the analyte of interest is achieved by measurement of the emitted signal.

  18. New perspectives in laser analytics: Resonance-enhanced multiphoton ionization in a Paul ion trap combined with a time-of-flight mass spectrometer

    NASA Astrophysics Data System (ADS)

    Bisling, Peter; Heger, Hans Jörg; Michaelis, Walfried; Weitkamp, Claus; Zobel, Harald

    1995-04-01

    A new laser analytical device has been developed that is based on resonance-enhanced multiphoton ionization in the very center of a radio-frequency quadrupole ion trap. Applications in speciation anlaysis of biological and enviromental samples and in materials science will all benefit from laser-optical selectivity in the resonance excitation process, combined with mass-spectropic sensivity which is further enhanced by the ion accumulation and storage capability.

  19. A general, cryogenically-based analytical technique for the determination of trace quantities of volatile organic compounds in the atmosphere

    NASA Technical Reports Server (NTRS)

    Coleman, R. A.; Cofer, W. R., III; Edahl, R. A., Jr.

    1985-01-01

    An analytical technique for the determination of trace (sub-ppbv) quantities of volatile organic compounds in air was developed. A liquid nitrogen-cooled trap operated at reduced pressures in series with a Dupont Nafion-based drying tube and a gas chromatograph was utilized. The technique is capable of analyzing a variety of organic compounds, from simple alkanes to alcohols, while offering a high level of precision, peak sharpness, and sensitivity.

  20. Detection of biological molecules using boronate-based chemical amplification and optical sensors

    DOEpatents

    Van Antwerp, William Peter; Mastrototaro, John Joseph; Lane, Stephen M.; Satcher, Jr., Joe H.; Darrow, Christopher B.; Peyser, Thomas A.; Harder, Jennifer

    2004-06-15

    Methods are provided for the determination of the concentration of biological levels of polyhydroxylated compounds, particularly glucose. The methods utilize an amplification system that is an analyte transducer immobilized in a polymeric matrix, where the system is implantable and biocompatible. Upon interrogation by an optical system, the amplification system produces a signal capable of detection external to the skin of the patient. Quantitation of the analyte of interest is achieved by measurement of the emitted signal.

  1. Analytical stability and simulation response study for a coupled two-body system

    NASA Technical Reports Server (NTRS)

    Tao, K. M.; Roberts, J. R.

    1975-01-01

    An analytical stability study and a digital simulation response study of two connected rigid bodies are documented. Relative rotation of the bodies at the connection is allowed, thereby providing a model suitable for studying system stability and response during a soft-dock regime. Provisions are made of a docking port axes alignment torque and a despin torque capability for encountering spinning payloads. Although the stability analysis is based on linearized equations, the digital simulation is based on nonlinear models.

  2. Minimum Detectable Dose as a Measure of Bioassay Programme Capability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carbaugh, Eugene H.

    2003-01-01

    This paper suggests that minimum detectable dose (MDD) be used to describe the capability of bioassay programs for which intakes are expected to be rare. This allows expression of the capability in units that correspond directly to primary dose limits. The concept uses the well-established analytical statistic minimum detectable amount (MDA) as the starting point and assumes MDA detection at a prescribed time post intake. The resulting dose can then be used as an indication of the adequacy or capability of the program for demonstrating compliance with the performance criteria. MDDs can be readily tabulated or plotted to demonstrate themore » effectiveness of different types of monitoring programs. The inclusion of cost factors for bioassay measurements can allow optimisation.« less

  3. Minimum detectable dose as a measure of bioassay programme capability.

    PubMed

    Carbaugh, E H

    2003-01-01

    This paper suggests that minimum detectable dose (MDD) be used to describe the capability of bioassay programmes for which intakes are expected to be rare. This allows expression of the capability in units that correspond directly to primary dose limits. The concept uses the well established analytical statistic minimum detectable amount (MDA) as the starting point, and assumes MDA detection at a prescribed time post-intake. The resulting dose can then be used as an indication of the adequacy or capability of the programme for demonstrating compliance with the performance criteria. MDDs can be readily tabulated or plotted to demonstrate the effectiveness of different types of monitoring programmes. The inclusion of cost factors for bioassay measurements can allow optimisation.

  4. Unsteady transonic potential flow over a flexible fuselage

    NASA Technical Reports Server (NTRS)

    Gibbons, Michael D.

    1993-01-01

    A flexible fuselage capability has been developed and implemented within version 1.2 of the CAP-TSD code. The capability required adding time dependent terms to the fuselage surface boundary conditions and the fuselage surface pressure coefficient. The new capability will allow modeling the effect of a flexible fuselage on the aeroelastic stability of complex configurations. To assess the flexible fuselage capability several steady and unsteady calculations have been performed for slender fuselages with circular cross-sections. Steady surface pressures are compared with experiment at transonic flight conditions. Unsteady cross-sectional lift is compared with other analytical results at a low subsonic speed and a transonic case has been computed. The comparisons demonstrate the accuracy of the flexible fuselage modifications.

  5. Performance Analyses of Intercity Ground Passenger Transportation Systems

    DOT National Transportation Integrated Search

    1976-04-01

    This report documents the development of analytical techniques and their use for investigating the performance of intercity ground passenger transportation systems. The purpose of the study is twofold: (1) to provide a capability of evaluating new pa...

  6. Stability and Curving Performance of Conventional and Advanced Rail Transit Vehicles

    DOT National Transportation Integrated Search

    1984-01-01

    Analytical studies are presented which compare the curving performance and speed capability of conventional rail transit trucks with self steering (cross-braced) and forced steering (linkages between carbody and wheelsets) radial trucks. Truck curvin...

  7. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  8. Laser Time-of-Flight Mass Spectrometry for Future In Situ Planetary Missions

    NASA Technical Reports Server (NTRS)

    Getty, S. A.; Brinckerhoff, W. B.; Cornish, T.; Ecelberger, S. A.; Li, X.; Floyd, M. A. Merrill; Chanover, N.; Uckert, K.; Voelz, D.; Xiao, X.; hide

    2012-01-01

    Laser desorption/ionization time-of-flight mass spectrometry (LD-TOF-MS) is a versatile, low-complexity instrument class that holds significant promise for future landed in situ planetary missions that emphasize compositional analysis of surface materials. Here we describe a 5kg-class instrument that is capable of detecting and analyzing a variety of analytes directly from rock or ice samples. Through laboratory studies of a suite of representative samples, we show that detection and analysis of key mineral composition, small organics, and particularly, higher molecular weight organics are well suited to this instrument design. A mass range exceeding 100,000 Da has recently been demonstrated. We describe recent efforts in instrument prototype development and future directions that will enhance our analytical capabilities targeting organic mixtures on primitive and icy bodies. We present results on a series of standards, simulated mixtures, and meteoritic samples.

  9. Description of a Generalized Analytical Model for the Micro-dosimeter Response

    NASA Technical Reports Server (NTRS)

    Badavi, Francis F.; Stewart-Sloan, Charlotte R.; Xapsos, Michael A.; Shinn, Judy L.; Wilson, John W.; Hunter, Abigail

    2007-01-01

    An analytical prediction capability for space radiation in Low Earth Orbit (LEO), correlated with the Space Transportation System (STS) Shuttle Tissue Equivalent Proportional Counter (TEPC) measurements, is presented. The model takes into consideration the energy loss straggling and chord length distribution of the TEPC detector, and is capable of predicting energy deposition fluctuations in a micro-volume by incoming ions through both direct and indirect ionic events. The charged particle transport calculations correlated with STS 56, 51, 110 and 114 flights are accomplished by utilizing the most recent version (2005) of the Langley Research Center (LaRC) deterministic ionized particle transport code High charge (Z) and Energy TRaNsport WZETRN), which has been extensively validated with laboratory beam measurements and available space flight data. The agreement between the TEPC model prediction (response function) and the TEPC measured differential and integral spectra in lineal energy (y) domain is promising.

  10. Heat pipe development

    NASA Technical Reports Server (NTRS)

    Bienart, W. B.

    1973-01-01

    The objective of this program was to investigate analytically and experimentally the performance of heat pipes with composite wicks--specifically, those having pedestal arteries and screwthread circumferential grooves. An analytical model was developed to describe the effects of screwthreads and screen secondary wicks on the transport capability of the artery. The model describes the hydrodynamics of the circumferential flow in triangular grooves with azimuthally varying capillary menisci and liquid cross-sections. Normalized results were obtained which give the influence of evaporator heat flux on the axial heat transport capability of the arterial wick. In order to evaluate the priming behavior of composite wicks under actual load conditions, an 'inverted' glass heat pipe was designed and constructed. The results obtained from the analysis and from the tests with the glass heat pipe were applied to the OAO-C Level 5 heat pipe, and an improved correlation between predicted and measured evaporator and transport performance were obtained.

  11. Laboratory Instruments Available to Support Space Station Researchers at Marshall Space Flight Center

    NASA Technical Reports Server (NTRS)

    Panda, Binayak; Gorti, Sridhar

    2013-01-01

    A number of research instruments are available at NASA's Marshall Space Flight Center (MSFC) to support ISS researchers and their investigations. These modern analytical tools yield valuable and sometimes new informative resulting from sample characterization. Instruments include modern scanning electron microscopes equipped with field emission guns providing analytical capabilities that include angstron-level image resolution of dry, wet and biological samples. These microscopes are also equipped with silicon drift X-ray detectors (SDD) for fast yet precise analytical mapping of phases, as well as electron back-scattered diffraction (EBSD) units to map grain orientations in crystalline alloys. Sample chambers admit large samples, provide variable pressures for wet samples, and quantitative analysis software to determine phase relations. Advances in solid-state electronics have also facilitated improvements for surface chemical analysis that are successfully employed to analyze metallic materials and alloys, ceramics, slags, and organic polymers. Another analytical capability at MSFC is a mganetic sector Secondary Ion Mass Spectroscopy (SIMS) that quantitatively determines and maps light elements such as hydrogen, lithium, and boron along with their isotopes, identifies and quantifies very low level impurities even at parts per billion (ppb) levels. Still other methods available at MSFC include X-ray photo-electron spectroscopy (XPS) that can determine oxidation states of elements as well as identify polymers and measure film thicknesses on coated materials, Scanning Auger electron spectroscopy (SAM) which combines surface sensitivity, spatial lateral resolution (approximately 20 nm), and depth profiling capabilities to describe elemental compositions in near surface regions and even the chemical state of analyzed atoms. Conventional Transmission Electron Microscope (TEM) for observing internal microstructures at very high magnifications and the Electron Probe Micro-analyzer (EPMA) for very precise microanalysis are available as needed by the researcher. Space Station researchers are invited to work with MSFC in analyzing their samples using these techniques.

  12. Green analytical chemistry--theory and practice.

    PubMed

    Tobiszewski, Marek; Mechlińska, Agata; Namieśnik, Jacek

    2010-08-01

    This tutorial review summarises the current state of green analytical chemistry with special emphasis on environmentally friendly sample preparation techniques. Green analytical chemistry is a part of the sustainable development concept; its history and origins are described. Miniaturisation of analytical devices and shortening the time elapsing between performing analysis and obtaining reliable analytical results are important aspects of green analytical chemistry. Solventless extraction techniques, the application of alternative solvents and assisted extractions are considered to be the main approaches complying with green analytical chemistry principles.

  13. Contact and Impact Dynamic Modeling Capabilities of LS-DYNA for Fluid-Structure Interaction Problems

    DTIC Science & Technology

    2010-12-02

    rigid sphere in a vertical water entry,” Applied Ocean Research, 13(1), pp. 43-48. Monaghan, J.J., 1994. “ Simulating free surface flows with SPH ...The kinematic free surface condition was used to determine the intersection between the free surface and the body in the outer flow domain...and the results were compared with analytical and numerical predictions. The predictive capability of ALE and SPH features of LS-DYNA for simulation

  14. The Transformation from Defence Procurement to Defence Acquisition - Opportunities for New Forms of Analytical Support

    DTIC Science & Technology

    2010-04-01

    Exchanges of Services ( ATARES ); Strategic Airlift Interim Solution (SALIS); Strategic Airlift Capability (SAC); the European Air Transport Fleet (EATF... ATARES is a TA, established in order to facilitate the exchange of military capabilities based on equivalent flying hours with Lockheed C-130 Hercules...initiatives such as NATO PfP, EU BG, MNE, NAMSA, MCCE, ATARES , SALIS, and SAC. The participation in the EU BG concept was as one of the contributors

  15. Analytic solution of magnetic induction distribution of ideal hollow spherical field sources

    NASA Astrophysics Data System (ADS)

    Xu, Xiaonong; Lu, Dingwei; Xu, Xibin; Yu, Yang; Gu, Min

    2017-12-01

    The Halbach type hollow spherical permanent magnet arrays (HSPMA) are volume compacted, energy efficient field sources, and capable of producing multi-Tesla field in the cavity of the array, which have attracted intense interests in many practical applications. Here, we present analytical solutions of magnetic induction to the ideal HSPMA in entire space, outside of array, within the cavity of array, and in the interior of the magnet. We obtain solutions using concept of magnetic charge to solve the Poisson's and Laplace's equations for the HSPMA. Using these analytical field expressions inside the material, a scalar demagnetization function is defined to approximately indicate the regions of magnetization reversal, partial demagnetization, and inverse magnetic saturation. The analytical field solution provides deeper insight into the nature of HSPMA and offer guidance in designing optimized one.

  16. NASA Laboratory Analysis for Manned Exploration Missions

    NASA Technical Reports Server (NTRS)

    Krihak, Michael K.; Shaw, Tianna E.

    2014-01-01

    The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability Element under the NASA Human Research Program. ELA instrumentation is identified as an essential capability for future exploration missions to diagnose and treat evidence-based medical conditions. However, mission architecture limits the medical equipment, consumables, and procedures that will be available to treat medical conditions during human exploration missions. Allocated resources such as mass, power, volume, and crew time must be used efficiently to optimize the delivery of in-flight medical care. Although commercial instruments can provide the blood and urine based measurements required for exploration missions, these commercial-off-the-shelf devices are prohibitive for deployment in the space environment. The objective of the ELA project is to close the technology gap of current minimally invasive laboratory capabilities and analytical measurements in a manner that the mission architecture constraints impose on exploration missions. Besides micro gravity and radiation tolerances, other principal issues that generally fail to meet NASA requirements include excessive mass, volume, power and consumables, and nominal reagent shelf-life. Though manned exploration missions will not occur for nearly a decade, NASA has already taken strides towards meeting the development of ELA medical diagnostics by developing mission requirements and concepts of operations that are coupled with strategic investments and partnerships towards meeting these challenges. This paper focuses on the remote environment, its challenges, biomedical diagnostics requirements and candidate technologies that may lead to successful blood-urine chemistry and biomolecular measurements in future space exploration missions.

  17. Background: Preflight Screening, In-flight Capabilities, and Postflight Testing

    NASA Technical Reports Server (NTRS)

    Gibson, Charles Robert; Duncan, James

    2009-01-01

    Recommendations for minimal in-flight capabilities: Retinal Imaging - provide in-flight capability for the visual monitoring of ocular health (specifically, imaging of the retina and optic nerve head) with the capability of downlinking video/still images. Tonometry - provide more accurate and reliable in-flight capability for measuring intraocular pressure. Ultrasound - explore capabilities of current on-board system for monitoring ocular health. We currently have limited in-flight capabilities on board the International Space Station for performing an internal ocular health assessment. Visual Acuity, Direct Ophthalmoscope, Ultrasound, Tonometry(Tonopen):

  18. Method For Chemical Sensing Using A Microfabricated Teeter-Totter Resonator

    DOEpatents

    Adkins, Douglas Ray; Heller, Edwin J.; Shul, Randy J.

    2004-11-30

    A method for sensing a chemical analyte in a fluid stream comprises providing a microfabricated teeter-totter resonator that relies upon a Lorentz force to cause oscillation in a paddle, applying a static magnetic field substantially aligned in-plane with the paddle, energizing a current conductor line on a surface of the paddle with an alternating electrical current to generate the Lorentz force, exposing the resonator to the analyte, and detecting the response of the oscillatory motion of the paddle to the chemical analyte. Preferably, a chemically sensitive coating is disposed on at least one surface of the paddle to enhance the sorption of the analyte by the paddle. The concentration of the analyte in a fluid stream can be determined by measuring the change in the resonant frequency or phase of the teeter-totter resonator as the chemical analyte is added to or removed from the paddle.

  19. STATISTICAL ANALYSIS OF TANK 18F FLOOR SAMPLE RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 18F as per the statistical sampling plan developed by Shine [1]. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL [2]. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples resultsmore » [3] to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL{sub 95%}) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 18F. The uncertainty is quantified in this report by an upper 95% confidence limit (UCL{sub 95%}) on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL{sub 95%} was based entirely on the six current scrape sample results (each averaged across three analytical determinations).« less

  20. STATISTICAL ANALYSIS OF TANK 19F FLOOR SAMPLE RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 19F as per the statistical sampling plan developed by Harris and Shine. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples resultsmore » to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL95%) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current scrape sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 19F. The uncertainty is quantified in this report by an UCL95% on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL95% was based entirely on the six current scrape sample results (each averaged across three analytical determinations).« less

  1. Fabricating a UV-Vis and Raman Spectroscopy Immunoassay Platform.

    PubMed

    Hanson, Cynthia; Israelsen, Nathan D; Sieverts, Michael; Vargis, Elizabeth

    2016-11-10

    Immunoassays are used to detect proteins based on the presence of associated antibodies. Because of their extensive use in research and clinical settings, a large infrastructure of immunoassay instruments and materials can be found. For example, 96- and 384-well polystyrene plates are available commercially and have a standard design to accommodate ultraviolet-visible (UV-Vis) spectroscopy machines from various manufacturers. In addition, a wide variety of immunoglobulins, detection tags, and blocking agents for customized immunoassay designs such as enzyme-linked immunosorbent assays (ELISA) are available. Despite the existing infrastructure, standard ELISA kits do not meet all research needs, requiring individualized immunoassay development, which can be expensive and time-consuming. For example, ELISA kits have low multiplexing (detection of more than one analyte at a time) capabilities as they usually depend on fluorescence or colorimetric methods for detection. Colorimetric and fluorescent-based analyses have limited multiplexing capabilities due to broad spectral peaks. In contrast, Raman spectroscopy-based methods have a much greater capability for multiplexing due to narrow emission peaks. Another advantage of Raman spectroscopy is that Raman reporters experience significantly less photobleaching than fluorescent tags 1 . Despite the advantages that Raman reporters have over fluorescent and colorimetric tags, protocols to fabricate Raman-based immunoassays are limited. The purpose of this paper is to provide a protocol to prepare functionalized probes to use in conjunction with polystyrene plates for direct detection of analytes by UV-Vis analysis and Raman spectroscopy. This protocol will allow researchers to take a do-it-yourself approach for future multi-analyte detection while capitalizing on pre-established infrastructure.

  2. Effect-Based Screening Methods for Water Quality Characterization Will Augment Conventional Analyte-by-Analyte Chemical Methods in Research As Well As Regulatory Monitoring

    EPA Science Inventory

    Conventional approaches to water quality characterization can provide data on individual chemical components of each water sample. This analyte-by-analyte approach currently serves many useful research and compliance monitoring needs. However these approaches, which require a ...

  3. Capture and exploration of sample quality data to inform and improve the management of a screening collection.

    PubMed

    Charles, Isabel; Sinclair, Ian; Addison, Daniel H

    2014-04-01

    A new approach to the storage, processing, and interrogation of the quality data for screening samples has improved analytical throughput and confidence and enhanced the opportunities for learning from the accumulating records. The approach has entailed the design, development, and implementation of a database-oriented system, capturing information from the liquid chromatography-mass spectrometry capabilities used for assessing the integrity of samples in AstraZeneca's screening collection. A Web application has been developed to enable the visualization and interactive annotation of the analytical data, monitor the current sample queue, and report the throughput rate. Sample purity and identity are certified automatically on the chromatographic peaks of interest if predetermined thresholds are reached on key parameters. Using information extracted in parallel from the compound registration and container inventory databases, the chromatographic and spectroscopic profiles for each vessel are linked to the sample structures and storage histories. A search engine facilitates the direct comparison of results for multiple vessels of the same or similar compounds, for single vessels analyzed at different time points, or for vessels related by their origin or process flow. Access to this network of information has provided a deeper understanding of the multiple factors contributing to sample quality assurance.

  4. Rapid identification of regulated organic chemical compounds in toys using ambient ionization and a miniature mass spectrometry system.

    PubMed

    Guo, Xiangyu; Bai, Hua; Lv, Yueguang; Xi, Guangcheng; Li, Junfang; Ma, Xiaoxiao; Ren, Yue; Ouyang, Zheng; Ma, Qiang

    2018-04-01

    Rapid, on-site analysis was achieved through significantly simplified operation procedures for a wide variety of toy samples (crayon, temporary tattoo sticker, finger paint, modeling clay, and bubble solution) using a miniature mass spectrometry system with ambient ionization capability. The labor-intensive analytical protocols involving sample workup and chemical separation, traditionally required for MS-based analysis, were replaced by direct sampling analysis using ambient ionization methods. A Mini β ion trap miniature mass spectrometer was coupled with versatile ambient ionization methods, e.g. paper spray, extraction spray and slug-flow microextraction nanoESI for direct identification of prohibited colorants, carcinogenic primary aromatic amines, allergenic fragrances, preservatives and plasticizers from raw toy samples. The use of paper substrates coated with Co 3 O 4 nanoparticles allowed a great increase in sensitivity for paper spray. Limits of detection as low as 5μgkg -1 were obtained for target analytes. The methods being developed based on the integration of ambient ionization with miniature mass spectrometer represent alternatives to current in-lab MS analysis operation, and would enable fast, outside-the-lab screening of toy products to ensure children's safety and health. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Brownian Motion at Lipid Membranes: A Comparison of Hydrodynamic Models Describing and Experiments Quantifying Diffusion within Lipid Bilayers.

    PubMed

    Block, Stephan

    2018-05-22

    The capability of lipid bilayers to exhibit fluid-phase behavior is a fascinating property, which enables, for example, membrane-associated components, such as lipids (domains) and transmembrane proteins, to diffuse within the membrane. These diffusion processes are of paramount importance for cells, as they are for example involved in cell signaling processes or the recycling of membrane components, but also for recently developed analytical approaches, which use differences in the mobility for certain analytical purposes, such as in-membrane purification of membrane proteins or the analysis of multivalent interactions. Here, models describing the Brownian motion of membrane inclusions (lipids, peptides, proteins, and complexes thereof) in model bilayers (giant unilamellar vesicles, black lipid membranes, supported lipid bilayers) are summarized and model predictions are compared with the available experimental data, thereby allowing for evaluating the validity of the introduced models. It will be shown that models describing the diffusion in freestanding (Saffman-Delbrück and Hughes-Pailthorpe-White model) and supported bilayers (the Evans-Sackmann model) are well supported by experiments, though only few experimental studies have been published so far for the latter case, calling for additional tests to reach the same level of experimental confirmation that is currently available for the case of freestanding bilayers.

  6. Fast determination of sugars in Coke and Diet Coke by miniaturized capillary electrophoresis with amperometric detection.

    PubMed

    Chu, Qingcui; Fu, Liang; Guan, Yueqing; Ye, Jiannong

    2005-02-01

    The fast separation capability of a novel miniaturized capillary electrophoresis with amperometric detection (CE-AD) system was demonstrated by determining sugar contents in Coke and diet Coke with an estimated separation efficiency of 60,000 TP/m. Factors influencing the separation and detection processes were examined and optimized. The end-capillary 300 microm Cu wire amperometric detector offers favorable signal-to-noise characteristics at a relatively low potential (+0.50 V vs. Ag/AgCl) for detecting sugars. Three sugars (sucrose, glucose, and fructose) have been separated within 330 s in a 8.5 cm length capillary at a separation voltage of 1000 V using a 50 mM NaOH running buffer (pH 12.7). Highly linear response is obtained for the above compounds over the range of 5.0 to 2.0 x 10(2) microg/mL with low detection limit, down to 0.8 microg/mL for glucose (S/N = 3). The injection-to-injection repeatability for analytes in peak current (RSD < 3.6%) and for migration times (RSD < 1.4%) was excellent. The new miniaturized CE-AD system should find a wide range of analytical applications involving assays of carbohydrates as an alternative to conventional CE and micro-CE.

  7. A Model for Axial Magnetic Bearings Including Eddy Currents

    NASA Technical Reports Server (NTRS)

    Kucera, Ladislav; Ahrens, Markus

    1996-01-01

    This paper presents an analytical method of modelling eddy currents inside axial bearings. The problem is solved by dividing an axial bearing into elementary geometric forms, solving the Maxwell equations for these simplified geometries, defining boundary conditions and combining the geometries. The final result is an analytical solution for the flux, from which the impedance and the force of an axial bearing can be derived. Several impedance measurements have shown that the analytical solution can fit the measured data with a precision of approximately 5%.

  8. Human Centred Design Considerations for Connected Health Devices for the Older Adult

    PubMed Central

    Harte, Richard P.; Glynn, Liam G.; Broderick, Barry J.; Rodriguez-Molinero, Alejandro; Baker, Paul M. A.; McGuiness, Bernadette; O’Sullivan, Leonard; Diaz, Marta; Quinlan, Leo R.; ÓLaighin, Gearóid

    2014-01-01

    Connected health devices are generally designed for unsupervised use, by non-healthcare professionals, facilitating independent control of the individuals own healthcare. Older adults are major users of such devices and are a population significantly increasing in size. This group presents challenges due to the wide spectrum of capabilities and attitudes towards technology. The fit between capabilities of the user and demands of the device can be optimised in a process called Human Centred Design. Here we review examples of some connected health devices chosen by random selection, assess older adult known capabilities and attitudes and finally make analytical recommendations for design approaches and design specifications. PMID:25563225

  9. Spectral imaging of chemical compounds using multivariate optically enhanced filters integrated with InGaAs VGA cameras

    NASA Astrophysics Data System (ADS)

    Priore, Ryan J.; Jacksen, Niels

    2016-05-01

    Infrared hyperspectral imagers (HSI) have been fielded for the detection of hazardous chemical and biological compounds, tag detection (friend versus foe detection) and other defense critical sensing missions over the last two decades. Low Size/Weight/Power/Cost (SWaPc) methods of identification of chemical compounds spectroscopy has been a long term goal for hand held applications. We describe a new HSI concept for low cost / high performance InGaAs SWIR camera chemical identification for military, security, industrial and commercial end user applications. Multivariate Optical Elements (MOEs) are thin-film devices that encode a broadband, spectroscopic pattern allowing a simple broadband detector to generate a highly sensitive and specific detection for a target analyte. MOEs can be matched 1:1 to a discrete analyte or class prediction. Additionally, MOE filter sets are capable of sensing an orthogonal projection of the original sparse spectroscopic space enabling a small set of MOEs to discriminate a multitude of target analytes. This paper identifies algorithms and broadband optical filter designs that have been demonstrated to identify chemical compounds using high performance InGaAs VGA detectors. It shows how some of the initial models have been reduced to simple spectral designs and tested to produce positive identification of such chemicals. We also are developing pixilated MOE compressed detection sensors for the detection of a multitude of chemical targets in challenging backgrounds/environments for both commercial and defense/security applications. This MOE based, real-time HSI sensor will exhibit superior sensitivity and specificity as compared to currently fielded HSI systems.

  10. The application of quality risk management to the bacterial endotoxins test: use of hazard analysis and critical control points.

    PubMed

    Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti

    2013-01-01

    Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.

  11. An efficient approach for treating composition-dependent diffusion within organic particles

    DOE PAGES

    O'Meara, Simon; Topping, David O.; Zaveri, Rahul A.; ...

    2017-09-07

    Mounting evidence demonstrates that under certain conditions the rate of component partitioning between the gas and particle phase in atmospheric organic aerosol is limited by particle-phase diffusion. To date, however, particle-phase diffusion has not been incorporated into regional atmospheric models. An analytical rather than numerical solution to diffusion through organic particulate matter is desirable because of its comparatively small computational expense in regional models. Current analytical models assume diffusion to be independent of composition and therefore use a constant diffusion coefficient. To realistically model diffusion, however, it should be composition-dependent (e.g. due to the partitioning of components that plasticise, vitrifymore » or solidify). This study assesses the modelling capability of an analytical solution to diffusion corrected to account for composition dependence against a numerical solution. Results show reasonable agreement when the gas-phase saturation ratio of a partitioning component is constant and particle-phase diffusion limits partitioning rate (<10% discrepancy in estimated radius change). However, when the saturation ratio of the partitioning component varies, a generally applicable correction cannot be found, indicating that existing methodologies are incapable of deriving a general solution. Until such time as a general solution is found, caution should be given to sensitivity studies that assume constant diffusivity. Furthermore, the correction was implemented in the polydisperse, multi-process Model for Simulating Aerosol Interactions and Chemistry (MOSAIC) and is used to illustrate how the evolution of number size distribution may be accelerated by condensation of a plasticising component onto viscous organic particles.« less

  12. Raman Spectroscopic Analysis of Geological and Biogeological Specimens of Relevance to the ExoMars Mission

    PubMed Central

    Edwards, Howell G.M.; Ingley, Richard; Parnell, John; Vítek, Petr; Jehlička, Jan

    2013-01-01

    Abstract A novel miniaturized Raman spectrometer is scheduled to fly as part of the analytical instrumentation package on an ESA remote robotic lander in the ESA/Roscosmos ExoMars mission to search for evidence for extant or extinct life on Mars in 2018. The Raman spectrometer will be part of the first-pass analytical stage of the sampling procedure, following detailed surface examination by the PanCam scanning camera unit on the ExoMars rover vehicle. The requirements of the analytical protocol are stringent and critical; this study represents a laboratory blind interrogation of specimens that form a list of materials that are of relevance to martian exploration and at this stage simulates a test of current laboratory instrumentation to highlight the Raman technique strengths and possible weaknesses that may be encountered in practice on the martian surface and from which future studies could be formulated. In this preliminary exercise, some 10 samples that are considered terrestrial representatives of the mineralogy and possible biogeologically modified structures that may be identified on Mars have been examined with Raman spectroscopy, and conclusions have been drawn about the viability of the unambiguous spectral identification of biomolecular life signatures. It is concluded that the Raman spectroscopic technique does indeed demonstrate the capability to identify biomolecular signatures and the mineralogy in real-world terrestrial samples with a very high degree of success without any preconception being made about their origin and classification. Key Words: Biosignatures—Mars Exploration Rovers—Raman spectroscopy—Search for life (biosignatures)—Planetary instrumentation. Astrobiology 13, 543–549. PMID:23758166

  13. An efficient approach for treating composition-dependent diffusion within organic particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Meara, Simon; Topping, David O.; Zaveri, Rahul A.

    Mounting evidence demonstrates that under certain conditions the rate of component partitioning between the gas and particle phase in atmospheric organic aerosol is limited by particle-phase diffusion. To date, however, particle-phase diffusion has not been incorporated into regional atmospheric models. An analytical rather than numerical solution to diffusion through organic particulate matter is desirable because of its comparatively small computational expense in regional models. Current analytical models assume diffusion to be independent of composition and therefore use a constant diffusion coefficient. To realistically model diffusion, however, it should be composition-dependent (e.g. due to the partitioning of components that plasticise, vitrifymore » or solidify). This study assesses the modelling capability of an analytical solution to diffusion corrected to account for composition dependence against a numerical solution. Results show reasonable agreement when the gas-phase saturation ratio of a partitioning component is constant and particle-phase diffusion limits partitioning rate (<10% discrepancy in estimated radius change). However, when the saturation ratio of the partitioning component varies, a generally applicable correction cannot be found, indicating that existing methodologies are incapable of deriving a general solution. Until such time as a general solution is found, caution should be given to sensitivity studies that assume constant diffusivity. Furthermore, the correction was implemented in the polydisperse, multi-process Model for Simulating Aerosol Interactions and Chemistry (MOSAIC) and is used to illustrate how the evolution of number size distribution may be accelerated by condensation of a plasticising component onto viscous organic particles.« less

  14. Implementation of GIS-based highway safety analyses : bridging the gap

    DOT National Transportation Integrated Search

    2001-01-01

    In recent years, efforts have been made to expand the analytical features of the Highway Safety Information System (HSIS) by integrating Geographic Information System (GIS) capabilities. The original version of the GIS Safety Analysis Tools was relea...

  15. System Operations Studies for Automated Gateway Transit Systems - Detailed Station Model Programmer's Manual.

    DOT National Transportation Integrated Search

    1982-01-01

    The Detailed Station Model (DSM) provides operational and performance measures of alternative station configurations and management policies with respect to vehicle and passenger capabilities. It provides an analytic tool to support tradeoff studies ...

  16. Automated drug identification system

    NASA Technical Reports Server (NTRS)

    Campen, C. F., Jr.

    1974-01-01

    System speeds up analysis of blood and urine and is capable of identifying 100 commonly abused drugs. System includes computer that controls entire analytical process by ordering various steps in specific sequences. Computer processes data output and has readout of identified drugs.

  17. Instrumentation development for drug detection on the breath

    DOT National Transportation Integrated Search

    1972-09-01

    Based on a survey of candidate analytical methods, mass spectrometry was identified as a promising technique for drug detection on the breath. To demonstrate its capabilities, an existing laboratory mass spectrometer was modified by the addition of a...

  18. Renal Cancer Biomarkers | NCI Technology Transfer Center | TTC

    Cancer.gov

    The National Cancer Institute's Laboratory of Proteomics and Analytical Technologies is seeking statements of capability or interest from parties interested in collaborative research to further develop, evaluate, or commercialize diagnostic, therapeutic and prognostic cancer biomarkers from clinical specimens.

  19. NREL Continuum

    Science.gov Websites

    Innovation Portal Bridging Information Gapa>
    Database revolutionizes of NREL's ever-expanding analytical capabilities.

    <a href=" ;http://www.nrel.gov/continuum/analysis/dan_says.html">Dan Saysa>

  20. Sustainability Tools Inventory Initial Gap Analysis

    EPA Science Inventory

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consu...

  1. Analytical evaluation of current starch methods used in the international sugar industry: Part I

    USDA-ARS?s Scientific Manuscript database

    Several analytical starch methods currently exist in the international sugar industry that are used to prevent or mitigate starch-related processing challenges as well as assess the quality of traded end-products. These methods use simple iodometric chemistry, mostly potato starch standards, and uti...

  2. Spiral wound extraction cartridge

    DOEpatents

    Wisted, Eric E.; Lundquist, Susan H.

    1999-01-01

    A cartridge device for removing an analyte from a fluid comprises a hollow core, a sheet composite comprising a particulate-loaded porous membrane and optionally at least one reinforcing spacer sheet, the particulate being capable of binding the analyte, the sheet composite being formed into a spiral configuration about the core, wherein the sheet composite is wound around itself and wherein the windings of sheet composite are of sufficient tightness so that adjacent layers are essentially free of spaces therebetween, two end caps which are disposed over the core and the lateral ends of the spirally wound sheet composite, and means for securing the end caps to the core, the end caps also being secured to the lateral ends of the spirally wound sheet composite. A method for removing an analyte from a fluid comprises the steps of providing a spirally wound element of the invention and passing the fluid containing the analyte through the element essentially normal to a surface of the sheet composite so as to bind the analyte to the particulate of the particulate-loaded porous membrane, the method optionally including the step of eluting the bound analyte from the sheet composite.

  3. Climate Analytics as a Service. Chapter 11

    NASA Technical Reports Server (NTRS)

    Schnase, John L.

    2016-01-01

    Exascale computing, big data, and cloud computing are driving the evolution of large-scale information systems toward a model of data-proximal analysis. In response, we are developing a concept of climate analytics as a service (CAaaS) that represents a convergence of data analytics and archive management. With this approach, high-performance compute-storage implemented as an analytic system is part of a dynamic archive comprising both static and computationally realized objects. It is a system whose capabilities are framed as behaviors over a static data collection, but where queries cause results to be created, not found and retrieved. Those results can be the product of a complex analysis, but, importantly, they also can be tailored responses to the simplest of requests. NASA's MERRA Analytic Service and associated Climate Data Services API provide a real-world example of climate analytics delivered as a service in this way. Our experiences reveal several advantages to this approach, not the least of which is orders-of-magnitude time reduction in the data assembly task common to many scientific workflows.

  4. Big data analytics as a service infrastructure: challenges, desired properties and solutions

    NASA Astrophysics Data System (ADS)

    Martín-Márquez, Manuel

    2015-12-01

    CERN's accelerator complex generates a very large amount of data. A large volumen of heterogeneous data is constantly generated from control equipment and monitoring agents. These data must be stored and analysed. Over the decades, CERN's researching and engineering teams have applied different approaches, techniques and technologies for this purpose. This situation has minimised the necessary collaboration and, more relevantly, the cross data analytics over different domains. These two factors are essential to unlock hidden insights and correlations between the underlying processes, which enable better and more efficient daily-based accelerator operations and more informed decisions. The proposed Big Data Analytics as a Service Infrastructure aims to: (1) integrate the existing developments; (2) centralise and standardise the complex data analytics needs for CERN's research and engineering community; (3) deliver real-time, batch data analytics and information discovery capabilities; and (4) provide transparent access and Extract, Transform and Load (ETL), mechanisms to the various and mission-critical existing data repositories. This paper presents the desired objectives and properties resulting from the analysis of CERN's data analytics requirements; the main challenges: technological, collaborative and educational and; potential solutions.

  5. An app for climate-based Chikungunya risk monitoring and mapping

    NASA Astrophysics Data System (ADS)

    Soebiyanto, R. P.; Rama, X.; Jepsen, R.; Bijoria, S.; Linthicum, K. J.; Anyamba, A.

    2017-12-01

    There is an increasing concern for reemergence and spread of chikungunya in the last 10 years in Africa, the Indian Ocean, and Asia, and range expansion that now reaches the Caribbean, South America and threatens North America. The outbreak of Chikungunya in 2013 and its spread throughout the Americas has so far resulted in more than 1.7 million suspected cases. This has demonstrated the importance of readiness in assessing potential risk of the emergence of vector-borne diseases. Climate and ecological conditions are now recognized as major contributors to the emergence and re-emergence of various vector-borne diseases including Chikungunya. Variations and persistence of extreme climate conditions provide suitable environment for the increase of certain disease vector populations, which then further amplify vector-borne disease transmission. This highlights the importance of climate anomaly information in assessing regions at risk for Chikungunya. In order to address such issue, we are developing a climate-based app, CHIKRISK, which will help decision makers to answer three critical questions: (i) Where has Chikungunya activity occurred; (ii) Where it is occurring now; (iii) Which regions are currently at risk for Chikungunya. We first develop a database of historical Chikungunya outbreak locations compiled from publicly available information. These records are used to map where Chikungunya activity has occurred over time. We leverage on various satellite-based climate data records - such as rainfall, land surface and near surface temperature to characterize evolving conditions prior to and during Chikungunya activity. Chikungunya outbreak data, climate and ancillary (i.e. population and elevation) data are used to develop analytics capability that will produce risk maps. The CHIKRISK app has the capability to visualize historical Chikungunya activity locations, climate anomaly conditions and Chikungunya risk maps. Currently, the focus of the development is on the Caribbean and South America regions. The capability will be expanded in phased manner to the entire world.

  6. Magnetic islands and singular currents at rational surfaces in three-dimensional magnetohydrodynamic equilibria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loizu, J., E-mail: joaquim.loizu@ipp.mpg.de; Princeton Plasma Physics Laboratory, P.O. Box 451, Princeton New Jersey 08543; Hudson, S.

    2015-02-15

    Using the recently developed multiregion, relaxed MHD (MRxMHD) theory, which bridges the gap between Taylor's relaxation theory and ideal MHD, we provide a thorough analytical and numerical proof of the formation of singular currents at rational surfaces in non-axisymmetric ideal MHD equilibria. These include the force-free singular current density represented by a Dirac δ-function, which presumably prevents the formation of islands, and the Pfirsch-Schlüter 1/x singular current, which arises as a result of finite pressure gradient. An analytical model based on linearized MRxMHD is derived that can accurately (1) describe the formation of magnetic islands at resonant rational surfaces, (2)more » retrieve the ideal MHD limit where magnetic islands are shielded, and (3) compute the subsequent formation of singular currents. The analytical results are benchmarked against numerical simulations carried out with a fully nonlinear implementation of MRxMHD.« less

  7. Inelastic response of metal matrix composites under biaxial loading

    NASA Technical Reports Server (NTRS)

    Mirzadeh, F.; Pindera, Marek-Jerzy; Herakovich, Carl T.

    1990-01-01

    Elements of the analytical/experimental program to characterize the response of silicon carbide titanium (SCS-6/Ti-15-3) composite tubes under biaxial loading are outlined. The analytical program comprises prediction of initial yielding and subsequent inelastic response of unidirectional and angle-ply silicon carbide titanium tubes using a combined micromechanics approach and laminate analysis. The micromechanics approach is based on the method of cells model and has the capability of generating the effective thermomechanical response of metal matrix composites in the linear and inelastic region in the presence of temperature and time-dependent properties of the individual constituents and imperfect bonding on the initial yield surfaces and inelastic response of (0) and (+ or - 45)sub s SCS-6/Ti-15-3 laminates loaded by different combinations of stresses. The generated analytical predictions will be compared with the experimental results. The experimental program comprises generation of initial yield surfaces, subsequent stress-strain curves and determination of failure loads of the SCS-6/Ti-15-3 tubes under selected loading conditions. The results of the analytical investigation are employed to define the actual loading paths for the experimental program. A brief overview of the experimental methodology is given. This includes the test capabilities of the Composite Mechanics Laboratory at the University of Virginia, the SCS-6/Ti-15-3 composite tubes secured from McDonnell Douglas Corporation, a text fixture specifically developed for combined axial-torsional loading, and the MTS combined axial-torsion loader that will be employed in the actual testing.

  8. Analytical determination of selenium in medical samples, staple food and dietary supplements by means of total reflection X-ray fluorescence spectroscopy

    NASA Astrophysics Data System (ADS)

    Stosnach, Hagen

    2010-09-01

    Selenium is essential for many aspects of human health and, thus, the object of intensive medical research. This demands the use of analytical techniques capable of analysing selenium at low concentrations with high accuracy in widespread matrices and sometimes smallest sample amounts. In connection with the increasing importance of selenium, there is a need for rapid and simple on-site (or near-to-site) selenium analysis in food basics like wheat at processing and production sites, as well as for the analysis of this element in dietary supplements. Common analytical techniques like electrothermal atomic absorption spectroscopy (ETAAS) and inductively-coupled plasma mass spectrometry (ICP-MS) are capable of analysing selenium in medical samples with detection limits in the range from 0.02 to 0.7 μg/l. Since in many cases less complicated and expensive analytical techniques are required, TXRF has been tested regarding its suitability for selenium analysis in different medical, food basics and dietary supplement samples applying most simple sample preparation techniques. The reported results indicate that the accurate analysis of selenium in all sample types is possible. The detection limits of TXRF are in the range from 7 to 12 μg/l for medical samples and 0.1 to 0.2 mg/kg for food basics and dietary supplements. Although this sensitivity is low compared to established techniques, it is sufficient for the physiological concentrations of selenium in the investigated samples.

  9. An Analytical-Numerical Model for Two-Phase Slug Flow through a Sudden Area Change in Microchannels

    DOE PAGES

    Momen, A. Mehdizadeh; Sherif, S. A.; Lear, W. E.

    2016-01-01

    In this article, two new analytical models have been developed to calculate two-phase slug flow pressure drop in microchannels through a sudden contraction. Even though many studies have been reported on two-phase flow in microchannels, considerable discrepancies still exist, mainly due to the difficulties in experimental setup and measurements. Numerical simulations were performed to support the new analytical models and to explore in more detail the physics of the flow in microchannels with a sudden contraction. Both analytical and numerical results were compared to the available experimental data and other empirical correlations. Results show that models, which were developed basedmore » on the slug and semi-slug assumptions, agree well with experiments in microchannels. Moreover, in contrast to the previous empirical correlations which were tuned for a specific geometry, the new analytical models are capable of taking geometrical parameters as well as flow conditions into account.« less

  10. Semantic Interaction for Sensemaking: Inferring Analytical Reasoning for Model Steering.

    PubMed

    Endert, A; Fiaux, P; North, C

    2012-12-01

    Visual analytic tools aim to support the cognitively demanding task of sensemaking. Their success often depends on the ability to leverage capabilities of mathematical models, visualization, and human intuition through flexible, usable, and expressive interactions. Spatially clustering data is one effective metaphor for users to explore similarity and relationships between information, adjusting the weighting of dimensions or characteristics of the dataset to observe the change in the spatial layout. Semantic interaction is an approach to user interaction in such spatializations that couples these parametric modifications of the clustering model with users' analytic operations on the data (e.g., direct document movement in the spatialization, highlighting text, search, etc.). In this paper, we present results of a user study exploring the ability of semantic interaction in a visual analytic prototype, ForceSPIRE, to support sensemaking. We found that semantic interaction captures the analytical reasoning of the user through keyword weighting, and aids the user in co-creating a spatialization based on the user's reasoning and intuition.

  11. Visual analytics of brain networks.

    PubMed

    Li, Kaiming; Guo, Lei; Faraco, Carlos; Zhu, Dajiang; Chen, Hanbo; Yuan, Yixuan; Lv, Jinglei; Deng, Fan; Jiang, Xi; Zhang, Tuo; Hu, Xintao; Zhang, Degang; Miller, L Stephen; Liu, Tianming

    2012-05-15

    Identification of regions of interest (ROIs) is a fundamental issue in brain network construction and analysis. Recent studies demonstrate that multimodal neuroimaging approaches and joint analysis strategies are crucial for accurate, reliable and individualized identification of brain ROIs. In this paper, we present a novel approach of visual analytics and its open-source software for ROI definition and brain network construction. By combining neuroscience knowledge and computational intelligence capabilities, visual analytics can generate accurate, reliable and individualized ROIs for brain networks via joint modeling of multimodal neuroimaging data and an intuitive and real-time visual analytics interface. Furthermore, it can be used as a functional ROI optimization and prediction solution when fMRI data is unavailable or inadequate. We have applied this approach to an operation span working memory fMRI/DTI dataset, a schizophrenia DTI/resting state fMRI (R-fMRI) dataset, and a mild cognitive impairment DTI/R-fMRI dataset, in order to demonstrate the effectiveness of visual analytics. Our experimental results are encouraging. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Existing and Required Modeling Capabilities for Evaluating ATM Systems and Concepts

    NASA Technical Reports Server (NTRS)

    Odoni, Amedeo R.; Bowman, Jeremy; Delahaye, Daniel; Deyst, John J.; Feron, Eric; Hansman, R. John; Khan, Kashif; Kuchar, James K.; Pujet, Nicolas; Simpson, Robert W.

    1997-01-01

    ATM systems throughout the world are entering a period of major transition and change. The combination of important technological developments and of the globalization of the air transportation industry has necessitated a reexamination of some of the fundamental premises of existing Air Traffic Management (ATM) concepts. New ATM concepts have to be examined, concepts that may place more emphasis on: strategic traffic management; planning and control; partial decentralization of decision-making; and added reliance on the aircraft to carry out strategic ATM plans, with ground controllers confined primarily to a monitoring and supervisory role. 'Free Flight' is a case in point. In order to study, evaluate and validate such new concepts, the ATM community will have to rely heavily on models and computer-based tools/utilities, covering a wide range of issues and metrics related to safety, capacity and efficiency. The state of the art in such modeling support is adequate in some respects, but clearly deficient in others. It is the objective of this study to assist in: (1) assessing the strengths and weaknesses of existing fast-time models and tools for the study of ATM systems and concepts and (2) identifying and prioritizing the requirements for the development of additional modeling capabilities in the near future. A three-stage process has been followed to this purpose: 1. Through the analysis of two case studies involving future ATM system scenarios, as well as through expert assessment, modeling capabilities and supporting tools needed for testing and validating future ATM systems and concepts were identified and described. 2. Existing fast-time ATM models and support tools were reviewed and assessed with regard to the degree to which they offer the capabilities identified under Step 1. 3 . The findings of 1 and 2 were combined to draw conclusions about (1) the best capabilities currently existing, (2) the types of concept testing and validation that can be carried out reliably with such existing capabilities and (3) the currently unavailable modeling capabilities that should receive high priority for near-term research and development. It should be emphasized that the study is concerned only with the class of 'fast time' analytical and simulation models. 'Real time' models, that typically involve humans-in-the-loop, comprise another extensive class which is not addressed in this report. However, the relationship between some of the fast-time models reviewed and a few well-known real-time models is identified in several parts of this report and the potential benefits from the combined use of these two classes of models-a very important subject-are discussed in chapters 4 and 7.

  13. Improving Coastal Ocean Color Validation Capabilities through Application of Inherent Optical Properties (IOPs)

    NASA Technical Reports Server (NTRS)

    Mannino, Antonio

    2008-01-01

    Understanding how the different components of seawater alter the path of incident sunlight through scattering and absorption is essential to using remotely sensed ocean color observations effectively. This is particularly apropos in coastal waters where the different optically significant components (phytoplankton, detrital material, inorganic minerals, etc.) vary widely in concentration, often independently from one another. Inherent Optical Properties (IOPs) form the link between these biogeochemical constituents and the Apparent Optical Properties (AOPs). understanding this interrelationship is at the heart of successfully carrying out inversions of satellite-measured radiance to biogeochemical properties. While sufficient covariation of seawater constituents in case I waters typically allows empirical algorithms connecting AOPs and biogeochemical parameters to behave well, these empirical algorithms normally do not hold for case I1 regimes (Carder et al. 2003). Validation in the context of ocean color remote sensing refers to in-situ measurements used to verify or characterize algorithm products or any assumption used as input to an algorithm. In this project, validation capabilities are considered those measurement capabilities, techniques, methods, models, etc. that allow effective validation. Enhancing current validation capabilities by incorporating state-of-the-art IOP measurements and optical models is the purpose of this work. Involved in this pursuit is improving core IOP measurement capabilities (spectral, angular, spatio-temporal resolutions), improving our understanding of the behavior of analytical AOP-IOP approximations in complex coastal waters, and improving the spatial and temporal resolution of biogeochemical data for validation by applying biogeochemical-IOP inversion models so that these parameters can be computed from real-time IOP sensors with high sampling rates. Research cruises supported by this project provides for collection and processing of seawater samples for biogeochemical (pigments, DOC and POC) and optical (CDOM and POM absorption coefficients) analyses to enhance our understanding of the linkages between in-water optical measurements (IOPs and AOPs) and biogeochemical constituents and to provide a more comprehensive suite of validation products.

  14. U.S. Navy Capstone Strategies and Concepts (2001-2010): Strategy, Policy, Concept, and Vision Documents

    DTIC Science & Technology

    2011-12-01

    Naval Analytical Capabilities: Improving Capabilities-Based Planning (2005)  Milan Vego, “Searching for a Strategy,” Armed Forces Journal (Apr... Schultz , Perry, Kissinger, Nunn, “A World Free of Nuclear Weapons,”Wall Street Journal (Jan 2007)  Earlier drafts of this briefing (2005-7) 270 Navy...2007) 316 Context: Other contemporary publications (VII)  Schultz , Perry, Kissinger, Nunn, “A World Free of Nuclear Weapons,”WSJ (Jan 2007) CSBA

  15. Determining your organization's 'risk capability'.

    PubMed

    Hannah, Bill; Hancock, Melinda

    2014-05-01

    An assessment of a provider's level of risk capability should focus on three key elements: Business intelligence, including sophisticated analytical models that can offer insight into the expected cost and quality of care for a given population. Clinical enterprise maturity, marked by the ability to improve health outcomes and to manage utilization and costs to drive change. Revenue transformation, emphasizing the need for a revenue cycle platform that allows for risk acceptance and management and that provides incentives for performance against defined objectives.

  16. Nanoscale Surface Plasmonics Sensor With Nanofluidic Control

    NASA Technical Reports Server (NTRS)

    Wei, Jianjun; Singhal, Sameer; Waldeck, David H.; Kofke, Matthew

    2013-01-01

    Conventional quantitative protein assays of bodily fluids typically involve multiple steps to obtain desired measurements. Such methods are not well suited for fast and accurate assay measurements in austere environments such as spaceflight and in the aftermath of disasters. Consequently, there is a need for a protein assay technology capable of routinely monitoring proteins in austere environments. For example, there is an immediate need for a urine protein assay to assess astronaut renal health during spaceflight. The disclosed nanoscale surface plasmonics sensor provides a core detection method that can be integrated to a lab-on-chip device that satisfies the unmet need for such a protein assay technology. Assays based upon combinations of nanoholes, nanorings, and nanoslits with transmission surface plasmon resonance (SPR) are used for assays requiring extreme sensitivity, and are capable of detecting specific analytes at concentrations as low as picomole to femtomole level in well-controlled environments. The device operates in a transmission mode configuration in which light is directed at one planar surface of the array, which functions as an optical aperture. The incident light induces surface plasmon light transmission from the opposite surface of the array. The presence of a target analyte is detected by changes in the spectrum of light transmitted by the array when a target analyte induces a change in the refractive index of the fluid within the nanochannels. This occurs, for example, when a target analyte binds to a receptor fixed to the walls of the nanochannels in the array. Independent fluid handling capability for individual nanoarrays on a nanofluidic chip containing a plurality of nanochannel arrays allows each array to be used to sense a different target analyte and/or for paired arrays to analyze control and test samples simultaneously in parallel. The present invention incorporates transmission mode nanoplasmonics and nanofluidics into a single, microfluidically controlled device. The device comprises one or more arrays of aligned nanochannels that are in fluid communication with inflowing and outflowing fluid handling manifolds that control the flow of fluid through the arrays. The array acts as an aperture in a plasmonic sensor. Fluid, in the form of a liquid or a gas and comprising a sample for analysis, is moved from an inlet manifold through the nanochannel array, and out through an exit manifold. The fluid may also contain a reagent used to modify the interior surfaces of the nanochannels, and/or a reagent required for the detection of an analyte.

  17. Modeling and analysis of a novel planar eddy current damper

    NASA Astrophysics Data System (ADS)

    Zhang, He; Kou, Baoquan; Jin, Yinxi; Zhang, Lu; Zhang, Hailin; Li, Liyi

    2014-05-01

    In this paper, a novel 2-DOF permanent magnet planar eddy current damper is proposed, of which the stator is made of a copper plate and the mover is composed of two orthogonal 1-D permanent magnet arrays with a double sided structure. The main objective of the planar eddy current damper is to provide two orthogonal damping forces for dynamic systems like the 2-DOF high precision positioning system. Firstly, the basic structure and the operating principle of the planar damper are introduced. Secondly, the analytical model of the planar damper is established where the magnetic flux density distribution of the permanent magnet arrays is obtained by using the equivalent magnetic charge method and the image method. Then, the analytical expressions of the damping force and damping coefficient are derived. Lastly, to verify the analytical model, the finite element method (FEM) is adopted for calculating the flux density and a planar damper prototype is manufactured and thoroughly tested. The results from FEM and experiments are in good agreement with the ones from the analytical expressions indicating that the analytical model is reasonable and correct.

  18. [Microsecond Pulsed Hollow Cathode Lamp as Enhanced Excitation Source of Hydride Generation Atomic Fluorescence Spectrometry].

    PubMed

    Zhang, Shuo

    2015-09-01

    The spectral, electrical and atomic fluorescence characteristics of As, Se, Sb and Pb hollow cathode lamps (HCLs) powered by a laboratory-built high current microsecond pulse (HCMP) power supply were studied, and the feasibility of using HCMP-HCLs as the excitation source of hydride generation atomic fluorescence spectrometry (HG-AFS) was evaluated. Under the HCMP power supply mode, the As, Se, Sb, Pb HCLs can maintain stable glow discharge at frequency of 100~1000 Hz, pulse width of 4.0~20 μs and pulse current up to 4.0 A. Relationship between the intensity of characteristic emission lines and HCMP power supply parameters, such as pulse current, power supply voltage, pulse width and frequency, was studied in detail. Compared with the conventional pulsed (CP) HCLs used in commercial AFS instruments, HCMP-HCLs have a narrower pulse width and much stronger pulse current. Under the optimized HCMP power supply parameters, the intensity of atomic emission lines of As, Se, Sb HCLs had sharp enhancement and that indicated their capacity of being a novel HG-AFS excitation source. However, the attenuation of atomic lines and enhancement of ionic lines negated such feasibility of HCMP-Pb HCL. Then the HG-AFS analytical capability of using the HCMP-As/Se/Sb HCLs excitation source was established and results showed that the HCMP-HCL is a promising excitation source for HG-AFS.

  19. Seeking maximum linearity of transfer functions

    NASA Astrophysics Data System (ADS)

    Silva, Filipi N.; Comin, Cesar H.; Costa, Luciano da F.

    2016-12-01

    Linearity is an important and frequently sought property in electronics and instrumentation. Here, we report a method capable of, given a transfer function (theoretical or derived from some real system), identifying the respective most linear region of operation with a fixed width. This methodology, which is based on least squares regression and systematic consideration of all possible regions, has been illustrated with respect to both an analytical (sigmoid transfer function) and a simple situation involving experimental data of a low-power, one-stage class A transistor current amplifier. Such an approach, which has been addressed in terms of transfer functions derived from experimentally obtained characteristic surface, also yielded contributions such as the estimation of local constants of the device, as opposed to typically considered average values. The reported method and results pave the way to several further applications in other types of devices and systems, intelligent control operation, and other areas such as identifying regions of power law behavior.

  20. Investigation of burn effect on skin using simultaneous Raman-Brillouin spectroscopy, and fluorescence microspectroscopy

    NASA Astrophysics Data System (ADS)

    Coker, Zachary; Meng, Zhaokai; Troyanova-Wood, Maria; Traverso, Andrew; Ballmann, Charles; Petrov, Georgi; Ibey, Bennett L.; Yakovlev, Vladislav

    2017-02-01

    Burns are thermal injuries that can completely damage or at least compromise the protective function of skin, and affect the ability of tissues to manage moisture. Burn-damaged tissues exhibit lower elasticity than healthy tissues, due to significantly reduced water concentrations and plasma retention. Current methods for determining burn intensity are limited to visual inspection, and potential hospital x-ray examination. We present a unique confocal microscope capable of measuring Raman and Brillouin spectra simultaneously, with concurrent fluorescence investigation from a single spatial location, and demonstrate application by investigating and characterizing the properties of burn-afflicted tissue on chicken skin model. Raman and Brillouin scattering offer complementary information about a material's chemical and mechanical structure, while fluorescence can serve as a useful diagnostic indicator and imaging tool. The developed instrument has the potential for very diverse analytical applications in basic biomedical science and biomedical diagnostics and imaging.

Top