Review of recent advances in analytical techniques for the determination of neurotransmitters
Perry, Maura; Li, Qiang; Kennedy, Robert T.
2009-01-01
Methods and advances for monitoring neurotransmitters in vivo or for tissue analysis of neurotransmitters over the last five years are reviewed. The review is organized primarily by neurotransmitter type. Transmitter and related compounds may be monitored by either in vivo sampling coupled to analytical methods or implanted sensors. Sampling is primarily performed using microdialysis, but low-flow push-pull perfusion may offer advantages of spatial resolution while minimizing the tissue disruption associated with higher flow rates. Analytical techniques coupled to these sampling methods include liquid chromatography, capillary electrophoresis, enzyme assays, sensors, and mass spectrometry. Methods for the detection of amino acid, monoamine, neuropeptide, acetylcholine, nucleoside, and soluable gas neurotransmitters have been developed and improved upon. Advances in the speed and sensitivity of these methods have enabled improvements in temporal resolution and increased the number of compounds detectable. Similar advances have enabled improved detection at tissue samples, with a substantial emphasis on single cell and other small samples. Sensors provide excellent temporal and spatial resolution for in vivo monitoring. Advances in application to catecholamines, indoleamines, and amino acids have been prominent. Improvements in stability, sensitivity, and selectivity of the sensors have been of paramount interest. PMID:19800472
Advances in NMR Spectroscopy for Lipid Oxidation Assessment
USDA-ARS?s Scientific Manuscript database
Although there are many analytical methods developed for the assessment of lipid oxidation, different analytical methods often give different, sometimes even contradictory, results. The reason for this inconsistency is that although there are many different kinds of oxidation products, most methods ...
2017-08-01
of metallic additive manufacturing processes and show that combining experimental data with modelling and advanced data processing and analytics...manufacturing processes and show that combining experimental data with modelling and advanced data processing and analytics methods will accelerate that...geometries, we develop a methodology that couples experimental data and modelling to convert the scan paths into spatially resolved local thermal histories
Advanced methods of structural and trajectory analysis for transport aircraft
NASA Technical Reports Server (NTRS)
Ardema, Mark D.
1995-01-01
This report summarizes the efforts in two areas: (1) development of advanced methods of structural weight estimation, and (2) development of advanced methods of trajectory optimization. The majority of the effort was spent in the structural weight area. A draft of 'Analytical Fuselage and Wing Weight Estimation of Transport Aircraft', resulting from this research, is included as an appendix.
The overall goal of this task is to help reduce the uncertainties in the assessment of environmental health and human exposure by better characterizing hazardous wastes through cost-effective analytical methods. Research projects are directed towards the applied development and ...
Toward Usable Interactive Analytics: Coupling Cognition and Computation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Endert, Alexander; North, Chris; Chang, Remco
Interactive analytics provide users a myriad of computational means to aid in extracting meaningful information from large and complex datasets. Much prior work focuses either on advancing the capabilities of machine-centric approaches by the data mining and machine learning communities, or human-driven methods by the visualization and CHI communities. However, these methods do not yet support a true human-machine symbiotic relationship where users and machines work together collaboratively and adapt to each other to advance an interactive analytic process. In this paper we discuss some of the inherent issues, outlining what we believe are the steps toward usable interactive analyticsmore » that will ultimately increase the effectiveness for both humans and computers to produce insights.« less
Thanh, Tran Thien; Vuong, Le Quang; Ho, Phan Long; Chuong, Huynh Dinh; Nguyen, Vo Hoang; Tao, Chau Van
2018-04-01
In this work, an advanced analytical procedure was applied to calculate radioactivity in spiked water samples in a close geometry gamma spectroscopy. It included MCNP-CP code in order to calculate the coincidence summing correction factor (CSF). The CSF results were validated by a deterministic method using ETNA code for both p-type HPGe detectors. It showed that a good agreement for both codes. Finally, the validity of the developed procedure was confirmed by a proficiency test to calculate the activities of various radionuclides. The results of the radioactivity measurement with both detectors using the advanced analytical procedure were received the ''Accepted'' statuses following the proficiency test. Copyright © 2018 Elsevier Ltd. All rights reserved.
2014-01-01
This review aims to highlight the recent advances and methodological improvements in instrumental techniques applied for the analysis of different brominated flame retardants (BFRs). The literature search strategy was based on the recent analytical reviews published on BFRs. The main selection criteria involved the successful development and application of analytical methods for determination of the target compounds in various environmental matrices. Different factors affecting chromatographic separation and mass spectrometric detection of brominated analytes were evaluated and discussed. Techniques using advanced instrumentation to achieve outstanding results in quantification of different BFRs and their metabolites/degradation products were highlighted. Finally, research gaps in the field of BFR analysis were identified and recommendations for future research were proposed. PMID:27433482
Advances in Adaptive Control Methods
NASA Technical Reports Server (NTRS)
Nguyen, Nhan
2009-01-01
This poster presentation describes recent advances in adaptive control technology developed by NASA. Optimal Control Modification is a novel adaptive law that can improve performance and robustness of adaptive control systems. A new technique has been developed to provide an analytical method for computing time delay stability margin for adaptive control systems.
Current Status of Mycotoxin Analysis: A Critical Review.
Shephard, Gordon S
2016-07-01
It is over 50 years since the discovery of aflatoxins focused the attention of food safety specialists on fungal toxins in the feed and food supply. Since then, analysis of this important group of natural contaminants has advanced in parallel with general developments in analytical science, and current MS methods are capable of simultaneously analyzing hundreds of compounds, including mycotoxins, pesticides, and drugs. This profusion of data may advance our understanding of human exposure, yet constitutes an interpretive challenge to toxicologists and food safety regulators. Despite these advances in analytical science, the basic problem of the extreme heterogeneity of mycotoxin contamination, although now well understood, cannot be circumvented. The real health challenges posed by mycotoxin exposure occur in the developing world, especially among small-scale and subsistence farmers. Addressing these problems requires innovative approaches in which analytical science must also play a role in providing suitable out-of-laboratory analytical techniques.
Recent advances in immunosensor for narcotic drug detection
Gandhi, Sonu; Suman, Pankaj; Kumar, Ashok; Sharma, Prince; Capalash, Neena; Suri, C. Raman
2015-01-01
Introduction: Immunosensor for illicit drugs have gained immense interest and have found several applications for drug abuse monitoring. This technology has offered a low cost detection of narcotics; thereby, providing a confirmatory platform to compliment the existing analytical methods. Methods: In this minireview, we define the basic concept of transducer for immunosensor development that utilizes antibodies and low molecular mass hapten (opiate) molecules. Results: This article emphasizes on recent advances in immunoanalytical techniques for monitoring of opiate drugs. Our results demonstrate that high quality antibodies can be used for immunosensor development against target analyte with greater sensitivity, specificity and precision than other available analytical methods. Conclusion: In this review we highlight the fundamentals of different transducer technologies and its applications for immunosensor development currently being developed in our laboratory using rapid screening via immunochromatographic kit, label free optical detection via enzyme, fluorescence, gold nanoparticles and carbon nanotubes based immunosensing for sensitive and specific monitoring of opiates. PMID:26929925
Modified symplectic schemes with nearly-analytic discrete operators for acoustic wave simulations
NASA Astrophysics Data System (ADS)
Liu, Shaolin; Yang, Dinghui; Lang, Chao; Wang, Wenshuai; Pan, Zhide
2017-04-01
Using a structure-preserving algorithm significantly increases the computational efficiency of solving wave equations. However, only a few explicit symplectic schemes are available in the literature, and the capabilities of these symplectic schemes have not been sufficiently exploited. Here, we propose a modified strategy to construct explicit symplectic schemes for time advance. The acoustic wave equation is transformed into a Hamiltonian system. The classical symplectic partitioned Runge-Kutta (PRK) method is used for the temporal discretization. Additional spatial differential terms are added to the PRK schemes to form the modified symplectic methods and then two modified time-advancing symplectic methods with all of positive symplectic coefficients are then constructed. The spatial differential operators are approximated by nearly-analytic discrete (NAD) operators, and we call the fully discretized scheme modified symplectic nearly analytic discrete (MSNAD) method. Theoretical analyses show that the MSNAD methods exhibit less numerical dispersion and higher stability limits than conventional methods. Three numerical experiments are conducted to verify the advantages of the MSNAD methods, such as their numerical accuracy, computational cost, stability, and long-term calculation capability.
Recent Advances in Paper-Based Sensors
Liana, Devi D.; Raguse, Burkhard; Gooding, J. Justin; Chow, Edith
2012-01-01
Paper-based sensors are a new alternative technology for fabricating simple, low-cost, portable and disposable analytical devices for many application areas including clinical diagnosis, food quality control and environmental monitoring. The unique properties of paper which allow passive liquid transport and compatibility with chemicals/biochemicals are the main advantages of using paper as a sensing platform. Depending on the main goal to be achieved in paper-based sensors, the fabrication methods and the analysis techniques can be tuned to fulfill the needs of the end-user. Current paper-based sensors are focused on microfluidic delivery of solution to the detection site whereas more advanced designs involve complex 3-D geometries based on the same microfluidic principles. Although paper-based sensors are very promising, they still suffer from certain limitations such as accuracy and sensitivity. However, it is anticipated that in the future, with advances in fabrication and analytical techniques, that there will be more new and innovative developments in paper-based sensors. These sensors could better meet the current objectives of a viable low-cost and portable device in addition to offering high sensitivity and selectivity, and multiple analyte discrimination. This paper is a review of recent advances in paper-based sensors and covers the following topics: existing fabrication techniques, analytical methods and application areas. Finally, the present challenges and future outlooks are discussed. PMID:23112667
Study and characterization of a MEMS micromirror device
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
2004-08-01
In this paper, advances in our study and characterization of a MEMS micromirror device are presented. The micromirror device, of 510 mm characteristic length, operates in a dynamic mode with a maximum displacement on the order of 10 mm along its principal optical axis and oscillation frequencies of up to 1.3 kHz. Developments are carried on by analytical, computational, and experimental methods. Analytical and computational nonlinear geometrical models are developed in order to determine the optimal loading-displacement operational characteristics of the micromirror. Due to the operational mode of the micromirror, the experimental characterization of its loading-displacement transfer function requires utilization of advanced optical metrology methods. Optoelectronic holography (OEH) methodologies based on multiple wavelengths that we are developing to perform such characterization are described. It is shown that the analytical, computational, and experimental approach is effective in our developments.
Rovina, Kobun; Siddiquee, Shafiquzzaman; Shaarani, Sharifudin M
2016-01-01
Allura Red AC (E129) is an azo dye that widely used in drinks, juices, bakery, meat, and sweets products. High consumption of Allura Red has claimed an adverse effects of human health including allergies, food intolerance, cancer, multiple sclerosis, attention deficit hyperactivity disorder, brain damage, nausea, cardiac disease and asthma due to the reaction of aromatic azo compounds (R = R' = aromatic). Several countries have banned and strictly controlled the uses of Allura Red in food and beverage products. This review paper is critically summarized on the available analytical and advanced methods for determination of Allura Red and also concisely discussed on the acceptable daily intake, toxicology and extraction methods.
Rovina, Kobun; Siddiquee, Shafiquzzaman; Shaarani, Sharifudin M.
2016-01-01
Allura Red AC (E129) is an azo dye that widely used in drinks, juices, bakery, meat, and sweets products. High consumption of Allura Red has claimed an adverse effects of human health including allergies, food intolerance, cancer, multiple sclerosis, attention deficit hyperactivity disorder, brain damage, nausea, cardiac disease and asthma due to the reaction of aromatic azo compounds (R = R′ = aromatic). Several countries have banned and strictly controlled the uses of Allura Red in food and beverage products. This review paper is critically summarized on the available analytical and advanced methods for determination of Allura Red and also concisely discussed on the acceptable daily intake, toxicology and extraction methods. PMID:27303385
A mean curvature model for capillary flows in asymmetric containers and conduits
NASA Astrophysics Data System (ADS)
Chen, Yongkang; Tavan, Noël; Weislogel, Mark M.
2012-08-01
Capillarity-driven flows resulting from critical geometric wetting criterion are observed to yield significant shifts of the bulk fluid from one side of the container to the other during "zero gravity" experiments. For wetting fluids, such bulk shift flows consist of advancing and receding menisci sometimes separated by secondary capillary flows such as rivulet-like flows along gaps. Here we study the mean curvature of an advancing meniscus in hopes of approximating a critical boundary condition for fluid dynamics solutions. It is found that the bulk shift flows behave as if the bulk menisci are either "connected" or "disconnected." For the connected case, an analytic method is developed to calculate the mean curvature of the advancing meniscus in an asymptotic sense. In contrast, for the disconnected case the method to calculate the mean curvature of the advancing and receding menisci uses a well-established procedure. Both disconnected and connected bulk shifts can occur as the first tier flow of more complex compound capillary flows. Preliminary comparisons between the analytic method and the results of drop tower experiments are encouraging.
Major advances in testing of dairy products: milk component and dairy product attribute testing.
Barbano, D M; Lynch, J M
2006-04-01
Milk component analysis is relatively unusual in the field of quantitative analytical chemistry because an analytical test result determines the allocation of very large amounts of money between buyers and sellers of milk. Therefore, there is high incentive to develop and refine these methods to achieve a level of analytical performance rarely demanded of most methods or laboratory staff working in analytical chemistry. In the last 25 yr, well-defined statistical methods to characterize and validate analytical method performance combined with significant improvements in both the chemical and instrumental methods have allowed achievement of improved analytical performance for payment testing. A shift from marketing commodity dairy products to the development, manufacture, and marketing of value added dairy foods for specific market segments has created a need for instrumental and sensory approaches and quantitative data to support product development and marketing. Bringing together sensory data from quantitative descriptive analysis and analytical data from gas chromatography olfactometry for identification of odor-active compounds in complex natural dairy foods has enabled the sensory scientist and analytical chemist to work together to improve the consistency and quality of dairy food flavors.
GENETIC-BASED ANALYTICAL METHODS FOR BACTERIA AND FUNGI
In the past two decades, advances in high-throughput sequencing technologies have lead to a veritable explosion in the generation of nucleic acid sequence information (1). While these advances are illustrated most prominently by the successful sequencing of the human genome, they...
Applying Behavior Analytic Procedures to Effectively Teach Literacy Skills in the Classroom
ERIC Educational Resources Information Center
Joseph, Laurice M.; Alber-Morgan, Sheila; Neef, Nancy
2016-01-01
The purpose of this article is to discuss the application of behavior analytic procedures for advancing and evaluating methods for teaching literacy skills in the classroom. Particularly, applied behavior analysis has contributed substantially to examining the relationship between teacher behavior and student literacy performance. Teacher…
Protein Quantification by Elemental Mass Spectrometry: An Experiment for Graduate Students
ERIC Educational Resources Information Center
Schwarz, Gunnar; Ickert, Stefanie; Wegner, Nina; Nehring, Andreas; Beck, Sebastian; Tiemann, Ruediger; Linscheid, Michael W.
2014-01-01
A multiday laboratory experiment was designed to integrate inductively coupled plasma-mass spectrometry (ICP-MS) in the context of protein quantification into an advanced practical course in analytical and environmental chemistry. Graduate students were familiar with the analytical methods employed, whereas the combination of bioanalytical assays…
Several advances in the analytic element method have been made to enhance its performance and facilitate three-dimensional ground-water flow modeling in a regional aquifer setting. First, a new public domain modular code (ModAEM) has been developed for modeling ground-water flow ...
Pharmaceutical cocrystals, salts and polymorphs: Advanced characterization techniques.
Pindelska, Edyta; Sokal, Agnieszka; Kolodziejski, Waclaw
2017-08-01
The main goal of a novel drug development is to obtain it with optimal physiochemical, pharmaceutical and biological properties. Pharmaceutical companies and scientists modify active pharmaceutical ingredients (APIs), which often are cocrystals, salts or carefully selected polymorphs, to improve the properties of a parent drug. To find the best form of a drug, various advanced characterization methods should be used. In this review, we have described such analytical methods, dedicated to solid drug forms. Thus, diffraction, spectroscopic, thermal and also pharmaceutical characterization methods are discussed. They all are necessary to study a solid API in its intrinsic complexity from bulk down to the molecular level, gain information on its structure, properties, purity and possible transformations, and make the characterization efficient, comprehensive and complete. Furthermore, these methods can be used to monitor and investigate physical processes, involved in the drug development, in situ and in real time. The main aim of this paper is to gather information on the current advancements in the analytical methods and highlight their pharmaceutical relevance. Copyright © 2017 Elsevier B.V. All rights reserved.
The Importance of Method Selection in Determining Product Integrity for Nutrition Research1234
Mudge, Elizabeth M; Brown, Paula N
2016-01-01
The American Herbal Products Association estimates that there as many as 3000 plant species in commerce. The FDA estimates that there are about 85,000 dietary supplement products in the marketplace. The pace of product innovation far exceeds that of analytical methods development and validation, with new ingredients, matrixes, and combinations resulting in an analytical community that has been unable to keep up. This has led to a lack of validated analytical methods for dietary supplements and to inappropriate method selection where methods do exist. Only after rigorous validation procedures to ensure that methods are fit for purpose should they be used in a routine setting to verify product authenticity and quality. By following systematic procedures and establishing performance requirements for analytical methods before method development and validation, methods can be developed that are both valid and fit for purpose. This review summarizes advances in method selection, development, and validation regarding herbal supplement analysis and provides several documented examples of inappropriate method selection and application. PMID:26980823
The Importance of Method Selection in Determining Product Integrity for Nutrition Research.
Mudge, Elizabeth M; Betz, Joseph M; Brown, Paula N
2016-03-01
The American Herbal Products Association estimates that there as many as 3000 plant species in commerce. The FDA estimates that there are about 85,000 dietary supplement products in the marketplace. The pace of product innovation far exceeds that of analytical methods development and validation, with new ingredients, matrixes, and combinations resulting in an analytical community that has been unable to keep up. This has led to a lack of validated analytical methods for dietary supplements and to inappropriate method selection where methods do exist. Only after rigorous validation procedures to ensure that methods are fit for purpose should they be used in a routine setting to verify product authenticity and quality. By following systematic procedures and establishing performance requirements for analytical methods before method development and validation, methods can be developed that are both valid and fit for purpose. This review summarizes advances in method selection, development, and validation regarding herbal supplement analysis and provides several documented examples of inappropriate method selection and application. © 2016 American Society for Nutrition.
Big Data Analytics for a Smart Green Infrastructure Strategy
NASA Astrophysics Data System (ADS)
Barrile, Vincenzo; Bonfa, Stefano; Bilotta, Giuliana
2017-08-01
As well known, Big Data is a term for data sets so large or complex that traditional data processing applications aren’t sufficient to process them. The term “Big Data” is referred to using predictive analytics. It is often related to user behavior analytics, or other advanced data analytics methods which from data extract value, and rarely to a particular size of data set. This is especially true for the huge amount of Earth Observation data that satellites constantly orbiting the earth daily transmit.
Analytical concepts for health management systems of liquid rocket engines
NASA Technical Reports Server (NTRS)
Williams, Richard; Tulpule, Sharayu; Hawman, Michael
1990-01-01
Substantial improvement in health management systems performance can be realized by implementing advanced analytical methods of processing existing liquid rocket engine sensor data. In this paper, such techniques ranging from time series analysis to multisensor pattern recognition to expert systems to fault isolation models are examined and contrasted. The performance of several of these methods is evaluated using data from test firings of the Space Shuttle main engines.
Churchwell, Mona I; Twaddle, Nathan C; Meeker, Larry R; Doerge, Daniel R
2005-10-25
Recent technological advances have made available reverse phase chromatographic media with a 1.7 microm particle size along with a liquid handling system that can operate such columns at much higher pressures. This technology, termed ultra performance liquid chromatography (UPLC), offers significant theoretical advantages in resolution, speed, and sensitivity for analytical determinations, particularly when coupled with mass spectrometers capable of high-speed acquisitions. This paper explores the differences in LC-MS performance by conducting a side-by-side comparison of UPLC for several methods previously optimized for HPLC-based separation and quantification of multiple analytes with maximum throughput. In general, UPLC produced significant improvements in method sensitivity, speed, and resolution. Sensitivity increases with UPLC, which were found to be analyte-dependent, were as large as 10-fold and improvements in method speed were as large as 5-fold under conditions of comparable peak separations. Improvements in chromatographic resolution with UPLC were apparent from generally narrower peak widths and from a separation of diastereomers not possible using HPLC. Overall, the improvements in LC-MS method sensitivity, speed, and resolution provided by UPLC show that further advances can be made in analytical methodology to add significant value to hypothesis-driven research.
On the Use of Accelerated Test Methods for Characterization of Advanced Composite Materials
NASA Technical Reports Server (NTRS)
Gates, Thomas S.
2003-01-01
A rational approach to the problem of accelerated testing for material characterization of advanced polymer matrix composites is discussed. The experimental and analytical methods provided should be viewed as a set of tools useful in the screening of material systems for long-term engineering properties in aerospace applications. Consideration is given to long-term exposure in extreme environments that include elevated temperature, reduced temperature, moisture, oxygen, and mechanical load. Analytical formulations useful for predictive models that are based on the principles of time-based superposition are presented. The need for reproducible mechanisms, indicator properties, and real-time data are outlined as well as the methodologies for determining specific aging mechanisms.
Historically, risk assessment has relied upon toxicological data to obtain hazard-based reference levels, which are subsequently compared to exposure estimates to determine whether an unacceptable risk to public health may exist. Recent advances in analytical methods, biomarker ...
Let's Talk Learning Analytics: A Framework for Implementation in Relation to Student Retention
ERIC Educational Resources Information Center
West, Deborah; Heath, David; Huijser, Henk
2016-01-01
This paper presents a dialogical tool for the advancement of learning analytics implementation for student retention in Higher Education institutions. The framework was developed as an outcome of a project commissioned and funded by the Australian Government's "Office for Learning and Teaching". The project took a mixed-method approach…
Learning Analytics in Higher Education Development: A Roadmap
ERIC Educational Resources Information Center
Adejo, Olugbenga; Connolly, Thomas
2017-01-01
The increase in education data and advance in technology are bringing about enhanced teaching and learning methodology. The emerging field of Learning Analytics (LA) continues to seek ways to improve the different methods of gathering, analysing, managing and presenting learners' data with the sole aim of using it to improve the student learning…
Analytical methods for dating modern writing instrument inks on paper.
Ezcurra, Magdalena; Góngora, Juan M G; Maguregui, Itxaso; Alonso, Rosa
2010-04-15
This work reviews the different analytical methods that have been proposed in the field of forensic dating of inks from different modern writing instruments. The reported works have been classified according to the writing instrument studied and the ink component analyzed in relation to aging. The study, done chronologically, shows the advances experienced in the ink dating field in the last decades. Copyright 2009 Elsevier Ireland Ltd. All rights reserved.
Recent advances in analytical methods, biomarker discovery, cell-based assay development, computational tools, sensor/monitor, and omics technology have enabled new streams of exposure and toxicity data to be generated at higher volumes and speed. These new data offer the opport...
Web-based Visual Analytics for Extreme Scale Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A; Evans, Katherine J; Harney, John F
In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less
MSFC Advanced Concepts Office and the Iterative Launch Vehicle Concept Method
NASA Technical Reports Server (NTRS)
Creech, Dennis
2011-01-01
This slide presentation reviews the work of the Advanced Concepts Office (ACO) at Marshall Space Flight Center (MSFC) with particular emphasis on the method used to model launch vehicles using INTegrated ROcket Sizing (INTROS), a modeling system that assists in establishing the launch concept design, and stage sizing, and facilitates the integration of exterior analytic efforts, vehicle architecture studies, and technology and system trades and parameter sensitivities.
ERIC Educational Resources Information Center
Albright, Jessica C.; Beussman, Douglas J.
2017-01-01
Capillary electrophoresis is an important analytical separation method used to study a wide variety of samples, including those of biological origin. Capillary electrophoresis may be covered in the classroom, especially in advanced analytical courses, and while many students are exposed to gel electrophoresis in biology or biochemistry…
Weisner, Thomas S; Fiese, Barbara H
2011-12-01
Mixed methods in family psychology refer to the systematic integration of qualitative and quantitative techniques to represent family processes and settings. Over the past decade, significant advances have been made in study design, analytic strategies, and technological support (such as software) that allow for the integration of quantitative and qualitative methods and for making appropriate inferences from mixed methods. This special section of the Journal of Family Psychology illustrates how mixed methods may be used to advance knowledge in family science through identifying important cultural differences in family structure, beliefs, and practices, and revealing patterns of family relationships to generate new measurement paradigms and inform clinical practice. Guidance is offered to advance mixed methods research in family psychology through sound principles of peer review.
Introductory Guide to the Statistics of Molecular Genetics
ERIC Educational Resources Information Center
Eley, Thalia C.; Rijsdijk, Fruhling
2005-01-01
Background: This introductory guide presents the main two analytical approaches used by molecular geneticists: linkage and association. Methods: Traditional linkage and association methods are described, along with more recent advances in methodologies such as those using a variance components approach. Results: New methods are being developed all…
The Water-Energy-Food Nexus: Advancing Innovative, Policy-Relevant Methods
NASA Astrophysics Data System (ADS)
Crootof, A.; Albrecht, T.; Scott, C. A.
2017-12-01
The water-energy-food (WEF) nexus is rapidly expanding in scholarly literature and policy settings as a novel way to address complex Anthropocene challenges. The nexus approach aims to identify tradeoffs and synergies of water, energy, and food systems, internalize social and environmental impacts, and guide development of cross-sectoral policies. However, a primary limitation of the nexus approach is the absence - or gaps and inconsistent use - of adequate methods to advance an innovative and policy-relevant nexus approach. This paper presents an analytical framework to identify robust nexus methods that align with nexus thinking and highlights innovative nexus methods at the frontier. The current state of nexus methods was assessed with a systematic review of 245 journal articles and book chapters. This review revealed (a) use of specific and reproducible methods for nexus assessment is uncommon - less than one-third of the reviewed studies present explicit methods; (b) nexus methods frequently fall short of capturing interactions among water, energy, and food - the very concept they purport to address; (c) assessments strongly favor quantitative approaches - 70% use primarily quantitative tools; (d) use of social science methods is limited (26%); and (e) many nexus methods are confined to disciplinary silos - only about one-quarter combine methods from diverse disciplines and less than one-fifth utilize both quantitative and qualitative approaches. Despite some pitfalls of current nexus methods, there are a host of studies that offer innovative approaches to help quantify nexus linkages and interactions among sectors, conceptualize dynamic feedbacks, and support mixed method approaches to better understand WEF systems. Applying our analytical framework to all 245 studies, we identify, and analyze herein, seventeen studies that implement innovative multi-method and cross-scalar tools to demonstrate promising advances toward improved nexus assessment. This paper finds that, to make the WEF nexus effective as a policy-relevant analytical tool, methods are needed that incorporate social and political dimensions of water, energy, and food; utilize multiple and interdisciplinary approaches; and engage stakeholders and policy-makers.
[Automated analyzer of enzyme immunoassay].
Osawa, S
1995-09-01
Automated analyzers for enzyme immunoassay can be classified by several points of view: the kind of labeled antibodies or enzymes, detection methods, the number of tests per unit time, analytical time and speed per run. In practice, it is important for us consider the several points such as detection limits, the number of tests per unit time, analytical range, and precision. Most of the automated analyzers on the market can randomly access and measure samples. I will describe the recent advance of automated analyzers reviewing their labeling antibodies and enzymes, the detection methods, the number of test per unit time and analytical time and speed per test.
Macedonia, Christian R; Johnson, Clark T; Rajapakse, Indika
2017-02-01
Technical advances in science have had broad implications in reproductive and women's health care. Recent innovations in population-level data collection and storage have made available an unprecedented amount of data for analysis while computational technology has evolved to permit processing of data previously thought too dense to study. "Big data" is a term used to describe data that are a combination of dramatically greater volume, complexity, and scale. The number of variables in typical big data research can readily be in the thousands, challenging the limits of traditional research methodologies. Regardless of what it is called, advanced data methods, predictive analytics, or big data, this unprecedented revolution in scientific exploration has the potential to dramatically assist research in obstetrics and gynecology broadly across subject matter. Before implementation of big data research methodologies, however, potential researchers and reviewers should be aware of strengths, strategies, study design methods, and potential pitfalls. Examination of big data research examples contained in this article provides insight into the potential and the limitations of this data science revolution and practical pathways for its useful implementation.
Andrew Taylor, R; Venkatesh, Arjun; Parwani, Vivek; Chekijian, Sharon; Shapiro, Marc; Oh, Andrew; Harriman, David; Tarabar, Asim; Ulrich, Andrew
2018-01-04
Emergency Department (ED) leaders are increasingly confronted with large amounts of data with the potential to inform and guide operational decisions. Routine use of advanced analytic methods may provide additional insights. To examine the practical application of available advanced analytic methods to guide operational decision making around patient boarding. Retrospective analysis of the effect of boarding on ED operational metrics from a single site between 1/2015 and 1/2017. Times series were visualized through decompositional techniques accounting for seasonal trends, to determine the effect of boarding on ED performance metrics and to determine the impact of boarding "shocks" to the system on operational metrics over several days. There were 226,461 visits with the mean (IQR) number of visits per day was 273 (258-291). Decomposition of the boarding count time series illustrated an upward trend in the last 2-3 quarters as well as clear seasonal components. All performance metrics were significantly impacted (p<0.05) by boarding count, except for overall Press Ganey scores (p<0.65). For every additional increase in boarder count, overall length-of-stay (LOS) increased by 1.55min (0.68, 1.50). Smaller effects were seen for waiting room LOS and treat and release LOS. The impulse responses indicate that the boarding shocks are characterized by changes in the performance metrics within the first day that fade out after 4-5days. In this study regarding the use of advanced analytics in daily ED operations, time series analysis provided multiple useful insights into boarding and its impact on performance metrics. Copyright © 2018. Published by Elsevier Inc.
MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications
Medina, Isabel; Cappiello, Achille; Careri, Maria
2018-01-01
Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017. PMID:29850370
Advances in high-resolution mass spectrometry based on metabolomics studies for food--a review.
Rubert, Josep; Zachariasova, Milena; Hajslova, Jana
2015-01-01
Food authenticity becomes a necessity for global food policies, since food placed in the market without fail has to be authentic. It has always been a challenge, since in the past minor components, called also markers, have been mainly monitored by chromatographic methods in order to authenticate the food. Nevertheless, nowadays, advanced analytical methods have allowed food fingerprints to be achieved. At the same time they have been also combined with chemometrics, which uses statistical methods in order to verify food and to provide maximum information by analysing chemical data. These sophisticated methods based on different separation techniques or stand alone have been recently coupled to high-resolution mass spectrometry (HRMS) in order to verify the authenticity of food. The new generation of HRMS detectors have experienced significant advances in resolving power, sensitivity, robustness, extended dynamic range, easier mass calibration and tandem mass capabilities, making HRMS more attractive and useful to the food metabolomics community, therefore becoming a reliable tool for food authenticity. The purpose of this review is to summarise and describe the most recent metabolomics approaches in the area of food metabolomics, and to discuss the strengths and drawbacks of the HRMS analytical platforms combined with chemometrics.
Charmaz, Kathy
2015-12-01
This article addresses criticisms of qualitative research for spawning studies that lack analytic development and theoretical import. It focuses on teaching initial grounded theory tools while interviewing, coding, and writing memos for the purpose of scaling up the analytic level of students' research and advancing theory construction. Adopting these tools can improve teaching qualitative methods at all levels although doctoral education is emphasized here. What teachers cover in qualitative methods courses matters. The pedagogy presented here requires a supportive environment and relies on demonstration, collective participation, measured tasks, progressive analytic complexity, and accountability. Lessons learned from using initial grounded theory tools are exemplified in a doctoral student's coding and memo-writing excerpts that demonstrate progressive analytic development. The conclusion calls for increasing the number and depth of qualitative methods courses and for creating a cadre of expert qualitative methodologists. © The Author(s) 2015.
DOT National Transportation Integrated Search
2010-10-01
The Volvo-Ford-UMTRI project: Safety Impact Methodology (SIM) for Lane Departure Warning is part of the U.S. Department of Transportation's Advanced Crash Avoidance Technologies (ACAT) program. The project developed a basic analytical framework for e...
Panuwet, Parinya; Hunter, Ronald E.; D’Souza, Priya E.; Chen, Xianyu; Radford, Samantha A.; Cohen, Jordan R.; Marder, M. Elizabeth; Kartavenka, Kostya; Ryan, P. Barry; Barr, Dana Boyd
2015-01-01
The ability to quantify levels of target analytes in biological samples accurately and precisely, in biomonitoring, involves the use of highly sensitive and selective instrumentation such as tandem mass spectrometers and a thorough understanding of highly variable matrix effects. Typically, matrix effects are caused by co-eluting matrix components that alter the ionization of target analytes as well as the chromatographic response of target analytes, leading to reduced or increased sensitivity of the analysis. Thus, before the desired accuracy and precision standards of laboratory data are achieved, these effects must be characterized and controlled. Here we present our review and observations of matrix effects encountered during the validation and implementation of tandem mass spectrometry-based analytical methods. We also provide systematic, comprehensive laboratory strategies needed to control challenges posed by matrix effects in order to ensure delivery of the most accurate data for biomonitoring studies assessing exposure to environmental toxicants. PMID:25562585
NASA Technical Reports Server (NTRS)
Keiter, I. D.
1982-01-01
Studies of several General Aviation aircraft indicated that the application of advanced technologies to General Aviation propellers can reduce fuel consumption in future aircraft by a significant amount. Propeller blade weight reductions achieved through the use of composites, propeller efficiency and noise improvements achieved through the use of advanced concepts and improved propeller analytical design methods result in aircraft with lower operating cost, acquisition cost and gross weight.
Immunoehemical methods are responding to the changing needs of regulatory and monitoring programs and are meeting new analytical challenges as they arise. Recent advances in environmental immunoehemistry have expanded the role of immunoassays from field screening methods to hig...
Advances in aptamer screening and small molecule aptasensors.
Kim, Yeon Seok; Gu, Man Bock
2014-01-01
It has been 20 years since aptamer and SELEX (systematic evolution of ligands by exponential enrichment) were described independently by Andrew Ellington and Larry Gold. Based on the great advantages of aptamers, there have been numerous isolated aptamers for various targets that have actively been applied as therapeutic and analytical tools. Over 2,000 papers related to aptamers or SELEX have been published, attesting to their wide usefulness and the applicability of aptamers. SELEX methods have been modified or re-created over the years to enable aptamer isolation with higher affinity and selectivity in more labor- and time-efficient manners, including automation. Initially, most of the studies about aptamers have focused on the protein targets, which have physiological functions in the body, and their applications as therapeutic agents or receptors for diagnostics. However, aptamers for small molecules such as organic or inorganic compounds, drugs, antibiotics, or metabolites have not been studied sufficiently, despite the ever-increasing need for rapid and simple analytical methods for various chemical targets in the fields of medical diagnostics, environmental monitoring, food safety, and national defense against targets including chemical warfare. This review focuses on not only recent advances in aptamer screening methods but also its analytical application for small molecules.
Piezocone Penetration Testing Device
DOT National Transportation Integrated Search
2017-01-03
Hydraulic characteristics of soils can be estimated from piezocone penetration test (called PCPT hereinafter) by performing dissipation test or on-the-fly using advanced analytical techniques. This research report presents a method for fast estimatio...
Applications of Optical Microcavity Resonators in Analytical Chemistry
Wade, James H.; Bailey, Ryan C.
2018-01-01
Optical resonator sensors are an emerging class of analytical technologies that use recirculating light confined within a microcavity to sensitively measure the surrounding environment. Bolstered by advances in microfabrication, these devices can be configured for a wide variety of chemical or biomolecular sensing applications. The review begins with a brief description of optical resonator sensor operation followed by discussions regarding sensor design, including different geometries, choices of material systems, methods of sensor interrogation, and new approaches to sensor operation. Throughout, key recent developments are highlighted, including advancements in biosensing and other applications of optical sensors. Alternative sensing mechanisms and hybrid sensing devices are then discussed in terms of their potential for more sensitive and rapid analyses. Brief concluding statements offer our perspective on the future of optical microcavity sensors and their promise as versatile detection elements within analytical chemistry. PMID:27049629
Clinical and diagnostic utility of saliva as a non-invasive diagnostic fluid: a systematic review
Nunes, Lazaro Alessandro Soares; Mussavira, Sayeeda
2015-01-01
This systematic review presents the latest trends in salivary research and its applications in health and disease. Among the large number of analytes present in saliva, many are affected by diverse physiological and pathological conditions. Further, the non-invasive, easy and cost-effective collection methods prompt an interest in evaluating its diagnostic or prognostic utility. Accumulating data over the past two decades indicates towards the possible utility of saliva to monitor overall health, diagnose and treat various oral or systemic disorders and drug monitoring. Advances in saliva based systems biology has also contributed towards identification of several biomarkers, development of diverse salivary diagnostic kits and other sensitive analytical techniques. However, its utilization should be carefully evaluated in relation to standardization of pre-analytical and analytical variables, such as collection and storage methods, analyte circadian variation, sample recovery, prevention of sample contamination and analytical procedures. In spite of all these challenges, there is an escalating evolution of knowledge with the use of this biological matrix. PMID:26110030
Methods for geochemical analysis
Baedecker, Philip A.
1987-01-01
The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.
Modern analytical methods for the detection of food fraud and adulteration by food category.
Hong, Eunyoung; Lee, Sang Yoo; Jeong, Jae Yun; Park, Jung Min; Kim, Byung Hee; Kwon, Kisung; Chun, Hyang Sook
2017-09-01
This review provides current information on the analytical methods used to identify food adulteration in the six most adulterated food categories: animal origin and seafood, oils and fats, beverages, spices and sweet foods (e.g. honey), grain-based food, and others (organic food and dietary supplements). The analytical techniques (both conventional and emerging) used to identify adulteration in these six food categories involve sensory, physicochemical, DNA-based, chromatographic and spectroscopic methods, and have been combined with chemometrics, making these techniques more convenient and effective for the analysis of a broad variety of food products. Despite recent advances, the need remains for suitably sensitive and widely applicable methodologies that encompass all the various aspects of food adulteration. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Propeller flow visualization techniques
NASA Technical Reports Server (NTRS)
Stefko, G. L.; Paulovich, F. J.; Greissing, J. P.; Walker, E. D.
1982-01-01
Propeller flow visualization techniques were tested. The actual operating blade shape as it determines the actual propeller performance and noise was established. The ability to photographically determine the advanced propeller blade tip deflections, local flow field conditions, and gain insight into aeroelastic instability is demonstrated. The analytical prediction methods which are being developed can be compared with experimental data. These comparisons contribute to the verification of these improved methods and give improved capability for designing future advanced propellers with enhanced performance and noise characteristics.
Analytical methods for determining individual aldehyde, ketone, and alcohol emissions from gasoline-, methanol-, and variable-fueled vehicles are described. These methods were used in the Auto/Oil Air quality Improvement Research Program to provide emission data for comparison of...
NASA Astrophysics Data System (ADS)
Borazjani, Iman; Asgharzadeh, Hafez
2015-11-01
Flow simulations involving complex geometries and moving boundaries suffer from time-step size restriction and low convergence rates with explicit and semi-implicit schemes. Implicit schemes can be used to overcome these restrictions. However, implementing implicit solver for nonlinear equations including Navier-Stokes is not straightforward. Newton-Krylov subspace methods (NKMs) are one of the most advanced iterative methods to solve non-linear equations such as implicit descritization of the Navier-Stokes equation. The efficiency of NKMs massively depends on the Jacobian formation method, e.g., automatic differentiation is very expensive, and matrix-free methods slow down as the mesh is refined. Analytical Jacobian is inexpensive method, but derivation of analytical Jacobian for Navier-Stokes equation on staggered grid is challenging. The NKM with a novel analytical Jacobian was developed and validated against Taylor-Green vortex and pulsatile flow in a 90 degree bend. The developed method successfully handled the complex geometries such as an intracranial aneurysm with multiple overset grids, and immersed boundaries. It is shown that the NKM with an analytical Jacobian is 3 to 25 times faster than the fixed-point implicit Runge-Kutta method, and more than 100 times faster than automatic differentiation depending on the grid (size) and the flow problem. The developed methods are fully parallelized with parallel efficiency of 80-90% on the problems tested.
Remote Sensing is a scientific discipline of non-contact monitoring. It includes a range of technologies that span from aerial photography to advanced spectral imaging and analytical methods. This Session is designed to demonstrate contemporary practical applications of remote se...
Focus on out-of-equilibrium dynamics in strongly interacting one-dimensional systems
NASA Astrophysics Data System (ADS)
Daley, A. J.; Rigol, M.; Weiss, D. S.
2014-09-01
In the past few years, there have been significant advances in understanding out-of-equilibrium dynamics in strongly interacting many-particle quantum systems. This is the case for 1D dynamics, where experimental advances—both with ultracold atomic gases and with solid state systems—have been accompanied by advances in theoretical methods, both analytical and numerical. This ‘focus on’ collection brings together 17 new papers, which together give a representative overview of the recent advances.
Salgueiro-González, N; Castiglioni, S; Zuccato, E; Turnes-Carou, I; López-Mahía, P; Muniategui-Lorenzo, S
2018-09-18
The problem of endocrine disrupting compounds (EDCs) in the environment has become a worldwide concern in recent decades. Besides their toxicological effects at low concentrations and their widespread use in industrial and household applications, these pollutants pose a risk for non-target organisms and also for public safety. Analytical methods to determine these compounds at trace levels in different matrices are urgently needed. This review critically discusses trends in analytical methods for well-known EDCs like alkylphenols and bisphenol A in solid environmental matrices, including sediment and aquatic biological samples (from 2006 to 2018). Information about extraction, clean-up and determination is covered in detail, including analytical quality parameters (QA/QC). Conventional and novel analytical techniques are compared, with their advantages and drawbacks. Ultrasound assisted extraction followed by solid phase extraction clean-up is the most widely used procedure for sediment and aquatic biological samples, although softer extraction conditions have been employed for the latter. The use of liquid chromatography followed by tandem mass spectrometry has greatly increased in the last five years. The majority of these methods have been employed for the analysis of river sediments and bivalve molluscs because of their usefulness in aquatic ecosystem (bio)monitoring programs. Green, simple, fast analytical methods are now needed to determine these compounds in complex matrices. Copyright © 2018 Elsevier B.V. All rights reserved.
Applications of flight control system methods to an advanced combat rotorcraft
NASA Technical Reports Server (NTRS)
Tischler, Mark B.; Fletcher, Jay W.; Morris, Patrick M.; Tucker, George T.
1989-01-01
Advanced flight control system design, analysis, and testing methodologies developed at the Ames Research Center are applied in an analytical and flight test evaluation of the Advanced Digital Optical Control System (ADOCS) demonstrator. The primary objectives are to describe the knowledge gained about the implications of digital flight control system design for rotorcraft, and to illustrate the analysis of the resulting handling-qualities in the context of the proposed new handling-qualities specification for rotorcraft. Topics covered in-depth are digital flight control design and analysis methods, flight testing techniques, ADOCS handling-qualities evaluation results, and correlation of flight test results with analytical models and the proposed handling-qualities specification. The evaluation of the ADOCS demonstrator indicates desirable response characteristics based on equivalent damping and frequency, but undersirably large effective time-delays (exceeding 240 m sec in all axes). Piloted handling-qualities are found to be desirable or adequate for all low, medium, and high pilot gain tasks; but handling-qualities are inadequate for ultra-high gain tasks such as slope and running landings.
Advances in Molecular Rotational Spectroscopy for Applied Science
NASA Astrophysics Data System (ADS)
Harris, Brent; Fields, Shelby S.; Pulliam, Robin; Muckle, Matt; Neill, Justin L.
2017-06-01
Advances in chemical sensitivity and robust, solid-state designs for microwave/millimeter-wave instrumentation compel the expansion of molecular rotational spectroscopy as research tool into applied science. It is familiar to consider molecular rotational spectroscopy for air analysis. Those techniques for molecular rotational spectroscopy are included in our presentation of a more broad application space for materials analysis using Fourier Transform Molecular Rotational Resonance (FT-MRR) spectrometers. There are potentially transformative advantages for direct gas analysis of complex mixtures, determination of unknown evolved gases with parts per trillion detection limits in solid materials, and unambiguous chiral determination. The introduction of FT-MRR as an alternative detection principle for analytical chemistry has created a ripe research space for the development of new analytical methods and sampling equipment to fully enable FT-MRR. We present the current state of purpose-built FT-MRR instrumentation and the latest application measurements that make use of new sampling methods.
Technology advancement for integrative stem cell analyses.
Jeong, Yoon; Choi, Jonghoon; Lee, Kwan Hyi
2014-12-01
Scientists have endeavored to use stem cells for a variety of applications ranging from basic science research to translational medicine. Population-based characterization of such stem cells, while providing an important foundation to further development, often disregard the heterogeneity inherent among individual constituents within a given population. The population-based analysis and characterization of stem cells and the problems associated with such a blanket approach only underscore the need for the development of new analytical technology. In this article, we review current stem cell analytical technologies, along with the advantages and disadvantages of each, followed by applications of these technologies in the field of stem cells. Furthermore, while recent advances in micro/nano technology have led to a growth in the stem cell analytical field, underlying architectural concepts allow only for a vertical analytical approach, in which different desirable parameters are obtained from multiple individual experiments and there are many technical challenges that limit vertically integrated analytical tools. Therefore, we propose--by introducing a concept of vertical and horizontal approach--that there is the need of adequate methods to the integration of information, such that multiple descriptive parameters from a stem cell can be obtained from a single experiment.
Technology Advancement for Integrative Stem Cell Analyses
Jeong, Yoon
2014-01-01
Scientists have endeavored to use stem cells for a variety of applications ranging from basic science research to translational medicine. Population-based characterization of such stem cells, while providing an important foundation to further development, often disregard the heterogeneity inherent among individual constituents within a given population. The population-based analysis and characterization of stem cells and the problems associated with such a blanket approach only underscore the need for the development of new analytical technology. In this article, we review current stem cell analytical technologies, along with the advantages and disadvantages of each, followed by applications of these technologies in the field of stem cells. Furthermore, while recent advances in micro/nano technology have led to a growth in the stem cell analytical field, underlying architectural concepts allow only for a vertical analytical approach, in which different desirable parameters are obtained from multiple individual experiments and there are many technical challenges that limit vertically integrated analytical tools. Therefore, we propose—by introducing a concept of vertical and horizontal approach—that there is the need of adequate methods to the integration of information, such that multiple descriptive parameters from a stem cell can be obtained from a single experiment. PMID:24874188
Cell bioprocessing in space - Applications of analytical cytology
NASA Technical Reports Server (NTRS)
Todd, P.; Hymer, W. C.; Goolsby, C. L.; Hatfield, J. M.; Morrison, D. R.
1988-01-01
Cell bioprocessing experiments in space are reviewed and the development of on-board cell analytical cytology techniques that can serve such experiments is discussed. Methods and results of experiments involving the cultivation and separation of eukaryotic cells in space are presented. It is suggested that an advanced cytometer should be developed for the quantitative analysis of large numbers of specimens of suspended eukaryotic cells and bioparticles in experiments on the Space Station.
Sosa-Ferrera, Zoraida; Mahugo-Santana, Cristina; Santana-Rodríguez, José Juan
2013-01-01
Endocrine-disruptor compounds (EDCs) can mimic natural hormones and produce adverse effects in the endocrine functions by interacting with estrogen receptors. EDCs include both natural and synthetic chemicals, such as hormones, personal care products, surfactants, and flame retardants, among others. EDCs are characterised by their ubiquitous presence at trace-level concentrations and their wide diversity. Since the discovery of the adverse effects of these pollutants on wildlife and human health, analytical methods have been developed for their qualitative and quantitative determination. In particular, mass-based analytical methods show excellent sensitivity and precision for their quantification. This paper reviews recently published analytical methodologies for the sample preparation and for the determination of these compounds in different environmental and biological matrices by liquid chromatography coupled with mass spectrometry. The various sample preparation techniques are compared and discussed. In addition, recent developments and advances in this field are presented. PMID:23738329
Annual banned-substance review: Analytical approaches in human sports drug testing.
Thevis, Mario; Kuuranne, Tiia; Geyer, Hans
2018-01-01
Several high-profile revelations concerning anti-doping rule violations over the past 12 months have outlined the importance of tackling prevailing challenges and reducing the limitations of the current anti-doping system. At this time, the necessity to enhance, expand, and improve analytical test methods in response to the substances outlined in the World Anti-Doping Agency's (WADA) Prohibited List represents an increasingly crucial task for modern sports drug-testing programs. The ability to improve analytical testing methods often relies on the expedient application of novel information regarding superior target analytes for sports drug-testing assays, drug elimination profiles, alternative test matrices, together with recent advances in instrumental developments. This annual banned-substance review evaluates literature published between October 2016 and September 2017 offering an in-depth evaluation of developments in these arenas and their potential application to substances reported in WADA's 2017 Prohibited List. Copyright © 2017 John Wiley & Sons, Ltd.
Tests of Measurement Invariance without Subgroups: A Generalization of Classical Methods
ERIC Educational Resources Information Center
Merkle, Edgar C.; Zeileis, Achim
2013-01-01
The issue of measurement invariance commonly arises in factor-analytic contexts, with methods for assessment including likelihood ratio tests, Lagrange multiplier tests, and Wald tests. These tests all require advance definition of the number of groups, group membership, and offending model parameters. In this paper, we study tests of measurement…
Rankin, Kristin M; Kroelinger, Charlan D; Rosenberg, Deborah; Barfield, Wanda D
2012-12-01
The purpose of this article is to summarize the methodology, partnerships, and products developed as a result of a distance-based workforce development initiative to improve analytic capacity among maternal and child health (MCH) epidemiologists in state health agencies. This effort was initiated by the Centers for Disease Control's MCH Epidemiology Program and faculty at the University of Illinois at Chicago to encourage and support the use of surveillance data by MCH epidemiologists and program staff in state agencies. Beginning in 2005, distance-based training in advanced analytic skills was provided to MCH epidemiologists. To support participants, this model of workforce development included: lectures about the practical application of innovative epidemiologic methods, development of multidisciplinary teams within and across agencies, and systematic, tailored technical assistance The goal of this initiative evolved to emphasize the direct application of advanced methods to the development of state data products using complex sample surveys, resulting in the articles published in this supplement to MCHJ. Innovative methods were applied by participating MCH epidemiologists, including regional analyses across geographies and datasets, multilevel analyses of state policies, and new indicator development. Support was provided for developing cross-state and regional partnerships and for developing and publishing the results of analytic projects. This collaboration was successful in building analytic capacity, facilitating partnerships and promoting surveillance data use to address state MCH priorities, and may have broader application beyond MCH epidemiology. In an era of decreasing resources, such partnership efforts between state and federal agencies and academia are essential for promoting effective data use.
Rabiei-Dastjerdi, Hamidreza; Matthews, Stephen A
2018-01-01
Recent interest in the social determinants of health (SDOH) and the effects of neighborhood contexts on individual health and well-being has grown exponentially. In this brief communication, we describe recent developments in both analytical perspectives and methods that have opened up new opportunities for researchers interested in exploring neighborhoods and health research within a SDOH framework. We focus specifically on recent advances in geographic information science, statistical methods, and spatial analytical tools. We close with a discussion of how these recent developments have the potential to enhance SDOH research in Iran.
Ng, Nyuk-Ting; Kamaruddin, Amirah Farhan; Wan Ibrahim, Wan Aini; Sanagi, Mohd Marsin; Abdul Keyon, Aemi S
2018-01-01
The efficiency of the extraction and removal of pollutants from food and the environment has been an important issue in analytical science. By incorporating inorganic species into an organic matrix, a new material known as an organic-inorganic hybrid material is formed. As it possesses high selectivity, permeability, and mechanical and chemical stabilities, organic-inorganic hybrid materials constitute an emerging research field and have become popular to serve as sorbents in various separaton science methods. Here, we review recent significant advances in analytical solid-phase extraction employing organic-inorganic composite/nanocomposite sorbents for the extraction of organic and inorganic pollutants from various types of food and environmental matrices. The physicochemical characteristics, extraction properties, and analytical performances of sorbents are discussed; including morphology and surface characteristics, types of functional groups, interaction mechanism, selectivity and sensitivity, accuracy, and regeneration abilities. Organic-inorganic hybrid sorbents combined with extraction techniques are highly promising for sample preparation of various food and environmental matrixes with analytes at trace levels. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Speciated Elemental and Isotopic Characterization of Atmospheric Aerosols - Recent Advances
NASA Astrophysics Data System (ADS)
Shafer, M.; Majestic, B.; Schauer, J.
2007-12-01
Detailed elemental, isotopic, and chemical speciation analysis of aerosol particulate matter (PM) can provide valuable information on PM sources, atmospheric processing, and climate forcing. Certain PM sources may best be resolved using trace metal signatures, and elemental and isotopic fingerprints can supplement and enhance molecular maker analysis of PM for source apportionment modeling. In the search for toxicologically relevant components of PM, health studies are increasingly demanding more comprehensive characterization schemes. It is also clear that total metal analysis is at best a poor surrogate for the bioavailable component, and analytical techniques that address the labile component or specific chemical species are needed. Recent sampling and analytical developments advanced by the project team have facilitated comprehensive characterization of even very small masses of atmospheric PM. Historically; this level of detail was rarely achieved due to limitations in analytical sensitivity and a lack of awareness concerning the potential for contamination. These advances have enabled the coupling of advanced chemical characterization to vital field sampling approaches that typically supply only very limited PM mass; e.g. (1) particle size-resolved sampling; (2) personal sampler collections; and (3) fine temporal scale sampling. The analytical tools that our research group is applying include: (1) sector field (high-resolution-HR) ICP-MS, (2) liquid waveguide long-path spectrophotometry (LWG-LPS), and (3) synchrotron x-ray absorption spectroscopy (sXAS). When coupled with an efficient and validated solubilization method, the HR-ICP-MS can provide quantitative elemental information on over 50 elements in microgram quantities of PM. The high mass resolution and enhanced signal-to-noise of HR-ICP-MS significantly advance data quality and quantity over that possible with traditional quadrupole ICP-MS. The LWG-LPS system enables an assessment of the soluble/labile components of PM, while simultaneously providing critical oxidation state speciation data. Importantly, the LWG- LPS can be deployed in a semi-real-time configuration to probe fine temporal scale variations in atmospheric processing or sources of PM. The sXAS is providing complementary oxidation state speciation of bulk PM. Using examples from our research; we will illustrate the capabilities and applications of these new methods.
Lores, Marta; Llompart, Maria; Alvarez-Rivera, Gerardo; Guerra, Eugenia; Vila, Marlene; Celeiro, Maria; Lamas, J Pablo; Garcia-Jares, Carmen
2016-04-07
Cosmetic products placed on the market and their ingredients, must be safe under reasonable conditions of use, in accordance to the current legislation. Therefore, regulated and allowed chemical substances must meet the regulatory criteria to be used as ingredients in cosmetics and personal care products, and adequate analytical methodology is needed to evaluate the degree of compliance. This article reviews the most recent methods (2005-2015) used for the extraction and the analytical determination of the ingredients included in the positive lists of the European Regulation of Cosmetic Products (EC 1223/2009): comprising colorants, preservatives and UV filters. It summarizes the analytical properties of the most relevant analytical methods along with the possibilities of fulfilment of the current regulatory issues. The cosmetic legislation is frequently being updated; consequently, the analytical methodology must be constantly revised and improved to meet safety requirements. The article highlights the most important advances in analytical methodology for cosmetics control, both in relation to the sample pretreatment and extraction and the different instrumental approaches developed to solve this challenge. Cosmetics are complex samples, and most of them require a sample pretreatment before analysis. In the last times, the research conducted covering this aspect, tended to the use of green extraction and microextraction techniques. Analytical methods were generally based on liquid chromatography with UV detection, and gas and liquid chromatographic techniques hyphenated with single or tandem mass spectrometry; but some interesting proposals based on electrophoresis have also been reported, together with some electroanalytical approaches. Regarding the number of ingredients considered for analytical control, single analyte methods have been proposed, although the most useful ones in the real life cosmetic analysis are the multianalyte approaches. Copyright © 2016 Elsevier B.V. All rights reserved.
2011-09-01
project research addresses our long-term goal to develop an analytical suite of the Advanced Laser Fluorescence (ALF) methods and instruments to improve...demonstrated ALF utility as an integrated tool for aquatic research and observations. The ALF integration into the major oceanographic programs is...currently in progress, including the California Current Ecosystem Long Term Ecological Research (CCE LTER, NSF) and California Cooperative Oceanic
NASA Astrophysics Data System (ADS)
Gen, Masao; Kakuta, Hideo; Kamimoto, Yoshihito; Wuled Lenggoro, I.
2011-06-01
A detection method based on the surface-enhanced Raman spectroscopy (SERS)-active substrate derived from aerosol nanoparticles and a colloidal suspension for detecting organic molecules of a model analyte (a pesticide) is proposed. This approach can detect the molecules of the derived from its solution with the concentration levels of ppb. For substrate fabrication, a gas-phase method is used to directly deposit Ag nanoparticles on to a silicon substrate having pyramidal structures. By mixing the target analyte with a suspension of Ag colloids purchased in advance, clotianidin analyte on Ag colloid can exist in junctions of co-aggregated Ag colloids. Using (i) a nanostructured substrate made from aerosol nanoparticles and (ii) colloidal suspension can increase the number of activity spots.
Investigation and Development of Advanced Surface Microanalysis Techniques and Methods
1983-04-01
California 94402 and Stephen L. Grube Watkins-Johnson 440 Kings Village Road Scotts Valley, California 95066 as published in Analytical Chemistry , 1985, 57...34 E. Silberg , T. Y. Chang, E. A. Caridi, C. A. Evans Jr. and C. J. Hitzman in Gallium Arsenide and Related Compounds 1982, 10th International Symposium...Spectrometry," P. K. Chu and S. L. Grube, Analytical Chemistry . 13. "Direct Lateral and In-Depth Distributional Analysis for Ionic - Contaminants in
Frantz, Terrill L
2012-01-01
This paper introduces the contemporary perspectives and techniques of social network analysis (SNA) and agent-based modeling (ABM) and advocates applying them to advance various aspects of complementary and alternative medicine (CAM). SNA and ABM are invaluable methods for representing, analyzing and projecting complex, relational, social phenomena; they provide both an insightful vantage point and a set of analytic tools that can be useful in a wide range of contexts. Applying these methods in the CAM context can aid the ongoing advances in the CAM field, in both its scientific aspects and in developing broader acceptance in associated stakeholder communities. Copyright © 2012 S. Karger AG, Basel.
NASA Technical Reports Server (NTRS)
Clayton, Joseph P.; Tinker, Michael L.
1991-01-01
This paper describes experimental and analytical characterization of a new flexible thermal protection material known as Tailorable Advanced Blanket Insulation (TABI). This material utilizes a three-dimensional ceramic fabric core structure and an insulation filler. TABI is the leading candidate for use in deployable aeroassisted vehicle designs. Such designs require extensive structural modeling, and the most significant in-plane material properties necessary for model development are measured and analytically verified in this study. Unique test methods are developed for damping measurements. Mathematical models are developed for verification of the experimental modulus and damping data, and finally, transverse properties are described in terms of the inplane properties through use of a 12-dof finite difference model of a simple TABI configuration.
Ornatsky, Olga I; Kinach, Robert; Bandura, Dmitry R; Lou, Xudong; Tanner, Scott D; Baranov, Vladimir I; Nitz, Mark; Winnik, Mitchell A
2008-01-01
Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for elemental tagging of antibodies, immunostaining of live and fixed human leukemia cells, and preparation of samples for ICP-MS analysis. Quantitative analysis of surface antigens on model cell lines using a cocktail of seven lanthanide labeled antibodies demonstrated high specificity and concordance with conventional immunophenotyping.
IN-SITU OXIDATION OF 1,4-DIOXANE (LABORATORY RESULTS)
Interest in the solvent stabilizer, 1,4-dioxane, is increasing because analytical detection limits have decreased indicating its presence at chlorinated volatile organic compound contaminated sites. The most common method for removing 1,4-dioxane from contaminated water is advanc...
Advances in explosives analysis—part II: photon and neutron methods
Brown, Kathryn E.; Greenfield, Margo T.; McGrane, Shawn D.; ...
2015-10-07
The number and capability of explosives detection and analysis methods have increased dramatically since publication of the Analytical and Bioanalytical Chemistry special issue devoted to Explosives Analysis [Moore DS, Goodpaster JV, Anal Bioanal Chem 395:245–246, 2009]. Here we review and critically evaluate the latest (the past five years) important advances in explosives detection, with details of the improvements over previous methods, and suggest possible avenues towards further advances in, e.g., stand-off distance, detection limit, selectivity, and penetration through camouflage or packaging. Our review consists of two parts. Part I discussed methods based on animals, chemicals (including colorimetry, molecularly imprinted polymers,more » electrochemistry, and immunochemistry), ions (both ion-mobility spectrometry and mass spectrometry), and mechanical devices. In Part II, we review methods based on photons, from very energetic photons including X-rays and gamma rays down to the terahertz range, and neutrons.« less
Advances in explosives analysis—part I. animal, chemical, ion, and mechanical methods
Brown, Kathryn E.; Greenfield, Margo T.; McGrane, Shawn D.; ...
2015-10-13
The number and capability of explosives detection and analysis methods have increased substantially since the publication of the Analytical and Bioanalytical Chemistry special issue devoted to Explosives Analysis (Moore and Goodpaster, Anal Bioanal Chem 395(2):245–246, 2009). We review and critically evaluate the latest (the past five years) important advances in explosives detection, with details of the improvements over previous methods, and suggest possible avenues towards further advances in, e.g., stand-off distance, detection limit, selectivity, and penetration through camouflage or packaging. The review consists of two parts. Moreover, Part I, reviews methods based on animals, chemicals (including colorimetry, molecularly imprinted polymers,more » electrochemistry, and immunochemistry), ions (both ion-mobility spectrometry and mass spectrometry), and mechanical devices. Part II will review methods based on photons, from very energetic photons including X-rays and gamma rays down to the terahertz range, and neutrons.« less
Advances in spectroscopic methods for quantifying soil carbon
Reeves, James B.; McCarty, Gregory W.; Calderon, Francisco; Hively, W. Dean
2012-01-01
The current gold standard for soil carbon (C) determination is elemental C analysis using dry combustion. However, this method requires expensive consumables, is limited by the number of samples that can be processed (~100/d), and is restricted to the determination of total carbon. With increased interest in soil C sequestration, faster methods of analysis are needed, and there is growing interest in methods based on diffuse reflectance spectroscopy in the visible, near-infrared or mid-infrared spectral ranges. These spectral methods can decrease analytical requirements and speed sample processing, be applied to large landscape areas using remote sensing imagery, and be used to predict multiple analytes simultaneously. However, the methods require localized calibrations to establish the relationship between spectral data and reference analytical data, and also have additional, specific problems. For example, remote sensing is capable of scanning entire watersheds for soil carbon content but is limited to the surface layer of tilled soils and may require difficult and extensive field sampling to obtain proper localized calibration reference values. The objective of this chapter is to discuss the present state of spectroscopic methods for determination of soil carbon.
Analytical methods for the determination of personal care products in human samples: an overview.
Jiménez-Díaz, I; Zafra-Gómez, A; Ballesteros, O; Navalón, A
2014-11-01
Personal care products (PCPs) are organic chemicals widely used in everyday human life. Nowadays, preservatives, UV-filters, antimicrobials and musk fragrances are widely used PCPs. Different studies have shown that some of these compounds can cause adverse health effects, such as genotoxicity, which could even lead to mutagenic or carcinogenic effects, or estrogenicity because of their endocrine disruption activity. Due to the absence of official monitoring protocols, there is an increasing demand of analytical methods that allow the determination of those compounds in human samples in order to obtain more information regarding their behavior and fate in the human body. The complexity of the biological matrices and the low concentration levels of these compounds make necessary the use of advanced sample treatment procedures that afford both, sample clean-up, to remove potentially interfering matrix components, as well as the concentration of analytes. In the present work, a review of the more recent analytical methods published in the scientific literature for the determination of PCPs in human fluids and tissue samples, is presented. The work focused on sample preparation and the analytical techniques employed. Copyright © 2014 Elsevier B.V. All rights reserved.
Determination of Copper and Zinc in Brass: Two Basic Methods
ERIC Educational Resources Information Center
Fabre, Paul-Louis; Reynes, Olivier
2010-01-01
In this experiment, the concentrations of copper and zinc in brass are obtained by two methods. This experiment does not require advanced instrumentation, uses inexpensive chemicals, and can be easily carried out during a 3-h upper-level undergraduate laboratory. Pedagogically, the basic concepts of analytical chemistry in solutions, such as pH,…
Martinez, Ramon; Ordunez, Pedro; Soliz, Patricia N; Ballesteros, Michael F
2016-01-01
Background The complexity of current injury-related health issues demands the usage of diverse and massive data sets for comprehensive analyses, and application of novel methods to communicate data effectively to the public health community, decision-makers and the public. Recent advances in information visualisation, availability of new visual analytic methods and tools, and progress on information technology provide an opportunity for shaping the next generation of injury surveillance. Objective To introduce data visualisation conceptual bases, and propose a visual analytic and visualisation platform in public health surveillance for injury prevention and control. Methods The paper introduces data visualisation conceptual bases, describes a visual analytic and visualisation platform, and presents two real-world case studies illustrating their application in public health surveillance for injury prevention and control. Results Application of visual analytic and visualisation platform is presented as solution for improved access to heterogeneous data sources, enhance data exploration and analysis, communicate data effectively, and support decision-making. Conclusions Applications of data visualisation concepts and visual analytic platform could play a key role to shape the next generation of injury surveillance. Visual analytic and visualisation platform could improve data use, the analytic capacity, and ability to effectively communicate findings and key messages. The public health surveillance community is encouraged to identify opportunities to develop and expand its use in injury prevention and control. PMID:26728006
-Omic and Electronic Health Records Big Data Analytics for Precision Medicine
Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.
2017-01-01
Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470
Analytical N beam position monitor method
NASA Astrophysics Data System (ADS)
Wegscheider, A.; Langner, A.; Tomás, R.; Franchi, A.
2017-11-01
Measurement and correction of focusing errors is of great importance for performance and machine protection of circular accelerators. Furthermore LHC needs to provide equal luminosities to the experiments ATLAS and CMS. High demands are also set on the speed of the optics commissioning, as the foreseen operation with β*-leveling on luminosity will require many operational optics. A fast measurement of the β -function around a storage ring is usually done by using the measured phase advance between three consecutive beam position monitors (BPMs). A recent extension of this established technique, called the N-BPM method, was successfully applied for optics measurements at CERN, ALBA, and ESRF. We present here an improved algorithm that uses analytical calculations for both random and systematic errors and takes into account the presence of quadrupole, sextupole, and BPM misalignments, in addition to quadrupolar field errors. This new scheme, called the analytical N-BPM method, is much faster, further improves the measurement accuracy, and is applicable to very pushed beam optics where the existing numerical N-BPM method tends to fail.
LOCATING BURIED WORLD WAR 1 MUNITIONS WITH REMOTE SENSING AND GIS
Remote Sensing is a scientific discipline of non-contact monitoring. It includes a range of technologies that span from aerial photography to advanced spectral imaging and analytical methods. This Session is designed to demonstrate contemporary practical applications of remote ...
Species delimitation: A case study in a problematic ant taxon
USDA-ARS?s Scientific Manuscript database
Species delimitation has been invigorated as a discipline in systematics by an influx of new character sets, analytical methods, and conceptual advances. We use genetic data from 68 markers, combined with distributional, bioclimatic, and coloration information, to distinguish evolutionarily indepe...
CHALLENGES IN BIODEGRADATION OF TRACE ORGANIC CONTAMINANTS-GASOLINE OXYGENATES AND SEX HORMONES
Advances in analytical methods have led to the identification of several classes of organic chemicals that are associated with adverse environmental impacts. Two such classes of organic chemicals, gasoline oxygenates and sex hormones, are used to illustrate challenges associated ...
Computational thermo-fluid dynamics contributions to advanced gas turbine engine design
NASA Technical Reports Server (NTRS)
Graham, R. W.; Adamczyk, J. J.; Rohlik, H. E.
1984-01-01
The design practices for the gas turbine are traced throughout history with particular emphasis on the calculational or analytical methods. Three principal components of the gas turbine engine will be considered: namely, the compressor, the combustor and the turbine.
2012-02-09
different sources [12,13], but the analytical techniques needed for such analysis (XRD, INAA , & ICP-MS) are time consuming and require expensive...partial least-squares discriminant analysis (PLSDA) that used the SIMPLS solving method [33]. In the experi- ment design, a leave-one-sample-out (LOSO) para...REPORT Advanced signal processing analysis of laser-induced breakdown spectroscopy data for the discrimination of obsidian sources 14. ABSTRACT 16
Analytical Glycobiology at High Sensitivity: Current Approaches and Directions
Novotny, Milos V.; Alley, William R.; Mann, Benjamin F.
2013-01-01
This review summarizes the analytical advances made during the last several years in the structural and quantitative determinations of glycoproteins in complex biological mixtures. The main analytical techniques used in the fields of glycomics and glycoproteomics involve different modes of mass spectrometry and their combinations with capillary separation methods such as microcolumn liquid chromatography and capillary electrophoresis. The needs for high-sensitivity measurements have been emphasized in the oligosaccharide profiling used in the field of biomarker discovery through MALDI mass spectrometry. High-sensitivity profiling of both glycans and glycopeptides from biological fluids and tissue extracts has been aided significantly through lectin preconcentration and the uses of affinity chromatography. PMID:22945852
NASA Astrophysics Data System (ADS)
Pritykin, F. N.; Nebritov, V. I.
2017-06-01
The structure of graphic database specifying the shape and the work envelope projection position of an android arm mechanism with various positions of the known in advance forbidden zones is proposed. The technique of analytical assignment of the work envelope based on the methods of analytical geometry and theory of sets is represented. The conducted studies can be applied in creation of knowledge bases for intellectual systems of android control functioning independently in the sophisticated environment.
Thermal conductivity of Rene 41 honeycomb panels
NASA Astrophysics Data System (ADS)
Deriugin, V.
1980-12-01
Effective thermal conductivities of Rene 41 panels suitable for advanced space transportation vehicle structures were determined analytically and experimentally for temperature ranges between 20.4K (423 F) and 1186K (1675 F). The cryogenic data were obtained using a cryostat whereas the high temperature data were measured using a heat flow meter and a comparative thermal conductivity instrument respectively. Comparisons were made between analysis and experimental data. Analytical methods appear to provide reasonable definition of the honeycomb panel effective thermal conductivities.
Thermal conductivity of Rene 41 honeycomb panels. [space transportation vehicles
NASA Technical Reports Server (NTRS)
Deriugin, V.
1980-01-01
Effective thermal conductivities of Rene 41 panels suitable for advanced space transportation vehicle structures were determined analytically and experimentally for temperature ranges between 20.4K (423 F) and 1186K (1675 F). The cryogenic data were obtained using a cryostat whereas the high temperature data were measured using a heat flow meter and a comparative thermal conductivity instrument respectively. Comparisons were made between analysis and experimental data. Analytical methods appear to provide reasonable definition of the honeycomb panel effective thermal conductivities.
NASA Technical Reports Server (NTRS)
Bekey, G. A.
1971-01-01
Studies are summarized on the application of advanced analytical and computational methods to the development of mathematical models of human controllers in multiaxis manual control systems. Specific accomplishments include the following: (1) The development of analytical and computer methods for the measurement of random parameters in linear models of human operators. (2) Discrete models of human operator behavior in a multiple display situation were developed. (3) Sensitivity techniques were developed which make possible the identification of unknown sampling intervals in linear systems. (4) The adaptive behavior of human operators following particular classes of vehicle failures was studied and a model structure proposed.
Analytical methods for human biomonitoring of pesticides. A review.
Yusa, Vicent; Millet, Maurice; Coscolla, Clara; Roca, Marta
2015-09-03
Biomonitoring of both currently-used and banned-persistent pesticides is a very useful tool for assessing human exposure to these chemicals. In this review, we present current approaches and recent advances in the analytical methods for determining the biomarkers of exposure to pesticides in the most commonly used specimens, such as blood, urine, and breast milk, and in emerging non-invasive matrices such as hair and meconium. We critically discuss the main applications for sample treatment, and the instrumental techniques currently used to determine the most relevant pesticide biomarkers. We finally look at the future trends in this field. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Niknejadi, Pardis; Madey, John M. J.
2017-09-01
By the covariant statement of the distance in space-time separating transmitter and receivers, the emission and absorption of the retarded and advanced waves are all simultaneous. In other words, for signals carried on electromagnetic waves (advanced or retarded) the invariant interval (cdt) 2 -dr2 between the emission of a wave and it's absorption at the non-reflecting boundary is always identically zero. Utilizing this principle, we have previously explained the advantages of including the coherent radiation reaction force as a part of the solution to the boundary value problem for FELs that radiate into "free space" (Self Amplified Spontaneous Emission (SASE) FELs) and discussed how the advanced field of the absorber can interact with the radiating particles at the time of emission. Here we present an analytical test which verifies that a multilayer mirror can act as a band pass filter and can contribute to microbunching in the electron beam. Here we will discuss motivation, conditions and requirements, and method for testing this effect.
Application of advanced control techniques to aircraft propulsion systems
NASA Technical Reports Server (NTRS)
Lehtinen, B.
1984-01-01
Two programs are described which involve the application of advanced control techniques to the design of engine control algorithms. Multivariable control theory is used in the F100 MVCS (multivariable control synthesis) program to design controls which coordinate the control inputs for improved engine performance. A systematic method for handling a complex control design task is given. Methods of analytical redundancy are aimed at increasing the control system reliability. The F100 DIA (detection, isolation, and accommodation) program, which investigates the uses of software to replace or augment hardware redundancy for certain critical engine sensor, is described.
Interactive program for analysis and design problems in advanced composites technology
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Swedlow, J. L.
1971-01-01
During the past year an experimental program in the fracture of advanced fiber composites has been completed. The experimental program has given direction to additional experimental and theoretical work. A synthesis program for designing low weight multifastener joints in composites is proposed, based on extensive analytical background. A number of failed joints have been thoroughly analyzed to evaluate the failure hypothesis used in the synthesis procedure. Finally, a new solution is reported for isotropic and anisotropic laminates using the boundary-integral method. The solution method offers significant savings of computer core and time for important problems.
Introduction: Ecological knowledge, theory and information in space and time [Chapter 1
Samuel A. Cushman; Falk Huettmann
2010-01-01
A central theme of this book is that there is a strong mutual dependence between explanatory theory, available data and analytical method in determining the lurching progress of ecological knowledge (Fig. 1.1). The two central arguments are first that limits in each of theory, data and method have continuously constrained advances in understanding ecological systems...
Review of Processing and Analytical Methods for Francisella ...
Journal Article The etiological agent of tularemia, Francisella tularensis, is a resilient organism within the environment and can be acquired many ways (infectious aerosols and dust, contaminated food and water, infected carcasses, and arthropod bites). However, isolating F. tularensis from environmental samples can be challenging due to its nutritionally fastidious and slow-growing nature. In order to determine the current state of the science regarding available processing and analytical methods for detection and recovery of F. tularensis from water and soil matrices, a review of the literature was conducted. During the review, analysis via culture, immunoassays, and genomic identification were the most commonly found methods for F. tularensis detection within environmental samples. Other methods included combined culture and genomic analysis for rapid quantification of viable microorganisms and use of one assay to identify multiple pathogens from a single sample. Gaps in the literature that were identified during this review suggest that further work to integrate culture and genomic identification would advance our ability to detect and to assess the viability of Francisella spp. The optimization of DNA extraction, whole genome amplification with inhibition-resistant polymerases, and multiagent microarray detection would also advance biothreat detection.
Tsikas, Dimitrios
2017-07-15
Tyrosine and tyrosine residues in proteins are attacked by the reactive oxygen and nitrogen species peroxynitrite (O=N-OO - ) to generate 3-nitrotyrosine (3-NT) and 3-nitrotyrosine-proteins (3-NTProt), respectively. 3-NT and 3-NTProt are widely accepted as biomarkers of nitr(os)ative stress. Over the years many different analytical methods have been reported for 3-NT and 3-NTProt. Reported concentrations often differ by more than three orders of magnitude, indicative of serious analytical problems. Strategies to overcome pre-analytical and analytical shortcomings and pitfalls have been proposed. The present review investigated whether recently published work on the quantitative measurement of biological 3-nitrotyrosine did adequately consider the analytical past of this biomolecule. 3-Nitrotyrosine was taken as a representative of biomolecules that occur in biological samples in the pM-to-nM concentration range. This examination revealed that in many cases the main protagonists involved in the publication of scientific work, i.e., authors, reviewers and editors, failed to do so. Learning from the analytical history of 3-nitrotyrosine means advancing analytical and biological science and implies the following key issues. (1) Choosing the most reliable analytical approach in terms of sensitivity and accuracy; presently this is best feasible by stable-isotope dilution tandem mass spectrometry coupled with gas chromatography (GC-MS/MS) or liquid chromatography (LC-MS/MS). (2) Minimizing artificial formation of 3-nitrotyrosine during sample work up, a major pitfall in 3-nitrotyrosine analysis. (3) Validating adequately the final method in the intendent biological matrix and the established concentration range. (4) Inviting experts in the field for critical evaluation of the novelty and reliability of the proposed analytical method, placing special emphasis on the compliance of the analytical outcome with 3-nitrotyrosine concentrations obtained by validated GC-MS/MS and LC-MS/MS methods. Copyright © 2017 Elsevier B.V. All rights reserved.
Review of spectral imaging technology in biomedical engineering: achievements and challenges.
Li, Qingli; He, Xiaofu; Wang, Yiting; Liu, Hongying; Xu, Dongrong; Guo, Fangmin
2013-10-01
Spectral imaging is a technology that integrates conventional imaging and spectroscopy to get both spatial and spectral information from an object. Although this technology was originally developed for remote sensing, it has been extended to the biomedical engineering field as a powerful analytical tool for biological and biomedical research. This review introduces the basics of spectral imaging, imaging methods, current equipment, and recent advances in biomedical applications. The performance and analytical capabilities of spectral imaging systems for biological and biomedical imaging are discussed. In particular, the current achievements and limitations of this technology in biomedical engineering are presented. The benefits and development trends of biomedical spectral imaging are highlighted to provide the reader with an insight into the current technological advances and its potential for biomedical research.
Ornatsky, Olga I.; Kinach, Robert; Bandura, Dmitry R.; Lou, Xudong; Tanner, Scott D.; Baranov, Vladimir I.; Nitz, Mark; Winnik, Mitchell A.
2008-01-01
Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for elemental tagging of antibodies, immunostaining of live and fixed human leukemia cells, and preparation of samples for ICP-MS analysis. Quantitative analysis of surface antigens on model cell lines using a cocktail of seven lanthanide labeled antibodies demonstrated high specificity and concordance with conventional immunophenotyping. PMID:19122859
Liu, Yuxuan; Huang, Xiangyi; Ren, Jicun
2016-01-01
CE is an ideal analytical method for extremely volume-limited biological microenvironments. However, the small injection volume makes it a challenge to achieve highly sensitive detection. Chemiluminescence (CL) detection is characterized by providing low background with excellent sensitivity because of requiring no light source. The coupling of CL with CE and MCE has become a powerful analytical method. So far, this method has been widely applied to chemical analysis, bioassay, drug analysis, and environment analysis. In this review, we first introduce some developments for CE-CL and MCE-CL systems, and then put the emphasis on the applications in the last 10 years. Finally, we discuss the future prospects. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Automated Deployment of Advanced Controls and Analytics in Buildings
NASA Astrophysics Data System (ADS)
Pritoni, Marco
Buildings use 40% of primary energy in the US. Recent studies show that developing energy analytics and enhancing control strategies can significantly improve their energy performance. However, the deployment of advanced control software applications has been mostly limited to academic studies. Larger-scale implementations are prevented by the significant engineering time and customization required, due to significant differences among buildings. This study demonstrates how physics-inspired data-driven models can be used to develop portable analytics and control applications for buildings. Specifically, I demonstrate application of these models in all phases of the deployment of advanced controls and analytics in buildings: in the first phase, "Site Preparation and Interface with Legacy Systems" I used models to discover or map relationships among building components, automatically gathering metadata (information about data points) necessary to run the applications. During the second phase: "Application Deployment and Commissioning", models automatically learn system parameters, used for advanced controls and analytics. In the third phase: "Continuous Monitoring and Verification" I utilized models to automatically measure the energy performance of a building that has implemented advanced control strategies. In the conclusions, I discuss future challenges and suggest potential strategies for these innovative control systems to be widely deployed in the market. This dissertation provides useful new tools in terms of procedures, algorithms, and models to facilitate the automation of deployment of advanced controls and analytics and accelerate their wide adoption in buildings.
Burger, Jessica L; Lovestead, Tara M; Bruno, Thomas J
2016-03-17
As the sources of natural gas become more diverse, the trace constituents of the C 6 + fraction are of increasing interest. Analysis of fuel gas (including natural gas) for compounds with more than 6 carbon atoms (the C 6 + fraction) has historically been complex and expensive. Hence, this is a procedure that is used most often in troubleshooting rather than for day-to-day operations. The C 6 + fraction affects gas quality issues and safety considerations such as anomalies associated with odorization. Recent advances in dynamic headspace vapor collection can be applied to this analysis and provide a faster, less complex alternative for compositional determination of the C 6 + fraction of natural gas. Porous layer open tubular capillaries maintained at low temperatures (PLOT-cryo) form the basis of a dynamic headspace sampling method that was developed at NIST initially for explosives in 2009. This method has been recently advanced by the combining of multiple PLOT capillary traps into one "bundle," or wafer, resulting in a device that allows the rapid trapping of relatively large amounts of analyte. In this study, natural gas analytes were collected by flowing natural gas from the laboratory (gas out of the wall) or a prepared surrogate gas flowing through a chilled wafer. The analytes were then removed from the PLOT-cryo wafer by thermal desorption and subsequent flushing of the wafer with helium. Gas chromatography (GC) with mass spectrometry (MS) was then used to identify the analytes.
Characterization of spacecraft humidity condensate
NASA Technical Reports Server (NTRS)
Muckle, Susan; Schultz, John R.; Sauer, Richard L.
1994-01-01
When construction of Space Station Freedom reaches the Permanent Manned Capability (PMC) stage, the Water Recovery and Management Subsystem will be fully operational such that (distilled) urine, spent hygiene water, and humidity condensate will be reclaimed to provide water of potable quality. The reclamation technologies currently baselined to process these waste waters include adsorption, ion exchange, catalytic oxidation, and disinfection. To ensure that the baseline technologies will be able to effectively remove those compounds presenting a health risk to the crew, the National Research Council has recommended that additional information be gathered on specific contaminants in waste waters representative of those to be encountered on the Space Station. With the application of new analytical methods and the analysis of waste water samples more representative of the Space Station environment, advances in the identification of the specific contaminants continue to be made. Efforts by the Water and Food Analytical Laboratory at JSC were successful in enlarging the database of contaminants in humidity condensate. These efforts have not only included the chemical characterization of condensate generated during ground-based studies, but most significantly the characterization of cabin and Spacelab condensate generated during Shuttle missions. The analytical results presented in this paper will be used to show how the composition of condensate varies amongst enclosed environments and thus the importance of collecting condensate from an environment close to that of the proposed Space Station. Although advances were made in the characterization of space condensate, complete characterization, particularly of the organics, requires further development of analytical methods.
Using quantum chemistry muscle to flex massive systems: How to respond to something perturbing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertoni, Colleen
Computational chemistry uses the theoretical advances of quantum mechanics and the algorithmic and hardware advances of computer science to give insight into chemical problems. It is currently possible to do highly accurate quantum chemistry calculations, but the most accurate methods are very computationally expensive. Thus it is only feasible to do highly accurate calculations on small molecules, since typically more computationally efficient methods are also less accurate. The overall goal of my dissertation work has been to try to decrease the computational expense of calculations without decreasing the accuracy. In particular, my dissertation work focuses on fragmentation methods, intermolecular interactionsmore » methods, analytic gradients, and taking advantage of new hardware.« less
Towards an Airframe Noise Prediction Methodology: Survey of Current Approaches
NASA Technical Reports Server (NTRS)
Farassat, Fereidoun; Casper, Jay H.
2006-01-01
In this paper, we present a critical survey of the current airframe noise (AFN) prediction methodologies. Four methodologies are recognized. These are the fully analytic method, CFD combined with the acoustic analogy, the semi-empirical method and fully numerical method. It is argued that for the immediate need of the aircraft industry, the semi-empirical method based on recent high quality acoustic database is the best available method. The method based on CFD and the Ffowcs William- Hawkings (FW-H) equation with penetrable data surface (FW-Hpds ) has advanced considerably and much experience has been gained in its use. However, more research is needed in the near future particularly in the area of turbulence simulation. The fully numerical method will take longer to reach maturity. Based on the current trends, it is predicted that this method will eventually develop into the method of choice. Both the turbulence simulation and propagation methods need to develop more for this method to become useful. Nonetheless, the authors propose that the method based on a combination of numerical and analytical techniques, e.g., CFD combined with FW-H equation, should also be worked on. In this effort, the current symbolic algebra software will allow more analytical approaches to be incorporated into AFN prediction methods.
Applications of Raman Spectroscopy in Biopharmaceutical Manufacturing: A Short Review.
Buckley, Kevin; Ryder, Alan G
2017-06-01
The production of active pharmaceutical ingredients (APIs) is currently undergoing its biggest transformation in a century. The changes are based on the rapid and dramatic introduction of protein- and macromolecule-based drugs (collectively known as biopharmaceuticals) and can be traced back to the huge investment in biomedical science (in particular in genomics and proteomics) that has been ongoing since the 1970s. Biopharmaceuticals (or biologics) are manufactured using biological-expression systems (such as mammalian, bacterial, insect cells, etc.) and have spawned a large (>€35 billion sales annually in Europe) and growing biopharmaceutical industry (BioPharma). The structural and chemical complexity of biologics, combined with the intricacy of cell-based manufacturing, imposes a huge analytical burden to correctly characterize and quantify both processes (upstream) and products (downstream). In small molecule manufacturing, advances in analytical and computational methods have been extensively exploited to generate process analytical technologies (PAT) that are now used for routine process control, leading to more efficient processes and safer medicines. In the analytical domain, biologic manufacturing is considerably behind and there is both a huge scope and need to produce relevant PAT tools with which to better control processes, and better characterize product macromolecules. Raman spectroscopy, a vibrational spectroscopy with a number of useful properties (nondestructive, non-contact, robustness) has significant potential advantages in BioPharma. Key among them are intrinsically high molecular specificity, the ability to measure in water, the requirement for minimal (or no) sample pre-treatment, the flexibility of sampling configurations, and suitability for automation. Here, we review and discuss a representative selection of the more important Raman applications in BioPharma (with particular emphasis on mammalian cell culture). The review shows that the properties of Raman have been successfully exploited to deliver unique and useful analytical solutions, particularly for online process monitoring. However, it also shows that its inherent susceptibility to fluorescence interference and the weakness of the Raman effect mean that it can never be a panacea. In particular, Raman-based methods are intrinsically limited by the chemical complexity and wide analyte-concentration-profiles of cell culture media/bioprocessing broths which limit their use for quantitative analysis. Nevertheless, with appropriate foreknowledge of these limitations and good experimental design, robust analytical methods can be produced. In addition, new technological developments such as time-resolved detectors, advanced lasers, and plasmonics offer potential of new Raman-based methods to resolve existing limitations and/or provide new analytical insights.
Isolation and analysis of ginseng: advances and challenges
Wang, Chong-Zhi
2011-01-01
Ginseng occupies a prominent position in the list of best-selling natural products in the world. Because of its complex constituents, multidisciplinary techniques are needed to validate the analytical methods that support ginseng’s use worldwide. In the past decade, rapid development of technology has advanced many aspects of ginseng research. The aim of this review is to illustrate the recent advances in the isolation and analysis of ginseng, and to highlight their new applications and challenges. Emphasis is placed on recent trends and emerging techniques. The current article reviews the literature between January 2000 and September 2010. PMID:21258738
NASA Astrophysics Data System (ADS)
Iqbal, M.; Islam, A.; Hossain, A.; Mustaque, S.
2016-12-01
Multi-Criteria Decision Making(MCDM) is advanced analytical method to evaluate appropriate result or decision from multiple criterion environment. Present time in advanced research, MCDM technique is progressive analytical process to evaluate a logical decision from various conflict. In addition, Present day Geospatial approach (e.g. Remote sensing and GIS) also another advanced technical approach in a research to collect, process and analyze various spatial data at a time. GIS and Remote sensing together with the MCDM technique could be the best platform to solve a complex decision making process. These two latest process combined very effectively used in site selection for solid waste management in urban policy. The most popular MCDM technique is Weighted Linear Method (WLC) where Analytical Hierarchy Process (AHP) is another popular and consistent techniques used in worldwide as dependable decision making. Consequently, the main objective of this study is improving a AHP model as MCDM technique with Geographic Information System (GIS) to select a suitable landfill site for urban solid waste management. Here AHP technique used as a MCDM tool to select the best suitable landfill location for urban solid waste management. To protect the urban environment in a sustainable way municipal waste needs an appropriate landfill site considering environmental, geological, social and technical aspect of the region. A MCDM model generate from five class related which related to environmental, geological, social and technical using AHP method and input the result set in GIS for final model location for urban solid waste management. The final suitable location comes out that 12.2% of the area corresponds to 22.89 km2 considering the total study area. In this study, Keraniganj sub-district of Dhaka district in Bangladesh is consider as study area which is densely populated city currently undergoes an unmanaged waste management system especially the suitable landfill sites for waste dumping site.
The Analytical Chemistry of Drug Monitoring in Athletes
NASA Astrophysics Data System (ADS)
Bowers, Larry D.
2009-07-01
The detection and deterrence of the abuse of performance-enhancing drugs in sport are important to maintaining a level playing field among athletes and to decreasing the risk to athletes’ health. The World Anti-Doping Program consists of six documents, three of which play a role in analytical development: The World Anti-Doping Code, The List of Prohibited Substances and Methods, and The International Standard for Laboratories. Among the classes of prohibited substances, three have given rise to the most recent analytical developments in the field: anabolic agents; peptide and protein hormones; and methods to increase oxygen delivery to the tissues, including recombinant erythropoietin. Methods for anabolic agents, including designer steroids, have been enhanced through the use of liquid chromatography/tandem mass spectrometry and gas chromatography/combustion/isotope-ratio mass spectrometry. Protein and peptide identification and quantification have benefited from advances in liquid chromatography/tandem mass spectrometry. Incorporation of techniques such as flow cytometry and isoelectric focusing have supported the detection of blood doping.
DEVELOPING THE TRANSDISCIPLINARY AGING RESEARCH AGENDA: NEW DEVELOPMENTS IN BIG DATA.
Callaghan, Christian William
2017-07-19
In light of dramatic advances in big data analytics and the application of these advances in certain scientific fields, new potentialities exist for breakthroughs in aging research. Translating these new potentialities to research outcomes for aging populations, however, remains a challenge, as underlying technologies which have enabled exponential increases in 'big data' have not yet enabled a commensurate era of 'big knowledge,' or similarly exponential increases in biomedical breakthroughs. Debates also reveal differences in the literature, with some arguing big data analytics heralds a new era associated with the 'end of theory' or which makes the scientific method obsolete, where correlation supercedes causation, whereby science can advance without theory and hypotheses testing. On the other hand, others argue theory cannot be subordinate to data, no matter how comprehensive data coverage can ultimately become. Given these two tensions, namely between exponential increases in data absent exponential increases in biomedical research outputs, and between the promise of comprehensive data coverage and data-driven inductive versus theory-driven deductive modes of enquiry, this paper seeks to provide a critical review of certain theory and literature that offers useful perspectives of certain developments in big data analytics and their theoretical implications for aging research. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Advanced Doubling Adding Method for Radiative Transfer in Planetary Atmospheres
NASA Astrophysics Data System (ADS)
Liu, Quanhua; Weng, Fuzhong
2006-12-01
The doubling adding method (DA) is one of the most accurate tools for detailed multiple-scattering calculations. The principle of the method goes back to the nineteenth century in a problem dealing with reflection and transmission by glass plates. Since then the doubling adding method has been widely used as a reference tool for other radiative transfer models. The method has never been used in operational applications owing to tremendous demand on computational resources from the model. This study derives an analytical expression replacing the most complicated thermal source terms in the doubling adding method. The new development is called the advanced doubling adding (ADA) method. Thanks also to the efficiency of matrix and vector manipulations in FORTRAN 90/95, the advanced doubling adding method is about 60 times faster than the doubling adding method. The radiance (i.e., forward) computation code of ADA is easily translated into tangent linear and adjoint codes for radiance gradient calculations. The simplicity in forward and Jacobian computation codes is very useful for operational applications and for the consistency between the forward and adjoint calculations in satellite data assimilation.
Recent Work in Hybrid Radiation Transport Methods with Applications to Commercial Nuclear Power
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kulesza, Joel A.
This talk will begin with an overview of hybrid radiation transport methods followed by a discussion of the author’s work to advance current capabilities. The talk will then describe applications for these methods in commercial nuclear power reactor analyses and techniques for experimental validation. When discussing these analytical and experimental activities, the importance of technical standards such as those created and maintained by ASTM International will be demonstrated.
The VAST Challenge: History, Scope, and Outcomes: An introduction to the Special Issue
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Kristin A.; Grinstein, Georges; Whiting, Mark A.
2014-10-01
Visual analytics aims to facilitate human insight from complex data via a combination of visual representations, interaction techniques, and supporting algorithms. To create new tools and techniques that achieve this goal requires that researchers have an understanding of analytical questions to be addressed, data that illustrates the complexities and ambiguities found in realistic analytic settings, and methods for evaluating whether the plausible insights are gained through use of the new methods. However, researchers do not, generally speaking, have access to analysts who can articulate their problems or operational data that is used for analysis. To fill this gap, the Visualmore » Analytics Science and Technology (VAST) Challenge has been held annually since 2006. The VAST Challenge provides an opportunity for researchers to experiment with realistic but not real problems, using realistic synthetic data with known events embedded. Since its inception, the VAST Challenge has evolved along with the visual analytics research community to pose more complex challenges, ranging from text analysis to video analysis to large scale network log analysis. The seven years of the VAST Challenge have seen advancements in research and development, education, evaluation, and in the challenge process itself. This special issue of Information Visualization highlights some of the noteworthy advancements in each of these areas. Some of these papers focus on important research questions related to the challenge itself, and other papers focus on innovative research that has been shaped by participation in the challenge. This paper describes the VAST Challenge process and benefits in detail. It also provides an introduction to and context for the remaining papers in the issue.« less
A graph algebra for scalable visual analytics.
Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V
2012-01-01
Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.
NASA Astrophysics Data System (ADS)
Recent advances in the analytical and numerical treatment of physical and engineering problems are discussed in reviews and reports. Topics addressed include fluid mechanics, numerical methods for differential equations, FEM approaches, and boundary-element methods. Consideration is given to optimization, decision theory, stochastics, actuarial mathematics, applied mathematics and mathematical physics, and numerical analysis.
Gear and Transmission Research at NASA Lewis Research Center
NASA Technical Reports Server (NTRS)
Townsend, Dennis P.
1997-01-01
This paper is a review of some of the research work of the NASA Lewis Research Center Mechanical Components Branch. It includes a brief review of the NASA Lewis Research Center and the Mechanical Components Branch. The research topics discussed are crack propagation of gear teeth, gear noise of spiral bevel and other gears, design optimization methods, methods we have investigated for transmission diagnostics, the analytical and experimental study of gear thermal conditions, the analytical and experimental study of split torque systems, the evaluation of several new advanced gear steels and transmission lubricants and the evaluation of various aircraft transmissions. The area of research needs for gearing and transmissions is also discussed.
Advances in analytical instrumentation have not only increased the number and types of chemicals measured, but reduced the quantitation limits, allowing these chemicals to be detected at progressively lower concentrations in various environmental matrices. Such analytical advanc...
Endo, Yasushi
2018-01-01
Edible fats and oils are among the basic components of the human diet, along with carbohydrates and proteins, and they are the source of high energy and essential fatty acids such as linoleic and linolenic acids. Edible fats and oils are used in for pan- and deep-frying, and in salad dressing, mayonnaise and processed foods such as chocolates and cream. The physical and chemical properties of edible fats and oils can affect the quality of oil foods and hence must be evaluated in detail. The physical characteristics of edible fats and oils include color, specific gravity, refractive index, melting point, congeal point, smoke point, flash point, fire point, and viscosity, while the chemical characteristics include acid value, saponification value, iodine value, fatty acid composition, trans isomers, triacylglycerol composition, unsaponifiable matters (sterols, tocopherols) and minor components (phospholipids, chlorophyll pigments, glycidyl fatty acid esters). Peroxide value, p-anisidine value, carbonyl value, polar compounds and polymerized triacylglycerols are indexes of the deterioration of edible fats and oils. This review describes the analytical methods to evaluate the quality of edible fats and oils, especially the Standard Methods for Analysis of Fats, Oils and Related Materials edited by Japan Oil Chemists' Society (the JOCS standard methods) and advanced methods.
Harel, Elad; Schröder, Leif; Xu, Shoujun
2008-01-01
Nuclear magnetic resonance (NMR) is a well-established analytical technique in chemistry. The ability to precisely control the nuclear spin interactions that give rise to the NMR phenomenon has led to revolutionary advances in fields as diverse as protein structure determination and medical diagnosis. Here, we discuss methods for increasing the sensitivity of magnetic resonance experiments, moving away from the paradigm of traditional NMR by separating the encoding and detection steps of the experiment. This added flexibility allows for diverse applications ranging from lab-on-a-chip flow imaging and biological sensors to optical detection of magnetic resonance imaging at low magnetic fields. We aim to compare and discuss various approaches for a host of problems in material science, biology, and physics that differ from the high-field methods routinely used in analytical chemistry and medical imaging.
Yan, Guanyong; Wang, Xiangzhao; Li, Sikun; Yang, Jishuo; Xu, Dongbo; Erdmann, Andreas
2014-03-10
We propose an in situ aberration measurement technique based on an analytical linear model of through-focus aerial images. The aberrations are retrieved from aerial images of six isolated space patterns, which have the same width but different orientations. The imaging formulas of the space patterns are investigated and simplified, and then an analytical linear relationship between the aerial image intensity distributions and the Zernike coefficients is established. The linear relationship is composed of linear fitting matrices and rotation matrices, which can be calculated numerically in advance and utilized to retrieve Zernike coefficients. Numerical simulations using the lithography simulators PROLITH and Dr.LiTHO demonstrate that the proposed method can measure wavefront aberrations up to Z(37). Experiments on a real lithography tool confirm that our method can monitor lens aberration offset with an accuracy of 0.7 nm.
Energy Systems Integration News | Energy Systems Integration Facility |
data analytics and forecasting methods to identify correlations between electricity consumption threats, or cyber and physical attacks-our nation's electricity grid must evolve. As part of the Grid other national labs, and several industry partners-to advance resilient electricity distribution systems
Laboratory Investigations Of Mechanisms For 1,4-Dioxane Destruction By Ozone In Water (Presentation)
Advances in analytical detection methods have made it possible to quantify 1,4-dioxane contamination in groundwater, even a well-characterized sites where it had not been previously detected. Although 1,4-dioxane is difficult to treat because of its chemical and physical propert...
While knowledge of exposure is fundamental to assessing and mitigating risks, exposure information has been costly and difficult to generate. Driven by major scientific advances in analytical methods, biomonitoring, computational tools, and a newly articulated vision for a great...
A Graphical Approach to Teaching Amplifier Design at the Undergraduate Level
ERIC Educational Resources Information Center
Assaad, R. S.; Silva-Martinez, J.
2009-01-01
Current methods of teaching basic amplifier design at the undergraduate level need further development to match today's technological advances. The general class approach to amplifier design is analytical and heavily based on mathematical manipulations. However, the students mathematical abilities are generally modest, creating a void in which…
Laboratory Investigation Of Mechanisms For 1,4-Dioxane Destruction By Ozone In Water
Advances in analytical detection methods have made it possible to quantify 1,4-dioxane contamination in groundwater, even at well-characterized sites where it had not been previously detected. Although 1,4-dioxane is difficult to treat because of its chemical and physical proper...
DOT National Transportation Integrated Search
2010-01-01
Current AASHTO provisions for the conventional load rating of flat slab bridges rely on the equivalent strip method : of analysis for determining live load effects, this is generally regarded as overly conservative by many professional : engineers. A...
Big–deep–smart data in imaging for guiding materials design
Kalinin, Sergei V.; Sumpter, Bobby G.; Archibald, Richard K.
2015-09-23
Harnessing big data, deep data, and smart data from state-of-the-art imaging might accelerate the design and realization of advanced functional materials. Here we discuss new opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest. We specifically focus on how these tools might help realize new discoveries in a timely manner. Such methodologies are particularly appropriate to explore in light of continued improvements in atomistic imaging, modelling and data analytics methods.
Big-deep-smart data in imaging for guiding materials design.
Kalinin, Sergei V; Sumpter, Bobby G; Archibald, Richard K
2015-10-01
Harnessing big data, deep data, and smart data from state-of-the-art imaging might accelerate the design and realization of advanced functional materials. Here we discuss new opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest. We specifically focus on how these tools might help realize new discoveries in a timely manner. Such methodologies are particularly appropriate to explore in light of continued improvements in atomistic imaging, modelling and data analytics methods.
Big-deep-smart data in imaging for guiding materials design
NASA Astrophysics Data System (ADS)
Kalinin, Sergei V.; Sumpter, Bobby G.; Archibald, Richard K.
2015-10-01
Harnessing big data, deep data, and smart data from state-of-the-art imaging might accelerate the design and realization of advanced functional materials. Here we discuss new opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest. We specifically focus on how these tools might help realize new discoveries in a timely manner. Such methodologies are particularly appropriate to explore in light of continued improvements in atomistic imaging, modelling and data analytics methods.
Big–deep–smart data in imaging for guiding materials design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kalinin, Sergei V.; Sumpter, Bobby G.; Archibald, Richard K.
Harnessing big data, deep data, and smart data from state-of-the-art imaging might accelerate the design and realization of advanced functional materials. Here we discuss new opportunities in materials design enabled by the availability of big data in imaging and data analytics approaches, including their limitations, in material systems of practical interest. We specifically focus on how these tools might help realize new discoveries in a timely manner. Such methodologies are particularly appropriate to explore in light of continued improvements in atomistic imaging, modelling and data analytics methods.
Metabolomics of Genetically Modified Crops
Simó, Carolina; Ibáñez, Clara; Valdés, Alberto; Cifuentes, Alejandro; García-Cañas, Virginia
2014-01-01
Metabolomic-based approaches are increasingly applied to analyse genetically modified organisms (GMOs) making it possible to obtain broader and deeper information on the composition of GMOs compared to that obtained from traditional analytical approaches. The combination in metabolomics of advanced analytical methods and bioinformatics tools provides wide chemical compositional data that contributes to corroborate (or not) the substantial equivalence and occurrence of unintended changes resulting from genetic transformation. This review provides insight into recent progress in metabolomics studies on transgenic crops focusing mainly in papers published in the last decade. PMID:25334064
Sandetskaya, Natalia; Allelein, Susann; Kuhlmeier, Dirk
2013-12-01
A combination of Micro-Electro-Mechanical Systems and nanoscale structures allows for the creation of novel miniaturized devices, which broaden the boundaries of the diagnostic approaches. Some materials possess unique properties at the nanolevel, which are different from those in bulk materials. In the last few years these properties became a focus of interest for many researchers, as well as methods of production, design and operation of the nanoobjects. Intensive research and development work resulted in numerous inventions exploiting nanotechnology in miniaturized systems. Modern technical and laboratory equipment allows for the precise control of such devices, making them suitable for sensitive and accurate detection of the analytes. The current review highlights recent patents in the field of nanotechnology in microdevices, applicable for medical, environmental or food analysis. The paper covers the structural and functional basis of such systems and describes specific embodiments in three principal branches: application of nanoparticles, nanofluidics, and nanosensors in the miniaturized systems for advanced analytics and diagnostics. This overview is an update of an earlier review article.
Reagentless, Structure-Switching, Electrochemical Aptamer-Based Sensors
NASA Astrophysics Data System (ADS)
Schoukroun-Barnes, Lauren R.; Macazo, Florika C.; Gutierrez, Brenda; Lottermoser, Justine; Liu, Juan; White, Ryan J.
2016-06-01
The development of structure-switching, electrochemical, aptamer-based sensors over the past ˜10 years has led to a variety of reagentless sensors capable of analytical detection in a range of sample matrices. The crux of this methodology is the coupling of target-induced conformation changes of a redox-labeled aptamer with electrochemical detection of the resulting altered charge transfer rate between the redox molecule and electrode surface. Using aptamer recognition expands the highly sensitive detection ability of electrochemistry to a range of previously inaccessible analytes. In this review, we focus on the methods of sensor fabrication and how sensor signaling is affected by fabrication parameters. We then discuss recent studies addressing the fundamentals of sensor signaling as well as quantitative characterization of the analytical performance of electrochemical aptamer-based sensors. Although the limits of detection of reported electrochemical aptamer-based sensors do not often reach that of gold-standard methods such as enzyme-linked immunosorbent assays, the operational convenience of the sensor platform enables exciting analytical applications that we address. Using illustrative examples, we highlight recent advances in the field that impact important areas of analytical chemistry. Finally, we discuss the challenges and prospects for this class of sensors.
NASA Technical Reports Server (NTRS)
Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.;
2016-01-01
This article provides supplemental information for a Letter reporting the rate of (BBH) coalescences inferred from 16 days of coincident Advanced LIGO observations surrounding the transient (GW) signal GW150914. In that work wereported various rate estimates whose 90% confidence intervals fell in the range 2600 Gpc(exp -3) yr(exp -1). Here we givedetails on our method and computations, including information about our search pipelines, a derivation of ourlikelihood function for the analysis, a description of the astrophysical search trigger distribution expected frommerging BBHs, details on our computational methods, a description of the effects and our model for calibrationuncertainty, and an analytic method for estimating our detector sensitivity, which is calibrated to our measurements.
NASA Technical Reports Server (NTRS)
Halford, Gary R.
1993-01-01
The evolution of high-temperature, creep-fatigue, life-prediction methods used for cyclic crack initiation is traced from inception in the late 1940's. The methods reviewed are material models as opposed to structural life prediction models. Material life models are used by both structural durability analysts and by material scientists. The latter use micromechanistic models as guidance to improve a material's crack initiation resistance. Nearly one hundred approaches and their variations have been proposed to date. This proliferation poses a problem in deciding which method is most appropriate for a given application. Approaches were identified as being combinations of thirteen different classifications. This review is intended to aid both developers and users of high-temperature fatigue life prediction methods by providing a background from which choices can be made. The need for high-temperature, fatigue-life prediction methods followed immediately on the heels of the development of large, costly, high-technology industrial and aerospace equipment immediately following the second world war. Major advances were made in the design and manufacture of high-temperature, high-pressure boilers and steam turbines, nuclear reactors, high-temperature forming dies, high-performance poppet valves, aeronautical gas turbine engines, reusable rocket engines, etc. These advances could no longer be accomplished simply by trial and error using the 'build-em and bust-em' approach. Development lead times were too great and costs too prohibitive to retain such an approach. Analytic assessments of anticipated performance, cost, and durability were introduced to cut costs and shorten lead times. The analytic tools were quite primitive at first and out of necessity evolved in parallel with hardware development. After forty years more descriptive, more accurate, and more efficient analytic tools are being developed. These include thermal-structural finite element and boundary element analyses, advanced constitutive stress-strain-temperature-time relations, and creep-fatigue-environmental models for crack initiation and propagation. The high-temperature durability methods that have evolved for calculating high-temperature fatigue crack initiation lives of structural engineering materials are addressed. Only a few of the methods were refined to the point of being directly useable in design. Recently, two of the methods were transcribed into computer software for use with personal computers.
NASA Astrophysics Data System (ADS)
Halford, Gary R.
1993-10-01
The evolution of high-temperature, creep-fatigue, life-prediction methods used for cyclic crack initiation is traced from inception in the late 1940's. The methods reviewed are material models as opposed to structural life prediction models. Material life models are used by both structural durability analysts and by material scientists. The latter use micromechanistic models as guidance to improve a material's crack initiation resistance. Nearly one hundred approaches and their variations have been proposed to date. This proliferation poses a problem in deciding which method is most appropriate for a given application. Approaches were identified as being combinations of thirteen different classifications. This review is intended to aid both developers and users of high-temperature fatigue life prediction methods by providing a background from which choices can be made. The need for high-temperature, fatigue-life prediction methods followed immediately on the heels of the development of large, costly, high-technology industrial and aerospace equipment immediately following the second world war. Major advances were made in the design and manufacture of high-temperature, high-pressure boilers and steam turbines, nuclear reactors, high-temperature forming dies, high-performance poppet valves, aeronautical gas turbine engines, reusable rocket engines, etc. These advances could no longer be accomplished simply by trial and error using the 'build-em and bust-em' approach. Development lead times were too great and costs too prohibitive to retain such an approach. Analytic assessments of anticipated performance, cost, and durability were introduced to cut costs and shorten lead times. The analytic tools were quite primitive at first and out of necessity evolved in parallel with hardware development. After forty years more descriptive, more accurate, and more efficient analytic tools are being developed. These include thermal-structural finite element and boundary element analyses, advanced constitutive stress-strain-temperature-time relations, and creep-fatigue-environmental models for crack initiation and propagation. The high-temperature durability methods that have evolved for calculating high-temperature fatigue crack initiation lives of structural engineering materials are addressed. Only a few of the methods were refined to the point of being directly useable in design.
Burger, Jessica L.; Lovestead, Tara M.; Bruno, Thomas J.
2017-01-01
As the sources of natural gas become more diverse, the trace constituents of the C6+ fraction are of increasing interest. Analysis of fuel gas (including natural gas) for compounds with more than 6 carbon atoms (the C6+ fraction) has historically been complex and expensive. Hence, this is a procedure that is used most often in troubleshooting rather than for day-to-day operations. The C6+ fraction affects gas quality issues and safety considerations such as anomalies associated with odorization. Recent advances in dynamic headspace vapor collection can be applied to this analysis and provide a faster, less complex alternative for compositional determination of the C6+ fraction of natural gas. Porous layer open tubular capillaries maintained at low temperatures (PLOT-cryo) form the basis of a dynamic headspace sampling method that was developed at NIST initially for explosives in 2009. This method has been recently advanced by the combining of multiple PLOT capillary traps into one “bundle,” or wafer, resulting in a device that allows the rapid trapping of relatively large amounts of analyte. In this study, natural gas analytes were collected by flowing natural gas from the laboratory (gas out of the wall) or a prepared surrogate gas flowing through a chilled wafer. The analytes were then removed from the PLOT-cryo wafer by thermal desorption and subsequent flushing of the wafer with helium. Gas chromatography (GC) with mass spectrometry (MS) was then used to identify the analytes. PMID:29332993
USDA-ARS?s Scientific Manuscript database
Electrical impedance spectroscopy (EIS), as an effective analytical technique for electrochemical system, has shown a wide application for food quality and safety assessment recently. Individual differences of livestock cause high variation in quality of raw meat and fish and their commercialized pr...
Subdimensions of Adolescent Belonging in High School
ERIC Educational Resources Information Center
Wallace, Tanner LeBaron; Ye, Feifei; Chhuon, Vichet
2012-01-01
Adolescents' sense of belonging in high school may serve a protective function, linking school-based relationships to positive youth outcomes. To advance the study of sense of belonging, we conducted a mixed method, factor analytic study (Phase 1 focus groups, N = 72; Phase 2 cross-sectional survey, N = 890) to explore the multidimensionality of…
Driven by major scientific advances in analytical methods, biomonitoring, computation, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deploy...
Developing Systemic Theories Requires Formal Methods
ERIC Educational Resources Information Center
Gobet, Fernand
2012-01-01
Ziegler and Phillipson (Z&P) advance an interesting and ambitious proposal, whereby current analytical/mechanistic theories of gifted education are replaced by systemic theories. In this commentary, the author focuses on the pros and cons of using systemic theories. He argues that Z&P's proposal both goes too far and not far enough. The future of…
Developments in Cylindrical Shell Stability Analysis
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Starnes, James H., Jr.
1998-01-01
Today high-performance computing systems and new analytical and numerical techniques enable engineers to explore the use of advanced materials for shell design. This paper reviews some of the historical developments of shell buckling analysis and design. The paper concludes by identifying key research directions for reliable and robust methods development in shell stability analysis and design.
Making and Using a Sensing Polymeric Material for Cu[superscript 2+
ERIC Educational Resources Information Center
Paddock, Jean R.; Maghasi, Anne T.; Heineman, William R.; Seliskar, Carl J.
2005-01-01
A simple chemical sensor-related experiment rooted in the synthesis of polymeric materials for use in either an advanced high-school or undergraduate college laboratory is presented. Students are introduced to and combine to the concepts of the chemical sensor, polymer chemistry, spectroscopy, metal chelates, and quantitative analytical methods.
Bayes Nets in Educational Assessment: Where Do the Numbers Come from? CSE Technical Report.
ERIC Educational Resources Information Center
Mislevy, Robert J.; Almond, Russell G.; Yan, Duanli; Steinberg, Linda S.
Educational assessments that exploit advances in technology and cognitive psychology can produce observations and pose student models that outstrip familiar test-theoretic models and analytic methods. Bayesian inference networks (BINs), which include familiar models and techniques as special cases, can be used to manage belief about students'…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertelson, P.C.; Francis, T.L.
1959-10-21
Studies of reflector control for the Advanced Engineering Test Reactor were made. The performance of various parts of the reflector control system model such as the safety reflector and the water jet educator, boric acid injection, and demineralizer systems is discussed. The experimental methods and results obtained are discussed. Four reflector control schemes were studied. The schemes were a single-region and three-region reflector schemes two separate reflectors, and two connected reflectors. Calculations were made of shim and safety reflector worth for a variety of parameters. Safety reflector thickness was varied from 7.75 to 0 inches, with and without boron. Boricmore » acid concentration was varied from 100 to 2% of saturation in the shim reflectors. Neutron flux plots are presented (C.J.G.)« less
Predicting adverse hemodynamic events in critically ill patients.
Yoon, Joo H; Pinsky, Michael R
2018-06-01
The art of predicting future hemodynamic instability in the critically ill has rapidly become a science with the advent of advanced analytical processed based on computer-driven machine learning techniques. How these methods have progressed beyond severity scoring systems to interface with decision-support is summarized. Data mining of large multidimensional clinical time-series databases using a variety of machine learning tools has led to our ability to identify alert artifact and filter it from bedside alarms, display real-time risk stratification at the bedside to aid in clinical decision-making and predict the subsequent development of cardiorespiratory insufficiency hours before these events occur. This fast evolving filed is primarily limited by linkage of high-quality granular to physiologic rationale across heterogeneous clinical care domains. Using advanced analytic tools to glean knowledge from clinical data streams is rapidly becoming a reality whose clinical impact potential is great.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chao, Y.A.; Chapman, D.M.; Hill, D.J.
2000-12-15
The dynamic rod worth measurement (DRWM) technique is a method of quickly validating the predicted bank worth of control rods and shutdown rods. The DRWM analytic method is based on three-dimensional, space-time kinetic simulations of the rapid rod movements. Its measurement data is processed with an advanced digital reactivity computer. DRWM has been used as the method of bank worth validation at numerous plant startups with excellent results. The process and methodology of DRWM are described, and the measurement results of using DRWM are presented.
NASA Technical Reports Server (NTRS)
Aniversario, R. B.; Harvey, S. T.; Mccarty, J. E.; Parsons, J. T.; Peterson, D. C.; Pritchett, L. D.; Wilson, D. R.; Wogulis, E. R.
1983-01-01
The horizontal stabilizer of the 737 transport was redesigned. Five shipsets were fabricated using composite materials. Weight reduction greater than the 20% goal was achieved. Parts and assemblies were readily produced on production-type tooling. Quality assurance methods were demonstrated. Repair methods were developed and demonstrated. Strength and stiffness analytical methods were substantiated by comparison with test results. Cost data was accumulated in a semiproduction environment. FAA certification was obtained.
2013-01-01
Influenza virus-like particle vaccines are one of the most promising ways to respond to the threat of future influenza pandemics. VLPs are composed of viral antigens but lack nucleic acids making them non-infectious which limit the risk of recombination with wild-type strains. By taking advantage of the advancements in cell culture technologies, the process from strain identification to manufacturing has the potential to be completed rapidly and easily at large scales. After closely reviewing the current research done on influenza VLPs, it is evident that the development of quantification methods has been consistently overlooked. VLP quantification at all stages of the production process has been left to rely on current influenza quantification methods (i.e. Hemagglutination assay (HA), Single Radial Immunodiffusion assay (SRID), NA enzymatic activity assays, Western blot, Electron Microscopy). These are analytical methods developed decades ago for influenza virions and final bulk influenza vaccines. Although these methods are time-consuming and cumbersome they have been sufficient for the characterization of final purified material. Nevertheless, these analytical methods are impractical for in-line process monitoring because VLP concentration in crude samples generally falls out of the range of detection for these methods. This consequently impedes the development of robust influenza-VLP production and purification processes. Thus, development of functional process analytical techniques, applicable at every stage during production, that are compatible with different production platforms is in great need to assess, optimize and exploit the full potential of novel manufacturing platforms. PMID:23642219
NREL’s Advanced Analytics Research for Buildings – Social Media Version
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Forty percent of the total energy consumption in the United States comes from buildings. Working together, we can dramatically shrink that number. NREL’s advanced analytics research has already proven to reduce energy use, save money, and stabilize the grid.
Big data sharing and analysis to advance research in post-traumatic epilepsy.
Duncan, Dominique; Vespa, Paul; Pitkanen, Asla; Braimah, Adebayo; Lapinlampi, Nina; Toga, Arthur W
2018-06-01
We describe the infrastructure and functionality for a centralized preclinical and clinical data repository and analytic platform to support importing heterogeneous multi-modal data, automatically and manually linking data across modalities and sites, and searching content. We have developed and applied innovative image and electrophysiology processing methods to identify candidate biomarkers from MRI, EEG, and multi-modal data. Based on heterogeneous biomarkers, we present novel analytic tools designed to study epileptogenesis in animal model and human with the goal of tracking the probability of developing epilepsy over time. Copyright © 2017. Published by Elsevier Inc.
Advances in Assays and Analytical Approaches for Botulinum Toxin Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grate, Jay W.; Ozanich, Richard M.; Warner, Marvin G.
2010-08-04
Methods to detect botulinum toxin, the most poisonous substance known, are reviewed. Current assays are being developed with two main objectives in mind: 1) to obtain sufficiently low detection limits to replace the mouse bioassay with an in vitro assay, and 2) to develop rapid assays for screening purposes that are as sensitive as possible while requiring an hour or less to process the sample an obtain the result. This review emphasizes the diverse analytical approaches and devices that have been developed over the last decade, while also briefly reviewing representative older immunoassays to provide background and context.
Chemical Detection and Identification Techniques for Exobiology Flight Experiments
NASA Technical Reports Server (NTRS)
Kojiro, Daniel R.; Sheverev, Valery A.; Khromov, Nikolai A.
2002-01-01
Exobiology flight experiments require highly sensitive instrumentation for in situ analysis of the volatile chemical species that occur in the atmospheres and surfaces of various bodies within the solar system. The complex mixtures encountered place a heavy burden on the analytical Instrumentation to detect and identify all species present. The minimal resources available onboard for such missions mandate that the instruments provide maximum analytical capabilities with minimal requirements of volume, weight and consumables. Advances in technology may be achieved by increasing the amount of information acquired by a given technique with greater analytical capabilities and miniaturization of proven terrestrial technology. We describe here methods to develop analytical instruments for the detection and identification of a wide range of chemical species using Gas Chromatography. These efforts to expand the analytical capabilities of GC technology are focused on the development of detectors for the GC which provide sample identification independent of the GC retention time data. A novel new approach employs Penning Ionization Electron Spectroscopy (PIES).
Lewis, Nathan S
2004-09-01
Arrays of broadly cross-reactive vapor sensors provide a man-made implementation of an olfactory system, in which an analyte elicits a response from many receptors and each receptor responds to a variety of analytes. Pattern recognition methods are then used to detect analytes based on the collective response of the sensor array. With the use of this architecture, arrays of chemically sensitive resistors made from composites of conductors and insulating organic polymers have been shown to robustly classify, identify, and quantify a diverse collection of organic vapors, even though no individual sensor responds selectively to a particular analyte. The properties and functioning of these arrays are inspired by advances in the understanding of biological olfaction, and in turn, evaluation of the performance of the man-made array provides suggestions regarding some of the fundamental odor detection principles of the mammalian olfactory system.
Development and Applications of Liquid Sample Desorption Electrospray Ionization Mass Spectrometry
NASA Astrophysics Data System (ADS)
Zheng, Qiuling; Chen, Hao
2016-06-01
Desorption electrospray ionization mass spectrometry (DESI-MS) is a recent advance in the field of analytical chemistry. This review surveys the development of liquid sample DESI-MS (LS-DESI-MS), a variant form of DESI-MS that focuses on fast analysis of liquid samples, and its novel analy-tical applications in bioanalysis, proteomics, and reaction kinetics. Due to the capability of directly ionizing liquid samples, liquid sample DESI (LS-DESI) has been successfully used to couple MS with various analytical techniques, such as microfluidics, microextraction, electrochemistry, and chromatography. This review also covers these hyphenated techniques. In addition, several closely related ionization methods, including transmission mode DESI, thermally assisted DESI, and continuous flow-extractive DESI, are briefly discussed. The capabilities of LS-DESI extend and/or complement the utilities of traditional DESI and electrospray ionization and will find extensive and valuable analytical application in the future.
Advanced rotorcraft technology: Task force report
NASA Technical Reports Server (NTRS)
1978-01-01
The technological needs and opportunities related to future civil and military rotorcraft were determined and a program plan for NASA research which was responsive to the needs and opportunities was prepared. In general, the program plan places the primary emphasis on design methodology where the development and verification of analytical methods is built upon a sound data base. The four advanced rotorcraft technology elements identified are aerodynamics and structures, flight control and avionic systems, propulsion, and vehicle configurations. Estimates of the total funding levels that would be required to support the proposed program plan are included.
NASA Astrophysics Data System (ADS)
Noda, Isao
2014-07-01
Noteworthy experimental practices, which are advancing forward the frontiers of the field of two-dimensional (2D) correlation spectroscopy, are reviewed with the focus on various perturbation methods currently practiced to induce spectral changes, pertinent examples of applications in various fields, and types of analytical probes employed. Types of perturbation methods found in the published literature are very diverse, encompassing both dynamic and static effects. Although a sizable portion of publications report the use of dynamic perturbatuions, much greater number of studies employ static effect, especially that of temperature. Fields of applications covered by the literature are also very broad, ranging from fundamental research to practical applications in a number of physical, chemical and biological systems, such as synthetic polymers, composites and biomolecules. Aside from IR spectroscopy, which is the most commonly used tool, many other analytical probes are used in 2D correlation analysis. The ever expanding trend in depth, breadth and versatility of 2D correlation spectroscopy techniques and their broad applications all point to the robust and healthy state of the field.
Turbine blade tip durability analysis
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Laflen, J. H.; Spamer, G. T.
1981-01-01
An air-cooled turbine blade from an aircraft gas turbine engine chosen for its history of cracking was subjected to advanced analytical and life-prediction techniques. The utility of advanced structural analysis techniques and advanced life-prediction techniques in the life assessment of hot section components are verified. Three dimensional heat transfer and stress analyses were applied to the turbine blade mission cycle and the results were input into advanced life-prediction theories. Shortcut analytical techniques were developed. The proposed life-prediction theories are evaluated.
NASA Astrophysics Data System (ADS)
Rolla, L. Barrera; Rice, H. J.
2006-09-01
In this paper a "forward-advancing" field discretization method suitable for solving the Helmholtz equation in large-scale problems is proposed. The forward wave expansion method (FWEM) is derived from a highly efficient discretization procedure based on interpolation of wave functions known as the wave expansion method (WEM). The FWEM computes the propagated sound field by means of an exclusively forward advancing solution, neglecting the backscattered field. It is thus analogous to methods such as the (one way) parabolic equation method (PEM) (usually discretized using standard finite difference or finite element methods). These techniques do not require the inversion of large system matrices and thus enable the solution of large-scale acoustic problems where backscatter is not of interest. Calculations using FWEM are presented for two propagation problems and comparisons to data computed with analytical and theoretical solutions and show this forward approximation to be highly accurate. Examples of sound propagation over a screen in upwind and downwind refracting atmospheric conditions at low nodal spacings (0.2 per wavelength in the propagation direction) are also included to demonstrate the flexibility and efficiency of the method.
Multi-analytical Approaches Informing the Risk of Sepsis
NASA Astrophysics Data System (ADS)
Gwadry-Sridhar, Femida; Lewden, Benoit; Mequanint, Selam; Bauer, Michael
Sepsis is a significant cause of mortality and morbidity and is often associated with increased hospital resource utilization, prolonged intensive care unit (ICU) and hospital stay. The economic burden associated with sepsis is huge. With advances in medicine, there are now aggressive goal oriented treatments that can be used to help these patients. If we were able to predict which patients may be at risk for sepsis we could start treatment early and potentially reduce the risk of mortality and morbidity. Analytic methods currently used in clinical research to determine the risk of a patient developing sepsis may be further enhanced by using multi-modal analytic methods that together could be used to provide greater precision. Researchers commonly use univariate and multivariate regressions to develop predictive models. We hypothesized that such models could be enhanced by using multiple analytic methods that together could be used to provide greater insight. In this paper, we analyze data about patients with and without sepsis using a decision tree approach and a cluster analysis approach. A comparison with a regression approach shows strong similarity among variables identified, though not an exact match. We compare the variables identified by the different approaches and draw conclusions about the respective predictive capabilities,while considering their clinical significance.
Analysis of multiple mycotoxins in food.
Hajslova, Jana; Zachariasova, Milena; Cajka, Tomas
2011-01-01
Mycotoxins are secondary metabolites of microscopic filamentous fungi. With regard to the widespread distribution of fungi in the environment, mycotoxins are considered to be one of the most important natural contaminants in foods and feeds. To protect consumers' health and reduce economic losses, surveillance and control of mycotoxins in food and feed has become a major objective for producers, regulatory authorities, and researchers worldwide. In this context, availability of reliable analytical methods applicable for this purpose is essential. Since the variety of chemical structures of mycotoxins makes impossible to use one single technique for their analysis, a vast number of analytical methods has been developed and validated. Both a large variability of food matrices and growing demands for a fast, cost-saving and accurate determination of multiple mycotoxins by a single method outline new challenges for analytical research. This strong effort is facilitated by technical developments in mass spectrometry allowing decreasing the influence of matrix effects in spite of omitting sample clean-up step. The current state-of-the-art together with future trends is presented in this chapter. Attention is focused mainly on instrumental method; advances in biosensors and other screening bioanalytical approaches enabling analysis of multiple mycotoxins are not discussed in detail.
Possibilities of application of the swirling flows in cooling systems of laser mirrors
NASA Astrophysics Data System (ADS)
Shanin, Yu; Chernykh, A.
2018-03-01
The paper presents analytical investigations into advanced cooling systems of the laser mirrors with heat exchange intensification by methods of ordered vortex impact on a coolant flow structure. Advantages and effectiveness of the proposed cooling systems have been estimated to reduction displacement of an optical mirror surface due to a flexure.
Logic of Sherlock Holmes in Technology Enhanced Learning
ERIC Educational Resources Information Center
Patokorpi, Erkki
2007-01-01
Abduction is a method of reasoning that people use under uncertainty in a context in order to come up with new ideas. The use of abduction in this exploratory study is twofold: (i) abduction is a cross-disciplinary analytic tool that can be used to explain certain key aspects of human-computer interaction in advanced Information Society Technology…
Analytical surveillance of emerging drugs of abuse and drug formulations
Thomas, Brian F.; Pollard, Gerald T.; Grabenauer, Megan
2012-01-01
Uncontrolled recreational drugs are proliferating in number and variety. Effects of long-term use are unknown, and regulation is problematic, as efforts to control one chemical often lead to several other structural analogs. Advanced analytical instrumentation and methods are continuing to be developed to identify drugs, chemical constituents of products, and drug substances and metabolites in biological fluids. Several mass spectrometry based approaches appear promising, particularly those that involve high resolution chromatographic and mass spectrometric methods that allow unbiased data acquisition and sophisticated data interrogation. Several of these techniques are shown to facilitate both targeted and broad spectrum analysis, which is often of particular benefit when dealing with misleadingly labeled products or assessing a biological matrix for illicit drugs and metabolites. The development and application of novel analytical approaches such as these will help to assess the nature and degree of exposure and risk and, where necessary, inform forensics and facilitate implementation of specific regulation and control measures. PMID:23154240
Martinez, Ramon; Ordunez, Pedro; Soliz, Patricia N; Ballesteros, Michael F
2016-04-01
The complexity of current injury-related health issues demands the usage of diverse and massive data sets for comprehensive analyses, and application of novel methods to communicate data effectively to the public health community, decision-makers and the public. Recent advances in information visualisation, availability of new visual analytic methods and tools, and progress on information technology provide an opportunity for shaping the next generation of injury surveillance. To introduce data visualisation conceptual bases, and propose a visual analytic and visualisation platform in public health surveillance for injury prevention and control. The paper introduces data visualisation conceptual bases, describes a visual analytic and visualisation platform, and presents two real-world case studies illustrating their application in public health surveillance for injury prevention and control. Application of visual analytic and visualisation platform is presented as solution for improved access to heterogeneous data sources, enhance data exploration and analysis, communicate data effectively, and support decision-making. Applications of data visualisation concepts and visual analytic platform could play a key role to shape the next generation of injury surveillance. Visual analytic and visualisation platform could improve data use, the analytic capacity, and ability to effectively communicate findings and key messages. The public health surveillance community is encouraged to identify opportunities to develop and expand its use in injury prevention and control. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
On Establishing Big Data Wave Breakwaters with Analytics (Invited)
NASA Astrophysics Data System (ADS)
Riedel, M.
2013-12-01
The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.
NASA Technical Reports Server (NTRS)
Kempler, Steve; Mathews, Tiffany
2016-01-01
The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.
Advancements in nano-enabled therapeutics for neuroHIV management.
Kaushik, Ajeet; Jayant, Rahul Dev; Nair, Madhavan
This viewpoint is a global call to promote fundamental and applied research aiming toward designing smart nanocarriers of desired properties, novel noninvasive strategies to open the blood-brain barrier (BBB), delivery/release of single/multiple therapeutic agents across the BBB to eradicate neurohuman immunodeficiency virus (HIV), strategies for on-demand site-specific release of antiretroviral therapy, developing novel nanoformulations capable to recognize and eradicate latently infected HIV reservoirs, and developing novel smart analytical diagnostic tools to detect and monitor HIV infection. Thus, investigation of novel nanoformulations, methodologies for site-specific delivery/release, analytical methods, and diagnostic tools would be of high significance to eradicate and monitor neuroacquired immunodeficiency syndrome. Overall, these developments will certainly help to develop personalized nanomedicines to cure HIV and to develop smart HIV-monitoring analytical systems for disease management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haab, Brian B.; Geierstanger, Bernhard H.; Michailidis, George
2005-08-01
Four different immunoassay and antibody microarray methods performed at four different sites were used to measure the levels of a broad range of proteins (N = 323 assays; 39, 88, 168, and 28 assays at the respective sites; 237 unique analytes) in the human serum and plasma reference specimens distributed by the Plasma Proteome Project (PPP) of the HUPO. The methods provided a means to (1) assess the level of systematic variation in protein abundances associated with blood preparation methods (serum, citrate-anticoagulated-plasma, EDTA-anticoagulated-plasma, or heparin-anticoagulated-plasma) and (2) evaluate the dependence on concentration of MS-based protein identifications from data sets usingmore » the HUPO specimens. Some proteins, particularly cytokines, had highly variable concentrations between the different sample preparations, suggesting specific effects of certain anticoagulants on the stability or availability of these proteins. The linkage of antibody-based measurements from 66 different analytes with the combined MS/MS data from 18 different laboratories showed that protein detection and the quality of MS data increased with analyte concentration. The conclusions from these initial analyses are that the optimal blood preparation method is variable between analytes and that the discovery of blood proteins by MS can be extended to concentrations below the ng/mL range under certain circumstances. Continued developments in antibody-based methods will further advance the scientific goals of the PPP.« less
Bruno, Thomas J; Ott, Lisa S; Lovestead, Tara M; Huber, Marcia L
2010-04-16
The analysis of complex fluids such as crude oils, fuels, vegetable oils and mixed waste streams poses significant challenges arising primarily from the multiplicity of components, the different properties of the components (polarity, polarizability, etc.) and matrix properties. We have recently introduced an analytical strategy that simplifies many of these analyses, and provides the added potential of linking compositional information with physical property information. This aspect can be used to facilitate equation of state development for the complex fluids. In addition to chemical characterization, the approach provides the ability to calculate thermodynamic properties for such complex heterogeneous streams. The technique is based on the advanced distillation curve (ADC) metrology, which separates a complex fluid by distillation into fractions that are sampled, and for which thermodynamically consistent temperatures are measured at atmospheric pressure. The collected sample fractions can be analyzed by any method that is appropriate. The analytical methods we have applied include gas chromatography (with flame ionization, mass spectrometric and sulfur chemiluminescence detection), thin layer chromatography, FTIR, corrosivity analysis, neutron activation analysis and cold neutron prompt gamma activation analysis. By far, the most widely used analytical technique we have used with the ADC is gas chromatography. This has enabled us to study finished fuels (gasoline, diesel fuels, aviation fuels, rocket propellants), crude oils (including a crude oil made from swine manure) and waste oils streams (used automotive and transformer oils). In this special issue of the Journal of Chromatography, specifically dedicated to extraction technologies, we describe the essential features of the advanced distillation curve metrology as an analytical strategy for complex fluids. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Petrova, N.; Zagidullin, A.; Nefedyev, Y.; Kosulin, V.; Andreev, A.
2017-11-01
Observing physical librations of celestial bodies and the Moon represents one of the astronomical methods of remotely assessing the internal structure of a celestial body without conducting expensive space experiments. The paper contains a review of recent advances in studying the Moon's structure using various methods of obtaining and applying the lunar physical librations (LPhL) data. In this article LPhL simulation methods of assessing viscoelastic and dissipative properties of the lunar body and lunar core parameters, whose existence has been recently confirmed during the seismic data reprocessing of ;Apollo; space mission, are described. Much attention is paid to physical interpretation of the free librations phenomenon and the methods for its determination. In the paper the practical application of the most accurate analytical LPhL tables (Rambaux and Williams, 2011) is discussed. The tables were built on the basis of complex analytical processing of the residual differences obtained when comparing long-term series of laser observations with the numerical ephemeris DE421. In the paper an efficiency analysis of two approaches to LPhL theory is conducted: the numerical and the analytical ones. It has been shown that in lunar investigation both approaches complement each other in various aspects: the numerical approach provides high accuracy of the theory, which is required for the proper processing of modern observations, the analytical approach allows to comprehend the essence of the phenomena in the lunar rotation, predict and interpret new effects in the observations of lunar body and lunar core parameters.
Główka, Franciszek K; Romański, Michał; Teżyk, Artur; Żaba, Czesław
2013-01-01
Treosulfan (TREO) is an alkylating agent registered for treatment of advanced platin-resistant ovarian carcinoma. Nowadays, TREO is increasingly applied iv in high doses as a promising myeloablative agent with low organ toxicity in children. Under physiological conditions it undergoes pH-dependent transformation into epoxy-transformers (S,S-EBDM and S,S-DEB). The mechanism of this reaction is generally known, but not its kinetic details. In order to investigate kinetics of TREO transformation, HPLC method with refractometric detection for simultaneous determination of the three analytes in one analytical run has been developed for the first time. The samples containing TREO, S,S-EBDM, S,S-DEB and acetaminophen (internal standard) were directly injected onto the reversed phase column. To assure stability of the analytes and obtain their complete resolution, mobile phase composed of acetate buffer pH 4.5 and acetonitrile was applied. The linear range of the calibration curves of TREO, S,S-EBDM and S,S-DEB spanned concentrations of 20-6000, 34-8600 and 50-6000 μM, respectively. Intra- and interday precision and accuracy of the developed method fulfilled analytical criteria. The stability of the analytes in experimental samples was also established. The validated HPLC method was successfully applied to the investigation of the kinetics of TREO activation to S,S-EBDM and S,S-DEB. At pH 7.4 and 37 °C the transformation of TREO followed first-order kinetics with a half-life 1.5h. Copyright © 2012 Elsevier B.V. All rights reserved.
Ding, Juefang; Chen, Xiaoyan; Dai, Xiaojian; Zhong, Dafang
2012-05-01
Apatinib, also known as YN968D1, is a novel antiangiogenic agent that selectively inhibits vascular endothelial growth factor receptor-2. Currently, apatinib is undergoing phase II/III clinical trials in China for the treatment of solid tumors. Apatinib is extensively metabolized in humans, and its major metabolites in circulation include cis-3-hydroxy-apatinib (M1-1), trans-3-hydroxy-apatinib (M1-2), apatinib-25-N-oxide (M1-6), and cis-3-hydroxy-apatinib-O-glucuronide (M9-2). To investigate the pharmacokinetics of apatinib and its four major metabolites in patients with advanced colorectal cancer, a sensitive and selective liquid chromatography-tandem mass spectrometry method was developed and validated for the simultaneous determination of apatinib, M1-1, M1-2, M1-6, and M9-2 in human plasma. After a simple protein precipitation using acetonitrile as the precipitation solvent, all the analytes and the internal standard vatalanib were separated on a Zorbax Eclipse XDB C(18) column (50 mm × 4.6 mm, 1.8 μm, Agilent) using acetonitrile: 5 mmol/L ammonium acetate with 0.1% formic acid as the mobile phase with gradient elution. A chromatographic total run time of 9 min was achieved. Mass spectrometry detection was conducted through electrospray ionization in positive ion multiple reaction monitoring modes. The method was linear over the concentration range of 3.00-2000 ng/mL for each analyte. The lower limit of quantification for each analyte was 3.00 ng/mL. The intra-assay precision for all the analytes was less than 11.3%, the inter-assay precision was less than 13.8%, and the accuracy was between -5.8% and 3.3%. The validated method was successfully applied to a clinical pharmacokinetic study following oral administration of 500 mg apatinib mesylate in patients with advanced colorectal cancer. Copyright © 2012 Elsevier B.V. All rights reserved.
Dillon, Roslyn; Croner, Lisa J; Bucci, John; Kairs, Stefanie N; You, Jia; Beasley, Sharon; Blimline, Mark; Carino, Rochele B; Chan, Vicky C; Cuevas, Danissa; Diggs, Jeff; Jennings, Megan; Levy, Jacob; Mina, Ginger; Yee, Alvin; Wilcox, Bruce
2018-05-30
Early detection of colorectal cancer (CRC) is key to reducing associated mortality. Despite the importance of early detection, approximately 40% of individuals in the United States between the ages of 50-75 have never been screened for CRC. The low compliance with colonoscopy and fecal-based screening may be addressed with a non-invasive alternative such as a blood-based test. We describe here the analytical validation of a multiplexed blood-based assay that measures the plasma concentrations of 15 proteins to assess advanced adenoma (AA) and CRC risk in symptomatic patients. The test was developed on an electrochemiluminescent immunoassay platform employing four multi-marker panels, to be implemented in the clinic as a laboratory developed test (LDT). Under the Clinical Laboratory Improvement Amendments (CLIA) and College of American Pathologists (CAP) regulations, a United States-based clinical laboratory utilizing an LDT must establish performance characteristics relating to analytical validity prior to releasing patient test results. This report describes a series of studies demonstrating the precision, accuracy, analytical sensitivity, and analytical specificity for each of the 15 assays, as required by CLIA/CAP. In addition, the report describes studies characterizing each of the assays' dynamic range, parallelism, tolerance to common interfering substances, spike recovery, and stability to sample freeze-thaw cycles. Upon completion of the analytical characterization, a clinical accuracy study was performed to evaluate concordance of AA and CRC classifier model calls using the analytical method intended for use in the clinic. Of 434 symptomatic patient samples tested, the percent agreement with original CRC and AA calls was 87% and 92% respectively. All studies followed CLSI guidelines and met the regulatory requirements for implementation of a new LDT. The results provide the analytical evidence to support the implementation of the novel multi-marker test as a clinical test for evaluating CRC and AA risk in symptomatic individuals. Copyright © 2018 Elsevier B.V. All rights reserved.
Napolitano, José G.; Gödecke, Tanja; Lankin, David C.; Jaki, Birgit U.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.
2013-01-01
The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative 1H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted 1H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. PMID:23870106
Unsteady Loss in the Stator Due to the Incoming Rotor Wake in a Highly-Loaded Transonic Compressor
NASA Technical Reports Server (NTRS)
Hah, Chunill
2015-01-01
The present paper reports an investigation of unsteady loss generation in the stator due to the incoming rotor wake in an advanced GE transonic compressor design with a high-fidelity numerical method. This advanced compressor with high reaction and high stage loading has been investigated both experimentally and analytically in the past. The measured efficiency in this advanced compressor is significantly lower than the design intention goal. The general understanding is that the current generation of compressor design analysis tools miss some important flow physics in this modern compressor design. To pinpoint the source of the efficiency miss, an advanced test with a detailed flow traverse was performed for the front one and a half stage at the NASA Glenn Research Center.
Dinov, Ivo D
2016-01-01
Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.
NASA Astrophysics Data System (ADS)
Wu, Haiqing; Bai, Bing; Li, Xiaochun
2018-02-01
Existing analytical or approximate solutions that are appropriate for describing the migration mechanics of CO2 and the evolution of fluid pressure in reservoirs do not consider the high compressibility of CO2, which reduces their calculation accuracy and application value. Therefore, this work first derives a new governing equation that represents the movement of complex fluids in reservoirs, based on the equation of continuity and the generalized Darcy's law. A more rigorous definition of the coefficient of compressibility of fluid is then presented, and a power function model (PFM) that characterizes the relationship between the physical properties of CO2 and the pressure is derived. Meanwhile, to avoid the difficulty of determining the saturation of fluids, a method that directly assumes the average relative permeability of each fluid phase in different fluid domains is proposed, based on the theory of gradual change. An advanced analytical solution is obtained that includes both the partial miscibility and the compressibility of CO2 and brine in evaluating the evolution of fluid pressure by integrating within different regions. Finally, two typical sample analyses are used to verify the reliability, improved nature and universality of this new analytical solution. Based on the physical characteristics and the results calculated for the examples, this work elaborates the concept and basis of partitioning for use in further work.
Advances in spatial epidemiology and geographic information systems.
Kirby, Russell S; Delmelle, Eric; Eberth, Jan M
2017-01-01
The field of spatial epidemiology has evolved rapidly in the past 2 decades. This study serves as a brief introduction to spatial epidemiology and the use of geographic information systems in applied research in epidemiology. We highlight technical developments and highlight opportunities to apply spatial analytic methods in epidemiologic research, focusing on methodologies involving geocoding, distance estimation, residential mobility, record linkage and data integration, spatial and spatio-temporal clustering, small area estimation, and Bayesian applications to disease mapping. The articles included in this issue incorporate many of these methods into their study designs and analytical frameworks. It is our hope that these studies will spur further development and utilization of spatial analysis and geographic information systems in epidemiologic research. Copyright © 2016 Elsevier Inc. All rights reserved.
Scott, Frank I; McConnell, Ryan A; Lewis, Matthew E; Lewis, James D
2012-04-01
Significant advances have been made in clinical and epidemiologic research methods over the past 30 years. We sought to demonstrate the impact of these advances on published gastroenterology research from 1980 to 2010. Twenty original clinical articles were randomly selected from each of three journals from 1980, 1990, 2000, and 2010. Each article was assessed for topic, whether the outcome was clinical or physiologic, study design, sample size, number of authors and centers collaborating, reporting of various statistical methods, and external funding. From 1980 to 2010, there was a significant increase in analytic studies, clinical outcomes, number of authors per article, multicenter collaboration, sample size, and external funding. There was increased reporting of P values, confidence intervals, and power calculations, and increased use of large multicenter databases, multivariate analyses, and bioinformatics. The complexity of clinical gastroenterology and hepatology research has increased dramatically, highlighting the need for advanced training of clinical investigators.
Bajoub, Aadil; Bendini, Alessandra; Fernández-Gutiérrez, Alberto; Carrasco-Pancorbo, Alegría
2018-03-24
Over the last decades, olive oil quality and authenticity control has become an issue of great importance to consumers, suppliers, retailers, and regulators in both traditional and emerging olive oil producing countries, mainly due to the increasing worldwide popularity and the trade globalization of this product. Thus, in order to ensure olive oil authentication, various national and international laws and regulations have been adopted, although some of them are actually causing an enormous debate about the risk that they can represent for the harmonization of international olive oil trade standards. Within this context, this review was designed to provide a critical overview and comparative analysis of selected regulatory frameworks for olive oil authentication, with special emphasis on the quality and purity criteria considered by these regulation systems, their thresholds and the analytical methods employed for monitoring them. To complete the general overview, recent analytical advances to overcome drawbacks and limitations of the official methods to evaluate olive oil quality and to determine possible adulterations were reviewed. Furthermore, the latest trends on analytical approaches to assess the olive oil geographical and varietal origin traceability were also examined.
Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.
Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís
2016-01-08
In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.
Neylon, J; Min, Y; Kupelian, P; Low, D A; Santhanam, A
2017-04-01
In this paper, a multi-GPU cloud-based server (MGCS) framework is presented for dose calculations, exploring the feasibility of remote computing power for parallelization and acceleration of computationally and time intensive radiotherapy tasks in moving toward online adaptive therapies. An analytical model was developed to estimate theoretical MGCS performance acceleration and intelligently determine workload distribution. Numerical studies were performed with a computing setup of 14 GPUs distributed over 4 servers interconnected by a 1 Gigabits per second (Gbps) network. Inter-process communication methods were optimized to facilitate resource distribution and minimize data transfers over the server interconnect. The analytically predicted computation time predicted matched experimentally observations within 1-5 %. MGCS performance approached a theoretical limit of acceleration proportional to the number of GPUs utilized when computational tasks far outweighed memory operations. The MGCS implementation reproduced ground-truth dose computations with negligible differences, by distributing the work among several processes and implemented optimization strategies. The results showed that a cloud-based computation engine was a feasible solution for enabling clinics to make use of fast dose calculations for advanced treatment planning and adaptive radiotherapy. The cloud-based system was able to exceed the performance of a local machine even for optimized calculations, and provided significant acceleration for computationally intensive tasks. Such a framework can provide access to advanced technology and computational methods to many clinics, providing an avenue for standardization across institutions without the requirements of purchasing, maintaining, and continually updating hardware.
NASA Technical Reports Server (NTRS)
Duke, J. C., Jr.; Henneke, E. G., II
1986-01-01
To evaluate the response of composite materials, it is imperative that the input excitation as well as the observed output be well characterized. This characterization ideally should be in terms of displacements as a function of time with high spatial resolution. Additionally, the ability to prescribe these features for the excitation is highly desirable. Various methods for generating and detecting ultrasound in advanced composite materials are examined. Characterization and tailoring of input excitation is considered for contact and noncontact, mechanical, and electromechanical devices. Type of response as well as temporal and spatial resolution of detection methods are discussed as well. Results of investigations at Virginia Tech in application of these techniques to characterizing the response of advanced composites are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2013-07-01
The Mathematics and Computation Division of the American Nuclear (ANS) and the Idaho Section of the ANS hosted the 2013 International Conference on Mathematics and Computational Methods Applied to Nuclear Science and Engineering (M and C 2013). This proceedings contains over 250 full papers with topics ranging from reactor physics; radiation transport; materials science; nuclear fuels; core performance and optimization; reactor systems and safety; fluid dynamics; medical applications; analytical and numerical methods; algorithms for advanced architectures; and validation verification, and uncertainty quantification.
ERIC Educational Resources Information Center
Moskovkin, Vladimir M.; Bocharova, Emilia A.; Balashova, Oksana V.
2014-01-01
Purpose: The purpose of this paper is to introduce and develop the methodology of journal benchmarking. Design/Methodology/ Approach: The journal benchmarking method is understood to be an analytic procedure of continuous monitoring and comparing of the advance of specific journal(s) against that of competing journals in the same subject area,…
Crew appliance computer program manual, volume 1
NASA Technical Reports Server (NTRS)
Russell, D. J.
1975-01-01
Trade studies of numerous appliance concepts for advanced spacecraft galley, personal hygiene, housekeeping, and other areas were made to determine which best satisfy the space shuttle orbiter and modular space station mission requirements. Analytical models of selected appliance concepts not currently included in the G-189A Generalized Environmental/Thermal Control and Life Support Systems (ETCLSS) Computer Program subroutine library were developed. The new appliance subroutines are given along with complete analytical model descriptions, solution methods, user's input instructions, and validation run results. The appliance components modeled were integrated with G-189A ETCLSS models for shuttle orbiter and modular space station, and results from computer runs of these systems are presented.
ERIC Educational Resources Information Center
Lee, Alwyn Vwen Yen; Tan, Seng Chee
2017-01-01
Understanding ideas in a discourse is challenging, especially in textual discourse analysis. We propose using temporal analytics with unsupervised machine learning techniques to investigate promising ideas for the collective advancement of communal knowledge in an online knowledge building discourse. A discourse unit network was constructed and…
A TENTATIVE GUIDE, DIFFERENTIAL AND INTEGRAL CALCULUS.
ERIC Educational Resources Information Center
BRANT, VINCENT; GERARDI, WILLIAM
THE COURSE IS INTENDED TO GO BEYOND THE REQUIREMENTS OF THE ADVANCED PLACEMENT PROGRAM IN MATHEMATICS AS DESIGNED BY THE COLLEGE ENTRANCE EXAMINATION BOARD. THE ADVANCED PLACEMENT PROGRAM CONSISTS OF A 1-YEAR COURSE COMBINING ANALYTIC GEOMETRY AND CALCULUS. PRESUPPOSED HERE ARE--A SEMESTER COURSE IN ANALYTIC GEOMETRY AND A THOROUGH KNOWLEDGE OF…
Advanced, Analytic, Automated (AAA) Measurement of Engagement during Learning
ERIC Educational Resources Information Center
D'Mello, Sidney; Dieterle, Ed; Duckworth, Angela
2017-01-01
It is generally acknowledged that engagement plays a critical role in learning. Unfortunately, the study of engagement has been stymied by a lack of valid and efficient measures. We introduce the advanced, analytic, and automated (AAA) approach to measure engagement at fine-grained temporal resolutions. The AAA measurement approach is grounded in…
NASA Astrophysics Data System (ADS)
Abbott, B. P.; Abbott, R.; Abbott, T. D.; Abernathy, M. R.; Acernese, F.; Ackley, K.; Adams, C.; Adams, T.; Addesso, P.; Adhikari, R. X.; Adya, V. B.; Affeldt, C.; Agathos, M.; Agatsuma, K.; Aggarwal, N.; Aguiar, O. D.; Aiello, L.; Ain, A.; Ajith, P.; Allen, B.; Allocca, A.; Altin, P. A.; Anderson, S. B.; Anderson, W. G.; Arai, K.; Araya, M. C.; Arceneaux, C. C.; Areeda, J. S.; Arnaud, N.; Arun, K. G.; Ascenzi, S.; Ashton, G.; Ast, M.; Aston, S. M.; Astone, P.; Aufmuth, P.; Aulbert, C.; Babak, S.; Bacon, P.; Bader, M. K. M.; Baker, P. T.; Baldaccini, F.; Ballardin, G.; Ballmer, S. W.; Barayoga, J. C.; Barclay, S. E.; Barish, B. C.; Barker, D.; Barone, F.; Barr, B.; Barsotti, L.; Barsuglia, M.; Barta, D.; Bartlett, J.; Bartos, I.; Bassiri, R.; Basti, A.; Batch, J. C.; Baune, C.; Bavigadda, V.; Bazzan, M.; Behnke, B.; Bejger, M.; Bell, A. S.; Bell, C. J.; Berger, B. K.; Bergman, J.; Bergmann, G.; Berry, C. P. L.; Bersanetti, D.; Bertolini, A.; Betzwieser, J.; Bhagwat, S.; Bhandare, R.; Bilenko, I. A.; Billingsley, G.; Birch, J.; Birney, R.; Biscans, S.; Bisht, A.; Bitossi, M.; Biwer, C.; Bizouard, M. A.; Blackburn, J. K.; Blair, C. D.; Blair, D. G.; Blair, R. M.; Bloemen, S.; Bock, O.; Bodiya, T. P.; Boer, M.; Bogaert, G.; Bogan, C.; Bohe, A.; Bojtos, P.; Bond, C.; Bondu, F.; Bonnand, R.; Boom, B. A.; Bork, R.; Boschi, V.; Bose, S.; Bouffanais, Y.; Bozzi, A.; Bradaschia, C.; Brady, P. R.; Braginsky, V. B.; Branchesi, M.; Brau, J. E.; Briant, T.; Brillet, A.; Brinkmann, M.; Brisson, V.; Brockill, P.; Brooks, A. F.; Brown, D. A.; Brown, D. D.; Brown, N. M.; Buchanan, C. C.; Buikema, A.; Bulik, T.; Bulten, H. J.; Buonanno, A.; Buskulic, D.; Buy, C.; Byer, R. L.; Cadonati, L.; Cagnoli, G.; Cahillane, C.; Calderón Bustillo, J.; Callister, T.; Calloni, E.; Camp, J. B.; Cannon, K. C.; Cao, J.; Capano, C. D.; Capocasa, E.; Carbognani, F.; Caride, S.; Casanueva Diaz, J.; Casentini, C.; Caudill, S.; Cavaglià, M.; Cavalier, F.; Cavalieri, R.; Cella, G.; Cepeda, C. B.; Cerboni Baiardi, L.; Cerretani, G.; Cesarini, E.; Chakraborty, R.; Chalermsongsak, T.; Chamberlin, S. J.; Chan, M.; Chao, S.; Charlton, P.; Chassande-Mottin, E.; Chen, H. Y.; Chen, Y.; Cheng, C.; Chincarini, A.; Chiummo, A.; Cho, H. S.; Cho, M.; Chow, J. H.; Christensen, N.; Chu, Q.; Chua, S.; Chung, S.; Ciani, G.; Clara, F.; Clark, J. A.; Cleva, F.; Coccia, E.; Cohadon, P.-F.; Colla, A.; Collette, C. G.; Cominsky, L.; Constancio, M., Jr.; Conte, A.; Conti, L.; Cook, D.; Corbitt, T. R.; Cornish, N.; Corsi, A.; Cortese, S.; Costa, C. A.; Coughlin, M. W.; Coughlin, S. B.; Coulon, J.-P.; Countryman, S. T.; Couvares, P.; Cowan, E. E.; Coward, D. M.; Cowart, M. J.; Coyne, D. C.; Coyne, R.; Craig, K.; Creighton, J. D. E.; Cripe, J.; Crowder, S. G.; Cumming, A.; Cunningham, L.; Cuoco, E.; Dal Canton, T.; Danilishin, S. L.; D’Antonio, S.; Danzmann, K.; Darman, N. S.; Dattilo, V.; Dave, I.; Daveloza, H. P.; Davier, M.; Davies, G. S.; Daw, E. J.; Day, R.; De, S.; DeBra, D.; Debreczeni, G.; Degallaix, J.; De Laurentis, M.; Deléglise, S.; Del Pozzo, W.; Denker, T.; Dent, T.; Dereli, H.; Dergachev, V.; De Rosa, R.; DeRosa, R. T.; DeSalvo, R.; Dhurandhar, S.; Díaz, M. C.; Di Fiore, L.; Di Giovanni, M.; Di Lieto, A.; Di Pace, S.; Di Palma, I.; Di Virgilio, A.; Dojcinoski, G.; Dolique, V.; Donovan, F.; Dooley, K. L.; Doravari, S.; Douglas, R.; Downes, T. P.; Drago, M.; Drever, R. W. P.; Driggers, J. C.; Du, Z.; Ducrot, M.; Dwyer, S. E.; Edo, T. B.; Edwards, M. C.; Effler, A.; Eggenstein, H.-B.; Ehrens, P.; Eichholz, J.; Eikenberry, S. S.; Engels, W.; Essick, R. C.; Etzel, T.; Evans, M.; Evans, T. M.; Everett, R.; Factourovich, M.; Fafone, V.; Fair, H.; Fairhurst, S.; Fan, X.; Fang, Q.; Farinon, S.; Farr, B.; Farr, W. M.; Favata, M.; Fays, M.; Fehrmann, H.; Fejer, M. M.; Ferrante, I.; Ferreira, E. C.; Ferrini, F.; Fidecaro, F.; Fiori, I.; Fiorucci, D.; Fisher, R. P.; Flaminio, R.; Fletcher, M.; Fong, H.; Fournier, J.-D.; Franco, S.; Frasca, S.; Frasconi, F.; Frei, Z.; Freise, A.; Frey, R.; Frey, V.; Fricke, T. T.; Fritschel, P.; Frolov, V. V.; Fulda, P.; Fyffe, M.; Gabbard, H. A. G.; Gair, J. R.; Gammaitoni, L.; Gaonkar, S. G.; Garufi, F.; Gatto, A.; Gaur, G.; Gehrels, N.; Gemme, G.; Gendre, B.; Genin, E.; Gennai, A.; George, J.; Gergely, L.; Germain, V.; Ghosh, Archisman; Ghosh, S.; Giaime, J. A.; Giardina, K. D.; Giazotto, A.; Gill, K.; Glaefke, A.; Goetz, E.; Goetz, R.; Gondan, L.; González, G.; Gonzalez Castro, J. M.; Gopakumar, A.; Gordon, N. A.; Gorodetsky, M. L.; Gossan, S. E.; Gosselin, M.; Gouaty, R.; Graef, C.; Graff, P. B.; Granata, M.; Grant, A.; Gras, S.; Gray, C.; Greco, G.; Green, A. C.; Groot, P.; Grote, H.; Grunewald, S.; Guidi, G. M.; Guo, X.; Gupta, A.; Gupta, M. K.; Gushwa, K. E.; Gustafson, E. K.; Gustafson, R.; Hacker, J. J.; Hall, B. R.; Hall, E. D.; Hammond, G.; Haney, M.; Hanke, M. M.; Hanks, J.; Hanna, C.; Hannam, M. D.; Hanson, J.; Hardwick, T.; Harms, J.; Harry, G. M.; Harry, I. W.; Hart, M. J.; Hartman, M. T.; Haster, C.-J.; Haughian, K.; Heidmann, A.; Heintze, M. C.; Heitmann, H.; Hello, P.; Hemming, G.; Hendry, M.; Heng, I. S.; Hennig, J.; Heptonstall, A. W.; Heurs, M.; Hild, S.; Hoak, D.; Hodge, K. A.; Hofman, D.; Hollitt, S. E.; Holt, K.; Holz, D. E.; Hopkins, P.; Hosken, D. J.; Hough, J.; Houston, E. A.; Howell, E. J.; Hu, Y. M.; Huang, S.; Huerta, E. A.; Huet, D.; Hughey, B.; Husa, S.; Huttner, S. H.; Huynh-Dinh, T.; Idrisy, A.; Indik, N.; Ingram, D. R.; Inta, R.; Isa, H. N.; Isac, J.-M.; Isi, M.; Islas, G.; Isogai, T.; Iyer, B. R.; Izumi, K.; Jacqmin, T.; Jang, H.; Jani, K.; Jaranowski, P.; Jawahar, S.; Jiménez-Forteza, F.; Johnson, W. W.; Jones, D. I.; Jones, R.; Jonker, R. J. G.; Ju, L.; K, Haris; Kalaghatgi, C. V.; Kalogera, V.; Kandhasamy, S.; Kang, G.; Kanner, J. B.; Karki, S.; Kasprzack, M.; Katsavounidis, E.; Katzman, W.; Kaufer, S.; Kaur, T.; Kawabe, K.; Kawazoe, F.; Kéfélian, F.; Kehl, M. S.; Keitel, D.; Kelley, D. B.; Kells, W.; Kennedy, R.; Key, J. S.; Khalaidovski, A.; Khalili, F. Y.; Khan, I.; Khan, S.; Khan, Z.; Khazanov, E. A.; Kijbunchoo, N.; Kim, C.; Kim, J.; Kim, K.; Kim, Nam-Gyu; Kim, Namjun; Kim, Y.-M.; King, E. J.; King, P. J.; Kinzel, D. L.; Kissel, J. S.; Kleybolte, L.; Klimenko, S.; Koehlenbeck, S. M.; Kokeyama, K.; Koley, S.; Kondrashov, V.; Kontos, A.; Korobko, M.; Korth, W. Z.; Kowalska, I.; Kozak, D. B.; Kringel, V.; Krishnan, B.; Królak, A.; Krueger, C.; Kuehn, G.; Kumar, P.; Kuo, L.; Kutynia, A.; Lackey, B. D.; Landry, M.; Lange, J.; Lantz, B.; Lasky, P. D.; Lazzarini, A.; Lazzaro, C.; Leaci, P.; Leavey, S.; Lebigot, E. O.; Lee, C. H.; Lee, H. K.; Lee, H. M.; Lee, K.; Lenon, A.; Leonardi, M.; Leong, J. R.; Leroy, N.; Letendre, N.; Levin, Y.; Levine, B. M.; Li, T. G. F.; Libson, A.; Littenberg, T. B.; Lockerbie, N. A.; Logue, J.; Lombardi, A. L.; Lord, J. E.; Lorenzini, M.; Loriette, V.; Lormand, M.; Losurdo, G.; Lough, J. D.; Lück, H.; Lundgren, A. P.; Luo, J.; Lynch, R.; Ma, Y.; MacDonald, T.; Machenschalk, B.; MacInnis, M.; Macleod, D. M.; Magaña-Sandoval, F.; Magee, R. M.; Mageswaran, M.; Majorana, E.; Maksimovic, I.; Malvezzi, V.; Man, N.; Mandel, I.; Mandic, V.; Mangano, V.; Mansell, G. L.; Manske, M.; Mantovani, M.; Marchesoni, F.; Marion, F.; Márka, S.; Márka, Z.; Markosyan, A. S.; Maros, E.; Martelli, F.; Martellini, L.; Martin, I. W.; Martin, R. M.; Martynov, D. V.; Marx, J. N.; Mason, K.; Masserot, A.; Massinger, T. J.; Masso-Reid, M.; Matichard, F.; Matone, L.; Mavalvala, N.; Mazumder, N.; Mazzolo, G.; McCarthy, R.; McClelland, D. E.; McCormick, S.; McGuire, S. C.; McIntyre, G.; McIver, J.; McManus, D. J.; McWilliams, S. T.; Meacher, D.; Meadors, G. D.; Meidam, J.; Melatos, A.; Mendell, G.; Mendoza-Gandara, D.; Mercer, R. A.; Merilh, E.; Merzougui, M.; Meshkov, S.; Messenger, C.; Messick, C.; Meyers, P. M.; Mezzani, F.; Miao, H.; Michel, C.; Middleton, H.; Mikhailov, E. E.; Milano, L.; Miller, J.; Millhouse, M.; Minenkov, Y.; Ming, J.; Mirshekari, S.; Mishra, C.; Mitra, S.; Mitrofanov, V. P.; Mitselmakher, G.; Mittleman, R.; Moggi, A.; Mohan, M.; Mohapatra, S. R. P.; Montani, M.; Moore, B. C.; Moore, C. J.; Moraru, D.; Moreno, G.; Morriss, S. R.; Mossavi, K.; Mours, B.; Mow-Lowry, C. M.; Mueller, C. L.; Mueller, G.; Muir, A. W.; Mukherjee, Arunava; Mukherjee, D.; Mukherjee, S.; Mukund, N.; Mullavey, A.; Munch, J.; Murphy, D. J.; Murray, P. G.; Mytidis, A.; Nardecchia, I.; Naticchioni, L.; Nayak, R. K.; Necula, V.; Nedkova, K.; Nelemans, G.; Neri, M.; Neunzert, A.; Newton, G.; Nguyen, T. T.; Nielsen, A. B.; Nissanke, S.; Nitz, A.; Nocera, F.; Nolting, D.; Normandin, M. E.; Nuttall, L. K.; Oberling, J.; Ochsner, E.; O’Dell, J.; Oelker, E.; Ogin, G. H.; Oh, J. J.; Oh, S. H.; Ohme, F.; Oliver, M.; Oppermann, P.; Oram, Richard J.; O’Reilly, B.; O’Shaughnessy, R.; Ottaway, D. J.; Ottens, R. S.; Overmier, H.; Owen, B. J.; Pai, A.; Pai, S. A.; Palamos, J. R.; Palashov, O.; Palomba, C.; Pal-Singh, A.; Pan, H.; Pankow, C.; Pannarale, F.; Pant, B. C.; Paoletti, F.; Paoli, A.; Papa, M. A.; Paris, H. R.; Parker, W.; Pascucci, D.; Pasqualetti, A.; Passaquieti, R.; Passuello, D.; Patricelli, B.; Patrick, Z.; Pearlstone, B. L.; Pedraza, M.; Pedurand, R.; Pekowsky, L.; Pele, A.; Penn, S.; Perreca, A.; Phelps, M.; Piccinni, O.; Pichot, M.; Piergiovanni, F.; Pierro, V.; Pillant, G.; Pinard, L.; Pinto, I. M.; Pitkin, M.; Poggiani, R.; Popolizio, P.; Porter, E. K.; Post, A.; Powell, J.; Prasad, J.; Predoi, V.; Premachandra, S. S.; Prestegard, T.; Price, L. R.; Prijatelj, M.; Principe, M.; Privitera, S.; Prodi, G. A.; Prokhorov, L.; Puncken, O.; Punturo, M.; Puppo, P.; Pürrer, M.; Qi, H.; Qin, J.; Quetschke, V.; Quintero, E. A.; Quitzow-James, R.; Raab, F. J.; Rabeling, D. S.; Radkins, H.; Raffai, P.; Raja, S.; Rakhmanov, M.; Rapagnani, P.; Raymond, V.; Razzano, M.; Re, V.; Read, J.; Reed, C. M.; Regimbau, T.; Rei, L.; Reid, S.; Reitze, D. H.; Rew, H.; Reyes, S. D.; Ricci, F.; Riles, K.; Robertson, N. A.; Robie, R.; Robinet, F.; Rocchi, A.; Rolland, L.; Rollins, J. G.; Roma, V. J.; Romano, R.; Romanov, G.; Romie, J. H.; Rosińska, D.; Rowan, S.; Rüdiger, A.; Ruggi, P.; Ryan, K.; Sachdev, S.; Sadecki, T.; Sadeghian, L.; Salconi, L.; Saleem, M.; Salemi, F.; Samajdar, A.; Sammut, L.; Sampson, L.; Sanchez, E. J.; Sandberg, V.; Sandeen, B.; Sanders, J. R.; Sassolas, B.; Sathyaprakash, B. S.; Saulson, P. R.; Sauter, O.; Savage, R. L.; Sawadsky, A.; Schale, P.; Schilling, R.; Schmidt, J.; Schmidt, P.; Schnabel, R.; Schofield, R. M. S.; Schönbeck, A.; Schreiber, E.; Schuette, D.; Schutz, B. F.; Scott, J.; Scott, S. M.; Sellers, D.; Sengupta, A. S.; Sentenac, D.; Sequino, V.; Sergeev, A.; Serna, G.; Setyawati, Y.; Sevigny, A.; Shaddock, D. A.; Shah, S.; Shahriar, M. S.; Shaltev, M.; Shao, Z.; Shapiro, B.; Shawhan, P.; Sheperd, A.; Shoemaker, D. H.; Shoemaker, D. M.; Siellez, K.; Siemens, X.; Sigg, D.; Silva, A. D.; Simakov, D.; Singer, A.; Singer, L. P.; Singh, A.; Singh, R.; Singhal, A.; Sintes, A. M.; Slagmolen, B. J. J.; Smith, J. R.; Smith, N. D.; Smith, R. J. E.; Son, E. J.; Sorazu, B.; Sorrentino, F.; Souradeep, T.; Srivastava, A. K.; Staley, A.; Steinke, M.; Steinlechner, J.; Steinlechner, S.; Steinmeyer, D.; Stephens, B. C.; Stevenson, S.; Stone, R.; Strain, K. A.; Straniero, N.; Stratta, G.; Strauss, N. A.; Strigin, S.; Sturani, R.; Stuver, A. L.; Summerscales, T. Z.; Sun, L.; Sutton, P. J.; Swinkels, B. L.; Szczepańczyk, M. J.; Tacca, M.; Talukder, D.; Tanner, D. B.; Tápai, M.; Tarabrin, S. P.; Taracchini, A.; Taylor, R.; Theeg, T.; Thirugnanasambandam, M. P.; Thomas, E. G.; Thomas, M.; Thomas, P.; Thorne, K. A.; Thorne, K. S.; Thrane, E.; Tiwari, S.; Tiwari, V.; Tokmakov, K. V.; Tomlinson, C.; Tonelli, M.; Torres, C. V.; Torrie, C. I.; Töyrä, D.; Travasso, F.; Traylor, G.; Trifirò, D.; Tringali, M. C.; Trozzo, L.; Tse, M.; Turconi, M.; Tuyenbayev, D.; Ugolini, D.; Unnikrishnan, C. S.; Urban, A. L.; Usman, S. A.; Vahlbruch, H.; Vajente, G.; Valdes, G.; Vallisneri, M.; van Bakel, N.; van Beuzekom, M.; van den Brand, J. F. J.; Van Den Broeck, C.; Vander-Hyde, D. C.; van der Schaaf, L.; van Heijningen, J. V.; van Veggel, A. A.; Vardaro, M.; Vass, S.; Vasúth, M.; Vaulin, R.; Vecchio, A.; Vedovato, G.; Veitch, J.; Veitch, P. J.; Venkateswara, K.; Verkindt, D.; Vetrano, F.; Viceré, A.; Vinciguerra, S.; Vine, D. J.; Vinet, J.-Y.; Vitale, S.; Vo, T.; Vocca, H.; Vorvick, C.; Voss, D.; Vousden, W. D.; Vyatchanin, S. P.; Wade, A. R.; Wade, L. E.; Wade, M.; Walker, M.; Wallace, L.; Walsh, S.; Wang, G.; Wang, H.; Wang, M.; Wang, X.; Wang, Y.; Ward, R. L.; Warner, J.; Was, M.; Weaver, B.; Wei, L.-W.; Weinert, M.; Weinstein, A. J.; Weiss, R.; Welborn, T.; Wen, L.; Wesels, P.; Westphal, T.; Wette, K.; Whelan, J. T.; White, D. J.; Whiting, B. F.; Williams, R. D.; Williamson, A. R.; Willis, J. L.; Willke, B.; Wimmer, M. H.; Winkler, W.; Wipf, C. C.; Wittel, H.; Woan, G.; Worden, J.; Wright, J. L.; Wu, G.; Yablon, J.; Yam, W.; Yamamoto, H.; Yancey, C. C.; Yap, M. J.; Yu, H.; Yvert, M.; Zadrożny, A.; Zangrando, L.; Zanolin, M.; Zendri, J.-P.; Zevin, M.; Zhang, F.; Zhang, L.; Zhang, M.; Zhang, Y.; Zhao, C.; Zhou, M.; Zhou, Z.; Zhu, X. J.; Zucker, M. E.; Zuraw, S. E.; Zweizig, J.; LIGO Scientific Collaboration; Virgo Collaboration
2016-12-01
This article provides supplemental information for a Letter reporting the rate of (BBH) coalescences inferred from 16 days of coincident Advanced LIGO observations surrounding the transient (GW) signal GW150914. In that work we reported various rate estimates whose 90% confidence intervals fell in the range 2–600 Gpc‑3 yr‑1. Here we give details on our method and computations, including information about our search pipelines, a derivation of our likelihood function for the analysis, a description of the astrophysical search trigger distribution expected from merging BBHs, details on our computational methods, a description of the effects and our model for calibration uncertainty, and an analytic method for estimating our detector sensitivity, which is calibrated to our measurements.
A Comparison of Interactional Aerodynamics Methods for a Helicopter in Low Speed Flight
NASA Technical Reports Server (NTRS)
Berry, John D.; Letnikov, Victor; Bavykina, Irena; Chaffin, Mark S.
1998-01-01
Recent advances in computing subsonic flow have been applied to helicopter configurations with various degrees of success. This paper is a comparison of two specific methods applied to a particularly challenging regime of helicopter flight, very low speeds, where the interaction of the rotor wake and the fuselage are most significant. Comparisons are made between different methods of predicting the interactional aerodynamics associated with a simple generic helicopter configuration. These comparisons are made using fuselage pressure data from a Mach-scaled powered model helicopter with a rotor diameter of approximately 3 meters. The data shown are for an advance ratio of 0.05 with a thrust coefficient of 0.0066. The results of this comparison show that in this type of complex flow both analytical techniques have regions where they are more accurate in matching the experimental data.
Turning the Page: Advancing Paper-Based Microfluidics for Broad Diagnostic Application.
Gong, Max M; Sinton, David
2017-06-28
Infectious diseases are a major global health issue. Diagnosis is a critical first step in effectively managing their spread. Paper-based microfluidic diagnostics first emerged in 2007 as a low-cost alternative to conventional laboratory testing, with the goal of improving accessibility to medical diagnostics in developing countries. In this review, we examine the advances in paper-based microfluidic diagnostics for medical diagnosis in the context of global health from 2007 to 2016. The theory of fluid transport in paper is first presented. The next section examines the strategies that have been employed to control fluid and analyte transport in paper-based assays. Tasks such as mixing, timing, and sequential fluid delivery have been achieved in paper and have enabled analytical capabilities comparable to those of conventional laboratory methods. The following section examines paper-based sample processing and analysis. The most impactful advancement here has been the translation of nucleic acid analysis to a paper-based format. Smartphone-based analysis is another exciting development with potential for wide dissemination. The last core section of the review highlights emerging health applications, such as male fertility testing and wearable diagnostics. We conclude the review with the future outlook, remaining challenges, and emerging opportunities.
Relative frequencies of constrained events in stochastic processes: An analytical approach.
Rusconi, S; Akhmatskaya, E; Sokolovski, D; Ballard, N; de la Cal, J C
2015-10-01
The stochastic simulation algorithm (SSA) and the corresponding Monte Carlo (MC) method are among the most common approaches for studying stochastic processes. They relies on knowledge of interevent probability density functions (PDFs) and on information about dependencies between all possible events. Analytical representations of a PDF are difficult to specify in advance, in many real life applications. Knowing the shapes of PDFs, and using experimental data, different optimization schemes can be applied in order to evaluate probability density functions and, therefore, the properties of the studied system. Such methods, however, are computationally demanding, and often not feasible. We show that, in the case where experimentally accessed properties are directly related to the frequencies of events involved, it may be possible to replace the heavy Monte Carlo core of optimization schemes with an analytical solution. Such a replacement not only provides a more accurate estimation of the properties of the process, but also reduces the simulation time by a factor of order of the sample size (at least ≈10(4)). The proposed analytical approach is valid for any choice of PDF. The accuracy, computational efficiency, and advantages of the method over MC procedures are demonstrated in the exactly solvable case and in the evaluation of branching fractions in controlled radical polymerization (CRP) of acrylic monomers. This polymerization can be modeled by a constrained stochastic process. Constrained systems are quite common, and this makes the method useful for various applications.
Hrvolová, Barbora; Martínez-Huélamo, Miriam; Colmán-Martínez, Mariel; Hurtado-Barroso, Sara; Lamuela-Raventós, Rosa Maria; Kalina, Jiří
2016-01-01
The concentration of carotenoids and fat-soluble vitamins in human plasma may play a significant role in numerous chronic diseases such as age-related macular degeneration and some types of cancer. Although these compounds are of utmost interest for human health, methods for their simultaneous determination are scarce. A new high pressure liquid chromatography (HPLC)-tandem mass spectrometry (MS/MS) method for the quantification of selected carotenoids and fat-soluble vitamins in human plasma was developed, validated, and then applied in a pilot dietary intervention study with healthy volunteers. In 50 min, 16 analytes were separated with an excellent resolution and suitable MS signal intensity. The proposed HPLC–MS/MS method led to improvements in the limits of detection (LOD) and quantification (LOQ) for all analyzed compounds compared to the most often used HPLC–DAD methods, in some cases being more than 100-fold lower. LOD values were between 0.001 and 0.422 µg/mL and LOQ values ranged from 0.003 to 1.406 µg/mL, according to the analyte. The accuracy, precision, and stability met with the acceptance criteria of the AOAC (Association of Official Analytical Chemists) International. According to these results, the described HPLC-MS/MS method is adequately sensitive, repeatable and suitable for the large-scale analysis of compounds in biological fluids. PMID:27754400
Hrvolová, Barbora; Martínez-Huélamo, Miriam; Colmán-Martínez, Mariel; Hurtado-Barroso, Sara; Lamuela-Raventós, Rosa Maria; Kalina, Jiří
2016-10-14
The concentration of carotenoids and fat-soluble vitamins in human plasma may play a significant role in numerous chronic diseases such as age-related macular degeneration and some types of cancer. Although these compounds are of utmost interest for human health, methods for their simultaneous determination are scarce. A new high pressure liquid chromatography (HPLC)-tandem mass spectrometry (MS/MS) method for the quantification of selected carotenoids and fat-soluble vitamins in human plasma was developed, validated, and then applied in a pilot dietary intervention study with healthy volunteers. In 50 min, 16 analytes were separated with an excellent resolution and suitable MS signal intensity. The proposed HPLC-MS/MS method led to improvements in the limits of detection (LOD) and quantification (LOQ) for all analyzed compounds compared to the most often used HPLC-DAD methods, in some cases being more than 100-fold lower. LOD values were between 0.001 and 0.422 µg/mL and LOQ values ranged from 0.003 to 1.406 µg/mL, according to the analyte. The accuracy, precision, and stability met with the acceptance criteria of the AOAC (Association of Official Analytical Chemists) International. According to these results, the described HPLC-MS/MS method is adequately sensitive, repeatable and suitable for the large-scale analysis of compounds in biological fluids.
Active Control of Inlet Noise on the JT15D Turbofan Engine
NASA Technical Reports Server (NTRS)
Smith, Jerome P.; Hutcheson, Florence V.; Burdisso, Ricardo A.; Fuller, Chris R.
1999-01-01
This report presents the key results obtained by the Vibration and Acoustics Laboratories at Virginia Tech over the year from November 1997 to December 1998 on the Active Noise Control of Turbofan Engines research project funded by NASA Langley Research Center. The concept of implementing active noise control techniques with fuselage-mounted error sensors is investigated both analytically and experimentally. The analytical part of the project involves the continued development of an advanced modeling technique to provide prediction and design guidelines for application of active noise control techniques to large, realistic high bypass engines of the type on which active control methods are expected to be applied. Results from the advanced analytical model are presented that show the effectiveness of the control strategies, and the analytical results presented for fuselage error sensors show good agreement with the experimentally observed results and provide additional insight into the control phenomena. Additional analytical results are presented for active noise control used in conjunction with a wavenumber sensing technique. The experimental work is carried out on a running JT15D turbofan jet engine in a test stand at Virginia Tech. The control strategy used in these tests was the feedforward Filtered-X LMS algorithm. The control inputs were supplied by single and multiple circumferential arrays of acoustic sources equipped with neodymium iron cobalt magnets mounted upstream of the fan. The reference signal was obtained from an inlet mounted eddy current probe. The error signals were obtained from a number of pressure transducers flush-mounted in a simulated fuselage section mounted in the engine test cell. The active control methods are investigated when implemented with the control sources embedded within the acoustically absorptive material on a passively-lined inlet. The experimental results show that the combination of active control techniques with fuselage-mounted error sensors and passive control techniques is an effective means of reducing radiated noise from turbofan engines. Strategic selection of the location of the error transducers is shown to be effective for reducing the radiation towards particular directions in the farfield. An analytical model is used to predict the behavior of the control system and to guide the experimental design configurations, and the analytical results presented show good agreement with the experimentally observed results.
Asgharzadeh, Hafez; Borazjani, Iman
2017-02-15
The explicit and semi-implicit schemes in flow simulations involving complex geometries and moving boundaries suffer from time-step size restriction and low convergence rates. Implicit schemes can be used to overcome these restrictions, but implementing them to solve the Navier-Stokes equations is not straightforward due to their non-linearity. Among the implicit schemes for nonlinear equations, Newton-based techniques are preferred over fixed-point techniques because of their high convergence rate but each Newton iteration is more expensive than a fixed-point iteration. Krylov subspace methods are one of the most advanced iterative methods that can be combined with Newton methods, i.e., Newton-Krylov Methods (NKMs) to solve non-linear systems of equations. The success of NKMs vastly depends on the scheme for forming the Jacobian, e.g., automatic differentiation is very expensive, and matrix-free methods without a preconditioner slow down as the mesh is refined. A novel, computationally inexpensive analytical Jacobian for NKM is developed to solve unsteady incompressible Navier-Stokes momentum equations on staggered overset-curvilinear grids with immersed boundaries. Moreover, the analytical Jacobian is used to form preconditioner for matrix-free method in order to improve its performance. The NKM with the analytical Jacobian was validated and verified against Taylor-Green vortex, inline oscillations of a cylinder in a fluid initially at rest, and pulsatile flow in a 90 degree bend. The capability of the method in handling complex geometries with multiple overset grids and immersed boundaries is shown by simulating an intracranial aneurysm. It was shown that the NKM with an analytical Jacobian is 1.17 to 14.77 times faster than the fixed-point Runge-Kutta method, and 1.74 to 152.3 times (excluding an intensively stretched grid) faster than automatic differentiation depending on the grid (size) and the flow problem. In addition, it was shown that using only the diagonal of the Jacobian further improves the performance by 42 - 74% compared to the full Jacobian. The NKM with an analytical Jacobian showed better performance than the fixed point Runge-Kutta because it converged with higher time steps and in approximately 30% less iterations even when the grid was stretched and the Reynold number was increased. In fact, stretching the grid decreased the performance of all methods, but the fixed-point Runge-Kutta performance decreased 4.57 and 2.26 times more than NKM with a diagonal Jacobian when the stretching factor was increased, respectively. The NKM with a diagonal analytical Jacobian and matrix-free method with an analytical preconditioner are the fastest methods and the superiority of one to another depends on the flow problem. Furthermore, the implemented methods are fully parallelized with parallel efficiency of 80-90% on the problems tested. The NKM with the analytical Jacobian can guide building preconditioners for other techniques to improve their performance in the future.
Asgharzadeh, Hafez; Borazjani, Iman
2016-01-01
The explicit and semi-implicit schemes in flow simulations involving complex geometries and moving boundaries suffer from time-step size restriction and low convergence rates. Implicit schemes can be used to overcome these restrictions, but implementing them to solve the Navier-Stokes equations is not straightforward due to their non-linearity. Among the implicit schemes for nonlinear equations, Newton-based techniques are preferred over fixed-point techniques because of their high convergence rate but each Newton iteration is more expensive than a fixed-point iteration. Krylov subspace methods are one of the most advanced iterative methods that can be combined with Newton methods, i.e., Newton-Krylov Methods (NKMs) to solve non-linear systems of equations. The success of NKMs vastly depends on the scheme for forming the Jacobian, e.g., automatic differentiation is very expensive, and matrix-free methods without a preconditioner slow down as the mesh is refined. A novel, computationally inexpensive analytical Jacobian for NKM is developed to solve unsteady incompressible Navier-Stokes momentum equations on staggered overset-curvilinear grids with immersed boundaries. Moreover, the analytical Jacobian is used to form preconditioner for matrix-free method in order to improve its performance. The NKM with the analytical Jacobian was validated and verified against Taylor-Green vortex, inline oscillations of a cylinder in a fluid initially at rest, and pulsatile flow in a 90 degree bend. The capability of the method in handling complex geometries with multiple overset grids and immersed boundaries is shown by simulating an intracranial aneurysm. It was shown that the NKM with an analytical Jacobian is 1.17 to 14.77 times faster than the fixed-point Runge-Kutta method, and 1.74 to 152.3 times (excluding an intensively stretched grid) faster than automatic differentiation depending on the grid (size) and the flow problem. In addition, it was shown that using only the diagonal of the Jacobian further improves the performance by 42 – 74% compared to the full Jacobian. The NKM with an analytical Jacobian showed better performance than the fixed point Runge-Kutta because it converged with higher time steps and in approximately 30% less iterations even when the grid was stretched and the Reynold number was increased. In fact, stretching the grid decreased the performance of all methods, but the fixed-point Runge-Kutta performance decreased 4.57 and 2.26 times more than NKM with a diagonal Jacobian when the stretching factor was increased, respectively. The NKM with a diagonal analytical Jacobian and matrix-free method with an analytical preconditioner are the fastest methods and the superiority of one to another depends on the flow problem. Furthermore, the implemented methods are fully parallelized with parallel efficiency of 80–90% on the problems tested. The NKM with the analytical Jacobian can guide building preconditioners for other techniques to improve their performance in the future. PMID:28042172
NASA Astrophysics Data System (ADS)
Asgharzadeh, Hafez; Borazjani, Iman
2017-02-01
The explicit and semi-implicit schemes in flow simulations involving complex geometries and moving boundaries suffer from time-step size restriction and low convergence rates. Implicit schemes can be used to overcome these restrictions, but implementing them to solve the Navier-Stokes equations is not straightforward due to their non-linearity. Among the implicit schemes for non-linear equations, Newton-based techniques are preferred over fixed-point techniques because of their high convergence rate but each Newton iteration is more expensive than a fixed-point iteration. Krylov subspace methods are one of the most advanced iterative methods that can be combined with Newton methods, i.e., Newton-Krylov Methods (NKMs) to solve non-linear systems of equations. The success of NKMs vastly depends on the scheme for forming the Jacobian, e.g., automatic differentiation is very expensive, and matrix-free methods without a preconditioner slow down as the mesh is refined. A novel, computationally inexpensive analytical Jacobian for NKM is developed to solve unsteady incompressible Navier-Stokes momentum equations on staggered overset-curvilinear grids with immersed boundaries. Moreover, the analytical Jacobian is used to form a preconditioner for matrix-free method in order to improve its performance. The NKM with the analytical Jacobian was validated and verified against Taylor-Green vortex, inline oscillations of a cylinder in a fluid initially at rest, and pulsatile flow in a 90 degree bend. The capability of the method in handling complex geometries with multiple overset grids and immersed boundaries is shown by simulating an intracranial aneurysm. It was shown that the NKM with an analytical Jacobian is 1.17 to 14.77 times faster than the fixed-point Runge-Kutta method, and 1.74 to 152.3 times (excluding an intensively stretched grid) faster than automatic differentiation depending on the grid (size) and the flow problem. In addition, it was shown that using only the diagonal of the Jacobian further improves the performance by 42-74% compared to the full Jacobian. The NKM with an analytical Jacobian showed better performance than the fixed point Runge-Kutta because it converged with higher time steps and in approximately 30% less iterations even when the grid was stretched and the Reynold number was increased. In fact, stretching the grid decreased the performance of all methods, but the fixed-point Runge-Kutta performance decreased 4.57 and 2.26 times more than NKM with a diagonal and full Jacobian, respectivley, when the stretching factor was increased. The NKM with a diagonal analytical Jacobian and matrix-free method with an analytical preconditioner are the fastest methods and the superiority of one to another depends on the flow problem. Furthermore, the implemented methods are fully parallelized with parallel efficiency of 80-90% on the problems tested. The NKM with the analytical Jacobian can guide building preconditioners for other techniques to improve their performance in the future.
Mean curvature model for a quasi-static advancing meniscus: a drop tower test
NASA Astrophysics Data System (ADS)
Chen, Yongkang; Tavan, Noel; Weislogel, Mark
A critical geometric wetting condition resulting in a significant shift of a capillary fluid from one region of a container to another was recently demonstrated during experiments performed aboard the International Space Station (the Capillary Flow Experiments, Vane Gap test units, bulk shift phenomena). Such phenomena are of interest for advanced methods of control for large quantities of liquids aboard spacecraft. The dynamics of the flows are well understood, but analytical models remain qualitative without the correct capillary pressure driving force for the shifting bulk fluid—where one large interface (meniscus) advances while another recedes. To determine this pressure an investigation of the mean curvature of the advancing meniscus is presented which is inspired by earlier studies of receding bulk menisci in non-circular cylindrical containers. The approach is permissible only in the quasi-static limit. It will be shown that the mean curvature of the advancing bulk meniscus is related to that of the receding bulk meniscus, both of which are highly sensitive to container geometry and wetting conditions. The two meniscus curvatures are identical for any control parameter at the critical value identified by the Concus-Finn analysis. However, they differ when the control parameter is below its critical value. Experiments along these lines are well suited for drop towers and comparisons with the analytical predictions implementing the mean curvature model are presented. The validation opens a pathway to the analysis of such flows in containers of great geometric complexity.
Historical review of missile aerodynamic developments
NASA Technical Reports Server (NTRS)
Spearman, M. Leroy
1989-01-01
The development of missiles from early history up to about 1970 is discussed. Early unpowered missiles beyond the rock include the spear, the bow and arrow, the gun and bullet, and the cannon and projectile. Combining gunpowder with projectiles resulted in the first powered missiles. In the early 1900's, the development of guided missiles was begun. Significant advances in missile technology were made by German scientists during World War II. The dispersion of these advances to other countries following the war resulted in accelerating the development of guided missiles. In the late 1940's and early 1950's there was a proliferation in the development of missile systems in many countries. These developments were based primarily on experimental work and on relatively crude analytical techniques. Discussed here are some of the missile systems that were developed up to about 1970; some of the problems encountered; the development of an experimental data base for use with missiles; and early efforts to develop analytical methods applicable to missiles.
Geerts, Hugo; Dacks, Penny A; Devanarayan, Viswanath; Haas, Magali; Khachaturian, Zaven S; Gordon, Mark Forrest; Maudsley, Stuart; Romero, Klaus; Stephenson, Diane
2016-09-01
Massive investment and technological advances in the collection of extensive and longitudinal information on thousands of Alzheimer patients results in large amounts of data. These "big-data" databases can potentially advance CNS research and drug development. However, although necessary, they are not sufficient, and we posit that they must be matched with analytical methods that go beyond retrospective data-driven associations with various clinical phenotypes. Although these empirically derived associations can generate novel and useful hypotheses, they need to be organically integrated in a quantitative understanding of the pathology that can be actionable for drug discovery and development. We argue that mechanism-based modeling and simulation approaches, where existing domain knowledge is formally integrated using complexity science and quantitative systems pharmacology can be combined with data-driven analytics to generate predictive actionable knowledge for drug discovery programs, target validation, and optimization of clinical development. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Aljabri, Abdullah S.
1988-01-01
High speed subsonic transports powered by advanced propellers provide significant fuel savings compared to turbofan powered transports. Unfortunately, however, propfans must operate in aircraft-induced nonuniform flow fields which can lead to high blade cyclic stresses, vibration and noise. To optimize the design and installation of these advanced propellers, therefore, detailed knowledge of the complex flow field is required. As part of the NASA Propfan Test Assessment (PTA) program, a 1/9 scale semispan model of the Gulfstream II propfan test-bed aircraft was tested in the NASA-Lewis 8 x 6 supersonic wind tunnel to obtain propeller flow field data. Detailed radial and azimuthal surveys were made to obtain the total pressure in the flow and the three components of velocity. Data was acquired for Mach numbers ranging from 0.6 to 0.85. Analytical predictions were also made using a subsonic panel method, QUADPAN. Comparison of wind-tunnel measurements and analytical predictions show good agreement throughout the Mach range.
Jabłońska-Czapla, Magdalena
2015-01-01
Chemical speciation is a very important subject in the environmental protection, toxicology, and chemical analytics due to the fact that toxicity, availability, and reactivity of trace elements depend on the chemical forms in which these elements occur. Research on low analyte levels, particularly in complex matrix samples, requires more and more advanced and sophisticated analytical methods and techniques. The latest trends in this field concern the so-called hyphenated techniques. Arsenic, antimony, chromium, and (underestimated) thallium attract the closest attention of toxicologists and analysts. The properties of those elements depend on the oxidation state in which they occur. The aim of the following paper is to answer the question why the speciation analytics is so important. The paper also provides numerous examples of the hyphenated technique usage (e.g., the LC-ICP-MS application in the speciation analysis of chromium, antimony, arsenic, or thallium in water and bottom sediment samples). An important issue addressed is the preparation of environmental samples for speciation analysis. PMID:25873962
ERIC Educational Resources Information Center
Piunno, Paul A. E.; Zetina, Adrian; Chu, Norman; Tavares, Anthony J.; Noor, M. Omair; Petryayeva, Eleonora; Uddayasankar, Uvaraj; Veglio, Andrew
2014-01-01
An advanced analytical chemistry undergraduate laboratory module on microfluidics that spans 4 weeks (4 h per week) is presented. The laboratory module focuses on comprehensive experiential learning of microfluidic device fabrication and the core characteristics of microfluidic devices as they pertain to fluid flow and the manipulation of samples.…
AN ADVANCED PLACEMENT COURSE IN ANALYTIC GEOMETRY AND CALCULUS (MATHEMATICS XV X AP).
ERIC Educational Resources Information Center
DEROLF, JOHN J.; MIENTKA, WALTER E.
THIS TEXT ON ANALYTIC GEOMETRY AND CALCULUS IS A CORRESPONDENCE COURSE DESIGNED FOR ADVANCED PLACEMENT OF HIGH SCHOOL STUDENTS IN COLLEGE. EACH OF THE 21 LESSONS INCLUDES READING ASSIGNMENTS AND LISTS OF PROBLEMS TO BE WORKED. IN ADDITION, SUPPLEMENTARY EXPLANATIONS AND COMMENTS ARE INCLUDED THAT (1) PROVIDE ILLUSTRATIVE EXAMPLES OF CONCEPTS AND…
ERIC Educational Resources Information Center
Wilczek-Vera, Grazyna; Salin, Eric Dunbar
2011-01-01
An experiment on fluorescence spectroscopy suitable for an advanced analytical laboratory is presented. Its conceptual development used a combination of the expository and discovery styles. The "learn-as-you-go" and direct "hands-on" methodology applied ensures an active role for a student in the process of visualization and discovery of concepts.…
Independent Research and Independent Exploratory Development Annual Report Fiscal Year 1975
1975-09-01
and Coding Study.(Z?80) ................................... ......... .................... 40 Optical Cover CMMUnicallor’s Using Laser Transceiverst...Using Auger Spectroscopy and PUBLICATIONS Additional Advanced Analytical Techniques," Wagner, N. K., "Auger Electron Spectroscopy NELC Technical Note 2904...K.. "Analysis of Microelectronic Materials Using Auger Spectroscopy and Additional Advanced Analytical Techniques," Contact: Proceedings of the
Lin, Shan-Yang; Wang, Shun-Li
2012-04-01
The solid-state chemistry of drugs has seen growing importance in the pharmaceutical industry for the development of useful API (active pharmaceutical ingredients) of drugs and stable dosage forms. The stability of drugs in various solid dosage forms is an important issue because solid dosage forms are the most common pharmaceutical formulation in clinical use. In solid-state stability studies of drugs, an ideal accelerated method must not only be selected by different complicated methods, but must also detect the formation of degraded product. In this review article, an analytical technique combining differential scanning calorimetry and Fourier-transform infrared (DSC-FTIR) microspectroscopy simulates the accelerated stability test, and simultaneously detects the decomposed products in real time. The pharmaceutical dipeptides aspartame hemihydrate, lisinopril dihydrate, and enalapril maleate either with or without Eudragit E were used as testing examples. This one-step simultaneous DSC-FTIR technique for real-time detection of diketopiperazine (DKP) directly evidenced the dehydration process and DKP formation as an impurity common in pharmaceutical dipeptides. DKP formation in various dipeptides determined by different analytical methods had been collected and compiled. Although many analytical methods have been applied, the combined DSC-FTIR technique is an easy and fast analytical method which not only can simulate the accelerated drug stability testing but also at the same time enable to explore phase transformation as well as degradation due to thermal-related reactions. This technique offers quick and proper interpretations. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Deguchi, Shuji; Watson, William D.
1988-01-01
Statistical methods are developed for gravitational lensing in order to obtain analytic expressions for the average surface brightness that include the effects of microlensing by stellar (or other compact) masses within the lensing galaxy. The primary advance here is in utilizing a Markoff technique to obtain expressions that are valid for sources of finite size when the surface density of mass in the lensing galaxy is large. The finite size of the source is probably the key consideration for the occurrence of microlensing by individual stars. For the intensity from a particular location, the parameter which governs the importance of microlensing is determined. Statistical methods are also formulated to assess the time variation of the surface brightness due to the random motion of the masses that cause the microlensing.
Structural analysis at aircraft conceptual design stage
NASA Astrophysics Data System (ADS)
Mansouri, Reza
In the past 50 years, computers have helped by augmenting human efforts with tremendous pace. The aircraft industry is not an exception. Aircraft industry is more than ever dependent on computing because of a high level of complexity and the increasing need for excellence to survive a highly competitive marketplace. Designers choose computers to perform almost every analysis task. But while doing so, existing effective, accurate and easy to use classical analytical methods are often forgotten, which can be very useful especially in the early phases of the aircraft design where concept generation and evaluation demands physical visibility of design parameters to make decisions [39, 2004]. Structural analysis methods have been used by human beings since the very early civilization. Centuries before computers were invented; the pyramids were designed and constructed by Egyptians around 2000 B.C, the Parthenon was built by the Greeks, around 240 B.C, Dujiangyan was built by the Chinese. Persepolis, Hagia Sophia, Taj Mahal, Eiffel tower are only few more examples of historical buildings, bridges and monuments that were constructed before we had any advancement made in computer aided engineering. Aircraft industry is no exception either. In the first half of the 20th century, engineers used classical method and designed civil transport aircraft such as Ford Tri Motor (1926), Lockheed Vega (1927), Lockheed 9 Orion (1931), Douglas DC-3 (1935), Douglas DC-4/C-54 Skymaster (1938), Boeing 307 (1938) and Boeing 314 Clipper (1939) and managed to become airborne without difficulty. Evidencing, while advanced numerical methods such as the finite element analysis is one of the most effective structural analysis methods; classical structural analysis methods can also be as useful especially during the early phase of a fixed wing aircraft design where major decisions are made and concept generation and evaluation demands physical visibility of design parameters to make decisions. Considering the strength and limitations of both methodologies, the question to be answered in this thesis is: How valuable and compatible are the classical analytical methods in today's conceptual design environment? And can these methods complement each other? To answer these questions, this thesis investigates the pros and cons of classical analytical structural analysis methods during the conceptual design stage through the following objectives: Illustrate structural design methodology of these methods within the framework of Aerospace Vehicle Design (AVD) lab's design lifecycle. Demonstrate the effectiveness of moment distribution method through four case studies. This will be done by considering and evaluating the strength and limitation of these methods. In order to objectively quantify the limitation and capabilities of the analytical method at the conceptual design stage, each case study becomes more complex than the one before.
A Review of Current Methods for Analysis of Mycotoxins in Herbal Medicines
Zhang, Lei; Dou, Xiao-Wen; Zhang, Cheng; Logrieco, Antonio F.; Yang, Mei-Hua
2018-01-01
The presence of mycotoxins in herbal medicines is an established problem throughout the entire world. The sensitive and accurate analysis of mycotoxin in complicated matrices (e.g., herbs) typically involves challenging sample pretreatment procedures and an efficient detection instrument. However, although numerous reviews have been published regarding the occurrence of mycotoxins in herbal medicines, few of them provided a detailed summary of related analytical methods for mycotoxin determination. This review focuses on analytical techniques including sampling, extraction, cleanup, and detection for mycotoxin determination in herbal medicines established within the past ten years. Dedicated sections of this article address the significant developments in sample preparation, and highlight the importance of this procedure in the analytical technology. This review also summarizes conventional chromatographic techniques for mycotoxin qualification or quantitation, as well as recent studies regarding the development and application of screening assays such as enzyme-linked immunosorbent assays, lateral flow immunoassays, aptamer-based lateral flow assays, and cytometric bead arrays. The present work provides a good insight regarding the advanced research that has been done and closes with an indication of future demand for the emerging technologies. PMID:29393905
Beyond Antibodies as Binding Partners: The Role of Antibody Mimetics in Bioanalysis.
Yu, Xiaowen; Yang, Yu-Ping; Dikici, Emre; Deo, Sapna K; Daunert, Sylvia
2017-06-12
The emergence of novel binding proteins or antibody mimetics capable of binding to ligand analytes in a manner analogous to that of the antigen-antibody interaction has spurred increased interest in the biotechnology and bioanalytical communities. The goal is to produce antibody mimetics designed to outperform antibodies with regard to binding affinities, cellular and tumor penetration, large-scale production, and temperature and pH stability. The generation of antibody mimetics with tailored characteristics involves the identification of a naturally occurring protein scaffold as a template that binds to a desired ligand. This scaffold is then engineered to create a superior binder by first creating a library that is then subjected to a series of selection steps. Antibody mimetics have been successfully used in the development of binding assays for the detection of analytes in biological samples, as well as in separation methods, cancer therapy, targeted drug delivery, and in vivo imaging. This review describes recent advances in the field of antibody mimetics and their applications in bioanalytical chemistry, specifically in diagnostics and other analytical methods.
Global open data management in metabolomics.
Haug, Kenneth; Salek, Reza M; Steinbeck, Christoph
2017-02-01
Chemical Biology employs chemical synthesis, analytical chemistry and other tools to study biological systems. Recent advances in both molecular biology such as next generation sequencing (NGS) have led to unprecedented insights towards the evolution of organisms' biochemical repertoires. Because of the specific data sharing culture in Genomics, genomes from all kingdoms of life become readily available for further analysis by other researchers. While the genome expresses the potential of an organism to adapt to external influences, the Metabolome presents a molecular phenotype that allows us to asses the external influences under which an organism exists and develops in a dynamic way. Steady advancements in instrumentation towards high-throughput and highresolution methods have led to a revival of analytical chemistry methods for the measurement and analysis of the metabolome of organisms. This steady growth of metabolomics as a field is leading to a similar accumulation of big data across laboratories worldwide as can be observed in all of the other omics areas. This calls for the development of methods and technologies for handling and dealing with such large datasets, for efficiently distributing them and for enabling re-analysis. Here we describe the recently emerging ecosystem of global open-access databases and data exchange efforts between them, as well as the foundations and obstacles that enable or prevent the data sharing and reanalysis of this data. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Cho, Il-Hoon; Ku, Seockmo
2017-09-30
The development of novel and high-tech solutions for rapid, accurate, and non-laborious microbial detection methods is imperative to improve the global food supply. Such solutions have begun to address the need for microbial detection that is faster and more sensitive than existing methodologies (e.g., classic culture enrichment methods). Multiple reviews report the technical functions and structures of conventional microbial detection tools. These tools, used to detect pathogens in food and food homogenates, were designed via qualitative analysis methods. The inherent disadvantage of these analytical methods is the necessity for specimen preparation, which is a time-consuming process. While some literature describes the challenges and opportunities to overcome the technical issues related to food industry legal guidelines, there is a lack of reviews of the current trials to overcome technological limitations related to sample preparation and microbial detection via nano and micro technologies. In this review, we primarily explore current analytical technologies, including metallic and magnetic nanomaterials, optics, electrochemistry, and spectroscopy. These techniques rely on the early detection of pathogens via enhanced analytical sensitivity and specificity. In order to introduce the potential combination and comparative analysis of various advanced methods, we also reference a novel sample preparation protocol that uses microbial concentration and recovery technologies. This technology has the potential to expedite the pre-enrichment step that precedes the detection process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abbott, B. P.; Abbott, R.; Abernathy, M. R.
This article provides supplemental information for a Letter reporting the rate of (BBH) coalescences inferred from 16 days of coincident Advanced LIGO observations surrounding the transient (GW) signal GW150914. In that work we reported various rate estimates whose 90% confidence intervals fell in the range 2–600 Gpc{sup −3} yr{sup −1}. Here we give details on our method and computations, including information about our search pipelines, a derivation of our likelihood function for the analysis, a description of the astrophysical search trigger distribution expected from merging BBHs, details on our computational methods, a description of the effects and our model for calibration uncertainty,more » and an analytic method for estimating our detector sensitivity, which is calibrated to our measurements.« less
NASA Astrophysics Data System (ADS)
Iwamoto, Mitsumasa; Taguchi, Dai
2018-03-01
Thermally stimulated current (TSC) measurement is widely used in a variety of research fields, i.e., physics, electronics, electrical engineering, chemistry, ceramics, and biology. TSC is short-circuit current that flows owing to the displacement of charges in samples during heating. TSC measurement is very simple, but TSC curves give very important information on charge behaviors. In the 1970s, TSC measurement contributed greatly to the development of electrical insulation engineering, semiconductor device technology, and so forth. Accordingly, the TSC experimental technique and its analytical method advanced. Over the past decades, many new molecules and advanced functional materials have been discovered and developed. Along with this, TSC measurement has attracted much attention in industries and academic laboratories as a way of characterizing newly discovered materials and devices. In this review, we report the latest research trend in the TSC method for the development of materials and devices in Japan.
Recent advances in the modeling of plasmas with the Particle-In-Cell methods
NASA Astrophysics Data System (ADS)
Vay, Jean-Luc; Lehe, Remi; Vincenti, Henri; Godfrey, Brendan; Lee, Patrick; Haber, Irv
2015-11-01
The Particle-In-Cell (PIC) approach is the method of choice for self-consistent simulations of plasmas from first principles. The fundamentals of the PIC method were established decades ago but improvements or variations are continuously being proposed. We report on several recent advances in PIC related algorithms, including: (a) detailed analysis of the numerical Cherenkov instability and its remediation, (b) analytic pseudo-spectral electromagnetic solvers in Cartesian and cylindrical (with azimuthal modes decomposition) geometries, (c) arbitrary-order finite-difference and generalized pseudo-spectral Maxwell solvers, (d) novel analysis of Maxwell's solvers' stencil variation and truncation, in application to domain decomposition strategies and implementation of Perfectly Matched Layers in high-order and pseudo-spectral solvers. Work supported by US-DOE Contracts DE-AC02-05CH11231 and the US-DOE SciDAC program ComPASS. Used resources of NERSC, supported by US-DOE Contract DE-AC02-05CH11231.
Reagen, William K; Lindstrom, Kent R; Thompson, Kathy L; Flaherty, John M
2004-09-01
The widespread use of semi- and nonvolatile organofluorochemicals in industrial facilities, concern about their persistence, and relatively recent advancements in liquid chromatography/mass spectrometry (LC/MS) technology have led to the development of new analytical methods to assess potential worker exposure to airborne organofluorochemicals. Techniques were evaluated for the determination of 19 organofluorochemicals and for total fluorine in ambient air samples. Due to the potential biphasic nature of most of these fluorochemicals when airborne, Occupational Safety and Health Administration (OSHA) versatile sampler (OVS) tubes were used to simultaneously trap fluorochemical particulates and vapors from workplace air. Analytical methods were developed for OVS air samples to quantitatively analyze for total fluorine using oxygen bomb combustion/ion selective electrode and for 17 organofluorochemicals using LC/MS and gas chromatography/mass spectrometry (GC/MS). The experimental design for this validation was based on the National Institute of Occupational Safety and Health (NIOSH) Guidelines for Air Sampling and Analytical Method Development and Evaluation, with some revisions of the experimental design. The study design incorporated experiments to determine analytical recovery and stability, sampler capacity, the effect of some environmental parameters on recoveries, storage stability, limits of detection, precision, and accuracy. Fluorochemical mixtures were spiked onto each OVS tube over a range of 0.06-6 microg for each of 12 compounds analyzed by LC/MS and 0.3-30 microg for 5 compounds analyzed by GC/MS. These ranges allowed reliable quantitation at 0.001-0.1 mg/m3 in general for LC/MS analytes and 0.005-0.5 mg/m3 for GC/MS analytes when 60 L of air are sampled. The organofluorochemical exposure guideline (EG) is currently 0.1 mg/m3 for many analytes, with one exception being ammonium perfluorooctanoate (EG is 0.01 mg/m3). Total fluorine results may be used to determine if the individual compounds quantified provide a suitable mass balance of total airborne organofluorochemicals based on known fluorine content. Improvements in precision and/or recovery as well as some additional testing would be needed to meet all NIOSH validation criteria. This study provided valuable information about the accuracy of this method for organofluorochemical exposure assessment.
ERIC Educational Resources Information Center
Polito, Vincent A., Jr.
2010-01-01
The objective of this research was to explore the possibilities of identifying knowledge style factors that could be used as central elements of a professional business analyst's (PBA) performance attributes at work for those decision makers that use advanced analytical technologies on decision making tasks. Indicators of knowledge style were…
Sampling and physico-chemical analysis of precipitation: a review.
Krupa, Sagar V
2002-01-01
Wet deposition is one of two processes governing the transfer of beneficial and toxic chemicals from the atmosphere on to surfaces. Since the early 1970s, numerous investigators have sampled and analyzed precipitation for their chemical constituents, in the context of "acidic rain" and related atmospheric processes. Since then, significant advances have been made in our understanding of how to sample rain, cloud and fog water to preserve their physico-chemical integrity prior to analyses. Since the 1970s large-scale precipitation sampling networks have been in operation to broadly address regional and multi-regional issues. However, in examining the results from such efforts at a site-specific level, concerns have been raised about the accuracy and precision of the information gathered. There is mounting evidence to demonstrate the instability of precipitation samples (e.g. with N species) that have been subjected to prolonged ambient or field conditions. At the present time precipitation sampling procedures allow unrefrigerated or refrigerated collection of wet deposition from individual events, sequential fractions within events, in situ continuous chemical analyses in the field and even sampling of single or individual rain, cloud and fog droplets. Similarly analytical procedures of precipitation composition have advanced from time-consuming methods to rapid and simultaneous analyses of major anions and cations, from bulk samples to single droplets. For example, analytical techniques have evolved from colorimetry to ion chromatography to capillary electrophoresis. Overall, these advances allow a better understanding of heterogeneous reactions and atmospheric pollutant scavenging processes by precipitation. In addition, from an environmental perspective, these advances allow better quantification of semi-labile (e.g. NH4+, frequently its deposition values are underestimated) or labile species [e.g. S (IV)] in precipitation and measurements of toxic chemicals such as Hg and PCBs (polychlorinated biphenyls). Similarly, methods now exist for source-receptor studies, using for example, the characterization of reduced elemental states and/or the use of stable isotopes in precipitation as tracers. Future studies on the relationship between atmospheric deposition and environmental impacts must exploit these advances. This review provides a comprehensive and comparative treatment of the state of the art sampling methods of precipitation and its physico-chemical analysis.
Propfan experimental data analysis
NASA Technical Reports Server (NTRS)
Vernon, David F.; Page, Gregory S.; Welge, H. Robert
1984-01-01
A data reduction method, which is consistent with the performance prediction methods used for analysis of new aircraft designs, is defined and compared to the method currently used by NASA using data obtained from an Ames Res. Center 11 foot transonic wind tunnel test. Pressure and flow visualization data from the Ames test for both the powered straight underwing nacelle, and an unpowered contoured overwing nacelle installation is used to determine the flow phenomena present for a wind mounted turboprop installation. The test data is compared to analytic methods, showing the analytic methods to be suitable for design and analysis of new configurations. The data analysis indicated that designs with zero interference drag levels are achieveable with proper wind and nacelle tailoring. A new overwing contoured nacelle design and a modification to the wing leading edge extension for the current wind tunnel model design are evaluated. Hardware constraints of the current model parts prevent obtaining any significant performance improvement due to a modified nacelle contouring. A new aspect ratio wing design for an up outboard rotation turboprop installation is defined, and an advanced contoured nacelle is provided.
Wianowska, Dorota; Dawidowicz, Andrzej L
2016-05-01
This paper proposes and shows the analytical capabilities of a new variant of matrix solid phase dispersion (MSPD) with the solventless blending step in the chromatographic analysis of plant volatiles. The obtained results prove that the use of a solvent is redundant as the sorption ability of the octadecyl brush is sufficient for quantitative retention of volatiles from 9 plants differing in their essential oil composition. The extraction efficiency of the proposed simplified MSPD method is equivalent to the efficiency of the commonly applied variant of MSPD with the organic dispersing liquid and pressurized liquid extraction, which is a much more complex, technically advanced and highly efficient technique of plant extraction. The equivalency of these methods is confirmed by the variance analysis. The proposed solventless MSPD method is precise, accurate, and reproducible. The recovery of essential oil components estimated by the MSPD method exceeds 98%, which is satisfactory for analytical purposes. Copyright © 2016 Elsevier B.V. All rights reserved.
Numerical studies of the Bethe-Salpeter equation for a two-fermion bound state
NASA Astrophysics Data System (ADS)
de Paula, W.; Frederico, T.; Salmè, G.; Viviani, M.
2018-03-01
Some recent advances on the solution of the Bethe-Salpeter equation (BSE) for a two-fermion bound system directly in Minkowski space are presented. The calculations are based on the expression of the Bethe-Salpeter amplitude in terms of the so-called Nakanishi integral representation and on the light-front projection (i.e. the integration of the light-front variable k - = k 0 - k 3). The latter technique allows for the analytically exact treatment of the singularities plaguing the two-fermion BSE in Minkowski space. The good agreement observed between our results and those obtained using other existing numerical methods, based on both Minkowski and Euclidean space techniques, fully corroborate our analytical treatment.
Cordero, Chiara; Kiefl, Johannes; Schieberle, Peter; Reichenbach, Stephen E; Bicchi, Carlo
2015-01-01
Modern omics disciplines dealing with food flavor focus the analytical efforts on the elucidation of sensory-active compounds, including all possible stimuli of multimodal perception (aroma, taste, texture, etc.) by means of a comprehensive, integrated treatment of sample constituents, such as physicochemical properties, concentration in the matrix, and sensory properties (odor/taste quality, perception threshold). Such analyses require detailed profiling of known bioactive components as well as advanced fingerprinting techniques to catalog sample constituents comprehensively, quantitatively, and comparably across samples. Multidimensional analytical platforms support comprehensive investigations required for flavor analysis by combining information on analytes' identities, physicochemical behaviors (volatility, polarity, partition coefficient, and solubility), concentration, and odor quality. Unlike other omics, flavor metabolomics and sensomics include the final output of the biological phenomenon (i.e., sensory perceptions) as an additional analytical dimension, which is specifically and exclusively triggered by the chemicals analyzed. However, advanced omics platforms, which are multidimensional by definition, pose challenging issues not only in terms of coupling with detection systems and sample preparation, but also in terms of data elaboration and processing. The large number of variables collected during each analytical run provides a high level of information, but requires appropriate strategies to exploit fully this potential. This review focuses on advances in comprehensive two-dimensional gas chromatography and analytical platforms combining two-dimensional gas chromatography with olfactometry, chemometrics, and quantitative assays for food sensory analysis to assess the quality of a given product. We review instrumental advances and couplings, automation in sample preparation, data elaboration, and a selection of applications.
Advancing Clinical Proteomics via Analysis Based on Biological Complexes: A Tale of Five Paradigms.
Goh, Wilson Wen Bin; Wong, Limsoon
2016-09-02
Despite advances in proteomic technologies, idiosyncratic data issues, for example, incomplete coverage and inconsistency, resulting in large data holes, persist. Moreover, because of naïve reliance on statistical testing and its accompanying p values, differential protein signatures identified from such proteomics data have little diagnostic power. Thus, deploying conventional analytics on proteomics data is insufficient for identifying novel drug targets or precise yet sensitive biomarkers. Complex-based analysis is a new analytical approach that has potential to resolve these issues but requires formalization. We categorize complex-based analysis into five method classes or paradigms and propose an even-handed yet comprehensive evaluation rubric based on both simulated and real data. The first four paradigms are well represented in the literature. The fifth and newest paradigm, the network-paired (NP) paradigm, represented by a method called Extremely Small SubNET (ESSNET), dominates in precision-recall and reproducibility, maintains strong performance in small sample sizes, and sensitively detects low-abundance complexes. In contrast, the commonly used over-representation analysis (ORA) and direct-group (DG) test paradigms maintain good overall precision but have severe reproducibility issues. The other two paradigms considered here are the hit-rate and rank-based network analysis paradigms; both of these have good precision-recall and reproducibility, but they do not consider low-abundance complexes. Therefore, given its strong performance, NP/ESSNET may prove to be a useful approach for improving the analytical resolution of proteomics data. Additionally, given its stability, it may also be a powerful new approach toward functional enrichment tests, much like its ORA and DG counterparts.
Salgueiro-González, N; Muniategui-Lorenzo, S; López-Mahía, P; Prada-Rodríguez, D
2017-04-15
In the last decade, the impact of alkylphenols and bisphenol A in the aquatic environment has been widely evaluated because of their high use in industrial and household applications as well as their toxicological effects. These compounds are well-known endocrine disrupting compounds (EDCs) which can affect the hormonal system of humans and wildlife, even at low concentrations. Due to the fact that these pollutants enter into the environment through waters, and it is the most affected compartment, analytical methods which allow the determination of these compounds in aqueous samples at low levels are mandatory. In this review, an overview of the most significant advances in the analytical methodologies for the determination of alkylphenols and bisphenol A in waters is considered (from 2002 to the present). Sample handling and instrumental detection strategies are critically discussed, including analytical parameters related to quality assurance and quality control (QA/QC). Special attention is paid to miniaturized sample preparation methodologies and approaches proposed to reduce time- and reagents consumption according to Green Chemistry principles, which have increased in the last five years. Finally, relevant applications of these methods to the analysis of water samples are examined, being wastewater and surface water the most investigated. Copyright © 2017 Elsevier B.V. All rights reserved.
John Herschel's Graphical Method
NASA Astrophysics Data System (ADS)
Hankins, Thomas L.
2011-01-01
In 1833 John Herschel published an account of his graphical method for determining the orbits of double stars. He had hoped to be the first to determine such orbits, but Felix Savary in France and Johann Franz Encke in Germany beat him to the punch using analytical methods. Herschel was convinced, however, that his graphical method was much superior to analytical methods, because it used the judgment of the hand and eye to correct the inevitable errors of observation. Line graphs of the kind used by Herschel became common only in the 1830s, so Herschel was introducing a new method. He also found computation fatiguing and devised a "wheeled machine" to help him out. Encke was skeptical of Herschel's methods. He said that he lived for calculation and that the English would be better astronomers if they calculated more. It is difficult to believe that the entire Scientific Revolution of the 17th century took place without graphs and that only a few examples appeared in the 18th century. Herschel promoted the use of graphs, not only in astronomy, but also in the study of meteorology and terrestrial magnetism. Because he was the most prominent scientist in England, Herschel's advocacy greatly advanced graphical methods.
Annual banned-substance review: analytical approaches in human sports drug testing.
Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm
2017-01-01
There has been an immense amount of visibility of doping issues on the international stage over the past 12 months with the complexity of doping controls reiterated on various occasions. Hence, analytical test methods continuously being updated, expanded, and improved to provide specific, sensitive, and comprehensive test results in line with the World Anti-Doping Agency's (WADA) 2016 Prohibited List represent one of several critical cornerstones of doping controls. This enterprise necessitates expediting the (combined) exploitation of newly generated information on novel and/or superior target analytes for sports drug testing assays, drug elimination profiles, alternative test matrices, and recent advances in instrumental developments. This paper is a continuation of the series of annual banned-substance reviews appraising the literature published between October 2015 and September 2016 concerning human sports drug testing in the context of WADA's 2016 Prohibited List. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Science Update: Analytical Chemistry.
ERIC Educational Resources Information Center
Worthy, Ward
1980-01-01
Briefly discusses new instrumentation in the field of analytical chemistry. Advances in liquid chromatography, photoacoustic spectroscopy, the use of lasers, and mass spectrometry are also discussed. (CS)
Advanced flight control system study
NASA Technical Reports Server (NTRS)
Hartmann, G. L.; Wall, J. E., Jr.; Rang, E. R.; Lee, H. P.; Schulte, R. W.; Ng, W. K.
1982-01-01
A fly by wire flight control system architecture designed for high reliability includes spare sensor and computer elements to permit safe dispatch with failed elements, thereby reducing unscheduled maintenance. A methodology capable of demonstrating that the architecture does achieve the predicted performance characteristics consists of a hierarchy of activities ranging from analytical calculations of system reliability and formal methods of software verification to iron bird testing followed by flight evaluation. Interfacing this architecture to the Lockheed S-3A aircraft for flight test is discussed. This testbed vehicle can be expanded to support flight experiments in advanced aerodynamics, electromechanical actuators, secondary power systems, flight management, new displays, and air traffic control concepts.
Advanced superposition methods for high speed turbopump vibration analysis
NASA Technical Reports Server (NTRS)
Nielson, C. E.; Campany, A. D.
1981-01-01
The small, high pressure Mark 48 liquid hydrogen turbopump was analyzed and dynamically tested to determine the cause of high speed vibration at an operating speed of 92,400 rpm. This approaches the design point operating speed of 95,000 rpm. The initial dynamic analysis in the design stage and subsequent further analysis of the rotor only dynamics failed to predict the vibration characteristics found during testing. An advanced procedure for dynamics analysis was used in this investigation. The procedure involves developing accurate dynamic models of the rotor assembly and casing assembly by finite element analysis. The dynamically instrumented assemblies are independently rap tested to verify the analytical models. The verified models are then combined by modal superposition techniques to develop a completed turbopump model where dynamic characteristics are determined. The results of the dynamic testing and analysis obtained are presented and methods of moving the high speed vibration characteristics to speeds above the operating range are recommended. Recommendations for use of these advanced dynamic analysis procedures during initial design phases are given.
Advances in analytical chemistry
NASA Technical Reports Server (NTRS)
Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.
1991-01-01
Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.
Bicubic uniform B-spline wavefront fitting technology applied in computer-generated holograms
NASA Astrophysics Data System (ADS)
Cao, Hui; Sun, Jun-qiang; Chen, Guo-jie
2006-02-01
This paper presented a bicubic uniform B-spline wavefront fitting technology to figure out the analytical expression for object wavefront used in Computer-Generated Holograms (CGHs). In many cases, to decrease the difficulty of optical processing, off-axis CGHs rather than complex aspherical surface elements are used in modern advanced military optical systems. In order to design and fabricate off-axis CGH, we have to fit out the analytical expression for object wavefront. Zernike Polynomial is competent for fitting wavefront of centrosymmetric optical systems, but not for axisymmetrical optical systems. Although adopting high-degree polynomials fitting method would achieve higher fitting precision in all fitting nodes, the greatest shortcoming of this method is that any departure from the fitting nodes would result in great fitting error, which is so-called pulsation phenomenon. Furthermore, high-degree polynomials fitting method would increase the calculation time in coding computer-generated hologram and solving basic equation. Basing on the basis function of cubic uniform B-spline and the character mesh of bicubic uniform B-spline wavefront, bicubic uniform B-spline wavefront are described as the product of a series of matrices. Employing standard MATLAB routines, four kinds of different analytical expressions for object wavefront are fitted out by bicubic uniform B-spline as well as high-degree polynomials. Calculation results indicate that, compared with high-degree polynomials, bicubic uniform B-spline is a more competitive method to fit out the analytical expression for object wavefront used in off-axis CGH, for its higher fitting precision and C2 continuity.
ERIC Educational Resources Information Center
Pierce, Karisa M.; Schale, Stephen P.; Le, Trang M.; Larson, Joel C.
2011-01-01
We present a laboratory experiment for an advanced analytical chemistry course where we first focus on the chemometric technique partial least-squares (PLS) analysis applied to one-dimensional (1D) total-ion-current gas chromatography-mass spectrometry (GC-TIC) separations of biodiesel blends. Then, we focus on n-way PLS (n-PLS) applied to…
Computational toxicity in 21st century safety sciences (China ...
presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China
The Shock and Vibration Digest. Volume 15, Number 3
1983-03-01
High Temperature Gas-Cooled Reactor Core with Block-type Fuel (2nd Report: An Analytical Method of Two-dmentmnal Vibration of Interacting CohunM) T...Computer-aided techniquei, Detign techniquei A wite of computer programs hat been developed which allow« advanced fatigue analyiit procedures to be...valuei with those developed by bearing analysis computer programs were used to formulate an understanding of the mechanisms that induce ball skidding
NASA Astrophysics Data System (ADS)
Roubidoux, J. A.; Jackson, J. E.; Lasseigne, A. N.; Mishra, B.; Olson, D. L.
2010-02-01
This paper correlates nonlinear material properties to nondestructive electronic measurements by using wave analysis techniques (e.g. Perturbation Methods) and incorporating higher-order phenomena. The correlations suggest that nondestructive electronic property measurements and practices can be used to assess thin films, surface layers, and other advanced materials that exhibit modified behaviors based on their space-charged interfacial behavior.
NASA Technical Reports Server (NTRS)
Ambur, Manjula; Schwartz, Katherine G.; Mavris, Dimitri N.
2016-01-01
The fields of machine learning and big data analytics have made significant advances in recent years, which has created an environment where cross-fertilization of methods and collaborations can achieve previously unattainable outcomes. The Comprehensive Digital Transformation (CDT) Machine Learning and Big Data Analytics team planned a workshop at NASA Langley in August 2016 to unite leading experts the field of machine learning and NASA scientists and engineers. The primary goal for this workshop was to assess the state-of-the-art in this field, introduce these leading experts to the aerospace and science subject matter experts, and develop opportunities for collaboration. The workshop was held over a three day-period with lectures from 15 leading experts followed by significant interactive discussions. This report provides an overview of the 15 invited lectures and a summary of the key discussion topics that arose during both formal and informal discussion sections. Four key workshop themes were identified after the closure of the workshop and are also highlighted in the report. Furthermore, several workshop attendees provided their feedback on how they are already utilizing machine learning algorithms to advance their research, new methods they learned about during the workshop, and collaboration opportunities they identified during the workshop.
From pixel to voxel: a deeper view of biological tissue by 3D mass spectral imaging
Ye, Hui; Greer, Tyler; Li, Lingjun
2011-01-01
Three dimensional mass spectral imaging (3D MSI) is an exciting field that grants the ability to study a broad mass range of molecular species ranging from small molecules to large proteins by creating lateral and vertical distribution maps of select compounds. Although the general premise behind 3D MSI is simple, factors such as choice of ionization method, sample handling, software considerations and many others must be taken into account for the successful design of a 3D MSI experiment. This review provides a brief overview of ionization methods, sample preparation, software types and technological advancements driving 3D MSI research of a wide range of low- to high-mass analytes. Future perspectives in this field are also provided to conclude that the positive and promises ever-growing applications in the biomedical field with continuous developments of this powerful analytical tool. PMID:21320052
An analytical study for the design of advanced rotor airfoils
NASA Technical Reports Server (NTRS)
Kemp, L. D.
1973-01-01
A theoretical study has been conducted to design and evaluate two airfoils for helicopter rotors. The best basic shape, designed with a transonic hodograph design method, was modified to meet subsonic criteria. One airfoil had an additional constraint for low pitching-moment at the transonic design point. Airfoil characteristics were predicted. Results of a comparative analysis of helicopter performance indicate that the new airfoils will produce reduced rotor power requirements compared to the NACA 0012. The hodograph design method, written in CDC Algol, is listed and described.
The physics of proton therapy.
Newhauser, Wayne D; Zhang, Rui
2015-04-21
The physics of proton therapy has advanced considerably since it was proposed in 1946. Today analytical equations and numerical simulation methods are available to predict and characterize many aspects of proton therapy. This article reviews the basic aspects of the physics of proton therapy, including proton interaction mechanisms, proton transport calculations, the determination of dose from therapeutic and stray radiations, and shielding design. The article discusses underlying processes as well as selected practical experimental and theoretical methods. We conclude by briefly speculating on possible future areas of research of relevance to the physics of proton therapy.
Newhauser, Wayne D; Zhang, Rui
2015-01-01
The physics of proton therapy has advanced considerably since it was proposed in 1946. Today analytical equations and numerical simulation methods are available to predict and characterize many aspects of proton therapy. This article reviews the basic aspects of the physics of proton therapy, including proton interaction mechanisms, proton transport calculations, the determination of dose from therapeutic and stray radiations, and shielding design. The article discusses underlying processes as well as selected practical experimental and theoretical methods. We conclude by briefly speculating on possible future areas of research of relevance to the physics of proton therapy. PMID:25803097
Let's Go Off the Grid: Subsurface Flow Modeling With Analytic Elements
NASA Astrophysics Data System (ADS)
Bakker, M.
2017-12-01
Subsurface flow modeling with analytic elements has the major advantage that no grid or time stepping are needed. Analytic element formulations exist for steady state and transient flow in layered aquifers and unsaturated flow in the vadose zone. Analytic element models are vector-based and consist of points, lines and curves that represent specific features in the subsurface. Recent advances allow for the simulation of partially penetrating wells and multi-aquifer wells, including skin effect and wellbore storage, horizontal wells of poly-line shape including skin effect, sharp changes in subsurface properties, and surface water features with leaky beds. Input files for analytic element models are simple, short and readable, and can easily be generated from, for example, GIS databases. Future plans include the incorporation of analytic element in parts of grid-based models where additional detail is needed. This presentation will give an overview of advanced flow features that can be modeled, many of which are implemented in free and open-source software.
Afonso-Olivares, Cristina; Montesdeoca-Esponda, Sarah; Sosa-Ferrera, Zoraida; Santana-Rodríguez, José Juan
2016-12-01
Today, the presence of contaminants in the environment is a topic of interest for society in general and for the scientific community in particular. A very large amount of different chemical substances reaches the environment after passing through wastewater treatment plants without being eliminated. This is due to the inefficiency of conventional removal processes and the lack of government regulations. The list of compounds entering treatment plants is gradually becoming longer and more varied because most of these compounds come from pharmaceuticals, hormones or personal care products, which are increasingly used by modern society. As a result of this increase in compound variety, to address these emerging pollutants, the development of new and more efficient removal technologies is needed. Different advanced oxidation processes (AOPs), especially photochemical AOPs, have been proposed as supplements to traditional treatments for the elimination of pollutants, showing significant advantages over the use of conventional methods alone. This work aims to review the analytical methodologies employed for the analysis of pharmaceutical compounds from wastewater in studies in which advanced oxidation processes are applied. Due to the low concentrations of these substances in wastewater, mass spectrometry detectors are usually chosen to meet the low detection limits and identification power required. Specifically, time-of-flight detectors are required to analyse the by-products.
Poore, Joshua C; Forlines, Clifton L; Miller, Sarah M; Regan, John R; Irvine, John M
2014-12-01
The decision sciences are increasingly challenged to advance methods for modeling analysts, accounting for both analytic strengths and weaknesses, to improve inferences taken from increasingly large and complex sources of data. We examine whether psychometric measures-personality, cognitive style, motivated cognition-predict analytic performance and whether psychometric measures are competitive with aptitude measures (i.e., SAT scores) as analyst sample selection criteria. A heterogeneous, national sample of 927 participants completed an extensive battery of psychometric measures and aptitude tests and was asked 129 geopolitical forecasting questions over the course of 1 year. Factor analysis reveals four dimensions among psychometric measures; dimensions characterized by differently motivated "top-down" cognitive styles predicted distinctive patterns in aptitude and forecasting behavior. These dimensions were not better predictors of forecasting accuracy than aptitude measures. However, multiple regression and mediation analysis reveals that these dimensions influenced forecasting accuracy primarily through bias in forecasting confidence. We also found that these facets were competitive with aptitude tests as forecast sampling criteria designed to mitigate biases in forecasting confidence while maximizing accuracy. These findings inform the understanding of individual difference dimensions at the intersection of analytic aptitude and demonstrate that they wield predictive power in applied, analytic domains.
Forlines, Clifton L.; Miller, Sarah M.; Regan, John R.; Irvine, John M.
2014-01-01
The decision sciences are increasingly challenged to advance methods for modeling analysts, accounting for both analytic strengths and weaknesses, to improve inferences taken from increasingly large and complex sources of data. We examine whether psychometric measures—personality, cognitive style, motivated cognition—predict analytic performance and whether psychometric measures are competitive with aptitude measures (i.e., SAT scores) as analyst sample selection criteria. A heterogeneous, national sample of 927 participants completed an extensive battery of psychometric measures and aptitude tests and was asked 129 geopolitical forecasting questions over the course of 1 year. Factor analysis reveals four dimensions among psychometric measures; dimensions characterized by differently motivated “top-down” cognitive styles predicted distinctive patterns in aptitude and forecasting behavior. These dimensions were not better predictors of forecasting accuracy than aptitude measures. However, multiple regression and mediation analysis reveals that these dimensions influenced forecasting accuracy primarily through bias in forecasting confidence. We also found that these facets were competitive with aptitude tests as forecast sampling criteria designed to mitigate biases in forecasting confidence while maximizing accuracy. These findings inform the understanding of individual difference dimensions at the intersection of analytic aptitude and demonstrate that they wield predictive power in applied, analytic domains. PMID:25983670
Nonlinear analysis for dual-frequency concurrent energy harvesting
NASA Astrophysics Data System (ADS)
Yan, Zhimiao; Lei, Hong; Tan, Ting; Sun, Weipeng; Huang, Wenhu
2018-05-01
The dual-frequency responses of the hybrid energy harvester undergoing the base excitation and galloping were analyzed numerically. In this work, an approximate dual-frequency analytical method is proposed for the nonlinear analysis of such a system. To obtain the approximate analytical solutions of the full coupled distributed-parameter model, the forcing interactions is first neglected. Then, the electromechanical decoupled governing equation is developed using the equivalent structure method. The hybrid mechanical response is finally separated to be the self-excited and forced responses for deriving the analytical solutions, which are confirmed by the numerical simulations of the full coupled model. The forced response has great impacts on the self-excited response. The boundary of Hopf bifurcation is analytically determined by the onset wind speed to galloping, which is linearly increased by the electrical damping. Quenching phenomenon appears when the increasing base excitation suppresses the galloping. The theoretical quenching boundary depends on the forced mode velocity. The quenching region increases with the base acceleration and electrical damping, but decreases with the wind speed. Superior to the base-excitation-alone case, the existence of the aerodynamic force protects the hybrid energy harvester at resonance from damages caused by the excessive large displacement. From the view of the harvested power, the hybrid system surpasses the base-excitation-alone system or the galloping-alone system. This study advances our knowledge on intrinsic nonlinear dynamics of the dual-frequency energy harvesting system by taking advantage of the analytical solutions.
Solid-Phase Extraction (SPE): Principles and Applications in Food Samples.
Ötles, Semih; Kartal, Canan
2016-01-01
Solid-Phase Extraction (SPE) is a sample preparation method that is practised on numerous application fields due to its many advantages compared to other traditional methods. SPE was invented as an alternative to liquid/liquid extraction and eliminated multiple disadvantages, such as usage of large amount of solvent, extended operation time/procedure steps, potential sources of error, and high cost. Moreover, SPE can be plied to the samples combined with other analytical methods and sample preparation techniques optionally. SPE technique is a useful tool for many purposes through its versatility. Isolation, concentration, purification and clean-up are the main approaches in the practices of this method. Food structures represent a complicated matrix and can be formed into different physical stages, such as solid, viscous or liquid. Therefore, sample preparation step particularly has an important role for the determination of specific compounds in foods. SPE offers many opportunities not only for analysis of a large diversity of food samples but also for optimization and advances. This review aims to provide a comprehensive overview on basic principles of SPE and its applications for many analytes in food matrix.
Analytical methods for determination of mycotoxins: An update (2009-2014).
Turner, Nicholas W; Bramhmbhatt, Heli; Szabo-Vezse, Monika; Poma, Alessandro; Coker, Raymond; Piletsky, Sergey A
2015-12-11
Mycotoxins are a problematic and toxic group of small organic molecules that are produced as secondary metabolites by several fungal species that colonise crops. They lead to contamination at both the field and postharvest stages of food production with a considerable range of foodstuffs affected, from coffee and cereals, to dried fruit and spices. With wide ranging structural diversity of mycotoxins, severe toxic effects caused by these molecules and their high chemical stability the requirement for robust and effective detection methods is clear. This paper builds on our previous review and summarises the most recent advances in this field, in the years 2009-2014 inclusive. This review summarises traditional methods such as chromatographic and immunochemical techniques, as well as newer approaches such as biosensors, and optical techniques which are becoming more prevalent. A section on sampling and sample treatment has been prepared to highlight the importance of this step in the analytical methods. We close with a look at emerging technologies that will bring effective and rapid analysis out of the laboratory and into the field. Copyright © 2015 Elsevier B.V. All rights reserved.
Hill, Ryan C; Oman, Trent J; Shan, Guomin; Schafer, Barry; Eble, Julie; Chen, Cynthia
2015-08-26
Currently, traditional immunochemistry technologies such as enzyme-linked immunosorbent assays (ELISA) are the predominant analytical tool used to measure levels of recombinant proteins expressed in genetically engineered (GE) plants. Recent advances in agricultural biotechnology have created a need to develop methods capable of selectively detecting and quantifying multiple proteins in complex matrices because of increasing numbers of transgenic proteins being coexpressed or "stacked" to achieve tolerance to multiple herbicides or to provide multiple modes of action for insect control. A multiplexing analytical method utilizing liquid chromatography with tandem mass spectrometry (LC-MS/MS) has been developed and validated to quantify three herbicide-tolerant proteins in soybean tissues: aryloxyalkanoate dioxygenase (AAD-12), 5-enol-pyruvylshikimate-3-phosphate synthase (2mEPSPS), and phosphinothricin acetyltransferase (PAT). Results from the validation showed high recovery and precision over multiple analysts and laboratories. Results from this method were comparable to those obtained with ELISA with respect to protein quantitation, and the described method was demonstrated to be suitable for multiplex quantitation of transgenic proteins in GE crops.
NASA Technical Reports Server (NTRS)
Deepak, Adarsh; Wang, Pi-Huan
1985-01-01
The research program is documented for developing space and ground-based remote sensing techniques performed during the period from December 15, 1977 to March 15, 1985. The program involved the application of sophisticated radiative transfer codes and inversion methods to various advanced remote sensing concepts for determining atmospheric constituents, particularly aerosols. It covers detailed discussions of the solar aureole technique for monitoring columnar aerosol size distribution, and the multispectral limb scattered radiance and limb attenuated radiance (solar occultation) techniques, as well as the upwelling scattered solar radiance method for determining the aerosol and gaseous characteristics. In addition, analytical models of aerosol size distribution and simulation studies of the limb solar aureole radiance technique and the variability of ozone at high altitudes during satellite sunrise/sunset events are also described in detail.
FANS Simulation of Propeller Wash at Navy Harbors (ESTEP Project ER-201031)
2016-08-01
this study, the Finite-Analytic Navier–Stokes code was employed to solve the Reynolds-Averaged Navier–Stokes equations in conjunction with advanced...site-specific harbor configurations, it is desirable to perform propeller wash study by solving the Navier–Stokes equations directly in conjunction ...Analytic Navier–Stokes code was employed to solve the Reynolds-Averaged Navier–Stokes equations in conjunction with advanced near-wall turbulence
Application of the first collision source method to CSNS target station shielding calculation
NASA Astrophysics Data System (ADS)
Zheng, Ying; Zhang, Bin; Chen, Meng-Teng; Zhang, Liang; Cao, Bo; Chen, Yi-Xue; Yin, Wen; Liang, Tian-Jiao
2016-04-01
Ray effects are an inherent problem of the discrete ordinates method. RAY3D, a functional module of ARES, which is a discrete ordinates code system, employs a semi-analytic first collision source method to mitigate ray effects. This method decomposes the flux into uncollided and collided components, and then calculates them with an analytical method and discrete ordinates method respectively. In this article, RAY3D is validated by the Kobayashi benchmarks and applied to the neutron beamline shielding problem of China Spallation Neutron Source (CSNS) target station. The numerical results of the Kobayashi benchmarks indicate that the solutions of DONTRAN3D with RAY3D agree well with the Monte Carlo solutions. The dose rate at the end of the neutron beamline is less than 10.83 μSv/h in the CSNS target station neutron beamline shutter model. RAY3D can effectively mitigate the ray effects and obtain relatively reasonable results. Supported by Major National S&T Specific Program of Large Advanced Pressurized Water Reactor Nuclear Power Plant (2011ZX06004-007), National Natural Science Foundation of China (11505059, 11575061), and the Fundamental Research Funds for the Central Universities (13QN34).
DOT National Transportation Integrated Search
2016-12-25
The key objectives of this study were to: 1. Develop advanced analytical techniques that make use of a dynamically configurable connected vehicle message protocol to predict traffic flow regimes in near-real time in a virtual environment and examine ...
An Analytical Assessment of NASA's N+1 Subsonic Fixed Wing Project Noise Goal
NASA Technical Reports Server (NTRS)
Berton, Jeffrey J.; Envia, Edmane; Burley, Casey L.
2009-01-01
The Subsonic Fixed Wing Project of NASA's Fundamental Aeronautics Program has adopted a noise reduction goal for new, subsonic, single-aisle, civil aircraft expected to replace current 737 and A320 airplanes. These so-called 'N+1' aircraft - designated in NASA vernacular as such since they will follow the current, in-service, 'N' airplanes - are hoped to achieve certification noise goal levels of 32 cumulative EPNdB under current Stage 4 noise regulations. A notional, N+1, single-aisle, twinjet transport with ultrahigh bypass ratio turbofan engines is analyzed in this study using NASA software and methods. Several advanced noise-reduction technologies are analytically applied to the propulsion system and airframe. Certification noise levels are predicted and compared with the NASA goal.
HOST turbine heat transfer program summary
NASA Technical Reports Server (NTRS)
Gladden, Herbert J.; Simoneau, Robert J.
1988-01-01
The objectives of the HOST Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena and to assess and improve the analytical methods used to predict the flow and heat transfer in high temperature gas turbines. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. A building-block approach was utilized and the research ranged from the study of fundamental phenomena and modeling to experiments in simulated real engine environments. Experimental research accounted for approximately 75 percent of the funding with the remainder going to analytical efforts. A healthy government/industry/university partnership, with industry providing almost half of the research, was created to advance the turbine heat transfer design technology base.
Tassi, Marco; De Vos, Jelle; Chatterjee, Sneha; Sobott, Frank; Bones, Jonathan; Eeltink, Sebastiaan
2018-01-01
The characterization of biotherapeutics represents a major analytical challenge. This review discusses the current state-of-the-art in analytical technologies to profile biopharma products under native conditions, i.e., the protein three dimensional conformation is maintained during liquid chromatographic analysis. Native liquid-chromatographic modes that are discussed include aqueous size-exclusion chromatography, hydrophobic interaction chromatography, and ion-exchange chromatography. Infusion conditions and the possibilities and limitations to hyphenate native liquid chromatography to mass spectrometry are discussed. Furthermore, the applicability of native liquid-chromatography methods and intact mass spectrometry analysis for the characterization of monoclonal antibodies and antibody-drug conjugates is discussed. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A
2017-12-19
As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.
Training the next generation analyst using red cell analytics
NASA Astrophysics Data System (ADS)
Graham, Meghan N.; Graham, Jacob L.
2016-05-01
We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.
The structure of biodiversity – insights from molecular phylogeography
Hewitt, Godfrey M
2004-01-01
DNA techniques, analytical methods and palaeoclimatic studies are greatly advancing our knowledge of the global distribution of genetic diversity, and how it evolved. Such phylogeographic studies are reviewed from Arctic, Temperate and Tropical regions, seeking commonalities of cause in the resulting genetic patterns. The genetic diversity is differently patterned within and among regions and biomes, and is related to their histories of climatic changes. This has major implications for conservation science. PMID:15679920
Optimal guidance law development for an advanced launch system
NASA Technical Reports Server (NTRS)
Calise, Anthony J.; Leung, Martin S. K.
1995-01-01
The objective of this research effort was to develop a real-time guidance approach for launch vehicles ascent to orbit injection. Various analytical approaches combined with a variety of model order and model complexity reduction have been investigated. Singular perturbation methods were first attempted and found to be unsatisfactory. The second approach based on regular perturbation analysis was subsequently investigated. It also fails because the aerodynamic effects (ignored in the zero order solution) are too large to be treated as perturbations. Therefore, the study demonstrates that perturbation methods alone (both regular and singular perturbations) are inadequate for use in developing a guidance algorithm for the atmospheric flight phase of a launch vehicle. During a second phase of the research effort, a hybrid analytic/numerical approach was developed and evaluated. The approach combines the numerical methods of collocation and the analytical method of regular perturbations. The concept of choosing intelligent interpolating functions is also introduced. Regular perturbation analysis allows the use of a crude representation for the collocation solution, and intelligent interpolating functions further reduce the number of elements without sacrificing the approximation accuracy. As a result, the combined method forms a powerful tool for solving real-time optimal control problems. Details of the approach are illustrated in a fourth order nonlinear example. The hybrid approach is then applied to the launch vehicle problem. The collocation solution is derived from a bilinear tangent steering law, and results in a guidance solution for the entire flight regime that includes both atmospheric and exoatmospheric flight phases.
Klotz, Christian; Šoba, Barbara; Skvarč, Miha; Gabriël, Sarah; Robertson, Lucy J
2017-11-09
Food-borne parasites (FBPs) are a neglected topic in food safety, partly due to a lack of awareness of their importance for public health, especially as symptoms tend not to develop immediately after exposure. In addition, methodological difficulties with both diagnosis in infected patients and detection in food matrices result in under-detection and therefore the potential for underestimation of their burden on our societies. This, in consequence, leads to lower prioritization for basic research, e.g. for development new and more advanced detection methods for different food matrices and diagnostic samples, and thus a vicious circle of neglect and lack of progress is propagated. The COST Action FA1408, A European Network for Foodborne Parasites (Euro-FBP) aims to combat the impact of FBP on public health by facilitating the multidisciplinary cooperation and partnership between groups of researchers and between researchers and stakeholders. The COST Action TD1302, the European Network for cysticercosis/taeniosis, CYSTINET, has a specific focus on Taenia solium and T. saginata, two neglected FBPs, and aims to advance knowledge and understanding of these zoonotic disease complexes via collaborations in a multidisciplinary scientific network. This report summarizes the results of a meeting within the Euro-FBP consortium entitled 'Analytical methods for food-borne parasites in human and veterinary diagnostics and in food matrices' and of the joined Euro-FBP and CYSTINET meeting.
Advanced Elemental and Isotopic Characterization of Atmospheric Aerosols
NASA Astrophysics Data System (ADS)
Shafer, M. M.; Schauer, J. J.; Park, J.
2001-12-01
Recent sampling and analytical developments advanced by the project team enable the detailed elemental and isotopic fingerprinting of extremely small masses of atmospheric aerosols. Historically, this type of characterization was rarely achieved due to limitations in analytical sensitivity and a lack of awareness concerning the potential for contamination. However, with the introduction of 3rd and 4th generation ICP-MS instrumentation and the application of state-of-the- art "clean-techniques", quantitative analysis of over 40 elements in sub-milligram samples can be realized. When coupled with an efficient and validated solubilization method, ICP-MS approaches provide distinct advantages in comparison with traditional methods; greatly enhanced detection limits, improved accuracy, and isotope resolution capability, to name a few. Importantly, the ICP-MS approach can readily be integrated with techniques which enable phase differentiation and chemical speciation information to be acquired. For example, selective chemical leaching can provide data on the association of metals with major phase-components, and oxidation state of certain metals. Critical information on metal-ligand stability can be obtained when electrochemical techniques, such as adsorptive cathodic stripping voltammetry (ACSV), are applied to these same extracts. Our research group is applying these techniques in a broad range of research projects to better understand the sources and distribution of trace metals in particulate matter in the atmosphere. Using examples from our research, including recent Pb and Sr isotope ratio work on Asian aerosols, we will illustrate the capabilities and applications of these new methods.
Glycation of antibodies: Modification, methods and potential effects on biological functions.
Wei, Bingchuan; Berning, Kelsey; Quan, Cynthia; Zhang, Yonghua Taylor
Glycation is an important protein modification that could potentially affect bioactivity and molecular stability, and glycation of therapeutic proteins such as monoclonal antibodies should be well characterized. Glycated protein could undergo further degradation into advance glycation end (AGE) products. Here, we review the root cause of glycation during the manufacturing, storage and in vivo circulation of therapeutic antibodies, and the current analytical methods used to detect and characterize glycation and AGEs, including boronate affinity chromatography, charge-based methods, liquid chromatography-mass spectrometry and colorimetric assay. The biological effects of therapeutic protein glycation and AGEs, which ranged from no affect to loss of activity, are also discussed.
NASA Technical Reports Server (NTRS)
Griswold, M.; Roskam, J.
1980-01-01
An analytical method is presented for predicting lateral-directional aerodynamic characteristics of light twin engine propeller-driven airplanes. This method is applied to the Advanced Technology Light Twin Engine airplane. The calculated characteristics are correlated against full-scale wind tunnel data. The method predicts the sideslip derivatives fairly well, although angle of attack variations are not well predicted. Spoiler performance was predicted somewhat high but was still reasonable. The rudder derivatives were not well predicted, in particular the effect of angle of attack. The predicted dynamic derivatives could not be correlated due to lack of experimental data.
Intelligent model-based diagnostics for vehicle health management
NASA Astrophysics Data System (ADS)
Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki
2003-08-01
The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.
NASA Technical Reports Server (NTRS)
Koh, Severino L. (Editor); Speziale, Charles G. (Editor)
1989-01-01
Various papers on recent advances in engineering science are presented. Some individual topics addressed include: advances in adaptive methods in computational fluid mechanics, mixtures of two medicomorphic materials, computer tests of rubber elasticity, shear bands in isotropic micropolar elastic materials, nonlinear surface wave and resonator effects in magnetostrictive crystals, simulation of electrically enhanced fibrous filtration, plasticity theory of granular materials, dynamics of viscoelastic media with internal oscillators, postcritical behavior of a cantilever bar, boundary value problems in nonlocal elasticity, stability of flexible structures with random parameters, electromagnetic tornadoes in earth's ionosphere and magnetosphere, helicity fluctuations and the energy cascade in turbulence, mechanics of interfacial zones in bonded materials, propagation of a normal shock in a varying area duct, analytical mechanics of fracture and fatigue.
Sun, Xiaohong; Ouyang, Yue; Chu, Jinfang; Yan, Jing; Yu, Yan; Li, Xiaoqiang; Yang, Jun; Yan, Cunyu
2014-04-18
A sensitive and reliable in-advance stable isotope labeling strategy was developed for simultaneous relative quantification of 8 acidic plant hormones in sub-milligram amount of plant materials. Bromocholine bromide (BETA) and its deuterated counterpart D9-BETA were used to in-advance derivatize control and sample extracts individually, which were then combined and subjected to solid-phase extraction (SPE) purification followed by UPLC-MS/MS analysis. Relative quantification of target compounds was obtained by calculation of the peak area ratios of BETA/D9-BETA labeled plant hormones. The in-advance stable isotope labeling strategy realized internal standard-based relative quantification of multiple kinds of plant hormones independent of availability of internal standard of every analyte with enhanced sensitivity of 1-3 orders of magnitude. Meanwhile, the in-advance labeling contributes to higher sample throughput and more reliability. The method was successfully applied to determine 8 plant hormones in 0.8mg DW (dry weight) of seedlings and 4 plant hormones from single seed of Arabidopsis thaliana. The results show the potential of the method in relative quantification of multiple plant hormones in tiny plant tissues or organs, which will advance the knowledge of the crosstalk mechanism of plant hormones. Copyright © 2014 Elsevier B.V. All rights reserved.
Ray, Chris; Saracco, James; Jenkins, Kurt J.; Huff, Mark; Happe, Patricia J.; Ransom, Jason I.
2017-01-01
During 2015-2016, we completed development of a new analytical framework for landbird population monitoring data from the National Park Service (NPS) North Coast and Cascades Inventory and Monitoring Network (NCCN). This new tool for analysis combines several recent advances in modeling population status and trends using point-count data and is designed to supersede the approach previously slated for analysis of trends in the NCCN and other networks, including the Sierra Nevada Network (SIEN). Advances supported by the new model-based approach include 1) the use of combined data on distance and time of detection to estimate detection probability without assuming perfect detection at zero distance, 2) seamless accommodation of variation in sampling effort and missing data, and 3) straightforward estimation of the effects of downscaled climate and other local habitat characteristics on spatial and temporal trends in landbird populations. No changes in the current field protocol are necessary to facilitate the new analyses. We applied several versions of the new model to data from each of 39 species recorded in the three mountain parks of the NCCN, estimating trends and climate relationships for each species during 2005-2014. Our methods and results are also reported in a manuscript in revision for the journal Ecosphere (hereafter, Ray et al.). Here, we summarize the methods and results outlined in depth by Ray et al., discuss benefits of the new analytical framework, and provide recommendations for its application to synthetic analyses of long-term data from the NCCN and SIEN. All code necessary for implementing the new analyses is provided within the Appendices to this report, in the form of fully annotated scripts written in the open-access programming languages R and JAGS.
Russell, Shane R; Claridge, Shelley A
2016-04-01
Because noncovalent interface functionalization is frequently required in graphene-based devices, biomolecular self-assembly has begun to emerge as a route for controlling substrate electronic structure or binding specificity for soluble analytes. The remarkable diversity of structures that arise in biological self-assembly hints at the possibility of equally diverse and well-controlled surface chemistry at graphene interfaces. However, predicting and analyzing adsorbed monolayer structures at such interfaces raises substantial experimental and theoretical challenges. In contrast with the relatively well-developed monolayer chemistry and characterization methods applied at coinage metal surfaces, monolayers on graphene are both less robust and more structurally complex, levying more stringent requirements on characterization techniques. Theory presents opportunities to understand early binding events that lay the groundwork for full monolayer structure. However, predicting interactions between complex biomolecules, solvent, and substrate is necessitating a suite of new force fields and algorithms to assess likely binding configurations, solvent effects, and modulations to substrate electronic properties. This article briefly discusses emerging analytical and theoretical methods used to develop a rigorous chemical understanding of the self-assembly of peptide-graphene interfaces and prospects for future advances in the field.
Evolution of accelerometer methods for physical activity research.
Troiano, Richard P; McClain, James J; Brychta, Robert J; Chen, Kong Y
2014-07-01
The technology and application of current accelerometer-based devices in physical activity (PA) research allow the capture and storage or transmission of large volumes of raw acceleration signal data. These rich data not only provide opportunities to improve PA characterisation, but also bring logistical and analytic challenges. We discuss how researchers and developers from multiple disciplines are responding to the analytic challenges and how advances in data storage, transmission and big data computing will minimise logistical challenges. These new approaches also bring the need for several paradigm shifts for PA researchers, including a shift from count-based approaches and regression calibrations for PA energy expenditure (PAEE) estimation to activity characterisation and EE estimation based on features extracted from raw acceleration signals. Furthermore, a collaborative approach towards analytic methods is proposed to facilitate PA research, which requires a shift away from multiple independent calibration studies. Finally, we make the case for a distinction between PA represented by accelerometer-based devices and PA assessed by self-report. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
NASA Astrophysics Data System (ADS)
Monfared, Vahid
2016-12-01
Analytically based model is presented for behavioral analysis of the plastic deformations in the reinforced materials using the circular (trigonometric) functions. The analytical method is proposed to predict creep behavior of the fibrous composites based on basic and constitutive equations under a tensile axial stress. New insight of the work is to predict some important behaviors of the creeping matrix. In the present model, the prediction of the behaviors is simpler than the available methods. Principal creep strain rate behaviors are very noteworthy for designing the fibrous composites in the creeping composites. Analysis of the mentioned parameter behavior in the reinforced materials is necessary to analyze failure, fracture, and fatigue studies in the creep of the short fiber composites. Shuttles, spaceships, turbine blades and discs, and nozzle guide vanes are commonly subjected to the creep effects. Also, predicting the creep behavior is significant to design the optoelectronic and photonic advanced composites with optical fibers. As a result, the uniform behavior with constant gradient is seen in the principal creep strain rate behavior, and also creep rupture may happen at the fiber end. Finally, good agreements are found through comparing the obtained analytical and FEM results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Block, R.C.; Feiner, F.
This document, Volume 3, includes papers presented at the 7th International Meeting on Nuclear Reactor Thermal-Hydraulics (NURETH-7) September 10--15, 1995 at Saratoga Springs, N.Y. The following subjects are discussed: Progress in analytical and experimental work on the fundamentals of nuclear thermal-hydraulics, the development of advanced mathematical and numerical methods, ad the application of advancements in the field in the development of novel reactor concepts. Also combined issues of thermal-hydraulics and reactor/power-plant safety, core neutronics and/or radiation. Selected abstracts have been indexed separately for inclusion in the Energy Science and Technology Database.
Zhang, Chunhui; Ning, Ke; Zhang, Wenwen; Guo, Yuanjie; Chen, Jun; Liang, Chen
2013-04-01
Increased attention is currently being directed towards the potential negative effects of antibiotics and other PPCPs discharged into the aquatic environment via municipal WWTP secondary effluents. A number of analytical methods, such as high performance liquid chromatography technologies, including a high performance liquid chromatography-fluorescence method (HPLC-FLD), high performance liquid chromatography-UV detection method (HPLC-UV) and high performance liquid chromatography-mass spectrometry method (HPLC-MS), have been suggested as determination technologies for antibiotic residues in water. In this study, we implement a HPLC-MS/MS combined method to detect and analyze antibiotics in WWTP secondary effluent and apply a horizontal subsurface flow constructed wetland (CW) as an advanced wastewater treatment for removing antibiotics in the WWTP secondary effluent. The results show that there were 2 macrolides, 2 quinolones and 5 sulfas in WWTP secondary effluent among all the 22 antibiotics considered. After the CW advanced treatment, the concentration removal efficiencies and removal loads of 9 antibiotics were 53-100% and 0.004-0.7307 μg m(-2) per day, respectively.
NASA Technical Reports Server (NTRS)
Corrigan, J. C.; Cronkhite, J. D.; Dompka, R. V.; Perry, K. S.; Rogers, J. P.; Sadler, S. G.
1989-01-01
Under a research program designated Design Analysis Methods for VIBrationS (DAMVIBS), existing analytical methods are used for calculating coupled rotor-fuselage vibrations of the AH-1G helicopter for correlation with flight test data from an AH-1G Operational Load Survey (OLS) test program. The analytical representation of the fuselage structure is based on a NASTRAN finite element model (FEM), which has been developed, extensively documented, and correlated with ground vibration test. One procedure that was used for predicting coupled rotor-fuselage vibrations using the advanced Rotorcraft Flight Simulation Program C81 and NASTRAN is summarized. Detailed descriptions of the analytical formulation of rotor dynamics equations, fuselage dynamic equations, coupling between the rotor and fuselage, and solutions to the total system of equations in C81 are included. Analytical predictions of hub shears for main rotor harmonics 2p, 4p, and 6p generated by C81 are used in conjunction with 2p OLS measured control loads and a 2p lateral tail rotor gearbox force, representing downwash impingement on the vertical fin, to excite the NASTRAN model. NASTRAN is then used to correlate with measured OLS flight test vibrations. Blade load comparisons predicted by C81 showed good agreement. In general, the fuselage vibration correlations show good agreement between anslysis and test in vibration response through 15 to 20 Hz.
Detecting Gear Tooth Fatigue Cracks in Advance of Complete Fracture
NASA Technical Reports Server (NTRS)
Zakrajsek, James J.; Lewicki, David G.
1996-01-01
Results of using vibration-based methods to detect gear tooth fatigue cracks are presented. An experimental test rig was used to fail a number of spur gear specimens through bending fatigue. The gear tooth fatigue crack in each test was initiated through a small notch in the fillet area of a tooth on the gear. The primary purpose of these tests was to verify analytical predictions of fatigue crack propagation direction and rate as a function of gear rim thickness. The vibration signal from a total of three tests was monitored and recorded for gear fault detection research. The damage consisted of complete rim fracture on the two thin rim gears and single tooth fracture on the standard full rim test gear. Vibration-based fault detection methods were applied to the vibration signal both on-line and after the tests were completed. The objectives of this effort were to identify methods capable of detecting the fatigue crack and to determine how far in advance of total failure positive detection was given. Results show that the fault detection methods failed to respond to the fatigue crack prior to complete rim fracture in the thin rim gear tests. In the standard full rim gear test all of the methods responded to the fatigue crack in advance of tooth fracture; however, only three of the methods responded to the fatigue crack in the early stages of crack propagation.
NASA Astrophysics Data System (ADS)
Huang, Y.; Longo, W. M.; Zheng, Y.; Richter, N.; Dillon, J. T.; Theroux, S.; D'Andrea, W. J.; Toney, J. L.; Wang, L.; Amaral-Zettler, L. A.
2017-12-01
Alkenones are mature, well-established paleo-sea surface temperature proxies that have been widely applied for more than three decades. However, recent advances across a broad range of alkenone-related topics at Brown University are inviting new paleoclimate and paleo-environmental applications for these classic biomarkers. In this presentation, I will summarize our progress in the following areas: (1) Discovery of a freshwater alkenone-producing haptophyte species and structural elucidation of novel alkenone structures unique to the species, performing in-situ temperature calibrations, and classifying alkenone-producing haptophytes into three groups based on molecular ecological approaches (with the new species belonging to Group I Isochrysidales); (2) A global survey of Group I haptophyte distributions and environmental conditions favoring the presence of this alga, as well as examples of using Group I alkenones for paleotemperature reconstructions; (3) New gas chromatographic columns that allow unprecedented resolution of alkenones and alkenoates and associated structural isomers, and development of a new suite of paleotemperature and paleoenvironmental proxies; (4) A new liquid chromatographic separation technique that allows efficient cleanup of alkenones and alkenoates (without the need for saponification) for subsequent coelution-free gas chromatographic analysis; (5) Novel structural features revealed by new analytical methods that now allow a comprehensive re-assessment of taxonomic features of various haptophyte species, with principal component analysis capable of fully resolving species biomarker distributions; (6) Development of UK37 double prime (UK37'') for Group II haptophytes (e.g., those occurring in saline lakes and estuaries), that differs from the traditional unsaturation indices used for SST reconstructions; (7) New assessment of how mixed inputs from different alkenone groups may affect SST reconstructions in marginal ocean environments and possible approaches to solving the problem; and, (8) Optimization of analytical methods for determining the double-bond positions of alkenones and alkenoates, and subsequent discovery of new structural features of short-chain alkenones and the proposal of new biosynthetic pathways.
Liu, Min; Zhang, Chunsun; Liu, Feifei
2015-09-03
In this work, we first introduce the fabrication of microfluidic cloth-based analytical devices (μCADs) using a wax screen-printing approach that is suitable for simple, inexpensive, rapid, low-energy-consumption and high-throughput preparation of cloth-based analytical devices. We have carried out a detailed study on the wax screen-printing of μCADs and have obtained some interesting results. Firstly, an analytical model is established for the spreading of molten wax in cloth. Secondly, a new wax screen-printing process has been proposed for fabricating μCADs, where the melting of wax into the cloth is much faster (∼5 s) and the heating temperature is much lower (75 °C). Thirdly, the experimental results show that the patterning effects of the proposed wax screen-printing method depend to a certain extent on types of screens, wax melting temperatures and melting time. Under optimized conditions, the minimum printing width of hydrophobic wax barrier and hydrophilic channel is 100 μm and 1.9 mm, respectively. Importantly, the developed analytical model is also well validated by these experiments. Fourthly, the μCADs fabricated by the presented wax screen-printing method are used to perform a proof-of-concept assay of glucose or protein in artificial urine with rapid high-throughput detection taking place on a 48-chamber cloth-based device and being performed by a visual readout. Overall, the developed cloth-based wax screen-printing and arrayed μCADs should provide a new research direction in the development of advanced sensor arrays for detection of a series of analytes relevant to many diverse applications. Copyright © 2015 Elsevier B.V. All rights reserved.
Investigation of Acoustical Shielding by a Wedge-Shaped Airframe
NASA Technical Reports Server (NTRS)
Gerhold, Carl H.; Clark, Lorenzo R.; Dunn, Mark H.; Tweed, John
2006-01-01
Experiments on a scale model of an advanced unconventional subsonic transport concept, the Blended Wing Body (BWB), have demonstrated significant shielding of inlet-radiated noise. A computational model of the shielding mechanism has been developed using a combination of boundary integral equation method (BIEM) and equivalent source method (ESM). The computation models the incident sound from a point source in a nacelle and determines the scattered sound field. In this way the sound fields with and without the airfoil can be estimated for comparison to experiment. An experimental test bed using a simplified wedge-shape airfoil and a broadband point noise source in a simulated nacelle has been developed for the purposes of verifying the analytical model and also to study the effect of engine nacelle placement on shielding. The experimental study is conducted in the Anechoic Noise Research Facility at NASA Langley Research Center. The analytic and experimental results are compared at 6300 and 8000 Hz. These frequencies correspond to approximately 150 Hz on the full scale aircraft. Comparison between the experimental and analytic results is quite good, not only for the noise scattering by the airframe, but also for the total sound pressure in the far field. Many of the details of the sound field that the analytic model predicts are seen or indicated in the experiment, within the spatial resolution limitations of the experiment. Changing nacelle location produces comparable changes in noise shielding contours evaluated analytically and experimentally. Future work in the project will be enhancement of the analytic model to extend the analysis to higher frequencies corresponding to the blade passage frequency of the high bypass ratio ducted fan engines that are expected to power the BWB.
Investigation of Acoustical Shielding by a Wedge-Shaped Airframe
NASA Technical Reports Server (NTRS)
Gerhold, Carl H.; Clark, Lorenzo R.; Dunn, Mark H.; Tweed, John
2004-01-01
Experiments on a scale model of an advanced unconventional subsonic transport concept, the Blended Wing Body (BWB), have demonstrated significant shielding of inlet-radiated noise. A computational model of the shielding mechanism has been developed using a combination of boundary integral equation method (BIEM) and equivalent source method (ESM). The computation models the incident sound from a point source in a nacelle and determines the scattered sound field. In this way the sound fields with and without the airfoil can be estimated for comparison to experiment. An experimental test bed using a simplified wedge-shape airfoil and a broadband point noise source in a simulated nacelle has been developed for the purposes of verifying the analytical model and also to study the effect of engine nacelle placement on shielding. The experimental study is conducted in the Anechoic Noise Research Facility at NASA Langley Research Center. The analytic and experimental results are compared at 6300 and 8000 Hz. These frequencies correspond to approximately 150 Hz on the full scale aircraft. Comparison between the experimental and analytic results is quite good, not only for the noise scattering by the airframe, but also for the total sound pressure in the far field. Many of the details of the sound field that the analytic model predicts are seen or indicated in the experiment, within the spatial resolution limitations of the experiment. Changing nacelle location produces comparable changes in noise shielding contours evaluated analytically and experimentally. Future work in the project will be enhancement of the analytic model to extend the analysis to higher frequencies corresponding to the blade passage frequency of the high bypass ratio ducted fan engines that are expected to power the BWB.
Della Pelle, Flavio; Compagnone, Dario
2018-02-04
Polyphenolic compounds (PCs) have received exceptional attention at the end of the past millennium and as much at the beginning of the new one. Undoubtedly, these compounds in foodstuffs provide added value for their well-known health benefits, for their technological role and also marketing. Many efforts have been made to provide simple, effective and user friendly analytical methods for the determination and antioxidant capacity (AOC) evaluation of food polyphenols. In a parallel track, over the last twenty years, nanomaterials (NMs) have made their entry in the analytical chemistry domain; NMs have, in fact, opened new paths for the development of analytical methods with the common aim to improve analytical performance and sustainability, becoming new tools in quality assurance of food and beverages. The aim of this review is to provide information on the most recent developments of new NMs-based tools and strategies for total polyphenols (TP) determination and AOC evaluation in food. In this review optical, electrochemical and bioelectrochemical approaches have been reviewed. The use of nanoparticles, quantum dots, carbon nanomaterials and hybrid materials for the detection of polyphenols is the main subject of the works reported. However, particular attention has been paid to the success of the application in real samples, in addition to the NMs. In particular, the discussion has been focused on methods/devices presenting, in the opinion of the authors, clear advancement in the fields, in terms of simplicity, rapidity and usability. This review aims to demonstrate how the NM-based approaches represent valid alternatives to classical methods for polyphenols analysis, and are mature to be integrated for the rapid quality assessment of food quality in lab or directly in the field.
2018-01-01
Polyphenolic compounds (PCs) have received exceptional attention at the end of the past millennium and as much at the beginning of the new one. Undoubtedly, these compounds in foodstuffs provide added value for their well-known health benefits, for their technological role and also marketing. Many efforts have been made to provide simple, effective and user friendly analytical methods for the determination and antioxidant capacity (AOC) evaluation of food polyphenols. In a parallel track, over the last twenty years, nanomaterials (NMs) have made their entry in the analytical chemistry domain; NMs have, in fact, opened new paths for the development of analytical methods with the common aim to improve analytical performance and sustainability, becoming new tools in quality assurance of food and beverages. The aim of this review is to provide information on the most recent developments of new NMs-based tools and strategies for total polyphenols (TP) determination and AOC evaluation in food. In this review optical, electrochemical and bioelectrochemical approaches have been reviewed. The use of nanoparticles, quantum dots, carbon nanomaterials and hybrid materials for the detection of polyphenols is the main subject of the works reported. However, particular attention has been paid to the success of the application in real samples, in addition to the NMs. In particular, the discussion has been focused on methods/devices presenting, in the opinion of the authors, clear advancement in the fields, in terms of simplicity, rapidity and usability. This review aims to demonstrate how the NM-based approaches represent valid alternatives to classical methods for polyphenols analysis, and are mature to be integrated for the rapid quality assessment of food quality in lab or directly in the field. PMID:29401719
Earth Science Data Analytics: Preparing for Extracting Knowledge from Information
NASA Technical Reports Server (NTRS)
Kempler, Steven; Barbieri, Lindsay
2016-01-01
Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to advance science point of view: On the continuum of ever evolving data management systems, we need to understand and develop ways that allow for the variety of data relationships to be examined, and information to be manipulated, such that knowledge can be enhanced, to facilitate science. Recognizing the importance and potential impacts of the unlimited ways to co-analyze heterogeneous datasets, now and especially in the future, one of the objectives of the ESDA cluster is to facilitate the preparation of individuals to understand and apply needed skills to Earth science data analytics. Pinpointing and communicating the needed skills and expertise is new, and not easy. Information technology is just beginning to provide the tools for advancing the analysis of heterogeneous datasets in a big way, thus, providing opportunity to discover unobvious scientific relationships, previously invisible to the science eye. And it is not easy It takes individuals, or teams of individuals, with just the right combination of skills to understand the data and develop the methods to glean knowledge out of data and information. In addition, whereas definitions of data science and big data are (more or less) available (summarized in Reference 5), Earth science data analytics is virtually ignored in the literature, (barring a few excellent sources).
Suzuki, Shigeru
2014-01-01
The techniques and measurement methods developed in the Environmental Survey and Monitoring of Chemicals by Japan’s Ministry of the Environment, as well as a large amount of knowledge archived in the survey, have led to the advancement of environmental analysis. Recently, technologies such as non-target liquid chromatography/high resolution mass spectrometry and liquid chromatography with micro bore column have further developed the field. Here, the general strategy of a method developed for the liquid chromatography/mass spectrometry (LC/MS) analysis of environmental chemicals with a brief description is presented. Also, a non-target analysis for the identification of environmental pollutants using a provisional fragment database and “MsMsFilter,” an elemental composition elucidation tool, is presented. This analytical method is shown to be highly effective in the identification of a model chemical, the pesticide Bendiocarb. Our improved micro-liquid chromatography injection system showed substantially enhanced sensitivity to perfluoroalkyl substances, with peak areas 32–71 times larger than those observed in conventional LC/MS. PMID:26819891
NASA Technical Reports Server (NTRS)
Sawyer, W. C.; Allen, J. M.; Hernandez, G.; Dillenius, M. F. E.; Hemsch, M. J.
1982-01-01
This paper presents a survey of engineering computational methods and experimental programs used for estimating the aerodynamic characteristics of missile configurations. Emphasis is placed on those methods which are suitable for preliminary design of conventional and advanced concepts. An analysis of the technical approaches of the various methods is made in order to assess their suitability to estimate longitudinal and/or lateral-directional characteristics for different classes of missile configurations. Some comparisons between the predicted characteristics and experimental data are presented. These comparisons are made for a large variation in flow conditions and model attitude parameters. The paper also presents known experimental research programs developed for the specific purpose of validating analytical methods and extending the capability of data-base programs.
Diagnostics for insufficiencies of posterior calculations in Bayesian signal inference.
Dorn, Sebastian; Oppermann, Niels; Ensslin, Torsten A
2013-11-01
We present an error-diagnostic validation method for posterior distributions in Bayesian signal inference, an advancement of a previous work. It transfers deviations from the correct posterior into characteristic deviations from a uniform distribution of a quantity constructed for this purpose. We show that this method is able to reveal and discriminate several kinds of numerical and approximation errors, as well as their impact on the posterior distribution. For this we present four typical analytical examples of posteriors with incorrect variance, skewness, position of the maximum, or normalization. We show further how this test can be applied to multidimensional signals.
Akan, Ozgur B.
2018-01-01
We consider a microfluidic molecular communication (MC) system, where the concentration-encoded molecular messages are transported via fluid flow-induced convection and diffusion, and detected by a surface-based MC receiver with ligand receptors placed at the bottom of the microfluidic channel. The overall system is a convection-diffusion-reaction system that can only be solved by numerical methods, e.g., finite element analysis (FEA). However, analytical models are key for the information and communication technology (ICT), as they enable an optimisation framework to develop advanced communication techniques, such as optimum detection methods and reliable transmission schemes. In this direction, we develop an analytical model to approximate the expected time course of bound receptor concentration, i.e., the received signal used to decode the transmitted messages. The model obviates the need for computationally expensive numerical methods by capturing the nonlinearities caused by laminar flow resulting in parabolic velocity profile, and finite number of ligand receptors leading to receiver saturation. The model also captures the effects of reactive surface depletion layer resulting from the mass transport limitations and moving reaction boundary originated from the passage of finite-duration molecular concentration pulse over the receiver surface. Based on the proposed model, we derive closed form analytical expressions that approximate the received pulse width, pulse delay and pulse amplitude, which can be used to optimize the system from an ICT perspective. We evaluate the accuracy of the proposed model by comparing model-based analytical results to the numerical results obtained by solving the exact system model with COMSOL Multiphysics. PMID:29415019
Kuscu, Murat; Akan, Ozgur B
2018-01-01
We consider a microfluidic molecular communication (MC) system, where the concentration-encoded molecular messages are transported via fluid flow-induced convection and diffusion, and detected by a surface-based MC receiver with ligand receptors placed at the bottom of the microfluidic channel. The overall system is a convection-diffusion-reaction system that can only be solved by numerical methods, e.g., finite element analysis (FEA). However, analytical models are key for the information and communication technology (ICT), as they enable an optimisation framework to develop advanced communication techniques, such as optimum detection methods and reliable transmission schemes. In this direction, we develop an analytical model to approximate the expected time course of bound receptor concentration, i.e., the received signal used to decode the transmitted messages. The model obviates the need for computationally expensive numerical methods by capturing the nonlinearities caused by laminar flow resulting in parabolic velocity profile, and finite number of ligand receptors leading to receiver saturation. The model also captures the effects of reactive surface depletion layer resulting from the mass transport limitations and moving reaction boundary originated from the passage of finite-duration molecular concentration pulse over the receiver surface. Based on the proposed model, we derive closed form analytical expressions that approximate the received pulse width, pulse delay and pulse amplitude, which can be used to optimize the system from an ICT perspective. We evaluate the accuracy of the proposed model by comparing model-based analytical results to the numerical results obtained by solving the exact system model with COMSOL Multiphysics.
NREL’s Advanced Analytics Research for Energy-Efficient Buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kutscher, Chuck; Livingood, Bill; Wilson, Eric
At NREL, we believe in building better buildings. More importantly, high-performance buildings that can do more and be smarter than ever before. Forty percent of the total energy consumption in the United States comes from buildings. Working together, we can dramatically shrink that number. But first, it starts with the research: our observations, experiments, modeling, analysis, and more. NREL’s advanced analytics research has already proven to reduce energy use, save money, and stabilize the grid.
Advanced Energy Storage Devices: Basic Principles, Analytical Methods, and Rational Materials Design
Liu, Jilei; Wang, Jin; Xu, Chaohe; Li, Chunzhong; Lin, Jianyi
2017-01-01
Abstract Tremendous efforts have been dedicated into the development of high‐performance energy storage devices with nanoscale design and hybrid approaches. The boundary between the electrochemical capacitors and batteries becomes less distinctive. The same material may display capacitive or battery‐like behavior depending on the electrode design and the charge storage guest ions. Therefore, the underlying mechanisms and the electrochemical processes occurring upon charge storage may be confusing for researchers who are new to the field as well as some of the chemists and material scientists already in the field. This review provides fundamentals of the similarities and differences between electrochemical capacitors and batteries from kinetic and material point of view. Basic techniques and analysis methods to distinguish the capacitive and battery‐like behavior are discussed. Furthermore, guidelines for material selection, the state‐of‐the‐art materials, and the electrode design rules to advanced electrode are proposed. PMID:29375964
Liu, Jilei; Wang, Jin; Xu, Chaohe; Jiang, Hao; Li, Chunzhong; Zhang, Lili; Lin, Jianyi; Shen, Ze Xiang
2018-01-01
Tremendous efforts have been dedicated into the development of high-performance energy storage devices with nanoscale design and hybrid approaches. The boundary between the electrochemical capacitors and batteries becomes less distinctive. The same material may display capacitive or battery-like behavior depending on the electrode design and the charge storage guest ions. Therefore, the underlying mechanisms and the electrochemical processes occurring upon charge storage may be confusing for researchers who are new to the field as well as some of the chemists and material scientists already in the field. This review provides fundamentals of the similarities and differences between electrochemical capacitors and batteries from kinetic and material point of view. Basic techniques and analysis methods to distinguish the capacitive and battery-like behavior are discussed. Furthermore, guidelines for material selection, the state-of-the-art materials, and the electrode design rules to advanced electrode are proposed.
Sweetow, Robert W; Sabes, Jennifer Henderson
2007-06-01
The level of interest in aural rehabilitation has increased recently, both in clinical use and in research presentations and publications. Advances in aural rehabilitation have seen previous techniques such as speech tracking and analytic auditory training reappear in computerized forms. These new delivery methods allow for a consistent, cost-effective, and convenient training program. Several computerized aural rehabilitation programs for hearing aid wearers and cochlear implant recipients have recently been developed and were reported on at the 2006 State of the Science Conference of the Rehabilitation Engineering Research Center on Hearing Enhancement at Gallaudet University. This article reviews these programs and outlines the similarities and differences in their design. Another promising area of aural rehabilitation research is the use of pharmaceuticals in the rehabilitation process. The results from a study of the effect of d-amphetamine in conjunction with intensive aural rehabilitation with cochlear implant patients are also described.
Validation of material point method for soil fluidisation analysis
NASA Astrophysics Data System (ADS)
Bolognin, Marco; Martinelli, Mario; Bakker, Klaas J.; Jonkman, Sebastiaan N.
2017-06-01
The main aim of this paper is to describe and analyse the modelling of vertical column tests that undergo fluidisation by the application of a hydraulic gradient. A recent advancement of the material point method (MPM), allows studying both stationary and non-stationary fluid flow while interacting with the solid phase. The fluidisation initiation and post-fluidisation processes of the soil will be investigated with an advanced MPM formulation (Double Point) in which the behavior of the solid and the liquid phase is evaluated separately, assigning to each of them a set of material points (MPs). The result of these simulations are compared to analytic solutions and measurements from laboratory experiments. This work is used as a benchmark test for the MPM double point formulation in the Anura3D software and to verify the feasibility of the software for possible future engineering applications.
Sweetow, Robert W.; Sabes, Jennifer Henderson
2007-01-01
The level of interest in aural rehabilitation has increased recently, both in clinical use and in research presentations and publications. Advances in aural rehabilitation have seen previous techniques such as speech tracking and analytic auditory training reappear in computerized forms. These new delivery methods allow for a consistent, cost-effective, and convenient training program. Several computerized aural rehabilitation programs for hearing aid wearers and cochlear implant recipients have recently been developed and were reported on at the 2006 State of the Science Conference of the Rehabilitation Engineering Research Center on Hearing Enhancement at Gallaudet University. This article reviews these programs and outlines the similarities and differences in their design. Another promising area of aural rehabilitation research is the use of pharmaceuticals in the rehabilitation process. The results from a study of the effect of d-amphetamine in conjunction with intensive aural rehabilitation with cochlear implant patients are also described. PMID:17494876
IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics
2016-01-01
Background We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. Objective To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. Methods The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Results Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix. Conclusions IBMWA is a new alternative for data analytics software that automates descriptive, predictive, and visual analytics. This program is very user-friendly but requires data preprocessing, statistical conceptual understanding, and domain expertise. PMID:27729304
A possible explanation for foreland thrust propagation
NASA Astrophysics Data System (ADS)
Panian, John; Pilant, Walter
1990-06-01
A common feature of thin-skinned fold and thrust belts is the sequential nature of foreland directed thrust systems. As a rule, younger thrusts develop in the footwalls of older thrusts, the whole sequence propagating towards the foreland in the transport direction. As each new younger thrust develops, the entire sequence is thickened; particularly in the frontal region. The compressive toe region can be likened to an advancing wave; as the mountainous thrust belt advanced the down-surface slope stresses drive thrusts ahead of it much like a surfboard rider. In an attempt to investigate the stresses in the frontal regions of thrustsheets, a numerical method has been devised from the algorithm given by McTigue and Mei [1981]. The algorithm yields a quickly computed approximate solution of the gravity- and tectonic-induced stresses of a two-dimensional homogeneous elastic half-space with an arbitrarily shaped free surface of small slope. A comparison of the numerical method with analytical examples shows excellent agreement. The numerical method was devised because it greatly facilitates the stress calculations and frees one from using the restrictive, simple topographic profiles necessary to obtain an analytical solution. The numerical version of the McTigue and Mei algorithm shows that there is a region of increased maximum resolved shear stress, τ, directly beneath the toe of the overthrust sheet. Utilizing the Mohr-Coulomb failure criterion, predicted fault lines are computed. It is shown that they flatten and become horizontal in some portions of this zone of increased τ. Thrust sheets are known to advance upon weak decollement zones. If there is a coincidence of increased τ, a weak rock layer, and a potential fault line parallel to this weak layer, we have in place all the elements necessary to initiate a new thrusting event. That is, this combination acts as a nucleating center to initiate a new thrusting event. Therefore, thrusts develop in sequence towards the foreland as a consequence of the stress concentrating abilities of the toe of the thrust sheet. The gravity- and tectonic-induced stresses due to the surface topography (usually ignored in previous analyses) of an advancing thrust sheet play a key role in the nature of shallow foreland thrust propagation.
Bridging paradigms: hybrid mechanistic-discriminative predictive models.
Doyle, Orla M; Tsaneva-Atansaova, Krasimira; Harte, James; Tiffin, Paul A; Tino, Peter; Díaz-Zuccarini, Vanessa
2013-03-01
Many disease processes are extremely complex and characterized by multiple stochastic processes interacting simultaneously. Current analytical approaches have included mechanistic models and machine learning (ML), which are often treated as orthogonal viewpoints. However, to facilitate truly personalized medicine, new perspectives may be required. This paper reviews the use of both mechanistic models and ML in healthcare as well as emerging hybrid methods, which are an exciting and promising approach for biologically based, yet data-driven advanced intelligent systems.
Development of an improved method of consolidating fatigue life data
NASA Technical Reports Server (NTRS)
Leis, B. N.; Sampath, S. G.
1978-01-01
A fatigue data consolidation model that incorporates recent advances in life prediction methodology was developed. A combined analytic and experimental study of fatigue of notched 2024-T3 aluminum alloy under constant amplitude loading was carried out. Because few systematic and complete data sets for 2024-T3 were available in the program generated data for fatigue crack initiation and separation failure for both zero and nonzero mean stresses. Consolidations of these data are presented.
1998 Technology Showcase. JOAP International Condition Monitoring Conference.
1998-04-01
Systems using Automated SEM/ EDX and New Diagnostic Routines 276 N. W Farrant & T. Luckhurst ADVANCED DIAGNOSTIC SYSTEMS Model-Based Diagnostics of Gas...Microscopy with Energy Dispersive X-Ray (SEM/ EDX ) micro analysis packages and Energy Dispersive X-Ray Fluorescence (EDXRF) analytical equipment. Therqfore...wear particles separated by ferrogram method. a- I WEAR PARTICLE A SLAS 97 (HOME PAGE) Fig I Home Page NONFE;RROUS MATERIAL A wW~ a48 -1, rV fr , ý b
Proactive human-computer collaboration for information discovery
NASA Astrophysics Data System (ADS)
DiBona, Phil; Shilliday, Andrew; Barry, Kevin
2016-05-01
Lockheed Martin Advanced Technology Laboratories (LM ATL) is researching methods, representations, and processes for human/autonomy collaboration to scale analysis and hypotheses substantiation for intelligence analysts. This research establishes a machinereadable hypothesis representation that is commonsensical to the human analyst. The representation unifies context between the human and computer, enabling autonomy in the form of analytic software, to support the analyst through proactively acquiring, assessing, and organizing high-value information that is needed to inform and substantiate hypotheses.
Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data.
Hammitt, Laura L; Feikin, Daniel R; Scott, J Anthony G; Zeger, Scott L; Murdoch, David R; O'Brien, Katherine L; Deloria Knoll, Maria
2017-06-15
Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.
Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data
Feikin, Daniel R.; Scott, J. Anthony G.; Zeger, Scott L.; Murdoch, David R.; O’Brien, Katherine L.; Deloria Knoll, Maria
2017-01-01
Abstract Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. PMID:28575372
Managing knowledge business intelligence: A cognitive analytic approach
NASA Astrophysics Data System (ADS)
Surbakti, Herison; Ta'a, Azman
2017-10-01
The purpose of this paper is to identify and analyze integration of Knowledge Management (KM) and Business Intelligence (BI) in order to achieve competitive edge in context of intellectual capital. Methodology includes review of literatures and analyzes the interviews data from managers in corporate sector and models established by different authors. BI technologies have strong association with process of KM for attaining competitive advantage. KM have strong influence from human and social factors and turn them to the most valuable assets with efficient system run under BI tactics and technologies. However, the term of predictive analytics is based on the field of BI. Extracting tacit knowledge is a big challenge to be used as a new source for BI to use in analyzing. The advanced approach of the analytic methods that address the diversity of data corpus - structured and unstructured - required a cognitive approach to provide estimative results and to yield actionable descriptive, predictive and prescriptive results. This is a big challenge nowadays, and this paper aims to elaborate detail in this initial work.
Hines, Erin P; Rayner, Jennifer L; Barbee, Randy; Moreland, Rae Ann; Valcour, Andre; Schmid, Judith E; Fenton, Suzanne E
2007-05-01
Breast milk is a primary source of nutrition that contains many endogenous compounds that may affect infant development. The goals of this study were to develop reliable assays for selected endogenous breast milk components and to compare levels of those in milk and serum collected from the same mother twice during lactation (2-7 weeks and 3-4 months). Reliable assays were developed for glucose, secretory IgA, interleukin-6, tumor necrosis factor-a, triglycerides, prolactin, and estradiol from participants in a US EPA study called Methods Advancement in Milk Analysis (MAMA). Fresh and frozen (-20 degrees C) milk samples were assayed to determine effects of storage on endogenous analytes. The source effect (serum vs milk) seen in all 7 analytes indicates that serum should not be used as a surrogate for milk in children's health studies. The authors propose to use these assays in studies to examine relationships between the levels of milk components and children's health.
NASA Astrophysics Data System (ADS)
Jeon, Haemin; Yu, Jaesang; Lee, Hunsu; Kim, G. M.; Kim, Jae Woo; Jung, Yong Chae; Yang, Cheol-Min; Yang, B. J.
2017-09-01
Continuous fiber-reinforced composites are important materials that have the highest commercialized potential in the upcoming future among existing advanced materials. Despite their wide use and value, their theoretical mechanisms have not been fully established due to the complexity of the compositions and their unrevealed failure mechanisms. This study proposes an effective three-dimensional damage modeling of a fibrous composite by combining analytical micromechanics and evolutionary computation. The interface characteristics, debonding damage, and micro-cracks are considered to be the most influential factors on the toughness and failure behaviors of composites, and a constitutive equation considering these factors was explicitly derived in accordance with the micromechanics-based ensemble volume averaged method. The optimal set of various model parameters in the analytical model were found using modified evolutionary computation that considers human-induced error. The effectiveness of the proposed formulation was validated by comparing a series of numerical simulations with experimental data from available studies.
NASA Technical Reports Server (NTRS)
Marsik, S. J.; Morea, S. F.
1985-01-01
A research and technology program for advanced high pressure, oxygen-hydrogen rocket propulsion technology is presently being pursued by the National Aeronautics and Space Administration (NASA) to establish the basic discipline technologies, develop the analytical tools, and establish the data base necessary for an orderly evolution of the staged combustion reusable rocket engine. The need for the program is based on the premise that the USA will depend on the Shuttle and its derivative versions as its principal Earth-to-orbit transportation system for the next 20 to 30 yr. The program is focused in three principal areas of enhancement: (1) life extension, (2) performance, and (3) operations and diagnosis. Within the technological disciplines the efforts include: rotordynamics, structural dynamics, fluid and gas dynamics, materials fatigue/fracture/life, turbomachinery fluid mechanics, ignition/combustion processes, manufacturing/producibility/nondestructive evaluation methods and materials development/evaluation. An overview of the Advanced High Pressure Oxygen-Hydrogen Rocket Propulsion Technology Program Structure and Working Groups objectives are presented with highlights of several significant achievements.
NASA Technical Reports Server (NTRS)
Marsik, S. J.; Morea, S. F.
1985-01-01
A research and technology program for advanced high pressure, oxygen-hydrogen rocket propulsion technology is presently being pursued by the National Aeronautics and Space Administration (NASA) to establish the basic discipline technologies, develop the analytical tools, and establish the data base necessary for an orderly evolution of the staged combustion reusable rocket engine. The need for the program is based on the premise that the USA will depend on the Shuttle and its derivative versions as its principal Earth-to-orbit transportation system for the next 20 to 30 yr. The program is focused in three principal areas of enhancement: (1) life extension, (2) performance, and (3) operations and diagnosis. Within the technological disciplines the efforts include: rotordynamics, structural dynamics, fluid and gas dynamics, materials fatigue/fracture/life, turbomachinery fluid mechanics, ignition/combustion processes, manufacturing/producibility/nondestructive evaluation methods and materials development/evaluation. An overview of the Advanced High Pressure Oxygen-Hydrogen Rocket Propulsion Technology Program Structure and Working Groups objectives are presented with highlights of several significant achievements.
NASA Astrophysics Data System (ADS)
Marsik, S. J.; Morea, S. F.
1985-03-01
A research and technology program for advanced high pressure, oxygen-hydrogen rocket propulsion technology is presently being pursued by the National Aeronautics and Space Administration (NASA) to establish the basic discipline technologies, develop the analytical tools, and establish the data base necessary for an orderly evolution of the staged combustion reusable rocket engine. The need for the program is based on the premise that the USA will depend on the Shuttle and its derivative versions as its principal Earth-to-orbit transportation system for the next 20 to 30 yr. The program is focused in three principal areas of enhancement: (1) life extension, (2) performance, and (3) operations and diagnosis. Within the technological disciplines the efforts include: rotordynamics, structural dynamics, fluid and gas dynamics, materials fatigue/fracture/life, turbomachinery fluid mechanics, ignition/combustion processes, manufacturing/producibility/nondestructive evaluation methods and materials development/evaluation. An overview of the Advanced High Pressure Oxygen-Hydrogen Rocket Propulsion Technology Program Structure and Working Groups objectives are presented with highlights of several significant achievements.
Advanced Engineering Environments: Implications for Aerospace Manufacturing
NASA Technical Reports Server (NTRS)
Thomas, D.
2001-01-01
There are significant challenges facing today's aerospace industry. Global competition, more complex products, geographically-distributed design teams, demands for lower cost, higher reliability and safer vehicles, and the need to incorporate the latest technologies quicker all face the developer of aerospace systems. New information technologies offer promising opportunities to develop advanced engineering environments (AEEs) to meet these challenges. Significant advances in the state-of-the-art of aerospace engineering practice are envisioned in the areas of engineering design and analytical tools, cost and risk tools, collaborative engineering, and high-fidelity simulations early in the development cycle. These advances will enable modeling and simulation of manufacturing methods, which will in turn allow manufacturing considerations to be included much earlier in the system development cycle. Significant cost savings, increased quality, and decreased manufacturing cycle time are expected to result. This paper will give an overview of the NASA's Intelligent Synthesis Environment, the agency initiative to develop an AEE, with a focus on the anticipated benefits in aerospace manufacturing.
Recent Advances in the Measurement of Arsenic, Cadmium, and Mercury in Rice and Other Foods
Punshon, Tracy
2015-01-01
Trace element analysis of foods is of increasing importance because of raised consumer awareness and the need to evaluate and establish regulatory guidelines for toxic trace metals and metalloids. This paper reviews recent advances in the analysis of trace elements in food, including challenges, state-of-the art methods, and use of spatially resolved techniques for localizing the distribution of As and Hg within rice grains. Total elemental analysis of foods is relatively well-established but the push for ever lower detection limits requires that methods be robust from potential matrix interferences which can be particularly severe for food. Inductively coupled plasma mass spectrometry (ICP-MS) is the method of choice, allowing for multi-element and highly sensitive analyses. For arsenic, speciation analysis is necessary because the inorganic forms are more likely to be subject to regulatory limits. Chromatographic techniques coupled to ICP-MS are most often used for arsenic speciation and a range of methods now exist for a variety of different arsenic species in different food matrices. Speciation and spatial analysis of foods, especially rice, can also be achieved with synchrotron techniques. Sensitive analytical techniques and methodological advances provide robust methods for the assessment of several metals in animal and plant-based foods, in particular for arsenic, cadmium and mercury in rice and arsenic speciation in foodstuffs. PMID:25938012
Attenuation correction in emission tomography using the emission data—A review
Li, Yusheng
2016-01-01
The problem of attenuation correction (AC) for quantitative positron emission tomography (PET) had been considered solved to a large extent after the commercial availability of devices combining PET with computed tomography (CT) in 2001; single photon emission computed tomography (SPECT) has seen a similar development. However, stimulated in particular by technical advances toward clinical systems combining PET and magnetic resonance imaging (MRI), research interest in alternative approaches for PET AC has grown substantially in the last years. In this comprehensive literature review, the authors first present theoretical results with relevance to simultaneous reconstruction of attenuation and activity. The authors then look back at the early history of this research area especially in PET; since this history is closely interwoven with that of similar approaches in SPECT, these will also be covered. We then review algorithmic advances in PET, including analytic and iterative algorithms. The analytic approaches are either based on the Helgason–Ludwig data consistency conditions of the Radon transform, or generalizations of John’s partial differential equation; with respect to iterative methods, we discuss maximum likelihood reconstruction of attenuation and activity (MLAA), the maximum likelihood attenuation correction factors (MLACF) algorithm, and their offspring. The description of methods is followed by a structured account of applications for simultaneous reconstruction techniques: this discussion covers organ-specific applications, applications specific to PET/MRI, applications using supplemental transmission information, and motion-aware applications. After briefly summarizing SPECT applications, we consider recent developments using emission data other than unscattered photons. In summary, developments using time-of-flight (TOF) PET emission data for AC have shown promising advances and open a wide range of applications. These techniques may both remedy deficiencies of purely MRI-based AC approaches in PET/MRI and improve standalone PET imaging. PMID:26843243
Attenuation correction in emission tomography using the emission data—A review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berker, Yannick, E-mail: berker@mail.med.upenn.edu; Li, Yusheng
2016-02-15
The problem of attenuation correction (AC) for quantitative positron emission tomography (PET) had been considered solved to a large extent after the commercial availability of devices combining PET with computed tomography (CT) in 2001; single photon emission computed tomography (SPECT) has seen a similar development. However, stimulated in particular by technical advances toward clinical systems combining PET and magnetic resonance imaging (MRI), research interest in alternative approaches for PET AC has grown substantially in the last years. In this comprehensive literature review, the authors first present theoretical results with relevance to simultaneous reconstruction of attenuation and activity. The authors thenmore » look back at the early history of this research area especially in PET; since this history is closely interwoven with that of similar approaches in SPECT, these will also be covered. We then review algorithmic advances in PET, including analytic and iterative algorithms. The analytic approaches are either based on the Helgason–Ludwig data consistency conditions of the Radon transform, or generalizations of John’s partial differential equation; with respect to iterative methods, we discuss maximum likelihood reconstruction of attenuation and activity (MLAA), the maximum likelihood attenuation correction factors (MLACF) algorithm, and their offspring. The description of methods is followed by a structured account of applications for simultaneous reconstruction techniques: this discussion covers organ-specific applications, applications specific to PET/MRI, applications using supplemental transmission information, and motion-aware applications. After briefly summarizing SPECT applications, we consider recent developments using emission data other than unscattered photons. In summary, developments using time-of-flight (TOF) PET emission data for AC have shown promising advances and open a wide range of applications. These techniques may both remedy deficiencies of purely MRI-based AC approaches in PET/MRI and improve standalone PET imaging.« less
Information Tailoring Enhancements for Large Scale Social Data
2016-03-15
i.com) 1 Work Performed within This Reporting Period .................................................... 2 1.1 Implemented Temporal Analytics ...following tasks. Implemented Temporal Analysis Algorithms for Advanced Analytics in Scraawl. We implemented our backend web service design for the...temporal analysis and we created a prototyope GUI web service of Scraawl analytics dashboard. Upgraded Scraawl computational framework to increase
MIT CSAIL and Lincoln Laboratory Task Force Report
2016-08-01
projects have been very diverse, spanning several areas of CSAIL concentration, including robotics, big data analytics , wireless communications...spanning several areas of CSAIL concentration, including robotics, big data analytics , wireless communications, computing architectures and...to machine learning systems and algorithms, such as recommender systems, and “Big Data ” analytics . Advanced computing architectures broadly refer to
Novel Advances in Shotgun Lipidomics for Biology and Medicine
Wang, Miao; Wang, Chunyan; Han, Rowland H.; Han, Xianlin
2015-01-01
The field of lipidomics, as coined in 2003, has made profound advances and been rapidly expanded. The mass spectrometry-based strategies of this analytical methodology-oriented research discipline for lipid analysis are largely fallen into three categories: direct infusion-based shotgun lipidomics, liquid chromatography-mass spectrometry-based platforms, and matrix-assisted laser desorption/ionization mass spectrometry-based approaches (particularly in imagining lipid distribution in tissues or cells). This review focuses on shotgun lipidomics. After briefly introducing its fundamentals, the major materials of this article cover its recent advances. These include the novel methods of lipid extraction, novel shotgun lipidomics strategies for identification and quantification of previously hardly accessible lipid classes and molecular species including isomers, and novel tools for processing and interpretation of lipidomics data. Representative applications of advanced shotgun lipidomics for biological and biomedical research are also presented in this review. We believe that with these novel advances in shotgun lipidomics, this approach for lipid analysis should become more comprehensive and high throughput, thereby greatly accelerating the lipidomics field to substantiate the aberrant lipid metabolism, signaling, trafficking, and homeostasis under pathological conditions and their underpinning biochemical mechanisms. PMID:26703190
Counterfeit drugs: analytical techniques for their identification.
Martino, R; Malet-Martino, M; Gilard, V; Balayssac, S
2010-09-01
In recent years, the number of counterfeit drugs has increased dramatically, including not only "lifestyle" products but also vital medicines. Besides the threat to public health, the financial and reputational damage to pharmaceutical companies is substantial. The lack of robust information on the prevalence of fake drugs is an obstacle in the fight against drug counterfeiting. It is generally accepted that approximately 10% of drugs worldwide could be counterfeit, but it is also well known that this number covers very different situations depending on the country, the places where the drugs are purchased, and the definition of what constitutes a counterfeit drug. The chemical analysis of drugs suspected to be fake is a crucial step as counterfeiters are becoming increasingly sophisticated, rendering visual inspection insufficient to distinguish the genuine products from the counterfeit ones. This article critically reviews the recent analytical methods employed to control the quality of drug formulations, using as an example artemisinin derivatives, medicines particularly targeted by counterfeiters. Indeed, a broad panel of techniques have been reported for their analysis, ranging from simple and cheap in-field ones (colorimetry and thin-layer chromatography) to more advanced laboratory methods (mass spectrometry, nuclear magnetic resonance, and vibrational spectroscopies) through chromatographic methods, which remain the most widely used. The conclusion section of the article highlights the questions to be posed before selecting the most appropriate analytical approach.
Petchkovsky, Leon
2017-06-01
Analytical psychology shares with many other psychotherapies the important task of repairing the consequences of developmental trauma. The majority of analytic patients come from compromised early developmental backgrounds: they may have experienced neglect, abuse, or failures of empathic resonance from their carers. Functional brain imagery techniques including Quantitative Electroencephalogram (QEEG), and functional Magnetic Resonance Imagery (fMRI), allow us to track mental processes in ways beyond verbal reportage and introspection. This independent perspective is useful for developing new psychodynamic hypotheses, testing current ones, providing diagnostic markers, and monitoring treatment progress. Jung, with the Word Association Test, grasped these principles 100 years ago. Brain imaging techniques have contributed to powerful recent advances in our understanding of neurodevelopmental processes in the first three years of life. If adequate nurturance is compromised, a range of difficulties may emerge. This has important implications for how we understand and treat our psychotherapy clients. The paper provides an overview of functional brain imaging and advances in developmental neuropsychology, and looks at applications of some of these findings (including neurofeedback) in the Jungian psychotherapy domain. © 2017, The Society of Analytical Psychology.
Schweitzer, Mary Higby; Schroeter, Elena R; Goshe, Michael B
2014-07-15
Advances in resolution and sensitivity of analytical techniques have provided novel applications, including the analyses of fossil material. However, the recovery of original proteinaceous components from very old fossil samples (defined as >1 million years (1 Ma) from previously named limits in the literature) is far from trivial. Here, we discuss the challenges to recovery of proteinaceous components from fossils, and the need for new sample preparation techniques, analytical methods, and bioinformatics to optimize and fully utilize the great potential of information locked in the fossil record. We present evidence for survival of original components across geological time, and discuss the potential benefits of recovery, analyses, and interpretation of fossil materials older than 1 Ma, both within and outside of the fields of evolutionary biology.
Wright, Aidan G C; Hallquist, Michael N
2014-01-01
Studying personality and its pathology as it changes, develops, or remains stable over time offers exciting insight into the nature of individual differences. Researchers interested in examining personal characteristics over time have a number of time-honored analytic approaches at their disposal. In recent years there have also been considerable advances in person-oriented analytic approaches, particularly longitudinal mixture models. In this methodological primer we focus on mixture modeling approaches to the study of normative and individual change in the form of growth mixture models and ipsative change in the form of latent transition analysis. We describe the conceptual underpinnings of each of these models, outline approaches for their implementation, and provide accessible examples for researchers studying personality and its assessment.
Analytical methodologies for aluminium speciation in environmental and biological samples--a review.
Bi, S P; Yang, X D; Zhang, F P; Wang, X L; Zou, G W
2001-08-01
It is recognized that aluminium (Al) is a potential environmental hazard. Acidic deposition has been linked to increased Al concentrations in natural waters. Elevated levels of Al might have serious consequences for biological communities. Of particular interest is the speciation of Al in aquatic environments, because Al toxicity depends on its forms and concentrations. In this paper, advances in analytical methodologies for Al speciation in environmental and biological samples during the past five years are reviewed. Concerns about the specific problems of Al speciation and highlights of some important methods are elucidated in sections devoted to hybrid techniques (HPLC or FPLC coupled with ET-AAS, ICP-AES, or ICP-MS), flow-injection analysis (FIA), nuclear magnetic resonance (27Al NMR), electrochemical analysis, and computer simulation. More than 130 references are cited.
Review and assessment of the database and numerical modeling for turbine heat transfer
NASA Technical Reports Server (NTRS)
Gladden, H. J.; Simoneau, R. J.
1988-01-01
The objectives of the HOST Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena and to assess and improve the analytical methods used to predict the flow and heat transfer in high-temperature gas turbines. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. A building-block approach was utilized and the research ranged from the study of fundamental phenomena and modeling to experiments in simulated real engine environments. Experimental research accounted for approximately 75 percent of the funding while the analytical efforts were approximately 25 percent. A healthy government/industry/university partnership, with industry providing almost half of the research, was created to advance the turbine heat transfer design technology base.
Asher, Lucy; Collins, Lisa M.; Ortiz-Pelaez, Angel; Drewe, Julian A.; Nicol, Christine J.; Pfeiffer, Dirk U.
2009-01-01
While the incorporation of mathematical and engineering methods has greatly advanced in other areas of the life sciences, they have been under-utilized in the field of animal welfare. Exceptions are beginning to emerge and share a common motivation to quantify ‘hidden’ aspects in the structure of the behaviour of an individual, or group of animals. Such analyses have the potential to quantify behavioural markers of pain and stress and quantify abnormal behaviour objectively. This review seeks to explore the scope of such analytical methods as behavioural indicators of welfare. We outline four classes of analyses that can be used to quantify aspects of behavioural organization. The underlying principles, possible applications and limitations are described for: fractal analysis, temporal methods, social network analysis, and agent-based modelling and simulation. We hope to encourage further application of analyses of behavioural organization by highlighting potential applications in the assessment of animal welfare, and increasing awareness of the scope for the development of new mathematical methods in this area. PMID:19740922
Noyes, Pamela D.; Lema, Sean C.; Roberts, Simon C.; Cooper, Ellen M.
2014-01-01
Thyroid hormones are critical regulators of normal development and physiological functioning in all vertebrates. Radioimmunoassay (RIA) approaches have been the method of choice for measuring circulating levels of thyroid hormones in vertebrates. While sensitive, RIA-based approaches only allow for a single analyte measurement per assay, can lack concordance across platforms and laboratories, and can be prone to analytical interferences especially when used with fish plasma. Ongoing advances in liquid chromatography tandem mass spectrometry (LC/MS/MS) have led to substantial decreases in detection limits for thyroid hormones and other biomolecules in complex matrices, including human plasma. Despite these advances, current analytical approaches do not allow for the measurement of native thyroid hormone in teleost fish plasma by mass spectrometry and continue to rely on immunoassay. In this study, we developed a new method that allows for the rapid extraction and simultaneous measurement of total T4 (TT4) and total T3 (TT3) in low volumes (50 μL) of fish plasma by LC/MS/MS. Methods were optimized initially in plasma from rainbow trout (Oncorhynchus mykiss) and applied to plasma from other teleost fishes, including fathead minnows (Pimephales promelas), mummichogs (Fundulus heteroclitus), sockeye salmon (Oncorhynchus nerka), and coho salmon (Oncorhynchus kisutch). Validation of method performance with T4- and T3-spiked rainbow trout plasma at 2 and 4 ng/mL produced mean recoveries ranging from 82 to 95 % and 97 to 105 %, respectively. Recovery of 13C12-T4 internal standard in plasma extractions was: 99±1.8 % in rainbow trout, 85±11 % in fathead minnow, 73±5.0 % in mummichog, 73±1.7 % in sockeye salmon, and 80±8.4 % in coho salmon. While absolute levels of thyroid hormones measured in identical plasma samples by LC/MS/MS and RIA varied depending on the assay used, T4/T3 ratios were generally consistent across both techniques. Less variability was measured among samples subjected to LC/MS/MS suggesting a more precise estimate of thyroid hormone homeostasis in the species targeted. Overall, a sensitive and reproducible method was established that takes advantage of LC/MS/MS techniques to rapidly measure TT4 and TT3 with negligible interferences in low volumes of plasma across a variety of teleost fishes. PMID:24343452
Analytical investigation of thermal barrier coatings on advanced power generation gas turbines
NASA Technical Reports Server (NTRS)
Amos, D. J.
1977-01-01
An analytical investigation of present and advanced gas turbine power generation cycles incorporating thermal barrier turbine component coatings was performed. Approximately 50 parametric points considering simple, recuperated, and combined cycles (including gasification) with gas turbine inlet temperatures from current levels through 1644K (2500 F) were evaluated. The results indicated that thermal barriers would be an attractive means to improve performance and reduce cost of electricity for these cycles. A recommended thermal barrier development program has been defined.
Microfluidic devices to enrich and isolate circulating tumor cells
Myung, J. H.; Hong, S.
2015-01-01
Given the potential clinical impact of circulating tumor cells (CTCs) in blood as a clinical biomarker for diagnosis and prognosis of various cancers, a myriad of detection methods for CTCs have been recently introduced. Among those, a series of microfluidic devices are particularly promising as these uniquely offer micro-scale analytical systems that are highlighted by low consumption of samples and reagents, high flexibility to accommodate other cutting-edge technologies, precise and well-defined flow behaviors, and automation capability, presenting significant advantages over the conventional larger scale systems. In this review, we highlight the advantages of microfluidic devices and their translational potential into CTC detection methods, categorized by miniaturization of bench-top analytical instruments, integration capability with nanotechnologies, and in situ or sequential analysis of captured CTCs. This review provides a comprehensive overview of recent advances in the CTC detection achieved through application of microfluidic devices and their challenges that these promising technologies must overcome to be clinically impactful. PMID:26549749
NASA Technical Reports Server (NTRS)
Hohenemser, K. H.; Banerjee, D.
1977-01-01
An introduction to aircraft state and parameter identification methods is presented. A simplified form of the maximum likelihood method is selected to extract analytical aeroelastic rotor models from simulated and dynamic wind tunnel test results for accelerated cyclic pitch stirring excitation. The dynamic inflow characteristics for forward flight conditions from the blade flapping responses without direct inflow measurements were examined. The rotor blades are essentially rigid for inplane bending and for torsion within the frequency range of study, but flexible in out-of-plane bending. Reverse flow effects are considered for high rotor advance ratios. Two inflow models are studied; the first is based on an equivalent blade Lock number, the second is based on a time delayed momentum inflow. In addition to the inflow parameters, basic rotor parameters like the blade natural frequency and the actual blade Lock number are identified together with measurement bias values. The effect of the theoretical dynamic inflow on the rotor eigenvalues is evaluated.
Electrospray Modifications for Advancing Mass Spectrometric Analysis
Meher, Anil Kumar; Chen, Yu-Chie
2017-01-01
Generation of analyte ions in gas phase is a primary requirement for mass spectrometric analysis. One of the ionization techniques that can be used to generate gas phase ions is electrospray ionization (ESI). ESI is a soft ionization method that can be used to analyze analytes ranging from small organics to large biomolecules. Numerous ionization techniques derived from ESI have been reported in the past two decades. These ion sources are aimed to achieve simplicity and ease of operation. Many of these ionization methods allow the flexibility for elimination or minimization of sample preparation steps prior to mass spectrometric analysis. Such ion sources have opened up new possibilities for taking scientific challenges, which might be limited by the conventional ESI technique. Thus, the number of ESI variants continues to increase. This review provides an overview of ionization techniques based on the use of electrospray reported in recent years. Also, a brief discussion on the instrumentation, underlying processes, and selected applications is also presented. PMID:28573082
On the analytic and numeric optimisation of airplane trajectories under real atmospheric conditions
NASA Astrophysics Data System (ADS)
Gonzalo, J.; Domínguez, D.; López, D.
2014-12-01
From the beginning of aviation era, economic constraints have forced operators to continuously improve the planning of the flights. The revenue is proportional to the cost per flight and the airspace occupancy. Many methods, the first started in the middle of last century, have explore analytical, numerical and artificial intelligence resources to reach the optimal flight planning. In parallel, advances in meteorology and communications allow an almost real-time knowledge of the atmospheric conditions and a reliable, error-bounded forecast for the near future. Thus, apart from weather risks to be avoided, airplanes can dynamically adapt their trajectories to minimise their costs. International regulators are aware about these capabilities, so it is reasonable to envisage some changes to allow this dynamic planning negotiation to soon become operational. Moreover, current unmanned airplanes, very popular and often small, suffer the impact of winds and other weather conditions in form of dramatic changes in their performance. The present paper reviews analytic and numeric solutions for typical trajectory planning problems. Analytic methods are those trying to solve the problem using the Pontryagin principle, where influence parameters are added to state variables to form a split condition differential equation problem. The system can be solved numerically -indirect optimisation- or using parameterised functions -direct optimisation-. On the other hand, numerical methods are based on Bellman's dynamic programming (or Dijkstra algorithms), where the fact that two optimal trajectories can be concatenated to form a new optimal one if the joint point is demonstrated to belong to the final optimal solution. There is no a-priori conditions for the best method. Traditionally, analytic has been more employed for continuous problems whereas numeric for discrete ones. In the current problem, airplane behaviour is defined by continuous equations, while wind fields are given in a discrete grid at certain time intervals. The research demonstrates advantages and disadvantages of each method as well as performance figures of the solutions found for typical flight conditions under static and dynamic atmospheres. This provides significant parameters to be used in the selection of solvers for optimal trajectories.
Trojanowicz, Marek; Bobrowski, Krzysztof; Szostek, Bogdan; Bojanowska-Czajka, Anna; Szreder, Tomasz; Bartoszewicz, Iwona; Kulisa, Krzysztof
2018-01-15
The monitoring of Advanced Oxidation/Reduction Processes (AO/RPs) for the evaluation of the yield and mechanisms of decomposition of perfluorinated compounds (PFCs) is often a more difficult task than their determination in the environmental, biological or food samples with complex matrices. This is mostly due to the formation of hundreds, or even thousands, of both intermediate and final products. The considered AO/RPs, involving free radical reactions, include photolytic and photocatalytic processes, Fenton reactions, sonolysis, ozonation, application of ionizing radiation and several wet oxidation processes. The main attention is paid to the most commonly occurring PFCs in the environment, namely PFOA and PFOS. The most powerful and widely exploited method for this purpose is without a doubt LC/MS/MS, which allows the identification and trace quantitation of all species with detectability and resolution power depending on the particular instrumental configurations. The GC/MS is often employed for the monitoring of volatile fluorocarbons, confirming the formation of radicals in the processes of C‒C and C‒S bonds cleavage. For the direct monitoring of radicals participating in the reactions of PFCs decomposition, the molecular spectrophotometry is employed, especially electron paramagnetic resonance (EPR). The UV/Vis spectrophotometry as a detection method is of special importance in the evaluation of kinetics of radical reactions with the use of pulse radiolysis methods. The most commonly employed for the determination of the yield of mineralization of PFCs is ion-chromatography, but there is also potentiometry with ion-selective electrode and the measurements of general parameters such as Total Organic Carbon and Total Organic Fluoride. The presented review is based on about 100 original papers published in both analytical and environmental journals. Copyright © 2017 Elsevier B.V. All rights reserved.
Ponce-Robles, Laura; Rivas, Gracia; Esteban, Belen; Oller, Isabel; Malato, Sixto; Agüera, Ana
2017-10-01
An analytical method was developed and validated for the determination of ten pesticides in sewage sludge coming from an agro-food industry. The method was based on the application of Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) extraction for solid sewage sludge and SPE extraction for sludge aqueous phase, followed by liquid chromatography (LC) coupled to hybrid quadrupole/linear ion trap mass spectrometry (QqLIT-MS). The QuEChERS method was reported 14 years ago and nowadays is mainly applied to the analysis of pesticides in food. More recent applications have been reported in other matrices as sewage sludge, but the complexity of the matrix makes necessary the optimization of the cleanup step to improve the efficiency of the analysis. With this aim, several dispersive solid-phase extraction cleanup sorbents were tested, choosing C18 + PSA as a d-SPE sorbent. The proposed method was satisfactorily validated for most compounds investigated, showing recoveries higher than 80% in most cases, with the only exception of prochloraz (71%) at low concentration level. Limits of quantification were lower than 40 ng l -1 in the aqueous phase and below 40 ng g -1 in the solid phase for the majority of the analytes. The method was applied to solid sludge and the sludge aqueous phase coming from an agro-food industry which processes fruits and vegetables. Graphical abstract Application of LC/MS/MS advanced analytical techniques for determination of pesticides contained in sewage sludge.
Management of thyroid cytological material, pre-analytical procedures and bio-banking.
Bode-Lesniewska, Beata; Cochand-Priollet, Beatrix; Straccia, Patrizia; Fadda, Guido; Bongiovanni, Massimo
2018-06-09
Thyroid nodules are common and increasingly detected due to recent advances in imaging techniques. However, clinically relevant thyroid cancer is rare and the mortality from aggressive thyroid cancer remains constant. FNAC (Fine Needle Aspiration Cytology) is a standard method for diagnosing thyroid malignancy and the discrimination of malignant nodules from goiter. As the examined nodules on thyroid FNAC are often small incidental findings, it is important to maintain a low rate of undetermined diagnoses requiring further clinical work up or surgery. The most important factors determining the accuracy of the cytological diagnosis and suitability for biobanking of thyroid FNACs are the quality of the sample and availability of adequate tissue for auxiliary studies. This article analyses technical aspects (pre-analytics) of performing thyroid FNACs, including image guidance and rapid on slide evaluation (ROSE), sample collection methods (conventional slides, liquid based methods (LBC), cell blocks) and storage (bio-banking). The spectrum of the special studies (immunocytochemistry on direct slides or LBC, immunohistochemistry on cell blocks and molecular methods) required for improving the precision of the cytological diagnosis of the thyroid nodules is discussed. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Contributions of immunoaffinity chromatography to deep proteome profiling of human biofluids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Chaochao; Duan, Jicheng; Liu, Tao
2016-05-01
Human biofluids, especially blood plasma or serum, hold great potential as the sources of potential biomarkers for various diseases; however, the enormous dynamic range of protein concentrations in biofluids represents a significant analytical challenge to detect promising low-abundance protein biomarkers. Over the last decade, various immunoaffinity chromatographic methods have been developed and routinely applied for separating low-abundance proteins from the high and moderate-abundance proteins, thus enabling more effective detection of low-abundance proteins. Herein, we review the advances of immunoaffinity separation methods and their contributions to the proteomics applications of different human biofluids. The limitations and future perspective of immunoaffinity separationmore » methods are also discussed.« less
Optimal guidance law development for an advanced launch system
NASA Technical Reports Server (NTRS)
Calise, Anthony J.; Hodges, Dewey H.; Leung, Martin S.; Bless, Robert R.
1991-01-01
The proposed investigation on a Matched Asymptotic Expansion (MAE) method was carried out. It was concluded that the method of MAE is not applicable to launch vehicle ascent trajectory optimization due to a lack of a suitable stretched variable. More work was done on the earlier regular perturbation approach using a piecewise analytic zeroth order solution to generate a more accurate approximation. In the meantime, a singular perturbation approach using manifold theory is also under current investigation. Work on a general computational environment based on the use of MACSYMA and the weak Hamiltonian finite element method continued during this period. This methodology is capable of the solution of a large class of optimal control problems.
Posse, Stefan
2011-01-01
The rapid development of fMRI was paralleled early on by the adaptation of MR spectroscopic imaging (MRSI) methods to quantify water relaxation changes during brain activation. This review describes the evolution of multi-echo acquisition from high-speed MRSI to multi-echo EPI and beyond. It highlights milestones in the development of multi-echo acquisition methods, such as the discovery of considerable gains in fMRI sensitivity when combining echo images, advances in quantification of the BOLD effect using analytical biophysical modeling and interleaved multi-region shimming. The review conveys the insight gained from combining fMRI and MRSI methods and concludes with recent trends in ultra-fast fMRI, which will significantly increase temporal resolution of multi-echo acquisition. PMID:22056458
Bobst, Cedric E.; Kaltashov, Igor A.
2012-01-01
Mass spectrometry has already become an indispensable tool in the analytical armamentarium of the biopharmaceutical industry, although its current uses are limited to characterization of covalent structure of recombinant protein drugs. However, the scope of applications of mass spectrometry-based methods is beginning to expand to include characterization of the higher order structure and dynamics of biopharmaceutical products, a development which is catalyzed by the recent progress in mass spectrometry-based methods to study higher order protein structure. The two particularly promising methods that are likely to have the most significant and lasting impact in many areas of biopharmaceutical analysis, direct ESI MS and hydrogen/deuterium exchange, are focus of this article. PMID:21542797
State-of-the-art Instruments for Detecting Extraterrestrial Life
NASA Technical Reports Server (NTRS)
Bada, Jeffrey L.
2003-01-01
In the coming decades, state-of-the-art spacecraft-based instruments that can detect key components associated with life as we know it on Earth will directly search for extinct or extant extraterrestrial life in our solar system. Advances in our analytical and detection capabilities, especially those based on microscale technologies, will be important in enhancing the abilities of these instruments. Remote sensing investigations of the atmospheres of extrasolar planets could provide evidence of photosynthetic-based life outside our solar system, although less advanced life will remain undetectable by these methods. Finding evidence of extraterrestrial life would have profound consequences both with respect to our understanding of chemical and biological evolution, and whether the biochemistry on Earth is unique in the universe.
Direct Multiple Shooting Optimization with Variable Problem Parameters
NASA Technical Reports Server (NTRS)
Whitley, Ryan J.; Ocampo, Cesar A.
2009-01-01
Taking advantage of a novel approach to the design of the orbital transfer optimization problem and advanced non-linear programming algorithms, several optimal transfer trajectories are found for problems with and without known analytic solutions. This method treats the fixed known gravitational constants as optimization variables in order to reduce the need for an advanced initial guess. Complex periodic orbits are targeted with very simple guesses and the ability to find optimal transfers in spite of these bad guesses is successfully demonstrated. Impulsive transfers are considered for orbits in both the 2-body frame as well as the circular restricted three-body problem (CRTBP). The results with this new approach demonstrate the potential for increasing robustness for all types of orbit transfer problems.
Optimal design application on the advanced aeroelastic rotor blade
NASA Technical Reports Server (NTRS)
Wei, F. S.; Jones, R.
1985-01-01
The vibration and performance optimization procedure using regression analysis was successfully applied to an advanced aeroelastic blade design study. The major advantage of this regression technique is that multiple optimizations can be performed to evaluate the effects of various objective functions and constraint functions. The data bases obtained from the rotorcraft flight simulation program C81 and Myklestad mode shape program are analytically determined as a function of each design variable. This approach has been verified for various blade radial ballast weight locations and blade planforms. This method can also be utilized to ascertain the effect of a particular cost function which is composed of several objective functions with different weighting factors for various mission requirements without any additional effort.
Mirasoli, Mara; Guardigli, Massimo; Michelini, Elisa; Roda, Aldo
2014-01-01
Miniaturization of analytical procedures through microchips, lab-on-a-chip or micro total analysis systems is one of the most recent trends in chemical and biological analysis. These systems are designed to perform all the steps in an analytical procedure, with the advantages of low sample and reagent consumption, fast analysis, reduced costs, possibility of extra-laboratory application. A range of detection technologies have been employed in miniaturized analytical systems, but most applications relied on fluorescence and electrochemical detection. Chemical luminescence (which includes chemiluminescence, bioluminescence, and electrogenerated chemiluminescence) represents an alternative detection principle that offered comparable (or better) analytical performance and easier implementation in miniaturized analytical devices. Nevertheless, chemical luminescence-based ones represents only a small fraction of the microfluidic devices reported in the literature, and until now no review has been focused on these devices. Here we review the most relevant applications (since 2009) of miniaturized analytical devices based on chemical luminescence detection. After a brief overview of the main chemical luminescence systems and of the recent technological advancements regarding their implementation in miniaturized analytical devices, analytical applications are reviewed according to the nature of the device (microfluidic chips, microchip electrophoresis, lateral flow- and paper-based devices) and the type of application (micro-flow injection assays, enzyme assays, immunoassays, gene probe hybridization assays, cell assays, whole-cell biosensors). Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.
2017-12-01
NASA's efforts to advance climate analytics-as-a-service are making new capabilities available to the research community: (1) A full-featured Reanalysis Ensemble Service (RES) comprising monthly means data from multiple reanalysis data sets, accessible through an enhanced set of extraction, analytic, arithmetic, and intercomparison operations. The operations are made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib; (2) A cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables. This near real-time capability enables advanced technologies like Spark and Hadoop-based MapReduce analytics over native NetCDF files; and (3) A WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation systems such as ESGF. The Reanalysis Ensemble Service includes the following: - New API that supports full temporal, spatial, and grid-based resolution services with sample queries - A Docker-ready RES application to deploy across platforms - Extended capabilities that enable single- and multiple reanalysis area average, vertical average, re-gridding, standard deviation, and ensemble averages - Convenient, one-stop shopping for commonly used data products from multiple reanalyses including basic sub-setting and arithmetic operations (e.g., avg, sum, max, min, var, count, anomaly) - Full support for the MERRA-2 reanalysis dataset in addition to, ECMWF ERA-Interim, NCEP CFSR, JMA JRA-55 and NOAA/ESRL 20CR… - A Jupyter notebook-based distribution mechanism designed for client use cases that combines CDSlib documentation with interactive scenarios and personalized project management - Supporting analytic services for NASA GMAO Forward Processing datasets - Basic uncertainty quantification services that combine heterogeneous ensemble products with comparative observational products (e.g., reanalysis, observational, visualization) - The ability to compute and visualize multiple reanalysis for ease of inter-comparisons - Automated tools to retrieve and prepare data collections for analytic processing
NASA Astrophysics Data System (ADS)
Xu, Yao; Zhang, Chun-Hui; Niebur, Ernst; Wang, Jun-Song
2018-04-01
Not Available Project supported by the National Natural Science Foundation of China (Grant No. 61473208), the Tianjin Research Program of Application Foundation and Advanced Technology, China (Grant No. 15JCYBJC47700), the National Institutes of Health, USA (Grant Nos. R01DA040990 and R01EY027544), and the Project of Humanities and Social Sciences from the Ministry of Education, China (Grant No. 17YJAZH092).
Exploring Novel Spintronic Responses from Advanced Functional Organic Materials
2015-08-10
under 365 nm handheld UV lamp before (left) and after (right) writing letters with AA solution as ink. In summary, we could demonstrate a novel...to UV (360 nm) for 20 min, and the octadecyltrichlorosilane (OTS) treatment was carried out in an OTS vapor chamber (80 °C, vacuum, 2 h 30 min...serum to 20 mM in food and pharmaceuticals.5-7 A lot of analytical detection methods for AA have been developed including electrophoresis,8 UV -Vis
Chemistry and Biochemistry of Dietary Polyphenols
Tsao, Rong
2010-01-01
Polyphenols are the biggest group of phytochemicals, and many of them have been found in plant-based foods. Polyphenol-rich diets have been linked to many health benefits. This paper is intended to review the chemistry and biochemistry of polyphenols as related to classification, extraction, separation and analytical methods, their occurrence and biosynthesis in plants, and the biological activities and implications in human health. The discussions are focused on important and most recent advances in the above aspects, and challenges are identified for future research. PMID:22254006
Eric Davidson, his philosophy, and the history of science.
Deichmann, Ute
2017-10-16
Eric Davidson, a passionate molecular developmental biologist and intellectual, believed that conceptual advances in the sciences should be based on knowledge of conceptual history. Convinced of the superiority of a causal-analytical approach over other methods, he succeeded in successfully applying this approach to the complex feature of organismal development by introducing the far-reaching concept of developmental Gene Regulatory Networks. This essay reviews Davidson's philosophy, his support for the history of science, and some aspects of his scientific personality.
Fatigue and fracture: Overview
NASA Technical Reports Server (NTRS)
Halford, G. R.
1984-01-01
A brief overview of the status of the fatigue and fracture programs is given. The programs involve the development of appropriate analytic material behavior models for cyclic stress-strain-temperature-time/cyclic crack initiation, and cyclic crack propagation. The underlying thrust of these programs is the development and verification of workable engineering methods for the calculation, in advance of service, of the local cyclic stress-strain response at the critical life governing location in hot section compounds, and the resultant crack initiation and crack growth lifetimes.
Momchilova, Svetlana M; Nikolova-Damyanova, Boryana M
2012-01-01
An effort is made to critically present the achievements in silver ion chromatography during the last decade. Novelties in columns, mobile-phase compositions and detectors are described. Recent applications of silver ion chromatography in the analysis of fatty acids and triacylglycerols are presented while stressing novel analytical strategies or new objects. The tendencies in the application of the method in complementary ways with reversed-phase chromatography, chiral chromatography and, especially, mass detection are outlined.
Coherent imaging at the diffraction limit
Thibault, Pierre; Guizar-Sicairos, Manuel; Menzel, Andreas
2014-01-01
X-ray ptychography, a scanning coherent diffractive imaging technique, holds promise for imaging with dose-limited resolution and sensitivity. If the foreseen increase of coherent flux by orders of magnitude can be matched by additional technological and analytical advances, ptychography may approach imaging speeds familiar from full-field methods while retaining its inherently quantitative nature and metrological versatility. Beyond promises of high throughput, spectroscopic applications in three dimensions become feasible, as do measurements of sample dynamics through time-resolved imaging or careful characterization of decoherence effects. PMID:25177990
Coherent imaging at the diffraction limit.
Thibault, Pierre; Guizar-Sicairos, Manuel; Menzel, Andreas
2014-09-01
X-ray ptychography, a scanning coherent diffractive imaging technique, holds promise for imaging with dose-limited resolution and sensitivity. If the foreseen increase of coherent flux by orders of magnitude can be matched by additional technological and analytical advances, ptychography may approach imaging speeds familiar from full-field methods while retaining its inherently quantitative nature and metrological versatility. Beyond promises of high throughput, spectroscopic applications in three dimensions become feasible, as do measurements of sample dynamics through time-resolved imaging or careful characterization of decoherence effects.
Lewczuk, Piotr; Riederer, Peter; O'Bryant, Sid E; Verbeek, Marcel M; Dubois, Bruno; Visser, Pieter Jelle; Jellinger, Kurt A; Engelborghs, Sebastiaan; Ramirez, Alfredo; Parnetti, Lucilla; Jack, Clifford R; Teunissen, Charlotte E; Hampel, Harald; Lleó, Alberto; Jessen, Frank; Glodzik, Lidia; de Leon, Mony J; Fagan, Anne M; Molinuevo, José Luis; Jansen, Willemijn J; Winblad, Bengt; Shaw, Leslie M; Andreasson, Ulf; Otto, Markus; Mollenhauer, Brit; Wiltfang, Jens; Turner, Martin R; Zerr, Inga; Handels, Ron; Thompson, Alexander G; Johansson, Gunilla; Ermann, Natalia; Trojanowski, John Q; Karaca, Ilker; Wagner, Holger; Oeckl, Patrick; van Waalwijk van Doorn, Linda; Bjerke, Maria; Kapogiannis, Dimitrios; Kuiperij, H Bea; Farotti, Lucia; Li, Yi; Gordon, Brian A; Epelbaum, Stéphane; Vos, Stephanie J B; Klijn, Catharina J M; Van Nostrand, William E; Minguillon, Carolina; Schmitz, Matthias; Gallo, Carla; Lopez Mato, Andrea; Thibaut, Florence; Lista, Simone; Alcolea, Daniel; Zetterberg, Henrik; Blennow, Kaj; Kornhuber, Johannes
2018-06-01
In the 12 years since the publication of the first Consensus Paper of the WFSBP on biomarkers of neurodegenerative dementias, enormous advancement has taken place in the field, and the Task Force takes now the opportunity to extend and update the original paper. New concepts of Alzheimer's disease (AD) and the conceptual interactions between AD and dementia due to AD were developed, resulting in two sets for diagnostic/research criteria. Procedures for pre-analytical sample handling, biobanking, analyses and post-analytical interpretation of the results were intensively studied and optimised. A global quality control project was introduced to evaluate and monitor the inter-centre variability in measurements with the goal of harmonisation of results. Contexts of use and how to approach candidate biomarkers in biological specimens other than cerebrospinal fluid (CSF), e.g. blood, were precisely defined. Important development was achieved in neuroimaging techniques, including studies comparing amyloid-β positron emission tomography results to fluid-based modalities. Similarly, development in research laboratory technologies, such as ultra-sensitive methods, raises our hopes to further improve analytical and diagnostic accuracy of classic and novel candidate biomarkers. Synergistically, advancement in clinical trials of anti-dementia therapies energises and motivates the efforts to find and optimise the most reliable early diagnostic modalities. Finally, the first studies were published addressing the potential of cost-effectiveness of the biomarkers-based diagnosis of neurodegenerative disorders.
Algorithms and software for U-Pb geochronology by LA-ICPMS
NASA Astrophysics Data System (ADS)
McLean, Noah M.; Bowring, James F.; Gehrels, George
2016-07-01
The past 15 years have produced numerous innovations in geochronology, including experimental methods, instrumentation, and software that are revolutionizing the acquisition and application of geochronological data. For example, exciting advances are being driven by Laser-Ablation ICP Mass Spectrometry (LA-ICPMS), which allows for rapid determination of U-Th-Pb ages with 10s of micrometer-scale spatial resolution. This method has become the most commonly applied tool for dating zircons, constraining a host of geological problems. The LA-ICPMS community is now faced with archiving these data with associated analytical results and, more importantly, ensuring that data meet the highest standards for precision and accuracy and that interlaboratory biases are minimized. However, there is little consensus with regard to analytical strategies and data reduction protocols for LA-ICPMS geochronology. The result is systematic interlaboratory bias and both underestimation and overestimation of uncertainties on calculated dates that, in turn, decrease the value of data in repositories such as EarthChem, which archives data and analytical results from participating laboratories. We present free open-source software that implements new algorithms for evaluating and resolving many of these discrepancies. This solution is the result of a collaborative effort to extend the U-Pb_Redux software for the ID-TIMS community to the LA-ICPMS community. Now named ET_Redux, our new software automates the analytical and scientific workflows of data acquisition, statistical filtering, data analysis and interpretation, publication, community-based archiving, and the compilation and comparison of data from different laboratories to support collaborative science.
Metal species involved in long distance metal transport in plants
Álvarez-Fernández, Ana; Díaz-Benito, Pablo; Abadía, Anunciación; López-Millán, Ana-Flor; Abadía, Javier
2014-01-01
The mechanisms plants use to transport metals from roots to shoots are not completely understood. It has long been proposed that organic molecules participate in metal translocation within the plant. However, until recently the identity of the complexes involved in the long-distance transport of metals could only be inferred by using indirect methods, such as analyzing separately the concentrations of metals and putative ligands and then using in silico chemical speciation software to predict metal species. Molecular biology approaches also have provided a breadth of information about putative metal ligands and metal complexes occurring in plant fluids. The new advances in analytical techniques based on mass spectrometry and the increased use of synchrotron X-ray spectroscopy have allowed for the identification of some metal-ligand species in plant fluids such as the xylem and phloem saps. Also, some proteins present in plant fluids can bind metals and a few studies have explored this possibility. This study reviews the analytical challenges researchers have to face to understand long-distance metal transport in plants as well as the recent advances in the identification of the ligand and metal-ligand complexes in plant fluids. PMID:24723928
Fibrinolysis standards: a review of the current status.
Thelwell, C
2010-07-01
Biological standards are used to calibrate measurements of components of the fibrinolytic system, either for assigning potency values to therapeutic products, or to determine levels in human plasma as an indicator of thrombotic risk. Traditionally WHO International Standards are calibrated in International Units based on consensus values from collaborative studies. The International Unit is defined by the response activity of a given amount of the standard in a bioassay, independent of the method used. Assay validity is based on the assumption that both standard and test preparation contain the same analyte, and the response in an assay is a true function of this analyte. This principle is reflected in the diversity of source materials used to prepare fibrinolysis standards, which has depended on the contemporary preparations they were employed to measure. With advancing recombinant technology, and improved analytical techniques, a reference system based on reference materials and associated reference methods has been recommended for future fibrinolysis standards. Careful consideration and scientific judgement must however be applied when deciding on an approach to develop a new standard, with decisions based on the suitability of a standard to serve its purpose, and not just to satisfy a metrological ideal. 2010 The International Association for Biologicals. Published by Elsevier Ltd. All rights reserved.
The evolution of analytical chemistry methods in foodomics.
Gallo, Monica; Ferranti, Pasquale
2016-01-08
The methodologies of food analysis have greatly evolved over the past 100 years, from basic assays based on solution chemistry to those relying on the modern instrumental platforms. Today, the development and optimization of integrated analytical approaches based on different techniques to study at molecular level the chemical composition of a food may allow to define a 'food fingerprint', valuable to assess nutritional value, safety and quality, authenticity and security of foods. This comprehensive strategy, defined foodomics, includes emerging work areas such as food chemistry, phytochemistry, advanced analytical techniques, biosensors and bioinformatics. Integrated approaches can help to elucidate some critical issues in food analysis, but also to face the new challenges of a globalized world: security, sustainability and food productions in response to environmental world-wide changes. They include the development of powerful analytical methods to ensure the origin and quality of food, as well as the discovery of biomarkers to identify potential food safety problems. In the area of nutrition, the future challenge is to identify, through specific biomarkers, individual peculiarities that allow early diagnosis and then a personalized prognosis and diet for patients with food-related disorders. Far from the aim of an exhaustive review of the abundant literature dedicated to the applications of omic sciences in food analysis, we will explore how classical approaches, such as those used in chemistry and biochemistry, have evolved to intersect with the new omics technologies to produce a progress in our understanding of the complexity of foods. Perhaps most importantly, a key objective of the review will be to explore the development of simple and robust methods for a fully applied use of omics data in food science. Copyright © 2015 Elsevier B.V. All rights reserved.
Gonzalez, Maria E; Barrett, Diane M
2010-01-01
Advanced food processing methods that accomplish inactivation of microorganisms but minimize adverse thermal exposure are of great interest to the food industry. High pressure (HP) and pulsed electric field (PEF) processing are commercially applied to produce high quality fruit and vegetable products in the United States, Europe, and Japan. Both microbial and plant cell membranes are significantly altered following exposure to heat, HP, or PEF. Our research group sought to quantify the degree of damage to plant cell membranes that occurs as a result of exposure to heat, HP, or PEF, using the same analytical methods. In order to evaluate whether new advanced processing methods are superior to traditional thermal processing methods, it is necessary to compare them. In this review, we describe the existing state of knowledge related to effects of heat, HP, and PEF on both microbial and plant cells. The importance and relevance of compartmentalization in plant cells as it relates to fruit and vegetable quality is described and various methods for quantification of plant cell membrane integrity are discussed. These include electrolyte leakage, cell viability, and proton nuclear magnetic resonance (1H-NMR). PMID:20492210
Gonzalez, Maria E; Barrett, Diane M
2010-09-01
Advanced food processing methods that accomplish inactivation of microorganisms but minimize adverse thermal exposure are of great interest to the food industry. High pressure (HP) and pulsed electric field (PEF) processing are commercially applied to produce high quality fruit and vegetable products in the United States, Europe, and Japan. Both microbial and plant cell membranes are significantly altered following exposure to heat, HP, or PEF. Our research group sought to quantify the degree of damage to plant cell membranes that occurs as a result of exposure to heat, HP, or PEF, using the same analytical methods. In order to evaluate whether new advanced processing methods are superior to traditional thermal processing methods, it is necessary to compare them. In this review, we describe the existing state of knowledge related to effects of heat, HP, and PEF on both microbial and plant cells. The importance and relevance of compartmentalization in plant cells as it relates to fruit and vegetable quality is described and various methods for quantification of plant cell membrane integrity are discussed. These include electrolyte leakage, cell viability, and proton nuclear magnetic resonance (¹H-NMR).
Baldrian, Petr; López-Mondéjar, Rubén
2014-02-01
Molecular methods for the analysis of biomolecules have undergone rapid technological development in the last decade. The advent of next-generation sequencing methods and improvements in instrumental resolution enabled the analysis of complex transcriptome, proteome and metabolome data, as well as a detailed annotation of microbial genomes. The mechanisms of decomposition by model fungi have been described in unprecedented detail by the combination of genome sequencing, transcriptomics and proteomics. The increasing number of available genomes for fungi and bacteria shows that the genetic potential for decomposition of organic matter is widespread among taxonomically diverse microbial taxa, while expression studies document the importance of the regulation of expression in decomposition efficiency. Importantly, high-throughput methods of nucleic acid analysis used for the analysis of metagenomes and metatranscriptomes indicate the high diversity of decomposer communities in natural habitats and their taxonomic composition. Today, the metaproteomics of natural habitats is of interest. In combination with advanced analytical techniques to explore the products of decomposition and the accumulation of information on the genomes of environmentally relevant microorganisms, advanced methods in microbial ecophysiology should increase our understanding of the complex processes of organic matter transformation.
Glycoprotein Disease Markers and Single Protein-omics*
Chandler, Kevin; Goldman, Radoslav
2013-01-01
Glycoproteins are well represented among biomarkers for inflammatory and cancer diseases. Secreted and membrane-associated glycoproteins make excellent targets for noninvasive detection. In this review, we discuss clinically applicable markers of cancer diseases and methods for their analysis. High throughput discovery continues to supply marker candidates with unusual glycan structures, altered glycoprotein abundance, or distribution of site-specific glycoforms. Improved analytical methods are needed to unlock the potential of these discoveries in validated clinical assays. A new generation of targeted quantitative assays is expected to advance the use of glycoproteins in early detection of diseases, molecular disease classification, and monitoring of therapeutic interventions. PMID:23399550
NASA Technical Reports Server (NTRS)
Della-Corte, Christopher
2012-01-01
Foil gas bearings are a key technology in many commercial and emerging oilfree turbomachinery systems. These bearings are nonlinear and have been difficult to analytically model in terms of performance characteristics such as load capacity, power loss, stiffness, and damping. Previous investigations led to an empirically derived method to estimate load capacity. This method has been a valuable tool in system development. The current work extends this tool concept to include rules for stiffness and damping coefficient estimation. It is expected that these rules will further accelerate the development and deployment of advanced oil-free machines operating on foil gas bearings.
Rogue athletes and recombinant DNA technology: challenges for doping control.
Azzazy, Hassan M E; Mansour, Mai M H
2007-10-01
The quest for athletic excellence holds no limit for some athletes, and the advances in recombinant DNA technology have handed these athletes the ultimate doping weapons: recombinant proteins and gene doping. Some detection methods are now available for several recombinant proteins that are commercially available as pharmaceuticals and being abused by dopers. However, researchers are struggling to come up with efficient detection methods in preparation for the imminent threat of gene doping, expected in the 2008 Olympics. This Forum article presents the main detection strategies for recombinant proteins and the forthcoming detection strategies for gene doping as well as the prime analytical challenges facing them.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Chaochao; Duan, Jicheng; Liu, Tao
Human biofluids, especially blood plasma or serum, hold great potential as the sources of candidate biomarkers for various diseases; however, the enormous dynamic range of protein concentrations in biofluids represents a significant analytical challenge for detecting promising low-abundance proteins. Over the last decade, various immunoaffinity chromatographic methods have been developed and routinely applied for separating low-abundance proteins from the high- and moderate-abundance proteins, thus enabling much more effective detection of low-abundance proteins. Herein, we review the advances of immunoaffinity separation methods and their contributions to the proteomic applications in human biofluids. The limitations and future perspectives of immunoaffinity separation methodsmore » are also discussed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meinke, Rainer B.; Goodzeit, Carl L.; Ball, Millicent J.
This research project advanced the development of reliable, cost-effective arrays of superconducting quadrupole magnets for use in multi-beam inertial fusion accelerators. The field in each array cell must be identical and meet stringent requirements for field quality and strength. An optimized compact array design using flat double-layer pancake coils was developed. Analytical studies of edge termination methods showed that it is feasible to meet the requirements for field uniformity in all cells and elimination of stray external field in several ways: active methods that involve placement of field compensating coils on the periphery of the array or a passive methodmore » that involves use of iron shielding.« less
Yim, Sehyuk; Gultepe, Evin; Gracias, David H; Sitti, Metin
2014-02-01
This paper proposes a new wireless biopsy method where a magnetically actuated untethered soft capsule endoscope carries and releases a large number of thermo-sensitive, untethered microgrippers (μ-grippers) at a desired location inside the stomach and retrieves them after they self-fold and grab tissue samples. We describe the working principles and analytical models for the μ-gripper release and retrieval mechanisms, and evaluate the proposed biopsy method in ex vivo experiments. This hierarchical approach combining the advanced navigation skills of centimeter-scaled untethered magnetic capsule endoscopes with highly parallel, autonomous, submillimeter scale tissue sampling μ-grippers offers a multifunctional strategy for gastrointestinal capsule biopsy.
Usefulness of Analytical Research: Rethinking Analytical R&D&T Strategies.
Valcárcel, Miguel
2017-11-07
This Perspective is intended to help foster true innovation in Research & Development & Transfer (R&D&T) in Analytical Chemistry in the form of advances that are primarily useful for analytical purposes rather than solely for publishing. Devising effective means to strengthen the crucial contribution of Analytical Chemistry to progress in Chemistry, Science & Technology, and Society requires carefully examining the present status of our discipline and also identifying internal and external driving forces with a potential adverse impact on its development. The diagnostic process should be followed by administration of an effective therapy and supported by adoption of a theragnostic strategy if Analytical Chemistry is to enjoy a better future.
The MCNP6 Analytic Criticality Benchmark Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
2016-06-16
Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less
Zakaria, Rosita; Allen, Katrina J.; Koplin, Jennifer J.; Roche, Peter
2016-01-01
Introduction Through the introduction of advanced analytical techniques and improved throughput, the scope of dried blood spot testing utilising mass spectrometric methods, has broadly expanded. Clinicians and researchers have become very enthusiastic about the potential applications of dried blood spot based mass spectrometric applications. Analysts on the other hand face challenges of sensitivity, reproducibility and overall accuracy of dried blood spot quantification. In this review, we aim to bring together these two facets to discuss the advantages and current challenges of non-newborn screening applications of dried blood spot quantification by mass spectrometry. Methods To address these aims we performed a key word search of the PubMed and MEDLINE online databases in conjunction with individual manual searches to gather information. Keywords for the initial search included; “blood spot” and “mass spectrometry”; while excluding “newborn”; and “neonate”. In addition, databases were restricted to English language and human specific. There was no time period limit applied. Results As a result of these selection criteria, 194 references were identified for review. For presentation, this information is divided into: 1) clinical applications; and 2) analytical considerations across the total testing process; being pre-analytical, analytical and post-analytical considerations. Conclusions DBS analysis using MS applications is now broadly applied, with drug monitoring for both therapeutic and toxicological analysis being the most extensively reported. Several parameters can affect the accuracy of DBS measurement and further bridge experiments are required to develop adjustment rules for comparability between dried blood spot measures and the equivalent serum/plasma values. Likewise, the establishment of independent reference intervals for dried blood spot sample matrix is required. PMID:28149263
New test techniques and analytical procedures for understanding the behavior of advanced propellers
NASA Technical Reports Server (NTRS)
Stefko, G. L.; Bober, L. J.; Neumann, H. E.
1983-01-01
Analytical procedures and experimental techniques were developed to improve the capability to design advanced high speed propellers. Some results from the propeller lifting line and lifting surface aerodynamic analysis codes are compared with propeller force data, probe data and laser velocimeter data. In general, the code comparisons with data indicate good qualitative agreement. A rotating propeller force balance demonstrated good accuracy and reduced test time by 50 percent. Results from three propeller flow visualization techniques are shown which illustrate some of the physical phenomena occurring on these propellers.
Contributions of Analytical Chemistry to the Clinical Laboratory.
ERIC Educational Resources Information Center
Skogerboe, Kristen J.
1988-01-01
Highlights several analytical techniques that are being used in state-of-the-art clinical labs. Illustrates how other advances in instrumentation may contribute to clinical chemistry in the future. Topics include: biosensors, polarization spectroscopy, chemiluminescence, fluorescence, photothermal deflection, and chromatography in clinical…
Analytical technique characterizes all trace contaminants in water
NASA Technical Reports Server (NTRS)
Foster, J. N.; Lysyj, I.; Nelson, K. H.
1967-01-01
Properly programmed combination of advanced chemical and physical analytical techniques characterize critically all trace contaminants in both the potable and waste water from the Apollo Command Module. This methodology can also be applied to the investigation of the source of water pollution.
NASA Astrophysics Data System (ADS)
Piburn, J.; Stewart, R.; Myers, A.; Sorokine, A.; Axley, E.; Anderson, D.; Burdette, J.; Biddle, C.; Hohl, A.; Eberle, R.; Kaufman, J.; Morton, A.
2017-10-01
Spatiotemporal (ST) analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.
Review and assessment of the database and numerical modeling for turbine heat transfer
NASA Technical Reports Server (NTRS)
Gladden, H. J.; Simoneau, R. J.
1989-01-01
The objectives of the NASA Hot Section Technology (HOST) Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena and to assess and improve the analytical methods used to predict the flow and heat transfer in high-temperature gas turbines. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. A building-block approach was utilized and the research ranged from the study of fundamental phenomena and modeling to experiments in simulated real engine environments. Experimental research accounted for approximately 75 percent of the funding while the analytical efforts were approximately 25 percent. A healthy government/industry/university partnership, with industry providing almost half of the research, was created to advance the turbine heat transfer design technology base.
Gradient Optimization for Analytic conTrols - GOAT
NASA Astrophysics Data System (ADS)
Assémat, Elie; Machnes, Shai; Tannor, David; Wilhelm-Mauch, Frank
Quantum optimal control becomes a necessary step in a number of studies in the quantum realm. Recent experimental advances showed that superconducting qubits can be controlled with an impressive accuracy. However, most of the standard optimal control algorithms are not designed to manage such high accuracy. To tackle this issue, a novel quantum optimal control algorithm have been introduced: the Gradient Optimization for Analytic conTrols (GOAT). It avoids the piecewise constant approximation of the control pulse used by standard algorithms. This allows an efficient implementation of very high accuracy optimization. It also includes a novel method to compute the gradient that provides many advantages, e.g. the absence of backpropagation or the natural route to optimize the robustness of the control pulses. This talk will present the GOAT algorithm and a few applications to transmons systems.
Drury, Colin G
2015-01-01
In recent years, advances in sensor technology, connectedness and computational power have come together to produce huge data-sets. The treatment and analysis of these data-sets is known as big data analytics (BDA), and the somewhat related term data mining. Fields allied to human factors/ergonomics (HFE), e.g. statistics, have developed computational methods to derive meaningful, actionable conclusions from these data bases. This paper examines BDA, often characterised by volume, velocity and variety, giving examples of successful BDA use. This examination provides context by considering examples of using BDA on human data, using BDA in HFE studies, and studies of how people perform BDA. Significant issues for HFE are the reliance of BDA on correlation rather than hypotheses and theory, the ethics of BDA and the use of HFE in data visualisation.
Analytical separations of mammalian decomposition products for forensic science: a review.
Swann, L M; Forbes, S L; Lewis, S W
2010-12-03
The study of mammalian soft tissue decomposition is an emerging area in forensic science, with a major focus of the research being the use of various chemical and biological methods to study the fate of human remains in the environment. Decomposition of mammalian soft tissue is a postmortem process that, depending on environmental conditions and physiological factors, will proceed until complete disintegration of the tissue. The major stages of decomposition involve complex reactions which result in the chemical breakdown of the body's main constituents; lipids, proteins, and carbohydrates. The first step to understanding this chemistry is identifying the compounds present in decomposition fluids and determining when they are produced. This paper provides an overview of decomposition chemistry and reviews recent advances in this area utilising analytical separation science. Copyright © 2010 Elsevier B.V. All rights reserved.
The Water-Energy-Food Nexus: A systematic review of methods for nexus assessment
NASA Astrophysics Data System (ADS)
Albrecht, Tamee R.; Crootof, Arica; Scott, Christopher A.
2018-04-01
The water-energy-food (WEF) nexus is rapidly expanding in scholarly literature and policy settings as a novel way to address complex resource and development challenges. The nexus approach aims to identify tradeoffs and synergies of water, energy, and food systems, internalize social and environmental impacts, and guide development of cross-sectoral policies. However, while the WEF nexus offers a promising conceptual approach, the use of WEF nexus methods to systematically evaluate water, energy, and food interlinkages or support development of socially and politically-relevant resource policies has been limited. This paper reviews WEF nexus methods to provide a knowledge base of existing approaches and promote further development of analytical methods that align with nexus thinking. The systematic review of 245 journal articles and book chapters reveals that (a) use of specific and reproducible methods for nexus assessment is uncommon (less than one-third); (b) nexus methods frequently fall short of capturing interactions among water, energy, and food—the very linkages they conceptually purport to address; (c) assessments strongly favor quantitative approaches (nearly three-quarters); (d) use of social science methods is limited (approximately one-quarter); and (e) many nexus methods are confined to disciplinary silos—only about one-quarter combine methods from diverse disciplines and less than one-fifth utilize both quantitative and qualitative approaches. To help overcome these limitations, we derive four key features of nexus analytical tools and methods—innovation, context, collaboration, and implementation—from the literature that reflect WEF nexus thinking. By evaluating existing nexus analytical approaches based on these features, we highlight 18 studies that demonstrate promising advances to guide future research. This paper finds that to address complex resource and development challenges, mixed-methods and transdisciplinary approaches are needed that incorporate social and political dimensions of water, energy, and food; utilize multiple and interdisciplinary approaches; and engage stakeholders and decision-makers.
Video Analysis of Anterior Cruciate Ligament (ACL) Injuries
Carlson, Victor R.; Sheehan, Frances T.; Boden, Barry P.
2016-01-01
Background: As the most viable method for investigating in vivo anterior cruciate ligament (ACL) rupture, video analysis is critical for understanding ACL injury mechanisms and advancing preventative training programs. Despite the limited number of published studies involving video analysis, much has been gained through evaluating actual injury scenarios. Methods: Studies meeting criteria for this systematic review were collected by performing a broad search of the ACL literature with use of variations and combinations of video recordings and ACL injuries. Both descriptive and analytical studies were included. Results: Descriptive studies have identified specific conditions that increase the likelihood of an ACL injury. These conditions include close proximity to opposing players or other perturbations, high shoe-surface friction, and landing on the heel or the flat portion of the foot. Analytical studies have identified high-risk joint angles on landing, such as a combination of decreased ankle plantar flexion, decreased knee flexion, and increased hip flexion. Conclusions: The high-risk landing position appears to influence the likelihood of ACL injury to a much greater extent than inherent risk factors. As such, on the basis of the results of video analysis, preventative training should be applied broadly. Kinematic data from video analysis have provided insights into the dominant forces that are responsible for the injury (i.e., axial compression with potential contributions from quadriceps contraction and valgus loading). With the advances in video technology currently underway, video analysis will likely lead to enhanced understanding of non-contact ACL injury. PMID:27922985
Alternatives to current flow cytometry data analysis for clinical and research studies.
Gondhalekar, Carmen; Rajwa, Bartek; Patsekin, Valery; Ragheb, Kathy; Sturgis, Jennifer; Robinson, J Paul
2018-02-01
Flow cytometry has well-established methods for data analysis based on traditional data collection techniques. These techniques typically involved manual insertion of tube samples into an instrument that, historically, could only measure 1-3 colors. The field has since evolved to incorporate new technologies for faster and highly automated sample preparation and data collection. For example, the use of microwell plates on benchtop instruments is now a standard on virtually every new instrument, and so users can easily accumulate multiple data sets quickly. Further, because the user must carefully define the layout of the plate, this information is already defined when considering the analytical process, expanding the opportunities for automated analysis. Advances in multi-parametric data collection, as demonstrated by the development of hyperspectral flow-cytometry, 20-40 color polychromatic flow cytometry, and mass cytometry (CyTOF), are game-changing. As data and assay complexity increase, so too does the complexity of data analysis. Complex data analysis is already a challenge to traditional flow cytometry software. New methods for reviewing large and complex data sets can provide rapid insight into processes difficult to define without more advanced analytical tools. In settings such as clinical labs where rapid and accurate data analysis is a priority, rapid, efficient and intuitive software is needed. This paper outlines opportunities for analysis of complex data sets using examples of multiplexed bead-based assays, drug screens and cell cycle analysis - any of which could become integrated into the clinical environment. Copyright © 2017. Published by Elsevier Inc.
Advanced industrial fluorescence metrology used for qualification of high quality optical materials
NASA Astrophysics Data System (ADS)
Engel, Axel; Becker, Hans-Juergen; Sohr, Oliver; Haspel, Rainer; Rupertus, Volker
2003-11-01
Schott Glas is developing and producing the optical material for various specialized applications in telecommunication, biomedical, optical, and micro lithography technology. The requirements on quality for optical materials are extremely high and still increasing. For example in micro lithography applications the impurities of the material are specified to be in the low ppb range. Usually the impurities in the lower ppb range are determined using analytical methods like LA ICP-MS and Neutron Activation Analysis. On the other hand absorption and laser resistivity of optical material is qualified with optical methods like precision spectral photometers and in-situ transmission measurements having UV lasers. Analytical methods have the drawback that they are time consuming and rather expensive, whereas the sensitivity for the absorption method will not be sufficient to characterize the future needs (coefficient much below 10-3 cm-1). In order to achieve the current and future quality requirements a Jobin Yvon FLUOROLOG 3.22 fluorescence spectrometer is employed to enable fast and precise qualification and analysis. The main advantage of this setup is the combination of highest sensitivity (more than one order of magnitude higher sensitivity that state of the art UV absorption spectroscopy) and fast measurement and evaluation cycles (several minutes compared to several hours necessary for chemical analytics). An overview is given for spectral characteristics and using specified standards. Moreover correlations to the material qualities are shown. In particular we have investigated the elementary fluorescence and absorption of rare earth element impurities as well as defects induced luminescence originated by impurities.
An Advanced Framework for Improving Situational Awareness in Electric Power Grid Operation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Huang, Zhenyu; Zhou, Ning
With the deployment of new smart grid technologies and the penetration of renewable energy in power systems, significant uncertainty and variability is being introduced into power grid operation. Traditionally, the Energy Management System (EMS) operates the power grid in a deterministic mode, and thus will not be sufficient for the future control center in a stochastic environment with faster dynamics. One of the main challenges is to improve situational awareness. This paper reviews the current status of power grid operation and presents a vision of improving wide-area situational awareness for a future control center. An advanced framework, consisting of parallelmore » state estimation, state prediction, parallel contingency selection, parallel contingency analysis, and advanced visual analytics, is proposed to provide capabilities needed for better decision support by utilizing high performance computing (HPC) techniques and advanced visual analytic techniques. Research results are presented to support the proposed vision and framework.« less
Analysis of Big Data in Gait Biomechanics: Current Trends and Future Directions.
Phinyomark, Angkoon; Petri, Giovanni; Ibáñez-Marcelo, Esther; Osis, Sean T; Ferber, Reed
2018-01-01
The increasing amount of data in biomechanics research has greatly increased the importance of developing advanced multivariate analysis and machine learning techniques, which are better able to handle "big data". Consequently, advances in data science methods will expand the knowledge for testing new hypotheses about biomechanical risk factors associated with walking and running gait-related musculoskeletal injury. This paper begins with a brief introduction to an automated three-dimensional (3D) biomechanical gait data collection system: 3D GAIT, followed by how the studies in the field of gait biomechanics fit the quantities in the 5 V's definition of big data: volume, velocity, variety, veracity, and value. Next, we provide a review of recent research and development in multivariate and machine learning methods-based gait analysis that can be applied to big data analytics. These modern biomechanical gait analysis methods include several main modules such as initial input features, dimensionality reduction (feature selection and extraction), and learning algorithms (classification and clustering). Finally, a promising big data exploration tool called "topological data analysis" and directions for future research are outlined and discussed.
Jim Starnes' Contributions to Residual Strength Analysis Methods for Metallic Structures
NASA Technical Reports Server (NTRS)
Young, Richard D.; Rose, Cheryl A.; Harris, Charles E.
2005-01-01
A summary of advances in residual strength analyses methods for metallic structures that were realized under the leadership of Dr. James H. Starnes, Jr., is presented. The majority of research led by Dr. Starnes in this area was conducted in the 1990's under the NASA Airframe Structural Integrity Program (NASIP). Dr. Starnes, respectfully referred to herein as Jim, had a passion for studying complex response phenomena and dedicated a significant amount of research effort toward advancing damage tolerance and residual strength analysis methods for metallic structures. Jim's efforts were focused on understanding damage propagation in built-up fuselage structure with widespread fatigue damage, with the goal of ensuring safety in the aging international commercial transport fleet. Jim's major contributions in this research area were in identifying the effects of combined internal pressure and mechanical loads, and geometric nonlinearity, on the response of built-up structures with damage. Analytical and experimental technical results are presented to demonstrate the breadth and rigor of the research conducted in this technical area. Technical results presented herein are drawn exclusively from papers where Jim was a co-author.
Lindgren, Annie R; Anderson, Frank E
2018-01-01
Historically, deep-level relationships within the molluscan class Cephalopoda (squids, cuttlefishes, octopods and their relatives) have remained elusive due in part to the considerable morphological diversity of extant taxa, a limited fossil record for species that lack a calcareous shell and difficulties in sampling open ocean taxa. Many conflicts identified by morphologists in the early 1900s remain unresolved today in spite of advances in morphological, molecular and analytical methods. In this study we assess the utility of transcriptome data for resolving cephalopod phylogeny, with special focus on the orders of Decapodiformes (open-eye squids, bobtail squids, cuttlefishes and relatives). To do so, we took new and previously published transcriptome data and used a unique cephalopod core ortholog set to generate a dataset that was subjected to an array of filtering and analytical methods to assess the impacts of: taxon sampling, ortholog number, compositional and rate heterogeneity and incongruence across loci. Analyses indicated that datasets that maximized taxonomic coverage but included fewer orthologs were less stable than datasets that sacrificed taxon sampling to increase the number of orthologs. Clades recovered irrespective of dataset, filtering or analytical method included Octopodiformes (Vampyroteuthis infernalis + octopods), Decapodiformes (squids, cuttlefishes and their relatives), and orders Oegopsida (open-eyed squids) and Myopsida (e.g., loliginid squids). Ordinal-level relationships within Decapodiformes were the most susceptible to dataset perturbation, further emphasizing the challenges associated with uncovering relationships at deep nodes in the cephalopod tree of life. Copyright © 2017 Elsevier Inc. All rights reserved.
Quantitative metabolomics by H-NMR and LC-MS/MS confirms altered metabolic pathways in diabetes.
Lanza, Ian R; Zhang, Shucha; Ward, Lawrence E; Karakelides, Helen; Raftery, Daniel; Nair, K Sreekumaran
2010-05-10
Insulin is as a major postprandial hormone with profound effects on carbohydrate, fat, and protein metabolism. In the absence of exogenous insulin, patients with type 1 diabetes exhibit a variety of metabolic abnormalities including hyperglycemia, glycosurea, accelerated ketogenesis, and muscle wasting due to increased proteolysis. We analyzed plasma from type 1 diabetic (T1D) humans during insulin treatment (I+) and acute insulin deprivation (I-) and non-diabetic participants (ND) by (1)H nuclear magnetic resonance spectroscopy and liquid chromatography-tandem mass spectrometry. The aim was to determine if this combination of analytical methods could provide information on metabolic pathways known to be altered by insulin deficiency. Multivariate statistics differentiated proton spectra from I- and I+ based on several derived plasma metabolites that were elevated during insulin deprivation (lactate, acetate, allantoin, ketones). Mass spectrometry revealed significant perturbations in levels of plasma amino acids and amino acid metabolites during insulin deprivation. Further analysis of metabolite levels measured by the two analytical techniques indicates several known metabolic pathways that are perturbed in T1D (I-) (protein synthesis and breakdown, gluconeogenesis, ketogenesis, amino acid oxidation, mitochondrial bioenergetics, and oxidative stress). This work demonstrates the promise of combining multiple analytical methods with advanced statistical methods in quantitative metabolomics research, which we have applied to the clinical situation of acute insulin deprivation in T1D to reflect the numerous metabolic pathways known to be affected by insulin deficiency.
NASA Astrophysics Data System (ADS)
Chien, Chih-Chun; Kouachi, Said; Velizhanin, Kirill A.; Dubi, Yonatan; Zwolak, Michael
2017-01-01
We present a method for calculating analytically the thermal conductance of a classical harmonic lattice with both alternating masses and nearest-neighbor couplings when placed between individual Langevin reservoirs at different temperatures. The method utilizes recent advances in analytic diagonalization techniques for certain classes of tridiagonal matrices. It recovers the results from a previous method that was applicable for alternating on-site parameters only, and extends the applicability to realistic systems in which masses and couplings alternate simultaneously. With this analytic result in hand, we show that the thermal conductance is highly sensitive to the modulation of the couplings. This is due to the existence of topologically induced edge modes at the lattice-reservoir interface and is also a reflection of the symmetries of the lattice. We make a connection to a recent work that demonstrates thermal transport is analogous to chemical reaction rates in solution given by Kramers' theory [Velizhanin et al., Sci. Rep. 5, 17506 (2015)], 10.1038/srep17506. In particular, we show that the turnover behavior in the presence of edge modes prevents calculations based on single-site reservoirs from coming close to the natural—or intrinsic—conductance of the lattice. Obtaining the correct value of the intrinsic conductance through simulation of even a small lattice where ballistic effects are important requires quite large extended reservoir regions. Our results thus offer a route for both the design and proper simulation of thermal conductance of nanoscale devices.
An Experimental Introduction to Interlaboratory Exercises in Analytical Chemistry
ERIC Educational Resources Information Center
Puignou, L.; Llaurado, M.
2005-01-01
An experimental exercise on analytical proficiency studies in collaborative trials is proposed. This practical provides students in advanced undergraduate courses in chemistry, pharmacy, and biochemistry, with the opportunity to improve their quality assurance skills. It involves an environmental analysis, determining the concentration of a…
Penman-Aguilar, Ana; Talih, Makram; Huang, David; Moonesinghe, Ramal; Bouye, Karen; Beckles, Gloria
2016-01-01
Reduction of health disparities and advancement of health equity in the United States require high-quality data indicative of where the nation stands vis-à-vis health equity, as well as proper analytic tools to facilitate accurate interpretation of these data. This article opens with an overview of health equity and social determinants of health. It then proposes a set of recommended practices in measurement of health disparities, health inequities, and social determinants of health at the national level to support the advancement of health equity, highlighting that (1) differences in health and its determinants that are associated with social position are important to assess; (2) social and structural determinants of health should be assessed and multiple levels of measurement should be considered; (3) the rationale for methodological choices made and measures chosen should be made explicit; (4) groups to be compared should be simultaneously classified by multiple social statuses; and (5) stakeholders and their communication needs can often be considered in the selection of analytic methods. Although much is understood about the role of social determinants of health in shaping the health of populations, researchers should continue to advance understanding of the pathways through which they operate on particular health outcomes. There is still much to learn and implement about how to measure health disparities, health inequities, and social determinants of health at the national level, and the challenges of health equity persist. We anticipate that the present discussion will contribute to the laying of a foundation for standard practice in the monitoring of national progress toward achievement of health equity.
Strategic analytics: towards fully embedding evidence in healthcare decision-making.
Garay, Jason; Cartagena, Rosario; Esensoy, Ali Vahit; Handa, Kiren; Kane, Eli; Kaw, Neal; Sadat, Somayeh
2015-01-01
Cancer Care Ontario (CCO) has implemented multiple information technology solutions and collected health-system data to support its programs. There is now an opportunity to leverage these data and perform advanced end-to-end analytics that inform decisions around improving health-system performance. In 2014, CCO engaged in an extensive assessment of its current data capacity and capability, with the intent to drive increased use of data for evidence-based decision-making. The breadth and volume of data at CCO uniquely places the organization to contribute to not only system-wide operational reporting, but more advanced modelling of current and future state system management and planning. In 2012, CCO established a strategic analytics practice to assist the agency's programs contextualize and inform key business decisions and to provide support through innovative predictive analytics solutions. This paper describes the organizational structure, services and supporting operations that have enabled progress to date, and discusses the next steps towards the vision of embedding evidence fully into healthcare decision-making. Copyright © 2014 Longwoods Publishing.
Initial alignment method for free space optics laser beam
NASA Astrophysics Data System (ADS)
Shimada, Yuta; Tashiro, Yuki; Izumi, Kiyotaka; Yoshida, Koichi; Tsujimura, Takeshi
2016-08-01
The authors have newly proposed and constructed an active free space optics transmission system. It is equipped with a motor driven laser emitting mechanism and positioning photodiodes, and it transmits a collimated thin laser beam and accurately steers the laser beam direction. It is necessary to introduce the laser beam within sensible range of the receiver in advance of laser beam tracking control. This paper studies an estimation method of laser reaching point for initial laser beam alignment. Distributed photodiodes detect laser luminescence at respective position, and the optical axis of laser beam is analytically presumed based on the Gaussian beam optics. Computer simulation evaluates the accuracy of the proposed estimation methods, and results disclose that the methods help us to guide the laser beam to a distant receiver.
NASA Technical Reports Server (NTRS)
Mcgary, Michael C.
1988-01-01
The anticipated application of advanced turboprop propulsion systems is expected to increase the interior noise of future aircraft to unacceptably high levels. The absence of technically and economically feasible noise source-path diagnostic tools has been a prime obstacle in the development of efficient noise control treatments for propeller-driven aircraft. A new diagnostic method that permits the separation and prediction of the fully coherent airborne and structureborne components of the sound radiated by plates or thin shells has been developed. Analytical and experimental studies of the proposed method were performed on an aluminum plate. The results of the study indicate that the proposed method could be used in flight, and has fewer encumbrances than the other diagnostic tools currently available.
NASA Technical Reports Server (NTRS)
Revell, J. D.; Balena, F. J.; Koval, L. R.
1980-01-01
The acoustical treatment mass penalties required to achieve an interior noise level of 80 dBA for high speed, fuel efficient propfan-powered aircraft are determined. The prediction method used is based on theory developed for the outer shell dynamics, and a modified approach for add-on noise control element performance. The present synthesis of these methods is supported by experimental data. Three different sized aircraft are studied, including a widebody, a narrowbody and a business sized aircraft. Noise control penalties are calculated for each aircraft for two kinds of noise control designs: add-on designs, where the outer wall structure cannot be changed, and advanced designs where the outer wall stiffness level and the materials usage can be altered. For the add-on designs, the mass penalties range from 1.7 to 2.4 percent of the takeoff gross weight (TOGW) of the various aircraft, similar to preliminary estimates. Results for advanced designs show significant reductions of the mass penalties. For the advanced aluminum designs the penalties are 1.5% of TOGW, and for an all composite aircraft the penalties range from 0.74 to 1.4% of TOGW.
Delamination Defect Detection Using Ultrasonic Guided Waves in Advanced Hybrid Structural Elements
NASA Astrophysics Data System (ADS)
Yan, Fei; Qi, Kevin ``Xue''; Rose, Joseph L.; Weiland, Hasso
2010-02-01
Nondestructive testing for multilayered structures is challenging because of increased numbers of layers and plate thicknesses. In this paper, ultrasonic guided waves are applied to detect delamination defects inside a 23-layer Alcoa Advanced Hybrid Structural plate. A semi-analytical finite element (SAFE) method generates dispersion curves and wave structures in order to select appropriate wave structures to detect certain defects. One guided wave mode and frequency is chosen to achieve large in-plane displacements at regions of interest. The interactions of the selected mode with defects are simulated using finite element models. Experiments are conducted and compared with bulk wave measurements. It is shown that guided waves can detect deeply embedded damages inside thick multilayer fiber-metal laminates with suitable mode and frequency selection.
Özel, Rıfat Emrah; Hayat, Akhtar; Andreescu, Silvana
2015-01-01
Neurotransmitters are important biological molecules that are essential to many neurophysiological processes including memory, cognition, and behavioral states. The development of analytical methodologies to accurately detect neurotransmitters is of great importance in neurological and biological research. Specifically designed microelectrodes or microbiosensors have demonstrated potential for rapid, real-time measurements with high spatial resolution. Such devices can facilitate study of the role and mechanism of action of neurotransmitters and can find potential uses in biomedicine. This paper reviews the current status and recent advances in the development and application of electrochemical sensors for the detection of small-molecule neurotransmitters. Measurement challenges and opportunities of electroanalytical methods to advance study and understanding of neurotransmitters in various biological models and disease conditions are discussed. PMID:26973348
Hypersonic airframe structures: Technology needs and flight test requirements
NASA Technical Reports Server (NTRS)
Stone, J. E.; Koch, L. C.
1979-01-01
Hypersonic vehicles, that may be produced by the year 2000, were identified. Candidate thermal/structural concepts that merit consideration for these vehicles were described. The current status of analytical methods, materials, manufacturing techniques, and conceptual developments pertaining to these concepts were reviewed. Guidelines establishing meaningful technology goals were defined and twenty-eight specific technology needs were identified. The extent to which these technology needs can be satisfied, using existing capabilities and facilities without the benefit of a hypersonic research aircraft, was assessed. The role that a research aircraft can fill in advancing this technology was discussed and a flight test program was outlined. Research aircraft thermal/structural design philosophy was also discussed. Programs, integrating technology advancements with the projected vehicle needs, were presented. Program options were provided to reflect various scheduling and cost possibilities.
Advanced Software V&V for Civil Aviation and Autonomy
NASA Technical Reports Server (NTRS)
Brat, Guillaume P.
2017-01-01
With the advances in high-computing platform (e.g., advanced graphical processing units or multi-core processors), computationally-intensive software techniques such as the ones used in artificial intelligence or formal methods have provided us with an opportunity to further increase safety in the aviation industry. Some of these techniques have facilitated building safety at design time, like in aircraft engines or software verification and validation, and others can introduce safety benefits during operations as long as we adapt our processes. In this talk, I will present how NASA is taking advantage of these new software techniques to build in safety at design time through advanced software verification and validation, which can be applied earlier and earlier in the design life cycle and thus help also reduce the cost of aviation assurance. I will then show how run-time techniques (such as runtime assurance or data analytics) offer us a chance to catch even more complex problems, even in the face of changing and unpredictable environments. These new techniques will be extremely useful as our aviation systems become more complex and more autonomous.
Materials and structural aspects of advanced gas-turbine helicopter engines
NASA Technical Reports Server (NTRS)
Freche, J. C.; Acurio, J.
1979-01-01
Advances in materials, coatings, turbine cooling technology, structural and design concepts, and component-life prediction of helicopter gas-turbine-engine components are presented. Stationary parts including the inlet particle separator, the front frame, rotor tip seals, vanes and combustors and rotating components - compressor blades, disks, and turbine blades - are discussed. Advanced composite materials are considered for the front frame and compressor blades, prealloyed powder superalloys will increase strength and reduce costs of disks, the oxide dispersion strengthened alloys will have 100C higher use temperature in combustors and vanes than conventional superalloys, ceramics will provide the highest use temperature of 1400C for stator vanes and 1370C for turbine blades, and directionally solidified eutectics will afford up to 50C temperature advantage at turbine blade operating conditions. Coatings for surface protection at higher surface temperatures and design trends in turbine cooling technology are discussed. New analytical methods of life prediction such as strain gage partitioning for high temperature prediction, fatigue life, computerized prediction of oxidation resistance, and advanced techniques for estimating coating life are described.
Bridging the Gap between Human Judgment and Automated Reasoning in Predictive Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.; Riensche, Roderick M.; Unwin, Stephen D.
2010-06-07
Events occur daily that impact the health, security and sustainable growth of our society. If we are to address the challenges that emerge from these events, anticipatory reasoning has to become an everyday activity. Strong advances have been made in using integrated modeling for analysis and decision making. However, a wider impact of predictive analytics is currently hindered by the lack of systematic methods for integrating predictive inferences from computer models with human judgment. In this paper, we present a predictive analytics approach that supports anticipatory analysis and decision-making through a concerted reasoning effort that interleaves human judgment and automatedmore » inferences. We describe a systematic methodology for integrating modeling algorithms within a serious gaming environment in which role-playing by human agents provides updates to model nodes and the ensuing model outcomes in turn influence the behavior of the human players. The approach ensures a strong functional partnership between human players and computer models while maintaining a high degree of independence and greatly facilitating the connection between model and game structures.« less
Asadollahi, Aziz; Khazanovich, Lev
2018-04-11
The emergence of ultrasonic dry point contact (DPC) transducers that emit horizontal shear waves has enabled efficient collection of high-quality data in the context of a nondestructive evaluation of concrete structures. This offers an opportunity to improve the quality of evaluation by adapting advanced imaging techniques. Reverse time migration (RTM) is a simulation-based reconstruction technique that offers advantages over conventional methods, such as the synthetic aperture focusing technique. RTM is capable of imaging boundaries and interfaces with steep slopes and the bottom boundaries of inclusions and defects. However, this imaging technique requires a massive amount of memory and its computation cost is high. In this study, both bottlenecks of the RTM are resolved when shear transducers are used for data acquisition. An analytical approach was developed to obtain the source and receiver wavefields needed for imaging using reverse time migration. It is shown that the proposed analytical approach not only eliminates the high memory demand, but also drastically reduces the computation time from days to minutes. Copyright © 2018 Elsevier B.V. All rights reserved.
Huerta, B; Rodríguez-Mozaz, S; Barceló, D
2012-11-01
The presence of pharmaceuticals in the aquatic environment is an ever-increasing issue of concern as they are specifically designed to target specific metabolic and molecular pathways in organisms, and they may have the potential for unintended effects on nontarget species. Information on the presence of pharmaceuticals in biota is still scarce, but the scientific literature on the subject has established the possibility of bioaccumulation in exposed aquatic organisms through other environmental compartments. However, few studies have correlated both bioaccumulation of pharmaceutical compounds and the consequent effects. Analytical methodology to detect pharmaceuticals at trace quantities in biota has advanced significantly in the last few years. Nonetheless, there are still unresolved analytical challenges associated with the complexity of biological matrices, which require exhaustive extraction and purification steps, and highly sensitive and selective detection techniques. This review presents the trends in the analysis of pharmaceuticals in aquatic organisms in the last decade, recent data about the occurrence of these compounds in natural biota, and the environmental implications that chronic exposure could have on aquatic wildlife.
Large-scale retrieval for medical image analytics: A comprehensive review.
Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting
2018-01-01
Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
Uchiyama, Shigehisa; Inaba, Yohei; Kunugita, Naoki
2011-05-15
Derivatization of carbonyl compounds with 2,4-dinitrophenylhydrazine (DNPH) is one of the most widely used analytical methods. In this article, we highlight recent advances using DNPH provided by our studies over past seven years. DNPH reacts with carbonyls to form corresponding stable 2,4-DNPhydrazone derivatives (DNPhydrazones). This method may result in analytical error because DNPhydrazones have both E- and Z-stereoisomers caused by the CN double bond. Purified aldehyde-2,4-DNPhydrazone demonstrated only the E-isomer, but under UV irradiation and the addition of acid, both E- and Z-isomers were seen. In order to resolve the isometric problem, a method for transforming the CN double bond of carbonyl-2,4-DNPhydrazone into a C-N single bond, by reductive amination using 2-picoline borane, has been developed. The amination reactions of C1-C10 aldehyde DNPhydrazones are completely converted into the reduced forms and can be analyzed with high-performance liquid chromatography. As a new application using DNPH derivatization, the simultaneous measurement of carbonyls with carboxylic acids or ozone is described in this review. Copyright © 2010 Elsevier B.V. All rights reserved.
Cardiovascular Redox and Ox Stress Proteomics
Kumar, Vikas; Calamaras, Timothy Dean; Haeussler, Dagmar; Colucci, Wilson Steven; Cohen, Richard Alan; McComb, Mark Errol; Pimentel, David
2012-01-01
Abstract Significance: Oxidative post-translational modifications (OPTMs) have been demonstrated as contributing to cardiovascular physiology and pathophysiology. These modifications have been identified using antibodies as well as advanced proteomic methods, and the functional importance of each is beginning to be understood using transgenic and gene deletion animal models. Given that OPTMs are involved in cardiovascular pathology, the use of these modifications as biomarkers and predictors of disease has significant therapeutic potential. Adequate understanding of the chemistry of the OPTMs is necessary to determine what may occur in vivo and which modifications would best serve as biomarkers. Recent Advances: By using mass spectrometry, advanced labeling techniques, and antibody identification, OPTMs have become accessible to a larger proportion of the scientific community. Advancements in instrumentation, database search algorithms, and processing speed have allowed MS to fully expand on the proteome of OPTMs. In addition, the role of enzymatically reversible OPTMs has been further clarified in preclinical models. Critical Issues: The identification of OPTMs suffers from limitations in analytic detection based on the methodology, instrumentation, sample complexity, and bioinformatics. Currently, each type of OPTM requires a specific strategy for identification, and generalized approaches result in an incomplete assessment. Future Directions: Novel types of highly sensitive MS instrumentation that allow for improved separation and detection of modified proteins and peptides have been crucial in the discovery of OPTMs and biomarkers. To further advance the identification of relevant OPTMs in advanced search algorithms, standardized methods for sample processing and depository of MS data will be required. Antioxid. Redox Signal. 17, 1528–1559. PMID:22607061
Biomolecular logic systems: applications to biosensors and bioactuators
NASA Astrophysics Data System (ADS)
Katz, Evgeny
2014-05-01
The paper presents an overview of recent advances in biosensors and bioactuators based on the biocomputing concept. Novel biosensors digitally process multiple biochemical signals through Boolean logic networks of coupled biomolecular reactions and produce output in the form of YES/NO response. Compared to traditional single-analyte sensing devices, biocomputing approach enables a high-fidelity multi-analyte biosensing, particularly beneficial for biomedical applications. Multi-signal digital biosensors thus promise advances in rapid diagnosis and treatment of diseases by processing complex patterns of physiological biomarkers. Specifically, they can provide timely detection and alert to medical emergencies, along with an immediate therapeutic intervention. Application of the biocomputing concept has been successfully demonstrated for systems performing logic analysis of biomarkers corresponding to different injuries, particularly exemplified for liver injury. Wide-ranging applications of multi-analyte digital biosensors in medicine, environmental monitoring and homeland security are anticipated. "Smart" bioactuators, for example for signal-triggered drug release, were designed by interfacing switchable electrodes and biocomputing systems. Integration of novel biosensing and bioactuating systems with the biomolecular information processing systems keeps promise for further scientific advances and numerous practical applications.
Umari, A.M.; Gorelick, S.M.
1986-01-01
It is possible to obtain analytic solutions to the groundwater flow and solute transport equations if space variables are discretized but time is left continuous. From these solutions, hydraulic head and concentration fields for any future time can be obtained without ' marching ' through intermediate time steps. This analytical approach involves matrix exponentiation and is referred to as the Matrix Exponential Time Advancement (META) method. Two algorithms are presented for the META method, one for symmetric and the other for non-symmetric exponent matrices. A numerical accuracy indicator, referred to as the matrix condition number, was defined and used to determine the maximum number of significant figures that may be lost in the META method computations. The relative computational and storage requirements of the META method with respect to the time marching method increase with the number of nodes in the discretized problem. The potential greater accuracy of the META method and the associated greater reliability through use of the matrix condition number have to be weighed against this increased relative computational and storage requirements of this approach as the number of nodes becomes large. For a particular number of nodes, the META method may be computationally more efficient than the time-marching method, depending on the size of time steps used in the latter. A numerical example illustrates application of the META method to a sample ground-water-flow problem. (Author 's abstract)
Rogers, Richard S; Abernathy, Michael; Richardson, Douglas D; Rouse, Jason C; Sperry, Justin B; Swann, Patrick; Wypych, Jette; Yu, Christopher; Zang, Li; Deshpande, Rohini
2017-11-30
Today, we are experiencing unprecedented growth and innovation within the pharmaceutical industry. Established protein therapeutic modalities, such as recombinant human proteins, monoclonal antibodies (mAbs), and fusion proteins, are being used to treat previously unmet medical needs. Novel therapies such as bispecific T cell engagers (BiTEs), chimeric antigen T cell receptors (CARTs), siRNA, and gene therapies are paving the path towards increasingly personalized medicine. This advancement of new indications and therapeutic modalities is paralleled by development of new analytical technologies and methods that provide enhanced information content in a more efficient manner. Recently, a liquid chromatography-mass spectrometry (LC-MS) multi-attribute method (MAM) has been developed and designed for improved simultaneous detection, identification, quantitation, and quality control (monitoring) of molecular attributes (Rogers et al. MAbs 7(5):881-90, 2015). Based on peptide mapping principles, this powerful tool represents a true advancement in testing methodology that can be utilized not only during product characterization, formulation development, stability testing, and development of the manufacturing process, but also as a platform quality control method in dispositioning clinical materials for both innovative biotherapeutics and biosimilars.
Recent Methodology in Ginseng Analysis
Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill
2012-01-01
As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholtz, Jean; Burtner, Edwin R.; Cook, Kristin A.
This course will introduce the field of Visual Analytics to HCI researchers and practitioners highlighting the contributions they can make to this field. Topics will include a definition of visual analytics along with examples of current systems, types of tasks and end users, issues in defining user requirements, design of visualizations and interactions, guidelines and heuristics, the current state of user-centered evaluations, and metrics for evaluation. We encourage designers, HCI researchers, and HCI practitioners to attend to learn how their skills can contribute to advancing the state of the art of visual analytics
NASA Technical Reports Server (NTRS)
Daly, J. K.; Torian, J. G.
1979-01-01
An overview of studies conducted to establish the requirements for advanced subsystem analytical tools is presented. Modifications are defined for updating current computer programs used to analyze environmental control, life support, and electric power supply systems so that consumables for future advanced spacecraft may be managed.
DNApod: DNA polymorphism annotation database from next-generation sequence read archives.
Mochizuki, Takako; Tanizawa, Yasuhiro; Fujisawa, Takatomo; Ohta, Tazro; Nikoh, Naruo; Shimizu, Tokurou; Toyoda, Atsushi; Fujiyama, Asao; Kurata, Nori; Nagasaki, Hideki; Kaminuma, Eli; Nakamura, Yasukazu
2017-01-01
With the rapid advances in next-generation sequencing (NGS), datasets for DNA polymorphisms among various species and strains have been produced, stored, and distributed. However, reliability varies among these datasets because the experimental and analytical conditions used differ among assays. Furthermore, such datasets have been frequently distributed from the websites of individual sequencing projects. It is desirable to integrate DNA polymorphism data into one database featuring uniform quality control that is distributed from a single platform at a single place. DNA polymorphism annotation database (DNApod; http://tga.nig.ac.jp/dnapod/) is an integrated database that stores genome-wide DNA polymorphism datasets acquired under uniform analytical conditions, and this includes uniformity in the quality of the raw data, the reference genome version, and evaluation algorithms. DNApod genotypic data are re-analyzed whole-genome shotgun datasets extracted from sequence read archives, and DNApod distributes genome-wide DNA polymorphism datasets and known-gene annotations for each DNA polymorphism. This new database was developed for storing genome-wide DNA polymorphism datasets of plants, with crops being the first priority. Here, we describe our analyzed data for 679, 404, and 66 strains of rice, maize, and sorghum, respectively. The analytical methods are available as a DNApod workflow in an NGS annotation system of the DNA Data Bank of Japan and a virtual machine image. Furthermore, DNApod provides tables of links of identifiers between DNApod genotypic data and public phenotypic data. To advance the sharing of organism knowledge, DNApod offers basic and ubiquitous functions for multiple alignment and phylogenetic tree construction by using orthologous gene information.
DNApod: DNA polymorphism annotation database from next-generation sequence read archives
Mochizuki, Takako; Tanizawa, Yasuhiro; Fujisawa, Takatomo; Ohta, Tazro; Nikoh, Naruo; Shimizu, Tokurou; Toyoda, Atsushi; Fujiyama, Asao; Kurata, Nori; Nagasaki, Hideki; Kaminuma, Eli; Nakamura, Yasukazu
2017-01-01
With the rapid advances in next-generation sequencing (NGS), datasets for DNA polymorphisms among various species and strains have been produced, stored, and distributed. However, reliability varies among these datasets because the experimental and analytical conditions used differ among assays. Furthermore, such datasets have been frequently distributed from the websites of individual sequencing projects. It is desirable to integrate DNA polymorphism data into one database featuring uniform quality control that is distributed from a single platform at a single place. DNA polymorphism annotation database (DNApod; http://tga.nig.ac.jp/dnapod/) is an integrated database that stores genome-wide DNA polymorphism datasets acquired under uniform analytical conditions, and this includes uniformity in the quality of the raw data, the reference genome version, and evaluation algorithms. DNApod genotypic data are re-analyzed whole-genome shotgun datasets extracted from sequence read archives, and DNApod distributes genome-wide DNA polymorphism datasets and known-gene annotations for each DNA polymorphism. This new database was developed for storing genome-wide DNA polymorphism datasets of plants, with crops being the first priority. Here, we describe our analyzed data for 679, 404, and 66 strains of rice, maize, and sorghum, respectively. The analytical methods are available as a DNApod workflow in an NGS annotation system of the DNA Data Bank of Japan and a virtual machine image. Furthermore, DNApod provides tables of links of identifiers between DNApod genotypic data and public phenotypic data. To advance the sharing of organism knowledge, DNApod offers basic and ubiquitous functions for multiple alignment and phylogenetic tree construction by using orthologous gene information. PMID:28234924
Analytical Verifications in Cryogenic Testing of NGST Advanced Mirror System Demonstrators
NASA Technical Reports Server (NTRS)
Cummings, Ramona; Levine, Marie; VanBuren, Dave; Kegley, Jeff; Green, Joseph; Hadaway, James; Presson, Joan; Cline, Todd; Stahl, H. Philip (Technical Monitor)
2002-01-01
Ground based testing is a critical and costly part of component, assembly, and system verifications of large space telescopes. At such tests, however, with integral teamwork by planners, analysts, and test personnel, segments can be included to validate specific analytical parameters and algorithms at relatively low additional cost. This paper opens with strategy of analytical verification segments added to vacuum cryogenic testing of Advanced Mirror System Demonstrator (AMSD) assemblies. These AMSD assemblies incorporate material and architecture concepts being considered in the Next Generation Space Telescope (NGST) design. The test segments for workmanship testing, cold survivability, and cold operation optical throughput are supplemented by segments for analytical verifications of specific structural, thermal, and optical parameters. Utilizing integrated modeling and separate materials testing, the paper continues with support plan for analyses, data, and observation requirements during the AMSD testing, currently slated for late calendar year 2002 to mid calendar year 2003. The paper includes anomaly resolution as gleaned by authors from similar analytical verification support of a previous large space telescope, then closes with draft of plans for parameter extrapolations, to form a well-verified portion of the integrated modeling being done for NGST performance predictions.
Shiokawa, Yuka; Date, Yasuhiro; Kikuchi, Jun
2018-02-21
Computer-based technological innovation provides advancements in sophisticated and diverse analytical instruments, enabling massive amounts of data collection with relative ease. This is accompanied by a fast-growing demand for technological progress in data mining methods for analysis of big data derived from chemical and biological systems. From this perspective, use of a general "linear" multivariate analysis alone limits interpretations due to "non-linear" variations in metabolic data from living organisms. Here we describe a kernel principal component analysis (KPCA)-incorporated analytical approach for extracting useful information from metabolic profiling data. To overcome the limitation of important variable (metabolite) determinations, we incorporated a random forest conditional variable importance measure into our KPCA-based analytical approach to demonstrate the relative importance of metabolites. Using a market basket analysis, hippurate, the most important variable detected in the importance measure, was associated with high levels of some vitamins and minerals present in foods eaten the previous day, suggesting a relationship between increased hippurate and intake of a wide variety of vegetables and fruits. Therefore, the KPCA-incorporated analytical approach described herein enabled us to capture input-output responses, and should be useful not only for metabolic profiling but also for profiling in other areas of biological and environmental systems.
NASA Technical Reports Server (NTRS)
Sakata, I. F.; Davis, G. W.
1975-01-01
The materials and advanced producibility methods that offer potential structural mass savings in the design of the primary structure for a supersonic cruise aircraft are identified and reported. A summary of the materials and fabrication techniques selected for this analytical effort is presented. Both metallic and composite material systems were selected for application to a near-term start-of-design technology aircraft. Selective reinforcement of the basic metallic structure was considered as the appropriate level of composite application for the near-term design.
Frequency Response of Pressure Sensitive Paints
NASA Technical Reports Server (NTRS)
Winslow, Neal A.; Carroll, Bruce F.; Setzer, Fred M.
1996-01-01
An experimental method for measuring the frequency response of Pressure Sensitive Paints (PSP) is presented. These results lead to the development of a dynamic correction technique for PSP measurements which is of great importance to the advancement of PSP as a measurement technique. The ability to design such a dynamic corrector is most easily formed from the frequency response of the given system. An example of this correction technique is shown. In addition to the experimental data, an analytical model for the frequency response is developed from the one dimensional mass diffusion equation.
Next-Generation Technologies for Multiomics Approaches Including Interactome Sequencing
Ohashi, Hiroyuki; Miyamoto-Sato, Etsuko
2015-01-01
The development of high-speed analytical techniques such as next-generation sequencing and microarrays allows high-throughput analysis of biological information at a low cost. These techniques contribute to medical and bioscience advancements and provide new avenues for scientific research. Here, we outline a variety of new innovative techniques and discuss their use in omics research (e.g., genomics, transcriptomics, metabolomics, proteomics, and interactomics). We also discuss the possible applications of these methods, including an interactome sequencing technology that we developed, in future medical and life science research. PMID:25649523
Olive Tree (Olea europeae L.) Leaves: Importance and Advances in the Analysis of Phenolic Compounds
Abaza, Leila; Taamalli, Amani; Nsir, Houda; Zarrouk, Mokhtar
2015-01-01
Phenolic compounds are becoming increasingly popular because of their potential role in contributing to human health. Experimental evidence obtained from human and animal studies demonstrate that phenolic compounds from Olea europaea leaves have biological activities which may be important in the reduction in risk and severity of certain chronic diseases. Therefore, an accurate profiling of phenolics is a crucial issue. In this article, we present a review work on current treatment and analytical methods used to extract, identify, and/or quantify phenolic compounds in olive leaves. PMID:26783953
Design and analytical study of a rotor airfoil
NASA Technical Reports Server (NTRS)
Dadone, L. U.
1978-01-01
An airfoil section for use on helicopter rotor blades was defined and analyzed by means of potential flow/boundary layer interaction and viscous transonic flow methods to meet as closely as possible a set of advanced airfoil design objectives. The design efforts showed that the first priority objectives, including selected low speed pitching moment, maximum lift and drag divergence requirements can be met, though marginally. The maximum lift requirement at M = 0.5 and most of the profile drag objectives cannot be met without some compromise of at least one of the higher order priorities.
Advanced propeller aerodynamic analysis
NASA Technical Reports Server (NTRS)
Bober, L. J.
1980-01-01
The analytical approaches as well as the capabilities of three advanced analyses for predicting propeller aerodynamic performance are presented. It is shown that two of these analyses use a lifting line representation for the propeller blades, and the third uses a lifting surface representation.
RLV Turbine Performance Optimization
NASA Technical Reports Server (NTRS)
Griffin, Lisa W.; Dorney, Daniel J.
2001-01-01
A task was developed at NASA/Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. Final results of the preliminary design and the results of the two-dimensional (2D) detailed design of the first-stage vane of a supersonic turbine suitable for a reusable launch vehicle (R-LV) are presented. Analytical techniques for obtaining the results are also discussed.
TERRA: Building New Communities for Advanced Biofuels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cornelius, Joe; Mockler, Todd; Tuinstra, Mitch
ARPA-E’s Transportation Energy Resources from Renewable Agriculture (TERRA) program is bringing together top experts from different disciplines – agriculture, robotics and data analytics – to rethink the production of advanced biofuel crops. ARPA-E Program Director Dr. Joe Cornelius discusses the TERRA program and explains how ARPA-E’s model enables multidisciplinary collaboration among diverse communities. The video focuses on two TERRA projects—Donald Danforth Center and Purdue University—that are developing and integrating cutting-edge remote sensing platforms, complex data analytics tools and plant breeding technologies to tackle the challenge of sustainably increasing biofuel stocks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The Chemical Technology (CMT) Division is a diverse technical organization with principal emphases in environmental management and development of advanced energy sources. The Division conducts research and development in three general areas: (1) development of advanced power sources for stationary and transportation applications and for consumer electronics, (2) management of high-level and low-level nuclear wastes and hazardous wastes, and (3) electrometallurgical treatment of spent nuclear fuel. The Division also performs basic research in catalytic chemistry involving molecular energy resources, mechanisms of ion transport in lithium battery electrolytes, and the chemistry of technology-relevant materials and electrified interfaces. In addition, the Divisionmore » operates the Analytical Chemistry Laboratory, which conducts research in analytical chemistry and provides analytical services for programs at Argonne National Laboratory (ANL) and other organizations. Technical highlights of the Division`s activities during 1997 are presented.« less
Electrochemical detection for microscale analytical systems: a review.
Wang, Joseph
2002-02-11
As the field of chip-based microscale systems continues its rapid growth, there are urgent needs for developing compatible detection modes. Electrochemistry detection offers considerable promise for such microfluidic systems, with features that include remarkable sensitivity, inherent miniaturization and portability, independence of optical path length or sample turbidity, low cost, low-power requirements and high compatibility with advanced micromachining and microfabrication technologies. This paper highlights recent advances, directions and key strategies in controlled-potential electrochemical detectors for miniaturized analytical systems. Subjects covered include the design and integration of the electrochemical detection system, its requirements and operational principles, common electrode materials, derivatization reactions, electrical-field decouplers, typical applications and future prospects. It is expected that electrochemical detection will become a powerful tool for microscale analytical systems and will facilitate the creation of truly portable (and possibly disposable) devices.
UTOPIAN: user-driven topic modeling based on interactive nonnegative matrix factorization.
Choo, Jaegul; Lee, Changhyun; Reddy, Chandan K; Park, Haesun
2013-12-01
Topic modeling has been widely used for analyzing text document collections. Recently, there have been significant advancements in various topic modeling techniques, particularly in the form of probabilistic graphical modeling. State-of-the-art techniques such as Latent Dirichlet Allocation (LDA) have been successfully applied in visual text analytics. However, most of the widely-used methods based on probabilistic modeling have drawbacks in terms of consistency from multiple runs and empirical convergence. Furthermore, due to the complicatedness in the formulation and the algorithm, LDA cannot easily incorporate various types of user feedback. To tackle this problem, we propose a reliable and flexible visual analytics system for topic modeling called UTOPIAN (User-driven Topic modeling based on Interactive Nonnegative Matrix Factorization). Centered around its semi-supervised formulation, UTOPIAN enables users to interact with the topic modeling method and steer the result in a user-driven manner. We demonstrate the capability of UTOPIAN via several usage scenarios with real-world document corpuses such as InfoVis/VAST paper data set and product review data sets.
Major and Modified Nucleosides, RNA, and DNA
NASA Astrophysics Data System (ADS)
Gehrke, Charles W.; Kuo, Kenneth C.
Most analytical chemists are well aware of the rapid rate of development of high-performance liquid chromatography (HPLC) over the past 5 years. A number of articles have been published in Analytical Chemistry on different topics in HPLC and many papers appear in the chromatographic journals. Some books also have been published covering this subject. HPLC has proved to be a very effective, broadly applicable chromatographic method for the separation and analysis of complex molecules in fields as diverse as biochemistry and environmental, pharmaceutical, medical, and polymer chemistry. HPLC is now having a major impact on the clinical and research aspects of medical biochemistry. Although the contributions of HPLC to other disciplines generally complements gas-liquid chromatography, this method is destined to play a much greater role in medical and biochemical research. This is because many of the biomolecules, owing to their molecular complexity and size, are thermally unstable or nonvolatile, preventing or complicating an analysis by GC. A major factor contributing to the powerful advances in biomedical liquid chromatography is the development of reversed-phase high-performance liquid chromatography (RP-HPLC) using n-alkyl and phenyl chemically bonded substrates.
Analytical Modeling of Acoustic Phonon-Limited Mobility in Strained Graphene Nanoribbons
NASA Astrophysics Data System (ADS)
Yousefvand, Ali; Ahmadi, Mohammad T.; Meshginqalam, Bahar
2017-11-01
Recent advances in graphene nanoribbon-based electronic devices encourage researchers to develop modeling and simulation methods to explore device physics. On the other hand, increasing the operating speed of nanoelectronic devices has recently attracted significant attention, and the modification of acoustic phonon interactions because of their important effect on carrier mobility can be considered as a method for carrier mobility optimization which subsequently enhances the device speed. Moreover, strain has an important influence on the electronic properties of the nanoelectronic devices. In this paper, the acoustic phonons mobility of armchair graphene nanoribbons ( n-AGNRs) under uniaxial strain is modeled analytically. In addition, strain, width and temperature effects on the acoustic phonon mobility of strained n-AGNRs are investigated. An increment in the strained AGNR acoustic phonon mobility by increasing the ribbon width is reported. Additionally, two different behaviors for the acoustic phonon mobility are verified by increasing the applied strain in 3 m, 3 m + 2 and 3 m + 1 AGNRs. Finally, the temperature effect on the modeled AGNR phonon mobility is explored, and mobility reduction by raising the temperature is reported.
Mattle, Eveline; Weiger, Markus; Schmidig, Daniel; Boesiger, Peter; Fey, Michael
2009-06-01
Hair care for humans is a major world industry with specialised tools, chemicals and techniques. Studying the effect of hair care products has become a considerable field of research, and besides mechanical and optical testing numerous advanced analytical techniques have been employed in this area. In the present work, another means of studying the properties of hair is added by demonstrating the feasibility of magnetic resonance imaging (MRI) of the human hair. Established dedicated nuclear magnetic resonance microscopy hardware (solenoidal radiofrequency microcoils and planar field gradients) and methods (constant time imaging) were adapted to the specific needs of hair MRI. Images were produced at a spatial resolution high enough to resolve the inner structure of the hair, showing contrast between cortex and medulla. Quantitative evaluation of a scan series with different echo times provided a T*(2) value of 2.6 ms for the cortex and a water content of about 90% for hairs saturated with water. The demonstration of the feasibility of hair MRI potentially adds a new tool to the large variety of analytical methods used nowadays in the development of hair care products.
Martignac, Marion; Balayssac, Stéphane; Gilard, Véronique; Benoit-Marquié, Florence
2015-06-18
We have investigated the removal of bortezomib, an anticancer drug prescribed in multiple myeloma, using the photochemical advanced oxidation process of V-UV/UV (185/254 nm). We used two complementary analytical techniques to follow the removal rate of bortezomib. Nuclear magnetic resonance (NMR) is a nonselective method requiring no prior knowledge of the structures of the byproducts and permits us to provide a spectral signature (fingerprinting approach). This untargeted method provides clues to the molecular structure changes and information on the degradation of the parent drug during the irradiation process. This holistic NMR approach could provide information for monitoring aromaticity evolution. We use liquid chromatography, coupled with high-resolution mass spectrometry (LC-MS), to correlate results obtained by (1)H NMR and for accurate identification of the byproducts, in order to understand the mechanistic degradation pathways of bortezomib. The results show that primary byproducts come from photoassisted deboronation of bortezomib at 254 nm. A secondary byproduct of pyrazinecarboxamide was also identified. We obtained a reliable correlation between these two analytical techniques.
Metabolic Engineering for the Production of Natural Products
Pickens, Lauren B.; Tang, Yi; Chooi, Yit-Heng
2014-01-01
Natural products and natural product derived compounds play an important role in modern healthcare as frontline treatments for many diseases and as inspiration for chemically synthesized therapeutics. With advances in sequencing and recombinant DNA technology, many of the biosynthetic pathways responsible for the production of these chemically complex and pharmaceutically valuable compounds have been elucidated. With an ever expanding toolkit of biosynthetic components, metabolic engineering is an increasingly powerful method to improve natural product titers and generate novel compounds. Heterologous production platforms have enabled access to pathways from difficult to culture strains; systems biology and metabolic modeling tools have resulted in increasing predictive and analytic capabilities; advances in expression systems and regulation have enabled the fine-tuning of pathways for increased efficiency, and characterization of individual pathway components has facilitated the construction of hybrid pathways for the production of new compounds. These advances in the many aspects of metabolic engineering have not only yielded fascinating scientific discoveries but also make it an increasingly viable approach for the optimization of natural product biosynthesis. PMID:22432617
Technological advances in real-time tracking of cell death
Skommer, Joanna; Darzynkiewicz, Zbigniew; Wlodkowic, Donald
2010-01-01
Cell population can be viewed as a quantum system, which like Schrödinger’s cat exists as a combination of survival- and death-allowing states. Tracking and understanding cell-to-cell variability in processes of high spatio-temporal complexity such as cell death is at the core of current systems biology approaches. As probabilistic modeling tools attempt to impute information inaccessible by current experimental approaches, advances in technologies for single-cell imaging and omics (proteomics, genomics, metabolomics) should go hand in hand with the computational efforts. Over the last few years we have made exciting technological advances that allow studies of cell death dynamically in real-time and with the unprecedented accuracy. These approaches are based on innovative fluorescent assays and recombinant proteins, bioelectrical properties of cells, and more recently also on state-of-the-art optical spectroscopy. Here, we review current status of the most innovative analytical technologies for dynamic tracking of cell death, and address the interdisciplinary promises and future challenges of these methods. PMID:20519963
Let's Not Forget: Learning Analytics Are about Learning
ERIC Educational Resources Information Center
Gaševic, Dragan; Dawson, Shane; Siemens, George
2015-01-01
The analysis of data collected from the interaction of users with educational and information technology has attracted much attention as a promising approach for advancing our understanding of the learning process. This promise motivated the emergence of the new research field, learning analytics, and its closely related discipline, educational…
ERIC Educational Resources Information Center
Dawson, Shane; Siemens, George
2014-01-01
The rapid advances in information and communication technologies, coupled with increased access to information and the formation of global communities, have resulted in interest among researchers and academics to revise educational practice to move beyond traditional "literacy" skills towards an enhanced set of…
Li, Jun; Jiang, Bin; Song, Hongwei; ...
2015-04-17
Here, we survey the recent advances in theoretical understanding of quantum state resolved dynamics, using the title reactions as examples. It is shown that the progress was made possible by major developments in two areas. First, an accurate analytical representation of many high-level ab initio points over a large configuration space can now be made with high fidelity and the necessary permutation symmetry. The resulting full-dimensional global potential energy surfaces enable dynamical calculations using either quasi-classical trajectory or more importantly quantum mechanical methods. The second advance is the development of accurate and efficient quantum dynamical methods, which are necessary formore » providing a reliable treatment of quantum effects in reaction dynamics such as tunneling, resonances, and zero-point energy. The powerful combination of the two advances has allowed us to achieve a quantitatively accurate characterization of the reaction dynamics, which unveiled rich dynamical features such as steric steering, strong mode specificity, and bond selectivity. The dependence of reactivity on reactant modes can be rationalized by the recently proposed sudden vector projection model, which attributes the mode specificity and bond selectivity to the coupling of reactant modes with the reaction coordinate at the relevant transition state. The deeper insights provided by these theoretical studies have advanced our understanding of reaction dynamics to a new level.« less
[Recent advances in sample preparation methods of plant hormones].
Wu, Qian; Wang, Lus; Wu, Dapeng; Duan, Chunfeng; Guan, Yafeng
2014-04-01
Plant hormones are a group of naturally occurring trace substances which play a crucial role in controlling the plant development, growth and environment response. With the development of the chromatography and mass spectroscopy technique, chromatographic analytical method has become a widely used way for plant hormone analysis. Among the steps of chromatographic analysis, sample preparation is undoubtedly the most vital one. Thus, a highly selective and efficient sample preparation method is critical for accurate identification and quantification of phytohormones. For the three major kinds of plant hormones including acidic plant hormones & basic plant hormones, brassinosteroids and plant polypeptides, the sample preparation methods are reviewed in sequence especially the recently developed methods. The review includes novel methods, devices, extractive materials and derivative reagents for sample preparation of phytohormones analysis. Especially, some related works of our group are included. At last, the future developments in this field are also prospected.
Advances in spectroscopic methods for quantifying soil carbon
Liebig, Mark; Franzluebbers, Alan J.; Follett, Ronald F.; Hively, W. Dean; Reeves, James B.; McCarty, Gregory W.; Calderon, Francisco
2012-01-01
The gold standard for soil C determination is combustion. However, this method requires expensive consumables, is limited to the determination of the total carbon and in the number of samples which can be processed (~100/d). With increased interest in soil C sequestration, faster methods are needed. Thus, interest in methods based on diffuse reflectance spectroscopy in the visible, near-infrared or mid-infrared ranges using either proximal or remote sensing. These methods have the ability to analyze more samples (2 to 3X/d) or huge areas (imagery) and do multiple analytes simultaneously, but require calibrations relating spectral and reference data and have specific problems, i.e., remote sensing is capable of scanning entire watersheds, thus reducing the sampling needed, but is limiting to the surface layer of tilled soils and by difficulty in obtaining proper calibration reference values. The objective of this discussion is the present state of spectroscopic methods for soil C determination.
Inorganic chemical analysis of environmental materials—A lecture series
Crock, J.G.; Lamothe, P.J.
2011-01-01
At the request of the faculty of the Colorado School of Mines, Golden, Colorado, the authors prepared and presented a lecture series to the students of a graduate level advanced instrumental analysis class. The slides and text presented in this report are a compilation and condensation of this series of lectures. The purpose of this report is to present the slides and notes and to emphasize the thought processes that should be used by a scientist submitting samples for analyses in order to procure analytical data to answer a research question. First and foremost, the analytical data generated can be no better than the samples submitted. The questions to be answered must first be well defined and the appropriate samples collected from the population that will answer the question. The proper methods of analysis, including proper sample preparation and digestion techniques, must then be applied. Care must be taken to achieve the required limits of detection of the critical analytes to yield detectable analyte concentration (above "action" levels) for the majority of the study's samples and to address what portion of those analytes answer the research question-total or partial concentrations. To guarantee a robust analytical result that answers the research question(s), a well-defined quality assurance and quality control (QA/QC) plan must be employed. This QA/QC plan must include the collection and analysis of field and laboratory blanks, sample duplicates, and matrix-matched standard reference materials (SRMs). The proper SRMs may include in-house materials and/or a selection of widely available commercial materials. A discussion of the preparation and applicability of in-house reference materials is also presented. Only when all these analytical issues are sufficiently addressed can the research questions be answered with known certainty.
NASA Astrophysics Data System (ADS)
Sheu, R.; Marcotte, A.; Khare, P.; Ditto, J.; Charan, S.; Gentner, D. R.
2017-12-01
Intermediate-volatility and semi-volatile organic compounds (I/SVOCs) are major precursors to secondary organic aerosol, and contribute to tropospheric ozone formation. Their wide volatility range, chemical complexity, behavior in analytical systems, and trace concentrations present numerous hurdles to characterization. We present an integrated sampling-to-analysis system for the collection and offline analysis of trace gas-phase organic compounds with the goal of preserving and recovering analytes throughout sample collection, transport, storage, and thermal desorption for accurate analysis. Custom multi-bed adsorbent tubes are used to collect samples for offline analysis by advanced analytical detectors. The analytical instrumentation comprises an automated thermal desorption system that introduces analytes from the adsorbent tubes into a gas chromatograph, which is coupled with an electron ionization mass spectrometer (GC-EIMS) and other detectors. In order to optimize the collection and recovery for a wide range of analyte volatility and functionalization, we evaluated a variety of commercially-available materials, including Res-Sil beads, quartz wool, glass beads, Tenax TA, and silica gel. Key properties for optimization include inertness, versatile chemical capture, minimal affinity for water, and minimal artifacts or degradation byproducts; these properties were assessed with a diverse mix of traditionally-measured and functionalized analytes. Along with a focus on material selection, we provide recommendations spanning the entire sampling-and-analysis process to improve the accuracy of future comprehensive I/SVOC measurements, including oxygenated and other functionalized I/SVOCs. We demonstrate the performance of our system by providing results on speciated VOCs-SVOCs from indoor, outdoor, and chamber studies that establish the utility of our protocols and pave the way for precise laboratory characterization via a mix of detection methods.
NASA Technical Reports Server (NTRS)
DellaCorte, Christopher
2010-01-01
Foil gas bearings are a key technology in many commercial and emerging Oil-Free turbomachinery systems. These bearings are non-linear and have been difficult to analytically model in terms of performance characteristics such as load capacity, power loss, stiffness and damping. Previous investigations led to an empirically derived method, a rule-of-thumb, to estimate load capacity. This method has been a valuable tool in system development. The current paper extends this tool concept to include rules for stiffness and damping coefficient estimation. It is expected that these rules will further accelerate the development and deployment of advanced Oil-Free machines operating on foil gas bearings
Experimental and analytical research on the aerodynamics of wind driven turbines. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohrbach, C.; Wainauski, H.; Worobel, R.
1977-12-01
This aerodynamic research program was aimed at providing a reliable, comprehensive data base on a series of wind turbine models covering a broad range of the prime aerodynamic and geometric variables. Such data obtained under controlled laboratory conditions on turbines designed by the same method, of the same size, and tested in the same wind tunnel had not been available in the literature. Moreover, this research program was further aimed at providing a basis for evaluating the adequacy of existing wind turbine aerodynamic design and performance methodology, for assessing the potential of recent advanced theories and for providing a basismore » for further method development and refinement.« less
Contributions of immunoaffinity chromatography to deep proteome profiling of human biofluids
Wu, Chaochao; Duan, Jicheng; Liu, Tao; ...
2016-01-12
Human biofluids, especially blood plasma or serum, hold great potential as the sources of candidate biomarkers for various diseases; however, the enormous dynamic range of protein concentrations in biofluids represents a significant analytical challenge for detecting promising low-abundance proteins. Over the last decade, various immunoaffinity chromatographic methods have been developed and routinely applied for separating low-abundance proteins from the high- and moderate-abundance proteins, thus enabling much more effective detection of low-abundance proteins. Herein, we review the advances of immunoaffinity separation methods and their contributions to the proteomic applications in human biofluids. The limitations and future perspectives of immunoaffinity separation methodsmore » are also discussed.« less
A study of fracture phenomena in fiber composite laminates. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Konish, H. J., Jr.
1973-01-01
The extension of linear elastic fracture mechanics from ostensibly homogeneous isotropic metallic alloys to heterogeneous anisotropic advanced fiber composites is considered. It is analytically demonstrated that the effects of material anisotropy do not alter the principal characteristics exhibited by a crack in an isotropic material. The heterogeneity of fiber composites is experimentally shown to have a negligible effect on the behavior of a sufficiently long crack. A method is proposed for predicting the fracture strengths of a large class of composite laminates; the values predicted by this method show good agreement with limited experimental data. The limits imposed by material heterogeneity are briefly discussed, and areas for further study are recommended.
Yim, Sehyuk; Gultepe, Evin; Gracias, David H.
2014-01-01
This paper proposes a new wireless biopsy method where a magnetically actuated untethered soft capsule endoscope carries and releases a large number of thermo-sensitive, untethered microgrippers (μ-grippers) at a desired location inside the stomach and retrieves them after they self-fold and grab tissue samples. We describe the working principles and analytical models for the μ-gripper release and retrieval mechanisms, and evaluate the proposed biopsy method in ex vivo experiments. This hierarchical approach combining the advanced navigation skills of centimeter-scaled untethered magnetic capsule endoscopes with highly parallel, autonomous, submillimeter scale tissue sampling μ-grippers offers a multifunctional strategy for gastrointestinal capsule biopsy. PMID:24108454
Progress and challenges associated with halal authentication of consumer packaged goods.
Premanandh, Jagadeesan; Bin Salem, Samara
2017-11-01
Abusive business practices are increasingly evident in consumer packaged goods. Although consumers have the right to protect themselves against such practices, rapid urbanization and industrialization result in greater distances between producers and consumers, raising serious concerns on the supply chain. The operational complexities surrounding halal authentication pose serious challenges on the integrity of consumer packaged goods. This article attempts to address the progress and challenges associated with halal authentication. Advancement and concerns on the application of new, rapid analytical methods for halal authentication are discussed. The significance of zero tolerance policy in consumer packaged foods and its impact on analytical testing are presented. The role of halal assurance systems and their challenges are also considered. In conclusion, consensus on the establishment of one standard approach coupled with a sound traceability system and constant monitoring would certainly improve and ensure halalness of consumer packaged goods. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Unambiguous detection of nitrated explosive vapours by fluorescence quenching of dendrimer films
NASA Astrophysics Data System (ADS)
Geng, Yan; Ali, Mohammad A.; Clulow, Andrew J.; Fan, Shengqiang; Burn, Paul L.; Gentle, Ian R.; Meredith, Paul; Shaw, Paul E.
2015-09-01
Unambiguous and selective standoff (non-contact) infield detection of nitro-containing explosives and taggants is an important goal but difficult to achieve with standard analytical techniques. Oxidative fluorescence quenching is emerging as a high sensitivity method for detecting such materials but is prone to false positives--everyday items such as perfumes elicit similar responses. Here we report thin films of light-emitting dendrimers that detect vapours of explosives and taggants selectively--fluorescence quenching is not observed for a range of common interferents. Using a combination of neutron reflectometry, quartz crystal microbalance and photophysical measurements we show that the origin of the selectivity is primarily electronic and not the diffusion kinetics of the analyte or its distribution in the film. The results are a major advance in the development of sensing materials for the standoff detection of nitro-based explosive vapours, and deliver significant insights into the physical processes that govern the sensing efficacy.
An overview on current fluid-inclusion research and applications
Chi, G.; Chou, I.-Ming; Lu, H.-Z.
2003-01-01
This paper provides an overview of some of the more important developments in fluid-inclusion research and applications in recent years, including fluid-inclusion petrography, PVTX studies, and analytical techniques. In fluid-inclusion petrography, the introduction of the concept of 'fluid-inclusion assemblage' has been a major advance. In PVTX studies, the use of synthetic fluid inclusions and hydrothermal diamond-anvil cells has greatly contributed to the characterization of the phase behaviour of geologically relevant fluid systems. Various analytical methods are being developed and refined rapidly, with the Laser-Raman and LA-ICP-MS techniques being particularly useful for volatile and solute analyses, respectively. Ore deposit research has been and will continue to be the main field of application of fluid inclusions. However, fluid inclusions have been increasingly applied to other fields of earth science, especially in petroleum geology and the study of magmatic and earth interior processes.
NASA Astrophysics Data System (ADS)
Kumarapperuma, Lakshitha; Premaratne, Malin; Jha, Pankaj K.; Stockman, Mark I.; Agrawal, Govind P.
2018-05-01
We demonstrate that it is possible to derive an approximate analytical expression to characterize the spasing (L-L) curve of a coherently enhanced spaser with 3-level gain-medium chromophores. The utility of this solution stems from the fact that it enables optimization of the large parameter space associated with spaser designing, a functionality not offered by the methods currently available in the literature. This is vital for the advancement of spaser technology towards the level of device realization. Owing to the compact nature of the analytical expressions, our solution also facilitates the grouping and identification of key processes responsible for the spasing action, whilst providing significant physical insights. Furthermore, we show that our expression generates results within 0.1% error compared to numerically obtained results for pumping rates higher than the spasing threshold, thereby drastically reducing the computational cost associated with spaser designing.
A communal catalogue reveals Earth's multiscale microbial diversity.
Thompson, Luke R; Sanders, Jon G; McDonald, Daniel; Amir, Amnon; Ladau, Joshua; Locey, Kenneth J; Prill, Robert J; Tripathi, Anupriya; Gibbons, Sean M; Ackermann, Gail; Navas-Molina, Jose A; Janssen, Stefan; Kopylova, Evguenia; Vázquez-Baeza, Yoshiki; González, Antonio; Morton, James T; Mirarab, Siavash; Zech Xu, Zhenjiang; Jiang, Lingjing; Haroon, Mohamed F; Kanbar, Jad; Zhu, Qiyun; Jin Song, Se; Kosciolek, Tomasz; Bokulich, Nicholas A; Lefler, Joshua; Brislawn, Colin J; Humphrey, Gregory; Owens, Sarah M; Hampton-Marcell, Jarrad; Berg-Lyons, Donna; McKenzie, Valerie; Fierer, Noah; Fuhrman, Jed A; Clauset, Aaron; Stevens, Rick L; Shade, Ashley; Pollard, Katherine S; Goodwin, Kelly D; Jansson, Janet K; Gilbert, Jack A; Knight, Rob
2017-11-23
Our growing awareness of the microbial world's importance and diversity contrasts starkly with our limited understanding of its fundamental structure. Despite recent advances in DNA sequencing, a lack of standardized protocols and common analytical frameworks impedes comparisons among studies, hindering the development of global inferences about microbial life on Earth. Here we present a meta-analysis of microbial community samples collected by hundreds of researchers for the Earth Microbiome Project. Coordinated protocols and new analytical methods, particularly the use of exact sequences instead of clustered operational taxonomic units, enable bacterial and archaeal ribosomal RNA gene sequences to be followed across multiple studies and allow us to explore patterns of diversity at an unprecedented scale. The result is both a reference database giving global context to DNA sequence data and a framework for incorporating data from future studies, fostering increasingly complete characterization of Earth's microbial diversity.
Unambiguous detection of nitrated explosive vapours by fluorescence quenching of dendrimer films.
Geng, Yan; Ali, Mohammad A; Clulow, Andrew J; Fan, Shengqiang; Burn, Paul L; Gentle, Ian R; Meredith, Paul; Shaw, Paul E
2015-09-15
Unambiguous and selective standoff (non-contact) infield detection of nitro-containing explosives and taggants is an important goal but difficult to achieve with standard analytical techniques. Oxidative fluorescence quenching is emerging as a high sensitivity method for detecting such materials but is prone to false positives—everyday items such as perfumes elicit similar responses. Here we report thin films of light-emitting dendrimers that detect vapours of explosives and taggants selectively—fluorescence quenching is not observed for a range of common interferents. Using a combination of neutron reflectometry, quartz crystal microbalance and photophysical measurements we show that the origin of the selectivity is primarily electronic and not the diffusion kinetics of the analyte or its distribution in the film. The results are a major advance in the development of sensing materials for the standoff detection of nitro-based explosive vapours, and deliver significant insights into the physical processes that govern the sensing efficacy.
Unambiguous detection of nitrated explosive vapours by fluorescence quenching of dendrimer films
Geng, Yan; Ali, Mohammad A.; Clulow, Andrew J.; Fan, Shengqiang; Burn, Paul L.; Gentle, Ian R.; Meredith, Paul; Shaw, Paul E.
2015-01-01
Unambiguous and selective standoff (non-contact) infield detection of nitro-containing explosives and taggants is an important goal but difficult to achieve with standard analytical techniques. Oxidative fluorescence quenching is emerging as a high sensitivity method for detecting such materials but is prone to false positives—everyday items such as perfumes elicit similar responses. Here we report thin films of light-emitting dendrimers that detect vapours of explosives and taggants selectively—fluorescence quenching is not observed for a range of common interferents. Using a combination of neutron reflectometry, quartz crystal microbalance and photophysical measurements we show that the origin of the selectivity is primarily electronic and not the diffusion kinetics of the analyte or its distribution in the film. The results are a major advance in the development of sensing materials for the standoff detection of nitro-based explosive vapours, and deliver significant insights into the physical processes that govern the sensing efficacy. PMID:26370931
The geometry of singularities and the black hole information paradox
NASA Astrophysics Data System (ADS)
Stoica, O. C.
2015-07-01
The information loss occurs in an evaporating black hole only if the time evolution ends at the singularity. But as we shall see, the black hole solutions admit analytical extensions beyond the singularities, to globally hyperbolic solutions. The method used is similar to that for the apparent singularity at the event horizon, but at the singularity, the resulting metric is degenerate. When the metric is degenerate, the covariant derivative, the curvature, and the Einstein equation become singular. However, recent advances in the geometry of spacetimes with singular metric show that there are ways to extend analytically the Einstein equation and other field equations beyond such singularities. This means that the information can get out of the singularity. In the case of charged black holes, the obtained solutions have nonsingular electromagnetic field. As a bonus, if particles are such black holes, spacetime undergoes dimensional reduction effects like those required by some approaches to perturbative Quantum Gravity.
NASA Astrophysics Data System (ADS)
Takeuchi, Toshie; Nakagawa, Takafumi; Tsukima, Mitsuru; Koyama, Kenichi; Tohya, Nobumoto; Yano, Tomotaka
A new electromagnetically actuated vacuum circuit breaker (VCB) has been designed and developed on the basis of the transient electromagnetic analysis coupled with motion. The VCB has three advanced bi-stable electromagnetic actuators, which control each phase independently. The VCB serves as a synchronous circuit breaker as well as a standard circuit breaker. In this work, the flux delay due to the eddy current is analytically formulated using the delay time constant of the actuator coil current, thereby leading to accurate driving behavior. With this analytical method, the electromagnetic mechanism for a 24kV rated VCB has been optimized; and as a result, the driving energy is reduced to one fifth of that of a conventional VCB employing spring mechanism, and the number of parts is significantly decreased. Therefore, the developed VCB becomes compact, highly reliable and highly durable.
Engineered Antibodies for Monitoring of Polynuclear Aromatic Hydrocarbons
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexander E. Karu Ph.D; Victoria A. Roberts Ph.D.; Qing X. Li, Ph.D.
2002-01-17
This project was undertaken to fill needs in ODE's human and ecosystem health effects research, site remediation, rapid emergency response, and regulatory compliance monitoring programs. Doe has greatly stimulated development and validation of antibody-based, rapid, field-portable detection systems for small hazardous compounds. These range from simple dipsticks, microplate enzyme-linked immunosorbent assays (ELISAs), and hand-held colorimeters, to ultrasensitive microfluidic reactors, fiber-optic sensors and microarrays that can identify multiple analytes from patterns of cross-reactivity. Unfortunately, the technology to produce antibodies with the most desirable properties did not keep pace. Lack of antibodies remains a limiting factor in production and practical use ofmore » such devices. The goals of our project were to determine the chemical and structural bases for the antibody-analyte binding interactions using advanced computational chemistry, and to use this information to create useful new binding properties through in vitro genetic engineering and combinatorial library methods.« less
Takeuchi, Masaki; Tsunoda, Hiromichi; Tanaka, Hideji; Shiramizu, Yoshimi
2011-01-01
This paper describes the performance of our automated acidic (CH(3)COOH, HCOOH, HCl, HNO(2), SO(2), and HNO(3)) gases monitor utilizing a parallel-plate wet denuder (PPWD). The PPWD quantitatively collects gaseous contaminants at a high sample flow rate (∼8 dm(3) min(-1)) compared to the conventional methods used in a clean room. Rapid response to any variability in the sample concentration enables near-real-time monitoring. In the developed monitor, the analyte collected with the PPWD is pumped into one of two preconcentration columns for 15 min, and determined by means of ion chromatography. While one preconcentration column is used for chromatographic separation, the other is used for loading the sample solution. The system allows continuous monitoring of the common acidic gases in an advanced semiconductor manufacturing clean room. 2011 © The Japan Society for Analytical Chemistry
Tanaka, Shiro; Matsuyama, Yutaka; Ohashi, Yasuo
2017-08-30
Increasing attention has been focused on the use and validation of surrogate endpoints in cancer clinical trials. Previous literature on validation of surrogate endpoints are classified into four approaches: the proportion explained approach; the indirect effects approach; the meta-analytic approach; and the principal stratification approach. The mainstream in cancer research has seen the application of a meta-analytic approach. However, VanderWeele (2013) showed that all four of these approaches potentially suffer from the surrogate paradox. It was also shown that, if a principal surrogate satisfies additional criteria called one-sided average causal sufficiency, the surrogate cannot exhibit a surrogate paradox. Here, we propose a method for estimating principal effects under a monotonicity assumption. Specifically, we consider cancer clinical trials which compare a binary surrogate endpoint and a time-to-event clinical endpoint under two naturally ordered treatments (e.g. combined therapy vs. monotherapy). Estimation based on a mean score estimating equation will be implemented by the expectation-maximization algorithm. We will also apply the proposed method as well as other surrogacy criteria to evaluate the surrogacy of prostate-specific antigen using data from a phase III advanced prostate cancer trial, clarifying the complementary roles of both the principal stratification and meta-analytic approaches in the evaluation of surrogate endpoints in cancer. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
P Yu
Unlike traditional 'wet' analytical methods which during processing for analysis often result in destruction or alteration of the intrinsic protein structures, advanced synchrotron radiation-based Fourier transform infrared microspectroscopy has been developed as a rapid and nondestructive and bioanalytical technique. This cutting-edge synchrotron-based bioanalytical technology, taking advantages of synchrotron light brightness (million times brighter than sun), is capable of exploring the molecular chemistry or structure of a biological tissue without destruction inherent structures at ultra-spatial resolutions. In this article, a novel approach is introduced to show the potential of the advanced synchrotron-based analytical technology, which can be used to study plant-basedmore » food or feed protein molecular structure in relation to nutrient utilization and availability. Recent progress was reported on using synchrotron-based bioanalytical technique synchrotron radiation-based Fourier transform infrared microspectroscopy and diffused reflectance infrared Fourier transform spectroscopy to detect the effects of gene-transformation (Application 1), autoclaving (Application 2), and bio-ethanol processing (Application 3) on plant-based food and feed protein structure changes on a molecular basis. The synchrotron-based technology provides a new approach for plant-based protein structure research at ultra-spatial resolutions at cellular and molecular levels.« less
Analytical quality by design: a tool for regulatory flexibility and robust analytics.
Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy
2015-01-01
Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).
Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics
Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy
2015-01-01
Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723
Determination of the Performance Parameters of a Spectrophotometer: An Advanced Experiment.
ERIC Educational Resources Information Center
Cope, Virgil W.
1978-01-01
Describes an advanced analytical chemistry laboratory experiment developed for the determination of the performance parameters of a spectrophotometer. Among the parameters are the baseline linearity with wavelength, wavelength accuracy and respectability, stray light, noise level and pen response time. (HM)
Analytical Chemistry Division annual progress report for period ending December 31, 1988
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The Analytical Chemistry Division of Oak Ridge National Laboratory (ORNL) is a large and diversified organization. As such, it serves a multitude of functions for a clientele that exists both in and outside of ORNL. These functions fall into the following general categories: (1) Analytical Research, Development, and Implementation. The division maintains a program to conceptualize, investigate, develop, assess, improve, and implement advanced technology for chemical and physicochemical measurements. Emphasis is on problems and needs identified with ORNL and Department of Energy (DOE) programs; however, attention is also given to advancing the analytical sciences themselves. (2) Programmatic Research, Development, andmore » Utilization. The division carries out a wide variety of chemical work that typically involves analytical research and/or development plus the utilization of analytical capabilities to expedite programmatic interests. (3) Technical Support. The division performs chemical and physicochemical analyses of virtually all types. The Analytical Chemistry Division is organized into four major sections, each of which may carry out any of the three types of work mentioned above. Chapters 1 through 4 of this report highlight progress within the four sections during the period January 1 to December 31, 1988. A brief discussion of the division's role in an especially important environmental program is given in Chapter 5. Information about quality assurance, safety, and training programs is presented in Chapter 6, along with a tabulation of analyses rendered. Publications, oral presentations, professional activities, educational programs, and seminars are cited in Chapters 7 and 8.« less
Recent Advances in Bioprinting and Applications for Biosensing
Dias, Andrew D.; Kingsley, David M.; Corr, David T.
2014-01-01
Future biosensing applications will require high performance, including real-time monitoring of physiological events, incorporation of biosensors into feedback-based devices, detection of toxins, and advanced diagnostics. Such functionality will necessitate biosensors with increased sensitivity, specificity, and throughput, as well as the ability to simultaneously detect multiple analytes. While these demands have yet to be fully realized, recent advances in biofabrication may allow sensors to achieve the high spatial sensitivity required, and bring us closer to achieving devices with these capabilities. To this end, we review recent advances in biofabrication techniques that may enable cutting-edge biosensors. In particular, we focus on bioprinting techniques (e.g., microcontact printing, inkjet printing, and laser direct-write) that may prove pivotal to biosensor fabrication and scaling. Recent biosensors have employed these fabrication techniques with success, and further development may enable higher performance, including multiplexing multiple analytes or cell types within a single biosensor. We also review recent advances in 3D bioprinting, and explore their potential to create biosensors with live cells encapsulated in 3D microenvironments. Such advances in biofabrication will expand biosensor utility and availability, with impact realized in many interdisciplinary fields, as well as in the clinic. PMID:25587413
Optimal clinical trial design based on a dichotomous Markov-chain mixed-effect sleep model.
Steven Ernest, C; Nyberg, Joakim; Karlsson, Mats O; Hooker, Andrew C
2014-12-01
D-optimal designs for discrete-type responses have been derived using generalized linear mixed models, simulation based methods and analytical approximations for computing the fisher information matrix (FIM) of non-linear mixed effect models with homogeneous probabilities over time. In this work, D-optimal designs using an analytical approximation of the FIM for a dichotomous, non-homogeneous, Markov-chain phase advanced sleep non-linear mixed effect model was investigated. The non-linear mixed effect model consisted of transition probabilities of dichotomous sleep data estimated as logistic functions using piecewise linear functions. Theoretical linear and nonlinear dose effects were added to the transition probabilities to modify the probability of being in either sleep stage. D-optimal designs were computed by determining an analytical approximation the FIM for each Markov component (one where the previous state was awake and another where the previous state was asleep). Each Markov component FIM was weighted either equally or by the average probability of response being awake or asleep over the night and summed to derive the total FIM (FIM(total)). The reference designs were placebo, 0.1, 1-, 6-, 10- and 20-mg dosing for a 2- to 6-way crossover study in six dosing groups. Optimized design variables were dose and number of subjects in each dose group. The designs were validated using stochastic simulation/re-estimation (SSE). Contrary to expectations, the predicted parameter uncertainty obtained via FIM(total) was larger than the uncertainty in parameter estimates computed by SSE. Nevertheless, the D-optimal designs decreased the uncertainty of parameter estimates relative to the reference designs. Additionally, the improvement for the D-optimal designs were more pronounced using SSE than predicted via FIM(total). Through the use of an approximate analytic solution and weighting schemes, the FIM(total) for a non-homogeneous, dichotomous Markov-chain phase advanced sleep model was computed and provided more efficient trial designs and increased nonlinear mixed-effects modeling parameter precision.
Structural Benchmark Creep Testing for the Advanced Stirling Convertor Heater Head
NASA Technical Reports Server (NTRS)
Krause, David L.; Kalluri, Sreeramesh; Bowman, Randy R.; Shah, Ashwin R.
2008-01-01
The National Aeronautics and Space Administration (NASA) has identified the high efficiency Advanced Stirling Radioisotope Generator (ASRG) as a candidate power source for use on long duration Science missions such as lunar applications, Mars rovers, and deep space missions. For the inherent long life times required, a structurally significant design limit for the heater head component of the ASRG Advanced Stirling Convertor (ASC) is creep deformation induced at low stress levels and high temperatures. Demonstrating proof of adequate margins on creep deformation and rupture for the operating conditions and the MarM-247 material of construction is a challenge that the NASA Glenn Research Center is addressing. The combined analytical and experimental program ensures integrity and high reliability of the heater head for its 17-year design life. The life assessment approach starts with an extensive series of uniaxial creep tests on thin MarM-247 specimens that comprise the same chemistry, microstructure, and heat treatment processing as the heater head itself. This effort addresses a scarcity of openly available creep properties for the material as well as for the virtual absence of understanding of the effect on creep properties due to very thin walls, fine grains, low stress levels, and high-temperature fabrication steps. The approach continues with a considerable analytical effort, both deterministically to evaluate the median creep life using nonlinear finite element analysis, and probabilistically to calculate the heater head s reliability to a higher degree. Finally, the approach includes a substantial structural benchmark creep testing activity to calibrate and validate the analytical work. This last element provides high fidelity testing of prototypical heater head test articles; the testing includes the relevant material issues and the essential multiaxial stress state, and applies prototypical and accelerated temperature profiles for timely results in a highly controlled laboratory environment. This paper focuses on the last element and presents a preliminary methodology for creep rate prediction, the experimental methods, test challenges, and results from benchmark testing of a trial MarM-247 heater head test article. The results compare favorably with the analytical strain predictions. A description of other test findings is provided, and recommendations for future test procedures are suggested. The manuscript concludes with describing the potential impact of the heater head creep life assessment and benchmark testing effort on the ASC program.
Reference Materials: Critical Importance to the Infant Formula Industry.
Wargo, Wayne F
2017-09-01
Infant formula is one of the most regulated foods in the world. It has advanced in complexity over the years as a result of numerous research innovations. To ensure product safety and quality, analytical technologies have also had to advance to keep pace. Given the rigorous performance demands expected of these methods and the ever-growing array of complex matrixes, there is the potential for gaps to exist in current Official MethodsSM and other recognized international methods for infant formula and adult nutritionals. Food safety concerns, particularly for infants, drive the need for extensive testing by manufacturers and regulators. The net effect is the potential for an increase in time- and resource-consuming regulatory disputes. In an effort to mitigate such costly activities, AOAC INTERNATIONAL, under the direction of the Infant Formula Council of America-a trade association of manufacturers and marketers of formulated nutritional products-agreed to establish voluntary consensus Standard Method Performance Requirements, and, ultimately, to identify and publish globally recognized, fit-for-purpose standard methods. To accomplish this task, nutritional reference materials (RMs), representing all major commercially available nutritional formulations, were (and continue to be) a critical necessity. In this paper, various types of RMs will be defined, followed by review and discussion of their importance to the infant formula industry.
NASA Technical Reports Server (NTRS)
Denner, Brett William
1989-01-01
An approximate method was developed to analyze and predict the acoustics of a counterrotating propeller configuration. The method employs the analytical techniques of Lock and Theodorsen as described by Davidson to predict the steady performance of a counterrotating configuration. Then, a modification of the method of Lesieutre is used to predict the unsteady forces on the blades. Finally, the steady and unsteady loads are used in the numerical method of Succi to predict the unsteady acoustics of the propeller. The numerical results are compared with experimental acoustic measurements of a counterrotating propeller configuration by Gazzaniga operating under several combinations of advance ratio, blade pitch, and number of blades. In addition, a constant-speed commuter-class propeller configuration was designed with the Davidson method and the acoustics analyzed at three advance ratios. Noise levels and frequency spectra were calculated at a number of locations around the configuration. The directivity patterns of the harmonics in both the horizontal and vertical planes were examined, with the conclusion that the noise levels of the even harmonics are relatively independent of direction whereas the noise levels of the odd harmonics are extremely dependent on azimuthal direction in the horizontal plane. The equations of Succi are examined to explain this behavior.
Advanced quantitative magnetic nondestructive evaluation methods - Theory and experiment
NASA Technical Reports Server (NTRS)
Barton, J. R.; Kusenberger, F. N.; Beissner, R. E.; Matzkanin, G. A.
1979-01-01
The paper reviews the scale of fatigue crack phenomena in relation to the size detection capabilities of nondestructive evaluation methods. An assessment of several features of fatigue in relation to the inspection of ball and roller bearings suggested the use of magnetic methods; magnetic domain phenomena including the interaction of domains and inclusions, and the influence of stress and magnetic field on domains are discussed. Experimental results indicate that simplified calculations can be used to predict many features of these results; the data predicted by analytic models which use finite element computer analysis predictions do not agree with respect to certain features. Experimental analyses obtained on rod-type fatigue specimens which show experimental magnetic measurements in relation to the crack opening displacement and volume and crack depth should provide methods for improved crack characterization in relation to fracture mechanics and life prediction.
A higher order panel method for linearized supersonic flow
NASA Technical Reports Server (NTRS)
Ehlers, F. E.; Epton, M. A.; Johnson, F. T.; Magnus, A. E.; Rubbert, P. E.
1979-01-01
The basic integral equations of linearized supersonic theory for an advanced supersonic panel method are derived. Methods using only linear varying source strength over each panel or only quadratic doublet strength over each panel gave good agreement with analytic solutions over cones and zero thickness cambered wings. For three dimensional bodies and wings of general shape, combined source and doublet panels with interior boundary conditions to eliminate the internal perturbations lead to a stable method providing good agreement experiment. A panel system with all edges contiguous resulted from dividing the basic four point non-planar panel into eight triangular subpanels, and the doublet strength was made continuous at all edges by a quadratic distribution over each subpanel. Superinclined panels were developed and tested on s simple nacelle and on an airplane model having engine inlets, with excellent results.
Green design assessment of electromechanical products based on group weighted-AHP
NASA Astrophysics Data System (ADS)
Guo, Jinwei; Zhou, MengChu; Li, Zhiwu; Xie, Huiguang
2015-11-01
Manufacturing industry is the backbone of a country's economy while environmental pollution is a serious problem that human beings must face today. The green design of electromechanical products based on enterprise information systems is an important method to solve the environmental problem. The question on how to design green products must be answered by excellent designers via both advanced design methods and effective assessment methods of electromechanical products. Making an objective and precise assessment of green design is one of the problems that must be solved when green design is conducted. An assessment method of green design on electromechanical products based on Group Weighted-AHP (Analytic Hierarchy Process) is proposed in this paper, together with the characteristics of green products. The assessment steps of green design are also established. The results are illustrated via the assessment of a refrigerator design.
2017-01-01
Chemical standardization, along with morphological and DNA analysis ensures the authenticity and advances the integrity evaluation of botanical preparations. Achievement of a more comprehensive, metabolomic standardization requires simultaneous quantitation of multiple marker compounds. Employing quantitative 1H NMR (qHNMR), this study determined the total isoflavone content (TIfCo; 34.5–36.5% w/w) via multimarker standardization and assessed the stability of a 10-year-old isoflavone-enriched red clover extract (RCE). Eleven markers (nine isoflavones, two flavonols) were targeted simultaneously, and outcomes were compared with LC-based standardization. Two advanced quantitative measures in qHNMR were applied to derive quantities from complex and/or overlapping resonances: a quantum mechanical (QM) method (QM-qHNMR) that employs 1H iterative full spin analysis, and a non-QM method that uses linear peak fitting algorithms (PF-qHNMR). A 10 min UHPLC-UV method provided auxiliary orthogonal quantitation. This is the first systematic evaluation of QM and non-QM deconvolution as qHNMR quantitation measures. It demonstrates that QM-qHNMR can account successfully for the complexity of 1H NMR spectra of individual analytes and how QM-qHNMR can be built for mixtures such as botanical extracts. The contents of the main bioactive markers were in good agreement with earlier HPLC-UV results, demonstrating the chemical stability of the RCE. QM-qHNMR advances chemical standardization by its inherent QM accuracy and the use of universal calibrants, avoiding the impractical need for identical reference materials. PMID:28067513
Higham, Desmond J.; Batty, Michael; Bettencourt, Luís M. A.; Greetham, Danica Vukadinović; Grindrod, Peter
2017-01-01
We introduce the 14 articles in the Royal Society Open Science themed issue on City Analytics. To provide a high-level, strategic, overview, we summarize the topics addressed and the analytical tools deployed. We then give a more detailed account of the individual contributions. Our overall aims are (i) to highlight exciting advances in this emerging, interdisciplinary field, (ii) to encourage further activity and (iii) to emphasize the variety of new, public-domain, datasets that are available to researchers. PMID:28386454
7 CFR 94.303 - Analytical methods.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...
7 CFR 94.303 - Analytical methods.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...
7 CFR 94.303 - Analytical methods.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...
7 CFR 94.303 - Analytical methods.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...
7 CFR 94.303 - Analytical methods.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...
SAM Radiochemical Methods Query
Laboratories measuring target radiochemical analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select radiochemical analytes.
Zhou, Weiqiang; Sherwood, Ben; Ji, Hongkai
2017-01-01
Technological advances have led to an explosive growth of high-throughput functional genomic data. Exploiting the correlation among different data types, it is possible to predict one functional genomic data type from other data types. Prediction tools are valuable in understanding the relationship among different functional genomic signals. They also provide a cost-efficient solution to inferring the unknown functional genomic profiles when experimental data are unavailable due to resource or technological constraints. The predicted data may be used for generating hypotheses, prioritizing targets, interpreting disease variants, facilitating data integration, quality control, and many other purposes. This article reviews various applications of prediction methods in functional genomics, discusses analytical challenges, and highlights some common and effective strategies used to develop prediction methods for functional genomic data. PMID:28076869
NASA Astrophysics Data System (ADS)
Yehia, Ali M.; Mohamed, Heba M.
2016-01-01
Three advanced chemmometric-assisted spectrophotometric methods namely; Concentration Residuals Augmented Classical Least Squares (CRACLS), Multivariate Curve Resolution-Alternating Least Squares (MCR-ALS) and Principal Component Analysis-Artificial Neural Networks (PCA-ANN) were developed, validated and benchmarked to PLS calibration; to resolve the severely overlapped spectra and simultaneously determine; Paracetamol (PAR), Guaifenesin (GUA) and Phenylephrine (PHE) in their ternary mixture and in presence of p-aminophenol (AP) the main degradation product and synthesis impurity of Paracetamol. The analytical performance of the proposed methods was described by percentage recoveries, root mean square error of calibration and standard error of prediction. The four multivariate calibration methods could be directly used without any preliminary separation step and successfully applied for pharmaceutical formulation analysis, showing no excipients' interference.
Da Silva, Laeticia; Collino, Sebastiano; Cominetti, Ornella; Martin, Francois-Pierre; Montoliu, Ivan; Moreno, Sergio Oller; Corthesy, John; Kaput, Jim; Kussmann, Martin; Monteiro, Jacqueline Pontes; Guiraud, Seu Ping
2016-09-01
There is increasing interest in the profiling and quantitation of methionine pathway metabolites for health management research. Currently, several analytical approaches are required to cover metabolites and co-factors. We report the development and the validation of a method for the simultaneous detection and quantitation of 13 metabolites in red blood cells. The method, validated in a cohort of healthy human volunteers, shows a high level of accuracy and reproducibility. This high-throughput protocol provides a robust coverage of central metabolites and co-factors in one single analysis and in a high-throughput fashion. In large-scale clinical settings, the use of such an approach will significantly advance the field of nutritional research in health and disease.
Efficient estimation of diffusion during dendritic solidification
NASA Technical Reports Server (NTRS)
Yeum, K. S.; Poirier, D. R.; Laxmanan, V.
1989-01-01
A very efficient finite difference method has been developed to estimate the solute redistribution during solidification with diffusion in the solid. This method is validated by comparing the computed results with the results of an analytical solution derived by Kobayashi (1988) for the assumptions of a constant diffusion coefficient, a constant equilibrium partition ratio, and a parabolic rate of the advancement of the solid/liquid interface. The flexibility of the method is demonstrated by applying it to the dendritic solidification of a Pb-15 wt pct Sn alloy, for which the equilibrium partition ratio and diffusion coefficient vary substantially during solidification. The fraction eutectic at the end of solidification is also obtained by estimating the fraction solid, in greater resolution, where the concentration of solute in the interdendritic liquid reaches the eutectic composition of the alloy.
Structural Assessment of Advanced Composite Tow-Steered Shells
NASA Technical Reports Server (NTRS)
Wu, K. Chauncey; Stanford, Bret K.; Hrinda, Glenn A.; Wang, Zhuosong; Martin, Robert a.; Kim, H. Alicia
2013-01-01
The structural performance of two advanced composite tow-steered shells, manufactured using a fiber placement system, is assessed using both experimental and analytical methods. The fiber orientation angles vary continuously around the shell circumference from 10 degrees on the shell crown and keel, to 45 degrees on the shell sides. The two shells differ in that one shell has the full 24-tow course applied during each pass of the fiber placement system, while the second shell uses the fiber placement system s tow drop/add capability to achieve a more uniform shell wall thickness. The shells are tested in axial compression, and estimates of their prebuckling axial stiffnesses and bifurcation buckling loads are predicted using linear finite element analyses. These preliminary predictions compare well with the test results, with an average agreement of approximately 10 percent.
Reliability Demonstration Approach for Advanced Stirling Radioisotope Generator
NASA Technical Reports Server (NTRS)
Ha, CHuong; Zampino, Edward; Penswick, Barry; Spronz, Michael
2010-01-01
Developed for future space missions as a high-efficiency power system, the Advanced Stirling Radioisotope Generator (ASRG) has a design life requirement of 14 yr in space following a potential storage of 3 yr after fueling. In general, the demonstration of long-life dynamic systems remains difficult in part due to the perception that the wearout of moving parts cannot be minimized, and associated failures are unpredictable. This paper shows a combination of systematic analytical methods, extensive experience gained from technology development, and well-planned tests can be used to ensure a high level reliability of ASRG. With this approach, all potential risks from each life phase of the system are evaluated and the mitigation adequately addressed. This paper also provides a summary of important test results obtained to date for ASRG and the planned effort for system-level extended operation.
The NASA Lewis large wind turbine program
NASA Technical Reports Server (NTRS)
Thomas, R. L.; Baldwin, D. H.
1981-01-01
The program is directed toward development of the technology for safe, reliable, environmentally acceptable large wind turbines that have the potential to generate a significant amount of electricity at costs competitive with conventional electric generation systems. In addition, these large wind turbines must be fully compatible with electric utility operations and interface requirements. Advances are made by gaining a better understanding of the system design drivers, improvements in the analytical design tools, verification of design methods with operating field data, and the incorporation of new technology and innovative designs. An overview of the program activities is presented and includes results from the first and second generation field machines (Mod-OA, -1, and -2), the design phase of the third generation wind turbine (Mod-5) and the advanced technology projects. Also included is the status of the Department of Interior WTS-4 machine.
Three dimensional calculation of thermonuclear ignition conditions for magnetized targets
NASA Astrophysics Data System (ADS)
Cortez, Ross; Cassibry, Jason; Lapointe, Michael; Adams, Robert
2017-10-01
Fusion power balance calculations, often performed using analytic methods, are used to estimate the design space for ignition conditions. In this paper, fusion power balance is calculated utilizing a 3-D smoothed particle hydrodynamics code (SPFMax) incorporating recent stopping power routines. Effects of thermal conduction, multigroup radiation emission and nonlocal absorption, ion/electron thermal equilibration, and compressional work are studied as a function of target and liner parameters and geometry for D-T, D-D, and 6LI-D fuels to identify the potential ignition design space. Here, ignition is defined as the condition when fusion particle deposition equals or exceeds the losses from heat conduction and radiation. The simulations are in support of ongoing research with NASA to develop advanced propulsion systems for rapid interplanetary space travel. Supported by NASA Innovative Advanced Concepts and NASA Marshall Space Flight Center.
Diagnostics and Active Control of Aircraft Interior Noise
NASA Technical Reports Server (NTRS)
Fuller, C. R.
1998-01-01
This project deals with developing advanced methods for investigating and controlling interior noise in aircraft. The work concentrates on developing and applying the techniques of Near Field Acoustic Holography (NAH) and Principal Component Analysis (PCA) to the aircraft interior noise dynamic problem. This involves investigating the current state of the art, developing new techniques and then applying them to the particular problem being studied. The knowledge gained under the first part of the project was then used to develop and apply new, advanced noise control techniques for reducing interior noise. A new fully active control approach based on the PCA was developed and implemented on a test cylinder. Finally an active-passive approach based on tunable vibration absorbers was to be developed and analytically applied to a range of test structures from simple plates to aircraft fuselages.
Flight experience with flight control redundancy management
NASA Technical Reports Server (NTRS)
Szalai, K. J.; Larson, R. R.; Glover, R. D.
1980-01-01
Flight experience with both current and advanced redundancy management schemes was gained in recent flight research programs using the F-8 digital fly by wire aircraft. The flight performance of fault detection, isolation, and reconfiguration (FDIR) methods for sensors, computers, and actuators is reviewed. Results of induced failures as well as of actual random failures are discussed. Deficiencies in modeling and implementation techniques are also discussed. The paper also presents comparison off multisensor tracking in smooth air, in turbulence, during large maneuvers, and during maneuvers typical of those of large commercial transport aircraft. The results of flight tests of an advanced analytic redundancy management algorithm are compared with the performance of a contemporary algorithm in terms of time to detection, false alarms, and missed alarms. The performance of computer redundancy management in both iron bird and flight tests is also presented.
Polybrominated Diphenyl Ethers in Dryer Lint: An Advanced Analysis Laboratory
ERIC Educational Resources Information Center
Thompson, Robert Q.
2008-01-01
An advanced analytical chemistry laboratory experiment is described that involves environmental analysis and gas chromatography-mass spectrometry. Students analyze lint from clothes dryers for traces of flame retardant chemicals, polybrominated diphenylethers (PBDEs), compounds receiving much attention recently. In a typical experiment, ng/g…
Advanced electron microscopy methods for the analysis of MgB2 superconductor
NASA Astrophysics Data System (ADS)
Birajdar, B.; Peranio, N.; Eibl, O.
2008-02-01
Advanced electron microscopy methods used for the analysis of superconducting MgB2 wires and tapes are described. The wires and tapes were prepared by the powder in tube method using different processing technologies and thoroughly characterised for their superconducting properties within the HIPERMAG project. Microstructure analysis on μm to nm length scales is necessary to understand the superconducting properties of MgB2. For the MgB2 phase analysis on μm scale an analytical SEM, and for the analysis on nm scale a energy-filtered STEM is used. Both the microscopes were equipped with EDX detector and field emission gun. Electron microscopy and spectroscopy of MgB2 is challenging because of the boron analysis, carbon and oxygen contamination, and the presence of large number of secondary phases. Advanced electron microscopy involves, combined SEM, EPMA and TEM analysis with artefact free sample preparation, elemental mapping and chemical quantification of point spectra. Details of the acquisition conditions and achieved accuracy are presented. Ex-situ wires show oxygen-free MgB2 colonies (a colony is a dense arrangement of several MgB2 grains) embedded in a porous and oxygen-rich matrix, introducing structural granularity. In comparison, in-situ wires are generally more dense, but show inhibited MgB2 phase formation with significantly higher fraction of B-rich secondary phases. SiC additives in the in-situ wires forms Mg2Si secondary phases. The advanced electron microscopy has been used to extract the microstructure parameters like colony size, B-rich secondary phase fraction, O mole fraction and MgB2 grain size, and establish a microstructure-critical current density model [1]. In summary, conventional secondary electron imaging in SEM and diffraction contrast imaging in the TEM are by far not sufficient and advanced electron microscopy methods are essential for the analysis of superconducting MgB2 wires and tapes.
Analytical Ultrasonics in Materials Research and Testing
NASA Technical Reports Server (NTRS)
Vary, A.
1986-01-01
Research results in analytical ultrasonics for characterizing structural materials from metals and ceramics to composites are presented. General topics covered by the conference included: status and advances in analytical ultrasonics for characterizing material microstructures and mechanical properties; status and prospects for ultrasonic measurements of microdamage, degradation, and underlying morphological factors; status and problems in precision measurements of frequency-dependent velocity and attenuation for materials analysis; procedures and requirements for automated, digital signal acquisition, processing, analysis, and interpretation; incentives for analytical ultrasonics in materials research and materials processing, testing, and inspection; and examples of progress in ultrasonics for interrelating microstructure, mechanical properites, and dynamic response.
Learning Analytics as a Counterpart to Surveys of Student Experience
ERIC Educational Resources Information Center
Borden, Victor M. H.; Coates, Hamish
2017-01-01
Analytics derived from the student learning environment provide new insights into the collegiate experience; they can be used as a supplement to or, to some extent, in place of traditional surveys. To serve this purpose, however, greater attention must be paid to conceptual frameworks and to advancing institutional systems, activating new…
ERIC Educational Resources Information Center
Shtoyko, Tanya; Zudans, Imants; Seliskar, Carl J.; Heineman, William R.; Richardson, John N.
2004-01-01
A sensor experiment which can be applied to advanced undergraduate laboratory course in physical or analytical chemistry is described along with certain concepts like the demonstration of chemical sensing, preparation of thin films on a substrate, microtitration, optical determination of complex ion stoichiometry and isosbestic point. It is seen…
Juicing the Juice: A Laboratory-Based Case Study for an Instrumental Analytical Chemistry Course
ERIC Educational Resources Information Center
Schaber, Peter M.; Dinan, Frank J.; St. Phillips, Michael; Larson, Renee; Pines, Harvey A.; Larkin, Judith E.
2011-01-01
A young, inexperienced Food and Drug Administration (FDA) chemist is asked to distinguish between authentic fresh orange juice and suspected reconstituted orange juice falsely labeled as fresh. In an advanced instrumental analytical chemistry application of this case, inductively coupled plasma (ICP) spectroscopy is used to distinguish between the…